Traditional employee management techniques are out of sync with human nature. Get ready for a renaissance of self-direction.
A little past noon on a rainy Friday in Charlottesville, Virginia, only a third of CEO Jeff Gunther’s employees have shown up for work. But Gunther—entrepreneur, manager, capitalist—is neither worried nor annoyed. In fact, he’s as calm and focused as a monk. Maybe that’s because he didn’t roll into the office himself until about an hour ago. Or maybe that’s because he knows his crew isn’t shirking. They’re working—just on their own terms.
Gunther has launched an experiment in autonomy at Meddius, one of a trio of companies he runs. He turned the company, which creates computer software and hardware to help hospitals integrate their information systems, into a ROWE—a results-only work environment.
ROWEs are the brainchild of Cali Ressler and Jody Thompson, two former human resources executives at the American retailer Best Buy. ROWE’s principles marry the common sense pragmatism of Ben Franklin to the cage-rattling radicalism of American community organizer Saul Alinsky. In a ROWE workplace, people don’t have schedules. They show up when they want. They don’t have to be in the office at a certain time—or any time, for that matter. They just have to get their work done. How they do it, when they do it and where they do it is up to them.
This appealed to Gunther, who’s in his early thirties. “Management isn’t about walking around and seeing if people are in their offices,” he told me. “It’s about creating conditions for people to do their best work.” That’s why he’d always tried to give employees a long leash. But as Meddius expanded, and as Gunther began exploring new office space, he started wondering whether talented, grown-up employees doing sophisticated work needed a leash of any length. So at the company’s holiday dinner in December 2008, he made an announcement: For the first 90 days of the new year, the entire 22-person operation would try an experiment. It would become a ROWE.
“In the beginning, people didn’t take to it,” Gunther says. The office filled up around 9 a.m. and emptied out in the early evening, just as before. A few staffers had come out of extremely controlling environments and weren’t accustomed to this kind of leeway. (At one employee’s previous company, staff had to arrive each day before 8 a.m. If someone was late, even by a few minutes, the employee had to write an explanation for everyone else to read.) But after a few weeks, most people found their groove. Productivity rose. Stress declined. And although two employees struggled with the freedom and left, by the end of the test period Gunther decided to go with ROWE permanently.
“Some people [outside of the company] thought I was crazy,” he says. “They wondered, ‘How can you know what your employees are doing if they’re not here?’” But in his view, the team was accomplishing more under this new arrangement. One reason: They were focused on the work itself rather than on whether someone would call them slackers for leaving at 3 p.m. to watch a daughter’s soccer game. And since the bulk of his staff consists of software developers, designers and others doing high-level creative work, that was essential. “For them, it’s all about the craftsmanship. And they need a lot of autonomy.”
People still had specific goals they had to reach—for example, completing a project by a certain time or ringing up a particular number of sales. And if they needed help, Gunther was there to assist. But he decided against tying those goals to compensation. “That creates a culture that says it’s all about money and not enough about the work.” Money, he believes, is only a “threshold motivator.” People must be paid well and be able to take care of their families, he says. But once a company meets this baseline, dollars and cents don’t much affect performance and motivation. Indeed, Gunther thinks that in a ROWE environment, employees are far less likely to jump to another job for a $10,000 or even $20,000 increase in salary. The freedom they have to do great work is more valuable, and harder to match, than a pay raise—and employee’s spouses, partners and families are among a ROWE’s staunchest advocates.
“More companies will migrate to this as more business owners my age come up. My dad’s generation views human beings as human resources. They’re the two-by-fours you need to build your house,” he says. “For me, it’s a partnership between me and the employees. They’re not resources. They’re partners.” And partners, like all of us, need to direct their own lives.
We forget sometimes that “management” does not emanate from nature. It’s not like a tree or a river. It’s like a television or a bicycle. It’s something that humans invented. As the strategy guru Gary Hamel has observed, management is a technology. And like Motivation 2.0, it’s a technology that has grown creaky. While some companies have oiled the gears a bit, and plenty more have paid lip service to the same, at its core, management hasn’t changed much in 100 years. Its central ethic remains control; its chief tools remain extrinsic motivators. That leaves it largely out of sync with the non-routine, right-brained abilities on which many of the world’s economies now depend. But could its most glaring weakness run deeper? Is management, as it’s currently considered, out of sync with human nature itself?
The idea of management (that is, management of people rather than management of, say, supply chains) is built on certain assumptions about the basic natures of those being managed. It presumes that to take action or move forward, we need a prod—that absent a reward or punishment, we’d remain happily and inertly in place. It also presumes that once people do get moving, they need direction—that without a firm and reliable guide, they’d wander.
But is that really our fundamental nature? Or, to use yet another computer metaphor, is that our “default setting”? When we enter the world, are we wired to be passive and inert? Or are we wired to be active and engaged?
I’m convinced it’s the latter—that our basic nature is to be curious and self-directed. And I say that not because I’m a dewy-eyed idealist, but because I’ve been around young children and because my wife and I have three kids of our own. Have you ever seen a 6-month-old or a 1-year-old who’s not curious and self-directed? I haven’t. That’s how we are out of the box. If, at age 14 or 43, we’re passive and inert, that’s not because it’s our nature. It’s because something flipped our default setting.
That something could well be management—not merely how bosses treat us at work, but also how the broader ethos has leeched into schools, families and many other aspects of our lives. Perhaps management isn’t responding to our supposedly natural state of passive inertia. Perhaps management is one of the forces that’s switching our default setting and producing that state.
Now, that’s not as insidious as it sounds. Submerging part of our nature in the name of economic survival can be a sensible move. My ancestors did it; so did yours. And there are times, even now, when we have no other choice.
But today economic accomplishment, not to mention personal fulfillment, more often swings on a different hinge. It depends not on keeping our nature submerged but on allowing it to surface. It requires resisting the temptation to control people—and instead doing everything we can to reawaken their deep-seated sense of autonomy. This innate capacity for self-direction is at the heart of Motivation 3.0 and Type I behavior.
The fundamentally autonomous quality of human nature is central to self-determination theory (SDT). Edward Deci, a professor of psychology at the University of Rochester, and Richard Ryan, a former student who is now Deci’s colleague, cite autonomy as one of three basic human needs. (The others are the need for competence and the need for relatedness.) And of the three, it’s the most important—the sun around which SDT’s planets orbit. In the 1980s, as they progressed in their work, Deci and Ryan moved away from categorizing behavior as either extrinsically motivated or intrinsically motivated to categorizing it as either controlled or autonomous. “Autonomous motivation involves behaving with a full sense of volition and choice,” they write in a 2008 article in Canadian Psychology, “whereas controlled motivation involves behaving with the experience of pressure and demand toward specific outcomes that comes from forces perceived to be external to the self.”
Autonomy, as they see it, is different from independence. It’s not the rugged, go-it-alone, rely-on-nobody individualism of the American cowboy. It means acting with choice—which means we can be both autonomous and happily -interdependent with others. And while the idea of independence has national and political reverberations, autonomy appears to be a human concept rather than a Western one. Researchers have found a link between autonomy and overall well-being not only in North America and Western Europe, but in Russia, Turkey and South Korea. Even in high-poverty non-Western locales like Bangladesh, social scientists have found that autonomy is something that people seek and that improves their lives.
A sense of autonomy has a powerful effect on individual performance and attitude. According to a cluster of recent behavioral science studies, autonomous motivation promotes greater conceptual understanding, better grades, enhanced persistence at school and, in sporting activities, higher productivity, less burnout and greater psychological well-being. Those effects carry over to the workplace. In 2004, Deci and Ryan, along with Paul Baard of Fordham University, carried out a study of workers at an American investment bank. The three researchers found greater job satisfaction among employees whose bosses offered “autonomous support.” These bosses saw issues from the employee’s point of view, gave meaningful feedback and information, provided ample choice over what to do and how to do it and encouraged employees to take on new projects. The resulting enhancement in job satisfaction, in turn, led to higher performance on the job. What’s more, the benefits that autonomy confers on individuals extend to their organizations. For examples, researchers at Cornell University studied 320 small businesses, half of which granted workers autonomy; the other half relied on top-down direction. The businesses that offered autonomy grew at four times the rate of the control-oriented firms and had one-third the employee turnover.
Yet too many businesses remain woefully behind the science. Most 21st-century notions of management presume that, in the end, people are pawns rather than players. British economist Francis Green, to cite just one example, points to the lack of individual discretion at work as the main explanation for declining productivity and job satisfaction in the U.K. Management still revolves largely around supervision, “if-then” rewards and other forms of control. That’s even true of the kinder, gentler Motivations 2.1 approach that whispers sweetly about things like “empowerment” and “flexibility.”
Indeed, just consider the very notion of “empowerment.” It presumes that the organization has the power and benevolently ladles some of it into the waiting bowls of grateful employees. But that’s not autonomy. That’s just a slightly more civilized form of control. Or take management’s embrace of “flex time.” Ressler and Thompson call it a “con game,” and they’re right. Flexibility simply widens the fences and occasionally opens the gates. It, too, is little more than control in sheep’s clothing. The words themselves reflect presumptions that run against both the texture of the times and the nature of the human condition. In short, management isn’t the solution; it’s the problem.
Perhaps it’s time to toss the very word “management” onto the linguistic ash heap alongside “icebox” and “horseless carriage.” This era doesn’t call for better management. It calls for a renaissance of self-direction.
This is an edited excerpt from Drive: The Surprising Truth About What Motivates Us by Daniel H. Pink, published by Riverhead Books. (c) 2009 by Daniel H. Pink.