“Minimal manning—and with it, the replacement of specialized workers with problem-solving generalists—isn’t a particularly nautical concept. Indeed, it will sound familiar to anyone in an organization who’s been asked to “do more with less”—which, these days, seems to be just about everyone. Ten years from now, the Deloitte consultant Erica Volini projects, 70 to 90 percent of workers will be in so-called hybrid jobs or superjobs—that is, positions combining tasks once performed by people in two or more traditional roles. Visit SkyWest Airlines’ careers site, and you’ll see that the company is looking for “cross utilized agents” capable of ticketing, marshaling and servicing aircraft, and handling luggage. At the online shoe company Zappos, which famously did away with job titles a few years back, employees are encouraged to take on multiple roles by joining “circles” that tackle different responsibilities. If you ask Laszlo Bock, Google’s former culture chief and now the head of the HR start-up Humu, what he looks for in a new hire, he’ll tell you “mental agility.” “What companies are looking for,” says Mary Jo King, the president of the National Résumé Writers’ Association, “is someone who can be all, do all, and pivot on a dime to solve any problem.”
The phenomenon is sped by automation, which usurps routine tasks, leaving employees to handle the nonroutine and unanticipated—and the continued advance of which throws the skills employers value into flux. It would be supremely ironic if the advance of the knowledge economy had the effect of devaluing knowledge. But that’s what I heard, recurrently, while reporting this story. “The half-life of skills is getting shorter,” I was told by IBM’s Joanna Daly, who oversaw an apprenticeship program that trained tech employees for new jobs within the company in as few as six months. By 2020, a 2016 World Economic Forum report predicted, “more than one-third of the desired core skill sets of most occupations” will not have been seen as crucial to the job when the report was published. If that’s the case, I asked John Sullivan, a prominent Silicon Valley talent adviser, why should anyone take the time to master anything at all? “You shouldn’t!” he replied.
As a rule of thumb, statements out of Silicon Valley should be deflated by half to control for hyperbole. Still, the ramifications of Sullivan’s comment unfurl quickly. Minimal manning—and the evolution of the economy more generally—requires a different kind of worker, with not only different acquired skills but different inherent abilities. It has implications for the nature and utility of a college education, for the path of careers, for inequality and employability—even for the generational divide. And that’s to say nothing of its potential impact on product quality and worker safety, or on the nature of the satisfactions one might derive from work. Or, for that matter, on the relevance of the question What do you want to be when you grow up? [..]
We like conscientious people because they can be trusted to show up early, double-check the math, fill the gap in the presentation, and return your car gassed up even though the tank was nowhere near empty to begin with. What struck [Michigan State University psychology professor Zachary] Hambrick as counterintuitive and interesting was that conscientiousness here seemed to correlate with poor performance. [..]
The people who did best tended to score high on “openness to new experience”—a personality trait that is normally not a major job-performance predictor and that, in certain contexts, roughly translates to “distractibility.” To borrow the management expert Peter Drucker’s formulation, people with this trait are less focused on doing things right, and more likely to wonder whether they’re doing the right things.
High in fluid intelligence, low in experience, not terribly conscientious, open to potential distraction—this is not the classic profile of a winning job candidate. But what if it is the profile of the winning job candidate of the future?
If that’s the case, some important implications would arise.
One concerns “grit”—a mind-set, much vaunted these days in educational and professional circles, that allows people to commit tenaciously to doing one thing well. Angela Duckworth, a University of Pennsylvania psychology professor, has written powerfully about the value of grit—putting your head down, blocking out distractions, committing over a course of many years to a chosen path. Her writing traces an intellectual lineage that can also be found in Malcolm Gladwell’s Outliers, which explains extraordinary success as a function of endless, dedicated practice—10,000 hours or more. These ideas are inherently appealing; they suggest that dedication can be more important than raw talent, that the dogged and conscientious will be rewarded in the end.
In the stable environments Duckworth and Gladwell draw from (chess, tennis, piano, higher education), a rigid adherence to routine can no doubt serve you well. But in situations with rapidly changing rules and roles, a small but growing body of evidence now suggests that it can leave you ill-equipped.
Paul Bartone, a retired Army colonel, seemed to find as much when he studied West Point students and graduates. Traditional measures such as SAT scores and high-school class rank “predicted leader performance in the stable, highly regulated environment of West Point” itself. But once cadets got into actual command environments, which tend to be fluid and full of surprises, a different picture emerged. “Psychological hardiness”—a construct that includes, among other things, a willingness to explore “multiple possible response alternatives,” a tendency to “see all experience as interesting and meaningful,” and a strong sense of self-confidence—was a better predictor of leadership ability in officers after three years in the field. Thus, Bartone and his co-authors wrote, “traditional predictors [of performance] appear not to hold in the fast-paced and unpredictable operational environment in which military officers are working today.” [..]
All too often experts [..] fail to inspect their knowledge structure for signs of decay. “It just didn’t occur to him,” [Arizona State management professor and former Air Force officer Jeffrey] LePine said, “that he was repeating the same mistake over and over.”
Yet the limitations of curious, fluidly intelligent groups of generalists quickly become apparent in the real world. The devaluation of expertise opens up ample room for different sorts of mistakes—and sometimes creates a kind of helplessness. [..]
Grit and 10,000 hours of training are appealing in part because they reinforce American self-conceptions that have been present since the country’s founding, ideas about equality of opportunity, about the value of knowledge, about the importance of hard work. And while no one would suggest that effort itself is being devalued today—hard work is just as important in the workplace that’s emerging as in the one that’s receding—a world in which mental agility and raw cognitive speed eclipse hard-won expertise is a world of greater exclusion: of older workers, slower learners, and the less socially adept. [..]
It would be wrong to say that the 10,000-hours-of-deliberate-practice idea doesn’t hold up at all. In some situations, it clearly does. Sports, musicianship, teaching—these are fields where the rules don’t change much over time. In tennis, it pays to put in the hours mastering your serve, because you know you’ll always be serving to a box 21 feet long and 13.5 feet wide, over a net strung 3.5 feet high. In medicine and law, the rules might change—but specialization will probably remain key. A spinal surgery will not be performed by a brilliant dermatologist. A criminal-defense team will not be headed by a tax attorney. And in tech, the demand for specialized skills will continue to reward expertise handsomely.
But in many fields, the path to success isn’t so clear. The rules keep changing, which means that highly focused practice has a much lower return. Zachary Hambrick and his co-authors showed as much in a 2014 meta-analysis. In uncertain environments, Hambrick told me, “specialization is no longer the coin of the realm.”
So where does this leave us?
It leaves us with lifelong learning, an unavoidably familiar phrase that, before I began this story, sounded tame to me—a motivational reminder that it’s never too late to learn Spanish or enroll in nighttime pottery classes. But when Guillermo Miranda, IBM’s former chief learning officer, used the term in describing to me how employees take advantage of the company’s automated career counselor, Myca, it started to sound like something new. “You can talk to the chatbot,” Miranda said, “and say, ‘Hey, Myca, how do I get a promotion?’ ”
Myca isn’t programmed to push any fixed career track. It isn’t dumb enough to try to predict the future—much less plan for it. “There is no master plan,” Miranda said. Myca just crunches data, notices correlations, and offers suggestions: Take a course on blockchain. Learn quantum computing. “Look, Jennifer!” it might say. “Three people like you just got promoted because they got these badges.”
Even as I reported this story, I found myself the target of career suggestions. “You need to be a video guy, an audio guy!” the Silicon Valley talent adviser John Sullivan told me, alluding to the demise of print media. I found it fascinating and slightly odd that Sullivan would so readily imagine that I would abandon writing—my life’s pursuit since high school—for a new line of work. More than that, though, I found the prospect of starting over just plain exhausting. Building a professional identity takes a lot of resources—money, time, energy. After it’s built, we expect to reap gains from our investment, and—let’s be honest—even do a bit of coasting. Are we equipped to continually return to apprentice mode? Will this burn us out? And will the collective work that results be as good as what came before?”
Full article, J Useem, The Atlantic, July 2019