Thursday, October 18, 2012

How Much Can Your Employees Get Away With?

I was baffled. It was years ago, during my first semester as a physics and math teacher at a last chance Brooklyn public high school. I could be as clear as day about my intentions, what I wanted from the kids, my reasons, and the consequences for non-compliance, and yet the kids did whatever they wanted. But after a while, they started to fall in line.
What was happening? One of the benefits of teaching high school is that you get uncensored feedback — a kind of radical transparency — on every aspect of your performance. I quickly noticed that the kids had figured out my "contingency maps"; they had figured out the unwritten if...then logic of my behavior. Regardless of what I said, every student knew what they could get away with and what work they were going to have to do. And this was independent of motivation, or of the power of my rewards and punishments. Some of them loved the work, while some of them didn't want anything to do with school. But they all tended to not do what they could get away with not doing, in some way or another, that first semester. Now that I have studied neuroscience and psychology, I understand what was happening, and what to do about it.
Unbeknownst to the students, they were using the brains' powerful ability to recognize patterns and adapt to them. As a leader who is aware of this aspect of brain function, you can learn when you are communicating the wrong message and how to communicate the right one; and thus elicit the employee behaviors and organizational culture you want. We have two competing systems in the brain, each of which relies a on different brain region. One system is a pattern recognizer, and guides the vast majority of our behavior. It's responsible for putting together the pieces of what we see, hear, smell, touch, and so on, to give us a sense of what's going on in the world. It does this passively and effortlessly.
Now add another fact about the pattern recognizer. It is no secret that the bulk of what our brains are interested in is other people. Thus, without trying, we know who arrives late to meetings but not dinners and vice versa. We know who says there will be a penalty, but doesn't follow through. We know what situations make our work friends angry at the boss. We know which paperwork will have little effect. And we know how long it really takes to get a response from the VP or from Sales on X. The default mode of brain function is to rely on this pattern recognizer and follow the contingencies it has picked up.
Let's leave my old high school classroom and look at a familiar contingency map that illustrates how pervasive these maps can be. All drivers have a sense of what the real speed limit is, and what the probability of getting a ticket is at different speeds (at the posted speed limit, at the unwritten speed limit, and higher than either). And we drive accordingly. Most people probably cannot put a number on these probabilities, but their behavior reflects that they know it in some way, and often are aware of the basic rules. The unwritten speed limit is roughly 5 to 10 mph higher than the posted in many areas, but sometimes 25 mph higher. And, notably, people often feel it is unfair if ticketed below that unwritten level. How could this feel unfair? Because the pattern recognizer was what prompted our expectations, and prepared the appropriate emotions for the situation. If our method for enforcing the speed limit was a social experiment in trying to get people to speed, it would be brilliant.

It is when the pattern recognizer isn't able to handle the situation that a second system kicks in. The second system — the reasoning system — is responsible for deliberate, self-conscious decisions about how we will behave. As leaders, we try to speak to the reasoning system to ask for compliance. But most of the time, the pattern recognizer drives behavior. The reasoning system uses parts of the brain that developed more recently in evolution, are metabolically demanding, and are capable of fatiguing quickly. It hears the request "Hey everyone, I'd like to start these meetings right at 9:00" and makes an effort to comply. But without some trigger to grab the attention of this system each time, and override the pattern recognizer, the latter will soon figure out just when the meetings really start, and people will arrive accordingly. The same goes for larger organizational requests, like greater focus on safety or creativity.
With my students, what happened half way through the year was that they stopped getting mixed messages from me — I stopped asking their reasoning systems for hard work and focus, while showing their pattern recognizers that they could choose the direction the class would take if they just acted out. Regardless of what I told their reasoning systems, with consistent behavior on my part, I had to also show their pattern recognizers what the new rules of the classroom were. Once their pattern recognizers came to expect that the safe bet was that we'd be working in class, that my attention on them would come from asking a good, on topic question, and that acting out would get you left out — that made the difference.
As leaders, we need to ask ourselves, what are our own contingency maps? When we ask for change in our own organizations, what are the unwritten rules we are communicating about how things really operate, and about how they will operate after a change? To learn what rules you project, look at what's consistent in your employees' behavior, because they are behaving according to your contingency map. For example, suppose you have asked for more creativity and risk-taking on project proposals you receive, but the whole team seems to keep playing it safe. Ask yourself what you are doing to suggest that playing it safe is a good idea.
Do you respond to quickness more than creativity? Are your people in danger if their proposal is too innovative to use right away, but in no danger with a mediocre proposal that can be used off the shelf? You can become conscious of these contingencies and leverage them to your desired ends — for instance by finding ways to make creativity the safer choice than quickness. Be consistent. You can't hide from the unwritten rules. But you can leverage them, by removing contingencies that interfere with your goals, and setting up contingencies that support them.
I'll be discussing these ideas as a panelist at the session on Organizational Change at 1:30 pm on October 17th, at the 2012 NeuroLeadership Summit, which you can watch free via livestream . Streaming is free throughout the summit (October 15-17) and available for purchase afterward. Viewers can also join the conversation on twitter with the hashtag #2012NLS.

Romney and Obama Need to Get Stories Straight

Here's a safe bet about the 2012 presidential election: either Barack Obama or Mitt Romney will lose. And here's a bet that's nearly as safe: whoever the loser is, he will explain his loss this way: "the other guy told a better story."
We hear it after every election cycle. In 2004, the Kerry campaign complained that while they had controlled the facts, Bush controlled the story. After the 2008 election, McCain's consultants bickered over the narrative he should have crafted but couldn't commit to. Even Obama, as if predicting his own defeat before the 2012 campaign began, lamented that his greatest regret thus far was failing to create a compelling narrative for the American people. (Perhaps we should have predicted his uninspiring performance in the first debate after all?)
Campaigns built around a coherent and compelling narrative have a huge advantage over those that continually fall back on their numbers, complicated policy proposals and of-the-moment attacks. Both camps know this, though in the fog of war, it's easy to forget. But with the margin between the candidates now razor thin, forgetting this core truth is a luxury neither campaign can afford. Here's why:
Stories are enormously important in helping us human beings make sense of our world. The more uncertain and complex the times are, the more we tend to turn to stories. They order our experience, defining clear heroes who are leading the way to a better future and obvious villains who stand in their path. They don't replace facts — they contextualize them. They teach a core truth — a moral of the story that aligns with the values of the listeners. And the best stories — the ones that have always built evangelists and rallied followers — show audiences how they can take on a starring role, stepping up as the heroes themselves.
So which candidate is doing a better job of telling a story? The sad reality is that while both Romney and Obama have at times defined a core story for their campaigns, neither candidate has been disciplined or focused enough to defend it effectively. It's a far cry from 2008, in which Obama's breakthrough narrative of citizen power, racial healing and a new beginning for America turned the election into a national referendum to define the story of our nation's past and future. Still, the narratives for each candidate are there just waiting to be leveraged, waiting to deliver this election to the candidate willing to put down the policy jargon and smears and tell a great story.
Romney's Story
The story Romney's been trying to tell needs to be understood in terms of the "Myth Gap" he's offering to heal. A Myth Gap is a cultural moment in which key explanations no longer make sense. For generations, the American Dream has been a core story at the heart of the American experience. Work hard, build up wealth, get ahead — it was simple and for many, very true. In 2008, that story came into serious question for millions of Americans who had indeed worked hard and built up wealth only to see it disappear.
When a significant Myth Gap opens up, a story that heals it is usually gobbled up. And Romney, indeed, has such a story. The American Dream, Romney's story goes, is as true as ever, but there's an evil force trying to squash it — a government obsessed with its own power and misguided policies of interference, embodied of course by Barack Obama and his administration. We are living through a time of deep failure, the story continues, and we need a white knight, a turnaround guy to come in and rescue the small business owner in distress. Who better to protect us than a strong father of five sons, a loving husband, a man who's rescued failing businesses wherever he found them?
Romney's narrative borrows from common conservative themes of small government and self-reliance, but, at his best, he adds a positive twist: his vision of America is not a collection of rugged, angry individualists wanting to be left alone but of optimistic small-town shopkeepers, taking care of their communities, their families and themselves.
Romney's story is not without its liabilities. First and foremost, he makes himself the absolute hero. If marketers have learned anything these past 50 years, it's that the best campaigns make the audience the heroes. If Romney can't adjust his narrative in that direction, simply returning to it may not be enough. And speaking of returning to his story, the more exclusively Romney harps on America's failed recovery these past four years rather than painting a picture of a desirable and deserved future, the further his story slips away.
Obama's Story
Obama's story was also defined at his convention with the counter-theme of "We're all in this together." It's a softer version of Yes We Can but plays on similar themes. Here we find clear heroes: American citizens working in joint purpose, sacrificing together for the good of the whole. But his story has now taken on a heavy dose of Occupy Wall Street anger and that's changed its tone. Where in 2008, the story was all about healing, Obama's 2012 story is about a determined charge forward in the face of villainous greed, inequity and corruption. It's extremely resonant, and not just with the small slice of America that showed up at Occupy encampments.
It's harder to sell a story of hope and change of course when you're in power, but at its best, Obama's story casts this campaign as the continuation of a rebel journey that has only just begun.
Like any name-brand, Obama has created an indelible and unshakeable story for himself. The American public will always associate him with his breathtaking 2008 story. The more he chooses to defend it and update it, the more he can build off of his greatest assets.
Unfortunately for his campaign, he hasn't told that story with much conviction since the Democratic Convention. And that gives voters the feeling that perhaps he himself is running out of hope, even as the results of the last four years have been far from a disaster in the view of many Americans.
So, who will strike first? Who will return to the big themes and stories before election day is upon us? I hope both do. The beauty of our democracy is that every four years, we have the chance to debate, as a nation, what our story is and should be. When our candidates refuse to carry that responsibility, we all find ourselves on the losing side of an election season.

New Product Narcissism

I want to love Samsung products. Every time I see someone with Samsung's flagship smartphone — the Galaxy S III — I can't help but be impressed. It's got an impressive 4.8 inch Super AMOLED screen (read: very high quality, large size). It sports the near field communication chips I've been dying for (read: secure mobile payments, instant device to device information transfer). And no one can argue with the fact that it's certainly a beautiful device (read: It might not be as pretty as my iPhone 5, but at least people will notice that it's not an iPhone 4S).
I also love the gumption Samsung showed when they came out swinging against Apple after losing a billion to them in their epic patent infringement case. But at the end of the day, I don't love Samsung products. In fact, I don't love any Android devices because I don't buy a phone to be a spectacular piece of hardware — and that's just something no big hardware company has figured out yet.
I buy a phone because over the course of a given day, things just come up. The more problems that I can solve with given device, the better that device is to me. And even if Samsung, HTC, and Motorola had a monopoly on the world's top engineers (which they don't), there's simply no way they could anticipate and solve all my problems (and if they tried they'd end up with a disaster of a phone that looked something like this).
In reality, when a job arises in my life and I find myself in search of a solution, I tend to hire a bundle of products to complete that job. When I find myself hungry, in search of a meal, I often hire Grubhub.com and the local Thai restaurant. When I find myself trying to impress a group, I tend to hire a style magazine, a haircut, Brooks Brothers, and a new tie. When I find myself trying to entertain my houseguests, I hire an iPhone and a Philips dock built for the iPhone. A phone is often part of the solution to my problems, but much of the time it's not the whole thing.
Unfortunately, like many product designers, most Android device makers fall victim to new product narcissism: the delusional belief that their product can solve the worlds problems without regard for all the experiences consumers derive from ecosystem partners. A relentless focus on ice cream, without regard for cups, spoons, or cones. The belief that products are solutions in and of themselves, instead of parts of the whole.
For example, Samsung mocks Apple's hesitancy to change hardware design. What the hardware giant fails to see are the benefits Apple offers its customers by sticking with a standard design with similar hardware for two years at a time. Apple's sub-par but static hardware is easier to design applications for — instead of quality checking thousands of devices, Apple developers only need to QA a handful. Apple's sub-par but static hardware also draws in a slew of accessory makers. From speakers, to cases, to totally new categories of product, Apple's standard design makes the iPhone the physical platform of choice.
Avoiding new product narcissism is a necessary key to building the products that will change the world. Product teams need to spend more time evaluating and solving problems instead of engaging in unproductive feature wars. I suggest three steps.
Start with the job you do for customers. If we agree that products are only valuable in their ability to complete the jobs that arise in our lives, then before you start your next design cycle, figure out what those jobs are. What does completing the job entail? What are the experiences you need to provide for your customers? What are the must-haves and what are the nice-to-haves?
Survey current solution ecosystems. Once you have an idea of what the jobs are in the arena you're vying to complete in, identify all the industry participants who currently help do that job for customers. Who makes the products? Who distributes? Who builds complementary products? Who provides post-sale support? How does each player contribute to the solution?
Identify what jobs you can complete alone, and develop a plan to empower ecosystem partners to do what you can't. Sometimes innovation requires completely replacing members of the existing ecosystem. Sometimes it doesn't. After you've charted how current ecosystems solve your customers problems, figure out what you can and can't deliver. Be honest. Then, if you still want to target jobs-to-be-done that are beyond the capability of your team, develop a plan to empower ecosystem partners.
The Segway was a technological marvel. It took Steve Jobs and Jeff Bezos just minutes to come to the conclusion that it had no market value since it didn't fit into our system. We don't have enclosed roads, we don't have convenient Segway (or, in many cities, even bike) lanes, and there were no systems to help the Segway carry my groceries home or haul my kids to school. It was a great product, but it was a terrible solution.
I want to love Samsung handsets.Technically, they're probably the best devices on the market. But while they might be the best products, they're certainly not the best solutions yet. If you want to avoid that same criticism, recognize that you're competing to be the best answer to a problem — and the value you create comes from far more than the product you deliver to your customers in the box.

Metacognition: The Skill Every Global Leader Needs

The increasingly international nature of business means leaders need new skills to get the full potential of teams and networks of people from a variety of cultural backgrounds. At the NeuroLeadership Summit being held in New York this week, top executives from Citibank, the American Management Association, and American University joined me on a panel to explore these new skills — skills including handling complexity, communicating virtually, and working across cultures.
Key among those is a thinking skill called cultural metacognition. Metacognition simply means thinking about thinking; in this context, thinking about your cultural assumptions. According to our research, if you can gain awareness of your assumptions, you can build trust and take your team beyond cooperating on a task to true creative collaboration.

Imagine you're driving in a foreign city. It takes heightened self-awareness to avoid getting lost so you need to be aware of the ways in which your mental map may be incomplete. You also need to actively check your assumptions against passing signs and landmarks.
Managers leading teams from different cultures confront a similar challenge. To navigate a working relationship with someone from another culture, you should be aware of your working assumptions about the other person. Checking for signs during the interaction that these assumptions apply is crucial to avoiding wrong turns or collisions in the relationship.
In a recently published paper led by my former student Roy Chua, now a professor at Harvard Business School, we used three different methods to test whether higher cultural metacognition leads to successful creative collaboration across cultures.

First, we assessed individual differences in the cultural metacognition of 43 middle-level managers enrolled in an executive MBA course, using the four-factor self-report cultural intelligence scale. For each manager, we also surveyed former coworkers whose cultural upbringing was different from the manager's, and asked how they rated the manager's effectiveness in creative collaboration. Managers with higher metacognition scores were rated as more effective by the other-culture managers who had worked with them.

Second, we surveyed the social networks of another group of executive MBA students, querying the 24 most important contacts in their professional network: Do they share new ideas with the contact? Do they trust the person affectively — from the heart, based on emotions? And, do they trust the person cognitively — from the head, based on evidence?
We asked the MBA students for their contacts' backgrounds as well as their cultural upbringing. As expected, the MBA students' cultural metacognition scores predicted the extent to which they experienced idea sharing, specifically in their cross-cultural relationships, though not in their same-culture relationships. To understand how high metacognition fosters the flow of ideas in intercultural relationships, the survey also measured two aspects of trust: affective trust (felt rapport) and cognitive trust (perceived reliability). Results showed that the creative collaboration gap in intercultural relationships arose from a deficit in affective, not cognitive, trust. This suggests that cultural metacognition works through fostering affective rapport that enables idea sharing and innovation.
But does trust lead to collaboration, or collaboration lead to trust? The final study tested this question in an experiment involving 236 undergraduates, invited to the laboratory two at a time. These participants were presented with a challenge similar to the television show "Iron Chef." They were shown ingredients used in different cuisines and asked to devise a recipe for an innovative chicken dish. After the individual challenge, participants were paired with someone from a different cultural background and asked to collaboratively produce a recipe different from what either had created individually. As in the prior studies, higher cultural metacognition led to greater affective trust and ultimately, more idea sharing and better creative collaboration — as rated subjectively by the participants, and as scored objectively by the expert chef judges.

A manipulation in the experiment confirmed our hypothesis about how the affective trust comes about. Half of the intercultural teams were given 10 minutes to talk personally before working together on the design task while the other half of the teams went directly into the design task without this opportunity. Higher cultural metacognition predicted greater affective trust and creative collaboration solely in the condition where groups had the personal conversation. In short, individuals with higher cultural metacognition did better at the get-acquainted conversations; their sensitivity in relying on cultural preconceptions meant that they were able to bridge the intercultural gap.
The good news is that cultural metacognition can be developed and strengthened over time in the same way driving in a foreign city can be improved. Here's what managers should consider:
  • Take positions and assignments in other countries and actively compare notes with others to gain a richer sense of how cultural lenses shape perceptions.
  • Keep a journal of your successes, failures, and surprises in adapting to a new culture.
  • Develop a checklist of questions to answer before a first meeting with a new contact. Checklists have proven to reduce mistakes in aviation and medicine, helping pilots and doctors make mistakes under pressure; the same is true in business. So, for instance, a Greek businessperson heading to Germany for a negotiation session might deliberately consider certain norms such as starting the meeting precisely on time, setting a specific agenda, and presenting detailed, fact-based evidence for each argument.
  • Use web-based tools like Culture Navigator to inform and test your assumptions about a culture that you do not know well.
To watch a livestream of the conference, visit the Neuroleadership Summit website.

What the Space Race Can Teach Us About Collaboration

As I watched the Space Shuttle making its way across the streets of Los Angeles recently enroute to its final home, I found myself thinking about how our space program began. In particular, one little known fact has long fascinated me: President Kennedy dreamed of a partnership with our nation's closest rival, U.S.S.R. President Khrushchev.
Now, as Earth faces environmental and social challenges, there's much we can learn from the way America rose to the challenge of the space race. In broad strokes, the challenges we face are deep and interconnected. The world's population will swell to 9.5 billion by 2050. Volatile weather is the new normal. Our global food system fails two out of every seven people. Recent studies highlight the urgent need for $50-100 trillion in global infrastructure investments by 2030 just to keep the lights on.
Our actions today will determine the world we give our kids tomorrow. President Kennedy's space race leadership teaches us five valuable lessons about how to nurture the kind of collaboration that can stand the test of time.

Perceived crises can spur massive and worthwhile goals

A key theme of the 1960 U.S. Presidential election was the lasting power of America's scientific edge. In response to Russia's launch of the first man into space, President Kennedy inspired a nation to dream bigger and put a man on the Moon. We're in another crisis now. Approached correctly, today's companies can frame these crises as a way to set the stage for sweeping innovation in technology and management practices alike.

Worthwhile goals can inspire entire generations

President Kennedy made science cool. It became the playground of our imaginations. Children imagined themselves as spacemen. Companies envisioned opportunities to grow. There was now a vision to unlock society's kinetic energy. An argument can be made that the dot-com era's roots were from this renewed interest in science in the 1960s.

Think and act in timescales that outlive your leadership

President Kennedy's goal implied that he would not be in office to reap the benefits of his leadership. His example shows that legacy can serve as an inviting substitute for short-term glory. With CEO tenures becoming ever shorter, the temptation is to prioritize short-term results. It is essential we rise above this temptation.

Partnerships with strange bedfellows must be considered

President Kennedy actually sought to collaborate with our fiercest rival - the U.S.S.R. - in order to achieve his vision. During one public address, Kennedy said: "...in a field where the United States and the Soviet Union have a special capacity--in the field of space--there is room for new cooperation..." With history as our teacher, there is no reason why companies should not be prepared to work with their fiercest rivals too.

Cross-sector collaboration has its benefits

Cross-sector collaboration was crucial to the achievement of President Kennedy's vision. Many companies within the private sector grew as a result. For example, IBM provided computational power. In IBM's view, participation gave the company the opportunity to challenge itself to develop the cutting edge in information and data management. Not to mention a new client.

In order to answer the call of today's challenges, the private, public, and civil sectors must collaborate. In the process, we will make the kind of history we can be proud of once more.

More blog posts by Eric Lowitt

Demand and Sales Aren't Equivalent

Steve Carlotti, CEO of the company where I work, likes to say: "Sales is what you buy. Demand is what you want. Growth comes from bringing the two together." As companies try to exploit the opportunities presented by Big Data, the difference between those two things is an essential insight.
Most executives assume that sales equal demand. Very often, this couldn't be further from the truth. The challenge with sales data is that it is too superficial. First, shopping occurs at the household level, but demand is at the individual level.
Take the last grocery bill for beverages for my family of five. There are no beverages that our entire household all consumes. Four out of five of us drink milk, three out of five drink juice, two out of five drink coffee, and just one-fifth of us prefer enhanced water.
No sales database can tease apart these nuances, and if you're simply measuring sales, it's hard to tell who in my family represents demand for each of these products. Also, the past does not predict the future. For nearly all the categories above, we were buying different variations (e.g., soy milk, almond milk, organic milk) and brands of each six months ago.
Finally, sales don't equal demand because consumers settle for less far more often than most realize. Consumers are forced to buy hot dogs and hot dog buns in different pack sizes. Socks come in a handful of sizes when shoe sizes range in 20+ lengths with 7+ widths, for a total of 140+ permutations. Twenty-nine percent of beer drinkers don't like the taste of beer! Between out of stocks, suboptimal assortment, pricing inefficiencies, difficult POS experiences, misaligned brands, and redundant innovation, I'd be surprised if consumers were happy with more than half their purchases.
If the challenge of analyzing sales data is breadth, the challenge of profiting from demand data is the inverse: It requires such depth that purpose, practicality and profit get lost. Demand data comes in the form of market research, demographic/behavioral databases, and more recently in social media and search, all of which require you to wallow in the primordial soup of unmet needs to figure it out. Demand is primal. I've seen consumers cry when given just the right stapler because being neat and organized is part of their identity. I've seen people wax poetic about how a basic bar of soap can be a passport to paradise, if even for just 10 minutes. It's easy to overlook the profound, Pandora's Box of human emotions that even the most commodity of products can unlock, because each of us has profoundly complex and uniquely rich stories. Demand is also paradoxical. Consumers frequently say and do very different things. The 29% of beer drinkers who don't really like beer are reluctant to say that out loud. Finally, demand needs to be measured in profits, by quantifying the economic value of a Facebook Like, a Google search, or a top two box score on a survey. That enables research to be linked to resource allocation and ROI and ensures purpose and practicality won't be lost along the way.
While sales and demand data have their own challenges, the biggest upside will come from better integration across both, as well as a third area of data we have yet to talk about...what people watch. I am a consumer (demand data), shopper (sales data) and watcher (media data). Yet most companies are focused on going deeper in one area, instead of integrating across all three. The challenge is that the keys to each of the three datasets are held by different companies: consumer (manufacturers), shopper (retailers) and watcher (media). Even the digital area is still fairly fragmented across consumers (Facebook/Twitter), shoppers (Amazon) and watchers (YouTube/Netflix), though some are trying hard to get to all three like Google (Google +, search, YouTube) and Amazon (reviews/ratings, retail and Prime videos).
This is why big demand data will require renaissance executives who are both right and left brained, schooled in anthropology and accounting, who are equally skilled in using a microscope and telescope to solve problems and find growth.

Why I Decided to Rethink Hiring Smart People

Chris Argyris' "Teaching Smart People How To Learn" utterly changed the way I thought about management. It didn't just give me a somewhat different view; it convinced me of the exact opposite of what I had believed before I'd read it. That's a heck of a lot of influence for 10 and a half pages!
At the time, I was a director at the strategy consulting firm Monitor, and a few months before the article was published in the May-June 1991 issue, we had formed a four-person Global Executive Committee to run the firm, so I was more intimately involved in its management than I had ever been before.
We had a pretty simple recruiting philosophy during our swift ramp-up from inception in 1983 to that point in time: hire super smart consultants because, thanks to their great intellect, they will be able to learn best and fastest. In fact, we had a thoroughly obnoxious catchphrase — stupid is forever — that I am very embarrassed ever existed, and repeating it here is part of my penance for once holding the view. Its (deeply flawed) logic was that you could teach someone all the interpersonal skills necessary as long as they were really smart. But if they weren't really smart to begin with, there was nothing you could do.
We were a Harvard Business School shop in the early days and, having great respect for Baker Scholars (the top 5% of the HBS class), we hired as many of them as we could. But they didn't work out nearly as well as we expected, and some flamed out pretty spectacularly. As is often the case, we attributed that to flawed execution of a fundamentally awesome theory — we had just hired the wrong super smart people.
Then I read "Teaching Smart People How To Learn," which argued trenchantly and compellingly that really smart people have the hardest time learning. They are so very smart that they are also very "brittle," to use Argyris's descriptor. When something goes wrong, rather than reflect on what they might have done to contribute to the error, they look entirely outside themselves for the causes and blame outside forces — irrational clients, impossible time pressure, lack of adequate resources, shifts beyond their control. Rather than learn from error, they doom themselves to repeat them.
Before reading the article, I would have been inclined to finish that last sentence with "despite being so very smart." After the article, my conclusion was "because they are so very smart." I personally changed my philosophy and worried a lot more about "smart is forever" than the opposite, and we at Monitor changed how we recruited and developed ever after.
The article had another positive knock-on effect for me, which added to its already great influence: It got me thinking about other assumed unalloyed goods. We had, for instance, assumed that "smartness" — in Monitor's case defined as analytical brilliance — was an unalloyed good. If Tom had a smartness rating of 10 and Sally was a 12, then she was just plain better. And if Jorge was a 15, he was better still. There was no downside to more of this obviously meritorious attribute, so just get more.
Of course, the faults of this reasoning are hardly news — we have an ancient expression which holds that "too much of a good thing may not be such a good thing" — but the article really brought that home to me in the business context. More time to do a client's project just might not be a good thing. A bigger consulting team for a given project might not be a good thing. Higher client billings might not be a good thing. More offices might not be a good thing. More consultants might not be a good thing.
Since reading the article, I have looked more critically at every unalloyed good in my world. I transformed how I consulted, how I built client relationships, how I managed the aspects of Monitor I ran, and how I have led the Rotman School as dean. It has made me better at what I do — my thanks to Chris Argyris and "Teaching Smart People How to Learn."
_____________________
More >>
More blog posts by Roger Martin
More on: Managing people, Talent management