Everyone Is Watching You Fail, Except They Aren't, Except You Think They Are
The guy two seats down from me in the all-hands meeting asked the question I'd been holding in my mouth for forty-five minutes. He just said it. Out loud. Into the open air of a conference call with, I don't know, two hundred people on it, including several vice presidents and at least one person whose title I've never fully understood but whose Slack messages make my stomach tighten.¹ He asked, basically: "I'm sorry, but can someone explain what we actually mean when we say 'customer-centric' here, because I've heard it nine times today and I'm not sure any of us are saying the same thing." The silence was about three seconds long, which in a meeting of this size feels approximately geological. And then the SVP of Product said, "That is, honestly, the best question anyone has asked all quarter."
I'd had the same question. I'd typed it into the chat box and then deleted it. Twice.
Here's what I was thinking during those forty-five minutes, and I want to be precise about this because I think the precision matters: I was thinking about the way my question would sound. Not whether it was a good question (I was 80% sure it was a good question, which is a pretty strong batting average for questions asked in professional settings). I was thinking about how the asking of it would make me look. Specifically, I was worried it would reveal something about me, some gap in my understanding everyone else had already quietly filled, some professional deficiency I'd managed to hide for years and was now, in one careless moment, about to expose to two hundred of my colleagues: a fairly grandiose fear for a Tuesday afternoon.²
This is a thing I do. Have done. Keep doing. Despite understanding, on an intellectual level, exactly why it's irrational. I collect the unasked questions and unsent emails and unconsidered risks the way some people collect vinyl records or regrets, which (now I'm thinking about it) may be the same collection.
Psychologists have a name for the specific cognitive error I was making in my chair. They call it the spotlight effect, and it works exactly the way you'd guess from the name: we massively overestimate how much other people are paying attention to us.³ The research on this has been replicated across dozens of studies, and the finding is consistent enough to be depressing: people routinely estimate they're being noticed and evaluated by roughly twice as many observers as are actually paying attention. Which means the audience you're performing your competence for is, at any given moment, about half the size you think it is. And the half watching? Most of them are cycling through their own private calculations about how they look, which means even the people noticing you aren't really noticing you. They're noticing you the way you notice wallpaper. Briefly, peripherally, and without retention.
There's something relieving and devastating about this simultaneously. Relieving because it means the theater you've been performing in is mostly empty seats. Devastating because it means you've been declining to raise your hand for years, carefully protecting yourself from a judgment barely anyone was going to render in the first place.⁴
I want to be careful here not to make this sound simpler than it is, because the fear of looking stupid is not a small or silly thing, even if the cognitive science says we're miscalibrating its probability. The fear is, in a certain way, very old. Older than performance reviews and all-hands meetings. There's a reason Aristotle spent considerable time on shame in the Rhetoric, and it wasn't because he was worried about his quarterly OKRs. The fear of being laughed at, of being found inadequate, of being exposed as someone who doesn't quite belong in the room they're sitting in, runs deep enough to operate beneath our conscious awareness.⁵ It doesn't feel cognitive. It feels physical. A tightening in the chest. A sudden heat in the ears. The conversational equivalent of pulling your hand back from a hot stove before your brain has even finished the sentence "that's hot."
And here's what makes it so persistent: the illusion of transparency, the spotlight effect's anxious cousin, tells us not just that everyone is watching, but that everyone can see through us. That our nervousness, our uncertainty, our fraudulence (perceived or otherwise) is somehow visible on our faces, readable in our body language, detectable in the micro-tremor of our voice when we say "I have a question."⁶ We think we're made of glass. We're actually, for all practical social purposes, closer to brick. But knowing you're made of brick and feeling made of glass are very different operations. I can tell you right now, with full access to the peer-reviewed literature, that if I'd asked my question in that meeting, roughly 180 out of 200 people would not have thought about it for more than four seconds. I can know this the way I know the earth is round. And it would still have taken considerable effort to hit unmute.⁷
This is the part where I want to talk about graveyards, but not in a morbid way. Or maybe in a slightly morbid way, since a little morbidity is probably appropriate when you're talking about the finite number of Tuesdays you're going to get.
Seneca, writing sometime around 49 AD (which is a long time ago to be making a point still applicable to people who attend Zoom meetings), observed something about the way we relate to time: we act as if we have an unlimited supply.⁸ He was writing to his friend Paulinus, and while the context was different (Roman political life, not life in the SaaS world), the observation has an uncomfortable staying power. We defer. We postpone. We tell ourselves "one day," which is a phrase with a peculiar grammatical status because it doesn't actually reference any specific day. "One day" is a placeholder for a commitment we haven't made, a date with no calendar entry, an intention with no architecture behind it. It's putting a sticky note on a sticky note.
And the reason "one day" is so seductive is it lets you feel the warmth of the aspiration without the friction of the attempt.⁹ You get to imagine yourself as the person who would ask the hard question, who would submit the application, who would try the thing they've been circling for years, without ever having to confront the gap between that person and the person currently sitting in their chair with a question typed and deleted in the chat box. There is a whole life you can live inside the gap between "I should" and "I will." And it is a comfortable life.
Here's where it gets strange, though, and where the cognitive science takes a turn people don't expect. The social psychologist Elliot Aronson ran an experiment in 1966 at the University of Minnesota where he had participants listen to recordings of a person answering quiz questions.¹⁰ In one version, the person was clearly brilliant, getting 92% of the questions right. In another version, the person was average. Then, in some recordings, the brilliant person accidentally spilled coffee all over himself. Aronson wanted to know: does the blunder make the brilliant person less attractive?
The answer was the opposite. The brilliant person who spilled coffee was rated as more likable than the brilliant person who didn't. The mistake made him more human, more approachable, less intimidatingly perfect. Aronson called this the pratfall effect, and it turns the entire architecture of our fear inside out.¹¹ Because consider what this means: the thing we're most afraid of, the stumble, the dumb question, the visible imperfection, is often the exact thing that makes other people feel closer to us. We spend enormous energy trying to appear flawless, and the research suggests flawlessness is precisely what makes people feel distant from us. Spilled coffee is not the catastrophe. Spilled coffee is the connection.
I have a friend who wanted to learn guitar for eleven years.¹² Eleven years of saying "one day" and watching YouTube tutorials without picking up the instrument. Not because he lacked time or money or access to guitars. Because he was, as he eventually admitted to me over what was probably too many beers, afraid of sounding bad. Of being a thirty-eight-year-old man playing "Smoke on the Water" badly in his living room while his wife an teenage daughtercould hear. Of the gap between the guitar player in his head and the guitar player he would actually be on day one, day ten, day one hundred. He started playing last March. He sounds, by his own admission, "terrible, but in a way I'm enjoying." Which struck me as one of the more honest self-assessments I've heard from another person.
The thing about being afraid to look stupid is it only works as a strategy if you're optimizing for the opinions of people who are (a) paying attention and (b) going to remember. And we've established that most people are doing neither of those things. The audience in your head, the one wincing when you mispronounce something or ask the obvious question or play a C chord with the wrong finger, is a projection. A sophisticated one, cobbled together from every cafeteria slight and parental correction and seventh-grade humiliation you've accumulated over a lifetime of being a self-conscious organism, but a projection nonetheless.¹³
Which is not to say there are no audiences. There are. Some people will laugh. Some colleague, somewhere, will think you asked a dumb question. Some person in the open-mic audience will feel secondhand embarrassment for you, and not even the generous kind. These people exist. They're just not the majority, and they're not the ones you should be calibrating your decisions around. Brené Brown borrowed a line from Theodore Roosevelt about this, the one about the man in the arena whose face is marred by dust and sweat.¹⁴ Roosevelt was talking about civic life, but the principle scales down: credit doesn't belong to the person who pointed out where the strong man stumbled.
This gets me thinking about a very specific arithmetic. Not the kind with numbers (though there are numbers, if you want them).¹⁵ The arithmetic of regret. Because when researchers study what people regret most at the ends of their lives, the answer is almost always the same: not the things they tried and failed at, but the things they never tried at all. The psychologist Thomas Gilovich has done extensive work on this, and his findings are clear enough to be uncomfortable. In the short term, we regret our actions: the email we shouldn't have sent, the joke that didn't land, the question that turned out to be, in fact, dumb. But over time, those action regrets fade. They get absorbed into the general background noise of being alive. The inaction regrets, though. The "I should have asked her out" and "I should have applied for that job" and "I should have said what I actually thought in that meeting." Those don't fade. They calcify.¹⁶
This might be the most quietly cruel thing about the fear of looking stupid: it poses as protector. It presents itself as the friend who pulls you back from the edge, who says "don't do that, you'll embarrass yourself, wait until you're ready." But readiness, in this context, is a fiction.¹⁷ There is no amount of preparation sufficient to guarantee you won't look foolish. There is no rehearsal thorough enough to eliminate the gap between intention and execution. The friend pulling you back from the edge isn't saving you from the fall. The friend is saving you from the climb back up.
Something shifted for me (and I want to resist the temptation to make this sound more dramatic than it was, because the truth is less cinematic and more the slow adjustment of a desk lamp's angle) when I started paying attention to the people I actually admired. Not the polished ones. Not the ones who seemed to move through professional and social life with the frictionless ease of someone who has never once typed a question into a chat box and then deleted it. The ones I admired, on closer inspection, were the ones who seemed willing to be bad at things in public. The colleague who asked clarifying questions in meetings without apology. The friend who started a business she didn't fully understand and talked about her confusion openly. The guy two seats down from me who just said the thing.
And this is where Aronson's research lands differently than the spotlight effect, because it's not just saying "nobody noticed your mistake." It's saying something more radical: the people who did notice your mistake may actually respect you more for it.¹⁸ The stumble is the thing your audience relates to. The unpolished attempt is the thing they remember warmly. We've been constructing elaborate armor against a threat that is, in many cases, its own antidote.
I asked a question in a meeting yesterday. A different meeting, smaller, but still public enough to trigger the old machinery. It wasn't a revelatory question. It wasn't a brave question. It was a question about whether a particular workflow applied to a particular set of accounts, and I asked it because I genuinely didn't know, and because the alternative was to spend forty-five more minutes pretending I did.
Nobody gasped. Nobody typed anything disparaging in a side channel (or if they did, I'll never know, which amounts to the same thing). The meeting moved on. The question was answered. The world continued to rotate at its customary speed.
And then, last Thursday, I did something harder. I sent a message to the guy from the all-hands. The one who asked the question I didn't. I'd drafted it eleven minutes after the meeting and then let it sit unsent for two weeks, which is a long time to hold a two-sentence thank-you note hostage because you're worried it'll seem weird. I finally hit send. His reply came back in under a minute. Four words.
"Ha, I almost didn't."
¹ I have encountered at least three people in my career whose titles I cannot parse. "Senior Director of Strategic Enablement" tells me approximately as much about someone's job as "Regional Manager of Vibes." ↩︎
² I should note that "grandiose" is doing a lot of work in this sentence. What I mean is: the fear was simultaneously enormous in its felt intensity and trivially small in its actual stakes. Nobody was going to fire me for asking a question. The worst plausible outcome was a three-second silence, which, while uncomfortable, is not a catastrophe by any reasonable metric. ↩︎
³ The research on this, pioneered by Thomas Gilovich, Victoria Husted Medvec, and Kenneth Savitsky at Cornell in 2000, has been replicated extensively. The core finding holds across contexts: we think we're being watched far more than we are. Their original paper is worth reading if you've ever felt even mildly self-conscious in public, which is to say: it is worth reading. ↩︎
⁴ There's a version of this realization that tips into nihilism, obviously: "Nobody's watching, so nothing matters." I don't think that's the right conclusion, but I understand the appeal. The better conclusion, or at least the one I'm trying to land on, is: "Nobody's watching, so the only opinion you're actually managing is your own." ↩︎
⁵ The gelotophobia research (yes, there is a formal term for the fear of being laughed at, because psychology names everything) suggests this fear sits at the intersection of shame sensitivity and social evaluation. In extreme cases it can be genuinely debilitating. For most of us, though, it just functions as a very effective brake pedal on the car we keep saying we want to drive. ↩︎
⁶ The research on the illusion of transparency, also by Gilovich and colleagues, is striking: speakers consistently overestimate how obviously nervous they appear to an audience. Even liars overestimate how detectable their lies are. We live in a state of perpetual false exposure. ↩︎
⁷ This is, I think, the central frustration of being a cognitively sophisticated animal: you can understand the bias, name the bias, cite the peer-reviewed literature on the bias, and still be fully subject to the bias. Knowing about the spotlight effect does not turn off the spotlight. It just means you're standing in it with a bibliography. ↩︎
⁸ The relevant Seneca quote, paraphrased because I can never remember the exact translation: "It is not that we have a short time to live, but that we waste a great deal of it." He was 53 when he wrote this, which feels about the right age to start noticing the math doesn't add up. ↩︎
⁹ This is related to what psychologists sometimes call attribute substitution: swapping a hard question (will I actually do this?) for an easier one (can I imagine myself doing this?). The answer to the second question is almost always yes, which is how you end up with a decade of "one day" and a closet full of unused equipment. ↩︎
¹⁰ Aronson's original 1966 paper is one of those beautiful artifacts of mid-century social psychology where the experimental design is so clean it almost reads as a parable. He went on to become one of the most cited psychologists of the twentieth century, which seems appropriate for someone whose most famous finding was about the hidden advantages of being imperfect. ↩︎
¹¹ I want to be precise about the pratfall effect's scope: it works best when the person is already perceived as competent. If you're perceived as mediocre, the blunder doesn't help. Which is, if you think about it, even more reason to take the risk, since most of us are more competent than we give ourselves credit for, and the blunder is what lets people see the human behind the competence. ↩︎
¹² His name is Mike, and he gave me permission to include this, which is more than I can say for my use of the term "too many beers," which he disputes. He says it was exactly the right number. ↩︎
¹³ The seventh-grade thing is specific to me, I realize, but I suspect most people have their own version. Details change. Architecture doesn't. ↩︎
¹⁴ Roosevelt delivered this speech at the Sorbonne in 1910. The full passage is longer and more complex than the excerpt usually quoted, but the part about the man in the arena has entered common usage for a reason: it clarifies something about the relationship between risk and judgment more efficiently than most philosophy. ↩︎
¹⁵ If you want the numbers: assume you have about 4,000 weeks of adult life (I'm borrowing this framing from Oliver Burkeman, who borrowed it from "doing math."). That's a lot of weeks, until you subtract the ones spent sleeping, commuting, and attending meetings about meetings. The remainder is smaller than you want it to be. ↩︎
¹⁶ Gilovich's research on regret distinguishes between "hot" regrets (recent, action-based, temporary) and "wistful" regrets (old, inaction-based, permanent). The wistful ones are the ones people mention on their deathbeds, which should probably tell us something about how to allocate our courage. ↩︎
¹⁷ I mean this quite literally. The idea of "being ready" contains a hidden assumption: there exists a state of preparation after which the thing becomes safe. This is true for, say, surgery. It is not true for asking a question in a meeting, sending a pitch email, trying a new skill, or any of the ten thousand small vulnerabilities ordinary life contains. For those, readiness is a moving goalpost you can chase forever without catching. ↩︎
¹⁸ This is the part of Aronson's work I find most unsettling, in a productive way. We assume the audience wants our perfection. The audience, it turns out, wants to see a coffee stain. ↩︎