The Noisy Room

A story about common confusion

It began with a study.

In December of 2025, Stanford researchers analyzed 2.2 billion social media posts looking for a pattern. They wanted to know what percentage of users posted severely toxic content. Not rudeness, not sarcasm, but speech that was so hateful that 90% of the world would flag it as being problematic.1

With this data in hand, they then asked thousands of people to answer a simple question:

Take a guess.
What percentage of social media users do you think post severely toxic content?
?
0%50%100%

The Bar

Imagine walking into a bar with 100 people. Three of them are screaming about politics, about each other, about nothing. But the bouncer, who gets paid based on how long you stand there staring, has wired those three into the sound system and turned it up to ten.

You walk in, hear the roar, and conclude: this place is full of lunatics. Never hearing the 97 people having normal conversations a few feet away.

This is social media. The bouncer is an algorithm. And whether you like it or not, you've been a bystander.

Pick a contentious topic. This is what your feed might look like.

Reading this feed, you might reasonably conclude that the country is split between unhinged extremes. It is not. And the gap between what Americans actually believe and what the feed suggests they believe may be the most consequential thing platforms are failing to show you.

See the Room

Let's scale a hypothetical social media platform down to a single room with 100 people inside. This is what it looks like:

97 regular users
3 users who have posted toxic content
3%33% On most platforms, ~3% of accounts produce 1/3 of all content
Your feed Engagement ranking amplifies high-reaction content from the prolific few
Your feed
The actual room. 3 out of 100 users have ever posted severely toxic content.

This pattern repeats across platforms. On Twitter/X, toxic tweets receive ~86% more retweets and ~27% more visibility than non-toxic ones, 0.3% of users shared 80% of all contested news,14 and just 6% of users produce roughly 73% of all political tweets.16 On TikTok, 25% of users produce 98% of all public videos.15 The specific numbers vary. The dynamic is the same: a small minority of highly active users overwhelms the majority.

After a time consuming content in this room, your brain performs a kind of ambient demography. The feed becomes a sort of census. You conclude — logically — that the behavior must be widespread. The room might just be full of extreme people! Maybe most people do believe these crazy things.

This is not just about what we see on social media

If this were just about tone of our social posts, it wouldn't matter very much. But this distortion ends up causing some seriously bad patterns of behavior.

Pattern 1 The Majority Goes Silent

When the majority of people looks at the feed and assumes they're outnumbered, people will often self-censor.3 The dynamic replicates on social media17 — fear of social isolation suppresses opinion expression on platforms where it's perceived to be unwelcome. They go quiet, or they leave a platform entirely. They cede the space to users with more extreme politics.

Pattern 2 The Loud Minority Thinks It's the Majority

The minority who aggressively post end up with their own distortion – believing they are part of the majority.5

A study of 17 extremist forums found the same pattern: the more someone posted, the more they believed the public agreed with them. More engaged participation bred false consensus.

Pattern 3 Everyone Gets Each Other Wrong

Both sides develop wildly inaccurate beliefs about who the other side actually is.6 See how some of your own beliefs line up:

What percentage of Democratic supporters do you think are LGBTQ?
?
0%50%100%
What percentage of Republican supporters do you think earn over $250,000 a year?
?
0%50%100%

The distortion extends to policy beliefs. Step through to see the perception gap on the issue of immigration.

Source: More in Common (2019) & Moore-Berg et al., PNAS 2020. Illustrative.
Pattern 4 Politicians Follow the Perceived Room, Not the Real One

Elected officials are very good at sensing political sentiment. It's literally their job. (They are not elected to correct people's beliefs.)

Politicians who can build a coalition about a perceived belief are more likely to win. They position themselves against an opponent that doesn't exist, but their supporters think exists.

And remember: most of our politics now happens on social media. Candidates often read the same distorted feed. They are unlikely to change their minds.

The window of discourse shifts. Not because opinions changed, but because perceptions of opinions did.

Pattern 5 Misperception Turns into Hostility

When you believe the other side is extreme, you become more willing to treat them as a threat.7

Both Democrats and Republicans vastly overestimate how many on the other side support political violence. The result is a populace primed to assume the other side is ready to do horrible things.

"What percentage of the other side supports political violence?"
Democrats believe
estimate
35.5%
35.5%
3.4× off
of Republicans support political violence
Republicans believe
estimate
37.1%
37.1%
4.0× off
of Democrats support political violence
Both sides were wrong by 3 to 4 times. When researchers corrected these beliefs, partisan hostility dropped.

Each step feeds the next. The distortion is self-reinforcing.

Knowing Isn't Enough

Okay. So now you know that a small minority dominates the feed.

You know that Republicans and Democrats actually have a far more nuanced set of opinions about contested issues.

Does that fix it? Not really. You also know that everyone else doesn't know it. And if the world continues operating as if the distortion is real, you should probably act the same — even though you know it's wrong. The room hasn't changed, even if you know people inside it are confused.

This is called a common knowledge problem.

Private knowledge
You've read the stat. But you have no idea who else has. The feed still looks the same. You still assume you're outnumbered. You stay quiet.

Steven Pinker lays this out cleanly in his excellent recent book When Everyone Knows That Everyone Knows.8 Learning a fact changes what you know. Seeing it displayed publicly — where everyone else can see it too — where you know others can also see it, changes what everyone knows, and subsequently how they act.

Social media has no public square. It has 300 million private windows, each showing a different distortion of the same room. Illuminating the common thoughts between us has the potential to radically change it.

The Idea

So what can we do about this?

Fortunately, there's some good evidence showing how it can be fixed. Multiple studies show that when misperceptions are corrected in a public way, hostility drops. Mernyk et al. found that a single correction reduced partisan hostility for a full month.7 Lee et al. found that correcting overestimates of toxic users improved how people felt about their country and each other.1

We can do this today.

Imagine every post on a contested topic had a quiet link beneath it. Not a fact check, a label, or a warning. Instead — what if it had a Community Check?

How do people actually feel about this?

click here

A Community Check is an open-source design layer that could be deployed across social media, beneath contentious posts, to help users understand how other people on the platform (or the nation) actually feel about an issue.

It is a way of quickly adding context to the most hot-button viral issues, giving people more visibility into the opinions of the public.

The Idea in Action

Let's explore this intervention with a topic that cuts across political identity:

Money in Politics

On the surface, this seems contentious. But it's actually a supermajority issue: 81% are concerned about the influence of money on elections, including 78% of Republicans and 90% of Democrats. 75% say unlimited spending weakens democracy. Only 15% believe unlimited political spending is protected free speech.

And yet, very little changes, largely because everyone assumes the other side is fine with it. The feed is full of people defending their team's donors and attacking the other team's. It might look like a 50/50 partisan battle, but it's not. It's a majority consensus that cannot see itself.

What if you could see this consensus?

@real_talk_politics · 2h
Everyone complains about money in politics but the second their candidate gets a massive donation they shut up real fast. You don't hate money in politics. You hate when the OTHER side has more of it.
♡ 11,847💬 6,203↻ 2,891
click here

Community Check draws from a random sample of platform users + robust national polls, surveyed independently of the content. The sample is statistically representative. The results update continuously. And critically: everyone sees the same numbers.

Why This Isn't Fact Checking or Audience Polling

Traditional fact-checking is a top-down approach that often feels like it's dictating from above. This is hard for people to stomach. Content moderation for many years now has been perceived as removing speech. This simply adds context, much like the crowdsourced feature Community Notes (an inspiration for this project).

Nor is this just a user-poll under a post. Instead it's drawing from all platform users, coupled with statistically significant national surveys. It's an actual window into the views of the majority, not just the views of those looking at the post.

It Works for Video Too

Short-form video is the fastest-growing vector for political distortion. The same dynamic applies — a small minority of creators produce the vast majority of political content — but video bypasses the pause that text gives you. Community Check can adapt. Tap through to see how.

Money IS free speech.
Deal with it. 🇺🇸
Citizens United was CORRECT
@liberty_caucus_tv Follow
#FreeSpeech #CitizensUnited 🔥
5,247
💬612
1,742
A political video crosses the engagement threshold. 51K views, 612 comments (1.2%), 1.7K shares (3.4%). The feed shows outrage. But what do people actually think?

See technical specs for how it works below ↓

We Could Do This Now

Platforms already have a lot of these capabilities. They already survey users. They even know how to run sophisticated polls. There are a few technical details to work out (spec here), but this is not a hard problem to solve.

The unseen majority is the public. And the public deserves to know itself.

A tiny minority, dominating the feed. That's all it ever was. The rest of us were here the whole time, quiet and decent and waiting to be seen.

Follow my other work here

See FAQ See technical specifications

Community Check is a free and open specification.

The complete technical spec, research base, and open questions are published for researchers, engineers, and platform designers to stress-test and build on. Please steal it with attribution.

View on GitHub
References
1 Lee, Neumann, Zaki & Hancock, "Americans overestimate how many social media users post harmful content," PNAS Nexus, 4(12), 2025. n=1,090. Benchmark: Kumar et al., "Understanding the Behaviors of Toxic Accounts on Reddit," WWW '23, 2023. 3.1% of accounts produced 33.3% of all comments.
2 Grinberg et al., "Fake news on Twitter during the 2016 U.S. presidential election," Science, 363(6425), 2019. 0.1% of users accounted for nearly 80% of contested news sources shared.
3 Noelle-Neumann, "The Spiral of Silence," J. Communication, 24(2), 1974.
4 Hampton et al., "Social Media and the 'Spiral of Silence'," Pew Research, 2014.
5 Wojcieszak, "False Consensus Goes Online," Public Opinion Quarterly, 72(4), 2008.
6 Ahler & Sood, "The Parties in Our Heads," J. Politics, 80(3), 2018. 342% overestimate.
7 Mernyk et al., "Correcting Inaccurate Metaperceptions," PNAS, 119(16), 2022. n=4,741. Effects lasted 1 month.
10 Moore-Berg, Ankori-Karlinsky, Hameiri & Bruneau, "Exaggerated meta-perceptions predict intergroup hostility between American political partisans," PNAS, 117(26), 2020. ~80% of both parties overestimated opposing party hostility by 50-300%. See also: "America's Divided Mind," Beyond Conflict, 2020.
11 Sparkman, Geiger & Weber, "Americans experience a false social reality by underestimating popular climate policy support by nearly half," Nature Communications, 13, 4779, 2022. n=6,119. 80% of Americans support siting renewables locally; perceived support: 43%.
12 Yudkin, Hawkins & Dixon, "The Perception Gap," More in Common, 2019. n=2,100 via YouGov. Average overestimation of opposing party's extreme views: ~55% estimated vs ~30% actual.
13 More in Common, "Americans' Environmental Blind Spot," 2022. 73% of Republicans support U.S. clean energy leadership; Republicans estimate only 33% of their own party agrees.
14 Baribi-Bartov, Munger & Pan, "Supersharers of fake news on Twitter," Science, 384(6700), 2024. 0.3% of users shared 80% of contested news during the 2020 U.S. election.
15 Pew Research Center, "How U.S. Adults Use TikTok," 2024. 25% of users produce 98% of all public videos.
16 Bail, Breaking the Social Media Prism, Princeton University Press, 2021; Pew Research Center, 2021. 6% of U.S. Twitter users produce ~73% of all political tweets.
17 Oz, Shahin & Greeves, "Platform affordances and spiral of silence: How perceived differences between Facebook and Twitter influence opinion expression online," Technology in Society, 76, 2024. Fear of social isolation suppresses opinion expression on platforms where it's perceived to be unwelcome — confirming the spiral-of-silence dynamic operates on social media, with platform-specific affordances (network association, anonymity, social presence) moderating the effect.

Common Questions

Technical Specification

How Community Check would work in practice, from data sources to platform integration.