Eighteen Million Americans Present at Mass Shootings?
The math on mass shootings doesn’t add up.
Publication
Pyrooz, D. C., Densley, J. A., & Peterson, J. K. (2025). Direct exposure to mass shootings among US adults. JAMA Network Open, 8(3), e250283. https://doi.org/10.1001/jamanetworkopen.2025.0283
What Was the Question?
The authors set out to answer a basic question: how many Americans have been directly exposed to a mass shooting? While previous surveys have captured indirect exposure (hearing about shootings, knowing someone affected), little was known about how many people were actually present during one of these events or physically injured. The study also sought to identify which sociodemographic groups are most at risk.
How Did They Look at It?
In January 2024, Pyrooz and colleagues used YouGov to administer a nationally representative online survey of 10,000 adults (18 years or older). The survey defined a mass shooting as a “gun-related crime where four or more people are shot in a public space.” Respondents were asked whether they had ever been physically present during such an event, and if so, whether they were injured. The authors used weighted multivariable logistic regression to estimate prevalence and identify demographic correlates such as age, gender, race, income, and education.
What Did They Find?
Roughly 7% of U.S. adults—about one in fifteen—reported having been physically present at a mass shooting. About 2% reported being physically injured, whether from being shot, trampled, or otherwise hurt. More than half of these incidents occurred since 2015, and three-quarters took place in the respondents’ own communities. Neighborhoods, rather than schools or workplaces, were the most common settings.
Younger people (especially Millennials and Gen Z), males, and Black respondents were more likely to report direct exposure. Asian respondents were less likely. There were no significant differences by income or education, suggesting mass shooting exposure cuts across socioeconomic boundaries.
So What?
The authors argue that their findings reveal mass shootings to be a pervasive public health issue rather than a rare criminal phenomenon. They highlight that direct exposure—being physically present or injured—is far more common than previously understood, affecting roughly one in fifteen adults. This, they suggest, means that the social and psychological toll of mass shootings extends deeply into American communities, not just into the headlines.
They interpret the demographic differences—greater exposure among younger generations, males, and Black respondents—as evidence of unequal risk across society, requiring targeted public health interventions. At the same time, because exposure did not vary by income or education, the authors contend that mass shootings cut across socioeconomic lines and should be treated as a widespread societal concern. Ultimately, they frame their study as a call for comprehensive prevention and post-incident support strategies, emphasizing that mass shootings must be addressed as a population-level health problem rather than an isolated law enforcement issue.
My $.02
Let’s do a little back-of-the-napkin math.
The authors report that 6.95% of U.S. adults—roughly one in fifteen—have been physically present during a mass shooting. Using the U.S. Census Bureau’s 2024 estimate of about 260 million adults, that translates to roughly:
260,000,000 × 0.0695 =18,070,000
So, by their numbers, about 18 million people have been directly present at a mass shooting.
Now, according to the Gun Violence Archive, there have been roughly 4,900 mass shootings in the U.S. over the past decade, depending on the exact definition and cut-off date. Divide one by the other:
18,000,000 ÷ 4900 = ~3673
That’s over 3600 people per mass shooting.
If we stretch the time window to the last 20 years, that number will go down, but these events have been increasing in frequency, not decreasing, so expanding the timeframe won’t come anywhere near halving the number per event. But, this is back of the napkin math, so lets do that and say 1800 per event for the last 20 years.
To put that in context, the Las Vegas Route 91 Harvest Festival shooting in 2017—the largest in U.S. history—had around 22,000 people present at the concert event and hundreds or maybe thousands on Las Vegas Boulevard (LVMPD Report). Most incidents, however, involve at most dozens of people, not thousands. If 1800 people were physically present at the average mass shooting, that would imply crowds on the scale of a mid-size concert or sporting event every time—something that simply doesn’t match reality.
So the results about the number present are implausible to say the least.
The most plausible explanation for the mismatch between the survey’s estimates and the known number of mass shootings is the survey method itself. Measuring rare, emotionally charged events with self-report surveys is a recipe for error. People often fold more common gun violence into the category of “mass shooting,” especially when definitions are broad or unfamiliar. Memory distortions can also inflate what someone believes they witnessed. Add in demand characteristics and simple courtesy bias—respondents wanting to be helpful or to endorse the premise of the survey—and the numbers can drift far from reality. When all of this sits on top of an opt-in online panel, inflation becomes almost guaranteed.
Lott and Moody’s (2025) follow-up critique of the Pyrooz et al. study makes this point even clearer. Using their own survey, Lott and Moody found that 7.8 percent of respondents said they had been present at a mass shooting, almost identical to Pyrooz et al.’s estimate. The symmetry is the giveaway. Two independent surveys producing the same result would normally strengthen confidence, but in this case it signals that the method itself is generating a repeatable error. The estimate is impossible when compared to incident-based data, which means the surveys are systematically mismeasuring the phenomenon.
To their credit, Lott and Moody also note that their estimates simply cannot be accurate. Simply put, there have not been enough mass shootings in the United States to produce tens of millions of directly exposed adults, let alone millions of injuries. Their critique underscores the central issue. These surveys are capturing something, but it is not literal presence at mass shootings.
The broader takeaway is that survey data can highlight lived experience, but they can also badly overshoot when respondents misclassify or overstate events. Better definitions, cross-checks with verified incident data, and survey instruments built to reduce recall and demand biases are needed. Until then, estimates of direct mass shooting exposure based on broad self-report surveys are likely to keep producing dramatic but untenable numbers.





This is a common error with using survey sampling of rare events. Here is a Chance article from almost 30 years ago going through a similar critique of Gary Kleck's work on using a gun for self-defense, https://sites.stat.columbia.edu/gelman/surveys.course/Hemenway1997.pdf
I think I originally saw the idea from Noah Smith, but you can basically ask any crazy question on a survey and get ~1/12 people to answer in the affirmative (which is around 8%). So ask "have you been abducted by aliens", and you will get people who say yes. So anything rarer than 1/10 people in surveys is quite hard to estimate in practice.
Part of it is people answering junk (some surveys, like Monitoring the Youth, have fake drug questions to catch kids just saying yes to everything). Some of it is bad recall and poor reading comprehension (not as a dig, just as a fact of life, should not assume college level reading capabilities for any survey).
Just to add to this, FBI UCR data shows fewer than 11,000 homicides with a firearm annually; call it a quarter million in 20 years. Even if EVERY homicide was a "mass shooting", that puts 80 eye witnesses per victim at every shooting. If a "mass shooting" is four or more, that's 320 or more witnesses at each one.
The same groups that have rejected DGU survey data from Gary Kleck and William English as an unreliable indicator are now championing this survey.