Medicine Technology 🌱 Environment Space Energy Physics Engineering Social Science Earth Science Science
Science 2026-03-20

52% of ADHD videos on TikTok contain misinformation, systematic review finds

A first-of-its-kind analysis across five major platforms reveals TikTok as the worst offender for inaccurate mental health content, while YouTube Kids stands out for near-zero misinformation rates

Social media has become the de facto mental health classroom for a generation. Teenagers and young adults now routinely turn to TikTok, Instagram, and YouTube not just for entertainment but for information about symptoms, diagnoses, and treatment options. For many, a 60-second video is the first step toward understanding what might be happening in their own minds. The question is whether that first step leads somewhere helpful or somewhere dangerous.

A new systematic review from the University of East Anglia provides the most comprehensive answer yet — and for TikTok in particular, the numbers are stark.

Five platforms, 5,000 posts, one pattern

Researchers at UEA's Norwich Medical School analyzed more than 5,000 social media posts about mental health across YouTube, TikTok, Facebook, Instagram, and X (formerly Twitter). The topics ranged widely: autism, ADHD, schizophrenia, bipolar disorder, depression, eating disorders, OCD, anxiety, and phobias. It is the first systematic review to examine mental health and neurodivergence content across multiple platforms simultaneously.

The headline finding: misinformation rates on social media reached as high as 56% for some topic-platform combinations. But the rates varied dramatically by platform, topic, and — most tellingly — by who created the content.

TikTok's algorithm and the misinformation machine

TikTok consistently scored worse than other platforms. Of the ADHD-related videos analyzed, 52% contained inaccurate information. For autism content, the figure was 41%. By comparison, YouTube averaged 22% misinformation across mental health topics, and Facebook came in under 15%.

The disparity has structural roots. TikTok's recommendation algorithm is engineered to surface content that generates rapid engagement — likes, shares, comments, rewatches. Accuracy is not a variable the algorithm weighs. Once a user shows interest in ADHD or anxiety content, the platform serves a stream of increasingly similar posts, creating what Alice Carter, who conducted the research for her doctoral thesis, described as echo chambers that reinforce false or exaggerated claims. Viral speed outpaces fact-checking.

The format itself compounds the problem. TikTok videos are short, visually driven, and optimized for emotional resonance rather than nuance. A 45-second clip asserting that certain everyday behaviors are signs of undiagnosed ADHD can rack up millions of views. A careful clinical explanation of what ADHD actually involves — its diagnostic criteria, its heterogeneity, the distinction between clinical symptoms and normal variation — doesn't fit the format as neatly.

The professional-amateur gap

The most striking finding may be the divide between professional and non-professional content creators. On TikTok, just 3% of videos made by healthcare professionals contained misinformation. Among non-professionals, the rate was 55%. That's an 18-fold difference in accuracy.

The problem isn't that lived-experience content has no value. Personal stories help people feel understood and can raise awareness of conditions that carry stigma. But the review makes clear that personal experience, however sincere, is not a reliable substitute for clinical knowledge when it comes to describing symptoms, diagnostic criteria, or treatment options. And on TikTok, professional voices represent a tiny fraction of the mental health content ecosystem.

Eleanor Chatburn, a researcher at UEA's Norwich Medical School, put the concern in practical terms: when misleading content circulates at scale, it can lead to self-misdiagnosis, delayed clinical assessment for people who genuinely need help, increased stigma, and avoidance of evidence-based treatment. Someone who encounters a viral video claiming that a specific supplement cures anxiety may delay seeking therapy or medication that actually works.

YouTube Kids: moderation makes a measurable difference

One platform stood out for the right reasons. YouTube Kids, which operates under stricter content moderation rules than its parent platform, contained zero misinformation for anxiety and depression content and only 8.9% for ADHD. Standard YouTube, by contrast, was described as highly inconsistent — quality varied dramatically by channel, topic, and individual creator.

The comparison suggests that moderation works. When platforms invest in content review and enforce accuracy standards, misinformation rates drop. When they don't, the algorithmic drive toward engagement amplifies whatever content generates the strongest emotional response, regardless of its accuracy.

Pathologizing the ordinary, minimizing the serious

Misinformation about mental health cuts in two directions simultaneously. Some content pathologizes ordinary behavior — reframing normal forgetfulness as ADHD, shyness as social anxiety disorder, mood fluctuations as bipolar disorder. This drives unnecessary worry and can overwhelm clinical services with referrals that don't meet diagnostic thresholds.

Other content minimizes serious conditions — portraying schizophrenia through a lens of fear, trivializing eating disorders, or suggesting that depression can be resolved through willpower or lifestyle changes alone. Both distortions cause harm, and both circulate freely on platforms that treat mental health content no differently from cooking videos or dance trends.

The review found that neurodivergence content — particularly autism and ADHD — carried higher misinformation rates than many other mental health topics. This tracks with a broader cultural moment in which ADHD and autism have become major themes on social media, generating enormous engagement and, inevitably, enormous volumes of inaccurate content.

The call for clinicians to become creators

The researchers concluded with a recommendation that sounds simple but would require a significant shift in professional culture: health organizations and clinicians need to create and promote evidence-based content on the platforms where young people actually spend their time. The answer to bad information isn't just content moderation — it's competition. Accurate content needs to be as visible, as engaging, and as accessible as the misinformation it seeks to displace.

The team also called for improved content moderation by platforms, standardized tools for assessing the quality of online mental health information, and clearer consensus definitions of what constitutes misinformation in this domain.

Whether any of that happens depends on platform incentives, regulatory pressure, and whether clinical institutions are willing to meet audiences where they are rather than waiting for patients to arrive at a clinic door. In the meantime, the data is clear: for millions of young people, TikTok is a primary source of mental health information. And more than half of what they're seeing about ADHD isn't accurate.

Source: "The Quality of Mental Health and Neurodivergence-Related Information on Social Media: A Systematic Review," published in The Journal of Social Media Research. Research by Dr. Alice Carter and Dr. Eleanor Chatburn, Norwich Medical School, University of East Anglia. Systematic review of 5,000+ posts across YouTube, TikTok, Facebook, Instagram, and X.