top of page
Image by Raphael Nogueira

THE EDUCATION EDIT

Are TikTok and Instagram Science Videos Making Us Overconfident?

Have you ever watched a short, flashy science TikTok or Instagram Reel and thought, "Wow, I totally get this — I could probably explain it to anyone!" 🤯 Well, new research suggests that while these bite-sized videos are amazing for making scientific ideas accessible, they might also be making you a little... overconfident.


This study is a fantastic way to link the research methods you’re learning in class to a relevant, modern psychological issue: how we consume information online.


🔬 The Psychology of Feeling Smart

The research by Salzmann, Walther, and Kaspar (2025) dug into how simplifying complex scientific concepts for an online audience affects both understanding and confidence. They essentially wanted to know if 'easy' equals 'expert.'


The Method: What They Did

The researchers recruited 179 university students with no prior psychology knowledge and randomly split them into different groups (an Independent Groups Design). The core experiment manipulated two key things (our Independent Variables - IVs):

  1. Video Language: Participants watched four psychology videos presented in either

  2. Plain Language: Using easy-to-understand, everyday terms.

  3. Academic Language: Using more technical terminology (like a textbook).

  4. Bias Awareness: Half of all participants watched an extra video beforehand about the "easiness effect" — a cognitive bias where people confuse ease of understanding with depth of knowledge, causing them to overestimate their expertise.


After watching the videos, the students completed self-report tasks to measure the Dependent Variables (DVs): how well they understood the content, their confidence in judging the accuracy of the research, and their likelihood of sharing or commenting on the videos.


Key Findings: The Overconfidence Trap

The results were a classic psychological paradox:


Better Understanding, Higher Confidence: Participants who watched the plain-language videos (the 'easy' ones) understood the content better overall. That's great! However, these same people also reported feeling significantly more confident in their ability to evaluate the research—even if they’d only just learned the basics.


The Warning Wasn't Enough: Crucially, simply being warned about the easiness effect beforehand didn't stop the overconfidence. This suggests that the cognitive bias is powerful and hard to suppress, even when we are consciously aware of it.


The Takeaway: False Sense of Expertise

Simplified science videos are vital for accessibility, but the study shows they can unintentionally give viewers a false sense of expertise. This is a critical insight for psychology—it highlights that making a concept easy to process doesn't mean people will then accurately evaluate it. We mistake the fluency of the information for the depth of our own knowledge.



🧠 Applying Research Methods: A Deeper Dive

Now let's put on your examiner's hat and break this study down using the research methods concepts you need for your A-level exams.


Analysing the Laboratory Experiment

The research by Salzmann et al. is an excellent example of a laboratory experiment designed specifically to establish a clear cause-and-effect relationship between how information is presented and our confidence levels. The key strength here is control.


To achieve this, the researchers carefully operationalised both the IVs and the DVs. For instance, the IV of video language was meticulously defined by having two distinct scripts (Plain vs. Academic) for the same content. Crucially, the DV of confidence wasn't vague; it was measured precisely using a quantitative self-report via a 7-point Likert-style scale (e.g., 1=Not at all confident to 7=Extremely confident in judging accuracy).


Because the study was conducted in a lab, extraneous variables were minimised. Factors like the content topic, video length, presentation style, and setting were held constant across all groups. This maximum control allows us to be confident that the difference in confidence observed was caused by the manipulation of the IV (the language used), providing the study with a strong Internal Validity.


Validity: Internal vs. Ecological

However, the controlled lab setting that boosts internal validity often comes at a cost to Ecological Validity. When analysing the study, we must consider Mundane Realism. The participants weren't actually scrolling through a personalised, engaging TikTok feed; they were sat down and instructed to watch specific psychology videos in a controlled environment.


This highly artificial setting raises questions: does the 'easiness effect' bias hold true when students are distracted, rapidly scrolling, and emotionally engaged in their natural online environment? The findings suggest the bias exists, but further field research would be needed to confirm this effect in a real-world context.


Sampling and Generalisability

The study used Opportunity Sampling, recruiting 179 university students available at the time. While this was convenient, it presents a limitation for Population Validity. University students, even those not studying psychology, may have higher baseline literacy and academic skills or higher general confidence levels than the wider population. The findings may not accurately generalise to older adults, or to younger, less academically engaged teenagers who form the core audience of platforms like TikTok. This means we must be cautious when generalising the conclusions about social media usage across the entire population.


💡 Reflection

This study is a great example of applying research methods to real-world psychology. It shows how design choices, sampling, and measurement all influence the conclusions we can draw. It also highlights the importance of critical thinking—just because something is easy to understand doesn't mean you fully understand it!

Image by Raphael Nogueira

STRUCTURED, GUIDED LEARNING AT YOUR PACE!

bottom of page