Recently there was a bit of a panic in the media regarding a very common item in kitchens all around the world: black plastic utensils used for flipping, scooping and otherwise handling our food while preparing culinary delights. The claim was that the recycled plastic which is used for many of these utensils leak a bad kind of flame-retardant chemical, decabromodiphenyl ether, or BDE-209, at a rate that would bring it dangerously close to the maximum allowed intake limit for humans. Only this claim was incorrect because the researchers who did the original study got their calculation of the intake limit wrong by a factor of ten.
This recent example is emblematic of how simple mistakes can combine with a reluctance to validate conclusions can lead successive consumers down a game of telephone where the original text may already have been wrong, where each node does not validate the provided text, and suddenly everyone knows that using certain kitchen utensils, microwaving dishes or adding that one thing to your food is pretty much guaranteed to kill you.
How does one go about defending oneself from becoming an unwitting factor in creating and propagating misinformation?
Making Mistakes Is Human
We all make mistakes, as nobody of us is perfect. Our memory is lossy, our focus drifts, and one momentary glitch is all it takes to make that typo, omit carrying the one, or pay attention to the road during that one crucial moment. As a result we have invented many ways to compensate for our flawed brains, much of it centered around double-checking, peer-validation and ways to keep an operator focused with increasingly automated means to interfere when said operator did not act in time.
The error in the black plastic utensils study is an example of what appears to be an innocent mistake that didn’t get caught before publication, and then likely the assumption was made by media publications – as they rushed to get that click-worthy scoop written up – that the original authors and peer-review process had caught any major mistakes. Unfortunately the original study by Megan Liu et al. in Chemosphere listed the BDE-209 reference dose for a 60 kg adult as 42,000 ng/day, when the reference dose per kg body weight is 7,000 ng.
It doesn’t take a genius to see that 60 times 7,000 makes 420,000 ng/day, and as it’s at the core of the conclusion being drawn, it ought to have been checked and double-checked alongside the calculated daily intake from contaminated cooking utensils at 34,700 ng/day. This ‘miscalculation’ as per the authors changed the impact from a solid 80% of the reference dose to not even 10%, putting it closer to the daily intake from other sources like dust. One factor that also played a role here, as pointed out by Joseph Brean in the earlier linked National Post article, is that the authors used nanograms, when micrograms would have sufficed and cut three redundant zeroes off each value.
Of note with the (human) brain is that error detection and correction are an integral part of learning, and this process can be readily detected with an EEG scan as an event-related potential (ERP), specifically an error-related negativity (ERN). This is something that we consciously experience as well, such as when we perform an action like typing some text and before we have a chance to re-read what we wrote we already know that we made a mistake. Other common examples include being aware of misspeaking even as the words leave your mouth and that sense of dread before an action you’re performing doesn’t quite work out as expected.
An interesting case study here involves these ERNs in the human medial frontal cortex as published in Neuron back in 2018 by Zhongzheng Fu et al. (with related Cedars-Sinai article). In this experimental setup volunteers were monitored via EEG as they were challenged with a Stroop task. During this task the self-monitoring of errors plays a major role as saying the word competes with saying the color, a struggle that’s visible in the EEG and shows the active error-correcting neurons to be located in regions like the dorsal anterior cingulate cortex (dACC). A good explanation can be found in this Frontiers for Young Minds article.
The ERN signal strength changes with age, becoming stronger as our brain grows and develops, including pertinent regions like the cingulate cortex. Yet as helpful as this mechanism is, mistakes will inevitably slip through and is why proofreading text requires a fresh pair of eyes, ideally a pair not belonging to the person who originally wrote said text, as they may be biased to pass over said mistakes.
Cognitive Biases
Although there is at this point no evidence to support the hypothesis that we are just brains in jars gently sloshing about in cerebrospinal fluid as sentient robots feed said brains a simulated reality, effectively this isn’t so far removed from the truth. Safely nestled inside our skulls we can only obtain a heavily filtered interpretation of the world around us via our senses, each of which throw away significant amounts of data in e.g. the retina before the remaining data percolates through their respective cortices and subsequent neural networks until whatever information is left seeps up into the neocortex where our consciousness resides as a somewhat haphazard integration of data streams.
Along the way there are countless (subconscious) processes that can affect how we consciously experience this information seepage. These are collectively called ‘cognitive biases‘, and include common types like confirmation bias. This particular type of bias is particularly prevalent as humans appear to be strongly biased towards seeking out confirmation of existing beliefs, rather than seeking out narratives that may challenge said beliefs.
Unsurprisingly, examples of confirmation bias are everywhere, ranging from the subtle (e.g. overconfidence and faulty reasoning in e.g. diagnosing a defect) to the extreme, such as dogmatic beliefs affecting large groups where any challenge to the faulty belief is met by equally extreme responses. Common examples here are anti-vaccination beliefs – where people will readily believe that vaccines cause everything from cancer to autism – and anti-radiation beliefs which range from insisting that electromagnetic radiation from powerlines, microwave ovens, WiFi, etc. is harmful, to believing various unfounded claims about nuclear power and the hazards of ionizing radiation.
In the case of our black plastic kitchen utensils some people in the audience likely already had a pre-existing bias towards believing that plastic cooking utensils are somehow bad, and for whom the faulty calculation thus confirmed this bias. They would have had little cause to validate the claim and happily shared it on their social media accounts and email lists as an irrefutable fact, resulting in many of these spatulas and friends finding themselves tossed into the bin in a blind panic.
Trust But Verify
Obviously you cannot go through each moment of the day validating every single piece of information that comes your way. The key here is to validate and verify where it matters. After reading such an alarmist article about cooking utensils in one’s local purveyor of journalistic integrity and/or social media, it behooves one to investigate these claims and possibly even run the numbers oneself, before making your way over to the kitchen to forcefully rip all of those claimed carriers of cancer seeds out of their respective drawers and hurling them into the trash bin.
The same kind of due diligence is important when a single, likely biased source makes a particular claim. Especially in this era where post-truth often trumps intellectualism, it’s important to take a step back when a claim is made and consider it in a broader context. While this miscalculation with flame-retardant levels in black kitchen utensils won’t have much of an impact on society, the many cases of clear cognitive bias in daily life as well as their exploitation by the unscrupulous brings to mind Carl Sagan’s fears about a ‘celebration of ignorance’ as expressed in his 1995 book The Demon-Haunted World: Science as a Candle in the Dark.
With a populace primed to respond to every emotionally-charged sound bite, we need these candles more than ever.
Statistics show that 5 out of 4 researchers have denied that their data was wrong.
Dang it, the black plastic spatula already went into landfill.
Fortunately I have a red plastic one. Surely it’s much safer.
It’s certainly faster!
you need the orange utensils, the ones original from the 70s, in “cadmium orange” ;)
Rather than a confirmation bias against black kitchen utensils, I suspect the issue is the bias caused by the very clear financial and status rewards for study authors towards writing exciting new discoveries. And of journals towards publishing said “discoveries”. And of the media towards publicising them.
No researcher, journal, or paper (tabloid or other) ever got rich from publishing: “totally expected: researchers find everyday item no one thought is dangerous is in fact not dangerous”.
To be fair, some researchers do get paid to confirm expected, non-revolutionary results, but not many, and none get famous for it.
And if you lie the right way you get pretty damn (in)famous.
Like https://en.wikipedia.org/wiki/Andrew_Wakefield?useskin=vector
Calculation mistakes have been around for ages. Early on someone calculated the iron content of spinach but screwed up the calculation and gave spinach 10x more iron than actual. Spinach-loving Popeye was born from the mistake
Coffee bad, coffee good. Aspirin bad, Aspirin good, eggs bad, eggs good. Endless flip-flop-flips. You can often manipulate stats to get the out-come you like. Mistakes of course just add to the confusion. So it goes. Now spatulas. All in the name of science… which has a habit of becoming ‘policy’/directives by governments…
I myself have been known to pour over the coffee research and waffle on the health benefits of bacon and sausage. I also have another topic on which I flip… eggs.
Dang, now I’m hungry…
Which spatula do I grab?
The handiest one that does the job. Life is a sexually transmitted disease that is 100% fatal
This is exactly correct and is the problem.
It isn’t that people don’t trust facts. The problem is that the “experts” keep changing their minds on the facts or what those facts mean (mistakes just make this worse).
And, thus, trust in “experts” drops. As we must rely upon others to do the research and tell us what it means, trust is paramount.
The problem is therefore a lack of trust in experts, especially in this modern time period where an expert can be found to back any construct you can imagine.
“Experts say…” is no longer a meaningful statement, it is, sadly, for many, a signal for “fraud alert”.
““Experts say…” is no longer a meaningful statement, it is, sadly, for many, a signal for “fraud alert”.”
I blame the TV stations, in parts. They always have their own random “experts”.
What’s also often negleted is the life experience of certain people.
People who have lived for decades next to a problematic nuclear reactor may have a different experience/knowledge than a professor who never did set foot next to it.
In an ideal world, both of the would have a dialog and would discuss the matter in a civil way on eye level.
Very interesting article !
I only use inox steel or wood kitchen utensils.
“How does one go about defending oneself from becoming an unwitting factor in creating and propagating misinformation?”
Media compentency. We already learned in school to question things we read/see/hear in the news and do our own researchers (do take them with a grain of salt in short).
For example, by comparion information available in both books and the internet.
That’s why we roll our eyes and feel sadness when our parents do fall for fake news on the internet.
The internet never was a safe place or reliable source.
That’s why we still have professional journalist who do research check things in the world.
Unfortunately, our parents never have learnt to double check on informationen. They grew up with news papers and TV and never had to doubt.
Unfortunately, the internet isn’t like that. Historically, it’s a chaotic place.
A popular statistics and social science blog (Statistical Modeling, Causal Inference, and Social Science) just today had a good writeup of much of the press that got the story wrong and then how well they did at correcting their readers: https://statmodeling.stat.columbia.edu/2024/12/19/how-did-the-press-do-on-that-black-spatula-story/
End result, they didn’t do a good job. Many didn’t reference the press release from Dr. Schwarcz at (https://www.mcgill.ca/oss/article/critical-thinking-health-and-nutrition/are-black-plastic-spatulas-and-serving-spoons-safe-use). This write-up skipped his work too! Many also didn’t link or mention the National Post article at https://nationalpost.com/news/canada/black-plastic (which this write-up thankfully did). Many news articles appear to avoid mentioning or linking to any perceived competitor, instead rewriting (plagiarizing?) their work as their own.
So, good job Hackaday for at least linking to one of the early news article. Do better next time by mentioning who actually discovered the fault. And maybe the next time an academic gets things wrong, see how it’s reported or corrected by newspapers and the media.
Wait, so black cooking utensils DON’T ooze BDE?