One researcher tested a new weight loss supplement. She gave it to over 200 overweight adults and found that their average weight, according to precise weight measurements, decreased two months after taking the supplement. She concluded that the supplement promotes weight loss.
Is that claim accurate or flawed?
Ask this question to a group of adults, and many will tell you to check the search. Finally, it is based on an experiment using an “accurate” scale, studying a decent number of subjects and measuring changes over time.
But the correct answer is: the researcher does not have enough information to support her conclusion. Her experiment did not include a control group of people who did not take supplements, meaning they could not say for sure whether the pill caused weight loss, or some other factor.
This is a kind of subtle distinction that serious thinking skills should help to enlighten. Yet many people – adults and children alike – struggle to identify this type of fraud in claims that seem reasonable at first glance.
It turns out that developing critical thinking skills can be difficult, although many educators believe that “we are training our students to be able to do this” at Indiana University Science.
Perhaps the main element of education is missing while teaching students to find faulty reasoning: practice. Motz and other psychology researchers at Indiana University tested in a study whose findings give them confidence A promising way to strengthen critical thinking muscles.
The aim of the research is to examine and improve participants’ ability to identify the following common misconceptions, so that people can draw incorrect conclusions from information and data:
Lack of control
Correlation is not causality
Each group of study participants began the experiment by taking a pre-test and receiving training on critical thinking. But only one group actually spent time putting that training into practice, through an exercise that asked them to read paragraphs about scientific claims – as in the first paragraph of this article – and Answer multiple choice questions about possible problems in logic. The second group took a different type of questionnaire instead, while the third group did not engage in either activity.
At the end of the experiment, all participants measured their ability to identify incorrectly after taking a test. According to Moz, all three groups saw improved performance, but the group that studied the extra-critical thinking received a “significantly higher profit” तीनtwo-fold improvement compared to their initial pre-test.
This suggests that training alone can do little to improve people’s critical-thinking skills परंतु but practice of pattern recognition can make a big difference.
“You can’t just say, ‘Hey, how do you evaluate information?’ You just have to be more discriminating with the help you render toward other people. ” “You have to look at people confused and know how they messed up – and why.”
The research was supported by a grant from the Reboot Foundation, which advocates and works to improve critical thinking skills. An article about the research is currently pre-printed, meaning it has not yet been published in peer-reviewed and scientific journals. The data is publicly available online.
The authors hope that this study will motivate more teachers to include critical-thinking training and practice exercises in their curriculum.
“We need to have a huge, crowd-sourced test bank of critical thinking items,” Motz says. “And people can include them in many different subjects.”
This would fit one of the researchers’ broader objectives: “Empowering other teachers to be able to experiment in a relatively simple way in their classroom,” says Emily Fife, a study author and assistant professor in the Department of Psychology. Brain Science at Indiana University. “I think this is a perfect topic that people will be very interested in.”