Accuracy on the Internet: The Price of Freedom is Personal Responsibility
Why you over-manage, under-manage, and sometimes do both

Evaluate the quality of researchA lot of numbers are thrown at us everyday by people who want to convince us to agree with them. A good study adds credibility to the point being made. But not all studies are credible.

In The Price of Freedom is Personal Responsibility, I shared tips and tools to help verify the information you come across. This article shows how to evaluate the quality of research being referenced.

A Google search for “new study shows” displays over 650,000,000 results. That’s a lot of new studies!  How do you determine which of these studies are credible?

You might question a study if the results don’t fit with your experience of reality. For example, those with personal experience eating when under stress would question today’s top Google search result – New Study Shows Stress Eating is a Myth. When you question the results, it’s worth taking a closer look. If you look closely in this article, you’ll see that only 59 people participated in the study – not nearly enough to draw such a huge conclusion.

But what about today’s number two search result? – Shocking New Study Shows Chemo Kills Half of Cancer Patients. It’s hard to simply dismiss this based on personal experience. However, if you read the article closely, you’ll see that although this looks like a credible study, these results were an indictment of one particular hospital, not chemo treatment. And if you click on the link to the research study itself, you’ll see that the conclusion of the study is that certain groups of people (type of cancer, age, general health, etc.) are at greater risk with chemo treatment. Nothing shocking or new about that. This article, published in Natural News, is an example of how the results of a credible study are manipulated to promote their own agenda, alternative medicine.

Unfortunately it has become a common practice for people to use questionable research or to manipulate research results to support their agenda.

Don’t Be Manipulated by Numbers

If you don’t know how to evaluate the quality of research, you can be easily manipulated by numbers or fall prey to confirmation bias (the tendency to see new information as confirmation of what you already believe).

The good news is, it’s not as hard as it might seem to learn. You don’t need to be a statistician to evaluate the quality of research. There are some simple things anyone can do that are not difficult at all.

4 Questions to Ask When Reading Articles That Reference Research

When numbers are being quoted, ask these questions to highlight potential concerns.

1. Who conducted the research?
Research produced or funded by groups with a strong political or commercial agenda is less likely to be trustworthy, since these groups have a vested interest in the study’s findings supporting their viewpoint.

2. Who is reporting the results?
Media coverage often seeks to be attention grabbing and succinct. As a result, they sometimes oversimplify the research or grab only a small piece and not the whole picture.

When those reporting the results have a political or commercial agenda, they may not fully or accurately summarize the original research. For example, the report of the study about chemo for cancer patients.

3. Are they jumping to conclusions?
Correlation does not imply causation. Just because two things are related does not mean one caused the other. For example, numerous studies showed that women taking hormone replacement therapy (HRT) also had a lower incidence of heart disease. For many years people assumed that HRT protected against heart disease. But later studies showed this is not true. They are simply correlated because the women who took HRT also took better care of their health in other areas as well.

4. Is this a one-time study?
A single study, no matter how good, needs to be viewed in the context of other research on the topic. What does other research on this topic show? Results obtained at one time might not hold true at another time.

6 Steps to Easily and Quickly Evaluate the Quality of Research

If the article reporting research raises questions for you, click on the link and take a look at the study itself. Beware of articles reporting research when no link to the study is provided.

Another advantage to looking directly at research results is that you often find interesting and useful information not included in articles referencing the study.

1. Consider the trustworthiness of researchers.
Are you familiar with the organization that conducted the research? Research conducted through a respected, known organization like Gallup or Harvard is more likely to be trustworthy.

2. Look at who was in the group.
Reported research is based on the assumption that the findings for a sample of people can be generalized to the larger population. How well does the sample represent the larger population? What groups are reported and how well do the respondents represent that group? What was the sample size? As demonstrated with the survey I discussed earlier, 59 people is too small to generalize from.

3. How was the data gathered?
What was the procedure for selecting the sample? When results are obtained by publishing an online survey, consider what motivated the individuals to take the survey. They might not be representative of the larger population. Take a look at the questions that were asked. If a test was given, was it a standardized test?

4. How was the data measured?
Was the data analyzed carefully or is this simply a study reporting numbers? Do they report measures of reliability and validity? Reliability means the study will produce the same results if you repeat it. Validity means it measures what it claims to measure.

5. How was the data analyzed?
Are the reported differences significant? A difference between 25% and 40% might not mean anything at all. A “significant” result is one that is unlikely to occur randomly or due to chance. Differences must be statistically “significant” to be considered meaningful. 

6. Are the conclusions justified?
Is the sample size large enough to generalize from?
Are the conclusions based on sufficient data?
Are the reported differences meaningful statistically?
If the study is attributing causation, is more than correlation demonstrated?
Do the results make sense based on what you read?

 

Photo credit: Bigstock/Igor Stevanovic How to Easily Evaluate the Quality of Research
Accuracy on the Internet: The Price of Freedom is Personal Responsibility
Why you over-manage, under-manage, and sometimes do both

Pin It on Pinterest