You are viewing an obsolete version of the DU website which is no longer supported by the Administrators. Visit The New DU.
Democratic Underground Latest Greatest Lobby Journals Search Options Help Login
Google

Objectivity, subjectivity, and how to evaluate scientific studies [View All]

Printer-friendly format Printer-friendly format
Printer-friendly format Email this thread to a friend
Printer-friendly format Bookmark this thread
This topic is archived.
Home » Discuss » Archives » General Discussion (1/22-2007 thru 12/14/2010) Donate to DU
TZ Donating Member (1000+ posts) Send PM | Profile | Ignore Wed Nov-07-07 05:58 AM
Original message
Objectivity, subjectivity, and how to evaluate scientific studies
Advertisements [?]
This post is not an attempt to sway anyone's belief on homeopathy, ufos, ESP etc, but merely an attempt to share what I have learned about evaluating data/evidence. Critical thinking skills are, IMHO, one of the most important tools in a scientists box and something I think more Americans need to know. And I have come to see that those who have not had the benefit of scientific background/education might not understand some of these distinctions well. So in an effort to educate/enlighten (something I try to do with my posts) I will explain certain skills that are taught. I would like to think people could use these things to better examine everything from the aforementioned topics to things like what our government claims is true, and what other politicians say/do.

First subjective evidence vs. objective evidence

Subjective evidence is basically how everyone personally observes and evaluates. Its your own personal interpretation of sensory input. Sometimes its accurate and sometimes not. I hear people say, I know what I saw..My eyes don't lie. Maybe they don't but certainly our brain is capable of misinterpreting what it sees/hears. Thats actually what happens with people who have learning disabilities like dyslexia..the brain is not interpreting the input correctly (transposting letters from correctly written words leaving text incomphrehensible to the sufferer) Sometimes we see what we expect/want to see like the example I cited in my previous post about the serial sniper and people looking for and "seeing" a white box truck instead of the real vehicle of the maroon caprice because thats what they believed was the suspect vehicle.

Objective evidence--some people claim that no one can ever be completely free of our own biases agendas, opinions, etc. That is to some extent true, but there are ways to compensate for that...Its one of the reasons why peer review is so important in the evaluation of study data. They attempt to get people who while may be in the field have no direct connections or reasons to be biased pro or anti to the person, so as to get as close to objective evaluation of methodologies and the results/conclusions drawn from those results. Thats why when people cite sources from places that don't have good objective measurements of data (and its why I don't like wikipedia as a citation for scientific evidence) I have a hard time taking it seriously.

Next: How to evaluate a scientific study. As everyone should know, just because a study has been published does not make it good science, especially if the peer review is lacking. But sometimes seriously flawed studies do get published/used (because of political agendas etc...) and it helps to understand what you are looking at before one can accept/reject a study.

Study size: Of course this makes a HUGE difference. Small studies really don't tell too much about how accurate the conclusions drawn are for the general population. Obviously a bigger sample size is better. Unfortunately study sample size is a big limitation in clinical trials.Its understandably hard to get people to volunteer to be "guinea pigs" so to speak. Despite what some think, thats the reason why a lot of drugs have issues after they are put on the market (Vioxx for example). Because even with accepted clinical trial study sizes there is still often not enough to truly anticipate all the possible side effects when its becomes widely available to the public. Thats why the FDA has an indefinite post-marketing phase for monitoring drugs because they do understand the scientific limitations of clinical trials. Vioxx was known to have potentially serious side effects yes (as many drugs do, unfortunately--nothing is 100% safe--medicine isn't an exact study there is a bunch of unpredicitability involved), but because of the natural limits on study size it was not understood how common those serious health issues were until the wide use of the drug made it evident. Thus Vioxx was pulled.

Statistical Analysis: This is not my area of strength so I won't get overly technical, but suffice it to say that there are differing ways of analyzing data, differing statistical tests and some are more appropriate than others to analyze data. I have seen people try to manipulate results this way...by using inappropriate analysis of data. Its entirely possible to have a significant results using one type of test, and have not significant results using another type of test.

Wrong conclusions drawn/what statistically significant figures mean: Sometimes just because a result shows statistical significance does not mean that we can drawn firm conclusions from it. And often times wrong conclusions (again usually if an agenda of some sort is involved- this is what happened with Avandia, IMO-eagerness to market led to ignoring or drawing the wrong conclusions of safety studies) can be made from them. In small enough study sizes a small change up or down leads to large differences in statistical significance and can sound look more legitimate then it really is (a recent study posted here had a 2X doubling of health risk in rats on something but when you looked at the actual stats the risk jumped from 0.5% to 1.0%- or something like that. Not really a huge concern in reality although indeed "statistically significant")

Anyway, thats some of the methodologies that I have learned (both formally and informally) on how to evaluate evidence. As I said too many people jump to conclusions about various issues scientific, political and otherwise IMO. So I thought I would share some of the ways I and others like me draw conclusions.
Feel free to disagree with my analysis but I think its a rational and logical way of dealing with the world around us.
Printer Friendly | Permalink |  | Top
 

Home » Discuss » Archives » General Discussion (1/22-2007 thru 12/14/2010) Donate to DU

Powered by DCForum+ Version 1.1 Copyright 1997-2002 DCScripts.com
Software has been extensively modified by the DU administrators


Important Notices: By participating on this discussion board, visitors agree to abide by the rules outlined on our Rules page. Messages posted on the Democratic Underground Discussion Forums are the opinions of the individuals who post them, and do not necessarily represent the opinions of Democratic Underground, LLC.

Home  |  Discussion Forums  |  Journals |  Store  |  Donate

About DU  |  Contact Us  |  Privacy Policy

Got a message for Democratic Underground? Click here to send us a message.

© 2001 - 2011 Democratic Underground, LLC