Borepatch has a fantastic post about the disconnect between science and public perception, and the pitfalls of forming public policy using the tools of the trade, in this case, within the hot mess of the day, global warming.
...What Mann did that has been the source of controversy is use a pretty unusual statistical analysis method. It's been shown that his method generates hockey stick shaped graphs when even random data is used as the input - say, the list of telephone numbers in your town. So far, the debate about this has had more than a bit of the playground did too/did not to it. This means that the hockey stick graph has won, because it is so visually striking.
Read the whole thing here- don't be afraid- he's written this for everyone, not just science geeks!
When I was performing the actual data collection of the first big money research study I worked on, I ran into a brick wall right away. I had designed a test regime to look at the interplay between differing types of neural input and behavior in lobsters. My particular holy grail was trying to discover how animals like lobsters use smell to find food in the water. It's a dry subject to me, now, but at the time I was fascinated... the belief was that if we could understand the lobster's ability to track smells, we could design robots to seek out oil, chemical, or explosive traces in the water, with no human involvement or risk.
Well, I ran into a problem, I did. The gigantic laboratory I needed, with the 60-foot long water tank that made smell-free, turbulence-free water (the turbulence affects how smells are distributed- chaotic patterns exist, and it's complicated, but we made a turbulent-free environment to eliminate the need for chaotic math, and thus, the gigantic water tank about half the volume of an olympic pool). Motion-sensitive low-light cameras, lobster blindfolds, you name it, a couple of million bucks went into this.
My problem was that the lobsters weren't hungry when they were in the tank. They were uninterested in following the smells I injected (along with the fluorescing dye that I once accidentally released into the neighboring estuary, dying the water, vegetation, and about 50 expensive sailboats a pleasing red). For three months, I sat in the dark and watched seemingly random lobster behavior, most of which consisted of finding a little house to sit in until the trial was over and it was time to go back in their personal tanks.
Well, I'm no supergenius, I'm a worker. I was probably the least intelligent person in my lab, but I'm a fast learner, can bullshit with the best of them, and can put in 18 hour days without throwing a hissy fit. I cranked out my study, and more or less got a crash course in neurobiology in 3 months, as I started off knowing almost nothing about the subject.
I'm not the brightest bulb, as I said, but my supervisor was. My supervisor earned two Ph.D's at the same time, which was why he was at Woods Hole... and he didn't see much good in the statistical array we had designed to test the data.
So, we did what so many scientists do. We broke up the datasets, and reverse-engineered the statistics. We found statistical tests that worked, and formed null hypotheses that fit the test and the data. Nowadays, this is called data mining, but such a term is disingenuous, to me. What we did was make a silk purse out of a sow's ear.
Now, my sow's ear and I were well received, don't get me wrong. We got the animal behavior geeks all hot and bothered, and the neuro folks got all soggy and hard to light. I disproved a couple of hypotheses that were popular at the time, and proved something small and significant, but I didn't prove anything earth-shattering... and, as I discovered not long afterwards, there was a good reason why I'm more interested in fisheries and fish farming: I got a prison pallor, sitting in the dark down on Cape Cod in the summer, and, sharing a house with 5 single women, 3 of whom were hot, I wasn't even around enough to hit on any of them.
*********************************
Although I don't do any research anymore, more's the pity, I do occasionally lecture at conferences designed to prepare social science students to perform cross-cultural research studies. Social 'sciences' being for the most part mental masturbation, in my opinion, I try to shogun the need for rigorous statistical analysis as part and parcel of a good study. Rule number one, in my book, is for the researcher to understand that statistics are not meant to prove or disprove anything, and are not meant to unveil higher truths... they are simply ways to help support your main point in the study, without the need for a crowd of 50 other scientists to testify in person, that you're not full of shit.
I think the kids like my colorful language. I've lost the fear of public speaking, and I like to sound like a lobsterman, not a geek.
Monday, August 23, 2010
Subscribe to:
Post Comments (Atom)
3 comments:
So that's how the sausage is made!
If I could quote Homer J. Simpson "Facts are meaningless! They can be used to prove anything!"
So that's why ship hulls are red below the waterline.
Post a Comment