Sunday, October 05, 2025

Meta-analyses

We fear that the study we are depending on will turn out to be a one-off, and not a good basis for choosing action.  Especially when a majority of the studies seem to point in a direction we like, but there are these pesky other studies that show the null hypothesis or even the opposite of our desires.  If only, if only someone could look at all the studies and put them together, to see what they say in aggregate. Then, then we would know what the answer is. It sounds like not only an answer, but, well, a meta-answer

Yeah, if only.

Let's start with inclusion.  The hundred studies we want to put together are not of the same design.  WRT the bees I just mentioned, some of the studies measure only bees.  Others measure all insects.  Other measure land insects or water insects. Some are point-in-time, one shot.  Others measure populations over years, but not in the same places every year, to get a sampling of the whole county - or even the whole country.  Does a decline of all insects in Denmark 1980-2000 deserve to be in with all land insects in Spain 2015-202?  Out of the hundred studies we end up picking only six might fit our criteria closely enough, and even those aren't identical. 

Is depression worse now than it was in the 1950s? Well, there's nothing like relying on the memory of 80 year-olds of whether the adults seemed depressed when they were children, eh? But if a study gets linked to at a popular online site, that's exactly what you will get in the comments, with people getting steamed up about it, too.  So you go for some objective criteria.  How about suicide rates? Eh, coroners didn't always write "suicide" even when they thought it was, out of kindness to the family. How about calls to suicide hotlines? Do we go by whether people describe themselves as depressed?  How about a 1-5 happiness scale? What if people lie about that? If one study has N=1732 and another one has N=47 college age females, how do you combine that?  Let's devise a test that gives us a number at the end and compare numbers. Okay, there are twenty-'leven depression tests, each asking similar but not identical questions. And that's just the single tests themselves. Now let's try to gather all the tests together and combine them.  What could possibly go wrong?

Publication bias can destroy just about anything.  What if you are studying political hatred and you find out your test shows that people who didn't hate anyone when they were 20 still don't hate anyone when they are fifty? No matter how you frame that, it's going to be hard to find a journal to publish it.  So you don't try, and neither did sixteen other colleges studying something similar that decade. What will get published?  All the studies that show that hatred changes significantly over time.

When you see the word "meta-analysis" do not be relieved or encouraged. It doesn't mean your questions are answered, it means you have new questions.  Even worse, people with biases they don't notice will find this magnified when they go meta.

You will still get some new information, and it might be useful. That is all. 

No comments: