
A new study finds that a “hidden world of uncertainty” may underlie most scientific discoveries, especially in the social sciences.
When scientists used the same set of data to answer a particular question hypothesis That immigration reduces support for social policy Dozens of researchers have presented very different results, according to a new study published October 28 in the journal Proceedings of the National Academy of Sciences.
The findings suggest that it can be very difficult to be confident of outcomes in some of these areas, since small changes in initial choices can yield dramatically different outcomes.
In the new study, Nate Breznow, a postdoctoral researcher at the University of Bremen in Germany, and colleagues asked 161 researchers in nearly sixteen research teams to test a common hypothesis: that immigration reduces support for government social policy. This question has been asked hundreds of times in the social science literature, Breznow told Live Science, and the results are all over the map.
As a baseline, they gave the research teams data from six government policy questions from the International Social Survey Program, a vast data set that tracks policy differences across 44 countries.
Next, they asked the teams to use logic and prior knowledge to develop models to explain the relationship between migration and support for government social services.
For example, one group might predict that an increased influx of immigrants into a country increases competition for scarce resources, which in turn reduces support for social services. Then, the research teams had to decide what types of data to use to answer that question (for example, net migrant flow into a country, gross domestic product, average or median income in different regions), as well as what types of analytics The statistic they might use.
Related: Deductive versus inductive reasoning
The results of the research groups generally reflected the literature: 13.5% said it was not possible to draw a conclusion, 60.7% said the hypothesis should be rejected, and 28.5% said the hypothesis was valid.
The Breznau team then used their statistical analysis to try to understand why different groups came to such different conclusions.
They found that neither bias nor inexperience could explain the variance. Alternatively, hundreds of different seemingly minor decisions may have changed the conclusions in one way or another. Even more surprising, no set of variables appeared to fluctuate the results one way or the other, possibly because there was insufficient data to compare the different models. (There was one limitation of the study: the authors’ analysis itself is a statistical model and therefore subject to uncertainty, too.)
It is not clear to what extent this universe of uncertainty plagues other sciences. It may be that the astrophysics model, for example, is simpler than models of human interactions on a large scale, Breznau said.
For example, there are 86 billion neurons in the human brain and 8 billion people on the planet, and all of these people interact in complex social networks.
“There may be basic laws that will govern human social and behavioral organization, but we certainly don’t have the tools to learn about them,” Breznau told Live Science.
One conclusion from the study, Breznow said, is that researchers should spend some time refining their hypotheses before jumping into data collection and analysis, and the new study’s hypothesis is an excellent example.
“Does migration undermine support for social policy? It is a very typical hypothesis in the social sciences, but it is probably too vague to have a definitive answer,” he said.
A more specific or specific question may yield better results, Breznau said.
If you want to see how different variables and modeling choices affected the results of each model, you can do so through Glossy application.
#Underlying #scientific #discoveries #lie #hidden #world #uncertainty