August 27th, 2013 in Other Sciences / Social Sciences
(Phys.org) -Researchers have found that authors of “soft science” research papers tend to overstate results more often than researchers in other fields. In their paper published in Proceedings of the National Academy of Sciences, Daniele Fanelli and John Ioannidis write that the worst offenders are in the United States.
In the science community, soft research has come to mean research that is done in areas that are difficult to measure-behavioral science being the most well known. Science conducted on the ways people (or animals) respond in experiments is quite often difficult to reproduce or to describe in measureable terms. For this reason, the authors claim, research based on behavioral methodologies has been considered (for several decades) to be at higher risk of bias, than with other sciences. Such biases, they suggest, tend to lead to inflated claims of success.
The problem Fanelli and Ioannidis suggest is that in soft science there are more “degrees of freedom”-researchers have more room to engineer experiments that will confirm what they already believe to be true. Thus, success in such sciences is defined as meeting expectations, rather than reaching a clearly defined goal or even discovering something new.
The researchers came to these conclusions by locating and analyzing 82 recent meta-analyses (papers produced by researchers studying published research papers) in genetics and in psychiatry that covered 1,174 studies. Including genetics allowed the duo to compare soft science studies with hard science studies as well as those that were a combination of the two.
In analyzing the data, the researchers found that researchers in the soft sciences tended to not only inflate their findings but to more often report that the outcome of their research matched their original assumptions. They also found that papers that listed researchers from the U.S. as leads tended to be the worst offenders. In their defense, the researchers suggest that the publish-or-perish atmosphere in the U.S. contributes to the problem as does difficulty in defining parameters of success in the soft sciences. The authors also noted that research efforts that included both hard and soft science were less likely than pure soft science efforts to lead to inflated results.
More information: US studies may overestimate effect sizes in softer research, Published online before print August 26, 2013, DOI: 10.1073/pnas.1302997110
Abstract
Many biases affect scientific research, causing a waste of resources, posing a threat to human health, and hampering scientific progress. These problems are hypothesized to be worsened by lack of consensus on theories and methods, by selective publication processes, and by career systems too heavily oriented toward productivity, such as those adopted in the United States (US). Here, we extracted 1,174 primary outcomes appearing in 82 meta-analyses published in health-related biological and behavioral research sampled from the Web of Science categories Genetics & Heredity and Psychiatry and measured how individual results deviated from the overall summary effect size within their respective meta-analysis. We found that primary studies whose outcome included behavioral parameters were generally more likely to report extreme effects, and those with a corresponding author based in the US were more likely to deviate in the direction predicted by their experimental hypotheses, particularly when their outcome did not include additional biological parameters. Nonbehavioral studies showed no such “US effect” and were subject mainly to sampling variance and small-study effects, which were stronger for non-US countries. Although this latter finding could be interpreted as a publication bias against non-US authors, the US effect observed in behavioral research is unlikely to be generated by editorial biases. Behavioral studies have lower methodological consensus and higher noise, making US researchers potentially more likely to express an underlying propensity to report strong and significant findings.
© 2013 Phys.org
“Researchers find researchers overestimate soft-science results-US the worst offender.” August 27th, 2013. http://phys.org/news/2013-08-overestimate-soft-science-resultsus-worst.html