OREANDA-NEWS A study of how people perceive human faces will kick off a new initiative to massively scale up, accelerate, and reproduce psychology studies.

The initiative—dubbed the “Psychological Science Accelerator” (PSA)—has so far forged alliances with more than 170 laboratories on six continents in a bid to enhance the ability of researchers to collect data at multiple sites on a massive scale. It is led by psychologist Christopher Chartier of Ashland University in Ohio, who says he wants to tackle a long-standing problem: the “tentative, preliminary results” produced by small studies conducted in relatively isolated laboratories. Such studies “just aren’t getting the job done,” he says, and PSA’s goal is to enable researchers to expand their reach and collect “large-scale confirmatory data” at many sites.

To gain access to the accelerator, researchers submit proposals to Chartier, who then forwards anonymized versions of submissions to a five-member selection committee. It considers factors such as how important the research question is, what impact it might have on the field, and how feasible data collection would be. Promising proposals are then passed to other committees—totaling more than 40 people—for feedback. The initial panel then makes the final call.

PSA received eight proposals in its first round, and late last month approved its first study. Suggested by psychologists Benedict Jones and Lisa DeBruine of the University of Glasgow in the United Kingdom, it aims to discover whether the research findings of Alexander Todorov, a psychologist at Princeton University, can be replicated on a global scale. Todorov has reported that people rank human faces on two components: valence and dominance. Valence is a measure of trustworthiness, whereas dominance is a measure of physical strength.

Todorov’s findings have been successfully replicated in the United States and in the United Kingdom, the two researchers say. Now, they want to tap PSA’s network to see whether they hold up in other parts of the world. “We think this deserves an answer on a global scale,” Chartier says.

More than 50 of PSA’s collaborating labs have already committed to collect data as part of the study. And Chartier notes that once the results are ready, the collective will publicly release all its data. This should help eliminate the problem of publication bias—where positive results are published but negative results are ignored—which is an especially thorny issue in so-called meta-analyses of multiple studies. Chartier says all PSA-collected data and experimental material will be open access by default (with exceptions for sensitive subjects), and researchers will preregister their studies, allowing peer reviewers to examine methods and proposed analyses before experiments begin.

Because of the large number of collaborators, PSA studies are likely to have hundreds of authors. And that could pose a problem for researchers seeking to use the studies to qualify for tenure or secure promotions. “One potential concern is that the reward system in science is not ready to accommodate large-scale collaboration,” says Brian Nosek, executive director of the Center for Open Science in Charlottesville, Virginia. But he believes “the dramatic gains that these projects will demonstrate will inspire change in the reward systems.”

PSA isn’t the only effort aiming to change how researchers conduct psychological studies, which have received extensive criticism for a lack of reproducibility. Others include the Many Labs Replication Project and the Pipeline Project. Earlier this year, Chartier also launched StudySwap, an online platform designed to help researchers find collaborators for replication studies and exchange resources.