More than a decade ago, social psychology was confronted with a “crisis of confidence” (Pashler & Wagenmakers, 2012, p. 528). Systematic replication efforts suggested a relatively high non-replication rate (e.g., Klein et al., 2014; Open Science Collaboration, 2015), and the use of questionable research practices (QRPs; Banks et al., 2016) appeared to be common (Götz et al., 2021; John et al., 2012). This “manifest” crisis was preceded by another, more “latent” crisis that was perhaps even more problematic—a “period, when we were [even] unaware of the problem and thus did nothing about it” (Nelson et al., 2018, p. 512). Back then, it seems, the field did not value replications and, thus, was not even aware of the possible low replicability of its research findings. Yet, replications play a central role in ensuring trust in scientific fields, such as social psychology (Simons, 2014). As they clarify research findings’ reliability, they can be seen as the “Supreme Court of the scientific system” (Collins, 1985, p. 19). Conversely, without replications, it is more difficult to assess the reliability of our findings.
Since the beginning of the manifest replication crisis, methods have been developed to improve the field’s research and publication practices. For example, open science practices (OSPs) aim to increase transparency, openness, and reproducibility. In 2015, OSPs, including support for publishing replications, were described in a policy framework called the Transparency and Openness Promotion (TOP) Guidelines for journals (Nosek et al., 2015). Initiatives like the TOP Guidelines may not be sufficient to initiate broad cultural change without support from relevant stakeholders. Indeed, we may still work in a “dysfunctional social culture” (Nosek et al., 2022, p. 733) that discourages replications. Journals are particularly important for advancing culture change as they are gatekeepers for scientific discoveries (Aguinis et al., 2020; Kepes et al., 2018). Thus, we investigated whether social psychology journals have increased their willingness to publish replications between 2015 and 2022 as indicated by the public policies on their websites. Moreover, we explored whether social psychology journals have increased the rate of publishing replications since 2015 and depending on whether their websites welcomed replications.
Literature Review
Replications play a central role in ensuring trust in entire scientific fields, such as social psychology, because they examine their reliability (Simons, 2014). They offer the possibility to evaluate scientific discoveries and, thus, should be “a fundamental feature of the scientific process” (Zwaan et al., 2018, p. 3). Replications, if conducted rigorously, inform theory and matter independently of their results. They allow examination of sampling errors, artefacts, and may even unveil fraud (Schmidt, 2009). In the long term, they ensure the stability of our knowledge (Hüffmeier et al., 2016; Radder, 1996). However, historical evidence suggests that social psychology has not (a) valued (Giner-Sorolla, 2012; Hüffmeier et al., 2016), (b) incentivized (Koole & Lakens, 2012), or (c) conducted replications (Makel et al., 2012; Schmidt, 2009). It follows that, in the absence of a research culture that facilitates conducting and appreciating replications, we know too little about whether individual findings, or the field as a whole, are trustworthy.
Relevance of Journals’ Expressed Support for Replications
At first glance, replications may appear to garner greater appreciation in social psychology today. For example, different systematic replication efforts (e.g., the “Many Labs” projects; e.g., Ebersole et al., 2016; Klein et al., 2014) were published in prestigious journals, like Advances in Methods and Practices in Psychological Science, Journal of Experimental Social Psychology, and Social Psychology. In addition, various researchers advocate for making replications mainstream (Hüffmeier et al., 2016; Zwaan et al., 2018).
However, even if individual researchers or entire labs begin to appreciate and conduct replications, these efforts may not be enough to initiate broad cultural change (Nosek et al., 2022). Researchers’ career advancement is often influenced by the quantity of publications, especially those in highly prestigious journals (Aguinis et al., 2020). Publications in such outlets can be even more important than research quality, especially when it comes to promotion, tenure, and reward decisions (Gervais et al., 2015; Gomez-Mejia & Balkin, 1992). In turn, journals decide what gets published: They determine which type of studies have a high chance of getting published and, thus, are particularly appealing for researchers to conduct. As such, journals function as gatekeepers for what type of research is valued and rewarded, including replications (Kepes et al., 2018).
For example, the Journal of Personality and Social Psychology declined to send a replication (Ritchie et al., 2012) of Bem’s (2011) precognition research out to peer-review, offering the following explanation: “This journal does not publish replication studies, whether successful or unsuccessful” (Aldhous, 2011). Hence, journals’ expressed support for replications, or lack thereof, probably has a strong impact on whether researchers conduct replications. As Giner-Sorolla (2012, p. 566) put it: “The unglamorous nature of replication work, confronted with the narrow publishing bottleneck, makes much of it unpublishable, and therefore not worth starting, in a world of precarious careers and limited resources.” Therefore, a sustainable and broad cultural change might only be achieved if stakeholders change policies and create incentives for conducting replications (Nosek et al., 2022).
Are Replications Mainstream now?
Social psychology’s “manifest” replication crisis came with a “big bang” and was covered by neighboring fields as well as the media (for an overview, see Pashler & Wagenmakers, 2012). However, errors enable learning processes (Zhao, 2011). Negative emotions related to one’s errors trigger a motivation to learn from them (Zhao, 2011), and negative media coverage and critical scrutiny of neighboring disciplines most likely led to negative emotions among social psychologists. Further, errors with major (vs. minor) negative consequences lead to even more learning because they attract more attention (i.e., “negative outcome bias”, Zakay et al., 2004, p. 151). As public trust in social psychology findings and in its scientific community could strongly diminish in view of low replication rates, the field’s errors clearly had the potential to trigger severe negative consequences.
This collective awakening was followed by intense discussions about methods to improve the field’s publication practices—for example, the introduction of the TOP guidelines, including replications (Nosek et al., 2015). Taken together, the field of social psychology had (a) the motivation (i.e., impending negative consequences), (b) the tools (i.e., preregistrations or replications), and (c) the time (i.e., one decade) to learn from its errors (i.e., the low appreciation of replications; Zakay et al., 2004; Zhao, 2011;) and to work towards the needed change in its scholarly culture (Pashler & Wagenmakers, 2012; Spellman et al., 2017). Therefore, over time, the field of social psychology may have changed its replication policies to address its replication problems. Thus, we hypothesized the following:
Hypothesis: Replications are welcomed1 more often in social psychology journals in 2022 than they were in 2015.
Overview of the Investigation
We first examined whether expressed support for replication studies in the policies of social psychology journals has changed from 2015 to 2022. We coded journal policies in 2015 and used this as a benchmark date because the publication of a large-scale replication study by the Open Science Collaboration (2015) that year made replication widely visible. We recoded those same journals’ policies in 2022 to assess whether they were responsive to the advocacy to increase cultural acceptance of replication research. If explicit support for replications is still missing, it could hint that social psychology upholds its dysfunctional culture (Nosek et al., 2022). Second, we estimated in an exploratory fashion how many articles were published in social psychology journals that dealt with replications, relative to all published articles comparing articles published before and after 2015 (cf. Makel et al., 2012). We also compared expressed support for replications on the journals’ websites and the number of articles published in the examined journals that dealt with replications. This can provide insight into whether any observed change in policy is associated with actual change in publishing replication-related research.
Third, we examine the general adoption of TOP-compliant policies (Nosek et al., 2015) in social psychology journals in 2022. This examination complements our data with an independent coding of present journal policies, and it provides an opportunity to characterize the policy landscape for replication in comparison with other transparency- and rigor-enhancing policies. Thus, the corresponding analyses show whether the development of expressed support for replication studies differs from the development of other open science policies.
Method
We preregistered this project on the OSF (see Supplementary Materials).2 Supplemental materials, including the full dataset, are publicly available (see Supplementary Materials).
Journal Selection
Our study compared the expressed support for replications in the policies of social psychology journals in 2015 and 2022. We analyzed the journals’ stated policies on their websites. For the data from 2022, we based our content analysis on all journals that were listed in the 2021 Journal Citation Reports (N = 65), for the journal category of “Psychology, Social.” For the data from 2015, we included all journals that were listed in the 2013 Journal Citation Reports in the same journal category. A summary of all included journals is provided in the Appendix. Journals were excluded from our analyses (a) if they exclusively published theoretical papers, meta-analyses, narrative reviews, or conference contributions (n = 4), (b) if they did not publish in English or German (n = 1), or (c) if they were not included in our dataset for 2015 (i.e., if they were not included in the 2013 Journal Citation Reports, n = 9). The final sample included N = 51 journals.
Coding Procedure
For our coding of expressed support for replications, we examined the following sections on the journals’ websites (i.e., an exhaustive coding): (a) aims and scopes, (b) author guidelines, (c) the general journal description, and (d) manuscript submission. We coded if and how journals mentioned replications using one variable with four levels: discouraged versus not mentioned versus considered secondary versus welcomed (see Table 1).
Table 1
Coding of Replications
Coding category |
---|
Replications are mentioned, considered as equally important as original research and welcomed |
Replications are mentioned, but considered secondary |
Replications are not mentioned |
Replications are mentioned, but are not welcomed or are even discouraged |
Note. “Considered secondary” means that journals mention replication studies, but consider them as secondary contributions and, for instance, publish them only online.
The data from 2015 were coded in January 2015 by two of the authors (κ = .873; two discrepancies were resolved through discussion). The data from 2022 were coded between February 7, 2022 and February 17, 2022. The first author and a second coder began by coding an initial set of five randomly selected journals. All discrepancies were discussed until consensus was reached. Afterwards, both coders coded all remaining journals independently. The resulting average interrater reliability was high (κ = .956; Landis & Koch, 1977). The one discrepancy that occurred was resolved through discussion. For exploratory purposes, we also collected (a) the journals’ impact factor, (b) the journals’ publisher, and (c) whether the journals were signatories of the Committee on Publication Ethics (COPE) or mentioned that their guidelines are based on the COPE standards, which include “principles of transparency and best practice in scholarly publishing” (COPE et al., 2018).
Results
Confirmatory Analyses
We examined the expressed support for replications in 2022 and provide a comparison between 2015 and 2022. Table 2 shows whether and how social psychology journals mentioned replications in 2015 and 2022. In 2022, most social psychology journals (36 of 51; 71%) did not mention replications at all. There was one journal (1 of 51; 2%; Organizational Behavior and Human Decision Processes) that discouraged the submission of replication studies for publication in its section on “submission guidelines” as follows: “Significant contributions are less likely from research that merely replicates previous findings, revisits established findings using different samples or measures, or offers an incremental advancement to an existing body of knowledge” (information obtained in February 2022). Another journal (1 of 51; 2%; Journal of Personality and Social Psychology) mentioned replications in the coded sections, but considered them as secondary contributions (by indicating that it only publishes them online, see the section on “guide for authors”): “Replication manuscripts, if accepted, will be published online only and will be listed in the Table of Contents in the print journal” (information obtained in February 2022). The remaining journals that mentioned replication (13 of 51; 25%) expressed their support for replications. However, even these journals did not communicate a respective policy in their aims and scopes sections, arguably the most prominent part of their websites: In fact, only three of those 13 journals mentioned that replication studies were welcomed in their aims and scopes sections. One of those three journals specified that it focuses on original contributions, but is also open to replications (therefore, this instance could also have been reasonably coded as “replications being considered secondary”; see Table 1).
Table 2
Frequencies: Treatment of the Topic of Replications in Social Psychology Journals
Coding category | 2015
|
2022
|
||
---|---|---|---|---|
N | % | N | % | |
Discouraged | 1 | 2 | 1 | 2 |
Not Mentioned | 42 | 82 | 36 | 71 |
Considered Secondary | 2 | 4 | 1 | 2 |
Welcomed | 6 | 12 | 13 | 25 |
Total | 51 | 100 | 51 | 100 |
In comparison to 2015, the number of journals that discouraged replications on their websites did not change. There were six fewer journals in 2022 that did not mention replications on their websites (36 in 2022 vs. 42 in 2015; see Table 2) and one journal less in 2022 that considered replications secondary (one in 2022 vs. two in 2015). Further, there were seven more journals in 2022 than in 2015 that welcomed replications on their websites (13 in 2022 vs. 6 in 2015). Thus, the number of journals that mentioned openness to replications (i.e., welcomed or considered secondary) increased from 8 to 14, indicating that replications were supported more often in 2022 than in 2015, which is consistent with our hypothesis.3
Exploratory Analyses
As preregistered, we explored possible relationships between journals’ expressed support for replications in 2022 and (a) the journals’ impact factor, (b) whether the journals were a signatory of COPE, and (c) the journals’ publisher (e.g., Wiley or Taylor & Francis). We found a small and positive correlation between journals’ expressed support for replications (coded as follows: 0 = discouraged, 1 = no mention, 2 = secondary, 3 = supported) and the journals’ impact factor, Kendall-Tau-b r = .14. Journals with a higher impact factor were slightly more likely to mention support for replications on their websites. Exploratory analyses with COPE and publisher variables are in the Appendix.
We next explored whether journals’ support for replications (vs. not) was associated with an increased rate of publishing replication-related research, and whether this rate was different before versus after 2015.4 In September 2022, using Web of Science, we searched the entire publication history of each of the 51 coded social psychology journals, which are part of the Journal Citation Reports, to identify (a) the total number of articles published and (b) the number of articles that contained the search term “replicat*” in the search category “topic” (i.e., any articles containing words with the stem of “replicat” in their title, abstract, author keywords, or keywords plus; see Makel et al., 2012, for a first example of this approach). The replication rate of each journal, estimating the percentage of articles that discussed replications, was calculated as follows: number of articles containing “replicat*” as divided by the total number of published articles. This analysis was conducted separately for two time periods (i.e., 1945–2015 and 2016–2022) to examine whether the replication rate changed over time (Table 3).
As we did not examine whether the obtained 3,582 hits that were retrieved with the term “replicat*” were in fact replications (cf. Makel et al., 2012), our estimates of the replication rates might have been overestimated (i.e., including false positives). Note that this search strategy is inclusive of papers that discuss replication prominently but do not, themselves, report replication studies.
Table 3
Articles That Mentioned “Replicat*” (N = 3,582)
N journals | Time period
|
Increase | ||||||
---|---|---|---|---|---|---|---|---|
1945–2015
|
2016–2022
|
|||||||
N articles |
N “replicat*” |
proportion N “replicat” | N articles |
N “replicat*” |
proportion N “replicat” | |||
Journals that … | … discouraged/did not mention replications in 2015 and 2022 (N = 36) |
25,319 | 707 | 2.8% | 15,734 | 586 | 3.7% | 32% |
… considered replications secondary in 2015 and did not mention replications in 2022 (N = 1) |
1,085 | 50 | 4.6% | 570 | 52 | 9.1% | 97% | |
… became supporters of replications (N = 7) |
13,658 | 578 | 4.2% | 7,402 | 407 | 5.5% | 30% | |
… were supporters of replications in 2015 and 2022 (N = 7) | 11,398 | 673 | 5.9% | 4,489 | 529 | 11.8% | 100% | |
Total | 51,460 | 2,008 | 3.9% | 28,195 | 1,574 | 5.6% | 43% |
Note. “N journals” indicates the number of journals within each group. “N articles” indicates the total number of published articles. “N ‘replicat*’” indicates the number of articles that used the term “replicat*” in their title, abstract, or keywords. Percentages are estimated replication rates (i.e., number of articles containing “replicat*” as divided by the total number of articles), and the increase in the replication rate from 2015 to 2022.
Overall, from 1945 to 2022, the term “replicat*” was used in 4.5% (3,582 of 79,655) of articles, with specific journals ranging from 0% to 22.3% (see the Supplementary Materials for individual journal data). However, in the period from 2016 to 2022, the term “replicat*” was used more (i.e., 5.6%, 1,574 of 28,195 articles) than in the period from 1945 to 2015 (i.e., 3.9%, 2,008 of 51,460 articles), a 43% increase. Usage of the term “replicat*” was more common in the years after 2015 than before across these journals, however, the term was used in a relatively small portion of all articles in both time periods.
To examine potential differences between journals expressing support versus no support for replications on their websites, we further estimated replication rates separately for the four groups of journals: Journals that (a) discouraged/did not mention replications in 2015 and 2022, (b) considered replications secondary in 2015 and did not mention replications in 2022, (c) became supporters of replications in 2022, and (d) were supporters of replications in 2015 and 2022. Interestingly, we observed that journals consistently expressing support for replications on their websites (group d), in fact, appeared to publish more articles dealing with replications than journals consistently not expressing support (group a). We found this pattern for both time periods, 1945–2015 (i.e., 5.9% vs. 2.8% for journals that did vs. did not express support for replications on their websites both in 2015 and 2022, respectively) and 2016–2022 (i.e., 11.8% vs. 3.7% for journals that did vs. did not express support for replications on their websites both in 2015 and 2022, respectively). Thus, journals that expressed support for replications on their websites appeared more likely to actually publish replications.
Further, the overall increase in the ratio of published replications was present in all four groups of journals. However, journals that were supporters of replications already in 2015 (group d), showed a steeper increase over time than journals that consistently discouraged/did not mention replications (group a) or than journals that became new supporters of replications (group c; i.e., an increase of 100% vs. 32% vs. 30% for journals that were consistent supporters vs. consistently discouraged/did not mention replications vs. became supporters, respectively).
As an additional exploratory analysis, we examined the adoption of TOP-compliant policies for replication and other open science practices (Nosek et al., 2015) in social psychology journals in 2022 by examining the journals’ TOP Factor. The director of policy at the Center for Open Science (COS), David Mellor, described the TOP factor as “a modular set of indicators of journal policies to facilitate the visibility of good research practices” (COS, 2020). The TOP Factor is based primarily on the eight standards5 from the TOP guidelines (Nosek et al., 2015) and, thus, evaluates the degree to which journals support transparency and reproducibility. It assesses the degree to which a journal adopts each of the eight standards, and two other initiatives, Registered Reports (Chambers & Tzavella, 2022) and open science badges (Kidwell et al., 2016), on four increasing levels (i.e., Level 0 to Level III). The maximum score that can be achieved when a journal adopts all standards at the highest level is a TOP factor of 30. We collected the journals’ TOP Factor from the open database of TOP Factor codings (https://topfactor.org/). There was a TOP Factor rating available for N = 40 of our included journals. The TOP Factor ratings for the missing 11 journals were coded by the first author for the ensuing analysis.
On average, social psychology journals had a TOP factor of M = 5.92, SD = 6.56. The lowest observed TOP factor was Min = 0 and the highest was Max = 23. Overall, there were 13 journals (25%) with a TOP Factor of 0 and 11 journals (22%) with a TOP Factor of 1. Three journals (6%) received a TOP Factor of or above 20. Table 4 shows the journals’ TOP Factor rating concerning replications.
Figure 1 offers a visualization of replication policy adoption in comparison with the other guidelines. Of the 51 included journals, 38 (75%) discouraged submission of replication studies or said nothing about it (i.e., TOP Factor level “Not Implemented”). Four journals (8%) encouraged submissions of replication studies (i.e., TOP Factor level I). One journal (2%) encouraged submissions of replication studies and conducted results-blind reviews (i.e., TOP Factor Level II). Finally, eight journals (16%) used Registered Reports as a submission option for replication studies with peer-review prior to observing the study outcomes (i.e., TOP Factor Level III). Taken together, the TOP Factor rating for replications (i.e., 75% not implemented) reveals a similar picture as our coding of the journals’ support for replications (i.e., 71% not mentioned).
Table 4
TOP Factor Scores of Social Psychology Journals for Replication
TOP Factor | N | % |
---|---|---|
Not Implemented: Journal discourages submission of replication studies, or says nothing about it | 38 | 75 |
Level I: Journal encourages submission of replication studies. | 4 | 8 |
Level II: Journal encourages submission of replication studies and conducts results blind review. | 1 | 2 |
Level III: Journal uses Registered Reports as a submission option for replication studies with peer review prior to observing the study outcomes. | 8 | 16 |
Total | 51 | 100 |
Note. Percentages may not total 100 due to rounding.
Figure 1
TOP Factor Scores of Social Psychology Journals
Note. Policies are contrasted with “Replication” as our reference point (depicted as the left-most bar). The remaining policies are ordered by the proportion of journals adopting the policy at any level. Visualization adapted from Nosek et al. (2022).
Finally, to explore whether newer journals potentially express more support for replications and, thus, represent a separate and more appreciative “market” for replication studies, we coded the policies of some prominent psychological open access journals that were created near or after 2015 in response to the call to increase rigor and transparency of research (i.e., Collabra: Psychology, Comprehensive Results in Social Psychology, Meta-Psychology, Personality Science, and Social Psychological Bulletin). We observed that all of these journals welcomed replications on their websites, and their average TOP Factor score of 22.4 (range from 18 to 27) was well-above the average of the social psychology journals included in our study. New journals, created, in part, to improve credibility of research, have a strongly positive stance toward open science and replications specifically. As a group of journals, they may represent an emerging alternative market for replications because most of the more “traditional” journals still did not explicitly welcome replications.
Discussion
Theoretical Implications and Future Research
Our study examined whether the expressed support for replication studies in policies of social psychology journals changed between 2015 and 2022. Our hypothesis was supported: Replications were welcomed more often in social psychology journals in 2022 than in 2015. However, expressed support for replications is still represented by a significant minority of journals. Most journals did not update their policies after 2015 and still did not mention replications at all on their websites. Further, even those journals that generally expressed support for replications mostly did not mention them as a possible publication option in their most visible sections (i.e., the aims and scopes).
Our exploratory search on the amount of published replication studies suggests that journals that mention and support replications on their websites are in fact more likely to publish replications. Further, journals that expressed support for replications on their websites in 2015 and 2022 showed a larger increase in their replications rate during that time period than journals that started supporting replications sometime between 2015 and 2022, or did not explicitly support replications in 2015 or 2022. Those journals that had a positive public stance toward replications in 2015 may have been particularly attractive to researchers interested in replication, and changing policies may take some time to bring about change in scholarly practice. Further, the analyzed group of mainstream social psychology journals are comparatively slow to adapt their guidelines compared with journals such as Collabra and Social Psychological Bulletin that emerged during the period with the aim, in part, to improve research credibility. The introduction of these journals is possibly a reaction to the slow changes of more traditional journals and appears to fulfill the extant need to have journals that support the publication of replications. Altogether, our findings suggest that the field of social psychology may still be characterized by a “dysfunctional social culture” (Nosek et al., 2022, p. 733), in which replications are not considered ordinary scientific practice. This applies to most traditional journals, and is qualified by some change among journals, appearance of more discussion of replication across all journals regardless of policy, and emergence of more progressive journals promoting strong open practices including replication.
Our exploratory analyses on the TOP Factor suggests that journal adoption of language supporting replication occurs at similar rates as other rigor and transparency promoting policies such as data sharing or preregistration. Many journals have adopted some TOP-compliant policies, but a majority still have no or minimal such policies. As such, the adoption of TOP-compliant policies is still coming along slowly. To better understand and change this state of affairs, future research could investigate the journal publishers’, editors’, and boards’ reasons to not actively support replications among other open science practices in their policies.
Practical Implications
If journals, as important gatekeepers, do not support and incentivize replications, then researchers may simply not conduct them. As a result, the actual “latent” crisis underlying the “manifest” crisis—a field that does not value replications, resulting in a lack of replication studies—cannot be addressed. In turn, the insufficient appreciation of replications threatens to leave unclear which of our scientific findings are reliable and trustworthy. Thus, the field (i.e., researchers, reviewers, editors, journals, publishers) may need to think again or more strongly about the role played by replications to support its renaissance as a stronger field (Nelson et al., 2018).
Limitations
Our article contains a number of limitations. For instance, we interpreted missing statements about replications as journals not supporting and valuing replications, which may overstate the extent of the problem. However, if a journal does not mention replications on its website at all, researchers are at least not encouraged to conduct and submit a replication to this journal, as it remains unclear whether a replication has a good chance of being published. Lacking communication about replications is noteworthy because journals often mention welcomed article types (e.g., reviews, meta-analyses, or quantitative and qualitative studies). Further, our additional search for articles containing the term replicat* in the title, abstract, or keywords (e.g., Makel et al., 2012) suggests that there is a relationship between expressed support for replications on social psychology journals’ websites and how often replications were mentioned in their published articles. Thus, missing statements about replications seem to be associated with fewer published replications. Still, there is some ambiguity in the interpretation of missing statements about replications.
Using the term replicat* does not mean that replication studies were included in the paper, just that replicat* appeared in key locations in the paper—the title, abstract, or keywords. It is plausible that use of the term is associated with the likelihood that replication studies are part of the paper, but future research could parse between talking about versus doing replication research.
The TOP Factor ratings we used for our exploratory analysis are provided by individual coding of community members. Different researchers can complete a journal evaluation form on the TOP Factor website, which is then reviewed and added to the database. Therefore, the TOP Factor scores for different journals are recorded at different times (ranging from December 2019 to September 2022 for our sample). Thus, journals may have updated their standards in the meantime. However, our own coding of the journals’ replication policies in 2022 is not affected by this limitation and paints a very similar picture of the journals’ policies.
Conclusion
Our findings underscore the slow change toward a replication culture in a substantial portion of mainstream social psychology journals. We observed a modest change over time in their expressed support for replications and amount of prominent discussion of replication in published articles. These findings overall reveal an enduring lack of encouragement as communicated by current journal policies. Thus, the field’s norms and practices regarding replications, and other open science practices, do not seem to have substantially changed in the last decade—replications still appear to be far from becoming mainstream.