Pre-registered replications & metas

Quick jump to sections: Background / Get involved / Our team / Media attention / Project summary / 2017-8 replications / 2018-9 course replications / 2018-9 guided theses replications / Planned for 2019-20 / Registered Reports / Open Science Initiatives

 

Background

In 2016, following recent developments in psychological science (the so-called “replication/reproducibility crisis”) and gaining my academic independence, I decided to make major changes to my research agenda to prioritize pre-registered replications and pre-registered meta-analyses and focus on the realm of judgment and decision making. The aim was to revisit research findings I once took for granted and re-establish the foundations on which I hope to build my research. I, therefore, decided that all my teaching and mentoring work with guided thesis students will involve either pre-registered replications or pre-registered meta-analyses, atleast as a first step, to examine the classics in the field.

In 2017 I guided 3 masters students at Maastricht University to pre-test this realignment. It far exceeded my expectations. We completed 3 pre-registered replication, 3 pre-registered meta-analyses, and one review paper summarizing the insights gained. Once joining HKU, in Dec 2017, I decided to scale up and mass-mobilize HKU’s undergraduate students and lead a massive pre-registered replication effort. In the first year, two semesters, of running this project, we’ve successfully completed 45 replication projects, making this one of the largest replication efforts in social-psychology. For each of the replication projects, we have full pre-registrations, data/code, and all written up in APA style submission ready student reports. In the second semester, most of the replications also included extensions with interesting contributions and insights.

I will continue running this in academic year 2019-20 with 20+ new replications+extensions. If any of this is of interest to you – lots of ways to join in. I am looking for interested early career researchers to join us, see more info on that below.

 

Get involved

You’re invited to:

  1. Read reports, browse open data and code from our mass pre-registered JDM replications project
  2. Browse through sample reports and outputs, aren’t these students amazing?
  3. Read, use, and/or contribute to our collaborative pre-registered replication projects guide.
  4. Read, use, and/or contribute to our collaborative R/JAMOVI/JASP guide.
  5. Read, use, and/or contribute to our collaborative replications extensions guide
  6. Read, use, and/or contribute to our collaborative Effect size, confidence intervals, and power analyses guide
  7. Browse and/or use our replications resource collection cloud folder.
  8. Keep track of the project on Researchgate.
  9. Our Slack workspace.
  10. Watch videos about the science replication crisis.
  11. Join the mailing list for events related to this project and open-science.
  12. If you are an Early Career Researcher (advanced PhD student, postdoc, early assistant professor) – Collaborate with us! (jump to section  “Collaborations on academic submissions”). Take the lead author on one of our completed projects, and help us submit the high-quality student reports to journals. Choose among the “still seeking collaborators” projects listed below, and email me.

 

Our team

There are currently 11 Early Career Researcher collaborators who joined us to finalize students’ reports for journal submissions:

  1. 2 US: Paul Henne & Andrew Smith
  2. 1 Hong Kong: Prasad Chandrashekar (2 submitted)
  3. 1 Canada: Jieying Chen (1 submitted)
  4. 1 France: Ignazio Ziano (1 submitted)
  5. 1 Netherlands: Tony Evans
  6. 1 UK: Paul Hanel
  7. 1 Norway: Hallgeir Sjåstad
  8. 1 New Zealand: Andrew Vonasch
  9. 1 Singapore: Mansur Khamitov
  10. 1 Sweden: Burak Tunca

See collaborator names next to the projects listed below.

Students so far involved in this project:

  • 70 undergraduates PSYC2020 Fundamentals of Social Psychology 2017-8 Spring course
  • 25 undergraduates PSYC3052 Advanced Social Psychology 2017-8 Spring course
  • 25 undergraduates PSYC3052 Advanced Social Psychology 2018-9 Autumn course
  • 27 undergraduates PSYC2071 Judgment and Decision Making 2018-9 Autumn course
  • 3 guided Maastricht University masters thesis students in the year 2017
  • 4 guided HKU undergraduate thesis students in the year 2018-9
  • 2 guided HKU masters thesis students in the year 2018-9

Teaching assistants:

  • 4 teaching assistants in PSYC2020 courses
  • 2 teaching assistants in PSYC3052/2017 courses

Big thanks to all involved students and teaching assistants.

 

Funding

  • The first semester we ran this in Spring 2017-8 was supported with 2000 euros by EASP Seedcorn grant 2018.
  • The second semester we ran this in Autumn 2018-9 was funded by Gilad Feldman’s HKU new faculty seed fund (~60,000HKD).
  • Data collection in autumn 2019-20 is supported by HKU teaching development grant (~250,000HKD).

 

Media attention

Media mentions of the project or related outputs:

  1. Bias Blind Spot replication
  2. Alicke 1985 replication: better than average effect

 

Project summary

WARNING: Preliminary student summarized findings, need to be rechecked and verified.

I summarized findings in a poster presented in summer 2019

Portrait version for SIPS2019:

2019 mass pre registered replication project status update portrait poster

Download the poster: PDF / JPG

Landscape earlier version:

2019 mass pre registered replication porject status update poster

Download the poster: PDF / JPG / Rmarkdown source

 

2017 Maastricht and 2017-8 HKU spring semester

Successful replications

  1. Action-effect (Kahneman & Tversky, 1982): Replicated several times (> 8). [multiple projects concluded by Gilad Feldman; sample publication]
  2. Inaction effect (Zeelenberg et al., 2002): Replicated Experiment 1 several times (> 4). [multiple projects concluded by Gilad Feldman; sample preprint]
  3. Omission bias (Spranca, Minsk, & Baron, 1991): Replicated two scenarios from Experiment 1 [project concluded by Tijen Yay; preprint]
  4. Exceptionality effect (Kahneman & Miller, 1986): Replicated two experiments (hitchhiker and car accident scenarios). [project concluded by Lucas Kutscher; publication]
  5. Exceptionality effect (Seta et al., 2001): Replicated 3 times. [concluded by Gilad Feldman; preprint]
  6. Name letter effect (Nuttin, 1987): Replicated the main experiment. [project led by Ignazio Ziano; preprint]
  7. Endowment effect & transaction demand (Mandel, 2002): Replicated Experiment 1 [project led by Ignazio Ziano; preprint]
  8. Bias blind spot (Pronin et al., 2002): Replicated Experiments 1b and 2 [project concluded by Subramanya Prasad CHANDRASHEKAR; preprint]
  9. Actor-observer bias in free will attributions (Pronin et al., 2010): Replicated twice in US/HK [writeup led by Hallgeir Sjåstad]
  10. Preference for indirect harm (Royzman & Baron, 2002): Replicated Experiment 2 twice (US/HK) and Experiment 3 once (HK). [project concluded by Ignazio Ziano; preprint]
  11. Inaction inertia (Tykocinski et al., 1995): Replicated Experiment 1 twice in US/HK samples. [project completed by Jieying Chen; Preprint ]
  12. Status quo bias (Samuelson & Zeckhauser, 1988): Replicated 4 scenarios from Experiment 1 (2 of the 4 ran twice) [writeup led by Qinyu Xiao]

 

Still seeking collaborators:
  1. Escalation of commitment (Arkes & Blumer, 1985): Replicated Experiments 1 and 4 twice (US/HK). [can be combined with escalation of commitment that ran in the second semester, and perhaps with the anticipated regret thesis project]
  2. Bias blind spot (Pronin & Kugler, 2007): Replicated twice in US/HK, in atleast 2/3 categories

 

Semi-successful replications

  1. Exceptionality effect (Miller & McFarland, 1986): Replicated twice using a regret DV, but not using original compensation DV [project concluded by Lucas Kutscher; publication]
  2. Doing/allowing morality asymmetry (Cushman et al, 2008): Replicated Experiment 1 in US but less so in a small underpowered HK sample. [Writeup led by Mansur Khamitov]

Inconclusive

  1. Force-Intention in moral judgment (Greene et al., 2009): Unsuccessful in replicating Experiment 1b in US MTurk samples and inconclusive in HK sample (medium effect size, sample underpowered to detect an effect). [Joined Psychological Science accelerator to follow up on this in a mass-collaboration] [Writeup led by Mansur Khamitov]

 

Unsuccessful replications; needs to revisit further

  1. Endowment effect and goal relevance (Irmak, Wakslak, & Trope, 2013): Unsuccessful in replicating the second experiment. [project led by Ignazio Ziano; preprint]

 

Still seeking collaborators in turning these to Registered Reports
  1. Actor-observer bias (Pronin et al., 2007): Unsuccessful in replicating Experiment 1 twice (US/HK)
  2. Anchoring effect by framing (Wong & Kwong, 2000): Unsuccessful in replicating twice in US and HK. Very likely culture/language bad translation issues. [very messy methodology and findings; postponed]
  3. Folk intentionality (Malle & Knobe, 1997): Twice (US/HK) found an effect when none was expected (actor-observer asymmetry). [there’s a high quality written manuscript. needs someone to follow through to publication]

 

 

HKU 2018-9 autumn semester

Note: Most of the projects from this semester are replications that include terrific extensions.

  1. Conjunction effect (Mellers, Hertwig, & Kahneman, 2001): Experiments 1-3 integrated design [project completed by Subramanya Prasad CHANDRASHEKAR; preprint]
  2. First instinct fallacy (Kruger, Wirtz & Miller 2005): Experiment 2 [writeup led by Paul Henne]
  3. Less is better (Hsee, 1998): Studies 1, 2, and 4 [writeup led by Andrew Vonasch]
  4. Disjunction effect (Tversky & Shafir, 1992): Experiment 1 [writeup led by Ignazio Ziano]
  5. Effort heuristic (Kruger etal, 2004): Combine Experiments 1-2 [writeup led by Tony Evans]
  6. Pluralistic ignorance (Miller, & McFarland, 1987): Experiment 1 [writeup led by Paul Hanel]
  7. Money illusion (Shafir, Diamond, & Tversky, 1997): Problems 1-4 [writeup led by Ignazio Ziano]
  8. Choosing versus rejecting (Shafir, 1993): All problems in the paper [writeup led by Subramanya Prasad CHANDRASHEKAR]
  9. Hindsight bias (Slovic & Fischhoff, 1977): Experiment 1 [writeup led by Jieying Chen]
  10. Hindsight bias (Fischhoff, 1975): Experiment 2 [writeup led by Jieying Chen]
  11. Anchoring-and-adjustment heuristic (Epley & Gilovich 2006): Study 1b [very messy methodology and findings; failed but hard to make out why. Writing will be to turn this into a registered report.] [writeup led by Andrew Smith]

 

Still seeking collaborators

Note: Most of the projects from this semester are replications that include terrific extensions.

These should be very straightforward…

  1. Outcome bias (Baron, & Hershey, 1988): Experiment 1 [Successful]
  2. Fundamental predictor error (Hsee & Weber, 1997): Experiment 1 [Successul]
  3. Insensitivity to sample bias (Hamill, Wilson, & Nisbett, 1980): Study 1 [Inconclusive]
  4. Irrational reactions to negative outcomes (Epstein, Lipson, Holstein, & Huh 1992): Combining Study 1 and 2 [Mostly successful replication]

These are a bit more tricky, one possible strategy is to submit those as a registered report.

  1. Escalation of commitment (Staw, 1976): Study 1 [Inconclusive, likely failed replication, can be combined with Arkes & Blumer, 1985 from 2018-9 semester]
  2. Relevance of irrelevant information (Schwarz, Strack, Hilton, & Naderer, 1991): Experiment 1 [Failed replication]
  3. Regret aversion (Zeelenberg etal 1996): Experiment 1

 

2018-9 guided thesis/internship students

Completed

These are outstanding theses by guided students and all include terrific extensions. The most comprehensive pre-registrations I have ever read and work exceeding PhD level. I took an active part throughout the whole process. These are very close to submission.

  1. Cognitive-experiential self-theory model (Epstein, Denes-Raj, & Pacini,1995) [Papara] [successful] [preprint] [writeup led by Subramanya Prasad CHANDRASHEKAR]
  2. Past-future asymmetry (Caruso, Gilbert, & Wilson, 2008): Experiments 1 and 4 [Florence] [failed replication, to be submitted as a Registered Report] (preprint) [writeup led by Burak Tunca]
  3. Global Self-Evaluation, Desirability and Controllability (Alicke, 1985, JPSP) [Cora] [mostly successul] (preprint) [writeup led by Ignazio Ziano]

 

Still seeking collaborators:
  1. Anticipated regret and escalation of commitment (Wong & Kwong, 2007) [Rachel] [mixed; some things replicated, others didn’t] (preprint)
  2. Counterfactuals, causal attributions, and the hindsight bias: A conceptual integration (Roese & Olson, 1996, JESP) [Roxane] [mixed] (preprint)
  3. Disjunction Bias (Hsee & Zhang, 2004, JPSP): Experiments 2 and 3 [Reanna] [mostly successful] (preprint)

 

In process

  1. Decoy effect (Ariely & Wallsten, 1995, OBHDP) [Qinyu Xiao]
  2. Decoy effect (Connolly, Reb, & Kausel, 2013, JDM) [Qinyu Xiao]

 

Planned replications for academic year 2019-20

If you’ve been invited to consider this list, please choose among those not suggested or picked by others:

The following 14 replications (some with 2 studies replicated, indicated with *) will be conducted in PSYC3052 Autumn 2019-20:

  1. Unrealistic optimism (Weinstein, 1980) – 5693 citations [TA – Ziqing]
  2. Subjective Probability: A Judgment of Representativeness (*Kahneman & Tversky, 1972, Cognitive Psychology) – 5228 citations
  3. The “false consensus effect”: An egocentric bias in social perception and attribution processes (*Ross et al, 1977, JESP) – 2939 citations
  4. The affect heuristic in judgments of risks and benefits (Study 2; Finucane et al., 2000, JBDM) – 2691 citations [peer review – Emir Efendic]
  5. Defaults, Framing and Privacy: Why Opting In-Opting Out (Johnson, Bellman, & Lohse, 2002, ML)  – cited 333
    1. with Do defaults save lives? Default bias (Johnson & Goldstein, 2003, Science) – cited 1740
  6. Knowing with Certainty: Appropriateness of Extreme Confidence (Fischhoff, Slovic, & Lichtenstein, 1977, JEPG) – 1731 citations
  7. Are humans good intuitive statisticians after all? Rethinking some conclusions from the literature on judgment under uncertainty (Cosmides & Tooby, 1996, Cognition) – 1460 citations [PDF]
  8. Social Utility and Decision Making in Interpersonal Contexts (Loewenstein, Thompson, & Bazerman, 1989, JPSP) – 1316 citations
  9. It won’t happen to me: Unrealistic optimism or illusion of control? (McKenna, 1993, BJP) [TA – Ziqing]
  10. Effort for payment: A tale of two markets (Heyman & Ariely, 2004, PS)
  11. You don’t know me, but I know you: The illusion of asymmetric insight (Pronin et al., 2001, JPSP)
  12. The disparity between the actual and assumed power of self-interest (Miller & Ratner, 1998, JPSP)

 

These are open for thesis students and interns:

  1. Discounting Future Green: Money Versus the Environment (Hardisty & Weber, 2009, JEP:G) – 285 citations [peer review – Emir Efendic] (materials available from author website:  [Study 1 materialsStudy 2 materialsStudy 3 materialsStudy 1 dataStudy 2 dataStudy 3 data])
  2. The base-rate fallacy in probability judgments (Bar-Hillel, 1980) – 1267 citations
  3. To Do or to Have? That Is the Question (Van Boven &  Gilovich, 2003, JPSP) – 1015 citations (PDF)
  4. Self–other judgments and perceived vulnerability to victimization (Perloff & Fetzer, 1986) – 867 citations
  5. Confirmation bias in sequential information search after preliminary decisions: an expansion of dissonance theoretical research on selective exposure to information (Jonas, Schulz-Hardt, Frey, & Thelen, 2001, JPSP) – 659 citations
  6. Crime and punishment: Distinguishing the roles of causal and intentional analyses in moral judgment (Cushman, 2008, Cognition) – 620 citations [PDF]
  7. From chump to champ: People’s appraisals of their earlier and present selves (Wilson & Ross, 2001, JPSP) – 553 citations [PDF]
  8. The temporal pattern to the experience of regret (Gilovich & Medvec, 1994, JPSP) – 478 citations [PDF]
  9. Disparity between the actual and assumed power of self-interest (Miller & Ratner, 1998, JPSP) – 425 citations [PDF]
  10. Effect of temporal perspective on subjective confidence (Gilovich et al., 1993, JPSP) – 337 citations [PDF]
  11. A Dirty Word or a Dirty World?: Attribute Framing, Political Affiliation, and Query Theory (Hardisty, Johnson, & Weber, 2010, Psychological Science) – 302 citations
  12. Looking forward, looking back: Anticipation is more evocative than retrospection (Van Boven & Ashworth, 2007, JEPG) – 225 citations
  13. The Peculiar Longevity of Things Not So Bad (Gilbert et al., 2004, Psychological Science) – 223 citations.
  14. Misuse of useless information (Bastardi & Shafir 1998, JPSP) – 204 citations.
  15. Partitioning default effects: why people choose not to choose (Dinner et al., 2011, JEPG) – 190 citations
  16. Outcome feedback: Hindsight and information (Hoch & Roediger, 1989, JEP:LMC) – 187 citations
  17. Words of estimative probability (Kent, 1964, CIA US government) | (WIKI) – 154 citations but has had quite an impact
  18. Illusion of asymmetric insight (Pronin etal., 2001, JPSP) – 133 citations [PDF]
  19. On the framing of medical decisions (McNeil, Pauker, & Tversky, 1988) – cited 120 times, one of the only joint evaluations of framing effects.

 

These are available for thesis students who want more challenging targets (some from marketing and economics):

  1. Gambling with the house money and trying to break even: The effects of prior outcomes on risky choice (Thaler & Johnson, 1980, Management Science) – 2500+ citations
  2. Affect, generalization, and the perception of risk (Johnson & Tversky, 1983, JPSP) – 2000+ citations (pdf)
  3. Compromise effect: Choice Based on Reasons: The Case of Attraction and Compromise Effects (Simonson, 1989, JCR) – 1878 citations.
  4. Retrievability: Judged Frequency of Lethal Events (Lichtenstein etal 1978) – 1798 citations
  5. Being Better but not Smarter than Others Muhammad Ali Effect (Allison, Messick, & Goethals 1989, SC) – 365 citations

 

Already picked by prospective thesis students:

  1. What is beautiful is good (Dion, Berscheid, & Walster, 1972, JPSP) – 4210 citations [picked by Samson]
  2. Confirmation Bias: Reasons for confidence (Koriat, Lichtenstein, & Fischhoff, 1980) – 1662 citations [picked by Tai Yik Long]
  3. Lake Wobegon be gone! The” below-average effect” and the egocentric nature of comparative ability judgments (Kruger, 1999, JPSP) – 986 citations [picked by Isablle]
  4. Why it won’t happen to me: perceptions of risk factors and susceptibility (Weinstein, 1983) – 1253 citations [picked by Leo Chan]
  5. The role of conscious reasoning and intuition in moral judgment: Testing three principles of harm (Cushman, Young, & Hauser, 2006, Psychological science) – 1056 citations. [picked by Michelle Cheng]
  6. Advice taking in decision making: Egocentric discounting and reputation formation (Yaniv, & Kleinberger, 2004, OBHDP) [suggested by Shoham Choshen-Hillel] – 513 citations [picked by Cherry Lau]
  7. Exploring the” planning fallacy”: Why people underestimate their task completion times (Buehler et al., 1994, JPSP) – 1293 citations [picked by Danny]
    1. combined with…
      Are we all less risky and more skillful than our fellow drivers? (Svenson, 1981, Acta Psychologica) [PDF] – 2112 citations (short, needs to be combined with something else) [picked by Danny]

 

Related initiatives by others

I’ve recently been made aware of similar related initiatives conducting mass replications of JDM findings:

  1. The Hagen Cumulative Science Project
    1. How to Teach Open Science Principles in the Undergraduate Curriculum (preprint)
  2. Replicability and Reducibility of heuristics and biases in Judgment and Decision Making (RR-JDM) @ Linköping University by JEDI lab

 

Other Registered Report initiatives

Myself with collaborators and students:

  1. Replication Registered Report: Cheerleader effect – Hierarchical Encoding Makes Individuals in a Group Seem More Attractive (Walker & Vul, 2014, Psychological Science) – 55 citations [led by Maria Sophia Heering with Stefano Livi]
  2. Replication Registered Report: Manipulations of Emotional Context Shape Moral Judgment (Valdesolo & DeSteno, 2006, Psychological Science) [led by Raluca Diana Szekely-Copîndean]
  3. Meta-analysis Registered Report: agency constructs and free will beliefs [with Krishna Savani and NTU student team]
  4. Meta-analysis Registered Report: free will beliefs and outcomes [with Krishna Savani and NTU student team]
  5. Meta-analysis Registered Report: values and the dark triad [with Velvetina Lim]

 

Many-labs type collaborative open-science projects:

  1. Collaborative multi-lab Registered Report: Accelerated CREP – RRR: Turri, Buckwalter, & Blouw (2015) [with Jiaxin Bill Shi and international collaboration] (In Principle Acceptance at AMPPS)
  2. Collaborative multi-lab Registered Report: PSA 006 Moral thinking across the world: Exploring the influence of personal force and intention in moral dilemma judgments
    [with Jiaxin Bill Shi and international collaboration] (reviewed at Nature Human Behaviour)
  3. Collaborative multi-lab Registered Report: STRAEQ-2: Development and Validation of the Social Thermoregulation, Risk Avoidance, and Eating Questionnaire – 2 (Proposal / OSF / OSF Wiki) [with Jiaxin Bill Shi and international collaboration] (in initial stages)

Open Science Initiatives

I’ve joined several related open-science initiatives. You’re welcome to join as well, email me if you’re interested in more info:

  1. Responding to reviewers and editors: Collaborative database (Contribute using form or table edit)
  2. Using replication as a teaching tool in the classroom
  3. Preregistration Planning and Deviation Documentation (PPDD) (APS hackathon: The Space between Pre-Registration and Publication: Deviation Documentation)
  4. Taking stock of the credibility revolution: Scientific reform 2011-now (started in this SIPS hackathon: Creating (and Mapping) the History of Scientific Reformcollaborative mapping of publications)
  5. Helping researchers identify their smallest effect size of interest (replications part)
  6. Resources for Learning (and Teaching) How to Conduct Meta-Analyses
  7. Mapping degrees of freedom in systematic review
  8. The Hidden Academia (gaming the system practices)
  9. Open psychological datasets (OSF page)

 

Also keeping an eye on and hoping to get more involved with:

  1. Framework for Open and Reproducible Research Training (FORRT) [XLS]
  2. A Crowdsourced Effort to Develop a Lab Manual Template for Social and Behavioural Scientists
  3. QRP Reviewer Guidelines / QRP reading list
  4. Building an open science knowledge base
  5. Contributorship Guidelines / Authorship accountability
  6. Analytic reproducibility
  7. List of Process of Systematic Review/Meta-Analysis (with some Best Practices)

For more initiatives, see PSA Meta-research hub