Skip to main content

Verified by Psychology Today


Is Anchoring and Adjustment the Best-Replicated Finding?

The insufficient adjustment when estimating numbers has replicated very well.

Key points

  • When people use a number they know (an anchor) to estimate an unknown number, their estimate stays too close to the anchor.
  • This finding has been very well replicated and could be considered social psychology's best-replicated finding.
  • The finding originates in behavioural economics and in the judgment and decision-making research.

In a previous post, I invited scholars to nominate what they think is the best-replicated finding in social psychology. Lukas Röseler, of the University of Bamberg, wrote me to make a case for anchoring and (insufficient) adjustment. It is a very good case.

The Anchor

More precisely, the finding is this: When people have to estimate some number they do not know, they sometimes start with a number they do know. What they know is the anchor; they then make their estimate of the unknown one by making adjustments. The finding is that the adjustments fall short. Hence the estimated number ends up being closer to the anchor than it should be.

It is how real estate appraisers work. What is the value of your house? It is impossible to know in advance, precisely, what someone will pay for it. No two houses are identical, and, in complicated neighborhoods, they differ in many ways. To estimate it, the appraiser starts with some definite numbers, such as how much other houses, similar and nearby, sold for. And then the appraiser makes adjustments, boosting the price if the garage is bigger or the view better, reducing it if there are fewer bathrooms, and so on. Even so, their estimates tend to be distorted toward the anchor. If you want a higher value for your house, make sure the appraiser starts by comparing it with houses that are more expensive!

In studies, the experimenter provides the anchor. Sometimes it is made obviously arbitrary, such as each participant starting with the last four digits of their social security number, or spinning a wheel to get a random number. Even so, the final estimate is biased to be too close to the anchor.

Röseler documents that this effect has been found many times, in many contexts. There are currently six meta-analyses, all of which found large and robust evidence for the effect. Five of them do not overlap, insofar as they focus on different areas of study, such as legal decision-making (Bystranowski), how much consumers are willing to pay for a product (Li), and negotiations (Orr). Röseler’s own meta-analysis examined openly available data sets that offered a standard test of the basic idea. He adds that his meta-analysis found the effect sizes were about the same in unpublished studies as in published ones, which is quite rare for meta-analyses, thus further strengthening the case.

Multiplicity of Methods

Multiplicity of methods is also evident, at least in the sense of what is being estimated. Studies done in the United States might have people estimate how long the Mississippi River is, but asking Asians to estimate an American river will likely yield more variance. Multiplicity of labs is another strength of this research domain, as many different researchers have found these effects.

Another criterion I proposed for the best-replicated finding was preregistration. Again, the anchoring-adjustment literature looks strong. Many recent studies are preregistered.

Another criterion for something to be considered as a best-replicated finding was success at multi-site replication. Anchoring was included in the first “Many Labs” project (Klein et al., 2014), and it was quite successful. Indeed, four of the 12 mini-studies included in that project were anchoring-adjustment studies, and all four found significant support. (They were four of the five biggest findings.)

Note: I call these "mini-studies" because they took only a minute or two for the participant to do. Old-school social psychologists might say it is unfair to compare a two-minute study based on one rating vs. the traditional sort of social psychology experiment that crafted a longer, fuller experience for each participating individual. The constraints and contingencies are quite different. For example, each participant in Klein’s project did all 12 of the studies in the same sitting—whereas, with the full-length studies, researchers typically sought participants who had little or no prior participation experience. This is why my review of the multi-site studies kept the mini-studies in a separate category from the full-length ones (and my previous post focused on the latter). It may be easier to do a two-minute experiment than, say, the Milgram obedience study. Still, ease of doing the experiment is hardly a disqualifying factor.

Judgment and Decision-Making

I suppose some people will question whether anchoring adjustment counts as social psychology. It is, after all, just a mistake people make when estimating numbers, which is hardly a social thing. I have long pushed for including judgment and decision-making (JDM) in social psychology, such as by recruiting a chapter on it for my graduate textbook. Anchoring and adjustment are also covered in my undergraduate social psychology textbook, and I suspect most other textbooks as well, so I am inclined to count it as social psychology, even if borrowed from JDM. To be sure, it is slightly embarrassing that some of social psychology’s best replication successes come with work borrowed from other fields. Of the 35 full-length studies we reviewed in surveying the multi-site replication literature, the most successful one was presented and published as applied cognitive psychology: Ito et al. (2019) replicated not only the finding but even the effect size that eyewitnesses who discuss the event can influence each other’s testimony in a lab simulation.

Nevertheless, the finding of inadequate adjustment after anchoring fits all the criteria for replication success. It has a strong case to be considered the best-replicated finding in social psychology.

Once again, I invite knowledgeable researchers to nominate other findings as plausible candidates for social psychology’s “best-replicated” status. Let’s celebrate our field’s successes!

Note: Dr. Röseler invites researchers to use his Web site, which has results from 94 studies for dynamic meta-analysis.


Röseler, L., Weber, L., Stich, E., Günther, M., Helgerth, K. A. C., Wagner, F. S., Antunovic, M., Bahník, S., Barrera, F., Baumeister, R. F., Bermeitinger, C., Bickenbach, S. L. C., Blank, P. A., Blower, F. B. N., Bögler, H. L., Boo, F. L., Boruchowicz, C., Bühler, R. L., Burgmer, P., Cheek, N., N., Dohle, S., Dorsch, L., Dück, M. S., Halali, E.., Fels, S.-A., Fischer, A. L., Frech, M.-L., Freira, L., Friedinger, K., Genschow, O., Harris, A., Häusser, J. A., Hedgebeth, M., Henkel, M., Horvath, D., Hügel, J. C., Igna, E. L. E., Imhoff, R., Intelmann, P., Ioannidis, K., Karg, A. H., Klamar, A., Klein, C., Klusmann, B., Knappe, E., Köppel, L.-M., Koßmann, L., Kraft, P., Kroworsch, M. K., Krueger, S. M., Kühling, S., Lagator, S., Lammers, J., Loschelder, D. D., Milstein, N., Molden, D. C., Navajas, J., Norem, J., K., Novak, J. Onuki, Y., Page, E., Panse, F., Papenmeier, F., Pavlovic, Z., Rebholz, T. R., Rinn, R., Rodgers, S., Röseler, J. J., Roßmaier, K. V., Sartorio, M., Scheelje, L., Schindler, S., Schreiner, N. B., Schreiter, M. L., Seida, C., Shanks, D. R., Siems, M.-C., Stitz, M., Starkulla, M., Stäglich, M., Thies, K., Thum, E., Undorf, M., Unger, B. D., Urlichich, D., Vadillo, M. A., Wackershauser-Sablotny, V., Wagner, F. S., Wessel, I., Wolf, H., Zhou, A., Zimdahl, M., & Schütz, A. (2022). OpAQ: Open Anchoring Quest, Version

Bystranowski, P., Janik, B., Próchnicki, M., & Skórska, P. (2021). Anchoring effect in legal decision-making: A meta-analysis. Law and Human Behavior, 45(1), 1–23.

Li, L., Maniadis, Z., & Sedikides, C. (2021). Anchoring in Economics: A Meta-Analysis of Studies on Willingness-To-Pay and Willingness-To-Accept. Journal of Behavioral and Experimental Economics, 90, 101629.

Orr, D., & Guthrie, C. (2006). Anchoring, information, expertise, and negotiation: New insights from meta-analysis. Ohio State Journal on Dispute Resolution, 21(3), 597–628.

Röseler, L., & Schütz, A. (2022). Hanging the Anchor Off a New Ship: A Meta-Analysis of Anchoring Effects. Advance online publication.

Klein, R. A., Ratliff, K. A., Vianello, M., Adams, R. B., Bahník, Š., Bernstein, M. J., Bocian, K., Brandt, M. J., Brooks, B., Brumbaugh, C. C., Cemalcilar, Z., Chandler, J., Cheong, W., Davis, W. E., Devos, T., Eisner, M., Frankowska, N., Furrow, D., Galliani, E. M., . . . Nosek, B. A. (2014). Investigating Variation in Replicability. Social Psychology, 45(3), 142–152.

Jacowitz, K. E., & Kahneman, D. (1995). Measures of anchoring in estimation tasks. Personality and Social Psychology Bulletin, 21(11), 1161–1166.

Röseler, L., Schütz, A., Blank, P. A., Dück, M., Fels, S., Kupfer, J., Scheelje, L., & Seida, C. (2021). Evidence against subliminal anchoring: Two close, highly powered, preregistered, and failed replication attempts. Journal of Experimental Social Psychology, 92, 104066.

Baumeister, R.F., Tice, D.M., & Bushman, B.J. (2022) A review of multi-site replication projects in social psychology: Methodological ideal or collective self-destruct mechanism? Manuscript submitted for publication/revising, Ohio State University.

Ito, H., Barzykowski, K., Grzesik, M., Gülgöz, S., Gürdere, C., Janssen, S. M. J., Khor, J., Rowthorn, H., Wade, K. A., Luna, K., Albuquerque, P. B., Kumar, D., Singh, A. D., Cecconello, W. W., Cadavid, S., Laird, N. C., Baldassari, M. J., Lindsay, D. S., & Mori, K. (2019). Eyewitness memory distortion following co-witness discussion: A replication of Garry, French, Kinzett, and Mori (2008) in ten countries. Journal of Applied Research in Memory and Cognition, 8(1), 68–77. doi:10.1016/j.jarmac.2018.09.004

More from Roy F. Baumeister Ph.D.
More from Psychology Today