You are here

My Biggest Research Mistake

My Biggest Research Mistake
Adventures and Misadventures in Psychological Research

Edited by:

March 2019 | 235 pages | SAGE Publications, Inc

My Biggest Research Mistake helps students and professionals in the field of psychological science learn from the diverse mistakes of successful psychological scientists. Through 57 personal stories drawn from the experiences of fellows in the Association for Psychological Science (APS), editor Robert J. Sternberg presents the mistakes of experts in the field as opportunities for learning, allowing students to avoid making the same mistakes in their own work.

Robert J. Sternberg
1. Introduction: How I Learned from Mistakes
I. Failure in conceptualizing research
Nick Haslam
2. Grandiosity and over-ambition
Harry P. Bahrick
3. Seperating data-based from non-data based evaluations
Judy S. Deloache
4. Too clever by half
Barbara Finlay
5. Death is not the answer
E. Tory Higgins
6. Thinking more is more when less is more
Ying-yi Hong
7. Manipulation checks can ruin your study
Jerome Kagan
8. Beware of popular premisis
Saul M. Kassin
9. The need for blind testing
John F. Kihlstrom
10. Finding implicit memory in post-hypnotic amnesia
Joachim I. Krueger and Johannes Ulrich
11. Social coordination in the wild
Barbara C. Malt
12. Data distress
David Matsumoto and Hyisung C. Hwang
13. A bif mistake in interpreting cultural differneces
Richard McCarty
14. In praise of pilot studies
Nora S. Newcombe
15. Start strong, plan ahead
Howard C. Nusbaum
16. A mistake in studying the role of sleep in speech
Lisa S. Onken
17. Short-term gains, long-term impasse
Richard E. Petty
18. Be as careful after your study is run as you are before
Paul Slovic
19. Lessons learned from a failed experiment
Laurence Steinberg
20. Raging hormones
Robert J. Sternberg
21. A failure in fidelity of experimental treatments
Peter Suedfeld
22. Stumbling in the dark
Rebecca Treiman
23. Pilot, pilot, pilot.
Ovid Tzeng
24. Failure to recognize surface differences doesn’t necessarily imply underlying processing differences
Bernard Weiner
25. Farfel flees from his feast
II. Prematurely jumping to conclusions
Eva L. Baker
26. Jumping to the wrong conclusion: A lesson about people and learning
III. Following a garden path
Maya Bar-Hillel
27. Why didn’t I see it earlier?
Charlotte J. Patterson
28. Losing time
IV. Using measures of dubious reliability/validity
David M. Buss
29. Virginity in mate selection
Robert A. Baron
30. New fields, new errors: Breaking rules every researcher should know
Linda M. Bartoshuk and Derek J. Snyder
31. How do we compare sensory or hedonic intensities across groups?
Larry E. Beutler and Samaria Lenore
32. Science marches on its measures
C. J. Brainerd
33. Reliability is not readiness
Gary P. Latham
34. The importance of being there
Frank Worrell
35. Failure to conduct a pilot study
V. Carelessness
Daniel R. Ilgen
36. Small change—big mistake: Check and check again
Reinhold Kliegl
37. Losing my dissertation data
Mitchell J. Prinstein
38. Peers, procedures, and panic: A careless error that offered a lifetime of benefits
Nancy L. Segal
39. Multiple missteps: The twin study that should have been
June P. Tangney
40. Always late: Causes and consequences far and wide
Thomas S. Wallsten, Gal Zauberman, Dan Ariely
41. When results are too good to be true, they are probably not true
VI. Over-relying on others
Mary Hegarty
42. Of course our program is error-free—not!
Julie T. Fitness
43. Hiring a woman to do a man’s job: The perils of equal opportunity employment when running (ruining) social psychological experiments
VII. Error in statistical analysis
Regina F. Frey and Mark A. McDaniel
44. The case of the enterprising instructor
Donald J. Foss
45. Self-help can be no help at all: Some unambiguous advice
Isabel Gauthier
46. A third-variable problem in face recognition
VIII. Generalizability of findings
J. Lawrence Aber
47. Not establishing the cross-cultural validity of measures of key constructs in a high-stakes field experiment
Karen Adolph
48. Ecological validity: Mistaking the lab for real life
David H. Barlow
49. A major error in the evaluation of psychological treatments for anxiety
IX. Failure to understand the “system”
Stephen J. Ceci
50. Mistakes were made (but not by me)
Klaus Fiedler
51. A missed opportunity to improve on credibility analysis in criminal law
Jack M. Fletcher
52. The importance of professional discourse
Richard M. Lerner and Jun Wang
53. “Nem di gelt?” or Can accepting grant awards be a bad thing?
David B. Pisoni
54. Keep your friends close but your enemies closer: With whom should you share your creative ideas?
Jonathan A. Plucker
55. Walking ethical tightropes in research collaborations
X. Societal costs outweigh societal benefits
James C. Kaufman
56. The danger of superficial success
Robert J. Sternberg
57. Kinds of research mistakes
Klaus Fiedler
Ecological Validity: Mistaking the Lab for Real Life
Klaus Fiedler
A Missed Opportunity in the Federal High Court of Justice: Second Thoughts on (Halfhearted) Engagement in Applied Work
Mistake: Not Establishing the Cross-Cultural Validity of Measures of Key Constructs in a High-Stakes Field Experiment
Lisa Onken
Short-Term Gains, Long-Term Impasse
Rebecca Treiman
Pilot, pilot, pilot
Daniel Ilgen
Small Change-- Big mistake: Check and Check Again
David Barlow
A Major Error in the Evaluation of Psychological Treatments for Anxiety
Nora Newcombe
Start Strong, Plan Ahead
Gary Latham
The Importance of Being There
Laurence Steinberg
Raging Hormones
June Tangney
Always Late:  Causes and Consequences Far and Wide
Barbara Finlay
Death is Not the Answer
James Kaufman
The Danger of Superficial Success
Barbara Malt
Data Distress
David Matsumoto, Hyisung Hwang
(One of) My Biggest Research Mistake(s)
Robert Sternberg
Introduction: How I Learned to Learn From Mistakes
Robert Sternberg
Kinds of Research Mistakes
Robert Sternberg
A Failure in Fidelity of Exprerimental Treatments
Jerome Kagan
Beware of Popular Premises
Donald Foss
Self-Help Can Be No Help at All: Some Unambiguous Advice
The Case of the Enterprising Instructor
Ying-yi Hong
Manipulation Check Can Ruin Your Study
Nick Haslam
Grandiosity and Over-Ambition
Maya Bar-Hillel
Why Didn't I See It Earlier?
Thomas Wallsten, Gal Zauberman, Dan Ariely
When Results Are Too Good to be True, They Are Probably Not
Richard Lerner
"Nem di gelt" or "Can Accepting Grant /Awards Be a Bad Thing?"
Jun Wang
"Nem gi gelt" or "Can Accepting Grant Awards Be a Bad Thing?"
Reinhold Kliegl
Losing My Dissertation Data
Isabel Gauthier
A Third-Variable Problem in Face Recognition
Joachim Krueger, Johannes Ullrich
Social Coordination in the Wild
Nancy Segal
Multiple Misstep: The Twin Study That Should Have Been
Mitchell Prinstein
Peers, Procedures, and Panic:? A Careless Error That Offered a Lifetime of Benefits
Julie Fitness
Sending a Woman to do a Man’s Job
Howard Nusbaum
A Mistake in Studying the Role of Sleep in Speech
Peter Suedfeld
Stumbling in the Dark
David Buss
Virginity in Mate Selection
Charlotte Patterson
Losing Time:  A Professional Mistake
Robert A. Baron
New Fields, New Errors: Breaking Rules
Paul Slovic
Lessons Learned from a Failed Experiment
Richard Petty
Be as Careful After Your Study is Run as You Are Before.
Linda Bartoshuk, Derek Snyder
The Psychophysics of Comparing Sensory/Hedonic Experiences Across Groups
Ovid Tzeng
My Biggest Research Mistake: Failure to Recognize Surface Differences Don't Necessarily Imply Underlying Processing Differences
Jack Fletcher
The Importance of Professional Discourse
Mary Hegarty
Of Course Our Program is Error Free - Not
David Pisoni
Keep Your Friends Close But Keep Your Enemies Closer: Who Should You Share Your Ideas With?
C. Brainerd
Reliability is Not Readiness
Saul M. Kassin
The Need to Blind Experimenter to Condition
Eva Baker
Jumping to the Wrong Conclusion: A Lesson About People and Learning
Larry Beutler
Science Marches on Its Measures
John Kihlstrom
Finding Implicit Memory in Posthypnotic Amnesia
Richard McCarty
In Praise of Pilot Studies
Jonathan Plucker
Walking Ethical Tightropes in Research Collaborations
Frank Worrell
Failure to Conduct a Pilot Study
Harry Bahrick
Separating Data-Based From Non-Data-Based Evaluations
Key features
  • A student-friendly writing style makes the content more approachable for readers to retain.
  • Personal stories from fellows of the Association of Psychological Science (APS) convey to readers that even the most distinguished experts make mistakes that can be avoided through examination.
  • Critical thinking questions give students the opportunity to reflect on preceding essays.
  • Stories from all areas of psychology appeal to learners beyond subfields of psychology.

Preview this book

For instructors

Select a Purchasing Option

ISBN: 9781506398846