Wednesday, October 7, 2015

60 % of psychology findings do not replicate: Implications for educational practice and research

Late august somewhat shocking results were published wherein efforts to replicate the findings of some of the most important studies in psychology failed. Replication of findings is one of the most important aspects of science. Results should not be taken seriously until they are replicated. However, most studies never et replicated.

What this means is that a majority of the most important findings in the field of psychology, ones that had been widely trusted and used by therapists and psychiatrists, are not valid.

Implications for Education Research

When you decide to implement something based on a research finding you are assuming that the results from the research will be replicated in your schools. However, if laboratory-based research cannot be replicated in the laboratory, it is even more unlikely that the results will be replicated in your school(s) where even less control exists: i.e., it is unlikely that you will see the same positive effects in your school?

Does this mean that leaders should not trust research findings? No!

The good news is that the glass is almost half full given that forty percent of the studies were replicated. What was the key characteristic of the replicable studies—especially since there were no indications of fraud? The key difference was that the replicated studies had larger effects, more positive results/greater effects/greater benefits, than those were not replicable. This validates a key point of Chapter 3 which discusses criteria for determining the practical significance of research. The basic conclusion is that you should only trust research with BIG effects and Chapter 3 provides guidance in how to identify such research. If you follow the recommendations in that chapter, chances are better that any practice that you adopt on the basis of research will be in the category of the 40% that will replicate the positive effects in your school(s).


Friday, June 26, 2015

Hi Everyone:

This blog is dedicated to having conversations with readers of my book:

Authentic Quantitative Analysis for Education Leadership Decision-Making and EdD Dissertations: A Practical, Intuitive, and Intelligible Approach 

How to Critique and Apply Quantitative Research to Improve Practice, and Develop a Rigorous and Useful EdD Dissertation


The goal of this blog is to share reactions, critiques, suggestions, and experiences with using this book in an EdD program with each other. The advantages of publishing with NCPEA, is that in addition to keeping the price low for students, their print on demand model makes it possible to make frequent revisions.  

In addition, the goal is to start a conversation among us as a community about the use of quantitative analysis in EdD programs. Hopefully, those who join in will not only be the individuals who teach the methodology courses, but a wide range of faculty and program directors. Please share your ideas with me and the others who join in. 

Stanley Pogrow