Home
Up

Tailoring Evaluations

 

Topics

bullet

Who does evaluations?

bullet

Who is your audience?

bullet

Working with clients

bullet

What is tailored?

bullet

Formative vs. summative evaluations

bullet

Scientific vs. practical considerations

bullet

Managing evaluation projects

bullet

Detecting Cheaters and other unintended consequences

  

Readings & Handouts

bullet

Rossi, et al. Chapters 2 & 3

bullet

Wholey, et al. Chapters 1, 2, 4, 9, 10, 23-28

bullet

Reserve #2, #9, #10

bullet

Report - Program Evaluation: Experienced Agencies Follow a Similar Model For Prioritizing Research Needs - GAO-11-176

bullet

Program evaluation can be a useful tool for policy analysts and program managers because it can help shed light on the relative effectiveness of different policy options.  For example, education reform proposals (2002) often focus on proposals for smaller classroom sizes, single-sex classrooms, and uniforms.  But how effective are these interventions?  Evaluations can often provide this type of information. 

bullet

Read this essay about why science matters and why it is a way of life from the NY Times (6/1/08).  In many ways, a goal of this class is to get you to employ more of a scientific perspective as a manager and analyst. 

bullet

At the same time, you have to be a critical consumer of "science" and information.  See this interesting commentary from the Washington Times about claims being made during the health reform debate (2009.  This set of handouts also provides useful insight on the importance of having the right perspective when it comes to evaluating risk, data, and things purported to be fact. 

bulletPolicy debates often involve "experts".  But what is an expert?  See this interesting article from the USA Today (7/18/05).  So how do you know who to trust and what information to believe?  Take Dr. Dwight Lundell.  He has been arguing for years that fat and cholesterol are not the main cause of heart disease and that it is inflammation in the artery wall that is the main cause.  More importantly, expensive statin medications used to lower cholesterol may actually be exacerbating problems with arterial disease. He is a heart surgeon with 25 years of experience and 5,000 open heart surgeries.  Is he an expert?  He certainly has sold a lot of books and a quick Google search produces lots of instances where he is referred to as such.  At the same time, he lost his medical license in 2008 and there are lots of other reasons to suggest he is a quack
bullet

It is also important for evaluators to use numbers whenever possible.  The problem of course is that people often have trouble understanding the meaning of numbers.  Watch this short clip from David Letterman for some examples of the problems people often have in terms of interpreting numbers:
bullet

http://www.youtube.com/watch?v=2dJanAgydps

bulletAnother example of the influence that phony "experts" can have on society is the argument that MMR vaccine can cause autism.   Is autism caused by the MMR vaccine?  While the answer is no, that has not stopped "experts" from claiming otherwise. 
bullet The evidence first suggesting the link was published by Dr. Wakefield and his colleagues (1998) in The Lancet and was subsequently retracted in 2004
bulletFeeding the controversy are events like a 2010 court award to Hannah Poling for more than $1.5 million in a vaccine-autism claim continuing to foster beliefs that there is a government cover-up of the link between MMR and autism (CBS News 9/9/10).  \
bulletThe federal government's vaccine injury compensation program has also made numerous awards for vaccine-induced brain injuries, which mimic symptom's of autism feeding fears.  See this controversial evaluation of the VICP by Hollan, et al. published in the Pace Environmental Law Review in 2011.  However, these payments are not for vaccines causing autism and two of the authors represent clients with claims on behalf of family members in the VICP.  See this interesting critique of the study in wired.com pointing out some of the flaws in terms of their analysis.  
bulletHowever, all of the best scientific evidence suggests that there is no link at all.  Ironically, evidence for the lack of a link is associated with the large number of people who stopped using the MMR vaccine after the media reports surrounding Wakefield's study published in the Lancet in 1998.  This should have been correlated with a measurable decrease in autism rates but there is no evidence that these changes occurred. 
bulletMoreover, despite the claims paid out by the VICP, in March 2010 a federal court ruled after 600 pages of findings that the families of children diagnosed with autism are not entitled to compensation because there is insufficient evidence of a link between MMR and autism.
bulletSo why the increase in autism rates?  In all likelihood it is due to changes in the definition of autism and better diagnostic tools that find more cases.  For example, if there were really an increase you would see higher rates in children than in adults but as a recent study by Brugha, et al. published in the Archives of General Psychiatry (5/12) from the UK indicates, that isn't the case.  It may also be due to government services other policies that create incentives for more cases to be diagnosed.  One new theory is even that it may be environmental and related to chemical exposure (CNN 6/11)?
bulletThe one thing we do know that if you weight the relative risks according to the CDC, it is safe to vaccinate your children and failing to do so places them and others at greater risk of illness or even death.
bullet

As a result of this fear over the erroneous reports of the link between the MMR vaccine and autism, large numbers of people have chosen not to vaccinate their children.  This has led to measles outbreaks in the U.S. and the U.K.

bullet

Being an effective program evaluator required not only thinking about programs and their associated outputs and outcomes in a critical manner, it also required being open to the sometimes counterintuitive findings that present themselves.  For example, see this article in the MailOnline that describes the results of a study on the importance of eating more vegetables, something we all simply assume is true.  But is it?  See this rebuttal of the study's conclusions from January '11.  Is it actually safer to be modestly overweight than underweight?  See this report in the JAMA on Obesity and Death (2005) and a related article about CDC linking risk of death to obesity (2005)

bullet

When in doubt, trust your gut instincts.  Sometimes it is the best strategy for making decisions.  See this interesting article from the U.K. Mail Online (8/11)

bullet

While program evaluation is a useful analytical tool that can be used to help shape policies and improve program administration, it can be very resource intensive.  In some cases, it might even cost more to evaluate a program than is currently being spent on the program.  Thus, when tailoring an evaluation some consider should be given to whether the potential value of the information obtained from an evaluation is worth the cost.  For example, was it really worth the cost of the LA Nutrition Study Evaluation (Newsweek - 8/09) to find out that kids like the taste of fruit? 

bullet

It is also important to remember that evaluation and program monitoring can create incentives to cheat.  This is particularly evident in education reform efforts.  See this USA Today investigative report: Testing the System (Mar. '11), this set of articles on a widespread scandal in Atlanta as examples (7/11), and this story on problems in South Carolina (2008).

bullet

It is also important to remember that an evaluation is just one type of information used by policymakers and sometimes programs are effective and are rejected.  For example, see this interesting article about the politics surrounding red light cameras at intersections (MSNBC 6/24/11).  In other instances, evaluations produce information that suggests programs are relatively ineffective but politicians choose to keep funding them.  For example, D.A.R.E. is a relatively popular program with the public and politicians despite the fact that there is little evidence that suggests the program is effective in achieving its desired outcomes (See RES #9 and RES #10). 

 

Lecture Notes

bullet

Download as a PDF file (Supplemental notes from lecture)

 

Web Resources

Professional Societies for Evaluators

bullet

American Evaluation Association

bullet

Association for Public Policy Analysis and Management (APPAM)

  

Organizations and Think Tanks Involved in Evaluation Research
bullet Mathematical Policy Research, Inc.
bulletRand Corporation
bulletUrban Institute
bulletBrookings Institution
bulletAmerican Enterprise Institute
bulletHeritage Foundation
bulletHudson Institute
bulletHoover Institution
bulletCATO Institute
bullet

Resources for the Future

bullet

John Locke Foundation

bullet

National League of Cities

bullet

National Association of Counties (NaCo)

bullet

National Governor's Association

bulletABT Associates
bulletMDRC (nonprofit education and social policy research organization)
bulletPew Charitable Trusts
bullet United Way of America, Outcome Measurement Resource Network
bulletU.S. Government Accounting Office
bulletU.S. Department of Health and Human Services, Agency for Healthcare Research and Quality
bullet DOE, Energy, Efficiency, and Renewable Energy, Program Evaluation
bullet DOJ, Bureau of Justice Assistance, Center for Program Evaluation
bullet Center for Disease Control, Evaluation Working Group
bullet UCLA Center for Health Policy Research
 
Evaluation Journals
bulletAmerican Journal of Evaluation
bulletNew Directions for Program Evaluation
bulletEvaluation Review (formerly Evaluation Quarterly)
bulletEvaluation Studies Review Annual
bulletEvaluation Practice, Evaluation, and Program Planning
bulletJournal of Public Policy Analysis and Management
bulletEvaluation in Education
bulletEvaluation and Human Services
bulletEvaluation and the Health Professions
 
General Sources for Evaluation Studies
bullet Links to evaluation resources
bullet Planning and Evaluation Resource Center
bulletOnline Evaluation Research Library
bullet Virtual Library - Evaluation
bullet Vanderbilt Institute for Public Policy Studies, Center for Evaluation Research and Methodology
bullet Applied Survey Research
bullet Western Michigan University Evaluation Center
bulletJoint Center for Poverty Research

Back to Mark T. Imperial's Homepage

Page last modified 08/17/11

Report problems to imperialm@uncw.edu