Blog

April 9, 2018Like

Research on Evaluation: It Takes a Village (The Solutions)

Leave a Comment
Our first post lamented the poor response rates in research on evaluation. There are many reasons for these poor response rates, but there are also many things that we can do to improve response rates and subsequently improve the state of research on evaluation. How can evaluators improve response rates? Coryn et. al (2016) suggests that evaluators find research on evaluation important. However, the response rates to these projects would suggest otherwise. As with any area of opportunity, there is often several components that influence success. Yes, evaluators should naturally care more about propelling our field forward, but the ability
Read More
April 3, 2018Like

Research on Evaluation: It Takes a Village (The Problem)

Leave a Comment
Response rates from evaluators are poor. Despite research suggesting that AEA members consider research on evaluation as important, response rates for research on evaluation studies are often only between 10-30%.1 As evaluators ourselves, we understand how busy we can be. However, we believe that evaluators should spend more time contributing to these studies. These studies can be thought of as evaluations of our field, such as: what our current practices are, how should we train evaluators, what can we improve, how do our evaluations lead to social betterment, and more are just some of the broad questions these studies aim
Read More
February 25, 2018Like

Visualizing Statistical Significance – and Effect Sizes!

1 comment
Ann Emery recently posted an awesome blog post on visualizing statistical significance. Starting with a table of statistics with lots of numbers and asterisks (*), she ended up with this… Read More Visualizing Statistical Significance – and Effect Sizes!
Read More
February 11, 2018Like

Confessions of a QUANT

1 comment
I have a confession to make. I am a QUANT. By a QUANT, I mean that I am good at quantitative methods and, because I’m good at them, I tend… Read More Confessions of a QUANT
Read More
January 21, 2018Like

Does This Logic Model Make My Program Look Good?

Leave a Comment
Over the past several years, data visualization has taken the evaluation community by storm. Today, there are dozens of blogs and online resources to help evaluators hop on the #dataviz train and communicate findings more effectively. The start of a new year is the perfect time to adopt new data visualization trends and apply them to your practice. However, before you jump on the bandwagon, it is worth testing assumptions about what works and what does not. That’s why we at the Claremont Evaluation Center decided to study the effectiveness of data visualization principles applied to logic models. Read More
Read More
January 1, 2018Like

2017 in Review — and Looking Forward to 2018

Leave a Comment
Image credit: Vladimir Kudinov http://www.vladimirkudinov.com/ This year, instead of yearly goals which are too long-term and fluffy to really mean anything or actually be accomplished, I focused on quarterly goals. This was… Read More 2017 in Review — and Looking Forward to 2018
Read More
December 31, 2017Like

Analyzing my Twitter Posts from 2017

Leave a Comment
A recent post on AEA365, plus my Evaluation Twitter working group, inspired me to finally learn how to scrape tweets in R! The AEA365 post linked to a tutorial on… Read More Analyzing my Twitter Posts from 2017
Read More
November 12, 2017Like

From Learning to Action: Reflections from Eval17

5 comments
I am home from Eval17 and wanted to reflect on my experiences at the conference. There were many interesting sessions I attended, but three things have really stuck with me.
Read More
October 23, 2017Like

Dana presents at Eval17: Surveying children, using vignettes to train staff, and more!

Leave a Comment
I am really looking forward to meeting you all at the annual AEA conference, Eval17! I wanted to share with you the details of my various presentations and hope you… Read More Dana presents at Eval17: Surveying children, using vignettes to train staff, and more!
Read More
October 12, 2017Like

Can evaluators be the bridge in the research-practice gap?

Leave a Comment
Researchers and practitioners agree that there is a gap between research (or theory) and practice. While the reasons for this gap are plentiful, they boil down to researchers and practitioners… Read More Can evaluators be the bridge in the research-practice gap?
Read More

Footnotes

  1. Notably, the study on research on evaluation had a response rate of 44% (Coryn et al., 2016). While this is much higher than most research on evaluation studies—and it is unclear how they achieved this since all they mention is they used Dillman’s principles—it is still low enough to call into question the generalizability of the findings. For instance, it may be more accurate to say only 44% of evaluators care about research on evaluation since the remaining 56% didn’t even both to participate!
  2. If you are interested in learning more about “visual efficiency”, check out Huang, W., Eades, P., & Hong, S. H. (2009). Measuring effectiveness of graph visualizations: A cognitive load perspective. Information Visualization8(3), 139-152.
  3. If you are interested in learning more about “visual efficiency”, check out Huang, W., Eades, P., & Hong, S. H. (2009). Measuring effectiveness of graph visualizations: A cognitive load perspective. Information Visualization8(3), 139-152.
  4. If you are interested in learning more about “visual efficiency”, check out Huang, W., Eades, P., & Hong, S. H. (2009). Measuring effectiveness of graph visualizations: A cognitive load perspective. Information Visualization8(3), 139-152.