AVI CHAI concluded its general grant making on December 31, 2019.

Getting Beyond the Survey

Posted by: Susan Kardos

January 31, 2019

By: Dr. Susan Kardos
In November, I wrote a piece posted here about the wide chasm that often exists between research and practice.  In that post I promised additional posts related to the broader topic of “evidence informed decision making.”
During the first week of January in a WeWork conference space in New York City, I had the honor of spending the day with the Nachshon Project Graduate Fellows during their 10-day intensive on “data-driven decision making” through the lens of the Jewish day school sector.  Organized and facilitated by Prizmah, Nachshon Project staff, and faculty assembled by Prizmah, this was an intensive week of learning and site visits: after concentrated work in New York, the fellows, their faculty, and Prizmah staff travelled to Jewish day schools in Memphis and Detroit. It was meant to deepen their knowledge about Jewish day schools as potential sites for their future work, introduce them to applicable concepts and methods related to data-driven decision making, and enrich their cohort experience.  I was honored to offer three sessions to frame their subsequent learning, and I learned a lot from their thoughtfulness and passion.
There are lots of different angles from which to think about “data-driven decision making” (I prefer “evidence-informed decision making”). These include the following: how to find and use existing data to inform decisions; how to apply existing theories or study findings in one’s own context; how to collect your own data; how to interpret data collected at your site; how to guide expert researchers or evaluators to do a study most useful to you; and how to engage in data collection and analysis that is not only useful to you in your site, but also contributes to a broader knowledge base about educational issues.  And those are just a few.
Here I am focusing on one critically important and seemingly obvious point about research, inquiry, data collection, and evaluation: You must first get clear about what it is you want to know.  If there is one key nugget to take away from this post, that is it, and I invite you to stick with me for the next few minutes to work it through.  But if you know you always spend sufficient time, at the outset, clearly articulating the question you want to answer (and really considering it and appreciating all its dimensionality), and you do this with colleagues and critical friends, then you just found yourself a few extra minutes to do something else. (Enjoy!)
You heard it before, or you may have even said it.  You’re in a conversation with colleagues, grantees, funder partners, or practitioners and someone says: “We just finished this program/initiative/event and we want to do a survey…”  That sort of thinking is akin to “I just finished my workout and I want to go get my blood pressure measured.”  That may or may not be a good idea; we can only determine what you should measure (Blood pressure? Heart rate? Weight? BMI? Mood? Strength? Flexibility? Social satisfaction? Sense of accomplishment? Persistence?), when you should measure, and how you should measure after we understand what it is you really want to know about the workout.
In the session with the Nachshon Graduate Fellows, I showed them this video of a children’s choir (take a look: if you’re like me and you find children singing uplifting and inspiring, you’ll be happy you did).  I asked them what sorts of data they could collect about what they just saw. Think about it. What data could you collect?  How many kids?  What are their ages?  How many boys and girls?  How often do they practice?  When did they join the choir?  Who are the soloists?  When was the choir formed?  For what reason?  Good questions.  All pretty easy to answer.  I then showed them the video again. I asked them to step back and think about their interests as graduate students, Rabbinical students, and Jewish educators.  What was it that they really really wanted to know?  What might help them understand or do their own work better?
Their questions were wonderful, important, and complex:  How did the choir director come to be the director of this choir?  What are the relationships among the children like?  What keeps the singers (and staff) motivated?  How do the children become members of the choir, and why is that particular process employed?  What, if anything, is the “hidden curriculum”?  What other things do these children do together and why?  What was the process for choosing that particular song; what role, if any, did the children have and why?  Have the children developed relationships, beliefs, or behaviors as a result of participation in the choir?  What is the nature of those relationships, beliefs, or behaviors?  How do parents support their children’s participation in the choir?  What is the average tenure of children in the choir? Is that the ideal tenure? Why or why not?  What is the business model for Voices of Hope?  What are its biggest organizational challenges, and is the organization sustainable?  What impact did the singing have on the audience members? What did the judges think and feel, and why?
It should be apparent that when you start from what you really want to know, rather than what data you can collect—or worse yet, how you can collect the data (a la “let’s do a survey”)—the nature of the questions changes powerfully.
So when you think you want to understand something better in your work, the three main questions you should ask are:

  1. What do I really want to know?
  2. Why do I really want to know that?
  3. What decisions can I better make once I know that?

Try to apply these questions to something in your own practice or organization.  What do you really want to understand better and why?  If you understood that aspect of your practice or organization better, what could you do or decide differently?  Here I’m focusing specifically on research you might do yourself or hire someone else to do about your work or your organization, but the basic principle applies to large multi-year, multi-million-dollar studies too.  If you don’t spend adequate time understanding what driving question you want to answer, you’ll have a very hard time taking any steps after that.
Here are some examples of questions that might guide a teacher’s study of her practice:  To what extent are my students able to choose “just right” books?  To what extent do my students consider their relationship with God during tefila?  To what extent are my students’ “study skills” improving?  What do I really believe about the STEM abilities of the girls in my class?  To what extent do my students feel ownership over the learning community I’m trying to create in my classroom?  In what ways are my students learning to think like scientists?  To what extent are my students able to understand complex concepts related to Israel?  You get the picture.
A funder might want to know: To what extent do our funded professional development programs align (or conflict) with what is already known in the field about effective professional development?  In what ways, beyond grant dollars, do my grantees benefit from our partnership?  To what extent does the reporting we require from our grantees advance their learning and their work (versus our required reports being time-consuming, compliance documents)?  When I look across programs, are there subcategories of potential participants (people from a certain neighborhood or geography, people who hold a certain set of beliefs or practices, people of a certain age or economic status, etc.) who aren’t accessing the programs we are offering?  Getting the question right requires a thoughtful process and, I believe, collaboration.
Getting the question right, as critical as it is, is still only half the battle (or, I would argue, more like two-thirds).  The next step is considering the following:

  • Is what I am seeking to know, knowable?
  • What kind of data do I need?
  • How can I (what methods will I use to) get the data systematically?
  • How can I (what methods will I use to) analyze the data systematically?
  • How will I interpret it?
  • How will I understand the limitations of this inquiry?

My intention is not to discourage inquiry, but rather to point out that there is more to think about beside just what survey questions you’ll use.  To be most useful and usable, any research, evaluation, or self-study must begin with a good question.
At AVI CHAI, we believe that the field of Jewish education can benefit greatly from a better developed culture of inquiry, where educators and funders are demanding and using evidence (where it exists) to make their best decisions and where researchers are producing usable (new word idea: “Jewsable”) knowledge that helps Jewish educators and funders solve their most pressing problems and answer their most vexing questions.  We invite you to learn more about AVI CHAI’s efforts to promote this culture of inquiry.
Dr. Susan Kardos is Senior Director, Strategy & Education Planning at The AVI CHAI Foundation.

WordPress Video Lightbox