Pages

Monday, September 11, 2017

How do you measure engagement? What we learned during Summer Reading 2017

Starting with this year, my role in summer reading is a specific one: I work at developing and implementing our measurement tools and strategies for capturing program outcomes and other evaluative data. To that end, the reading clubs committee's meeting that matters most to my work is the one where they confirm the program goals for the year. Not the theme, not the design of the program--the program goals. Once I've got a grasp of what the goals are, and why they've been chosen, I start thinking about how we can collect data to help us gauge our success.

For Summer Reading 2017, which here at Skokie was called Camp Imagine, our reading clubs committee identified two main goals of our program:

Goal 1: Increase the completion rate by 10%. This goal is a change in outputs--the proportion of registrants who went on to finish the program this summer as compared to the proportion from last summer. You need two data points to get your completion rate: the number of people who signed up for the program, and the number of people who completed the program. This is pretty standard data to collect for reading programs across the country.

Goal 2: Increase engagement. This goal is both a behavior in itself, which is an output, and a change in behavior, or an outcome. In other words, the reading clubs committee was designing a program that would hopefully result in more people participating across the whole summer as well as engaging with both the program and the library at greater rates than they might have otherwise.

(Also, a note here: we didn't attempt to measure engagement last summer, which means we didn't have a baseline to compare this summer's numbers to. As a result, we were actually measuring our success toward the goal of facilitating engagement, as opposed to increasing engagement year-over-year. We can track that next year, using our 2017 numbers as our baseline for comparison.)

Now, you may be thinking to yourself: but how do you measure engagement? That, my friends, is a really, really good question. And there's not necessarily a single best answer. You see, measuring engagement starts by first identifying what engagement means to you with regard to your specific program or service.

So pretty much immediately after I got word that these were the goals selected by the reading clubs committee, I went to them with some questions. Specifically, what did they have in mind when they envisioned patron engagement? That's when I learned that they envisioned Camp Imagine participants sticking with the program throughout the summer; doing the program with some specific participatory behaviors; using the library in new ways; and interacting with one another and the community around books and reading. That may still seem somewhat amorphous and abstract, but this definition gave me plenty to work with--especially since the committee was designing the program so that participants could choose specific activities and challenges that tied into the engagement goal.

That length-of-participation type of engagement--sticking with the program throughout the summer--was measurable by seeing how many days, on average, Camp Imagine participants were active in the program. We tallied the number of days between registration and final visit to the Camp Imagine desk for 20% of our participants (a solid sample, considering we had over 3700 participants). This was a straight output numbers analysis.

Because of the way we implement our summer reading program and the volume of participants involved, we knew we'd stick to a participant survey to capture most of the rest of the data to get at whether we were actually facilitating engagement. With the goals and definitions in mind, I got down to creating our survey questions to measure engagement.

The reading clubs committee was defining engagement as participating in a few direct ways with the program. That's not the same as just participating in the program. Rather, a specific engagement activity option was available at every level of Camp Imagine: to submit something to the Camp Mailbox. It might be a book review; a picture or drawing of a friend; a favorite recipe; a map of places traveled, or hoped to travel; and more sharing prompts like these. We had a physical Camp Mailbox just next to our Camp Imagine desk, where all registration and badge completion interactions took place. The program was designed to give participants the option of engaging directly with the program and program themes by submitting to the Camp Mailbox, and this type of participation was incentivized when we featured Camp Mailbox submissions on a display next to the camp desk and on our Facebook page. And so, when it came time to think about quantifying this type of engagement, we asked two specific questions on the summer reading survey:

1) While participating in Camp Imagine, I submitted an item to the Camp Mailbox. (Options: Yes / No / Not Sure)

2) While participating in Camp Imagine, I looked at items submitted to the Camp Mailbox on display in the library or on the library's Facebook page. (Options: Yes / No / Not Sure)

The next definition of engagement was using the library in new ways. Once again, this type of engagement was built into the design of the program as an option. At every level of the program, participants were invited to find relevant information in a library database or resource; try reading in a new genre; set up an account with and try a downloadable resource like Hoopla or RBDigital; and more. To get at whether participants were in fact using the library in new ways, we asked one specific question on the summer reading survey:

3) While participating in Camp Imagine, I tried a library resource I'd never used before. (Options: Yes / No / Not Sure)

Finally, the last definition of engagement was that participants would ideally interact with one another and the larger community around books and/or reading. Again, this type of behavior was incentivized, with many levels of the program including the activity of sharing a book or other media recommendation with another person--be it friend, family member, librarian, or anyone else. This type of engagement is also where my initial research into measuring engagement--specifically during summer reading--paid off. I had explored write-ups and descriptions of how California libraries have been tracking summer reading outcomes since 2011, and that's where we landed on this concept of engagement with reading programs as resulting in participants considering their communities as places valuing reading. With the combination of our program-incentivized sharing and the concepts learned from California's outcomes iterations, we asked two specific questions on the summer reading survey:

4) While participating in Camp Imagine, I shared a book recommendation with a family member, friend, or other community member. (Options: Yes / No / Not Sure)

5) As a result of participating in Camp Imagine, I feel that Skokie is a community of readers. (Options: Strongly Agree / Agree / Neither Agree nor Disagree / Disagree / Strongly Disagree)

The survey was rounded out by questions from the PLA Project Outcomes summer reading surveys--we feel it's important that we contribute to the project, which provides its aggregate data for library advocacy across the country, whenever it fits into our own strategies and plans for capturing program outcomes.

Our final survey was one half-sheet of paper, front and back, and we got a response rate that allowed us a pretty high degree of confidence in the data we collected.

Curious what we found?

Our graphic designer Vanessa Rosenbaum created a gorgeous 4-page final report for Camp Imagine. Here's the page about engagement. I put together the data analysis and narrative, and the sidebar includes items from the Camp Mailbox and anecdotes.

From analyzing the length of time that Camp Imagine registrants engaged in the program (which was 73 days long in total), we found that:
  • Youth who earned at least one badge (i.e., didn't just register and never return) participated on average 34 days, or about five weeks.
  • Teens who earned at least one badge participated on average 29 days, or about four weeks.
When we looked at the top 50th and 25th percentiles of youth and teen participants, these numbers were even higher. This type of sustained engagement has implications for combatting the summer slide.

From our survey questions, here's what we learned about program engagement:
  • 84% of participants submitted something to the Camp Mailbox.
  • 61% of participants viewed the Camp Mail.
  • 65% of participants tried a library resource they'd never used before.
  • 75% of participants recommended a book to a family member, friend, or other community member.
  • 94% of participants feel Skokie is a community of readers.
Now, considering participants had the option, at every single level of the program, to complete each level in the traditional summer reading way--by reading alone--we think these numbers are pretty remarkable. In every metric, over half of program participants opted to participate in our engagement activities alongside the traditional reading activities. And the fact the 94% of participants feel our community is one of readers? Well, that makes a librarian's heart happy. And these data all provide solid baselines for continuing to measure engagement over the course of subsequent reading clubs.

So that's how we did it: that's how we measured engagement during Summer Reading 2017.



No comments:

Post a Comment

Note: Only a member of this blog may post a comment.