Foundations and donors are increasingly questioning the impact of their funds at museums and historic sites, a trend that’s growing as well in business according to Jack Phillips and James Kirkpatrick at a session at the ASTD conference yesterday. After the recent recession, they’ve found that CEOs are increasingly asking about the return on investment (ROI) of every program and activity, including employee training and education. Although training claims to be an essential contributor to business productivity and performance, it hasn’t been adequately measured or evaluated, and thus can’t prove their value. That surprised me because I thought that was a struggle only for museums and historic sites. We seem to be continually fighting to prove our worth and other than economic impact, haven’t been able to show why we matter in our communities. It looks like we’re not alone.
Phillips and Kirkpatrick are the leaders in the field of measuring performance in business and developed frameworks that “define the levels at which programs are evaluated and how data are captured at different times from different sources.” Although they disagree on whether the framework should have four or five levels, they both agree that evaluation is usually stuck at the first two levels. While the Kirkpatrick Model consists of four levels, Phillips has a five-level evaluation framework as follows:
Level | Measurement Focus | Key Questions Asked |
---|---|---|
1. Reaction and Planned Action | Measures participant satisfaction with the program or process. | Is the program or process relevant, important, useful, or helpful? |
2. Learning | Measures changes in knowledge, skills, and attitudes. | Did the participants increase or enhance knowledge, skills, or perceptions? |
3. Application and Implementation | Measures changes in performance or action. | Are participants applying the knowledge, skills, or information? |
4. Business Impact | Measures changes in key business measures. | How does the application improve output, quality, cost, time, and satisfaction? |
5. ROI | Compares program benefits to the costs. | Do the monetary benefits of the program exceed the investment in the program? |
Created with the HTML Table Generator
Both believe that education and training programs have to demonstrate their value to the organization and need to figure this out before the CEO asks for this information–otherwise they may lose their budgets and jobs. They also believe that evaluation is a disruptive force and part of the “change management” process. We can make a parallel argument in the museum and historic site field. We need to demonstrate the impact of our programs and activities before funders ask–solid evaluations at higher levels can help make the case. It can also be difficult to implement because people don’t like to change. If you’d like to learn more, both Phillips and Kirkpatrick have lots of resources on their websites and have written several books on the topic.
On an entirely different note, I’m struck by how differently the conference is presented. At a museum or history conference, most sessions are 90-minutes and dominated by panels of 2-3 speakers with a moderator. At ASTD, most sessions are led by one person allowing them to go into a topic in-depth or incorporate some activities. When panel sessions are used, it’s to discuss various perspectives on the same issue (museum conferences tend to avoid disagreements although history conferences are notorious for chewing up speakers during the Q&A). They also vary the length of the sessions, starting with 120-minute sessions at 10 am and then slowly reducing to 60-minutes for the 4:30 pm session. That provides a way to match time and topic, but also follows the declining energy of participants as the day progresses.
Max, thank you for this post! Sure glad we are not alone. Evaluation can serve so many functions in an organization–not the least of which is professional learning about museum practice and results–and you are right, people do not like to change even when the data provides evidence that a change is needed to achieve the museum’s goal. So in a conference about training, are any sessions about training humans to embrace change and flexibility? If so, can you share?
LikeLike
Thanks, Randi. There were lots of sessions on managing change however, I didn’t get a chance to hear them. Looking at the recently released second edition of the ASTD Handbook (a bit fat anthology codifying best practices in training and development), they recommend Managing at the Speed of Change by Conner (2006); The Heart of Change by Kotter and Cohen (2013); and Change is Everybody’s Business by McLagan (2002).
LikeLike
Considering that most historic sites and house museums in America have very small staffs, doesn’t this push for accountability favor the few really large sites that can afford to spend the man hours evaluating their ROI? Sites with one, two, or three overworked staff members don’t usually have the time to devote to their own self-analysis, so will the smaller sites starve for lack of grant funds while the larger museums with staff available for self-analysis gather the majority of the funding? For that matter, how are small for profit business’s with one and two employees going to engage in this level of introspection?
LikeLike
When I was the conference yesterday, I posed your question to Jim Kirkpatrick. He said it’s possible to do good evaluation with small non-profits and other organizations with limited resources–he’s done it, for example, with small church groups who are setting up schools and orphanages overseas. The key is to start with the end in mind (Level 4): what are the desired outcomes? Every organization should have that in writing. What will vary depending on capacity is the amount and type of evaluation. For small organizations, just focus on the one program or activity that you believe contributes the most to outcomes. That’s where you evaluate reaction (Level 1) and learning (Level 2). Again, conducting evaluation is intentionally disruptive, so it will require additional effort to implement and maintain.
LikeLike
Great topic, Max. Perhaps the more general point, and one that may be more comfortable to some, is simply, “Have objectives and measure your progress in achieving them and the resources used in the process.”
As another commenter indicated, this may be more difficult at smaller organizations. In the case of all-volunteer organizations, for some their objective may be keeping a historic house open and used so that it isn’t abandoned and razed. To others, it may be maintaining the archives of a town knowing that someday someone will find them useful.
For organizations that are a little larger, especially those that receive grant money, it seems very reasonable to expect that there is not just a purpose, but actual goals with measurable results.
Regarding your observation about the format of the conference you attended versus the typical museum or history conference, wouldn’t it be great to see some upcoming conferences act on that idea. Different formats, including those that provide exposure to more tightly-condensed talks, make a lot of sense.
Shorter formats are also well-suited to presenting specific ideas that attendees could adopt or adapt. For example: Five talks in about an hour, with 9 minutes or so per speaker followed by 1 followup question for each.
This might be done thematically, such as . . .
– Creative special events
– Creative fund-raising ideas
– Creative programs for kids
. . . to name a few.
Further discussion would be in the hall at the end of the session or at a birds-of-a-feather-go-have-drinks-together informal session.
One of the things that drives the current format is the way in which selections are made, with a CFP deadline many months before the conference and all of the work done by small committees. Imagine instead if a few sessions were given over to some new approaches. For example, invite people to post what they want to talk about 30 days in advance and give everyone a 10-day window to vote on the ones they want to see.
I recently proposed doing a History Camp–an unconference focused on history–in conjunction with the National Council on Public History conference in 2015: http://www.thehistorylist.com/blog_posts/history-camp-nashville-proposed-for-ncph-2015-annual-conference. Will be interesting to hear the committee’s reaction.
LikeLike
Thanks, Lee! Great suggestions.
LikeLike