Photo by Allison on Flickr.
I recently realised that we, cultural institutions, are using the wrong metrics to measure our online success, because we’re measuring just that: generic success. We’re using statistics and software that is perfectly fine when you’re selling Cokes, but might not be ideal for culture, heritage and the arts.
In the real world we know our success cannot easily be measured in hard figures. Visitors numbers and shop turnover are important KPIs, especially as our funding and financial well-being often depends on it. Yet, these quantitative measures of success are hardly ever part of our mission. Instead we consider ourselves successful when we change behaviour, increase knowledge, spark imagination… Evaluators use complex toolkits and checklists to see if an exhibition had the right impact, an event the expected outcome. In the real world, we are successful if we are significant.
Not online. In almost all project presentations I’ve seen in the past year, success is measured in hits, comments and likes. Sometimes more advanced metrics are used that hint participation, enthusiasm, loyalty. Once or twice I’ve heard people refer to quality of service (search queries resolved). Success, online, is a number.
If online is a full part of your institution, online success is significance as well.
Surely a thousand donations to your crowdfunding project is a success, but does it make a dent in your universe? And similarly, 2,000 visitors to your mini site is nothing to boast about, but what if all their lives were changed because of it?
At the annual Ecsite conference last week I joined a session on real world evaluation of exhibitions. I was amazed by how well the tools presented showed if an activity was significant. Returning home I had an email in my inbox from booking.com to rate my hotel and experience that showed me how easily the evaluation tools can be integrated in our online activities. I’ve left my email address on a gazillion cultural websites, yet never have I had to answer 7 straightforward questions that could have told the institution if they were significant in my life.
(I’m not talking about the 300+ questions SurveyMonkeys send out by interns before, during or after a website redesign, by the way, I’ve had enough of these.)
For the next cultural online activity I design, I will work together with an evaluator to measure significance rather than just quantitative success. The framework I use when designing activities already acknowledges the organisational mission, so defining significance will be easy.
What about you? I’m sure I’ve missed a whole lot of online projects that ignored Google Analytics and instead focused on other success indicators in their evaluation. Tell me about them in the comments, if you please.