MENU
Measuring. Photo by Allison on Flickr.

by • 3 Jun, 2012 • TechnologyComments (6)744

Going from measuring online success to measuring significance

I recently realised that we, cultural institutions, are using the wrong metrics to measure our online success, because we’re measuring just that: generic success. We’re using statistics and software that is perfectly fine when you’re selling Cokes, but might not be ideal for culture, heritage and the arts.

In the real world we know our success cannot easily be measured in hard figures. Visitors numbers and shop turnover are important KPIs, especially as our funding and financial well-being often depends on it. Yet, these quantitative measures of success are hardly ever part of our mission. Instead we consider ourselves successful when we change behaviour, increase knowledge, spark imagination… Evaluators use complex toolkits and checklists to see if an exhibition had the right impact, an event the expected outcome. In the real world, we are successful if we are significant.

Not online. In almost all project presentations I’ve seen in the past year, success is measured in hits, comments and likes. Sometimes more advanced metrics are used that hint participation, enthusiasm, loyalty. Once or twice I’ve heard people refer to quality of service (search queries resolved). Success, online, is a number.

If online is a full part of your institution, online success is significance as well.

Surely a thousand donations to your crowdfunding project is a success, but does it make a dent in your universe? And similarly, 2,000 visitors to your mini site is nothing to boast about, but what if all their lives were changed because of it?

At the annual Ecsite conference last week I joined a session on real world evaluation of exhibitions. I was amazed by how well the tools presented showed if an activity was significant. Returning home I had an email in my inbox from booking.com to rate my hotel and experience that showed me how easily the evaluation tools can be integrated in our online activities. I’ve left my email address on a gazillion cultural websites, yet never have I had to answer 7 straightforward questions that could have told the institution if they were significant in my life.

(I’m not talking about the 300+ questions SurveyMonkeys send out by interns before, during or after a website redesign, by the way, I’ve had enough of these.)

For the next cultural online activity I design, I will work together with an evaluator to measure significance rather than just quantitative success. The framework I use when designing activities already acknowledges the organisational mission, so defining significance will be easy.

What about you? I’m sure I’ve missed a whole lot of online projects that ignored Google Analytics and instead focused on other success indicators in their evaluation. Tell me about them in the comments, if you please.

Photo by Allison on Flickr.

Related Posts

  • http://www.sebchan.com Seb Chan

    There’s a lot of interesting work and thinking going on in academia under the banner of AltMetrics. Much of this is relevant to how museums can rethink measuring ‘significance’ and ‘impact’ – http://altmetrics.org/manifesto/

    I think you need to make the distinction between immediate, medium term and long term time scales too. Much of the quantitative tools we have at our disposal are about ‘immediate’ impact (this got even worse when everyone was excited about ‘Real Time’ in Google Analytics). But really a lot of what we are talking about when we think of ‘impact’ are the 10+ year impacts that can only be discovered through longitudinal studies. There’s been a few of these – “what made you choose your career as a scientist? Well, when I was 8 I remember visiting this exhibition at the museum and . . . ” – but nowhere near enough.

    Arriving in New York there was this great campaign for Brooklyn Academy of Music (BAM) which captures the challenge and opportunity for arts organisation. Their tagline ‘BAM! And then it hits you” captures the reality that after you’ve seen an amazing show it might not be for a few days afterwards that you really ‘get it’. And often it is those shows that you keep on talking about that have been the most ‘impactful’ and ‘effective’. See the NYT on the campaign here – http://www.nytimes.com/2011/09/19/business/media/new-campaign-lets-bam-hit-you-all-over.html?pagewanted=all

  • http://themuseumofthefuture.com/ Jasper Visser

    Hi Seb, thanks for your reply. I really enjoy the thinking in the AltMetrics manifesto, as it takes the thinking about online impact one step further.

    If I remember the courses I did on evaluation and testing in university correctly, there are countless ways to measure medium and long term effects, but most of them require designs much more elaborate and proactive than installing Google Analytics and staring at graphs. I’d be good to pick up these books again and make a translation of these methodologies to the online cultural world (I write this down as a to-do).

    Going to the session at Ecsite and writing this blog post has made me realise that although we focus on tangible results in the online work we do at Inspired by Coffee, the metrics/evaluation part of our work deserves way more attention than it already gets.

  • http://twitter.com/bridgetmck bridgetmck

    This is a good question. A lot of my work is evaluating cultural learning and outreach projects, whether they are IRL, online or hybrid. We might use Google Analytics etc as part of the quantitative work but 99% of my time is spent asking questions like: How long will people be using this, and if not for long, how can it evolve or have a legacy? How does it meet the needs of educators and learners? How can these resources be made more accessible to a wider group? How is it developing their skills of digital literacy? How does it build knowledge assets for the museum? And so on. We’ve also done a little bit that is using more sophisticated techniques to measure Social Return on Investment. Each evaluation project is custom-designed  and because they’re mostly qualitative, we’re using a range of observational, dialogic and narrative methods. It would be really interesting to develop a more systematic approach to measuring impact so I’m keen to take a closer look at the Altmetrics Seb points to.  

  • Pingback: How is the world different because your museum is online? « museum geek

  • Pingback: Mierzenie efektywność działań instytucji kultury w internecie - Historia i Media

  • Pingback: współczynnik sensowności | widownia