- must know what we are trying to achieve, before deciding what to measure.
- online measurement is not an exact science. But doesn’t mean results are invalid. Need a balance between how much we are willing to accept as hard truths or merely indicators or estimates.
- Numbers don’t mean anything by themselves. Need context, e.g. Compared against something (see notes on comments) or your end in mind (e.g. Action, sales)
- target audience of the data must also accept/ understand the measurement method (See Intuit case study)
No point starting social media if you don’t have an objective in mind, and if you can’t measure that objective (see Katie Delahaye Paine, “7 steps to measurable social media success”).
There are many possible metrics (see David Berkowitz, “100 ways to measure social media”)
Understanding Analysis: “The most important part of all analysis is whether and how the resulting metrics will be used”.
E.g. of an ambiguous question: “are fat people lazy?” How fat is fat? How lazy is lazy? Measured against what? Equally vague is the question, “is our marketing effective?” (a way to clarify is to define effectiveness, e.g. Increase sales)
Author highly recommends “the thinker’s guide to analytical thinker” by Dr Linda Elder and Dr Richard Paul (www.criticalthinking.org), on practical advice on approaching an analysis project.
Flow of customer/ company relationship:
- get people to know you (attention, reach)
- get people to like you (emotions, sentiments, value)
- get people to interact with you (trigger action)
- convince them to buy (trigger action)
Reach: the % of people you want to touch and can actually get hold of (but not necessarily action from them).
“measuring by alarm”, where mentions dip or rise above the usual volume of chatter
Twitter retweets as a measure. Tweetbeep.
Twitter case study; salesforce.com. More of an experiment for them in tracking reach via twitter. They instantly forward any mention of their CEO to their PR dpt to monitor reputation risk. .
* help track photos & videos
Mentions a Web Analytics Association in the US. Published a draft Social Media Standards.
Measuring Facebook widgets: hosted,viewed, grabbed, installed, used, uninstalled, active, time on page, mouseover, event.
Jodi McDermott, http://www.clearsprings.com. How data/ measurements may not precise (book has a very good anecdote of what not to do; reasons why measurement is not consistent — though that may be changing). “numbers don’t have tone precise, just compelling”. Suggests ranges and estimates.
Study- “firm-created word-of-mouth communication: evidence from a field test”, marketing science, vol 28, no. 4, david godes & dina mayzlin, 2009. Suggests that wom between less loyal fans (not highly loyal) and looser acquaintances (not friends) maximized incremental sales.[I noted the word incremental.] And that opinion leaders and fans were already “preaching to the converted” so it would not result in additional sales. Must have conversations “where none would have naturally occurred otherwise”. See http://www.webanalyticsassociation.org/en/art/712
Chpt 4, measuring sentiment. Cites a study about measuring tweets on michael jackson’s death. “detecting sadness in 140 characters: sentiment analysis and mourning Michael Jackson on twitter”.
Author defines engagement as “when somebody cares and interacts”. Suggests people can care about the weather and interact via online platforms but not really concerned about the brand. Or care/ interested about a brand but not willing or able to interact, so little engagement.
On bookmarking services; Idea of noteworthiness: e.g. how many people share/ fwd your content, how many link to your content, how many people click through in a given period. Cited from web analytics association.
“engagement food chain”: see, save, rate, repeated, comment, click, interact, purchase, recommend.
Comments – suggests reviewing the types of posts that get comments, and what sort of comments.
“bad reviews are your friend”
a case on using Net Promoter Score; Tektronik, http://www.btobonline.com, 9 Jun 2008
Suggests using Google Alerts and track “I would recommend (pdt/ company)” or “I would not recommend…”
“ask them to participate”: you can’t control social media conversations but you can bring them closer.
Case study on Dell’s http://www.ideastorm.com.
Mentions Motorola’s wiki where people can post their self written user manuals.
Chapt 7, getting results.
Recommends Eric T. Peterson’s Big Book of Key Performance Indicators, on measuring web performance.
“all good KPIs drive action”, and if a KPI that change drastically or unexpectedly don’t compel some action or reaction, then probably not an important one or even redundant.
Chris Lake’s 35 social media KPIs to measure engagement:
E.g. Comments, fans, downloads, posts, reviews, time spent, uploads, forwards, favourites.
Talks about “Key Listening Indicators” and “key community indicators” (latter includes active members, page views, new members, retention, improvements in marketing efficiency, sales)
See “a social media marketing campaign deconstructed” http://econsultancy.com/blog/2363-a-social-media-marketing-campaign-deconstructed
Shared a case study of a company, Intuit, which adopted a traditional research approach (control & study group) to measuring the impact of a web campaign. Management accepted the results, as they accepted the research methods. But they also acknowledged that being able to measure hindered them “in how far they want to strive for”. That when they understand how to measure, they started to put limitations on the creative side. There was value in first innovating new things, then finding a way to measure later.
An issue is that employees may feel that asking them to justify their results, on top offering to get something going, is a potential straw that will break the camel’s back.
Lists further readings.