“The Editor Behind the Curtain” – Erin Kissane at Confab
And so it came to pass that after many hours and umpteen blog posts, I’d published all of my notes from Confab. And I’ve left one of the best until last. Erin Kissane gave a fantastic talk about using numbers for good, but with a necessary level of caution.
Erin Kissane’s talk title referenced The Wizard of Oz, one of my little daughter’s favourite movies, so how could I not love it? Especially when she revealed a sub-plot of how having an editorial strategy will help save your ass and find your way during a period of rapid change. Erin doesn’t doubt we are in a period of rapid change, or as she put it “shiny new stuff is always appearing.”
I love shiny new stuff, me. But Erin pointed out that most organisations or people can do three things with “shiny new stuff”. They “deride and ignore”, y’know the iPad is lame or Twittr will never catch on. Or they “embrace unquestioningly”. Or they take a moment to work out what is helpful and use what matters.
Unsurprisingly Erin urged us to be in the latter camp, and think about the roots of “editorial strategy”. Editorial is not, she said, about the red pen, about being a stickler, about obsessing over syntax. And editorial is not only style, voice or tone. She said she prefers to think about it as “steering the ship” — not the actions of one person at the helm, or one navigator, or one person in the engine room, but a combination of them under guidance that will keep the whole ship heading in the right direction during a storm.
Erin then went on to talk about her experiences recently working with data-journalists, and I had one of those crash-zoom moments where in a room with a couple of hundred people only a handful put their hand up to say they knew what data-journalism was. In the same way that I’d wager a room full of 200 data-journalists would struggle to know what “content strategy” was. For a few brief seconds I felt like I was literally at the centre of a really nerdy Venn diagram.
Erin described data-journalists as “people who live and breathe data” in order to extract stories from it. She thinks there are some useful things to be gained from the practice by content strategist because, she said, we do have an awful lot of data, and we are going to get even more. Content strategists might think of themselves as “word people”, but Erin reminded us that we use content score-cards and spreadsheets and analytics and user-testing groups, and that this is all data. The promise of data, she said, is that we want to get closer to our users, and knowing more about their behaviour should help us to help them.
But she also cautioned against the “hypnosis” of data. That because we have numbers we treat them as objectively true, even though any set of numbers will be biased by what we could collect, and why we chose to collect it in that way. A real risk is that we have some data, and so then we ask some questions of what we’ve got, without expanding our horizons. It is much better to ask important questions, and then seek out the data that will answer them. Sticking to datasets you already have also means potentially missing out on stories, connections, and different interpretations.
And numbers don’t tell the whole story. She mentioned one particular website that occasionally sends massive traffic spikes to a project she works on. The load makes the server guy sweat, she explained, but since the audience of that website are not their target market, it isn’t actually of any use to them, even though the raw server log numbers would suggest “Brilliant, loads of traffic, we’re doing great.”
Without naming names, Erin also suggested that some content strategy practitioners were selling their content scoring methods as “scientific” when actually they were “some dude in a basement putting numbers in a spreadsheet read off some post-it notes.” Numbers in this context are a convenience, not a rigourous thing from on high. It was, she said, all too tempting sometimes to try and prove a case to a client by blinding them with numbers rather than really saying “here is how we evaluated this” and actually taking them through the reasoning.
Erin said we risk falling for “a wish or desire to mechanise human judgement” or “the longing we have to replace the dirty messy human stuff with this crystalline new structure of data”, but there were strong positives for content strategists. We should be looking to automate what can be automated, and use machines and data to build tools and rules that support editorial judgement. I always think that a great CMS is one that captures every bit of human input that has gone into the editing process — how long was this on the homepage, did it go in a big image slot for strong photo-stories, was it linked to by a lot of other articles, that sort of data.
Data can be a great ally for content strategists, Erin explained, but only by sticking to our editorial roots. What are we trying to accomplish with this piece of content? And for whom? Numbers can help us measure whether we are achieving those goals, but they can’t tell us everything.