Skip to main content

A/B testing content

I enjoyed reading a story about how The Huffington Post A/B tests various different headlines in real time.

The use of A/B testing is widespread among high traffic internet sites and can often provide very useful insight into user behavior patterns and, in aggregate, a great picture of what works and what does not. It's not a panacea, however, and the results are rarely black and white, leaving some room for interpretation. Regardless I do like the idea of applying the same technique to content as to presentation and algorithms.

There's a story about how the staff at  radio listening stations in Britain during World War II became able to recognize the foreign radio operators by subtle habits in the rhythm and speed of their Morse code. Presumably writers for a multiple-source publication like HuffPo have their own vocabulary and grammar habits in the wording of their article titles too so it would be fascinating to see whether the increased engagement actually represents a subconscious preference for a particular contributor rather than headline effectiveness itself.