I’m in charge of putting together our weekly events emails. As such, I’m interested in discovering patterns, developing hypotheses, and testing my theories. All in the name of providing a more useful experience for our patrons. Each quarter I do an analysis to look for new insights, check out the answers to my questions, and ask library stakeholders for guidance on what we should test next.
Our findings for April to June:
* Do email opens go up if we send emails to low open users on Thursdays as opposed to Friday? The six-week test showed that it doesn’t make a difference.
* Out of the two subject lines we test each week across three segments, two segments usually choose the same one.
* Opens decreased in the spring, but not as drastically as I thought.
* New subscribers’ open and click-through rates dropped drastically from winter to spring.
* Email unsubscribes and bounces decreased. The numbers are small, so it looks very impressive to say “We had a 26% decline in unsubscribes!”
* We’re gaining email subscribers thanks to the form on our website.
* People do scroll all the way to the Did You Knows. We know this thanks to the click-through rate which is comparable to items higher up in the email.
Next round I’m thinking about testing:
* Does the age category of the featured event affect unsubscribe rates? I believe the answer is yes, but am eager to see if that theory proves true.
Of course, I need to look at this from a longer perspective as well. Perhaps in December, when things usually slow down, I can compare some of my numbers loosely to the 2013-2016 numbers.