Ever deployed an email program change right before Black Friday? 😬 In the latest Arcana episode, Mickey Chandler dives into why timing your email program changes matters more than you think. From authentication updates to ESP migrations, learn how to implement changes without risking deliverability or compliance issues. 🎯 Key takeaways: Why January is ideal for email program changes What to think about when testing Creating fallback plans that work Listen now to avoid the costly mistakes we've seen over 20+ years in email deliverability. #EmailMarketing #Deliverability #EmailAuthentication #PrivacyCompliance
Transcript
There's never a perfect time to make changes to your email program, but there are definitely wrong times. If you've ever deployed changes during peak sending periods or rushed updates to meet compliance deadlines, You know exactly what I mean. This week on Arcana, how to implement changes to your email program without breaking everything in the process. Welcome to Arcana, Whizardries' podcast for email deliverability and privacy. January is an interesting time in the email world. The holiday rush is over, teams are catching their breath, A lot of companies are looking at their email programs, thinking about what changes they need to make in the coming year. Maybe it's updating authentication to meet new requirements, which we were doing this time last year. Maybe it's adjusting processes to account for new privacy compliance requirements. Maybe it's just time to migrate to a new ESP. Whatever changes you're planning, how you implement those changes. matter just as much as what you're implementing. The risks of poor timing aren't theoretical. When we look at email program changes that fail, rushed implementations during high stakes periods aren't the most common cause, but they do rank right up there. Process changes that occur during peak sending windows, Another common cause of why things fail. Last minute paper privacy updates that get pushed live without discussion or notification. to impacted teams, infrastructure migrations when teams are already stretched thin. All of these are things that can cause problems for a program. And they aren't just inconvenient, they're potentially catastrophic for your company. A failed authentication deployment can mean that legitimate mail gets blocked. Privacy compliance issues can trigger regulatory investigations. And don't forget, GDPR has penalties that range up to 4 percent of your company's global turnover. And those investigations can last months or years. Infrastructure problems can take days or weeks to fully resolve. The difference between successful and failed implementations often come down to three things. Timing, testing, and having a fallback plan. So let's break these down because each one matters more than you might think. Let's start with timing. Every email program has natural rhythms. Times when volume is higher, times when it's lower, times when you can afford a hiccup, and times when you absolutely cannot have anything go wrong. Understanding these rhythms is critical. For instance, if you're in retail, implementing major changes between, say, Halloween and Christmas isn't just risky, it's potentially catastrophic. Even a small delivery problem during peak season can have massive revenue impact. But January? That's different. Volume is lower, especially compared to December. The stakes are lower and you now have room to handle unexpected issues. The same principle applies to B2B senders. Maybe your busy period isn't the time period between Thanksgiving and Christmas. Maybe it's the end of a quarter or during your annual conference when everyone is in town attending sessions and workshops. Whatever your company's rhythm is, you need to work with it and not against it. But it's not just about avoiding busy period. It's about taking advantage of quiet ones These windows of opportunity are perfect for testing changes, validating your results and having time to roll back. If something goes wrong, this brings us to our second point testing. Let's break down what proper testing actually requires. You need a complete staging environment that mirrors your production infrastructure. You want authentication that works. You want privacy controls that are in place. You want routing rules that very closely align with what you're doing anyway. Testing needs to cover seed list addresses at major mailbox providers. Whether you're paying someone for them, or you've set them up yourself. Why? Because you need to verify that your new system has authentication that passes for every mechanism that you've deployed. SPF, DKIM, DMARC, all of that needs to be validated. You need to validate that your privacy compliance controls are functioning. And most importantly, you need to generate detailed logs and metrics that you can use to identify potential issues before they become problems. The reality is email is not easy, fairly complex. You're dealing with authentication. You're dealing with reputation. You're dealing with content filtering. You're dealing with privacy compliance issues. And all of these interact with each other. A change that looks fine in testing might cause problems at scale. Or worse, it may cause problems that don't show up until days or weeks later. So when we're talking about testing and we say you need to validate your results, what does that actually mean? It's not enough to check just to see whether or not messages are delivering. That is whether or not the mailbox provider receives the message and tells you, I'll take it and do something with it. You want specific measurable criteria for success. That includes. Checking your authentication. You want to check your pass rates. Gmail is particularly good and helpful with this because they actually will place headers in your incoming messages that show what they checked and whether or not it validated. You want to look at delivery rates for the major mailbox providers that you're dealing with. You want to look at response times for the sending infrastructure. You want to check that your privacy compliance things. are in place and working. If someone gets a mail at one of those mailbox providers and clicks on the unsubscribe button, will they actually be unsubscribed? You want to be able to check your new engagement metrics and compare them to your old one. Now, a word of warning here: it's important to understand that when you change infrastructure, especially if you're doing something like changing an email service provider, you may, in fact, see a dip in deliverability for a period of time. While the mailbox providers are getting used to the new infrastructure. That's not unusual. But you want to have engagement metrics that you can use as a baseline so that you know when things are back to normal. And if things don't come back to normal over an extended period, You need to know where they should be, so you can talk to your provider about how to get them back there. And then you also want system logs that are showing you expected behavior. Changes to your email program don't just affect your email team. One of the things that I have seen throughout my 20 year career in doing email is that people really only think about their own team. So the email team is charged with implementing things, and when they make changes, they make certain that everybody on the email team knows. But changes to your email program also can affect your marketing team, your sales team, even your customer service team. Anyone who relies on email to do their job is impacted, and this means that your implementation plan needs to include communication strategies. So that those teams find out things that are important. So, an implementation plan needs to include strategies about when you'll notify internal teams about what's going on. How will you handle questions and concern? What metrics will you be able to share with those teams to demonstrate success? Or, what's going to tell you when something goes wrong? Because even with perfect timing and thorough testing, things can still go wrong. Mr. Murphy's been around for a long time and he's not going anywhere. That's where your fallback plan comes in. Every change needs three things. An implementation plan, a validation plan, and a rollback plan. Your implementation plan outlines how you're going to make the change. Your validation plan confirms that the change worked, but your rollback plan, that's your safety net. It's what you do when something unexpected happens. And something unexpected usually does happen. Maybe authentication starts failing for some subset of messages. Maybe your privacy compliance tools start flagging false positives. Maybe delivery rates drop at a specific mailbox provider. That's why having a detailed fallback plan is crucial. It's not admitting defeat, it's being prepared. You need predetermined triggers that tell you that it's time to roll back a change. You need documented procedures for how you'll roll back. And most importantly, you need to test those procedures before you need them. So what does that mean for your email program? Let's talk practical steps. First, map out your email program's natural rhythm. When are your high volume periods? When are things quieter? When do you have the bandwidth to handle potential issues? Second, document everything. Your implementation plan, your validation criteria, and especially your rollback procedures. Being in the middle of an issue is not the time to figure out how to undo what you've been trying to do. Third, build a testing framework that covers all of your bases. Don't just test that messages deliver. Test that they authenticate. Test that your controls work. Test that your reporting captures everything you need. Finally, be realistic about timelines. Change often takes longer than you think. Testing takes longer than you think. Validation takes longer than you think. Build in buffer time. You'll need it. So before our next episode, there are three things that I want you to do. Number one, document your program's 12 months and use that to forecast peak periods. Number two, create a change impact matrix. Who needs to know about changes? Who needs to approve them? And who needs to help implement them? And then number three, review your current testing procedures and identify any gaps. Remember, there's never a perfect time to change your program. There are definitely wrong times. The key is knowing the difference and planning accordingly. This has been Arcana. Contact us at whizardries.com for more information on how we can help you get your mail delivered.To view or add a comment, sign in