5 Things to Watch Out for with Marketing Mix Modeling

Marketing Mix Modeling (MMM) is a powerful, data-driven statistical analysis tool designed to optimize marketing spend and strategy by exploring multiple variables that contribute to marketing performance. However, MMM also has common pitfalls that can lead to skewed results or incorrect assumptions. In this article, we will explore five things to watch out for in MMM and the best ways to avoid them.  

  1. Not validating data

Businesses rely on sound, reliable data to drive business decisions. If data is incomplete, inconsistent, or just poor quality in general, this can result in incorrect assumptions and more time wasted on making wrong business decisions. There are many ways data can turn ‘bad,’ such as overfitting and data decay – this is why data validation is a crucial process in MMM to reduce the likelihood of such errors.  

Overfitting happens when the model fits too closely to training data. In other words, it will perform well on training data, but not with new data, making it unable to make accurate forecasts from any data other than what it was trained on – only listening to the “noise” and not the “signal.”  

Data decay occurs when data loses value over time and becomes outdated because of changing trends, new business processes, or other real-world factors. This can affect marketing activities, especially if the data hasn’t been adjusted to reflect recent trends.

How to avoid:

  • Validate your data – Ensure that the data is clean, consistent, and accurate across all departments of your organization.  
  • Pre-process data – Address missing values, outliers, and other anomalies before modeling.
  • Conduct backtesting regularly – This involves running the model on historical data to test the accuracy of its forecasts. In fact, this is a crucial part of model validation and should be done on a regular basis to ensure model forecast accuracy.  
  • Refresh data regularly – Consistently keep data up to date by incorporating the latest inputs to reflect current trends and conditions.  
  1. Ignoring carry-over effects and saturation effects

Carry-over effects and saturation effects tend to be overlooked by marketers, but they are crucial in understanding the impact of long-term brand building versus short-term sales lifts.  

Carry-over effects (or adstock effects) refer to the lingering effects of a campaign over time. For example, you may promote a new product launch a few weeks ahead but really only start to see significant impact closer to the date of the launch.  

Saturation effects refer to the diminishing returns from repetitive exposure. The law of diminishing returns states that there will be a point where continuing to increase input will only result in less output (or return on investment). This also applies to marketing – when you continue to increase advertising spend, efficiency goes down, and this goes for all channels, whether it be TV, display, or print.  

How to avoid:

  • Carry-over effects – Use models that account for the residual impact of past campaigns on current sales, including lagged variables which capture delayed responses to marketing.  
  • Saturation effects – Implement diminishing returns in the model to reflect how marketing efforts may become less effective over time. By accounting for saturation effects in the MMM model, it avoids overestimating the impact of marketing activities by knowing where to stop increasing spending on channels that are already saturated, so budget can be allocated elsewhere.  
  1. Relying on impressions instead of spend

Measuring marketing effectiveness based on ‘vanity metrics’ like impressions, likes, followers, etc., can be misleading, as they do not necessarily correlate with actual sales or ROI. These kinds of metrics look impressive on the surface (and can make you feel good!) but tends to lack substance and doesn’t always translate into meaningful business impact. That is not to say that these kinds of metrics are useless, but context and knowing how it ties back to business goals is key when it comes to utilizing vanity metrics.  

How to avoid:

  • Focus on spend efficiency, rather than just counting impressions – This is important in gauging the actual impact of marketing activities as it relates to business goals. Impressions are useful for gauging brand awareness and reach, but at the end of the day, what matters most is whether investing in those activities had an impact on ROI and other relevant metrics like conversions and sales.
  • A/B Testing Run tests comparing campaign performance based on conversion metrics rather than just impressions.  
  • Use impressions only when necessary – Impressions can help measure reach and awareness, but it's crucial to model how the dollars spent translate into more actionable metrics like sales and customer retention.  
  1. Treating all marketing channels the same

This relates to the challenge of cross-channel marketing, where each channel gets individual time and investment, but all relates back to ‘one single source of truth.’ However, a common mistake is treating all channels the same, such as giving them all the same KPI. After all, not all channels contribute equally. This can result in some channels being heavily underutilized compared to others simply because they did not generate the same results as others. For example, social media may not be the most ideal for conversions as a KPI (since users are not in the ‘mindset’ to buy a product when browsing social media), whereas conversions might be more relevant for a website where users are more likely to convert. It is important to remember that marketing channels often have different response curves, effectiveness, and interaction with consumers.

How to avoid:

  • Use channel-specific response curves – Adjust the MMM model to account for how different channels perform and avoid giving them all channels the same weight.  For example, digital ads may perform differently compared to TV ads due to different levels of engagement.
  1. Bias

There are multiple ways bias can occur – from modeling bias, aggregation bias, and missing variable bias – and this can be a big deal, since a biased model can still generate an output that appears ‘logical’ but is in fact quite flawed.  

Types of bias:

  • Aggregation bias – This occurs when data is combined or averaged across groups in a way that masks or distorts differences within the data, which may result in misleading conclusions.
  • Missing variable bias – This refers to when important variables important to business drivers are left out of a model, whether intentional or not, leading to incorrect assumptions about the data. In other words, leaving out important factors for analysis can dramatically skew the data and the resulting conclusions.  
  • Modeling bias – This refers to systematic errors in the model, which can arise from incorrect assumptions, limitations in the model, or biased data, causing it to make incorrect predictions and generate skewed results.

How to avoid:

  • Continually refine assumptions and test multiple scenarios to ensure that you are not making conclusions based on skewed data.  
  • Use hold-out samples – This refers to a random sample of data that is excluded from the model training, ensuring that the model is not overfitting the data and is not biased towards any one specific variable. This allows for an unbiased evaluation of the model’s performance when applied to new data.  
  • Work closely with the person in charge of the model – You want to make sure to fully understand what parameters are going into your MMM model and how they relate to your business objectives, as well as which parameters are selected by a human (based on their knowledge and expertise) vs. the parameters actually learned by the model.  

In conclusion, while Marketing Mix Modeling (MMM) can offer valuable insights into the effectiveness of marketing strategies, it is important to avoid these 5 common pitfalls to ensure accurate results for your MMM model, as well as increase your ROI in the long-term.

Related Articles

Keep updated about our new features

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.