Ask Lana
Help Center
How can we help? πŸ‘‹

Planning Accuracy

Measure forecast precision

Page navigation

To access the "Demand Planning Accuracy" screen you can click on the "Forecast" button to open the Demand planning Accuracy

This will open the "Demand Planning Accuracy" page that allows you to evaluate the performance of both your "Forecast" and your "Plan"

Besides the traditional filter controls, "Demand Planning Accuracy" gives you control over the data you visualize.

Data Selection:

The "Historical Data" gives you access 2 types of datasets:

  • Actual: This is the historical "Sales Out" that was uploaded into SIMCEL
  • Base Demand: This is the historical "Sales Out"Β data cleaned up from any planned and unplanned demand events

The "Projection Data" gives you access 4 types of datasets:

  • Committed Plans:Β The consolidation of all pastΒ "PlansΒ set as primary" (validated for execution) which include the trade and Marketing events and their impacts, with an offset of 1 day.
  • Committed Plans (90 Days Offset):Β The consolidation of all Past "PlansΒ set as primary" (validated for execution) which include the trade and Marketing events and their impacts, with an offset of 90 days (lag 3)
  • Forecast Base:Β The consolidation of all past "Forecast Baseline" which does not include Trade and Marketing events, with an offset of 1 day.
  • Forecast Base (90 Days Offset):Β The consolidation of all past "Forecast Baseline" which does not include Trade and Marketing events, with an offset of 90 days (lag 3).

As Β you can see any historical data cannot be compared with any projection data:

  • Actual VSΒ Committed PlansΒ orΒ Committed Plans (90 Days Offset): This allows you to compare the planning accuracy, including any Trade and Marketing events
  • Base Demand VSΒ Forecast BaseΒ orΒ Forecast BaseΒ (90 Days Offset): This allows you to compare the statistical forecast accuracy, excluding any Trade and Marketing events

Error Metrics:

TheΒ "Errors Metrics"Β give access to the selection of the accuracy performance metrics:

SIMCEL uses 6 different Metrics (please find the definition of those KPIs in the Appendix):

  • MAE: Mean Absolute Error
  • MAPE: Mean Absolute Percentage Error
  • MSE: Mean Square Error
  • RMSE: Root Mean Square Error
  • SMAPE: Symmetric Mean Absolute Percentage Error
  • WMAPE: Weighted Mean Absolute Percentage Error

Data Granularity:

The "Product Granularity"Β gives access to the selection of the level at which the product data will be aggregated before computing the error metrics:

The "Customer Granularity"Β gives access to the selection of the level at which the customer data will be aggregated before computing the error metrics:

The "Time Granularity"Β gives access to the selection of the level in which the Time dimension will be aggregated before computing the error metrics:

Analytics:

The "Error Metrics"Β gives a summary of the projection accuracy based on the Error Metrics selected. The data will be affected by the filter and aggregation level selected (as a rule of thumb the more atomic is the data the less accurate the projection)

The "Historical VS Projection Time Series"Β displays in the same line chart the historical data and the projection data, allowing you to spot the period where the 2 datasets deviate.

TheΒ "Historical VS Projection"Β displays all data points according to the aggregation level and allows the assessment of the discrepancies over/under forecasting.

TheΒ "Volume Vs Error Matrix"Β displays for customer and product segments their respective accuracy as well as their volume. It allows you to identify the best and worst performers.

Metrics Computation:

The Demand Planning Accuracy table is not automatically updated. If there are any changes in the historical data (new data uploaded, base demand adjusted) or in the projection data (forecast updated, new event allocated to the primary scenario, new primary scenario), the user will need to trigger the "Run Metrics"Β button to ensure that all the analytics are displaying the latest data.

Export Demand Dynamic Report

The Demand Dynamic Report, authored by SIMCEL, analyzes product historical sales performance. The report provides insights into overall demand trends, key contributors, and significant outliers.

To export the report:

  • Click β€œView Pre-Forecast Report” Button
  • You can configure some meta information of the report:
    • Select Analysis Date Range: Choose the period for the report analysis.
    • Select Time aggregation level: Select how you wish to aggregate time data (Monthly, Quarterly).
    • Channels (Optional): Define a subset of customer segments.
    • Select Segment Group: Choose the aggregation level for historical demand and forecast data, such as Brand, Category, or Channel.
    • Select UoM: Specify the unit of measurement for analyzing historical demand and forecast data.
    • Set outlier coefficient (# stddev multipliers): Set the number of standard deviation multipliers for outlier detection and analysis. This helps in identifying significantly high or low sales volumes during the selected period (see here).
    • Set threshold for # Top Contributors: Establish the number of top contributing segments (e.g., brands or product IDs) to be highlighted in the report. This is the absolute threshold.
    • Set threshold for % contribution of Top Contributors: Define the percentage threshold that identifies top contributors by their share of total demand, sales volume, etc. If the number of contributors exceeding this percentage surpasses the absolute threshold, it will be adjusted to match the absolute threshold limit.
    • Select Date Range to evaluate Top Contributors: Select the specific time frame for identifying top contributors, based on their contribution to sales or demand within the last year or another defined period. For example, we only care about brands that make up 80% of sales in the last 12 months.
  • Then click β€œGenerate”, an HTML report will be automatically saved to your computer

Appendix

Forecast KPIs definition

Error

The error is the forecast minus the demand.

Note that with this definition, if the forecast overshoots the demand, the error will be positive, and if the forecast undershoots the demand, then the error will be negative.

Bias

The bias is defined as the average error.

Where n is the number of historical periods where you have both a forecast and a demand.

As a positive error on one item can offset a negative error on another item, a forecast model can achieve very low bias and not be precise at the same time.

MAPE

MAPE is the sum of the individual absolute errors divided by the demand (each period separately). It is the average of the percentage errors.

MAPE is a poor accuracy indicator. As you can see in the formula, MAPE divides each error individually by the demand, so it is skewed: high errors during low-demand periods will have a major impact on MAPE.

MAE

It is the mean of the absolute error

One of the first issues of this KPI is that it is not scaled to the average demand. If one tells you that MAE is 10 for a particular item, you cannot know if this is good or bad. If your average demand is 1000, it is, of course, astonishing, but if the average demand is 1, this is a very poor accuracy. To solve this, it is common to divide MAE by the average demand to get a %:

πŸ—’οΈ
MAPE/MAE Confusion: Many people call MAE as MAPE, leading to misunderstandings. To avoid confusion and ensure accurate comparisons, it is advisable to specify the error calculation method used in any discussion or analysis.

RMSE

It is the average of the squares of the errors. Here, the error is the difference between the forecast and the actual demand.

It is defined as the square root of the average squared error.

Just as for MAE, MSE is not scaled to the demand. We can then define MSE% as such,

RMSE

It is defined as the square root of the mean squared error.

Just as for MAE, RMSE is not scaled to the demand. We can then define RMSE% as such,

πŸ—’οΈ
Compared to MAE, MSE and RMSE gives more weight to larger errors. This means that larger forecast errors have a disproportionately larger impact on the MSE value, making it sensitive to outliers and large deviations.

WMAPE

It is similar to MAPE but takes into account the importance of different periods or products by applying weight. The weight assigned to each period or product reflects its relative importance to the total sales volume, demand, or other weights that are considered critical to the business.

SMAPE

It is a measure of forecast accuracy that is symmetric around zero, meaning it penalizes over-forecasting and under-forecasting equally. It is calculated as the average absolute percentage error between the actual and forecasted values, divided by the average of the actual and forecasted values.

Did this answer your question?
😞
😐
🀩