Mean Squared Error (MSE) is a regression loss function that measures the average squared differences between predicted and actual values.
MSE calculates the squared difference between each prediction and its true value, then averages these squared errors across all data points. The squaring operation penalizes larger errors more heavily than smaller ones, making the model sensitive to outliers.
Lower MSE values indicate better model performance, with 0 representing perfect predictions. MSE is always non-negative and expressed in squared units of the target variable, which can make interpretation challenging compared to metrics like Root Mean Squared Error (RMSE).
Where:
For stock price predictions: actual prices [100, 105, 110] and predictions [98, 107, 108], the MSE would be:
1import numpy as np
2
3y_true = [100, 105, 110]
4y_pred = [98, 107, 108]
5
6mse = np.mean((np.array(y_true) - np.array(y_pred))**2)
7print(f"MSE: {mse}") # Output: MSE: 2.67
8