Skip to content

Commit 3e255e2

Browse files
author
TASMAYU
committed
Add RMSE and Log-Cosh loss functions to loss_functions.py
1 parent 788d95b commit 3e255e2

File tree

1 file changed

+24
-0
lines changed

1 file changed

+24
-0
lines changed

machine_learning/loss_functions.py

Lines changed: 24 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -667,3 +667,27 @@ def kullback_leibler_divergence(y_true: np.ndarray, y_pred: np.ndarray) -> float
667667
import doctest
668668

669669
doctest.testmod()
670+
671+
672+
def root_mean_squared_error(y_true: np.ndarray, y_pred: np.ndarray) -> float:
673+
"""
674+
Calculate the Root Mean Squared Error (RMSE) between ground truth and predicted values.
675+
# ... docstring continues ...
676+
"""
677+
# LINE 1: Check if input arrays have same length
678+
if len(y_true) != len(y_pred):
679+
raise ValueError("Input arrays must have the same length.")
680+
681+
# LINE 2: Calculate squared differences between true and predicted values
682+
# (y_true - y_pred) gives errors, then we square each error
683+
squared_errors = (y_true - y_pred) ** 2
684+
685+
# LINE 3: Calculate mean of all squared errors
686+
# This gives Mean Squared Error (MSE)
687+
mean_squared_error = np.mean(squared_errors)
688+
689+
# LINE 4: Take square root of MSE to get RMSE
690+
# This brings units back to original scale
691+
return np.sqrt(mean_squared_error)
692+
693+

0 commit comments

Comments
 (0)