Skip to content

Commit f3f8a57

Browse files
[pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
1 parent 9ae7622 commit f3f8a57

File tree

1 file changed

+4
-3
lines changed

1 file changed

+4
-3
lines changed

machine_learning/loss_functions.py

Lines changed: 4 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -662,12 +662,13 @@ def kullback_leibler_divergence(y_true: np.ndarray, y_pred: np.ndarray) -> float
662662
kl_loss = y_true * np.log(y_true / y_pred)
663663
return np.sum(kl_loss)
664664

665+
665666
def root_mean_squared_error(y_true, y_pred):
666667
"""
667668
Root Mean Squared Error (RMSE)
668669
669-
Root Mean Squared Error (RMSE) is a standard metric used to evaluate the accuracy of regression models.
670-
It measures the average magnitude of the prediction errors, giving higher weight to larger errors due to squaring.
670+
Root Mean Squared Error (RMSE) is a standard metric used to evaluate the accuracy of regression models.
671+
It measures the average magnitude of the prediction errors, giving higher weight to larger errors due to squaring.
671672
The RMSE value is always non-negative, and a lower RMSE indicates better model performance.
672673
673674
RMSE = sqrt( (1/n) * Σ (y_true - y_pred) ^ 2)
@@ -680,7 +681,7 @@ def root_mean_squared_error(y_true, y_pred):
680681
681682
Returns:
682683
float: The RMSE Loss function between y_Pred and y_true
683-
684+
684685
Example:
685686
>>> y_true = np.array([100, 200, 300])
686687
>>> y_pred = np.array([110, 190, 310])

0 commit comments

Comments
 (0)