Skip to content

Commit bb4c622

Browse files
Added Tanh Activation Function
1 parent abc0025 commit bb4c622

File tree

1 file changed

+40
-0
lines changed
  • neural_network/activation_functions

1 file changed

+40
-0
lines changed
Lines changed: 40 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,40 @@
1+
"""
2+
This script demonstrates the implementation of the Hyperbolic Tangent (Tanh) function.
3+
4+
The tanh function is a hyperbolic function that maps any real-valued input to a value
5+
between -1 and 1. It's commonly used as an activation function in neural networks
6+
and is a scaled version of the sigmoid function.
7+
8+
For more detailed information, you can refer to the following link:
9+
https://en.wikipedia.org/wiki/Hyperbolic_functions#Hyperbolic_tangent
10+
"""
11+
12+
import numpy as np
13+
14+
15+
def tanh(vector: np.ndarray) -> np.ndarray:
16+
"""
17+
Implements the hyperbolic tangent (tanh) activation function.
18+
19+
Parameters:
20+
vector (np.ndarray): A vector that consists of numeric values
21+
22+
Returns:
23+
np.ndarray: Input vector after applying tanh activation function
24+
25+
Formula: f(x) = (e^x - e^(-x)) / (e^x + e^(-x)) = (e^(2x) - 1) / (e^(2x) + 1)
26+
27+
Examples:
28+
>>> tanh(np.array([-1.0, 0.0, 1.0, 2.0]))
29+
array([-0.76159416, 0. , 0.76159416, 0.96402758])
30+
31+
>>> tanh(np.array([-5.0, -2.5, 2.5, 5.0]))
32+
array([-0.9999092, -0.9866143, 0.9866143, 0.9999092])
33+
"""
34+
return np.tanh(vector)
35+
36+
37+
if __name__ == "__main__":
38+
import doctest
39+
40+
doctest.testmod()

0 commit comments

Comments
 (0)