Rohan's Bytes
Subscribe
Sign in
Share this post
Rohan's Bytes
ML Interview Q Series: Why is the ReLU activation function frequently chosen instead of Sigmoid in deep neural network architectures?
Copy link
Facebook
Email
Notes
More
ML Interview Series
ML Interview Q Series: Why is the ReLUβ¦
Rohan Paul
Apr 5
1
Share this post
Rohan's Bytes
ML Interview Q Series: Why is the ReLU activation function frequently chosen instead of Sigmoid in deep neural network architectures?
Copy link
Facebook
Email
Notes
More
1
π Browse the full ML Interview series here.
Read β
Comments
Share
Copy link
Facebook
Email
Notes
More
This site requires JavaScript to run correctly. Please
turn on JavaScript
or unblock scripts
Share this post
ML Interview Q Series: Why is the ReLUβ¦
Share this post
π Browse the full ML Interview series here.