Posts

Showing posts from March 21, 2019

What activation function should I use for a specific regression problem?

Image
1 $begingroup$ Which is better for regression problems create a neural net with tanh/sigmoid and exp(like) activations or ReLU and linear? Standard is to use ReLU but it's brute force solution that requires certain net size and I would like to avoid creating a very big net, also sigmoid is much more prefered but in my case regression will output values from range (0, 1e7)... maybe also sigmoid net with linear head will work? I am curious about your take on the subject. machine-learning neural-network deep-learning regression activation-function share | improve this question edited 9 hours ago Media