Loss is decreasing but val_loss not! [duplicate]
$begingroup$
This question already has an answer here:
Validation loss is not decreasing
2 answers
If loss is decreasing but val_loss not, what is the problem and how can I fix it?
I get such vague result:
lstm loss-function
$endgroup$
marked as duplicate by Antonio Jurić, Siong Thye Goh, Sean Owen♦ yesterday
This question has been asked before and already has an answer. If those answers do not fully address your question, please ask a new question.
add a comment |
$begingroup$
This question already has an answer here:
Validation loss is not decreasing
2 answers
If loss is decreasing but val_loss not, what is the problem and how can I fix it?
I get such vague result:
lstm loss-function
$endgroup$
marked as duplicate by Antonio Jurić, Siong Thye Goh, Sean Owen♦ yesterday
This question has been asked before and already has an answer. If those answers do not fully address your question, please ask a new question.
$begingroup$
Are you sure this isn't backwards? It would be odd for validation loss to be consistently lower than train. Not impossible, but atypical.
$endgroup$
– Sean Owen♦
yesterday
add a comment |
$begingroup$
This question already has an answer here:
Validation loss is not decreasing
2 answers
If loss is decreasing but val_loss not, what is the problem and how can I fix it?
I get such vague result:
lstm loss-function
$endgroup$
This question already has an answer here:
Validation loss is not decreasing
2 answers
If loss is decreasing but val_loss not, what is the problem and how can I fix it?
I get such vague result:
This question already has an answer here:
Validation loss is not decreasing
2 answers
lstm loss-function
lstm loss-function
edited 2 days ago
user145959
asked 2 days ago
user145959user145959
1168
1168
marked as duplicate by Antonio Jurić, Siong Thye Goh, Sean Owen♦ yesterday
This question has been asked before and already has an answer. If those answers do not fully address your question, please ask a new question.
marked as duplicate by Antonio Jurić, Siong Thye Goh, Sean Owen♦ yesterday
This question has been asked before and already has an answer. If those answers do not fully address your question, please ask a new question.
$begingroup$
Are you sure this isn't backwards? It would be odd for validation loss to be consistently lower than train. Not impossible, but atypical.
$endgroup$
– Sean Owen♦
yesterday
add a comment |
$begingroup$
Are you sure this isn't backwards? It would be odd for validation loss to be consistently lower than train. Not impossible, but atypical.
$endgroup$
– Sean Owen♦
yesterday
$begingroup$
Are you sure this isn't backwards? It would be odd for validation loss to be consistently lower than train. Not impossible, but atypical.
$endgroup$
– Sean Owen♦
yesterday
$begingroup$
Are you sure this isn't backwards? It would be odd for validation loss to be consistently lower than train. Not impossible, but atypical.
$endgroup$
– Sean Owen♦
yesterday
add a comment |
1 Answer
1
active
oldest
votes
$begingroup$
This indicates that model is not generalizing (it is over-fitting). Few options are :
- Get more training data
- Reduce complexity of model (Number of LSTM layers, complexity of dense layers)
Andrew NG has a good video on this topic :
https://www.youtube.com/watch?v=OSd30QGMl88
A tutorial specific to LSTM :
https://machinelearningmastery.com/diagnose-overfitting-underfitting-lstm-models/
$endgroup$
add a comment |
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
This indicates that model is not generalizing (it is over-fitting). Few options are :
- Get more training data
- Reduce complexity of model (Number of LSTM layers, complexity of dense layers)
Andrew NG has a good video on this topic :
https://www.youtube.com/watch?v=OSd30QGMl88
A tutorial specific to LSTM :
https://machinelearningmastery.com/diagnose-overfitting-underfitting-lstm-models/
$endgroup$
add a comment |
$begingroup$
This indicates that model is not generalizing (it is over-fitting). Few options are :
- Get more training data
- Reduce complexity of model (Number of LSTM layers, complexity of dense layers)
Andrew NG has a good video on this topic :
https://www.youtube.com/watch?v=OSd30QGMl88
A tutorial specific to LSTM :
https://machinelearningmastery.com/diagnose-overfitting-underfitting-lstm-models/
$endgroup$
add a comment |
$begingroup$
This indicates that model is not generalizing (it is over-fitting). Few options are :
- Get more training data
- Reduce complexity of model (Number of LSTM layers, complexity of dense layers)
Andrew NG has a good video on this topic :
https://www.youtube.com/watch?v=OSd30QGMl88
A tutorial specific to LSTM :
https://machinelearningmastery.com/diagnose-overfitting-underfitting-lstm-models/
$endgroup$
This indicates that model is not generalizing (it is over-fitting). Few options are :
- Get more training data
- Reduce complexity of model (Number of LSTM layers, complexity of dense layers)
Andrew NG has a good video on this topic :
https://www.youtube.com/watch?v=OSd30QGMl88
A tutorial specific to LSTM :
https://machinelearningmastery.com/diagnose-overfitting-underfitting-lstm-models/
answered 2 days ago
Shamit VermaShamit Verma
78426
78426
add a comment |
add a comment |
$begingroup$
Are you sure this isn't backwards? It would be odd for validation loss to be consistently lower than train. Not impossible, but atypical.
$endgroup$
– Sean Owen♦
yesterday