k-fold cross validation in keras for regression using sklearn












-1












$begingroup$


I am using a wrapper to use sklearn k-fold cross-validation with keras for a regression problem with ANN. but the accuracies i get look very weird. It has worked fine for a classification problem. I am attaching the code too. Is there anything I'm doing wrong



from keras.wrappers.scikit_learn import KerasRegressor
from sklearn.model_selection import cross_val_score
from keras.models import Sequential
from keras.layers import Dense
def build_regressor():
regressor = Sequential()
regressor.add(Dense(units = 8, kernel_initializer = 'uniform', activation = 'relu', input_dim = 15))
regressor.add(Dense(units = 8, kernel_initializer = 'uniform', activation = 'relu'))
regressor.add(Dense(units = 1, kernel_initializer = 'uniform'))
regressor.compile(optimizer = 'adam', loss = 'mse', metrics = ['mae'])
return regressor
regressor = KerasRegressor(build_fn = build_regressor, batch_size = 10, epochs = 100)
accuracies = cross_val_score(estimator = regressor, X = X_train, y = y_train, cv = 10, n_jobs = 1)
mean = accuracies.mean()
variance = accuracies.std()









share|improve this question









$endgroup$












  • $begingroup$
    What exactly do you mean they "look very weird"? Care to share them?
    $endgroup$
    – desertnaut
    yesterday










  • $begingroup$
    -15.8012, -13.6942, -14.537, -22.315, -13.333, -15.8931, -16.9658, -13.4334, -21.4675, -39.7934, these are the 10 values obtained for accuracies
    $endgroup$
    – Chinni
    yesterday












  • $begingroup$
    "Accuracies" is the wrong term here (you are in a regression setting); so these are 10 values of negative MSE (or MAE). What is weird about them?
    $endgroup$
    – desertnaut
    yesterday










  • $begingroup$
    I was expecting that "Accuracies" would contain r2_scores,since it is a regression problem. Correct me if I am wrong
    $endgroup$
    – Chinni
    yesterday










  • $begingroup$
    Well, the API is rather poorly documented, but I would be highly surprised if the Keras people use R^2 at all, which is practically never used in predictive contexts; R^2 seems like a fossil from the old statistics era - see the last part of my SO answer scikit-learn & statsmodels - which R-squared is correct? for more.
    $endgroup$
    – desertnaut
    yesterday
















-1












$begingroup$


I am using a wrapper to use sklearn k-fold cross-validation with keras for a regression problem with ANN. but the accuracies i get look very weird. It has worked fine for a classification problem. I am attaching the code too. Is there anything I'm doing wrong



from keras.wrappers.scikit_learn import KerasRegressor
from sklearn.model_selection import cross_val_score
from keras.models import Sequential
from keras.layers import Dense
def build_regressor():
regressor = Sequential()
regressor.add(Dense(units = 8, kernel_initializer = 'uniform', activation = 'relu', input_dim = 15))
regressor.add(Dense(units = 8, kernel_initializer = 'uniform', activation = 'relu'))
regressor.add(Dense(units = 1, kernel_initializer = 'uniform'))
regressor.compile(optimizer = 'adam', loss = 'mse', metrics = ['mae'])
return regressor
regressor = KerasRegressor(build_fn = build_regressor, batch_size = 10, epochs = 100)
accuracies = cross_val_score(estimator = regressor, X = X_train, y = y_train, cv = 10, n_jobs = 1)
mean = accuracies.mean()
variance = accuracies.std()









share|improve this question









$endgroup$












  • $begingroup$
    What exactly do you mean they "look very weird"? Care to share them?
    $endgroup$
    – desertnaut
    yesterday










  • $begingroup$
    -15.8012, -13.6942, -14.537, -22.315, -13.333, -15.8931, -16.9658, -13.4334, -21.4675, -39.7934, these are the 10 values obtained for accuracies
    $endgroup$
    – Chinni
    yesterday












  • $begingroup$
    "Accuracies" is the wrong term here (you are in a regression setting); so these are 10 values of negative MSE (or MAE). What is weird about them?
    $endgroup$
    – desertnaut
    yesterday










  • $begingroup$
    I was expecting that "Accuracies" would contain r2_scores,since it is a regression problem. Correct me if I am wrong
    $endgroup$
    – Chinni
    yesterday










  • $begingroup$
    Well, the API is rather poorly documented, but I would be highly surprised if the Keras people use R^2 at all, which is practically never used in predictive contexts; R^2 seems like a fossil from the old statistics era - see the last part of my SO answer scikit-learn & statsmodels - which R-squared is correct? for more.
    $endgroup$
    – desertnaut
    yesterday














-1












-1








-1





$begingroup$


I am using a wrapper to use sklearn k-fold cross-validation with keras for a regression problem with ANN. but the accuracies i get look very weird. It has worked fine for a classification problem. I am attaching the code too. Is there anything I'm doing wrong



from keras.wrappers.scikit_learn import KerasRegressor
from sklearn.model_selection import cross_val_score
from keras.models import Sequential
from keras.layers import Dense
def build_regressor():
regressor = Sequential()
regressor.add(Dense(units = 8, kernel_initializer = 'uniform', activation = 'relu', input_dim = 15))
regressor.add(Dense(units = 8, kernel_initializer = 'uniform', activation = 'relu'))
regressor.add(Dense(units = 1, kernel_initializer = 'uniform'))
regressor.compile(optimizer = 'adam', loss = 'mse', metrics = ['mae'])
return regressor
regressor = KerasRegressor(build_fn = build_regressor, batch_size = 10, epochs = 100)
accuracies = cross_val_score(estimator = regressor, X = X_train, y = y_train, cv = 10, n_jobs = 1)
mean = accuracies.mean()
variance = accuracies.std()









share|improve this question









$endgroup$




I am using a wrapper to use sklearn k-fold cross-validation with keras for a regression problem with ANN. but the accuracies i get look very weird. It has worked fine for a classification problem. I am attaching the code too. Is there anything I'm doing wrong



from keras.wrappers.scikit_learn import KerasRegressor
from sklearn.model_selection import cross_val_score
from keras.models import Sequential
from keras.layers import Dense
def build_regressor():
regressor = Sequential()
regressor.add(Dense(units = 8, kernel_initializer = 'uniform', activation = 'relu', input_dim = 15))
regressor.add(Dense(units = 8, kernel_initializer = 'uniform', activation = 'relu'))
regressor.add(Dense(units = 1, kernel_initializer = 'uniform'))
regressor.compile(optimizer = 'adam', loss = 'mse', metrics = ['mae'])
return regressor
regressor = KerasRegressor(build_fn = build_regressor, batch_size = 10, epochs = 100)
accuracies = cross_val_score(estimator = regressor, X = X_train, y = y_train, cv = 10, n_jobs = 1)
mean = accuracies.mean()
variance = accuracies.std()






neural-network keras scikit-learn regression cross-validation






share|improve this question













share|improve this question











share|improve this question




share|improve this question










asked 2 days ago









ChinniChinni

184




184












  • $begingroup$
    What exactly do you mean they "look very weird"? Care to share them?
    $endgroup$
    – desertnaut
    yesterday










  • $begingroup$
    -15.8012, -13.6942, -14.537, -22.315, -13.333, -15.8931, -16.9658, -13.4334, -21.4675, -39.7934, these are the 10 values obtained for accuracies
    $endgroup$
    – Chinni
    yesterday












  • $begingroup$
    "Accuracies" is the wrong term here (you are in a regression setting); so these are 10 values of negative MSE (or MAE). What is weird about them?
    $endgroup$
    – desertnaut
    yesterday










  • $begingroup$
    I was expecting that "Accuracies" would contain r2_scores,since it is a regression problem. Correct me if I am wrong
    $endgroup$
    – Chinni
    yesterday










  • $begingroup$
    Well, the API is rather poorly documented, but I would be highly surprised if the Keras people use R^2 at all, which is practically never used in predictive contexts; R^2 seems like a fossil from the old statistics era - see the last part of my SO answer scikit-learn & statsmodels - which R-squared is correct? for more.
    $endgroup$
    – desertnaut
    yesterday


















  • $begingroup$
    What exactly do you mean they "look very weird"? Care to share them?
    $endgroup$
    – desertnaut
    yesterday










  • $begingroup$
    -15.8012, -13.6942, -14.537, -22.315, -13.333, -15.8931, -16.9658, -13.4334, -21.4675, -39.7934, these are the 10 values obtained for accuracies
    $endgroup$
    – Chinni
    yesterday












  • $begingroup$
    "Accuracies" is the wrong term here (you are in a regression setting); so these are 10 values of negative MSE (or MAE). What is weird about them?
    $endgroup$
    – desertnaut
    yesterday










  • $begingroup$
    I was expecting that "Accuracies" would contain r2_scores,since it is a regression problem. Correct me if I am wrong
    $endgroup$
    – Chinni
    yesterday










  • $begingroup$
    Well, the API is rather poorly documented, but I would be highly surprised if the Keras people use R^2 at all, which is practically never used in predictive contexts; R^2 seems like a fossil from the old statistics era - see the last part of my SO answer scikit-learn & statsmodels - which R-squared is correct? for more.
    $endgroup$
    – desertnaut
    yesterday
















$begingroup$
What exactly do you mean they "look very weird"? Care to share them?
$endgroup$
– desertnaut
yesterday




$begingroup$
What exactly do you mean they "look very weird"? Care to share them?
$endgroup$
– desertnaut
yesterday












$begingroup$
-15.8012, -13.6942, -14.537, -22.315, -13.333, -15.8931, -16.9658, -13.4334, -21.4675, -39.7934, these are the 10 values obtained for accuracies
$endgroup$
– Chinni
yesterday






$begingroup$
-15.8012, -13.6942, -14.537, -22.315, -13.333, -15.8931, -16.9658, -13.4334, -21.4675, -39.7934, these are the 10 values obtained for accuracies
$endgroup$
– Chinni
yesterday














$begingroup$
"Accuracies" is the wrong term here (you are in a regression setting); so these are 10 values of negative MSE (or MAE). What is weird about them?
$endgroup$
– desertnaut
yesterday




$begingroup$
"Accuracies" is the wrong term here (you are in a regression setting); so these are 10 values of negative MSE (or MAE). What is weird about them?
$endgroup$
– desertnaut
yesterday












$begingroup$
I was expecting that "Accuracies" would contain r2_scores,since it is a regression problem. Correct me if I am wrong
$endgroup$
– Chinni
yesterday




$begingroup$
I was expecting that "Accuracies" would contain r2_scores,since it is a regression problem. Correct me if I am wrong
$endgroup$
– Chinni
yesterday












$begingroup$
Well, the API is rather poorly documented, but I would be highly surprised if the Keras people use R^2 at all, which is practically never used in predictive contexts; R^2 seems like a fossil from the old statistics era - see the last part of my SO answer scikit-learn & statsmodels - which R-squared is correct? for more.
$endgroup$
– desertnaut
yesterday




$begingroup$
Well, the API is rather poorly documented, but I would be highly surprised if the Keras people use R^2 at all, which is practically never used in predictive contexts; R^2 seems like a fossil from the old statistics era - see the last part of my SO answer scikit-learn & statsmodels - which R-squared is correct? for more.
$endgroup$
– desertnaut
yesterday










1 Answer
1






active

oldest

votes


















1












$begingroup$

Found the answer through sklearn documentation. The default scoring parameter for cross_val_score is None.So the accuracies that I got are not r2_scores. Since I was expecting them to be r^2 values, I have to mention it as a parameter.



accuracies = cross_val_score(estimator = regressor, X = X_train, y = y_train,scoring='r2',cv = 10, n_jobs = 1)


Adding scoring parameter I was able to get the r2_scores






share|improve this answer









$endgroup$













    Your Answer





    StackExchange.ifUsing("editor", function () {
    return StackExchange.using("mathjaxEditing", function () {
    StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
    StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
    });
    });
    }, "mathjax-editing");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "557"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: false,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: null,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdatascience.stackexchange.com%2fquestions%2f46796%2fk-fold-cross-validation-in-keras-for-regression-using-sklearn%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    1 Answer
    1






    active

    oldest

    votes








    1 Answer
    1






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    1












    $begingroup$

    Found the answer through sklearn documentation. The default scoring parameter for cross_val_score is None.So the accuracies that I got are not r2_scores. Since I was expecting them to be r^2 values, I have to mention it as a parameter.



    accuracies = cross_val_score(estimator = regressor, X = X_train, y = y_train,scoring='r2',cv = 10, n_jobs = 1)


    Adding scoring parameter I was able to get the r2_scores






    share|improve this answer









    $endgroup$


















      1












      $begingroup$

      Found the answer through sklearn documentation. The default scoring parameter for cross_val_score is None.So the accuracies that I got are not r2_scores. Since I was expecting them to be r^2 values, I have to mention it as a parameter.



      accuracies = cross_val_score(estimator = regressor, X = X_train, y = y_train,scoring='r2',cv = 10, n_jobs = 1)


      Adding scoring parameter I was able to get the r2_scores






      share|improve this answer









      $endgroup$
















        1












        1








        1





        $begingroup$

        Found the answer through sklearn documentation. The default scoring parameter for cross_val_score is None.So the accuracies that I got are not r2_scores. Since I was expecting them to be r^2 values, I have to mention it as a parameter.



        accuracies = cross_val_score(estimator = regressor, X = X_train, y = y_train,scoring='r2',cv = 10, n_jobs = 1)


        Adding scoring parameter I was able to get the r2_scores






        share|improve this answer









        $endgroup$



        Found the answer through sklearn documentation. The default scoring parameter for cross_val_score is None.So the accuracies that I got are not r2_scores. Since I was expecting them to be r^2 values, I have to mention it as a parameter.



        accuracies = cross_val_score(estimator = regressor, X = X_train, y = y_train,scoring='r2',cv = 10, n_jobs = 1)


        Adding scoring parameter I was able to get the r2_scores







        share|improve this answer












        share|improve this answer



        share|improve this answer










        answered 9 hours ago









        ChinniChinni

        184




        184






























            draft saved

            draft discarded




















































            Thanks for contributing an answer to Data Science Stack Exchange!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            Use MathJax to format equations. MathJax reference.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdatascience.stackexchange.com%2fquestions%2f46796%2fk-fold-cross-validation-in-keras-for-regression-using-sklearn%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            How to label and detect the document text images

            Vallis Paradisi

            Tabula Rosettana