Certainity of a classifier












0












$begingroup$


How to build a classifier that by default will predict that it is for class 1, but if the classifier believes with 80 certainity that it belongs to 0, it will be classed as 0. How to check how certain a classifier is on it's prediction.










share|improve this question









New contributor




Oman is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.







$endgroup$








  • 1




    $begingroup$
    Why don't you use a classifier that can export probabilities (like a Decision Tree) and make the prediction manually from there? If the probability of class 0 is > 0.8, return 0, else return 1.
    $endgroup$
    – Tasos
    8 hours ago
















0












$begingroup$


How to build a classifier that by default will predict that it is for class 1, but if the classifier believes with 80 certainity that it belongs to 0, it will be classed as 0. How to check how certain a classifier is on it's prediction.










share|improve this question









New contributor




Oman is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.







$endgroup$








  • 1




    $begingroup$
    Why don't you use a classifier that can export probabilities (like a Decision Tree) and make the prediction manually from there? If the probability of class 0 is > 0.8, return 0, else return 1.
    $endgroup$
    – Tasos
    8 hours ago














0












0








0





$begingroup$


How to build a classifier that by default will predict that it is for class 1, but if the classifier believes with 80 certainity that it belongs to 0, it will be classed as 0. How to check how certain a classifier is on it's prediction.










share|improve this question









New contributor




Oman is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.







$endgroup$




How to build a classifier that by default will predict that it is for class 1, but if the classifier believes with 80 certainity that it belongs to 0, it will be classed as 0. How to check how certain a classifier is on it's prediction.







python classifier






share|improve this question









New contributor




Oman is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.











share|improve this question









New contributor




Oman is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.









share|improve this question




share|improve this question








edited 8 hours ago









pcko1

1,651418




1,651418






New contributor




Oman is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.









asked 8 hours ago









OmanOman

1




1




New contributor




Oman is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.





New contributor





Oman is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.






Oman is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.








  • 1




    $begingroup$
    Why don't you use a classifier that can export probabilities (like a Decision Tree) and make the prediction manually from there? If the probability of class 0 is > 0.8, return 0, else return 1.
    $endgroup$
    – Tasos
    8 hours ago














  • 1




    $begingroup$
    Why don't you use a classifier that can export probabilities (like a Decision Tree) and make the prediction manually from there? If the probability of class 0 is > 0.8, return 0, else return 1.
    $endgroup$
    – Tasos
    8 hours ago








1




1




$begingroup$
Why don't you use a classifier that can export probabilities (like a Decision Tree) and make the prediction manually from there? If the probability of class 0 is > 0.8, return 0, else return 1.
$endgroup$
– Tasos
8 hours ago




$begingroup$
Why don't you use a classifier that can export probabilities (like a Decision Tree) and make the prediction manually from there? If the probability of class 0 is > 0.8, return 0, else return 1.
$endgroup$
– Tasos
8 hours ago










2 Answers
2






active

oldest

votes


















1












$begingroup$

Many classifiers will give the option to get predicted probability. Then you can just put a threshold. Here is how it can be done in with sklearn:



from sklearn.ensemble import RandomForestClassifier
from sklearn.datasets import make_classification

# Make a dataset
X, y = make_classification(n_samples=1000, n_features=4,
n_informative=2, n_redundant=0,
random_state=0, shuffle=False)

clf = RandomForestClassifier(n_estimators=100, max_depth=2,
random_state=0)
clf.fit(X, y)

# 1 if proba is less than 0.8, otherwise 0
predictions = 1 - (clf.predict_proba(X)[:, 0] > 0.80)





share|improve this answer











$endgroup$





















    0












    $begingroup$

    You can build a neural network with softmax activation on the output layer, to give you values within the range [0,1]. Then you can further post-process those predictions however you like, i.e. using a threshold of 0.8 for binary classification between 0 and 1.






    share|improve this answer









    $endgroup$














      Your Answer





      StackExchange.ifUsing("editor", function () {
      return StackExchange.using("mathjaxEditing", function () {
      StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
      StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
      });
      });
      }, "mathjax-editing");

      StackExchange.ready(function() {
      var channelOptions = {
      tags: "".split(" "),
      id: "557"
      };
      initTagRenderer("".split(" "), "".split(" "), channelOptions);

      StackExchange.using("externalEditor", function() {
      // Have to fire editor after snippets, if snippets enabled
      if (StackExchange.settings.snippets.snippetsEnabled) {
      StackExchange.using("snippets", function() {
      createEditor();
      });
      }
      else {
      createEditor();
      }
      });

      function createEditor() {
      StackExchange.prepareEditor({
      heartbeatType: 'answer',
      autoActivateHeartbeat: false,
      convertImagesToLinks: false,
      noModals: true,
      showLowRepImageUploadWarning: true,
      reputationToPostImages: null,
      bindNavPrevention: true,
      postfix: "",
      imageUploader: {
      brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
      contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
      allowUrls: true
      },
      onDemand: true,
      discardSelector: ".discard-answer"
      ,immediatelyShowMarkdownHelp:true
      });


      }
      });






      Oman is a new contributor. Be nice, and check out our Code of Conduct.










      draft saved

      draft discarded


















      StackExchange.ready(
      function () {
      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdatascience.stackexchange.com%2fquestions%2f48615%2fcertainity-of-a-classifier%23new-answer', 'question_page');
      }
      );

      Post as a guest















      Required, but never shown

























      2 Answers
      2






      active

      oldest

      votes








      2 Answers
      2






      active

      oldest

      votes









      active

      oldest

      votes






      active

      oldest

      votes









      1












      $begingroup$

      Many classifiers will give the option to get predicted probability. Then you can just put a threshold. Here is how it can be done in with sklearn:



      from sklearn.ensemble import RandomForestClassifier
      from sklearn.datasets import make_classification

      # Make a dataset
      X, y = make_classification(n_samples=1000, n_features=4,
      n_informative=2, n_redundant=0,
      random_state=0, shuffle=False)

      clf = RandomForestClassifier(n_estimators=100, max_depth=2,
      random_state=0)
      clf.fit(X, y)

      # 1 if proba is less than 0.8, otherwise 0
      predictions = 1 - (clf.predict_proba(X)[:, 0] > 0.80)





      share|improve this answer











      $endgroup$


















        1












        $begingroup$

        Many classifiers will give the option to get predicted probability. Then you can just put a threshold. Here is how it can be done in with sklearn:



        from sklearn.ensemble import RandomForestClassifier
        from sklearn.datasets import make_classification

        # Make a dataset
        X, y = make_classification(n_samples=1000, n_features=4,
        n_informative=2, n_redundant=0,
        random_state=0, shuffle=False)

        clf = RandomForestClassifier(n_estimators=100, max_depth=2,
        random_state=0)
        clf.fit(X, y)

        # 1 if proba is less than 0.8, otherwise 0
        predictions = 1 - (clf.predict_proba(X)[:, 0] > 0.80)





        share|improve this answer











        $endgroup$
















          1












          1








          1





          $begingroup$

          Many classifiers will give the option to get predicted probability. Then you can just put a threshold. Here is how it can be done in with sklearn:



          from sklearn.ensemble import RandomForestClassifier
          from sklearn.datasets import make_classification

          # Make a dataset
          X, y = make_classification(n_samples=1000, n_features=4,
          n_informative=2, n_redundant=0,
          random_state=0, shuffle=False)

          clf = RandomForestClassifier(n_estimators=100, max_depth=2,
          random_state=0)
          clf.fit(X, y)

          # 1 if proba is less than 0.8, otherwise 0
          predictions = 1 - (clf.predict_proba(X)[:, 0] > 0.80)





          share|improve this answer











          $endgroup$



          Many classifiers will give the option to get predicted probability. Then you can just put a threshold. Here is how it can be done in with sklearn:



          from sklearn.ensemble import RandomForestClassifier
          from sklearn.datasets import make_classification

          # Make a dataset
          X, y = make_classification(n_samples=1000, n_features=4,
          n_informative=2, n_redundant=0,
          random_state=0, shuffle=False)

          clf = RandomForestClassifier(n_estimators=100, max_depth=2,
          random_state=0)
          clf.fit(X, y)

          # 1 if proba is less than 0.8, otherwise 0
          predictions = 1 - (clf.predict_proba(X)[:, 0] > 0.80)






          share|improve this answer














          share|improve this answer



          share|improve this answer








          edited 7 hours ago

























          answered 8 hours ago









          Simon LarssonSimon Larsson

          618113




          618113























              0












              $begingroup$

              You can build a neural network with softmax activation on the output layer, to give you values within the range [0,1]. Then you can further post-process those predictions however you like, i.e. using a threshold of 0.8 for binary classification between 0 and 1.






              share|improve this answer









              $endgroup$


















                0












                $begingroup$

                You can build a neural network with softmax activation on the output layer, to give you values within the range [0,1]. Then you can further post-process those predictions however you like, i.e. using a threshold of 0.8 for binary classification between 0 and 1.






                share|improve this answer









                $endgroup$
















                  0












                  0








                  0





                  $begingroup$

                  You can build a neural network with softmax activation on the output layer, to give you values within the range [0,1]. Then you can further post-process those predictions however you like, i.e. using a threshold of 0.8 for binary classification between 0 and 1.






                  share|improve this answer









                  $endgroup$



                  You can build a neural network with softmax activation on the output layer, to give you values within the range [0,1]. Then you can further post-process those predictions however you like, i.e. using a threshold of 0.8 for binary classification between 0 and 1.







                  share|improve this answer












                  share|improve this answer



                  share|improve this answer










                  answered 8 hours ago









                  pcko1pcko1

                  1,651418




                  1,651418






















                      Oman is a new contributor. Be nice, and check out our Code of Conduct.










                      draft saved

                      draft discarded


















                      Oman is a new contributor. Be nice, and check out our Code of Conduct.













                      Oman is a new contributor. Be nice, and check out our Code of Conduct.












                      Oman is a new contributor. Be nice, and check out our Code of Conduct.
















                      Thanks for contributing an answer to Data Science Stack Exchange!


                      • Please be sure to answer the question. Provide details and share your research!

                      But avoid



                      • Asking for help, clarification, or responding to other answers.

                      • Making statements based on opinion; back them up with references or personal experience.


                      Use MathJax to format equations. MathJax reference.


                      To learn more, see our tips on writing great answers.




                      draft saved


                      draft discarded














                      StackExchange.ready(
                      function () {
                      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdatascience.stackexchange.com%2fquestions%2f48615%2fcertainity-of-a-classifier%23new-answer', 'question_page');
                      }
                      );

                      Post as a guest















                      Required, but never shown





















































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown

































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown







                      Popular posts from this blog

                      How to label and detect the document text images

                      Vallis Paradisi

                      Tabula Rosettana