How to show progress of sklearn.multioutput.MultiOutputRegressor and XGBRegressor?
$begingroup$
Is it possible to show the training progress of the MultiOutputRegressor in sklearn? When a huge dataset is processed, my program runs a long time and I have no clue how long it will take. I have shortened my program to a minimal working example below.
import numpy as np
from sklearn.multioutput import MultiOutputRegressor
import xgboost as xgb
df = np.arange(50).reshape(10,5)
train = df[:8]
test = df[8:]
X_train = train[:,0:-2]
X_test = test[:,0:-2]
y_train = train[:,-2:]
y_test = test[:,-2:]
eval_set = [(X_test, y_test)]
multioutputregressor = MultiOutputRegressor(xgb.XGBRegressor(eval_set=eval_set, verbose_eval=True))
multioutputregressor.fit(X_train, y_train)
predictions = multioutputregressor.predict(X_test)
print(predictions)
scikit-learn xgboost
$endgroup$
bumped to the homepage by Community♦ yesterday
This question has answers that may be good or bad; the system has marked it active so that they can be reviewed.
add a comment |
$begingroup$
Is it possible to show the training progress of the MultiOutputRegressor in sklearn? When a huge dataset is processed, my program runs a long time and I have no clue how long it will take. I have shortened my program to a minimal working example below.
import numpy as np
from sklearn.multioutput import MultiOutputRegressor
import xgboost as xgb
df = np.arange(50).reshape(10,5)
train = df[:8]
test = df[8:]
X_train = train[:,0:-2]
X_test = test[:,0:-2]
y_train = train[:,-2:]
y_test = test[:,-2:]
eval_set = [(X_test, y_test)]
multioutputregressor = MultiOutputRegressor(xgb.XGBRegressor(eval_set=eval_set, verbose_eval=True))
multioutputregressor.fit(X_train, y_train)
predictions = multioutputregressor.predict(X_test)
print(predictions)
scikit-learn xgboost
$endgroup$
bumped to the homepage by Community♦ yesterday
This question has answers that may be good or bad; the system has marked it active so that they can be reviewed.
$begingroup$
There is no built-in progress bar for scikit learn and most of the ML algorithms. There are might be a solution using tqdm package see :github.com/scikit-learn/scikit-learn/issues/7574, but I am not sure it will work on MultiOutputRegressor or not!!
$endgroup$
– Majid Mortazavi
Aug 9 '18 at 9:34
$begingroup$
Thanks, I tried this but unfortunately it doesn't work in my case, since XGBRegressor doesn't have a partial_fit method.
$endgroup$
– Dennis
Aug 9 '18 at 15:47
add a comment |
$begingroup$
Is it possible to show the training progress of the MultiOutputRegressor in sklearn? When a huge dataset is processed, my program runs a long time and I have no clue how long it will take. I have shortened my program to a minimal working example below.
import numpy as np
from sklearn.multioutput import MultiOutputRegressor
import xgboost as xgb
df = np.arange(50).reshape(10,5)
train = df[:8]
test = df[8:]
X_train = train[:,0:-2]
X_test = test[:,0:-2]
y_train = train[:,-2:]
y_test = test[:,-2:]
eval_set = [(X_test, y_test)]
multioutputregressor = MultiOutputRegressor(xgb.XGBRegressor(eval_set=eval_set, verbose_eval=True))
multioutputregressor.fit(X_train, y_train)
predictions = multioutputregressor.predict(X_test)
print(predictions)
scikit-learn xgboost
$endgroup$
Is it possible to show the training progress of the MultiOutputRegressor in sklearn? When a huge dataset is processed, my program runs a long time and I have no clue how long it will take. I have shortened my program to a minimal working example below.
import numpy as np
from sklearn.multioutput import MultiOutputRegressor
import xgboost as xgb
df = np.arange(50).reshape(10,5)
train = df[:8]
test = df[8:]
X_train = train[:,0:-2]
X_test = test[:,0:-2]
y_train = train[:,-2:]
y_test = test[:,-2:]
eval_set = [(X_test, y_test)]
multioutputregressor = MultiOutputRegressor(xgb.XGBRegressor(eval_set=eval_set, verbose_eval=True))
multioutputregressor.fit(X_train, y_train)
predictions = multioutputregressor.predict(X_test)
print(predictions)
scikit-learn xgboost
scikit-learn xgboost
asked Aug 8 '18 at 17:39
DennisDennis
31
31
bumped to the homepage by Community♦ yesterday
This question has answers that may be good or bad; the system has marked it active so that they can be reviewed.
bumped to the homepage by Community♦ yesterday
This question has answers that may be good or bad; the system has marked it active so that they can be reviewed.
$begingroup$
There is no built-in progress bar for scikit learn and most of the ML algorithms. There are might be a solution using tqdm package see :github.com/scikit-learn/scikit-learn/issues/7574, but I am not sure it will work on MultiOutputRegressor or not!!
$endgroup$
– Majid Mortazavi
Aug 9 '18 at 9:34
$begingroup$
Thanks, I tried this but unfortunately it doesn't work in my case, since XGBRegressor doesn't have a partial_fit method.
$endgroup$
– Dennis
Aug 9 '18 at 15:47
add a comment |
$begingroup$
There is no built-in progress bar for scikit learn and most of the ML algorithms. There are might be a solution using tqdm package see :github.com/scikit-learn/scikit-learn/issues/7574, but I am not sure it will work on MultiOutputRegressor or not!!
$endgroup$
– Majid Mortazavi
Aug 9 '18 at 9:34
$begingroup$
Thanks, I tried this but unfortunately it doesn't work in my case, since XGBRegressor doesn't have a partial_fit method.
$endgroup$
– Dennis
Aug 9 '18 at 15:47
$begingroup$
There is no built-in progress bar for scikit learn and most of the ML algorithms. There are might be a solution using tqdm package see :github.com/scikit-learn/scikit-learn/issues/7574, but I am not sure it will work on MultiOutputRegressor or not!!
$endgroup$
– Majid Mortazavi
Aug 9 '18 at 9:34
$begingroup$
There is no built-in progress bar for scikit learn and most of the ML algorithms. There are might be a solution using tqdm package see :github.com/scikit-learn/scikit-learn/issues/7574, but I am not sure it will work on MultiOutputRegressor or not!!
$endgroup$
– Majid Mortazavi
Aug 9 '18 at 9:34
$begingroup$
Thanks, I tried this but unfortunately it doesn't work in my case, since XGBRegressor doesn't have a partial_fit method.
$endgroup$
– Dennis
Aug 9 '18 at 15:47
$begingroup$
Thanks, I tried this but unfortunately it doesn't work in my case, since XGBRegressor doesn't have a partial_fit method.
$endgroup$
– Dennis
Aug 9 '18 at 15:47
add a comment |
1 Answer
1
active
oldest
votes
$begingroup$
You can consider modifying the code of MultiOutputEstimator or XGBModel to introduce some debugging output. These seem to be the classes that implement the fitting logic of the two libraries. The corresponding source files should also be provided in the installations on your local hard drive.
For example, you can print information of when the separate threads are starting and stopping in MultiOutputEstimator.fit
(inherited and thus reused in MultiOutputRegressor
), lines 167-169. Also, consider the tip in the documentation of the n_jobs
parameter:
If 1 is given, no parallel computing code is used at all, which is useful for debugging.
You can use a similar approach to expand the debugging in XGBModel.fit
(the basis of XGBRegressor
).
$endgroup$
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "557"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdatascience.stackexchange.com%2fquestions%2f36652%2fhow-to-show-progress-of-sklearn-multioutput-multioutputregressor-and-xgbregresso%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
You can consider modifying the code of MultiOutputEstimator or XGBModel to introduce some debugging output. These seem to be the classes that implement the fitting logic of the two libraries. The corresponding source files should also be provided in the installations on your local hard drive.
For example, you can print information of when the separate threads are starting and stopping in MultiOutputEstimator.fit
(inherited and thus reused in MultiOutputRegressor
), lines 167-169. Also, consider the tip in the documentation of the n_jobs
parameter:
If 1 is given, no parallel computing code is used at all, which is useful for debugging.
You can use a similar approach to expand the debugging in XGBModel.fit
(the basis of XGBRegressor
).
$endgroup$
add a comment |
$begingroup$
You can consider modifying the code of MultiOutputEstimator or XGBModel to introduce some debugging output. These seem to be the classes that implement the fitting logic of the two libraries. The corresponding source files should also be provided in the installations on your local hard drive.
For example, you can print information of when the separate threads are starting and stopping in MultiOutputEstimator.fit
(inherited and thus reused in MultiOutputRegressor
), lines 167-169. Also, consider the tip in the documentation of the n_jobs
parameter:
If 1 is given, no parallel computing code is used at all, which is useful for debugging.
You can use a similar approach to expand the debugging in XGBModel.fit
(the basis of XGBRegressor
).
$endgroup$
add a comment |
$begingroup$
You can consider modifying the code of MultiOutputEstimator or XGBModel to introduce some debugging output. These seem to be the classes that implement the fitting logic of the two libraries. The corresponding source files should also be provided in the installations on your local hard drive.
For example, you can print information of when the separate threads are starting and stopping in MultiOutputEstimator.fit
(inherited and thus reused in MultiOutputRegressor
), lines 167-169. Also, consider the tip in the documentation of the n_jobs
parameter:
If 1 is given, no parallel computing code is used at all, which is useful for debugging.
You can use a similar approach to expand the debugging in XGBModel.fit
(the basis of XGBRegressor
).
$endgroup$
You can consider modifying the code of MultiOutputEstimator or XGBModel to introduce some debugging output. These seem to be the classes that implement the fitting logic of the two libraries. The corresponding source files should also be provided in the installations on your local hard drive.
For example, you can print information of when the separate threads are starting and stopping in MultiOutputEstimator.fit
(inherited and thus reused in MultiOutputRegressor
), lines 167-169. Also, consider the tip in the documentation of the n_jobs
parameter:
If 1 is given, no parallel computing code is used at all, which is useful for debugging.
You can use a similar approach to expand the debugging in XGBModel.fit
(the basis of XGBRegressor
).
edited Aug 10 '18 at 12:51
answered Aug 10 '18 at 12:32
maptomapto
519212
519212
add a comment |
add a comment |
Thanks for contributing an answer to Data Science Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdatascience.stackexchange.com%2fquestions%2f36652%2fhow-to-show-progress-of-sklearn-multioutput-multioutputregressor-and-xgbregresso%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
$begingroup$
There is no built-in progress bar for scikit learn and most of the ML algorithms. There are might be a solution using tqdm package see :github.com/scikit-learn/scikit-learn/issues/7574, but I am not sure it will work on MultiOutputRegressor or not!!
$endgroup$
– Majid Mortazavi
Aug 9 '18 at 9:34
$begingroup$
Thanks, I tried this but unfortunately it doesn't work in my case, since XGBRegressor doesn't have a partial_fit method.
$endgroup$
– Dennis
Aug 9 '18 at 15:47