Tensorflow Deep learning network not utilizing GPU?
$begingroup$
I have a Nvidia GeForce GT 755M (PC), which I heard should be at least functional for running deep learning models. But when I train my model (DCGAN) and check the task manager process info (Win 10) I see close to 100% CPU utilization, and very little GPU activity (1-5%). That amount of GPU activity seems to be the constant even if I'm not running a model...so my concern is that I'm not using my graphics card at all, and that I would get better performance if I did.
Furthermore, when I run the following code it indicates that no GPU was found:
import tensorflow as tf
import warnings
if not tf.test.gpu_device_name():
warnings.warn('No GPU found. Please use a GPU to train your neural network.')
else:
print('Default GPU Device: {}'.format(tf.test.gpu_device_name()))
So my questions are the following:
- Is my GPU something that should improve performance vs using the CPU?
- Why isn't tensorflow finding the GPU?
- How can I make Tensorflow utilize the GPU?
I am just beginning to learn about the hardware aspect of ML so any advice and any recommended beginners tutorials for getting basic understanding/intuition would be much appreciated!
Screenshot of component utilization while training a convolutional NN:
Thanks!
machine-learning neural-network deep-learning tensorflow gan
New contributor
$endgroup$
add a comment |
$begingroup$
I have a Nvidia GeForce GT 755M (PC), which I heard should be at least functional for running deep learning models. But when I train my model (DCGAN) and check the task manager process info (Win 10) I see close to 100% CPU utilization, and very little GPU activity (1-5%). That amount of GPU activity seems to be the constant even if I'm not running a model...so my concern is that I'm not using my graphics card at all, and that I would get better performance if I did.
Furthermore, when I run the following code it indicates that no GPU was found:
import tensorflow as tf
import warnings
if not tf.test.gpu_device_name():
warnings.warn('No GPU found. Please use a GPU to train your neural network.')
else:
print('Default GPU Device: {}'.format(tf.test.gpu_device_name()))
So my questions are the following:
- Is my GPU something that should improve performance vs using the CPU?
- Why isn't tensorflow finding the GPU?
- How can I make Tensorflow utilize the GPU?
I am just beginning to learn about the hardware aspect of ML so any advice and any recommended beginners tutorials for getting basic understanding/intuition would be much appreciated!
Screenshot of component utilization while training a convolutional NN:
Thanks!
machine-learning neural-network deep-learning tensorflow gan
New contributor
$endgroup$
$begingroup$
what does runningnvidia-smi
in the terminal return? Have you installedtensorflow-gpu
instead of justtensorflow
? Have you downloaded and installedcuDNN
?
$endgroup$
– pcko1
9 hours ago
add a comment |
$begingroup$
I have a Nvidia GeForce GT 755M (PC), which I heard should be at least functional for running deep learning models. But when I train my model (DCGAN) and check the task manager process info (Win 10) I see close to 100% CPU utilization, and very little GPU activity (1-5%). That amount of GPU activity seems to be the constant even if I'm not running a model...so my concern is that I'm not using my graphics card at all, and that I would get better performance if I did.
Furthermore, when I run the following code it indicates that no GPU was found:
import tensorflow as tf
import warnings
if not tf.test.gpu_device_name():
warnings.warn('No GPU found. Please use a GPU to train your neural network.')
else:
print('Default GPU Device: {}'.format(tf.test.gpu_device_name()))
So my questions are the following:
- Is my GPU something that should improve performance vs using the CPU?
- Why isn't tensorflow finding the GPU?
- How can I make Tensorflow utilize the GPU?
I am just beginning to learn about the hardware aspect of ML so any advice and any recommended beginners tutorials for getting basic understanding/intuition would be much appreciated!
Screenshot of component utilization while training a convolutional NN:
Thanks!
machine-learning neural-network deep-learning tensorflow gan
New contributor
$endgroup$
I have a Nvidia GeForce GT 755M (PC), which I heard should be at least functional for running deep learning models. But when I train my model (DCGAN) and check the task manager process info (Win 10) I see close to 100% CPU utilization, and very little GPU activity (1-5%). That amount of GPU activity seems to be the constant even if I'm not running a model...so my concern is that I'm not using my graphics card at all, and that I would get better performance if I did.
Furthermore, when I run the following code it indicates that no GPU was found:
import tensorflow as tf
import warnings
if not tf.test.gpu_device_name():
warnings.warn('No GPU found. Please use a GPU to train your neural network.')
else:
print('Default GPU Device: {}'.format(tf.test.gpu_device_name()))
So my questions are the following:
- Is my GPU something that should improve performance vs using the CPU?
- Why isn't tensorflow finding the GPU?
- How can I make Tensorflow utilize the GPU?
I am just beginning to learn about the hardware aspect of ML so any advice and any recommended beginners tutorials for getting basic understanding/intuition would be much appreciated!
Screenshot of component utilization while training a convolutional NN:
Thanks!
machine-learning neural-network deep-learning tensorflow gan
machine-learning neural-network deep-learning tensorflow gan
New contributor
New contributor
edited 9 hours ago
L Xandor
New contributor
asked 9 hours ago
L XandorL Xandor
1062
1062
New contributor
New contributor
$begingroup$
what does runningnvidia-smi
in the terminal return? Have you installedtensorflow-gpu
instead of justtensorflow
? Have you downloaded and installedcuDNN
?
$endgroup$
– pcko1
9 hours ago
add a comment |
$begingroup$
what does runningnvidia-smi
in the terminal return? Have you installedtensorflow-gpu
instead of justtensorflow
? Have you downloaded and installedcuDNN
?
$endgroup$
– pcko1
9 hours ago
$begingroup$
what does running
nvidia-smi
in the terminal return? Have you installed tensorflow-gpu
instead of just tensorflow
? Have you downloaded and installed cuDNN
?$endgroup$
– pcko1
9 hours ago
$begingroup$
what does running
nvidia-smi
in the terminal return? Have you installed tensorflow-gpu
instead of just tensorflow
? Have you downloaded and installed cuDNN
?$endgroup$
– pcko1
9 hours ago
add a comment |
0
active
oldest
votes
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "557"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
L Xandor is a new contributor. Be nice, and check out our Code of Conduct.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdatascience.stackexchange.com%2fquestions%2f45343%2ftensorflow-deep-learning-network-not-utilizing-gpu%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
0
active
oldest
votes
0
active
oldest
votes
active
oldest
votes
active
oldest
votes
L Xandor is a new contributor. Be nice, and check out our Code of Conduct.
L Xandor is a new contributor. Be nice, and check out our Code of Conduct.
L Xandor is a new contributor. Be nice, and check out our Code of Conduct.
L Xandor is a new contributor. Be nice, and check out our Code of Conduct.
Thanks for contributing an answer to Data Science Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdatascience.stackexchange.com%2fquestions%2f45343%2ftensorflow-deep-learning-network-not-utilizing-gpu%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
$begingroup$
what does running
nvidia-smi
in the terminal return? Have you installedtensorflow-gpu
instead of justtensorflow
? Have you downloaded and installedcuDNN
?$endgroup$
– pcko1
9 hours ago