Tensorflow Deep learning network not utilizing GPU?












1












$begingroup$


I have a Nvidia GeForce GT 755M (PC), which I heard should be at least functional for running deep learning models. But when I train my model (DCGAN) and check the task manager process info (Win 10) I see close to 100% CPU utilization, and very little GPU activity (1-5%). That amount of GPU activity seems to be the constant even if I'm not running a model...so my concern is that I'm not using my graphics card at all, and that I would get better performance if I did.



Furthermore, when I run the following code it indicates that no GPU was found:



import tensorflow as tf
import warnings
if not tf.test.gpu_device_name():
warnings.warn('No GPU found. Please use a GPU to train your neural network.')
else:
print('Default GPU Device: {}'.format(tf.test.gpu_device_name()))


So my questions are the following:




  • Is my GPU something that should improve performance vs using the CPU?

  • Why isn't tensorflow finding the GPU?

  • How can I make Tensorflow utilize the GPU?


I am just beginning to learn about the hardware aspect of ML so any advice and any recommended beginners tutorials for getting basic understanding/intuition would be much appreciated!



Screenshot of component utilization while training a convolutional NN:
Screenshot of component utilization while training a convolutional NN



Thanks!










share|improve this question









New contributor




L Xandor is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.







$endgroup$












  • $begingroup$
    what does running nvidia-smi in the terminal return? Have you installed tensorflow-gpu instead of just tensorflow ? Have you downloaded and installed cuDNN ?
    $endgroup$
    – pcko1
    9 hours ago


















1












$begingroup$


I have a Nvidia GeForce GT 755M (PC), which I heard should be at least functional for running deep learning models. But when I train my model (DCGAN) and check the task manager process info (Win 10) I see close to 100% CPU utilization, and very little GPU activity (1-5%). That amount of GPU activity seems to be the constant even if I'm not running a model...so my concern is that I'm not using my graphics card at all, and that I would get better performance if I did.



Furthermore, when I run the following code it indicates that no GPU was found:



import tensorflow as tf
import warnings
if not tf.test.gpu_device_name():
warnings.warn('No GPU found. Please use a GPU to train your neural network.')
else:
print('Default GPU Device: {}'.format(tf.test.gpu_device_name()))


So my questions are the following:




  • Is my GPU something that should improve performance vs using the CPU?

  • Why isn't tensorflow finding the GPU?

  • How can I make Tensorflow utilize the GPU?


I am just beginning to learn about the hardware aspect of ML so any advice and any recommended beginners tutorials for getting basic understanding/intuition would be much appreciated!



Screenshot of component utilization while training a convolutional NN:
Screenshot of component utilization while training a convolutional NN



Thanks!










share|improve this question









New contributor




L Xandor is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.







$endgroup$












  • $begingroup$
    what does running nvidia-smi in the terminal return? Have you installed tensorflow-gpu instead of just tensorflow ? Have you downloaded and installed cuDNN ?
    $endgroup$
    – pcko1
    9 hours ago
















1












1








1





$begingroup$


I have a Nvidia GeForce GT 755M (PC), which I heard should be at least functional for running deep learning models. But when I train my model (DCGAN) and check the task manager process info (Win 10) I see close to 100% CPU utilization, and very little GPU activity (1-5%). That amount of GPU activity seems to be the constant even if I'm not running a model...so my concern is that I'm not using my graphics card at all, and that I would get better performance if I did.



Furthermore, when I run the following code it indicates that no GPU was found:



import tensorflow as tf
import warnings
if not tf.test.gpu_device_name():
warnings.warn('No GPU found. Please use a GPU to train your neural network.')
else:
print('Default GPU Device: {}'.format(tf.test.gpu_device_name()))


So my questions are the following:




  • Is my GPU something that should improve performance vs using the CPU?

  • Why isn't tensorflow finding the GPU?

  • How can I make Tensorflow utilize the GPU?


I am just beginning to learn about the hardware aspect of ML so any advice and any recommended beginners tutorials for getting basic understanding/intuition would be much appreciated!



Screenshot of component utilization while training a convolutional NN:
Screenshot of component utilization while training a convolutional NN



Thanks!










share|improve this question









New contributor




L Xandor is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.







$endgroup$




I have a Nvidia GeForce GT 755M (PC), which I heard should be at least functional for running deep learning models. But when I train my model (DCGAN) and check the task manager process info (Win 10) I see close to 100% CPU utilization, and very little GPU activity (1-5%). That amount of GPU activity seems to be the constant even if I'm not running a model...so my concern is that I'm not using my graphics card at all, and that I would get better performance if I did.



Furthermore, when I run the following code it indicates that no GPU was found:



import tensorflow as tf
import warnings
if not tf.test.gpu_device_name():
warnings.warn('No GPU found. Please use a GPU to train your neural network.')
else:
print('Default GPU Device: {}'.format(tf.test.gpu_device_name()))


So my questions are the following:




  • Is my GPU something that should improve performance vs using the CPU?

  • Why isn't tensorflow finding the GPU?

  • How can I make Tensorflow utilize the GPU?


I am just beginning to learn about the hardware aspect of ML so any advice and any recommended beginners tutorials for getting basic understanding/intuition would be much appreciated!



Screenshot of component utilization while training a convolutional NN:
Screenshot of component utilization while training a convolutional NN



Thanks!







machine-learning neural-network deep-learning tensorflow gan






share|improve this question









New contributor




L Xandor is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.











share|improve this question









New contributor




L Xandor is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.









share|improve this question




share|improve this question








edited 9 hours ago







L Xandor













New contributor




L Xandor is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.









asked 9 hours ago









L XandorL Xandor

1062




1062




New contributor




L Xandor is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.





New contributor





L Xandor is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.






L Xandor is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.












  • $begingroup$
    what does running nvidia-smi in the terminal return? Have you installed tensorflow-gpu instead of just tensorflow ? Have you downloaded and installed cuDNN ?
    $endgroup$
    – pcko1
    9 hours ago




















  • $begingroup$
    what does running nvidia-smi in the terminal return? Have you installed tensorflow-gpu instead of just tensorflow ? Have you downloaded and installed cuDNN ?
    $endgroup$
    – pcko1
    9 hours ago


















$begingroup$
what does running nvidia-smi in the terminal return? Have you installed tensorflow-gpu instead of just tensorflow ? Have you downloaded and installed cuDNN ?
$endgroup$
– pcko1
9 hours ago






$begingroup$
what does running nvidia-smi in the terminal return? Have you installed tensorflow-gpu instead of just tensorflow ? Have you downloaded and installed cuDNN ?
$endgroup$
– pcko1
9 hours ago












0






active

oldest

votes











Your Answer





StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "557"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});






L Xandor is a new contributor. Be nice, and check out our Code of Conduct.










draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdatascience.stackexchange.com%2fquestions%2f45343%2ftensorflow-deep-learning-network-not-utilizing-gpu%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























0






active

oldest

votes








0






active

oldest

votes









active

oldest

votes






active

oldest

votes








L Xandor is a new contributor. Be nice, and check out our Code of Conduct.










draft saved

draft discarded


















L Xandor is a new contributor. Be nice, and check out our Code of Conduct.













L Xandor is a new contributor. Be nice, and check out our Code of Conduct.












L Xandor is a new contributor. Be nice, and check out our Code of Conduct.
















Thanks for contributing an answer to Data Science Stack Exchange!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


Use MathJax to format equations. MathJax reference.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdatascience.stackexchange.com%2fquestions%2f45343%2ftensorflow-deep-learning-network-not-utilizing-gpu%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

How to label and detect the document text images

Vallis Paradisi

Tabula Rosettana