Timeline for How to make my Neural Netwok run on GPU instead of CPU
Current License: CC BY-SA 4.0
11 events
| when toggle format | what | by | license | comment | |
|---|---|---|---|---|---|
| Apr 2, 2020 at 20:47 | comment | added | Dean | @n1k31t4, thanks for your classic contribution. However, trying to run your scripts on the latest tensorflow(gpu) is giving me error: RuntimeError: The Session graph is empty. Add operations to the graph before calling run(). Kindly show me the modifications I need to make. | |
| Sep 6, 2019 at 10:17 | history | edited | n1k31t4 | CC BY-SA 4.0 | Added conda env instructions |
| Dec 2, 2018 at 22:55 | vote | accept | Deni Avinash | ||
| Dec 2, 2018 at 22:51 | comment | added | Deni Avinash | Hi, Looks like my Code runs on GPU as well. Thanks for the sharing the code. I also ran a normal ANN today and I found that it consumed GPU. Looks like the RNN code I had was too much for the GPU. Thanks for helping me out. | |
| Dec 2, 2018 at 16:49 | comment | added | n1k31t4 | @DeniAvinash - please try using my update to confirm your GPU is working. | |
| Dec 2, 2018 at 16:49 | history | edited | n1k31t4 | CC BY-SA 4.0 | Addede extra code to test availability of GPU |
| Dec 2, 2018 at 5:45 | comment | added | Deni Avinash | I'm trying to run a simple RNN program. Now, after following few prior steps I'm get 'GPU Sync failed' error. Which I presume that the GPU is used but it has some issues in capacity. | |
| Dec 2, 2018 at 4:40 | comment | added | n1k31t4 | It seems like your Python interpreter is just not able to see the GPU. This might be a permissions issue or Windows related, as I haven't ever had this problem on Linux. If following that linked tutorial doesn't fix the problem, I am afraid I don't know a solution. My only final thought would be that the model you are training doesn't actually use the GPU. Could you share a minimal example of a model with the training code, which still doesn't use the GPU? | |
| Dec 2, 2018 at 4:21 | comment | added | Deni Avinash | I tried running the given coda command in prompt. I use a MSI laptop which has the Msi centre that displays the CPU and GPU consumption. When I train my Neural netwok I don find the GPU being consumed even after running the given command. | |
| Dec 2, 2018 at 4:17 | comment | added | Deni Avinash | Thanks for the information. I'm having a Nvidia GTX1060 GPU. | |
| Dec 2, 2018 at 2:54 | history | answered | n1k31t4 | CC BY-SA 4.0 |