Conversation
|
This is great! Will let you know if I am able to run the code. |
|
@alessandro-montanari I find that the multiple GPU version produces worse result compared to the single GPU version. It makes a lot of wrong detections. Do I have to take any note when running this multiple GPU version? |
|
That's weird. |
|
Unfortunately I am having some weird issues with the images where the code fails in preprocessing.py line 238 ( Anyway, with the code we are using for our application (it's basically this one plus some other changes to your implementation) we didn't see any loss in accuracy going from 1 GPU (batch size = 40) to 4 GPUs (batch size = 160). I'll try to come back on this but please let me know if you have any news. |
|
Have anyone been able to train with more than one GPU? |
|
@msis what error do you get? |
|
@alessandro-montanari Here's the trace with and in N.B. I used |
|
have a issue for multi-gpu training. different configuration is only batch size. (multi-gpu: 64, single-gpu: 16) |
I tried this code on CPU and on a single GPU and it works fine. I tried a previous version on 4 GPUs and it worked fine too. I will be able to try this version on multiple GPUs on Monday.
Let me know what you think.