You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello ,I met the same problem as you.
And I find it that this gif what my model generated 6.60Mb,
But in this code example gif folder,it is 8.22 Mb.
What make this difference?
I hope author to anser this question,thank you very much!
Hello ,I met the same problem as you. And I find it that this gif what my model generated 6.60Mb, But in this code example gif folder,it is 8.22 Mb. What make this difference? I hope author to anser this question,thank you very much!
guess it's the learning rate and trainning epoch causing that problem, i changed it to 5x larger: lr = 5e-4, and the performance will be better:
after 50 epochs trainning:
Thanks for sharing this repo.
Running the code (
main.py
, 200 epochs, parameters as mentioned in readme, only fixing the the issue mentioned in #1), I couldn't reproduce the results shown in https://github.com/EmilienDupont/wgan-gp/blob/master/gifs/mnist_200_epochs.gif. Instead, the generated samples look as follows:Any hints on what would need to be changed to make the training successful?
One step towards reproducibility would be to fix random seeds in the beginning of
main.py
, e.g. with the following code:The text was updated successfully, but these errors were encountered: