Skip to content

Why smaller batch size results better results? With the epochs increased, the G loss went up? #39

@AAAeray

Description

@AAAeray

There are 8k+ 256*256 images in my datasets, I set batchsize=32 and trained with 4 12GB GPUs, but it's not as good as I set batchsize = 4 and trained with one GPU,neither the training speed nor the image quality. Why a smaller batch size results better results? Is there a best batchsize?
Beside,I set batchsize=4, when epoch>70, g-loss went up obviously,and qualities of generated images went worse.Why did this happen? (Training process was interrupted when epoch = 54, I reloaded weight files and optimer states from epoch 53 )
Thanks!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions