Closed
Description
Describe the bug
Should these three lines
optimizer.step()
lr_scheduler.step()
optimizer.zero_grad()
be indented to fall within the scope of if accelerator.sync_gradients:
? When gradient accumulation is enabled, gradient updates should only occur within the if block. Could this be a bug?
https://github.com/huggingface/diffusers/blob/main/examples/flux-control/train_control_lora_flux.py#L1299-L1301
Reproduction
None
Logs
System Info
None
Who can help?
No response