Skip to content

Conversation

yfyeung
Copy link
Collaborator

@yfyeung yfyeung commented May 15, 2025

This PR add zipformer_llm_zh recipe.

Some new features / modifications:

  • Introduce batch shaving mechanism, dynamically reduce batch when OOM happen to prevent training interruptions.
  • Transition to full DeepSpeed training, removing torch.autocast (supported by Fix scaling.py: ensure SwooshL/SwooshR output dtype matches input dtype #1940).
  • Expose more parameters of DynamicBucketSampler for more efficient batching.
  • Fix data preparation from huggingface.
  • Set world_size and rank explicitly for dataloader.

@yfyeung yfyeung changed the title [WIP] Add Streaming Zipformer LLM recipe for ASR [WIP] Add streaming Zipformer encoder for LLM based ASR May 15, 2025
@yfyeung yfyeung changed the title [WIP] Add streaming Zipformer encoder for LLM based ASR [WIP] Support streaming Zipformer encoder for LLM based ASR May 15, 2025
@yfyeung yfyeung changed the title [WIP] Support streaming Zipformer encoder for LLM based ASR [WIP] Support Zipformer encoder for LLM based ASR May 15, 2025
@yfyeung yfyeung changed the title [WIP] Support Zipformer encoder for LLM based ASR [Not for merge] Support Zipformer encoder for LLM based ASR Jun 2, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants