You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/zero
littsk 1a3315e336
[hotfix] Add layer norm gradients all-reduce for sequence parallel (#4926)
1 year ago
..
gemini [hotfix] fix grad accumulation plus clipping for gemini (#5002) 1 year ago
low_level [hotfix] Add layer norm gradients all-reduce for sequence parallel (#4926) 1 year ago
__init__.py [misc] update pre-commit and run all files (#4752) 1 year ago
wrapper.py [misc] update pre-commit and run all files (#4752) 1 year ago