You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/legacy/engine/gradient_handler
Xuanlei Zhao dc003c304c
[moe] merge moe into main (#4978)
1 year ago
..
__init__.py [moe] merge moe into main (#4978) 1 year ago
_base_gradient_handler.py [misc] update pre-commit and run all files (#4752) 1 year ago
_data_parallel_gradient_handler.py [misc] update pre-commit and run all files (#4752) 1 year ago
_moe_gradient_handler.py [misc] update pre-commit and run all files (#4752) 1 year ago
_pipeline_parallel_gradient_handler.py [misc] update pre-commit and run all files (#4752) 1 year ago
_sequence_parallel_gradient_handler.py [misc] update pre-commit and run all files (#4752) 1 year ago
_zero_gradient_handler.py [misc] update pre-commit and run all files (#4752) 1 year ago
utils.py [legacy] move engine to legacy (#4560) 1 year ago