You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/nn
Xuanlei Zhao dc003c304c
[moe] merge moe into main (#4978)
1 year ago
..
layer [moe] merge moe into main (#4978) 1 year ago
loss [moe] merge moe into main (#4978) 1 year ago
lr_scheduler [hotfix] fix lr scheduler bug in torch 2.0 (#4864) 1 year ago
optimizer [test] add no master test for low level zero plugin (#4934) 1 year ago
__init__.py [legacy] move communication and nn to legacy and refactor logger (#4671) 1 year ago
init.py [misc] update pre-commit and run all files (#4752) 1 year ago