You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/nn
Hongxin Liu e5ce4c8ea6
[npu] add npu support for gemini and zero (#5067)
1 year ago
..
layer [moe] merge moe into main (#4978) 1 year ago
loss [moe] merge moe into main (#4978) 1 year ago
lr_scheduler [hotfix] fix lr scheduler bug in torch 2.0 (#4864) 1 year ago
optimizer [npu] add npu support for gemini and zero (#5067) 1 year ago
__init__.py [legacy] move communication and nn to legacy and refactor logger (#4671) 1 year ago
init.py [misc] update pre-commit and run all files (#4752) 1 year ago