You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai
Hongxin Liu da39d21b71
[moe] support mixtral (#5309)
10 months ago
..
_C
_analyzer
accelerator [accelerator] fixed npu api 10 months ago
amp [npu] change device to accelerator api (#5239) 11 months ago
auto_parallel [npu] change device to accelerator api (#5239) 11 months ago
autochunk
booster [moe] support mixtral (#5309) 10 months ago
checkpoint_io [moe] support mixtral (#5309) 10 months ago
cli
cluster fix-test (#5210) 11 months ago
context
device
fx
inference
interface
kernel [feat] refactored extension module (#5298) 10 months ago
lazy
legacy [feat] refactored extension module (#5298) 10 months ago
logging
moe [moe] support mixtral (#5309) 10 months ago
nn [lr-scheduler] fix load state dict and add test (#5369) 10 months ago
pipeline [feat] refactored extension module (#5298) 10 months ago
shardformer fix typo change dosen't to doesn't (#5308) 10 months ago
tensor [moe] support mixtral (#5309) 10 months ago
testing [npu] change device to accelerator api (#5239) 11 months ago
utils Merge pull request #5310 from hpcaitech/feature/npu 10 months ago
zero [moe] support mixtral (#5309) 10 months ago
__init__.py [accelerator] init the accelerator module (#5129) 1 year ago
initialize.py [npu] change device to accelerator api (#5239) 11 months ago