You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai
flybird11111 0a25e16e46
[shardformer]gather llama logits (#5398)
9 months ago
..
_C
_analyzer
accelerator [accelerator] fixed npu api 10 months ago
amp [npu] change device to accelerator api (#5239) 11 months ago
auto_parallel [hotfix] Fix wrong import in meta_registry (#5392) 9 months ago
autochunk
booster [fsdp] impl save/load shard model/optimizer (#5357) 9 months ago
checkpoint_io Merge pull request #5372 from hpcaitech/exp/mixtral 10 months ago
cli
cluster
context
device
fx
inference
interface
kernel [feat] refactored extension module (#5298) 10 months ago
lazy
legacy [feat] refactored extension module (#5298) 10 months ago
logging
moe [moe] fix tests 10 months ago
nn [lr-scheduler] fix load state dict and add test (#5369) 10 months ago
pipeline [feat] refactored extension module (#5298) 10 months ago
shardformer [shardformer]gather llama logits (#5398) 9 months ago
tensor [moe] support mixtral (#5309) 10 months ago
testing [npu] change device to accelerator api (#5239) 11 months ago
utils Merge pull request #5310 from hpcaitech/feature/npu 10 months ago
zero [llama] fix training and inference scripts (#5384) 9 months ago
__init__.py
initialize.py [npu] change device to accelerator api (#5239) 11 months ago