You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai
hxwang ff507b755e
Merge branch 'main' of github.com:hpcaitech/ColossalAI into prefetch
6 months ago
..
_C
_analyzer
accelerator
amp
auto_parallel [misc] refactor launch API and tensor constructor (#5666) 7 months ago
autochunk
booster Merge branch 'main' of github.com:hpcaitech/ColossalAI into prefetch 6 months ago
checkpoint_io [lora] add lora APIs for booster, support lora for TorchDDP (#4981) 7 months ago
cli
cluster [Feature] Distributed optimizers: Lamb, Galore, CAME and Adafactor (#5694) 7 months ago
context [Fix]: implement thread-safety singleton to avoid deadlock for very large-scale training scenarios (#5625) 7 months ago
device [Feature] Distributed optimizers: Lamb, Galore, CAME and Adafactor (#5694) 7 months ago
fx
inference [Inference]Fix readme and example for API server (#5742) 6 months ago
interface [Feature] Distributed optimizers: Lamb, Galore, CAME and Adafactor (#5694) 7 months ago
kernel [NFC] Fix code factors on inference triton kernels (#5743) 6 months ago
lazy [lazy] fix lazy cls init (#5720) 7 months ago
legacy [sync] Sync feature/colossal-infer with main 6 months ago
logging
moe
nn [Feature] Distributed optimizers: Lamb, Galore, CAME and Adafactor (#5694) 7 months ago
pipeline [LowLevelZero] low level zero support lora (#5153) 7 months ago
quantization [Feature] qlora support (#5586) 7 months ago
shardformer [Colossal-Inference] (v0.1.0) Merge pull request #5739 from hpcaitech/feature/colossal-infer 6 months ago
tensor [Feature] Distributed optimizers: Lamb, Galore, CAME and Adafactor (#5694) 7 months ago
testing
utils
zero Merge branch 'main' of github.com:hpcaitech/ColossalAI into prefetch 6 months ago
__init__.py
initialize.py [misc] refactor launch API and tensor constructor (#5666) 7 months ago