You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai
Edenzzzz 15055f9a36
[hotfix] quick fixes to make legacy tutorials runnable (#5559)
8 months ago
..
_C
_analyzer [hotfix] quick fixes to make legacy tutorials runnable (#5559) 8 months ago
accelerator [hotfix] fix typo change MoECheckpintIO to MoECheckpointIO (#5335) 9 months ago
amp
auto_parallel
autochunk
booster [shardformer] Sequence Parallelism Optimization (#5533) 8 months ago
checkpoint_io [hotfix] fix typo change MoECheckpintIO to MoECheckpointIO (#5335) 9 months ago
cli
cluster [shardformer] Sequence Parallelism Optimization (#5533) 8 months ago
context
device
fx
inference [shardformer, pipeline] add `gradient_checkpointing_ratio` and heterogenous shard policy for llama (#5508) 8 months ago
interface
kernel [shardformer] update colo attention to support custom mask (#5510) 8 months ago
lazy
legacy Fix ColoTensorSpec for py11 (#5440) 8 months ago
logging
moe [hotfix] fix typo change MoECheckpintIO to MoECheckpointIO (#5335) 9 months ago
nn [hotfix] quick fixes to make legacy tutorials runnable (#5559) 8 months ago
pipeline [shardformer, pipeline] add `gradient_checkpointing_ratio` and heterogenous shard policy for llama (#5508) 8 months ago
shardformer [shardformer] Sequence Parallelism Optimization (#5533) 8 months ago
tensor fixed layout converter caching and updated tester 8 months ago
testing [shardformer] update colo attention to support custom mask (#5510) 8 months ago
utils
zero [shardformer] Sequence Parallelism Optimization (#5533) 8 months ago
__init__.py
initialize.py