You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai
Frank Lee 638a07a7f9
[test] fixed gemini plugin test (#3411)
2 years ago
..
_C
_analyzer [autoparallel] adapt autoparallel with new analyzer (#3261) 2 years ago
amp [NFC] polish colossalai/amp/__init__.py code style (#3272) 2 years ago
auto_parallel [test] fixed gemini plugin test (#3411) 2 years ago
autochunk [autochunk] support vit (#3084) 2 years ago
booster [booster] implement Gemini plugin (#3352) 2 years ago
builder
checkpoint_io [booster] implemented the torch ddd + resnet example (#3232) 2 years ago
cli [NFC] polish colossalai/cli/benchmark/models.py code style (#3290) 2 years ago
cluster [booster] implemented the torch ddd + resnet example (#3232) 2 years ago
communication
context [NFC] polish colossalai/context/random/__init__.py code style (#3327) 2 years ago
device [hotfix] add copyright for solver and device mesh (#2803) 2 years ago
engine [NFC] polish colossalai/engine/gradient_handler/__init__.py code style (#3329) 2 years ago
fx [autoparallel] adapt autoparallel with new analyzer (#3261) 2 years ago
gemini [NFC] polish colossalai/gemini/paramhooks/_param_hookmgr.py code style 2 years ago
interface [booster] implemented the torch ddd + resnet example (#3232) 2 years ago
kernel updated flash attention usage 2 years ago
logging
nn [moe] add checkpoint for moe models (#3354) 2 years ago
pipeline [pipeline] Add Simplified Alpa DP Partition (#2507) 2 years ago
registry
tensor Add interface for colo tesnor dp size (#3227) 2 years ago
testing
trainer
utils [lazyinit] combine lazy tensor with dtensor (#3204) 2 years ago
zero [zero] Refactor ZeroContextConfig class using dataclass (#3186) 2 years ago
__init__.py
constants.py
core.py
global_variables.py [NFC] polish colossalai/global_variables.py code style (#3259) 2 years ago
initialize.py Fix False warning in initialize.py (#2456) 2 years ago