ColossalAI/colossalai
Frank Lee 7d8d825681
[booster] fixed the torch ddp plugin with the new checkpoint api (#3442)
2023-04-06 09:43:51 +08:00
..
_C
_analyzer [autoparallel]integrate auto parallel feature with new tracer (#3408) 2023-04-04 17:40:45 +08:00
amp [NFC] polish colossalai/amp/__init__.py code style (#3272) 2023-03-29 15:22:21 +08:00
auto_parallel [autoparallel]integrate auto parallel feature with new tracer (#3408) 2023-04-04 17:40:45 +08:00
autochunk
booster [booster] fixed the torch ddp plugin with the new checkpoint api (#3442) 2023-04-06 09:43:51 +08:00
builder
checkpoint_io [checkpoint] refactored the API and added safetensors support (#3427) 2023-04-04 15:23:01 +08:00
cli [NFC] polish colossalai/cli/benchmark/models.py code style (#3290) 2023-03-29 15:22:21 +08:00
cluster [booster] implemented the torch ddd + resnet example (#3232) 2023-03-27 10:24:14 +08:00
communication
context [NFC] polish colossalai/context/random/__init__.py code style (#3327) 2023-03-30 14:19:26 +08:00
device
engine [format] Run lint on colossalai.engine (#3367) 2023-04-05 23:24:43 +08:00
fx [autoparallel] adapt autoparallel with new analyzer (#3261) 2023-03-30 17:47:24 +08:00
interface [booster] implemented the torch ddd + resnet example (#3232) 2023-03-27 10:24:14 +08:00
kernel updated flash attention usage 2023-03-20 17:57:04 +08:00
logging
nn [zero] reorganize zero/gemini folder structure (#3424) 2023-04-04 13:48:16 +08:00
pipeline
registry
tensor Fix typo (#3448) 2023-04-06 09:43:31 +08:00
testing
trainer
utils [zero] reorganize zero/gemini folder structure (#3424) 2023-04-04 13:48:16 +08:00
zero [example] update examples related to zero/gemini (#3431) 2023-04-04 17:32:51 +08:00
__init__.py
constants.py
core.py
global_variables.py [NFC] polish colossalai/global_variables.py code style (#3259) 2023-03-29 15:22:21 +08:00
initialize.py [zero] reorganize zero/gemini folder structure (#3424) 2023-04-04 13:48:16 +08:00