ColossalAI/colossalai
Michelle 86cf6aed5b Fix/format (#4261)
* revise shardformer readme (#4246)

* [example] add llama pretraining (#4257)

* [NFC] polish colossalai/communication/p2p.py code style

---------

Co-authored-by: Jianghai <72591262+CjhHa1@users.noreply.github.com>
Co-authored-by: binmakeswell <binmakeswell@gmail.com>
Co-authored-by: Qianran Ma <qianranm@luchentech.com>
2023-07-26 14:12:57 +08:00
..
_C
_analyzer
amp
auto_parallel [test] fixed tests failed due to dtensor change (#4082) 2023-07-04 16:05:01 +08:00
autochunk
booster [NFC] Fix format for mixed precision (#4253) 2023-07-26 14:12:57 +08:00
builder
checkpoint_io [checkpointio] Sharded Optimizer Checkpoint for Gemini Plugin (#4302) 2023-07-21 14:39:01 +08:00
cli [cli] hotfix launch command for multi-nodes (#4165) 2023-07-04 17:54:40 +08:00
cluster
communication Fix/format (#4261) 2023-07-26 14:12:57 +08:00
context
device [format] applied code formatting on changed files in pull request 4152 (#4157) 2023-07-04 16:07:47 +08:00
engine [nfc]fix ColossalaiOptimizer is not defined (#4122) 2023-06-30 17:23:22 +08:00
fx
interface Next commit [checkpointio] Unsharded Optimizer Checkpoint for Gemini Plugin (#4141) 2023-07-07 16:33:06 +08:00
kernel [Kernels] added triton-implemented of self attention for colossal-ai (#4241) 2023-07-18 23:53:38 +08:00
lazy [lazy] support init on cuda (#4269) 2023-07-19 16:43:01 +08:00
logging
nn [shardformer] integrated linear 1D with dtensor (#3996) 2023-07-04 16:05:01 +08:00
pipeline
registry
shardformer revise shardformer readme (#4246) 2023-07-17 17:30:57 +08:00
tensor [dtensor] fixed readme file name and removed deprecated file (#4162) 2023-07-04 18:21:11 +08:00
testing Next commit [checkpointio] Unsharded Optimizer Checkpoint for Gemini Plugin (#4141) 2023-07-07 16:33:06 +08:00
trainer
utils
zero [checkpointio] Sharded Optimizer Checkpoint for Gemini Plugin (#4302) 2023-07-21 14:39:01 +08:00
__init__.py
constants.py
core.py
global_variables.py
initialize.py