You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai
Boyuan Yao d5c5bc219e
[SC] add GPT example for auto checkpoint (#1889)
2 years ago
..
amp [NFC] polish colossalai/amp/naive_amp/__init__.py code style (#1905) 2 years ago
auto_parallel [autoparallel] user-friendly API for CheckpointSolver. (#1879) 2 years ago
builder [NFC] polish colossalai/builder/__init__.py code style (#1560) 2 years ago
cli
communication
context updated tp layers 2 years ago
device [autoparallel] add numerical test for node strategies (#1760) 2 years ago
engine [engin/schedule] use p2p_v2 to recontruct pipeline_schedule (#1408) 2 years ago
fx [SC] add GPT example for auto checkpoint (#1889) 2 years ago
gemini MemStatsCollectorStatic (#1765) 2 years ago
kernel [kernel] added jit warmup (#1792) 2 years ago
logging
nn [inference] overlap comm and compute in Linear1D_Row when stream_chunk_num > 1 (#1876) 2 years ago
pipeline [Pipeline]Adapt to Pipelinable OPT (#1782) 2 years ago
registry
tensor [autoparallel] fix bugs caused by negative dim key (#1808) 2 years ago
testing [unittest] added doc for the pytest wrapper (#1704) 2 years ago
trainer [NFC] polish _checkpoint_hook.py code style (#1722) 2 years ago
utils [utils] fixed lazy init context (#1867) 2 years ago
zero [zero] migrate zero1&2 (#1878) 2 years ago
__init__.py version to 0.1.11rc2 (#1832) 2 years ago
constants.py updated tp layers 2 years ago
core.py
global_variables.py updated tp layers 2 years ago
initialize.py