You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai
digger yu de0d7df33f
[nfc] fix typo colossalai/zero (#3923)
1 year ago
..
_C
_analyzer
amp [bf16] add bf16 support (#3882) 1 year ago
auto_parallel
autochunk
booster [doc] fix docs about booster api usage (#3898) 1 year ago
builder
checkpoint_io
cli Modify torch version requirement to adapt torch 2.0 (#3896) 1 year ago
cluster
communication
context
device [nfc] fix typo colossalai/cli fx kernel (#3847) 2 years ago
engine
fx [nfc] fix typo colossalai/cli fx kernel (#3847) 2 years ago
interface
kernel [bf16] add bf16 support (#3882) 1 year ago
lazy [lazy] fix compatibility problem on torch 1.13 (#3911) 1 year ago
logging
nn [nfc]fix typo colossalai/pipeline tensor nn (#3899) 1 year ago
pipeline [nfc]fix typo colossalai/pipeline tensor nn (#3899) 1 year ago
registry
tensor [nfc]fix typo colossalai/pipeline tensor nn (#3899) 1 year ago
testing
trainer fix typo with colossalai/trainer utils zero (#3908) 1 year ago
utils fix typo with colossalai/trainer utils zero (#3908) 1 year ago
zero [nfc] fix typo colossalai/zero (#3923) 1 year ago
__init__.py
constants.py
core.py
global_variables.py
initialize.py [nfc] fix typo colossalai/zero (#3923) 1 year ago