You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai
アマデウス 077a66dd81
updated attention kernel (#2133)
2 years ago
..
_C [optimizer] add div_scale for optimizers (#2117) 2 years ago
amp [kernel] move all symlinks of kernel to `colossalai._C` (#1971) 2 years ago
auto_parallel [autoparallel] process size nodes in runtime pass (#2130) 2 years ago
builder
cli [cli] updated installation cheheck with more inforamtion (#2050) 2 years ago
communication
context updated tp layers 2 years ago
device [device] update flatten device mesh usage (#2079) 2 years ago
engine
fx [autoparallel] support linear function bias addition (#2104) 2 years ago
gemini [Gemini] update API of the chunkmemstatscollector. (#2129) 2 years ago
kernel updated attention kernel (#2133) 2 years ago
logging fixed logger 2 years ago
nn [Gemini] chunk init using runtime visited param order (#2115) 2 years ago
pipeline [PP Middleware] Add bwd and step for PP middleware (#2111) 2 years ago
registry
tensor [NFC] polish comments for Chunk class (#2116) 2 years ago
testing [zero] test gradient accumulation (#1964) 2 years ago
trainer [polish] remove useless file _mem_tracer_hook.py (#1963) 2 years ago
utils [hotfix] fix a type in ColoInitContext (#2106) 2 years ago
zero [Gemini] update API of the chunkmemstatscollector. (#2129) 2 years ago
__init__.py [setup] supported conda-installed torch (#2048) 2 years ago
constants.py updated tp layers 2 years ago
core.py
global_variables.py updated tp layers 2 years ago
initialize.py