You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai
Jiarui Fang 8c66a1d0aa
[polish] remove useless file _mem_tracer_hook.py (#1963)
2 years ago
..
amp [NFC] polish colossalai/amp/naive_amp/__init__.py code style (#1905) 2 years ago
auto_parallel [autoparallel] support addmm in tracer and solver (#1961) 2 years ago
builder
cli
communication
context updated tp layers 2 years ago
device [autoparallel] add numerical test for node strategies (#1760) 2 years ago
engine
fx [autoparallel] support addmm in tracer and solver (#1961) 2 years ago
gemini [Gemini] polish memstats collector (#1962) 2 years ago
kernel updated flash attention api 2 years ago
logging fixed logger 2 years ago
nn [Gemini] add GeminiAdamOptimizer (#1960) 2 years ago
pipeline [Pipeline]Adapt to Pipelinable OPT (#1782) 2 years ago
registry
tensor [autoparallel] remove redundancy comm node (#1893) 2 years ago
testing [unittest] added doc for the pytest wrapper (#1704) 2 years ago
trainer [polish] remove useless file _mem_tracer_hook.py (#1963) 2 years ago
utils [ColoTensor] reconfig ColoInitContext, decouple default_pg and default_dist_spec. (#1953) 2 years ago
zero [Gemini] polish memstats collector (#1962) 2 years ago
__init__.py version to 0.1.11rc2 (#1832) 2 years ago
constants.py updated tp layers 2 years ago
core.py
global_variables.py updated tp layers 2 years ago
initialize.py