ColossalAI/colossalai
Jiarui Fang bc0e271e71
[buider] use builder() for cpu adam and fused optim in setup.py (#2187)
2022-12-23 16:05:13 +08:00
..
_C [optimizer] add div_scale for optimizers (#2117) 2022-12-12 17:58:57 +08:00
amp [builder] runtime adam and fused_optim builder (#2184) 2022-12-23 14:14:21 +08:00
auto_parallel [autoparallel] integrate_gpt_related_tests (#2134) 2022-12-23 12:36:59 +08:00
builder
cli [cli] updated installation cheheck with more inforamtion (#2050) 2022-11-30 17:53:55 +08:00
communication
context
device [device] update flatten device mesh usage (#2079) 2022-12-05 16:16:07 +08:00
engine
fx [Pipeline Middleware ] Fix deadlock when num_microbatch=num_stage (#2156) 2022-12-23 11:38:43 +08:00
gemini [hotfix] fix auto policy of test_sharded_optim_v2 (#2157) 2022-12-20 23:03:18 +08:00
kernel [buider] use builder() for cpu adam and fused optim in setup.py (#2187) 2022-12-23 16:05:13 +08:00
logging fixed logger 2022-11-15 16:00:07 +08:00
nn [builder] runtime adam and fused_optim builder (#2184) 2022-12-23 14:14:21 +08:00
pipeline [Pipeline Middleware ] Fix deadlock when num_microbatch=num_stage (#2156) 2022-12-23 11:38:43 +08:00
registry
tensor [autoparallel] memory estimation for shape consistency (#2144) 2022-12-21 10:39:37 +08:00
testing [zero] test gradient accumulation (#1964) 2022-11-29 13:00:30 +08:00
trainer [polish] remove useless file _mem_tracer_hook.py (#1963) 2022-11-16 15:55:10 +08:00
utils [Gemini] Update coloinit_ctx to support meta_tensor (#2147) 2022-12-19 22:37:07 +08:00
zero [example] add zero1, zero2 example in GPT examples (#2146) 2022-12-20 14:30:27 +08:00
__init__.py [setup] supported conda-installed torch (#2048) 2022-11-30 16:45:15 +08:00
constants.py
core.py
global_variables.py
initialize.py