ColossalAI/colossalai
HELSON 48d33b1b17
[gemini] add get static torch model (#2356)
2023-01-06 13:41:19 +08:00
..
_C [optimizer] add div_scale for optimizers (#2117) 2022-12-12 17:58:57 +08:00
amp [NFC] polish colossalai/amp/torch_amp/torch_amp.py code style (#2290) 2023-01-04 15:09:57 +08:00
auto_parallel [workflow]New version: Create workflow files for examples' auto check (#2298) 2023-01-06 09:26:49 +08:00
builder
cli [NFC] polish colossalai/cli/benchmark/__init__.py code style (#2308) 2023-01-04 15:09:57 +08:00
communication [NFC] polish communication/p2p_v2.py code style (#2303) 2023-01-04 15:09:57 +08:00
context [hotfix] Fixing the bug related to ipv6 support 2022-12-27 12:42:46 +08:00
device [device] alpha beta profiler (#2311) 2023-01-05 16:39:55 +08:00
engine
fx [auto-parallel] refactoring ColoTracer (#2118) 2023-01-04 14:44:22 +08:00
gemini [Gemini] fix the convert_to_torch_module bug (#2269) 2023-01-03 15:55:35 +08:00
kernel [builder] reconfig op_builder for pypi install (#2314) 2023-01-04 16:32:32 +08:00
logging [logger] hotfix, missing _FORMAT (#2231) 2022-12-29 22:59:39 +08:00
nn [gemini] add get static torch model (#2356) 2023-01-06 13:41:19 +08:00
pipeline [example] add benchmark (#2276) 2023-01-03 17:20:59 +08:00
registry
tensor [autoparallel] fix runtime apply memory estimation (#2281) 2023-01-03 17:18:07 +08:00
testing [amp] add gradient clipping for unit tests (#2283) 2023-01-04 11:59:56 +08:00
trainer [polish] remove useless file _mem_tracer_hook.py (#1963) 2022-11-16 15:55:10 +08:00
utils [builder] unified cpu_optim fused_optim inferface (#2190) 2022-12-23 20:57:41 +08:00
zero [zero] polish low level zero optimizer (#2275) 2023-01-03 17:22:34 +08:00
__init__.py [setup] supported conda-installed torch (#2048) 2022-11-30 16:45:15 +08:00
constants.py updated tp layers 2022-11-02 12:19:38 +08:00
core.py
global_variables.py updated tp layers 2022-11-02 12:19:38 +08:00
initialize.py