ColossalAI/colossalai
Jiarui Fang 10b3df65c8
[FAW] move coloparam setting in test code. (#1429)
2022-08-10 14:31:53 +08:00
..
amp [doc] update rst and docstring (#1351) 2022-07-21 15:54:53 +08:00
builder [NFC] polish colossalai/builder/builder.py code style (#1265) 2022-07-13 12:08:21 +08:00
cli
communication [communication] add p2p_v2.py to support communication with List[Any] (#1407) 2022-08-09 11:40:04 +08:00
context [doc] update rst and docstring (#1351) 2022-07-21 15:54:53 +08:00
device [device] add DeviceMesh class to support logical device layout (#1394) 2022-08-02 19:23:48 +08:00
engine [hotfix] fix PipelineSharedModuleGradientHandler (#1314) 2022-07-14 17:31:13 +08:00
fx [fx] patched torch.max and data movement operator (#1391) 2022-08-01 15:31:50 +08:00
gemini [zero] add has_inf_or_nan in AgChunk; enhance the unit test of AgChunk (#1426) 2022-08-10 11:37:28 +08:00
kernel [hotfix] fix CPUAdam kernel nullptr (#1410) 2022-08-05 19:45:45 +08:00
logging
nn [FAW] move coloparam setting in test code. (#1429) 2022-08-10 14:31:53 +08:00
pipeline
registry
tensor [tensor] add shape consistency feature to support auto spec transform (#1418) 2022-08-10 11:29:17 +08:00
testing
trainer
utils [hotfix] fix a running error in test_colo_checkpoint.py (#1387) 2022-07-29 15:58:06 +08:00
zero [hotfix] zero optim prevents calling inner optim.zero_grad (#1422) 2022-08-09 16:08:12 +08:00
__init__.py [NFC] polish colossalai/__init__.py code style (#1285) 2022-07-13 12:08:21 +08:00
constants.py
core.py
global_variables.py
initialize.py [hotfix] remove potiential circle import (#1307) 2022-07-14 13:44:26 +08:00