ColossalAI/colossalai
ver217 dba7e0cfb4
make AutoPlacementPolicy configurable (#1191)
2022-06-30 15:18:30 +08:00
..
amp [hotfix]different overflow status lead to communication stuck. (#1175) 2022-06-27 09:53:57 +08:00
builder [pipeline] refactor the pipeline module (#1087) 2022-06-10 11:27:38 +08:00
cli [hotfix] fix some bugs caused by size mismatch. (#1011) 2022-05-23 14:02:28 +08:00
communication [hotfix]fixed p2p process send stuck (#1181) 2022-06-28 14:41:11 +08:00
context
engine [hotfix]fix some bugs caused by refactored schedule. (#1148) 2022-06-21 22:46:30 +08:00
fx [fx] patched conv and normalization (#1188) 2022-06-29 18:58:38 +08:00
gemini make AutoPlacementPolicy configurable (#1191) 2022-06-30 15:18:30 +08:00
kernel [optim] refactor fused sgd (#1134) 2022-06-20 11:19:38 +08:00
logging
nn [refactor] move chunk and chunkmgr to directory gemini (#1182) 2022-06-29 13:31:02 +08:00
pipeline [pipeline]add customized policy (#1139) 2022-06-21 15:23:41 +08:00
registry Remove duplication registry (#1078) 2022-06-08 07:47:24 +08:00
tensor [tensor] remove gpc in tensor tests (#1186) 2022-06-29 14:08:40 +08:00
testing [test] skip tests when not enough GPUs are detected (#1090) 2022-06-09 17:19:13 +08:00
trainer fix issue #1080 (#1071) 2022-06-07 17:21:11 +08:00
utils [context]use meta tensor to init model lazily. (#1187) 2022-06-29 21:02:30 +08:00
zero [refactor] move chunk and chunkmgr to directory gemini (#1182) 2022-06-29 13:31:02 +08:00
__init__.py
constants.py fix typo in constants (#1027) 2022-05-26 08:45:08 +08:00
core.py [Tensor] distributed view supports inter-process hybrid parallel (#1169) 2022-06-27 09:45:26 +08:00
global_variables.py
initialize.py [ddp] supported customized torch ddp configuration (#1123) 2022-06-15 18:11:53 +08:00