ColossalAI/colossalai
XYE e83b2ce853 [NFC] polish colossalai/nn/layer/vanilla/layers.py code style (#1295) 2022-07-13 12:08:21 +08:00
..
amp [hotfix]different overflow status lead to communication stuck. (#1175) 2022-06-27 09:53:57 +08:00
builder [NFC] polish colossalai/builder/builder.py code style (#1265) 2022-07-13 12:08:21 +08:00
cli
communication [NFC] polish colossalai/communication/collective.py (#1262) 2022-07-13 12:08:21 +08:00
context
engine [NFC] polish colossalai/engine/ophooks/utils.py code style (#1256) 2022-07-13 12:08:21 +08:00
fx [fx] added ndim property to proxy (#1253) 2022-07-12 15:27:13 +08:00
gemini make AutoPlacementPolicy configurable (#1191) 2022-06-30 15:18:30 +08:00
kernel [NFC] polish colossalai/kernel/cuda_native/csrc/kernels/include/kernels.h code style (#1291) 2022-07-13 12:08:21 +08:00
logging
nn [NFC] polish colossalai/nn/layer/vanilla/layers.py code style (#1295) 2022-07-13 12:08:21 +08:00
pipeline [pipeline]add customized policy (#1139) 2022-06-21 15:23:41 +08:00
registry Remove duplication registry (#1078) 2022-06-08 07:47:24 +08:00
tensor [hotfix] Dist Mgr gather torch version (#1284) 2022-07-13 00:18:56 +08:00
testing [test] skip tests when not enough GPUs are detected (#1090) 2022-06-09 17:19:13 +08:00
trainer
utils [tensor] distributed checkpointing for parameters (#1240) 2022-07-12 15:51:06 +08:00
zero [hotfix] fix sharded optim step and clip_grad_norm (#1226) 2022-07-08 13:34:48 +08:00
__init__.py [NFC] polish colossalai/__init__.py code style (#1285) 2022-07-13 12:08:21 +08:00
constants.py
core.py [Tensor] distributed view supports inter-process hybrid parallel (#1169) 2022-06-27 09:45:26 +08:00
global_variables.py
initialize.py [ddp] supported customized torch ddp configuration (#1123) 2022-06-15 18:11:53 +08:00