ColossalAI/colossalai/utils
Frank Lee 250be4d31e
[utils] integrated colotensor with lazy init context (#1324)
* [utils] integrated colotensor with lazy init context

* polish code

* polish code

* polish code
2022-07-15 17:47:12 +08:00
..
checkpoint [checkpoint] add ColoOptimizer checkpointing (#1316) 2022-07-15 09:52:55 +08:00
data_sampler Refactored docstring to google style 2022-03-29 17:17:47 +08:00
model [utils] integrated colotensor with lazy init context (#1324) 2022-07-15 17:47:12 +08:00
multi_tensor_apply Refactored docstring to google style 2022-03-29 17:17:47 +08:00
profiler [hotfix] remove potiential circle import (#1307) 2022-07-14 13:44:26 +08:00
tensor_detector Refactored docstring to google style 2022-03-29 17:17:47 +08:00
__init__.py [refactory] add nn.parallel module (#1068) 2022-06-06 15:34:41 +08:00
activation_checkpoint.py [util] fixed activation checkpointing on torch 1.9 (#719) 2022-04-12 09:35:45 +08:00
checkpointing.py polish checkpoint docstring (#637) 2022-04-02 13:34:33 +08:00
common.py [hotfix] fix sharded optim step and clip_grad_norm (#1226) 2022-07-08 13:34:48 +08:00
cuda.py [refactor] refactor the memory utils (#715) 2022-04-11 16:47:57 +08:00
memory.py [gemini] APIs to set cpu memory capacity (#809) 2022-04-19 16:05:22 +08:00
moe.py Refactored docstring to google style 2022-03-29 17:17:47 +08:00
timer.py Refactored docstring to google style 2022-03-29 17:17:47 +08:00