ColossalAI/colossalai/utils
Nikita Shulga 01066152f1
Don't use `torch._six` (#2775)
* Don't use `torch._six`

This is a private API which is gone after https://github.com/pytorch/pytorch/pull/94709

* Update common.py
2023-02-17 09:22:45 +08:00
..
checkpoint [hotfix] fix a running error in test_colo_checkpoint.py (#1387) 2022-07-29 15:58:06 +08:00
checkpoint_io [CheckpointIO] a uniform checkpoint I/O module (#1689) 2022-11-08 15:15:13 +08:00
data_sampler Refactored docstring to google style 2022-03-29 17:17:47 +08:00
model [gemini] fix colo_init_context (#2683) 2023-02-13 17:53:15 +08:00
multi_tensor_apply [setup] support pre-build and jit-build of cuda kernels (#2374) 2023-01-06 20:50:26 +08:00
profiler [Gemini] clean no used MemTraceOp (#1970) 2022-11-17 13:41:54 +08:00
rank_recorder [pipeline/rank_recorder] fix bug when process data before backward | add a tool for multiple ranks debug (#1681) 2022-10-09 17:32:57 +08:00
tensor_detector [NFC] polish utils/tensor_detector/__init__.py code style (#1573) 2022-09-08 22:11:04 +08:00
__init__.py [ddp] add is_ddp_ignored (#2434) 2023-01-11 12:22:45 +08:00
activation_checkpoint.py [utils] Add use_reetrant=False in utils.activation_checkpoint (#1460) 2022-08-16 15:39:20 +08:00
checkpointing.py [utils] refactor parallel layers checkpoint and bcast model on loading checkpoint (#1548) 2022-09-06 20:18:35 +08:00
common.py Don't use `torch._six` (#2775) 2023-02-17 09:22:45 +08:00
cuda.py [refactor] refactor the memory utils (#715) 2022-04-11 16:47:57 +08:00
memory.py [gemini] APIs to set cpu memory capacity (#809) 2022-04-19 16:05:22 +08:00
moe.py Refactored docstring to google style 2022-03-29 17:17:47 +08:00
timer.py Refactored docstring to google style 2022-03-29 17:17:47 +08:00