ColossalAI/colossalai/utils
Nikita Shulga 01066152f1
Don't use `torch._six` (#2775)
* Don't use `torch._six`

This is a private API which is gone after https://github.com/pytorch/pytorch/pull/94709

* Update common.py
2023-02-17 09:22:45 +08:00
..
checkpoint
checkpoint_io [CheckpointIO] a uniform checkpoint I/O module (#1689) 2022-11-08 15:15:13 +08:00
data_sampler
model [gemini] fix colo_init_context (#2683) 2023-02-13 17:53:15 +08:00
multi_tensor_apply [setup] support pre-build and jit-build of cuda kernels (#2374) 2023-01-06 20:50:26 +08:00
profiler [Gemini] clean no used MemTraceOp (#1970) 2022-11-17 13:41:54 +08:00
rank_recorder [pipeline/rank_recorder] fix bug when process data before backward | add a tool for multiple ranks debug (#1681) 2022-10-09 17:32:57 +08:00
tensor_detector
__init__.py [ddp] add is_ddp_ignored (#2434) 2023-01-11 12:22:45 +08:00
activation_checkpoint.py
checkpointing.py
common.py Don't use `torch._six` (#2775) 2023-02-17 09:22:45 +08:00
cuda.py
memory.py
moe.py
timer.py