You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/utils
Super Daniel 35c0c0006e
[utils] lazy init. (#2148)
2 years ago
..
checkpoint
checkpoint_io [CheckpointIO] a uniform checkpoint I/O module (#1689) 2 years ago
data_sampler
model [utils] lazy init. (#2148) 2 years ago
multi_tensor_apply [setup] support pre-build and jit-build of cuda kernels (#2374) 2 years ago
profiler [Gemini] clean no used MemTraceOp (#1970) 2 years ago
rank_recorder [pipeline/rank_recorder] fix bug when process data before backward | add a tool for multiple ranks debug (#1681) 2 years ago
tensor_detector
__init__.py [ddp] add is_ddp_ignored (#2434) 2 years ago
activation_checkpoint.py
checkpointing.py
common.py [ddp] add is_ddp_ignored (#2434) 2 years ago
cuda.py
memory.py
moe.py
timer.py