You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai
Jiarui Fang 68dcd51d41
[Tensor] update ColoTensor torch_function (#822)
3 years ago
..
amp [hotfix] fix memory leak in zero (#781) 3 years ago
builder [NFC] polish colossalai/builder/builder.py code style (#662) 3 years ago
cli [cli] added check installation cli 3 years ago
communication [util] fixed communication API depth with PyTorch 1.9 (#721) 3 years ago
context [compatibility] used backward-compatible API for global process group (#758) 3 years ago
engine [refactor] moving grad acc logic to engine (#804) 3 years ago
gemini [tensor] reorganize files (#820) 3 years ago
kernel Revert "[zero] add ZeroTensorShardStrategy (#793)" (#806) 3 years ago
logging
nn [TP] change the check assert in split batch 2d (#772) 3 years ago
registry
tensor [Tensor] update ColoTensor torch_function (#822) 3 years ago
testing [test] added a decorator for address already in use error with backward compatibility (#760) 3 years ago
trainer [log] local throughput metrics (#811) 3 years ago
utils [gemini] APIs to set cpu memory capacity (#809) 3 years ago
zero [gemini] collect cpu-gpu moving volume in each iteration (#813) 3 years ago
__init__.py
constants.py
core.py
global_variables.py
initialize.py [gemini] APIs to set cpu memory capacity (#809) 3 years ago