You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai
Frank Lee b862d89d00
[doc] improved docstring in the logging module (#861)
3 years ago
..
amp [hotfix] fix memory leak in zero (#781) 3 years ago
builder modefied the pp build for ckpt adaptation (#803) 3 years ago
cli [cli] refactored micro-benchmarking cli and added more metrics (#858) 3 years ago
communication [doc] improved docstring in the communication module (#863) 3 years ago
context [compatibility] used backward-compatible API for global process group (#758) 3 years ago
engine [refactor] moving grad acc logic to engine (#804) 3 years ago
gemini [gemini] polish code (#855) 3 years ago
kernel Revert "[zero] add ZeroTensorShardStrategy (#793)" (#806) 3 years ago
logging [doc] improved docstring in the logging module (#861) 3 years ago
nn [gemini] add GeminiMemoryManger (#832) 3 years ago
registry [usability] added assertion message in registry (#864) 3 years ago
tensor [tensor] an initial dea of tensor spec (#865) 3 years ago
testing [test] added a decorator for address already in use error with backward compatibility (#760) 3 years ago
trainer [log] local throughput metrics (#811) 3 years ago
utils [pipelinable]use ColoTensor to replace dummy tensor. (#853) 3 years ago
zero [zero] use GeminiMemoryManager when sampling model data (#850) 3 years ago
__init__.py
constants.py
core.py [polish] polish singleton and global context (#500) 3 years ago
global_variables.py [MOE] add unitest for MOE experts layout, gradient handler and kernel (#469) 3 years ago
initialize.py modefied the pp build for ckpt adaptation (#803) 3 years ago