You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/booster
Hongxin Liu 5452df63c5
[plugin] torch ddp plugin supports sharded model checkpoint (#3775)
2 years ago
..
mixed_precision [amp] Add naive amp demo (#3774) 2 years ago
plugin [plugin] torch ddp plugin supports sharded model checkpoint (#3775) 2 years ago
__init__.py [booster] implemented the torch ddd + resnet example (#3232) 2 years ago
accelerator.py [booster] added the accelerator implementation (#3159) 2 years ago
booster.py [doc] Fix typo under colossalai and doc(#3618) 2 years ago