You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/booster
flybird11111 46e091651b
[shardformer] hybridparallelplugin support gradients accumulation. (#5246)
10 months ago
..
mixed_precision [npu] add npu support for hybrid plugin and llama (#5090) 1 year ago
plugin [shardformer] hybridparallelplugin support gradients accumulation. (#5246) 10 months ago
__init__.py
accelerator.py [misc] update pre-commit and run all files (#4752) 1 year ago
booster.py [lazy] support from_pretrained (#4801) 1 year ago