You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/zero/gemini
Hongxin Liu 7303801854
[llama] fix training and inference scripts (#5384)
9 months ago
..
chunk [feat] refactored extension module (#5298) 10 months ago
memory_tracer [npu] change device to accelerator api (#5239) 11 months ago
__init__.py [shardformer]: support gpt-j, falcon, Mistral and add interleaved pipeline for bert (#5088) 1 year ago
gemini_ddp.py [llama] fix training and inference scripts (#5384) 9 months ago
gemini_hook.py [misc] update pre-commit and run all files (#4752) 1 year ago
gemini_mgr.py [npu] add npu support for gemini and zero (#5067) 1 year ago
gemini_optimizer.py [checkpointio] fix gemini and hybrid parallel optim checkpoint (#5347) 10 months ago
placement_policy.py [npu] change device to accelerator api (#5239) 11 months ago
utils.py [npu] change device to accelerator api (#5239) 11 months ago