You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/legacy/zero/gemini
ver217 148469348a
Merge branch 'main' into sync/npu
10 months ago
..
ophooks [misc] update pre-commit and run all files (#4752) 1 year ago
paramhooks [misc] update pre-commit and run all files (#4752) 1 year ago
__init__.py [shardformer]: support gpt-j, falcon, Mistral and add interleaved pipeline for bert (#5088) 1 year ago
colo_init_context.py [shardformer]: support gpt-j, falcon, Mistral and add interleaved pipeline for bert (#5088) 1 year ago
gemini_context.py [misc] update pre-commit and run all files (#4752) 1 year ago
stateful_tensor.py [misc] update pre-commit and run all files (#4752) 1 year ago
stateful_tensor_mgr.py [npu] change device to accelerator api (#5239) 11 months ago
tensor_placement_policy.py [npu] change device to accelerator api (#5239) 11 months ago
tensor_utils.py [misc] update pre-commit and run all files (#4752) 1 year ago