You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/legacy/zero/gemini
ver217 148469348a
Merge branch 'main' into sync/npu
10 months ago
..
ophooks
paramhooks
__init__.py [shardformer]: support gpt-j, falcon, Mistral and add interleaved pipeline for bert (#5088) 1 year ago
colo_init_context.py [shardformer]: support gpt-j, falcon, Mistral and add interleaved pipeline for bert (#5088) 1 year ago
gemini_context.py
stateful_tensor.py
stateful_tensor_mgr.py [npu] change device to accelerator api (#5239) 11 months ago
tensor_placement_policy.py [npu] change device to accelerator api (#5239) 11 months ago
tensor_utils.py