You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/utils
Wenhao Chen 7172459e74
[shardformer]: support gpt-j, falcon, Mistral and add interleaved pipeline for bert (#5088)
1 year ago
..
model [misc] update pre-commit and run all files (#4752) 1 year ago
multi_tensor_apply [misc] update pre-commit and run all files (#4752) 1 year ago
rank_recorder [misc] update pre-commit and run all files (#4752) 1 year ago
tensor_detector [misc] update pre-commit and run all files (#4752) 1 year ago
__init__.py [npu] add npu support for gemini and zero (#5067) 1 year ago
common.py [misc] update pre-commit and run all files (#4752) 1 year ago
device.py [npu] add npu support for hybrid plugin and llama (#5090) 1 year ago
memory.py [shardformer]: support gpt-j, falcon, Mistral and add interleaved pipeline for bert (#5088) 1 year ago
timer.py [npu] add npu support for gemini and zero (#5067) 1 year ago