You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/legacy
Wenhao Chen 7172459e74
[shardformer]: support gpt-j, falcon, Mistral and add interleaved pipeline for bert (#5088)
1 year ago
..
amp [npu] add npu support for hybrid plugin and llama (#5090) 1 year ago
builder [misc] update pre-commit and run all files (#4752) 1 year ago
communication [misc] update pre-commit and run all files (#4752) 1 year ago
context [hotfix] fix torch 2.0 compatibility (#4936) 1 year ago
engine [npu] add npu support for gemini and zero (#5067) 1 year ago
inference [inference] Refactor inference architecture (#5057) 1 year ago
nn [npu] add npu support for gemini and zero (#5067) 1 year ago
pipeline [misc] update pre-commit and run all files (#4752) 1 year ago
registry [misc] update pre-commit and run all files (#4752) 1 year ago
tensor [hotfix] fix torch 2.0 compatibility (#4936) 1 year ago
trainer [misc] update pre-commit and run all files (#4752) 1 year ago
utils [npu] add npu support for hybrid plugin and llama (#5090) 1 year ago
zero [shardformer]: support gpt-j, falcon, Mistral and add interleaved pipeline for bert (#5088) 1 year ago
__init__.py [bug] fix get_default_parser in examples (#4764) 1 year ago
constants.py [misc] update pre-commit and run all files (#4752) 1 year ago
core.py [misc] update pre-commit and run all files (#4752) 1 year ago
global_variables.py [misc] update pre-commit and run all files (#4752) 1 year ago
initialize.py [moe] merge moe into main (#4978) 1 year ago