You can not select more than 25 topics
Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
flybird11111
365671be10
fix-test (#5210)
fix-test
fix-test
|
11 months ago |
.. |
_C
|
[setup] support pre-build and jit-build of cuda kernels (#2374)
|
2 years ago |
_analyzer
|
[misc] update pre-commit and run all files (#4752)
|
1 year ago |
amp
|
[npu] add npu support for gemini and zero (#5067)
|
1 year ago |
auto_parallel
|
[npu] add npu support for gemini and zero (#5067)
|
1 year ago |
autochunk
|
[misc] update pre-commit and run all files (#4752)
|
1 year ago |
booster
|
fix-test (#5210)
|
11 months ago |
checkpoint_io
|
[pipeline,shardformer] Fix p2p efficiency in pipeline, allow skipping loading weight not in weight_map when `strict=False`, fix llama flash attention forward, add flop estimation by megatron in llama benchmark (#5017)
|
1 year ago |
cli
|
[bug] Fix the version check bug in colossalai run when generating the cmd. (#4713)
|
1 year ago |
cluster
|
fix-test (#5210)
|
11 months ago |
context
|
[moe] merge moe into main (#4978)
|
1 year ago |
device
|
[npu] add npu support for hybrid plugin and llama (#5090)
|
1 year ago |
fx
|
[misc] update pre-commit and run all files (#4752)
|
1 year ago |
inference
|
[Hotfix] Fix model policy matching strategy in ShardFormer (#5064)
|
1 year ago |
interface
|
[lazy] support from_pretrained (#4801)
|
1 year ago |
kernel
|
fix thrust-transform-reduce error (#5078)
|
1 year ago |
lazy
|
[doc] add lazy init docs (#4808)
|
1 year ago |
legacy
|
[shardformer]: support gpt-j, falcon, Mistral and add interleaved pipeline for bert (#5088)
|
1 year ago |
logging
|
[misc] update pre-commit and run all files (#4752)
|
1 year ago |
moe
|
[hotfix]: modify create_ep_hierarchical_group and add test (#5032)
|
1 year ago |
nn
|
[npu] add npu support for gemini and zero (#5067)
|
1 year ago |
pipeline
|
[pipeline]: add p2p fallback order and fix interleaved pp deadlock (#5214)
|
11 months ago |
shardformer
|
support linear accumulation fusion (#5199)
|
11 months ago |
tensor
|
fix (#5158)
|
12 months ago |
testing
|
[npu] add npu support for hybrid plugin and llama (#5090)
|
1 year ago |
utils
|
[shardformer]: support gpt-j, falcon, Mistral and add interleaved pipeline for bert (#5088)
|
1 year ago |
zero
|
[shardformer]: support gpt-j, falcon, Mistral and add interleaved pipeline for bert (#5088)
|
1 year ago |
__init__.py
|
[misc] update pre-commit and run all files (#4752)
|
1 year ago |
initialize.py
|
[npu] add npu support for gemini and zero (#5067)
|
1 year ago |