ColossalAI/tests/test_shardformer/test_model
flybird11111 79718fae04
[shardformer] llama support DistCrossEntropy (#5176)
* fix

aaa

fix

fix

fix

* fix

* fix

* test ci

* fix ci

fix

* llama support dist-cross

fix

fix

fix

fix

fix

fix

fix

fix

* fix

* fix

* fix

fix

* test ci

* test ci

* fix

* [Colossal-Llama-2] Add finetuning Colossal-Llama-2 example (#4878)

* Add finetuning Colossal-Llama-2 example

* Add finetuning Colossal-Llama-2 example 2

* Add finetuning Colossal-Llama-2 example and support NEFTuning

* Add inference example and refine neftune

* Modify readme file

* update the imports

---------

Co-authored-by: Xu Yuanchen <yuanchen.xu00@gmail.com>
Co-authored-by: Camille Zhong <44392324+Camille7777@users.noreply.github.com>

* llama support dist-cross

fix

fix

fix

fix

fix

fix

fix

fix

* fix

* fix

* fix

fix

* test ci

* test ci

* fix

* fix ci

* fix ci

---------

Co-authored-by: Yuanchen <70520919+chengeharrison@users.noreply.github.com>
Co-authored-by: Xu Yuanchen <yuanchen.xu00@gmail.com>
Co-authored-by: Camille Zhong <44392324+Camille7777@users.noreply.github.com>
2023-12-13 01:39:14 +08:00
..
__init__.py [shardformer] adapted T5 and LLaMa test to use kit (#4049) 2023-07-04 16:05:01 +08:00
_utils.py [hotfix] fix torch 2.0 compatibility (#4936) 2023-10-18 11:05:25 +08:00
test_shard_bert.py [hotfix] Add layer norm gradients all-reduce for sequence parallel (#4926) 2023-11-03 13:32:43 +08:00
test_shard_blip2.py [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
test_shard_bloom.py [hotfix] Add layer norm gradients all-reduce for sequence parallel (#4926) 2023-11-03 13:32:43 +08:00
test_shard_chatglm2.py [hotfix] Add layer norm gradients all-reduce for sequence parallel (#4926) 2023-11-03 13:32:43 +08:00
test_shard_falcon.py [shardformer]: support gpt-j, falcon, Mistral and add interleaved pipeline for bert (#5088) 2023-11-28 16:54:42 +08:00
test_shard_gpt2.py [hotfix] Add layer norm gradients all-reduce for sequence parallel (#4926) 2023-11-03 13:32:43 +08:00
test_shard_gptj.py [shardformer] llama support DistCrossEntropy (#5176) 2023-12-13 01:39:14 +08:00
test_shard_llama.py [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
test_shard_mistral.py [shardformer]: support gpt-j, falcon, Mistral and add interleaved pipeline for bert (#5088) 2023-11-28 16:54:42 +08:00
test_shard_opt.py [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
test_shard_sam.py [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
test_shard_t5.py [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
test_shard_vit.py [hotfix] fix torch 2.0 compatibility (#4936) 2023-10-18 11:05:25 +08:00
test_shard_whisper.py [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00