ColossalAI/tests/test_shardformer
flybird11111 7486ed7d3a
[shardformer] update llama2/opt finetune example and fix llama2 policy (#4645)
* [shardformer] update shardformer readme

[shardformer] update shardformer readme

[shardformer] update shardformer readme

* [shardformer] update llama2/opt finetune example and shardformer update to llama2

* [shardformer] update llama2/opt finetune example and shardformer update to llama2

* [shardformer] update llama2/opt finetune example and shardformer update to llama2

* [shardformer] change dataset

* [shardformer] change dataset

* [shardformer] fix CI

* [shardformer] fix

* [shardformer] fix

* [shardformer] fix

* [shardformer] fix

* [shardformer] fix

[example] update opt example

[example] resolve comments

fix

fix
2023-09-09 22:45:36 +08:00
..
test_layer [shardformer] Add overlap support for gpt2 (#4535) 2023-08-29 18:30:50 +08:00
test_model [shardformer] update llama2/opt finetune example and fix llama2 policy (#4645) 2023-09-09 22:45:36 +08:00
__init__.py [shardformer] adapted T5 and LLaMa test to use kit (#4049) 2023-07-04 16:05:01 +08:00
test_shard_utils.py [test] add shard util tests 2023-08-15 23:25:14 +08:00
test_with_torch_ddp.py [shardformer] support lazy init (#4202) 2023-08-15 23:25:14 +08:00