ColossalAI/tests/test_shardformer
Baizhou Zhang ed4c448488 [pipeline] rewrite t5 tests & support multi-tensor transmitting in pipeline (#4388)
* fix remaining t5 bugs/rewrite t5 tests

* fix multi-tensor communication in pipeline

* rearrange test_config

* fix keyerror in sync_shared_params

* fix get_held_layers & Randomnizer, complete t5 tests

* erase printing

* fix get_held_layers through modifying _release_unheld_layers

* fix _get_recursive_held_layers bug
2023-08-15 23:25:14 +08:00
..
test_layer update some module with new api version 2023-08-15 23:25:14 +08:00
test_model [pipeline] rewrite t5 tests & support multi-tensor transmitting in pipeline (#4388) 2023-08-15 23:25:14 +08:00
__init__.py [shardformer] adapted T5 and LLaMa test to use kit (#4049) 2023-07-04 16:05:01 +08:00
test_shard_utils.py [test] add shard util tests 2023-08-15 23:25:14 +08:00
test_with_torch_ddp.py [shardformer] support lazy init (#4202) 2023-08-15 23:25:14 +08:00