ColossalAI/tests/test_shardformer
Jianghai d0807122e2 [pipeline] test pure pipeline process using llama (#4218)
* bloom policy

* llama pipeline forward and tests

* fix the output and attention_mask

* fix name

* bind argument to policy

* Revert "bloom policy"

This reverts commit 8dee68a0a2.

This policy should be revert and copied to feature/bloom

* revert the bloom changes

* cancel unneeded inputs

* gpt

* finish llama

* causal lm and sequence classification

* revision

* add pure pipeline test

* fixed version

* fixed version

* pure pipeline
2023-08-15 23:25:14 +08:00
..
test_layer [shardformer] support inplace sharding (#4251) 2023-08-15 23:25:14 +08:00
test_model [pipeline] test pure pipeline process using llama (#4218) 2023-08-15 23:25:14 +08:00
__init__.py [shardformer] adapted T5 and LLaMa test to use kit (#4049) 2023-07-04 16:05:01 +08:00
test_shard_utils.py [test] add shard util tests 2023-08-15 23:25:14 +08:00
test_with_torch_ddp.py [shardformer] support lazy init (#4202) 2023-08-15 23:25:14 +08:00