ColossalAI/tests/test_shardformer/test_model
Hongxin Liu 890774b2fb [shardformer] support lazy init (#4202)
* [shardformer] support lazy init

* [shardformer] linear support lazy init

* [shardformer] embedding support lazy init

* [shardformer] norm support lazy init

* [shardformer] fused linear support lazy init

* [test] update shardformer test layer

* [test] shardformer with lazy init fit ddp

* [lazy] hotfix deepcopy of param

* [shardformer] fix bert policy and update test

* [shardformer] fix bloom policy and update test

* [shardformer] fix opt policy and update test

* [shardformer] fix t5 policy and update test

* [shardformer] fix gpt2 policy and update test

* [shardformer] fix llama policy and update test
2023-08-15 23:25:14 +08:00
..
__init__.py [shardformer] adapted T5 and LLaMa test to use kit (#4049) 2023-07-04 16:05:01 +08:00
_utils.py [shardformer] support lazy init (#4202) 2023-08-15 23:25:14 +08:00
test_shard_bert.py [shardformer] support lazy init (#4202) 2023-08-15 23:25:14 +08:00
test_shard_bloom.py [shardformer] support lazy init (#4202) 2023-08-15 23:25:14 +08:00
test_shard_gpt2.py [shardformer] support lazy init (#4202) 2023-08-15 23:25:14 +08:00
test_shard_llama.py [shardformer] support lazy init (#4202) 2023-08-15 23:25:14 +08:00
test_shard_opt.py [shardformer] support lazy init (#4202) 2023-08-15 23:25:14 +08:00
test_shard_t5.py [shardformer] support lazy init (#4202) 2023-08-15 23:25:14 +08:00
test_shard_vit.py [shardformer] added embedding gradient check (#4124) 2023-07-04 16:05:01 +08:00