ColossalAI/tests/test_shardformer
Jianghai a88e92251d [pipeline] add chatglm (#4363)
* add pipeline policy and bert forward to be done

* add bertmodel pipeline forward and make tests

* add Bert_Policy and test for policy

* update formatting

* update formatting

* update the code

* fix bugs

* fix name confilt

* add bloom model and policy ,revise the base class of policy

* revise

* revision

* add bert_for_pretraining

* add bert_for_pretraining forward and policy

* fix typos

* cancel warning

* change the imediate output to default dict

* change the default output of get_shared_params

* add chatglm

* add

* chatglm

* chatglm

* finish chatglm

* deletes

* fix rmsnorm

* chatglm

* fix chatglm shard

* init
2023-08-15 23:25:14 +08:00
..
test_layer update some module with new api version 2023-08-15 23:25:14 +08:00
test_model [pipeline] add chatglm (#4363) 2023-08-15 23:25:14 +08:00
__init__.py [shardformer] adapted T5 and LLaMa test to use kit (#4049) 2023-07-04 16:05:01 +08:00
test_shard_utils.py [test] add shard util tests 2023-08-15 23:25:14 +08:00
test_with_torch_ddp.py [shardformer] support lazy init (#4202) 2023-08-15 23:25:14 +08:00