mirror of https://github.com/hpcaitech/ColossalAI
890774b2fb
* [shardformer] support lazy init * [shardformer] linear support lazy init * [shardformer] embedding support lazy init * [shardformer] norm support lazy init * [shardformer] fused linear support lazy init * [test] update shardformer test layer * [test] shardformer with lazy init fit ddp * [lazy] hotfix deepcopy of param * [shardformer] fix bert policy and update test * [shardformer] fix bloom policy and update test * [shardformer] fix opt policy and update test * [shardformer] fix t5 policy and update test * [shardformer] fix gpt2 policy and update test * [shardformer] fix llama policy and update test |
||
---|---|---|
.. | ||
test_dist_crossentropy.py | ||
test_dropout.py | ||
test_embedding.py | ||
test_layernorm.py | ||
test_linear_1d.py | ||
test_qkv_fused_linear_1d.py | ||
test_vocab_parallel_embedding_1d.py |