You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/shardformer/layer
Bin Jia 7c8be77081
[shardformer/sequence parallel] support gpt2 seq parallel with pp/dp/tp (#4460)
1 year ago
..
__init__.py [shardformer] fix import 1 year ago
_operation.py [shardformer/sequence parallel] support gpt2 seq parallel with pp/dp/tp (#4460) 1 year ago
dropout.py [shardformer] supported bloom model (#4098) 1 year ago
embedding.py [shardformer] fix embedding 1 year ago
linear.py [shardformer/sequence parallel] Cherry pick commit to new branch (#4450) 1 year ago
loss.py fix some typo colossalai/shardformer (#4160) 1 year ago
normalization.py [shardformer] support inplace sharding (#4251) 1 year ago
parallel_module.py [shardformer] supported fused qkv checkpoint (#4073) 1 year ago
qkv_fused_linear.py [shardformer/sequence parallel] Cherry pick commit to new branch (#4450) 1 year ago
utils.py [misc] resolve code factor issues (#4433) 1 year ago