You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/nn/layer
Frank Lee 015af592f8
[shardformer] integrated linear 1D with dtensor (#3996)
1 year ago
..
colossalai_layer fixed using zero with tp cannot access weight correctly 2 years ago
moe [doc] Fix typo under colossalai and doc(#3618) 2 years ago
parallel_1d [shardformer] add Dropout layer support different dropout pattern (#3856) 1 year ago
parallel_2d [utils] refactor parallel layers checkpoint and bcast model on loading checkpoint (#1548) 2 years ago
parallel_2p5d [utils] refactor parallel layers checkpoint and bcast model on loading checkpoint (#1548) 2 years ago
parallel_3d improved allgather & reducescatter for 3d 2 years ago
parallel_sequence [nfc] fix typo colossalai/nn (#3887) 2 years ago
utils
vanilla added skip_bias_add for non-tp linear 2 years ago
wrapper [NFC] polish colossalai/nn/layer/wrapper/pipeline_wrapper.py code style (#1303) 2 years ago
__init__.py
base_layer.py [shardformer] integrated linear 1D with dtensor (#3996) 1 year ago