Commit Graph

2539 Commits (fee32a3b785327d63ff9cdaea4451b4cfe071d2a)
 

Author SHA1 Message Date
Frank Lee fee32a3b78
[docker] added ssh and rdma support for docker (#4192)
1 year ago
Frank Lee 190a6ea9c2
[dtensor] fixed readme file name and removed deprecated file (#4162)
1 year ago
Frank Lee cc3cbe9f6f
[workflow] show test duration (#4159)
1 year ago
Hongxin Liu 1908caad38
[cli] hotfix launch command for multi-nodes (#4165)
1 year ago
digger yu 2ac24040eb
fix some typo colossalai/shardformer (#4160)
1 year ago
github-actions[bot] c77b3b19be
[format] applied code formatting on changed files in pull request 4152 (#4157)
1 year ago
Frank Lee f447ca1811 [chat] removed cache file (#4155)
1 year ago
Frank Lee 89f45eda5a [shardformer] added development protocol for standardization (#4149)
1 year ago
Frank Lee 1fb0d95df0 [shardformer] made tensor parallelism configurable (#4144)
1 year ago
Frank Lee 74257cb446 [shardformer] refactored some doc and api (#4137)
1 year ago
jiangmingyan 7f9b30335b [shardformer] write an shardformer example with bert finetuning (#4126)
1 year ago
Frank Lee ae035d305d [shardformer] added embedding gradient check (#4124)
1 year ago
Frank Lee 44a190e6ac [shardformer] import huggingface implicitly (#4101)
1 year ago
Frank Lee 6a88bae4ec [shardformer] integrate with data parallelism (#4103)
1 year ago
Frank Lee f3b6aaa6b7 [shardformer] supported fused normalization (#4112)
1 year ago
Frank Lee b1c2901530 [shardformer] supported bloom model (#4098)
1 year ago
Kun Lin 8af29ee47a [shardformer] support vision transformer (#4096)
1 year ago
jiangmingyan ac80937138 [shardformer] shardformer support opt models (#4091)
1 year ago
Frank Lee d33a44e8c3 [shardformer] refactored layernorm (#4086)
1 year ago
Frank Lee c4b1b65931 [test] fixed tests failed due to dtensor change (#4082)
1 year ago
FoolPlayer 92f6791095 [shardformer] Add layernorm (#4072)
1 year ago
Frank Lee 70c58cfd4f [shardformer] supported fused qkv checkpoint (#4073)
1 year ago
FoolPlayer 0803a61412 [shardformer] add linearconv1d test (#4067)
1 year ago
Frank Lee 8eb09a4c69 [shardformer] support module saving and loading (#4062)
1 year ago
FoolPlayer 7740c55c55 support kit use for bert/gpt test (#4055)
1 year ago
Frank Lee f22ddacef0 [shardformer] refactored the shardformer layer structure (#4053)
1 year ago
Frank Lee 58df720570 [shardformer] adapted T5 and LLaMa test to use kit (#4049)
1 year ago
FoolPlayer 4021b9a8a2 [shardformer] add gpt2 test and layer class refactor (#4041)
1 year ago
Frank Lee d857f3dbba [shardformer] supported T5 and its variants (#4045)
1 year ago
Frank Lee c1d5453e9f [shardformer] adapted llama to the new API (#4036)
1 year ago
FoolPlayer 74d176c8d8 [shardformer] fix bert and gpt downstream with new api (#4024)
1 year ago
Frank Lee e253a07007 [shardformer] updated doc (#4016)
1 year ago
FoolPlayer df018fc305 support bert with new api
1 year ago
FoolPlayer 507c0ad368 add vocabembedding layer
1 year ago
Frank Lee 45d9384346 [shardformer] removed inplace tensor sharding (#4018)
1 year ago
Frank Lee 3893fa1a8d [shardformer] refactored embedding and dropout to parallel module (#4013)
1 year ago
FoolPlayer dfca9678fa integrate with dist layer (#4011)
1 year ago
Frank Lee 015af592f8 [shardformer] integrated linear 1D with dtensor (#3996)
1 year ago
FoolPlayer d3bc530849 [shardformer] Refactor shardformer api (#4001)
1 year ago
Frank Lee 611971248c [device] support init device mesh from process group (#3990)
1 year ago
FoolPlayer a2f9af810d [shardformer] fix an error in readme (#3988)
1 year ago
FoolPlayer f7774ec0f3 [Shardformer] Downstream bert (#3979)
1 year ago
wukong1992 c1c672d0f0 [shardformer] shardformer support t5 model (#3994)
1 year ago
wukong1992 6b30dfb7ce [shardformer] support llama model using shardformer (#3969)
1 year ago
FoolPlayer 45927d5527 [shardformer] Add dropout layer in shard model and refactor policy api (#3949)
1 year ago
FoolPlayer a73130482d [shardformer] Unit test (#3928)
1 year ago
FoolPlayer f1cb5ac6bf [shardformer] Align bert value (#3907)
1 year ago
FoolPlayer 79f8d5d54b [shardformer] add gpt2 policy and modify shard and slicer to support (#3883)
1 year ago
FoolPlayer 70173e3123 update README (#3909)
1 year ago
FoolPlayer ab8a47f830 [shardformer] add Dropout layer support different dropout pattern (#3856)
1 year ago