Commit Graph

182 Commits (3af13a2c3e917ce23d44050cfeea71cfa9f23e81)
 

Author SHA1 Message Date
jiaruifang 7977422aeb add bert for unitest and sharded model is not able to pass the bert case
3 years ago
Frank Lee 3d5d64bd10 refactored grad scaler (#338)
3 years ago
Frank Lee 6a3188167c set criterion as optional in colossalai initialize (#336)
3 years ago
Jie Zhu 3213554cc2 [profiler] add adaptive sampling to memory profiler (#330)
3 years ago
ver217 1388671699 [zero] Update sharded model v2 using sharded param v2 (#323)
3 years ago
jiaruifang 799d105bb4 using pytest parametrize
3 years ago
jiaruifang dec24561cf show pytest parameterize
3 years ago
Jiarui Fang 11bddb6e55 [zero] update zero context init with the updated test utils (#327)
3 years ago
Frank Lee 6268446b81 [test] refactored testing components (#324)
3 years ago
HELSON 4f26fabe4f fixed strings in profiler outputs (#325)
3 years ago
Jiarui Fang de0468c7a8 [zero] zero init context (#321)
3 years ago
1SAA 73bff11288 Added profiler communication operations
3 years ago
binmakeswell d275b98b7d add badge and contributor list
3 years ago
LuGY a3269de5c9 [zero] cpu adam kernel (#288)
3 years ago
Jiarui Fang 90d3aef62c [zero] yet an improved sharded param (#311)
3 years ago
Jiarui Fang c9e7d9582d [zero] polish shard strategy (#310)
3 years ago
ver217 3092317b80 polish code
3 years ago
ver217 36f9a74ab2 fix sharded param hook and unit test
3 years ago
ver217 001ca624dd impl shard optim v2 and add unit test
3 years ago
Jiarui Fang 74f77e314b [zero] a shard strategy in granularity of tensor (#307)
3 years ago
Jiarui Fang 80364c7686 [zero] sharded tensor (#305)
3 years ago
Jie Zhu d344689274 [profiler] primary memory tracer
3 years ago
FrankLeeeee dfc3fafe89 update unit testing CI rules
3 years ago
FrankLeeeee bbbfe9b2c9 added compatibility CI and options for release ci
3 years ago
FrankLeeeee 115bcc0b41 added pypi publication CI and remove formatting CI
3 years ago
ver217 b105371ace rename shared adam to sharded optim v2
3 years ago
ver217 70814dc22f fix master params dtype
3 years ago
ver217 795210dd99 add fp32 master params in sharded adam
3 years ago
ver217 a109225bc2 add sharded adam
3 years ago
Jiarui Fang 8f74fbd9c9 polish license (#300)
3 years ago
Jiarui Fang e17e92c54d Polish sharded parameter (#297)
3 years ago
ver217 7aef75ca42 [zero] add sharded grad and refactor grad hooks for ShardedModel (#287)
3 years ago
Frank Lee 9afb5c8b2d fixed typo in ShardParam (#294)
3 years ago
Frank Lee 27155b8513 added unit test for sharded optimizer (#293)
3 years ago
Frank Lee e17e54e32a added buffer sync to naive amp model wrapper (#291)
3 years ago
Jiarui Fang 8d653af408 add a common util for hooks registered on parameter. (#292)
3 years ago
Jie Zhu f867365aba bug fix: pass hook_list to engine (#273)
3 years ago
Jiarui Fang 5a560a060a Feature/zero (#279)
3 years ago
binmakeswell 08eccfe681 add community group and update issue template(#271)
3 years ago
Sze-qq 3312d716a0 update experimental visualization (#253)
3 years ago
binmakeswell 753035edd3 add Chinese README
3 years ago
1SAA 82023779bb Added TPExpert for special situation
3 years ago
HELSON 36b8477228 Fixed parameter initialization in FFNExpert (#251)
3 years ago
アマデウス e13293bb4c fixed CI dataset directory; fixed import error of 2.5d accuracy (#255)
3 years ago
1SAA 219df6e685 Optimized MoE layer and fixed some bugs;
3 years ago
zbian 3dba070580 fixed padding index issue for vocab parallel embedding layers; updated 3D linear to be compatible with examples in the tutorial
3 years ago
ver217 24f8583cc4 update setup info (#233)
3 years ago
github-actions b9f8521f8c Automated submodule synchronization
3 years ago
Frank Lee f5ca88ec97 fixed apex import (#227)
3 years ago
Frank Lee eb3fda4c28 updated readme and change log (#224)
3 years ago