Commit Graph

25 Commits (055fbf5be680dfde20be1c51302f3c8b154a93e4)

Author SHA1 Message Date
アマデウス 297b8baae2
[model checkpoint] add gloo groups for cpu tensor communication (#589)
3 years ago
Liang Bowen 2c45efc398
html refactor (#555)
3 years ago
Liang Bowen ec5086c49c Refactored docstring to google style
3 years ago
Jiarui Fang a445e118cf
[polish] polish singleton and global context (#500)
3 years ago
HELSON f24b5ed201
[MOE] remove old MoE legacy (#493)
3 years ago
Jiarui Fang 65c0f380c2
[format] polish name format for MOE (#481)
3 years ago
HELSON 7544347145
[MOE] add unitest for MOE experts layout, gradient handler and kernel (#469)
3 years ago
HELSON 84fd7c1d4d
add moe context, moe utilities and refactor gradient handler (#455)
3 years ago
Frank Lee b72b8445c6
optimized context test time consumption (#446)
3 years ago
Frank Lee 1e4bf85cdb fixed bug in activation checkpointing test (#387)
3 years ago
RichardoLuo 8539898ec6 flake8 style change (#363)
3 years ago
ziyu huang a77d73f22b fix format parallel_context.py (#359)
3 years ago
Maruyama_Aya e83970e3dc fix format ColossalAI\colossalai\context\process_group_initializer
3 years ago
アマデウス 9ee197d0e9 moved env variables to global variables; (#215)
3 years ago
HELSON 0f8c7f9804
Fixed docstring in colossalai (#171)
3 years ago
Frank Lee e2089c5c15
adapted for sequence parallel (#163)
3 years ago
HELSON dceae85195
Added MoE parallel (#127)
3 years ago
ver217 a951bc6089
update default logger (#100) (#101)
3 years ago
ver217 96780e6ee4
Optimize pipeline schedule (#94)
3 years ago
アマデウス 01a80cd86d
Hotfix/Colossalai layers (#92)
3 years ago
アマデウス 0fedef4f3c
Layer integration (#83)
3 years ago
ver217 8f02a88db2
add interleaved pipeline, fix naive amp and update pipeline model initializer (#80)
3 years ago
Frank Lee 35813ed3c4
update examples and sphnix docs for the new api (#63)
3 years ago
Frank Lee da01c234e1
Develop/experiments (#59)
3 years ago
zbian 404ecbdcc6 Migrated project
3 years ago