Commit Graph

19 Commits (07cb21142fc1daaf1a402f827721d3fdeb56d075)

Author SHA1 Message Date
ver217 ae71036cd2
[utils] refactor parallel layers checkpoint and bcast model on loading checkpoint (#1548)
2 years ago
Zirui Zhu 598cde4a0f [NFC] polish colossalai/nn/layer/parallel_2p5d/layers.py code style (#972)
3 years ago
ver217 58580b50fe
Revert "[NFC] Hotfix/format (#984)" (#986)
3 years ago
binmakeswell 0772828fba
[NFC] Hotfix/format (#984)
3 years ago
Ziyue Jiang 4b01da24cd
[TP] change the check assert in split batch 2d (#772)
3 years ago
アマデウス b8899e0905
[TP] allow layernorm without bias (#750)
3 years ago
Frank Lee eda30a058e
[compatibility] fixed tensor parallel compatibility with torch 1.9 (#700)
3 years ago
Liang Bowen 828e465622
[hotfix] Raise messages for indivisible batch sizes with tensor parallelism (#622)
3 years ago
アマデウス 93089ed708
[model checkpoint] updated saving/loading for 2.5d layers (#596)
3 years ago
Liang Bowen ec5086c49c Refactored docstring to google style
3 years ago
Yuer867 4a0f8c2c50 fix format parallel_2p5d (#357)
3 years ago
zbian 3dba070580 fixed padding index issue for vocab parallel embedding layers; updated 3D linear to be compatible with examples in the tutorial
3 years ago
アマデウス 9ee197d0e9 moved env variables to global variables; (#215)
3 years ago
HELSON 0f8c7f9804
Fixed docstring in colossalai (#171)
3 years ago
BoxiangW 4a3d3446b0
Update layer integration documentations (#108)
3 years ago
アマデウス 01a80cd86d
Hotfix/Colossalai layers (#92)
3 years ago
アマデウス 0fedef4f3c
Layer integration (#83)
3 years ago
Frank Lee da01c234e1
Develop/experiments (#59)
3 years ago
zbian 404ecbdcc6 Migrated project
3 years ago