You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/nn
ver217 cefc29ff06
[tensor] impl ColoDDP for ColoTensor (#1009)
3 years ago
..
layer [NFC] polish colossalai/nn/layer/utils/common.py code style (#983) 3 years ago
loss [hotfix] Raise messages for indivisible batch sizes with tensor parallelism (#622) 3 years ago
lr_scheduler Refactored docstring to google style 3 years ago
metric [hotfix] Raise messages for indivisible batch sizes with tensor parallelism (#622) 3 years ago
model Develop/experiments (#59) 3 years ago
optimizer [zero] improve adaptability for not-shard parameters (#708) 3 years ago
__init__.py Layer integration (#83) 3 years ago
init.py Refactored docstring to google style 3 years ago
parallel.py [tensor] impl ColoDDP for ColoTensor (#1009) 3 years ago