You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/nn/loss
shenggan 8edb777cc2
[NFC] polish colossalai/nn/loss/loss_2p5d.py code style (#1553)
2 years ago
..
__init__.py moved env variables to global variables; (#215) 3 years ago
loss_1d.py [tensor] add unitest for colo_tensor 1DTP cross_entropy (#1230) 2 years ago
loss_2d.py [hotfix] Raise messages for indivisible batch sizes with tensor parallelism (#622) 3 years ago
loss_2p5d.py [NFC] polish colossalai/nn/loss/loss_2p5d.py code style (#1553) 2 years ago
loss_3d.py html refactor (#555) 3 years ago
loss_moe.py html refactor (#555) 3 years ago