You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/nn
LuGY 6a3f9fda83
[cuda] modify the fused adam, support hybrid of fp16 and fp32 (#497)
3 years ago
..
layer [polish] polish singleton and global context (#500) 3 years ago
loss [polish] polish singleton and global context (#500) 3 years ago
lr_scheduler Fixed docstring in colossalai (#171) 3 years ago
metric fixed CI dataset directory; fixed import error of 2.5d accuracy (#255) 3 years ago
model Develop/experiments (#59) 3 years ago
optimizer [cuda] modify the fused adam, support hybrid of fp16 and fp32 (#497) 3 years ago
__init__.py Layer integration (#83) 3 years ago
init.py Layer integration (#83) 3 years ago