You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/nn
LuGY 105c5301c3
[zero]added hybrid adam, removed loss scale in adam (#527)
3 years ago
..
layer [polish] polish singleton and global context (#500) 3 years ago
loss [polish] polish singleton and global context (#500) 3 years ago
lr_scheduler
metric
model
optimizer [zero]added hybrid adam, removed loss scale in adam (#527) 3 years ago
__init__.py
init.py