mirror of https://github.com/hpcaitech/ColossalAI
![]() * add test * fix no_sync bug in low level zero plugin * fix test * add argument for grad accum * add grad accum in backward hook for gemini * finish implementation, rewrite tests * fix test * skip stuck model in low level zero test * update doc * optimize communication & fix gradient checkpoint * modify doc * cleaning codes * update cpu adam fp16 case |
||
---|---|---|
.. | ||
utils | ||
__init__.py | ||
albert.py | ||
beit.py | ||
bert.py | ||
gpt2.py | ||
hanging_param_model.py | ||
inline_op_model.py | ||
nested_model.py | ||
registry.py | ||
repeated_computed_layers.py | ||
resnet.py | ||
simple_net.py |