mirror of https://github.com/hpcaitech/ColossalAI
![]() * init a checkpoint dir * [checkpoint]support resume for cosinewarmuplr * [checkpoint]add unit test * fix some bugs but still not OK * fix bugs * make it faster * [checkpoint]support generalized scheduler * polish * [tensor] torch function return colotensor * polish * fix bugs * remove debug info * polish * polish * [tensor] test_model pass unittests * polish * [hotfix] fx get comm size bug Co-authored-by: ZhaoYi1222 <zhaoyi9499@gmail.com> |
||
---|---|---|
.. | ||
__init__.py | ||
adding_split_node_pass.py | ||
meta_info_prop.py | ||
shard_1d_pass.py | ||
utils.py |