mirror of https://github.com/hpcaitech/ColossalAI
aibig-modeldata-parallelismdeep-learningdistributed-computingfoundation-modelsheterogeneous-traininghpcinferencelarge-scalemodel-parallelismpipeline-parallelism
You can not select more than 25 topics
Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
HELSON
707b11d4a0
|
2 years ago | |
---|---|---|
.. | ||
common_utils | 2 years ago | |
core | 2 years ago | |
model | 2 years ago | |
test_colo_checkpoint_tools.py | 2 years ago | |
test_comm_spec_apply.py | 2 years ago | |
test_context.py | 2 years ago | |
test_mix_gather.py | 2 years ago | |
test_parameter.py | 2 years ago | |
test_shape_consistency.py | 2 years ago | |
test_shape_consistency_apply.py | 2 years ago | |
test_sharded_linear.py | 2 years ago | |
test_sharding_spec.py | 2 years ago | |
test_tp_with_zero.py | 2 years ago |