mirror of https://github.com/hpcaitech/ColossalAI
aibig-modeldata-parallelismdeep-learningdistributed-computingfoundation-modelsheterogeneous-traininghpcinferencelarge-scalemodel-parallelismpipeline-parallelism
You can not select more than 25 topics
Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
…
|
||
---|---|---|
.. | ||
d_tensor | ||
moe_tensor | ||
padded_tensor | ||
__init__.py | ||
colo_parameter.py | ||
colo_tensor.py | ||
comm_spec.py | ||
param_op_hook.py | ||
shape_consistency.py | ||
sharding_spec.py | ||
utils.py |