You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/tests/test_tensor
HELSON ea13a201bb
[polish] polish code for get_static_torch_model (#2405)
2 years ago
..
common_utils [testing] add beit model for unit testings (#2196) 2 years ago
core [colotensor] add Tensor.view op and its unit test (#1343) 2 years ago
model [Gemini] remove static tracer (#2083) 2 years ago
test_colo_checkpoint_tools.py [colotensor] use cpu memory to store state_dict (#1367) 2 years ago
test_comm_spec_apply.py [autoparallel] shard param and buffer as expected (#1753) 2 years ago
test_context.py [ColoTensor] reconfig ColoInitContext, decouple default_pg and default_dist_spec. (#1953) 2 years ago
test_mix_gather.py [autoparallel] mix gather (#1977) 2 years ago
test_parameter.py [refactor] refactor ColoTensor's unit tests (#1340) 2 years ago
test_shape_consistency.py [autoparallel] update CommSpec (#1667) 2 years ago
test_shape_consistency_apply.py [autoparallel] shard param and buffer as expected (#1753) 2 years ago
test_sharded_linear.py [ColoTensor] ColoInitContext initialize parameters in shard mode. (#1937) 2 years ago
test_sharding_spec.py [autoparallel] handled illegal sharding strategy (#1728) 2 years ago
test_tp_with_zero.py [polish] polish code for get_static_torch_model (#2405) 2 years ago