You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/tests/test_tensor
YuliangLiu0306 4b03c25f85
[tensor]add 1D device mesh (#1492)
2 years ago
..
common_utils [colotensor] add Tensor.view op and its unit test (#1343) 2 years ago
core [colotensor] add Tensor.view op and its unit test (#1343) 2 years ago
model [hotfix] fix megatron_init in test_gpt2.py (#1357) 2 years ago
test_chunk.py [hotfix] ZeroDDP use new process group (#1333) 2 years ago
test_colo_checkpoint_tools.py [colotensor] use cpu memory to store state_dict (#1367) 2 years ago
test_comm_spec_apply.py [tensor]add 1D device mesh (#1492) 2 years ago
test_context.py [refactory] add nn.parallel module (#1068) 3 years ago
test_parameter.py [refactor] refactor ColoTensor's unit tests (#1340) 2 years ago
test_shape_consistency.py [tensor] shape consistency generate transform path and communication cost (#1435) 2 years ago
test_shape_consistency_apply.py [tensor]add 1D device mesh (#1492) 2 years ago
test_sharded_linear.py [tensor] added linear implementation for the new sharding spec (#1416) 2 years ago
test_sharding_spec.py [tensor]build sharding spec to replace distspec in future. (#1405) 2 years ago
test_zero_optim.py [unit test] add megatron init test in zero_optim (#1358) 2 years ago