You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/tests/test_tensor
Jiarui Fang f7e276fa71
[Gemini] add GeminiAdamOptimizer (#1960)
2 years ago
..
common_utils
core
model [utils] remove lazy_memory_allocate from ColoInitContext (#1844) 2 years ago
test_colo_checkpoint_tools.py [colotensor] use cpu memory to store state_dict (#1367) 2 years ago
test_comm_spec_apply.py [autoparallel] shard param and buffer as expected (#1753) 2 years ago
test_context.py [ColoTensor] reconfig ColoInitContext, decouple default_pg and default_dist_spec. (#1953) 2 years ago
test_parameter.py
test_shape_consistency.py [autoparallel] update CommSpec (#1667) 2 years ago
test_shape_consistency_apply.py [autoparallel] shard param and buffer as expected (#1753) 2 years ago
test_sharded_linear.py [ColoTensor] ColoInitContext initialize parameters in shard mode. (#1937) 2 years ago
test_sharding_spec.py [autoparallel] handled illegal sharding strategy (#1728) 2 years ago
test_tp_with_zero.py [Gemini] add GeminiAdamOptimizer (#1960) 2 years ago