ColossalAI/tests/test_fx/test_tracer
YuliangLiu0306 19438ea0ef
[hotfix] skip gpt tracing test (#2064)
2022-12-02 16:48:28 +08:00
..
test_hf_model [hotfix] skip gpt tracing test (#2064) 2022-12-02 16:48:28 +08:00
test_timm_model [fx] add a symbolic_trace api. (#1812) 2022-11-08 13:59:20 +08:00
test_torchaudio_model [fx] add a symbolic_trace api. (#1812) 2022-11-08 13:59:20 +08:00
test_torchrec_model [fx] add a symbolic_trace api. (#1812) 2022-11-08 13:59:20 +08:00
test_torchvision_model [fx] add a symbolic_trace api. (#1812) 2022-11-08 13:59:20 +08:00
test_activation_checkpoint_annotation.py [autoparallel] move ckpt solvers to autoparallel folder / refactor code (#1764) 2022-11-01 10:43:15 +08:00
test_bias_addition_module.py [autoparallel]add essential CommActions for broadcast oprands (#1793) 2022-11-04 18:36:42 +08:00
test_control_flow.py [fx] supported data-dependent control flow in model tracing (#1185) 2022-06-29 15:05:25 +08:00
test_functional_conv.py [fx]patch nn.functional convolution (#1528) 2022-09-01 19:05:07 +08:00
test_patched_module.py [fx] fixed adapative pooling size concatenation error (#1489) 2022-08-25 09:05:07 +08:00
test_patched_op.py [fx] patched torch.max and data movement operator (#1391) 2022-08-01 15:31:50 +08:00