ColossalAI/tests/test_utils
oahzxl 501a9e9cd2
[hotfix] polish flash attention (#1802)
2022-11-07 14:30:22 +08:00
..
test_checkpoint [hotfix] skipped unsafe test cases (#1282) 2022-07-13 00:08:59 +08:00
test_activation_checkpointing.py [utils] Add use_reetrant=False in utils.activation_checkpoint (#1460) 2022-08-16 15:39:20 +08:00
test_colo_checkpoint.py [hotfix] fix a running error in test_colo_checkpoint.py (#1387) 2022-07-29 15:58:06 +08:00
test_commons.py [gemini] add GeminiMemoryManger (#832) 2022-04-24 13:08:48 +08:00
test_flash_attention.py [hotfix] polish flash attention (#1802) 2022-11-07 14:30:22 +08:00
test_lazy_init_ctx.py [utils] integrated colotensor with lazy init context (#1324) 2022-07-15 17:47:12 +08:00
test_memory.py [test] ignore 8 gpu test (#1080) 2022-06-08 23:14:18 +08:00
test_norm_gradient_clipping.py [Doc] add more doc for ColoTensor. (#1458) 2022-08-16 10:38:41 +08:00
test_zero_gradient_clippling.py [test] refactored with the new rerun decorator (#763) 2022-04-15 00:33:04 +08:00