You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/pipeline
duanjunwen d0ec221b38
[fix\ fix fail case test_shard_llama
1 month ago
..
schedule [fix\ fix fail case test_shard_llama 1 month ago
__init__.py [feat] add zerobubble pp (just a frame now); add POC test for dx_dw; add test for zerobubble; 3 months ago
p2p.py fix object_to_tensor usage when torch>=2.3.0 (#5820) 5 months ago
stage_manager.py [fix\ fix fail case test_shard_llama 1 month ago
weight_grad_store.py [feat] support meta cache, meta_grad_send, meta_tensor_send; fix runtime too long in Recv Bwd; benchmark for llama + Hybrid(tp+pp); 1 month ago