ColossalAI/colossalai/shardformer/shard
Wang Binluo dcd41d0973
Merge pull request #6071 from wangbluo/ring_attention
[Ring Attention] fix the 2d ring attn when using multiple machine
2024-10-15 15:17:21 +08:00
..
__init__.py [shardformer, pipeline] add `gradient_checkpointing_ratio` and heterogenous shard policy for llama (#5508) 2024-04-01 11:34:58 +08:00
grad_ckpt_config.py [shardformer] refactor pipeline grad ckpt config (#5646) 2024-04-25 15:19:30 +08:00
shard_config.py Merge pull request #6071 from wangbluo/ring_attention 2024-10-15 15:17:21 +08:00
sharder.py [nfc] fix typo colossalai/shardformer/ (#5133) 2024-01-04 16:21:55 +08:00
shardformer.py [FP8] rebase main (#5963) 2024-08-06 16:29:37 +08:00
utils.py [pipeline] update shardformer policy 2023-08-15 23:25:14 +08:00