ColossalAI/colossalai/shardformer/policies
Hongxin Liu 17904cb5bf
Merge pull request #6012 from hpcaitech/feature/fp8_comm
[fp8]  support fp8 communication and fp8 training for Colossalai
2024-08-27 10:09:43 +08:00
..
__init__.py [shardformer] init shardformer code structure (#3731) 2023-07-04 16:05:01 +08:00
auto_policy.py [FP8] rebase main (#5963) 2024-08-06 16:29:37 +08:00
base_policy.py [fp8] Merge feature/fp8_comm to main branch of Colossalai (#6016) 2024-08-22 09:21:34 +08:00
bert.py [fp8] support hybrid parallel plugin (#5982) 2024-08-12 18:17:05 +08:00
blip2.py [fp8] support hybrid parallel plugin (#5982) 2024-08-12 18:17:05 +08:00
bloom.py [fp8] support hybrid parallel plugin (#5982) 2024-08-12 18:17:05 +08:00
chatglm2.py [fp8] support hybrid parallel plugin (#5982) 2024-08-12 18:17:05 +08:00
command.py [fp8] Merge feature/fp8_comm to main branch of Colossalai (#6016) 2024-08-22 09:21:34 +08:00
deepseek.py [Feature] Zigzag Ring attention (#5905) 2024-08-16 13:56:38 +08:00
falcon.py [fp8] support hybrid parallel plugin (#5982) 2024-08-12 18:17:05 +08:00
gpt2.py [fp8] support hybrid parallel plugin (#5982) 2024-08-12 18:17:05 +08:00
gptj.py [fp8] support hybrid parallel plugin (#5982) 2024-08-12 18:17:05 +08:00
llama.py Merge pull request #6012 from hpcaitech/feature/fp8_comm 2024-08-27 10:09:43 +08:00
mistral.py [fp8] Merge feature/fp8_comm to main branch of Colossalai (#6016) 2024-08-22 09:21:34 +08:00
mixtral.py [fp8] Merge feature/fp8_comm to main branch of Colossalai (#6016) 2024-08-22 09:21:34 +08:00
opt.py [fp8] support hybrid parallel plugin (#5982) 2024-08-12 18:17:05 +08:00
qwen2.py [Feature] Zigzag Ring attention (#5905) 2024-08-16 13:56:38 +08:00
sam.py [fp8] support hybrid parallel plugin (#5982) 2024-08-12 18:17:05 +08:00
t5.py [fp8] support hybrid parallel plugin (#5982) 2024-08-12 18:17:05 +08:00
vit.py [fp8] support hybrid parallel plugin (#5982) 2024-08-12 18:17:05 +08:00
whisper.py [fp8] support hybrid parallel plugin (#5982) 2024-08-12 18:17:05 +08:00