You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/checkpoint_io
Elsa Granger b2ad0d9e8f
[pipeline,shardformer] Fix p2p efficiency in pipeline, allow skipping loading weight not in weight_map when `strict=False`, fix llama flash attention forward, add flop estimation by megatron in llama benchmark (#5017)
1 year ago
..
__init__.py [misc] update pre-commit and run all files (#4752) 1 year ago
checkpoint_io_base.py [shardformer] fix master param sync for hybrid plugin/rewrite unwrapping logic (#4758) 1 year ago
general_checkpoint_io.py [shardformer] fix master param sync for hybrid plugin/rewrite unwrapping logic (#4758) 1 year ago
hybrid_parallel_checkpoint_io.py [pipeline,shardformer] Fix p2p efficiency in pipeline, allow skipping loading weight not in weight_map when `strict=False`, fix llama flash attention forward, add flop estimation by megatron in llama benchmark (#5017) 1 year ago
index_file.py [misc] update pre-commit and run all files (#4752) 1 year ago
utils.py [shardformer] Fix serialization error with Tensor Parallel state saving (#5018) 1 year ago