ColossalAI/colossalai/checkpoint_io
flybird11111 4c4482f3ad
[example] llama2 add fine-tune example (#4673)
* [shardformer] update shardformer readme

[shardformer] update shardformer readme

[shardformer] update shardformer readme

* [shardformer] update llama2/opt finetune example and shardformer update to llama2

* [shardformer] update llama2/opt finetune example and shardformer update to llama2

* [shardformer] update llama2/opt finetune example and shardformer update to llama2

* [shardformer] change dataset

* [shardformer] change dataset

* [shardformer] fix CI

* [shardformer] fix

* [shardformer] fix

* [shardformer] fix

* [shardformer] fix

* [shardformer] fix

[example] update opt example

[example] resolve comments

fix

fix

* [example] llama2 add finetune example

* [example] llama2 add finetune example

* [example] llama2 add finetune example

* [example] llama2 add finetune example

* fix

* update llama2 example

* update llama2 example

* fix

* update llama2 example

* update llama2 example

* update llama2 example

* update llama2 example

* update llama2 example

* update llama2 example

* Update requirements.txt

* update llama2 example

* update llama2 example

* update llama2 example
2023-09-15 18:45:44 +08:00
..
__init__.py [hotfix] fix typo in hybrid parallel io (#4697) 2023-09-12 17:32:19 +08:00
checkpoint_io_base.py Next commit [checkpointio] Unsharded Optimizer Checkpoint for Gemini Plugin (#4141) 2023-07-07 16:33:06 +08:00
general_checkpoint_io.py Merge branch 'main' into feature/shardformer 2023-09-04 23:43:13 +08:00
hybrid_parallel_checkpoint_io.py [example] llama2 add fine-tune example (#4673) 2023-09-15 18:45:44 +08:00
index_file.py [checkpointio] General Checkpointing of Sharded Optimizers (#3984) 2023-06-15 15:21:26 +08:00
utils.py [legacy] move communication and nn to legacy and refactor logger (#4671) 2023-09-11 16:24:28 +08:00