digger yu
|
518b31c059
|
[docs] change placememt_policy to placement_policy (#3829)
* fix typo colossalai/autochunk auto_parallel amp
* fix typo colossalai/auto_parallel nn utils etc.
* fix typo colossalai/auto_parallel autochunk fx/passes etc.
* fix typo docs/
* change placememt_policy to placement_policy in docs/ and examples/
|
2023-05-24 14:51:49 +08:00 |
ver217
|
26b7aac0be
|
[zero] reorganize zero/gemini folder structure (#3424)
* [zero] refactor low-level zero folder structure
* [zero] fix legacy zero import path
* [zero] fix legacy zero import path
* [zero] remove useless import
* [zero] refactor gemini folder structure
* [zero] refactor gemini folder structure
* [zero] refactor legacy zero import path
* [zero] refactor gemini folder structure
* [zero] refactor gemini folder structure
* [zero] refactor gemini folder structure
* [zero] refactor legacy zero import path
* [zero] fix test import path
* [zero] fix test
* [zero] fix circular import
* [zero] update import
|
2023-04-04 13:48:16 +08:00 |
Fazzie-Maqianli
|
292c81ed7c
|
fix/transformer-verison (#2581)
|
2023-02-08 13:50:27 +08:00 |
jiaruifang
|
025b482dc1
|
[example] dreambooth example
|
2023-01-18 18:42:56 +08:00 |
Haofan Wang
|
cfd1d5ee49
|
[example] fixed seed error in train_dreambooth_colossalai.py (#2445)
|
2023-01-11 16:56:15 +08:00 |
HELSON
|
48d33b1b17
|
[gemini] add get static torch model (#2356)
|
2023-01-06 13:41:19 +08:00 |
Haofan Wang
|
9edd0aa75e
|
Update train_dreambooth_colossalai.py
accelerator.num_processes -> gpc.get_world_size(ParallelMode.DATA)
|
2023-01-05 15:49:57 +08:00 |
Fazzie-Maqianli
|
a9b27b9265
|
[exmaple] fix dreamblooth format (#2315)
|
2023-01-04 16:20:00 +08:00 |
BlueRum
|
1405b4381e
|
[example] fix save_load bug for dreambooth (#2280)
|
2023-01-03 17:13:29 +08:00 |
Fazzie-Maqianli
|
ce3c4eca7b
|
[example] support Dreamblooth (#2188)
|
2022-12-23 16:47:30 +08:00 |