ColossalAI/colossalai/auto_parallel
YuliangLiu0306 0da1d00399
[autoparallel] support distributed dataloader option (#1906)
* [autoparallel] support distributed dataloader option

* update output handler to support ddp dataloader

* poish code
2022-11-17 20:11:53 +08:00
..
checkpoint [autoparallel] user-friendly API for CheckpointSolver. (#1879) 2022-11-10 20:59:28 +08:00
meta_profiler [autoparallel] add torch.nn.ReLU metainfo (#1868) 2022-11-16 23:12:31 +08:00
passes [autoparallel] remove redundancy comm node (#1893) 2022-11-15 10:53:41 +08:00
pipeline_shard [autoparallel] init new folder structure (#1696) 2022-10-13 14:18:55 +08:00
tensor_shard [autoparallel] support distributed dataloader option (#1906) 2022-11-17 20:11:53 +08:00
__init__.py [autoparallel] standardize the code structure (#1469) 2022-08-19 15:51:54 +08:00