ColossalAI/colossalai
Geng Zhang 0aad53c62b
[FCE] update interface for frequency statistics in FreqCacheEmbedding (#1462)
2022-08-23 17:38:24 +08:00
..
amp [doc] update rst and docstring (#1351) 2022-07-21 15:54:53 +08:00
auto_parallel [autoparallel] integrate auto parallel with torch fx (#1479) 2022-08-23 14:23:08 +08:00
builder
cli
communication [communication] add p2p_v2.py to support communication with List[Any] (#1407) 2022-08-09 11:40:04 +08:00
context [doc] update rst and docstring (#1351) 2022-07-21 15:54:53 +08:00
device [tensor] support runtime ShardingSpec apply (#1453) 2022-08-19 13:39:51 +08:00
engine [engin/schedule] use p2p_v2 to recontruct pipeline_schedule (#1408) 2022-08-12 11:33:26 +08:00
fx [fx] Fix ckpt functions' definitions in forward (#1476) 2022-08-22 16:59:54 +08:00
gemini [zero] add chunk_managerV2 for all-gather chunk (#1441) 2022-08-11 19:17:24 +08:00
kernel [hotfix] fix CPUAdam kernel nullptr (#1410) 2022-08-05 19:45:45 +08:00
logging
nn [FCE] update interface for frequency statistics in FreqCacheEmbedding (#1462) 2022-08-23 17:38:24 +08:00
pipeline [pipeline/rpc] implement a demo for PP with cuda rpc framework (#1470) 2022-08-22 10:50:51 +08:00
registry
tensor [autoparallel] Add conv handler to generate strategies and costs info for conv (#1467) 2022-08-19 14:57:23 +08:00
testing
trainer
utils [utils] Add use_reetrant=False in utils.activation_checkpoint (#1460) 2022-08-16 15:39:20 +08:00
zero [utils] Impl clip_grad_norm for ColoTensor and ZeroOptimizer (#1442) 2022-08-11 22:58:58 +08:00
__init__.py
constants.py
core.py
global_variables.py
initialize.py [hotfix] remove potiential circle import (#1307) 2022-07-14 13:44:26 +08:00