Commit Graph

1057 Commits (e070ca45c62e09fc709cd16bf77bd0bfef30eedc)
 

Author SHA1 Message Date
Jiarui Fang af5438caa2
[FAW] refactor reorder() for CachedParamMgr (#1514)
2 years ago
Jiarui Fang 9feee6d06b
[FAW] LFU initialize with dataset freq (#1513)
2 years ago
CsRic 1b8fee8e9c
[FAW] shrink freq_cnter size (#1509)
2 years ago
github-actions[bot] f8945eef17
Automated submodule synchronization (#1511)
2 years ago
Boyuan Yao 4acc58ee20
[fx] Fix activation codegen dealing with checkpointing first op (#1510)
2 years ago
Boyuan Yao ac3a453a50
[fx] fix the discretize bug (#1506)
2 years ago
Boyuan Yao 31fffd3fc5
[fx] fix wrong variable name in solver rotor (#1502)
2 years ago
Sze-qq 3b6a5e2593
update OPT experiment result for 8 GPUs (#1503)
2 years ago
Jiarui Fang ba61109b6c
[FAW] remove code related to chunk (#1501)
2 years ago
Jiarui Fang d5085bb317
[FAW] add more docs and fix a warning (#1500)
2 years ago
Kirigaya Kazuto 5a6fd71f90
[pipeline/rpc] update outstanding mechanism | optimize dispatching strategy (#1497)
2 years ago
CsRic 0ed2f46131
[FAW] FAW embedding use LRU as eviction strategy intialized with dataset stats (#1494)
2 years ago
YuliangLiu0306 8b7d6bd5be
[autoparallel] add more sharding strategies to conv (#1487)
2 years ago
github-actions[bot] eda3de2701
Automated submodule synchronization (#1499)
2 years ago
Boyuan Yao de1e716dc4
[fx] Add activation checkpoint solver rotor (#1496)
2 years ago
Super Daniel 09c023bee2
[fx] add more op patches for profiler and error message for unsupported ops. (#1495)
2 years ago
YuliangLiu0306 413c053453
[autoparallel] add cost graph class (#1481)
2 years ago
YuliangLiu0306 4b03c25f85
[tensor]add 1D device mesh (#1492)
2 years ago
CsRic b8d0e39eaf
[FAW] LFU cache for the FAW
2 years ago
Kirigaya Kazuto 9145aef2b4
[pipeline/rpc] implement distributed optimizer | test with assert_close (#1486)
2 years ago
Frank Lee 3da68d6b1b
[fx] fixed adapative pooling size concatenation error (#1489)
2 years ago
Jiarui Fang cde7b8a5b8
[FAW] init an LFU implementation for FAW (#1488)
2 years ago
Super Daniel 32efe8e740
[fx] add profiler for fx nodes. (#1480)
2 years ago
Frank Lee d39e11dffb
[autoparallel] added namespace constraints (#1490)
2 years ago
Kirigaya Kazuto a6c8749198
[pipeline/rpc] support interleaving | fix checkpoint bug | change logic when dispatch data in work_list to ensure steady 1F1B (#1483)
2 years ago
github-actions[bot] d6e3dca436
Automated submodule synchronization (#1484)
2 years ago
Geng Zhang 0aad53c62b
[FCE] update interface for frequency statistics in FreqCacheEmbedding (#1462)
2 years ago
Frank Lee ede326298b
[autoparallel] integrate auto parallel with torch fx (#1479)
2 years ago
github-actions[bot] 8fb09a950a
Automated submodule synchronization (#1478)
2 years ago
fastalgo 0f438d15ee
Update README.md
2 years ago
Sze-qq 1750d6f573
[doc] update readme with the new xTrimoMultimer project (#1477)
2 years ago
Boyuan Yao 1f2e547f7a
[fx] Fix ckpt functions' definitions in forward (#1476)
2 years ago
Kirigaya Kazuto bb5f5289e0
[pipeline/rpc] implement a demo for PP with cuda rpc framework (#1470)
2 years ago
Frank Lee 628c7e3fc8
[autoparallel] added dot handler (#1475)
2 years ago
github-actions[bot] d08566fb61
Automated submodule synchronization (#1472)
2 years ago
Frank Lee 9dae9bb2bc
[autoparallel] introduced baseclass for op handler and reduced code redundancy (#1471)
2 years ago
Frank Lee 3a54e1c9b7
[autoparallel] standardize the code structure (#1469)
2 years ago
YuliangLiu0306 26a37b5cd5
[autoparallel] Add conv handler to generate strategies and costs info for conv (#1467)
2 years ago
Jiarui Fang 1b491ad7de
[doc] update docstring in ProcessGroup (#1468)
2 years ago
YuliangLiu0306 b73fb7a077
[tensor] support runtime ShardingSpec apply (#1453)
2 years ago
github-actions[bot] 177d3f5718
Automated submodule synchronization (#1465)
2 years ago
Super Daniel bbc58d881b
[fx] fix MetaInfoProp for incorrect calculations and add detections for inplace op. (#1466)
2 years ago
Super Daniel e7383f578b
[fx] add rules to linearize computation graphs for searching. (#1461)
2 years ago
github-actions[bot] a7a3d55114
Automated submodule synchronization (#1452)
2 years ago
Boyuan Yao 092b9c8f49
[fx] Add use_reentrant=False to checkpoint in codegen (#1463)
2 years ago
Boyuan Yao 47fd8e4a02
[utils] Add use_reetrant=False in utils.activation_checkpoint (#1460)
2 years ago
Jiarui Fang 36824a304c
[Doc] add more doc for ColoTensor. (#1458)
2 years ago
Jiarui Fang a1476ea882
[NFC] polish doc style for ColoTensor (#1457)
2 years ago
Super Daniel 0dbd61c29b
[fx] fix test and algorithm bugs in activation checkpointing. (#1451)
2 years ago
Jiarui Fang b1553fdf96
[NFC] global vars should be upper case (#1456)
2 years ago