.. |
amp
|
[hotfix] fix initialize bug with zero (#442)
|
2022-03-17 13:16:22 +08:00 |
builder
|
add pytorch hooks (#179)
|
2022-01-25 22:20:54 +08:00 |
communication
|
fix format (#332)
|
2022-03-11 15:50:28 +08:00 |
context
|
[MOE] add unitest for MOE experts layout, gradient handler and kernel (#469)
|
2022-03-21 13:35:04 +08:00 |
engine
|
[MOE] polish moe_env (#467)
|
2022-03-19 15:36:25 +08:00 |
kernel
|
[formart] format fixed for kernel\cuda_native codes (#335)
|
2022-03-11 15:50:28 +08:00 |
logging
|
[log] better logging display with rich (#426)
|
2022-03-16 09:51:15 +08:00 |
nn
|
[MOE] add unitest for MOE experts layout, gradient handler and kernel (#469)
|
2022-03-21 13:35:04 +08:00 |
registry
|
add pytorch hooks (#179)
|
2022-01-25 22:20:54 +08:00 |
testing
|
optimized context test time consumption (#446)
|
2022-03-17 14:40:52 +08:00 |
trainer
|
Added profiler communication operations
|
2022-03-11 15:50:28 +08:00 |
utils
|
[MOE] add unitest for MOE experts layout, gradient handler and kernel (#469)
|
2022-03-21 13:35:04 +08:00 |
zero
|
zero init ctx receives a dp process group (#471)
|
2022-03-21 11:18:55 +08:00 |
__init__.py
|
Develop/experiments (#59)
|
2021-12-09 15:08:29 +08:00 |
constants.py
|
fix format constants.py (#358)
|
2022-03-11 15:50:28 +08:00 |
core.py
|
[MOE] polish moe_env (#467)
|
2022-03-19 15:36:25 +08:00 |
global_variables.py
|
[MOE] add unitest for MOE experts layout, gradient handler and kernel (#469)
|
2022-03-21 13:35:04 +08:00 |
initialize.py
|
[MOE] add unitest for MOE experts layout, gradient handler and kernel (#469)
|
2022-03-21 13:35:04 +08:00 |