ColossalAI/colossalai
Jiarui Fang f552b11294
[zero] label state for param fp16 and grad (#551)
2022-03-30 15:57:46 +08:00
..
amp Refactored docstring to google style 2022-03-29 17:17:47 +08:00
builder Refactored docstring to google style 2022-03-29 17:17:47 +08:00
communication Refactored docstring to google style 2022-03-29 17:17:47 +08:00
context Refactored docstring to google style 2022-03-29 17:17:47 +08:00
engine [zero] label state for param fp16 and grad (#551) 2022-03-30 15:57:46 +08:00
kernel [cuda] modify the fused adam, support hybrid of fp16 and fp32 (#497) 2022-03-25 14:15:53 +08:00
logging Refactored docstring to google style 2022-03-29 17:17:47 +08:00
nn [TP] Add gather_out arg to Linear (#541) 2022-03-30 09:35:46 +08:00
registry Refactored docstring to google style 2022-03-29 17:17:47 +08:00
testing [test] fixed rerun_on_exception and adapted test cases (#487) 2022-03-25 17:25:12 +08:00
trainer Refactored docstring to google style 2022-03-29 17:17:47 +08:00
utils [zero] dump memory stats for sharded model (#548) 2022-03-30 09:38:44 +08:00
zero [zero] label state for param fp16 and grad (#551) 2022-03-30 15:57:46 +08:00
__init__.py Develop/experiments (#59) 2021-12-09 15:08:29 +08:00
constants.py fix format constants.py (#358) 2022-03-11 15:50:28 +08:00
core.py [polish] polish singleton and global context (#500) 2022-03-23 18:03:39 +08:00
global_variables.py [MOE] add unitest for MOE experts layout, gradient handler and kernel (#469) 2022-03-21 13:35:04 +08:00
initialize.py Refactored docstring to google style 2022-03-29 17:17:47 +08:00