ColossalAI/colossalai
ver217 6e553748a7
polish sharded optim docstr and warning (#770)
2022-04-14 21:03:59 +08:00
..
amp [bug] fixed grad scaler compatibility with torch 1.8 (#735) 2022-04-12 16:04:21 +08:00
builder [NFC] polish colossalai/builder/builder.py code style (#662) 2022-04-06 11:40:59 +08:00
communication [util] fixed communication API depth with PyTorch 1.9 (#721) 2022-04-12 09:44:40 +08:00
context [compatibility] used backward-compatible API for global process group (#758) 2022-04-14 17:20:35 +08:00
engine [zero] refactor memstats_collector (#746) 2022-04-14 12:01:12 +08:00
gemini [gemini] init genimi individual directory (#754) 2022-04-14 16:40:26 +08:00
kernel [NFC] polish colossalai/kernel/cuda_native/csrc/multi_tensor_adam.cu code style (#667) 2022-04-06 11:40:59 +08:00
logging Refactored docstring to google style 2022-03-29 17:17:47 +08:00
nn [TP] allow layernorm without bias (#750) 2022-04-14 11:43:56 +08:00
registry Refactored docstring to google style 2022-03-29 17:17:47 +08:00
testing [test] added a decorator for address already in use error with backward compatibility (#760) 2022-04-14 16:48:44 +08:00
trainer fix the ckpt bugs when using DDP (#769) 2022-04-14 21:03:24 +08:00
utils [zero] refactor memstats_collector (#746) 2022-04-14 12:01:12 +08:00
zero polish sharded optim docstr and warning (#770) 2022-04-14 21:03:59 +08:00
__init__.py Develop/experiments (#59) 2021-12-09 15:08:29 +08:00
constants.py fix format constants.py (#358) 2022-03-11 15:50:28 +08:00
core.py [polish] polish singleton and global context (#500) 2022-03-23 18:03:39 +08:00
global_variables.py [MOE] add unitest for MOE experts layout, gradient handler and kernel (#469) 2022-03-21 13:35:04 +08:00
initialize.py fix initialize about zero 2022-04-13 19:10:21 +08:00