ColossalAI/colossalai/nn
FoolPlayer ab8a47f830 [shardformer] add Dropout layer support different dropout pattern (#3856)
* add dropout layer, add dropout test

* modify seed manager as context manager

* add a copy of col_nn.layer

* add dist_crossentropy loss; separate module test

* polish the code

* fix dist crossentropy loss
2023-07-04 16:05:01 +08:00
..
_ops [doc] Fix typo under colossalai and doc(#3618) 2023-04-26 11:38:43 +08:00
layer [shardformer] add Dropout layer support different dropout pattern (#3856) 2023-07-04 16:05:01 +08:00
loss [nfc] fix typo colossalai/nn (#3887) 2023-06-05 16:04:27 +08:00
lr_scheduler [NFC] polish colossalai/nn/lr_scheduler/linear.py code style (#1716) 2022-10-19 12:20:51 +08:00
metric [NFC] polish colossalai/nn/metric/_utils.py code style (#1727) 2022-10-19 12:20:51 +08:00
optimizer [nfc]fix typo colossalai/pipeline tensor nn (#3899) 2023-06-06 14:07:36 +08:00
parallel [nfc] fix typo colossalai/nn (#3887) 2023-06-05 16:04:27 +08:00
__init__.py [kernel] added jit warmup (#1792) 2022-11-08 16:22:23 +08:00
init.py [NFC] polish colossalai/nn/init.py code style (#1292) 2022-07-13 10:51:55 +08:00