You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/nn
Frank Lee 015af592f8
[shardformer] integrated linear 1D with dtensor (#3996)
1 year ago
..
_ops [doc] Fix typo under colossalai and doc(#3618) 2 years ago
layer [shardformer] integrated linear 1D with dtensor (#3996) 1 year ago
loss [nfc] fix typo colossalai/nn (#3887) 2 years ago
lr_scheduler
metric
optimizer [nfc]fix typo colossalai/pipeline tensor nn (#3899) 2 years ago
parallel [nfc] fix typo colossalai/nn (#3887) 2 years ago
__init__.py [kernel] added jit warmup (#1792) 2 years ago
init.py