mirror of https://github.com/hpcaitech/ColossalAI
![]() To be compatible with the new change in the Transformers library, where a new argument 'padding_mask' was added to forward function of attention layer. https://github.com/huggingface/transformers/pull/25598 |
||
---|---|---|
.. | ||
dataset | ||
model | ||
tokenizer | ||
utils | ||
__init__.py |