mirror of https://github.com/hpcaitech/ColossalAI
![]() * remove flash-attn dep * rm padding llama * revise infer requirements * move requirements out of module |
||
---|---|---|
.. | ||
__init__.py | ||
nopadding_llama.py |
![]() * remove flash-attn dep * rm padding llama * revise infer requirements * move requirements out of module |
||
---|---|---|
.. | ||
__init__.py | ||
nopadding_llama.py |