mirror of https://github.com/hpcaitech/ColossalAI
86cf6aed5b
* revise shardformer readme (#4246) * [example] add llama pretraining (#4257) * [NFC] polish colossalai/communication/p2p.py code style --------- Co-authored-by: Jianghai <72591262+CjhHa1@users.noreply.github.com> Co-authored-by: binmakeswell <binmakeswell@gmail.com> Co-authored-by: Qianran Ma <qianranm@luchentech.com> |
||
---|---|---|
.. | ||
_C | ||
_analyzer | ||
amp | ||
auto_parallel | ||
autochunk | ||
booster | ||
builder | ||
checkpoint_io | ||
cli | ||
cluster | ||
communication | ||
context | ||
device | ||
engine | ||
fx | ||
interface | ||
kernel | ||
lazy | ||
logging | ||
nn | ||
pipeline | ||
registry | ||
shardformer | ||
tensor | ||
testing | ||
trainer | ||
utils | ||
zero | ||
__init__.py | ||
constants.py | ||
core.py | ||
global_variables.py | ||
initialize.py |