mirror of https://github.com/hpcaitech/ColossalAI
aibig-modeldata-parallelismdeep-learningdistributed-computingfoundation-modelsheterogeneous-traininghpcinferencelarge-scalemodel-parallelismpipeline-parallelism
You can not select more than 25 topics
Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
9 lines
598 B
9 lines
598 B
#!/usr/bin/env bash |
|
|
|
PRETRAINED=${1:-"hpcai-tech/grok-1"} |
|
|
|
torchrun --standalone --nproc_per_node 8 inference_tp.py --pretrained "$PRETRAINED" \ |
|
--max_new_tokens 100 \ |
|
--text "The company's annual conference, featuring keynote speakers and exclusive product launches, will be held at the Los Angeles Convention Center from October 20th to October 23rd, 2021. Extract the date mentioned in the above sentence." \ |
|
"将以下句子翻译成英语。 我喜欢看电影和读书。" \ |
|
"All books have the same weight, 10 books weigh 5kg, what is the weight of 2 books?"
|
|
|