Mistral
NOTE
To make your life easier, run these commands from the recipe directory (here recipes/mistral
).β
Retrieve and convert modelβ
Set environment variablesβ
export EOLE_MODEL_DIR=<where_to_store_models>
export HF_TOKEN=<your_hf_token>
Download and convert modelβ
eole convert HF --model_dir mistralai/Mistral-7B-v0.3 --output ${EOLE_MODEL_DIR}/mistral-7b-v0.3 --token $HF_TOKEN
Inferenceβ
Write test prompt to text fileβ
echo -e "What are some nice places to visit in France?" | sed ':a;N;$!ba;s/\n/ο½newlineο½ /g' > test_prompt.txt
Run inferenceβ
eole predict -c mistral-7b-awq-gemm-inference.yaml -src test_prompt.txt -output test_output.txt