Mixtral
NOTE
To make your life easier, run these commands from the recipe directory (here recipes/mixtral
).β
Retrieve and convert modelβ
Set environment variablesβ
export EOLE_MODEL_DIR=<where_to_store_models>
export HF_TOKEN=<your_hf_token>
Download and convert modelβ
eole convert HF --model_dir TheBloke/Mixtral-8x7B-Instruct-v0.1-AWQ --output ${EOLE_MODEL_DIR}/mixtral-8x7b-instruct-v0.1-awq --token $HF_TOKEN
Inferenceβ
Write test prompt to text fileβ
echo -e "What are some nice places to visit in France?" | sed ':a;N;$!ba;s/\n/ο½newlineο½ /g' > test_prompt.txt
Run inferenceβ
eole predict -c mixtral-inference-awq.yaml -src test_prompt.txt -output test_output.txt