AX-M1 on a Rock 5b+

Hi there,

I recently received a AX-M1 M.2 and of course wanted to give it a try. I’m running the Radxa OS-Image as documented here

Although I was able to install the driver successfully, I failed to install the huggingface-cli. According to the AX-M1 documentation the huggingface-cli should be installed like this:

pip3 install -U “huggingface_hub[cli]”
This didn’t worked (error: externally-managed-environment). So I tried pipx, which brought me one step further, but still can’t start the huggingface-cli

I wasnt able to find anything in this forum nor on discord. And since the Rock 5b+ is officialy supported and tested I assume that at some point someone successfully installed the AX-M1 on a Rock 5b+ and maybe can give me an fresh Idea to solve my problem?

domidum

Hi, domidum

You need to install the python package in the virtual environment.

please have a look the following link:
https://docs.radxa.com/en/rock5/rock5b/app-development/venv_usage

best,
Morgan

1 Like

Hi, Morgan

Thank you very much, that link indeed solved my problem. I assume that in generall running LLMs on the AX-M1 alway has to happen within such an environment?

Kind regards
Dominic

Hi, Dominic

If you want to use Python package, you need to run the python script in virtual environment.

best,
Morgan

Hi Morgan,

thanks for you answer.
According to what I found on radxa documentation or the AX-LLM its all python based

What other possibilities exist to run models on a AX-M1 besides python package based? On https://huggingface.co/AXERA-TECH I saw vLLM but I’m not quite sure if the AX8850 is officialy supported?
Sorry my lacking knowledge, its all still pretty new to me

kind regards
Domidum

Hi, domillo

AXCL supports both C++ and Python , so you can also run your model based on C++ .
As for AX-M1 , it does not support vLLM — did you mean VLM instead?

best,
Morgan