You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The vLLM image in OPEA doesn't work due to a known bug in vLLM (HabanaAI/vllm-fork#462).
The steps mentioned to compile the vLLM image mentions that we need to use setuptools=69.5.1 version. But that statement doesn't exist in the dockerfile.hpu.
(sed -i "s/RUN pip install setuptools/RUN pip install setuptools==69.5.1/g" docker/Dockerfile.hpu)
Please clean up the documentation so that the build goes smoothly. Also, push a valid image in opea harbour.
The text was updated successfully, but these errors were encountered:
The vLLM image in OPEA doesn't work due to a known bug in vLLM (HabanaAI/vllm-fork#462).
The steps mentioned to compile the vLLM image mentions that we need to use setuptools=69.5.1 version. But that statement doesn't exist in the dockerfile.hpu.
(sed -i "s/RUN pip install setuptools/RUN pip install setuptools==69.5.1/g" docker/Dockerfile.hpu)
Please clean up the documentation so that the build goes smoothly. Also, push a valid image in opea harbour.
The text was updated successfully, but these errors were encountered: