Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature] Llamaindex integration #1112

Open
joshuayao opened this issue Nov 11, 2024 · 0 comments · May be fixed by run-llama/llama_index#16666
Open

[Feature] Llamaindex integration #1112

joshuayao opened this issue Nov 11, 2024 · 0 comments · May be fixed by run-llama/llama_index#16666
Labels
feature New feature or request
Milestone

Comments

@joshuayao
Copy link
Collaborator

joshuayao commented Nov 11, 2024

Priority

P1-Stopper

OS type

Ubuntu

Hardware type

Xeon-GNR

Running nodes

Single Node

Description

Integrate OPEA into llama-index: Add a llama-index wrapper for LLMs hosted by OPEA's GenAIComps library

Owner is logan-markewich

@joshuayao joshuayao added the feature New feature or request label Nov 11, 2024
@joshuayao joshuayao added this to the v1.2 milestone Nov 11, 2024
@joshuayao joshuayao added this to OPEA Nov 11, 2024
@joshuayao joshuayao linked a pull request Nov 25, 2024 that will close this issue
7 tasks
@joshuayao joshuayao moved this to In progress in OPEA Nov 25, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature New feature or request
Projects
Status: In progress
Development

Successfully merging a pull request may close this issue.

1 participant