-
Notifications
You must be signed in to change notification settings - Fork 102
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
docs: change quickstart to ollama #249
base: main
Are you sure you want to change the base?
Conversation
docs/vectorizer-quick-start.md
Outdated
dockerfile: Dockerfile | ||
context: ../pgai/projects/extension | ||
target: pgai-test-db |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
images need to exchanged for the real ones, once they work. This is how I tested it though.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
done
docs/vectorizer-quick-start.md
Outdated
Before we start we need to tell ollama to download an embedding model so we can use it with pgai. For this example we will use the "nomic-embed-text" model. | ||
To download it into the container simply run: | ||
``` | ||
docker-compose exec ollama ollama pull nomic-embed-text |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
docker-compose exec ollama ollama pull nomic-embed-text | |
docker compose exec ollama ollama pull nomic-embed-text |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
done
docs/vectorizer-quick-start.md
Outdated
@@ -123,20 +121,20 @@ To create and run a vectorizer, then query the auto-generated embeddings created | |||
```sql | |||
SELECT | |||
chunk, | |||
embedding <=> ai.openai_embed('text-embedding-3-small', 'good food', dimensions=>768) as distance | |||
embedding <=> ai.ollama('nomic-embed-text', 'good food', host => 'http://ollama:11434') as distance, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
embedding <=> ai.ollama('nomic-embed-text', 'good food', host => 'http://ollama:11434') as distance, | |
embedding <=> ai.ollama_embed('nomic-embed-text', 'good food', host => 'http://ollama:11434') as distance |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
done
```shell | ||
docker compose up -d db | ||
docker compose up -d |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Note: this still won't work, because the vectorizer worker isn't robust to failures in connecting to the DB etc. I've opened a PR which fixes this: #263
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
done
…quickstart-guide-to-use-ollama
The docker images don't support this yet so probably best to merge after release.