-
-
Notifications
You must be signed in to change notification settings - Fork 1.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Chunking support for XMLScraperGraph #800
Comments
@VinciGit00 is this something we are looking at, to support? |
Yes please, @madguy02 if you can help us we would be glad |
Can you assign it to me @VinciGit00 |
Hi @madguy02 I assigned it |
So, can you tell me more about the model, @Etherealspringfall , how many tokens does it support? generally for GPT3.5 or GPT4 50000 is well within limits as the error says its supported upto 129024, i tried the code out with gpt4 mini and it does not give me the error above, i think it has to do with the token limits of the model. Moreoever AFAIK, XMLScrapeGraph is used to scrape .xml files, not sure if .html file is supported in this case(?) @Etherealspringfall can you run this code and give me the number of tokens in the text you are sending:
|
that graph is deprecated |
Hi, I'm encountering a length limit when using a third party model to extract local html. Can chunking support be added to XMLScraperGraph?
code:
error:
The text was updated successfully, but these errors were encountered: