Replies: 3 comments 1 reply
-
also need the feature |
Beta Was this translation helpful? Give feedback.
-
From a plugin perspective, this should be fairly simple to implement (see https://github.com/just1984/All_Tabs_2_TXT_Plugin). I am only evaluating tabby at the moment and will try it out soon. But as far as I understand, tabby allows to connect to a repository and it then builds a RAG DB based on some tree-sitter queries. It does not allow sending the context from the open tabs to the LLM prompt (as what GitHub Copilot does). This could lead to quite bad performance if tabby isn't connected to the repository. Please correct me if I am wrong. |
Beta Was this translation helpful? Give feedback.
-
add to idea list #2674 |
Beta Was this translation helpful? Give feedback.
-
Hi tabby team,
I am currently trying to use tabby with CodeLlama-13B model. The ability to generate code based on comments in the current file is pretty good, and the generation speed is also satisfactory. However, I found that the generation effect was not so satisfactory in more complex code completion and unit test generation scenarios. IDE plugin seemed to be unable to correctly read the context of nearby tabs. Does the tabby team have plans to add this useful feature? This is useful for small to medium sized models and above.
Beta Was this translation helpful? Give feedback.
All reactions