-
Notifications
You must be signed in to change notification settings - Fork 4
new: langchain
plugin
#44
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
# self._adj._fetch_all() | ||
# TODO: what would this look like? | ||
|
||
# NOTE: OUT OF SERVICE | ||
# def chat(self, prompt: str) -> str: | ||
# if self.__qa_chain is None: | ||
# if not self.graph_exists_in_db: | ||
# return "Could not initialize QA chain: Graph does not exist" | ||
# def push(self) -> None: | ||
# TODO: what would this look like? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What about those two todos?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you share more information on those two methods. What are they for?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ya these are placeholders for a potential feature later on, completely unrelated to chat()
:
-
pull
proposes the idea of fetching the "latest" state of an ArangoDB Graph, which would introduce the concept of cross-collaboration across the same ArangoDB Graph. I don't have an implementation in-mind yet, so it's purely hypothetical. -
push
would be more towards supporting transactions in annxadb
Graph. So something like this:
import nx_arangodb as nxadb
G = nxadb.Graph(name="MyGraph", use_transactions=True)
# Only modifies the cache:
G.nodes[1]['foo'] = 'bar'
G.add_node(2, foo='buz')
G.add_edge(1, 2)
G.push() # Sends the last 3 operations to the DB
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM to in general. Before we merge back, we need to discuss the two introduced TODOs.
Bringing back the LangChain plugin as an optional feature
The default LLM used is GPT-4 via the
ChatOpenAI
class of thelangchain_openai
module, but users are free to pass their own model via thellm
parameter to thechat()
function.