fix: ollama local and llama local#521
Merged
Merged
Conversation
Contributor
|
This is good and was working locally however few issues:
|
Contributor
Author
Which external provider ? OpenAI ? Where did you define it? In the .env file or character? I tested the different variations with OLLAMA ? |
Merged
Contributor
|
I tested with anthropic and it still inits OLLAMA env is within character file |
Contributor
Author
I updated defaultCharacter.ts with ANTHROPIC and it didn't try to download the llamalocal model. It attempts to call the antropic api. Do you have anything else in your .env related to llama definited ? Are you on the latest code ? |
Contributor
|
in the logs it tells me it has initalised |
Ollama fix
fix: ollamaModel already defined
Ollama fix
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.

Ollama local broken and local llama issues
Relates partially to issue #443
Risks
Low risk - affects only the Llama and Ollama - llama.ts
Background
Eliza is supposed to be able to work with Ollama local and it was broken in several ways. Wrong embeddings were being loaded. Local llama was being downloaded even when the Ollama local was configured.
Model can be selected in the character by setting the ModelProviderName to LLAMALOCAL or OLLAMA
ex: modelProvider: ModelProviderName.LLAMALOCAL
default ModelProvider can be set on the .env file by setting the OLLAMA_MODEL= env variable
What does this PR do?
What kind of change is this?
Bug fixes and improvements
Documentation changes needed?
No, but might want to document the VERBOSE=true for more detailed logging for debugging purposes
Testing
Tested for each of the providers LLAMALOCAL or OLLAMA
Where should a reviewer start?
Detailed testing steps
For ollama local, you will need to install Ollama software and install the models that you will be using.