Skip to content

update readme with local instructions #172

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Aug 5, 2024
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
15 changes: 14 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@ kubectl krew install kubectl-ai/kubectl-ai

- [OpenAI API key](https://platform.openai.com/overview)
- [Azure OpenAI Service](https://aka.ms/azure-openai) API key and endpoint
- OpenAI API-compatible endpoint (such as [AIKit](https://github.com/sozercan/aikit) or [LocalAI](https://localai.io/))
- [OpenAI API-compatible endpoint](#set-up-a-local-openai-api-compatible-endpoint) (such as [AIKit](https://github.com/sozercan/aikit) or [LocalAI](https://localai.io/))

For OpenAI, Azure OpenAI or OpenAI API compatible endpoint, you can use the following environment variables:

Expand All @@ -59,6 +59,19 @@ Azure OpenAI service does not allow certain characters, such as `.`, in the depl
export AZURE_OPENAI_MAP="gpt-3.5-turbo=my-deployment"
```

### Set up a local OpenAI API-compatible endpoint

If you don't have OpenAI API access, you can set up a local OpenAI API-compatible endpoint using [AIKit](https://github.com/sozercan/aikit) on your local machine without any GPUs! For more information, see the [AIKit documentaton](https://sozercan.github.io/aikit/).

```shell
docker run -d --rm -p 8080:8080 ghcr.io/sozercan/llama3.1:8b
export OPENAI_ENDPOINT="http://localhost:8080/v1"
export OPENAI_DEPLOYMENT_NAME="llama-3.1-8b-instruct"
export OPENAI_API_KEY="n/a"
```

After setting up the environment like above, you can use `kubectl-ai` as usual.

### Flags and environment variables

- `--require-confirmation` flag or `REQUIRE_CONFIRMATION` environment varible can be set to prompt the user for confirmation before applying the manifest. Defaults to true.
Expand Down