-
macOS: Download from https://ollama.com/download/Ollama.dmg.
-
Windows: Download from https://ollama.com/download/OllamaSetup.exe.
-
Linux:
curl -fsSL https://ollama.com/install.sh | sh, or refer to the guide from ollama. -
Docker: The official Ollama Docker image
ollama/ollamais available on Docker Hub.
As updates to the official Ollama version may occasionally introduce minor uncertainties, we provide an additional maintained branch to support stable execution of the MiniCPM-V series models.
Environment requirements:
- go version 1.22 or above
- cmake version 3.24 or above
- C/C++ Compiler e.g. Clang on macOS, TDM-GCC (Windows amd64) or llvm-mingw (Windows arm64), GCC/Clang on Linux.
Clone OpenBMB Ollama Fork:
git clone https://github.com/tc-mb/ollama.git
cd ollama
git checkout MIniCPM-VThen build and run Ollama from the root directory of the repository:
go build .
./ollama serveThe MiniCPM-V 4 model can be used directly:
./ollama run openbmb/minicpm-v4Separate the input prompt and the image path with space.
What is in the picture? xx.jpg
with open(image_path, 'rb') as image_file:
# Convert the image file to a base64 encoded string
encoded_string = base64.b64encode(image_file.read()).decode('utf-8')
data = {
"model": "minicpm-v4",
"prompt": query,
"stream": False,
"images": [encoded_string] # The 'images' list can hold multiple base64-encoded images.
}
# Set request URL
url = "http://localhost:11434/api/generate"
response = requests.post(url, json=data)
return responseIf the method above fails, please refer to the following guide.
- HuggingFace: https://huggingface.co/openbmb/MiniCPM-V-4-gguf
- ModelScope: https://modelscope.cn/models/OpenBMB/MiniCPM-V-4-gguf
Create and edit a ModelFile:
vim minicpmv4.ModelfileThe content of the Modelfile should be as follows:
FROM ./MiniCPM-V-4/model/Model-3.6B-Q4_K_M.gguf
FROM ./MiniCPM-V-4/mmproj-model-f16.gguf
TEMPLATE """{{- if .Messages }}{{- range $i, $_ := .Messages }}{{- $last := eq (len (slice $.Messages $i)) 1 -}}<|im_start|>{{ .Role }}{{ .Content }}{{- if $last }}{{- if (ne .Role "assistant") }}<|im_end|><|im_start|>assistant{{ end }}{{- else }}<|im_end|>{{ end }}{{- end }}{{- else }}{{- if .System }}<|im_start|>system{{ .System }}<|im_end|>{{ end }}{{ if .Prompt }}<|im_start|>user{{ .Prompt }}<|im_end|>{{ end }}<|im_start|>assistant{{ end }}{{ .Response }}{{ if .Response }}<|im_end|>{{ end }}"""
SYSTEM """You are a helpful assistant."""
PARAMETER top_p 0.8
PARAMETER num_ctx 4096
PARAMETER stop ["<|im_start|>","<|im_end|>"]
PARAMETER temperature 0.7
Parameter Descriptions:
| first from | second from | num_ctx |
|---|---|---|
| Your language GGUF model path | Your vision GGUF model path | Max Model length |
./ollama create minicpm-v4 -f minicpmv4.ModelfileIn a new terminal window, run the model instance:
./ollama run minicpm-v4Enter the prompt and the image path, separated by a space.
What is in the picture? xx.jpg