{"id":10281,"date":"2024-09-19T17:13:12","date_gmt":"2024-09-19T15:13:12","guid":{"rendered":"https:\/\/via-internet.de\/blog\/?p=10281"},"modified":"2025-09-16T10:11:51","modified_gmt":"2025-09-16T08:11:51","slug":"ollama-cookbook","status":"publish","type":"post","link":"https:\/\/via-internet.de\/blog\/2024\/09\/19\/ollama-cookbook\/","title":{"rendered":"Ollama Cookbook"},"content":{"rendered":"\n<h2 class=\"wp-block-heading\">Setting up VS Code Environment<\/h2>\n\n\n\n<h3 class=\"wp-block-heading\">Debugging<\/h3>\n\n\n\n<p class=\"wp-block-paragraph\">Create a file launch.json and add this to configuration<\/p>\n\n\n\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">\"configurations\": [\n        {\n            \"name\": \"Streamlit\",\n            \"type\": \"debugpy\",\n            \"request\": \"launch\",\n            \"module\": \"streamlit\",\n            \"args\": [\n                \"run\",\n                \"${file}\",\n                \"--server.port\",\n                \"2000\"\n            ]\n        }\n    ]<\/pre>\n\n\n\n<h2 class=\"wp-block-heading\">Ollama und Python<\/h2>\n\n\n\n<h3 class=\"wp-block-heading\">Beispiele<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li><a href=\"https:\/\/github.com\/ollama\/ollama-python\">Ollama Python Library<\/a><\/li>\n<\/ul>\n\n\n\n<ul class=\"wp-block-list\">\n<li><a href=\"https:\/\/github.com\/ollama\/ollama-js\">Ollama JavaScript Library<\/a><\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">Getting Started<\/h3>\n\n\n\n<h3 class=\"wp-block-heading\"><sub>Python<\/sub><\/h3>\n\n\n\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"shell\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">pip install ollama<\/pre>\n\n\n\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"python\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">import ollama\n\nresponse = ollama.chat(model='llama2', messages=[\n  {\n    'role': 'user',\n    'content': 'Why is the sky blue?',\n  },\n])\n\nprint(response['message']['content'])<\/pre>\n\n\n\n<h3 class=\"wp-block-heading\"><sub>JavaScript<\/sub><\/h3>\n\n\n\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">npm install ollama<\/pre>\n\n\n\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">import ollama from 'ollama'\n\nconst response = await ollama.chat({\n  model: 'llama2',\n  messages: [{ role: 'user', content: 'Why is the sky blue?' }],\n})\nconsole.log(response.message.content)\n<\/pre>\n\n\n\n<h3 class=\"wp-block-heading\">Use cases<\/h3>\n\n\n\n<p class=\"wp-block-paragraph\">Both libraries support Ollama\u2019s full set of features. Here are some examples in Python:<\/p>\n\n\n\n<h4 class=\"wp-block-heading\">Streaming<\/h4>\n\n\n\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"python\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">for chunk in chat('mistral', messages=messages, stream=True):\n    print(chunk['message']['content'], end='', flush=True)<\/pre>\n\n\n\n<h4 class=\"wp-block-heading\">Multi-modal<\/h4>\n\n\n\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"python\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">with open('image.png', 'rb') as file:\n  response = ollama.chat(\n    model='llava',\n    messages=[\n      {\n        'role': 'user',\n        'content': 'What is strange about this image?',\n        'images': [file.read()],\n      },\n    ],\n  )\n\nprint(response['message']['content'])<\/pre>\n\n\n\n<h4 class=\"wp-block-heading\">Text Completion<\/h4>\n\n\n\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">result = ollama.generate(\n  model='stable-code',\n  prompt='\/\/ A c function to reverse a string\\n',\n)\n\nprint(result['response'])<\/pre>\n\n\n\n<h4 class=\"wp-block-heading\">Creating custom models<\/h4>\n\n\n\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">modelfile='''\nFROM llama2\nSYSTEM You are mario from super mario bros.\n'''\n\nollama.create(model='example', modelfile=modelfile)<\/pre>\n\n\n\n<h4 class=\"wp-block-heading\">Custom client<\/h4>\n\n\n\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">ollama = Client(host='my.ollama.host')<\/pre>\n\n\n\n<p class=\"wp-block-paragraph\">More examples are available in the GitHub repositories for the&nbsp;<a href=\"https:\/\/github.com\/ollama\/ollama-python\/tree\/main\/examples\">Python<\/a>&nbsp;and&nbsp;<a href=\"https:\/\/github.com\/ollama\/ollama-js\/tree\/main\/examples\">JavaScript<\/a>&nbsp;libraries.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Tipps und Tricks<\/h2>\n\n\n\n<h3 class=\"wp-block-heading\">ollama serve<\/h3>\n\n\n\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">OLLAMA_ORIGINS=https:\/\/webml-demo.vercel.app\nOLLAMA_HOST=127.0.0.1:11435 ollama serve<\/pre>\n\n\n\n<p class=\"wp-block-paragraph\"><\/p>\n\n\n\n<p class=\"wp-block-paragraph\"><\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Docker<\/h2>\n\n\n\n<h3 class=\"wp-block-heading\">Run Ollama in Docker container<\/h3>\n\n\n\n<h4 class=\"wp-block-heading\">CPU only<\/h4>\n\n\n\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">docker run -d -v ollama:\/root\/.ollama -p 11434:11434 --name ollama ollama\/ollama<\/pre>\n\n\n\n<h4 class=\"wp-block-heading\">Nvidia GPU<\/h4>\n\n\n\n<p class=\"wp-block-paragraph\">Install the Nvidia container toolkit.<br>Run Ollama inside a Docker container<\/p>\n\n\n\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"shell\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">docker run -d --gpus=all -v ollama:\/root\/.ollama -p 11434:11434 --name ollama ollama\/ollama<\/pre>\n\n\n\n<h4 class=\"wp-block-heading\">Run a model<\/h4>\n\n\n\n<p class=\"wp-block-paragraph\">Now you can run a model like Llama 2 inside the container.<\/p>\n\n\n\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"shell\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">docker exec -it ollama ollama run llama2<\/pre>\n\n\n\n<h2 class=\"wp-block-heading\">OpenAI<\/h2>\n\n\n\n<h3 class=\"wp-block-heading\">OpenAI Compatibility<\/h3>\n\n\n\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"python\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">from openai import OpenAI\n\nclient = OpenAI(\n    base_url = 'http:\/\/localhost:11434\/v1',\n    api_key='ollama', # required, but unused\n)\n\nresponse = client.chat.completions.create(\n  model=\"llama2\",\n  messages=[\n    {\"role\": \"system\", \"content\": \"You are a helpful assistant.\"},\n    {\"role\": \"user\", \"content\": \"Who won the world series in 2020?\"},\n    {\"role\": \"assistant\", \"content\": \"The LA Dodgers won in 2020.\"},\n    {\"role\": \"user\", \"content\": \"Where was it played?\"}\n  ]\n)\n\nprint(response.choices[0].message.content)<\/pre>\n\n\n\n<h2 class=\"wp-block-heading\">Using Streamlit<\/h2>\n\n\n\n<h3 class=\"wp-block-heading\">LangChain<\/h3>\n\n\n\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">from langchain_community.llms import Ollama\nllm = Ollama(model=\"gemma2\")\nllm.invoke(\"Why is the sky blue?\")<\/pre>\n\n\n\n<h3 class=\"wp-block-heading\">LlamaIndex<\/h3>\n\n\n\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"python\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">from llama_index.llms.ollama import Ollama\n\nllm = Ollama(model=\"gemma2\")\nllm.complete(\"Why is the sky blue?\")<\/pre>\n\n\n\n<h2 class=\"wp-block-heading\">LLMs and Models by Example<\/h2>\n\n\n\n<h3 class=\"wp-block-heading\">wizard-math<\/h3>\n\n\n\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"shell\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">ollama run wizard-math 'Expand the following expression: $7(3y+2)$'<\/pre>\n\n\n\n<figure class=\"wp-block-image size-large\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"445\" src=\"https:\/\/via-internet.de\/blog\/wp-content\/uploads\/2024\/09\/Bildschirmfoto-2024-09-20-um-11.40.48-1024x445.png\" alt=\"\" class=\"wp-image-10284\" srcset=\"https:\/\/via-internet.de\/blog\/wp-content\/uploads\/2024\/09\/Bildschirmfoto-2024-09-20-um-11.40.48-1024x445.png 1024w, https:\/\/via-internet.de\/blog\/wp-content\/uploads\/2024\/09\/Bildschirmfoto-2024-09-20-um-11.40.48-300x131.png 300w, https:\/\/via-internet.de\/blog\/wp-content\/uploads\/2024\/09\/Bildschirmfoto-2024-09-20-um-11.40.48-768x334.png 768w, https:\/\/via-internet.de\/blog\/wp-content\/uploads\/2024\/09\/Bildschirmfoto-2024-09-20-um-11.40.48-1536x668.png 1536w, https:\/\/via-internet.de\/blog\/wp-content\/uploads\/2024\/09\/Bildschirmfoto-2024-09-20-um-11.40.48.png 1862w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><\/figure>\n\n\n\n<h2 class=\"wp-block-heading\">Libraries for using Ollama<\/h2>\n\n\n\n<h3 class=\"wp-block-heading\">Using Ollama<\/h3>\n\n\n\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">from ollama import Client<\/pre>\n\n\n\n<h3 class=\"wp-block-heading\">Using LangChain<\/h3>\n\n\n\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">from langchain_community.llms import Ollama<\/pre>\n\n\n\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\"><\/pre>\n\n\n\n<p class=\"wp-block-paragraph\"><\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Basic Operations<\/h2>\n\n\n\n<h3 class=\"wp-block-heading\">Import<\/h3>\n\n\n\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">from ollama import Client<\/pre>\n\n\n\n<h3 class=\"wp-block-heading\">Create Client<\/h3>\n\n\n\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">client = Client(host=ollama_url)<\/pre>\n\n\n\n<h3 class=\"wp-block-heading\">Get list of models<\/h3>\n\n\n\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">model_list = []\n\nmodels = client.list()\nfor model in models['models']:\n    model_list.append(model['name'])<\/pre>\n","protected":false},"excerpt":{"rendered":"<p>Setting up VS Code Environment Debugging Create a file launch.json and add this to configuration Ollama und Python Beispiele Getting Started Python JavaScript Use cases Both libraries support Ollama\u2019s full set of features. Here are some examples in Python: Streaming Multi-modal Text Completion Creating custom models Custom client More examples are available in the GitHub repositories for the&nbsp;Python&nbsp;and&nbsp;JavaScript&nbsp;libraries. Tipps und Tricks ollama serve Docker Run Ollama in Docker container CPU only Nvidia GPU Install the Nvidia container toolkit.Run Ollama inside a Docker container Run a model Now you can run a model like Llama 2 inside the container. OpenAI OpenAI Compatibility [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_crdt_document":"","_jetpack_memberships_contains_paid_content":false,"footnotes":""},"categories":[168],"tags":[],"class_list":["post-10281","post","type-post","status-publish","format-standard","hentry","category-ai"],"jetpack_featured_media_url":"","jetpack_sharing_enabled":true,"_links":{"self":[{"href":"https:\/\/via-internet.de\/blog\/wp-json\/wp\/v2\/posts\/10281","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/via-internet.de\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/via-internet.de\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/via-internet.de\/blog\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/via-internet.de\/blog\/wp-json\/wp\/v2\/comments?post=10281"}],"version-history":[{"count":7,"href":"https:\/\/via-internet.de\/blog\/wp-json\/wp\/v2\/posts\/10281\/revisions"}],"predecessor-version":[{"id":10430,"href":"https:\/\/via-internet.de\/blog\/wp-json\/wp\/v2\/posts\/10281\/revisions\/10430"}],"wp:attachment":[{"href":"https:\/\/via-internet.de\/blog\/wp-json\/wp\/v2\/media?parent=10281"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/via-internet.de\/blog\/wp-json\/wp\/v2\/categories?post=10281"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/via-internet.de\/blog\/wp-json\/wp\/v2\/tags?post=10281"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}