Working with local LLMs
List of Frameworks and Tools
Ollama
curl -fsSL https://ollama.com/install.sh | sh
ollama run llama2
LLM
pip install llm llm install llm-gpt4all llm -m the-model-name "Your query"
llm aliases set falcon ggml-model-gpt4all-falcon-q4_0 llm -m ggml-model-gpt4all-falcon-q4_0 "Tell me a joke about computer programming"
Leave a Reply