ragflow/docs/ollama.md
KevinHuSh 3708b97db9
Support Ollama (#261)
### What problem does this PR solve?

Issue link:#221

### Type of change

- [x] New Feature (non-breaking change which adds functionality)
2024-04-08 19:20:57 +08:00

1.4 KiB

Ollama

One-click deployment of local LLMs, that is Ollama.

Install

Launch Ollama

Decide which LLM you want to deploy (here's a list for supported LLM), say, mistral:

$ ollama run mistral

Or,

$ docker exec -it ollama ollama run mistral

Use Ollama in RAGFlow

  • Go to 'Settings > Model Providers > Models to be added > Ollama'.

Base URL: Enter the base URL where the Ollama service is accessible, like, http://:11434

  • Use Ollama Models.