For the Japanese version, click here
Dify
Dify is an open-source GUI platform designed to build AGI apppication. This time, I use Dify to quickly prototype and test before development in Langchain.
Local installation
For an easy installation, simply use the default settings of Dify.
Requirement
Ensure your machine meets these minimum system requirements and has Docker and Docker Compose are installed:
- CPU >= 2 Core
- RAM >= 4GB
Step 1: Clone Dify
Clone the Dify repository to your machine:
git clone https://github.com/langgenius/dify.git
Acttualy you only need the dify/docker from repository
Step 2. Start Dify
Navigate to the docker directory and run following command to start Dify:
cd dify/docker
docker compose up -d
Results:
[+] Running 11/11
✔ Network docker_default Created 0.0s
✔ Network docker_ssrf_proxy_network Created 0.0s
✔ Container docker-ssrf_proxy-1 Started 1.7s
✔ Container docker-db-1 Started 1.6s
✔ Container docker-weaviate-1 Started 1.6s
✔ Container docker-redis-1 Started 1.7s
✔ Container docker-sandbox-1 Started 1.7s
✔ Container docker-web-1 Started 1.7s
✔ Container docker-api-1 Started 2.3s
✔ Container docker-worker-1 Started 2.2s
✔ Container docker-nginx-1 Started 2.8s
Step 3. Verify container status:
Run the following command to check if all containers are running successfully:
docker compose ps
Results:
NAME IMAGE COMMAND SERVICE CREATED STATUS PORTS
docker-api-1 langgenius/dify-api:0.6.10 "/bin/bash /entrypoi…" api 5 minutes ago Up 5 minutes 5001/tcp
docker-db-1 postgres:15-alpine "docker-entrypoint.s…" db 5 minutes ago Up 5 minutes (healthy) 5432/tcp
docker-nginx-1 nginx:latest "/docker-entrypoint.…" nginx 5 minutes ago Up 5 minutes 0.0.0.0:80->80/tcp
docker-redis-1 redis:6-alpine "docker-entrypoint.s…" redis 5 minutes ago Up 5 minutes (healthy) 6379/tcp
docker-sandbox-1 langgenius/dify-sandbox:0.2.1 "/main" sandbox 5 minutes ago Up 5 minutes
docker-ssrf_proxy-1 ubuntu/squid:latest "entrypoint.sh -f /e…" ssrf_proxy 5 minutes ago Up 5 minutes 3128/tcp
docker-weaviate-1 semitechnologies/weaviate:1.19.0 "/bin/weaviate --hos…" weaviate 5 minutes ago Up 5 minutes
docker-web-1 langgenius/dify-web:0.6.10 "/bin/sh ./entrypoin…" web 5 minutes ago Up 5 minutes 3000/tcp
docker-worker-1 langgenius/dify-api:0.6.10 "/bin/bash /entrypoi…" worker 5 minutes ago Up 5 minutes 5001/tcp
Access Dify
Acesss http://localhost/install to use Dify.
Since I have used Dify before, it displays the login screen. If this is your first run, you will need to create an account.
Your workspace will look like this:
Setup the model
Click on top right corner, select Settings > Model Provider
Using Ollama
- Visit https://www.ollama.com/ and download the Ollama client for your system
- Run Ollama (I will use Llama2, but you can choose any model that suit your needs, visit Olama models for more details)
ollama run llama2
-
Integrate Ollama in Dify
- Scroll down and choose Ollama
- Fill in:
- Model Name: llama2
- Base URL:
http://<your-ollama-endpoint-domain>:11434
I run Dify from docker, so I will set the base URL tohttp://host.docker.internal:11434
Enter the base URL where the Ollama service is accessible. - Model Type: Chat
- Model Context Length: 4096
The maximum context length of the model. If unsure, use the default value of 4096. - Maximum Token Limit: 4096
The maximum number of tokens returned by the model. If there are no specific requirements for the model, this can be consistent with the model context length. - Support for Vision: Yes
Check this option if the model supports image understanding (multimodal), like llava.
- Scroll down and choose Ollama
Using OpenAI
I am curently using OpenAI create prototypes and test.