dify-openai-cloak is a service that acts as a proxy between OpenAI-compatible clients and the Dify API.
It translates OpenAI API requests to Dify API requests, allowing you to use Dify's AI capabilities with OpenAI-compatible clients.
- Translates OpenAI API requests to Dify API requests
- Supports multiple models with separate API keys
- Docker support
See project board for more details.
Please be noted that this project is still under development and the current implementation is not perfect. It lacks some features and has some limitations, notably:
- It uses a "fake" streaming implementation which is not actually streaming responses from Dify.
- It only uses the last message from the OpenAI request as the query for Dify.
- It doesn't handle all possible fields from the OpenAI request (like temperature, max_tokens, etc.).
Feel free to submit an issue or pull request.
-
Pull the Docker image:
docker pull ghcr.io/mintyfrankie/dify-openai-cloak:latest
-
Create a
config.yaml
file with your configuration (see Configuration section below). -
Run the Docker container:
docker run -d -p 3000:3000 -v /path/to/your/config.yaml:/app/config/config.yaml ghcr.io/yourusername/dify-openai-cloak:latest
Replace
/path/to/your/config.yaml
with the actual path to your configuration file.
-
Clone the repository:
git clone https://github.com/mintyfrankie/dify-openai-cloak.git cd dify-openai-cloak
-
Install dependencies:
pnpm install
-
Copy the example configuration file and edit it with your settings:
cp config.example.yaml config.yaml
-
Start the server:
pnpm start
You can configure the application using either a config.yaml
file or environment variables. The application will look for the config file in the following locations:
/app/config/config.yaml
(for Docker deployments)./config.yaml
(in the project root)
If no config file is found, it will fall back to environment variables.
Reference to config.example.yaml for more details.