Dies ist ein simples Webinterface für ein lokales Sprachmodell.
Go to file
2024-10-30 19:21:33 +01:00
client add model streaming 2024-10-30 19:21:33 +01:00
server created basic structure 2024-10-29 21:09:13 +01:00
.env.server.example created basic structure 2024-10-29 21:09:13 +01:00
.gitignore add model streaming 2024-10-30 19:21:33 +01:00
docker-compose.yml change index.html 2024-10-29 21:48:27 +01:00
LICENSE add LICENSE 2024-10-29 21:15:03 +01:00
nginx.conf.example add model streaming 2024-10-30 19:21:33 +01:00
README.md change index.html 2024-10-29 21:48:27 +01:00

This project aims to be a little interface to an LLM hosted by a vllm-Engine.

It implements a simple Authentication with JSON Web Tokens.

docker-compose up --build
docker-compose down