Dies ist ein simples Webinterface für ein lokales Sprachmodell.
Go to file
2024-11-18 17:48:18 +01:00
client render markdown in old chats 2024-11-18 17:48:18 +01:00
server add little more security - against brute force attacs 2024-11-18 17:37:57 +01:00
.env.server.example created basic structure 2024-10-29 21:09:13 +01:00
.gitignore add model streaming 2024-10-30 19:21:33 +01:00
docker-compose.yml add tokenizer api call to display progress bar 2024-11-18 15:30:08 +01:00
LICENSE add LICENSE 2024-10-29 21:15:03 +01:00
nginx.conf.example add model streaming 2024-10-30 19:21:33 +01:00
README.md change index.html 2024-10-29 21:48:27 +01:00

This project aims to be a little interface to an LLM hosted by a vllm-Engine.

It implements a simple Authentication with JSON Web Tokens.

docker-compose up --build
docker-compose down