llm-py-web/server
Charlotte Som 77e7e0b0cf dont clear the system prompt actually
simonw plugins (e.g. llm-anthropic, llm-gemini) don't send the system
prompt as a {role: system} message in build_messages, so the model will
'forget' its system intsructions if it's not retransmitted every time :/
2025-03-26 20:55:58 +00:00
..
__init__.py send and display the conversation name 2025-02-26 10:21:15 +00:00
http.py write a little frontend 2025-02-26 08:10:58 +00:00
inference.py dont clear the system prompt actually 2025-03-26 20:55:58 +00:00
tid.py get inference over websocket working 2025-02-26 04:20:28 +00:00