Health Status
Live Connections
All currently active proxy connections with queue state, streaming mode, and live metrics.
Backends
Inspect backend status, load, errors, and concurrency settings in one place.
| Backend | Status | Load | Models | Metrics | Last Error | Controls |
|---|
Chat Debugger
Chat directly through the proxy, similar to the built-in llama.cpp frontend, including live tokens, sampler parameters, and raw responses.
Request JSON
// request appears here
Response / Stream
// response appears here
Recent Requests
Live history with queue time, target backend, and outcome.