Allow viewing conversations even when llama server is down (#16255)

* webui: allow viewing conversations and sending messages even if llama-server is down

- Cached llama.cpp server properties in browser localStorage on startup, persisting successful fetches and reloading them when refresh attempts fail so the chat UI continues to render while the backend is unavailable.
- Cleared the stored server properties when resetting the store to prevent stale capability data after cache-backed operation.
- Kept the original error-splash behavior when no cached props exist so fresh installs still surface a clear failure state instead of rendering stale data.

* feat: Add UI for `props` endpoint unavailable + cleanup logic

* webui: extend cached props fallback to offline errors

Treat connection failures (refused, DNS, timeout, fetch) the same way as
server 5xx so the warning banner shows up when cache is available, instead
of falling back to a full error screen.

* webui: Left the chat form enabled when a server warning is present so operators can keep sending messages

e.g., to restart the backend over llama-swap, even while cached /props data is in use

* chore: update webui build output

---------

Co-authored-by: Pascal <admin@serveurperso.com>
This commit is contained in:
Aleksander Grygier
2025-09-26 18:35:42 +02:00
committed by GitHub
parent e0539eb6ae
commit 1a18927894
7 changed files with 149 additions and 6 deletions

View File

@@ -27,11 +27,10 @@ export async function validateApiKey(fetch: typeof globalThis.fetch): Promise<vo
if (!response.ok) {
if (response.status === 401 || response.status === 403) {
throw error(401, 'Access denied');
} else if (response.status >= 500) {
throw error(response.status, 'Server error - check if llama.cpp server is running');
} else {
throw error(response.status, `Server responded with status ${response.status}`);
}
console.warn(`Server responded with status ${response.status} during API key validation`);
return;
}
} catch (err) {
// If it's already a SvelteKit error, re-throw it
@@ -40,6 +39,6 @@ export async function validateApiKey(fetch: typeof globalThis.fetch): Promise<vo
}
// Network or other errors
throw error(503, 'Cannot connect to server - check if llama.cpp server is running');
console.warn('Cannot connect to server for API key validation:', err);
}
}