Get the latest tech news
Rubberduck: Emulate OpenAI/Anthropic locally with caching and failure injection
LLM caching proxy server that emulates popular LLMs with the ability to simulate failures - Zipstack/rubberduck
Rubberduck provides caching, failure simulation, rate limiting, per-user proxy instances, and detailed logging for testing and development of LLM-powered applications. Dashboard: Live system stats and proxy monitoring Proxy Management: Full lifecycle control with visual status indicators Logs: Real-time streaming with advanced filtering Settings: Global configuration and security controls Stripe-inspired UI: Clean, modern, responsive design Proxy Lifecycle: Create, start, stop, configure proxies Authentication: Register, login, logout flows Failure Simulation: Test timeout, error injection, rate limiting Caching: Verify cache hits/misses Logging: Check request logging and export
Or read this on Hacker News