hey @s44002
The part that handles all your actual llm traffic (the Gateway) is totally solid. It's the backbone that's been running smoothly for ages now and has served more than 2 billion api requests.
What you ran into was just a UI hiccup in our dashboard where you set things up. None of your production stuff or Virtual Keys were affected since the Gateway runs completely separate from the dashboard.
The Gateway gets super rigorous testing before any updates, since we know how critical it is for your production workloads. While, the UI might hit some weird edge cases as we keep building it out, we designed it specifically so that it can't mess with your actual API calls, by designed the Gateway as stateless.
Bottom line:Your production stuff is 100% safe and stable. The UI thing was just surface level.
Please let me know if that does make sense.
Ps: we openly display our service status at
https://status.portkey.ai/