Alright. Yup awaiting input.
Here's some intel on langchain:
https://github.com/langchain-ai/langchainjs/discussions/5258What I'm currently going to ship out at least for now: Just break the loop in the front-end. My UX will be perfect, but:
- Server will make some noise about unhandled abort:
โจฏ unhandledRejection: TypeError: Invalid state: WritableStream is closed
at close (webpack:///src/app/api/chat/route.ts?46b7:219:21)
217 | await writer.write(encoder.encode('\n\nAn error occurred while generating the response.'));
218 | } finally {
219 | await writer.close();
| ^
220 | }
221 | })();
222 | {
code: 'ERR_INVALID_STATE'
- In my portkey logs I can see the full reply continues to come in and so the API will charge me for all response tokens. Could be an issue with o1 model