I have a stream of text that I send to a function to turn into speech. I send the text sentence by sentence for better latency. The issue im running into is handling interruptions. Right now, I send each sentence to a function containing a promise. So I end up with a chain of awaits that pile on fairly quickly, so when I try to interrupt the function, althought it properly breaks out, the await chain continues to run. Here is a snippet of the code (send_voice returns a promise)
const stream = await openai.chat.completions.create({ model: "gpt-3.5-turbo", messages: conversation_history, stream: true, }); let sentence = ""; for await (const chunk of stream) { const content = chunk.choices[0]?.delta?.content || ""; sentence += content; if (interruptionState.value){ console.log("breaking"); break; //break correctly called but send_voice chain continues } if (sentence.endsWith('.') || sentence.endsWith('!') || sentence.endsWith('?')) { console.log("FINAL: ", sentence); conversation_history.push({ "role": "system", "content": sentence }); await send_voice(sentence, ws, elleven_key, voice_id,interruptionState); sentence = ""; } }
I know that I can’t stop the await chain, but is there any way around this? I essentially need to call send_voice sequentially and be able to stop it quickly on interruption. I’ve been stuck on this for a while so any help would be much appreciated!
I’ve been stuck on this for a while so any help would be much appreciated!
The issue you’re facing is likely due to the fact that await inside a loop creates a chain of promises, and even if you break out of the loop, the existing promises in the chain will continue to resolve. One way to handle this is by using a flag to indicate whether you should continue processing or stop.
await
Here’s an example of how you could modify your code to use a flag for interruption:
const stream = await openai.chat.completions.create({ model: "gpt-3.5-turbo", messages: conversation_history, stream: true, }); let sentence = ""; let continueProcessing = true; const processChunk = async (chunk) => { const content = chunk.choices[0]?.delta?.content || ""; sentence += content; if (!continueProcessing) { console.log("Breaking"); return; } if (sentence.endsWith('.') || sentence.endsWith('!') || sentence.endsWith('?')) { console.log("FINAL: ", sentence); conversation_history.push({ "role": "system", "content": sentence }); await send_voice(sentence, ws, elleven_key, voice_id, interruptionState); sentence = ""; } }; for await (const chunk of stream) { await processChunk(chunk); } // If you need to stop processing externally (e.g., on interruption) // you can set continueProcessing to false. continueProcessing = false;
By setting continueProcessing to false, you can stop processing at the next iteration of the loop. The processChunk function is then responsible for handling the logic within each chunk.
continueProcessing
false
processChunk