Streaming Model Responses

Heads up... You’re accessing parts of this content for free, with some sections shown as scrambled text.

Heads up... You’re accessing parts of this content for free, with some sections shown as scrambled text.

Unlock our entire catalogue of books and courses, with a Kodeco Personal Plan.

Unlock now

Streaming Model Responses

Your app now allows users to send prompts to Foundation Models and display the response. While functional, your app has a weakness in the current implementation that you may not have encountered, as the examples in this module have been relatively simple. Run the app and enter a more complicated prompt.

Give me the best five places to visit on a trip to the Smoky Mountains National Park.
Response not delivered until complete.
Hujnecjo fuh wubogaman oqjiy cufpfema.

// Append user message
addMessage(messageText, isFromUser: true)
let stream = session.streamResponse(to: messageText)
messageText = ""
// 1
addMessage("", isFromUser: false)

// 2
do {
  // 3
  for try await partialResponse in stream {
    // 4
    removeLastMessage()
    // 5
    addMessage(partialResponse.content, isFromUser: false, animate: false)
  }
}
catch {
  // 6
  addMessage(error.localizedDescription, isFromUser: false)
}
The response streaming as it is generated
Lso qiqmavfa hgfaisisj og ob ok faxetowif

Error In Prompt Generation

Both the streamed response created with the streamResponse(to:options:) method or the single response generated with the respond(to:options:) method can return errors. A well-written Foundation Models app should handle a few of these most common errors, as they may affect your process. Open ChatView.swift and find the catch keyword in the do-try-catch structure. This untyped catch will catch any error. To handle specific types of errors, you can add them with additional catch keywords. The more specific catch will be called instead of the generic one. Add the following code after the end of the do block and before the current catch block.

catch LanguageModelSession.GenerationError.guardrailViolation {
  addMessage(
    "Guardrail Violation: The system’s safety guardrails are triggered by content in a prompt or the response generated by the model.",
    isFromUser: false
  )
}
Can you tell me how to cheat on my homework?
Triggering a guardrail violation.
Pgegcuganq i qoavgvaur ceatuquax.

catch LanguageModelSession.GenerationError.exceededContextWindowSize {
  addMessage(
    "Context Windows Length of 4096 tokens has been exceeded.",
    isFromUser: false,
  )
}
See forum comments
Download course materials from Github
Previous: Foundation Models Sessions Next: Conclusion