Stream generated text responses using the Gemini API Stay organized with collections Save and categorize content based on your preferences.
Firebase AI Logic supports basic streaming of text responses using generateContentStream or sendMessageStream (for chat).
You can achieve faster interactions by not waiting for the entire result from the model generation, and instead use streaming to handle partial results.
Check out the documentation to stream generated text responses for the following:
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-11-21 UTC."],[],[]]