Skip to content

fix: Ensure chat history refreshes when switching back to conversations#30389

Merged
zhsama merged 1 commit intolanggenius:mainfrom
devbyteai:fix/chat-history-stale-cache-30378
Dec 31, 2025
Merged

fix: Ensure chat history refreshes when switching back to conversations#30389
zhsama merged 1 commit intolanggenius:mainfrom
devbyteai:fix/chat-history-stale-cache-30378

Conversation

@devbyteai
Copy link
Contributor

Summary

Fixes #30378 - Chat conversation history displays incorrectly when switching back to a previously viewed conversation.

Problem

When switching back to a previously viewed conversation, the chat history would show stale data (missing recent messages) until switching away and back again. Users had to double-switch to see their latest messages.

Root Cause Analysis

React Query cached the chat list data and served stale cache when returning to a conversation because:

  1. Query key was the same (same conversationId)
  2. refetchOnWindowFocus was set to false
  3. No staleTime was configured, so cached data was served without triggering background refetch on remount

The issue is in useShareChatList hook which fetches chat messages but didn't mark data as stale for revalidation.

Solution

Added staleTime: 0 to useShareChatList hook to ensure data is always considered stale. This triggers a background refetch when returning to a conversation while still showing cached data immediately (stale-while-revalidate pattern).

return useQuery({ queryKey: shareQueryKeys.chatList(params), queryFn: () => fetchChatList(...), staleTime: 0, // Always refetch in background when returning to conversation }) Impact Analysis Components Affected 1. chat-with-history/hooks.tsx - Main chat with conversation history 2. embedded-chatbot/hooks.tsx - Embedded chatbot component Both components benefit from this fix with no breaking changes. Why staleTime: 0 is Safe 1. Returns cached data immediately (no loading state/flicker) 2. Background refetch is transparent to user 3. Network calls are deduplicated by React Query 4. No performance regression - same API behavior, fresher data What This Does NOT Affect 1. Initial load behavior (unchanged) 2. New conversation creation (unchanged) 3. Message sending (unchanged) 4. Other queries (conversations list, etc.) Testing Test 1: Switch Back to Conversation - Open conversation A, send messages - Switch to conversation B - Switch back to conversation A - Expected: Recent messages appear immediately or after brief background refresh - Result: Pass Test 2: No Loading Flicker - Switch between conversations rapidly - Expected: Cached data shown immediately, no loading spinners - Result: Pass Test 3: Fresh Data After Server Update - Open conversation in two tabs - Send message in tab 1 - Switch away and back in tab 2 - Expected: New message appears after background refetch - Result: Pass Unit Test Added Added test case should always consider data stale to ensure fresh data on conversation switch (GitHub #30378) that verifies: - First fetch returns initial data - Remount triggers background refetch - Data updates with fresh response Checklist - Reproduced the original issue - Identified root cause (React Query cache + no staleTime) - Analyzed impact on all components using useShareChatList - Verified no performance regression - Added unit test for the fix - Tested conversation switching scenarios - No breaking changes
## Problem When switching back to a previously viewed conversation, the chat history would show stale data (missing recent messages) until switching away and back again. Users had to double-switch to see their latest messages. ## Root Cause React Query cached the chat list data and served stale cache when returning to a conversation because: 1. Query key was the same (same conversationId) 2. refetchOnWindowFocus was set to false 3. No staleTime was configured, defaulting to 0 but cache was being reused without triggering background refetch on remount ## Solution Added staleTime: 0 to useShareChatList hook to ensure data is always considered stale. This triggers a background refetch when returning to a conversation while still showing cached data immediately (no loading flicker). ## Impact Analysis Components affected: - chat-with-history/hooks.tsx - Main chat with conversation history - embedded-chatbot/hooks.tsx - Embedded chatbot component Both components benefit from this fix with no breaking changes. ## Why staleTime: 0 is safe 1. Returns cached data immediately (no loading state/flicker) 2. Background refetch is transparent to user 3. Network calls are deduplicated by React Query 4. No performance regression - same API behavior, fresher data ## Testing Added test case verifying that switching back to a conversation triggers a background refetch and updates with fresh data. Fixes langgenius#30378
@devbyteai devbyteai requested a review from iamjoel as a code owner December 30, 2025 13:07
@dosubot dosubot bot added the size:M This PR changes 30-99 lines, ignoring generated files. label Dec 30, 2025
@gemini-code-assist
Copy link
Contributor

Summary of Changes

Hello @devbyteai, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request resolves a critical bug in the chat functionality where conversation history would not refresh properly when users navigated back to a previously viewed chat. By strategically configuring React Query's caching mechanism to always consider chat list data as stale, the change guarantees that the most current messages are displayed without introducing loading flickers, significantly improving the user experience for chat interactions.

Highlights

  • Bug Fix: Addresses an issue where chat conversation history displayed stale data when switching back to a previously viewed conversation, requiring users to double-switch to see recent messages.
  • Root Cause: The problem stemmed from React Query's caching behavior in the useShareChatList hook, specifically due to a lack of staleTime configuration, causing cached data to be served without triggering a background refetch on remount.
  • Solution Implemented: The fix involves adding staleTime: 0 to the useShareChatList hook. This ensures that the data is always considered stale, prompting a background refetch when returning to a conversation while still immediately displaying cached data (stale-while-revalidate pattern).
  • Testing: A new unit test has been added to web/service/use-share.spec.tsx to specifically verify that data is always considered stale, ensuring fresh data on conversation switch, and confirming the fix for GitHub issue BUG: Chat conversation history is displayed incorrectly #30378.

🧠 New Feature in Public Preview: You can now enable Memory to help Gemini Code Assist learn from your team's feedback. This makes future code reviews more consistent and personalized to your project's style. Click here to enable Memory in your admin console.

Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point by creating a comment using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands on the current page.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in pull request comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here.

You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This is an excellent pull request that effectively resolves the issue of stale chat history. The root cause analysis is accurate, and the solution of setting staleTime: 0 on the useShareChatList query is a simple and idiomatic way to implement the desired 'stale-while-revalidate' behavior in React Query. This ensures a smooth user experience by showing cached data immediately while fetching fresh data in the background. The new unit test is particularly well-written; it accurately simulates the component lifecycle of switching conversations (unmount and remount) and thoroughly validates that the data is correctly updated. The changes are clean, well-documented, and demonstrate a solid understanding of the problem and the library being used. No issues were found.

@dosubot dosubot bot added the lgtm This PR has been approved by a maintainer label Dec 31, 2025
@zhsama zhsama merged commit 1b8e80a into langgenius:main Dec 31, 2025
10 checks passed
@dosubot dosubot bot mentioned this pull request Mar 2, 2026
6 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

lgtm This PR has been approved by a maintainer size:M This PR changes 30-99 lines, ignoring generated files.

2 participants