-
OverviewThe purpose of this proposal is to introduce a debounce mechanism for the The key idea is to add a new Problem StatementCurrently, developers need to manually debounce queries when working with features like typeahead search, auto-complete, or real-time filters. This requires additional logic to be implemented by every developer, leading to repetitive, boilerplate code. While React Query offers staleTime and cacheTime options, they do not address the issue of input-driven debounce logic. Most developers rely on libraries like Proposed SolutionWe propose adding a New Optionconst { data } = useQuery(queryKey, queryFn, {
take: 'first' | 'last' | 'every', // New option (default: 'every')
debounceTime: 300, // The debounce window (in milliseconds)
}); Option Definitions
Usage Examples1️⃣ Take the "Last" Event (Classic Debounce)This is the most typical use case for debouncing — only trigger the last query after the user stops typing for a specified delay.
Use case: Autocomplete or search bars where the last input is the only one that matters. 2️⃣ Take the "First" EventThis approach is useful when you want to capture the first change and ignore everything else that happens within the debounce window.
Use case: For situations like form submission, where you want to react immediately on the first action and ignore subsequent rapid actions. 3️⃣ Take "Every" Event (Rate Limiting)This allows every event to be processed but only after a minimum debounce delay between queries.
Use case: Ideal for scenarios where you want to send periodic updates to an API but still reduce the frequency of requests. Proposed ImplementationImplementation Steps
Alternatives Considered
Benefits of This Proposal
Drawbacks
Request for FeedbackWe’d love feedback on the following aspects:
We look forward to your thoughts and feedback! |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
Thanks for the detailed proposal - it warrants a detailed answer. We’ve stated in the past that The reason is manifold:
You’re saying it “adds unnecessary complexity to projects”, I would argue the opposite that it keeps unnecessary complexity out of React Query. It’s really just one additional line of code: + const debouncedSearchTerm = useDebounce(searchTerm, 500)
const { data } = useQuery({
- queryKey: ['search', searchTerm]
+ queryKey: ['search', debouncedSearchTerm]
queryFn: fetchSearchResults,
}); You can pick your favourite
With your proposal, the QueryKey would change with every keystroke, which creates a new cache entry with every keystroke, and then we would somehow debounce the call to the QueryFn. The user-land implementation debounces the queryKey, which means React Query will only know about the new cache entry after debouncing has finished. This is conceptually much closer to what you’d expect, as we wouldn’t create that many in-between entries.
For example, for auto-complete, we likely just don’t want to send events until the user has “stopped” typing. Defining an arbitrary time for what “stopped” means seems ... arbitrary? In my example, it will also delay the fetch by 500ms after the user has stopped. Since React 18, React offers the built-in const deferredSearchTerm = React.useDeferredValue(searchTerm)
const { data } = useQuery({
queryKey: ['search', deferredSearchTerm]
queryFn: fetchSearchResults,
}); and kind of get the best of both worlds with a react built-in feature. |
Beta Was this translation helpful? Give feedback.
Thanks for the detailed proposal - it warrants a detailed answer.
We’ve stated in the past that
debouncing
is one feature that will not make it into the Query API. I’ve talked about that in my recent talk at the React Advanced Conference “React Query API Design - Lessons Learned”:The reason is manifold:
lodash.debounce
has a lot of options -leading
,trailing
,maxWait
, which I think is what users would expect us to support, too. Some users might want to havethrottling
instead - another option we’d need to support. All of this adds to bundle size as well -lodash.debounce
is 1.14kb, which is roughly a 10% increase of our current size.