Skip to content

Add LRU cache eviction to CombinePerKeyPrecombineOperator#37466

Merged
damccorm merged 2 commits intoapache:masterfrom
junaiddshaukat:add-lru-cache-precombine-operator
Feb 2, 2026
Merged

Add LRU cache eviction to CombinePerKeyPrecombineOperator#37466
damccorm merged 2 commits intoapache:masterfrom
junaiddshaukat:add-lru-cache-precombine-operator

Conversation

@junaiddshaukat
Copy link
Contributor

Summary

This PR implements LRU (Least Recently Used) cache eviction for CombinePerKeyPrecombineOperator, addressing the TODO at line 487 of operators.ts.

Changes

  1. Added touchKey() method - Moves accessed keys to the end of the Map (most recently used position)
  2. Added flushLRU() method - Evicts entries from the front (least recently used) until cache is under limit
  3. Updated process() method - Each key access now updates its LRU position

Technical Details

JavaScript Maps preserve insertion order, so we implement LRU by:

  • On access: delete and re-insert the key (moves to end)
  • On eviction: iterate from front (oldest entries first)

Related Issues

Testing

  • All 297 existing tests pass
  • Prettier check passes

Fixes: #37465

Please add a meaningful description for your change here


Thank you for your contribution! Follow this checklist to help us incorporate your contribution quickly and easily:

  • Mention the appropriate issue in your description (for example: addresses #123), if applicable. This will automatically add a link to the pull request in the issue. If you would like the issue to automatically close on merging the pull request, comment fixes #<ISSUE NUMBER> instead.
  • Update CHANGES.md with noteworthy changes.
  • If this contribution is large, please file an Apache Individual Contributor License Agreement.

See the Contributor Guide for more tips on how to make review process smoother.

To check the build health, please visit https://github.com/apache/beam/blob/master/.test-infra/BUILD_STATUS.md

GitHub Actions Tests Status (on master branch)

Build python source distribution and wheels
Python tests
Java tests
Go tests

See CI.md for more information about GitHub Actions CI or the workflows README to see a list of phrases to trigger workflows.

- Added touchKey() method to move accessed keys to end (most recently used)
- Added flushLRU() method to evict entries from front (least recently used)
- Updated process() to track key usage for proper LRU ordering

JavaScript Maps preserve insertion order, so we implement LRU by:
- On access: delete and re-insert the key (moves to end)
- On eviction: iterate from front (oldest entries first)

Fixes apache#37465
@gemini-code-assist
Copy link
Contributor

Summary of Changes

Hello @junaiddshaukat, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request significantly enhances the CombinePerKeyPrecombineOperator by integrating a robust LRU cache eviction mechanism. This change addresses a long-standing TODO, improving the operator's memory management and performance by intelligently removing less frequently accessed data when the cache capacity is reached. The implementation leverages native JavaScript Map behavior to maintain the LRU order efficiently.

Highlights

  • LRU Cache Eviction: Implemented Least Recently Used (LRU) cache eviction for the CombinePerKeyPrecombineOperator to efficiently manage its internal state when the number of keys exceeds a defined limit.
  • New touchKey() Method: Introduced a touchKey() private method that moves an accessed key to the end of the internal Map, effectively marking it as most recently used by leveraging JavaScript Map's insertion order preservation.
  • New flushLRU() Method: Added a flushLRU() method responsible for evicting the least recently used entries from the cache (those at the front of the Map) until the cache size is within the specified limit.
  • Updated process() Logic: The process() method was updated to utilize the new touchKey() method for every key access, ensuring that the LRU order is maintained. It also now calls flushLRU() instead of the previous random flush logic when the cache size limit is exceeded.

🧠 New Feature in Public Preview: You can now enable Memory to help Gemini Code Assist learn from your team's feedback. This makes future code reviews more consistent and personalized to your project's style. Click here to enable Memory in your admin console.

Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point by creating a comment using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands on the current page.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in pull request comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here.

You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

@junaiddshaukat
Copy link
Contributor Author

Hi @damccorm, added LRU to PrecombineOperator. Please have a look when you got time, Thanks.

@github-actions
Copy link
Contributor

Assigning reviewers:

R: @Abacn added as fallback since no labels match configuration

Note: If you would like to opt out of this review, comment assign to next reviewer.

Available commands:

  • stop reviewer notifications - opt out of the automated review tooling
  • remind me after tests pass - tag the comment author after tests pass
  • waiting on author - shift the attention set back to the author (any comment or push by the author will return the attention set to the reviewers)

The PR bot will only process comments in the main thread (not review comments).

@damccorm
Copy link
Contributor

damccorm commented Feb 2, 2026

/gemini review

Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request implements LRU cache eviction for CombinePerKeyPrecombineOperator, which is a valuable enhancement. The implementation is generally sound, but I've identified a couple of areas for improvement. Firstly, the logic in the process method for updating the cache can be simplified and made more efficient, which would also eliminate the need for the new touchKey helper method. Secondly, there is considerable code duplication between the new flushLRU method and the existing flush method, which should be refactored to enhance maintainability. My specific comments provide detailed suggestions for addressing these points.

Comment on lines 488 to 497
const existingAccumulator = this.groups.get(wkey);
const newAccumulator =
existingAccumulator !== undefined
? this.combineFn.addInput(existingAccumulator, wvalue.value.value)
: this.combineFn.addInput(
this.combineFn.createAccumulator(),
wvalue.value.value,
);
// Move to end (most recently used) by delete + set
this.touchKey(wkey, newAccumulator);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The logic for updating the accumulator and managing the LRU behavior can be simplified and made slightly more performant. The current approach uses a touchKey helper method which is called for every element. This results in a delete operation even for new keys, which is unnecessary. By inlining this logic into the process method, we can avoid the helper and the extra operation for new keys, making the code more concise and efficient. This change would also make the touchKey method (lines 477-480) obsolete, allowing for its removal.

      const accumulator = this.groups.get(wkey);
      if (accumulator !== undefined) {
        this.groups.delete(wkey);
      }
      this.groups.set(
        wkey,
        this.combineFn.addInput(
          accumulator ?? this.combineFn.createAccumulator(),
          wvalue.value.value
        )
      );

Comment on lines 517 to 546
flushLRU(target: number): ProcessResult {
const result = new ProcessResultBuilder();
const toDelete: string[] = [];
// Iterate from the front (oldest/least recently used entries)
for (const [wkey, values] of this.groups) {
if (this.groups.size - toDelete.length <= target) {
break;
}
const parts = wkey.split(" ");
const encodedWindow = parts[0];
const encodedKey = parts[1];
const window = decodeFromBase64(encodedWindow, this.windowCoder);
result.add(
this.receiver.receive({
value: {
key: decodeFromBase64(encodedKey, this.keyCoder),
value: values,
},
windows: [window],
timestamp: window.maxTimestamp(),
pane: PaneInfoCoder.ONE_AND_ONLY_FIRING,
}),
);
toDelete.push(wkey);
}
for (const wkey of toDelete) {
this.groups.delete(wkey);
}
return result.build();
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

There is significant code duplication between flushLRU and the flush method. The logic for iterating, decoding keys, sending results, and collecting keys for deletion is nearly identical. While the loop termination condition is checked at a different point, the core processing logic is the same. To improve maintainability and reduce redundancy, this common logic should be extracted into a private helper method that both flushLRU and flush can call.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we even need to keep the existing flush implementation? We could probably replace it with a clear operation that is a simpler equivalent of flush(0)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great suggestion! You're absolutely right, since flush is only called with target=0 (in finishBundle), and flushLRU(0) would achieve the same result of flushing all entries, we can simply replace flush(target) with flushLRU(0). Let me fix that!

@junaiddshaukat
Copy link
Contributor Author

@damccorm, I have addressed all review feedback:

  • Replaced flush() with flushLRU(0) and removed the duplicate method
  • Removed code duplication between flush and flushLRU
  • Inlined touchKey logic and optimized it to call delete() only when the key exists

Let me know if further changes needed!

Copy link
Contributor

@damccorm damccorm left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks, LGTM

@damccorm damccorm merged commit e4cd040 into apache:master Feb 2, 2026
18 of 27 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[Bug]: Add LRU cache eviction to CombinePerKeyPrecombineOperator

2 participants

Comments