Skip to content

Conversation

@pull
Copy link

@pull pull bot commented Jan 22, 2026

See Commits and Changes for more details.


Created by pull[bot] (v2.0.0-alpha.4)

Can you help keep this open source service alive? 💖 Please sponsor : )

#88397)

Adopt the new generated accessor methods in turbo-tasks-backend.

Remove the CachedDataItem enum and related support macros.

This new approach is more ergonomic, memory efficient, and generates slightly smaller serialized payloads.  

From measuring vercel-site:
```
canary@e05510ab3bf91c02c7381e2c471363561ffa1198:
$ hyperfine -p 'rm -rf .next' -w 2 -r 10  'pnpm next build --turbopack --experimental-build-mode=compile'
Benchmark 1: pnpm next build --turbopack --experimental-build-mode=compile
  Time (mean ± σ):     55.981 s ±  0.885 s    [User: 386.683 s, System: 111.484 s]
  Range (min … max):   54.438 s … 57.723 s    10 runs
```

```
01-09-migrate-to-typed-accessors@d343fad5bf92af9c5271c62a24aedcf02c3d83d4
$ hyperfine -p 'rm -rf .next' -w 2 -r 10  'pnpm next build --turbopack --experimental-build-mode=compile'
Benchmark 1: pnpm next build --turbopack --experimental-build-mode=compile
  Time (mean ± σ):     54.897 s ±  0.496 s    [User: 389.400 s, System: 127.353 s]
  Range (min … max):   54.298 s … 55.681 s    10 runs
```

so a small savings of a little over a second.  Not too bad for mostly removing a bunch of `match` expressions that were required in the old model.

similarly we should expect a small progression in disk size since we no longer explode into `Vec<CachedDataItem>` which induces some overhead from redundantly encoding enum descriminents and not being able to 'length prefix' collections.

from a clean `.next` dir on vercel-site, and a production build with caching enabled, the cache size is

before
```
$ du -h .next/cache
4.3G	.next/cache/turbopack/v16.1.0-canary.0-518-ge05510ab3bf9
```
after
```
$ du -h .next/cache
4.1G	.next/cache/turbopack/v16.1.0-canary.0-546-gd343fad5bf92
```
Which is expected, the main thing we are winning in terms of cache size is the CachedDataItem descriminent values, and in this build we were serializing about 260M items, so we save 1 byte for each of them.

In some measurements i saw much more dramatic wins of >1gb, which i do not have a great explanation for.
@pull pull bot locked and limited conversation to collaborators Jan 22, 2026
@pull pull bot added the ⤵️ pull label Jan 22, 2026
@pull pull bot merged commit 0866832 into code:canary Jan 22, 2026
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants