Skip to content

Commit d92472c

Browse files
committed
Add 'Difference to reference implementation' section to README.md
1 parent 335477f commit d92472c

File tree

1 file changed

+36
-1
lines changed

1 file changed

+36
-1
lines changed

README.md

Lines changed: 36 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -61,14 +61,49 @@ and Nicholas Schrock (@schrockn) from [Facebook](https://www.facebook.com/), the
6161

6262
### Manual dispatching
6363

64+
The original data loader was written in Javascript for NodeJS. NodeJS is single-threaded in nature, but simulates
65+
asynchronous logic by invoking functions on separate threads in an event loop, as explained
66+
[in this post](http://stackoverflow.com/a/19823583/3455094) on StackOverflow.
67+
68+
[Vert.x](http://vertx.io) on the other hand also uses an event loop ([that you should not block!!](http://vertx.io/docs/vertx-core/java/#golden_rule)), but comes
69+
with actor-like [`Verticle`](http://vertx.io/docs/vertx-core/java/#_verticles)s and a
70+
distributed [`EventBus`](http://vertx.io/docs/vertx-core/java/#event_bus) that make it inherently asynchronous, and non-blocking.
71+
72+
Now in NodeJS generates so-call 'ticks' in which queued functions are dispatched for execution, and Facebook `DataLoader` uses
73+
the `nextTick()` function in NodeJS to _automatically_ dequeue load requests and send them to the batch execution function for processing.
74+
75+
And here there is an *IMPORTANT DIFFERENCE* compared to how _this_ data loader operates!!
76+
77+
In NodeJS the batch preparation will not affect the asynchronous processing behaviour in any way. It will just prepare
78+
batches in 'spare time' as it were.
79+
80+
This is different in Vert.x as you will actually _delay_ the execution of your load requests, until the moment where you make a call
81+
to `dataLoader.dispatch()` in comparison to when you would just handle futures directly.
82+
83+
Does this make Java `DataLoader` any less useful than the reference implementation? I would argue this is not the case,
84+
and there are also gains to this different mode of operation:
85+
86+
- In contrast to the NodeJS implementation _you_ as developer are in full control of when batches are dispatched
87+
- You can attach any logic that determines when a dispatch takes place
88+
- You still retain all other features, full caching support and batching (e.g. to optimize message bus traffic, GraphQL query execution time, etc.)
89+
90+
However, with batch execution control comes responsibility! If you forget to make the call to `dispatch()` then the futures
91+
in the load request queue will never be batched, and thus _will never complete_! So be careful when crafting your loader designs.
92+
93+
*Note*: In future releases the danger of not invoking dispatch will be greatly diminished. There will be an optional dispatch timeout,
94+
and some other optional features that ensure all load requests eventually complete. See [Project plans](#project-plans) for upcoming features and ideas.
95+
6496
### Additional features
6597

66-
- None, so far :(
98+
- Initial release is a feature-complete port of the reference implementation (with the only change being [Manual dispatching](#manual-dispatching)).
99+
- See [Project plans](#project-plans) for upcoming features and ideas.
67100

68101
## Let's get started!
69102

70103
### Installing
71104

105+
No more talking. Let's install the `vertx-dataloader` dependency and look at some actual code!
106+
72107
### Building
73108

74109
### Using

0 commit comments

Comments
 (0)