-
Notifications
You must be signed in to change notification settings - Fork 132
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Show current benchmarking results #123
Comments
Coincidentally, I just updated the code size benchmarks this morning, cleanup up the code as well. It's easy to measure code size, not so much the performance. For example, google-protobuf deserializes into an intermediate state. So a simple roundtrip might show different results, compared to when all fields are set with the getter / setter methods. That being said, having performance benchmarks for protobuf.js would be great. |
Partially address timostamm#123 Results of the perf benchmark on my machine: ### read binary google-protobuf : 413.662 ops/s ts-proto : 1,324.736 ops/s protobuf-ts (speed) : 1,461.452 ops/s protobuf-ts (speed, bigint) : 1,475.889 ops/s protobuf-ts (size) : 1,250.677 ops/s protobuf-ts (size, bigint) : 1,255.167 ops/s protobufjs : 1,732.049 ops/s ### write binary google-protobuf : 906.883 ops/s ts-proto : 3,805.993 ops/s protobuf-ts (speed) : 430.632 ops/s protobuf-ts (speed, bigint) : 448.063 ops/s protobuf-ts (size) : 378.682 ops/s protobuf-ts (size, bigint) : 392.511 ops/s protobufjs : 1,539.768 ops/s ### from partial ts-proto : 4,503.332 ops/s protobuf-ts (speed) : 1,568.577 ops/s protobuf-ts (size) : 1,555.881 ops/s ### read json ts-proto : 3,719.366 ops/s protobuf-ts (speed) : 889.613 ops/s protobuf-ts (size) : 890.034 ops/s protobufjs : 4,120.232 ops/s ### write json ts-proto : 13,200.842 ops/s protobuf-ts (speed) : 1,865.668 ops/s protobuf-ts (size) : 1,862.537 ops/s protobufjs : 4,199.372 ops/s ### read json string ts-proto : 957.057 ops/s protobuf-ts (speed) : 416.284 ops/s protobuf-ts (size) : 421.48 ops/s protobufjs : 910.256 ops/s ### write json string ts-proto : 1,000.572 ops/s protobuf-ts (speed) : 943.338 ops/s protobuf-ts (size) : 949.784 ops/s protobufjs : 1,446.891 ops/s
Thanks for the PR! The makefile bug from this comment is fixed in commit 326299f, making sure the performance benchmarks run with the same payload every time. Not saying benchmarks should only run with the large payload, this was just fixing the obvious makefile bug. All testees are in the same ballpark with the large payload size:
I think the benchmarks should run on several payload sizes. There are some factor 10 gaps with the smaller payload you were measuring that are worth closer investigation. |
For reference: Large payload: 1.2MiB - FileDescriptorSet for |
Great results! One thing that caught my eyes is the massive difference in writing JSON between |
But look closely at the numbers. They are put into relation when you realize that they measure turning the internal representation into a JSON object. What you need in practice is a JSON string:
|
I think the manual deserves a performance comparison table at the end of the section Code size vs speed. It should just show numbers for binary I/O and JSON (string) I/O. It should show generator version number and parameters, preferably in a one simple table. It should be mentioned how and where this is measured, and with what payload size. The table should be generated by a script, similar to the code size report. These are the results including protobuf.js:
|
Looks like there has been a regression in v2.0.0-alpha.9. We stopped generating
|
Was this this discovery made in some off-github discussion? I'm curious more than anything since I can't find any issue or PR mentioning this. Glad it was spotted though! Completely unrelated, but what on Earth is going on with ts-proto's "write json object" benchmark? Being nearly an order of magnitude faster than the underlying library it uses (protobufjs) seems odd. |
See #147 (comment) and #147 (comment)
It's impressive, right? I don't think ts-proto is sharing any code with protobufjs for JSON. |
Hello. When I run
Where protobuf.js is about twice as fast. Is this what is referenced above?
Am I in the smaller payload segment? Am I holding it wrong?
I'm happy to provide additional information, just tell me what you need. All the best |
Without seeing your benchmark code it's hard to say. It depends on what types of fields you're using. Are you using int32/int64/uint32/uint64 fields? In the case of string: how long are they? Are the strings all ascii? Are you including multi-byte UTF-8 characters? How many? How many multi-byte UTF-8 characters do you have in a row? What JS runtime and version are you using? All of these things will affect the results. You can see some differences just based on the string length being decoded here where Node v18.13 and higher starts performing faster using the And here's another case where different browsers start performing better using So yeah, it depends on what you're trying to encode/decode. |
Thanks for your reply. I'll address what I can, and if it's helpful I could attach the protos, although hard to read bc employer, they might shed some insight. I've only looked at decoding.
Not running in the browser. Tested with supplying TextEncoder as well as Buffer. Buffer was slower in this case. This is what I'm having as payload for
I added a repro repo here https://github.com/osadi/pbbench which is using benny, and not the code from this repo, but the numbers looked similar when I tried the same code in both. |
So yeah, protobufjs is likely to always be faster with just ascii strings. Its utf8 string decoder takes some shortcuts that a conformant one cannot. |
I'm probably misunderstanding something. I've removed the only two string fields in my example and protobuf.js seems to do even better? I really want to use protobuf-ts, I just need to figure out if I'm doing it wrong, and try to sell it to the team.. I'm not decoding the object as json, or jsonString, if that's what you're referring to. It's a base64 which is fed to Buffer, which is then the input to Thank you. And again, if I should remove my comments derailing your actual question just let me know. No problems. |
Out of curiosity is there a reason you're looking to use protobuf-ts over protobuf-es? I only ask because the latter is A) much more actively maintained, B) supports more/newer protobuf features, and C) the project is lead by the same person who made protobuf-ts, but with protobuf-es they're actually getting paid to work on it :) FWIW I think protobuf.js is going to be faster in most cases than either protobuf-ts or protobuf-es (but not like an order of magnitude faster, we're talking less than twice as fast). If you care about protobuf conformance then it's probably wise to stay away from protobuf.js, but if speed is paramount it's really hard to beat protobuf.js. |
:) No there's no reason. I was just tasked with working on an app, that used protobuf.js. And I wasn't to keen on using that so looked at alternatives. I'm happy you took the time to answer, and provided alternatives. Much appreciated. |
The manual currently provides a comparison between code size vs speed, but it only shows the resulting size of the generated code so we don't know the runtime difference. There is already a benchmarks package which provides code to perform benchmarks, but it leave something to be desired. Namely: the current results. :)
It would be helpful to show this information (especially the perf.ts results) somewhere. Ideally it could be included in the code size vs speed section of the manual.
P.S. A comparison against protobuf.js would be very nice to compare the performance against since that library tends to be the fastest out there at the moment.
P.P.S. I've already taken a crack at adding protobuf.js to the perf.ts benchmarks locally and it seems like protobuf.js can decode/encode the binary about twice as fast as [email protected]. Any ideas how that gap could be closed? Is protobuf.js taking shortcuts that aren't conformant to the proto spec? Are there any techniques that could be copied from protobuf.js?
Thank you again for this wonderful project!
The text was updated successfully, but these errors were encountered: