Replies: 6 comments
-
Ha. Interesting! Thanks for flagging. Will investigate more this week. |
Beta Was this translation helpful? Give feedback.
-
I can try to migrate to mitata if you think it would be useful. Took me like 15 minutes on a different repo cause the abi is pretty similar to vitest. |
Beta Was this translation helpful? Give feedback.
-
Do you think this is a config issue? What if we increase the warmup in Vitest? |
Beta Was this translation helpful? Give feedback.
-
I don't think its that because of the huge discrepancy but I will play around with it when I get some more time. It took my benchmarks from 2x to 30x so I thought it might be interesting for viem. |
Beta Was this translation helpful? Give feedback.
-
Might be worth posting findings here too vitest-dev/vitest#2220 |
Beta Was this translation helpful? Give feedback.
-
Look like the |
Beta Was this translation helpful? Give feedback.
-
Is there an existing issue for this?
Package Version
1.7.0
Current Behavior
Overhead of vitest seems to be slowing down the runtime such that maximum performance isn't being achieved. I was working on a application level ethereum sdk to replace @uniswap/sdk-core, and the performance gains that I thought would evidently show up, weren't there. A really quick poc is benchmarking noop vs add. In vitest there is often no difference. My intuition is that functions that are small enough get drowned out by the overhead of vitest.
Expected Behavior
I think switching benchmarking frameworks to mitata, should give much more accurate (and more importantly favorable) results to viem over ethers.
Steps To Reproduce
With mitata:
Notice the much larger iteration times but also the ops/sec when run using mitata is a few orders of magnitudes greater than when using vitest.
Link to Minimal Reproducible Example (StackBlitz, CodeSandbox, GitHub repo etc.)
https://github.com/kyscott18/vitest-overhead-example
Anything else?
No response
Beta Was this translation helpful? Give feedback.
All reactions