vm: lazy load internal/vm/module for performance
Even though the benchmarks are open for discussion, I think lazy loading should be good.
vm/context-global-proxy.js n=100000 1.54 % ±2.19% ±2.94% ±3.87%
vm/create-context.js n=100 -0.14 % ±1.50% ±1.99% ±2.60%
vm/run-in-context.js withSigintListener=0 breakOnSigint=0 n=1 0.46 % ±2.56% ±3.40% ±4.43%
vm/run-in-context.js withSigintListener=0 breakOnSigint=1 n=1 * 3.16 % ±3.01% ±4.01% ±5.22%
vm/run-in-context.js withSigintListener=1 breakOnSigint=0 n=1 -1.89 % ±3.55% ±4.73% ±6.15%
vm/run-in-context.js withSigintListener=1 breakOnSigint=1 n=1 -0.69 % ±2.34% ±3.11% ±4.07%
vm/run-in-this-context.js withSigintListener=0 breakOnSigint=0 n=1 1.52 % ±1.68% ±2.23% ±2.90%
vm/run-in-this-context.js withSigintListener=0 breakOnSigint=1 n=1 0.99 % ±2.09% ±2.78% ±3.62%
vm/run-in-this-context.js withSigintListener=1 breakOnSigint=0 n=1 1.45 % ±1.67% ±2.23% ±2.90%
vm/run-in-this-context.js withSigintListener=1 breakOnSigint=1 n=1 0.06 % ±1.50% ±2.00% ±2.60%
Be aware that when doing many comparisons the risk of a false-positive result increases.
In this case, there are 10 comparisons, you can thus expect the following amount of false-positive results:
0.50 false positives, when considering a 5% risk acceptance (*, **, ***),
0.10 false positives, when considering a 1% risk acceptance (**, ***),
0.01 false positives, when considering a 0.1% risk acceptance (***)