Skip to content

perf_hooks: reducing overhead of performance observer entry list

Continuing the work started on https://github.com/nodejs/performance/issues/109.

                                                       confidence improvement accuracy (*)   (**)  (***)
perf_hooks/performance-observer.js pending=1 n=100000         ***    147.63 %       ±3.89% ±5.23% ±6.89%
perf_hooks/performance-observer.js pending=10 n=100000        ***    104.54 %       ±5.69% ±7.60% ±9.92%

Be aware that when doing many comparisons the risk of a false-positive
result increases. In this case, there are 2 comparisons, you can thus
expect the following amount of false-positive results:
  0.10 false positives, when considering a   5% risk acceptance (*, **, ***),
  0.02 false positives, when considering a   1% risk acceptance (**, ***),
  0.00 false positives, when considering a 0.1% risk acceptance (***)

cc @nodejs/performance

Merge request reports

Loading