It was a moment of my work time, and I researched a few tools how to test web performance.
- Web browser Timeline, Profiling and Audit
- jsperf.com (for js code only)
- Boomerang http://lognormal.github.io/boomerang/doc/ CAN collect some JS/Page data, no memeory leaks research.
- DNS Latency test: http://www.lognormal.com/boomerang/doc/howtos/howto-8.html
- “bandwidth/latency along with page load time” http://www.lognormal.com/boomerang/doc/howtos/howto-3.html
- window.performance + Spec draft: https://dvcs.w3.org/hg/webperf/raw-file/tip/specs/NavigationTiming/Overview.html
- Jiffy http://code.google.com/p/jiffy-web/, Jiffy allows developers to
- measure individual pieces of page rendering (script load, AJAX execution, page load, etc.) on every client
- report those measurements and other metadata to a web server
- aggregate web server logs into a database
- generate reports
- http://www.webpagetest.org/ online tool to receive some data with selection of location and web browser type.
- “npm install webpagetest -g” and Jenkins plugin:
- http://www.showslow.com/ online tol based on URL receive some data, and comparing between servers.
- Web Episodes http://stevesouders.com/episodes/
- dynaTrace profiling tool : http://www.compuware.com/en_us/application-performance-management.html
- Chrome extensions for performance testing:
- “Performance Appraisal”, “Web Performance”, “Page performance” (using NavigationTiming), “Show Slow”, “Page Speed by Google”, “SPOF-O-Matic”
A mistake that is often made is that too much information is fetched dynamically with too many calls. One example is a product page with 10 products. The developer may decide to use AJAX to load detailed product information for every product individually. This means 10 XHR calls for every 10 products that are displayed. This will of course work but it means that you have 10 roundtrips to the server that lets the user wait for the final result. The server needs to handle 10 additional requests that puts additional pressure on the server infrastructure.