So I have a problem. One of my customers at $DAYJOB has some serious issues with web page load times occasionally, from a user perspective (i.e. the time that it takes all elements of the page to be downloaded and rendered in a browser).
As a hosting provider, I'm only interested in the performance of my servers. Generally what I'll do is use wget to get the page and send it to /dev/null to see if I'm having a problem. This is great for proving that I do or do not have an issue, but what I want to see is some command line way to tell the aggregrate load times for all elements on a page, and a breakdown of the individual elements, in order to act more as a partner in identifying the issue rather than simply saying "not me, go talk to someone else!" :)
Surely wget can do this (I think) but a trip through the manpage doesn't immediately reveal anything. A recursive run (wget -r -l 0 or 1) doesn't seem to do it. Is there either something I'm missing or some other utility that will let me do this?