Your answer is in your question. To get the page load time of a site in a browser you should use a browser. Not only is a browser going to be the most accurate representation of a browser but you'd be surprised at how hard it is to get a synthetic tool to record a true, subjective figure for page load time. If you also want data for FF, Chrome etc, then install them and use them. The best (only?) approach for cross-browser page load tuning is to repeat the tests across multiple browsers!
So, that's client-side stuff (aka WPO).
For the server, you could se a tool like JMeter, this would load your server, not your client. You want to test your server separately to your client and when you are testing the server, you should focus on the server and pretty much ignore the client. JMeter and it's friends are not browsers but they are very good at simulating the server calls that come from browsers and it is at this level (the HTTP level) that this task should be performed. So, to recap, this is server focused activity, not client.
Once you've done all that then, yes, client-side tools can be useful for automation and regression but they're really only useful for when you've already tuned things.
Reasons why sythentic tools aren't as good as a stopwatch and a human brain.
Most tools will record how long it takes to load everything but these days that's not always the same as the user experience. We make great efforts to push loading things to the background or bottom of the page (below the fold) but synthetic tools don't see these things.
In the same vein, js and images might still be loading but from the users perspective the page is complete. A tool would not see this, a human brain would.
The way a page loads can be subtly different which might not make much difference to the human experience but could cause a tool to throw a hissy fit. Eg. A third-party call might timeout but if this does not block the page it might not even be seen by the user.