The server could work simply right so the main bottlenecks appear in the coding of the different webpage elements. Different developers may easily forget what's the importance of do not overloading with excessive coding acrobatics because any sequence finally adds time, space or bandwidth requirements.
Using time-consuming algorithms (hash generation, compression, intensive database queries, ...) in the first elements of the page would cause a bottleneck for all users with slow processors. The same for scripts or external files with a considerable size since the bandwidth are sometimes limited or the connection cannot reach high download speeds. Other times the problem is worse when who serves the external files encounters issues and then all the pages should slow down until the problem is solved. The solution lies in to keep the files in your server, inlining the most important elements in the appropriate location remembering the execution flow and utilize techniques to reduce the size of the archives.
The most straightforward of them are programs or plug-ins that compress files offline removing unnecessary data. Using libraries, it's very usual that most of the code serves for nothing and sadly there aren't conventional methods to select only what is needed. To avoid risks, the user can launch tools like Firebug to annotate where those black, unnecessary areas appear to delete them.
Another important element to mention is how the bandwidth might be consumed by automated bots without much benefit or none. It's worthy to know how to monitor them and use IP blocking tools or tailored scripts to avoid their activity. With applications like cPanel on the web server, there's the option of checking logs and storing them in a daily basis. For making things easier, the page should have an automated system to warn the administrators by email when something strange happens. Then, it would be possible to act accordingly for providing a better user experience.