Even though bandwith increases, website optimization is a bigger issue then ever. Mobile devices don't benefit from modern fiber lanes and web applications get more complex and require more and more ressources to load.
Additionaly loaded ressources may load additionally ressources which couldn't be loaded before (cause they where unknown), making the load even longer.
There are a bunch of books, articles and tutorials out there describing possible improvements as well as tools analyzing it but gooing through all the items takes time...
We recently needed to write an algorithm to unpack a specific file from a proprietary archive format.
The fun part is that the initial task quickly transformed into a research task as our colleague Michael wanted to dig deeper in the topic. Here is the story behind the performance boost.
I never worked with binary files before in PHP so I decided to first get it working and care about optimisations later.
My straight forward approach of using file_get_contents and normal string operations resulted in very low performing and memory hungry process.
For getting a 10kb file out of an 2MB archive the algorithm took ~1.200ms and ~14mb memory at its peak.
This had to be optimized ... and I succeeded to bring it down to 30ms and 1MB memory at its peak ...