With the days of dial-up and pitiful 2G data connections long behind most of us, it would seem tempting to stop caring about how much data an end-user is expected to suck down that big and wide bro…
Even if you do cache everything, each site hosts their own copy of jQuery or whatever the kids use these days, and your proxy isn’t going to cache that any better than the client already does.
don’t they always have a short cache timeout? the proxy could just tell the client that the cache timeout is a long time, and when the browser checks if it’s really up to date, it would redownload the asset but just return the right status code if it actually didn’t change.
and all the jquery copies could be also eliminated with a filesystem that can do deduplication, even if just periodically. I think even ext4 can do that with reflink copy, and rmlint helps there.
don’t they always have a short cache timeout? the proxy could just tell the client that the cache timeout is a long time, and when the browser checks if it’s really up to date, it would redownload the asset but just return the right status code if it actually didn’t change.
and all the jquery copies could be also eliminated with a filesystem that can do deduplication, even if just periodically. I think even ext4 can do that with reflink copy, and rmlint helps there.