Nearly everything done nowadays with 500MB+ of javascript libraries could already be done more than a decade ago with jQuery + AJAX.
It still can. Even without those oldies.
It feels like this has been an issue for some time now with the internet ballooning in how resource heavy it is despite many websites not becoming all that more functional. It’s the reason there is a meme of people being surprised that their browser tab is taking up so much ram. I mean yeah that news website may function similarly to how it did 10 years ago, but that tiny thumbnail is technically autoplaying a 1080p video, and despite being zoomed out in frame the photos uploaded in the background and thumbnails are also fairly large and high res even before you click on them, and there are countless other things running in the background that just arent worth it.
There was a period in the late 00s and early to mid 10s where the rise of the smart phones delayed this trend and forced developers to reconsider more minimal global experience. Flash was killed off and things got lighter weight and the new media rich features were better optimized for performance.
I think it’s also not just that the developers tend to have better devices as much as it’s a result of time and energy and resources put towards building software. Its similar to videogames. In the old days to save on resources a 2d game might use a single texture tile that could be mirrored, rotated, or color swapped so that precious ram space can be spared. Once the baseline or average hits a certain point(or a new console gen appears) a lot of that “optimization” goes away because it’s not needed. Sometimes it’s obvious and we’re better for it like clouds no longer having to play double duty as bushes, but othertimes it means that we move onto something that technically looks marginally better but absolutely leaves a good chunk of contemporary hardware in the dust.
I think the most frustrating things about websites is that things arent that different for all the under the hood changes we get. Google maps is a lot slower in firefox than it used to be, and the android app uses more resources on mid range hardware than it used to(I’d know I remember using it on my HTC Dream/G1). Functionally I have been able to do the same things I can do now on google maps for probably more than a decade now. New technology has been introduced in the backend to make maps “better” but it is at the cost of CPU ticks and snappiness. Likewise a lot of news and article websites dont look that much different than they used to 10 years ago. Sure things are laid out differently and aesthetics change, but the navigation is fairly steady. But we have all this javascript and bandwidth sucking media autoloading and creating a slower experience. Even modern hardware can suffer from this.
This was more interesting than I expected. Though they didn’t clarify why it costs $700,000, given the context I assume it’s customers on slower devices/connectivity leaving rather than something like bandwidth?
One retailer realized they were losing $700,000 a year per kilobyte of JavaScript, Russell said.
It’s a retailer so definitely lost sales or conversions
At $700k per kb, it really can’t be anyone other than Amazon.
Often, it boils down to one common problem: Too much client-side JavaScript. This is not a cost-free error. One retailer realized they were losing $700,000 a year per kilobyte of JavaScript, Russell said.
“You may be losing all of the users who don’t have those devices because the experience is so bad,” he said.
They just didn’t link to the one retailer’s context. But it’s “bring back old reddit” energy directed at everything SPA-ish.
edit to give it a little personal context: I was stuck on geosat internet for a little while and could not use amazon’s site across the connection. I’m not sure if they’re the retailer mentioned. But the only way I could make it usable was to apply the ublock rule
*.images-amazon.com/*.js^
described here.What really stunk about it was that if you’re somewhere where geosat is/was the only option, then you’re highly dependent on online retail. And knowing how to manage ublock rules is not exactly widespread knowledge.
Amazon found back in 2006 that every 100ms page load time resulted in -1% sales, and considering that they’re talking about bloat yeah, it’d be just from the increase load time/customers lost.
The funny thing is that internet speeds, back in 2006, were significantly lower than today. And here we are, with 10x the speeds and pages somehow loading slower than back then!
The article mentions that the new Javascript frameworks cater more to developers. As if developers have to use the garbage sites that they are developing, this article is very confused and written in a very bad way. Also, I would love to see a source to that clickbaity title other than “some guy said something”. I also find this sentence hilarious as a developer. Russell contended we’ve over-prioritized developer experience and the end user experience suffers as a result. This is clearly from somebody who doesnt understand the concept of javascript framework fatigue