For the most part I think the reason so many web devs put up with the “all-react” (and similar) development experience is basically cargo culting. If you admit you don’t like it, chances are there’s at least one front-end hipster around who will mock you as outdated, and that’s enough to silence most. For the hipsters, the problems of SPAs are hard, and engineers like hacking on hard problems. Also the fact that the solutions don’t work very well means they’re constantly being reinvented, which means if you do the work to keep up with it all you’re rewarded by being regarded as an expert, which is nice.
Lastly, I wouldn’t underestimate how this has built up slowly over time, and therefore how many people just don’t know any better.
This is a good, very productive, very fast-learning developer I’m talking about. He literally had never tried to use the DOM api, and didn’t realize it was, you know, useful.
I think there’s a lot of that in front end world today.
I fear that most authors (and most creators of images and links) are not knowledgeable enough to see the web's shortcomings and that it will be very hard to explain the shortcoming to them -- with the result that most authors will continue to consider their job to be done once they have put their writings (and images and links) on the web.
No greater mistake can be made than to imagine that what has been written latest is always the more correct; that what is written later on is an improvement on what was written previously; and that every change means progress. Men who think and have correct judgment, and people who treat their subject earnestly, are all exceptions only. Vermin is the rule everywhere in the world: it is always at hand and busily engaged in trying to improve in its own way upon the mature deliberations of the thinkers. So that if a man wishes to improve himself in any subject he must guard against immediately seizing the newest books written upon it, in the assumption that science is always advancing and that the older books have been made use of in the compiling of the new. They have, it is true, been used; but how? The writer often does not thoroughly understand the old books; he will, at the same time, not use their exact words, so that the result is he spoils and bungles what has been said in a much better and clearer way by the old writers; since they wrote from their own lively knowledge of the subject. He often leaves out the best things they have written, their most striking elucidations of the matter, their happiest remarks, because he does not recognise their value or feel how pregnant they are. It is only what is stupid and shallow that appeals to him. An old and excellent book is frequently shelved for new and bad ones; which, written for the sake of money, wear a pretentious air and are much eulogised by the authors’ friends. In science, a man who wishes to distinguish himself brings something new to market; this frequently consists in his denouncing some principle that has been previously held as correct, so that he may establish a wrong one of his own. Sometimes his attempt is successful for a short time, when a return is made to the old and correct doctrine. These innovators are serious about nothing else in the world than their own priceless person, and it is this that they wish to make its mark.
11 months ago
It's really sad that in 2020, 10k+ engineers can't make a photo, video, post and message sharing website that is not a pain to use. We collectively failed as a profession. If one needs 2MB of CSS for such a website, there is clearly a problem.
There is no excuse for how slow software is today. None.
Many parts of Windows 95 were faster in wall clock time in 1995 on the hardware of 1995 than today's Windows 10 is on the hardware of today. Yes, today's software does more, but THAT MUCH more? Are you sure?
The hardware we have is very fast. Software developers have been relying on hardware upgrades for performance improvements for far too long, and now few software developers know how fast things can be, if they just try just a tiny little bit.
Also, OOP teaches developers how to think about software in ways that are exactly opposite to how computers actually do work efficiently. Object oriented programming is just inherently slower because it encourages developers to think of things one at a time. Computers like to do things in batches.
More people need to think about performance, because clock speeds aren't going up like they used to, and we still don't know how to write software that spreads across a lot of cores very well. The free ride that hardware upgrades provided us is quickly coming to an end.
tl:dr; everyone needs to learn about how processor caches work, especially the 24-year old JS devs who think they already know everything.
What has been created by this half century of massive corporate propaganda is what's called "anti-politics". So that anything that goes wrong, you blame the government. Well okay, there's plenty to blame the government about, but the government is the one institution that people can change... the one institution that you can affect without institutional change. That's exactly why all the anger and fear has been directed at the government. The government has a defect - it's potentially democratic. Corporations have no defect - they're pure tyrannies. So therefore you want to keep corporations invisible, and focus all anger on the government. So if you don't like something, you know, your wages are going down, you blame the government. Not blame the guys in the Fortune 500, because you don't read the Fortune 500. You just read what they tell you in the newspapers... so you don't read about the dazzling profits and the stupendous dizz, and the wages going down and so on, all you know is that the bad government is doing something, so let's get mad at the government.
Just as an aside, to give you an interesting benchmark—on roughly the same system, roughly optimized the same way, a benchmark from 1979 at Xerox PARC runs only 50 times faster today. Moore’s law has given us somewhere between 40,000 and 60,000 times improvement in that time. So there’s approximately a factor of 1,000 in efficiency that has been lost by bad CPU architectures.
The myth that it doesn’t matter what your processor architecture is—that Moore’s law will take care of you—is totally false.
Perhaps it was commercialization in the 1980s that killed off the next expected new thing. Our plan and our hope was that the next generation of kids would come along and do something better than Smalltalk around 1984 or so. We all thought that the next level of programming language would be much more strategic and even policy-oriented and would have much more knowledge about what it was trying to do. But a variety of different things conspired together, and that next generation actually didn’t show up. One could actually argue—as I sometimes do—that the success of commercial personal computing and operating systems has actually led to a considerable retrogression in many, many respects.
You could think of it as putting a low-pass filter on some of the good ideas from the ’60s and ’70s, as computing spread out much, much faster than educating unsophisticated people can happen. In the last 25 years or so, we actually got something like a pop culture, similar to what happened when television came on the scene and some of its inventors thought it would be a way of getting Shakespeare to the masses. But they forgot that you have to be more sophisticated and have more perspective to understand Shakespeare. What television was able to do was to capture people as they were.
So I think the lack of a real computer science today, and the lack of real software engineering today, is partly due to this pop culture.