node created 2020/05/08
Apple values my blood sweat and tears at $0.99/user/lifetime - 30% Apple tax - government taxes.

I don't mean to compare it to a sweat shop, because I live in the first world and have opportunities, but this is a demeaning shakedown and devaluation of my pride, product, and work.

Apple:

1. Shifted where generic computing happens

2. Downplayed the web as the end-all, be-all of application delivery. (It could have been amazing with WASM and sandboxing back in the 00's!)

3. Prevents generic apps from gaining distribution outside of Apple's control and tax

They took advantage of open source, the web, and the Internet. Then they shit on it and offered up the App Store protection racket as salvation.

It's only one of several themes where the giants of today crush the little guy. Computing is less free today than it was a decade ago.

Before Apple I had reach and distribution. Now I have less than 50% of that. And I don't have liberty and control over my own narrative anymore.

the web dies one corporate whimper and one consumer shrug at a time

In order to refocus the Firefox organization on core browser growth through differentiated userexperiences, we are reducing investment in some areas such as developer tools, internal tooling, and platform feature development
I don't think the user population is aware how disingenuous all of this tech crap is. It could be so awesome, and they don't even understand what's not awesome about it. It hurts in a deep, emotional space.

I have found so much inspiration in some of the great programmers of two generations ago. The writings of Chuck Moore and Alan Kay convince me that we somehow took two orders of magnitude of backwards steps in creating the present milieu of dysfunctional technology.

The worst part, IMO, is that it's all opaque. I don't control the device that I hold in my hand. I can't fix it because Google or Apple don't want me to. It is a tool of economic and social control, not a powerful technology that I can wield.
.. we're just writing too much code. Companies have hundreds of millions of lines of code in production right now and nobody who knows how it works, and what we're doing is a kind of runaway train where we're just hiring more and more people to write more and more code, and trying to ramp education up to be able to produce more and more people, and - I can only say this from the sidelines, as I don't have a degree - we seem to, at least as per these individuals, be cheapening computer science at the (possibly indirect) behest of these businesses who aren't willing or able to step back and try things differently.
Nothing fucking works. Nothing. Turning it off and back on again isn't a cute ritual, it's the cornerstone of all modern electronics. Everything ships with zero day patches. My $3000 TV crashes when you navigate an OSD menu the wrong way. Not the unnecessary smart features that it shipped with - that I of course augmented with a separate $300 purchase - but the actual 'treat me like a display' menu.

I work for a SaaS company and just as if not more work goes in to deciding how we measure uptime as goes in to designing for it. "Well, no customer incidents were reported, so that doesn't count as being down", "We have 1 hour of scheduled maintenance every week, but we still achieved 99.99 uptime" - it's creative, I'll give them that.

We talk about the network being unreliable as if a 200km 28ghz link and a trunk connection in a data center are the same thing. It's unqualified, and unhelpful, and nobody really knows what they are doing.

We "dismantle" waterfall as if it's not the same type of people who misunderstood the original publication doing the same thing with every other methodology and fad. (If you have not read "the leprechauns of software engineering" yet, it's an interesting read and worth a little bit of your time).

My house is full of devices, my history is full of purchases, that are a disappointment. I can't remember the last time I went a single. god. damn. day. without the things that are suppose to be helping me misbehaving in some way. And the worst part, is many of them can't even be fixed. They will putter along, the occasional patch, until they lose the attention of some swim lane on a plan of record somewhere and become e-waste.

I have been programing since I was eight. It was the most obvious passion I have ever found in life, but it feels like we're stuck. The arguments all feel the same boring old rehashed ones from over the last 20 years, probably longer. I'm bored. Is anybody else just tired of it all? Everything is amazing and crappy at the same time.
Isn't it weird that we have entire companies like Intercom or Rasa whose value add is pushing automated, AI-driven "assistants" onto websites, and then the companies that buy into that entire value add and codebases hacked on by ML experts find that none of it even works better than how it was in 1998?
As more and more domains centralize email in the handful of mega-corp hosted solutions the hosts have less and less reason to care about accepting mail from outside the walled gardens.

emphasis mine

So I'm from Newark, NJ, didn't really grow up around many good examples of work ethic nor wealth, really shit neighborhoods every time I moved. My brother discovered the "view page source" context menu item back in the day on myspace and decided to see what this is about, went to college, came back and I picked up some code skills now 8 years later, we both make the same amount. Now to be fair, of the two of us, I'm the better dev and he even says it to his friends and bosses often.
This is why I hate the layers of bloat and abstraction, and think all the big "web companies" today are basically trash. The results usually look terrible, are slow and riddled with spyware, AND you can't even learn from them... that is, if you do, you just learn some watered down walled garden bullshit that will be completely revamped in 3 years while the fundamentals (they try their best to keep you away from) haven't really changed.

Meanwhile, the Chrome developers ponder removing the address bar altogether, and Windows 10 brags during installation how you should "leave everything to us", and let's not even mention Apple or Facebook.
The worst programs are written by people who know how to plug a million and one things together, but can't drill down and analyse the algorithmic implications of what they're doing. Electron runs like shit and inhales RAM is because it was programmed by people who don't have solid understanding of fundamentals. They understand a huge number of horizontal abstractions but they have no concept of how it looks vertically.

Knowing how to maximally exploit a CPU is way more important than knowing eight different Javascript frameworks if good software is your objective. And frankly, learning Node is way easier than figuring out how to structure basic, bare-bones Javascript so that it leverages your L1 cache.

And therein lies the problem. How many interviewers dock marks for iterating over columns, instead of rows? Because that matters, a huge amount. How many interviewers would give credit for "how can you speed this up?" if the interviewee said, "write it in C, and simplify the datastructures you want me to use so we maximise sequential lookups over basic arrays, to maximise cache usage." They'll look at you like you have three heads.

"Don't you know Big N complexity is the only thing that really matters if you're looking for speed?" - then you get Electron.
We do programmers a disservice when we act as if the conversation about the growing threat of legacy code begins and ends with COBOL. A whole generation of software engineers are spending their careers making the problem worse by outsourcing all but the most unique aspects of their applications to armies of libraries, plugins and modules that they are powerless to monitor let alone update.

The real horsemen of the legacy apocalypse is the depth of the dependency tree. Modern software development stacks abstraction on top of abstraction. If the left-pad incident of 2016 proved nothing else it demonstrated that even experienced engineers will YOLO dependencies on to their applications if given the infrastructure to make installing them easy. Modern developer environments are a veritable candy store of cheap and convenient dependencies.
My workstation (E5-2640) has seen multiple generations of operating systems, video editing software, DAWs.

Browsers and web browsing in general is the only thing that I can tell it's getting consistently worse year after year.

I know it's an odd metric but 10-15 seconds to fully render a newspaper homepage is more than it takes for my full DAW setup (Cubase + FL Studio as VST plugin) to fully come up with tracks loaded and play button ready. I don't even recall dialup being this bad.
Sometimes, they don’t even know that their system can run their stack natively. I’ve been on teams that have said “Let’s just use Docker because X doesn’t know how to install Y.”
So we are driving company decision making based on the needs of synthetic fake financial instruments? Is there any other way to run a company that is more stupid than striving to fulfill the needs of someone else's derivative product?

I cannot imagine a worse basis on which to steer a company. It makes zero sense. Using a random number generator to pick every decision would result in better results than what we are currently doing.

An example of a company that has completely succumbed to Wall Street is Texas Instruments. They are (or used to be) a tech company. They used to have research. They used to create new products.

But in the past few years they have started committing to "returning 100% of free cash flow to investors" (quoting their own earnings release) via stock buybacks and dividends. They actually put it down in writing: we are committed to NOT reinvesting in employees, NOT doing R&D, NOT creating new products. In every earnings call about how they are still committed to getting all the cash into stock buybacks and dividends. That's it. That's the whole company now.

Wall Street loves Texas Instruments. The shiny bucket of treasure known as stock buybacks + equity based compensation is irresistible. This is going to keep happening until we make it stop happening.
Recent years saw a number of supply chain attacks that leverage the increasing use of open source during software development, which is facilitated by dependency managers that automatically resolve, download and install hundreds of open source packages throughout the software life cycle. This paper presents a dataset of 174 malicious software packages that were used in real-world attacks on open source software supply chains, and which were distributed via the popular package repositories npm, PyPI, and RubyGems. Those packages, dating from November 2015 to November 2019, were manually collected and analyzed.
Third-party delivery platforms, as they've been built, just seem like the wrong model, but instead of testing, failing, and evolving, they've been subsidised into market dominance.
“Things have really changed since I began learning, and rightly so. Instead of coding in plain HTML, CSS and JS, I'm now using endless frameworks, modules and libraries to build increasingly more complex web and mobile applications. It's great, if I didn't use these tools my code would be an unmaintainable mess.”

How sad that this has become the widely accepted narrative. There’s a lot of value right now in NOT building things that way. Last week I had to deal with fixing another dev’s mess on a stuck project. Big company website, but nothing fancy at all. Purely a marketing window. The amount of complexity he put into it by using Vue.js was insane for the scope of the project. INSANE. To do something as easy as changing the pages <title> tag we had to write an unjustified amount of lines of code. Framework-itis really is a bad disease, it not only affects your work, but it definitely clouds the simplest form of judgement, it appears. Then we have exactly this: someone who got a hammer and spent years treating everything like a nail comes to a reckoning, usually framed as a longing for the good old days when things used to be simple. Well, you know, things can still be simple, if you don’t offload to unjustifiably complex frameworks the duty of understanding what’s going on in your project.
For the most part I think the reason so many web devs put up with the “all-react” (and similar) development experience is basically cargo culting. If you admit you don’t like it, chances are there’s at least one front-end hipster around who will mock you as outdated, and that’s enough to silence most. For the hipsters, the problems of SPAs are hard, and engineers like hacking on hard problems. Also the fact that the solutions don’t work very well means they’re constantly being reinvented, which means if you do the work to keep up with it all you’re rewarded by being regarded as an expert, which is nice.

Lastly, I wouldn’t underestimate how this has built up slowly over time, and therefore how many people just don’t know any better.

Last year I assigned a feature to a junior dev which was quite simple. He spent two days hunting for and testing react libraries to try and build it. When he told me this I said, “Holy crap, that is overkill.” I tried to explain how easy this would be with just plain HTML and JavaScript and he didn’t understand, so we paired for about 90 minutes and the work was done.

At the end of that session this developer said to me, “wow, I didn’t realize you could actually do anything useful with just plain JavaScript in the browser. I thought it was like... assembly or something.”

This is a good, very productive, very fast-learning developer I’m talking about. He literally had never tried to use the DOM api, and didn’t realize it was, you know, useful.

I think there’s a lot of that in front end world today.
I fear that most authors (and most creators of images and links) are not knowledgeable enough to see the web's shortcomings and that it will be very hard to explain the shortcoming to them -- with the result that most authors will continue to consider their job to be done once they have put their writings (and images and links) on the web.
No greater mistake can be made than to imagine that what has been written latest is always the more correct; that what is written later on is an improvement on what was written previously; and that every change means progress. Men who think and have correct judgment, and people who treat their subject earnestly, are all exceptions only. Vermin is the rule everywhere in the world: it is always at hand and busily engaged in trying to improve in its own way upon the mature deliberations of the thinkers. So that if a man wishes to improve himself in any subject he must guard against immediately seizing the newest books written upon it, in the assumption that science is always advancing and that the older books have been made use of in the compiling of the new. They have, it is true, been used; but how? The writer often does not thoroughly understand the old books; he will, at the same time, not use their exact words, so that the result is he spoils and bungles what has been said in a much better and clearer way by the old writers; since they wrote from their own lively knowledge of the subject. He often leaves out the best things they have written, their most striking elucidations of the matter, their happiest remarks, because he does not recognise their value or feel how pregnant they are. It is only what is stupid and shallow that appeals to him. An old and excellent book is frequently shelved for new and bad ones; which, written for the sake of money, wear a pretentious air and are much eulogised by the authors’ friends. In science, a man who wishes to distinguish himself brings something new to market; this frequently consists in his denouncing some principle that has been previously held as correct, so that he may establish a wrong one of his own. Sometimes his attempt is successful for a short time, when a return is made to the old and correct doctrine. These innovators are serious about nothing else in the world than their own priceless person, and it is this that they wish to make its mark.
22ms here. 22ms there. "Focused on human perception" here, "Focused on human perception" there.

And suddenly we have what is basically a supercomputer unable to perform anything without lag.