node created 2012/06/17
last changed 2012/06/17
 2 w ago

bullshit-introducing clichés

  • at the end of the day..
  • any reasonable person would agree that..
  • I'm not entirely sure that I'm convinced I fully agree that..
 3 w ago

In response to "Superintelligence - The Idea That Eats Smart People"

http://idlewords.com/talks/superintelligence.htm

The danger of computers becoming like humans is not as great as the danger of humans becoming like computers.
What eats me is trajectory we are on as humans. Runaway actual intelligence, even if it destroys humanity, wouldn't worry me as much, I'd wish it good luck, IMO even a totally random dice roll is better than what we're aiming at. But AI is more a meme than even an honest intent. It's like saying "I really really want blueberry pie", but then when you ask what that is, they it gets real murky real quick, but that doesn't stop the hype, as if wanting something a lot makes up for not knowing what it is. But that doesn't prime a person to *make* blueberry pie, it primese them to get lured by what they think is the smell of blueberry pie.

Here's something to note, as the length of a discussion about "AI" grows in length, the probability of things getting explained via something they saw in a movie or read in a book or saw on TV, glossing over 99.9% of the "details" those left out, approaches 1. You may say we make this fiction because of our achievements, or may point to things that actually did come to pass (of course, compared to the stuff that didn't, even from the same authors, it's nothing). And I love using examples, too, and I sure love quotes.

But still, I think when we are this steeped in variations of the same thing over and over and over, of course we'll "consider it" at some point, and the moral or philoshopical depth is drastically reduced by already being primed. We're like people who don't see what we build with our hands, because we wear VR googles that show us movies of our childhood or some console game.

What I can see us realistically making are are "idols" with eyes that do not see, with audio output, perfect speech synthesis, that does not convey meaning, incredibly fast analysis that is not thought. From the get go, starting with the Turing test, it was more about how what something seems from the outside, than what it is to itself on the inside.

Furthermore, we might make human level AI no problem, EZ PZ, but not by making AI so smart, but my making humans dumber. We're already training ourselves to select what we consume and think from discrete pre-configured options. We notice and complain about the effects of in all sorts of smaller areas, but it's a general trend, and I think it's not so much about creating something "better" than humans, but about removing human agency.

The frightening coincidence of the modern population explosion with the discovery of technical devices that, through automation, will make large sections of the population 'superfluous' even in terms of labor, and that, through nuclear energy, make it possible to deal with this twofold threat by the use of instruments beside which Hitler's gassing installations look like an evil child's fumbling toys, should be enough to make us tremble.
"Eichmann in Jerusalem: A Report on the Banality of Evil"
Meanwhile, there's this idea that humans becoming "superfluous" means we'll all be free from "bad" work, and free for fun work and leisure. How we would get from an *increasing* concentration of wealth in fewer hands to some commnuist utopia? Is that some kind of integer overflow, where enough greed and power wrap over to sharing and letting others live and decide their own fate? We're connected to that (like Michael Scott is to the baby of his boss,) by delusion, the path we're on doesn't lead there.

Throw away a word here, do something that "everybody does" there, adapt to "how the world is" some, and there you go, a blank nothing that can be deprecated without guilt or resistance. The desire to control human agency is met more than halfway by our desire to shed it, to abdicate responsibility, become a piece of floatsam flowing down the river of history to the ocean of technotopia, enter the holy land of holodeck, where we can consume endlessly. We digitize, we sample, that's how we make things "manageable", and at high enough resolution we can fool ourselves, or have something "good enough to work with".

And just like children that get too much sugar too early tend to not liking fruit as much, because they're not as extremely sweet, our abstractions lure some people to prefer them over the dirty, fractal, infinite real world, or the exchange of emojis and pre-configured figures of speech over real human contact, silence that isn't awkward, thinking about what you're trying to say, or even coming up blank and that being okay... just like we go "posterized, high contrast" in all sorts of ways aready, I hve no problem supposing that we will come up with a form of alienation like that, but for thinking, I just no clue how it will look like.

We already have it with language of course, but I'm sure we can take that to the next level, maybe neural interfaces. If we can't read and transmit thoughts in their fullness and depth, then hey, just reduce our thoughts to the equivalent of grunts, that might work. Become like a computer, 0 and 1. Convince yourself that's that just what humans have been all along, remember Star Trek wisdom, don't be so proud and consider your brain more than a "meat machine", don't deny Data his quest to become human! Might is right, cue super emotional music swelling up.
 4 w ago

spooky thought of the day

The attack on general purpose computing is just a means, the attack on general purpose thinking is the end.
 4 w ago
That's what turned me off from most gaming was the soullessness on many levels. But just looking at youtubers, 99% of their "discussions" of games make me want to throw up, too, and that includes most supposed "serious" ones, too. Even when they're going by a script, if you reduced what they said to the actual meanings, there would be little to nothing left except misused phrases.

I used to despise show wrestling as a kid, but yesterday I saw 5 minutes of a Star Craft world championship finals game, and the announcers made me insane with hatred within 15 seconds. "What an epic moment", oh, how those units come out of the base "*so* instantly". Consider Bill Hicks' imitation of Jay Leno, then consider how much worse than Jay Leno "gaming" shills have become.

My theory is that this has to do with a shift from games being about something, even if that's just silly fun or violence, to them being about escaping from facing reality. People say shit that doesn't mean anything, and the people they say it to don't bother to ask "what does that even mean", because one hand washes the other, and they are in an unconscious conspiracy of pretending what they clobber together are sentences.
 3 m ago

Klimawandelwitze

Oh nein! Für das erste Mal seit 69 Jahren haben wir keine vollgestopften Regale mehr und müssen zwischen 3 statt 7 Tomatensorten wählen.

Sprich: wer Klimawandel ernstnimmt, ist ein hirnloser Konsument, im Gegensatz zu denen, die sich über jeglichen Ernst lustig machen (müssen, um ihre eigene Dummheit zu projezieren).
 4 m ago

Immortality

Ever noticed that the people who obsess about immortality suck really badly at life? I noticed that during my youth... whenever anything about the subject was on TV, I watched it with fascination, and slowly noticed rather unflattering patterns.

You ever look at their faces? ... “I’m pro-life!” [makes a pinched face of hate and fear; his lips are pursed as though he’s just sucked on a lemon.] “I’m pro-life!” Boy, they look it, don’t they? They just exude joie de vivre. You just want to hang with them and play Trivial Pursuit all night long.
Bill Hicks

Take this guy, who apparently is still held in high esteem by some in the "tech industry":

http://www.calvertjournal.com/opinion/show/1487/roscosmos-proton-m-tsiolkovsky

However, Tsiolkovsky’s interests did not lie just in the fields of engineering and rocket design; he was also interested in social reform. In 1928 he published a book called The Unknown Intelligence in which he argued that humans would colonise our galaxy and introduce the philosophy of panpsychism, a sinister form of anthropocentric perfectionism with a eugenic streak. Tsiolkovsky believed that atoms have their own form of intelligence and that if all the lower forms of life were eliminated, then the suffering of the “human, higher atoms” would be lessened, as they would not have to go back to the bottom of the pyramid of existence but would be reused again in the highest form of matter — humans. To achieve this, Tsiolkovsky suggested sterilising all fauna and aquatic life, and most of the flora on Earth, leaving only those plants necessary for nutrition. His plans did not stop there, but embraced the full extent of eugenics: he proposed using the same remedy to eliminate all “imperfect” members of humankind, so that only the best, healthiest and most intelligent people would be allowed to reproduce. Their offspring would then go on to create a higher caste of Nietzschean Übermenschen and, ultimately, reach the much longed-for goal of immortality.

Nietzschean, you say? Nietzsche would have puked his guts out.

The earth has become small, and on it hops the last man, who makes everything small. His race is as ineradicable as the flea; the last man lives longest.

'We have invented happiness,'say the last men, and they blink. They have left the regions where it was hard to live, for one needs warmth.
 5 m ago

"Hacksaw Ridge" - what a story o_O

https://en.wikipedia.org/wiki/Hacksaw_Ridge

Other changes to the story occur near the end of the movie when Doss is placed on a stretcher. In real life, Doss had another wounded man take his place on the stretcher. After treating the soldier, a sniper shot fractured Doss' arm and he crawled over 300 yards to safety after being left alone for five hours. Gibson omitted this from the film because he felt that the audience would not find the scene believable.
 7 m ago

"If I don't do it, someone else will"

Corollary: plenty of skilled engineers with fewer moralistic constraints will jump at the chance to do interesting work for high pay. For a company as large and wealthy as Google, they can continue to raise offer salaries until they are adequately staffed.

There is a school of thought that recommends “moral” people doing “immoral” work because if those people left then other “immoral” people will take those jobs and more readily implement “immoral” features. So the “moral” engineers have an incentive to stay and act as a front line against “immoral” actions, or at least have an insider’s position for whistleblowing.

Military drones are here to stay, and whether or not the US builds them, other military powers certainly will.

Ultimately, I don’t think this changes anything.

Joseph Weizenbaum pointed out the intellectual and moral bankruptcy of this "logic" in an interview: it's like saying every year there's bound to be rapes happening in Berlin -- so why not just go out and rape someone?

If your only excuse is that someone else would do it, you *are* that other person. Any simple person on the street would probably understand Weizenbaum's point a lot better than some people who frequent HN. Isn't that interesting? You really think you know yourself he didn't? Rather than demonstrating what is well known for like a century for those not doing it, those who have a slight interest in the subject matter of personal responsibility, sophistry, and industrialization of murder?

So the “moral” engineers have an incentive to stay and act as a front line against “immoral” actions

Instead they seem to prevent NOTHING, in factual history so far, but time and time again act as fig leaves. "Yeah, it's bad, but there are some good guys in there trying to do the right thing". They're offering themselves up as hostages, in a way, and it robs society of the determination with which it would defend itself otherwise. "School of thought", more like school of unthought. School of "nobody could have known that would happen".

It's like the onion layer structure of totalitarian systems, front organizations, with ever more secretive, slightly more extreme inner layers that look down on the fools in outer circles for not being radical enough, or for making compromises with the outside world that "just doesn't get it". That layer structure and the playing for time serves a function, and it works beautifully. In the end, everybody is just reacting, and nobody is responsible.

We're essential building a tower that will short circuit the heavens and the earth if we finish it and kill us all, and that will kill us all if it just keels over. But that doesn't mean you can't climb down instead of up, and spend energy on finding ways down, on finding ways to build nets for others to jump some distance safely, rather than rationalizing climbing up. Yeah, if you don't get drunk somebody else still might, but guess what, the best way to detect and address that is sobriety on your behalf. Swimming further out into the ocean of kool-aid just increases the distance you have to swim back, it does not lead to that ocean turning into drinkable water, much less land.
 8 m ago

made my day

Emphasis mine:
Now, perhaps even more ominously, Quartz pointed out on Sunday that Hannah Arendt’s 1951 analysis of Nazism and Stalinism The Origins of Totalitarianism has sold out on Amazon completely. This is extremely metal. The Origins of Totalitarianism is a nearly 600-page, dense slab of genius that I fondly recall never quite finishing in college. 1984 and the Constitution both easily fit within its shadow. It’s worth it.

from here, via Why we all need to read ‘The Origins of Totalitarianism’