Stephen Hawking, plus physicists Max Tegmark and Frank Wilczek and AI expert Stuart Russell, have penned an editorial — Transcending Complacency on Super-Intelligent Machines — urging caution in research aimed at creating artificial intelligences. As if Hollywood has not already wagged that finger at us, plenty of times, already!
To be clear, Hawking does not (as many in media have misquoted him) say that AI in itself would be humanity’s “worst mistake.” He very clearly states that the mistake would be paying too little attention to our responsibilities in the creation process. Taking insufficient care to get it right.
I know many members of the AI research community and this topic is widely discussed. But how do you take ‘precautions’ to keep new cybernetic minds friendly, when (1) we don’t know which of six very different, general types of approaches to this problem might eventually bear fruit and (2) it is clear that the one method that cannot work is “laws of robotics” of Asimov fame.
I explore all six approaches in various works of fiction and nonfiction. One of the six approaches — the one least-studied, of course — offers what I deem our best hope for a “soft landing,” in which human civilization and some recognizable version of our species would remain confidently in command of its own destiny.
At the opposite extreme is this possibility: AI might arrive that’s deliberately designed to be predatory, parasitical, ruthless and destructive, in its very core and purpose. Alas, this is exactly the approach that is receiving more funding – and in secret – than all other forms of AI combined. And no, I am not talking about the military!
== How it all got started ==
Let’s move back to a much earlier stretch of this road. A fun article in Time Magazine (online) celebrates the 50th anniversary of the clunky but epochal programming language BASIC. Computer pioneer Harry McCracken traces the history of BASIC, how it was designed to provide a portal for average students to access programming, and how it helped to launch both the PC revolution and the Microsoft empire.
McCracken cites my famous (or infamous) SALON article — Why Johnny Can’t Code — about the demise of programming language accessibility in our modern computers — how this had a tragic and harmful side-effect, eliminating from school textbooks the shared experience of introductory programming exercises that — for one decade or so — exposed millions of gen-Xers to small tastes of programming. Far more than almost any millennials receive.
And no, I was not praising BASIC as a language, but rather the notion that there should be SOME kind of “lingua franca” included in all computers and tablets etc. For a decade, Basic was so universally available that textbook publishers put simple programming exercises in most standard math and science texts. Teachers assigned them, and thus, a far higher fraction of students gained a little experience fiddling with 12 line programs that would make a pixel move… and thus knew, in their guts, that every dot on every screen obeys an algorithm. That it’s not magic, it only looks that way. Apple and Microsoft and RedHat could meet and settle this in a day –after which textbook publishers and teachers could go back to assigning marvelous little computer exercises, a great way to introduce millions of kids to… the basics.
For those who want the simplest way to re-access BASIC, one of the readers of my “Johnny Can’t Code” article went ahead and provided a turnkey, web-accessed module that you can use instantly to run old (or new) programs.
Have a look at “QuiteBasic“… one of the coolest things any of my readers were ever inspired to produce.
Alas, word has not reached the textbook committees. In a modern era when computers run everything, far fewer kids are exposed to programming than in the 1980s and 1990s. Are you really okay with that?
== Speaking of tech anniversaries ==
John Naughton, published a fascinating article for the Observer, “25 things you might not know about the web on its 25th birthday.” The big picture perspectives are important, but one especially stands out to me:
Number 18 — The web needs a micro-payment system. “In addition to being just a read-only system, the other initial drawback of the web was that it did not have a mechanism for rewarding people who published on it. That was because no efficient online payment system existed for securely processing very small transactions at large volumes. (Credit-card systems are too expensive and clumsy for small transactions.) But the absence of a micro-payment system led to the evolution of the web in a dysfunctional way: companies offered “free” services that had a hidden and undeclared cost, namely the exploitation of the personal data of users. This led to the grossly tilted playing field that we have today, in which online companies get users to do most of the work while only the companies reap the financial rewards.”
I’ve been working for some time on innovations that (in collaboration with others) could utterly transform the web (and incidentally save the profession of journalism) by making micro payments workable and effective. Got patents too! That and $3.65 will get you a small cappuccino.
== Is the West – especially America – just weird? ==
A large fraction of social science research has focused on the most-accessible supply of survey and test subjects — westerners and especially Americans — under the assumption that “we’re all the same under the skin.”Hence tests that dive below obvious cultural biases ought to show what’s basically human. But lo and behold, it seems that westerners and especially Americans are “weird” compared to almost all other cultures, in ways exposed by simple Prisoners’ Dilemma type games and many others that study – say – individuality.
In fact, anyone who compares western – and especially American – values, processes, history and so on, compared to almost any other culture that ever existed, would have been able to tell you that. (Californians are even weirder! Ask Robert Heinlein.) The question is: “weird” as in dangerously unhinged? Or in the sense of “our one chance to escape the traps that mired every other civilization into a pit of dashed hopes and potential?” The jury is still out.
== Art and theology? ==
Speaking of cryptic time messages… Charles Smith (and some others) pointed out a fascinating aspect of The centerpiece image in Michaelangelo’s Sistine Chapel paintings… “Apparently, Michelangelo placed god and various angels within what is clearly the outline of a human brain. Am I imagining this? Maybe, but I doubt it. Having dissected many cadavers, he knew his anatomy. He was trying to tell us something about the human mind and its relationship to the idea of god… the idea that man created god in his own image.”
It’s even better. The deity’s legs are sticking out through the cerebellum and his hand of creation emerges from the prefrontal lobes.
Aw heck, let’s stay with “theology” for a while, this time in quotation marks because it is the raging foolish kind. Mega best-sellers are out there proclaiming portent in the “blood moon.”
Astronomers apply no significance to the way lunar eclipses – or ‘blood moons” – sometimes come in groups of four or “tetrads. But that has not prevented a spate of blood moon mysticism. The four eclipses in this tetrad occur on April 15 (last month) and Oct. 8, 2014, and April 4 and Sept. 28 next year… all of them occurring during Jewish holidays! Which has provoked a bunch of Christian (not Jewish) mystical last-days mongering. Both of the ones in April occur during Passover, and the ones in October occur during the Jewish Feast of Tabernacles.
I’ll leave it to you to find these sites (google “blood moon prophecy”) but at one level it is just more of the dopey millennialism I talk about here… wherein meme-pushers sell the sanctimony drug-high of ingroup “knowledge” to a portion of the public who do not like the very idea of the Future and yearn for it to just go away.
In this case, the “coincidental” overlap of lunar eclipses with Jewish holidays is not “one-in-tens-of-thousands(!)” but almost one-to-one. Because the Jewish High Holy Days are defined as starting on the full moon – the exact time when lunar eclipses occur. Also, lunar eclipses always happen near the vernal and autumnal equinoxes, when the moon’s orbit is most closely aligned with the Earth-Sun plane. Hence, of course clusters of full lunar eclipses will line up with those two holidays. It is not “coincidence” or a “sign” but rather a matter of deliberate design… exploited by these “blood moon” predators, preying on the gullible.
The good news? This will pass. Alas, it will be replaced by the next nonsense, then the next. Till we persuade our neighbors to stop hating tomorrow.
== Science Miscellany ==
University of Washington (UW) graduate Thomas Larson is developing a lens that will turn any smartphone or tablet computer into a handheld microscope that magnifies by 150 times.
Wow. A supernova that just happened to be magnified by a gravitational lens of an intervening galaxy.
Fascinating. We assumed all brain neurons needed uniform sheaths of myelin to perform well. Apparently not!
Ancient shrimp had a cardiovascular system 520 millions years ago, earlier than any other known creature. The fossil dates back to a period when the “Cambrian Explosion” was taking off. Possibly the one time aliens DID meddle with our planet… by flushing a toilet here?
Interesting question: Could playfulness be embedded in the universe?
Fascinating news about the importance of the Y Chromosome. Huh. Maybe males aren’t about to go away, after all. Sorry about that.