Blogotariat

Oz Blog News Commentary

Articles from Digitopoly

Can a superintelligence self-regulate and not destroy us?

November 16, 2017 - 00:11 -- Admin

One of the most compelling reasons why a superintelligent (i.e., way smarter than human), artificial intelligence (AI) may end up destroying us is the so-called paperclip apocalypse. Posited by Nick Bostrom, this involves some random engineer creating an AI with the goal of making paperclips. That AI then becomes superintelligent and in the single minded pursuit of paperclip making ends up appropriating all of the world’s (and maybe universe’s) resources.

Apple and the AI non-threat

November 7, 2017 - 05:02 -- Admin

Technology and business forecasting is hard. I get that. That is one reason I try to avoid it except, of course, for fun or to get attention. But the one thing that continues to perplex me is why pundits keep coming back to one specific well: that Apple is doomed.

Pages