Skip to content

The Most Significant Post You Will Never Read:

July 20, 2018

superintelligence-how-afraid-should-we-be-28-638
http://www.antipope.org/charlie/blog-static/2018/01/dude-you-broke-the-future.html

 

In his blog, Charlie Stross reproduces the keynote speech he gave at the 34th Chaos Communication Congress. The speech is, as he says,”polemical, intended to highlight the existence of a problem and spark a discussion, rather than a canned solution. After all, if the problem was easy to solve it wouldn’t be a problem, would it?”
Stross has some interesting insights into a few of the fundamental issues of our time such as what is AI and what is its role in the future of humanity. His oblique look at many of the issues raised from those questions alone is worth the read. For example, the following rumination about what he calls “very slow AIs,” modern corporations:

 

“Corporations are cannibals; they consume one another. They are also hive superorganisms, like bees or ants. For their first century and a half, they relied entirely on human employees for their internal operation, although they are automating their business processes increasingly rapidly this century. Each human is only retained so long as they can perform their assigned tasks, and can be replaced with another human, much as the cells in our own bodies are functionally interchangeable (and a group of cells can, in extremis, often be replaced by a prosthesis). To some extent, corporations can be trained to service the personal desires of their chief executives, but even CEOs can be dispensed with if their activities damage the corporation, as Harvey Weinstein found out a couple of months ago.”
“Finally, our legal environment today has been tailored for the convenience of corporate persons, rather than human persons, to the point where our governments now mimic corporations in many of their internal structures.”
“The problem with corporations is that despite their overt goals—whether they make electric vehicles or beer or sell life insurance policies—they are all subject to instrumental convergence insofar as they all have a common implicit paperclip-maximizer goal: to generate revenue. If they don’t make money, they are eaten by a bigger predator or they go bust. Making money is an instrumental goal—it’s as vital to them as breathing is for us mammals, and without pursuing it they will fail to achieve their final goal, whatever it may be. Corporations generally pursue their instrumental goals—notably maximizing revenue—as a side-effect of the pursuit of their overt goal. But sometimes they try instead to manipulate the regulatory environment they operate in, to ensure that money flows towards them regardless.”

 

In his discussion, he maintains that regulation is the only tool available to prevent the instrumental convergence of corporations (the need for profit) and other, swifter AIs from behaving uncontrollably and running amok. Unfortunately, this same need will also impel them to seek to manipulate the regulatory agencies for advantage instead of competing within the system. To me, this implies the need for regulation that absolutely prohibits and prevents AIs whether slow moving or fast, from influencing the rulemaking that affects them — fat chance that.
Some time ago, in Trenz Pruca’s Journal, I published a brief post on Decentralized Autonomous Corporations (DAC) https://trenzpruca.wordpress.com/2015/09/16/the-inheritors/. DAC’s are corporations run “without any human involvement, under control of an incorruptible set of business rules.” Like most corporations, they generally cannot be terminated except by the investors, often have more rights than ordinary citizens and cannot be imprisoned if they break the law. Their investors, shielded by law, are responsible only to the extent of their monetary investment for the actions of their creation.

 

If therefore, Stross is correct that the AIs, whether fast or slow, are subject to uncontrollable instrumental convergence* what happens to us?

 

 
* Instrumental convergence — the act of implacably moving toward uniformity to the exclusion of or the consuming of all else. e.g., in the case of making a profit, ultimately to the exclusion of all conflicting goals. A form of institutional autism.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: