>Sure but i'm looking for fully materialist explanation
It's the most materialist (and closest to marxist) analysis I've yet encountered on the subject matter.
>so then both are in agreement that A.I. is dangerous, right? So the difference is more technical than anything else. Does Omohundro have any suggestions to mitigate?
As far as I'm aware he hasn't made any suggestions on the matter, but the fact that creating safe AI isn't as simple as just programing it to have correct values would seem to imply that attention needs to be payed to the material conditions that it resides in. It also implies that both capitalism and proprietary software could very well make the situation worse. Capitalism will tend to select for "sociopathic"* behavior and proprietary software results in monopolies and closed environments that effectively remove the AI from any kind of "social" environment, leading to it developing in a less constrained manner.
*Yes, this is an anthropomorphism - I'm simply using it as an analogy
>so what exactly is the difference?
The degree in which its initial goals are modified. Orthogonism assumes a much greater adherence to it's initial values than is warranted, and subsequently reaches absurd conclusions about paper clip maximisers. A "self maximising" AI is a far more real threat (one that continuously seeks to become more intelligent by harvesting all available matter to build more processors), and even that is in all likelihood a simplification.
>Yeah, no. Our brains do not operate on quantum processes, rather they use electro-chemical impulses. The idea that replicating brains would be an effective strategy for building computers is ridiculous though. Maybe on the level of software, but you seem to be implying that the way to build AIG is on the level of hardware. Am I misunderstanding you?
Slightly. My point was simply that we don't need quantum computers to create the necessary hardware acceleration required for AI. And no, to don't think that replicating brains is the way forward at all.
Message too long. Click
to view full text.