l

Saturday 6 October 2012

Human Limits?


 “Humanity Won’t Be Augmented, It Will Be Drowned”
By: Hugo de Garis

The Transhumanists, as their label suggests, want to augment humanity, to extend humanity to a superior form, with extra capacities beyond (trans) human limits, e.g. greater intelligence, longer life, healthier life, etc. This is fine so far as it goes, but the problem is that it does not go anywhere near far enough. My main objection to the Transhumanists is that they seem not to see that future technologies will not just be able to “augment humanity”, but veritably to “drown humanity”, dwarfing human capacities by a factor of trillions of trillions. For example, a single cubic millimeter of sand has more computing capacity than the human brain by a factor of a quintillion (a million trillion). This number can be found readily enough. One can estimate the number of atoms in a cubic millimeter. Assume that each atom is manipulating one bit of information, switching in femtoseconds. The estimated bit processing rate of the human brain is about 10exp16 bits a second, which works out to be a quintillion times smaller.
Thus artificial brains will utterly dwarf human brains in their capacities, so the potential of near future technologies (i.e. only a few decades away) will make augmenting humanity seem a drop in the ocean. My main beef against the Transhumanists is that they are not “biting the bullet” in the sense of not taking seriously the prospect that humanity will be drowned by vastly superior artilects who may not like human beings very much, once they become hugely superior to us. The Transhumanists suffer from tunnel vision. They focus on minor extensions of human capacities such as greater intelligence, longer healthier life, bigger memory, faster thinking etc. They tend to ignore the bigger question of “species dominance” i.e. should humanity build artilects that would be god like in their capacities, utterly eclipsing human capacities.
Since a sizable proportion of humanity (according to recent opinion polls that I have undertaken, but need to be scaled up) utterly reject the idea of humans being superseded by artilects, they will go to war,  when  push really comes  to shove,  to ensure that humans remain the dominant  species. This will be a passionate war, because the stake has never been so high, namely the survival of the human species, not just countries, or a people, but ALL people. This species dominance war (the “Artilect War”) will kill billions of people, because it will be waged with 21st century weapons that will be far more deadly than 20th century weapons, probably nano based.
The Transhumanists are too childishly optimistic, and refuse to “bite the bullet.” They do not face up to the big question of whether  humanity should build artilects or not and thus risk a gigadeath Artilect war. The childlike optimism of the Transhumanists is touching, but hardly edifying. They are not facing up to the hard reality. Perhaps deep in their hearts, the Transhumanists feel the force of the above argument, but find the prospect of a gigadeath Artilect War so horrible that they blot it out of their consciousnesses and pretend that all will be sweetness and light, all very happy, but not very adult.

A Reply the above by Steve Richfield

I used to believe as you do, but my opinion has since morphed a bit after many discussions with singulitarians.
Singularity is a religion, only instead of praying to an existing God, they are seeking to create their own God. The parallels to the Tower of Babel are there for all to see.
I have come to believe that anyone who would buy into a “human friendly AGI” must necessarily be too stupid to ever build one, and hence is no threat. Simply ignore the singulitarians, as you now ignore the many other lunatic cult religions.
What is a threat is the diversion of resources to such folly, and the potential awakening of governmental regulation over competent AI development.
Note that several companies now have prototype cold fusion reactors working, but there has been SO little press after past bogus claims. It will be interesting to watch as this technology moves into the mainstream, because the dangers are very parallel in that lunatics could potentially build hydrogen bombs in their kitchens. While this may not be as dangerous as releasing super-human AGIs, it is nonetheless dangerous enough to observe the social response.

No comments:

Post a Comment

Please leave a comment.