outfox
in
- 459
- Posts
- 12
- Years
- Seen Feb 25, 2013
When the topic of artificial intelligence crops up, many jump to the conclusion that, once a machine becomes more intelligent than its maker, the world as humans know it is doomed.
I can't help but think that for a machine to have intelligence greater than that of a human it must also have tolerance for other life forms greater than that of a human. Is this dangerous thinking? Does anyone have a positive outlook on the growth of the AI field? Even deeper, has our species already accomplished singularity? Timothy Lenoir wrote a compelling article arguing that singularity took place when our brains evolved the capacity for abstract symbolic representation, paving the way for diverse culture, complex social organizations, technology, and further evolution of knowledge. Agree? Disagree?
Anyway, the race to achieving technological singularity continues to get tighter. So, how long til humans end up in zoos? My bet's on no longer than a century.
I can't help but think that for a machine to have intelligence greater than that of a human it must also have tolerance for other life forms greater than that of a human. Is this dangerous thinking? Does anyone have a positive outlook on the growth of the AI field? Even deeper, has our species already accomplished singularity? Timothy Lenoir wrote a compelling article arguing that singularity took place when our brains evolved the capacity for abstract symbolic representation, paving the way for diverse culture, complex social organizations, technology, and further evolution of knowledge. Agree? Disagree?
Anyway, the race to achieving technological singularity continues to get tighter. So, how long til humans end up in zoos? My bet's on no longer than a century.