From: Philip Sutton (Philip.Sutton@green-innovations.asn.au)
Date: Thu Jul 14 2005 - 20:27:46 MDT
Hi Eliezer,
You said:
> I think that all utility functions containing no explicit mention of
> humanity (example: paperclips) are equally dangerous.
The sort of super-powerful AGI that we are discussing in this thread would,
most likely, find a way to propagate itself through the universe - so it is not
just a threat to life on earth - but is a threat to life in the universe. Having a
specific reference somewhere in AGI goal structures to 'humanity' is
probably a good idea, but it is massively inadequate with respect to the
need to protect life in the universe at large. Although vitally important to
every human, caring for humans is a very small specific case of a much
bigger general problem.
Cheers, Philip
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:51 MDT