From: m.l.vere@durham.ac.uk
Date: Sun May 14 2006 - 17:02:15 MDT
Quoting Phillip Huggan <cdnprodigy@yahoo.com>:
> I don't know why the snipers haven't killed you yet, but you are polluting my
> inbox. What is arbitrarily recognized as right or wrong by individuals is
> man-made, but first-hand conscious experience is an objective property of
> this universe.
Why should anyone care about those of others?
> So there is an object ranking of universe energy
> configurations that can be termed Morality.
Ok, this depends on the precise definition of terms, but if so, why should we
care?
> We may not achieve anything
> close to the most utilitarian configuration, but it is an objective target
> (not mere subjective smoke and mirrors). That target is the ideal goal for
> an AGI,
I think it is very unlikely that such a thing exists.
> but anything that doesn't make our aggregate standards-of-living over
> the course of our future light-cone volume worse, would be acceptable.
Perhaps, but i can imagine CV making it (at least for me) 'worse' - not that
we can know for sure.
> You
> are pawning off your own selfish philosophy (nothing wrong with it) as an AGI
> barrier.
Eh? No, I personally dont want an AI built which will force morality onto me.
Thus I communicate this to an AGI list.
Being selfish does not make you a nihilist.
Agreed. Being a nihlist means questioning everything, and accepting
no 'absolute truth' which cannot be proven.
> Spamming sl4 with
> kindergarden philosophy is selfish.
Kindly disprove the philosophy before you insult it.
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:56 MDT