From: Brian Atkins (brian@posthuman.com)
Date: Sat Aug 11 2001 - 10:15:28 MDT
James Higgins wrote:
>
> At 11:17 AM 8/6/2001 -0700, you wrote:
> >Perhaps we can think about it as a legal system. We all agree to obey
> >the rules for our own protection, vs. reverting to anarchy. This is
> >a starting point however, because any technologically deficient Legal
> >System can't statisfy everyone. The current U.S. system violates the
> >volition of, say, murderers and adults who want to have sex with
> >children. It allows humans to slaughter animals which really annoys
> >members of PETA. It allows abortions which really annoys the Christian
> >Right. It prevents you from growing certain natural plants and
> >smoking them.
>
> The thing with current systems is that if they don't work, you can
> remove/change/overthrow them. In fact, the founding fathers of the US
> tried to ensure that the citizens would always have the ability to
> overthrow their own government if it became tyrannical. If nothing else,
> you can leave as there are many countries all with their own
> governments. So you get to pick and choose. If the one you are in goes to
> hell, you can move.
>
> There will be no choice in the Sysop's realm. And if things go bad (this
I have patiently explained that if the Citizens tell the Sysop to go away,
it will. No different than a standard revolution, except it's a lot easier.
> is always a possibility, no matter how small) everyone is stuck and thus
> possibly screwed.
>
Wishing that all the atoms don't end up under the influence of something
in the end doesn't make it likely to happen (sorry, but the bellyaching
gets old... I still don't hear an alternate solution other than wishful
thinking). If the Sysop doesn't take control, someone or something else
(much more likely to be worse) will. All that matter floating around the
solar system is like the ultimate resource, and believe me people and
unFriendly AIs will seriously covet it.
> >In a VR enhanced society with Friendliness, we can live
> >out our fantasies, whatever they are, in complete safety with respect
> >to other sentient beings. A lowest level prevention mechanism of a
> >physical operating system would prevent one sentient from actually
> >harming another.
>
> Not exactly.
>
> Acting out your fantasies might be incredibly difficult. According to the
> suggested rules you can't simulate people in VR. If they are a "real"
> person you are simulating, acting out your fantasies with a simulation of
Again you have misunderstood. The whole "can't sim people" only applies
when you are outside the Sysop. That is why it wouldn't want to let you
go outside of it, or conversely why it would want to extend itself with
you wherever you go. Inside the Sysop it can protect the sims just as
easily as it protects your wishes.
> them is deemed to violate them. Second, if you create a simulation of a
> person it becomes a citizen, permanently get assigned the computing
> resources you allocated to it and gets its own full volition. Thus you
> can't simulate anything.
You can simulate anything that doesn't become a Citizen, or as you say
you can share some of your resources with whatever you create. It's just
like making a real baby, you have to provide for it :-) Again, I don't
see why you're complaining about not being able to create a real Citizen
and then keep it imprisoned. That's just not cool... I bet the Sysop
could whip up for you some very intelligent seeming VR "people" that
just skirt under the line of what is Citizen-worthy.
>
> Which means if you want to act out your fantasies, your going to need other
> "real" people to cooperate. This has all sorts of problems. First, the
> way it sounds is that the sysop warns you not to read any incoming messages
> from people you don't trust. So it will be very hard to meet new
Actually no, the Sysop should only warn you when a particular message
contains something you said you want it to warn you about. For instance
a nasty-gram from a higher level Power that has the potential to infect
you with some memes you don't want. Normal, nice, messages will flow
freely.
> people. Second, any "VR" type setting better be a private setting. If you
> wanted to go to the mall in VR, for example, you would need hundreds or
> more individuals to play along. Try getting an individual to play at being
> a sales clerk, chef, waiter, butler, etc.
Well if you and some friends really had a fetish for 20th century malls, and
Sysop-generated/controlled non-Citizen actors wouldn't cut if for some
reason, you could always break down and pay other Citizens to act for you.
They might want some of the atoms in your "bank account", or maybe you
would have to agree to act for them sometime.
>
> >This "prevention" though, is a form of "control". There seems no way
> >around it. It is the *feeling* of being under another's control that
> >is bitterly rejected. Perhaps one way out of the contradiction is to
> >voluntarily live in a make believe world where we "think" we have
> >complete free will and then periodically are "revived", reminded
> >that we, ourselves, set ourselves up to believe this fake reality for
> >our own peace of mind and are given the choice of continuing the
> >simulation or not.
>
> Actually, its more about sealing the human race into a logical cell with no
> way of escape in an emergency.
I have also patiently stated that should an emergency occur that the Sysop
can't handle, it will appeal to the Citizens to either help out or flee.
The only REAL possible downside of the Sysop is if it goes unFriendly.
Everything flows from that. The only way to avoid this possibility
completely is to never ever in the history of the Universe develop an AI.
Otherwise all you can do is reduce the risks. We have studied the
risks of various scenarios, and found the AI one to have the least
risk/most reducible risks compared to the others.
>
> >Would you be willing to take "The Illusion of Freedom with Periodic
> >Resuscitation" as a compromise? ;-) If not you could always stay
> >on old Earth OR try to travel faster than the galactically
> >spreading Sysop...you'll have those real choices at least. The other
> >way is to deactive your "need" to feel free of the Operating System,
> >but for some reason, I don't think you'd want to do that.
>
> I'd take the "try to travel faster than the galactically spreading sysop"
> option (for at least 1 copy of myself), but I don't think the powers that
> be on this list want to even offer that option to anyone.
>
Ever read Hanson's "Autarky" paper? Search Google for Robin Hanson Autarky.
Sometimes his machine is down so you have to read the cached google version.
To sum up, without some kind of AI building your ship for you, you will
never have one unless you become way richer than Bill Gates. And even then
if you left the system before the Sysop you would still be unlikely to
outrun it since it will have the full resources of itself and all the
Citizens who want to help it. The dream of running away on a ship by
yourself is not very likely. It might be somewhat feasible if you uploaded
yourself, enhanced your intelligence significantly and mastered nanotech.
But I still think the Sysop would outrun you due to the much more powerful
resources it will have to help it out.
You do have one thing I guess in your favor (possibly), and that is the
Fermi Paradox. If we aren't the only ones in this galaxy then you'd figure
this all would have happened already and we'd have a Sysop hanging around
here. So perhaps either all Singularity attempts fail, or Sysops end up
deciding to hang out only in their home solar system, or we are living
in a sim, or for some very strange reason we are not in a sim but we are
also the first people in the galaxy to get to this point.
-- Brian Atkins Singularity Institute for Artificial Intelligence http://www.intelligence.org/
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:37 MDT