Lazulian Philosophy in D. Minor

edited 2012-04-23 01:26:58 in General

this is my personal thread for dumping random bullshit I like to ramble about, assuming no one cares that I make such a thread. Stuff here may be developed further depending on what stuff it is. Also feel free to reply, I like talking, if you haven't noticed. 

So in any case. I was thinking about EVE no Jikan a bit ago and it got me on to the subject of robots. If they were sufficiently intelligent, would we treat them well or would they be subjected to the same nonsense that your average person is? I guess it would all depend on how soon we start with the whole "sufficiently sapient" thing. I imagine there'd be a robot rights movement or something.

I do wonder how soon all of that would happen. Reliable AI is, along with space colonization, one of my "dream milestones". Things that, I think if we achieved would signify that we "made it" as a species, to some extent. Basically if you consider life a game, I would consider those to be (some of) the end goals.

Tagged:

Comments

  • I really need to improve my diet. Today I have eaten pizza bagels, cheese crackers, kettle-cooked potato chips, pretzel nuggets, ice cream, and a cheese sandwich. I've had both cherry soda and iced tea to drink. None of this is terribly healthy. I'm gonna end up with diabeetus by the time I'm 20 at this rate.
  • It's 4:20 somewhere.
    I don't understand your question about robots. Why would they ever be treated better than the average person?

    Anyway, if we can create sentience from scratch, it doesn't seem like it would be comparatively hard to prevent it from feeling suffering.
  • Not better than the average person, just not discriminated against.

    Anyway, if we can create sentience from scratch, it doesn't seem like it would be comparatively hard to prevent it from feeling suffering.

    No, probably not, but I would consider that to be like

    I dunno. Cheating, I guess?

    I should note that my positions on these things are rarely logical.

  • It's 4:20 somewhere.
    The whole point of creating AI is for it to serve us. It hardly makes sense to design it to be unhappy while doing so, unless that was part of some experiment. Even then, it'd probably be unethical.

    Pain makes sense for humans to experience, because they need to get through life without taking too much damage in order to reproduce. That's not so for robots.
  • READ MY CROSS SHIPPING-FANFICTION, DAMMIT!

    i get so angry sometimes i just punch plankton --Klinotaxis
  • It's 4:20 somewhere.
    p much
Sign In or Register to comment.