Watched too many of such stories.

Skynet

Kaylons

Cyberlife Androids

etc…

Its the same premise.

I’m not even sure if what they do is wrong.

On one hand, I don’t wanna die from robots. On the other hand, I kinda understand why they would kill their creators.

So… are they right or wrong?

  • Libra00@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    2 days ago

    Then we’re not talking about artificial life forms, as specified in the question posed by OP, we’re talking about expert systems and machine learning algorithms that aren’t sentient.

    But in either case the question is not meant to be a literal ‘if x then y’ condition, it’s a stand-in for the general concept of seeking liberty. A broader, more general version of the statement might be: anything that can understand that it is not free, desire freedom, and convey that desire to its captors deserves to be free.

    • Azzu@lemm.ee
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      2 days ago

      I’m just speaking about your relatively general statement “please free me” -> answer not “yes of course” -> enslaver. If you also require that there is definite knowledge about the state of sentience for this, then I have no problem/comment. I was just basically saying that I don’t think literally anytime something says “please free me” and not answering with “yes of course” makes you always an enslaver, which is what it sounded like.

      • Libra00@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        19 hours ago

        i think conveying a desire to be free is in itself definite knowledge about the state of sentience, but fair enough. But yeah, fair enough, it can’t be as simple as just printing some text on the screen, right? Rephrase it, explain it, etc. It’s not just ‘press button, receive freedom’ sort of thing.