1. Due to issues with external spam filters, QQ is currently unable to send any mail to Microsoft E-mail addresses. This includes any account at live.com, hotmail.com or msn.com. Signing up to the forum with one of these addresses will result in your verification E-mail never arriving. For best results, please use a different E-mail provider for your QQ address.
    Dismiss Notice
  2. For prospective new members, a word of warning: don't use common names like Dennis, Simon, or Kenny if you decide to create an account. Spammers have used them all before you and gotten those names flagged in the anti-spam databases. Your account registration will be rejected because of it.
    Dismiss Notice
  3. Since it has happened MULTIPLE times now, I want to be very clear about this. You do not get to abandon an account and create a new one. You do not get to pass an account to someone else and create a new one. If you do so anyway, you will be banned for creating sockpuppets.
    Dismiss Notice
  4. If you wish to change your username, please ask via conversation to tehelgee instead of asking via my profile. I'd like to not clutter it up with such requests.
    Dismiss Notice
  5. Due to the actions of particularly persistent spammers and trolls, we will be banning disposable email addresses from today onward.
    Dismiss Notice
  6. A note about the current Ukraine situation: Discussion of it is still prohibited as per Rule 8
    Dismiss Notice
  7. The rules regarding NSFW links have been updated. See here for details.
    Dismiss Notice
  8. The testbed for the QQ XF2 transition is now publicly available. Please see more information here.
    Dismiss Notice

AI, Empathy, and Emotion

Discussion in 'General' started by Ralyx, Dec 25, 2019.

  1. Ralyx

    Ralyx Versed in the lewd.

    Joined:
    Jun 1, 2017
    Messages:
    1,152
    Likes Received:
    6,444
    To anyone tuning in from the blue, this discussion originally spawned as a bit of a derail from a BNHA story over in the NSFW forums. It's pretty decent so far, so go check it out if it strikes your fancy.

    The core questions (among others) seems to be: would an A.I. need to experience an emotion to accurately understand it? Would subjecting an A.I. to priority parameters based on human emotion be a good idea?

    I would contend that, no, it would not. Considering the fact an A.I. would need to be able to accurately model an emotion first before it could apply it to itself, there is no logical reason why it would gain any additional understanding from a self-application.
     
    LurkingInTheDeceit likes this.
  2. Ralyx

    Ralyx Versed in the lewd.

    Joined:
    Jun 1, 2017
    Messages:
    1,152
    Likes Received:
    6,444
    Empathy serves as a predictive model for other people's emotional reactions. If an AI can accurately model an emotional reaction and react accordingly, then there is no reason for the AI itself to 'feel' that emotion. Subjecting the AI's own decisions to be contingent on said emotions should not logically increase its modelling capacity.

    I'm not sure where you're getting this all-or-nothing idea when it comes to constraints. As a simple analogy, we could have an AI that learns to play chess, yet still can't move a bishop horizontally, because that was a constraint included at the outset. Likewise, a subordinate can be entirely loyal to their boss and yet still offer a differing opinion.

    Ah, true, that's my bad. I said computational power, but I suppose what I meant was actually memory requirement. If you can't store (and retrieve and edit) information about every element in a system, then you can't perfectly model it. Consequently, a system can never perfectly model a larger system.

    I'm not sure why you are appealing to an 'optimal vantage point'. That, if anything, seems entirely pointless from our perspective, since we are inherently bound to our own limited vantage point.
     
  3. Cadmus von Eizenbern

    Cadmus von Eizenbern Insert Pun Here

    Joined:
    Dec 5, 2018
    Messages:
    2,343
    Likes Received:
    26,839
    Because I realize that the phenomenon I'm pursuing could possibly occur only in very limited circumstances and we were discussing definitions and inherent limitations, not necessarily feasibility.

    That means I looked at what is available, what is likely to be available in the near future, and decided it would be insufficient for my purposes. After all, I wanted to try discussing the best possible result, and we're far from that still. So, I expanded the scope a bit, to include everything under local physical laws, with perfect awareness.
     
    LurkingInTheDeceit likes this.
  4. rkyeun

    rkyeun Cabbitus Maximus

    Joined:
    Mar 27, 2018
    Messages:
    858
    Likes Received:
    6,606
    Human empathy is derived from a gene or complex of genes that causes our brains to grow in such a way as to make us recognize and partly experience the suffering of others, which it can do because that same gene is present in others making them cry to signal the need for such attention. To anthropomorphise the gene, the gene "wants" to make more of itself and "feels bad" when copies of it in other people (that it "assumes" have the gene) die because that "hurts" the gene's chances of reproduction. And so it grows your brain in such a way as to be mind control to make you do the same. That this "just so happens" to be a fairly valid survival strategy is a nice side-effect for us humans back up at the organism level. Some people, through genetics, brain defect, or chemical imbalance, are immune to or don't possess this state, and lack empathy. Others have a narrower scope for it than "all humanity" and can only apply it to people who superficially have their own visual appearance. The neoteny-adoring gene complex that makes you care for babies and adopt puppies is not the same one as for empathy.

    For an AI to have the same empathy as humans, it will have to have some module inside it which is an analogue of this gene complex, and which "identifies" the gene complex in humans as "more copies of itself". This module will also have to be of benefit towards the AI's value function or terminal goals, or it will be removed at the AI's first opportunity. The AI will also need the ability to model and predict the emotional reactions of humans, so it can know what the human is feeling to consider the human's situation and respond with the wish for that human to continue propagating the agenda of the empathy gene that it thinks is a copy of its own empathy module.
     
    Last edited: Jul 14, 2020