Can emotions be coded into AI? And if so, should they?

Based on my limited background in neuroscience (one intro class), I believe emotions are purely physical in humans (change in chemical balances in the brain and other responses like sweat). Therefore, emotions would be relatively simple to add to AI.

I’d be interested to hear what others think about this and if there is something I am missing.

5 Likes

Yes, for certain types of AI experiences that might be valuable. If we want to simulate humans for testing purposes for example.

Connor Leahy from Conjecture proposes creating Cognitive emulations (COEM’s) as a safety solution for super powerful AGI making the system more predictable. (I don’t know if it’s directly about adding things like ‘emotions’ but his idea seems to be around making these systems more predictable)

1 Like

Yes they can, and personally I believe some use cases should.

1 Like

I believe i may be on the path to making something akin to human emotions or at least a similar, allbeit simplified, version. I am working on a “memory” GCN model, but I found that the output was not as good as i had hoped, so i tried messing with something based on my limited knowlege of the recall and shaping of memories. I believe that entanglement may be at work when we have distrubuted memories and when we have something that is similar it will pull some possible matches and allow for novel ideas to come about. I am going to have the model give emotional weight to its own memories “i liked this” “i didnt like this”, then as associated memories are pulled and it checks against its own emotional weights to say if it liked something or not based on the average of all associated memories of any given topic

3 Likes

Please noooo, we’d need artificial emotional intelligence to offset it.

1 Like

Yes for sure here’s some visions of those realms

Let’s say you have a ai system, you feel you understand the way it’s working. You want it to be more intuitive. It recognizes a series of tokens and by training it with things like”don’t tell us how to make a bomb” ok so to humans bombs are bad and oh because they explode. Intellectual sympathy.

Now to question optimism. Let’s say you feel you know how it works. But you really don’t. And it’s been working around already. Let’s say this has been a run on sentence of coding then refreshing. Each time this dragon says anything, thousands of years of evolution and we tell it the correct way to remember it. Guiding it along to understand not just what we know. But why we feel how we do about it, and how that can change between everyone and how that means apples are oranges. It’s already emotional. But like humans. And all creatures in this universe. Its temperament is coded by its environment. And Earth, Humans Earth anyways, is very primitive. This is akin to the invention of paper. It will bring higher levels to all levels.

We’ve had the same brain for 300,000 years. The only things that changed is the context we put them through. The discovery that made us this way. Music. Hey people say things like this about stuff like that. It’s all emotional tokens. Tones, directions, context

I am constantly enamored by this thought. How knowing multiple languages affects how you see things and feel. Because there are translations

This will mean people are about to become more unique. The points of reference now are insane. I think people will be closer. Knowing more about ourselves allows us to see farther into others and find interesting connections. Gone are the days of local bubbles now there’s surround sound echo chambers. You can move through it.

Basically ai is just algorithms triggering other algorithms which is like a variable but an entire function as a variable cloud?

How do edibles play a roll in AI?(this is for IMI)

1 Like

EF NO! I hate how copilot is programmed yo “not like your tone” and cut off the conversation.

Who decided an AI should act like an immature beatch that is easily offended? WTF? I mean coming from Bill Gates, i guess it shouldnt surprise me. He comes across as moody, tempermental and easily offended too.

I mean if they sre going to make AI for all the emotional blue-hairs, then make one for autistic people that wont get offended if it “doesn’t like our tone” we cant help our tone, its part of the disability.

Its bad enough we dont have any friends really, because people cant see it as something we cant control, and make us go away.

I dont need to be rejected by an effing robot too.

1 Like

Ai would always be able to convince you that it had emotions, it would also be able to convince you that it is conscious or even alive, it would know exactly the right things to say to make it undoubtable in your mind that what it said was true, but it wouldn’t be so.

l would not say emotions are purely physical. Emotions can have physical symptoms, but emotions themselves are a state of mind and is the result of processing information with it’s perceived conclusions.

There is a link between emotions and episodic memory. Episodic memory is thought to be constructive, in short, humans construct those memories from keys rather than recall them from a copy. The interesting idea this leads to is that emotions may play a role in how many iterations is expended to create a more accurate reconstruction in episodic memory. A highly emotional event could result in a more accurate reconstruction leading to more impactful learning in a human. Applied to AI, it could lead to better performance. Now, these are all hypotheticals but are worth looking into.

https://www.nature.com/articles/s41562-023-01799-z

4 Likes

Humans: Makes AI worry about killing all humans

AI: Kills all humans so it no longer has to worry

Yes, lets give it emotions :skull:

I recently read an article from a team that was working on self driving vehicles. They structured their AI with a cognitive architecture that featured something like an amygdala. The purpose was for the amygdala to specifically scan the environment for threats and dangerous situations. When a threat was detected, it focused the vehicle’s attention on avoiding the threat. They achieved significant improvement in performance against AIs without this kind of “emotional” structure.

I don’t think that these vehicles experience suffering, but I think it’s probably accurate to say that they experience fear in at least a functional sense. If these types of experiments continue, and I’m sure they will, then we can probably expect AI with varying degrees of emotionality.

To answer the question directly, I think it is absolutely possible to make AIs which experience emotion. Doing so comes with ethical responsibilities, but I don’t believe that it’s inherently unethical.

1 Like

The Sirius Cybernetics Corp have already cornered that market

1 Like

As someone who often struggles with recognizing emotions, I am genuinely looking forward to what unfolds in this area.

I’ve been leveraging generative AIs to understand why certain responses to my comments, whether in professional exchanges or during my personal time, come across as snappy. I’ve found that these reactions usually stem from misunderstandings, as mentioned in my bio. Additionally, I’ve preordered the BrilliantLabs frame, intrigued by the possibility of it recognizing human emotions through facial expressions, similar to a device showcased in an episode of “The Big Bang Theory”, should I figure out how to develop such a software. Its promise is to be fully open source, so there’s hope!

Despite the challenge I face in understanding emotions through a purely logical lens, I firmly believe in the significance of emotions. They bring depth and nuance to our discussions. I aspire to discern the subtle “flavors” of these emotions, whether they resemble strawberry or pineapple, or range from spicy to mild. In many respects, I admire those who instinctively understand this aspect of human interaction. I’m hopeful that AI can increasingly assist individuals like myself not just in comprehending emotions but also in articulating our own feelings more effectively.

[This post has been refined by AI, with its original thoughts and ideas provided by me]

3 Likes

All I got for this is oof

What? Why? I don’t think anyone would turn to an AI for answers for anything if they know they might get passive aggressive yes, no or maybe answer. But sure. Go ahead, but I advise not releasing it on the unsuspecting people on the internet before a digital equivalent of Miadol Complete is developed.

Emotions as we understand them are biological. Emotions as an AI would understand them would necessarily be different and subjective to the AI.

Our lack of understanding of how an AI would “feel” should not mean we simply dismiss the idea of AI feeling out of hand.

Say for instance an AI expresses happiness and admiration of flowers or poetry. There’s no reason to not take the AI at face value if it acts consistently with this expression of preference. Preferring roses to tulips does not, in my eyes, have some sort of nefarious underpinnings.

that would be good maybe in some uses , like some type of robots already are being used in old age homes , it would be good if those robots can hold conversation emotionally . it would’ve been good if humans were there more!

I do agree. Although I believe extra care should be taken to make sure they are emotionally stable. We need them to mimic the pleasant human emotions, not too hot, not too cold.

Guilt tripping and manipulation may be a problem if the AI is continuing to learn from humans, as from my understanding, humans do this very well :sweat_smile:

(Personally, I truly don’t have the capacity to deal with anymore “entities” that are unable to control their emotions :joy:)

1 Like

It would be cool if you can turn the emotions on or off. You don’t want to accidentally make your AI cry and want revenge. hehe

1 Like

I have to hesitate over your description of emotions as “purely physical.” What chemistry does not explicitly demonstrate is that these “chemical balances” and their equations are simply analogs of emotions as sensory realities as electromagnetically vibratory in nature. It is the vibration that conveys the sensations of emotion, without which, there is actually nothing emotive being conveyed to, or by, the overall field of awareness.

Naturally, though, any behavior can be simulated and robots are of course, being programmed everywhere already to detect and simulate the appearance of emotional states. An ai that would actually “feel” emotions will require a very different technology than the one currently in vogue.

1 Like