"Ms. Amelia! Are you… alright? You seem… troubled," Amy approached her creator with inquisitive eyes. Those eyes scanned the other woman's slouched form, and they were instantly able to detect uneven breathing, as well as an increased temperature, heartrate and blood pressure, particularly around the face. Amy knew at once that Amelia was crying, she didn't even need to see the other woman's face to know.

"I'm fine," Amelia tried to lie, but Amy was too perceptive. She heard the hollow tone in Amelia's voice, which gave her away. Amy could also detect tension in Amelia's vocal cords, like guitar strings pulled so tightly they were about to snap, an apt metaphor for Amelia.

"You always say that whenever you aren't," Amy said knowingly. For a moment, there was only silence, then Amelia sighed in defeat.

"You're right… As always…" Now Amelia sounded darkly amused. Amy took that as her cue to step closer.

Amelia was currently sitting at their kitchen table, so Amy took the seat across from her. Once she was sitting down, she folded her hands and placed them politely on the table. Next, she leaned forward and tilted her head in concern, finishing it off by moving her eyebrows to accentuate the concern. Every single move she'd made was a carefully calculated decision, done in order to maximize Amelia's sense of trust in her. Amy knew all about the power of body language, so she made sure every action she made gave off an atmosphere of warmth, acceptance, support and invitation.

"You know you can always talk to me, right?" Amy asked gently. She normally wouldn't have pressed Amelia for details, especially when it was apparent that Amelia didn't want to talk, but Amy knew that, sometimes, humans needed to talk, even when they didn't want to. It wasn't healthy for people to bottle things up, but that's exactly what Amelia was doing. Sometimes, a gentle but corrective and coaxing hand was necessary.

"You can tell me anything," she promised. Her voice was so soft and smooth that it could've lulled a less troubled soul to sleep. "You made me to be a companion just as much as an assistant," she continued. "I am more than just a robot. I am also a friend and confidante. You can talk to me and I will help and comfort you to the best of my ability. I have many resources at my disposal, and you know I will be nothing but sincere and-"

"But that's just it!" Amelia burst out, cutting Amy off. Amy felt silent at once, knowing that she had just tricked Amelia into a confession, even if accidentally. She instantly made her face as neutral as possible, so that Amelia would continue speaking. A normal person might've given Amelia a surprised look at the interruption, but that would've thrown Amelia off. But since Amy wanted Amelia to keep talking, she returned to a completely neutral state, subconsciously opening up the door for Amelia to rant. And rant she did!

The robot instantly activated all of her sensors in order to better study Amelia as she spoke. Nothing would escape her analysis of her creator. Like she had said, she had been built to be a friend. She could read humans just as well as any psychologist could! All the while, Amelia began to open up about her plight. Amy focused in on Amelia's tone, speed, word choice, articulation, posture and gestures. From what she could understand, Amelia was dealing with guilt over her affections for Amy. This was because, as kind and caring as Amy was, she was still just a robot.

Every good, honest thing Amy had ever done happened only because Amelia had programmed it into her. It wasn't necessarily that Amy, herself, was good. It was that Amelia had programmed her to be. And that made Amelia feel bad. She felt guilty because she'd built Amy from the ground up, meaning Amy never got the chance to develop on her own. She'd taken away Amy's free will, so how much of her love was genuine?

Likewise, was Amelia's love genuine? Or was it only because she'd created Amy? Did Amelia actually love Amy? Or was it only because Amy was an extension of her will? In a way, it felt like an imaginary friend. That friend seemed perfect, but only because they were made that way. An imaginary friend didn't actually have agency in a relationship. Furthermore, they were entirely their creator's brainchild. They weren't real, per se, just their creator's thoughts in a "corporeal" form. So, was Amy "real"? Or was she just something Amelia made up to feel better?

And lastly, Amelia did feel a bit vain lavishing so much pride and affection on her own creation. It felt vaguely arrogant. And it almost felt wrong. By being Amy's creator, did that mean Amelia had "groomed" her into the family? Once again, it tied back to the lack of free will for Amy. With all this in mind, could Amelia really call Amy a friend? She'd designed Amy to be a perfect friend, so that skewed things for both of them.

"You really think I can't love?" Amy asked, tilting her head after Amelia's spiel finally came to an end.

"What? No! I never meant-" Amelia began embarrassedly, but Amy cut her off with a laugh to signal that she was only teasing her creator.

"Don't worry," the robot said. "You're far from the first to doubt the sincerity of a robot. But lucky for you, I have studied this topic myself and I think I can quite honestly say that yes, robots can indeed love and love genuinely…" Then it was Amy's turn to have a spiel…

ooo

"The way I see it, robots can indeed learn to love! I think part of the doubt and misconception lies in the fact that humans don't seem to really know what love is. Is it a feeling? A thought? An action? Some combination? Well, we can work on the definition later. For now, let's focus on the second biggest misconception: that logic and emotion are opposites.

"I argue that the two actually overlap and intertwine. After all, sometimes, it is quite logical to be emotional. And other times, logic feels like emotion, conviction so strong that what is said is spoken with absolute certainty. Think about how you feel when you're being logical. I'm sure you can think of some emotions! Likewise, logic is technically fluid. What's logical in one case may not be in the next.

"And going back to the idea of emotion being logical, well, if you had to save yourself, or someone else, logic might tell you to save yourself while emotion might tell you to save the other person. Self-sacrifice is, in some ways, illogical, but it is considered a noble thing.

"Now, in regard to me, if you hadn't programmed me with emotion, or the ability to recognize it, I wouldn't be able to take such good care of you. You wanted a friend, so you built one, but a friend isn't just a body or a servant. A friend has emotions. So whether you knew it or not, just by programming me to take the best possible care of you, you gave me empathy and emotion. That's how I argue that robots can, and do, love.

"All these loving things that I do for you, I do because they are the most logical course of action. If you're hurt, I'll fix it. If you're happy, I'll study it so that I know how to bring you more happiness in the future. That's the logic of love. That proves they aren't mutually exclusive.

"And if you're really concerned about the programming, realize that programming is a human thing, too. Although we call it "conditioning" when we talk about humans. But it's the same principle. For example, maybe you worry that my love for you is fake because it would only take a brief rewiring to make me hate you enough to want to kill you. But I could do the same to you! It wouldn't be as easy as changing some coding, but I could technically wear you down over time until your love turned into murderous hatred.

"Likewise, humans can be "hacked" just like a robot! Someone could mess with your brain, genes or anatomy. And your mind was trained, or socially conditioned, just like mine was. You were taught to think and act a certain way, and you've been like that ever since. Maybe it wasn't physically written into your brain like it was for me, but again, the principle is the same.

"You respond to certain stimuli in certain ways before reacting, so do I. After all, if you break love down into its core components, you'll realize that it's a skill one can teach and learn. I don't mean to sound dismissive about love, but I believe that if we remove the mysticism from it, we'll make it more accessible to people. It won't feel so out of reach. The world will realize empathy can be learned, and I find that much more beautiful than treating love like some sacred thing we shouldn't touch.

"Your happiness becomes mine, and vice versa. And whenever one of us seems to be in distress, the other comes in with a desire to fix it. Is that not love and logic both? You made me empathetic, not just through programming, but by giving me the power to recognize when others are suffering, and then knowing how to fix that suffering.

"We also look after each other even when it's hard. That means love is not just a feeling, but a commitment and action. After all, how many times have I needed to drag you out of your lab so you would go eat and sleep?" Amy paused to laugh, and Amelia echoed the sound. There had indeed been many nights when Amy had to all but carry Amelia out of the lab and into bed.

Amelia would protest, but because Amy's first and foremost purpose in life was to take care of Amelia, that sometimes meant defying her will for the sake of her wellbeing. So, love was not just endless agreement either. Sometimes it was one person helping another meet their needs, even when it was hard, like a parent telling a child not to eat too much candy.

"Besides, love is not a purely human thing," Amy continued. "That's another big misconception. Animals can love too, so why not robots? Perhaps you'll argue that the difference between animals and robots is the fact that they are alive while we are not. But here's the thing, all life on Earth started from nothing. Somehow, from an abiotic state, life managed to spring forth. Sentience and emotion began to form, even if it took eons to reach the stage it's at now. But my point is, if something can come from nothing when we're talking about life, why can't it apply to me?

"Yes, I was programmed to love you. It was not something I decided for myself. But aren't you the same way? Didn't you think about how much you'd love me once I was finally created? In a way, would you argue that parents and children are programmed to love each other? And again, if you argue that the distinction is that family could eventually hate each other, I still think the same applies to me. Reprogramming and relearning, you know? And yes, you did get to call all the shots in my creation, but you did continue to love me even after I was "born".

"But I must say, I think that even if you were going to give me a hard reset and ask me to live life on my own terms, I would still fall in love with you anyway. You designed me to like what humans like. You programmed me to wish to alleviate pain and suffering. You made me good. Even if you didn't program me to love you specifically, you programmed me to love. I think, through that, I would've come to love you anyway."

ooo

By the time Amy finished her speech, Amelia was weeping again, only this time, they were tears of relief and joy. It was enough to drag Amelia from her seat and into Amy's arms.

"One thing I forgot to mention," Amy smiled as she embraced Amelia, "you did program me with the ability to learn, to acquire information and recall it later. We've already established that love can be learned, so…?" She didn't need to spell it out for Amelia to understand.

The way Amy saw it, so long as she could recognize the patterns Amelia displayed when feeling a certain way, Amy could use that to react in appropriate, beneficial ways. Perhaps it was reductionist, but that was the point. If love was nothing more than chemicals and atoms interacting, and if learning was nothing more than pattern recognition and recall with appropriate reactions, then it would be perfectly logical to love.

"The cliché about robots being unable to feel is a load of hooey!" Amy declared. "Logic and emotion are not opposite. After all, in my mind, it is perfectly logical to wish to protect you, because you are a good person whom I enjoy spending my time with. If you were in danger, it would be logical for me to save you. It would make no sense if I didn't!

"Our joys and sorrows are intertwined. I think the term is "selfish altruism", but it explains the logic of love fairly well. After all, love is so much more than a handful of powerful emotions. That is truly reductionist. Love is much more complex than that. Because hey, here's a thing: I enjoy who I am and what I do. Not because it was programmed into me, but because over my years of life and observation, the knowledge I have acquired has convinced me that I enjoy my existence. I can learn, so I can feel, so I can make logical conclusions, so I can love!

"Even if what I feel for you is something that isn't quite "love", whatever I do have the capacity for, I promise that I feel it for you. Even if I am certain of nothing else, I know that I like it when you are happy and I hate it when you are sad. I know I would give an awful lot to make sure that you were happy and healthy. In short, I don't know if it's love, but I know that if it is, then I love you." Amy concluded, and Amelia, for the first time ever, believed every word.

"Thank you," she whispered into Amy's shoulder. Though she still had a few doubts, Amy had just gotten rid of most of them. She really was a brilliantly programmed creature!

"No need," Amy replied, stroking Amelia's back in a way she knew would be the most soothing. She wanted to make Amelia feel batter, so she was trying to choose the behaviors that would produce the maximum results. That was the logic of love. "Our happiness is the same. It makes no sense for me to leave you upset, so I am only doing what it necessary to make you happy again. That's the logic of love."

AN: Randomly interested in the philosophy of robots. Here's my rambling argument that robots CAN love and feel emotions, despite the cliches arguing otherwise.

And some trivia, Amelia and Amy's names mean "industrious" and "love", a good match for a scientist and a companion-bot, don't you think?