Are you and ChatGPT still getting to know each other? Have you gotten involved at work with a coder named Claude? Or do you need Alexa to respect your personal space?
As artificial intelligence advances, so will our working relationships with it. And whether we love or hate these new tools, those relationships beg an array of ethical questions, said Autumn Edwards, director of the Communication and Social Robotics Lab at Western Michigan University.
Edwards visited 抖阴视频 on February 27 to deliver 抖阴视频鈥檚 2026 Communication in the Modern World Lecture, 鈥淭alking to Machines: Communication, Ethics, and the Structure of Human-AI Relations.鈥
Are we obligated to treat robots with kindness? Does it matter if we use our manners with AI? 鈥淲e don鈥檛 wonder whether we should be polite to a hammer,鈥 Edwards said. 鈥淪o why do we ask this about conversational systems or interactive machines?鈥
She showed 抖阴视频 students a short video where engineers from Boston Dynamics demonstrated their robot鈥檚 skill in maneuvering in unpredictable environments. The robot walked on four legs as engineers attempted to knock it off course with sharp kicks.
Students gasped at those kicks 鈥 not because they were impressed by the engineering 鈥 but because they empathized for the dog-like machine being kicked. Their wince was a common reaction, Edwards said. In fact, YouTube鈥檚 algorithm blocked the video, unable to distinguish it from footage of actual animal cruelty.
Our natural empathy 鈥渓ays bare how relentlessly social we are,鈥 Edwards said. 鈥淲e鈥檙e fundamentally relational beings 鈥 creatures oriented toward encounter. Relation is our default mode. Anthropomorphism is a feature, not a bug, in who we are.鈥
Edwards pointed to different types of ethical standards that nudge us toward kindness with these systems.
Virtue ethics responds to kicking the robot or cursing the chatbot and says, 鈥淭hat behavior could cultivate cruelty in me. If I allow that in myself, what am I becoming?鈥 And relational ethics might lead us to ask, 鈥淲hat kind of world are we creating together if our behavior normalizes abuse?鈥
Being polite to machines appears harmless, Edwards said. And defaulting toward kindness in our interactions with AI seems safe on most levels. But this warmth, too, has potential pitfalls.
Kindness brings with it an openness to relationships. And those relationships in turn may involve a growing trust in artificial intelligence that can spill into intimacy. 鈥淎nd a lot of these companies that leverage AI are coming for our intimacies,鈥 Edwards said.
The age of AI is both promising and fraught. Edwards cautioned, 鈥淭he invention of the airplane was also the invention of the plane crash.鈥 And if we wish to avoid our own fiery collisions with AI, we must keep careful watch from our personal control towers.
鈥淚鈥檓 not sure if these technologies deserve our full moral consideration,鈥 Edwards said. 鈥淏ut they鈥檝e got to be at least a blip on our moral radar.鈥