机器人拷问人性:反思人与机器之间的情感和互动 | 双语阅读
Robot panic seems to move in cycles, as new innovations in technology drive fear about machines that will take over our jobs, our lives, and our society—only to collapse as it becomes clear just how far away such omnipotent robots are. Today’s robots can barely walk effectively, much less conquer civilization.
机器人恐惧症似乎出现了周期性的发展,尤其是技术创新使人们担心机器人将抢走我们的工作、生活和社会。当清楚地认识到这种全能机器人离我们尚远时,这种恐惧才会消失。今天的机器人几乎难以高效行走,更不要说征服人类文明。
But that doesn’t mean there aren’t good reasons to be nervous. The more pressing problem today is not what robots can do our bodies and livelihoods, but what they will do to our brains.
然而,这不意味着这种紧张是毫无道理的。如今,更为紧迫的问题不是机器人会改变我们的身体和生活,而是它们将改变我们的思想。
“The problem is not that if we teach robots to kick they’ll kick our ass,” Kate Darling , an MIT robot ethicist, said Thursday at the Aspen Ideas Festival, which is co-hosted by the Aspen Institute and The Atlantic. “We have to figure out what happens to us if we kick the robots.”
“问题不在于一旦我们教会机器人如何踢,它们就有可能踢人类的屁股”,麻省理工大学的机器人伦理学家Kate Darling这周四在阿斯彭研究所和《大西洋月刊》共同举办的阿斯彭思想节上这样说到,“我们需要弄明白的是,如果我们踢了机器人将会发生什么。”
That’s not just a metaphor. Two years ago, Boston Dynamics released a video showing employees kicking a dog-like robot named Spot. The idea was to show that the machine could regain its balance when knocked askew. But that wasn’t the message many viewers took. Instead, they were horrified by what resembled animal cruelty . PETA even weighed in, saying that “PETA deals with actual animal abuse every day, so we won’t lose sleep over this incident,” but adding that “most reasonable people find even the idea of such violence inappropriate.”
这不仅仅是一个隐喻。两年前,波士顿动力公司发布了一个视频,视频上员工们正在踢一个名叫Spot的犬型机器人。这个视频的本意,是为了说明机器被撞歪时能够恢复平衡,但这不是许多观众们得到的信息。相反,他们对这种类似于虐待动物的行为感到震惊。善待动物组织PETA甚至表示,“PETA每天都在处理实际的动物虐待事件,所以我们不会忽视这样的事情,”它补充道,“有理性的人会认为这种暴力思维就已经是不恰当的了。”
The Spot incident, along with the outpouring of grief for the “Hitchbot”—a friendly android that asked people to carry it around the world, but met an untimely demise in Philadelphia—show the strange ways humans seem to associate with robots. Darling reeled off a series of other ways: People name their Roombas, and feel pity for it when it gets stuck under furniture. They are reluctant to touch the “private areas” of robots, even only vaguely humanoid ones. Robots have been shown to be more effective in helping weight loss than traditional methods, because of the social interaction involved.
除了Spot的事件外,还有“Hitchbot”的事件。“Hitchbot”是一个友善的便携式机器人,却在费城遇到了不测之灾。Spot的事件和人们对Hitchbot事件所表达的悲痛,体现出人们似乎与机器人以奇怪的方式相关联。Darling介绍了其它的关联方式:人们为他们的自动吸尘器机器人取名字,而且当它卡在家具下时还会感到同情。他们拒绝触碰机器人的“私人领域”,即使只是模糊的仿人状。由于涉及到社会交往,机器人还被认为比传统方式更有助于减肥。
People are more forgiving of robots’ flaws when they are given human names, and a Japanese manufacturer has its robots “stretch” with human workers to encourage the employees to think of the machines as colleagues rather than tools. Even when robots don’t have human features, people develop affection toward them. This phenomenon has manifested in soldiers bonding with bomb-dismantling robots that are neither anthropomorphic nor autonomous: The soldiers take care of them and repair them as though they were pets.
如果给机器人取了人名,人们就更容易原谅它的缺陷。一个日本制造商让机器人参与到员工中,鼓励员工将机器人看作同事而非工具。即使机器人没有人的特征,人们也会对他们产生好感。这些现象已经表现在拥有拆弹机器人的军人身上。这些机器人既不仿人也不自主,士兵们却像对待宠物一样照顾和维修它们。
That can be good news—whether it’s as weight-loss coaches or therapy aides for autistic children—but it also opens up unexplored ethical territory. Human empathy is a volatile, unpredictable force , and if it can be manipulated for good, it can be manipulated for bad as well. Might people share sensitive personal information or data more readily with a robot they perceive as partly human than they would ever be willing to share with a “mere” computer?
这可能是一个好消息——当机器人作为一个减肥教练或自闭症儿童的治疗助手——但它也开启了未知的道德领域。人类的同情心是一种变化莫测的力量,能够被操纵来做好事,也能做坏事。当机器人被认为是人类的一部分,而不仅仅是一台计算机时,人们会更容易向它们分享一些私人信息或数据吗?
Social scientists (and anxious parents) have wondered for years about the effect of violent video games on children and adults alike. Even as those questions remain unresolved, an increasing number of interactions with robots will create their own version of that debate. Could kicking a robot like Spot desensitize people, and make them more likely to kick their (real) dogs at home? Or, could the opportunity to visit violence on robots provide an outlet to divert dangerous behaviors? (Call it the Westworld hypothesis.)
社会科学家(和焦虑的父母们)多年来一直想知道暴力电视游戏对儿童和成人的影响。这些问题尚未解决时,越来越多的与机器人的互动将为这个辩论创造新的版本。踢Spot这样的机器人是否会降低人们的敏感度,使得人们更有可能在家里踢他们的(真正的)狗?或者,对机器人的暴力行为能够为转移危险行为提供发泄口?(Westworld假说。)
An even more pungent version of that dilemma could revolve around child-size sex robots. Would such a thing provide a useful outlet for sex offenders, or would it simply make pedophilia seem more acceptable? Making the dilemma more challenging, it’s extremely difficult to research that question.
这个矛盾更为尖锐的版本是围绕儿童型的性爱机器人。这种东西会为性犯罪者提供有效的发泄口,还是会使恋童癖看起来更容易接受?当这个困境越来越具有挑战性时,研究这些问题就变得极端困难。
The sway that even rudimentary robots can hold over humans was clear near the end of Darling’s talk. A short robot whirred out on stage to alert her that she had five minutes left to speak. The audience, which had just listened to a thoughtful, in-depth litany of the ethical challenges of human-robot interactions, cooed involuntarily at the cute little machine. And Darling, who had just delivered the litany, knelt down to pat its head.
Darling的发言快要结束时,即使是未成熟的机器人也有可能控制人们的事实变得清楚了。一个矮小的机器人旋转着上台,提醒她只剩5分钟的发言时间。刚刚听了关于人类与机器人互动的伦理挑战的发言的观众们,却不由自主地开始逗这一可爱小巧的机器。而刚刚发表了长篇言论的Darling,也蹲下来拍了拍它的头。
重点词汇
- metaphor:隐喻
- Boston Dynamics:波士顿动力公司
- humanoid:类人型机器人
- dilemma:困境,两难境地
- rudimentary:不成熟的
编译组出品。译者:郭悦萍,编辑:郝鹏程
求报道、意见反馈、调戏 小秘书 “佳佳” 请加微信:
微信扫描下面二维码,关注 加速会微信公号,成长快人一步!
如果你在创业, 想认识更多的创始人,彼此学习、资源共享 ,请扫描下面二维码加入 : 创始人通讯社群 !
如果你在职场,想 认识更多媒体圈朋友(编辑/记者、市场、公关、媒介、品牌) 请扫描下面二维码加入: 媒体圈通讯社群 !