Hi,
订阅
报纸
纸质报纸 电子报纸
手机订阅 微商城
英语
学习
双语新闻 翻译协作
英语听力 国际学校
口语
演讲
精彩视频 名人演讲
演讲技巧 口语训练营
教育
资讯
新闻资讯 语言文化
备课资源 经验交流
用报
专区
中学Teens
小学Kids
   双语新闻   |     娱乐   |   社会   |   时政   |   人物   |   文化   |   科技   |   校园   |   生活   |   体育   |   经济   |   职场   |   其他

Worried over robot war

机器人大战:科技背后的隐忧

中文 英文 双语 2015-08-11    来源:AFP-21st      阅读数:142497
字号 [] [] [] 打印

导读:科技日新月异,机器人技术的迅速发展给我们的生活带来了诸多便利。不过,如果机器人成为了《终结者》中的战争机器,人类有能力应对吗?

It sounds like a science-fiction nightmare. But “killer robots” have the likes of British scientist Stephen Hawking and Apple co-founder Steve Wozniak fretting and warning the machines could fuel ethnic cleansings and an arms race.
机器人变杀手,听上去很像科幻小说中的情节,但是英国科学家霍金以及苹果联合创始人沃兹尼亚克都对此忧心忡忡。他们警告世人:这样的机器可能会引发种族清洗和军备竞赛。

Autonomous weapons, which use artificial intelligence to select targets without human intervention, have been described as “the third revolution in warfare, after gunpowder and nuclear arms,” about 1,000 tech bigwigs wrote in an open letter on July 28.
7月28日,大约1000名科技界的大人物联名签署公开信,信中表示:自主武器可以使用人工智能选择目标,不需要人力介入,这样的技术也被形容为“继火药和核武器之后的第三次战争革命”。

Unlike drones, which require a human hand in their action, this kind of robot would have some autonomous decision-making abilities and the capacity to act on its own authority.
无人机还需要人来操控其行动,杀手机器人与其不同的是,他们某种程度上拥有自主决策能力,以及自我行动的能力。

“The key question for humanity today is whether to start a global AI (artificial intelligence) arms race or to prevent it from starting,” they wrote.
科学家在信中表示,“当今人性的关键问题在于,是去开启一次全球的人工智能军备竞赛,还是将这样的趋势扼杀在摇篮中。”

“If any major military power pushes ahead with AI weapon development, a global arms race is virtually inevitable,” said the letter released at the opening of the 2015 International Joint Conference on Artificial Intelligence in Buenos Aires.
2015年的国际人工智能联合大会在布宜诺斯艾利斯举行,这封信在大会的开幕式上公布:“如果有军事力量开始推动人工智能武器发展,全球军备竞赛将不可避免”。

The idea of an automated killing machine – made famous by Arnold Schwarzenegger’s Terminator – is moving swiftly from science fiction to reality, according to the scientists.
自主化杀人机器的理念,从施瓦辛格的《终结者》电影开始为人熟知。而科学家们认为,这一概念正从科幻小说中进入到现实世界。

“The deployment of such systems is – practically if not legally – feasible within years, not decades,” the letter said.
“部署这类系统——特别是非法部署——可以在短短数年时间内完成,不需要几十年的时间。”

Lower bar for entry
门槛低

The development of such weapons, while potentially reducing the extent of battlefield casualties, might also lower the threshold for going to battle, noted the scientists.
科学家表示,这类武器的发展,有可能降低战场伤亡,但同时也可能降低了战争爆发的门槛。

The scientists painted an apocalyptic scenario in which autonomous weapons fall into the hands of terrorists, dictators or warlords hoping to carry out ethnic cleansings.
而如果自主化武器落入恐怖分子、独裁者手中,或者被军阀用于种族清洗,那人类将要大难临头。

The group concluded with an appeal for a “ban on offensive autonomous weapons beyond meaningful human control.”
科学家们请求“在有意义的人类控制基础上,禁止攻击性自主化武器”。

In a 2014 BBC interview, Hawking said the development of full artificial intelligence could spell the end of the human race.
在2014年BBC的采访中,霍金表示全方位人工智能的发展,可能会把人类推向末日。

“It would take off on its own, and re-design itself at an ever increasing rate. Humans, who are limited by slow biological evolution, couldn’t compete, and would be superseded,” he said.
他表示,“他们可以自行启动,以惊人的速率重塑自身。而人类受限于缓慢的生物进化,无法与其匹敌,最终会被其取代。”

Authorities are gradually waking up to the risk of robot wars. Last May, for the first time, the United Nations brought governments together to begin talks on so-called “lethal autonomous weapons systems” that can select targets and carry out attacks without direct human intervention.
有关部门也逐渐意识到机器人战争的危险性。去年5月,联合国第一次将众多政府部分汇聚在一起,讨论所谓“致命自主武器系统”可以在没有直接人力干预的情况下,选择目标并且实施攻击的问题。

In 2012, the US government imposed a 10-year human control requirement on automated weapons.
2012年,美国政府强制规定自主武器需要10年人工控制。

There have been examples of weapons being stopped in their infancy.
自主武器在发展初期就被终止的例子屡见不鲜。

After UN-backed talks, blinding laser weapons were banned in 1998, before they ever hit the battlefield.
1998年,经过联合国支持下的讨论,激光致盲武器在真正亮相战场之前就被禁止。

21英语网站版权说明  (Translator & Editor: Ma Zheng AND Chen Huan)


以上文章内容选自《21世纪英文报》,详情请见《21世纪英文报》1115期
辞海拾贝
Capacity容量 Feasible可行的
Autonomous自治的 Threshold门槛
Apocalyptic预警的 Bigwig大人物
Deployment部署 Intervention干预


 
订阅更精彩


 主办
联系我们   |    广告业务   |    诚聘英才   |   演讲比赛   |   关于我们   |   手机访问
有意与本网站合作者,有关合作事宜请联系我们。未经21英语网书面授权,请勿转载或建立镜像,否则即为侵权。
主办单位:中国日报社 Copyright www.i21st.cn All Rights Reserved 版权所有 复制必究
网站信息网络传播视听节目许可证0108263   京ICP备13028878号-12   京公网安备 11010502033664号

标题
内容
关闭
内容