不卡AV在线|网页在线观看无码高清|亚洲国产亚洲国产|国产伦精品一区二区三区免费视频

學(xué)習(xí)啦 > 學(xué)習(xí)英語 > 英語口語 > 殺手英語怎么說

殺手英語怎么說

時間: 焯杰674 分享

殺手英語怎么說

  相信大家對殺手這個詞一點都不陌生,我們可以從各種電影、電視劇中了解這一特殊職業(yè)。今天學(xué)習(xí)啦小編在這里為大家介紹殺手用英語怎么說,歡迎大家閱讀!

  殺手的英語說法

  killer

  slayer

  殺手的相關(guān)短語

  殺手本能 Killer Instinct ;

  職業(yè)殺手 professional killer

  殺手的英語例句

  1. The vital clue to the killer's identity was his nickname, Peanuts.

  查明殺手身份的重要線索是他的外號叫“花生”.

  2. Depression is the third thing that works to my patients' disadvantage.

  抑郁是威脅我的病人健康的第三大殺手。

  3. It's a film about a serial killer and not for the faint-hearted.

  這部電影是講一個連環(huán)殺手的,不適合膽小的人看。

  4. Heart disease is the biggest killer of men in most developed countries.

  在多數(shù)發(fā)達國家,心臟病是導(dǎo)致人們死亡的頭號殺手。

  5. A hit man had been sent to silence her over the affair.

  為了掩蓋這件事,已經(jīng)派出一名職業(yè)殺手去殺她滅口。

  6. Heart disease is the biggest killer, claiming 180,000 lives a year.

  心臟病是頭號殺手,每年奪去18萬條生命。

  7. Police are theorizing that the killers may be posing as hitchhikers.

  警方推測那些殺手可能會假裝成搭便車的人。

  8. Other officers gave chase but the killers escaped.

  其他警官追了上去,可是殺手還是逃了。

  9. a cold and calculating killer

  一個工于心計的冷酷殺手

  10. It was the deadly striker's 11 th goal of the season.

  這是那個殺手前鋒本賽季的第11個進球.

  11. He is a hired killer.

  他是一個受雇的殺手.

  12. They were professional killers who did in John.

  殺死約翰的這些人是職業(yè)殺手.

  13. She took out a contract on her ex - husband .

  她雇了殺手打算謀殺她 前夫.

  14. Cannibal killer Jeffrey Dahmer has been caught trying to hide a razor blade in his cell.

  食人殺手杰弗里·達默被抓到試圖將一片剃須刀片藏在牢房里。

  15. Their cold-blooded killers had then dragged their lifeless bodies upstairs to the bathroom.

  那些冷血殺手那時已經(jīng)將他們的尸體拖到樓上浴室里。

  殺手英文相關(guān)閱讀:殺手機器人將是人類噩夢

  Mankind is a bloodthirsty species. According to Steven Pinker, the academic, for much of history being murdered by a fellow human was the leading cause of death. Civilisation is largely a tale of man’s violent instincts being progressively muffled. A part of this is the steady withdrawal of actual human flesh from the battle zone, with front lines gradually pulled apart by the advent of long-range artillery and air power, and the decline in the public’s tolerance for casualties.

  人類是一個嗜血的物種。根據(jù)學(xué)者史蒂文?平克(Steven Pinker)的說法,在歷史的大部分時間里,被同類所殺是人類的頭號死因。文明基本上是人的暴力本能被逐漸束縛住的故事。其中一個部分是有血有肉的人持續(xù)從戰(zhàn)場撤出,前線逐漸被遠程武器和空中軍事力量拉遠,公眾對于傷亡的容忍程度也下降了。

  Arguably, America’s principal offensive weapon is the drone, firing on targets thousands of miles from where its controller safely sits. Given the pace of advance, it takes no imaginative leap to foresee machines displacing human agency altogether from the act of killing. Artificial brains already perform well in tasks hitherto regarded as the province of humans. Computers will be trusted with driving a car or diagnosing an illness. Algorithmic intelligence could therefore surpass the human sort for making the decision to kill.

  可以說,美國的主要進攻武器是無人機,操縱者安坐于千里之外對目標進行打擊。考慮到技術(shù)進步之快,無需腦洞大開,我們就能預(yù)見到機器將可完全代替人類進行殺戮。在迄今仍被視為人類專屬的活動領(lǐng)域里,人工大腦已有良好表現(xiàn)。電腦將被交托駕駛汽車或者診斷疾病的任務(wù)。因此,在做出殺戮的決策上,算法智能或許也將超越人類智能。

  This prospect has prompted more than 1,000 artificial intelligence experts to write calling for the development of “lethal, autonomous weapons systems” to cease forthwith. Act now, they urge, or what they inevitably dub “killer robots” will be as widespread, and as deadly, as the Kalashnikov rifle.

  這種可能性,促使1000多名人工智能專家在一封公開信中呼吁立即停止發(fā)展“致命自動武器系統(tǒng)”。他們敦促稱,現(xiàn)在就行動,否則被他們不可避免地稱為“殺手機器人”的武器將和卡拉什尼科夫步槍(Kalashnikov,即AK-47)一樣廣為流傳并造成同樣致命的危害。

  It is easy to understand military enthusiasm for robotic warfare. Soldiers are precious, expensive and fallible. Every conflict exacts a heavy toll from avoidable human error. Machines in contrast neither grow weary nor lose patience. They can be sent into places unsafe or even impossible for ordinary soldiers. Rapid improvements in computational power are giving machines “softer” skills, such as the ability to identify an individual, flesh-and-blood target. Robots could eventually prove safer than even the most experienced soldier, for example by being capable of picking out a gunman from a crowd of children — then shooting him.

  軍方對機器人戰(zhàn)爭的熱衷很容易理解。士兵是寶貴的、成本高昂的,也是會犯錯誤的。本可避免的人為失誤在每一場戰(zhàn)斗中都造成了嚴重傷亡。相較之下,機器既不知疲倦,也不會失去耐心。它們可以被送往不安全甚至普通士兵無法到達的地方。計算能力的迅速提升正賦予機器“更柔軟”的技能,比如識別一個有血有肉的單獨目標。最終,事實可能將證明機器人會比最有經(jīng)驗的士兵更安全,比如能夠從一群孩子中挑出槍手——然后射殺他。

  The case against robotic warfare is the same that applies to all advances in weaponry, the avoidance of unforeseeable consequences that cause unlimited damage to the innocent. Whatever precautions are taken, there is no foolproof way to stop weapons falling into the wrong hands. For a glimpse into what could go wrong, recall how Chrysler, the US carmaker has needed to debug 1.4m vehicles after finding the car could be remotely hacked. Now imagine it came equipped with guns.

  反對機器人戰(zhàn)爭的理由與反對所有武器進步的理由相同——避免大量無辜受到傷害這種不可預(yù)知的后果。無論采取了什么預(yù)防措施,都沒有萬無一失的方法來阻止武器落入不法之徒的手中。要想一窺那種情況下會有什么后果,可以回憶一下美國汽車制造商克萊斯勒(Chrysler)在發(fā)現(xiàn)汽車可以被遠程入侵后,需要檢測和排除140萬輛汽車隱患的事情?,F(xiàn)在,想象一下這些車裝備了槍支。

  Technological futurists also fret about the exponential nature of advances in artificial intelligence. The scientist Stephen Hawking recently warned of the “technological catastrophe” that would follow artificial intelligence vastly exceeding the human sort. Whether this is a plumb inevitability or fantasy, science itself cannot decide: but in light of the risk, how sensible can it be to arm such super-intelligences?

  技術(shù)未來主義者也擔(dān)憂人工智能異??焖俚陌l(fā)展??茖W(xué)家斯蒂芬?霍金(Stephen Hawking)最近提醒人們警惕人工智能遠超人類智能后可能發(fā)生的“科技大災(zāi)難”。這到底是絕對無法避免的事情,還是只是幻想,科學(xué)本身無法確定:但考慮到其中的風(fēng)險,給超級智能裝備武器能有多明智呢?

  The moral argument is more straightforward. The abhorrence of killing has been as important to its decline as any technological breakthrough. Inserting artificial intelligence into the causal chain would muddle the responsibility that must underpin any decision to kill. Without clear responsibility, not only might the means to wage war be enhanced, but so too might the appetite for doing so.

  道德方面的理由更為直接。在減少殺人方面,對殺戮的厭惡是個重要因素,其作用不亞于任何技術(shù)突破。將人工智能插入這條因果鏈,將弄混殺人決定背后的責(zé)任。沒有明確的責(zé)任,不僅發(fā)動戰(zhàn)爭的手段得到加強,發(fā)動戰(zhàn)爭的意愿也可能上升。

  Uninventing weapons is impossible: consider anti-personnel landmines — autonomous weapons in their way — which are still killing 15,000-20,000 people annually. The nature of artificial intelligence renders it impossible to foresee where the development of autonomous weapons would end. No amount of careful programming could limit the consequences. Far better not to embark on such a journey.

  讓武器消失是不可能的:想一想殺傷性地雷——一種自動運行的武器——現(xiàn)在依然每年造成1.5萬到2萬人喪生。人工智能的性質(zhì)使人們無法預(yù)見自動武器發(fā)展的終點在哪里。不管進行多少精密的編程,也無法限制其后果。最好不要踏上這樣的旅程。


猜你喜歡:

1.殺手的保鏢迅雷下載

2.智能手機用英語怎么說

3.沙塵暴用英語怎么說

4.機械師用英語怎么說

5.霸氣的英文網(wǎng)名帶翻譯

6.威脅用英語怎么說

660951