Translate to multiple languages

Subscribe to my Email updates

https://feedburner.google.com/fb/a/mailverify?uri=helgeScherlundelearning
Enjoy what you've read, make sure you subscribe to my Email Updates

Sunday, October 28, 2018

Human Rights without humans: The final line between artificial and superhuman intelligences | Opinion - The Hill

Photo: Jose Mauricio Gaona
Jose Mauricio Gaona, O’Brien Fellow at the McGill Center for Human Rights (CHRLP), a Saul Hayes Fellow at McGill University’s Faculty of Law, and a Vanier Canada Scholar (Social Sciences, Humanities, and Research Council of Canada SSHRC) asks if we develop a conscious artificial being or a super-intelligent human being, what rights then prevail?

Photo: Getty

Human intelligence precedes civilization; artificial and superhuman intelligences, however, will redefine it. Current research in artificial general intelligence (AGI) and intelligence enhancement (IE) seek to remove human error from their most ambitious technological quests. On the one hand, using evolutionary algorithms, AGI aims to develop a fully automated, increasingly independent, gradually cognitive, and eventually conscious artificial being. On the other hand, using neurotechnology, IE intends to create a super-intelligent and inherently different human being capable to counteract the inexorable ascension of machines in the next few years.

But what is the limit of such scientific enterprises? If we develop a conscious artificial being or a super-intelligent human being, what rights then prevail: human rights, artificial- or superhuman- rights? How far should we go to satisfy our intellectual curiosity, our ability to innovate, or other less noble yet often prevailing reasons such as productivity, greed, or power?...

As MIT Professor Erik Brynjolfssen explains, AI machine learning provides machines with sometimes a million-fold improvement in their performance enabling them to solve problems on their own. That is, outside human supervision, despite human nature and, ergo the risk.
Read  more... 

Source: The Hill