bszlogo
Deutsch Englisch Französisch Spanisch
SWB
sortiert nach
nur Zeitschriften/Serien/Datenbanken nur Online-Ressourcen OpenAccess
  Unscharfe Suche
Suchgeschichte Kurzliste Vollanzeige Besitznachweis(e)

Recherche beenden

  

Ergebnisanalyse

  

Speichern/
Druckansicht

  

Druckvorschau

  
1 von 1
      
1 von 1
      
* Ihre Aktion:   suchen [und] (PICA Prod.-Nr. [PPN]) 888813775
 Felder   ISBD   MARC21 (FL_924)   Citavi, Referencemanager (RIS)   Endnote Tagged Format   BibTex-Format   RDF-Format 
Online Ressourcen (ohne online verfügbare<BR> Zeitschriften und Aufsätze)
 
K10plusPPN: 
888813775     Zitierlink
SWB-ID: 
501342036                        
Titel: 
The technological singularity : managing the journey / Victor Callaghan, James Miller, Roman Yampolskiy, Stuart Armstrong (eds.)
Beteiligt: 
Callaghan, Victor [Herausgeberin/-geber] info info ; Miller, James D. [Herausgeberin/-geber] info info ; Yampolskiy, Roman [Herausgeberin/-geber] ; Armstrong, Stuart [Herausgeberin/-geber]
Erschienen: 
Berlin, Heidelberg : Springer Berlin Heidelberg, [2017]
Umfang: 
1 Online-Ressource (xii, 261 Seiten)
Sprache(n): 
Englisch
Schriftenreihe: 
Bibliogr. Zusammenhang: 
Erscheint auch als: The technological singularity (Druck-Ausgabe)
Print version: Callaghan, Victor : The Technological Singularity : Managing the Journey. - Berlin, Heidelberg : Springer Berlin Heidelberg,c2017
ISBN: 
978-3-662-54033-6 ( : electronic bk.)
978-3-662-54031-2 (ISBN der Printausgabe)
Sonstige Nummern: 
OCoLC: 987911221     see Worldcat
OCoLC: 1002816871 (aus SWB)     see Worldcat


Link zum Volltext: 
Digital Object Identifier (DOI): 10.1007/978-3-662-54033-6


RVK-Notation: 
Sachgebiete: 
Schlagwortfolge: 
Sonstige Schlagwörter: 
Inhaltliche
Zusammenfassung: 
Foreword -- References -- Acknowledgements -- Contents -- 1 Introduction to the Technological Singularity -- 1.1 Why the "Singularity" Is Important -- 1.2 Superintelligence, Superpowers -- 1.3 Danger, Danger! -- 1.4 Uncertainties and Safety -- References -- Risks of, and Responses to, the Journey to the Singularity -- 2 Risks of the Journey to the Singularity -- 2.1 Introduction -- 2.2 Catastrophic AGI Risk -- 2.2.1 Most Tasks Will Be Automated -- 2.2.2 AGIs Might Harm Humans -- 2.2.3 AGIs May Become Powerful Quickly -- 2.2.3.1 Hardware Overhang -- 2.2.3.2 Speed Explosion -- 2.2.3.3 Intelligence Explosion -- References -- 3 Responses to the Journey to the Singularity -- 3.1 Introduction -- 3.2 Post-Superintelligence Responses -- 3.3 Societal Proposals -- 3.3.1 Do Nothing -- 3.3.1.1 AI Is Too Distant to Be Worth Our Attention -- 3.3.1.2 Little Risk, no Action Needed -- 3.3.1.3 Let Them Kill Us -- 3.3.1.4 "Do Nothing" Proposals-Our View -- 3.3.2 Integrate with Society -- 3.3.2.1 Legal and Economic Controls -- 3.3.2.2 Foster Positive Values -- 3.3.2.3 "Integrate with Society" Proposals-Our View -- 3.3.3 Regulate Research -- 3.3.3.1 Review Boards -- 3.3.3.2 Encourage Research into Safe AGI -- 3.3.3.3 Differential Technological Progress -- 3.3.3.4 International Mass Surveillance -- 3.3.3.5 "Regulate Research" Proposals-Our View -- 3.3.4 Enhance Human Capabilities -- 3.3.4.1 Would We Remain Human? -- 3.3.4.2 Would Evolutionary Pressures Change Us? -- 3.3.4.3 Would Uploading Help? -- 3.3.4.4 "Enhance Human Capabilities" Proposals-Our View -- 3.3.5 Relinquish Technology -- 3.3.5.1 Outlaw AGI -- 3.3.5.2 Restrict Hardware -- 3.3.5.3 "Relinquish Technology" Proposals-Our View -- 3.4 External AGI Constraints -- 3.4.1 AGI Confinement -- 3.4.1.1 Safe Questions -- 3.4.1.2 Virtual Worlds -- 3.4.1.3 Resetting the AGI -- 3.4.1.4 Checks and Balances

3.4.1.5 "AI Confinement" Proposals-Our View -- 3.4.2 AGI Enforcement -- 3.4.2.1 "AGI Enforcement" Proposals-Our View -- 3.5 Internal Constraints -- 3.5.1 Oracle AI -- 3.5.1.1 Oracles Are Likely to Be Released -- 3.5.1.2 Oracles Will Become Authorities -- 3.5.1.3 "Oracle AI" Proposals-Our View -- 3.5.2 Top-Down Safe AGI -- 3.5.2.1 Three Laws -- 3.5.2.2 Categorical Imperative -- 3.5.2.3 Principle of Voluntary Joyous Growth -- 3.5.2.4 Utilitarianism -- 3.5.2.5 Value Learning -- 3.5.2.6 Approval-Directed Agents -- 3.5.2.7 "Top-Down Safe AGI" Proposals-Our View -- 3.5.3 Bottom-up and Hybrid Safe AGI -- 3.5.3.1 Evolutionary Invariants -- 3.5.3.2 Evolved Morality -- 3.5.3.3 Reinforcement Learning -- 3.5.3.4 Human-like AGI -- 3.5.3.5 "Bottom-up and Hybrid Safe AGI" Proposals-Our View -- 3.5.4 AGI Nanny -- 3.5.4.1 "AGI Nanny" Proposals-Our View -- 3.5.5 Motivational Scaffolding -- 3.5.6 Formal Verification -- 3.5.6.1 "Formal Verification" Proposals-Our View -- 3.5.7 Motivational Weaknesses -- 3.5.7.1 High Discount Rates -- 3.5.7.2 Easily Satiable Goals -- 3.5.7.3 Calculated Indifference -- 3.5.7.4 Programmed Restrictions -- 3.5.7.5 Legal Machine Language -- 3.5.7.6 "Motivational Weaknesses" Proposals-Our View -- 3.6 Conclusion -- Acknowledgementss -- References -- Managing the Singularity Journey -- 4 How Change Agencies Can Affect Our Path Towards a Singularity -- 4.1 Introduction -- 4.2 Pre-singularity: The Dynamic Process of Technological Change -- 4.2.1 Paradigm Shifts -- 4.2.2 Technological Change and Innovation Adoption -- 4.2.3 The Change Agency Perspective -- 4.2.3.1 Business Organisations as Agents of Change in Innovation Practice -- 4.2.3.2 Social Networks as Agents of Change -- 4.2.3.3 The Influence of Entrepreneurs as Agents of Change -- 4.2.3.4 Nation States as Agents of Change -- 4.3 Key Drivers of Technology Research and Their Impact
large
 Zum Volltext 

1 von 1
      
1 von 1