Custom cover image
Custom cover image

AI snake oil : what artificial intelligence can do, what it can't, and how to tell the difference / Arvind Narayanan and Sayash Kapoor

By: Contributor(s): Resource type: Ressourcentyp: Buch (Online)Book (Online)Language: English Publisher: Princeton ; Oxford : Princeton University Press, [2024]Copyright date: © 2024Description: 1 Online-Ressource (x, 348 Seiten) : Illustrationen, DiagrammeISBN:
  • 9780691249643
Subject(s): Additional physical formats: 9780691249131 | Erscheint auch als: AI snake oil. Druck-Ausgabe Princeton : Princeton University Press, 2024. x, 348 SeitenDDC classification:
  • 006.3 23/eng/20240324
RVK: RVK: CC 7270 | SR 8500 | ST 300Local classification: Lokale Notation: inf 6.20LOC classification:
  • Q335
DOI: DOI: 10.1515/9780691249643Notes: Anmerkungen: beim Buchhändler bestellt für KIT 07.2025Summary: From two of TIME’s 100 Most Influential People in AI, what you need to know about AI—and how to defend yourself against bogus AI claims and productsConfused about AI and worried about what it means for your future and the future of the world? You’re not alone. AI is everywhere—and few things are surrounded by so much hype, misinformation, and misunderstanding. In AI Snake Oil, computer scientists Arvind Narayanan and Sayash Kapoor cut through the confusion to give you an essential understanding of how AI works and why it often doesn’t, where it might be useful or harmful, and when you should suspect that companies are using AI hype to sell AI snake oil—products that don’t work, and probably never will.While acknowledging the potential of some AI, such as ChatGPT, AI Snake Oil uncovers rampant misleading claims about the capabilities of AI and describes the serious harms AI is already causing in how it’s being built, marketed, and used in areas such as education, medicine, hiring, banking, insurance, and criminal justice. The book explains the crucial differences between types of AI, why organizations are falling for AI snake oil, why AI can’t fix social media, why AI isn’t an existential risk, and why we should be far more worried about what people will do with AI than about anything AI will do on its own. The book also warns of the dangers of a world where AI continues to be controlled by largely unaccountable big tech companies.By revealing AI’s limits and real risks, AI Snake Oil will help you make better decisions about whether and how to use AI at work and homeSummary: "A trade book that argues that predictive AI is snake oil: it cannot and will never work. Artificial Intelligence is an umbrella term for a set of loosely related technologies. For instance, ChatGPT has little in common with algorithms that banks use to evaluate loan applicants. Both of these are referred to as AI, but in all of the salient ways - how they work, what they're used for and by whom, and how they fail - they couldn't be more different. Understanding the fundamental differences between AI technologies is critical for a technologically literate public to evaluate how AI is being used all around us. In this book, Arvind Narayanan and Sayash Kapoor explain the major strains of AI in use today: generative AI, predictive AI, and AI for content moderation. They show readers how to differentiate between them and, importantly, make a cogent argument for which types of AI can work well and which can never work, because of their inherent limitations. AI in this latter category, the authors argue, is AI snake oil: it does not and cannot work. More precisely, generative AI is imperfect but can be used for good once we learn how to apply it appropriately, whereas predictive AI can never work - in spite of the fact that it's being sold and marketed today in products - because we have never been able to accurately predict human behavior"--PPN: PPN: 1904207545Package identifier: Produktsigel: EBA-EMB | GBV-deGruyter-alles | ZDB-23-DGG | ZDB-23-DEI
No physical items for this record

Restricted Access; Controlled Vocabulary for Access Rights: online access with authorization coarar

http://purl.org/coar/access_right/c_16ec.

In English