{"id":11007,"date":"2026-05-01T18:42:43","date_gmt":"2026-05-02T00:42:43","guid":{"rendered":"https:\/\/attentionmedia.io\/?p=11007"},"modified":"2026-05-01T18:42:43","modified_gmt":"2026-05-02T00:42:43","slug":"why-we-trust-ai-when-it-makes-things-up","status":"publish","type":"post","link":"https:\/\/attentionmedia.io\/?p=11007","title":{"rendered":"Why we trust AI when it makes things up"},"content":{"rendered":"<div><img loading=\"lazy\" decoding=\"async\" width=\"800\" height=\"450\" src=\"https:\/\/martech.org\/wp-content\/uploads\/2026\/04\/high-tech-road-sign-saying-true-false-800x450.png\" class=\"attachment-large size-large wp-post-image\" alt=\"\" loading=\"lazy\" \/><\/div>\n<p>I went to a developer conference and, by accident, learned something profound about human nature. It started innocently enough \u2014 the \u201cAll Things AI Conference<em>\u201d<\/em> in Durham, NC, had a title too good to pass up.\u00a0<\/p>\n<p>What I didn\u2019t expect was to be the only marketer among 2,500 developers, nodding along as whurly (yes, that\u2019s his real name), CEO of quantum computing company Strangeworks, dove deep into quantum computing and AI. I was in over my head. But sometimes that\u2019s where the best insights hide.<\/p>\n<p>It wasn\u2019t until Luis Lastras, director of language and multimodal technology at IBM, began talking about \u201csmall models\u201d that I finally recognized something. Luis said something that struck me that I didn\u2019t realize: \u201cHallucinations are intentional.\u201d\u00a0<\/p>\n<p>Say what?\u00a0<\/p>\n<h2 class=\"wp-block-heading\">The answer is\u2026<\/h2>\n<p>According to Luis, hallucinations are a way for developers to learn how models work. Because the models operate autonomously, they don\u2019t filter out what they output \u2014 at least not yet. Think of letting your grandfather, who lost his filter, loose at a dinner party.\u00a0\u00a0<\/p>\n<p>It\u2019s one of the things that IBM learned working with small models. These models validate their outputs at certain points as they generate them, to reduce hallucinations.\u00a0<\/p>\n<p>Anyone who\u2019s worked with AI has experienced hallucination \u2014 from made-up sources to statistics that are just plain wrong. Lastras said they are little extra pieces of information that AI thinks are helpful, but weren\u2019t asked for in the prompt.\u00a0<\/p>\n<p>He showed a demo of a prompt asking how many moons Mars has, and the response came back with two and their names, with the added extra \u2014 the distance from Earth, which was not requested. The distance between the planets may have been right, but validating that would require another step, so it might not have been.\u00a0<\/p>\n<h2 class=\"wp-block-heading\">How this evolved<\/h2>\n<p>However, humans are inclined to think the AI is always right.<\/p>\n<p>In a study by Elon University conducted with 500 AI users (US adults) last year, almost 70% believed that AI models are at least as smart as they are, with 26% believing that they are \u201ca lot smarter.\u201d<\/p>\n<p>What is more concerning is that we believe AI is thinking like a human. A Wall Street Journal article, <em>\u201c<\/em>Even Smart People Believe AI is Really Thinking,\u201d said, \u201cOur cognitive biases developed to help us survive in complex social environments\u2026 [We have] evolved to view linguistic fluency as a proxy for intelligence, engagement, and helpfulness as indicators of trustworthiness.\u201d<\/p>\n<p>The same tendency that leads us to trust our linguistically adept fellows for survival is leading us to trust systems that appear to listen, understand, and want to help us.\u00a0\u00a0\u00a0<\/p>\n<p>So, the more AI tools and bots act like humans, the more likely we are to trust them. Which brings us back to the hallucination. The more AI tools act like they\u2019re being helpful, the more likely we are to miss that \u201clittle extra\u201d piece of information that wasn\u2019t requested.\u00a0\u00a0<\/p>\n<h2 class=\"wp-block-heading\">Bottom line<\/h2>\n<p>The convergence of intentional hallucinations and our deeply wired human instinct to trust fluent, helpful communicators creates a perfect storm of misplaced confidence.\u00a0<\/p>\n<p>As AI tools grow more sophisticated and human-like, our evolutionary instincts will only make it harder to maintain the critical distance needed to catch the errors, embellishments, and unrequested additions that slip through.\u00a0<\/p>\n<p>The good news is that awareness is the first step. Whether it\u2019s IBM\u2019s small models validating outputs in real time or simply slowing down to verify what AI hands us, the antidote to a cognitive bias millions of years in the making is something refreshingly simple \u2014 a healthy dose of human skepticism.<\/p>\n<p><a href=\"https:\/\/www.semrush.com\/lp\/semrush-one\/en\/?utm_campaign=ic_semrush_one&amp;utm_source=searchengineland.com&amp;utm_medium=overlay&amp;onboarding=off\" target=\"_blank\"><\/a><\/p>\n<div>\n<div>\n<div class=\"headline-responsive\">\n        Your customers search everywhere. Make sure your brand <span>shows up<\/span>.\n      <\/div>\n<p>\n        The SEO toolkit you know, plus the AI visibility data you need.\n      <\/p>\n<\/div>\n<div>\n      <span>Start Free Trial<\/span>\n    <\/div>\n<div>\n<div>Get started with<\/div>\n<p>      <img decoding=\"async\" src=\"https:\/\/searchengineland.com\/wp-content\/seloads\/2025\/11\/semrush-one.webp\" alt=\"Semrush One Logo\" \/>\n    <\/p><\/div>\n<\/div>\n<p><\/p>\n<p>The post <a href=\"https:\/\/martech.org\/why-we-trust-ai-when-it-makes-things-up\/\">Why we trust AI when it makes things up<\/a> appeared first on <a href=\"https:\/\/martech.org\/\">MarTech<\/a>.<\/p>","protected":false},"excerpt":{"rendered":"<p>I went to a developer conference and, by accident, learned something profound about human nature. It started innocently enough \u2014 the \u201cAll Things AI Conference\u201d in Durham, NC, had a title too good to pass up.\u00a0 What I didn\u2019t expect was to be the only marketer among 2,500 developers, nodding along as whurly (yes, that\u2019s &hellip; <\/p>\n<p class=\"link-more\"><a href=\"https:\/\/attentionmedia.io\/?p=11007\" class=\"more-link\">Read more<span class=\"screen-reader-text\"> &#8220;Why we trust AI when it makes things up&#8221;<\/span><\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[1],"tags":[],"class_list":["post-11007","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-uncategorized"],"featured_media_urls":{"thumbnail":["https:\/\/martech.org\/searchengineland.com\/wp-content\/seloads\/2025\/11\/semrush-one.webp",0,0,false],"medium":["https:\/\/martech.org\/searchengineland.com\/wp-content\/seloads\/2025\/11\/semrush-one.webp",0,0,false],"medium_large":["https:\/\/martech.org\/searchengineland.com\/wp-content\/seloads\/2025\/11\/semrush-one.webp",0,0,false],"large":["https:\/\/martech.org\/searchengineland.com\/wp-content\/seloads\/2025\/11\/semrush-one.webp",0,0,false],"1536x1536":["https:\/\/martech.org\/searchengineland.com\/wp-content\/seloads\/2025\/11\/semrush-one.webp",0,0,false],"2048x2048":["https:\/\/martech.org\/searchengineland.com\/wp-content\/seloads\/2025\/11\/semrush-one.webp",0,0,false],"inspiro-featured-image":["https:\/\/martech.org\/searchengineland.com\/wp-content\/seloads\/2025\/11\/semrush-one.webp",0,0,false],"inspiro-loop":["https:\/\/martech.org\/searchengineland.com\/wp-content\/seloads\/2025\/11\/semrush-one.webp",0,0,false],"inspiro-loop@2x":["https:\/\/martech.org\/searchengineland.com\/wp-content\/seloads\/2025\/11\/semrush-one.webp",0,0,false],"portfolio_item-thumbnail":["https:\/\/martech.org\/searchengineland.com\/wp-content\/seloads\/2025\/11\/semrush-one.webp",0,0,false],"portfolio_item-thumbnail@2x":["https:\/\/martech.org\/searchengineland.com\/wp-content\/seloads\/2025\/11\/semrush-one.webp",0,0,false],"portfolio_item-masonry":["https:\/\/martech.org\/searchengineland.com\/wp-content\/seloads\/2025\/11\/semrush-one.webp",0,0,false],"portfolio_item-masonry@2x":["https:\/\/martech.org\/searchengineland.com\/wp-content\/seloads\/2025\/11\/semrush-one.webp",0,0,false],"portfolio_item-thumbnail_cinema":["https:\/\/martech.org\/searchengineland.com\/wp-content\/seloads\/2025\/11\/semrush-one.webp",0,0,false],"portfolio_item-thumbnail_portrait":["https:\/\/martech.org\/searchengineland.com\/wp-content\/seloads\/2025\/11\/semrush-one.webp",0,0,false],"portfolio_item-thumbnail_portrait@2x":["https:\/\/martech.org\/searchengineland.com\/wp-content\/seloads\/2025\/11\/semrush-one.webp",0,0,false],"portfolio_item-thumbnail_square":["https:\/\/martech.org\/searchengineland.com\/wp-content\/seloads\/2025\/11\/semrush-one.webp",0,0,false]},"_links":{"self":[{"href":"https:\/\/attentionmedia.io\/index.php?rest_route=\/wp\/v2\/posts\/11007","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/attentionmedia.io\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/attentionmedia.io\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/attentionmedia.io\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/attentionmedia.io\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=11007"}],"version-history":[{"count":0,"href":"https:\/\/attentionmedia.io\/index.php?rest_route=\/wp\/v2\/posts\/11007\/revisions"}],"wp:attachment":[{"href":"https:\/\/attentionmedia.io\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=11007"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/attentionmedia.io\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=11007"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/attentionmedia.io\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=11007"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}