Saturday, September 21, 2024
HomebusinessHow Anthropic found a trick to get AI to give you answers...

How Anthropic found a trick to get AI to give you answers it’s not supposed to

Published on

spot_img


If you build it, people will try to break it. Sometimes even the people building stuff are the ones breaking it. Such is the case with Anthropic and its latest research which demonstrates an interesting vulnerability in current LLM technology. More or less if you keep at a question, you can break guardrails and wind up with large language models telling you stuff that they are designed not to. Like how to build a bomb.

Of course given progress in open-source AI technology, you can spin up your own LLM locally and just ask it whatever you want, but for more consumer-grade stuff this is an issue worth pondering. What’s fun about AI today is the quick pace it is advancing, and how well — or not — we’re doing as a species to better understand what we’re building.

If you’ll allow me the thought, I wonder if we’re going to see more questions and issues of the type that Anthropic outlines as LLMs and other new AI model types get smarter, and larger. Which is perhaps repeating myself. But the closer we get to more generalized AI intelligence, the more it should resemble a thinking entity, and not a computer that we can program, right? If so, we might have a harder time nailing down edge cases to the point when that work becomes unfeasible? Anyway, let’s talk about what Anthropic recently shared.



Source link

See also  Social Venture Mona Empowers Immigrant Refugee snd Women Entrepreneurs

Latest articles

Every Falsehood, Exaggeration and Untruth in Trump’s and Harris’s Stump Speeches

Thank you very much, everybody. Hello, Las...

Shoppers Searched for Years for a Mess-Free Way To Cook Bacon — and They Finally Found It

Let’s face it: You love bacon, but you hate the greasy mess...

Sources – Utah QB Cam Rising game-time call vs. Oklahoma State

Pete Thamel, ESPNSep 21, 2024, 09:11 AM ETUtah quarterback Cam Rising is a...

More like this