Claude AI Will Soon Be Able to Control Your Browser (If You Let It)

Did you know that you can customize Google to filter out unwanted results? Follow these steps to improve your search results, including adding Lifehacker as your preferred source for tech news .
Have you ever wondered while browsing the internet, “Oh, if only an AI bot could do that for me?” I haven’t, but I think some people probably have, because Anthropic is currently running an experiment that allows certain Chrome users to do just that.
The company announced the new integration on Tuesday . Eligible users will now have access to a Chrome extension that, when activated, allows Claude AI to see everything you do in the browser. Claude can use that context to more accurately answer questions and requests you might receive through the extension’s built-in chatbot window.
But while that’s just one component of the feature, Anthropic’s vision goes far beyond simply making a more usable chatbot. In addition to more contextual interactions, Claude for Chrome can also control your browser and perform actions for you. It’s truly the future, although I’m not sure I really need it.
Here’s an example: Let’s say you’re looking for an apartment. Instead of opening Zillow yourself, you can click the “Claude” button in Chrome to launch a chatbot and tell it what exactly you’re looking for in a new home. As part of that request, you can ask Claude to look at listings on Zillow and share the best deals. According to Anthropic, Claude will do this for you and even tell you what permissions you need to grant in the chatbot window to complete the task, such as reading the contents of a page on Zillow.com .
In another example, Anthropic shows a user asking Claude to find a restaurant on DoorDash with good reviews that serves garlic noodles and add it to their cart. Claude describes the search in detail, including what he sees on the DoorDash homepage, how to search for “garlic noodles,” and even that he has to press “Enter” to perform the search.
If it works as advertised, it’s a little odd that you can ask a chatbot like Claude to do something in a web browser like Chrome and it just does it. But for most tasks, I don’t see the point. I suppose if you’re too busy to browse apartment listings or search for noodles to order for dinner yourself, Claude for Chrome offers the ability to multitask. But I usually don’t have a problem with those types of tasks. In fact, when I was looking for a new apartment or house, I enjoyed doing the research myself; I also enjoyed choosing a good restaurant for dinner. Those aren’t the kinds of things I necessarily need or want a bot for, especially for results that are fairly subjective: Why would Claude want to know which apartments are good for me or whether I’d prefer noodles from one restaurant over another? I’d rather choose those things myself.
Claude for Chrome and your security
Then there are the security issues that Anthropic is open about. The company acknowledges that AI browsers are vulnerable to Promise Injection attacks, a type of cyberattack in which attackers insert malicious instructions into AI models. In testing, the company found that before implementing any security measures, the success rate of Promise Injection attacks was 23.6%. In one of these successful tests, Anthropic sent a malicious email demanding that all emails be deleted from the user’s inbox. Claude for Chrome read the email and followed the instructions. Not perfect.
But it doesn’t implement the security measures Anthropic says it’s working on. These include giving users control over all site-level permissions, as well as verifying users before performing “high-risk” actions like posting content, making purchases, or providing personal information. The company has also improved Claude’s instructions for handling personal information, and blocked the bot from accessing “high-risk” sites like financial, adult, and pirate sites. Anthropic is also working on additional security measures, so this feature is fairly limited for now.
How to sign up for Claude for Chrome
Right now, Anthropic is only offering 1,000 Claude Max subscribers an initial trial, priced at $100 or $200 per month. The company will continue to offer early access to more Max subscribers in the coming weeks, though I wouldn’t be surprised if they open the trial to Pro subscribers ($20 per month) in the future.
If you qualify, you can sign up for the waitlist right now . Despite the above-mentioned security measures, the company warns that testers will be exposed to the following risks from malicious actors:
-
Access to your accounts or files
-
Sharing your personal information
-
Making purchases on your behalf
-
Doing things you never planned to do