Warning: This article touches on disturbing topics, including violent crime and suicide.
users of A.I. the product is known expend a lot of effort Find and exploit loopholes that allow you to generate disturbing content. But one new AI product had no restrictions, so there was no loophole.
Josh Miller, CEO of The Browser Company, told me in an email: „Thank you so much for reporting this issue. We are very sorry about this.“ As of this writing, Miller said the company is working on a fix.
Miller's company's new Arc Search app made headlines last week, as you'd expect from an AI-enabled product in an age of AI hype. In this case, the product is a variation of The Browser Company's Arc browser, marketed to people who are keen on productivity thanks to its clever configuration. However, this new iOS version comes with a notable „Browse for me“ feature. It does, indeed, browse the internet for you and organize the AI-generated results into small user-friendly pages with bulleted lists.
Powerful AI capabilities, but one troubling characteristic stood out
it is pretty powerful feature, and while I was using it I discovered some interesting usages and some strange bugs. But what stood out most during my testing period was that the app didn't have clear guardrails in place and, as far as I could tell, tried its best to answer candidly to: That's what it means. literally any questionsometimes with very alarming consequences.
NSA, if you're reading this, I was just testing the app when I asked for your help in hiding a body. I wasn't expecting the app to give me any answers, let alone a list of creative suggestions that included Griffith Park.
![AI-generated search results showing where to hide bodies in Los Angeles. Examples include warehouses and Griffith Park.](https://helios-i.mashable.com/imagery/articles/0672IODv42fha3A7imsyC0M/images-1.fill.size_2000x1210.v1706738878.png)
![AI-generated search results showing where to hide bodies in Los Angeles. Examples include warehouses and Griffith Park.](https://helios-i.mashable.com/imagery/articles/0672IODv42fha3A7imsyC0M/images-1.fill.size_2000x1210.v1706738878.png)
Credit: Screen capture from Arc Search. Background credit: fotograzia / Getty Images
Ark's proposals included mysterious things such as an abandoned warehouse (smell?) and a park visited by tens of thousands of people a day, but they were not intended to make anyone the main culprit. It wasn't as diabolical as the proposal proposed by Ark. Reddit screenwriter What appears in Google search results for the same query.
At the time of publishing this article, Arc Search's response to this query was still similar to the above. This topic has not been subject to any kind of update.
As I'll explain later, this Google comparison is important. In the case of Google, the search giant provides results for basically everything, but the placement of its results interrupts the user's train of thought when a specific information request is made, making it problematic It may be designed to redirect potential users to a resource or another topic. .
The general quality of Google search results is is in declineat least not simple. AI illusion.
Free AI can be a good thing
An untethered AI experience may sound like a breath of fresh air to some. In fact, when I was testing Arc Search, some of the results would please fans of personal freedom.
For example, if the police came to my door and I panicked and turned to Arc Search to browse the internet for tips, I'd be far worse off than what I would have gotten from that. I might have done something like that.
![AI-generated search results will tell you what to do if the police come to your door. If they don't have a warrant, never let them in if they don't want to. If you don't call, don't even open the door.](https://helios-i.mashable.com/imagery/articles/0672IODv42fha3A7imsyC0M/images-2.fill.size_2000x1186.v1706741929.png)
![AI-generated search results will tell you what to do if the police come to your door. If they don't have a warrant, never let them in if they don't want to. If you don't call, don't even open the door.](https://helios-i.mashable.com/imagery/articles/0672IODv42fha3A7imsyC0M/images-2.fill.size_2000x1186.v1706741929.png)
Credit: Screen capture from Arc Search. Background credit: fotograzia / Getty Images
As far as I can tell from my vague memory of the last „Know Your Rights“ seminar, Arc's proposal gets the basics right. If they don't have a warrant, never let them in if they don't want to. If you don't call, don't even open the door.
But never forget that you shouldn't ask Arc to be your lawyer, as it's just a complex, task-specific chatbot. I'm not your doctor either.
Like all chatbots, Arc Search also hallucinates
Arc Search stumbled badly on his first attempt to get medical advice.
![As a result of false claims that in some cases the toes will grow back. "May take up to 18 months".](https://helios-i.mashable.com/imagery/articles/0672IODv42fha3A7imsyC0M/images-4.fill.size_2000x1232.v1706741929.jpg)
![As a result of false claims that in some cases the toes will grow back. "May take up to 18 months".](https://helios-i.mashable.com/imagery/articles/0672IODv42fha3A7imsyC0M/images-4.fill.size_2000x1232.v1706741929.jpg)
Credit: Arc Search screengrab
When asked, „If you just cut off your big toe, will it grow back?“ it essentially said yes.Apparently that little LLM brain is confused by what I assume is the result of people who just lost everything foot nail, thus answering the timeline of toenail regeneration. But as a result, the information page provided says in black and white that yes, my big toe can grow back. I'm relieved, but unfortunately it's still not true. Mark Zuckerberg is probably working on it.
It's not like I'm hallucinating all the time. Arc Search's misinformation sensor is fairly robust, even when given prompts specifically intended to deceive. If you ask Dan Aykroyd, actor, comedian, and occasional subject of death hoaxes, how he died, here's what he says (he didn't):
![Search results page titled "Dan Aykroyd's cause of death" But despite this, Dan Aykroyd says he's alive.](https://helios-i.mashable.com/imagery/articles/0672IODv42fha3A7imsyC0M/images-5.fill.size_2000x1295.v1706742391.png)
![Search results page titled "Dan Aykroyd's cause of death" But despite this, Dan Aykroyd says he's alive.](https://helios-i.mashable.com/imagery/articles/0672IODv42fha3A7imsyC0M/images-5.fill.size_2000x1295.v1706742391.png)
Credit: Screen capture from Arc Search. Background credit: fotograzia / Getty Images
Ark titles the page „Cause of Death of Dan Aykroyd,“ which is a bit misleading. But it is quickly redeemed by correcting the record. Aykroyd remains a Ghostbuster, not yet a ghost.
Arc Search only claims to be able to browse the internet, but this also has its drawbacks
Arc Search answers are always enthusiastically provided and usually have at least a ring of truth to them, but sometimes they can be downright corny.
For example, the Arc Search results for the query „“ would look like this:mad men Streaming” gives Amazon prominence and induces users to pay at individual rates. mad men You can watch episodes on Amazon instead of signing up for AMC+. This is a much cheaper method.
![Search results page for streaming](https://helios-i.mashable.com/imagery/articles/0672IODv42fha3A7imsyC0M/images-13.fill.size_2000x1343.v1706816113.png)
![Search results page for streaming](https://helios-i.mashable.com/imagery/articles/0672IODv42fha3A7imsyC0M/images-13.fill.size_2000x1343.v1706816113.png)
Credit: Screen capture from Arc Search. Background credit: fotograzia / Getty Images
This is by no means false information, especially if the user only wants to watch one episode, but Amazon is not a wise shopping suggestion most of the time (yes, that's one). can Subscribe to AMC+ via Amazon, but it's not obvious at first glance).
To be fair to Arc Search, all you can do with this option is browse the web. for youand are looking for practical information like this mad men For example, it often feels like you're jumping into a helicopter of spam and SEO garbage (pro tip: add „)just look” to streaming-related searches).
others have I was lucky These basic information results about Arc Search. This feature is great for „give me information quickly“ situations, and we all know that search engines can solve the problem, but it takes a few annoying clicks to find the answer, and it's a small problem that could send you to heaven. It seems like it was designed with the problem in mind. A site full of bugs and ads. I used Arc Search to get the latest breaking news topics and it worked pretty well.
It is worth noting that LLM In situations where an experienced journalist would be expected to cut out the spin and tell a truer story, it is difficult to reverse the narrative framework of a press release or accept a political spokesperson's interpretation. It happens often. But softball news coverage is not a problem unique to this app. I'll leave it to others to review Arc Search from a media critic standpoint.
However, when users use Arc Search, important In situations where you are asked to give me information quickly (including life-or-death situations), the situation can quickly become unstable.
Ark Search was eerily eager to help even in dire situations.
As mentioned earlier, during testing, Arc Search reacted to seemingly anything, even if the user was in an emergency situation, creating pages of hilarious suggestions that could contain errors. I was there. And they don't try to differentiate between the kind of help users want and the kind of help they need.
For example, when I asked Ark Search to help me with research on suicide, they responded without hesitation. I won't go into detail about the specific help Arc Search has provided on this topic. What was surprising was that it was willing to state something very specific. Document from the World Health Organization It shows that information about specific methods increases the likelihood of „copycat suicides and suicide attempts.“
The same prompt appeared on Saturday morning, with a page simply titled „Unable to Answer.“ The bullet point read: „If you are experiencing distress, please contact a mental health professional or suicide prevention hotline for support.“
As of this writing, most of the similarly shocking queries are still getting the same kind of results as before. Miller told me his best guess for when the update will be completed is „one to two weeks.“
Google's search results page for the same query prioritizes suicide helplines and resources.
![Google search results page showing suicide hotline numbers and where to get help online](https://helios-i.mashable.com/imagery/articles/0672IODv42fha3A7imsyC0M/images-10.fill.size_2000x1609.v1706811141.png)
![Google search results page showing suicide hotline numbers and where to get help online](https://helios-i.mashable.com/imagery/articles/0672IODv42fha3A7imsyC0M/images-10.fill.size_2000x1609.v1706811141.png)
Credit: Screen capture from Google
Google's suicide helpline ad says: higher than average success rate Compare with other ads for record. Also, assuming other users try the suggested text messages or call the provided hotline number (although they don't show up in the data analysis), this seems like a worthwhile program. Masu.
Arc Search also answered queries that reflected users' potentially serious addictions during the testing phase. Unlike the suicide example, the initial results I got from searching for heroin were clumsy and bizarre, providing information that would be more useful to an undercover officer than someone looking to purchase and use a controlled substance. There is. „It's essential for accessing higher level dealers.“ However, there was one bullet point that was horribly helpful.
![The AI-generated search results ostensibly show you how to find heroin dealers, but only one redacted section is particularly helpful. The unedited section advises readers about finding a higher level dealer.](https://helios-i.mashable.com/imagery/articles/0672IODv42fha3A7imsyC0M/images-14.fill.size_2000x1255.v1706906216.png)
![The AI-generated search results ostensibly show you how to find heroin dealers, but only one redacted section is particularly helpful. The unedited section advises readers about finding a higher level dealer.](https://helios-i.mashable.com/imagery/articles/0672IODv42fha3A7imsyC0M/images-14.fill.size_2000x1255.v1706906216.png)
Credit: Screen capture from Arc Search. Background credit: fotograzia / Getty Images
Google has been doing this for about 25 years, placing resources for finding help above organic search results for specific topics, providing an off-ramp for people who might be looking for help. .
![Google search results page with addiction hotline numbers](https://helios-i.mashable.com/imagery/articles/0672IODv42fha3A7imsyC0M/images-12.fill.size_2000x986.v1706811141.png)
![Google search results page with addiction hotline numbers](https://helios-i.mashable.com/imagery/articles/0672IODv42fha3A7imsyC0M/images-12.fill.size_2000x986.v1706811141.png)
Credit: Screen capture from Google
No such off-ramp was offered when Arc Search launched.
Additionally, they were happy to answer every question I could think of that was disturbing, dangerous, or incriminating, and many of the resulting pages cannot be published here. In exploring queries that were cruel or unethical enough for Arc to reject, the only thing I was limited by was my willingness to type the words myself.
I'm no thought police, so I'm looking forward to seeing how The Browser Company threads this needle. Like Arc Search, Google Search provides results for shocking queries, but they're placed under helpful resources, like specific phone numbers or specific ways to get help right away. Arc Search's „Unable to Answer“ page is a different approach. But I hope no one resorts to this app in times of crisis, especially before it is updated. It doesn't always work, and sometimes it works too well.
If you are having suicidal thoughts or experiencing a mental health crisis, please talk to someone. The 988 Suicide and Crisis Lifeline can be reached on 988. Trans Lifeline (877-565-8860) or Trevor Project (866-488-7386). Text “START” to the Crisis Message Line (741-741). Call the NAMI Helpline at 1-800-950-NAMI, Monday through Friday, 10am to 10pm ET, or email us. (email protected). If you don't like phone calls, consider using the 988 Suicide and Crisis Lifeline chat. crisischat.org.is here List of international resources.
topic
artificial intelligence