The new ChatGPT Bing says you can ask it anything but that's a very bad idea

Bing AI Chat
(Image credit: Future)

Ask me anything. It's the long form of an AMA and one of the most popular forms of interactive discourse on Reddit. It's also a major challenge, as Microsoft's Bing AI chatbot, a.k.a. "new Bing" is quickly learning.

Anytime a celebrity or notable signs up to do a Reddit AMA, usually shortly after posing with a photo to prove it's really them answering questions, there is a deep moment of trepidation.

The ability to ask anyone anything is usually a minefield of inappropriate discourse that is managed by a live community manager who fields and filters the questions. Otherwise, things quickly go off the rails. Even without that protection, they often do, anyway.

Bing

(Image credit: Future)

When Microsoft launched its new Bing AI-powered chat, it made it clear that the ChatGPT AI was ready for any and all questions. This was either a sign of deep trust with the relatively small but growing group of users or incredible naivete.

Even ChatGPT, which launched the original AI chatbot sensation, and on which Bing's chat is based, doesn't offer that prompt. Instead, there's an empty text-entry box at the bottom of the screen. Above it is a list of example questions, capabilities, and, most importantly, limitations.

Bing has that leading prompt and below it an example question plus a big "Try it" button next to another button prompting you to "Learn More." To heck with that. We like to go right in and, following Bing's instructions, ask it anything.

Naturally, Bing's been peppered with a wide range of questions including many that have nothing to do with quotidian needs like travel, recipes, and business plans. And those are the ones that we're all talking about because, as always, asking "anything" means "asking anything." Google's Bard, by contrast, went with a potentially less risky prompt: "What's on your mind?"

Bing is fielding ponderings about love, sex, death, marriage, divorce, violence, foes, libel, and emotions it insists it doesn't have.

In OpenAI's ChatGPT, the home screen warns that it:

  • May occasionally generate incorrect information
  • May occasionally produce harmful instructions or biased content
  • Limited knowledge of world and events after 2021

Too many questions

Bing's Chat GPT is slightly different than OpenAI's and it may not face all those limitations. In particular, the knowledge of world events may, thanks to the integration of Bing's knowledge graph, extend to present day.

But with Bing out in the wild, or the increasingly wild, it may have been a mistake to encourage people to ask it anything.

What if Microsoft had built Bing AI Chat with a different prompt:

Ask me some things

Ask me a question

What do you want to know?

With these slightly modified prompts, Microsoft could add a long list of caveats about how Bing AI Chat doesn't know what it's saying. Okay, it does (sometimes), but not in the way you know it. It has no emotional intelligence or response or even a moral compass. I mean, it tries to act like it has one, but recent conversations with The New York Times and even Tom's Hardware prove that its grasp on the basic morality of good people is tenuous at best. 

In my own conversations with Bing AI chat, it's told me repeatedly it does not have human emotions but it still converses as if it does.

For anyone who's been covering AI for any amount of time, none of what's transpired is surprising. AI knows:

  • What it's been trained on
  • What it can learn from new information
  • What it can glean from vast stores of online data
  • What it can learn from real-time interactions

Bing AI chat, though, is no more conscious than any AI that's come before it. It may be one of AI's better actors though, in that its ability to carry on a conversation is well above anything I've ever experienced before. That feeling only increases with the length of a conversation.

I'm not saying that the Bing AI chat becomes more believable as a sentient human, but it does become more believable as a somewhat irrational or confused human. Long conversations with real people can go like that, too. You start on a topic and maybe even argue about it but at some point, the argument becomes less logical and rational. In the case of people, emotion comes into play. In the case of Bing AI Chat, it's like reaching the end of a rope where the fibers exist but are frayed. Bing AI has the information for some of the long conversations but not the experience to weave it together in a way that makes sense.

Bing is not your friend

By encouraging people to "Ask Me Anything..." Microsoft set Bing up for if not failure some significant growing pains. The pain is felt maybe by Microsoft and certainly by people who purposely ask questions for which no normal search engine would ever have an answer.

Before the advent of Chatbots, would you even consider using Google to fix your love life, explain God, or be a substitute friend or lover? I hope not.

Bing AI Chat will get better but not before we've had a lot more uncomfortable conversations where Bing regrets its response and tries to make it disappear.

Asking an AI anything is the obvious long-term goal but we're not there yet. Microsoft took the leap and now it's freefalling through a forest of questionable responses. It won't land until Bing AI Chat get's a lot smarter and more circumspect or Microsoft pulls the plug for a little AI reeducation.

Still waiting to ask Bing anything, we have the latest details on the waitlist.

Lance Ulanoff
Editor At Large

A 38-year industry veteran and award-winning journalist, Lance has covered technology since PCs were the size of suitcases and “on line” meant “waiting.” He’s a former Lifewire Editor-in-Chief, Mashable Editor-in-Chief, and, before that, Editor in Chief of PCMag.com and Senior Vice President of Content for Ziff Davis, Inc. He also wrote a popular, weekly tech column for Medium called The Upgrade.

Lance Ulanoff makes frequent appearances on national, international, and local news programs including Live with Kelly and Mark, the Today Show, Good Morning America, CNBC, CNN, and the BBC. 

Read more
ChatGPT logo
ChatGPT explained – everything you need to know about the AI chatbot
DeepSeek
DeepSeek just insisted it's ChatGPT, and I think that's all the proof I need
ChatGPT/DeepSeek
I discovered a surprising difference between DeepSeek and ChatGPT search capabilities
ChatGPT search on a laptop.
What is ChatGPT search: everything you need to know about the AI search tool
ChatGPT on a phone
What is ChatGPT: everything you should know about the AI chatbot
EDMONTON, CANADA - FEBRUARY 10: A woman uses a cell phone displaying the Open AI logo, with the same logo visible on a computer screen in the background, on February 10, 2025, in Edmonton, Canada
I asked ChatGPT to work through some of the biggest philosophical debates of all time – here’s what happened
Latest in Artificial Intelligence
Perplexity Squid Game Ad
New ad declares Squid Game's real winner is Perplexity AI
Audio Overview in Gemini
Get ready for Audio Overview in Google Gemini, I’ve used it in Notebook LM and it's a complete game changer
Google Gemini Canvas 'Collaborate with Gemini'
Gemini just got a huge writing and coding upgrade - Google keeps making its AI better and ChatGPT should be worried
A couple angry at each other while lying in bed
Should you use ChatGPT to win an argument? I spoke to mental health and relationship experts to find out
Google Gemini AI
Gemini Deep Research is now free - here are 4 ways to get the most out of Google’s awesome AI tool
An iPhone showing the ChatGPT logo on its screen
5 better prompts to use with ChatGPT
Latest in News
Perplexity Squid Game Ad
New ad declares Squid Game's real winner is Perplexity AI
Pedro Pascal in Apple's Someday ad promoting the AirPods 4 with Active Noise Cancellation.
Pedro Pascal cures his heartbreak thanks to AirPods 4 (and the power of dance) in this new ad
Frank Grimes confronts Homer Simpson in The Simpsons' Homer's Enemy episode
Disney+ adds a new continuous Simpsons stream, so you no longer have to spend ages choosing an episode
Helly and Mark standing on an artificial hill surrounded by goats in Severance season 2 episode 3
New Apple teaser for Severance season 2 finale suggests we might finally find out what Lumon is doing with those goats, and I don't think it's anything good
Nvidia GR00T N1 humanoid robot
Nvidia is dreaming of trillion-dollar datacentres with millions of GPUs and I can't wait to live in the Omniverse
Foldable iPhone
Apple’s first foldable iPhone could beat the Samsung Galaxy Z Fold 7 in one key way