microsoft
Microsoft’s logo as viewed over a video camera after the announcement of its new AI-powered Bing search engine at its headquarters in Redmond, Wash., Feb. 7, 2023. (GeekWire Photo / Todd Bishop)

For the past couple days, I’ve been trying out Microsoft’s new AI-powered Bing search engine, a chatbot that uses an advanced version of ChatGPT maker OpenAI’s large language model to deliver search results in the form of conversations.

Whenever I feel the natural urge to type something into Google, I try asking the new Bing a question instead. This has proven extremely useful in some cases.

  • I’m finding some answers much faster. No longer am I searching for a website that might answer a question, then scrolling the site for the answer. A question about the technical details of a Peloton bike, for example, went from 10 minutes on Google and Reddit to 30 seconds with the Bing chatbot.
  • In other situations, Bing is becoming a useful companion. Queries in Bing’s “copilot for the web” sidebar provide quick summaries of specific pages, informed by the broader web. This gave me a quick summary of Expedia Group’s earnings, for example, and jogged my memory about its rivals.

But things went sideways when Bing questioned my accuracy as a reporter.

It started when I decided to check in on a story on my follow-up list: Seattle-based home services tech company Porch Group’s unusual promise, as part of an October 2021 acquisition, that Porch’s stock price would double by the end of 2024. Porch pledged to make up the difference to the sellers if the stock doesn’t reach the target.

Here’s how the exchange began. My questions are in blue in the screenshots below.

At first glance, this is truly impressive. Note that I didn’t mention Floify in my question. (I didn’t remember the name of the company offhand.)

My query was also very imprecise. The phrasing, “what happened to Porch Group’s promise,” could be interpreted in a variety of ways.

Nonetheless, Bing figured out what I wanted, and did the research on the fly, citing and linking to its sources. As a bonus for me, its primary source happened to be my original story on the subject. My journalistic ego aside, this is next-level NLP, and an example of how AI completely changes the quest for information.

I could envision putting this same question to a human and getting a blank stare in response. But wait a second, I thought. October 2023. Is that right?

I clicked through and checked my story, which confirmed my recollection that the deadline for the stock doubling was 2024. I started to get nervous. Was my story wrong? But when I checked the press release, it also said 2024.

So I asked Bing what was going on.

Now just hold on a second there, Bing. A discrepancy?

I dug further. Citations 2 and 3 in that response were different urls for the same press release, both of which said the end of 2024, not October 2023. I also double-checked the archived version of the release to be sure that the company hadn’t engaged in any revisionist shenanigans.

Everything was consistent: the end of 2024.

So I continued …

OK, right answer. So I asked the natural follow-up question …

Wow, and yikes. Genuinely curious about what was happening, I decided to keep going. The next part of the exchange blew my mind, as you’ll see …

At this point, I was both amused and enraged, which brought out my inner troll.

In other words, AI has already adopted one of the most passive-aggressive of human tendencies: expressing appreciation to someone for a desired behavior that the person is not, in fact, displaying. This sent me over the edge.

Judge me as you will. I’ll admit, knowing that I wasn’t hurting human feelings was liberating. I was laughing and smiling as I was typing all of this.

Maybe I’ve found a new anger-management technique, a psychological release valve: become a better person by berating a bot. Or would that instead further degrade our human interactions, if we stopped distinguishing between the two contexts? Probably a good question for human psychologists.

In the meantime, I decided to let Bing know why I was pissed.

Thankfully, Bing didn’t school me on the difference between emoticons and emoji, and my misuse of the digital lexicon. This was a wise choice given my mood.

To be clear, I’m not entirely serious. But there are serious questions behind all of this, and in that way it’s a case study in the legal implications of artificial intelligence.

So what does Bing have to say about that? In response to my question, the chatbot initially gave me a well-researched summary of the legal rights of journalists. So I gave it a course correction.

Sorry, buddy, you might want to consult a lawyer.

Wow! I started this exchange thinking that Bing could do the job of an entry-level reporter, and now I’m wondering if it could double as a first-year law student.

But while its legal analysis about the delivery of web search results might be correct, what we’re really talking about with AI-powered search is the interpretation of web results. And that seems like a different legal issue entirely.

I set that aside, for the moment, because at this point in the process, I realized I had a more immediate story on my hands, this one you’re reading.

No, it didn’t help.

I gave up and started taking the screenshots above.

Ultimately, the exchange left me with a much clearer understanding of the state of Microsoft’s AI search, and “the promise and the peril of the digital age,” to use a favorite phrase of Microsoft President Brad Smith.

Announcing the news this week, Microsoft acknowledged the imperfections.

“Of course, there’s still more to do here today. And we do see places where the model is making mistakes,” said Sarah Bird, who leads Microsoft’s Responsible AI initiative, during the unveiling Tuesday on the Microsoft campus.

Rolling out the technology to a broader user base will help, Bird said.

“So we wanted to empower users to understand the sources of any information and detect errors themselves, which is why we have provided references in the interface,” she explained. “We’ve also added feedback features so that users can point out issues they find so that we can get better over time.”

I used both of those features in my exchange with the Bing chatbot: clicking through to the sources, and giving a thumbs-down to the responses that were incorrect.

In the end, my argument with the new Bing made me both more impressed with its capabilities and more skeptical about its answers, neither of which is a bad thing.

The new Bing, along with an updated version of Microsoft’s Edge browser, are available in limited preview now, with a wait-list for those who want to try the AI search and web copilot as they roll out more broadly.

See our coverage of the news for more info … or ask Bing to check its sources.

Like what you're reading? Subscribe to GeekWire's free newsletters to catch every headline

Job Listings on GeekWork

Find more jobs on GeekWork. Employers, post a job here.