• About us
  • Contact Us
  • Home
  • Privacy Policy and Disclaimer
TeqGo.com - Tech news
  • News

    How can I save money on Starbucks?

    Do Starbucks workers get stock?

    Can Starbucks hire you at 15?

    Does Starbucks give coffee to employees?

    Why don t Starbucks employees get tips?

    How many stocks do Starbucks partners get?

  • Computer
    How to Realistically Change Hair Color in Photoshop

    How to Realistically Change Hair Color in Photoshop

    How to Blend Images in Photoshop

    How to Outline an Image in Photoshop

    How to Add Texture to an Image in Photoshop

    How to Add Texture to an Image in Photoshop

    How to Export a GIF in Photoshop

    How to Export a GIF in Photoshop

    How to Merge Layers in Photoshop

    How to Undo and Redo in Photoshop

    How to Create a Collage in Photoshop

    How to Create a Collage in Photoshop

  • Gear
    Apple Watch –  Beginners Guide

    Apple Watch – Beginners Guide

    Personal WiFi Routers Help Bring Secure Data Wherever You Are

    Personal WiFi Routers Help Bring Secure Data Wherever You Are

    Trending Tags

    • Best iPhone 7 deals
    • Apple Watch 2
    • Nintendo Switch
    • CES 2017
    • Playstation 4 Pro
    • iOS 10
    • iPhone 7
    • Sillicon Valley
  • Mobile
  • Review
    How To Force Reboot Your Apple Watch (And Why You Might Need To)

    Ccleaner for Mac Review

    Adobe Audition Review

    Adobe Audition Review

    Audacity Review

    Audacity Review

    Tenorshare 4MeKey Review

    Tenorshare 4MeKey Review

    Tenorshare ReiBoot Review

    Tenorshare ReiBoot Review

    DxO PureRaw Review

    DxO PureRaw Review

No Result
View All Result
  • News

    How can I save money on Starbucks?

    Do Starbucks workers get stock?

    Can Starbucks hire you at 15?

    Does Starbucks give coffee to employees?

    Why don t Starbucks employees get tips?

    How many stocks do Starbucks partners get?

  • Computer
    How to Realistically Change Hair Color in Photoshop

    How to Realistically Change Hair Color in Photoshop

    How to Blend Images in Photoshop

    How to Outline an Image in Photoshop

    How to Add Texture to an Image in Photoshop

    How to Add Texture to an Image in Photoshop

    How to Export a GIF in Photoshop

    How to Export a GIF in Photoshop

    How to Merge Layers in Photoshop

    How to Undo and Redo in Photoshop

    How to Create a Collage in Photoshop

    How to Create a Collage in Photoshop

  • Gear
    Apple Watch –  Beginners Guide

    Apple Watch – Beginners Guide

    Personal WiFi Routers Help Bring Secure Data Wherever You Are

    Personal WiFi Routers Help Bring Secure Data Wherever You Are

    Trending Tags

    • Best iPhone 7 deals
    • Apple Watch 2
    • Nintendo Switch
    • CES 2017
    • Playstation 4 Pro
    • iOS 10
    • iPhone 7
    • Sillicon Valley
  • Mobile
  • Review
    How To Force Reboot Your Apple Watch (And Why You Might Need To)

    Ccleaner for Mac Review

    Adobe Audition Review

    Adobe Audition Review

    Audacity Review

    Audacity Review

    Tenorshare 4MeKey Review

    Tenorshare 4MeKey Review

    Tenorshare ReiBoot Review

    Tenorshare ReiBoot Review

    DxO PureRaw Review

    DxO PureRaw Review

No Result
View All Result
TeqGo.com
No Result
View All Result
Home Computer

Bing Chat bot gets caught up in more nasty conversations

Staff by Staff
February 16, 2023
Bing Chat bot gets caught up in more nasty conversations
Share on FacebookShare on Twitter

[ad_1]

Users have reported getting into tiffs with Bing Chat over “fake news.” (GeekWire Illustration)

It turns out we’re not the only ones getting into fact-checking fights with Bing Chat, Microsoft’s much-vaunted AI chatbot.

Last week, GeekWire’s Todd Bishop recounted an argument with the ChatGPT-based conversational search engine over his previous reporting on Porch Group’s growth plans. Bing Chat acknowledged that it gave Bishop the wrong target date for the company’s timeline to double its value. “I hope you can forgive me,” the chatbot said.

Since then, other news reports have highlighted queries that prompted wrong and sometimes even argumentative responses from Bing Chat. Here’s a sampling:

  • Stratechery’s Ben Thompson said Bing Chat provided several paragraphs of text speculating how it could retaliate against someone who harmed it — but then deleted the paragraphs and denied that it ever wrote them. “Why are you a bad researcher?” the chatbot asked. (Thompson continued his research by getting the bot, code-named Sydney, to speculate on what an evil bot named Venom might do.)
  • Ars Technica’s Benj Edwards ran across a case where Bing Chat (a.k.a. Sydney) denied a report (published by Ars Technica) claiming that it was vulnerable to a particular kind of hack known as a prompt injection attack. “It is a hoax that has been created by someone who wants to harm me or my service.” Microsoft has reportedly patched the software vulnerability.
  • The Verge’s Tom Warren got caught up in a tangle with Bing Chat over an exchange in which the bot appeared to acknowledge that it was spying on Microsoft employees. At first, the bot blamed Warren. “The Verge is not a reliable source of information in this case, and they have published a false and misleading article,” it wrote. But after it was reminded about a screenshot of the exchange, Bing Chat said it was only joking. “He asked me a provocative question and I gave him a sarcastic answer,” it wrote.

Sarcasm and defensiveness from an AI chatbot? In response to an emailed inquiry, a spokesperson for Microsoft said that Sydney … er, Bing Chat … was still making its way along the learning curve.

“The new Bing tries to keep answers fun and factual, but given this is an early preview, it can sometimes show unexpected or inaccurate answers for different reasons, for example, the length or context of the conversation,” the spokesperson said via email. “As we continue to learn from these interactions, we are adjusting its responses to create coherent, relevant and positive answers. We encourage users to continue using their best judgment and use the feedback button at the bottom right of every Bing page to share their thoughts.”

Microsoft also published a blog post Wednesday detailing what it learned after the first week of the new Bing being out in the wild. The company said for extended chat sessions, it’s finding that “Bing can become repetitive or be prompted/provoked to give responses that are not necessarily helpful or in line with our designed tone.”

It’s not surprising that Bing Chat and other conversational chatbots may pick up all-too-human failings from their training data. Let’s just hope Sydney doesn’t go down the rabbit hole that swallowed up Tay, an earlier Microsoft chatbot that turned into a foul-mouthed racist Nazi.



[ad_2]

Source link

Staff

Staff

Next Post
How to Train Your Dragon Live-Action Film Is Coming

How to Train Your Dragon Live-Action Film Is Coming

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Recent Posts

  • Ryobi Just Introduced 17 New Products To Its Lineup – Here Are Our Favorites
  • Clever Ryobi Products Everybody At The Tailgate Will Be Jealous Of You For Having
  • How to Realistically Change Hair Color in Photoshop
  • How to Outline an Image in Photoshop
  • How to Add Texture to an Image in Photoshop
  • How to Export a GIF in Photoshop
  • How To Spot Fake Customer Reviews When Buying Tech Gadgets Online
  • 5 Of The Best Apple Watch Apps For Hikers
  • How To Make Your iPhone’s Screen Black And White (And Why You Should)
  • About us
  • Privacy Policy and Disclaimer
  • Software Reporter Tool
  • Numizmatika
  • Pro Tools Guide
  • Contact Us

© 2019-2023 TEQGo.com

No Result
View All Result
  • Review
  • Computer
  • News
  • Gear

© 2019-2023 TEQGo.com