[ad_1]
On Wednesday, Microsoft employee Mike Davidson announced that the company has rolled out three distinct personality styles for its experimental AI-powered Bing Chat bot: Creative, Balanced, or Precise. Microsoft has been testing the feature since February 24 with a limited set of users. Switching between modes produces different results that shift its balance between accuracy and creativity.
Bing Chat is an AI-powered assistant based on an advanced large language model (LLM) developed by OpenAI. A key feature of Bing Chat is that it can search the web and incorporate the results into its answers.
Microsoft announced Bing Chat on February 7, and shortly after going live, adversarial attacks regularly drove an early version of Bing Chat to simulated insanity, and users discovered the bot could be convinced to threaten them. Not long after, Microsoft dramatically dialed back Bing Chat’s outbursts by imposing strict limits on how long conversations could last.
Since then, the firm has been experimenting with ways to bring back some of Bing Chat’s sassy personality for those who wanted it but also allow other users to seek more accurate responses. This resulted in the new three-choice “conversation style” interface.
In our experiments with the three styles, we noticed that “Creative” mode produced shorter and more off-the-wall suggestions that were not always safe or practical. “Precise” mode erred on the side of caution, sometimes not suggesting anything if it saw no safe way to achieve a result. In the middle, “Balanced” mode often produced the longest responses with the most detailed search results and citations from websites in its answers.
With large language models, unexpected inaccuracies (hallucinations) often increase in frequency with increased “creativity,” which usually means that the AI model will deviate more from the information it learned in its dataset. AI researchers often call this property “temperature,” but Bing team members say there is more at work with the new conversation styles.
According to Microsoft employee Mikhail Parakhin, switching the modes in Bing Chat changes fundamental aspects of the bot’s behavior, including swapping between different AI models that have received additional training from human responses to its output. The different modes also use different initial prompts, meaning that Microsoft swaps the personality-defining prompt like the one revealed in the prompt injection attack we wrote about in February.
While Bing Chat is still only available to those who signed up for a waitlist, Microsoft continues to refine Bing Chat and other AI-powered Bing search features continuously as it prepares to roll it out more widely to users. Recently, Microsoft announced plans to integrate the technology into Windows 11.
[ad_2]
Source link