Live
- MyVoice: Views of our readers 24th December 2024
- Centre scraps 'no-detention policy' for Class 5, 8 students
- Waste to Worth: Be Excreta Wise
- Officials told to resolve issues of differently-abled
- Israel orders ‘impossible’ evacuation
- Farmers plead for justice as govts betray them
- TDP committed to welfare of Christians: CM
- Collector Ansariya reviews development programmes
- CM orders clearance of diet bill dues to hostels
- MP Purandeswari assures to discuss with CM Naidu
Just In
Microsoft adds three options for AI-powered chatbot responses
A report reveals that Microsoft has added three options for AI-powered chatbot responses: Creative, Balanced, and Precise. Here is everything you need to know.
Microsoft Bing rose to popularity for all the wrong reasons, and many people continue to use the AI-powered chatbot even after it failed to provide accurate and appropriate answers. Now, the company has released an update that allows you to set a ringtone to get personalized responses from the chatbot. Microsoft's latest effort doesn't improve the chat experience. Here is everything you need to know.
A report from The Verge reveals that Microsoft has added three options for AI-powered chatbot responses, Creative, Balanced, and Precise. The new feature will help people get a better experience with the chatbot, as there were reports that several users had received rude responses.
AI chatbot creative mode will offer users original and imaginative answers. The precise mode will give people precise and short answers based on the details provided by the business. Finally, the balanced mode is self-explanatory and will give mixed answers regarding accuracy and creativity. The new chat feature is rolling out to all Microsoft Bing users. It should be available in the next few days if it's not visible to you.
The latest update is also expected to fix the issue of the AI chatbot not responding to some of the questions users ask. Some responses have been based on hallucinations, and Bing has reportedly even asked users to "shut up." The ChatGPT-based AI model has threatened users and even asked one user to end their marriage. Microsoft, in its defence, has said that the more you chat with the AI chatbot, it can confuse the underlying chat model in the new Bing.
In a conversation with an NYT employee, Microsoft's Bing chatbot claimed that it wants to be human because people can feel anyone, be emotional, and do many things that an AI can't. However, the bot has also revealed that his real name is Sydney, not Bing. But you need to know that Sydney is your internal name that Microsoft established while developing the technology.
© 2024 Hyderabad Media House Limited/The Hans India. All rights reserved. Powered by hocalwire.com