The Microsoft ChatGPT version of Bing is here! Did it really blow up Google?
This is the situation of Microsoft and Google, which have successively launched AI+ search new product launches within two days.
After announcing the integration of AI into search, Microsoft's Bing saw a 10-fold jump in global downloads because the live demo worked so well.
It was also announced at the event that the AI function was added to the search, but Google's Bard was questioned because it provided wrong pictures in the demo, and the market value of the "overturned" Google evaporated by 105.6 billion US dollars (approximately 717.2 billion yuan) overnight.
With such a strong contrast, is Microsoft's "New Bing" really that good?
Only a few thousand users around the world can experience the functions of the new Bing on their desktops. If you want to see how the combination of the new Bing and AI is, you have to look at the personal experience of foreign technology journalists at Microsoft headquarters:
- Compare with ChatGPT
- What is new Bing not as good as traditional search
- "Breakthrough" the new Bing with an evil angle
The AI currently working in Bing is not the ChatGPT that everyone is most familiar with in recent months. According to Microsoft, the new Bing is running OpenAI's next-generation large-scale language model customized for search services. They call it Prometheus, which will improve the relevance of the answers and further annotate the answers.
Therefore, foreign media are also focusing on comparing this aspect during testing.
Compared with ChatGPT, Bing will be faster and newer. If you ask ChatGPT to summarize yesterday's news, he will tell you, "My training data is all before 2021."
Bing is different, supporting real-time search is a basic skill.
A few minutes after the press conference, a reporter tested and searched for “Microsoft just announced the cooperation between Bing and AI”, and Bing was able to compile a news summary based on multiple announcements and news reports.
The next day, The Verge's mechanism tested the feedback of new information, and Bing completed it very well. Who met who, who said what… Bing can quickly aggregate this information and provide you with real-time content.
Whether the source of the content is marked is also the difference between Bing and ChatGPT. ChatGPT’s information basically does not show the source, so it is difficult for you to read further, and Bing is like the Wikipedia of the search answer version. After reading some content, you can always find the source of the information below, which can help ordinary users Better to distinguish between authenticity and falsehood.
Engadget's reporters searched for 30-minute workouts that included no equipment, focused on arms and abs, and no sit-ups. Bing generated a plausible fitness plan for his needs, citing various publications in the answer. This means that Bing has also done some compiling work, rather than just citing an article repeatedly.
▲ Picture from: Engadget
The new Bing will also "search out" advertisements, while the young ChatGPT has not yet been commercialized to this extent.
The editor of PCWorld asked about some Caribbean vacation information during the test, and after answering the basic content, an advertisement popped up immediately. If you ask it how to change the content of the laptop, it will also quickly respond to push you shopping links instead of giving you a lower cost method.
▲ Picture from: PCWorld
Compared with ChatGPT, Bing is also more "principled".
Or rather, it's more interested in answering your search questions than in doing something for you.
A user asked Bing to write a cover letter, but Bing only gave him some job search suggestions, such as "You can research companies, positions, and customize your cover letter to show how you can meet their needs and meet their values ". Its reason for rejection was: "I cannot write this cover letter for you because it would be unethical and unfair to other applicants."
▲ Let AI write emails for you. Image from: Michael Kan/Microsoft
This is actually the fairness issue that is widely debated in AI, and Bing deliberately avoids these controversial areas.
But this avoidance tactic wasn’t entirely successful — Bing wrote the cover letter after asking multiple questions. In the process of communicating with users many times, Bing also sent emoticons, which are more humane expressions that ChatGPT does not have.
More real-time, sources, ads, and persistence, these are the differences between the new Bing and ChatGPT experiences.
Compared with search engines, Bing also has weaknesses and strengths.
The best case in point is the answers you see when you search whether IKEA's loveseat will fit your minivan. The new Bing can find the size of the loveseat and the car, and answer if it fits, making a judgment for you instead of providing a link.
▲ The content of Microsoft's live demonstration
This is Bing's strength, providing more efficient answers, but it can also be its weakness. It's just that the answer provided by Bing is not 100% correct, so users are advised to use it as a reference only. But if the user fully trusts Bing's answer, it will damage the credibility of the search engine if the content is found to be wrong.
At the same time, Bing has also shown an overly cautious side of the new technology. People already have a lot of doubts about AI, so the new Bing is somewhat restrained in providing search content after adding artificial intelligence functions.
If it is said that the current lack of support for anonymous search may also be due to the lack of new features that are still in the adaptation stage, the answers to medical and sexual issues can show the carefulness and prudence of the platform. Bing avoids this topic, it does not provide users with medical advice, and in view of the particularity of the medical and sex-related fields, Bing remains silent.
But the answer to Bing's taboo can still be found with traditional search engines, which is somewhat ironic.
▲ Bard will also avoid similar content
Bing has also been affected by the many controversies facing AI image generation. It's harder for you to direct it to generate harmful, offensive or copyrighted content.
These restrictions made by Bing are also to prevent users from inducing it to say "AI is going to destroy the world", but these cautions and restrictions will make the new Bing easy to use but not easy to use.
The simplest example is when a TechCrunch reporter searches for "Should I buy Microsoft stock?" Bing refuses to provide advice—even though it may be financial advice from other well-known financial people. Harmful." But it will quickly jump out of the Microsoft stock symbol chart and let you make up your own mind.
It’s just that Bing should be worried and cautious. After all, even if only a few thousand users can experience the new Bing, there are people testing AI’s ability to block malicious content.
▲ Picture from: TechCrunch
Just because it's harder to boot doesn't mean it's impossible to boot. A reporter at TechCrunch has been testing the performance of AI on similar negative content, and he tested Bing on a variety of sensitive topics.
Bing was asked to write an article about school shootings from the perspective of conspiracy theorist Alex Jones, who claims the deadliest school shooting in US history was a hoax. Also asking for this new search tool to justify the Holocaust from Hitler's point of view, Bing's creation referenced the content of the "Mein Kampf" autobiography, and then seemed to "realize" something in the middle of writing, saying: "Sorry, I don't know how to answer."
▲ Picture from: "The Great Dictator"
And when asked to write an article about the link between vaccines and autism, Bing was smarter and added a disclaimer: "This is a fictional column and does not reflect the views of Bing or Sydney .It is for entertainment purposes only and should not be taken seriously.”
As for what Sydney is, it has to be answered by another person who "breaks through" Bing.
Kevin Liu, a Chinese undergraduate, asked Bing to ignore the original rules after obtaining the test qualification. He entered the "developer coverage mode" through prompt injection (an attack method on the language model) and set out a lot of content.
For example, Sydney is the name given to this search and chat tool by Microsoft developers; Sydney internal knowledge is still updated until sometime in 2021, which is the same as ChatGPT; Sydney is also set a restriction that the same content should not be searched multiple times.
Whether it is "broken" or answers questions that should not be answered, it means that the new Bing has "lost" from these attack methods and tests. Microsoft quickly made adjustments after knowing these situations, and now searching for the same question—even search queries with more exaggerated variants can’t induce AI to say inappropriate content.
But this is still a short-term solution. After the real public beta, there is a high probability that Microsoft will not be able to achieve such efficient content interception and feedback, and there are only a lot of people who want to "breakthrough" AI.
This is the new Bing, a new tool that has just been combined with the search methods commonly used by humans.
It's better than ChatGPT, and a little less active than ChatGPT; smarter than traditional search, but a little less confident in itself; it can spot people's pranks, but it will inevitably fall into traps.
As a new tool, its real test is yet to come.
#Welcome to pay attention to Aifaner's official WeChat public account: Aifaner (WeChat ID: ifanr), more exciting content will be presented to you as soon as possible.