Breaking news from the tech industry reveals that Huawei, the renowned Chinese multinational, is set to introduce its very own artificial intelligence chatbot called Pangu Chat. This innovative creation by Huawei is constructed on their exclusive large-scale learning model, boasting an astounding 100 billion parameters.
Pangu Chat, named after the impressive Pangu cloud architecture, is poised to take on the AI chatbot market. Although it shares similarities with the renowned GPT-3 scale, Pangu Chat is still considered a moderately powerful AI assistant. In comparison, the upcoming GPT-4 model is rumored to house a staggering 170 trillion parameters, clearly surpassing Huawei’s language model.
Excitement surrounds the forthcoming Huawei Developer Conference (Cloud) event, scheduled for July 7th, where Pangu Chat will make its grand debut. However, the initial release will primarily target business users, disappointing regular consumers. Fear not, though, as the plan is to eventually extend availability to government and enterprise users in the near future.
Huawei has demonstrated its commitment to the Pangu cloud model. It is investing a substantial sum of 960 million yuan over the past three years. A significant portion of this investment went towards acquiring 2000 Huawei Ascend 910 chipsets. In addition to 4000 GPU/TPU cards, to facilitate the training process.
After rigorous testing, the Pangu-E large model, an upgraded iteration of the Pangu cloud model, has emerged with flying colors, showcasing an astonishing output of nearly 1.085 trillion parameters. This remarkable achievement positions Huawei’s Pangu-E model in direct competition with the highly anticipated GPT-4, surpassing even the GPT-3.5.
Huawei’s foray into the AI landscape is poised to shake up the market, providing businesses with a new, robust chatbot solution. With Pangu Chat’s imminent release, the possibilities for enhanced human-machine interactions are vast. Stay tuned for further updates on this exciting development as Huawei continues to redefine the boundaries of artificial intelligence.
Leave a Comment