摘要:近期,中国发布了三款新的生成式人工智能(AI)模型,同时推出了两款多模态文本到图像模型,这一系列创新在全球科技领域激起了强烈反响。受此影响,美国AI相关股票的股价在本周前两日平均下跌了10%。高盛研究机构指出,这一发展不仅凸显了美国科技巨头在AI领域的巨额资本
导读
近期,中国发布了三款新的生成式人工智能(AI)模型,同时推出了两款多模态文本到图像模型,这一系列创新在全球科技领域激起了强烈反响。受此影响,美国AI相关股票的股价在本周前两日平均下跌了10%。高盛研究机构指出,这一发展不仅凸显了美国科技巨头在AI领域的巨额资本投入,也预示着未来为了实现AI技术的规模化应用仍需持续投资。
中国AI发展亮点:性能提升与成本降低并举
中国AI模型在性能与成本效益方面表现突出。例如,推理成本(模型在训练后对新内容进行处理的阶段)在过去一年内下降了95%以上。这种成本的大幅下降预计将加速生成式AI应用的普及。更值得注意的是,这些新模型在“深度思考模式”或推理能力上有所突破,能够在回答用户问题前逐步分析推理,这一过程尽管耗时5到20秒,却更贴近人类思维方式,输出结果也更加精准。
此外,中国AI模型的经济性也是一大亮点。目前,中国模型每百万输入token的定价仅为0.14美元,这仅是美国大型科技公司同类推理模型价格的一个零头。这种显著的成本优势不仅促使一些美国科技巨头调整定价策略,甚至部分付费模型开始免费开放。
资本与产业格局的潜在改变
高盛亚洲互联网研究团队负责人Ronald Keung指出,中国AI模型的亮相正在推动行业从单纯追求性能提升,转向更加注重成本控制。这一趋势可能改变全球资本在AI领域的投资方式。例如,中国的AI企业不仅专注于降低成本,还尝试用更少的芯片完成相同任务。此外,边缘计算的兴起也引发了关注,这意味着未来更小型的AI模型可能无需依赖大型数据中心即可在手机或个人电脑上运行。
尽管如此,Keung也表示,与美国科技巨头相比,中国企业在AI领域的资本支出仍然较低——2023年中国主要互联网公司的资本支出同比增长了61%,但总体基数较小,且与全球同行相比仍有较大差距。中国企业的投资更多集中在股东回报与受消费刺激政策影响的国内市场上,而非海外扩张或巨额AI研发投入。
展望未来,中国AI发展的突破不仅可能降低技术壁垒,还可能为更多企业和行业带来机会。Keung指出,生成式AI的成本下降和性能提升可能推动人工智能向通用人工智能(AGI)迈进,即一种能在人类所有知识领域展现卓越表现的AI。对于应用层面的未来发展,高盛认为AI市场可能充满意外。AI技术可能被用于优化广告投放、创建超级AI助手APP,甚至在社交软件中整合交易功能。例如,这类AI助手可以帮助用户完成从预订机票到规划与朋友聚会的方方面面任务。然而,AI是否会像互联网时代那样由少数大规模应用主导,抑或会形成多元化、分散化的格局,目前仍有待观察。
仍存在地缘政治风险
尽管中国AI的崛起令人瞩目,但地缘政治因素为其未来发展带来了不确定性。美国对中国芯片的出口限制以及对中国企业的严格审查,使得全球投资者对中国市场的风险偏好仍较为保守。高盛指出,目前中国互联网企业的估值仍低于美国同行。然而,这些企业在国内市场的深厚根基及其持续增长的APP下载量,或许暗示着未来潜在的全球化发展空间。
总的来说,中国在生成式AI领域的最新突破,不仅在技术与成本上展现出强劲的竞争力,还可能重塑AI的应用场景和产业格局。随着AI技术的普及,未来的行业竞争格局与市场机会或将充满更多可能性。
英文原文:(节选)
China’s AI development could speed up AI adoption
Goldman Sachs
30 January 2025
What is important about the recent AI developments in China?
Three Chinese AI models were launched last week, as well as two multi-modal text-to-image models this week. And while most of the attention has been on DeepSeek’s new model, the other models are at around the same level in terms of performance and cost per token (a token is a small unit of text).
The cost of inferencing (the stage that comes after training, when an AI model works with content that it has never seen before) has fallen by more than 95% in China over the past year. We expect this much lower inferencing cost to drive a proliferation of generative AI applications.
Some of the models launched over the past week are focused on Deep thinking modes or reasoning. That means that the chat bot goes through each of its steps when you ask a question, telling you what it’s thinking before it arrives at an answer. That takes around 5-20 seconds for every question.
The process makes sense when you look at how human beings interact — if you ask me a question, and then I give you an immediate answer in milliseconds, then the chance is that I might not have thought it through. These models think before they speak.
The performances of these models seem to have improved a lot as a result. It's mostly because they assess their own answers before giving a final output.
Are these developments likely to change the way that capital is invested in AI?
Chinese players have been focused on driving the lowest cost, and also maybe trying to use minimal chips in doing the same tasks. I think over the last week, there's also been more focus on whether edge computing is becoming more popular, which could allow smaller AI models to run on your phone or computer without connecting to mega data centers. I think these are all questions that investors have on how the landscape will evolve.
What is clear to us is that lowering the cost of AI models will drive much higher adoption, as it would make the models much cheaper to use in future.
Both our research teams in China and our US teams expect this year to be the year of AI agents and applications. The good news is that some of these Chinese models have pushed the industry to focus not just on raising the performance, but also on lowering the cost. That should drive higher and higher adoption of artificial intelligence.
How much cheaper are the AI models in China relative to the incumbent AI providers in the US?
When it comes to how much the companies charge per use of the model, which is measured on a per-token basis, the charges are significantly lower. As of last weekend, a Chinese AI model’s pricing was 14 cents per million input tokens. That’s only a single-digit percentage of the amount that an equivalent reasoning model from a large US technology company charges.
It’s clear that prices are starting to come down as a result. Already, we’ve seen some US big tech companies adjust their pricing, including making some of their paid models free. So I think there will be a continuing race on efficiencies..
(本文观点仅供了解海外研究动态,不代表平台的意见和立场。)
荐读•赠书 | 管涛新著《货币的反噬》
五部门重磅发声!事关降息降准、提振消费、化债、DeepSeek等经济热点话题
全国人大代表林尚立:培养适应新质生产力人才的四点关键要素
政府工作报告,透露七大金融工作重点
关注两会!全国人大代表庄毓敏履职尽责!
来源:IMI财经观察