World.Alpha-News.org ➤ The news of the world is here

SAN FRANCISCO, Jan 28 - Developers at top U.S. AI companies are applauding the emergence of DeepSeek AI models while also questioning the belief that their high-priced technology has been outperformed by a new, affordable Chinese competitor.

A Chinese startup, DeepSeek, rose to prominence on Monday with its free AI assistant, OpenAI's ChatGPT, topping Apple's App Store in the U.S. The company claimed to have trained its model on Nvidia's budget-friendly H800 processor chips using less than $6 million.

As concerns about competition shook the U.S. stock market, some AI experts praised DeepSeek's skilled team and recent research advancements. However, they remained unfazed by the development, according to sources familiar with discussions at four leading AI labs who preferred to remain anonymous.

OpenAI CEO Sam Altman acknowledged DeepSeek's model R1, stating "is an impressive model, particularly around what they're able to deliver for the price." Nvidia recognized DeepSeek's achievement as evidence of the growing demand for its chips.

In response to customer interest, software maker Snowflake decided to incorporate DeepSeek models into its AI marketplace. Despite some concerns about hosting AI technology from China, Snowflake's executive vice president of product, Christian Kleinerman, expressed confidence in the decision, stating, "We see no issues supporting it as long as we are transparent with our customers."

Meanwhile, U.S. AI developers are rushing to analyze DeepSeek's V3 model. Although DeepSeek released a research paper outlining the model and its popular application in December, crucial details like total development costs remain undisclosed.

Suggestions point to China rapidly closing the gap from 18 to 6 months behind the cutting-edge AI models in the U.S. DeepSeek's free model strategy has created significant excitement but may challenge its ability to meet chip demand, according to industry insiders.

DeepSeek's success cannot solely be attributed to its modest $6 million budget compared to the estimated $250 billion U.S. cloud companies will spend on AI infrastructure this year. The cost specified refers to chip usage in the final training run, distinct from the overall development expenses.

Executives from top labs emphasized that the expenses extend far beyond the training run cost. The paper mentioned that the V3 training run used 2,048 of Nvidia's H800 chips, indicating a legally compliant design that adheres to U.S. export regulations released in 2022.

Industry insights suggest earlier development stages likely involved a substantial chip investment, potentially exceeding $1 billion. Praising DeepSeek for offering its models as open source, venture capitalist Marc Andreessen described DeepSeek R1 as a groundbreaking contribution to the AI community.

The recognition received by DeepSeek highlights the appeal of open source AI technology as a cost-effective alternative to exclusive platforms like OpenAI's ChatGPT. The upcoming quarterly earnings reports from major American tech companies are anticipated to prompt a reassessment of assumptions regarding access to resources essential for advancing AI technologies.