Venture Capitalist at Theory

About / Categories / Subscribe / Twitter

2 minute read / Jun 26, 2023 /

Picking Teams in AI

Yesterday, Databricks announced its intent to acquire Mosaic for $1.3b. Perhaps not coincidentally, Snowflake announced a deepened partnership with Nvidia to offer customers models & training on Nvidia’s Nemo platform.

Clouds are picking teams in one of the most important dislocations in software.

Cloud <- -> LLM Infrastructure
Microsoft <- -> OpenAI
Snowflake <- -> Nvidia
Databricks <- -> Mosaic
Google <- -> Anthropic
Oracle <- -> Cohere
Amazon <- -> HuggingFace

Microsoft has invested over $10b, plus significant development efforts to work with OpenAI. In addition, Microsoft & Snowflake announced a deeper AI go-to-market partnership with Snowflake.

Snowflake’s partnership with Nvidia positions Snowflake’s cloud as a broader infrastructure platform.

DataBricks, whose business revolves around Spark operating in customer environments, has announced plans to acquire Mosaic, a vertically integrated model training & management system that functions on similar workloads.

Google has invested hundreds of millions into Anthropic, complementing its efforts with Google Brain.

Oracle has paralleled Microsoft’s OpenAI partnership with Cohere, investing & seeking to build a product for Oracle’s cloud with Cohere.

Amazon has announced HuggingFace LLMs on their Sagemaker product, embracing the open-source community.

Cloudflare, which has seen tremendous interest in its R2 storage product for model training because of a lower cost storage infrastructure, had partnered with Mosaic. It’s unclear how the Databricks acquisition might change that relationship.

Cloud infrastructure players are picking teams within the infrastructure layer. Most major cloud players have picked an LLM partner & perhaps will choose multiple.

For startups building LLM-based applications & infrastructure, this alters the calculus of selecting a cloud. Five years ago, many startups defaulted to AWS for the generous credits, broad catalog, & rapid pace of innovation.

LLM-enabled apps require customer data to train, propelling data security to the top of the list for most enterprise buyers. Startups may begin to pick clouds to reach a particular class of customer, the security promises the underlying platforms provide, & then available algorithms & cost.

Access to particular models may be a consideration, but given the rapid advances in open-source that advantage will likely erode over time.


Read More:

MotherDuck - DuckDB in the Cloud