- Executive AI Playbook
- Posts
- Optimism to build, pessimism to design
Optimism to build, pessimism to design
Sequoia's thesis for datacenters, Figma AI and Notion's data infra
Welcome back to the Enterprise AI Playbook, Issue 7. Here are the successes, cautionary tales and deep dives from this week.
Successful launches - “AI is now shovel ready”
This thought piece by Sequoia focuses on the data center gold rush that’s started with the demand for GPUs to power AI systems and new capacity will benefit startups in the long term.
In particular, we will focus on the data center buildout, the rise of the “AI factory,” and its implications for energy, construction and the industrial supply chain. We believe that 2025 will be the “Year of the Data Center” and that we are on the cusp of transitioning from a hype cycle into an industrial-driven build cycle.
[…]
Whether there will be enough demand to fill them, we still don’t know yet. At the very least, training and inference costs should continue their decline, a boon to startups.
The article focuses on the capex investments being made into both cloud and first party data centers, with the results being:
1) more capacity for startups
2) a boom in infrastructure
3) an improvement in building capabilities.
The thesis is interesting, but is optimistic with two elephants in the room: the ability of the US to build infrastructure and the climate impacts of more data centers.
Building and procuring infrastructure in the US (and many western countries) has been challenging, both physical infrastructure and the energy systems around them. The US has yet to phase out coal power and struggles to maintain and modernize many energy providers, including the utility in the heart of of silicon valley; PG&E. Perhaps big tech procurement and influence will be the magic bullet to fight bureaucracy and short term profit motives, but the building of data centers in the past 20 years during the cloud boom has not caused a power boom, neither with utilities nor the adoption of cheap nuclear power. The US also struggles to build the physical type of transformer, essential in any grid expansion.
Most big technology companies have also cancelled their Net Zero plans and have increased power consumption immensely. This impact of climate will continue to cause issues, as the 2024 continues its trajectory to be the hottest on record. While investors are typically optimistic, the magic powers of big tech have still not been felt in the physical infrastructure world.
Cautionary Tales - Figma AI release misses the mark
Figma received criticism last week on its new “Make Designs” gen AI feature. In a twitter thread, one user commented that they generated a weather app that looks too similar to other apps, leading to a question on how the models are trained.
Figma’s generated weather apps
The CEO of Figma responded to the thread and commented that customer data is not being used for training, but that there was a QA issue. This lead to this Gen AI feature being paused until further investigations. The CEO of Figma continues in the tweet thread that “we believe that craft is your competitive advantage and it’s more important than ever to create unique designs.” This is an important message given how much and effort is made to create, organize and validate design patterns.
However, the messaging is not fully consistent with Figma’s data policy, which opts users into Figma Gen AI model training for starter and professional plans. This training default continues to be an area of focus for SaaS companies who want to strengthen their moats with user generated content, while potentially automating their users out of jobs.
Figma’s data policy
Full thread response.
Deep dive - Case study on Notion’s path for stronger infrastructure for AI
Notion recently published a comprehensive blog on how they improved their data infrastructure and the importance of these changes to its AI strategy:
Moving several large, crucial Postgres datasets (some of them tens of TB large) to data lake gave us a net savings of over a million dollars for 2022 and proportionally higher savings in 2023 and 2024.
For these datasets, the end-to-end ingestion time from Postgres to S3 and Snowflake decreased from more than a day to a few minutes for small tables and up to a couple of hours for large ones. Re-syncs, when necessary, can be completed within 24 hours without overloading live databases.
Most importantly, the changeover unlocked massive data storage, compute, and freshness savings from a variety of analytics and product asks, enabling the successful rollout of Notion AI features in 2023 and 2024.
The financial and operation overhead savings are pretty tangible and reinforce the need to invest and scale data projects to improve AI capabilities. The outline of specific problems and design decisions provide a great framework for future data projects and the tradeoffs made.
Full blog post
Question to ask your team:
How are AI/ML teams working with data teams on modern infrastructure requirements?
Until next week,
Denys - Enterprise AI @ Voiceflow