It's Monday, April 27th: Big Cloud put $65 billion into Anthropic in four days, and DeepSeek shipped V4 with frontier-class performance at $0.14 per million tokens.

Head over to our Events Portal to get the latest on upcoming AI Collective events near you. Search by city, date, or event format, and join thousands of builders at events across 180+ chapters on every continent (except Antarctica, for now).

Find an event in your city using the link below.👇

The top AI stories from last week, filtered for what will help you stay in the know.

1️⃣ HYPERSCALER LAND GRAB: Google Stakes Up to $40B on Anthropic

Google committed up to $40 billion to Anthropic on Thursday, $10 billion deployed immediately and $30 billion more tied to performance and commercial milestones. The round values Anthropic at $350 billion as of its February 2026 mark, with secondary interest reportedly chasing an $800 billion-plus valuation ahead of a possible October IPO.

Alongside the cash, Google Cloud will deliver 5 gigawatts of TPU capacity to Anthropic over the next five years, layered on top of an early-April Broadcom deal that brings 3.5 gigawatts of TPU capacity online starting in 2027.

The Google announcement landed four days after Amazon committed up to $25 billion more (with $5 billion immediate at a $380 billion valuation), in exchange for Anthropic pledging $100 billion-plus in AWS spend over a decade and roughly 1 gigawatt of Trainium2 and Trainium3 capacity by year-end. Combined with prior Amazon money and existing Google exposure, Anthropic now has roughly $65 billion of fresh hyperscaler capital authorized in a single week. Dario Amodei framed the spend simply: "Our users tell us Claude is increasingly essential to how they work, and we need to build the infrastructure to keep pace with rapidly growing demand."

The structure is the story. Anthropic is now multi-cloud at a scale OpenAI never reached with Microsoft, with roughly 10 gigawatts of compute commitments split across two hyperscalers plus Broadcom and CoreWeave. Both Google and Amazon are betting on a frontier lab they don't control, while Microsoft and OpenAI keep merging closer to a shared roadmap. For builders, that means Claude has runway to stay competitive on capability without being captive to any one cloud's API economics or rate-limit politics.

It also confirms what every Q1 earnings call hinted at: the binding constraint on frontier AI is now gigawatts and grid power. Read every cloud announcement from here through year-end as a real-estate and energy story first.

Our Perspective

2️⃣ MODEL WARS: DeepSeek's V4 Lands at Frontier Quality, Bargain Pricing

DeepSeek released preview versions of V4-Pro and V4-Flash on Friday, both open-weight and both shipping with a 1-million-token context window. V4-Pro carries 1.6 trillion total parameters with 49 billion active; V4-Flash carries 284 billion total with 13 billion active. V4-Pro is now the largest open-weight model available, more than twice the size of V3.2 (671B) and ahead of Moonshot's Kimi K2.6 (1.1T) and MiniMax M1 (456B).

Our Perspective

Pricing undercuts every comparable Western model. V4-Pro runs $0.145 per million input tokens and $3.48 per million output tokens, beating Gemini 3.1 Pro, GPT-5.5, Claude Opus 4.7, and GPT-5.4 on cost. V4-Flash at $0.14/$0.28 sits below GPT-5.4 Nano, Gemini 3.1 Flash, GPT-5.4 Mini, and Claude Haiku 4.5.

On capability, V4-Pro matches GPT-5.4 on coding-competition benchmarks and outperforms GPT-5.2 and Gemini 3.0 Pro on certain reasoning tasks, but trails GPT-5.4 and Gemini 3.1 Pro on knowledge tests. DeepSeek's own framing puts V4 on "a developmental trajectory that trails state-of-the-art frontier models by approximately 3 to 6 months." The release came one day after the U.S. accused China of industrial-scale AI IP theft via proxy accounts; Anthropic and OpenAI have separately accused DeepSeek of distilling their models.

The cost gap is what should change purchase orders. A team running a high-volume agent stack on Claude Opus 4.7 or GPT-5.5 can sub V4-Pro in for tasks where 3-to-6-month-stale frontier quality is acceptable, and cut inference cost by an order of magnitude. The Stanford AI Index 2026 already concluded China has "effectively closed" the performance gap, and V4 makes the closing concrete on every dimension that matters to builders: open weights, longer context, lower price, and benchmark scores within striking distance of the closed labs.

The geopolitics will keep complicating deployment. Several U.S. states, plus Australia, Taiwan, South Korea, Denmark, and Italy, have restrictions on DeepSeek for privacy or national-security reasons, and the IP-theft claims are unlikely to settle quickly. For most builders, the practical move is to test the open weights on private infrastructure, where most concerns evaporate, and let the policy fight resolve in the background.

Our Perspective

🔗 Other News

Your pulse on the biggest events and announcements and happening in AI this week, from Noah Frank ⚡️

📅 Events We’re Watching

Mark your calendars and be sure to sign up for these landmark events we’re watching. Be sure to look out for special AIC discounts where available.

April 27 – 29: AIM-2026 (San Francisco, California)

The Third International Conference on Artificial Intelligence and Machine Learning, with keynote speakers from Stanford, University of Maryland, and York University. More academic than trade show. Registration runs $299 to $1,099.

May 4 – 7: IBM Think 2026 (Boston, Massachusetts)

IBM's flagship technology conference, covering enterprise AI, cloud computing, and quantum. Heavy on real-world implementation and use cases across industries like healthcare, finance, and supply chain.

May 27 – 28: AI DevSummit 2026 (South San Francisco, California)

A two-day conference on shipping real-world AI, with tracks on management, machine learning, and enterprise integration. Speakers include Logan Ramalingam (Google Cloud), Kordel France (Toyota), and AIC’s very own Mary Grygleski! Registration starts at $1,080.

June 15 – 18: Databricks Data + AI Summit 2026 (San Francisco, California)

The leading event at the intersection of data engineering, machine learning, and AI, hosted by Databricks. In-person passes run $1,395 to $1,895, but virtual access is free. If you can only attend one event this summer, this is a strong pick.

🫵 Do You Belong on Our Newsletter?

Share your message with the world’s largest AI community. To inquire about partnership availability, reach out to our team below.

The AI Collective is a community of volunteers, made for volunteers. All proceeds directly fund future initiatives that benefit this community.

Before You Go…

Connect With Us on Socials

Get Involved in Your Community

Thank you to the thousands of volunteers around the world who make this work possible. We truly could not do this without you.

About the Authors

Noah is a researcher, innovation strategist, and ex-founder thinking and writing about the future of AI. His work and body of research explores the economics of emerging technology and organizational strategy.

About Joy Dong

Joy is a news editor, writer, and entrepreneur at the forefront of the emerging tech landscape. A former educator turned media strategist, she currently writes TEA, where she demystifies complex systems to make AI and blockchain accessible for all.

Add Your Thoughts

Avatar

or to participate

Keep Reading