Dragonfly

Google Cloud Next and the AI-Driven Data Revolution

Google Cloud Next 2025: AI breakthroughs, next-gen data infrastructure, and cutting-edge compute hardware—built for scale.

April 14, 2025

Google Cloud Next 2025

Google Cloud Next is here! And once again, it’s showcasing innovation that’s impossible to ignore. This year, the spotlight remains firmly on AI, with groundbreaking releases and enhancements that continue to push the boundaries of what’s possible. From Gemini 2.5 (the newest model with enhanced reasoning) to Agentspace (which brings AI to every employee), the momentum is undeniable. We’re just as excited as you are to see where these technologies take us next.

Behind every cutting-edge AI model and intelligent application lies a critical enabler: data infrastructure. As AI capabilities expand, the demand for high-performance, scalable, resilient, and consistent data systems grows even more urgent. Whether it’s training massive LLMs, powering real-time inference, or managing ever-growing datasets, the backbone of data infrastructure must keep pace—and Google Cloud is rising to the challenge.

In this post, we’ll highlight some of the most compelling announcements from Google Cloud Next, focusing on AI and the data infrastructure that makes it all possible. While we can’t cover everything (there’s just too much good stuff!), we’ve picked the updates we find most fascinating—and we think you will, too.


Firestore MongoDB Compatibility & Firebase Studio

At Google Cloud Next, Firestore took a major leap forward by introducing native MongoDB compatibility—a highly anticipated feature for the community. This update lets developers leverage the familiar MongoDB API, drivers, and tools while benefiting from Firestore’s serverless scalability, strong consistency, and industry-leading 99.999% availability. To break it down a bit more, Firestore with MongoDB compatibility offers:

  • Seamless Migration: Use existing MongoDB code and integrations with no application changes.
  • Serverless Advantages: Multi-region replication, single-digit millisecond reads, and pay-as-you-go pricing.
  • Flexibility & Power: Retain MongoDB’s JSON storing and querying agility while tapping into Firestore’s auto-scaling and Google Cloud’s built-in governance and integration.

Alongside Firestore’s update, Firebase Studio, a cloud-based environment for prototyping and building full-stack AI apps, was also introduced in preview. Key features include:

  • Unified Workspace: A cloud-based integrated development environment (IDE) combined with Gemini-powered AI agents for end-to-end development.
  • Rapid Prototyping: Generate apps with multimodal prompts, including language, images, and drawings.
  • Always-on AI Assistance: Enhance your Firebase development with Gemini’s AI coding support, covering interactive chat, tool operation, and real-time inline suggestions.

This duo empowers developers to build faster with familiar tools while scaling effortlessly within the Firebase and MongoDB ecosystems.

Google Cloud Next 2025: Announcing Firestore with MongoDB Compatibility

Google Cloud Next 2025: Announcing Firestore with MongoDB Compatibility

AlloyDB AI: Supercharging PostgreSQL for the AI Era

Google unveiled groundbreaking AI integrations for AlloyDB—Google’s high-performance PostgreSQL-compatible database—to transform how developers and enterprises interact with data. Combining 4x faster transactional speeds and 2x better price-performance than vanilla PostgreSQL, AlloyDB has now evolved into an AI-ready engine that understands both structured and unstructured data.

With the new release, AlloyDB now delivers 10x faster filtered vector searches than standard PostgreSQL. Index creations are also 10x faster than the HNSW index in PostgreSQL. Adaptive filtering optimizes hybrid queries involving joins, filters, and vector indexes in real time. Additionally, auto-maintained vector indexes reduce manual rebuilds, keeping indexes in sync with source data. AlloyDB’s enhanced semantic search combines vector retrieval with Vertex AI’s ranking API for precision results. 

For natural language interactions, AlloyDB introduces intelligent disambiguation to clarify vague queries and secure parameterized views to prevent injection attacks. The new AI query engine embeds natural language processing directly into SQL through AI.IF() and AI.RANK() operators and expands multimodal support to generate embeddings for text, images, and videos. With these upgrades, data isn’t just stored, but is actively understood. This transforms the database into an intelligent platform for AI applications or, frankly, any future user-experience-driven applications.

BigQuery as an Autonomous Data-to-AI Platform

BigQuery now delivers an end-to-end autonomous data platform with deep AI integration. Gemini powers the experience with natural language data prep, SQL/Python code generation (60%+ acceptance rate), and automated metadata creation. The platform breaks new ground with multimodal support through ObjectRef tables, enabling unified querying of structured data and unstructured content like images/text.

For enterprise readiness, BigQuery now introduces managed disaster recovery with near-real-time replication and automatic failover. The platform also introduces intelligent query optimizations: a low-latency mode for short queries, history-based performance tuning, and the Column Metadata Index (CMETA) that maintains consistent speed from 10GB to 100PB datasets.

This positions BigQuery as more than a warehouse—it’s now an AI-activatable data hub that makes autonomous information exploration easy at scale. Ready to query your massive amounts of data with SQL, Python, or natural language without any barriers? BigQuery could be the right choice for you.

Google Cloud Next 2025: BigQuery as an Autonomous Data-to-AI Platform

Google Cloud Next 2025: BigQuery as an Autonomous Data-to-AI Platform

Axion: Hardware Acceleration for Cloud Databases

Innovation comes from both software and, of course, the hardware that powers it. Earlier, we covered AlloyDB’s AI-powered software advancements. Google Cloud Next also unveiled major hardware breakthroughs, starting with Axion processors, Google’s custom ARM-based CPUs designed to redefine the price-performance ratio for data workloads.

Since its launch, Axion has delivered up to 65% better efficiency than current-generation x86 instances and outpaces rival Arm chips by 10%, with adoption by 40% of Compute Engine’s top customers. Now, this performance extends to managed databases:

  • AlloyDB (our earlier highlight) achieves 50% better price-performance on Axion C4A VMs vs. Google’s N-series VMs.
  • Cloud SQL is also available as part of the preview release, offering similar gains for transactional workloads.

With integrated Titanium SSDs delivering up to 6TB storage and 2.4M random read IOPS, Axion C4A instances provide the hardware muscle needed for demanding database workloads. These advancements prove that next-gen infrastructure remains just as crucial as software innovation when scaling AI-powered applications.

AI Hypercomputer & Ironwood TPUs: The Engine Behind AI Innovation

Like data stores, modern AI demands infrastructure built for scale and efficiency. Google’s AI Hypercomputer delivers exactly that—combining over a decade of AI expertise into an integrated system of optimized hardware, open software, and flexible consumption. One of the many stars of this year’s hardware lineup is Ironwood, Google’s 7th-gen TPU, which boasts:

  • 5x more compute and 6x more memory than Trillium
  • 42.5 exaFLOPS per pod at 2x better power efficiency
  • Seamless integration with PyTorch/JAX stacks

AI has undoubtedly taken center stage at this year’s Google Cloud Next, with infrastructure innovations like these paving the way for next-generation applications. These advancements demonstrate how every layer of the stack—from silicon to software—now revolves around accelerating AI. For a deeper dive into all the announcements, explore the official blog series from Google Cloud.


Google Cloud Next 2025 proved one thing: AI’s potential is limitless. While the world focuses on models and agents, we at Dragonfly obsess over what powers them—modern data infrastructure that delivers consistent, high-performance data at scale. Through innovations like multi-threaded architecture with minimal locking and cloud-native resilience, Dragonfly provides what every application demands most—uncompromising speed.

Dragonfly Wings

Stay up to date on all things Dragonfly

Join our community for unparalleled support and insights

Join

Switch & save up to 80%

Dragonfly is fully compatible with the Redis ecosystem and requires no code changes to implement. Instantly experience up to a 25X boost in performance and 80% reduction in cost