AI agents are being sold as the solution for planning trips, answering business questions, and solving problems of all kinds, but getting them to work with tools and data outside their chat interfaces has been tricky. Developers have to patch together various connectors and keep them running, but that’s a fragile approach that’s hard to scale and creates governance headaches.
Google claims it’s trying to solve that by launching its own fully managed, remote MCP servers that would make its Google and Cloud services — like Maps and BigQuery — easier for agents to plug into.
The move follows the launch of Google’s latest Gemini 3 model, and the company is looking to pair stronger reasoning with more dependable connections to real-world tools and data.
“We are making Google agent-ready by design,” Steren Giannini, product management director at Google Cloud, told TechCrunch.
Instead of spending a week or two setting up connectors, developers can now essentially paste in a URL to a managed endpoint, Giannini said.
At launch, Google is starting with MCP servers for Maps, BigQuery, Compute Engine, and Kubernetes Engine. In practice, this might look like an analytics assistant querying BigQuery directly, or an ops agent interacting with infrastructure services.
In the case of Maps, Giannini said, without the MCP, developers would rely on the model’s built-in knowledge. “But by giving your agent […] a tool like the Google Maps MCP server, then it gets grounded on actual, up-to-date location information for places or trips planning,” he added.
Techcrunch event
San Francisco
|
October 13-15, 2026
While the MCP servers will eventually be offered across all of Google’s tools, they are initially launching under public preview, meaning they’re not yet fully covered by Google Cloud terms of service. They are, however, being offered to enterprise customers that already pay for Google services at no extra cost.
“We expect to bring them to general availability very soon in the new year,” Giannini said, adding that he expects more MCP servers to trickle in every week.
MCP, which stands for Model Context Protocol, was developed by Anthropic about a year ago as an open-source standard to connect AI systems with data and tools. The protocol has been widely adopted across the agent tooling world, and Anthropic earlier this week donated MCP to a new Linux Foundation fund dedicated to open-sourcing and standardizing AI agent infrastructure.
“The beauty of MCP is that, because it’s a standard, if Google provides a server, it can connect to any client,” Giannini said. “I’m looking forward to seeing how many more clients will emerge.”
One can think of MCP clients as the AI apps on the other end of the wire that talk to MCP servers and call the tools they offer. For Google, that includes Gemini CLI and AI Studio. Giannini said he’s also tried it with Anthropic’s Claude and OpenAI’s ChatGPT as clients, and “they just work.”
Google argues this isn’t just about connecting agents to its services. The bigger enterprise play is Apigee, its API management product, which many companies already use to issue API keys, set quotas, and monitor traffic.
Giannini said Apigee can essentially “translate” a standard API into an MCP server, turning endpoints like a product catalog API into tools an agent can discover and use, with existing security and governance controls layered on top.
In other words, the same API guardrails companies use for human-built apps could now apply to AI agents, too.
Google’s new MCP servers are protected by a permission mechanism called Google Cloud IAM, which explicitly protects what an agent can do with that server. They are also protected by Google Cloud Model Armor, which Giannini describes as a firewall dedicated to agentic workloads that defends against advanced agentic threats like prompt injection and data exfiltration. Administrators can also rely on audit logging for additional observability.
Google plans to expand MCP support beyond the initial set of servers. In the next few months, the company will roll out support for services across areas like storage, databases, logging and monitoring, and security.
“We built the plumbing so that developers don’t have to,” Giannini said.