AI Infrastructure Security

Cloudflare and GoDaddy Team Up to Give Websites Real Control Over AI Bot Swarms

Cloudflare and GoDaddy announced a new partnership that gives millions of website owners practical tools to control AI bot swarms. GoDaddy users can now easily allow, block, or set rules for crawlers from major AI companies.

Updated on April 08, 2026
Cloudflare and GoDaddy Team Up to Give Websites Real Control Over AI Bot Swarms

Cloudflare and GoDaddy have joined forces to address one of the fastest-growing headaches on the internet today: aggressive AI bot swarms that crawl websites at massive scale. The partnership brings Cloudflare’s AI Crawl Control tool directly into GoDaddy’s hosting platform, making it simple for millions of small businesses, publishers, bloggers, and creators to decide exactly how AI companies like OpenAI, Google, and Anthropic can access their content.

For years, websites have operated under a basic understanding with search engines — crawlers index content and send traffic back in return. That balance is breaking down fast. AI companies are now sending thousands of bots to scrape pages, pull training data, and power chatbots that deliver answers directly to users without ever sending them to the original site. The result is real pressure on servers, higher hosting costs, and declining traffic and revenue for many creators.

Through this integration, GoDaddy customers gain easy access to Cloudflare’s tools that let them allow specific bots, block others, or even explore ways to charge for access. The partnership also supports newer standards like Agent Name Service and Web Bot Auth, which help verify who is crawling and improve transparency. With Cloudflare handling traffic for roughly 20% of the entire web and GoDaddy being the world’s largest domain registrar and hosting provider, this move brings meaningful control to a much wider group of everyday website owners who previously had few good options.

Key Terms

AI Crawl Control

Cloudflare’s tool that lets website owners decide which AI bots can access their site, block unwanted ones, or set specific rules for crawling.

AI Bot Swarms

Large groups of automated bots deployed by AI companies to scrape content at massive scale for training models and generating answers.

Agent Name Service & Web Bot Auth

Emerging technical standards that help verify the identity of AI agents and make their behavior more transparent to site owners.

Permission Based Crawling

A model where websites can explicitly allow, restrict, or negotiate access when AI systems want to use their content

Conditions Driving This Change

The explosion of AI bot swarms is being driven by several powerful forces that are reshaping the economics and operations of the open web.

  • AI companies need massive amounts of fresh, high-quality data to keep improving their models, so they are deploying thousands of crawlers across the internet at unprecedented speed and volume.

  • Traditional search engines used to send traffic back to websites in exchange for indexing content, but AI answer engines now summarize or deliver information directly to users, cutting off that traffic and revenue stream for creators.

  • Many small businesses, publishers, and individual creators lack the technical tools or expertise to manage or even measure the scale of AI crawling hitting their sites.

  • Server costs and bandwidth usage are rising noticeably for sites that receive heavy bot traffic, creating real infrastructure stress that was not part of the original web model.

  • Without clear permission systems or enforcement mechanisms, the long-term incentive to produce high-quality original content risks weakening as creators see their work extracted without compensation or benefit.

  • Major infrastructure providers like Cloudflare and GoDaddy are seeing this problem play out across millions of sites on their networks and are now stepping in with solutions that can scale broadly.

These combined pressures have created a clear tipping point. What began as a convenient way for AI companies to gather data has turned into a structural challenge for the sustainability of the open web. The Cloudflare-GoDaddy partnership is a direct and practical response to this new reality.

What Security and Governance Looked Like Before

Before this partnership, most website owners had very limited and often ineffective ways to deal with AI bots. The primary tool available to many was the basic robots.txt file, which told crawlers which parts of a site they could access. In practice, many AI bots simply ignored these instructions or found easy ways around them. Larger organizations could set up custom firewall rules or use more advanced Cloudflare features, but the average small business owner, blogger, or content creator usually had almost no practical control.

Governance in this space was mostly passive and reactive. Site owners could try to block all bots or allow everything, but achieving any meaningful middle ground required technical skills that most people did not have. There was very little visibility into which specific AI companies were visiting a site, how much content they were taking, or how frequently they returned. This lack of transparency left many creators feeling frustrated as their traffic numbers dropped and hosting bills sometimes increased without clear explanation. Accountability was almost nonexistent. Creators often felt their work was being used without permission or any return benefit, and there were few standardized ways to negotiate or enforce rules. The overall situation left the majority of the web exposed to machine-scale consumption while the people actually creating the content struggled to protect their business models or even fully understand what was happening to their sites.

What’s Changing Now

The Cloudflare and GoDaddy partnership is changing the situation in a meaningful way by bringing powerful and easy-to-use AI bot management tools to millions of regular website owners. GoDaddy is integrating Cloudflare’s AI Crawl Control directly into its hosting platform, so users can now simply decide which AI bots are allowed on their site, which ones should be blocked, and under what specific conditions crawlers can operate.

The integration also supports newer technical standards such as Agent Name Service and Web Bot Auth. These standards help websites verify the real identity of AI agents and gain much better visibility into how those agents behave. Site owners now have practical, user-friendly controls instead of having to rely on outdated or easily ignored methods. This is moving the web toward a more permission-based model where creators can protect their work when they choose to, or potentially explore ways to benefit when AI systems want to use their content.

By combining Cloudflare’s deep infrastructure expertise with GoDaddy’s massive reach to small businesses and individual creators, the partnership is making sophisticated bot management accessible to a much broader audience. What used to be a problem that only large tech companies could handle is now becoming standard operational infrastructure for everyday websites. This development helps restore some balance between rapid AI innovation and the economic incentives that keep the open web healthy and sustainable for the people who actually create the content.

Our Take

AI Security Take
The Cloudflare-GoDaddy partnership highlights how managing AI consumption at scale is quickly becoming core infrastructure for the web. As AI bot swarms put real pressure on websites, the ability to see what is happening, enforce rules, and maintain control is moving from a nice-to-have feature to an essential capability that site owners need.

For governance and security teams inside organizations, this development carries a clear message. As companies deploy their own AI agents that crawl external websites or consume large amounts of web data, they will soon face similar questions around permission, transparency, and fair use. The tools and standards emerging now will influence how organizations think about governing their own outbound AI activity as well.

This partnership shows infrastructure providers stepping up to create practical solutions that can scale across millions of sites. GAIG tracks platforms and tools that help organizations establish visibility, enforce policies, and maintain accountability when AI systems interact with the wider internet. As AI usage continues to expand, strong governance layers that cover both internal systems and external data interfaces will become increasingly important for responsible and sustainable operations.

Related Articles

ServiceNow Launches Autonomous Workforce and Integrates Moveworks Into Its AI Platform AI Governance Platforms

Feb 27, 2026

ServiceNow Launches Autonomous Workforce and Integrates Moveworks Into Its AI Platform

Read More
Arize vs Fiddler vs Arthur: Which AI Monitoring Platform Actually Fits Your Enterprise? Model Observability

Mar 1, 2026

Arize vs Fiddler vs Arthur: Which AI Monitoring Platform Actually Fits Your Enterprise?

Read More
AI Governance Platforms vs Monitoring vs Security vs Compliance AI Policy & Standards

Mar 1, 2026

AI Governance Platforms vs Monitoring vs Security vs Compliance

Read More

Stay ahead of Industry Trends with our Newsletter

Get expert insights, regulatory updates, and best practices delivered to your inbox