Where Should ads.txt Be Hosted? Root Domain Rules Explained
ads.txt must live at your root domain. Not a subdomain, not a subfolder, not behind a redirect chain. Here is exactly where to place it and why DSPs care.

Key Takeaways
- ads.txt must be hosted at the root of your registered domain. The file must be accessible at
https://yourdomain.com/ads.txt. Not a subdomain. Not a subfolder. The root path on the root domain. - Subdomains don't inherit the root domain's ads.txt. If
blog.yourdomain.comruns its own ad inventory, it needs its own ads.txt file. DSP crawlers check each domain independently. - Redirects are allowed but must be minimal. One redirect from HTTP to HTTPS is fine. A chain of 5+ redirects will cause most DSP crawlers to give up and treat your inventory as unverified.
- CDN hosting works if configured correctly. The file must return HTTP 200 with
Content-Type: text/plain. Aggressive caching can delay DSP visibility of updates by hours or days. - HTTPS is effectively mandatory. While the original spec allowed HTTP, every major DSP now expects HTTPS. An SSL certificate error on your ads.txt path means crawlers can't fetch the file.
---
Where Should ads.txt Be Hosted?
The ads.txt spec is precise about one thing above all else: location.
The file has to be at the root of your domain. Not "somewhere on your website." Not "in a folder that makes organizational sense."
The root.
And the definition of "root" is narrower than most publishers realize.
Get the hosting wrong, and your ads.txt might as well not exist. DSP crawlers follow the IAB Tech Lab specification exactly. They request https://yourdomain.com/ads.txt. If they don't get a 200 response with plain text content at that exact URL, they move on.
Your inventory goes unverified.
Revenue disappears.
This guide covers the root domain requirement, subdomain rules, redirect handling, CDN configuration, and the edge cases that trip up publishers running complex site architectures.
The Root Domain Requirement
The IAB Tech Lab spec states that ads.txt must be served from the root path (/ads.txt) of the publisher's registered domain.
This means:
Correct:
texthttps://example.com/ads.txt https://www.example.com/ads.txt (if www is your canonical domain)
Incorrect:
texthttps://example.com/advertising/ads.txt https://example.com/files/ads.txt https://cdn.example.com/ads.txt https://ads.example.com/ads.txt
The "root domain" is the domain registered with a registrar. For example.com, the root is example.com. For country-code TLDs like example.co.uk, the root is example.co.uk.
When a DSP receives a bid request, it extracts the publisher's domain from the request and constructs the ads.txt URL by appending /ads.txt to the root.
There's no discovery mechanism. No DNS record pointing to an alternative location. No header that says "my ads.txt is actually over here."
The crawler goes to the root path, and that's the only place it looks.
No exceptions. No workarounds.
How Subdomains Work (And Don't Work)
This is where the spec catches many publishers off guard.
Subdomains are treated as separate entities. A DSP verifying inventory from news.example.com will look for ads.txt at news.example.com/ads.txt, not example.com/ads.txt.
If the subdomain doesn't have its own ads.txt file, the DSP has two options:
- Fall back to the root domain. The IAB spec allows crawlers to check the root domain's ads.txt if the subdomain doesn't have one. Most major DSPs implement this fallback.
- Treat the subdomain as unverified. Some DSPs don't implement the fallback. They check the subdomain, get a 404, and move on.
The safest approach? If a subdomain runs its own ad inventory, give it its own ads.txt file. Don't rely on the root domain fallback. You can't control which DSPs implement it and which don't.
When subdomains need their own ads.txt:
- The subdomain runs independently with its own SSP relationships
- The subdomain is a separate publishing brand under the same parent domain
- The subdomain uses different monetization partners than the root domain
- You want to ensure 100% compatibility with all DSP crawlers, including those that don't implement fallback
When you don't need a separate subdomain ads.txt:
- The subdomain shares the same ad stack and SSP accounts as the root domain
- The subdomain doesn't serve ads at all
- All programmatic traffic routes through the root domain's ad server
The www vs Non-www Question
If your canonical domain is www.example.com, your ads.txt should be accessible at both:
texthttps://www.example.com/ads.txt https://example.com/ads.txt
Most publishers handle this by redirecting one to the other. If example.com redirects to www.example.com, then a DSP requesting example.com/ads.txt will follow the redirect and find the file at www.example.com/ads.txt.
This works as long as the redirect chain stays short.
The reverse also applies. If www.example.com redirects to example.com, host the file at example.com/ads.txt.
The key: make sure the file is accessible regardless of which form the DSP uses. Not all crawlers normalize the domain before requesting. And you can't predict which version they'll try first.
Redirect Rules: What DSP Crawlers Accept
Redirects between HTTP and HTTPS or between www and non-www variants are standard. DSP crawlers handle these without issue.
The problems start when redirect chains get long or circular.
What works:
http://example.com/ads.txttohttps://example.com/ads.txt(HTTP to HTTPS upgrade, one redirect)https://example.com/ads.txttohttps://www.example.com/ads.txt(www normalization, one redirect)
What breaks:
- A chain of 5+ redirects before reaching the actual file. Most DSP crawlers give up after 3-5 redirects.
- A redirect loop where
example.com/ads.txtredirects towww.example.com/ads.txtwhich redirects back toexample.com/ads.txt. Infinite loop. Crawler gives up. - A redirect to a different domain entirely (e.g., redirecting to a third-party ads.txt management service at a different root domain). Some DSPs accept cross-domain redirects, others don't.
Third-party ads.txt management services that host your file on their domain and use redirects from your domain to theirs work with some DSPs but not all. If you use one of these services, test that major DSP crawlers (Google, The Trade Desk, DV360) can successfully resolve the chain and fetch the file.
Don't just assume it works.
CDN Configuration for ads.txt
If your site sits behind a CDN like Cloudflare, AWS CloudFront, or Fastly, your ads.txt file gets served through the CDN.
This works fine with the right configuration. But "the right configuration" isn't always the default.
Content type. The CDN must serve ads.txt with Content-Type: text/plain. Some CDN configurations default to text/html or application/octet-stream for unrecognized file types. Override this for the /ads.txt path specifically.
Caching. DSPs re-crawl ads.txt files every 24-72 hours. If your CDN caches the file aggressively (24+ hour TTL), updates you make won't be visible to crawlers until the cache expires. Set a short cache TTL for ads.txt, something like 1-4 hours.
Origin pull. Make sure the CDN is configured to fetch ads.txt from your origin server. If the file only exists at the CDN edge and not on your origin, deployments or origin changes could accidentally remove it.
Cache invalidation. When you update ads.txt, manually purge the CDN cache for that path. Don't wait for the TTL to expire if you've made a critical fix (like adding a missing SSP).
Every hour of delay is bids you're not getting.
SSL/HTTPS Requirements
The original IAB ads.txt spec didn't mandate HTTPS.
In practice, HTTPS is now required. Every major DSP crawler expects a valid SSL certificate on the domain serving ads.txt. If your certificate is expired, self-signed, or misconfigured, crawlers will fail the TLS handshake and treat the file as inaccessible.
Check these SSL scenarios:
- Mixed domains. If
example.comhas a valid certificate butwww.example.comdoesn't (or vice versa), crawlers that hit the wrong variant will fail. - Certificate expiration. A certificate that expired yesterday means DSPs can't fetch your ads.txt today. Set up certificate renewal monitoring. This shouldn't be this complicated in 2026, but expired certs still happen all the time.
- CDN SSL. If your CDN provides the SSL certificate (like Cloudflare's universal SSL), make sure it covers the root domain, not just
*.example.com.
Platform-Specific Hosting
WordPress
The simplest option is the Ads.txt Manager plugin. Install it, paste your entries, and it creates a virtual ads.txt route. No file upload needed.
The plugin intercepts requests to /ads.txt and serves the content from the database.
Or upload a physical file via FTP to your WordPress root directory (the folder containing wp-config.php).
Shopify
Shopify provides a built-in ads.txt editor at Settings > Online Store > Preferences. Paste your entries and save. Shopify handles the hosting and serving automatically.
Ghost
Ghost doesn't have a native ads.txt feature. Upload the file via your hosting provider's file manager or SFTP to the root web directory.
Static Sites (Next.js, Gatsby, Hugo)
Place ads.txt in your /public or /static directory. Build tools will copy it to the root of the deployed output.
Verify after deployment that the file is accessible at your production URL. Deploy configs sometimes have surprises.
Headless CMS / JAMstack
If your frontend is deployed to Vercel, Netlify, or similar platforms, add ads.txt to the public directory of your frontend repository. These platforms serve files from public/ at the root path.
Verification After Hosting
After placing the file, verify:
- Direct browser access. Visit
https://yourdomain.com/ads.txt. You should see raw text, not an HTML page. - HTTP status code. The response must be 200, not 301 (redirect), 403 (forbidden), or 404 (not found).
- Content type header. Must be
text/plain. Check with:curl -I https://yourdomain.com/ads.txt - No authentication gates. Some sites have staging protections, password walls, or login requirements that block public access. Your ads.txt must be accessible without any authentication.
- robots.txt isn't blocking. Verify that your robots.txt doesn't contain rules that prevent DSP crawlers from accessing
/ads.txt. - External access. Test from outside your network. Internal DNS, VPNs, and development environments can show a different result than what DSPs see.
This trips up more publishers than you'd expect.
Frequently Asked Questions
Can I host ads.txt on a CDN subdomain instead of my root domain?
No. Even if your content is served from cdn.example.com, the ads.txt file must be at example.com/ads.txt. DSPs construct the URL from the publisher domain in the bid request, which is your root domain, not your CDN subdomain.
What if my domain redirects to another domain?
If olddomain.com permanently redirects to newdomain.com, host ads.txt on newdomain.com. DSP crawlers will follow the redirect, but the primary domain in bid requests should be newdomain.com. If you're running ads on both domains during a migration, both domains need their own ads.txt.
Do I need ads.txt on both HTTP and HTTPS?
No, but your HTTP URL should redirect to HTTPS. Most DSPs request the HTTPS version first. As long as http://example.com/ads.txt redirects cleanly to https://example.com/ads.txt, you only need the file served once.
What if my hosting provider doesn't allow files at the root directory?
Some managed hosting platforms restrict root directory access. In that case, use a platform-specific feature (like Shopify's built-in ads.txt editor) or a plugin (like WordPress Ads.txt Manager) that programmatically serves the file at the /ads.txt route without a physical file.
If neither option exists, contact your hosting provider. They should be able to configure a route.
How do I check if DSPs can actually crawl my ads.txt?
Run a free scan on BeamFlow to verify file accessibility, syntax, and sellers.json cross-verification. You can also check from the command line with curl -v https://yourdomain.com/ads.txt and look for a 200 status code with text/plain content type.
Related Articles

ads.txt Checklist Before Going Live With a New SSP
Adding a new SSP? Use this checklist to get ads.txt right from day one. Most verification issues happen during onboarding because publishers skip steps they do not know about.

How to Track ads.txt Issues Across Multiple Domains
Managing ads.txt for one domain is straightforward. Managing it across five, ten, or fifty domains is where things break. Here is how to keep every domain verified.

How to Monitor ads.txt Changes Over Time
SSPs change sellers.json without telling you. Your own team edits ads.txt without documenting it. Monitoring changes over time catches issues before they cost revenue.
Ready to optimize your ads.txt?
Check your domain's supply chain health instantly, free.
Check Your Domain Free