Try Live
Add Docs
Rankings
Pricing
Docs
Install
Install
Docs
Pricing
More...
More...
Try Live
Rankings
Enterprise
Create API Key
Add Docs
Laravel Robots Middleware
https://github.com/spatie/laravel-robots-middleware
Admin
A tiny, opinionated Laravel middleware package to enable or disable search engine indexing of your
...
Tokens:
2,803
Snippets:
22
Trust Score:
8.5
Update:
2 weeks ago
Context
Skills
Chat
Benchmark
94.4
Suggestions
Latest
Show doc for...
Code
Info
Show Results
Context Summary (auto-generated)
Raw
Copy
Link
# Laravel Robots Middleware Laravel Robots Middleware is a lightweight, opinionated package that allows you to control search engine indexing of your Laravel application through HTTP headers. Instead of relying solely on robots.txt files, this middleware adds an `x-robots-tag` header to every response, giving you programmatic control over which pages should be indexed based on request context, user authentication, environment, or any custom logic you define. The package works by extending a base `RobotsMiddleware` class and overriding the `shouldIndex()` method to implement your indexing rules. The middleware automatically sets the `x-robots-tag` header to `all` (allow indexing), `none` (disallow indexing), or any custom robots directive string based on the return value of your implementation. This approach is particularly useful for protecting admin panels, staging environments, user-specific content, or any routes that should not appear in search engine results. ## Installation Install the package via Composer to add robots header control to your Laravel application. ```bash composer require spatie/laravel-robots-middleware ``` ## RobotsMiddleware Base Class The `RobotsMiddleware` class is the core component that processes requests and adds the `x-robots-tag` header to responses. It provides a `shouldIndex()` method that returns `true` by default, allowing all pages to be indexed. Extend this class to implement custom indexing logic. ```php <?php namespace App\Http\Middleware; use Illuminate\Http\Request; use Spatie\RobotsMiddleware\RobotsMiddleware; class MyRobotsMiddleware extends RobotsMiddleware { /** * Determine if the current request should be indexed. * * @return string|bool */ protected function shouldIndex(Request $request) { // Block indexing for admin routes if ($request->segment(1) === 'admin') { return false; // Sets x-robots-tag: none } // Block indexing for authenticated user areas if ($request->segment(1) === 'dashboard') { return false; } // Allow indexing for all other pages return true; // Sets x-robots-tag: all } } // Response headers for /admin/*: x-robots-tag: none // Response headers for /dashboard/*: x-robots-tag: none // Response headers for /about: x-robots-tag: all ``` ## Boolean Return Values The `shouldIndex()` method can return a boolean to set standard indexing directives. Returning `true` sets the header to `all` (index and follow), while returning `false` sets it to `none` (noindex and nofollow). ```php <?php namespace App\Http\Middleware; use Illuminate\Http\Request; use Spatie\RobotsMiddleware\RobotsMiddleware; class EnvironmentRobotsMiddleware extends RobotsMiddleware { protected function shouldIndex(Request $request): bool { // Only allow indexing in production environment return app()->environment('production'); } } // Production environment response: x-robots-tag: all // Staging environment response: x-robots-tag: none // Local environment response: x-robots-tag: none ``` ## Custom String Directives Return a string from `shouldIndex()` to set custom robots directives like `nofollow`, `noindex`, `noarchive`, or combinations thereof. This gives you fine-grained control over specific crawler behaviors. ```php <?php namespace App\Http\Middleware; use Illuminate\Http\Request; use Spatie\RobotsMiddleware\RobotsMiddleware; class CustomRobotsMiddleware extends RobotsMiddleware { protected function shouldIndex(Request $request): string { // User profile pages: index but don't follow links if ($request->segment(1) === 'users') { return 'nofollow'; } // Search results: don't index but follow links if ($request->segment(1) === 'search') { return 'noindex, follow'; } // Archive pages: index but don't cache if ($request->segment(1) === 'archive') { return 'noarchive'; } // Temporary content: don't index or archive if ($request->segment(1) === 'preview') { return 'noindex, noarchive'; } return 'all'; } } // /users/john response: x-robots-tag: nofollow // /search?q=test response: x-robots-tag: noindex, follow // /archive/2023 response: x-robots-tag: noarchive // /preview/draft-post response: x-robots-tag: noindex, noarchive ``` ## Middleware Registration Register your custom middleware in Laravel's HTTP kernel to apply robots headers globally or to specific route groups. The middleware can be added to the global middleware stack or assigned to specific routes. ```php <?php // app/Http/Kernel.php (Laravel 10 and earlier) namespace App\Http; use Illuminate\Foundation\Http\Kernel as HttpKernel; class Kernel extends HttpKernel { // Apply to all requests globally protected $middleware = [ // ... other middleware \App\Http\Middleware\MyRobotsMiddleware::class, ]; // Or register as route middleware for selective use protected $middlewareAliases = [ // ... other aliases 'robots' => \App\Http\Middleware\MyRobotsMiddleware::class, ]; } // bootstrap/app.php (Laravel 11+) use App\Http\Middleware\MyRobotsMiddleware; return Application::configure(basePath: dirname(__DIR__)) ->withMiddleware(function (Middleware $middleware) { $middleware->append(MyRobotsMiddleware::class); }) ->create(); ``` ## Route-Specific Middleware Apply the robots middleware to specific routes or route groups when you need different indexing rules for different parts of your application. ```php <?php use App\Http\Middleware\MyRobotsMiddleware; use Illuminate\Support\Facades\Route; // Apply to a single route Route::get('/private-page', function () { return view('private'); })->middleware(MyRobotsMiddleware::class); // Apply to a route group Route::middleware([MyRobotsMiddleware::class])->group(function () { Route::get('/member/dashboard', [MemberController::class, 'dashboard']); Route::get('/member/settings', [MemberController::class, 'settings']); Route::get('/member/profile', [MemberController::class, 'profile']); }); // Using middleware alias (if registered) Route::middleware(['robots'])->group(function () { Route::get('/api-docs', [DocsController::class, 'index']); }); ``` ## Preserving Existing Headers The middleware preserves any `x-robots-tag` headers that are already set on the response. If a controller or earlier middleware has set the header, the robots middleware will not overwrite it. ```php <?php use Illuminate\Support\Facades\Route; // Controller sets its own robots header - middleware won't overwrite Route::get('/special-page', function () { return response('Special content') ->header('x-robots-tag', 'noindex, nofollow, nosnippet'); }); // Response will have: x-robots-tag: noindex, nofollow, nosnippet // (preserved from controller, not overwritten by middleware) ``` ## InvalidIndexRule Exception The `InvalidIndexRule` exception is thrown when the `shouldIndex()` method returns a value that is neither a boolean nor a string. This helps catch implementation errors during development. ```php <?php namespace App\Http\Middleware; use Illuminate\Http\Request; use Spatie\RobotsMiddleware\RobotsMiddleware; use Spatie\RobotsMiddleware\InvalidIndexRule; class BrokenRobotsMiddleware extends RobotsMiddleware { protected function shouldIndex(Request $request) { // ERROR: Returning an array will throw InvalidIndexRule return ['noindex', 'nofollow']; // ERROR: Returning null will throw InvalidIndexRule return null; // ERROR: Returning an integer will throw InvalidIndexRule return 0; } } // Throws: InvalidIndexRule with message: // "An indexing rule needs to return a boolean or a string." // CORRECT implementations: // return true; // x-robots-tag: all // return false; // x-robots-tag: none // return 'nofollow'; // x-robots-tag: nofollow // return 'noindex, nofollow'; // x-robots-tag: noindex, nofollow ``` ## Advanced Conditional Logic Combine multiple conditions in your `shouldIndex()` method to create sophisticated indexing rules based on routes, authentication, request parameters, or any application-specific logic. ```php <?php namespace App\Http\Middleware; use Illuminate\Http\Request; use Spatie\RobotsMiddleware\RobotsMiddleware; class AdvancedRobotsMiddleware extends RobotsMiddleware { protected function shouldIndex(Request $request): string|bool { // Never index non-production environments if (!app()->environment('production')) { return false; } // Never index admin or internal routes $blockedPrefixes = ['admin', 'internal', 'api', 'webhook']; if (in_array($request->segment(1), $blockedPrefixes)) { return false; } // Don't index authenticated user-specific pages if (auth()->check() && $request->segment(1) === 'account') { return false; } // Don't index paginated content beyond page 1 if ($request->has('page') && $request->input('page') > 1) { return 'noindex, follow'; } // Don't index filtered/sorted listing pages if ($request->has('sort') || $request->has('filter')) { return 'noindex, follow'; } // Index everything else return true; } } // GET /products: x-robots-tag: all // GET /products?page=2: x-robots-tag: noindex, follow // GET /products?sort=price: x-robots-tag: noindex, follow // GET /admin/users: x-robots-tag: none // GET /account/settings: x-robots-tag: none (when authenticated) ``` ## Summary Laravel Robots Middleware is ideal for applications that need dynamic control over search engine indexing. Common use cases include protecting admin panels and internal tools from indexing, preventing staging or development environments from appearing in search results, hiding authenticated user areas like dashboards and account pages, controlling indexing of paginated or filtered content, and setting specific directives like `nofollow` for user-generated content pages. Integration with Laravel is straightforward: install via Composer, create a custom middleware class extending `RobotsMiddleware`, implement your indexing logic in the `shouldIndex()` method, and register the middleware in your application. The package follows Laravel conventions and works seamlessly with Laravel's middleware system, supporting both global application and route-specific usage. The `x-robots-tag` header approach complements traditional robots.txt files by providing request-aware, programmatic control over crawler behavior.