Short story short. Add routes.IgnoreRoute("robots.txt");
to your RouteConfig.cs and add a file called robots.txt to your project root folder.
RouteConfig.cs:
...
public static void RegisterRoutes(RouteCollection routes)
{
routes.IgnoreRoute("{resource}.axd/{*pathInfo}");
routes.IgnoreRoute("robots.txt");
routes.MapRoute(
name: "Default",
url: "{controller}/{action}/{id}",
defaults: new { controller = "Home", action = "Index", id = UrlParameter.Optional }
);
...
}
Robots.txt:
User-agent: *
Disallow: /
Now your site won’t get indexed by search engines. Typically you want to do this when you have a beta site that should be public to internet but it shouldn’t show up in google search results.