News & Articles

How Can You Remove A Link From Google Search In 2021?

Tuesday, January 28th, 2020
Written by Content Team

When millions of websites are desperately trying to appear on Google’s search results page, you may be wondering why anyone would want to remove on of their URLs from google search. It could be that some of your product or services are outdated and no longer relevant or maybe a deleted page is still ranking. Whatever the reasons there are multiple things you can do to remove a link from google search.


#1 Removing a link from Google search with a Meta Noindex tag

One of the most effective ways to remove a URL from Google search is by adding a meta NOINDEX tag to the header of your page. The HTML code for a meta noindex tag is as follows:

<meta name=”robots” content=”noindex”>

When a webcrawler next crawls your webpage and sees this tag in the header, Googlebot will remove that page entirely from the google search results page. This is the most effective way of removing a page from Google search and should be your go-to in most instances.


#2 Removing a link from Google search with Password Protection

Password protected pages or subfolders won’t show up in search engines as they are disabled in the robots.txt file. This blocks off all kind of web crawlers which makes it one of the most secure ways to block URLs.


#3 Removing a link from Google search with Google Search Console

If you have access to Google Search Console, you can temporarily deindex one of your web pages. In order to do so, you must first delete the page in question then send a URL removal request with one of Search Console’s tools. In order to do so, navigate to legacy tools > Removals. Simply enter your URL, press continue and wait for the webpage to be removed from Google.

If you use this method, the URL will be removed from the search index for 90 days, afterwards there is a possibility that the webpage will be re-added to the index. So, the best way to remove a link from Google search is through the Meta noindex tag.


Robots.txt file

A common misconception is that a robots.txt file is the best way of removing URLs from Google’s index. While Google won’t crawl or index a page that is blocked by robots.txt, a page will still be indexed if it is linked from other locations on the web. So, it is important to explore other methods before reverting to your robots.txt file. If you’d like to learn more about how a robots.txt file can be used for SEO, check out our blog post.

If you need help deindexing a page on your website, get in touch with Global Search Marketing to discuss our SEO services.