How to Reclaim Links for SEO

How to Reclaim Links for SEO

What Is Link Reclamation?

Link reclamation is the process of finding, fixing or replacing broken links pointed at your website. In SEO, when discussing links, a lot of focus is put on building new links. However, while building new links is very valuable, maintaining your link profile so as not to lose the link juice your site currently has is also very important. By finding and fixing broken links through link reclamation you can maintain the quality and SEO power of your backlinks.

Why Does Link Reclamation Matter?

As mentioned above, link reclamation will ensure that your website isn’t losing out on any link juice due to broken links, moved pages or faulty URL canonicalization. Going through the link reclamation process has the added benefit of helping you to find defective pages, duplicate content issues and problems with your internal linking and site navigation.

Reclaimed links are also low hanging fruit in the link building world. Website owners don’t want broken links on their site (these links are bad for their own SEO and usability), so fixing or replacing them is in their own interest. Your internal links are the easiest as they are under the direct control of you or, at the very least, a team member. Fixing your internal linking is vitally important to ensure that your pages with a lot of links are distributing that link juice to your other pages.

When going through the link reclamation process, it is often advantageous to start with your internal links. There are two reasons for this:

  1. These links are under your direct control, so they are an easy way to score quick wins to improve your SEO.

  2. Crawling your site to find broken internal link will identify the pages that are most likely to have broken external links as well.

Start internal link reclamation by crawling your site. Use a web crawler like DeeCrawl or Screaming Frog. Crawling your site will return a list of your URLs along with important information about them such as title tag; meta description; HTML header tags and, most importantly for your purposes here, HTTP status code.

When using Screaming Frog, filter your crawl results by Client Error to display pages returning 404 errors. View all your internal links to that page by clicking on the URL in the top window, and then clicking on Inlinks in the bottom pane. If you have a large site, export your URLs using the Bulk Export option and sort/filter the Excel file by status code.

Screaming Frog broken inlinks

You can fix these links by setting up 301 redirects. However, the fewer links pointing to redirects you have on your site, the better for your usability and SEO (redirects add an extra step to loading the page - no one likes this). You are better off fixing your links to preserve UX and link juice.

In the past, you would normally take this opportunity to update any 302 (temporary) redirects to 301 (permanent) redirects, since search engines preferred 301s and passed more link juice those redirects than 302s. However, in early 2016, Google announced that 301 and 302 redirects pass an equal amount link juice and 301s pass full link juice.

Reclaiming external links pointed at your site is more complicated and difficult than fixing internal links. This is for two main reasons:

  1. These links are spread throughout the internet, instead of all residing in one place (your site). To find your backlinks, you must crawl potentially millions of pages.

  2. Since these links are on other sites, you cannot fix them yourself. In theory site owners should be more than willing to update broken links on their pages, but that is not guaranteed.

The first step is to find your links. The good news is that if you’ve already done a link audit for your site, this step is complete. While arduous, you can do this manually if you haven’t audited your link profile yet. You can download your links from Google Search Console by exporting them as a CSV from the "Who Links the Most" section in the “Links to Your Site” area. Unfortunately, Google Search Console only provides you with the linking URL, so to get the link destination, you will need to either manually check the page, or crawl the list of URL using Screaming Frog (free) or Kerboo (paid) and checking the Outlinks pane for your domain. Once you have your backlinks, you will then need to recrawl the URL to find HTTP status.

Alternatively, you can use a backlink tool such as Majestic or Ahrefs to find and export your links, and crawl that list to find broken links.

For links pointing to broken or out-of-date URLs, reclaiming your links can be as simple as fixing your pages and adding 301 redirects. However, ideally backlinks won’t point to redirects, so you should still take the time to sending website owners updated URLs for their links. Make sure to point out that fixing outbound links on their sites is a win-win for everyone. This should have a higher conversion rate than normal link building outreach.

Another cause of losing link juice is duplicate content. Even if you always create 100% original articles for every page, your website could still be dealing with duplicate content issues thanks to canonical URL issues:

Since people probably don’t know your preferred URL structure, you could have several links pointed at each version of your pages. This will dilute your site’s link juice, reducing its effectiveness and ability to rank.

Check Google Search Console’s HTML Improvements under Search Appearance for duplicate title tags and meta descriptions. Compare the URLs of these pages to see if they are instances of the above issues.

Google Search Console HTML Improvements

Deploy 301 redirects to send visitors (and full link juice) to the canonical version of your URLs. However, if 301 redirects aren’t possible due to CMS limitations or lack of web development resources, you can add the rel="canonical” tag to your pages. This HTML tag tells search engines where to find the original version of content and where to send link juice.

The rel="canonical” tag is only a suggestion, however, so you should also set a preferred domain in Google Search Console.

Google Search Console Preferred Domain

This feature tells Google whether or not you want to use URLs with WWW at the start or not. So if Googlebot sees a link to out on the web, it will view it as a link to, and follow it accordingly. However, human users will still be sent to, so you still need to set up 301 redirects.

Recent guides