Is your domain losing visibility? Then do not waste time and quickly find the reason for the break-in in order to avoid large traffic losses. In this article you will learn how to find the “responsible” one step by step.
Note: If there are major changes in your project SEO Visibility, first check if they were caused by adding or removing project keywords!
1. Check the Organic Traffic Loss
You can watch this workflow as a video or read it in text format.
Text format
Go to Organic Search > Organic Rankings and scroll down to the Visibility Chart.
Find a data point, where the visibility recorded a drop, and hover over it to find out the exact date.
In this example, our domain lost 29.5% of SEO Visibility on 06.10.2019. After that, a downward trend is also visible.
2. Compare with Competitors
Text format
It can also be very helpful to look at the competitor's SEO Visibility development to identify possible connections. To do this, click on a competitor on the right side to display that competitor's SEO Visibility.
In our example, one competitor stands out in particular. Since our domain (red line) has lost visibility, the competitor's visibility (blue line) has increased. So there could be a possible connection.
3. Correlation towards the Algorithm Updates
Text format
These results also suggest possible updates to the Google algorithm that have affected the rankings in our market. To identify possible connections, click the Google Updates button in the upper left corner. Now the small Google icons appear at the bottom, where the Google Update was taking place.
If we look at our example, we notice a Google Update that is in close proximity to the visibility change.
By clicking the corresponding icon, we receive information about the update and can follow the link to read the related blog article. Here you can find information on algorithm changes that affect your domain in particular.
4. Analyze Winners and Losers Keywords
Text format
In order to be able to associate the loss of visibility with certain keywords, go to the Organic Search > Winner & Loser Keywords. Find the data point where the drop was and click on it. Now you can analyze how many keywords have grown in terms of positions and traffic at that data point and how many have been lost.
In our example, we can already see from the columns in the table that the number of lost keywords was significantly higher than the number of keywords won. The legend on the right shows the precise differences. 21% of all keywords have lost rankings, and 17% have even disappeared from rankings completely.
On the page below there are the corresponding tables of winner and loser keywords. Go to loser keywords. Make sure to set the data point where the drop was recorded. Sorted by Traffic Index, all keywords that have lost positions are now listed.
Let's take a look at our example again: Some of the keywords have been completely blown out of the rankings (see "out" in the position trend). "carrie underwood" is the keyword our domain has lost the most traffic for (-772). It is also noticeable that among the first 10 loser keywords a few keywords belong to the "/ music" directory of the domain (see URL).
5. Examine Individual Directories
Text format
To see if some directories had a bigger impact on changing the overall SEO Visibility of a domain than others. To do this, we go to Organic Search > Directories. At the top of the page there is an SEO Visibility chart broken down into individual domain directories.
In the example we see that the directories /restaurants/, /arts/ and /location/ are subject to slight fluctuations, but nothing serious is noticeable there. The /music/ directory looks different. For clarity, you can hide other directories by clicking on them. The SEO Visibility case is particularly noticeable in the /music/ directory and thus has a great impact on the overall SEO Visibility.
6. Conduct a Site Crawl
Text format
We also recommend that you perform regular crawls to check all your domains for all technical components. Even the smallest technical changes can have devastating effects on page visibility.
A crawl can easily be set up in the Site Experience. Learn how to do this in this article. Make sure that your servers do not block the Searchmetrics Bot!
After a crawl has been completed, all results can be viewed on the subpages of the Site Experience.
Tip: Switch on your Visibility Guard to be aware of important changes immediately! More information.
You should make sure that your website does not have a 404 error, for example. Check this in the Indexabilty > Server Responses subpage. Check out this article if you are not sure what each status code means. Basically, most of the pages should have the status code 200. All other status codes should be checked to avoid loss of visibility.
In our example, more than half of all links lead to the 500 Server Error. This problem should be resolved immediately.
Further important information can be obtained by analyzing the canonical tags. Go to Indexability> Canonical Tags to examine these. If only a few pages have a canonical tag, you can think about adapting the website structure if necessary to prevent accusations of duplicates.
Another reason for the decrease in traffic can be the long loading time of individual pages. Long waiting times not only negatively affect the user experience, but also lower the Google rankings.
Get an overview of the loading times on your URLs under Performance> Page Speed.
In our example it becomes clear that the loading times of our URLs are in the middle range. We could try to optimize the loading times for some of the URLs listed below, for example by using smaller files.
Link analysis should not be ignored either. The report under Link Analysis > Broken & Redirected Links provides information about all these links. It goes without saying that broken links in particular should be avoided and fixed as quickly as possible.
Let's stick with the internal linking and take a look at the Link Equity Report (Link Analysis > Link Equity Optimization). A good internal link leads to the higher SPS values on the individual pages. Pages with low SPS values are not considered as important and usually generate less traffic under these circumstances.
Make a few changes based on all this and set up a regular crawl to understand the changes.