This keyword was in position one with a CTR of 90%, but then the domain added a noindex directive to the page (facepalm). So, Google replaced that number one ranking URL with their subdomain, which was already ranking number two. However, the subdomain homepage wasn’t the ideal location for the query, as searchers couldn’t find the correct information right away. But it got even worse, because then they decided to 301 redirect that subdomain homepage to the top level domain homepage, so now Google was forced to initially rank a generic page that clearly didn’t have the correct information to satisfy that specific query.
As you can see, they then fell completely from that top position, as it was irrelevant, gambling data hong kong and Google couldn’t retrieve the correct page for the job. Something similar happened in this next example. The result in position one for a very juicy term with a fantastic CTR suddenly returned a 404, so Google started to rank a different page from that same domain instead, which was associated with a slightly similar but inexact topic. This again wasn’t the correct fit for the query, so the overall performance declined.
data, but to dig deeper — especially if there’s multiple pages ranking for a keyword — so that you can see exactly what’s happening. Got spam? The final point is not exactly a pattern to consider, but more a wise lesson to wrap up everything I’ve explored in this post. At scale, Google is testing pages in the top 10 results in order to find the best placement based on that performance. With this in mind, why can’t we ask people to go to the SERPs, click on our results, and reap the tasty benefits of that improved position? Or better yet, why don’t we automate this continually for all of our top-10-tested queries? Of course, this approach is heavily spammy, against guidelines, and something against which Google can easily safeguard.
This is why it’s so important to look not just at the overall
-
- Posts: 309
- Joined: Tue Dec 24, 2024 3:15 am