Diggity Marketing SEO News Roundup – May 2020

Estimated reading time: 8 minutes

The last couple of months have been crazy for everyone. Now, we can see the light at the end of the tunnel and it’s time to hit the ground running. Need to gather some momentum? That’s where we come in. This month’s roundup is filled with all the latest tricks to carry you forward. First, we have the month’s top guides. You’ll see the latest keyword research tips from Moz, get actionable advice for building tier 2 links, and pick up a new technique to track anchor text for incoming links. After that, we have some data-packed case studies. Learn some new ways to analyze Google Core Updates, what the data says about how Coronavirus has impacted SEO visibility, and why you should “launder” irrelevant content by turning it into duplicate content. After that, we’ll catch you up on the latest news. Get the latest on the recent algorithm updates. Then, learn what Google has to say about old content, find out how much GMB impressions have collapsed, and discover whether Google is still not indexing new content. The Keyword Research Master Guide Now is a great time to start thinking about the fundamentals of what we do. We can come out of this with better standards and practices, and pretty much everything begins with keyword research. This timely guide from Moz has some ideas for how you can optimize your methods to be in line with the latest updates. It covers a range of topics including: How to distinguish between valuable and time-wasting keywords How to pull “seed” keywords from search data and competitors How to transfer what you’ve learned into a content strategy How/when to make on-page tweaks What tools provide the most essential data (Moz may be a bit biased here, naturally) Not all this information will be new, especially if you’re a regular reader of our seo news roundups. However, Moz is one of the biggest names in SEO, and their guides can influence what clients expect from SEO agency reports. Better to be ahead of the curve, right? If that’s where you like to be, RankClub’s Tier 2 link building guide offers you an efficient way to take the lead in links. RC’s Tier 2 link building Guide A lot of factors go into the value of any given link. This guide makes the case that you can supercharge the best links you’ve built by pointing additional links toward those placements rather than your website. According to Rankclub, these secondary (tier 2) links can make links that are already strong into long-term authority engines. Even better, they claim that this strategy is now simpler to pull off than it was in the past. PBNs have replaced GSA spam, web 2.0 blogs, and complex 3-4 layer tier schemes as a one-stop source for effective tier 2 links. The guide covers how to recognize proper tier 1 and tier 2 opportunities, some options for variety, and even some data from a tier 2 experiment. While the tier 1 and tier 2 links you build are important, you also need to analyze the links that you didn’t build. The next guide in line will tell you how to track the anchor text for incoming links—using only Google Tag Manager. Tracking the anchor text for incoming links in Google Tag Manager The anchor text that strangers are using to link to your site can tell you a lot about what information visitors find most valuable. This data can be key to your anchor text optimization efforts, and this guide by David Vallejo tells you how you can finally start collecting it. This process uses a custom HTML tag to make an XMLRequest to a PHP file that scrapes each visitor’s referring source and copies the anchor for your review. If some of that sounds like gibberish, don’t worry. While this method does require some coding, all of the code is provided for you. You can simply paste it into place. As the author himself states, the code is pretty rudimentary. You or your developer may be able to improve on what’s there. If that still sounds a bit complicated, don’t worry. The author has also provided a video for the entire process. If this guide whets your appetite for backend optimization, you’ll also enjoy the next one. Ahrefs has found that some SEOs are breaking their own pagination. Here’s how to find out if you’re one, and how to fix it. SEOs Are Breaking Pagination After Google Changed Rel=Prev/Next — Here’s How to Get It Right Google announced late last year that they no longer recognized the rel=prev/next markup. In response, SEO teams across the web began changing their implementation. According to Ahrefs, that may have been a mistake. First, they point out that Google isn’t the only party that ever used this markup. Other search engines still do, and it remains part of the ADA (American Disability Act) compliance and part of the standards published by the World Web Consortium. Google seems to have some other way to get the same information. That’s not something that other parties that use the tag can do anytime soon. Furthermore, a lot of SEOs who set out to change their implementation may have made things worse in unexpected ways. The guide contains some plans to help if you: Canonicalized the first page Orphaned your own content with misapplied noindex tags Blocked crawling and cut off later pages For each one, it also tells you how to find out if you’ve made any of these mistakes. That covers the guides for this month, but the upcoming case studies teach their own kinds of lessons. First, let’s look at an argument for why you need to change the way you analyze core updates. Google Core Updates: Stop Analyzing them like it’s 2013 If you’re an SEO, you’re probably pretty confident in your understanding of traffic, and how to tell when and how an update has affected it. This chart-packed piece by Dan Shure may put that to the test. He shows you how to break down your averaged traffic, and how to analyze whether an update was better or worse for it than the first glance suggests. He argues that analyzing traffic changes at the domain level is one of the least-insightful ways to judge whether a core update was good or bad for a website. The problem with assessing domain traffic is that there is rarely a domain-level solution for the effects of updates. Instead, different pages are taking hits or climbing based on other factors. He suggests (and lays out) a plan for segmenting your traffic by: Pages Page Types Query Query Types Device Type This, along with EAT-based analysis, can give you a lot more information about what an update really did to your site. Thanks to the algorithm update that just dropped, you’ll have a chance to put this into action. More on that in the news items. Unfortunately for us all, core updates are likely less responsible for big traffic changes lately than the Coronavirus. The next case study examines the impact that has had on visibility. How the Coronavirus Has Impacted SEO Visibility Across Categories Nearly every niche is experiencing volatility right now because, as this case study points out, the intentions and motivations of searchers are in flux. Some goods no longer fit in most family budgets, while others have experienced massive surges (like gaming consoles) and even shortages because of their increased value during the quarantine. The study examines the effect the Coronavirus has had on 18 niches, including all of the following: Addictions & Recovery Alternative & Natural Medicine Beauty Finance Food & Drink News & Media Nutrition & Fitness Restaurants & Delivery TV, Movies & Streaming Video Games, Consoles & Entertainment A series of charts break down not only who the winners and losers are, but also how much they’re winning or losing and which domains are benefitting. Altogether, it provides some great data you can use to target your advertising or affiliate marketing to where the money is moving right now. If you are responding to sudden changes in visibility, especially on older sites, you may be struggling to manage some old content. The next case study has some ideas for what you can do. Launder irrelevant content by turning it into duplicate content There aren’t many satisfying methods yet for dealing with expired content. Oliver HG Mason has an idea for a workaround: turn it into duplicate content and apply what already works there. His method leans on the fact that duplicate content (when canonically linked) more reliably passes along ranking signals than irrelevant content. He claims that by… Replacing irrelevant content with content from a preferred destination page Making the copied page canonical to the destination Waiting for Google to confirm the relationship And then 301’ing the copied page directly to the destination page …you can create a relationship where ranking signals flow Read More Read More

The post Diggity Marketing SEO News Roundup – May 2020 first appeared on Diggity Marketing.

Discover more from Artificial Race!

Subscribe to get the latest posts to your email.