support@artificialrace.com

Diggity Marketing SEO News Roundup – February 2021


Estimated reading time: 8 minutes

SEO’s best minds are clearly back at work. An impressive set of articles were published this month, and this roundup will take you through the best of them… We’ll open with some case studies. First, you’ll learn what three separate case studies have to say about what went down with the December 2020 update. After that, you’ll learn how many URLs you can “request indexing” for before you hit a limit. Some fresh guides are next. They’ll teach you about the latest best practices for PBNs, how to find keywords without help from historical data, and how to analyze SERPs to rank more effectively. Finally, we’ll look at the news. There was a search ranking algorithm update on January 28th that you shouldn’t miss, plus a Google-employee’s ruling on what does and doesn’t qualify as “duplicate content.” Google’s December 2020 Broad Core Algorithm Update Part 2: Three Case Studies That  Underscore The Complexity and Nuance of Broad Core Updates https://www.gsqi.com/marketing-blog/december-2020-google-core-algorithm-update-part-two-case-studies/ Glenn Gabe brings us this look at what we can know about the latest core update from three different case studies. The first case study covers the results of a news publisher who focuses on a highly-specific niche. This site was hit hard over 2020, even though it appeared to be doing most things right. As Glen put it, the site had E-A-T galore. Several core authors produced the news stories. They were qualified and their authorship was prominently displayed. Additionally, the site had over 2 million inbound links, including some of the most authoritative sites in the world. None of this stopped them from getting hit in the January update and again in the May update. Glen recommended tackling this with a series of steps targeted to problems with news sites including: By the time the December 2020 update rolled around, the site’s traffic grew by $140%+. It’s a story that may offer some options to other struggling news sources. The second case study involved a site that did not welcome the December update. This affiliate site lost more than half of all its traffic. Through several graphs, Glen diagnosed what he believes is the cause. In this case, he thinks that low-quality content has been allowed to overwhelm the core site content. It’s difficult to say because this study hasn’t been concluded yet. Affiliate sites that have been hit may want to stay tuned. The final case study looked at a site that had every reason to praise the December update, especially after it had been hit so hard by the May update. This site was large-scale, and it was operating in a tough niche. The May update destroyed more than 40% of its traffic. Again, Glen noticed that a growing pile of thin content defined the site. It also had some problems with intrusive ads and mobile issues. While these issues were steadily corrected, nothing changed until the December update, when the site regained 40% of its traffic—almost overnight. These case studies can offer a lot of ideas to sites that have been hit across 2020. If you want to make some major changes to your own site, you may be interested in knowing how many URLs you can request indexing for at one time. Our next case study may have your answer. How Many URLs Can You “Request Indexing” For in GSC? [Case Study] https://nickleroy.com/blog-posts/request-indexing-gsc-limit/ Nick LeRoy brings us this quick look into how many indexing requests GSC will tolerate from you at one time. The “request indexing” feature was completely missing from GSC for several months. Many SEOs were excited to see it come back, but they may not have noticed that the functions have changed slightly. Nick’s case study helped to clarify some of those changes. Before the tool was taken offline, the limits had been tracked to about 50 URLs/day. Nick tested the new limits with a site that launched with more than 500,000 new pages. These limits may be concerning for SEOs who rely on fast indexing or might be launching new sites soon. Nick theorized that the new limits may be there to prevent automatization of the whole process. It’s something to watch. For now, let’s move on to the guides. We’ll start with Rank Club’s look at the best PBN practices for 2021. 2021’s PBN Best Practice Guide [Backed by Data] https://rankclub.io/2021-pbn-best-practice-guide Rob Rok of Rank Club brings us this data-backed look at how to use PBNs right in 2021. He didn’t theorize about what might work. Instead, he tracked what his busiest customers were doing and turned it into a set of recommendations. He broke the guide down by Tier 1 and Tier 2 PBN links. Tier 1 links are the PBN links that you build directly to your site. Tier 2 links are the PBN links that you point toward your incoming links to increase their power. For each set, he tried to answer the biggest questions. For Tier 1 links, he focused on questions like: Q: How fast can I build PBN links to my site? A: Approximately 3.29 per month. Q. How many PBN links can I build to my website? A: The average number built to one domain is 7.12 Q: Do PBN links work on YT videos? A: Isolated testing has shown they do, but as of yet, clients are not using them that way.   For Tier 2 links, he focused on questions like: Q:How many Tier 2 links should be sent to a given URL? A: The average is 2.63, though clients have been successful in building as many as 13 at a time. Q: How many Tier 2 links can be built at a time? A: The average number of links per order is 20.53. Q: Should I use tier 1 and Tier 2 links together? A: Nearly 25% of all PBN users choose to use both of them together.   Nearly a dozen more questions and answers are covered across the full guide. Many of the questions and answers are reinforced with graphs, charts, and other helpful data representations. Now that you’ve learned something you can do with links, let’s look at how you can improve your keywords. Moz has some advice on how you find keywords when you can’t rely on historical data. Finding Keyword Opportunities Without Historical Data https://moz.com/blog/find-keyword-opportunities-without-historical-data Imogen Davies brings us this in-depth look at what options you have when researching a keyword with no historical data. As she points out in the introduction, Google has confirmed that 15% of daily queries are combinations that have never been searched before. A lot of opportunities are likely to be buried in those queries, but it’s hard to imagine successfully ranking for them when there’s no reference point for what works. Standard keyword tools aren’t going to be helpful here because they’re built around analyzing historical data. Imogen recommends three alternative strategies: For mining “People Also Ask,” Imogen suggests that you should start by going large-scale. Use SERP API tools or repeated searches to track all of the related searches that real people make. Scraping autosuggest is the next recommendation, and it’s easily done with the URL query string she provides for you to paste right into the search bar. It provides you with a complete list of all the suggested queries that are associated with your keyword. She recommends that you follow up on either of these strategies by grouping everything you find into topics and themes. This will help you plan your content and get ahead of competitors on unserved queries. Our next piece has some more advice for you on how to serve your searchers better. This time, you’re going to do it by analyzing SERPs. How to Analyze SERPs to Win Big in Rankings https://cxl.com/blog/analyze-serps/ Adam Steele, writing for CXL, brings us this look at how you can analyze SERPs to win big in rankings. He starts with a short history lesson on how SERPs have changed. He points out that features are frequently the top result for most searches, and that nearly all searches are now intensely customized for intent. This emphasis on intent has turned out to be a great thing for SEOs. Now, we have a simple, visual way to confirm what Google thinks a keyword means. All we need to do is perform a search and analyze the SERPs that appear. Adam’s guide takes us through how we can use that analysis to confirm that a content plan will satisfy the keyword’s intent. He starts with a clear example. He performs a search with the word “Apple” and shows what comes up. There isn’t a single first-page result for the fruit. Instead, it’s all about the tech company. Good luck ranking for the word “apple” if what you’re selling comes by the bushel. He also points out that making even small shifts in the query can change the intent significantly.  As an example, he points out the difference between the terms “my SEO sucks” and “why does my SEO suck.” Read More Read More

The post Diggity Marketing SEO News Roundup – February 2021 first appeared on Diggity Marketing.


Discover more from Artificial Race!

Subscribe to get the latest posts to your email.