support@artificialrace.com

Case Study: A 4.5x Organic Traffic Increase Using (What?) Page Rank


Estimated reading time: 8 minutes

Introduction I’ve been a director at The Search Initiative a while now. We’ve had some crazy results for a whole bunch of clients – all in very different niches. I’m going to share with you how my team and I took a low-authority website and put our foot on the gas to get it moving – fast! I truly believe there was a key ingredient that accelerated this website, but I’m going to share the whole process from the start. Why? Each website is going to be different, so you need to figure out what your site needs. You need to go through the process. Here’s a sneak peek of the growth. Now learn how we did it… Initial Analysis Since starting TSI’s organic seo services, I’ve realized that working with my own sites is hugely different from working with clients; especially if the website has weak foundations. I know how I want my money sites to look, so I build them using rigorous attention to detail. But if you take a website that’s been developed without a certain level of SEO knowledge – there’s normally quite a lot of on-site and off-site to fix. Here’s how my team broke down the initial analysis: Keyword Research On-Site Audit Backlink Audit Competitor Analysis Keyword Research My team tackled keyword research with two main workflows: one is used to monitor the health of a website, and the other is for content gap analysis. When we’re looking to track keywords for a website, we want to track some of the core terms, but also terms that are having problems. If a term is suffering from keyword cannibalization that we’re trying to fix – it’s worth tracking this daily until it’s resolved. Since this client needed a huge content strategy, we did both a health check and initial content gap analysis. This approach included breaking down all keywords for that industry into topics of relevant terms. In total, this process took over 20 hours and included thousands of keywords chunked into neat topics. This work later helped with choosing page titles, headings and content. Here’s an example of how we did it: Step 1. Search Broad Keywords Step 2. Review Parent Topics Step 3. Find Competitors for Parent Topics Step 4. Reverse Engineer Competitor’s Keywords Step 5. Exclude Outdated Keywords There is the option to also export all of these keywords into excel documents and then filter them that way. But most of the time, a lot of the top keywords are fairly similar. Here’s an example for the best dog food term: best dog food best dog foods healthiest dog food what is the best dog food top rated dog food best food for dogs While each keyword is unique, they all follow a singular intent. The users are interested in finding out what are the best dog foods in the market. On-Site Audit Finding all the technical and content issues with the website requires a full on-site audit. However, while big reports are easy on the eyes, it’s small changes that make the difference. We audited the website and found a whole bunch of technical issues, from lack of breadcrumbs, poor internal link structures, bad quality anchor text and unoptimized titles. A full on-site audit tutorial is too big for this post (perhaps coming soon), but here are some quick tips: Screaming Frog – A cheap way to regularly crawl your website. There are lots of ways to find errors, redirects, and missing metadata. You can also use a custom search to find all references of your keywords. Sitebulb – This tool is more expensive and is a monthly recurring fee. However, it gives you lots of extra data that would be impossible to spot manually and hard with Screaming Frog. An example would be empty hyperlink references. Site Search – By using Google’s site search (site:domain.com) and operators, you can find hundreds of issues with index management, outdated page titles, and multiple pages targeting the same keyword. There are a lot of quick wins here. Page Titles – If you wrote your page titles 1 – 2 years ago, you may find that they’re outdated now. A quick site search with “intitle:2018” will find all your content that is either not updated or not yet crawled by Google. Internal Links – A major way to pass relevance signals and authority to your core pages is through internal links. Make sure that your pages are well interlinked and you’re not using low-quality anchors from your power pages, such as “click here” or “more information”. We focused on fixing around 5 issues at a time varying from small changes like improving accessibility, to bigger changes like introducing breadcrumbs for a custom build website. Backlink Audit The website had a relatively small backlink profile, which meant it lacked authority, relevance signals and entry points for crawling. It also meant that a full in-depth link analysis was unnecessary for this campaign. In this instance, the initial check revealed there was nothing to be concerned about, so we moved on to technical implementation as soon as possible. Had the website experienced problems with the link profile, we would have done a full backlink audit to try and recover this. Here’s what to look out for: Link Distribution – Pointing too many links toward internal pages instead of your homepage can cause lots of issues. So make sure that you’re not overdoing it. Anchor Text Analysis – Using exact match, partial match and topical anchors are a great way to pass third-party relevance signals. Too many and you’ll be caught out over-optimizing, but too few and you won’t be competitive. Read more about anchor optimization. Referring IP Analysis – There are a finite number of IPv4 Addresses, so this isn’t often a big cause for concern. However, it’s worth making sure that you’ve not got too many links from the same IP address. Autonomous System Numbers – Since a server can be assigned any number of IP addresses, these systems often include an ASN. This is another way that Google could flag large numbers of websites from the same origin. My team did a case study on how to remove an algorithmic penalty, a lot of these audits come included in any penalty removal campaign. Competitor Analysis The difference between a search analyst and data scientist is how you approach the search engines. An analyst is focused on reviewing the SERPs and finding what is working best today, while a data scientist wants to understand how things work. We built our team to include both since competitor analysis requires a keen eye for reviewing the SERPs and algorithm analysis requires solid data scientists. If you want to do SEO at a high level, you’ve got to constantly be reviewing competitors using various analysis tools. You will notice that tons of best practices get ignored in the top positions and the devil is in the details. In this instance, we found that both more content and more links would be required for long-term success. Content Strategy Building any long-term authority website in competitive industries will include both an authoritative link profile and content plan. My team reviewed their existing content, looked at how other websites in their industry wanted to help users and then addressed these four cornerstones: User Intent – before we did anything, we wanted to nail the user intent on every page. This research meant that we identified three pillars of content for their site. We’ll get into this in further detail below. Service Pages – these pages were dedicated to explaining what service was offered, how to contact and what was included with that offering. Blog Content – these posts were dedicated to providing non-commercial, informative content that was interesting to the reader. Resource Center – this section was dedicated to giving basic information about topics in their industry. Instead of using Wikipedia for all our links to authority content, we wanted to use internal links instead. Here’s a little bit about each section and our strategy for them: User Intent The biggest mistake I see all the time is the simplest thing to check: What types of content is Google ranking in the top 10 positions? If you’re serving 10,000 words of content in a huge blog post, but Google is only interested in serving service pages with 50 words of content – you’ve missed the point. Another commonly found problem we find at The Search Initiative is including too much content in a single post, when your competitors have several shorter posts. One of the main attractions for Thailand are the yoga retreats. If you’re searching for this (yoga retreats) in America, you’re expecting to find destinations. Let’s take a look: The first position is called Yoga Journal and includes almost no content aside from images and headings. That’s exactly what the users were looking for. There are other websites doing a similar service and can help you make bookings. While others Read More Read More

The post Case Study: A 4.5x Organic Traffic Increase Using (What?) Page Rank first appeared on Diggity Marketing.


Discover more from Artificial Race!

Subscribe to get the latest posts to your email.