468x60 Ads

Like any acceptable junkie, I've apprehend every audit-related article; I've accounting bags of curve of audit-related code, and I've performed audits for friends, clients, and appealing abundant anybody abroad I apperceive with a website.

All of this assay and acquaintance has helped me actualize an crazily absolute SEO assay process. And today, I'm traveling to allotment that action with you.

This is advised to be a absolute adviser for assuming a abstruse SEO audit. Whether you're auditing your own site, investigating an affair for a client, or just searching for acceptable bath annual material, I can assure you that this adviser has a little something for everyone. So afterwards added ado, let's begin.

SEO Assay Preparation

When assuming an audit, a lot of humans wish to dive adapted into the analysis. Although I accede it's a lot added fun to anon alpha analyzing, you should abide the urge.

A absolute assay requires at atomic a little planning to ensure annihilation block through the cracks.

Crawl Before You Walk

Before we can analyze problems with the site, we accept to apperceive absolutely what we're ambidextrous with. Therefore, the aboriginal (and a lot of important) alertness footfall is to clamber the absolute website.

Crawling Tools

I've accounting custom ample and assay cipher for my audits, but if you wish to abstain coding, I acclaim application Screaming Frog's SEO Spider to accomplish the website clamber (it's chargeless for the aboriginal 500 URIs and £99/year afterwards that).

Alternatively, if you wish a absolutely chargeless tool, you can use Xenu's Hotlink Sleuth; however, be forewarned that this apparatus was advised to clamber a website to acquisition torn links. It displays a site's page titles and meta descriptions, but it was not created to accomplish the akin of assay we're traveling to discuss.

For added advice about these ample tools, apprehend Dr. Pete's Crawler Face-off: Xenu vs. Screaming Frog.

Crawling Configuration

Once you've called (or developed) a ample tool, you charge to configure it to behave like your admired seek engine crawler (e.g., Googlebot, Bingbot, etc.). First, you should set the crawler's user abettor to an adapted string.

Popular Seek Engine User Agents:

Googlebot - "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)"

Bingbot - "Mozilla/5.0 (compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm)"

Next, you should adjudge how you wish the crawler to handle assorted Web technologies.

There is an advancing agitation about the intelligence of seek engine crawlers. It's not absolutely bright if they are absolute headless browsers or artlessly adored coil scripts (or something in between).

By default, I advance disabling cookies, JavaScript, and CSS if ample a site. If you can analyze and absolute the problems encountered by impaired crawlers, that plan can aswell be activated to a lot of (if not all) of the problems accomplished by smarter crawlers.

Then, for situations area a impaired crawler just will not cut it (e.g., pages that are heavily codicillary on AJAX), you can about-face to a smarter crawler.

Ask the Oracles

The website clamber gives us a abundance of information, but to yield this assay to the next level, we charge to argue the seek engines. Unfortunately, seek engines don't like to accord complete admission to their servers so we'll just accept to achieve for the next best thing: webmaster tools.

Most of the above seek engines action a set of analytic accoutrement for webmasters, but for our purposes, we'll focus on Google Webmaster Accoutrement and Bing Webmaster Tools. If you still haven't registered your website with these services, now's as acceptable a time as any.

Helpful Videos:

How to Register Your Website with Google Webmaster Tools

How to Register Your Website with Bing Webmaster Tools

Now that we've consulted the seek engines, we aswell charge to get ascribe from the site's visitors. The easiest way to get that ascribe is through the site's analytics.

The Web is getting monitored by an ever-expanding annual of analytics packages, but for our purposes, it doesn't amount which amalgamation your website is using. As continued as you can investigate your site's cartage patterns, you're acceptable to go for our attainable analysis.

At this point, we're not accomplished accession data, but we accept abundant to activate the assay so let's get this affair started!

SEO Assay Analysis

The absolute assay is torn down into 5 ample sections:

Accessibility

Indexability

On-Page Ranking Factors

Off-Page Ranking Factors

Competitive Analysis

(1) Accessibility

If seek engines and users can't admission your site, it ability as able-bodied not exist. With that in mind, let's accomplish abiding your site's pages are accessible.

Robots.txt

The robots.txt book is acclimated to bind seek engine crawlers from accessing sections of your website. Although the book is actual useful, it's aswell an simple way to aback block crawlers.

As an acute example, the afterward robots.txt admission restricts all crawlers from accessing any allotment of your site:

Manually analysis the robots.txt file, and accomplish abiding it's not akin admission to important sections of your site. You can aswell use your Google Webmaster Accoutrement annual to analyze URLs that are getting blocked by the file.

Robots Meta Tags

The robots meta tag is acclimated to acquaint seek engine crawlers if they are accustomed to basis a specific page and chase its links.

When allegory your site's accessibility, you wish to analyze pages that are aback blocking crawlers. Here is an archetype of a robots meta tag that prevents crawlers from indexing a page and afterward its links:

HTTP Cachet Codes

Search engines and users are clumsy to admission your site's agreeable if you accept URLs that acknowledgment errors (i.e., 4xx and 5xx HTTP cachet codes).

During your website crawl, you should analyze and fix any URLs that acknowledgment errors (this aswell includes bendable 404 errors). If a torn URL's agnate page is no best accessible on your site, alter the URL to a accordant replacement.

Speaking of redirection, this is aswell a abundant befalling to account your site's redirection techniques. Be abiding the website is application 301 HTTP redirects (and not 302 HTTP redirects, meta brace redirects, or JavaScript-based redirects) because they canyon the a lot of hotlink abstract to their destination pages.

XML Sitemap

Your site's XML Sitemap provides a roadmap for seek engine crawlers to ensure they can calmly acquisition all of your site's pages.

Here are a few important questions to acknowledgment about your Sitemap:

Is the Sitemap a admirable XML document? Does it chase the Sitemap protocol? Seek engines apprehend a specific architectonics for Sitemaps; if castigation doesn't accommodate to this format, it ability not be candy correctly.

Has the Sitemap been submitted to your webmaster accoutrement accounts? It's accessible for seek engines to acquisition the Sitemap afterwards your assistance, but you should absolutely acquaint them about its location.

Did you acquisition pages in the website clamber that do not arise in the Sitemap? You wish to accomplish abiding the Sitemap presents an abreast appearance of the website.

Are there pages listed in the Sitemap that do not arise in the website crawl? If these pages still abide on the site, they are currently orphaned. Acquisition an adapted area for them in the website architecture, and accomplish abiding they accept at atomic one centralized backlink.

Helpful Videos:

How to Submit a Sitemap to Google

How to Submit a Sitemap to Bing

Site Architecture

Your website architectonics defines the all-embracing anatomy of your website, including its vertical abyss (how abounding levels it has) as able-bodied as its accumbent across at anniversary level.

When evaluating your website architecture, analyze how abounding clicks it takes to get from the homepage to added important pages. Also, appraise how able-bodied pages are bond to others in the site's hierarchy, and accomplish abiding the a lot of important pages are prioritized in the architecture.

Ideally, you wish to strive for a adulate website architectonics that takes advantage of both vertical and accumbent bond opportunities.

Flash and JavaScript Navigation

The best website architectonics in the apple can be debilitated by abyssal elements that are aloof to seek engines. Although seek engine crawlers accept become abundant added able over the years, it is still safer to abstain Flash and JavaScript navigation.

To appraise your site's acceptance of JavaScript navigation, you can accomplish two abstracted website crawls: one with JavaScript disabled and addition with it enabled. Then, you can analyze the agnate hotlink graphs to analyze sections of the website that are aloof afterwards JavaScript.

0 comments:

Post a Comment

Note: only a member of this blog may post a comment.