Advanced Guide to Search Engine Optimization

Learn about fundamental concepts and how to leverage the power of SEO to get more traffic, leads, and sales.

An Introduction to Search

Think of the Internet as a spider-web of interlinked websites and webpages. Search engines follow all of the links that exist between different websites to form their view of the web. This is called “crawling.” During their crawl, search engines analyze over 200 factors to develop a holistic view of every website they encounter. When a user performs their search, they receive hundreds of thousands of results to their query in the form of Search Engine Result Pages (SERP).

SEO is the combination of on-page optimizations, including the optimization of website code, page content, information hierarchy, and meta data, and off-page optimizations, including the acquisition of clout through publicity, that will improve a website's keyword ranking positions in a search engine's search index. This is achieved by understanding and leveraging what we know about the way search engines, like Google, crawl and index the web.

On-Page Optimizations

Optimizing website code, on-page content, information hierarchy, and meta data improves the way website users and search engines understand and interact with a website. Ensuring the website is optimized is the first step towards increasing keyword rankings and organic website sessions.

Information Architecture & Hierarchy

The website’s information architecture, the hierarchy of webpages in relation to the overall site, as well as their interrelatedness, is expressed through URL structuring and breadcrumbs. Having a clear and simple URL structure is essential in creating an environment for both users and search engines that is easy to understand.

Link Equity Distribution

Search engines crawl and index the Internet using automated scripts, often referred to as search engine robots or spiders. Unlike humans, these spiders crawl the web by clicking through every link they encounter, within sites and between sites. They then use this interlinking information to form a map of the Internet and the relationships between sites. A mini-map is also formed for the pages within a site. The largest, most authoritative websites are formed when a relatively large number of links point towards them.

As pages with higher authority across the Internet link out to smaller sites, those smaller sites inherit some of its authority. This is because the link is passing “equity.” Links have always represented the single most important subset of "off-site elements" in terms of gaining rankings. The more linked-to a page becomes, the more authority it garnishes. Interlinking is a way a Webmaster can redistribute the link equity a particular page is receiving. Thus, the equity a homepage has received can be directed to deeper, value-generating pages. Therefore, redistributing link equity enables profit pages to rank.

Off-Page Optimizations

Search engines are influenced by a brand's clout, the online engagement around it, and the quality of the of those conversations. This engagement can greatly impact a website's visibility in organic search. In addition, the sentiment of the conversations, positive or negative, can also affect what appears in search for specific queries. Off-page optimization leverages brand communities, influencers, thought leaders, and content publishers, in addition to their audiences, to build brand awareness and strengthen brand recall. By building the right kind of online engagement for a brand, its website can benefit from an increase in organic search performance.

The Value of SEO

SEO is important when launching a new website, migrating a website to a new host or CMS, targeting multiple languages or regions, or when its important to forecast demand.

SEO has proven, measurable results. With web analytics, marketers are able to effectively and efficiently set and measure target KPIs. Compared to traditional media, SEO provides a relatively low-cost alternative for businesses to reach large audiences.

Given that search engines allow users to quickly find products and services they are interested in, companies are more effectively allocating their marketing budgets to a more qualified audience. Due to its characteristically low customer acquisition costs, SEO has one of the best returns on investment. This, coupled with the fact that there aren’t any associated cost-per-click fees, means that SEO can dramatically reduce a business' customer acquisition costs.

Identifying Clear Value Propositions

The value proposition is a statement that summarizes why a consumer should buy a product or use a service. This statement should convince a potential consumer that one particular product or service (or brand) will add more value or better-solves a problem than other similar offerings.  Keyword research, audience segmentation and positioning will improve a marketer's understanding of consumer behaviour and will help her to formulate value propositions. If this information is used correctly, the number of website visitors that complete her specified, value-generating actions will improve.

Increase Website Visits

A good SEO strategy has the ability to increase organic search visits to a website. In most cases though, quality is better than quantity. The good news is that SEO can help with traffic relevancy too. When pages are optimized to target a more relevant audience, changing our focus to more targeted keywords, may decrease website traffic as relevance improves. As users become more relevant, website conversion rates tend to improve.

Mitigating Risk

Relying on a single source of traffic can make organizations sensitive to external market conditions that are outside of their control. The Internet is in a constant state of flux. Relying on social media referrals to drive visits to your website can result in dramatic drop-offs in traffic when websites like Facebook and Twitter alter their algorithms. Investing too heavily in pay-per-click can increase your customer acquisition costs, making it increasingly difficult to compete. These are just two examples of the risks associated with relying too heavily on one channel to generate value. Like diversifying your investment portfolio, increasing the number of channels that drive traffic to your website will reduce the risk that your organization will be harmed by an unpredictable marketing environment.

Website Navigation

The interlinking between a website’s pages is an extremely important ranking factor for internal pages in Search. The following discussion includes the factors that should be considered when designing website navigation, and the resulting information architecture of a website, to ensure search engine crawlers can locate and index all important website pages.

1.1 Menus & Website Navigation

Ensure that all drop-down menus, particularly the main navigational menu, contain HTML links in the source code that are crawlable. Ensure no-follow directives are removed and the navigation functions in environments that have JavaScript and Flash disabled.

If a browser does not render a website’s drop-down menus when JavaScript is disabled, search engines may not be able to efficiently crawl the website.

1.2 Breadcrumbs

Use breadcrumbs to specify a page’s location in the website’s hierarchy. In addition, a single URL and breadcrumb path must be chosen and used for pages that appear under multiple categories in a navigational menu.

Search engines will use a website’s breadcrumbs, along with its URL structure, navigational menus, and HTML sitemap to better-understand the website and its information architecture. Conflicting information creates confusion for users and search engine algorithms.

Implementing the appropriate structured data mark-up alongside breadcrumbs also helps search engines better-present their results. This enriches the user experience and can result in increased prominence on search engines as a result. This is particularly important for resource pages and blog posts.

1.3 Internal Linking

Always link to a single version of any given URL. When linking to internal pages, decide if your website’s URLs should contain http or https, www or not, and a trailing slash appended to the end of the URL or not.

Below are eight unique URL variants. Remember to choose just one:


Ensure you keep internal linking consistent, to ensure search engine crawlers encounter as few redirects as possible to conserve your crawl budget.

1.4 HTML Sitemap

Ensure that your website has an HTML sitemap that is dynamically generated, and place a link to it in the footer of the website. HTML sitemaps make it possible for all important, live pages on a website to be a single click away from your home page. The HTML sitemap also further communicates the website’s information architecture and the relationship between pages. The sitemap design should communicate a consistent message with the main navigational menu, as well as the website’s URL structure and breadcrumbs.

Best practices dictate that the sitemap should automatically update to include any newly created pages and their respective URLs. When a page “expires,” or is removed from the website for whatever reason, the page should also be removed from the HTML sitemap.

The HTML sitemap also allows users to quickly see all important website pages in the event they are not able to effectively navigate it, for whatever reason. In addition, the HTML Sitemap places all links to important pages one level away from any point of entry. This facilitates both website crawls and improves link equity distribution from the home page.

Note that the HTML sitemap is not to be confused with the XML sitemap that is submitted to Google through Google Search Console.

URLs & Header Response Codes

2.1 Structuring URLs

As discussed, URL structure conveys important information to search engines like Google. It’s important that existing URLs, for any website, change as little as possible. This avoids losing any page authority and keeps header response code errors to a minimum.

When creating URLs, it’s important to avoid duplicate content, prevent crawl/indexing issues, and be relevant to users.

Our Tips:

  • Avoid using parameters (“?”, “&”) in the URL. This is not applicable for e-commerce websites, which rely on parameters for product filters.
  • Do not hide any important content behind hashtags in the URL (#). URLs that contain hashtags are typically not crawled or indexed by search engines.
  • Avoid creating URLs based on user path. It’s important that only one version of any URL exists. Rewriting URLs based on user paths will create duplicate content. This is particularly important if you are working with a Magento installation.
  • Ensure that all internal links, pointing to the same page, use consistent and accurate Final URLs.
  • When updating URLs, it’s important to implement a 301 redirect from the old URL to the new URL, if the content has moved permanently.

2.2 Keyword Use in URLs

Including important keywords in URL paths is important for both users and search engines. It’s important to include these keywords, without creating excessively long URLs. Note that over-optimizing keywords in URLs can lead to a poor user experience that can negatively impact rankings.

2.3 Header Response Codes

Ensure links are up-to-date and return “200 OK” header response codes. Sending crawlers through multiple 301 redirects, or to too many non-existent pages that return a “404 Page Not Found” header response code, negatively impacts crawl budget and link equity distribution.

2.4 404 Pages

If a website does send a user to a “404 Page Not Found” page, it’s important that the experience is enjoyable as possible and users can quickly find their way to another relevant page. For this, it is recommended to create a custom 404 page template that is well-structured, informative, and looks pleasing to the eye.

Meta Information & Schema Mark-Up

3.1 Page Titles & Meta Descriptions

Search engines grab Meta Information from a page and serve it to its users in search results. Page Titles are the large, typically blue pieces of text in search results, and are used by search engine algorithms to rank pages for specific keywords.

Effectively written Meta Descriptions are used to entice searchers to click, and are therefore shown on search result pages as grey text. You can view this as “marketing text,” or an “elevator pitch.” Well-written Meta Descriptions explain what the user is going to gain by visiting the page, what makes it unique or superior in value to the alternatives (value proposition), and contain a call to action. Note that Meta Descriptions are not used by search engine algorithms to rank pages and search engines may modify your Meta Description as they see fit.

Page Titles and Meta Descriptions are elements located in the <head> section of each webpage. The Page Title is especially important because it is the primary indicator of relevance to the search and it is the first thing that a user sees on a search engine results page (SERP).

Page Titles and Meta Descriptions should read as natural, entice the audience to click on your result, and effectively target the right keywords for the page.

3.2 Headings defines HTML Headings as follows:

“HTML defines six levels of headings. A heading element implies all of the font changes, paragraph breaks before and after, and any white space necessary to render the heading. The heading elements are H1, H2, H3, H4, H5, and H6 with H1 being the highest (or most important) level and H6 the least…Use the DIV element together with headings elements when you want to make the hierarchical structure of a document explicit.”

Headings should not only be used to format text in a certain way. Their primary purpose is to structure the page’s content hierarchically. H3, H4, H5, and H6 headings are not absolute requirements and should only be used in areas that make sense. When writing headings, they should be unique from page-to-page and reinforce the key themes of the page.

3.3 Alt Attributes

An alt attribute should be assigned to each image rendered on the page. In the event the web browser cannot load the image, this alternative text will be displayed.

Alt attributes are used by search engine algorithms to analyze the relevance and importance of images to the page. These tags also increase the likelihood that the image will appear in image-based search results.

3.4 Open Graph

Social sharing isn’t only an important ranking factor; social has become a driver for brand awareness. Controlling the messages your business serves to social audiences is an important consideration in this. By implementing the open graph protocol on your website, you ensure the consistency of your brand messaging across platforms that recognize it (Facebook, Twitter, and LinkedIn).

The basic open graph elements are:

  • OG:Title: The title of your object as it should appear within the graph
  • OG:Type: The type of your object (example:
  • OG:Image: A URL for an image that represents your object
  • OG:URL: The canonical URL of your object that will be used as its permanent ID in the graph

3.5 Structured Data

Structured data mark-up, or schema, is a standard way to annotate content so that machines can understand it. When your website includes structured data mark-up, search engines like Google may use that data to index your content better. Schema can also lead to presenting your website in a more prominent way in search results, and better integrate it into Google’s increasingly rich search experience.

Acceptable rich snippets can be found on The website offers several options for integrating the rich snippets through Microdata, JSON-LD, or RDF-a. JSON-LD implementation is most recommended.

Complete documentation for schema may be found here:

Duplicate Content

4.1 Introduction to Duplicate Content

Links aside, arguably the most important ranking factor is website content. Sites that offer superior content to that of their competitors generally rank higher for competing terms. When developing content for your site, it’s equally as important to manage content duplication. Duplicate content sends a strong low-quality signal to search engines, but it isn’t always intentional. In fact, content duplication (the same, or similar content across multiple URLs on a single domain) is often created unintentionally.

To effectively manage duplicate content, it’s important that a primary URL is defined for each unique piece of content (website page). The process of canonicalization is how we manage duplicate content across websites.

The canonicalization process directs search engines to the page that is officially recognized as true and genuine, and it is done in two ways:

  • 301 redirects forces a user or search engine spider from one URL to another, and passes ranking authority from the old URL to the new URL; and
  • Rel=canonical tags suggest to search engines where the “home” of the content is located by specifying its URL.

4.2 Pages Under Construction

For website pages you do not want search engines to index (such is the case for pages or templates that are under development), you can:

  • Use the robots.txt directive to disallow the page from being crawled; and
  • Use the “noindex meta tag” in the Header or HTML to further ensure the page will not be indexed by search engines.


The meta noindex tag should be placed in the <head> section of every page you would like to keep out of a search engine’s index and looks like this:


4.3 Cross-Domain Duplicate Content

Up to this point, duplicate content has been referred to as an on-site problem where multiple pages (URLs) on the same website display the same or very similar content. The discussion will now turn to duplicate content across websites, which is equally as important.

If your site’s content finds its way onto another website, and that website has more authority, it may outrank your website for related search terms. This is why, when advertising your business and its services, it’s important to create unique content for those advertisements. Taking or repurposing existing website content for use on another website can negatively impact the quality assessment of your website.

4.4 URL Slashes

Another culprit of duplicate content is the trailing slashes in URLs. If there isn’t a way to avoid the duplicate content caused by trailing slashes in URLs, ensure that all internal links point to a single version (slash or unslashed -- just pick one!). Once this is done, apply a 301 redirect from all non-trailing slash URLs to trailing slash URLs (or vice-versa).

Other On-Page Optimizations

5.1 Sitemaps

An XML sitemap, not to be confused with the HTML sitemap discussed in an earlier section, should be periodically submitted to Google & Bing using Search Console or Webmaster Tools.

These sitemaps give webmasters a way to list, in hierarchical format, all file destinations for any given website. The purpose of the XML sitemap is to suggest how the website should be crawled and indexed and helps search engines discover new content. They differ from traditional HTML sitemaps in that they are not a page on the website that users will access.

It’s important to understand that XML sitemaps do not affect a page’s rankings - rather, these sitemaps affects their presence in the search engine index.

Multiple standards exist for producing an XML Sitemap, however Google recommends using the standard put forth by More information about XML sitemaps can be found in the Google help center.

The following basic rules should be followed when creating and maintaining XML sitemaps:

  • Always identify the default XML namespace
  • Keep the text UTF-8 compliant. If you use things like ampersands in your URLs they should be entity-escaped (replace “&” with “&amp;” in URLs)
  • No files with more than 50,000 entries – if you have more than this, use sitemap index files
  • No files more than 50 MB in size, uncompressed (gzip compression is an option)
  • Only include pages that should be indexed. Usually this means that you should not include internal search results URLs, or URLs with session IDs
  • Sitemap location is important. A sitemap file that is stored in a directory can only reference URLs stored within that directory or sub-directories. For this reason sitemap files are most often stored in the root directory
  • Only list unbroken links. Search Console will show erroneous URLs in a sitemap file, and you should strive to have no more than a 1% error rate

How to Create & Submit an XML Sitemap

  1. Identify which pages on the site are important for search engines. You can ignore things like login pages, non-canonical pages, search results pages and other pages that are not relevant to search engines.
  2. Create your sitemap file. This can be done within a CMS (or plugin, such as Yoast) or with crawler or “link sleuth.”
  3. Test and validate your xml file using Google Search Console
  4. Upload the sitemap.xml file to the root folder on your server.
  5. Reference the sitemap in your robots.txt file with a line like this: Sitemap:
  6. Register your sitemap with Google and Bing via their respective Webmaster Tools interfaces.

5.2 Page Load Times

Search engines are now assessing a page’s load time, and have incorporated the analysis into their algorithms. Load times should be reduced as much as possible, but avoiding the use of large image files and heavy code.

Our tips:

5.3 Mobile SEO

There are two ways to render mobile-friendly websites, without creating a separate mobile URL:

  1. Responsive web design. This type of design serves the same HTML code on the same URL, regardless of the user’s device. The code, however, is designed to render the website differently by responding to the screen-size of the browser and device. Responsive website design is strongly recommended.
  2. Dynamic Serving. This type of implementation serves content on the same URL, regardless of device, however generates a different version of HTML for the device type based on what the server identifies.

Google strongly recommends that developers create a responsive website that adapts to the size of the user’s screen. Developers can then signal to browsers that the website will adapt to all devices, by including the following meta tag in the <head> section of all it webpages:

<meta name=”viewport” content=”width-device-width, initial-scale=1”>

Remember that search engines like Google need to experience your website the same way a user would. Thus, serving different content to “Googlebot” can lead to an algorithmic search engine penalty.


The web is becoming a more secure place. That’s why Google has made a push for prioritizing secure websites and content. This means that the HTTPS protocol is used as a quality ranking-signal in their search algorithm.

Most webmasters know that HTTPS gives website users added security and privacy. What many people don't know is that referral data from a referring site using https is stripped for sites that do not use https. This traffic will show up in your analytics as “direct” traffic. By switching to https, you regain the lost referral data in Google Analytics.

Google has also made it clear that HTTPS is a ranking factor and switching to HTTPS will result in a “boost” in rankings.

As with everything, implementing HTTPS is not as straight-forward as one might think. The first thing to keep in mind is that HTTPS may decrease page speed. This is why it is so important to optimize your website for speed first. In addition to this, all links, including canonical tags, image src links, and JavaScript will need to be changed over to HTTPS. If left unchanged, some browsers will trigger a “page is insecure” warning that may turn off many users. Finally, if a migration strategy is not well-planned, switching to HTTPS may increase the number of internal 301 redirects. This decreases crawl efficiency, as a result.

Visit the following resource pages to learn more about HTTPS:

5.5 Word Count

Although often overlooked, word count is an important on-page optimization. Pages with low word counts do not give Google much to work with; when there are more words to a page, not only will search engines typically understand the purpose of that page better, there is evidence that suggests they may spend more time crawling the website.

5.6 Google My Business

Google My Business is an integral part of Local SEO and should be treated as such. The Google My Business panel appears to the right of a search results page when you type in a business-related query. If your business relies on local clients, you need to optimize your Google My Business listing. This involves registering your business with the service.

It is important to maintain total uniformity in your NAPW (business name , address, phone number, website address) across your Google My Business pages and local citation sources.

Find out more about Google My Business here: 

Backlinks & Off-Page SEO

6.1 External Linking

When linking to other websites from your own website, it is important that the links reach the desired URL. It’s also important that you link to authoritative sources that preserve the integrity of your brand.

6.2 Backlink Profiles

A strong backlink profile is important to maintaining and improving rankings for your website. There are countless ways to gain and attract good backlinks. The issue at hand is there are just as many ways to attract spammy, low quality backlinks. Search engines like Google do not like these types of links, and will penalize you for having them. Many so-called ‘marketers’ resort to black-hat tactics of buying links from link farms and spamming directories, message boards, and the comments sections of many blogs. This is detrimental to your business and removes the long term, positive ROI that SEO provides.

For more info about backlinks, check out this MOZ post.

6.3 Local SEO & Location Signalling

Local citations are like the online version of the Yellow Pages; they are digital directory listings that help people locate your business. For multi-location firms that rely on local business, local SEO should be an integral part of the SEO strategy.

Local SEO includes the following:

  • Local Search Listing Optimization, including Google My Business and Bing Places business listings.
  • Multi-Regional Website Optimization, including business location schema
  • Local Citation Building, including the creation of Yelp and Facebook local business pages


Local Search Listings

Google serves local search results from its Google My Business platform. Here, businesses are able to list their business locations and contact information on Google Maps and dedicated search listing pages. When a user performs a search that that is local in nature, search engines will typically serve these local results. The listings are usually shown alongside a map in search results. On-page and off-page optimizations ensure that these local business listings have the best chance of appearing in organic search results, when local search results are triggered.

Multi-Regional Website Optimization

Search engines also return standard organic results for queries they deem to be relevant to the searcher's intent and region. Ensuring your multi-regional website is optimized for organic search ensures that search engines rank the most relevant page for each region. This means that if someone located in Houston, TX searches for “inbound marketing agency,” search engines should return results for inbound marketing agencies located in or near Houston. In addition, multi-regional website optimization ensures that the searcher is directed to the most relevant page for their region or language.

Learn more about local citations at this blog post here.

6.4 Social Signals

Social signals refer to social media profiles owned by your brand and how complete those properties are. Social Signals are a ranking factor, so it’s important not to disregard them. Additionally, social media is another channel that can be used to attract a qualified, engaged audience.

SEO Abuse & Spamming

Over the years, many so-called “SEO professionals” have spammed the Internet with poor-quality content and links, under the guise of search engine optimization. As search engine algorithms have become more sophisticated, they have become almost immune to these unethical practices. Known as black-hat search engine optimizers, these individuals are oftentimes difficult to discern from true SEO practitioners.

A company that participates in “black-hat” SEO not only runs the risk of not seeing a return on their investment, but many risk algorithmic and manual search engine penalties that can cause organic rankings and search traffic to drop. In addition, the worst offenders are permanently banned from appearing in Google search. Yes, the largest search engines employ a team of web spam specialists that manually penalize websites each and every day.

Some of the most common spamming techniques include:

  • Stuffing keywords into on-page content
  • Automating the process of building low-quality links that contributes to web-spam
  • Cloaking or hiding content on your website to manipulate search engine spiders
Keep in mind that a search engine's mission is to provide the most relevant results to their users and their algorithms don't like it when they suspect website of spam and abuse.
Speak with an Expert

Speak with an Expert

We’re always available to chat, so reach out to Tandm if you have more questions for our team of experts to answer, or if you are interested in a search marketing partnership with our agency.