Technical SEO

Technical SEO refers to anything we do that makes our site easier for search engines to crawl and index.  

Technical SEO vs. On-Page SEO vs. Off-Page SEO

Technical SEO On-Page SEO Off-Page SEO 
It means anything we do that makes our site easier for search engines to crawl and index.  On-page SEO refers to the content that tells search engines what your page is about, including image alt text, keyword usage, meta descriptions, H1 tags, URL naming, and internal linking. Off-page SEO tells search engines how popular and useful your page is through votes of confidence — most notably backlinks, or links from other sites to your own. Backlink quantity and quality boost a page’s PageRank. 

5 Factors: 

Technical SEO Audit Fundamentals 

Audit Your Preferred Domain 

* Our website domain impacts whether people can find you through search and provides a consistent way to identify your site. 

* When you select a preferred domain, you’re telling search engines whether you prefer the www or non-www version of your site to be displayed in the search results. 
*Ex: For example, you might select www.yourwebsite.com over yourwebsite.com. This tells search engines to prioritize the www version of your site and redirects all users to that URL. Otherwise, search engines will treat these two versions as separate sites, resulting in dispersed SEO value. 

Implement SSL (Secure Sockets Layer) 

* Creates a layer of protection between the web server and a browser, thereby making your site secure. 

* When a user sends information to your website, like payment or contact info, that information is less likely to be hacked because you have SSL to protect them. 

* Search engines prioritize secure sites because Google announced as early as 2014 that SSL would be considered a ranking factor. 

* After you set up SSL, you’ll need to migrate any non-SSL pages from http to https.  

Steps: 

1.Redirect all http://yourwebsite.com pages to https://yourwebsite.com. 

2.Update all canonical and hreflang tags accordingly. 

3.Update the URLs on your sitemap (located at yourwebsite.com/sitemap.xml) and your robot.txt (located at yourwebsite.com/robots.txt). 

4.Set up a new instance of Google Search Console and Bing Webmaster Tools for your https website and track it to make sure 100% of the traffic migrates over. 

Optimize Page Speed 

Compress all of your files: Compression reduces the size of your images, as well as CSS, HTML, and JavaScript files, so they take up less space and load faster.  

Audit redirects regularly: A 301 redirect takes a few seconds to process. Multiply that over several pages or layers of redirects, and you’ll seriously impact your site speed.  

Trim down your code: messy code can negatively impact your site speed. Messy code means code that’s lazy.  

Content distribution networks: are distributed web servers that store copies of your website in various geographical locations and deliver your site based on the searcher’s location. Since the information between servers has a shorter distance to travel, your site loads faster for the requesting party. 

Outdated plugins: often have security vulnerabilities that make your website susceptible to malicious hackers who can harm your website’s rankings. Make sure you’re always using the latest versions of plugins and minimize your use to the most essential.  

Cache plugins: store a static version of your site to send to returning users, thereby decreasing the time to load the site during repeat visits.  

Use asynchronous (async) loading: Scripts are instructions that servers need to read before they can process the HTML, or body, of your webpage, i.e. the things visitors want to see on your site. Typically, scripts are placed in the <head> of a website (think: your Google Tag Manager script), where they are prioritized over the content on the rest of the page. Using async code means the server can process the HTML and script simultaneously, thereby decreasing the delay and increasing page load time. 

An async script looks: <script async src=”script.js”></script> 

Crawlability Checklist 

Search bots will crawl your pages to gather information about your site. If these bots are somehow blocked from crawling, they can’t index or rank your pages.  

1.Create an XML sitemap. 

2.Maximize your crawl budget. 

3.Optimize your site architecture. 

4.Set a URL structure. 

5.Utilize robots.txt. 

6.Add breadcrumb menus. 

7.Use pagination. 

8.Check your SEO log files. 

1. Create an XML sitemap. 

XML Sitemap that helps search bots understand and crawl your web pages. Remember to keep your sitemap up to date as you add and remove web pages. 

 2. Maximize your crawl budget. 

Crawl budget refers to the pages and resources on your site search bots will crawl. 

Tips to ensure maximizing your crawl budget: 

*Remove or canonicalize duplicate pages. 

*Fix or redirect any broken links. 

*Make sure your CSS and Javascript files are crawlable. 

*Check your crawl stats regularly and watch for sudden dips or increases. 

*Make sure any bot or page you’ve disallowed from crawling is meant to be blocked. 

*Keep your sitemap updated and submit it to the appropriate webmaster tools. 

*Prune your site of unnecessary or outdated content. 

*Watch out for dynamically generated URLs, which can make the number of pages on your site skyrocket. 

3. Optimize your site architecture. 

A site architecture could look something like this, where the About, Product, News, etc. pages are positioned at the top of the hierarchy of page importance. 

 4. Set a URL structure. 

URL structure refers to how you structure your URLs, which could be determined by your site architecture.  

Tips about how to write your URLs: 

*Use lowercase characters. 

*Use dashes to separate words. 

*Make them short and descriptive. 

*Avoid using unnecessary characters or words. 

*Include your target keywords. 

5. Utilize robots.txt. 

When a web robot crawls your site, it will first check the /robot.txt, otherwise known as the Robot Exclusion Protocol. This protocol can allow or disallow specific web robots to crawl your site, including specific sections or even pages of your site. If you’d like to prevent bots from indexing your site, you’ll use a noindex robots meta tag.  

6. Add breadcrumb menus. 

Breadcrumbs are exactly what they sound like — a trail that guides users to back to the start of their journey on your website. It’s a menu of pages that tells users how their current page relates to the rest of the site. 

 Breadcrumbs should be two things:  

* Visible to users so they can easily navigate your web pages without using the Back button 

* Have structured markup language to give accurate context to search bots that are crawling your site. 

7. Use pagination. 

Pagination uses code to tell search engines when pages with distinct URLs are related to each other.  

rel=”next” to tell the search bot which page to crawl second. Then, on page two, you’ll use rel=”prev” to indicate the prior page and rel=”next” to indicate the subsequent page, and so on. 

8. Check your SEO log files. 

Web servers record and store log data about every action they take on your site in log files. The data recorded includes the time and date of the request, the content requested, and the requesting IP address. 

Indexability Checklist 

As search bots crawl your website, they begin indexing pages based on their topic and relevance to that topic. Once indexed, your page is eligible to rank on the SERPs.  

1.Unblock search bots from accessing pages. 

2.Remove duplicate content. 

3.Audit your redirects. 

4.Check the mobile responsiveness of your site. 

5.Fix HTTP errors. 

1. Unblock search bots from accessing pages. 

Google’s robots.txt tester will give you a list of pages that are disallowed and you can use the Google Search Console’s Inspect tool to determine the cause of blocked pages.  

2. Remove duplicate content. 

Duplicate content confuses search bots and negatively impacts your indexability. Remember to use canonical URLs to establish your preferred pages. 

3. Audit your redirects. 

Verify that all of your redirects are set up properly. Redirect loops, broken URLs, or — worse — improper redirects can cause issues when your site is being indexed.  

4. Check the mobile-responsiveness of your site. 

Use Google’s mobile-friendly test to check where your website needs to improve. 

5. Fix HTTP errors. 

HTTP stands for HyperText Transfer Protocol errors can impede the work of search bots by blocking them from important content on your site. 

*301 Permanent Redirects are used to permanently send traffic from one URL to another. A 301 redirect passes all ranking power from the old URL to the new URL and is most used when a page has been permanently moved or removed from a website. 

* A 302 redirect lets search engines know that a website or page has been moved temporarily.  

* An HTTP 403 response code means that a client is forbidden from accessing a valid URL. The server understands the request, but it can’t fulfill the request because of client-side issues. 

*404 Error Pages tell users that the page they have requested doesn’t exist, either because it’s been removed, or they typed the wrong URL. 

*405 Method Not Allowed means the server knows the request method, but the target resource doesn’t support this method. 

* 500 Internal Server Error is the server encountered an unexpected condition that prevented it from fulfilling the request. 

* 502 Bad Gateway Error is related to miscommunication, or invalid response, between website servers. 

*503 Service Unavailable tells you that while your server is functioning properly, it is unable to fulfill the request. 

 *504 Gateway Timeout means a server did not receive a timely response from your web server to access the requested information. 

Renderability Checklist 

An accessible site is based on ease of rendering. Below are the website elements to review for your renderability audit. 

 1.Server Performance 

If you notice that your server is experiencing issues, use the resources provided above to troubleshoot and resolve them. Failure to do so in a timely manner can result in search engines removing your web page from their index as it is a poor experience to show a broken page to a user. 

 2.HTTP Status 

Similar to server performance, HTTP errors will prevent access to your webpages. You can use a web crawler, like Screaming Frog, Botify, or DeepCrawl to perform a comprehensive error audit of your site. 

3.Load Time and Page Size 

A delay in page load time can result in a server error that will block bots from your webpages or have them crawl partially loaded versions that are missing important sections of content. Depending on how much crawl demand there is for a given resource, bots will spend an equivalent amount of resources to attempt to load, render, and index pages.  

4.JavaScript Rendering 

Google admittedly has a difficult time processing JavaScript (JS) and, therefore, recommends employing pre-rendered content to improve accessibility. Google also has a host of resources to help you understand how search bots access JS on your site and how to improve search-related issues. 

5.Orphan Pages 

Every page on your site should be linked to at least one other page — preferably more, depending on how important the page is. When a page has no internal links, it’s called an orphan page. Like an article with no introduction, these pages lack the context that bots need to understand how they should be indexed. 

 6.Page Depth 

Page depth refers to how many layers down a page exists in your site structure. It’s best to keep your site architecture as shallow as possible while still maintaining an intuitive hierarchy. Sometimes a multi-layered site is inevitable; in that case, you’ll want to prioritize a well-organized site over shallowness. 

Rankability Checklist 

1.Internal and External Linking 

Linking improves crawling, indexing, and your ability to rank. 

2.Backlink Quality 

Backlinks tell search bots that External Website A believes your page is high-quality and worth crawling.  

 3.Content Clusters 

Content clusters link related content so search bots can easily find, crawl, and index all of the pages you own on a particular topic. They act as a self-promotion tool to show search engines how much you know about a topic, so they are more likely to rank your site as an authority for any related search query. 

Clickability Checklist 

1.Use structured data. 

2.Win SERP features. 

3.Optimize for Featured Snippets. 

4.Consider Google Discover. 

1. Use structured data. 

Structured data employs a specific vocabulary called schema to categorize and label elements on your webpage for search bots. The schema makes it crystal clear what each element is, how it relates to your site, and how to interpret it.   

2. Win SERP features. 

Write useful content and use structured data. The easier it is for search bots to understand the elements of your site, the better your chances of getting a rich result. Structured data is useful for getting these from your site to the top of the SERPs, thereby, increasing the probability of a click-through: 

 *Articles 

*Videos 

*Reviews 

*Events 

*How-Tos 

*FAQs 

*Images 

*Local Business Listings 

*Products 

*Sitelinks 

3. Optimize for Featured Snippets. 

Featured Snippets are intended to get searchers the answers to their queries as quickly as possible. According to Google, providing the best answer to the searcher’s query is the only way to win a snippet. 

4. Consider Google Discover

Google Discover is a relatively new algorithmic listing of content by category specifically for mobile users. 

Leave a comment

Your email address will not be published. Required fields are marked *