SEO-RULES-OF-THUMB

SEO Rules of Thumb

UNDERSTANDING SEARCH ENGINE OPTIMIZATION GOAL

SEO Rules Intro

The goal of search engine optimization (SEO) is not just getting the traffic or be ranked higher in the SERP,  but it is how much of the acquired traffics convert to lead. The ROIs is the final goal of every search engine campaigns. Getting there might be a little bit upsetting and challenging. However, if the basic rules of SEO are fully understood and applied, the sky is the limit.

The rules are simple to follow, and understanding and applying those practices are paramount for any SEO success story.

SEO rule no.1

Gather Your Thoughts

gather-your-thought-regarding-seo-campaigns
Scrapes all your SEO thoughts and evaluate them

It is indispensable to understand business goals and objectives, anticipate for survival likelihood, and forecast the potential prospects intended for your establishment.

Moreso, running a successful SEO campaign does require you to identify the business vision and mission(long-term) – the pre-requisite to develop the SEO techniques to engage in a winning battle with any competitors out there.

Above all, what really matters in the chosen industry, the products or services that get the most attention of people, the need for that services or products, leading authority, index term volume and the associated value to that searched terms should be the number one priority to reckon with before heading to the Internet and compete for ranking.

Beating a competitor at his/her own game is possible if you painstakingly follow the highlighted steps above and apply the SEO rules below.

SEO rule no.2

Brainstorm for The Right Tools

brainstom-for-the-search-engine-utility-tools
Brainstorming for the right SEO tools to launch campaign

With your thoughts fully tailored, you can dance your way up to conduct a comprehensive brainstorming, thanks to the collected ideas most relevant to your business. Now you are half the battle towards your solid SEO campaign.

Now it’s high time you “batten down the hatches” and hit the Internet. Look for the right tools to conduct proper diagnosis, auditing and keyword researches for your website, and again don’t forget to spy on your competitor if you would.

SEO rule no.3

Choose The Right Gadget

decide-on-web-audit-tools-to-use-for-optimization-campaign
Choose your best SEO utility tool and master it.

So far, you have been able to organize your plans proactively with a sheer curiosity of getting it right. The larger part of any SEO has been done, the rest is to apply and put those hypotheses to test through acquiring the right tools and implement it with your expertise.

Remember the purpose of acquiring this tool, for instance, is to examine and provide crucial information regarding the issues that face your website in terms of indexing & crawlability, redirects, encoding & technical factors, links and link structures, URLs, images, and localization.

Use any one of the following free tools to launch your campaign:

  1. Spy Tool
  2. Website Grader
  3. Check My Link
  4. SEO Report Card
  5. SEMrush
  6. Sitebulb
  7. GTMetrix
  8. Google Analytics
  9. Raven Tools
  10. Buzzsumo
  11. Ahrefs
  12. ScreamingFrog
  13. Google Lighthouse
  14. Google Search Console
  15. Majestic
  16. Moz

SEO rule no.4

Yield to SEO Tool's Suggestion

The introduction of tools to your campaign might be a little bit of a battle if you are using any one of them for the first time.

But what I think should be of utmost importance is for the engine to uncover the hidden flaws embedding on your website. Observe these steps:

  1. Pencil down the errors that have a significant SEO impact on a web page, you should do this slowly to really unearth all the specks of dirt and clouds of dust on the site.
  2. Now yield to the suggestions collected from auditing your website, this time, your expertise or experience would judge as a yardstick to fixing the problems effectually. At this stage, it’s your call to decide if you would like to fight it tooth and nail or pass the baton to the experienced SEO specialist.

Nonetheless, I have prepared a few engaging tips to help clean up some mess.

EXPERT LEVEL

Identify The Problems

At this juncture, the level of understanding a basic HTML language may be put to test to further interpret the recommendations thanks to the web audit tool.

Below are the major site structure related problems: 

  1. Indexing and crawlability
  2. Redirects factor
  3. Encoding and technical factors
  4. URLs and the links
  5. Images
  6. On-page
  7. Localization

INDEXING AND CRAWLABILITY - SITE STRUCTURE

HTTP Status Codes

HTTP status code is a web server response renders whenever a web page is browsed. The status code is render in different forms – 400, 403, 404, 500 e.t.c.

Causes of Status Codes

  1. Request made to non-existed file or page
  2. The inability of the server to perform or complete a certain request

HTTP STATUS CODE - 404 ERROR

404 Error Code

404 error response is triggered when a page is not found on request. 

Causes
  1. Request made to non-existed file or page
404 Error Fixing
  1. Optimize those pages with navigational options or use permanent 301 redirects.
  2. Open up your .htaccess file, and enter the code like so
# 301 Redirect notfound.html
Options +FollowSymLinks
RewriteEngine on
Redirect 301 /notfound.html /goodpath.html

Note the notfound.html is the file that generates the error code(404 not found), and the goodpath.html is your fresh link that renders 200 status code when visited.

HTTP STATUS CODE - 500 ERROR

500 Error Code

500 error response is triggered when server is incapable of performing certain request.

Causes
  1. The inability of the server to perform or complete a certain request
500 Error Fixing
  1. Check the .htaccess for misconfiguration especially the syntax
  2. Remove the existing .htaccess file and refresh the web page (copy the code snippet somewhere as a backup, just in case.)
  3. Pay keen attention to the change when your browser makes a new request and create a fresh .htaccess file. OR
  4. Check the file permission on the index.html/php page by right clicking on the file and set permission code to 644 or 755. Any number other than this may have caused the server to trigger this error. 

INDEXING AND CRAWLABILITY - indexing tag issues

Indexing Tag

Your website data may be restricted from indexing in different ways. Check if your

  1. Robots.txt file is properly configured (check disallow rules in this file)
  2. X-Robots / Meta tags

indexing tag issues - robots.txt file

Robots.txt

Robots.txt file is an indexing property that contains commands for the spider instructing the crawler which pages should or should not be indexed. It’s a simple text file on the server that uses the term “disallow” to discourage crawlers from entering certain directories.

What it Does
  1. Instruct crawlers or robots where to index on your web pages.
Robot.txt Fixing
  1. Ensure the file is well-formatted for search engines to crawl and read it. You can set different rules using the command allow, disallow, useragent to instantiate the required commands.
  2. It is very important to ensure that your robots.txt file does not conflict with sitemaps you have submitted to search engines.
  3. Ensure the file is properly placed in the root folder like so:
-- domian.com/robots.txt.

4.  Check the wild card (*) in the useragent  to ensure the code instructed properly.

User­agent: *
Disallow: /cgi­bin/ # neglect this robot 

For more information on these rules, check out robotstxt.org.

Indexing tag issues - The X-robots-tag

X-Robots Tag

This tag is located in the header part of the document preventing or allowing search engines from showing certain files.

What it Does
  1. Inhibit search engines from showing certain files
Fixing X-Robot Tags
  1. Watch out for the following code in your header document for proper implementation.
  2. Check the header document of PHP file to see if you have properly tagged or implemented the command in the right way.
header("X-Robots-Tag: noindex", true); this tag prevents search engine from showing certail file
header("X-Robots-Tag: noindex, nofollow", true); this allows search engine from following the link.

4. In the .htaccess file, do this, if you don’t want search engines to show or index a .doc file type, add this code snippet to .htaccess file

<FilesMatch ".doc$">
Header set X-Robots-Tag "noindex, noarchive, nosnippet"

or use the below code snippet if you suspect any foul play with the robots.tx 

<FilesMatch "robots.txt">
Header set X-Robots-Tag "noindex"

SITEMAP

Sitemap .xml file

A sitemap is an XML file with a protocol that lists all the available files to be indexed by crawler when visited. It signals the search engines about the important pieces of information found on the website.

What it Does
  1. Lists all the available files to be indexed by crawler when visited.
Fixing Sitemap Issues
  1. This file should be located on the website one directory structure away from the homepage (ex. http://www.site.com/sitemap.xml).
  2. Check the header document of PHP file to see if you have properly tagged or implemented the command in the right way.
  3. If your sitemap isn’t indexed yet, please submit it to the search engines here. Don’t forget to resubmit the sitemap after making changes to it.
Important Tips
  1. An XML sitemap should be updated when new pages are added to the website and need to be correctly coded.
  2. A valid .xml sitemap on your website does not confirm that it will be flawlessly indexed by search engines. For better indexation make sure the Robots.txt file is also available in the root directory of your website.
  3.  Ensure these files are free of errors and are kept current and available at all times.
  4. Find out how to create an .xml sitemap at sitemaps.org.

Redirect Factors

Check All Redirect

Website redirect can be cumbersome if codes responsible for this action are not properly installed in the .htacess for instance

Components of Redirect
  1. Domain with or without www version
  2. Issues with HTTP/HTTPS protocol
  3. Pages with 302 redirect
  4. Pages with long redirect chains
  5. Pages with a meta refresh
  6. Pages with rel=”canonical”

REDIRECT OF DOMAIN

Domain With or Without (www) Version

Domain can be in the form of www or without www depending on the webmaster choice.

Causes
  1. Choice of the webmaster change
Penalty
  1. There is no penalty indexing both version of www and non www.
Recommendation
  1. Set up and view the primary www or non-www version for your site in the .htaccess file. 
  2. You can also do the same by setting the preferred domain in Google Search Console.
Implementing the versions of www and non-www 

Add this code snippet to your htaccess file if you want a version without or with www

# Redirect to a domain without www.

RewriteEngine on
RewriteCond %{HTTPS} on
RewriteCond %{HTTP_HOST} ^www\.(.*)$ [NC]
RewriteRule .* https://%1%{REQUEST_URI} [R=301,L]

#Redirect to a domain with www

RewriteCond %{HTTPS} on
RewriteCond %{HTTP_HOST} !^www\. [NC]
RewriteRule .* https://www.%{HTTP_HOST}%{REQUEST_URI} [R=301,L]

**Fixing this will help you prevent search engines from indexing two versions of a website. Best practice would be to set one version as a priority.

REDIRECT OF PROTOCOLS

Domain with HTTP/HTTPS

Issues with HTTP/HTTPS protocol should be carefully checked to avoid conflict of the version to be recognized by the search engines. The best way to fix this is to apply the following code snippet instructing the search engine which protocol takes precedence over the other.

Causes
  1. You previously had your website hosted without encrypted protocol – HTTP
  2. You have just recently switched or upgraded from HTTP to HTTPS
Penalty
  1. Cause your website to be undermined by search engine
  2. Confuse the search engine which protocol to be prioritized.
Recommendation
  1. In your root directory, locate a .htaccess file (via FTP)
  2. Open with notepad or any code editor and add the following code snippet
Implementing the redirect of HTTP to HTTPS

Add this code snippet to your htaccess file if you want a version without or with www

# HTTP to HTTPS:
RewriteCond %{HTTPS} on
RewriteCond %{HTTP_HOST} !^www\. [NC]
RewriteRule .* https://www.%{HTTP_HOST}%{REQUEST_URI} [R=301,L]

TEMPORARY REDIRECT

Domain with 302 Redirect

Redirect 302 is another HTTP status code, a response you get when a page is temporarily redirected to another link or file.

What it Does

Redirect link temporarily.

Set Back
  1. 302 redirects are temporary so they are not actually a great link for a search engine. If you use them instead of 301s, search engines might continue to index the old URL.
  2. Cause search engine to disregard the new one as a duplicate, or
  3. They divide the link popularity between the two versions, thus hurting search rankings.
Recommendation

1. Check if your .htaccess does not have something similar to the code snippet below

RewriteEngine On 
RewriteCond %{HTTP_HOST} ^yourdomain.com [NC]
RewriteCond %{SERVER_PORT} 80
RewriteRule ^(.*)$ https://www.yourdomain.com/$1 [R,L]

if it did, remove it and replace with this

# HTTP to HTTPS:
RewriteCond %{HTTPS} on
RewriteCond %{HTTP_HOST} !^www\. [NC]
RewriteRule .* https://www.%{HTTP_HOST}%{REQUEST_URI} [R=301,L]

2. In order to make sure all redirects are set up correctly, check your website for Pages with meta refresh and for Pages with rel=”canonical”.

It is not recommended to use 302 redirects if you are permanently moving a page or a website

Long chain redirect

Long Chain Redirect

Page with Long Chain Redirect

A long chain redirect is a page having a link longer than 2 redirects. Long chain redirect may also have a bad impact on your page ranking.

Possible Causes
  1. A poor .htaccess file setup may cause a page to end up with having two or more redirects.
SEO Impacts
  1. There is a high risk that a page will not be indexed as Google bots do not follow more than 5 redirects.
  2. Too many redirects will slow down your page speed. Every new redirect may add up to several seconds to the page load time.
  3. High bounce rate: users are not willing to stay on a page that takes more than 3 seconds to load
Fixing Long Chain Redirect
  1. Check your .htaccess file and doctor your commands or remove completely and install a new one.
  2. Refer to the web audit suggestion pointing to this particular problem and identify the URL with many redirect, diffuse the redirect to at least 1 or 2.

Meta Refresh Problems

Meta Refresh

Meta refresh is an HTML meta tag element characterized with an http-equiv parameter whose content parameter is set to a ” refresh”  instructing a web browser to automatically refresh the current web document at a given set time interval.

What it Does
  1. Instruct a web browser to automatically refresh the current web document at a given set time interval.
  2. Creates confusion between crawlers and users 
Violation
  1. Meta refresh may be seen as a violation of Google’s Quality Guidelines and therefore is not recommended from an SEO point of view.
Fixing Meta Refresh
  1. Make sure all redirects are set up correctly
  2. Check your website for Pages with 302 redirects and for Pages with rel=”canonical”
  3. Adhere to the permanent 301 redirects instead.

ENCODING

Encoding & Technical Factors

Encoding and technical factors could be the questions relating to

  1. Website not being mobile friendly
  2. Pages with mixed content issues as a result of HTTPS conflict protocol
  3. Website having a page with multiple canonical URLs
  4. Pages with frames
  5. Pages with big files

ENCODING - Mobile usability

Mobile Issues

 Potential Mobile Issues

You can check the Mobile Usability report in Google Search Console -> Search Traffic -> Mobile Usability and fix potential mobile issues found on your site.

According to Google, the mobile-friendly algorithm affects mobile searches in all languages worldwide and has a significant impact on Google rankings

ENCODING - Mixed Content

Mixed Content Factor

Pages with Mixed Content Issues

A mixed content issue is caused by the presence of a mixed version of HTTP/HTTPS factor found on a web document and improper redirection of the protocol to include the validity of the SSL certificate. Some browsers sometimes block pages with mixed content issues signaling with insecure connection signs. Google has classified a mixed content into two different types namely: active and passive mixed contents.

Active mixed content interacts with the page as a whole and allows an attacker to do almost anything with the page and it includes scripts, stylesheets, iframes, flash resources.

Passive mixed content refers to content that doesn’t interact with the rest of the page and it includes images, video, and audio content, along with other resources that cannot interact with the rest of the page .

Possible Causes
  1. The inclusion of insecure content served over HTTPS pages could cause your page to trigger mixed content issues.
  2. If an HTTPS page includes content retrieved through regular, cleartext HTTP, this weakens the security of the entire page as the unencrypted content is accessible to sniffers and can be modified by man-in-the-middle attackers.
To-Do

If passive, check the responsible element and change the protocol to the secured one. In this case, we are checking for the image which would look like this when embedded within codes.

img src ="http://domain.com/link-to-image" an insecure content served over HTTPS

Manually change it to

img src ="https://domain.com/link-to-image" problem solved

While waiting, read the keywords mastermind to learn how to effectively plan your index terms to get to the top of search engine result page.

TO BE CONTINUED…..

Pro Word.

“Having a beautiful website with emptiness or with junks packed with stuffed very much unrecognizable by a search engine is an absolute waste of resources. Spending time to build constructive assets is the way forward. ”

Leave Your Comment Here

error: Content is protected !!