Do You Know the SEO Hierarchy of Needs?

SEO Hierachy of Needs

Maslow had his hierarchy of needs, and so does search engine optimization. With 33% of search traffic going to the first result and over 50% of traffic going to the top 3 search results, you better believe that SEO matters. (Source: Search Engine Watch)

Follow this guide to vastly improve your SEO. Let’s start at the bottom of the pyramid.


The foundation for your SEO efforts is your site’s ability to be found by search engines. Google explains,

“Crawlers look at webpages and follow links on those pages, much like you would if you were browsing content on the web. They go from link to link and bring data about those webpages back to Google’s servers.”

Yoast provides a helpful graphic for understanding this step.


When Google, Bing, Yahoo, or another search engine finds your website, it reads your robots.txt file. This file tells the search engine robot, or spider, or crawler which pages on your website that it’s allowed to index. You can find more on robot.txt files here.

The crawler next looks at your HTTP header. In it is a status code that tells the spider whether your site is okay or not. Status code 200 means that your site is healthy and open for business. You can find more on status codes here.

A crawler follows the internal links on your website to understand what’s on your site. One way to help a crawler find the pages on your site is to link your pages.

Another good way to help the crawler is to create a sitemap. The easiest way that I know to do this is to use a WordPress plug in that does this for you. I use and recommend Yoast SEO for this and a number of other SEO related tasks.


Are search engines finding all the pages on your website? A sitemap helps with this. Indexability is the ability for a search engine to find, analyze, and add your pages to their index. Site structure, internal link structure, broken page redirects, server errors, other tech factors, and blocking web crawler access to pages are all issues for indexability. Let’s walk through a few of these.

Site structure and internal link structure can work together to help web crawlers index your site. They can only find information that is linked on your site, or someone else’s site. If you pages don’t have internal or external links, they are harder to find for web crawlers and this negatively affects your indexability. Internal links help web crawlers understand your site.

Broken page redirects, also called broken links or broken URLs are digital dead ends that will hurt your search engine rankings. There’s an easy solution – use a broken link checker. I have one on my website and I check it regularly.

Server errors can have a negative effect on your search engine rankings. When someone wants to visit your website, they click on a link or type in your web address. This request goes to your server and it is answered with a server header status code (I know, too much information). That code could be “HTTP/1.1 200 OK,” which is fantastic. But, it could be one of a number of status codes like ” 302 Found,” or ” 404 Not Found,” or “410 Gone,” which are not good. These status codes will hurt your SEO.

Other tech factors include poor site speed, unsupported scripts, forms, gated content, bad comments, spam comments, low-quality links, black hat SEO tactics, outdated technologies, even bad spelling and grammar. Granted, spelling and grammar may not seem like tech issues, but they can affect your search rankings.

Blocking web crawler access is a no-no except for pages that you want to restrict public access. It’s easy to mistakingly block other pages from web crawlers, so it’s good to check your robots.txt file.

If you see this, your robots.txt file is blocking web crawler access to your website:

robots.txt file

This is what you want to see:

robots.txt correct file

So, if you aren’t seeing what you should be seeing, overwrite your robots.txt file.


This measures how easy your website can be displayed or rendered. Factors for accessibility include server performance, http status, load time, JavaScript rendering, orphan pages (pages with no outgoing links), and website strength against hacking and spam.


Rankability gets back to everything we’ve already covered for SEO. Let’s add two more elements to this picture: high quality content and keywords. Keywords and key phrases that point to high quality original content are certain winners for SEO. For SEO purposes, a minimum of 1,000 words, and preferably over 1,500 words is best. Headings, paragraphs, lists, alt data for images, and meta data all help for higher rankings.

Speaking of high-quality original content, I hope that you’ve found this helpful for your website.

Add your comments and questions, and your likes and shares!

Leave a Reply

Your email address will not be published. Required fields are marked *