The periodic table of SEO success factors part 4
November 10, 2015The periodic table of SEO success factors part 6
November 13, 2015In the previous article, I explained the first column of the periodic table of success factors. In this article, I will explain the next on page group which is site architecture. The right site structure can help you to improve your ranking while the wrong structure can harm your ranking
Ac: Site Crawl
Search engines like Google or Bing have powerful software called crawlers of robots which “crawl” websites, going from one page to another very quickly. They make copies of your pages that get stored in what’s called an “index,” which is like a massive book of the web.
When someone searches, the search engine flips through this big book, finds all the relevant pages and then picks out what it thinks are the very best ones to show first. To be found, you have to be in the book. To be in the book, you have to be crawled.
Each site is given a crawl budget, an approximate amount of time or pages a search engine will crawl each day, based on the relative trust and authority of a site. Larger sites may seek to improve their crawl efficiency to ensure that the ‘right’ pages are being crawled more often. The use of robots.txt, internal link structures and specifically telling search engines not to crawl pages with certain URL parameters can all improve crawl efficiency.
However, for most, crawl problems can be easily avoided. In addition, it’s good practice to use sitemaps, both HTML and XML, to make it easy for search engines to crawl your site.
You must always remember that in designing a website you have to design for both search engines and humans!
Ad: Duplication
Sometimes that big book, the search index, gets messy. Flipping through it, a search engine might find page after page after page of what looks like virtually the same content, making it more difficult for it to figure out which of those many pages it should return for a given search. This is not good.
It gets even worse if people are actively linking to different versions of the same page. Those links, an indicator of trust and authority, are suddenly split between those versions. The result is a lower perception of the true value users have assigned that page. That’s why canonicalization is so important.
You only want one version of a page to be available to search engines.
There are many ways duplicate versions of a page can creep into existence. A site may have www and non-www versions of the site instead of redirecting one to the other. An e-commerce site may allow search engines to index their paginated pages. But no one is a search for “page 9 red dresses”. Or filtering parameters might be appended to a URL, making it look (to a search engine) like a different page.
For as many ways as there are to create URL bloat inadvertently, there are ways to address it. Proper implementation of 301 redirects, the use of rel=canonical tags, managing URL parameters, and effective pagination strategies can all help ensure you’re running a tight ship.
Am: Mobile Friendly
These days search queries for mobile devices are increasing day by day and in this year searches from mobile is exceeding the number of searches from desktop devices. And in 2016 search from mobile devices will be more than 5 billion more than desktop devices.
So get your site mobile friendly. You’ll increase your chance of success with search rankings as making your mobile visitors happy. In addition, if you have an app, consider making use of app indexing and linking, which both search engines offer.