Site Architecture & Search Engine Success Factors
Lesson 3: Site Architecture & Search Engine Success Factors
The next major On-The-Page group in the Periodic Table Of SEO Success Factors is site architecture. The right site structure can help your SEO efforts flourish while the wrong one can cripple them.
Ac: Site crawlability
Search engines “crawl” websites, going from one page to another incredibly quickly, acting like hyperactive speed-readers. They make copies of your pages that get stored in what’s called an “index,” which is like a massive book of the web.
When someone searches, the search engine flips through this big book, finds all the relevant pages and then picks out what it thinks are the very best ones to show first. To be found, you have to be in the book. To be in the book, you have to be crawled.
Each site is given a crawl budget, an approximate amount of time or pages a search engine will crawl each day, based on the relative trust and authority of a site. Larger sites may seek to improve their crawl efficiency to ensure that the “right” pages are being crawled more often. The use of robots.txt, internal link structures and specifically telling search engines not to crawl pages with certain URL parameters can all improve crawl efficiency.
More Google searches happen on mobile devices than on desktop. Given this, it’s no wonder that Google is rewarding sites that are mobile-friendly with a chance of better rankings on mobile searches while those that aren’t might have a harder time appearing. Bing, too, is doing the same.
So get your site mobile-friendly. You’ll increase your chance of success with search rankings as well as making your mobile visitors happy. In addition, if you have an app, consider making use of app indexing and linking, which both search engines offer.
Sometimes that big book, the search index, gets messy. Flipping through it, a search engine might find page after page after page of what looks like virtually the same content, making it more difficult for it to figure out which of those many pages it should return for a given search. This is not good.
It gets even worse if people are actively linking to different versions of the same page. Those links, an indicator of trust and authority, are suddenly split between those versions. The result is a distorted (and lower) perception of the true value users have assigned that page. That’s why canonicalization is so important.
You only want one version of a page to be available to search engines.
There are many ways duplicate versions of a page can creep into existence. A site may have www and non-www versions of the site instead of redirecting one to the other. An e-commerce site may allow search engines to index their paginated pages. But no one is going to search for “page 9 red dresses.” Or filtering parameters might be appended to a URL, making it look (to a search engine) like a different page.
For as many ways as there are to create URL bloat inadvertently, there are ways to address it. Proper implementation of 301 redirects, the use of rel=canonical tags, managing URL parameters and effective pagination strategies can all help ensure you’re running a tight ship.
As: Site speed
Google wants to make the web a faster place and has declared that speedy sites get a small ranking advantage over slower sites.
However, making your site blisteringly fast isn’t a guaranteed express ride to the top of search results. Speed is a minor factor that impacts just one in 100 queries, according to Google.
But speed can reinforce other factors and may actually improve others. We’re an impatient bunch of folks these days, especially when we’re on our mobile devices! So engagement (and conversion) on a site may improve based on a speedy load time.
Speed up your site! Search engines and humans will both appreciate it.
Au: Are your URLs descriptive?
Yes. Having the words you want to be found for within your domain name or page URLs can help your ranking prospects. It’s not a major factor, but if it makes sense to have descriptive words in your URLs, do so.
Ah: HTTPS/secure site
Google would like to see the entire web running HTTPS servers, in order to provide better security to web surfers. To help make this happen, it rewards sites that use HTTPS with a small ranking boost.
As with the site speed boost, this is just one of many factors Google uses when deciding if a web page should rank well. It alone doesn’t guarantee getting into the top results. But if you’re thinking about running a secure site anyway, then this might help contribute to your overall search success.