It’s hard to overstate the importance of search engines in modern life. The number of searches conducted every day worldwide currently stands at over 6.6 billion, while Google has become a verb and part of everyday language. As online stores replace high street brands and our lives become increasingly reliant on internet access, search engines are the gatekeepers and librarians directing us towards relevant online content.
Google now handles three quarters of the global search engine market, far outstripping the Bing/Yahoo partnership or China’s Baidu. And yet, like its rivals, the precise composition of Google’s ranking results is cloaked in secrecy. Microsoft is equally coy about how Bing’s own immensely powerful algorithm orders and ranks web pages. This is partly to prevent black hat marketing professionals developing workarounds to cheat the system, though it also allows for constant fettling in response to evolving web trends.
Below are ten things we know search engines consider when ranking a website against its peers and competitors:
- Its top level domain. There are over 1,500 top level domains in existence, forming the final part of any web address. Some are country code TLDs, like .uk; others are generic, such as .mobile or .play. A ccTLD will perform better in searches conducted in that country, while gTLDs offer no specific benefits or drawbacks. Yet although search engines claim not to discriminate between established gTLDs like .com and newer ones like .corp, audiences distrust unfamiliar TLDs. This reduces web traffic volumes, which in turn impacts upon the next SEO criteria…
- User activity. A key factor in a site’s SEO ranking is its popularity. People shy away from new or niche gTLDs, but they’re more likely to visit an address they trust. This increases traffic, suggesting this platform is a valuable asset. The engines can track how long visitors spend on each page, how they navigate around and where they depart. With the average lifespan of a site being just over three years, Bing gives established platforms considerably more credence than newer ones.
- Mobile-friendly design. Most web traffic is carried on smartphones or tablets nowadays, rather than desktops. Search engines can estimate how long content will take to download across patchy Wi-Fi or 4G connections, elevating the ranking results of fast-loading portals. An optimal website should have a responsive framework that adapts to each device’s screen size, reformatting menus into dropdown hamburger tabs, for example, without losing any content. Mobile-only sites are as frowned upon nowadays as homepages powered by Flash, while automatically playing media content increases loading times and therefore damages SEO rankings.
- Keyword usage and density. Using the same keyword several times on a web page implies expertise, which means searches for that word will be ranked more highly. The same is true of more detailed phrases, known as long tails. However, it’s easy to overdo this, becoming a process known as keyword stuffing. Sophisticated machine learning algorithms enable search engines to determine when content is being aimed at them rather than people. Nowadays, they actively mark down sites with excessive keyword density.
- Frequent updates. One way to boost keyword performance without saturating individual pages is by regularly uploading new content. Blogs and news pages are ideal platforms for adding key terms, while a regularly updated site is seen as more authoritative, boosting its ranking results. Don’t delete old content, though, since search engines dislike disappearing pages or text. Bing doesn’t scan pages as frequently as Google, so it’s more likely to link to content that’s been removed.
- Metadata. There are plenty of places other than the body content where you can incorporate keywords and boost SEO performance. The alt text used to generate written captions for images is a great way of squeezing in additional keywords or long tails. HTML offers a variety of tags to indicate each page’s topic or content, like meta descriptions – the snippets of text displayed in search results. With effective keyword deployment, their content becomes a key driver of SEO rankings.
- Interstitials. These are the prominent advertising-style graphics that appear after pages load, often greying out or obscuring the background. They require caution, since search engines have started treating screen-filling interstitial ads as the page’s actual content. Because they’re usually nothing more than a call to action, being treated as page content may destroy any value possessed by the content beneath them.
- A robots.txt file. This is effectively a set of instructions for search engine crawlers about which pages are on the site. It can also inform them about pages that shouldn’t be included in the results, like intranets or hosting pages where content is hidden from public view before being unveiled. A robots.txt file is often complemented with an XML sitemap, which explains to web crawlers the links between pages.
- Backlinks. Once upon a time, the number of links to a website provided a measure of a site’s SEO ranking. Low-cost link farms flourished, and their historical efforts to cheat the system mean that any site listed on a dodgy farm now gets downgraded. Quality rather than quantity is the new mantra, with a link from academic portals or widely-known media sites being worth far more than a link in a personal blog or a YouTube comment.
- Social media accounts. Despite the greater value of authoritative links, inbound traffic from social media platforms still carries value. Platforms such as Facebook and Twitter are deemed to be reputable drivers of inbound visitors, and Bing is particularly attentive to the number of likes and shares a page receives when calculating rankings. These social signals can flag up web content that’s gone viral or sparked a debate.
It should be noted that the UK’s two dominant search engines differ in a number of key respects. For instance, Bing tries to prioritise local companies over national brands, whereas Google adopts the opposite approach. Nevertheless, adhering to the guidelines laid out in this article will ensure your website performs far more strongly next time it’s ranked by a search engine…