Hummingbird and SEO in the Mobile Era

25 October 2013
1357 Views
Comments are off for this post

SEO in the Mobile Era

Google’s latest update, Hummingbird, is out and many are wondering where that leaves us regarding SEO. The fact of the matter is that the update has been slowly rolled out in the weeks before its final announcement so, if you have not been affected by now you probably won’t be. What really excites me is that Hummingbird is one move among many, in a direction that is trying to make the internet a better place for users, designers, developers and SEO‘s. When I say “make the internet a better place” I mean that when someone searches for something, they generally want to find it. They don’t want to feel like they now have to search through tons of irrelevant search results to find what it is they are really looking for because the first sites to rank are not really the most relevant, they are (incorrectly) optimized better than the relevant sites.

The Hummingbird update is mostly about one thing, graph searches, also known as semantic searches. This basically means that Google is able to look at a site as an “entity” rather than a list of keywords. This allows a user to type in an actual question, i.e. “is Canon ever going to be better than Nikon” (had to throw that one in) as opposed to a string of keywords i.e. “Canon, Nikon, comparison” into the search engine. When Google has a better overall view of what the sites are about, it can answer these questions with sites that are more relevant and applicable to the search query.

Before we continue, let’s take a brief look at past events in the history of search engines and websites. First we need to understand that all search engines travel through links. That’s how they get around and that’s how they get to your site. They literally crawl the web, gathering info about all the sites they encounter. This info is then stored for them to use in producing the SERPS (Search Engine Results Pages). The SERPS are basically the results we get when we do a search. The more links that lead to your site, the more they assume the site is a good one and the higher you rank in those SERPS. The actual info was relayed to the search engine in the form of a “keyword”. This is how the search engine knows what your site is about. This is a very basic break down of a very elaborate process. What people used to do to rank in the SERPS was to stuff their pages with irrelevant keywords and create unnatural links through various “link farms” and other methods. As you can well imagine, this started to fill the SERPS with very irrelevant links.

Recent changes in Google‘s search spiders have learned to detect unnatural links and penalize the target pages. They have also done away with keywords. Please note that, to date, it is only Google that has done away with keywords and having keywords won’t help you but having too many keywords (I find around 8 relevant keywords per page) will result in a penalty so use them, but use them in the correct fashion and use them sparingly.

Another issue that is affecting SEO (Search Engine Optimization) nowadays is the mobile era. When I first heard of “mobile” my immediate thought was that of a teenager surfing the web on a mobile phone and my first thought was that is not really a massive market so why bother. What many tend to forget is that the average number of people surfing the web on tablets is on the increase. The average number of people who own a tablet and use it on a daily basis has grown exponentially. Existing operating systems have changed their entire workings to accommodate this already massive and ever growing market.

How do I do SEO now?”

Essentially everything mentioned in the search engine webmaster guidelines still hold true but there are some ever increasing new factors that are pivotal to SEO in this era:

  • Natural Links
  • Responsive websites
  • Website page load speeds
  • Structured data

Natural Links

Google’s intention is for people to value the content on your pages and to link to them. This was always the intention but links have been seriously abused in the past. SEO’s used to make fortunes in creating unnatural links to sites to boost rankings. Now with the recent updates, many are scrambling to find and remove unnatural links to prevent pages from becoming de-indexed. For me this is a great step in the right direction as it is forcing website owners to start providing relevant, quality content that will create natural links and it also forces SEO’s to move more in the direction of on-site SEO which is what they were supposed to do in the first place. You can see who is linking to your website by going to Alexa or Open Site Explorer, both free tools and easy to use, just enter your website address.

“it is forcing website owners to start providing relevant, quality content that will create natural links”

Responsive Websites

An unresponsive website

An Unresponsive Website

Responsive websites are basically websites which can easily “alter” with different size screens, i.e. mobile screens. A simple test is to navigate to your page and “restore” the browser. Then re-size the window and see if the text columns adjust to the browser size. Basically make the size of the window the same size (roughly) as a tablet screen, and then the same size as your mobile phone screen and see if the columns adjust to accommodate the screen size.

A Responsive Website

A Responsive Website

If the text just “falls off” the page as in the above image, your website is not responsive. (I was horrified to find that none of my own sites are responsive.) If the columns adjust, as in the image on he left, your website is responsive. If your website is not responsive, don’t panic, most aren’t, especially older websites created before the “mobile era”. You should have a chat with your web developer to get this resolved soon.

“If your website is not responsive, don’t panic, most aren’t but you should have a chat with your web developer to get this resolved soon”

 

 

Website Page Load Speeds

Page load speeds have been a factor affecting SEO for a while but recent changes have resulted in search engines being more strict with the page load speed, again, as a result of the mobile era. There are many aspects affecting page load speeds but I find the largest culprit to be images sizes. Images have to be optimized with suitable image processing software and another important factor is for your image size to be defined in your html. Another necessary course of action is the use of CSS Sprites to increase page load speeds as well as using clean HTML and Javascript. Again, in my opinion, all these factors are going to cause web developers to seriously up their game to stay on top. It is advisable for website owners to go to GT Metrix to check their page load speed. GT metrix is another extremely useful and completely free tool. All you have to do is enter your URL and get your results. For web developers, GT Metrix offers a detailed report on what factors are affecting the page load speed as well as the actual size of the page.

“all these factors are going to cause web developers to seriously up their game to stay on top”

Structured Data

This is by far my favourite SEO technique. What this essentially entails is defining and attaching properties to the existing content on a website page There are standard sets of properties which can be found at schema.org. Currently only Google search spiders use this Schema but I can assure you that other search engines, Bing, and Yahoo are not very far behind. There are a number of reasons why I enjoy structured data so much. Firstly, it is the fuel behind semantic search. As an SEO, I am assured that by the correct usage of the Schemas, I am communicating as much about the website as an entity as possible. When Google knows so much more about you, the possibilities of appearing in searches that Google finds relevant in its graph searches are endless. This increases the scope of appearing in the SERPS much more than was ever possible with keywords. Keywords were always “behind the scenes”, allowing people to stuff in as many as possible, mostly irrelevant to try and appear in as many searches as possible. Structured data is applied to existing text, amongst other elements and I have found that this also assists in creating more relevant text and structuring it correctly. People can also not try cheat the system as one can only mark up the elements that appear on the page. This also prevents “fly by night” SEO’s from trying to operate. Google offers a completely free Structured Data Testing Tool. A very simple tool, navigate to the page and enter either your URL or the HTML of the page and Google will display the structured data on the page. Basically it will show you how Google sees your page.

“always towards a better internet” ~