Search Engine Webmaster Guidelines

what-is-seo

Search Engine Webmaster Guidelines

The following are a list of guidelines on developing and optimizing a website. These search engine webmaster guidelines are relatively generic but most hold true. The only massive variable between search engines are the amount of weight each guideline carries. Following these guidelines will not only get your website indexed by major search engines, it will also improve the visual appeal of your website as search engines view pages in a similar manner to the way humans do.

It is important to adhere to these guidelines as, failing to do so can result in your website being penalized by losing ranking or being de-indexed by search engines.

Make Pages For Users, Not Search Engines

It is sometimes tricky not to do this. One tends to always keep SEO in the back of your mind when populating your site. That is where is should stay. All the SEO in the world is absolutely useless if your website is unattractive visually, difficult to navigate and fails to answer the three most important questions:

  1. Can they do what I want?
  2. Where are they?
  3. How much does it cost?

If your visitors can achieve their answers rapidly in a visually pleasant environment, they will easily be converted to clients.

My advice on doing this is to initially populate your website, allowing your content to flow freely. When you are finished populating the entire website, return to each page and apply your SEO in a natural flowing way.

Always ask your friends to take a look at your site and offer feedback. It is common for a designer to design something that makes perfect sense to themselves but makes no sense to others. Try also not to use items like audio, popups and video that spontaneously starts when opening a page.

“One tends to always keep SEO in the back of your mind when populating your site. That is where is should stay.”

Don’t Deceive Your Users

As much as users dislike being deceived, so do search engines. If you are producing honest content for users, this should not concern you, however, I do recommend you take note of the following ways which will deem a website as deceptive in case you, unknowingly, use any of these tactics:

  • Redirecting users to where they don’t want to go. Links should lead to places relevant to the anchor text and the content describing the link destination.
  • Fake content or fake .xml markup with untrue properties will result in penalties and loss of visitors.
  • Duplicate content may not be discovered by users but will be discovered by search engines. Keep your content original.

“The consequences of a user simply closing a tab if they feel they are being deceived are minimal compared to the consequences when a search engine is being deceived.”

Use Accurate and Descriptive Page Titles and Alt Tags

Search engine webmaster guidelines, an example of where the title tag appears

Click on the image for a larger example

Titles can be found at the top of your tab (next to the favicon) and at the top of your result in a search engine.It is important to describe your page accurately in this title using keywords that are relevant only to that particular page. Avoid keyword stuffing here, it will result in a penalty. As a guideline, use your main keyword in a description of the page to make it user friendly for search engines and compelling for humans. As I always advise on optimizing for all search engines, it is interesting to note that Bing values a link to a page if the anchor text matches keywords in the title of the destination page and keywords in the H1 header of that page so bear that in mind when composing page titles and internal linking. Make sure each page has a unique page title.

“…Bing values a link to a page if the anchor text matches keywords in the title of the destination page…”

To check what your pages titles look like, type in “site:yoururl.com” for example, “site:thatwhitehatwizard.com” into your browser. This will give you a list of all the pages that have been indexed and show what they will look like when they appear in the SERPS.

Alt tags are “descriptions” of the image. They basically describe the image to blind users and search engines. One must always bear in mind that search engines cannot see or analyze images. They only see text. A great way to see what a search engine can actually see on a website page, is to navigate to the page and press “Ctrl+A” all the blue is what it cannot see and the text will appear in white. Image alt tags will not appear here, to view your image alt tags, it is advisable to navigate to the page you want to check, then copy and paste the URL into this image alt tag checker for an accurate description. Remembering that blind users have pages, including alt tags read out to them, it is important to make these alt tags compelling and descriptive and refrain from keyword stuffing. You may not have that many blind users on your page but remember always that search engines are also blind users and will penalize or reward you according to your alt tags.

“…search engines are also blind users…”

Make Sure a Search Engine Can Understand Your Website

Following on the previous concept of search engines being “blind” it is important to ensure that a search engine can easily read and understand your website. Generally if your website is in HTML, it will have no problem. The issues come in when you when you start adding fancy elements like Java, iFrames, Flash and PDF’s. These are not easily read by search engines are are also “orphaned”. Basically, when a search engine encounters an item like a .pdf, for example, it can read it and index it but it will display it in a search result as a separate item. Navigating to this result will take you to the .pdf only and will have nothing to do with your website. These elements are often tricky to work with as one usually will want users to be able to download information but when optimizing a site one usually encounters the following issues:

  • A .pdf is “read” but does not add to your sites “content” from an SEO point of view.
  • Any text in the .pdf that is similar to text on the pages will result in a “duplicate content” penalty so be sure to add a “no follow” tag to your link to the .pdf.
  • The .pdf’s are indexed but do not appear connected to the SERP of the website.
  • Flash websites can be visually appealing but are difficult for search engines to index, have longer page load times and are not preferred at all by users. I always instantly close the tab when I see a flash site loading.
  • Image galleries are prime SEO real estate and sometimes (in the case of photographic websites) the only real estate that one has to optimize in the form of alt tags and structured data markup. Presenting these galleries results in a loss of that optimization.

“a site that a search engine finds easy to use and understand is a site that your visitor will find easy to use and understand”

Always Avoid Link Schemes

This is a favourite topic of mine as the tempting thing to do is accelerate your pages ranking by generating as many back-links (external links) to your pages as quickly as possible. Your page authority is largely affected by the number of and the quality of links that point to that page. By increasing the number of links to a page you increase the authority of that page. This is not as simple as it seems. Link schemes are, in a nutshell, schemes that will create links to a page that are bot naturally generated links. There are many different versions, e.g. linking to someone for a link in return, paying link builders to generate links. participating in link generating schemes that post articles linking to your site and the list goes on ad infinitum.

Recently Google de-indexed thousands of pages that had links directed at them through a type of bulk article generating scheme. This scheme worked for the pages but as soon as the search algorithm changed, they were removed from the index, basically rendering the website absolutely useless as they will never appear in the SERPS again.

When someone links to a page, they are essentially casting a vote for that page and telling the users that the page they are pointing to is a page that contains relative, authoritative information pertaining to the page they are currently on. There are many ways to do this in a “white-hat” fashion but my advice here is: if you are an authority in your field, your website will become an authority as people will naturally link to it. The best methods to use in becoming an authority are:

  • Information: Provide your users with useful information that they will want to link to to share with other users. For example, I will advise a CCTV company to post information on how to chose a camera with the correct focal length pertaining to the width of field they require, combined with a table of angles for users to refer to. In doing this that will ensure that visitors return often to their website (and then see the specials) and will also link to the site from their own websites (they should rather link to your site than run the risk of posting the same on their site and get penalized for duplicate content).
  • Eduction: Educate your users on certain topics that will keep them referring back to and linking back to your website. For example, I will advise someone with a photographic website to post tutorials on a speciality genre of theirs, like product photography. Again this will ensure return visitors and links to the pages.
  • Affiliation: Becoming affiliated to and sponsoring various organizations is a lot of work, work that will naturally bring in business for you but also work that will generate natural links, more so if the above points are in place.

“if you are an authority in your field, your website will become an authority”

No Cloaking

Cloaking is when you present one page to a search engine and another page to your users. There are two different basic methods here. The first is to present a page with desirable content to the search engine to gain ranking in the SERPS and the users get to another page with undesirable content. Another way is to present a page that is rich in text content to search engines and the users are directed to a page that is more visually appealing. Cloaking is rather technical to achieve so is nothing to stress about for the average web developer. If you are cloaking, you know you are cloaking and it is a great idea to stop as this can well lead to the page being de-indexed.

“Cloaking is a black hat search engine optimization (SEO) technique in which the content presented to the search engine robot is different from that presented to the users’ browser.” ~ Wikipedia

Avoid Deceptive Redirects

Redirects are used when you want to redirect a user from the URL they wanted to go to, to another URL. There are plenty reasons to do this but one always has to ask: “Is the destination content relevant to the desired content?”, and, “Do the users want to do to the redirected content?” If the answer to these two questions is “yes”, then you have a natural redirect which is acceptable. I, for one, am not at all a fan of redirects at all.

“always redirect search engines and users to the same destination”

Never Use Hidden Text or Links

Hiding text or text links is done by making the text colour the same as or similar to the background colour. The search engine robots now have the ability to define the difference between the text colour and the background colour. The use of hidden text and text links will negatively affect your rankings.

“I have encountered web-masters using this technique in their clients’ pages as well as their own, much to their detriment.”

Avoid Doorway Pages

Doorway pages are pages that contain similar content but with certain keywords changed to be able to rank for a particular product in various cities, for example, a photographer will have a page for Studio Photography optimized for one city and another Studio Photography page optimized for a neighbouring city. The guideline here is that, even though your product is good, you mostly cannot rank equally for one area as you can for another. My advice here is to define your area of service and optimize for that, you will achieve far better results than diluting it over a larger area. As your page authority increases, so can you increase your target area, assuming you serve that area. I did have a photographer client that alternates between two cities, quite far apart and my advice in that situation was to provide accurate mark-up on the elements on her pages and let the search engines define the areas she worked in from this.

“If your product is simply that attractive that people will want it, they will buy it wherever they are.”

Never Use Duplicate Content or Scraping

Populating this particular website brings tears to my eyes as I have already created a section on my photographic website covering some SEO guidelines for photographers. I now have to generate new content for this site. Search engines cache pages they index and compare that content to new content they discover and they penalize the pages that duplicate this. Often web-masters make use of article spinners that will change words to synonyms to try to fool search engines. The algorithms are advanced enough to detect this.

Another common cause of concern is content, in the form of Twitter feeds etc, that is displayed on a page. The common guideline here is to ensure that the automated feed does not consist of more than 10 percent of the content on that page.

“…remember, a robot is a robot and will detect content that has been spun by a robot…”

Steer Clear of Keyword Stuffing

This ties in very much with “create content for users, not search engines”. When creating your content, write it naturally and let your SEO consultant edit it for optimization. Basically, your main keywords should appear in the first paragraph, above the fold. When written naturally, your keywords should not consist of more that 2 to 5 percent of the content of the page.

Keyword stuffing alerts search engines to an over-optimized page and will prevent your visitors from contacting you.

“I am a Randburg based photographer with a studio in Randburg for my photography where I also do product photography in Randburg and wedding photography for people in Randburg for people who are looking for a Randburg photographer.”, does not read at all well and will not convert visitors at all. It should rather read as: “I am a Randburg based photographer specializing in studio, wedding and product photography.” This is simple, to the point, contains relevant keywords and instantly describes what you are about.

“your main keywords should appear in the first paragraph, above the fold”

Internal Linking and Hierarchy

It is considered good practice to plan your website well before you begin to develop it. This way your menus will flow naturally in a well organized structure. It is important that your hierarchy of your pages make sense and are relevant in their “parent/child” relationships. Take for example the hierarchy of this section: SEO -> How We do SEO -> Search Engine Webmaster Guidelines it follows a hierarchical order that answers questions: “What is it?”, “How is it done?”, “What are the guides supporting how it is done”. If I had to throw in a section on “Website design packages”, it would make no sense at all. When doing your site planning, try to make sure your menus don’t go more than 3 levels deep, e.g. Home-> SEO -> How we do SEO is pretty much the maximum I would go. More levels down and search engines will struggle and so will your users. I have all to often felt “lost” in a website with confusing menu’s and levels.

creating a readable sitemap for your users

click on the image for a larger example

It is also good practice to create a static, text link to each page in your site from your home page. Preferably make these text links as image links cannot be seen by blind users such as search engines and people. Often menu’s on your home page are Java menus which search engines cannot parse so they cannot navigate to all the pages. Looking at the example on the right, I have worked the links to the pages into the content on the home page. Doing it this way ensures that users and search engines can easily navigate to the rest of the site. The links are in text and the anchor text of the links is taken care of nicely. When developing a website and you or the client is struggling to create the home page text, this is an easy “structure” to use as a base and, in doing so, your keywords are also strategically placed on your home page. One example here of structuring for a search engine and not a user, is to create a “link to home” on all pages other than your home page. The natural tendency is to create a link to your “contact us” page. Remember, each individual page has an individual page authority dependent on the amount of “votes” for it. Your “contact us” page has relatively zero content so there is no reason for the other pages to “vote” for it, your home page needs those “votes” more. In linking back to your “home page” you also keep search engines and users on your website for longer.

I have all to often felt “lost” in a website with confusing menu’s and levels”

Keep the Amount of Links on Your Pages Down

Search engines have suggested that the amount of links you have on each particular page be a reasonable amount. Though there is no set rule here, take the maximum number as being 100. Any more and your page could be well seen as a “link farm” and this is discouraged. Remember also, it is also acceptable for a page with hundreds of external relevant links to it can have more links linking out. Too many links out of a page can easily lead your users to navigate out of our website. With this in mind, I always make links linking out of my websites open up in a new tab to keep the users to enjoy the link and stay on my site.

It is very important to add a rel=”nofollow” attribute to the <a> tag on all paid or advertising links. Failure to do so can quickly lead to your page being removed from the search engine’s index.

“I always make links linking out of my websites open up in a new tab to keep the users to enjoy the link and stay on my site.”

Use robots.txt Wisely

The robots.txt file is used to tell search engine spiders where to go and what to follow. If the file is missing or empty, spiders will crawl the content. This can have some uses, for example, if you have a downloadable .pdf document that has the same content as on one of your pages, it is wise to use a robots.txt file to prevent search engines from indexing it and penalizing you for duplicate content.

“Make sure you know and fully understand robots.txt files before using as I have seen entire websites having no-follows in the code.”

Submit Your Content to Search Engines

To submit your website to Google and other search engines is a fairly easy process. One can submit to Google or submit a site map on webmaster tools or submit rich snippets.

 

“I always maintain that, bearing in mind that search engines and users often see and navigate a website in similar ways, planning your site for search engines and populating it for users is the winning recipe”