Search Engine Optimise

A lot has been said about designing search engine friendly sites but your audience is people, not a search engine. This is why before you even begin to think about how to construct a web site or the copywriting for it you need to conduct market research.

Very few web sites cater for everyone. Most web sites have a specific target audience, usually made up of a large proportion of a single age group, gender, income bracket, etc. Of course, the target audience for your site may be considerably smaller, focusing on people working in a single industry, or it may be larger. The point is to establish the target market for your site before you begin working on the design or content.

You must also decide what your site is going to offer your target audience. Is your site informational in nature, are you going to offer online sales, is it primarily an online brochure, are you planning to use your online presence to boost offline sales, increase brand awareness or confidence in your company?

Combining what you consider to be your target market with what you hope to accomplish with your web site should give you a good idea of the type of copywriting , design style and navigation structure to employ. But if you are hoping to be found well on search engines and directories it is essential that you first conduct keyword research

Keyword Research

The keywords you are researching now will need to go into the web site’s content . Keyword selection goes hand in hand with web site copywriting but, if possible, should be done first. It is important to remember that not everyone searches the web the same way. You have to take into consideration variations in terminology as determined by one’s age, profession, what part of the world they are from, and so on. Remember that there are often several ways of saying the same thing.

NEVER try to target keywords that are not relevant to your site. You want qualified traffic from people who have spent time on a search engine or directory specifically to find the products and services you offer. The more qualified the traffic the more easily that traffic will translate into sales (assuming your site can sell itself once they get there).

First make a list of every phrase you can think of that people might use to find what your site is going to offer. Ask your friends and colleagues for input and try to find people that are within your site’s target audience to get an idea of how they would search.

Once you have a good long list use search engine databases to refine it and see what people have actually searched for when looking for similar sites. If you have money to spend buy Wordtracker . Otherwise use Overture’s free keyword suggestion tool or Google’s tool . You can also check Google, AltaVista and FAST Alltheweb.com using a Boolean search to see how many other sites are listed when you type in your keywords. This will give you an idea of how many other sites are also using those terms.

Now, looking at your target audience you need to decide which terms THEY, not you or your marketing agency, are most likely to use. Bear in mind that most people do “natural language” searches. They do not use marketing terminology. Generally speaking, they use simple 2-4 word phrases. You may want your web site to say “marvellously wonderful widgets” but your target audience is most likely going to look for “blue widget” and so find your competitor who chose to stick with his keyword research despite what the marketing department said.

You will need to use common sense when compiling the information from friends, online search query databases and the amount of other sites using your terms. Bear in mind that your site’s text and optimisation can be refined as you go along so you do not need to get it 100% perfect the first time around. You will most often find that even after exhaustive keyword research, the site will need to be tweaked in a year (or sooner) as your web site statistics will reveal that people are using other terms to search for you. The way people search does change according to industry trends, new products on the market, etc… And most likely your site will change as well.

Now that you have a good list of highly relevant phrases you can start working on your search engine friendly, keyword rich web site content .

Title and Meta Tags

Before we go any further let me reiterate that the most important element in optimising sites for high search engine listings is html body text . If the pages do not have keyword rich text, even the best titles and meta tags are not going to go very far toward good listings.

The titles and meta tags are supposed to highlight the most important keywords, as listed throughout the html body text on that web page as well as match the overall theme of the site. They are meant to be an addition to, not replacement for, keyword rich html copy.

After the body copy, the < title > < /title > tag is the most important element in search engine optimisation for a web page (please note that the theme, navigation, and inbound links are the most important elements for successful search engine placement for the site overall). The text in the title tag should be:

The < meta name=”description” content=” ” > tag is not an important element as far as search engine optimisation is concerned but it is important for your site’s listings as a well worded description has often been the deciding factor in whether a searcher clicked on the first, second or third links, with many choosing the lower listing due to a better title and description. Please note that not all engines use the meta description tag when displaying results, some use the first few lines of text from your web page and some use text from anywhere in your web page providing it contains the terms that were used in the search. The meta description tag should be:

Next comes the < meta name=”keywords” content=” ” > tag about which there are so many misconceptions. In short, if you don’t have the time, forget this tag completely. I personally use it on almost every web page I optimise simply because I believe that every little bit might count but I never spend a lot of time on it. Simply choose the few main keywords that are being targeted on that particular webpage and add them to the meta keyword tag. Try not to repeat any single word if possible. In order to avoid repetition it’s advisable to not use commas as this will allow you to string words together to make up several possible combinations without having to repeat any one of them.

The meta keywords tag should be:

There are a lot of other meta tags but none of them are currently useful for search engine optimisation. That doesn’t mean they should not be used, just don’t expect them to affect the site’s rankings.

Alt Tags and Link Text

Alt tag are not read by every search engine but some do take note of them. Accessibility equipment however needs alt tags to describe pictures, buttons and non-text links to the disabled user.

To make the alt tag useful for both accessibility and search engine optimisation it should be used wisely. The alt tag should describe the graphic while using a keyword that can sensibly be associated with it. Refrain from using large paragraphs or keyword stuffing as that can, and usually will, be considered spam .

Link text can be very important as search engines do take note of the words used in the anchor text to describe the page being linked to. As with the alt text (and every other aspect of professional search engine optimisation) link text should be used wisely and make sense to the viewer.

Using text for links instead of buttons, image maps or framed menus not only allow the engines easier access in spidering the site, it also makes it easier for the disabled.

Writing for Search Engines

Before we go any further please note that we are talking about visible HTML text. Graphical text, flash movies , javascript mouseovers, etc… do not count because the search engines cannot read them. The same almost holds true for framed and dynamic sites . As with almost everything in life there are workarounds that can be employed and the correct use of these workarounds can lead to good listings. But the sites that consistently maintain high rankings usually have a good amount of keyword rich html text on static pages.

First of all decide on how many pages your site is going to be. Then assign a “theme” to each page (ie: about us, widgets, red widgets, international widgets). Now take your keyword research list and divide the phrases between the pages making sure they match the page theme. You should have 1-3 key phrases per page. Most likely there will be other secondary phrases that you can easily work into the copy as well. The more variations of the key phrases you can use the better but be sure you don’t lose focus of the 1-3 main ones that you have chosen for that page. If a keyword/phrase applies to more than 1 page, all the better – especially if there are a number of variations for that phrase.

You should try to have between 200-500 words of copy per page. If possible the top line of text should read like a newspaper headline and contain that page’s most important keyphrase. The first paragraph should ideally contain all of that page’s main keyphrases and they should be repeated at least twice more in the text. But make sure the copy reads well. Do NOT use hidden text, hidden layers or any other spam tricks as the site will eventually be caught (or reported by your competitors) and penalised or banned. It simply isn’t worth it and isn’t necessary.

Online Sales Copy

As stated in the preceding page, search engines like lots of keyword rich html text. But marketing departments will tell you that your visitors want less. Actually, what humans want is to be able to read the text easily. Try using:

The amount of text you use on your site will depend on the purpose of your site and your target audience. But even if you feel it’s best to use short, snappy online sales copy you can still make it keyword rich. If you need to use short pages with sales copy and do not want every page to contain 200-500 of keyword rich text then make sure you also create product info sheets and FAQ pages for those visitors that do want more information. Ideally you should have one keyword rich page per product or service.

Search Engine Friendly Web Design

Now that you’ve sorted out your keyword rich web site content you need to attach it to a search engine friendly design and structure.

It has often been said that search engines prefer “Big Dumb and Ugly” sites. And the reason “Big, Dumb and Ugly” sites usually do well on search engine listings is because the navigation structure is so simple and there are no “bells and whistles” such as flash, dynamic content, etc… BDU sites usually load very quickly, have few or no graphics and have a lot of keyword rich text. So if you simply take the BDU site’s main attractions as the engines see them: simple navigation, fast loading pages, lots of keyword rich html text and use them to develop a professionally designed web site you will achieve good listings without the site having to be BDU.

Web Site Architecture and Link Structure

In order for the engines to be able to easily get around your site and index every page you will ideally have a text based navigation structure. Drop down menus, image maps and frameset menus are difficult, and in some cases impossible, for the search engine spiders to follow. So try to make it easy on them by either using a completely text based navigation system or providing text links throughout each page’s copy.

If possible, all of the site’s pages should reside in the root directory. Where there are too many pages for this to be feasible, try to keep the site architecture as simple and as close to the ground level as possible. Avoid having a url that looks like www.mysite.com/directories/level1/level2/page1.htm . The further from the root the page is, the less likely it is to be crawled.

The link structure should allow the user and spider to get to any page from any page. In very large sites this will translate to any directory main page from any page. The user should never be more than 1 or 2 clicks away from where he or she wants to go and the path to get there should be so clear that even a blind person can’t miss it .

Since you’ve already researched and divided up your keywords for each page give each page the main keywords as a file name. If it’s 2 words separate them with a dash. Same goes for directory names. So instead of having www.mysite.com/directory/page1.htm make it www.mysite.com/widgets/blue-widgets.htm . Your files and directory names will not make or break your listings but it can’t hurt and every little bit extra may help (this is also true with domain names).

Do not obsess over it and register domains or make up directories and very long file names just to use keywords as that can make it very difficult for people to remember the url and increase the chances of it being mis-typed. As in everything to do with your site, your primary goal is not to be found but to sell. You have to balance every aspect of your site’s design between what is good for search engine optimisation and what your target audience needs.

Frames Based Sites

Frames are often used to aid navigation. Sometimes they’re used because the designer or site owner thinks it’s the only way to get the desired look or functionality. In some cases, they may be correct, but not always.

Frames are not the best option when it comes to search engines but they are not the major stumbling block they were several years ago as long as certain considerations are made to ensure that framed sites are still accessible to both engines and disabled users. If you have to use frames the best way to ensure both engines and disabled users can still access the site is to create a text based non-frames version. If this isn’t possible, at least ensure the following are taken into consideration for the frames site.

Search Engine Friendly Dynamic Sites

As optimising dynamic sites can be difficult I would recommend that you outsource the search engine promotion to a professional, ensuring that the SEO consultant you choose has a history of successfully promoting these types of sites.

A dynamic page contains content that is generated on the fly by taking results from a lookup table or database in order to satisfy a specified query. Dynamic sites are convenient for running shopping carts and quickly updating and customizing content, and while some sites make use of dynamic content for certain sections other sites are entirely designed within dynamic URLs.

You can usually tell if a page is dynamic by the “?” or other special characters located in the page’s URL. Although all of the major engines can read pages using URLs that have non-alphanumeric characters in them (such as &, +, %, $, ?) they may not actively spider such sites, choosing rather to only spider the dynamic page that is directly submitted to them or directly linked to from a static page.

The reason the engines shy away from actively spidering dynamic sites is because they do not want to be caught in what is known as a “spider trap”. Depending on how the site is coded, a single dynamic page can be generated hundreds of times – each with its own unique session id based url.

So If you want the search engines to spider your dynamic site you will need to either pay for inclusion (Inktomi, Lycos/Alltheweb and AltaVista offer this) and/or ensure that workarounds are put in place.

The best workaround for most dynamic sites is to get rid of any characters that are incompatible with search engine indexing. Most notably these include, ?, =, &. This can be done by creating scripts on the server that modify how it “serves up” dynamic pages. Even though the content is dynamic, the “normal” urls allow the site to be properly spidered.

If you have already built the site and you cannot change the url’s using the present programming you will either have to pay to have the site redeveloped using technology that allows search engine friendly url’s or you can opt to use other workarounds.

Search Engine Friendly Design – Flash

Flash itself is not bad and for some target markets it can be a great feature to have as part of your site. Unfortunately, flash isn’t the best format for high search engine placement. It also seems that a lot of designers do not know how to make the best use of flash.

Up until recently the most common use of flash seems to have been the flash movie on a splash page . Besides the fact that studies have shown that almost everyone presses the “skip intro” button meaning the flash intro is not an attention grabber or means of enhancing brand awareness, the flash splash page will kill your site’s search engine listings – or ensure you never get any if it’s a new site.

Flash text can only be read by FAST Alltheweb.com. None of the other engines can currently read flash text or follow flash links. Although FAST Alltheweb.com is now indexing flash this does not make flash search engine friendly. Flash MX is supposed to move flash design away from “useless” and “fancy” and into “content” and “usability”. It is meant to go some way towards making flash more accessible to disabled users and possibly search engines, but the designer needs to know how to use it correctly in order for it to be effective in this way. It will be some time before flash can be considered either truely accessible or indexable.

So if flash itself horrid? No. The judicial and purposeful use of flash on websites whose target audience will appreciate it is a good thing and can help to increase the site’s “stickiness”. Games, quizzes, surveys, online forms, multi media streaming, and other interactive features are all good uses for flash. And since the rest of the site does not need to be designed in flash, using flash technology in these ways will enhance your web presence without taking anything away from the engines or disabled users who will still able to read and navigate the rest of your html site.

But it goes without saying that sites composed entirely in flash are not going to achieve high search engine placement, even on FAST Alltheweb.com. The problem with flash and search engines is primarily with:

Tables

The way the engines view your page is different from the way it appears to the human viewer. If you go to a webpage and click on “View” in the top left hand corner of your browser and then click on “Source” you will be able to see the way the engines will see the page. Search engines read from the top of the page down in the HTML code. This means that the way they view the text in tables is very different than the way it is laid out to the human viewer on the screen.

To ensure that your most important keywords are placed prominently in the tables you will have to view the tables in the source code. You may find that the more important keywords, although appearing on the screen at the top of the tables are in fact way down the page in the source code, below less important text, graphics or navigation structure.

To ensure that your most important text is the first thing the search engines see use the simple table trick found on the following sites:

JavaScript

Javascript is one of the most common causes of “code bloat”. Search engines usually only read so far down a webpage (remember, the source code, not the visible page) and if the first half of it is taken up by javascript they may give up and go away before getting to your keyword rich text.

To ensure this doesn’t happen simply put all of that page’s javascript in an external js file. Not only will this increase your page’s search engine friendliness, it will also make it faster to load.

To create a js file use a text editor and put any javascript code that you would normally place in the section of your web page into a separate file. This file should only contain your JavaScript code, no html. Then on your actual page, you put this in the < head > section:
< Script language=”JavaScript” src=”filename.js” >
< /Script >

Another solution is to place the javascript at the end of the page (in the source code) below all of the text and links. But this can sometimes cause problems with rollover buttons, interactive date scripts, banners and the like. If it does, use the external js file instead, especially if you use the same javascript on more than one page as you can call the same js file for all of them making it easier for the designer and decreasing overall download time.

It should be noted that if all of your navigation is done using javascript the search engines will not be able to follow your links. To ensure your entire site can be spidered make sure you use a noscript tag.

Cascading Style Sheets

Cascading Style Sheets, or CSS, is an HTML addition that allows the designer to control various web page design parameters by pre-defining them in the < head > tag either by referencing to an external CCS file or by placing the CCS information for that particular page directly in the < head > tag.

CSS massively decreases download time and can save design time as well if the same CSS file is used throughout the site. As far as search engines are concerned, using an external CSS file removes code bloat. You can also use CSS files to give the engines what they want such as the hefty and ugly < H1 > tags without having to compromise on the way the site appears to the human viewer. Using CSS to define margins, fonts, link appearance, colours, and placement means that you can feed the engines what would normally be a Big, Dumb and Ugly page without it appearing that way to humans.

Please note that we are not talking about spam here. All we are saying is that you can set the < H1 > tag in CSS to look like small, bold type while the engines will still treat it with all the respect that text in the (traditionally horrid looking) < H1 > tag deserves.

What you should NOT use CSS for is hidden layers in an effort to get more keywords in without having to have them in the visible text, otherwise known as spam. Even if the engines don’t get you (and they eventually will) your competitors are keeping a sharp eye out and they (or their promotions company) will report it. Hidden layers are a big no-no so don’t even think about it.

Robots.txt

All major search engines support the robots.txt and it is the first file a spider will request when visiting a site. Therefore a robots.txt file should be part of every site. It should be used to exclude the robots from sensitive material that is not password protected, any under construction pages, test sections, js files, style sheets and the CGI-BIN.

Robots.txt files are not only for useful for stopping the engines from grabbing sensitive pages they also help to tell the engines where they should go instead. If you have a very large site the engines may spend so much time spidering your image directory, test directory or other non-essentials that they run out of time and go off to another site. This is not what you want. You want the engines to get to the important, keyword rich pages on your site. So use a robots.txt to keep them away from anything that isn’t going to be beneficial to your search engine rankings.

As the robots.txt is explicit you should never ‘Allow’ anything on the file unless that is the ONLY directory or page that you want the engines to index. As that would be a very rare occurrence it is best to primarily use the file to disallow what you don’t want indexed instead. Run the robots.txt through a validator to ensure it’s written correctly or it may end up doing more harm than good.

Web Site Optimisation

Once the site is designed and all the content has been added it’s ready to be optimised for the best search engine placement. Hopefully you’ve already done your keyword research and have used keyword rich text on as many pages as possible.

The next step in the search engine optimisation process is to further highlight your keywords by using titles, meta tags , alt tags and link text .

Don’t Do Spam

There are a lot of different definitions of spam from the very vague “anything that’s bad” to a list of every technique that could possibly be considered to be spam. At the heart of the matter though is the engines themselves. Since they will be the ones to penalise or ban your site if what you do is what they define as spam then it stands to reason that the definition should come from them.

Unfortunately they are not completely clear on the issue either in regards to actual techniques. Probably because they feel if they spell it out exactly then the spammers will find some new way of doing it that isn’t covered in the engines anti-spam list (which would probably happen).

Some people and some engines have defined spam as “anything you wouldn’t do if the engines didn’t exist”. The problem with that definition is the conclusion that any method of trying to get good rankings, even as simple as writing keyword rich text, is spam. Try applying that logic to any other form of marketing and you’ll find it just doesn’t work as it rules everyone out as spammers.

But just because there isn’t a list of exactly every type of spam or a clear definition of what all engines consider to be spam doesn’t mean you’re free to do things that have been labelled spam techniques. These include:

Bear in mind that some of the above techniques are employed with no thought to search engine positioning. For example:

It does appear that the engines make exceptions in some cases but all of the above do raise “red flags” and may very well cause your site to be penalised or banned. Just because another site used it without any harm to their listings doesn’t mean you will be able to. Life isn’t always fair and the engines are not answerable to you. So to be safe avoid all appearance of evil and don’t do spam.

To get a free consultation, send your enquiry now.

 

 

ProgressVista. All copyrights reserved.
We offer Customised Web design, Web Hosting, Content Management System, Secure Payment , Ecommerce solution, Domain Names Register, Email Marketing, Google Ready services, include: Google analytics, Google sitemap, Google map, Google base, Google website optimizer, Google web master tool, Google adwords, conversion rate and Google local business center