Internet Search And SEO

Sunday, November 19, 2006
Internet Search

For the last decade the Internet has emerged itself into an integral part of our everyday life. What previously was mostly a network of research institutions has become an unprecedented medium for commercial and social use. With the number of web pages now exceeding 12 billion it is extremely difficult to find information without using search engines.

Search engines appeared almost immediately once the WWW was born. A pioneering search engine called Archie was created in the University of Montreal in 1990 and was used to search files located on public FTP sites. The first search engine created to crawl and index text files on the web, World Wide Web Wanderer or Wandex, was introduced in 1993 in MIT and worked on principals similar to those used by the modern web search engines.

However it was not until late 90s when search engines have become the major tool to find information on the Internet. Before Internet directories such as Yahoo! or BestOfTheWeb were used to locate web pages. But it didn't take long for web users to recognize the usefulness of the Internet search, once indexing and ranking algorithms have reached a state where search engines were able to quickly return highly relevant results

Search Results Relevancy

It is all about the relevancy of results. In their infancy search engines were utterly vulnerable to abuse. Originally search engines used to index file names and later meta-tags. It doesn't take much to stuff file names and meta-tags with popular keywords which are irrelevant to the document's content. When a user queried a search engine with a keyword he received results that couldn't satisfy him. That is why Internet directories with pages classified into directories by human reviewers were far more useful than the search engines of those days.

The problem of relevancy has become increasingly important with the growing commercial use of the Web. The profits of commercial websites depend on their exposure and if there is anything that can help improve the exposure in search results - it must be done. And the first attempts to influence the search results by using popular but not relevant keywords to drive more traffic were made.

With their increasing computational and storage capacity search engine have become able to index the entire content of a page. But it was still possible to abuse search results by using excessive keyword stuffing, hidden keyword-rich text or many other spammy techniques. Basing solely on the page content it was impossible to ensure the relevancy of search results and more advanced ranking algorithms were introduced.

Search Engine Optimization

But even modern sophisticated search engine algorithms are not perfect and can't guarantee 100% relevancy. There still exist many loopholes that can be exploited to manipulate search results. Search results manipulation is not necessarily an unethical conduct. Consider three biggest travel websites:
http://Travelocity.com, http://Expedia.com and http://Priceline.com. They provide booking service of a similar quality and are equally relevant for keywords like 'vacation' or 'airline tickets'. But nevertheless they rank differently in search results, and it is natural for a lower ranked site to try improving its position by using search engine optimization techniques.

Search Engine Optimization, or SEO, is about defining what factors influence the placement of a page in search results and adjusting these factors to obtain a higher position. To optimize the relevance of a page you have to adjust keyword frequency and proximity, add keywords into headings and bold text. It must look natural though - excessive keyword stuffing might have negative effects. To increase the popularity of a page you must obtain incoming links with targeted keywords in the anchor text, and make sure that links come from authoritative pages but not from irrelevant link farms.

Search engine optimization is an extremely dynamic field. Once a loophole in ranking algorithms becomes known and widely used, search engines change their algorithms to neutralize its effect. Every algorithm update can drastically affect positions and traffic of websites utilizing aggressive crash and burn optimization strategies. Such strategies are based on the most recently discovered loopholes and tricks and are able to put your pages into top positions quite quickly. Their effect however lasts only to the next algorithm update.

Another far more time-consuming approach is the creation of websites with high quality content following so-called Google quality guidelines. This approach emphasizes the website's value for users and persuades against using tricks to manipulate search results. In short - good websites eventually will rank high. But it takes years and tons of effort to make a site popular this way. Some sites are by their nature difficult to promote this way. For example few grocery stores can afford turning their websites into top Internet grocery portals; it just wouldn't be worth the effort.

SEO alone can't guarantee a lasting effect. Not using SEO at all requires too much effort to promote a site. The wise way to make a website popular is make sure that it is useful for users, obtain quality incoming links and do not forget to fine tune your pages focusing the targeted keywords. Follow the famous Brett Tabke's Successful Site in 12 Months with Google Alone - after four years it is still a valid approach.

About The Author
Oleg Ishenko, Berlin, Germany. BSc in Telecommunications Industry Business Management. Master student in Humboldt University Berlin. MCSE and MCDBA certifications.
Get more useful information at our Comprehensive SEO and Online Marketing Research - http://www.seoresearcher.com
Author: by: Oleg Ishenko
Source: Articlecity.com
See Also