What is 2014 Search Engine Algorithms | Bots | Crawlers Or Spiders and How It’s Working?

The Advancement of Search Engine Algorithms


Today’s search engines and algorithms are more sophisticated than ever. If you type a single word or phrase into a search box and click the search button, you have to wait only a few seconds to get thousands of result. On google you need only type letter by letter to get a different set of results and in most cases you can get suggested keywords to help automatically complete your search phrase.
Not only that, most of the time those results take you to websites that are relevant to what you are searching for and every rarely does the website link you click on appear to be non-existant or is not available.

What is a Search Engine Algorithm?


The easiest way to answer this question is search engines use algorithms to process the billions of pieces of information they have collected about web pages on the internet. An algorithms is a list of filters and rules used to decide which website most closely relate to the search term entered in to the search engine. You then see the results of what the algorithms decided were the best results for your searched words or phrases on the screen.

What you do not know is little programs called bots, crawlers, or spiders have visited virtually every website on the internet to collect information on everyone of them. Thease little programs collect keywords, phrases and other coding located on every website, and then stores this information in huge databases used by the search engines. Virtually a copy of every publically available website and picture worldwide can be found on these search engines.

Thoroughly confused yet? Let’s give an example. Let’s say you go to a website URL such www.xyz.com. The funny thing is, what you see (also referred to as content) on the website that  comes up accounts for only about 20 percent of what search engines use to decide what makes a site relevant for any keywords searches.


EmoticonEmoticon