The search engines bot that finally index your website to the notice of the potential followers. So it is better to understand how the search engines bot really index your website and how they show your information to their users?
There are basically five types of search engines. The first is by crawlers based or spiders based search engine.
Crawlers based Search Engines use spiders to index websites from WWW. When we submit our website to a search engine by completing their required submission page, the search engine spider will index our entire site. A ‘spider or bot’ is an automated program that is run by the search engine system. Spider or bot visits a web site, read the content on the actual site, the site’s descriptions and also follow the links that the site connects. The spider then returns all that information back to a central directory, where the data is indexed. It will visit each link you have on your website and index those sites as well. Some spiders will only index a certain number of pages on your site, so don’t create a site with 500 pages!
The spider will periodically return to the sites to check for any information that has changed. The frequency with which this happens is determined by the moderators of the search engine.
A spider is almost like a book where it contains the table of contents, the actual content and the links and references for all the websites it finds during its search, and it may index up to a million pages a day.
Example: Google, ask
When you ask a search engine to locate website or information, it is actually searching through the index which it has created and not actually searching the Web. Different search engines produce different rankings because not every search engine uses the same algorithm to search through the indices.