Indexing a site on search engine

     Before a site appears in search results, a search engine must index it. An indexed site will have been visited and analyzed by a search robot with relevant information saved in the search engine database. If a page is present in the search engine index, it can be displayed in search results otherwise, the search engine cannot know anything about it and it cannot display information from the page..
     There are two ways to index a site in search engine.

    1. Register your site to a search engine manually if it provides this service.
    2. Let the robots of search engines find them automatically.
To make your site search engine robots friendly then folllow rules below:
*   Try to make any page of your site reachable within 3 mouse clicks from your home page.Sitemap provides this usability.
*   Do not make common mistakes stated here earlier.
*   Remember that search engines index no more than the first 100-200 KB of text on a page. Hence, the following rule – do not use pages with text larger than 100 KB if you want them to be indexed completely.

     The behavior of search engine robots can be controlled by making robots.txt file.This file allows you to explicitly permit or forbid them to index particular pages on your site.

     The databases of search engines are constantly being updated; records in them may change, disappear and reappear. That is why the number of indexed pages on your site may sometimes vary. One of the most common reasons for a page to disappear from indexes is server unavailability. This means that the search robot could not access it at the time it was attempting to index the site. After the server is restarted, the site should eventually reappear in the index.

mistakes done by SEO frequently


1.Use of Graphical header and menus



    Very often sites are designed with a graphic header.The upper part of a page is a very valuable place where you should insert your most important keywords for best seo. In case of a graphic image, that prime position is wasted since search engines can not make use of images. The text in it cannot be indexed by search engines and so it will not contribute toward the page rank. If you must present a logo, the best way is to use a hybrid approach – place the graphic logo at the top of each page and size it so that it does not occupy its entire width. Use a text header to make up the rest of the width.


    Internal links on your site should contain keywords, which will give an additional advantage in seo ranking. If your navigation menu consists of graphic elements to make it more attractive, search engines will not be able to index the text of its links. If it is not possible to avoid using a graphic menu, at least remember to specify correct ALT attributes for all images.


2.Use of session ID



    Some sites use session identifiers. This means that each visitor gets a unique parameter (&session_id=) when he or she arrives at the site. This ID is added to the address of each page visited on the site. Session IDs help site owners to collect useful statistics, including information about visitors' behavior. However, from the point of view of a search robot, a page with a new address is a brand new page. This means that, each time the search robot comes to such a site, it will get a new session identifier and will consider the pages as new ones whenever it visits them.If you want to improve your site's SEO then it is advisable to avoid the use of session in your site.


3.Hidden Text(SEO deception method)



Hidden text (E.g.text color is same as backgrount color) allows site owners to cram a page with their desired keywords without affecting page logic or visual layout.this invisible text will be seen by search robots. The use of such deceptive(fraud) optimization methods may result in banning of the site. It could be excluded from the index (database) of the search engine.


4.One-pixel links(SEO deception method)



Search engines consider the use of tiny, almost invisible, graphic image links just one pixel wide and high as an attempt at deception, which may lead to a site ban.


registered my blog on technorati

HKZY38CFCFSV

technorati is the site which has done a great job for bloggers to gain more and more traffic to their blogs.

External Ranking Factors(PART 2)

1. Inbound links to a site
An analysis of inbound links to the page being evaluated is one of the key factors in page ranking. This is the only factor that is not controlled by the site owner.
   It is known to all that interesting sites will have more inbound links. This is because owners of other sites on the Internet will tend to have published links to a site if they think it is a worthwhile resource. The search engine will use this inbound link criterion as its evaluation criteria
   Therefore, two main factors influence how pages are stored by the search engine and sorted for display in search results:
    - Relevance, as described in the previous section on internal ranking factors.
    - Number and quality of inbound links, also known as link citation, link popularity or citation index.

2.Link importance (citation index, link popularity)

   By Simply counting the number of inbound links does not give enough information to evaluate a site. It is obvious that a link from www.google.com should mean much more than a link from some homepage like www.hostingcompany.com/~myhomepage.html. You have to take into account link importance as well as number of links.
   Citation index is used to evaluate the number and quality of inbound links to a site.It is a numeric estimate of the popularity of a resource expressed as an absolute value representing page importance. Each search engine uses its own algorithms to estimate a page citation index.
   The absolute citation index value, a scaled citation index is also used sometimes. It indicates the popularity of a page relative to the popularity of other pages on the Internet.
3.Link Text
   The link text of any inbound site link is very important in search result ranking.
E.g www.seo-n-hack.blogspot.com.If the link text contains appropriate keywords, the search engine regards it as an additional and highly significant recommendation that the site actually contains valuable information relevant to the search query.
4.Google Pagerank
   The Google company was the first company to patent the system of taking into account inbound links. The algorithm was named PageRank.
PageRank is estimated separately for each web page and is determined by the PageRank (citation) of other pages referring to it. It is a kind of “virtuous circle.” The main task is to find the criterion that determines page importance. In the case of PageRank, it is the possible frequency of visits to a page.
The PageRank of a specified web page is thus defined as the probability that a user may visit the web page. It follows that, the sum of probabilities for all existing web pages is exactly one because the user is assumed to be visiting at least one Internet page at any given moment.

  You can determine the PageRank value for any web page with the help of the Google ToolBar that shows a PageRank value within the range from 0 to 10. It should be noted that the Google ToolBar does not show the exact PageRank probability value, but the PageRank range a particular site is in. Each range (from 0 to 10) is defined according to a logarithmic scale.

   Here is an example: each page has a real PageRank value known only to Google. To derive a displayed PageRank range for their ToolBar, they use a logarithmic scale as shown in this table
          Real PR                               ToolBar PR
          1-10                                            1
          10-100                                        2
          100-1000                                    3
          1000-10.000                               4
Etc.


   This shows that the PageRank ranges displayed on the Google ToolBar are not all equal. It is easy, for example, to increase PageRank from one to two, while it is much more difficult to increase it from six to seven.

   In practice, PageRank is mainly used for two purposes:
1. Quick check of the sites popularity.
 2. Evaluation of the competitiveness level for a search query is a vital part of seo work.

Search Google