Indexing a site on search engine

     Before a site appears in search results, a search engine must index it. An indexed site will have been visited and analyzed by a search robot with relevant information saved in the search engine database. If a page is present in the search engine index, it can be displayed in search results otherwise, the search engine cannot know anything about it and it cannot display information from the page..
     There are two ways to index a site in search engine.

    1. Register your site to a search engine manually if it provides this service.
    2. Let the robots of search engines find them automatically.
To make your site search engine robots friendly then folllow rules below:
*   Try to make any page of your site reachable within 3 mouse clicks from your home page.Sitemap provides this usability.
*   Do not make common mistakes stated here earlier.
*   Remember that search engines index no more than the first 100-200 KB of text on a page. Hence, the following rule – do not use pages with text larger than 100 KB if you want them to be indexed completely.

     The behavior of search engine robots can be controlled by making robots.txt file.This file allows you to explicitly permit or forbid them to index particular pages on your site.

     The databases of search engines are constantly being updated; records in them may change, disappear and reappear. That is why the number of indexed pages on your site may sometimes vary. One of the most common reasons for a page to disappear from indexes is server unavailability. This means that the search robot could not access it at the time it was attempting to index the site. After the server is restarted, the site should eventually reappear in the index.

mistakes done by SEO frequently


1.Use of Graphical header and menus



    Very often sites are designed with a graphic header.The upper part of a page is a very valuable place where you should insert your most important keywords for best seo. In case of a graphic image, that prime position is wasted since search engines can not make use of images. The text in it cannot be indexed by search engines and so it will not contribute toward the page rank. If you must present a logo, the best way is to use a hybrid approach – place the graphic logo at the top of each page and size it so that it does not occupy its entire width. Use a text header to make up the rest of the width.


    Internal links on your site should contain keywords, which will give an additional advantage in seo ranking. If your navigation menu consists of graphic elements to make it more attractive, search engines will not be able to index the text of its links. If it is not possible to avoid using a graphic menu, at least remember to specify correct ALT attributes for all images.


2.Use of session ID



    Some sites use session identifiers. This means that each visitor gets a unique parameter (&session_id=) when he or she arrives at the site. This ID is added to the address of each page visited on the site. Session IDs help site owners to collect useful statistics, including information about visitors' behavior. However, from the point of view of a search robot, a page with a new address is a brand new page. This means that, each time the search robot comes to such a site, it will get a new session identifier and will consider the pages as new ones whenever it visits them.If you want to improve your site's SEO then it is advisable to avoid the use of session in your site.


3.Hidden Text(SEO deception method)



Hidden text (E.g.text color is same as backgrount color) allows site owners to cram a page with their desired keywords without affecting page logic or visual layout.this invisible text will be seen by search robots. The use of such deceptive(fraud) optimization methods may result in banning of the site. It could be excluded from the index (database) of the search engine.


4.One-pixel links(SEO deception method)



Search engines consider the use of tiny, almost invisible, graphic image links just one pixel wide and high as an attempt at deception, which may lead to a site ban.


registered my blog on technorati

HKZY38CFCFSV

technorati is the site which has done a great job for bloggers to gain more and more traffic to their blogs.

External Ranking Factors(PART 2)

1. Inbound links to a site
An analysis of inbound links to the page being evaluated is one of the key factors in page ranking. This is the only factor that is not controlled by the site owner.
   It is known to all that interesting sites will have more inbound links. This is because owners of other sites on the Internet will tend to have published links to a site if they think it is a worthwhile resource. The search engine will use this inbound link criterion as its evaluation criteria
   Therefore, two main factors influence how pages are stored by the search engine and sorted for display in search results:
    - Relevance, as described in the previous section on internal ranking factors.
    - Number and quality of inbound links, also known as link citation, link popularity or citation index.

2.Link importance (citation index, link popularity)

   By Simply counting the number of inbound links does not give enough information to evaluate a site. It is obvious that a link from www.google.com should mean much more than a link from some homepage like www.hostingcompany.com/~myhomepage.html. You have to take into account link importance as well as number of links.
   Citation index is used to evaluate the number and quality of inbound links to a site.It is a numeric estimate of the popularity of a resource expressed as an absolute value representing page importance. Each search engine uses its own algorithms to estimate a page citation index.
   The absolute citation index value, a scaled citation index is also used sometimes. It indicates the popularity of a page relative to the popularity of other pages on the Internet.
3.Link Text
   The link text of any inbound site link is very important in search result ranking.
E.g www.seo-n-hack.blogspot.com.If the link text contains appropriate keywords, the search engine regards it as an additional and highly significant recommendation that the site actually contains valuable information relevant to the search query.
4.Google Pagerank
   The Google company was the first company to patent the system of taking into account inbound links. The algorithm was named PageRank.
PageRank is estimated separately for each web page and is determined by the PageRank (citation) of other pages referring to it. It is a kind of “virtuous circle.” The main task is to find the criterion that determines page importance. In the case of PageRank, it is the possible frequency of visits to a page.
The PageRank of a specified web page is thus defined as the probability that a user may visit the web page. It follows that, the sum of probabilities for all existing web pages is exactly one because the user is assumed to be visiting at least one Internet page at any given moment.

  You can determine the PageRank value for any web page with the help of the Google ToolBar that shows a PageRank value within the range from 0 to 10. It should be noted that the Google ToolBar does not show the exact PageRank probability value, but the PageRank range a particular site is in. Each range (from 0 to 10) is defined according to a logarithmic scale.

   Here is an example: each page has a real PageRank value known only to Google. To derive a displayed PageRank range for their ToolBar, they use a logarithmic scale as shown in this table
          Real PR                               ToolBar PR
          1-10                                            1
          10-100                                        2
          100-1000                                    3
          1000-10.000                               4
Etc.


   This shows that the PageRank ranges displayed on the Google ToolBar are not all equal. It is easy, for example, to increase PageRank from one to two, while it is much more difficult to increase it from six to seven.

   In practice, PageRank is mainly used for two purposes:
1. Quick check of the sites popularity.
 2. Evaluation of the competitiveness level for a search query is a vital part of seo work.

Internal ranking factors affecting search ranks (PART 1)

1.  Amount of text on a page

A page consisting of just a few words is less likely to get to the top of a search results. Search engines loves the sites that have a high information content. Generally, you should try to increase the text content of your site in the interest of seo. The optimum page size is 500-3000 words (or 2000 to 20,000 characters).Search engine visibility is increased as the amount of page text increases due to the increased reliability of occasional and accidental search queries causing it to be listed. This factor sometimes results in a large number of visitors.

2.  Number of keywords on a page

Keywords must be used at least three to four times in the page text. The upper limit depends on the overall page size – the larger the page, the more keyword repetitions can be made. Keyword phrases (word combinations consisting of several keywords) are worth a separate mention. The best seo results are observed when a keyword phrase is used several times in the text with all keywords in the phrase arranged in exactly the same order. In addition, all of the words from the phrase should be used separately several times in the remaining text. There should also be some difference (dispersion) in the number of entries for each of these repeated words.

   E.g. Suppose we optimize a page for the phrase "seo services” (one of our seo keywords for this site) It would be good to use the phrase “seo services” in the text 10 times, the word “seo” 7 times somewhere else in the text and the word “servicese” 5 times. The numbers here are for assumption only, but they show the general optimization idea very well.

3.  Keyword density and seo

   Keyword page density shows relative frequency of a word in text in form of percentage.

E.g. if a  word is used 5 times on a page having 100 words, the keyword density=5%. If the density of a keyword is low, then search engine will not much take care of it. If the density is high, the search engine may activate its spam filter. If this happens, the page will be penalized and its position in search listings will be intentionally lowered.

   The optimum value for keyword density should be between 5-7%. In the case of keyword phrases, you should calculate the total density of each of the individual keywords comprising the phrases to make sure it is within the specified limits. In practice, a keyword density of more than 7-8% does not seem to have any negative seo consequences. However, it is not necessary and can reduce the legibility of the content from a user’s viewpoint.

4.  Location of keywords on a page

   A very short rule for seo experts – "the closer a keyword or keyword phrase is to the beginning of a document, the more significant it becomes for the search engine. "

5.  Text format and seo

   Search engines treat text that is highlighted or given special formatting. We recommend:
   - use keywords in headings. Headings are text highlighted with the «H» HTML tags. The «h1» and «h2» tags are most effective. Currently, the use of CSS allows you to redefine the appearance of text highlighted with these tags. This means that «H» tags are used less than nowadays, but are still very important in seo work.;
   - Highlight keywords with bold fonts. Do not highlight the entire text! Just highlight each keyword two or three times on the page. Use the «strong» tag for highlighting instead of the more traditional «B» bold tag.

6.  «TITLE» tag

   This is one of the most important tags for search engines. Make use of this fact in your seo work. Keywords must be used in the TITLE tag. The link to your site that is normally displayed in search results will contain text derived from the TITLE tag. It functions as a sort of virtual business card for your pages. Often, the TITLE tag text is the first information about your website that the user sees. This is why it should not only contain keywords, but also be informative and attractive. You want the searcher to be tempted to click on your listed link and navigate to your website.50-80 characters from the TITLE tag are displayed in search results and so you should limit the size of the title to this length.

7.  Keywords in links

   A simple seo rule – "use keywords in the text of page links that refer to other pages on your site and to any external Internet resources. Keywords in such links can slightly enhance page rank. "

8.  «ALT» attributes in images

   Any page image has a special optional attribute known as "alternative text.” It is specified using the HTML «ALT» tag.Search engines save the value of image ALT attributes when they parse (index) pages, but do not use it to rank search results.

   Currently, the Google search engine takes into account text in the ALT attributes of those images that are links to other pages. The ALT attributes of other images are ignored. There is no information regarding other search engines, but we can assume that the situation is similar. We consider that keywords can and should be used in ALT attributes, but this practice is not vital for seo purposes.

9.  Description Meta tag

   This is used to specify page descriptions. It does not influence the seo ranking process but it is very important. A lot of search engines (including the largest one – Google) display information from this tag in their search results if this tag is present on a page and if its content matches the content of the page and the search query.Experience has shown that a high position in search results does not always guarantee large numbers of visitors.

10.  Keywords Meta tag

   This Meta tag was initially used to specify keywords for pages but it is hardly ever used by search engines now. It is often ignored in seo projects.The following rule must be observed for this tag: "only keywords actually used in the page text must be added to it. "

Search engine VS Directories

The line between search engines and directories is a thin one. but there are some important differences below the surface.
The most important difference is in the way they collect sites to add to their databases.
Search engines rely on “crawlers” or “robots”.
crawlers are programs that continuously roam the web visiting web pages and depositing information on those pages Into a searchable database (also called an “index”). Spiders automatically follow links from one page to the next and from one site to the next.


SEARCH ENGINES


A popular myth is that search engines really “search the web“ every time someone enters a search query. if that was true, you'd have to wait days - perhaps even weeks - for the search engine to return the search results in response to a search.
Instead, search engines search through databases of indexed web sites, and it takes only a fraction of a second.
Search engine spiders can only find pages that are
1. linked to from other pages they know about or
2. that are submitted to them.
That's it.
So that means they only index part of the web?
Yes.
No search engine can claim to have indexed the entire web. They can only index a subset of the web that part of the web they “know about” and with millions of pages being added to the web daily, the part that they know about is only a small percentage of the total.
What matters to you and me is for our pages to be among those that are indexed. That’s simple enough.

E.g. Google,Bing,Yahoo!,Altavista and so on……


DIRECTORIES

Directories do not use spiders. Instead, they use real people (editors) who visit and evaluate sites and add them only if they meet the directory's minimum quality requirement.

This is an important difference:

search engine spiders can index thousands of pages a day.

Directory editors can't.

so why do we have directories If they can't compete?

The answer is quality.
Human directory editors are considerably harder to impress 1han spiders. The page has to offer unique Information or a unique product. When you submit a site to a directory, the editor of the category where you submitted will take a look at your site and decide if its good enough to add to the directory.

Expect directory editors to reject pages with typos, broken
links, unclear navigation etc.

That said, getting your site into a directory is not that hard.
If your site offers high-quality content, is not broken or under construction and you follow the directory's submission rules, you shouldn't have trouble getting in.
We'll look at the details in the Link Popularity section later in the book.


How search engines works....

Most people couldn't care less.

I know I didn’t when I started out. I used that car excuse:
“You have to understand how a car works to be able to drive one.”
But along the way I couldn't avoid learning how search engines works  and the more I learned the more it all made sense. I realized... this is nothing like driving a car. There's no competition when you’re driving a car (assuming you just want to get from A to B).
SEO is more like racing a car.
(If you’re a racecar driver or a racing fan you'll know
how important it is for the driver to understand what
parts of the car's anatomy perfoms which functions.)
The SEO industry is becofrming increasingly
competitive, but there are still only 10 spots in the
top 10 for any keyword. This stuff gives you an
edge over the hordes clamoring for position.

first post for SEO...

hi every one........
today i m going to start my blogging on SEO.
I hope you would like it and it will help ful for many SEO beginners.
So nw lets start.....

Search Google