Thursday, March 26, 2009
Stop words are common words that are ignored by search engines at the time of searching a key phrase. This is done in order to save space on their server, and also to accelerate the search process.
When a search is conducted in a search engine, it will exclude the stop words from the search query, and will use the query by replacing all the stop words with a marker. A marker is a symbol that is substituted with the stop words. The intention is to save space. This way, the search engines are able to save more web pages in that extra space, as well as retain the relevancy of the search query.
When a search is conducted in a search engine, it will exclude the stop words from the search query, and will use the query by replacing all the stop words with a marker. A marker is a symbol that is substituted with the stop words. The intention is to save space. This way, the search engines are able to save more web pages in that extra space, as well as retain the relevancy of the search query.
Besides, omitting a few words also speeds up the search process. For instance, if a query consists of three words, the Search Engine would generally make three runs for each of the words and display the listings. However, if one of the words is such that omitting it does not make a difference to search results, it can be excluded from the query and consequently the search process becomes faster. Some commonly excluded "stop words" are:
after, also, an, and, as, at, be, because, before, between, but, before, for, however, from, if, in, into, of, or, other, out, since, such, than, that, the, these, there, this, those, to, under, upon, when, where, whether, which, with, within, without
after, also, an, and, as, at, be, because, before, between, but, before, for, however, from, if, in, into, of, or, other, out, since, such, than, that, the, these, there, this, those, to, under, upon, when, where, whether, which, with, within, without
Labels: Seo Techniques
Saturday, March 21, 2009
We discussed earlier the prominence of frames based websites. Many amateur web designers do not understand the drastic effects frames can have on search engine visibility. Such ignorance is augmented by the fact that some Search Engines such as Google and Ask.com are actually frames capable. Ask.com spiders can crawl through frames and index all web pages of a website. However, this is only true for a few Search Engines.
The best solution as stated above is to avoid frames all together. If you still decide to use frames another remedy to this problem is using Javascript. Javascript can be added anywhere and is visible to Search Engines. These would enable spiders to crawl to other web pages, even if they do not recognize frames.
With a little trial and error, you can make your frame sites accessible to both types of search engines.
With a little trial and error, you can make your frame sites accessible to both types of search engines.
Labels: Seo Techniques
Sunday, March 15, 2009
Many websites make use of frames on their web pages. In some cases, more than two frames would be used on a single web page. The reason why most websites use frames is because each frame’s content has a different source. A master page known as a “frameset” controls the process of clubbing content from different sources into a single web page. Such frames make it easier for webmasters to club multiple sources into a single web page. This, however, has a huge disadvantage when it comes to Search Engines.
Some of the older Search Engines do not have the capability to read content from frames. These only crawl through the frameset instead of all the web pages. Consequently web pages with multiple frames are ignored by the spider. There are certain tags known as “NOFRAMES” (Information ignored by frames capable browser) that can be inserted in the HTML of these web pages. Spiders are able to read information within the NOFRAMES tags. Thus, Search Engines only see the Frameset.
Moreover, there cannot be any links to other web pages in the NOFRAMES blocks. That means the search engines won't crawl past the frameset, thus ignoring all the content rich web pages that are controlled by the frameset. Hence, it is always advisable to have web pages without frames as these could easily make your website invisible to Search Engines.
Moreover, there cannot be any links to other web pages in the NOFRAMES blocks. That means the search engines won't crawl past the frameset, thus ignoring all the content rich web pages that are controlled by the frameset. Hence, it is always advisable to have web pages without frames as these could easily make your website invisible to Search Engines.
Labels: Seo Techniques
Thursday, March 12, 2009
You do not have to submit all the pages of your site. As stated earlier, many sites have restrictions on the number of pages you submit. A key page or a page that has links to many inner pages is ideal, but you must submit some inner pages. This insures that even if the first page is missed, the crawler does get to access other pages and all the important pages through them. Submit your key 3 to 4 pages at least. Choose the ones that have the most relevant content and keywords to suit your target search string and verify that they link to other pages properly.
Labels: Seo Techniques
Monday, March 9, 2009
We have written above that the spiders may bypass long and “difficult” pages. They would have their own time-out characteristics or other controls that help them come unstuck from such pages. So you do not want to have such a page become your “gateway” page. One tip is to keep the page size below 100 kb.
Labels: Seo Techniques
Saturday, March 7, 2009
In affiliate programs, sites that send you traffic and visitors, have to be paid on the basis of per click or other parameters (such as number of pages visited on your site, duration spent, transactions etc). Most common contractual understanding revolves around payment per click or click throughs. Affiliates use tracking software that monitors such clicks using a redirection measurement system. The validity of affiliate programs in boosting your link analysis is doubtful. Nevertheless, it is felt that it does not actually do any harm. It does provide you visitors, and that is important. In the case of some search engines re-directs may even count in favor of your link analysis. Use affiliate programs, but this is not a major strategy for optimization.
Several pages in e-commerce and other functional sites are generated dynamically and have “?” or “&” sign in their dynamic URLs. These signs separate the CGI variables. While Google will crawl these pages, many other engines will not. One inconvenient solution is to develop static equivalent of the dynamic pages and have them on your site.
Another way to avoid such dynamic URLs is to rewrite these URLs using a syntax that is accepted by the crawler and also understood as equivalent to the dynamic URL by the application server. The Amazon site shows dynamic URLs in such syntax. If you are using Apache web server, you can use Apache rewrite rules to enable this conversion.
One good tip is that you should prepare a crawler page (or pages) and submit this to the search engines. This page should have no text or content except for links to all the important pages that you wished to be crawled. When the spider reaches this page it would crawl to all the links and would suck all the desired pages into its index. You can also break up the main crawler page into several smaller pages if the size becomes too large. The crawler shall not reject smaller pages, whereas larger pages may get bypassed if the crawler finds them too slow to be spidered.
You do not have to be concerned that the result may throw up this “sitemap” page and would disappoint the visitor. This will not happen, as the “site-map” has no searchable content and will not get included in the results, rather all other pages would. We found the site wired.com had published
hierarchical sets of crawler pages. The first crawler page lists all the category headlines, these links lead to a set of links with all story headlines, which in turn lead to the news stories.
Another way to avoid such dynamic URLs is to rewrite these URLs using a syntax that is accepted by the crawler and also understood as equivalent to the dynamic URL by the application server. The Amazon site shows dynamic URLs in such syntax. If you are using Apache web server, you can use Apache rewrite rules to enable this conversion.
One good tip is that you should prepare a crawler page (or pages) and submit this to the search engines. This page should have no text or content except for links to all the important pages that you wished to be crawled. When the spider reaches this page it would crawl to all the links and would suck all the desired pages into its index. You can also break up the main crawler page into several smaller pages if the size becomes too large. The crawler shall not reject smaller pages, whereas larger pages may get bypassed if the crawler finds them too slow to be spidered.
You do not have to be concerned that the result may throw up this “sitemap” page and would disappoint the visitor. This will not happen, as the “site-map” has no searchable content and will not get included in the results, rather all other pages would. We found the site wired.com had published
hierarchical sets of crawler pages. The first crawler page lists all the category headlines, these links lead to a set of links with all story headlines, which in turn lead to the news stories.
Labels: Seo Techniques
Monday, March 2, 2009
It's important to note that you shouldn't try to optimize your home page for more than one theme. They just end up weakening each other's strength when you do that. By using simple links to your alternative content, a link to your humor page can get folks where they want to go, and then you can write your humor page as a secondary index optimized toward a humor theme. In the end, each page should be optimized for search engines for the main topic of that page or site section.
Search engine optimization is made up of many simple techniques that work together to create a comprehensive overall strategy. This combination of techniques is greater as a whole than the sum of the parts. While you can skip any small technique that is a part of the overall strategy, it will subtract from the edge you'd gain by employing all the tactics.
Labels: Seo Techniques
Subscribe to:
Posts (Atom)