broosAdmin
*only for wapmaster*
i belive that you know exactly what search engines use to find contents in a site, its a simple online *robot* called a crawer. now when i was working in the ZAMBIAN government online search. we were told to build a search engine to only search goverment websites." woow i can say i knew a crawer well.
now here is how it works. a crawer does not reads your contants directly, it will visit your site's index to look for a *text file* which must be
" ROBOTS.txt "
a robots.txt file commads a crawer as on which page it must search and which page must not be searched. now you can direct the crawre to your *sitemap* a site map can be in XML / TXT / HTML. and i gues google & yandex use xml. see below this site you will find my site map. it conants all links to my wapsite, of which the crawre likes and reads. thern my pages will be on a lucky search on google yandex bing.
now see more topics on
http://zone.wapamp.com/blog