Page 1 of 1
prevent a page from being picked up by a search engine
Posted: Tue Sep 17, 2002 2:37 am
by noguru
Is there a way to prevent a webpage from being picked up by 'n search engine? I used frames and now google is not only picking up the default page (which contains the framesets), but also the page in the main frame. When this page is loaded, the functionality in the other frames is lost. I need only the default page to be picked up by search engines.
Posted: Tue Sep 17, 2002 2:43 am
by dusty
<META NAME="ROBOTS" CONTENT="NOINDEX">
Posted: Tue Sep 17, 2002 5:07 am
by noguru
i've tried it, but google still picks up my page. does it take a while?
i've managed to write the following piece of client-side script that also does the trick. it checks if the window is a frame and if not, it redirects to the default page:
Code: Select all
<script language="JavaScript">
wn = window.name;
if (wn.length==0) {
window.parent.location.href = 'http://whatever/';
}
</script>
Thanks anyway.
Posted: Tue Sep 17, 2002 8:16 am
by dusty
google has to reindex your site for the changes to take place i believe. some search engines are manual and you have to login and do it yourself.
Posted: Tue Sep 17, 2002 11:25 am
by phice
Search on google for "robots.txt". Theres a few codes you can do to that file to block the search engine for looking over certain files, folders. This is especially good for /admin or /beta /test folders.