- Original Poster
- #1
Hi I recently changed provider for my website and site traffic has doubled however when looking at googles webmaster tools it says:
"there are severe health issues with your site - some important pages are being blocked in robots.txt"
Under crawl errors it says there are 3,306 pages restricted by robots.txt
I assumed this was just where it had duplicate listings for each page as I only have around 180 pages.
Is this message something I should just ignore? The robots.txt file hasn't been altered at all from when it was first set up.
I am using Magento, the file is as follows:
User-agent: *
Disallow: /index.php/
Disallow: /*?
Disallow: /*.js$
Disallow: /*.css$
Disallow: /checkout/
Disallow: /tag/
Disallow: /app/
Disallow: /downloader/
Disallow: /js/
Disallow: /lib/
Disallow: /*.php$
Disallow: /pkginfo/
Disallow: /report/
Disallow: /skin/
Disallow: /var/
Disallow: /catalog/
Disallow: /customer/
Disallow: /sendfriend/
Disallow: /review/
Disallow: /*SID=
Thanks for your help,
Sarah
"there are severe health issues with your site - some important pages are being blocked in robots.txt"
Under crawl errors it says there are 3,306 pages restricted by robots.txt
I assumed this was just where it had duplicate listings for each page as I only have around 180 pages.
Is this message something I should just ignore? The robots.txt file hasn't been altered at all from when it was first set up.
I am using Magento, the file is as follows:
User-agent: *
Disallow: /index.php/
Disallow: /*?
Disallow: /*.js$
Disallow: /*.css$
Disallow: /checkout/
Disallow: /tag/
Disallow: /app/
Disallow: /downloader/
Disallow: /js/
Disallow: /lib/
Disallow: /*.php$
Disallow: /pkginfo/
Disallow: /report/
Disallow: /skin/
Disallow: /var/
Disallow: /catalog/
Disallow: /customer/
Disallow: /sendfriend/
Disallow: /review/
Disallow: /*SID=
Thanks for your help,
Sarah
