Although you do not know it, some of your web
pages might block Google. If Google cannot access all of your web
pages, you're losing visitors and sales. Here are five reasons why
Google cannot access your pages:
1. Errors in
robots.txt file of your website keep Google away
directive of the
file is an easy way to exclude single files or whole directories from
indexing. To exclude individual files, add this to your robots.txt file:
To exclude whole
website has a
robots.txt file, double
check your robots.txt file to make sure that you do not exclude
directories that you want to see in Google's search results.
Note that your website visitors can still see the pages that you
exclude in the robots.txt file. Check your website
website audit tool
in SEOprofiler to find out if there are
issues with the robots.txt file.
2. Your pages
robots noindex tag
enables you to tell search engine robots that a particular page should
not be indexed. To exclude a web page from the search results, add the
following code in the <head> section of a web page:
name="robots" content="noindex, nofollow">
In this case, search engines won't index the page and they also won't
follow the links on the page. If you want search engines to follow the
links on the page, use this tag:
name="robots" content="noindex, follow">
The page won't appear on Google's result page then but the links will
be followed. If you want to make sure that Google indexes all pages,
remove this tag.
The meta robots noindex tag only influences search engine robots.
Regular visitors of your website still can see the pages. The website
in SEOprofiler will also inform you about issues with the
meta robots noindex tag.
Your pages send the wrong HTTP
The server header status code
enables you to
website visitors and search engine robots to different places on your
website. A web page usually has a "200 OK" status code. For example,
you can use these server status codes:
- 301 moved permanently: this request and all
should be sent to a new URL.
- 403 forbidden: the server refuses to respond to
For search engine optimization purposes, a 301 redirect should be used
if you want to make sure that visitors of old pages get redirected to
the new pages on your website.
The website audit tool in SEOprofiler shows the
codes that are used by your website
and it also highlights
with problematic status codes.
4. Your pages
are password protected
pages, only visitors who know the password will be able to view the
Search engine robots won't be able to access the pages. Password
protected pages can have a negative influence on the user experience so
you should thoroughly test this.
5. Your pages
also keep search engine robots away from your door. For example,
can hide content by making it only accessible to user agents that
It might be that
your web pages use
pages. Google can parse these pages to some extend but you're making it
unnecessarily difficult then.
to find these problems on your website
In general, you
want Google to index page pages. For that reason, it is important to
find potential problems on your site. The website audit tool in
SEOprofiler locates all issues on your site and it also shows you how
to fix these problems. If you haven't done it
plans & pricing
contents - Visit
Please forward this
to your friends!