Weekly SEO news: 29 October 2013
Welcome to the latest issue of the Search Engine Facts newsletter.

Sometimes, your web site might contain pages that you do not want to see in Google's search results. Some people block Google without knowing it and others don't know how to exclude individual pages. This week's article shows you how to prevent Google from indexing pages of your website.

In the news: Googles recent changes mainly benefit Google, Google test large banner ads on the search result page, Google might ignore your disavow links file, ane more.

Table of contents:

We hope that you enjoy this newsletter and that it helps you to get more out of your website. Please pass this newsletter on to your friends.

Best regards,
Andre Voget, Johannes Selbach, Axandra CEO

1. Five ways to keep your website out of Google

Sometimes, your web site might contain pages that you do not want to see in Google's search results. Some people block Google without knowing it and others don't know how to exclude individual pages. This week's article shows you how to prevent Google from indexing pages of your website.

Keep out - Google

1. Block pages and directories with the robots.txt file

The disallow directive of the robots.txt file is an easy way to exclude single files or whole directories from indexing. To exclude individual files, add this to your robots.txt file:

User-agent: *
Disallow: /directory/name-of-file.html

To exclude whole directories, use this:

User-agent: *
Disallow: /first-directory/
Disallow: /second-directory/

If you use this method, double check your robots.txt file to make sure that you do not exclude directories that you want to see in Google's search results.

Note that your website visitors still can see the pages that you exclude in the robots.txt file.

2. Block individual pages with the meta robots noindex tag

The meta robots noindex tag enables you to tell search engine robots that a particular page should not be indexed. To exclude a web page from the search results, add the following code in the <head> section of a web page:

<meta name="robots" content="noindex, nofollow">

In this case, search engines won't index the page and they also won't follow the links on the page. If you want search engines to follow the links on the page, use this tag:

<meta name="robots" content="noindex, follow">

The meta robots noindex tag only influences search engine robots. Regular visitors of your website still can see the pages.

3. Block pages with the correct server header status

The server header status code enables you to send real website visitors and search engine robots to different places on your website. A web page usually has a "200 OK" status code. For example, you can use these server status codes:
  • 301 moved permanently: this request and all future requests should be sent to a new URL.
  • 403 forbidden: the server refuses to respond to the request.
For search engine optimization purposes, a 301 redirect should be used if you want to make sure that visitors of old pages get redirected to the new pages on your website.

4. Password protect your web pages

If you password protect your pages, only visitors who know the password will be able to view the content.

Search engine robots won't be able to access the pages. Password protected pages can have a negative influence on the user experience so you should thoroughly test this. Details on how to password protect a page can be found here.

5. Use cookies and JavaScript to present your content

Cookies and JavaScript can also help you to keep search engine robots away from your door. For example, you can hide content by making it only accessible to user agents that accept cookies.

You can also use very complex JavaScripts to execute your content. Most search engine robots do not execute complex JavaScript code.

Blocking Google can be helpful for some pages. In general, you want Google to index page pages. The tools in SEOprofiler help you to make sure that Google and other search engines can index your web pages correctly.

If you haven't done it yet, create your SEOprofiler account now. If you want to save time, order the full version for our special price:

Free trial Test full version


Create a free trial account now or get the full version and save 99%.

2. Search engine news and articles of the week

GoogleAre Google's search UI changes motivated by profit or better UX?
"Clearly, the goal is to keep the visitor on Google's properties for as long as possible, but there is more danger here than first meets the eye. [...]

The actual sources of information are becoming less and less prominent, Google turns into a scraper, an affiliate player even in some niches where that info can be monetised into leads (e.g. finance, travel).

It stops being the search engine and starts being a monetisation engine, but ultimately, it's the only one profiting from it, at others' expense."



Google tests very large banner ads in search results

Google is currently running a limited, US-only test, in which advertisers can include an image as part of the search ads that show in response to certain branded queries.

The new ads are very big banners that will appear at the top of the search result page above the first results.



Eric KuanGoogle: we might reject your disavow link files

"In an online discussion, Google’s Eric Kuan said that Google might choose to reject your disavow links file.

The disavow tool should be used primarily if you have done everything you can to remove an inorganic link and still cannot get the link removed. If you are primarily disavowing links and not trying to remove them, your reconsideration request may not be successful."


Bing: How a higher Quality Score in Bing Ads can mean more clicks and more sales
"You may be tempted to run on every keyword variation and synonym you can think of, but unless there is a relevant connection between that keyword, your ad and your web site, you run the risk of managing keywords that may not serve."

Related: Stop obsessing over AdWords Quality Score



Huge Google shift points to faster search results
"Researchers at USC have stumbled on a huge change in how Google architects its search services. The result? Reduced lag in serving search queries, especially in more far-flung regions (as in, far from Google’s own data centres)."


Search engine newslets

  • Google's Matt Cutts: more pages do not equal better rankings.
  • Google Maps Easter egg for Philadelphia.
  • Google's 100 search results per page breaks.
  • Twitter will raise more than $1 billion in IPO.
  • Pinterest does another massive funding — $225 million at $3.8 billion valuation.
  • Google testing new local listing “About Page” layout – Just what are they thinking?
Back to table of contents - Visit Axandra.com
3. Recommended resources

How to get a website health-check quickly and easily

Doing a regular audit of your web pages is important if you want to make sure that search engines can index all of your pages. The more pages your website has, the more likely it is that there are many errors that have to be corrected.

The website audit tool in SEOprofiler automatically checks your web pages once per week. You will get a report that shows all the issues that should be corrected. By correcting these issues, you make sure that search engines can index all of your pages correctly. Here’s a screenshot of the overview page:

SEO website audit

You can get the website audit tool and all other tools in SEOprofiler for our special offer price:

Free trial Test full version for $1


Create a free trial account now or test the full version for only $1.

Back to table of contents - Visit Axandra.com
4. Previous articles
Back to table of contents - Visit Axandra.com

Do you have a minute?