Weekly SEO news: 12 August 2014
Welcome to the latest issue of the Search Engine Facts newsletter.

It might be that you're blocking Google without knowing it. That means that Google won't index all the pages of your website. In this article, you'll learn how to make sure that you do not block Google inadvertently.

In the news: Google gives HTTPS sites a small ranking boost, Google hasn't updated the official PageRank for 8 months, Google query tricks, Facebook bans 'Like gates', and more.

Table of contents:

We hope that you enjoy this newsletter and that it helps you to get more out of your website. Please pass this newsletter on to your friends.

Best regards,
Andre Voget, Johannes Selbach, Axandra CEO

1. Five ways to inadvertently keep Google from indexing your website

It might be that you're blocking Google without knowing it. That means that Google won't index all the pages of your website. In this article, you'll learn how to block Google and how to make sure that you do not block Google inadvertently.

Keep out - Google

1. Errors in the robots.txt file of your website will keep Google away

The disallow directive of the robots.txt file is an easy way to exclude single files or whole directories from indexing. To exclude individual files, add this to your robots.txt file:

User-agent: *
Disallow: /directory/name-of-file.html

To exclude whole directories, use this:

User-agent: *
Disallow: /first-directory/
Disallow: /second-directory/

If your website has a robots.txt file, double check your robots.txt file to make sure that you do not exclude directories that you want to see in Google's search results.

Note that your website visitors can still see the pages that you exclude in the robots.txt file. Check your website with the website audit tool in SEOprofiler to find out if there are any issues with the robots.txt file.

2. Use the meta robots noindex tag and Google will go away

The meta robots noindex tag enables you to tell search engine robots that a particular page should not be indexed. To exclude a web page from the search results, add the following code in the <head> section of a web page:

<meta name="robots" content="noindex, nofollow">

In this case, search engines won't index the page and they also won't follow the links on the page. If you want search engines to follow the links on the page, use this tag:

<meta name="robots" content="noindex, follow">

The page won't appear on Google's result page then but the links will be followed. If you want to make sure that Google indexes all pages, remove this tag.

The meta robots noindex tag only influences search engine robots. Regular visitors of your website still can see the pages. The website audit tool in SEOprofiler will also inform you about issues with the meta robots noindex tag.

3. The wrong HTTP status code will send Google away

The server header status code enables you to send real website visitors and search engine robots to different places on your website. A web page usually has a "200 OK" status code. For example, you can use these server status codes:
  • 301 moved permanently: this request and all future requests should be sent to a new URL.
  • 403 forbidden: the server refuses to respond to the request.
For search engine optimization purposes, a 301 redirect should be used if you want to make sure that visitors of old pages get redirected to the new pages on your website.

The website audit tool in SEOprofiler shows the different status codes that are used by your website and it also highlights pages with problematic status codes.

4. Google won't index password protected pages

If you password protect your pages, only visitors who know the password will be able to view the content.

Search engine robots won't be able to access the pages. Password protected pages can have a negative influence on the user experience so you should thoroughly test this.

5. If your pages require cookies or JavaScript, Google might not be able to index your pages

Cookies and JavaScript can also keep search engine robots away from your door. For example, you can hide content by making it only accessible to user agents that accept cookies.

You can also use very complex JavaScripts to execute your content. Most search engine robots do not execute complex JavaScript code so they won't be able to read your pages.

In general, you want Google to index page pages. The tools in SEOprofiler help you to make sure that Google and other search engines can index your web pages correctly. Use the website audit tool in SEOprofiler to make sure that you do not inadvertently block Google.

Back to table of contents - Visit SEOprofiler.com
2. Search engine news and articles of the week
Google gives HTTPS sites a small ranking boost
"We're starting to use HTTPS as a ranking signal. For now it's only a very lightweight signal — affecting fewer than 1% of global queries, and carrying less weight than other signals such as high-quality content — while we give webmasters time to switch to HTTPS."


GoogleGoogle toolbar PageRank hasn't been updated in 8 months 

"So will they update it in the future? Maybe, I mean, if all these sites switch to HTTPS, all the PR will be down to 0, so they might have to refresh it once in the future, to assign PageRank to those new URLs. Personally, I wouldn't mind if they never updated it again."


Mobile search spent in UK and Australia outpaces U.S.

"While average phone CPCs are 12 cents lower than tablet in Australia, the average CPC spread between phone and tablet has narrowed in the US and UK. In the US, average phone CPCs are $0.04 cents lower than tablet. In the UK, the phone CPCs are just .02 euros shy of tablets."


GoogleGoogle query tricks

"This isn't some new Google Search feature, but I thought it's worth sharing. Google has some smart algorithms that process your queries and can determine what you intended to type even if it's not properly formatted."

Facebook bans the “Like gate;” pages have 90 days to comply

"In an update to its Platform Policy this week, the social network said it will no longer allow Pages to require a user to Like a Page to gain access to content, contests, apps or rewards."

Search engine newslets
  • Google readies test to target users with ads across the mobile web and apps.
  • No address? No ratings for you! Facebook ratings axed for pages without a physical location.
  • Through the Google lens: search trends August 1-7.
3. Recommended resources

Everything you need

SEOprofiler is a full featured web-based SEO tool that offers everything you need to get high rankings on Google:

  • automatic weekly website audits
  • tools for keyword research and analysis
  • competitive intelligence tools
  • automated ranking checks
  • website analytics
  • tools for spam-free link building
  • link disinfection tool
  • social media monitoring
  • white-label reports (web-based and PDF)
  • and much more.

Free trial Test full version for $1


Create a free trial account now or test the full version.

Back to table of contents - Visit SEOprofiler.com
4. Previous articles
Back to table of contents - Visit SEOprofiler.com

Do you have a minute?