SEO


To follow or not to follow


by Sally Kavanagh

The way in which the no follow tag is being used seems to be getting silly.

When it was first introduced it was intended to prevent abuse caused by spamming blog comments with links, a very good idea although my blog is still bombarded with spam comments if I don’t take action to prevent them.

Then it was used to prevent paid links, links from adverts etc, being used to pass link juice. Again quite sensible and reasonable but it is just beginning to get difficult. Surely a link from a reputable directory that vets all its listings should pass page rank. No low rank, spam site is going to either pay if it’s a paid directory nor pass editorial review so surely it is a legitimate SEO signal that the search engines could use.

Today I read a very scary headline in a emailshot from econsultancy’s Daily Pulse

“Has Google really just killed the PR industry”

The article references a recent update to Google’s webmaster tools guidelines 

The relevant section reads

Additionally, creating links that weren’t editorially placed or vouched for by the site’s owner on a page, otherwise known as unnatural links, can be considered a violation of our guidelines. Here are a few common examples of unnatural links that violate our guidelines:

•   Links with optimized anchor text in articles or press releases distributed on other sites. For example:
There are many wedding rings on the market. If you want to have a wedding, you will have to pick the best ring. You will also need to buy flowers and a wedding dress.

Now does that mean that if I write a press release I should not use keyword anchor text? Surely if I am creating a link to a page on my client’s site, then it makes sense to use anchor text to tell my reader and the search engines what the page is about. And Google has said that links in press releases do not benefit rankings –  https://productforums.google.com/forum/#!topic/webmasters/O178PwARnZw/discussion

We are in danger of getting to the point where what would have been considered ‘good practice’ until recently is now branded as unnatural link building!

So perhaps we should all start optimising for click here and contact us, then Google will realise that no one is bothering with the meta keyword tag, start ranking sites on that and keyword density and everything will have come full circle.


 

Baidu – is it opening up to the UK?


by Sally Kavanagh

There is no doubt that Baidu, the Chinese language search engine is making moves westwards.  It seems Baidu’s revenue has fallen in recent years and this has been cited as a possible reason.  On the other hand you could say it just makes sense for Baidu to capitalise on its technology and to exploit is access to its 70-80% share of the huge Chinese onine market.

At the end of February, Baidu announced is English language web site for developers and it looks like this is part of Baidu’s effort to maintain its dominance in the Chinese mobile market in the face of competition from other Chinese competitors such as Qihoo.

Another development is Baidu’s approach to leading brands in the west to partner with Baidu as advertisers.  A launch event is being held in London on March 18th.

So what does this mean for UK business looking to gain visibility in the Chinese search engine?  It certainly looks as though Baidu is planning to open up more to the English speaking world.  Currently Baidu is a Chinese language only engine and sites need to be translated into simplified Chinese and also it hard to impossible to get rankings if the site is not hosted in China.

Baidu does offer paid listings – the PPC model.  It is not as straightforward to sign up as with Google or Bing and there is a minimum spend of $2000 a month, but might offer a very interesting way of testing the Chinese market before investing larger sums on the ground.  More information at http://www.chinasearchint.com/en/why-baidu/getting-started

Baidu could prove an interesting area for SEO in the near future as China’s rapid development means it is looking more and more to Western markets to expand into.

 


 

Google algorithm changes in 2012 and checking rankings


by Sally Kavanagh

As we all know Google changes its algorithm frequently – small changes are reputed to happen on a weekly if not daily basis.  Wordtracker recently published a very useful little article summarising the major updates that occurred in 2012.  The description of how this affects how to check rankings is particularly useful, especially the reference to Google NCR (no country redirect) if you want to look to see how rankings will look in another country.

The full article is available on the Wordtracker site.

The main changes last year were increase in personalisation of search and the two big anti-spam updates Penguin and Panda.

I have big issues with personalisation of search, although I can see the reasoning behind – delivering results that aligned with the searchers interests.  My concern is that inevitably that means manipulating results which veers towards censoring results.  The aim may be laudable but the unintended consequences are not.

On the other hand, Penguin aimed to remove spammy sites, especially those with over-optimised links (remember Florida back in 2003, nothing changes!) and Panda which is designed to remove sites with very thin content.

The Wordtracker article is a good resume of the 2012 i[dates, and more importantly, how to deal with them.

 


 

Getting pages corrected and reindexed in Google


We’ve all done it. You upload a new page or new content to your site and there is a mistake in it. The headline includes a terrible typo, or you missed a zero of the promotional price banner. In the past, all you could do was wait for Google to respider and then reindex the page. Depending on the the site and how deep into it the page was, it could take some time. All very difficult.

Google’s Webmaster Tools has now added a natty new tool feature overcomes this. You’ll find it under ‘Diganostics’ and then ‘Fetch as Googlebot’.

Fetch as Googlebot has been around for a while. It allows you to ask Googlebot to come and spider a page. Enter the URL, click fetch and the page will be listed. Simply click ‘success’ when the page has been spidered and you will see the code that Google has read. The tool was initially intended to allow webmasters to check to see if their sites had been hacked, but it is also useful as it shows the header status code (showing any redirects etc) and the download time, an increasingly important factor used to determine rankings.

The new feature is that you can now submit the page to Google meaning that the new page or content beats the queue for getting spidered.

You can go further and ask Google to respider and reindex all the pages that the corrected page links to you.

There is a limit on the use of Fetch as Googlebot. Matts Cutts in his video on this says you have 50 fetches per week and 10 linked pages submissions. But I have a WMT account with about 30 domains on it and that is allowing me 500 fetches, but across all the domains of course. I am still limited to 10 linked page submissions though.

This tool has been around for about six months, but if you missed it’s very useful to know about for when next drama happens.

Google’s Matt Cutts describes the process in more detail


 

SEO and competitive analysis


SERPS redux is a useful quick way of looking at the top results from Google to use for competitive SEO analysis Read the rest of this entry »
 

Researching your link profile


I was presenting a course on how to optimise on page content last week in Eastleigh and the subject of link profiles came up – not surprisingly perhaps in any SEO presentation.

The tool I recommend is Majestic SEO’s site explorer.  It offers quite a lot for free – all you need to do is register – and if you want to look at more sites or go into greater depth then subscriptions start at £9.99 per month.

Wordtracker also provides an excellent tool called Linkbuilder.  This is more expensive and its interface is easier to use and analyses the links a bit more for you – but the data is taken from Majestic SEO, Wordtracker then make it a little more accessible.


 

SEO – always traffic never just rankings!


april seo tipsThe challenge is that 60% people use the key word ‘watches’ to search on and this is the main focus for optimisation. Currently listed on page three using this term, the objective for this brief is to move the listing to page 2 – Please quote based on meeting this objective.

This was the request I received last week from a new prospective client, the site offers watches, but specifically aimed at the sports market. It threw up some interesting issues because, quite simply, it is not a brief I am prepared to take on! On the other hand, I find it extremely difficult to say no, especially when this is a prestigious site and one with plenty of SEO potential.

My problem is that SEO should be all about attracting qualified traffic to a site, not about chasing rankings. The two are not entirely unrelated of course, but neither are they synonymous. So this was the approach I took…………

Many thanks for inviting me to provide a quote to meet the above objective. However I would like to suggest a slightly different approach as in my experience better results are obtained by focussing on developing organic search engine visibility across as wide a range of keywords as possible, rather than concentrating all resources on one, or a very few, generic keywords. In other words, focus on generating high quality relevant traffic, rather than focusing on rankings.

The rationale behind this approach is as follows:

  • The keywords (phrases) that people use in search engines are incredibly varied, and the best converting keyword are generally the most specific – the so called long tail search. So a search for ‘watches for deep sea diving’ is much more likely to result in a sale than one for ‘watches’ where the searcher may just be looking to see what is available compared with the first searcher who is likely to be ready to buy once he finds exactly what he is looking for.
  • Google (and all the other engines) use the number of visitors to a site as one of the 200 or so parameters used to determine which sites rank where. So in order to rank on page 1 for sunglasses, it will be necessary to have a very active site – this excludes PPC activity – and the only way of doing this is to have good rankings but good rankings only develop with traffic. The solution to this conundrum is to develop a wide range of rankings for long tail keywords. This will bring in high quality traffic likely to convert and in doing so help to develop rankings for the more competitive, generic terms such as ‘watches’
  • Google can and does change its algorithm without notice and certainly without any explanation. The wider the range of good rankings a site has, the less vulnerable it is to changes in these algorithms. This also means that it is not possible to guarantee results in the organic listings. Payment on results is an option, though it tends not to be as straightforward as it at first may seem.
  • Targeting long tail keywords is the best use of of SEO resources.

I then continued the quote in the normal way with an outline of expected outcomes, and details of initial and ongoing work that I felt the site needed.

I got the job!The moral of this story? A large part of SEO is about educating the client and that can begin well before you even start working for them.

And yes, the client came back very quickly, could see the sense behind my approach and we are currently discussing how we can go forward.


 

Alt text and image title text, their impact on SEO


Alt text and title images, they are often available in CMSs but what are they and how do they help with image and page optimisation Read the rest of this entry »
 

The importance of SEO in site redesign


seo blog post Mar 11It is so easy to throw the baby out with the bath water.  The US news site Gawker did this in a big way when they redesigned their site recently.  Econsultancy, UK authority site on emarketing analysed their problems very effectively in a news item.

Econsultancy’s no 1 issue with the new site is the total apparent lack of thought given to the SEO.  I am still amazed at the lack of understanding of SEO among so many, but not of course all, web designers and developers.

SEO is not a skill I expect designers and developers to possess, in my experience it takes a different sort of brain, but it is one that it is vital they take fully into account if they are to deliver a website fit for purpose.


 

Social media activity helps SEO – it’s official


Google and Bing have both confirmed that they include social media usage on Twitter and Facebook in their ranking algorithms. Read the rest of this entry »