Select Page

Introducing Contrado Digital

We are delighted to announce the launch of our integrated online marketing agency Contrado Digital, specialists in Search Engine Optimisation.

We launched Contrado Digital because we believe the key to successful Search Engine Optimisation is about integrating Search throughout your organisation. We have found that by breaking down the silos and joining teams together this helps achieve awesome results.

 

Who’s Behind Contrado Digital?

Michael Cropper Founder & Managing Director

Michael Cropper
Founder & Managing Director

Contrado Digital has been launched by Michael Cropper who has vast experience working across a range of industries including Luxury Travel, Budget Travel, Charity, Transport, Building, Trade Insulation and Ink & Toner. Michael has helped consistently deliver results on both the client side and the agency side throughout his career and decided it was time to branch out and help more businesses succeed online.

We work with a team of search specialists, infographic designers, content writers and web developers to help deliver the best results possible for your business.

Watch this space as the Contrado Digital team grows – we have big plans for the next 12 months and are always looking for great people to work with us!

 

Why Integrated Online Marketing?

The online environment is more competitive now than it has ever been, with more businesses aiming to capture the billions of people searching online every day for different products and services. We believe that to succeed long term online is ultimately about improving your whole business so search engines see that you are an authority in your industry.

 

Size of Google's Index Back in 2008

Size of Google’s Index Back in 2008

 

Integrating Search throughout your organisation leads to a deeper understanding of your customers. This leads to better products and services being tailored to what your customers want and need.

Integrating Search throughout your organisation helps teams working together to achieve a common goal – increased traffic and sales from Organic Search.

It’s all well and good saying these things, but what about a few examples to illustrate how integrated online marketing can help your business grow.

 

Example 1 – Call Centre Support / Live Chat Functionality

Do you run a call centre or live chat to support customer queries? Have you ever thought about how Search can be integrated into this area?

What areas or topics are customers continually asking you about? What are their common queries or questions about your products or services? Continually reviewing this can be a fantastic source of information which can help feed into your content plan on your website.

If customers keep asking common questions about a product, then create some content on the website that answers their question. This will lead to a happier customer who now has the information they needed sooner. It will also lead to higher quality content being displayed on your website, which we all know is what search engines are looking for. As an added bonus, this is also likely to reduce call centre costs.

 

Example 2 – In-Store Staff

Bridge the gap between offline and online

Bridge the gap between offline and online

Do you have physical stores on the high street or branches customers can visit? Have you ever thought about how Search can be integrated into this area?

Your staff are interacting with customers on a daily basis and will gather a phenomenal amount of information within this time about what customers are looking for when purchasing your product and service. Looking at ways to capture this information then feeding it into your website content plan can be extremely valuable.

What are your customers using the products and services for? What information do they need before making a purchasing decision? What other considerations are they making? All of this information can be understood and utilised to create content on your website. Again, this additional content that helps your customers is also going to help drive additional traffic to your website from Organic Search.

How about your in-store staff encouraging customers to engage online. Encourage them to ‘check-in’ to your store. Encourage customers to review your business after they have made a purchase. Encourage customers to share information socially once they have just made a purchase. The opportunities to integrate offline with online are endless and can seriously help your Search Engine Optimisation strategy.

 

Example 3 – Marketing Efforts

Do you regularly run TV campaigns, special promotions or competitions to engage your customers? Have you ever thought about how Search can be integrated into this area?

If you are running a TV campaign, then why not integrate this ad on a page on your website then promote this page socially. This will not only increase the reach of the campaign but also help drive additional social shares and backlinks to your website. To generate additional traffic, you could also promote this on YouTube to capture the huge audience that visits the website daily.

If you are running a competition then instead of running the competition on a social platform, why not look at setting the competition up on your website. By promoting the competition page on your website this will help increase the number of backlinks, increase social shares and drive additional referral traffic to your website.

If you are creating a promotional page for a marketing campaign, then why not see if there are any specific keywords that could be targeted on the page. This small tweak can lead to additional organic traffic landing on the promotional page which may even perform better than other channels.

 

Integrated Online Marketing

Hopefully that gives a few examples of how Search can be integrated throughout your organisation.  to help drive awesome results for your business. Integration takes time, effort, collaboration and creativity to make things work better.

 

How We Work - Delivering Together

 

What Does Contrado Mean?

Contrado is Latin for “to deliver together” and is the foundation of how we work and our beliefs. We work with your organisation to deliver results together. You are experts in what you do and we like to think we know a thing or two about what we do. Together we can deliver results to help grow your business online.

The phrase “to deliver together” also links with integrated online marketing. Successfully integrating marketing and business activities ultimately means working together towards a common goal. This is something we strongly believe in and have seen time and time again how this process helps achieve awesome results.

 

Stay in Touch

Keep in touch with us socially on Twitter, Facebook, Google+, LinkedIn and subscribe to our Newsletter to stay up to date with all the latest news and trends in Search Engine Optimisation and the Digital Industry.

Looking to find out more sooner? Then get in touch to see how we can help grow your business online.

 

17 Of My 2013 SEO Predictions

At the beginng of 2012 I made some quite accurate SEO predictions so I thought I’d have another go this year and see how accurate they are again at the end of the year. Here are a few predictions that I think/hope will happen at some point in 2013.

 

1. Restricted API Access

On a couple of occasions in 2012 there has been issues with API access for some large services which have put pressure on businesses to make changes if using these services. The main ones include the Twitter API where their terms of service changed back in August 2012 to make it more restrictive for people using the service.

Another change was how Google cracked down on the usage of the AdWords API by revoking SEOmoz’s access in November andthreatening removing AdWords API access for Raven Tools if they didn’t  remove their scraped rankings data.

API access with any service makes it extremely easy to utilise their data by creating more integrated products, services and data intelligence systems. I predict that in 2013 we are going to see a lot more API access restricted throughout 2013 as more people begin to take advantage of this technology, with Google specifically having a large list of APIs they offer.

 

 

2. Cross-Device Tracking Becomes a Reality

2012 has been the year where more people are using multiple devices during their purchasing process than ever before. Tracking this behavior is extremely difficult to do accurately due to the technologies all being cookie based for users who aren’t logged in to your website. Access by multiple devices then, in terms of analytics tracking, you are classed as two separate users with no connection at all.

What I would love to see in 2013 is for someone to come up with a solution to this issue and for it to be possible to track across multiple devices during the purchasing funnel. This would enable marketers to fully understand how customers behave and provide data to backup why people need a mobile website, why responsive design is important, how and when people convert, what the assisted conversions are, and all of the other interesting metrics that come along with this type of tracking.

With more and more people being continually logged into Google and Facebook while browsing the web maybe there is a solution to this here to at least provide some useful data across devices. While this is certainly not perfect (and I doubt that any system would ever be!) this could be an interesting step towards understanding the multiple-device purchasing process.

I think this prediction is unlikely to come true, it is more of a ‘fingers crossed’ wish instead!

 

Image courtesy of Search Engine Land

 

3. Big Businesses Gain a Deeper Understanding of SEO

Often  larger organisations don’t fully understand the bigger picture with SEO. It is often seen as something that is either a one off piece of work or something that you can simply bolt on to projects/websites. A bit like magic really.

With updates like Pengiun and Panda forcing SEOs to look towards the higher quality side of SEO I predict that this will in turn translate into bigger businesses gaining a much deeper understanding of what is involved with SEO. And this is only a good thing.

The better larger organisations understand SEO the more it can be integrated throughout their whole processes to get the best results possible.

 

 

4. Google Launches an Integrated Travel Product

With the speed that Google is moving in the travel space I predict that they will launch an integrated travel product which joins up all of their products/services that they have launched over the past couple of years.

Initially it started with purchasing several key businesses in the travel space. This was then used as the basis for creating their own travel products (which were pushed to the top of the SERPs!). The next logical step is to integrate these into something much more useful and that spans across the whole process of booking a holiday or a hotel.

Currently Google have;

 

 

I have posted a lot about how Google is entering the travel market over the past couple of years;

 

 

I would be extremely surprised if something like this didn’t launch in 2013. Whatever they launch, I predict that it will be a cross platform device with some kind of social element integrated throughout (Google+ of course!)

Here is what it could look like (okay, I may have gone a little overboard with the ‘book’ buttons…)

 

 


 

5. Google+

I predict that Google announces further spurious statistics about the failing social media platform, Google+. A while back when I attended theGoogle@Manchester Event they mentioned that;

 

“Google+ as a social network has a total of 400 million users with 100 million of these being active every month.”

 

According to Wikipedia (so it must be true…) Google has 53,546 employees which would certainly account for around 0.053% of that user base (if my math is correct!). Not quite sure who the rest are though…. For pure amusement and pointless-ness sake, I am going tomake up estimate that each Google employee has convinced 10 of their family/friends to use the social network, of which they have then convinced 2 of their friends to do the same which gives….

  • 53,546 Google employees have convinced
  • 535,460 of their friends/family to join Google+ who have then convinced a further
  • 1,070,920 people to give it a go too.
  • This leaves us with a total of, 53,546 + 535,460 + 1,070,920 people who are (in some way) related to Google employees which gives us 1,659,926 people who are using Google+ each month – or 1.65% of the user baseare Google ‘related’

 

As you can see…these figures are based on extremely scientific and accurate data sources so cannot be questioned in any way :-) I also imagine that Google came to those figure it announced in a similar scientific way as well..

 

 

 

6. Forget ‘Big Data’, Businesses Begin to Understand ‘Small Data’

‘Big Data’ – one of the phrases being thrown around throughout 2012. I love data, I am a total geek and love spreadsheets but what I have found is that most businesses struggle to understand small data, let along big data (i.e. think Tesco’s data warehouses for their Clubcard).

I predict that in 2013 businesses are going to get a better understanding of the ‘small data’ that is already available to them and about how this can inform business decisions. There is always so much information available within the data when you are asking ‘it’ (the data) the right questions.

 

 

7. Google Jobs Launches

One vertical that Google has left largely untouched so far is the Jobs market. This is a place that is rife with aggregators and, to be fair, a lot of spammy and low quality websites (don’t get me wrong – they aren’t all like that – but there is a lot of junk out there..). This could be an area where Google launch a new Scheme.org/Rich Snippet markup to help Google better understand the jobs that are available.

The next step generally after this happens is that Google create their own vertical to compete against the people who have just handed them the structured data on a plate. While I think Travel and Social are areas that Google is more interested in at present, I wouldn’t be surprised if they begin making moves in this area.

 

 

8. Integrated SEO

As I mentioned a little earlier in the prediction for how big business will gain a deeper understanding of SEO, this one follows on from this in the sense that as a greater understanding comes the more integrated SEO becomes throughout the whole organisation. From integrating TV campaigns online, integrating social with other areas of the marketing work and working closer with the PR teams.

I predict that we are going to see much more integrated SEO throughout 2013.

 

 

9. Google Monetises More Previously-Free Services

Google currently has a large list of their APIs that are available for people to use. This list has shrunk significantly over the past couple of years which has been due to Google deprecating those APIs or starting to monetise them as they have done with the Google Maps API.

I predict that in 2013 we are going to see Google looking to monetise more of their previously free APIs. It is difficult to say which these could be but some of the more prominent ones could be the YouTube API, Google Webmaster Tools API (this would be a contender after threatening persuading companies such as Raven Tools andSEOmoz to remove scraped data. I imagine next they will be informing them that they can pay to use the rankings data within the Google Webmaster Tools API!).

 

 

10. Bing Integrates Search into Other Products

Bing is still one to watch. Last year I predicted that they would gain search engine market share and they did by a small amount. I predict that Bing (or Microsoft specifically) will being to be integrated more and more into other products such as the Xbox and other Windows based devices such as mobile, Surface and other tablets.

Bing has created strategic partnerships with Facebook in the past and to grow further I believe this will be key for them. If Facebook launch their own search platform though, I reserve the right to scrap this prediction all together as this will have a reasonable impact on Bings market share :-)

 

 

11. SEO Role Expands

In the past the SEO role can often be quite isolated  and looking at things from an SEO point. With Google updates making old skool SEO much harder to game I predict that the role of an SEO will expand further. While I can’t imagine the role will by hands on in the new areas, I believe the additional responsibilities will be to gain deeper understanding of different business areas which will help the SEO team integrate better within an organisation.

These additional understandings could include PPC, Email and Offline Advertising. This can only be a good thing as a better understanding helps people to think more creatively about how to get the best results possible.

 

 

12. Businesses Test AdWords for Non-Commercial Keywords

While not traditionally SEO, but according to the above prediction it could well become a bigger part of the role, I predict that in 2013 Google is going to be pushing people to bid on non-commercial keywords within Google AdWords.

Why? Because it is an additional revenue stream for Google that is largely untapped, with the exception of the Google Grants for non-profit organisations. This will likely be pushed more as multi-touch point analysis becomes easier to track. Imagine if you placed an advert for the keyword “Best places to visit in Bangkok” and landed the person on a travel guide style piece of content. Then if you are also advertising for things such as “Bangkok Hotels” then is this user more likely to purchase the product with you as they have come across your brand before? Or not?

The cost per click for commercial keywords is always growing as more people start to bid on them, so I predict that we are going to see more people testing this type of advertising to generate a higher return on investment.

 

 

13. Links Become Even Harder to Game

While Google has been making a lot of changes in relation to links over the past 12 months, I predict we are going to see an even greater change for links throughout 2013. Matt Cutts has already mentioned about how Google ‘may’ target infographic websites as these aren’t really endorsements for your website. While I disagree with this for high quality infographic websites, I can understand where he is coming from as there are a lot of junky infographic websites out there.

Google has also mentioned about how guest blogging trend hasn’t been the highest quality in the past and has been done more for the link value than anything else. With working in SEO you certainly come across enough websites while trawling the web to fully understand what he means. I predict that we are going to see a lot less value passed from these types of websites in the future as often they are a bunch of niche websites owned by the same person who is charging for posting. Ultimately, this is just a glorified link wheel with content and no real added value to anyone.

Overall in relation to links I predict that Google is going to crack down on medium quality links and really push people to think about real world and valuable links.

 

 

14. Google’s Pet Panda Goes to Sleep….for a while

I predict that in 2013 Google Panda will become a thing of the past, everyone who has been hit has been hit already for low quality content. Instead this algorithm update will simply be incorporated into the normal search engine algorithm. While I don’t think this is the last we will see with Google targeting low quality content, it will be less prominent  throughout 2013.

Who knows what the next big algorithm update will be called, I’m going to hedge my bets on…..The Google Puffin Update. (Black & White = Check. Animal = Check. Begins with P = Check.)

 

 

15. Delayed SEO Becomes a Reality

Back in August Bill Slawski did a post about ‘Transition Rank‘ which is a new patent owned by Google which is about delaying the algorithm to make it harder to understand what is happening. I predict that in 2013 Google will announce the implementation of their ‘delayed response’ algorithm that they applied for a patent with earlier in the year.

This will make it much harder to track direct results from specific pieces of work that has been undertaken. This type of change could also contribute towards the SEO role becoming more diverse as it turns more into a marketing role.

 

 

16. Web Analytics Tools Become Even More Inaccurate

I predict that in 2013 we are going to witness analytical blindness as (not provided) reaches insane levels. Analytical blindness, not in the sense that there is too much data, but in the sense that there simply isn’t anything we can see there! Some reports are showing (not provided) data anywhere from 40% to 60% of traffic. I know on my blog the levels are currently at around, 60% of organic traffic is (not provided) and while this is going to be higher based on the content of the blog, but it is a signal to where things are heading for everyone.

In addition to Google’s (not provided) data scandal, there are also issues with iOS6 users on the iPhone with their data being automatically encrypted which means…..you guessed it….more (not provided) data.

Tracking results accurately is becoming a joke and being made harder and harder. I predict that in 2013 things are only going to get more difficult.

 

 

17. Increased Focus For User Generated Content

I predict that in 2013 user generated content will become a priority as brands struggle to keep up with the requirements for on-going high quality unique and relevant content needed to help sustain rankings. It is clear that several top brands are starting to understand this and have begun engaging a lot more with their customers in better ways than just having a conversation over social media, instead people are starting to use those brand advocates in a way to generate content for them – usually after some kind of incentive.

User generated content can be such a powerful thing when there is simply too much information for one team of people to write about. Content creation is a large resource which is extremely important to have and if you can use your audience as an addition to this work then it is a winning combination. I predict that we are doing to see an increase in this type of content creation through 2013.

 

 

 

Happy 2013 Everyone! :-)

 

2 thoughts on “17 Of My 2013 SEO Predictions”

  1. Some Great Predictions, many of which are already turning into reality, For example the Delayed ranking algorithm is already being witnessed. The predictions for Unified Google Travel Products might or might not become a reality as then it will highly devalue the search neutrality of Google. Though Google Jobs might be a reality, but we have not seen even a slight bit of indications on this. The most interesting part of your article is the insistence of small data which i interpret as analyzing already available data at micro level or user life cycle level.

    1. Hi Ved,

      Thanks for the comments. For your comment about the ‘small data’ trend, yes this is all about getting people to analyse the data that is already available to people.

      As a simple example, within Google Analytics it is possible to segment brand and non-brand traffic. From this you can then identify the differences in conversion rate and ultimately the revenue/profit this generates. Then it would be possible to see if there are any marketing campaigns that could ultimately make pay for their self, such as, “if we can convert 10 people into liking our brand and drive traffic to our website, then they are going to convert at y% more, which means that we would make £z in revenue/profit – hey, that is more than the cost of the campaign!”

      It is really simple stuff to do and in my experience it is just something that isn’t often looked at to help with the decision making processes

How Accurate Were My 2012 SEO Predictions?

Back at the start of 2012 I predicted 13 things that would happen in the world of SEO over the next 12 months, looking back on these let’s see how many of these came true.

1. Social Signals Gain Importance

Back in 2011 when Eric Schmidt announced that “The social signal, the people you ‘hang with’ is actually a ranking signal” but has this actually manifested itself into anything with a little more value throughout 2012?

Well I’m not too sure. While there has been tests undertaken to see how well social signals correlate to high rankings, this is certainly not conclusive or likely to be the cause of the high rankings alone.

Then during Matt Cutt’s keynote at SMX Advanced in 2012, he said that while social signals are important they are still nowhere near as important as links. Watch the full video below for what Matt Cutts has to say about social signals compared to links.

 

 

So all in all, I don’t think that the social signals have really increased that much in importance throughout 2012 purely for their standard SEO benefit.

That said, in my opinion, I believe that social signals are an extremely important factor for long term success for the simple reason that the more social mentions and shares that you get then the more real people are being exposed to your brand online and this can only be a good thing as these are the type of people who are going to be linking to your website on from their blogs and other communities they take part in online.

 

2. High Quality Content is Imperative for SEO Success

I think this goes without saying that this is one of the areas that has grown the most throughout 2012 within the SEO community. With further roll outs of the Panda update  throughout the year to an almost monthly update then high quality content cannot be more important.

There have even been company specifically re-branding theirself as content marketing companies opposed to traditional SEO companies. Companies such as Blue Glass UK after they acquired Kev Gibbonscompany Quatro in November 2012.

I have seen first hand on many occasions how important quality content is for increasing rankings for both non-competitive and competitive keywords. Write content –> let Google index the new content –> see increased rankings as a result.

John Doherty did a post a post back in October 2012 on SEOmoz which looked at what type of content gets links in 2012 in which one of the interesting pieces of correlated data was that the longer the content, the more links were generated. While this isn’t a specific guide to say that longer content will always generate more links, but it is an interesting correlation non the less.

 

 

3. Rich Media Becomes More Important

This one kind of follows on from the previous mention about high quality content, as high quality content to me is going beyond the traditional couple of hundred words on a page to explain the topic. Instead it is about creating high quality content in whatever format is best to get the message or information across to the user.

Another interesting piece of data within the post what type of content gets links in 2012 was that posts with images generate more linking root domains compared to those without. The graph of this data is shown below;

 

So overall I would say that rich media has become a lot more important throughout 2012 than it has been in the past. More people are producing a wider range of content with Infographics becoming a favourite in the SEO world. While Matt Cutts has announced that;

 

“I would not be surprised if at some point in the future we did not start to discount these infographic-type links to a degree. The link is often embedded in the infographic in a way that people don’t realize, vs. a true endorsement of your site.”

 

Personally, I believe this was just a bit of a bluff. People love infographics and they communicate data in a way that is very easy to digest and understand. So while there may be some tweaks into targeting lower quality infographics in the future, this is not specific to infographics but more in line with targeting lower quality content overall. What I took from Matt’s statement was that you shouldn’t be producing low quality infographics purely for links but you should be producing high quality ones because they are genuinely useful for people.

 

4. Google Providing Answers Directly Within Search Results

Well…where to even begin with this one. There has been so many changes within the search results with Google directly answering questions that there is quite a bit to cover here!

Back in June 2012 Google launched Knowledge Graph to the world which started to answer questions directly within the search results for anenormous amount of queries which ultimately lead to less traffic being received for the websites where this information was scraped.

 

 

Since then there has been more and more of this type of content showing up directly within the search results including when you search for “things to do in Paris” which shows a huge list of different points of interest in Paris;

 

 

I would hate to think how much of an impact this has had for companies such as Lonely Planet and Trip Advisor as they are some good sources of this type of information, but I imagine they are getting much less traffic now these types of queries are being answered directly within the search results.

Here is another example of this huge bar when searching for “Bruce Willis Films”;

 

 

I’m guessing IMDB has had a bit of a hammering from the loss in traffic from this too.

This has been such a huge change within the search results for 2012 I dread to think of the impact this has had on some of these businesses. If anyone has any statistics about websites where their traffic has been hit hard from these introductions then let me know as it would be great to see some actual numbers.

 

5.  Google Gets Slap On Wrist

Finally….finally…well almost. Lots and lots of things happening in this space with Google being sued for countless different issues around the globe. Just to name a few but Google are being investigated for their tax avoidance schemes they have set up and actually appeared in front of the Public Accounts Committee in November (which was a rather amusing viewing if you saw it!).

Danny Sullivan of Search Engine Land also wrote a letter to the FTCabout how search engines need to disclose more in relation to paid inclusion although nothing has been received back unfortunately.

And lets not forget about Barry Adams’ amusingly titled Google’s 2012 Clusterfuck Countdown which lists a huge amount of issues where Google is simply being held accountable for different things and generally messing up with things they are doing.

 

6. Mobile SEO Gains Traction

At the start of the year I predicted that Google would announce some kind of meta tag that would help them understand what the mobile version of a website is and what is the equivalent version on the desktop websites.

And I couldn’t have been more spot on with this. In June 2012 Google announced exactly this information which is outlined in their official mobile website guidelines.

Google recommends;

When building a website that targets smartphones, Google supports three different configurations:

  1. Sites that use responsive web design, i.e. sites that serve all devices on the same set of URLs, with each URL serving the same HTML to all devices and using just CSS to change how the page is rendered on the device. This is Google’s recommended configuration.
  2. Sites that dynamically serve all devices on the same set of URLs, but each URL serves different HTML (and CSS) depending on whether the user agent is a desktop or a mobile device.
  3. Sites that have a separate mobile and desktop sites.

 

This is a really good improvement that Google have made as it ensures there are no duplicate content issues when displaying different content to people on different devices along with removing the potential for penalties when re-directing users based on the ‘User-Agent’ in the header.

 

7. Less Importance on Number of Back Links

Need I say anything more than the word, Penguin? Back in April 2012 Google let another one of their black and white animals out of the bag with the Penguin update.

The quantity and quality of backlinks used to be extremely important, and while they are still and extremely important area to focus on it needs to be treated carefully. With the Penguin update actively penalising websites with a high quantity of low quality backlinks then we need to ensure that the links being built are good quality to avoid being placed into this category.

Previously low quality backlinks simply didn’t pass much/any value at all but this update followed by Google announcing that they are sending out warnings to webmasters which state they have got lots of low quality links pointing at their website and they need to do something which isinsane.

After which Google then announced a Disavow tool in October 2012 which is even more ludicrous as it expects webmasters to do Google’s job for them. Absolute non-sense.

 

8. Only Google Reviews Are Used

No surprise again here with Google deciding to screw over everyone else who previously provided reviews to their products. Instead now Google have built up enough of their own reviews of their own to not have to worry about external reviews. In additional to this Google purchased Zagat back in September 2011 and have begun to integrate these reviews into Google’s services as well now.

 

 

Then below shows a screenshot of how Google’s own reviews and their Zagat reviews on the Places pages;

 

 

The only place where it appears that external reviews are even being mentioned at the moment (specifically for a couple of hotel searches) is right at the bottom of the Places page (below both Google’s own reviews and the Zagat reviews) which shows;

 

 

The same is happening when reviews are included within the Knowledge Graph when that appears for different searches. Whereby there are only Zagat reviews and Google owned reviews which are being displayed.

Honestly though, this should come as no surprise as this is what Google do when rolling out all of their own products and services, they utilise key partners initially until they can build up enough of their own properties first then simply get rid of the people they were previously working with.

 

9. Bing Gains Market Share

Back at the start of 2012 I predicted that Bing’s search engine market share would grow to around 20% and while it has grown this year it hasn’t quite grown by that amount. Below is the search engine market share data for the past three years sourced from ComScore;

 

 

As you can see in the above chart, Bing’s market share has actually been steadily increasing over the past few years which is always a good sign in my opinion as it brings a bit more competition to the search engine scene. Google is still massively ahead of the rest though and will continue to be in this position for quite some time.

Below is a pie chart for the search engine market share in October 2012 which shows Bing as 16% of the search engine market. Which is actually still a very good increase for Bing as previously they were at 15% of the market. So at least they are growing;

 

 

During 2012 Bing also launched a campaign targeted at Google called,Scroogled which was a nice way to make people realise that there are other alternatives to Google. Hopefully we will see more of these campaigns from Bing in 2013.

 

10. Google Gets More Personal

It was looking like there was going to be a lot more personalisation happening throughout 2012 and there has certainly been more updates of the authorship algorithm which allows more people to get their photo listed within the search results next to their own content.

There has been Search Plus Your World launched in January 2012 which has received a lot of criticism since launching but it emphasises how Google is really pushing the personalised search with linking it up with their Google+ social network which is going to be a massive focus in 2013.

 

 

 

11. The Death of SEO

:-)

 

12. Authorship Becomes Essential For Content

With authorship being pushed quite a bit by Google as it links in with Google+  this has certainly become a more important area to focus on simply because with the added image listed within the SERPs this helps drive additional organic traffic through to your website.

The markup hasn’t become essential though for content writers, while it is certainly an area that a lot of people are focusing on it isn’t essential. Currently there doesn’t appear to be any evidence to suggest that if a piece of content is written by an influential person on a non-influential website that this ‘Person Rank’ is passing any additional value or trust towards that website.

Maybe Google will announce something like this next year possibly, but for 2012 while authorship is important to help you stand out in the search results it doesn’t appear to be doing anything more than that at present.

 

13. Twitter Launches Analytics Platform

Well…it was a bit of wishful thinking :-) Maybe next year!

 

 

Summary

Overall though I think some of the predictions I made at the beginning of the year were actually quite accurate;

  1. Social Signals Gain Importance: Certainly still important but haven’t quite had the direct impact that was predicted
  2. High Quality Content is Imperative for SEO Success: More Panda updates, traditional SEO companies re-branding theirself as content marketing agencies and evidence to suggest that longer content generates more links
  3. Rich Media Becomes More Important: Certainly more people focusing on richer content and I can see this being the trend for the longer term
  4. Google Providing Answers Directly Within Search Results: An unbelievable amount of changes in this area
  5. Google Gets Slap On Wrist: Multiple slaps on the wrist
  6. Mobile SEO Gains Traction: Some nice changes to avoid duplicate content and potential penalties for redirects
  7. Less Importance on Number of Back Links: Penguin
  8. Only Google Reviews Are Used: Only Google’s own reviews and Zagat reviews being used now
  9. Bing Gains Market Share: By 1% point, not quite the 5% points predicted but it still gained
  10. Google Gets More Personal: Search Plus Your World
  11. The Death of SEO: :-) 
  12. Authorship Becomes Essential For Content: Certainly not essential but it can help stand out in the search results
  13. Twitter Launches Analytics Platform: We can only hope this will come next year!

 

Out of the 13 SEO predictions for 2012 I am quite happy with the fact that 9 of these have come true. Now time to have a think about what could happen in 2013…

2 thoughts on “How Accurate Were My 2012 SEO Predictions?”

  1. Thanks for the shout out, Mick. :) 9 out of 12 is a pretty good hit rate (I’m deliberately discounting prediction 11 :)). Statistically you’ve beaten pretty much every other set of predictions out there, including many by professional ‘futurists.’

How to Analyse Traffic from Link Building Work

A large part of SEO is all about link building, you don’t need me to tell you that. What is important though is not just the changes in rankings which are happening due to the link building but also the amount of referral traffic which is coming through from this work.

If your links aren’t generating traffic then I would begin to question the long term success those types of links are having on your website, rankings and brand.

Below outlines a quick and simple step by step process to quickly assess how effective your link building techniques have been in generating traffic to your website.

 

Step 1: Download all referral traffic from Google Analytics

Go to Google Analytics –> Traffic Sources –> Sources –> Referrals

 

 

Then view the maximum amount of rows, 500 by using the filter at the bottom of the screen.

 

 

Then export all of this data into a CSV file which can easily be opened later in Excel. Click on Export –> CSV at the top of the screen as shown in the screenshot below

 

 

 

Step 2 – Combine Links Data with Rererral traffic

The next step is to open up the file you have just download and re-save this as a normal Excel file as we will be adding in some more tabs, data and look-ups which standard CSV files aren’t designed to handle.

Once you have done this do the following;

 

  1. Rename the first tab to ‘Google Analytics Referral Data’
  2. Create a second tab called ‘Links Built’
This will help when doing the cool excel magic a little later.
Now you need to simply add into the tab ‘Links Built’ (you guessed it!) all of the links you have built!
Below is a screenshot of some of the referral traffic to my website;

 

Below is a screenshot of some of the links which I have built to my site by shamelessly self promoting my content;

 

 

The links in the above screenshot are just a small sample I have scraped together for this blog post, I don’t actually keep track of this for my own blog – I have got much better things to do with my life :-) For people building links in competitive industries and on larger websites then you will likely have lists which go into the hundreds or thousands, so this method can really save some time for you.

So now you have all of the link and referral data within one Excel file which you can then do some cool Excel magic on.

 

Step 3 – Do Cool Excel Magic

Now you want to find out how many visitors the link building has resulted in. As mentioned previously, if you are just building links for the SEO value and not for traffic is this really going to be a good long term SEO strategy…?

The next step here is to add an extra column next to the list of referral traffic to see if this referral traffic was from the effort you put into link building;

 

 

If you want to copy and paste the formula then it is;

 

=IF(COUNTIF(‘Links Built’!$A$2:$A$100,CONCATENATE(“*”, A8, “*”))>0,”Yes”,”No”)

 

The formula may look a little scary but all it is essentially saying is: “See if this domain in A8 is contained within the list of links which I have built”. For a full guide on what this means take a look through the blog post explaining this in detail, How to VLOOKUP Using Partial Match

The different parts of the formula are saying;

  • Count the range of cells, A2:A100 if….
  • The cell contains a partial match on cell A8
  • If there is a match, then put “Yes” in the cell
  • If not, then put “No” in the cell

 

Now simply drag this formula down through to all of your referral traffic to see if this was from a link which was built or not.

Note, this check is only looking at the domain name and not page specific. So if you built a link on www.example.com/page1.html which drove 0 visits and you got an organic link on www.example.com/page2.html which drove 100 visits then all of this traffic would be attributed towards the link building work which you have done – which isn’t correct in this example.

By default Google Analytics doesn’t display the full referral path, but only domain name. It is possible to set up an additional profile within Google Analytics, to get the full URL for referrals then follow the guide in the link. One thing to note is that Google Analytics profiles only show data from the date they were set up, so you cannot see historical data with this method. Although if you get it set up now then you can begin to get more accurate data in the future.

Second note, the formula described above can also be run as aVLOOKUP if you need to pull the data back into this tab by simply editing the formula as follows (although this method will only bring back the first occurrence of the domain name mentioned, unless you have the full referral path within your analytics profile);

 

=VLOOKUP(CONCATENATE(“*”, A8, “*”), ‘Links Built’!$A$2:$A$100, 1, FALSE)

 

Step 4 – Analyse the Data

Now you will have ended up with a list of Yes/No’s which will tell you if you have built the link for this traffic source. Below is  an example of how this can look after you have filtered by all of the “Yes”;

 

 

As you can see from  three domains listed above, these have driven over 1000 visits to my blog in a short period of time. Are these good for SEO? Well traditionally you may argue that the links are no followed so they don’t offer any value. Personally I would rather have a no followed link which drives actual traffic and real people to my website instead of a followed link which doesn’t drive any traffic at all.

I would suggest running reports like this on a regular basis to continually assess if the work you are doing is actually driving real users to your website and not just building links for the pure PageRank benefit of the link. If all of your links have driven 0 traffic to your website in the past X months then I would begin to ask yourself if what you are doing is going to be having real long term results for your website.

If you also track the type of website where you have been building links such as, guest blog post, infographics, directories etc. then you can quickly assess which type of links are or aren’t driving traffic to your website which can help gain further insights into what is working from a traffic point.

How I Built SimpleSitemapGenerator.org in a Weekend

Whilst working in the SEO industry there are time when certain tools would make your life easier and you just can’t quite find a tool that does the job that you need. This is one occasion where I was looking for a simple sitemap generator and all of the tools that I could find were either limiting the number of URLs which could be contained to a really small number or didn’t allow me to tell the tool what the URLs actually were.

So that I why I built SimpleSitemapGenerator.org over a weekend. I’m sure there will be sitemap tools which can achieve a similar result out there that I simply have found but my patience was wearing thin searching :-)

Was it difficult? Not really. It was just working through some basic logic to build in exactly what I needed. Below explains how I built the tool.

 

What is it built with?

Simple Sitemap Generator is built on a Java platform running on an Apache Tomcat web server. Why? Because I know Java. The exact same task could be achieved using any programming language you choose if you require. My referred method of developing websites is using the Integrated Development Environment (IDE) called NetBeans.

Some hardcore programmers always prefer not to use these types of tools as they can get their self tied in knots sometimes which require a deeper understanding to untangle – so if you only use these tools you may find it difficult to figure out what is wrong. Personally, I prefer to make my life as simple as possible – why make things more difficult than they have to be to achieve the task in hand?

 

 

How does it work?

Quite simply really, the list of URLs are parsed using a Java program behind the scenes which separates all URLs by the new line character. The other items including the change frequency, last modified and the priority are also picked up from the main form then used in the program.

The program ultimately just runs through each of the URLs within the list (up to a maximum of 50,000 URLs due to this limitation within XML sitemaps) and wraps the correct tags around each item based on the latest XML sitemap specification.

Below is a simple diagram about how the program uses the data which has been entered on the form so you can see how the logic works in the program. I have excluded any of the Java code so it is a little easier to understand for the non-technical people.

(click on image for a larger view)

 

 

Then the sitemap is complete! So it is just about displaying that nicely to the user.

 

How did you design the logo?

I am not a big fan of designing anything and I am very poor at doing so. My preferred method of developing logos and nice graphics is using Microsoft Word combined with Paint.Net to achieve a few nicer effects if needed.

Why do I use these tools? Because using the more advanced tools are way beyond my skill set and I don’t have the time or desire to try and master these. The basics serve my purposes for the time being but not to say that I may not learn in the future – just not in the near future.

 

Why didn’t you build in a website scraper?

A lot of other sitemap generator tools have built in website scrapers and can identify all URLs on your website easily, although these are always limited by the number of URLs they can crawl. There are several reasons why I didn’t build in a web scraper to the tool;

The first reason being that by having a website that crawls the whole of a website leaves the tool open to abuse by people wanting to attack certain websites by making the tool send thousands of requests towards a certain website. This is more commonly known as a Denial of Service (DOS) attack. This amount of requests can bring websites to their knees or totally offline.

If I built in a scraper function into the tool then it would be very simple for someone to enter in “www.website.com” into the scraper tool and press ‘go’ and continue doing the same in endless tabs in their browser. The result from which would be thousands of requests going to www.website.com. There is always ways to get around this type of abuse but this requires more time to build into the tool.

The second reason why I didn’t build a web scraper into the tool is because there are already really good tools out there that can do this for you, namely Xenu Link Sleuth. Why re-invent the wheel?

I primarily built this tool for myself as I will find it useful as I work on a lot of different websites, so it makes my life simpler. I can quickly identify all URLs on a website using Xenu so I didn’t need to go re-designing this as I can simply use a combination of tools to achieve the task which works out quicker.

The third reason why I didn’t do this is because the actual server overheads to crawl an awful lot of URLs to scrape a website, then parse all of the information to use in the sitemap is an awful lot and since there will likely be very little income from the tool (advertising makes pennies!) then this would purely be a loss making exercise for me and that doesn’t sound like too much fun.

 

Why can’t you have different priorities for different URLs?

Because I didn’t build this in as (in my opinion – I’m sure there will be people with other opinions on this!) there is very little value in changing this from 0.1, 0.4, or 1.0. The aim of the tool is to quickly build an XML sitemap from a list of URLs so you can tell search engines about content they may not already be aware of. If you want to quickly tell them about content then why would you set a lower priority for content?

While it may be interesting to build into the tool a way to prioritise URLs based on their importance, there are no plans to do this in the near future. If you want to begin doing things like this then I suggest you build a custom XML sitemap generator which is more integrated into your content management system / database so that it can be continually upgraded.

 

 

How did you choose the font and colour scheme?

As you know already that I created the logo in Microsoft Word, well you may notice the font from another post I did a while ago about the 200 signals in Google ranking algorithm (and yes, that image was also created in Word). Why the font? Because I like it. Simple as that.

Why the colour scheme? For the same reason, I like that basic green colour in Word for colouring sections of text in (I usually use this for ticking off items on a to-do list or similar) so it seemed like a nice choice and I think it works quite well.

How about the main navigation colour scheme? Well I actually just pulled this whole navigation from another website I have developed as I wanted to quickly create a navigation menu and there was little value in creating one from scratch. So this was more of a quick and dirty approach which achieves the aim of being a navigation menu.

 

How did you get the XML sitemap to look pretty?

If you view the sitemap for the actual website, http://www.simplesitemapgenerator.org/sitemap.xml then you can see the sitemap is styled all nicely as is seen below;

 

 

Isn’t an XML sitemap supposed to look like a normal XML document though? Well usually yes, but it is possible to style up XML sitemaps so they look nice. This is using an XML Stylesheet which is achieved by adding a line of code to the top of the XML Sitemap as follows;

 

<?xml-stylesheet type=”text/xsl” href=”http://www.simplesitemapgenerator.org/sitemap-stylesheet.xsl”?>

 

This line of code is pulling in the stylesheet information from a separate stylesheet file which is making the XML document look a little nicer. I will be doing another post about how to create these as they are reasonably straight forward to implement and can make your XML sitemap a little more user friendly and they also have other SEO benefits such as being able to easily ping all of the URLs to ensure they are working etc.

 

Summary

So there is a bit of information about how I builtSimpleSitemapGenerator.org in a weekend. Quite simple really, it was just about allowing basic data to be entered onto a form then parsing the results and outputting to a nice format which is in line with the latest XML sitemap specifications.

I always encourage people to give something a go and try and solve a solution to a problem yourself as it really isn’t that difficult. The added bonus that it is fun doing so too!

This tool has certainly made my life easier and will continue to do so. I hope it can be of some use to you as well. If you do find it useful then please share :-)

Excel Tips & Tricks for SEO

Quick reference manual for myself as to regular Excel things I use and useful formulas for various tasks. If you find it useful too, then please share around :-) It will be a growing list as/when I find something useful that I use a lot but can never remember the exact way of doing it!

 

How to Count the Number of Occurrences of Text in a Cell

Quick reference tip;

 

=SUM(LEN(<range>)-LEN(SUBSTITUTE(<range>,”text”,””)))/LEN(“text”)+1

=SUM(LEN(B2)-LEN(SUBSTITUTE(B2,”|”,””)))/LEN(“|”)+1(example)

=SUM(LEN(B2)-LEN(SUBSTITUTE(B2,”separator”,””)))/LEN(“separator”)+1(example)

 

Read the full blog post about how to count the number of occurrences of text in a cell here about how to use this.

 

How to Count the Number of Occurrences of Text in a Column in Excel

Quick reference tip;

 

=COUNTIF(<range>, “text”)

=COUNTIF($A$2:$A$8, A2) (example)

=COUNTIF($A$2:$A$8, “Apples”) (example)

 

Read the full blog post about How to Count the Number of Occurrences of Text in a Column in Excel here about how to use this.

 

How to VLOOKUP Using Partial Matches in Excel

Quick reference tip;

 

=VLOOKUP(CONCATENATE(“*”, <lookup value>, “*”), <range>, 1, FALSE)

=VLOOKUP(CONCATENATE(“*”, A2, “*”), B2:B10, 1, FALSE)(example)

=VLOOKUP(CONCATENATE(“*”, “Jim”, “*”), B2:B10, 1, FALSE) (example)

 

Read the full blog post about How to VLOOKUP Using Partial Match for how to use this.

 

How To Get The Domain Name From a URL in Excel

Quick reference tip;

 

=MID({CELL OF FULL URL}, FIND(“//”, {CELL OF FULL URL})+2, FIND(“/”, {CELL OF FULL URL}, 10)-8)

=MID(A1, FIND(“//”, A1)+2, FIND(“/”, A1, 10)-8)

=MID(“http://www.michaelcropper.co.uk/2012/10/how-to-scrape-the-href-attribute-using-xpathonurl-seo-tools-1252.html“, FIND(“//”, “http://www.michaelcropper.co.uk/2012/10/how-to-scrape-the-href-attribute-using-xpathonurl-seo-tools-1252.html“)+2, FIND(“/”, “http://www.michaelcropper.co.uk/2012/10/how-to-scrape-the-href-attribute-using-xpathonurl-seo-tools-1252.html“, 10)-8)

 

Read the full blog post on How To Get The Domain Name From a URL in Excel for detailed information about how to use this formula.