SEO Analysis: Buying a Website vs Starting a New One

Read below the video for further description, and make sure to leave a comment with your view or opinion in the comments section.


I know that I have asked myself whether I should buy a website or start a new one, each time I have an idea or start a new project.

For the most part because I’m trying to be different and gain traction by creating new information, for me I will most likely always choose to start a new website (but that could change down the road).

But the question is interesting and hasn’t really been analysed from an SEO point of view.

So what are the SEO advantages and disadvantages to buying a website versus starting a new one?


Click on image to increase size.

It would seem like buying a website is the best option from an SEO point of view.

To an extent that is true, if you can make sure that the site you are buying hasn’t done anything black hat or any sneaky SEO tricks and it is in a niche that you will love writing content for. Then buying a website makes a lot of sense from purely a search engine optimization point of view.

Other factors

While buying a website is the preferred option for SEO you also have to consider whether you have the cash required to buy a website and whether you are the type of person who could run a website that they had not created.

Personally I like to have control and also to have been the originator of the idea and then be able to mold that idea around my skills and interests. So I wouldn’t be the best person to go buying a website because my personality wouldn’t suit it.

You have to ask yourself whether you need to be the originator and whether you need to been in full control or whether you can slot into somebody else’s idea.

If you do have the cash and personality to buy a website then there is only one more step to ensure your acquisition is a success.

SEO track record

Unfortunately some unsavoury SEOs use black hat (sneaky) strategies to try and advance their website’s ranking in the short term. Sometimes this works in the short term but it will never work in the long term. Buying a site that uses these techniques is a disaster waiting to happen.

There are a number of free and paid tools available to you to do a background check on the site.

Make sure you (or an expert) checks whether the site breaks any of Google’s Webmaster Guidelines.

Here are some handy tools to help you out:



If you have the cash, the personality and you find a site that suits your skills and interests and that site hasn’t infringed on the Google Webmaster Guidelines then grab your opportunity now.

Spelling and Grammar in the Google Algorithm

Read below the video for further description, and make sure to leave a comment with your view or opinion in the comments section.


Matt Cutts is a one of my favourite SEO commentators and has been answering SEO’s questions for a long time.

But a recent video I watched from Matt got my blood boiling, or at least bubbling.

Matt was asked whether spelling and grammar was part of the Google algorithm?


Basically Matt’s response was that having spelling and grammar in the algorithm made a lot of sense, because users wanted pages and websites that were well written and easy to read.

We aren’t talking about university level English here, just a factor that would relegate poorly written pages (potentially written by non-native English/[relevant language] speakers) that provide little or no value to the user.

No user wants these kind of results and implementing a factor that would get rid of them makes so much sense, too much sense apparently.

Matt answered the question first of all and said that he wasn’t aware that Google used these factors in their algorithm.

Essentially Google doesn’t use spelling and grammatical factors in their algorithm.

In fairness to Matt he did say it would be a factor worth looking into.

Google’s Solution

Matt gave a couple of reasons why Google doesn’t use these factors in their algorithm:

  • There are potential errors when creating an algorithm/factor that would measure spelling and grammatical mistakes, e.g.  what if a page quoted French and the main page was in English?
  • Google believe that PageRank will do the same job as such factors, i.e. if a page has poor grammar or spelling, fewer reputable people will link to it and therefore it will automatically be relegated down the search results.

On the surface this looks like smart reasoning from some very smart people, but lets look below the surface.

Rebuttal - Google’s Double Standard

First off Google already have the technology in terms of spelling (they pride themselves on on their spelling corrections in search) and language and dialect differentiation (Google Translate). So with a little fine tuning and the algorithm would have a low percentage error level.

Although they don’t have any major grammar corrector that I’m aware of, they do have thousands of the top engineers in the world to create one.

Plus as users are probably only interested in making sure the page has decent readable grammar, we’re not asking Google to come up with a masterpiece, just a basic algorithm that identifies basic grammatical errors that are obviously apparent to a reader when reading the page.

Double Standard

The whole argument that PageRank would do a better job is totally flawed.

If Google had that much trust in PageRank, then their algorithm would simply find out what a page is about, whether its doing anything sneaky/black hat and then rank in accordance with quality of links.

But it doesn’t why not?

Because Google know that PageRank can make mistakes too. For example what if the most reputable online English tutor linked to a poorly written page as an example of what not to do when writing a web page? That’s right that page would probably benefit from that link (assuming the tutor didn’t use the nofollow tag).

That’s why Google combine PageRank with other factors that make a lot of sense to use:

PageSpeed Example: Google currently use a factor that demotes pages that load slowly and increases the ranking of pages that load quickly.

Here’s what Google have to say about PageSpeed: “we believe that making our websites load and display faster improves the user’s experience and helps them become more productive.” – Source

And another good one: “Speeding up websites is important — not just to site owners, but to all Internet users. Faster sites create happy users” – Source

Hey, doesn’t that sound similar, you could say that spelling and grammar “improves the user experience” and that “well written sites create happy users.”

Yet Google use PageSpeed as a factor and not spelling and grammar.

Google  has shown that PageRank needs to be combined with other factors to get the best results.

When a factor such as spelling and grammar is ignored it’s just Google contradicting itself.

If I was Head of Search…..

If I was head of search at Google I would:

  • Develop a set of new factors to determine the grammatical and spelling rating of a page.
  • I would create a set of signals within the spelling and grammar algorithm to rate how accurate the algorithm is on a page by page basis and if there was a significant level of doubt I would simply minimize the use of that factor on the page or not use it at all and defer back to PageRank.
  • Create a notification method within Google Webmaster Tools that would let sites known which pages are being penalized and why, and of course then a reconsideration request so that if the site has cleaned up their pages they will be re-evaluated.
  • And I would share this page on Facebook and Twitter, hint, hint.

Let me know if you spot any spelling or grammatical errors in this article :)

Video: How the Google Algorithm and the Panda Update Have Failed Users

Read below the video for further description, and make sure to leave a comment with your view or opinion in the comments section.


The recent claims by Hubpages, who had been hit hard by the Panda update , that moving all their content to various subdomains has returned their traffic levels to near pre-Panda levels, that there is a clear problem with the Google algorithm and that it has been failing users.

Hubpages lost over 50% of their search engine traffic after Google clamped down on the site and a lot of other sites that it claimed to host low quality content in Feburary.

After consulting with Matt Cutts a Google fellow, Hubpages decided that by giving each author their own subdomain, much like WordPress and Blogger do would be the best way to split up the content and help Google distinguish what is high quality and what is crap.

Following the change Hubpages have reported that their traffic levels are back near pre-Panda levels.

It is already widely known that Google treat subdomains as individual sites and the site-wide blanket factors Google apply don’t impact subdomains as much as if the content was hosted on the single domain.

I’m not worried whether this is the best way to combat the Panda update, I know a lot of people agree and disagree with this as a solution but what I am worried about is the obvious and gaping hole in the Google algorithm that clearly still exists.

Google Lost Focus

In my opinion Google have lost focus on what makes a great algorithm and applied too much weight to algorithmic factors that impact Google’s opinion of a website as a whole.

Factors like the homepage’s PageRank, number of indexed pages, number of inbound links and social media influence on the site as a whole.

The Hubpage’s example shows that Google are currently ranking pages based on site-wide factors as opposed to on a page by page basis.

While this may in some cases lead to better quality search results (and Google extensively research the impact of changes to their algorithm on search quality before implementing them), in other cases it has lead to spam and low quality search results ruling the web and the proliferation of content farms.

As Google strove to move away from on page factors, because some are easily manipulated by the webmaster, they left themselves open to large sites with a big brand showing up in the search results where they shouldn’t simply because they rank well in the site-wide factors.

Right Idea – Poor Execution

I understand why Google tried to reduce the weighting on the on page factors like having the keyword in the title tag, first 90 words, URL, image alt tag, etc.

But as Google moved away from those on page factors, they should have looked to move towards more on page factors that were harder to manipulate.

Instead they focussed on a lot of site-wide factors that were harder but not impossible to manipulate.

The Panda update essentially sought to regain the balance, but instead of regaining the balance by increasing the weighting of these hard to manipulate on page factors like Latent Dirichlet Allocation and other complicated factors that are hard or not worth the effort to try to game.

The article I linked to above shows a large correlation between LDA and search engine ranking so Google may already be using these factors but clearly not enough, or maybe they don’t trust them to a level they think it would be wise to increase their weighting.

A possible solution would be for Google to implement conditional factors that are only looked at when there is a large number of content farm/big brand sites for one query.

So in the case of these big brand dominated queries the weighting of LDA type factors would be increased thus most likely increasing the quality of those search results but the queries that are being returned as high quality with the current algorithm would be unchanged.

It’s sort of like having an algorithm analyse the work of the Google algorithm and make decisions from the finding.

Of course there are many other possible solutions but that is a potential one.


The reason I decided to write this article is because we all (myself included) looked up to Google as the search engine powerhouse who may still be living on their original reputation of revolutionaries and spam killers while this is no longer the case.

They may still be the best search engine, in terms of search quality but with moves like the Panda update and the state of the algorithm before that I would suspect that Google could be overlapped if a new search engine with new ideas starts-up.

While I would love to see somebody like Blekko do well and maybe even implement solutions like I have suggested, with their current attitude towards search and their current results I can’t see them overtaking Google any time soon.

Still I would love to be proven wrong and I would love for a small search engine with a point to prove to take on Google with these types of queries with a new and creative solution and put some heat on the search giant.


Video: How Many Factors Are There in a Search Engine Algorithm?

Read below the video for further description, and make sure to leave a comment with your view or opinion in the comments section.


After Google and Bing traded blows on the number of factors their respective algorithms were compiled of there was considerable confusion in the SEO industry at how many factors their algorithms really had.

Tit for Tat

Google originally said that their algorithm had around 200 main factors.

In response Bing probably looking to get one up on their competition decided to announce that they had a couple of thousand factors in their algorithm.

The confusion levels increased as Google explained that they calculated their 200 factors based on the fact that each factor had 50 sub factors and as a result the number of factors, parameters, elements or whatever you are meant to call them is somewhere around 10,000.

Bing not wanting to be out shone also explained that it all depended what you counted as a factor and an element.

Both search engines came out of the war of words looking more than a little sinister and back handed but the common SEO came out hungry for reliable information.

My personal hunger for this information and finding the truth behind the search engine algorithms has given rise to TheOpenAlgorithm project, which will hopefully result in a more transparent, open search industry and help a lot of webmasters. As well as being a lot of fun for myself, of course.

So, how many factors are there, really?

After all of that, the webmasters of the world were still left wondering what that reliable information and number would be.

I can only speculate and the algorithms are always changing, introducing new factors. But my personal opinion would be that both Google and Bing use around 10,000 factors in their algorithm.

I suspect Google being a more developed and older search engine that the number of factors in their algorithm would be somewhat higher than Bing’s but again that’s only speculation.

I’m using 10,000 as the goal for my project as I try to uncover as many factors as possible.


Unless your running projects, research or are looking for a number to use in an article you are writing you shouldn’t be worrying about the number of factors in the search engine algorithms.

The actual number doesn’t impact you and focussing on it will only distract you from the goal, creating a great site for users.

I’m not naive and we all know SEO is crucial to your site’s success, but don’t over focus on it, stay tuned here via Twitter, Facebook or RSS and you will have all the necessary information for creating a search engine and user friendly website.

If you liked this video you may want to check out my other videos and please leave a comment with any questions or opinions you may have.

Video: The Foundation for Great SEO – A Great Content Strategy

Read below the video for further description, and make sure to leave a comment with your view or opinion in the comments section.

Everybody knows content is important but most people just don’t get it when it comes to their website. They either forget great content is necessary or think any content is content worthy of search engine traffic.

The majority of the times I see this is with ecommerce or service sales based, static websites where the end goal is to get you to buy something. But I have also see this when bloggers focus too much on stats like traffic and ad revenue.

As a result these webmasters lose focus and write content that is mundane and stale. Stuff like “What is SEO?” and “What is YouTube?”, now of course there’s a need for this content, but you would have to be kidding yourself to believe that this need hasn’t already been fulfilled.

The foundation for great SEO and getting a lot of search engine traffic is to have great user focussed content.

To be constantly creating this fantastic content you are going to need to have a content strategy in place.

Creating a Great Content Strategy

Using this simple self assessment you can accurately determine whether you are implementing a great content strategy or if your a webmaster who has lost their way.

Ask yourself: “are you writing content 100%, totally for the user and not thinking of the search engine at all when you write your first draft?”

If you can honestly say yes then you are fine and all you need to do is keep going and creating more unique content.

If you are thinking of the search engine and thinking of keywords on your first draft then your probably writing crap content.

(P.S. I’m not saying you should never think about the search engine, your final draft/editorial check should make sure you are using the right keywords in the right places (and all that other SEO stuff I like to talk about), but your first draft is all about the user.)

If you answered no to that question then run this idea creation process through your mind.

What content is my target market missing, what do they really want that they don’t already have?

(That could be video news, tutorial articles, webinars, a plain old tips blog, it could be anything and you know your target market and your industry best so you should be able to answer this question with a bit of thought.)

How am I going to fulfil what they want?

This part’s the content strategy, you should be able to write down what it is your working towards (be that a regularly updated blog, a daily news update, etc.). If you ask yourself everyday, ”am I moving towards this goal?” and you are then you are bound to get a lot of search engine traffic.


Really what this is about, is a mental refresh of where you are and where you need to be and a lot of websites lose their way, or are off track from the beginning. The foundation for any site is great user focussed content and you need to create it regularly.

There is a lot more to SEO than just the content but if you stay tuned to TheOpenAlgorithm (I highly recommend our email newsletter) and my latest findings and blog posts you will find that peripheral work a breeze.

While all the other SEO peripherals are important, its more important to create a great foundation for your SEO house, than to have a great design and no ground to put it on.

Video: Guest Blogging Advanced Strategy Why you Need to Brag More


Read below the video for further description, and make sure to leave a comment with your view or opinion in the comments section.


I’m a huge fan of guest blogging. I believe it’s a great way to build your personal reputation, your site’s reputation with the search engine’s, increase your traffic and increase your long term SEO success.

Guest blogging helped me launch my first ever product (Link Building Mastery) which I went on to sell the rights to a new owner.

I have even launched a website and associated product totally to help teach people how to do guest blogging right.

My story

When I launched that first product I had a low traffic website with a PageRank of 0 and no email or social media list.

I decided to market the product through all the usual methods, I dabbled in Google Adwords, posted in forums, tried to recruit affiliates but none of them worked as well as guest blogging, since then I have been a total convert and devote guest blogger.

It wasn’t all peaches and roses, my first five guest posts sent me 18 visitors and it wasn’t because of the site I was blogging on (one of them was on one of the biggest blogs in my niche SEOMoz).

I decided to try one more post with some slight changes to how I wrote it.

The main one being instead of writing tutorials or commenting on news I would write about my own experiences and share my learnings, share my story.

In this case I shared the story of how I developed a product, gave an excerpt from the product and linked to it many times in the article.

It worked, that post sent me nearly double (31 visitors in the first 20 days and hundreds until I sold the website in July 2011) than my first five posts combined.

Since then I have written a lot of guest posts almost all of them drawing from my personal experience. All of them have sent me a lot of traffic, some sending me hundreds in the first day after publishing.

I wrote about this strategy in another guest post over at the 2CreateaWebsite blog, which ironically sent me over 300 visitors within the week after writing the post.

The Learning

The major learning for me was that to make a connection with your reader and compell them to click on that link to your blog in your author bio you needed to share a story, show your personal value beyond just being a good source of information.

You need each and every reader to believe you are an authority in your field before they finish reading the post.

If you can accomplish that through sharing your experience you too can double the traffic you get from your next guest post.


Of course there is a lot more to successful guest blogging but if you implement this advanced strategy you certainly will see the results.

On my website GuestBlogging101 I share more advanced strategies from my personal experience, that will help you quadruple not double the traffic you get from your next guest post.

If you liked this video, check out my other videos.

  • Didn’t like this video.
  • Used guest blogging but it didn’t work out.
  • Want to share your own guest blogging story or advanced strategy.
  • Used this strategy with success.
  • Got a question.

Leave a comment below, I try and respond to every comment submitted.


Video: How to Get Interviews with Industry Leaders and Use Them on your Site

Read below the video for further description, and make sure to leave a comment with your view or opinion in the comments section.


I first came across interviewing people as a strategy to build traffic and content from David Tiefenthaler of, when I read a post he did for the 2CreateaWebsite blog.

David had quadrupled his traffic, so I decided to give interviewing a go.

Other than guest blogging, interviewing experts would have to be the strategy that I have used the most over all my sites and sites that I help manage.

My first forray into interviews was with my former website, where I had an interviews section (hopefully the new owner will keep that section and link alive, if not let me know in the comments section).

My second was with my eBook Link Building Mastery where I interviewed 15 incredible people including some major names like Glen Allsopp, Marko Saric, Yaro Starak, Andrew Warner, Tamar Weinburg and even David (the same David who lead me to take an interest in interviewing people).

And since I have used interviews on pretty much every site I have helped, owned or consulted on.

I have learnt a lot about interviewing people, how to get interviews with industry leaders, A-list bloggers and very busy people.

I have conducted over 45 interviews (edit: now over 70) and plan to do a lot more (including some for TheOpenAlgorithm).

In this video I discuss how to get interviews with experts, how to publish them and get the most from them from a traffic point of view.

The Process

The basic process behind securing and publishing an interview is:

Identify a target – Email them asking for an interview – Conduct the interview – Publish – Share via email and social media – Make sure the interviewee shares with their social media followers – Rinse & Repeat.

There’s a lot more to it and conducting interviews is an art in itself. There are a lot of advanced strategies and things you can do to get the most out of your interviews but that’s the basic process.

I take that basic process into a little more depth in the video so I recommend you watch it. I’m still skimming the surface but the best way to learn is to go out and do it for yourself, so after you have watched the video go and try it out.

Try interviewing somebody small first (a friend/colleague first and build yourself up to the big ones and get used to the process).

If you have any questions or would like some more in-depth information on the process make sure to leave a comment below.


Interviews are a great way to build connections, traffic and high quality content. The process behind interviews is  very important to get right and takes you 4 or 5 interviews before you feel comfortable interviewing an industry leader.

You can conduct your interviews over the phone, via Skype, chat (Gmail, MSN, Skype chat) or by email it doesn’t really matter. The crucial thing is to record the interview if its audio or video and provide a transcription below the audio file or video.

The most important thing about interviewing people, is to interview people so don’t worry too much about the theory just go and do it.

If you liked this video, check out my other videos.

  • Didn’t like this video.
  • Used interviews on your site but it didn’t work out or got rejected.
  • Want to share your own interviewing success story.
  • Think I left something important out.
  • Got a question.

Leave a comment below, I try and respond to every comment submitted.


What Mark Zuckerburg Can Teach You About Blogging


I was thinking of starting TheOpenAlgorithm as a commercial venture in February 2011. The plan to create a great project that would attract people by giving away information on factors and potential factors. And make money from it by having a membership area or selling training books or video courses.

Then I watched The Social Network.

In it Mark Zuckerburg, billionaire and current CEO of one of the biggest companies in the world refuses to put advertisements on Facebook.

I abandoned my monetization plan after reflecting on the movie and the Facebook story. This proved a valuable blogging lesson for me that I hope to share with you.

How Mark Zuckerburg Can Make You A Better Blogger

What Zuck Can Teach You About Blogging

Obviously Zuck is a smart guy and a rich guy, so why would such a smart and rich teenager turn down the opportunity to make some money from a project that he had literally worked 24/7 on for over a year?

For two key reasons, that will help you become a better blogger.

The Business Model

Mark knew his business model better than everyone else. His argument was that Facebook had to “be cool” before they put advertisements on the site.

But how cool did it need to be? It was already in the thousands if not millions of members.

What Mark Zuckerburg recognized better than anybody else was that Facebook could only succeed if it had millions of active users. Until most of the civilized world was on Facebook, it was replaceable just like MySpace and Bebo.

To aid world domination Mark felt it necessary to turn off ads, because ads turn off users, and he was right to do it.

Blogging has the same business model as Facebook. Gain thousands of loyal and active fans (members), who comment, click, share and buy regularly.

Blogs fail without members.

When a blog reaches a certain number of members/loyal fans they become too big to fail, unless they doing something really stupid.

There are many measurements of when a blog reaches this point. But for me there are a few key indicators:

  • Steady and growing traffic.
  • Consistent numbers of blog comments and traffic per post.
  • Constantly showing up for the keywords you want to.
  • Having every post amplified by members via social networks immediately after publication.
  • Number of email/RSS subscribers and social media followers as compared with other blogs in your niche.

There are a few vital characteristics of blogs and bloggers who reach this point.

Product Oriented: Forget About Money

Person Focusses On The Product And Not The Money

Focussed On The Product

Zuck was adamant that Facebook must be perfect for users.

When you build a website focussed on the user and not the money, the money will come.

When you build a website focussed on the money, it’s unlikely the user will come.

As a blogger your product is your blog as a whole.

From my point of view there are two parts to a blog, the information it provides and the platform it is presented on.

The information is essentially the content, how its written, is it user friendly, is it easy to understand, broken into small paragraphs, are the grammatical mistakes, are there images and videos, etc?

The platform is the blogs design, is it social, is the text easy to read, do you have a mobile theme, how long does it take to load, etc?

Being product oriented is the most crucial part of blogging, with 133 million blogs out there to compete with, you will need to have valuable, well written and presented content, to have a chance of competing, let alone dominating.

Like Zuck you need to be obsessed with making your blog perfect for users.

Gain Market Share


World Map Graphing Market Share

Worldwide Graph Of Market Share


Just like the movie it is not only crucial that you have a great product, it is even more important to market your product.

And as documented in The Social Network, Facebook grew gradually from college to college, gaining market share of each college’s social networking time.

That’s right even Facebook with it’s astonishing growth was one college at a time.

Look at Groupon the fastest growing company in the world, they have grown one city at a time.

Of course all the usual blog marketing techniques apply, blog commenting, forum posting, guest posting, etc.

But I think you should look at marketing your blog differently.

Go from one competitors blog (and remember every blog is a competitor for a users online time but that doesn’t mean you can’t live in harmony with them either) to the next blog. Dominating them, one step at a time.

At each blog/step, comment profusely, guest post, link, like, share and mention the blog on Twitter until you have won over as many of that blog’s users to be members of your blog as possible.

Then move onto the next blog, forum or group and the next, and the next and so on.

Take an aggressive marketing stance of your blog and you will win over new members.

Don’t feel bad about being so cut throat, if that blog or forum was any good then the users will remain members of both.


My argument would clearly be to forget about making money from your blog until you reach the goals set out in this post or your own personal goals for the blog.

Have a viable monetization model in mind and on paper before starting your blog so that when you reach these goals you are able to make money from your blog.

It’s crucial to have this done, otherwise you may walk into a niche where people simple don’t spend and you don’t earn.

You don’t have to stick to the monetization plan, in fact you probably won’t but make sure you have one or your blog may fail just when it had touched success.

Be focussed on your product i.e. information and the platform you present it on to users. Grow gradually, eating up market share and stealing users from other blogs and you will have set up your blog for Facebook style, probably not size, success.

10 Non Techie Reasons Why I Learned Python And You Should Too


After coming up with a mental template for my project, I knew there would be a need to automate it in a way I didn’t know how.

I contemplated raising funding (most likely through Kickstarter) and hiring a programmer or taking on a partner who could program, but that would make this a business and not a project.

For those of you who don’t know I want to keep TheOpenAlgorithm as a project free of monetization for as long as possible but I do at some point have plans to make money from it.

That’s why I decided that I needed to learn to program. Knowing this I set about researching the various programming languages.

That’s what I am going to share with you today, 10 non technical, simple to understand reasons why I picked Python and why I believe you should too.

I knew more than the average Joe about programming, but I was still totally out of my depth when I read articles talking about functions, LISt Processing and other Pythony things, in fact I still am confused when I read these articles.


Because these articles are written by Python experts who forget what it was like when they were new to the programming business or they are written for other experienced programmers who are thinking of learning a second language.

I simply couldn’t find a decent guide that would trade off or promote one language over the other I a way I could understand.

But I was lucky, I asked around my contacts who knew how to program and asked them what language to learn and why.

10 Reasons you should learn Python

Disclaimer: I thought I would write this guide now while I am still in the process of learning Python, so that I simply can’t confuse you, because I’m nowhere near a Python expert yet, when I am an expert I’m sure I will write a more techie guide to the Python features, but for now its beginner’s essential knowledge.

  • It’s free: I’ve personally never heard a better reason to do anything. Python is a totally free language to download, use and play with, that’s because a bunch of crazy volunteers who devote their time to improving the language (much like Wikipedia).


  • It’s really easy to learn: Not only have I been told it’s a simple language to learn I have experienced it first hand. Despite yet being an expert I have seen how fast my progress has been. I attribute this down to the way the language was designed, the commands (that’s the code you write) are mostly in normal English, so if you want to tell the computer to write something you type print “something” and run the program.

This makes it easy to remember commands and also makes it easy to understand what you are doing. Apparently other languages don’t act like this and you have to remember non-sensicle abbreviations.

  • Free resources: Those crazy volunteers and Python members took their generosity to the next level when they created a great beginners guide to Python, couple that with some great YouTube tutorials and you have yourself a language that’s is not only free but is also free to learn. Great, huh?


  • Paid resources: Unfortunately not everything about Python is free, you might have to shed out $20 to buy a book or pay for the petrol to take you to your local library because I would recommend learning Python fully from a paid resource.

Because of all the free resources, the paid ones have to be really good to sell and they tend to have a better structure to them. I have in the course of learning to program bought two books, the best for beginners being Hello World! Computer Programming For Kids and Other Beginners.

Don’t be put off that the book is aimed at kids, that just means it is easy to use and contains simple language anyone could understand.

But if your afraid to have a book with “Kids” in the title on your bookshelf then the other book I bought was Python 3 for Absolute Beginners, which I found less useful as it was more theory than the other book which contained a lot of exercises but would probably be quite handy for somebody with some sort of basic programming experience.

  • Google use it: In fact Python is one of Google’s preferred languages, they are always looking to hire experts in it and they have created many of their popular products with it. They build a lot of their products with Python (in fact much of the Google spidering and back end search capabilities were built in Python).

So I guess if your looking for a job with Google, Python’s a great place to start.

When you have the Google stamp of approval, you know your onto a winner.

  • It’s versatile: Ok, I promised no techie stuff so I’ll keep it simple. Python can be used for small, large, online and offline projects. It’s versatile, get it?


  • It’s quick: Some languages take an age to program not Python, remember it was created with programmer in mind and that means it is simple and quick to write code in Python.


  • Up to date: Because of Python’s volunteers and the fact that it’s an open source language there are always people trying to improve it. That’s means new versions of the language are regularly released, that keep the language fresh and up to date with current trends, making it a more powerful language that is less likely to fade away into obscurity.


  • Fast (not just easy) to learn: A Google Employee who turned me onto Python said I could become “reasonably proficient in it in less than two months”, you wouldn’t say that about learning French.

If you have a brain suited to programming i.e. you like computers, aren’t afraid of simple maths equations and are a problem solver then you should be able to learn your new skill quickly, which is a real bonus.

  • Great community: Ever have a problem you can’t figure out, or a link your can’t find, just ask one of the thousands of Python community members who are more than willing to help out. You will find them on forums, Twitter, Facebook, Q&A sites, pretty much everywhere.

Not that many languages have as open and helpful community which makes it a lot less frustrating when you are stuck or can’t find a bug in your code.

I Ireland where I live, there’s even a Python group that meet up every month, so I know if I ever need help or an experts view I can always go along to a meet up.

I hope that helps you decide which programming language you want to learn. On a personal note, I have found Python to be easy, fun and simple to learn. Although I haven’t really done anything with it yet, the people who I have talked to about TheOpenAlgorithm seem to think Python is ideal and completely capable of doing the job.

I would highly recommend learning Python and if not Python then some programming language. It’s well worth the effort, the sense of accomplishment of creating a program is great and even if you don’t create the new Facebook you will at least understand better how your computer, iPod and smart phone work.

Stay tuned to all my posts, both on Python and on the interesting things I will be doing with Python by following us on Twitter, subscribing to the RSS feed, connecting with me on LinkedIn, liking us on Facebook, viewing us on YouTube or subscribing to our posts via email.

Young Scientist 2011


For those of you who don’t know already, the idea for this blog is a result of the BT Young Scientist and Technology Exhibition 2011 (or BTYSTE for short).

I did a project for the prestigious largest of its kind science fair, that has been running for 47 years in Dublin, Ireland.

BT Young Scientist and Technology Exhibition Logo

The project entitled “Investigating the factors of a search engine algorithm” was relatively successful and proved an excellent starting ground for the now larger scale project.

And as you might have guessed I tried to find as many of the 200 main factors the search engines use to determine where any site ranks when you search for something on Google, Yahoo, Bing, Ask, etc.

I used a testing method that has since been called “reverse engineering to Google algorithm” despite the fact that I didn’t know what reverse engineering was before coming up with the testing method. To me it just seemed like the logical way to try and prove certain factors and come up with new ones.

I came up with a list of 157 factors that definitely are, probably are and might be factors and I tried to test to see whether they were or weren’t factors and by default find their weighting in the algorithms too.

The way I went about that was to take an individual factor, let’s say Page Speed (how quick a page loads) and compared the speed a page loads in all of the top 5 results of a Google search compared to the 25th-30th result, over 30 searches per factors.

The results I got back would be an average of how quick a page loads in high ranking sites (they are the ones in the top 5) and an average Page Speed of the sites that didn’t rank so well, low ranking sites in the 25-30 band.

I would compare this average and if there was a difference between high ranking and low ranking then that factor is impacting how that page is ranking in the search engines and as a result it can be confirmed with relatively high certainty that it is a factor and based on the difference between the two types of sites you could determine what the weighting of the factor was when we compared the difference to other factors.

The greater the difference the more impact it was having and therefore the greater the weighting.

Follow all that?

If you did, that’s great, if not your probably in the majority and I’ll summarize it. I created  a testing method that could be used to confirm a search engine algorithmic factor, test a new one and return a result as to whether it was a factor and then I could tell you how important that factor is in the algorithm.

Of course there were a number of problems with this testing method.

  • The project was done over a space of 2 months with time also spent on preparing presentations, report books and a project diary as well as preparing to spend a week in a hall with 500 other projects presenting my idea to the general public. That meant I didn’t have as much time to focus on the testing part as I would have liked. As a result I only tested 20 factors using this method and came up with a list of 157 factors based on intuition and other lists available.
  • I don’t know a programming language capable of doing the testing automatically so I did the testing manually. I am currently learning Python which will allow me to this in the future. Therefore I only tested these 20 factors over 300 web pages, which simply isn’t a large enough sample size.
  • The test was simply not scientific enough, I used no proven formula and took only a small sample size.


Despite all these problems the week at BTYSTE was interesting and inspiring, with people even paying for me to email them the list of factors. Essentially people found my half hearted efforts at a project interesting, and so I figured I would develop the project further.

In the 3 days that I spent presenting the project to the public I talked to a number of SEOs and webmasters and I have forged relationships with a lot of important people that will be able to help me as I continue to better the project.

Following the interest at BTYSTE I have been offered a number of jobs and a Google employee and high level programmer have given me advice on how to improve the project.

As a result I have come up with a fully developed system for as accurately as possible testing search engine algorithmic factors.

All in all the BTYSTE was a great week, I enjoyed talking face to face with fellow SEOs and gained a lot of contacts and experience. It also has driven me on to continue with the project, bettering it and maybe even entering next year.