Category Archives: Videos

SEO Analysis: Buying a Website vs Starting a New One

 
Read below the video for further description, and make sure to leave a comment with your view or opinion in the comments section.


 

I know that I have asked myself whether I should buy a website or start a new one, each time I have an idea or start a new project.

For the most part because I’m trying to be different and gain traction by creating new information, for me I will most likely always choose to start a new website (but that could change down the road).

But the question is interesting and hasn’t really been analysed from an SEO point of view.

So what are the SEO advantages and disadvantages to buying a website versus starting a new one?

Comparison

Click on image to increase size.

It would seem like buying a website is the best option from an SEO point of view.

To an extent that is true, if you can make sure that the site you are buying hasn’t done anything black hat or any sneaky SEO tricks and it is in a niche that you will love writing content for. Then buying a website makes a lot of sense from purely a search engine optimization point of view.

Other factors

While buying a website is the preferred option for SEO you also have to consider whether you have the cash required to buy a website and whether you are the type of person who could run a website that they had not created.

Personally I like to have control and also to have been the originator of the idea and then be able to mold that idea around my skills and interests. So I wouldn’t be the best person to go buying a website because my personality wouldn’t suit it.

You have to ask yourself whether you need to be the originator and whether you need to been in full control or whether you can slot into somebody else’s idea.

If you do have the cash and personality to buy a website then there is only one more step to ensure your acquisition is a success.

SEO track record

Unfortunately some unsavoury SEOs use black hat (sneaky) strategies to try and advance their website’s ranking in the short term. Sometimes this works in the short term but it will never work in the long term. Buying a site that uses these techniques is a disaster waiting to happen.

There are a number of free and paid tools available to you to do a background check on the site.

Make sure you (or an expert) checks whether the site breaks any of Google’s Webmaster Guidelines.

Here are some handy tools to help you out:

 

Summary

If you have the cash, the personality and you find a site that suits your skills and interests and that site hasn’t infringed on the Google Webmaster Guidelines then grab your opportunity now.

Spelling and Grammar in the Google Algorithm

 
Read below the video for further description, and make sure to leave a comment with your view or opinion in the comments section.

 

Matt Cutts is a one of my favourite SEO commentators and has been answering SEO’s questions for a long time.

But a recent video I watched from Matt got my blood boiling, or at least bubbling.

Matt was asked whether spelling and grammar was part of the Google algorithm?

Response

Basically Matt’s response was that having spelling and grammar in the algorithm made a lot of sense, because users wanted pages and websites that were well written and easy to read.

We aren’t talking about university level English here, just a factor that would relegate poorly written pages (potentially written by non-native English/[relevant language] speakers) that provide little or no value to the user.

No user wants these kind of results and implementing a factor that would get rid of them makes so much sense, too much sense apparently.

Matt answered the question first of all and said that he wasn’t aware that Google used these factors in their algorithm.

Essentially Google doesn’t use spelling and grammatical factors in their algorithm.

In fairness to Matt he did say it would be a factor worth looking into.

Google’s Solution

Matt gave a couple of reasons why Google doesn’t use these factors in their algorithm:

  • There are potential errors when creating an algorithm/factor that would measure spelling and grammatical mistakes, e.g.  what if a page quoted French and the main page was in English?
  • Google believe that PageRank will do the same job as such factors, i.e. if a page has poor grammar or spelling, fewer reputable people will link to it and therefore it will automatically be relegated down the search results.

On the surface this looks like smart reasoning from some very smart people, but lets look below the surface.

Rebuttal - Google’s Double Standard

First off Google already have the technology in terms of spelling (they pride themselves on on their spelling corrections in search) and language and dialect differentiation (Google Translate). So with a little fine tuning and the algorithm would have a low percentage error level.

Although they don’t have any major grammar corrector that I’m aware of, they do have thousands of the top engineers in the world to create one.

Plus as users are probably only interested in making sure the page has decent readable grammar, we’re not asking Google to come up with a masterpiece, just a basic algorithm that identifies basic grammatical errors that are obviously apparent to a reader when reading the page.

Double Standard

The whole argument that PageRank would do a better job is totally flawed.

If Google had that much trust in PageRank, then their algorithm would simply find out what a page is about, whether its doing anything sneaky/black hat and then rank in accordance with quality of links.

But it doesn’t why not?

Because Google know that PageRank can make mistakes too. For example what if the most reputable online English tutor linked to a poorly written page as an example of what not to do when writing a web page? That’s right that page would probably benefit from that link (assuming the tutor didn’t use the nofollow tag).

That’s why Google combine PageRank with other factors that make a lot of sense to use:

PageSpeed Example: Google currently use a factor that demotes pages that load slowly and increases the ranking of pages that load quickly.

Here’s what Google have to say about PageSpeed: “we believe that making our websites load and display faster improves the user’s experience and helps them become more productive.” – Source

And another good one: “Speeding up websites is important — not just to site owners, but to all Internet users. Faster sites create happy users” – Source

Hey, doesn’t that sound similar, you could say that spelling and grammar “improves the user experience” and that “well written sites create happy users.”

Yet Google use PageSpeed as a factor and not spelling and grammar.

Google  has shown that PageRank needs to be combined with other factors to get the best results.

When a factor such as spelling and grammar is ignored it’s just Google contradicting itself.

If I was Head of Search…..

If I was head of search at Google I would:

  • Develop a set of new factors to determine the grammatical and spelling rating of a page.
  • I would create a set of signals within the spelling and grammar algorithm to rate how accurate the algorithm is on a page by page basis and if there was a significant level of doubt I would simply minimize the use of that factor on the page or not use it at all and defer back to PageRank.
  • Create a notification method within Google Webmaster Tools that would let sites known which pages are being penalized and why, and of course then a reconsideration request so that if the site has cleaned up their pages they will be re-evaluated.
  • And I would share this page on Facebook and Twitter, hint, hint.

Let me know if you spot any spelling or grammatical errors in this article :)

Video: How the Google Algorithm and the Panda Update Have Failed Users

 
Read below the video for further description, and make sure to leave a comment with your view or opinion in the comments section.

 

The recent claims by Hubpages, who had been hit hard by the Panda update , that moving all their content to various subdomains has returned their traffic levels to near pre-Panda levels, that there is a clear problem with the Google algorithm and that it has been failing users.

Hubpages lost over 50% of their search engine traffic after Google clamped down on the site and a lot of other sites that it claimed to host low quality content in Feburary.

After consulting with Matt Cutts a Google fellow, Hubpages decided that by giving each author their own subdomain, much like WordPress and Blogger do would be the best way to split up the content and help Google distinguish what is high quality and what is crap.

Following the change Hubpages have reported that their traffic levels are back near pre-Panda levels.

It is already widely known that Google treat subdomains as individual sites and the site-wide blanket factors Google apply don’t impact subdomains as much as if the content was hosted on the single domain.

I’m not worried whether this is the best way to combat the Panda update, I know a lot of people agree and disagree with this as a solution but what I am worried about is the obvious and gaping hole in the Google algorithm that clearly still exists.

Google Lost Focus

In my opinion Google have lost focus on what makes a great algorithm and applied too much weight to algorithmic factors that impact Google’s opinion of a website as a whole.

Factors like the homepage’s PageRank, number of indexed pages, number of inbound links and social media influence on the site as a whole.

The Hubpage’s example shows that Google are currently ranking pages based on site-wide factors as opposed to on a page by page basis.

While this may in some cases lead to better quality search results (and Google extensively research the impact of changes to their algorithm on search quality before implementing them), in other cases it has lead to spam and low quality search results ruling the web and the proliferation of content farms.

As Google strove to move away from on page factors, because some are easily manipulated by the webmaster, they left themselves open to large sites with a big brand showing up in the search results where they shouldn’t simply because they rank well in the site-wide factors.

Right Idea – Poor Execution

I understand why Google tried to reduce the weighting on the on page factors like having the keyword in the title tag, first 90 words, URL, image alt tag, etc.

But as Google moved away from those on page factors, they should have looked to move towards more on page factors that were harder to manipulate.

Instead they focussed on a lot of site-wide factors that were harder but not impossible to manipulate.

The Panda update essentially sought to regain the balance, but instead of regaining the balance by increasing the weighting of these hard to manipulate on page factors like Latent Dirichlet Allocation and other complicated factors that are hard or not worth the effort to try to game.

The article I linked to above shows a large correlation between LDA and search engine ranking so Google may already be using these factors but clearly not enough, or maybe they don’t trust them to a level they think it would be wise to increase their weighting.

A possible solution would be for Google to implement conditional factors that are only looked at when there is a large number of content farm/big brand sites for one query.

So in the case of these big brand dominated queries the weighting of LDA type factors would be increased thus most likely increasing the quality of those search results but the queries that are being returned as high quality with the current algorithm would be unchanged.

It’s sort of like having an algorithm analyse the work of the Google algorithm and make decisions from the finding.

Of course there are many other possible solutions but that is a potential one.

Summary

The reason I decided to write this article is because we all (myself included) looked up to Google as the search engine powerhouse who may still be living on their original reputation of revolutionaries and spam killers while this is no longer the case.

They may still be the best search engine, in terms of search quality but with moves like the Panda update and the state of the algorithm before that I would suspect that Google could be overlapped if a new search engine with new ideas starts-up.

While I would love to see somebody like Blekko do well and maybe even implement solutions like I have suggested, with their current attitude towards search and their current results I can’t see them overtaking Google any time soon.

Still I would love to be proven wrong and I would love for a small search engine with a point to prove to take on Google with these types of queries with a new and creative solution and put some heat on the search giant.

 

Video: How Many Factors Are There in a Search Engine Algorithm?

 
Read below the video for further description, and make sure to leave a comment with your view or opinion in the comments section.

 

After Google and Bing traded blows on the number of factors their respective algorithms were compiled of there was considerable confusion in the SEO industry at how many factors their algorithms really had.

Tit for Tat

Google originally said that their algorithm had around 200 main factors.

In response Bing probably looking to get one up on their competition decided to announce that they had a couple of thousand factors in their algorithm.

The confusion levels increased as Google explained that they calculated their 200 factors based on the fact that each factor had 50 sub factors and as a result the number of factors, parameters, elements or whatever you are meant to call them is somewhere around 10,000.

Bing not wanting to be out shone also explained that it all depended what you counted as a factor and an element.

Both search engines came out of the war of words looking more than a little sinister and back handed but the common SEO came out hungry for reliable information.

My personal hunger for this information and finding the truth behind the search engine algorithms has given rise to TheOpenAlgorithm project, which will hopefully result in a more transparent, open search industry and help a lot of webmasters. As well as being a lot of fun for myself, of course.

So, how many factors are there, really?

After all of that, the webmasters of the world were still left wondering what that reliable information and number would be.

I can only speculate and the algorithms are always changing, introducing new factors. But my personal opinion would be that both Google and Bing use around 10,000 factors in their algorithm.

I suspect Google being a more developed and older search engine that the number of factors in their algorithm would be somewhat higher than Bing’s but again that’s only speculation.

I’m using 10,000 as the goal for my project as I try to uncover as many factors as possible.

Summary

Unless your running projects, research or are looking for a number to use in an article you are writing you shouldn’t be worrying about the number of factors in the search engine algorithms.

The actual number doesn’t impact you and focussing on it will only distract you from the goal, creating a great site for users.

I’m not naive and we all know SEO is crucial to your site’s success, but don’t over focus on it, stay tuned here via Twitter, Facebook or RSS and you will have all the necessary information for creating a search engine and user friendly website.

If you liked this video you may want to check out my other videos and please leave a comment with any questions or opinions you may have.

Video: The Foundation for Great SEO – A Great Content Strategy

 
Read below the video for further description, and make sure to leave a comment with your view or opinion in the comments section.
 

Everybody knows content is important but most people just don’t get it when it comes to their website. They either forget great content is necessary or think any content is content worthy of search engine traffic.

The majority of the times I see this is with ecommerce or service sales based, static websites where the end goal is to get you to buy something. But I have also see this when bloggers focus too much on stats like traffic and ad revenue.

As a result these webmasters lose focus and write content that is mundane and stale. Stuff like “What is SEO?” and “What is YouTube?”, now of course there’s a need for this content, but you would have to be kidding yourself to believe that this need hasn’t already been fulfilled.

The foundation for great SEO and getting a lot of search engine traffic is to have great user focussed content.

To be constantly creating this fantastic content you are going to need to have a content strategy in place.

Creating a Great Content Strategy

Using this simple self assessment you can accurately determine whether you are implementing a great content strategy or if your a webmaster who has lost their way.

Ask yourself: “are you writing content 100%, totally for the user and not thinking of the search engine at all when you write your first draft?”

If you can honestly say yes then you are fine and all you need to do is keep going and creating more unique content.

If you are thinking of the search engine and thinking of keywords on your first draft then your probably writing crap content.

(P.S. I’m not saying you should never think about the search engine, your final draft/editorial check should make sure you are using the right keywords in the right places (and all that other SEO stuff I like to talk about), but your first draft is all about the user.)

If you answered no to that question then run this idea creation process through your mind.

What content is my target market missing, what do they really want that they don’t already have?

(That could be video news, tutorial articles, webinars, a plain old tips blog, it could be anything and you know your target market and your industry best so you should be able to answer this question with a bit of thought.)

How am I going to fulfil what they want?

This part’s the content strategy, you should be able to write down what it is your working towards (be that a regularly updated blog, a daily news update, etc.). If you ask yourself everyday, ”am I moving towards this goal?” and you are then you are bound to get a lot of search engine traffic.

Summary

Really what this is about, is a mental refresh of where you are and where you need to be and a lot of websites lose their way, or are off track from the beginning. The foundation for any site is great user focussed content and you need to create it regularly.

There is a lot more to SEO than just the content but if you stay tuned to TheOpenAlgorithm (I highly recommend our email newsletter) and my latest findings and blog posts you will find that peripheral work a breeze.

While all the other SEO peripherals are important, its more important to create a great foundation for your SEO house, than to have a great design and no ground to put it on.

Video: Guest Blogging Advanced Strategy Why you Need to Brag More

 

Read below the video for further description, and make sure to leave a comment with your view or opinion in the comments section.

 

I’m a huge fan of guest blogging. I believe it’s a great way to build your personal reputation, your site’s reputation with the search engine’s, increase your traffic and increase your long term SEO success.

Guest blogging helped me launch my first ever product (Link Building Mastery) which I went on to sell the rights to a new owner.

I have even launched a website and associated product totally to help teach people how to do guest blogging right.

My story

When I launched that first product I had a low traffic website with a PageRank of 0 and no email or social media list.

I decided to market the product through all the usual methods, I dabbled in Google Adwords, posted in forums, tried to recruit affiliates but none of them worked as well as guest blogging, since then I have been a total convert and devote guest blogger.

It wasn’t all peaches and roses, my first five guest posts sent me 18 visitors and it wasn’t because of the site I was blogging on (one of them was on one of the biggest blogs in my niche SEOMoz).

I decided to try one more post with some slight changes to how I wrote it.

The main one being instead of writing tutorials or commenting on news I would write about my own experiences and share my learnings, share my story.

In this case I shared the story of how I developed a product, gave an excerpt from the product and linked to it many times in the article.

It worked, that post sent me nearly double (31 visitors in the first 20 days and hundreds until I sold the website in July 2011) than my first five posts combined.

Since then I have written a lot of guest posts almost all of them drawing from my personal experience. All of them have sent me a lot of traffic, some sending me hundreds in the first day after publishing.

I wrote about this strategy in another guest post over at the 2CreateaWebsite blog, which ironically sent me over 300 visitors within the week after writing the post.

The Learning

The major learning for me was that to make a connection with your reader and compell them to click on that link to your blog in your author bio you needed to share a story, show your personal value beyond just being a good source of information.

You need each and every reader to believe you are an authority in your field before they finish reading the post.

If you can accomplish that through sharing your experience you too can double the traffic you get from your next guest post.

Summary

Of course there is a lot more to successful guest blogging but if you implement this advanced strategy you certainly will see the results.

On my website GuestBlogging101 I share more advanced strategies from my personal experience, that will help you quadruple not double the traffic you get from your next guest post.

If you liked this video, check out my other videos.

  • Didn’t like this video.
  • Used guest blogging but it didn’t work out.
  • Want to share your own guest blogging story or advanced strategy.
  • Used this strategy with success.
  • Got a question.

Leave a comment below, I try and respond to every comment submitted.

 

Video: How to Get Interviews with Industry Leaders and Use Them on your Site

 
Read below the video for further description, and make sure to leave a comment with your view or opinion in the comments section.


 

I first came across interviewing people as a strategy to build traffic and content from David Tiefenthaler of tips4running.com, when I read a post he did for the 2CreateaWebsite blog.

David had quadrupled his traffic, so I decided to give interviewing a go.

Other than guest blogging, interviewing experts would have to be the strategy that I have used the most over all my sites and sites that I help manage.

My first forray into interviews was with my former website 2buildbacklinks.com, where I had an interviews section (hopefully the new owner will keep that section and link alive, if not let me know in the comments section).

My second was with my eBook Link Building Mastery where I interviewed 15 incredible people including some major names like Glen Allsopp, Marko Saric, Yaro Starak, Andrew Warner, Tamar Weinburg and even David (the same David who lead me to take an interest in interviewing people).

And since I have used interviews on pretty much every site I have helped, owned or consulted on.

I have learnt a lot about interviewing people, how to get interviews with industry leaders, A-list bloggers and very busy people.

I have conducted over 45 interviews (edit: now over 70) and plan to do a lot more (including some for TheOpenAlgorithm).

In this video I discuss how to get interviews with experts, how to publish them and get the most from them from a traffic point of view.

The Process

The basic process behind securing and publishing an interview is:

Identify a target – Email them asking for an interview – Conduct the interview – Publish – Share via email and social media – Make sure the interviewee shares with their social media followers – Rinse & Repeat.

There’s a lot more to it and conducting interviews is an art in itself. There are a lot of advanced strategies and things you can do to get the most out of your interviews but that’s the basic process.

I take that basic process into a little more depth in the video so I recommend you watch it. I’m still skimming the surface but the best way to learn is to go out and do it for yourself, so after you have watched the video go and try it out.

Try interviewing somebody small first (a friend/colleague first and build yourself up to the big ones and get used to the process).

If you have any questions or would like some more in-depth information on the process make sure to leave a comment below.

Summary

Interviews are a great way to build connections, traffic and high quality content. The process behind interviews is  very important to get right and takes you 4 or 5 interviews before you feel comfortable interviewing an industry leader.

You can conduct your interviews over the phone, via Skype, chat (Gmail, MSN, Skype chat) or by email it doesn’t really matter. The crucial thing is to record the interview if its audio or video and provide a transcription below the audio file or video.

The most important thing about interviewing people, is to interview people so don’t worry too much about the theory just go and do it.

If you liked this video, check out my other videos.

  • Didn’t like this video.
  • Used interviews on your site but it didn’t work out or got rejected.
  • Want to share your own interviewing success story.
  • Think I left something important out.
  • Got a question.

Leave a comment below, I try and respond to every comment submitted.