Naturally everyone is interested in getting better search engine rankings. A persistent myth however, is that you need to reach x words in your articles, otherwise they won't rank well, or they might not get indexed at all.
As of the writing of this Brugbart has a number of articles, which are just below 100 words in length, and they are nearly all indexed. I think the smallest articles we have had, has been around 50 words!
But my point is not to get people to write shorter posts. My point with this article is that, while short pages are useful in some circumstances, they may not be so useful in others.
Brugbart has a tool to help determine potential low quality content, which potentially needs to be improved. I am not afraid to delete content of poor quality, or completely re-write older articles.
How much or little is needed?
As you will likely find discussed in other articles on Brugbart, it is generally not recommended to write short. Writing long is usually better than short, but it has little or nothing to do with the length of the posts themselves. Rather it has to do with originality, and uniqueness of your content.
If your content is unique you will nearly always rank regardless of how long your posts are, but if there's a million other bloggers writing about the same topics it will be harder, and you may need to put more of an effort into your writing.
The ideal content length has been a topic of debate in SEO communities, and while there might be some truth to the longer is better idea – as i also discussed above – it only seem to apply in some cases.
Competitive niches might be a determining factor
If there's a lot existing content in a certain field, long posts should in theory perform better in the search engines than the short posts. But even then there's still no guarantee that you will outperform other websites on the subject.
A lot of all this can be known almost intuitively, simply by performing simple thought-experiments. For instance, you could perform such an experiment by asking questions like:
- What are the main ranking factors that Google uses, besides links?
- What is X page doing different to rank higher than Y?
But to answer a question like that, you would also need to know about how Google is ranking search results. Answering such questions precisely can be hard, but you can get still get close enough, to enable you to make the necessary changes for your pages, for them to start ranking better in the search engine results.
Thought experiments like this are good, as it allows you to spend less time searching for information or doing experiments. In this case, a simple thought experiment proves that there really is no minimum word count. Google wants to index pages, as long as they are relevant to searchers.
Where does the 300 myth come from?
Presumably it originates from blackhat SEOs who got hit by the panda and penguin updates, which were algorithm changes aimed at improving the quality of search.
Panda was a change to Googles search algorithm that first appeared February 23, 2011 – so this is a long time ago – but the purpose of the update, was to boost the ranking of high-quality-pages, and to demote pages of lower quality. Generally the purpose of all updates to the algorithm, is to improve search, so we will not focus to much on the individual updates. However, in a blog post Amit Singhal from Google mentions something very interesting:
One other specific piece of guidance we've offered is that low-quality content on some parts of a website can impact the whole site’s rankings, and thus removing low quality pages, merging or improving the content of individual shallow pages into more useful pages, or moving low quality pages to a different domain could eventually help the rankings of your higher-quality content. – More guidance on building high-quality sites
Which brings me back to one of my earlier points in this article, that is, the importance of updating old content, and even deleting our low quality content. By getting rid of garbage and improving old content, you can actually end up improving the ranking for your website as a whole, and not just the individual pages.
Penguin focused mostly on unnatural links, that is websites that had a portfolio of fake links pointing to them. It has been typical of blackhatters to try and manipulate their rankings trough buying of links, or spamming the links themselves.
As these blackhat SEOs got hit, they likely invented the minimum 300-words idea, to avoid getting punished for thin content.