General, SEO

Too Much SEO?

So, you’ve learned all about keywords, titles, and meta-descriptions. You’ve edited your blog posts and practiced better writing habits. Your SEO campaign is on track! Your site traffic is up and you’re ranking better in search results than you ever had before. Hoping to boost your results even further, you optimize a little more – some tweaks here and there, a few more keywords placed ever-so-strategically in your articles and body copy. After a while, you noticed your numbers are slumping. Bounce rates are up. What could have happened? Where did you go wrong?

There’s ongoing debate about SEO over-optimization. Google’s goal is to ‘organize the world’s information and make it universally accessible and useful’, and SEO tactics don’t necessarily further that goal. The problem is that Google’s algorithm remains a puzzle to be solved, not a true metric of quality. SEO best practices are meant to make pages more appealing for the ranking algorithm, which isn’t always conducive to great content. Over the past several years, Google has taken strides to address the gap between great content and SEO best practices by penalizing over-optimized content. Today we’re going to look at how over-optimization can hurt your website more than help it.

SEO sounds dumb (sometimes)

I’ve mentioned before that SEO can lead to content that sounds stilted and awkward. In an attempt to appeal to a robot, you end up driving away human connections. There have been many times where I’ve gone to a major SEO blog or website for research purposes, and left feeling a little jaded about their robotic tone or intrusive calls to action. The problem with SEO is that it encourages people to write for the algorithm. Keyword density is one unfortunate metric that can end up encouraging bad writing. While keyword stuffing (including far too many keywords on a single page) has long been punished, most SEO experts agree that keywords are still vital to search ranking. Having to rewrite content to include more keywords so a robot can tell what your blog is about is less than ideal. In real conversations, spoken or written, you don’t go back and repeat “Kelowna web design and SEO company” over and over again to make sure your audience knows what you’re talking about. If you did, you’d sound like a crazy person.

Fortunately, Google is working on this issue. Back in 2012, they made some big changes to how the algorithm ‘reads’ pages, enacting penalties for artificial links and keyword stuffing. Now, one instance of a target keyword is a good start, and two or three is great. Any more than that, and there are diminishing returns on effectiveness. Too many, and you’ll actually start hurting your search ranking. Additionally, Google has gotten advanced enough to detect words similar to target keywords. Instead of simply looking for repeated incidents of a keyword or keywords, Google also takes into account related terms. In theory, this limits the impact of keyword stuffing even further, and benefits content that discusses topics at length and in detail.

What’s the takeaway here? Remember to write for humans, and spruce up your writing with a few SEO tricks, not the other way around. You can bet Google’s algorithm will grow ever more refined, and better able to weed out poor writing.

User experience trumps search ranking

I’ve already begun to mention this concept above, but let’s explore it more. Imagine, through a ton of SEO work and a little luck, your website is ranked #1 overall for your biggest, most important keyword. That’s going to mean a ton of traffic to your site. Think of all the new customers you’ll have! However, something’s gone wrong. Your business is doing way better than it was when you were on page 3 of Google, but not that much better. Your bounce rate (the number of people who leave your site without visiting any other pages or engaging in any way) is crazy high, and the leads just aren’t coming in. Meanwhile, your competitor across town, who barely ranks on the first page, is slammed busy every day. What happened?

While this scenario is unlikely, it’s not impossible. Boosting your search ranking won’t do much if the website visitors see is cluttered or unappealing, or if the content is poor or misleading. The fact is, a million views a month with an 80% bounce rate can be worse than half a million views with a 50% bounce rate. While ranking highly is definitely a big boon to your business, you need to be better than your competitors to capitalize on that high-ranking keyword. Often, the better site eventually ends up ranking well anyway, but it’s worth your consideration: if you have a few hours a week to work on your website, you need to decide if low views or high bounce rate is a bigger problem, and work accordingly.

What’s the answer, then?

If the question you’re asking is ‘Can I over-optimize my website content?’, the answer is yes. If you’re asking ‘How much SEO is too much SEO?’, the answer is a little less clear. As I’ve mentioned, Google’s algorithm is a closely guarded secret. There’s no way to know the definitive best number, type, and location of keywords or links, so you have to work the SEO community’s highly educated guesses and Google’s own published revelations. What’s important to understand is that SEO is only part of an online business’ success. As SEO continues to change and evolve, it should be easier to simply write the best article you can and rank well for it. In the meantime, it’s an ongoing balancing act.

Do you need some great content written for your website, or need a site refresh to drop your bounce rate? Contact us today to get a free quote and consultation for your website!