KISS (Keep It Simple, Silly) is often held up these days as a tired acronym, but in the case of SEO it is, these days a more accurate position to hold than the popular ‘content is king’ mantra.
This is because, while content remains the single most impactful weapon in the SEO armoury (and arguably the simplest), the overall strategy must be straightforward enough so that it is relatively transparent – for a whole host of reasons, not least any future handover. But KISS is also a philosophy that developers should and do try to follow.
A recent communication from Martin Splitt, search developer advocate for Google included the following statement. “You might shoot yourself in the foot when you don’t expect it, so why would you build something more brittle if all it does is solve a non-problem?” The fact is that there are many people out there who understand enough developing, SEO and the relevant technology to position themselves perfectly to advise companies that their sites need changing and restructuring when in reality the cost will far outweigh any potential benefit. Martin Splitt went on to nail this point home. “These are things that worry me a lot and, oftentimes, it is either very over eagerly excited developers or SEOs who understand enough of the technology to be dangerous with it.”
So, the point is that ensuring and demanding simplicity and transparency, both as a business owner and SEO is beneficial. You are less open to trouble and unnecessary cost as a business owner and as an SEO you will earn a reputation of trustworthiness and that is worth its weight in gold. In addition you are less likely to cause yourself, and your clients, problems.
One of the biggest problems caused by unnecessary tinkering can be crawling errors. The amount of websites Google find that are linking incorrectly is apparently staggering. Internal and external links are equally problematic and errors in both stem predominantly from SEOs who are trying as Martin Splitt puts it, to “reinvent the wheel.” Linking is simple and when it is made complex it may seem to have worked but can lead to sporadic failings that often involve crawlers.
A misinterpretation of function and cause and effect on the part of SEOs is also responsible for many common Javascript issues. One classic example of the damaging effect of tinkering in this area is that an SEO looks at his or her site and sees that Google has dropped it from the index. Often, when investigating such issues you would find that an SEO has looked at robots.txt, which has advised against taking a given URL. However, if this URL happened to be Google itself, the search engine will not be able to even see the site’s content at all, irrespective of how meticulously the JavaScript API is constructed. Arguably, of course, the issue is not entirely one of needing to KISS, but also the age old saying that a little bit of knowledge is dangerous. If the SEOs in question understood the pitfalls, you could argue that these sorts of things could be avoided. But on the whole, there is little to be gained from over-icing the cake. The overuse of JavaScript is almost certainly an example of this.
The best advice, as offered by Google for many years is to build websites for the sake of users, not search engines. The subtle tweaking for search engines should be more of an afterthought, long after the site is optimised for users, across all platforms.
John Hinds is a director at Lojix, a digital marketing agency based in Yorkshire.
SME Paid Under