Our website is getting better and now we want to share another update – SEO optimization.
As we are working in a booming world of IT services, it is vitally important to qualitatively expand the scope of influence across different search engines to allow larger number of users find the services they need. Another aspect here is that optimization is not always equal to the quality: different companies overuse SEO approach to reach top pages of search results, thus cluttering them up and reducing the quality. That’s why we don’t want to be somewhere near these cheaters.
Why do we care about this? SEO optimization helps us organize website content in a better way, making it more consistent and valuable for end-users and even more, target services to new audience. That’s why we’ve applied a comprehensive approach based on a set of best practices to make our brand new website not also user, but also search engine -friendly.
What we’ve done? First of all, we tried to eliminate all the validation errors that appeared after the site was fully redesigned. We’ve managed to get rid of 70% of errors, whereas the rest of them are related to third-party plugins and cannot be fixed immediately. Many articles across the Internet assure that validation has no significant impact on how high in search results your website can be, however, in real world this hypothesis was neither confirmed nor refuted. But the fact website code gets more clean and tidy is left apodictic.
Secondly, we properly configured all the meta tags, attributes and headers to make them compliant with the recent SEO practices. We even added some new code to templates since we found some discrepancies. Now, the existing pages have relevant meta descriptions, text structure and cross-links. The same actions were applied to images. As the result – no errors and missing attributes.
Thirdly, we added accurate robots.txt and sitemap files. Despite the fact that robots.txt was created pretty straightforwardly with anything extraordinary, sitemap was created a bit differently. Currently many sites have their own xml version of sitemap to simplify indexing processes, but in majority of cases regular users have nothing similar to observe all the published pages within a particular site. So we decided to create two separate sitemaps – one for search engines (xml) and another for users (html) to improve navigation and reduce the number of situations when people cannot find the services or articles they want or were searching for. Now, the link to the current html version of Sitemap is located in footer, so all the published pages can be accessed from the one single place using a simple and clean structure:
There was a whole bunch of other minor SEO improvements, but these are the most valuable ones.
As we deliver technology services with outstanding quality and accuracy it is important to keep our website as simple as possible to give users a remarkable experience when surfing it. We assured that the SEO improvements mentioned above will greatly help our visitors and keep up their confidence.
Feel free to browse through the latest insights and hints on the DevOps, Big Data, Machine Learning and Blockchain from IT Svit!
Wanted: Managed Services for Murdering DevOps
The growth of managed services has provided the developers with cloud-based infrastructure management tools, thus making the DevOps teams obsolete for startups and small businesses.
DevOps in Insurance Industry: Challenges & Real Success Stories
Insurance industry has to be amongst the leaders in adopting the modern technology. Leveraging DevOps in insurance industry can be the disruptive advantage.