If organic search results are going to be the primary source of traffic to your website, you will need to keep this in mind throughout the process of coding the pages. Contrary to popular belief, there is a lot more to SEO than content marketing, synonyms, and keywords. A lot of the code which goes into the site itself determines how well it will perform in the foreseeable future.
Put Yourself on the Map
The first thing you must do is ensure that your page is accessible to Google and other search engines and that the content can be seen by their robots or spiders. The simple way to check this would be to use “fetch as Google” in the “Crawl” section of the Google Search Console. It is worth noting that iframes cannot be accessed by crawlers, and Silverlight and Flash content suffers too. It is a good idea to keep all important indexable content in HTML.
Optimized, Clean URLs
Your URL is not only an important cog in the user experience but also is critical to SEO. Since it is the first thing any crawler will see, it should tell them a lot about the content itself by being clean, short, easy to read and descriptive. Having tons of numeric parameters is a serious no-no as it is extremely confusing. It makes much more sense to have http://website.com/products/category_specific/specific_product/ as a URL for a simple product page than something on the lines of http://website.com/products/index.jsp?category_id=31&product_id=16/. Parameters only confuse robots as well as viewers and are unable to convince search engines of their relevance to particular keywords. If you’re using CMSs with sorting, filters, and dynamic elements, you’re bound to end up with parameters. Advanced content management systems like WordPress, however, allow you to change permalinks, which is a good idea.
Get the Most Out of Meta Tags
Search engines learn a lot about your site from Meta tags too. The title tag is the most important and should contain most important target keyword with any extra content, never exceeding 60 characters. You can use the pipe (|) character to separate keywords too, like <title>Keyword1 for readers | Keyword2 guide </title>. Use target location and industry in the title tag if you are optimizing local search rankings.
The descriptions don’t factor in directly for ranking but are still important as search engines combine them and the title for the search snippet and use them to learn about the page. So they’re essentially text ads you put out for free, with the keywords in bold. They can decrease bounce rate and boost click-through rates too. Consumers can be attracted with terms like “sale” or “cheap” or “discounts”, “free shipping” and so on. The description should not exceed 160 characters.
Finally, the robots tag tells search engines whether you want the page indexed, for example <meta name=“robots” content=“noindex”/>. It is worth understanding that not allowing Google to index the page does not mean it will not show up on SERPs, as it comes with the description ‘A description for this result is not available because of this site’s robots.txt’. It is worth noting that you don’t need to write the metadata yourself, and can hire a top NY SEO Expert to help you out.
Make It Mobile-Friendly
Google’s mobile search algorithm now takes mobile devices very seriously when considering rankings. They offer a mobile-friendly test which lets you check how well you are doing. There are several issues in the design which could arise, and you should stay on top of them. Page loading time is a huge factor and Google expects pages to render ATF content in one second or less. After factoring in DNS lookup, the TCP handshake and finally HTTP request and response only 400 milliseconds will be left to load ATF content.
You should optimize image size in GIMP or Photoshop, make the most of caching, minimize ATF content to below 148kb, minifying code by removing extra characters from stylesheets and JS with JSMin or YUI Compressor, and try Google’s Accelerated Mobile Page tools.
The PageSpeed Insights tool is a great place to gather performance metrics. As far as design goes, responsive design is as good as it gets, and all you have to do is to change a viewport property in the meta name and design keeping other devices in mind. Dynamic design is more holistic but takes a lot more time, as it requires you to detect the user and serve a specialized version of the website to desktop or mobile devices. The very last resort is having a separate sub domain for mobile devices and use a rel=“canonical” tag for duplicate pages.
Search engines are not the enemy, so as long as you provide them good content which serves the users best, you really don’t need to worry. There are a million different techniques to boost results, but sticking to a few good ones is always a fantastic idea.