Robot Files, A Primer

Robots Files Robot files are txt documents that reside on the root of a domain. These text files give bots direction on what to crawl and what not to crawl. The file can target specific or all bots to target in its request. “Do not crawl” directives can be specific to the URL or folder[…]

SEO for website migrations

URL Structure, A Primer

URL Structure/Hierarchy In DNS and Google Search Console, ensure account is set to manage URL variations, including http:, https:, www, and non-www sites. There should be a uniform method of delivering URLs so as not to cause duplicate content between what search engines see as distinct websites. Additionally, it’s important to ensure that sub folders[…]

Conversion rate optimization and user experience with Oni

Sitemaps, A Primer

Sitemaps Sitemaps are XML files that outline website URLs for search engines to easily access. Elements of the sitemap include location of file, priority of page, frequency of updates and images included on the page (among other elements). Sitemaps are ideally located at the root of a website (www.example.com/sitemap.xml), however, unlike robots files, they can[…]