If you haven’t already heard Google Search has announced that it will now prioritize mobile-friendly search results on mobile searches over other results. This means that if Google has detected that your website is not mobile friendly then your sites search results end up at the bottom of the pile in comparison to other results. Tragically, this also means reaching less people who are using tablet or mobile devices.
- Google wants to make the ever growing population of tablet and mobile users have high-quality results
- Google wants to ensure that people finding results through google on non-Desktop devices are able to properly view content without the use of zooms or tapping
How are they accomplishing this?
- Site page(s) that do not pass will be ranked accordingly
- Companies and organizations can verify their site by going to the Google Mobile Tester
While frustrating it makes sense that one of the largest search engines would like to prioritize these results. The main reason being that people will continue to use their services to find appropriate results. Furthermore, sites are able to be verified through the webmaster tools to determine whether their sites comply. Developing a website that is responsive is key in order to be more flexible with the changing market and needs related to increased use of tablets and mobiles.
If you are running Drupal or WordPress you might need to modify the Robots.txt file in order to allow Googlebots to pass through and verify your site properly. You can do this by adding the following section of code.
User-agent: Googlebot # Directories Disallow: /includes/ Disallow: /misc/ Disallow: /modules/ Disallow: /profiles/ Disallow: /scripts/ Disallow: /themes/ # Files Disallow: /CHANGELOG.txt Disallow: /cron.php Disallow: /INSTALL.mysql.txt Disallow: /INSTALL.pgsql.txt Disallow: /INSTALL.sqlite.txt Disallow: /install.php Disallow: /INSTALL.txt Disallow: /LICENSE.txt Disallow: /MAINTAINERS.txt Disallow: /update.php Disallow: /UPGRADE.txt Disallow: /xmlrpc.php # Paths (clean URLs) Disallow: /admin/ Disallow: /comment/reply/ Disallow: /filter/tips/ Disallow: /node/add/ Disallow: /search/ Disallow: /user/register/ Disallow: /user/password/ Disallow: /user/login/ Disallow: /user/logout/ # Paths (no clean URLs) Disallow: /?q=admin/ Disallow: /?q=comment/reply/ Disallow: /?q=filter/tips/ Disallow: /?q=node/add/ Disallow: /?q=search/ Disallow: /?q=user/password/ Disallow: /?q=user/register/ Disallow: /?q=user/login/ Disallow: /?q=user/logout/ # ========================================= # Allow: /*.js* Allow: /*.css* # the most specific rule based on the length of the [path] entry will trump the less specific (shorter) rule. # The order of precedence for rules with wildcards is undefined # https://developers.google.com/webmasters/control-crawl-index/docs/robots_txt # Thanks to Marcel Jong and sunishabraham Allow: /misc/*.js Allow: /modules/*.js Allow: /modules/*.css # ========================================= #
And for WordPress can add the following:
User-agent: Googlebot Disallow: /wp-admin/ Allow: /*.js* Allow: /*.css* Allow: /wp-admin/admin-ajax.php*
Once that has been done you can go into your Google Search Console and add the Robots.txt file. Once this has been fetched Google will appropriately index your site according to desktop and non-desktop search results.