This Week’s Industry News
Compiled by the Rocket Clicks Team
Chrome DevTools Offers Option to Adjust Geolocation
Google Chrome users now have the ability to override geolocation and show a different location with Chrome DevTools. This allows users to test a website based on specific locations to see how it responds, which is especially helpful if the website’s user experience changes based on location.
How to Show Different Geolocation Using Chrome DevTools:
- In DevTools, press Command+Shift+P (Mac) or Control+Shift+P (Windows, Linux, Chrome OS) to open the Command Menu
- Type ‘show sensors’, and press Enter. This will open the Sensors tab at the bottom of your DevTools window.
- Select from one of the preset cities, or Select ‘Location unavailable’ to see how the site behaves when the user’s location is not available
- To add a location not on the list, press the ‘Manage’ button to the right of the dropdown menu, and press the ‘Add location…’ button
- Include the location name along with the city’s coordinates, and press ‘Add’
Once the preferred geolocation is entered, close the DevTool panel and the browser will continue to serve pages as if the user was located in the chosen area.
Sources:Search Engine Journal
Google Ads Removing Two Bidding Strategies
Google is planning to remove the ‘Target Search Page Location’ and ‘Target Outranking Share’ bid strategies in late June. With the capabilities of Target Impression Share, introduced last November, Google has decided to no longer offer the above-mentioned bid strategies. Target Impression Share is a smart bidding strategy which automatically sets bids according to where the ad will show up. Campaigns using Target Search Page Location or Target Outranking Share bid strategies will be automatically migrated to Target Impression Share. After migrating to Target Impression share, campaigns will be automatically optimized based on previous target locations and historical impression share.
Source:Search Engine Journal
Google Announces Mobile-First Indexing by Default for New Sites Starting July 1
Mobile-first indexing will be enabled by default for all new websites starting July 1, 2019. This means Google will crawl any site previously unknown to Google Search using a smartphone Googlebot by default. Any site published after June 30 will need to be ready for this method of crawling.
As for existing sites, users can check for mobile-first indexing by using the URL Inspection Tool in Search Console to see how a URL was last crawled and indexed. Google will continue to monitor and evaluate existing pages and will notify users through Search Console when the page is ready for mobile-first indexing. Existing sites can improve readiness by focusing on improving content, structured data, and other meta-data. As for new sites, Google recommends a responsive web design from the start, including using a single URL for both desktop and mobile websites.
Twitter is Testing the Effects of Showing Users More Ads
Twitter is displaying more sponsored posts in users’ feeds in an experiment that tests how often ads are shown. Not all users have the same ad frequency, and one of the factors that determines ad frequency is how a person continues to use Twitter after being exposed to a number of ads. If someone’s Twitter usage drops after being several ads at a greater frequency, then they may be shown fewer ads as a result. This new test may provide greater opportunities for advertisers to show up in front of their target audience more frequently. However, it may also hurt advertisers with poor ad experience if users continuously react negatively towards their ads.
Source: Search Engine Journal
3 Ways SEOs Can Utilize the WayBack Machine
Whether you’re using it to troubleshoot an issue, find old content, or simply remember details of a previous promotion, the WayBack Machine is a handy tool to search the Internet archive. Here are three ways to utilize the WayBack Machine:
- Troubleshoot a Shift in Traffic – whether it’s for your own site or a competitors, the WayBack Machine can be used to pinpoint issues with on-page meta, internal linking, or page changes by comparing an older version of the page to the current one. Put the URL of interest into the search box of archive.org, and choose a result based on the date you believe the code may have changed. The cached page will load in the browser just like a regular website, but it will have a header from archive.org. Analyze the page for any differences in structure and content, which may provide a clue as to why traffic has fluctuated.
- Check Indexing Issues Using robots.txt – the WayBack Machine also archives robots.txt files, which makes it possible to quickly identify any changes in crawling permissions.
- Research Backlink Strategies – use a backlink tool to identify any lost links on a website. Then, put those links into the WayBack Machine and determine how they used to link to the target website. Ideally, legitimate link strategies won’t have lost links, so this allows users to determine why the link is no longer there.
Source:Search Engine Journal
3 Things to Remember When Testing Low Volume Ad Accounts
- It’s Going to Take Time – In any account with a lower budget, your test is likely going to take 30 – 60 days to reach any sort of significance. To understand how long a specific test may take, it’s best to identify the metrics you’re going to measure success by and then use historical data to calculate the estimated length of the test.
- Try to Keep Things the Same – Things will change over the 30-60 days, so it is important to remember what is currently being tested before doing optimizations so as not to disturb any tests that are currently in progress. Any change to the testing environment will affect results, and if the test is multi-campaign, then you have to be incredibly careful about every change you make.
- Finding Statistical Significance – While testing low volume ad accounts can be time-consuming, it is best to wait until the test has reached statistical significance before declaring a “winner”. If a winner is declared too soon, without enough data, you could make compounding decisions off that original test, without knowing the full results.