I missed this article during the hustle and bustle of Christmas, and I’m thinking that was what Google hoped would happen. I’m sure they hoped people would be too distracted to see the news that Google was going to ruin email marketing to Gmail users. Here’s the important part of the article:
“…Google has just announced a move that will [...] cache all images for Gmail users. Embedded images will now be saved by Google, and the e-mail content will be modified to display those images from Google’s cache, instead of from a third-party server. E-mail marketers will no longer be able to get any information from images…”
So what? Well, just like [not provided], Google is disconnecting marketers from their target audience and effectively making them blind to a very important piece of data: open rate. When marketing emails are opened their images are downloaded from the email marketing company’s server with a special image call. The special image call sends back information about you including your IP, email address, a timestamp, etc. That tells the email marketing company that you opened an email and viewed their offer which gives them an “open rate”. Since Google is going to cache all images on their own servers and display the cached images in Gmail, the email marketing company will no longer receive information about who has opened their emails and will no longer get good “open rate” information. With an email campaign it’s really important to understand the number of people who opened an email vs. the number of people who clicked a link. Since Google is going to cache the images, the only way the email marketing company can tell if someone has opened their email is if they click a link which makes Gmail’s open rate equal to click through rate. If those two are equal it’s difficult to say if the email marketing company provided a compelling message because they won’t really know how many people opened an email vs. how many people engaged with it.
In a digital age, those who control the data, control the world.
In a never-ending onslaught against “free” organic listings, Google continues to push the top organic spot further and further down the page. I’ve even seen reports that some searches have no organic results above the fold. I first posted about this topic back in March of 2012 when I posted screenshots of how much the results had shifted down over a couple of years. In March of 2010, the #1 organic listing was 138px from the top of the browser. In March of 2012 it had dropped down to 296px. In the screenshot below, Google has dropped the #1 organic listing down to 706px from the top of the page. If that weren’t bad enough, Google is also inserting image results below the #1 ranking making #2 166px further down the page and ensure only one organic result shows up above the fold.
Google puts more self-promotion above organic results.
Google is pushing their paid platforms hard by devaluing organic listings which forces more and more webmasters to try to get clicks from paid services like Adwords and Product Search. The image below shows how many links on the page drive users into other Google properties instead of to someone’s site. Only 13% of the links above the fold for this query lead off Google’s “property”.
Only five out of the 41 links in this screenshot take you off Google’s site.
I don’t expect this to end anytime soon. I think that in two or three years Google won’t provide any organic listings on the first page of results.
A client launched a new website and we’re helping them make it as fast as possible. Google’s Page Speed Insights is great at analyzing the code on the page to tell you where you can make improvements, but it doesn’t pay any attention to things like DNS lookup, server response times, connection types, server locations, etc. For more in-depth evaluation of page speed I like to use WebPageTest.org. During today’s test I found that the worst offending response times are from Google and Facebook! Check out the image below to see how much of a time-suck they are when you use their +1 and “Like” buttons on your site. My favorite part is that Facebook is calling Ireland to serve up some content. Really Facebook? Ireland??
Facebook “Like” and Google +1 calls account for 3,595ms of load time per page for my client’s site. (click for a larger view)
Well, the US government could not come to an agreement about the budget. The shutdown really isn’t impacting “regular” people, just those employed by and receiving benefits from the government. Oh, and people going to parks. That being said, the shutdown is about to impact a BUNCH of website owners. Here’s why…
Since the shutdown puts non-essential government employees back on their couches, that means all the server and network admins have FLOODED Xbox Live and the Playstation Network. While those guys are out fighting zombies, pwning n00bs, and leveling up, their servers and networks are unattended and shut down. What’s that got to do with SEOs? If your site gets a boost in ranking from .gov links (if you have them, it does) some of those sites are now returning 404s in the absence of the nerds who care for them.
nasa.gov along with many other government sites currently 404s!
I thought maybe it wouldn’t impact sites because Google would “fudge” the numbers for a while until the .gov sites came back. Nope. During our competitive reviews we check the number of .gov links point to competitors. About a week ago one of the sites we reviewed had 1,460 links from .gov websites. Today they have 87. Eighty Seven. Did it hurt their ranking? Well, in the words of Phil Robertson, “He gone.”
1,460 .gov links before the shutdown. 87 after. Ouch.
Brace yourselves… if your site depends on .gov links, you could drop like a rock. It could take a long time to recover your ranking once the .gov sites come back. It could trip a filter for over-acquisition of links in a short period of time. I assume Google would “fix” that because that would be a monumental FAIL if not.
Today Google announced a new feature in Webmaster Tools that reports if you have a manual action against your site. I’m a bit surprised by this because it should already be abundantly clear to webmasters if they have a manual action against their site or not. Google has said time and again that if there’s a [recent] manual action against a site that they will notify the webmaster via email and display a message in Google Webmaster Tools. I guess people just aren’t paying attention. I think this tool was really released to help stem the tide of reconsideration requests that are coming from people who are confused of the difference between algorithmic and manual penalties. The last couple of years have seen broad algorithmic penalization of websites from Panda and Penguin. It’s been the biggest shakeup in many years. All these actions against websites have probably generated millions and millions of unnecessary reconsideration requests. Google says this report is an effort to be more transparent, but I believe they only created this “feature” to help their Webspam team which must be inundated with reconsideration requests from people who don’t understand how to tell if their site has a manual penalty. There’s no reason for them to be any more transparent than sending an email and issuing a notice in webmaster tools, unless to benefit themselves. Whatever the real reason, the “feature” is there and you can check your own GWMT account now to see, without a shadow of a doubt, if you have a manual action or not.
Google’s new report on webmaster tools shows if you have a manual penalty or not – if you weren’t already paying attention to their other notifications.