A coming update to the WordPress blog system will no longer support Internet Explorer 6. A blog post which was posted on May 12 reports the change. Version 3.2 which is currently in beta testing and scheduled for release in June 2011 will impact many bloggers around the world because the developers of WordPress have not considered IE 6 in the design and coding of their backend admin area or in their main themes. Good for them!
The message from everyone: IE6 GO AWAY!
Google has finally improved the information provided by the AdWords Traffic Estimate and Adwords Keyword tools. You can see in their blog post that they talk about the fact that they’ve improved the algorithm behind the estimates.
The change is actually pretty significant. To test I searched for “farm animal toys” which we just did some research on for a client. (We use the Google AdWords Traffic Estimate tool for part of our phrase research.) Less than a week ago we conducted some research to expand the list of phrases we promote for a client. One of the phrases we checked with the tool was “farm animal toys” wand it reported that there were 275 monthly searches for that phrase in Google. I checked the same phrase today and the tool reported 720 monthly searches! That is a SIGNIFICANT change especially when compared to the competition of the niche. We use a very basic ratio to represent competitiveness for phrases as an illustration for clients and the ratio was more than halved showing that the niche could be much more lucrative to peruse than we thought. (We use a very basic ratio to communicate the level of competition to our clients but we actually look at a lot of more “nerdy” stuff. The “nerdy” stuff tends to overwhelm our clients so we just go with one simple illustration)
Google updated their Traffic Estimator and Keyword Tools! (click for a larger view)
I think I see Google doing some testing. In one browser I’m getting 1.19 billion results for “world video news” and in another browser I’m getting 586 million. Both are logged in with the same account, both are getting facebook results pulled into the results, etc. It’s not browser-specific though because when I open a new incognito window in Chrome and open facebook, I get the lower number of results. I’m going to keep playing with it to see what I can come up with.
1.19 Billion Results in Chrome
586 million results in Firefox
I was asked by a new potential client to review the SEO reports sent to him from the SEO that I’ll be replacing. I was appalled. I absolutely could NOT believe what the other company is passing off as “SEO.” They were posting links on many, many spam link farms and calling it “link building.” I checked 22 of the sites where they got links for this potential client and all 22 of them are complete spam sites. The other “SEO” used the exact same anchor text for every link they created and obviously spent no time checking the quality of the linking sites. What a joke.
Another section of this report showed something they were calling “social marketing” which included posting “articles” to Google Knol. I checked and 87/88 of the posts the “SEO” made on this company’s behalf have been flagged as spam posts. Want to tell Google you’re a spammer? Spam one of their own properties!! What’s worse than spamming Google Knol with these articles? Posting the EXACT same articles – the ones that Google flagged as spam – on a couple dozen other sites.
They also did something called “PDF Distribution” and I was amazed to see that they were “distributing” the exact same PDF to 11 sites at the same time. Also, many of these “PDF Distribution” sites present content to Google as flat HTML, not as a PDF. All Google sees is tons of duplicate content on multiple sites. Google will make a determination which site, if any, is the original owner of the content and exclude all the rest from the listings. This means that if Google decides one of the “PDF Distribution” sites is the site that owns the content produced for the client, the client’s PDF won’t show up anywhere.
This is the EXACT thing I talk about ALL the time. The “SEO” company – that I wish I could name – is a disgrace to our industry. The company is completely shooting their client in the foot and ruining their chances of ranking competitively anytime in the near future. It’s going to take a lot of work to convince this potential client that I know what I’m talking about, I know what I’m doing, and I don’t spam because he’s trusted this other company for so long.
Ever wonder how Google knows in real-time which roads have traffic jams and which roads are clear? It may surprise you to learn that they get some of that information from YOU!
I’ve always kind of ignored Google’s real-time traffic reports thinking that they couldn’t possibly know what was going on and that they were just guessing or regurgitating data from those DOT traffic sensors embedded in the highway.
I started digging around one day to find out how Google was doing it after I visited Denver for a wedding last November. I was on a side-street during a jam-up and decided to use my iPhone to try to find a way around. Google somehow knew there was a traffic jam right where I was! I looked around and could see no traffic cameras, no sensors in the road – nothing that would be reporting to Google. Somehow they knew where the jam-up started and where it ended. I was puzzled so I started looking around online to find out what was going on. That’s when I found a Google blog post discussing how they collect the data. Google collects traffic data from smartphones running the Google Maps app! Google is sent small, anonymous bits of data telling them your gps coordinates. From those coordinates Google determines your location, direction of travel, and average speed. They combine that with data from other people on the same road and DOT sensors (if applicable) to come up with their real-time traffic maps. The screenshot below is from Google Patent Application US 2010/0286899 A1 and is a funny illustration of what data is collected and delivered to an “information provider” which transmits the data to Google.
Illustration from Google Patent App showing traffic data collection from sensors. (click for a larger view)
Another amazing thing is that Google also provides predictive traffic data based on previous data they collected. It’s a pretty sweet feature that is currently available at the bottom left side of the maps screen when traffic is enabled. You can see in the screenshot below that Wednesdays are ok for traffic in the KC area. They calculate their predictive traffic data based on past information they collected.
Google's Predictive Traffic at Work in Kansas City. (Click for a larger view)
I used Google’s Predictive Traffic when I was driving to Austin in March to speak at Pubcon. I checked the highest traffic times in Dallas and timed my trip to miss it. (I have to go through Dallas at nearly rush-hour on my way from KC) It worked out great and I avoided the biggest jams.
Google must be thrilled with the proliferation of smart phones and the fact that they currently have THE maps application.
I’m remiss to have not pointed out the traffic data that you can get from Google Earth. Thanks to my friend Kirby for pointing that out. On Google Earth you can see a lot of little dots all over the roads. If you click on a dot it tells you the vehicle’s live traffic speed. It’s a VERY interesting thing to look at, and quite shocking to see how many phones Google can pull traffic from in real-time!
Live Traffic from Google Earth (click for a larger view)