Online Marketer Conference & Expo returns to Melbourne in November

Online Marketer returns to Melbourne in November 2011 and this year see’s the return of our three staple Conference products plus two new Workshops.

The Online Marketer Conference Series & Expo combines two of the worlds most respected Marketing conferences under one roof together with our entry level Conference – Online Marketer Boot-Camp, with all new content PLUS two new events operating under our new “Workshop” brand.

Tuesday November 15th kicks off with both Search Marketing Expo and eMetrics Marketing Optimization Summit running side by side.

Wednesday November 16th feature our two new Workshops, the all new Social Media Marketer Workshop plus our Mobile Marketer Workshop that we successfully introduced to our Sydney event back in April.

  • Search Marketing Expo – Intermediate to advanced level
  • eMetrics Marketing Optimization Summit – Intermediate Level
  • Mobile Marketer Workshop – Intermediate Level
  • Social Media Marketer Workshop – Intermediate Level
  • Online Marketer Boot-Camp – Entry Level

Registration options:

We have a variety of pass options for this event starting with the 2 day ALL ACCESS Pass, this gives you access to the entire event, giving you the flexibility to attend any session on any day, you have a free rain, the 2 Day Pass is only $1,395 + GST.

Special Offer:

If you would like to take advantage of our traditional Pre-Agenda discount offer and save yourself 30% then click here to book today. SAVE 30%, pay only $976.50 + GST.

Vital Information

Where: Hilton On The Park, Melbourne
When: November 15 & 16

Our next newsletter will have a full overview of what each events Agenda will hold along with some announcements about our Speakers for 2011.

Regards

Barry Smyth

Founder & Content Director

Google’s Own Words About the Farmer/Panda Update

By Vanessa Fox over at SMX Sydney’s partner site – Searchengineland.com

An amazing amount of great information came out of SMX West lastweek. Below is a summary of some of what I found to be the most actionable and useful.

Google’s Own Words About the Farmer/Panda Update

Google’s Matt Cutts said that while the change isn’t 100% perfect, searcher feedback has been overwhelmingly positive. He noted that the change is completely algorithmic with no manual exceptions.

Blocking “Low Quality” Content

Matt reiterated that enough low quality content on a site could reduce rankings for that site as a whole. Improving the quality of the pages or removing the pages altogether are typically good ways to fix that problem, but a few scenarios need a different solution.

For instance, a business review site might want to include a listing for each business so that visitors can leave reviews, but those pages typically have only business description information that’s duplicated across the web until visitors have reviewed it. A question/answer site will have questions without answers… until visitors answer them.

In cases like this, Google’s Maile Ohye recommended using a <meta name=robots content=noindex> on the pages until they have unique and high-quality content on them. She recommends this over blocking via robots.txt so that search engines can know the pages exist and start building history for them so that once the pages are no longer blocked, they can more quickly be ranked appropriately. I noted in the panel where we discussed this that an exception might be for a very large site, robots.txt would ensure that the search engine bots were spending the crawl time available on the pages with high-quality content.

Ad-To-Content Ratio

Matt said that having advertising on your site does not inherently reduce its quality. However, itis possible to overdo it. I had noted in my earlier articles about this change that in particular, no content and only ads above the fold, as well as pages that have so many ads, it’s difficult to find the non-advertising content often provide a poor user experience.

Slowed Crawl

Matt also noted that if Google determines a site isn’t as useful to users, they may not crawl it as frequently. My suggestion based on this is to take a look at your server logs to determine what pages Googlebot is crawling and often those pages are crawled. This can give you a sense of how long it might take before quality changes to your site take affect. If Google only crawls a page every 30 days, then you can’t expect quality improvements to change your rankings in 10 days, for instance.

International Roll Out

Matt confirmed that the algorithm change is still U S. only at this point, but is being tested internationally and would be rolling out to additional countries “within weeks”. He said that the type of low quality content targeted by the changes are more prevalent in the United States than in other countries, so the impact won’t be as strong outside the U.S.

Continued Algorithm Changes

Matt said that many more changes are queued up for the year. He said the focus this year is on making low quality content and content farms less visible in search results as well as helping original creators of content be more visible. Google is working to help original content rank better and may, for instance, experiment with swapping the position of the original source and the syndicated source when the syndicated version would ordinarily rank highest based on value signals to the page. And they are continuing to work on identifying scraped content.

Does this mean that SEO will have to continue to change? Not if your philosophy is to build value for users rather than build to search engine signals. As Matt said, “What I said five years ago is still true: don’t chase algorithm, try to make sites users love.”

Gil Reich noted my frustration at the way some people seemed to interpret this advice.

“The Ask the SEOs sessions used to be battles between Black and White. Now, with the same participants, it’s between “focus on users” and “focus on creating the footprint that sites that focus on users have.” It seemed that Vanessa found this debate far more frustrating than when her fellow panelists were simply black hats.”

It’s true. For instance, at one point, Bruce Clay said that they’d done analysis of the sites that dropped and those that didn’t and they found that those that dropped tended to have almost an identical word count across all articles. So, he said it was important to mix up article length. I told the audience that I hoped that what they got out of the session was not that they had to go back to their sites and make sure all the lengths were different but that they actually made sure each page of content was useful.

You know the best way to ensure your site has a “footprint that sites that focus on users have”? Focus on users!

“White Hat” Cloaking?

We hear a lot about “white hat”cloaking. This tends to mean anytime a site is configured to specifically show search engines different content than visitors for non-spam reasons. For instance, a search engine might see HTML content while visitors see Flash. The site might not serve ads to search engines or present only the canonical versions of URLs.

Some SEOs say that Google condones this type of cloaking. However, Matt was definitive that this is not the case. He said:

“White hat cloaking is a contradiction in terms at Google. We’ve never had to make an exception for “white hat” cloaking. If someone tells you that — that’s dangerous.”

Matt said anytime a site includes code that special cases for Googlebot by user agent or IP address, Google considers that cloaking and may take action against the site.

First-Click Free Program and Geotargeting

Danny Sullivan asserted that the “first-click free” program is white hat cloaking, but Matt disagreed, saying that the site was showing exactly the same thing to Googlebot and visitors coming to the site from a Google search. Matt also noted that showing content based on geolocation is not cloaking as it’s not special casing for the Googlebot IPs (but rather by the geographic location).

Cloaking For Search-Friendly URLs

Note that several Search Engine Land articles (including this and this) assert that cloaking to show search-engine friendly URLs is OK with the search engines, but Google in particular has been definitive that this implementation is not something they advocate.

The basis for some thinking this scenario is OK by Google seems to be a statement from a Google representative at SES Chicago in 2005. But back in 2008, Matt Cutts clarified to me that “cloaking by rewriting URLs based on user-agent is not okay, at least for Google.” (My understanding is that was the first speaking engagement for the Google representative and he wasn’t fully up on the intricacies of cloaking.)

In any case, there are now lots of workarounds for URL issues. If developers are unable to fix URLs directly or implement redirects, they can simply use the rel=canonical link attribute or the Google Webmaster Tools parameter handling feature (Bing Webmaster Tools has a similar feature).

Hiding Text for Accessibility

What about using a -999px CSS text indent for image replacement? Maile had previously done apost on her personal blog noting that hiding text in this way can be seen as violating the Google webmaster guidelines even if it’s done for accessibility, not spammy reasons. Generally, you can use a different implementation (such as ALT attributes or CSS sprites). On stage at SMX, Maile also recommended using font-face. This can be tricky to implement, but at least for the font files themselves, you can use Google Web Fonts rather then building them yourself.

Matt seconded this in a later session: “hidden text? Not a good idea. I think Maile covered that in an earlier session.”

Showing Different Content to New Visitors Vs. Returning Visitors

Someone asked about showing different content to new vs. returning users. Both Matt and Duane Forrester of Bing commented that it was best to be careful with this type of technique. Generally, this type of scenario is implemented via a cookie. Both new visitors and Google won’t have a cookie for the site, while a returning visitor will have one. Matt noted that if you treat Googlebot the same as a new user, this generally is fine.

Building Links

Links continue to be important, but how can sites acquire them using search engine-approved methods? Matt said to ask yourself “how can I make a good site that people will love?” Make and impression and build a great site with a loyal audience (not necessarily through search) that brings brand value and links will come.

Creating Value Beyond Content

Someone asked how to get links to internal pages of an ecommerce site. Product pages just aren’t that interesting and don’t have a ton of editorial content. Matt recommended looking at different ways of slicing the data and becoming an authority in the space.

How Valuable is Article Marketing?

Not very.  Both Duane and Matt said that articles syndicated hundreds of times across the web just don’t provide valuable links and in any case, they aren’t editorially given. Duane made things simple: “don’t do the article marketing stuff.”

He suggested contacting an authority site in your space to see if they would publish a guest article that you write particularly for them. If the authority site finds your content valuable enough to publish, that’s a completely different situation from article hubs that allow anyone to publish anything.

What About Links in Press Releases?

Someone noted that while paid links violated the search engine guidelines, you can pay a press release service to distribute your release to places such as Google News, so don’t those links count? Matt clarified that the links in the press releases themselves don’t count for PageRank value, but if a journalist reads the release and then writes about the site, any links in that news article will then count.

Are Retweets More Valuable Than Links?

Someone asked about the recent SEOmoz post that concluded that retweets alone could boost rankings. Matt said he had asked Amit Singhal, who heads Google’s core ranking team, if this was possible. He said that Amit confirmed links in tweets is not currently part of Google’s rankings so the conclusions drawn by the post were not correct. Rather, other indirect factors were likely at play, such as some who saw the tweet later linked to it. (Purely speculating on my part, those tweets could have been embedded in other sites that in turn were seen as links.)

Matt mentioned that signals such as retweets might help in real-time search results and then talked about a recent change that causes searchers to see pages that have been tweeted.

Some mistakenly took this to mean that the Google algorithm would give a rankings boost to pages that have been tweeted vs. those that haven’t, but Matt was talking about the change a few weeks ago that personalizes search results based on a searcher’s social network  connections. As Matt McGee explained in his Search Engine Land article about it:

In some cases, Google will simply be annotating results with a social search indicator, says Google’s Mike Cassidy, Product Management Director for Search. Google’s traditional ranking algorithms will determine where a listing should appear, but the listing may be enhanced to reflect any social element to it.

In other cases, the social search element will change a page’s ranking — making it appear higher than “normal.” This, I should add, is a personalized feature based on an individual’s relationships. The ranking impact will be different based on how strong your connections are, and different people will see different results.

Can Competitors Buy Links To Your Site and Hurt Your Site’s Rankings?

This is an age old question, but as several high profile sites have had rankings demotions lately for external link profiles that violate the search engine guidelines, it was top of some people’s minds. Matt reiterated that competitors generally can’t do anything to hurt another site. The algorithms are built to simply not value links that violate search engine guidelines. Demotions generally only occur when a larger pattern of violations is found.

What About Exact Match Domains and Incoming Anchor Text?

Several people have commented on spammy sites with exact math domains and lots of spammy incoming links with exact match anchor text ranking quite well in Google and Bing. Matt said they are looking into this.

How Google Handles Spam and Reconsideration Requests

Matt reiterated some basics about how this works:

  • Google’s approach to spam: Engineers write algorithms to address spam issues at scale. In parallel, manual teams are both proactive and reactive in looking for spam and both removing it from the index and providing it to  the engineering team to help them modify their algorithms to not only find that specific spam instance, but all similar instances across the web.
  • How Google handles spam reports: Spam reports have four times the weight of other spam found in terms of manual action as it’s clearly spam that a searcher has seen in results.
  • Notifying site owners of guidelines violations: Google is looking at providing additional transparency around guidelines violations found on sites. (They already provide some details in theGoogle Webmaster Tools message center.)
  • How Google handles reconsideration requests: Only reconsideration requests from sites that have a manual penalty are routed to Googlers for evaluation. Generally, the reconsideration process takes less than a week. Algorithmic penalties can’t be manually removed. Rather, as Google recrawls the pages the algorithm adjusts the rankings accordingly.

The Farmer/Panda Update: New Information From Google and The Latest from SMX West

By Vanessa Fox over at SMX Sydney’s partner site – Searchengineland.com

Today at SMX West, the subject of the day was content farms. One session included the founder of (sometimes-called content farm) Associated Content, Luke Beatty, and focused on the good that can be learned from content farms, as well as what sites who have been hit by Google’s Farmer aka Panda update can do to regain their rankings.

This discussion coincides with an updated posting from Google in the webmaster discussion forum thread they opened for site owners on the subject. What can we learn?

To summarize:

  • Substantial low quality on a site can cause the rankings for the entire site to decline (even for the high quality pages)
  • Evaluate your web site for poor quality pages (not useful, poorly written, non-unique, or thin) and remove them
  • Overall user experience is likely important: design and usability, ad-to-content ratio, brand perception
  • Look at both content and page templates (do the templates overwhelm the pages with ads? Provide a poor user interface?)
  • After ensuring all content on the site is high quality, focus on engagement and awareness (through social media and other channels)
  • Diversify into other channels and even within search, look beyond web search at Google News and “one box” style results such as blogs, images, and videos
  • We can potentially learn from content farms, particularly in how they pinpoint what audiences are interested in and what problems they are trying to solve as well as how they harness crowdsourcing.

Official From Google: Low Quality Content On a Part of a Site Can Impact a Site’s Ranking As A Whole

First, Google’s words:

Our recent update is designed to reduce rankings for low-quality sites, so the key thing for webmasters to do is make sure their sites are the highest quality possible. We looked at a variety of signals to detect low quality sites. Bear in mind that people searching on Google typically don’t want to see shallow or poorly written content, content that’s copied from other websites, or information that are just not that useful. In addition, it’s important for webmasters to know that low quality content on part of a site can impact a site’s ranking as a whole. For this reason, if you believe you’ve been impacted by this change you should evaluate all the content on your site and do your best to improve the overall quality of the pages on your domain. Removing low quality pages or moving them to a different domain could help your rankings for the higher quality content.

Let’s break that down:

  • This algorithm specifically targets sites (not necessarily content farms) that are low quality in a number of ways, such as:
    • Shallow content (not enough content to be useful)
    • Poorly written content
    • Content copied from other sites
    • Content that’s not useful
  • Low quality content on part of the site can impact the rankings of the entire site
  • Remove the low quality pages of the site to increase rankings of the high quality pages

A key phase is “information that [is] just not useful”. It’s not enough that content is unique and verbose. Another key is that even high quality pages can lose rankings if poor quality pages tarnish the overall site.

Advice From SMX West

Matthew Brown of AudienceWise (previously with the NY Times) had a concrete action plan for sites that had lost rankings. He noted that Google never said they were specifically targeting content farms and that all variety of sites were impacted, including ecommerce sites, shopping comparison sites, and sites with thin content.

Brand Perception

He speculated that user experience and brand likely contribute to a site’s overall perception of quality. It makes sense, as someone who has a positive experience with a brand is likely to click on that brand if it shows up in the search results, whereas someone who has had a poor experience with a brand may skip it in the search results and click the next listing.

For well-known brands, the searcher need not have had any experience with a brand before, but may click on the listing based on the overall authority and credibility of the brand. (This could potentially be why sites such as the Huffington Post don’t seem to have been affected, despite having “content farm-like” content.)

Quality Content Ratios

Brown said that the rankings changes seem page-based, based on what he’d seen, but that he’d also seen instances that seemed to be site-wide if enough poor content existed on the site. He talked about a quality vs. quantity ratio on sites. Even the ratio of low quality content is high enough, it could bring the entire site down. This aligns very closely to Google’s latest statement.

He speculated that even small sites could be impacted (even though they don’t fit the content farm definition) based on the ratio of good quality pages. Other things to look at include number of links to pages throughout the site vs. to the home page and the ratio of content to ads on a page.

Factors of Sites That Weren’t Impacted

Brown noted that content farm-like sites that seemed not to lose rankings had common factors such as brand awareness and credibility (like my Huffington Post example), inclusion in Google News, lots of links to internal pages, and substantial social media sharing.

He felt design and user experience play a part as well, showing an example from ehow.com (some say content farm-like, yet not impacted by this Google change) with a clean user interface and few ads above the fold.

He recommended:

  • Getting rid of poor quality pages entirely (redirect them if it makes sense, otherwise 404 them)
  • Building out brand signals
  • Working on promotion and engagement

He said the answer wasn’t more writers, but improving the quality of the content that’s there and focusing on building attention through marketing and engagement.

Associated Content: Google Referrals Down to 2/3 of Content

Luke Beatty, who founded Associated Content (and now that Yahoo has acquired the company is a VP there) said that while 2/3 of their content has seen significant declines in Google referrals since the algorithm change, 1/3 of their content has seen increases. However, it sounded as though the content that has seen increases may reside on the Yahoo network , rather than on associatedcontent.com.

Previous reports have associatedcontent.com traffic from Google down over 90%. (Just over a year ago, CEO Patrick Keane said: “With 90% of our editorial viewed at least once a month, clearly our organic search strategy is working.”)

If indeed it is the case that articles housed on Yahoo properties, rather than associatedcontent.com, are fairing better in Google search results, it could be that associatedcontent.com as a whole has suffered as Google described: “low quality content on part of a site can impact a site’s ranking as a whole”. Of course, it could also be that the higher quality articles were the ones chosen for publishing on Yahoo properties.

Indexing Not a Data Point For This Algorithm Change

One thing to note: Beatty said that 93% of associatedcontent.com pages are still indexed in Google and seemed to use that stat to indicate that they weren’t completely out of Google’s favor. But looking at indexing numbers is probably not the best indicator of impact from this algorithm change as it seems to have impacted ranking, but not indexing.

How Traditional Media Can Learn From Content Farms

Tim Ruder of Perfect Market looked at content farms from a different angle. He said that the quality of most mainstream media is high, but the economics are challenging. Ruder pointed out that mainstream didn’t seem to be impacted by this change, although they produce many articles a day on their sites. But they focus on quality. According to Pam Horan, president of the The Online Publishers Association(OPA), which includes content publishers such as CNN.com, “Traffic to sites that belong to the Online Publishers Association grew between 5% and 50% the day after Google’s tweak.”

Right now, print media supports online efforts, but print revenue is declining, so things will have to change. Ruder is from a traditional media background and wonders what have content farms done right that traditional media can learn from? In particular, content farms do a good job of understanding what people are searching for and the language they use. Traditional media can take a lesson from this to better engage with readers.

Building Content For Searcher Needs

Matt McGee, who moderated the session, later asked if Google inherently was taking issue with the idea of looking at search volumes and building content to address those needs?

The panelists said, it’s the quality of the content that’s at issue. I wrote about this before the algorithm shift in my article about the 2011 Super Bowl start time.

Here’s the thing about search trends and seasonal searches. A lot of discussion lately has focused on “content farms” that write and publish content based on what people search for most. For the most part, these sites don’t provide a lot of value, as they’re not written by someone who’s an expert or passionate about a topic. They’re simply a bunch of words that match a popular topic. To be clear, that’s not what I’m advocating. Instead, I’m saying that organizations such as the NFL and Fox could use search data to find out what their audiences need and make sure they are meeting those needs. In this case, the NFL audience is turning to the search box to find out what time the game starts. If the NFL would provide this information and make sure it’s visible in search, people would be happy and they’d tune into the game on time. Everyone wins.

One panelist noted that their jobs as marketers were to provide marketing for their clients (not ensure the sites are high quality). This is an old debate. Is SEO about ranking or about conversion? In my opinion, SEO goes beyond just marketing. SEO is also about understanding what your audience is looking for and meeting those needs. If you approach SEO from that perspective, you build your site in line with what Google is trying to surface with this algorithm change.

Mobile SEO: Considering your Mobile Web Strategy

Mobile web access is growing by leaps and bounds each year. If you are not paying attention to your mobile web traffic, you probably should be.

There are a lot of different ways to create a mobile-friendly user experience on your website, but there are five basic options:

  • Do nothing – Some sites work well on mobile phones without much work
  • Create mobile/traditional hybrid pages (Option 1 below)– Add a ‘handheld’ mobile stylesheet to re-format the existing pages to display better on mobile phones
  • Create mobile-specific pages- (Option 2 below)These are separate pages that are designed specifically for viewing on a mobile device
  • Dynamic content delivery – Use a device aware content management system that detects and adapts the content it serves based on the device accessing it.

Mobile Website Architecture Options

The method that you choose for creating your mobile site will depend on the type of content that you are serving and the development resources that you have available. In the mobile world, the advantage you get in from spending more time and money generally yields a more predictable and adaptable user experience across a higher number of phones.

The problem is that there are a so many phones available that it can be hard to predict, test or know that your site will work well across all of them. When you add in the possibility of different mobile browsers, carriers and phone settings, it becomes almost impossible.

With mobile development, there is always a point of diminishing returns.

If you are being purely pragmatic, and ROI driven, there is always a point where you can spend more and more time and money, improving the user experience to perfection, but the improvements become so specific and niche that they have only a minor impact on the ability of the site to drive revenue.

That said, you must remember that most mobile visitors will not understand how complex this can actually be to build a mobile site that works well on all phones, and they will have no sympathy. Believe it or not, many visitors to your mobile site will not know the difference between a mobile website and a mobile application.

As more and more companies start to replicate their websites in app-form, you can start to understand why. The blurring of this distinction is actually changing the expectations of mobile users, when it comes to mobile sites; users actually expect mobile websites to look and behave like apps. This can be a mixed blessing for your web development team

Why Mobile Friendly Is Not Mobile SEO

By Bryson Meunier over at SMX Sydney’s partner site – Searchengineland.com

My Google Reader feed went haywire with “mobile SEO” alerts when Google posted their “Making Websites Mobile Friendly” advice on the Google Webmaster Central Blog. The odd thing is, they didn’t say anything about SEO or search quality in the post, which has to do with how Google crawls and indexes mobile websites today.

SEOs and Search Engine Land readers know that search engine optimization at a very basic level is the practice of making content more visible in search engines by ensuring that a site is properly crawled and indexed, that it contains the content users want using the queries they use to search for that content, and that the content is properly marketed to those users through link building or some other method of content distribution.

What Google posted (in the above link) related entirely to crawling and indexing, and contained nothing at all about returning the proper content for the proper queries, or making that content visible to mobile users when they’re looking for it.

The distinction should be painfully clear to anyone who has searched for a mobile site on their Android or iPhone recently. The current smartphone version of Google search returns a lot of results related to Mobile, Alabama when you do a navigational search for a mobile site and the mobile site you’re looking for either: doesn’t exist, has been excluded for fear of duplicating content, or uses handheld stylesheets to render content.

I tried to find an example of a popular site that has a lot of search volume from smartphones that fits this mold, but of the 30 sites I looked at that showed up in the Google keyword tool as having high demand for their brand plus the word “mobile” (excluding software and mobile carrier queries), all sites delivered mobile sites to smartphone user agents.

No sites in this sample of 685k monthly smartphone searches for mobile sites use handheld stylesheets to render mobile content or are apparently run by webmasters who are concerned about diluting their link equity with mobile content.

If you do want to see an example of a site that has a mobile version, but is excluded from the index with robots.txt, try searching [home depot mobile site] on a smartphone.

Figure 1 [home depot mobile site] on an iPhone 4 shows criticism, a mobile site for an arena and a desktop site for mobile homes, but no home depot mobile site in the organic listings. Home Depot used to send their mobile traffic to a Digby site that was nofollowed in robots.txt. They’ve since redirected smartphones to an iPad compatible site, but the Digby site is still in Google’s index.

The Great Mobile SEO Divide

If this debate is new to you there are essentially two camps among SEOs when it comes to creating mobile content: the one URL SEO camp and the mobile URL SEO camp (which is a microcosm of the One Web vs Mobile Web debate that has been going on for years in the mobile marketing and development community).

The one URL SEO camp, represented ably in the past by Barry Schwartz, Michael Martin, Cindy Krum, Rand Fishkin et al is in favor of displaying one URL to mobile and desktop users alike, and rendering the content with handheld stylesheets for mobile users. It’s clearly easier than developing two sites, and it doesn’t create two URLs, potentially splitting the site’s link popularity and making it more difficult to rank for competitive terms.

The mobile URL SEO camp, represented ably in the past by dotMobi, Shari Thurow, Yours Truly, Matt Cutts and others maintain that there is no evidence of reduced ranking of mobile sites as a result of split link popularity; and that treating mobile sites as duplicate content in the same way that printable URLs are duplicate content is a false analogy, as no one is actively looking for printable copies of pages in search engines and search engines don’t treat mobile sites as duplicate content.

Mobile users require a mobile user experience, and without it they might get content that is confusing or irrelevant, and will convert at a lower rate.

When Google came out with their “official” stance on this through the Webmaster Central Blog (official is in quotes, as they’ve said something slightly different with regard to mobile URLs on the Webmaster Central YouTube channel last month and at Searchology 2009), this is essentially what they said:

  • Mobile user detection is not cloaking (Which they have said before here, here (PDF) and here).
  • Google currently has no way of delivering mobile content created for smartphones, but this may change in the future.
  • If you haven’t created mobile content, you don’t have to do anything but your content may be transcoded for certain users.
  • Mobile as Google defines it today primarily refers to feature phones (or “traditional” phones) that don’t have full internet browsers.
  • Mobile URLs are fine. The best practice when serving mobile content at mobile URLs (like m.*.com) is to use mobile browser detection and permanently redirect mobile user agents. If this is done, Google will be able to serve the right content to the right users (as they have said before in the Google SEO Starter Guide).
  • Mobile sitemaps are intended for feature phone content.

When Google put out this announcement my Google Reader filled up with a few sources I respect and many I’ve never heard of proclaiming that Google has officially said you don’t need mobile URLs for mobile SEO.

Of course, the post never says this, or anything about mobile SEO. What it says is that webmasters don’t need to do anything for mobile users right now, as Google will make their desktop sites mobile friendly when they think it’s appropriate.

If being crawled and returned for certain mobile users is enough for you, continue to do nothing and you will probably be “mobile friendly” in spite of yourself. But don’t call it SEO.

SEOs know that transcoders can make a site unusable, and sometimes make it impossible for search engine traffic to convert. SEOs also know that mobile users prefer mobile content, and convert at a higher level when given mobile content on a mobile device. SEOs also know that people search differently on their mobile phones than they do on their desktop computers, and transcoded or mobile formatted content may not include the queries they’re searching for or calls to action that make sense in their context.

SEOs now also know that Google uses toolbar data in its ranking algorithms, and a high bounce rate or other metrics that can be measured by the toolbar can theoretically adversely affect a web site’s ranking in search engines, including the kind that is created by serving desktop content to smartphone users. In other words, SEOs know that what Google calls “mobile friendly” in this article is not often what is best for site owners and users.

Google recognizes this, more or less, in the post, and says “However, for many websites it may still make sense for the content to be formatted differently for smartphones, and the decision to do so should be based on how you can best serve your users.” It’s noncommittal on Google’s part on mobile SEO, and leaves it up to the site owner to serve mobile content in a way that best serves the users.

For me, that way is still redirecting mobile user agents to mobile URLs with mobile-specific user experiences, as that’s what the data says they respond to best. You can make a site “mobile friendly” without following this advice, but optimization is something else entirely.

For now, webmasters can get by with either strategy, provided they don’t mind not appearing for navigational searches or any of the other search quality issues mentioned above. But given that Google is ostensibly about providing the best result for the user’s query in context, and their top priorities for 2011 are all mobile, just doing the bare minimum to satisfy the growing base of mobile users is not likely to work forever.

Google: Mobile Growth Occurring Faster Than Expected

By Pamela Parker over at SMX Sydney’s partner site – Searchengineland.com

Google CEO Eric Schmidt says consumers’ adoption of the company’s mobile services has happened more quickly than executives expected. Schmidt, who is set to step down as CEO in April, spoke at the IAB Annual Leadership Meeting in Palm Springs.

“It’s happening, and it’s happening faster,” he said. “We look at the charts internally, and it’s happening faster than all of our predictions.”

As an example, Schmidt told the audience that Google sees 200 million mobile playbacks of content on YouTube per day. Additionally, the company saw mobile search spike in conjunction with a real world event — the Super Bowl. When Chrysler advertised on the big game, search on computers surged 48 times over the normal volume of queries, but search on mobile spiked even higher — 102 times the normal volume.

Another Super Bowl advertiser, GoDaddy.com, saw an even more profound effect. Search on desktops were up 38 times higher than normal, but mobile was up 315 times higher than the usual volume.

Schmidt also said he saw display advertising playing a strong role in the mobile environment. “I see the union of the mobile device and advertising, and particularly display advertising, as the thing that’s really going to revolutionize this,” he said.

In the past few months we’ve seen Google, which is the creator of the Android operating system for smartphones and tablets, indicate its strong commitment to and belief in the future of mobile. In August, Google said it was seeing a $1 billion annualized run rate for mobile revenues.

SMX Elite nearly sold out!

Our inaugural SMX Elite day is now close to a sell out.

One of the important parts of SMX Elite apart from highly advanced SEO training is networking, networking with your peers & networking with SMX Sydney’s group of exceptionally gifted and talented International speakers.

All SMX Elite Pass holders gain access to the International Speaker Dinner on the Friday after SMX Sydney finishes up, being elite in nature we made a conscious decision to limit the numbers of this event as we have limited the seating for the dinner and we wanted to ensure that delegates and speakers alike had ample time to network and share new views and ideas.

This event sold out without any of the attendees seeing the agenda, which is a testament to the quality of programming on offer with Online Marketer branded events.

If you have been in search for eons and SEO is you thing then you will find that the SMX Elite Pass option is where you want to be.

At the time of sending this on Feb 23 we only had 6 tickets available.

SMX Sydney International Speakers

Every year we have new and old faces joing us from the International Search marketing Community and 2011 is no different from any other year.

Here is our current line up of speakers for 2011 (A-Z by surname):

Check out the agendas for all three days and make up your own mind:

Latest Discount Offer

If your were not lucky enough to have secured the 40% discount then we are happy to announce that you can still save 25%, as long as you book before the end of this week – Friday 25th Feb, 2011.

After that date all we will still offer a 10% discount, so now is the time to be a smart Online Marketer and cash in on the saving we are offering.

Please dont forget that we have a range of pass options that allow you to combine our Mobile Marketing Workshop, eMetrics & SMX Elite, there are plenty of options to choose from, you can download a pass options overview here.

SMX Sydney Agendas Live NOW

We have four pieces of news for you today.

  1. SMX Sydney Keynote Announcement
  2. SMX Sydney & SMX Elite Agendas is posted
  3. A New Website
  4. Discount for the events

SMX Sydney Keynote Speaker Announcement

Today we are proud to announce our two Keynote speakers for SMX Sydney 2011.

First cab off the rank on Day One (Thursday April 14) is Danny Sullivan, Danny is Editor-In-Chief of Search Engine Land, the leading news and analysis blog for search marketing professionals and is a creative force behind SMX, he has been involved in search for more than 10 years.

Our Keynote speaker for Day Two (Friday April 15th) is Stefan Weitz, Stefan is the Director of Bing and is charged with working with people and organisations across the industry to promote and improve Search technologies.

Coincidentally our Keynote with Stefan occurs on what is now know as #bingfriday, this was a term that Greg Boser came up with in our final session at SMX Melbourne in November last year, in short he suggested that we all use Bing on Fridays to break the Google habit, hence the name – #bingfriday.

Stefan’s Keynote will be broken into two parts, a a short 20 minute formal presentation on the confluence of Social data and Search. This will be followed by a “fireside” chat with Stefan, Danny Sullivan & Barry Smyth.

We are very excited to have both of these Industry Thought Leaders involved in the 2011 event.

SMX Sydney and SMX Elite Agendas

This years agenda has been our most challenging yet and we have taken a lot of time to build this one so that we get it right. In order to make sure that we keep up to speed with more advanced SEO practitioners we have added an additional day of training with the SMX Elite Pass, this gives us more room in our Thursday and Friday agendas for more mainstream content.

Please remember that the content is programmed for the Thursday and Friday agendas with people that have at least 3+ years of experience in search and social media marketing. If you have been in online for less than than this you will find the content to lean more towards an advanced level, where as if you have been around a while you will find that the content is of an intermediate nature.

If you have been in search for eons and SEO is you thing then you will find that the SMX Elite Pass option is where you want to be. We only have eight spots left for Elite.

Check out the agendas for all three days and make up your own mind:

New Website

Given that you are on the site you may already know we have four events that take place under the same roof, late last year we decided to take the step of rebranding all of these under one name – Online Marketer.

As such we now have a new website that all that content now lives under, special thanks goes out to our Brisbane based Web Developer – iReckon for helping us get this up and running through the tough times of the floods. If you need any help in building a new website from scratch using WordPress or any other technology then these are your guys, cant recommend them highly enough.

Latest Discount Offer

If your went lucky enough to have secured the 40% discount then we are happy to announce that you can still save 25%, as long as you book before the end of next week – Friday 25th feb, 2011.

After that date all we will still offer a 10% discount, so now is the time to be a smart Online Marketer and cash in on the saving we are offering.

Please dont forget that we have a range of pass options that allow you to combine our Mobile Marketing Workshop, eMetrics & SMX Elite, there are plenty of options to choose from, you can download a pass options overview here.

Thats all for now, some more news coming next week regarding speaker announcements for SMX, agenda information for eMetrics Marketing Optimization Summit plus a new session at SMX that we are sure will excite you.

Till then take care.