Pages

Thursday, May 29, 2014

eBay Slapped By Google: But Was It Algorithmic Or Manual?

As you know, Google released Panda 4.0 last week today and a bunch of sites dropped like rocks, including eBay. In fact, here is an updated SearchMetrics chart of the drop, it is a 78% loss in search visibility!

SearchMetrics eBay

But now, the question is, was it Panda 4.0 that hit eBay or a manual action. Re/Code reports sources, that I have confirmed as well, that it was a manual action. Of course, Google won't comment but it is incredibly likely that eBay was hit with a manual action.
In fact, as refugeeks.com reported and the other tracking tools, the loss was specific to a single directory on eBay's site.
Doesn't Panda also impact parts of sites, like directories, not just whole sites. Yes, it is possible. So was this Panda or manual? Possibly both.
Does it matter if it was a manual action or algorithmic action? Yes! It changes how you go about fixing the issue and the timing of the recovery of the issue.

Thursday, May 22, 2014

No, Google Says There’s Been No Penguin Update

This morning, I noticed a lot of buzz around a possible Google Penguin update. The SEO space was noticing huge changes in the search results from previously penalized sites, many that were impacted by the Google Penguin update.

The Google Penguin algorithm targets sites trying to manipulate their search results through link building efforts that they deem as unnatural. This is not the manual actions for unnatural links but rather an algorithm that detects these types of links automatically.
A Google spokesperson told us at Search Engine Land, there is no Penguin or other spam efforts going on now. So this update, at least according to Google, is not spam related. In fact, Google made it sound like there was no update at all.
There were many webmasters talking about changes in rankings over the past 24-48 hours, independent of the Panda 4.0 update from a week ago. Most of them said the update impacted their sites that had Penguin issues. But not all were convinced that it was indeed Penguin related. But either way, many SEOs and webmasters noticed ranking changes.


Here is one graph from a site that had an algorithm penalty that recovered in the past couple days:
Other webmasters shared their analytics with me, showing huge changes in traffic as well. While others just claimed Penguin or other recoveries.
When I responded to some SEOs about Google saying there was no Penguin or otherwise update, some SEOs said they don’t believe it. Here are some tweets from SEOs

Thursday, May 15, 2014

Using Schema Markup to Boost Your Google Rankings

As a marketer or someone involved in SEO, you’ve probably heard of Schema markup, but if you haven’t, it’s okay. While frequently viewed as a relatively new technology to increase search engine rankings and the accessibility and usability of your pages, the markup has actually been around for years. If it’s not currently part of your SEO strategy, now is the time to make a change.
Keep reading to learn more about the basics of schema markup, along with 3 compelling reasons to start using it on your website today.



What is it and How Does it Work?
According to Dan Shewan at WordStream, schema is “a type of microdata that makes it easier for search engines to parse and interpret the information on your webpages more effectively so they can serve relevant results to users based on search queries.”
To make that even simpler, schema markup takes the information that matters most for search engines to find your information, puts it front and center as far as the backend of a website is concerned and serves it up to increase the odds of a website using the markup standing out above a website that does not. It’s wholly designed to increase the ease and effectiveness of search engine crawlers.

Additional information is available on Schema.org, launched in 2011 as a result of the Schema project – a collaboration between multiple large search engines like Google, Bing and Yahoo!. The site brings Schema markup to the forefront of website design by providing a large collection of schemas – largely viewed as html tags – that can be used by web designers, developers and average users to improve search engine placement.

Schema works similarly to other markup formats by applying microdata to page content by easily defining – in html terms – exactly what a webpage contains and how it should be treated. It presents data in an easy-to-read format for search engine crawlers and makes it more likely that relevant information will be presented to searchers. It makes the crawlers’ jobs easier and is therefore generally rewarded.


1. Matt Cutts has Been Recommending it For Years.

Matt Cutts – head of Google’s Webspam team, single-handedly responsible for shutting down multiple websites utilizing questionable SEO strategies and other actions – has been recommending schema markup for years. This is big – as someone who has helped to write the technology behind the largest search engine in the world, his words carry weight.
Back in 2012, in a webmaster help video, Cutts shared the following information about schema markup.
“Just because you implement schema.org doesn’t mean you necessarily rank higher. But, there are some corner cases, like if you were to type in ‘lasagna,’ and then click over on the left hand side and click on ‘recipes,’ that’s the sort of thing where using schema.org markup might help, because then you’re more likely to be showing up in that at all.”
While he was reluctant to outright say that it would help rankings, he made it clear that it definitely wouldn’t hurt anything. Coming from one of the top names in Google, these matters.

2. Rich Snippets, Like Those Featured in Google SERPs, Result in Higher CTRs.

Schema markup leads to tangible benefits, including enhanced search engine results pages – SERPs – that stand out among the competition. While these results generally include items like titles and snippets of a full webpage, targeted schema markup can be created to include customer rankings, photos and more.

Yes, visually it’s more appealing, but, it’s also more effective. Information shared by Search Engine Land indicates that rich snippets, listings that include more information than standard search engine listings, can increase click-through rates by 30%. Thirty percent more web traffic can do a lot to take a company to the next level, while looping back and increasing search engine visibility even more. Schema markup allows for this circle and as such, should be a serious consideration for increasing rankings and website effectiveness.

3. Pages with Schema Markup Rank an Average of 4 Positions Higher on Google

A recent study by Searchmetrics revealed that while less than 1 percent of sites on the Internet have implemented Schema markup, those who have done so rank an average of 4 positions higher on Google. That can mean the difference between page 1 and page 2, or being displayed in the top results or being buried, never to be found by searchers and prospective clients.
The study goes on to explain that Google shows results for sites with Schema markup – like those mentioned in reason number 2 – for over 36% of keyword queries, while keywords without markups are shown less frequently.

The bottom line is simple. Schema has been recommended by individuals behind the top search engines in the world and cannot hurt anything, the rich snippets that the practice enables lead to higher click-through rates and those with the markup rank higher in search engine results than sites that do not. Not using the markup – based on these reasons alone – just doesn’t make sense.
If you’re ready to put schema markup to work for your site, or are interested in learning more, check out Schema.org’s getting started guide. The time to start is now.

Wednesday, May 7, 2014

Local Listing Management Priorities: How & Where to Syndicate Local Data

Multi-location businesses face unique challenges in the management, optimization, and distribution of local business listing information. Accuracy is critical for every business, but those with a number of locations must also master scale.

It's easy to get bogged down in spreadsheets and a hodgepodge of tools. After all, most marketers struggle to see the ROI from local campaigns, so it can be difficult to justify putting budget here to streamline your efforts. You might simply rely on Yext, or have an agency do the work for you.

Looked at on-page validation or SEO factors, local landing page user experience, and local listing management together as components of your blueprint for multi-location brand success. This post will dive deeper into local data syndication – where to focus your efforts, how to do it properly, and scaling your efforts for maximum ROI.

1.Start With the Essentials: Google, Bing, Yahoo

The "big three" are free and account for 94.2 percent of search traffic. This March, 13.1 billion – or 65.5 percent – of the 19.4 billion explicit core searches in the U.S. took place on Google sites. Together, Microsoft (Bing) and Yahoo sites powered another 28.7 percent of searches.

Many major brands focus on Google alone, forgetting about Yahoo and Bing. This is a huge missed opportunity to rank higher through better optimization for these free platforms.

Even worse, incorrect or outdated local business information served up to the 28.7 percent of people who search with Bing or Yahoo can result in poor customer experiences .

2.Focus on Lowest-Cost Data Aggregators: Accuracy, Optimization & Automation

Once you're straight with the Google, Bing, and Yahoo, focus on your lowest cost data aggregators. These are your options less than $99 a year, per location, combined – Infogroup, Acxiom, Localeze, and Factual, among others.

Aggregators are a low-cost solution that can deliver great value, enabling your brand to build hundreds of valuable, relevant backlinks to your local landing pages. They help you offer a better user experience, increase your local presence, and rank higher in local organic search results.

How Can You Get Right With Your Low-Cost Data Aggregators?

Make sure you include and verify as much of the following information as you can:
  • Business name
  • Address
  • Telephone
  • Landing Page URL
  • Neighborhood Information
  • Cities, Areas Served
  • Brand Logo and Storefront Images
  • Business Description with services (keywords) offered and city you serve
  • Categories that match your services (keywords)
  • Hours
  • Holiday Hours Taglines & Brands Carried (keywords)
  • Events
  • Social Networks URLs
  • Local Map URLs
  • Review Site URLs

3.Home in on Your Higher-Cost Local Listings Management Solutions on Popular Networks

 Now it's time to get right with your higher-cost, more niche local listing management solutions on popular networks – Yelp, Yext, Foursquare, Facebook, and others. These are your $299 to $499 per year, per location options.

Social networks have evolved from a place to post cat pictures and chat with your high school friends to a valuable source of information for mobile consumers. Real-time data distributors like Yext and niche directory sites like Yelp, Foursquare, and even Facebook function as social networks, but also act as or power business review sites and house local business listing information.

Automating the process of populating, verifying, and updating your listings on these networks makes this a far less daunting task with a higher quality outcome.

4. Track, Evaluate & Optimize: Tactics for Ongoing Best Practice Management

Alongside the automation of your local listing syndication, look to citation management and ranking tools for ongoing improvements. These will help you track performance, measure increases in ranking and traffic, and more effectively optimize your local listings across the top search engines, data aggregators, and popular networks.

Using citation management and ranking tools increases ranking and traffic, and will you give you the insight and reporting you need to move the needle:
  • Bright Local
  • Moz Local
  • Whitespark
  • Chatmeter
  • Yext
  • Places Scout
  • Google Analytics.

Key Takeaways

  • Automate, automate, automate! It not only saves you time, but ensures greater accuracy and therefore higher rankings and a better user experience.
  • Start with essential free listings and systematically work your way through the above items to get it right. Optimize, measure, and improve on a consistent basis. Get out of the weeds of the niche directories and start with the three major ones for the biggest win.
  • Take advantage of the different types of information you can share on various listings solutions; go beyond NAP (name, address, phone number). Connect to your social networks, offer holiday hours and other relevant operational information, add images, etc.
If you're finding yourself drowning in spreadsheets and constantly playing catch-up to keep your local listings up to date and accurate, it's time to implement a smarter, more effective local listings syndication strategy.

Managing all listings from a single CMS reduces your time investment and will help you increase your rankings. This approach will also grow your local presence and ultimately achieve the lowest cost per visitor for your multi-location business.