Pages

Monday, August 26, 2013

Google’s Local Carousel

Google’s recent announcement of the local “Carousel” has proven to be a major advancement in how search is being displayed. Due to the more visual nature of us humans, the Carousel offers various channels for the users to discover information and brands in a more visual and interactive manner.

Understanding How the Visual Carousel for Local Search Impacts Clicks, Rankings, and Your Overall SEO Strategy

Google’s Carousel could be a major shake-up for local search marketing campaigns. Although the Carousel does not show for every search, our research indicates that the Carousel layout is most likely to appear for search queries where user interaction – such as reviews and photo uploads – are common. We believe this is likely to expand to other search results in the near future.

How Carousel Works

The Carousel view displays a horizontal strip of images at the top of the search engine results page (SERP) that slides from left to right (if the quantity of results warrants it) and is highlighted with a black background. Previously, when searching for restaurants, bars or other nearby places, local results were displayed as a vertical text list, typically listing about five to seven businesses, in a fairly normal and even somewhat boring fashion.

The new Carousel format includes many more listings, and is much more eye-catching with rich information for each business listed, such as an image, number of reviews, review score, average costs, and type of food.




Typically we see that Google continues to push organic listings farther down on the SERP. Organic results are getting less real estate due to expanded coverage for the knowledge graph results (just try searching “acne treatment”) and product listing ads (PLAs). However, the local Carousel listings are located on top of the page, above paid search, knowledge graph results, and organic listings.

Furthermore, the new horizontal design displays about 30 percent more business listings and it covers about 40 percent less vertical space, leaving more room for organic listings than previous local results. Even more local results can be displayed once the user moves the map to his or her desired areas.  This gives local SEO a much larger opportunity to dominate the search results page.

Impact on Click-throughs in the Carousel

Because search results are shown horizontally and not vertically, traditional ranking CTRs may no longer apply. In a recent study, published on LocalU.org, it was discovered that the No. 1 local ranking position did not always get the highest amount of clicks. In the test, the actions of 83 people were tracked on the new layout. 

The results showed that 48 percent of the total clicks were on Carousel results and only 14.5 percent of the clicks were on the map. Additionally, the eighth Carousel result had the most clicks and the third Carousel result had the second most clicks.



[Source : http://www.covario.com/2013/07/googles-local-carousel-a-new-form-of-discovery-marketing/#fbid=Qxr9oXvOtlT ]


Monday, August 19, 2013

7 Mistakes that Lead to Guest Post Failure

Guest posting! The highway to unbridled blogging success.Guest posting offer for us fellow failure-chaser, you’re in luck. Because writing and submitting a guest post offers some real opportunities for spectacular failure.

1. Be as timid as humanly possible


The first opportunity for failure is the pitch. Confidence carries the day when it comes to guest posting.So if it’s failure you’re looking for, don’t show any confidence. Try not to sell your idea, and make sure you don’t actually write the post you’re proposing. Be hesitant, and make it apparent that you’re wasting your host’s time. With a bit of luck, they won’t you send so much as a read receipt.

2. Don’t startle the readers

Maybe the A-Lister you’ve just pathetically pitched has taken pity on you, and asked you to draft up your post.What he’s looking for here is some competence. So make sure you don’t show any. Starting with a bang and grabbing attention leads to success, so don’t do it. Write cautiously and quietly, so as not to startle your audience into action.

3. Imitation is the most sincere form of flattery — so shamelessly copy content

Your lukewarm opening should have dissuaded all but the most persistent of writers. So it’s going to take some real incompetence to screw this up now.The quickest way is to do something that’s been done before. Retread old ground — but not in a new and interesting way. No, simply regurgitate your host’s best piece with some added spelling mistakes and grammatical errors.Be very cautious with this, as covering old topics in a fresh way is actually a terrific way to write a popular guest post. Make sure not to add your own twist or fresh angle and you should be fine.

4. Shamelessly plug your own blog

It’s now time to look over the content you’ve just haphazardly thrown together.To hit the dizzying lows of total failure, you need to employ an ancient SEO technique known as “spamming.” In other words, drop your link into the post so often that the page becomes nearly unreadable.This is going to fail for two main reasons. One; it’s going to make your host even less likely to publish your piece. Two; it doesn’t work.

5. Make your ending as flat as possible

If you’ve done everything wrong up until now, you should be faced with a pathetic piece of trash, where every second word is a link to your blog.Congratulations. You’re nearly done with the writing. All that’s left for you to do is cobble together an ending that peters out. And whatever you do, don’t forget to leave out effective closing techniques like a strong call to action.

6. Treat your host like you’re one of The Sex Pistols

The chances are that you’re going to have to interact with your host, as they attempt to polish the steaming post you’ve just deposited in their inbox. So now’s the time to channel some old-school punk.Just like The Sex Pistols in their first TV interview, start swearing at your host, avoiding giving direct answers, and give the impression that the conversation is beneath you. With luck, this should be enough to make sure you don’t get published …

7. Run like mad and don’t ever look back


If after all of this, by some horrible stroke of luck you do get published, there’s still one more opportunity for failure.

Demonstrate a complete lack of commitment to your guest post. Don’t reply to comments, don’t promote it on Twitter or Facebook, and certainly don’t write a post on your own blog to take advantage of the new traffic that your guest post provides.

And with that, you’ll have blown your big guest posting chance.

Friday, August 16, 2013

KML Files

Keyhole Markup Language (KML) is an XML standard used to represent geographic data in 2D and 3D Maps. The format was conceived by Google for use in its Google Earth software. It will accept various parameters from Name, Address and Phone number to GPS Coordinates.
According to Google Developers, “a KML is a file format used to display geographic data in an Earth browser such as Google Earth, Google Maps, and Google Maps for mobile. KML uses a tag-based structure with nested elements and attributes and is based on the XML standard.” Once consumers can consistently find your business on these resources, it can help improve your traffic volume significantly.This is an option  for your local SEO campaign. An easy way to transfer your business’s info to Google’s server is through a KML Generator.

We submit two files - a GeoSitemap which tells Google where to find the KML File on your server , which has got the GPS coordinated which will tell Google where exactly your physical address is. Here is a pictorial representation of that same information



Create a KML File for  Local Business

The steps to create KML file are

1.  Go to Geo sitemap generator.com and click on the start button.
2.  Select Manual input  and Fill in all the required fields and click on Add Location
3.  If you have multiple branches for your business, add all of them by clicking on Add Location
4.  Once you add all the branches click on Step 2
5.  Now add the url of your website ( add the www or non www version depending on your setup ) . If    your site has multiple subsites and you want each of  them   to have its own KML File you can do that. Just make sure you add the appropriate locations.kml file to those folders. In most cases you can leave it as the default  after adding in your sites url.
6.  Description for your KML file and Author fields are not compulsory but you can add em in there if you want to, then hit the Generate button. I usually go for the download option , but if you have a lot of locations then you might want to get them to email you when its ready. 
7.  Now you will be presented with 2 files the KML File and a Geo sitemap, download both of them.

Hence this is an option to gain major benefits for a local SEO campaign.


Monday, August 5, 2013

Web Robots

Web Robots (also known as Web Wanderers, Crawlers, or Spiders), are programs that traverse the Web automatically. Search engines such as Google use them to index the web content, spammers use them to scan for email addresses, and they have many other uses.

It works likes this: a robot wants to vists a Web site URL, say http://www.example.com/welcome.html . Before it does so, it firsts checks for http://www.example.com/robots.txt , and finds:

The simplest robots.txt file uses two rules:

•   User-agent: the robot the following rule applies to
•   Disallow: the URL you want to block

Disallow indexing of everything

User-agent: *
Disallow: /

The "User-agent: *" means this section applies to all robots. The "Disallow: /" tells the robot that it should not visit any pages on the site.

Allow indexing of everything

User-agent: *
Disallow:

Disawllow indexing of a specific folder

User-agent: *
Disallow: /folder/

To exclude a single robot

User-agent: BadBot
Disallow: /

To allow a single robot

User-agent: Google
Disallow:

User-agent: *
Disallow: /

Crawl-delay directive

Several major crawlers support a Crawl-delay parameter, set to the number of seconds to wait between successive requests to the same server:

User-agent: *
Crawl-delay: 10

Allow directive

Some major crawlers support an Allow directive which can counteract a following Disallow directive. This is useful when one tells robots to avoid an entire directory but still wants some HTML documents in that directory crawled and indexed.

Disallow Googlebot from indexing of a folder, except for allowing the indexing of one file in that folder

User-agent: Googlebot
Disallow: /folder1/
Allow: /folder1/myfile.html

Sitemap

Some crawlers support a Sitemap directive, allowing multiple Sitemaps in the same robots.txt in the form:

Sitemap: http://www.gstatic.com/s2/sitemaps/profiles-sitemap.xml
Sitemap: http://www.google.com/hostednews/sitemap_index.xml

Host

Some crawlers (Yandex,Google) support a Host directive, allowing websites with multiple mirrors to specify their preferred domain.

Host: example.com
(Or)
Host: www.example.com

This is not supported by all crawlers and if used, it should be inserted at the bottom of the host file after Crawl-delay directive