Archives for : January2016

The Most Important Things We Learned About Google’s Panda Algo

Posted by jenstar

Webmasters were caught by surprise two weeks ago, when Google released many new statements about their Panda algorithm to The SEM Post. Traditionally, Google tends to be rather quiet about their search algorithms, but their new comments were a departure from this. Google was quite transparent and shared a lot of new Panda-related information that many SEOs weren’t aware of.

Here are what I consider to be the top new takeaways from Google about the Panda algorithm. These are all things that SEOs can put into action, either to create new, great-quality content or to increase the quality value of their current content.

First, the Panda algorithm is specifically about content. It’s not about links, it’s not about mobile-friendliness, it’s not about having an HTTPS site. Rather, the Panda algorithm rewards great-quality content by demoting content that’s either quite spammy in nature or that’s simply not very good.

Now, here are the most important things you should know about Panda, including some of the mistakes and misconceptions about the algorithm update that have confused even the expert SEOs.

Removing content Google considers good

One big issue is that many SEOs have been promoting the widespread removal of content from websites that were hit by Panda. In actuality, however, what many webmasters don’t realize is that they could be shooting themselves in the foot by doing this.

When performing content audits, many penalty experts will cut a wide swath through the site’s content and remove it. Whether claiming that X% of content needs to be removed to recover from Panda or that older, less fresh content needs to be removed, doing this without the proper research will cause rankings to decrease even further. It’s never a “surefire Panda recovery tactic,” despite what some might say.

Unfortunately for SEOs, there’s no magic formula to recover from Panda when it comes to the quantity, age, or length of the content on the site. Instead, you need to look at each page to determine its value. The last thing you want to do is remove pages that are actually helping.

Fortunately, we have the tools to be able to determine the “good versus bad” when it comes to figuring out what Google considers quality. And the answer is in both Google Analytics (or whatever your preferred site analytics program is) and in Google Search Console.

If Google is sending traffic to a page, then it considers it quality enough to rank. If you were going to remove one of these pages because it was written a few years ago or because it was below a magic word count threshold, you would lose all the future traffic Google would send to that page.

If you’re determined to remove content, at least verify that Google isn’t sending those pages traffic before you add to your Panda problems by losing more traffic.

Your content should match the search query

We all laugh when we look in our Google Search Console Search Analytics and see the funny keywords people search for. However, part of providing quality content is also delivering those content expectations. In other words, if a search is repeatedly bringing visitors to a specific page, you’ll want to make sure that page delivers the promised content.

From the Panda Algo Guide:

A Google spokesperson also took it a step further and suggested using it also to identify pages where the search query isn’t quite matching the delivered content. “If you believe your site is affected by the Panda algorithm, in Search Console’s Search Analytics feature you can identify the queries which lead to pages that provide overly vague information or don’t seem to satisfy the user need for a query.”

So if your site has been impacted by Panda — or you’re concerned it might be and want to be proactive — start matching up popular queries with their pages, making sure you’re fully delivering on those content query expectations. While this won’t be as big of a concern for sites not impacted by Panda, it’s something to keep in mind if you do notice those “odd” keywords popping up with frequency.

Ensuring your content matches the query is also one of the easiest Panda fixes you can do, although it might take some legwork to spot those queries that under-deliver. Often, it’s just a matter of slightly tweaking a paragraph or two, or adding an additional few paragraphs to change the content for those queries from “meh” to “awesome.” And if you deliver that content on the visitor’s landing page, it means they’re more likely to stick around, view more of your content, and share it with others — rather than hitting the back button to find a page that does answer their query.

Fixable? Or kill it with fire?

“Fixing” versus “removing” is another area where many experts disagree. Luckily, it’s been one of the areas that Google has been pretty vocal about if you know where to find those comments.

Google has been a longtime advocate of fixing poor quality content. Both Gary Illyes and John Mueller have repeatedly talked about improving the quality of content.

In a hangout, John Mueller said:

Overall, the quality of the site should be significantly improved so we can trust the content. Sometimes what we see with a site like that will have a lot of thin content, maybe there’s content you are aggregating from other sources, maybe there’s user-generated content where people are submitting articles that are kind of low quality, and those are all the things you might want to look at and say what can I do; on the one hand, hand if I want to keep these articles, maybe prevent these from appearing in search.

Now, there are always edge cases, and this is what many experts get hung up on. The important thing to remember is that Google’s not talking about those weird, random edge cases, but rather what applies to most websites. Is it forum spam for the latest and greatest Uggs seller? Of course, you’ll want to remove or noindex it. But if it’s the content you hired your next-door neighbor to write for you, or “original” content you bought off of Fiverr? Improve it instead.

If you do have thin content that you’ll want to upgrade in the future, you can always noindex it for now. If it’s not indexable by Google, it’s not going to hurt you, from a Panda perspective. However, it’s important to note that you still need to have enough quality content on your site, even if you’re noindexing or removing the bad stuff.

This is also what Google recommended in the Panda Algo Guide:

A Google spokesperson also said this, when referring to lower quality pages. “Instead of deleting those pages, your goal should be to create pages that don’t fall in that category: pages that provide unique value for your users who would trust your site in the future when they see it in the results.”

Still determined to remove it after checking all the facts? Gary Illyes gave suggestions during his keynote at Pubcon last year on how to remove thin content properly.

Ranking with Panda

One of the most surprising revelations from Google is that sites can still rank while being affected by Panda. While there are certainly instances where Panda impacts an entire site, and this is probably true in the majority of cases, it is possible that only some pages are negatively impacted by Panda. This is yet another reason you want to be careful when removing pages.

From the Panda Algo Guide:

What most people are seeing are sites that have content that is overwhelmingly poor quality, so it can seem that an entire site is affected. But if a site does have quality content on a page, those pages can continue to rank.

A Google spokesperson confirmed this as well.

The Panda algorithm may continue to show such a site for more specific and highly-relevant queries, but its visibility will be reduced for queries where the site owner’s benefit is disproportionate to the user’s benefit.

This comment reinforces the idea from Google that a key part of Panda is where Google feels the site owner is getting the most benefit from a visitor to their site, rather than vice-versa.

Duplicate content

One of the first things that webmasters do when they get hit by Panda is freak out over duplicate content. And while managing your duplicate content is always a good idea from a technical standpoint, it doesn’t actually play any kind of a role in Panda, as confirmed by John Mueller late last year.

And even then, John Mueller described fixing duplicate content on a priority scale as “somewhere in the sidebar or even quite low on the list.” In other words, focus on what Panda is impacting first, then clean up the non-Panda related technical details at the end.

Bottom line: Duplicate content can certainly affect your SEO. But from a Panda perspective, if your main focus is on getting your site ranking well again in Google after a Panda hit, leave it until the end. Google is usually pretty good about sorting it out, and if not, it’s fixable with either some redirects or canonicals.

Word count

Many webmasters fixate on the idea that content has to be a certain number of words to be deemed “Panda-proof.” There are plenty of instances of thousand-word articles that are extremely poor quality, and other examples of content so great that even having only a hundred or so words will trigger a featured snippet… something Google tends to give only to higher-quality sites.

Now, if you’re writing content, there’s nothing wrong with trying to set up certain benchmarks for the number of words — especially if you have contributors or you’re hiring writers. There’s no issue with that. The issue is with falsely believing that word count is related to quality, both in Google’s eyes and from the Panda algo perspective.

It’s very dangerous to assume that because an article or post is under a specific word count that it needs to be removed or improved. Instead, as with the case of considering whether you should remove content, look to see whether Google is sending referrals to those pages. If they’re ranking and receiving traffic from Google, word count is not an issue.

Advertising & affiliate links

The role that both advertising and affiliate links play in Google Panda is an interesting one. This isn’t to say that all advertising is bad or all affiliate links are bad. It’s a topic that John Mueller from Google has brought up in his Google Hangouts, as well. The problem is the content surrounding it — how much there is and what it’s like.

Where there’s an impact is in the amount of advertising and affiliate links. Will Google consider a page that is essentially just affiliate links without any quality content as good? It’s not that Panda is specifically targeting ads or affiliate content. There are lots of awesome affiliate sites out there that rank really well and are not affected by Panda whatsoever.

The problem lies in the disconnect between the balance of useful content and monetization. At Pubcon, Gary Illyes said the value to the visitor should be higher than the value to the site owner. But as we see on many sites, that balance has tipped the other way, where the visitor is seen merely as a means of revenue, without concern about giving that visitor any value back.

You don’t need to hit your visitors over the head with a huge amount of advertising and affiliate links to make money. That visitor brings a lot of additional value to your site when they don’t feel your site is too ad heavy. From the Panda Algo Guide:

There are also benefits from traffic even if it doesn’t convert into a click on an affiliate link. Maybe they share it on social media, maybe they recommend it to someone, or they return at a later time, remembering the good user experience from the previous visit.

A Google spokesperson also said, “Users not only remember but also voluntarily spread the word about the quality of the site, because the content is produced with care, it’s original, and shows that the author is truly an expert in the topic of the site.” And this is where many affiliate sites run into problems.

There’s another thing that often happens when a website is hit by Panda: naturally, the revenue from the ads they do have on the site goes down. Unfortunately, often the response to this loss of revenue is to increase the number of ads or affiliate links to compensate. But this degrades the value of the content even further and, despite the knee-jerk reaction, is not the appropriate move in a Panda-busting plan.

Bottom line: There is absolutely nothing wrong with having advertising or affiliate links on a site. That alone won’t cause a Panda issue. What can cause a Panda issue, rather, is how and how much you present these things. Ads and affiliate links should support your content, not overwhelm it.

User-generated content

What about user-generated content? Sadly, it’s getting a pretty bad rap these days. But it’s getting this reputation for the crappy user-generated content out there, not for the high-quality user generated content you see on sites. Many so-called experts advise removing all user-generated content, when again that’s one of those moves that can negatively impact your site.

Instead, look at the actual user-generated content you have your site and decide whether it’s quality or not. For example, YouMoz is considered to be fairly high-quality user generated content: all posts still have to be approved by editors, and only a small percent of submitted articles make it live on the site. Even then, their editors also work to improve and edit the pieces as necessary, ensuring that even though it is user-generated content, it’s still high quality.

But like any content on the web, user-generated or not, there are different levels of quality. If your user-generated content quality is very high, then you have nothing to worry about. You could have a different contributor for every single article if you wanted to. It has nothing to do with how you obtained the content for your site, but rather how high-quality and valuable that content is.

Likewise, with forums or community-driven sites where all the content is user-contributed, it’s about how quality that content is — not about who contributes it. Sites like Stackoverflow have hundreds of thousands of contributors, yet it’s considered very high-quality and it does extremely well in the Google search results.

If your user-generated content has both its high point and its low point regarding quality, there are a few things actions that Google recommends so that the lower-quality content doesn’t drag down the entire site. John Mueller said if you can recognize the types of lower-quality content on the forum or the patterns that tend to match it, then you can block it from being indexed by Google. This might mean noindexing your welcome forum where people are posting introductions about themselves, or blocking the chitchat forums while leaving the helpful Q&A as indexable.

And, of course, you need to deal with any spam in your user-generated content, whether it’s something like YouMoz or a forum for people who all love a specific hobby. Have good guidelines in place to prevent your active users from spamming or link-dropping. And use some of the many forum add-ons that identify and remove spam before Google can even see it.

Do not follow the advice of those who say all user-generated content is bad… it’s not. Just ensure that it’s high quality, and you won’t have a problem with Panda from the start.

Commenting

You may have noticed a trend lately: Many blogs and news sites are removing comments from their sites completely. When you do this, though, you’re removing a signal that Google can use that shows how well people are responding to your content. Like any content, comments aren’t all bad simply because they’re comments — their quality is the deciding factor, and this will vary.

And it’s not just the Google perspective that dictates why you should keep them. Having a comment section can keep visitors coming back to your site to check for new commentary, and it can often offer additional insights and viewpoints on the content. Communities can even form around comment sections. And, of course, it adds more content.

But, like user-generated content, you need to make sure you’re keeping it high quality. Have a good comments policy in place; if you’re in doubt, don’t approve the comment. Your goal is to keep those comments high-quality, and if there’s any suspicion (such as a username of “Buy Keyword Now,” or it’s nothing more than an “I agree” comment), just don’t allow it.

That said, allowing low-quality comments can affect the site, something John Mueller has confirmed. I wouldn’t panic over a handful of low-quality comments, but if the overall value of the comments is pretty low, you probably want to weed them out, keep the high-quality comments, and be a little bit more discriminating going forward.

Technical issues

No, technical issues do not cause Panda. However, it’s still a widespread belief that things like page speed, duplicate content, or even what TLD the site is on can have an impact on Panda. This is not accurate at all.

That said, these kinds of technical issues do have an impact on your overall rankings — just not for Panda reasons. So, it’s best practice to ensure your page speed is good, you’re not running long redirect chains, and your URL structure is good; all these things do affect your overall SEO with Google’s core algorithm. With regards to recovering from Panda, though, it doesn’t have an impact at all.

“Core” Algo

One of the surprises was the addition of the core algo comment, where Google revealed to The SEM Post that Panda was now part of the core algorithm. But what does this mean? Is it even important to the average SEO?

The answer is no. Previously, Panda was a filter added after the core search algo. Now, while it’s moved to become part of that core algo, Panda itself is essentially the same, and it still impacts websites the same way.

Google confirmed the same. Gary Illyes from Google commented on it being one of the worst takeaways from all the Panda news.

A2. I think this is the worst takeway of the past few days, but imagine an engine of a car. It used to be that there was no starter (https://en.wikipedia.org/wiki/Starter_(engine)), the driver had to go in front of the car, and use some tool to start the engine. Today we have starters in any petrol engine, it’s integrated. It became more convenient, but essentially nothing changed.

For a user or even a webmaster it should not matter at all which components live where, it’s really irrelevant, and that’s why I think people should focus on these “interesting” things less.

It really doesn’t make a difference from an SEO’s perspective, despite the initial speculation it might have.

Overall

Google released a lot of great Panda information last week, and all of it contained advice that SEOs can put into action immediately — whether to ensure their site is Panda-proofed, or to fix a site that had been slapped by Panda previously.

The bottom line: Create high-level, quality content for your websites, and you won’t have to worry about Pandas.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

No-Hype SEO: A Realistic Formula For Making SEO Work For Your Business, Part 2

In part 2 of his two-part series on simplifying your SEO efforts to achieve results, columnist Daniel Faggella shows how you can turn visitors into leads — and leads into sales. The post No-Hype SEO: A Realistic Formula For Making SEO Work For Your Business, Part 2 appeared first on Search Engine…

Please visit Search Engine Land for the full article.

SearchCap: Republican Debate & Google, AdWords iOS App & Adobe Report

Below is what happened in search today, as reported on Search Engine Land and from other places across the web. The post SearchCap: Republican Debate & Google, AdWords iOS App & Adobe Report appeared first on Search Engine Land.

Please visit Search Engine Land for the full article.

Targeted Link Building in 2016 – Whiteboard Friday

Posted by randfish

SEO has much of its roots in the practice of targeted link building. And while it’s no longer the only core component involved, it’s still a hugely valuable factor when it comes to rank boosting. In this week’s Whiteboard Friday, Rand goes over why targeted link building is still relevant today and how to develop a process you can strategically follow to success.

Click on the whiteboard image above to open a high resolution version in a new tab!

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re going to chat about four questions that kind of all go together around targeted link building.

Targeted link building is the practice of reaching out and trying to individually bring links to specific URLs or specific domains — usually individual pages, though — and trying to use those links to boost the rankings of those pages in search engine results. And look, for a long time, this was the core of SEO. This was how SEO was done. It was almost the start and the end.

Obviously, a lot of other practices have come into play in the industry, and I think there’s even been some skepticism from folks about whether targeted link building is still a valid practice. I think we can start with that question and then get on to some of these others.

When does it make sense?

In my opinion, targeted link building does make sense when you fulfill certain conditions. We know from our experimentation, from correlation data, from Google’s own statements, from lots of industry data that links still move the needle when it comes to rankings. If you have a page that’s ranking number 4, you point a bunch of new links to it from important pages and sites around the web, particularly if they contain the anchor text that you’re trying to rank for, and you will move up in the rankings.

It makes sense to do this if your page is already ranking somewhere in the, say, top 10 to 20, maybe even 30 results and/or if the page has measurable high impact on business metrics. That could be sales. It could be leads. It could be conversions. Even if it’s indirect, if you can observe both those things happening, it’s probably worthwhile.

It’s also okay if you say, “Hey, we’re not yet ranking in the top 20, but our paid search page is ranking on page 1. We know that we have high conversions here. We want to move from page 3, page 4 up to page 1, and then hopefully up into the top two, top three results. Then it is worth this targeted link building effort, because when you build up that visibility, when you grow those rankings, you can be assured that you are going to gain more visits, more traffic that will convert and send you these key business metrics and push those things up. So I do think targeted link building still makes sense when those conditions are fulfilled.

Is this form of link building worthwhile?

Is this something that can actually do the job it’s supposed to do? And the answer, yeah. Look, if rank boosting is your goal, links are one of the ways where if you already have a page that’s performing well from a conversion standpoint — from a user experience standpoint, pages per visit, your browse rate, things like time onsite, if you’re not seeing high bounce rate, if you have got a page that’s clearly accessible and well targeted and well optimized on the page itself — then links are going to be the most powerful, if not one of the most powerful, elements to moving your rankings. But you’ve got to have a scalable, repeatable process to build links.

You need the same thing that we look for broadly in our marketing practices, which is that flywheel. Yes, it’s going to be hard to get things started. But once we do, we can find a process that works for us again and again. Each successive link that we get and each successive page whose rankings we’re trying to move gets easier and easier because we’ve been there before, we’ve done it, we know what works and what doesn’t work, and we know the ins and outs of the practice. That’s what we’re searching for.

When it comes to finding that flywheel, there are sort of tactics that fit into three categories that still do work. I’m not going to get into the individual specific tactics themselves, but they fall into these three buckets. What we’ve found is that for each individual niche, for each industry, for each different website and for each link builder, each SEO, each one of you out there, there’s a process or combination of processes that works best. So I’m going to dictate to you which tactics works best, but you’ll generally find them in these three buckets

Buckets:

One: one-to-one outreach. This is you going out and sending usually an e-mail, but it could be a DM or a tweet, an at reply tweet. It could be a phone call. It could be — I literally got one of these today — a letter in the mail addressed to me, hand-addressed to me from someone who’d created a piece of content and wanted to know if I would be willing to cover it. It wasn’t exactly up my alley, so I’m not going to. But I thought that was an interesting form of one-to-one outreach.

It could be broadcast. Broadcast is things like social sharing, where we’re broadcasting out a message like, “Hey, we’ve produced this. It’s finally live. We launched it. Come check it out.” That could go through bulk e-mail. It could go through an e-mail subscription. It could go through a newsletter. It could go through press. It could go through a blog.

Then there’s paid amplification. That’s things like social ads, native ads, retargeting, display, all of these different formats. Typically, what you’re going to find is that one-to-one outreach is most effective when you can build up those relationships and when you have something that is highly targeted at a single site, single individual, single brand, single person.

Broadcast works well if, in your niche, certain types of content or tools or data gets regular coverage and you already reach that audience through one of your broadcast mediums.

Paid amplification tends to work best when you have an audience that you know is likely to pick those things up and potentially link to them, but you don’t already reach them through organic channels, or you need another shot at reaching them from organic and paid, both.

Building a good process for link acquisition

Let’s end here with the process for link acquisition. I think this is kind of the most important element here because it helps us get to that flywheel. When I’ve seen successful link builders do their work, they almost all have a process that looks something like this. It doesn’t have to be exactly this, but it almost always falls into this format. There’s a good tool I can talk about for this too.

But the idea being the first step is opportunity discovery, where we figure out where the link opportunities that we have are. Step 2 is building an acquisition spreadsheet of some kind so that we can prioritize which links we’re going to chase after and what tactics we’re going to use. Step 3 is the execution, learn, and iterate process that we always find with any sort of flywheel or experimentation.

Step 1: Reach out to relevant communities

We might find that it turns out for the links that we’re trying to get relevant communities are a great way to acquire those links. We reach out via forums or Slack chat rooms, or it could be something like a private chat, or it could be IRC. It could be a whole bunch of different things. It could be blog comments.

Maybe we’ve found that competitive links are a good way for us to discover some opportunities. Certainly, for most everyone, competitive links should be on your radar, where you go and you look and you say, “Hey, who’s linking to my competition? Who’s linking to the other people who are ranking for this keyword and ranking for related keywords? How are they getting those links? Why are those people linking to them? Who’s linking to them? What are they saying about them? Where are they coming from?”

It could be press and publications. There are industry publications that cover certain types of data or launches or announcements or progress or what have you. Perhaps that’s an opportunity.

Resource lists and linkers. So there’s still a ton of places on the web where people link out to. Here’s a good set of resources around customer on-boarding for software as a service companies. Oh, you know what? We have a great post about that. I’m going to reach out to the person who runs this list of resources, and I’m going to see if maybe they’ll cover it. Or we put together a great meteorology map looking at the last 50 winters in the northeast of the United States and showing a visual graphic overlay of that charted against global warming trends, and maybe I should share that with the Royal Meteorological Society of England. I’m going to go pitch their person at whatever.ac.uk it is.

Blog and social influencers. These are folks who tend to run, obviously, popular blogs or popular social accounts on Twitter or on Facebook or on LinkedIn, or what have you, Pinterest. It could be Instagram. Potentially worth reaching out to those kinds of folks.

Feature, focus, or intersection sources. This one’s a little more complex and convoluted, but the idea is to find something where you have an intersection of some element that you’re providing through the content of your page that you seem to get a link from and there is intersection with things that other organizations or people have interest in.

So, for example, on my meteorology example, perhaps you might say, “Lots of universities that run meteorology courses would probably love an animation like this. Let me reach out to professors.” “Or you know what? I know there’s a data graphing startup that often features interesting data graphing stuff, and it turns out we used one of their frameworks. So let’s go reach out to that startup, and we’ll check out the GitHub project, see who the author is, ping that person and see if maybe they would want to cover it or link to it or share it on social.” All those kinds of things. You found the intersections of overlapping interest.

The last one, biz devs and partnerships. This is certainly not a comprehensive list. There could be tons of other potential opportunity to discover mechanisms. This covers a lot of them and a lot of the ones that tend to work for link builders. But you can and should think of many other ways that you could potentially find new opportunities for links.

Step 2: Build a link acquisition spreadsheet

Gotta build that link acquisition spreadsheet. The spreadsheet almost always looks something like this. It’s not that dissimilar to how we do keyword research, except we’re prioritizing things based on: How important is this and how much do I feel like I could get that link? Do I have a process for it? Do I have someone to reach out to?

So what you want is either the URL or the domain from which you’re trying to get the link. The opportunity type — maybe it’s a partnership or a resource list or press. The approach you’re going to take, the contact information that you’ve got. If you don’t have it yet, that’s probably the first thing on your list is to try and go get that. Then the link metrics around this.

There’s a good startup called BuzzStream that does sort of a system, a mechanism like this where you can build those targeted link outreach lists. It can certainly be helpful. I know a lot of folks like using things like Open Site Explorer and Followerwonk, Ahrefs, Majestic to try and find and fill in a bunch of these data points.

Step 3: Execute, learn, and iterate

Once we’ve got our list and we’re going through the process of actually using these approaches and these opportunity types and this contact information to reach out to people, get the links that we’re hoping to get, now we want to execute, learn, and iterate. So we’re going to do some forms of one-to-one outreach where we e-mail folks and we get nothing. It just doesn’t work at all. What we want to do is try and figure out: Why was that? Why didn’t that resonate with those folks?

We’ll do some paid amplification that just reaches tens of thousands of people, low cost per click, no links. Just nothing, we didn’t get anything. Okay, why didn’t we get a response? Why didn’t we get people clicking on that? Why did the people who clicked on it seem to ignore it entirely? Why did we get no amplification from that?

We can have those ideas and hypotheses and use that to improve our processes. We want to learn from our mistakes. But to do that, just like investments in content and investments in social and other types of investments in SEO, we’ve got to give ourselves time. We have to talk to our bosses, our managers, our teams, our clients and say, “Hey, gang, this is an iterative learning process. We’re going to figure out what forms of link building we’re good at, and then we’re going to be able to boost rankings once we do. But if we give up because we don’t give ourselves time to learn, we’re never going to get these results.”

All right, look forward to your thoughts on tactical link building and targeted link building. We’ll see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

5 Questions About Social Marketing for Dr. Diogo Verissimo

Social marketing campaigns aim to influence the behavior of a target group in a way … Answer: Social Marketing is the application of marketing concepts and … This effort to build local support is pivotal, as conservationists know from …

Graphcomment Comment System

Emailed Author: There are issues with your plugin code. Please read this ENTIRE email, address all listed issues, and reply to this email with your corrected code attached. It is required for you to read and reply to these emails, and failure to do so will result in your plugin being rejected.

## Including jquery files (or calling them remotely)

Your plugin has included your own copy of a core jQuery file (or called it remotely, probably from Google or jquery.com).

WordPress includes its own version of jquery and many other similar JS files, which have all been rigorously tested with WP and many of the most common plugins. In order to provide the best compatibility and experience for our users, we ask that you not package your own (especially not an older version) and instead use wp_enqueue_script() to pull in WordPress’s version.

Please review http://codex.wordpress.org/Function_Reference/wp_enqueue_script and update your plugin accordingly. You need to both change your code to use our jquery as well as remove the unused files. Remember! Keeping unused files out of your plugins makes them smaller and less potentially vulnerable! if you have any jquery files included in your plugin that WP core has, just delete them.

Offloading jQuery js, css, and other scripts to Google (or jquery.com or anywhere else frankly) is similarly disallowed for the same reasons, but also because you’re introducing an unnecessary dependency on another site. If the file you’re trying to use isn’t a part of WordPress Core, then you should include it -locally- in your plugin, not remotely. Please check first. We have a LOT of JS files 🙂

If your code doesn’t work with the built-in versions of jQuery, it’s most likely a no conflict issue.

If you can’t guess, we -really- want you to use our JS files, and if you can’t, we need to know why so we can fix things for everyone. If you’re just including it because you want to support old versions of WP, or because you think they may not have jQuery, please don’t. If they don’t have the default jQuery, a lot more than your plugin will break. And if they’re on older versions of WordPress, they need to upgrade.

We do not recommend you support anything except the most recent version of WP and one release back. After all, we don’t.

wp_register_script(‘jquery’, plugins_url(‘/theme/vendors/jquery/dist/jquery.min.js’, __FILE__));
wp_register_script(‘jquery-ui’, plugins_url(‘/theme/vendors/jquery-ui/jquery-ui.min.js’, __FILE__), array(‘jquery’));

Those aren’t needed. Please use ours.

—-

Please make sure you’ve addressed ALL issues brought up in this email. When you’ve corrected your code, reply to this email with the updated code attached as a zip, or provide a link to the new code for us to review. If you have questions, concerns, or need clarification, please reply to this email and just ask us.

(While we have tried to make this review as exhaustive as possible we, like you, are humans and may have missed things. As such, we will re-review the ENTIRE plugin when you send it back to us. We appreciate your patience and understanding in this.)

My Single Best SEO Tip for Improved Web Traffic

Posted by Cyrus-Shepard

Howdy Moz Fans,

After more than 5 years — including an 18-month hiatus as a Moz associate — tomorrow marks my last day working as a Mozzer.

Make no mistake — I love this job, company, and community. Moz has taught me to be a better marketer. Both Rand Fishkin and Sarah Bird (and many others) have taught me more about emotional intelligence and how to treat others than I thought possible of myself. Moz has introduced me to amazing coworkers and industry folk around the world. I’m truly grateful for this experience.

Since my first YouMoz post was accepted for publication by Jen Lopez before I even worked here, I’ve done my best to share SEO tips and tactics to help people advance their marketing and improve online visibility. These posts are truly the thing I’m most proud of.

Time for one last SEO tip, so I hope it’s a good one…

SEO white lies

The beauty of SEO is that, instead of pushing a marketing message onto folks who don’t want to hear what you have to say, you can reverse-engineer the process to discover exactly what people are looking for, create the right content for it, and appear before them at exactly the moment they are looking for it. It’s pull vs. push.

Works like magic. Customers come to you.

Let’s begin this process by telling a lie.

“Content is king.”

Bull hockey. The king doesn’t rule jack squat. A truer statement is this: If content is king, then the user is queen, and she rules the universe. Let’s say that again, because this is important.

“The user is queen, and she rules the universe.”

Google only cares about your content inasmuch as it answers the user’s search query. Search results are not a collection of “good” content; they are a ranked list of content that best satisfies what the user is looking for.

Here’s a typical process many SEOs use when building content:

  1. Conduct keyword research to discover what people are searching for relative to your niche.
  2. Pick a series of high-volume, low-competition phrases
  3. Build content around these phrases and topics
  4. Launch and market the page. Build some links.
  5. Watch the traffic roll in. (Or not)
  6. Move on to the next project.

The shortcoming of this approach is that 1–4 are often hit or miss. Google’s Keyword Planner, perhaps the best available keyword tool available, is famous for not surfacing most long-tail keywords. Additionally, creating the exact content and building the right links in order for Google to rank you for precise pages is challenging as well.

Unfortunately, this where most people stop.

My advice: Don’t stop there.

This whole process relies on traditional SEO signals to rank your content higher. Signals like keyword usage and PageRank (yes, it’s a real ranking factor). While these factors remain hugely important, they miss the point of where SEO has already moved.

In our latest Ranking Factors Expert Survey, we asked over 150 top search marketers to rate which factors they see gaining and losing significance in Google’s algorithm. The results showed that while most traditional SEO features were expected to either retain or decrease in influence, we found that user-based features were expected to increase.

In addition to signals like mobile-friendliness, site speed, overall UX, and perceived quality, the factors I want to focus on today include:

  1. Page matches the searcher’s intent: In other words, the page has a high probability of being what the user is actually looking for.
  2. Search engine results clickstream data: This may include measuring the search results that users actually click, as well as the pogo-sticking effect.
  3. Task completion: The user is able to complete the task they set out to do. In other words, their questions have been completely answered.

What I am going to talk about is how to improve all three of these factors for underperforming pages at the same time, using a single technique.

Here’s the tip: Optimize for how users are actually using the page — as opposed to how you optimized the page ahead of time — and you’ll see significantly better traffic.

Once you begin receiving traffic from search engines, you have an incredible amount of data regarding real search visits. If your page receives any traffic at all, Google has already guessed what your content is about — right or wrong — and is sending some traffic to you. In all reality, there is a gap between the traffic you thought you were optimizing for when you created the page, and the traffic you are actually getting.

You want to close that gap. We’ll ask and answer these 3 questions:

  1. Is my content matching the intent of the visitors I’m actually receiving?
  2. Based on this intent, is my search snippet enticing users to click?
  3. Does my page allow users to complete their task?

Here’s how we’re going to do it. I present your SEO homework.

1. Identify your low-to-mid performing pages

This process works best on pages with lower or disappointing traffic levels. The reason you want to stay away from your high-performing pages is the adage: “If it ain’t broke, don’t fix it.”

That’s not to say that high-performing pages can’t be improved, but whenever you make changes to a page you risk ruining the things that work well, so for now we’re going to focus on our under-performers.

The simplest way is to use analytics to identify pages you believe are high quality — and target good keyword phrases — but receive less traffic than you’d expect based on site averages.

For this example I’ll use Google Search Console for my data, although you could use other platforms such as Bing Webmaster or even features found in Moz Pro.

Here’s a picture of our traffic and search queries for Followerwonk. While it’s a good amount of traffic, something looks off with the second URL: it receives 10x more impressions than any other URL, but only gets a 0.25% click-through rate. We’ll use this URL for our process.

2. Discover mismatches between user intent and content

Next, we want to discover the keyword phrases that surface our URL in search results. Here’s how you do it in Search Console.

After you complete #3 above by clicking on the URL you wish to analyze, you’ll find a page of data isolating that URL, but it will lack keywords. Now hit the “Queries” tab to filter keywords filtered for this specific URL.

For our Followerwonk URL, we discover an interesting result. The phrase “twitter search” generated a million search impressions, but only 724 clicks. Google believes we deserve to rank for this query, but obviously the page doesn’t offer what people are looking for

Or does it?

The Followerwonk Bio Search page offers advanced Twitter bio search, complete with lots of advanced options you can’t find on Twitter. It’s reasonable that tons of people searching for “twitter search” would find enormous value in this page. So why the disconnect?

A quick screenshot reveals the heart of the problem.

That’s it — the entire page. Very little explanatory text makes it difficult to quickly grasp what this page is about. While this is an awesome page, it fails in one key aspect for its highest volume search query.

The page fails to satisfy user intent. (At least in a quick, intuitive way.)

So how can we fix this? Let’s move on to the next steps.

3. Optimizing for user intent

Now that we understand how users are actually finding our page, we want to make it obvious that our page is exactly what they are looking for to solve their problem. There are 5 primary areas this can be accomplished.

  1. Title tag
  2. Meta description
  3. Page title and headers
  4. Body text
  5. Call to action

Rewriting the title tags and descriptions of underperforming pages to include the keyword queries users perform to find your URL can lead to a quick increase in clicks and visits.

Additionally, after you get these clicks, there’s a growing body of inconclusive evidence that higher click-through rates may lead to higher rankings. In the end, it really doesn’t matter. The whole point is that you get more traffic, one way or another.

The key is to take this data to optimize your search snippet in a way that entices more and better traffic.

Earning the click is only half the battle. After we get the visitor on our site, now we have to convince them (almost immediately) that we can actually solve the problem they came here to find. Which leads to…

4. Improving task completion

Consider this: A user searches for “best restaurants in Seattle.” You want your pizza parlor to rank #1 for this query, but will this satisfy the user?

Likely not, as the user is probably looking for a list of top restaurants, complete with reviews, hours, maps, and menus. If you can offer all — like TripAdvisor, Opentable, and Yelp — then you’ve helped the user complete their task.

The key to task completion is to make solving the user’s problem both clear and immediate. On our Followerwonk page, this could be accomplished by making it immediately clear that they could perform an advanced Twitter search, for free, along with an expectation of what the results would look like.

A standard for task completion can be found by answering the following question: After the user visits this page, will they have completely found what they are looking for, or will they need to return to Google for help?

When the query is satisfied by your website, then you’ve achieved task completion, and likely deserve to rank very highly for the targeted search query.

5. Submit for reindexing

The beauty of this process is that you can see results very quickly. The easiest thing to do is to submit the page for reindexing in Google, which can help your changes appear in search results much faster.

You may see changes submitted this way reflected in search results within minutes or hours. Usually it’s not more than a day or two.

6–7. Measure results, tweak, and repeat

Now that your results are live, you want to measure present performance against past. After a few days or weeks (whenever you have enough data to make statistically significant decisions) you want to specifically look at:

  • Rankings, or overall impressions
  • Clicks and click-through rate
  • Engagement metrics, including bounce rate, time on site, and conversions

Warning: You may not get it right the first time. That’s okay. It’s fine to iterate and improve (as long as you don’t destroy your page in the process). In fact, that’s the whole point!

If you follow this process, you may see not only increases in traffic, but improved traffic coming to your site that better aligns what you offer with what the visitor is searching for.

The best content that aligns with user intent is what search engines want to deliver to its users. This is what you want to broadcast to search engines. The results can be rewarding.

Transitions

What’s next for me? In the near term, I’m starting a boutique online publishing/media company, tentatively named Fazillion. (Our aim is to produce content with heart, as we ourselves are inspired by sites like Mr. Money Mustache, Wait but Why, and Data is Beautiful.)

I can’t express enough how much this company and this community means to me. Moving on to the next adventure is the right thing to do at this time, but it makes me sad nonetheless.

Coincidentally, my departure from Moz creates a unique job opening for a talented SEO and Content Architect. It should make a wonderful opportunity for the right person. If you’re interested in applying, you can check it out here: SEO Content Marketer at Moz

Happy SEO, everybody! If you see me walking down the street, be sure to say hi.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

New US Presidential “Candidate Cards” Are A Disaster For Google’s Search Quality

An experiment to allow US presidential candidates to have an unprecedented guaranteed spot at the top of Google’s search results have turned into a megaphone for one. The post New US Presidential “Candidate Cards” Are A Disaster For Google’s Search Quality appeared first on…

Please visit Search Engine Land for the full article.

LAPS announces new marketing campaign

The new campaign, created by PMDG Marketing Communications, is designed to capture the playful and positive personality of the loving homeless …

Content Randomizer FREE

This plugin randomly shuffles content around in an article using a simple short code to create dynamic articles and pages that change on each visit.