Tuesday, 28 June 2016

10 Illustrations of How Fresh Content May Influence Google Rankings (Updated)

Posted by Cyrus-Shepard

[Estimated read time: 11 minutes]

How fresh is this article?

Through patent filings over the years, Google has explored many ways that it might use “freshness” as a ranking signal. Back in 2011, we published a popular Moz Blog post about these “Freshness Factors” for SEO. Following our own advice, this is a brand new update of that article.

In 2003, Google engineers filed a patent named Information retrieval based on historical data that shook the SEO world. The patent not only offered insight into the mind of Google engineers at the time, but also seemingly provided a roadmap for Google's algorithm for years to come.

In his series on the “10 most important search patents of all time,” Bill Slawski's excellent writeup shows how this patent spawned an entire family of Google child patents–the latest from October 2011.

This post doesn't attempt to describe all the ways that Google may determine freshness to rank web pages, but instead focuses on areas we may most likely influence through SEO.

Giant, great big caveat: Keep in mind that while multiple Google patent filings describe these techniques - often in great detail - we have no guarantee how Google uses them in its algorithm. While we can't be 100% certain, evidence suggests that they use at least some, and possibly many, of these techniques to rank search results.

For another take on these factors, I highly recommend reading Justin Briggs' excellent article Methods for Evaluating Freshness.

When “Queries Deserve Freshness”

Former Google Fellow Amit Singhal once explained how “Different searches have different freshness needs.”

The implication is that Google measures all of your documents for freshness, then scores each page according to the type of search query.

Singhal describes the types of keyword searches most likely to require fresh content:


  • Recent events or hot topics: “occupy oakland protest” “nba lockout”

  • Regularly recurring events: “NFL scores” “dancing with the stars” “exxon earnings”

  • Frequent updates: “best slr cameras” “subaru impreza reviews”

Google may determine exactly which queries require fresh content by monitoring the web and their own huge warehouse of data, including:


  1. Search volume: Are queries for a particular term spiking (i.e. “Earthquake Los Angeles”)?

  2. News and blog coverage: If a number of news organizations start writing about the same subject, it's likely a hot topic.

  3. Social media: A spike in mentions of a particular topic may indicate the topic is “trending.”

While some queries need fresh content, other search queries may be better served by older content.

Fresh is often better, but not always. (More on this later.)

Below are ten ways Google may determine the freshness of your content. Images courtesy of my favorite graphic designer, Dawn Shepard.

1. Freshness by inception date

Initially, a web page can be given a “freshness” score based on its inception date, which decays over time. This freshness score may boost a piece of content for certain search queries, but degrades as the content becomes older.

The inception date is often when Google first becomes aware of the document, such as when Googlebot first indexes a document or discovers a link to it.


"For some queries, older documents may be more favorable than newer ones. As a result, it may be beneficial to adjust the score of a document based on the difference (in age) from the average age of the result set."
– All captions from US Patent Document Scoring Based on Document Content Update

2. Amount of change influences freshness: How Much

The age of a webpage or domain isn't the only freshness factor. Search engines can score regularly updated content for freshness differently from content that doesn't change. In this case, the amount of change on your webpage plays a role.

For example, changing a single sentence won't have as big of a freshness impact as a large change to the main body text.


"Also, a document having a relatively large amount of its content updated over time might be scored differently than a document having a relatively small amount of its content updated over time."

In fact, Google may choose to ignore small changes completely. That's one reason why when I update a link on a page, I typically also update the text surrounding it. This way, Google may be less likely to ignore the change. Consider the following:

"In order to not update every link's freshness from a minor edit of a tiny unrelated part of a document, each updated document may be tested for significant changes (e.g., changes to a large portion of the document or changes to many different portions of the document) and a link's freshness may be updated (or not updated) accordingly."

3. Changes to core content matter more: How important

Changes made in “important” areas of a document will signal freshness differently than changes made in less important content.

Less important content includes:


  • JavaScript

  • Comments

  • Advertisements

  • Navigation

  • Boilerplate material

  • Date/time tags

Conversely, “important” content often means the main body text.

So simply changing out the links in your sidebar, or updating your footer copy, likely won't be considered as a signal of freshness.


"…content deemed to be unimportant if updated/changed, such as Javascript, comments, advertisements, navigational elements, boilerplate material, or date/time tags, may be given relatively little weight or even ignored altogether when determining UA."

This brings up the issue of timestamps on a page. Some webmasters like to update timestamps regularly - sometimes in an attempt to fake freshness - but there exists conflicting evidence on how well this works. Suffice to say, the freshness signals are likely much stronger when you keep the actual page content itself fresh and updated.

4. The rate of document change: How often

Content that changes more often is scored differently than content that only changes every few years.

For example, consider the homepage of the New York Times, which updates every day and has a high degree of change.


"For example, a document whose content is edited often may be scored differently than a document whose content remains static over time. Also, a document having a relatively large amount of its content updated over time might be scored differently than a document having a relatively small amount of its content updated over time."

Google may treat links from these pages differently as well (more on this below.) For example, a fresh “link of the day” from the Yahoo homepage may be assigned less significance than a link that remains more permanently.

5. New page creation

Instead of revising individual pages, fresh websites often add completely new pages over time. (This is the case with most blogs.) Websites that add new pages at a higher rate may earn a higher freshness score than sites that add content less frequently.


"UA may also be determined as a function of one or more factors, such as the number of 'new' or unique pages associated with a document over a period of time. Another factor might include the ratio of the number of new or unique pages associated with a document over a period of time versus the total number of pages associated with that document."

Some webmasters advocate adding 20–30% new pages to your site every year. Personally, I don't believe this is necessary as long as you send other freshness signals, including keeping your content up-to-date and regularly earning new links.

6. Rate of new link growth signals freshness

Not all freshness signals are restricted to the page itself. Many external signals can also indicate freshness as well, oftentimes with powerful results.

If a webpage sees an increase in its link growth rate, this could indicate a signal of relevance to search engines. For example, if folks start linking to your personal website because you're about to get married, your site could be deemed more relevant and fresh (as far as this current event goes.)


"…a downward trend in the number or rate of new links (e.g., based on a comparison of the number or rate of new links in a recent time period versus an older time period) over time could signal to search engine 125 that a document is stale, in which case search engine 125 may decrease the document's score."

Be warned: an unusual increase in linking activity can also indicate spam or manipulative link building techniques. Search engines are likely to devalue such behavior. Natural link growth over time is usually the best bet.

7. Links from fresh sites pass fresh value

Links from sites that have a high freshness score themselves can raise the freshness score of the sites they link to.

For example, if you obtain a link off an old, static site that hasn't been updated in years, this may not pass the same level of freshness value as a link from a fresh page, i.e. the homepage of Wired. Justin Briggs coined this FreshRank.


"Document S may be considered fresh if n% of the links to S are fresh or if the documents containing forward links to S are considered fresh."

8. Traffic and engagement metrics may signal freshness

When Google presents a list of search results to users, the results the users choose and how much time they spend on each one can be used as an indicator of freshness and relevance.

For example, if users consistently click a search result further down the list, and they spend much more time engaged with that page than the other results, this may mean the result is more fresh and relevant.


"If a document is returned for a certain query and over time, or within a given time window, users spend either more or less time on average on the document given the same or similar query, then this may be used as an indication that the document is fresh or stale, respectively."

You might interpret this to mean that click-through rate is a ranking factor, but that's not necessarily the case. A more nuanced interpretation might say that the increased clicks tell Google there is a hot interest in the topic, and this page - and others like it - happen to match user intent.

For a more detailed explanation of this CTR phenomenon, I highly recommend reading Eric Enge's excellent article about CTR as a ranking factor.

9. Changes in anchor text may devalue links

If the subject of a web page changes dramatically over time, it makes sense that any new anchor text pointing to the page will change as well.

For example, if you buy a domain about racing cars, then change the format to content about baking, over time your new incoming anchor text will shift from cars to cookies.

In this instance, Google might determine that your site has changed so much that the old anchor text is now stale (the opposite of fresh) and devalue those older links entirely.


"The date of appearance/change of the document pointed to by the link may be a good indicator of the freshness of the anchor text based on the theory that good anchor text may go unchanged when a document gets updated if it is still relevant and good."

The lesson here is that if you update a page, don't deviate too much from the original context or you may risk losing equity from your pre-existing links.

10. Older is often better

Google understands the newest result isn't always the best. Consider a search query for “Magna Carta." An older, authoritative result may be best here.

In this case, having a well-aged document may actually help you.

Google's patent suggests they determine the freshness requirement for a query based on the average age of documents returned for the query.


"For some queries, documents with content that has not recently changed may be more favorable than documents with content that has recently changed. As a result, it may be beneficial to adjust the score of a document based on the difference from the average date-of-change of the result set."

A good way to determine this is to simply Google your search term, and gauge the average inception age of the pages returned in the results. If they all appear more than a few years old, a brand-new fresh page may have a hard time competing.

Freshness best practices

The goal here shouldn't be to update your site simply for the sake of updating it and hoping for better ranking. If this is your practice, you'll likely be frustrated with a lack of results.

Instead, your goal should be to update your site in a timely manner that benefits users, with an aim of increasing clicks, user engagement, and fresh links. These are the clearest signals you can pass to Google to show that your site is fresh and deserving of high rankings.

Aside from updating older content, other best practices include:


  1. Create new content regularly.

  2. When updating, focus on core content, and not unimportant boilerplate material.

  3. Keep in mind that small changes may be ignored. If you're going to update a link, you may consider updating all the text around the link.

  4. Steady link growth is almost always better than spiky, inconsistent link growth.

  5. All other things being equal, links from fresher pages likely pass more value than links from stale pages.

  6. Engagement metrics are your friend. Work to increase clicks and user satisfaction.

  7. If you change the topic of a page too much, older links to the page may lose value.

Updating older content works amazingly well when you also earn fresh links to the content. A perfect example of this is when Geoff Kenyon updated his Technical Site Audit Checklist post on Moz. You can see the before and after results below:


Be fresh.

Be relevant.

Most important, be useful.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Monday, 27 June 2016

Predicting Intent: What Unnatural Outbound Link Penalties Could Mean for the Future of SEO

Posted by Angular

[Estimated read time: 8 minutes]

As SEOs, we often find ourselves facing new changes implemented by search engines that impact how our clients' websites perform in the SERPs. With each change, it's important that we look beyond its immediate impact and think about its future implications so that we can try to answer this question: "If I were Google, why would I do that?"

Recently, Google implemented a series of manual penalties that affected sites deemed to have unnatural outbound links. Webmasters of affected sites received messages like this in Google Search Console:

Google Outbound Links Penalty

Webmasters were notified in an email that Google had detected a pattern of "unnatural artificial, deceptive, or manipulative outbound links." The manual action itself described the link as being either "unnatural or irrelevant."

The responses from webmasters varied in their usual extreme fashion, with recommendations ranging from "do nothing" to "nofollow every outbound link on your site."

Google's John Mueller posted in product forums that you don't need to nofollow every link on your site, but you should focus on nofollowing links that point to a product, sales, or social media page as the result of an exchange.

Now, on to the fun part of being an SEO: looking at a problem and trying to reverse-engineer Google's intentions to decipher the implications this could have on our industry, clients, and strategy.

The intent of this post is not to decry those opinions that this was specifically focused on bloggers who placed dofollow links on product/business reviews, but to present a few ideas to incite discussion as to the potential big-picture strategy that could be at play here.

A few concepts that influenced my thought process are as follows:


  • Penguin has repeatedly missed its "launch date," which indicates that Google engineers don't feel it's accurate enough to release into the wild.

Penguin Not Ready

  • The growth of negative SEO makes it even more difficult for Google to identify/penalize sites for tactics that are not implemented on their own websites.

  • Penguin temporarily impacted link-building markets in a way Google would want. The decline reached its plateau in July 2015, as shown in this graph from Google Trends:
    Trend of Link Building

If I were Google, I would expect webmasters impacted by the unnatural outbound links penalty to respond in one of these ways:


  1. Do nothing. The penalty is specifically stated to "discount the trust in links on your site." As a webmaster, do you really care if Google trusts the outbound links on your site or not? What about if the penalty does not impact your traffic, rankings, visibility, etc.? What incentive do you have to take any action? Even if you sell links, if the information is not publicly displayed, this does nothing to harm your link-selling business.

  2. Innocent site cleanup effort. A legitimate site that has not exchanged goods for links (or wants to pretend they haven't) would simply go through their site and remove any links that they feel may have triggered the issue and then maybe file a bland reconsideration request stating as much.

  3. Guilty site cleanup effort. A site that has participated in link schemes would know exactly which links are the offenders and remove them. Now, depending on the business owner, some might then file a reconsideration request saying, "I'm sorry, so-and-so paid me to do it, and I'll never do it again." Others may simply state, "Yes, we have identified the problem and corrected it."

In scenario No. 1, Google wins because this helps further the fear, uncertainty, and doubt (FUD) campaigns around link development. It is suddenly impossible to know if a site's outbound links have value because they may possibly have a penalty preventing them from passing value. So link building not only continues to carry the risk of creating a penalty on your site, but it suddenly becomes more obvious that you could exchange goods/money/services for a link that has no value despite its MozRank or any other external "ranking" metric.

In scenarios No. 2 and No. 3, Google wins because they can monitor the links that have been nofollowed/removed and add potential link scheme violators to training data.

In scenario No. 3, they may be able to procure evidence of sites participating in link schemes through admissions by webmasters who sold the links.

If I were Google, I would really love to have a control group of known sites participating in link schemes to further develop my machine-learned algorithm for detecting link profile manipulation. I would take the unnatural outbound link data from scenario No. 3 above and run those sites as a data set against Penguin to attempt 100% confidence, knowing that all those sites definitely participated in link schemes. Then I would tweak Penguin with this training dataset and issue manual actions against the linked sites.

This wouldn't be the first time SEOs have predicted a Google subtext of leveraging webmasters and their data to help them further develop their algorithms for link penalties. In 2012, the SEO industry was skeptical regarding the use of the disavow tool and whether or not Google was crowdsourcing webmasters for their spam team.

martinibuster

"Clearly there are link schemes that cannot be caught through the standard algorithm. That's one of the reasons why there are manual actions. It's within the realm of possibilities that disavow data can be used to confirm how well they're catching spam, as well as identifying spam they couldn't catch automatically. For example, when web publishers disavow sites that were not caught by the algorithm, this can suggest a new area for quality control to look into." - Roger Montti, Martinibuster.com



What objectives could the unnatural outbound links penalties accomplish?


  1. Legit webmasters could become more afraid to sell/place links because they get "penalized."

  2. Spammy webmasters could continue selling links from their penalized sites, which would add to the confusion and devaluation of link markets.

  3. Webmasters could become afraid to buy/exchange links because they could get scammed by penalized sites and be more likely to be outed by the legitimate sites.

  4. The Penguin algorithm could have increased confidence scoring and become ready for real-time implementation.

Russ Jones


"There was a time when Google would devalue the PR of a site that was caught selling links. With that signal gone, and Google going after outbound links, it is now more difficult than ever to know whether a link acquired is really of value." -- Russ Jones, Principal Search Scientist at MOZ



Again, if I were Google, the next generation of Penguin would likely heavily weight irrelevantly placed links, and not just commercial keyword-specific anchor text. Testing this first on the sites I think are guilty of providing the links and simply devaluing those links seems much smarter. Of course, at this point, there is no specific evidence to indicate Google's intention behind the unnatural outbound links penalties were intended as a final testing phase for Penguin and to further devalue the manipulated link market. But if I were Google, that's exactly what I would be doing.

Tripp Hamilton


"Gone are the days of easily repeatable link building strategies. Acquiring links shouldn't be easy, and Penguin will continue to change the search marketing landscape whether we like it or not. I, for one, welcome our artificially intelligent overlords. Future iterations of the Penguin algorithm will further solidify the “difficulty level” of link acquisition, making spam less popular and forcing businesses toward legitimate marketing strategies." - Tripp Hamilton, Product Manager at Removeem.com

Google's webmaster guidelines show link schemes are interpreted by intent. I wonder what happens if I start nofollowing links from my site for the intent of devaluing a site's rankings? The intent is manipulation. Am I at risk of being considered a participant in link schemes? If I do link building as part of an SEO campaign, am I inherently conducting a link scheme?

Google Webmaster Guidelines for Link Scheme

So, since I'm an SEO, not Google, I have to ask myself and my colleagues, "What does this do to change or reinforce my SEO efforts?" I immediately think back to a Whiteboard Friday from a few years ago that discusses the Rules of Link Building.

Cyrus Shepard

"At its best, good link building is indistinguishable from good marketing." - Cyrus Shepard, former Content Astronaut at Moz





When asked what type of impact SEOs should expect from this, Garret French from Citation Labs shared:

Garret French


"Clearly this new effort by Google will start to dry up the dofollow sponsored post, sponsored review marketplace. Watch for prices to drop over the next few months and then go back and test reviews with nofollowed links to see which ones actually drive converting traffic! If you can't stomach paying for nofollowed links then it's time to get creative and return to old-fashioned, story-driven blog PR. It doesn't scale well, but it works well for natural links."

In conclusion, as SEOs, we are responsible for predicting the future of our industry. We do not simply act in the present. Google does not wish for its results to be gamed and have departments full of data scientists dedicated to building algorithms to identify and devalue manipulative practices. If you are incapable of legitimately building links, then you must mimic legitimate links in all aspects (or consider a new career).

Takeaways

Most importantly, any links that we try to build must provide value. If a URL links to a landing page that is not contextually relevant to its source page, then this irrelevant link is likely to be flagged and devalued. Remember, Google can do topical analysis, too.

In link cleanup mode or Penguin recovery, we've typically approached unnatural links as being obvious when they have a commercial keyword (e.g. "insurance quotes") because links more naturally occur with the URL, brand, or navigational labels as anchor text. It would also be safe to assume that natural links tend to occur in content about the destination the link offers and that link relevance should be considered.

Finally, we should continue to identify and present clients with methods for naturally building authority by providing value in what they offer and working to build real relationships and brand advocates.

What are your thoughts? Do you agree? Disagree?


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Tuesday, 21 June 2016

Context is King: A Million Examples of Creative Ad Campaigns Getting it Right

Posted by Daniel_Marks

[Estimated read time: 6 minutes]


This was one of the first television commercials to ever air:





Talking to the camera on a mic was the obvious way to leverage television: after all, that's how radio commercials worked. Now, advertisers could just put radio commercials on television. What an exciting new advertising medium!


As it turns out, putting radio commercials on television wasn't really the best use of this new medium. Sound familiar? This seems awfully similar to the current practice of turning your television commercial into a YouTube pre-roll ad. However, the difference this time isn't the media format, which is largely similar (YouTube videos are still video, banner ads are still text + image, podcast sponsorships are still voice, etc.) Instead, the difference is how people are consuming the content; in other words, the context.


A television commercial is a relatively static experience: 30 seconds of video placed within a few appropriate time slots, reaching people in their living room (or possibly bedroom). A Facebook newsfeed ad is a little more dynamic: it can be seen anywhere (home, office, bus, etc.), at anytime, by anyone, in almost any format and next to almost any content. The digital age has basically exacerbated the "problem" of context by offering up a thousand different ways for consumers to interact with your marketing.


But, with great problems comes great opportunity - or something like that. So, what are some ways to leverage context in the digital age?


Intent context


Different channels have different user intents. On one end of the funnel are channels like Facebook and Snapchat that are great fillers of the empty space in our lives. This makes them well-suited for top-of-funnel brand advertising because you aren't looking for something specific and are therefore more receptive to brand messaging (though you can certainly use Facebook for direct marketing purposes).


BuzzFeed, for example, has done a great job of tailoring their Snapchat content to the intent of the channel - it's about immediate gratification, not driving off-channel behaviors:





This feels like you're watching your friend's Snapchat story, not professionally produced branded content. However, it's still the early days for Snapchat - all companies, including BuzzFeed, are trying to figure out what kind of content makes sense for their goals.


As for Facebook, there are plenty of examples of doing brand awareness right, but one of the more famous ones is by A1 Steak Sauce. It was both set and promoted (in part) on Facebook:





Critically, the video works with or without sound.


On the other end of the funnel is something like AdWords: great when you know what you're looking for, not so great when you don't. This subway ad for health insurance from Oscar feels pretty out of place when you use the same copy for AdWords:






Getting intent right means that you need to actually experience your ad as a user would. It's not enough to put a bunch of marketers together in a conference room and watch the YouTube ad you created. You need to feel the ad as a user would. This means watching your ad when you're in the living room and just clicked on a friend's YouTube link from Facebook to watch a soccer highlight (or whatever).


Situational context


Situational context (is that redundant?) can be leveraged with a whole range of strategies, but the overarching theme is the same: make users feel like the ad they're seeing is uniquely built for their current situation. It's putting a YouTube star in pre-roll ads on their own channel, or quickly tweeting something off the back of a current event:




...or digital experiences that are relevant to the sporting event a user is watching:





There are thousands of examples of doing this right:





Behavioral context


You might want people on Facebook to watch your video with sound, but the reality is that 85% of Facebook video views are silent. You might want people to watch your brilliant one-minute YouTube ad, but the reality is that 94% of users skip an ad after 5 seconds You need to embrace user behaviors instead of railing against them, like these smart people:




  • Geico makes an “unskippable” 5 second YouTube ad:

    How do you reach people who skip your commercial after 5 seconds? Make the ad 5 seconds long!


Understanding channel behaviors means not using channel features for the sake of channel features while still taking advantage of behaviors that allow for richer ad experiences. It means using the channel yourself, looking up the relevant research, talking to experts, and making informed decisions about how people will actually engage with your creative work.


Location context


A user's location can prompt geographic-specific advertising (for example, Facebook Local Awareness Ads or in-store Snapchat filters). It can feel gimmicky when used needlessly, but can provide a compelling marketing experience when done right.


AirBnB's slogan is "belong anywhere." One of the ways to feel like a local in a new city is to have locals give you a personal tour - which is exactly what AirBnB provides by targeting people on mobile when they're looking for directions:





Or you can just make use of location services in more straightforward ways, like how the Bernie Sanders campaign targeted his core demographics in New York before the important primary by using Snapchat Geofilters.


However, be careful about inferring location from device - only 17% of mobile searches are on the go.


Audience context


Audience targeting is likely the most powerful form of context provided by digital marketing. You can segment your audience in a thousand different ways - from Facebook Lookalikes to Google Customer Match - that a billboard could only dream of. The more you customize your ad copy to the audience you're targeting, the better off you'll be. (There seems to be a running theme here…)


You could directly speak to the audience of your competitors by targeting branded keywords:




Or better yet, target competitor customers that are about to change services:




Retargeting is another powerful way to use audience context by changing your copy to reflect the actions a user has taken on your site (more great retargeting examples here):




Then, of course, there are all the obvious ways of leveraging audience, such as adjusting your value proposition, using a slightly different tone, or tweaking the offer you provide.


There's a cliché that the digital age has killed advertising creativity. Forget about clever copy or innovative work, It's all about spreadsheets and algorithms now. This couldn't be further from the truth. The Internet didn't kill advertising creativity - it just raised the bar. Content in all its forms (video ads, blog posts, tweets, etc.) will always be important. It might be harder to buy engaged eyeballs for your 30-second commercial online, but content done right can reach millions of people who are voluntarily consuming it. More importantly, though, the Internet lets you engage with your audience in a thousand innovative ways, providing a revamped arena for marketing creativity: context.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Monday, 20 June 2016

Using Google Tag Manager to Dynamically Generate Schema/JSON-LD Tags

Posted by serpschris

[Estimated read time: 7 minutes]

One of the biggest takeaways from SearchFest in Portland earlier this year was the rapidly rising importance of semantic search and structured data - in particular Schema.org. And while implementing Schema used to require a lot of changes to your site's markup, the JSON-LD format has created a great alternative to adding microdata to a page with minimal code.

mike arnesen searchfest 2016

Check out Mike Arnesen's deck from his SearchFest talk, "Understanding & Facilitating Semantic Search," for a great overview on using structured data.

What was even more exciting was the idea that you could use Google Tag Manager to insert JSON-LD into a page, allowing you to add Schema markup to your site without having to touch the site's code directly (in other words, no back and forth with the IT department).

Trouble is, while it seemed like Tag Manager would let you insert a JSON-LD snippet on the page no problem, it didn't appear to be possible to use other Tag Manager features to dynamically generate that snippet. Tag Manager lets you create variables by extracting content from the page using either CSS selectors or some basic JavaScript. These variables can then be used dynamically in your tags (check out Mike's post on semantic analysis for a good example).

So if we wanted to grab that page URL and pass it dynamically to the JSON-LD snippet, we might have tried something like this:

Using tag manager to insert JSON-LD with dynamic variables

But that doesn't work. Bummer.

Meaning that if you wanted to use GTM to add the the BlogPosting Schema type to each of your blog posts, you would have to create a different tag and trigger (based on the URL) for each post. Not exactly scalable.

But, with a bit of experimentation, I've figured out a little bit of JavaScript magic that makes it possible to extract data from the existing content on the page and dynamically create a valid JSON-LD snippet.

Dynamically generating JSON-LD

The reason why our first example doesn't work is because Tag Manager replaces each variable with a little piece of JavaScript that calls a function - returning the value of whatever variable is called.

We can see this error in the Google Structured Data Testing Tool:

JSON-LD Google Tag Manager variable error

The error is the result of Tag Manager inserting JavaScript into what should be a JSON tag - this is invalid, and so the tag fails.

However, we can use Tag Manager to insert a JavaScript tag, and have that JavaScript tag insert our JSON-LD tag.

Google Tag Manager JSON-LD insertion script

If you're not super familiar with JavaScript, this might look pretty complicated, but it actually works the exact same way as many other tags you're probably already using (like Google Analytics, or Tag Manager itself).

Here, our Schema data is contained within the JavaScript "data" object, which we can dynamically populate with variables from Tag Manager. The snippet then creates a script tag on the page with the right type (application/ld+json), and populates the tag with our data, which we convert to JSON using the JSON.stringify function.

The purpose of this example is simply to demonstrate how the script works (dynamically swapping out the URL for the Organization Schema type wouldn't actually make much sense). So let's see how it could be used in the real world.

Dynamically generating Schema.org tags for blog posts

Start with a valid Schema template

First, build out a complete JSON/LD Schema snippet for a single post based on the schema.org/BlogPosting specification.

example article schema template

Identify the necessary dynamic variables

There are a number of variables that will be the same between articles; for example, the publisher information. Likewise, the main image for each article has a specific size generated by WordPress that will always be the same between posts, so we can keep the height and width variables constant.

In our case, we've identified 7 variables that change between posts that we'll want to populate dynamically:
identify schema properties for dynamic substitution by tag manager

Create the variables within Google Tag Manager


  • Main Entity ID: The page URL.

  • Headline: We'll keep this simple and use the page title.

  • Date Published and Modified: Our blog is on WordPress, so we already have meta tags for "article:published_time" and "article:modified_time". The modified_time isn't always included (unless the post is modified after publishing), but the Schema specification recommends including it, so we should set dateModified to the published date if it there isn't already a modified date. In some circumstances, we may need to re-format the date - fortunately, in this case, it's already in the ISO 860 format, so we're good.

  • Author Name: In some cases we're going to need to extract content from the page. Our blog lists the author and published date in the byline. We'll need to extract the name, but leave out the time stamp, trailing pipe, and spaces.tag manager extract author name from pagetag manager extract author name from page markup


  • Article Image: Our blog has Yoast installed, which has specified image tags for Twitter and Open Graph. Note: I'm using the meta twitter:image instead of the og:image tag value due to a small bug that existed with the open graph image on our blog when I wrote this.

  • Article Description: We'll use the meta description.

Here is our insertion script, again, that we'll use in our tag, this time with the properties swapped out for the variables we'll need to create:

google tag manager json-ld insertion script with dynamic variables

I'm leaving out dateModified right now - we'll cover than in a minute.

Extracting meta values

Fortunately, Tag Manager makes extracting values from DOM elements really easy - especially because, as is the case with meta properties, the exact value we need will be in one of the element's attributes. To extract the page title, we can get the value of the tag. We don't need to specify an attribute name for this one:<p><img src="http://d2v4zi8pl64nxt.cloudfront.net/using-google-tag-manager-to-dynamically-generate-schema-org-json-ld-tags/576794184cca31.23853501.png" width="738" alt="configuring a google tag manager tag to extract the title value"><br /> </p><p>For meta properties, we can extract the value from the content attribute:<br /> </p><p><img src="http://d2v4zi8pl64nxt.cloudfront.net/using-google-tag-manager-to-dynamically-generate-schema-org-json-ld-tags/5767941901fcb8.76439638.png" width="738" alt="configuring a google tag manager tag to extract the title value"><br /> </p><p>Tag Manager also has some useful built-in variables that we can leverage - in this case, the Page URL:<br /> </p><p><img src="http://d2v4zi8pl64nxt.cloudfront.net/using-google-tag-manager-to-dynamically-generate-schema-org-json-ld-tags/57679419854306.38513113.png" width="618" alt="Tag Manager Page URL built in variable"><br /> </p><h3>Processing page elements</h3><p>For extracting the author name, the markup of our site makes it so that just a straight selector won't work, meaning we'll need to use some custom JavaScript to grab just the text we want (the text of the span element, not the time element), and strip off the last 3 characters (" | ") to get just the author's name.<br /> </p><p>In case there's a problem with this selector, I've also put in a fallback (just our company name), to make sure that if our selector fails a value is returned.<br /> </p><p><img src="http://d2v4zi8pl64nxt.cloudfront.net/using-google-tag-manager-to-dynamically-generate-schema-org-json-ld-tags/5767941a0600f1.80330070.png" width="738" alt="custom JavaScript google tag manager variable to extract and process copy"><br /> </p><h2>Testing</h2><p>Tag Manager has a great feature that allows you to stage and test tags before you deploy them.<br /> </p><p><img src="http://d2v4zi8pl64nxt.cloudfront.net/using-google-tag-manager-to-dynamically-generate-schema-org-json-ld-tags/5767941b683f03.05409682.png" width="400" alt="google tag manager debug mode"><br /> </p><p>Once we have our variables in place, we can enter the Preview mode and head to one of our blog posts:<br /> </p><p><img src="http://d2v4zi8pl64nxt.cloudfront.net/using-google-tag-manager-to-dynamically-generate-schema-org-json-ld-tags/5767941bd03e11.38700283.png" width="738" alt="testing tag manager schema variables"><br /> </p><p>Here we can check the values of all of our variables to make sure that the correct values are coming through.<br /> </p><p>Finally, we set up our tag, and configure it to fire where we want. In this case, we're just going to fire these tags on blog posts:<br /> </p><p><img src="http://d2v4zi8pl64nxt.cloudfront.net/using-google-tag-manager-to-dynamically-generate-schema-org-json-ld-tags/5767941c50bfd3.27970223.png" alt="tag manager trigger configuration"><br /> </p><p>And here's the final version of our tag.<br /> </p><p>For our dateModified parameter, we added a few lines of code that check whether our modified variable is set, and if it's not, sets the "dateModified" JSON-LD variable to the published date. You can find the <a href="https://gist.github.com/chrisgoddard/bbc998efc270929d0a67305d0941c6eb" target="_blank">raw code here</a>.<br /> </p><p><img src="http://d2v4zi8pl64nxt.cloudfront.net/using-google-tag-manager-to-dynamically-generate-schema-org-json-ld-tags/5767941cc21524.15034477.png" width="738" alt="dynamic schema json-ld tag"><br /> </p><p>Now we can save the tag, deploy the current version, and then use the <a href="https://search.google.com/structured-data/testing-tool" target="_blank">Google Structured Data Testing Tool</a> to validate our work:<br /> </p><p><img src="http://d2v4zi8pl64nxt.cloudfront.net/using-google-tag-manager-to-dynamically-generate-schema-org-json-ld-tags/5767941d8ff161.37948276.png" width="738" alt="google structured data testing tool validates dynamically generated JSON-LD"><br /> </p><p>Success!!<br /> </p><hr><p>This is just a first version of this code, which is serving to test the idea that we can use Google Tag Manager to dynamically insert JSON-LD/Schema.org tags. However after just a few days we checked in with Google Search Console and it confirmed the BlogPosting Schema was successfully found on all of our blog posts with no errors, so I think this is a viable method for implementing structured data.<br /> </p><p><img src="http://d2v4zi8pl64nxt.cloudfront.net/using-google-tag-manager-to-dynamically-generate-schema-org-json-ld-tags/5767941e2bb207.78142880.png" width="738" alt="valid structured data found in Google Search Console"><br /> </p><p>Structured data is becoming an increasingly important part of an SEO's job, and with techniques like this we can dramatically improve our ability to implement structured data efficiently, and with minimal technical overhead.<br /> </p><p>I'm interested in hearing the community's experience with using Tag Manager with JSON-LD, and I'd love to hear if people have success using this method!<br /> </p><p>Happy tagging!<br /> </p><br /><p><a href="https://moz.com/moztop10">Sign up for The Moz Top 10</a>, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!</p><div style="clear:both; padding-bottom:0.25em"></div> </p> </div> <p class="post-footer"> <em>posted by Unknown @ <a href="http://webleadsuk.blogspot.com/2016/06/using-google-tag-manager-to-dynamically.html" title="permanent link">16:35</a></em>   <a class="comment-link" href="http://webleadsuk.blogspot.com/2016/06/using-google-tag-manager-to-dynamically.html#comment-form"location.href=http://webleadsuk.blogspot.com/2016/06/using-google-tag-manager-to-dynamically.html#comment-form;><span style="text-transform:lowercase">0 Comments</span></a> <span class="item-control blog-admin pid-1382433543"><a style="border:none;" href="https://www.blogger.com/post-edit.g?blogID=526595817835727941&postID=7623112839176674754&from=pencil" title="Edit Post"><img class="icon-action" alt="" src="https://resources.blogblog.com/img/icon18_edit_allbkg.gif" height="18" width="18"></a></span> </p> </div> <!-- End .post --> <!-- Begin #comments --> <!-- End #comments --> <h2 class="date-header">Friday, 17 June 2016</h2> <!-- Begin .post --> <div class="post"><a name="6657923967356022046"></a> <h3 class="post-title"> Why Listing Accuracy is Important - Whiteboard Friday </h3> <div class="post-body"> <p> <div style="clear:both;"></div><p>Posted by <a href=\"https://moz.com/community/users/138169\">George-Freitag</a></p><p>Whether you manage a single local listing or hundreds, the consistency of your NAP data across the web can either help grow your business, or serve as a barrier to customer discovery.<br /> </p><p>With answers for business owners, SEOs dabbling in local search, and those at enterprise-level searching for a broad solution, Moz Local's George Freitag stars in this week's Whiteboard Friday covering all the boons and secrets of listing accuracy (give him a warm welcome, folks!).<br /> </p><div class="wistia_video_foam_dummy" data-source-container-id="" style="border: 0px; display: block; height: 0px; margin: 0px; padding: 0px; position: static; visibility: hidden; width: auto;"></div><iframe src="http://fast.wistia.net/embed/iframe/m44pvodp5z?seo=false&videoFoam=true" allowtransparency="true" frameborder="0" scrolling="no" class="wistia_embed" name="wistia_embed" allowfullscreen="allowfullscreen" mozallowfullscreen="mozallowfullscreen" webkitallowfullscreen="webkitallowfullscreen" oallowfullscreen="oallowfullscreen" msallowfullscreen="msallowfullscreen" width="696" height="420" style="width: 696px; height: 420px;"><br /> </iframe><script src="http://fast.wistia.net/assets/external/E-v1.js" async=""></script><script src="http://fast.wistia.net/assets/external/E-v1.js" async=""></script><p style="text-align: center;"><a href="http://d2v4zi8pl64nxt.cloudfront.net/why-listing-accuracy-is-important-whiteboard-friday/57631b948970e1.56690052.jpg" target="_blank"><img src="http://d2v4zi8pl64nxt.cloudfront.net/why-listing-accuracy-is-important-whiteboard-friday/57631b948970e1.56690052.jpg" rel="box-shadow: rgb(153, 153, 153) 0px 0px 10px 0px; border-radius: 20px;" alt="" style="box-shadow: rgb(153, 153, 153) 0px 0px 10px 0px; border-radius: 20px;"></a><br /> </p><p style="text-align: center;" class="caption">Click on the whiteboard image above to open a high resolution version in a new tab!<br><br /> </p><h2>Video Transcription</h2>Hi. My name is George Freitag with Moz Local, and today I wanted to talk about listing accuracy. The reason why I wanted to talk about it was because it's one of those topics that is brought up a lot in local search. If you work in local search directly or if you work with an agency or an SEO on local search, it's one of those things that you've probably heard a lot about, you understand it's something you have to do, you understand that it's very important because it is, but you might not understand exactly what it means, why it's important, how come you have to do it, or why it takes as much time as it does. So today, I wanted to spend the time to go over why listing accuracy is important, how it works and how you can do that.<p><img src="http://d1avok0lzls2w.cloudfront.net/uploads/blog/576318d4094c16.41739169.jpg" rel="box-shadow: 0 0 10px 0 #999; border-radius: 20px;" style="box-shadow: 0 0 10px 0 #999; border-radius: 20px;"><br /> </p>So to start out, let's just look at an actual local search result. Let's say this is a search for what you do. This is a search for a burger restaurant, if you run a burger restaurant. You've got your three local results. It's on a phone, so that's pretty much all you get, is those three listings, and this is where you want to be, but you aren't.<p>So why aren't you? How come your clients, your competitors are on this search result, but you aren't? It has to do with trust.<br><br /> </p><h2>A story about trust</h2><p>So to demonstrate why trust is so important, I want to go over a quick story about me when I was looking for a bank a few years ago. I don't go to the bank that often. So when it came time, I got out my phone, punched out Google Maps, and then proceeded to walk for the next seven blocks with my face buried in my phone, checking my Twitter and email and whatever. When the little lady inside told me I'd arrived, I looked up and saw that the place was closed. Not even just closed for the day. It was closed altogether.<br /> </p><p>So that is Google's greatest fear, because if there's one reason why you, me, anyone will stop using Google is if that happens over and over again. If I repeatedly get sent to businesses that don't exist, if I try to call them and the call information is incorrect, then I'm going to stop using Google, and so Google takes that very seriously.<br /> </p><p>In fact, Google is putting itself even more on the line with additional business information. If you've done a local search lately, and I assume that you have if you've watched this video, you'll see that in this Knowledge Graph they're giving you all sorts of information. They're telling you whether or not the store is open. They're linking you to reviews. They are just giving a ton of information in their local search results. And if they're not confident on all the information they have, then they're not going to show it, because if they repeatedly show that a place is closed when it's not, then you're going to stop using Google. So Google takes trust very seriously.<br /> </p><h2>Listing accuracy</h2><p><img src="http://d1avok0lzls2w.cloudfront.net/uploads/blog/5763057226bad4.16826157.jpg" rel="box-shadow: 0 0 10px 0 #999; border-radius: 20px;" style="box-shadow: 0 0 10px 0 #999; border-radius: 20px;"><br /> </p><p>So how does Google determine trust? That is where listing accuracy comes into play. Listing accuracy is Google's method for determining whether or not it can trust a local business search result.<br /> </p><p>So to show how that works, let's go over here and say this is your business and let's say that you've already set up your Google My Business page. You've already set up your name, address, phone, your NAP. If you haven't done anything along those lines, there's <a href="https://moz.com/academy/local-listings-management" target="_blank">plenty of information</a> <a href="https://moz.com/blog/advanced-citation-audit-clean-up-achieve-consistent-data-higher-rankings" target="_blank">on our website</a> and on <a href="https://support.google.com/business/answer/3038177?hl=en" target="_blank">other places online</a> about how to do that.<br /> </p><p>But let's say that you've already set up your Google My Business. It's absolutely perfect with your business name, all your hours, all the images are filled out, and it's still not showing up here. Well, one of the reasons may be because of this concept of<strong> listing accuracy.</strong><br /> </p><p>So here's your listing, but what you might not know of or you might be kind of aware of are these other slight variations that exist elsewhere online, and these might just exist somewhere. So let's say you've got one variation where the address is slightly different, like it's a suite number that you don't want to have, but sometimes it mentions a suite number. Maybe it's an old phone number or an old cell phone number that somehow got indexed or an old tracking number. Maybe it's just some general bad information, just about specifically where you are or a website that's slightly different.<br /> </p><p>Then over here, you've got another variation. Maybe this is a different business name. Maybe this is a business that was in your location before you moved in. So these places just sort of exist out there on the web somewhere, and they might even be in a place that you don't even know about. They might be on an obscure website that you don't ever see, and you're not really that worried about because it's something that you know isn't really being seen by your customers.<br /> </p><p>So why should you worry about it? Why should you care about a website that's got bad information, that's on a source that you're never going to go to, you know your customers are never going to go to and probably isn't driving that much traffic your website instead? Well, it has to do with this concept of listing accuracy, because, again, this is how Google is measuring how much it can trust your information.<br /> </p><h2>The local ecosystem</h2><p><img src="http://d1avok0lzls2w.cloudfront.net/uploads/blog/57630810c24160.88221615.jpg" rel="box-shadow: 0 0 10px 0 #999; border-radius: 20px;" style="box-shadow: 0 0 10px 0 #999; border-radius: 20px;"><br /> </p><p>So over here we've got what we call the local ecosystem. You might've seen <a href="https://moz.com/learn/local/local-search-data-us" target="_blank">our graphic on our website</a>, which sort of explains what the different data sources are. I just want to demonstrate how it works.</p><p>So for your listing, Google can get its information from a bunch of different places. One of them, of course, is you. So this is you providing your information directly to Google My Business. This can also be the "Report a problem" in Google Maps or Google Mapmaker.<br /> </p><p>But in addition to that, it's got all these other places that Google knows it can get business information from, and some of these places provide information directly to Google through feeds and some of them Google just knows about because it can crawl them (because basically it can just crawl anything on the web). These are places like:<br /> </p><ul><br /> <li>Phone directories</li><br /> <li>Phone books</li><br /> <li>Business directories for specific businesses like OpenTable or Healthgrades</li><br /> <li>Review sites like Citysearch, Insider Pages</li><br /> <li>News sites</li><br /> <li>A restaurant review about your business</li><br /> <li>Government websites</li><br /> </ul><p>Each time one of these places mentions your business information, it increases the confidence that Google has in the information that you provided. So this place and this place both mention you; that works to increase the confidence that Google has in the business information that you provided, making it more likely for it to show your business in its search results. So the more times it mentions you, the greater confidence it has in your information.<br /> </p><p>But if you have these other variations sort of floating around out there, then all of a sudden Google's got some conflicting sources about your business information. So let's say that all these places are mentioning you the way that you want to be mentioned, but these places are giving slight variations. So all of a sudden Google's getting two different addresses, and so it's becoming a little bit less confident in the information that you've provided. Maybe now this place is giving a completely different phone number. So now it doesn't really want to show you because it doesn't want to have that call button on your search result.<br /> </p><p>Each time it mentions one of these, it decreases the amount of confidence, and you also lose the opportunity to build confidence in your website. In fact, if there's enough sources out here saying one thing that are contradictory to what you're saying in your own Google My Business page, these places can actually override what you're providing and Google will deem them more trustworthy than the information that you're directly providing to them. So if all these places are saying that you're open 'til six and you're telling Google you're open 'til eight, all of a sudden Google is telling everyone that you're closed when you're not, which can be absolutely detrimental to your business.<br /> </p><p>So how do you fix this and what you do about it? Well, it should be pretty clear from this graph. You want to find all these instances of Google locating business information that is not consistent with you and making it consistent. You're going to go out and find the source that's pointing at one of these variations, fix it so it's pointing at your own place. Then all of a sudden, instead of taking away from the confidence and trust Google has in your listing, it's building towards it.<br /> </p><h2>Finding NAP variations</h2><p><img src="http://d1avok0lzls2w.cloudfront.net/uploads/blog/57631640d2a149.80885324.jpg" rel="box-shadow: 0 0 10px 0 #999; border-radius: 20px;" style="box-shadow: 0 0 10px 0 #999; border-radius: 20px;"><br /> </p><p>The way that you find this variation, it can just be through doing some pretty straightforward Google search. So let's go down here to some examples. You're going to use a quoted search to look for different types of information.<br /> </p><p>If you already know about some bad information out there, you already know that there's like an old business name that you used to go under or a variation of your address, you can just do a quoted search for that information directly, find all the sources that bring up that bad information, go to the website, fix it, and then move on. All along you're building the trust that Google has in your business listing.<br /> </p><p>For those places that you're not yet sure about, like I said, there might be some directories that you don't even know about or there might be some variations of your business information that you might not be aware of, so you can't search for it directly, the best way to do that is to search for your phone number in different formats.<br /> </p><p>Darren Shaw of Whitespark did a great <a href="https://moz.com/blog/find-nap-variations-before-building-local-citations-whiteboard-friday" target="_blank">Whiteboard Friday</a> just a few weeks ago about exactly this, about how to find all of your NAP variations in Google. I recommend reading it if you want to follow through with this.<br /> </p><p>There are also some tools you can use. Andrew Shotland of Local SEO Guide <a href="https://chrome.google.com/webstore/detail/nap-hunter/ligeiippheclogiddffemogcgpjmieao?hl=en" target="_blank">has a Chrome plug-in</a>, called Local Citation Finder*, that will just open up a whole bunch of different variations of your business information in different Chrome tabs and can really, really help with that.<br /> </p><p><em>*Editor's note: The tool mentioned here is actually called the <a href="https://chrome.google.com/webstore/detail/nap-hunter/ligeiippheclogiddffemogcgpjmieao?hl=en" target="_blank">NAP Hunter Chrome Extension</a>, but Whitespark's <a href="https://whitespark.ca/local-citation-finder/" target="_blank">Local Citation Finder tool</a> is another fantastic resource to keep in your local SEO toolkit. ;-)</em><br /> </p><h2>Enterprise-level solutions for cleaning up NAP consistency</h2><p>So this might work if you've got one, a couple dozen, or maybe a hundred or so businesses. You can probably do this by hand and find all these places yourself. But if you have a ton of businesses - a few hundred or even a few thousand businesses - then this is not that scalable all of a sudden, and then it's time to move to one of these solutions where you're working with some of the primary data sources.<br /> </p><p>So these are the sources that provide the information to all of these places. The big ones are <strong>Localeze, Neustar, Infogroup, Acxiom, and Factual</strong>. We're getting into the paid options right now. But basically, some of these places have been around for decades. They're the ones who provided the phone books with their information. You might have gone to Yelp and tried to sign up and saw that you're already on there and wondered why. This is why. They're getting their information from these places, and they push out all the information to these other places.<br /> </p><p>So if you go to one or more of these, either directly or through a service like Moz Local or Yext, you can correct your information on one of these platforms. It'll push it out to every place on its network. That will correct the information here and, again, which in turn will make its way over to building Google's confidence in your Google My Business, which will increase the likelihood of it putting your business in its search results.<br /> </p><p>If one of these places finds an inaccurate listing, it will correct it. So let's say that this phone book is in Infogroup's network and it encounters the inaccurate data, it will hit it, fix it, and then all of a sudden instead of hurting you, this is again building confidence in your Google My Business listing.<br /> </p><p>Another benefit of working with these places is that they'll get you into places that you probably weren't even aware of. So, in addition to fixing points of bad data, it's also <strong>creating new points of accurate data</strong> that didn't even exist in the first place, which again build the confidence in your business listing and then increase the likelihood to show up in those local search results.<br /> </p><p>The last step for this is related to maintenance. <strong>T</strong><strong>his is not a one-time thing.</strong> Over time this information can be corrupted because these places not only get their information from a primary data provider, but they also get their information from each other. In some cases, a primary data source might be crawling these sites, that it indirectly provides information to, and so if you ever played a game of a telephone, you kind of know how this will end up. So you do need to go back and revisit these exercises from time to time, looking for business inaccuracies through Google manually or keeping up a relationship with one of these top-level data providers.<br /> </p><p>So in summary, what you want to do is start here, make sure you...<br /> </p><ol><br /> <li>Have got your Google My Business listing set up, </li><br /> <li>Find all the variations and inaccuracies in your data, </li><br /> <li>Fix them, and</li><br /> <li>Work with a primary data provider to push out the correct information.</li><br /> </ol><p>And then all of these places will build up the confidence that you're already providing Google in your Google My Business listing, making it more and more likely for Google to show you in its search results.<br /> </p><p>Thanks for watching. If you want to keep up with the latest in local search news, you can follow us on our brand-new social accounts @MozLocal both on <a href="https://twitter.com/MozLocal" target="_blank">Twitter</a> and <a href="https://www.facebook.com/mozlocal" target="_blank">Facebook</a>. Once again, my name is George Freitag with Moz Local and thanks for watching.<br><br><br /> </p><p><a href="http://www.speechpad.com/page/video-transcription/" target="_blank">Video transcription</a> by <a href="http://www.speechpad.com/" target="_blank">Speechpad.com</a><br /> </p><br /><p><a href="https://moz.com/moztop10">Sign up for The Moz Top 10</a>, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!</p><div style="clear:both; padding-bottom:0.25em"></div> </p> </div> <p class="post-footer"> <em>posted by Unknown @ <a href="http://webleadsuk.blogspot.com/2016/06/why-listing-accuracy-is-important.html" title="permanent link">15:05</a></em>   <a class="comment-link" href="http://webleadsuk.blogspot.com/2016/06/why-listing-accuracy-is-important.html#comment-form"location.href=http://webleadsuk.blogspot.com/2016/06/why-listing-accuracy-is-important.html#comment-form;><span style="text-transform:lowercase">0 Comments</span></a> <span class="item-control blog-admin pid-1382433543"><a style="border:none;" href="https://www.blogger.com/post-edit.g?blogID=526595817835727941&postID=6657923967356022046&from=pencil" title="Edit Post"><img class="icon-action" alt="" src="https://resources.blogblog.com/img/icon18_edit_allbkg.gif" height="18" width="18"></a></span> </p> </div> <!-- End .post --> <!-- Begin #comments --> <!-- End #comments --> <h2 class="date-header">Thursday, 16 June 2016</h2> <!-- Begin .post --> <div class="post"><a name="8108608252987723582"></a> <h3 class="post-title"> "A Complete Failure": What Tech Businesses Can Learn from a Sports Blog's Scandal </h3> <div class="post-body"> <p> <div style="clear:both;"></div><p>Posted by <a href=\"https://moz.com/community/users/151685\">RuthBurrReedy</a></p><p class="text-light-gray">[Estimated read time: 16 minutes]<br /> </p> <p>On February 17th, sports news and opinion blog <a target="_blank" href="http://www.sbnation.com">SB Nation</a> posted a longform story titled “Who is Daniel Holtzclaw?” The story, which took a sympathetic viewpoint on convicted rapist Daniel Holtzclaw, sparked a huge amount of controversy among SB Nation readers and garnered the blog a great deal of media attention. It took less than 24 hours for SB Nation to pull the piece, calling it “a complete failure.”<br /> </p><p>In the aftermath, SB Nation's parent company, Vox Media, convened a panel of editors and journalists to investigate how a piece taking such a controversial stance on such a sensitive topic could have been published on the site without anyone saying “Hang on, is this a good idea?”<br /> </p><h2>What does this have to do with digital marketing?<br /> </h2><p>When I read the peer review report, I was expecting a fascinating glimpse into the inner workings of a very popular media site. What I wasn't expecting was how many of the review's findings were exactly the kinds of struggles, missteps, and roadblocks I've seen at my and my friends' workplaces over the years. If you're a startup, a rapidly growing business, or a company that relies heavily on technology for communication, chances are you have or will run into many of the same pitfalls that plagued SB Nation.<br /> </p><p>Regardless of your feelings about the case itself or SB Nation's coverage of it, the Vox peer review has some valuable lessons on how a business can avoid inadvertently launching a product (or website, or piece of content) it's not proud of. <a target="_blank" href="https://cdn1.vox-cdn.com/uploads/chorus_asset/file/6553831/Peer_Review_Final.0.pdf">The entire peer review is worth a read</a>, but I've pulled out the biggest lessons for businesses here.<br /> </p><h2>Realistic production schedules<br /> </h2><p>The Longform blog was publishing one longform piece of content every week, a pace that the peer review called “...a furious rate of output that's not conducive to consistently excellent reporting and editing.”<br /> </p><p>In the tech world, there's enormous pressure to ship and keep shipping. For content creators, there is often a quota of pieces per week or per month to be met. <strong>A</strong><strong>n environment that stresses pressure to consistently produce over pressure to do excellent work is one that is ultimately going to result in some inferior products getting pushed out there.</strong> Taking shortcuts in the name of speed also results in <a href="https://en.wikipedia.org/wiki/Technical_debt" target="_blank">technical debt</a> that can take a business years to catch up with.<br /> </p><p><img src="http://d1avok0lzls2w.cloudfront.net/uploads/blog/575e237364ffd2.32026401.jpg"><br /> </p><p class="caption" style="text-align: center; font-size: 10px"><a target="_blank" href="https://pixabay.com/en/startup-start-up-people-593343/">Photo via Pixabay</a><br /> </p><p>“Done is better than perfect” is a common mantra at tech companies, and there's a lot of value in not sweating the small stuff. Scope creep is a real problem that can bog a project down inevitably. Still, it's important to make sure that your organization isn't encouraging a breakneck pace at the expense of quality.<br /> </p><p>Encourage your team to be honest about how long a project might take, and then build in some squish time to handle the unforeseen (more on that next). Recognize that not all product updates (blog posts, ad campaigns) are created equal, and some will take more time than others. If your team is constantly in “crunch mode” trying to get things out the door, that's a big clue that your pace is too fast and mistakes are going to be missed.<br /> </p><h2>Build in time for Quality Assurance<br /> </h2><p>This is one I've seen time and again around a new website launch. As delays are almost unavoidable when building a new site, the “final product” often isn't ready for review until a day or two before launch is scheduled to happen. It certainly can be possible to test and review a site in that amount of time, but <em></em><strong>only if everything is actually ready to go</strong><em></em>. If the QA review finds any major problems, there's no time to fix them before the launch deadline.<br /> </p><p>At SB Nation, editors and reviewers had between two and four days to turn a longform piece around for publishing before it was slated to go live. This meant heavy revisions were out of the question without pushing back the timeline to publish. Furthermore, the peer review discovered that many pieces were being published with only one other person having reviewed them - and then only from a copy-editing perspective, not for content.<br /> </p><p><img src="http://d1avok0lzls2w.cloudfront.net/uploads/blog/57624722a9eab8.17689237.jpg" alt="Quote: "A QA process is meaningless without a strategy to fix the problems it uncovers."" rel="border:none;" style="border:none;"><br /> </p><p>Even if your team are all geniuses who consistently do a dynamite job, any writer will tell you that it's impossible to proofread your own work. An outside perspective is absolutely vital to finding and fixing the problems any project might have; the person who built it is just going to be too close to the work to do it themselves.<br /> </p><p>Say it with me: <strong>a QA process is meaningless without a strategy to fix the problems it uncovers.</strong> Do not expect that your projects will just need a quick review-and-polish before they're ready to go live. Instead, make sure that you've budgeted enough time both thoroughly test and review, and assume that the QA process will uncover at least one major issue that you'll need to build in time to fix. In order to do that, you'll have to make sure you:<br /> </p><h2>Don't get married to your deadlines<br /> </h2><p>There are occasions where a hard deadline is unavoidable. You need the new website to be built before your major trade show event, or your blog post is responding to a developing news situation that won't be as relevant tomorrow. There are plenty of situations, however, when deadlines are more malleable than you might think. Sometimes, the cost of launching something that's not fully ready to go is much higher than the cost of pushing back a deadline by a week or two.<br /> </p><p><img src="http://d1avok0lzls2w.cloudfront.net/uploads/blog/57624750614964.33998578.jpg" alt="Quote: "Don't get so focused on “done” that you lose sight of what you're trying to do with your project in the first place."" rel="border:none;" style="border:none;"><br /> </p><p>As a leader, one thing you can do is build in internal deadlines throughout the life of a project, so as each deadline approaches you're checking in regularly about whether or not it's realistic. This can mean saying things like “We will need two weeks for QA, so all work on the product will need to be completed by June 15th to launch on July 1st.” “The blue widget team will need 3 weeks to complete their portion of the project, so if it's not handed off to them by May 25th, we are jeopardizing the launch date.” By setting these internal deadlines, you're building in an early warning system for yourself so you have plenty of time to plan and manage expectations. You'll also set a precedent for treating QA time as sacred, instead of cutting into it when other portions of the project overrun.<br><br /> </p><p>You should also decide ahead of time what you do and don't need in order to launch something. This is commonly referred to as defining the MVP, or Minimum Viable Product. If you have to sacrifice a few bells and whistles to hit your deadline, it may be worth it - but make sure you don't get so focused on “done” that you lose sight of what you're trying to do with this project in the first place.<br /> </p><h2>Be clear about roles and responsibilities<br /> </h2><p>Vox uncovered a problem at SB Nation that is so, so typical at companies that have undergone a period of fast growth: their internal leadership structure had not grown to scale with their increased numbers. The result was confusion about who was responsible for what, and who had the power to say “No” to whom.<br /> </p><p>This type of problem can be especially insidious at an organization with a “flat” org chart, where there aren't a ton of pre-defined internal hierarchies. Too much middle management does run the risk of turning your business into the company from Office Space - but <strong>too little oversight can be just as dangerous. </strong>A clearly-defined org chart can help your employees figure out who they need to contact in a crisis, and make sure that everyone's in the loop who needs to be.<strong></strong><br /> </p><p><img src="http://d1avok0lzls2w.cloudfront.net/uploads/blog/575e225de1eb00.74890088.jpg"><br /> </p><p class="caption" style="text-align: center; font-size: 10px"><a target="_blank" href="https://pixabay.com/en/boss-executive-businesswoman-454867/">Photo via Pixabay</a><br /> </p><p>Taking the time to build some internal structure, define roles and responsibilities, and have a clear process to escalate problems as needed might sound unbearably stuffy for your hip young tech business. You may feel that “everyone knows” who's responsible for what because you're such a tight-knit bunch. I'm sure I'm not the only one who felt an unpleasant shiver of recognition when I read this quote from the peer review, though:<br /> </p><blockquote><em>“Asked why he thought senior staffers felt they couldn't overrule Stout, Hall told us, “I think people didn't know how...We have a very tight, personality-based workplace. And I think sometimes if you don't know that person - people didn't feel like there was a formal way to do it.”</em><br /> </blockquote><p><strong>Make sure there is a formal way to do it. </strong>Believe me, you will be so happy that you got these things in place before you needed them, instead of waiting until you have a huge, public, embarrassing incident to make you realize you need them.<br /> </p><h2>Empower feedback at a cultural level<br /> </h2><p>There were multiple people within the SB Nation organization that were concerned about the Holtzclaw piece, but most of them felt like it was not their job, or their place, to give feedback. The copy editor who was the first to review the piece specifically said he thought sharing his concerns was “above his pay grade.”<br /> </p><p>It can be difficult to know whether or not criticism or raising concerns is appropriate in a given situation, especially for less-experienced team members who may not be familiar with workplace etiquette. This is exacerbated when decisions are made via email chains - a given observer may not want to derail the thread with their concerns, or risk looking foolish or overstepping their bounds in front of their colleagues.<br /> </p><p><img src="http://d1avok0lzls2w.cloudfront.net/uploads/blog/57624779c8d868.97686916.jpg" alt="Quote: "It won't matter if you ask for feedback if you're not taking the time to support it culturally."" rel="border:none;" style="border:none;"><br /> </p><p>As a leader in your organization, pay attention to decision-making processes and make sure that you are creating a space to actively solicit and encourage feedback from the people involved, regardless of their pay grade. You need to pay attention to the “unwritten rules” of your workplace, too - what happens when someone voices a critique? Are they listened to? Taken seriously? Dismissed? Ridiculed? Does change ever come out of it? <strong>It won't matter if you ask for feedback if you're not taking the time to support it culturally.<br /> </strong><br /> </p><p><strong></strong>This is one of many instances in which defining and living by a set of company values can be so important. At UpBuild, <a href="http://www.upbuild.io/our-culture/" target="_blank">two of our values are transparency and integrity</a>, so creating a culture where feedback is warmly encouraged fits right in with what we've said we want to do. Knowing that they're taking an action that's supported by the company's core values can encourage someone to risk speaking up when they otherwise might not have.<br /> </p><h2>Avoid isolation and concentration of power<br /> </h2><p><img src="http://d1avok0lzls2w.cloudfront.net/uploads/blog/576248268b8f55.65378603.jpg" alt="Quote: "There should be no one person who holds so much exclusive knowledge about an area of your business that it wouldn't be able to continue if they left."" rel="border:none;" style="border:none;"><br /> </p><p>Most startups have that one person. They've been there for a while, maybe since the beginning, and since the business started out lean and mean, they've worn a lot of hats while they're there. Maybe they helped build some of the systems the company runs on. At any rate, they've got one product, or one area, that is “theirs” alone - nobody else really touches it, and in some cases nobody else really knows exactly what they do with it. What people do know is they have to stay on this person's good side, because any time their work intersects with this person's individual fiefdom within the company, they'll need their cooperation to get it done. In the case of SB Nation's Longform blog, even people who were ostensibly editor Glenn Stout's peers didn't feel empowered to kill a piece he'd championed, because it was “his” blog.<br /> </p><p style="text-align: center;"><img src="http://d1avok0lzls2w.cloudfront.net/uploads/blog/575e21934e2408.57510421.jpg"><br /> </p><p class="caption" style="text-align: center; font-size: 10px"><a target="_blank" href="https://pixabay.com/en/entrepreneur-startup-start-up-man-593353/">Photo via Pixabay</a><br /> </p><p>Additionally, the team was distributed across the country, which means they used a combination of email and Slack to communicate. According to Stout, “When I first started hearing about Slack, I had no idea even what it was. And then the edict came down that we have to go on Slack. And I had to find out what it was. And I would use it occasionally, but I'm just much more comfortable with emails.”<br /> </p><p>What I found so fascinating about this portion of the report was that not only did Stout's non-participation in Slack distance him from his co-workers, giving them less of a sense of what he was working on, <strong>it also created the impression that he was exempt from the rules and expectations that governed much of the rest of the organization.<br /> </strong><br /> </p><p>If you recognize an internal fiefdom forming (or know of one that's already there), break it up by building in some redundancies - there should be no one person who holds so much exclusive knowledge about an area of your business that it wouldn't be able to continue if they were to leave. Get other people involved in the project, emphasize cross-team collaboration and information sharing, and make it a requirement for everyone. Even your top performers shouldn't be exempt from the policies and procedures laid out for your organization.<br /> </p><p>Building in redundancies can also help guard against another contributing factor to the SB Nation debacle: the one person who would usually dictate whether or not a piece would be published, the editorial director, was on vacation. Two other editors were meant to be taking on his work in his absence, but neither was sure if they had the power to kill the article. If a core decision-maker is on vacation, there should be other team members conversant enough with his or her work to step in, and everyone should be on the same page about their power to make decisions in his or her absence.<br /> </p><p><img src="http://d1avok0lzls2w.cloudfront.net/uploads/blog/576247f942b486.09224777.jpg" alt="Quote: "If important discussions are happening [on a tool like Slack], everyone needs to be using it."" rel="border:none;" style="border:none;"><br /> </p><p>To combat isolation, be consistent in setting and reinforcing expectations for communication and collaboration. Don't just let everyone use communication tools to whatever extent they prefer; figure out which conversations need to be happening in which channel, and then nudge your team to make sure information is being communicated in the right way/place.<br /> </p><p>The informality and high interruption factor of a tool like Slack can turn people off, and it's easy to see participation as voluntary, but if important discussions are happening there, everyone needs to be using it. Set the expectation with your direct reports: “You don't have to look at it constantly, but I expect you to be checking Slack (x) times per day, participating in discussions that concern you, and responding promptly to direct messages.”<br /> </p><p>Speaking of tools:<br /> </p><h2>Pick a project management solution and stick with it<br /> </h2><p><img src="http://d1avok0lzls2w.cloudfront.net/uploads/blog/57624881646e74.40081261.jpg" alt="Quote: "Adoption of a new tool or process has to be set at a cultural level."" rel="border:none;" style="border:none;"><br /> </p><p>Growing companies tend to leave a trail of discarded project management systems in their wake. It can be difficult to get a whole team to fully adopt a new tool, and when you're small, it's pretty easy to keep track of everything that's going on. The challenge comes, once again, with growth - sooner or later a company gets big enough for things to start falling through the cracks. Finding the next new shiny project management tool and rolling it out to great fanfare are easy; getting your team to actually use it is the hard part.<br /> </p><p><strong>Adoption of a new tool or process has to be set at a cultural level.</strong> Build use of the new tool into existing processes in an explicit way, and then reinforce that that's how you expect them to be done from now on. I've found it helpful to structure touch-base meetings about a given project around the project management tool (we use <a target="_blank" href="https://trello.com">Trello</a>); that gives the people working on the project incentive to make sure everything's up-to-date before our meeting.<br /> </p><h2>Prioritize empathy<br /> </h2><p>When you're spending all your time thinking about and making something, it can be really hard to think about it in any different way. Most SEOs have encountered this with our clients; people get so used to industry terminology or niche product descriptions that they have a hard time taking a step back and asking, “What are our customers really looking for? What problems are they trying to solve?”<br /> </p><p><img src="http://d1avok0lzls2w.cloudfront.net/uploads/blog/576248c32fec65.60728671.jpg" alt="Quote: "Empathy should be a required skill and consideration for anyone who is going to be communicating on behalf of your business."" rel="border:none;" style="border:none;"><br /> </p><p>Empathy can come into play at many points during the marketing process - user testing, focus groups, persona building, user experience, accessibility concerns, etc. - but is of particular importance when dealing with sensitive subjects. There are countless examples of businesses who have, for example, <a href="http://blog.hubspot.com/blog/tabid/6307/bid/9286/Learning-From-Kenneth-Cole-s-Social-Media-Mistake.aspx" target="_blank">tried to take an “edgy” tone on social media and wound up in a media firestorm</a>. Empathy should be a required skill and consideration for anyone who is going to be communicating on behalf of your business.<br /> </p><p>As with empowering feedback, prioritizing empathy is something that can be best done at a cultural level. You may decide you want to <a href="https://moz.com/about/tagfee" target="_blank">make empathy part of your core values</a>, but even if you don't, you should clearly define your company's stance on addressing sensitive topics - and the tone and brand voice you've defined as part of your overall brand strategy should reflect that as well. Make empathy part of the training and onboarding process for anyone who will be communicating about your brand, and for all higher-up and senior staff, and you'll start to see the cultural shift.<br /> </p><h2>Fail fast, apologize faster<br /> </h2><p>The peer review covered a lot of “what went wrong,” but let's close on what went right. It took SB Nation less than 24 hours to realize their mistake, take down the piece, and issue an apology - not a “we're sorry if you were offended” apology, but a real, honest-to-goodness, “this was a mistake and we are heartily sorry” apology.<br /> </p><p><img src="http://d1avok0lzls2w.cloudfront.net/uploads/blog/576248de504233.76380569.jpg" alt="Quote: "The rules are simple, and you probably learned them in elementary school: 1.) Say you're sorry. 2.) Show you understand why what you did was wrong or hurt somebody. 3.) Don't do it again."" rel="border:none;" style="border:none;"><br /> </p><p>It would have been easy to beat a hasty retreat, engage in some quick and public firings, and sweep the whole thing under the rug. Instead, SB Nation and parent company Vox Media proved that their apology was sincere by taking concrete steps to figure out what had happened and how they could prevent it from happening again.<br /> </p><p>If you take nothing else away from this, learn from this master class in public apology. The rules are simple, and you probably learned them in elementary school: 1.) Say you're sorry. 2.) Show you understand why what you did was wrong or hurt somebody. 3.) Don't do it again.<br /> </p>What lessons did you take away from this peer review? In addition to everything I've outlined above, it also served as an important reminder for me that there are many places to find lessons to be learned, and that they're often outside the tech bubble we so often find ourselves in.<br /><p><a href="https://moz.com/moztop10">Sign up for The Moz Top 10</a>, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!</p><div style="clear:both; padding-bottom:0.25em"></div> </p> </div> <p class="post-footer"> <em>posted by Unknown @ <a href="http://webleadsuk.blogspot.com/2016/06/a-complete-failure-what-tech-businesses.html" title="permanent link">16:56</a></em>   <a class="comment-link" href="http://webleadsuk.blogspot.com/2016/06/a-complete-failure-what-tech-businesses.html#comment-form"location.href=http://webleadsuk.blogspot.com/2016/06/a-complete-failure-what-tech-businesses.html#comment-form;><span style="text-transform:lowercase">0 Comments</span></a> <span class="item-control blog-admin pid-1382433543"><a style="border:none;" href="https://www.blogger.com/post-edit.g?blogID=526595817835727941&postID=8108608252987723582&from=pencil" title="Edit Post"><img class="icon-action" alt="" src="https://resources.blogblog.com/img/icon18_edit_allbkg.gif" height="18" width="18"></a></span> </p> </div> <!-- End .post --> <!-- Begin #comments --> <!-- End #comments --> </div></div></div> <!-- End #main --> <!-- Begin #sidebar --> <div id="sidebar"> <!-- Begin #profile-container --> <div id="profile-container"><h2 class="sidebar-title">About Me</h2> <dl class="profile-datablock"> <dd class="profile-data"><strong>Name:</strong> <a rel="author" href="https://www.blogger.com/profile/13522202927198816532"> Unknown </a></dd> </dl> <p class="profile-link"><a rel="author" href="https://www.blogger.com/profile/13522202927198816532">View my complete profile</a></p></div> <!-- End #profile --> <!-- Begin .box --> <div class="box"><div class="box2"><div class="box3"> <h2 class="sidebar-title">Links</h2> <ul> <li><a href="https://news.google.com/">Google News</a></li> <li><a href="https://support.google.com/blogger/answer/41427">Edit-Me</a></li> <li><a href="https://support.google.com/blogger/answer/41427">Edit-Me</a></li> </ul> <h2 class="sidebar-title">Previous Posts</h2> <ul id="recently"> <li><a href="http://webleadsuk.blogspot.com/2016/08/get-your-seattle-exploration-on-at.html">Get Your Seattle Exploration on at MozCon 2016!</a></li> <li><a href="http://webleadsuk.blogspot.com/2016/08/combining-email-and-facebook-for.html">Combining Email and Facebook for a Dynamite Ecomme...</a></li> <li><a href="http://webleadsuk.blogspot.com/2016/08/the-two-part-seo-ranking-model-lets.html">The Two-Part SEO Ranking Model: Let's Make SEO Simple</a></li> <li><a href="http://webleadsuk.blogspot.com/2016/08/should-you-be-outsourcing-seo-training.html">Should You Be Outsourcing SEO Training for Your Team?</a></li> <li><a href="http://webleadsuk.blogspot.com/2016/08/new-rank-beyond-10-blue-links-with-serp.html">NEW: Rank Beyond 10 Blue Links with SERP Feature T...</a></li> <li><a href="http://webleadsuk.blogspot.com/2016/08/googles-future-is-in-cards.html">Google's Future is in the Cards</a></li> <li><a href="http://webleadsuk.blogspot.com/2016/08/responsive-design-is-killing-two-thirds.html">Responsive Design is Killing Two-Thirds of Your Co...</a></li> <li><a href="http://webleadsuk.blogspot.com/2016/08/moz-is-doubling-down-on-search.html">Moz is Doubling Down on Search</a></li> <li><a href="http://webleadsuk.blogspot.com/2016/08/how-to-build-killer-content-keyword-map.html">How to Build a Killer Content → Keyword Map for SE...</a></li> <li><a href="http://webleadsuk.blogspot.com/2016/08/heres-how-to-use-your-daily-habits-for.html">Here's How to Use Your Daily Habits For Writing Be...</a></li> </ul> <h2 class="sidebar-title">Archives</h2> <ul class="archive-list"> <li><a href="http://webleadsuk.blogspot.com/2014/12/">December 2014</a></li> <li><a href="http://webleadsuk.blogspot.com/2015/01/">January 2015</a></li> <li><a href="http://webleadsuk.blogspot.com/2015/02/">February 2015</a></li> <li><a href="http://webleadsuk.blogspot.com/2015/03/">March 2015</a></li> <li><a href="http://webleadsuk.blogspot.com/2015/04/">April 2015</a></li> <li><a href="http://webleadsuk.blogspot.com/2015/05/">May 2015</a></li> <li><a href="http://webleadsuk.blogspot.com/2015/06/">June 2015</a></li> <li><a href="http://webleadsuk.blogspot.com/2015/07/">July 2015</a></li> <li><a href="http://webleadsuk.blogspot.com/2015/08/">August 2015</a></li> <li><a href="http://webleadsuk.blogspot.com/2015/11/">November 2015</a></li> <li><a href="http://webleadsuk.blogspot.com/2015/12/">December 2015</a></li> <li><a href="http://webleadsuk.blogspot.com/2016/01/">January 2016</a></li> <li><a href="http://webleadsuk.blogspot.com/2016/03/">March 2016</a></li> <li><a href="http://webleadsuk.blogspot.com/2016/04/">April 2016</a></li> <li><a href="http://webleadsuk.blogspot.com/2016/05/">May 2016</a></li> <li><a href="http://webleadsuk.blogspot.com/2016/06/">June 2016</a></li> <li><a href="http://webleadsuk.blogspot.com/2016/07/">July 2016</a></li> <li><a href="http://webleadsuk.blogspot.com/2016/08/">August 2016</a></li> <li><a href="http://webleadsuk.blogspot.com/">Current Posts</a></li> </ul> <p id="powered-by"><a href="https://www.blogger.com"><img src="//buttons.blogger.com/bloggerbutton1.gif" alt="Powered by Blogger" /></a></p> <p id="blogfeeds">Subscribe to<br />Posts [<a target="_blank" href="http://webleadsuk.blogspot.com/feeds/posts/default" type="application/atom+xml">Atom</a>]</p> <!-- <p>This is a paragraph of text that could go in the sidebar.</p> --> </div></div></div> <!-- End .box --> </div> <!-- End #sidebar --> <!-- Begin #footer --> <div id="footer"><div><div><hr /> <p><!-- This is an optional footer. If you want text here, place it inside these tags, and remove this comment. --> </p> </div></div></div> <!-- End #footer --> </div> <!-- End #content --> </body> </html>