Category Archives: SEM & SEO

How to master copywriting for SEO

In 2018, you need to understand copywriting and SEO– and a whole lot more – to write content that will rank well and return a great ROI.

If you have a head for marketing, UX and research, too, you’ll be in a commanding position. As our discipline evolves in response to a changing search engine landscape, demarcation lines become blurred, and it’s been difficult not to venture into featured snippets, schema and other on-page aspects of SEO.

Instead, with proper focus, you’ll need to know about your audience and how they’ll read your content, what they will be looking for, the continuing role of high quality, in-depth content, where offline historic copywriting skills still live on today, why you should still be using key phrases, and why structure is important.

How will your audience read your content in 2018?

Google’s recent announcement of the first set of sites being migrated to mobile-first indexing reflects the fact that the majority of searches worldwide are carried out on mobile devices. My direct experience is that the move to mobile is very much in the B2C space; less so in B2B, where people are still at their desks with their laptops or desktops.

And then, we see the start of an explosion in voice search and devices – our smartphones and home devices from Google, Amazon and Apple – reading content to us.

Of course, we’re still seeing how voice pans out, and its implications for SEO copywriting, but I’d say if you stick to simple language and shorter sentences within a well-structured piece (think about making the main points right up front in case the listener’s attention wanders).

High-quality, in-depth content

However your audience interacts with your work, it needs to be excellent. Make your content unique, high quality and written to professional standards. Google will reward you. Buying 300-500 spun monstrosities, while never being a great thing, had better not even pass through your mind today. They’ll kill your SEO and content marketing ambitions stone dead.

While we’re thinking about copy lengths, one popular strategy recently has been to write a longer piece than those above you in the rankings. Theirs is 2,000 words? Then leapfrog them by writing 2,500!

Of course, it’s not as simple as that. Take a look at the webpages above you in the SERPs. How good are they? Are they well-written? Do they answer the questions customers are asking? Do they understand searcher intent and how to respond to it?

If the 2,000-worder in your sights fails on any or all of these factors, you may be able to kick the ball out of the park with a shorter, tighter, laser-targeted 1,500-worder.

Writing shorter pieces for mobile’s smaller screens may be tempting. Don’t, though. You’ll lose out to those more extensive pieces, written without such an artificial restriction. Instead, leave it to your UX people, designers and developers to get the presentation right.

Write for people

Now that Google can understand the words on a page, you have to raise your writing game. Get your grammar and stylistic chops up with the best and Google should reward you for it. But don’t forget your audience. Deliver them precisely what they’re looking for.

Before you start writing, ask yourself:

  • Who is your audience?
  • Where is their pain?

Put yourself in their mind; imagine how they will react to your content.

You may want to go the whole hog and spend time developing Personas. Personally, I’m happy to use them if there’s the budget and someone else to do most of the donkey work. Otherwise, I find I can usually visualize the target group more easily than the series of sometimes-unconvincing individuals that can come out of the Persona-building exercise.

Bridging the offline past with the online present

Let’s see how the long-established rules of copywriting work in today’s SEO copywriting environment.

  • Do your research: Advertising industry king, David Ogilvy, stressed the fundamental importance of research in producing great copy some 50 years ago – decades before the age of keyword research or the internet. Don’t you forget the keyword research, though – more on that later .
  • Write an attention-grabbing headline based on related key phrases from your research.
  • Involve the reader further with subheads – don’t skimp on them, either.
  • Make it easy for the reader: In addition to inserting subheads, write in short paragraphs and short sentences. And ensure you put spaces between paragraphs.
  • Calls to action: No matter how good your copy, you’ll need a CTA to see the full return on your investment, through sign-ups, purchases or other goal fulfilments.
  • Treat editing as separate from writing: Get some time between the two processes and see your work with new eyes. If you’re writing more than a couple of screens of copy, consider printing out your work. You’ll see it entirely differently.
  • Get someone else to read your work: They’ll notice your mistakes and pick out where you’re unclear.

Ranker: How to make a Google algorithm-proof website

Any SEO or webmaster who has ever had a website affected by a Google algorithm change – or feared being affected by one – has probably wished that they could find a way to make their website “algorithm-proof”.

Still, surely there’s no such thing as a website that’s never impacted by Google algorithms, right? As long as your site is indexed by Google, it’s at the mercy of the algorithms that Google uses to determine website ranking, all the more so if you happen to rely heavily on organic search traffic for your business.

The art – or science – of search engine optimization is about determining as best you can what those algorithms are looking for, and giving it to them.

Yet one website believes it has found the formula for making its content “Google algorithm-proof”. Ranker is a website made up of dynamic, crowdsourced lists that users can vote on, about everything from pop culture to geography, history to sports, celebrities to science.

And according to its CEO, Clark Benson, Ranker has never suffered a negative effect from a Google algorithm change, growing its traffic steadily without interruption over the course of eight and a half years.

Search Engine Watch caught up with Benson to find out Ranker’s secret to success, and whether there is a formula for creating an algorithm-proof website.

Rankings, not review sites

So what is Ranker, exactly?

“Ranker’s primary reason for being is to crowdsource anything that makes sense to rank,” says Benson. “Any topic that people are really interested in.

“The unique angle that we’ve pursued is that instead of having this being one 23-year-old blogger’s opinion of the best new TV shows of the year, or whatever it happens to be, we would have a dynamic list that visitors could vote on, potentially add items to, and re-rank.

“The end result is a very wisdom-of-crowds-based answer which is always changing and dynamically moving along as tastes change, and as more people vote on things.”

Voting on a list of ‘Historical events you most want to go back and see’ on Ranker

Lists have been a time-honored draw for magazines and other print media over the years, but it was when the internet came along that they really exploded – spawning dozens of list-oriented viral websites and the much-mocked listicle, which became a staple of online journalism. However, Benson – a self-described “lifelong list nerd” – was frustrated by the fact that these lists only ever represented one person’s opinion.

In a similar vein, he found review websites unhelpful, as user-generated reviews represented a single person’s subjective opinion in a format that wasn’t conducive to making a decision.

“Part of the reason to build Ranker was my frustration with review sites, because when I’m looking for an answer to something, like which TV show to watch, I don’t want to read a lot of text reviews.

“I also feel that in typical five-star rating systems, everything tends to be clustered around three and a half to four stars, so you don’t get any true granularity on what is best.”

In a world increasingly “cluttered with choices”, therefore, Benson was convinced that rankings were “the simplest way to dissect a choice in a category, without losing the credibility of the answer”. And so he built Ranker as a website where the wisdom of the crowd could determine the ultimate ranking for any list of items, on any topic.

The secret to Ranker’s SEO success: Content freshness

Since Ranker’s launch in 2009, the site has amassed more than 100,000 rankings across dozens of broad categories, encompassing almost any topic that people could have a passion for.

When the website first launched, however, it had very few resources, and Benson explains that he had to learn SEO from scratch in order to give the website a strong foundation.

Luckily, earning traffic was never a problem for the site, because the type of content published on Ranker was uniquely suited to catering to Google’s algorithms.

“We’ve never been hit by any algorithm changes – we’ve always grown our organic search traffic year over year over year, steadily, for the eight and a half years we’ve been live.

“You never exactly know what works in SEO, because Google doesn’t tell you what works, but I’ve always believed that the best intelligence on what to do comes from the public statements Google makes – their best practices.

“And one of the key factors that Google says is in their index is freshness of content. Content has a lifespan. In our case, because our rankings are dynamic and always changing – people are adding things to them, voting things up and down – this makes for perpetually fresh content.

“We have a lot of content that is six, seven, even eight years old that is still doing as well as it was years ago, and in some cases it’s even growing in traffic.”

One of Ranker’s most evergreen pieces of content is a list ranking the ‘Best Movies of All Time’ – which is more than 5,000 items long.

“Obviously that’s a topic that there’s a lot of passion and a lot of competition for [in search rankings]. And in the last few years, we’ve been on the top three or so results on Google for that term.

“We’ve watched that page just grow in rankings over the span of seven or eight years. I can only guess it’s because the page is always changing.”

User-curated content

At the time of writing this article, Ranker’s front page is currently spotlighting a list of best-dressed celebs at the 2018 Oscars, a best TV episode names ranking, and a list of possible game-changing deep space observations to be made by the Webb Telescope.

Anyone can add an item to a list on Ranker, although Ranker’s content is not purely user-generated. Ranker has an editorial team which is made up of people who, in Benson’s words, “have a mind for cataloging things” rather than people who specialize in writing a lot of prose.

Lists are typically started off by one of Ranker’s editors, and when a user wants to add a new item to a list, it’s cross-referenced with Ranker’s database, a huge data set made up of more than 28 million people, places and things. If the item isn’t found in the database, it’s added to a moderation queue.

Rather than UGC (user-generated content), therefore, Benson thinks of Ranker’s lists as something he terms UCC – user-curated content.

How did Ranker build such a huge data set? Beginning in 2007, a company called Metaweb ran an open source, collaborative knowledge base called Freebase, which contained data harvested from sources such as Wikipedia, the Notable Names Database, Fashion Model Directory and MusicBrainz, along with user-submitted wiki contributions.

This knowledge base made up a large part of Ranker’s data set. What’s interesting is that Freebase was later acquired by none other than Google – and is the foundation of Google’s Knowledge Graph.

Additionally, not every list on Ranker is crowdsourced or voted on. Some lists, such as Everyone Who Has Been Fired Or Resigned From The Trump Administration So Far, don’t make sense to have users voting on them, but are kept fresh with the addition of new items whenever the topic is in the news.

Can other websites do ‘Ranker SEO’?

Benson acknowledges that Ranker’s setup is fairly unique, and so it isn’t necessarily possible to emulate its success with SEO by trying to do the same thing – unless you just happen to have your own crowdsourced, user-curated list website, of course.

With that said, there are still some practical lessons that website owners, particularly publishers, can take away from Ranker’s success and apply to their own SEO strategy.

Are keywords still relevant to SEO in 2018?

What a useless article! Anyone worth their salt in the SEO industry knows that a blinkered focus on keywords in 2018 is a recipe for disaster


Sure, I couldn’t agree with you more, but when you dive into the subject it uncovers some interesting issues.

If you work in the industry you will no doubt have had the conversation with someone who knows nothing about SEO, who subsequently says something along the lines of:

“SEO? That’s search engine optimization. It’s where you put your keywords on your website, right?”

Extended dramatic sigh. Potentially a hint of aloof eye rolling.

It is worth noting that when we mention ‘keywords’ we are referring to exact match keywords, usually of the short tail variety and often high-priority transactional keywords.

To set the scene, I thought it would be useful to sketch out a polarized situation:

Side one:
Include your target keyword as many times as possible in your content. Google loves the keywords*. Watch your website languish in mid table obscurity and scratch your head wondering why it ain’t working, it all seemed so simple.

(*not really)

Side two:
You understand that Google is smarter than just counting the amount of keywords that exactly match a search. So you write for the user…..creatively, with almost excessive flair. Your content is renowned for its cryptic and subconscious messaging.

It’s so subconscious that a machine doesn’t have a clue what you’re talking about. Replicate results for Side One. Cue similar head scratching.

Let’s start with side one. White Hat (and successful) SEO is not about ‘gaming’ Google, or other search engines for that matter. You have to give Doc Brown a call and hop in the DeLorean back to the early 2000s if that’s the environment you’re after.

Search engines are focused on providing the most relevant and valuable results for their users. As a by product they have, and are, actively shutting down opportunities for SEOs to manipulate the search results through underhanded tactics.

What are underhanded tactics? I define them by tactics that don’t provide value to the user; they are only employed to manipulate the search results.

Here’s why purely focusing on keywords is outdated
Simply put, Google’s search algorithm is more advanced than counting the amount of keyword matches on a page. They’re more advanced than assessing keyword density as well. Their voracious digital Panda was the first really famous update to highlight to the industry that they would not accept keyword stuffing.

Panda was the first, but certainly not the last. Since 2011 there have been multiple updates that have herded the industry away from the dark days of keyword stuffing to the concept of user-centric content.

I won’t go into heavy detail on each one, but have included links to more information if you so desire:

Hummingbird, Latent Semantic Indexing and Semantic Search
Google understands synonyms; that was relatively easy for them to do. They didn’t stop there, though. Hummingbird helps them to understand the real meaning behind a search term instead of the keywords or synonyms involved in the search.

Supposedly one of the three most important ranking factors for Google. RankBrain is machine learning that helps Google, once again, understand the true intent behind a search term.

All of the above factors have led to an industry that is focused more on the complete search term and satisfying the user intent behind the search term as opposed to focusing purely on the target keyword.

As a starting point, content should always be written for the user first. Focus on task completion for the user, or as Moz described in their White Board Friday ‘Search Task Accomplishment’. Keywords (or search terms) and associated phrases can be included later if necessary, more on this below.

Writing user-centric content pays homage to more than just the concept of ranking for keywords. For a lot of us, we want the user to complete an action, or at the very least return to our website in the future.

Even if keyword stuffing worked (it doesn’t), you might get more traffic but would struggle to convert your visitors due to the poor quality of your content.

So should we completely ignore keywords?
Well, no, and that’s not me backtracking. All of the above advice is legitimate. The problem is that it just isn’t that simple. The first point to make is that if your content is user centric, your keyword (and related phrases) will more than likely occur naturally.

You may have to play a bit of a balancing act to make sure that you don’t up on ‘Side Two’ mentioned at the beginning of this article. Google is a very clever algorithm, but in the end it is still a machine.

If your content is a bit too weird and wonderful, it can have a negative impact on your ability to attract the appropriate traffic due to the fact that it is simply too complex for Google to understand which search terms to rank your website for.

This balancing act can take time and experience. You don’t want to include keywords for the sake of it, but you don’t want to make Google’s life overly hard. Experiment, analyse, iterate.

Other considerations for this more ‘cryptic’ content is how it is applied to your page and its effect on user experience.

The rise of personal searches: How can content marketers take advantage?

As marketers in the ever-changing world of digital, success depends on knowing what consumers want and expect from us. After all, it’s the only way we can deliver.

So, it’s interesting to see that a recent data release from Google tells us that personalized search is becoming more and more prominent among internet users.

No longer are they turning to friends and family for personal advice and recommendations, but search engines too.

Of course, we already knew that… that’s why we work so hard at getting to know our audience and understanding their micro-moments and pain points, delivering the right content at the right time, in the right way.

But what Google is telling us is that rather than searching, “How often should you wash your hair?”, we are now searching “How often should I wash my hair?”. Changing those two little words is making the way that we use search engines far more personal than ever before.

And the data suggests that consumers now truly trust that their most specific needs can be answered by content on the web. In fact, in the last two years Google has reported that mobile searches using “…for me” has grown by a huge 60% over the last two years.

On top of this, they have also seen an 80% increase in mobile searches including “…should I?”. As a result, we really are treating search as one of our best, most trusted friends.

And that’s great news for content marketers.

For those of us working in motor, beauty, finance, fitness and pet care, it seems that this new insight is especially relevant – these are the industries in which users are most frequently turning to Google to solve their personal pain points.

How can we prepare and optimize our content for these types of search?


Creating calculators and tools is a brilliant way of targeting personal search terms and providing our users with the personalized response they are looking for. Let’s use a fitness example to demonstrate this:

This recent data circulation from Google suggests that users are starting to search for something like, “how much water should I drink each day?” in higher volumes than something like, “how much water should you drink per day?”.

Now, most of us know that the answer to this question will depend on a number of different factors including gender, body composition, activity level and so on.

What our audience is expecting from this search is a personalized answer that takes all of these things into consideration and tells them exactly how much water they should personally be drinking each day.

A water consumption calculator would do this well, and if the user wants the specificity of an individual result, they will be willing to fill in the necessary personal details to retrieve it. A blog post that simply states the average recommended fluid intake for a man or a woman as recommended by the NHS is no longer user focused enough.

Case studies and testimonials

Providing personalized content will not always be easy, and at times users may need encouragement to spend a little longer on a page to find the personalized answer they are looking for. In this instance, case studies and testimonials are a great way to push users further through their journey in the right direction.

For example, “How much money do I need to retire?” is a more complex question than our fitness example. There are so many variants that could alter the accurate and personalized response to this question, so it’s difficult to answer it quickly in a personalized way.

However, if we provide users with a testimonial or case study at the right stage in their journey – one that was created after a lot of persona research and uses someone or a situation that will resonate with them – they are likely to engage with the content.

Creating engagement via a case study will increase the likelihood that they’ll enquire with your brand for a more personalized answer, continuing their journey on their way to the personalized answer they are looking for.

Seasonal SEO and evergreen URLs: How to drive seasonal traffic year-round

Now that Christmas and the New Year are well and truly behind us, it’s time to think about next year!

While it might seem like an odd time to start planning for the holidays, this time of year is the perfect occasion to reflect on what went well during the last holiday season, how to build on it, and the steps you can take to drive seasonal traffic all year round.

Why is seasonal traffic so important?

Seasonal website traffic isn’t just a gimmick or something that can be considered a few months before the event. Many companies rely on these peak buying periods to help balance their books and flatten out their averaged revenue across the year – therefore it requires a dedicated strategy.

Interest around shopping online continues to increase year on year, with a greater swing towards mobile devices and shopping ‘on the go’. Connection speeds are faster and websites are optimizing for speed.

They’re prioritizing mobile viewing in many cases and the experience is often so rapid and easy that the concerns around clunkiness and security that once plagued online sales are quickly diminishing (if not non-existent for savvy users).

A blend of great discounts, quick deliveries, press coverage, advertising buzz and good timing has meant that events such as Black Friday and Cyber Monday (ironically both now dominated by online sales in the UK) are now cornerstones in many businesses’ revenue streams.

In this article, we’ll look into how some of the basics can help you slip ahead of competitors.

Permanent (evergreen) URLs

Staying active all year round plays a vital role in the success of many seasonal and time sensitive campaigns. We so often hear:

  • “Should I set up a new page for XYZ event?”
  • “We’re offering 20% off this weekend – do we need a new page?”
  • “Performance is up, so we thought… more categories!”

Well, it’s not always just a quick answer, there are plenty of factors that need to be taken into account to provide a considered (and correct) response. The trick is, this isn’t just about SEO – it rarely ever is! You have to consider all the below factors (and more) when making a new URL:

  • Time taken to manage and tag products appropriately
  • What do you hope it will rank for?
  • Will it cannibalise other keyword targeting categories?
  • Does it need to be indexed or is it for PPC/Email campaigns?
  • Will you add internal links to it – where will they go post-season?
  • Is the page going to generate backlinks?
  • Can the page be used all year (for example /clearance instead of /2018-aw-sale)?
  • Will you be printing this URL on brochures/leaflets, etc?
  • Can it be short and snappy?

What is an evergreen URL?

An evergreen URL is an address on your site that doesn’t need to change – see it as a permanent addition to your site’s internal architecture. A good example of this is a /sale page. The associated event may not always be active – but the equity of the page is not sporadically redirected to other URLs on the site throughout the year.

The dreaded dated URL

Avoid dating the URL – fashion sites are often the worst offenders for /aw16 or /ss17 (with the abbreviations standing for Autumn/Winter and Spring/Summer respectively). How about just /new-arrivals, or going super short with /new-in (for example

But it’s not just category URLs that need attention and stability. There are a variety of pages that benefit from a carefully planned approach – next we’ll take a look at one of the most successful pieces of seasonal marketing (across multiple platforms) and how it impacts potential organic performance.

The search impact of Christmas adverts

Christmas adverts in the UK are a sign that the festive season is here… or they may just be a premature annoyance that definitely didn’t make me cry that one time!

Regardless, there are a few lone examples of where using a carefully considered (permanent) URL can be a viable source of generating natural links and help sell a story (…plus some merchandise).

John Lewis Christmas advert

The widely anticipated release of John Lewis’ Christmas advert is an annual event that is fast rivaling the Coca Cola lorries in terms of seasonal buzz. Other retailers have since latched onto its success and diluted the impact of these emotional shorts, but for the last three years John Lewis did something that really worked.

The below graph from Ahrefs shows how the URL received links from referring domains. Many of the links came from large influential sites including The Guardian, Huffington Post, BBC and HubSpot.

Naturally, these links occur shortly after the release of each year’s advert. This not only provided the site with authority and trust, but also provided a large amount of referral traffic.

The drop-off from these links is minimal and the pages themselves were well-crafted. What’s more, the URL itself never changed – no 404s, no redirects.

Something was missing this Christmas…

2017 saw a change to John Lewis’ approach with a separate sub-directory for content. The URL is far less marketable and the Christmas advert is less prominent. There seems to be a focus on the more commercial aspects of Christmas and event ideas, which is both a shame and a lost opportunity as the new URL has received far less buzz (as you might expect).

Competitors and other big brands have attempted a similar execution but are also being held back by inefficient URLs and a need for a little more magic. Some of the best near misses can be seen below (if any 404 or redirect to the homepage when you’re reading this, it only backs up my point!):

Everything you need to know about the Google Chrome ad blocker

Google launches a new version of its Chrome web browser today (February 15), which will include an in-built ad blocker to try and eradicate intrusive ads from the browsing experience.

There are some clear standards and some unanswered questions relating to this new approach, so what exactly do marketers need to know?

Google announced last year that certain ad types would be blocked automatically within Chrome. This seemingly seismic update is due to go live today in the latest upgrade to the world’s most popular web browser.

The integration of an ad blocker within Google Chrome is just a small part of a much bigger movement to improve the quality of online advertising, however.

This has been driven by consumers, who are increasingly frustrated with ads that interrupt and distract them from the content they want to view. As people spend more time on mobile devices and advertisers invest more in video, that tension has only heightened. ads

The survey results in the image above tally with the findings from Google’s own research. Axios revealed recently that Google has found two concerning trends when analyzing user behavior on Chrome:

  1. One-in-five Chrome feedback reports mentions annoying/unwanted ads
  2. There were 5+ billion mutes from people using Google’s “mute this ad” feature in 2017

Of course, this has led to huge growth in the adoption of ad blockers over the last few years. Consumers have found these to be an easy and convenient solution, but this is not a permanent stance.

There is a widespread acceptance that if advertisers can provide some value to consumers, the latter will be much more receptive to the messaging.


Worryingly for advertisers and publishers, the growth in mobile ad blocker usage has been very notable and that trend has been particularly marked in the Asia-Pacific region over the past 12 months.

Many publishers have implemented “ad block walls”, which do not allow access to their content for users with an ad blocker installed. That approach is only a stop-gap measure and does not strike at the heart of the issue, however.

It is pretty clear which way the wind is blowing, so Google is aiming to take a modicum of control over the prevailing trend rather than ignore it altogether. Third-party ad blockers, after all, might also end up blocking ads from the Google Display Network.

Moreover, Chrome accounts for 62% of the mobile browser market and 59% of desktop, so it certainly has the clout to make a difference.

And yet, there is a fine balance to strike here between permitting the ads that fuel so much of the digital economy, while precluding those that are overly intrusive. Google, of course, has much to lose if it adopts an overzealous approach, but much to gain if it can become the arbiter of the correct standards for digital advertising.

Which ads will be affected?

The standards by which the Chrome ad blocker will operate are based on the guidelines set by the Coalition for Better Ads. Google is on the board that sets these regulations, but so are many other influential bodies, including the Association of National Advertisers, Unilever, and Facebook.

This collective set out to pinpoint the ad experiences that consumers found to be overly negative when browsing. The research (which can be viewed here) revealed certain types of ad that are most typically tied to negative experiences.

The desktop web experiences that will be affected are:

desktop ads

While the mobile ad types that will be affected are:

Of course, these are broad categories and there are levels of sophistication within each. Google has added the stipulation that publishers have a 7.5% non-compliance threshold before their ads are blocked.

There is also an element of common sense to be applied here. We have all been subjected to the kinds of ads that this initiative targets, whether they are full-screen auto-play videos or pop-up ads that feel impossible to close.

How will Google enforce this?

Significantly, Google estimates that just 1% of publishers will be affected in the short-term by the new ad blocker. It would be fair to say that the approach to cutting out sub-par ads has more in common with a scalpel than an axe. After all, Google knows better than anyone that advertising supports the vast majority of what we see online.

Wes MacLaggan, SVP of Marketing at Marin Software, commented to Search Engine Watch that:

These new standards are meant to create a better user experience for consumers, and ultimately encourage fewer ad blocking installations. In the short term, we’ll see some ad formats and advertisers shut off. These advertisers and publishers will need to invest in more quality ads, while publishers will no longer be able to rely on monetizing with intrusive formats.

Google will also alert sites that are at the “warning” or “failing” level on its scale, to provide an opportunity to clean up their ads. The search giant reports that 37% of sites that were initially in violation of their standards have since made changes to improve the quality of their ads.

Websites that violate the new standards will be given 30 days to remove the offending ads from their sites or Google will block their ads.

5 Powerful Tips For SEO On A Budget


In  my early days as a marketer, I used to dream about having an unlimited budget to implement all my ideas. OK, let me be honest: I still do that sometimes. I do it for my own digital marketing agency, Idunn, as well as for clients whose businesses I truly believe in.

But unlimited budgets are just that: a dream.

Even the biggest corporations in the world have a limited budget (albeit the limit is quite high).

So I snap out of it and work on coming up with the best strategies within the budget our clients or I have.

And you know what?

It’s actually quite rewarding!

I love looking back on how much we managed to achieve with so little. We work with a lot of bootstrapping startups, so we actually have a knack for making things work on a tight budget.

SEO on a budget is by far one of the most challenging and common issues of small and medium-sized companies. But it doesn’t mean it can’t be done.

5 tips for excellent SEO results on a budget

Let’s take a look on how we can maximize optimization even with budget constraints.

1. Take a close look at your keyword strategy

I wrote a lot about choosing the right keywords here, but let me summarize this for you: try to go for keywords that are both easy to rank for and relevant for your business.

Here’s an example: it’s hard to rank for “hotel in Paris”, but you can rank for “hotel near the Eiffel tower” much easier. This comes with the added bonus of sending you qualified leads aka the people who are most likely to book your hotel.

Granted, you will get less visitors than if you rank for “best hotel in Paris”. But the strategy above won’t cost you an arm and a leg. And, after all, why should you care about the visitors who don’t turn into customers anyway?

2. Make sure all your information is correct

This is vital for local businesses, but also very important for any type of company. Make sure that your address, phone number, email address, contact person and ZIP code are identical on every platform you use, from Yelp to Facebook and your own website.

Make a Google My Business listing for an added bonus. This way, when people near you search for your products or services, Google will return your page as a result.

3. Write for humans

Yes, keywords are important. But not as important as keeping your readers engaged. If you take a look at the 17 factors that impact ranking, you will see that most of them speak about a great user experience.

Bounce rate, source of traffic, time spent on page and many others indicate that an unnatural writing style will chase off your visitors.

This not 2010. Google bots now understand user intent. And, thanks to innovations like Alexa and Siri, search has become more conversational.

A user is more likely to search “how do I make a chocolate cake from scratch” than “chocolate cake” today. That’s because they also know that the latter search may send them to a bakery shop. If users get specific, you have no reason not to.

4. Outsource SEO tasks

I know what you’re thinking: outsourcing means paying. And we’re on a tight budget, remember?

Of course I do!

But the kind of writing that gets you on the good side of search engines isn’t embodied by 500-word blog posts anymore. You need to go long form and in-depth. This means tons of research and a lot of time spent putting together memorable and informative pieces of 1500+ words.

And time is money. If you get this done in house, you are still paying an employee for it.

Most of the copywriting clients we work with say the same thing: it’s much cheaper to outsource to a reliable agency than to pay a full-time employee for it. Plus, it’s more easily scalable. When your budget runs out, you can pull the plug or limit your investment in content – without firing anyone.

5. Optimize and link everything

It’s quite common to have a superbly optimized blog post and forget about the smaller things. Meta descriptions, alt tags, image tags and more are equally important.

They tell Google bots that your content is relevant for the keyword you chose more clearly than an extra paragraph in your copy.

The same goes for linking. If you’re on a budget and links from other domains are an issue, make sure you do a lot of inbound linking.

It’s perfectly free and incredibly powerful. Whenever you write a new blog post, link to some of your previous ones. Ideally, the anchor text should be the same as the keyword of the article in question.

This is how you signal to search engines that your article is relevant for a certain keyword. The more links to it, the better its ranking.


Great SEO is not something that happens overnight. It’s something that you have to work on continuously. Even if you had an unlimited budget, you’d still have to constantly add new texts and review your links.

The key here is being patient. It may take you a while to see tangible results, but they will come if your work is up to par.

Whatever you do, don’t try black-hat techniques. It may be appealing to hire someone who promises to help you rank on the first position for the most competitive of keywords for a measly $200. But you won’t be ranking high for more than a week! After that, Google will bury your website so deep that you’ll have to buy a new domain in order to get another chance at visibility

How to Set Up Custom Intent Audiences in AdWords

Custom Intent Audiences in AdWords.

“Consumers are more curious, more demanding, and more impatient than ever. . . AdWords has been redesigned to help you reach these mobile-first consumers in faster and easier ways. Today, we’re introducing more innovations available only in the new experience.” –Anthony Chavez, Director of Product Management, AdWords

Last summer, Google added an array of new features to the AdWords platform including a new interface that Google noted was, “. . . the most powerful change [they’ve] made to how advertisers visualize and manage their campaigns in over 15 years.”

Following such bold changes, Google introduced exciting new AdWords features like promotion extensionsad variations, new opportunities to meet business goals.

What has many excited, however, is the new custom intent audiences.

In mid-November Google announced a variety of new sales-driving AdWords components, including custom intent audiences.

Custom intent audiences enables businesses to leverage the Google Display Network (GDN) to, “…make it easy for you to reach people who want to buy the specific products you offer–based on data from your campaigns, website and YouTube channel.”

Google explained the effects of the new audience option as followers:

The system works by employing machine learning technology to analyze a user’s current or previous AdWords efforts to produce a custom audience to target.

The automatically generated audience is comprised of the most frequently surfed URLs and keywords for a given product or service search.

While this may sound like a wholly automated marketing solution, users do have some sovereignty over the process as custom intent audiences can be automatically created by Google.

Custom intent audiences give both novice and expert advertisers the tools to successfully expand beyond the bounds of Google Display Network’s canned audience groups.

No matter which option you feel more comfortable using, each presents the distinct potential for entering scads of new, prospective consumers into a business’s sales funnel.

Where to Find Custom Intent Audiences

Once you have navigated to the Display campaign portion of the interface, you can head to the audience page to see both types of custom intent audiences.

Start by creating or selecting an ad campaign to run. Next, select the “Targeting” button just below that.

From here, you will be able to select “Intent;” this can be found sandwiched between the “Affinity” and “Remarketing” options.

Now you will be asked to choose between the automatically generated custom intent audience or to create your own.

Auto-Generated Custom Audiences

While crafting a custom audience is within the wheelhouse of some marketers, others might not feel so confident in the process.

For these folks, utilizing the automatically created audience is likely to be more their speed.

After selecting “Custom intent audiences: auto-created,” users will be presented with a myriad of possible audience options.

This is the defining feature of customer intent audiences, as opposed to the topic or placement-based options Display Network users have had up until this point.

Creating A Custom Audience

If you have opted to craft your own audience, after selecting the “Intent” option, click the blue “+” icon found near the words, “New Custom Intent Audience.”

With all your URLs and keywords in place, select “Create.”

You will then be taken back to the previous screen; here you can analyze your campaign’s estimated reach.

Feel free to play with your audience criteria until you have generated a reach you find suitable.

This high level of audience detail and identification provides business owners with a much more refined method for reaching prospects.

Get familiar with this new feature now, as it can help your brand earn tons of new leads and sales.

Will your business opt to leverage custom intent audiences? If so, do you plan on creating your own, or will you let Google do the heavy lifting?


How to check your Domain Authority: 4 tools to use

domain authority

Domain Authority (DA) is a metric that serves as a handy heuristic in the SEO industry. 

Put simply, it provides insight into how likely a site is to rank for specific keywords, based on the SEO authority it holds.

There are numerous tools that can help us arrive at these useful scores.

Below, we round up some of the most accurate and intuitive ways to see a site’s SEO equity.

With few insights into how Google’s algorithms really work for organic search, the lure of a metric like Domain Authority is self-evident.

It provides a glimpse into the SEO “strength” of a website, in a similar fashion to the now obsolete PageRank toolbar.

Google still makes use of some variation of the PR algorithm internally, but its scores are no longer visible to the public.

If anything, they encouraged some negative attempts to “game” Google’s rankings through link acquisition.

However, many SEOs make use of Domain Authority to sense-check the quality of their inbound links.

To understand how these are affecting their own’s site’s SEO health.

What is Domain Authority?

“Domain Authority (DA) is a search engine ranking score developed by Moz that predicts how well a website will rank on search engine result pages (SERPs).

A Domain Authority score ranges from one to 100, with higher scores corresponding to a greater ability to rank.

Domain Authority is calculated by evaluating linking root domains, number of total links, MozRankMozTrust, etc. — into a single DA score.

This score can then be used when comparing websites or tracking the “ranking strength” of a website over time.” – Moz.

Ultimately, this is a representative model of how Google decides which pages should rank for each query, and in what order they should rank.

As is the case with the term ‘relevance’, authority covers a very broad area of assessment that is open to interpretation.

Domain Authority aims to cut through that ambiguity by providing a metric that can compare the SEO strength of different websites based on a consistent methodology.

Although marketers are aware that DA has intrinsic limitations as a metric, it is at least a barometer of whether our SEO efforts are gaining traction or not.

When prospecting for new links, for example, it is helpful to check the DA of external sites before contacting the site about a potential partnership.

Combined with a range of other metrics – both qualitative and quantitative – Domain Authority can therefore guide brands towards more effective SEO decisions.

‘Domain Authority’ was devised by Moz and they have naturally taken ownership of this name.

Their suite of tools (some of which are discussed in this article) will reveal the authority of particular domains, but dozens of other free tools use Moz’s API to show these scores too.

However, a couple of other SEO software packages provide a slightly different view on a domain’s SEO strength.

Moz’s scores are based on the links contained within its own index, which is undoubtedly smaller than Google’s index of URLs.

Other SEO software companies, such as Majestic and Ahrefs, have their own index of URLs.

These indexes will largely overlap with each other, but there are still questions to pose to your chosen provider:

  • Index size: How many URLs are contained within the software’s index?
  • Frequency of index crawling: How often is the index refreshed?
  • Live links: Are there common instances of ‘false positives’, where inactive links are reported with 200 status codes?
  • Correlation with actual rankings: Simply, does a higher domain score equate to better rankings?

The importance of these questions, and the resultant significance of their answers, will depend on a brand’s context.

Nonetheless, these are points worth considering when assessing the scores your site receives.

Each of the main players in this space has subtle distinctions within its methodology, which will be important for most SEOs.

We will begin our round-up with the Moz tools (some of them free) that will show the Domain Authority for any site, before looking at a couple of alternatives that provide a valuable reference point.

Moz (MozBar, Open Site Explorer)

It should be clear that Moz is the major contender when it comes to checking a domain’s SEO authority.

We included MozBar on our list of the best Google Chrome extensions for SEO and it deserves its place in this list, too.

MozBar will highlight the Domain Authority of any site a user is browsing, along with the Page Authority (PA) of that particular URL.

As the name suggest, PA applies a similar methodology to DA, but localized to a particular URL rather than a domain.


This is also available in search results pages, making it possible to see whether a site’s Domain or Page Authority correlates with higher rankings for particular queries.

As such, these two metrics in combination are a great starting point for investigations into the quality and quantity of backlinks pointing to a domain.

Marketers should be aware, however, that these scores do fluctuate.

That should be viewed as a positive, as the scores are an increasingly accurate reflection of how Google is evaluating sites.

Moz employs machine learning algorithms to re-calibrate the authority scores based on link activity across its index, but also the impact that certain types of link have.

We can consider this an attempt to peg the Moz index to that of Google, and we know the latter is tweaked thousands of times a year.

Therefore, we should be careful about the causal links we infer from DA scores.

When tracking Domain Authority, always benchmark against similar sites to avoid viewing this as an absolute indication of how well you are performing.

By viewing it as a relative metric instead, we can gain a healthier insight into whether our strategy is working.

This is where another Moz-owned tool, Open Site Explorer, proves its worth.

Open Site Explorer uses a range of proprietary Moz metrics to highlight the areas in which specific sites under- or over-perform. the side by side comparisons it creates are an intuitive way to spot strengths and weaknesses in a site’s link profile on a broader scale.


Moz’s Domain Authority is undoubtedly useful – especially when used as an entry point into deeper investigation.

MozBar and Open Site Explorer provide access to this metric for all marketers, so they should be viewed as the go-to resources for anyone seeking a check on their site’s SEO ranking potential.


Ahrefs boasts an index of over 12 trillion links and data on 200 million root domains, making it an invaluable repository for SEOs wanting to understand their site’s SEO performance.

The two metrics that matter within the scope of this article are URL Rating (UR) and Domain Rating (DR).

We can consider these Ahrefs’ equivalents to Page Authority and Domain Authority, respectively, at least in terms of their purpose.

The latter is defined by Ahrefs as “a proprietary metric that shows the strength of a target website’s total backlink profile (in terms of its size and quality).”

It appears frequently within the software interface, in examples like the one in the screenshot below:


So, why would you use the Ahrefs DR score over Moz’s DA calculation? Their definitions do seem strikingly similar, after all.

As always, the detail is critical. If we refer back to our initial points for consideration, it becomes possible to compare Ahrefs with Moz:

  • Index size
  • Frequency of index crawling
  • Live links
  • Correlation with actual rankings

Both Moz and Ahrefs have invested significantly in improving the size, quality and freshness of their link data.

Some SEOs have a preference for one over the other, and their scores do vary significantly on occasion.

Those that prefer Ahrefs typically do so for the freshness of its index and DR’s correlation with actual rankings.

The clarity of the Ahrefs methodology is also very welcome, right down to the number of links typically required.

To put things simply, we calculate the DR of a given website the following way:

  1. Look at how many unique domains have at least 1 dofollow link to the target website;
  2. Take into account the DR values of those linking domains;
  3. Take into account how many unique domains each of those websites link to;
  4. Apply some math and coding magic to calculate “raw” DR scores;
  5. Plot these scores on a 0–100 scale (which is dynamic in nature and will “stretch” over time).
  • DR 0–20: 20
  • DR 20–40: 603
  • DR 40–60: 4,212
  • DR 60–80: 25,638
  • DR 80–100: 335,717

Ahrefs requires a monthly licence to access its data; for those that do sign up, it provides a very useful sanity check for the domain strength scores seen elsewhere

8 key Google Analytics reports for SEO

Any stellar SEO strategy should be meticulously tracked and heavily data-driven.

Gut feel is great when deciding on which new pair of shoes to buy, but it’s not the best foundation to base your SEO work upon.

Google Analytics is a treasure trove of insightful data. And it’s free! However, with so much data available at our fingertips, it can be a bit of a minefield, and most people only scratch the surface.

Keyword rankings are great for stroking your ego and making your client smile and nod, but they don’t tap into the bigger picture.

In order to continually build on and improve your campaign, you need to pay close attention to the nitty-gritty of your data. There’s a lot to take into account, but in this post we’ll provide an overview of the key Google Analytics reports and views to bolster your SEO campaigns.

Many of these reports can be created as custom reports, which is handy for tailoring your reporting to specific business needs and sharing with clients.

Read on and we’ll help you to track and measure your SEO efforts like the analytical guru you are.

1. Organic search

Where to find it: ‘Acquisition’ > ‘Overview’ > Click through to ‘Organic Search’

It’s an obvious one but a good place to start. Head to the ‘Overview’ tab under ‘Acquisition’ for a base level indication of your website’s primary traffic channels. This provides an immediate summary of your top channels and how each is performing in terms of traffic volume, behavior and conversions.

As well as showing a general overview of organic traffic, you can also dig deeper into the data by clicking on ‘Organic Search’ in the table and playing around with the filters. Consider the most popular organic landing pages, an overview of keywords, search engines sending the most traffic, exit pages, bounce rates, and more.

On the topic of bounce rates, it’s a good idea to pay particular attention to this metric with regards to individual pages. Identify those pages with a bounce rate that is below the average for your site. Take some time to review these pages and work out why that might be, subsequently applying any UX/UI or targeting amendments.

This is all very well but wouldn’t it be handy if you could view only your organic traffic across the whole of your Google Analytics? It’s easier than you think. Simply click  to ‘Add Segment’ and check the box for organic traffic.

Leave the ‘All Users’ segment for a handy comparison, or remove this segment for a view of only your organic traffic.

2. Landing page and page titles

Where to find it: ‘Behavior’ > ‘Site Content’ > ‘Landing Pages’ > Add secondary dimension ‘Page Titles’

One of the most frustrating aspects of Google Analytics organic reports is the dreaded ‘(not provided)’ result which features under ‘Keyword’.

This unfortunate occurrence is the result of searches which have been carried out securely. In other words, if the URL of the search engine features HTTPS or if they are logged into a Google account and therefore protected by data privacy policies. In these scenarios, the search term deployed by the user will not be provided.

But how wonderful would it be to see a list of all the search terms people used to find your site? Unfortunately I’m not a magician and I can’t abracadabra these search phrases from the Google abyss. But I can offer an alternative solution that will at least give you an overview.

View your organic traffic via landing page and page title, as this will show which pages are performing best in terms of organic search. By including the page title, you can then look at which keywords those pages are optimised for and get a pretty good idea of the search phrases users are deploying and those which are performing best in terms of traffic and bounce rate.

This can also help you identify the pages which are not performing well in terms of organic traffic. You can then review whether the keywords need refining, the onsite optimization needs an overhaul, or the content needs revamping.

3. Conversion goals

Where to find it: ‘Conversions’ > ‘Goals’ > ‘Overview’

It’s all very well having a high volume of organic traffic but if it isn’t converting then there’s really not much point. To test the quality of your organic traffic, you need to be tracking conversions. There are two levels to this.

The first is your conversion goals. You can filter these with regards to traffic and understand what percentage of a website’s conversions are resulting from organic traffic.

To further improve this data, add monetary value to your conversions to better demonstrate the value that your SEO efforts are bringing. Some clients care only about keyword rankings, some care only about the dollar signs. Either way, it’s worth spending some time with your client to work out how much each conversion is worth and the data that they are most interested in.

For example, let’s say you sell kitchens. If you know the average cost of a sale and the percentage of kitchen brochure downloads which convert to a sale, then you can work out an approximate value for each conversion.

4. Assisted conversions

Where to find it: ‘Conversions’ > ‘Multi-Channel Funnels’ > ‘Assisted Conversions’

Although useful, conversion goals only give a surface view of conversions. What if someone initially found your website via Google and didn’t convert, but then later returned to your website by typing in the URL direct and then converted?

It’s very common for users not to convert on their first visit to a website, especially if they are only in the awareness or consideration phase of the sales funnel. When returning the next time around to make a purchase, they are more likely to go direct, or perhaps they see a reminder via social media.

This is where assisted conversions can save the day. Find these by clicking on ‘Multi-Channel Funnels’ under ‘Conversions’, and then ‘Assisted Conversions’.

With this data, you can identify whether each channel featured on the conversion path of a user, therefore providing more accurate data in terms of the quality of your organic traffic.

Pay attention to any drops or surges in organic traffic in this section. If, for example, you have noticed a drop in organic assisted conversions yet your organic traffic has remained consistent, then it may indicate that the leads are no longer as qualified. This should prompt a review of your keyword and content strategy.