|John Hamm by |
|GaryVee by GaryVee|
“The term minimalism is also used to describe a trend in design and architecture, wherein the subject is reduced to its necessary elements”.
CSS has been with us for over two decades, but its use properly kicked in 2003 when Jeffrey Zeldman published his book "Designing with Web Standards". Since then the use of flash templates for web design started to slow down, and CSS became widely used.
By the time flat design was introduced in 2006 by Windows with their Zune player release, web designers were in a phase showing off their skills using flashy illustrations and animations that int their minds were to wow website visitors. It was a minefield full of colours, textures and simple animations that constantly distracted people to no end. You can check out how the internet looked like in early 2000 by visiting Web Design Museum.
Flat design emphasises usability in its minimalistic approach toward the interface. It focuses on being clean and simple, emphasising open space, 2D illustrations, contrast and crisp edges. As a principal, the less stylistic elements seen as unnecessary clutter in web design, the better.
One can say that flat design goes back to the basics of web design in terms of being a functional tool. The focus shifts from how it looks to how well it works. Don't get me wrong, the looks are still important and need to please the eye, but at the same time when your end goal is to implement flat design it needs to equally be user-friendly.
On the other hand, the use of depth in web design has been equally popular. Designers who use it tend to say they like to “breathe life” into their work and add depth as a way to do it. By adding shadow, which is very popular to create a 3D perception, you can emphasise specific elements on the website and make them more realistic.
According to some, a real-looking interface can boost people’s interaction with the website which is always important when it comes to web design, as it must be pretty but more importantly, functional.
Some specific tools can be used to achieve depth in web design.
Skeuomorphism became popular thanks to Apple. Steve Jobs was always the one who emphasised that people should use devices thanks to intuition and should not have to read manuals to enjoy the technology. But after a few years, this trend seems to be overused, and even Apple followed the Windows approach and introduced flat design in their devices.
These days everything seems to be mixing, and although we see the flat design in the lead, it has some depth elements incorporated into it like subtle shadows and micro-animations. It is the best of two worlds combined for better visuals and user-experience. That's a win-win if you ask me.
A graphic artist of any kind is a person who may not necessarily possess the skills needed to be able to call themselves a graphic designer. On the other hand, not every graphic designer is an artist. You can learn the craft of design and techniques needed to create graphics that can carry one’s message. With the right set of tools and knowledge, the world is yours. The best of the best will have both and will be the leaders who inspire others; creating graphic designs that are admired and shared.
If it only was so easy to define a graphic artist. Some will say you need to be born an artist, that you need an artistic soul, but for the purpose of this article, we will focus on a more down-to-earth approach. In short, a graphic artist is a creative person such as an illustrator, photographer or somebody designing typography to list just a few; an artist that works with a visual medium.
A graphics designer will most often be a part of a graphic artists group, though not necessarily. Many freelancers are working in this area by themselves. A graphic designer specialises in supplying graphic illustrations for an editorial piece, web developer or advertising, among many others. Your job as a graphic designer is to produce a visual design (a digital one most of the time these days) that conveys a specific message for whatever purpose.
You would think that there are more important skills to have as a graphic designer. However, this is not always one of those jobs where you sit behind a desk all day and work on your computer. Your task as a graphic designer, especially in a large company, is to communicate the client’s written or spoken message through visual means. That requires you to have well-developed communication skills. It’s not a one-man job so you will also find yourself participating in brainstorm meetings where you and your colleagues will have to agree on the best marketing strategy. You will have to be able to understand the team leader’s vision and confront them if your experience tells you this or that won’t work. And you need to be a team player, so knowing some negotiation skills will come handy when putting your thoughts out there for others to think through.
If you need a qualified Graphic Designer, please call Colorpeak at 0191 645 1645.
I could tell you there are plenty of tools that you can use as a graphic designer, but there are only a few that truly count and enable you to create with no limitations. These days, knowing how to use (and use well!) Adobe CS is a must. As a freelancer, you may want to dip into Gimp, Inkspace or Krita freeware software. If you’re looking for a job in a respected Graphic Design company, however, then having InDesign, Photoshop and Illustrator listed in your CV is a necessity.
In terms of "know-how", there are certain things that you need to know by heart and most importantly understand, like typography or branding. However, there are others that you only need to be aware of, like printing technicalities.
UI/UX design (User Interface/User Experience), typography and basic HTML/CSS knowledge are essential if you are considering working as a graphic designer in web design field. One could argue that you don’t need any coding skills to produce a website, but that is not true. You need to understand if and how a web developer will be able to reproduce your design using programming skills. I would say you can be a graphic artist without this knowledge but to be a good and respected graphic designer you won’t get away from the necessity of understanding HTML and CSS. Studies show that 36% of companies prefer their graphic designers to know HTML. Typography is another of those skills that you need to obtain but we have a separate article that talks about typography in minimalistic design if you’re interested.
When it comes to branding, there are specific guidelines that you need to be aware of. Fortunately, it usually lies within a team leader’s scope to decide what needs to be designed. It is good to have experience in creating a brand for a company. However, in the end, you simply need to produce what the client needs, so most of the time you learn as you go.
Need a brand design? We can do it all! Call 0191 645 1645 for more information.
You do not need to know how the printer operates. Frankly, you’re not working at the printer. Regardless of that, you should know how to prepare your design for printing purposes. When applying for a graphic designer position, trust me, you will not get it unless you know what bleed or CMYK is.
We’re living in the era of Skillshare, Treehouse and Udemy online courses as well as YouTube where you can find anything from instructions on how to clean the coffee spill from your carpet to creating a 3D logo for your client. Yet, if you browse available positions on Indeed, Jobsite or especially the Government Find a Job website you will hardly find an offer that does not require an educational background. In fact, in more than 80% of cases, you would not even be invited for an interview unless you have a degree. But then again, you can always become a freelancer and there are plenty of websites like Fiverr or PeoplePerHour where you can advertise your services.
Although it is not easy and takes time to acquire graphic design skills you can do it yourself especially if design is your passion. You can get a degree and end up being a part of a graphic artist team that works for large advertising agencies. You can also be a free spirit working on your own from the comfort of your home. Whichever path you take, remember there are certain skills you need to possess without which you can’t be successful in the field of design.
For any Graphic Design queries drop us an email, and we will be in touch shortly.
This isn’t a trick question, or is it…? It used to be much easier to answer it. Back in early 2000 when web design reality was situated around ‘Is that person capable of creating a website?’ rather than ‘Is that person capable of designing a beautiful website, that is functional and makes people buy products?’ And it is an important matter especially when you're considering who should design your professional business website.
In the past, knowing how to code was necessary. Without understanding HTML or flash, you could not build a website. There was no question about whom should you hire to design it. The problem was that tech people were not known for their exquisite sense of visual style. Making a professional business website that looks good was not a priority if it's appearance was considered at all. You can see for yourself by visiting the Wayback Machine to see how the internet looked like back in 2005! Very few heard of good UI (User Interface) or UX (User Experience). These words may sound silly to a "full-stack-web-developer-engineer” however, now coding skills are not necessary just put a functional website up.
Which is not to say skilled coders and web developers aren't necessary. Not at all! The "no-coding" approach works if we think about a general site, with a basic purpose like displaying simple information. Having an online space that functions as a visual business card or a leaflet is useful to describe and promote your company quickly. And there is a rapidly growing market for that type of web design. The reason? A website is often the first thing your potential customers will look for when they want to learn more about the product or service that you sell. So, you won't get away with not having a website. At least not if you want to appear as a forward-thinking business owner.
If you need to outsource the creation of this type of website, an experienced graphic artist should be enough allowing you to save some time and money on more elaborate developments and designs. But there is no question that thousands of entrepreneurs, start-ups and resourceful individuals benefit from available (often free) web design tools that enable non-techies to "DIY" their websites. You probably heard of Squarespace, Wix or Weebly.
If you’re on a budget, have lots of spare time, and you're determined enough, there is no reason why you shouldn't try to bootstrap your online portfolio! The disadvantage, however, is that there is a lot of technical, functional and visual limitations imposed by online web builders. So, websites built with those tools are also almost impossible to expand beyond their generic purpose and are often locked to the system they've been designed with - so forget about easy migration and scale! Moreover (and assuming you don't know CSS), your website won't be bespoke, and you will likely cringe while stumbling upon an identical one (your competitor's!). Therefore, your expectations towards DIY website builders shouldn't be to create professional business website that really stands out and takes you to the next level, which is...
If you’re a small or medium business owner that cares about how the website presents the company brand, and how good it is in actually making money (i.e. conversions) you’re probably better off hiring a professional web designer.
We get to live in the lucky times where a good web designer has a solid understanding of coding, so you no longer need to choose between functionality and the looks. It's also often the case that an experienced web designer will have a good understanding of graphic design. So is undoubtedly qualified to pull off a bespoke, professional business website that will lift you above the rest of the "DIY-crowd" and your competitors.
If you're treating your website as an active advertisement or a real business asset that can draw the attention of your customers - and you should! - then having it done by a professional web designer is without question the best way to go.
If you need your website to do more than the above, you probably will have to consider investing your money in a developer, especially if you’re thinking about managing large databases of information through your website. If you have a bunch of money on a large team of web developers, front-end designers, artists, project managers, etc. and have plenty of time for everyone to team-up then you can deliver a great professional business website project that will look stylish and modern. If all goes well, it will also be more reliable and capable of handling more complicated tasks like bespoke e-commerce shopping carts and payment processing systems. In a sense, a well thought-out and properly executed high-level website/IT project is infinitely scalable and can handle billions of active users like this website or this one.
However, expect a bill to the floor and, as unfortunate as it sounds, a high risk of failure. As it often is with large, overbudget IT projects.
It really depends on your goals. There will always be some work for developers who get higher paychecks and can pull magical websites that do more than looking good. Some people don’t need to spend a fortune on online advertising or marketing if they can stitch up a basic site from a template for free. However, the sweet spot has never been as available as it is now and you can have a professional, beautiful and functional website with a reasonable price tag and realistic ROI for your business at the same time.
I thought I'd drop a few words about Vistag as we've tested the tool a little bit on two of our now interactive websites (not e-commerce, yet...). The first website did not work due to, what we believe was Slider Revolution interference. Vistag and Slider Revolution tandem renders our website completely unusable. Therefore, we decided to switch it off there.
The second of our interactive websites, a static WordPress site, was (is) much better. We're still testing it there but performance is all good and Pingdom/GTMetrix/PageSpeed did not mark any meaningful drops in loading speed scores. Even with 10+ or so tags thrown on two web pages.
To cut to the chase, I really like the visual presentation and interactive aspects of Vistag. Objectively looking at it - Vistag WILL make you and others hover, click, poke, fondle and (hopefully) buy-through those flashing gems. There is just no way around it given how our brains and eyes work.
However, at this point in time, Vistag isn't finished tool yet and many fundamental options are missing.
Some of the things we'd like to see rather sooner than later include:
The dashboard and interface would also benefit from some conscious cleanup and re-planning for easier navigation. The User experience (UI) is a bit all over the place on the admin side.
I've also looked at this new "Lookbook" feature, which allows to add images to tags. I'm sure this is probably my pure ignorance, but I still don't get it. Again, it could be lack of my attention but how is this better or more important than just feeding and tagging pictures from WP Media or external URLs. The link on the dashboard takes me to some vague Documentation on how to make it work, but nothing else.
I've mentioned CSS customisations already but let me reiterate the fact that the default font size and colour (light-grey on a white background?) is just unreadable. To change that you need to do some CSS gymnastics on your website. But it seems to be in development now, so fingers crossed that we won't wait long for this must-have feature for interactive websites to work as intended.
Your images. Shoppable. Create visual content that sells like crazy.
At this point, I can't give it a 5-stars review because Vistag is incomplete; clearly in beta stage, and an exorcism on their bugged code is yet to be performed by the developers. However, I can confidently score this tool 4 out of 5 because this brilliant, yet simple idea serves you a ready-made link between human behaviour and computer science that you can use to your sales advantage, this very evening. And all this without the need of getting two PhDs in the respective fields trying to sweat something like that in your own basement. A fine marketing tool with great potential to make your plain web become one of truly interactive websites.
You simply install Vistag, place tags over your e-commerce images and the job of selling fidget spinners and made-in-China "what-have-you's" is done for you.
1991 Before SEO there was the beginning of world wide web. In on March 6th, 1991 the first ever website - info.cern.ch - was launched by Tim Berners-Lee.
1994 Stanford University students Jerry Wang and David Filo start Yahoo, the first Search Engine to be used by masses followed by AltaVista, Excite, and Lycos.
1996 Backrub search engine is created that ranks sites based on inbound link relevancy and popularity. Backrub would ultimately become Google.
1998 The King is born. Google launches in September, and so the story of SEO really begins. Before it, search engines positioned results based on on-page content, domain names, directories and breadcrumbs. Google introduced PageRank algorithm that also took into account the quantity and quality of links pointing to a website and anchor text.
2003 First Google algorithm update named Florida takes down a lot of websites in their ranking, especially those that would stuff keywords. Repeating keywords would be hidden at the bottom of a page in a font colour that matches the background. That way reader would not see them, but the bots would feed off them and rank websites position higher. Actions like that became to be known as Blackhat SEO tactics. Around this time a new tactic of link building is born. A race to build as many backlinks as possible begins as the savvy marketers quickly learn how to abuse the system.
2005 Google makes its first attempt to fight back linking exploitation and launches its "rel=nofollow" attribute preventing the authority of websites to be passed on. Following this update, Google launches the Jagger and Big Daddy algorithms just before the end of 2015 to prevent link farming and other suspicious SEO tactics.
2006 YouTube gets acquired by Google for the whopping amount of $1.65 billion. Eventually, it would become the second most used search engine in the world. In the same year, Google also launches Google Analytics and Webmaster Tools giving developers profoundly deep insight on how Google sees their websites.
2008 Suggestion Box is finally launched after four years of development and testing. Continuing its path of improving the user experience, Google focuses on understanding better how we surf the web and interact with content. It may seem like the most obvious Google feature these days, but back then showing related searches to automatically appear below, after you start typing in the search box, was a major hit.
2009 Bing goes online or rather Microsoft gives a new name to its Live Search tool. By then Google has nearly 70% of the search engine market in the USA.
2010 Google announces that site speed is a ranking factor following "Caffeine" update, dubbed a next-generation search architecture that is faster, more precise and provides more relevant results. All that thanks to fast "spider-bots" that can quickly crawl website and cover larger parts of the internet.
2011 "Panda" update causes a notable impact on SEO which resulted in affecting optimisation field to this day. In an attempt to clean up search results, 12% of them were impacted. Websites with low quality and irrelevant content (better known as "content farms") drop down in the rankings. A similar thing happens to websites with unoriginal, static and auto-generated content.
2012 The following "Penguin" update doubles-down on eliminating aggressive, black-hat SEO spam tactics. Gone are the sites that violated Google's Webmaster Guidelines such us buying links, keyword stuffing and keyword matching anchor text to the dot. Eventually, Penguin and previously mentioned Panda release become part of Google’s real-time search architecture.
2013 "Hummingbird" release centres around the growing market of mobile users. It is the biggest update to Google’s algorithm since 2001 and deals better with natural language questions, conversational search, and it lays the foundations of Voice Search. Original content becomes a major factor in ranking along with blogs. Google starts to reward websites that provide useful, unique and lengthy answers to visitor’s queries. "Long-tail keywords" or, in other words, detailed and specific search queries become a thing.
2014 The release of "Pigeon" is all about better local search results. Google improves location and distance ranking parameters to provide relevant results to users based on proximity. Local businesses with strong organic presence showed higher in traditional search within the area of the searching person’s location. The "Local SEO" finally gets its own genre, now distinct from the general SEO.
2015 Some would say a breakthrough year in which Google reported more mobile searches in comparison to desktop search. "RankBrain" - a self-learning search architecture is introduced as part of the Hummingbird algorithm. It determines the most relevant results to search engine queries. At first, it runs only on 15% of searches that the system had never encountered before, but eventually, it applies to all of them.
2016 Google confirms that the search engine's top three ranking factors are: links, content, and RankBrain.
2017 Google "Fred" update hits mostly websites with poor content. In general, Google tries to deal with aggressive monetisation, misleading and deceptive ads, poor mobile compatibility and poor content. Fred is not a stand-alone algorithm update, rather a catchall name for every quality tweak to Google system intended to improve it and get rid of the content that violates the Webmaster Guidelines. It's been a known fact for a while that Google does quality updates on a regular basis and most go unnoticed and are unannounced.
2018 A year of Webmaster Tools modernisation. Google Search Console, Google My Business and, most notably for SEO, Google’s PageSpeed Insights tools receive their updates. There are of course multiple changes to Google's ranking algorithm done almost every day, but there were three Broad Core Algorithm Updates that were actually announced. PageSpeed Insights update becomes a significant player in ranking and slow sites with low optimisation score on mobiles are affected. Google announces nine factors that influence Optimization Score.
And so, the story continues. What will 2019 bring? Most likely further updates toward mobile-friendly AMP enabled websites that load fast on slow 3G networks. A lot has changed since the first Google SEO revolution. An easily abused system based on keywords evolved into sophisticated learning machine that thinks like a human. Content became a king, and mobile device accessibility will further shape the ranking.
This text was originally published on: Colorpeak Blog - How has SEO evolved throughout the years?
I have yet to meet a business owner who has never been in need of automation. Today startups, SMBs and large companies have one thing in common - they're all thrown in the game of pursuing the holy grail of business and marketing automation (The Automation Singularity).
As you’re reading this, I wouldn't be surprised if you've got your feet wet trying a marketing automation software at some point. MailChimp for example or even chatbots...?
But maybe you have been doing all your marketing the "old school" way, and you want to get some of the piling work volumes off your shoulders, and you don't want to hire, train, pay and manage additional employee(s). Well, I share your viewpoint, because running a business means innovation. So, you either work on out-innovating your competition or your practices get inefficient and your business becomes outdated pretty quickly.
I will try my best in this article to help you get started with choosing the lowest hanging fruit of marketing autopilot platforms. You should finish the piece having a grasp on what is marketing automation, why marketing automation is the next big thing in coming years, and how to use the power of automation platforms and tools to their fullest potential.
Remember, you can't turn your eyes away and pretend the business innovations don't affect you. I will argue here that 2018/19 is a perfect time to jump on board and benefit from at least some of the cheap and accessible business automation out there. As you read on pick one or two solutions and over the next few weeks try to integrate them for the betterment of your business and your life.
The traditional marketing as a scientific field goes back to early 1900s or even earlier with the development of modern capitalism and consumerism. In a nutshell, business owners and (modern) marketers discovered that by studying consumer behaviour they can predict and influence sales profits. Businesses then went on developing step-by-step processes to anticipate the needs and wants of potential consumers so they could satisfy them more effectively in comparison to its competitors. Several decades later we use complex data and scientific methods to measure and design specific procedures to sell more to more people.
Marketing and sales funnels evolved and it's no longer just printed ads in papers and magazines or lump mail ads. With the technological revolution, especially the Internet and social media's worldwide blitz, we've opened several channels enabling us to communicate more effectively and build relevant relationships with our customers. What it means for marketers like you is that a need for a more sophisticated and technologically enhanced approach with complex, multivariate sales funnels is required. Hence, marketing automation.
"Innovation distinguishes between a leader and a follower." - Steve Jobs
Despite the overblown claims and inflated marketing slogans, marketing automation is NOT an artificial intelligence (AI). The real AI doesn't exist (yet) and we're likely far away from it. Some moderate that and use "semi-AI" but it's misleading because the "semi-" isn't clearly defined and means anything. I just call it for what it is for now. Marketing automation is a set of logical rules, processes and tactics that run on autopilot, making some tasks or even segments of your business autonomous and independent from human input. Besides the obvious profits like time, money and energy saving, such marketing practices bring a real and quantifiable added value to your business.
Although the philosophy of business management and marketing orientation remains intact and pretty solid, the tools and methodology seem to evolve and set new milestones almost every week.
Today, marketing automation software has never been more accessible. And in such high demand. Entrepreneurs keep coming up with creative ideas and cutting-edge technology to help businesses design their perfect strategies to predict the needs and personalise the copywriting.
So new marketing software is springing up like mushrooms.
But let's step back to something basic, like MailChimp which is an email auto-responding system. Their automated software is free (to the extent) and allows for:
(did I mention it's free? ;)
Check this video on how to set one up with this step-by-step tutorial:
To summarise, here are the top 5 essential gains of automation in the area specific to marketing:
"In peace prepare for war, in war prepare for peace." - Sun Tzu.
The future of automation in 2019 and beyond looks bright. And it isn't just Tesla and their self-driving autonomous cars. Interestingly enough, I find those same principles that drive Musk and his robo-vehicles towards automation apply to any aspect of the business. Here's what we all are after:
See Marquees driving Yandex self-driving car around California
Let's bring it down to a business application and see what benefits businesses can expect from automated systems in 2019 moving forward.
The most common use of marketing automation software appears to be in sales, but this limited utility expands very quickly.
eMarketer reports that only 5% of companies have never benefited from marketing automation (I couldn't dig their source though, so take it with a grain of salt). But that makes sense, given the power of computing and how technology penetrated the entire world of economy. It is clear that customer journey maps are extensively used by Googles, Amazons and eBays of this world to accurately map out how customer behave in relation to businesses. And the same applies to Social Media giants like Facebook, Twitter and LinkedIn. Their optimisation is so granular, and it developed so fast, that <a some people stood to consider its ethics and what we want as a civilisation.
Nevertheless, the value in sales is so obvious and tempting that it only makes sense to continue using it. Providing your primary incentive is to quickly "help" your prospects who are travelling the distance from being a bystander to a raving fan. And the distance, no matter the length, is measurable and categorisable. Therefore, prone to endless optimisations (and exploitation).
As of today, experts agree that by learning different ways how customers interact with businesses, pretty much guarantees big profits. For this reason, social scientist and marketers flock to use all sorts of automation tools that help with just that. Meanwhile, Marketing SaaS companies try to meet their prospects midway offering easy to understand but potentially vastly scalable in its application software. In fact, lowering entry level is now the name of the game with most marketing automation software.
One such software is FullStory. The software works based on simple principles but the results, which it draws through different modes of user interpretation, provide comprehensive and quite elaborate customer journey maps.
"See your site through your users' eyes. More than the sum of its clicks, FullStory replays your customer's journey – like a DVR for your website – so you can search, see, and understand your user experience."
There is certainly a learning curve associated with implementing any new system or technology. However, with much lowered entry-level bars the fruit has never hung so low.
Back in the days, gathering and calculating any large data sets was done by businesses who could afford full-time data analyst and a "numbers person". So it was reserved usually for enterprise-level companies and agencies, or outsourced and never looked at closely again. Today, with tools like Google Analytics, you have quick and easy access to software that with a click of a button displays almost complete picture of your company's money flow.
It's free and offers a tremendous amount of actionable data that is automatically gathered and calculated from your website, ads and email campaigns, to mention just a few sources.
It's also worth mentioning that Facebook, Pinterest or LinkedIn Ads platforms use sophisticated dashboards where data is calculated and presented to you in any way you want. It takes time to figure it all out, though. I'd suggest to start with finding answers to simple questions like:
One of the most significant aspects of marketing automation is the sudden spread of bots and various auto-messengers. Apart from built-in Facebook Messenger Bots, platforms like Quriobot and MobileMonkey are rapidly developing and are more and more sophisticated in methods of visitor engagement on your website. The key to a great visitors engagement appears to be:
Speaking of "triggers". These can be heavily customised to adhere to the customers' behaviour on your site and provide visual and heavily targeted responses. For example, imagine you have someone checking on your "Home Page" then going to "Pricing" where they spend two minutes and decide to leave. Normally this particular visitor could be classified as a prospective customer, that you may want to engage further. Therefore, at each individual stage, your bot can enhance your visitors' experience by way of targeted messages, additional content and offers. Even at the point of departure, a bot can be configured to throw an "exit intent" trigger and, hopefully, stop the visitor from leaving the website.
The "AI" in bots gets also increasingly sophisticated. For example, tools like ActiveChat try to mimic human-like responses and with it infinite visual building blocks you can make it up as complicated as Lego's Star Wars Millennium Falcon set. It can also integrate with e-commerce systems so you can have an actual "digital counter" right on your website.
No wonder bots are the next big thing and we will hear about them more and more.
"I think that's the single best piece of advice: constantly think about how you could be doing things better, and questioning yourself." - Elon Musk
It may be hard to start learning all these platforms today. However, in the long run, it will pay back immensely. Also, know that 99% of your competitors aren't doing it.
So, it’s better to start giving marketing automation some attention before it is too late. Whatever industry you're in, it will be driven by the same principles as any other, whoever puts in the hours to innovate will increase the odds of success over the business owners that procrastinated and neglected to do the homework.
You may not have enough resources to go all in with learning the marketing tools, but there is no harm in trying. I have probably been guilty of procrastination more than anyone in the business. But eventually I have learned to take little steps from early on, and over time I realised that I made a quantum leap in growing my business and developing myself. So, if you are reading this article right at this moment, stop wasting your time and jump right in.
These are obviously just a few tips and the ocean of automated tools is filling in with more and more fish. There are few more things that I have learned over the years, so pop back in and check this article for updates. But most importantly start with making your first marketing automation funnel, no matter how big your goals are. From my experience, the right actions will pay off. I wish you good luck with your marketing automation journey.
The internet kicked off the beginning of the year 2019 with #10yearchallenge. It started on Twitter and soon picked up speed there and on Instagram with thousands of people following the hashtag. Majority of social media users began sharing their photos taken a decade ago and compare it with the most recent one, showing how much they have changed. But we thought to take another spin and jump in on the 10 years challenge bandwagon comparing how most popular websites in the UK today looked like in the past! So, here is 10 Year Challenge in web design according to Colorpeak's team.
We will start our 10 Year Challenge in web design, of course, with arguably the most important website - Google. And contrary to most of the sites on our list, they have not changed much, at least in terms of grid design. When you go on Google you will be welcomed by a white page with logo and search bar in the middle. And guess what! It was no different back in the days. Most noticeably Google has changed its logo which in 2019 fits into popular flat design trend. The company logo that was in use ten years ago was also the longest lasting up until now (from May 31, 1999 to May 5, 2010). It used serif typeface font called Catull designed in the '80s.
It seems that Google wasn't happy with the initial change as the old 20-year-old logo was revamped twice in the last ten years. In 2015 Google finally arrived at what we see today - a geometric sans-serif typeface called Product Sans. The font was created in-house at Google. It's also worth mentioning that the logo colours, as well as remaining body-font Arial, hasn't been changed since 1998.
Since YouTube's acquisition by Google in November 2006, the experiments and testing with the website's UX and UI began. However, for the longest time, we were just seeing different versions of classic view with the menu at the top, followed by Promoted Videos horizontal panel and the vertical list of featured videos. In 2010 sections “Recommended for you” and “Videos watched now” started popping up. Recommended videos are available to this day, but Videos watched now eventually turned into ”What’s trending”. A side panel menu replaced top one in 2012. In the meantime, YouTube has experimented with newspaper website-like grid showing categories such as sports, music, entertainment and many others. These, however, have come to past.
Current view hasn’t changed much since 2014. We choose whether we want to see the content in a grid or list view. We also saw some minor updates to the side panel menu structure and cosmetic changes to align with modern design.
Oh gosh, where do we start Facebook's 10 Year Challenge in web design? Facebook has changed so much and yet so little over the years. The most noticeable is huge photo replaced by a smaller, round profile picture and a profile banner that now sits at the top of our profile page. However, these days we have a 2-column grid on our profile page (it used to be 3): with left hand-side being filled with our info and the feed filling in the rest. In regards to the page that we interact with on a daily basis, after logging in to Facebook, it remained mostly unchanged with a navigation panel on the left, the feed in the middle and some ads on the right followed by suggestions for people that you may know. Facebook Stories that sit on top of ads panel have only been introduced in 2018, though.
Amazon turned eCommerce selling into art and Jeff Bezos, the owner, to the wealthiest man on earth. The landing page of Amazon has changed quite a bit. We now have access to good old categories on the left side panel as ten years ago, but they are not available on the Landing page. When you visit Amazon in 2019, you are welcomed by a dark navigation panel at the top of the page allowing you to search for items and manage your account. The rest of the page is covered by a slider that keeps changing showing you the most recent deals in a very pleasing Instagram-like form of large photos with added descriptions here and there. Each slide is different and grabs your attention thanks to significant changes in colour. Amazon seems to know all the triggers that will make you want to purchase something… anything!
Look at the BBC website from 2009! Ten years ago, BBC website would change its theme colour every day, but you won’t see that these days any more. In 2019 BBC is pushing the minimalistic design. We can see more photos than text, which is in complete opposition to how it was before. This is evidently a sign of our times, and BBC recognised that internet users have a lower attention span and a picture or video can capture their attention more effectively than lines of text. This is one of the strongest examples in our 10 Year Challenge in web design.
One must appreciate how little Wikipedia has changed over the years. The main grid remains the same with logo and menu on the left, login options at the top and the rest of the page filled in with definitions and descriptions. We can notice however subtle changes in the visual design. Flat design has definitely captured the attention of the people responsible for the layout, making it a tiny bit more modern following Google’s direction. On a side note, I think Wikipedia would benefit from introducing column layout. These days with screens that have high resolution it’s hard to read from left to the right of the screen without dropping the line once or twice when moving to the next row.
Highly criticised new design (beta) was first rolled out on 1st of April 2018. It introduced Facebook-like feed in the middle with an additional supportive column on the right and a significant amount (33%) of white space surrounding it. The latter being especially prominent on high-resolution desktop screens. Notably, new design removed the blue header fonts that resembled Google Search results and replaced them with a black font (IBMPlexSans) that looks more like news portals.
According to Reddit's CEO Steve Huffman, the new design aims mostly at new users: “Many of us evangelise Reddit and tell people how awesome it is ... then when those new people decide to check out Reddit for the first time they’re greeted with dystopian Craigslist. We’d like to fix that.”
To mitigate some of the criticism, Reddit introduced three types of views that users can pick: Card, Classic, Compact. And, as of now (Jan 2019) the "Old-Redditers" can still switch back to the "old view" in their account preferences. Reddit's re-design approach heavily polarised the community and divided it into two camps: "Old-Redditers" and "New-Redditers" - who could've thought that you can find an analogy to the political situation in the United States in something like "10 Year Challenge in web design"...
Ebay seems to be following Amazon’s suite by introducing a slider to it’s landing page. It’s big, it’s colourful, it grabs your attention. They learned how to use white (negative) space to their benefit, contrary to the home page from a decade ago. It was packed with categories (who really uses them on the home page...? That’s right. No one!), banners and images - it seemed very crowded. There’s also the case of the logo. I would argue the old one was more recognisable. Now we have this Google-like simple font - a popular trend in the past three years, but I think eBay’s logo used to be more distinct.
Complete transformation! That’s the first thing that comes to my mind when I compare the Twitter layout from 10 years ago to its current version. Is this the winner of Colorpeak's 10 Year Challenge in web design? Not quite.
This is yet another example of a website that takes simplicity and turns it into an advantage. A very intuitive layout that encourages you to simply tweet, re-tweet, and consume other tweets. Blue background has been replaced with white one, the logo became smaller and “What’s trending” panel has been added on the right-hand side of the feed. Twitter understands that the vast majority of its users access their feed from mobile phones, so it seems they design with a small screen and quick load time in mind. Then they follow with desktop design extending few elements a bit but refusing to include anything unnecessary. 5 stars when it comes to UX and UI design if you ask me!
Last but not least is my favourite example which shows that sometimes one needs to start with MVP ("Minimum Viable Product") to put the ball rolling and the success will come later. It is also the winner of our 10 Year Challenge competition! When you look at Netflix website from ten years ago, you will laugh. I did! But then we have to remember that back in the days there was no fast internet connection and renting (RENTING!) a movie through a website to be delivered to your home was shockingly innovative. Since then, Netflix has made a leap jump, and their navigation page that allows you to select movies is what created a binge-watching experience and trend. Whoever is behind Netflix Landing Page is an excellent web designer and developer team that understands how our brains work and how to trigger them to consume more and more content. Bravo!
This concludes our 10 Year Challenge web design comparison. Of course, top ten visited websites in the UK change very often, so in a while, you may find this list outdated, but if you're here in January 2019 these are the websites UK residents visit the most. We hope you enjoyed our take on #10YearChallenge so go ahead and subscribe to our facebook, twitter and other social media accounts for more interesting articles from Colropeak team.
This article was posted originally on Colorpeak Blog
AS A PERSON WHO WENT FROM HAVING NO DESIGNER SKILLS TO BUILDING WEBSITES FOR A LIVING, I THINK IT’S WORTH SHARING SOME BASIC PRINCIPLES THAT HELPED ME BECOME A WORDPRESS WEB DESIGNER. TODAY I WILL TALK YOU THROUGH SOME CRITICAL SKILLS THAT ONE SHOULD DEVELOP TO BECOME A FULL-TIME WEB DESIGNER.
If you’ve ever considered becoming a web designer, but you’re not interested in learning to code, now is a great time to give it a try. Why may you ask? The answer is simple – there are so many tools that allow you to design without the back-end knowledge that you can become a WordPress web designer with a bit of imagination and some skills. But to become a successful web designer I would advise you to start with research. The reason why is because there is much more to web design than just translating your vision into a working and functional website using some software (that I will talk about as well).
First, to build a website you need to have an understanding of who is it for, what features are required and what are your client’s expectations. Don’t have a client yet? Perfect! You don’t want your first website, which is going to be a disaster anyway, affect your confidence. Start small. Does your mum have a hobby? That’s how I started. I had built a website where she could showcase her cat sweaters. I was not ready for e-Commerce yet to sell them to other animal lovers, but I could start with a nice portfolio showcasing her creations (and my basic WordPress web designer skills).
So, whether it’s your mum or a neighbour sit down with them and ask all the questions that you think will be important to build a website. I would discourage you from skipping this step and interviewing yourself. There are certain things you can learn only when you’re surprised. Therefore, an interaction with somebody else is the best way to do it.
Check out our basic web design questionnaire that I used at the early stage before Colorpeak was an agency. It does not matter whether you’re a WordPress web designer or developer. If you’re building a website, then you need one. Below, I also included some additional links with other questions that may turn out handy for your project specifically. Be cautious though as too many questions will bore your client. Be accurate but don’t go overboard with three pages of questions. As part of a large marketing agency, there will come a time for fancy shmancy questions about the company’s vision or whether their solutions are better than their competitors. Now you need to know if they have a domain, what colours do they like (or hate), what is the website for and do they have content or do you need to create it.
Remember to use the questionnaire as a guide for your conversation; don’t just send it to them point blank. A discussion is a key to land a client. You will not get them through cold calling or emails. Therefore, you should practice this skill as often as you have an opportunity to do so.
But at the very beginning, the technical skills that you need as a WordPress web designer are how to set up a website on a hosting platform and install WordPress on it.
Start your first website by using one of the WordPress free templates. This will help you learn how it works and what are the principles of working with Gutenberg design engine. To make your website look more appealing there are some themes that offer more flexibility and will allow you to be more creative. These are Divi from Elegant Themes, Elementor or Beaver Builder. You can find out the pros and cons of each one on this YouTube video.
To state the obvious, as a WordPress web designer you will need some graphic design skills. Fortunately, these days there are many free repositories that you can source images from. If you combine them with basic skills of working with Photoshop or Gimp image processing tools you are set. Also, don’t forget to go through some User Experience (UX) guides. They will teach you the principles of good web design that will help the visitor of the website enjoy the experience.
When creating a website in 2019, some marketing skills are a must. A WordPress web designer won’t get away without knowing basics of copywriting and Search Engine Optimisation (SEO), like keyword research and on-page optimisation. I know many people who would argue with me on that point, but I believe that a competent web designer should have this knowledge.
You may not have to write content for your website as the client will provide it, but you need to recognise when it’s crap. It's part of your job to advise your client that they need to hire a better writer. Even better, to leave the copywriting to you (or your trusted freelance copywriter - you will find them on Fiverr and People Per Hour, although it takes few orders before you can find somebody trustworthy).
You will also need to skilfully describe what each page is all about in between 120 and 158 characters in a page description. That requires not only copywriting skills but also some keyword research and SEO knowledge. Fortunately, there are plugins for that! For example, Yoast will not only tell you whether your meta description or page title is too short or too long but also it will score your keyword strength. It will also suggest how to correct your text to make it more readable and point the header tag structure (H1-H3) in relation to the keywords. It’s a great tool. Optimising images is also important, and you need to have tools to decrease image size to speed up page loading time. As a WordPress web designer, you need to know how to add Alt Tags and descriptions and geotag them, which helps with Local SEO.
In terms of keywords, clients will often tell you what keyword they want to rank for. Nine times out of ten they will be wrong. Sometimes because the competition is too strong and sometimes because nobody searches for that keyword, so there is no point to rank for it. I have included some articles below that will give you a better grasp on keyword research. After that use Google Trends or Answer the Public tools to judge the value of your keywords.
Tools that will help you do the keyword research
It seems like a lot, but do not feel overwhelmed. You will simply acquire all these skills while designing websites and being open to self-learning. I am a firm believer in "learning on the job" and acquiring abilities that you need to resolve problems one by one as you approach them. This is why the best way to learn WordPress web designer skills is to actually try and design some websites for fun and google your way through obstacles. So, go ahead, start now. Just register your new site on WordPress today, and tomorrow… well, the sky is the limit… OK, if you think this is cringy, then it is, but do it regardless. No postponing!
The above skills won’t make you a Frontend Web developer yet, we will get to that in another article, but it will get you started. Don’t miss the opportunity. Today is the day when you choose the path to become a WordPress web designer.
This article was originally posted on Colorpeak Blog - 4 key skills of a WordPress web designer - a beginners guide
Update March 15, 2019: Although this Google Update initially came out under the name of Florida 2.0, Google themselves named this update Google March 2019 Core Update. According to officials, this name helps to avoid confusion as it tells you the type of update it was and when it happened. Over the last few […]
Update March 15, 2019: Although this Google Update initially came out under the name of Florida 2.0, Google themselves named this update Google March 2019 Core Update. According to officials, this name helps to avoid confusion as it tells you the type of update it was and when it happened.
Over the last few days, there has been a lot of buzz about possible Google algorithm updates. And yes, it’s true that Google updates its algorithm every single day, often multiple times per day but this update seems to be significant. And moreover, it is confirmed by Google on March 13.
Number 13’s infamy will probably explode even more now, but you need to know that the most recent Google Update started on 12 of March, continued in force on the 13 of March and the rankings fluctuation seems to continue as you’re reading this.
Leaving modesty aside, we did spot the update before Google making the official announcement and before all the buzz in the SEO world. No, we do not have psychic powers, but we do have a super powerful tool: cognitiveSEO Signals.
While we are aware that search results are different around the world, in different regions, and in different languages, and changing search results is almost a Google trademark, the rankings fluctuations were really high on lots of countries. The daily updates that typically occur in SERP don’t compare to this one in any way. And this time, Google officials also confirmed what the data already told us.
This week, we released a broad core algorithm update, as we do several times per year. Our guidance about such updates remains as we’ve covered before. Please see these tweets for more about that:https://t.co/uPlEdSLHoXhttps://t.co/tmfQkhdjPL
— Google SearchLiaison (@searchliaison) March 13, 2019
Of course, a lot of buzz did not delay to appear on the social media channels, with people complaining about big high volatility as well as dropped traffic and drops.
I’m not sure if you’re familiar with Ian Fleming (the guy that wrote the James Bond series of novels and short stories) byword, but it’s somehow bitter funny how it applies to our current situation.
Once is an accident. Twice is a coincidence. Three times is an enemy action.
It’s clear that we cannot talk about coincidences at this point as people all over the places are facing rankings fluctuations.
Seeing less Positions, Impressions and Clicks cut in half from countries where my site shouldn’t show up. Surely will see less bounces from total site =)@JohnMu @rustybrick @dr_pete pic.twitter.com/dlkVQrOy3u
— amagnacco (@amagnacco) March 13, 2019
As we don’t like to take anything for granted, we researched and double-checked on real case studies to see how the current update impacted businesses from all over the regions. We used the cognitiveSEO Rank Tracker to easily check and analyze some keywords and their rankings.
We found some massive drops in rankings on commercial keywords with high search volume. Below you can see some screenshots with just some the search position drops we’ve identified.
We couldn’t find any increases in rankings for the moment, yet this doesn’t mean that increases didn’t occur. I’m sure that in the following days we’ll better understand how this update impacted the Google ranks and as the law of nature works when someone goes down, someone else goes up.
We’ve done the research on several countries and languages to find out that Florida 2.0 is clearly a global update that affects websites regardless of the country, region or country.
What is also important to notice is the amount of keywords websites have lost their rankings on. We’ve found just a few situations were the websites declined in SERP on just a few keywords. Most of the analyzed websites faced drops on half or maybe more than half of the keywords they were interested in ranking for.
For the moment we cannot draw a conclusion on what the March 12 Google Update actually targeted: links, content, intent, all together, something else? There could be a commonality between the websites affected by this update, yet, further research needs to be done. But keep still, here at cognitiveSEO we are a bunch of geeks who like doing in-depth researches so most likely we’ll come back with more details on what Florida 2.0 is all about.
The truth is that Google doesn’t always disclose updates and algorithm changes. Or, being a bit nasty, we might say that Google rarely makes public its rankings updates. That’s why there is also a lot of buzz and confusion every time someone from the SEO world notices something out of place in SERP.
How did we find about the update before Google making any official statement?
It started with a mail. A notification actually from cognitiveSEO Signals, cognitiveSEO’s tool that spots Algorithm changes in real time.
Checking the tool on several countries, positions, and changes, we realized that there is actually something big going on and for sure the biggest fluctuation from the last 5 months for several countries like US, UK, Australia, India, several countries in Europe and so on.
The short answer is: it depends on who you’re asking.
If you’re asking Google, the answer you’ll get is :
There is nothing in particular to “fix,” and we don’t want content owners to mistakenly try to change things that aren’t issues.
According to the same big G officials, a broad core update means that Google is not targeting any niche or any particular signals, like quality. In a broad core algorithm update, Google is not targeting anything. There’s no “fix” for pages that may perform less well other than to remain focused on building great content. Over time, it may be that your content may rise relative to other pages. And as with any update, some sites may note drops or gains. There’s nothing wrong with pages that may now perform less well. Instead, it’s that changes to our systems are benefiting pages that were previously under-rewarded….
Of course, as with any update, some sites may note drops or gains. Yet, if you’re asking any SEO Pro or webmaster who is very concerned about his website ranking high, well it is hard to do nothing while your ranks are dropping.
We are not saying that you should look for issues on your website where they are not, or to start changing stuff just for the sake of the algorithm update. Yet, there are some actions that you can take.
And yes, you can check everything by using only one tool: the cognitiveSEO toolset.
There is no secret that mobile searches in Google are constantly growing from one year to another. And therefore, Google’s interest on mobile search market is increasing as well. What is very interesting to notice is that when it comes to ranking fluctuations, the mobile SERP is way more volatile than the desktop one.
If the current algorithm (named Florida 2.0) is the biggest Google fluctuation on desktop from the past five months, when it comes to the mobile market, the situation is completely different. Dozens of fluctuations and lots of volatility can be easily observed on the screenshots taken from the UK and USA mobile vs. the desktop search market.
Where is all that coming from?
Well, Google Florida was the first major algorithm update and it was released in November 2003. This, of course, was the first significant update in what would become a decade filled with huge updates. Regardless of what it was called, Florida hit in the middle of the holiday shopping season targetted highly commercial terms. As you can imagine, lots of pages and businesses have been wiped out from Google’s SERP.
As you might tend to think that these two updates are similar in the way of rolling, what they actually have in common is the name. Google Updates were always given names by the search industry; it was not Google’s officials who named their updates Fred, Penguin or Panda. And the Florida update was named this way due to the fact that it was released around Pubcon Florida SEO conference taking place. History repeats and with the update and the conference taking place in the same month, Florida 2.0 came naturally.
Yet, the novelty came from Google this time as, to everybody’s surprise, the officials decided to name the update themselves; this is why this current update is called March 2019 Core Update.
There is a long history on naming the Google Updates and the WebmasterWorld used to name them. Will this be the end of an era and will Google name their updates by themselves from now on? We’ll wait and see.
How about you? Did you encounter any ranking or traffic fluctuation? Let us know in the comments below!
301 redirects are one of the core elements SEO experts use on a regular basis. They are very useful but they can also be very dangerous. You can fix broken links with them, but you can also create redirect loops which can affect indexability. In order to really take advantage of 301 redirects, you […]
301 redirects are one of the core elements SEO experts use on a regular basis. They are very useful but they can also be very dangerous. You can fix broken links with them, but you can also create redirect loops which can affect indexability.
In order to really take advantage of 301 redirects, you have to know how to set them up properly but also the scenarios in which they are required. Last but not least, you have to know what to avoid when using 301 redirects to deal with a problem.
In this article, you will learn when you should use 301 redirects to take advantage of them for SEO purposes and how to avoid mistakes that can affect your rankings.
Warning: Playing with URLs and 301 redirects on a large scale can have a massive negative impact on your site if done improperly. If you don’t feel 100% confident doing things yourself, it’s a better idea to get in touch with an SEO specialist that can approve and help with the modifications.
Most people know what a 301 redirect is, but just in case, I feel the need to define it.
301 is an HTTP status code. Status codes indicate whether an HTTP request was successful or not or; in other words if a web page works or not. Basically, a 301 redirect is a permanent redirect from one URL to another. There are multiple status codes, some of which you already probably know. 500 HTTP status, for example, indicates a server error, while 404 indicates that a resource doesn’t exist. Status code 200 is the most common one but you probably see it less, because it indicates a successful request (you end up seeing the page instead of any status code, which is good).
The 301 status code states that a web resource can be found at a new address. So, for example, if I have page A and I 301 redirect it to page B, if you access page A the browser will automatically take you to page B.
In other words, it’s like moving a page from one address to another.
Search engines try to provide the best experience for their users, so they don’t want to display bad resources in their search results. A missing resource is definitely a bad experience for users.
Due to the way they work, it is inevitable that search engines don’t display some 404 pages in their results. The crawler finds new resources and brings them to the indexer. The indexer then indexes the page and it is only after that the ranking process begins.
Once a page was indexed, it gets displayed if you search for it. Also, from time to time, it gets recrawled. Google does this in order to discover if any modifications have been made to the page. However, a page can suddenly vanish. Maybe the owner deleted it, or maybe something bad happened to the server.
During that period between two crawls, a page can be indexed and ranked by Google, leading users to 404 pages.
At some point, Google will figure this out and it will not be pleased. It will eventually remove the 404 pages from the index, but if this issue keeps repeating, it might even view a website as less reliable, because it is wasting crawl budget.
But 404 pages don’t only affect Search Engines. It affects you! Your website, your business and your revenue. If users search for your pages, if they click on your links and these links return a 404, you’ve just lost a client.
Not only that but if you’re the one creating the 404 pages (by accidentally deleting content, for example) you’re also going to lose a lot of link equity if those pages have backlinks pointing to them. It will be bad for search engine optimization. 301 redirects can help channel that equity to a new location so that your domain doesn’t lose authority.
Of course, it is inevitable that at some point a site will have 404 errors. In fact, I can create a 404 error right now by linking to a nonexistent page. But that would be bad for me as well. I don’t want to link to 404 pages, because Google will then think that I lead users and search engines to missing resources.
You see, it’s not actually the broken page that matters, but the link that’s pointing to it. As I previously said in another article (and I’m going to quote myself now):
A page doesn’t really exist until another page links to it.
You can find if your site has 301 error issues by using the CognitiveSEO ToolSet. Use the Site Explorer and search for Broken Pages to find backlinks from other sites to your site that return a 404:
Use the Site Audit to find 404 issues within your own site. You can also view your entire set of 301s and check if they’re all good or pointing where they should.
So, if you find that your site has other sites pointing to 404 pages, you can try to contact the owner to replace the URL or simply do a redirect yourself.
The 301 redirect’s main purpose: to minimize the existence or appearance of missing or broken resources.
Now we could say that Google and other search engines love 301s, but that doesn’t mean that you should start redirecting everything. 301 redirects should be used with caution and only in specific and necessary cases, as messing things up can have devastating outcomes.
If you don’t find yourself in one of these situations, then you probably shouldn’t be playing with 301s.
When you launch a new website, one of the first things you should do is redirect all the domains to the preferred version.
There are 4 main versions of your site:
Naturally, in 2019 you’ll want to have SSL. This will probably be default sometime in the future, who knows. In general, it doesn’t matter if your site is www or non-www but you can go with www just to make sure (helpful for something related to CDNs, images and cookies if your site gets bigger in the future).
In any case, let’s get back on track and state where 301s come in:
Every other version of your site should 301 redirect to the preferred version and it’s also preferable if the redirect is a 1 step process.
So if my preferred version is https://www.yoursite.com, I don’t want http://yoursite.com to redirect first to https://yoursite.com and then to https://www.yoursite.com, but directly to https://www.yoursite.com.
You can easily check this by running a Site Audit in the CognitiveSEO Toolset and implement the changes in Seach Console:
Many websites still run on HTTP connections. This is risky, especially when dealing with personal data. For example, even a small contact form on your contact page could be susceptible to GDPR infringement if it’s not secure since the data could be intercepted by third parties.
If you’re planning to move your entire site from HTTP to HTTPS, you have to be very careful. I repeat: You have to be very careful.
This can have devastating negative effects if the transition isn’t done properly. By properly, I mean setting up 301 redirects from each HTTP URL to its HTTPS counterpart.
You can check this step by step HTTP to HTTPS migration guide if you want to make sure you get everything right.
Broken pages and links are actually 404 pages. You should be constantly looking for these types of errors as they can appear at any time, for example, when someone misspells a URL.
Any link pointing from outside your site towards your site that reaches a 404 error should be dealt with. You can find broken links and pages using the CognitiveSEO Toolset, as mentioned above in the article.
The best scenario is to contact the owner and ask them to fix the link. However, this is time-consuming, sometimes inefficient and might even lead to them replacing the link altogether.
You can redirect those 404 links (broken pages) to the most relevant page on your site to fix these issues.
When it comes to internal broken links (links that are broken within your site), ideally you should change those instead of 301 redirecting them. 301 redirects pass link equity, but some of it gets lost during the process, so a direct link is always better.
Take this with a very very small grain of salt. Sometimes, you might actually want the resource to return a 404. For example, if someone links to some weird URL on your site and it looks like spam, it’s probably not a good idea to redirect that negative equity to one of your good pages.
Now, this is debatable. Some would say that you should always redirect any broken link or resource. Everyone will agree that the best type of redirect is to the most relevant resource possible.
So, for example, if I have a blog about animals and I delete a page about dogs, I don’t just want to redirect it to the Homepage and definitely not want to redirect it to a page about cats.
The proper 301 redirect should always be towards the most relevant page on the website.
But what do you do in case you don’t have any relevant page? Well… the most commonly recommended thing is to redirect to the homepage. However, this comes with a problem:
By displaying a 404, you give the users an answer and also have the chance to show a call to action or at least make them laugh (via design). By redirecting to the homepage, you simply send the users somewhere they didn’t expect to land.
If the user’s intent was to read an article about a topic, he or she will be even more confused by ending up on the homepage of a site than landing on a 404.
So here, a 404 can be a lot more helpful, especially if you add a nice design to it and also a call to action. Here are a couple of examples, maybe even better than in the one above if you have a big website with a lot of 404s:
Sorry, the information you were looking for isn’t here (can be personalized) but:
It’s not such a big deal if your website has 404 errors here and there, but if it’s on a large scale or if they are pages that quality sites link to, then you should redirect them to the most relevant resource.
Also, keep this in mind:
It’s better to return a full 404 than a soft 404, which is a 404 looking page that hides a 200 status code under it.
Soft 404’s sound like a bit of trickery to Google. On one hand you’re telling the user that the resource they’re looking for isn’t there, but on the other hand, you’re telling Google that the page is OK.
In SEO, it’s usually a good idea to never change the URL of a specific resource.
However, if you do need to change it, then you should always 301 redirect the old URL to the new URL. Popular CMS, such as WordPress, even do this automatically. If you change the URL, you can notice that the old one redirects to the new one.
When you change a URL, Google will have to first crawl it, then index it and rank it all over again. This can take time. Setting up a 301 redirect will tell Google that the page isn’t an entirely new page, but actually, an old one that has just moved its address.
For example, you might have a very old website with some very old pages that used to have underlines in the URLs. As you know, dashes are now preferred, so you might want to change https://www.yoursite.com/this_page/ to https://www.yoursite.com/this-page/.
If you do it, make sure to 301 redirect the old page to the new one.
Redesigning or improving a website on a large scale can often end up in deleted pages, moved or rewritten content.
If you’ve removed any pages during a redesign process, make sure you redirect those pages accordingly to the most relevant resource on your site.
Again, big changes on websites can always have negative SEO impacts if certain aspects are not taken into account.
If you’re in the process of redesigning your website or are thinking of doing it in the future, you can always check our website SEO redesign checklist.
If you have a very big website, especially in the eCommerce field, you’re constantly dealing with duplicate content or dynamic URL issues.
For example, if you have big a set of products both in red and yellow, the use of dynamic URLs might create duplicate or at least very similar content when filtering for either red or yellow.
301 redirects can help with this in certain scenarios, but you can also use Canonical Tags. You can read more details about canonical tags towards the end of this article.
You can check if you have duplicate or similar content issues using the CognitiveSEO Site Audit:
Recommendations regarding this issue can vary from one scenario to another. Due to the fact that this usually also happens on a large scale, with thousands of pages, it’s always better to contact an SEO specialist before making any modifications.
However, if you do have multiple URLs that are almost identical, you can redirect them accordingly to a final version. This can potentially strengthen that page, as it won’t be cannibalized and the link equity from different links, if any, will be sent to that page.
Are you planning to change your domain? Do you have two brand websites and would like to combine them? Then 301 is the way to go.
But don’t make the mistake to just set a simple 301 from one domain to another. Each and every URL must redirect to its new location on the new domain.
Ok, now that we’ve covered the most common and important cases when you should take advantage of 301 redirects, let’s get into how exactly you can set up correct redirects from one URL to another and even from one domain to another.
Setting up 301 redirects is actually simple. That is… if you don’t have to set thousands. You can set them up in different ways:
Via Plugins: Setting up redirects via CMS plugins is pretty easy. You can use any redirection plugin / extension / module. Usually, there are two fields, the one with the current URL and the one with the desired URL.
Via .htaccess: Setting up 301 redirects can be done via the .htaccess file on your server.
If you want to redirect from one URL to another, it’s pretty simple. You just have to add:
Redirect 301 /old-URL/ /new-URL/
You can read more on .htaccess redirections here.
Via cPanel: A cPanel redirect can also be used and it’s pretty easy to do on a small scale.
Via Domain Level Redirect: Last but not least, you can set up a domain level redirect from your domain registrar dashboard. This is a good way to redirect especially if you’re merging from one domain to another.
Set two redirect records, one with the host www and another with the host @ each pointing to the new domain and make sure to add a backslash at the end of the domain. S,o if I were to redirect cognitiveseo.com to brandmentions.com, it would look something like this.
Type > Host > Value
Redirect Record > www > brandmentions.com/
Redirect Record > @ > brandmentions.com/
This will redirect all pages to their new counterparts (for example, it will not only redirect cognitiveseo.com to brandmentions.com but also cognitiveseo.com/page to brandmentions.com/page).
This is actually the question which led us to writing this article:
“When you are moving site or content and make proper 301 redirects (one to one), what is the safe period after which we can consider all possible page juice is passed to new pages and Google deleted it from its index and it’s safe to permanently kill redirects?”
The answer is pretty straight forward:
It is never safe to remove 301 redirects. The best case scenario is keeping them running FOREVER.
Sounds like an evil request, but you’ve heard me right! You have to keep doing it for eternity. Why? Well, really, it depends. But it’s just safer if you never remove the redirects.
If it’s just a page with no backlinks and no traffic, you can simply check if the new URL has been indexed and the old URL has been deindexed. You could then remove the 301, as it’s no longer needed. However, if your page does have backlinks (even internal links within your own site), then removing the 301 will result in a 404.
It’s always better to keep them, as long as they don’t create any technical issues with the server, which they shouldn’t.
There are also some things that you must make sure you don’t do when working with redirects!
The worst thing you can do when working with 301 redirects is to create a redirect loop.
Or, you might have seen it before in the form of “This website redirected too many times”. A redirect loop is when Page A redirects to Page B and then Page B redirects back to Page A. I hope you can understand why Google would get frustrated.
It’s also recommended not to do multiple or chain redirects. You can have 2 or 3 if needed, but Google won’t usually follow more than 4 redirects.
Google tries to crawl the web as efficiently as possible. Each site gets a certain ‘crawl budget’. If you waste that on abandoned chain redirects or redirect loops, you can end up having less for other important pages.
So, instead of having:
Page A > 301 > Page B > 301 > Page C > 301 > Page D
You should have:
Page A > 301 > Page D
Page B > 301 > Page D
Page C > 301 > Page D
Also, when you’re trying to fix broken pages, don’t just redirect everything to the homepage. It won’t necessarily do any harm, but you can maximize effectiveness if you redirect each page to something relevant.
People often get confused by canonical tags and 301 redirects because they are sort of similar. So, which one should you use and when?
The canonical tag’s purpose is to tell search engines which page to display without redirecting users to that page.
So, by using a rel=”canonical” tag, Google will see Page A but it will display Page B in the search engines, but when users access Page A (via the site navigation menu or direct URL) they will still see Page A.
Generally, it is better to use 301 redirects when dealing with missing or old content, but it’s probably a better idea to use canonical tags when dealing with dynamic URLs caused duplicate content.
Going back to the example I previously gave (a big set of products both in red and yellow) using a canonical tag can help you let the users browse the site disturbed while telling the search engines which version of the page to display.
Please note that if your pages have searches for both Yellow and Red color, you should keep both the pages indexed. However, if the users only search for the product and never search for colors, then it’s a better idea not to cannibalize the results.
Please read this article about common canonical tag mistakes if you want to learn more about this topic.
301 redirects can be both very helpful and deadly (if used the wrong way). Make sure you properly 301 redirect when encountering any of the cases mentioned above for the best search engine optimization outcomes.
Important things to remember:
Have you ever used 301 redirects to fix SEO issues? How did that go for you or your client? Let us know in the comments section! We’re really curious to find out.
Within this cognitiveSEO Talks episode you’ll get the chance to get inspired by Lukasz Zelezny, a prolific keynote speaker, SEO consultant, and author. He started working in the SEO industry around 20 years ago while living in Poland. Every year he is actively participating in 10 to 20 events as a keynote speaker and he […]
Within this cognitiveSEO Talks episode you’ll get the chance to get inspired by Lukasz Zelezny, a prolific keynote speaker, SEO consultant, and author. He started working in the SEO industry around 20 years ago while living in Poland. Every year he is actively participating in 10 to 20 events as a keynote speaker and he constantly worked for mid and large companies such as HomeAway, Thomson Reuters, The Digital Property Group,to mention just a few.
As Lukasz himself is mentioning, he is a hands-on person, spending lots of his time keeping up to date with the changes in the technology of online marketing. He started his professional career in 2005 and has since been responsible for the organic performance of a number of companies including HomeAway, Thomson Reuters, The Digital Property Group and Fleetway Travel.
Lukasz traveled 75,000 km speaking at many SEO and social media conferences including ClickZ Shanghai China, ClickZ Jakarta Indonesia, SiMGA Malta, SES London in the United Kingdom, as well as conferences held in Europe – Marketing Festival in Brno, Brighton SEO in Brighton, UnGagged in London.
Additionally, whenever he has the chance he organizes workshops where he is sharing tips around SEO, Social Media and Analytics. And talking about SEO tips and tricks, hope you enjoyed the list of SEO tips Lukasz has shared with us within this interview.
Regardless if you’re just starting out with SEO, or you’re offering SEO services for a long time, everybody seems to ask you the same thing: “How long does it take to rank in Google?” Well, it’s time to understand exactly what factors impact ranking time so you can approximate (after an SEO audit) […]
The post How Long Does It Take to Rank in Google & How To Speed It Up appeared first on SEO Blog | cognitiveSEO Blog on SEO Tactics & Strategies.
Regardless if you’re just starting out with SEO, or you’re offering SEO services for a long time, everybody seems to ask you the same thing:
“How long does it take to rank in Google?”
Well, it’s time to understand exactly what factors impact ranking time so you can approximate (after an SEO audit) how long it would take to rank and, if it’s the case, sign a contract with your client accordingly.
This can be a tough one to answer and it might even even prevent you from successfully making the sale. A while ago, I addressed it in the How to Convince Clients for SEO article. The short answer there for was that you can’t really tell how long it will take, because it depends on very many factors, such as budget and competition.
As mentioned before, there are some factors that influence how long it will take until you can successfully rank a page for a keyword.
You can’t really tell how much time it will take to rank until you audit the website. How many keywords does it want to target? What’s the competition for those keywords? What’s the current state of the website? What’s the budget?
These are all factors that affect how quickly a website can get to the top.
Quick tip: One thing I like to do is to offer the Audit for free in case they eventually sign the SEO services contract (for the determined period of time required, which you will find after making the audit). If they then decline, you will only bill the audit. I do this because sometimes an audit can take less than one hour (if the site is small) but it can also take weeks even months (if the site is huge).
The first thing that has to be taken into account is the website itself. If it has issues, they will take time. This doesn’t mean that you can’t start working on keyword research or content creation. You should do that as soon as possible. But if your site isn’t mobile friendly, for example, it just can’t make it to the top these days.
It is worth mentioning that a website which loads slowly and lacks authority will perform worse than a fast site which has lots of quality backlinks.
Technical issues don’t necessarily speed up the ranking process, but it is something most people try to fix first because it’s very important. How are the URLs, canonical tags and hreflang tags? Does the site get indexed well? Is it mobile friendly, or does it need a complete overhaul?
Depending on which technical problems the website has, it can take from 1 month to 1 year, so it’s really hard to estimate.
If you have a bigger budget, you might fix them faster, because you could outsource to multiple people. If not, it will just take more time.
There’s no time here to spend detailing what technical problems could arise and how to fix them, so we’ll just assume that you’ve taken care of them all using our technical SEO guide.
Generally speaking, an older website will perform a little better than a new website. This of course, if the site has something useful to provide.
Either way, a website is definitely not trusted by Google when it’s first launched. It could be just another automated SPAM site, or just a really bad website. You’ll need to spend some time getting it to index well, showing Google that it’s well structured and fast and that users like it.
You’ll also have to take into account how long it is going to take to build the website, if it doesn’t exist yet. Is it a small one? Is it a big one? Does it have basic functionalities or is it a complex digital eCommerce store with hard to implement features?
Again, in general, bigger websites tend to perform better. Why? Because it also means they’re probably a little older and chances are they also have backlinks. However, while big websites might rank a page faster than a small site with no authority, it will also take more time to rank all the pages because they are so many.
If you decide to tell your client: “I think we could rank for these keywords in less than a year”, then you would also have to assume you’re going to publish articles for them all the same time, now, which is unlikely to happen.
Domain Performance is critical when trying to see how long it will take to rank. If your website has absolutely no backlinks pointing to it, then it’s probably going to take a lot longer to rank for a target keyword than if you had thousands of (quality) backlinks pointing to it.
You can see a quick estimation of your domain performance by entering your website URL in the CognitiveSEO Site Explorer.
If your domain Influence is low and your competitors’ domain influence is high, it’s going to take a lot longer to rank against them.
Now that you’ve established where your website or your client’s website stands at, you can look at your competition. The secret here is to spot opportunities. If you want to establish a long term strategy, I’d focus on creating high level content and target medium to high level competition keywords (be realistic though), while generating short terms profit via PPC.
When you’re looking to rank for a particular keyword, you’ll have to see what type of content is already ranking there.
You can see an average of your targeted keyword’s content performance using the CognitiveSEO Keyword Tool. You can also look at individual pages and see how they perform.
However, it’s not the only thing you should take into account. Is it something ugly? Is it really good looking? What about the information? Is it good? Can you find a gap?
Last but not least, how long is it? Does Google rank short articles, or does it rank “Ultimate Guides” which can be 10000 words long? You can see that in the tool as well, next to each result.
If you’re going against websites that have really strong signals that they’re qualitative (such as Wikipedia), it will take longer than if you go against websites that don’t have a strong link profile.
So, if you see that a results page is full of websites with low domain performance, it’s a good sign you can rank faster than usual. This often happens in local SEO niches, where there are higher chances of small sites being the only ones targeting those keywords.
Again, the Keyword Tool can help you identify if there’s an opportunity. You can see the average by looking at the Keyword Difficulty, which includes a mix of the average content performance and domain performance.
However, it’s also enough to spot a couple of low domain performance websites ranking to know that you can do it as well.
If the top websites don’t target the keywords directly in the title, there are 2 possible reasons:
Budget is really important when trying to determine a website’s ranking time frame. It directs everything, from how fast the technical fixes will be implemented, to how well the content will be written.
Depending on how well or bad a website is optimized, the required technical budget can be small or huge. There might even be situations where the best solution is to rebuild the website from scratch.
This adds budget, but also time. If you have to rebuild a website, you can’t start publishing content right away. It’s better to wait until it’s finished.
For example, I’ve been working with a client which has a custom built website. This was a turnoff for me, because I knew we would run into issues. Not only did it take a lot more time, because the programmers responded slowly and we had to keep chatting to get things right, but it also cost the client more money.
For another client (actually my lowest budget client ever, which was also a very large site but didn’t really want to invest much) things didn’t go so well. They had a pretty serious hreflang content duplication issue which I did not know how to fix (from a programming perspective) and did not want to spend the enormous time required to learn how to do it, especially for that budget.
Either way, they knew from the beginning what I could do and what I couldn’t and how much effort I was willing to put in for the amount of money they invested.
Content creation is the most time consuming and expensive part on the long run. However, it’s the best investment you can make. Without content, you won’t have a foundation to play with. You can’t outreach or get links to something that doesn’t exist. Your clients can’t read the article you never published.
If you only have $10 to spend on an article, it’s probably not going to be the best one, at least not in the US market. Pay $50-$100 instead and you might have a change.
Of course, you also have to consider the other factors mentioned above, such as competition. If their content looks like $5, then a $10 article might just do it.
Sometimes, there’s just so much you can do with content. If all the other competitors build links, you might feel like you’re not making any progress at all. It’s not an issue if you purchase publicity on other sites, but try as much as possible to make it worthwhile also in terms of clicks. Don’t just buy a link that nobody will see.
Again, if someone asks you for $1000 for a backlink, you’re better of investing that in quality content, outreaching and building relationships. It’s not easy to do everything yourself, but try as much as possible to develop an outreach strategy which you or someone else can easily execute.
You can also use the CognitiveSEO Tools to easily determine link opportunities you can outreach. For example, you can go into the Competitive Analysis (once your campaign is ready) and look for Common Domains. Then, find the ones that your site (Site 1) doesn’t have yet. This saves time, because you won’t be blindly writing e-mails. Instead, you’ll be targeting webmasters you already know are interesting in the topic.
Social Media is a great way to initially boost your content. If the signals are good and the content receives traffic, it means that it’s performing well and that people might also like to see it in the search results.
Managing a social media account isn’t easy. It’s not just about simply sharing your latest article on a page. Your main focus is engagement. If you’re interested, here’s an article on how to establish a great social media strategy. You’ll see that it’s no child’s play and that it must also be done properly in order to be effective.
Once you start ranking on the second or first page, you might notice that sometimes, Google throws your website to the top, on spot 1 or 2, but only for a while. Don’t celebrate yet, because chances are it’s only a Google Dance.
This happens when Google sees algorithmic signals such as quality content and backlinks, but they also have to check it with real time Rank Brain signals such as CTR. Will the users like your content? Can’t know without sending a little bit of traffic to it, right?
If the Google dance keeps pushing you up, it’s a good sign. This means that you’ll rank on position #25 for a while, then go on #2 for a day, then back to #13. After a week or so, you’ll be on #3 for a day, then go back to #9 and so on.
In order to have a quick idea, I’ve created this graphic to showcase how long it will take to rank in different scenarios. The graphic above is valid considering if your site is rather new and the piece of content you’re going to publish is of very high quality.
Note that these are just gross estimations and shouldn’t be taken for granted:
You have to sort of ‘add up’ the rows, meaning that if you have a poorly optimized website it will take you between 6-12 months to fix that before you can actually start writing quality content. Then you would focus on content and domain performance.
If your Domain Performance is high as well, then you can ‘cancel’ the Domain Performance metric and consider your competition as ‘low’. This is not available for Content as well, as that rather involves UX and CTR metrics more.
You really only have 3 options that can really speed up the process of ranking a page in Google. And no, they’re not quick and easy. It is what it is. SEO takes time. If you want short term profits, take advantage of PPC.
The real simple way of increasing your chances to rank is to increase budget. Better content, professional design, greater outreach efforts and social media boosting of the article once it’s published will definitely help you rank better.
However, the costs of publishing content like mentioned above can go up to $500 if not more. But it will pay off if done properly. Kick-starting things now can be beneficial on the long run.
There’s also the option of buying backlinks. This can save up a lot of time. However, this is risky and can get you into trouble. The safe way is to buy nofollow backlinks. You might think that it’s not helpful, but in fact, nofollow backlinks do help with SEO.
Either way, you should focus on finding quality backlinks that are relevant. The link should reach an interested audience. It’s ideal if it can also bring some traffic. If it costs more than $250, then you’re probably better off spending that on quality content creation or outreach.
Don’t waste money on BlackHat SEO tactics such as automated link building or PBNs. It is risky and it can even end your business if you mess up a big client’s rankings.
Bigger budget means also improving your sales capability. If your client keeps asking about the time frame, then you should be ready to explain how you can speed it up and that it requires more budget.
When I say really good content I don’t mean it like “quality content”. I mean really, really good content, on which you spend weeks even months developing, planning outreach and designing it.
To be honest, this very piece of content doesn’t fit that category. Although it’s very qualitative (hopefully), we haven’t spent months designing it and, while we do outreach (as I’ll probably reach out to Nathan to let him know that I featured his video), I haven’t spent months developing a list.
A good example of a really good piece of content is what Nathan Gotch did with his “blogspot” article. He ranks for that keyword, although it’s a branded keyword and most people are directly looking for blogspot.com, not his article.
His twist was to come to the topic from another angle and write something that hasn’t been written yet. In this case, the results were full of “how tos” but, instead, he decided to write “11 Huge Reasons to AVOID Blogspot“.
Here’s his video about the process:
Writing this type of content can get it viral, which means people will start sharing it and linking to it, which will get it ranked really fast on its own. However, you’ll probably need to kick-start its grow somewhere on social media.
Your final option is to only target low difficulty keywords, which means spending more time in the research phase.
In order to correctly position yourself in the market and know which keywords you can successfully target, you should check out and execute our SEO competitor analysis framework.
Even so, this is only to speed up ranking but not also traffic or profits. You can spend one year to rank for a 1000/month search volume keyword with higher difficulty, or you could spend 1 month ranking for a 100/month search volume, but you’ll actually need 10 of those to make up for the same traffic you’d get from the first one.
If you’re starting off with a low performance website and a low budget, you should focus on low competition keywords.
Links to a single page will help that page rank faster, even though other sites have a greater domain performance. So, if you really want to rank on a particular keyword and you’re competing against a high Domain Performance website, you can compensate by getting links to that specific page.
You can check direct links using the Site Explorer. Simply paste the exact URL there and you’ll see how many backlinks a page has. This way you can also spot link opportunities.
Seeing results with SEO usually takes from 6 months up to one full year (even for experienced SEOs). But it can also take longer, because it depends on very many factors. Of course, it can also take less. There are very many high quality websites there that are poorly optimized for search. A small intervention can dramatically increase their traffic.
What’s your experience? How long does it generally take you to rank a website for its targeted keywords? What was your fastest one and which one took the longest? Tell us in the comments section below!
The post How Long Does It Take to Rank in Google & How To Speed It Up appeared first on SEO Blog | cognitiveSEO Blog on SEO Tactics & Strategies.
Relationship marketing is important in every business. Clients keep your business running. Getting them is one thing, but keeping them is a whole different story. Customer loyalty can ensure a strong long-term relationship with your clients. Having the right tools to manage them and deliver what they need can lead you into that direction. […]
Relationship marketing is important in every business. Clients keep your business running. Getting them is one thing, but keeping them is a whole different story.
Customer loyalty can ensure a strong long-term relationship with your clients. Having the right tools to manage them and deliver what they need can lead you into that direction. We think that business success relies on the number of satisfied customers, plus other ingredients that make the strategy “spicier” – more elaborated and powerful.
Check out these effective relationship marketing strategies to maintain, engage and convert your clients.
Before you start building a relationship marketing strategy, you need to know your audience to connect with it. If you’re struggling to figure out your niche, you can follow the following instructions:
Your clients are an important asset to your business; they make your business work. Your business will exist as long as you have clients. More benefits come from your loyal customers because they continue to bring monetary value to the company. You need to maintain your customers engaged to keep them loyal.
According to Gallup’s customer database, half of all customers are satisfied and only 38% of them say they are engaged.
Customer engagement doesn’t automatically follow satisfaction.
Loyal customers need attention to keep their engagement rate high. According to Experience Matters, loyal customers are 5x as likely to repurchase, 5x as likely to forgive, 4x as likely to refer, and 7x as likely to try a new offering. Loyal customers are more likely to create customer lifetime value (CLV).
It all starts with building a team to execute quality experience. Helping your customers understand the value of your services is highly important. You need to help them learn your services by making it easier. It’s just like in UX design, the users have a path and your website should follow that path. The picture below is a good representation:
As we all probably already know, UX is very important for SEO, too. Many experts think that UX is crucial in the future of SEO, especially with all the evolution of machine learning technology. And UX goes beyond SEO, your team must see what the user wants, what they need, how your services benefit them and achieve their goal.
Helping customers understand and value your product over your competitors’ is not a sales tactic.
Your support team is in the first line, talking and keeping contact with your customers. Building a great customer service team is number one in developing relationship marketing. Promote a customer-centric policy to your team and try to engage them in providing high customer support services. Your team should know how to deal with both negative and positive situations and offer satisfaction to the user.
To know how to deal in both negative and positive situations you’ll need a Q&A template, that can be modified depending on the user’s needs. There are lots of companies specialized in all sorts of activities, but every single one of them receives lots of questions about what they do, and how they can help the user. A template offers great support.
There might be cases when the conversations can be nuanced and get a little tricky. That’s why it is good to have a plan: you can get some ideas, keep you out of trouble and please the customer at the same time.
HelpScout created helpful customer service questions and answer template picturing all sorts of situations.
For small businesses, it is easier to talk with customers and you don’t require an elaborated Q&A template. In case you have a SaaS or a business that is more complex, it would be awesome if you had some documentation. It could benefit both your team and your customers for future reference.
There are lots of situations when you don’t need to play by the book, and let the master team do the job. Skyscanner did an amazing job responding to a man who got stuck with a 47-year flight connection:
Short for Ask, Categorize, Act, and Follow-up, the ACAF Customer Feedback Loop is a business strategy that centers itself on the customer. It has 4 steps:
Feedback is very important in every business. It helps you evolve and correct the mistakes. Ask for valuable and personal customer feedback from your customers to see what you’re doing right and what’s wrong.
Always thank the customer for the feedback and make them feel appreciated. Not to mention feedback creates innovative opportunities, by hearing what your users need and you could develop in your product.
Having a plan to provide quality customer support doesn’t require advanced technologies. You need to set up a process: the user sends a message, you need to assign conversations to other people, think about how to engage with the users, how to do the follow-up. After that, think about what to do with the email addresses to send them newsletters or send them special offers.
The customer support process tends to be more complex. You’ll need the following:
If you’re a small or mid-sized business, then you could use some free options. But an enterprise needs more advanced technologies. Having an automated customer support process can ease your work a lot. Practically, it can be a service such a help center or chatbot or any artificial intelligence, which will elaborate more at point 12.
Think of this: happy clients that had a good experience will return, unhappy clients that had unsatisfactory experience won’t return and can spread the bad word about the company. Studies show that almost 13% of unhappy clients tell to over 20 people about their bad experience.
Knowing what to say to your customers can be a daunting task. You can always improve your customer experience by providing excellent customer service, quickly offering answers, solutions and trying to maintain a strong relationship with your customers.
It doesn’t really matter if your product or services are impeccable if user experience is shitty. It is highly important to resolve issues when they appear and provide a clean experience on site without any problems and errors.
More than that, you should actively engage customers on social media or blogs. It will strengthen the customer relationship and they can promote your business afterward.
Customer retention shows the companies’ ability to keep their customers for a period of time. A high retention rate means people (users, customers) that continue to bring revenue to the company and buy products.
Studies show that acquiring a new customer, however, is anywhere from 5 to 25 times more expensive than retaining existing ones. Retaining customers is a pretty important thing. Every single person thinks about the benefits they have when choosing a brand, a product or certain services. And the brand must think about how to keep those customers.
There are lots of customer retention strategies you can personalize for your clients to keep and bring ROI, but some of the most profitable are:
These actions can be developed into campaigns and create something really interesting depending on your niche and business. These may come in addition to the strategies we’ve elaborated in the whole article to lead you into setting up a more specific customer retention plan.
Since we’ve mentioned brand loyalty, you must know it is a good strategy for relationship marketing. Customer loyalty works very well and can strengthen the relationship you have with your clients. Basically, a loyalty program shows you care about your customers first. Loyal customers are hard to acquire, but once you got them, it costs the business about 5-25x less to sell to an existing client than to acquire a new customer.
Every business should aspire to customer loyalty. It is a virtue of their existence. There are lots of ideas you could generate by building a loyalty program. You can send the message by email, using an App (if your business has something like that), to the user’s account and so on. It can be a discount, a gift, or something else depending on your creativity. You can also find lots of ways to reward loyal customers in the digital marketing space.
Uber, for example, rewarded Gold members with some interesting perks, not available for all members. The membership levels are Blue (0 points to qualify), Gold (500 points to qualify), Platinum (2500 points to qualify), and Diamond (7500 points to qualify). So, all the member that hit 500 points and join Uber Gold get flexible cancellations that refund your $5 cancellation fee if you rebook within 15 minutes. In addition, members get priority support.
The Platinum and Diamond get a lot more services that make the ride more pleasant, such as:
Loyalty programs can boost your ROI and keep the customers that really matter.
Creating a referral program is born from customer satisfaction. People that are pleased with the services might tell other people about their experience, so why not take advantage of that? Encourage them and don’t leave room for second thoughts. Satisfied customers are willing to make referrals. Those who receive such referrals are more likely to pay attention to them rather than to the brand.
If you got the referrals engaged, then you’ll get some of the best new customers you can get.
If you want, we can set aside our reasons why you should use a referral program, and listen to the studies. R&G Technologies discovered that referral leads convert 30% better than leads generated from other marketing channels. More than that, referred customers have a 16% higher lifetime value.
There are lots of businesses that use the referral program. Booking, for example, gives you $15 if you recommend it to a friend. You have to invite your friends by sending them the referral link. They book and stay at the accommodation, then after their stay, you and your friend both get the $15 for the next booking.
Your customers can become true advocates, by connecting and sharing your product with others when they have a good experience. Rewarding a satisfied customer through a referral marketing program doesn’t require so much work, and it can bring lots of benefits for everybody involved.
Even if you focus on relationships, you shouldn’t miss customer data and numbers. In order to keep track of your business’s success, you’ll have to follow sales numbers and customer feedback metrics.
Some of the most important metrics in this situation is customer lifetime value (CLV). The formula for the metric is:
Estimated Average Lifetime Value = (Average Sale) x (Estimated Number of times customers purchased)
There are free tools that calculate the metric. Google Analytics, for example, measures lifetime value for users acquired through different channels. You need to select two Google Analytics metrics and compare them to identify the date range during which you acquired users. These are the metrics available:
You can compare LTV in relation to the cost of customer acquisition (CAC) to measure how long it takes to regain the investment to get a new customer.
Marketers agree that growing CLV is essential to the health of their organization and a key success metric.
Conversion rate is another important metric that should be tracked. Conversions could be under lots of forms, depending on what business goals you’ve set. It can be:
The are lots of other metrics you could follow, that can be tracked automatically with lots of services available that regards to the:
Businesses are more likely to become successful if they build a long-term relationship with their customers. Having a customer relationship management (CRM) strategy in place will bring lots of benefits in the future. You can say goodbye to tangled messages, no customer history in one place, hard customer management, no customer support flow, bad organization and many more. And you can create long-term customers more easily.
There are lots of customer relationship management tools that come in handy when you want to have an efficient and easy management with your customers. You can see customer’s history, lots of data and conversion numbers. Below you can see an example from such a CRM tool.
These tools can save you a lot of time to organize and customize your process. You are able to track and organize every stage of your sales pipeline, manage a large number of contacts, customize the steps as you wish.
For managing contacts and conversations you can use apps and services that include customer support technology. Intercom is an example. The messaging platform makes it very easy to:
The tool is great for engaging and retaining customers.
Customer feedback, questions and conversations can be a great starting point for a new documented article. Generating article ideas based on your customers’ question will lead to a high open rate because you’ll talk about something that lots of people search. Imagine if all your articles answered your readers’ questions: the CTR would be very high.
There might times when marketers don’t write for the customers but rather for the industry, and they might lose a big part of the audience. If you write for your actual clients you could manage to acquire new customers.
Customers’ questions create content marketing gold.
When your current customers ask you questions it’s like finding our their needs very easily. You can see if there’s been a confusion, feedback, improvement, something they want to learn or any other thing. More than that, you can use that piece of content in the future as a reference in case there are other customers that ask you similar questions.
Maintaining a strong customer relationship isn’t the job of a single department. Everybody should work in order to fulfill customers’ needs. Sales teams are not entirely responsible. It should be a synergy.
Relationship marketing is a strategy that will foster customer loyalty and ensure long term engagement. You need to master this in order to bring customer retention and satisfaction.
It’s interesting for me to see that even experienced SEO specialists forget about the power of internal linking. As backlinks from other websites become harder and harder to obtain (because people focus on the wrong techniques), using an internal linking strategy with the right combination of anchor texts can bring great SEO results. […]
The post Best Internal Linking Structure & Strategy to Boost Your SEO appeared first on SEO Blog | cognitiveSEO Blog on SEO Tactics & Strategies.
It’s interesting for me to see that even experienced SEO specialists forget about the power of internal linking. As backlinks from other websites become harder and harder to obtain (because people focus on the wrong techniques), using an internal linking strategy with the right combination of anchor texts can bring great SEO results.
But what makes a good internal linking strategy? Well, the answer varies from site to site but, generally, it’s the foundation that matters. Build it right from the start and understand the basic concepts and you’ll be set for having a good internal linking structure forever.
In this article, you’ll learn everything you need to know, so keep reading!
Internal links are just like backlinks, but within your own website. They are links that go from one page on a web domain to another page on the same domain. They are most commonly observed in navigation menus, sidebars and footers, but also within the article body.
Search Engines look at a lot of things when they are trying to determine which pages they should rank. One of the things it looks at are internal links.
Through internal linking your website vouches for your own pages. I know, it’s kind of narcissistic, but it’s really helpful for search engines and SEO!
If we go after the same rules as in off-page SEO, namely that a page with more backlinks is more valuable in the eyes of other sites, in on-page SEO a page with more internal links is more valuable in the eyes of your own website.
So if you said “My eyes are beautiful 252 times per day” and “My nose is beautiful 9 times per day” people would naturally figure out that you REALLY like your eyes and that they’re very important to you.
But enough about how pretty I am, let’s get back to serious stuff!
Generally, it’s enough to just do internal linking in order to benefit from it. That’s because many people ignore it completely! However, it’s important to understand exactly how Google treats internal links if you really want to take advantage of them.
PageRank, although it sounds ancient, is still used. So when you link to a page from another page, be it internally or externally, you pass PageRank. It’s Google’s score for… ranking pages (actually, I think Larry Page really wanted his name in one of the algorithms).
When you add more than one link, the PageRank splits evenly. This means that if, hypothetically, the PR score was 100 and you added 3 external links, each would get a score of 33.3.
The way Page Rank works has changed over time. Normally, a while ago, people would ‘sculpt’ Page Rank by using rel=”nofollow” tags. This meant that you could link to 50 pages but only follow one link to pass the entire equity to it. However, when you use a nofollow tag today, that PageRank will vanish, so you won’t win anything.
The PR algorithm is complicate, but it’s important to understand that it doesn’t only apply to backlinks, but also to internal links.
In one of my last articles I mentioned how you can create really strong pages by linking them in boilerplate content.
This is very important when thinking your internal linking structure, especially if you’re trying to rank for multiple keywords.
For example, it doesn’t really matter that I link to the domain analysis tool under a different anchor text now, because the navigation already links to it under the anchor text “Site Explorer” in the navigation section.
Don’t take this assumption for granted though! Many tests have been made and Google officials ‘kinda confirmed this, but as of today we aren’t really sure if that’s still the case.
In SEO, things might change over time. The truth is that it’s hard to believe Google only takes the first anchor text into account, especially because Google keeps endorsing the contextual links found in the body. However, it’s safer to assume at this point that the first link is the one that matters, so make sure you use the most valuable anchor text if you plan on adding important pages to your Navigation.
Quick Tip 1: If you want to avoid that, you can just link to a general services page (without any drop-downs to separate services) where you can then list each service and link to its specific page.
Quick Tip 2: If you do link to your important pages in the navigation section, consider diversifying your off-site links (backlinks) anchor texts in order to target multiple phrases. So if you secure a guest post, don’t link back to your article always using the same anchor text as in the navigation (although you should use it from time to time as well).
If you want to check your internal links’ anchor text distribution, the CognitiveSEO Site Audit makes it really easy. Just go to the Site Architecture section > Linking Structure and then go to Anchor Text Distribution. Make sure to view the Internal Links.
Google treats links differently depending on where they are located on the website. From what we know, Google values contextual links in the body of the page more.
It’s also important that the link is positioned higher in the content (but not necessarily in the Header section).
What also matters when you interlink between your pages is the click depth. If a page is only found 27 level deep in your website, there are big chances that Google will consider it less important.
The Site Audit makes it really easy for you to see the click depth of your pages in order to spot non-prioritized important pages.
In our case, those are mostly blog pages number 8, which are found 9 clicks away from the homepage. This is normal and those pages aren’t actually important. However, if we found an article there, it means that we probably should interlink it more so that Google can pick it up faster from more recent posts.
A thing that can also help you build a good interlinking strategy is your site URL path structure. We know that shorter URLs tend to rank better in Google.
However, when stuffing all the URLs immediately after the root domain, it’s harder to see the bigger picture when you’re trying to segment sections of your website.
Having a root only URL structure might work well for a blog, but having a hierarchy in your URL path might be more helpful for an eCommerce site.
A very important thing regarding your internal linking structure is taking care of your broken links & orphan pages.
Broken Links are actually 404 pages. They can be easily fixed by replacing them or by using 301 redirects. The CognitiveSEO Site Audit makes it easy to identify your broken links and resources:
Google doesn’t like broken links & pages because it sends users to an unsatisfying location.
Oprhan Pages are pages that aren’t linked to from anywhere in the site. The CognitiveSEO Tools can also help you find some orphan pages:
However, the truth is that it’s impossible to identify all orphan pages on a site because… there are no links to them. Usually, there might sometimes be backlinks to them pointing to other sites (but no internal links) or they might be in the sitemap but not in the site structure.
There are multiple types of internal links that you can use when improving your interlinking structure.
Contextual links are the most important ones. They are hyperlinks found in a <a> tag which wraps around a relevant anchor text.
So in HTML it would look like this: <a href=”https://cognitiveseo.com”>SEO Tools</a> and in the article itself it would look like this: SEO Tools.
When using contextual links to interlink between your articles, make sure you include keywords in your anchors, to tell Google what the link is about. However, don’t use that as your main focus. The purpose of a link is to be clicked on! Try to get the user to click your link.
Image links are pretty simple to understand. You click an image it’s’ going to take you to a link. Here’s an example. Click it and it will take you somewhere nice.
The general consensus is that contextual links have greater value than image links. I agree. I rarely click on images to go to another article or read about something. I actually expect the image to enlarge if I’m clicking it so that I may view it better.
However, despite being less valuable, image links hide an important technique which you can take advantage of!
You see, the problem with contextual links is that you can’t really use the anchor texts exactly as you want. Sometimes, the queries people use don’t have verbs or don’t really make sense. Or the keyword you want to target might simply not fit in your sentence.
Well, in case you can’t fit your desired anchor text anywhere in your content, you can definitely use the keyword in the image alt tag, which will be viewed as an anchor text. This is also a good way of adding hard to write keywords into your content, even without links.
I’m not necessarily recommending image links and definitely not recommending exploiting alt tags. Try to keep things useful and relevant.
However, keep in mind that blind people might get a bad experience because content readers often use image alt tags to describe an image. You can save your soul by at least describing the image in the image title tag, which content readers might also pick up.
Navigational links mainly refer to the structure of the site, since they are kept within lists (<ul> & <li> tags). Make sure you structure you site.
Regarding footer links, the main rule would be not to spam too much. People have a bad habit of doing that.
Also, footer links don’t always have to be the same on every page! Kayak.com uses footer links to its advantage in the car rental section. You can see some cities in the following screenshot. However, those will change depending on the page the user is viewing, to show only the closest or most relevant cities.
The same criterion goes for the sidebars. Use them to your advantage, but don’t abuse spamming all your categories in there. Only the most important ones or the most relevant to the current page the user is viewing.
If you like to live your life on the edge… you might think it’s a way of bypassing the “first link priority”. Well… I haven’t tested this so I can’t say for sure, but what I know is that links placed after a nofollow link to the same page will be ignored as well. Also, it’s sneaky and can get you into trouble.
Developing a long term interlinking strategy is important because as your site grows, you have to make sure Google is able to find the pages easily.
The best internal linking strategy is to do internal linking.
Note that the following statement applies generally, not only for blogs and informational sites. It’s a foundation for any other strategy.
However, there are more specific cases in which different strategies work better. These are actually entire topics for other articles, but I’ll touch them briefly.
After following the boilerplate content tips mentioned above, the rest is pretty simple:
When you write new articles, always link to old ones. After you finish writing new ones, edit old ones and link to the new one.
Interlink between articles only when relevant and remember to use the proper anchor text.
The secret here is to create a habit of doing this. Without a habit, you’re always going to be frustrated. The truth is I don’t always edit old articles to interlink to the new ones I post, but I remember to do it when I update old articles, as it’s a habit.
When your site is big and it has thousands of pages, things aren’t that easy. You can’t add all the pages in the Navigation.
A good strategy I always recommend for eCommerce website (which few actually do) is having a blog. This will not only enable the site to target more keywords as more content gets published, but it also opens the opportunity to link to subcategories product pages that don’t fit in the navigation as well.
Take advantage of Breadcrumbs: Breadcrumbs are a great way of strengthening more important pages such as category pages. Why? Because each subcategory will link to its parent, but not to its child.
Considering the following pattern, you can see how the Category page is linked to 4 times, while the product page only one time. Naturally, I’ll assume that in most cases, the Category page is the most important page, targeting a broader and more competitive keyword, which is the main anchor text used in the internal linking strategy.
Home > Category > Subcategory > Sub Subcategory > Product
Home > Category > Subcategory > Sub Subcategory
Home > Category > Subcategory
Home > Category
Moreover, it’s very important to correctly implement canonical tags on your pages, because parameters & search filters also often create links which can leak if the canonical tag isn’t properly set.
Furthermore, consider using dynamic footer links depending on your categories instead of just using the same footer links on every page. Is the user on the Guitars page? Link to Effect Pedals in the footer. Is the user in the Drum Kits section? Link to Drum Sticks.
Having an internal linking strategy is crucial when you have a huge website, with hundreds of thousands of posts, products, categories and pages.
However, when your site is smaller, you shouldn’t stress too much over it.
If you have a small services website, internal linking shouldn’t really be an issue. Google will be able to crawl 25-50 pages pretty easily.
What you should focus on is developing a content strategy that will expand your website. Keep using the general rule of “just interlink”.
There’s one point I still want to touch in this article. There are two main ways of building sites. You can either structure them hierarchically (silo) or use topic clusters, which are very useful for informational sites and blogs.
A silo structure site looks something like this:
This works really well for services websites & eCommerce sites. Then you have topic clusters which look something like this:
With topic clusters, you need to write what is known as a pillar piece of content, such as “SEO Guide” which is then surrounded by other less important topics such as “keyword research” or “link building”.
In the picture above, the arrows should be bidirectional. So the “SEO Guide” will mention a little bit about every topic, but link to a more detailed content about that topic which also links back to the main SEO Guide, strengthening it. The cluster can repeat, so “Keyword Research” might be surrounded by its own set of topics and so on.
You can always combine both silo structure (for your services section) with topic clusters (for your blog section).
And that wraps it up. The basics of internal linking, how it helps your site rank better and how to properly do it. How do you prefer to structure your site? Do you use internal linking to your SEO advantage? How? Let us know in the comments section, we’re curious to find out!
The post Best Internal Linking Structure & Strategy to Boost Your SEO appeared first on SEO Blog | cognitiveSEO Blog on SEO Tactics & Strategies.
If you’re just getting started with SEO, I’m pretty confident you’re looking to get everything for free. While that’s not always possible, the truth is that there are a lot of great free SEO Tools out there. To be honest, there are free tools that even the pros use every single day. And that’s […]
If you’re just getting started with SEO, I’m pretty confident you’re looking to get everything for free. While that’s not always possible, the truth is that there are a lot of great free SEO Tools out there.
To be honest, there are free tools that even the pros use every single day. And that’s happening for a good reason: some free SEO tools are great.
This list isn’t the most comprehensive one and it doesn’t list every free tools out there, but it captures a little bit of everything that is necessary in daily SEO tasks.
P.S. It’s impossible for me to get them all, so if you know any Free SEO Tools that deserve to make it in this list, please share them in the comments section at the end.
No list of SEO Tools should miss Google Analytics. Sure, it’s probably not the most accurate but the truth is that no tool is 100% accurate. In fact, Google has an entire suite of tools you can use for free.
Let’s start with Google Tag Manager:
Are you bored of tracking codes? Using multiple tools is always a hassle when you implement them. If you want to remove one, you have to go through plugins or templates. Maybe you forgot which plugin you’ve used to add the tracking code or which template contains the scripts.
Well, with GTM you won’t have this issue anymore. You can have them all in one place to enable and disable them at your will. It does exactly what it says: it manages HTML tags which contain tracking codes or scripts.
You can add your Analytics code there. It supports a variety of Google Tools as well as custom scripts to add external tracking codes, such as Facebook Pixel.
I’m not going to tell you how to install it, but there are hundreds of tutorials out there so go watch one. Here’s a good one:
No website should miss out on Google Analytics. While not the most accurate tool, it’s definitely a ‘good value’ for the huge amount of information it offers and the filtering that it’s capable of doing.
If you don’t use it yet, you should. You can see a lot of things, such as how long your users stay on your pages, if they bounce back to the search results looking for something else or where your traffic is coming from.
You can add it via the Google Tag Manager.
The Search Console is something that every SEO should use. Why? Because it tells you whether a page is indexed or not, which is indispensable for every website.
Previously known as Google Webmaster Tools, this free resource also helps you with:
If you don’t have the Google Search Console connected to your website yet, then do this as soon as possible. It’s pretty easy to add:
PageSpeed Insights is a great tool which will help you quickly identify the most vital issues your site is facing regarding speed.
However, don’t take the score there for granted. A website might have a low score but still load very fast, which is what actually matters.
In fact, the PageSpeed Insights tool even lowers the score for using external scripts such as Tag Manager, Analytics or Google Fonts. To avoid that, you would have to host them locally, which is both unpractical and risky and Google doesn’t recommend it.
That score isn’t in any way related to your SEO performance and getting it to 100/100 won’t guarantee you higher rankings. However, it will help you identify some issues that might be lurking in your website.
This is just a 1 page check so make sure you don’t check just the Homepage. Now you can’t stay and check every page out there, but you can at least check one category page, one product and one blog post, just to test each important page template.
If you want to check the pages in bulk, you can use the CognitiveSEO Site Audit Tool. It will check every page, but be prepared, it’s going to take a while!
The Google Mobile-Friendly tool is really useful because it helps you confirm if Google itself sees your website as mobile friendly. Why is it so important that your website is mobile friendly?
Well, for once because Google has run a mobile first index since 2018. However, what’s more important is that more than 50% of users now search the web using their mobile phones. That number is going to rise.
So, make sure your site is mobile friendly, otherwise you’ll be losing a lot of visitors and definitely some rankings.
Getting backlinks is hard. But you know what’s harder? It’s getting backlinks from relevant websites.
Now Google Alerts can’t do the outreach for you, but it can alert you when something new about a topic gets published on the web. So, if I write this article about free SEO tools and you use ‘free seo tools’ as a keyword in Google Alerts, you’ll get notified when my article gets indexed by Google.
You receive the notifications via e-mail. You can create multiple alerts. Make sure to create a filter, so that they don’t flood your inbox.
You can also use this to monitor your brand. This is very useful for building relationships and even links. If someone mentions you without giving a link, you can reach out and ask them for one.
However, we’ve found that Alerts doesn’t always get everything. So we’ve built our own tool: BrandMentions. You should check it out!
Google Trends is a great tool that will show you whether the interest in a particular topic is growing or falling.
This might be helpful when working on a new niche site. Maybe you think your idea is really cool, but if the interest suddenly declined, maybe you should do some more research to see if it’s worth it on the long run.
You can also use it to see if interest in your brand is growing or not. Google wants to reward Branded sites, because they are more trustworthy. You can also compare your brand with other brands to see the difference.
Google autosuggest isn’t actually a tool, but a feature which everyone uses every day when searching on Google, including yourself.
However, if you want to look for some new keyword ideas for which you can write awesome content, then it’s a good start. Just type a seed keyword there and Google will suggest you what other people generally search. Keep in mind that searches might be personalized, so it’s a good idea to do this in Icognito.
Keyword Shitter: If you want something that will quickly generate all those ideas you can try Keyword Shitter. Yeah, I know. It’s really called that.
Another cool thing you can do is scroll to the bottom of a search result to see some related keywords. You can use those keywords in your article to make it more relevant to the main keyword you searched for.
However, if you want a list of the most important keywords that you should include in your article to make it more relevant and help it rank better, you can try the CognitiveSEO Keyword Tool.
I was actually going to skip the Google Keyword Planner but decided to add it in the end. The reason why I don’t really feel like adding the Keyword Planner is because it’s an AdWords centered tool. It also doesn’t give as much data as it used to give… except if you pay for Ads. Then it will give you more data.
The truth is I have a better keyword tool for you, but it’s lower in the article, so keep reading.
Google Correlate extracts keywords with similar time-based or regional search patterns to the query you provided. It is the brother of Google Trends, basically, it works in reverse because it uses the patterns point to keywords. You get a list of keywords that have a correlation with the search you’ve made.
Google Correlate uses Pearson correlation to compare data, that’s why for each result you get the Pearson correlation in front of the keyword. The keywords are in descending order, featuring in the first 10 positions the ones that have a higher correlation. 1.0 representing a perfect correlation. But you won’t see any result below 0.6. Google Correlate is used mostly to study and predict human behavior and to see the user intent. It can be used very well in keyword research to spot the contextuality and create contextual content.
In a world where security is becoming more and more important, having an SSL Certificate is indispensable. Your website should be secured, no matter what you’re doing with it.
Back in the day, SSL Certificates weren’t all that easy to get. Today however, with tools like CloudFlare, you can secure your website easily.
All you have to do is set up an account and have access to your domain registrar. You’ll have to add your host nameservers to CloudFlare, then point your Domain Registrar to CloudFlare’s nameservers. They will act as an intermediary, protecting your website from attacks and also adding SSL.
If you’re going to make the switch, make sure you check out our http to https migration guide. You don’t want to end up messing up all your rankings!
If you use Chrome as your default browser, you can use these extensions to aid you in your SEO journey.
WooRank will analyze your website from a technical point of view, highlighting the most important issues that you should fix.
It works both as a Chrome Extension and on their website.
Keywords Everywhere is an extension that you must have! This tool might be a little invasive since it literally shows everywhere, but you can easily turn it off from the extension’s shortcut (top right in Chrome).
I’m not sure how this happened, but in the screenshot above you can see an ad saying “This chrome ext is better than Keywords Everywhere”.
Well, I’m not sure if it’s better or more useful, because it does different things, but this SEO Minion extension is actually useful, so I’m going to feature it here as well.
SEOmofo Snippet Optimizer will help you creating titles and meta descriptions. You can either create new ones directly there or check old ones to see if they’re ok.
There are a lot of alternatives here… but this is the one I use. It’s the first I found years ago and I still use it.
However, it appears that the tool works counting characters, but from what we know, Google uses pixels instead. You can use Serpsim instead as a pixel based alternative.
GT Metrix is an alternative to PageSpeed Insights. It does pretty much the same thing, telling you which images aren’t optimized, if you have caching problems, if you minify your CSS, etc. I often have the impression that GT Metrix does a better job.
Focus on the following: Fully loaded time, Total Page Size and No. of Requests. Keep them all low. Another tip is that if you register, you’ll wait less for a page to be analyzed.
Again, it only acts on one page at a time, so make sure you check multiple page templates to see issues on different sections of your website.
If you don’t fear Russians & Putin will steal your data and use it to rule the world, then I can’t stress enough how awesome Yandex Metrica is.
Although it’s dedicated to the Russian search giant Yandex, the cool feature in Yandex metrica is its heatmaps and user session recording. It’s basically a free Hotjar.
Just make sure you are GDPR compliant when doing these. The users must click accept for you to record their session.
A redirect checker is always in use. You have to make sure that all versions of the site ( http/https & www/non-www) are pointing to a single version via 301 redirect.
It’s also useful to detect redirects in general, whether you want to see if it’s a 301 or a 302, or need to figure out through which other pages a URL redirects.
I usually just search for this on Google and end up choosing whichever app ranks first, but a popular redirect checker seems to be https://httpstatus.io/.
I’ve been using TextMechanic.co for a very long time (not .com, but the .co version). It has saved me a lot of time, because it’s very easy to manipulate text with it.
Here’s a list of some things that you can do with this tool:
If you’re not working with WordPress or a very popular CMS, it might be difficult to get all the Structured Data & Schema Markup right on your website.
Well, that’s why James Flynn created this Schema Markup Generator.
It’s a pretty simple process. You select what type of page you have, then input your data there and the tool will generate the markup for you. If you’re a programmer, you can use that to dynamically add things like titles and prices into your templates.
If you have a multilingual website, then you must definitely set up your hreflang tags correctly. If you don’t do this, your site will be an international mess!
Luckily, we have Aleyda Solis which is a renowned International SEO. She created a Hreflang Tags Generator Tool which you can use to generate these tags in order to set them up correctly.
You might also be interested in reading our article about common hreflang mistakes.
Answer The Public is a great keyword tool which will give you ideas and sort them both alphabetically and topically. It’s a great tool if you’re looking for a broader view on the keywords.
The best part about it is probably the visualization chart which gathers the keyword phrases around prepositions and connecting words.
Xenu is a website crawler. It will crawl your site and list all your URLs and resources. Its main purpose is to identify 404 pages. You can also use it to find which pages are pointing to those 404 pages, by right clicking a result and hitting URL Properties.
In a previous version of Screaming Frog you could upload a list of the URLs crawled by Xenu and you could analyze your entire site for free this way. Unfortunately, the most recent builds of Screaming Frog limit you to 500 URLs even if you upload your own list.
A good alternative is Site Audit by CognitiveSEO. It will tell you exactly which links are OK, which links are 404s, which ones 301 and you have a lot of filters at your disposal, as well as reports.
Ubbersuggest is my Google Keyword Planner alternative. It’s free and pretty generous for a free tool. We can all thank Neil Patel for that.
It even shows you the top 10 results ranking for that keyword, so that you may take a look at them to see what you’re competing against. Use it wisely to discover new keyword opportunities!
Since WordPress is the most popular CMS platform out there, I couldn’t have finished this list without listing 3 of the most important plugins you should have on your site.
The Yoast SEO plugin is a must have on any WordPress site. It does a basic thing, which is allowing you to add a Title and Meta description to your pages.
This is important, because you don’t always want your H1 and Blog Title to be the same as the <title> tag. Using Yoast SEO Plugin, you can actually target more keywords. You can also set a different title for Facebook / Twitter, where keywords aren’t that important, but the catchiness is.
If you haven’t noticed, this also have a snippet preview, so you won’t have to be checking it with other tools every time.
The plugin comes with a bunch of recommendations and best practices, but you don’t really need to follow them by heart. Sometimes, a good title can get a red light.
There are a lot of caching plugins out there. Unfortunately, W3 isn’t the easiest to use, but from my perspective, it seems to be the most powerful. It can handle Caching very well and it can also minify your static resources.
Smush is a great tool for compressing images, one of the leading causes (if not the one) of slow websites. The only disadvantage of the free version is that you can only optimize 50 images at a time. Well… it might be a little inconvenient, but we could say that “One click per day keeps the SEO away”.
There are a lot of Free SEO Tools out there, way too many to list them all here. Many of them are probably really good, probably better than what I’ve listed before.
Which tools do you use? If you know any of these tools, please share them below in the comments section.
We’re back with a new guest for our cognitiveSEO Talks series, Gerry White, an SEO expert with broad knowledge in analytics, digital and search marketing strategy, social marketing and a lot more. With almost 20 year experience in digital, Gerry is a technical marketing consultant that has worked with companies such as BBC, McDonald’s, Gordon Ramsey, Premier […]
We’re back with a new guest for our cognitiveSEO Talks series, Gerry White, an SEO expert with broad knowledge in analytics, digital and search marketing strategy, social marketing and a lot more. With almost 20 year experience in digital, Gerry is a technical marketing consultant that has worked with companies such as BBC, McDonald’s, Gordon Ramsey, Premier Inn or DirectGov. Gerry has been creating, promoting and analyzing websites and more recently apps for companies across the world and he is also a speaker on international conferences. At the moment he is working at Just Eat.
Working with so many companies and big-name brands, Gerry acknowledged a vast experience. As Gerry mentioned in the cognitiveSEO talks, he’s always testing and does case studies on the go. For him and his team, it is very important to offer the user a great experience and always look at all the different ways people are navigating a site. He gives a rule of a thumb:
|Everything has to be super simple with site architecture. The bigger the sites, the more complicated things are. You need to make sure the SEO basics are right.|
|SEO Consultant at JUST EAT and Co-organiser of ConferenceTakeItOffline.co.uk @Dergal / TakeItOffline.co.uk|
In his spare time, Gerry helps organize the TakeItOffline digital round tables with businesses, consultants and agencies. You should listen to this interview with Gerry White as you’ll get lots of insights and tips for creating better websites and make them loved by the user and Google. We wouldn’t want to spoil the talk for you so go ahead and discover it yourself.
|Improving the title tag will have a significant impact on SERP.|
|SEO Consultant at JUST EAT and Co-organiser of ConferenceTakeItOffline.co.uk @Dergal / TakeItOffline.co.uk|
Top 10 Marketing Nuggets:
Call it as you want: plug-in, extension or add-on, its purpose remains the same: to make whatever software you are using more feature-rich. There are a lot of great plug-ins, but the ones we’re going to focus on are the plug-ins that will enhance your SEO efforts. We’re not going to get terribly technical […]
Call it as you want: plug-in, extension or add-on, its purpose remains the same: to make whatever software you are using more feature-rich. There are a lot of great plug-ins, but the ones we’re going to focus on are the plug-ins that will enhance your SEO efforts.
We’re not going to get terribly technical so enjoy checking the plugins below and use the ones that fit you best.
Along the years, we’ve searched, tested and tried lots of plugins. The following SEO plugins will surely help you achieve better SEO results but you need to choose the right ones for you.
If you know any other SEO plugins that should be listed here, please let us know in the comments section below.
SEO Yoast is probably one of the most used and most popular WordPress SEO plugins being installed by over five million websites. One of the best features of SEO Yoast is the XML sitemap management which allows you to easily create your sitemaps. You don’t have to code and then fix it if something is not working so you avoid any of the headaches.
For content lovers, there’s the content optimization snippet preview which allows you to add your keyword, meta description and meta title to preview them as they appear on search. Besides that, you get tips and indications whether your content needs more on-site optimization, or de-optimization in case of keyword stuffing.
Moreover, Yoast SEO helps you identify and avoid duplicate content so you don’t get penalized by Google.
SEO Framework is another great plugin for small businesses rather than big companies. The interface looks like it’s integrated into WordPress, so it delivers fast SEO solutions and it’s time efficient, leaving no room for errors. Not to mention that interacting with it feels very natural.
The fact that it has an AI built makes it very interesting and it automatically optimizes your pages. That way, it gives you lots of possibilities to create a better website. It comes preconfigured but also gives you the option to change any settings you want. You can improve the search results and the social presence with SEO Framework.
In the WordPress plugins gallery, there are a lot of options for all sorts of issues and problems that you have for your website and Broken Link Checker is another example. The plugin, as the name point out, checks your broken links.
After you have installed it, the plugin will parse your whole website and show you how many broken links you have, similar to the screenshot above. You can find the list of broken links in a new tab of the WP admin panel – Tools -> Broken Links. Whenever you find broken links, there are some actions you can take: Edit link, Unlink, Not broken, Dismiss.
All In One Schema Rich Snippets can be used to improve the appearance in search engine results with rich snippets. The plugin can be used at its best for schema implementations, such as Recipes, Events, People, Products, Articles and so on.
Using the plugin will give more accurate information to the search engines about your website, help your results stand out in SERP and give you a competitive advantage.
Rank Math is a free WordPress plugin that has lots of cool features for every business. It is developed by MyThemeShop, one of the most famous WordPress theme providers. This WordPress SEO plugin helps you optimize your content and outrank your competitors. One of the coolest things is that it supports schema-based themes and also AMP pages.
With Rank Math you can check lots of errors and get a lot of information for your website:
As the name says it, XML Sitemap was specially designed to create XML sitemaps that will help search engines to better index your site. It works best with most search engines like Google, Bing, Yahoo and Ask.com. The great thing about it is the fact that it notifies all the search engines every time you post new content or update any existing content.
All in One SEO Pack is an easy WordPress plugin for beginners and for small businesses that want to improve their website and increase their rankings, but it also has advanced features and an API for developers.
This WordPress plugin will help you with the following:
|It is one of simple fast and very powerful SEO plugin for the WordPress user. You can find the loads of features that you can easily enable or disable as per required. You can find the premium version adds from this plugin.|
|Editor & Content team @copyproblogger|
SEOPress has lots of features and Jenelia shares some of the most important ones:
SEO Suite Ultimate extension is a comprehensive solution for Magento websites that want to improve their rankings and traffic. Plus it offers support for features that are not a part of the default Magento setup. It is installed by the team of experts from the plugin.
Enriched with lots of features to offer a valuable user experience, the SEO extension is developed taking into consideration lots of advantages:
CreareSEO extension was created for Magento to help users solve their SEO problems by using a set of smart tools designed for them.
The Magento SEO plugin helps with:
Google integrations: you’ll get a list of Google services that are not integrated into your Magento 2 platform.
SEO Suite is a free extension designed by Emipro Technologies. The plugin spots any aspects that need to be fixed, such as duplicate content issues allowing you to manage dynamic templates, improve indexation and even help you out with internal linking.
It solves the most important features and has lots of benefits such as:
Magento Canonical URLs is a free extension that adds ‘rel’ canonical URL to the head of your web pages and also manually sets custom canonical links for products. The extension helps the search engines select the most relevant pages for specific searches.
Plus, it helps you detect and fix duplicate content issues and it provides correct URL from databases for your CMS pages.
Advanced SEO by Activo is an SEO plugin for Magento 2 that adds the Organization schema.org rich snippet to the homepage of your store. We all know how critical it is to have an SEO advantage in your store so the plugin brings a lot of features to improve your store ranking and visibility.
There’s nothing too complicated, the extension guides you and it’s a great fit for your online store.
The Advanced Sitemap is an SEO plugin designed for Magento 1. There’s also an option for Magento 2, named Ultimate SEO Optimizer. Advanced Sitemap, as the name says it, is an extension that generates the sitemap automatically. It is very simple, you’ll have to install it and then generate how many Sitemaps you’d like.
It has lots of customizations included, which allows you to set the priority and the frequency for each field, even for homepage, to set the title for sitemap, to add more links to the sitemap plus many others.
SEO-Generator is an SEO plugin for Joomla that generates keywords and descriptions for your webpages by pulling text from the title and/or the content. The plugin has an effective system to include the keyword and description in the articles. It requires you to review them and add only those that you want.
In the admin mode, on the right side of the article, you’ll see a section placed under “Metadata Information”, named “Plugin Parameter” where you get keyword ideas. In the screenshot below there’s an example.
SEO Generator for has lots of advantages for websites created on Joomla. It has support on lots of languages, making it very accessible for lots of international businesses. The process of curating keywords has three steps:
Another cool advantage of the plugin is the fact that it works in a reversible mood, too. For example, you can generate keywords for older articles.
Another great SEO plugin for Joomla is Easy Frontend SEO. Lots of people recommend it due to its ease of use. The plugin helps you add and edit any meta information you’d like on every page. Be it a title, description, keywords, generator or robots, you can change them in frontend. So you don’t require advanced HTML knowledge. The plugin is available on every page on any Joomla website so it’s very easy to change any piece of information you want.
The plugin does not change, alter or even delete given Joomla! data. One big advantage of the plugin is the direct influence on the generated metadata, which gives a lot of freedom to the user.
JoomSEF can be used to make the URLs search engine friendly. Plus, you get a lot of interesting and helpful features, such as customized error pages, internal linking management, duplicate URL customization, and search engine friendly URL management as you can see in the JoomSEF panel.
The Joomla SEO extension offers support for URL translations at multilingual sites and tries to provide more comfort to the user. We all know how important search engine friendly URLs are and how needed they are for achieving better ranking with search engines.
Here are some of the main features of JoomSEF:
Tag Meta is another great SEO plugin for Joomla that allows you to efficiently manage your website’s meta information. It offers great advantages for improving website rankings through SEO.
Tag Meta makes is very easy to set the title tag, meta tags or link “canonical” on any page by simply mentioning the URL or just a part of it. You can apply multiple rules together and they will work just as fine. For example, if you set more than one rule to a URL they will apply in cascade using the ordering set. Only if the rule is declared as “the last one” the process will stop.
Another great advantage is the fact that you can generate meta information dynamically for each page. For example, you can keep the title tag on each rule from the global settings.
sh404SEF is the longest developed and most popular SEO extension in Joomla! Extensions Directory (JED). It supports all the Joomla versions and it comes with lots of SEO features.
Here’s what you can do with sh404SEF:
Wix is a closed system and has an SEO plugin that can be used only by Wix websites, named Wix SEO Wiz. It is a simple plugin to use for optimizing your website to rank on SERP. While there are other platforms you could use for creating websites, Wix is a great fit for small businesses, blogs, and personal websites.
The SEO plugin has three main features:
To optimize a page you need to select the Page SEO, then add title, description and preview your page in SERP. Then you need to click on save and publish when the page is finished.
The plugin was created two years ago in order to help Wix websites get reached online by users. It starts by creating a personalized SEO plan, adding a website, keywords, type of business and following instructions to rank higher in search engines.
Plug in SEO will help you check your shop for SEO problems. This diagnosis tool checks all the essential areas of search engine optimization, such as titles, headings, meta descriptions, blog post structure, speed and much more.
You have to install the plugin and afterwards your store will be checked. Once the analysis is done, the app will display the details to let you know what improvements are necessary.
Plug in SEO constantly checks your shop for any issues and sends you notifications on email when it finds something. The app has full support and code snippets directions and guidance. Plus you get efficient title and description editing by following a template.
Plug in SEO is very good to improve your SEO for your store and get exceptional control of your customized SEO actions for Shopify.
ReloadSEO is an SEO extension that works great for e-commerce, so it can also be used for Magento, Joomla!, WordPress, Drupal and more. It promises to give you all the SEO tools you need to grow your search traffic and revenue.
ReloadSEO is made simple so that each user can improve the results for their website no matter how much SEO they know. It covers backlinks, content, keyword research, store monitoring and more.
Built as a native solution for major platforms, ReloadSEO carries a lot of benefits for e-commerce websites and focuses on lots of needs:
Weebly SEO Plugin is a great fit for websites designed on Weebly. Similar to WordPress, Weebly is a web hosting service but mostly oriented for online shopping. The Weebly websites are SEO friendly and can rank on SERP if you follow the SEO principles. There’s nothing different than other websites in terms of SEO.
Every website created with Weebly has SEO Settings you can edit to rank higher on Google and other search engines. There are on-page and off-site SEO settings available so you can edit the plugin.
Compared to other plugins, Weebly has some other advantages:
The CMS Weebly was named Top Rated Content Management Systems for 2019 by TrustRadius.
Squarespace offers a simple solution for those who are interested in less customization with low maintenance and want something simple to use. Squarespace SEO is a simple solution for those who want to publish content quickly and efficiently.
Some users blame Squarespace because it doesn’t have better SEO integration into the platform. I would say that it is a better fit for beginners, rather than SEO pros. When people want to use Squarespace for creating their website, SEO is by far the topic that is most frequently asked about.
When you sign up for the Squarespace platform, your site is generally optimized for SEO; even the founder of Squarespace mentions that.
Squarespace is engineered to work properly without a sea of plugins, and you should not take the lack of plugin for this to mean that we didn’t actually just build it right from the start.
|Anthony Casalena, founder of Squarespace|
Plus you have other benefits included, such as automatic XML sitemap, free SSL certificate, user-friendly themes, clean HTML and URLs.
MozBar is a free Chrome extension that delivers metrics on the go. If you install the MozBar from Google Play, you’ll see the ‘M’ icon from the toolbar at the top right-hand side of your browser. Whenever you’ll search for something on the web, you’ll get some information on every webpage that appears in the search results.
With just a simple click, you can see the page authority (PA), domain authority (DA), the number of links and referring domains. Plus you can see the spam score and get lots of insights on the go.
Whenever you enter a new website, you can see on-site elements, link metrics, markup data or highlight internal and external links, followed and nofollowed links and so on.
SEOquake is a Chrome plugin developed by SEMrush. It has lots of information displayed directly in your search results page. That way, you can see the number of links, referring domains, rankings data, display ads information. Plus you get more insights through the integration with Alexa, Whois, Bing and Wayback Machine.
You can edit settings and see results based on the location you set it from the Chrome extension, or order the results by a specific parameter from the left-side menu.
Moreover, for each individual page, you get a set of new data regarding links, content, specific metrics, internal and external links, ads, traffic and many more.
Keywords Everywhere is is one of the best SEO plugins for Chrome. For every search you make, you get free search volume, CPC & competition data and additional lists for related keywords, and other suggestions. The SEO extension is easy to use and saves a lot of time since all the information is at sight.
The Chrome extension supports lots of other websites where you can see the metrics under the search toolbox, websites such as Youtube, GSC, Google Trends, Google Analytics, Soolve, Esty, Bing, eBay and more.
WooRank is a Chrome extension that works based on score. It is designed by the same company that has an SEO tool – WooRank. Once enabled, Woorank calculates a score based on the website analysis it makes it by checking content, links, technical information, mobile, local and usability. It provides various SEO tips for your website to rank number 1 in SERP.
For a more comprehensive analysis, you can click on any information the extension gives you and check the pop out with the whole analysis.
SimilarWeb gives you website traffic and key metrics for any website, such as traffic, search data, social, display ads, on-site content, audience and competitors. By collecting all this information you can gain insights into any website’s statistics and strategy as you’re browsing.
Similar to the other Chrome extensions, with a simple click you can get a more comprehensive analysis on the site you want. Find all the information you need in one place.
SEO Peek is a great Chrome extension for content marketers. It quickly checks the on-page SEO factors of any website. It simply checks the DOM of a page, so there’s no need to bother with the HTML source. Enable it from the top-right side of the page and see the page title, meta description, meta keywords, headings, HTTP Status, meta tags, canonicalization annotations, mobile annotation and more.
Intentional or unintentional, be it plagiarism or bad technical implementation, duplicate content is an issue that is affecting millions of websites around the web. If you’ve wondered what content duplication is exactly and how it affects SEO and Google rankings, then you’re in the right place. Whether you think your site is affected by this […]
Intentional or unintentional, be it plagiarism or bad technical implementation, duplicate content is an issue that is affecting millions of websites around the web. If you’ve wondered what content duplication is exactly and how it affects SEO and Google rankings, then you’re in the right place.
Whether you think your site is affected by this issue or just want to learn about it, in this article you will find everything you need to know about duplicate content. From what it is to how you can fix it in specific cases, here you have it all, so keep reading.
Duplicate content is content that has been already written by someone, somewhere else. So, if you take a piece of content off one website with the infamous Copy/Paste and then publish it on your website, then you have duplicate content.
Duplicate content has many sides and can be caused by many things, from technical difficulties or unintentional mistakes to deliberate action. Before we get into more technical aspects, we must first understand what content duplication actually is.
On the web, duplicate content is when the same (or very similar) content is found on two different URLs.
Another key thing here to remember is that the content is already indexed on Google. If Google doesn’t have the original version of the copied content in its index, then it can’t really consider it duplicate content, even though it is!
Around 5 years ago, I was actually contemplating scanning old news magazine pages and, using software, turning the images into text and then use it for PBNs or whatever worked at that time. While that might be illegal from a copyright point of view, it should pass Google’s duplication filters even today.
I would actually recommend publications which are moving from print to digital should repurpose old content in their magazines on their websites.
We all know Google likes to see quality content on your site, and not thin content. If you have it but it’s not on Google yet, it still is new and original, so why not take this opportunity? Sure, some news might be irrelevant today, but I’m sure magazines also featured evergreen content such as “How to lose weight fast”.
An article could even be modified into something like How people used to do fitness in the 80s’. You can keep the content identical this way (although a small original introduction might be required).
However, things are a little bit more complex than that. There’s a big discussion on what exactly makes for duplicate content in Google’s eyes. Is a quote duplicate content?
Will my site be affected if I publish someone else’s content but cite the source?
Also, there isn’t one single solution for fixing duplicate content issues. Why? Because there are very many scenarios. There are multiple solutions and one of them might be better than the other. There are many things to be discussed and, hopefully, by the end of this article you’ll have all your questions answered.
However, we must first get some other things clear to better understand the nature of duplicate content. Then we will analyze different scenarios and give solutions for each and every one of them.
There’s a lot of content out there in the world. Compared to that, Google knows only about a small part of it. To be able to truly say if the content on your site has been copied, Google would have to know every piece of paper that has ever been written, which is impossible.
When you publish something on your website, it takes a while for Google to crawl and index it. If your site is popular and you publish content often, Google will crawl it more often. This means it can index the content sooner.
If you publish rarely, Google will probably not crawl your site so often and it might not index the content very quickly. Once a piece of content is indexed, Google can then relate other content to it to see if it’s duplicate or not.
The date of the index is a good reference source for which content was the original version.
So what happens when Google identifies a piece of content as duplicate? Well, it has 2 choices:
Will you be penalized for duplicate content? No.
Is duplicate content hurting your site? Well, that’s another story.
Because Google doesn’t like duplicate content very much, people have assumed that it’s a bad practice which gets punished by Google. With a Penalty!
Despite popular belief and although content duplicate does cause issues, there’s no such thing as a duplicate content penalty!
At least not in the same way that we have other penalties, be them manual or algorithmic. Or, at least that’s what Gary Illyes said in a tweet.
DYK Google doesn’t have a duplicate content penalty, but having many URLs serving the same content burns crawl budget and may dilute signals pic.twitter.com/3sW4PU8hTi
— Gary “鯨理” Illyes (@methode) February 13, 2017
This comes in contradiction with Google’s official page on duplicate content on the webmaster guidelines which states that:
“In the rare cases in which Google perceives that duplicate content may be shown with intent to manipulate our rankings and deceive our users, we’ll also make appropriate adjustments in the indexing and ranking of the sites involved. As a result, the ranking of the site may suffer, or the site might be removed entirely from the Google index, in which case it will no longer appear in search results.” – Google
So while there’s no duplicate content penalty, if you ‘try to manipulate search results’ you might end up losing rankings or even getting deindexed. Here’s Google at its best again, contradicting itself, at least a little bit.
However, I tend to take Gary’s word for granted. Duplicate content isn’t something that you should avoid just because Google might hit you in the head. Also, Google won’t hit you just because you have duplicate content.
It’s a different story with those who use content scrapers, deliberately steal content and try to get it ranked or use mass content syndication only for links. It’s not only about content duplication but actually about stealing content and filling the internet up with garbage.
The fact that there’s just so much ‘innocent’ duplicate content out there makes it even harder for Google to detect the evil-doers with a 100% success rate.
But even though Google won’t penalize you, it doesn’t mean that duplicate content can’t affect your website in a negative way.
Talking about duplicate content penalties, here’s what is written in the Google Search Quality Evaluator Guidelines from March 2017:
The Lowest rating is appropriate if all or almost all of the MC (main content) on the page is copied with little or no time, effort, expertise, manual curation, or added value for users. Such pages should be rated Lowest, even if the page assigns credit for the content to another source.
Also, you can check out the video below where Andrey Lipattsev, senior Google search quality strategist, repeated and said content duplication penalty doesn’t exist and also that:
Here’s even more about the duplicate content penalty.
Well, the answer to that is very simple:
When you search something on Google, would you like to see the exact same thing 10 times? Of course not! You want different products, so that you may choose. You want different opinions, so that you can form your own.
Google wants to avoid SPAM and useless overload of its index and servers. It wants to serve its users the best content.
As a general rule of thumb, Google tries to display only 1 version of the same content.
However, sometimes, Google fails to do this and multiple or very similar versions of the same pages, many times even from the exact same website get shown.
For example, in Romania, the biggest eCommerce website, eMAG, generates pages dynamically from nearly all the searches that happen on their site. In the following image, you can see 3 top listings for keyword, keyword plural and keyword + preposition. All of these were searched internally on eMAG’s website so they automatically generated these pages and sent them to the index.
You can see the titles & descriptions are very similar and the content on those pages is identical.
Now this is a very smart move from this eCommerce site. Normally, Google shouldn’t allow this to happen. Multiple complaints are emerging in the Romanian eComm community regarding this issue but it seems to keep going (some requests even reached John Mueller).
Although I highly doubt it, it is possible that those results are actually the most relevant. But this doesn’t happen for every keyword out there. Some keyword searches are unique and, most of the time, Google only displays one page from eMAG’s website on the first page.
In my opinion, although this site could canonicalize these versions to a single page, it’s not their fault that they get 3 top listings. It’s Google’s job to rank the pages, not theirs.
This is a classic example of duplicate content issue. From a user’s perspective, this might not be a very good thing. Maybe the user wants to see other websites. Maybe they’ve had a bad experience with the site in the past. Maybe they just want to see if it’s cheaper somewhere else.
Google is still trying to figure out ways to detect when this is an issue and when not. It’s not quite there, but it’s getting better and better.
I’ve encountered some queries where the first 3 pages were all occupied by eMAG results. I can tell you, it’s a scary sight! I highly doubt that they were the only ones selling that type of product, because eMAG is actually a retailer. They sell other people’s products and most of them have their own websites.
According to Matt Cutts, about 25-30% of the entire internet is made up of duplicate content. That figure might have changed in recent years, since the video is pretty old. Considering the expansion of the internet and the growing number of new websites (especially eCommerce ones, where content duplication is thriving), it has likely increased.
So what we get from the video above is that not all duplicate content is bad. Sometimes people quote other people for a reason. They bring quality to their content by doing that and it isn’t something bad.
In essence, think about it like this:
Duplicate content is when content is identical or very similar to the original source.
Now of course, very similar can be interpreted. But that’s not the point. If you’re thinking about these numbers, then you’re obviously up to something bad. If you’ve contemplated deliberately copying/stealing some content to claim it as your own, then it’s duplicate content. You can also get into legal copyright issues.
A popular type of duplicate content that is harmless are eCommerce sites product descriptions.
Either because they’re lazy or they have so many products to list, eCommerce sites owners and editors simply copy paste product descriptions. This creates a ton of duplicate content, but users might like to still see it on the web because of different prices or services quality.
What ultimately sells a product though is its copy. So don’t just list a bunch of technical specifications. Write a story that sells.
Many eCommerce website owners are complaining that other websites are stealing their description content. As long as they don’t outrank you, I’d see it as a good thing. If they outrank you, simply sue them due to copyright. However, make sure that you have an actual basis on which you can sue them. Some technical product specifications aren’t enough.
Another one is boilerplate content. Boilerplate content is content that repeats itself over and over again on multiple pages, such as the header, navigation, footer and sidebar content.
As long as you’re not trying to steal someone else’s content without their permission and claim it as your own, you’re mostly fine with using quotes or rewriting some phrases. However, if your page has 70-80% similarity and you only replace some verbs and subjects with synonyms… that’s not actually quoting.
Did You Know
Google Search Console no longer allows you to see your duplicate content issues. Some time ago, this was possible, but Google ‘let go’ of this old feature.
So how can you know if you have duplicate content issues?
You can use the cognitiveSEO Site Audit Tool for that. The tool has a special section for that, where it automatically identifies any duplicate content issues. Therefore, you can quickly take a look at your duplicate pages, duplicate titles, descriptions, etc.
More than that, the tool has a section that identifies near duplicate pages and tells you the level of similarity between them.
As Gary Illyes pointed above, some of the actual issues caused by duplicate content is that it burns up crawl budget (that especially happens to big sites) and it dilutes link equity, because people will be linking to different pages which hold the same content.
Google has to spend a lot of resources to crawl your website. This includes servers, personnel, internet and electricity bills and many other costs. Although Google’s resources seem unlimited (and probably are), the crawler does stop at some point if a website is very, very big.
If Google crawls your pages and keeps finding the same thing over and over again, it will ‘get bored’ and stop crawling your site.
This might leave important pages uncrawled, so new content or changes might be ignored. Make sure all of your most important pages are crawled and indexed by reducing the number of irrelevant pages your site is feeding to Google.
Since duplicate content is usually generated by dynamic URLs from search filters, it ends up being duplicated not once, but thousands of times, depending on how many filter combinations there are.
One example is the one I gave above with eMAG. In reality, Google filtered a lot more results as doing a search for site:emag.ro + the keyword returns over 40.000 results. Many of those are probably very similar and some might be identical. For example, another variation is keyword + white. However, the landing page doesn’t only list white items, which makes it also irrelevant.
When you get backlinks, they point to a specific URL. That URL gets stronger and stronger the more links it gets. However…
If you have 10 versions of the same page and people can access all of them, different websites might link to different versions of that page.
While this is still helpful for your domain overall, it might not be the best solution for your website or for specific, important pages that you want to rank high.
We’ll talk soon about this issue, what causes it and how to fix it.
The URL examples I gave above are rather search engine optimization friendly, but some filters might not look so friendly. We all know Google recommends that you keep your URLs user friendly. Weird long URLs are associated with viruses, malware and scams.
A while ago, we even made a research on thousand of websites and the conclusion was that the more concise the URL, the greater the chance to be higher up.
Example of not friendly URL: https://domain.com/category/default.html?uid=87YHG9347HG387H4G&action=register
Example of friendly URL: https://domain.com/account/register/
Try to keep your URLs short and easy to read, so that they would help and not hurt your sites. For example, people will figure out what those filters mean if you say order=asc&price=500&color=red. But, unless you’re a very big and trustworthy brand, like Google, they won’t be so sure what’s happening if the URL parameter extension is ei=NgfZXLizAuqErwTM6JWIDA (that’s a Google search parameter suffix).
As I said previously, sometimes the duplication of a page can result in bad user experience, which will harm your website on the long run.
If you end up ranking a page to the top of Google when it’s not actually relevant, users will notice that immediately (ex. indexing a search page with color xyz when you have no items with that color).
It’s finally time to list the most popular scenarios of how duplicate content gets created on the web. To check some SEO basics, let’s start with how it happens on websites internally, because it’s by far the most common issue.
If you have an SSL certificate on your website, then there are two versions of your website. One with HTTP and one with HTTPS.
They might look very similar, but in Google’s eyes they’re different. First of all, they’re on different URLs. And since it’s the same content, it results in duplicate content. Second, one is secure and the other one is not. It’s a big difference regarding security.
If you’re planning to move your site to a secure URL, make sure to check this HTTP to HTTPS migration guide.
There are also two more versions possible:
It’s the same thing as above, whether they’re running on HTTP or HTTPS. Two separate URLs containing the same content. You might not see a big difference between those two, but www is actually a subdomain. You’re just so used to seeing them as the same thing because they display the same content and usually redirect to a single preferred version.
While Google might know how to display a single version on its result pages, most of the time it doesn’t always get the right one.
I’ve encountered this many times. It’s a technical SEO basic thing that every SEO should check, yet very many make this mistake and forget to set a preferred version. On some keywords Google displayed the HTTP version and on some other keywords it displayed the HTTPS version of the same page. Not very consistent.
So how can you fix this?
Solution: To solve this issue, make sure you’re redirecting all the other URL versions to your preferred version. This should be the case not only for the main domain but also for all the other pages. Each page of non-preferred versions should redirect to the proper page’s preferred version:
For example, if your preferred version is https://www.domain.com/page1 then…
…should all 301 redirect to https://www.domain.com/page1/
Just in case you’re wondering, a WWW version will help you on the long term if your site gets really big and you want to server cookieless images from a subdomain. If you’re just building your website, use WWW. If you’re already on root domain, leave it like that. The switch isn’t worth the hassle.
You can also set the preferred version from the old Google Search Console:
However, 301 redirects are mandatory. Without them, the link signals will be diluted between the 4 versions. Some people might link to you using one of the other variants. Without 301 redirects, you won’t take full advantage of those links and all your previous link building efforts will vanish.
Also, we’re not yet sure of the course the GSC is taking with its new iteration, so it’s unclear if this feature will still be available in the future.
One common issue that leads to duplicate content is using hierarchical product URLs. What do I mean by this?
Well, let’s say you have an eCommerce store with very many products and categories or a blog with very many posts and categories.
On a hierarchical URL structure, the URLs would look like this:
At a first look, everything seems fine. The issue arises when you have the same product or article in multiple categories.
Now one solution:
As long as you are 100% certain that your product/article won’t be in two different categories, you’re safe using hierarchical URLs.
For example, if you have a page called services and have multiple unique services with categories and subcategories, there’s no issue in having hierarchical URLs.
Solution: If you think your articles or products will be in multiple categories, then it’s better to separate post types and taxonomies with their own prefixes:
Category pages can still remain hierarchical as long as a subcategory isn’t found in multiple root categories (one scenario would be /accessories, which can be in multiple categories and subcategories, but it’s only the name that’s the same, while the content is completely different, so it’s not duplicate content).
Another solution would be to specify a main category and then use canonical tags or 301 redirects to the main version, but it’s not such an elegant solution and it can still cause link signal dilution.
Warning: If you do plan on fixing this issue by changing your URL structure, make sure you set the proper 301 redirects! Each old duplicate version should 301 to the final and unique new one.
One of the most common causes of content duplication are URL variations. Parameters and URL extensions create multiple versions of the same content under different URLs.
They are especially popular on eCommerce websites, but can also be found on other types of sites, such as booking websites, rental services and even blog category pages.
For example, on an eCommerce store, if you have filters to sort items by ascending or descending price, you can get one of these two URLs:
These pages are called faceted pages. A facet is one side of an object with multiple sides. In the example above, the pages are very similar, but instead of being written A to Z they’re written Z to A.
Some people will link to the first variant, but others might link to the second, depending on which filter they were on last. And let’s not forget about the original version without any filters (domain.com/category/subcategory). On top of that, these are only two filters, but there might be a lot more (reviews, relevancy, popularity, etc.).
This results in link signal dilution, making one of every version a little bit stronger, instead of making a single version of that page really strong. Eventually, this will lead to fewer rankings overall.
Sure, you might argue that because of pagination, the pages will actually be completely different. That’s true if you have enough products in a category to fill multiple pages.
However, I could also argue that the first page of “?order=desc” is a duplicate of the last page of domain.com/category/subcategory?order=asc and vice versa. One of them is also a duplicate of the main version, unless the main version orders them randomly.
I could also argue that Google doesn’t really care about pagination anymore. In fact, it cares so little that it forgot to tell us that it doesn’t care about them.
Google still recommends using pagination the same way as you did before (either with parameters or subdirectories).
However, you should also make sure now that you properly interlink between these pages and that each page can ‘kind of’ stand on its own. Mihai Aperghis from Vertify asked John Mueller about this and this was his response:
Just because parameters create duplicate content issues it doesn’t mean you should never index any pages that contain parameters.
Sometimes it’s a good idea to index faceted pages, if users are using those filters as keywords in their search queries.
For example, some bad filters which you should not index could be sorting by price as shown above. However, if your users search for “best second hand car under 3000” then filters with price between X and Y might be relevant.
Another popular example are color filters. If you don’t have a specific color scheme for a product but the filter exists, you don’t want to index that. However, if filtering by the color black completely changes the content of the page, then it might be a relevant page to index, especially if your users also use queries such as “black winter coats”.
The examples above are for some general eCommerce store, but try to adapt them to your particular case. For example, if you have a car rental service, people might not necessarily search for color but they might search for diesel, petrol or electric, so you might want to index those.
One thing to mention is that anchor-like extensions & suffixes (#) are not seen as duplicate URLs. Google simply ignores fragments.
Ian Laurie from Portent talks about fixing a HUGE duplicate content issue (links with parameters to contact pages on every page of the site) like this. The solution was to use # instead of ? as an extension to the contact page URL. For some reason, Google completely ignores links with anchors.
However, in this article Ian mentions that he hasn’t even tried rel=canonical to fix the issue. While rel=canonical would probably not harm at all, in this case it might have not been helpful due to the scale of the issue.
Solution: The best solution here is to actually avoid creating duplicate content issues in the first place. Don’t add parameters when it’s not necessary and don’t add parameters when the pages don’t create a unique facet, at least to some extent.
However, if the deed is done, the fix is to either use rel=canonical and canonicalize all the useless facets to the root of the URL or noindex those pages completely. Remember though that Google is the one to decide if it will take the ‘recommendation’ you give through robots.txt or noindex meta tags. This is also applicable to canonical tags, but from my experience, they work pretty well!
Remember to leave the important facets to be indexed (self referencing canonical), especially if they have searches. Make sure to also dynamically generate their titles.
The title of the facet should not keep the same title as the main category page. It should be dynamically generated depending on the filters. So if my category is Smartphones and the title is Best Smartphones You Can Buy in 2019 and the user filters after color and price, then the title of the facet should be something like “Best Blue Smartphones Under $500 You Can Buy in 2019”
Another issue that can result in content duplication is a bad hreflang implementation.
Most multilingual websites have a bad hreflang implementation. That’s because most plugins out there implement the hreflang wrong.
Even I use some because I couldn’t find an alternative. I’ll present the issue:
When you have 2 languages and a page is translated to both languages, everything is fine. Each page has 2 hreflang tags pointing correctly to the other version. However, when a page is untranslated, the other language version points to the root of the other language, when it should not exist at all. This basically tells Google that the French language version of domain.com/en/untranslated-page/ is domain.com/fr/, which isn’t true.
Polylang has this issue. I know WPML also had it, but I’m not sure if they’ve addressed it yet.
However, it’s not the hreflang tag itself that causes duplicate content issues, but the links to these pages from the language selector.
The hreflang issue only confuses search engines into which page to display where. It doesn’t cause duplicate content issues. But while some plugins are smarter, others also create the pages and links to those other versions in the menu of the website. Now this is duplicate content.
When the pages aren’t translated, qTranslate (which has been abandoned) creates links to the other variants but lists them as empty or with a warning message saying something like “This language is not available for this page). This creates a flood of empty pages with similar URLs and titles (stolen from the original language) that burn crawl budget and confuse search engines even more.
Now you might think a fix is easy, but merging from one plugin to another isn’t always the easiest thing to do. It takes a lot of time and effort to get it right.
Solution: The simple solution is to not create any links to untranslated variants. If you have 5 languages, a page which is translated to all 5 languages should include the other 4 links in the menu (under the flag drop down let’s say) and also have the appropriate hreflang tags implemented correctly.
However, if you have 5 languages but a particular page is only translated in 2 languages, the flags dropdown should only contain 1 link, to the other page (maybe 2 links to both pages, a self-referencing link isn’t really an issue). Also, only 2 hreflang tags should be present instead of all 5.
If you want to know more about SEO & multilingual websites you should read this post on common hreflang mistakes.
While it’s good to have landing pages that are focused on conversions everywhere, sometimes they’re not the best for SEO. So it’s a very good idea to create customized pages only for ads.
The thing here is that many times, they are very similar and offer the same content. Why similar and not identical? Well, there can be many reasons. Maybe you have different goals for those pages, or different rules from the advertising platform.
For example, you might only change an image because Adwords rules don’t let you use waist measuring tools when talking about weight loss products. However, when it comes to organic search, that’s not really an issue.
Solution: If your landing pages have been created specifically for ads and provide no SEO value, use a noindex meta tag on them. You can also try to canonicalize them to the very similar version that is actually targeted to organic search.
Boilerplate content is the content that is found on multiple or every page of your site. Common examples are Headers, Navigation Menus, Footers and Sidebars. These are vital to a site’s functionality. We’re used to them and without them a site would be much harder to navigate.
However, it can sometimes cause duplicate content, for example when there is too little content. If you have only 30 words on 50 different pages, but the header, footer and sidebar have 250 words, then that’s about a 90% similarity. It’s mostly caused by the lack of content rather than the boilerplate.
Solution: Don’t bother too much with it. Just try to keep your pages content rich and unique. If some faucet pages from your filters list too little products, then the boilerplate content will be most of the content. In that case, you want to use the solution mentioned above in the URL Variations section.
It’s also a good idea if you keep it a little bit dynamic. And by dynamic I don’t mean random. It should still be static on each page, just not the same on every page.
For example, Kayak uses a great internal linking strategy in its footer. Instead of displaying the same cities over and over again, it only displays the closest or more relevant ones. So while the homepage displays the most important cities in the US, New York only displays surrounding cities. This is very relevant for the user and Google loves that!
Duplicate content can occur cross-domains. Again, Google doesn’t want to show its users the same thing 6 times, so it only has to pick one, the original article most of the times.
There are different scenarios where cross-domain content duplication occurs. Let’s take a look at each of them and see if we can come up with some solutions!
Generally, Google tries to reward the original creator of the content. However, sometimes it fails.
Contrary to popular belief, Google might not look at the publication date when trying to determine who was first, because that can be easily changed in the HTML. Instead, it looks at when it first indexed it.
Google figures out who published the content first by looking at when it indexed the first iteration of that content.
People often try to trick Google into thinking they published the content first. This has even happened to us, here at CognitiveSEO. Because we publish rather often, we don’t always bother to tell Google “Hey, look, new content, index it now!”
This means that we let the crawler do its job and get our content whenever it thinks it suitable. But this allows for others to steal the content and index it quicker than us. Automatic blogs using content scapers steal the content and then tells Google to index it immediately.
Sometimes it takes the links as well and then Google is able to figure out the original source if you do internal linking well. But often it strips all links and sometimes even adds links of their own.
Because our domain is authoritative in Google’s eyes and most of those blogs have weak domains, Google figures things out pretty quickly.
But if you have a rather new domain and some other bigger site steals your content and gets it indexed first, then you can’t do much. This happened to me once with my personal blog in Romania. Some guys thought the piece was so awesome they had to steal it. Problem was they didn’t even link to the original source.
Solution: When someone steals your content the best way to protect yourself is to have it indexed first. Get your pages indexed as soon as possible using the Google Search Console.
This might be tricky if you have a huge website. Another thing you can do is to try and block the scrapers from crawling you from within your server. However, they might be using different IPs each time.
You can also file a DMCA report to Google and let them know you don’t agree with your content being stolen/scraped. There are very little chances that this will help, but you never know.
Well… I can’t say much about this. You shouldn’t be stealing other people’s content! It should be written somewhere on a Beginners guide to SEO that stealing content is not a content marketing strategy.
In general, having only duplicate content on your website won’t give you great results with search engines. So using content scraping tools and automated blogs isn’t the way to go for SEO.
However, it is not unheard of sites that make a decent living out of scraping content. They usually promote that through ads and social media, but sometimes also get decent search engine traffic, especially when they repost news stories.
This can result in legal action which might close your site and even get you in bigger problems, such as lawsuits and fines. However, if you have permission from the owners to scrape their content, I wouldn’t see an issue with that. We all do what we want, in the end.
While Google considers that there’s no added value for their search engine and tries to reward the original creator whenever possible, you can’t say that a news scarping site is never useful. Maybe a natural disaster warning news reaches some people through that site and saves some lives. You never know.
What if you don’t actually steal their content?
What about those cases in which you really find a piece of content interesting and want to republish it so that your audience can also read it? Or let’s say you publish a guest post on Forbes but also want to share it with your audience on your site?
Well, in that case there’s not really an issue as long as you get approval from the original source.
Solution: It’s not really an issue from an SEO point of view if you publish someone else’s content. However, make sure you have approval for this to not get into any legal issues. They will most probably want a backlink to their original post in 99.9% of the cases.
You can even use a canonical link to the original post. Don’t use a 301 as this will send the people directly to the source, leaving you with no credit at all.
Don’t expect to rank with that content anywhere, especially if your overall domain performance is lower than the original source. That being said, don’t let other big websites repost your content without at least a backlink to your original source in the beginning of the article or, preferably, a full fledged canonical tag in the HTML source.
Content curation is the process of gathering information relevant to a particular topic or area of interest. I didn’t write that. Wikipedia did.
Curating content rarely leads to duplicate content. The difference between content curation and plagiarism is that in plagiarism, people claim to be the original owner of the content.
However, the definition has its issues, as things can be very nuanced. What if I actually come up with an idea, but I don’t know that someone has written about it before? Is that still plagiarism?
In fact, what did anyone ever really invent? Can you imagine/invent a new color? Even Leonardo Da Vinci probably inspired his helicopter from the maple seed.
This post, for example, is 100% curated content. I’ve gathered the information from around the web and centralized it here. Brian Dean calls this The Skyscraper Technique. While people have done this for ages, he gave it a name and now he’s famous for that.
However, I didn’t actually steal anything. I’ve gathered the information in my head first and then unloaded it here in this article. I didn’t use copy paste and I didn’t claim to have invented these techniques or methods. I cited people and even linked to them. All I did was put all this information into one place by rewriting it from my own head.
Solution: When you write content using the Skyscraper Technique or by curating content or whatever you want to call it, make sure you don’t copy paste.
Make sure you look at the topic from a new perspective, from a different angle. Make sure you add a personal touch. Make sure you add value. That’s when it’s going to help you reach the top.
After web scraping, content syndication is the second most common duplicate content around the web. The difference between these two is that it’s a willful action.
So the question arises! Will syndicating my content over 10-20 sites affect my SEO?
In general, there’s no issue in syndicating content, as long as it’s not your main content generation method (which kind of looks like web scraping).
When syndicating content, it’s a great idea to get a rel=canonical to the original source, or at least a backlink.
Again, Google doesn’t like seeing the same thing over and over again. That’s actually the purpose of the canonical tag, so use it properly!
Solution: Make sure content syndication isn’t your main thing as it might be picked up by Google as scraping content. If you want to syndicate your content, do your best to try to get a canonical tag to the original URL on your site. This will ensure that other sites won’t be able to outrank you.
If rel=canonical isn’t an option, then at least get a backlink and try to get it as close to the beginning as possible.
If you’ve purposely reached this article, then you most probably already have a duplicate content issue. However, what if you don’t know you have one?
Some time ago, this was also possible in the old version of the Google Search Console. Unfortunately, this sections now returns “This report is no longer available here.” In their official statement, Google said that they will ‘let go’ of this old feature.
Well, since that’s no longer available, the CognitiveSEO Site Audit Tool is a great way to easily identify duplicate content issues:
You can take a look at the content duplication or the Title, Heading and Meta Description duplication.
The tool also has an awesome advanced feature that identifies near duplicate pages and tells you the level of similarity between them!
Every page in the tool has hints on how you can fix different issues, so if you’re trying to fix any duplicate content issues, the Site Audit Tool can definitely help you out.
In the examples I gave above for each scenario, you’ve seen that there are different solutions, from not indexing the pages to 301 redirects and canonical URL tags.
Let’s take a look at what each solution does so that you may better understand why each one might be better for particular cases. This way, you’ll be able to make the right choice when faced with your unique scenario.
First of all, remember that:
The best way to fix duplicate content is to avoid having it in the first place.
Duplicate content issues can escalate very quickly and can be very difficult to fix. That’s why it’s always a good idea to talk to an SEO specialist before you even launch a webiste. Make sure the specialist gets in contact with the development team. Preventing an illness is always better and cheaper than treating it!
The 301 redirect can fix a duplicate content issue. However, this also means that the page will completely vanish and redirect to a new location.
If you users don’t need to access that page, the 301 is the best way of dealing with duplicate content. It passes link equity and Google will always respect it. However, it’s not a good use case for facets, for example, because you want your users to be able to access those pages.
You can use a plugin to 301 redirect or redirect from the .htaccess file if you’re on an Apache server. There are also probably 1,000 other ways of setting up a 301 but you have Google for that.
The canonical tag has actually been introduced by Google as a solution to content duplication.
Whenever you need users to be able to access those pages from within the website or anywhere else but don’t want to confuse search engines in which page to rank since they might be very similar, then you can use the canonical tag to tell search engines which page should be displayed in the search results.
Not indexing some pages can also fix duplicate content issues. However, you have to make sure these pages are actually useless.
A big warning sign that you should not remove them from the index are backlinks. If these pages have backlinks, then a 301 or a canonical tag might be the better choice since they pass link signals.
You can either block the pages from being indexed through your robots.txt file:
Or you can add a noindex meta tag directly on the pages you don’t want Google to index:
<meta name=”robots” content=”noindex”>
However, Google will choose to follow or ignore your indications, so make sure you test the effectiveness of your action on your particular case.
Sometimes, you can’t fix them all. You might want to noindex some facets of your filtered pages, while you might want to canonicalize others. In some instances, you might even want to 301 them since they don’t have value anymore. This really depends on your particular case!
An awesome guide on how different duplicate content fixes affect pages and search engines can be found here. This is a screenshot of the table in that article presenting the methods and their effects:
Duplicate content is an issue affecting millions of websites and approximately 30% of the entire internet! If you’re suffering from duplicate content, hopefully now you know how you can fix it.
We tried to create a complete guide where you can find anything you want about the subject “duplicate content and SEO.” This advanced guide to SEO and duplicate content will surely help you whether you’re doing some technical SEO audits or if you’re planning an online marketing campaign.
Have you encountered content duplication issues in your SEO journey? Let us know in the comments section below!