There has been a fair amount of controversy in the last five years regarding how marketers should write web content so that Google elevates the search engine ranking of the site in question.
One camp of marketers insists that website content, such as web pages, product descriptions, professional services, and blog articles should be written so simplistically, straightforwardly, and concisely that anyone with a 6th grade reading comprehension level will be able to understand it. This camp believes that content should be written for people, albeit people who aren’t necessarily the sharpest tools in the toolbox.
Another camp of marketers operates as though the sole purpose of publishing written content online is to elevate website search engine ranking, and therefore all content should be composed to maximize the chances that Google bots will determine the website’s content is “valuable.” This camp believes that content should be written with as many popular SEO keywords as possible so that the content pleases Google bots and Google’s ever-changing algorithms even if doing so causes the actual verbiage to sound awkward, stilted, or downright grammatically incorrect.
I don’t know about you, but when I think of my ideal customer, neither a 6th grader nor a Google bot comes to mind… But I digress.
Does content that consumers love tend to rank poorly on Google?
Will content that’s written for the specific purpose of maximizing search engine ranking automatically repulse living, breathing web visitors?
Why do Google bots exist if they’ve been programmed to value what consumers despise and penalize what consumers value?
The answers to these questions can be summed up in a single statement: Google bots have been programmed to mimic consumer behavior and value what consumers value.
In other words, the entire “content controversy” is based on a faulty assumption.
You see, professional writers tend to insist that excellent writing has the power to render the need for SEO techniques obsolete. On the other hand, professional webmasters and SEO strategists tend to presume that strategic SEO is so effective that it can elevate the search engine ranking of even the most poorly written drivel. These two diametrically opposed mentalities have caused marketers to split into those two camps I mentioned.
If you ask me, massive egos are to blame…
The fact of the matter is that content must be well-written and also utilize effective SEO techniques. You can’t favor one over the other and expect your website to positively impact your business revenue.
Let’s cut to the chase. You want to learn the characteristics of “well written” content, and you want to understand how you, as a marketer, can ensure that the content you write satisfies the quality criteria for both consumers and Google bots.
Step one is to free yourself from the mindset of those “camps” I’ve been referencing. The rest of the steps are laid out in this article, so keep reading.
Your business is unique, which means your customers are unique individuals with unique needs. This means that the content you write should speak to your audience. You can identify your audience and understand who your ideal customer is by analyzing your Customer Relationship Management data as well as your web traffic reporting and analytics data.
Do not become seduced by the prospect of writing content for a broad audience as if doing so will result in acquiring new customers. It will backfire. Meaning, if you write your content in such a way that caters to people with a low reading comprehension level even though your ideal client holds a graduate degree, earns a seven-figure salary, and places a high premium on credentials and expertise, you will only end up turning off the people you want to do business with.
Instead, be polarizing.
There’s no real risk of excluding consumers who were never going to become your customers anyway. However, there’s a real benefit to composing your content in such a way that it exclusively addresses a niche audience. In fact, intentionally writing for a niche audience will organically boost your search engine ranking when it comes to consumers who use longtail keyword phrases to find business websites that offer very specific products, services, and solutions.
This type of consumer will be more likely to quickly convert into a customer once they land on your website, which is far more beneficial than having one hundred consumers land on your website who never buy.
Write your content as though you are speaking to your ideal customer, one-on-one. If your customer base is largely comprised of ten-year olds, then speak to them in their language with their vocabulary, remaining mindful of their limited attention span.
Use the following checklist as you write:
● Identify and use the best word choices
● Find and use high-ranking keywords
● Understand your niche audience
● Speak to your audience in their language
● Study your competition
● Write with confidence, challenge existing opinions, be polarizing
● Create a content template based on your high-ranking content for future use
If you’ve been keeping up with the FTx 360 blog, then you’ve already read several articles about SEO, including What Is SEO?, How to Spot Negative SEO, and Ways to Increase Web Traffic. For tips on how to determine which keywords and longtail keyword phrases to use, be sure to check out those articles.
Then, in order to compose effective content that Google bots will love just as much as consumers, keep the following pointers in mind as you write:
● Understand SEO copywriting
● Incorporate keywords in your headline
● Optimize for clarity
● Understand on-page SEO
● Build sentences and paragraphs that naturally utilize longtail keyword phrases
● Google boosts articles that are at least 2,400 words
● Google elevates the ranking of websites that have longer visitor engagement sessions
Ready to analyze the real technical side of your website to ensure that Google bots will love, love, love your website? It’s time to talk about Robots.txt.
The term “robots.txt” is shorthand for “robots exclusion protocol.” A robots.txt file is a website text file that’s sole purpose is to inform search engine robots, including Google bots, whether or not robots have permission to “crawl” the web page that the robots.txt file belongs to.
What this means is that Google won’t automatically crawl your website by virtue of the fact that your site exists, in the same way that your neighbor won’t automatically enter your house just because it’s right across the street. Before Google crawls a given web page, its robots will first read the robots.txt file for permission and instructions.
Do you want to permit Google to crawl every page on your website? Maybe. Or maybe not. That’s up to you. If, for instance, you have indexed your website with archived content, you wouldn’t want Google to crawl both the featured content and its archived duplicate, because Google will penalize the duplication and demote your ranking.
You can “disallow” Google bots from crawling web pages by setting the appropriate allowances and disallowances within your robots.txt files.
Where can you find your robots.txt files in order to review and modify the permissions?
Simply type your website’s URL into the search bar, followed by “/robots.txt.” Let’s use FTx 360 as an example. Type the following URL into your search bar:
https://ftxdigital360.com/robots.txt
You will notice:
User – agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php
That exact information is what Google will adhere to prior to crawling our website pages.
What does yours include? Be sure to review your robots.txt files for each of your web pages and make sure that the files reflect what you want to allow and disallow.
In the rare event that there aren’t any robots.txt files associated with your website, Google will proceed as though they have full permission to crawl and index your entire site. If that’s okay with you, then there’s nothing you need to do. Robots.txt will be your friend across the board.
But if you prefer certain pages of your website not to be crawled and indexed by Google and other search engines, then you’ll need to create robots.txt files and meticulously list the permissions you want in place, blocking Google from any pages you don’t want their bots to crawl.
Cleaning up your robots.txt files is an integral aspect of ensuring that the web page content crawled and indexed by bots are elevated on Google’s search engine ranking so that visitors discover and read them.
Are you familiar with the digital marketing term, “cloaking”? If not, you aren’t alone. I’ve been writing digital content for over eight years and only recently have I been introduced to the concept of “cloaking.”
Anyone who has ever dared venture into the online dating world has likely been the victim of some degree of cloaking. You chat with what you believe to be an attractive individual on the dating platform, only to discover when you finally meet them in person that they are 20 years older and 80 lbs heavier than their profile pictures indicated…
Yikes!
In a nutshell, “cloaking” can be defined as falsely representing your URL content to Google’s search engine spiders for the purposes of luring online users to your website. Similar to a washed-up 45 year-old who uses his nephew’s handsome bodybuilding photos in order to score dates on OKCupid, “cloaking” involves providing Google with an inaccurate portrayal of your website in order to elevate search engine ranking and drive web traffic.
There are many reasons why “cloaking” is a big no-no in the digital marketing world, but the biggest of all is that businesses that use cloaking as an SEO tactic ultimately end up turning potential customers into infuriated website visitors.
When a potential customer uses Google to search for the best mechanic in town and your website appears in the first page of the search results, you better be a mechanic. If you’re selling used cars instead, you’re going to have one very irritated website visitor.
Now, chances are, you would have never heard of cloaking if not for reading this article. You might be thinking right now that you’re in no danger of “cloaking” your website in order to “game the Google system,” so why am I bothering to tell you about cloaking in the first place?
Because there are unethical digital marketing agencies out there who attempt to use cloaking without their clients’ knowledge in order to “deliver outstanding SEO results.” These agencies like to boast that they’ll get your website on the first page of Google’s search results, only to end up using unethical tactics to succeed. Worst of all, despite elevating their clients’ websites on Google, they provide virtually no ROI, because deceived website visitors obviously do not become online customers.
What are the “cloaking” red flags to look out for when you’re considering partnering with a digital marketing agency? Are there warning signs that can help you avoid these disreputable “cloaking” agencies? And if you’ve recently discovered that the digital marketing agency you’ve hired has “cloaked” your website, what can you do about it now?
Let’s take a look…
Cloaking is not only unethical, but unlawful in 2021. Ethical webmasters employ search engine optimization techniques as they index the websites they build. This is fine. But how can you tell whether a webmaster has used ethical SEO techniques or illegal ones? You can find out for yourself what a digital agency has been up to by investigating their clients’ websites. Here’s what to look for:
● Check for AdWords cloaking scripts, PHP URL cloaking scripts, and URL cloaking scripts
● Review the SERP directly by turning on “Preserve Log” in “Chrome DevTools,” switching your user type to “Googlebot,” and going to the website directly from Google to compare the “Googlebot” content to the “online user” content
● Use a “cloaking checker” tool, like the one offered at Dupli Checker
Needless to say, if you discover any prospective marketing agency has used cloaking for their clients’ websites, run.
When Google discovers that a website has used “cloaking” to elevate SEO ranking, then Google will blacklist the website, period. It is very hard to bounce back once you’ve been blacklisted. Protect your business so that you don’t hire an agency that resorts to these unethical SEO tactics. Here are the questions to ask potential webmasters and digital marketers before you sign a contract:
● Will you use PBNs to increase my website’s rank on Google?
● Will you republish high-ranking web content for me in order to maintain a favorable SEO ranking?
● Will you maximize SEO and backlinks by assigning a team to leave comments under all my website’s blog articles?
If a marketing agency answers “yes” to any of the above questions, run. It’s a very good idea to familiarize yourself with Google’s Webmaster Guidelines prior to interviewing your agency candidates.
What if you recently discovered that the digital marketing agency you’ve been working with has used cloaking and other “black hat SEO” techniques, which have now backfired, resulting in your website being blacklisted by Google? Can you recover? It’s possible, but recovering won’t happen overnight. That being said, here are the steps you can take today to submit a “reconsideration request” to Google:
● First navigate to Google Search Console > Crawl > Fetch, and fetch the web pages from the affected portions of your website
● Compare your web content that is fetched by Google to the web content your online users see and resolve all variations between the two so that the content is the same
● Finally, check all redirects on your site and remove the faulty ones that send users to an unexpected destination and / or ones that utilize “conditional redirects”
● Once you’ve completed the above steps, submit your reconsideration request to Google
If you do not want to attempt resolving the Google violations yourself, preferring instead to hire a reputable marketing agency, then just be sure to relay to the new digital marketing agency the specific Google penalties you’ve received. Did you receive a “partial match” penalty or a “site-wide match” penalty? Let the new agency know so that they can focus on fixing the specific violations.
Cloaking, or “black hat SEO,” is the result of plugging misleading content into Google for the specific purpose of optimizing SEO. In technical terms, this is accomplished when the marketer provides Google’s bots with a “server-side script,” i.e. a Google-friendly variation of the website. How would a website know that a “visitor” is a “search engine robot” and not a real user? Rather than digress into a detailed and highly technical explanation of how that’s done, let’s put it this way: Artificial Intelligence has come a long way.
Originally, A.I. was not developed for the purposes of helping unethical marketers to lie, cheat, and steal. But very unfortunately, mastering the best A.I. strategies in order to inform Google’s bots of URL content is what led fraudsters to invent “cloaking” in the first place.
You see, web developers, SEO strategists, and digital marketers are supposed to provide Google bots with URL information. This information must be truncated. For example, if a website sells over 500 products, the URL description that is written specifically for Google is obviously not going to include mention of all 500 products. Savvy SEO strategists will therefore choose to highlight only the most SEO-friendly products to mention within the Google-specific “scripts.” If organic coffee is the most popular product a website sells, for instance, then the “scripts” used for Google will focus on organic coffee and won’t mention the other 499 organic produce, meat, and tea products that are also for sale on the site.
Hopefully, by this point you see the distinction between acceptable SEO strategies and unethical “black hat SEO” cloaking.
The big takeaway to remember is that if real, living, breathing users love your web content, then Google bots will, too. Always write for people. Never write for the sake of pleasing algorithms if doing so comes at the expense of providing quality content to actual people. After all, Google bots are not your potential customers, people are, so be sure to keep the horse in front of the cart if you want to elevate your search engine ranking on Google.
Are you looking for a reputable digital marketing agency to develop your website, write your content, and handle your SEO? FTx 360 offers web design and development, content writing, and SEO marketing and optimization services for businesses of all industries. If you’ve been the victim of black hat SEO, we can fix it, repair your reputation, and put you on the fast track to building trust with consumers. Contact us to speak with our webmasters, content writers, and experienced marketing strategists today.
Want to read more articles like this? Enter your email below to subscribe to our mailing list and be the first to know about the latest marketing trends!
![]() |
Thank you for Signing Up |
"Marketing is enthusiasm transferred to the customer."
![]() |
Thank you for Signing Up |
© 2023 FTx 360 | Website Design by FasTrax Infotech