Learning all the ins and outs of SEO vocabulary and jargon can feel like learning another language.
On top of it, making your client familiar with this SEO jargon is a different task in itself.
Over the years I have had very interesting conversations with my clients which have helped me to come up with my own storytelling way to make them understand various aspects of SEO.
Here I am sharing a few conversations.
Client: Ajit, Why I need to make sure that my content length is 1500 or 2k. I read somewhere that lengthy content ranks on top on Google.
Ajit: What are you likely to sit through- a short documentary or a 3 hrs long film?
Client: It all depends on how interesting the content is! Sometimes I enjoy short documentaries and sometimes I end up hooked to longer ones. I do not judge the content based on length.
Ajit: Exactly my point! The same is the case with Google. Your ranking on Google thrives on the quality of the content you share and not its length.
Client (making a self-realization face): Makes sense!
Client: Why do I need to achieve better PageSpeed? Why does having a low PageSpeed affect my set rankings?
Ajit: If you have to drive your car on a highway. What speed would you prefer: Your options are:
- Gear 1 or 2: 30-40
- Gear 3 to 5: 80-100
Client: I don’t like rash driving. I would opt for anything between 70-90 which is both safe and timely.
Ajit: You are right! You have speedometers in cars (in some) that show you that you are driving at an economy between 80-100 and red bar shows that you are driving at a non-economy. The same is the case with Google PageSpeed. Your optimal score should be 70-80. Anything below 50 is not viable.
3. Robots.txt file:
Client: I have noticed that you keep mentioning about Robots.txt files and suggest keeping them updated. What is that exactly?
Ajit: Think Robots.txt file as Anti Virus software installed in your laptop/pc. The moment you switch on your laptop and start surfing, it scans files at the backend. The good ones are allowed/passed but the others which can harm your device are blocked.
Client: Yes true, that is the purpose of having Anti Virus software in the first place.
Ajit: Exactly! Robots.txt files also work in the same way. It allows Search Bots to crawl through your website and index those pages that will benefit the site in SEO rankings and blocks all the other pages that can cause harm to the website if Google finds them.
So, it is like putting all your garbage and bad stuff in a garage and welcoming guests in a neat and clean drawing-room.
Client: Ah, understand!
4. Google Updates:
Client: You mentioned in your last report that because of the last Google update our traffic got better.
Ajit: Think of it as tests that your examiner takes in a year. You have class tests, mid-semester tests, and final tests.
Google (examiner) rolls out several updates a year. It rolls out several small updates (class tests & mid-term) and one major update every year (final test).
Like it’s okay to fail internal class tests but it’s consequential to fail the final exam, it’s okay to suffer a slight drop because of small Google updates but it can be disastrous to not survive that one major update that happens once a year.
Client: Haha. Interesting!
Ever faced a situation where clients come up with such queries and even after technical explanations they don’t grasp it well?
BTW here is the link of SEO glossary for reading and understanding.
What do you do in that case? Would love to listen to your stories! 🙂