5 silly SEO mistakes that even professional marketers make

By


5 silly SEO mistakes that even professional marketers make


With a rise of comprehensive digital marketing, search engine optimization has been somewhat faded. When you have paid ads, social media, content then why you would spend time on keywords, tags or links. Though a bit ignored SEO is still important to a site’s overall performance in search. By utilizing the power of on-page and off page optimization you can bring your site to the next level. However, doing SEO the right way is much easier said than done. It’s constantly evolving and Google continues to throw out search engine algorithms update one after another. 

No wonder SEO professionals are constantly exploring the Web to take out that crucial bit of info about upcoming changes and any updates. But the problem is, in fear of on missing out the new stuff they often make some basic-level mistakes. 

Here are those silly but harmful mistakes that even professionals make. These are suggested by the top phoenix seo expert, that can ruin your entire digital marketing campaign. 

#1. Close your site from indexing in .htaccess

.htaccess is the configuration file which is used to store specific directives that block or provide access to site’s document directories. If you know how to manage it you can -

Create a more detailed sitemap

Generate cleaner URLs

Adjust caching to improve load time.

In short, .htaccess is crucial for your site’s indexing process and eventually, receive higher SERPs. 

However, you need to be a true professional to set up an .htaccess file correctly. A single mistake could lead to awful consequences. 

 For example, your entire site could be blocked from indexing, like this:

RewriteCond %{HTTP_USER_AGENT} ^Google.* [OR]

RewriteCond %{HTTP_USER_AGENT} ^.*Bing.*

RewriteRule ^/dir/.*$ – [F]

If these lines are there in your .htaccess file, search bots won’t crawl and index it. Ask a developer to delete the code or do it yourself.

Ensure that you check .htaccess every time you begin a new project. Some SEOs promote and optimize sites for months, without realizing that all their efforts are in vain. 

#2. Leave your robots.txt file Entirely open for crawling

 Never ever leave your robots.txt file open for crawling because it can result in serious privacy issues. You can lose your site entirely through a data breach.

Make sure you learn enough about setting up and managing robots.txt files. Act immediately if see something like this in robot.txt:

User-Agent: *

Allow: /

This means that search bots can access and crawl each and every web page on your site, including admin, login, cart, and dynamic pages (search and filters). You want to keep your customers’ personal pages protected and closed. You don’t want to be penalized for having dozens of spammy dynamic pages as well.

Ensure you disallow pages that should be blocked and allow pages that should be indexed. 

#3. Forget Adding ‘no-follow’ tag attribute to outbound links

SEO professionals know that links are vital for ranking. But in order to focus on backlinks they completely forget that their own site pass link juice to other sites. What you should do is drive all the high-quality backlinks, but keep all the power on your site. 

Your strategy is simple:

Scan your site using a site scanner tool (such as Xenu)

Sort links by address to locate outbound ones

Create an Excel file listing outbound links (or download a standard HTML report)

Scan every link in the list to implement “no follow” tag attribute where required. 

Don’t be obsessed with the “no follow” tag attribute either. By saving all the link juice for yourself, you provoke other SEO professionals to follow you as well. 

#4. Fail to check code in validator

The better the code of your website the higher the SERPs your site will potentially earn because the neat and clean code allows search crawlers to scan and index your site more efficiently, without leaving a single page behind.

So every time a new project comes to you to optimize and promote make sure you check the code. You can simply do it by tools like W3C Markup Validation Service. Just copy the URL of your website and paste it into the address field of  W3C Markup Validation Service. If any errors are there ask the developer to fix those. 

While Google doesn’t penalize websites for having invalid bits of HTML and CSS, you better use the validator tool anyway. After all, it doesn’t take much time but improves your site’s performance for both users and crawlers.  

Working with experts from San Francisco seo company will help you  achieve the desired results and will make sure that you don’t make such silly mistakes.


Directory listing counter is continuously increasing, be a part of it to gain the advantages, 10045 Companies are already listed.

Skype: virtuousreviews

We use cookies to offer you a better browsing experience, analyze site traffic, personalize content, and serve targeted advertisements. If you continue to use this site, you consent to the placement and use of cookies. We will handle your contact details in line with our Privacy Policy