//12 completely outdated SEO practices you should avoid by @searchmastergen
1544371493 12 completely outdated seo practices you should avoid by searchmastergen 760x490 - 12 completely outdated SEO practices you should avoid by @searchmastergen

12 completely outdated SEO practices you should avoid by @searchmastergen

 

 

12 completely outdated seo practices you should avoid by searchmastergen - 12 completely outdated SEO practices you should avoid by @searchmastergen

12 completely outdated seo practices you should avoid by searchmastergen - 12 completely outdated SEO practices you should avoid by @searchmastergen & # 39;);

h3_html = & # 39;

& # 39; + cat_head_params.sponsor.headline +

& # 39 ;;

cta = & # 39; +
atext = & # 39;

& # 39; + cat_head_params.sponsor_text +

& # 39 ;;
scdetails = scheader.getElementsByClassName (& # 39; scdetails & # 39;);
sappendHtml (scdetails [0] h3_html);
sappendHtml (scdetails [0] atext);
sappendHtml (scdetails [0] cta);
// logo
sappendHtml (scheader, "http://www.searchenginejournal.com/");
sc_logo = scheader.getElementsByClassName (& # 39; sc-logo & # 39;);
logo_html = & # 39; - 12 completely outdated SEO practices you should avoid by @searchmastergen & # 39 ;;
sappendHtml (sc_logo [0] logo_html);

sappendHtml (scheader, & # 39;

ADVERTISING

& # 39;)

if ("undefined"! = typeof __gaTracker) {
__gaTracker ('create', 'AU-1465708-12', 'auto', 'tkTracker');
__gaTracker ("tkTracker.set", "dimension1", window.location.href);
__gaTracker ('tkTracker.set', 'dimension2', 'search engine optimization');
__gaTracker ('tkTracker.set', 'contentGroup1', 'Search Engine Optimization');
__gaTracker ('tkTracker.send', 'hitType': 'pageview', 'page: cat_head_params.logo_url, & title = & # 39; #:; Cat_head_params.sponsor.headline, & # 39; sessionControl & # 39 ;: start & # 39;});
slinks = scheader.getElementsByTagName ("a");
sadd_event (slinks, click & # 39 ;, spons_track);
}
} // endif cat_head_params.sponsor_logo

SEO underwent profound evolutionary changes in the years and continues to do so every day.

Although most traditional marketing tactics (for the most part The modifications to SEO have radically transformed the landscape.

Most, if not all, of these modifications have helped improve the Web – and research, in

Yet, some people remain attached to the "old methods" and try to use obsolete SEO practices to improve the visibility and performance of their brand

Some of these tactics have worked well.

Yet many marketing novices and / or small business owners still use these "zombie" SEO techniques (Tactics that should be dead, but are not for some reason forgotten.)

Not only are they ineffective but many of the 12 obsolete SEO practices below are potentially harmful to the well-being of your brand, websites, and other digital properties.

1. Abuse of Keywords

Webmasters and "marketers" continue to misunderstand the role of keywords in general SEO initiatives and their use in daily strategy.

Let's examine in more detail the specific types of abuse and mismanagement of keywords, including irrelevant usage, writing for a specific keyword density and the stuffing keywords.

Target / confusion of irrelevant keywords

Too often. SEO novices try to integrate their content and messaging within the confines of their keyword research (and not much else).

These "marketers" will shape the content and its metadata to represent keywords that they are not properly aligned with, or the appropriate intent of the users who search for keywords to high volume targeted.

Brands may therefore lose the attention of readers before having the opportunity to communicate true message with them.

If the marketed keywords do not align with the content of the page, the disconnection will hinder the success of the content, even if it is otherwise of good quality.

Don Do not try to mislead users and direct them to content that is misleadingly presented by large keywords in order to increase visibility.

Google knows what it looks like, and it can really be defined as an obsolete SEO practice (

Keyword Density

Writing for a 'Density of Words' keyword-specific, like many keyword-driven marketing tactics, simply misses.

Google no longer depends on keyword density (or the relationship between word usage specific keys and the global page copy) to determine if a webpage is an effective source for responding to a search query.

It is so much more advanced than just looking for a keyword s; such as Google use a multitude of signals to determine the search results.

Keywords remain important for the topics and ideas that they represent, but they do not constitute a lifeline for ranking search queries of great value.

[1945

The quality of content and the quality of the transmission of messages are the lifeline.

Filling Keywords

This is probably the oldest thing in the book.

SEO SEO is

So, loading our web pages with keywords – especially the same high value keyword that we are aggressively targeting all over the site – will us to help better appear in the research, thus outpacing the competition?

Absolutely not.

Search engines have long known what is the keyword jamming and what types of text combinations are artificial. They notice them as attempts to manipulate search results and demote the content as such.

Yes, there may still be some valuable content using simple keyword either intentionally or not. demoted because of its real value to users.

At the time, webmasters trying to play with the system were going to put all the keyword variations of a keyword of great value into the footer of the site or even more rudimentary. , associate these keywords with the background color of the site, effectively hiding them from humans, but not search engine crawlers.

Webmasters also tried it with links . (Do not do anything like this.)

Do not forget that you write for humans, not for search engines.

2. Writing for Robots

It is important to understand that writing against nature is not natural.

And the search engines know it.

The belief is: writing for the Web means we should repeat a topic by its proper name whenever it is mentioned, working with variants and plural / non-plural versions of the word, of so that "all bases are covered".

When they crawl, the crawlers see the keyword repeated and in several different versions, which allowed the page to be well positioned for the keyword variations used (again and again … and again ).

It will not work anymore.

[1945]

Search engines are advanced enough to understand repeated keywords, their variations and the adverse experience of generally bad content.

Write for humans, not for search engine robots or any other robot.

3. Article Marketing & Article Directories

Any attempt to use the system usually does not work in the world of SEO.

But that does not stop people from trying.

Especially when these tactics offer significant improvements to a trademark, its website and / or associated digital properties.

Of course, the article directories worked. And they also worked very well for a long time.

Commonly regarded as one of the very first forms of digital marketing, article syndication was an easy way for a connoisseur. And it made sense since the idea was similar to other channels such as television and print media that already regularly use subscribed content.

But Google eventually understand and unveiled his Panda update in 2011.

Panda shifted the search landscape by targeting content farms and directories, as well as other websites offering crap content (whether it's just bad / fake horrible writings, meaning or theft). .

The idea behind article marketing does not make sense in today's world, where your high-quality content must be original and proficient in the world. authority and reliability.

4. Spinning Article

Usually created with software, the article is the black hat tactic of trying to recreate quality content using words, phrases and a different organization.

The end result was essentially a messy mess. of an article making the same remarks as the source material.

It is not surprising that this is no longer effective.

While The artificial intelligence keeps improving to create content everything that is generated by a machine is always of lower quality than what the human being can produce – something original, useful and substance.

5. Purchase Links

This one continues to bite webmasters many years later.

Like most SEO tactics, if it sounds suspicious, you probably should not do it.

Once upon a time, it was common to pay quickly to get a lot of links pointing to your site.

You must now manage the backlink profiles. and optimized, just like the websites we monitor, and poor quality domains with too many backlinks pointing to a website can be dangerous to the health of a website.

Google can easily identify poor quality sites. identify when these sites send a wealth of links that they should not be.

Today, if you legitimately wish to help strengthen the authority and visibility of your website, you must gain links . ] do not pay someone to build them manually.

6. Anchorage Text

Internal Links are a feature of any site structure and any quality user experience.

This is usually done with anchor text, an HTML element that allows us to say it. users what kind of content they can expect if they click on a link.

There are different types of anchor texts (mark, nude, exact match, website / brand name, page title and / or title, etc..), But some are certainly have become more favorable than others, depending on the use and the situation.

Previously, the use of anchor text in the identical and keyword-rich was one of the best practices in SEO.

Since Penguin Google identifies better the content overoptimized .

This goes back to the golden rule of producing well-constructed content. friendly and natural.

If you optimize for search engines and not for the huma you will probably fail.

7. Obsolete Tactics for Keyword Research

Keywords have certainly undergone radical changes in the last five to ten years.

Marketers had a plethora of keyword-level data, which allowed us to see what works for our brand and what does not work, but also to better understand the targeting of ideas and the intentions of the user .

Much of this use was lost with keyword "(not supplied)" .

In the following years, tools appeared to try to replicate keyword data. But it is simply impossible to recreate it properly.

And yet, even with this keyword data now stripped, marketers are required to do their own keyword research to fully understand the industry, the competition, the geographic region, and so on.

To do this, many marketers are turning to Google's free keyword planner . Although the data they contain have been scrutinized over the years, it's a free Google-owned product that provides us with data we could not get before, and many of them we continue to use them (myself). ]

But it's important to remember what the data represents in terms of keywords.

In the keyword planner, the term "competition" refers only to paid competition and traffic, so it is virtually useless to define a natural search strategy.

The Moz keyword explorer and the SEMrush Magic keyword tool are paid tools.

Google Trends is useful for this. type of competitive analysis too, and it's free.

8. Pages for all keyword variants

Once upon a time, it was a useful tactic to rank all the high-value keyword variations targeted by your brand and its messages.

Fortunately, updates of algorithms such as Hummingbird RankBrain and others have helped Google understand that variants of the same word are, in fact, all related to the same subject.

The best, the most useful. the content around these entities should be the most visible because of the value that it offers users on the subject, not just a variant of the word.

Aside from the fact that it will lead to a brutal site self-cannibalization this makes the website much more difficult to use and navigate because the content will be incredibly similar.

The negative user experience alone is reason enough not to do it. But the fact that Google knows better than not to neglect this practice is obvious.

This tactic evolved and ultimately contributed to the creation of numerous content pools that targeted traffic only for the value of their keywords and their relevance. visibility.

This was attributed to the "old way" of optimizing a website – for keywords and search engines, rather than for users and their intent.

9. Targeting search queries with exact match

The tactic of targeting search queries with exact match in the hope of classifying queries only for traffic numbers , and not because the search query or its answer really concerned the business optimization for it has become a fairly common practice before the full deployment of the Google Knowledge Graph.

Marketers would strive to rank first for search queries accurately to trigger open discussion and increased click rate. for their sites.

10. Areas of exact matches

It makes sense to have high-value keywords in your URL. In a certain way.

But when it becomes confusing or misleading (that is, it results in a bad user experience), you must draw the line.

A best practice for domain must be consistent with your brand.

Brand names must be short, concise and fairly significant.

Why would not you want the same thing from your domain?

Google would value the exact match domains a long time ago because it made sense to use it as a signal.

Behavioral data has now helped Google make such changes. (1945)

Run a good business and offer quality products and / or services under the brand, and Google will strive to make it visible when it is visible. relevant for those who seek it.

11. Frequency of XML Site Plans

We should never try to manipulate search engine crawlers so that our website is crawled more than ever before. others because he believed that new content had been published or that substantial changes had been made to the site.

But, since webmasters previously did that the site map is used very differently from what was envisioned.

Previously, webmasters could assign a priority number to each page of a website listed in the sitemap, ranging from 0.0 to 1.0.

Since this has never been used correctly, crawlers do not even honor the frequency classification.

Instead, search engines simply analyze the content they feel they need to explore

. Make sure you follow the best practices of XML Sitemap . Sitemaps are an extremely important element for all websites.

12. Incorrect Content

Look at it in the face. There was a time in our world where miserable content could still occupy a good rank.

Oh, how times have changed.

Content stolen, end content Content filled with keywords, non-believable content – there was a time when all this could be achieved by search engine robots and regurgitated by users .

But nothing more.

We know what it takes. create quality content that is rewarded by search engines because they tell us what's right and what's wrong .

If you want to succeed in SEO today, you must do what is right.

More resources for the referencing:

Subscribe to SEJ

Receive our newsletter Daily of SEJ's founder, Loren Baker, on the latest news regarding the industry!

update of google images google lens live by martinibuster - 12 completely outdated SEO practices you should avoid by @searchmastergen