//Google corrects a technical problem resulting in the de-indexation of pages by @MattGSouthern
1554754864 google corrects a technical problem resulting in the de indexation of pages by mattgsouthern 760x490 - Google corrects a technical problem resulting in the de-indexation of pages by @MattGSouthern

Google corrects a technical problem resulting in the de-indexation of pages by @MattGSouthern

 

 

google corrects a technical problem resulting in the de indexation of pages by mattgsouthern - Google corrects a technical problem resulting in the de-indexation of pages by @MattGSouthern

google corrects a technical problem resulting in the de indexation of pages by mattgsouthern - Google corrects a technical problem resulting in the de-indexation of pages by @MattGSouthern] & # 39;);

h3_html = & # 39;

& # 39; + cat_head_params.sponsor.headline + & # 39; & # 39;

& nbsp;

cta = & # 39; & # 39; +
atext = & # 39;

& # 39; + cat_head_params.sponsor_text +

& # 39 ;;
scdetails = scheader.getElementsByClassName (& # 39; scdetails & # 39;);
sappendHtml (scdetails [0] h3_html);
sappendHtml (scdetails [0] atext);
sappendHtml (scdetails [0] cta);
// logo
sappendHtml (scheader, "http://www.searchenginejournal.com/");
sc_logo = scheader.getElementsByClassName (& # 39; sc-logo & # 39;);
logo_html = & # 39; - Google corrects a technical problem resulting in the de-indexation of pages by @MattGSouthern & # 39 ;;
sappendHtml (sc_logo [0] logo_html);

sappendHtml (scheader, & # 39;

ADVERTISING

& # 39;)

if ("undefined"! = typeof __gaTracker) {
__gaTracker ('create', 'AU-1465708-12', 'auto', 'tkTracker');
__gaTracker ("tkTracker.set", "dimension1", window.location.href);
__gaTracker ('tkTracker.set', 'dimension2', 'seo');
__gaTracker ("tkTracker.set", "contentGroup1", & # 39; seo & # 39;);
__gaTracker ('tkTracker.send', 'hitType': 'pageview', page: cat_head_params.logo_url, & title> #:; Cat_head_params.sponsor.headline, & # 39; sessionControl & # 39 ;: & # 39;
slinks = scheader.getElementsByTagName ("a");
sadd_event (slinks, click & # 39 ;, spons_track);
}
} // endif cat_head_params.sponsor_logo

Google strives to solve a common technical problem that has resulted in the removal of web pages from its index.

Webmasters and SEOs have been affected by this problem since last Thursday. . Google has not officially acknowledged the existence of a problem until Saturday.

Saturday, John Mueller of Google incorrectly reported that the problem was corrected:

Sorry – We had a technical problem on our side for This should be resolved in the meantime and the URLs concerned should be treated again. It is good to see that the Inspect URL tool is also useful in this type of case!

– John 🍌 (@JohnMu) April 6, 2019

A day later, on Sunday, Danny Sullivan posted a follow-up tweet via the Search Liaison account.

It turns out that indexing problems are largely solved and that they are about to be fully resolved.

We are aware of indexing issues that affected some sites as of Friday. We believe that the problems are mostly solved and do not require any special effort on the part of the site owners. We will provide another update when the issues are considered fully resolved.

– Google SearchLing (@searchliaison) 7 April 2019

Also, as stated in the tweet, site owners should do nothing to solve the problem . The problem comes from the end of Google.

That said, however, if you absolutely need to re-index a few pages of great value, you can still use the Inspect URL tool.

Using the Inspect URL tool, site owners can use Google to perform a new analysis and redraw of specific pages. The problem is that it can only process one URL at a time.

The Inspect URL tool is therefore not an ideal solution for inserting a large number of pages into Google's search index, but is a decent option for a handful.

Mueller adds that even when the problem is fully resolved, not all URLs in each site will be reindexed.

One more thing to add here: we do not index all the URLs on the Web, so even once it is processed here, it would be normal for all the URLs of all the sites not to be indexed. Awesome sites with minimal duplication help us recognize the value of indexing more of your pages.

– John 🍌 (@JohnMu) April 7, 2019

Lastly, Mueller says things will end up "calm down as before," which is certainly a good new for those who fear that rankings will be affected by this problem.

One more thing to add here – we do not index all URLs on the web, so even once it is reprocessed here, it would be normal for all the URLs of each site not to be indexed. Awesome sites with minimal duplication help us recognize the value of indexing more of your pages.

– John 🍌 (@JohnMu) April 7, 2019

Google has: far not provided specific details regarding what caused the error in the first place .