By using “Fetch as Google,” webmasters can check how different webpages appear to a Googlebot. There are three main reasons to do this: to troubleshoot issues that may be impacting your search results ranking, to determine which pages are affected after a malware attack, and to submit pages for indexing to Google (important after any major change to content on a page).
How to Fetch As Google
It is a relatively simple task to Fetch as Google, but you will need to have already verified webmaster tools on your website. Navigate to the section called “Crawl,” under which you will notice the option to “Fetch as Google.” Here, you will be able to enter any webpages you want rendered by Google — both mobile and desktop versions. As the page will appear just as Google sees it, this will allow you to ensure the search engine is viewing everything correctly.
As webmaster tools crawls the page, you can monitor the status — it will change from “Partial” to “Complete.” Throughout the process, you can view the information under “Fetch” or “Render.” If you struggle to analyze fetched text, rendering will give you a visual view.
In the case that something is wrong with a page, you will see a status such as “Temporarily Unreachable,” “Redirected,” or “Failed.” In some instances, webmaster tools can help you resolve the error. By clicking on “Crawl Errors” you will see a list in the errors details screen and troubleshoot appropriately.
The “temporarily unreachable” message often means the server is temporarily down or it is facing an issue out of your hands. The best solution is to simply wait a few minutes and try again. If the message continues to appear every time you attempt to crawl, it is likely because you are using a poor-quality or free hosting service. Your best option is to switch to a premium service.
Another cause of this message may be a security plugin blocking the Googlebot from accessing your site. In this case, you will need to disable your plugins and rerun “Fetch as Google” a couple minutes later. If this fails to solve the issue, contact your hosting company.
If you receive a redirected error, you have either entered the full URL (this is unnecessary — you only need to enter the URL information that follows the domain name) or you have missed off the “/“ at the end of the URL.
A failed message means there is an issue with the content itself. To resolve this problem, first type the URL from the fetching test into your browser and check the page does appear. If it is able to load, run a Google page speeds insights test to measure page performance. This test may be able to locate the problem for you. However, if the issue is caused by a server firewall, you will need to contact your hosting company and ask customer support to fix the problem.
A page that continually refreshes is usually down to a browser issue, but it may also be due to a virus or malware. In this case, remove any software that you are unsure you can trust and reset your browser. If the problem persists, switch to another browser or even another computer and run the test there.
Checking for Malware
Pages infected with malware tend to appear differently in a web browser than to a Googlebot. For instance, they may appear in search results for unrelated keywords or perform badly for your target keywords. The site may also appear in search results with the warning “This site may harm your computer,” leading to a drop in visitors to your website. When you “Fetch as Google,” webmaster tools can detect any malware present and lead you to the necessary action to remove it.
Submitting Your Site for Indexing
By fetching as Google, you can shortcut the normal indexing process and help your site reach search results pages much sooner. To submit your site as part of the “Fetch as Google” process, use the “Submit to Index” button.
You have two options for indexing your pages. The first is to select “Crawl only this URL.” This means the Googlebot will only consider the page itself and ignore any links on the page. However, if you choose “Crawl this URL and its direct links” Googlebot will crawl everything.
After you hit “Go,” the Googlebot will crawl your page immediately. Neglect this stage and it is impossible to know how long you will have to wait before the Googlebot starts crawling.
Using “Fetch as Google”
It is important to bear in mind that the number of times you can use “Fetch as Google” and submit your URLs is limited. You receive a set number per account rather than per website, which means if you manage several websites you need to take care to avoid running out. Every account receives 500 fetches per week, 500 URL submissions per month, and 10 chances to crawl both URLs and direct links per month. credit-n.ru credit-n.ru
http://www.otc-certified-store.com/antibiotics-medicine-usa.html http://www.otc-certified-store.com/antiallergic-medicine-europe.html https://zp-pdl.com/fast-and-easy-payday-loans-online.php zp-pdl.com https://zp-pdl.com/how-to-get-fast-payday-loan-online.php https://zp-pdl.com/emergency-payday-loans.php