Mind blowing SEO improving and Google rank increasing Signature Rotator SEO Tool.

Thread: 

Some URLs in the Sitemap have a high response time

Ads
banner
banner
banner
victor Offline referral

Posts: 638
Joined: Jun 2013
Reputation: 33

#1
Senior Member
Can anybody explain the meaning of this warning that Google webmaster tools account shows
Quote:Some URLs listed in this Sitemap have a high response time. This may indicate a problem with your server or with the content of the page.
chod Offline referral

Posts: 860
Joined: Jul 2013
Reputation: 85

#2
Posting Freak
Pretty much exactly what it is telling you. At least post your sitemap or url to it. But you have links in there that when googlebot hits them goes beyond what they say is an acceptable load time for a link.

What is on the pages that it is hitting? Are there DB calls? Videos? Large images? What is different with those pages vs. all the others? Have you tried to fetch as googlebot? What have you done to see what the problem is on those pages? Provide more info, get more useful responses.
www.AdminEmpire.com
www.BlimptonTech.com - Best Free Online JavaScript tool to minify JavaScript files.
invisibe_dude Offline referral

Posts: 820
Joined: May 2013
Reputation: 24

#3
Posting Freak
I think it's a bug or something I get these warnings too and the URL that Google says has a high response time loads fine
chod Offline referral

Posts: 860
Joined: Jul 2013
Reputation: 85

#4
Posting Freak
(11-28-2013 05:26 PM)invisibe_dude Wrote:  I think it's a bug or something I get these warnings too and the URL that Google says has a high response time loads fine

And do you ever take the time to actually crawl those URL's as googlebot? Or do more than just load it in your browser on your end to "test"??

It is not a "bug" and giving you that warning for a reason!
www.AdminEmpire.com
www.BlimptonTech.com - Best Free Online JavaScript tool to minify JavaScript files.
maya Offline referral

Posts: 1,058
Joined: May 2013
Reputation: 34

#5
Posting Freak
Chod how to suggest we crawl URL's as googlebot?
chod Offline referral

Posts: 860
Joined: Jul 2013
Reputation: 85

#6
Posting Freak
(11-29-2013 09:24 PM)Maya Wrote:  Chod how to suggest we crawl URL's as googlebot?

use a crawler and set your useragent to what you want to pretend to be.

Since most are lazy researchers on here here is just a teaser for 1 of the thousands of ways to do it.
Code:
wget --user-agent="Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)" www.YourWebSite.tld

Lookup the proper useragent that was from memory. And you will need to do more to achieve what you want with wget AND there are other methods that could be easier. It all depends on how far you want to go into learning a valuable thing in the webmasters world.

Also be aware some people check googlebot useragent against the IP its coming from to see if it matches googles range. Not too common but is used out there. It would be something you would have to learn how to "spoof" or "work around".
www.AdminEmpire.com
www.BlimptonTech.com - Best Free Online JavaScript tool to minify JavaScript files.
marcus_avrelius Offline referral

Posts: 2,424
Joined: May 2013
Reputation: 102

#7
Support Team
Thanks Chod for sharing listen correct if I am wrong but it looks like LINUX code to me, so I think I will be able to do it with BackTrack right?
chod Offline referral

Posts: 860
Joined: Jul 2013
Reputation: 85

#8
Posting Freak
(11-30-2013 06:45 PM)marcus_avrelius Wrote:  Thanks Chod for sharing listen correct if I am wrong but it looks like LINUX code to me, so I think I will be able to do it with BackTrack right?

You dont use backtrack for an everyday Linux OS right? Its a very specific use case OS. And Backtrack is no longer the used one, its Kali that is where backtrack has gone.

If you want to run wget on windows you can find the windows binary here
http://www.gnu.org/software/wget/
www.AdminEmpire.com
www.BlimptonTech.com - Best Free Online JavaScript tool to minify JavaScript files.
invisibe_dude Offline referral

Posts: 820
Joined: May 2013
Reputation: 24

#9
Posting Freak
Yeah Linux OS is a king when it comes down to these kind of things. Usually I ignore those errors because I know for sure that the URL loads fine but now that you have posted the way to test it I will definitely give it a try thanks
chod Offline referral

Posts: 860
Joined: Jul 2013
Reputation: 85

#10
Posting Freak
(11-30-2013 11:18 PM)invisibe_dude Wrote:  Yeah Linux OS is a king when it comes down to these kind of things. Usually I ignore those errors because I know for sure that the URL loads fine but now that you have posted the way to test it I will definitely give it a try thanks
That is a bad method to keep. As I stated, just because it loads fast for YOU, does not mean it loads fast for THEM! Also wget is NOT the best method, just A method. And that wget command will not give you all the data needed to see the full aspect of the issue. Also it can be a GEO related issue as well lots of factors to take into account.
www.AdminEmpire.com
www.BlimptonTech.com - Best Free Online JavaScript tool to minify JavaScript files.
victor Offline referral

Posts: 638
Joined: Jun 2013
Reputation: 33

#11
Senior Member
Thanks everybody and specially you Chod for your awesome support





User(s) browsing this thread: 1 Guest(s)