Just recently had my account reset, server is now back online. When i run a sitemap for a url, zip files are included as if they are pages. Tried to run sitemap again, only index page was crawled. Is anyone wiser of the issue?
Sitemap
Collapse
X
-
Re: Sitemap
That is exactly why manual Sitemap creation is preferred: once you save the file, you can open it using Notebook and then edit it properly before uploading it to your public_hmtl/ directory. You should add the robots.txt file as well, thereby creating a complete set of Rules for the SE's.
The robots.txt file
Create Google-compliant Sitemaps Here
More Discussion on Sitemaps
Problems getting Google to verify
There are a LOT of threads already posted in the Google and Search Engine Topics forums that detail the core principles that will prove useful to you as well.
-
-
Re: Sitemap
Originally posted by contactsaturn View PostBefore account was reset, sitemap ran fine. Might there be something wrong with set-up of account after reset?
Try publishing a couple of additional pages (named for pages you intend WILL be published at some point) as test pages (to be deleted or left to remain even if as blank 'placeholder' pages) and wait 3-4 hours after publishing them for the server to resolve, and THEN try it again.
If this is a new domain, it will take at least 24 hours for the account to become spidered (different from being visited by SE bots), so give it time.
Also, choose to change the XML file SE instructions to "Once A Week" for better results.
And add the robots.txt file also after your site is finished being updated. Fill the glass completely!
Comment
-
Comment