A site map, also called sitemap, on Squarespace is a list of URLs telling search engines about the structure of content on a site. When you create a Squarespace site, it comes with a site map using the .xml format, so there’s no need for you to create one manually.
Your site map includes all pages’ URLs and also image metadata for SEO-friendly indexing. Squarespace automatically updates your sitemap with any pages you add or remove from your site.
Before you begin, keep in mind that site maps are not available for trial sites.
What’s the difference between HTML and an XML site map
Site map plays a really important role for your website’s SEO. But when it comes to choosing the type of site map for a website, people sometimes get confused which sitemap to create. Although Squarespace does not provide the HTML site map and only offers you the one in .xml format, it’s still good to know the difference.
The simplest explanation of the difference between these two formats is that XML is primarily used by search engines while HTML is created for the users. A XML site map protocol is specifically aimed at search engine spiders. It is a file that stores and demonstrates all the activity occurring on your Squarespace site. And it’s not just the site’s main URL, but all the URLs along with associated metadata. This info on the sitemap can include the data on when this URL was updated, how important for the whole site it is, the average frequency of changes, this URL’s relation to the rest of the site and so much more.
HTML sitemap is a simple overview of the site - just the pages and all the info a user is interested in. It can be used, for example, if you’re looking for a shopping cart on a site and can’t find it. By going to the HTML site map you’ll easily find what you were looking for there. Along with being helpful for the users, your HTML site map can be also beneficial for your rankings serving as a signal that your site is really user-friendly.
The major difference between XML and HTML, basically, means that the person responsible for writing the code or the program gets to decide what the tags will mean. In HTML it’s not possible while XML offers you such an opportunity. And usually people prefer to mix these two types up together. Also, XML site maps are usually not linked to from the official site, - this type of sitemap can more often be found on the domain like xample.com/sitemap.xml.
How to find your site map
You’ll find your site map by adding /sitemap.xml to the end of your domain. And with your built-in domain, your site map URL will look like this (replace site name with your own site’s name):
And if you’re having a custom domain, your site map URL will look like this:
Performing changes to your site map
You can’t directly edit a site map. Your site map will update automatically every time your site gets any changes. These changes usually appear within an hour, and sometimes it takes 24 hours for them to appear on your Squarespace sitemap.
What your site map don’t include
There are some things that are not contained in your site map. This list includes:
- Disabled pages on your site, site pages with page passwords, site pages hidden from search.
- Old URLs that were used to create URL redirects on your Squarespace site.
- Style information.
- The URLs of uploaded files.
- Code block and Embed block content.
- The individual URLs of your Squarespace site pages within an Index page. But don’t worry, the content of those site pages will appear - search engines still have the ability to crawl all content of the Index page.
Your site map should only include status 200 URLs.
Now that you know where to find your site map and what it contains, let’s proceed to submitting your site map to Google. In case you think Squarespace will do this job for you - that’s fair. But in some cases you may want your website to get indexed faster than it can be done naturally. That’s why we’ll review how to help your Squarespace sitemap get noticed by Google faster.
How to submit your site map to Google
Technically, there’s no urgent need to submit your Squarespace site map to Google Search Console manually as Google can find and discover your site and sitemap naturally. But it’s only fair for smaller or medium-sized site. And here’s what Google Search Console has to say about cases when submitting your Squarespace site map might be a good idea:
- If your site is really large - the larger it is, the more likely Googlebot may miss some pages on it and within your sitemap. Especially it concerns big ecommerce sites. So if your Squarespace site is used as an ecommerce store, that’s your story - submit your sitemap manually.
- If there’s a large archive of content pages, isolated pages, or not linked together pages on your Squarespace site and sitemap.
- If you have a new site that hasn’t got many (or even any) backlinks. As Googlebot uses links to discover pages, it also uses links from other domains to get to your site. And if other sites haven't begun to link to your site yet, it will take much longer for your sitemap to get noticed. So in order not to lose traffic for your Squarespace website, you’d better submit your sitemap to Google Search Console.
- You have a lot of multimedia content like video, images on your Squarespace site or it is shown in Google News. All these files, of course, will be seen by Google sometime, but your goal is to get your Squarespace website and site map notified faster.
To get the idea of how much of your site and sitemap has already been indexed by Google, simply perform a “site:” search of your domain. But remember though: Google will show you an approximate number of results, this number is not perfectly reliable. So it is always better to submit your Squarespace site and sitemap to Google Search Console manually.
If you think that submitting your site map to Google Search Console works for you, then follow these easy steps given below to do it:
- Find your site map. You can do so by performing the actions we’ve described above.
- Open your Google Search Console account, and then locate your correct property. It is really important that you make sure you find the property for the exact domain you need to submit your site map for. Keep in mind that http:// and https:// are not the same, and example.com and www.example.com are not the same as well.
- Go to “Sitemap” section under “Index” on the left site navigation panel.
- Get rid of old and outdated site maps if any have been submitted. To find this option and outdated sitemap on your website, click on three stacked dots in the top right corner of the site page.
- Add your sitemap URL under a “Add a new sitemap” and then click “Submit”. Here you only need to submit the end of the URL of your sitemap, not the Squarespace domain portion.
- Soon Google Search Console will verify that your site map has been found. It may take some time before your sitemap is crawled.
But there’s not only Google Search Console you may want to submit your Squarespace site and sitemap to. Now let’s see how you can submit your site map to Bing and Yahoo.
How to submit your Squarespace site map to Bing and Yahoo search
In some cases, you might find it necessary to submit your sitemap to Bing or Yahoo to get your Squarespace site noticed by users of these search engines. Follow these steps:
- Go to Bing Webmaster Tools dashboard, and then log in using your username and password.
- Find the “Sitemap” widget on the main screen of your dashboard, and after that open the Sitemap tool.
- Type the URL of your site map in the input box in the lower right corner, and click “Submit”. The URL of your sitemap will be submitted to Bing and after that - added to your list of site maps above the input box.
Keep in mind that if you want your Squarespace site to appear in Yahoo search, you’ll also need to submit your sitemap to Bing as Yahoo Search results come from the Yahoo web crawler (Slurp) and Bing's web crawler. And to manage how your site appears in Yahoo Search you’ll need to use meta tags and robots.txt along with submitting the sitemap.
How to add sitemap to robots.txt
A robots.txt file is a simple text that is stored within the root directory of your Squarespace site. Your robots.txt file is there to tell search engines which parts of your site should be crawled/indexed, and which ones shouldn’t. Also, it demonstrates which search engines have permission to crawl your Squarespace site and sitemap. And it’s really important to have this file on your site because it’s the first thing a search engine bot will look for before performing any other actions with a sitemap.
In the same way as robots.txt file does, a site map allows search engines to crawl through and then index all pages of your Squarespace site collected in one place. You can create a robots.txt file that will include your site map by following these easy steps given below:
Find your Squarespace site map URL
In case you or the developer you work with have already created your site map, then it’s probably located at http://www.example.com/sitemap.xml. The word “Example” here can be replaced by your Squarespace domain name. You can check if it’s true by pasting this URL in the browser where you will either see your site map or the 404 error page meaning that this particular location is not the one having your site map there.
You can also use the help of Google search operators. All you have to do here is simply type site:example.com filetype:xml in Google's search bar.
If you still can’t find your site map, then probably it does not exist. In this case, generate the sitemap yourself or ask for help from a developer.
Find your your robots.txt file
Just like it was with your site map, you can look if your robots.txt file has been created on your site by typing http://www.example.com/robots.txt, and replacing the “example” by your domain name. If it turns out that you don’t have this file, then you need to create one and make sure that it was added to the top-level directory (root) of your web server before adding your sitemap there.
Add your Squarespace site map to your robots.txt file
And finally you’ll need to add your site map location to your robots.txt file. To do it, you’ll have to edit your file by adding a directive with the URL of your sitemap. As a result, your robots.txt file will look like this:
What to do if Squarespace site map contains URLs which are blocked by robots.txt
Sometimes, there might be a situation when you see a trigger for your site in Google Search Console for "Sitemap contains URLs which are blocked by robots.txt". In case you wanted to do so, this warning can be ignored. But if you’re new here, you definitely need to check and see if everything’s fine with the work of your sitemap.
In most cases, blocked sitemap URLs are typically caused by developers who improperly configure their robots.txt file. Every time you’re disallowing anything, you need to make sure you understand what you’re doing and what changes will be reflected on your sitemap. Otherwise, the warning will appear and the web crawlers won’t be able to crawl your Squarespace site.
So, before trying to solve the "sitemap contains URLs which are blocked by robots.txt" issue, check the following things:
- Check for any Disallow rules within your robots.txt file of your Squarespace site. The robots.txt file should be contained in your root directory, as it is shown in this example: https://example.com/robots.txt.
- If your Squarespace website recently migrated from HTTP to HTTPS, then make sure that you’ve created a new property for the HTTPS version and that the robots.txt file is available via HTTPS.
- To check what warnings or errors are being generated for your site, apply robots.txt Tester within the Google Search Console.
- Sometimes, your robots.txt may be cached. It means that you should give Google some time to recrawl your Squarespace site. You can even try to re-submit your site to Google Search Console if there are any issues that were addressed.
- You can try to manually ask Google to crawl your Squarespace site. To make it happen, navigate to your Search Console, go to the “Crawl” section and then find “Fetch as Google”. There you can add the URL path in your sitemap which Google was warning you about and choose “Fetch”. As soon as everything’s reloaded, click the “Request Indexing” and choose “Crawl only this URL” in your sitemap.
Once any alterations have been performed to your robots.txt file, it will take some time for Google to re-crawl your Squarespace site. And if you are sure that all the conflicting Disallow rules were removed, then it’s only a matter of time for Google to do its job on your Squarespace page.
Squarespace site maps and their relations with search engines
Your Squarespace site map demonstrates to search engines what pages are exactly available for them to be crawled. Keep in mind that the position of your Squarespace site page within the sitemap does not affect its positions in search results.
Squarespace sets standard priorities for pages and blog posts right in your site map. For example, your homepage will have the priority of 1, other pages will have the priority of .75, and blog posts have the priority of .5. These priorities help search engines rank pages of your Squarespace site relative to each other. So, the homepage’s priority of 1 in the sitemap demonstrates that it is the top page of your site. Aside from this, priorities do not carry any other meaning for search engines and have no impact on your website’s SEO.