Google Webmaster Tools - Part 1: Site Configuration
In the first part of the series of articles on the Google Webmaster Tools is now about 5 menu items of the site configuration:
- XML Sitemaps
- Crawler Access
- Site Links
- Change of Address
- Settings
- Crawler Access
- Site Links
- Change of Address
- Settings
Sitemaps
How easily an XML Sitemap can create, I have already detailed in the article create G Site Crawler Sitemap shown. Therefore I shall not at this point to other explanations. The created a Google Sitemap, you can click on the "Sitemap file with infrastructure”. In the displayed form field then you need only the filename of the sitemap file register (usually the term "sitemap.xml"). A short time later, Google visited the site and then gets off the map. When the Google Sitemap is then processed, displayed in the Webmaster Tools, how many URLs have been submitted and how many of them are Google-Index ended up in the. Especially the last value should be treated with caution, however. In my case page count is only about 50% of the index shown there to actually see.
Crawler Access
In this menu you get the first Google search of robots.txt displayed. In the field below, you can try to make individual statements, whether they are syntactically correct. When I showed this feature in all directions but a "syntax not understood" on. And even with such instructions I use the menu item robots.txt file was created.
Under the above item, you can assemble your own by the way its robots.txt less ". To block such as a single file for all search engines, you have to specify only to "Block", in User-Agent "All Robots" select and register with directories and files, the file name. In this way, you can click to assemble all the instructions and then left to the created only download and upload robots.txt to the root directory of its domain.
Under the Link Remove URL you can specify files and directories, from where you want them to be deleted. This example makes sense if this no longer exists. Google notes, although even if a Web page no longer exists. This function is the removal of the pages a bit faster but what is important, for example, when confidential data excludes a page on which you want to remove quickly.
Site Links
Site links are links that search results are displayed below the site found in the.
These links are only displayed on the website, which is No. 1 on the search results. And even if this is much more relevant than the site ranked second by using the Google function "Site Links" you can now block unsolicited Site Links, for example, those links that lead to the contact or the FAQ. However, it is not possible to set their own links, which will then be displayed on the site links.
These links are only displayed on the website, which is No. 1 on the search results. And even if this is much more relevant than the site ranked second by using the Google function "Site Links" you can now block unsolicited Site Links, for example, those links that lead to the contact or the FAQ. However, it is not possible to set their own links, which will then be displayed on the site links.
Change of Address
Here you can tell Google when website is moved to another domain with his. For this we just have the new domain in Webmaster Tools to set up and use the button "Revised site select" assign to the previous site.
Settings
The function geographic target of a domain can map a given country. If one site mainly German user wants to achieve with his, this should indicate to Germany as a geographic target.
If the function Preferred domain should be a certain fixed domain pretend Google, the page indexing is used for. Here you can choose if the domain with or without "www" will be taken. Whichever you decide is this is a matter of taste. The important thing is that even in Webmaster Tools should be set up the domain selected. Otherwise, the selection jumps after pressing the Save button again at the originally set to domain.
Under Crawl speed can be adjusted, the speed at which the Googlebot to access your page with to. In "Faster," the Google bot is a bit faster, but the server additionally burdened. If the server is overloaded, however, should speed the Google Bots slower adjust? If there are no problems, it is advisable to have the right to determine Google automatically.
Among treatment parameters, you can enter parameters that are included in the link, but should be ignored by Google. This makes sense for web shops, where it often happens that the same items can be achieved under several URLs. By entering the separator, which separates the parameters from the URL, you can avoid duplicate URLs.
Tags: Blogging tips, Google, SEO, Website optimization
Subscribe to:
Post Comments (Atom)
Share your views...
0 Respones to "Google Webmaster Tools - Part 1: Site Configuration"
Post a Comment