xml sitemap or google webmaster

(bear with me as i dont really know what im talking about - an ex colleague has asked if i can find info about this for him)

A friend has recently (nov 09) had a website created for his business and he thinks the site isnt getting indexed by google quick enough.

The web designers & SEO guys that had been working on the site are using 'google webmaster' and have said that thats the best tool to get the site indexed.

I do alot of networking in my area and when ive mentioned to others that the site isnt getting indexed quick enough, they say that getting a xml sitemap is the way to go??

The web guys are saying google webmaster and others are saying xml sitemap!

Can anyone shed any light on the topic? Who is right?....or am i even asking the right question???

HELP!
 

eddavishofbauer

Free Member
May 7, 2009
11
2
Google webmaster centre is a resource and tool set to help people get their site indexed.

I'm sure the SEO people submitted a sitemap as it's good practice. Why doesn't your friend ask them?

Google will find you eventually if you have a well structured site and not doing anything stupid to stop them.
 
Last edited:
Upvote 0
Can you clarify...

When you say they probably subitted a sitemap, would that be a xml sitemap, or is there another type? And where would they submit it to? Google webmaster?

Im catching up with him tomorrow so i'll find out what exactly the web guys have done so far.

Whats the best way to find out how many pages are actually indexed?

Thanks for your help, i have no idea about these things!
 
Upvote 0
Eddie,

The sitemap is an xml file that you submit to Google through their webmaster tools - it basically tells google what pages are on your site. I would say that if the site's been there since November and still isn't indexed there's something wrong, especially if your friend has had this work done by a 'professional' company.

You can check which pages google has indexed by typing in "site:wwwyourwebsitecom" into google (changing 'yourwebsite', to the site in question:rolleyes: and put dots in the right places, can't post links yet!!)

With almost no work at all, a site should be indexed by google within a week or two at most, a day or two if you know what you're doing!!

PM me if you need more info.
 
Upvote 0
Thanks for all the comments.

After a 3hr meeting today with his SEO guys, it turns out that there are bigger problems on the site than first thought.

It turns out that back in Dec last year 7000 pages of his site had been indexed, but according to google webmaster, google is now only indexing 1750!

The SEO guys havnt seen this happen before so are stratching their heads a bit!!

My mate isnt happy!

I think i will put this in a post of its own and see if anyone can shed any light.

(ps ive already told my mate he needs to get his own account on here!!!)
 
Upvote 0
Evening all

(this is third party info as the guy who owns the site in question dosnt yet have an account on here. I havnt no idea about these things, so i told him id ask on here for him!).

A friend had a site created back in Nov 09, the SEO seemed to be going well and within 6 weeks google had indexed just over 7000 of his pages (according to google webmaster).

This week he has found out that google now only has 1700 pages indexed!!

The sites updated daily/weekly with new info, which i thought would only be a good thing, but the numbers are still going the wrong way!

His SEO guys havnt seen this before and have started looking into it.

Anyone shed any light on this problem?
 
Upvote 0
Evening all

(this is third party info as the guy who owns the site in question dosnt yet have an account on here. I havnt no idea about these things, so i told him id ask on here for him!).

A friend had a site created back in Nov 09, the SEO seemed to be going well and within 6 weeks google had indexed just over 7000 of his pages (according to google webmaster).

This week he has found out that google now only has 1700 pages indexed!!

The sites updated daily/weekly with new info, which i thought would only be a good thing, but the numbers are still going the wrong way!

His SEO guys havnt seen this before and have started looking into it.

Anyone shed any light on this problem?


Google has something iffy going on i did a thread on it ..
will try and find the thread and link it..

Your not the only one , who has noticed somethings wrong with them ;)
 
Upvote 0
For the last 12 months, google has steadily been de-indexing pages on larger sites, while leaving the 'apparent' number of indexed pages there.

I would lay money on the possibility that your friends site has nowhere near 1700 indexed properly, more like a couple of hundred showing cached versions, the rest are what were know ans 'supplemental' pages.
 
  • Like
Reactions: Eddie1974
Upvote 0
Cheers Pete

Thanks for that link, just had a good read. At least my mates not the only one with problems!

Im new to forums in general, but i can see myself getting hooked! Its so interesting to read everyones different opinions on any given subject! I will have to become a full member soon!
 
Upvote 0
For the last 12 months, google has steadily been de-indexing pages on larger sites, while leaving the 'apparent' number of indexed pages there.

I would lay money on the possibility that your friends site has nowhere near 1700 indexed properly, more like a couple of hundred showing cached versions, the rest are what were know ans 'supplemental' pages.


Is there a way to find out if that is the case?
 
Upvote 0

Codefixer

Free Member
Nov 18, 2007
481
118
Belfast
Are all the indexed pages on the site unique?

By that I mean have they unique pages titles, descriptions and content.
If some of the pages are merely duplicates, possibly just different querystring parameters then Google may deem these pages as just filler with little valuable content and choose not to index them.
 
Upvote 0

fisicx

Moderator
Sep 12, 2006
46,766
8
15,418
Aldershot
www.aerin.co.uk
The answer to the reduced page count is often very simple. One of the main causes is the lack of original content, if too many pages are similar then Google may decide to drop them out of the main index. They will still be in the database but tucked away at the back. The other reason may be the honeymoon period experienced whilst initially indexing the site. Google often gives a bit of a boost to new sites while considering how to rank each page. It may be that you got a 7000 page boost in the early stages but G has now decided only 1750 are worth including in the index.

You also need to consider that a sitemap can actually hurt your ranking. If the 'importance' isn't set correctly Google will assume that all the pages are 'average' and won't take into account some of the SEO work carried out.

If Google is only reporting 1750 out of 7000 indexed then do a few checks. Search for a range of products and see they appear in the SERPs. If they do then all is well. If they don't then the developer/SEO team need to do some work.
 
  • Like
Reactions: OldWelshGuy
Upvote 0
I have merged the two threads that were running, so apologies for any disjointedness now in the replies.

I have mentioned previously about this and the main issues I have found that contributed to it were

Weak pages.
By this I mean that the pages were pretty much duplicated with only a small element changing on each page. This is especially a problem with E com sites as the header, nav footer all stay the same. Visible content might look a little different but when you look at the code, you see that in hundreds of lines, maybe only 4 or 5 are unique.

Poor navigation/site structure
In this case I noticed that the actual site hierarchy was poorly designed. There was no clear structure for the search engines to apply weight to.

Poor linking structure
here I saw poor linking, in as much as most pages were linked to each other from most other pages. this only served to water down PageRank (link juice) to a point where the pages were all seen as unimportant (again related to poor structure/architecture

too many links
This was a common theme as many carts have pop out or drop down option selections, which look innocent enough, but on further investigation can be seen to be causing problems . It is possible to have hundreds of links per page, and this isn't good.,
While a flat file structure is Ok for a small site, a clear linking and hierarchy MUSt be evident in a large site. This allows Google to apply it's weight to each page, its trust to each page, pro rata.

All the above serve only to confuse the search engines as to the importance of pages within your site. Each site has a page saturation level, it is worked out by the two main elements in the google algorithm (yes there are really only 2 when it is all boiled down)
1. Importance - this is a measure of value in the eyes of google and is pretty much page rank
2. relevance this is a textual value.

The above are further split in to the 250 or more sub elements that make up the algorithm, but when all said and done, it is those 2 that matter.

With most of a shopping carts pages being near duplicate content, and the page cross linking structure being higgledy piggldy at best, how is Google supposed to know what is important or relevant?
 
Upvote 0
both are essential for website, but for indexing the website in google you have to submit the search engines like google, yahoo & bing. Search for add URL to google in google and submit your website to search engines.


Sorry but I diagree, if you have to submit, then you are doing it wrong.

The only time I have submitted a website to a search engine was way back when Altavista allowed you to submit and would spider within minutes (this is how we all learned SEO) :)
 
Upvote 0
I would lay money on the possibility that your friends site has nowhere near 1700 indexed properly, more like a couple of hundred showing cached versions, the rest are what were know ans 'supplemental' pages.

I agree 'Supplementals' still exist they just don't tell you any more!

Even if you have a new site with 1700 pages of unique content Google will not put them all in the 'main' index immediately.

Regards

Dotty
 
Upvote 0

michellemoore

Free Member
Mar 12, 2010
17
0
Sorry but I diagree, if you have to submit, then you are doing it wrong.

The only time I have submitted a website to a search engine was way back when Altavista allowed you to submit and would spider within minutes (this is how we all learned SEO) :)
Bing search engine was not indexing my new website and i submitted my URL to bing and the next day it was crawled?? So I am not sure:|
 
Upvote 0

fisicx

Moderator
Sep 12, 2006
46,766
8
15,418
Aldershot
www.aerin.co.uk
Bing search engine was not indexing my new website and i submitted my URL to bing and the next day it was crawled?? So I am not sure:|
Probably a coincidence.
 
Upvote 0
Quote
Poor linking structure
here I saw poor linking, in as much as most pages were linked to each other from most other pages. this only served to water down PageRank (link juice) to a point where the pages were all seen as unimportant (again related to poor structure/architecture



I thought id heard internal links were important? I didnt realise you could do it too much.


Thanks for all the comments, im forwarding all this to my mate. Hopefully it may help to get his site out of trouble.
 
Last edited by a moderator:
Upvote 0

Oracle simsim

Free Member
Mar 13, 2010
3
0
it seem that I came too late

in my opionion.. Google does not need a sitemap. just let your site pages link to each other for visitors (like google spiders). if you are running vbulltin it is not very important to submit the xml sitemap. Google does not need to to crawl your site

try to browse your site just like any unregistered (visitor) then see what google can see

other seach engine are not worthy to be considered. they behave just like a moody girl :D
 
Upvote 0

Latest Articles

Join UK Business Forums for free business advice