I've got kind of an interesting situation here. I've rebuilt a site (for SEO purposes) that is basically within a folder. This site has some decent search results for now so I want those to keep directing traffic to the site, but as I understand being on the second level like that limits the amount that the spiders will crawl your site.
So here are the options that I see:
Have the primary index page(on the first level) redirect to this folder. This is what I have in place now, just not sure that it helps the spiders out though.
A variation of the above, create each page on the first level as a redirect. I'm thinking this would not work.
Recreate the site on the first level and just switch it out with the folder level erasing the original site. Then you have a bunch of broken links from the engines. I doubt these new pages could just swap them out.
Duplicating the site as above, but keeping the folder level. Doubling content might also hurt the site.
As far as the htaccess route that I've read about online, I don't have access to it.
What is the correct step to take to maximize it's searchability
Search engines really couldn't care less about your internal structure. It doesn't matter to them whether it's all in the root folder, if it's spread across a folder tree four layers deep or if the whole thing is in a single sub-sub-sub-sub-sub-folder. What matters is that you have a clear page structure and consistent internal navigation so that spiders can find their way around easily. What form the actual URL takes doesn't get a look in.
Thanks for the response. That takes a little off my shoulders now.
Well don't worry about that, just redirect the old urls to new one and your new urls will be indexed in no time with same searchability.
Some people who are doing SEO believe that the further of the document from root, the less important it is. IMO, same with that factor but it doesn't affect the crawl rate.
This topic is now archived. It is frozen and cannot be changed in any way.