I’ve got kind of an interesting situation here. I’ve rebuilt a site (for SEO purposes) that is basically within a folder. This site has some decent search results for now so I want those to keep directing traffic to the site, but as I understand being on the second level like that limits the amount that the spiders will crawl your site.
So here are the options that I see:
Have the primary index page(on the first level) redirect to this folder. This is what I have in place now, just not sure that it helps the spiders out though.
A variation of the above, create each page on the first level as a redirect. I’m thinking this would not work.
Recreate the site on the first level and just switch it out with the folder level erasing the original site. Then you have a bunch of broken links from the engines. I doubt these new pages could just swap them out.
Duplicating the site as above, but keeping the folder level. Doubling content might also hurt the site.
As far as the htaccess route that I’ve read about online, I don’t have access to it.
What is the correct step to take to maximize it’s searchability