Questions

The problem is I need to do this at scale, quickly, in a whitehat way. I thought about pinging, hiring guys on Fiverr to submit to social networks or using Scrapebox's rapid indexer add-on - but I think that all of these solutions are going to look mighty suspicious to Google and result in penalties.

Putting that many new URLs on a new domain in the system may trigger a manual review. I would probably start putting 50K at a time over the course of 1-3 months.

Also best way to get those deeper URLs indexed are:

1) Setup the XML sitemap in webmaster tools.
2) Setup easy to crawl HTML sitemaps. Web pages with ~100 links per page that are easy to crawl and navigate through pagination.


Answered 10 years ago

Unlock Startups Unlimited

Access 20,000+ Startup Experts, 650+ masterclass videos, 1,000+ in-depth guides, and all the software tools you need to launch and grow quickly.

Already a member? Sign in

Copyright © 2024 Startups.com LLC. All rights reserved.