Want Google to crawl your site faster? One of the fastest ways to make sure Google gets to those deeper pages of content in your site faster is to go ahead and use a Google Webmaster Tools.
For one thing, that's where you'll be able to submit a Google sitemap file. If you don't know what a sitemap file is, you might have just woken up from a 10,000 year old sleep, so, as a basic refresher, a sitemap file is typically an xml based file that contains listings of your page urls, along with a timestamp of when that file was created and last modified. This way, Google not only knows the page exists, but it will know when it was created, and also how frequently its updated.
Inside of Google Webmaster Tools, you'll also be able to set the crawl rate for your website. The crawl rate is basically how slow or fast the Googlebot requests pages from your site. Typically you can set it faster, but only for 90 days at a time, which should be good enough for Google to rip through your site and grab the content for indexing. If you have several million pages of content to get indexed, you should start off with several large sitemap files. The sitemap files should not exceed 50,000 urls or 10MB, so compress your files with gzip compression if they are larger then 10MB. Link directly to pages on your site that are several clicks away from the homepage, by doing that, you foce Google to take a look at that page, and the ones it's linking to.
Another method to get the deeper pages of your site indexed, would simply be to re-arrange your link structure, similar to what Amazon did, making the category pages link to sub-categories and even deeper pages in the heirarchy.