A few days ago, I had to upgrade my hosting plan. I have $ 10 hosting plans, so I'm enjoying a good return on investment, until I began to go beyond the site of CPU usage limits. The problem Googlebot It is constantly crawling in 4 per my site, in other words, about 350,000 pages a day caused. So I moved it to a dedicated server. The site has 5000 page views a day, and therefore nothing compared to Googlebot activity. This is why I hosted in shared hosting it, it is also the site engine optimization, so it will be processed as soon as a challenge page. It is a database query processing between 0.0004 ... 0.002 seconds.
My choice is to either make the query faster crawl speed limit requests per second or upgrade the host.
I do not know how to do?