One of the serious limits on the Force.com batchable platform is the strict requirement of having the Start () call finish in 2 minutes. If you're dealing with a huge data set from a huge object, you might be in trouble. The trick is to loosen the criteria and let the Execute() to further filter the data. The batchable platform can handle 50 million records, but it's impatient if your scope query can't identify all of them in 2 minutes.
It's esp. hard if you need to do stuff on an aggregated basis, i.e., the job scope is essentially a bunch of AggregateResults. The reduction of query time becomes more complex. Essentially the same trick still applies, but need additional support of your Stateful data structure. I had an interesting discussion of this exact scenario on DeveloperForce and provided some sample code, check it out if you're interested. Details in my posts ad Here-n-now.