Google has confirmed that there are some GoogleBot useragent spiders not properly passing verification protocol.
Savvy webmasters noticed that GoogleBot over the .249.70.0 /24 IP range was not returning the proper reverse DNS verification details. The response given was “no such host is known,” but the activity webmasters noticed was that it did appear to be a legit Google crawler.
Google’s John Mueller confirmed on my Google+ post that this was indeed an issue on Google’s end. Temporarily they have stopped GoogleBot activity in those IP ranges and will fix the issue before they continue to crawl on those IP ranges.
The issue is that if you are blocking rogue spiders through reverse DNS verification, you may have blocked Google from crawling your site because of the failed DNS check.
Now Google is working on a fix but you should not have to worry about blocking a real GoogleBot at this time.