I had the same issue. Changing the robots.txt file didn't help for me. In the errorlog I found that crawling was blocked on the server site. I contacted the support of my hosting company and got the following response:
"The Facebook crawler has been temporarily blocked for some servers because it showed a very aggressive crawling behavior (more than 100,000 hits per day to multiple websites). This was not only the case with our servers, but also with servers from other hosting companies. We contacted Facebook about this, but unfortunately none received a response to our email.
The Facebook crawler now has access again and we are keeping some things keep an eye on it."
After this all works fine for me.