I was faced with this same exact problem. I'm going to guess you and I were probably doing the same application challenge! This question was also asked here, but I'll copy my solution here as well since it hasn't yet been marked as a duplicate. Also, I'm making the assumption that you, like me, did not want to use the Google API because you didn't want to have to deal with authenticating. If authentication tokens are not an issue, that definitely seems like the tidiest way to go about it.
Using requests, I was able to pull down the raw HTML response from calling the page, then using BeautifulSoup I was able to turn it into a workable, parse-able object:
# Make request
html_response = requests.get(url=url)
# Parse html into a BeautifulSoup object
soup = BeautifulSoup(html_response.text, 'html.parser')
# Collect and return the first table (assuming the first table is what you want)
return soup.find('table')
From there, you can parse the table more precisely to pull out the data you want. Here are a couple examples of how you can work with a BeautifulSoup table to get what you need:
I'm refraining from copy-pasting my exact solution because I know others will use this to fill out the same job application challenge, but this gets you everything you need as long as you have a Python foundation.