79413644

Date: 2025-02-05 04:30:17
Score: 1
Natty:
Report link

This approach may help:

  1. BeautifulSoup library can be used for the static websites; snscrape for Twitter; LinkedIn API for scraping LinkedIn.
  2. Do code for filtering the links that aren't needed using keyword filtering / regex.
  3. Store the extracted data in Google Sheets (as it is free). Access it using Google Cloud Console - Google Sheets API.
  4. Automate this using task scheduler in windows.
Reasons:
  • No code block (0.5):
  • Low reputation (0.5):
Posted by: Swati