I’m experiencing an issue where my website is being targeted by web scraping bots. It appears that the attackers are controlling Chrome browsers using the Chrome DevTools Protocol (CDP) directly, rather than relying on automation frameworks like Selenium or Puppeteer. As a result, traditional browser fingerprinting methods aren’t revealing any unusual characteristics or anomalies.
I’ve Tried:
- Implementing standard browser fingerprinting techniques
Challenges:
- CDP-controlled browsers mimic regular user browsers closely, making it difficult to detect using conventional methods.
- Lack of distinctive fingerprints or behavioral anomalies.
Question:
What strategies or techniques can I use to effectively detect and mitigate web scrapers that are controlling Chrome browsers via the Chrome DevTools Protocol (CDP) instead of using automation tools like Selenium or Puppeteer? Are there specific indicators or advanced methods that can help identify such sophisticated scraping attempts?