WeSearch

I built a "polite scraper" Chrome extension instead of a server-side scraper. Here's why.

·6 min read · 0 reactions · 0 comments · 2 views
#web scraping#chrome extension#security#automation#privacy
I built a "polite scraper" Chrome extension instead of a server-side scraper. Here's why.
⚡ TL;DR · AI summary

The author built a Chrome extension called SlotOwl that monitors government appointment portals for available slots, choosing to run the scraper within the user's browser instead of on a server. This 'polite scraper' approach enhances security, avoids IP bans, reduces server costs, and leverages the user to solve captchas manually. The trade-off is that users must keep their browser open for the monitoring to work.

Key facts
Original article
DEV.to (Top)
Read full at DEV.to (Top) →
Opening excerpt (first ~120 words) tap to expand

try { if(localStorage) { let currentUser = localStorage.getItem('current_user'); if (currentUser) { currentUser = JSON.parse(currentUser); if (currentUser.id === 3904702) { document.getElementById('article-show-container').classList.add('current-user-is-article-author'); } } } } catch (e) { console.error(e); } Nik G Posted on May 1 I built a "polite scraper" Chrome extension instead of a server-side scraper. Here's why. #chrome #webdev #buildinpublic Six weeks ago I started building SlotOwl — a Chrome extension that watches government appointment portals (visa, immigration, passport, Global Entry) and notifies you the moment a slot opens. This week I shipped it.

Excerpt limited to ~120 words for fair-use compliance. The full article is at DEV.to (Top).

Anonymous · no account needed
Share 𝕏 Facebook Reddit LinkedIn Threads WhatsApp Bluesky Mastodon Email

Discussion

0 comments

More from DEV.to (Top)