##twscrape
x to nostr scraper
#installation
git clone repo to path: /usr/share/twscrape/,
if you use different path, edit path in twitter.jscreate file .sec and place hex private key there
create file queue and paste x.com profile urls there
create cookies.txt by running:
./extract_cookies.sh
optionally install systemd service
sudo cp twitter.service /etc/systemd/system/twitter.service
sudo systemctl daemon-reload
sudo systemctl start twitter
#running
scape single profile:
node /usr/share/twscrape/twitter.js "$p"
or scrape profiles in queue:
./start-queue.sh
if using systemd service:
sudo systemctl enable twitter
sudo systemctl start twitter
systemd service simply runs queue once so its not really continuous process. just add while loop to start-queue.sh if you need such.
#config
you can change some settings in config file
if you want to see what its doing, set headless mode in twitter.js to false