Files
knightcrawler/README.md
iPromKnight ab17ef81be Big rewrite - distributed consumers for ingestion / scraping(scalable) - single producer written in c#.
Changed from page scraping to rss xml scraping
Includes RealDebridManager hashlist decoding (requires a github readonly PAT as requests must be authenticated) - This allows ingestion of 200k+ entries in a few hours.
Simplifies a lot of torrentio to deal with new data
2024-02-01 16:38:45 +00:00

766 B

Torrentio

  • torrentio-addon - the Stremio addon which will query scraped entries and return Stremio stream results.

Self-hosted quickstart

docker-compose up -d

Then open your browser to 127.0.0.1:7000

If you'd like to enable crawling of RealDebridManager's shared hashlists which will massively boost your database cached entries, enter a readonly github personal access token in 'env/producer.env' as the 'GithubSettings__PAT=<token_here>' value.

You can scale the number of consumers, by changing the consumer deploy replica count in the compose file on line 87. This is currently set to 3. If you'd like to adjust the number of concurrent processed ingestions per consumer, thats the job concurrency setting within 'env/consumer.env'.