Big rewrite - distributed consumers for ingestion / scraping(scalable) - single producer written in c#.
Changed from page scraping to rss xml scraping Includes RealDebridManager hashlist decoding (requires a github readonly PAT as requests must be authenticated) - This allows ingestion of 200k+ entries in a few hours. Simplifies a lot of torrentio to deal with new data
This commit is contained in:
32
src/producer/Configuration/scrapers.json
Normal file
32
src/producer/Configuration/scrapers.json
Normal file
@@ -0,0 +1,32 @@
|
||||
{
|
||||
"ScrapeConfiguration": {
|
||||
"StorageConnectionString": "",
|
||||
"Scrapers": [
|
||||
{
|
||||
"Name": "SyncEzTvJob",
|
||||
"IntervalSeconds": 60,
|
||||
"Enabled": true
|
||||
},
|
||||
{
|
||||
"Name": "SyncTpbJob",
|
||||
"IntervalSeconds": 60,
|
||||
"Enabled": true
|
||||
},
|
||||
{
|
||||
"Name": "SyncYtsJob",
|
||||
"IntervalSeconds": 60,
|
||||
"Enabled": true
|
||||
},
|
||||
{
|
||||
"Name": "SyncTgxJob",
|
||||
"IntervalSeconds": 60,
|
||||
"Enabled": true
|
||||
},
|
||||
{
|
||||
"Name": "SyncDmmJob",
|
||||
"IntervalSeconds": 1800,
|
||||
"Enabled": true
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
Reference in New Issue
Block a user