Big rewrite - distributed consumers for ingestion / scraping(scalable) - single producer written in c#.

Changed from page scraping to rss xml scraping
Includes RealDebridManager hashlist decoding (requires a github readonly PAT as requests must be authenticated) - This allows ingestion of 200k+ entries in a few hours.
Simplifies a lot of torrentio to deal with new data
This commit is contained in:
iPromKnight
2024-02-01 16:38:45 +00:00
parent 6fb4ddcf23
commit ab17ef81be
255 changed files with 18489 additions and 69074 deletions

14
src/node/addon/index.js Normal file
View File

@@ -0,0 +1,14 @@
import express from 'express';
import serverless from './serverless.js';
import { initBestTrackers } from './lib/magnetHelper.js';
import {ipFilter} from "./lib/ipFilter.js";
const app = express();
app.enable('trust proxy');
app.use(ipFilter);
app.use(express.static('static', { maxAge: '1y' }));
app.use((req, res, next) => serverless(req, res, next));
app.listen(process.env.PORT || 7000, () => {
initBestTrackers()
.then(() => console.log(`Started addon at: http://localhost:${process.env.PORT || 7000}`));
});