From f000ae6c12105f23645e2c3e98ed3af0add07b1c Mon Sep 17 00:00:00 2001 From: purple_emily Date: Sun, 10 Mar 2024 11:40:58 +0000 Subject: [PATCH] WIP: Run Knight Crawler --- .github/workflows/build-docs.yaml | 2 ++ docs/Writerside/topics/Getting-started.md | 5 ++- docs/Writerside/topics/Overview.md | 10 +++--- docs/Writerside/topics/Run-Knight-Crawler.md | 34 ++++++++++++-------- 4 files changed, 33 insertions(+), 18 deletions(-) diff --git a/.github/workflows/build-docs.yaml b/.github/workflows/build-docs.yaml index 06e35bb..2b7ac16 100644 --- a/.github/workflows/build-docs.yaml +++ b/.github/workflows/build-docs.yaml @@ -1,5 +1,7 @@ name: Build documentation +# TODO: Only run on ./docs folder change + on: push: branches: ["master"] diff --git a/docs/Writerside/topics/Getting-started.md b/docs/Writerside/topics/Getting-started.md index 613a7ba..ed18f05 100644 --- a/docs/Writerside/topics/Getting-started.md +++ b/docs/Writerside/topics/Getting-started.md @@ -3,12 +3,15 @@ Knight Crawler is provided as an all-in-one solution. This means that you can get started with very little setup. In its initial state Knight Crawler will only work on the machine that you host it on, and will not be accessible on -your local network or from the internet. This is a Stremio limitation ([read Stremio's stance here](https://github.com/Stremio/stremio-features/issues/687)) and is beyond our control. We provide a guide +your local network or from the internet. This is a Stremio +limitation ([read Stremio's stance here](https://github.com/Stremio/stremio-features/issues/687)) and is beyond our +control. We provide a guide how to make Knight Crawler accessible from your local network or the internet [here](). ## Before you start Make sure that you have: + - A place to host Knight Crawler - [Docker](https://docs.docker.com/get-docker/) and [Compose](https://docs.docker.com/compose/install/) installed - A [GitHub](https://github.com/) account _(optional)_ diff --git a/docs/Writerside/topics/Overview.md b/docs/Writerside/topics/Overview.md index 0115ced..578c64f 100644 --- a/docs/Writerside/topics/Overview.md +++ b/docs/Writerside/topics/Overview.md @@ -2,14 +2,15 @@ The image shows a Knight in silvery armour looking forwards. -Knight Crawler is a self-hosted [Stremio](https://www.stremio.com/) addon for streaming torrents via a [Debrid](Supported-Debrid-services.md "Click for a list of Debrid services we support") service. +Knight Crawler is a self-hosted [Stremio](https://www.stremio.com/) addon for streaming torrents via +a [Debrid](Supported-Debrid-services.md "Click for a list of Debrid services we support") service. We are active on [Discord](https://discord.gg/8fQdxay9z2) for both support and casual conversation. > Knight Crawler is currently alpha software. -> +> > Users are responsible for ensuring their data is backed up regularly. -> +> > Please read the changelogs before updating to the latest version. > {style="warning"} @@ -24,5 +25,6 @@ Knight Crawler is an addon for [Stremio](https://www.stremio.com/). It began as 3. It then stores this information to a database for easy access. When you choose on a film or tv show to watch on Stremio, a request will be sent to your installation of Knight Crawler. -Knight Crawler will query the database and return a list of all the copies it has stored in the database as Debrid links. +Knight Crawler will query the database and return a list of all the copies it has stored in the database as Debrid +links. This enables playback to begin immediately for your chosen media. \ No newline at end of file diff --git a/docs/Writerside/topics/Run-Knight-Crawler.md b/docs/Writerside/topics/Run-Knight-Crawler.md index cd5820f..ba6d623 100644 --- a/docs/Writerside/topics/Run-Knight-Crawler.md +++ b/docs/Writerside/topics/Run-Knight-Crawler.md @@ -1,6 +1,7 @@ # Run Knight Crawler -To run Knight Crawler you need two files, both can be found in the [deployment/docker](https://github.com/Gabisonfire/knightcrawler/tree/master/deployment/docker) +To run Knight Crawler you need two files, both can be found in +the [deployment/docker](https://github.com/Gabisonfire/knightcrawler/tree/master/deployment/docker) directory on GitHub: - deployment/docker/.env.example @@ -25,7 +26,7 @@ Before we start the services, we need to change a few things in the .env If you are using an external database, configure it in the .env file. Don't forget to disable the ones > included in the docker-compose.yaml. -### Your time zone. +### Your time zone ```Bash TZ=London/Europe @@ -43,21 +44,28 @@ MAX_CONNECTIONS_PER_TORRENT=10 CONSUMER_REPLICAS=3 ``` -These are totally subjective to your machine and network capacity. The above default is pretty minimal and will work on most machines. +These are totally subjective to your machine and network capacity. The above default is pretty minimal and will work on +most machines. -`JOB_CONCURRENCY` is how many films and tv shows the consumers should process at once. As this affects every consumer this will likely cause exponential - strain on your system. It's probably best to leave this at 5, but you can try experimenting with it if you wish. +`JOB_CONCURRENCY` is how many films and tv shows the consumers should process at once. As this affects every consumer +this will likely cause exponential +strain on your system. It's probably best to leave this at 5, but you can try experimenting with it if you wish. -`MAX_CONNECTIONS_PER_TORRENT` is how many peers the consumer will attempt to connect to when it is trying to collect metadata. -Increasing this value can speed up processing, but you will eventually reach a point where more connections are being made than -your router can handle. This will then cause a cascading fail where your internet stops working. If you are going to increase this value +`MAX_CONNECTIONS_PER_TORRENT` is how many peers the consumer will attempt to connect to when it is trying to collect +metadata. +Increasing this value can speed up processing, but you will eventually reach a point where more connections are being +made than +your router can handle. This will then cause a cascading fail where your internet stops working. If you are going to +increase this value then try increasing it by 10 at a time. -> Increasing this value increases the max connections for every parallel job for every consumer. For example -> with the default values above this means that Knight Crawler will be on average making `(5 x 3) x 10 = 150` connections at any one time. +> Increasing this value increases the max connections for every parallel job, for every consumer. For example +> with the default values above this means that Knight Crawler will be on average making `(5 x 3) x 10 = 150` +> connections at any one time. > {style="warning"} -`CONSUMER_REPLICAS` is how many consumers should be started. This is the ultimate decider in how fast you will be able to -add films and tv shows to your database. However, this is also going to be the most intensive service you will run. -The default of 3 is a reasonable starting amount. It will work on almost every system. \ No newline at end of file +`CONSUMER_REPLICAS` is how many consumers should be initially started. This is best kept below 10 as GitHub rate limit +how fast we can access a list of torrent trackers. You can increase or decrease the number of consumers whilst the service is running by running the command `docker compose up --scale consumer=`. This value is best increased by 5 at a time. Repeat this process until you have reached the desired level of consumers. + +### GitHub personal access token