Compare commits
16 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
594320ed63 | ||
|
|
a7d5944d25 | ||
|
|
c053a5f8da | ||
|
|
5611d3776f | ||
|
|
833ac11a96 | ||
|
|
16d8707c48 | ||
|
|
6dfbaa4739 | ||
|
|
03b5617312 | ||
|
|
19cb42af77 | ||
|
|
9344531b34 | ||
|
|
723aa6b6a0 | ||
|
|
e17b476801 | ||
|
|
2a414d8bc0 | ||
|
|
9b5f454e6e | ||
|
|
ad9549c695 | ||
|
|
1e85cb00ff |
4
.github/ISSUE_TEMPLATE/bug_report.md
vendored
4
.github/ISSUE_TEMPLATE/bug_report.md
vendored
@@ -12,6 +12,9 @@ A clear and concise description of what the bug is.
|
|||||||
|
|
||||||
**To Reproduce**
|
**To Reproduce**
|
||||||
Steps to reproduce the behavior:
|
Steps to reproduce the behavior:
|
||||||
|
1.
|
||||||
|
2.
|
||||||
|
3.
|
||||||
|
|
||||||
**Expected behavior**
|
**Expected behavior**
|
||||||
A clear and concise description of what you expected to happen.
|
A clear and concise description of what you expected to happen.
|
||||||
@@ -23,6 +26,7 @@ If the logs are short, make sure to triple backtick them, or use https://pastebi
|
|||||||
**Hardware:**
|
**Hardware:**
|
||||||
- OS and distro: [e.g. Raspberry Pi OS, Ubuntu, Rocky]
|
- OS and distro: [e.g. Raspberry Pi OS, Ubuntu, Rocky]
|
||||||
- Server: [e.g. VM, Baremetal, Pi]
|
- Server: [e.g. VM, Baremetal, Pi]
|
||||||
|
- Knightcrawler Version: [2.0.xx]
|
||||||
|
|
||||||
**Additional context**
|
**Additional context**
|
||||||
Add any other context about the problem here.
|
Add any other context about the problem here.
|
||||||
|
|||||||
39
.github/workflows/git_cliff.yml
vendored
Normal file
39
.github/workflows/git_cliff.yml
vendored
Normal file
@@ -0,0 +1,39 @@
|
|||||||
|
on:
|
||||||
|
push:
|
||||||
|
branches:
|
||||||
|
- main
|
||||||
|
workflow_dispatch:
|
||||||
|
|
||||||
|
jobs:
|
||||||
|
changelog:
|
||||||
|
name: Generate changelog
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
steps:
|
||||||
|
- name: Checkout
|
||||||
|
uses: actions/checkout@v4
|
||||||
|
with:
|
||||||
|
fetch-depth: 0
|
||||||
|
|
||||||
|
- name: Generate a changelog
|
||||||
|
uses: orhun/git-cliff-action@v3
|
||||||
|
with:
|
||||||
|
config: cliff.toml
|
||||||
|
args: --verbose
|
||||||
|
env:
|
||||||
|
OUTPUT: CHANGELOG.md
|
||||||
|
GITHUB_REPO: ${{ github.repository }}
|
||||||
|
|
||||||
|
- name: Commit
|
||||||
|
run: |
|
||||||
|
git config user.name 'github-actions[bot]'
|
||||||
|
git config user.email 'github-actions[bot]@users.noreply.github.com'
|
||||||
|
set +e
|
||||||
|
git checkout -b feat/changelog_$(date +"%d_%m")
|
||||||
|
git add CHANGELOG.md
|
||||||
|
git commit -m "[skip ci] Update changelog"
|
||||||
|
git push https://${{ secrets.GITHUB_TOKEN }}@github.com/${{ github.repository }}.git feat/changelog_$(date +"%d_%m")
|
||||||
|
|
||||||
|
- name: create pull request
|
||||||
|
run: gh pr create -B main -H feat/changelog_$(date +"%d_%m") --title '[skip ci] Update changelog' --body 'Changelog update by git-cliff'
|
||||||
|
env:
|
||||||
|
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||||
22
CHANGELOG.md
Normal file
22
CHANGELOG.md
Normal file
@@ -0,0 +1,22 @@
|
|||||||
|
# Changelog
|
||||||
|
|
||||||
|
All notable changes to this project will be documented in this file.
|
||||||
|
|
||||||
|
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
|
||||||
|
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
|
||||||
|
|
||||||
|
## [1.0.0] - 2024-03-25
|
||||||
|
### Details
|
||||||
|
#### Changed
|
||||||
|
- Change POSTGRES_USERNAME to POSTGRES_USER. Oops by @purple-emily
|
||||||
|
- Change POSTGRES_DATABASE to POSTGRES_DB by @purple-emily
|
||||||
|
- Two movie commands instead of movie and tv by @purple-emily
|
||||||
|
- Cleanup RabbitMQ env vars, and Github Pat by @iPromKnight
|
||||||
|
|
||||||
|
#### Fixed
|
||||||
|
- HRD -> HDR by @mplewis
|
||||||
|
|
||||||
|
## New Contributors
|
||||||
|
* @mplewis made their first contribution
|
||||||
|
|
||||||
|
<!-- generated by git-cliff -->
|
||||||
34
CONTRIBUTING.md
Normal file
34
CONTRIBUTING.md
Normal file
@@ -0,0 +1,34 @@
|
|||||||
|
We use [Meaningful commit messages](https://reflectoring.io/meaningful-commit-messages/)
|
||||||
|
|
||||||
|
Tl;dr:
|
||||||
|
1. It should answer the question: “What happens if the changes are applied?".
|
||||||
|
2. Use the imperative, present tense. It is easier to read and scan quickly:
|
||||||
|
```
|
||||||
|
Right: Add feature to alert admin for new user registration
|
||||||
|
Wrong: Added feature ... (past tense)
|
||||||
|
```
|
||||||
|
3. The summary should always be able to complete the following sentence:
|
||||||
|
`If applied, this commit will… `
|
||||||
|
|
||||||
|
We use [git-cliff] for our changelog.
|
||||||
|
|
||||||
|
The breaking flag is set to true when the commit has an exclamation mark after the commit type and scope, e.g.:
|
||||||
|
`feat(scope)!: this is a breaking change`
|
||||||
|
|
||||||
|
Keywords (Commit messages should start with these):
|
||||||
|
```
|
||||||
|
# Added
|
||||||
|
add
|
||||||
|
support
|
||||||
|
# Removed
|
||||||
|
remove
|
||||||
|
delete
|
||||||
|
# Fixed
|
||||||
|
test
|
||||||
|
fix
|
||||||
|
```
|
||||||
|
|
||||||
|
Any other commits will fall under the `Changed` category
|
||||||
|
|
||||||
|
|
||||||
|
This project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html)
|
||||||
24
README.md
24
README.md
@@ -67,30 +67,6 @@ Then set any of the values you wouldd like to customize.
|
|||||||
|
|
||||||
By default, Knight Crawler is configured to be *relatively* conservative in its resource usage. If running on a decent machine (16GB RAM, i5+ or equivalent), you can increase some settings to increase consumer throughput. This is especially helpful if you have a large backlog from [importing databases](#importing-external-dumps).
|
By default, Knight Crawler is configured to be *relatively* conservative in its resource usage. If running on a decent machine (16GB RAM, i5+ or equivalent), you can increase some settings to increase consumer throughput. This is especially helpful if you have a large backlog from [importing databases](#importing-external-dumps).
|
||||||
|
|
||||||
### DebridMediaManager setup (optional)
|
|
||||||
|
|
||||||
There are some optional steps you should take to maximise the number of movies/tv shows we can find.
|
|
||||||
|
|
||||||
We can search DebridMediaManager hash lists which are hosted on GitHub. This allows us to add hundreds of thousands of movies and tv shows, but it requires a Personal Access Token to be generated. The software only needs read access and only for public repositories. To generate one, please follow these steps:
|
|
||||||
|
|
||||||
1. Navigate to GitHub settings -> Developer Settings -> Personal access tokens -> Fine-grained tokens (click [here](https://github.com/settings/tokens?type=beta) for a direct link)
|
|
||||||
2. Press `Generate new token`
|
|
||||||
3. Fill out the form (example data below):
|
|
||||||
```
|
|
||||||
Token name:
|
|
||||||
KnightCrawler
|
|
||||||
Expiration:
|
|
||||||
90 days
|
|
||||||
Description:
|
|
||||||
<blank>
|
|
||||||
Repository access
|
|
||||||
(checked) Public Repositories (read-only)
|
|
||||||
```
|
|
||||||
4. Click `Generate token`
|
|
||||||
5. Take the new token and add it to the bottom of the [stack.env](deployment/docker/stack.env) file
|
|
||||||
```
|
|
||||||
GITHUB_PAT=<YOUR TOKEN HERE>
|
|
||||||
```
|
|
||||||
### Configure external access
|
### Configure external access
|
||||||
|
|
||||||
Please choose which applies to you:
|
Please choose which applies to you:
|
||||||
|
|||||||
112
cliff.toml
Normal file
112
cliff.toml
Normal file
@@ -0,0 +1,112 @@
|
|||||||
|
# git-cliff ~ configuration file
|
||||||
|
# https://git-cliff.org/docs/configuration
|
||||||
|
|
||||||
|
[changelog]
|
||||||
|
# changelog header
|
||||||
|
header = """
|
||||||
|
# Changelog\n
|
||||||
|
All notable changes to this project will be documented in this file.
|
||||||
|
|
||||||
|
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
|
||||||
|
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).\n
|
||||||
|
"""
|
||||||
|
# template for the changelog body
|
||||||
|
# https://keats.github.io/tera/docs/#introduction
|
||||||
|
body = """
|
||||||
|
{%- macro remote_url() -%}
|
||||||
|
https://github.com/{{ remote.github.owner }}/{{ remote.github.repo }}
|
||||||
|
{%- endmacro -%}
|
||||||
|
|
||||||
|
{% if version -%}
|
||||||
|
## [{{ version | trim_start_matches(pat="v") }}] - {{ timestamp | date(format="%Y-%m-%d") }}
|
||||||
|
{% else -%}
|
||||||
|
## [Unreleased]
|
||||||
|
{% endif -%}
|
||||||
|
|
||||||
|
### Details\
|
||||||
|
|
||||||
|
{% for group, commits in commits | group_by(attribute="group") %}
|
||||||
|
#### {{ group | upper_first }}
|
||||||
|
{%- for commit in commits %}
|
||||||
|
- {{ commit.message | upper_first | trim }}\
|
||||||
|
{% if commit.github.username %} by @{{ commit.github.username }}{%- endif -%}
|
||||||
|
{% if commit.github.pr_number %} in \
|
||||||
|
[#{{ commit.github.pr_number }}]({{ self::remote_url() }}/pull/{{ commit.github.pr_number }}) \
|
||||||
|
{%- endif -%}
|
||||||
|
{% endfor %}
|
||||||
|
{% endfor %}
|
||||||
|
|
||||||
|
{%- if github.contributors | filter(attribute="is_first_time", value=true) | length != 0 %}
|
||||||
|
## New Contributors
|
||||||
|
{%- endif -%}
|
||||||
|
|
||||||
|
{% for contributor in github.contributors | filter(attribute="is_first_time", value=true) %}
|
||||||
|
* @{{ contributor.username }} made their first contribution
|
||||||
|
{%- if contributor.pr_number %} in \
|
||||||
|
[#{{ contributor.pr_number }}]({{ self::remote_url() }}/pull/{{ contributor.pr_number }}) \
|
||||||
|
{%- endif %}
|
||||||
|
{%- endfor %}\n
|
||||||
|
"""
|
||||||
|
# template for the changelog footer
|
||||||
|
footer = """
|
||||||
|
{%- macro remote_url() -%}
|
||||||
|
https://github.com/{{ remote.github.owner }}/{{ remote.github.repo }}
|
||||||
|
{%- endmacro -%}
|
||||||
|
|
||||||
|
{% for release in releases -%}
|
||||||
|
{% if release.version -%}
|
||||||
|
{% if release.previous.version -%}
|
||||||
|
[{{ release.version | trim_start_matches(pat="v") }}]: \
|
||||||
|
{{ self::remote_url() }}/compare/{{ release.previous.version }}..{{ release.version }}
|
||||||
|
{% endif -%}
|
||||||
|
{% else -%}
|
||||||
|
[unreleased]: {{ self::remote_url() }}/compare/{{ release.previous.version }}..HEAD
|
||||||
|
{% endif -%}
|
||||||
|
{% endfor %}
|
||||||
|
<!-- generated by git-cliff -->
|
||||||
|
"""
|
||||||
|
# remove the leading and trailing whitespace from the templates
|
||||||
|
trim = true
|
||||||
|
|
||||||
|
[git]
|
||||||
|
# parse the commits based on https://www.conventionalcommits.org
|
||||||
|
conventional_commits = true
|
||||||
|
# filter out the commits that are not conventional
|
||||||
|
filter_unconventional = true
|
||||||
|
# process each line of a commit as an individual commit
|
||||||
|
split_commits = false
|
||||||
|
# regex for preprocessing the commit messages
|
||||||
|
commit_preprocessors = [
|
||||||
|
# remove issue numbers from commits
|
||||||
|
{ pattern = '\((\w+\s)?#([0-9]+)\)', replace = "" },
|
||||||
|
]
|
||||||
|
# regex for parsing and grouping commits
|
||||||
|
commit_parsers = [
|
||||||
|
{ message = "^.*: add", group = "Added" },
|
||||||
|
{ message = "^add", group = "Added" },
|
||||||
|
{ message = "^.*: support", group = "Added" },
|
||||||
|
{ message = "^support", group = "Added" },
|
||||||
|
{ message = "^.*: remove", group = "Removed" },
|
||||||
|
{ message = "^remove", group = "Removed" },
|
||||||
|
{ message = "^.*: delete", group = "Removed" },
|
||||||
|
{ message = "^delete", group = "Removed" },
|
||||||
|
{ message = "^.*: test", group = "Fixed" },
|
||||||
|
{ message = "^test", group = "Fixed" },
|
||||||
|
{ message = "^.*: fix", group = "Fixed" },
|
||||||
|
{ message = "^fix", group = "Fixed" },
|
||||||
|
{ message = "^.*", group = "Changed" },
|
||||||
|
]
|
||||||
|
# protect breaking changes from being skipped due to matching a skipping commit_parser
|
||||||
|
protect_breaking_commits = false
|
||||||
|
# filter out the commits that are not matched by commit parsers
|
||||||
|
filter_commits = true
|
||||||
|
# regex for matching git tags
|
||||||
|
tag_pattern = "v[0-9].*"
|
||||||
|
# regex for skipping tags
|
||||||
|
skip_tags = "v0.1.0-beta.1"
|
||||||
|
# regex for ignoring tags
|
||||||
|
ignore_tags = ""
|
||||||
|
# sort the tags topologically
|
||||||
|
topo_order = false
|
||||||
|
# sort the commits inside sections by oldest/newest order
|
||||||
|
sort_commits = "oldest"
|
||||||
@@ -94,7 +94,7 @@ services:
|
|||||||
condition: service_healthy
|
condition: service_healthy
|
||||||
env_file: stack.env
|
env_file: stack.env
|
||||||
hostname: knightcrawler-addon
|
hostname: knightcrawler-addon
|
||||||
image: gabisonfire/knightcrawler-addon:2.0.24
|
image: gabisonfire/knightcrawler-addon:2.0.26
|
||||||
labels:
|
labels:
|
||||||
logging: promtail
|
logging: promtail
|
||||||
networks:
|
networks:
|
||||||
@@ -117,7 +117,7 @@ services:
|
|||||||
redis:
|
redis:
|
||||||
condition: service_healthy
|
condition: service_healthy
|
||||||
env_file: stack.env
|
env_file: stack.env
|
||||||
image: gabisonfire/knightcrawler-consumer:2.0.24
|
image: gabisonfire/knightcrawler-consumer:2.0.26
|
||||||
labels:
|
labels:
|
||||||
logging: promtail
|
logging: promtail
|
||||||
networks:
|
networks:
|
||||||
@@ -138,7 +138,7 @@ services:
|
|||||||
redis:
|
redis:
|
||||||
condition: service_healthy
|
condition: service_healthy
|
||||||
env_file: stack.env
|
env_file: stack.env
|
||||||
image: gabisonfire/knightcrawler-debrid-collector:2.0.24
|
image: gabisonfire/knightcrawler-debrid-collector:2.0.26
|
||||||
labels:
|
labels:
|
||||||
logging: promtail
|
logging: promtail
|
||||||
networks:
|
networks:
|
||||||
@@ -152,7 +152,7 @@ services:
|
|||||||
migrator:
|
migrator:
|
||||||
condition: service_completed_successfully
|
condition: service_completed_successfully
|
||||||
env_file: stack.env
|
env_file: stack.env
|
||||||
image: gabisonfire/knightcrawler-metadata:2.0.24
|
image: gabisonfire/knightcrawler-metadata:2.0.26
|
||||||
networks:
|
networks:
|
||||||
- knightcrawler-network
|
- knightcrawler-network
|
||||||
restart: "no"
|
restart: "no"
|
||||||
@@ -163,7 +163,7 @@ services:
|
|||||||
postgres:
|
postgres:
|
||||||
condition: service_healthy
|
condition: service_healthy
|
||||||
env_file: stack.env
|
env_file: stack.env
|
||||||
image: gabisonfire/knightcrawler-migrator:2.0.24
|
image: gabisonfire/knightcrawler-migrator:2.0.26
|
||||||
networks:
|
networks:
|
||||||
- knightcrawler-network
|
- knightcrawler-network
|
||||||
restart: "no"
|
restart: "no"
|
||||||
@@ -182,7 +182,7 @@ services:
|
|||||||
redis:
|
redis:
|
||||||
condition: service_healthy
|
condition: service_healthy
|
||||||
env_file: stack.env
|
env_file: stack.env
|
||||||
image: gabisonfire/knightcrawler-producer:2.0.24
|
image: gabisonfire/knightcrawler-producer:2.0.26
|
||||||
labels:
|
labels:
|
||||||
logging: promtail
|
logging: promtail
|
||||||
networks:
|
networks:
|
||||||
@@ -207,7 +207,7 @@ services:
|
|||||||
deploy:
|
deploy:
|
||||||
replicas: ${QBIT_REPLICAS:-0}
|
replicas: ${QBIT_REPLICAS:-0}
|
||||||
env_file: stack.env
|
env_file: stack.env
|
||||||
image: gabisonfire/knightcrawler-qbit-collector:2.0.24
|
image: gabisonfire/knightcrawler-qbit-collector:2.0.26
|
||||||
labels:
|
labels:
|
||||||
logging: promtail
|
logging: promtail
|
||||||
networks:
|
networks:
|
||||||
|
|||||||
@@ -20,7 +20,7 @@ x-depends: &knightcrawler-app-depends
|
|||||||
|
|
||||||
services:
|
services:
|
||||||
metadata:
|
metadata:
|
||||||
image: gabisonfire/knightcrawler-metadata:2.0.24
|
image: gabisonfire/knightcrawler-metadata:2.0.26
|
||||||
env_file: ../../.env
|
env_file: ../../.env
|
||||||
networks:
|
networks:
|
||||||
- knightcrawler-network
|
- knightcrawler-network
|
||||||
@@ -30,7 +30,7 @@ services:
|
|||||||
condition: service_completed_successfully
|
condition: service_completed_successfully
|
||||||
|
|
||||||
migrator:
|
migrator:
|
||||||
image: gabisonfire/knightcrawler-migrator:2.0.24
|
image: gabisonfire/knightcrawler-migrator:2.0.26
|
||||||
env_file: ../../.env
|
env_file: ../../.env
|
||||||
networks:
|
networks:
|
||||||
- knightcrawler-network
|
- knightcrawler-network
|
||||||
@@ -40,7 +40,7 @@ services:
|
|||||||
condition: service_healthy
|
condition: service_healthy
|
||||||
|
|
||||||
addon:
|
addon:
|
||||||
image: gabisonfire/knightcrawler-addon:2.0.24
|
image: gabisonfire/knightcrawler-addon:2.0.26
|
||||||
<<: [*knightcrawler-app, *knightcrawler-app-depends]
|
<<: [*knightcrawler-app, *knightcrawler-app-depends]
|
||||||
restart: unless-stopped
|
restart: unless-stopped
|
||||||
hostname: knightcrawler-addon
|
hostname: knightcrawler-addon
|
||||||
@@ -48,22 +48,22 @@ services:
|
|||||||
- "7000:7000"
|
- "7000:7000"
|
||||||
|
|
||||||
consumer:
|
consumer:
|
||||||
image: gabisonfire/knightcrawler-consumer:2.0.24
|
image: gabisonfire/knightcrawler-consumer:2.0.26
|
||||||
<<: [*knightcrawler-app, *knightcrawler-app-depends]
|
<<: [*knightcrawler-app, *knightcrawler-app-depends]
|
||||||
restart: unless-stopped
|
restart: unless-stopped
|
||||||
|
|
||||||
debridcollector:
|
debridcollector:
|
||||||
image: gabisonfire/knightcrawler-debrid-collector:2.0.24
|
image: gabisonfire/knightcrawler-debrid-collector:2.0.26
|
||||||
<<: [*knightcrawler-app, *knightcrawler-app-depends]
|
<<: [*knightcrawler-app, *knightcrawler-app-depends]
|
||||||
restart: unless-stopped
|
restart: unless-stopped
|
||||||
|
|
||||||
producer:
|
producer:
|
||||||
image: gabisonfire/knightcrawler-producer:2.0.24
|
image: gabisonfire/knightcrawler-producer:2.0.26
|
||||||
<<: [*knightcrawler-app, *knightcrawler-app-depends]
|
<<: [*knightcrawler-app, *knightcrawler-app-depends]
|
||||||
restart: unless-stopped
|
restart: unless-stopped
|
||||||
|
|
||||||
qbitcollector:
|
qbitcollector:
|
||||||
image: gabisonfire/knightcrawler-qbit-collector:2.0.24
|
image: gabisonfire/knightcrawler-qbit-collector:2.0.26
|
||||||
<<: [*knightcrawler-app, *knightcrawler-app-depends]
|
<<: [*knightcrawler-app, *knightcrawler-app-depends]
|
||||||
restart: unless-stopped
|
restart: unless-stopped
|
||||||
depends_on:
|
depends_on:
|
||||||
|
|||||||
2
src/addon/package-lock.json
generated
2
src/addon/package-lock.json
generated
@@ -12,7 +12,7 @@
|
|||||||
"@redis/client": "^1.5.14",
|
"@redis/client": "^1.5.14",
|
||||||
"@redis/json": "^1.0.6",
|
"@redis/json": "^1.0.6",
|
||||||
"@redis/search": "^1.1.6",
|
"@redis/search": "^1.1.6",
|
||||||
"all-debrid-api": "^1.1.0",
|
"all-debrid-api": "^1.2.0",
|
||||||
"axios": "^1.6.1",
|
"axios": "^1.6.1",
|
||||||
"bottleneck": "^2.19.5",
|
"bottleneck": "^2.19.5",
|
||||||
"cache-manager": "^3.4.4",
|
"cache-manager": "^3.4.4",
|
||||||
|
|||||||
@@ -10,7 +10,7 @@
|
|||||||
},
|
},
|
||||||
"dependencies": {
|
"dependencies": {
|
||||||
"@putdotio/api-client": "^8.42.0",
|
"@putdotio/api-client": "^8.42.0",
|
||||||
"all-debrid-api": "^1.1.0",
|
"all-debrid-api": "^1.2.0",
|
||||||
"axios": "^1.6.1",
|
"axios": "^1.6.1",
|
||||||
"bottleneck": "^2.19.5",
|
"bottleneck": "^2.19.5",
|
||||||
"cache-manager": "^3.4.4",
|
"cache-manager": "^3.4.4",
|
||||||
|
|||||||
@@ -3,6 +3,7 @@ import { addonBuilder } from 'stremio-addon-sdk';
|
|||||||
import { cacheWrapStream } from './lib/cache.js';
|
import { cacheWrapStream } from './lib/cache.js';
|
||||||
import { dummyManifest } from './lib/manifest.js';
|
import { dummyManifest } from './lib/manifest.js';
|
||||||
import * as repository from './lib/repository.js';
|
import * as repository from './lib/repository.js';
|
||||||
|
import applyFilters from "./lib/filter.js";
|
||||||
import applySorting from './lib/sort.js';
|
import applySorting from './lib/sort.js';
|
||||||
import { toStreamInfo, applyStaticInfo } from './lib/streamInfo.js';
|
import { toStreamInfo, applyStaticInfo } from './lib/streamInfo.js';
|
||||||
import { Type } from './lib/types.js';
|
import { Type } from './lib/types.js';
|
||||||
@@ -32,6 +33,7 @@ builder.defineStreamHandler((args) => {
|
|||||||
.then(records => records
|
.then(records => records
|
||||||
.sort((a, b) => b.torrent.seeders - a.torrent.seeders || b.torrent.uploadDate - a.torrent.uploadDate)
|
.sort((a, b) => b.torrent.seeders - a.torrent.seeders || b.torrent.uploadDate - a.torrent.uploadDate)
|
||||||
.map(record => toStreamInfo(record)))))
|
.map(record => toStreamInfo(record)))))
|
||||||
|
.then(streams => applyFilters(streams, args.extra))
|
||||||
.then(streams => applySorting(streams, args.extra))
|
.then(streams => applySorting(streams, args.extra))
|
||||||
.then(streams => applyStaticInfo(streams))
|
.then(streams => applyStaticInfo(streams))
|
||||||
.then(streams => applyMochs(streams, args.extra))
|
.then(streams => applyMochs(streams, args.extra))
|
||||||
|
|||||||
@@ -84,7 +84,7 @@ export function getImdbIdMovieEntries(imdbId) {
|
|||||||
where: {
|
where: {
|
||||||
imdbId: { [Op.eq]: imdbId }
|
imdbId: { [Op.eq]: imdbId }
|
||||||
},
|
},
|
||||||
include: [Torrent],
|
include: { model: Torrent, required: true },
|
||||||
limit: 500,
|
limit: 500,
|
||||||
order: [
|
order: [
|
||||||
[Torrent, 'size', 'DESC']
|
[Torrent, 'size', 'DESC']
|
||||||
@@ -99,7 +99,7 @@ export function getImdbIdSeriesEntries(imdbId, season, episode) {
|
|||||||
imdbSeason: { [Op.eq]: season },
|
imdbSeason: { [Op.eq]: season },
|
||||||
imdbEpisode: { [Op.eq]: episode }
|
imdbEpisode: { [Op.eq]: episode }
|
||||||
},
|
},
|
||||||
include: [Torrent],
|
include: { model: Torrent, required: true },
|
||||||
limit: 500,
|
limit: 500,
|
||||||
order: [
|
order: [
|
||||||
[Torrent, 'size', 'DESC']
|
[Torrent, 'size', 'DESC']
|
||||||
@@ -112,7 +112,7 @@ export function getKitsuIdMovieEntries(kitsuId) {
|
|||||||
where: {
|
where: {
|
||||||
kitsuId: { [Op.eq]: kitsuId }
|
kitsuId: { [Op.eq]: kitsuId }
|
||||||
},
|
},
|
||||||
include: [Torrent],
|
include: { model: Torrent, required: true },
|
||||||
limit: 500,
|
limit: 500,
|
||||||
order: [
|
order: [
|
||||||
[Torrent, 'size', 'DESC']
|
[Torrent, 'size', 'DESC']
|
||||||
@@ -126,7 +126,7 @@ export function getKitsuIdSeriesEntries(kitsuId, episode) {
|
|||||||
kitsuId: { [Op.eq]: kitsuId },
|
kitsuId: { [Op.eq]: kitsuId },
|
||||||
kitsuEpisode: { [Op.eq]: episode }
|
kitsuEpisode: { [Op.eq]: episode }
|
||||||
},
|
},
|
||||||
include: [Torrent],
|
include: { model: Torrent, required: true },
|
||||||
limit: 500,
|
limit: 500,
|
||||||
order: [
|
order: [
|
||||||
[Torrent, 'size', 'DESC']
|
[Torrent, 'size', 'DESC']
|
||||||
|
|||||||
@@ -20,7 +20,7 @@ export function toStreamInfo(record) {
|
|||||||
const title = joinDetailParts(
|
const title = joinDetailParts(
|
||||||
[
|
[
|
||||||
joinDetailParts([record.torrent.title.replace(/[, ]+/g, ' ')]),
|
joinDetailParts([record.torrent.title.replace(/[, ]+/g, ' ')]),
|
||||||
joinDetailParts([!sameInfo && record.title || undefined]),
|
joinDetailParts([record.title || undefined]),
|
||||||
joinDetailParts([
|
joinDetailParts([
|
||||||
joinDetailParts([formatSize(record.size)], '💾 ')
|
joinDetailParts([formatSize(record.size)], '💾 ')
|
||||||
]),
|
]),
|
||||||
|
|||||||
@@ -15,7 +15,7 @@ WORKDIR /app
|
|||||||
|
|
||||||
ENV PYTHONUNBUFFERED=1
|
ENV PYTHONUNBUFFERED=1
|
||||||
|
|
||||||
RUN apk add --update --no-cache python3=~3.11.8-r0 py3-pip && ln -sf python3 /usr/bin/python
|
RUN apk add --update --no-cache python3=~3.11 py3-pip && ln -sf python3 /usr/bin/python
|
||||||
|
|
||||||
COPY --from=build /src/out .
|
COPY --from=build /src/out .
|
||||||
|
|
||||||
|
|||||||
@@ -8,12 +8,14 @@ public class PostgresConfiguration
|
|||||||
private const string PasswordVariable = "PASSWORD";
|
private const string PasswordVariable = "PASSWORD";
|
||||||
private const string DatabaseVariable = "DB";
|
private const string DatabaseVariable = "DB";
|
||||||
private const string PortVariable = "PORT";
|
private const string PortVariable = "PORT";
|
||||||
|
private const string CommandTimeoutVariable = "COMMAND_TIMEOUT_SEC"; // Seconds
|
||||||
|
|
||||||
private string Host { get; init; } = Prefix.GetRequiredEnvironmentVariableAsString(HostVariable);
|
private string Host { get; init; } = Prefix.GetRequiredEnvironmentVariableAsString(HostVariable);
|
||||||
private string Username { get; init; } = Prefix.GetRequiredEnvironmentVariableAsString(UsernameVariable);
|
private string Username { get; init; } = Prefix.GetRequiredEnvironmentVariableAsString(UsernameVariable);
|
||||||
private string Password { get; init; } = Prefix.GetRequiredEnvironmentVariableAsString(PasswordVariable);
|
private string Password { get; init; } = Prefix.GetRequiredEnvironmentVariableAsString(PasswordVariable);
|
||||||
private string Database { get; init; } = Prefix.GetRequiredEnvironmentVariableAsString(DatabaseVariable);
|
private string Database { get; init; } = Prefix.GetRequiredEnvironmentVariableAsString(DatabaseVariable);
|
||||||
private int PORT { get; init; } = Prefix.GetEnvironmentVariableAsInt(PortVariable, 5432);
|
private int PORT { get; init; } = Prefix.GetEnvironmentVariableAsInt(PortVariable, 5432);
|
||||||
|
private int CommandTimeout { get; init; } = Prefix.GetEnvironmentVariableAsInt(CommandTimeoutVariable, 300);
|
||||||
|
|
||||||
public string StorageConnectionString => $"Host={Host};Port={PORT};Username={Username};Password={Password};Database={Database};";
|
public string StorageConnectionString => $"Host={Host};Port={PORT};Username={Username};Password={Password};Database={Database};CommandTimeout={CommandTimeout}";
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -134,7 +134,7 @@ public class ImdbDbService(PostgresConfiguration configuration, ILogger<ImdbDbSe
|
|||||||
{
|
{
|
||||||
try
|
try
|
||||||
{
|
{
|
||||||
await using var connection = CreateNpgsqlConnection();
|
await using var connection = new NpgsqlConnection(configuration.StorageConnectionString);
|
||||||
await connection.OpenAsync();
|
await connection.OpenAsync();
|
||||||
|
|
||||||
await operation(connection);
|
await operation(connection);
|
||||||
@@ -145,16 +145,6 @@ public class ImdbDbService(PostgresConfiguration configuration, ILogger<ImdbDbSe
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
private NpgsqlConnection CreateNpgsqlConnection()
|
|
||||||
{
|
|
||||||
var connectionStringBuilder = new NpgsqlConnectionStringBuilder(configuration.StorageConnectionString)
|
|
||||||
{
|
|
||||||
CommandTimeout = 3000,
|
|
||||||
};
|
|
||||||
|
|
||||||
return new(connectionStringBuilder.ConnectionString);
|
|
||||||
}
|
|
||||||
|
|
||||||
private async Task ExecuteCommandWithTransactionAsync(Func<NpgsqlConnection, NpgsqlTransaction, Task> operation, NpgsqlTransaction transaction, string errorMessage)
|
private async Task ExecuteCommandWithTransactionAsync(Func<NpgsqlConnection, NpgsqlTransaction, Task> operation, NpgsqlTransaction transaction, string errorMessage)
|
||||||
{
|
{
|
||||||
try
|
try
|
||||||
|
|||||||
@@ -13,7 +13,7 @@
|
|||||||
<PackageReference Include="Dapper" Version="2.1.35" />
|
<PackageReference Include="Dapper" Version="2.1.35" />
|
||||||
<PackageReference Include="Microsoft.Extensions.Hosting" Version="8.0.0" />
|
<PackageReference Include="Microsoft.Extensions.Hosting" Version="8.0.0" />
|
||||||
<PackageReference Include="Microsoft.Extensions.Http" Version="8.0.0" />
|
<PackageReference Include="Microsoft.Extensions.Http" Version="8.0.0" />
|
||||||
<PackageReference Include="Npgsql" Version="8.0.2" />
|
<PackageReference Include="Npgsql" Version="8.0.3" />
|
||||||
<PackageReference Include="Serilog" Version="3.1.1" />
|
<PackageReference Include="Serilog" Version="3.1.1" />
|
||||||
<PackageReference Include="Serilog.AspNetCore" Version="8.0.1" />
|
<PackageReference Include="Serilog.AspNetCore" Version="8.0.1" />
|
||||||
<PackageReference Include="Serilog.Sinks.Console" Version="5.0.1" />
|
<PackageReference Include="Serilog.Sinks.Console" Version="5.0.1" />
|
||||||
|
|||||||
@@ -5,7 +5,7 @@
|
|||||||
"Name": "SyncEzTvJob",
|
"Name": "SyncEzTvJob",
|
||||||
"IntervalSeconds": 60,
|
"IntervalSeconds": 60,
|
||||||
"Enabled": true,
|
"Enabled": true,
|
||||||
"Url": "https://eztv1.xyz/ezrss.xml",
|
"Url": "https://eztvx.to/ezrss.xml",
|
||||||
"XmlNamespace": "http://xmlns.ezrss.it/0.1/"
|
"XmlNamespace": "http://xmlns.ezrss.it/0.1/"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
|
|||||||
@@ -14,7 +14,7 @@ WORKDIR /app
|
|||||||
|
|
||||||
ENV PYTHONUNBUFFERED=1
|
ENV PYTHONUNBUFFERED=1
|
||||||
|
|
||||||
RUN apk add --update --no-cache python3=~3.11.8-r0 py3-pip && ln -sf python3 /usr/bin/python
|
RUN apk add --update --no-cache python3=~3.11 py3-pip && ln -sf python3 /usr/bin/python
|
||||||
|
|
||||||
COPY --from=build /src/out .
|
COPY --from=build /src/out .
|
||||||
|
|
||||||
|
|||||||
@@ -14,7 +14,7 @@ public partial class DebridMediaManagerCrawler(
|
|||||||
protected override string Source => "DMM";
|
protected override string Source => "DMM";
|
||||||
|
|
||||||
private const int ParallelismCount = 4;
|
private const int ParallelismCount = 4;
|
||||||
|
|
||||||
public override async Task Execute()
|
public override async Task Execute()
|
||||||
{
|
{
|
||||||
var tempDirectory = await dmmFileDownloader.DownloadFileToTempPath(CancellationToken.None);
|
var tempDirectory = await dmmFileDownloader.DownloadFileToTempPath(CancellationToken.None);
|
||||||
@@ -24,7 +24,7 @@ public partial class DebridMediaManagerCrawler(
|
|||||||
logger.LogInformation("Found {Files} files to parse", files.Length);
|
logger.LogInformation("Found {Files} files to parse", files.Length);
|
||||||
|
|
||||||
var options = new ParallelOptions { MaxDegreeOfParallelism = ParallelismCount };
|
var options = new ParallelOptions { MaxDegreeOfParallelism = ParallelismCount };
|
||||||
|
|
||||||
await Parallel.ForEachAsync(files, options, async (file, token) =>
|
await Parallel.ForEachAsync(files, options, async (file, token) =>
|
||||||
{
|
{
|
||||||
var fileName = Path.GetFileName(file);
|
var fileName = Path.GetFileName(file);
|
||||||
@@ -69,9 +69,9 @@ public partial class DebridMediaManagerCrawler(
|
|||||||
if (page.TryGetValue(infoHash, out var dmmContent) &&
|
if (page.TryGetValue(infoHash, out var dmmContent) &&
|
||||||
successfulResponses.TryGetValue(dmmContent.Filename, out var parsedResponse))
|
successfulResponses.TryGetValue(dmmContent.Filename, out var parsedResponse))
|
||||||
{
|
{
|
||||||
page[infoHash] = dmmContent with {ParseResponse = parsedResponse};
|
page[infoHash] = dmmContent with { ParseResponse = parsedResponse };
|
||||||
}
|
}
|
||||||
|
|
||||||
return ValueTask.CompletedTask;
|
return ValueTask.CompletedTask;
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
@@ -86,7 +86,7 @@ public partial class DebridMediaManagerCrawler(
|
|||||||
}
|
}
|
||||||
|
|
||||||
var pageSource = await File.ReadAllTextAsync(filePath);
|
var pageSource = await File.ReadAllTextAsync(filePath);
|
||||||
|
|
||||||
var match = HashCollectionMatcher().Match(pageSource);
|
var match = HashCollectionMatcher().Match(pageSource);
|
||||||
|
|
||||||
if (!match.Success)
|
if (!match.Success)
|
||||||
@@ -106,9 +106,34 @@ public partial class DebridMediaManagerCrawler(
|
|||||||
|
|
||||||
var decodedJson = LZString.DecompressFromEncodedURIComponent(encodedJson.Value);
|
var decodedJson = LZString.DecompressFromEncodedURIComponent(encodedJson.Value);
|
||||||
|
|
||||||
var json = JsonDocument.Parse(decodedJson);
|
JsonElement arrayToProcess;
|
||||||
|
try
|
||||||
var torrents = await json.RootElement.EnumerateArray()
|
{
|
||||||
|
var json = JsonDocument.Parse(decodedJson);
|
||||||
|
|
||||||
|
if (json.RootElement.ValueKind == JsonValueKind.Object &&
|
||||||
|
json.RootElement.TryGetProperty("torrents", out var torrentsProperty) &&
|
||||||
|
torrentsProperty.ValueKind == JsonValueKind.Array)
|
||||||
|
{
|
||||||
|
arrayToProcess = torrentsProperty;
|
||||||
|
}
|
||||||
|
else if (json.RootElement.ValueKind == JsonValueKind.Array)
|
||||||
|
{
|
||||||
|
arrayToProcess = json.RootElement;
|
||||||
|
}
|
||||||
|
else
|
||||||
|
{
|
||||||
|
logger.LogWarning("Unexpected JSON format in {Name}", name);
|
||||||
|
return [];
|
||||||
|
}
|
||||||
|
}
|
||||||
|
catch (Exception ex)
|
||||||
|
{
|
||||||
|
logger.LogError("Failed to parse JSON {decodedJson} for {Name}: {Exception}", decodedJson, name, ex);
|
||||||
|
return [];
|
||||||
|
}
|
||||||
|
|
||||||
|
var torrents = await arrayToProcess.EnumerateArray()
|
||||||
.ToAsyncEnumerable()
|
.ToAsyncEnumerable()
|
||||||
.Select(ParsePageContent)
|
.Select(ParsePageContent)
|
||||||
.Where(t => t is not null)
|
.Where(t => t is not null)
|
||||||
@@ -120,7 +145,7 @@ public partial class DebridMediaManagerCrawler(
|
|||||||
await Storage.MarkPageAsIngested(filenameOnly);
|
await Storage.MarkPageAsIngested(filenameOnly);
|
||||||
return [];
|
return [];
|
||||||
}
|
}
|
||||||
|
|
||||||
var torrentDictionary = torrents
|
var torrentDictionary = torrents
|
||||||
.Where(x => x is not null)
|
.Where(x => x is not null)
|
||||||
.GroupBy(x => x.InfoHash)
|
.GroupBy(x => x.InfoHash)
|
||||||
@@ -141,7 +166,7 @@ public partial class DebridMediaManagerCrawler(
|
|||||||
{
|
{
|
||||||
var (infoHash, dmmContent) = kvp;
|
var (infoHash, dmmContent) = kvp;
|
||||||
var parsedTorrent = dmmContent.ParseResponse;
|
var parsedTorrent = dmmContent.ParseResponse;
|
||||||
if (parsedTorrent is not {Success: true})
|
if (parsedTorrent is not { Success: true })
|
||||||
{
|
{
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
@@ -192,7 +217,7 @@ public partial class DebridMediaManagerCrawler(
|
|||||||
Category = AssignCategory(result),
|
Category = AssignCategory(result),
|
||||||
RtnResponse = parsedTorrent.Response.ToJson(),
|
RtnResponse = parsedTorrent.Response.ToJson(),
|
||||||
};
|
};
|
||||||
|
|
||||||
|
|
||||||
private Task AddToCache(string cacheKey, ImdbEntry best)
|
private Task AddToCache(string cacheKey, ImdbEntry best)
|
||||||
{
|
{
|
||||||
@@ -200,19 +225,19 @@ public partial class DebridMediaManagerCrawler(
|
|||||||
{
|
{
|
||||||
AbsoluteExpirationRelativeToNow = TimeSpan.FromDays(1),
|
AbsoluteExpirationRelativeToNow = TimeSpan.FromDays(1),
|
||||||
};
|
};
|
||||||
|
|
||||||
return cache.SetStringAsync(cacheKey, JsonSerializer.Serialize(best), cacheOptions);
|
return cache.SetStringAsync(cacheKey, JsonSerializer.Serialize(best), cacheOptions);
|
||||||
}
|
}
|
||||||
|
|
||||||
private async Task<(bool Success, ImdbEntry? Entry)> CheckIfInCacheAndReturn(string cacheKey)
|
private async Task<(bool Success, ImdbEntry? Entry)> CheckIfInCacheAndReturn(string cacheKey)
|
||||||
{
|
{
|
||||||
var cachedImdbId = await cache.GetStringAsync(cacheKey);
|
var cachedImdbId = await cache.GetStringAsync(cacheKey);
|
||||||
|
|
||||||
if (!string.IsNullOrEmpty(cachedImdbId))
|
if (!string.IsNullOrEmpty(cachedImdbId))
|
||||||
{
|
{
|
||||||
return (true, JsonSerializer.Deserialize<ImdbEntry>(cachedImdbId));
|
return (true, JsonSerializer.Deserialize<ImdbEntry>(cachedImdbId));
|
||||||
}
|
}
|
||||||
|
|
||||||
return (false, null);
|
return (false, null);
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -222,7 +247,7 @@ public partial class DebridMediaManagerCrawler(
|
|||||||
|
|
||||||
return (pageIngested, filename);
|
return (pageIngested, filename);
|
||||||
}
|
}
|
||||||
|
|
||||||
private static string AssignCategory(ImdbEntry entry) =>
|
private static string AssignCategory(ImdbEntry entry) =>
|
||||||
entry.Category.ToLower() switch
|
entry.Category.ToLower() switch
|
||||||
{
|
{
|
||||||
@@ -230,9 +255,9 @@ public partial class DebridMediaManagerCrawler(
|
|||||||
var category when string.Equals(category, "tvSeries", StringComparison.OrdinalIgnoreCase) => "tv",
|
var category when string.Equals(category, "tvSeries", StringComparison.OrdinalIgnoreCase) => "tv",
|
||||||
_ => "unknown",
|
_ => "unknown",
|
||||||
};
|
};
|
||||||
|
|
||||||
private static string GetCacheKey(string category, string title, int year) => $"{category.ToLowerInvariant()}:{year}:{title.ToLowerInvariant()}";
|
private static string GetCacheKey(string category, string title, int year) => $"{category.ToLowerInvariant()}:{year}:{title.ToLowerInvariant()}";
|
||||||
|
|
||||||
private static ExtractedDMMContent? ParsePageContent(JsonElement item)
|
private static ExtractedDMMContent? ParsePageContent(JsonElement item)
|
||||||
{
|
{
|
||||||
if (!item.TryGetProperty("filename", out var filenameElement) ||
|
if (!item.TryGetProperty("filename", out var filenameElement) ||
|
||||||
@@ -241,10 +266,10 @@ public partial class DebridMediaManagerCrawler(
|
|||||||
{
|
{
|
||||||
return null;
|
return null;
|
||||||
}
|
}
|
||||||
|
|
||||||
return new(filenameElement.GetString(), bytesElement.GetInt64(), hashElement.GetString());
|
return new(filenameElement.GetString(), bytesElement.GetInt64(), hashElement.GetString());
|
||||||
}
|
}
|
||||||
|
|
||||||
private record DmmContent(string Filename, long Bytes, ParseTorrentTitleResponse? ParseResponse);
|
private record DmmContent(string Filename, long Bytes, ParseTorrentTitleResponse? ParseResponse);
|
||||||
private record ExtractedDMMContent(string Filename, long Bytes, string InfoHash);
|
private record ExtractedDMMContent(string Filename, long Bytes, string InfoHash);
|
||||||
private record RtnBatchProcessable(string InfoHash, string Filename);
|
private record RtnBatchProcessable(string InfoHash, string Filename);
|
||||||
|
|||||||
@@ -9,6 +9,7 @@ public static class ServiceCollectionExtensions
|
|||||||
client.BaseAddress = new("https://github.com/debridmediamanager/hashlists/zipball/main/");
|
client.BaseAddress = new("https://github.com/debridmediamanager/hashlists/zipball/main/");
|
||||||
client.DefaultRequestHeaders.Add("Accept-Encoding", "gzip");
|
client.DefaultRequestHeaders.Add("Accept-Encoding", "gzip");
|
||||||
client.DefaultRequestHeaders.UserAgent.ParseAdd("curl");
|
client.DefaultRequestHeaders.UserAgent.ParseAdd("curl");
|
||||||
|
client.Timeout = TimeSpan.FromMinutes(10); // 10 minute timeout, #217
|
||||||
});
|
});
|
||||||
|
|
||||||
return services;
|
return services;
|
||||||
|
|||||||
@@ -15,7 +15,7 @@ WORKDIR /app
|
|||||||
|
|
||||||
ENV PYTHONUNBUFFERED=1
|
ENV PYTHONUNBUFFERED=1
|
||||||
|
|
||||||
RUN apk add --update --no-cache python3=~3.11.8-r0 py3-pip && ln -sf python3 /usr/bin/python
|
RUN apk add --update --no-cache python3=~3.11 py3-pip && ln -sf python3 /usr/bin/python
|
||||||
|
|
||||||
COPY --from=build /src/out .
|
COPY --from=build /src/out .
|
||||||
|
|
||||||
|
|||||||
@@ -8,12 +8,14 @@ public class PostgresConfiguration
|
|||||||
private const string PasswordVariable = "PASSWORD";
|
private const string PasswordVariable = "PASSWORD";
|
||||||
private const string DatabaseVariable = "DB";
|
private const string DatabaseVariable = "DB";
|
||||||
private const string PortVariable = "PORT";
|
private const string PortVariable = "PORT";
|
||||||
|
private const string CommandTimeoutVariable = "COMMAND_TIMEOUT_SEC"; // Seconds
|
||||||
|
|
||||||
private string Host { get; init; } = Prefix.GetRequiredEnvironmentVariableAsString(HostVariable);
|
private string Host { get; init; } = Prefix.GetRequiredEnvironmentVariableAsString(HostVariable);
|
||||||
private string Username { get; init; } = Prefix.GetRequiredEnvironmentVariableAsString(UsernameVariable);
|
private string Username { get; init; } = Prefix.GetRequiredEnvironmentVariableAsString(UsernameVariable);
|
||||||
private string Password { get; init; } = Prefix.GetRequiredEnvironmentVariableAsString(PasswordVariable);
|
private string Password { get; init; } = Prefix.GetRequiredEnvironmentVariableAsString(PasswordVariable);
|
||||||
private string Database { get; init; } = Prefix.GetRequiredEnvironmentVariableAsString(DatabaseVariable);
|
private string Database { get; init; } = Prefix.GetRequiredEnvironmentVariableAsString(DatabaseVariable);
|
||||||
private int PORT { get; init; } = Prefix.GetEnvironmentVariableAsInt(PortVariable, 5432);
|
private int PORT { get; init; } = Prefix.GetEnvironmentVariableAsInt(PortVariable, 5432);
|
||||||
|
private int CommandTimeout { get; init; } = Prefix.GetEnvironmentVariableAsInt(CommandTimeoutVariable, 300);
|
||||||
|
|
||||||
public string StorageConnectionString => $"Host={Host};Port={PORT};Username={Username};Password={Password};Database={Database};";
|
public string StorageConnectionString => $"Host={Host};Port={PORT};Username={Username};Password={Password};Database={Database};CommandTimeout={CommandTimeout}";
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -15,7 +15,7 @@
|
|||||||
<PackageReference Include="Dapper" Version="2.1.35" />
|
<PackageReference Include="Dapper" Version="2.1.35" />
|
||||||
<PackageReference Include="MassTransit.Abstractions" Version="8.2.0" />
|
<PackageReference Include="MassTransit.Abstractions" Version="8.2.0" />
|
||||||
<PackageReference Include="MassTransit.RabbitMQ" Version="8.2.0" />
|
<PackageReference Include="MassTransit.RabbitMQ" Version="8.2.0" />
|
||||||
<PackageReference Include="Npgsql" Version="8.0.2" />
|
<PackageReference Include="Npgsql" Version="8.0.3" />
|
||||||
<PackageReference Include="pythonnet" Version="3.0.3" />
|
<PackageReference Include="pythonnet" Version="3.0.3" />
|
||||||
<PackageReference Include="Serilog" Version="3.1.1" />
|
<PackageReference Include="Serilog" Version="3.1.1" />
|
||||||
<PackageReference Include="Serilog.Extensions.Hosting" Version="8.0.0" />
|
<PackageReference Include="Serilog.Extensions.Hosting" Version="8.0.0" />
|
||||||
|
|||||||
@@ -8,12 +8,14 @@ public class PostgresConfiguration
|
|||||||
private const string PasswordVariable = "PASSWORD";
|
private const string PasswordVariable = "PASSWORD";
|
||||||
private const string DatabaseVariable = "DB";
|
private const string DatabaseVariable = "DB";
|
||||||
private const string PortVariable = "PORT";
|
private const string PortVariable = "PORT";
|
||||||
|
private const string CommandTimeoutVariable = "COMMAND_TIMEOUT_SEC"; // Seconds
|
||||||
|
|
||||||
private string Host { get; init; } = Prefix.GetRequiredEnvironmentVariableAsString(HostVariable);
|
private string Host { get; init; } = Prefix.GetRequiredEnvironmentVariableAsString(HostVariable);
|
||||||
private string Username { get; init; } = Prefix.GetRequiredEnvironmentVariableAsString(UsernameVariable);
|
private string Username { get; init; } = Prefix.GetRequiredEnvironmentVariableAsString(UsernameVariable);
|
||||||
private string Password { get; init; } = Prefix.GetRequiredEnvironmentVariableAsString(PasswordVariable);
|
private string Password { get; init; } = Prefix.GetRequiredEnvironmentVariableAsString(PasswordVariable);
|
||||||
private string Database { get; init; } = Prefix.GetRequiredEnvironmentVariableAsString(DatabaseVariable);
|
private string Database { get; init; } = Prefix.GetRequiredEnvironmentVariableAsString(DatabaseVariable);
|
||||||
private int PORT { get; init; } = Prefix.GetEnvironmentVariableAsInt(PortVariable, 5432);
|
private int PORT { get; init; } = Prefix.GetEnvironmentVariableAsInt(PortVariable, 5432);
|
||||||
|
private int CommandTimeout { get; init; } = Prefix.GetEnvironmentVariableAsInt(CommandTimeoutVariable, 300);
|
||||||
|
|
||||||
public string StorageConnectionString => $"Host={Host};Port={PORT};Username={Username};Password={Password};Database={Database};";
|
public string StorageConnectionString => $"Host={Host};Port={PORT};Username={Username};Password={Password};Database={Database};CommandTimeout={CommandTimeout}";
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -12,7 +12,7 @@
|
|||||||
<PackageReference Include="Dapper" Version="2.1.28" />
|
<PackageReference Include="Dapper" Version="2.1.28" />
|
||||||
<PackageReference Include="Microsoft.Extensions.Hosting" Version="8.0.0" />
|
<PackageReference Include="Microsoft.Extensions.Hosting" Version="8.0.0" />
|
||||||
<PackageReference Include="Microsoft.Extensions.Http" Version="8.0.0" />
|
<PackageReference Include="Microsoft.Extensions.Http" Version="8.0.0" />
|
||||||
<PackageReference Include="Npgsql" Version="8.0.1" />
|
<PackageReference Include="Npgsql" Version="8.0.3" />
|
||||||
<PackageReference Include="Serilog" Version="3.1.1" />
|
<PackageReference Include="Serilog" Version="3.1.1" />
|
||||||
<PackageReference Include="Serilog.AspNetCore" Version="8.0.1" />
|
<PackageReference Include="Serilog.AspNetCore" Version="8.0.1" />
|
||||||
<PackageReference Include="Serilog.Sinks.Console" Version="5.0.1" />
|
<PackageReference Include="Serilog.Sinks.Console" Version="5.0.1" />
|
||||||
|
|||||||
Reference in New Issue
Block a user