Merge pull request #2069 from Kometa-Team/requests-update

created requests module to handle all outgoing requests
This commit is contained in:
meisnate12 2024-05-28 17:01:04 -04:00 committed by GitHub
commit 67fc3cafc5
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
47 changed files with 991 additions and 817 deletions

View file

@ -2,6 +2,7 @@ AAC
accessModes accessModes
Addon Addon
Adlib Adlib
AFI's
Amblin Amblin
analytics analytics
AniDB AniDB
@ -19,6 +20,7 @@ Arrowverse
Atmos Atmos
Avenir Avenir
BAFTA BAFTA
Bambara
BBFC BBFC
bearlikelion bearlikelion
Berlinale Berlinale
@ -56,6 +58,7 @@ customizable
customizations customizations
César César
dbader dbader
d'Or
de de
deva deva
DIIIVOY DIIIVOY
@ -177,6 +180,7 @@ microsoft
mikenobbs mikenobbs
minikube minikube
mnt mnt
Mojo's
monetization monetization
Mossi Mossi
MPAA MPAA
@ -202,6 +206,7 @@ OMDb
oscar oscar
OSX OSX
ozzy ozzy
Palme
pathing pathing
PCM PCM
PersistentVolumeClaim PersistentVolumeClaim

View file

@ -10,12 +10,12 @@ Please include a summary of the changes.
Please delete options that are not relevant. Please delete options that are not relevant.
- [ ] Bug fix (non-breaking change which fixes an issue) - [] Bug fix (non-breaking change which fixes an issue)
- [ ] New feature (non-breaking change which adds functionality) - [] New feature (non-breaking change which adds functionality)
- [ ] Breaking change (fix or feature that would cause existing functionality to not work as expected) - [] Breaking change (fix or feature that would cause existing functionality to not work as expected)
- [ ] Documentation change (non-code changes affecting only the wiki) - [] Documentation change (non-code changes affecting only the wiki)
- [ ] Infrastructure change (changes related to the github repo, build process, or the like) - [] Infrastructure change (changes related to the github repo, build process, or the like)
## Checklist ## Checklist
- [ ] My code was submitted to the nightly branch of the repository. - [] My code was submitted to the nightly branch of the repository.

27
.github/workflows/merge-develop.yml vendored Normal file
View file

@ -0,0 +1,27 @@
name: Merge Nightly into Develop
on:
workflow_dispatch:
jobs:
merge-develop:
runs-on: ubuntu-latest
steps:
- name: Create App Token
uses: actions/create-github-app-token@v1
id: app-token
with:
app-id: ${{ vars.APP_ID }}
private-key: ${{ secrets.APP_TOKEN }}
- name: Check Out Repo
uses: actions/checkout@v4
with:
token: ${{ steps.app-token.outputs.token }}
ref: nightly
fetch-depth: 0
- name: Push Nightly into Develop
run: |
git push origin refs/heads/nightly:refs/heads/develop

27
.github/workflows/merge-master.yml vendored Normal file
View file

@ -0,0 +1,27 @@
name: Merge Develop into Master
on:
workflow_dispatch:
jobs:
merge-master:
runs-on: ubuntu-latest
steps:
- name: Create App Token
uses: actions/create-github-app-token@v1
id: app-token
with:
app-id: ${{ vars.APP_ID }}
private-key: ${{ secrets.APP_TOKEN }}
- name: Check Out Repo
uses: actions/checkout@v4
with:
token: ${{ steps.app-token.outputs.token }}
ref: develop
fetch-depth: 0
- name: Push Develop into Master
run: |
git push origin refs/heads/develop:refs/heads/master

View file

@ -1,12 +0,0 @@
name: Spellcheck Action
on: pull_request
jobs:
spellcheck:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: rojopolis/spellcheck-github-actions@0.36.0

27
.github/workflows/validate.yml vendored Normal file
View file

@ -0,0 +1,27 @@
name: Validate Pull Request
on: pull_request
jobs:
validate-pull:
runs-on: ubuntu-latest
steps:
- name: Display Refs
run: |
echo "Base Repo: ${{ github.event.pull_request.base.repo.full_name }}"
echo "Base Ref: ${{ github.base_ref }}"
echo "Head Repo: ${{ github.event.pull_request.head.repo.full_name }}"
echo "Head Ref: ${{ github.head_ref }}"
- name: Check Base Branch
if: github.base_ref == 'master' || github.base_ref == 'develop'
run: |
echo "ERROR: Pull Requests cannot be submitted to master or develop. Please submit the Pull Request to the nightly branch"
exit 1
- name: Checkout Repo
uses: actions/checkout@v4
- name: Run Spellcheck
uses: rojopolis/spellcheck-github-actions@0.36.0

View file

@ -1 +1 @@
2.0.1-develop24 2.0.1-develop26

View file

@ -934,31 +934,6 @@ The available setting attributes which can be set at each level are outlined bel
- metadata - metadata
``` ```
??? blank "`verify_ssl` - Turn SSL Verification on or off.<a class="headerlink" href="#verify-ssl" title="Permanent link"></a>"
<div id="verify-ssl" />Turn SSL Verification on or off.
???+ note
set to false if your log file shows any errors similar to "SSL: CERTIFICATE_VERIFY_FAILED"
<hr style="margin: 0px;">
**Attribute:** `verify_ssl`
**Levels with this Attribute:** Global
**Accepted Values:** `true` or `false`
**Default Value:** `true`
???+ example "Example"
```yaml
settings:
verify_ssl: false
```
??? blank "`custom_repo` - Used to set up the custom `repo` [file block type](files.md#location-types-and-paths).<a class="headerlink" href="#custom-repo" title="Permanent link"></a>" ??? blank "`custom_repo` - Used to set up the custom `repo` [file block type](files.md#location-types-and-paths).<a class="headerlink" href="#custom-repo" title="Permanent link"></a>"
<div id="custom-repo" />Specify where the `repo` attribute's base is when defining `collection_files`, `metadata_files`, `playlist_file` and `overlay_files`. <div id="custom-repo" />Specify where the `repo` attribute's base is when defining `collection_files`, `metadata_files`, `playlist_file` and `overlay_files`.

View file

@ -39,6 +39,7 @@ These collections are applied by calling the below paths into the `collection_fi
| [Trakt Charts](chart/trakt.md)<sup>2</sup> | `trakt` | Trakt Popular, Trakt Trending | :fontawesome-solid-circle-check:{ .green } | :fontawesome-solid-circle-check:{ .green } | | [Trakt Charts](chart/trakt.md)<sup>2</sup> | `trakt` | Trakt Popular, Trakt Trending | :fontawesome-solid-circle-check:{ .green } | :fontawesome-solid-circle-check:{ .green } |
| [AniList Charts](chart/anilist.md) | `anilist` | AniList Popular, AniList Season | :fontawesome-solid-circle-check:{ .green } | :fontawesome-solid-circle-check:{ .green } | | [AniList Charts](chart/anilist.md) | `anilist` | AniList Popular, AniList Season | :fontawesome-solid-circle-check:{ .green } | :fontawesome-solid-circle-check:{ .green } |
| [MyAnimeList Charts](chart/myanimelist.md) | `myanimelist` | MyAnimeList Popular, MyAnimeList Top Rated | :fontawesome-solid-circle-check:{ .green } | :fontawesome-solid-circle-check:{ .green } | | [MyAnimeList Charts](chart/myanimelist.md) | `myanimelist` | MyAnimeList Popular, MyAnimeList Top Rated | :fontawesome-solid-circle-check:{ .green } | :fontawesome-solid-circle-check:{ .green } |
| [Letterboxd Charts](chart/letterboxd.md) | `letterboxd` | Letterboxd Top 250, Top 250 Most Fans | :fontawesome-solid-circle-check:{ .green } | :fontawesome-solid-circle-xmark:{ .red } |
| [Other Charts](chart/other.md) | `other_chart` | AniDB Popular, Common Sense Selection | :fontawesome-solid-circle-check:{ .green } | :fontawesome-solid-circle-check:{ .green } | | [Other Charts](chart/other.md) | `other_chart` | AniDB Popular, Common Sense Selection | :fontawesome-solid-circle-check:{ .green } | :fontawesome-solid-circle-check:{ .green } |
<sup>1</sup> Requires [Tautulli Authentication](../config/tautulli.md) <sup>1</sup> Requires [Tautulli Authentication](../config/tautulli.md)

View file

@ -68,6 +68,7 @@ This is the default Kometa collection ordering:
| `basic` | `010` | | `basic` | `010` |
| `anilist` | `020` | | `anilist` | `020` |
| `imdb` | `020` | | `imdb` | `020` |
| `letterboxd` | `020` |
| `myanimelist` | `020` | | `myanimelist` | `020` |
| `other_chart` | `020` | | `other_chart` | `020` |
| `tautulli` | `020` | | `tautulli` | `020` |
@ -211,4 +212,4 @@ libraries:
{% {%
include-markdown "./example.md" include-markdown "./example.md"
%} %}

View file

@ -229,6 +229,32 @@ different ways to specify these things.
docker run -it -v "X:\Media\Kometa\config:/config:rw" kometateam/kometa --timeout 360 docker run -it -v "X:\Media\Kometa\config:/config:rw" kometateam/kometa --timeout 360
``` ```
??? blank "No Verify SSL&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;`-nv`/`--no-verify-ssl`&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;`KOMETA_NO_VERIFY_SSL`<a class="headerlink" href="#no-verify-ssl" title="Permanent link"></a>"
<div id="no-verify-ssl" />Turn SSL Verification off.
???+ note
set to false if your log file shows any errors similar to "SSL: CERTIFICATE_VERIFY_FAILED"
<hr style="margin: 0px;">
**Accepted Values:** Integer (value is in seconds)
**Shell Flags:** `-nv` or `--no-verify-ssl` (ex. `--no-verify-ssl`)
**Environment Variable:** `KOMETA_NO_VERIFY_SSL` (ex. `KOMETA_NO_VERIFY_SSL=true`)
!!! example
=== "Local Environment"
```
python kometa.py --no-verify-ssl
```
=== "Docker Environment"
```
docker run -it -v "X:\Media\Kometa\config:/config:rw" kometateam/kometa --no-verify-ssl
```
??? blank "Collections Only&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;`-co`/`--collections-only`&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;`KOMETA_COLLECTIONS_ONLY`<a class="headerlink" href="#collections-only" title="Permanent link"></a>" ??? blank "Collections Only&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;`-co`/`--collections-only`&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;`KOMETA_COLLECTIONS_ONLY`<a class="headerlink" href="#collections-only" title="Permanent link"></a>"
<div id="collections-only" />Only run collection YAML files, skip library operations, metadata, overlays, and playlists. <div id="collections-only" />Only run collection YAML files, skip library operations, metadata, overlays, and playlists.

348
kometa.py
View file

@ -50,6 +50,7 @@ arguments = {
"trace": {"args": "tr", "type": "bool", "help": "Run with extra Trace Debug Logs"}, "trace": {"args": "tr", "type": "bool", "help": "Run with extra Trace Debug Logs"},
"log-requests": {"args": ["lr", "log-request"], "type": "bool", "help": "Run with all Requests printed"}, "log-requests": {"args": ["lr", "log-request"], "type": "bool", "help": "Run with all Requests printed"},
"timeout": {"args": "ti", "type": "int", "default": 180, "help": "Kometa Global Timeout (Default: 180)"}, "timeout": {"args": "ti", "type": "int", "default": 180, "help": "Kometa Global Timeout (Default: 180)"},
"no-verify-ssl": {"args": "nv", "type": "bool", "help": "Turns off Global SSL Verification"},
"collections-only": {"args": ["co", "collection-only"], "type": "bool", "help": "Run only collection files"}, "collections-only": {"args": ["co", "collection-only"], "type": "bool", "help": "Run only collection files"},
"metadata-only": {"args": ["mo", "metadatas-only"], "type": "bool", "help": "Run only metadata files"}, "metadata-only": {"args": ["mo", "metadatas-only"], "type": "bool", "help": "Run only metadata files"},
"playlists-only": {"args": ["po", "playlist-only"], "type": "bool", "help": "Run only playlist files"}, "playlists-only": {"args": ["po", "playlist-only"], "type": "bool", "help": "Run only playlist files"},
@ -204,6 +205,7 @@ from modules import util
util.logger = logger util.logger = logger
from modules.builder import CollectionBuilder from modules.builder import CollectionBuilder
from modules.config import ConfigFile from modules.config import ConfigFile
from modules.request import Requests, parse_version
from modules.util import Failed, FilterFailed, NonExisting, NotScheduled, Deleted from modules.util import Failed, FilterFailed, NonExisting, NotScheduled, Deleted
def my_except_hook(exctype, value, tb): def my_except_hook(exctype, value, tb):
@ -223,15 +225,13 @@ def new_send(*send_args, **kwargs):
requests.Session.send = new_send requests.Session.send = new_send
version = ("Unknown", "Unknown", 0) file_version = ("Unknown", "Unknown", 0)
with open(os.path.join(os.path.dirname(os.path.abspath(__file__)), "VERSION")) as handle: with open(os.path.join(os.path.dirname(os.path.abspath(__file__)), "VERSION")) as handle:
for line in handle.readlines(): for line in handle.readlines():
line = line.strip() line = line.strip()
if len(line) > 0: if len(line) > 0:
version = util.parse_version(line) file_version = parse_version(line)
break break
branch = util.guess_branch(version, env_version, git_branch)
version = (version[0].replace("develop", branch), version[1].replace("develop", branch), version[2])
uuid_file = os.path.join(default_dir, "UUID") uuid_file = os.path.join(default_dir, "UUID")
uuid_num = None uuid_num = None
@ -255,179 +255,181 @@ def process(attrs):
executor.submit(start, *[attrs]) executor.submit(start, *[attrs])
def start(attrs): def start(attrs):
logger.add_main_handler()
logger.separator()
logger.info("")
logger.info_center(" __ ___ ______ ___ ___ _______ __________ ___ ")
logger.info_center("| |/ / / __ \\ | \\/ | | ____|| | / \\ ")
logger.info_center("| ' / | | | | | \\ / | | |__ `---| |---` / ^ \\ ")
logger.info_center("| < | | | | | |\\/| | | __| | | / /_\\ \\ ")
logger.info_center("| . \\ | `--` | | | | | | |____ | | / _____ \\ ")
logger.info_center("|__|\\__\\ \\______/ |__| |__| |_______| |__| /__/ \\__\\ ")
logger.info("")
if is_lxml:
system_ver = "lxml Docker"
elif is_linuxserver:
system_ver = "Linuxserver"
elif is_docker:
system_ver = "Docker"
else:
system_ver = f"Python {platform.python_version()}"
logger.info(f" Version: {version[0]} ({system_ver}){f' (Git: {git_branch})' if git_branch else ''}")
latest_version = util.current_version(version, branch=branch)
new_version = latest_version[0] if latest_version and (version[1] != latest_version[1] or (version[2] and version[2] < latest_version[2])) else None
if new_version:
logger.info(f" Newest Version: {new_version}")
logger.info(f" Platform: {platform.platform()}")
logger.info(f" Memory: {round(psutil.virtual_memory().total / (1024.0 ** 3))} GB")
if not is_docker and not is_linuxserver:
try:
with open(os.path.abspath(os.path.join(os.path.dirname(__file__), "requirements.txt")), "r") as file:
required_versions = {ln.split("==")[0]: ln.split("==")[1].strip() for ln in file.readlines()}
for req_name, sys_ver in system_versions.items():
if sys_ver and sys_ver != required_versions[req_name]:
logger.info(f" {req_name} version: {sys_ver} requires an update to: {required_versions[req_name]}")
except FileNotFoundError:
logger.error(" File Error: requirements.txt not found")
if "time" in attrs and attrs["time"]: start_type = f"{attrs['time']} "
elif run_args["tests"]: start_type = "Test "
elif "collections" in attrs and attrs["collections"]: start_type = "Collections "
elif "libraries" in attrs and attrs["libraries"]: start_type = "Libraries "
else: start_type = ""
start_time = datetime.now()
if "time" not in attrs:
attrs["time"] = start_time.strftime("%H:%M")
attrs["time_obj"] = start_time
attrs["version"] = version
attrs["branch"] = branch
attrs["config_file"] = run_args["config"]
attrs["ignore_schedules"] = run_args["ignore-schedules"]
attrs["read_only"] = run_args["read-only-config"]
attrs["no_missing"] = run_args["no-missing"]
attrs["no_report"] = run_args["no-report"]
attrs["collection_only"] = run_args["collections-only"]
attrs["metadata_only"] = run_args["metadata-only"]
attrs["playlist_only"] = run_args["playlists-only"]
attrs["operations_only"] = run_args["operations-only"]
attrs["overlays_only"] = run_args["overlays-only"]
attrs["plex_url"] = plex_url
attrs["plex_token"] = plex_token
logger.separator(debug=True)
logger.debug(f"Run Command: {run_arg}")
for akey, adata in arguments.items():
if isinstance(adata["help"], str):
ext = '"' if adata["type"] == "str" and run_args[akey] not in [None, "None"] else ""
logger.debug(f"--{akey} (KOMETA_{akey.replace('-', '_').upper()}): {ext}{run_args[akey]}{ext}")
logger.debug("")
if secret_args:
logger.debug("Kometa Secrets Read:")
for sec in secret_args:
logger.debug(f"--kometa-{sec} (KOMETA_{sec.upper().replace('-', '_')}): (redacted)")
logger.debug("")
logger.separator(f"Starting {start_type}Run")
config = None
stats = {"created": 0, "modified": 0, "deleted": 0, "added": 0, "unchanged": 0, "removed": 0, "radarr": 0, "sonarr": 0, "names": []}
try: try:
config = ConfigFile(default_dir, attrs, secret_args) logger.add_main_handler()
logger.separator()
logger.info("")
logger.info_center(" __ ___ ______ ___ ___ _______ __________ ___ ")
logger.info_center("| |/ / / __ \\ | \\/ | | ____|| | / \\ ")
logger.info_center("| ' / | | | | | \\ / | | |__ `---| |---` / ^ \\ ")
logger.info_center("| < | | | | | |\\/| | | __| | | / /_\\ \\ ")
logger.info_center("| . \\ | `--` | | | | | | |____ | | / _____ \\ ")
logger.info_center("|__|\\__\\ \\______/ |__| |__| |_______| |__| /__/ \\__\\ ")
logger.info("")
if is_lxml:
system_ver = "lxml Docker"
elif is_linuxserver:
system_ver = "Linuxserver"
elif is_docker:
system_ver = "Docker"
else:
system_ver = f"Python {platform.python_version()}"
my_requests = Requests(file_version, env_version, git_branch, verify_ssl=False if run_args["no-verify-ssl"] else True)
logger.info(f" Version: {my_requests.version[0]} ({system_ver}){f' (Git: {git_branch})' if git_branch else ''}")
if my_requests.new_version:
logger.info(f" Newest Version: {my_requests.new_version}")
logger.info(f" Platform: {platform.platform()}")
logger.info(f" Total Memory: {round(psutil.virtual_memory().total / (1024.0 ** 3))} GB")
logger.info(f" Available Memory: {round(psutil.virtual_memory().available / (1024.0 ** 3))} GB")
if not is_docker and not is_linuxserver:
try:
with open(os.path.abspath(os.path.join(os.path.dirname(__file__), "requirements.txt")), "r") as file:
required_versions = {ln.split("==")[0]: ln.split("==")[1].strip() for ln in file.readlines()}
for req_name, sys_ver in system_versions.items():
if sys_ver and sys_ver != required_versions[req_name]:
logger.info(f" {req_name} version: {sys_ver} requires an update to: {required_versions[req_name]}")
except FileNotFoundError:
logger.error(" File Error: requirements.txt not found")
if "time" in attrs and attrs["time"]: start_type = f"{attrs['time']} "
elif run_args["tests"]: start_type = "Test "
elif "collections" in attrs and attrs["collections"]: start_type = "Collections "
elif "libraries" in attrs and attrs["libraries"]: start_type = "Libraries "
else: start_type = ""
start_time = datetime.now()
if "time" not in attrs:
attrs["time"] = start_time.strftime("%H:%M")
attrs["time_obj"] = start_time
attrs["config_file"] = run_args["config"]
attrs["ignore_schedules"] = run_args["ignore-schedules"]
attrs["read_only"] = run_args["read-only-config"]
attrs["no_missing"] = run_args["no-missing"]
attrs["no_report"] = run_args["no-report"]
attrs["collection_only"] = run_args["collections-only"]
attrs["metadata_only"] = run_args["metadata-only"]
attrs["playlist_only"] = run_args["playlists-only"]
attrs["operations_only"] = run_args["operations-only"]
attrs["overlays_only"] = run_args["overlays-only"]
attrs["plex_url"] = plex_url
attrs["plex_token"] = plex_token
logger.separator(debug=True)
logger.debug(f"Run Command: {run_arg}")
for akey, adata in arguments.items():
if isinstance(adata["help"], str):
ext = '"' if adata["type"] == "str" and run_args[akey] not in [None, "None"] else ""
logger.debug(f"--{akey} (KOMETA_{akey.replace('-', '_').upper()}): {ext}{run_args[akey]}{ext}")
logger.debug("")
if secret_args:
logger.debug("Kometa Secrets Read:")
for sec in secret_args:
logger.debug(f"--kometa-{sec} (KOMETA_{sec.upper().replace('-', '_')}): (redacted)")
logger.debug("")
logger.separator(f"Starting {start_type}Run")
config = None
stats = {"created": 0, "modified": 0, "deleted": 0, "added": 0, "unchanged": 0, "removed": 0, "radarr": 0, "sonarr": 0, "names": []}
try:
config = ConfigFile(my_requests, default_dir, attrs, secret_args)
except Exception as e:
logger.stacktrace()
logger.critical(e)
else:
try:
stats = run_config(config, stats)
except Exception as e:
config.notify(e)
logger.stacktrace()
logger.critical(e)
logger.info("")
end_time = datetime.now()
run_time = str(end_time - start_time).split(".")[0]
if config:
try:
config.Webhooks.end_time_hooks(start_time, end_time, run_time, stats)
except Failed as e:
logger.stacktrace()
logger.error(f"Webhooks Error: {e}")
version_line = f"Version: {my_requests.version[0]}"
if my_requests.new_version:
version_line = f"{version_line} Newest Version: {my_requests.new_version}"
try:
log_data = {}
no_overlays = []
no_overlays_count = 0
convert_errors = {}
other_log_groups = [
("No Items found for", r"No Items found for .* \(\d+\) (.*)"),
("Convert Warning: No TVDb ID or IMDb ID found for AniDB ID:", r"Convert Warning: No TVDb ID or IMDb ID found for AniDB ID: (.*)"),
("Convert Warning: No AniDB ID Found for AniList ID:", r"Convert Warning: No AniDB ID Found for AniList ID: (.*)"),
("Convert Warning: No AniDB ID Found for MyAnimeList ID:", r"Convert Warning: No AniDB ID Found for MyAnimeList ID: (.*)"),
("Convert Warning: No IMDb ID Found for TMDb ID:", r"Convert Warning: No IMDb ID Found for TMDb ID: (.*)"),
("Convert Warning: No TMDb ID Found for IMDb ID:", r"Convert Warning: No TMDb ID Found for IMDb ID: (.*)"),
("Convert Warning: No TVDb ID Found for TMDb ID:", r"Convert Warning: No TVDb ID Found for TMDb ID: (.*)"),
("Convert Warning: No TMDb ID Found for TVDb ID:", r"Convert Warning: No TMDb ID Found for TVDb ID: (.*)"),
("Convert Warning: No IMDb ID Found for TVDb ID:", r"Convert Warning: No IMDb ID Found for TVDb ID: (.*)"),
("Convert Warning: No TVDb ID Found for IMDb ID:", r"Convert Warning: No TVDb ID Found for IMDb ID: (.*)"),
("Convert Warning: No AniDB ID to Convert to MyAnimeList ID for Guid:", r"Convert Warning: No AniDB ID to Convert to MyAnimeList ID for Guid: (.*)"),
("Convert Warning: No MyAnimeList Found for AniDB ID:", r"Convert Warning: No MyAnimeList Found for AniDB ID: (.*) of Guid: .*"),
]
other_message = {}
with open(logger.main_log, encoding="utf-8") as f:
for log_line in f:
for err_type in ["WARNING", "ERROR", "CRITICAL"]:
if f"[{err_type}]" in log_line:
log_line = log_line.split("|")[1].strip()
other = False
for key, reg in other_log_groups:
if log_line.startswith(key):
other = True
_name = re.match(reg, log_line).group(1)
if key not in other_message:
other_message[key] = {"list": [], "count": 0}
other_message[key]["count"] += 1
if _name not in other_message[key]:
other_message[key]["list"].append(_name)
if other is False:
if err_type not in log_data:
log_data[err_type] = []
log_data[err_type].append(log_line)
if "No Items found for" in other_message:
logger.separator(f"Overlay Errors Summary", space=False, border=False)
logger.info("")
logger.info(f"No Items found for {other_message['No Items found for']['count']} Overlays: {other_message['No Items found for']['list']}")
logger.info("")
convert_title = False
for key, _ in other_log_groups:
if key.startswith("Convert Warning") and key in other_message:
if convert_title is False:
logger.separator("Convert Summary", space=False, border=False)
logger.info("")
convert_title = True
logger.info(f"{key[17:]}")
logger.info(", ".join(other_message[key]["list"]))
if convert_title:
logger.info("")
for err_type in ["WARNING", "ERROR", "CRITICAL"]:
if err_type not in log_data:
continue
logger.separator(f"{err_type.lower().capitalize()} Summary", space=False, border=False)
logger.info("")
logger.info("Count | Message")
logger.separator(f"{logger.separating_character * 5}|", space=False, border=False, side_space=False, left=True)
for k, v in Counter(log_data[err_type]).most_common():
logger.info(f"{v:>5} | {k}")
logger.info("")
except Failed as e:
logger.stacktrace()
logger.error(f"Report Error: {e}")
logger.separator(f"Finished {start_type}Run\n{version_line}\nFinished: {end_time.strftime('%H:%M:%S %Y-%m-%d')} Run Time: {run_time}")
logger.remove_main_handler()
except Exception as e: except Exception as e:
logger.stacktrace() logger.stacktrace()
logger.critical(e) logger.critical(e)
else:
try:
stats = run_config(config, stats)
except Exception as e:
config.notify(e)
logger.stacktrace()
logger.critical(e)
logger.info("")
end_time = datetime.now()
run_time = str(end_time - start_time).split(".")[0]
if config:
try:
config.Webhooks.end_time_hooks(start_time, end_time, run_time, stats)
except Failed as e:
logger.stacktrace()
logger.error(f"Webhooks Error: {e}")
version_line = f"Version: {version[0]}"
if new_version:
version_line = f"{version_line} Newest Version: {new_version}"
try:
log_data = {}
no_overlays = []
no_overlays_count = 0
convert_errors = {}
other_log_groups = [
("No Items found for", r"No Items found for .* \(\d+\) (.*)"),
("Convert Warning: No TVDb ID or IMDb ID found for AniDB ID:", r"Convert Warning: No TVDb ID or IMDb ID found for AniDB ID: (.*)"),
("Convert Warning: No AniDB ID Found for AniList ID:", r"Convert Warning: No AniDB ID Found for AniList ID: (.*)"),
("Convert Warning: No AniDB ID Found for MyAnimeList ID:", r"Convert Warning: No AniDB ID Found for MyAnimeList ID: (.*)"),
("Convert Warning: No IMDb ID Found for TMDb ID:", r"Convert Warning: No IMDb ID Found for TMDb ID: (.*)"),
("Convert Warning: No TMDb ID Found for IMDb ID:", r"Convert Warning: No TMDb ID Found for IMDb ID: (.*)"),
("Convert Warning: No TVDb ID Found for TMDb ID:", r"Convert Warning: No TVDb ID Found for TMDb ID: (.*)"),
("Convert Warning: No TMDb ID Found for TVDb ID:", r"Convert Warning: No TMDb ID Found for TVDb ID: (.*)"),
("Convert Warning: No IMDb ID Found for TVDb ID:", r"Convert Warning: No IMDb ID Found for TVDb ID: (.*)"),
("Convert Warning: No TVDb ID Found for IMDb ID:", r"Convert Warning: No TVDb ID Found for IMDb ID: (.*)"),
("Convert Warning: No AniDB ID to Convert to MyAnimeList ID for Guid:", r"Convert Warning: No AniDB ID to Convert to MyAnimeList ID for Guid: (.*)"),
("Convert Warning: No MyAnimeList Found for AniDB ID:", r"Convert Warning: No MyAnimeList Found for AniDB ID: (.*) of Guid: .*"),
]
other_message = {}
with open(logger.main_log, encoding="utf-8") as f:
for log_line in f:
for err_type in ["WARNING", "ERROR", "CRITICAL"]:
if f"[{err_type}]" in log_line:
log_line = log_line.split("|")[1].strip()
other = False
for key, reg in other_log_groups:
if log_line.startswith(key):
other = True
_name = re.match(reg, log_line).group(1)
if key not in other_message:
other_message[key] = {"list": [], "count": 0}
other_message[key]["count"] += 1
if _name not in other_message[key]:
other_message[key]["list"].append(_name)
if other is False:
if err_type not in log_data:
log_data[err_type] = []
log_data[err_type].append(log_line)
if "No Items found for" in other_message:
logger.separator(f"Overlay Errors Summary", space=False, border=False)
logger.info("")
logger.info(f"No Items found for {other_message['No Items found for']['count']} Overlays: {other_message['No Items found for']['list']}")
logger.info("")
convert_title = False
for key, _ in other_log_groups:
if key.startswith("Convert Warning") and key in other_message:
if convert_title is False:
logger.separator("Convert Summary", space=False, border=False)
logger.info("")
convert_title = True
logger.info(f"{key[17:]}")
logger.info(", ".join(other_message[key]["list"]))
if convert_title:
logger.info("")
for err_type in ["WARNING", "ERROR", "CRITICAL"]:
if err_type not in log_data:
continue
logger.separator(f"{err_type.lower().capitalize()} Summary", space=False, border=False)
logger.info("")
logger.info("Count | Message")
logger.separator(f"{logger.separating_character * 5}|", space=False, border=False, side_space=False, left=True)
for k, v in Counter(log_data[err_type]).most_common():
logger.info(f"{v:>5} | {k}")
logger.info("")
except Failed as e:
logger.stacktrace()
logger.error(f"Report Error: {e}")
logger.separator(f"Finished {start_type}Run\n{version_line}\nFinished: {end_time.strftime('%H:%M:%S %Y-%m-%d')} Run Time: {run_time}")
logger.remove_main_handler()
def run_config(config, stats): def run_config(config, stats):
library_status = run_libraries(config) library_status = run_libraries(config)

View file

@ -225,6 +225,7 @@ nav:
- Basic Charts: defaults/chart/basic.md - Basic Charts: defaults/chart/basic.md
- AniList Charts: defaults/chart/anilist.md - AniList Charts: defaults/chart/anilist.md
- IMDb Charts: defaults/chart/imdb.md - IMDb Charts: defaults/chart/imdb.md
- Letterboxd Charts: defaults/chart/letterboxd.md
- MyAnimeList Charts: defaults/chart/myanimelist.md - MyAnimeList Charts: defaults/chart/myanimelist.md
- Tautulli Charts: defaults/chart/tautulli.md - Tautulli Charts: defaults/chart/tautulli.md
- TMDb Charts: defaults/chart/tmdb.md - TMDb Charts: defaults/chart/tmdb.md

View file

@ -89,8 +89,9 @@ class AniDBObj:
class AniDB: class AniDB:
def __init__(self, config, data): def __init__(self, requests, cache, data):
self.config = config self.requests = requests
self.cache = cache
self.language = data["language"] self.language = data["language"]
self.expiration = 60 self.expiration = 60
self.client = None self.client = None
@ -104,19 +105,19 @@ class AniDB:
self.version = version self.version = version
self.expiration = expiration self.expiration = expiration
logger.secret(self.client) logger.secret(self.client)
if self.config.Cache: if self.cache:
value1, value2, success = self.config.Cache.query_testing("anidb_login") value1, value2, success = self.cache.query_testing("anidb_login")
if str(value1) == str(client) and str(value2) == str(version) and success: if str(value1) == str(client) and str(value2) == str(version) and success:
return return
try: try:
self.get_anime(69, ignore_cache=True) self.get_anime(69, ignore_cache=True)
if self.config.Cache: if self.cache:
self.config.Cache.update_testing("anidb_login", self.client, self.version, "True") self.cache.update_testing("anidb_login", self.client, self.version, "True")
except Failed: except Failed:
self.client = None self.client = None
self.version = None self.version = None
if self.config.Cache: if self.cache:
self.config.Cache.update_testing("anidb_login", self.client, self.version, "False") self.cache.update_testing("anidb_login", self.client, self.version, "False")
raise raise
@property @property
@ -137,9 +138,9 @@ class AniDB:
if params: if params:
logger.trace(f"Params: {params}") logger.trace(f"Params: {params}")
if data: if data:
return self.config.post_html(url, data=data, headers=util.header(self.language)) return self.requests.post_html(url, data=data, language=self.language)
else: else:
return self.config.get_html(url, params=params, headers=util.header(self.language)) return self.requests.get_html(url, params=params, language=self.language)
def _popular(self): def _popular(self):
response = self._request(urls["popular"]) response = self._request(urls["popular"])
@ -184,8 +185,8 @@ class AniDB:
def get_anime(self, anidb_id, ignore_cache=False): def get_anime(self, anidb_id, ignore_cache=False):
expired = None expired = None
anidb_dict = None anidb_dict = None
if self.config.Cache and not ignore_cache: if self.cache and not ignore_cache:
anidb_dict, expired = self.config.Cache.query_anidb(anidb_id, self.expiration) anidb_dict, expired = self.cache.query_anidb(anidb_id, self.expiration)
if expired or not anidb_dict: if expired or not anidb_dict:
time_check = time.time() time_check = time.time()
if self._delay is not None: if self._delay is not None:
@ -200,8 +201,8 @@ class AniDB:
}) })
self._delay = time.time() self._delay = time.time()
obj = AniDBObj(self, anidb_id, anidb_dict) obj = AniDBObj(self, anidb_id, anidb_dict)
if self.config.Cache and not ignore_cache: if self.cache and not ignore_cache:
self.config.Cache.update_anidb(expired, anidb_id, obj, self.expiration) self.cache.update_anidb(expired, anidb_id, obj, self.expiration)
return obj return obj
def get_anidb_ids(self, method, data): def get_anidb_ids(self, method, data):

View file

@ -57,8 +57,8 @@ country_codes = [
] ]
class AniList: class AniList:
def __init__(self, config): def __init__(self, requests):
self.config = config self.requests = requests
self._options = None self._options = None
@property @property
@ -79,7 +79,7 @@ class AniList:
def _request(self, query, variables, level=1): def _request(self, query, variables, level=1):
logger.trace(f"Query: {query}") logger.trace(f"Query: {query}")
logger.trace(f"Variables: {variables}") logger.trace(f"Variables: {variables}")
response = self.config.post(base_url, json={"query": query, "variables": variables}) response = self.requests.post(base_url, json={"query": query, "variables": variables})
json_obj = response.json() json_obj = response.json()
logger.trace(f"Response: {json_obj}") logger.trace(f"Response: {json_obj}")
if "errors" in json_obj: if "errors" in json_obj:

View file

@ -6,11 +6,10 @@ from modules import anidb, anilist, icheckmovies, imdb, letterboxd, mal, mojo, p
from modules.util import Failed, FilterFailed, NonExisting, NotScheduled, NotScheduledRange, Deleted from modules.util import Failed, FilterFailed, NonExisting, NotScheduled, NotScheduledRange, Deleted
from modules.overlay import Overlay from modules.overlay import Overlay
from modules.poster import KometaImage from modules.poster import KometaImage
from modules.request import quote
from plexapi.audio import Artist, Album, Track from plexapi.audio import Artist, Album, Track
from plexapi.exceptions import NotFound from plexapi.exceptions import NotFound
from plexapi.video import Movie, Show, Season, Episode from plexapi.video import Movie, Show, Season, Episode
from requests.exceptions import ConnectionError
from urllib.parse import quote
logger = util.logger logger = util.logger
@ -559,9 +558,7 @@ class CollectionBuilder:
self.obj = getter(self.name) self.obj = getter(self.name)
break break
except Failed as e: except Failed as e:
error = e logger.error(e)
else:
logger.error(error)
raise Deleted(self.delete()) raise Deleted(self.delete())
else: else:
self.libraries.append(self.library) self.libraries.append(self.library)
@ -1182,11 +1179,9 @@ class CollectionBuilder:
if method_name == "url_poster": if method_name == "url_poster":
try: try:
if not method_data.startswith("https://theposterdb.com/api/assets/"): if not method_data.startswith("https://theposterdb.com/api/assets/"):
image_response = self.config.get(method_data, headers=util.header()) self.config.Requests.get_image(method_data)
if image_response.status_code >= 400 or image_response.headers["Content-Type"] not in util.image_content_types:
raise ConnectionError
self.posters[method_name] = method_data self.posters[method_name] = method_data
except ConnectionError: except Failed:
logger.warning(f"{self.Type} Warning: No Poster Found at {method_data}") logger.warning(f"{self.Type} Warning: No Poster Found at {method_data}")
elif method_name == "tmdb_list_poster": elif method_name == "tmdb_list_poster":
self.posters[method_name] = self.config.TMDb.get_list(util.regex_first_int(method_data, "TMDb List ID")).poster_url self.posters[method_name] = self.config.TMDb.get_list(util.regex_first_int(method_data, "TMDb List ID")).poster_url
@ -1209,11 +1204,9 @@ class CollectionBuilder:
def _background(self, method_name, method_data): def _background(self, method_name, method_data):
if method_name == "url_background": if method_name == "url_background":
try: try:
image_response = self.config.get(method_data, headers=util.header()) self.config.Requests.get_image(method_data)
if image_response.status_code >= 400 or image_response.headers["Content-Type"] not in util.image_content_types:
raise ConnectionError
self.backgrounds[method_name] = method_data self.backgrounds[method_name] = method_data
except ConnectionError: except Failed:
logger.warning(f"{self.Type} Warning: No Background Found at {method_data}") logger.warning(f"{self.Type} Warning: No Background Found at {method_data}")
elif method_name == "tmdb_background": elif method_name == "tmdb_background":
self.backgrounds[method_name] = self.config.TMDb.get_movie_show_or_collection(util.regex_first_int(method_data, 'TMDb ID'), self.library.is_movie).backdrop_url self.backgrounds[method_name] = self.config.TMDb.get_movie_show_or_collection(util.regex_first_int(method_data, 'TMDb ID'), self.library.is_movie).backdrop_url
@ -2875,7 +2868,7 @@ class CollectionBuilder:
if self.details["changes_webhooks"]: if self.details["changes_webhooks"]:
self.notification_removals.append(util.item_set(item, self.library.get_id_from_maps(item.ratingKey))) self.notification_removals.append(util.item_set(item, self.library.get_id_from_maps(item.ratingKey)))
if self.playlist and items_removed: if self.playlist and items_removed:
self.library._reload(self.obj) self.library.item_reload(self.obj)
self.obj.removeItems(items_removed) self.obj.removeItems(items_removed)
elif items_removed: elif items_removed:
self.library.alter_collection(items_removed, self.name, smart_label_collection=self.smart_label_collection, add=False) self.library.alter_collection(items_removed, self.name, smart_label_collection=self.smart_label_collection, add=False)
@ -3328,7 +3321,7 @@ class CollectionBuilder:
logger.error("Metadata: Failed to Update Please delete the collection and run again") logger.error("Metadata: Failed to Update Please delete the collection and run again")
logger.info("") logger.info("")
else: else:
self.library._reload(self.obj) self.library.item_reload(self.obj)
#self.obj.batchEdits() #self.obj.batchEdits()
batch_display = "Collection Metadata Edits" batch_display = "Collection Metadata Edits"
if summary[1] and str(summary[1]) != str(self.obj.summary): if summary[1] and str(summary[1]) != str(self.obj.summary):
@ -3449,8 +3442,8 @@ class CollectionBuilder:
elif style_data and "tpdb_background" in style_data and style_data["tpdb_background"]: elif style_data and "tpdb_background" in style_data and style_data["tpdb_background"]:
self.backgrounds["style_data"] = f"https://theposterdb.com/api/assets/{style_data['tpdb_background']}" self.backgrounds["style_data"] = f"https://theposterdb.com/api/assets/{style_data['tpdb_background']}"
self.collection_poster = util.pick_image(self.obj.title, self.posters, self.library.prioritize_assets, self.library.download_url_assets, asset_location) self.collection_poster = self.library.pick_image(self.obj.title, self.posters, self.library.prioritize_assets, self.library.download_url_assets, asset_location)
self.collection_background = util.pick_image(self.obj.title, self.backgrounds, self.library.prioritize_assets, self.library.download_url_assets, asset_location, is_poster=False) self.collection_background = self.library.pick_image(self.obj.title, self.backgrounds, self.library.prioritize_assets, self.library.download_url_assets, asset_location, is_poster=False)
clean_temp = False clean_temp = False
if isinstance(self.collection_poster, KometaImage): if isinstance(self.collection_poster, KometaImage):
@ -3520,7 +3513,7 @@ class CollectionBuilder:
logger.separator(f"Syncing {self.name} {self.Type} to Trakt List {self.sync_to_trakt_list}", space=False, border=False) logger.separator(f"Syncing {self.name} {self.Type} to Trakt List {self.sync_to_trakt_list}", space=False, border=False)
logger.info("") logger.info("")
if self.obj: if self.obj:
self.library._reload(self.obj) self.library.item_reload(self.obj)
self.load_collection_items() self.load_collection_items()
current_ids = [] current_ids = []
for item in self.items: for item in self.items:
@ -3597,7 +3590,7 @@ class CollectionBuilder:
def send_notifications(self, playlist=False): def send_notifications(self, playlist=False):
if self.obj and self.details["changes_webhooks"] and \ if self.obj and self.details["changes_webhooks"] and \
(self.created or len(self.notification_additions) > 0 or len(self.notification_removals) > 0): (self.created or len(self.notification_additions) > 0 or len(self.notification_removals) > 0):
self.library._reload(self.obj) self.library.item_reload(self.obj)
try: try:
self.library.Webhooks.collection_hooks( self.library.Webhooks.collection_hooks(
self.details["changes_webhooks"], self.details["changes_webhooks"],

View file

@ -1,6 +1,5 @@
import base64, os, re, requests import os, re
from datetime import datetime from datetime import datetime
from lxml import html
from modules import util, radarr, sonarr, operations from modules import util, radarr, sonarr, operations
from modules.anidb import AniDB from modules.anidb import AniDB
from modules.anilist import AniList from modules.anilist import AniList
@ -27,9 +26,8 @@ from modules.tautulli import Tautulli
from modules.tmdb import TMDb from modules.tmdb import TMDb
from modules.trakt import Trakt from modules.trakt import Trakt
from modules.tvdb import TVDb from modules.tvdb import TVDb
from modules.util import Failed, NotScheduled, NotScheduledRange, YAML from modules.util import Failed, NotScheduled, NotScheduledRange
from modules.webhooks import Webhooks from modules.webhooks import Webhooks
from retrying import retry
logger = util.logger logger = util.logger
@ -142,7 +140,7 @@ library_operations = {
} }
class ConfigFile: class ConfigFile:
def __init__(self, default_dir, attrs, secrets): def __init__(self, in_request, default_dir, attrs, secrets):
logger.info("Locating config...") logger.info("Locating config...")
config_file = attrs["config_file"] config_file = attrs["config_file"]
if config_file and os.path.exists(config_file): self.config_path = os.path.abspath(config_file) if config_file and os.path.exists(config_file): self.config_path = os.path.abspath(config_file)
@ -153,10 +151,9 @@ class ConfigFile:
logger.clear_errors() logger.clear_errors()
self._mediastingers = None self._mediastingers = None
self.Requests = in_request
self.default_dir = default_dir self.default_dir = default_dir
self.secrets = secrets self.secrets = secrets
self.version = attrs["version"] if "version" in attrs else None
self.branch = attrs["branch"] if "branch" in attrs else None
self.read_only = attrs["read_only"] if "read_only" in attrs else False self.read_only = attrs["read_only"] if "read_only" in attrs else False
self.no_missing = attrs["no_missing"] if "no_missing" in attrs else None self.no_missing = attrs["no_missing"] if "no_missing" in attrs else None
self.no_report = attrs["no_report"] if "no_report" in attrs else None self.no_report = attrs["no_report"] if "no_report" in attrs else None
@ -196,7 +193,7 @@ class ConfigFile:
logger.debug(re.sub(r"(token|client.*|url|api_*key|secret|error|delete|run_start|run_end|version|changes|username|password): .+", r"\1: (redacted)", line.strip("\r\n"))) logger.debug(re.sub(r"(token|client.*|url|api_*key|secret|error|delete|run_start|run_end|version|changes|username|password): .+", r"\1: (redacted)", line.strip("\r\n")))
logger.debug("") logger.debug("")
self.data = YAML(self.config_path).data self.data = self.Requests.file_yaml(self.config_path).data
def replace_attr(all_data, in_attr, par): def replace_attr(all_data, in_attr, par):
if "settings" not in all_data: if "settings" not in all_data:
@ -364,7 +361,7 @@ class ConfigFile:
if data is None or attribute not in data: if data is None or attribute not in data:
message = f"{text} not found" message = f"{text} not found"
if parent and save is True: if parent and save is True:
yaml = YAML(self.config_path) yaml = self.Requests.file_yaml(self.config_path)
endline = f"\n{parent} sub-attribute {attribute} added to config" endline = f"\n{parent} sub-attribute {attribute} added to config"
if parent not in yaml.data or not yaml.data[parent]: yaml.data[parent] = {attribute: default} if parent not in yaml.data or not yaml.data[parent]: yaml.data[parent] = {attribute: default}
elif attribute not in yaml.data[parent]: yaml.data[parent][attribute] = default elif attribute not in yaml.data[parent]: yaml.data[parent][attribute] = default
@ -480,7 +477,7 @@ class ConfigFile:
"playlist_sync_to_users": check_for_attribute(self.data, "playlist_sync_to_users", parent="settings", default="all", default_is_none=True), "playlist_sync_to_users": check_for_attribute(self.data, "playlist_sync_to_users", parent="settings", default="all", default_is_none=True),
"playlist_exclude_users": check_for_attribute(self.data, "playlist_exclude_users", parent="settings", default_is_none=True), "playlist_exclude_users": check_for_attribute(self.data, "playlist_exclude_users", parent="settings", default_is_none=True),
"playlist_report": check_for_attribute(self.data, "playlist_report", parent="settings", var_type="bool", default=True), "playlist_report": check_for_attribute(self.data, "playlist_report", parent="settings", var_type="bool", default=True),
"verify_ssl": check_for_attribute(self.data, "verify_ssl", parent="settings", var_type="bool", default=True), "verify_ssl": check_for_attribute(self.data, "verify_ssl", parent="settings", var_type="bool", default=True, save=False),
"custom_repo": check_for_attribute(self.data, "custom_repo", parent="settings", default_is_none=True), "custom_repo": check_for_attribute(self.data, "custom_repo", parent="settings", default_is_none=True),
"overlay_artwork_filetype": check_for_attribute(self.data, "overlay_artwork_filetype", parent="settings", test_list=filetype_list, translations={"webp": "webp_lossy"}, default="jpg"), "overlay_artwork_filetype": check_for_attribute(self.data, "overlay_artwork_filetype", parent="settings", test_list=filetype_list, translations={"webp": "webp_lossy"}, default="jpg"),
"overlay_artwork_quality": check_for_attribute(self.data, "overlay_artwork_quality", parent="settings", var_type="int", default_is_none=True, int_min=1, int_max=100), "overlay_artwork_quality": check_for_attribute(self.data, "overlay_artwork_quality", parent="settings", var_type="int", default_is_none=True, int_min=1, int_max=100),
@ -492,7 +489,9 @@ class ConfigFile:
if "https://github.com/" in repo: if "https://github.com/" in repo:
repo = repo.replace("https://github.com/", "https://raw.githubusercontent.com/").replace("/tree/", "/") repo = repo.replace("https://github.com/", "https://raw.githubusercontent.com/").replace("/tree/", "/")
self.custom_repo = repo self.custom_repo = repo
self.latest_version = util.current_version(self.version, branch=self.branch)
if not self.general["verify_ssl"]:
self.Requests.no_verify_ssl()
add_operations = True if "operations" not in self.general["run_order"] else False add_operations = True if "operations" not in self.general["run_order"] else False
add_metadata = True if "metadata" not in self.general["run_order"] else False add_metadata = True if "metadata" not in self.general["run_order"] else False
@ -516,25 +515,20 @@ class ConfigFile:
new_run_order.append("overlays") new_run_order.append("overlays")
self.general["run_order"] = new_run_order self.general["run_order"] = new_run_order
yaml = YAML(self.config_path) config_yaml = self.Requests.file_yaml(self.config_path)
if "settings" not in yaml.data or not yaml.data["settings"]: if "settings" not in config_yaml.data or not config_yaml.data["settings"]:
yaml.data["settings"] = {} config_yaml.data["settings"] = {}
yaml.data["settings"]["run_order"] = new_run_order config_yaml.data["settings"]["run_order"] = new_run_order
yaml.save() config_yaml.save()
self.session = requests.Session()
if not self.general["verify_ssl"]:
self.session.verify = False
if self.session.verify is False:
import urllib3
urllib3.disable_warnings(urllib3.exceptions.InsecureRequestWarning)
if self.general["cache"]: if self.general["cache"]:
logger.separator() logger.separator()
self.Cache = Cache(self.config_path, self.general["cache_expiration"]) self.Cache = Cache(self.config_path, self.general["cache_expiration"])
else: else:
self.Cache = None self.Cache = None
self.GitHub = GitHub(self, {"token": check_for_attribute(self.data, "token", parent="github", default_is_none=True)}) self.GitHub = GitHub(self.Requests, {
"token": check_for_attribute(self.data, "token", parent="github", default_is_none=True)
})
logger.separator() logger.separator()
@ -542,7 +536,9 @@ class ConfigFile:
if "notifiarr" in self.data: if "notifiarr" in self.data:
logger.info("Connecting to Notifiarr...") logger.info("Connecting to Notifiarr...")
try: try:
self.NotifiarrFactory = Notifiarr(self, {"apikey": check_for_attribute(self.data, "apikey", parent="notifiarr", throw=True)}) self.NotifiarrFactory = Notifiarr(self.Requests, {
"apikey": check_for_attribute(self.data, "apikey", parent="notifiarr", throw=True)
})
except Failed as e: except Failed as e:
if str(e).endswith("is blank"): if str(e).endswith("is blank"):
logger.warning(e) logger.warning(e)
@ -557,7 +553,7 @@ class ConfigFile:
if "gotify" in self.data: if "gotify" in self.data:
logger.info("Connecting to Gotify...") logger.info("Connecting to Gotify...")
try: try:
self.GotifyFactory = Gotify(self, { self.GotifyFactory = Gotify(self.Requests, {
"url": check_for_attribute(self.data, "url", parent="gotify", throw=True), "url": check_for_attribute(self.data, "url", parent="gotify", throw=True),
"token": check_for_attribute(self.data, "token", parent="gotify", throw=True) "token": check_for_attribute(self.data, "token", parent="gotify", throw=True)
}) })
@ -582,8 +578,8 @@ class ConfigFile:
self.Webhooks = Webhooks(self, self.webhooks, notifiarr=self.NotifiarrFactory, gotify=self.GotifyFactory) self.Webhooks = Webhooks(self, self.webhooks, notifiarr=self.NotifiarrFactory, gotify=self.GotifyFactory)
try: try:
self.Webhooks.start_time_hooks(self.start_time) self.Webhooks.start_time_hooks(self.start_time)
if self.version[0] != "Unknown" and self.latest_version[0] != "Unknown" and self.version[1] != self.latest_version[1] or (self.version[2] and self.version[2] < self.latest_version[2]): if self.Requests.has_new_version():
self.Webhooks.version_hooks(self.version, self.latest_version) self.Webhooks.version_hooks(self.Requests.version, self.Requests.latest_version)
except Failed as e: except Failed as e:
logger.stacktrace() logger.stacktrace()
logger.error(f"Webhooks Error: {e}") logger.error(f"Webhooks Error: {e}")
@ -613,7 +609,7 @@ class ConfigFile:
if "omdb" in self.data: if "omdb" in self.data:
logger.info("Connecting to OMDb...") logger.info("Connecting to OMDb...")
try: try:
self.OMDb = OMDb(self, { self.OMDb = OMDb(self.Requests, self.Cache, {
"apikey": check_for_attribute(self.data, "apikey", parent="omdb", throw=True), "apikey": check_for_attribute(self.data, "apikey", parent="omdb", throw=True),
"expiration": check_for_attribute(self.data, "cache_expiration", parent="omdb", var_type="int", default=60, int_min=1) "expiration": check_for_attribute(self.data, "cache_expiration", parent="omdb", var_type="int", default=60, int_min=1)
}) })
@ -628,7 +624,7 @@ class ConfigFile:
logger.separator() logger.separator()
self.MDBList = MDBList(self) self.MDBList = MDBList(self.Requests, self.Cache)
if "mdblist" in self.data: if "mdblist" in self.data:
logger.info("Connecting to MDBList...") logger.info("Connecting to MDBList...")
try: try:
@ -652,7 +648,7 @@ class ConfigFile:
if "trakt" in self.data: if "trakt" in self.data:
logger.info("Connecting to Trakt...") logger.info("Connecting to Trakt...")
try: try:
self.Trakt = Trakt(self, { self.Trakt = Trakt(self.Requests, self.read_only, {
"client_id": check_for_attribute(self.data, "client_id", parent="trakt", throw=True), "client_id": check_for_attribute(self.data, "client_id", parent="trakt", throw=True),
"client_secret": check_for_attribute(self.data, "client_secret", parent="trakt", throw=True), "client_secret": check_for_attribute(self.data, "client_secret", parent="trakt", throw=True),
"pin": check_for_attribute(self.data, "pin", parent="trakt", default_is_none=True), "pin": check_for_attribute(self.data, "pin", parent="trakt", default_is_none=True),
@ -674,7 +670,7 @@ class ConfigFile:
if "mal" in self.data: if "mal" in self.data:
logger.info("Connecting to My Anime List...") logger.info("Connecting to My Anime List...")
try: try:
self.MyAnimeList = MyAnimeList(self, { self.MyAnimeList = MyAnimeList(self.Requests, self.Cache, self.read_only, {
"client_id": check_for_attribute(self.data, "client_id", parent="mal", throw=True), "client_id": check_for_attribute(self.data, "client_id", parent="mal", throw=True),
"client_secret": check_for_attribute(self.data, "client_secret", parent="mal", throw=True), "client_secret": check_for_attribute(self.data, "client_secret", parent="mal", throw=True),
"localhost_url": check_for_attribute(self.data, "localhost_url", parent="mal", default_is_none=True), "localhost_url": check_for_attribute(self.data, "localhost_url", parent="mal", default_is_none=True),
@ -691,7 +687,9 @@ class ConfigFile:
else: else:
logger.info("mal attribute not found") logger.info("mal attribute not found")
self.AniDB = AniDB(self, {"language": check_for_attribute(self.data, "language", parent="anidb", default="en")}) self.AniDB = AniDB(self.Requests, self.Cache, {
"language": check_for_attribute(self.data, "language", parent="anidb", default="en")
})
if "anidb" in self.data: if "anidb" in self.data:
logger.separator() logger.separator()
logger.info("Connecting to AniDB...") logger.info("Connecting to AniDB...")
@ -745,15 +743,15 @@ class ConfigFile:
logger.info("") logger.info("")
logger.separator(f"Skipping {e} Playlist File") logger.separator(f"Skipping {e} Playlist File")
self.TVDb = TVDb(self, self.general["tvdb_language"], self.general["cache_expiration"]) self.TVDb = TVDb(self.Requests, self.Cache, self.general["tvdb_language"], self.general["cache_expiration"])
self.IMDb = IMDb(self) self.IMDb = IMDb(self.Requests, self.Cache, self.default_dir)
self.Convert = Convert(self) self.Convert = Convert(self.Requests, self.Cache, self.TMDb)
self.AniList = AniList(self) self.AniList = AniList(self.Requests)
self.ICheckMovies = ICheckMovies(self) self.ICheckMovies = ICheckMovies(self.Requests)
self.Letterboxd = Letterboxd(self) self.Letterboxd = Letterboxd(self.Requests, self.Cache)
self.BoxOfficeMojo = BoxOfficeMojo(self) self.BoxOfficeMojo = BoxOfficeMojo(self.Requests, self.Cache)
self.Reciperr = Reciperr(self) self.Reciperr = Reciperr(self.Requests)
self.Ergast = Ergast(self) self.Ergast = Ergast(self.Requests, self.Cache)
logger.separator() logger.separator()
@ -1165,15 +1163,15 @@ class ConfigFile:
for attr in ["clean_bundles", "empty_trash", "optimize"]: for attr in ["clean_bundles", "empty_trash", "optimize"]:
try: try:
params["plex"][attr] = check_for_attribute(lib, attr, parent="plex", var_type="bool", save=False, throw=True) params["plex"][attr] = check_for_attribute(lib, attr, parent="plex", var_type="bool", save=False, throw=True)
except Failed as er: except Failed:
test = lib["plex"][attr] if "plex" in lib and attr in lib["plex"] and lib["plex"][attr] else self.general["plex"][attr] test_attr = lib["plex"][attr] if "plex" in lib and attr in lib["plex"] and lib["plex"][attr] else self.general["plex"][attr]
params["plex"][attr] = False params["plex"][attr] = False
if test is not True and test is not False: if test_attr is not True and test_attr is not False:
try: try:
util.schedule_check(attr, test, current_time, self.run_hour) util.schedule_check(attr, test_attr, current_time, self.run_hour)
params["plex"][attr] = True params["plex"][attr] = True
except NotScheduled: except NotScheduled:
logger.info(f"Skipping Operation Not Scheduled for {test}") logger.info(f"Skipping Operation Not Scheduled for {test_attr}")
if params["plex"]["url"].lower() == "env": if params["plex"]["url"].lower() == "env":
params["plex"]["url"] = self.env_plex_url params["plex"]["url"] = self.env_plex_url
@ -1201,7 +1199,7 @@ class ConfigFile:
logger.info(f"Connecting to {display_name} library's Radarr...") logger.info(f"Connecting to {display_name} library's Radarr...")
logger.info("") logger.info("")
try: try:
library.Radarr = Radarr(self, library, { library.Radarr = Radarr(self.Requests, self.Cache, library, {
"url": check_for_attribute(lib, "url", parent="radarr", var_type="url", default=self.general["radarr"]["url"], req_default=True, save=False), "url": check_for_attribute(lib, "url", parent="radarr", var_type="url", default=self.general["radarr"]["url"], req_default=True, save=False),
"token": check_for_attribute(lib, "token", parent="radarr", default=self.general["radarr"]["token"], req_default=True, save=False), "token": check_for_attribute(lib, "token", parent="radarr", default=self.general["radarr"]["token"], req_default=True, save=False),
"add_missing": check_for_attribute(lib, "add_missing", parent="radarr", var_type="bool", default=self.general["radarr"]["add_missing"], save=False), "add_missing": check_for_attribute(lib, "add_missing", parent="radarr", var_type="bool", default=self.general["radarr"]["add_missing"], save=False),
@ -1231,7 +1229,7 @@ class ConfigFile:
logger.info(f"Connecting to {display_name} library's Sonarr...") logger.info(f"Connecting to {display_name} library's Sonarr...")
logger.info("") logger.info("")
try: try:
library.Sonarr = Sonarr(self, library, { library.Sonarr = Sonarr(self.Requests, self.Cache, library, {
"url": check_for_attribute(lib, "url", parent="sonarr", var_type="url", default=self.general["sonarr"]["url"], req_default=True, save=False), "url": check_for_attribute(lib, "url", parent="sonarr", var_type="url", default=self.general["sonarr"]["url"], req_default=True, save=False),
"token": check_for_attribute(lib, "token", parent="sonarr", default=self.general["sonarr"]["token"], req_default=True, save=False), "token": check_for_attribute(lib, "token", parent="sonarr", default=self.general["sonarr"]["token"], req_default=True, save=False),
"add_missing": check_for_attribute(lib, "add_missing", parent="sonarr", var_type="bool", default=self.general["sonarr"]["add_missing"], save=False), "add_missing": check_for_attribute(lib, "add_missing", parent="sonarr", var_type="bool", default=self.general["sonarr"]["add_missing"], save=False),
@ -1264,7 +1262,7 @@ class ConfigFile:
logger.info(f"Connecting to {display_name} library's Tautulli...") logger.info(f"Connecting to {display_name} library's Tautulli...")
logger.info("") logger.info("")
try: try:
library.Tautulli = Tautulli(self, library, { library.Tautulli = Tautulli(self.Requests, library, {
"url": check_for_attribute(lib, "url", parent="tautulli", var_type="url", default=self.general["tautulli"]["url"], req_default=True, save=False), "url": check_for_attribute(lib, "url", parent="tautulli", var_type="url", default=self.general["tautulli"]["url"], req_default=True, save=False),
"apikey": check_for_attribute(lib, "apikey", parent="tautulli", default=self.general["tautulli"]["apikey"], req_default=True, save=False) "apikey": check_for_attribute(lib, "apikey", parent="tautulli", default=self.general["tautulli"]["apikey"], req_default=True, save=False)
}) })
@ -1315,44 +1313,8 @@ class ConfigFile:
logger.stacktrace() logger.stacktrace()
logger.error(f"Webhooks Error: {e}") logger.error(f"Webhooks Error: {e}")
def get_html(self, url, headers=None, params=None):
return html.fromstring(self.get(url, headers=headers, params=params).content)
def get_json(self, url, json=None, headers=None, params=None):
response = self.get(url, json=json, headers=headers, params=params)
try:
return response.json()
except ValueError:
logger.error(str(response.content))
raise
@retry(stop_max_attempt_number=6, wait_fixed=10000)
def get(self, url, json=None, headers=None, params=None):
return self.session.get(url, json=json, headers=headers, params=params)
def get_image_encoded(self, url):
return base64.b64encode(self.get(url).content).decode('utf-8')
def post_html(self, url, data=None, json=None, headers=None):
return html.fromstring(self.post(url, data=data, json=json, headers=headers).content)
def post_json(self, url, data=None, json=None, headers=None):
response = self.post(url, data=data, json=json, headers=headers)
try:
return response.json()
except ValueError:
logger.error(str(response.content))
raise
@retry(stop_max_attempt_number=6, wait_fixed=10000)
def post(self, url, data=None, json=None, headers=None):
return self.session.post(url, data=data, json=json, headers=headers)
def load_yaml(self, url):
return YAML(input_data=self.get(url).content).data
@property @property
def mediastingers(self): def mediastingers(self):
if self._mediastingers is None: if self._mediastingers is None:
self._mediastingers = self.load_yaml(mediastingers_url) self._mediastingers = self.Requests.get_yaml(mediastingers_url)
return self._mediastingers return self._mediastingers

View file

@ -1,15 +1,19 @@
import re, requests import re
from modules import util from modules import util
from modules.util import Failed, NonExisting from modules.util import Failed, NonExisting
from modules.request import urlparse
from plexapi.exceptions import BadRequest from plexapi.exceptions import BadRequest
from requests.exceptions import ConnectionError
logger = util.logger logger = util.logger
anime_lists_url = "https://raw.githubusercontent.com/Kometa-Team/Anime-IDs/master/anime_ids.json" anime_lists_url = "https://raw.githubusercontent.com/Kometa-Team/Anime-IDs/master/anime_ids.json"
class Convert: class Convert:
def __init__(self, config): def __init__(self, requests, cache, tmdb):
self.config = config self.requests = requests
self.cache = cache
self.tmdb = tmdb
self._anidb_ids = {} self._anidb_ids = {}
self._mal_to_anidb = {} self._mal_to_anidb = {}
self._anidb_to_mal = {} self._anidb_to_mal = {}
@ -22,7 +26,7 @@ class Convert:
self._tmdb_show_to_anidb = {} self._tmdb_show_to_anidb = {}
self._imdb_to_anidb = {} self._imdb_to_anidb = {}
self._tvdb_to_anidb = {} self._tvdb_to_anidb = {}
self._anidb_ids = self.config.get_json(anime_lists_url) self._anidb_ids = self.requests.get_json(anime_lists_url)
for anidb_id, ids in self._anidb_ids.items(): for anidb_id, ids in self._anidb_ids.items():
anidb_id = int(anidb_id) anidb_id = int(anidb_id)
if "mal_id" in ids: if "mal_id" in ids:
@ -78,6 +82,11 @@ class Convert:
else: else:
return None return None
def anidb_to_mal(self, anidb_id):
if anidb_id not in self._anidb_to_mal:
raise Failed(f"Convert Warning: No MyAnimeList Found for AniDB ID: {anidb_id}")
return self._anidb_to_mal[anidb_id]
def anidb_to_ids(self, anidb_ids, library): def anidb_to_ids(self, anidb_ids, library):
ids = [] ids = []
anidb_list = anidb_ids if isinstance(anidb_ids, list) else [anidb_ids] anidb_list = anidb_ids if isinstance(anidb_ids, list) else [anidb_ids]
@ -139,15 +148,15 @@ class Convert:
def tmdb_to_imdb(self, tmdb_id, is_movie=True, fail=False): def tmdb_to_imdb(self, tmdb_id, is_movie=True, fail=False):
media_type = "movie" if is_movie else "show" media_type = "movie" if is_movie else "show"
expired = False expired = False
if self.config.Cache and is_movie: if self.cache and is_movie:
cache_id, expired = self.config.Cache.query_imdb_to_tmdb_map(tmdb_id, imdb=False, media_type=media_type) cache_id, expired = self.cache.query_imdb_to_tmdb_map(tmdb_id, imdb=False, media_type=media_type)
if cache_id and not expired: if cache_id and not expired:
return cache_id return cache_id
try: try:
imdb_id = self.config.TMDb.convert_from(tmdb_id, "imdb_id", is_movie) imdb_id = self.tmdb.convert_from(tmdb_id, "imdb_id", is_movie)
if imdb_id: if imdb_id:
if self.config.Cache: if self.cache:
self.config.Cache.update_imdb_to_tmdb_map(media_type, expired, imdb_id, tmdb_id) self.cache.update_imdb_to_tmdb_map(media_type, expired, imdb_id, tmdb_id)
return imdb_id return imdb_id
except Failed: except Failed:
pass pass
@ -158,15 +167,15 @@ class Convert:
def imdb_to_tmdb(self, imdb_id, fail=False): def imdb_to_tmdb(self, imdb_id, fail=False):
expired = False expired = False
if self.config.Cache: if self.cache:
cache_id, cache_type, expired = self.config.Cache.query_imdb_to_tmdb_map(imdb_id, imdb=True, return_type=True) cache_id, cache_type, expired = self.cache.query_imdb_to_tmdb_map(imdb_id, imdb=True, return_type=True)
if cache_id and not expired: if cache_id and not expired:
return cache_id, cache_type return cache_id, cache_type
try: try:
tmdb_id, tmdb_type = self.config.TMDb.convert_imdb_to(imdb_id) tmdb_id, tmdb_type = self.tmdb.convert_imdb_to(imdb_id)
if tmdb_id: if tmdb_id:
if self.config.Cache: if self.cache:
self.config.Cache.update_imdb_to_tmdb_map(tmdb_type, expired, imdb_id, tmdb_id) self.cache.update_imdb_to_tmdb_map(tmdb_type, expired, imdb_id, tmdb_id)
return tmdb_id, tmdb_type return tmdb_id, tmdb_type
except Failed: except Failed:
pass pass
@ -177,15 +186,15 @@ class Convert:
def tmdb_to_tvdb(self, tmdb_id, fail=False): def tmdb_to_tvdb(self, tmdb_id, fail=False):
expired = False expired = False
if self.config.Cache: if self.cache:
cache_id, expired = self.config.Cache.query_tmdb_to_tvdb_map(tmdb_id, tmdb=True) cache_id, expired = self.cache.query_tmdb_to_tvdb_map(tmdb_id, tmdb=True)
if cache_id and not expired: if cache_id and not expired:
return cache_id return cache_id
try: try:
tvdb_id = self.config.TMDb.convert_from(tmdb_id, "tvdb_id", False) tvdb_id = self.tmdb.convert_from(tmdb_id, "tvdb_id", False)
if tvdb_id: if tvdb_id:
if self.config.Cache: if self.cache:
self.config.Cache.update_tmdb_to_tvdb_map(expired, tmdb_id, tvdb_id) self.cache.update_tmdb_to_tvdb_map(expired, tmdb_id, tvdb_id)
return tvdb_id return tvdb_id
except Failed: except Failed:
pass pass
@ -196,15 +205,15 @@ class Convert:
def tvdb_to_tmdb(self, tvdb_id, fail=False): def tvdb_to_tmdb(self, tvdb_id, fail=False):
expired = False expired = False
if self.config.Cache: if self.cache:
cache_id, expired = self.config.Cache.query_tmdb_to_tvdb_map(tvdb_id, tmdb=False) cache_id, expired = self.cache.query_tmdb_to_tvdb_map(tvdb_id, tmdb=False)
if cache_id and not expired: if cache_id and not expired:
return cache_id return cache_id
try: try:
tmdb_id = self.config.TMDb.convert_tvdb_to(tvdb_id) tmdb_id = self.tmdb.convert_tvdb_to(tvdb_id)
if tmdb_id: if tmdb_id:
if self.config.Cache: if self.cache:
self.config.Cache.update_tmdb_to_tvdb_map(expired, tmdb_id, tvdb_id) self.cache.update_tmdb_to_tvdb_map(expired, tmdb_id, tvdb_id)
return tmdb_id return tmdb_id
except Failed: except Failed:
pass pass
@ -215,15 +224,15 @@ class Convert:
def tvdb_to_imdb(self, tvdb_id, fail=False): def tvdb_to_imdb(self, tvdb_id, fail=False):
expired = False expired = False
if self.config.Cache: if self.cache:
cache_id, expired = self.config.Cache.query_imdb_to_tvdb_map(tvdb_id, imdb=False) cache_id, expired = self.cache.query_imdb_to_tvdb_map(tvdb_id, imdb=False)
if cache_id and not expired: if cache_id and not expired:
return cache_id return cache_id
try: try:
imdb_id = self.tmdb_to_imdb(self.tvdb_to_tmdb(tvdb_id, fail=True), is_movie=False, fail=True) imdb_id = self.tmdb_to_imdb(self.tvdb_to_tmdb(tvdb_id, fail=True), is_movie=False, fail=True)
if imdb_id: if imdb_id:
if self.config.Cache: if self.cache:
self.config.Cache.update_imdb_to_tvdb_map(expired, imdb_id, tvdb_id) self.cache.update_imdb_to_tvdb_map(expired, imdb_id, tvdb_id)
return imdb_id return imdb_id
except Failed: except Failed:
pass pass
@ -234,8 +243,8 @@ class Convert:
def imdb_to_tvdb(self, imdb_id, fail=False): def imdb_to_tvdb(self, imdb_id, fail=False):
expired = False expired = False
if self.config.Cache: if self.cache:
cache_id, expired = self.config.Cache.query_imdb_to_tvdb_map(imdb_id, imdb=True) cache_id, expired = self.cache.query_imdb_to_tvdb_map(imdb_id, imdb=True)
if cache_id and not expired: if cache_id and not expired:
return cache_id return cache_id
try: try:
@ -243,8 +252,8 @@ class Convert:
if tmdb_type == "show": if tmdb_type == "show":
tvdb_id = self.tmdb_to_tvdb(tmdb_id, fail=True) tvdb_id = self.tmdb_to_tvdb(tmdb_id, fail=True)
if tvdb_id: if tvdb_id:
if self.config.Cache: if self.cache:
self.config.Cache.update_imdb_to_tvdb_map(expired, imdb_id, tvdb_id) self.cache.update_imdb_to_tvdb_map(expired, imdb_id, tvdb_id)
return tvdb_id return tvdb_id
except Failed: except Failed:
pass pass
@ -258,8 +267,8 @@ class Convert:
cache_id = None cache_id = None
imdb_check = None imdb_check = None
expired = None expired = None
if self.config.Cache: if self.cache:
cache_id, imdb_check, media_type, expired = self.config.Cache.query_guid_map(guid) cache_id, imdb_check, media_type, expired = self.cache.query_guid_map(guid)
if (cache_id or imdb_check) and not expired: if (cache_id or imdb_check) and not expired:
media_id_type = "movie" if "movie" in media_type else "show" media_id_type = "movie" if "movie" in media_type else "show"
if item_type == "hama" and check_id.startswith("anidb"): if item_type == "hama" and check_id.startswith("anidb"):
@ -270,7 +279,7 @@ class Convert:
return media_id_type, cache_id, imdb_check, expired return media_id_type, cache_id, imdb_check, expired
def scan_guid(self, guid_str): def scan_guid(self, guid_str):
guid = requests.utils.urlparse(guid_str) guid = urlparse(guid_str)
return guid.scheme.split(".")[-1], guid.netloc return guid.scheme.split(".")[-1], guid.netloc
def get_id(self, item, library): def get_id(self, item, library):
@ -288,13 +297,13 @@ class Convert:
try: try:
for guid_tag in item.guids: for guid_tag in item.guids:
try: try:
url_parsed = requests.utils.urlparse(guid_tag.id) url_parsed = urlparse(guid_tag.id)
if url_parsed.scheme == "tvdb": tvdb_id.append(int(url_parsed.netloc)) if url_parsed.scheme == "tvdb": tvdb_id.append(int(url_parsed.netloc))
elif url_parsed.scheme == "imdb": imdb_id.append(url_parsed.netloc) elif url_parsed.scheme == "imdb": imdb_id.append(url_parsed.netloc)
elif url_parsed.scheme == "tmdb": tmdb_id.append(int(url_parsed.netloc)) elif url_parsed.scheme == "tmdb": tmdb_id.append(int(url_parsed.netloc))
except ValueError: except ValueError:
pass pass
except requests.exceptions.ConnectionError: except ConnectionError:
library.query(item.refresh) library.query(item.refresh)
logger.stacktrace() logger.stacktrace()
raise Failed("No External GUIDs found") raise Failed("No External GUIDs found")
@ -375,12 +384,12 @@ class Convert:
imdb_id.append(imdb) imdb_id.append(imdb)
def update_cache(cache_ids, id_type, imdb_in, guid_type): def update_cache(cache_ids, id_type, imdb_in, guid_type):
if self.config.Cache: if self.cache:
cache_ids = ",".join([str(c) for c in cache_ids]) cache_ids = ",".join([str(c) for c in cache_ids])
imdb_in = ",".join([str(i) for i in imdb_in]) if imdb_in else None imdb_in = ",".join([str(i) for i in imdb_in]) if imdb_in else None
ids = f"{item.guid:<46} | {id_type} ID: {cache_ids:<7} | IMDb ID: {str(imdb_in):<10}" ids = f"{item.guid:<46} | {id_type} ID: {cache_ids:<7} | IMDb ID: {str(imdb_in):<10}"
logger.info(f" Cache | {'^' if expired else '+'} | {ids} | {item.title}") logger.info(f" Cache | {'^' if expired else '+'} | {ids} | {item.title}")
self.config.Cache.update_guid_map(item.guid, cache_ids, imdb_in, expired, guid_type) self.cache.update_guid_map(item.guid, cache_ids, imdb_in, expired, guid_type)
if (tmdb_id or imdb_id) and library.is_movie: if (tmdb_id or imdb_id) and library.is_movie:
update_cache(tmdb_id, "TMDb", imdb_id, "movie") update_cache(tmdb_id, "TMDb", imdb_id, "movie")

View file

@ -156,20 +156,21 @@ class Race:
class Ergast: class Ergast:
def __init__(self, config): def __init__(self, requests, cache):
self.config = config self.requests = requests
self.cache = cache
def get_races(self, year, language, ignore_cache=False): def get_races(self, year, language, ignore_cache=False):
expired = None expired = None
if self.config.Cache and not ignore_cache: if self.cache and not ignore_cache:
race_list, expired = self.config.Cache.query_ergast(year, self.config.Cache.expiration) race_list, expired = self.cache.query_ergast(year, self.cache.expiration)
if race_list and expired is False: if race_list and expired is False:
return [Race(r, language) for r in race_list] return [Race(r, language) for r in race_list]
response = self.config.get(f"{base_url}{year}.json") response = self.requests.get(f"{base_url}{year}.json")
if response.status_code < 400: if response.status_code < 400:
races = [Race(r, language) for r in response.json()["MRData"]["RaceTable"]["Races"]] races = [Race(r, language) for r in response.json()["MRData"]["RaceTable"]["Races"]]
if self.config.Cache and not ignore_cache: if self.cache and not ignore_cache:
self.config.Cache.update_ergast(expired, year, races, self.config.Cache.expiration) self.cache.update_ergast(expired, year, races, self.cache.expiration)
return races return races
else: else:
raise Failed(f"Ergast Error: F1 Season: {year} Not found") raise Failed(f"Ergast Error: F1 Season: {year} Not found")

View file

@ -10,8 +10,8 @@ kometa_base = f"{base_url}/repos/Kometa-Team/Kometa"
configs_raw_url = f"{raw_url}/Kometa-Team/Community-Configs" configs_raw_url = f"{raw_url}/Kometa-Team/Community-Configs"
class GitHub: class GitHub:
def __init__(self, config, params): def __init__(self, requests, params):
self.config = config self.requests = requests
self.token = params["token"] self.token = params["token"]
logger.secret(self.token) logger.secret(self.token)
self.headers = {"Authorization": f"token {self.token}"} if self.token else None self.headers = {"Authorization": f"token {self.token}"} if self.token else None
@ -22,19 +22,19 @@ class GitHub:
self._translation_keys = [] self._translation_keys = []
self._translations = {} self._translations = {}
def _requests(self, url, err_msg=None, json=True, params=None): def _requests(self, url, err_msg=None, params=None, yaml=False):
response = self.config.get(url, headers=self.headers, params=params)
if not err_msg: if not err_msg:
err_msg = f"URL Not Found: {url}" err_msg = f"URL Not Found: {url}"
if yaml:
return self.requests.get_yaml(url, headers=self.headers, params=params)
response = self.requests.get(url, headers=self.headers, params=params)
if response.status_code >= 400: if response.status_code >= 400:
raise Failed(f"Git Error: {err_msg}") raise Failed(f"Git Error: {err_msg}")
if json: try:
try: return response.json()
return response.json() except ValueError:
except ValueError: logger.error(str(response.content))
logger.error(str(response.content)) raise
raise
return response
def get_top_tree(self, repo): def get_top_tree(self, repo):
if not str(repo).startswith("/"): if not str(repo).startswith("/"):
@ -77,8 +77,8 @@ class GitHub:
def configs_url(self): def configs_url(self):
if self._configs_url is None: if self._configs_url is None:
self._configs_url = f"{configs_raw_url}/master/" self._configs_url = f"{configs_raw_url}/master/"
if self.config.version[1] in self.config_tags and (self.config.latest_version[1] != self.config.version[1] or self.config.branch == "master"): if self.requests.version[1] in self.config_tags and (self.requests.latest_version[1] != self.requests.version[1] or self.requests.branch == "master"):
self._configs_url = f"{configs_raw_url}/v{self.config.version[1]}/" self._configs_url = f"{configs_raw_url}/v{self.requests.version[1]}/"
return self._configs_url return self._configs_url
@property @property
@ -90,8 +90,7 @@ class GitHub:
def translation_yaml(self, translation_key): def translation_yaml(self, translation_key):
if translation_key not in self._translations: if translation_key not in self._translations:
url = f"{self.translation_url}{translation_key}.yml" yaml = self._requests(f"{self.translation_url}{translation_key}.yml", yaml=True).data
yaml = util.YAML(input_data=self._requests(url, json=False).content).data
output = {"collections": {}, "key_names": {}, "variables": {}} output = {"collections": {}, "key_names": {}, "variables": {}}
for k in output: for k in output:
if k in yaml: if k in yaml:

View file

@ -5,8 +5,8 @@ from modules.util import Failed
logger = util.logger logger = util.logger
class Gotify: class Gotify:
def __init__(self, config, params): def __init__(self, requests, params):
self.config = config self.requests = requests
self.token = params["token"] self.token = params["token"]
self.url = params["url"].rstrip("/") self.url = params["url"].rstrip("/")
logger.secret(self.url) logger.secret(self.url)
@ -19,9 +19,9 @@ class Gotify:
def _request(self, path="message", json=None, post=True): def _request(self, path="message", json=None, post=True):
if post: if post:
response = self.config.post(f"{self.url}/{path}", headers={"X-Gotify-Key": self.token}, json=json) response = self.requests.post(f"{self.url}/{path}", headers={"X-Gotify-Key": self.token}, json=json)
else: else:
response = self.config.get(f"{self.url}/{path}") response = self.requests.get(f"{self.url}/{path}")
try: try:
response_json = response.json() response_json = response.json()
except JSONDecodeError as e: except JSONDecodeError as e:

View file

@ -7,12 +7,12 @@ builders = ["icheckmovies_list", "icheckmovies_list_details"]
base_url = "https://www.icheckmovies.com/lists/" base_url = "https://www.icheckmovies.com/lists/"
class ICheckMovies: class ICheckMovies:
def __init__(self, config): def __init__(self, requests):
self.config = config self.requests = requests
def _request(self, url, language, xpath): def _request(self, url, language, xpath):
logger.trace(f"URL: {url}") logger.trace(f"URL: {url}")
return self.config.get_html(url, headers=util.header(language)).xpath(xpath) return self.requests.get_html(url, language=language).xpath(xpath)
def _parse_list(self, list_url, language): def _parse_list(self, list_url, language):
imdb_urls = self._request(list_url, language, "//a[@class='optionIcon optionIMDB external']/@href") imdb_urls = self._request(list_url, language, "//a[@class='optionIcon optionIMDB external']/@href")

View file

@ -1,7 +1,7 @@
import csv, gzip, json, math, os, re, requests, shutil, time import csv, gzip, json, math, os, re, shutil, time
from modules import util from modules import util
from modules.request import parse_qs, urlparse
from modules.util import Failed from modules.util import Failed
from urllib.parse import urlparse, parse_qs
logger = util.logger logger = util.logger
@ -94,8 +94,10 @@ graphql_url = "https://api.graphql.imdb.com/"
list_url = f"{base_url}/list/ls" list_url = f"{base_url}/list/ls"
class IMDb: class IMDb:
def __init__(self, config): def __init__(self, requests, cache, default_dir):
self.config = config self.requests = requests
self.cache = cache
self.default_dir = default_dir
self._ratings = None self._ratings = None
self._genres = None self._genres = None
self._episode_ratings = None self._episode_ratings = None
@ -108,28 +110,27 @@ class IMDb:
logger.trace(f"URL: {url}") logger.trace(f"URL: {url}")
if params: if params:
logger.trace(f"Params: {params}") logger.trace(f"Params: {params}")
headers = util.header(language) if language else util.header() response = self.requests.get_html(url, params=params, header=True, language=language)
response = self.config.get_html(url, headers=headers, params=params)
return response.xpath(xpath) if xpath else response return response.xpath(xpath) if xpath else response
def _graph_request(self, json_data): def _graph_request(self, json_data):
return self.config.post_json(graphql_url, headers={"content-type": "application/json"}, json=json_data) return self.requests.post_json(graphql_url, headers={"content-type": "application/json"}, json=json_data)
@property @property
def hash(self): def hash(self):
if self._hash is None: if self._hash is None:
self._hash = self.config.get(hash_url).text.strip() self._hash = self.requests.get(hash_url).text.strip()
return self._hash return self._hash
@property @property
def events_validation(self): def events_validation(self):
if self._events_validation is None: if self._events_validation is None:
self._events_validation = self.config.load_yaml(f"{git_base}/event_validation.yml") self._events_validation = self.requests.get_yaml(f"{git_base}/event_validation.yml").data
return self._events_validation return self._events_validation
def get_event(self, event_id): def get_event(self, event_id):
if event_id not in self._events: if event_id not in self._events:
self._events[event_id] = self.config.load_yaml(f"{git_base}/events/{event_id}.yml") self._events[event_id] = self.requests.get_yaml(f"{git_base}/events/{event_id}.yml").data
return self._events[event_id] return self._events[event_id]
def validate_imdb_lists(self, err_type, imdb_lists, language): def validate_imdb_lists(self, err_type, imdb_lists, language):
@ -213,7 +214,7 @@ class IMDb:
def _watchlist(self, user, language): def _watchlist(self, user, language):
imdb_url = f"{base_url}/user/{user}/watchlist" imdb_url = f"{base_url}/user/{user}/watchlist"
for text in self._request(imdb_url, language=language , xpath="//div[@class='article']/script/text()")[0].split("\n"): for text in self._request(imdb_url, language=language, xpath="//div[@class='article']/script/text()")[0].split("\n"):
if text.strip().startswith("IMDbReactInitialState.push"): if text.strip().startswith("IMDbReactInitialState.push"):
jsonline = text.strip() jsonline = text.strip()
return [f for f in json.loads(jsonline[jsonline.find('{'):-2])["starbars"]] return [f for f in json.loads(jsonline[jsonline.find('{'):-2])["starbars"]]
@ -450,8 +451,8 @@ class IMDb:
def keywords(self, imdb_id, language, ignore_cache=False): def keywords(self, imdb_id, language, ignore_cache=False):
imdb_keywords = {} imdb_keywords = {}
expired = None expired = None
if self.config.Cache and not ignore_cache: if self.cache and not ignore_cache:
imdb_keywords, expired = self.config.Cache.query_imdb_keywords(imdb_id, self.config.Cache.expiration) imdb_keywords, expired = self.cache.query_imdb_keywords(imdb_id, self.cache.expiration)
if imdb_keywords and expired is False: if imdb_keywords and expired is False:
return imdb_keywords return imdb_keywords
keywords = self._request(f"{base_url}/title/{imdb_id}/keywords", language=language, xpath="//td[@class='soda sodavote']") keywords = self._request(f"{base_url}/title/{imdb_id}/keywords", language=language, xpath="//td[@class='soda sodavote']")
@ -465,15 +466,15 @@ class IMDb:
imdb_keywords[name] = (int(result.group(1)), int(result.group(2))) imdb_keywords[name] = (int(result.group(1)), int(result.group(2)))
else: else:
imdb_keywords[name] = (0, 0) imdb_keywords[name] = (0, 0)
if self.config.Cache and not ignore_cache: if self.cache and not ignore_cache:
self.config.Cache.update_imdb_keywords(expired, imdb_id, imdb_keywords, self.config.Cache.expiration) self.cache.update_imdb_keywords(expired, imdb_id, imdb_keywords, self.cache.expiration)
return imdb_keywords return imdb_keywords
def parental_guide(self, imdb_id, ignore_cache=False): def parental_guide(self, imdb_id, ignore_cache=False):
parental_dict = {} parental_dict = {}
expired = None expired = None
if self.config.Cache and not ignore_cache: if self.cache and not ignore_cache:
parental_dict, expired = self.config.Cache.query_imdb_parental(imdb_id, self.config.Cache.expiration) parental_dict, expired = self.cache.query_imdb_parental(imdb_id, self.cache.expiration)
if parental_dict and expired is False: if parental_dict and expired is False:
return parental_dict return parental_dict
response = self._request(f"{base_url}/title/{imdb_id}/parentalguide") response = self._request(f"{base_url}/title/{imdb_id}/parentalguide")
@ -483,8 +484,8 @@ class IMDb:
parental_dict[ptype] = results[0].strip() parental_dict[ptype] = results[0].strip()
else: else:
raise Failed(f"IMDb Error: No Item Found for IMDb ID: {imdb_id}") raise Failed(f"IMDb Error: No Item Found for IMDb ID: {imdb_id}")
if self.config.Cache and not ignore_cache: if self.cache and not ignore_cache:
self.config.Cache.update_imdb_parental(expired, imdb_id, parental_dict, self.config.Cache.expiration) self.cache.update_imdb_parental(expired, imdb_id, parental_dict, self.cache.expiration)
return parental_dict return parental_dict
def _ids_from_chart(self, chart, language): def _ids_from_chart(self, chart, language):
@ -542,26 +543,15 @@ class IMDb:
raise Failed(f"IMDb Error: Method {method} not supported") raise Failed(f"IMDb Error: Method {method} not supported")
def _interface(self, interface): def _interface(self, interface):
gz = os.path.join(self.config.default_dir, f"title.{interface}.tsv.gz") gz = os.path.join(self.default_dir, f"title.{interface}.tsv.gz")
tsv = os.path.join(self.config.default_dir, f"title.{interface}.tsv") tsv = os.path.join(self.default_dir, f"title.{interface}.tsv")
if os.path.exists(gz): if os.path.exists(gz):
os.remove(gz) os.remove(gz)
if os.path.exists(tsv): if os.path.exists(tsv):
os.remove(tsv) os.remove(tsv)
with requests.get(f"https://datasets.imdbws.com/title.{interface}.tsv.gz", stream=True) as r: self.requests.get_stream(f"https://datasets.imdbws.com/title.{interface}.tsv.gz", gz, "IMDb Interface")
r.raise_for_status()
total_length = r.headers.get('content-length')
if total_length is not None:
total_length = int(total_length)
dl = 0
with open(gz, "wb") as f:
for chunk in r.iter_content(chunk_size=8192):
dl += len(chunk)
f.write(chunk)
logger.ghost(f"Downloading IMDb Interface: {dl / total_length * 100:6.2f}%")
logger.exorcise()
with open(tsv, "wb") as f_out: with open(tsv, "wb") as f_out:
with gzip.open(gz, "rb") as f_in: with gzip.open(gz, "rb") as f_in:

View file

@ -8,14 +8,15 @@ builders = ["letterboxd_list", "letterboxd_list_details"]
base_url = "https://letterboxd.com" base_url = "https://letterboxd.com"
class Letterboxd: class Letterboxd:
def __init__(self, config): def __init__(self, requests, cache):
self.config = config self.requests = requests
self.cache = cache
def _parse_page(self, list_url, language): def _parse_page(self, list_url, language):
if "ajax" not in list_url: if "ajax" not in list_url:
list_url = list_url.replace("https://letterboxd.com/films", "https://letterboxd.com/films/ajax") list_url = list_url.replace("https://letterboxd.com/films", "https://letterboxd.com/films/ajax")
logger.trace(f"URL: {list_url}") logger.trace(f"URL: {list_url}")
response = self.config.get_html(list_url, headers=util.header(language)) response = self.requests.get_html(list_url, language=language)
letterboxd_ids = response.xpath("//li[contains(@class, 'poster-container') or contains(@class, 'film-detail')]/div/@data-film-id") letterboxd_ids = response.xpath("//li[contains(@class, 'poster-container') or contains(@class, 'film-detail')]/div/@data-film-id")
items = [] items = []
for letterboxd_id in letterboxd_ids: for letterboxd_id in letterboxd_ids:
@ -44,7 +45,7 @@ class Letterboxd:
def _tmdb(self, letterboxd_url, language): def _tmdb(self, letterboxd_url, language):
logger.trace(f"URL: {letterboxd_url}") logger.trace(f"URL: {letterboxd_url}")
response = self.config.get_html(letterboxd_url, headers=util.header(language)) response = self.requests.get_html(letterboxd_url, language=language)
ids = response.xpath("//a[@data-track-action='TMDb']/@href") ids = response.xpath("//a[@data-track-action='TMDb']/@href")
if len(ids) > 0 and ids[0]: if len(ids) > 0 and ids[0]:
if "themoviedb.org/movie" in ids[0]: if "themoviedb.org/movie" in ids[0]:
@ -54,7 +55,7 @@ class Letterboxd:
def get_list_description(self, list_url, language): def get_list_description(self, list_url, language):
logger.trace(f"URL: {list_url}") logger.trace(f"URL: {list_url}")
response = self.config.get_html(list_url, headers=util.header(language)) response = self.requests.get_html(list_url, language=language)
descriptions = response.xpath("//meta[@property='og:description']/@content") descriptions = response.xpath("//meta[@property='og:description']/@content")
return descriptions[0] if len(descriptions) > 0 and len(descriptions[0]) > 0 else None return descriptions[0] if len(descriptions) > 0 and len(descriptions[0]) > 0 else None
@ -106,16 +107,16 @@ class Letterboxd:
logger.ghost(f"Finding TMDb ID {i}/{total_items}") logger.ghost(f"Finding TMDb ID {i}/{total_items}")
tmdb_id = None tmdb_id = None
expired = None expired = None
if self.config.Cache: if self.cache:
tmdb_id, expired = self.config.Cache.query_letterboxd_map(letterboxd_id) tmdb_id, expired = self.cache.query_letterboxd_map(letterboxd_id)
if not tmdb_id or expired is not False: if not tmdb_id or expired is not False:
try: try:
tmdb_id = self._tmdb(f"{base_url}{slug}", language) tmdb_id = self._tmdb(f"{base_url}{slug}", language)
except Failed as e: except Failed as e:
logger.error(e) logger.error(e)
continue continue
if self.config.Cache: if self.cache:
self.config.Cache.update_letterboxd_map(expired, letterboxd_id, tmdb_id) self.cache.update_letterboxd_map(expired, letterboxd_id, tmdb_id)
ids.append((tmdb_id, "tmdb")) ids.append((tmdb_id, "tmdb"))
logger.info(f"Processed {total_items} TMDb IDs") logger.info(f"Processed {total_items} TMDb IDs")
if filtered_ids: if filtered_ids:

View file

@ -1,9 +1,10 @@
import os, time import os, time
from abc import ABC, abstractmethod from abc import ABC, abstractmethod
from modules import util, operations from modules import util
from modules.meta import MetadataFile, OverlayFile from modules.meta import MetadataFile, OverlayFile
from modules.operations import Operations from modules.operations import Operations
from modules.util import Failed, NotScheduled, YAML from modules.poster import ImageData
from modules.util import Failed, NotScheduled
from PIL import Image from PIL import Image
logger = util.logger logger = util.logger
@ -274,6 +275,36 @@ class Library(ABC):
def image_update(self, item, image, tmdb=None, title=None, poster=True): def image_update(self, item, image, tmdb=None, title=None, poster=True):
pass pass
def pick_image(self, title, images, prioritize_assets, download_url_assets, item_dir, is_poster=True, image_name=None):
image_type = "poster" if is_poster else "background"
if image_name is None:
image_name = image_type
if images:
logger.debug(f"{len(images)} {image_type}{'s' if len(images) > 1 else ''} found:")
for i in images:
logger.debug(f"Method: {i} {image_type.capitalize()}: {images[i]}")
if prioritize_assets and "asset_directory" in images:
return images["asset_directory"]
for attr in ["style_data", f"url_{image_type}", f"file_{image_type}", f"tmdb_{image_type}", "tmdb_profile",
"tmdb_list_poster", "tvdb_list_poster", f"tvdb_{image_type}", "asset_directory",
f"pmm_{image_type}",
"tmdb_person", "tmdb_collection_details", "tmdb_actor_details", "tmdb_crew_details",
"tmdb_director_details",
"tmdb_producer_details", "tmdb_writer_details", "tmdb_movie_details", "tmdb_list_details",
"tvdb_list_details", "tvdb_movie_details", "tvdb_show_details", "tmdb_show_details"]:
if attr in images:
if attr in ["style_data", f"url_{image_type}"] and download_url_assets and item_dir:
if "asset_directory" in images:
return images["asset_directory"]
else:
try:
return self.config.Requests.download_image(title, images[attr], item_dir, is_poster=is_poster, filename=image_name)
except Failed as e:
logger.error(e)
if attr in ["asset_directory", f"pmm_{image_type}"]:
return images[attr]
return ImageData(attr, images[attr], is_poster=is_poster, is_url=attr != f"file_{image_type}")
@abstractmethod @abstractmethod
def reload(self, item, force=False): def reload(self, item, force=False):
pass pass
@ -291,7 +322,7 @@ class Library(ABC):
pass pass
def check_image_for_overlay(self, image_url, image_path, remove=False): def check_image_for_overlay(self, image_url, image_path, remove=False):
image_path = util.download_image("", image_url, image_path).location image_path = self.config.Requests.download_image("", image_url, image_path).location
while util.is_locked(image_path): while util.is_locked(image_path):
time.sleep(1) time.sleep(1)
with Image.open(image_path) as image: with Image.open(image_path) as image:
@ -350,7 +381,7 @@ class Library(ABC):
self.report_data[collection][other] = [] self.report_data[collection][other] = []
self.report_data[collection][other].append(title) self.report_data[collection][other].append(title)
yaml = YAML(self.report_path, start_empty=True) yaml = self.config.Requests.file_yaml(self.report_path, start_empty=True)
yaml.data = self.report_data yaml.data = self.report_data
yaml.save() yaml.save()

View file

@ -2,7 +2,7 @@ import re, secrets, time, webbrowser
from datetime import datetime from datetime import datetime
from json import JSONDecodeError from json import JSONDecodeError
from modules import util from modules import util
from modules.util import Failed, TimeoutExpired, YAML from modules.util import Failed, TimeoutExpired
logger = util.logger logger = util.logger
@ -79,8 +79,10 @@ class MyAnimeListObj:
class MyAnimeList: class MyAnimeList:
def __init__(self, config, params): def __init__(self, requests, cache, read_only, params):
self.config = config self.requests = requests
self.cache = cache
self.read_only = read_only
self.client_id = params["client_id"] self.client_id = params["client_id"]
self.client_secret = params["client_secret"] self.client_secret = params["client_secret"]
self.localhost_url = params["localhost_url"] self.localhost_url = params["localhost_url"]
@ -175,8 +177,8 @@ class MyAnimeList:
def _save(self, authorization): def _save(self, authorization):
if authorization is not None and "access_token" in authorization and authorization["access_token"] and self._check(authorization): if authorization is not None and "access_token" in authorization and authorization["access_token"] and self._check(authorization):
if self.authorization != authorization and not self.config.read_only: if self.authorization != authorization and not self.read_only:
yaml = YAML(self.config_path) yaml = self.requests.file_yaml(self.config_path)
yaml.data["mal"]["authorization"] = { yaml.data["mal"]["authorization"] = {
"access_token": authorization["access_token"], "access_token": authorization["access_token"],
"token_type": authorization["token_type"], "token_type": authorization["token_type"],
@ -191,13 +193,13 @@ class MyAnimeList:
return False return False
def _oauth(self, data): def _oauth(self, data):
return self.config.post_json(urls["oauth_token"], data=data) return self.requests.post_json(urls["oauth_token"], data=data)
def _request(self, url, authorization=None): def _request(self, url, authorization=None):
token = authorization["access_token"] if authorization else self.authorization["access_token"] token = authorization["access_token"] if authorization else self.authorization["access_token"]
logger.trace(f"URL: {url}") logger.trace(f"URL: {url}")
try: try:
response = self.config.get_json(url, headers={"Authorization": f"Bearer {token}"}) response = self.requests.get_json(url, headers={"Authorization": f"Bearer {token}"})
logger.trace(f"Response: {response}") logger.trace(f"Response: {response}")
if "error" in response: raise Failed(f"MyAnimeList Error: {response['error']}") if "error" in response: raise Failed(f"MyAnimeList Error: {response['error']}")
else: return response else: return response
@ -211,7 +213,7 @@ class MyAnimeList:
if self._delay is not None: if self._delay is not None:
while time_check - self._delay < 1: while time_check - self._delay < 1:
time_check = time.time() time_check = time.time()
data = self.config.get_json(f"{jikan_base_url}{url}", params=params) data = self.requests.get_json(f"{jikan_base_url}{url}", params=params)
self._delay = time.time() self._delay = time.time()
return data return data
@ -286,8 +288,8 @@ class MyAnimeList:
def get_anime(self, mal_id): def get_anime(self, mal_id):
expired = None expired = None
if self.config.Cache: if self.cache:
mal_dict, expired = self.config.Cache.query_mal(mal_id, self.expiration) mal_dict, expired = self.cache.query_mal(mal_id, self.expiration)
if mal_dict and expired is False: if mal_dict and expired is False:
return MyAnimeListObj(self, mal_id, mal_dict, cache=True) return MyAnimeListObj(self, mal_id, mal_dict, cache=True)
try: try:
@ -297,8 +299,8 @@ class MyAnimeList:
if "data" not in response: if "data" not in response:
raise Failed(f"MyAnimeList Error: No Anime found for MyAnimeList ID: {mal_id}") raise Failed(f"MyAnimeList Error: No Anime found for MyAnimeList ID: {mal_id}")
mal = MyAnimeListObj(self, mal_id, response["data"]) mal = MyAnimeListObj(self, mal_id, response["data"])
if self.config.Cache: if self.cache:
self.config.Cache.update_mal(expired, mal_id, mal, self.expiration) self.cache.update_mal(expired, mal_id, mal, self.expiration)
return mal return mal
def get_mal_ids(self, method, data): def get_mal_ids(self, method, data):

View file

@ -2,8 +2,8 @@ import time
from datetime import datetime from datetime import datetime
from json import JSONDecodeError from json import JSONDecodeError
from modules import util from modules import util
from modules.request import urlparse
from modules.util import Failed, LimitReached from modules.util import Failed, LimitReached
from urllib.parse import urlparse
logger = util.logger logger = util.logger
@ -72,8 +72,9 @@ class MDbObj:
class MDBList: class MDBList:
def __init__(self, config): def __init__(self, requests, cache):
self.config = config self.requests = requests
self.cache = cache
self.apikey = None self.apikey = None
self.expiration = 60 self.expiration = 60
self.limit = False self.limit = False
@ -108,7 +109,7 @@ class MDBList:
final_params[k] = v final_params[k] = v
try: try:
time.sleep(0.2 if self.supporter else 1) time.sleep(0.2 if self.supporter else 1)
response = self.config.get_json(url, params=final_params) response = self.requests.get_json(url, params=final_params)
except JSONDecodeError: except JSONDecodeError:
raise Failed("MDBList Error: JSON Decoding Failed") raise Failed("MDBList Error: JSON Decoding Failed")
if "response" in response and (response["response"] is False or response["response"] == "False"): if "response" in response and (response["response"] is False or response["response"] == "False"):
@ -134,14 +135,14 @@ class MDBList:
else: else:
raise Failed("MDBList Error: Either IMDb ID, TVDb ID, or TMDb ID and TMDb Type Required") raise Failed("MDBList Error: Either IMDb ID, TVDb ID, or TMDb ID and TMDb Type Required")
expired = None expired = None
if self.config.Cache and not ignore_cache: if self.cache and not ignore_cache:
mdb_dict, expired = self.config.Cache.query_mdb(key, self.expiration) mdb_dict, expired = self.cache.query_mdb(key, self.expiration)
if mdb_dict and expired is False: if mdb_dict and expired is False:
return MDbObj(mdb_dict) return MDbObj(mdb_dict)
logger.trace(f"ID: {key}") logger.trace(f"ID: {key}")
mdb = MDbObj(self._request(api_url, params=params)) mdb = MDbObj(self._request(api_url, params=params))
if self.config.Cache and not ignore_cache: if self.cache and not ignore_cache:
self.config.Cache.update_mdb(expired, key, mdb, self.expiration) self.cache.update_mdb(expired, key, mdb, self.expiration)
return mdb return mdb
def get_imdb(self, imdb_id): def get_imdb(self, imdb_id):
@ -212,7 +213,7 @@ class MDBList:
url_base = url_base if url_base.endswith("/") else f"{url_base}/" url_base = url_base if url_base.endswith("/") else f"{url_base}/"
url_base = url_base if url_base.endswith("json/") else f"{url_base}json/" url_base = url_base if url_base.endswith("json/") else f"{url_base}json/"
try: try:
response = self.config.get_json(url_base, headers=headers, params=params) response = self.requests.get_json(url_base, headers=headers, params=params)
if (isinstance(response, dict) and "error" in response) or (isinstance(response, list) and response and "error" in response[0]): if (isinstance(response, dict) and "error" in response) or (isinstance(response, list) and response and "error" in response[0]):
err = response["error"] if isinstance(response, dict) else response[0]["error"] err = response["error"] if isinstance(response, dict) else response[0]["error"]
if err in ["empty", "empty or private list"]: if err in ["empty", "empty or private list"]:

View file

@ -1,7 +1,8 @@
import math, operator, os, re import math, operator, os, re
from datetime import datetime from datetime import datetime
from modules import plex, ergast, util from modules import plex, ergast, util
from modules.util import Failed, NotScheduled, YAML from modules.request import quote
from modules.util import Failed, NotScheduled
from plexapi.exceptions import NotFound, BadRequest from plexapi.exceptions import NotFound, BadRequest
logger = util.logger logger = util.logger
@ -128,10 +129,7 @@ class DataFile:
dir_path = content_path dir_path = content_path
if translation: if translation:
content_path = f"{content_path}/default.yml" content_path = f"{content_path}/default.yml"
response = self.config.get(content_path) yaml = self.config.Requests.get_yaml(content_path, check_empty=True)
if response.status_code >= 400:
raise Failed(f"URL Error: No file found at {content_path}")
yaml = YAML(input_data=response.content, check_empty=True)
else: else:
if file_type == "Default": if file_type == "Default":
if not overlay and file_path.startswith(("movie/", "chart/", "award/")): if not overlay and file_path.startswith(("movie/", "chart/", "award/")):
@ -157,7 +155,7 @@ class DataFile:
raise Failed(f"File Error: Default does not exist {file_path}") raise Failed(f"File Error: Default does not exist {file_path}")
else: else:
raise Failed(f"File Error: File does not exist {content_path}") raise Failed(f"File Error: File does not exist {content_path}")
yaml = YAML(path=content_path, check_empty=True) yaml = self.config.Requests.file_yaml(content_path, check_empty=True)
if not translation: if not translation:
logger.debug(f"File Loaded From: {content_path}") logger.debug(f"File Loaded From: {content_path}")
return yaml.data return yaml.data
@ -169,8 +167,11 @@ class DataFile:
key_names = {} key_names = {}
variables = {k: {"default": v[lib_type]} for k, v in yaml.data["variables"].items()} variables = {k: {"default": v[lib_type]} for k, v in yaml.data["variables"].items()}
def add_translation(yaml_path, yaml_key, data=None): def add_translation(yaml_path, yaml_key, url=False):
yaml_content = YAML(input_data=data, path=yaml_path if data is None else None, check_empty=True) if url:
yaml_content = self.config.Requests.get_yaml(yaml_path, check_empty=True)
else:
yaml_content = self.config.Requests.file_yaml(yaml_path, check_empty=True)
if "variables" in yaml_content.data and yaml_content.data["variables"]: if "variables" in yaml_content.data and yaml_content.data["variables"]:
for var_key, var_value in yaml_content.data["variables"].items(): for var_key, var_value in yaml_content.data["variables"].items():
if lib_type in var_value: if lib_type in var_value:
@ -196,10 +197,9 @@ class DataFile:
if file_type in ["URL", "Git", "Repo"]: if file_type in ["URL", "Git", "Repo"]:
if "languages" in yaml.data and isinstance(yaml.data["language"], list): if "languages" in yaml.data and isinstance(yaml.data["language"], list):
for language in yaml.data["language"]: for language in yaml.data["language"]:
response = self.config.get(f"{dir_path}/{language}.yml") try:
if response.status_code < 400: add_translation(f"{dir_path}/{language}.yml", language, url=True)
add_translation(f"{dir_path}/{language}.yml", language, data=response.content) except Failed:
else:
logger.error(f"URL Error: Language file not found at {dir_path}/{language}.yml") logger.error(f"URL Error: Language file not found at {dir_path}/{language}.yml")
else: else:
for file in os.listdir(dir_path): for file in os.listdir(dir_path):
@ -343,7 +343,7 @@ class DataFile:
if "<<" in str(d_value): if "<<" in str(d_value):
default[f"{final_key}_encoded"] = re.sub(r'<<(.+)>>', r'<<\1_encoded>>', d_value) default[f"{final_key}_encoded"] = re.sub(r'<<(.+)>>', r'<<\1_encoded>>', d_value)
else: else:
default[f"{final_key}_encoded"] = util.quote(d_value) default[f"{final_key}_encoded"] = quote(d_value)
if "optional" in template: if "optional" in template:
if template["optional"]: if template["optional"]:
@ -434,7 +434,7 @@ class DataFile:
condition_found = True condition_found = True
if condition["value"] is not None: if condition["value"] is not None:
variables[final_key] = condition["value"] variables[final_key] = condition["value"]
variables[f"{final_key}_encoded"] = util.quote(condition["value"]) variables[f"{final_key}_encoded"] = quote(condition["value"])
else: else:
optional.append(final_key) optional.append(final_key)
break break
@ -442,7 +442,7 @@ class DataFile:
if "default" in con_value: if "default" in con_value:
logger.trace(f'Conditional Variable: {final_key} defaults to "{con_value["default"]}"') logger.trace(f'Conditional Variable: {final_key} defaults to "{con_value["default"]}"')
variables[final_key] = con_value["default"] variables[final_key] = con_value["default"]
variables[f"{final_key}_encoded"] = util.quote(con_value["default"]) variables[f"{final_key}_encoded"] = quote(con_value["default"])
else: else:
logger.trace(f"Conditional Variable: {final_key} added as optional variable") logger.trace(f"Conditional Variable: {final_key} added as optional variable")
optional.append(str(final_key)) optional.append(str(final_key))
@ -465,7 +465,7 @@ class DataFile:
if not sort_mapping and variables["mapping_name"].startswith(f"{op} "): if not sort_mapping and variables["mapping_name"].startswith(f"{op} "):
sort_mapping = f"{variables['mapping_name'][len(op):].strip()}, {op}" sort_mapping = f"{variables['mapping_name'][len(op):].strip()}, {op}"
if sort_name and sort_mapping: if sort_name and sort_mapping:
break break
else: else:
raise Failed(f"{self.data_type} Error: template sub-attribute move_prefix is blank") raise Failed(f"{self.data_type} Error: template sub-attribute move_prefix is blank")
variables[f"{self.data_type.lower()}_sort"] = sort_name if sort_name else variables[name_var] variables[f"{self.data_type.lower()}_sort"] = sort_name if sort_name else variables[name_var]
@ -482,7 +482,7 @@ class DataFile:
if key not in variables: if key not in variables:
variables[key] = value variables[key] = value
for key, value in variables.copy().items(): for key, value in variables.copy().items():
variables[f"{key}_encoded"] = util.quote(value) variables[f"{key}_encoded"] = quote(value)
default = {k: v for k, v in default.items() if k not in variables} default = {k: v for k, v in default.items() if k not in variables}
og_optional = optional og_optional = optional
@ -1374,7 +1374,7 @@ class MetadataFile(DataFile):
if sub: if sub:
sub_str = "" sub_str = ""
for folder in sub.split("/"): for folder in sub.split("/"):
folder_encode = util.quote(folder) folder_encode = quote(folder)
sub_str += f"{folder_encode}/" sub_str += f"{folder_encode}/"
if folder not in top_tree: if folder not in top_tree:
raise Failed(f"Image Set Error: Subfolder {folder} Not Found at https://github.com{repo}tree/master/{sub_str}") raise Failed(f"Image Set Error: Subfolder {folder} Not Found at https://github.com{repo}tree/master/{sub_str}")
@ -1385,21 +1385,21 @@ class MetadataFile(DataFile):
return f"https://raw.githubusercontent.com{repo}master/{sub}{u}" return f"https://raw.githubusercontent.com{repo}master/{sub}{u}"
def from_repo(u): def from_repo(u):
return self.config.get(repo_url(u)).content.decode().strip() return self.config.Requests.get(repo_url(u)).content.decode().strip()
def check_for_definition(check_key, check_tree, is_poster=True, git_name=None): def check_for_definition(check_key, check_tree, is_poster=True, git_name=None):
attr_name = "poster" if is_poster and (git_name is None or "background" not in git_name) else "background" attr_name = "poster" if is_poster and (git_name is None or "background" not in git_name) else "background"
if (git_name and git_name.lower().endswith(".tpdb")) or (not git_name and f"{attr_name}.tpdb" in check_tree): if (git_name and git_name.lower().endswith(".tpdb")) or (not git_name and f"{attr_name}.tpdb" in check_tree):
return f"tpdb_{attr_name}", from_repo(f"{check_key}/{util.quote(git_name) if git_name else f'{attr_name}.tpdb'}") return f"tpdb_{attr_name}", from_repo(f"{check_key}/{quote(git_name) if git_name else f'{attr_name}.tpdb'}")
elif (git_name and git_name.lower().endswith(".url")) or (not git_name and f"{attr_name}.url" in check_tree): elif (git_name and git_name.lower().endswith(".url")) or (not git_name and f"{attr_name}.url" in check_tree):
return f"url_{attr_name}", from_repo(f"{check_key}/{util.quote(git_name) if git_name else f'{attr_name}.url'}") return f"url_{attr_name}", from_repo(f"{check_key}/{quote(git_name) if git_name else f'{attr_name}.url'}")
elif git_name: elif git_name:
if git_name in check_tree: if git_name in check_tree:
return f"url_{attr_name}", repo_url(f"{check_key}/{util.quote(git_name)}") return f"url_{attr_name}", repo_url(f"{check_key}/{quote(git_name)}")
else: else:
for ct in check_tree: for ct in check_tree:
if ct.lower().startswith(attr_name): if ct.lower().startswith(attr_name):
return f"url_{attr_name}", repo_url(f"{check_key}/{util.quote(ct)}") return f"url_{attr_name}", repo_url(f"{check_key}/{quote(ct)}")
return None, None return None, None
def init_set(check_key, check_tree): def init_set(check_key, check_tree):
@ -1417,14 +1417,14 @@ class MetadataFile(DataFile):
if k not in top_tree: if k not in top_tree:
logger.info(f"Image Set Warning: {k} not found at https://github.com{repo}tree/master/{sub}") logger.info(f"Image Set Warning: {k} not found at https://github.com{repo}tree/master/{sub}")
continue continue
k_encoded = util.quote(k) k_encoded = quote(k)
item_folder = self.config.GitHub.get_tree(top_tree[k]["url"]) item_folder = self.config.GitHub.get_tree(top_tree[k]["url"])
item_data = init_set(k_encoded, item_folder) item_data = init_set(k_encoded, item_folder)
seasons = {} seasons = {}
for ik in item_folder: for ik in item_folder:
match = re.search(r"(\d+)", ik) match = re.search(r"(\d+)", ik)
if match: if match:
season_path = f"{k_encoded}/{util.quote(ik)}" season_path = f"{k_encoded}/{quote(ik)}"
season_num = int(match.group(1)) season_num = int(match.group(1))
season_folder = self.config.GitHub.get_tree(item_folder[ik]["url"]) season_folder = self.config.GitHub.get_tree(item_folder[ik]["url"])
season_data = init_set(season_path, season_folder) season_data = init_set(season_path, season_folder)
@ -1770,7 +1770,6 @@ class MetadataFile(DataFile):
nonlocal updated nonlocal updated
if updated: if updated:
try: try:
#current_item.saveEdits()
logger.info(f"{description} Metadata Update Successful") logger.info(f"{description} Metadata Update Successful")
except BadRequest: except BadRequest:
logger.error(f"{description} Metadata Update Failed") logger.error(f"{description} Metadata Update Failed")
@ -1816,7 +1815,6 @@ class MetadataFile(DataFile):
summary = tmdb_item.overview summary = tmdb_item.overview
genres = tmdb_item.genres genres = tmdb_item.genres
#item.batchEdits()
add_edit("title", item, meta, methods) add_edit("title", item, meta, methods)
add_edit("sort_title", item, meta, methods, key="titleSort") add_edit("sort_title", item, meta, methods, key="titleSort")
if self.library.is_movie: if self.library.is_movie:
@ -1926,7 +1924,6 @@ class MetadataFile(DataFile):
season_methods = {sm.lower(): sm for sm in season_dict} season_methods = {sm.lower(): sm for sm in season_dict}
season_style_data = None season_style_data = None
if update_seasons: if update_seasons:
#season.batchEdits()
add_edit("title", season, season_dict, season_methods) add_edit("title", season, season_dict, season_methods)
add_edit("summary", season, season_dict, season_methods) add_edit("summary", season, season_dict, season_methods)
add_edit("user_rating", season, season_dict, season_methods, key="userRating", var_type="float") add_edit("user_rating", season, season_dict, season_methods, key="userRating", var_type="float")
@ -1993,7 +1990,6 @@ class MetadataFile(DataFile):
logger.error(f"{self.type_str} Error: Episode {episode_id} in Season {season_id} not found") logger.error(f"{self.type_str} Error: Episode {episode_id} in Season {season_id} not found")
continue continue
episode_methods = {em.lower(): em for em in episode_dict} episode_methods = {em.lower(): em for em in episode_dict}
#episode.batchEdits()
add_edit("title", episode, episode_dict, episode_methods) add_edit("title", episode, episode_dict, episode_methods)
add_edit("sort_title", episode, episode_dict, episode_methods, key="titleSort") add_edit("sort_title", episode, episode_dict, episode_methods, key="titleSort")
add_edit("content_rating", episode, episode_dict, episode_methods, key="contentRating") add_edit("content_rating", episode, episode_dict, episode_methods, key="contentRating")
@ -2040,7 +2036,6 @@ class MetadataFile(DataFile):
logger.error(f"{self.type_str} Error: episode {episode_id} of season {season_id} not found") logger.error(f"{self.type_str} Error: episode {episode_id} of season {season_id} not found")
continue continue
episode_methods = {em.lower(): em for em in episode_dict} episode_methods = {em.lower(): em for em in episode_dict}
#episode.batchEdits()
add_edit("title", episode, episode_dict, episode_methods) add_edit("title", episode, episode_dict, episode_methods)
add_edit("sort_title", episode, episode_dict, episode_methods, key="titleSort") add_edit("sort_title", episode, episode_dict, episode_methods, key="titleSort")
add_edit("content_rating", episode, episode_dict, episode_methods, key="contentRating") add_edit("content_rating", episode, episode_dict, episode_methods, key="contentRating")
@ -2081,7 +2076,6 @@ class MetadataFile(DataFile):
else: else:
logger.error(f"{self.type_str} Error: Album: {album_name} not found") logger.error(f"{self.type_str} Error: Album: {album_name} not found")
continue continue
#album.batchEdits()
add_edit("title", album, album_dict, album_methods, value=title) add_edit("title", album, album_dict, album_methods, value=title)
add_edit("sort_title", album, album_dict, album_methods, key="titleSort") add_edit("sort_title", album, album_dict, album_methods, key="titleSort")
add_edit("critic_rating", album, album_dict, album_methods, key="rating", var_type="float") add_edit("critic_rating", album, album_dict, album_methods, key="rating", var_type="float")
@ -2126,7 +2120,6 @@ class MetadataFile(DataFile):
logger.error(f"{self.type_str} Error: Track: {track_num} not found") logger.error(f"{self.type_str} Error: Track: {track_num} not found")
continue continue
#track.batchEdits()
add_edit("title", track, track_dict, track_methods, value=title) add_edit("title", track, track_dict, track_methods, value=title)
add_edit("user_rating", track, track_dict, track_methods, key="userRating", var_type="float") add_edit("user_rating", track, track_dict, track_methods, key="userRating", var_type="float")
add_edit("track", track, track_dict, track_methods, key="index", var_type="int") add_edit("track", track, track_dict, track_methods, key="index", var_type="int")
@ -2187,7 +2180,6 @@ class MetadataFile(DataFile):
race = race_lookup[season.seasonNumber] race = race_lookup[season.seasonNumber]
title = race.format_name(round_prefix, shorten_gp) title = race.format_name(round_prefix, shorten_gp)
updated = False updated = False
#season.batchEdits()
add_edit("title", season, value=title) add_edit("title", season, value=title)
finish_edit(season, f"Season: {title}") finish_edit(season, f"Season: {title}")
_, _, ups = self.library.item_images(season, {}, {}, asset_location=asset_location, title=title, _, _, ups = self.library.item_images(season, {}, {}, asset_location=asset_location, title=title,
@ -2198,7 +2190,6 @@ class MetadataFile(DataFile):
for episode in season.episodes(): for episode in season.episodes():
if len(episode.locations) > 0: if len(episode.locations) > 0:
ep_title, session_date = race.session_info(episode.locations[0], sprint_weekend) ep_title, session_date = race.session_info(episode.locations[0], sprint_weekend)
#episode.batchEdits()
add_edit("title", episode, value=ep_title) add_edit("title", episode, value=ep_title)
add_edit("originally_available", episode, key="originallyAvailableAt", var_type="date", value=session_date) add_edit("originally_available", episode, key="originallyAvailableAt", var_type="date", value=session_date)
finish_edit(episode, f"Season: {season.seasonNumber} Episode: {episode.episodeNumber}") finish_edit(episode, f"Season: {season.seasonNumber} Episode: {episode.episodeNumber}")

View file

@ -1,8 +1,8 @@
from datetime import datetime from datetime import datetime
from modules import util from modules import util
from modules.request import parse_qs, urlparse
from modules.util import Failed from modules.util import Failed
from num2words import num2words from num2words import num2words
from urllib.parse import urlparse, parse_qs
logger = util.logger logger = util.logger
@ -125,8 +125,9 @@ base_url = "https://www.boxofficemojo.com"
class BoxOfficeMojo: class BoxOfficeMojo:
def __init__(self, config): def __init__(self, requests, cache):
self.config = config self.requests = requests
self.cache = cache
self._never_options = None self._never_options = None
self._intl_options = None self._intl_options = None
self._year_options = None self._year_options = None
@ -161,7 +162,7 @@ class BoxOfficeMojo:
logger.trace(f"URL: {base_url}{url}") logger.trace(f"URL: {base_url}{url}")
if params: if params:
logger.trace(f"Params: {params}") logger.trace(f"Params: {params}")
response = self.config.get_html(f"{base_url}{url}", headers=util.header(), params=params) response = self.requests.get_html(f"{base_url}{url}", header=True, params=params)
return response.xpath(xpath) if xpath else response return response.xpath(xpath) if xpath else response
def _parse_list(self, url, params, limit): def _parse_list(self, url, params, limit):
@ -258,16 +259,16 @@ class BoxOfficeMojo:
else: else:
imdb_id = None imdb_id = None
expired = None expired = None
if self.config.Cache: if self.cache:
imdb_id, expired = self.config.Cache.query_letterboxd_map(item) imdb_id, expired = self.cache.query_letterboxd_map(item)
if not imdb_id or expired is not False: if not imdb_id or expired is not False:
try: try:
imdb_id = self._imdb(item) imdb_id = self._imdb(item)
except Failed as e: except Failed as e:
logger.error(e) logger.error(e)
continue continue
if self.config.Cache: if self.cache:
self.config.Cache.update_letterboxd_map(expired, item, imdb_id) self.cache.update_letterboxd_map(expired, item, imdb_id)
ids.append((imdb_id, "imdb")) ids.append((imdb_id, "imdb"))
logger.info(f"Processed {total_items} IMDb IDs") logger.info(f"Processed {total_items} IMDb IDs")
return ids return ids

View file

@ -9,8 +9,8 @@ base_url = "https://notifiarr.com/api/v1/"
class Notifiarr: class Notifiarr:
def __init__(self, config, params): def __init__(self, requests, params):
self.config = config self.requests = requests
self.apikey = params["apikey"] self.apikey = params["apikey"]
self.header = {"X-API-Key": self.apikey} self.header = {"X-API-Key": self.apikey}
logger.secret(self.apikey) logger.secret(self.apikey)
@ -24,7 +24,7 @@ class Notifiarr:
@retry(stop_max_attempt_number=6, wait_fixed=10000, retry_on_exception=util.retry_if_not_failed) @retry(stop_max_attempt_number=6, wait_fixed=10000, retry_on_exception=util.retry_if_not_failed)
def request(self, json=None, path="notification", params=None): def request(self, json=None, path="notification", params=None):
response = self.config.get(f"{base_url}{path}/pmm/", json=json, headers=self.header, params=params) response = self.requests.get(f"{base_url}{path}/pmm/", json=json, headers=self.header, params=params)
try: try:
response_json = response.json() response_json = response.json()
except JSONDecodeError as e: except JSONDecodeError as e:

View file

@ -43,8 +43,9 @@ class OMDbObj:
class OMDb: class OMDb:
def __init__(self, config, params): def __init__(self, requests, cache, params):
self.config = config self.requests = requests
self.cache = cache
self.apikey = params["apikey"] self.apikey = params["apikey"]
self.expiration = params["expiration"] self.expiration = params["expiration"]
self.limit = False self.limit = False
@ -53,16 +54,16 @@ class OMDb:
def get_omdb(self, imdb_id, ignore_cache=False): def get_omdb(self, imdb_id, ignore_cache=False):
expired = None expired = None
if self.config.Cache and not ignore_cache: if self.cache and not ignore_cache:
omdb_dict, expired = self.config.Cache.query_omdb(imdb_id, self.expiration) omdb_dict, expired = self.cache.query_omdb(imdb_id, self.expiration)
if omdb_dict and expired is False: if omdb_dict and expired is False:
return OMDbObj(imdb_id, omdb_dict) return OMDbObj(imdb_id, omdb_dict)
logger.trace(f"IMDb ID: {imdb_id}") logger.trace(f"IMDb ID: {imdb_id}")
response = self.config.get(base_url, params={"i": imdb_id, "apikey": self.apikey}) response = self.requests.get(base_url, params={"i": imdb_id, "apikey": self.apikey})
if response.status_code < 400: if response.status_code < 400:
omdb = OMDbObj(imdb_id, response.json()) omdb = OMDbObj(imdb_id, response.json())
if self.config.Cache and not ignore_cache: if self.cache and not ignore_cache:
self.config.Cache.update_omdb(expired, omdb, self.expiration) self.cache.update_omdb(expired, omdb, self.expiration)
return omdb return omdb
else: else:
try: try:

View file

@ -1,7 +1,7 @@
import os, re import os, re
from datetime import datetime, timedelta, timezone from datetime import datetime, timedelta, timezone
from modules import plex, util, anidb from modules import plex, util, anidb
from modules.util import Failed, LimitReached, YAML from modules.util import Failed, LimitReached
from plexapi.exceptions import NotFound from plexapi.exceptions import NotFound
from plexapi.video import Movie, Show from plexapi.video import Movie, Show
@ -296,10 +296,11 @@ class Operations:
mal_id = self.library.reverse_mal[item.ratingKey] mal_id = self.library.reverse_mal[item.ratingKey]
elif not anidb_id: elif not anidb_id:
logger.warning(f"Convert Warning: No AniDB ID to Convert to MyAnimeList ID for Guid: {item.guid}") logger.warning(f"Convert Warning: No AniDB ID to Convert to MyAnimeList ID for Guid: {item.guid}")
elif anidb_id not in self.config.Convert._anidb_to_mal:
logger.warning(f"Convert Warning: No MyAnimeList Found for AniDB ID: {anidb_id} of Guid: {item.guid}")
else: else:
mal_id = self.config.Convert._anidb_to_mal[anidb_id] try:
mal_id = self.config.Convert.anidb_to_mal(anidb_id)
except Failed as err:
logger.warning(f"{err} of Guid: {item.guid}")
if mal_id: if mal_id:
try: try:
_mal_obj = self.config.MyAnimeList.get_anime(mal_id) _mal_obj = self.config.MyAnimeList.get_anime(mal_id)
@ -1134,7 +1135,7 @@ class Operations:
yaml = None yaml = None
if os.path.exists(self.library.metadata_backup["path"]): if os.path.exists(self.library.metadata_backup["path"]):
try: try:
yaml = YAML(path=self.library.metadata_backup["path"]) yaml = self.config.Requests.file_yaml(self.library.metadata_backup["path"])
except Failed as e: except Failed as e:
logger.error(e) logger.error(e)
filename, file_extension = os.path.splitext(self.library.metadata_backup["path"]) filename, file_extension = os.path.splitext(self.library.metadata_backup["path"])
@ -1144,7 +1145,7 @@ class Operations:
os.rename(self.library.metadata_backup["path"], f"{filename}{i}{file_extension}") os.rename(self.library.metadata_backup["path"], f"{filename}{i}{file_extension}")
logger.error(f"Backup failed to load saving copy to {filename}{i}{file_extension}") logger.error(f"Backup failed to load saving copy to {filename}{i}{file_extension}")
if not yaml: if not yaml:
yaml = YAML(path=self.library.metadata_backup["path"], create=True) yaml = self.config.Requests.file_yaml(self.library.metadata_backup["path"], create=True)
if "metadata" not in yaml.data or not isinstance(yaml.data["metadata"], dict): if "metadata" not in yaml.data or not isinstance(yaml.data["metadata"], dict):
yaml.data["metadata"] = {} yaml.data["metadata"] = {}
special_names = {} special_names = {}

View file

@ -71,6 +71,8 @@ def get_canvas_size(item):
class Overlay: class Overlay:
def __init__(self, config, library, overlay_file, original_mapping_name, overlay_data, suppress, level): def __init__(self, config, library, overlay_file, original_mapping_name, overlay_data, suppress, level):
self.config = config self.config = config
self.requests = self.config.Requests
self.cache = self.config.Cache
self.library = library self.library = library
self.overlay_file = overlay_file self.overlay_file = overlay_file
self.original_mapping_name = original_mapping_name self.original_mapping_name = original_mapping_name
@ -159,7 +161,7 @@ class Overlay:
raise Failed(f"Overlay Error: horizontal_offset and vertical_offset are required when using a backdrop") raise Failed(f"Overlay Error: horizontal_offset and vertical_offset are required when using a backdrop")
def get_and_save_image(image_url): def get_and_save_image(image_url):
response = self.config.get(image_url) response = self.requests.get(image_url)
if response.status_code == 404: if response.status_code == 404:
raise Failed(f"Overlay Error: Overlay Image not found at: {image_url}") raise Failed(f"Overlay Error: Overlay Image not found at: {image_url}")
if response.status_code >= 400: if response.status_code >= 400:
@ -224,14 +226,14 @@ class Overlay:
self.addon_offset = util.parse("Overlay", "addon_offset", self.data["addon_offset"], datatype="int", parent="overlay") if "addon_offset" in self.data else 0 self.addon_offset = util.parse("Overlay", "addon_offset", self.data["addon_offset"], datatype="int", parent="overlay") if "addon_offset" in self.data else 0
self.addon_position = util.parse("Overlay", "addon_position", self.data["addon_position"], parent="overlay", options=["left", "right", "top", "bottom"]) if "addon_position" in self.data else "left" self.addon_position = util.parse("Overlay", "addon_position", self.data["addon_position"], parent="overlay", options=["left", "right", "top", "bottom"]) if "addon_position" in self.data else "left"
image_compare = None image_compare = None
if self.config.Cache: if self.cache:
_, image_compare, _ = self.config.Cache.query_image_map(self.mapping_name, f"{self.library.image_table_name}_overlays") _, image_compare, _ = self.cache.query_image_map(self.mapping_name, f"{self.library.image_table_name}_overlays")
overlay_size = os.stat(self.path).st_size overlay_size = os.stat(self.path).st_size
self.updated = not image_compare or str(overlay_size) != str(image_compare) self.updated = not image_compare or str(overlay_size) != str(image_compare)
try: try:
self.image = Image.open(self.path).convert("RGBA") self.image = Image.open(self.path).convert("RGBA")
if self.config.Cache: if self.cache:
self.config.Cache.update_image_map(self.mapping_name, f"{self.library.image_table_name}_overlays", self.name, overlay_size) self.cache.update_image_map(self.mapping_name, f"{self.library.image_table_name}_overlays", self.name, overlay_size)
except OSError: except OSError:
raise Failed(f"Overlay Error: overlay image {self.path} failed to load") raise Failed(f"Overlay Error: overlay image {self.path} failed to load")
match = re.search("\\(([^)]+)\\)", self.name) match = re.search("\\(([^)]+)\\)", self.name)
@ -308,16 +310,16 @@ class Overlay:
if not os.path.exists(self.path): if not os.path.exists(self.path):
raise Failed(f"Overlay Error: Overlay Image not found at: {self.path}") raise Failed(f"Overlay Error: Overlay Image not found at: {self.path}")
image_compare = None image_compare = None
if self.config.Cache: if self.cache:
_, image_compare, _ = self.config.Cache.query_image_map(self.mapping_name, f"{self.library.image_table_name}_overlays") _, image_compare, _ = self.cache.query_image_map(self.mapping_name, f"{self.library.image_table_name}_overlays")
overlay_size = os.stat(self.path).st_size overlay_size = os.stat(self.path).st_size
self.updated = not image_compare or str(overlay_size) != str(image_compare) self.updated = not image_compare or str(overlay_size) != str(image_compare)
try: try:
self.image = Image.open(self.path).convert("RGBA") self.image = Image.open(self.path).convert("RGBA")
if self.has_coordinates(): if self.has_coordinates():
self.backdrop_box = self.image.size self.backdrop_box = self.image.size
if self.config.Cache: if self.cache:
self.config.Cache.update_image_map(self.mapping_name, f"{self.library.image_table_name}_overlays", self.mapping_name, overlay_size) self.cache.update_image_map(self.mapping_name, f"{self.library.image_table_name}_overlays", self.mapping_name, overlay_size)
except OSError: except OSError:
raise Failed(f"Overlay Error: overlay image {self.path} failed to load") raise Failed(f"Overlay Error: overlay image {self.path} failed to load")

View file

@ -13,6 +13,7 @@ logger = util.logger
class Overlays: class Overlays:
def __init__(self, config, library): def __init__(self, config, library):
self.config = config self.config = config
self.cache = self.config.Cache
self.library = library self.library = library
self.overlays = [] self.overlays = []
@ -88,8 +89,8 @@ class Overlays:
image_compare = None image_compare = None
overlay_compare = None overlay_compare = None
poster = None poster = None
if self.config.Cache: if self.cache:
image, image_compare, overlay_compare = self.config.Cache.query_image_map(item.ratingKey, f"{self.library.image_table_name}_overlays") image, image_compare, overlay_compare = self.cache.query_image_map(item.ratingKey, f"{self.library.image_table_name}_overlays")
self.library.reload(item, force=True) self.library.reload(item, force=True)
overlay_compare = [] if overlay_compare is None else util.get_list(overlay_compare, split="|") overlay_compare = [] if overlay_compare is None else util.get_list(overlay_compare, split="|")
@ -126,10 +127,10 @@ class Overlays:
if compare_name not in overlay_compare or properties[original_name].updated: if compare_name not in overlay_compare or properties[original_name].updated:
overlay_change = f"{compare_name} not in {overlay_compare} or {properties[original_name].updated}" overlay_change = f"{compare_name} not in {overlay_compare} or {properties[original_name].updated}"
if self.config.Cache: if self.cache:
for over_name in over_names: for over_name in over_names:
if properties[over_name].name.startswith("text"): if properties[over_name].name.startswith("text"):
for cache_key, cache_value in self.config.Cache.query_overlay_special_text(item.ratingKey).items(): for cache_key, cache_value in self.cache.query_overlay_special_text(item.ratingKey).items():
actual = plex.attribute_translation[cache_key] if cache_key in plex.attribute_translation else cache_key actual = plex.attribute_translation[cache_key] if cache_key in plex.attribute_translation else cache_key
if not hasattr(item, actual): if not hasattr(item, actual):
continue continue
@ -369,10 +370,11 @@ class Overlays:
mal_id = self.library.reverse_mal[item.ratingKey] mal_id = self.library.reverse_mal[item.ratingKey]
elif not anidb_id: elif not anidb_id:
raise Failed(f"Convert Warning: No AniDB ID to Convert to MyAnimeList ID for Guid: {item.guid}") raise Failed(f"Convert Warning: No AniDB ID to Convert to MyAnimeList ID for Guid: {item.guid}")
elif anidb_id not in self.config.Convert._anidb_to_mal:
raise Failed(f"Convert Warning: No MyAnimeList Found for AniDB ID: {anidb_id} of Guid: {item.guid}")
else: else:
mal_id = self.config.Convert._anidb_to_mal[anidb_id] try:
mal_id = self.config.Convert.anidb_to_mal(anidb_id)
except Failed as errr:
raise Failed(f"{errr} of Guid: {item.guid}")
if mal_id: if mal_id:
found_rating = self.config.MyAnimeList.get_anime(mal_id).score found_rating = self.config.MyAnimeList.get_anime(mal_id).score
except Failed as err: except Failed as err:
@ -394,9 +396,9 @@ class Overlays:
actual_value = getattr(item, actual_attr) actual_value = getattr(item, actual_attr)
if format_var == "versions": if format_var == "versions":
actual_value = len(actual_value) actual_value = len(actual_value)
if self.config.Cache: if self.cache:
cache_store = actual_value.strftime("%Y-%m-%d") if format_var in overlay.date_vars else actual_value cache_store = actual_value.strftime("%Y-%m-%d") if format_var in overlay.date_vars else actual_value
self.config.Cache.update_overlay_special_text(item.ratingKey, format_var, cache_store) self.cache.update_overlay_special_text(item.ratingKey, format_var, cache_store)
sub_value = None sub_value = None
if format_var == "originally_available": if format_var == "originally_available":
if mod: if mod:
@ -517,8 +519,8 @@ class Overlays:
else: else:
logger.info(" Overlay Update Not Needed") logger.info(" Overlay Update Not Needed")
if self.config.Cache and poster_compare: if self.cache and poster_compare:
self.config.Cache.update_image_map(item.ratingKey, f"{self.library.image_table_name}_overlays", item.thumb, poster_compare, overlay='|'.join(compare_names)) self.cache.update_image_map(item.ratingKey, f"{self.library.image_table_name}_overlays", item.thumb, poster_compare, overlay='|'.join(compare_names))
except Failed as e: except Failed as e:
logger.error(f" {e}\n Overlays Attempted on {item_title}: {', '.join(over_names)}") logger.error(f" {e}\n Overlays Attempted on {item_title}: {', '.join(over_names)}")
except Exception as e: except Exception as e:

View file

@ -1,8 +1,10 @@
import os, plexapi, re, requests, time import os, plexapi, re, time
from datetime import datetime, timedelta from datetime import datetime, timedelta
from modules import builder, util from modules import builder, util
from modules.library import Library from modules.library import Library
from modules.util import Failed, ImageData from modules.poster import ImageData
from modules.request import parse_qs, quote_plus, urlparse
from modules.util import Failed
from PIL import Image from PIL import Image
from plexapi import utils from plexapi import utils
from plexapi.audio import Artist, Track, Album from plexapi.audio import Artist, Track, Album
@ -12,8 +14,8 @@ from plexapi.library import Role, FilterChoice
from plexapi.playlist import Playlist from plexapi.playlist import Playlist
from plexapi.server import PlexServer from plexapi.server import PlexServer
from plexapi.video import Movie, Show, Season, Episode from plexapi.video import Movie, Show, Season, Episode
from requests.exceptions import ConnectionError, ConnectTimeout
from retrying import retry from retrying import retry
from urllib import parse
from xml.etree.ElementTree import ParseError from xml.etree.ElementTree import ParseError
logger = util.logger logger = util.logger
@ -445,16 +447,13 @@ class Plex(Library):
super().__init__(config, params) super().__init__(config, params)
self.plex = params["plex"] self.plex = params["plex"]
self.url = self.plex["url"] self.url = self.plex["url"]
plex_session = self.config.session plex_session = self.config.Requests.session
if self.plex["verify_ssl"] is False and self.config.general["verify_ssl"] is True: if self.plex["verify_ssl"] is False and self.config.Requests.global_ssl is True:
logger.debug("Overriding verify_ssl to False for Plex connection") logger.debug("Overriding verify_ssl to False for Plex connection")
plex_session = requests.Session() plex_session = self.config.Requests.create_session(verify_ssl=False)
plex_session.verify = False if self.plex["verify_ssl"] is True and self.config.Requests.global_ssl is False:
import urllib3
urllib3.disable_warnings(urllib3.exceptions.InsecureRequestWarning)
if self.plex["verify_ssl"] is True and self.config.general["verify_ssl"] is False:
logger.debug("Overriding verify_ssl to True for Plex connection") logger.debug("Overriding verify_ssl to True for Plex connection")
plex_session = requests.Session() plex_session = self.config.Requests.create_session()
self.token = self.plex["token"] self.token = self.plex["token"]
self.timeout = self.plex["timeout"] self.timeout = self.plex["timeout"]
logger.secret(self.url) logger.secret(self.url)
@ -493,13 +492,13 @@ class Plex(Library):
except Unauthorized: except Unauthorized:
logger.info(f"Plex Error: Plex connection attempt returned 'Unauthorized'") logger.info(f"Plex Error: Plex connection attempt returned 'Unauthorized'")
raise Failed("Plex Error: Plex token is invalid") raise Failed("Plex Error: Plex token is invalid")
except requests.exceptions.ConnectTimeout: except ConnectTimeout:
raise Failed(f"Plex Error: Plex did not respond within the {self.timeout}-second timeout.") raise Failed(f"Plex Error: Plex did not respond within the {self.timeout}-second timeout.")
except ValueError as e: except ValueError as e:
logger.info(f"Plex Error: Plex connection attempt returned 'ValueError'") logger.info(f"Plex Error: Plex connection attempt returned 'ValueError'")
logger.stacktrace() logger.stacktrace()
raise Failed(f"Plex Error: {e}") raise Failed(f"Plex Error: {e}")
except (requests.exceptions.ConnectionError, ParseError): except (ConnectionError, ParseError):
logger.info(f"Plex Error: Plex connection attempt returned 'ConnectionError' or 'ParseError'") logger.info(f"Plex Error: Plex connection attempt returned 'ConnectionError' or 'ParseError'")
logger.stacktrace() logger.stacktrace()
raise Failed("Plex Error: Plex URL is probably invalid") raise Failed("Plex Error: Plex URL is probably invalid")
@ -630,7 +629,7 @@ class Plex(Library):
def upload_theme(self, collection, url=None, filepath=None): def upload_theme(self, collection, url=None, filepath=None):
key = f"/library/metadata/{collection.ratingKey}/themes" key = f"/library/metadata/{collection.ratingKey}/themes"
if url: if url:
self.PlexServer.query(f"{key}?url={parse.quote_plus(url)}", method=self.PlexServer._session.post) self.PlexServer.query(f"{key}?url={quote_plus(url)}", method=self.PlexServer._session.post)
elif filepath: elif filepath:
self.PlexServer.query(key, method=self.PlexServer._session.post, data=open(filepath, 'rb').read()) self.PlexServer.query(key, method=self.PlexServer._session.post, data=open(filepath, 'rb').read())
@ -745,7 +744,7 @@ class Plex(Library):
raise Failed("Overlay Error: No Poster found to reset") raise Failed("Overlay Error: No Poster found to reset")
return image_url return image_url
def _reload(self, item): def item_reload(self, item):
item.reload(checkFiles=False, includeAllConcerts=False, includeBandwidths=False, includeChapters=False, item.reload(checkFiles=False, includeAllConcerts=False, includeBandwidths=False, includeChapters=False,
includeChildren=False, includeConcerts=False, includeExternalMedia=False, includeExtras=False, includeChildren=False, includeConcerts=False, includeExternalMedia=False, includeExtras=False,
includeFields=False, includeGeolocation=False, includeLoudnessRamps=False, includeMarkers=False, includeFields=False, includeGeolocation=False, includeLoudnessRamps=False, includeMarkers=False,
@ -774,7 +773,7 @@ class Plex(Library):
item, is_full = self.cached_items[item.ratingKey] item, is_full = self.cached_items[item.ratingKey]
try: try:
if not is_full or force: if not is_full or force:
self._reload(item) self.item_reload(item)
self.cached_items[item.ratingKey] = (item, True) self.cached_items[item.ratingKey] = (item, True)
except (BadRequest, NotFound) as e: except (BadRequest, NotFound) as e:
logger.stacktrace() logger.stacktrace()
@ -911,7 +910,7 @@ class Plex(Library):
if playlist.title not in playlists: if playlist.title not in playlists:
playlists[playlist.title] = [] playlists[playlist.title] = []
playlists[playlist.title].append(username) playlists[playlist.title].append(username)
except requests.exceptions.ConnectionError: except ConnectionError:
pass pass
scan_user(self.PlexServer, self.account.title) scan_user(self.PlexServer, self.account.title)
for user in self.users: for user in self.users:
@ -990,7 +989,7 @@ class Plex(Library):
self._query(f"/library/collections{utils.joinArgs(args)}", post=True) self._query(f"/library/collections{utils.joinArgs(args)}", post=True)
def get_smart_filter_from_uri(self, uri): def get_smart_filter_from_uri(self, uri):
smart_filter = parse.parse_qs(parse.urlparse(uri.replace("/#!/", "/")).query)["key"][0] # noqa smart_filter = parse_qs(urlparse(uri.replace("/#!/", "/")).query)["key"][0] # noqa
args = smart_filter[smart_filter.index("?"):] args = smart_filter[smart_filter.index("?"):]
return self.build_smart_filter(args), int(args[args.index("type=") + 5:args.index("type=") + 6]) return self.build_smart_filter(args), int(args[args.index("type=") + 5:args.index("type=") + 6])
@ -1037,7 +1036,7 @@ class Plex(Library):
for playlist in self.PlexServer.switchUser(user).playlists(): for playlist in self.PlexServer.switchUser(user).playlists():
if isinstance(playlist, Playlist) and playlist.title == playlist_title: if isinstance(playlist, Playlist) and playlist.title == playlist_title:
return playlist return playlist
except requests.exceptions.ConnectionError: except ConnectionError:
pass pass
raise Failed(f"Plex Error: Playlist {playlist_title} not found") raise Failed(f"Plex Error: Playlist {playlist_title} not found")
@ -1090,7 +1089,7 @@ class Plex(Library):
try: try:
fin = False fin = False
for guid_tag in item.guids: for guid_tag in item.guids:
url_parsed = requests.utils.urlparse(guid_tag.id) url_parsed = urlparse(guid_tag.id)
if url_parsed.scheme == "tvdb": if url_parsed.scheme == "tvdb":
if isinstance(item, Show): if isinstance(item, Show):
ids.append((int(url_parsed.netloc), "tvdb")) ids.append((int(url_parsed.netloc), "tvdb"))
@ -1106,7 +1105,7 @@ class Plex(Library):
break break
if fin: if fin:
continue continue
except requests.exceptions.ConnectionError: except ConnectionError:
continue continue
if imdb_id and not tmdb_id: if imdb_id and not tmdb_id:
for imdb in imdb_id: for imdb in imdb_id:
@ -1329,8 +1328,8 @@ class Plex(Library):
asset_location = item_dir asset_location = item_dir
except Failed as e: except Failed as e:
logger.warning(e) logger.warning(e)
poster = util.pick_image(title, posters, self.prioritize_assets, self.download_url_assets, asset_location, image_name=image_name) poster = self.pick_image(title, posters, self.prioritize_assets, self.download_url_assets, asset_location, image_name=image_name)
background = util.pick_image(title, backgrounds, self.prioritize_assets, self.download_url_assets, asset_location, background = self.pick_image(title, backgrounds, self.prioritize_assets, self.download_url_assets, asset_location,
is_poster=False, image_name=f"{image_name}_background" if image_name else image_name) is_poster=False, image_name=f"{image_name}_background" if image_name else image_name)
updated = False updated = False
if poster or background: if poster or background:

View file

@ -1,10 +1,24 @@
import os, time import os, time
from modules import util from modules import util
from modules.util import Failed, ImageData from modules.util import Failed
from PIL import Image, ImageFont, ImageDraw, ImageColor from PIL import Image, ImageFont, ImageDraw, ImageColor
logger = util.logger logger = util.logger
class ImageData:
def __init__(self, attribute, location, prefix="", is_poster=True, is_url=True, compare=None):
self.attribute = attribute
self.location = location
self.prefix = prefix
self.is_poster = is_poster
self.is_url = is_url
self.compare = compare if compare else location if is_url else os.stat(location).st_size
self.message = f"{prefix}{'poster' if is_poster else 'background'} to [{'URL' if is_url else 'File'}] {location}"
def __str__(self):
return str(self.__dict__)
class ImageBase: class ImageBase:
def __init__(self, config, data): def __init__(self, config, data):
self.config = config self.config = config
@ -48,10 +62,10 @@ class ImageBase:
else: else:
return None, None return None, None
response = self.config.get(url) response = self.config.Requests.get(url)
if response.status_code >= 400: if response.status_code >= 400:
raise Failed(f"Poster Error: {attr} not found at: {url}") raise Failed(f"Poster Error: {attr} not found at: {url}")
if "Content-Type" not in response.headers or response.headers["Content-Type"] not in util.image_content_types: if "Content-Type" not in response.headers or response.headers["Content-Type"] not in self.config.Requests.image_content_types:
raise Failed(f"Poster Error: {attr} not a png, jpg, or webp: {url}") raise Failed(f"Poster Error: {attr} not a png, jpg, or webp: {url}")
if response.headers["Content-Type"] == "image/jpeg": if response.headers["Content-Type"] == "image/jpeg":
ext = "jpg" ext = "jpg"

View file

@ -13,15 +13,16 @@ availability_descriptions = {"announced": "For Announced", "cinemas": "For In Ci
monitor_descriptions = {"movie": "Monitor Only the Movie", "collection": "Monitor the Movie and Collection", "none": "Do not Monitor"} monitor_descriptions = {"movie": "Monitor Only the Movie", "collection": "Monitor the Movie and Collection", "none": "Do not Monitor"}
class Radarr: class Radarr:
def __init__(self, config, library, params): def __init__(self, requests, cache, library, params):
self.config = config self.requests = requests
self.cache = cache
self.library = library self.library = library
self.url = params["url"] self.url = params["url"]
self.token = params["token"] self.token = params["token"]
logger.secret(self.url) logger.secret(self.url)
logger.secret(self.token) logger.secret(self.token)
try: try:
self.api = RadarrAPI(self.url, self.token, session=self.config.session) self.api = RadarrAPI(self.url, self.token, session=self.requests.session)
self.api.respect_list_exclusions_when_adding() self.api.respect_list_exclusions_when_adding()
self.api._validate_add_options(params["root_folder_path"], params["quality_profile"]) # noqa self.api._validate_add_options(params["root_folder_path"], params["quality_profile"]) # noqa
self.profiles = self.api.quality_profile() self.profiles = self.api.quality_profile()
@ -102,8 +103,8 @@ class Radarr:
tmdb_id = item[0] if isinstance(item, tuple) else item tmdb_id = item[0] if isinstance(item, tuple) else item
logger.ghost(f"Loading TMDb ID {i}/{len(tmdb_ids)} ({tmdb_id})") logger.ghost(f"Loading TMDb ID {i}/{len(tmdb_ids)} ({tmdb_id})")
try: try:
if self.config.Cache and not ignore_cache: if self.cache and not ignore_cache:
_id = self.config.Cache.query_radarr_adds(tmdb_id, self.library.original_mapping_name) _id = self.cache.query_radarr_adds(tmdb_id, self.library.original_mapping_name)
if _id: if _id:
skipped.append(item) skipped.append(item)
raise Continue raise Continue
@ -152,8 +153,8 @@ class Radarr:
logger.info("") logger.info("")
for movie in added: for movie in added:
logger.info(f"Added to Radarr | {movie.tmdbId:<7} | {movie.title}") logger.info(f"Added to Radarr | {movie.tmdbId:<7} | {movie.title}")
if self.config.Cache: if self.cache:
self.config.Cache.update_radarr_adds(movie.tmdbId, self.library.original_mapping_name) self.cache.update_radarr_adds(movie.tmdbId, self.library.original_mapping_name)
logger.info(f"{len(added)} Movie{'s' if len(added) > 1 else ''} added to Radarr") logger.info(f"{len(added)} Movie{'s' if len(added) > 1 else ''} added to Radarr")
if len(exists) > 0 or len(skipped) > 0: if len(exists) > 0 or len(skipped) > 0:
@ -169,8 +170,8 @@ class Radarr:
upgrade_qp.append(movie) upgrade_qp.append(movie)
else: else:
logger.info(f"Already in Radarr | {movie.tmdbId:<7} | {movie.title}") logger.info(f"Already in Radarr | {movie.tmdbId:<7} | {movie.title}")
if self.config.Cache: if self.cache:
self.config.Cache.update_radarr_adds(movie.tmdbId, self.library.original_mapping_name) self.cache.update_radarr_adds(movie.tmdbId, self.library.original_mapping_name)
if upgrade_qp: if upgrade_qp:
self.api.edit_multiple_movies(upgrade_qp, quality_profile=qp) self.api.edit_multiple_movies(upgrade_qp, quality_profile=qp)
for movie in upgrade_qp: for movie in upgrade_qp:

View file

@ -8,11 +8,11 @@ builders = ["reciperr_list", "stevenlu_popular"]
stevenlu_url = "https://s3.amazonaws.com/popular-movies/movies.json" stevenlu_url = "https://s3.amazonaws.com/popular-movies/movies.json"
class Reciperr: class Reciperr:
def __init__(self, config): def __init__(self, requests):
self.config = config self.requests = requests
def _request(self, url, name="Reciperr"): def _request(self, url, name="Reciperr"):
response = self.config.get(url) response = self.requests.get(url)
if response.status_code >= 400: if response.status_code >= 400:
raise Failed(f"{name} Error: JSON not found at {url}") raise Failed(f"{name} Error: JSON not found at {url}")
return response.json() return response.json()

242
modules/request.py Normal file
View file

@ -0,0 +1,242 @@
import base64, os, ruamel.yaml, requests
from lxml import html
from modules import util
from modules.poster import ImageData
from modules.util import Failed
from requests.exceptions import ConnectionError
from retrying import retry
from urllib import parse
logger = util.logger
image_content_types = ["image/png", "image/jpeg", "image/webp"]
def get_header(headers, header, language):
if headers:
return headers
else:
if header and not language:
language = "en-US,en;q=0.5"
if language:
return {
"Accept-Language": "eng" if language == "default" else language,
"User-Agent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:109.0) Gecko/20100101 Firefox/113.0"
}
def parse_version(version, text="develop"):
version = version.replace("develop", text)
split_version = version.split(f"-{text}")
return version, split_version[0], int(split_version[1]) if len(split_version) > 1 else 0
def quote(data):
return parse.quote(str(data))
def quote_plus(data):
return parse.quote_plus(str(data))
def parse_qs(data):
return parse.parse_qs(data)
def urlparse(data):
return parse.urlparse(str(data))
class Requests:
def __init__(self, file_version, env_version, git_branch, verify_ssl=True):
self.file_version = file_version
self.env_version = env_version
self.git_branch = git_branch
self.image_content_types = ["image/png", "image/jpeg", "image/webp"]
self.nightly_version = None
self.develop_version = None
self.master_version = None
self.session = self.create_session()
self.global_ssl = verify_ssl
if not self.global_ssl:
self.no_verify_ssl()
self.branch = self.guess_branch()
self.version = (self.file_version[0].replace("develop", self.branch), self.file_version[1].replace("develop", self.branch), self.file_version[2])
self.latest_version = self.current_version(self.version, branch=self.branch)
self.new_version = self.latest_version[0] if self.latest_version and (self.version[1] != self.latest_version[1] or (self.version[2] and self.version[2] < self.latest_version[2])) else None
def create_session(self, verify_ssl=True):
session = requests.Session()
if not verify_ssl:
self.no_verify_ssl(session)
return session
def no_verify_ssl(self, session=None):
if session is None:
session = self.session
session.verify = False
if session.verify is False:
import urllib3
urllib3.disable_warnings(urllib3.exceptions.InsecureRequestWarning)
def has_new_version(self):
return self.version[0] != "Unknown" and self.latest_version[0] != "Unknown" and self.version[1] != self.latest_version[1] or (self.version[2] and self.version[2] < self.latest_version[2])
def download_image(self, title, image_url, download_directory, is_poster=True, filename=None):
response = self.get_image(image_url)
new_image = os.path.join(download_directory, f"{filename}") if filename else download_directory
if response.headers["Content-Type"] == "image/jpeg":
new_image += ".jpg"
elif response.headers["Content-Type"] == "image/webp":
new_image += ".webp"
else:
new_image += ".png"
with open(new_image, "wb") as handler:
handler.write(response.content)
return ImageData("asset_directory", new_image, prefix=f"{title}'s ", is_poster=is_poster, is_url=False)
def file_yaml(self, path_to_file, check_empty=False, create=False, start_empty=False):
return YAML(path=path_to_file, check_empty=check_empty, create=create, start_empty=start_empty)
def get_yaml(self, url, check_empty=False):
response = self.get(url)
if response.status_code >= 400:
raise Failed(f"URL Error: No file found at {url}")
return YAML(input_data=response.content, check_empty=check_empty)
def get_image(self, url):
response = self.get(url, header=True)
if response.status_code == 404:
raise Failed(f"Image Error: Not Found on Image URL: {url}")
if response.status_code >= 400:
raise Failed(f"Image Error: {response.status_code} on Image URL: {url}")
if "Content-Type" not in response.headers or response.headers["Content-Type"] not in self.image_content_types:
raise Failed("Image Not PNG, JPG, or WEBP")
def get_stream(self, url, location, info="Item"):
with self.session.get(url, stream=True) as r:
r.raise_for_status()
total_length = r.headers.get('content-length')
if total_length is not None:
total_length = int(total_length)
dl = 0
with open(location, "wb") as f:
for chunk in r.iter_content(chunk_size=8192):
dl += len(chunk)
f.write(chunk)
logger.ghost(f"Downloading {info}: {dl / total_length * 100:6.2f}%")
logger.exorcise()
def get_html(self, url, headers=None, params=None, header=None, language=None):
return html.fromstring(self.get(url, headers=headers, params=params, header=header, language=language).content)
def get_json(self, url, json=None, headers=None, params=None, header=None, language=None):
response = self.get(url, json=json, headers=headers, params=params, header=header, language=language)
try:
return response.json()
except ValueError:
logger.error(str(response.content))
raise
@retry(stop_max_attempt_number=6, wait_fixed=10000)
def get(self, url, json=None, headers=None, params=None, header=None, language=None):
return self.session.get(url, json=json, headers=get_header(headers, header, language), params=params)
def get_image_encoded(self, url):
return base64.b64encode(self.get(url).content).decode('utf-8')
def post_html(self, url, data=None, json=None, headers=None, header=None, language=None):
return html.fromstring(self.post(url, data=data, json=json, headers=headers, header=header, language=language).content)
def post_json(self, url, data=None, json=None, headers=None, header=None, language=None):
response = self.post(url, data=data, json=json, headers=headers, header=header, language=language)
try:
return response.json()
except ValueError:
logger.error(str(response.content))
raise
@retry(stop_max_attempt_number=6, wait_fixed=10000)
def post(self, url, data=None, json=None, headers=None, header=None, language=None):
return self.session.post(url, data=data, json=json, headers=get_header(headers, header, language))
def guess_branch(self):
if self.git_branch:
return self.git_branch
elif self.env_version in ["nightly", "develop"]:
return self.env_version
elif self.file_version[2] > 0:
dev_version = self.get_develop()
if self.file_version[1] != dev_version[1] or self.file_version[2] <= dev_version[2]:
return "develop"
else:
return "nightly"
else:
return "master"
def current_version(self, version, branch=None):
if branch == "nightly":
return self.get_nightly()
elif branch == "develop":
return self.get_develop()
elif version[2] > 0:
new_version = self.get_develop()
if version[1] != new_version[1] or new_version[2] >= version[2]:
return new_version
return self.get_nightly()
else:
return self.get_master()
def get_nightly(self):
if self.nightly_version is None:
self.nightly_version = self.get_version("nightly")
return self.nightly_version
def get_develop(self):
if self.develop_version is None:
self.develop_version = self.get_version("develop")
return self.develop_version
def get_master(self):
if self.master_version is None:
self.master_version = self.get_version("master")
return self.master_version
def get_version(self, level):
try:
url = f"https://raw.githubusercontent.com/Kometa-Team/Kometa/{level}/VERSION"
return parse_version(self.get(url).content.decode().strip(), text=level)
except ConnectionError:
return "Unknown", "Unknown", 0
class YAML:
def __init__(self, path=None, input_data=None, check_empty=False, create=False, start_empty=False):
self.path = path
self.input_data = input_data
self.yaml = ruamel.yaml.YAML()
self.yaml.width = 100000
self.yaml.indent(mapping=2, sequence=2)
try:
if input_data:
self.data = self.yaml.load(input_data)
else:
if start_empty or (create and not os.path.exists(self.path)):
with open(self.path, 'w'):
pass
self.data = {}
else:
with open(self.path, encoding="utf-8") as fp:
self.data = self.yaml.load(fp)
except ruamel.yaml.error.YAMLError as e:
e = str(e).replace("\n", "\n ")
raise Failed(f"YAML Error: {e}")
except Exception as e:
raise Failed(f"YAML Error: {e}")
if not self.data or not isinstance(self.data, dict):
if check_empty:
raise Failed("YAML Error: File is empty")
self.data = {}
def save(self):
if self.path:
with open(self.path, 'w', encoding="utf-8") as fp:
self.yaml.dump(self.data, fp)

View file

@ -29,15 +29,16 @@ monitor_descriptions = {
apply_tags_translation = {"": "add", "sync": "replace", "remove": "remove"} apply_tags_translation = {"": "add", "sync": "replace", "remove": "remove"}
class Sonarr: class Sonarr:
def __init__(self, config, library, params): def __init__(self, requests, cache, library, params):
self.config = config self.requests = requests
self.cache = cache
self.library = library self.library = library
self.url = params["url"] self.url = params["url"]
self.token = params["token"] self.token = params["token"]
logger.secret(self.url) logger.secret(self.url)
logger.secret(self.token) logger.secret(self.token)
try: try:
self.api = SonarrAPI(self.url, self.token, session=self.config.session) self.api = SonarrAPI(self.url, self.token, session=self.requests.session)
self.api.respect_list_exclusions_when_adding() self.api.respect_list_exclusions_when_adding()
self.api._validate_add_options(params["root_folder_path"], params["quality_profile"], params["language_profile"]) # noqa self.api._validate_add_options(params["root_folder_path"], params["quality_profile"], params["language_profile"]) # noqa
self.profiles = self.api.quality_profile() self.profiles = self.api.quality_profile()
@ -126,8 +127,8 @@ class Sonarr:
tvdb_id = item[0] if isinstance(item, tuple) else item tvdb_id = item[0] if isinstance(item, tuple) else item
logger.ghost(f"Loading TVDb ID {i}/{len(tvdb_ids)} ({tvdb_id})") logger.ghost(f"Loading TVDb ID {i}/{len(tvdb_ids)} ({tvdb_id})")
try: try:
if self.config.Cache and not ignore_cache: if self.cache and not ignore_cache:
_id = self.config.Cache.query_sonarr_adds(tvdb_id, self.library.original_mapping_name) _id = self.cache.query_sonarr_adds(tvdb_id, self.library.original_mapping_name)
if _id: if _id:
skipped.append(item) skipped.append(item)
raise Continue raise Continue
@ -176,8 +177,8 @@ class Sonarr:
logger.info("") logger.info("")
for series in added: for series in added:
logger.info(f"Added to Sonarr | {series.tvdbId:<7} | {series.title}") logger.info(f"Added to Sonarr | {series.tvdbId:<7} | {series.title}")
if self.config.Cache: if self.cache:
self.config.Cache.update_sonarr_adds(series.tvdbId, self.library.original_mapping_name) self.cache.update_sonarr_adds(series.tvdbId, self.library.original_mapping_name)
logger.info(f"{len(added)} Series added to Sonarr") logger.info(f"{len(added)} Series added to Sonarr")
if len(exists) > 0 or len(skipped) > 0: if len(exists) > 0 or len(skipped) > 0:
@ -193,8 +194,8 @@ class Sonarr:
upgrade_qp.append(series) upgrade_qp.append(series)
else: else:
logger.info(f"Already in Sonarr | {series.tvdbId:<7} | {series.title}") logger.info(f"Already in Sonarr | {series.tvdbId:<7} | {series.title}")
if self.config.Cache: if self.cache:
self.config.Cache.update_sonarr_adds(series.tvdbId, self.library.original_mapping_name) self.cache.update_sonarr_adds(series.tvdbId, self.library.original_mapping_name)
if upgrade_qp: if upgrade_qp:
self.api.edit_multiple_series(upgrade_qp, quality_profile=qp) self.api.edit_multiple_series(upgrade_qp, quality_profile=qp)
for series in upgrade_qp: for series in upgrade_qp:

View file

@ -8,8 +8,8 @@ logger = util.logger
builders = ["tautulli_popular", "tautulli_watched"] builders = ["tautulli_popular", "tautulli_watched"]
class Tautulli: class Tautulli:
def __init__(self, config, library, params): def __init__(self, requests, library, params):
self.config = config self.requests = requests
self.library = library self.library = library
self.url = params["url"] self.url = params["url"]
self.apikey = params["apikey"] self.apikey = params["apikey"]
@ -69,4 +69,4 @@ class Tautulli:
if params: if params:
for k, v in params.items(): for k, v in params.items():
final_params[k] = v final_params[k] = v
return self.config.get_json(self.api, params=final_params) return self.requests.get_json(self.api, params=final_params)

View file

@ -113,8 +113,8 @@ class TMDbMovie(TMDBObj):
super().__init__(tmdb, tmdb_id, ignore_cache=ignore_cache) super().__init__(tmdb, tmdb_id, ignore_cache=ignore_cache)
expired = None expired = None
data = None data = None
if self._tmdb.config.Cache and not ignore_cache: if self._tmdb.cache and not ignore_cache:
data, expired = self._tmdb.config.Cache.query_tmdb_movie(tmdb_id, self._tmdb.expiration) data, expired = self._tmdb.cache.query_tmdb_movie(tmdb_id, self._tmdb.expiration)
if expired or not data: if expired or not data:
data = self.load_movie() data = self.load_movie()
super()._load(data) super()._load(data)
@ -125,8 +125,8 @@ class TMDbMovie(TMDBObj):
self.collection_id = data["collection_id"] if isinstance(data, dict) else data.collection.id if data.collection else None self.collection_id = data["collection_id"] if isinstance(data, dict) else data.collection.id if data.collection else None
self.collection_name = data["collection_name"] if isinstance(data, dict) else data.collection.name if data.collection else None self.collection_name = data["collection_name"] if isinstance(data, dict) else data.collection.name if data.collection else None
if self._tmdb.config.Cache and not ignore_cache: if self._tmdb.cache and not ignore_cache:
self._tmdb.config.Cache.update_tmdb_movie(expired, self, self._tmdb.expiration) self._tmdb.cache.update_tmdb_movie(expired, self, self._tmdb.expiration)
@retry(stop_max_attempt_number=6, wait_fixed=10000, retry_on_exception=util.retry_if_not_failed) @retry(stop_max_attempt_number=6, wait_fixed=10000, retry_on_exception=util.retry_if_not_failed)
def load_movie(self): def load_movie(self):
@ -144,8 +144,8 @@ class TMDbShow(TMDBObj):
super().__init__(tmdb, tmdb_id, ignore_cache=ignore_cache) super().__init__(tmdb, tmdb_id, ignore_cache=ignore_cache)
expired = None expired = None
data = None data = None
if self._tmdb.config.Cache and not ignore_cache: if self._tmdb.cache and not ignore_cache:
data, expired = self._tmdb.config.Cache.query_tmdb_show(tmdb_id, self._tmdb.expiration) data, expired = self._tmdb.cache.query_tmdb_show(tmdb_id, self._tmdb.expiration)
if expired or not data: if expired or not data:
data = self.load_show() data = self.load_show()
super()._load(data) super()._load(data)
@ -162,8 +162,8 @@ class TMDbShow(TMDBObj):
loop = data.seasons if not isinstance(data, dict) else data["seasons"].split("%|%") if data["seasons"] else [] # noqa loop = data.seasons if not isinstance(data, dict) else data["seasons"].split("%|%") if data["seasons"] else [] # noqa
self.seasons = [TMDbSeason(s) for s in loop] self.seasons = [TMDbSeason(s) for s in loop]
if self._tmdb.config.Cache and not ignore_cache: if self._tmdb.cache and not ignore_cache:
self._tmdb.config.Cache.update_tmdb_show(expired, self, self._tmdb.expiration) self._tmdb.cache.update_tmdb_show(expired, self, self._tmdb.expiration)
@retry(stop_max_attempt_number=6, wait_fixed=10000, retry_on_exception=util.retry_if_not_failed) @retry(stop_max_attempt_number=6, wait_fixed=10000, retry_on_exception=util.retry_if_not_failed)
def load_show(self): def load_show(self):
@ -184,8 +184,8 @@ class TMDbEpisode:
self.ignore_cache = ignore_cache self.ignore_cache = ignore_cache
expired = None expired = None
data = None data = None
if self._tmdb.config.Cache and not ignore_cache: if self._tmdb.cache and not ignore_cache:
data, expired = self._tmdb.config.Cache.query_tmdb_episode(self.tmdb_id, self.season_number, self.episode_number, self._tmdb.expiration) data, expired = self._tmdb.cache.query_tmdb_episode(self.tmdb_id, self.season_number, self.episode_number, self._tmdb.expiration)
if expired or not data: if expired or not data:
data = self.load_episode() data = self.load_episode()
@ -198,8 +198,8 @@ class TMDbEpisode:
self.imdb_id = data["imdb_id"] if isinstance(data, dict) else data.imdb_id self.imdb_id = data["imdb_id"] if isinstance(data, dict) else data.imdb_id
self.tvdb_id = data["tvdb_id"] if isinstance(data, dict) else data.tvdb_id self.tvdb_id = data["tvdb_id"] if isinstance(data, dict) else data.tvdb_id
if self._tmdb.config.Cache and not ignore_cache: if self._tmdb.cache and not ignore_cache:
self._tmdb.config.Cache.update_tmdb_episode(expired, self, self._tmdb.expiration) self._tmdb.cache.update_tmdb_episode(expired, self, self._tmdb.expiration)
@retry(stop_max_attempt_number=6, wait_fixed=10000, retry_on_exception=util.retry_if_not_failed) @retry(stop_max_attempt_number=6, wait_fixed=10000, retry_on_exception=util.retry_if_not_failed)
def load_episode(self): def load_episode(self):
@ -215,13 +215,15 @@ class TMDbEpisode:
class TMDb: class TMDb:
def __init__(self, config, params): def __init__(self, config, params):
self.config = config self.config = config
self.requests = self.config.Requests
self.cache = self.config.Cache
self.apikey = params["apikey"] self.apikey = params["apikey"]
self.language = params["language"] self.language = params["language"]
self.region = None self.region = None
self.expiration = params["expiration"] self.expiration = params["expiration"]
logger.secret(self.apikey) logger.secret(self.apikey)
try: try:
self.TMDb = TMDbAPIs(self.apikey, language=self.language, session=self.config.session) self.TMDb = TMDbAPIs(self.apikey, language=self.language, session=self.requests.session)
except TMDbException as e: except TMDbException as e:
raise Failed(f"TMDb Error: {e}") raise Failed(f"TMDb Error: {e}")
self.iso_3166_1 = {iso: i.name for iso, i in self.TMDb._iso_3166_1.items()} # noqa self.iso_3166_1 = {iso: i.name for iso, i in self.TMDb._iso_3166_1.items()} # noqa

View file

@ -1,6 +1,7 @@
import requests, time, webbrowser import time, webbrowser
from modules import util from modules import util
from modules.util import Failed, TimeoutExpired, YAML from modules.request import urlparse
from modules.util import Failed, TimeoutExpired
from retrying import retry from retrying import retry
logger = util.logger logger = util.logger
@ -36,8 +37,9 @@ id_types = {
} }
class Trakt: class Trakt:
def __init__(self, config, params): def __init__(self, requests, read_only, params):
self.config = config self.requests = requests
self.read_only = read_only
self.client_id = params["client_id"] self.client_id = params["client_id"]
self.client_secret = params["client_secret"] self.client_secret = params["client_secret"]
self.pin = params["pin"] self.pin = params["pin"]
@ -137,10 +139,9 @@ class Trakt:
"redirect_uri": redirect_uri, "redirect_uri": redirect_uri,
"grant_type": "authorization_code" "grant_type": "authorization_code"
} }
response = self.config.post(f"{base_url}/oauth/token", json=json_data, headers={"Content-Type": "application/json"}) response = self.requests.post(f"{base_url}/oauth/token", json=json_data, headers={"Content-Type": "application/json"})
if response.status_code != 200: if response.status_code != 200:
raise Failed(f"Trakt Error: ({response.status_code}) {response.reason}") raise Failed(f"Trakt Error: ({response.status_code}) {response.reason}")
#raise Failed("Trakt Error: Invalid trakt pin. If you're sure you typed it in correctly your client_id or client_secret may be invalid")
response_json = response.json() response_json = response.json()
logger.trace(response_json) logger.trace(response_json)
if not self._save(response_json): if not self._save(response_json):
@ -155,7 +156,7 @@ class Trakt:
"trakt-api-key": self.client_id "trakt-api-key": self.client_id
} }
logger.secret(token) logger.secret(token)
response = self.config.get(f"{base_url}/users/settings", headers=headers) response = self.requests.get(f"{base_url}/users/settings", headers=headers)
if response.status_code == 423: if response.status_code == 423:
raise Failed("Trakt Error: Account is Locked please Contact Trakt Support") raise Failed("Trakt Error: Account is Locked please Contact Trakt Support")
if response.status_code != 200: if response.status_code != 200:
@ -172,7 +173,7 @@ class Trakt:
"redirect_uri": redirect_uri, "redirect_uri": redirect_uri,
"grant_type": "refresh_token" "grant_type": "refresh_token"
} }
response = self.config.post(f"{base_url}/oauth/token", json=json_data, headers={"Content-Type": "application/json"}) response = self.requests.post(f"{base_url}/oauth/token", json=json_data, headers={"Content-Type": "application/json"})
if response.status_code != 200: if response.status_code != 200:
return False return False
return self._save(response.json()) return self._save(response.json())
@ -180,8 +181,8 @@ class Trakt:
def _save(self, authorization): def _save(self, authorization):
if authorization and self._check(authorization): if authorization and self._check(authorization):
if self.authorization != authorization and not self.config.read_only: if self.authorization != authorization and not self.read_only:
yaml = YAML(self.config_path) yaml = self.requests.file_yaml(self.config_path)
yaml.data["trakt"]["pin"] = None yaml.data["trakt"]["pin"] = None
yaml.data["trakt"]["authorization"] = { yaml.data["trakt"]["authorization"] = {
"access_token": authorization["access_token"], "access_token": authorization["access_token"],
@ -219,9 +220,9 @@ class Trakt:
if pages > 1: if pages > 1:
params["page"] = current params["page"] = current
if json_data is not None: if json_data is not None:
response = self.config.post(f"{base_url}{url}", json=json_data, headers=headers) response = self.requests.post(f"{base_url}{url}", json=json_data, headers=headers)
else: else:
response = self.config.get(f"{base_url}{url}", headers=headers, params=params) response = self.requests.get(f"{base_url}{url}", headers=headers, params=params)
if pages == 1 and "X-Pagination-Page-Count" in response.headers and not params: if pages == 1 and "X-Pagination-Page-Count" in response.headers and not params:
pages = int(response.headers["X-Pagination-Page-Count"]) pages = int(response.headers["X-Pagination-Page-Count"])
if response.status_code >= 400: if response.status_code >= 400:
@ -251,7 +252,7 @@ class Trakt:
def list_description(self, data): def list_description(self, data):
try: try:
return self._request(requests.utils.urlparse(data).path)["description"] return self._request(urlparse(data).path)["description"]
except Failed: except Failed:
raise Failed(data) raise Failed(data)
@ -313,7 +314,7 @@ class Trakt:
return data return data
def sync_list(self, slug, ids): def sync_list(self, slug, ids):
current_ids = self._list(slug, urlparse=False, fail=False) current_ids = self._list(slug, parse=False, fail=False)
def read_result(data, obj_type, result_type, result_str=None): def read_result(data, obj_type, result_type, result_str=None):
result_str = result_str if result_str else result_type.capitalize() result_str = result_str if result_str else result_type.capitalize()
@ -351,7 +352,7 @@ class Trakt:
read_not_found(results, "Remove") read_not_found(results, "Remove")
time.sleep(1) time.sleep(1)
trakt_ids = self._list(slug, urlparse=False, trakt_ids=True) trakt_ids = self._list(slug, parse=False, trakt_ids=True)
trakt_lookup = {f"{ty}_{i_id}": t_id for t_id, i_id, ty in trakt_ids} trakt_lookup = {f"{ty}_{i_id}": t_id for t_id, i_id, ty in trakt_ids}
rank_ids = [trakt_lookup[f"{ty}_{i_id}"] for i_id, ty in ids if f"{ty}_{i_id}" in trakt_lookup] rank_ids = [trakt_lookup[f"{ty}_{i_id}"] for i_id, ty in ids if f"{ty}_{i_id}" in trakt_lookup]
self._request(f"/users/me/lists/{slug}/items/reorder", json_data={"rank": rank_ids}) self._request(f"/users/me/lists/{slug}/items/reorder", json_data={"rank": rank_ids})
@ -376,9 +377,9 @@ class Trakt:
def build_user_url(self, user, name): def build_user_url(self, user, name):
return f"{base_url.replace('api.', '')}/users/{user}/lists/{name}" return f"{base_url.replace('api.', '')}/users/{user}/lists/{name}"
def _list(self, data, urlparse=True, trakt_ids=False, fail=True, ignore_other=False): def _list(self, data, parse=True, trakt_ids=False, fail=True, ignore_other=False):
try: try:
url = requests.utils.urlparse(data).path.replace("/official/", "/") if urlparse else f"/users/me/lists/{data}" url = urlparse(data).path.replace("/official/", "/") if parse else f"/users/me/lists/{data}"
items = self._request(f"{url}/items") items = self._request(f"{url}/items")
except Failed: except Failed:
raise Failed(f"Trakt Error: List {data} not found") raise Failed(f"Trakt Error: List {data} not found")
@ -417,7 +418,7 @@ class Trakt:
return self._parse(items, typeless=chart_type == "popular", item_type="movie" if is_movie else "show", ignore_other=ignore_other) return self._parse(items, typeless=chart_type == "popular", item_type="movie" if is_movie else "show", ignore_other=ignore_other)
def get_people(self, data): def get_people(self, data):
return {str(i[0][0]): i[0][1] for i in self._list(data) if i[1] == "tmdb_person"} return {str(i[0][0]): i[0][1] for i in self._list(data) if i[1] == "tmdb_person"} # noqa
def validate_list(self, trakt_lists): def validate_list(self, trakt_lists):
values = util.get_list(trakt_lists, split=False) values = util.get_list(trakt_lists, split=False)

View file

@ -1,9 +1,10 @@
import re, requests, time import re, time
from datetime import datetime from datetime import datetime
from lxml import html from lxml import html
from lxml.etree import ParserError from lxml.etree import ParserError
from modules import util from modules import util
from modules.util import Failed from modules.util import Failed
from requests.exceptions import MissingSchema
from retrying import retry from retrying import retry
logger = util.logger logger = util.logger
@ -48,8 +49,8 @@ class TVDbObj:
self.ignore_cache = ignore_cache self.ignore_cache = ignore_cache
expired = None expired = None
data = None data = None
if self._tvdb.config.Cache and not ignore_cache: if self._tvdb.cache and not ignore_cache:
data, expired = self._tvdb.config.Cache.query_tvdb(tvdb_id, is_movie, self._tvdb.expiration) data, expired = self._tvdb.cache.query_tvdb(tvdb_id, is_movie, self._tvdb.expiration)
if expired or not data: if expired or not data:
item_url = f"{urls['movie_id' if is_movie else 'series_id']}{tvdb_id}" item_url = f"{urls['movie_id' if is_movie else 'series_id']}{tvdb_id}"
try: try:
@ -100,12 +101,13 @@ class TVDbObj:
self.genres = parse_page("//strong[text()='Genres']/parent::li/span/a/text()[normalize-space()]", is_list=True) self.genres = parse_page("//strong[text()='Genres']/parent::li/span/a/text()[normalize-space()]", is_list=True)
if self._tvdb.config.Cache and not ignore_cache: if self._tvdb.cache and not ignore_cache:
self._tvdb.config.Cache.update_tvdb(expired, self, self._tvdb.expiration) self._tvdb.cache.update_tvdb(expired, self, self._tvdb.expiration)
class TVDb: class TVDb:
def __init__(self, config, tvdb_language, expiration): def __init__(self, requests, cache, tvdb_language, expiration):
self.config = config self.requests = requests
self.cache = cache
self.language = tvdb_language self.language = tvdb_language
self.expiration = expiration self.expiration = expiration
@ -115,7 +117,7 @@ class TVDb:
@retry(stop_max_attempt_number=6, wait_fixed=10000, retry_on_exception=util.retry_if_not_failed) @retry(stop_max_attempt_number=6, wait_fixed=10000, retry_on_exception=util.retry_if_not_failed)
def get_request(self, tvdb_url): def get_request(self, tvdb_url):
response = self.config.get(tvdb_url, headers=util.header(self.language)) response = self.requests.get(tvdb_url, language=self.language)
if response.status_code >= 400: if response.status_code >= 400:
raise Failed(f"({response.status_code}) {response.reason}") raise Failed(f"({response.status_code}) {response.reason}")
return html.fromstring(response.content) return html.fromstring(response.content)
@ -136,8 +138,8 @@ class TVDb:
else: else:
raise Failed(f"TVDb Error: {tvdb_url} must begin with {urls['movies']} or {urls['series']}") raise Failed(f"TVDb Error: {tvdb_url} must begin with {urls['movies']} or {urls['series']}")
expired = None expired = None
if self.config.Cache and not ignore_cache and not is_movie: if self.cache and not ignore_cache and not is_movie:
tvdb_id, expired = self.config.Cache.query_tvdb_map(tvdb_url, self.expiration) tvdb_id, expired = self.cache.query_tvdb_map(tvdb_url, self.expiration)
if tvdb_id and not expired: if tvdb_id and not expired:
return tvdb_id, None, None return tvdb_id, None, None
logger.trace(f"URL: {tvdb_url}") logger.trace(f"URL: {tvdb_url}")
@ -165,8 +167,8 @@ class TVDb:
pass pass
if tmdb_id is None and imdb_id is None: if tmdb_id is None and imdb_id is None:
raise Failed(f"TVDb Error: No TMDb ID or IMDb ID found") raise Failed(f"TVDb Error: No TMDb ID or IMDb ID found")
if self.config.Cache and not ignore_cache and not is_movie: if self.cache and not ignore_cache and not is_movie:
self.config.Cache.update_tvdb_map(expired, tvdb_url, tvdb_id, self.expiration) self.cache.update_tvdb_map(expired, tvdb_url, tvdb_id, self.expiration)
return tvdb_id, tmdb_id, imdb_id return tvdb_id, tmdb_id, imdb_id
elif tvdb_url.startswith(urls["movie_id"]): elif tvdb_url.startswith(urls["movie_id"]):
err_text = f"using TVDb Movie ID: {tvdb_url[len(urls['movie_id']):]}" err_text = f"using TVDb Movie ID: {tvdb_url[len(urls['movie_id']):]}"
@ -177,7 +179,7 @@ class TVDb:
raise Failed(f"TVDb Error: Could not find a TVDb {media_type} {err_text}") raise Failed(f"TVDb Error: Could not find a TVDb {media_type} {err_text}")
def get_list_description(self, tvdb_url): def get_list_description(self, tvdb_url):
response = self.config.get_html(tvdb_url, headers=util.header(self.language)) response = self.requests.get_html(tvdb_url, language=self.language)
description = response.xpath("//div[@class='block']/div[not(@style='display:none')]/p/text()") description = response.xpath("//div[@class='block']/div[not(@style='display:none')]/p/text()")
description = description[0] if len(description) > 0 and len(description[0]) > 0 else None description = description[0] if len(description) > 0 and len(description[0]) > 0 else None
poster = response.xpath("//div[@id='artwork']/div/div/a/@href") poster = response.xpath("//div[@id='artwork']/div/div/a/@href")
@ -190,7 +192,7 @@ class TVDb:
logger.trace(f"URL: {tvdb_url}") logger.trace(f"URL: {tvdb_url}")
if tvdb_url.startswith((urls["list"], urls["alt_list"])): if tvdb_url.startswith((urls["list"], urls["alt_list"])):
try: try:
response = self.config.get_html(tvdb_url, headers=util.header(self.language)) response = self.requests.get_html(tvdb_url, language=self.language)
items = response.xpath("//div[@id='general']//div/div/h3/a") items = response.xpath("//div[@id='general']//div/div/h3/a")
for item in items: for item in items:
title = item.xpath("text()")[0] title = item.xpath("text()")[0]
@ -217,7 +219,7 @@ class TVDb:
if len(ids) > 0: if len(ids) > 0:
return ids return ids
raise Failed(f"TVDb Error: No TVDb IDs found at {tvdb_url}") raise Failed(f"TVDb Error: No TVDb IDs found at {tvdb_url}")
except requests.exceptions.MissingSchema: except MissingSchema:
logger.stacktrace() logger.stacktrace()
raise Failed(f"TVDb Error: URL Lookup Failed for {tvdb_url}") raise Failed(f"TVDb Error: URL Lookup Failed for {tvdb_url}")
else: else:

View file

@ -1,4 +1,4 @@
import glob, os, re, requests, ruamel.yaml, signal, sys, time import glob, os, re, signal, sys, time
from datetime import datetime, timedelta from datetime import datetime, timedelta
from modules.logs import MyLogger from modules.logs import MyLogger
from num2words import num2words from num2words import num2words
@ -43,19 +43,6 @@ class NotScheduled(Exception):
class NotScheduledRange(NotScheduled): class NotScheduledRange(NotScheduled):
pass pass
class ImageData:
def __init__(self, attribute, location, prefix="", is_poster=True, is_url=True, compare=None):
self.attribute = attribute
self.location = location
self.prefix = prefix
self.is_poster = is_poster
self.is_url = is_url
self.compare = compare if compare else location if is_url else os.stat(location).st_size
self.message = f"{prefix}{'poster' if is_poster else 'background'} to [{'URL' if is_url else 'File'}] {location}"
def __str__(self):
return str(self.__dict__)
def retry_if_not_failed(exception): def retry_if_not_failed(exception):
return not isinstance(exception, Failed) return not isinstance(exception, Failed)
@ -108,88 +95,6 @@ parental_labels = [f"{t.capitalize()}:{v}" for t in parental_types for v in pare
previous_time = None previous_time = None
start_time = None start_time = None
def guess_branch(version, env_version, git_branch):
if git_branch:
return git_branch
elif env_version in ["nightly", "develop"]:
return env_version
elif version[2] > 0:
dev_version = get_develop()
if version[1] != dev_version[1] or version[2] <= dev_version[2]:
return "develop"
else:
return "nightly"
else:
return "master"
def current_version(version, branch=None):
if branch == "nightly":
return get_nightly()
elif branch == "develop":
return get_develop()
elif version[2] > 0:
new_version = get_develop()
if version[1] != new_version[1] or new_version[2] >= version[2]:
return new_version
return get_nightly()
else:
return get_master()
nightly_version = None
def get_nightly():
global nightly_version
if nightly_version is None:
nightly_version = get_version("nightly")
return nightly_version
develop_version = None
def get_develop():
global develop_version
if develop_version is None:
develop_version = get_version("develop")
return develop_version
master_version = None
def get_master():
global master_version
if master_version is None:
master_version = get_version("master")
return master_version
def get_version(level):
try:
url = f"https://raw.githubusercontent.com/Kometa-Team/Kometa/{level}/VERSION"
return parse_version(requests.get(url).content.decode().strip(), text=level)
except requests.exceptions.ConnectionError:
return "Unknown", "Unknown", 0
def parse_version(version, text="develop"):
version = version.replace("develop", text)
split_version = version.split(f"-{text}")
return version, split_version[0], int(split_version[1]) if len(split_version) > 1 else 0
def quote(data):
return requests.utils.quote(str(data))
def download_image(title, image_url, download_directory, is_poster=True, filename=None):
response = requests.get(image_url, headers=header())
if response.status_code == 404:
raise Failed(f"Image Error: Not Found on Image URL: {image_url}")
if response.status_code >= 400:
raise Failed(f"Image Error: {response.status_code} on Image URL: {image_url}")
if "Content-Type" not in response.headers or response.headers["Content-Type"] not in image_content_types:
raise Failed("Image Not PNG, JPG, or WEBP")
new_image = os.path.join(download_directory, f"{filename}") if filename else download_directory
if response.headers["Content-Type"] == "image/jpeg":
new_image += ".jpg"
elif response.headers["Content-Type"] == "image/webp":
new_image += ".webp"
else:
new_image += ".png"
with open(new_image, "wb") as handler:
handler.write(response.content)
return ImageData("asset_directory", new_image, prefix=f"{title}'s ", is_poster=is_poster, is_url=False)
def get_image_dicts(group, alias): def get_image_dicts(group, alias):
posters = {} posters = {}
backgrounds = {} backgrounds = {}
@ -205,34 +110,6 @@ def get_image_dicts(group, alias):
logger.error(f"Metadata Error: {attr} attribute is blank") logger.error(f"Metadata Error: {attr} attribute is blank")
return posters, backgrounds return posters, backgrounds
def pick_image(title, images, prioritize_assets, download_url_assets, item_dir, is_poster=True, image_name=None):
image_type = "poster" if is_poster else "background"
if image_name is None:
image_name = image_type
if images:
logger.debug(f"{len(images)} {image_type}{'s' if len(images) > 1 else ''} found:")
for i in images:
logger.debug(f"Method: {i} {image_type.capitalize()}: {images[i]}")
if prioritize_assets and "asset_directory" in images:
return images["asset_directory"]
for attr in ["style_data", f"url_{image_type}", f"file_{image_type}", f"tmdb_{image_type}", "tmdb_profile",
"tmdb_list_poster", "tvdb_list_poster", f"tvdb_{image_type}", "asset_directory", f"pmm_{image_type}",
"tmdb_person", "tmdb_collection_details", "tmdb_actor_details", "tmdb_crew_details", "tmdb_director_details",
"tmdb_producer_details", "tmdb_writer_details", "tmdb_movie_details", "tmdb_list_details",
"tvdb_list_details", "tvdb_movie_details", "tvdb_show_details", "tmdb_show_details"]:
if attr in images:
if attr in ["style_data", f"url_{image_type}"] and download_url_assets and item_dir:
if "asset_directory" in images:
return images["asset_directory"]
else:
try:
return download_image(title, images[attr], item_dir, is_poster=is_poster, filename=image_name)
except Failed as e:
logger.error(e)
if attr in ["asset_directory", f"pmm_{image_type}"]:
return images[attr]
return ImageData(attr, images[attr], is_poster=is_poster, is_url=attr != f"file_{image_type}")
def add_dict_list(keys, value, dict_map): def add_dict_list(keys, value, dict_map):
for key in keys: for key in keys:
if key in dict_map: if key in dict_map:
@ -1012,36 +889,3 @@ def get_system_fonts():
return dirs return dirs
system_fonts = [n for d in dirs for _, _, ns in os.walk(d) for n in ns] system_fonts = [n for d in dirs for _, _, ns in os.walk(d) for n in ns]
return system_fonts return system_fonts
class YAML:
def __init__(self, path=None, input_data=None, check_empty=False, create=False, start_empty=False):
self.path = path
self.input_data = input_data
self.yaml = ruamel.yaml.YAML()
self.yaml.width = 100000
self.yaml.indent(mapping=2, sequence=2)
try:
if input_data:
self.data = self.yaml.load(input_data)
else:
if start_empty or (create and not os.path.exists(self.path)):
with open(self.path, 'w'):
pass
self.data = {}
else:
with open(self.path, encoding="utf-8") as fp:
self.data = self.yaml.load(fp)
except ruamel.yaml.error.YAMLError as e:
e = str(e).replace("\n", "\n ")
raise Failed(f"YAML Error: {e}")
except Exception as e:
raise Failed(f"YAML Error: {e}")
if not self.data or not isinstance(self.data, dict):
if check_empty:
raise Failed("YAML Error: File is empty")
self.data = {}
def save(self):
if self.path:
with open(self.path, 'w', encoding="utf-8") as fp:
self.yaml.dump(self.data, fp)

View file

@ -1,12 +1,13 @@
from json import JSONDecodeError from json import JSONDecodeError
from modules import util from modules import util
from modules.util import Failed, YAML from modules.util import Failed
logger = util.logger logger = util.logger
class Webhooks: class Webhooks:
def __init__(self, config, system_webhooks, library=None, notifiarr=None, gotify=None): def __init__(self, config, system_webhooks, library=None, notifiarr=None, gotify=None):
self.config = config self.config = config
self.requests = self.config.Requests
self.error_webhooks = system_webhooks["error"] if "error" in system_webhooks else [] self.error_webhooks = system_webhooks["error"] if "error" in system_webhooks else []
self.version_webhooks = system_webhooks["version"] if "version" in system_webhooks else [] self.version_webhooks = system_webhooks["version"] if "version" in system_webhooks else []
self.run_start_webhooks = system_webhooks["run_start"] if "run_start" in system_webhooks else [] self.run_start_webhooks = system_webhooks["run_start"] if "run_start" in system_webhooks else []
@ -39,7 +40,7 @@ class Webhooks:
json = self.discord(json) json = self.discord(json)
elif webhook.startswith("https://hooks.slack.com/services"): elif webhook.startswith("https://hooks.slack.com/services"):
json = self.slack(json) json = self.slack(json)
response = self.config.post(webhook, json=json) response = self.requests.post(webhook, json=json)
if response is not None: if response is not None:
try: try:
response_json = response.json() response_json = response.json()
@ -47,7 +48,7 @@ class Webhooks:
if webhook == "notifiarr" and self.notifiarr and response.status_code == 400: if webhook == "notifiarr" and self.notifiarr and response.status_code == 400:
def remove_from_config(text, hook_cat): def remove_from_config(text, hook_cat):
if response_json["details"]["response"] == text: if response_json["details"]["response"] == text:
yaml = YAML(self.config.config_path) yaml = self.requests.file_yaml(self.config.config_path)
changed = False changed = False
if hook_cat in yaml.data and yaml.data["webhooks"][hook_cat]: if hook_cat in yaml.data and yaml.data["webhooks"][hook_cat]:
if isinstance(yaml.data["webhooks"][hook_cat], list) and "notifiarr" in yaml.data["webhooks"][hook_cat]: if isinstance(yaml.data["webhooks"][hook_cat], list) and "notifiarr" in yaml.data["webhooks"][hook_cat]:
@ -83,7 +84,7 @@ class Webhooks:
if version[1] != latest_version[1]: if version[1] != latest_version[1]:
notes = self.config.GitHub.latest_release_notes() notes = self.config.GitHub.latest_release_notes()
elif version[2] and version[2] < latest_version[2]: elif version[2] and version[2] < latest_version[2]:
notes = self.config.GitHub.get_commits(version[2], nightly=self.config.branch == "nightly") notes = self.config.GitHub.get_commits(version[2], nightly=self.requests.branch == "nightly")
self._request(self.version_webhooks, {"event": "version", "current": version[0], "latest": latest_version[0], "notes": notes}) self._request(self.version_webhooks, {"event": "version", "current": version[0], "latest": latest_version[0], "notes": notes})
def end_time_hooks(self, start_time, end_time, run_time, stats): def end_time_hooks(self, start_time, end_time, run_time, stats):
@ -124,10 +125,10 @@ class Webhooks:
if self.library: if self.library:
thumb = None thumb = None
if not poster_url and collection.thumb and next((f for f in collection.fields if f.name == "thumb"), None): if not poster_url and collection.thumb and next((f for f in collection.fields if f.name == "thumb"), None):
thumb = self.config.get_image_encoded(f"{self.library.url}{collection.thumb}?X-Plex-Token={self.library.token}") thumb = self.requests.get_image_encoded(f"{self.library.url}{collection.thumb}?X-Plex-Token={self.library.token}")
art = None art = None
if not playlist and not background_url and collection.art and next((f for f in collection.fields if f.name == "art"), None): if not playlist and not background_url and collection.art and next((f for f in collection.fields if f.name == "art"), None):
art = self.config.get_image_encoded(f"{self.library.url}{collection.art}?X-Plex-Token={self.library.token}") art = self.requests.get_image_encoded(f"{self.library.url}{collection.art}?X-Plex-Token={self.library.token}")
self._request(webhooks, { self._request(webhooks, {
"event": "changes", "event": "changes",
"server_name": self.library.PlexServer.friendlyName, "server_name": self.library.PlexServer.friendlyName,
@ -330,4 +331,3 @@ class Webhooks:
fields.append(field) fields.append(field)
new_json["embeds"][0]["fields"] = fields new_json["embeds"][0]["fields"] = fields
return new_json return new_json

View file

@ -8,9 +8,9 @@ PlexAPI==4.15.13
psutil==5.9.8 psutil==5.9.8
python-dotenv==1.0.1 python-dotenv==1.0.1
python-dateutil==2.9.0.post0 python-dateutil==2.9.0.post0
requests==2.32.1 requests==2.32.2
retrying==1.3.4 retrying==1.3.4
ruamel.yaml==0.18.6 ruamel.yaml==0.18.6
schedule==1.2.1 schedule==1.2.2
setuptools==69.5.1 setuptools==70.0.0
tmdbapis==1.2.16 tmdbapis==1.2.16