-
Notifications
You must be signed in to change notification settings - Fork 16
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
OneDrive Proxy #20
Open
BrutuZ
wants to merge
5
commits into
subject-f:develop
Choose a base branch
from
BrutuZ:onedrive
base: develop
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Open
OneDrive Proxy #20
Changes from all commits
Commits
Show all changes
5 commits
Select commit
Hold shift + click to select a range
ed5f89d
OneDrive prototype
BrutuZ 386240e
include `s!` back in result on home parser
BrutuZ 110508a
format with black
BrutuZ 98b9d76
chapter API returns all series pages, not just ch1
BrutuZ 957b3f7
Try to load metadata from JSON file in the root
BrutuZ File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,247 @@ | ||
from datetime import datetime | ||
|
||
from django.shortcuts import redirect | ||
from django.urls import re_path | ||
|
||
from ..source import ProxySource | ||
from ..source.data import ChapterAPI, ProxyException, SeriesAPI, SeriesPage | ||
from ..source.helpers import api_cache, encode, get_wrapper | ||
from json import JSONDecodeError | ||
from requests import HTTPError, RequestException | ||
|
||
import re | ||
|
||
|
||
class OneDrive(ProxySource): | ||
""" | ||
Receives a OneDrive share URL | ||
Will parse subfolders up to one level: | ||
"[Artist] Series Title/Ch. 01 - Chapter Title/images.ext" OR "Title/images.ext" | ||
Expects chapter folders to be prefixed by the number, will abstact 'Ch.' and 'Chapter' prefixes | ||
Series title will be the top-most folder name | ||
If chapter number can't be guessed from folder title, assumes 1. | ||
Meaning unidentified subfolders will result in a single chapter with all images | ||
Chapter will be blank if it can't be parsed from sub-folder name | ||
Cover will be the first `cover.ext` found in the tree or page 1 of chapter 1 | ||
Doesn't support volumes, always "Uncategorized" | ||
""" | ||
|
||
def get_reader_prefix(self): | ||
return "onedrive" | ||
|
||
def shortcut_instantiator(self): | ||
def handler(request, series_id): | ||
print(request, series_id) | ||
return redirect(f"reader-{self.get_reader_prefix()}-chapter-page", series_id) | ||
|
||
return [ | ||
re_path(r"(?:1drv)/(?P<series_id>[\d\w]+)/$", handler), | ||
] | ||
|
||
@staticmethod | ||
def date_parser(timestamp: float): | ||
timestamp = int(timestamp) | ||
try: | ||
date = datetime.utcfromtimestamp(timestamp) | ||
except ValueError: | ||
date = datetime.utcfromtimestamp(timestamp // 1000) | ||
return [ | ||
date.year, | ||
date.month - 1, | ||
date.day, | ||
date.hour, | ||
date.minute, | ||
date.second, | ||
] | ||
|
||
@api_cache(prefix="od_common_dt", time=300) | ||
def od_common(self, meta_id): | ||
def od_api(share_id: str) -> dict: | ||
map = {"folders": [], "files": []} | ||
od_series_api = ( | ||
f"https://api.onedrive.com/v1.0/shares/{share_id}/driveItem?$expand=children" | ||
) | ||
resp = get_wrapper(od_series_api) | ||
print(f"Response code: {resp.status_code} {resp.url}") | ||
|
||
if not resp.ok: | ||
resp = get_wrapper(od_series_api, use_proxy=True) | ||
print(f"Response code proxy: {resp.status_code} {resp.url}") | ||
|
||
try: | ||
resp.raise_for_status() | ||
resp = resp.json() | ||
except (HTTPError, JSONDecodeError, RequestException) as error: | ||
raise ProxyException(f"Could not parse OneDrive folder `{share_id}`: {error}") | ||
|
||
map["title"] = resp["name"] | ||
try: | ||
map["date"] = datetime.fromisoformat( | ||
f"{resp.get('lastModifiedDateTime', '')[:19]}+00:00" | ||
).timestamp() | ||
except ValueError: | ||
map["date"] = datetime.utcnow().timestamp() | ||
|
||
for contents in resp["children"]: | ||
if "file" in contents and ( | ||
"image" in contents or "image" in contents.get("file", {}).get("mimeType", "") | ||
): | ||
if contents["name"].startswith("cover."): | ||
map["cover"] = contents["@content.downloadUrl"] | ||
continue | ||
map["files"].append(contents["@content.downloadUrl"]) | ||
elif "file" in contents and contents["name"].endswith(".json"): | ||
try: | ||
map["metadata"] = get_wrapper(contents["@content.downloadUrl"]).json() | ||
except (JSONDecodeError, RequestException): | ||
continue | ||
if "folder" in contents: | ||
map["folders"].append(contents.get("webUrl").split("/")[-1]) | ||
if not map.get("cover") and map["files"]: | ||
map["cover"] = map["files"][0] | ||
return map | ||
|
||
chapters_dict = { | ||
"1": { | ||
"title": "", | ||
"last_updated": None, | ||
"groups": { | ||
"OneDrive": [], | ||
}, | ||
} | ||
} | ||
series_dict = { | ||
"title": "", | ||
"description": "", | ||
"artist": None, | ||
"author": None, | ||
"cover": None, | ||
} | ||
series = od_api(meta_id) | ||
series_dict["title"] = series.get("metadata", {}).get( | ||
"title", self.parse_title(series["title"])[1] | ||
) | ||
has_artist = re.search(r"^\[(.+?)\] ", series["title"], re.IGNORECASE) | ||
series_dict["description"] = series.get("metadata", {}).get("description", "") | ||
series_dict["artist"] = series.get("metadata", {}).get( | ||
"artist", has_artist.group(1) if has_artist else "Unknown" | ||
) | ||
series_dict["author"] = series.get("metadata", {}).get("author", series_dict["artist"]) | ||
series_dict["alt_title"] = ( | ||
series_dict["title"].replace(has_artist.group(), "") if has_artist else "" | ||
) | ||
series_dict["cover"] = series.get("metadata", {}).get("cover", series.get("cover")) | ||
|
||
if series.get("files"): | ||
chapters_dict["1"] = { | ||
"title": series["title"], | ||
"last_updated": series["date"], | ||
"groups": {"OneDrive": series["files"]}, | ||
} | ||
|
||
for subfolder in series.get("folders", {}): | ||
folder = od_api(subfolder) | ||
if not folder["files"]: | ||
continue | ||
if not series_dict["cover"]: | ||
series_dict["cover"] = folder["files"][0] | ||
title = self.parse_title(folder["title"]) | ||
chapters_dict[title[0]] = { | ||
"title": title[1], | ||
"last_updated": folder["date"], | ||
"groups": {"OneDrive": folder["files"]}, | ||
} | ||
series_dict["chapters"] = chapters_dict | ||
|
||
chapter_list = [ | ||
[ | ||
ch[0], # Chapter Number | ||
ch[0], # Chapter Number | ||
ch[1]["title"], # Chapter Title | ||
ch[0].replace(".", "-"), # Chapter Slug | ||
"OneDrive", # Group | ||
self.date_parser(ch[1]["last_updated"]), # Date | ||
"Uncatecorized", # Volume Number | ||
] | ||
for ch in sorted( | ||
chapters_dict.items(), | ||
key=lambda m: float(m[0]), | ||
# reverse=True, | ||
) | ||
] | ||
groups_dict = {str(key): "OneDrive" for key in chapters_dict} | ||
|
||
return { | ||
"slug": meta_id, | ||
"title": series_dict["title"], | ||
"alt_title": series_dict["alt_title"], | ||
"description": series_dict["description"], | ||
"artist": series_dict["artist"], | ||
"author": series_dict["author"], | ||
"cover": series_dict["cover"], | ||
"chapters": chapters_dict, | ||
"chapter_list": chapter_list, | ||
"groups": groups_dict, | ||
"timestamp": series["date"], | ||
} | ||
|
||
def parse_title(self, title: str) -> tuple: | ||
search = re.search(r"^(?:Ch\.? ?|Chapter )?0?([\d\.,]{1,5})(?: - )?", title, re.IGNORECASE) | ||
ch = search.group(1) if search else "1" | ||
ch_title = title if not search else title.replace(search.group(), "") | ||
return (ch, ch_title) | ||
|
||
@api_cache(prefix="od_series_dt", time=300) | ||
def series_api_handler(self, meta_id): | ||
data = self.od_common(meta_id) | ||
return ( | ||
SeriesAPI( | ||
slug=meta_id, | ||
title=data["title"], | ||
description=data["description"], | ||
author=data["artist"], | ||
artist=data["artist"], | ||
groups=data["groups"], | ||
cover=data["cover"], | ||
chapters=data["chapters"], | ||
) | ||
if data | ||
else None | ||
) | ||
|
||
@api_cache(prefix="od_pages_dt", time=300) | ||
def chapter_api_handler(self, meta_id): | ||
data = self.od_common(meta_id) | ||
return ( | ||
ChapterAPI( | ||
pages=[ | ||
page | ||
for c in [ch["groups"]["OneDrive"] for ch in data["chapters"].values()] | ||
for page in c | ||
], | ||
series=data["slug"], | ||
chapter="1", | ||
) | ||
if data | ||
else None | ||
) | ||
|
||
@api_cache(prefix="od_series_page_dt", time=300) | ||
def series_page_handler(self, meta_id): | ||
data = self.od_common(meta_id) | ||
return ( | ||
SeriesPage( | ||
series=data["title"], | ||
alt_titles=[data["alt_title"]], | ||
alt_titles_str=data["alt_title"], | ||
slug=data["slug"], | ||
cover_vol_url=data["cover"], | ||
metadata=[["Author", data["author"]], ["Artist", data["artist"]]], | ||
synopsis=data["description"], | ||
author=data["artist"], | ||
chapter_list=data["chapter_list"], | ||
original_url=f"https://1drv.ms/f/{meta_id}", | ||
) | ||
if data | ||
else None | ||
) |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I have a little fud about burying this in our "about" page wall of text when we're not the only ones that can read this information. The share URL should have a "source" button that reveals the original resource:
That original URL should in some way reveal the original source, which would also reveal the drive's owner.
I've seen gist owners not realizing that their GitHub accounts are publicly visible so I'm still a bit wary about this.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I tried hijacking the function to add a new pop-up element exclusively for this disclaimer, but my frontend-fu wasn't strong enough. IIRC I managed to make it show up but broke the close button 😅.