v0.5.0 Release (#960)

* Bump versions by dotnet-bump-version.

* Color Theme applies to scrollbars (#793)

* Moved more scss to use syntax to reduce css size

* Hooked in color-scheme to help brower render scroll bars appropriately depending on color scheme user selects

* Bump versions by dotnet-bump-version.

* UI Updates + New Events (#806)

* Implemented ability to see downloads users are performing on the events widget.

* Fixed a bug where version update task was calling wrong code

* Fixed a bug where when checking for updates, the event wouldn't be pushed to server with correct name.

Added update check to the event widget rather than opening a modal on the user.

* Relaxed password requirements to only be 6-32 characters and inform user on register form about the requirements

* Removed a ton of duplicate logic for series cards where the logic was already defined in action service

* Fixed OPDS total items giving a rounded number rather than total items.

* Fixed off by one issue on OPDS pagination

* Bump versions by dotnet-bump-version.

* Update GA .net version (#818)

* Local Metadata Integration Part 1 (#817)

* Started with some basic plumbing with comic info parsing updating Series/Volume.

* We can now get chapter title from comicInfo.xml

* Hooked in the ability to store people into the chapter metadata.

* Removed no longer used imports, fixed up some foreign key constraints on deleting series with person linked.

* Refactored Summary out of the UI for Series into SeriesMetadata. Updated application to .net 6. There is a bug in metadata code for updating.

* Removed the parallel.ForEach with a normal foreach which lets us use async. For I/O heavy code, shouldn't change much.

* Refactored scan code to only check extensions with comic info, fixed a bug on scan events not using correct method name, removed summary field (still buggy)

* Fixed a bug where on cancelling a metadata request in modal, underlying button would get stuck in a disabled state.

* Changed how metadata selects the first volume to read summary info from. It will now select the first non-special volume rather than Volume 1.

* More debugging and found more bugs to fix

* Redid all the migrations as one single one. Fixed a bug with GetChapterInfo returning null when ChapterMetadata didn't exist for that Chapter.

Fixed an issue with mapper failing on GetChapterMetadata. Started work on adding people and a design for people.

* Fixed a bug where checking if file modified now takes into account if file has been processed at least once. Introduced a bug in saving people to series.

* Just made code compilable again

* Fixed up code. Now people for series and chapters add correctly without any db issues.

* Things are working, but I'm not happy with how the management of Person is. I need to take into account that 1 person needs to map to an image and role is arbitrary.

* Started adding UI code to showcase chapter metadata

* Updated workflow to be .NET 6

* WIP of updating card detail to show the information more clearly and without so many if statements

* Removed ChatperMetadata and store on the Chapter itself. Much easier to use and less joins.

* Implemented Genre on SeriesMetadata level

* Genres and People are now removed from Series level if they are no longer on comicInfo

* PeopleHelper is done with unit tests. Everything is working.

* Unit tests in place for Genre Helper

* Starting on CacheHelper

* Finished tests for ShouldUpdateCoverImage. Fixed and added tests in ArchiveService/ScannerService.

* CacheHelper is fully tested

* Some DI cleanup

* Scanner Service now calls GetComicInfo for books. Added ability to update Series Sort name from metadata files (mainly epub as comicinfo doesn't have a field)

* Forgot to move a line of code

* SortName now populates from metadata (epub only, ComicInfo has no tags)

* Cards now show the chapter title name if it's set on hover, else will default back to title.

* Fixed a major issue with how MangaFiles were being updated with LastModified, which messed up our logic for avoiding refreshes.

* Woohoo, more tests and some refactors to be able to test more services wtih mock filesystem. Fixed an issue where SortName was getting set as first chapter, but the Series was in a group.

* Refactored the MangaFile creation code into the DbFactory where we also setup the first LastModified update.

* Has file changed bug is now finally fixed

* Remove dead genres, refactor genre to use title instead of name.

* Refactored out a directory from ShouldUpdateCoverImage() to keep the code clean

* Unit tests for ComicInfo on BookService.

* Refactored series detail into it's own component

* Series-detail now received refresh metadata events to refresh what's on screen

* Removed references to Artist on PersonRole as it has no metadata mapping

* Security audit

* Fixed a benchmark

* Updated JWT Token generator to use new methods in .NET 6

* Updated all the docker and build commands to use net6.0

* Commented out sonar scan since it's not setup for net6.0 yet.

* Don't rely on test (#819)

* Bump versions by dotnet-bump-version.

* Update monorepo-build.sh

Updated build to use .net6.0

* Bump versions by dotnet-bump-version.

* Update sonar-scan.yml

More updates to 6.0

* Update sonar-scan.yml

Please work

* Bump versions by dotnet-bump-version.

* Please work

* Bump versions by dotnet-bump-version.

* Local Metadata Integration Part 1 (#820)

* Started with some basic plumbing with comic info parsing updating Series/Volume.

* We can now get chapter title from comicInfo.xml

* Hooked in the ability to store people into the chapter metadata.

* Removed no longer used imports, fixed up some foreign key constraints on deleting series with person linked.

* Refactored Summary out of the UI for Series into SeriesMetadata. Updated application to .net 6. There is a bug in metadata code for updating.

* Removed the parallel.ForEach with a normal foreach which lets us use async. For I/O heavy code, shouldn't change much.

* Refactored scan code to only check extensions with comic info, fixed a bug on scan events not using correct method name, removed summary field (still buggy)

* Fixed a bug where on cancelling a metadata request in modal, underlying button would get stuck in a disabled state.

* Changed how metadata selects the first volume to read summary info from. It will now select the first non-special volume rather than Volume 1.

* More debugging and found more bugs to fix

* Redid all the migrations as one single one. Fixed a bug with GetChapterInfo returning null when ChapterMetadata didn't exist for that Chapter.

Fixed an issue with mapper failing on GetChapterMetadata. Started work on adding people and a design for people.

* Fixed a bug where checking if file modified now takes into account if file has been processed at least once. Introduced a bug in saving people to series.

* Just made code compilable again

* Fixed up code. Now people for series and chapters add correctly without any db issues.

* Things are working, but I'm not happy with how the management of Person is. I need to take into account that 1 person needs to map to an image and role is arbitrary.

* Started adding UI code to showcase chapter metadata

* Updated workflow to be .NET 6

* WIP of updating card detail to show the information more clearly and without so many if statements

* Removed ChatperMetadata and store on the Chapter itself. Much easier to use and less joins.

* Implemented Genre on SeriesMetadata level

* Genres and People are now removed from Series level if they are no longer on comicInfo

* PeopleHelper is done with unit tests. Everything is working.

* Unit tests in place for Genre Helper

* Starting on CacheHelper

* Finished tests for ShouldUpdateCoverImage. Fixed and added tests in ArchiveService/ScannerService.

* CacheHelper is fully tested

* Some DI cleanup

* Scanner Service now calls GetComicInfo for books. Added ability to update Series Sort name from metadata files (mainly epub as comicinfo doesn't have a field)

* Forgot to move a line of code

* SortName now populates from metadata (epub only, ComicInfo has no tags)

* Cards now show the chapter title name if it's set on hover, else will default back to title.

* Fixed a major issue with how MangaFiles were being updated with LastModified, which messed up our logic for avoiding refreshes.

* Woohoo, more tests and some refactors to be able to test more services wtih mock filesystem. Fixed an issue where SortName was getting set as first chapter, but the Series was in a group.

* Refactored the MangaFile creation code into the DbFactory where we also setup the first LastModified update.

* Has file changed bug is now finally fixed

* Remove dead genres, refactor genre to use title instead of name.

* Refactored out a directory from ShouldUpdateCoverImage() to keep the code clean

* Unit tests for ComicInfo on BookService.

* Refactored series detail into it's own component

* Series-detail now received refresh metadata events to refresh what's on screen

* Removed references to Artist on PersonRole as it has no metadata mapping

* Security audit

* Fixed a benchmark

* Updated JWT Token generator to use new methods in .NET 6

* Updated all the docker and build commands to use net6.0

* Commented out sonar scan since it's not setup for net6.0 yet.

* Removed some directives

* Removed my test db

* Bump versions by dotnet-bump-version.

* .NET 6 Coding Patterns + Unit Tests (#823)

* Refactored all files to have Interfaces within the same file. Started moving over to file-scoped namespaces.

* Refactored common methods for getting underlying file's cover, pages, and extracting into 1 interface.

* More refactoring around removing dependence on explicit filetype testing for getting information.

* Code is buildable, tests are broken. Huge refactor (not completed) which makes most of DirectoryService testable with a mock filesystem (and thus the services that utilize it).

* Finished porting DirectoryService to use mocked filesystem implementation.

* Added a null check

* Added a null check

* Finished all unit tests for DirectoryService.

* Some misc cleanup on the code

* Fixed up some bugs from refactoring scan loop.

* Implemented CleanupService testing and refactored more of DirectoryService to be non-static.

Fixed a bug where cover file cleanup wasn't properly finding files due to a regex bug.

* Fixed an issue in CleanupBackup() where we weren't properly selecting database files older than 30 days. Finished CleanupService Tests.

* Refactored Flatten and RemoveNonImages to directory service to allow CacheService to be testable.

* Finally have CacheService tested. Rewrote GetCachedPagePath() to be much more straightforward & performant.

* Updated DefaultParserTests.cs to contain all existing tests and follow new test layout format.

* All tests fixed up

* Bump versions by dotnet-bump-version.

* Bump docnet version to support x64 ARM pdfium binaries. (#826)

* Bump versions by dotnet-bump-version.

* Fixed bad build (#828)

* Bump versions by dotnet-bump-version.

* Bugfix/bad build (#829)

* Fixed bad build

* More directory service issues in Program

* Bump versions by dotnet-bump-version.

* Feature/local metadata more tags (#832)

* Stashing code

* removed some debug code on series detail page. Now detail is collapsed by default.

* Added AgeRating

* Fixed a crash when NetVips tries to write a cover file and cover directory is not existing.

* When a card is selected for bulk actions, show an outline in addition to select box

* Added AgeRating into the metadata parsing. Added a hack where ComicInfo uses Number in ComicInfo rather than Volume. This is to test out the effects on users libraries.

* Added AgeRating and ReleaseDate to the metadata implelentation.

* Bump versions by dotnet-bump-version.

* Misc Fixes (#839)

* Fixed a case where chapter was being parsed incorrectly when the series title ends in a number.

* Updated Kavita to support Tome/T notation found in French comics

* Added support for identifying European specials and expanded support for cleaning some tags used in European comics. During cleaning, if series starts with - or comma, remove it.

* Fixed an issue where add to collection for a single series wasn't calling the bulk action handler

* Fixed a NPE on AgeRating conversion. Fixed a bug where when looking for cover image, file extensions was throwing off sort code.

* Refactored Natural Sort ordering to better follow how Windows behaves. This is a departure from how the original code executes.

* GetCachedPagePath now uses natural sorting to pick the images for reading in a more correct order.

* Updated parser to handle a case where there was more than one space as a separator

* Bump versions by dotnet-bump-version.

* Beefed up ToC generation to handle new ways of packing epub. (#840)

* Bump versions by dotnet-bump-version.

* Tachiyomi Enhancements (#845)

* Added a new endpoint to get all Series with Progress info.

* Fixed up some potential NPEs during scan

* Commented out filter code, not ready for it.

* Fixed up a parsing case for european comics

* Refactored FilterDto to allow for specifying multiple formats to return.

* Refactored FilterDto to allow for specifying multiple formats to return.

* Refactored the UI to show OPDS as 3rd Party Clients since Tachiyomi now uses OPDS url scheme for authentication.

* Bump versions by dotnet-bump-version.

* In-Depth Filtering (#850)

* Laying the foundation for the filter rework

* Filtering by Genre is now possible.

* Cleaned up code and preparing for People filtering

* People filtering is hooked up for the frontend

* Filtering now works. On Deck does not work with filtering currently due to a unique implementation.

* More cleanup

* Implemented the ability to reset the filters

* Added a mobile drawer for filtering

* Added some additional cases for NaturalSortComparer. Filter now uses a drawer on smaller screens.

* Fixed a bug where backup service was not pointing to the correct directory.

* Undid the fix, it's working as expected

* Bump versions by dotnet-bump-version.

* More Filtering and Support for ComicInfo v2.1 (draft) Tags (#851)

* Added a reoccuring task to cleanup db entries that might be abandoned. On library page, the Library in question will be prepoulated.

* Laid out the foundation for customized sorting. Added all series page to the UI when clicking on Libraries section header on home page so user can apply any filtering they like.

* When filtering, the current library filter will now automatically filter out the options for people and genres.

* Implemented Sorting controls

* Clear now clears sorting and read progress. Sorting is disabled on deck and recently added.

* Fixed an issue where all-series page couldn't click to open series

* Don't let the user unselect the last read progress. Added new comicinfo v2.1 draft tags.

* Hooked in Translator tag into backend and UI.

* Fixed an issue where you could open multiple typeaheads at the same time

* Integrated Translator and Tags ComicInfo extension fields. Started work on a badge expander.

* Reworked a bit more on badge expander. Added the UI code for Age Rating and Tags

* Integrated backend for Tags, Translator, and Age Rating

* Metadata tags now collapse if more than 4 present

* Some code cleanup

* Made the not read badge slightly smaller

* Bump versions by dotnet-bump-version.

* More Filtering and Support for ComicInfo v2.1 (draft) Tags (#853)

* Added a reoccuring task to cleanup db entries that might be abandoned. On library page, the Library in question will be prepoulated.

* Laid out the foundation for customized sorting. Added all series page to the UI when clicking on Libraries section header on home page so user can apply any filtering they like.

* When filtering, the current library filter will now automatically filter out the options for people and genres.

* Implemented Sorting controls

* Clear now clears sorting and read progress. Sorting is disabled on deck and recently added.

* Fixed an issue where all-series page couldn't click to open series

* Don't let the user unselect the last read progress. Added new comicinfo v2.1 draft tags.

* Hooked in Translator tag into backend and UI.

* Fixed an issue where you could open multiple typeaheads at the same time

* Integrated Translator and Tags ComicInfo extension fields. Started work on a badge expander.

* Reworked a bit more on badge expander. Added the UI code for Age Rating and Tags

* Integrated backend for Tags, Translator, and Age Rating

* Metadata tags now collapse if more than 4 present

* Some code cleanup

* Made the not read badge slightly smaller

* Implemented Language filter. Fixed up some logic around Release Year not being set when month is missing.

* Added a missing file and Updated Filter to use different design for layout

* Fixed up some alignment issues

* Refined the styles further

* Bump versions by dotnet-bump-version.

* Fixes v0.4.19! (#855)

* Fixed OPDS urls to work with new Filtering schema

* Fixed a rendering issue with Language tag when it's null

* Fixed a bug where locked covers were resetting during refresh metadata.

* Redid all the migrations and put some extra checks due to a bad migration from previous release (EF Core was producing an error).

* Fixed a bug which didn't take sort direction when not changing sort field

* Default installs now backup daily

* Bump versions by dotnet-bump-version.

* Fixed a bug in how to determine if the volume or series updates cover image. (#857)

* Bump versions by dotnet-bump-version.

* More fixes (again) (#858)

* Send stack trace to the UI on prod mode

* Pdfs will now generate cover images. I missed something a few releases ago.

* Bump versions by dotnet-bump-version.

* I can't believe it's more fixes! (#863)

* Send stack trace to the UI on prod mode

* Pdfs will now generate cover images. I missed something a few releases ago.

* Ignore @Recently-Snapshot directories for QNAP.

* Refactored Bitmap code to use ImageSharp so it's truly cross platform.

* Updated pdf extraction to use a multi-threaded approach to greatly speed up pdf image extraction

* Hooked in Characters tag from ComicInfo.xml

* Bump versions by dotnet-bump-version.

* Added check to see if mount folder is empty (#871)

* Added check for if mount folder is empty

* updating log message

* Adding unit test

* Bump versions by dotnet-bump-version.

* Reader Fixes and Enhancements (#880)

* Don't show an exception when bookmarking doesn't have anything to change.

* Cleaned up the bookmark code a bit.

* Implemented fullscreen mode in the web reader. Refactored User Settings to move Password and 3rd Party Clients to a tab rather than accordion. Removed color filters for web reader.

* Implemented fullscreen mode into book reader

* Added some code for toggling fullscreen which re-renders the screen to ensure the fitting works optimially

* Fixed an issue where moving from FitToScreen -> Split (L/R) wouldn't render the screen correctly due to canvas not being reset.

* Fixed bad optimization and scaling when drawing fit to screen

* Removed left/right highlights on page direction change in favor for icons. Double arrow will dictate the page change.

* Reduced overlay auto close time to 3 seconds

* Updated the paginging direction overlay to use icons and colors. Added a blur effect on menus

* Removed debug flags

* Bump versions by dotnet-bump-version.

* Fixed a bug with OPDS feeds not returning back results due to improperly setting up FilterDto. (#882)

* Bump versions by dotnet-bump-version.

* Bookmark Refactor (#893)

* Fixed a bug which didn't take sort direction when not changing sort field

* Added foundation for Bookmark refactor

* Code broken, need to take a break. Issue is Getting bookmark image needs authentication but UI doesn't send.

* Implemented the ability to send bookmarked files to the web. Implemented ability to clear bookmarks on disk on a re-occuring basis.

* Updated the bookmark design to have it's own card that is self contained. View bookmarks modal has been updated to better lay out the cards.

* Refactored download bookmark codes to select files from bookmark directory directly rather than open underlying files.

* Wrote the basic logic to kick start the bookmark migration.

Added Installed Version into the DB to allow us to know more accurately when to run migrations

* Implemented the ability to change the bookmarks directory

* Updated all references to BookmarkDirectory to use setting from the DB.

Updated Server Settings page to use 2 col for some rows.

* Refactored some code to DirectoryService (hasWriteAccess) and fixed up some unit tests from a previous PR.

* Treat folders that start with ._ as blacklisted.

* Implemented Reset User preferences. Some extra code to prep for the migration.

* Implemented a migration for existing bookmarks to using new filesystem based bookmarks

* Bump versions by dotnet-bump-version.

* Only admins should be able to create new users (#896)

* Bump versions by dotnet-bump-version.

* Backup on Migrations (#898)

* Refactored how the migrations are run.

* A backup will be performed before any migrations. Added additional guards before a sub-module is loaded.

* Bump versions by dotnet-bump-version.

* Fixed a critical bug where registration was broken for first time flow. Refactored how backup before migrations occured such that it now puts the db in temp. The db will be deleted automatically that night. (#900)

* Bump versions by dotnet-bump-version.

* Fixed lack of ability to scroll with fullscreen mode (#903)

* Bump versions by dotnet-bump-version.

* Book Reader Issues (#906)

* Refactored the Font Escaping Regex with new unit tests.

* Fonts are now properly escaped, somehow a regression was introduced.

* Refactored most of the book page loading for the reader into the service.

* Fixed a bug where going into fullscreen in non dark mode will cause the background of the reader to go black. Fixed a rendering issue with margin left/right screwing html up. Fixed an issue where line-height: 100% would break book's css, now we remove the styles if they are non-valuable.

* Changed how I fixed the black mode in fullscreen

* Fixed an issue where anchors wouldn't be colored blue in white mode

* Fixed a bug in the code that checks if a filename is a cover where it would choose "backcover" as a cover, despite it not being a valid case.

* Validate if ReleaseYear is a valid year and if not, set it to 0 to disable it.

* Fixed an issue where some large images could blow out the screen when reading on mobile. Now images will force to be max of width of browser

* Put my hack back in for fullscreen putting background color to black

* Change forwarded headers from All to explicit names

* Fixed an issue where Scheme was not https when it should have been. Now the browser will handle which scheme to request.

* Cleaned up the user preferences to stack multiple controls onto one row

* Fixed fullscreen scroll issue with progress, but now sticky top is missing.

* Corrected the element on which we fullscreen

* Bump versions by dotnet-bump-version.

* Fixed a bug where loading page on book reader wouldn't scroll to last position (#907)

* Bump versions by dotnet-bump-version.

* Metadata Optimizations (#910)

* Added a tooltip to inform user that format and collection filter selections do not only show for the selected library.

* Refactored a lot of code around when we update chapter cover images. Applied an optimization for when we re-calculate volume/series covers, such that it only occurs when the first chapter's image updates.

* Updated code to ensure only lastmodified gets refreshed in metadata since it always follows a scan

* Optimized how metadata is populated on the series. Instead of re-reading the comicInfos, instead I read the data from the underlying chapter entities. This reduces N additional reads AND enables the ability in the future to show/edit chapter level metadata.

* Spelling mistake

* Fixed a concurency issue by not selecting Genres from DB. Added a test for long paths.

* Fixed a bug in filter where collection tag wasn't populating on load

* Cleaned up the logic for changelog to better compare against the installed verison. For nightly users, show the last stable as installed.

* Removed some demo code

* SplitQuery to allow loading tags much faster for series metadata load.

* Bump versions by dotnet-bump-version.

* Fixed the book reader off by one issue with loading last page (#911)

* Bump versions by dotnet-bump-version.

* Misc Fixes (#914)

* Fixed the book reader off by one issue with loading last page

* Fixed a case where scanner would not delete a series if another series with same name but different format was added in that same scan.

* Added some missing tag generation (chapter language and summary)

* Bump versions by dotnet-bump-version.

* Implemented Publication Status in SeriesMetadata and the ability to filter it. (#915)

* Bump versions by dotnet-bump-version.

* Book Reader Issue Take 2 (#916)

* Implemented Publication Status in SeriesMetadata and the ability to filter it.

* Updated the docs for Language on metadata to specify it's a BCP-47 code to match Anansi Project. Fixed a bug with reader from previous PR.

* Bump versions by dotnet-bump-version.

* Don't tag a series as completed if count is 0. (#917)

* Bump versions by dotnet-bump-version.

* Last Page Rendering Twice on Web Reader Fix (#920)

* Don't tag a series as completed if count is 0.

* Removed some dead code and added some spacers for when certain fields are disabled so filter section still looks good.

* Fixed a bug where last page of a manga reader would be rendered twice when paging backwards.

* Bump versions by dotnet-bump-version.

* Metadata Performance Scan (#921)

* Refactored updating chapter metadata from ComicInfo into the Scan loop. This let's us avoid an additional N file reads (expensive) in the metadata service, as we already have to read them in the scan loop.

* Refactored Series level metadata aggregation into the scan loop. This allows for the batching of DB updates to be much smaller, thus faster without much overhead of GC.

* Refactored some of the code for ProcessFile to remove a few redundant if statements

* Fixed broken build (#922)

* Bump versions by dotnet-bump-version.

* Bump versions by dotnet-bump-version.

* Misc Fixes and Changes (#927)

* Cleaned up a ton of warnings/suggestions from the IDE.

* Fixed a bug when clearing the filters some presets could be undone.

* Renamed a class in the OPDS spec

* Simplified logic for when Fit To Screen rendering logic occurs. It now works always rather than only on cover images.

* Give some additional info to the user on what the differences between Library Types are

* Don't scan .qpkg folders (QNAP devices)

* Refactored some code to enable ability to test CoverImage Test. This is a broken test, test.zip is waiting on an issue in NetVips.

* Fixed an issue where Extra might get flagged as special too early, if in a word like Extraordinary

* Cleaned up the regex for the extra issue to be more flexible

* Bump versions by dotnet-bump-version.

* Bump follow-redirects from 1.13.0 to 1.14.7 in /UI/Web (#929)

Bumps [follow-redirects](https://github.com/follow-redirects/follow-redirects) from 1.13.0 to 1.14.7.
- [Release notes](https://github.com/follow-redirects/follow-redirects/releases)
- [Commits](https://github.com/follow-redirects/follow-redirects/compare/v1.13.0...v1.14.7)

---
updated-dependencies:
- dependency-name: follow-redirects
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>

* Fixed a bug in CleanupBookmarks where the Except was deleting all files because the path separators didn't match. (#932)

* Bump versions by dotnet-bump-version.

* Bump versions by dotnet-bump-version.

* Feature/parse scanned files tests (#934)

* Fixed a bug in CleanupBookmarks where the Except was deleting all files because the path separators didn't match.

* Added unit tests for ParseScannedFiles.cs.

* Fixed some unit tests. Parser will now clear out multiple spaces in a row and replace with a single.

* Bump versions by dotnet-bump-version.

* Performance Enhancements (#937)

* Bump versions by dotnet-bump-version.

* Unit Tests & New Natural Sort (#941)

* Added a lot of tests

* More tests! Added a Parser.NormalizePath to normalize all paths within Kavita.

* Fixed a bug where MarkChaptersAsUnread implementation wasn't consistent between different files and lead to extra row generation for no reason.

* Added more unit tests

* Found a better implementation for Natural Sorting. Added tests and validate it works. Next commit will swap out natural Sort for new Extension.

* Replaced NaturalSortComparer with OrderByNatural.

* Drastically simplified and sped up FindFirstEntry for finding cover images in archives

* Initial fix for a epub bug where metadata defines key as absolute path but document uses a relative path. We now have a hack to correct for the epub.

* Bump versions by dotnet-bump-version.

* Fix image aspect ratio in rare case (#935)

* Bump versions by dotnet-bump-version.

* Fixed missing handlers for adding a chapter to a reading list from card details modal and adding series to collection from series detail. (#942)

* Bump versions by dotnet-bump-version.

* Bump versions by dotnet-bump-version.

* Metadata Tags (#947)

* Implemented the ability to click a metadata tag (in series detail) and load a pre-filtered view. Apply still needs to be implemented (preset load is out of sync with external filter)

* Refactored people to properly use typeahead so duplicates don't happen and use an observable chain so we can update the screen correctly

* Many refactoring to ensure that the timings for filtering always works

* Bump versions by dotnet-bump-version.

* Removed a hack that was put in when users complained about a tool improperly tagging. This is not the case for most tools. (#949)

* Scanner not merging with series that has LocalizedName match (#950)

* When performing a scan, series should group if they share the same localized name as a pre-existing series.

* Fixed a bug where a series with a different name and localized name weren't merging with a different set of files with the same naming as localized name.

* Bump versions by dotnet-bump-version.

* Bump versions by dotnet-bump-version.

* Bump versions by dotnet-bump-version.

* Reader Fixes (#951)

* Normalized paths on download controller and when scan is killed due to missing or empty folders, log a critical error.

* Tweaked the query for OnDeck to better promote recently added chapters in a series with read progress, but it's still not perfect.

* Fixed an issue where up/down key weren't working unless you clicked on the book explicitly

* Fixed an issue where infinite scroller was broken in fullscreen mode

* When toggling fullscreen mode on infinite scroller, the current page is retained as current position

* Fixed an issue where a double render would occur when we didn't need to render as fit split

* Stop showing loader when not using fit split

* Bump versions by dotnet-bump-version.

* Shakeout testing Fixes (#952)

* Cleaned up some old code in download bookmark that could create pointless temp folders.

* Fixed a bad http call on reading list remove read and cleaned up the messaging

* Undid an optimization in finding cover image due to it perfoming depth first rather than breadth.

* Updated CleanComicInfo to have Translators and CoverArtists, which were previously missing.

* Renamed Refresh Metadata to Refresh Covers on the UI, given Metadata refresh is done in Scan.

* Library detail will now retain the search query in the UI. Reduced the amount of api calls to the backend on load.

* Reverted allowing the filter to reside in the UI (even though it does work).

* Updated the Age Rating to match the v2.1 spec.

* Fixed a bug where progress wasn't being saved

* Fixed line height not having any effect due to not applying to children elements in the reader

* Fixed some wording for Refresh Covers confirmation

* Delete Series will now send an event to the UI informing that series was deleted.

* Change Progress widget to show Refreshing Covers for

* When we exit early due to potential missing folders/drives in a scan, tell the UI that scan is 100% done.

* Fixed manage library not supressing scan loader when a complete came in

* Fixed a spelling difference for Publication Status between filter and series detail

* Fixed a bug where collection detail page would flash on first load due to duplicate load events

* Added bookmarks to backups

* Fixed issues where fullscreen mode would break infinite scroller contiunous reader

* Bump versions by dotnet-bump-version.

* Fixed GetTags having wrong return type defined (#954)

* Bump versions by dotnet-bump-version.

* Missing Age Ratings (#955)

* Fixed GetTags having wrong return type defined

* Added missing Age Rating tags

* Bump versions by dotnet-bump-version.

* Version bump for release (#953)

Co-authored-by: Robbie Davis <robbie@therobbiedavis.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Andrew Mackrodt <andrewmackrodt@gmail.com>
This commit is contained in:
Joseph Milazzo 2022-01-18 15:31:34 -08:00 committed by GitHub
parent 7fb41f0945
commit 41096d6dc5
No known key found for this signature in database
GPG key ID: 4AEE18F83AFDEB23
360 changed files with 33758 additions and 8073 deletions

View file

@ -3,12 +3,16 @@ using System.Linq;
using System.Threading.Tasks;
using API.Entities;
using API.Errors;
using API.Interfaces.Services;
using Microsoft.AspNetCore.Identity;
using Microsoft.Extensions.Logging;
namespace API.Services
{
public interface IAccountService
{
Task<IEnumerable<ApiException>> ChangeUserPassword(AppUser user, string newPassword);
}
public class AccountService : IAccountService
{
private readonly UserManager<AppUser> _userManager;

View file

@ -10,7 +10,6 @@ using API.Archive;
using API.Comparators;
using API.Data.Metadata;
using API.Extensions;
using API.Interfaces.Services;
using API.Services.Tasks;
using Kavita.Common;
using Microsoft.Extensions.Logging;
@ -19,6 +18,19 @@ using SharpCompress.Common;
namespace API.Services
{
public interface IArchiveService
{
void ExtractArchive(string archivePath, string extractPath);
int GetNumberOfPagesFromArchive(string archivePath);
string GetCoverImage(string archivePath, string fileName, string outputDirectory);
bool IsValidArchive(string archivePath);
ComicInfo GetComicInfo(string archivePath);
ArchiveLibrary CanOpen(string archivePath);
bool ArchiveNeedsFlattening(ZipArchive archive);
Task<Tuple<byte[], string>> CreateZipForDownload(IEnumerable<string> files, string tempFolder);
string FindCoverImageFilename(string archivePath, IList<string> entryNames);
}
/// <summary>
/// Responsible for manipulating Archive files. Used by <see cref="CacheService"/> and <see cref="ScannerService"/>
/// </summary>
@ -27,12 +39,14 @@ namespace API.Services
{
private readonly ILogger<ArchiveService> _logger;
private readonly IDirectoryService _directoryService;
private readonly IImageService _imageService;
private const string ComicInfoFilename = "comicinfo";
public ArchiveService(ILogger<ArchiveService> logger, IDirectoryService directoryService)
public ArchiveService(ILogger<ArchiveService> logger, IDirectoryService directoryService, IImageService imageService)
{
_logger = logger;
_directoryService = directoryService;
_imageService = imageService;
}
/// <summary>
@ -42,7 +56,10 @@ namespace API.Services
/// <returns></returns>
public virtual ArchiveLibrary CanOpen(string archivePath)
{
if (!(File.Exists(archivePath) && Parser.Parser.IsArchive(archivePath) || Parser.Parser.IsEpub(archivePath))) return ArchiveLibrary.NotSupported;
if (string.IsNullOrEmpty(archivePath) || !(File.Exists(archivePath) && Parser.Parser.IsArchive(archivePath) || Parser.Parser.IsEpub(archivePath))) return ArchiveLibrary.NotSupported;
var ext = _directoryService.FileSystem.Path.GetExtension(archivePath).ToUpper();
if (ext.Equals(".CBR") || ext.Equals(".RAR")) return ArchiveLibrary.SharpCompress;
try
{
@ -108,46 +125,47 @@ namespace API.Services
/// </summary>
/// <param name="entryFullNames"></param>
/// <returns>Entry name of match, null if no match</returns>
public string FindFolderEntry(IEnumerable<string> entryFullNames)
public static string FindFolderEntry(IEnumerable<string> entryFullNames)
{
var result = entryFullNames
.FirstOrDefault(x => !Path.EndsInDirectorySeparator(x) && !Parser.Parser.HasBlacklistedFolderInPath(x)
&& Parser.Parser.IsCoverImage(x)
&& !x.StartsWith(Parser.Parser.MacOsMetadataFileStartsWith));
.OrderByNatural(Path.GetFileNameWithoutExtension)
.Where(path => !(Path.EndsInDirectorySeparator(path) || Parser.Parser.HasBlacklistedFolderInPath(path) || path.StartsWith(Parser.Parser.MacOsMetadataFileStartsWith)))
.FirstOrDefault(Parser.Parser.IsCoverImage);
return string.IsNullOrEmpty(result) ? null : result;
}
/// <summary>
/// Returns first entry that is an image and is not in a blacklisted folder path. Uses <see cref="NaturalSortComparer"/> for ordering files
/// Returns first entry that is an image and is not in a blacklisted folder path. Uses <see cref="OrderByNatural"/> for ordering files
/// </summary>
/// <param name="entryFullNames"></param>
/// <returns>Entry name of match, null if no match</returns>
public static string FirstFileEntry(IEnumerable<string> entryFullNames, string archiveName)
public static string? FirstFileEntry(IEnumerable<string> entryFullNames, string archiveName)
{
// First check if there are any files that are not in a nested folder before just comparing by filename. This is needed
// because NaturalSortComparer does not work with paths and doesn't seem 001.jpg as before chapter 1/001.jpg.
var fullNames = entryFullNames.Where(x =>!Parser.Parser.HasBlacklistedFolderInPath(x)
&& Parser.Parser.IsImage(x)
&& !x.StartsWith(Parser.Parser.MacOsMetadataFileStartsWith)).ToList();
var fullNames = entryFullNames
.OrderByNatural(c => c.GetFullPathWithoutExtension())
.Where(path => !(Path.EndsInDirectorySeparator(path) || Parser.Parser.HasBlacklistedFolderInPath(path) || path.StartsWith(Parser.Parser.MacOsMetadataFileStartsWith)) && Parser.Parser.IsImage(path))
.ToList();
if (fullNames.Count == 0) return null;
var nonNestedFile = fullNames.Where(entry => (Path.GetDirectoryName(entry) ?? string.Empty).Equals(archiveName))
.OrderBy(Path.GetFullPath, new NaturalSortComparer())
.OrderByNatural(c => c.GetFullPathWithoutExtension())
.FirstOrDefault();
if (!string.IsNullOrEmpty(nonNestedFile)) return nonNestedFile;
// Check the first folder and sort within that to see if we can find a file, else fallback to first file with basic sort.
// Get first folder, then sort within that
var firstDirectoryFile = fullNames.OrderBy(Path.GetDirectoryName, new NaturalSortComparer()).FirstOrDefault();
var firstDirectoryFile = fullNames.OrderByNatural(Path.GetDirectoryName).FirstOrDefault();
if (!string.IsNullOrEmpty(firstDirectoryFile))
{
var firstDirectory = Path.GetDirectoryName(firstDirectoryFile);
if (!string.IsNullOrEmpty(firstDirectory))
{
var firstDirectoryResult = fullNames.Where(f => firstDirectory.Equals(Path.GetDirectoryName(f)))
.OrderBy(Path.GetFileName, new NaturalSortComparer())
.OrderByNatural(Path.GetFileNameWithoutExtension)
.FirstOrDefault();
if (!string.IsNullOrEmpty(firstDirectoryResult)) return firstDirectoryResult;
@ -155,7 +173,7 @@ namespace API.Services
}
var result = fullNames
.OrderBy(Path.GetFileName, new NaturalSortComparer())
.OrderByNatural(Path.GetFileNameWithoutExtension)
.FirstOrDefault();
return string.IsNullOrEmpty(result) ? null : result;
@ -173,7 +191,7 @@ namespace API.Services
/// <param name="archivePath"></param>
/// <param name="fileName">File name to use based on context of entity.</param>
/// <returns></returns>
public string GetCoverImage(string archivePath, string fileName)
public string GetCoverImage(string archivePath, string fileName, string outputDirectory)
{
if (archivePath == null || !IsValidArchive(archivePath)) return string.Empty;
try
@ -184,25 +202,24 @@ namespace API.Services
case ArchiveLibrary.Default:
{
using var archive = ZipFile.OpenRead(archivePath);
var entryNames = archive.Entries.Select(e => e.FullName).ToArray();
var entryNames = archive.Entries.Select(e => e.FullName).ToList();
var entryName = FindFolderEntry(entryNames) ?? FirstFileEntry(entryNames, Path.GetFileName(archivePath));
var entryName = FindCoverImageFilename(archivePath, entryNames);
var entry = archive.Entries.Single(e => e.FullName == entryName);
using var stream = entry.Open();
return CreateThumbnail(archivePath + " - " + entry.FullName, stream, fileName);
using var stream = entry.Open();
return _imageService.WriteCoverThumbnail(stream, fileName, outputDirectory);
}
case ArchiveLibrary.SharpCompress:
{
using var archive = ArchiveFactory.Open(archivePath);
var entryNames = archive.Entries.Where(archiveEntry => !archiveEntry.IsDirectory).Select(e => e.Key).ToList();
var entryName = FindFolderEntry(entryNames) ?? FirstFileEntry(entryNames, Path.GetFileName(archivePath));
var entryName = FindCoverImageFilename(archivePath, entryNames);
var entry = archive.Entries.Single(e => e.Key == entryName);
using var stream = entry.OpenEntryStream();
return CreateThumbnail(archivePath + " - " + entry.Key, stream, fileName);
return _imageService.WriteCoverThumbnail(stream, fileName, outputDirectory);
}
case ArchiveLibrary.NotSupported:
_logger.LogWarning("[GetCoverImage] This archive cannot be read: {ArchivePath}. Defaulting to no cover image", archivePath);
@ -220,6 +237,18 @@ namespace API.Services
return string.Empty;
}
/// <summary>
/// Given a list of image paths (assume within an archive), find the filename that corresponds to the cover
/// </summary>
/// <param name="archivePath"></param>
/// <param name="entryNames"></param>
/// <returns></returns>
public string FindCoverImageFilename(string archivePath, IList<string> entryNames)
{
var entryName = FindFolderEntry(entryNames) ?? FirstFileEntry(entryNames, Path.GetFileName(archivePath));
return entryName;
}
/// <summary>
/// Given an archive stream, will assess whether directory needs to be flattened so that the extracted archive files are directly
/// under extract path and not nested in subfolders. See <see cref="DirectoryInfoExtensions"/> Flatten method.
@ -235,18 +264,25 @@ namespace API.Services
}
// TODO: Refactor CreateZipForDownload to return the temp file so we can stream it from temp
/// <summary>
///
/// </summary>
/// <param name="files"></param>
/// <param name="tempFolder">Temp folder name to use for preparing the files. Will be created and deleted</param>
/// <returns></returns>
/// <exception cref="KavitaException"></exception>
public async Task<Tuple<byte[], string>> CreateZipForDownload(IEnumerable<string> files, string tempFolder)
{
var dateString = DateTime.Now.ToShortDateString().Replace("/", "_");
var tempLocation = Path.Join(DirectoryService.TempDirectory, $"{tempFolder}_{dateString}");
DirectoryService.ExistOrCreate(tempLocation);
var tempLocation = Path.Join(_directoryService.TempDirectory, $"{tempFolder}_{dateString}");
_directoryService.ExistOrCreate(tempLocation);
if (!_directoryService.CopyFilesToDirectory(files, tempLocation))
{
throw new KavitaException("Unable to copy files to temp directory archive download.");
}
var zipPath = Path.Join(DirectoryService.TempDirectory, $"kavita_{tempFolder}_{dateString}.zip");
var zipPath = Path.Join(_directoryService.TempDirectory, $"kavita_{tempFolder}_{dateString}.zip");
try
{
ZipFile.CreateFromDirectory(tempLocation, zipPath);
@ -260,25 +296,12 @@ namespace API.Services
var fileBytes = await _directoryService.ReadFileAsync(zipPath);
DirectoryService.ClearAndDeleteDirectory(tempLocation);
_directoryService.ClearAndDeleteDirectory(tempLocation); // NOTE: For sending back just zip, just schedule this to be called after the file is returned or let next temp storage cleanup take care of it
(new FileInfo(zipPath)).Delete();
return Tuple.Create(fileBytes, zipPath);
}
private string CreateThumbnail(string entryName, Stream stream, string fileName)
{
try
{
return ImageService.WriteCoverThumbnail(stream, fileName);
}
catch (Exception ex)
{
_logger.LogWarning(ex, "[GetCoverImage] There was an error and prevented thumbnail generation on {EntryName}. Defaulting to no cover image", entryName);
}
return string.Empty;
}
/// <summary>
/// Test if the archive path exists and an archive
@ -322,7 +345,12 @@ namespace API.Services
return null;
}
public ComicInfo GetComicInfo(string archivePath)
/// <summary>
/// This can be null if nothing is found or any errors occur during access
/// </summary>
/// <param name="archivePath"></param>
/// <returns></returns>
public ComicInfo? GetComicInfo(string archivePath)
{
if (!IsValidArchive(archivePath)) return null;
@ -336,7 +364,7 @@ namespace API.Services
case ArchiveLibrary.Default:
{
using var archive = ZipFile.OpenRead(archivePath);
var entry = archive.Entries.SingleOrDefault(x =>
var entry = archive.Entries.FirstOrDefault(x =>
!Parser.Parser.HasBlacklistedFolderInPath(x.FullName)
&& Path.GetFileNameWithoutExtension(x.Name)?.ToLower() == ComicInfoFilename
&& !Path.GetFileNameWithoutExtension(x.Name)
@ -346,7 +374,9 @@ namespace API.Services
{
using var stream = entry.Open();
var serializer = new XmlSerializer(typeof(ComicInfo));
return (ComicInfo) serializer.Deserialize(stream);
var info = (ComicInfo) serializer.Deserialize(stream);
ComicInfo.CleanComicInfo(info);
return info;
}
break;
@ -354,7 +384,7 @@ namespace API.Services
case ArchiveLibrary.SharpCompress:
{
using var archive = ArchiveFactory.Open(archivePath);
return FindComicInfoXml(archive.Entries.Where(entry => !entry.IsDirectory
var info = FindComicInfoXml(archive.Entries.Where(entry => !entry.IsDirectory
&& !Parser.Parser
.HasBlacklistedFolderInPath(
Path.GetDirectoryName(
@ -365,6 +395,9 @@ namespace API.Services
.Parser
.MacOsMetadataFileStartsWith)
&& Parser.Parser.IsXml(entry.Key)));
ComicInfo.CleanComicInfo(info);
return info;
}
case ArchiveLibrary.NotSupported:
_logger.LogWarning("[GetComicInfo] This archive cannot be read: {ArchivePath}", archivePath);
@ -385,9 +418,9 @@ namespace API.Services
}
private static void ExtractArchiveEntities(IEnumerable<IArchiveEntry> entries, string extractPath)
private void ExtractArchiveEntities(IEnumerable<IArchiveEntry> entries, string extractPath)
{
DirectoryService.ExistOrCreate(extractPath);
_directoryService.ExistOrCreate(extractPath);
foreach (var entry in entries)
{
entry.WriteToDirectory(extractPath, new ExtractionOptions()
@ -400,7 +433,7 @@ namespace API.Services
private void ExtractArchiveEntries(ZipArchive archive, string extractPath)
{
// NOTE: In cases where we try to extract, but there are InvalidPathChars, we need to inform the user
// TODO: In cases where we try to extract, but there are InvalidPathChars, we need to inform the user (throw exception, let middleware inform user)
var needsFlattening = ArchiveNeedsFlattening(archive);
if (!archive.HasFiles() && !needsFlattening) return;
@ -408,7 +441,7 @@ namespace API.Services
if (!needsFlattening) return;
_logger.LogDebug("Extracted archive is nested in root folder, flattening...");
new DirectoryInfo(extractPath).Flatten();
_directoryService.Flatten(extractPath);
}
/// <summary>
@ -447,10 +480,10 @@ namespace API.Services
break;
}
case ArchiveLibrary.NotSupported:
_logger.LogWarning("[ExtractArchive] This archive cannot be read: {ArchivePath}. Defaulting to 0 pages", archivePath);
_logger.LogWarning("[ExtractArchive] This archive cannot be read: {ArchivePath}", archivePath);
return;
default:
_logger.LogWarning("[ExtractArchive] There was an exception when reading archive stream: {ArchivePath}. Defaulting to 0 pages", archivePath);
_logger.LogWarning("[ExtractArchive] There was an exception when reading archive stream: {ArchivePath}", archivePath);
return;
}

View file

@ -1,17 +1,13 @@
using System;
using System.Collections.Generic;
using System.Drawing;
using System.Drawing.Imaging;
using System.IO;
using System.Linq;
using System.Runtime.InteropServices;
using System.Text;
using System.Text.RegularExpressions;
using System.Threading.Tasks;
using System.Web;
using API.Data.Metadata;
using API.Entities.Enums;
using API.Interfaces.Services;
using API.Parser;
using Docnet.Core;
using Docnet.Core.Converters;
@ -21,21 +17,54 @@ using ExCSS;
using HtmlAgilityPack;
using Microsoft.Extensions.Logging;
using Microsoft.IO;
using SixLabors.ImageSharp;
using SixLabors.ImageSharp.PixelFormats;
using VersOne.Epub;
using Image = SixLabors.ImageSharp.Image;
namespace API.Services
{
public interface IBookService
{
int GetNumberOfPages(string filePath);
string GetCoverImage(string fileFilePath, string fileName, string outputDirectory);
Task<Dictionary<string, int>> CreateKeyToPageMappingAsync(EpubBookRef book);
/// <summary>
/// Scopes styles to .reading-section and replaces img src to the passed apiBase
/// </summary>
/// <param name="stylesheetHtml"></param>
/// <param name="apiBase"></param>
/// <param name="filename">If the stylesheetHtml contains Import statements, when scoping the filename, scope needs to be wrt filepath.</param>
/// <param name="book">Book Reference, needed for if you expect Import statements</param>
/// <returns></returns>
Task<string> ScopeStyles(string stylesheetHtml, string apiBase, string filename, EpubBookRef book);
ComicInfo GetComicInfo(string filePath);
ParserInfo ParseInfo(string filePath);
/// <summary>
/// Extracts a PDF file's pages as images to an target directory
/// </summary>
/// <param name="fileFilePath"></param>
/// <param name="targetDirectory">Where the files will be extracted to. If doesn't exist, will be created.</param>
void ExtractPdfImages(string fileFilePath, string targetDirectory);
Task<string> ScopePage(HtmlDocument doc, EpubBookRef book, string apiBase, HtmlNode body, Dictionary<string, int> mappings, int page);
}
public class BookService : IBookService
{
private readonly ILogger<BookService> _logger;
private readonly IDirectoryService _directoryService;
private readonly IImageService _imageService;
private readonly StylesheetParser _cssParser = new ();
private static readonly RecyclableMemoryStreamManager StreamManager = new ();
private const string CssScopeClass = ".book-content";
public BookService(ILogger<BookService> logger)
public BookService(ILogger<BookService> logger, IDirectoryService directoryService, IImageService imageService)
{
_logger = logger;
_directoryService = directoryService;
_imageService = imageService;
}
private static bool HasClickableHrefPart(HtmlNode anchor)
@ -141,13 +170,10 @@ namespace API.Services
}
stylesheetHtml = stylesheetHtml.Insert(0, importBuilder.ToString());
var importMatches = Parser.Parser.CssImportUrlRegex.Matches(stylesheetHtml);
foreach (Match match in importMatches)
{
if (!match.Success) continue;
var importFile = match.Groups["Filename"].Value;
stylesheetHtml = stylesheetHtml.Replace(importFile, apiBase + prepend + importFile);
}
EscapeCSSImportReferences(ref stylesheetHtml, apiBase, prepend);
EscapeFontFamilyReferences(ref stylesheetHtml, apiBase, prepend);
// Check if there are any background images and rewrite those urls
EscapeCssImageReferences(ref stylesheetHtml, apiBase, book);
@ -174,6 +200,26 @@ namespace API.Services
return RemoveWhiteSpaceFromStylesheets(stylesheet.ToCss());
}
private static void EscapeCSSImportReferences(ref string stylesheetHtml, string apiBase, string prepend)
{
foreach (Match match in Parser.Parser.CssImportUrlRegex.Matches(stylesheetHtml))
{
if (!match.Success) continue;
var importFile = match.Groups["Filename"].Value;
stylesheetHtml = stylesheetHtml.Replace(importFile, apiBase + prepend + importFile);
}
}
private static void EscapeFontFamilyReferences(ref string stylesheetHtml, string apiBase, string prepend)
{
foreach (Match match in Parser.Parser.FontSrcUrlRegex.Matches(stylesheetHtml))
{
if (!match.Success) continue;
var importFile = match.Groups["Filename"].Value;
stylesheetHtml = stylesheetHtml.Replace(importFile, apiBase + prepend + importFile);
}
}
private static void EscapeCssImageReferences(ref string stylesheetHtml, string apiBase, EpubBookRef book)
{
var matches = Parser.Parser.CssImageUrlRegex.Matches(stylesheetHtml);
@ -189,6 +235,139 @@ namespace API.Services
}
}
private static void ScopeImages(HtmlDocument doc, EpubBookRef book, string apiBase)
{
var images = doc.DocumentNode.SelectNodes("//img");
if (images != null)
{
foreach (var image in images)
{
if (image.Name != "img") continue;
// Need to do for xlink:href
if (image.Attributes["src"] != null)
{
var imageFile = image.Attributes["src"].Value;
if (!book.Content.Images.ContainsKey(imageFile))
{
// TODO: Refactor the Key code to a method to allow the hacks to be tested
var correctedKey = book.Content.Images.Keys.SingleOrDefault(s => s.EndsWith(imageFile));
if (correctedKey != null)
{
imageFile = correctedKey;
} else if (imageFile.StartsWith(".."))
{
// There are cases where the key is defined static like OEBPS/Images/1-4.jpg but reference is ../Images/1-4.jpg
correctedKey = book.Content.Images.Keys.SingleOrDefault(s => s.EndsWith(imageFile.Replace("..", string.Empty)));
if (correctedKey != null)
{
imageFile = correctedKey;
}
}
}
image.Attributes.Remove("src");
image.Attributes.Add("src", $"{apiBase}" + imageFile);
}
}
}
images = doc.DocumentNode.SelectNodes("//image");
if (images != null)
{
foreach (var image in images)
{
if (image.Name != "image") continue;
if (image.Attributes["xlink:href"] != null)
{
var imageFile = image.Attributes["xlink:href"].Value;
if (!book.Content.Images.ContainsKey(imageFile))
{
var correctedKey = book.Content.Images.Keys.SingleOrDefault(s => s.EndsWith(imageFile));
if (correctedKey != null)
{
imageFile = correctedKey;
}
}
image.Attributes.Remove("xlink:href");
image.Attributes.Add("xlink:href", $"{apiBase}" + imageFile);
}
}
}
}
private static string PrepareFinalHtml(HtmlDocument doc, HtmlNode body)
{
// Check if any classes on the html node (some r2l books do this) and move them to body tag for scoping
var htmlNode = doc.DocumentNode.SelectSingleNode("//html");
if (htmlNode == null || !htmlNode.Attributes.Contains("class")) return body.InnerHtml;
var bodyClasses = body.Attributes.Contains("class") ? body.Attributes["class"].Value : string.Empty;
var classes = htmlNode.Attributes["class"].Value + " " + bodyClasses;
body.Attributes.Add("class", $"{classes}");
// I actually need the body tag itself for the classes, so i will create a div and put the body stuff there.
return $"<div class=\"{body.Attributes["class"].Value}\">{body.InnerHtml}</div>";
}
private static void RewriteAnchors(int page, HtmlDocument doc, Dictionary<string, int> mappings)
{
var anchors = doc.DocumentNode.SelectNodes("//a");
if (anchors != null)
{
foreach (var anchor in anchors)
{
BookService.UpdateLinks(anchor, mappings, page);
}
}
}
private async Task InlineStyles(HtmlDocument doc, EpubBookRef book, string apiBase, HtmlNode body)
{
var inlineStyles = doc.DocumentNode.SelectNodes("//style");
if (inlineStyles != null)
{
foreach (var inlineStyle in inlineStyles)
{
var styleContent = await ScopeStyles(inlineStyle.InnerHtml, apiBase, "", book);
body.PrependChild(HtmlNode.CreateNode($"<style>{styleContent}</style>"));
}
}
var styleNodes = doc.DocumentNode.SelectNodes("/html/head/link");
if (styleNodes != null)
{
foreach (var styleLinks in styleNodes)
{
var key = BookService.CleanContentKeys(styleLinks.Attributes["href"].Value);
// Some epubs are malformed the key in content.opf might be: content/resources/filelist_0_0.xml but the actual html links to resources/filelist_0_0.xml
// In this case, we will do a search for the key that ends with
if (!book.Content.Css.ContainsKey(key))
{
var correctedKey = book.Content.Css.Keys.SingleOrDefault(s => s.EndsWith(key));
if (correctedKey == null)
{
_logger.LogError("Epub is Malformed, key: {Key} is not matching OPF file", key);
continue;
}
key = correctedKey;
}
var styleContent = await ScopeStyles(await book.Content.Css[key].ReadContentAsync(), apiBase,
book.Content.Css[key].FileName, book);
if (styleContent != null)
{
body.PrependChild(HtmlNode.CreateNode($"<style>{styleContent}</style>"));
}
}
}
}
public ComicInfo GetComicInfo(string filePath)
{
if (!IsValidFile(filePath) || Parser.Parser.IsPdf(filePath)) return null;
@ -202,10 +381,13 @@ namespace API.Services
var info = new ComicInfo()
{
Summary = epubBook.Schema.Package.Metadata.Description,
Writer = string.Join(",", epubBook.Schema.Package.Metadata.Creators),
Writer = string.Join(",", epubBook.Schema.Package.Metadata.Creators.Select(c => Parser.Parser.CleanAuthor(c.Creator))),
Publisher = string.Join(",", epubBook.Schema.Package.Metadata.Publishers),
Month = !string.IsNullOrEmpty(publicationDate) ? DateTime.Parse(publicationDate).Month : 0,
Year = !string.IsNullOrEmpty(publicationDate) ? DateTime.Parse(publicationDate).Year : 0,
Title = epubBook.Title,
Genre = string.Join(",", epubBook.Schema.Package.Metadata.Subjects.Select(s => s.ToLower().Trim())),
};
// Parse tags not exposed via Library
foreach (var metadataItem in epubBook.Schema.Package.Metadata.MetaItems)
@ -215,6 +397,9 @@ namespace API.Services
case "calibre:rating":
info.UserRating = float.Parse(metadataItem.Content);
break;
case "calibre:title_sort":
info.TitleSort = metadataItem.Content;
break;
}
}
@ -305,8 +490,6 @@ namespace API.Services
{
using var epubBook = EpubReader.OpenBook(filePath);
// If the epub has the following tags, we can group the books as Volumes
// <meta content="5.0" name="calibre:series_index"/>
// <meta content="The Dark Tower" name="calibre:series"/>
// <meta content="Wolves of the Calla" name="calibre:title_sort"/>
// If all three are present, we can take that over dc:title and format as:
@ -323,6 +506,7 @@ namespace API.Services
var series = string.Empty;
var specialName = string.Empty;
var groupPosition = string.Empty;
var titleSort = string.Empty;
foreach (var metadataItem in epubBook.Schema.Package.Metadata.MetaItems)
@ -338,6 +522,7 @@ namespace API.Services
break;
case "calibre:title_sort":
specialName = metadataItem.Content;
titleSort = metadataItem.Content;
break;
}
@ -363,18 +548,26 @@ namespace API.Services
{
specialName = epubBook.Title;
}
return new ParserInfo()
var info = new ParserInfo()
{
Chapters = Parser.Parser.DefaultChapter,
Edition = string.Empty,
Format = MangaFormat.Epub,
Filename = Path.GetFileName(filePath),
Title = specialName.Trim(),
Title = specialName?.Trim(),
FullFilePath = filePath,
IsSpecial = false,
Series = series.Trim(),
Volumes = seriesIndex
};
// Don't set titleSort if the book belongs to a group
if (!string.IsNullOrEmpty(titleSort) && string.IsNullOrEmpty(seriesIndex))
{
info.SeriesSort = titleSort;
}
return info;
}
}
catch (Exception)
@ -392,7 +585,7 @@ namespace API.Services
FullFilePath = filePath,
IsSpecial = false,
Series = epubBook.Title.Trim(),
Volumes = Parser.Parser.DefaultVolume
Volumes = Parser.Parser.DefaultVolume,
};
}
catch (Exception ex)
@ -403,31 +596,46 @@ namespace API.Services
return null;
}
private static void AddBytesToBitmap(Bitmap bmp, byte[] rawBytes)
{
var rect = new Rectangle(0, 0, bmp.Width, bmp.Height);
var bmpData = bmp.LockBits(rect, ImageLockMode.WriteOnly, bmp.PixelFormat);
var pNative = bmpData.Scan0;
Marshal.Copy(rawBytes, 0, pNative, rawBytes.Length);
bmp.UnlockBits(bmpData);
}
/// <summary>
/// Extracts a pdf into images to a target directory. Uses multi-threaded implementation since docnet is slow normally.
/// </summary>
/// <param name="fileFilePath"></param>
/// <param name="targetDirectory"></param>
public void ExtractPdfImages(string fileFilePath, string targetDirectory)
{
DirectoryService.ExistOrCreate(targetDirectory);
_directoryService.ExistOrCreate(targetDirectory);
using var docReader = DocLib.Instance.GetDocReader(fileFilePath, new PageDimensions(1080, 1920));
var pages = docReader.GetPageCount();
using var stream = StreamManager.GetStream("BookService.GetPdfPage");
for (var pageNumber = 0; pageNumber < pages; pageNumber++)
Parallel.For(0, pages, pageNumber =>
{
using var stream = StreamManager.GetStream("BookService.GetPdfPage");
GetPdfPage(docReader, pageNumber, stream);
using var fileStream = File.Create(Path.Combine(targetDirectory, "Page-" + pageNumber + ".png"));
stream.Seek(0, SeekOrigin.Begin);
stream.CopyTo(fileStream);
}
});
}
/// <summary>
/// Responsible to scope all the css, links, tags, etc to prepare a self contained html file for the page
/// </summary>
/// <param name="doc">Html Doc that will be appended to</param>
/// <param name="book">Underlying epub</param>
/// <param name="apiBase">API Url for file loading to pass through</param>
/// <param name="body">Body element from the epub</param>
/// <param name="mappings">Epub mappings</param>
/// <param name="page">Page number we are loading</param>
/// <returns></returns>
public async Task<string> ScopePage(HtmlDocument doc, EpubBookRef book, string apiBase, HtmlNode body, Dictionary<string, int> mappings, int page)
{
await InlineStyles(doc, book, apiBase, body);
RewriteAnchors(page, doc, mappings);
ScopeImages(doc, book, apiBase);
return PrepareFinalHtml(doc, body);
}
/// <summary>
@ -436,13 +644,13 @@ namespace API.Services
/// <param name="fileFilePath"></param>
/// <param name="fileName">Name of the new file.</param>
/// <returns></returns>
public string GetCoverImage(string fileFilePath, string fileName)
public string GetCoverImage(string fileFilePath, string fileName, string outputDirectory)
{
if (!IsValidFile(fileFilePath)) return string.Empty;
if (Parser.Parser.IsPdf(fileFilePath))
{
return GetPdfCoverImage(fileFilePath, fileName);
return GetPdfCoverImage(fileFilePath, fileName, outputDirectory);
}
using var epubBook = EpubReader.OpenBook(fileFilePath);
@ -458,7 +666,7 @@ namespace API.Services
if (coverImageContent == null) return string.Empty;
using var stream = coverImageContent.GetContentStream();
return ImageService.WriteCoverThumbnail(stream, fileName);
return _imageService.WriteCoverThumbnail(stream, fileName, outputDirectory);
}
catch (Exception ex)
{
@ -469,7 +677,7 @@ namespace API.Services
}
private string GetPdfCoverImage(string fileFilePath, string fileName)
private string GetPdfCoverImage(string fileFilePath, string fileName, string outputDirectory)
{
try
{
@ -479,7 +687,7 @@ namespace API.Services
using var stream = StreamManager.GetStream("BookService.GetPdfPage");
GetPdfPage(docReader, 0, stream);
return ImageService.WriteCoverThumbnail(stream, fileName);
return _imageService.WriteCoverThumbnail(stream, fileName, outputDirectory);
}
catch (Exception ex)
@ -498,15 +706,10 @@ namespace API.Services
var rawBytes = pageReader.GetImage(new NaiveTransparencyRemover());
var width = pageReader.GetPageWidth();
var height = pageReader.GetPageHeight();
using var bmp = new Bitmap(width, height, PixelFormat.Format32bppArgb);
AddBytesToBitmap(bmp, rawBytes);
// Removes 1px margin on left/right side after bitmap is copied out
for (var y = 0; y < bmp.Height; y++)
{
bmp.SetPixel(bmp.Width - 1, y, bmp.GetPixel(bmp.Width - 2, y));
}
var image = Image.LoadPixelData<Bgra32>(rawBytes, width, height);
stream.Seek(0, SeekOrigin.Begin);
bmp.Save(stream, ImageFormat.Jpeg);
image.SaveAsPng(stream);
stream.Seek(0, SeekOrigin.Begin);
}

View file

@ -1,46 +1,54 @@
using System;
using System.Collections.Generic;
using System.Collections.Immutable;
using System.IO;
using System.Linq;
using System.Threading.Tasks;
using API.Comparators;
using API.Data;
using API.Entities;
using API.Entities.Enums;
using API.Extensions;
using API.Interfaces;
using API.Interfaces.Services;
using Microsoft.Extensions.Logging;
namespace API.Services
{
public interface ICacheService
{
/// <summary>
/// Ensures the cache is created for the given chapter and if not, will create it. Should be called before any other
/// cache operations (except cleanup).
/// </summary>
/// <param name="chapterId"></param>
/// <returns>Chapter for the passed chapterId. Side-effect from ensuring cache.</returns>
Task<Chapter> Ensure(int chapterId);
/// <summary>
/// Clears cache directory of all volumes. This can be invoked from deleting a library or a series.
/// </summary>
/// <param name="chapterIds">Volumes that belong to that library. Assume the library might have been deleted before this invocation.</param>
void CleanupChapters(IEnumerable<int> chapterIds);
string GetCachedPagePath(Chapter chapter, int page);
string GetCachedEpubFile(int chapterId, Chapter chapter);
public void ExtractChapterFiles(string extractPath, IReadOnlyList<MangaFile> files);
}
public class CacheService : ICacheService
{
private readonly ILogger<CacheService> _logger;
private readonly IUnitOfWork _unitOfWork;
private readonly IArchiveService _archiveService;
private readonly IDirectoryService _directoryService;
private readonly IBookService _bookService;
private readonly IReadingItemService _readingItemService;
private readonly NumericComparer _numericComparer;
public CacheService(ILogger<CacheService> logger, IUnitOfWork unitOfWork, IArchiveService archiveService,
IDirectoryService directoryService, IBookService bookService)
public CacheService(ILogger<CacheService> logger, IUnitOfWork unitOfWork,
IDirectoryService directoryService, IReadingItemService readingItemService)
{
_logger = logger;
_unitOfWork = unitOfWork;
_archiveService = archiveService;
_directoryService = directoryService;
_bookService = bookService;
_readingItemService = readingItemService;
_numericComparer = new NumericComparer();
}
public void EnsureCacheDirectory()
{
if (!DirectoryService.ExistOrCreate(DirectoryService.CacheDirectory))
{
_logger.LogError("Cache directory {CacheDirectory} is not accessible or does not exist. Creating...", DirectoryService.CacheDirectory);
}
}
/// <summary>
/// Returns the full path to the cached epub file. If the file does not exist, will fallback to the original.
/// </summary>
@ -50,8 +58,8 @@ namespace API.Services
public string GetCachedEpubFile(int chapterId, Chapter chapter)
{
var extractPath = GetCachePath(chapterId);
var path = Path.Join(extractPath, Path.GetFileName(chapter.Files.First().FilePath));
if (!(new FileInfo(path).Exists))
var path = Path.Join(extractPath, _directoryService.FileSystem.Path.GetFileName(chapter.Files.First().FilePath));
if (!(_directoryService.FileSystem.FileInfo.FromFileName(path).Exists))
{
path = chapter.Files.First().FilePath;
}
@ -62,14 +70,14 @@ namespace API.Services
/// Caches the files for the given chapter to CacheDirectory
/// </summary>
/// <param name="chapterId"></param>
/// <returns>This will always return the Chapter for the chpaterId</returns>
/// <returns>This will always return the Chapter for the chapterId</returns>
public async Task<Chapter> Ensure(int chapterId)
{
EnsureCacheDirectory();
_directoryService.ExistOrCreate(_directoryService.CacheDirectory);
var chapter = await _unitOfWork.ChapterRepository.GetChapterAsync(chapterId);
var extractPath = GetCachePath(chapterId);
if (!Directory.Exists(extractPath))
if (!_directoryService.Exists(extractPath))
{
var files = chapter.Files.ToList();
ExtractChapterFiles(extractPath, files);
@ -90,22 +98,12 @@ namespace API.Services
var removeNonImages = true;
var fileCount = files.Count;
var extraPath = "";
var extractDi = new DirectoryInfo(extractPath);
var extractDi = _directoryService.FileSystem.DirectoryInfo.FromDirectoryName(extractPath);
if (files.Count > 0 && files[0].Format == MangaFormat.Image)
{
DirectoryService.ExistOrCreate(extractPath);
if (files.Count == 1)
{
_directoryService.CopyFileToDirectory(files[0].FilePath, extractPath);
}
else
{
DirectoryService.CopyDirectoryToDirectory(Path.GetDirectoryName(files[0].FilePath), extractPath,
Parser.Parser.ImageFileExtensions);
}
extractDi.Flatten();
_readingItemService.Extract(files[0].FilePath, extractPath, MangaFormat.Image, files.Count);
_directoryService.Flatten(extractDi.FullName);
}
foreach (var file in files)
@ -117,63 +115,37 @@ namespace API.Services
if (file.Format == MangaFormat.Archive)
{
_archiveService.ExtractArchive(file.FilePath, Path.Join(extractPath, extraPath));
_readingItemService.Extract(file.FilePath, Path.Join(extractPath, extraPath), file.Format);
}
else if (file.Format == MangaFormat.Pdf)
{
_bookService.ExtractPdfImages(file.FilePath, Path.Join(extractPath, extraPath));
_readingItemService.Extract(file.FilePath, Path.Join(extractPath, extraPath), file.Format);
}
else if (file.Format == MangaFormat.Epub)
{
removeNonImages = false;
DirectoryService.ExistOrCreate(extractPath);
_directoryService.ExistOrCreate(extractPath);
_directoryService.CopyFileToDirectory(files[0].FilePath, extractPath);
}
}
extractDi.Flatten();
_directoryService.Flatten(extractDi.FullName);
if (removeNonImages)
{
extractDi.RemoveNonImages();
_directoryService.RemoveNonImages(extractDi.FullName);
}
}
public void Cleanup()
{
_logger.LogInformation("Performing cleanup of Cache directory");
EnsureCacheDirectory();
try
{
DirectoryService.ClearDirectory(DirectoryService.CacheDirectory);
}
catch (Exception ex)
{
_logger.LogError(ex, "There was an issue deleting one or more folders/files during cleanup");
}
_logger.LogInformation("Cache directory purged");
}
/// <summary>
/// Removes the cached files and folders for a set of chapterIds
/// </summary>
/// <param name="chapterIds"></param>
public void CleanupChapters(IEnumerable<int> chapterIds)
{
_logger.LogInformation("Running Cache cleanup on Chapters");
foreach (var chapter in chapterIds)
{
var di = new DirectoryInfo(GetCachePath(chapter));
if (di.Exists)
{
di.Delete(true);
}
_directoryService.ClearDirectory(GetCachePath(chapter));
}
_logger.LogInformation("Cache directory purged");
}
@ -184,46 +156,32 @@ namespace API.Services
/// <returns></returns>
private string GetCachePath(int chapterId)
{
return Path.GetFullPath(Path.Join(DirectoryService.CacheDirectory, $"{chapterId}/"));
return _directoryService.FileSystem.Path.GetFullPath(_directoryService.FileSystem.Path.Join(_directoryService.CacheDirectory, $"{chapterId}/"));
}
public async Task<(string path, MangaFile file)> GetCachedPagePath(Chapter chapter, int page)
/// <summary>
/// Returns the absolute path of a cached page.
/// </summary>
/// <param name="chapter">Chapter entity with Files populated.</param>
/// <param name="page">Page number to look for</param>
/// <returns>Page filepath or empty if no files found.</returns>
public string GetCachedPagePath(Chapter chapter, int page)
{
// Calculate what chapter the page belongs to
var pagesSoFar = 0;
var chapterFiles = chapter.Files ?? await _unitOfWork.ChapterRepository.GetFilesForChapterAsync(chapter.Id);
foreach (var mangaFile in chapterFiles)
var path = GetCachePath(chapter.Id);
var files = _directoryService.GetFilesWithExtension(path, Parser.Parser.ImageFileExtensions);
files = files
.AsEnumerable()
.OrderByNatural(Path.GetFileNameWithoutExtension)
.ToArray();
if (files.Length == 0)
{
if (page <= (mangaFile.Pages + pagesSoFar))
{
var path = GetCachePath(chapter.Id);
var files = DirectoryService.GetFilesWithExtension(path, Parser.Parser.ImageFileExtensions);
Array.Sort(files, _numericComparer);
if (files.Length == 0)
{
return (files.ElementAt(0), mangaFile);
}
// Since array is 0 based, we need to keep that in account (only affects last image)
if (page == files.Length)
{
return (files.ElementAt(page - 1 - pagesSoFar), mangaFile);
}
if (mangaFile.Format == MangaFormat.Image && mangaFile.Pages == 1)
{
// Each file is one page, meaning we should just get element at page
return (files.ElementAt(page), mangaFile);
}
return (files.ElementAt(page - pagesSoFar), mangaFile);
}
pagesSoFar += mangaFile.Pages;
return string.Empty;
}
return (string.Empty, null);
// Since array is 0 based, we need to keep that in account (only affects last image)
return page == files.Length ? files.ElementAt(page - 1) : files.ElementAt(page);
}
}
}

View file

@ -1,31 +1,88 @@
using System;
using System;
using System.Collections;
using System.Collections.Generic;
using System.Collections.Immutable;
using System.IO;
using System.IO.Abstractions;
using System.Linq;
using System.Text.RegularExpressions;
using System.Threading.Tasks;
using API.Interfaces.Services;
using API.Comparators;
using API.Extensions;
using Microsoft.Extensions.Logging;
namespace API.Services
{
public interface IDirectoryService
{
IFileSystem FileSystem { get; }
string CacheDirectory { get; }
string CoverImageDirectory { get; }
string LogDirectory { get; }
string TempDirectory { get; }
string ConfigDirectory { get; }
/// <summary>
/// Original BookmarkDirectory. Only used for resetting directory. Use <see cref="ServerSettings.BackupDirectory"/> for actual path.
/// </summary>
string BookmarkDirectory { get; }
/// <summary>
/// Lists out top-level folders for a given directory. Filters out System and Hidden folders.
/// </summary>
/// <param name="rootPath">Absolute path of directory to scan.</param>
/// <returns>List of folder names</returns>
IEnumerable<string> ListDirectory(string rootPath);
Task<byte[]> ReadFileAsync(string path);
bool CopyFilesToDirectory(IEnumerable<string> filePaths, string directoryPath, string prepend = "");
bool Exists(string directory);
void CopyFileToDirectory(string fullFilePath, string targetDirectory);
int TraverseTreeParallelForEach(string root, Action<string> action, string searchPattern, ILogger logger);
bool IsDriveMounted(string path);
bool IsDirectoryEmpty(string path);
long GetTotalSize(IEnumerable<string> paths);
void ClearDirectory(string directoryPath);
void ClearAndDeleteDirectory(string directoryPath);
string[] GetFilesWithExtension(string path, string searchPatternExpression = "");
bool CopyDirectoryToDirectory(string sourceDirName, string destDirName, string searchPattern = "");
Dictionary<string, string> FindHighestDirectoriesFromFiles(IEnumerable<string> libraryFolders,
IList<string> filePaths);
IEnumerable<string> GetFoldersTillRoot(string rootPath, string fullPath);
IEnumerable<string> GetFiles(string path, string fileNameRegex = "", SearchOption searchOption = SearchOption.TopDirectoryOnly);
bool ExistOrCreate(string directoryPath);
void DeleteFiles(IEnumerable<string> files);
void RemoveNonImages(string directoryName);
void Flatten(string directoryName);
Task<bool> CheckWriteAccess(string directoryName);
}
public class DirectoryService : IDirectoryService
{
private readonly ILogger<DirectoryService> _logger;
private static readonly Regex ExcludeDirectories = new Regex(
@"@eaDir|\.DS_Store",
RegexOptions.Compiled | RegexOptions.IgnoreCase);
public static readonly string TempDirectory = Path.Join(Directory.GetCurrentDirectory(), "config", "temp");
public static readonly string LogDirectory = Path.Join(Directory.GetCurrentDirectory(), "config", "logs");
public static readonly string CacheDirectory = Path.Join(Directory.GetCurrentDirectory(), "config", "cache");
public static readonly string CoverImageDirectory = Path.Join(Directory.GetCurrentDirectory(), "config", "covers");
public static readonly string BackupDirectory = Path.Join(Directory.GetCurrentDirectory(), "config", "backups");
public static readonly string ConfigDirectory = Path.Join(Directory.GetCurrentDirectory(), "config");
public IFileSystem FileSystem { get; }
public string CacheDirectory { get; }
public string CoverImageDirectory { get; }
public string LogDirectory { get; }
public string TempDirectory { get; }
public string ConfigDirectory { get; }
public string BookmarkDirectory { get; }
private readonly ILogger<DirectoryService> _logger;
public DirectoryService(ILogger<DirectoryService> logger)
private static readonly Regex ExcludeDirectories = new Regex(
@"@eaDir|\.DS_Store|\.qpkg",
RegexOptions.Compiled | RegexOptions.IgnoreCase);
public static readonly string BackupDirectory = Path.Join(Directory.GetCurrentDirectory(), "config", "backups");
public DirectoryService(ILogger<DirectoryService> logger, IFileSystem fileSystem)
{
_logger = logger;
_logger = logger;
FileSystem = fileSystem;
CoverImageDirectory = FileSystem.Path.Join(FileSystem.Directory.GetCurrentDirectory(), "config", "covers");
CacheDirectory = FileSystem.Path.Join(FileSystem.Directory.GetCurrentDirectory(), "config", "cache");
LogDirectory = FileSystem.Path.Join(FileSystem.Directory.GetCurrentDirectory(), "config", "logs");
TempDirectory = FileSystem.Path.Join(FileSystem.Directory.GetCurrentDirectory(), "config", "temp");
ConfigDirectory = FileSystem.Path.Join(FileSystem.Directory.GetCurrentDirectory(), "config");
BookmarkDirectory = FileSystem.Path.Join(FileSystem.Directory.GetCurrentDirectory(), "config", "bookmarks");
}
/// <summary>
@ -36,16 +93,16 @@ namespace API.Services
/// <param name="searchPatternExpression">Regex version of search pattern (ie \.mp3|\.mp4). Defaults to * meaning all files.</param>
/// <param name="searchOption">SearchOption to use, defaults to TopDirectoryOnly</param>
/// <returns>List of file paths</returns>
private static IEnumerable<string> GetFilesWithCertainExtensions(string path,
private IEnumerable<string> GetFilesWithCertainExtensions(string path,
string searchPatternExpression = "",
SearchOption searchOption = SearchOption.TopDirectoryOnly)
{
if (!Directory.Exists(path)) return ImmutableList<string>.Empty;
if (!FileSystem.Directory.Exists(path)) return ImmutableList<string>.Empty;
var reSearchPattern = new Regex(searchPatternExpression, RegexOptions.IgnoreCase);
return Directory.EnumerateFiles(path, "*", searchOption)
return FileSystem.Directory.EnumerateFiles(path, "*", searchOption)
.Where(file =>
reSearchPattern.IsMatch(Path.GetExtension(file)) && !Path.GetFileName(file).StartsWith(Parser.Parser.MacOsMetadataFileStartsWith));
reSearchPattern.IsMatch(FileSystem.Path.GetExtension(file)) && !FileSystem.Path.GetFileName(file).StartsWith(Parser.Parser.MacOsMetadataFileStartsWith));
}
@ -57,17 +114,17 @@ namespace API.Services
/// <param name="rootPath"></param>
/// <param name="fullPath"></param>
/// <returns></returns>
public static IEnumerable<string> GetFoldersTillRoot(string rootPath, string fullPath)
public IEnumerable<string> GetFoldersTillRoot(string rootPath, string fullPath)
{
var separator = Path.AltDirectorySeparatorChar;
if (fullPath.Contains(Path.DirectorySeparatorChar))
var separator = FileSystem.Path.AltDirectorySeparatorChar;
if (fullPath.Contains(FileSystem.Path.DirectorySeparatorChar))
{
fullPath = fullPath.Replace(Path.DirectorySeparatorChar, Path.AltDirectorySeparatorChar);
fullPath = fullPath.Replace(FileSystem.Path.DirectorySeparatorChar, FileSystem.Path.AltDirectorySeparatorChar);
}
if (rootPath.Contains(Path.DirectorySeparatorChar))
{
rootPath = rootPath.Replace(Path.DirectorySeparatorChar, Path.AltDirectorySeparatorChar);
rootPath = rootPath.Replace(FileSystem.Path.DirectorySeparatorChar, FileSystem.Path.AltDirectorySeparatorChar);
}
@ -76,14 +133,14 @@ namespace API.Services
var root = rootPath.EndsWith(separator) ? rootPath.Substring(0, rootPath.Length - 1) : rootPath;
var paths = new List<string>();
// If a file is at the end of the path, remove it before we start processing folders
if (Path.GetExtension(path) != string.Empty)
if (FileSystem.Path.GetExtension(path) != string.Empty)
{
path = path.Substring(0, path.LastIndexOf(separator));
}
while (Path.GetDirectoryName(path) != Path.GetDirectoryName(root))
while (FileSystem.Path.GetDirectoryName(path) != Path.GetDirectoryName(root))
{
var folder = new DirectoryInfo(path).Name;
var folder = FileSystem.DirectoryInfo.FromDirectoryName(path).Name;
paths.Add(folder);
path = path.Substring(0, path.LastIndexOf(separator));
}
@ -91,35 +148,60 @@ namespace API.Services
return paths;
}
/// <summary>
/// Does Directory Exist
/// </summary>
/// <param name="directory"></param>
/// <returns></returns>
public bool Exists(string directory)
{
var di = new DirectoryInfo(directory);
return di.Exists;
var di = FileSystem.DirectoryInfo.FromDirectoryName(directory);
return di.Exists;
}
public static IEnumerable<string> GetFiles(string path, string searchPatternExpression = "",
SearchOption searchOption = SearchOption.TopDirectoryOnly)
/// <summary>
/// Get files given a path.
/// </summary>
/// <remarks>This will automatically filter out restricted files, like MacOsMetadata files</remarks>
/// <param name="path"></param>
/// <param name="fileNameRegex">An optional regex string to search against. Will use file path to match against.</param>
/// <param name="searchOption">Defaults to top level directory only, can be given all to provide recursive searching</param>
/// <returns></returns>
public IEnumerable<string> GetFiles(string path, string fileNameRegex = "", SearchOption searchOption = SearchOption.TopDirectoryOnly)
{
if (searchPatternExpression != string.Empty)
if (!FileSystem.Directory.Exists(path)) return ImmutableList<string>.Empty;
if (fileNameRegex != string.Empty)
{
if (!Directory.Exists(path)) return ImmutableList<string>.Empty;
var reSearchPattern = new Regex(searchPatternExpression, RegexOptions.IgnoreCase);
return Directory.EnumerateFiles(path, "*", searchOption)
var reSearchPattern = new Regex(fileNameRegex, RegexOptions.IgnoreCase);
return FileSystem.Directory.EnumerateFiles(path, "*", searchOption)
.Where(file =>
reSearchPattern.IsMatch(file) && !file.StartsWith(Parser.Parser.MacOsMetadataFileStartsWith));
{
var fileName = FileSystem.Path.GetFileName(file);
return reSearchPattern.IsMatch(fileName) &&
!fileName.StartsWith(Parser.Parser.MacOsMetadataFileStartsWith);
});
}
return !Directory.Exists(path) ? Array.Empty<string>() : Directory.GetFiles(path);
return FileSystem.Directory.EnumerateFiles(path, "*", searchOption).Where(file =>
!FileSystem.Path.GetFileName(file).StartsWith(Parser.Parser.MacOsMetadataFileStartsWith));
}
/// <summary>
/// Copies a file into a directory. Does not maintain parent folder of file.
/// Will create target directory if doesn't exist. Automatically overwrites what is there.
/// </summary>
/// <param name="fullFilePath"></param>
/// <param name="targetDirectory"></param>
public void CopyFileToDirectory(string fullFilePath, string targetDirectory)
{
try
{
var fileInfo = new FileInfo(fullFilePath);
var fileInfo = FileSystem.FileInfo.FromFileName(fullFilePath);
if (fileInfo.Exists)
{
fileInfo.CopyTo(Path.Join(targetDirectory, fileInfo.Name), true);
ExistOrCreate(targetDirectory);
fileInfo.CopyTo(FileSystem.Path.Join(targetDirectory, fileInfo.Name), true);
}
}
catch (Exception ex)
@ -129,19 +211,19 @@ namespace API.Services
}
/// <summary>
/// Copies a Directory with all files and subdirectories to a target location
/// Copies all files and subdirectories within a directory to a target location
/// </summary>
/// <param name="sourceDirName"></param>
/// <param name="destDirName"></param>
/// <param name="searchPattern">Defaults to *, meaning all files</param>
/// <returns></returns>
/// <exception cref="DirectoryNotFoundException"></exception>
public static bool CopyDirectoryToDirectory(string sourceDirName, string destDirName, string searchPattern = "")
/// <param name="sourceDirName">Directory to copy from. Does not copy the parent folder</param>
/// <param name="destDirName">Destination to copy to. Will be created if doesn't exist</param>
/// <param name="searchPattern">Defaults to all files</param>
/// <returns>If was successful</returns>
/// <exception cref="DirectoryNotFoundException">Thrown when source directory does not exist</exception>
public bool CopyDirectoryToDirectory(string sourceDirName, string destDirName, string searchPattern = "")
{
if (string.IsNullOrEmpty(sourceDirName)) return false;
// Get the subdirectories for the specified directory.
var dir = new DirectoryInfo(sourceDirName);
var dir = FileSystem.DirectoryInfo.FromDirectoryName(sourceDirName);
if (!dir.Exists)
{
@ -156,17 +238,17 @@ namespace API.Services
ExistOrCreate(destDirName);
// Get the files in the directory and copy them to the new location.
var files = GetFilesWithExtension(dir.FullName, searchPattern).Select(n => new FileInfo(n));
var files = GetFilesWithExtension(dir.FullName, searchPattern).Select(n => FileSystem.FileInfo.FromFileName(n));
foreach (var file in files)
{
var tempPath = Path.Combine(destDirName, file.Name);
var tempPath = FileSystem.Path.Combine(destDirName, file.Name);
file.CopyTo(tempPath, false);
}
// If copying subdirectories, copy them and their contents to new location.
foreach (var subDir in dirs)
{
var tempPath = Path.Combine(destDirName, subDir.Name);
var tempPath = FileSystem.Path.Combine(destDirName, subDir.Name);
CopyDirectoryToDirectory(subDir.FullName, tempPath);
}
@ -178,19 +260,30 @@ namespace API.Services
/// </summary>
/// <param name="path"></param>
/// <returns></returns>
public static bool IsDriveMounted(string path)
public bool IsDriveMounted(string path)
{
return new DirectoryInfo(Path.GetPathRoot(path) ?? string.Empty).Exists;
return FileSystem.DirectoryInfo.FromDirectoryName(FileSystem.Path.GetPathRoot(path) ?? string.Empty).Exists;
}
public static string[] GetFilesWithExtension(string path, string searchPatternExpression = "")
{
if (searchPatternExpression != string.Empty)
{
return GetFilesWithCertainExtensions(path, searchPatternExpression).ToArray();
}
return !Directory.Exists(path) ? Array.Empty<string>() : Directory.GetFiles(path);
/// <summary>
/// Checks if the root path of a path is empty or not.
/// </summary>
/// <param name="path"></param>
/// <returns></returns>
public bool IsDirectoryEmpty(string path)
{
return FileSystem.Directory.Exists(path) && !FileSystem.Directory.EnumerateFileSystemEntries(path).Any();
}
public string[] GetFilesWithExtension(string path, string searchPatternExpression = "")
{
if (searchPatternExpression != string.Empty)
{
return GetFilesWithCertainExtensions(path, searchPatternExpression).ToArray();
}
return !FileSystem.Directory.Exists(path) ? Array.Empty<string>() : FileSystem.Directory.GetFiles(path);
}
/// <summary>
@ -198,9 +291,9 @@ namespace API.Services
/// </summary>
/// <param name="paths"></param>
/// <returns>Total bytes</returns>
public static long GetTotalSize(IEnumerable<string> paths)
public long GetTotalSize(IEnumerable<string> paths)
{
return paths.Sum(path => new FileInfo(path).Length);
return paths.Sum(path => FileSystem.FileInfo.FromFileName(path).Length);
}
/// <summary>
@ -208,13 +301,13 @@ namespace API.Services
/// </summary>
/// <param name="directoryPath"></param>
/// <returns></returns>
public static bool ExistOrCreate(string directoryPath)
public bool ExistOrCreate(string directoryPath)
{
var di = new DirectoryInfo(directoryPath);
var di = FileSystem.DirectoryInfo.FromDirectoryName(directoryPath);
if (di.Exists) return true;
try
{
Directory.CreateDirectory(directoryPath);
FileSystem.Directory.CreateDirectory(directoryPath);
}
catch (Exception)
{
@ -227,11 +320,11 @@ namespace API.Services
/// Deletes all files within the directory, then the directory itself.
/// </summary>
/// <param name="directoryPath"></param>
public static void ClearAndDeleteDirectory(string directoryPath)
public void ClearAndDeleteDirectory(string directoryPath)
{
if (!Directory.Exists(directoryPath)) return;
if (!FileSystem.Directory.Exists(directoryPath)) return;
DirectoryInfo di = new DirectoryInfo(directoryPath);
var di = FileSystem.DirectoryInfo.FromDirectoryName(directoryPath);
ClearDirectory(directoryPath);
@ -239,13 +332,13 @@ namespace API.Services
}
/// <summary>
/// Deletes all files within the directory.
/// Deletes all files and folders within the directory path
/// </summary>
/// <param name="directoryPath"></param>
/// <returns></returns>
public static void ClearDirectory(string directoryPath)
public void ClearDirectory(string directoryPath)
{
var di = new DirectoryInfo(directoryPath);
var di = FileSystem.DirectoryInfo.FromDirectoryName(directoryPath);
if (!di.Exists) return;
foreach (var file in di.EnumerateFiles())
@ -265,7 +358,7 @@ namespace API.Services
/// <param name="directoryPath"></param>
/// <param name="prepend">An optional string to prepend to the target file's name</param>
/// <returns></returns>
public static bool CopyFilesToDirectory(IEnumerable<string> filePaths, string directoryPath, string prepend = "", ILogger logger = null)
public bool CopyFilesToDirectory(IEnumerable<string> filePaths, string directoryPath, string prepend = "")
{
ExistOrCreate(directoryPath);
string currentFile = null;
@ -274,36 +367,36 @@ namespace API.Services
foreach (var file in filePaths)
{
currentFile = file;
var fileInfo = new FileInfo(file);
var fileInfo = FileSystem.FileInfo.FromFileName(file);
if (fileInfo.Exists)
{
fileInfo.CopyTo(Path.Join(directoryPath, prepend + fileInfo.Name));
fileInfo.CopyTo(FileSystem.Path.Join(directoryPath, prepend + fileInfo.Name));
}
else
{
logger?.LogWarning("Tried to copy {File} but it doesn't exist", file);
_logger.LogWarning("Tried to copy {File} but it doesn't exist", file);
}
}
}
catch (Exception ex)
{
logger?.LogError(ex, "Unable to copy {File} to {DirectoryPath}", currentFile, directoryPath);
_logger.LogError(ex, "Unable to copy {File} to {DirectoryPath}", currentFile, directoryPath);
return false;
}
return true;
}
public bool CopyFilesToDirectory(IEnumerable<string> filePaths, string directoryPath, string prepend = "")
{
return CopyFilesToDirectory(filePaths, directoryPath, prepend, _logger);
}
/// <summary>
/// Lists all directories in a root path. Will exclude Hidden or System directories.
/// </summary>
/// <param name="rootPath"></param>
/// <returns></returns>
public IEnumerable<string> ListDirectory(string rootPath)
{
if (!Directory.Exists(rootPath)) return ImmutableList<string>.Empty;
if (!FileSystem.Directory.Exists(rootPath)) return ImmutableList<string>.Empty;
var di = new DirectoryInfo(rootPath);
var di = FileSystem.DirectoryInfo.FromDirectoryName(rootPath);
var dirs = di.GetDirectories()
.Where(dir => !(dir.Attributes.HasFlag(FileAttributes.Hidden) || dir.Attributes.HasFlag(FileAttributes.System)))
.Select(d => d.Name).ToImmutableList();
@ -311,20 +404,26 @@ namespace API.Services
return dirs;
}
/// <summary>
/// Reads a file's into byte[]. Returns empty array if file doesn't exist.
/// </summary>
/// <param name="path"></param>
/// <returns></returns>
public async Task<byte[]> ReadFileAsync(string path)
{
if (!File.Exists(path)) return Array.Empty<byte>();
return await File.ReadAllBytesAsync(path);
if (!FileSystem.File.Exists(path)) return Array.Empty<byte>();
return await FileSystem.File.ReadAllBytesAsync(path);
}
/// <summary>
/// Finds the highest directories from a set of MangaFiles
/// Finds the highest directories from a set of file paths. Does not return the root path, will always select the highest non-root path.
/// </summary>
/// <remarks>If the file paths do not contain anything from libraryFolders, this returns an empty dictionary back</remarks>
/// <param name="libraryFolders">List of top level folders which files belong to</param>
/// <param name="filePaths">List of file paths that belong to libraryFolders</param>
/// <returns></returns>
public static Dictionary<string, string> FindHighestDirectoriesFromFiles(IEnumerable<string> libraryFolders, IList<string> filePaths)
public Dictionary<string, string> FindHighestDirectoriesFromFiles(IEnumerable<string> libraryFolders, IList<string> filePaths)
{
var stopLookingForDirectories = false;
var dirs = new Dictionary<string, string>();
@ -365,20 +464,19 @@ namespace API.Services
/// <param name="searchPattern">Regex pattern to search against</param>
/// <param name="logger"></param>
/// <exception cref="ArgumentException"></exception>
public static int TraverseTreeParallelForEach(string root, Action<string> action, string searchPattern, ILogger logger)
public int TraverseTreeParallelForEach(string root, Action<string> action, string searchPattern, ILogger logger)
{
//Count of files traversed and timer for diagnostic output
//Count of files traversed and timer for diagnostic output
var fileCount = 0;
// Determine whether to parallelize file processing on each folder based on processor count.
//var procCount = Environment.ProcessorCount;
// Data structure to hold names of subfolders to be examined for files.
var dirs = new Stack<string>();
if (!Directory.Exists(root)) {
throw new ArgumentException("The directory doesn't exist");
if (!FileSystem.Directory.Exists(root)) {
throw new ArgumentException("The directory doesn't exist");
}
dirs.Push(root);
while (dirs.Count > 0) {
@ -387,7 +485,7 @@ namespace API.Services
string[] files;
try {
subDirs = Directory.GetDirectories(currentDir).Where(path => ExcludeDirectories.Matches(path).Count == 0);
subDirs = FileSystem.Directory.GetDirectories(currentDir).Where(path => ExcludeDirectories.Matches(path).Count == 0);
}
// Thrown if we do not have discovery permission on the directory.
catch (UnauthorizedAccessException e) {
@ -403,7 +501,7 @@ namespace API.Services
}
try {
files = GetFilesWithCertainExtensions(currentDir, searchPattern)
files = GetFilesWithCertainExtensions(currentDir, searchPattern)
.ToArray();
}
catch (UnauthorizedAccessException e) {
@ -423,22 +521,7 @@ namespace API.Services
// Otherwise, execute sequentially. Files are opened and processed
// synchronously but this could be modified to perform async I/O.
try {
// if (files.Length < procCount) {
// foreach (var file in files) {
// action(file);
// fileCount++;
// }
// }
// else {
// Parallel.ForEach(files, () => 0, (file, _, localCount) =>
// { action(file);
// return ++localCount;
// },
// (c) => {
// Interlocked.Add(ref fileCount, c);
// });
// }
foreach (var file in files) {
foreach (var file in files) {
action(file);
fileCount++;
}
@ -448,6 +531,7 @@ namespace API.Services
if (ex is UnauthorizedAccessException) {
// Here we just output a message and go on.
Console.WriteLine(ex.Message);
_logger.LogError(ex, "Unauthorized access on file");
return true;
}
// Handle other exceptions here if necessary...
@ -469,13 +553,13 @@ namespace API.Services
/// Attempts to delete the files passed to it. Swallows exceptions.
/// </summary>
/// <param name="files">Full path of files to delete</param>
public static void DeleteFiles(IEnumerable<string> files)
public void DeleteFiles(IEnumerable<string> files)
{
foreach (var file in files)
{
try
{
new FileInfo(file).Delete();
FileSystem.FileInfo.FromFileName(file).Delete();
}
catch (Exception)
{
@ -538,5 +622,101 @@ namespace API.Services
// Return formatted number with suffix
return readable.ToString("0.## ") + suffix;
}
/// <summary>
/// Removes all files except images from the directory. Includes sub directories.
/// </summary>
/// <param name="directoryName">Fully qualified directory</param>
public void RemoveNonImages(string directoryName)
{
DeleteFiles(GetFiles(directoryName, searchOption:SearchOption.AllDirectories).Where(file => !Parser.Parser.IsImage(file)));
}
/// <summary>
/// Flattens all files in subfolders to the passed directory recursively.
///
///
/// foo<para />
/// ├── 1.txt<para />
/// ├── 2.txt<para />
/// ├── 3.txt<para />
/// ├── 4.txt<para />
/// └── bar<para />
/// ├── 1.txt<para />
/// ├── 2.txt<para />
/// └── 5.txt<para />
///
/// becomes:<para />
/// foo<para />
/// ├── 1.txt<para />
/// ├── 2.txt<para />
/// ├── 3.txt<para />
/// ├── 4.txt<para />
/// ├── bar_1.txt<para />
/// ├── bar_2.txt<para />
/// └── bar_5.txt<para />
/// </summary>
/// <param name="directoryName">Fully qualified Directory name</param>
public void Flatten(string directoryName)
{
if (string.IsNullOrEmpty(directoryName) || !FileSystem.Directory.Exists(directoryName)) return;
var directory = FileSystem.DirectoryInfo.FromDirectoryName(directoryName);
var index = 0;
FlattenDirectory(directory, directory, ref index);
}
/// <summary>
/// Checks whether a directory has write permissions
/// </summary>
/// <param name="directoryName">Fully qualified path</param>
/// <returns></returns>
public async Task<bool> CheckWriteAccess(string directoryName)
{
try
{
ExistOrCreate(directoryName);
await FileSystem.File.WriteAllTextAsync(
FileSystem.Path.Join(directoryName, "test.txt"),
string.Empty);
}
catch (Exception ex)
{
ClearAndDeleteDirectory(directoryName);
return false;
}
ClearAndDeleteDirectory(directoryName);
return true;
}
private void FlattenDirectory(IDirectoryInfo root, IDirectoryInfo directory, ref int directoryIndex)
{
if (!root.FullName.Equals(directory.FullName))
{
var fileIndex = 1;
foreach (var file in directory.EnumerateFiles().OrderByNatural(file => file.FullName))
{
if (file.Directory == null) continue;
var paddedIndex = Parser.Parser.PadZeros(directoryIndex + "");
// We need to rename the files so that after flattening, they are in the order we found them
var newName = $"{paddedIndex}_{Parser.Parser.PadZeros(fileIndex + "")}{file.Extension}";
var newPath = Path.Join(root.FullName, newName);
if (!File.Exists(newPath)) file.MoveTo(newPath);
fileIndex++;
}
directoryIndex++;
}
foreach (var subDirectory in directory.EnumerateDirectories().OrderByNatural(d => d.FullName))
{
FlattenDirectory(root, subDirectory, ref directoryIndex);
}
}
}
}

View file

@ -3,56 +3,54 @@ using System.IO;
using System.Linq;
using System.Threading.Tasks;
using API.Entities;
using API.Interfaces.Services;
using Microsoft.AspNetCore.StaticFiles;
namespace API.Services
namespace API.Services;
public interface IDownloadService
{
public interface IDownloadService
Task<(byte[], string, string)> GetFirstFileDownload(IEnumerable<MangaFile> files);
string GetContentTypeFromFile(string filepath);
}
public class DownloadService : IDownloadService
{
private readonly IDirectoryService _directoryService;
private readonly FileExtensionContentTypeProvider _fileTypeProvider = new FileExtensionContentTypeProvider();
public DownloadService(IDirectoryService directoryService)
{
Task<(byte[], string, string)> GetFirstFileDownload(IEnumerable<MangaFile> files);
string GetContentTypeFromFile(string filepath);
_directoryService = directoryService;
}
public class DownloadService : IDownloadService
/// <summary>
/// Downloads the first file in the file enumerable for download
/// </summary>
/// <param name="files"></param>
/// <returns></returns>
public async Task<(byte[], string, string)> GetFirstFileDownload(IEnumerable<MangaFile> files)
{
private readonly IDirectoryService _directoryService;
private readonly FileExtensionContentTypeProvider _fileTypeProvider = new FileExtensionContentTypeProvider();
var firstFile = files.Select(c => c.FilePath).First();
return (await _directoryService.ReadFileAsync(firstFile), GetContentTypeFromFile(firstFile), Path.GetFileName(firstFile));
}
public DownloadService(IDirectoryService directoryService)
public string GetContentTypeFromFile(string filepath)
{
// Figures out what the content type should be based on the file name.
if (!_fileTypeProvider.TryGetContentType(filepath, out var contentType))
{
_directoryService = directoryService;
}
/// <summary>
/// Downloads the first file in the file enumerable for download
/// </summary>
/// <param name="files"></param>
/// <returns></returns>
public async Task<(byte[], string, string)> GetFirstFileDownload(IEnumerable<MangaFile> files)
{
var firstFile = files.Select(c => c.FilePath).First();
return (await _directoryService.ReadFileAsync(firstFile), GetContentTypeFromFile(firstFile), Path.GetFileName(firstFile));
}
public string GetContentTypeFromFile(string filepath)
{
// Figures out what the content type should be based on the file name.
if (!_fileTypeProvider.TryGetContentType(filepath, out var contentType))
contentType = Path.GetExtension(filepath).ToLowerInvariant() switch
{
contentType = Path.GetExtension(filepath).ToLowerInvariant() switch
{
".cbz" => "application/zip",
".cbr" => "application/vnd.rar",
".cb7" => "application/x-compressed",
".epub" => "application/epub+zip",
".7z" => "application/x-7z-compressed",
".7zip" => "application/x-7z-compressed",
".pdf" => "application/pdf",
_ => contentType
};
}
return contentType;
".cbz" => "application/zip",
".cbr" => "application/vnd.rar",
".cb7" => "application/x-compressed",
".epub" => "application/epub+zip",
".7z" => "application/x-7z-compressed",
".7zip" => "application/x-7z-compressed",
".pdf" => "application/pdf",
_ => contentType
};
}
return contentType;
}
}

View file

@ -0,0 +1,46 @@
using System;
using System.IO.Abstractions;
using API.Extensions;
namespace API.Services;
public interface IFileService
{
IFileSystem GetFileSystem();
bool HasFileBeenModifiedSince(string filePath, DateTime time);
bool Exists(string filePath);
}
public class FileService : IFileService
{
private readonly IFileSystem _fileSystem;
public FileService(IFileSystem fileSystem)
{
_fileSystem = fileSystem;
}
public FileService() : this(fileSystem: new FileSystem()) { }
public IFileSystem GetFileSystem()
{
return _fileSystem;
}
/// <summary>
/// If the File on disk's last modified time is after passed time
/// </summary>
/// <remarks>This has a resolution to the minute. Will ignore seconds and milliseconds</remarks>
/// <param name="filePath">Full qualified path of file</param>
/// <param name="time"></param>
/// <returns></returns>
public bool HasFileBeenModifiedSince(string filePath, DateTime time)
{
return !string.IsNullOrEmpty(filePath) && _fileSystem.File.GetLastWriteTime(filePath).Truncate(TimeSpan.TicksPerMinute) > time.Truncate(TimeSpan.TicksPerMinute);
}
public bool Exists(string filePath)
{
return _fileSystem.File.Exists(filePath);
}
}

View file

@ -1,7 +1,6 @@
using System;
using System.Threading;
using System.Threading.Tasks;
using API.Interfaces;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Hosting;
@ -21,7 +20,7 @@ namespace API.Services.HostedServices
using var scope = _provider.CreateScope();
var taskScheduler = scope.ServiceProvider.GetRequiredService<ITaskScheduler>();
taskScheduler.ScheduleTasks();
await taskScheduler.ScheduleTasks();
taskScheduler.ScheduleUpdaterTasks();
try

View file

@ -1,21 +1,32 @@
using System;
using System.IO;
using System.Linq;
using API.Comparators;
using API.Entities;
using API.Interfaces.Services;
using Microsoft.Extensions.Logging;
using NetVips;
namespace API.Services
{
namespace API.Services;
public class ImageService : IImageService
{
public interface IImageService
{
void ExtractImages(string fileFilePath, string targetDirectory, int fileCount = 1);
string GetCoverImage(string path, string fileName, string outputDirectory);
/// <summary>
/// Creates a Thumbnail version of a base64 image
/// </summary>
/// <param name="encodedImage">base64 encoded image</param>
/// <returns>File name with extension of the file. This will always write to <see cref="DirectoryService.CoverImageDirectory"/></returns>
string CreateThumbnailFromBase64(string encodedImage, string fileName);
string WriteCoverThumbnail(Stream stream, string fileName, string outputDirectory);
}
public class ImageService : IImageService
{
private readonly ILogger<ImageService> _logger;
private readonly IDirectoryService _directoryService;
public const string ChapterCoverImageRegex = @"v\d+_c\d+";
public const string SeriesCoverImageRegex = @"seres\d+";
public const string CollectionTagCoverImageRegex = @"tag\d+";
public const string SeriesCoverImageRegex = @"series_\d+";
public const string CollectionTagCoverImageRegex = @"tag_\d+";
/// <summary>
@ -23,60 +34,40 @@ namespace API.Services
/// </summary>
private const int ThumbnailWidth = 320;
public ImageService(ILogger<ImageService> logger)
public ImageService(ILogger<ImageService> logger, IDirectoryService directoryService)
{
_logger = logger;
_logger = logger;
_directoryService = directoryService;
}
/// <summary>
/// Finds the first image in the directory of the first file. Does not check for "cover/folder".ext files to override.
/// </summary>
/// <param name="file"></param>
/// <returns></returns>
public string GetCoverFile(MangaFile file)
public void ExtractImages(string fileFilePath, string targetDirectory, int fileCount)
{
var directory = Path.GetDirectoryName(file.FilePath);
if (string.IsNullOrEmpty(directory))
{
_logger.LogError("Could not find Directory for {File}", file.FilePath);
return null;
}
var firstImage = DirectoryService.GetFilesWithExtension(directory, Parser.Parser.ImageFileExtensions)
.OrderBy(f => f, new NaturalSortComparer()).FirstOrDefault();
return firstImage;
_directoryService.ExistOrCreate(targetDirectory);
if (fileCount == 1)
{
_directoryService.CopyFileToDirectory(fileFilePath, targetDirectory);
}
else
{
_directoryService.CopyDirectoryToDirectory(Path.GetDirectoryName(fileFilePath), targetDirectory,
Parser.Parser.ImageFileExtensions);
}
}
public string GetCoverImage(string path, string fileName)
public string GetCoverImage(string path, string fileName, string outputDirectory)
{
if (string.IsNullOrEmpty(path)) return string.Empty;
if (string.IsNullOrEmpty(path)) return string.Empty;
try
{
return CreateThumbnail(path, fileName);
}
catch (Exception ex)
{
_logger.LogWarning(ex, "[GetCoverImage] There was an error and prevented thumbnail generation on {ImageFile}. Defaulting to no cover image", path);
}
return string.Empty;
}
/// <inheritdoc />
public string CreateThumbnail(string path, string fileName)
{
try
{
using var thumbnail = Image.Thumbnail(path, ThumbnailWidth);
var filename = fileName + ".png";
thumbnail.WriteToFile(Path.Join(DirectoryService.CoverImageDirectory, filename));
thumbnail.WriteToFile(_directoryService.FileSystem.Path.Join(outputDirectory, filename));
return filename;
}
catch (Exception e)
catch (Exception ex)
{
_logger.LogError(e, "Error creating thumbnail from url");
_logger.LogWarning(ex, "[GetCoverImage] There was an error and prevented thumbnail generation on {ImageFile}. Defaulting to no cover image", path);
}
return string.Empty;
@ -89,11 +80,16 @@ namespace API.Services
/// <param name="stream">Stream to write to disk. Ensure this is rewinded.</param>
/// <param name="fileName">filename to save as without extension</param>
/// <returns>File name with extension of the file. This will always write to <see cref="DirectoryService.CoverImageDirectory"/></returns>
public static string WriteCoverThumbnail(Stream stream, string fileName)
public string WriteCoverThumbnail(Stream stream, string fileName, string outputDirectory)
{
using var thumbnail = Image.ThumbnailStream(stream, ThumbnailWidth);
var filename = fileName + ".png";
thumbnail.WriteToFile(Path.Join(DirectoryService.CoverImageDirectory, fileName + ".png"));
_directoryService.ExistOrCreate(outputDirectory);
try
{
_directoryService.FileSystem.File.Delete(_directoryService.FileSystem.Path.Join(outputDirectory, filename));
} catch (Exception ex) {/* Swallow exception */}
thumbnail.WriteToFile(_directoryService.FileSystem.Path.Join(outputDirectory, filename));
return filename;
}
@ -105,7 +101,7 @@ namespace API.Services
{
using var thumbnail = Image.ThumbnailBuffer(Convert.FromBase64String(encodedImage), ThumbnailWidth);
var filename = fileName + ".png";
thumbnail.WriteToFile(Path.Join(DirectoryService.CoverImageDirectory, fileName + ".png"));
thumbnail.WriteToFile(_directoryService.FileSystem.Path.Join(_directoryService.CoverImageDirectory, fileName + ".png"));
return filename;
}
catch (Exception e)
@ -146,5 +142,4 @@ namespace API.Services
{
return $"tag{tagId}";
}
}
}

View file

@ -5,329 +5,364 @@ using System.IO;
using System.Linq;
using System.Threading.Tasks;
using API.Comparators;
using API.Data;
using API.Data.Metadata;
using API.Data.Repositories;
using API.Data.Scanner;
using API.Entities;
using API.Entities.Enums;
using API.Extensions;
using API.Helpers;
using API.Interfaces;
using API.Interfaces.Services;
using API.SignalR;
using Microsoft.AspNetCore.SignalR;
using Microsoft.Extensions.Logging;
namespace API.Services
namespace API.Services;
public interface IMetadataService
{
public class MetadataService : IMetadataService
/// <summary>
/// Recalculates metadata for all entities in a library.
/// </summary>
/// <param name="libraryId"></param>
/// <param name="forceUpdate"></param>
Task RefreshMetadata(int libraryId, bool forceUpdate = false);
/// <summary>
/// Performs a forced refresh of metadata just for a series and it's nested entities
/// </summary>
/// <param name="libraryId"></param>
/// <param name="seriesId"></param>
Task RefreshMetadataForSeries(int libraryId, int seriesId, bool forceUpdate = false);
}
public class MetadataService : IMetadataService
{
private readonly IUnitOfWork _unitOfWork;
private readonly ILogger<MetadataService> _logger;
private readonly IHubContext<MessageHub> _messageHub;
private readonly ICacheHelper _cacheHelper;
private readonly IReadingItemService _readingItemService;
private readonly IDirectoryService _directoryService;
private readonly ChapterSortComparerZeroFirst _chapterSortComparerForInChapterSorting = new ChapterSortComparerZeroFirst();
public MetadataService(IUnitOfWork unitOfWork, ILogger<MetadataService> logger,
IHubContext<MessageHub> messageHub, ICacheHelper cacheHelper,
IReadingItemService readingItemService, IDirectoryService directoryService)
{
private readonly IUnitOfWork _unitOfWork;
private readonly ILogger<MetadataService> _logger;
private readonly IArchiveService _archiveService;
private readonly IBookService _bookService;
private readonly IImageService _imageService;
private readonly IHubContext<MessageHub> _messageHub;
private readonly ChapterSortComparerZeroFirst _chapterSortComparerForInChapterSorting = new ChapterSortComparerZeroFirst();
_unitOfWork = unitOfWork;
_logger = logger;
_messageHub = messageHub;
_cacheHelper = cacheHelper;
_readingItemService = readingItemService;
_directoryService = directoryService;
}
public MetadataService(IUnitOfWork unitOfWork, ILogger<MetadataService> logger,
IArchiveService archiveService, IBookService bookService, IImageService imageService, IHubContext<MessageHub> messageHub)
{
_unitOfWork = unitOfWork;
_logger = logger;
_archiveService = archiveService;
_bookService = bookService;
_imageService = imageService;
_messageHub = messageHub;
}
/// <summary>
/// Determines whether an entity should regenerate cover image.
/// </summary>
/// <remarks>If a cover image is locked but the underlying file has been deleted, this will allow regenerating. </remarks>
/// <param name="coverImage"></param>
/// <param name="firstFile"></param>
/// <param name="forceUpdate"></param>
/// <param name="isCoverLocked"></param>
/// <param name="coverImageDirectory">Directory where cover images are. Defaults to <see cref="DirectoryService.CoverImageDirectory"/></param>
/// <returns></returns>
public static bool ShouldUpdateCoverImage(string coverImage, MangaFile firstFile, bool forceUpdate = false,
bool isCoverLocked = false, string coverImageDirectory = null)
{
if (string.IsNullOrEmpty(coverImageDirectory))
{
coverImageDirectory = DirectoryService.CoverImageDirectory;
}
var fileExists = File.Exists(Path.Join(coverImageDirectory, coverImage));
if (isCoverLocked && fileExists) return false;
if (forceUpdate) return true;
return (firstFile != null && firstFile.HasFileBeenModified()) || !HasCoverImage(coverImage, fileExists);
}
private static bool HasCoverImage(string coverImage)
{
return HasCoverImage(coverImage, File.Exists(coverImage));
}
private static bool HasCoverImage(string coverImage, bool fileExists)
{
return !string.IsNullOrEmpty(coverImage) && fileExists;
}
private string GetCoverImage(MangaFile file, int volumeId, int chapterId)
{
file.UpdateLastModified();
switch (file.Format)
{
case MangaFormat.Pdf:
case MangaFormat.Epub:
return _bookService.GetCoverImage(file.FilePath, ImageService.GetChapterFormat(chapterId, volumeId));
case MangaFormat.Image:
var coverImage = _imageService.GetCoverFile(file);
return _imageService.GetCoverImage(coverImage, ImageService.GetChapterFormat(chapterId, volumeId));
case MangaFormat.Archive:
return _archiveService.GetCoverImage(file.FilePath, ImageService.GetChapterFormat(chapterId, volumeId));
default:
return string.Empty;
}
}
/// <summary>
/// Updates the metadata for a Chapter
/// </summary>
/// <param name="chapter"></param>
/// <param name="forceUpdate">Force updating cover image even if underlying file has not been modified or chapter already has a cover image</param>
public bool UpdateMetadata(Chapter chapter, bool forceUpdate)
{
var firstFile = chapter.Files.OrderBy(x => x.Chapter).FirstOrDefault();
if (ShouldUpdateCoverImage(chapter.CoverImage, firstFile, forceUpdate, chapter.CoverImageLocked))
{
_logger.LogDebug("[MetadataService] Generating cover image for {File}", firstFile?.FilePath);
chapter.CoverImage = GetCoverImage(firstFile, chapter.VolumeId, chapter.Id);
return true;
}
/// <summary>
/// Updates the metadata for a Chapter
/// </summary>
/// <param name="chapter"></param>
/// <param name="forceUpdate">Force updating cover image even if underlying file has not been modified or chapter already has a cover image</param>
private bool UpdateChapterCoverImage(Chapter chapter, bool forceUpdate)
{
var firstFile = chapter.Files.OrderBy(x => x.Chapter).FirstOrDefault();
if (!_cacheHelper.ShouldUpdateCoverImage(_directoryService.FileSystem.Path.Join(_directoryService.CoverImageDirectory, chapter.CoverImage), firstFile, chapter.Created, forceUpdate, chapter.CoverImageLocked))
return false;
}
/// <summary>
/// Updates the metadata for a Volume
/// </summary>
/// <param name="volume"></param>
/// <param name="forceUpdate">Force updating cover image even if underlying file has not been modified or chapter already has a cover image</param>
public bool UpdateMetadata(Volume volume, bool forceUpdate)
if (firstFile == null) return false;
_logger.LogDebug("[MetadataService] Generating cover image for {File}", firstFile?.FilePath);
chapter.CoverImage = _readingItemService.GetCoverImage(firstFile.FilePath, ImageService.GetChapterFormat(chapter.Id, chapter.VolumeId), firstFile.Format);
return true;
}
private void UpdateChapterLastModified(Chapter chapter, bool forceUpdate)
{
var firstFile = chapter.Files.OrderBy(x => x.Chapter).FirstOrDefault();
if (firstFile == null || _cacheHelper.HasFileNotChangedSinceCreationOrLastScan(chapter, forceUpdate, firstFile)) return;
firstFile.UpdateLastModified();
}
/// <summary>
/// Updates the cover image for a Volume
/// </summary>
/// <param name="volume"></param>
/// <param name="forceUpdate">Force updating cover image even if underlying file has not been modified or chapter already has a cover image</param>
private bool UpdateVolumeCoverImage(Volume volume, bool forceUpdate)
{
// We need to check if Volume coverImage matches first chapters if forceUpdate is false
if (volume == null || !_cacheHelper.ShouldUpdateCoverImage(
_directoryService.FileSystem.Path.Join(_directoryService.CoverImageDirectory, volume.CoverImage),
null, volume.Created, forceUpdate)) return false;
volume.Chapters ??= new List<Chapter>();
var firstChapter = volume.Chapters.OrderBy(x => double.Parse(x.Number), _chapterSortComparerForInChapterSorting).FirstOrDefault();
if (firstChapter == null) return false;
volume.CoverImage = firstChapter.CoverImage;
return true;
}
/// <summary>
/// Updates cover image for Series
/// </summary>
/// <param name="series"></param>
/// <param name="forceUpdate">Force updating cover image even if underlying file has not been modified or chapter already has a cover image</param>
private void UpdateSeriesCoverImage(Series series, bool forceUpdate)
{
if (series == null) return;
if (!_cacheHelper.ShouldUpdateCoverImage(_directoryService.FileSystem.Path.Join(_directoryService.CoverImageDirectory, series.CoverImage),
null, series.Created, forceUpdate, series.CoverImageLocked))
return;
series.Volumes ??= new List<Volume>();
var firstCover = series.Volumes.GetCoverImage(series.Format);
string coverImage = null;
if (firstCover == null && series.Volumes.Any())
{
// We need to check if Volume coverImage matches first chapters if forceUpdate is false
if (volume == null || !ShouldUpdateCoverImage(volume.CoverImage, null, forceUpdate)) return false;
volume.Chapters ??= new List<Chapter>();
var firstChapter = volume.Chapters.OrderBy(x => double.Parse(x.Number), _chapterSortComparerForInChapterSorting).FirstOrDefault();
if (firstChapter == null) return false;
volume.CoverImage = firstChapter.CoverImage;
return true;
}
/// <summary>
/// Updates metadata for Series
/// </summary>
/// <param name="series"></param>
/// <param name="forceUpdate">Force updating cover image even if underlying file has not been modified or chapter already has a cover image</param>
public bool UpdateMetadata(Series series, bool forceUpdate)
{
var madeUpdate = false;
if (series == null) return false;
// NOTE: This will fail if we replace the cover of the first volume on a first scan. Because the series will already have a cover image
if (ShouldUpdateCoverImage(series.CoverImage, null, forceUpdate, series.CoverImageLocked))
// If firstCover is null and one volume, the whole series is Chapters under Vol 0.
if (series.Volumes.Count == 1)
{
series.Volumes ??= new List<Volume>();
var firstCover = series.Volumes.GetCoverImage(series.Format);
string coverImage = null;
if (firstCover == null && series.Volumes.Any())
{
// If firstCover is null and one volume, the whole series is Chapters under Vol 0.
if (series.Volumes.Count == 1)
{
coverImage = series.Volumes[0].Chapters.OrderBy(c => double.Parse(c.Number), _chapterSortComparerForInChapterSorting)
.FirstOrDefault(c => !c.IsSpecial)?.CoverImage;
madeUpdate = true;
}
if (!HasCoverImage(coverImage))
{
coverImage = series.Volumes[0].Chapters.OrderBy(c => double.Parse(c.Number), _chapterSortComparerForInChapterSorting)
.FirstOrDefault()?.CoverImage;
madeUpdate = true;
}
}
series.CoverImage = firstCover?.CoverImage ?? coverImage;
coverImage = series.Volumes[0].Chapters.OrderBy(c => double.Parse(c.Number), _chapterSortComparerForInChapterSorting)
.FirstOrDefault(c => !c.IsSpecial)?.CoverImage;
}
return UpdateSeriesSummary(series, forceUpdate) || madeUpdate ;
}
private bool UpdateSeriesSummary(Series series, bool forceUpdate)
{
// NOTE: This can be problematic when the file changes and a summary already exists, but it is likely
// better to let the user kick off a refresh metadata on an individual Series than having overhead of
// checking File last write time.
if (!string.IsNullOrEmpty(series.Summary) && !forceUpdate) return false;
var isBook = series.Library.Type == LibraryType.Book;
var firstVolume = series.Volumes.FirstWithChapters(isBook);
var firstChapter = firstVolume?.Chapters.GetFirstChapterWithFiles();
var firstFile = firstChapter?.Files.FirstOrDefault();
if (firstFile == null || (!forceUpdate && !firstFile.HasFileBeenModified())) return false;
if (Parser.Parser.IsPdf(firstFile.FilePath)) return false;
var comicInfo = GetComicInfo(series.Format, firstFile);
if (string.IsNullOrEmpty(comicInfo?.Summary)) return false;
series.Summary = comicInfo.Summary;
return true;
}
private ComicInfo GetComicInfo(MangaFormat format, MangaFile firstFile)
{
if (format is MangaFormat.Archive or MangaFormat.Epub)
if (!_cacheHelper.CoverImageExists(coverImage))
{
return Parser.Parser.IsEpub(firstFile.FilePath) ? _bookService.GetComicInfo(firstFile.FilePath) : _archiveService.GetComicInfo(firstFile.FilePath);
coverImage = series.Volumes[0].Chapters.OrderBy(c => double.Parse(c.Number), _chapterSortComparerForInChapterSorting)
.FirstOrDefault()?.CoverImage;
}
return null;
}
series.CoverImage = firstCover?.CoverImage ?? coverImage;
}
/// <summary>
/// Refreshes Metadata for a whole library
/// </summary>
/// <remarks>This can be heavy on memory first run</remarks>
/// <param name="libraryId"></param>
/// <param name="forceUpdate">Force updating cover image even if underlying file has not been modified or chapter already has a cover image</param>
public async Task RefreshMetadata(int libraryId, bool forceUpdate = false)
/// <summary>
///
/// </summary>
/// <param name="series"></param>
/// <param name="forceUpdate"></param>
private void ProcessSeriesMetadataUpdate(Series series, ICollection<Person> allPeople, ICollection<Genre> allGenres, ICollection<Tag> allTags, bool forceUpdate)
{
_logger.LogDebug("[MetadataService] Processing series {SeriesName}", series.OriginalName);
try
{
var library = await _unitOfWork.LibraryRepository.GetLibraryForIdAsync(libraryId, LibraryIncludes.None);
_logger.LogInformation("[MetadataService] Beginning metadata refresh of {LibraryName}", library.Name);
var chunkInfo = await _unitOfWork.SeriesRepository.GetChunkInfo(library.Id);
var stopwatch = Stopwatch.StartNew();
var totalTime = 0L;
_logger.LogInformation("[MetadataService] Refreshing Library {LibraryName}. Total Items: {TotalSize}. Total Chunks: {TotalChunks} with {ChunkSize} size", library.Name, chunkInfo.TotalSize, chunkInfo.TotalChunks, chunkInfo.ChunkSize);
await _messageHub.Clients.All.SendAsync(SignalREvents.RefreshMetadataProgress,
MessageFactory.RefreshMetadataProgressEvent(library.Id, 0F));
var i = 0;
for (var chunk = 1; chunk <= chunkInfo.TotalChunks; chunk++, i++)
{
if (chunkInfo.TotalChunks == 0) continue;
totalTime += stopwatch.ElapsedMilliseconds;
stopwatch.Restart();
_logger.LogInformation("[MetadataService] Processing chunk {ChunkNumber} / {TotalChunks} with size {ChunkSize}. Series ({SeriesStart} - {SeriesEnd}",
chunk, chunkInfo.TotalChunks, chunkInfo.ChunkSize, chunk * chunkInfo.ChunkSize, (chunk + 1) * chunkInfo.ChunkSize);
var nonLibrarySeries = await _unitOfWork.SeriesRepository.GetFullSeriesForLibraryIdAsync(library.Id,
new UserParams()
{
PageNumber = chunk,
PageSize = chunkInfo.ChunkSize
});
_logger.LogDebug("[MetadataService] Fetched {SeriesCount} series for refresh", nonLibrarySeries.Count);
Parallel.ForEach(nonLibrarySeries, series =>
{
try
{
_logger.LogDebug("[MetadataService] Processing series {SeriesName}", series.OriginalName);
var volumeUpdated = false;
foreach (var volume in series.Volumes)
{
var chapterUpdated = false;
foreach (var chapter in volume.Chapters)
{
chapterUpdated = UpdateMetadata(chapter, forceUpdate);
}
volumeUpdated = UpdateMetadata(volume, chapterUpdated || forceUpdate);
}
UpdateMetadata(series, volumeUpdated || forceUpdate);
}
catch (Exception)
{
/* Swallow exception */
}
});
if (_unitOfWork.HasChanges() && await _unitOfWork.CommitAsync())
{
_logger.LogInformation(
"[MetadataService] Processed {SeriesStart} - {SeriesEnd} out of {TotalSeries} series in {ElapsedScanTime} milliseconds for {LibraryName}",
chunk * chunkInfo.ChunkSize, (chunk * chunkInfo.ChunkSize) + nonLibrarySeries.Count, chunkInfo.TotalSize, stopwatch.ElapsedMilliseconds, library.Name);
foreach (var series in nonLibrarySeries)
{
await _messageHub.Clients.All.SendAsync(SignalREvents.RefreshMetadata, MessageFactory.RefreshMetadataEvent(library.Id, series.Id));
}
}
else
{
_logger.LogInformation(
"[MetadataService] Processed {SeriesStart} - {SeriesEnd} out of {TotalSeries} series in {ElapsedScanTime} milliseconds for {LibraryName}",
chunk * chunkInfo.ChunkSize, (chunk * chunkInfo.ChunkSize) + nonLibrarySeries.Count, chunkInfo.TotalSize, stopwatch.ElapsedMilliseconds, library.Name);
}
var progress = Math.Max(0F, Math.Min(1F, i * 1F / chunkInfo.TotalChunks));
await _messageHub.Clients.All.SendAsync(SignalREvents.RefreshMetadataProgress,
MessageFactory.RefreshMetadataProgressEvent(library.Id, progress));
}
await _messageHub.Clients.All.SendAsync(SignalREvents.RefreshMetadataProgress,
MessageFactory.RefreshMetadataProgressEvent(library.Id, 1F));
_logger.LogInformation("[MetadataService] Updated metadata for {SeriesNumber} series in library {LibraryName} in {ElapsedMilliseconds} milliseconds total", chunkInfo.TotalSize, library.Name, totalTime);
}
/// <summary>
/// Refreshes Metadata for a Series. Will always force updates.
/// </summary>
/// <param name="libraryId"></param>
/// <param name="seriesId"></param>
public async Task RefreshMetadataForSeries(int libraryId, int seriesId, bool forceUpdate = false)
{
var sw = Stopwatch.StartNew();
var series = await _unitOfWork.SeriesRepository.GetFullSeriesForSeriesIdAsync(seriesId);
if (series == null)
{
_logger.LogError("[MetadataService] Series {SeriesId} was not found on Library {LibraryId}", seriesId, libraryId);
return;
}
_logger.LogInformation("[MetadataService] Beginning metadata refresh of {SeriesName}", series.Name);
var volumeUpdated = false;
var volumeIndex = 0;
var firstVolumeUpdated = false;
foreach (var volume in series.Volumes)
{
var chapterUpdated = false;
var firstChapterUpdated = false; // This only needs to be FirstChapter updated
var index = 0;
foreach (var chapter in volume.Chapters)
{
chapterUpdated = UpdateMetadata(chapter, forceUpdate);
var chapterUpdated = UpdateChapterCoverImage(chapter, forceUpdate);
// If cover was update, either the file has changed or first scan and we should force a metadata update
UpdateChapterLastModified(chapter, forceUpdate || chapterUpdated);
if (index == 0 && chapterUpdated)
{
firstChapterUpdated = true;
}
index++;
}
volumeUpdated = UpdateMetadata(volume, chapterUpdated || forceUpdate);
var volumeUpdated = UpdateVolumeCoverImage(volume, firstChapterUpdated || forceUpdate);
if (volumeIndex == 0 && volumeUpdated)
{
firstVolumeUpdated = true;
}
volumeIndex++;
}
UpdateMetadata(series, volumeUpdated || forceUpdate);
if (_unitOfWork.HasChanges() && await _unitOfWork.CommitAsync())
{
await _messageHub.Clients.All.SendAsync(SignalREvents.RefreshMetadata, MessageFactory.RefreshMetadataEvent(series.LibraryId, series.Id));
}
_logger.LogInformation("[MetadataService] Updated metadata for {SeriesName} in {ElapsedMilliseconds} milliseconds", series.Name, sw.ElapsedMilliseconds);
UpdateSeriesCoverImage(series, firstVolumeUpdated || forceUpdate);
}
catch (Exception ex)
{
_logger.LogError(ex, "[MetadataService] There was an exception during updating metadata for {SeriesName} ", series.Name);
}
}
/// <summary>
/// Refreshes Metadata for a whole library
/// </summary>
/// <remarks>This can be heavy on memory first run</remarks>
/// <param name="libraryId"></param>
/// <param name="forceUpdate">Force updating cover image even if underlying file has not been modified or chapter already has a cover image</param>
public async Task RefreshMetadata(int libraryId, bool forceUpdate = false)
{
var library = await _unitOfWork.LibraryRepository.GetLibraryForIdAsync(libraryId, LibraryIncludes.None);
_logger.LogInformation("[MetadataService] Beginning metadata refresh of {LibraryName}", library.Name);
var chunkInfo = await _unitOfWork.SeriesRepository.GetChunkInfo(library.Id);
var stopwatch = Stopwatch.StartNew();
var totalTime = 0L;
_logger.LogInformation("[MetadataService] Refreshing Library {LibraryName}. Total Items: {TotalSize}. Total Chunks: {TotalChunks} with {ChunkSize} size", library.Name, chunkInfo.TotalSize, chunkInfo.TotalChunks, chunkInfo.ChunkSize);
await _messageHub.Clients.All.SendAsync(SignalREvents.RefreshMetadataProgress,
MessageFactory.RefreshMetadataProgressEvent(library.Id, 0F));
for (var chunk = 1; chunk <= chunkInfo.TotalChunks; chunk++)
{
if (chunkInfo.TotalChunks == 0) continue;
totalTime += stopwatch.ElapsedMilliseconds;
stopwatch.Restart();
_logger.LogInformation("[MetadataService] Processing chunk {ChunkNumber} / {TotalChunks} with size {ChunkSize}. Series ({SeriesStart} - {SeriesEnd}",
chunk, chunkInfo.TotalChunks, chunkInfo.ChunkSize, chunk * chunkInfo.ChunkSize, (chunk + 1) * chunkInfo.ChunkSize);
var nonLibrarySeries = await _unitOfWork.SeriesRepository.GetFullSeriesForLibraryIdAsync(library.Id,
new UserParams()
{
PageNumber = chunk,
PageSize = chunkInfo.ChunkSize
});
_logger.LogDebug("[MetadataService] Fetched {SeriesCount} series for refresh", nonLibrarySeries.Count);
var allPeople = await _unitOfWork.PersonRepository.GetAllPeople();
var allGenres = await _unitOfWork.GenreRepository.GetAllGenresAsync();
var allTags = await _unitOfWork.TagRepository.GetAllTagsAsync();
var seriesIndex = 0;
foreach (var series in nonLibrarySeries)
{
try
{
ProcessSeriesMetadataUpdate(series, allPeople, allGenres, allTags, forceUpdate);
}
catch (Exception ex)
{
_logger.LogError(ex, "[MetadataService] There was an exception during metadata refresh for {SeriesName}", series.Name);
}
var index = chunk * seriesIndex;
var progress = Math.Max(0F, Math.Min(1F, index * 1F / chunkInfo.TotalSize));
await _messageHub.Clients.All.SendAsync(SignalREvents.RefreshMetadataProgress,
MessageFactory.RefreshMetadataProgressEvent(library.Id, progress));
seriesIndex++;
}
await _unitOfWork.CommitAsync();
foreach (var series in nonLibrarySeries)
{
await _messageHub.Clients.All.SendAsync(SignalREvents.RefreshMetadata, MessageFactory.RefreshMetadataEvent(library.Id, series.Id));
}
_logger.LogInformation(
"[MetadataService] Processed {SeriesStart} - {SeriesEnd} out of {TotalSeries} series in {ElapsedScanTime} milliseconds for {LibraryName}",
chunk * chunkInfo.ChunkSize, (chunk * chunkInfo.ChunkSize) + nonLibrarySeries.Count, chunkInfo.TotalSize, stopwatch.ElapsedMilliseconds, library.Name);
}
await _messageHub.Clients.All.SendAsync(SignalREvents.RefreshMetadataProgress,
MessageFactory.RefreshMetadataProgressEvent(library.Id, 1F));
await RemoveAbandonedMetadataKeys();
_logger.LogInformation("[MetadataService] Updated metadata for {SeriesNumber} series in library {LibraryName} in {ElapsedMilliseconds} milliseconds total", chunkInfo.TotalSize, library.Name, totalTime);
}
private async Task RemoveAbandonedMetadataKeys()
{
await _unitOfWork.TagRepository.RemoveAllTagNoLongerAssociated();
await _unitOfWork.PersonRepository.RemoveAllPeopleNoLongerAssociated();
await _unitOfWork.GenreRepository.RemoveAllGenreNoLongerAssociated();
}
// TODO: I can probably refactor RefreshMetadata and RefreshMetadataForSeries to be the same by utilizing chunk size of 1, so most of the code can be the same.
private async Task PerformScan(Library library, bool forceUpdate, Action<int, Chunk> action)
{
var chunkInfo = await _unitOfWork.SeriesRepository.GetChunkInfo(library.Id);
var stopwatch = Stopwatch.StartNew();
var totalTime = 0L;
_logger.LogInformation("[MetadataService] Refreshing Library {LibraryName}. Total Items: {TotalSize}. Total Chunks: {TotalChunks} with {ChunkSize} size", library.Name, chunkInfo.TotalSize, chunkInfo.TotalChunks, chunkInfo.ChunkSize);
await _messageHub.Clients.All.SendAsync(SignalREvents.RefreshMetadataProgress,
MessageFactory.RefreshMetadataProgressEvent(library.Id, 0F));
for (var chunk = 1; chunk <= chunkInfo.TotalChunks; chunk++)
{
if (chunkInfo.TotalChunks == 0) continue;
totalTime += stopwatch.ElapsedMilliseconds;
stopwatch.Restart();
action(chunk, chunkInfo);
// _logger.LogInformation("[MetadataService] Processing chunk {ChunkNumber} / {TotalChunks} with size {ChunkSize}. Series ({SeriesStart} - {SeriesEnd}",
// chunk, chunkInfo.TotalChunks, chunkInfo.ChunkSize, chunk * chunkInfo.ChunkSize, (chunk + 1) * chunkInfo.ChunkSize);
// var nonLibrarySeries = await _unitOfWork.SeriesRepository.GetFullSeriesForLibraryIdAsync(library.Id,
// new UserParams()
// {
// PageNumber = chunk,
// PageSize = chunkInfo.ChunkSize
// });
// _logger.LogDebug("[MetadataService] Fetched {SeriesCount} series for refresh", nonLibrarySeries.Count);
//
// var chapterIds = await _unitOfWork.SeriesRepository.GetChapterIdWithSeriesIdForSeriesAsync(nonLibrarySeries.Select(s => s.Id).ToArray());
// var allPeople = await _unitOfWork.PersonRepository.GetAllPeople();
// var allGenres = await _unitOfWork.GenreRepository.GetAllGenres();
//
//
// var seriesIndex = 0;
// foreach (var series in nonLibrarySeries)
// {
// try
// {
// ProcessSeriesMetadataUpdate(series, chapterIds, allPeople, allGenres, forceUpdate);
// }
// catch (Exception ex)
// {
// _logger.LogError(ex, "[MetadataService] There was an exception during metadata refresh for {SeriesName}", series.Name);
// }
// var index = chunk * seriesIndex;
// var progress = Math.Max(0F, Math.Min(1F, index * 1F / chunkInfo.TotalSize));
//
// await _messageHub.Clients.All.SendAsync(SignalREvents.RefreshMetadataProgress,
// MessageFactory.RefreshMetadataProgressEvent(library.Id, progress));
// seriesIndex++;
// }
await _unitOfWork.CommitAsync();
}
}
/// <summary>
/// Refreshes Metadata for a Series. Will always force updates.
/// </summary>
/// <param name="libraryId"></param>
/// <param name="seriesId"></param>
public async Task RefreshMetadataForSeries(int libraryId, int seriesId, bool forceUpdate = true)
{
var sw = Stopwatch.StartNew();
var series = await _unitOfWork.SeriesRepository.GetFullSeriesForSeriesIdAsync(seriesId);
if (series == null)
{
_logger.LogError("[MetadataService] Series {SeriesId} was not found on Library {LibraryId}", seriesId, libraryId);
return;
}
await _messageHub.Clients.All.SendAsync(SignalREvents.RefreshMetadataProgress,
MessageFactory.RefreshMetadataProgressEvent(libraryId, 0F));
var allPeople = await _unitOfWork.PersonRepository.GetAllPeople();
var allGenres = await _unitOfWork.GenreRepository.GetAllGenresAsync();
var allTags = await _unitOfWork.TagRepository.GetAllTagsAsync();
ProcessSeriesMetadataUpdate(series, allPeople, allGenres, allTags, forceUpdate);
await _messageHub.Clients.All.SendAsync(SignalREvents.RefreshMetadataProgress,
MessageFactory.RefreshMetadataProgressEvent(libraryId, 1F));
if (_unitOfWork.HasChanges() && await _unitOfWork.CommitAsync())
{
await _messageHub.Clients.All.SendAsync(SignalREvents.RefreshMetadata, MessageFactory.RefreshMetadataEvent(series.LibraryId, series.Id));
}
await RemoveAbandonedMetadataKeys();
_logger.LogInformation("[MetadataService] Updated metadata for {SeriesName} in {ElapsedMilliseconds} milliseconds", series.Name, sw.ElapsedMilliseconds);
}
}

View file

@ -0,0 +1,326 @@
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Threading.Tasks;
using API.Comparators;
using API.Data;
using API.Data.Repositories;
using API.DTOs;
using API.Entities;
using API.Extensions;
using Kavita.Common;
using Microsoft.Extensions.Logging;
namespace API.Services;
public interface IReaderService
{
void MarkChaptersAsRead(AppUser user, int seriesId, IEnumerable<Chapter> chapters);
void MarkChaptersAsUnread(AppUser user, int seriesId, IEnumerable<Chapter> chapters);
Task<bool> SaveReadingProgress(ProgressDto progressDto, int userId);
Task<int> CapPageToChapter(int chapterId, int page);
Task<int> GetNextChapterIdAsync(int seriesId, int volumeId, int currentChapterId, int userId);
Task<int> GetPrevChapterIdAsync(int seriesId, int volumeId, int currentChapterId, int userId);
}
public class ReaderService : IReaderService
{
private readonly IUnitOfWork _unitOfWork;
private readonly ILogger<ReaderService> _logger;
private readonly IDirectoryService _directoryService;
private readonly ICacheService _cacheService;
private readonly ChapterSortComparer _chapterSortComparer = new ChapterSortComparer();
private readonly ChapterSortComparerZeroFirst _chapterSortComparerForInChapterSorting = new ChapterSortComparerZeroFirst();
public ReaderService(IUnitOfWork unitOfWork, ILogger<ReaderService> logger, IDirectoryService directoryService, ICacheService cacheService)
{
_unitOfWork = unitOfWork;
_logger = logger;
_directoryService = directoryService;
_cacheService = cacheService;
}
public static string FormatBookmarkFolderPath(string baseDirectory, int userId, int seriesId, int chapterId)
{
return Parser.Parser.NormalizePath(Path.Join(baseDirectory, $"{userId}", $"{seriesId}", $"{chapterId}"));
}
/// <summary>
/// Marks all Chapters as Read by creating or updating UserProgress rows. Does not commit.
/// </summary>
/// <param name="user"></param>
/// <param name="seriesId"></param>
/// <param name="chapters"></param>
public void MarkChaptersAsRead(AppUser user, int seriesId, IEnumerable<Chapter> chapters)
{
foreach (var chapter in chapters)
{
var userProgress = GetUserProgressForChapter(user, chapter);
if (userProgress == null)
{
user.Progresses.Add(new AppUserProgress
{
PagesRead = chapter.Pages,
VolumeId = chapter.VolumeId,
SeriesId = seriesId,
ChapterId = chapter.Id
});
}
else
{
userProgress.PagesRead = chapter.Pages;
userProgress.SeriesId = seriesId;
userProgress.VolumeId = chapter.VolumeId;
}
}
}
/// <summary>
/// Marks all Chapters as Unread by creating or updating UserProgress rows. Does not commit.
/// </summary>
/// <param name="user"></param>
/// <param name="seriesId"></param>
/// <param name="chapters"></param>
public void MarkChaptersAsUnread(AppUser user, int seriesId, IEnumerable<Chapter> chapters)
{
foreach (var chapter in chapters)
{
var userProgress = GetUserProgressForChapter(user, chapter);
if (userProgress == null) continue;
userProgress.PagesRead = 0;
userProgress.SeriesId = seriesId;
userProgress.VolumeId = chapter.VolumeId;
}
}
/// <summary>
/// Gets the User Progress for a given Chapter. This will handle any duplicates that might have occured in past versions and will delete them. Does not commit.
/// </summary>
/// <param name="user">Must have Progresses populated</param>
/// <param name="chapter"></param>
/// <returns></returns>
private static AppUserProgress GetUserProgressForChapter(AppUser user, Chapter chapter)
{
AppUserProgress userProgress = null;
if (user.Progresses == null)
{
throw new KavitaException("Progresses must exist on user");
}
try
{
userProgress =
user.Progresses.SingleOrDefault(x => x.ChapterId == chapter.Id && x.AppUserId == user.Id);
}
catch (Exception)
{
// There is a very rare chance that user progress will duplicate current row. If that happens delete one with less pages
var progresses = user.Progresses.Where(x => x.ChapterId == chapter.Id && x.AppUserId == user.Id).ToList();
if (progresses.Count > 1)
{
user.Progresses = new List<AppUserProgress>()
{
user.Progresses.First()
};
userProgress = user.Progresses.First();
}
}
return userProgress;
}
/// <summary>
/// Saves progress to DB
/// </summary>
/// <param name="progressDto"></param>
/// <param name="userId"></param>
/// <returns></returns>
public async Task<bool> SaveReadingProgress(ProgressDto progressDto, int userId)
{
// Don't let user save past total pages.
progressDto.PageNum = await CapPageToChapter(progressDto.ChapterId, progressDto.PageNum);
try
{
var userProgress =
await _unitOfWork.AppUserProgressRepository.GetUserProgressAsync(progressDto.ChapterId, userId);
if (userProgress == null)
{
// Create a user object
var userWithProgress =
await _unitOfWork.UserRepository.GetUserByIdAsync(userId, AppUserIncludes.Progress);
userWithProgress.Progresses ??= new List<AppUserProgress>();
userWithProgress.Progresses.Add(new AppUserProgress
{
PagesRead = progressDto.PageNum,
VolumeId = progressDto.VolumeId,
SeriesId = progressDto.SeriesId,
ChapterId = progressDto.ChapterId,
BookScrollId = progressDto.BookScrollId,
LastModified = DateTime.Now
});
_unitOfWork.UserRepository.Update(userWithProgress);
}
else
{
userProgress.PagesRead = progressDto.PageNum;
userProgress.SeriesId = progressDto.SeriesId;
userProgress.VolumeId = progressDto.VolumeId;
userProgress.BookScrollId = progressDto.BookScrollId;
userProgress.LastModified = DateTime.Now;
_unitOfWork.AppUserProgressRepository.Update(userProgress);
}
if (await _unitOfWork.CommitAsync())
{
return true;
}
}
catch (Exception exception)
{
_logger.LogError(exception, "Could not save progress");
await _unitOfWork.RollbackAsync();
}
return false;
}
/// <summary>
/// Ensures that the page is within 0 and total pages for a chapter. Makes one DB call.
/// </summary>
/// <param name="chapterId"></param>
/// <param name="page"></param>
/// <returns></returns>
public async Task<int> CapPageToChapter(int chapterId, int page)
{
var totalPages = await _unitOfWork.ChapterRepository.GetChapterTotalPagesAsync(chapterId);
if (page > totalPages)
{
page = totalPages;
}
if (page < 0)
{
page = 0;
}
return page;
}
/// <summary>
/// Tries to find the next logical Chapter
/// </summary>
/// <example>
/// V1 → V2 → V3 chapter 0 → V3 chapter 10 → SP 01 → SP 02
/// </example>
/// <param name="seriesId"></param>
/// <param name="volumeId"></param>
/// <param name="currentChapterId"></param>
/// <param name="userId"></param>
/// <returns>-1 if nothing can be found</returns>
public async Task<int> GetNextChapterIdAsync(int seriesId, int volumeId, int currentChapterId, int userId)
{
var volumes = (await _unitOfWork.VolumeRepository.GetVolumesDtoAsync(seriesId, userId)).ToList();
var currentVolume = volumes.Single(v => v.Id == volumeId);
var currentChapter = currentVolume.Chapters.Single(c => c.Id == currentChapterId);
if (currentVolume.Number == 0)
{
// Handle specials by sorting on their Filename aka Range
var chapterId = GetNextChapterId(currentVolume.Chapters.OrderByNatural(x => x.Range), currentChapter.Number);
if (chapterId > 0) return chapterId;
}
foreach (var volume in volumes)
{
if (volume.Number == currentVolume.Number && volume.Chapters.Count > 1)
{
// Handle Chapters within current Volume
// In this case, i need 0 first because 0 represents a full volume file.
var chapterId = GetNextChapterId(currentVolume.Chapters.OrderBy(x => double.Parse(x.Number), _chapterSortComparerForInChapterSorting), currentChapter.Number);
if (chapterId > 0) return chapterId;
}
if (volume.Number != currentVolume.Number + 1) continue;
// Handle Chapters within next Volume
// ! When selecting the chapter for the next volume, we need to make sure a c0 comes before a c1+
var chapters = volume.Chapters.OrderBy(x => double.Parse(x.Number), _chapterSortComparer).ToList();
if (currentChapter.Number.Equals("0") && chapters.Last().Number.Equals("0"))
{
return chapters.Last().Id;
}
var firstChapter = chapters.FirstOrDefault();
if (firstChapter == null) return -1;
return firstChapter.Id;
}
return -1;
}
/// <summary>
/// Tries to find the prev logical Chapter
/// </summary>
/// <example>
/// V1 ← V2 ← V3 chapter 0 ← V3 chapter 10 ← SP 01 ← SP 02
/// </example>
/// <param name="seriesId"></param>
/// <param name="volumeId"></param>
/// <param name="currentChapterId"></param>
/// <param name="userId"></param>
/// <returns>-1 if nothing can be found</returns>
public async Task<int> GetPrevChapterIdAsync(int seriesId, int volumeId, int currentChapterId, int userId)
{
var volumes = (await _unitOfWork.VolumeRepository.GetVolumesDtoAsync(seriesId, userId)).Reverse().ToList();
var currentVolume = volumes.Single(v => v.Id == volumeId);
var currentChapter = currentVolume.Chapters.Single(c => c.Id == currentChapterId);
if (currentVolume.Number == 0)
{
var chapterId = GetNextChapterId(currentVolume.Chapters.OrderByNatural(x => x.Range).Reverse(), currentChapter.Number);
if (chapterId > 0) return chapterId;
}
foreach (var volume in volumes)
{
if (volume.Number == currentVolume.Number)
{
var chapterId = GetNextChapterId(currentVolume.Chapters.OrderBy(x => double.Parse(x.Number), _chapterSortComparerForInChapterSorting).Reverse(), currentChapter.Number);
if (chapterId > 0) return chapterId;
}
if (volume.Number == currentVolume.Number - 1)
{
var lastChapter = volume.Chapters
.OrderBy(x => double.Parse(x.Number), _chapterSortComparerForInChapterSorting).LastOrDefault();
if (lastChapter == null) return -1;
return lastChapter.Id;
}
}
return -1;
}
private static int GetNextChapterId(IEnumerable<ChapterDto> chapters, string currentChapterNumber)
{
var next = false;
var chaptersList = chapters.ToList();
foreach (var chapter in chaptersList)
{
if (next)
{
return chapter.Id;
}
if (currentChapterNumber.Equals(chapter.Number)) next = true;
}
return -1;
}
}

View file

@ -0,0 +1,141 @@
using System;
using API.Data.Metadata;
using API.Entities.Enums;
using API.Parser;
namespace API.Services;
public interface IReadingItemService
{
ComicInfo GetComicInfo(string filePath);
int GetNumberOfPages(string filePath, MangaFormat format);
string GetCoverImage(string filePath, string fileName, MangaFormat format);
void Extract(string fileFilePath, string targetDirectory, MangaFormat format, int imageCount = 1);
ParserInfo Parse(string path, string rootPath, LibraryType type);
}
public class ReadingItemService : IReadingItemService
{
private readonly IArchiveService _archiveService;
private readonly IBookService _bookService;
private readonly IImageService _imageService;
private readonly IDirectoryService _directoryService;
private readonly DefaultParser _defaultParser;
public ReadingItemService(IArchiveService archiveService, IBookService bookService, IImageService imageService, IDirectoryService directoryService)
{
_archiveService = archiveService;
_bookService = bookService;
_imageService = imageService;
_directoryService = directoryService;
_defaultParser = new DefaultParser(directoryService);
}
/// <summary>
/// Gets the ComicInfo for the file if it exists. Null otherewise.
/// </summary>
/// <param name="filePath">Fully qualified path of file</param>
/// <returns></returns>
public ComicInfo? GetComicInfo(string filePath)
{
if (Parser.Parser.IsEpub(filePath))
{
return _bookService.GetComicInfo(filePath);
}
if (Parser.Parser.IsComicInfoExtension(filePath))
{
return _archiveService.GetComicInfo(filePath);
}
return null;
}
/// <summary>
///
/// </summary>
/// <param name="filePath"></param>
/// <param name="format"></param>
/// <returns></returns>
public int GetNumberOfPages(string filePath, MangaFormat format)
{
switch (format)
{
case MangaFormat.Archive:
{
return _archiveService.GetNumberOfPagesFromArchive(filePath);
}
case MangaFormat.Pdf:
case MangaFormat.Epub:
{
return _bookService.GetNumberOfPages(filePath);
}
case MangaFormat.Image:
{
return 1;
}
case MangaFormat.Unknown:
default:
return 0;
}
}
public string GetCoverImage(string filePath, string fileName, MangaFormat format)
{
if (string.IsNullOrEmpty(filePath) || string.IsNullOrEmpty(fileName))
{
return string.Empty;
}
return format switch
{
MangaFormat.Epub => _bookService.GetCoverImage(filePath, fileName, _directoryService.CoverImageDirectory),
MangaFormat.Archive => _archiveService.GetCoverImage(filePath, fileName, _directoryService.CoverImageDirectory),
MangaFormat.Image => _imageService.GetCoverImage(filePath, fileName, _directoryService.CoverImageDirectory),
MangaFormat.Pdf => _bookService.GetCoverImage(filePath, fileName, _directoryService.CoverImageDirectory),
_ => string.Empty
};
}
/// <summary>
/// Extracts the reading item to the target directory using the appropriate method
/// </summary>
/// <param name="fileFilePath">File to extract</param>
/// <param name="targetDirectory">Where to extract to. Will be created if does not exist</param>
/// <param name="format">Format of the File</param>
/// <param name="imageCount">If the file is of type image, pass number of files needed. If > 0, will copy the whole directory.</param>
/// <exception cref="ArgumentOutOfRangeException"></exception>
public void Extract(string fileFilePath, string targetDirectory, MangaFormat format, int imageCount = 1)
{
switch (format)
{
case MangaFormat.Pdf:
_bookService.ExtractPdfImages(fileFilePath, targetDirectory);
break;
case MangaFormat.Archive:
_archiveService.ExtractArchive(fileFilePath, targetDirectory);
break;
case MangaFormat.Image:
_imageService.ExtractImages(fileFilePath, targetDirectory, imageCount);
break;
case MangaFormat.Unknown:
case MangaFormat.Epub:
break;
default:
throw new ArgumentOutOfRangeException(nameof(format), format, null);
}
}
/// <summary>
/// Parses information out of a file. If file is a book (epub), it will use book metadata regardless of LibraryType
/// </summary>
/// <param name="path"></param>
/// <param name="rootPath"></param>
/// <param name="type"></param>
/// <returns></returns>
public ParserInfo Parse(string path, string rootPath, LibraryType type)
{
return Parser.Parser.IsEpub(path) ? _bookService.ParseInfo(path) : _defaultParser.Parse(path, rootPath, type);
}
}

View file

@ -1,178 +1,187 @@
using System;
using System.IO;
using System.Threading;
using System.Threading.Tasks;
using API.Data;
using API.Entities.Enums;
using API.Helpers.Converters;
using API.Interfaces;
using API.Interfaces.Services;
using API.Services.Tasks;
using Hangfire;
using Microsoft.Extensions.Logging;
namespace API.Services
namespace API.Services;
public interface ITaskScheduler
{
public class TaskScheduler : ITaskScheduler
Task ScheduleTasks();
Task ScheduleStatsTasks();
void ScheduleUpdaterTasks();
void ScanLibrary(int libraryId, bool forceUpdate = false);
void CleanupChapters(int[] chapterIds);
void RefreshMetadata(int libraryId, bool forceUpdate = true);
void RefreshSeriesMetadata(int libraryId, int seriesId, bool forceUpdate = false);
void ScanSeries(int libraryId, int seriesId, bool forceUpdate = false);
void CancelStatsTasks();
Task RunStatCollection();
}
public class TaskScheduler : ITaskScheduler
{
private readonly ICacheService _cacheService;
private readonly ILogger<TaskScheduler> _logger;
private readonly IScannerService _scannerService;
private readonly IUnitOfWork _unitOfWork;
private readonly IMetadataService _metadataService;
private readonly IBackupService _backupService;
private readonly ICleanupService _cleanupService;
private readonly IStatsService _statsService;
private readonly IVersionUpdaterService _versionUpdaterService;
private readonly IDirectoryService _directoryService;
public static BackgroundJobServer Client => new BackgroundJobServer();
private static readonly Random Rnd = new Random();
public TaskScheduler(ICacheService cacheService, ILogger<TaskScheduler> logger, IScannerService scannerService,
IUnitOfWork unitOfWork, IMetadataService metadataService, IBackupService backupService,
ICleanupService cleanupService, IStatsService statsService, IVersionUpdaterService versionUpdaterService,
IDirectoryService directoryService)
{
private readonly ICacheService _cacheService;
private readonly ILogger<TaskScheduler> _logger;
private readonly IScannerService _scannerService;
private readonly IUnitOfWork _unitOfWork;
private readonly IMetadataService _metadataService;
private readonly IBackupService _backupService;
private readonly ICleanupService _cleanupService;
_cacheService = cacheService;
_logger = logger;
_scannerService = scannerService;
_unitOfWork = unitOfWork;
_metadataService = metadataService;
_backupService = backupService;
_cleanupService = cleanupService;
_statsService = statsService;
_versionUpdaterService = versionUpdaterService;
_directoryService = directoryService;
}
private readonly IStatsService _statsService;
private readonly IVersionUpdaterService _versionUpdaterService;
public async Task ScheduleTasks()
{
_logger.LogInformation("Scheduling reoccurring tasks");
public static BackgroundJobServer Client => new BackgroundJobServer();
private static readonly Random Rnd = new Random();
public TaskScheduler(ICacheService cacheService, ILogger<TaskScheduler> logger, IScannerService scannerService,
IUnitOfWork unitOfWork, IMetadataService metadataService, IBackupService backupService,
ICleanupService cleanupService, IStatsService statsService, IVersionUpdaterService versionUpdaterService)
var setting = (await _unitOfWork.SettingsRepository.GetSettingAsync(ServerSettingKey.TaskScan)).Value;
if (setting != null)
{
_cacheService = cacheService;
_logger = logger;
_scannerService = scannerService;
_unitOfWork = unitOfWork;
_metadataService = metadataService;
_backupService = backupService;
_cleanupService = cleanupService;
_statsService = statsService;
_versionUpdaterService = versionUpdaterService;
var scanLibrarySetting = setting;
_logger.LogDebug("Scheduling Scan Library Task for {Setting}", scanLibrarySetting);
RecurringJob.AddOrUpdate("scan-libraries", () => _scannerService.ScanLibraries(),
() => CronConverter.ConvertToCronNotation(scanLibrarySetting), TimeZoneInfo.Local);
}
else
{
RecurringJob.AddOrUpdate("scan-libraries", () => _scannerService.ScanLibraries(), Cron.Daily, TimeZoneInfo.Local);
}
public void ScheduleTasks()
setting = (await _unitOfWork.SettingsRepository.GetSettingAsync(ServerSettingKey.TaskBackup)).Value;
if (setting != null)
{
_logger.LogInformation("Scheduling reoccurring tasks");
var setting = Task.Run(() => _unitOfWork.SettingsRepository.GetSettingAsync(ServerSettingKey.TaskScan)).GetAwaiter().GetResult().Value;
if (setting != null)
{
var scanLibrarySetting = setting;
_logger.LogDebug("Scheduling Scan Library Task for {Setting}", scanLibrarySetting);
RecurringJob.AddOrUpdate("scan-libraries", () => _scannerService.ScanLibraries(),
() => CronConverter.ConvertToCronNotation(scanLibrarySetting), TimeZoneInfo.Local);
}
else
{
RecurringJob.AddOrUpdate("scan-libraries", () => _scannerService.ScanLibraries(), Cron.Daily, TimeZoneInfo.Local);
}
setting = Task.Run(() => _unitOfWork.SettingsRepository.GetSettingAsync(ServerSettingKey.TaskBackup)).Result.Value;
if (setting != null)
{
_logger.LogDebug("Scheduling Backup Task for {Setting}", setting);
RecurringJob.AddOrUpdate("backup", () => _backupService.BackupDatabase(), () => CronConverter.ConvertToCronNotation(setting), TimeZoneInfo.Local);
}
else
{
RecurringJob.AddOrUpdate("backup", () => _backupService.BackupDatabase(), Cron.Weekly, TimeZoneInfo.Local);
}
RecurringJob.AddOrUpdate("cleanup", () => _cleanupService.Cleanup(), Cron.Daily, TimeZoneInfo.Local);
_logger.LogDebug("Scheduling Backup Task for {Setting}", setting);
RecurringJob.AddOrUpdate("backup", () => _backupService.BackupDatabase(), () => CronConverter.ConvertToCronNotation(setting), TimeZoneInfo.Local);
}
else
{
RecurringJob.AddOrUpdate("backup", () => _backupService.BackupDatabase(), Cron.Weekly, TimeZoneInfo.Local);
}
#region StatsTasks
RecurringJob.AddOrUpdate("cleanup", () => _cleanupService.Cleanup(), Cron.Daily, TimeZoneInfo.Local);
RecurringJob.AddOrUpdate("cleanup-db", () => _cleanupService.CleanupDbEntries(), Cron.Daily, TimeZoneInfo.Local);
}
#region StatsTasks
public async Task ScheduleStatsTasks()
public async Task ScheduleStatsTasks()
{
var allowStatCollection = (await _unitOfWork.SettingsRepository.GetSettingsDtoAsync()).AllowStatCollection;
if (!allowStatCollection)
{
var allowStatCollection = (await _unitOfWork.SettingsRepository.GetSettingsDtoAsync()).AllowStatCollection;
if (!allowStatCollection)
{
_logger.LogDebug("User has opted out of stat collection, not registering tasks");
return;
}
_logger.LogDebug("Scheduling stat collection daily");
RecurringJob.AddOrUpdate("report-stats", () => _statsService.Send(), Cron.Daily(Rnd.Next(0, 22)), TimeZoneInfo.Local);
_logger.LogDebug("User has opted out of stat collection, not registering tasks");
return;
}
public void CancelStatsTasks()
_logger.LogDebug("Scheduling stat collection daily");
RecurringJob.AddOrUpdate("report-stats", () => _statsService.Send(), Cron.Daily(Rnd.Next(0, 22)), TimeZoneInfo.Local);
}
public void CancelStatsTasks()
{
_logger.LogDebug("Cancelling/Removing StatsTasks");
RecurringJob.RemoveIfExists("report-stats");
}
/// <summary>
/// First time run stat collection. Executes immediately on a background thread. Does not block.
/// </summary>
public async Task RunStatCollection()
{
var allowStatCollection = (await _unitOfWork.SettingsRepository.GetSettingsDtoAsync()).AllowStatCollection;
if (!allowStatCollection)
{
_logger.LogDebug("Cancelling/Removing StatsTasks");
RecurringJob.RemoveIfExists("report-stats");
_logger.LogDebug("User has opted out of stat collection, not sending stats");
return;
}
BackgroundJob.Enqueue(() => _statsService.Send());
}
/// <summary>
/// First time run stat collection. Executes immediately on a background thread. Does not block.
/// </summary>
public async Task RunStatCollection()
{
var allowStatCollection = (await _unitOfWork.SettingsRepository.GetSettingsDtoAsync()).AllowStatCollection;
if (!allowStatCollection)
{
_logger.LogDebug("User has opted out of stat collection, not sending stats");
return;
}
BackgroundJob.Enqueue(() => _statsService.Send());
}
#endregion
#endregion
#region UpdateTasks
#region UpdateTasks
public void ScheduleUpdaterTasks()
{
_logger.LogInformation("Scheduling Auto-Update tasks");
// Schedule update check between noon and 6pm local time
RecurringJob.AddOrUpdate("check-updates", () => CheckForUpdate(), Cron.Daily(Rnd.Next(12, 18)), TimeZoneInfo.Local);
}
#endregion
public void ScheduleUpdaterTasks()
{
_logger.LogInformation("Scheduling Auto-Update tasks");
// Schedule update check between noon and 6pm local time
RecurringJob.AddOrUpdate("check-updates", () => _versionUpdaterService.CheckForUpdate(), Cron.Daily(Rnd.Next(12, 18)), TimeZoneInfo.Local);
}
#endregion
public void ScanLibrary(int libraryId, bool forceUpdate = false)
{
_logger.LogInformation("Enqueuing library scan for: {LibraryId}", libraryId);
BackgroundJob.Enqueue(() => _scannerService.ScanLibrary(libraryId));
// When we do a scan, force cache to re-unpack in case page numbers change
BackgroundJob.Enqueue(() => _cleanupService.CleanupCacheDirectory());
}
public void ScanLibrary(int libraryId, bool forceUpdate = false)
{
_logger.LogInformation("Enqueuing library scan for: {LibraryId}", libraryId);
BackgroundJob.Enqueue(() => _scannerService.ScanLibrary(libraryId));
// When we do a scan, force cache to re-unpack in case page numbers change
BackgroundJob.Enqueue(() => _cleanupService.CleanupCacheDirectory());
}
public void CleanupChapters(int[] chapterIds)
{
BackgroundJob.Enqueue(() => _cacheService.CleanupChapters(chapterIds));
}
public void CleanupChapters(int[] chapterIds)
{
BackgroundJob.Enqueue(() => _cacheService.CleanupChapters(chapterIds));
}
public void RefreshMetadata(int libraryId, bool forceUpdate = true)
{
_logger.LogInformation("Enqueuing library metadata refresh for: {LibraryId}", libraryId);
BackgroundJob.Enqueue(() => _metadataService.RefreshMetadata(libraryId, forceUpdate));
}
public void RefreshMetadata(int libraryId, bool forceUpdate = true)
{
_logger.LogInformation("Enqueuing library metadata refresh for: {LibraryId}", libraryId);
BackgroundJob.Enqueue(() => _metadataService.RefreshMetadata(libraryId, forceUpdate));
}
public void RefreshSeriesMetadata(int libraryId, int seriesId, bool forceUpdate = true)
{
_logger.LogInformation("Enqueuing series metadata refresh for: {SeriesId}", seriesId);
BackgroundJob.Enqueue(() => _metadataService.RefreshMetadataForSeries(libraryId, seriesId, forceUpdate));
}
public void CleanupTemp()
{
BackgroundJob.Enqueue(() => DirectoryService.ClearDirectory(DirectoryService.TempDirectory));
}
public void ScanSeries(int libraryId, int seriesId, bool forceUpdate = false)
{
_logger.LogInformation("Enqueuing series scan for: {SeriesId}", seriesId);
BackgroundJob.Enqueue(() => _scannerService.ScanSeries(libraryId, seriesId, CancellationToken.None));
}
public void RefreshSeriesMetadata(int libraryId, int seriesId, bool forceUpdate = true)
{
_logger.LogInformation("Enqueuing series metadata refresh for: {SeriesId}", seriesId);
BackgroundJob.Enqueue(() => _metadataService.RefreshMetadataForSeries(libraryId, seriesId, forceUpdate));
}
public void BackupDatabase()
{
BackgroundJob.Enqueue(() => _backupService.BackupDatabase());
}
public void ScanSeries(int libraryId, int seriesId, bool forceUpdate = false)
{
_logger.LogInformation("Enqueuing series scan for: {SeriesId}", seriesId);
BackgroundJob.Enqueue(() => _scannerService.ScanSeries(libraryId, seriesId, CancellationToken.None));
}
public void BackupDatabase()
{
BackgroundJob.Enqueue(() => _backupService.BackupDatabase());
}
/// <summary>
/// Not an external call. Only public so that we can call this for a Task
/// </summary>
// ReSharper disable once MemberCanBePrivate.Global
public async Task CheckForUpdate()
{
var update = await _versionUpdaterService.CheckForUpdate();
await _versionUpdaterService.PushUpdate(update);
}
/// <summary>
/// Not an external call. Only public so that we can call this for a Task
/// </summary>
// ReSharper disable once MemberCanBePrivate.Global
public async Task CheckForUpdate()
{
var update = await _versionUpdaterService.CheckForUpdate();
await _versionUpdaterService.PushUpdate(update);
}
}

View file

@ -4,204 +4,195 @@ using System.IO;
using System.IO.Compression;
using System.Linq;
using System.Threading.Tasks;
using API.Data;
using API.Entities.Enums;
using API.Extensions;
using API.Interfaces;
using API.Interfaces.Services;
using API.SignalR;
using Hangfire;
using Microsoft.AspNetCore.SignalR;
using Microsoft.Extensions.Configuration;
using Microsoft.Extensions.Logging;
namespace API.Services.Tasks
namespace API.Services.Tasks;
public interface IBackupService
{
public class BackupService : IBackupService
Task BackupDatabase();
/// <summary>
/// Returns a list of full paths of the logs files detailed in <see cref="IConfiguration"/>.
/// </summary>
/// <param name="maxRollingFiles"></param>
/// <param name="logFileName"></param>
/// <returns></returns>
IEnumerable<string> GetLogFiles(int maxRollingFiles, string logFileName);
}
public class BackupService : IBackupService
{
private readonly IUnitOfWork _unitOfWork;
private readonly ILogger<BackupService> _logger;
private readonly IDirectoryService _directoryService;
private readonly IHubContext<MessageHub> _messageHub;
private readonly IList<string> _backupFiles;
public BackupService(ILogger<BackupService> logger, IUnitOfWork unitOfWork,
IDirectoryService directoryService, IConfiguration config, IHubContext<MessageHub> messageHub)
{
private readonly IUnitOfWork _unitOfWork;
private readonly ILogger<BackupService> _logger;
private readonly IDirectoryService _directoryService;
private readonly IHubContext<MessageHub> _messageHub;
_unitOfWork = unitOfWork;
_logger = logger;
_directoryService = directoryService;
_messageHub = messageHub;
private readonly IList<string> _backupFiles;
var maxRollingFiles = config.GetMaxRollingFiles();
var loggingSection = config.GetLoggingFileName();
var files = GetLogFiles(maxRollingFiles, loggingSection);
public BackupService(IUnitOfWork unitOfWork, ILogger<BackupService> logger,
IDirectoryService directoryService, IConfiguration config, IHubContext<MessageHub> messageHub)
_backupFiles = new List<string>()
{
_unitOfWork = unitOfWork;
_logger = logger;
_directoryService = directoryService;
_messageHub = messageHub;
"appsettings.json",
"Hangfire.db", // This is not used atm
"Hangfire-log.db", // This is not used atm
"kavita.db",
"kavita.db-shm", // This wont always be there
"kavita.db-wal" // This wont always be there
};
var maxRollingFiles = config.GetMaxRollingFiles();
var loggingSection = config.GetLoggingFileName();
var files = LogFiles(maxRollingFiles, loggingSection);
_backupFiles = new List<string>()
{
"appsettings.json",
"Hangfire.db", // This is not used atm
"Hangfire-log.db", // This is not used atm
"kavita.db",
"kavita.db-shm", // This wont always be there
"kavita.db-wal" // This wont always be there
};
foreach (var file in files.Select(f => (new FileInfo(f)).Name).ToList())
{
_backupFiles.Add(file);
}
foreach (var file in files.Select(f => (_directoryService.FileSystem.FileInfo.FromFileName(f)).Name).ToList())
{
_backupFiles.Add(file);
}
public IEnumerable<string> LogFiles(int maxRollingFiles, string logFileName)
{
var multipleFileRegex = maxRollingFiles > 0 ? @"\d*" : string.Empty;
var fi = new FileInfo(logFileName);
var files = maxRollingFiles > 0
? DirectoryService.GetFiles(DirectoryService.LogDirectory, $@"{Path.GetFileNameWithoutExtension(fi.Name)}{multipleFileRegex}\.log")
: new[] {"kavita.log"};
return files;
}
/// <summary>
/// Will backup anything that needs to be backed up. This includes logs, setting files, bare minimum cover images (just locked and first cover).
/// </summary>
[AutomaticRetry(Attempts = 3, LogEvents = false, OnAttemptsExceeded = AttemptsExceededAction.Fail)]
public async Task BackupDatabase()
{
_logger.LogInformation("Beginning backup of Database at {BackupTime}", DateTime.Now);
var backupDirectory = Task.Run(() => _unitOfWork.SettingsRepository.GetSettingAsync(ServerSettingKey.BackupDirectory)).Result.Value;
_logger.LogDebug("Backing up to {BackupDirectory}", backupDirectory);
if (!DirectoryService.ExistOrCreate(backupDirectory))
{
_logger.LogCritical("Could not write to {BackupDirectory}; aborting backup", backupDirectory);
return;
}
await SendProgress(0F);
var dateString = $"{DateTime.Now.ToShortDateString()}_{DateTime.Now.ToLongTimeString()}".Replace("/", "_").Replace(":", "_");
var zipPath = Path.Join(backupDirectory, $"kavita_backup_{dateString}.zip");
if (File.Exists(zipPath))
{
_logger.LogInformation("{ZipFile} already exists, aborting", zipPath);
return;
}
var tempDirectory = Path.Join(DirectoryService.TempDirectory, dateString);
DirectoryService.ExistOrCreate(tempDirectory);
DirectoryService.ClearDirectory(tempDirectory);
_directoryService.CopyFilesToDirectory(
_backupFiles.Select(file => Path.Join(DirectoryService.ConfigDirectory, file)).ToList(), tempDirectory);
await SendProgress(0.25F);
await CopyCoverImagesToBackupDirectory(tempDirectory);
await SendProgress(0.75F);
try
{
ZipFile.CreateFromDirectory(tempDirectory, zipPath);
}
catch (AggregateException ex)
{
_logger.LogError(ex, "There was an issue when archiving library backup");
}
DirectoryService.ClearAndDeleteDirectory(tempDirectory);
_logger.LogInformation("Database backup completed");
await SendProgress(1F);
}
private async Task CopyCoverImagesToBackupDirectory(string tempDirectory)
{
var outputTempDir = Path.Join(tempDirectory, "covers");
DirectoryService.ExistOrCreate(outputTempDir);
try
{
var seriesImages = await _unitOfWork.SeriesRepository.GetLockedCoverImagesAsync();
_directoryService.CopyFilesToDirectory(
seriesImages.Select(s => Path.Join(DirectoryService.CoverImageDirectory, s)), outputTempDir);
var collectionTags = await _unitOfWork.CollectionTagRepository.GetAllCoverImagesAsync();
_directoryService.CopyFilesToDirectory(
collectionTags.Select(s => Path.Join(DirectoryService.CoverImageDirectory, s)), outputTempDir);
var chapterImages = await _unitOfWork.ChapterRepository.GetCoverImagesForLockedChaptersAsync();
_directoryService.CopyFilesToDirectory(
chapterImages.Select(s => Path.Join(DirectoryService.CoverImageDirectory, s)), outputTempDir);
}
catch (IOException)
{
// Swallow exception. This can be a duplicate cover being copied as chapter and volumes can share same file.
}
if (!DirectoryService.GetFiles(outputTempDir).Any())
{
DirectoryService.ClearAndDeleteDirectory(outputTempDir);
}
}
private async Task SendProgress(float progress)
{
await _messageHub.Clients.All.SendAsync(SignalREvents.BackupDatabaseProgress,
MessageFactory.BackupDatabaseProgressEvent(progress));
}
/// <summary>
/// Removes Database backups older than 30 days. If all backups are older than 30 days, the latest is kept.
/// </summary>
public void CleanupBackups()
{
const int dayThreshold = 30;
_logger.LogInformation("Beginning cleanup of Database backups at {Time}", DateTime.Now);
var backupDirectory = Task.Run(() => _unitOfWork.SettingsRepository.GetSettingAsync(ServerSettingKey.BackupDirectory)).Result.Value;
if (!_directoryService.Exists(backupDirectory)) return;
var deltaTime = DateTime.Today.Subtract(TimeSpan.FromDays(dayThreshold));
var allBackups = DirectoryService.GetFiles(backupDirectory).ToList();
var expiredBackups = allBackups.Select(filename => new FileInfo(filename))
.Where(f => f.CreationTime > deltaTime)
.ToList();
if (expiredBackups.Count == allBackups.Count)
{
_logger.LogInformation("All expired backups are older than {Threshold} days. Removing all but last backup", dayThreshold);
var toDelete = expiredBackups.OrderByDescending(f => f.CreationTime).ToList();
for (var i = 1; i < toDelete.Count; i++)
{
try
{
toDelete[i].Delete();
}
catch (Exception ex)
{
_logger.LogError(ex, "There was an issue deleting {FileName}", toDelete[i].Name);
}
}
}
else
{
foreach (var file in expiredBackups)
{
try
{
file.Delete();
}
catch (Exception ex)
{
_logger.LogError(ex, "There was an issue deleting {FileName}", file.Name);
}
}
}
_logger.LogInformation("Finished cleanup of Database backups at {Time}", DateTime.Now);
}
}
public IEnumerable<string> GetLogFiles(int maxRollingFiles, string logFileName)
{
var multipleFileRegex = maxRollingFiles > 0 ? @"\d*" : string.Empty;
var fi = _directoryService.FileSystem.FileInfo.FromFileName(logFileName);
var files = maxRollingFiles > 0
? _directoryService.GetFiles(_directoryService.LogDirectory,
$@"{_directoryService.FileSystem.Path.GetFileNameWithoutExtension(fi.Name)}{multipleFileRegex}\.log")
: new[] {"kavita.log"};
return files;
}
/// <summary>
/// Will backup anything that needs to be backed up. This includes logs, setting files, bare minimum cover images (just locked and first cover).
/// </summary>
[AutomaticRetry(Attempts = 3, LogEvents = false, OnAttemptsExceeded = AttemptsExceededAction.Fail)]
public async Task BackupDatabase()
{
_logger.LogInformation("Beginning backup of Database at {BackupTime}", DateTime.Now);
var backupDirectory = (await _unitOfWork.SettingsRepository.GetSettingAsync(ServerSettingKey.BackupDirectory)).Value;
_logger.LogDebug("Backing up to {BackupDirectory}", backupDirectory);
if (!_directoryService.ExistOrCreate(backupDirectory))
{
_logger.LogCritical("Could not write to {BackupDirectory}; aborting backup", backupDirectory);
return;
}
await SendProgress(0F);
var dateString = $"{DateTime.Now.ToShortDateString()}_{DateTime.Now.ToLongTimeString()}".Replace("/", "_").Replace(":", "_");
var zipPath = _directoryService.FileSystem.Path.Join(backupDirectory, $"kavita_backup_{dateString}.zip");
if (File.Exists(zipPath))
{
_logger.LogInformation("{ZipFile} already exists, aborting", zipPath);
return;
}
var tempDirectory = Path.Join(_directoryService.TempDirectory, dateString);
_directoryService.ExistOrCreate(tempDirectory);
_directoryService.ClearDirectory(tempDirectory);
_directoryService.CopyFilesToDirectory(
_backupFiles.Select(file => _directoryService.FileSystem.Path.Join(_directoryService.ConfigDirectory, file)).ToList(), tempDirectory);
await SendProgress(0.25F);
await CopyCoverImagesToBackupDirectory(tempDirectory);
await SendProgress(0.5F);
await CopyBookmarksToBackupDirectory(tempDirectory);
await SendProgress(0.75F);
try
{
ZipFile.CreateFromDirectory(tempDirectory, zipPath);
}
catch (AggregateException ex)
{
_logger.LogError(ex, "There was an issue when archiving library backup");
}
_directoryService.ClearAndDeleteDirectory(tempDirectory);
_logger.LogInformation("Database backup completed");
await SendProgress(1F);
}
private async Task CopyCoverImagesToBackupDirectory(string tempDirectory)
{
var outputTempDir = Path.Join(tempDirectory, "covers");
_directoryService.ExistOrCreate(outputTempDir);
try
{
var seriesImages = await _unitOfWork.SeriesRepository.GetLockedCoverImagesAsync();
_directoryService.CopyFilesToDirectory(
seriesImages.Select(s => _directoryService.FileSystem.Path.Join(_directoryService.CoverImageDirectory, s)), outputTempDir);
var collectionTags = await _unitOfWork.CollectionTagRepository.GetAllCoverImagesAsync();
_directoryService.CopyFilesToDirectory(
collectionTags.Select(s => _directoryService.FileSystem.Path.Join(_directoryService.CoverImageDirectory, s)), outputTempDir);
var chapterImages = await _unitOfWork.ChapterRepository.GetCoverImagesForLockedChaptersAsync();
_directoryService.CopyFilesToDirectory(
chapterImages.Select(s => _directoryService.FileSystem.Path.Join(_directoryService.CoverImageDirectory, s)), outputTempDir);
}
catch (IOException)
{
// Swallow exception. This can be a duplicate cover being copied as chapter and volumes can share same file.
}
if (!_directoryService.GetFiles(outputTempDir, searchOption: SearchOption.AllDirectories).Any())
{
_directoryService.ClearAndDeleteDirectory(outputTempDir);
}
}
private async Task CopyBookmarksToBackupDirectory(string tempDirectory)
{
var bookmarkDirectory =
(await _unitOfWork.SettingsRepository.GetSettingAsync(ServerSettingKey.BookmarkDirectory)).Value;
var outputTempDir = Path.Join(tempDirectory, "bookmarks");
_directoryService.ExistOrCreate(outputTempDir);
try
{
_directoryService.CopyDirectoryToDirectory(bookmarkDirectory, outputTempDir);
}
catch (IOException)
{
// Swallow exception.
}
if (!_directoryService.GetFiles(outputTempDir, searchOption: SearchOption.AllDirectories).Any())
{
_directoryService.ClearAndDeleteDirectory(outputTempDir);
}
}
private async Task SendProgress(float progress)
{
await _messageHub.Clients.All.SendAsync(SignalREvents.BackupDatabaseProgress,
MessageFactory.BackupDatabaseProgressEvent(progress));
}
}

View file

@ -1,7 +1,9 @@
using System.IO;
using System;
using System.IO;
using System.Linq;
using System.Threading.Tasks;
using API.Interfaces;
using API.Interfaces.Services;
using API.Data;
using API.Entities.Enums;
using API.SignalR;
using Hangfire;
using Microsoft.AspNetCore.SignalR;
@ -9,32 +11,37 @@ using Microsoft.Extensions.Logging;
namespace API.Services.Tasks
{
public interface ICleanupService
{
Task Cleanup();
Task CleanupDbEntries();
void CleanupCacheDirectory();
Task DeleteSeriesCoverImages();
Task DeleteChapterCoverImages();
Task DeleteTagCoverImages();
Task CleanupBackups();
Task CleanupBookmarks();
}
/// <summary>
/// Cleans up after operations on reoccurring basis
/// </summary>
public class CleanupService : ICleanupService
{
private readonly ICacheService _cacheService;
private readonly ILogger<CleanupService> _logger;
private readonly IBackupService _backupService;
private readonly IUnitOfWork _unitOfWork;
private readonly IHubContext<MessageHub> _messageHub;
private readonly IDirectoryService _directoryService;
public CleanupService(ICacheService cacheService, ILogger<CleanupService> logger,
IBackupService backupService, IUnitOfWork unitOfWork, IHubContext<MessageHub> messageHub)
public CleanupService(ILogger<CleanupService> logger,
IUnitOfWork unitOfWork, IHubContext<MessageHub> messageHub,
IDirectoryService directoryService)
{
_cacheService = cacheService;
_logger = logger;
_backupService = backupService;
_unitOfWork = unitOfWork;
_messageHub = messageHub;
_directoryService = directoryService;
}
public void CleanupCacheDirectory()
{
_logger.LogInformation("Cleaning cache directory");
_cacheService.Cleanup();
}
/// <summary>
/// Cleans up Temp, cache, deleted cover images, and old database backups
@ -45,12 +52,12 @@ namespace API.Services.Tasks
_logger.LogInformation("Starting Cleanup");
await SendProgress(0F);
_logger.LogInformation("Cleaning temp directory");
DirectoryService.ClearDirectory(DirectoryService.TempDirectory);
_directoryService.ClearDirectory(_directoryService.TempDirectory);
await SendProgress(0.1F);
CleanupCacheDirectory();
await SendProgress(0.25F);
_logger.LogInformation("Cleaning old database backups");
_backupService.CleanupBackups();
await CleanupBackups();
await SendProgress(0.50F);
_logger.LogInformation("Cleaning deleted cover images");
await DeleteSeriesCoverImages();
@ -58,49 +65,137 @@ namespace API.Services.Tasks
await DeleteChapterCoverImages();
await SendProgress(0.7F);
await DeleteTagCoverImages();
await SendProgress(0.8F);
_logger.LogInformation("Cleaning old bookmarks");
await CleanupBookmarks();
await SendProgress(1F);
_logger.LogInformation("Cleanup finished");
}
/// <summary>
/// Cleans up abandon rows in the DB
/// </summary>
public async Task CleanupDbEntries()
{
await _unitOfWork.AppUserProgressRepository.CleanupAbandonedChapters();
await _unitOfWork.PersonRepository.RemoveAllPeopleNoLongerAssociated();
await _unitOfWork.GenreRepository.RemoveAllGenreNoLongerAssociated();
await _unitOfWork.CollectionTagRepository.RemoveTagsWithoutSeries();
}
private async Task SendProgress(float progress)
{
await _messageHub.Clients.All.SendAsync(SignalREvents.CleanupProgress,
MessageFactory.CleanupProgressEvent(progress));
}
private async Task DeleteSeriesCoverImages()
/// <summary>
/// Removes all series images that are not in the database. They must follow <see cref="ImageService.SeriesCoverImageRegex"/> filename pattern.
/// </summary>
public async Task DeleteSeriesCoverImages()
{
var images = await _unitOfWork.SeriesRepository.GetAllCoverImagesAsync();
var files = DirectoryService.GetFiles(DirectoryService.CoverImageDirectory, ImageService.SeriesCoverImageRegex);
foreach (var file in files)
{
if (images.Contains(Path.GetFileName(file))) continue;
File.Delete(file);
}
var files = _directoryService.GetFiles(_directoryService.CoverImageDirectory, ImageService.SeriesCoverImageRegex);
_directoryService.DeleteFiles(files.Where(file => !images.Contains(_directoryService.FileSystem.Path.GetFileName(file))));
}
private async Task DeleteChapterCoverImages()
/// <summary>
/// Removes all chapter/volume images that are not in the database. They must follow <see cref="ImageService.ChapterCoverImageRegex"/> filename pattern.
/// </summary>
public async Task DeleteChapterCoverImages()
{
var images = await _unitOfWork.ChapterRepository.GetAllCoverImagesAsync();
var files = DirectoryService.GetFiles(DirectoryService.CoverImageDirectory, ImageService.ChapterCoverImageRegex);
foreach (var file in files)
{
if (images.Contains(Path.GetFileName(file))) continue;
File.Delete(file);
}
var files = _directoryService.GetFiles(_directoryService.CoverImageDirectory, ImageService.ChapterCoverImageRegex);
_directoryService.DeleteFiles(files.Where(file => !images.Contains(_directoryService.FileSystem.Path.GetFileName(file))));
}
private async Task DeleteTagCoverImages()
/// <summary>
/// Removes all collection tag images that are not in the database. They must follow <see cref="ImageService.CollectionTagCoverImageRegex"/> filename pattern.
/// </summary>
public async Task DeleteTagCoverImages()
{
var images = await _unitOfWork.CollectionTagRepository.GetAllCoverImagesAsync();
var files = DirectoryService.GetFiles(DirectoryService.CoverImageDirectory, ImageService.CollectionTagCoverImageRegex);
foreach (var file in files)
{
if (images.Contains(Path.GetFileName(file))) continue;
File.Delete(file);
var files = _directoryService.GetFiles(_directoryService.CoverImageDirectory, ImageService.CollectionTagCoverImageRegex);
_directoryService.DeleteFiles(files.Where(file => !images.Contains(_directoryService.FileSystem.Path.GetFileName(file))));
}
/// <summary>
/// Removes all files and directories in the cache directory
/// </summary>
public void CleanupCacheDirectory()
{
_logger.LogInformation("Performing cleanup of Cache directory");
_directoryService.ExistOrCreate(_directoryService.CacheDirectory);
try
{
_directoryService.ClearDirectory(_directoryService.CacheDirectory);
}
catch (Exception ex)
{
_logger.LogError(ex, "There was an issue deleting one or more folders/files during cleanup");
}
_logger.LogInformation("Cache directory purged");
}
/// <summary>
/// Removes Database backups older than 30 days. If all backups are older than 30 days, the latest is kept.
/// </summary>
public async Task CleanupBackups()
{
const int dayThreshold = 30;
_logger.LogInformation("Beginning cleanup of Database backups at {Time}", DateTime.Now);
var backupDirectory =
(await _unitOfWork.SettingsRepository.GetSettingAsync(ServerSettingKey.BackupDirectory)).Value;
if (!_directoryService.Exists(backupDirectory)) return;
var deltaTime = DateTime.Today.Subtract(TimeSpan.FromDays(dayThreshold));
var allBackups = _directoryService.GetFiles(backupDirectory).ToList();
var expiredBackups = allBackups.Select(filename => _directoryService.FileSystem.FileInfo.FromFileName(filename))
.Where(f => f.CreationTime < deltaTime)
.ToList();
if (expiredBackups.Count == allBackups.Count)
{
_logger.LogInformation("All expired backups are older than {Threshold} days. Removing all but last backup", dayThreshold);
var toDelete = expiredBackups.OrderByDescending(f => f.CreationTime).ToList();
_directoryService.DeleteFiles(toDelete.Take(toDelete.Count - 1).Select(f => f.FullName));
}
else
{
_directoryService.DeleteFiles(expiredBackups.Select(f => f.FullName));
}
_logger.LogInformation("Finished cleanup of Database backups at {Time}", DateTime.Now);
}
/// <summary>
/// Removes all files in the BookmarkDirectory that don't currently have bookmarks in the Database
/// </summary>
public async Task CleanupBookmarks()
{
// Search all files in bookmarks/ except bookmark files and delete those
var bookmarkDirectory =
(await _unitOfWork.SettingsRepository.GetSettingAsync(ServerSettingKey.BookmarkDirectory)).Value;
var allBookmarkFiles = _directoryService.GetFiles(bookmarkDirectory, searchOption: SearchOption.AllDirectories).Select(Parser.Parser.NormalizePath);
var bookmarks = (await _unitOfWork.UserRepository.GetAllBookmarksAsync())
.Select(b => Parser.Parser.NormalizePath(_directoryService.FileSystem.Path.Join(bookmarkDirectory,
b.FileName)));
var filesToDelete = allBookmarkFiles.ToList().Except(bookmarks).ToList();
_logger.LogDebug("[Bookmarks] Bookmark cleanup wants to delete {Count} files", filesToDelete.Count());
_directoryService.DeleteFiles(filesToDelete);
// Clear all empty directories
foreach (var directory in _directoryService.FileSystem.Directory.GetDirectories(bookmarkDirectory))
{
if (_directoryService.FileSystem.Directory.GetFiles(directory).Length == 0 &&
_directoryService.FileSystem.Directory.GetDirectories(directory).Length == 0)
{
_directoryService.FileSystem.Directory.Delete(directory, false);
}
}
}
}

View file

@ -6,7 +6,7 @@ using System.IO;
using System.Linq;
using API.Entities;
using API.Entities.Enums;
using API.Interfaces.Services;
using API.Helpers;
using API.Parser;
using Microsoft.Extensions.Logging;
@ -23,34 +23,46 @@ namespace API.Services.Tasks.Scanner
public class ParseScannedFiles
{
private readonly ConcurrentDictionary<ParsedSeries, List<ParserInfo>> _scannedSeries;
private readonly IBookService _bookService;
private readonly ILogger _logger;
private readonly IDirectoryService _directoryService;
private readonly IReadingItemService _readingItemService;
private readonly DefaultParser _defaultParser;
/// <summary>
/// An instance of a pipeline for processing files and returning a Map of Series -> ParserInfos.
/// Each instance is separate from other threads, allowing for no cross over.
/// </summary>
/// <param name="bookService"></param>
/// <param name="logger"></param>
public ParseScannedFiles(IBookService bookService, ILogger logger)
/// <param name="logger">Logger of the parent class that invokes this</param>
/// <param name="directoryService">Directory Service</param>
/// <param name="readingItemService">ReadingItemService Service for extracting information on a number of formats</param>
public ParseScannedFiles(ILogger logger, IDirectoryService directoryService,
IReadingItemService readingItemService)
{
_bookService = bookService;
_logger = logger;
_directoryService = directoryService;
_readingItemService = readingItemService;
_scannedSeries = new ConcurrentDictionary<ParsedSeries, List<ParserInfo>>();
_defaultParser = new DefaultParser(_directoryService);
}
/// <summary>
/// Gets the list of parserInfos given a Series. If the series does not exist within, return empty list.
/// Gets the list of all parserInfos given a Series (Will match on Name, LocalizedName, OriginalName). If the series does not exist within, return empty list.
/// </summary>
/// <param name="parsedSeries"></param>
/// <param name="series"></param>
/// <returns></returns>
public static IList<ParserInfo> GetInfosByName(Dictionary<ParsedSeries, List<ParserInfo>> parsedSeries, Series series)
{
var existingKey = parsedSeries.Keys.FirstOrDefault(ps =>
ps.Format == series.Format && ps.NormalizedName.Equals(Parser.Parser.Normalize(series.OriginalName)));
var allKeys = parsedSeries.Keys.Where(ps =>
SeriesHelper.FindSeries(series, ps));
return existingKey != null ? parsedSeries[existingKey] : new List<ParserInfo>();
var infos = new List<ParserInfo>();
foreach (var key in allKeys)
{
infos.AddRange(parsedSeries[key]);
}
return infos;
}
/// <summary>
@ -62,20 +74,12 @@ namespace API.Services.Tasks.Scanner
/// <param name="type">Library type to determine parsing to perform</param>
private void ProcessFile(string path, string rootPath, LibraryType type)
{
ParserInfo info;
// TODO: Emit event with what is being processed. It can look like Kavita isn't doing anything during file scan
if (Parser.Parser.IsEpub(path))
{
info = _bookService.ParseInfo(path);
}
else
{
info = Parser.Parser.Parse(path, rootPath, type);
}
// If we couldn't match, log. But don't log if the file parses as a cover image
var info = _readingItemService.Parse(path, rootPath, type);
if (info == null)
{
// If the file is an image and literally a cover image, skip processing.
if (!(Parser.Parser.IsImage(path) && Parser.Parser.IsCoverImage(path)))
{
_logger.LogWarning("[Scanner] Could not parse series from {Path}", path);
@ -83,16 +87,36 @@ namespace API.Services.Tasks.Scanner
return;
}
if (Parser.Parser.IsEpub(path) && Parser.Parser.ParseVolume(info.Series) != Parser.Parser.DefaultVolume)
// This catches when original library type is Manga/Comic and when parsing with non
if (Parser.Parser.IsEpub(path) && Parser.Parser.ParseVolume(info.Series) != Parser.Parser.DefaultVolume) // Shouldn't this be info.Volume != DefaultVolume?
{
info = Parser.Parser.Parse(path, rootPath, type);
var info2 = _bookService.ParseInfo(path);
info = _defaultParser.Parse(path, rootPath, LibraryType.Book);
var info2 = _readingItemService.Parse(path, rootPath, type);
info.Merge(info2);
}
info.ComicInfo = _readingItemService.GetComicInfo(path);
if (info.ComicInfo != null)
{
if (!string.IsNullOrEmpty(info.ComicInfo.Volume))
{
info.Volumes = info.ComicInfo.Volume;
}
if (!string.IsNullOrEmpty(info.ComicInfo.Series))
{
info.Series = info.ComicInfo.Series;
}
if (!string.IsNullOrEmpty(info.ComicInfo.Number))
{
info.Chapters = info.ComicInfo.Number;
}
}
TrackSeries(info);
}
/// <summary>
/// Attempts to either add a new instance of a show mapping to the _scannedSeries bag or adds to an existing.
/// This will check if the name matches an existing series name (multiple fields) <see cref="MergeName"/>
@ -133,7 +157,7 @@ namespace API.Services.Tasks.Scanner
/// same normalized name, it merges into the existing one. This is important as some manga may have a slight difference with punctuation or capitalization.
/// </summary>
/// <param name="info"></param>
/// <returns></returns>
/// <returns>Series Name to group this info into</returns>
public string MergeName(ParserInfo info)
{
var normalizedSeries = Parser.Parser.Normalize(info.Series);
@ -161,12 +185,11 @@ namespace API.Services.Tasks.Scanner
{
var sw = Stopwatch.StartNew();
totalFiles = 0;
var searchPattern = GetLibrarySearchPattern();
foreach (var folderPath in folders)
{
try
{
totalFiles += DirectoryService.TraverseTreeParallelForEach(folderPath, (f) =>
totalFiles += _directoryService.TraverseTreeParallelForEach(folderPath, (f) =>
{
try
{
@ -176,7 +199,7 @@ namespace API.Services.Tasks.Scanner
{
_logger.LogError(exception, "The file {Filename} could not be found", f);
}
}, searchPattern, _logger);
}, Parser.Parser.SupportedExtensions, _logger);
}
catch (ArgumentException ex)
{
@ -191,11 +214,6 @@ namespace API.Services.Tasks.Scanner
return SeriesWithInfos();
}
private static string GetLibrarySearchPattern()
{
return Parser.Parser.SupportedExtensions;
}
/// <summary>
/// Returns any series where there were parsed infos
/// </summary>

File diff suppressed because it is too large Load diff

View file

@ -2,108 +2,111 @@
using System.Net.Http;
using System.Runtime.InteropServices;
using System.Threading.Tasks;
using API.Data;
using API.DTOs.Stats;
using API.Entities.Enums;
using API.Interfaces;
using API.Interfaces.Services;
using Flurl.Http;
using Kavita.Common.EnvironmentInfo;
using Microsoft.AspNetCore.Http;
using Microsoft.Extensions.Logging;
namespace API.Services.Tasks
namespace API.Services.Tasks;
public interface IStatsService
{
public class StatsService : IStatsService
Task Send();
Task<ServerInfoDto> GetServerInfo();
}
public class StatsService : IStatsService
{
private readonly ILogger<StatsService> _logger;
private readonly IUnitOfWork _unitOfWork;
private const string ApiUrl = "https://stats.kavitareader.com";
public StatsService(ILogger<StatsService> logger, IUnitOfWork unitOfWork)
{
private readonly ILogger<StatsService> _logger;
private readonly IUnitOfWork _unitOfWork;
private const string ApiUrl = "https://stats.kavitareader.com";
_logger = logger;
_unitOfWork = unitOfWork;
public StatsService(ILogger<StatsService> logger, IUnitOfWork unitOfWork)
FlurlHttp.ConfigureClient(ApiUrl, cli =>
cli.Settings.HttpClientFactory = new UntrustedCertClientFactory());
}
/// <summary>
/// Due to all instances firing this at the same time, we can DDOS our server. This task when fired will schedule the task to be run
/// randomly over a 6 hour spread
/// </summary>
public async Task Send()
{
var allowStatCollection = (await _unitOfWork.SettingsRepository.GetSettingsDtoAsync()).AllowStatCollection;
if (!allowStatCollection)
{
_logger = logger;
_unitOfWork = unitOfWork;
FlurlHttp.ConfigureClient(ApiUrl, cli =>
cli.Settings.HttpClientFactory = new UntrustedCertClientFactory());
return;
}
/// <summary>
/// Due to all instances firing this at the same time, we can DDOS our server. This task when fired will schedule the task to be run
/// randomly over a 6 hour spread
/// </summary>
public async Task Send()
await SendData();
}
/// <summary>
/// This must be public for Hangfire. Do not call this directly.
/// </summary>
// ReSharper disable once MemberCanBePrivate.Global
public async Task SendData()
{
var data = await GetServerInfo();
await SendDataToStatsServer(data);
}
private async Task SendDataToStatsServer(ServerInfoDto data)
{
var responseContent = string.Empty;
try
{
var allowStatCollection = (await _unitOfWork.SettingsRepository.GetSettingsDtoAsync()).AllowStatCollection;
if (!allowStatCollection)
var response = await (ApiUrl + "/api/v2/stats")
.WithHeader("Accept", "application/json")
.WithHeader("User-Agent", "Kavita")
.WithHeader("x-api-key", "MsnvA2DfQqxSK5jh")
.WithHeader("x-kavita-version", BuildInfo.Version)
.WithHeader("Content-Type", "application/json")
.WithTimeout(TimeSpan.FromSeconds(30))
.PostJsonAsync(data);
if (response.StatusCode != StatusCodes.Status200OK)
{
return;
}
await SendData();
}
/// <summary>
/// This must be public for Hangfire. Do not call this directly.
/// </summary>
// ReSharper disable once MemberCanBePrivate.Global
public async Task SendData()
{
var data = await GetServerInfo();
await SendDataToStatsServer(data);
}
private async Task SendDataToStatsServer(ServerInfoDto data)
{
var responseContent = string.Empty;
try
{
var response = await (ApiUrl + "/api/v2/stats")
.WithHeader("Accept", "application/json")
.WithHeader("User-Agent", "Kavita")
.WithHeader("x-api-key", "MsnvA2DfQqxSK5jh")
.WithHeader("x-kavita-version", BuildInfo.Version)
.WithHeader("Content-Type", "application/json")
.WithTimeout(TimeSpan.FromSeconds(30))
.PostJsonAsync(data);
if (response.StatusCode != StatusCodes.Status200OK)
{
_logger.LogError("KavitaStats did not respond successfully. {Content}", response);
}
}
catch (HttpRequestException e)
{
var info = new
{
dataSent = data,
response = responseContent
};
_logger.LogError(e, "KavitaStats did not respond successfully. {Content}", info);
}
catch (Exception e)
{
_logger.LogError(e, "An error happened during the request to KavitaStats");
_logger.LogError("KavitaStats did not respond successfully. {Content}", response);
}
}
public async Task<ServerInfoDto> GetServerInfo()
catch (HttpRequestException e)
{
var installId = await _unitOfWork.SettingsRepository.GetSettingAsync(ServerSettingKey.InstallId);
var serverInfo = new ServerInfoDto
var info = new
{
InstallId = installId.Value,
Os = RuntimeInformation.OSDescription,
KavitaVersion = BuildInfo.Version.ToString(),
DotnetVersion = Environment.Version.ToString(),
IsDocker = new OsInfo(Array.Empty<IOsVersionAdapter>()).IsDocker,
NumOfCores = Math.Max(Environment.ProcessorCount, 1)
dataSent = data,
response = responseContent
};
return serverInfo;
_logger.LogError(e, "KavitaStats did not respond successfully. {Content}", info);
}
catch (Exception e)
{
_logger.LogError(e, "An error happened during the request to KavitaStats");
}
}
public async Task<ServerInfoDto> GetServerInfo()
{
var installId = await _unitOfWork.SettingsRepository.GetSettingAsync(ServerSettingKey.InstallId);
var serverInfo = new ServerInfoDto
{
InstallId = installId.Value,
Os = RuntimeInformation.OSDescription,
KavitaVersion = BuildInfo.Version.ToString(),
DotnetVersion = Environment.Version.ToString(),
IsDocker = new OsInfo(Array.Empty<IOsVersionAdapter>()).IsDocker,
NumOfCores = Math.Max(Environment.ProcessorCount, 1)
};
return serverInfo;
}
}

View file

@ -4,7 +4,6 @@ using System.Linq;
using System.Net.Http;
using System.Threading.Tasks;
using API.DTOs.Update;
using API.Interfaces.Services;
using API.SignalR;
using API.SignalR.Presence;
using Flurl.Http;
@ -15,159 +14,155 @@ using Microsoft.AspNetCore.SignalR;
using Microsoft.Extensions.Hosting;
using Microsoft.Extensions.Logging;
namespace API.Services.Tasks
namespace API.Services.Tasks;
internal class GithubReleaseMetadata
{
internal class GithubReleaseMetadata
{
/// <summary>
/// Name of the Tag
/// <example>v0.4.3</example>
/// </summary>
// ReSharper disable once InconsistentNaming
public string Tag_Name { get; init; }
/// <summary>
/// Name of the Release
/// </summary>
public string Name { get; init; }
/// <summary>
/// Body of the Release
/// </summary>
public string Body { get; init; }
/// <summary>
/// Url of the release on Github
/// </summary>
// ReSharper disable once InconsistentNaming
public string Html_Url { get; init; }
/// <summary>
/// Date Release was Published
/// </summary>
// ReSharper disable once InconsistentNaming
public string Published_At { get; init; }
}
/// <summary>
/// Name of the Tag
/// <example>v0.4.3</example>
/// </summary>
// ReSharper disable once InconsistentNaming
public string Tag_Name { get; init; }
/// <summary>
/// Name of the Release
/// </summary>
public string Name { get; init; }
/// <summary>
/// Body of the Release
/// </summary>
public string Body { get; init; }
/// <summary>
/// Url of the release on Github
/// </summary>
// ReSharper disable once InconsistentNaming
public string Html_Url { get; init; }
/// <summary>
/// Date Release was Published
/// </summary>
// ReSharper disable once InconsistentNaming
public string Published_At { get; init; }
}
public class UntrustedCertClientFactory : DefaultHttpClientFactory
{
public override HttpMessageHandler CreateMessageHandler() {
return new HttpClientHandler {
ServerCertificateCustomValidationCallback = (_, _, _, _) => true
};
}
}
public class VersionUpdaterService : IVersionUpdaterService
{
private readonly ILogger<VersionUpdaterService> _logger;
private readonly IHubContext<MessageHub> _messageHub;
private readonly IPresenceTracker _tracker;
private readonly Markdown _markdown = new MarkdownDeep.Markdown();
#pragma warning disable S1075
private static readonly string GithubLatestReleasesUrl = "https://api.github.com/repos/Kareadita/Kavita/releases/latest";
private static readonly string GithubAllReleasesUrl = "https://api.github.com/repos/Kareadita/Kavita/releases";
#pragma warning restore S1075
public VersionUpdaterService(ILogger<VersionUpdaterService> logger, IHubContext<MessageHub> messageHub, IPresenceTracker tracker)
{
_logger = logger;
_messageHub = messageHub;
_tracker = tracker;
FlurlHttp.ConfigureClient(GithubLatestReleasesUrl, cli =>
cli.Settings.HttpClientFactory = new UntrustedCertClientFactory());
FlurlHttp.ConfigureClient(GithubAllReleasesUrl, cli =>
cli.Settings.HttpClientFactory = new UntrustedCertClientFactory());
}
/// <summary>
/// Fetches the latest release from Github
/// </summary>
public async Task<UpdateNotificationDto> CheckForUpdate()
{
var update = await GetGithubRelease();
return CreateDto(update);
}
/// <summary>
///
/// </summary>
/// <returns></returns>
public async Task<IEnumerable<UpdateNotificationDto>> GetAllReleases()
{
var updates = await GetGithubReleases();
return updates.Select(CreateDto);
}
private UpdateNotificationDto CreateDto(GithubReleaseMetadata update)
{
if (update == null || string.IsNullOrEmpty(update.Tag_Name)) return null;
var updateVersion = new Version(update.Tag_Name.Replace("v", string.Empty));
var currentVersion = BuildInfo.Version.ToString();
if (updateVersion.Revision == -1)
{
currentVersion = currentVersion.Substring(0, currentVersion.LastIndexOf(".", StringComparison.Ordinal));
}
return new UpdateNotificationDto()
{
CurrentVersion = currentVersion,
UpdateVersion = updateVersion.ToString(),
UpdateBody = _markdown.Transform(update.Body.Trim()),
UpdateTitle = update.Name,
UpdateUrl = update.Html_Url,
IsDocker = new OsInfo(Array.Empty<IOsVersionAdapter>()).IsDocker,
PublishDate = update.Published_At
};
}
public async Task PushUpdate(UpdateNotificationDto update)
{
if (update == null) return;
var admins = await _tracker.GetOnlineAdmins();
var updateVersion = new Version(update.CurrentVersion);
if (BuildInfo.Version < updateVersion)
{
_logger.LogInformation("Server is out of date. Current: {CurrentVersion}. Available: {AvailableUpdate}", BuildInfo.Version, updateVersion);
await SendEvent(update, admins);
}
else if (Environment.GetEnvironmentVariable("ASPNETCORE_ENVIRONMENT") == Environments.Development)
{
_logger.LogInformation("Server is up to date. Current: {CurrentVersion}", BuildInfo.Version);
await SendEvent(update, admins);
}
}
private async Task SendEvent(UpdateNotificationDto update, IReadOnlyList<string> admins)
{
var connections = new List<string>();
foreach (var admin in admins)
{
connections.AddRange(await _tracker.GetConnectionsForUser(admin));
}
await _messageHub.Clients.Users(admins).SendAsync(SignalREvents.UpdateVersion, MessageFactory.UpdateVersionEvent(update));
}
private static async Task<GithubReleaseMetadata> GetGithubRelease()
{
var update = await GithubLatestReleasesUrl
.WithHeader("Accept", "application/json")
.WithHeader("User-Agent", "Kavita")
.GetJsonAsync<GithubReleaseMetadata>();
return update;
}
private static async Task<IEnumerable<GithubReleaseMetadata>> GetGithubReleases()
{
var update = await GithubAllReleasesUrl
.WithHeader("Accept", "application/json")
.WithHeader("User-Agent", "Kavita")
.GetJsonAsync<IEnumerable<GithubReleaseMetadata>>();
return update;
}
public class UntrustedCertClientFactory : DefaultHttpClientFactory
{
public override HttpMessageHandler CreateMessageHandler() {
return new HttpClientHandler {
ServerCertificateCustomValidationCallback = (_, _, _, _) => true
};
}
}
public interface IVersionUpdaterService
{
Task<UpdateNotificationDto> CheckForUpdate();
Task PushUpdate(UpdateNotificationDto update);
Task<IEnumerable<UpdateNotificationDto>> GetAllReleases();
}
public class VersionUpdaterService : IVersionUpdaterService
{
private readonly ILogger<VersionUpdaterService> _logger;
private readonly IHubContext<MessageHub> _messageHub;
private readonly IPresenceTracker _tracker;
private readonly Markdown _markdown = new MarkdownDeep.Markdown();
#pragma warning disable S1075
private static readonly string GithubLatestReleasesUrl = "https://api.github.com/repos/Kareadita/Kavita/releases/latest";
private static readonly string GithubAllReleasesUrl = "https://api.github.com/repos/Kareadita/Kavita/releases";
#pragma warning restore S1075
public VersionUpdaterService(ILogger<VersionUpdaterService> logger, IHubContext<MessageHub> messageHub, IPresenceTracker tracker)
{
_logger = logger;
_messageHub = messageHub;
_tracker = tracker;
FlurlHttp.ConfigureClient(GithubLatestReleasesUrl, cli =>
cli.Settings.HttpClientFactory = new UntrustedCertClientFactory());
FlurlHttp.ConfigureClient(GithubAllReleasesUrl, cli =>
cli.Settings.HttpClientFactory = new UntrustedCertClientFactory());
}
/// <summary>
/// Fetches the latest release from Github
/// </summary>
public async Task<UpdateNotificationDto> CheckForUpdate()
{
var update = await GetGithubRelease();
return CreateDto(update);
}
public async Task<IEnumerable<UpdateNotificationDto>> GetAllReleases()
{
var updates = await GetGithubReleases();
return updates.Select(CreateDto);
}
private UpdateNotificationDto CreateDto(GithubReleaseMetadata update)
{
if (update == null || string.IsNullOrEmpty(update.Tag_Name)) return null;
var updateVersion = new Version(update.Tag_Name.Replace("v", string.Empty));
var currentVersion = BuildInfo.Version.ToString();
if (updateVersion.Revision == -1)
{
currentVersion = currentVersion.Substring(0, currentVersion.LastIndexOf(".", StringComparison.Ordinal));
}
return new UpdateNotificationDto()
{
CurrentVersion = currentVersion,
UpdateVersion = updateVersion.ToString(),
UpdateBody = _markdown.Transform(update.Body.Trim()),
UpdateTitle = update.Name,
UpdateUrl = update.Html_Url,
IsDocker = new OsInfo(Array.Empty<IOsVersionAdapter>()).IsDocker,
PublishDate = update.Published_At
};
}
public async Task PushUpdate(UpdateNotificationDto update)
{
if (update == null) return;
var admins = await _tracker.GetOnlineAdmins();
var updateVersion = new Version(update.CurrentVersion);
if (BuildInfo.Version < updateVersion)
{
_logger.LogInformation("Server is out of date. Current: {CurrentVersion}. Available: {AvailableUpdate}", BuildInfo.Version, updateVersion);
await SendEvent(update, admins);
}
else if (Environment.GetEnvironmentVariable("ASPNETCORE_ENVIRONMENT") == Environments.Development)
{
_logger.LogInformation("Server is up to date. Current: {CurrentVersion}", BuildInfo.Version);
await SendEvent(update, admins);
}
}
private async Task SendEvent(UpdateNotificationDto update, IReadOnlyList<string> admins)
{
await _messageHub.Clients.Users(admins).SendAsync(SignalREvents.UpdateAvailable, MessageFactory.UpdateVersionEvent(update));
}
private static async Task<GithubReleaseMetadata> GetGithubRelease()
{
var update = await GithubLatestReleasesUrl
.WithHeader("Accept", "application/json")
.WithHeader("User-Agent", "Kavita")
.GetJsonAsync<GithubReleaseMetadata>();
return update;
}
private static async Task<IEnumerable<GithubReleaseMetadata>> GetGithubReleases()
{
var update = await GithubAllReleasesUrl
.WithHeader("Accept", "application/json")
.WithHeader("User-Agent", "Kavita")
.GetJsonAsync<IEnumerable<GithubReleaseMetadata>>();
return update;
}
}

View file

@ -6,51 +6,54 @@ using System.Security.Claims;
using System.Text;
using System.Threading.Tasks;
using API.Entities;
using API.Interfaces.Services;
using Microsoft.AspNetCore.Identity;
using Microsoft.Extensions.Configuration;
using Microsoft.IdentityModel.Tokens;
using JwtRegisteredClaimNames = Microsoft.IdentityModel.JsonWebTokens.JwtRegisteredClaimNames;
namespace API.Services
namespace API.Services;
public interface ITokenService
{
public class TokenService : ITokenService
Task<string> CreateToken(AppUser user);
}
public class TokenService : ITokenService
{
private readonly UserManager<AppUser> _userManager;
private readonly SymmetricSecurityKey _key;
public TokenService(IConfiguration config, UserManager<AppUser> userManager)
{
private readonly UserManager<AppUser> _userManager;
private readonly SymmetricSecurityKey _key;
public TokenService(IConfiguration config, UserManager<AppUser> userManager)
{
_userManager = userManager;
_key = new SymmetricSecurityKey(Encoding.UTF8.GetBytes(config["TokenKey"]));
}
public async Task<string> CreateToken(AppUser user)
{
var claims = new List<Claim>
{
new Claim(JwtRegisteredClaimNames.NameId, user.UserName)
};
var roles = await _userManager.GetRolesAsync(user);
claims.AddRange(roles.Select(role => new Claim(ClaimTypes.Role, role)));
var creds = new SigningCredentials(_key, SecurityAlgorithms.HmacSha512Signature);
var tokenDescriptor = new SecurityTokenDescriptor()
{
Subject = new ClaimsIdentity(claims),
Expires = DateTime.Now.AddDays(7),
SigningCredentials = creds
};
var tokenHandler = new JwtSecurityTokenHandler();
var token = tokenHandler.CreateToken(tokenDescriptor);
return tokenHandler.WriteToken(token);
}
_userManager = userManager;
_key = new SymmetricSecurityKey(Encoding.UTF8.GetBytes(config["TokenKey"]));
}
}
public async Task<string> CreateToken(AppUser user)
{
var claims = new List<Claim>
{
new Claim(JwtRegisteredClaimNames.NameId, user.UserName)
};
var roles = await _userManager.GetRolesAsync(user);
claims.AddRange(roles.Select(role => new Claim(ClaimTypes.Role, role)));
var creds = new SigningCredentials(_key, SecurityAlgorithms.HmacSha512Signature);
var tokenDescriptor = new SecurityTokenDescriptor()
{
Subject = new ClaimsIdentity(claims),
Expires = DateTime.Now.AddDays(7),
SigningCredentials = creds
};
var tokenHandler = new JwtSecurityTokenHandler();
var token = tokenHandler.CreateToken(tokenDescriptor);
return tokenHandler.WriteToken(token);
}
}

View file

@ -1,43 +0,0 @@
using System;
using System.Collections.Generic;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
using API.Interfaces.Services;
using Microsoft.Extensions.DependencyInjection;
namespace API.Services
{
public class WarmupServicesStartupTask : IStartupTask
{
private readonly IServiceCollection _services;
private readonly IServiceProvider _provider;
public WarmupServicesStartupTask(IServiceCollection services, IServiceProvider provider)
{
_services = services;
_provider = provider;
}
public Task ExecuteAsync(CancellationToken cancellationToken = default)
{
using var scope = _provider.CreateScope();
foreach (var singleton in GetServices(_services))
{
Console.WriteLine("DI preloading of " + singleton.FullName);
scope.ServiceProvider.GetServices(singleton);
}
return Task.CompletedTask;
}
static IEnumerable<Type> GetServices(IServiceCollection services)
{
return services
.Where(descriptor => descriptor.ImplementationType != typeof(WarmupServicesStartupTask))
.Where(descriptor => !descriptor.ServiceType.ContainsGenericParameters)
.Select(descriptor => descriptor.ServiceType)
.Distinct();
}
}
}