v0.6.0 - Polish, Polish, Polish + Send To Support! (#1604)

* New Scan Loop (#1447)

* Staging the code for the new scan loop.

* Implemented a basic idea of changes on drives triggering scan loop. Issues: 1. Scan by folder does not work, 2. Queuing system is very hacky and needs a separate thread, 3. Performance degregation could be very real.

* Started writing unit test for new loop code

* Implemented a basic method to scan a folder path with ignore support (not implemented, code in place)

* Added some code to the parser to build out the idea of processing series in batches based on some top level folder.

* Scan Series now uses the new code (folder based parsing) and now handles the LocalizedSeries issue.

* Got library scan working with the new folder-based scan loop. Updated code to set FolderPath (for improved scan times and partial scan support).

* Wrote some notes on update library scan loop.

* Removed migration for merge

* Reapplied the SeriesFolder migration after merge

* Refactored a check that used multiple db calls into one.

* Made lots of progress on ignore support, but some confusion on underlying library. Ticket created. On hold till then.

* Updated Scan Library and Scan Series to exit early if no changes are on the underlying folders that need to be scanned.

* Implemented the ability to have .kavitaignore files within your directories and Kavita will parse them and ignore files and directories based on rules within them.

* Fixed an issue where ignore files nested wouldn't stack with higher level ignores

* Wrote out some basic code that showcases how we can scan series or library based on file events on the underlying system. Very buggy, needs lots of edge case testing and logging and dupplication checking.

* Things are working kinda. I'm getting lost in my own code and complexity. I'm not sure it's worth it.

* Refactored ScanFiles out to Directory Service.

* Refactored more code out to keep the code clean.

* More unit tests

* Refactored the signature of ParsedSeries to use IList. Started writing unit tests and reworked the UpdateLibrary to work how it used to with new scan loop code (note: using async update library/series does not work).

* Fixed the bug where processSeriesInfos was being invoked twice per series and made the code work very similar to old code (except loose leaf files dont work) but with folder based scanning.

* Prep for unit tests (updating broken ones with new implementations)

* Just some notes. Not sure I want to finish this work.

* Refactored the LibraryWatcher with some comments and state variables.

* Undid the migrations in case I don't move forward with this branch

* Started to clean the code and prepare for finishing this work.

* Fixed a bad merge

* Updated signatures to cleanup the code and commit to the new strategy for scanning.

* Swapped out the code with async processing of series on a small library

* The new scan loop is working in both Sync and Async methods. The code is slow and not optimized. This represents a good point to start polling and applying optimizations.

* Refactored UpdateSeries out of Scanner and into a dedicated file.

* Refactored how ProcessTasks are awaited to allow more async

* Fixed an issue where side nav item wouldn't show correct highlight and migrated to OnPush

* Moved where we start to stopwatch to encapsulate the full scan

* Cleaned up SignalR events to report correctly (still needs a redesign)

* Remove the "remove" code until I figure it out

* Put in extremely expensive series deletion code for library scan.

* Have Genre and Tag update the DB immediately to avoid dup issues

* Taking a break

* Moving to a lock with People was successful. Need to apply to others.

* Refactored code for series level and tag and genre with new locking strategy.

* New scan loop works. Next up optimization

* Swapped out the Kavita log with svg for faster load

* Refactored metadata updates to occur when the series are being updated.

* Code cleanup

* Added a new type of generic message (Info) to inform the user.

* Code cleanup

* Implemented an optimization which prevents any I/O (other than an attribute lookup) for Library/Series Scan. This can bring a recently updated library on network storage (650 series) to fully process in 2 seconds.

Fixed a bug where File Analysis was running everytime for each non-epub file.

* Fixed ARM x64 builds not being able to view PDF cover images due to a bad update in DocNet.

* Some code cleanup

* Added experimental signalr update code to have a more natural refresh of library-detail page

* Hooked in ability to send new series events to UI

* Moved all scan (file scan only) tasks into Scan Queue. Made it so scheduled ScanLibraries will now check if any existing task is being run and reschedule for 3 hours, and 10 mins for scan series.

* Implemented the info event in the events widget and added a clear all button to dismiss all infos and errors.  Added --event-widget-info-bg-color

* Remove --drawer-background-color since it's not used

* When new series added, inject directly into the view.

* Some debug code cleanup

* Fixed up the unit tests

* Ensure all config directories exist on startup

* Disabled Library Watching (that will go in next build)

* Ensure update for series is admin only

* Lots of code changes, scan series kinda works, specials are splitting, optimizations are failing. Demotivated on this work again.

* Removed SeriesFolder migration

* Added the SeriesFolder migration

* Added a new pipe for dates so we can provide some nicer defaults. Added folder path to the series detail.

* The scan optimizations now work for NTFS systems.

* Removed a TODO

* Migrated all the times to use DateTime.Now and not Utc.

* Refactored some repo calls to use the includes flag pattern

* Implemented a check for the library scan optimization check to validate if the library was updated (type change, library rename, folder change, or series deleted) and let the optimization be bypassed.

* Added another optimization which will use just folder attribute of last write time if the drive is not NTFS.

* Fixed a unit test

* Some code cleanup

* Bump versions by dotnet-bump-version.

* Misc UI Fixes (#1450)

* Fixed collection cover images not rendering

* added a try/catch on sending email, so we fail silently if it doesn't send.

* Fixed Go Back not returning to last scroll position due to layoutmode change resetting, despite nothing changing.

* Fixed a bug where when turning between pages on default mode, the height calculations could get skewed.

* Fixed a missing case for card item where it wouldn't show tooltip title for series.

* Bump versions by dotnet-bump-version.

* New Scan Loop Fixes (#1452)

* Refactored ScanSeries to avoid a lot of extra work and fixed a bug where Scan Series would invoke the processing twice.

Refactored the series selection code during process such that we use Localized Name as well, for cases where the original name was changed.

Undid an optimization around Last Write time, since Linux file systems match how NTFS works.

* Fixed part of the query

* Added a NormalizedLocalizedName for quick searching in which a series needs grouping. Reworked scan loop code a bit to ensure we don't do extra work.

Tweaked the widget logic to help display better and not show "Nothing going on here".

* Fixed a bug where archives with ._ files would be counted as valid files, while they are actually just metadata files on Mac's.

* Fixed a broken unit test

* Bump versions by dotnet-bump-version.

* Simplify parent lookup with Directory.GetParent (#1455)

* Simplify parent lookup with Directory.GetParent

* Address comments

* Bump versions by dotnet-bump-version.

* Scan Loop Fixes (#1459)

* Added Last Folder Scanned time to series info modal.

Tweaked the info event detail modal to have a primary and thus be auto-dismissable

* Added an error event when multiple series are found in processing a series.

* Fixed a bug where a series could get stuck with other series due to a bad select query.

Started adding the force flag hook for the UI and designing the confirm.

Confirm service now also has ability to hide the close button.

Updated error events and logging in the loop, to be more informative

* Fixed a bug where confirm service wasn't showing the proper body content.

* Hooked up force scan series

* refresh metadata now has force update

* Fixed up the messaging with the prompt on scan, hooked it up properly in the scan library to avoid the check if the whole library needs to even be scanned. Fixed a bug where NormalizedLocalizedName wasn't being calculated on new entities.

Started adding unit tests for this problematic repo method.

* Fixed a bug where we updated NormalizedLocalizedName before we set it.

* Send an info to the UI when series are spread between multiple library level folders.

* Added some logger output when there are no files found in a folder. Return early if there are no files found, so we can avoid some small loops of code.

* Fixed an issue where multiple series in a folder with localized series would cause unintended grouping. This is not supported and hence we will warn them and allow the bad grouping.

* Added a case where scan series fails due to the folder being removed. We will now log an error

* Normalize paths when finding the highest directory till root.

* Fixed an issue with Scan Series where changing a series' folder to a different path but the original series folder existed with another series in it, would cause the series to not be deleted.

* Fixed some bugs around specials causing a series merge issue on scan series.

* Removed a bug marker

* Cleaned up some of the scan loop and removed a test I don't need.

* Remove any prompts for force flow, it doesn't work well. Leave the API as is though.

* Fixed up a check for duplicate ScanLibrary calls

* Bump versions by dotnet-bump-version.

* Scroll Resume (#1460)

* When we navigate from a page then back, resume back on the last scroll key (if clicked)

* Resume jump key position when navigating back to a page. Removed some extra blank space on collection detail when a collection doesn't have a summary or cover image.

* Ignore progress events on series cards

* Added a url to swagger for /, which could be reverse proxy url

* Bump versions by dotnet-bump-version.

* Misc UI fixes (#1461)

* Misc fixes

- Fixed modal being stretched when not needed.
- Fixed Logo vertical align
- Fixed drawer content scroll, and from it being squished due to overridden by bootstrap.

* series detail cover image stretch fix

- Fixes: Fixes series detail cover image being stretched on larger resolutions

* fixing empty lists scrollbar

* Fixing want to read error

* fixing unnecessary scrollbar

* Fixing recently updated tooltip

* Bump versions by dotnet-bump-version.

* Folder Watching (#1467)

* Hooked in a server setting to enable/disable folder watching

* Validated the file rename change event

* Validated delete file works

* Tweaked some logic to determine if a change occurs on a folder or a file.

* Added a note for an upcoming branch

* Some minor changes in the loop that just shift where code runs.

* Implemented ScanFolder api

* Ensure we restart watchers when we modify a library folder.

* Fixed a unit test

* Bump versions by dotnet-bump-version.

* More Scan Loop Bugfixes (#1471)

* Updated scan time for watcher to 30 seconds for non-dev. Moved ScanFolder off the Scan queue as it doesn't need to be there. Updated loggers

* Fixed jumpbar missing

* Tweaked the messaging for CoverGen

* When we return early due to nothing being done on library and series scan, make sure we kick off other tasks that need to occur.

* Fixed a foreign constraint issue on Volumes when we were adding to a new series.

* Fixed a case where when picking normalized series, capitalization differences wouldn't stack when they should.

* Reduced the logging output on dev and prod settings.

* Fixed a bug in the code that finds the highest directory from a file, where we were not checking against a normalized path.

* Cleaned up some code

* Fixed broken unit tests

* Bump versions by dotnet-bump-version.

* More Scan Loop Fixes (#1473)

* Added a ToList() to avoid a bug where a person could be removed from a list while iterating over the list.

* When deleting a series, want to read page will now automatically remove that series from the view.

* Fixed a series lookup which was ignoring format

* Ignore XML comment warnings

* Removed a note since it was already working that way

* Fixed unit test

* Bump versions by dotnet-bump-version.

* Misc UI Fixes (#1477)

* Tweaked a Migration to log correctly only if something is going to be done.

* Refactored Reading List Controller code into a dedicated service and cleaned up some methods that aren't needed anymore.

* Fixed a bug where adding a new item to a reading list wasn't adding it at the end.

* Fixed an issue where collection page would re-render the same covers on multiple items.

* Fixed a missing margin-top which made the page extras drawer not render correctly and hence unclosable on small screens.

* Added some timeout on manage users screen to give data time to flush.

Added a dedicated token log for account flows, in case url encoding plays a part (but from testing it doesn't).

* Reverted back to building for ES6 instead of es2020 for old Safari 12.5.5 browsers (10MB difference in build size).

* Cleaned up the logic in removing series not found during scan loop.

* Tweaked the timings for Library Watcher to 1 min and reprocess queue every 30 seconds.

* Bump versions by dotnet-bump-version.

* Added fixes for libvips (#1479)

* Bump versions by dotnet-bump-version.

* Tachiyomi + Fixes (#1481)

* Fixed a bootstrap bug

* Fixed repeating images on collection detail

* Fixed up some logic in library watcher which wasn't processing all of the queue.

* When parsing non-epubs in Book library, use Manga parsing for Volume support to better support Light Novels

* Fixed some bugs with the tachiyomi plugin api's for progress tracking

* Bump versions by dotnet-bump-version.

* Adding Health controller (#1480)

* Adding Health controller

- Added: Added API endpoint for a health check to streamline docker healthy status.

* review comment fixes

* Bump versions by dotnet-bump-version.

* Simplify Folder Watcher (#1484)

* Refactored Library Watcher to use Hangfire under the hood.

* Support .kavitaignore at root level.

* Refactored a lot of the library watching code to process faster and handle when FileSystemWatcher runs out of internal buffer space. It's still not perfect, but good enough for basic use.

* Make folder watching as experimental and default it to off by default.

* Revert #1479

* Tweaked the messaging for OPDS to remove a note about download role.

Moved some code closer to where it's used.

* Cleaned up how the events widget reports

* Fixed a null issue when deleting series in the UI

* Cleaned up some debug code

* Added more information for when we skip a scan

* Cleaned up some logging messages in CoverGen tasks

* More log message tweaks

* Added some debug to help identify a rare issue

* Fixed a bug where save bookmarks as webp could get reset to false when saving other server settings

* Updated some documentation on library watcher.

* Make LibraryWatcher fire every 5 mins

* Bump versions by dotnet-bump-version.

* Sort series by chapter number only when some chapters have no volume (#1487)

* Sort series by chapter number only when some chapters have no volume information

* Implement a Default static instance of ChapterSortComparer

* Further use Default static Comparers

* Add missing ToLit() as per comments

* SQLite Hangfire  (#1488)

* Update to use SQLIte for Hangfire to retain information on tasks

* Updated all external links to have noopener noreferrer

* When watching folders, ensure the folders exist before creating watchers.

* Tweaked the messaging for Email Service and added link to the project.

* Bump versions by dotnet-bump-version.

* Bump versions by dotnet-bump-version.

* Fixed typeahead not working correctly (#1490)

* Bump versions by dotnet-bump-version.

* Release Testing Day 1 (#1491)

* Fixed a bug where typeahead wouldn't automatically show results on relationship screen without an additional click.

* Tweaked the code which checks if a modification occured to check on seconds rather than minutes

* Clear cache will now clear temp/ directory as well.

* Fixed an issue where Chrome was caching api responses when it shouldn't had.

* Added a cleanup temp code

* Ensure genres get removed during series scan when removed from metadata.

* Fixed a bug where all epubs with a volume would show as Volume 0 in reading list

* When a scan is in progress, don't let the user delete the library.

* Bump versions by dotnet-bump-version.

* Scan Loop Last Write Time Change (#1492)

* Refactored invite user flow to separate error handling on create user flow and email flow. This should help users that have unique situations.

* Switch to using files to check LastWriteTime. Debug code in for Robbie to test on rclone

* Updated Parser namespace. Changed the LastWriteTime to check all files and folders.

* Bump versions by dotnet-bump-version.

* Release Testing Day 2 (#1493)

* Added a no data section to collection detail.

* Remove an optimization for skipping the whole library scan as it wasn't reliable

* When resetting password, ensure the input is colored correctly

* Fixed setting new password after resetting, throwing an error despite it actually being successful.

Fixed incorrect messaging for Password Reset page.

* Fixed a bug where reset password would show the side nav button and skew the page.

Updated a lot of references to use Typed version for formcontrols.

* Removed a migration from 0.5.0, 6 releases ago.

* Added a null check so we don't throw an exception when connecting with signalR on unauthenticated users.

* Bump versions by dotnet-bump-version.

* Fixed a bug where a series with a relationship couldn't be deleted. (#1495)

* Bump versions by dotnet-bump-version.

* Release Testing Day 3 (#1496)

* Tweaked log messaging for library scan when no files were scanned.

* When a theme that is set gets removed due to a scan, inform the user to refresh.

* Fixed a typo and make Darkness -> Brightness

* Make download theme files allowed to be invoked by non-authenticated users, to allow new users to get the default theme.

* Hide all series side nav item if there are no libraries exposed to the user

* Fixed an API for Tachiyomi when syncing progress

* Fixed dashboard not responding to Series Removed and Added events.

Ensure we send SeriesRemoved events when they are deleted.

* Reverted Hangfire SQLite due to aborted jobs being resumed, when they shouldnt. Fixed some scan loop issues where cover gen wouldn't be invoked always on new libraries.

* Bump versions by dotnet-bump-version.

* Updating series detail cover style (#1498)

# FIxed
- Fixed: Fixed an issue with series detail cover when scaled down.

* Bump versions by dotnet-bump-version.

* v0.5.6 Release (#1499)

* Bump versions by dotnet-bump-version.

* Bookmark RBS + Dynamic PGO (#1503)

* Allow .NET to optimize code as it's running.

* Implemented the ability to restrict users Bookmark ability. By default, users will need to now opt-in to get bookmark roles.

* Fixed a tachiyomi progress syncing logic bug

* Bump versions by dotnet-bump-version.

* Updating series detail cover (#1509)

* Updating series detail cover

# Fixed
- Fixed: Fixed an issue where the series detail cover would resize too large on ultra wide displays.

* Fixing typos

* Bump versions by dotnet-bump-version.

* Logging Enhancements (#1521)

* Recreated Kavita Logging with Serilog instead of Default. This needs to be move out of the appsettings now, to allow auto updater to patch.

* Refactored the code to be completely configured via Code rather than appsettings.json. This is a required step for Auto Updating.

* Added in the ability to send logs directly to the UI only for users on the log route. Stopping implementation as Alerts page will handle the rest of the implementation.

* Fixed up the backup service to not rely on Config from appsettings.json

* Tweaked the Logging levels available

* Moved everything over to File-scoped namespaces

* Moved everything over to File-scoped namespaces

* Code cleanup, removed an old migration and changed so debug logging doesn't print sensitive db data

* Removed dead code

* Bump versions by dotnet-bump-version.

* Misc Enhancements (#1525)

* Moved the data connection for the Database out of appsettings.json and hardcoded it. This will allow for more customization and cleaner update process.

* Removed unneeded code

* Updated pdf viewer to 15.0.0 (pdf 2.6), which now supports east-asian fonts

* Fixed up some regex parsing for volumes that have a float number.

* Fixed a bug where the tooltip for Publication Status wouldn't show

* Fixed some weird parsing rules where v1.1 would parse as volume 1 chapter 1

* Fixed a bug where bookmarking button was hidden for admins without bookmark role (due to migration)

* Unified the star rating component in series detail to match metadata filter.

* Fixed a bug in the bulk selection code when using shift selection, where the inverse of what was selected would be toggled.

* Fixed some old code where if on all series page, only English as a language would return. We now return all languages of all libraries.

* Updated api/metadata/languages documentation

* Refactored some bookmark api names: get-bookmarks -> chapter-bookmarks, get-all-bookmarks -> all-bookmarks, get-series-bookmarks -> series-bookmarks, etc.

* Refactored all cases of createSeriesFilter to filterUtiltityService.

Added ability to search for a series on Bookmarks page.

Fixed a bug where people filters wouldn't respect the disable flag froms ettings.

* Cleaned up a bit of the circular downloader code.

* Implemented Russian Parsing

* Fixed an issue where some users that had a missing theme entry wouldn't be able to update their user preferences.

* Refactored normalization to exclude !, thus allowing series with ! to be different from each other.

* Fixed a migration exit case

* Fixed broken unit test

* Bump versions by dotnet-bump-version.

* Fixed a version issue with migration (#1526)

* Bump versions by dotnet-bump-version.

* Metadata Bugfixes (#1511)

* Fix XML deserialization of empty elements to integers

* Fix assumption that environment uses US time format

* Use series name as SeriesSort in epub

* Address some PR comments

* Add partial Equals(0 implementation to ComicInfo

* Update ComicInfo unittest. Revert previous version

* Bump versions by dotnet-bump-version.

* Configure Animation Module (#1504)

* Configure Animation Module

Configure the Animation Module of Angular to disable animation on older iOS devices (<14) where it causes animate not defined errors.

* Simplified disableAnimations

Removed the regex iOS version check as it seemed to return false on iOS 12.5.5 meaning that the `!('animate' in document.documentElement)` did the job already. This also allows users to enable the experimental feature Web Animations on iOS 12.5.5 to have them enabled again.

as note; navigator.userAgent returned the following on an iPad iOS 12.5.5
`Mozilla/5.0 (iPad; CPU OS 12_5_5 like Mac OS X) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/12.1.2 Mobile/15E148 Safari/604.1`

* added console output on disabled

Added console.error if Web Animations have been disabled due to the browser not supporting this.

* Bump versions by dotnet-bump-version.

* Reader Bugs + New Features  (#1536)

* Updated a typo in manage tasks of Reoccuring -> Recurring

* Fixed a bug in MinimumNumberFromRange where a regex wasn't properly constructed which could skew results.

* Fixed a bug where Volume numbers that were a float wouldn't render correctly in the manga reader menu.

* Added the ability to double click on the image to bookmark it. Optimized the bookmark and unbookmark flows to remove 2 DB calls and reworked some flow of calls to speed it up.

Fixed some logic where when using double (manga) flow, both of the images wouldn't show the bookmark effect, despite both of them being saved. Likewise, fixed a bug where both images weren't updating UI state, so switching from double (manga) to single, the second image wouldn't show as bookmarked without a refresh.

* Double click works perfectly for bookmarking

* Collection cover image chooser will now prompt with all series covers by default.

Reset button is now moved up to the first slot if applicable.

* When a Completed series is fully read by a user, a nightly task will now remove that series from their Want to Read list.

* Added ability to trigger Want to Read cleanup from Tasks page.

* Moved the brightness readout to the label line and fixed a bootstrap migration bug where small buttons weren't actually small.

* Implemented ability to filter against release year (min or max or both).

* Fixed a log message that wasn't properly formatted when scan finished an no files changes.

* Cleaned up some code and merged some methods

* Implemented sort by Release year metadata filter.

* Fixed the code that finds ComicInfo.xml inside archives to only check the root and check explicitly for casing, so it must be ComicInfo.xml.

* Dependency updates

* Refactored some strings into consts and used TriggerJob rather than just enqueuing

* Fixed the prefetcher which wasn't properly loading in the correct order as it was designed.

* Cleaned up all traces of CircularArray from MangaReader

* Removed a debug code

* Fixed a bug with webtoon reader in fullscreen mode where continuous reader wouldn't trigger

* When cleaning up series from users' want to read lists, include both completed and cancelled.

* Fixed a bug where small images wouldn't have the pagination area extend to the bottom on manga reader

* Added a new method for hashing during prod builds and ensure we always use aot

* Fixed a bug where the save button wouldn't enable when color change occured.

* Cleaned up some issues in one of contributor's PR.

* Bump versions by dotnet-bump-version.

* Misc Polish and Fixes (#1542)

* Moved LibraryWatcher to utilize a queue for calculating the change event to ensure the Watcher doesn't get overwhelmed on large moves.

* Fixed a security vulnerability (https://huntr.dev/bounties/8a3e652f-d6bf-436e-877e-0eaf5c69ef95/). This will be disclosed in Stable release changelog.

* Tweaked the log message template

* Removed some dead code from Configuration json patcher

* Fixed a bug with the ComicInfo finding to properly handle root level.

Fixed a bug where sometimes scanner wouldn't choose the first file with ComicInfo for filling out information.

* Added new setting for managing how many logs files are allowed, just like how backups work.

* Added unit tests for new CleanupLogs code

* Fixed a bug where manga reader background color wasn't actually sending from the UI

* Added new stats for tracking to help understand usage in the app and what features are used or not.

* Fixed Stats url

* Fixed a bug where volumes that had larger than 1 difference wouldn't properly return next/prev chapter (for continuous reader)

* Remove a redundant test step in build pipeline, since it's already done at PR stage.

* Updated dockerfile to use the new Heath check endpoint

* Allow force to pass through to scan loop

* Removed some old config stuff from a safety check on config in entrypoint.sh

* Fixed broken unit tests due to new RBS check and how we setup mock data.

* Bump versions by dotnet-bump-version.

* Removed some debug code (#1543)

* Bump versions by dotnet-bump-version.

* Parser optimization part1 (#1531)

* Optimize CleanTitle

* Optimize MangaEditionRegex

* Optimize special regexes

* Refactor manga|comic special parsing into simple tests

* Word bind the special regexps. Support additional "special" use cases.

* Updates to address PR comments

* CleanTitle benchmarking

* Use a smaller Comics Data set for benchmarking

* Bump versions by dotnet-bump-version.

* Tachiyomi unit tests and fixes (#1549)

* Moved logic from TachiyomiController.cs to TachiyomiService.cs

* Added GetLatestChapter Unit Tests

* Tachiyomi more tests.
Implemented test for yearly volumes

* MarkVolumesUntilAsRead unit test

* Registered tachiyomi service.
Added new test

* Fixed test pages

* Added missing check if its single-file volume

* Removed dead code

* Added method documentation and breaked thousands with `_`

* Review details and renamed test method to be more descriptive

* Review changes
- Removed automapper
- Added spaces after commas
- Added class documentation (copied from controller)
- Made Culture static
- Added 'R' doc linking to docs.ms
- Added trycatch to service when saving progress and logged
- Removed redundant qualifiers

* finishing touches

Co-authored-by: Joseph Milazzo <joseph.v.milazzo@gmail.com>

* Bump versions by dotnet-bump-version.

* Folder Watching Polish + Epub Fix (#1550)

* Fixed entrypoint writing bad json (from develop)

* Fixed a bug where log file could write out a crap ton of information (serializing Series object) when a db error occurs.

* Fixed an issue with scan loop where concurrency issues could occur on new series being added.

* Tweaked the logger to suppress some noisy logs when using Debug log level.

* Fixed a regression with epub parsing from v3.2 of Vers-One's release

* Fixed up folder watching to work more reliable. Validated in production.

* Code cleanup

* Bump versions by dotnet-bump-version.

* Fallback to other locations when ComicInfo.xml not at root of archive (#1551)

* Fallback to other locations when ComicInfo.xml not at root of archive

* Better ComicInfo test coverage and benchmarks

* Add a rar archive to the ComicInfo test cases

* Bump versions by dotnet-bump-version.

* Series title word wrapping (#1519)

* Added text-break class to series title.

Simply added the class text-break from bootstrap to break on words.
https://getbootstrap.com/docs/5.0/utilities/text/#word-break

This has no issue with languages that do not or rarely use spaces, such as japanese and chinese. Used the following two series names to test;
- 今まで一度も女扱いされたことがない女騎士を女扱いする漫画
- Imamade Ichido mo Onnaatsukai sareta Koto ga Nai Onna Kishi wo Onnaatsukai suru

* Added text-break class to localized title

Also added the text-break bootstrap class to the localized title, removed the word-break rule from css as it is redundant.

* Enclosed LibraryName with span

Enclosed {{libraryName}} with a span to remove the added space before the title, aligning it again with the start of the subtitle. This mimics series-detail-component.html

* Bump versions by dotnet-bump-version.

* Nested Menus (#1554)

* added initial submenu

* added submenu - needs a bit of more work

* removed admin and nonadmin action split

* the whole menu is build under the resetactions function

* removed download from seriesAction

* changed submenu layout
changed submenu toggle icon
fix for the hovering of submenu toggle

* moved the cdMarkForCheck in the subscribe block

* Bump versions by dotnet-bump-version.

* Send To Device Support (#1557)

* Tweaked the logging output

* Started implementing some basic idea for devices

* Updated Email Service with new API routes

* Implemented basic DB structure and some APIs to prep for the UI and flows.

* Added an abstract class to make Unit testing easier.

* Removed dependency we don't need

* Updated the UI to be able to show devices and add new devices. Email field will update the platform if the user hasn't interacted with it already.

* Added ability to delete a device as well

* Basic ability to send files to devices works

* Refactored Action code to pass ActionItem back and allow for dynamic children based on an Observable (api).

Hooked in ability to send a chapter to a device. There is no logic in the FE to validate type.

* Fixed a broken unit test

* Implemented the ability to edit a device

* Code cleanup

* Fixed a bad success message

* Fixed broken unit test from updating mock layer

* Bump versions by dotnet-bump-version.

* Fixed a bug where when no devices, the submenu item would still render. (#1558)

* Bump versions by dotnet-bump-version.

* Extended Korean Filename Parsing Support (#1556)

* Added Some Korean Volume Matches

* Fixed Typo And Added Test Cases

* Restore Chapter Decimal Support

* Added Decimal Volume Support to -권, -화, -회 and -장
Merged -권 Pattern to -화, -회, -장 Pattern
Added Decimal Test to ParseVolumeTest

* Grouped Korean Tests

* Fixed Regexp Comment

* Bump versions by dotnet-bump-version.

* Disable Animations + Lots of bugfixes and Polish (#1561)

* Fixed inputs not showing inline validation due to a missing class

* Fixed some checks

* Increased the button size on manga reader (develop)

* Migrated a type cast to a pure pipe

* Sped up the check for if SendTo should render on the menu

* Don't allow user to bookmark in bookmark mode

* Fixed a bug where Scan Series would skip over Specials due to how new scan loop works.

* Fixed scroll to top button persisting when navigating between pages

* Edit Series modal now doesn't have a lock field for Series, which can't be locked as it is inheritently locked.

Added some validation to ensure Name and SortName are required.

* Fixed up some spacing

* Fixed actionable menu not opening submenu on mobile

* Cleaned up the layout of cover image on series detail

* Show all volume or chapters (if only one volume) for cover selection on series

* Don't open submenu to right if there is no space

* Fixed up cover image not allowing custom saves of existing series/chapter/volume images.

Fixed up logging so console output matches log file.

* Implemented the ability to turn off css transitions in the UI.

* Updated a note internally

* Code smells

* Added InstallId when pinging the email service to allow throughput tracking

* Bump versions by dotnet-bump-version.

* Auth Email Rework (#1567)

* Hooked up Send to for Series and volumes and fixed a bug where Email Service errors weren't propagating to the UI layer.

When performing actions on series detail, don't disable the button anymore.

* Added send to action to volumes

* Fixed a bug where .kavitaignore wasn't being applied at library root level

* Added a notification for when a device is being sent a file.

* Added a check in forgot password for users that do not have an email set or aren't confirmed.

* Added a new api for change email and moved change password directly into new Account tab (styling and logic needs testing)

* Save approx scroll position like with jump key, but on normal click of card.

* Implemented the ability to change your email address or set one. This requires a 2 step process using a confirmation token. This needs polishing and css.

* Removed an unused directive from codebase

* Fixed up some typos on publicly

* Updated query for Pending Invites to also check if the user account has not logged in at least once.

* Cleaned up the css for validate email change

* Hooked in an indicator to tell user that a user has an unconfirmed email

* Cleaned up code smells

* Bump versions by dotnet-bump-version.

* Misc Polish (#1569)

* Introduced a lock for DB work during the scan to hopefully reduce the concurrency issues

* Don't allow multiple theme scans to occur

* Fixed bulk actions not having all actions due to nested actionable menu changes

* Refactored the Scan loop to be synchronous to avoid any issues. After first loop, no real performance issues.

* Updated the LibraryWatcher when under many internal buffer full issues, to suspend watching for a full hour, to allow whatever downloading to complete.

* Removed Semaphore as it's not needed anymore

* Updated the output for logger to explicitly say from Kavita (if you're pushing to Seq)

* Fixed a broken test

* Fixed ReleaseYear not populating due to a change from a contributor around how to populate ReleaseYear.

* Ensure when scan folder runs, that we don't double enqueue the same tasks.

* Fixed user settings not loading the correct tab

* Changed the Release Year -> Release

* Added more refresh hooks in reader to hopefully ensure faster refreshes

* Reset images between chapter loads to help flush image faster. Don't show broken image icon when an image is still loading.

* Fixed the prefetcher not properly loading the correct images and hence, allowing a bit of lag between chapter loads.

* Code smells

* Bump versions by dotnet-bump-version.

* Scan Loop Fixes (#1572)

* Cleanup some messaging in the scan loop to be more context bearing

* Added Response Caching to Series Detail for 1 min, due to the heavy nature of the call.

* Refactored code to make it so that processing of series runs sync correctly.

Added a log to inform the user of corrupted volume from buggy code in v0.5.6.

* Moved folder watching out of experimental

* Fixed an issue where empty folders could break the scan loop

* Another fix for when dates aren't valid, the scanner wouldn't get the proper min and would throw an exception (develop)

* Implemented the ability to edit release year from the UI for a series.

* Added a unit test for some new logic

* Code smells

* Bump versions by dotnet-bump-version.

* Scan Loop Fortification (#1573)

* Cleanup some messaging in the scan loop to be more context bearing

* Added Response Caching to Series Detail for 1 min, due to the heavy nature of the call.

* Refactored code to make it so that processing of series runs sync correctly.

Added a log to inform the user of corrupted volume from buggy code in v0.5.6.

* Moved folder watching out of experimental

* Fixed an issue where empty folders could break the scan loop

* Another fix for when dates aren't valid, the scanner wouldn't get the proper min and would throw an exception (develop)

* Implemented the ability to edit release year from the UI for a series.

* Added a unit test for some new logic

* Code smells

* Rewrote the handler for suspending watching to be more resilient and ensure no two threads have a race condition.

* More error handling for when a ScanFolder is invoked but multiple series belong to that folder, log it to the user and default to a library scan.

* ScanSeries now will check for kavitaignores higher than it's own folder and respect library level.

* Fixed an issue where image series with a folder name containing the word "folder" could get ignored as it thought the image was a cover image.

When a series folder is moved or deleted, skip parent ignore finding.

* Removed some old files, added in scanFolder a check if the series found for a folder is in a book library and if so to always do a library scan (as books are often nested into one folder with  multiple series). Added some unit tests

* Refactored some scan loop logic into ComicInfo, wrote tests and updated some documentation to make the fields more clear.

* Added a test for GetLastWriteTime based on recent bug

* Cleaned up some redundant code

* Fixed a bad merge

* Code smells

* Removed a package that's no longer used.

* Ensure we check against ScanQueue on ScanFolder enqueuing

* Documentation and more bullet proofing to ensure Hangfire checks work more as expected

* Bump versions by dotnet-bump-version.

* Restricted Profiles (#1581)

* Added ReadingList age rating from all series and started on some unit tests for the new flows.

* Wrote more unit tests for Reading Lists

* Added ability to restrict user accounts to a given age rating via admin edit user modal and invite user. This commit contains all basic code, but no query modifications.

* When updating a reading list's title via UI, explicitly check if there is an existing RL with the same title.

* Refactored Reading List calculation to work properly in the flows it's invoked from.

* Cleaned up an unused method

* Promoted Collections no longer show tags where a Series exists within them that is above the user's age rating.

* Collection search now respects age restrictions

* Series Detail page now checks if the user has explicit access (as a user might bypass with direct url access)

* Hooked up age restriction for dashboard activity streams.

* Refactored some methods from Series Controller and Library Controller to a new Search Controller to keep things organized

* Updated Search to respect age restrictions

* Refactored all the Age Restriction queries to extensions

* Related Series no longer show up if they are out of the age restriction

* Fixed a bad mapping for the update age restriction api

* Fixed a UI state change after updating age restriction

* Fixed unit test

* Added a migration for reading lists

* Code cleanup

* Bump versions by dotnet-bump-version.

* Misc Bugfixes (#1582)

* Fixed a bug with RBS on non-admin accounts

* Fixed a bug where get next/prev chapter wouldn't respect floating point volume numbers

* Fixed a bad migration version check

* When building kavita ignore exclusions, ignore blank lines.

* Hooked up the GetFullSeriesByAnyName to check against OriginalName exactly

* Refactored some code for building ignore from library root, to keep the code cleaner

* Tweaked some messaging

* Fixed a bad directory join when a change event occurs in a nested series folder.

* Fixed a bug where cover generation would prioritize a special if there were only chapters in the series.

* Fixed a bug where you couldn't update a series modal if there wasn't a release year present

* Fixed an issue where renaming the Series in Kavita wouldn't allow ScanSeries to see the files, and thus would delete the Series.

* Added an additional check with Hangfire to make sure ScanFolder doesn't kick off a change when a bunch of changes come through for the same directory, but a job is already running.

* Added more documentation

* Migrated more response caching to profiles and merged 2 apis into one, since they do the same thing.

* Fixed a bug where NotApplicable age ratings were breaking Recently Updated Series

* Cleaned up some cache profiles

* More caching

* Provide response caching on Get Next/Prev Chapter

* Code smells

* Bump versions by dotnet-bump-version.

* New Series Relation - Edition (#1583)

* Moved UpdateRelatedSeries from controller to SeriesService.cs

* Added 2 tests.
- UpdateRelatedSeries_ShouldDeletePrequelRelation
- UpdateRelatedSeries_ShouldNotAllowDuplicates

* Some docs and codestyle nitpicks

* Simplified tests and made easier to read

* Added 'Editions' series relation

* Missing code to properly show the relations in the UI

* Create Service for GetRelatedServices

* Added unit test. Assert Edition, Prequel and Sequel do not return parent while others do

* fixed missing userRating

* Add requested changes:
- Rename one test
- Split one test into two tests

* Bump versions by dotnet-bump-version.

* Release Polish (#1586)

* Fixed a scaling issue in the epub reader, where images could scale when they shouldn't.

* Removed some caching on library/ api and added more output for a foreign key constraint

* Hooked in Restricted Profile stat collection

* Added a new boolean on age restrictions to explicitly allow unknowns or not. Since unknown is the default state of metadata, if users are allowed access to Unknown, age restricted content could leak.

* Fixed a bug where sometimes series cover generation could fail under conditions where only specials existed.

* Fixed foreign constraint issue when cleaning up series not seen at end of scan loop

* Removed an additional epub parse when scanning and handled merging differently

* Code smell

* Bump versions by dotnet-bump-version.

* Release Shakeout Day 1 (#1591)

* Fixed an issue where reading list were not able to update their summary due to a duplicate title check.

* Misc code smell cleanup

* Updated .net dependencies and removed unneeded ones

* Fixed an issue where removing a series from want to read list page wouldn't update the page correctly

* Fixed age restriction not applied to Recommended page

* Ensure that Genres and Tags are age restricted gated

* Persons are now age gated as well

* When you choose a cover, the new cover will properly be selected and will focus on it, in the cases there are many other covers available.

* Fixed caching profiles

* Added in a special hook when deleting a library to clear all series Relations before we delete

* Bump versions by dotnet-bump-version.

* Release Shakeout Day 2 (#1594)

* Fixed a bad color on the PWA titlebar

* Added more unit tests, cleaned up some dead code, and made it so when age restriction is Not Applicable, the Unknowns field disables

* Don't show an empty menu when user has no permissions

* Fixed deleting a library with relation causing library deleting to fail

* Consolidated some includes code into one method for Series Repo

* Small fixes

* Bump versions by dotnet-bump-version.

* Release Shakeout 3 (#1597)

* Fixed a bug where bulk selection on series detail wouldn't allow you to select the whole card, only the checkbox.

* Refactored the implementation of MarkChaptersAsRead to streamline it.

* Fixed a bug where volume cards weren't properly updating their read state based on events from backend.

* Added [ScannerService] to more loggers

* Fixed invite user flow

* Fixed broken edit user flow

* Fixed calling device service on unauthenticated screens causing redirection

* Fixed reset password via email not working when success message was sent back

* Fixed broken white theme on book reader

* Small tweaks to white theme

* More fixes

* Adjusted AutomaticRetries

* Bump versions by dotnet-bump-version.

* UI Polish (#1599)

* Make the positioning of "Library | Recommended" consistent

* Fix reading lists not navigating back to their library after deletion

* Bump versions by dotnet-bump-version.

* Release Shakeout 4 (#1600)

* Fixed a bug where bulk selection on series detail wouldn't allow you to select the whole card, only the checkbox.

* Refactored the implementation of MarkChaptersAsRead to streamline it.

* Fixed a bug where volume cards weren't properly updating their read state based on events from backend.

* Added [ScannerService] to more loggers

* Fixed invite user flow

* Fixed broken edit user flow

* Fixed calling device service on unauthenticated screens causing redirection

* Fixed reset password via email not working when success message was sent back

* Fixed broken white theme on book reader

* Small tweaks to white theme

* More fixes

* Adjusted AutomaticRetries

* When an auth change occures, reset the devices in service so devices don't leak between profiles

* Fixed a bug where sendTo for series wasn't properly taking into account specials (on series detail page)

* Drop down how long series detail caches for to prevent signalr updates from refreshing the UI

* Close submenus when hovering over other items, not just other submenus

* Fixed a bug where scanning for themes would always report theme didn't exist

* Added Hangfire.db back in

* Fixed a bad build

* Bump versions by dotnet-bump-version.

* Fixed column layout on multiple components for the user settings (#1602)

#Fixed
- Fixed: Fixed an issue where the controls would extend outside of the container on the user account preferences page.

* Bump versions by dotnet-bump-version.

* v0.6 Release (#1603)

Co-authored-by: tjarls <tjarls@gmail.com>
Co-authored-by: Robbie Davis <robbie@therobbiedavis.com>
Co-authored-by: Chris Plaatjes <kizaing@gmail.com>
Co-authored-by: Ocgineer <rvanbeek1@gmail.com>
Co-authored-by: ThePromidius <thepromidiusyt@gmail.com>
Co-authored-by: Korakot Santiudommongkol <47130579+KorakotSanti@users.noreply.github.com>
Co-authored-by: DeltaLaboratory <delta@deltalab.dev>
Co-authored-by: TheIceCreamTroll <33820904+TheIceCreamTroll@users.noreply.github.com>
This commit is contained in:
Joe Milazzo 2022-10-22 09:34:20 -07:00 committed by GitHub
parent 150e67031a
commit ac3fe0b1f4
No known key found for this signature in database
GPG key ID: 4AEE18F83AFDEB23
474 changed files with 35961 additions and 19346 deletions

View file

@ -2,6 +2,7 @@
using System.Collections.Generic;
using System.Linq;
using System.Threading.Tasks;
using API.Constants;
using API.Data;
using API.Entities;
using API.Errors;
@ -9,88 +10,123 @@ using Microsoft.AspNetCore.Identity;
using Microsoft.EntityFrameworkCore;
using Microsoft.Extensions.Logging;
namespace API.Services
namespace API.Services;
public interface IAccountService
{
public interface IAccountService
Task<IEnumerable<ApiException>> ChangeUserPassword(AppUser user, string newPassword);
Task<IEnumerable<ApiException>> ValidatePassword(AppUser user, string password);
Task<IEnumerable<ApiException>> ValidateUsername(string username);
Task<IEnumerable<ApiException>> ValidateEmail(string email);
Task<bool> HasBookmarkPermission(AppUser user);
Task<bool> HasDownloadPermission(AppUser user);
}
public class AccountService : IAccountService
{
private readonly UserManager<AppUser> _userManager;
private readonly ILogger<AccountService> _logger;
private readonly IUnitOfWork _unitOfWork;
public const string DefaultPassword = "[k.2@RZ!mxCQkJzE";
public AccountService(UserManager<AppUser> userManager, ILogger<AccountService> logger, IUnitOfWork unitOfWork)
{
Task<IEnumerable<ApiException>> ChangeUserPassword(AppUser user, string newPassword);
Task<IEnumerable<ApiException>> ValidatePassword(AppUser user, string password);
Task<IEnumerable<ApiException>> ValidateUsername(string username);
Task<IEnumerable<ApiException>> ValidateEmail(string email);
_userManager = userManager;
_logger = logger;
_unitOfWork = unitOfWork;
}
public class AccountService : IAccountService
public async Task<IEnumerable<ApiException>> ChangeUserPassword(AppUser user, string newPassword)
{
private readonly UserManager<AppUser> _userManager;
private readonly ILogger<AccountService> _logger;
private readonly IUnitOfWork _unitOfWork;
public const string DefaultPassword = "[k.2@RZ!mxCQkJzE";
var passwordValidationIssues = (await ValidatePassword(user, newPassword)).ToList();
if (passwordValidationIssues.Any()) return passwordValidationIssues;
public AccountService(UserManager<AppUser> userManager, ILogger<AccountService> logger, IUnitOfWork unitOfWork)
var result = await _userManager.RemovePasswordAsync(user);
if (!result.Succeeded)
{
_userManager = userManager;
_logger = logger;
_unitOfWork = unitOfWork;
_logger.LogError("Could not update password");
return result.Errors.Select(e => new ApiException(400, e.Code, e.Description));
}
public async Task<IEnumerable<ApiException>> ChangeUserPassword(AppUser user, string newPassword)
result = await _userManager.AddPasswordAsync(user, newPassword);
if (!result.Succeeded)
{
var passwordValidationIssues = (await ValidatePassword(user, newPassword)).ToList();
if (passwordValidationIssues.Any()) return passwordValidationIssues;
var result = await _userManager.RemovePasswordAsync(user);
if (!result.Succeeded)
{
_logger.LogError("Could not update password");
return result.Errors.Select(e => new ApiException(400, e.Code, e.Description));
}
result = await _userManager.AddPasswordAsync(user, newPassword);
if (!result.Succeeded)
{
_logger.LogError("Could not update password");
return result.Errors.Select(e => new ApiException(400, e.Code, e.Description));
}
return new List<ApiException>();
_logger.LogError("Could not update password");
return result.Errors.Select(e => new ApiException(400, e.Code, e.Description));
}
public async Task<IEnumerable<ApiException>> ValidatePassword(AppUser user, string password)
{
foreach (var validator in _userManager.PasswordValidators)
{
var validationResult = await validator.ValidateAsync(_userManager, user, password);
if (!validationResult.Succeeded)
{
return validationResult.Errors.Select(e => new ApiException(400, e.Code, e.Description));
}
}
return new List<ApiException>();
}
return Array.Empty<ApiException>();
}
public async Task<IEnumerable<ApiException>> ValidateUsername(string username)
public async Task<IEnumerable<ApiException>> ValidatePassword(AppUser user, string password)
{
foreach (var validator in _userManager.PasswordValidators)
{
if (await _userManager.Users.AnyAsync(x => x.NormalizedUserName == username.ToUpper()))
var validationResult = await validator.ValidateAsync(_userManager, user, password);
if (!validationResult.Succeeded)
{
return new List<ApiException>()
{
new ApiException(400, "Username is already taken")
};
return validationResult.Errors.Select(e => new ApiException(400, e.Code, e.Description));
}
return Array.Empty<ApiException>();
}
public async Task<IEnumerable<ApiException>> ValidateEmail(string email)
return Array.Empty<ApiException>();
}
public async Task<IEnumerable<ApiException>> ValidateUsername(string username)
{
if (await _userManager.Users.AnyAsync(x => x.NormalizedUserName == username.ToUpper()))
{
var user = await _unitOfWork.UserRepository.GetUserByEmailAsync(email);
if (user == null) return Array.Empty<ApiException>();
return new List<ApiException>()
{
new ApiException(400, "Email is already registered")
new ApiException(400, "Username is already taken")
};
}
return Array.Empty<ApiException>();
}
public async Task<IEnumerable<ApiException>> ValidateEmail(string email)
{
var user = await _unitOfWork.UserRepository.GetUserByEmailAsync(email);
if (user == null) return Array.Empty<ApiException>();
return new List<ApiException>()
{
new ApiException(400, "Email is already registered")
};
}
/// <summary>
/// Does the user have the Bookmark permission or admin rights
/// </summary>
/// <param name="user"></param>
/// <returns></returns>
public async Task<bool> HasBookmarkPermission(AppUser user)
{
var roles = await _userManager.GetRolesAsync(user);
return roles.Contains(PolicyConstants.BookmarkRole) || roles.Contains(PolicyConstants.AdminRole);
}
/// <summary>
/// Does the user have the Download permission or admin rights
/// </summary>
/// <param name="user"></param>
/// <returns></returns>
public async Task<bool> HasDownloadPermission(AppUser user)
{
var roles = await _userManager.GetRolesAsync(user);
return roles.Contains(PolicyConstants.DownloadRole) || roles.Contains(PolicyConstants.AdminRole);
}
/// <summary>
/// Does the user have Change Restriction permission or admin rights
/// </summary>
/// <param name="user"></param>
/// <returns></returns>
public async Task<bool> HasChangeRestrictionRole(AppUser user)
{
var roles = await _userManager.GetRolesAsync(user);
return roles.Contains(PolicyConstants.ChangePasswordRole) || roles.Contains(PolicyConstants.AdminRole);
}
}

View file

@ -14,479 +14,478 @@ using Microsoft.Extensions.Logging;
using SharpCompress.Archives;
using SharpCompress.Common;
namespace API.Services
namespace API.Services;
public interface IArchiveService
{
public interface IArchiveService
void ExtractArchive(string archivePath, string extractPath);
int GetNumberOfPagesFromArchive(string archivePath);
string GetCoverImage(string archivePath, string fileName, string outputDirectory);
bool IsValidArchive(string archivePath);
ComicInfo GetComicInfo(string archivePath);
ArchiveLibrary CanOpen(string archivePath);
bool ArchiveNeedsFlattening(ZipArchive archive);
/// <summary>
/// Creates a zip file form the listed files and outputs to the temp folder.
/// </summary>
/// <param name="files">List of files to be zipped up. Should be full file paths.</param>
/// <param name="tempFolder">Temp folder name to use for preparing the files. Will be created and deleted</param>
/// <returns>Path to the temp zip</returns>
/// <exception cref="KavitaException"></exception>
string CreateZipForDownload(IEnumerable<string> files, string tempFolder);
}
/// <summary>
/// Responsible for manipulating Archive files. Used by <see cref="CacheService"/> and <see cref="ScannerService"/>
/// </summary>
// ReSharper disable once ClassWithVirtualMembersNeverInherited.Global
public class ArchiveService : IArchiveService
{
private readonly ILogger<ArchiveService> _logger;
private readonly IDirectoryService _directoryService;
private readonly IImageService _imageService;
private const string ComicInfoFilename = "ComicInfo.xml";
public ArchiveService(ILogger<ArchiveService> logger, IDirectoryService directoryService, IImageService imageService)
{
void ExtractArchive(string archivePath, string extractPath);
int GetNumberOfPagesFromArchive(string archivePath);
string GetCoverImage(string archivePath, string fileName, string outputDirectory);
bool IsValidArchive(string archivePath);
ComicInfo GetComicInfo(string archivePath);
ArchiveLibrary CanOpen(string archivePath);
bool ArchiveNeedsFlattening(ZipArchive archive);
/// <summary>
/// Creates a zip file form the listed files and outputs to the temp folder.
/// </summary>
/// <param name="files">List of files to be zipped up. Should be full file paths.</param>
/// <param name="tempFolder">Temp folder name to use for preparing the files. Will be created and deleted</param>
/// <returns>Path to the temp zip</returns>
/// <exception cref="KavitaException"></exception>
string CreateZipForDownload(IEnumerable<string> files, string tempFolder);
_logger = logger;
_directoryService = directoryService;
_imageService = imageService;
}
/// <summary>
/// Responsible for manipulating Archive files. Used by <see cref="CacheService"/> and <see cref="ScannerService"/>
/// Checks if a File can be opened. Requires up to 2 opens of the filestream.
/// </summary>
// ReSharper disable once ClassWithVirtualMembersNeverInherited.Global
public class ArchiveService : IArchiveService
/// <param name="archivePath"></param>
/// <returns></returns>
public virtual ArchiveLibrary CanOpen(string archivePath)
{
private readonly ILogger<ArchiveService> _logger;
private readonly IDirectoryService _directoryService;
private readonly IImageService _imageService;
private const string ComicInfoFilename = "comicinfo";
if (string.IsNullOrEmpty(archivePath) || !(File.Exists(archivePath) && Tasks.Scanner.Parser.Parser.IsArchive(archivePath) || Tasks.Scanner.Parser.Parser.IsEpub(archivePath))) return ArchiveLibrary.NotSupported;
public ArchiveService(ILogger<ArchiveService> logger, IDirectoryService directoryService, IImageService imageService)
var ext = _directoryService.FileSystem.Path.GetExtension(archivePath).ToUpper();
if (ext.Equals(".CBR") || ext.Equals(".RAR")) return ArchiveLibrary.SharpCompress;
try
{
_logger = logger;
_directoryService = directoryService;
_imageService = imageService;
using var a2 = ZipFile.OpenRead(archivePath);
return ArchiveLibrary.Default;
}
/// <summary>
/// Checks if a File can be opened. Requires up to 2 opens of the filestream.
/// </summary>
/// <param name="archivePath"></param>
/// <returns></returns>
public virtual ArchiveLibrary CanOpen(string archivePath)
catch (Exception)
{
if (string.IsNullOrEmpty(archivePath) || !(File.Exists(archivePath) && Tasks.Scanner.Parser.Parser.IsArchive(archivePath) || Tasks.Scanner.Parser.Parser.IsEpub(archivePath))) return ArchiveLibrary.NotSupported;
var ext = _directoryService.FileSystem.Path.GetExtension(archivePath).ToUpper();
if (ext.Equals(".CBR") || ext.Equals(".RAR")) return ArchiveLibrary.SharpCompress;
try
{
using var a2 = ZipFile.OpenRead(archivePath);
return ArchiveLibrary.Default;
using var a1 = ArchiveFactory.Open(archivePath);
return ArchiveLibrary.SharpCompress;
}
catch (Exception)
{
try
return ArchiveLibrary.NotSupported;
}
}
}
public int GetNumberOfPagesFromArchive(string archivePath)
{
if (!IsValidArchive(archivePath))
{
_logger.LogError("Archive {ArchivePath} could not be found", archivePath);
return 0;
}
try
{
var libraryHandler = CanOpen(archivePath);
switch (libraryHandler)
{
case ArchiveLibrary.Default:
{
using var a1 = ArchiveFactory.Open(archivePath);
return ArchiveLibrary.SharpCompress;
using var archive = ZipFile.OpenRead(archivePath);
return archive.Entries.Count(e => !Tasks.Scanner.Parser.Parser.HasBlacklistedFolderInPath(e.FullName) && Tasks.Scanner.Parser.Parser.IsImage(e.FullName));
}
catch (Exception)
case ArchiveLibrary.SharpCompress:
{
return ArchiveLibrary.NotSupported;
using var archive = ArchiveFactory.Open(archivePath);
return archive.Entries.Count(entry => !entry.IsDirectory &&
!Tasks.Scanner.Parser.Parser.HasBlacklistedFolderInPath(Path.GetDirectoryName(entry.Key) ?? string.Empty)
&& Tasks.Scanner.Parser.Parser.IsImage(entry.Key));
}
case ArchiveLibrary.NotSupported:
_logger.LogWarning("[GetNumberOfPagesFromArchive] This archive cannot be read: {ArchivePath}. Defaulting to 0 pages", archivePath);
return 0;
default:
_logger.LogWarning("[GetNumberOfPagesFromArchive] There was an exception when reading archive stream: {ArchivePath}. Defaulting to 0 pages", archivePath);
return 0;
}
}
catch (Exception ex)
{
_logger.LogWarning(ex, "[GetNumberOfPagesFromArchive] There was an exception when reading archive stream: {ArchivePath}. Defaulting to 0 pages", archivePath);
return 0;
}
}
/// <summary>
/// Finds the first instance of a folder entry and returns it
/// </summary>
/// <param name="entryFullNames"></param>
/// <returns>Entry name of match, null if no match</returns>
public static string FindFolderEntry(IEnumerable<string> entryFullNames)
{
var result = entryFullNames
.Where(path => !(Path.EndsInDirectorySeparator(path) || Tasks.Scanner.Parser.Parser.HasBlacklistedFolderInPath(path) || path.StartsWith(Tasks.Scanner.Parser.Parser.MacOsMetadataFileStartsWith)))
.OrderByNatural(Path.GetFileNameWithoutExtension)
.FirstOrDefault(Tasks.Scanner.Parser.Parser.IsCoverImage);
return string.IsNullOrEmpty(result) ? null : result;
}
/// <summary>
/// Returns first entry that is an image and is not in a blacklisted folder path. Uses <see cref="EnumerableExtensions.OrderByNatural"/> for ordering files
/// </summary>
/// <param name="entryFullNames"></param>
/// <param name="archiveName"></param>
/// <returns>Entry name of match, null if no match</returns>
public static string? FirstFileEntry(IEnumerable<string> entryFullNames, string archiveName)
{
// First check if there are any files that are not in a nested folder before just comparing by filename. This is needed
// because NaturalSortComparer does not work with paths and doesn't seem 001.jpg as before chapter 1/001.jpg.
var fullNames = entryFullNames
.Where(path => !(Path.EndsInDirectorySeparator(path) || Tasks.Scanner.Parser.Parser.HasBlacklistedFolderInPath(path) || path.StartsWith(Tasks.Scanner.Parser.Parser.MacOsMetadataFileStartsWith)) && Tasks.Scanner.Parser.Parser.IsImage(path))
.OrderByNatural(c => c.GetFullPathWithoutExtension())
.ToList();
if (fullNames.Count == 0) return null;
var nonNestedFile = fullNames.Where(entry => (Path.GetDirectoryName(entry) ?? string.Empty).Equals(archiveName))
.OrderByNatural(c => c.GetFullPathWithoutExtension())
.FirstOrDefault();
if (!string.IsNullOrEmpty(nonNestedFile)) return nonNestedFile;
// Check the first folder and sort within that to see if we can find a file, else fallback to first file with basic sort.
// Get first folder, then sort within that
var firstDirectoryFile = fullNames.OrderByNatural(Path.GetDirectoryName).FirstOrDefault();
if (!string.IsNullOrEmpty(firstDirectoryFile))
{
var firstDirectory = Path.GetDirectoryName(firstDirectoryFile);
if (!string.IsNullOrEmpty(firstDirectory))
{
var firstDirectoryResult = fullNames.Where(f => firstDirectory.Equals(Path.GetDirectoryName(f)))
.OrderByNatural(Path.GetFileNameWithoutExtension)
.FirstOrDefault();
if (!string.IsNullOrEmpty(firstDirectoryResult)) return firstDirectoryResult;
}
}
public int GetNumberOfPagesFromArchive(string archivePath)
{
if (!IsValidArchive(archivePath))
{
_logger.LogError("Archive {ArchivePath} could not be found", archivePath);
return 0;
}
var result = fullNames
.OrderByNatural(Path.GetFileNameWithoutExtension)
.FirstOrDefault();
try
return string.IsNullOrEmpty(result) ? null : result;
}
/// <summary>
/// Generates byte array of cover image.
/// Given a path to a compressed file <see cref="Tasks.Scanner.Parser.Parser.ArchiveFileExtensions"/>, will ensure the first image (respects directory structure) is returned unless
/// a folder/cover.(image extension) exists in the the compressed file (if duplicate, the first is chosen)
///
/// This skips over any __MACOSX folder/file iteration.
/// </summary>
/// <remarks>This always creates a thumbnail</remarks>
/// <param name="archivePath"></param>
/// <param name="fileName">File name to use based on context of entity.</param>
/// <param name="outputDirectory">Where to output the file, defaults to covers directory</param>
/// <returns></returns>
public string GetCoverImage(string archivePath, string fileName, string outputDirectory)
{
if (archivePath == null || !IsValidArchive(archivePath)) return string.Empty;
try
{
var libraryHandler = CanOpen(archivePath);
switch (libraryHandler)
{
var libraryHandler = CanOpen(archivePath);
switch (libraryHandler)
case ArchiveLibrary.Default:
{
case ArchiveLibrary.Default:
{
using var archive = ZipFile.OpenRead(archivePath);
return archive.Entries.Count(e => !Tasks.Scanner.Parser.Parser.HasBlacklistedFolderInPath(e.FullName) && Tasks.Scanner.Parser.Parser.IsImage(e.FullName));
}
case ArchiveLibrary.SharpCompress:
{
using var archive = ArchiveFactory.Open(archivePath);
return archive.Entries.Count(entry => !entry.IsDirectory &&
!Tasks.Scanner.Parser.Parser.HasBlacklistedFolderInPath(Path.GetDirectoryName(entry.Key) ?? string.Empty)
&& Tasks.Scanner.Parser.Parser.IsImage(entry.Key));
}
case ArchiveLibrary.NotSupported:
_logger.LogWarning("[GetNumberOfPagesFromArchive] This archive cannot be read: {ArchivePath}. Defaulting to 0 pages", archivePath);
return 0;
default:
_logger.LogWarning("[GetNumberOfPagesFromArchive] There was an exception when reading archive stream: {ArchivePath}. Defaulting to 0 pages", archivePath);
return 0;
using var archive = ZipFile.OpenRead(archivePath);
var entryName = FindCoverImageFilename(archivePath, archive.Entries.Select(e => e.FullName));
var entry = archive.Entries.Single(e => e.FullName == entryName);
using var stream = entry.Open();
return _imageService.WriteCoverThumbnail(stream, fileName, outputDirectory);
}
}
catch (Exception ex)
{
_logger.LogWarning(ex, "[GetNumberOfPagesFromArchive] There was an exception when reading archive stream: {ArchivePath}. Defaulting to 0 pages", archivePath);
return 0;
}
}
/// <summary>
/// Finds the first instance of a folder entry and returns it
/// </summary>
/// <param name="entryFullNames"></param>
/// <returns>Entry name of match, null if no match</returns>
public static string FindFolderEntry(IEnumerable<string> entryFullNames)
{
var result = entryFullNames
.Where(path => !(Path.EndsInDirectorySeparator(path) || Tasks.Scanner.Parser.Parser.HasBlacklistedFolderInPath(path) || path.StartsWith(Tasks.Scanner.Parser.Parser.MacOsMetadataFileStartsWith)))
.OrderByNatural(Path.GetFileNameWithoutExtension)
.FirstOrDefault(Tasks.Scanner.Parser.Parser.IsCoverImage);
return string.IsNullOrEmpty(result) ? null : result;
}
/// <summary>
/// Returns first entry that is an image and is not in a blacklisted folder path. Uses <see cref="EnumerableExtensions.OrderByNatural"/> for ordering files
/// </summary>
/// <param name="entryFullNames"></param>
/// <param name="archiveName"></param>
/// <returns>Entry name of match, null if no match</returns>
public static string? FirstFileEntry(IEnumerable<string> entryFullNames, string archiveName)
{
// First check if there are any files that are not in a nested folder before just comparing by filename. This is needed
// because NaturalSortComparer does not work with paths and doesn't seem 001.jpg as before chapter 1/001.jpg.
var fullNames = entryFullNames
.Where(path => !(Path.EndsInDirectorySeparator(path) || Tasks.Scanner.Parser.Parser.HasBlacklistedFolderInPath(path) || path.StartsWith(Tasks.Scanner.Parser.Parser.MacOsMetadataFileStartsWith)) && Tasks.Scanner.Parser.Parser.IsImage(path))
.OrderByNatural(c => c.GetFullPathWithoutExtension())
.ToList();
if (fullNames.Count == 0) return null;
var nonNestedFile = fullNames.Where(entry => (Path.GetDirectoryName(entry) ?? string.Empty).Equals(archiveName))
.OrderByNatural(c => c.GetFullPathWithoutExtension())
.FirstOrDefault();
if (!string.IsNullOrEmpty(nonNestedFile)) return nonNestedFile;
// Check the first folder and sort within that to see if we can find a file, else fallback to first file with basic sort.
// Get first folder, then sort within that
var firstDirectoryFile = fullNames.OrderByNatural(Path.GetDirectoryName).FirstOrDefault();
if (!string.IsNullOrEmpty(firstDirectoryFile))
{
var firstDirectory = Path.GetDirectoryName(firstDirectoryFile);
if (!string.IsNullOrEmpty(firstDirectory))
case ArchiveLibrary.SharpCompress:
{
var firstDirectoryResult = fullNames.Where(f => firstDirectory.Equals(Path.GetDirectoryName(f)))
.OrderByNatural(Path.GetFileNameWithoutExtension)
.FirstOrDefault();
using var archive = ArchiveFactory.Open(archivePath);
var entryNames = archive.Entries.Where(archiveEntry => !archiveEntry.IsDirectory).Select(e => e.Key).ToList();
if (!string.IsNullOrEmpty(firstDirectoryResult)) return firstDirectoryResult;
var entryName = FindCoverImageFilename(archivePath, entryNames);
var entry = archive.Entries.Single(e => e.Key == entryName);
using var stream = entry.OpenEntryStream();
return _imageService.WriteCoverThumbnail(stream, fileName, outputDirectory);
}
case ArchiveLibrary.NotSupported:
_logger.LogWarning("[GetCoverImage] This archive cannot be read: {ArchivePath}. Defaulting to no cover image", archivePath);
return string.Empty;
default:
_logger.LogWarning("[GetCoverImage] There was an exception when reading archive stream: {ArchivePath}. Defaulting to no cover image", archivePath);
return string.Empty;
}
var result = fullNames
.OrderByNatural(Path.GetFileNameWithoutExtension)
.FirstOrDefault();
return string.IsNullOrEmpty(result) ? null : result;
}
catch (Exception ex)
{
_logger.LogWarning(ex, "[GetCoverImage] There was an exception when reading archive stream: {ArchivePath}. Defaulting to no cover image", archivePath);
}
return string.Empty;
}
/// <summary>
/// Generates byte array of cover image.
/// Given a path to a compressed file <see cref="Tasks.Scanner.Parser.Parser.ArchiveFileExtensions"/>, will ensure the first image (respects directory structure) is returned unless
/// a folder/cover.(image extension) exists in the the compressed file (if duplicate, the first is chosen)
///
/// This skips over any __MACOSX folder/file iteration.
/// </summary>
/// <remarks>This always creates a thumbnail</remarks>
/// <param name="archivePath"></param>
/// <param name="fileName">File name to use based on context of entity.</param>
/// <param name="outputDirectory">Where to output the file, defaults to covers directory</param>
/// <returns></returns>
public string GetCoverImage(string archivePath, string fileName, string outputDirectory)
/// <summary>
/// Given a list of image paths (assume within an archive), find the filename that corresponds to the cover
/// </summary>
/// <param name="archivePath"></param>
/// <param name="entryNames"></param>
/// <returns></returns>
public static string FindCoverImageFilename(string archivePath, IEnumerable<string> entryNames)
{
var entryName = FindFolderEntry(entryNames) ?? FirstFileEntry(entryNames, Path.GetFileName(archivePath));
return entryName;
}
/// <summary>
/// Given an archive stream, will assess whether directory needs to be flattened so that the extracted archive files are directly
/// under extract path and not nested in subfolders. See <see cref="DirectoryService"/> Flatten method.
/// </summary>
/// <param name="archive">An opened archive stream</param>
/// <returns></returns>
public bool ArchiveNeedsFlattening(ZipArchive archive)
{
// Sometimes ZipArchive will list the directory and others it will just keep it in the FullName
return archive.Entries.Count > 0 &&
!Path.HasExtension(archive.Entries.ElementAt(0).FullName) ||
archive.Entries.Any(e => e.FullName.Contains(Path.AltDirectorySeparatorChar) && !Tasks.Scanner.Parser.Parser.HasBlacklistedFolderInPath(e.FullName));
}
/// <summary>
/// Creates a zip file form the listed files and outputs to the temp folder.
/// </summary>
/// <param name="files">List of files to be zipped up. Should be full file paths.</param>
/// <param name="tempFolder">Temp folder name to use for preparing the files. Will be created and deleted</param>
/// <returns>Path to the temp zip</returns>
/// <exception cref="KavitaException"></exception>
public string CreateZipForDownload(IEnumerable<string> files, string tempFolder)
{
var dateString = DateTime.Now.ToShortDateString().Replace("/", "_");
var tempLocation = Path.Join(_directoryService.TempDirectory, $"{tempFolder}_{dateString}");
var potentialExistingFile = _directoryService.FileSystem.FileInfo.FromFileName(Path.Join(_directoryService.TempDirectory, $"kavita_{tempFolder}_{dateString}.zip"));
if (potentialExistingFile.Exists)
{
if (archivePath == null || !IsValidArchive(archivePath)) return string.Empty;
try
{
var libraryHandler = CanOpen(archivePath);
switch (libraryHandler)
{
case ArchiveLibrary.Default:
{
using var archive = ZipFile.OpenRead(archivePath);
var entryName = FindCoverImageFilename(archivePath, archive.Entries.Select(e => e.FullName));
var entry = archive.Entries.Single(e => e.FullName == entryName);
using var stream = entry.Open();
return _imageService.WriteCoverThumbnail(stream, fileName, outputDirectory);
}
case ArchiveLibrary.SharpCompress:
{
using var archive = ArchiveFactory.Open(archivePath);
var entryNames = archive.Entries.Where(archiveEntry => !archiveEntry.IsDirectory).Select(e => e.Key).ToList();
var entryName = FindCoverImageFilename(archivePath, entryNames);
var entry = archive.Entries.Single(e => e.Key == entryName);
using var stream = entry.OpenEntryStream();
return _imageService.WriteCoverThumbnail(stream, fileName, outputDirectory);
}
case ArchiveLibrary.NotSupported:
_logger.LogWarning("[GetCoverImage] This archive cannot be read: {ArchivePath}. Defaulting to no cover image", archivePath);
return string.Empty;
default:
_logger.LogWarning("[GetCoverImage] There was an exception when reading archive stream: {ArchivePath}. Defaulting to no cover image", archivePath);
return string.Empty;
}
}
catch (Exception ex)
{
_logger.LogWarning(ex, "[GetCoverImage] There was an exception when reading archive stream: {ArchivePath}. Defaulting to no cover image", archivePath);
}
return string.Empty;
// A previous download exists, just return it immediately
return potentialExistingFile.FullName;
}
/// <summary>
/// Given a list of image paths (assume within an archive), find the filename that corresponds to the cover
/// </summary>
/// <param name="archivePath"></param>
/// <param name="entryNames"></param>
/// <returns></returns>
public static string FindCoverImageFilename(string archivePath, IEnumerable<string> entryNames)
_directoryService.ExistOrCreate(tempLocation);
if (!_directoryService.CopyFilesToDirectory(files, tempLocation))
{
var entryName = FindFolderEntry(entryNames) ?? FirstFileEntry(entryNames, Path.GetFileName(archivePath));
return entryName;
throw new KavitaException("Unable to copy files to temp directory archive download.");
}
/// <summary>
/// Given an archive stream, will assess whether directory needs to be flattened so that the extracted archive files are directly
/// under extract path and not nested in subfolders. See <see cref="DirectoryService"/> Flatten method.
/// </summary>
/// <param name="archive">An opened archive stream</param>
/// <returns></returns>
public bool ArchiveNeedsFlattening(ZipArchive archive)
var zipPath = Path.Join(_directoryService.TempDirectory, $"kavita_{tempFolder}_{dateString}.zip");
try
{
// Sometimes ZipArchive will list the directory and others it will just keep it in the FullName
return archive.Entries.Count > 0 &&
!Path.HasExtension(archive.Entries.ElementAt(0).FullName) ||
archive.Entries.Any(e => e.FullName.Contains(Path.AltDirectorySeparatorChar) && !Tasks.Scanner.Parser.Parser.HasBlacklistedFolderInPath(e.FullName));
ZipFile.CreateFromDirectory(tempLocation, zipPath);
}
catch (AggregateException ex)
{
_logger.LogError(ex, "There was an issue creating temp archive");
throw new KavitaException("There was an issue creating temp archive");
}
/// <summary>
/// Creates a zip file form the listed files and outputs to the temp folder.
/// </summary>
/// <param name="files">List of files to be zipped up. Should be full file paths.</param>
/// <param name="tempFolder">Temp folder name to use for preparing the files. Will be created and deleted</param>
/// <returns>Path to the temp zip</returns>
/// <exception cref="KavitaException"></exception>
public string CreateZipForDownload(IEnumerable<string> files, string tempFolder)
return zipPath;
}
/// <summary>
/// Test if the archive path exists and an archive
/// </summary>
/// <param name="archivePath"></param>
/// <returns></returns>
public bool IsValidArchive(string archivePath)
{
if (!File.Exists(archivePath))
{
var dateString = DateTime.Now.ToShortDateString().Replace("/", "_");
var tempLocation = Path.Join(_directoryService.TempDirectory, $"{tempFolder}_{dateString}");
var potentialExistingFile = _directoryService.FileSystem.FileInfo.FromFileName(Path.Join(_directoryService.TempDirectory, $"kavita_{tempFolder}_{dateString}.zip"));
if (potentialExistingFile.Exists)
{
// A previous download exists, just return it immediately
return potentialExistingFile.FullName;
}
_directoryService.ExistOrCreate(tempLocation);
if (!_directoryService.CopyFilesToDirectory(files, tempLocation))
{
throw new KavitaException("Unable to copy files to temp directory archive download.");
}
var zipPath = Path.Join(_directoryService.TempDirectory, $"kavita_{tempFolder}_{dateString}.zip");
try
{
ZipFile.CreateFromDirectory(tempLocation, zipPath);
}
catch (AggregateException ex)
{
_logger.LogError(ex, "There was an issue creating temp archive");
throw new KavitaException("There was an issue creating temp archive");
}
return zipPath;
}
/// <summary>
/// Test if the archive path exists and an archive
/// </summary>
/// <param name="archivePath"></param>
/// <returns></returns>
public bool IsValidArchive(string archivePath)
{
if (!File.Exists(archivePath))
{
_logger.LogWarning("Archive {ArchivePath} could not be found", archivePath);
return false;
}
if (Tasks.Scanner.Parser.Parser.IsArchive(archivePath) || Tasks.Scanner.Parser.Parser.IsEpub(archivePath)) return true;
_logger.LogWarning("Archive {ArchivePath} is not a valid archive", archivePath);
_logger.LogWarning("Archive {ArchivePath} could not be found", archivePath);
return false;
}
private static bool ValidComicInfoArchiveEntry(string fullName, string name)
{
var filenameWithoutExtension = Path.GetFileNameWithoutExtension(name).ToLower();
return !Tasks.Scanner.Parser.Parser.HasBlacklistedFolderInPath(fullName)
&& filenameWithoutExtension.Equals(ComicInfoFilename, StringComparison.InvariantCultureIgnoreCase)
&& !filenameWithoutExtension.StartsWith(Tasks.Scanner.Parser.Parser.MacOsMetadataFileStartsWith)
&& Tasks.Scanner.Parser.Parser.IsXml(name);
}
if (Tasks.Scanner.Parser.Parser.IsArchive(archivePath) || Tasks.Scanner.Parser.Parser.IsEpub(archivePath)) return true;
/// <summary>
/// This can be null if nothing is found or any errors occur during access
/// </summary>
/// <param name="archivePath"></param>
/// <returns></returns>
public ComicInfo? GetComicInfo(string archivePath)
{
if (!IsValidArchive(archivePath)) return null;
_logger.LogWarning("Archive {ArchivePath} is not a valid archive", archivePath);
return false;
}
try
private static bool IsComicInfoArchiveEntry(string fullName, string name)
{
return !Tasks.Scanner.Parser.Parser.HasBlacklistedFolderInPath(fullName)
&& name.Equals(ComicInfoFilename, StringComparison.OrdinalIgnoreCase)
&& !name.StartsWith(Tasks.Scanner.Parser.Parser.MacOsMetadataFileStartsWith);
}
/// <summary>
/// This can be null if nothing is found or any errors occur during access
/// </summary>
/// <param name="archivePath"></param>
/// <returns></returns>
public ComicInfo? GetComicInfo(string archivePath)
{
if (!IsValidArchive(archivePath)) return null;
try
{
if (!File.Exists(archivePath)) return null;
var libraryHandler = CanOpen(archivePath);
switch (libraryHandler)
{
if (!File.Exists(archivePath)) return null;
var libraryHandler = CanOpen(archivePath);
switch (libraryHandler)
case ArchiveLibrary.Default:
{
case ArchiveLibrary.Default:
using var archive = ZipFile.OpenRead(archivePath);
var entry = archive.Entries.FirstOrDefault(x => (x.FullName ?? x.Name) == ComicInfoFilename) ??
archive.Entries.FirstOrDefault(x => IsComicInfoArchiveEntry(x.FullName, x.Name));
if (entry != null)
{
using var archive = ZipFile.OpenRead(archivePath);
var entry = archive.Entries.FirstOrDefault(x => ValidComicInfoArchiveEntry(x.FullName, x.Name));
if (entry != null)
{
using var stream = entry.Open();
var serializer = new XmlSerializer(typeof(ComicInfo));
var info = (ComicInfo) serializer.Deserialize(stream);
ComicInfo.CleanComicInfo(info);
return info;
}
break;
using var stream = entry.Open();
var serializer = new XmlSerializer(typeof(ComicInfo));
var info = (ComicInfo) serializer.Deserialize(stream);
ComicInfo.CleanComicInfo(info);
return info;
}
case ArchiveLibrary.SharpCompress:
{
using var archive = ArchiveFactory.Open(archivePath);
var entry = archive.Entries.FirstOrDefault(entry =>
ValidComicInfoArchiveEntry(Path.GetDirectoryName(entry.Key), entry.Key));
if (entry != null)
{
using var stream = entry.OpenEntryStream();
var serializer = new XmlSerializer(typeof(ComicInfo));
var info = (ComicInfo) serializer.Deserialize(stream);
ComicInfo.CleanComicInfo(info);
return info;
}
break;
}
case ArchiveLibrary.NotSupported:
_logger.LogWarning("[GetComicInfo] This archive cannot be read: {ArchivePath}", archivePath);
return null;
default:
_logger.LogWarning(
"[GetComicInfo] There was an exception when reading archive stream: {ArchivePath}",
archivePath);
return null;
break;
}
}
catch (Exception ex)
{
_logger.LogWarning(ex, "[GetComicInfo] There was an exception when reading archive stream: {Filepath}", archivePath);
}
return null;
}
private void ExtractArchiveEntities(IEnumerable<IArchiveEntry> entries, string extractPath)
{
_directoryService.ExistOrCreate(extractPath);
foreach (var entry in entries)
{
entry.WriteToDirectory(extractPath, new ExtractionOptions()
case ArchiveLibrary.SharpCompress:
{
ExtractFullPath = true, // Don't flatten, let the flatterner ensure correct order of nested folders
Overwrite = false
});
}
}
using var archive = ArchiveFactory.Open(archivePath);
var entry = archive.Entries.FirstOrDefault(entry => entry.Key == ComicInfoFilename) ??
archive.Entries.FirstOrDefault(entry =>
IsComicInfoArchiveEntry(Path.GetDirectoryName(entry.Key), entry.Key));
private void ExtractArchiveEntries(ZipArchive archive, string extractPath)
{
var needsFlattening = ArchiveNeedsFlattening(archive);
if (!archive.HasFiles() && !needsFlattening) return;
archive.ExtractToDirectory(extractPath, true);
if (!needsFlattening) return;
_logger.LogDebug("Extracted archive is nested in root folder, flattening...");
_directoryService.Flatten(extractPath);
}
/// <summary>
/// Extracts an archive to a temp cache directory. Returns path to new directory. If temp cache directory already exists,
/// will return that without performing an extraction. Returns empty string if there are any invalidations which would
/// prevent operations to perform correctly (missing archivePath file, empty archive, etc).
/// </summary>
/// <param name="archivePath">A valid file to an archive file.</param>
/// <param name="extractPath">Path to extract to</param>
/// <returns></returns>
public void ExtractArchive(string archivePath, string extractPath)
{
if (!IsValidArchive(archivePath)) return;
if (Directory.Exists(extractPath)) return;
if (!_directoryService.FileSystem.File.Exists(archivePath))
{
_logger.LogError("{Archive} does not exist on disk", archivePath);
throw new KavitaException($"{archivePath} does not exist on disk");
}
var sw = Stopwatch.StartNew();
try
{
var libraryHandler = CanOpen(archivePath);
switch (libraryHandler)
{
case ArchiveLibrary.Default:
if (entry != null)
{
using var archive = ZipFile.OpenRead(archivePath);
ExtractArchiveEntries(archive, extractPath);
break;
using var stream = entry.OpenEntryStream();
var serializer = new XmlSerializer(typeof(ComicInfo));
var info = (ComicInfo) serializer.Deserialize(stream);
ComicInfo.CleanComicInfo(info);
return info;
}
case ArchiveLibrary.SharpCompress:
{
using var archive = ArchiveFactory.Open(archivePath);
ExtractArchiveEntities(archive.Entries.Where(entry => !entry.IsDirectory
&& !Tasks.Scanner.Parser.Parser.HasBlacklistedFolderInPath(Path.GetDirectoryName(entry.Key) ?? string.Empty)
&& Tasks.Scanner.Parser.Parser.IsImage(entry.Key)), extractPath);
break;
}
case ArchiveLibrary.NotSupported:
_logger.LogWarning("[ExtractArchive] This archive cannot be read: {ArchivePath}", archivePath);
return;
default:
_logger.LogWarning("[ExtractArchive] There was an exception when reading archive stream: {ArchivePath}", archivePath);
return;
break;
}
case ArchiveLibrary.NotSupported:
_logger.LogWarning("[GetComicInfo] This archive cannot be read: {ArchivePath}", archivePath);
return null;
default:
_logger.LogWarning(
"[GetComicInfo] There was an exception when reading archive stream: {ArchivePath}",
archivePath);
return null;
}
}
catch (Exception ex)
{
_logger.LogWarning(ex, "[GetComicInfo] There was an exception when reading archive stream: {Filepath}", archivePath);
}
}
catch (Exception e)
return null;
}
private void ExtractArchiveEntities(IEnumerable<IArchiveEntry> entries, string extractPath)
{
_directoryService.ExistOrCreate(extractPath);
foreach (var entry in entries)
{
entry.WriteToDirectory(extractPath, new ExtractionOptions()
{
_logger.LogWarning(e, "[ExtractArchive] There was a problem extracting {ArchivePath} to {ExtractPath}",archivePath, extractPath);
throw new KavitaException(
$"There was an error when extracting {archivePath}. Check the file exists, has read permissions or the server OS can support all path characters.");
}
_logger.LogDebug("Extracted archive to {ExtractPath} in {ElapsedMilliseconds} milliseconds", extractPath, sw.ElapsedMilliseconds);
ExtractFullPath = true, // Don't flatten, let the flatterner ensure correct order of nested folders
Overwrite = false
});
}
}
private void ExtractArchiveEntries(ZipArchive archive, string extractPath)
{
var needsFlattening = ArchiveNeedsFlattening(archive);
if (!archive.HasFiles() && !needsFlattening) return;
archive.ExtractToDirectory(extractPath, true);
if (!needsFlattening) return;
_logger.LogDebug("Extracted archive is nested in root folder, flattening...");
_directoryService.Flatten(extractPath);
}
/// <summary>
/// Extracts an archive to a temp cache directory. Returns path to new directory. If temp cache directory already exists,
/// will return that without performing an extraction. Returns empty string if there are any invalidations which would
/// prevent operations to perform correctly (missing archivePath file, empty archive, etc).
/// </summary>
/// <param name="archivePath">A valid file to an archive file.</param>
/// <param name="extractPath">Path to extract to</param>
/// <returns></returns>
public void ExtractArchive(string archivePath, string extractPath)
{
if (!IsValidArchive(archivePath)) return;
if (Directory.Exists(extractPath)) return;
if (!_directoryService.FileSystem.File.Exists(archivePath))
{
_logger.LogError("{Archive} does not exist on disk", archivePath);
throw new KavitaException($"{archivePath} does not exist on disk");
}
var sw = Stopwatch.StartNew();
try
{
var libraryHandler = CanOpen(archivePath);
switch (libraryHandler)
{
case ArchiveLibrary.Default:
{
using var archive = ZipFile.OpenRead(archivePath);
ExtractArchiveEntries(archive, extractPath);
break;
}
case ArchiveLibrary.SharpCompress:
{
using var archive = ArchiveFactory.Open(archivePath);
ExtractArchiveEntities(archive.Entries.Where(entry => !entry.IsDirectory
&& !Tasks.Scanner.Parser.Parser.HasBlacklistedFolderInPath(Path.GetDirectoryName(entry.Key) ?? string.Empty)
&& Tasks.Scanner.Parser.Parser.IsImage(entry.Key)), extractPath);
break;
}
case ArchiveLibrary.NotSupported:
_logger.LogWarning("[ExtractArchive] This archive cannot be read: {ArchivePath}", archivePath);
return;
default:
_logger.LogWarning("[ExtractArchive] There was an exception when reading archive stream: {ArchivePath}", archivePath);
return;
}
}
catch (Exception e)
{
_logger.LogWarning(e, "[ExtractArchive] There was a problem extracting {ArchivePath} to {ExtractPath}",archivePath, extractPath);
throw new KavitaException(
$"There was an error when extracting {archivePath}. Check the file exists, has read permissions or the server OS can support all path characters.");
}
_logger.LogDebug("Extracted archive to {ExtractPath} in {ElapsedMilliseconds} milliseconds", extractPath, sw.ElapsedMilliseconds);
}
}

File diff suppressed because it is too large Load diff

View file

@ -3,6 +3,7 @@ using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Threading.Tasks;
using API.Constants;
using API.Data;
using API.DTOs.Reader;
using API.Entities;
@ -78,15 +79,14 @@ public class BookmarkService : IBookmarkService
/// <returns>If the save to DB and copy was successful</returns>
public async Task<bool> BookmarkPage(AppUser userWithBookmarks, BookmarkDto bookmarkDto, string imageToBookmark)
{
if (userWithBookmarks == null || userWithBookmarks.Bookmarks == null) return false;
try
{
var userBookmark =
await _unitOfWork.UserRepository.GetBookmarkForPage(bookmarkDto.Page, bookmarkDto.ChapterId, userWithBookmarks.Id);
var userBookmark = userWithBookmarks.Bookmarks.SingleOrDefault(b => b.Page == bookmarkDto.Page && b.ChapterId == bookmarkDto.ChapterId);
if (userBookmark != null)
{
_logger.LogError("Bookmark already exists for Series {SeriesId}, Volume {VolumeId}, Chapter {ChapterId}, Page {PageNum}", bookmarkDto.SeriesId, bookmarkDto.VolumeId, bookmarkDto.ChapterId, bookmarkDto.Page);
return false;
return true;
}
var fileInfo = _directoryService.FileSystem.FileInfo.FromFileName(imageToBookmark);
@ -100,14 +100,13 @@ public class BookmarkService : IBookmarkService
VolumeId = bookmarkDto.VolumeId,
SeriesId = bookmarkDto.SeriesId,
ChapterId = bookmarkDto.ChapterId,
FileName = Path.Join(targetFolderStem, fileInfo.Name)
FileName = Path.Join(targetFolderStem, fileInfo.Name),
AppUserId = userWithBookmarks.Id
};
_directoryService.CopyFileToDirectory(imageToBookmark, targetFilepath);
userWithBookmarks.Bookmarks ??= new List<AppUserBookmark>();
userWithBookmarks.Bookmarks.Add(bookmark);
_unitOfWork.UserRepository.Update(userWithBookmarks);
_unitOfWork.UserRepository.Add(bookmark);
await _unitOfWork.CommitAsync();
if (settings.ConvertBookmarkToWebP)
@ -135,15 +134,12 @@ public class BookmarkService : IBookmarkService
public async Task<bool> RemoveBookmarkPage(AppUser userWithBookmarks, BookmarkDto bookmarkDto)
{
if (userWithBookmarks.Bookmarks == null) return true;
var bookmarkToDelete = userWithBookmarks.Bookmarks.SingleOrDefault(x =>
x.ChapterId == bookmarkDto.ChapterId && x.Page == bookmarkDto.Page);
try
{
var bookmarkToDelete = userWithBookmarks.Bookmarks.SingleOrDefault(x =>
x.ChapterId == bookmarkDto.ChapterId && x.AppUserId == userWithBookmarks.Id && x.Page == bookmarkDto.Page &&
x.SeriesId == bookmarkDto.SeriesId);
if (bookmarkToDelete != null)
{
await DeleteBookmarkFiles(new[] {bookmarkToDelete});
_unitOfWork.UserRepository.Delete(bookmarkToDelete);
}
@ -151,10 +147,10 @@ public class BookmarkService : IBookmarkService
}
catch (Exception)
{
await _unitOfWork.RollbackAsync();
return false;
}
await DeleteBookmarkFiles(new[] {bookmarkToDelete});
return true;
}

View file

@ -10,245 +10,244 @@ using API.Extensions;
using Kavita.Common;
using Microsoft.Extensions.Logging;
namespace API.Services
namespace API.Services;
public interface ICacheService
{
public interface ICacheService
/// <summary>
/// Ensures the cache is created for the given chapter and if not, will create it. Should be called before any other
/// cache operations (except cleanup).
/// </summary>
/// <param name="chapterId"></param>
/// <returns>Chapter for the passed chapterId. Side-effect from ensuring cache.</returns>
Task<Chapter> Ensure(int chapterId);
/// <summary>
/// Clears cache directory of all volumes. This can be invoked from deleting a library or a series.
/// </summary>
/// <param name="chapterIds">Volumes that belong to that library. Assume the library might have been deleted before this invocation.</param>
void CleanupChapters(IEnumerable<int> chapterIds);
void CleanupBookmarks(IEnumerable<int> seriesIds);
string GetCachedPagePath(Chapter chapter, int page);
string GetCachedBookmarkPagePath(int seriesId, int page);
string GetCachedFile(Chapter chapter);
public void ExtractChapterFiles(string extractPath, IReadOnlyList<MangaFile> files);
Task<int> CacheBookmarkForSeries(int userId, int seriesId);
void CleanupBookmarkCache(int seriesId);
}
public class CacheService : ICacheService
{
private readonly ILogger<CacheService> _logger;
private readonly IUnitOfWork _unitOfWork;
private readonly IDirectoryService _directoryService;
private readonly IReadingItemService _readingItemService;
private readonly IBookmarkService _bookmarkService;
public CacheService(ILogger<CacheService> logger, IUnitOfWork unitOfWork,
IDirectoryService directoryService, IReadingItemService readingItemService,
IBookmarkService bookmarkService)
{
/// <summary>
/// Ensures the cache is created for the given chapter and if not, will create it. Should be called before any other
/// cache operations (except cleanup).
/// </summary>
/// <param name="chapterId"></param>
/// <returns>Chapter for the passed chapterId. Side-effect from ensuring cache.</returns>
Task<Chapter> Ensure(int chapterId);
/// <summary>
/// Clears cache directory of all volumes. This can be invoked from deleting a library or a series.
/// </summary>
/// <param name="chapterIds">Volumes that belong to that library. Assume the library might have been deleted before this invocation.</param>
void CleanupChapters(IEnumerable<int> chapterIds);
void CleanupBookmarks(IEnumerable<int> seriesIds);
string GetCachedPagePath(Chapter chapter, int page);
string GetCachedBookmarkPagePath(int seriesId, int page);
string GetCachedFile(Chapter chapter);
public void ExtractChapterFiles(string extractPath, IReadOnlyList<MangaFile> files);
Task<int> CacheBookmarkForSeries(int userId, int seriesId);
void CleanupBookmarkCache(int seriesId);
_logger = logger;
_unitOfWork = unitOfWork;
_directoryService = directoryService;
_readingItemService = readingItemService;
_bookmarkService = bookmarkService;
}
public class CacheService : ICacheService
public string GetCachedBookmarkPagePath(int seriesId, int page)
{
private readonly ILogger<CacheService> _logger;
private readonly IUnitOfWork _unitOfWork;
private readonly IDirectoryService _directoryService;
private readonly IReadingItemService _readingItemService;
private readonly IBookmarkService _bookmarkService;
// Calculate what chapter the page belongs to
var path = GetBookmarkCachePath(seriesId);
var files = _directoryService.GetFilesWithExtension(path, Tasks.Scanner.Parser.Parser.ImageFileExtensions);
files = files
.AsEnumerable()
.OrderByNatural(Path.GetFileNameWithoutExtension)
.ToArray();
public CacheService(ILogger<CacheService> logger, IUnitOfWork unitOfWork,
IDirectoryService directoryService, IReadingItemService readingItemService,
IBookmarkService bookmarkService)
if (files.Length == 0)
{
_logger = logger;
_unitOfWork = unitOfWork;
_directoryService = directoryService;
_readingItemService = readingItemService;
_bookmarkService = bookmarkService;
return string.Empty;
}
public string GetCachedBookmarkPagePath(int seriesId, int page)
// Since array is 0 based, we need to keep that in account (only affects last image)
return page == files.Length ? files.ElementAt(page - 1) : files.ElementAt(page);
}
/// <summary>
/// Returns the full path to the cached file. If the file does not exist, will fallback to the original.
/// </summary>
/// <param name="chapter"></param>
/// <returns></returns>
public string GetCachedFile(Chapter chapter)
{
var extractPath = GetCachePath(chapter.Id);
var path = Path.Join(extractPath, _directoryService.FileSystem.Path.GetFileName(chapter.Files.First().FilePath));
if (!(_directoryService.FileSystem.FileInfo.FromFileName(path).Exists))
{
// Calculate what chapter the page belongs to
var path = GetBookmarkCachePath(seriesId);
var files = _directoryService.GetFilesWithExtension(path, Tasks.Scanner.Parser.Parser.ImageFileExtensions);
files = files
.AsEnumerable()
.OrderByNatural(Path.GetFileNameWithoutExtension)
.ToArray();
if (files.Length == 0)
{
return string.Empty;
}
// Since array is 0 based, we need to keep that in account (only affects last image)
return page == files.Length ? files.ElementAt(page - 1) : files.ElementAt(page);
path = chapter.Files.First().FilePath;
}
return path;
}
/// <summary>
/// Returns the full path to the cached file. If the file does not exist, will fallback to the original.
/// </summary>
/// <param name="chapter"></param>
/// <returns></returns>
public string GetCachedFile(Chapter chapter)
/// <summary>
/// Caches the files for the given chapter to CacheDirectory
/// </summary>
/// <param name="chapterId"></param>
/// <returns>This will always return the Chapter for the chapterId</returns>
public async Task<Chapter> Ensure(int chapterId)
{
_directoryService.ExistOrCreate(_directoryService.CacheDirectory);
var chapter = await _unitOfWork.ChapterRepository.GetChapterAsync(chapterId);
var extractPath = GetCachePath(chapterId);
if (_directoryService.Exists(extractPath)) return chapter;
var files = chapter.Files.ToList();
ExtractChapterFiles(extractPath, files);
return chapter;
}
/// <summary>
/// This is an internal method for cache service for extracting chapter files to disk. The code is structured
/// for cache service, but can be re-used (download bookmarks)
/// </summary>
/// <param name="extractPath"></param>
/// <param name="files"></param>
/// <returns></returns>
public void ExtractChapterFiles(string extractPath, IReadOnlyList<MangaFile> files)
{
var removeNonImages = true;
var fileCount = files.Count;
var extraPath = "";
var extractDi = _directoryService.FileSystem.DirectoryInfo.FromDirectoryName(extractPath);
if (files.Count > 0 && files[0].Format == MangaFormat.Image)
{
var extractPath = GetCachePath(chapter.Id);
var path = Path.Join(extractPath, _directoryService.FileSystem.Path.GetFileName(chapter.Files.First().FilePath));
if (!(_directoryService.FileSystem.FileInfo.FromFileName(path).Exists))
{
path = chapter.Files.First().FilePath;
}
return path;
}
/// <summary>
/// Caches the files for the given chapter to CacheDirectory
/// </summary>
/// <param name="chapterId"></param>
/// <returns>This will always return the Chapter for the chapterId</returns>
public async Task<Chapter> Ensure(int chapterId)
{
_directoryService.ExistOrCreate(_directoryService.CacheDirectory);
var chapter = await _unitOfWork.ChapterRepository.GetChapterAsync(chapterId);
var extractPath = GetCachePath(chapterId);
if (_directoryService.Exists(extractPath)) return chapter;
var files = chapter.Files.ToList();
ExtractChapterFiles(extractPath, files);
return chapter;
}
/// <summary>
/// This is an internal method for cache service for extracting chapter files to disk. The code is structured
/// for cache service, but can be re-used (download bookmarks)
/// </summary>
/// <param name="extractPath"></param>
/// <param name="files"></param>
/// <returns></returns>
public void ExtractChapterFiles(string extractPath, IReadOnlyList<MangaFile> files)
{
var removeNonImages = true;
var fileCount = files.Count;
var extraPath = "";
var extractDi = _directoryService.FileSystem.DirectoryInfo.FromDirectoryName(extractPath);
if (files.Count > 0 && files[0].Format == MangaFormat.Image)
{
_readingItemService.Extract(files[0].FilePath, extractPath, MangaFormat.Image, files.Count);
_directoryService.Flatten(extractDi.FullName);
}
foreach (var file in files)
{
if (fileCount > 1)
{
extraPath = file.Id + string.Empty;
}
switch (file.Format)
{
case MangaFormat.Archive:
_readingItemService.Extract(file.FilePath, Path.Join(extractPath, extraPath), file.Format);
break;
case MangaFormat.Epub:
case MangaFormat.Pdf:
{
removeNonImages = false;
if (!_directoryService.FileSystem.File.Exists(files[0].FilePath))
{
_logger.LogError("{File} does not exist on disk", files[0].FilePath);
throw new KavitaException($"{files[0].FilePath} does not exist on disk");
}
_directoryService.ExistOrCreate(extractPath);
_directoryService.CopyFileToDirectory(files[0].FilePath, extractPath);
break;
}
}
}
_readingItemService.Extract(files[0].FilePath, extractPath, MangaFormat.Image, files.Count);
_directoryService.Flatten(extractDi.FullName);
if (removeNonImages)
}
foreach (var file in files)
{
if (fileCount > 1)
{
_directoryService.RemoveNonImages(extractDi.FullName);
extraPath = file.Id + string.Empty;
}
switch (file.Format)
{
case MangaFormat.Archive:
_readingItemService.Extract(file.FilePath, Path.Join(extractPath, extraPath), file.Format);
break;
case MangaFormat.Epub:
case MangaFormat.Pdf:
{
removeNonImages = false;
if (!_directoryService.FileSystem.File.Exists(files[0].FilePath))
{
_logger.LogError("{File} does not exist on disk", files[0].FilePath);
throw new KavitaException($"{files[0].FilePath} does not exist on disk");
}
_directoryService.ExistOrCreate(extractPath);
_directoryService.CopyFileToDirectory(files[0].FilePath, extractPath);
break;
}
}
}
/// <summary>
/// Removes the cached files and folders for a set of chapterIds
/// </summary>
/// <param name="chapterIds"></param>
public void CleanupChapters(IEnumerable<int> chapterIds)
_directoryService.Flatten(extractDi.FullName);
if (removeNonImages)
{
foreach (var chapter in chapterIds)
{
_directoryService.ClearAndDeleteDirectory(GetCachePath(chapter));
}
_directoryService.RemoveNonImages(extractDi.FullName);
}
}
/// <summary>
/// Removes the cached files and folders for a set of chapterIds
/// </summary>
/// <param name="chapterIds"></param>
public void CleanupChapters(IEnumerable<int> chapterIds)
{
foreach (var chapter in chapterIds)
{
_directoryService.ClearAndDeleteDirectory(GetCachePath(chapter));
}
}
/// <summary>
/// Removes the cached files and folders for a set of chapterIds
/// </summary>
/// <param name="seriesIds"></param>
public void CleanupBookmarks(IEnumerable<int> seriesIds)
{
foreach (var series in seriesIds)
{
_directoryService.ClearAndDeleteDirectory(GetBookmarkCachePath(series));
}
}
/// <summary>
/// Returns the cache path for a given Chapter. Should be cacheDirectory/{chapterId}/
/// </summary>
/// <param name="chapterId"></param>
/// <returns></returns>
private string GetCachePath(int chapterId)
{
return _directoryService.FileSystem.Path.GetFullPath(_directoryService.FileSystem.Path.Join(_directoryService.CacheDirectory, $"{chapterId}/"));
}
private string GetBookmarkCachePath(int seriesId)
{
return _directoryService.FileSystem.Path.GetFullPath(_directoryService.FileSystem.Path.Join(_directoryService.CacheDirectory, $"{seriesId}_bookmarks/"));
}
/// <summary>
/// Returns the absolute path of a cached page.
/// </summary>
/// <param name="chapter">Chapter entity with Files populated.</param>
/// <param name="page">Page number to look for</param>
/// <returns>Page filepath or empty if no files found.</returns>
public string GetCachedPagePath(Chapter chapter, int page)
{
// Calculate what chapter the page belongs to
var path = GetCachePath(chapter.Id);
// TODO: We can optimize this by extracting and renaming, so we don't need to scan for the files and can do a direct access
var files = _directoryService.GetFilesWithExtension(path, Tasks.Scanner.Parser.Parser.ImageFileExtensions)
.OrderByNatural(Path.GetFileNameWithoutExtension)
.ToArray();
if (files.Length == 0)
{
return string.Empty;
}
/// <summary>
/// Removes the cached files and folders for a set of chapterIds
/// </summary>
/// <param name="seriesIds"></param>
public void CleanupBookmarks(IEnumerable<int> seriesIds)
{
foreach (var series in seriesIds)
{
_directoryService.ClearAndDeleteDirectory(GetBookmarkCachePath(series));
}
}
// Since array is 0 based, we need to keep that in account (only affects last image)
return page == files.Length ? files.ElementAt(page - 1) : files.ElementAt(page);
}
public async Task<int> CacheBookmarkForSeries(int userId, int seriesId)
{
var destDirectory = _directoryService.FileSystem.Path.Join(_directoryService.CacheDirectory, seriesId + "_bookmarks");
if (_directoryService.Exists(destDirectory)) return _directoryService.GetFiles(destDirectory).Count();
/// <summary>
/// Returns the cache path for a given Chapter. Should be cacheDirectory/{chapterId}/
/// </summary>
/// <param name="chapterId"></param>
/// <returns></returns>
private string GetCachePath(int chapterId)
{
return _directoryService.FileSystem.Path.GetFullPath(_directoryService.FileSystem.Path.Join(_directoryService.CacheDirectory, $"{chapterId}/"));
}
var bookmarkDtos = await _unitOfWork.UserRepository.GetBookmarkDtosForSeries(userId, seriesId);
var files = (await _bookmarkService.GetBookmarkFilesById(bookmarkDtos.Select(b => b.Id))).ToList();
_directoryService.CopyFilesToDirectory(files, destDirectory);
_directoryService.Flatten(destDirectory);
return files.Count;
}
private string GetBookmarkCachePath(int seriesId)
{
return _directoryService.FileSystem.Path.GetFullPath(_directoryService.FileSystem.Path.Join(_directoryService.CacheDirectory, $"{seriesId}_bookmarks/"));
}
/// <summary>
/// Clears a cached bookmarks for a series id folder
/// </summary>
/// <param name="seriesId"></param>
public void CleanupBookmarkCache(int seriesId)
{
var destDirectory = _directoryService.FileSystem.Path.Join(_directoryService.CacheDirectory, seriesId + "_bookmarks");
if (!_directoryService.Exists(destDirectory)) return;
/// <summary>
/// Returns the absolute path of a cached page.
/// </summary>
/// <param name="chapter">Chapter entity with Files populated.</param>
/// <param name="page">Page number to look for</param>
/// <returns>Page filepath or empty if no files found.</returns>
public string GetCachedPagePath(Chapter chapter, int page)
{
// Calculate what chapter the page belongs to
var path = GetCachePath(chapter.Id);
// TODO: We can optimize this by extracting and renaming, so we don't need to scan for the files and can do a direct access
var files = _directoryService.GetFilesWithExtension(path, Tasks.Scanner.Parser.Parser.ImageFileExtensions)
.OrderByNatural(Path.GetFileNameWithoutExtension)
.ToArray();
if (files.Length == 0)
{
return string.Empty;
}
// Since array is 0 based, we need to keep that in account (only affects last image)
return page == files.Length ? files.ElementAt(page - 1) : files.ElementAt(page);
}
public async Task<int> CacheBookmarkForSeries(int userId, int seriesId)
{
var destDirectory = _directoryService.FileSystem.Path.Join(_directoryService.CacheDirectory, seriesId + "_bookmarks");
if (_directoryService.Exists(destDirectory)) return _directoryService.GetFiles(destDirectory).Count();
var bookmarkDtos = await _unitOfWork.UserRepository.GetBookmarkDtosForSeries(userId, seriesId);
var files = (await _bookmarkService.GetBookmarkFilesById(bookmarkDtos.Select(b => b.Id))).ToList();
_directoryService.CopyFilesToDirectory(files, destDirectory);
_directoryService.Flatten(destDirectory);
return files.Count;
}
/// <summary>
/// Clears a cached bookmarks for a series id folder
/// </summary>
/// <param name="seriesId"></param>
public void CleanupBookmarkCache(int seriesId)
{
var destDirectory = _directoryService.FileSystem.Path.Join(_directoryService.CacheDirectory, seriesId + "_bookmarks");
if (!_directoryService.Exists(destDirectory)) return;
_directoryService.ClearAndDeleteDirectory(destDirectory);
}
_directoryService.ClearAndDeleteDirectory(destDirectory);
}
}

View file

@ -0,0 +1,125 @@
using System;
using System.Collections.Generic;
using System.Linq;
using System.Threading.Tasks;
using API.Data;
using API.DTOs.Device;
using API.DTOs.Email;
using API.Entities;
using API.Entities.Enums;
using API.SignalR;
using Kavita.Common;
using Microsoft.Extensions.Logging;
namespace API.Services;
public interface IDeviceService
{
Task<Device> Create(CreateDeviceDto dto, AppUser userWithDevices);
Task<Device> Update(UpdateDeviceDto dto, AppUser userWithDevices);
Task<bool> Delete(AppUser userWithDevices, int deviceId);
Task<bool> SendTo(IReadOnlyList<int> chapterIds, int deviceId);
}
public class DeviceService : IDeviceService
{
private readonly IUnitOfWork _unitOfWork;
private readonly ILogger<DeviceService> _logger;
private readonly IEmailService _emailService;
public DeviceService(IUnitOfWork unitOfWork, ILogger<DeviceService> logger, IEmailService emailService)
{
_unitOfWork = unitOfWork;
_logger = logger;
_emailService = emailService;
}
#nullable enable
public async Task<Device?> Create(CreateDeviceDto dto, AppUser userWithDevices)
{
try
{
userWithDevices.Devices ??= new List<Device>();
var existingDevice = userWithDevices.Devices.SingleOrDefault(d => d.Name.Equals(dto.Name));
if (existingDevice != null) throw new KavitaException("A device with this name already exists");
existingDevice = DbFactory.Device(dto.Name);
existingDevice.Platform = dto.Platform;
existingDevice.EmailAddress = dto.EmailAddress;
userWithDevices.Devices.Add(existingDevice);
_unitOfWork.UserRepository.Update(userWithDevices);
if (!_unitOfWork.HasChanges()) return existingDevice;
if (await _unitOfWork.CommitAsync()) return existingDevice;
}
catch (Exception ex)
{
_logger.LogError(ex, "There was an error when creating your device");
await _unitOfWork.RollbackAsync();
}
return null;
}
public async Task<Device?> Update(UpdateDeviceDto dto, AppUser userWithDevices)
{
try
{
var existingDevice = userWithDevices.Devices.SingleOrDefault(d => d.Id == dto.Id);
if (existingDevice == null) throw new KavitaException("This device doesn't exist yet. Please create first");
existingDevice.Name = dto.Name;
existingDevice.Platform = dto.Platform;
existingDevice.EmailAddress = dto.EmailAddress;
if (!_unitOfWork.HasChanges()) return existingDevice;
if (await _unitOfWork.CommitAsync()) return existingDevice;
}
catch (Exception ex)
{
_logger.LogError(ex, "There was an error when updating your device");
await _unitOfWork.RollbackAsync();
}
return null;
}
#nullable disable
public async Task<bool> Delete(AppUser userWithDevices, int deviceId)
{
try
{
userWithDevices.Devices = userWithDevices.Devices.Where(d => d.Id != deviceId).ToList();
_unitOfWork.UserRepository.Update(userWithDevices);
if (!_unitOfWork.HasChanges()) return true;
if (await _unitOfWork.CommitAsync()) return true;
}
catch (Exception ex)
{
_logger.LogError(ex, "There was an issue with deleting the device, {DeviceId} for user {UserName}", deviceId, userWithDevices.UserName);
}
return false;
}
public async Task<bool> SendTo(IReadOnlyList<int> chapterIds, int deviceId)
{
var files = await _unitOfWork.ChapterRepository.GetFilesForChaptersAsync(chapterIds);
if (files.Any(f => f.Format is not (MangaFormat.Epub or MangaFormat.Pdf)))
throw new KavitaException("Cannot Send non Epub or Pdf to devices as not supported");
var device = await _unitOfWork.DeviceRepository.GetDeviceById(deviceId);
if (device == null) throw new KavitaException("Device doesn't exist");
device.LastUsed = DateTime.Now;
_unitOfWork.DeviceRepository.Update(device);
await _unitOfWork.CommitAsync();
var success = await _emailService.SendFilesToEmail(new SendToDto()
{
DestinationEmail = device.EmailAddress,
FilePaths = files.Select(m => m.FilePath)
});
return success;
}
}

File diff suppressed because it is too large Load diff

View file

@ -3,7 +3,6 @@ using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Threading.Tasks;
using API.Constants;
using API.Entities;
using Microsoft.AspNetCore.Identity;
using Microsoft.AspNetCore.StaticFiles;
@ -14,17 +13,12 @@ public interface IDownloadService
{
Tuple<string, string, string> GetFirstFileDownload(IEnumerable<MangaFile> files);
string GetContentTypeFromFile(string filepath);
Task<bool> HasDownloadPermission(AppUser user);
}
public class DownloadService : IDownloadService
{
private readonly UserManager<AppUser> _userManager;
private readonly FileExtensionContentTypeProvider _fileTypeProvider = new FileExtensionContentTypeProvider();
public DownloadService(UserManager<AppUser> userManager)
{
_userManager = userManager;
}
public DownloadService() { }
/// <summary>
/// Downloads the first file in the file enumerable for download
@ -62,9 +56,5 @@ public class DownloadService : IDownloadService
return contentType;
}
public async Task<bool> HasDownloadPermission(AppUser user)
{
var roles = await _userManager.GetRolesAsync(user);
return roles.Contains(PolicyConstants.DownloadRole) || roles.Contains(PolicyConstants.AdminRole);
}
}

View file

@ -1,4 +1,5 @@
using System;
using System.Collections.Generic;
using System.Linq;
using System.Net;
using System.Threading.Tasks;
@ -20,23 +21,29 @@ public interface IEmailService
Task<bool> CheckIfAccessible(string host);
Task<bool> SendMigrationEmail(EmailMigrationDto data);
Task<bool> SendPasswordResetEmail(PasswordResetEmailDto data);
Task<bool> SendFilesToEmail(SendToDto data);
Task<EmailTestResultDto> TestConnectivity(string emailUrl);
Task<bool> IsDefaultEmailService();
Task SendEmailChangeEmail(ConfirmationEmailDto data);
}
public class EmailService : IEmailService
{
private readonly ILogger<EmailService> _logger;
private readonly IUnitOfWork _unitOfWork;
private readonly IDownloadService _downloadService;
/// <summary>
/// This is used to initially set or reset the ServerSettingKey. Do not access from the code, access via UnitOfWork
/// </summary>
public const string DefaultApiUrl = "https://email.kavitareader.com";
public EmailService(ILogger<EmailService> logger, IUnitOfWork unitOfWork)
public EmailService(ILogger<EmailService> logger, IUnitOfWork unitOfWork, IDownloadService downloadService)
{
_logger = logger;
_unitOfWork = unitOfWork;
_downloadService = downloadService;
FlurlHttp.ConfigureClient(DefaultApiUrl, cli =>
cli.Settings.HttpClientFactory = new UntrustedCertClientFactory());
@ -58,7 +65,7 @@ public class EmailService : IEmailService
result.Successful = false;
result.ErrorMessage = "This is a local IP address";
}
result.Successful = await SendEmailWithGet(emailUrl + "/api/email/test");
result.Successful = await SendEmailWithGet(emailUrl + "/api/test");
}
catch (KavitaException ex)
{
@ -69,10 +76,26 @@ public class EmailService : IEmailService
return result;
}
public async Task<bool> IsDefaultEmailService()
{
return (await _unitOfWork.SettingsRepository.GetSettingAsync(ServerSettingKey.EmailServiceUrl)).Value
.Equals(DefaultApiUrl);
}
public async Task SendEmailChangeEmail(ConfirmationEmailDto data)
{
var emailLink = (await _unitOfWork.SettingsRepository.GetSettingAsync(ServerSettingKey.EmailServiceUrl)).Value;
var success = await SendEmailWithPost(emailLink + "/api/account/email-change", data);
if (!success)
{
_logger.LogError("There was a critical error sending Confirmation email");
}
}
public async Task SendConfirmationEmail(ConfirmationEmailDto data)
{
var emailLink = (await _unitOfWork.SettingsRepository.GetSettingAsync(ServerSettingKey.EmailServiceUrl)).Value;
var success = await SendEmailWithPost(emailLink + "/api/email/confirm", data);
var success = await SendEmailWithPost(emailLink + "/api/invite/confirm", data);
if (!success)
{
_logger.LogError("There was a critical error sending Confirmation email");
@ -85,7 +108,7 @@ public class EmailService : IEmailService
try
{
if (IsLocalIpAddress(host)) return false;
return await SendEmailWithGet(DefaultApiUrl + "/api/email/reachable?host=" + host);
return await SendEmailWithGet(DefaultApiUrl + "/api/reachable?host=" + host);
}
catch (Exception)
{
@ -96,24 +119,33 @@ public class EmailService : IEmailService
public async Task<bool> SendMigrationEmail(EmailMigrationDto data)
{
var emailLink = (await _unitOfWork.SettingsRepository.GetSettingAsync(ServerSettingKey.EmailServiceUrl)).Value;
return await SendEmailWithPost(emailLink + "/api/email/email-migration", data);
return await SendEmailWithPost(emailLink + "/api/invite/email-migration", data);
}
public async Task<bool> SendPasswordResetEmail(PasswordResetEmailDto data)
{
var emailLink = (await _unitOfWork.SettingsRepository.GetSettingAsync(ServerSettingKey.EmailServiceUrl)).Value;
return await SendEmailWithPost(emailLink + "/api/email/email-password-reset", data);
return await SendEmailWithPost(emailLink + "/api/invite/email-password-reset", data);
}
private static async Task<bool> SendEmailWithGet(string url, int timeoutSecs = 30)
public async Task<bool> SendFilesToEmail(SendToDto data)
{
if (await IsDefaultEmailService()) return false;
var emailLink = (await _unitOfWork.SettingsRepository.GetSettingAsync(ServerSettingKey.EmailServiceUrl)).Value;
return await SendEmailWithFiles(emailLink + "/api/sendto", data.FilePaths, data.DestinationEmail);
}
private async Task<bool> SendEmailWithGet(string url, int timeoutSecs = 30)
{
try
{
var settings = await _unitOfWork.SettingsRepository.GetSettingsDtoAsync();
var response = await (url)
.WithHeader("Accept", "application/json")
.WithHeader("User-Agent", "Kavita")
.WithHeader("x-api-key", "MsnvA2DfQqxSK5jh")
.WithHeader("x-kavita-version", BuildInfo.Version)
.WithHeader("x-kavita-installId", settings.InstallId)
.WithHeader("Content-Type", "application/json")
.WithTimeout(TimeSpan.FromSeconds(timeoutSecs))
.GetStringAsync();
@ -131,26 +163,69 @@ public class EmailService : IEmailService
}
private static async Task<bool> SendEmailWithPost(string url, object data, int timeoutSecs = 30)
private async Task<bool> SendEmailWithPost(string url, object data, int timeoutSecs = 30)
{
try
{
var settings = await _unitOfWork.SettingsRepository.GetSettingsDtoAsync();
var response = await (url)
.WithHeader("Accept", "application/json")
.WithHeader("User-Agent", "Kavita")
.WithHeader("x-api-key", "MsnvA2DfQqxSK5jh")
.WithHeader("x-kavita-version", BuildInfo.Version)
.WithHeader("x-kavita-installId", settings.InstallId)
.WithHeader("Content-Type", "application/json")
.WithTimeout(TimeSpan.FromSeconds(timeoutSecs))
.PostJsonAsync(data);
if (response.StatusCode != StatusCodes.Status200OK)
{
return false;
var errorMessage = await response.GetStringAsync();
throw new KavitaException(errorMessage);
}
}
catch (Exception)
catch (FlurlHttpException ex)
{
_logger.LogError(ex, "There was an exception when interacting with Email Service");
return false;
}
return true;
}
private async Task<bool> SendEmailWithFiles(string url, IEnumerable<string> filePaths, string destEmail, int timeoutSecs = 300)
{
try
{
var settings = await _unitOfWork.SettingsRepository.GetSettingsDtoAsync();
var response = await (url)
.WithHeader("User-Agent", "Kavita")
.WithHeader("x-api-key", "MsnvA2DfQqxSK5jh")
.WithHeader("x-kavita-version", BuildInfo.Version)
.WithHeader("x-kavita-installId", settings.InstallId)
.WithTimeout(timeoutSecs)
.AllowHttpStatus("4xx")
.PostMultipartAsync(mp =>
{
mp.AddString("email", destEmail);
var index = 1;
foreach (var filepath in filePaths)
{
mp.AddFile("file" + index, filepath, _downloadService.GetContentTypeFromFile(filepath));
index++;
}
}
);
if (response.StatusCode != StatusCodes.Status200OK)
{
var errorMessage = await response.GetStringAsync();
throw new KavitaException(errorMessage);
}
}
catch (FlurlHttpException ex)
{
_logger.LogError(ex, "There was an exception when sending Email for SendTo");
return false;
}
return true;

View file

@ -6,55 +6,54 @@ using API.Services.Tasks.Scanner;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Hosting;
namespace API.Services.HostedServices
namespace API.Services.HostedServices;
public class StartupTasksHostedService : IHostedService
{
public class StartupTasksHostedService : IHostedService
private readonly IServiceProvider _provider;
public StartupTasksHostedService(IServiceProvider serviceProvider)
{
private readonly IServiceProvider _provider;
public StartupTasksHostedService(IServiceProvider serviceProvider)
{
_provider = serviceProvider;
}
public async Task StartAsync(CancellationToken cancellationToken)
{
using var scope = _provider.CreateScope();
var taskScheduler = scope.ServiceProvider.GetRequiredService<ITaskScheduler>();
await taskScheduler.ScheduleTasks();
taskScheduler.ScheduleUpdaterTasks();
try
{
// These methods will automatically check if stat collection is disabled to prevent sending any data regardless
// of when setting was changed
await taskScheduler.ScheduleStatsTasks();
await taskScheduler.RunStatCollection();
}
catch (Exception)
{
//If stats startup fail the user can keep using the app
}
try
{
var unitOfWork = scope.ServiceProvider.GetRequiredService<IUnitOfWork>();
if ((await unitOfWork.SettingsRepository.GetSettingsDtoAsync()).EnableFolderWatching)
{
var libraryWatcher = scope.ServiceProvider.GetRequiredService<ILibraryWatcher>();
await libraryWatcher.StartWatching();
}
}
catch (Exception)
{
// Fail silently
}
}
public Task StopAsync(CancellationToken cancellationToken) => Task.CompletedTask;
_provider = serviceProvider;
}
public async Task StartAsync(CancellationToken cancellationToken)
{
using var scope = _provider.CreateScope();
var taskScheduler = scope.ServiceProvider.GetRequiredService<ITaskScheduler>();
await taskScheduler.ScheduleTasks();
taskScheduler.ScheduleUpdaterTasks();
try
{
// These methods will automatically check if stat collection is disabled to prevent sending any data regardless
// of when setting was changed
await taskScheduler.ScheduleStatsTasks();
await taskScheduler.RunStatCollection();
}
catch (Exception)
{
//If stats startup fail the user can keep using the app
}
try
{
var unitOfWork = scope.ServiceProvider.GetRequiredService<IUnitOfWork>();
if ((await unitOfWork.SettingsRepository.GetSettingsDtoAsync()).EnableFolderWatching)
{
var libraryWatcher = scope.ServiceProvider.GetRequiredService<ILibraryWatcher>();
await libraryWatcher.StartWatching();
}
}
catch (Exception)
{
// Fail silently
}
}
public Task StopAsync(CancellationToken cancellationToken) => Task.CompletedTask;
}

View file

@ -128,7 +128,7 @@ public class ImageService : IImageService
return true;
}
catch (Exception ex)
catch (Exception)
{
/* Swallow Exception */
}

View file

@ -8,6 +8,7 @@ using API.Data;
using API.Data.Metadata;
using API.Data.Repositories;
using API.Data.Scanner;
using API.DTOs.Metadata;
using API.Entities;
using API.Entities.Enums;
using API.Extensions;
@ -51,7 +52,6 @@ public class MetadataService : IMetadataService
private readonly ICacheHelper _cacheHelper;
private readonly IReadingItemService _readingItemService;
private readonly IDirectoryService _directoryService;
private readonly ChapterSortComparerZeroFirst _chapterSortComparerForInChapterSorting = new ChapterSortComparerZeroFirst();
private readonly IList<SignalRMessage> _updateEvents = new List<SignalRMessage>();
public MetadataService(IUnitOfWork unitOfWork, ILogger<MetadataService> logger,
IEventHub eventHub, ICacheHelper cacheHelper,
@ -89,7 +89,7 @@ public class MetadataService : IMetadataService
private void UpdateChapterLastModified(Chapter chapter, bool forceUpdate)
{
var firstFile = chapter.Files.MinBy(x => x.Chapter);
if (firstFile == null || _cacheHelper.HasFileNotChangedSinceCreationOrLastScan(chapter, forceUpdate, firstFile)) return;
if (firstFile == null || _cacheHelper.IsFileUnmodifiedSinceCreationOrLastScan(chapter, forceUpdate, firstFile)) return;
firstFile.UpdateLastModified();
}
@ -108,7 +108,7 @@ public class MetadataService : IMetadataService
volume.Chapters ??= new List<Chapter>();
var firstChapter = volume.Chapters.MinBy(x => double.Parse(x.Number), _chapterSortComparerForInChapterSorting);
var firstChapter = volume.Chapters.MinBy(x => double.Parse(x.Number), ChapterSortComparerZeroFirst.Default);
if (firstChapter == null) return Task.FromResult(false);
volume.CoverImage = firstChapter.CoverImage;
@ -131,23 +131,8 @@ public class MetadataService : IMetadataService
return Task.CompletedTask;
series.Volumes ??= new List<Volume>();
var firstCover = series.Volumes.GetCoverImage(series.Format);
string coverImage = null;
if (firstCover == null && series.Volumes.Any())
{
// If firstCover is null and one volume, the whole series is Chapters under Vol 0.
if (series.Volumes.Count == 1)
{
coverImage = series.Volumes[0].Chapters.OrderBy(c => double.Parse(c.Number), _chapterSortComparerForInChapterSorting)
.FirstOrDefault(c => !c.IsSpecial)?.CoverImage;
}
series.CoverImage = series.GetCoverImage();
if (!_cacheHelper.CoverImageExists(coverImage))
{
coverImage = series.Volumes[0].Chapters.MinBy(c => double.Parse(c.Number), _chapterSortComparerForInChapterSorting)?.CoverImage;
}
}
series.CoverImage = firstCover?.CoverImage ?? coverImage;
_updateEvents.Add(MessageFactory.CoverUpdateEvent(series.Id, MessageFactoryEntityTypes.Series));
return Task.CompletedTask;
}

View file

@ -21,10 +21,11 @@ public interface IReaderService
{
Task MarkSeriesAsRead(AppUser user, int seriesId);
Task MarkSeriesAsUnread(AppUser user, int seriesId);
Task MarkChaptersAsRead(AppUser user, int seriesId, IEnumerable<Chapter> chapters);
Task MarkChaptersAsUnread(AppUser user, int seriesId, IEnumerable<Chapter> chapters);
Task MarkChaptersAsRead(AppUser user, int seriesId, IList<Chapter> chapters);
Task MarkChaptersAsUnread(AppUser user, int seriesId, IList<Chapter> chapters);
Task<bool> SaveReadingProgress(ProgressDto progressDto, int userId);
Task<int> CapPageToChapter(int chapterId, int page);
int CapPageToChapter(Chapter chapter, int page);
Task<int> GetNextChapterIdAsync(int seriesId, int volumeId, int currentChapterId, int userId);
Task<int> GetPrevChapterIdAsync(int seriesId, int volumeId, int currentChapterId, int userId);
Task<ChapterDto> GetContinuePoint(int seriesId, int userId);
@ -39,8 +40,8 @@ public class ReaderService : IReaderService
private readonly IUnitOfWork _unitOfWork;
private readonly ILogger<ReaderService> _logger;
private readonly IEventHub _eventHub;
private readonly ChapterSortComparer _chapterSortComparer = new ChapterSortComparer();
private readonly ChapterSortComparerZeroFirst _chapterSortComparerForInChapterSorting = new ChapterSortComparerZeroFirst();
private readonly ChapterSortComparer _chapterSortComparer = ChapterSortComparer.Default;
private readonly ChapterSortComparerZeroFirst _chapterSortComparerForInChapterSorting = ChapterSortComparerZeroFirst.Default;
private const float MinWordsPerHour = 10260F;
private const float MaxWordsPerHour = 30000F;
@ -75,8 +76,6 @@ public class ReaderService : IReaderService
{
await MarkChaptersAsRead(user, seriesId, volume.Chapters);
}
_unitOfWork.UserRepository.Update(user);
}
/// <summary>
@ -92,18 +91,18 @@ public class ReaderService : IReaderService
{
await MarkChaptersAsUnread(user, seriesId, volume.Chapters);
}
_unitOfWork.UserRepository.Update(user);
}
/// <summary>
/// Marks all Chapters as Read by creating or updating UserProgress rows. Does not commit.
/// </summary>
/// <remarks>Emits events to the UI for each chapter progress and one for each volume progress</remarks>
/// <param name="user"></param>
/// <param name="seriesId"></param>
/// <param name="chapters"></param>
public async Task MarkChaptersAsRead(AppUser user, int seriesId, IEnumerable<Chapter> chapters)
public async Task MarkChaptersAsRead(AppUser user, int seriesId, IList<Chapter> chapters)
{
var seenVolume = new Dictionary<int, bool>();
foreach (var chapter in chapters)
{
var userProgress = GetUserProgressForChapter(user, chapter);
@ -117,19 +116,29 @@ public class ReaderService : IReaderService
SeriesId = seriesId,
ChapterId = chapter.Id
});
await _eventHub.SendMessageAsync(MessageFactory.UserProgressUpdate,
MessageFactory.UserProgressUpdateEvent(user.Id, user.UserName, seriesId, chapter.VolumeId, chapter.Id, chapter.Pages));
}
else
{
userProgress.PagesRead = chapter.Pages;
userProgress.SeriesId = seriesId;
userProgress.VolumeId = chapter.VolumeId;
await _eventHub.SendMessageAsync(MessageFactory.UserProgressUpdate,
MessageFactory.UserProgressUpdateEvent(user.Id, user.UserName, userProgress.SeriesId, userProgress.VolumeId, userProgress.ChapterId, chapter.Pages));
}
await _eventHub.SendMessageAsync(MessageFactory.UserProgressUpdate,
MessageFactory.UserProgressUpdateEvent(user.Id, user.UserName, seriesId, chapter.VolumeId, chapter.Id, chapter.Pages));
// Send out volume events for each distinct volume
if (!seenVolume.ContainsKey(chapter.VolumeId))
{
seenVolume[chapter.VolumeId] = true;
await _eventHub.SendMessageAsync(MessageFactory.UserProgressUpdate,
MessageFactory.UserProgressUpdateEvent(user.Id, user.UserName, seriesId,
chapter.VolumeId, 0, chapters.Where(c => c.VolumeId == chapter.VolumeId).Sum(c => c.Pages)));
}
}
_unitOfWork.UserRepository.Update(user);
}
/// <summary>
@ -138,8 +147,9 @@ public class ReaderService : IReaderService
/// <param name="user"></param>
/// <param name="seriesId"></param>
/// <param name="chapters"></param>
public async Task MarkChaptersAsUnread(AppUser user, int seriesId, IEnumerable<Chapter> chapters)
public async Task MarkChaptersAsUnread(AppUser user, int seriesId, IList<Chapter> chapters)
{
var seenVolume = new Dictionary<int, bool>();
foreach (var chapter in chapters)
{
var userProgress = GetUserProgressForChapter(user, chapter);
@ -152,7 +162,17 @@ public class ReaderService : IReaderService
await _eventHub.SendMessageAsync(MessageFactory.UserProgressUpdate,
MessageFactory.UserProgressUpdateEvent(user.Id, user.UserName, userProgress.SeriesId, userProgress.VolumeId, userProgress.ChapterId, 0));
// Send out volume events for each distinct volume
if (!seenVolume.ContainsKey(chapter.VolumeId))
{
seenVolume[chapter.VolumeId] = true;
await _eventHub.SendMessageAsync(MessageFactory.UserProgressUpdate,
MessageFactory.UserProgressUpdateEvent(user.Id, user.UserName, seriesId,
chapter.VolumeId, 0, 0));
}
}
_unitOfWork.UserRepository.Update(user);
}
/// <summary>
@ -273,6 +293,21 @@ public class ReaderService : IReaderService
return page;
}
public int CapPageToChapter(Chapter chapter, int page)
{
if (page > chapter.Pages)
{
page = chapter.Pages;
}
if (page < 0)
{
page = 0;
}
return page;
}
/// <summary>
/// Tries to find the next logical Chapter
/// </summary>
@ -297,19 +332,29 @@ public class ReaderService : IReaderService
if (chapterId > 0) return chapterId;
}
var currentVolumeNumber = float.Parse(currentVolume.Name);
var next = false;
foreach (var volume in volumes)
{
if (volume.Number == currentVolume.Number && volume.Chapters.Count > 1)
var volumeNumbersMatch = Math.Abs(float.Parse(volume.Name) - currentVolumeNumber) < 0.00001f;
if (volumeNumbersMatch && volume.Chapters.Count > 1)
{
// Handle Chapters within current Volume
// In this case, i need 0 first because 0 represents a full volume file.
var chapterId = GetNextChapterId(currentVolume.Chapters.OrderBy(x => double.Parse(x.Number), _chapterSortComparer),
currentChapter.Range, dto => dto.Range);
if (chapterId > 0) return chapterId;
next = true;
continue;
}
if (volume.Number != currentVolume.Number + 1) continue;
if (volumeNumbersMatch)
{
next = true;
continue;
}
if (!next) continue;
// Handle Chapters within next Volume
// ! When selecting the chapter for the next volume, we need to make sure a c0 comes before a c1+
@ -373,6 +418,7 @@ public class ReaderService : IReaderService
if (chapterId > 0) return chapterId;
}
var next = false;
foreach (var volume in volumes)
{
if (volume.Number == currentVolume.Number)
@ -380,8 +426,10 @@ public class ReaderService : IReaderService
var chapterId = GetNextChapterId(currentVolume.Chapters.OrderBy(x => double.Parse(x.Number), _chapterSortComparerForInChapterSorting).Reverse(),
currentChapter.Range, dto => dto.Range);
if (chapterId > 0) return chapterId;
next = true; // When the diff between volumes is more than 1, we need to explicitly tell that next volume is our use case
continue;
}
if (volume.Number == currentVolume.Number - 1)
if (next)
{
if (currentVolume.Number - 1 == 0) break; // If we have walked all the way to chapter volume, then we should break so logic outside can work
var lastChapter = volume.Chapters.MaxBy(x => double.Parse(x.Number), _chapterSortComparerForInChapterSorting);
@ -497,7 +545,7 @@ public class ReaderService : IReaderService
var chapters = volume.Chapters
.OrderBy(c => float.Parse(c.Number))
.Where(c => !c.IsSpecial && Tasks.Scanner.Parser.Parser.MaxNumberFromRange(c.Range) <= chapterNumber);
await MarkChaptersAsRead(user, volume.SeriesId, chapters);
await MarkChaptersAsRead(user, volume.SeriesId, chapters.ToList());
}
}

View file

@ -2,6 +2,7 @@
using API.Data.Metadata;
using API.Entities.Enums;
using API.Parser;
using API.Services.Tasks.Scanner.Parser;
namespace API.Services;
@ -71,8 +72,7 @@ public class ReadingItemService : IReadingItemService
// This catches when original library type is Manga/Comic and when parsing with non
if (Tasks.Scanner.Parser.Parser.IsEpub(path) && Tasks.Scanner.Parser.Parser.ParseVolume(info.Series) != Tasks.Scanner.Parser.Parser.DefaultVolume) // Shouldn't this be info.Volume != DefaultVolume?
{
info = _defaultParser.Parse(path, rootPath, LibraryType.Book);
var info2 = Parse(path, rootPath, type);
var info2 = _defaultParser.Parse(path, rootPath, LibraryType.Book);
info.Merge(info2);
}

View file

@ -17,7 +17,7 @@ public interface IReadingListService
Task<bool> DeleteReadingListItem(UpdateReadingListPosition dto);
Task<AppUser?> UserHasReadingListAccess(int readingListId, string username);
Task<bool> DeleteReadingList(int readingListId, AppUser user);
Task CalculateReadingListAgeRating(ReadingList readingList);
Task<bool> AddChaptersToReadingList(int seriesId, IList<int> chapterIds,
ReadingList readingList);
}
@ -41,7 +41,7 @@ public class ReadingListService : IReadingListService
/// <summary>
/// Removes all entries that are fully read from the reading list
/// Removes all entries that are fully read from the reading list. This commits
/// </summary>
/// <remarks>If called from API layer, expected for <see cref="UserHasReadingListAccess"/> to be called beforehand</remarks>
/// <param name="readingListId">Reading List Id</param>
@ -62,10 +62,12 @@ public class ReadingListService : IReadingListService
itemIdsToRemove.Contains(r.Id));
_unitOfWork.ReadingListRepository.BulkRemove(listItems);
var readingList = await _unitOfWork.ReadingListRepository.GetReadingListByIdAsync(readingListId);
await CalculateReadingListAgeRating(readingList);
if (!_unitOfWork.HasChanges()) return true;
await _unitOfWork.CommitAsync();
return true;
return await _unitOfWork.CommitAsync();
}
catch
{
@ -97,6 +99,11 @@ public class ReadingListService : IReadingListService
return await _unitOfWork.CommitAsync();
}
/// <summary>
/// Removes a certain reading list item from a reading list
/// </summary>
/// <param name="dto">Only ReadingListId and ReadingListItemId are used</param>
/// <returns></returns>
public async Task<bool> DeleteReadingListItem(UpdateReadingListPosition dto)
{
var readingList = await _unitOfWork.ReadingListRepository.GetReadingListByIdAsync(dto.ReadingListId);
@ -109,11 +116,34 @@ public class ReadingListService : IReadingListService
index++;
}
await CalculateReadingListAgeRating(readingList);
if (!_unitOfWork.HasChanges()) return true;
return await _unitOfWork.CommitAsync();
}
/// <summary>
/// Calculates the highest Age Rating from each Reading List Item
/// </summary>
/// <param name="readingList"></param>
public async Task CalculateReadingListAgeRating(ReadingList readingList)
{
await CalculateReadingListAgeRating(readingList, readingList.Items.Select(i => i.SeriesId));
}
/// <summary>
/// Calculates the highest Age Rating from each Reading List Item
/// </summary>
/// <remarks>This method is used when the ReadingList doesn't have items yet</remarks>
/// <param name="readingList"></param>
/// <param name="seriesIds">The series ids of all the reading list items</param>
private async Task CalculateReadingListAgeRating(ReadingList readingList, IEnumerable<int> seriesIds)
{
var ageRating = await _unitOfWork.SeriesRepository.GetMaxAgeRatingFromSeriesAsync(seriesIds);
readingList.AgeRating = ageRating;
}
/// <summary>
/// Validates the user has access to the reading list to perform actions on it
/// </summary>
@ -167,16 +197,18 @@ public class ReadingListService : IReadingListService
var existingChapterExists = readingList.Items.Select(rli => rli.ChapterId).ToHashSet();
var chaptersForSeries = (await _unitOfWork.ChapterRepository.GetChaptersByIdsAsync(chapterIds))
.OrderBy(c => Tasks.Scanner.Parser.Parser.MinNumberFromRange(c.Volume.Name))
.ThenBy(x => double.Parse(x.Number), _chapterSortComparerForInChapterSorting);
.ThenBy(x => double.Parse(x.Number), _chapterSortComparerForInChapterSorting)
.ToList();
var index = lastOrder + 1;
foreach (var chapter in chaptersForSeries)
foreach (var chapter in chaptersForSeries.Where(chapter => !existingChapterExists.Contains(chapter.Id)))
{
if (existingChapterExists.Contains(chapter.Id)) continue;
readingList.Items.Add(DbFactory.ReadingListItem(index, seriesId, chapter.VolumeId, chapter.Id));
index += 1;
}
await CalculateReadingListAgeRating(readingList, new []{ seriesId });
return index > lastOrder + 1;
}
}

View file

@ -5,14 +5,17 @@ using System.Linq;
using System.Threading.Tasks;
using API.Comparators;
using API.Data;
using API.Data.Repositories;
using API.DTOs;
using API.DTOs.CollectionTags;
using API.DTOs.Metadata;
using API.DTOs.SeriesDetail;
using API.Entities;
using API.Entities.Enums;
using API.Entities.Metadata;
using API.Helpers;
using API.SignalR;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Extensions.Logging;
namespace API.Services;
@ -24,7 +27,8 @@ public interface ISeriesService
Task<bool> UpdateSeriesMetadata(UpdateSeriesMetadataDto updateSeriesMetadataDto);
Task<bool> UpdateRating(AppUser user, UpdateSeriesRatingDto updateSeriesRatingDto);
Task<bool> DeleteMultipleSeries(IList<int> seriesIds);
Task<bool> UpdateRelatedSeries(UpdateRelatedSeriesDto dto);
Task<RelatedSeriesDto> GetRelatedSeries(int userId, int seriesId);
}
public class SeriesService : ISeriesService
@ -75,6 +79,12 @@ public class SeriesService : ISeriesService
series.Metadata.AgeRatingLocked = true;
}
if (updateSeriesMetadataDto.SeriesMetadata.ReleaseYear > 1000 && series.Metadata.ReleaseYear != updateSeriesMetadataDto.SeriesMetadata.ReleaseYear)
{
series.Metadata.ReleaseYear = updateSeriesMetadataDto.SeriesMetadata.ReleaseYear;
series.Metadata.ReleaseYearLocked = true;
}
if (series.Metadata.PublicationStatus != updateSeriesMetadataDto.SeriesMetadata.PublicationStatus)
{
series.Metadata.PublicationStatus = updateSeriesMetadataDto.SeriesMetadata.PublicationStatus;
@ -166,6 +176,7 @@ public class SeriesService : ISeriesService
series.Metadata.CoverArtistLocked = updateSeriesMetadataDto.SeriesMetadata.CoverArtistsLocked;
series.Metadata.WriterLocked = updateSeriesMetadataDto.SeriesMetadata.WritersLocked;
series.Metadata.SummaryLocked = updateSeriesMetadataDto.SeriesMetadata.SummaryLocked;
series.Metadata.ReleaseYearLocked = updateSeriesMetadataDto.SeriesMetadata.ReleaseYearLocked;
if (!_unitOfWork.HasChanges())
{
@ -462,6 +473,17 @@ public class SeriesService : ISeriesService
public async Task<SeriesDetailDto> GetSeriesDetail(int seriesId, int userId)
{
var series = await _unitOfWork.SeriesRepository.GetSeriesDtoByIdAsync(seriesId, userId);
var libraryIds = (await _unitOfWork.LibraryRepository.GetLibraryIdsForUserIdAsync(userId));
if (!libraryIds.Contains(series.LibraryId))
throw new UnauthorizedAccessException("User does not have access to the library this series belongs to");
var user = await _unitOfWork.UserRepository.GetUserByIdAsync(userId);
if (user.AgeRestriction != AgeRating.NotApplicable)
{
var seriesMetadata = await _unitOfWork.SeriesRepository.GetSeriesMetadata(seriesId);
if (seriesMetadata.AgeRating > user.AgeRestriction)
throw new UnauthorizedAccessException("User is not allowed to view this series due to age restrictions");
}
var libraryType = await _unitOfWork.LibraryRepository.GetLibraryTypeAsync(series.LibraryId);
var volumes = (await _unitOfWork.VolumeRepository.GetVolumesDtoAsync(seriesId, userId))
@ -605,4 +627,76 @@ public class SeriesService : ISeriesService
_ => "Chapter"
};
}
/// <summary>
/// Returns all related series against the passed series Id
/// </summary>
/// <param name="userId"></param>
/// <param name="seriesId"></param>
/// <returns></returns>
public async Task<RelatedSeriesDto> GetRelatedSeries(int userId, int seriesId)
{
return await _unitOfWork.SeriesRepository.GetRelatedSeries(userId, seriesId);
}
/// <summary>
/// Update the relations attached to the Series. Does not generate associated Sequel/Prequel pairs on target series.
/// </summary>
/// <param name="dto"></param>
/// <returns></returns>
public async Task<bool> UpdateRelatedSeries(UpdateRelatedSeriesDto dto)
{
var series = await _unitOfWork.SeriesRepository.GetSeriesByIdAsync(dto.SeriesId, SeriesIncludes.Related);
UpdateRelationForKind(dto.Adaptations, series.Relations.Where(r => r.RelationKind == RelationKind.Adaptation).ToList(), series, RelationKind.Adaptation);
UpdateRelationForKind(dto.Characters, series.Relations.Where(r => r.RelationKind == RelationKind.Character).ToList(), series, RelationKind.Character);
UpdateRelationForKind(dto.Contains, series.Relations.Where(r => r.RelationKind == RelationKind.Contains).ToList(), series, RelationKind.Contains);
UpdateRelationForKind(dto.Others, series.Relations.Where(r => r.RelationKind == RelationKind.Other).ToList(), series, RelationKind.Other);
UpdateRelationForKind(dto.SideStories, series.Relations.Where(r => r.RelationKind == RelationKind.SideStory).ToList(), series, RelationKind.SideStory);
UpdateRelationForKind(dto.SpinOffs, series.Relations.Where(r => r.RelationKind == RelationKind.SpinOff).ToList(), series, RelationKind.SpinOff);
UpdateRelationForKind(dto.AlternativeSettings, series.Relations.Where(r => r.RelationKind == RelationKind.AlternativeSetting).ToList(), series, RelationKind.AlternativeSetting);
UpdateRelationForKind(dto.AlternativeVersions, series.Relations.Where(r => r.RelationKind == RelationKind.AlternativeVersion).ToList(), series, RelationKind.AlternativeVersion);
UpdateRelationForKind(dto.Doujinshis, series.Relations.Where(r => r.RelationKind == RelationKind.Doujinshi).ToList(), series, RelationKind.Doujinshi);
UpdateRelationForKind(dto.Prequels, series.Relations.Where(r => r.RelationKind == RelationKind.Prequel).ToList(), series, RelationKind.Prequel);
UpdateRelationForKind(dto.Sequels, series.Relations.Where(r => r.RelationKind == RelationKind.Sequel).ToList(), series, RelationKind.Sequel);
UpdateRelationForKind(dto.Editions, series.Relations.Where(r => r.RelationKind == RelationKind.Edition).ToList(), series, RelationKind.Edition);
if (!_unitOfWork.HasChanges()) return true;
return await _unitOfWork.CommitAsync();
}
/// <summary>
/// Applies the provided list to the series. Adds new relations and removes deleted relations.
/// </summary>
/// <param name="dtoTargetSeriesIds"></param>
/// <param name="adaptations"></param>
/// <param name="series"></param>
/// <param name="kind"></param>
private void UpdateRelationForKind(ICollection<int> dtoTargetSeriesIds, IEnumerable<SeriesRelation> adaptations, Series series, RelationKind kind)
{
foreach (var adaptation in adaptations.Where(adaptation => !dtoTargetSeriesIds.Contains(adaptation.TargetSeriesId)))
{
// If the seriesId isn't in dto, it means we've removed or reclassified
series.Relations.Remove(adaptation);
}
// At this point, we only have things to add
foreach (var targetSeriesId in dtoTargetSeriesIds)
{
// This ensures we don't allow any duplicates to be added
if (series.Relations.SingleOrDefault(r =>
r.RelationKind == kind && r.TargetSeriesId == targetSeriesId) !=
null) continue;
series.Relations.Add(new SeriesRelation()
{
Series = series,
SeriesId = series.Id,
TargetSeriesId = targetSeriesId,
RelationKind = kind
});
_unitOfWork.SeriesRepository.Update(series);
}
}
}

View file

@ -0,0 +1,158 @@
using System;
using API.DTOs;
using System.Threading.Tasks;
using API.Data;
using System.Collections.Immutable;
using System.Collections.Generic;
using System.Globalization;
using System.Linq;
using API.Comparators;
using API.Entities;
using AutoMapper;
using Microsoft.Extensions.Logging;
namespace API.Services;
public interface ITachiyomiService
{
Task<ChapterDto> GetLatestChapter(int seriesId, int userId);
Task<bool> MarkChaptersUntilAsRead(AppUser userWithProgress, int seriesId, float chapterNumber);
}
/// <summary>
/// All APIs are for Tachiyomi extension and app. They have hacks for our implementation and should not be used for any
/// other purposes.
/// </summary>
public class TachiyomiService : ITachiyomiService
{
private readonly IUnitOfWork _unitOfWork;
private readonly IMapper _mapper;
private readonly ILogger<ReaderService> _logger;
private readonly IReaderService _readerService;
private static readonly CultureInfo EnglishCulture = CultureInfo.CreateSpecificCulture("en-US");
public TachiyomiService(IUnitOfWork unitOfWork, IMapper mapper, ILogger<ReaderService> logger, IReaderService readerService)
{
_unitOfWork = unitOfWork;
_readerService = readerService;
_mapper = mapper;
_logger = logger;
}
/// <summary>
/// Gets the latest chapter/volume read.
/// </summary>
/// <param name="seriesId"></param>
/// <param name="userId"></param>
/// <returns>Due to how Tachiyomi works we need a hack to properly return both chapters and volumes.
/// If its a chapter, return the chapterDto as is.
/// If it's a volume, the volume number gets returned in the 'Number' attribute of a chapterDto encoded.
/// The volume number gets divided by 10,000 because that's how Tachiyomi interprets volumes</returns>
public async Task<ChapterDto> GetLatestChapter(int seriesId, int userId)
{
var currentChapter = await _readerService.GetContinuePoint(seriesId, userId);
var prevChapterId =
await _readerService.GetPrevChapterIdAsync(seriesId, currentChapter.VolumeId, currentChapter.Id, userId);
// If prevChapterId is -1, this means either nothing is read or everything is read.
if (prevChapterId == -1)
{
var series = await _unitOfWork.SeriesRepository.GetSeriesDtoByIdAsync(seriesId, userId);
var userHasProgress = series.PagesRead != 0 && series.PagesRead <= series.Pages;
// If the user doesn't have progress, then return null, which the extension will catch as 204 (no content) and report nothing as read
if (!userHasProgress) return null;
// Else return the max chapter to Tachiyomi so it can consider everything read
var volumes = (await _unitOfWork.VolumeRepository.GetVolumes(seriesId)).ToImmutableList();
var looseLeafChapterVolume = volumes.FirstOrDefault(v => v.Number == 0);
if (looseLeafChapterVolume == null)
{
var volumeChapter = _mapper.Map<ChapterDto>(volumes.Last().Chapters.OrderBy(c => float.Parse(c.Number), ChapterSortComparerZeroFirst.Default).Last());
if (volumeChapter.Number == "0")
{
var volume = volumes.First(v => v.Id == volumeChapter.VolumeId);
return new ChapterDto()
{
// Use R to ensure that localization of underlying system doesn't affect the stringification
// https://docs.microsoft.com/en-us/globalization/locale/number-formatting-in-dotnet-framework
Number = (volume.Number / 10_000f).ToString("R", EnglishCulture)
};
}
return new ChapterDto()
{
Number = (int.Parse(volumeChapter.Number) / 10_000f).ToString("R", EnglishCulture)
};
}
var lastChapter = looseLeafChapterVolume.Chapters.OrderBy(c => float.Parse(c.Number), ChapterSortComparer.Default).Last();
return _mapper.Map<ChapterDto>(lastChapter);
}
// There is progress, we now need to figure out the highest volume or chapter and return that.
var prevChapter = await _unitOfWork.ChapterRepository.GetChapterDtoAsync(prevChapterId);
var volumeWithProgress = await _unitOfWork.VolumeRepository.GetVolumeDtoAsync(prevChapter.VolumeId, userId);
// We only encode for single-file volumes
if (volumeWithProgress.Number != 0 && volumeWithProgress.Chapters.Count == 1)
{
// The progress is on a volume, encode it as a fake chapterDTO
return new ChapterDto()
{
// Use R to ensure that localization of underlying system doesn't affect the stringification
// https://docs.microsoft.com/en-us/globalization/locale/number-formatting-in-dotnet-framework
Number = (volumeWithProgress.Number / 10_000f).ToString("R", EnglishCulture)
};
}
// Progress is just on a chapter, return as is
return prevChapter;
}
/// <summary>
/// Marks every chapter and volume that is sorted below the passed number as Read. This will not mark any specials as read.
/// Passed number will also be marked as read
/// </summary>
/// <param name="userWithProgress"></param>
/// <param name="seriesId"></param>
/// <param name="chapterNumber">Can also be a Tachiyomi encoded volume number</param>
public async Task<bool> MarkChaptersUntilAsRead(AppUser userWithProgress, int seriesId, float chapterNumber)
{
userWithProgress.Progresses ??= new List<AppUserProgress>();
switch (chapterNumber)
{
// When Tachiyomi sync's progress, if there is no current progress in Tachiyomi, 0.0f is sent.
// Due to the encoding for volumes, this marks all chapters in volume 0 (loose chapters) as read.
// Hence we catch and return early, so we ignore the request.
case 0.0f:
return true;
case < 1.0f:
{
// This is a hack to track volume number. We need to map it back by x10,000
var volumeNumber = int.Parse($"{(int)(chapterNumber * 10_000)}", EnglishCulture);
await _readerService.MarkVolumesUntilAsRead(userWithProgress, seriesId, volumeNumber);
break;
}
default:
await _readerService.MarkChaptersUntilAsRead(userWithProgress, seriesId, chapterNumber);
break;
}
try {
_unitOfWork.UserRepository.Update(userWithProgress);
if (!_unitOfWork.HasChanges()) return true;
if (await _unitOfWork.CommitAsync()) return true;
} catch (Exception ex) {
_logger.LogError(ex, "There was an error saving progress from tachiyomi");
await _unitOfWork.RollbackAsync();
}
return false;
}
}

View file

@ -1,5 +1,6 @@
using System;
using System.Collections.Generic;
using System.Collections.Immutable;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
@ -8,7 +9,6 @@ using API.Entities.Enums;
using API.Helpers.Converters;
using API.Services.Tasks;
using API.Services.Tasks.Metadata;
using API.Services.Tasks.Scanner;
using Hangfire;
using Microsoft.Extensions.Logging;
@ -19,6 +19,7 @@ public interface ITaskScheduler
Task ScheduleTasks();
Task ScheduleStatsTasks();
void ScheduleUpdaterTasks();
void ScanFolder(string folderPath, TimeSpan delay);
void ScanFolder(string folderPath);
void ScanLibrary(int libraryId, bool force = false);
void CleanupChapters(int[] chapterIds);
@ -49,9 +50,14 @@ public class TaskScheduler : ITaskScheduler
public static BackgroundJobServer Client => new BackgroundJobServer();
public const string ScanQueue = "scan";
public const string DefaultQueue = "default";
public const string RemoveFromWantToReadTaskId = "remove-from-want-to-read";
public const string CleanupDbTaskId = "cleanup-db";
public const string CleanupTaskId = "cleanup";
public const string BackupTaskId = "backup";
public const string ScanLibrariesTaskId = "scan-libraries";
public const string ReportStatsTaskId = "report-stats";
public static readonly IList<string> ScanTasks = new List<string>()
{"ScannerService", "ScanLibrary", "ScanLibraries", "ScanFolder", "ScanSeries"};
private static readonly ImmutableArray<string> ScanTasks = ImmutableArray.Create("ScannerService", "ScanLibrary", "ScanLibraries", "ScanFolder", "ScanSeries");
private static readonly Random Rnd = new Random();
@ -83,27 +89,28 @@ public class TaskScheduler : ITaskScheduler
{
var scanLibrarySetting = setting;
_logger.LogDebug("Scheduling Scan Library Task for {Setting}", scanLibrarySetting);
RecurringJob.AddOrUpdate("scan-libraries", () => _scannerService.ScanLibraries(),
RecurringJob.AddOrUpdate(ScanLibrariesTaskId, () => _scannerService.ScanLibraries(),
() => CronConverter.ConvertToCronNotation(scanLibrarySetting), TimeZoneInfo.Local);
}
else
{
RecurringJob.AddOrUpdate("scan-libraries", () => ScanLibraries(), Cron.Daily, TimeZoneInfo.Local);
RecurringJob.AddOrUpdate(ScanLibrariesTaskId, () => ScanLibraries(), Cron.Daily, TimeZoneInfo.Local);
}
setting = (await _unitOfWork.SettingsRepository.GetSettingAsync(ServerSettingKey.TaskBackup)).Value;
if (setting != null)
{
_logger.LogDebug("Scheduling Backup Task for {Setting}", setting);
RecurringJob.AddOrUpdate("backup", () => _backupService.BackupDatabase(), () => CronConverter.ConvertToCronNotation(setting), TimeZoneInfo.Local);
RecurringJob.AddOrUpdate(BackupTaskId, () => _backupService.BackupDatabase(), () => CronConverter.ConvertToCronNotation(setting), TimeZoneInfo.Local);
}
else
{
RecurringJob.AddOrUpdate("backup", () => _backupService.BackupDatabase(), Cron.Weekly, TimeZoneInfo.Local);
RecurringJob.AddOrUpdate(BackupTaskId, () => _backupService.BackupDatabase(), Cron.Weekly, TimeZoneInfo.Local);
}
RecurringJob.AddOrUpdate("cleanup", () => _cleanupService.Cleanup(), Cron.Daily, TimeZoneInfo.Local);
RecurringJob.AddOrUpdate("cleanup-db", () => _cleanupService.CleanupDbEntries(), Cron.Daily, TimeZoneInfo.Local);
RecurringJob.AddOrUpdate(CleanupTaskId, () => _cleanupService.Cleanup(), Cron.Daily, TimeZoneInfo.Local);
RecurringJob.AddOrUpdate(CleanupDbTaskId, () => _cleanupService.CleanupDbEntries(), Cron.Daily, TimeZoneInfo.Local);
RecurringJob.AddOrUpdate(RemoveFromWantToReadTaskId, () => _cleanupService.CleanupWantToRead(), Cron.Daily, TimeZoneInfo.Local);
}
#region StatsTasks
@ -119,7 +126,7 @@ public class TaskScheduler : ITaskScheduler
}
_logger.LogDebug("Scheduling stat collection daily");
RecurringJob.AddOrUpdate("report-stats", () => _statsService.Send(), Cron.Daily(Rnd.Next(0, 22)), TimeZoneInfo.Local);
RecurringJob.AddOrUpdate(ReportStatsTaskId, () => _statsService.Send(), Cron.Daily(Rnd.Next(0, 22)), TimeZoneInfo.Local);
}
public void AnalyzeFilesForLibrary(int libraryId, bool forceUpdate = false)
@ -127,11 +134,14 @@ public class TaskScheduler : ITaskScheduler
BackgroundJob.Enqueue(() => _wordCountAnalyzerService.ScanLibrary(libraryId, forceUpdate));
}
/// <summary>
/// Upon cancelling stat, we do report to the Stat service that we are no longer going to be reporting
/// </summary>
public void CancelStatsTasks()
{
_logger.LogDebug("Cancelling/Removing StatsTasks");
RecurringJob.RemoveIfExists("report-stats");
_logger.LogDebug("Stopping Stat collection as user has opted out");
RecurringJob.RemoveIfExists(ReportStatsTaskId);
_statsService.SendCancellation();
}
/// <summary>
@ -150,11 +160,16 @@ public class TaskScheduler : ITaskScheduler
public void ScanSiteThemes()
{
_logger.LogInformation("Starting Site Theme scan");
if (HasAlreadyEnqueuedTask("ThemeService", "Scan", Array.Empty<object>(), ScanQueue))
{
_logger.LogInformation("A Theme Scan is already running");
return;
}
_logger.LogInformation("Enqueueing Site Theme scan");
BackgroundJob.Enqueue(() => _themeService.Scan());
}
#endregion
#region UpdateTasks
@ -166,9 +181,32 @@ public class TaskScheduler : ITaskScheduler
RecurringJob.AddOrUpdate("check-updates", () => CheckForUpdate(), Cron.Daily(Rnd.Next(12, 18)), TimeZoneInfo.Local);
}
public void ScanFolder(string folderPath, TimeSpan delay)
{
var normalizedFolder = Tasks.Scanner.Parser.Parser.NormalizePath(folderPath);
if (HasAlreadyEnqueuedTask(ScannerService.Name, "ScanFolder", new object[] { normalizedFolder }))
{
_logger.LogInformation("Skipped scheduling ScanFolder for {Folder} as a job already queued",
normalizedFolder);
return;
}
_logger.LogInformation("Scheduling ScanFolder for {Folder}", normalizedFolder);
BackgroundJob.Schedule(() => _scannerService.ScanFolder(normalizedFolder), delay);
}
public void ScanFolder(string folderPath)
{
_scannerService.ScanFolder(Tasks.Scanner.Parser.Parser.NormalizePath(folderPath));
var normalizedFolder = Tasks.Scanner.Parser.Parser.NormalizePath(folderPath);
if (HasAlreadyEnqueuedTask(ScannerService.Name, "ScanFolder", new object[] {normalizedFolder}))
{
_logger.LogInformation("Skipped scheduling ScanFolder for {Folder} as a job already queued",
normalizedFolder);
return;
}
_logger.LogInformation("Scheduling ScanFolder for {Folder}", normalizedFolder);
_scannerService.ScanFolder(normalizedFolder);
}
#endregion
@ -186,17 +224,14 @@ public class TaskScheduler : ITaskScheduler
public void ScanLibrary(int libraryId, bool force = false)
{
var alreadyEnqueued =
HasAlreadyEnqueuedTask("ScannerService", "ScanLibrary", new object[] {libraryId, true}, ScanQueue) ||
HasAlreadyEnqueuedTask("ScannerService", "ScanLibrary", new object[] {libraryId, false}, ScanQueue);
if (alreadyEnqueued)
if (HasScanTaskRunningForLibrary(libraryId))
{
_logger.LogInformation("A duplicate request to scan library for library occured. Skipping");
_logger.LogInformation("A duplicate request for Library Scan on library {LibraryId} occured. Skipping", libraryId);
return;
}
if (RunningAnyTasksByMethod(ScanTasks, ScanQueue))
{
_logger.LogInformation("A Scan is already running, rescheduling ScanLibrary in 3 hours");
_logger.LogInformation("A Library Scan is already running, rescheduling ScanLibrary in 3 hours");
BackgroundJob.Schedule(() => ScanLibrary(libraryId, force), TimeSpan.FromHours(3));
return;
}
@ -204,7 +239,7 @@ public class TaskScheduler : ITaskScheduler
_logger.LogInformation("Enqueuing library scan for: {LibraryId}", libraryId);
BackgroundJob.Enqueue(() => _scannerService.ScanLibrary(libraryId, force));
// When we do a scan, force cache to re-unpack in case page numbers change
BackgroundJob.Enqueue(() => _cleanupService.CleanupCacheDirectory());
BackgroundJob.Enqueue(() => _cleanupService.CleanupCacheAndTempDirectories());
}
public void CleanupChapters(int[] chapterIds)
@ -285,34 +320,84 @@ public class TaskScheduler : ITaskScheduler
await _versionUpdaterService.PushUpdate(update);
}
public static bool HasScanTaskRunningForLibrary(int libraryId)
/// <summary>
/// If there is an enqueued or scheduled task for <see cref="ScannerService.ScanLibrary"/> method
/// </summary>
/// <param name="libraryId"></param>
/// <param name="checkRunningJobs">Checks against jobs currently executing as well</param>
/// <returns></returns>
public static bool HasScanTaskRunningForLibrary(int libraryId, bool checkRunningJobs = true)
{
return
HasAlreadyEnqueuedTask("ScannerService", "ScanLibrary", new object[] {libraryId, true}, ScanQueue) ||
HasAlreadyEnqueuedTask("ScannerService", "ScanLibrary", new object[] {libraryId, false}, ScanQueue);
HasAlreadyEnqueuedTask(ScannerService.Name, "ScanLibrary", new object[] {libraryId, true}, ScanQueue, checkRunningJobs) ||
HasAlreadyEnqueuedTask(ScannerService.Name, "ScanLibrary", new object[] {libraryId, false}, ScanQueue, checkRunningJobs);
}
/// <summary>
/// Checks if this same invocation is already enqueued
/// If there is an enqueued or scheduled task for <see cref="ScannerService.ScanSeries"/> method
/// </summary>
/// <param name="seriesId"></param>
/// <param name="checkRunningJobs">Checks against jobs currently executing as well</param>
/// <returns></returns>
public static bool HasScanTaskRunningForSeries(int seriesId, bool checkRunningJobs = true)
{
return
HasAlreadyEnqueuedTask(ScannerService.Name, "ScanSeries", new object[] {seriesId, true}, ScanQueue, checkRunningJobs) ||
HasAlreadyEnqueuedTask(ScannerService.Name, "ScanSeries", new object[] {seriesId, false}, ScanQueue, checkRunningJobs);
}
/// <summary>
/// Checks if this same invocation is already enqueued or scheduled
/// </summary>
/// <param name="methodName">Method name that was enqueued</param>
/// <param name="className">Class name the method resides on</param>
/// <param name="args">object[] of arguments in the order they are passed to enqueued job</param>
/// <param name="queue">Queue to check against. Defaults to "default"</param>
/// <param name="checkRunningJobs">Check against running jobs. Defaults to false.</param>
/// <returns></returns>
public static bool HasAlreadyEnqueuedTask(string className, string methodName, object[] args, string queue = DefaultQueue)
public static bool HasAlreadyEnqueuedTask(string className, string methodName, object[] args, string queue = DefaultQueue, bool checkRunningJobs = false)
{
var enqueuedJobs = JobStorage.Current.GetMonitoringApi().EnqueuedJobs(queue, 0, int.MaxValue);
return enqueuedJobs.Any(j => j.Value.InEnqueuedState &&
var ret = enqueuedJobs.Any(j => j.Value.InEnqueuedState &&
j.Value.Job.Method.DeclaringType != null && j.Value.Job.Args.SequenceEqual(args) &&
j.Value.Job.Method.Name.Equals(methodName) &&
j.Value.Job.Method.DeclaringType.Name.Equals(className));
if (ret) return true;
var scheduledJobs = JobStorage.Current.GetMonitoringApi().ScheduledJobs(0, int.MaxValue);
ret = scheduledJobs.Any(j =>
j.Value.Job.Method.DeclaringType != null && j.Value.Job.Args.SequenceEqual(args) &&
j.Value.Job.Method.Name.Equals(methodName) &&
j.Value.Job.Method.DeclaringType.Name.Equals(className));
if (ret) return true;
if (checkRunningJobs)
{
var runningJobs = JobStorage.Current.GetMonitoringApi().ProcessingJobs(0, int.MaxValue);
return runningJobs.Any(j =>
j.Value.Job.Method.DeclaringType != null && j.Value.Job.Args.SequenceEqual(args) &&
j.Value.Job.Method.Name.Equals(methodName) &&
j.Value.Job.Method.DeclaringType.Name.Equals(className));
}
return false;
}
/// <summary>
/// Checks against any jobs that are running or about to run
/// </summary>
/// <param name="classNames"></param>
/// <param name="queue"></param>
/// <returns></returns>
public static bool RunningAnyTasksByMethod(IEnumerable<string> classNames, string queue = DefaultQueue)
{
var enqueuedJobs = JobStorage.Current.GetMonitoringApi().EnqueuedJobs(queue, 0, int.MaxValue);
return enqueuedJobs.Any(j => !j.Value.InEnqueuedState &&
var ret = enqueuedJobs.Any(j => !j.Value.InEnqueuedState &&
classNames.Contains(j.Value.Job.Method.DeclaringType?.Name));
if (ret) return true;
var runningJobs = JobStorage.Current.GetMonitoringApi().ProcessingJobs(0, int.MaxValue);
return runningJobs.Any(j => classNames.Contains(j.Value.Job.Method.DeclaringType?.Name));
}
}

View file

@ -7,6 +7,7 @@ using System.Threading.Tasks;
using API.Data;
using API.Entities.Enums;
using API.Extensions;
using API.Logging;
using API.SignalR;
using Hangfire;
using Microsoft.AspNetCore.SignalR;
@ -19,30 +20,27 @@ public interface IBackupService
{
Task BackupDatabase();
/// <summary>
/// Returns a list of full paths of the logs files detailed in <see cref="IConfiguration"/>.
/// Returns a list of all log files for Kavita
/// </summary>
/// <param name="maxRollingFiles"></param>
/// <param name="logFileName"></param>
/// <param name="rollFiles">If file rolling is enabled. Defaults to True.</param>
/// <returns></returns>
IEnumerable<string> GetLogFiles(int maxRollingFiles, string logFileName);
IEnumerable<string> GetLogFiles(bool rollFiles = LogLevelOptions.LogRollingEnabled);
}
public class BackupService : IBackupService
{
private readonly IUnitOfWork _unitOfWork;
private readonly ILogger<BackupService> _logger;
private readonly IDirectoryService _directoryService;
private readonly IConfiguration _config;
private readonly IEventHub _eventHub;
private readonly IList<string> _backupFiles;
public BackupService(ILogger<BackupService> logger, IUnitOfWork unitOfWork,
IDirectoryService directoryService, IConfiguration config, IEventHub eventHub)
IDirectoryService directoryService, IEventHub eventHub)
{
_unitOfWork = unitOfWork;
_logger = logger;
_directoryService = directoryService;
_config = config;
_eventHub = eventHub;
_backupFiles = new List<string>()
@ -56,12 +54,17 @@ public class BackupService : IBackupService
};
}
public IEnumerable<string> GetLogFiles(int maxRollingFiles, string logFileName)
/// <summary>
/// Returns a list of all log files for Kavita
/// </summary>
/// <param name="rollFiles">If file rolling is enabled. Defaults to True.</param>
/// <returns></returns>
public IEnumerable<string> GetLogFiles(bool rollFiles = LogLevelOptions.LogRollingEnabled)
{
var multipleFileRegex = maxRollingFiles > 0 ? @"\d*" : string.Empty;
var fi = _directoryService.FileSystem.FileInfo.FromFileName(logFileName);
var multipleFileRegex = rollFiles ? @"\d*" : string.Empty;
var fi = _directoryService.FileSystem.FileInfo.FromFileName(LogLevelOptions.LogFile);
var files = maxRollingFiles > 0
var files = rollFiles
? _directoryService.GetFiles(_directoryService.LogDirectory,
$@"{_directoryService.FileSystem.Path.GetFileNameWithoutExtension(fi.Name)}{multipleFileRegex}\.log")
: new[] {_directoryService.FileSystem.Path.Join(_directoryService.LogDirectory, "kavita.log")};
@ -137,9 +140,7 @@ public class BackupService : IBackupService
private void CopyLogsToBackupDirectory(string tempDirectory)
{
var maxRollingFiles = _config.GetMaxRollingFiles();
var loggingSection = _config.GetLoggingFileName();
var files = GetLogFiles(maxRollingFiles, loggingSection);
var files = GetLogFiles();
_directoryService.CopyFilesToDirectory(files, _directoryService.FileSystem.Path.Join(tempDirectory, "logs"));
}

View file

@ -1,199 +1,273 @@
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Threading.Tasks;
using API.Data;
using API.Data.Repositories;
using API.DTOs.Filtering;
using API.Entities;
using API.Entities.Enums;
using API.Helpers;
using API.SignalR;
using Hangfire;
using Microsoft.AspNetCore.SignalR;
using Microsoft.Extensions.Logging;
namespace API.Services.Tasks
namespace API.Services.Tasks;
public interface ICleanupService
{
public interface ICleanupService
{
Task Cleanup();
Task CleanupDbEntries();
void CleanupCacheDirectory();
Task DeleteSeriesCoverImages();
Task DeleteChapterCoverImages();
Task DeleteTagCoverImages();
Task CleanupBackups();
void CleanupTemp();
}
Task Cleanup();
Task CleanupDbEntries();
void CleanupCacheAndTempDirectories();
Task DeleteSeriesCoverImages();
Task DeleteChapterCoverImages();
Task DeleteTagCoverImages();
Task CleanupBackups();
Task CleanupLogs();
void CleanupTemp();
/// <summary>
/// Cleans up after operations on reoccurring basis
/// Responsible to remove Series from Want To Read when user's have fully read the series and the series has Publication Status of Completed or Cancelled.
/// </summary>
public class CleanupService : ICleanupService
/// <returns></returns>
Task CleanupWantToRead();
}
/// <summary>
/// Cleans up after operations on reoccurring basis
/// </summary>
public class CleanupService : ICleanupService
{
private readonly ILogger<CleanupService> _logger;
private readonly IUnitOfWork _unitOfWork;
private readonly IEventHub _eventHub;
private readonly IDirectoryService _directoryService;
public CleanupService(ILogger<CleanupService> logger,
IUnitOfWork unitOfWork, IEventHub eventHub,
IDirectoryService directoryService)
{
private readonly ILogger<CleanupService> _logger;
private readonly IUnitOfWork _unitOfWork;
private readonly IEventHub _eventHub;
private readonly IDirectoryService _directoryService;
_logger = logger;
_unitOfWork = unitOfWork;
_eventHub = eventHub;
_directoryService = directoryService;
}
public CleanupService(ILogger<CleanupService> logger,
IUnitOfWork unitOfWork, IEventHub eventHub,
IDirectoryService directoryService)
/// <summary>
/// Cleans up Temp, cache, deleted cover images, and old database backups
/// </summary>
[AutomaticRetry(Attempts = 3, LogEvents = false, OnAttemptsExceeded = AttemptsExceededAction.Fail)]
public async Task Cleanup()
{
_logger.LogInformation("Starting Cleanup");
await SendProgress(0F, "Starting cleanup");
_logger.LogInformation("Cleaning temp directory");
_directoryService.ClearDirectory(_directoryService.TempDirectory);
await SendProgress(0.1F, "Cleaning temp directory");
CleanupCacheAndTempDirectories();
await SendProgress(0.25F, "Cleaning old database backups");
_logger.LogInformation("Cleaning old database backups");
await CleanupBackups();
await SendProgress(0.50F, "Cleaning deleted cover images");
_logger.LogInformation("Cleaning deleted cover images");
await DeleteSeriesCoverImages();
await SendProgress(0.6F, "Cleaning deleted cover images");
await DeleteChapterCoverImages();
await SendProgress(0.7F, "Cleaning deleted cover images");
await DeleteTagCoverImages();
await DeleteReadingListCoverImages();
await SendProgress(0.8F, "Cleaning old logs");
await CleanupLogs();
await SendProgress(1F, "Cleanup finished");
_logger.LogInformation("Cleanup finished");
}
/// <summary>
/// Cleans up abandon rows in the DB
/// </summary>
public async Task CleanupDbEntries()
{
await _unitOfWork.AppUserProgressRepository.CleanupAbandonedChapters();
await _unitOfWork.PersonRepository.RemoveAllPeopleNoLongerAssociated();
await _unitOfWork.GenreRepository.RemoveAllGenreNoLongerAssociated();
await _unitOfWork.CollectionTagRepository.RemoveTagsWithoutSeries();
}
private async Task SendProgress(float progress, string subtitle)
{
await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress,
MessageFactory.CleanupProgressEvent(progress, subtitle));
}
/// <summary>
/// Removes all series images that are not in the database. They must follow <see cref="ImageService.SeriesCoverImageRegex"/> filename pattern.
/// </summary>
public async Task DeleteSeriesCoverImages()
{
var images = await _unitOfWork.SeriesRepository.GetAllCoverImagesAsync();
var files = _directoryService.GetFiles(_directoryService.CoverImageDirectory, ImageService.SeriesCoverImageRegex);
_directoryService.DeleteFiles(files.Where(file => !images.Contains(_directoryService.FileSystem.Path.GetFileName(file))));
}
/// <summary>
/// Removes all chapter/volume images that are not in the database. They must follow <see cref="ImageService.ChapterCoverImageRegex"/> filename pattern.
/// </summary>
public async Task DeleteChapterCoverImages()
{
var images = await _unitOfWork.ChapterRepository.GetAllCoverImagesAsync();
var files = _directoryService.GetFiles(_directoryService.CoverImageDirectory, ImageService.ChapterCoverImageRegex);
_directoryService.DeleteFiles(files.Where(file => !images.Contains(_directoryService.FileSystem.Path.GetFileName(file))));
}
/// <summary>
/// Removes all collection tag images that are not in the database. They must follow <see cref="ImageService.CollectionTagCoverImageRegex"/> filename pattern.
/// </summary>
public async Task DeleteTagCoverImages()
{
var images = await _unitOfWork.CollectionTagRepository.GetAllCoverImagesAsync();
var files = _directoryService.GetFiles(_directoryService.CoverImageDirectory, ImageService.CollectionTagCoverImageRegex);
_directoryService.DeleteFiles(files.Where(file => !images.Contains(_directoryService.FileSystem.Path.GetFileName(file))));
}
/// <summary>
/// Removes all reading list images that are not in the database. They must follow <see cref="ImageService.ReadingListCoverImageRegex"/> filename pattern.
/// </summary>
public async Task DeleteReadingListCoverImages()
{
var images = await _unitOfWork.ReadingListRepository.GetAllCoverImagesAsync();
var files = _directoryService.GetFiles(_directoryService.CoverImageDirectory, ImageService.ReadingListCoverImageRegex);
_directoryService.DeleteFiles(files.Where(file => !images.Contains(_directoryService.FileSystem.Path.GetFileName(file))));
}
/// <summary>
/// Removes all files and directories in the cache and temp directory
/// </summary>
public void CleanupCacheAndTempDirectories()
{
_logger.LogInformation("Performing cleanup of Cache & Temp directories");
_directoryService.ExistOrCreate(_directoryService.CacheDirectory);
_directoryService.ExistOrCreate(_directoryService.TempDirectory);
try
{
_logger = logger;
_unitOfWork = unitOfWork;
_eventHub = eventHub;
_directoryService = directoryService;
}
/// <summary>
/// Cleans up Temp, cache, deleted cover images, and old database backups
/// </summary>
[AutomaticRetry(Attempts = 3, LogEvents = false, OnAttemptsExceeded = AttemptsExceededAction.Fail)]
public async Task Cleanup()
{
_logger.LogInformation("Starting Cleanup");
await SendProgress(0F, "Starting cleanup");
_logger.LogInformation("Cleaning temp directory");
_directoryService.ClearDirectory(_directoryService.CacheDirectory);
_directoryService.ClearDirectory(_directoryService.TempDirectory);
await SendProgress(0.1F, "Cleaning temp directory");
CleanupCacheDirectory();
await SendProgress(0.25F, "Cleaning old database backups");
_logger.LogInformation("Cleaning old database backups");
await CleanupBackups();
await SendProgress(0.50F, "Cleaning deleted cover images");
_logger.LogInformation("Cleaning deleted cover images");
await DeleteSeriesCoverImages();
await SendProgress(0.6F, "Cleaning deleted cover images");
await DeleteChapterCoverImages();
await SendProgress(0.7F, "Cleaning deleted cover images");
await DeleteTagCoverImages();
await DeleteReadingListCoverImages();
await SendProgress(1F, "Cleanup finished");
_logger.LogInformation("Cleanup finished");
}
catch (Exception ex)
{
_logger.LogError(ex, "There was an issue deleting one or more folders/files during cleanup");
}
/// <summary>
/// Cleans up abandon rows in the DB
/// </summary>
public async Task CleanupDbEntries()
_logger.LogInformation("Cache and temp directory purged");
}
/// <summary>
/// Removes Database backups older than configured total backups. If all backups are older than total backups days, only the latest is kept.
/// </summary>
public async Task CleanupBackups()
{
var dayThreshold = (await _unitOfWork.SettingsRepository.GetSettingsDtoAsync()).TotalBackups;
_logger.LogInformation("Beginning cleanup of Database backups at {Time}", DateTime.Now);
var backupDirectory =
(await _unitOfWork.SettingsRepository.GetSettingAsync(ServerSettingKey.BackupDirectory)).Value;
if (!_directoryService.Exists(backupDirectory)) return;
var deltaTime = DateTime.Today.Subtract(TimeSpan.FromDays(dayThreshold));
var allBackups = _directoryService.GetFiles(backupDirectory).ToList();
var expiredBackups = allBackups.Select(filename => _directoryService.FileSystem.FileInfo.FromFileName(filename))
.Where(f => f.CreationTime < deltaTime)
.ToList();
if (expiredBackups.Count == allBackups.Count)
{
await _unitOfWork.AppUserProgressRepository.CleanupAbandonedChapters();
await _unitOfWork.PersonRepository.RemoveAllPeopleNoLongerAssociated();
await _unitOfWork.GenreRepository.RemoveAllGenreNoLongerAssociated();
await _unitOfWork.CollectionTagRepository.RemoveTagsWithoutSeries();
_logger.LogInformation("All expired backups are older than {Threshold} days. Removing all but last backup", dayThreshold);
var toDelete = expiredBackups.OrderByDescending(f => f.CreationTime).ToList();
_directoryService.DeleteFiles(toDelete.Take(toDelete.Count - 1).Select(f => f.FullName));
}
else
{
_directoryService.DeleteFiles(expiredBackups.Select(f => f.FullName));
}
_logger.LogInformation("Finished cleanup of Database backups at {Time}", DateTime.Now);
}
public async Task CleanupLogs()
{
_logger.LogInformation("Performing cleanup of logs directory");
var dayThreshold = (await _unitOfWork.SettingsRepository.GetSettingsDtoAsync()).TotalLogs;
var deltaTime = DateTime.Today.Subtract(TimeSpan.FromDays(dayThreshold));
var allLogs = _directoryService.GetFiles(_directoryService.LogDirectory).ToList();
var expiredLogs = allLogs.Select(filename => _directoryService.FileSystem.FileInfo.FromFileName(filename))
.Where(f => f.CreationTime < deltaTime)
.ToList();
if (expiredLogs.Count == allLogs.Count)
{
_logger.LogInformation("All expired backups are older than {Threshold} days. Removing all but last backup", dayThreshold);
var toDelete = expiredLogs.OrderBy(f => f.CreationTime).ToList();
_directoryService.DeleteFiles(toDelete.Take(toDelete.Count - 1).Select(f => f.FullName));
}
else
{
_directoryService.DeleteFiles(expiredLogs.Select(f => f.FullName));
}
_logger.LogInformation("Finished cleanup of logs at {Time}", DateTime.Now);
}
public void CleanupTemp()
{
_logger.LogInformation("Performing cleanup of Temp directory");
_directoryService.ExistOrCreate(_directoryService.TempDirectory);
try
{
_directoryService.ClearDirectory(_directoryService.TempDirectory);
}
catch (Exception ex)
{
_logger.LogError(ex, "There was an issue deleting one or more folders/files during cleanup");
}
private async Task SendProgress(float progress, string subtitle)
{
await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress,
MessageFactory.CleanupProgressEvent(progress, subtitle));
}
_logger.LogInformation("Temp directory purged");
}
/// <summary>
/// Removes all series images that are not in the database. They must follow <see cref="ImageService.SeriesCoverImageRegex"/> filename pattern.
/// </summary>
public async Task DeleteSeriesCoverImages()
{
var images = await _unitOfWork.SeriesRepository.GetAllCoverImagesAsync();
var files = _directoryService.GetFiles(_directoryService.CoverImageDirectory, ImageService.SeriesCoverImageRegex);
_directoryService.DeleteFiles(files.Where(file => !images.Contains(_directoryService.FileSystem.Path.GetFileName(file))));
}
public async Task CleanupWantToRead()
{
_logger.LogInformation("Performing cleanup of Series that are Completed and have been fully read that are in Want To Read list");
/// <summary>
/// Removes all chapter/volume images that are not in the database. They must follow <see cref="ImageService.ChapterCoverImageRegex"/> filename pattern.
/// </summary>
public async Task DeleteChapterCoverImages()
var libraryIds = (await _unitOfWork.LibraryRepository.GetLibrariesAsync()).Select(l => l.Id).ToList();
var filter = new FilterDto()
{
var images = await _unitOfWork.ChapterRepository.GetAllCoverImagesAsync();
var files = _directoryService.GetFiles(_directoryService.CoverImageDirectory, ImageService.ChapterCoverImageRegex);
_directoryService.DeleteFiles(files.Where(file => !images.Contains(_directoryService.FileSystem.Path.GetFileName(file))));
}
/// <summary>
/// Removes all collection tag images that are not in the database. They must follow <see cref="ImageService.CollectionTagCoverImageRegex"/> filename pattern.
/// </summary>
public async Task DeleteTagCoverImages()
{
var images = await _unitOfWork.CollectionTagRepository.GetAllCoverImagesAsync();
var files = _directoryService.GetFiles(_directoryService.CoverImageDirectory, ImageService.CollectionTagCoverImageRegex);
_directoryService.DeleteFiles(files.Where(file => !images.Contains(_directoryService.FileSystem.Path.GetFileName(file))));
}
/// <summary>
/// Removes all reading list images that are not in the database. They must follow <see cref="ImageService.ReadingListCoverImageRegex"/> filename pattern.
/// </summary>
public async Task DeleteReadingListCoverImages()
{
var images = await _unitOfWork.ReadingListRepository.GetAllCoverImagesAsync();
var files = _directoryService.GetFiles(_directoryService.CoverImageDirectory, ImageService.ReadingListCoverImageRegex);
_directoryService.DeleteFiles(files.Where(file => !images.Contains(_directoryService.FileSystem.Path.GetFileName(file))));
}
/// <summary>
/// Removes all files and directories in the cache and temp directory
/// </summary>
public void CleanupCacheDirectory()
{
_logger.LogInformation("Performing cleanup of Cache directory");
_directoryService.ExistOrCreate(_directoryService.CacheDirectory);
_directoryService.ExistOrCreate(_directoryService.TempDirectory);
try
PublicationStatus = new List<PublicationStatus>()
{
_directoryService.ClearDirectory(_directoryService.CacheDirectory);
_directoryService.ClearDirectory(_directoryService.TempDirectory);
}
catch (Exception ex)
PublicationStatus.Completed,
PublicationStatus.Cancelled
},
Libraries = libraryIds,
ReadStatus = new ReadStatus()
{
_logger.LogError(ex, "There was an issue deleting one or more folders/files during cleanup");
Read = true,
InProgress = false,
NotRead = false
}
_logger.LogInformation("Cache directory purged");
}
/// <summary>
/// Removes Database backups older than configured total backups. If all backups are older than total backups days, only the latest is kept.
/// </summary>
public async Task CleanupBackups()
};
foreach (var user in await _unitOfWork.UserRepository.GetAllUsersAsync(AppUserIncludes.WantToRead))
{
var dayThreshold = (await _unitOfWork.SettingsRepository.GetSettingsDtoAsync()).TotalBackups;
_logger.LogInformation("Beginning cleanup of Database backups at {Time}", DateTime.Now);
var backupDirectory =
(await _unitOfWork.SettingsRepository.GetSettingAsync(ServerSettingKey.BackupDirectory)).Value;
if (!_directoryService.Exists(backupDirectory)) return;
var series = await _unitOfWork.SeriesRepository.GetSeriesDtoForLibraryIdAsync(0, user.Id, new UserParams(), filter);
var seriesIds = series.Select(s => s.Id).ToList();
if (seriesIds.Count == 0) continue;
var deltaTime = DateTime.Today.Subtract(TimeSpan.FromDays(dayThreshold));
var allBackups = _directoryService.GetFiles(backupDirectory).ToList();
var expiredBackups = allBackups.Select(filename => _directoryService.FileSystem.FileInfo.FromFileName(filename))
.Where(f => f.CreationTime < deltaTime)
.ToList();
if (expiredBackups.Count == allBackups.Count)
{
_logger.LogInformation("All expired backups are older than {Threshold} days. Removing all but last backup", dayThreshold);
var toDelete = expiredBackups.OrderByDescending(f => f.CreationTime).ToList();
_directoryService.DeleteFiles(toDelete.Take(toDelete.Count - 1).Select(f => f.FullName));
}
else
{
_directoryService.DeleteFiles(expiredBackups.Select(f => f.FullName));
}
_logger.LogInformation("Finished cleanup of Database backups at {Time}", DateTime.Now);
user.WantToRead ??= new List<Series>();
user.WantToRead = user.WantToRead.Where(s => !seriesIds.Contains(s.Id)).ToList();
_unitOfWork.UserRepository.Update(user);
}
public void CleanupTemp()
if (_unitOfWork.HasChanges())
{
_logger.LogInformation("Performing cleanup of Temp directory");
_directoryService.ExistOrCreate(_directoryService.TempDirectory);
try
{
_directoryService.ClearDirectory(_directoryService.TempDirectory);
}
catch (Exception ex)
{
_logger.LogError(ex, "There was an issue deleting one or more folders/files during cleanup");
}
_logger.LogInformation("Temp directory purged");
await _unitOfWork.CommitAsync();
}
_logger.LogInformation("Performing cleanup of Series that are Completed and have been fully read that are in Want To Read list, completed");
}
}

View file

@ -11,52 +11,6 @@ using Microsoft.Extensions.Logging;
namespace API.Services.Tasks.Scanner;
/// <summary>
/// Change information
/// </summary>
public class Change
{
/// <summary>
/// Gets or sets the type of the change.
/// </summary>
/// <value>
/// The type of the change.
/// </value>
public WatcherChangeTypes ChangeType { get; set; }
/// <summary>
/// Gets or sets the full path.
/// </summary>
/// <value>
/// The full path.
/// </value>
public string FullPath { get; set; }
/// <summary>
/// Gets or sets the name.
/// </summary>
/// <value>
/// The name.
/// </value>
public string Name { get; set; }
/// <summary>
/// Gets or sets the old full path.
/// </summary>
/// <value>
/// The old full path.
/// </value>
public string OldFullPath { get; set; }
/// <summary>
/// Gets or sets the old name.
/// </summary>
/// <value>
/// The old name.
/// </value>
public string OldName { get; set; }
}
public interface ILibraryWatcher
{
/// <summary>
@ -84,24 +38,35 @@ public class LibraryWatcher : ILibraryWatcher
private readonly IDirectoryService _directoryService;
private readonly IUnitOfWork _unitOfWork;
private readonly ILogger<LibraryWatcher> _logger;
private readonly IScannerService _scannerService;
private readonly ITaskScheduler _taskScheduler;
private readonly Dictionary<string, IList<FileSystemWatcher>> _watcherDictionary = new ();
private static readonly Dictionary<string, IList<FileSystemWatcher>> WatcherDictionary = new ();
/// <summary>
/// This is just here to prevent GC from Disposing our watchers
/// </summary>
private readonly IList<FileSystemWatcher> _fileWatchers = new List<FileSystemWatcher>();
private IList<string> _libraryFolders = new List<string>();
private static readonly IList<FileSystemWatcher> FileWatchers = new List<FileSystemWatcher>();
/// <summary>
/// The amount of time until the Schedule ScanFolder task should be executed
/// </summary>
/// <remarks>The Job will be enqueued instantly</remarks>
private readonly TimeSpan _queueWaitTime;
/// <summary>
/// Counts within a time frame how many times the buffer became full. Is used to reschedule LibraryWatcher to start monitoring much later rather than instantly
/// </summary>
private int _bufferFullCounter;
/// <summary>
/// Used to lock buffer Full Counter
/// </summary>
private static readonly object Lock = new ();
public LibraryWatcher(IDirectoryService directoryService, IUnitOfWork unitOfWork, ILogger<LibraryWatcher> logger, IScannerService scannerService, IHostEnvironment environment)
public LibraryWatcher(IDirectoryService directoryService, IUnitOfWork unitOfWork,
ILogger<LibraryWatcher> logger, IHostEnvironment environment, ITaskScheduler taskScheduler)
{
_directoryService = directoryService;
_unitOfWork = unitOfWork;
_logger = logger;
_scannerService = scannerService;
_taskScheduler = taskScheduler;
_queueWaitTime = environment.IsDevelopment() ? TimeSpan.FromSeconds(30) : TimeSpan.FromMinutes(5);
@ -109,69 +74,75 @@ public class LibraryWatcher : ILibraryWatcher
public async Task StartWatching()
{
_logger.LogInformation("Starting file watchers");
_logger.LogInformation("[LibraryWatcher] Starting file watchers");
_libraryFolders = (await _unitOfWork.LibraryRepository.GetLibraryDtosAsync())
var libraryFolders = (await _unitOfWork.LibraryRepository.GetLibraryDtosAsync())
.SelectMany(l => l.Folders)
.Distinct()
.Select(Parser.Parser.NormalizePath)
.Where(_directoryService.Exists)
.ToList();
foreach (var libraryFolder in _libraryFolders)
foreach (var libraryFolder in libraryFolders)
{
_logger.LogDebug("Watching {FolderPath}", libraryFolder);
_logger.LogDebug("[LibraryWatcher] Watching {FolderPath}", libraryFolder);
var watcher = new FileSystemWatcher(libraryFolder);
watcher.Changed += OnChanged;
watcher.Created += OnCreated;
watcher.Deleted += OnDeleted;
watcher.Error += OnError;
watcher.Disposed += (_, _) =>
_logger.LogError("[LibraryWatcher] watcher was disposed when it shouldn't have been. Please report this to Kavita dev");
watcher.Filter = "*.*";
watcher.IncludeSubdirectories = true;
watcher.EnableRaisingEvents = true;
_fileWatchers.Add(watcher);
if (!_watcherDictionary.ContainsKey(libraryFolder))
FileWatchers.Add(watcher);
if (!WatcherDictionary.ContainsKey(libraryFolder))
{
_watcherDictionary.Add(libraryFolder, new List<FileSystemWatcher>());
WatcherDictionary.Add(libraryFolder, new List<FileSystemWatcher>());
}
_watcherDictionary[libraryFolder].Add(watcher);
WatcherDictionary[libraryFolder].Add(watcher);
}
_logger.LogInformation("[LibraryWatcher] Watching {Count} folders", FileWatchers.Count);
}
public void StopWatching()
{
_logger.LogInformation("Stopping watching folders");
foreach (var fileSystemWatcher in _watcherDictionary.Values.SelectMany(watcher => watcher))
_logger.LogInformation("[LibraryWatcher] Stopping watching folders");
foreach (var fileSystemWatcher in WatcherDictionary.Values.SelectMany(watcher => watcher))
{
fileSystemWatcher.EnableRaisingEvents = false;
fileSystemWatcher.Changed -= OnChanged;
fileSystemWatcher.Created -= OnCreated;
fileSystemWatcher.Deleted -= OnDeleted;
fileSystemWatcher.Dispose();
fileSystemWatcher.Error -= OnError;
}
_fileWatchers.Clear();
_watcherDictionary.Clear();
FileWatchers.Clear();
WatcherDictionary.Clear();
}
public async Task RestartWatching()
{
_logger.LogDebug("[LibraryWatcher] Restarting watcher");
StopWatching();
await StartWatching();
}
private void OnChanged(object sender, FileSystemEventArgs e)
{
_logger.LogDebug("[LibraryWatcher] Changed: {FullPath}, {Name}, {ChangeType}", e.FullPath, e.Name, e.ChangeType);
if (e.ChangeType != WatcherChangeTypes.Changed) return;
_logger.LogDebug("[LibraryWatcher] Changed: {FullPath}, {Name}", e.FullPath, e.Name);
ProcessChange(e.FullPath, string.IsNullOrEmpty(_directoryService.FileSystem.Path.GetExtension(e.Name)));
BackgroundJob.Enqueue(() => ProcessChange(e.FullPath, string.IsNullOrEmpty(_directoryService.FileSystem.Path.GetExtension(e.Name))));
}
private void OnCreated(object sender, FileSystemEventArgs e)
{
_logger.LogDebug("[LibraryWatcher] Created: {FullPath}, {Name}", e.FullPath, e.Name);
ProcessChange(e.FullPath, !_directoryService.FileSystem.File.Exists(e.Name));
BackgroundJob.Enqueue(() => ProcessChange(e.FullPath, !_directoryService.FileSystem.File.Exists(e.Name)));
}
/// <summary>
@ -183,14 +154,34 @@ public class LibraryWatcher : ILibraryWatcher
var isDirectory = string.IsNullOrEmpty(_directoryService.FileSystem.Path.GetExtension(e.Name));
if (!isDirectory) return;
_logger.LogDebug("[LibraryWatcher] Deleted: {FullPath}, {Name}", e.FullPath, e.Name);
ProcessChange(e.FullPath, true);
BackgroundJob.Enqueue(() => ProcessChange(e.FullPath, true));
}
/// <summary>
/// On error, we count the number of errors that have occured. If the number of errors has been more than 2 in last 10 minutes, then we suspend listening for an hour
/// </summary>
/// <remarks>This will schedule jobs to decrement the buffer full counter</remarks>
/// <param name="sender"></param>
/// <param name="e"></param>
private void OnError(object sender, ErrorEventArgs e)
{
_logger.LogError(e.GetException(), "[LibraryWatcher] An error occured, likely too many watches occured at once. Restarting Watchers");
_logger.LogError(e.GetException(), "[LibraryWatcher] An error occured, likely too many changes occured at once or the folder being watched was deleted. Restarting Watchers");
bool condition;
lock (Lock)
{
_bufferFullCounter += 1;
condition = _bufferFullCounter >= 3;
}
if (condition)
{
_logger.LogInformation("[LibraryWatcher] Internal buffer has been overflown multiple times in past 10 minutes. Suspending file watching for an hour");
StopWatching();
BackgroundJob.Schedule(() => RestartWatching(), TimeSpan.FromHours(1));
return;
}
Task.Run(RestartWatching);
BackgroundJob.Schedule(() => UpdateLastBufferOverflow(), TimeSpan.FromMinutes(10));
}
@ -198,53 +189,79 @@ public class LibraryWatcher : ILibraryWatcher
/// Processes the file or folder change. If the change is a file change and not from a supported extension, it will be ignored.
/// </summary>
/// <remarks>This will ignore image files that are added to the system. However, they may still trigger scans due to folder changes.</remarks>
/// <remarks>This is public only because Hangfire will invoke it. Do not call external to this class.</remarks>
/// <param name="filePath">File or folder that changed</param>
/// <param name="isDirectoryChange">If the change is on a directory and not a file</param>
private void ProcessChange(string filePath, bool isDirectoryChange = false)
// ReSharper disable once MemberCanBePrivate.Global
public async Task ProcessChange(string filePath, bool isDirectoryChange = false)
{
var sw = Stopwatch.StartNew();
_logger.LogDebug("[LibraryWatcher] Processing change of {FilePath}", filePath);
try
{
// We need to check if directory or not
// If not a directory change AND file is not an archive or book, ignore
if (!isDirectoryChange &&
!(Parser.Parser.IsArchive(filePath) || Parser.Parser.IsBook(filePath))) return;
var parentDirectory = _directoryService.GetParentDirectoryName(filePath);
if (string.IsNullOrEmpty(parentDirectory)) return;
// We need to find the library this creation belongs to
// Multiple libraries can point to the same base folder. In this case, we need use FirstOrDefault
var libraryFolder = _libraryFolders.FirstOrDefault(f => parentDirectory.Contains(f));
if (string.IsNullOrEmpty(libraryFolder)) return;
var rootFolder = _directoryService.GetFoldersTillRoot(libraryFolder, filePath).ToList();
if (!rootFolder.Any()) return;
// Select the first folder and join with library folder, this should give us the folder to scan.
var fullPath =
Parser.Parser.NormalizePath(_directoryService.FileSystem.Path.Join(libraryFolder, rootFolder.First()));
var alreadyScheduled =
TaskScheduler.HasAlreadyEnqueuedTask(ScannerService.Name, "ScanFolder", new object[] {fullPath});
_logger.LogDebug("{FullPath} already enqueued: {Value}", fullPath, alreadyScheduled);
if (!alreadyScheduled)
!(Parser.Parser.IsArchive(filePath) || Parser.Parser.IsBook(filePath)))
{
_logger.LogDebug("[LibraryWatcher] Scheduling ScanFolder for {Folder}", fullPath);
BackgroundJob.Schedule(() => _scannerService.ScanFolder(fullPath), _queueWaitTime);
_logger.LogDebug("[LibraryWatcher] Change from {FilePath} is not an archive or book, ignoring change", filePath);
return;
}
else
var libraryFolders = (await _unitOfWork.LibraryRepository.GetLibraryDtosAsync())
.SelectMany(l => l.Folders)
.Distinct()
.Select(Parser.Parser.NormalizePath)
.Where(_directoryService.Exists)
.ToList();
var fullPath = GetFolder(filePath, libraryFolders);
_logger.LogDebug("Folder path: {FolderPath}", fullPath);
if (string.IsNullOrEmpty(fullPath))
{
_logger.LogDebug("[LibraryWatcher] Skipped scheduling ScanFolder for {Folder} as a job already queued",
fullPath);
_logger.LogDebug("[LibraryWatcher] Change from {FilePath} could not find root level folder, ignoring change", filePath);
return;
}
_taskScheduler.ScanFolder(fullPath, _queueWaitTime);
}
catch (Exception ex)
{
_logger.LogError(ex, "[LibraryWatcher] An error occured when processing a watch event");
}
_logger.LogDebug("ProcessChange occured in {ElapsedMilliseconds}ms", sw.ElapsedMilliseconds);
_logger.LogDebug("[LibraryWatcher] ProcessChange completed in {ElapsedMilliseconds}ms", sw.ElapsedMilliseconds);
}
private string GetFolder(string filePath, IEnumerable<string> libraryFolders)
{
var parentDirectory = _directoryService.GetParentDirectoryName(filePath);
_logger.LogDebug("[LibraryWatcher] Parent Directory: {ParentDirectory}", parentDirectory);
if (string.IsNullOrEmpty(parentDirectory)) return string.Empty;
// We need to find the library this creation belongs to
// Multiple libraries can point to the same base folder. In this case, we need use FirstOrDefault
var libraryFolder = libraryFolders.FirstOrDefault(f => parentDirectory.Contains(f));
_logger.LogDebug("[LibraryWatcher] Library Folder: {LibraryFolder}", libraryFolder);
if (string.IsNullOrEmpty(libraryFolder)) return string.Empty;
var rootFolder = _directoryService.GetFoldersTillRoot(libraryFolder, filePath).ToList();
_logger.LogDebug("[LibraryWatcher] Root Folders: {RootFolders}", rootFolder);
if (!rootFolder.Any()) return string.Empty;
// Select the first folder and join with library folder, this should give us the folder to scan.
return Parser.Parser.NormalizePath(_directoryService.FileSystem.Path.Join(libraryFolder, rootFolder.Last()));
}
/// <summary>
/// This is called via Hangfire to decrement the counter. Must work around a lock
/// </summary>
// ReSharper disable once MemberCanBePrivate.Global
public void UpdateLastBufferOverflow()
{
lock (Lock)
{
if (_bufferFullCounter == 0) return;
_bufferFullCounter -= 1;
}
}
}

View file

@ -1,361 +1,417 @@
using System;
using System.Collections.Concurrent;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Threading.Tasks;
using API.Entities.Enums;
using API.Extensions;
using API.Parser;
using API.SignalR;
using Kavita.Common.Helpers;
using Microsoft.Extensions.Logging;
namespace API.Services.Tasks.Scanner
namespace API.Services.Tasks.Scanner;
public class ParsedSeries
{
public class ParsedSeries
{
/// <summary>
/// Name of the Series
/// </summary>
public string Name { get; init; }
/// <summary>
/// Normalized Name of the Series
/// </summary>
public string NormalizedName { get; init; }
/// <summary>
/// Format of the Series
/// </summary>
public MangaFormat Format { get; init; }
}
/// <summary>
/// Name of the Series
/// </summary>
public string Name { get; init; }
/// <summary>
/// Normalized Name of the Series
/// </summary>
public string NormalizedName { get; init; }
/// <summary>
/// Format of the Series
/// </summary>
public MangaFormat Format { get; init; }
}
public enum Modified
{
Modified = 1,
NotModified = 2
}
public class SeriesModified
{
public string FolderPath { get; set; }
public string SeriesName { get; set; }
public DateTime LastScanned { get; set; }
public MangaFormat Format { get; set; }
public IEnumerable<string> LibraryRoots { get; set; }
}
public class SeriesModified
public class ParseScannedFiles
{
private readonly ILogger _logger;
private readonly IDirectoryService _directoryService;
private readonly IReadingItemService _readingItemService;
private readonly IEventHub _eventHub;
/// <summary>
/// An instance of a pipeline for processing files and returning a Map of Series -> ParserInfos.
/// Each instance is separate from other threads, allowing for no cross over.
/// </summary>
/// <param name="logger">Logger of the parent class that invokes this</param>
/// <param name="directoryService">Directory Service</param>
/// <param name="readingItemService">ReadingItemService Service for extracting information on a number of formats</param>
/// <param name="eventHub">For firing off SignalR events</param>
public ParseScannedFiles(ILogger logger, IDirectoryService directoryService,
IReadingItemService readingItemService, IEventHub eventHub)
{
public string FolderPath { get; set; }
public string SeriesName { get; set; }
public DateTime LastScanned { get; set; }
public MangaFormat Format { get; set; }
_logger = logger;
_directoryService = directoryService;
_readingItemService = readingItemService;
_eventHub = eventHub;
}
public class ParseScannedFiles
/// <summary>
/// This will Scan all files in a folder path. For each folder within the folderPath, FolderAction will be invoked for all files contained
/// </summary>
/// <param name="scanDirectoryByDirectory">Scan directory by directory and for each, call folderAction</param>
/// <param name="seriesPaths">A dictionary mapping a normalized path to a list of <see cref="SeriesModified"/> to help scanner skip I/O</param>
/// <param name="folderPath">A library folder or series folder</param>
/// <param name="folderAction">A callback async Task to be called once all files for each folder path are found</param>
/// <param name="forceCheck">If we should bypass any folder last write time checks on the scan and force I/O</param>
public async Task ProcessFiles(string folderPath, bool scanDirectoryByDirectory,
IDictionary<string, IList<SeriesModified>> seriesPaths, Func<IList<string>, string,Task> folderAction, bool forceCheck = false)
{
private readonly ILogger _logger;
private readonly IDirectoryService _directoryService;
private readonly IReadingItemService _readingItemService;
private readonly IEventHub _eventHub;
/// <summary>
/// An instance of a pipeline for processing files and returning a Map of Series -> ParserInfos.
/// Each instance is separate from other threads, allowing for no cross over.
/// </summary>
/// <param name="logger">Logger of the parent class that invokes this</param>
/// <param name="directoryService">Directory Service</param>
/// <param name="readingItemService">ReadingItemService Service for extracting information on a number of formats</param>
/// <param name="eventHub">For firing off SignalR events</param>
public ParseScannedFiles(ILogger logger, IDirectoryService directoryService,
IReadingItemService readingItemService, IEventHub eventHub)
string normalizedPath;
if (scanDirectoryByDirectory)
{
_logger = logger;
_directoryService = directoryService;
_readingItemService = readingItemService;
_eventHub = eventHub;
// This is used in library scan, so we should check first for a ignore file and use that here as well
var potentialIgnoreFile = _directoryService.FileSystem.Path.Join(folderPath, DirectoryService.KavitaIgnoreFile);
var matcher = _directoryService.CreateMatcherFromFile(potentialIgnoreFile);
var directories = _directoryService.GetDirectories(folderPath, matcher).ToList();
foreach (var directory in directories)
{
normalizedPath = Parser.Parser.NormalizePath(directory);
if (HasSeriesFolderNotChangedSinceLastScan(seriesPaths, normalizedPath, forceCheck))
{
await folderAction(new List<string>(), directory);
}
else
{
// For a scan, this is doing everything in the directory loop before the folder Action is called...which leads to no progress indication
await folderAction(_directoryService.ScanFiles(directory, matcher), directory);
}
}
return;
}
/// <summary>
/// This will Scan all files in a folder path. For each folder within the folderPath, FolderAction will be invoked for all files contained
/// </summary>
/// <param name="scanDirectoryByDirectory">Scan directory by directory and for each, call folderAction</param>
/// <param name="folderPath">A library folder or series folder</param>
/// <param name="folderAction">A callback async Task to be called once all files for each folder path are found</param>
/// <param name="forceCheck">If we should bypass any folder last write time checks on the scan and force I/O</param>
public async Task ProcessFiles(string folderPath, bool scanDirectoryByDirectory,
IDictionary<string, IList<SeriesModified>> seriesPaths, Func<IList<string>, string,Task> folderAction, bool forceCheck = false)
normalizedPath = Parser.Parser.NormalizePath(folderPath);
if (HasSeriesFolderNotChangedSinceLastScan(seriesPaths, normalizedPath, forceCheck))
{
string normalizedPath;
if (scanDirectoryByDirectory)
{
// This is used in library scan, so we should check first for a ignore file and use that here as well
var potentialIgnoreFile = _directoryService.FileSystem.Path.Join(folderPath, DirectoryService.KavitaIgnoreFile);
var directories = _directoryService.GetDirectories(folderPath, _directoryService.CreateMatcherFromFile(potentialIgnoreFile)).ToList();
await folderAction(new List<string>(), folderPath);
return;
}
// We need to calculate all folders till library root and see if any kavitaignores
var seriesMatcher = BuildIgnoreFromLibraryRoot(folderPath, seriesPaths);
foreach (var directory in directories)
await folderAction(_directoryService.ScanFiles(folderPath, seriesMatcher), folderPath);
}
/// <summary>
/// Used in ScanSeries, which enters at a lower level folder and hence needs a .kavitaignore from higher (up to root) to be built before
/// the scan takes place.
/// </summary>
/// <param name="folderPath"></param>
/// <param name="seriesPaths"></param>
/// <returns>A GlobMatter. Empty if not applicable</returns>
private GlobMatcher BuildIgnoreFromLibraryRoot(string folderPath, IDictionary<string, IList<SeriesModified>> seriesPaths)
{
var seriesMatcher = new GlobMatcher();
try
{
var roots = seriesPaths[folderPath][0].LibraryRoots.Select(Parser.Parser.NormalizePath).ToList();
var libraryFolder = roots.SingleOrDefault(folderPath.Contains);
if (string.IsNullOrEmpty(libraryFolder) || !Directory.Exists(folderPath))
{
return seriesMatcher;
}
var allParents = _directoryService.GetFoldersTillRoot(libraryFolder, folderPath);
var path = libraryFolder;
// Apply the library root level kavitaignore
var potentialIgnoreFile = _directoryService.FileSystem.Path.Join(path, DirectoryService.KavitaIgnoreFile);
seriesMatcher.Merge(_directoryService.CreateMatcherFromFile(potentialIgnoreFile));
// Then apply kavitaignores for each folder down to where the series folder is
foreach (var folderPart in allParents.Reverse())
{
path = Parser.Parser.NormalizePath(Path.Join(libraryFolder, folderPart));
potentialIgnoreFile = _directoryService.FileSystem.Path.Join(path, DirectoryService.KavitaIgnoreFile);
seriesMatcher.Merge(_directoryService.CreateMatcherFromFile(potentialIgnoreFile));
}
}
catch (Exception ex)
{
_logger.LogError(ex,
"[ScannerService] There was an error trying to find and apply .kavitaignores above the Series Folder. Scanning without them present");
}
return seriesMatcher;
}
/// <summary>
/// Attempts to either add a new instance of a show mapping to the _scannedSeries bag or adds to an existing.
/// This will check if the name matches an existing series name (multiple fields) <see cref="MergeName"/>
/// </summary>
/// <param name="scannedSeries">A localized list of a series' parsed infos</param>
/// <param name="info"></param>
private void TrackSeries(ConcurrentDictionary<ParsedSeries, List<ParserInfo>> scannedSeries, ParserInfo info)
{
if (info.Series == string.Empty) return;
// Check if normalized info.Series already exists and if so, update info to use that name instead
info.Series = MergeName(scannedSeries, info);
var normalizedSeries = Parser.Parser.Normalize(info.Series);
var normalizedSortSeries = Parser.Parser.Normalize(info.SeriesSort);
var normalizedLocalizedSeries = Parser.Parser.Normalize(info.LocalizedSeries);
try
{
var existingKey = scannedSeries.Keys.SingleOrDefault(ps =>
ps.Format == info.Format && (ps.NormalizedName.Equals(normalizedSeries)
|| ps.NormalizedName.Equals(normalizedLocalizedSeries)
|| ps.NormalizedName.Equals(normalizedSortSeries)));
existingKey ??= new ParsedSeries()
{
Format = info.Format,
Name = info.Series,
NormalizedName = normalizedSeries
};
scannedSeries.AddOrUpdate(existingKey, new List<ParserInfo>() {info}, (_, oldValue) =>
{
oldValue ??= new List<ParserInfo>();
if (!oldValue.Contains(info))
{
normalizedPath = Parser.Parser.NormalizePath(directory);
if (HasSeriesFolderNotChangedSinceLastScan(seriesPaths, normalizedPath, forceCheck))
{
await folderAction(new List<string>(), directory);
}
else
{
// For a scan, this is doing everything in the directory loop before the folder Action is called...which leads to no progress indication
await folderAction(_directoryService.ScanFiles(directory), directory);
}
oldValue.Add(info);
}
return oldValue;
});
}
catch (Exception ex)
{
_logger.LogCritical(ex, "[ScannerService] {SeriesName} matches against multiple series in the parsed series. This indicates a critical kavita issue. Key will be skipped", info.Series);
foreach (var seriesKey in scannedSeries.Keys.Where(ps =>
ps.Format == info.Format && (ps.NormalizedName.Equals(normalizedSeries)
|| ps.NormalizedName.Equals(normalizedLocalizedSeries)
|| ps.NormalizedName.Equals(normalizedSortSeries))))
{
_logger.LogCritical("[ScannerService] Matches: {SeriesName} matches on {SeriesKey}", info.Series, seriesKey.Name);
}
}
}
/// <summary>
/// Using a normalized name from the passed ParserInfo, this checks against all found series so far and if an existing one exists with
/// same normalized name, it merges into the existing one. This is important as some manga may have a slight difference with punctuation or capitalization.
/// </summary>
/// <param name="scannedSeries"></param>
/// <param name="info"></param>
/// <returns>Series Name to group this info into</returns>
private string MergeName(ConcurrentDictionary<ParsedSeries, List<ParserInfo>> scannedSeries, ParserInfo info)
{
var normalizedSeries = Parser.Parser.Normalize(info.Series);
var normalizedLocalSeries = Parser.Parser.Normalize(info.LocalizedSeries);
try
{
var existingName =
scannedSeries.SingleOrDefault(p =>
(Parser.Parser.Normalize(p.Key.NormalizedName).Equals(normalizedSeries) ||
Parser.Parser.Normalize(p.Key.NormalizedName).Equals(normalizedLocalSeries)) &&
p.Key.Format == info.Format)
.Key;
if (existingName != null && !string.IsNullOrEmpty(existingName.Name))
{
return existingName.Name;
}
}
catch (Exception ex)
{
_logger.LogCritical(ex, "[ScannerService] Multiple series detected for {SeriesName} ({File})! This is critical to fix! There should only be 1", info.Series, info.FullFilePath);
var values = scannedSeries.Where(p =>
(Parser.Parser.Normalize(p.Key.NormalizedName) == normalizedSeries ||
Parser.Parser.Normalize(p.Key.NormalizedName) == normalizedLocalSeries) &&
p.Key.Format == info.Format);
foreach (var pair in values)
{
_logger.LogCritical("[ScannerService] Duplicate Series in DB matches with {SeriesName}: {DuplicateName}", info.Series, pair.Key.Name);
}
}
return info.Series;
}
/// <summary>
/// This will process series by folder groups. This is used solely by ScanSeries
/// </summary>
/// <param name="libraryType"></param>
/// <param name="folders"></param>
/// <param name="libraryName"></param>
/// <param name="isLibraryScan">If true, does a directory scan first (resulting in folders being tackled in parallel), else does an immediate scan files</param>
/// <param name="seriesPaths">A map of Series names -> existing folder paths to handle skipping folders</param>
/// <param name="processSeriesInfos">Action which returns if the folder was skipped and the infos from said folder</param>
/// <param name="forceCheck">Defaults to false</param>
/// <returns></returns>
public async Task ScanLibrariesForSeries(LibraryType libraryType,
IEnumerable<string> folders, string libraryName, bool isLibraryScan,
IDictionary<string, IList<SeriesModified>> seriesPaths, Func<Tuple<bool, IList<ParserInfo>>, Task> processSeriesInfos, bool forceCheck = false)
{
await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress, MessageFactory.FileScanProgressEvent("File Scan Starting", libraryName, ProgressEventType.Started));
async Task ProcessFolder(IList<string> files, string folder)
{
var normalizedFolder = Parser.Parser.NormalizePath(folder);
if (HasSeriesFolderNotChangedSinceLastScan(seriesPaths, normalizedFolder, forceCheck))
{
var parsedInfos = seriesPaths[normalizedFolder].Select(fp => new ParserInfo()
{
Series = fp.SeriesName,
Format = fp.Format,
}).ToList();
await processSeriesInfos.Invoke(new Tuple<bool, IList<ParserInfo>>(true, parsedInfos));
_logger.LogDebug("[ScannerService] Skipped File Scan for {Folder} as it hasn't changed since last scan", folder);
return;
}
normalizedPath = Parser.Parser.NormalizePath(folderPath);
if (HasSeriesFolderNotChangedSinceLastScan(seriesPaths, normalizedPath, forceCheck))
_logger.LogDebug("[ScannerService] Found {Count} files for {Folder}", files.Count, folder);
await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress,
MessageFactory.FileScanProgressEvent(folder, libraryName, ProgressEventType.Updated));
if (files.Count == 0)
{
await folderAction(new List<string>(), folderPath);
_logger.LogInformation("[ScannerService] {Folder} is empty or is no longer in this location", folder);
return;
}
await folderAction(_directoryService.ScanFiles(folderPath), folderPath);
}
var scannedSeries = new ConcurrentDictionary<ParsedSeries, List<ParserInfo>>();
var infos = files
.Select(file => _readingItemService.ParseFile(file, folder, libraryType))
.Where(info => info != null)
.ToList();
/// <summary>
/// Attempts to either add a new instance of a show mapping to the _scannedSeries bag or adds to an existing.
/// This will check if the name matches an existing series name (multiple fields) <see cref="MergeName"/>
/// </summary>
/// <param name="scannedSeries">A localized list of a series' parsed infos</param>
/// <param name="info"></param>
private void TrackSeries(ConcurrentDictionary<ParsedSeries, List<ParserInfo>> scannedSeries, ParserInfo info)
{
if (info.Series == string.Empty) return;
MergeLocalizedSeriesWithSeries(infos);
// Check if normalized info.Series already exists and if so, update info to use that name instead
info.Series = MergeName(scannedSeries, info);
var normalizedSeries = Parser.Parser.Normalize(info.Series);
var normalizedSortSeries = Parser.Parser.Normalize(info.SeriesSort);
var normalizedLocalizedSeries = Parser.Parser.Normalize(info.LocalizedSeries);
try
{
var existingKey = scannedSeries.Keys.SingleOrDefault(ps =>
ps.Format == info.Format && (ps.NormalizedName.Equals(normalizedSeries)
|| ps.NormalizedName.Equals(normalizedLocalizedSeries)
|| ps.NormalizedName.Equals(normalizedSortSeries)));
existingKey ??= new ParsedSeries()
{
Format = info.Format,
Name = info.Series,
NormalizedName = normalizedSeries
};
scannedSeries.AddOrUpdate(existingKey, new List<ParserInfo>() {info}, (_, oldValue) =>
{
oldValue ??= new List<ParserInfo>();
if (!oldValue.Contains(info))
{
oldValue.Add(info);
}
return oldValue;
});
}
catch (Exception ex)
{
_logger.LogCritical(ex, "{SeriesName} matches against multiple series in the parsed series. This indicates a critical kavita issue. Key will be skipped", info.Series);
foreach (var seriesKey in scannedSeries.Keys.Where(ps =>
ps.Format == info.Format && (ps.NormalizedName.Equals(normalizedSeries)
|| ps.NormalizedName.Equals(normalizedLocalizedSeries)
|| ps.NormalizedName.Equals(normalizedSortSeries))))
{
_logger.LogCritical("Matches: {SeriesName} matches on {SeriesKey}", info.Series, seriesKey.Name);
}
}
}
/// <summary>
/// Using a normalized name from the passed ParserInfo, this checks against all found series so far and if an existing one exists with
/// same normalized name, it merges into the existing one. This is important as some manga may have a slight difference with punctuation or capitalization.
/// </summary>
/// <param name="info"></param>
/// <returns>Series Name to group this info into</returns>
private string MergeName(ConcurrentDictionary<ParsedSeries, List<ParserInfo>> scannedSeries, ParserInfo info)
{
var normalizedSeries = Parser.Parser.Normalize(info.Series);
var normalizedLocalSeries = Parser.Parser.Normalize(info.LocalizedSeries);
try
{
var existingName =
scannedSeries.SingleOrDefault(p =>
(Parser.Parser.Normalize(p.Key.NormalizedName).Equals(normalizedSeries) ||
Parser.Parser.Normalize(p.Key.NormalizedName).Equals(normalizedLocalSeries)) &&
p.Key.Format == info.Format)
.Key;
if (existingName != null && !string.IsNullOrEmpty(existingName.Name))
{
return existingName.Name;
}
}
catch (Exception ex)
{
_logger.LogCritical(ex, "Multiple series detected for {SeriesName} ({File})! This is critical to fix! There should only be 1", info.Series, info.FullFilePath);
var values = scannedSeries.Where(p =>
(Parser.Parser.Normalize(p.Key.NormalizedName) == normalizedSeries ||
Parser.Parser.Normalize(p.Key.NormalizedName) == normalizedLocalSeries) &&
p.Key.Format == info.Format);
foreach (var pair in values)
{
_logger.LogCritical("Duplicate Series in DB matches with {SeriesName}: {DuplicateName}", info.Series, pair.Key.Name);
}
}
return info.Series;
}
/// <summary>
/// This will process series by folder groups.
/// </summary>
/// <param name="libraryType"></param>
/// <param name="folders"></param>
/// <param name="libraryName"></param>
/// <returns></returns>
public async Task ScanLibrariesForSeries(LibraryType libraryType,
IEnumerable<string> folders, string libraryName, bool isLibraryScan,
IDictionary<string, IList<SeriesModified>> seriesPaths, Action<Tuple<bool, IList<ParserInfo>>> processSeriesInfos, bool forceCheck = false)
{
await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress, MessageFactory.FileScanProgressEvent("File Scan Starting", libraryName, ProgressEventType.Started));
foreach (var folderPath in folders)
foreach (var info in infos)
{
try
{
await ProcessFiles(folderPath, isLibraryScan, seriesPaths, async (files, folder) =>
{
var normalizedFolder = Parser.Parser.NormalizePath(folder);
if (HasSeriesFolderNotChangedSinceLastScan(seriesPaths, normalizedFolder, forceCheck))
{
var parsedInfos = seriesPaths[normalizedFolder].Select(fp => new ParserInfo()
{
Series = fp.SeriesName,
Format = fp.Format,
}).ToList();
processSeriesInfos.Invoke(new Tuple<bool, IList<ParserInfo>>(true, parsedInfos));
_logger.LogDebug("Skipped File Scan for {Folder} as it hasn't changed since last scan", folder);
return;
}
_logger.LogDebug("Found {Count} files for {Folder}", files.Count, folder);
await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress, MessageFactory.FileScanProgressEvent(folderPath, libraryName, ProgressEventType.Updated));
if (files.Count == 0)
{
_logger.LogInformation("[ScannerService] {Folder} is empty", folder);
return;
}
var scannedSeries = new ConcurrentDictionary<ParsedSeries, List<ParserInfo>>();
var infos = files
.Select(file => _readingItemService.ParseFile(file, folderPath, libraryType))
.Where(info => info != null)
.ToList();
MergeLocalizedSeriesWithSeries(infos);
foreach (var info in infos)
{
try
{
TrackSeries(scannedSeries, info);
}
catch (Exception ex)
{
_logger.LogError(ex, "There was an exception that occurred during tracking {FilePath}. Skipping this file", info.FullFilePath);
}
}
// It would be really cool if we can emit an event when a folder hasn't been changed so we don't parse everything, but the first item to ensure we don't delete it
// Otherwise, we can do a last step in the DB where we validate all files on disk exist and if not, delete them. (easy but slow)
foreach (var series in scannedSeries.Keys)
{
if (scannedSeries[series].Count > 0 && processSeriesInfos != null)
{
processSeriesInfos.Invoke(new Tuple<bool, IList<ParserInfo>>(false, scannedSeries[series]));
}
}
}, forceCheck);
TrackSeries(scannedSeries, info);
}
catch (ArgumentException ex)
catch (Exception ex)
{
_logger.LogError(ex, "The directory '{FolderPath}' does not exist", folderPath);
_logger.LogError(ex,
"[ScannerService] There was an exception that occurred during tracking {FilePath}. Skipping this file",
info.FullFilePath);
}
}
await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress, MessageFactory.FileScanProgressEvent("File Scan Done", libraryName, ProgressEventType.Ended));
foreach (var series in scannedSeries.Keys)
{
if (scannedSeries[series].Count > 0 && processSeriesInfos != null)
{
await processSeriesInfos.Invoke(new Tuple<bool, IList<ParserInfo>>(false, scannedSeries[series]));
}
}
}
/// <summary>
/// Checks against all folder paths on file if the last scanned is >= the directory's last write down to the second
/// </summary>
/// <param name="seriesPaths"></param>
/// <param name="normalizedFolder"></param>
/// <param name="forceCheck"></param>
/// <returns></returns>
private bool HasSeriesFolderNotChangedSinceLastScan(IDictionary<string, IList<SeriesModified>> seriesPaths, string normalizedFolder, bool forceCheck = false)
{
if (forceCheck) return false;
return seriesPaths.ContainsKey(normalizedFolder) && seriesPaths[normalizedFolder].All(f => f.LastScanned.Truncate(TimeSpan.TicksPerSecond) >=
_directoryService.GetLastWriteTime(normalizedFolder).Truncate(TimeSpan.TicksPerSecond));
foreach (var folderPath in folders)
{
try
{
await ProcessFiles(folderPath, isLibraryScan, seriesPaths, ProcessFolder, forceCheck);
}
catch (ArgumentException ex)
{
_logger.LogError(ex, "[ScannerService] The directory '{FolderPath}' does not exist", folderPath);
}
}
/// <summary>
/// Checks if there are any ParserInfos that have a Series that matches the LocalizedSeries field in any other info. If so,
/// rewrites the infos with series name instead of the localized name, so they stack.
/// </summary>
/// <example>
/// Accel World v01.cbz has Series "Accel World" and Localized Series "World of Acceleration"
/// World of Acceleration v02.cbz has Series "World of Acceleration"
/// After running this code, we'd have:
/// World of Acceleration v02.cbz having Series "Accel World" and Localized Series of "World of Acceleration"
/// </example>
/// <param name="infos">A collection of ParserInfos</param>
private void MergeLocalizedSeriesWithSeries(IReadOnlyCollection<ParserInfo> infos)
await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress, MessageFactory.FileScanProgressEvent("File Scan Done", libraryName, ProgressEventType.Ended));
}
/// <summary>
/// Checks against all folder paths on file if the last scanned is >= the directory's last write down to the second
/// </summary>
/// <param name="seriesPaths"></param>
/// <param name="normalizedFolder"></param>
/// <param name="forceCheck"></param>
/// <returns></returns>
private bool HasSeriesFolderNotChangedSinceLastScan(IDictionary<string, IList<SeriesModified>> seriesPaths, string normalizedFolder, bool forceCheck = false)
{
if (forceCheck) return false;
return seriesPaths.ContainsKey(normalizedFolder) && seriesPaths[normalizedFolder].All(f => f.LastScanned.Truncate(TimeSpan.TicksPerSecond) >=
_directoryService.GetLastWriteTime(normalizedFolder).Truncate(TimeSpan.TicksPerSecond));
}
/// <summary>
/// Checks if there are any ParserInfos that have a Series that matches the LocalizedSeries field in any other info. If so,
/// rewrites the infos with series name instead of the localized name, so they stack.
/// </summary>
/// <example>
/// Accel World v01.cbz has Series "Accel World" and Localized Series "World of Acceleration"
/// World of Acceleration v02.cbz has Series "World of Acceleration"
/// After running this code, we'd have:
/// World of Acceleration v02.cbz having Series "Accel World" and Localized Series of "World of Acceleration"
/// </example>
/// <param name="infos">A collection of ParserInfos</param>
private void MergeLocalizedSeriesWithSeries(IReadOnlyCollection<ParserInfo> infos)
{
var hasLocalizedSeries = infos.Any(i => !string.IsNullOrEmpty(i.LocalizedSeries));
if (!hasLocalizedSeries) return;
var localizedSeries = infos
.Where(i => !i.IsSpecial)
.Select(i => i.LocalizedSeries)
.Distinct()
.FirstOrDefault(i => !string.IsNullOrEmpty(i));
if (string.IsNullOrEmpty(localizedSeries)) return;
// NOTE: If we have multiple series in a folder with a localized title, then this will fail. It will group into one series. User needs to fix this themselves.
string nonLocalizedSeries;
// Normalize this as many of the cases is a capitalization difference
var nonLocalizedSeriesFound = infos
.Where(i => !i.IsSpecial)
.Select(i => i.Series).DistinctBy(Parser.Parser.Normalize).ToList();
if (nonLocalizedSeriesFound.Count == 1)
{
var hasLocalizedSeries = infos.Any(i => !string.IsNullOrEmpty(i.LocalizedSeries));
if (!hasLocalizedSeries) return;
var localizedSeries = infos
.Where(i => !i.IsSpecial)
.Select(i => i.LocalizedSeries)
.Distinct()
.FirstOrDefault(i => !string.IsNullOrEmpty(i));
if (string.IsNullOrEmpty(localizedSeries)) return;
// NOTE: If we have multiple series in a folder with a localized title, then this will fail. It will group into one series. User needs to fix this themselves.
string nonLocalizedSeries;
// Normalize this as many of the cases is a capitalization difference
var nonLocalizedSeriesFound = infos
.Where(i => !i.IsSpecial)
.Select(i => i.Series).DistinctBy(Parser.Parser.Normalize).ToList();
if (nonLocalizedSeriesFound.Count == 1)
nonLocalizedSeries = nonLocalizedSeriesFound.First();
}
else
{
// There can be a case where there are multiple series in a folder that causes merging.
if (nonLocalizedSeriesFound.Count > 2)
{
nonLocalizedSeries = nonLocalizedSeriesFound.First();
}
else
{
// There can be a case where there are multiple series in a folder that causes merging.
if (nonLocalizedSeriesFound.Count > 2)
{
_logger.LogError("[ScannerService] There are multiple series within one folder that contain localized series. This will cause them to group incorrectly. Please separate series into their own dedicated folder or ensure there is only 2 potential series (localized and series): {LocalizedSeries}", string.Join(", ", nonLocalizedSeriesFound));
}
nonLocalizedSeries = nonLocalizedSeriesFound.FirstOrDefault(s => !s.Equals(localizedSeries));
_logger.LogError("[ScannerService] There are multiple series within one folder that contain localized series. This will cause them to group incorrectly. Please separate series into their own dedicated folder or ensure there is only 2 potential series (localized and series): {LocalizedSeries}", string.Join(", ", nonLocalizedSeriesFound));
}
nonLocalizedSeries = nonLocalizedSeriesFound.FirstOrDefault(s => !s.Equals(localizedSeries));
}
if (string.IsNullOrEmpty(nonLocalizedSeries)) return;
if (string.IsNullOrEmpty(nonLocalizedSeries)) return;
var normalizedNonLocalizedSeries = Parser.Parser.Normalize(nonLocalizedSeries);
foreach (var infoNeedingMapping in infos.Where(i =>
!Parser.Parser.Normalize(i.Series).Equals(normalizedNonLocalizedSeries)))
{
infoNeedingMapping.Series = nonLocalizedSeries;
infoNeedingMapping.LocalizedSeries = localizedSeries;
}
var normalizedNonLocalizedSeries = Parser.Parser.Normalize(nonLocalizedSeries);
foreach (var infoNeedingMapping in infos.Where(i =>
!Parser.Parser.Normalize(i.Series).Equals(normalizedNonLocalizedSeries)))
{
infoNeedingMapping.Series = nonLocalizedSeries;
infoNeedingMapping.LocalizedSeries = localizedSeries;
}
}
}

View file

@ -1,9 +1,9 @@
using System.IO;
using System.Linq;
using API.Entities.Enums;
using API.Services;
using API.Parser;
namespace API.Parser;
namespace API.Services.Tasks.Scanner.Parser;
public interface IDefaultParser
{
@ -36,81 +36,81 @@ public class DefaultParser : IDefaultParser
var fileName = _directoryService.FileSystem.Path.GetFileNameWithoutExtension(filePath);
ParserInfo ret;
if (Services.Tasks.Scanner.Parser.Parser.IsEpub(filePath))
if (Parser.IsEpub(filePath))
{
ret = new ParserInfo()
ret = new ParserInfo
{
Chapters = Services.Tasks.Scanner.Parser.Parser.ParseChapter(fileName) ?? Services.Tasks.Scanner.Parser.Parser.ParseComicChapter(fileName),
Series = Services.Tasks.Scanner.Parser.Parser.ParseSeries(fileName) ?? Services.Tasks.Scanner.Parser.Parser.ParseComicSeries(fileName),
Volumes = Services.Tasks.Scanner.Parser.Parser.ParseVolume(fileName) ?? Services.Tasks.Scanner.Parser.Parser.ParseComicVolume(fileName),
Chapters = Parser.ParseChapter(fileName) ?? Parser.ParseComicChapter(fileName),
Series = Parser.ParseSeries(fileName) ?? Parser.ParseComicSeries(fileName),
Volumes = Parser.ParseVolume(fileName) ?? Parser.ParseComicVolume(fileName),
Filename = Path.GetFileName(filePath),
Format = Services.Tasks.Scanner.Parser.Parser.ParseFormat(filePath),
Format = Parser.ParseFormat(filePath),
FullFilePath = filePath
};
}
else
{
ret = new ParserInfo()
ret = new ParserInfo
{
Chapters = type == LibraryType.Comic ? Services.Tasks.Scanner.Parser.Parser.ParseComicChapter(fileName) : Services.Tasks.Scanner.Parser.Parser.ParseChapter(fileName),
Series = type == LibraryType.Comic ? Services.Tasks.Scanner.Parser.Parser.ParseComicSeries(fileName) : Services.Tasks.Scanner.Parser.Parser.ParseSeries(fileName),
Volumes = type == LibraryType.Comic ? Services.Tasks.Scanner.Parser.Parser.ParseComicVolume(fileName) : Services.Tasks.Scanner.Parser.Parser.ParseVolume(fileName),
Chapters = type == LibraryType.Comic ? Parser.ParseComicChapter(fileName) : Parser.ParseChapter(fileName),
Series = type == LibraryType.Comic ? Parser.ParseComicSeries(fileName) : Parser.ParseSeries(fileName),
Volumes = type == LibraryType.Comic ? Parser.ParseComicVolume(fileName) : Parser.ParseVolume(fileName),
Filename = Path.GetFileName(filePath),
Format = Services.Tasks.Scanner.Parser.Parser.ParseFormat(filePath),
Format = Parser.ParseFormat(filePath),
Title = Path.GetFileNameWithoutExtension(fileName),
FullFilePath = filePath
};
}
if (Services.Tasks.Scanner.Parser.Parser.IsImage(filePath) && Services.Tasks.Scanner.Parser.Parser.IsCoverImage(filePath)) return null;
if (Parser.IsCoverImage(_directoryService.FileSystem.Path.GetFileName(filePath))) return null;
if (Services.Tasks.Scanner.Parser.Parser.IsImage(filePath))
if (Parser.IsImage(filePath))
{
// Reset Chapters, Volumes, and Series as images are not good to parse information out of. Better to use folders.
ret.Volumes = Services.Tasks.Scanner.Parser.Parser.DefaultVolume;
ret.Chapters = Services.Tasks.Scanner.Parser.Parser.DefaultChapter;
ret.Volumes = Parser.DefaultVolume;
ret.Chapters = Parser.DefaultChapter;
ret.Series = string.Empty;
}
if (ret.Series == string.Empty || Services.Tasks.Scanner.Parser.Parser.IsImage(filePath))
if (ret.Series == string.Empty || Parser.IsImage(filePath))
{
// Try to parse information out of each folder all the way to rootPath
ParseFromFallbackFolders(filePath, rootPath, type, ref ret);
}
var edition = Services.Tasks.Scanner.Parser.Parser.ParseEdition(fileName);
var edition = Parser.ParseEdition(fileName);
if (!string.IsNullOrEmpty(edition))
{
ret.Series = Services.Tasks.Scanner.Parser.Parser.CleanTitle(ret.Series.Replace(edition, ""), type is LibraryType.Comic);
ret.Series = Parser.CleanTitle(ret.Series.Replace(edition, ""), type is LibraryType.Comic);
ret.Edition = edition;
}
var isSpecial = type == LibraryType.Comic ? Services.Tasks.Scanner.Parser.Parser.ParseComicSpecial(fileName) : Services.Tasks.Scanner.Parser.Parser.ParseMangaSpecial(fileName);
var isSpecial = type == LibraryType.Comic ? Parser.IsComicSpecial(fileName) : Parser.IsMangaSpecial(fileName);
// We must ensure that we can only parse a special out. As some files will have v20 c171-180+Omake and that
// could cause a problem as Omake is a special term, but there is valid volume/chapter information.
if (ret.Chapters == Services.Tasks.Scanner.Parser.Parser.DefaultChapter && ret.Volumes == Services.Tasks.Scanner.Parser.Parser.DefaultVolume && !string.IsNullOrEmpty(isSpecial))
if (ret.Chapters == Parser.DefaultChapter && ret.Volumes == Parser.DefaultVolume && isSpecial)
{
ret.IsSpecial = true;
ParseFromFallbackFolders(filePath, rootPath, type, ref ret); // NOTE: This can cause some complications, we should try to be a bit less aggressive to fallback to folder
}
// If we are a special with marker, we need to ensure we use the correct series name. we can do this by falling back to Folder name
if (Services.Tasks.Scanner.Parser.Parser.HasSpecialMarker(fileName))
if (Parser.HasSpecialMarker(fileName))
{
ret.IsSpecial = true;
ret.Chapters = Services.Tasks.Scanner.Parser.Parser.DefaultChapter;
ret.Volumes = Services.Tasks.Scanner.Parser.Parser.DefaultVolume;
ret.Chapters = Parser.DefaultChapter;
ret.Volumes = Parser.DefaultVolume;
ParseFromFallbackFolders(filePath, rootPath, type, ref ret);
}
if (string.IsNullOrEmpty(ret.Series))
{
ret.Series = Services.Tasks.Scanner.Parser.Parser.CleanTitle(fileName, type is LibraryType.Comic);
ret.Series = Parser.CleanTitle(fileName, type is LibraryType.Comic);
}
// Pdfs may have .pdf in the series name, remove that
if (Services.Tasks.Scanner.Parser.Parser.IsPdf(filePath) && ret.Series.ToLower().EndsWith(".pdf"))
if (Parser.IsPdf(filePath) && ret.Series.ToLower().EndsWith(".pdf"))
{
ret.Series = ret.Series.Substring(0, ret.Series.Length - ".pdf".Length);
}
@ -127,35 +127,55 @@ public class DefaultParser : IDefaultParser
/// <param name="ret">Expects a non-null ParserInfo which this method will populate</param>
public void ParseFromFallbackFolders(string filePath, string rootPath, LibraryType type, ref ParserInfo ret)
{
var fallbackFolders = _directoryService.GetFoldersTillRoot(rootPath, filePath).ToList();
var fallbackFolders = _directoryService.GetFoldersTillRoot(rootPath, filePath)
.Where(f => !Parser.IsMangaSpecial(f))
.ToList();
if (fallbackFolders.Count == 0)
{
var rootFolderName = _directoryService.FileSystem.DirectoryInfo.FromDirectoryName(rootPath).Name;
var series = Parser.ParseSeries(rootFolderName);
if (string.IsNullOrEmpty(series))
{
ret.Series = Parser.CleanTitle(rootFolderName, type is LibraryType.Comic);
return;
}
if (!string.IsNullOrEmpty(series) && (string.IsNullOrEmpty(ret.Series) || !rootFolderName.Contains(ret.Series)))
{
ret.Series = series;
return;
}
}
for (var i = 0; i < fallbackFolders.Count; i++)
{
var folder = fallbackFolders[i];
if (!string.IsNullOrEmpty(Services.Tasks.Scanner.Parser.Parser.ParseMangaSpecial(folder))) continue;
var parsedVolume = type is LibraryType.Manga ? Services.Tasks.Scanner.Parser.Parser.ParseVolume(folder) : Services.Tasks.Scanner.Parser.Parser.ParseComicVolume(folder);
var parsedChapter = type is LibraryType.Manga ? Services.Tasks.Scanner.Parser.Parser.ParseChapter(folder) : Services.Tasks.Scanner.Parser.Parser.ParseComicChapter(folder);
var parsedVolume = type is LibraryType.Manga ? Parser.ParseVolume(folder) : Parser.ParseComicVolume(folder);
var parsedChapter = type is LibraryType.Manga ? Parser.ParseChapter(folder) : Parser.ParseComicChapter(folder);
if (!parsedVolume.Equals(Services.Tasks.Scanner.Parser.Parser.DefaultVolume) || !parsedChapter.Equals(Services.Tasks.Scanner.Parser.Parser.DefaultChapter))
if (!parsedVolume.Equals(Parser.DefaultVolume) || !parsedChapter.Equals(Parser.DefaultChapter))
{
if ((string.IsNullOrEmpty(ret.Volumes) || ret.Volumes.Equals(Services.Tasks.Scanner.Parser.Parser.DefaultVolume)) && !parsedVolume.Equals(Services.Tasks.Scanner.Parser.Parser.DefaultVolume))
{
ret.Volumes = parsedVolume;
}
if ((string.IsNullOrEmpty(ret.Chapters) || ret.Chapters.Equals(Services.Tasks.Scanner.Parser.Parser.DefaultChapter)) && !parsedChapter.Equals(Services.Tasks.Scanner.Parser.Parser.DefaultChapter))
{
ret.Chapters = parsedChapter;
}
if ((string.IsNullOrEmpty(ret.Volumes) || ret.Volumes.Equals(Parser.DefaultVolume)) && !parsedVolume.Equals(Parser.DefaultVolume))
{
ret.Volumes = parsedVolume;
}
if ((string.IsNullOrEmpty(ret.Chapters) || ret.Chapters.Equals(Parser.DefaultChapter)) && !parsedChapter.Equals(Parser.DefaultChapter))
{
ret.Chapters = parsedChapter;
}
}
// Generally users group in series folders. Let's try to parse series from the top folder
if (!folder.Equals(ret.Series) && i == fallbackFolders.Count - 1)
{
var series = Services.Tasks.Scanner.Parser.Parser.ParseSeries(folder);
var series = Parser.ParseSeries(folder);
if (string.IsNullOrEmpty(series))
{
ret.Series = Services.Tasks.Scanner.Parser.Parser.CleanTitle(folder, type is LibraryType.Comic);
ret.Series = Parser.CleanTitle(folder, type is LibraryType.Comic);
break;
}

File diff suppressed because it is too large Load diff

View file

@ -2,100 +2,99 @@
using API.Entities.Enums;
using API.Services.Tasks.Scanner.Parser;
namespace API.Parser
namespace API.Parser;
/// <summary>
/// This represents all parsed information from a single file
/// </summary>
public class ParserInfo
{
/// <summary>
/// This represents all parsed information from a single file
/// Represents the parsed chapters from a file. By default, will be 0 which means nothing could be parsed.
/// <remarks>The chapters can only be a single float or a range of float ie) 1-2. Mainly floats should be multiples of 0.5 representing specials</remarks>
/// </summary>
public class ParserInfo
public string Chapters { get; set; } = "";
/// <summary>
/// Represents the parsed series from the file or folder
/// </summary>
public string Series { get; set; } = string.Empty;
/// <summary>
/// This can be filled in from ComicInfo.xml/Epub during scanning. Will update the SortName field on <see cref="Entities.Series"/>
/// </summary>
public string SeriesSort { get; set; } = string.Empty;
/// <summary>
/// This can be filled in from ComicInfo.xml/Epub during scanning. Will update the LocalizedName field on <see cref="Entities.Series"/>
/// </summary>
public string LocalizedSeries { get; set; } = string.Empty;
/// <summary>
/// Represents the parsed volumes from a file. By default, will be 0 which means that nothing could be parsed.
/// If Volumes is 0 and Chapters is 0, the file is a special. If Chapters is non-zero, then no volume could be parsed.
/// <example>Beastars Vol 3-4 will map to "3-4"</example>
/// <remarks>The volumes can only be a single int or a range of ints ie) 1-2. Float based volumes are not supported.</remarks>
/// </summary>
public string Volumes { get; set; } = "";
/// <summary>
/// Filename of the underlying file
/// <example>Beastars v01 (digital).cbz</example>
/// </summary>
public string Filename { get; init; } = "";
/// <summary>
/// Full filepath of the underlying file
/// <example>C:/Manga/Beastars v01 (digital).cbz</example>
/// </summary>
public string FullFilePath { get; set; } = "";
/// <summary>
/// <see cref="MangaFormat"/> that represents the type of the file
/// <remarks>Mainly used to show in the UI and so caching service knows how to cache for reading.</remarks>
/// </summary>
public MangaFormat Format { get; set; } = MangaFormat.Unknown;
/// <summary>
/// This can potentially story things like "Omnibus, Color, Full Contact Edition, Extra, Final, etc"
/// </summary>
/// <remarks>Not Used in Database</remarks>
public string Edition { get; set; } = "";
/// <summary>
/// If the file contains no volume/chapter information or contains Special Keywords <see cref="Parser.MangaSpecialRegex"/>
/// </summary>
public bool IsSpecial { get; set; }
/// <summary>
/// Used for specials or books, stores what the UI should show.
/// <remarks>Manga does not use this field</remarks>
/// </summary>
public string Title { get; set; } = string.Empty;
/// <summary>
/// If the ParserInfo has the IsSpecial tag or both volumes and chapters are default aka 0
/// </summary>
/// <returns></returns>
public bool IsSpecialInfo()
{
/// <summary>
/// Represents the parsed chapters from a file. By default, will be 0 which means nothing could be parsed.
/// <remarks>The chapters can only be a single float or a range of float ie) 1-2. Mainly floats should be multiples of 0.5 representing specials</remarks>
/// </summary>
public string Chapters { get; set; } = "";
/// <summary>
/// Represents the parsed series from the file or folder
/// </summary>
public string Series { get; set; } = string.Empty;
/// <summary>
/// This can be filled in from ComicInfo.xml/Epub during scanning. Will update the SortName field on <see cref="Entities.Series"/>
/// </summary>
public string SeriesSort { get; set; } = string.Empty;
/// <summary>
/// This can be filled in from ComicInfo.xml/Epub during scanning. Will update the LocalizedName field on <see cref="Entities.Series"/>
/// </summary>
public string LocalizedSeries { get; set; } = string.Empty;
/// <summary>
/// Represents the parsed volumes from a file. By default, will be 0 which means that nothing could be parsed.
/// If Volumes is 0 and Chapters is 0, the file is a special. If Chapters is non-zero, then no volume could be parsed.
/// <example>Beastars Vol 3-4 will map to "3-4"</example>
/// <remarks>The volumes can only be a single int or a range of ints ie) 1-2. Float based volumes are not supported.</remarks>
/// </summary>
public string Volumes { get; set; } = "";
/// <summary>
/// Filename of the underlying file
/// <example>Beastars v01 (digital).cbz</example>
/// </summary>
public string Filename { get; init; } = "";
/// <summary>
/// Full filepath of the underlying file
/// <example>C:/Manga/Beastars v01 (digital).cbz</example>
/// </summary>
public string FullFilePath { get; set; } = "";
return (IsSpecial || (Volumes == "0" && Chapters == "0"));
}
/// <summary>
/// <see cref="MangaFormat"/> that represents the type of the file
/// <remarks>Mainly used to show in the UI and so caching service knows how to cache for reading.</remarks>
/// </summary>
public MangaFormat Format { get; set; } = MangaFormat.Unknown;
/// <summary>
/// This will contain any EXTRA comicInfo information parsed from the epub or archive. If there is an archive with comicInfo.xml AND it contains
/// series, volume information, that will override what we parsed.
/// </summary>
public ComicInfo ComicInfo { get; set; }
/// <summary>
/// This can potentially story things like "Omnibus, Color, Full Contact Edition, Extra, Final, etc"
/// </summary>
/// <remarks>Not Used in Database</remarks>
public string Edition { get; set; } = "";
/// <summary>
/// If the file contains no volume/chapter information or contains Special Keywords <see cref="Parser.MangaSpecialRegex"/>
/// </summary>
public bool IsSpecial { get; set; }
/// <summary>
/// Used for specials or books, stores what the UI should show.
/// <remarks>Manga does not use this field</remarks>
/// </summary>
public string Title { get; set; } = string.Empty;
/// <summary>
/// If the ParserInfo has the IsSpecial tag or both volumes and chapters are default aka 0
/// </summary>
/// <returns></returns>
public bool IsSpecialInfo()
{
return (IsSpecial || (Volumes == "0" && Chapters == "0"));
}
/// <summary>
/// This will contain any EXTRA comicInfo information parsed from the epub or archive. If there is an archive with comicInfo.xml AND it contains
/// series, volume information, that will override what we parsed.
/// </summary>
public ComicInfo ComicInfo { get; set; }
/// <summary>
/// Merges non empty/null properties from info2 into this entity.
/// </summary>
/// <remarks>This does not merge ComicInfo as they should always be the same</remarks>
/// <param name="info2"></param>
public void Merge(ParserInfo info2)
{
if (info2 == null) return;
Chapters = string.IsNullOrEmpty(Chapters) || Chapters == "0" ? info2.Chapters: Chapters;
Volumes = string.IsNullOrEmpty(Volumes) || Volumes == "0" ? info2.Volumes : Volumes;
Edition = string.IsNullOrEmpty(Edition) ? info2.Edition : Edition;
Title = string.IsNullOrEmpty(Title) ? info2.Title : Title;
Series = string.IsNullOrEmpty(Series) ? info2.Series : Series;
IsSpecial = IsSpecial || info2.IsSpecial;
}
/// <summary>
/// Merges non empty/null properties from info2 into this entity.
/// </summary>
/// <remarks>This does not merge ComicInfo as they should always be the same</remarks>
/// <param name="info2"></param>
public void Merge(ParserInfo info2)
{
if (info2 == null) return;
Chapters = string.IsNullOrEmpty(Chapters) || Chapters == "0" ? info2.Chapters: Chapters;
Volumes = string.IsNullOrEmpty(Volumes) || Volumes == "0" ? info2.Volumes : Volumes;
Edition = string.IsNullOrEmpty(Edition) ? info2.Edition : Edition;
Title = string.IsNullOrEmpty(Title) ? info2.Title : Title;
Series = string.IsNullOrEmpty(Series) ? info2.Series : Series;
IsSpecial = IsSpecial || info2.IsSpecial;
}
}

View file

@ -3,6 +3,7 @@ using System.Collections.Generic;
using System.Collections.Immutable;
using System.Diagnostics;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
using API.Data;
using API.Data.Metadata;
@ -14,6 +15,7 @@ using API.Parser;
using API.Services.Tasks.Metadata;
using API.SignalR;
using Hangfire;
using Kavita.Common;
using Microsoft.Extensions.Logging;
namespace API.Services.Tasks.Scanner;
@ -48,8 +50,6 @@ public class ProcessSeries : IProcessSeries
private IList<Person> _people;
private IList<Tag> _tags;
public ProcessSeries(IUnitOfWork unitOfWork, ILogger<ProcessSeries> logger, IEventHub eventHub,
IDirectoryService directoryService, ICacheHelper cacheHelper, IReadingItemService readingItemService,
IFileService fileService, IMetadataService metadataService, IWordCountAnalyzerService wordCountAnalyzerService)
@ -108,6 +108,7 @@ public class ProcessSeries : IProcessSeries
{
seriesAdded = true;
series = DbFactory.Series(firstInfo.Series, firstInfo.LocalizedSeries);
_unitOfWork.SeriesRepository.Add(series);
}
if (series.LibraryId == 0) series.LibraryId = library.Id;
@ -116,7 +117,8 @@ public class ProcessSeries : IProcessSeries
{
_logger.LogInformation("[ScannerService] Processing series {SeriesName}", series.OriginalName);
var firstParsedInfo = parsedInfos[0];
// parsedInfos[0] is not the first volume or chapter. We need to find it using a ComicInfo check (as it uses firstParsedInfo for series sort)
var firstParsedInfo = parsedInfos.FirstOrDefault(p => p.ComicInfo != null, firstInfo);
UpdateVolumes(series, parsedInfos);
series.Pages = series.Volumes.Sum(v => v.Pages);
@ -155,7 +157,6 @@ public class ProcessSeries : IProcessSeries
await UpdateSeriesFolderPath(parsedInfos, library, series);
series.LastFolderScanned = DateTime.Now;
_unitOfWork.SeriesRepository.Attach(series);
if (_unitOfWork.HasChanges())
{
@ -166,7 +167,9 @@ public class ProcessSeries : IProcessSeries
catch (Exception ex)
{
await _unitOfWork.RollbackAsync();
_logger.LogCritical(ex, "[ScannerService] There was an issue writing to the for series {@SeriesName}", series);
_logger.LogCritical(ex,
"[ScannerService] There was an issue writing to the database for series {@SeriesName}",
series.Name);
await _eventHub.SendMessageAsync(MessageFactory.Error,
MessageFactory.ErrorEvent($"There was an issue writing to the DB for Series {series}",
@ -210,13 +213,13 @@ public class ProcessSeries : IProcessSeries
if (!library.Folders.Select(f => f.Path).Contains(seriesDirs.Keys.First()))
{
series.FolderPath = Parser.Parser.NormalizePath(seriesDirs.Keys.First());
_logger.LogDebug("Updating {Series} FolderPath to {FolderPath}", series.Name, series.FolderPath);
}
}
}
public void EnqueuePostSeriesProcessTasks(int libraryId, int seriesId, bool forceUpdate = false)
{
//BackgroundJob.Enqueue(() => _metadataService.GenerateCoversForSeries(libraryId, seriesId, forceUpdate));
BackgroundJob.Enqueue(() => _wordCountAnalyzerService.ScanSeries(libraryId, seriesId, forceUpdate));
}
@ -233,12 +236,9 @@ public class ProcessSeries : IProcessSeries
var chapters = series.Volumes.SelectMany(volume => volume.Chapters).ToList();
// Update Metadata based on Chapter metadata
series.Metadata.ReleaseYear = chapters.Min(c => c.ReleaseDate.Year);
if (series.Metadata.ReleaseYear < 1000)
if (!series.Metadata.ReleaseYearLocked)
{
// Not a valid year, default to 0
series.Metadata.ReleaseYear = 0;
series.Metadata.ReleaseYear = chapters.MinimumReleaseYear();
}
// Set the AgeRating as highest in all the comicInfos
@ -438,7 +438,22 @@ public class ProcessSeries : IProcessSeries
_logger.LogDebug("[ScannerService] Updating {DistinctVolumes} volumes on {SeriesName}", distinctVolumes.Count, series.Name);
foreach (var volumeNumber in distinctVolumes)
{
var volume = series.Volumes.SingleOrDefault(s => s.Name == volumeNumber);
_logger.LogDebug("[ScannerService] Looking up volume for {VolumeNumber}", volumeNumber);
Volume volume;
try
{
volume = series.Volumes.SingleOrDefault(s => s.Name == volumeNumber);
}
catch (Exception ex)
{
if (ex.Message.Equals("Sequence contains more than one matching element"))
{
_logger.LogCritical("[ScannerService] Kavita found corrupted volume entries on {SeriesName}. Please delete the series from Kavita via UI and rescan", series.Name);
throw new KavitaException(
$"Kavita found corrupted volume entries on {series.Name}. Please delete the series from Kavita via UI and rescan");
}
throw;
}
if (volume == null)
{
volume = DbFactory.Volume(volumeNumber);
@ -457,7 +472,7 @@ public class ProcessSeries : IProcessSeries
foreach (var chapter in volume.Chapters)
{
var firstFile = chapter.Files.MinBy(x => x.Chapter);
if (firstFile == null || _cacheHelper.HasFileNotChangedSinceCreationOrLastScan(chapter, false, firstFile)) continue;
if (firstFile == null || _cacheHelper.IsFileUnmodifiedSinceCreationOrLastScan(chapter, false, firstFile)) continue;
try
{
var firstChapterInfo = infos.SingleOrDefault(i => i.FullFilePath.Equals(firstFile.FilePath));
@ -479,10 +494,10 @@ public class ProcessSeries : IProcessSeries
var deletedVolumes = series.Volumes.Except(nonDeletedVolumes);
foreach (var volume in deletedVolumes)
{
var file = volume.Chapters.FirstOrDefault()?.Files?.FirstOrDefault()?.FilePath ?? "";
var file = volume.Chapters.FirstOrDefault()?.Files?.FirstOrDefault()?.FilePath ?? string.Empty;
if (!string.IsNullOrEmpty(file) && _directoryService.FileSystem.File.Exists(file))
{
_logger.LogError(
_logger.LogInformation(
"[ScannerService] Volume cleanup code was trying to remove a volume with a file still existing on disk. File: {File}",
file);
}
@ -493,7 +508,7 @@ public class ProcessSeries : IProcessSeries
series.Volumes = nonDeletedVolumes;
}
_logger.LogDebug("[ScannerService] Updated {SeriesName} volumes from {StartingVolumeCount} to {VolumeCount}",
_logger.LogDebug("[ScannerService] Updated {SeriesName} volumes from count of {StartingVolumeCount} to {VolumeCount}",
series.Name, startingVolumeCount, series.Volumes.Count);
}
@ -582,7 +597,7 @@ public class ProcessSeries : IProcessSeries
{
var firstFile = chapter.Files.MinBy(x => x.Chapter);
if (firstFile == null ||
_cacheHelper.HasFileNotChangedSinceCreationOrLastScan(chapter, false, firstFile)) return;
_cacheHelper.IsFileUnmodifiedSinceCreationOrLastScan(chapter, false, firstFile)) return;
var comicInfo = info;
if (info == null)
@ -616,14 +631,7 @@ public class ProcessSeries : IProcessSeries
}
// This needs to check against both Number and Volume to calculate Count
if (!string.IsNullOrEmpty(comicInfo.Number) && float.Parse(comicInfo.Number) > 0)
{
chapter.Count = (int) Math.Floor(float.Parse(comicInfo.Number));
}
if (!string.IsNullOrEmpty(comicInfo.Volume) && float.Parse(comicInfo.Volume) > 0)
{
chapter.Count = Math.Max(chapter.Count, (int) Math.Floor(float.Parse(comicInfo.Volume)));
}
chapter.Count = comicInfo.CalculatedCount();
void AddPerson(Person person)
{
@ -632,13 +640,11 @@ public class ProcessSeries : IProcessSeries
void AddGenre(Genre genre)
{
//chapter.Genres.Add(genre);
GenreHelper.AddGenreIfNotExists(chapter.Genres, genre);
}
void AddTag(Tag tag, bool added)
{
//chapter.Tags.Add(tag);
TagHelper.AddTagIfNotExists(chapter.Tags, tag);
}
@ -647,7 +653,7 @@ public class ProcessSeries : IProcessSeries
{
var day = Math.Max(comicInfo.Day, 1);
var month = Math.Max(comicInfo.Month, 1);
chapter.ReleaseDate = DateTime.Parse($"{month}/{day}/{comicInfo.Year}");
chapter.ReleaseDate = new DateTime(comicInfo.Year, month, day);
}
var people = GetTagValues(comicInfo.Colorist);
@ -736,7 +742,6 @@ public class ProcessSeries : IProcessSeries
/// <param name="action"></param>
private void UpdatePeople(IEnumerable<string> names, PersonRole role, Action<Person> action)
{
var allPeopleTypeRole = _people.Where(p => p.Role == role).ToList();
foreach (var name in names)

View file

@ -8,7 +8,7 @@ using System.Threading.Tasks;
using API.Data;
using API.Data.Repositories;
using API.Entities;
using API.Extensions;
using API.Entities.Enums;
using API.Helpers;
using API.Parser;
using API.Services.Tasks.Metadata;
@ -25,14 +25,15 @@ public interface IScannerService
/// cover images if forceUpdate is true.
/// </summary>
/// <param name="libraryId">Library to scan against</param>
/// <param name="forceUpdate">Don't perform optimization checks, defaults to false</param>
[Queue(TaskScheduler.ScanQueue)]
[DisableConcurrentExecution(60 * 60 * 60)]
[AutomaticRetry(Attempts = 0, OnAttemptsExceeded = AttemptsExceededAction.Delete)]
[AutomaticRetry(Attempts = 3, OnAttemptsExceeded = AttemptsExceededAction.Delete)]
Task ScanLibrary(int libraryId, bool forceUpdate = false);
[Queue(TaskScheduler.ScanQueue)]
[DisableConcurrentExecution(60 * 60 * 60)]
[AutomaticRetry(Attempts = 0, OnAttemptsExceeded = AttemptsExceededAction.Delete)]
[AutomaticRetry(Attempts = 3, OnAttemptsExceeded = AttemptsExceededAction.Delete)]
Task ScanLibraries();
[Queue(TaskScheduler.ScanQueue)]
@ -96,18 +97,39 @@ public class ScannerService : IScannerService
_wordCountAnalyzerService = wordCountAnalyzerService;
}
/// <summary>
/// Given a generic folder path, will invoke a Series scan or Library scan.
/// </summary>
/// <remarks>This will Schedule the job to run 1 minute in the future to allow for any close-by duplicate requests to be dropped</remarks>
/// <param name="folder"></param>
public async Task ScanFolder(string folder)
{
var seriesId = await _unitOfWork.SeriesRepository.GetSeriesIdByFolder(folder);
if (seriesId > 0)
Series series = null;
try
{
BackgroundJob.Enqueue(() => ScanSeries(seriesId, true));
series = await _unitOfWork.SeriesRepository.GetSeriesByFolderPath(folder, SeriesIncludes.Library);
}
catch (InvalidOperationException ex)
{
if (ex.Message.Equals("Sequence contains more than one element."))
{
_logger.LogCritical("[ScannerService] Multiple series map to this folder. Library scan will be used for ScanFolder");
}
}
if (series != null && series.Library.Type != LibraryType.Book)
{
if (TaskScheduler.HasScanTaskRunningForSeries(series.Id))
{
_logger.LogInformation("[ScannerService] Scan folder invoked for {Folder} but a task is already queued for this series. Dropping request", folder);
return;
}
BackgroundJob.Schedule(() => ScanSeries(series.Id, true), TimeSpan.FromMinutes(1));
return;
}
// This is basically rework of what's already done in Library Watcher but is needed if invoked via API
var parentDirectory = _directoryService.GetParentDirectoryName(folder);
if (string.IsNullOrEmpty(parentDirectory)) return; // This should never happen as it's calculated before enqueing
if (string.IsNullOrEmpty(parentDirectory)) return;
var libraries = (await _unitOfWork.LibraryRepository.GetLibraryDtosAsync()).ToList();
var libraryFolders = libraries.SelectMany(l => l.Folders);
@ -118,12 +140,17 @@ public class ScannerService : IScannerService
var library = libraries.FirstOrDefault(l => l.Folders.Select(Scanner.Parser.Parser.NormalizePath).Contains(libraryFolder));
if (library != null)
{
BackgroundJob.Enqueue(() => ScanLibrary(library.Id, false));
if (TaskScheduler.HasScanTaskRunningForLibrary(library.Id))
{
_logger.LogInformation("[ScannerService] Scan folder invoked for {Folder} but a task is already queued for this library. Dropping request", folder);
return;
}
BackgroundJob.Schedule(() => ScanLibrary(library.Id, false), TimeSpan.FromMinutes(1));
}
}
/// <summary>
///
/// Scans just an existing Series for changes. If the series doesn't exist, will delete it.
/// </summary>
/// <param name="seriesId"></param>
/// <param name="bypassFolderOptimizationChecks">Not Used. Scan series will always force</param>
@ -164,6 +191,7 @@ public class ScannerService : IScannerService
await _eventHub.SendMessageAsync(MessageFactory.Error, MessageFactory.ErrorEvent($"{series.Name} scan aborted", "Files for series are not in a nested folder under library path. Correct this and rescan."));
return;
}
}
if (string.IsNullOrEmpty(folderPath))
@ -173,14 +201,13 @@ public class ScannerService : IScannerService
return;
}
// If the series path doesn't exist anymore, it was either moved or renamed. We need to essentially delete it
var parsedSeries = new Dictionary<ParsedSeries, IList<ParserInfo>>();
var processTasks = new List<Task>();
await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress, MessageFactory.LibraryScanProgressEvent(library.Name, ProgressEventType.Started, series.Name));
await _processSeries.Prime();
void TrackFiles(Tuple<bool, IList<ParserInfo>> parsedInfo)
async Task TrackFiles(Tuple<bool, IList<ParserInfo>> parsedInfo)
{
var parsedFiles = parsedInfo.Item2;
if (parsedFiles.Count == 0) return;
@ -192,23 +219,24 @@ public class ScannerService : IScannerService
Format = parsedFiles.First().Format
};
if (!foundParsedSeries.NormalizedName.Equals(series.NormalizedName))
// For Scan Series, we need to filter out anything that isn't our Series
if (!foundParsedSeries.NormalizedName.Equals(series.NormalizedName) && !foundParsedSeries.NormalizedName.Equals(Scanner.Parser.Parser.Normalize(series.OriginalName)))
{
return;
}
processTasks.Add(_processSeries.ProcessSeriesAsync(parsedFiles, library));
await _processSeries.ProcessSeriesAsync(parsedFiles, library);
parsedSeries.Add(foundParsedSeries, parsedFiles);
}
_logger.LogInformation("Beginning file scan on {SeriesName}", series.Name);
var scanElapsedTime = await ScanFiles(library, new []{folderPath}, false, TrackFiles, true);
var scanElapsedTime = await ScanFiles(library, new []{ folderPath }, false, TrackFiles, true);
_logger.LogInformation("ScanFiles for {Series} took {Time}", series.Name, scanElapsedTime);
//await Task.WhenAll(processTasks);
await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress, MessageFactory.LibraryScanProgressEvent(library.Name, ProgressEventType.Ended, series.Name));
// Remove any parsedSeries keys that don't belong to our series. This can occur when users store 2 series in the same folder
RemoveParsedInfosNotForSeries(parsedSeries, series);
@ -378,7 +406,7 @@ public class ScannerService : IScannerService
[Queue(TaskScheduler.ScanQueue)]
[DisableConcurrentExecution(60 * 60 * 60)]
[AutomaticRetry(Attempts = 0, OnAttemptsExceeded = AttemptsExceededAction.Delete)]
[AutomaticRetry(Attempts = 3, OnAttemptsExceeded = AttemptsExceededAction.Delete)]
public async Task ScanLibraries()
{
_logger.LogInformation("Starting Scan of All Libraries");
@ -396,9 +424,10 @@ public class ScannerService : IScannerService
/// ie) all entities will be rechecked for new cover images and comicInfo.xml changes
/// </summary>
/// <param name="libraryId"></param>
/// <param name="forceUpdate">Defaults to false</param>
[Queue(TaskScheduler.ScanQueue)]
[DisableConcurrentExecution(60 * 60 * 60)]
[AutomaticRetry(Attempts = 0, OnAttemptsExceeded = AttemptsExceededAction.Delete)]
[AutomaticRetry(Attempts = 3, OnAttemptsExceeded = AttemptsExceededAction.Delete)]
public async Task ScanLibrary(int libraryId, bool forceUpdate = false)
{
var sw = Stopwatch.StartNew();
@ -423,12 +452,13 @@ public class ScannerService : IScannerService
await _processSeries.Prime();
var processTasks = new List<Task>();
void TrackFiles(Tuple<bool, IList<ParserInfo>> parsedInfo)
var processTasks = new List<Func<Task>>();
Task TrackFiles(Tuple<bool, IList<ParserInfo>> parsedInfo)
{
var skippedScan = parsedInfo.Item1;
var parsedFiles = parsedInfo.Item2;
if (parsedFiles.Count == 0) return;
if (parsedFiles.Count == 0) return Task.CompletedTask;
var foundParsedSeries = new ParsedSeries()
{
@ -445,21 +475,23 @@ public class ScannerService : IScannerService
NormalizedName = Scanner.Parser.Parser.Normalize(pf.Series),
Format = pf.Format
}));
return;
return Task.CompletedTask;
}
totalFiles += parsedFiles.Count;
seenSeries.Add(foundParsedSeries);
processTasks.Add(_processSeries.ProcessSeriesAsync(parsedFiles, library));
processTasks.Add(async () => await _processSeries.ProcessSeriesAsync(parsedFiles, library));
return Task.CompletedTask;
}
var scanElapsedTime = await ScanFiles(library, libraryFolderPaths, shouldUseLibraryScan, TrackFiles, forceUpdate);
var scanElapsedTime = await ScanFiles(library, libraryFolderPaths, shouldUseLibraryScan, TrackFiles);
await Task.WhenAll(processTasks);
foreach (var task in processTasks)
{
await task();
}
await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress, MessageFactory.FileScanProgressEvent(string.Empty, library.Name, ProgressEventType.Ended));
@ -473,9 +505,6 @@ public class ScannerService : IScannerService
library.LastScanned = time;
// Could I delete anything in a Library's Series where the LastScan date is before scanStart?
// NOTE: This implementation is expensive
var removedSeries = await _unitOfWork.SeriesRepository.RemoveSeriesNotInList(seenSeries, library.Id);
_unitOfWork.LibraryRepository.Update(library);
if (await _unitOfWork.CommitAsync())
@ -484,7 +513,7 @@ public class ScannerService : IScannerService
{
_logger.LogInformation(
"[ScannerService] Finished library scan of {ParsedSeriesCount} series in {ElapsedScanTime} milliseconds for {LibraryName}. There were no changes",
totalFiles, seenSeries.Count, sw.ElapsedMilliseconds, library.Name);
seenSeries.Count, sw.ElapsedMilliseconds, library.Name);
}
else
{
@ -493,10 +522,27 @@ public class ScannerService : IScannerService
totalFiles, seenSeries.Count, sw.ElapsedMilliseconds, library.Name);
}
foreach (var s in removedSeries)
try
{
await _eventHub.SendMessageAsync(MessageFactory.SeriesRemoved,
MessageFactory.SeriesRemovedEvent(s.Id, s.Name, s.LibraryId), false);
// Could I delete anything in a Library's Series where the LastScan date is before scanStart?
// NOTE: This implementation is expensive
_logger.LogDebug("[ScannerService] Removing Series that were not found during the scan");
var removedSeries = await _unitOfWork.SeriesRepository.RemoveSeriesNotInList(seenSeries, library.Id);
_logger.LogDebug("[ScannerService] Found {Count} series that needs to be removed: {SeriesList}",
removedSeries.Count, removedSeries.Select(s => s.Name));
_logger.LogDebug("[ScannerService] Removing Series that were not found during the scan - complete");
await _unitOfWork.CommitAsync();
foreach (var s in removedSeries)
{
await _eventHub.SendMessageAsync(MessageFactory.SeriesRemoved,
MessageFactory.SeriesRemovedEvent(s.Id, s.Name, s.LibraryId), false);
}
}
catch (Exception ex)
{
_logger.LogCritical(ex, "[ScannerService] There was an issue deleting series for cleanup. Please check logs and rescan");
}
}
else
@ -512,7 +558,7 @@ public class ScannerService : IScannerService
}
private async Task<long> ScanFiles(Library library, IEnumerable<string> dirs,
bool isLibraryScan, Action<Tuple<bool, IList<ParserInfo>>> processSeriesInfos = null, bool forceChecks = false)
bool isLibraryScan, Func<Tuple<bool, IList<ParserInfo>>, Task> processSeriesInfos = null, bool forceChecks = false)
{
var scanner = new ParseScannedFiles(_logger, _directoryService, _readingItemService, _eventHub);
var scanWatch = Stopwatch.StartNew();
@ -549,4 +595,5 @@ public class ScannerService : IScannerService
{
return existingSeries.Where(es => !ParserInfoHelpers.SeriesHasMatchingParserInfoFormat(es, parsedSeries));
}
}

View file

@ -1,4 +1,6 @@
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Net.Http;
using System.Runtime.InteropServices;
@ -21,6 +23,7 @@ public interface IStatsService
{
Task Send();
Task<ServerInfoDto> GetServerInfo();
Task SendCancellation();
}
public class StatsService : IStatsService
{
@ -127,6 +130,11 @@ public class StatsService : IStatsService
MaxSeriesInALibrary = await MaxSeriesInAnyLibrary(),
MaxVolumesInASeries = await MaxVolumesInASeries(),
MaxChaptersInASeries = await MaxChaptersInASeries(),
MangaReaderBackgroundColors = await AllMangaReaderBackgroundColors(),
MangaReaderPageSplittingModes = await AllMangaReaderPageSplitting(),
MangaReaderLayoutModes = await AllMangaReaderLayoutModes(),
FileFormats = AllFormats(),
UsingRestrictedProfiles = await GetUsingRestrictedProfiles(),
};
var usersWithPref = (await _unitOfWork.UserRepository.GetAllUsersAsync(AppUserIncludes.UserPreferences)).ToList();
@ -149,6 +157,39 @@ public class StatsService : IStatsService
return serverInfo;
}
public async Task SendCancellation()
{
_logger.LogInformation("Informing KavitaStats that this instance is no longer sending stats");
var installId = (await _unitOfWork.SettingsRepository.GetSettingsDtoAsync()).InstallId;
var responseContent = string.Empty;
try
{
var response = await (ApiUrl + "/api/v2/stats/opt-out?installId=" + installId)
.WithHeader("Accept", "application/json")
.WithHeader("User-Agent", "Kavita")
.WithHeader("x-api-key", "MsnvA2DfQqxSK5jh")
.WithHeader("x-kavita-version", BuildInfo.Version)
.WithHeader("Content-Type", "application/json")
.WithTimeout(TimeSpan.FromSeconds(30))
.PostAsync();
if (response.StatusCode != StatusCodes.Status200OK)
{
_logger.LogError("KavitaStats did not respond successfully. {Content}", response);
}
}
catch (HttpRequestException e)
{
_logger.LogError(e, "KavitaStats did not respond successfully. {Response}", responseContent);
}
catch (Exception e)
{
_logger.LogError(e, "An error happened during the request to KavitaStats");
}
}
private Task<bool> GetIfUsingSeriesRelationship()
{
return _context.SeriesRelation.AnyAsync();
@ -190,4 +231,40 @@ public class StatsService : IStatsService
.SelectMany(v => v.Chapters)
.Count());
}
private async Task<IEnumerable<string>> AllMangaReaderBackgroundColors()
{
return await _context.AppUserPreferences.Select(p => p.BackgroundColor).Distinct().ToListAsync();
}
private async Task<IEnumerable<PageSplitOption>> AllMangaReaderPageSplitting()
{
return await _context.AppUserPreferences.Select(p => p.PageSplitOption).Distinct().ToListAsync();
}
private async Task<IEnumerable<LayoutMode>> AllMangaReaderLayoutModes()
{
return await _context.AppUserPreferences.Select(p => p.LayoutMode).Distinct().ToListAsync();
}
private IEnumerable<FileFormatDto> AllFormats()
{
var results = _context.MangaFile
.AsNoTracking()
.AsEnumerable()
.Select(m => new FileFormatDto()
{
Format = m.Format,
Extension = Path.GetExtension(m.FilePath)?.ToLowerInvariant()
})
.DistinctBy(f => f.Extension)
.ToList();
return results;
}
private Task<bool> GetUsingRestrictedProfiles()
{
return _context.Users.AnyAsync(u => u.AgeRestriction > AgeRating.NotApplicable);
}
}

View file

@ -75,7 +75,7 @@ public class TokenService : ITokenService
var username = tokenContent.Claims.FirstOrDefault(q => q.Type == JwtRegisteredClaimNames.NameId)?.Value;
var user = await _userManager.FindByNameAsync(username);
if (user == null) return null; // This forces a logout
var isValid = await _userManager.VerifyUserTokenAsync(user, TokenOptions.DefaultProvider, "RefreshToken", request.RefreshToken);
await _userManager.VerifyUserTokenAsync(user, TokenOptions.DefaultProvider, "RefreshToken", request.RefreshToken);
await _userManager.UpdateSecurityStampAsync(user);