Kavita/API/Services/Tasks/Scanner/ParseScannedFiles.cs
Joseph Milazzo 0eac193248
New Scan Loop (#1447)
* Staging the code for the new scan loop.

* Implemented a basic idea of changes on drives triggering scan loop. Issues: 1. Scan by folder does not work, 2. Queuing system is very hacky and needs a separate thread, 3. Performance degregation could be very real.

* Started writing unit test for new loop code

* Implemented a basic method to scan a folder path with ignore support (not implemented, code in place)

* Added some code to the parser to build out the idea of processing series in batches based on some top level folder.

* Scan Series now uses the new code (folder based parsing) and now handles the LocalizedSeries issue.

* Got library scan working with the new folder-based scan loop. Updated code to set FolderPath (for improved scan times and partial scan support).

* Wrote some notes on update library scan loop.

* Removed migration for merge

* Reapplied the SeriesFolder migration after merge

* Refactored a check that used multiple db calls into one.

* Made lots of progress on ignore support, but some confusion on underlying library. Ticket created. On hold till then.

* Updated Scan Library and Scan Series to exit early if no changes are on the underlying folders that need to be scanned.

* Implemented the ability to have .kavitaignore files within your directories and Kavita will parse them and ignore files and directories based on rules within them.

* Fixed an issue where ignore files nested wouldn't stack with higher level ignores

* Wrote out some basic code that showcases how we can scan series or library based on file events on the underlying system. Very buggy, needs lots of edge case testing and logging and dupplication checking.

* Things are working kinda. I'm getting lost in my own code and complexity. I'm not sure it's worth it.

* Refactored ScanFiles out to Directory Service.

* Refactored more code out to keep the code clean.

* More unit tests

* Refactored the signature of ParsedSeries to use IList. Started writing unit tests and reworked the UpdateLibrary to work how it used to with new scan loop code (note: using async update library/series does not work).

* Fixed the bug where processSeriesInfos was being invoked twice per series and made the code work very similar to old code (except loose leaf files dont work) but with folder based scanning.

* Prep for unit tests (updating broken ones with new implementations)

* Just some notes. Not sure I want to finish this work.

* Refactored the LibraryWatcher with some comments and state variables.

* Undid the migrations in case I don't move forward with this branch

* Started to clean the code and prepare for finishing this work.

* Fixed a bad merge

* Updated signatures to cleanup the code and commit to the new strategy for scanning.

* Swapped out the code with async processing of series on a small library

* The new scan loop is working in both Sync and Async methods. The code is slow and not optimized. This represents a good point to start polling and applying optimizations.

* Refactored UpdateSeries out of Scanner and into a dedicated file.

* Refactored how ProcessTasks are awaited to allow more async

* Fixed an issue where side nav item wouldn't show correct highlight and migrated to OnPush

* Moved where we start to stopwatch to encapsulate the full scan

* Cleaned up SignalR events to report correctly (still needs a redesign)

* Remove the "remove" code until I figure it out

* Put in extremely expensive series deletion code for library scan.

* Have Genre and Tag update the DB immediately to avoid dup issues

* Taking a break

* Moving to a lock with People was successful. Need to apply to others.

* Refactored code for series level and tag and genre with new locking strategy.

* New scan loop works. Next up optimization

* Swapped out the Kavita log with svg for faster load

* Refactored metadata updates to occur when the series are being updated.

* Code cleanup

* Added a new type of generic message (Info) to inform the user.

* Code cleanup

* Implemented an optimization which prevents any I/O (other than an attribute lookup) for Library/Series Scan. This can bring a recently updated library on network storage (650 series) to fully process in 2 seconds.

Fixed a bug where File Analysis was running everytime for each non-epub file.

* Fixed ARM x64 builds not being able to view PDF cover images due to a bad update in DocNet.

* Some code cleanup

* Added experimental signalr update code to have a more natural refresh of library-detail page

* Hooked in ability to send new series events to UI

* Moved all scan (file scan only) tasks into Scan Queue. Made it so scheduled ScanLibraries will now check if any existing task is being run and reschedule for 3 hours, and 10 mins for scan series.

* Implemented the info event in the events widget and added a clear all button to dismiss all infos and errors.  Added --event-widget-info-bg-color

* Remove --drawer-background-color since it's not used

* When new series added, inject directly into the view.

* Some debug code cleanup

* Fixed up the unit tests

* Ensure all config directories exist on startup

* Disabled Library Watching (that will go in next build)

* Ensure update for series is admin only

* Lots of code changes, scan series kinda works, specials are splitting, optimizations are failing. Demotivated on this work again.

* Removed SeriesFolder migration

* Added the SeriesFolder migration

* Added a new pipe for dates so we can provide some nicer defaults. Added folder path to the series detail.

* The scan optimizations now work for NTFS systems.

* Removed a TODO

* Migrated all the times to use DateTime.Now and not Utc.

* Refactored some repo calls to use the includes flag pattern

* Implemented a check for the library scan optimization check to validate if the library was updated (type change, library rename, folder change, or series deleted) and let the optimization be bypassed.

* Added another optimization which will use just folder attribute of last write time if the drive is not NTFS.

* Fixed a unit test

* Some code cleanup
2022-08-19 05:42:38 -07:00

344 lines
16 KiB
C#

using System;
using System.Collections.Concurrent;
using System.Collections.Generic;
using System.Linq;
using System.Threading.Tasks;
using API.Entities;
using API.Entities.Enums;
using API.Extensions;
using API.Helpers;
using API.Parser;
using API.SignalR;
using Microsoft.Extensions.Logging;
namespace API.Services.Tasks.Scanner
{
public class ParsedSeries
{
/// <summary>
/// Name of the Series
/// </summary>
public string Name { get; init; }
/// <summary>
/// Normalized Name of the Series
/// </summary>
public string NormalizedName { get; init; }
/// <summary>
/// Format of the Series
/// </summary>
public MangaFormat Format { get; init; }
}
public enum Modified
{
Modified = 1,
NotModified = 2
}
public class SeriesModified
{
public string FolderPath { get; set; }
public string SeriesName { get; set; }
public DateTime LastScanned { get; set; }
public MangaFormat Format { get; set; }
}
public class ParseScannedFiles
{
private readonly ILogger _logger;
private readonly IDirectoryService _directoryService;
private readonly IReadingItemService _readingItemService;
private readonly IEventHub _eventHub;
/// <summary>
/// An instance of a pipeline for processing files and returning a Map of Series -> ParserInfos.
/// Each instance is separate from other threads, allowing for no cross over.
/// </summary>
/// <param name="logger">Logger of the parent class that invokes this</param>
/// <param name="directoryService">Directory Service</param>
/// <param name="readingItemService">ReadingItemService Service for extracting information on a number of formats</param>
/// <param name="eventHub">For firing off SignalR events</param>
public ParseScannedFiles(ILogger logger, IDirectoryService directoryService,
IReadingItemService readingItemService, IEventHub eventHub)
{
_logger = logger;
_directoryService = directoryService;
_readingItemService = readingItemService;
_eventHub = eventHub;
}
/// <summary>
/// Gets the list of all parserInfos given a Series (Will match on Name, LocalizedName, OriginalName). If the series does not exist within, return empty list.
/// </summary>
/// <param name="parsedSeries"></param>
/// <param name="series"></param>
/// <returns></returns>
public static IList<ParserInfo> GetInfosByName(Dictionary<ParsedSeries, IList<ParserInfo>> parsedSeries, Series series)
{
var allKeys = parsedSeries.Keys.Where(ps =>
SeriesHelper.FindSeries(series, ps));
var infos = new List<ParserInfo>();
foreach (var key in allKeys)
{
infos.AddRange(parsedSeries[key]);
}
return infos;
}
/// <summary>
/// This will Scan all files in a folder path. For each folder within the folderPath, FolderAction will be invoked for all files contained
/// </summary>
/// <param name="scanDirectoryByDirectory">Scan directory by directory and for each, call folderAction</param>
/// <param name="folderPath">A library folder or series folder</param>
/// <param name="folderAction">A callback async Task to be called once all files for each folder path are found</param>
/// <param name="forceCheck">If we should bypass any folder last write time checks on the scan and force I/O</param>
public async Task ProcessFiles(string folderPath, bool scanDirectoryByDirectory,
IDictionary<string, IList<SeriesModified>> seriesPaths, Func<IList<string>, string,Task> folderAction, bool forceCheck = false)
{
string normalizedPath;
if (scanDirectoryByDirectory)
{
var directories = _directoryService.GetDirectories(folderPath).ToList();
foreach (var directory in directories)
{
normalizedPath = Parser.Parser.NormalizePath(directory);
if (HasSeriesFolderNotChangedSinceLastScan(seriesPaths, normalizedPath, forceCheck))
{
await folderAction(new List<string>(), directory);
}
else
{
// For a scan, this is doing everything in the directory loop before the folder Action is called...which leads to no progress indication
await folderAction(_directoryService.ScanFiles(directory), directory);
}
}
return;
}
normalizedPath = Parser.Parser.NormalizePath(folderPath);
if (HasSeriesFolderNotChangedSinceLastScan(seriesPaths, normalizedPath, forceCheck))
{
await folderAction(new List<string>(), folderPath);
return;
}
await folderAction(_directoryService.ScanFiles(folderPath), folderPath);
}
/// <summary>
/// Attempts to either add a new instance of a show mapping to the _scannedSeries bag or adds to an existing.
/// This will check if the name matches an existing series name (multiple fields) <see cref="MergeName"/>
/// </summary>
/// <param name="scannedSeries">A localized list of a series' parsed infos</param>
/// <param name="info"></param>
private void TrackSeries(ConcurrentDictionary<ParsedSeries, List<ParserInfo>> scannedSeries, ParserInfo info)
{
if (info.Series == string.Empty) return;
// Check if normalized info.Series already exists and if so, update info to use that name instead
info.Series = MergeName(scannedSeries, info);
var normalizedSeries = Parser.Parser.Normalize(info.Series);
var normalizedSortSeries = Parser.Parser.Normalize(info.SeriesSort);
var normalizedLocalizedSeries = Parser.Parser.Normalize(info.LocalizedSeries);
try
{
var existingKey = scannedSeries.Keys.SingleOrDefault(ps =>
ps.Format == info.Format && (ps.NormalizedName.Equals(normalizedSeries)
|| ps.NormalizedName.Equals(normalizedLocalizedSeries)
|| ps.NormalizedName.Equals(normalizedSortSeries)));
existingKey ??= new ParsedSeries()
{
Format = info.Format,
Name = info.Series,
NormalizedName = normalizedSeries
};
scannedSeries.AddOrUpdate(existingKey, new List<ParserInfo>() {info}, (_, oldValue) =>
{
oldValue ??= new List<ParserInfo>();
if (!oldValue.Contains(info))
{
oldValue.Add(info);
}
return oldValue;
});
}
catch (Exception ex)
{
_logger.LogCritical(ex, "{SeriesName} matches against multiple series in the parsed series. This indicates a critical kavita issue. Key will be skipped", info.Series);
foreach (var seriesKey in scannedSeries.Keys.Where(ps =>
ps.Format == info.Format && (ps.NormalizedName.Equals(normalizedSeries)
|| ps.NormalizedName.Equals(normalizedLocalizedSeries)
|| ps.NormalizedName.Equals(normalizedSortSeries))))
{
_logger.LogCritical("Matches: {SeriesName} matches on {SeriesKey}", info.Series, seriesKey.Name);
}
}
}
/// <summary>
/// Using a normalized name from the passed ParserInfo, this checks against all found series so far and if an existing one exists with
/// same normalized name, it merges into the existing one. This is important as some manga may have a slight difference with punctuation or capitalization.
/// </summary>
/// <param name="info"></param>
/// <returns>Series Name to group this info into</returns>
public string MergeName(ConcurrentDictionary<ParsedSeries, List<ParserInfo>> scannedSeries, ParserInfo info)
{
var normalizedSeries = Parser.Parser.Normalize(info.Series);
var normalizedLocalSeries = Parser.Parser.Normalize(info.LocalizedSeries);
try
{
var existingName =
scannedSeries.SingleOrDefault(p =>
(Parser.Parser.Normalize(p.Key.NormalizedName).Equals(normalizedSeries) ||
Parser.Parser.Normalize(p.Key.NormalizedName).Equals(normalizedLocalSeries)) &&
p.Key.Format == info.Format)
.Key;
if (existingName != null && !string.IsNullOrEmpty(existingName.Name))
{
return existingName.Name;
}
}
catch (Exception ex)
{
_logger.LogCritical(ex, "Multiple series detected for {SeriesName} ({File})! This is critical to fix! There should only be 1", info.Series, info.FullFilePath);
var values = scannedSeries.Where(p =>
(Parser.Parser.Normalize(p.Key.NormalizedName) == normalizedSeries ||
Parser.Parser.Normalize(p.Key.NormalizedName) == normalizedLocalSeries) &&
p.Key.Format == info.Format);
foreach (var pair in values)
{
_logger.LogCritical("Duplicate Series in DB matches with {SeriesName}: {DuplicateName}", info.Series, pair.Key.Name);
}
}
return info.Series;
}
/// <summary>
/// This is a new version which will process series by folder groups.
/// </summary>
/// <param name="libraryType"></param>
/// <param name="folders"></param>
/// <param name="libraryName"></param>
/// <returns></returns>
public async Task ScanLibrariesForSeries(LibraryType libraryType,
IEnumerable<string> folders, string libraryName, bool isLibraryScan,
IDictionary<string, IList<SeriesModified>> seriesPaths, Action<Tuple<bool, IList<ParserInfo>>> processSeriesInfos, bool forceCheck = false)
{
await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress, MessageFactory.FileScanProgressEvent("Starting file scan", libraryName, ProgressEventType.Started));
foreach (var folderPath in folders)
{
try
{
await ProcessFiles(folderPath, isLibraryScan, seriesPaths, async (files, folder) =>
{
var normalizedFolder = Parser.Parser.NormalizePath(folder);
if (HasSeriesFolderNotChangedSinceLastScan(seriesPaths, normalizedFolder, forceCheck))
{
var parsedInfos = seriesPaths[normalizedFolder].Select(fp => new ParserInfo()
{
Series = fp.SeriesName,
Format = fp.Format,
}).ToList();
processSeriesInfos.Invoke(new Tuple<bool, IList<ParserInfo>>(true, parsedInfos));
_logger.LogDebug("Skipped File Scan for {Folder} as it hasn't changed since last scan", folder);
return;
}
_logger.LogDebug("Found {Count} files for {Folder}", files.Count, folder);
await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress, MessageFactory.FileScanProgressEvent(folderPath, libraryName, ProgressEventType.Updated));
var scannedSeries = new ConcurrentDictionary<ParsedSeries, List<ParserInfo>>();
var infos = files.Select(file => _readingItemService.ParseFile(file, folderPath, libraryType)).Where(info => info != null).ToList();
MergeLocalizedSeriesWithSeries(infos);
foreach (var info in infos)
{
try
{
TrackSeries(scannedSeries, info);
}
catch (Exception ex)
{
_logger.LogError(ex, "There was an exception that occurred during tracking {FilePath}. Skipping this file", info.FullFilePath);
}
}
// It would be really cool if we can emit an event when a folder hasn't been changed so we don't parse everything, but the first item to ensure we don't delete it
// Otherwise, we can do a last step in the DB where we validate all files on disk exist and if not, delete them. (easy but slow)
foreach (var series in scannedSeries.Keys)
{
if (scannedSeries[series].Count > 0 && processSeriesInfos != null)
{
processSeriesInfos.Invoke(new Tuple<bool, IList<ParserInfo>>(false, scannedSeries[series]));
}
}
}, forceCheck);
}
catch (ArgumentException ex)
{
_logger.LogError(ex, "The directory '{FolderPath}' does not exist", folderPath);
}
}
await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress, MessageFactory.FileScanProgressEvent(string.Empty, libraryName, ProgressEventType.Ended));
}
private bool HasSeriesFolderNotChangedSinceLastScan(IDictionary<string, IList<SeriesModified>> seriesPaths, string normalizedFolder, bool forceCheck = false)
{
if (forceCheck) return false;
return seriesPaths.ContainsKey(normalizedFolder) && seriesPaths[normalizedFolder].All(f => f.LastScanned.Truncate(TimeSpan.TicksPerMinute) >=
_directoryService.GetLastWriteTime(normalizedFolder).Truncate(TimeSpan.TicksPerMinute));
}
/// <summary>
/// Checks if there are any ParserInfos that have a Series that matches the LocalizedSeries field in any other info. If so,
/// rewrites the infos with series name instead of the localized name, so they stack.
/// </summary>
/// <example>
/// Accel World v01.cbz has Series "Accel World" and Localized Series "World of Acceleration"
/// World of Acceleration v02.cbz has Series "World of Acceleration"
/// After running this code, we'd have:
/// World of Acceleration v02.cbz having Series "Accel World" and Localized Series of "World of Acceleration"
/// </example>
/// <param name="infos">A collection of ParserInfos</param>
private static void MergeLocalizedSeriesWithSeries(IReadOnlyCollection<ParserInfo> infos)
{
var hasLocalizedSeries = infos.Any(i => !string.IsNullOrEmpty(i.LocalizedSeries));
if (!hasLocalizedSeries) return;
var localizedSeries = infos.Select(i => i.LocalizedSeries).Distinct()
.FirstOrDefault(i => !string.IsNullOrEmpty(i));
if (string.IsNullOrEmpty(localizedSeries)) return;
var nonLocalizedSeries = infos.Select(i => i.Series).Distinct()
.FirstOrDefault(series => !series.Equals(localizedSeries));
var normalizedNonLocalizedSeries = Parser.Parser.Normalize(nonLocalizedSeries);
foreach (var infoNeedingMapping in infos.Where(i =>
!Parser.Parser.Normalize(i.Series).Equals(normalizedNonLocalizedSeries)))
{
infoNeedingMapping.Series = nonLocalizedSeries;
infoNeedingMapping.LocalizedSeries = localizedSeries;
}
}
}
}