Change Detection: On Push aka UI Smoothness (#1369)

* Updated Series Info Cards to use OnPush and hooked in progress events when we do a mark as read/unread on entities. These events update progress bars but also will now trigger a re-calculation on Read Time Left.

* Removed Library Card Component

* Refactored manga reader title and subtitle calculation to the backend.

* Coverted card actionables to onPush

* Series Card on push cleanup

* Updated edit collection tags for on push

* Update cover image chooser for on push

* Cleaned up carsouel reel

* Updated cover image to allow for uploading gif and webp files

* Bulk add to collection on push

* Updated bulk operation to use on push. Updated bulk operation to have mark as unread and read buttons explicitly. Updated so add to collection is visible and delete.

Fixed a bug where manage library component wasn't invoking the trackBy function

* Updating entity title for on push

* Removed file info component

* Updated Mange Library for on push

* Entity info cards on push

* List item on push

* Updated icon and title for on push and fixed some missing change detection on series detail

* Restricted the typeahead interface to simplify the design

* Edit Series Relation now shows a value in the dropdown for Parent relationships and disables the field.

* Updated edit series relation to focus on new typeahead when adding a new relationship

* Added some documentation and when Scanning a library, don't allow the user to enqueue the same job multiple times.

* Applied the No-enqueue if already enqueued logic to other tasks

* Library detail on push

* Updated events widget to onpush

* Card detail drawer on push. Card detail cover chooser now will show all chapter's covers for selection in cover chooser.

* Chapter metadata detail on push

* Removed Card Detail modal

* All collections on push

* Removed some comments

* Updated bulk selection to use an observable rather than function calls so new on push strategy works

* collection detail now uses on push and scroller is placed on correct element

* Updated library recommended to on push. Ensure that when mark as read occurs, the appropriate streams are refreshed.

* Updated library detail to on push

* Update metadata fiter to onpush. Bugs found and reported to Project

* person badge on push

* Read more on push

* Updated tag badge to on push

* User login on push

* When initing side nav, don't call an authenticated api until we are sure a user is logged in

* Updated splash container to on push

* Dashboard on push

* Side nav slight refactor around some api calls

* Cleaned up series card on push to use same cdRef naming convention

* Updated Static Files to use caching

* Added width and height to logo image

* shortcuts modal on push

* reading lists on push

* Reading list detail on push

* draggable ordered list on push

* Refactored reading-list-detail to use a new item which drastically reduces renders on operations

* series format on push

* circular loader on push

* Badge Expander on push

* update notification modal on push

* drawer on push

* Edit Series Modal on push

* reset password on push

* review series modal on push

* series metadata detail on push

* theme manager on push

* confirm reset password on push

* register on push

* confirm migration email on push

* confirm email on push

* add email to account migration on push

* user preferences on push. Made global settings default open

* edit series relation on push

* Fixed an edge case bug for next chapter where if the current volume had a single chapter of 1 and the next volume had a chapter number of 0, it would say there are no more chapters.

* Updated infinite scroller with on push support

* Moved some animations over to typeahead, not integrated yet.

* Manga reader is now on push

* Reader settings on push

* refactored how we close the book

* Updated table of contents for on push

* Updated book reader for on push. Fixed a bug where table of contents wasn't showing current page anchor due to a scroll calulation bug

* Small code tweak

* Icon and title on push

* nav header on push

* grouped typeahead on push

* typeahead on push and added a new trackby identity function to allow even faster rendering of big lists

* pdf reader on push

* code cleanup
This commit is contained in:
Joseph Milazzo 2022-07-11 11:57:07 -04:00 committed by GitHub
parent f5be0fac58
commit 4e49aa47ce
No known key found for this signature in database
GPG key ID: 4AEE18F83AFDEB23
126 changed files with 1658 additions and 1674 deletions

View file

@ -28,32 +28,27 @@ namespace API.Controllers
private readonly ILogger<ReaderController> _logger;
private readonly IReaderService _readerService;
private readonly IBookmarkService _bookmarkService;
private readonly IEventHub _eventHub;
/// <inheritdoc />
public ReaderController(ICacheService cacheService,
IUnitOfWork unitOfWork, ILogger<ReaderController> logger,
IReaderService readerService, IBookmarkService bookmarkService,
IEventHub eventHub)
IReaderService readerService, IBookmarkService bookmarkService)
{
_cacheService = cacheService;
_unitOfWork = unitOfWork;
_logger = logger;
_readerService = readerService;
_bookmarkService = bookmarkService;
_eventHub = eventHub;
}
/// <summary>
/// Returns the PDF for the chapterId.
/// </summary>
/// <param name="apiKey">API Key for user to validate they have access</param>
/// <param name="chapterId"></param>
/// <returns></returns>
[HttpGet("pdf")]
public async Task<ActionResult> GetPdf(int chapterId)
{
var chapter = await _cacheService.Ensure(chapterId);
if (chapter == null) return BadRequest("There was an issue finding pdf file for reading");
@ -152,9 +147,9 @@ namespace API.Controllers
if (dto == null) return BadRequest("Please perform a scan on this series or library and try again");
var mangaFile = (await _unitOfWork.ChapterRepository.GetFilesForChapterAsync(chapterId)).First();
return Ok(new ChapterInfoDto()
var info = new ChapterInfoDto()
{
ChapterNumber = dto.ChapterNumber,
ChapterNumber = dto.ChapterNumber,
VolumeNumber = dto.VolumeNumber,
VolumeId = dto.VolumeId,
FileName = Path.GetFileName(mangaFile.FilePath),
@ -164,8 +159,33 @@ namespace API.Controllers
LibraryId = dto.LibraryId,
IsSpecial = dto.IsSpecial,
Pages = dto.Pages,
ChapterTitle = dto.ChapterTitle ?? string.Empty
});
ChapterTitle = dto.ChapterTitle ?? string.Empty,
Subtitle = string.Empty,
Title = dto.SeriesName
};
if (info.ChapterTitle is {Length: > 0}) {
info.Title += " - " + info.ChapterTitle;
}
if (info.IsSpecial && dto.VolumeNumber.Equals(Parser.Parser.DefaultVolume))
{
info.Subtitle = info.FileName;
} else if (!info.IsSpecial && info.VolumeNumber.Equals(Parser.Parser.DefaultVolume))
{
info.Subtitle = _readerService.FormatChapterName(info.LibraryType, true, true) + info.ChapterNumber;
}
else
{
info.Subtitle = "Volume " + info.VolumeNumber;
if (!info.ChapterNumber.Equals(Parser.Parser.DefaultChapter))
{
info.Subtitle += " " + _readerService.FormatChapterName(info.LibraryType, true, true) +
info.ChapterNumber;
}
}
return Ok(info);
}
/// <summary>
@ -235,7 +255,7 @@ namespace API.Controllers
var user = await _unitOfWork.UserRepository.GetUserByUsernameAsync(User.GetUsername(), AppUserIncludes.Progress);
var chapters = await _unitOfWork.ChapterRepository.GetChaptersAsync(markVolumeReadDto.VolumeId);
_readerService.MarkChaptersAsUnread(user, markVolumeReadDto.SeriesId, chapters);
await _readerService.MarkChaptersAsUnread(user, markVolumeReadDto.SeriesId, chapters);
_unitOfWork.UserRepository.Update(user);
@ -258,7 +278,7 @@ namespace API.Controllers
var user = await _unitOfWork.UserRepository.GetUserByUsernameAsync(User.GetUsername(), AppUserIncludes.Progress);
var chapters = await _unitOfWork.ChapterRepository.GetChaptersAsync(markVolumeReadDto.VolumeId);
_readerService.MarkChaptersAsRead(user, markVolumeReadDto.SeriesId, chapters);
await _readerService.MarkChaptersAsRead(user, markVolumeReadDto.SeriesId, chapters);
_unitOfWork.UserRepository.Update(user);
@ -288,7 +308,7 @@ namespace API.Controllers
chapterIds.Add(chapterId);
}
var chapters = await _unitOfWork.ChapterRepository.GetChaptersByIdsAsync(chapterIds);
_readerService.MarkChaptersAsRead(user, dto.SeriesId, chapters);
await _readerService.MarkChaptersAsRead(user, dto.SeriesId, chapters);
_unitOfWork.UserRepository.Update(user);
@ -317,7 +337,7 @@ namespace API.Controllers
chapterIds.Add(chapterId);
}
var chapters = await _unitOfWork.ChapterRepository.GetChaptersByIdsAsync(chapterIds);
_readerService.MarkChaptersAsUnread(user, dto.SeriesId, chapters);
await _readerService.MarkChaptersAsUnread(user, dto.SeriesId, chapters);
_unitOfWork.UserRepository.Update(user);
@ -343,7 +363,7 @@ namespace API.Controllers
var volumes = await _unitOfWork.VolumeRepository.GetVolumesForSeriesAsync(dto.SeriesIds.ToArray(), true);
foreach (var volume in volumes)
{
_readerService.MarkChaptersAsRead(user, volume.SeriesId, volume.Chapters);
await _readerService.MarkChaptersAsRead(user, volume.SeriesId, volume.Chapters);
}
_unitOfWork.UserRepository.Update(user);
@ -370,7 +390,7 @@ namespace API.Controllers
var volumes = await _unitOfWork.VolumeRepository.GetVolumesForSeriesAsync(dto.SeriesIds.ToArray(), true);
foreach (var volume in volumes)
{
_readerService.MarkChaptersAsUnread(user, volume.SeriesId, volume.Chapters);
await _readerService.MarkChaptersAsUnread(user, volume.SeriesId, volume.Chapters);
}
_unitOfWork.UserRepository.Update(user);

View file

@ -73,8 +73,6 @@ namespace API.Controllers
var userId = await _unitOfWork.UserRepository.GetUserIdByUsernameAsync(User.GetUsername());
var items = await _unitOfWork.ReadingListRepository.GetReadingListItemDtosByIdAsync(readingListId, userId);
return Ok(items);
//return Ok(await _unitOfWork.ReadingListRepository.AddReadingProgressModifiers(userId, items.ToList()));
}
/// <summary>

View file

@ -1,23 +1,68 @@
using System;
using API.Entities.Enums;
using API.Entities.Enums;
namespace API.DTOs.Reader
namespace API.DTOs.Reader;
/// <summary>
/// Information about the Chapter for the Reader to render
/// </summary>
public class ChapterInfoDto : IChapterInfoDto
{
public class ChapterInfoDto : IChapterInfoDto
{
/// <summary>
/// The Chapter Number
/// </summary>
public string ChapterNumber { get; set; }
/// <summary>
/// The Volume Number
/// </summary>
public string VolumeNumber { get; set; }
/// <summary>
/// Volume entity Id
/// </summary>
public int VolumeId { get; set; }
/// <summary>
/// Series Name
/// </summary>
public string SeriesName { get; set; }
/// <summary>
/// Series Format
/// </summary>
public MangaFormat SeriesFormat { get; set; }
/// <summary>
/// Series entity Id
/// </summary>
public int SeriesId { get; set; }
/// <summary>
/// Library entity Id
/// </summary>
public int LibraryId { get; set; }
/// <summary>
/// Library type
/// </summary>
public LibraryType LibraryType { get; set; }
/// <summary>
/// Chapter's title if set via ComicInfo.xml (Title field)
/// </summary>
public string ChapterTitle { get; set; } = string.Empty;
/// <summary>
/// Total Number of pages in this Chapter
/// </summary>
public int Pages { get; set; }
/// <summary>
/// File name of the chapter
/// </summary>
public string FileName { get; set; }
/// <summary>
/// If this is marked as a special in Kavita
/// </summary>
public bool IsSpecial { get; set; }
/// <summary>
/// The subtitle to render on the reader
/// </summary>
public string Subtitle { get; set; }
/// <summary>
/// Series Title
/// </summary>
/// <remarks>Usually just series name, but can include chapter title</remarks>
public string Title { get; set; }
public string ChapterNumber { get; set; }
public string VolumeNumber { get; set; }
public int VolumeId { get; set; }
public string SeriesName { get; set; }
public MangaFormat SeriesFormat { get; set; }
public int SeriesId { get; set; }
public int LibraryId { get; set; }
public LibraryType LibraryType { get; set; }
public string ChapterTitle { get; set; } = string.Empty;
public int Pages { get; set; }
public string FileName { get; set; }
public bool IsSpecial { get; set; }
}
}

View file

@ -1,6 +1,5 @@
namespace API.Entities
{
//[Index(nameof(SeriesId), nameof(VolumeId), nameof(ChapterId), IsUnique = true)]
public class ReadingListItem
{
public int Id { get; init; }
@ -16,7 +15,7 @@
public ReadingList ReadingList { get; set; }
public int ReadingListId { get; set; }
// Idea, keep these for easy join statements
// Keep these for easy join statements
public Series Series { get; set; }
public Volume Volume { get; set; }
public Chapter Chapter { get; set; }

View file

@ -66,9 +66,9 @@ public class CacheHelper : ICacheHelper
/// <summary>
/// Has the file been modified since last scan or is user forcing an update
/// </summary>
/// <param name="lastScan"></param>
/// <param name="forceUpdate"></param>
/// <param name="firstFile"></param>
/// <param name="lastScan">Last time the scan was performed on this file</param>
/// <param name="forceUpdate">Should we ignore any logic and force this to return true</param>
/// <param name="firstFile">The file in question</param>
/// <returns></returns>
public bool HasFileChangedSinceLastScan(DateTime lastScan, bool forceUpdate, MangaFile firstFile)
{
@ -76,10 +76,6 @@ public class CacheHelper : ICacheHelper
if (forceUpdate) return true;
return _fileService.HasFileBeenModifiedSince(firstFile.FilePath, lastScan)
|| _fileService.HasFileBeenModifiedSince(firstFile.FilePath, firstFile.LastModified);
// return firstFile != null &&
// (!forceUpdate &&
// !(_fileService.HasFileBeenModifiedSince(firstFile.FilePath, lastScan)
// || _fileService.HasFileBeenModifiedSince(firstFile.FilePath, firstFile.LastModified)));
}
/// <summary>

View file

@ -189,6 +189,7 @@ namespace API.Services
/// <remarks>This always creates a thumbnail</remarks>
/// <param name="archivePath"></param>
/// <param name="fileName">File name to use based on context of entity.</param>
/// <param name="outputDirectory">Where to output the file, defaults to covers directory</param>
/// <returns></returns>
public string GetCoverImage(string archivePath, string fileName, string outputDirectory)
{
@ -261,16 +262,16 @@ namespace API.Services
archive.Entries.Any(e => e.FullName.Contains(Path.AltDirectorySeparatorChar) && !Parser.Parser.HasBlacklistedFolderInPath(e.FullName));
}
// TODO: Refactor CreateZipForDownload to return the temp file so we can stream it from temp
/// <summary>
///
/// Creates a zip file form the listed files and outputs to the temp folder.
/// </summary>
/// <param name="files"></param>
/// <param name="files">List of files to be zipped up. Should be full file paths.</param>
/// <param name="tempFolder">Temp folder name to use for preparing the files. Will be created and deleted</param>
/// <returns></returns>
/// <returns>All bytes for the given file in a Tuple</returns>
/// <exception cref="KavitaException"></exception>
public async Task<Tuple<byte[], string>> CreateZipForDownload(IEnumerable<string> files, string tempFolder)
{
// TODO: Refactor CreateZipForDownload to return the temp file so we can stream it from temp
var dateString = DateTime.Now.ToShortDateString().Replace("/", "_");
var tempLocation = Path.Join(_directoryService.TempDirectory, $"{tempFolder}_{dateString}");

View file

@ -686,6 +686,7 @@ namespace API.Services
/// </summary>
/// <param name="fileFilePath"></param>
/// <param name="fileName">Name of the new file.</param>
/// <param name="outputDirectory">Where to output the file, defaults to covers directory</param>
/// <returns></returns>
public string GetCoverImage(string fileFilePath, string fileName, string outputDirectory)
{

View file

@ -25,6 +25,7 @@ public interface IImageService
/// Converts the passed image to webP and outputs it in the same directory
/// </summary>
/// <param name="filePath">Full path to the image to convert</param>
/// <param name="outputPath">Where to output the file</param>
/// <returns>File of written webp image</returns>
Task<string> ConvertToWebP(string filePath, string outputPath);
}
@ -89,6 +90,7 @@ public class ImageService : IImageService
/// </summary>
/// <param name="stream">Stream to write to disk. Ensure this is rewinded.</param>
/// <param name="fileName">filename to save as without extension</param>
/// <param name="outputDirectory">Where to output the file, defaults to covers directory</param>
/// <returns>File name with extension of the file. This will always write to <see cref="DirectoryService.CoverImageDirectory"/></returns>
public string WriteCoverThumbnail(Stream stream, string fileName, string outputDirectory)
{

View file

@ -35,6 +35,7 @@ public interface IMetadataService
/// </summary>
/// <param name="libraryId"></param>
/// <param name="seriesId"></param>
/// <param name="forceUpdate">Overrides any cache logic and forces execution</param>
Task RefreshMetadataForSeries(int libraryId, int seriesId, bool forceUpdate = true);
}
@ -278,6 +279,7 @@ public class MetadataService : IMetadataService
/// </summary>
/// <param name="libraryId"></param>
/// <param name="seriesId"></param>
/// <param name="forceUpdate">Overrides any cache logic and forces execution</param>
public async Task RefreshMetadataForSeries(int libraryId, int seriesId, bool forceUpdate = true)
{
var sw = Stopwatch.StartNew();

View file

@ -9,6 +9,7 @@ using API.Data.Repositories;
using API.DTOs;
using API.DTOs.Reader;
using API.Entities;
using API.Entities.Enums;
using API.Extensions;
using API.SignalR;
using Kavita.Common;
@ -20,8 +21,8 @@ public interface IReaderService
{
Task MarkSeriesAsRead(AppUser user, int seriesId);
Task MarkSeriesAsUnread(AppUser user, int seriesId);
void MarkChaptersAsRead(AppUser user, int seriesId, IEnumerable<Chapter> chapters);
void MarkChaptersAsUnread(AppUser user, int seriesId, IEnumerable<Chapter> chapters);
Task MarkChaptersAsRead(AppUser user, int seriesId, IEnumerable<Chapter> chapters);
Task MarkChaptersAsUnread(AppUser user, int seriesId, IEnumerable<Chapter> chapters);
Task<bool> SaveReadingProgress(ProgressDto progressDto, int userId);
Task<int> CapPageToChapter(int chapterId, int page);
Task<int> GetNextChapterIdAsync(int seriesId, int volumeId, int currentChapterId, int userId);
@ -30,6 +31,7 @@ public interface IReaderService
Task MarkChaptersUntilAsRead(AppUser user, int seriesId, float chapterNumber);
Task MarkVolumesUntilAsRead(AppUser user, int seriesId, int volumeNumber);
HourEstimateRangeDto GetTimeEstimate(long wordCount, int pageCount, bool isEpub);
string FormatChapterName(LibraryType libraryType, bool includeHash = false, bool includeSpace = false);
}
public class ReaderService : IReaderService
@ -71,7 +73,7 @@ public class ReaderService : IReaderService
user.Progresses ??= new List<AppUserProgress>();
foreach (var volume in volumes)
{
MarkChaptersAsRead(user, seriesId, volume.Chapters);
await MarkChaptersAsRead(user, seriesId, volume.Chapters);
}
_unitOfWork.UserRepository.Update(user);
@ -88,7 +90,7 @@ public class ReaderService : IReaderService
user.Progresses ??= new List<AppUserProgress>();
foreach (var volume in volumes)
{
MarkChaptersAsUnread(user, seriesId, volume.Chapters);
await MarkChaptersAsUnread(user, seriesId, volume.Chapters);
}
_unitOfWork.UserRepository.Update(user);
@ -100,7 +102,7 @@ public class ReaderService : IReaderService
/// <param name="user"></param>
/// <param name="seriesId"></param>
/// <param name="chapters"></param>
public void MarkChaptersAsRead(AppUser user, int seriesId, IEnumerable<Chapter> chapters)
public async Task MarkChaptersAsRead(AppUser user, int seriesId, IEnumerable<Chapter> chapters)
{
foreach (var chapter in chapters)
{
@ -115,12 +117,17 @@ public class ReaderService : IReaderService
SeriesId = seriesId,
ChapterId = chapter.Id
});
await _eventHub.SendMessageAsync(MessageFactory.UserProgressUpdate,
MessageFactory.UserProgressUpdateEvent(user.Id, user.UserName, seriesId, chapter.VolumeId, chapter.Id, chapter.Pages));
}
else
{
userProgress.PagesRead = chapter.Pages;
userProgress.SeriesId = seriesId;
userProgress.VolumeId = chapter.VolumeId;
await _eventHub.SendMessageAsync(MessageFactory.UserProgressUpdate,
MessageFactory.UserProgressUpdateEvent(user.Id, user.UserName, userProgress.SeriesId, userProgress.VolumeId, userProgress.ChapterId, chapter.Pages));
}
}
}
@ -131,7 +138,7 @@ public class ReaderService : IReaderService
/// <param name="user"></param>
/// <param name="seriesId"></param>
/// <param name="chapters"></param>
public void MarkChaptersAsUnread(AppUser user, int seriesId, IEnumerable<Chapter> chapters)
public async Task MarkChaptersAsUnread(AppUser user, int seriesId, IEnumerable<Chapter> chapters)
{
foreach (var chapter in chapters)
{
@ -142,6 +149,9 @@ public class ReaderService : IReaderService
userProgress.PagesRead = 0;
userProgress.SeriesId = seriesId;
userProgress.VolumeId = chapter.VolumeId;
await _eventHub.SendMessageAsync(MessageFactory.UserProgressUpdate,
MessageFactory.UserProgressUpdateEvent(user.Id, user.UserName, userProgress.SeriesId, userProgress.VolumeId, userProgress.ChapterId, 0));
}
}
@ -321,6 +331,8 @@ public class ReaderService : IReaderService
currentChapter.Range, dto => dto.Range);
if (chapterId > 0) return chapterId;
} else if (double.Parse(firstChapter.Number) > double.Parse(currentChapter.Number)) return firstChapter.Id;
// If we are the last chapter and next volume is there, we should try to use it (unless it's volume 0)
else if (double.Parse(firstChapter.Number) == 0) return firstChapter.Id;
}
// If we are the last volume and we didn't find any next volume, loop back to volume 0 and give the first chapter
@ -485,7 +497,7 @@ public class ReaderService : IReaderService
var chapters = volume.Chapters
.OrderBy(c => float.Parse(c.Number))
.Where(c => !c.IsSpecial && Parser.Parser.MaxNumberFromRange(c.Range) <= chapterNumber);
MarkChaptersAsRead(user, volume.SeriesId, chapters);
await MarkChaptersAsRead(user, volume.SeriesId, chapters);
}
}
@ -494,7 +506,7 @@ public class ReaderService : IReaderService
var volumes = await _unitOfWork.VolumeRepository.GetVolumesForSeriesAsync(new List<int> { seriesId }, true);
foreach (var volume in volumes.OrderBy(v => v.Number).Where(v => v.Number <= volumeNumber && v.Number > 0))
{
MarkChaptersAsRead(user, volume.SeriesId, volume.Chapters);
await MarkChaptersAsRead(user, volume.SeriesId, volume.Chapters);
}
}
@ -540,4 +552,29 @@ public class ReaderService : IReaderService
AvgHours = (int) Math.Round((pageCount / AvgPagesPerMinute / 60F))
};
}
/// <summary>
/// Formats a Chapter name based on the library it's in
/// </summary>
/// <param name="libraryType"></param>
/// <param name="includeHash">For comics only, includes a # which is used for numbering on cards</param>
/// <param name="includeSpace">Add a space at the end of the string. if includeHash and includeSpace are true, only hash will be at the end.</param>
/// <returns></returns>
public string FormatChapterName(LibraryType libraryType, bool includeHash = false, bool includeSpace = false)
{
switch(libraryType)
{
case LibraryType.Manga:
return "Chapter" + (includeSpace ? " " : string.Empty);
case LibraryType.Comic:
if (includeHash) {
return "Issue #";
}
return "Issue" + (includeSpace ? " " : string.Empty);
case LibraryType.Book:
return "Book" + (includeSpace ? " " : string.Empty);
default:
throw new ArgumentOutOfRangeException(nameof(libraryType), libraryType, null);
}
}
}

View file

@ -105,7 +105,6 @@ public class SeriesService : ISeriesService
series.Metadata.LanguageLocked = true;
}
series.Metadata.CollectionTags ??= new List<CollectionTag>();
UpdateRelatedList(updateSeriesMetadataDto.CollectionTags, series, allCollectionTags, (tag) =>
{
@ -200,10 +199,11 @@ public class SeriesService : ISeriesService
return false;
}
// TODO: Move this to a helper so we can easily test
private static void UpdateRelatedList(ICollection<CollectionTagDto> tags, Series series, IReadOnlyCollection<CollectionTag> allTags,
Action<CollectionTag> handleAdd)
{
// TODO: Move UpdateRelatedList to a helper so we can easily test
if (tags == null) return;
// I want a union of these 2 lists. Return only elements that are in both lists, but the list types are different
var existingTags = series.Metadata.CollectionTags.ToList();

View file

@ -1,5 +1,6 @@
using System;
using System.Collections.Generic;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
using API.Data;
@ -162,8 +163,12 @@ public class TaskScheduler : ITaskScheduler
public void ScanLibrary(int libraryId)
{
if (HasAlreadyEnqueuedTask("ScannerService","ScanLibrary", new object[] {libraryId}))
{
_logger.LogInformation("A duplicate request to scan library for library occured. Skipping");
return;
}
_logger.LogInformation("Enqueuing library scan for: {LibraryId}", libraryId);
// TODO: If a library scan is already queued up for libraryId, don't do anything
BackgroundJob.Enqueue(() => _scannerService.ScanLibrary(libraryId));
// When we do a scan, force cache to re-unpack in case page numbers change
BackgroundJob.Enqueue(() => _cleanupService.CleanupCacheDirectory());
@ -176,24 +181,48 @@ public class TaskScheduler : ITaskScheduler
public void RefreshMetadata(int libraryId, bool forceUpdate = true)
{
if (HasAlreadyEnqueuedTask("MetadataService","RefreshMetadata", new object[] {libraryId, forceUpdate}))
{
_logger.LogInformation("A duplicate request to refresh metadata for library occured. Skipping");
return;
}
_logger.LogInformation("Enqueuing library metadata refresh for: {LibraryId}", libraryId);
BackgroundJob.Enqueue(() => _metadataService.RefreshMetadata(libraryId, forceUpdate));
}
public void RefreshSeriesMetadata(int libraryId, int seriesId, bool forceUpdate = false)
{
if (HasAlreadyEnqueuedTask("MetadataService","RefreshMetadataForSeries", new object[] {libraryId, seriesId, forceUpdate}))
{
_logger.LogInformation("A duplicate request to refresh metadata for library occured. Skipping");
return;
}
_logger.LogInformation("Enqueuing series metadata refresh for: {SeriesId}", seriesId);
BackgroundJob.Enqueue(() => _metadataService.RefreshMetadataForSeries(libraryId, seriesId, forceUpdate));
}
public void ScanSeries(int libraryId, int seriesId, bool forceUpdate = false)
{
if (HasAlreadyEnqueuedTask("ScannerService", "ScanSeries", new object[] {libraryId, seriesId, forceUpdate}))
{
_logger.LogInformation("A duplicate request to scan series occured. Skipping");
return;
}
_logger.LogInformation("Enqueuing series scan for: {SeriesId}", seriesId);
BackgroundJob.Enqueue(() => _scannerService.ScanSeries(libraryId, seriesId, CancellationToken.None));
}
public void AnalyzeFilesForSeries(int libraryId, int seriesId, bool forceUpdate = false)
{
if (HasAlreadyEnqueuedTask("WordCountAnalyzerService", "ScanSeries", new object[] {libraryId, seriesId, forceUpdate}))
{
_logger.LogInformation("A duplicate request to scan series occured. Skipping");
return;
}
_logger.LogInformation("Enqueuing analyze files scan for: {SeriesId}", seriesId);
BackgroundJob.Enqueue(() => _wordCountAnalyzerService.ScanSeries(libraryId, seriesId, forceUpdate));
}
@ -212,4 +241,21 @@ public class TaskScheduler : ITaskScheduler
var update = await _versionUpdaterService.CheckForUpdate();
await _versionUpdaterService.PushUpdate(update);
}
/// <summary>
/// Checks if this same invocation is already enqueued
/// </summary>
/// <param name="methodName">Method name that was enqueued</param>
/// <param name="className">Class name the method resides on</param>
/// <param name="args">object[] of arguments in the order they are passed to enqueued job</param>
/// <param name="queue">Queue to check against. Defaults to "default"</param>
/// <returns></returns>
private static bool HasAlreadyEnqueuedTask(string className, string methodName, object[] args, string queue = "default")
{
var enqueuedJobs = JobStorage.Current.GetMonitoringApi().EnqueuedJobs(queue, 0, int.MaxValue);
return enqueuedJobs.Any(j => j.Value.InEnqueuedState &&
j.Value.Job.Method.DeclaringType != null && j.Value.Job.Args.SequenceEqual(args) &&
j.Value.Job.Method.Name.Equals(methodName) &&
j.Value.Job.Method.DeclaringType.Name.Equals(className));
}
}

View file

@ -40,6 +40,7 @@ namespace API.Services.Tasks.Scanner
/// <param name="logger">Logger of the parent class that invokes this</param>
/// <param name="directoryService">Directory Service</param>
/// <param name="readingItemService">ReadingItemService Service for extracting information on a number of formats</param>
/// <param name="eventHub">For firing off SignalR events</param>
public ParseScannedFiles(ILogger logger, IDirectoryService directoryService,
IReadingItemService readingItemService, IEventHub eventHub)
{
@ -251,6 +252,7 @@ namespace API.Services.Tasks.Scanner
/// </summary>
/// <param name="libraryType">Type of library. Used for selecting the correct file extensions to search for and parsing files</param>
/// <param name="folders">The folders to scan. By default, this should be library.Folders, however it can be overwritten to restrict folders</param>
/// <param name="libraryName">Name of the Library</param>
/// <returns></returns>
public async Task<Dictionary<ParsedSeries, List<ParserInfo>>> ScanLibrariesForSeries(LibraryType libraryType, IEnumerable<string> folders, string libraryName)
{

View file

@ -22,6 +22,7 @@ using Kavita.Common.EnvironmentInfo;
using Microsoft.AspNetCore.Builder;
using Microsoft.AspNetCore.Hosting;
using Microsoft.AspNetCore.Http;
using Microsoft.AspNetCore.Http.Features;
using Microsoft.AspNetCore.HttpOverrides;
using Microsoft.AspNetCore.Identity;
using Microsoft.AspNetCore.ResponseCompression;
@ -30,6 +31,7 @@ using Microsoft.Extensions.Configuration;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Hosting;
using Microsoft.Extensions.Logging;
using Microsoft.Net.Http.Headers;
using Microsoft.OpenApi.Models;
using TaskScheduler = API.Services.TaskScheduler;
@ -230,7 +232,13 @@ namespace API
app.UseStaticFiles(new StaticFileOptions
{
ContentTypeProvider = new FileExtensionContentTypeProvider()
ContentTypeProvider = new FileExtensionContentTypeProvider(),
HttpsCompression = HttpsCompressionMode.Compress,
OnPrepareResponse = ctx =>
{
const int durationInSeconds = 60 * 60 * 24;
ctx.Context.Response.Headers[HeaderNames.CacheControl] = "public,max-age=" + durationInSeconds;
}
});
app.Use(async (context, next) =>