Change Detection: On Push aka UI Smoothness (#1369)

* Updated Series Info Cards to use OnPush and hooked in progress events when we do a mark as read/unread on entities. These events update progress bars but also will now trigger a re-calculation on Read Time Left.

* Removed Library Card Component

* Refactored manga reader title and subtitle calculation to the backend.

* Coverted card actionables to onPush

* Series Card on push cleanup

* Updated edit collection tags for on push

* Update cover image chooser for on push

* Cleaned up carsouel reel

* Updated cover image to allow for uploading gif and webp files

* Bulk add to collection on push

* Updated bulk operation to use on push. Updated bulk operation to have mark as unread and read buttons explicitly. Updated so add to collection is visible and delete.

Fixed a bug where manage library component wasn't invoking the trackBy function

* Updating entity title for on push

* Removed file info component

* Updated Mange Library for on push

* Entity info cards on push

* List item on push

* Updated icon and title for on push and fixed some missing change detection on series detail

* Restricted the typeahead interface to simplify the design

* Edit Series Relation now shows a value in the dropdown for Parent relationships and disables the field.

* Updated edit series relation to focus on new typeahead when adding a new relationship

* Added some documentation and when Scanning a library, don't allow the user to enqueue the same job multiple times.

* Applied the No-enqueue if already enqueued logic to other tasks

* Library detail on push

* Updated events widget to onpush

* Card detail drawer on push. Card detail cover chooser now will show all chapter's covers for selection in cover chooser.

* Chapter metadata detail on push

* Removed Card Detail modal

* All collections on push

* Removed some comments

* Updated bulk selection to use an observable rather than function calls so new on push strategy works

* collection detail now uses on push and scroller is placed on correct element

* Updated library recommended to on push. Ensure that when mark as read occurs, the appropriate streams are refreshed.

* Updated library detail to on push

* Update metadata fiter to onpush. Bugs found and reported to Project

* person badge on push

* Read more on push

* Updated tag badge to on push

* User login on push

* When initing side nav, don't call an authenticated api until we are sure a user is logged in

* Updated splash container to on push

* Dashboard on push

* Side nav slight refactor around some api calls

* Cleaned up series card on push to use same cdRef naming convention

* Updated Static Files to use caching

* Added width and height to logo image

* shortcuts modal on push

* reading lists on push

* Reading list detail on push

* draggable ordered list on push

* Refactored reading-list-detail to use a new item which drastically reduces renders on operations

* series format on push

* circular loader on push

* Badge Expander on push

* update notification modal on push

* drawer on push

* Edit Series Modal on push

* reset password on push

* review series modal on push

* series metadata detail on push

* theme manager on push

* confirm reset password on push

* register on push

* confirm migration email on push

* confirm email on push

* add email to account migration on push

* user preferences on push. Made global settings default open

* edit series relation on push

* Fixed an edge case bug for next chapter where if the current volume had a single chapter of 1 and the next volume had a chapter number of 0, it would say there are no more chapters.

* Updated infinite scroller with on push support

* Moved some animations over to typeahead, not integrated yet.

* Manga reader is now on push

* Reader settings on push

* refactored how we close the book

* Updated table of contents for on push

* Updated book reader for on push. Fixed a bug where table of contents wasn't showing current page anchor due to a scroll calulation bug

* Small code tweak

* Icon and title on push

* nav header on push

* grouped typeahead on push

* typeahead on push and added a new trackby identity function to allow even faster rendering of big lists

* pdf reader on push

* code cleanup
This commit is contained in:
Joseph Milazzo 2022-07-11 11:57:07 -04:00 committed by GitHub
parent f5be0fac58
commit 4e49aa47ce
No known key found for this signature in database
GPG key ID: 4AEE18F83AFDEB23
126 changed files with 1658 additions and 1674 deletions

View file

@ -189,6 +189,7 @@ namespace API.Services
/// <remarks>This always creates a thumbnail</remarks>
/// <param name="archivePath"></param>
/// <param name="fileName">File name to use based on context of entity.</param>
/// <param name="outputDirectory">Where to output the file, defaults to covers directory</param>
/// <returns></returns>
public string GetCoverImage(string archivePath, string fileName, string outputDirectory)
{
@ -261,16 +262,16 @@ namespace API.Services
archive.Entries.Any(e => e.FullName.Contains(Path.AltDirectorySeparatorChar) && !Parser.Parser.HasBlacklistedFolderInPath(e.FullName));
}
// TODO: Refactor CreateZipForDownload to return the temp file so we can stream it from temp
/// <summary>
///
/// Creates a zip file form the listed files and outputs to the temp folder.
/// </summary>
/// <param name="files"></param>
/// <param name="files">List of files to be zipped up. Should be full file paths.</param>
/// <param name="tempFolder">Temp folder name to use for preparing the files. Will be created and deleted</param>
/// <returns></returns>
/// <returns>All bytes for the given file in a Tuple</returns>
/// <exception cref="KavitaException"></exception>
public async Task<Tuple<byte[], string>> CreateZipForDownload(IEnumerable<string> files, string tempFolder)
{
// TODO: Refactor CreateZipForDownload to return the temp file so we can stream it from temp
var dateString = DateTime.Now.ToShortDateString().Replace("/", "_");
var tempLocation = Path.Join(_directoryService.TempDirectory, $"{tempFolder}_{dateString}");

View file

@ -686,6 +686,7 @@ namespace API.Services
/// </summary>
/// <param name="fileFilePath"></param>
/// <param name="fileName">Name of the new file.</param>
/// <param name="outputDirectory">Where to output the file, defaults to covers directory</param>
/// <returns></returns>
public string GetCoverImage(string fileFilePath, string fileName, string outputDirectory)
{

View file

@ -25,6 +25,7 @@ public interface IImageService
/// Converts the passed image to webP and outputs it in the same directory
/// </summary>
/// <param name="filePath">Full path to the image to convert</param>
/// <param name="outputPath">Where to output the file</param>
/// <returns>File of written webp image</returns>
Task<string> ConvertToWebP(string filePath, string outputPath);
}
@ -89,6 +90,7 @@ public class ImageService : IImageService
/// </summary>
/// <param name="stream">Stream to write to disk. Ensure this is rewinded.</param>
/// <param name="fileName">filename to save as without extension</param>
/// <param name="outputDirectory">Where to output the file, defaults to covers directory</param>
/// <returns>File name with extension of the file. This will always write to <see cref="DirectoryService.CoverImageDirectory"/></returns>
public string WriteCoverThumbnail(Stream stream, string fileName, string outputDirectory)
{

View file

@ -35,6 +35,7 @@ public interface IMetadataService
/// </summary>
/// <param name="libraryId"></param>
/// <param name="seriesId"></param>
/// <param name="forceUpdate">Overrides any cache logic and forces execution</param>
Task RefreshMetadataForSeries(int libraryId, int seriesId, bool forceUpdate = true);
}
@ -278,6 +279,7 @@ public class MetadataService : IMetadataService
/// </summary>
/// <param name="libraryId"></param>
/// <param name="seriesId"></param>
/// <param name="forceUpdate">Overrides any cache logic and forces execution</param>
public async Task RefreshMetadataForSeries(int libraryId, int seriesId, bool forceUpdate = true)
{
var sw = Stopwatch.StartNew();

View file

@ -9,6 +9,7 @@ using API.Data.Repositories;
using API.DTOs;
using API.DTOs.Reader;
using API.Entities;
using API.Entities.Enums;
using API.Extensions;
using API.SignalR;
using Kavita.Common;
@ -20,8 +21,8 @@ public interface IReaderService
{
Task MarkSeriesAsRead(AppUser user, int seriesId);
Task MarkSeriesAsUnread(AppUser user, int seriesId);
void MarkChaptersAsRead(AppUser user, int seriesId, IEnumerable<Chapter> chapters);
void MarkChaptersAsUnread(AppUser user, int seriesId, IEnumerable<Chapter> chapters);
Task MarkChaptersAsRead(AppUser user, int seriesId, IEnumerable<Chapter> chapters);
Task MarkChaptersAsUnread(AppUser user, int seriesId, IEnumerable<Chapter> chapters);
Task<bool> SaveReadingProgress(ProgressDto progressDto, int userId);
Task<int> CapPageToChapter(int chapterId, int page);
Task<int> GetNextChapterIdAsync(int seriesId, int volumeId, int currentChapterId, int userId);
@ -30,6 +31,7 @@ public interface IReaderService
Task MarkChaptersUntilAsRead(AppUser user, int seriesId, float chapterNumber);
Task MarkVolumesUntilAsRead(AppUser user, int seriesId, int volumeNumber);
HourEstimateRangeDto GetTimeEstimate(long wordCount, int pageCount, bool isEpub);
string FormatChapterName(LibraryType libraryType, bool includeHash = false, bool includeSpace = false);
}
public class ReaderService : IReaderService
@ -71,7 +73,7 @@ public class ReaderService : IReaderService
user.Progresses ??= new List<AppUserProgress>();
foreach (var volume in volumes)
{
MarkChaptersAsRead(user, seriesId, volume.Chapters);
await MarkChaptersAsRead(user, seriesId, volume.Chapters);
}
_unitOfWork.UserRepository.Update(user);
@ -88,7 +90,7 @@ public class ReaderService : IReaderService
user.Progresses ??= new List<AppUserProgress>();
foreach (var volume in volumes)
{
MarkChaptersAsUnread(user, seriesId, volume.Chapters);
await MarkChaptersAsUnread(user, seriesId, volume.Chapters);
}
_unitOfWork.UserRepository.Update(user);
@ -100,7 +102,7 @@ public class ReaderService : IReaderService
/// <param name="user"></param>
/// <param name="seriesId"></param>
/// <param name="chapters"></param>
public void MarkChaptersAsRead(AppUser user, int seriesId, IEnumerable<Chapter> chapters)
public async Task MarkChaptersAsRead(AppUser user, int seriesId, IEnumerable<Chapter> chapters)
{
foreach (var chapter in chapters)
{
@ -115,12 +117,17 @@ public class ReaderService : IReaderService
SeriesId = seriesId,
ChapterId = chapter.Id
});
await _eventHub.SendMessageAsync(MessageFactory.UserProgressUpdate,
MessageFactory.UserProgressUpdateEvent(user.Id, user.UserName, seriesId, chapter.VolumeId, chapter.Id, chapter.Pages));
}
else
{
userProgress.PagesRead = chapter.Pages;
userProgress.SeriesId = seriesId;
userProgress.VolumeId = chapter.VolumeId;
await _eventHub.SendMessageAsync(MessageFactory.UserProgressUpdate,
MessageFactory.UserProgressUpdateEvent(user.Id, user.UserName, userProgress.SeriesId, userProgress.VolumeId, userProgress.ChapterId, chapter.Pages));
}
}
}
@ -131,7 +138,7 @@ public class ReaderService : IReaderService
/// <param name="user"></param>
/// <param name="seriesId"></param>
/// <param name="chapters"></param>
public void MarkChaptersAsUnread(AppUser user, int seriesId, IEnumerable<Chapter> chapters)
public async Task MarkChaptersAsUnread(AppUser user, int seriesId, IEnumerable<Chapter> chapters)
{
foreach (var chapter in chapters)
{
@ -142,6 +149,9 @@ public class ReaderService : IReaderService
userProgress.PagesRead = 0;
userProgress.SeriesId = seriesId;
userProgress.VolumeId = chapter.VolumeId;
await _eventHub.SendMessageAsync(MessageFactory.UserProgressUpdate,
MessageFactory.UserProgressUpdateEvent(user.Id, user.UserName, userProgress.SeriesId, userProgress.VolumeId, userProgress.ChapterId, 0));
}
}
@ -321,6 +331,8 @@ public class ReaderService : IReaderService
currentChapter.Range, dto => dto.Range);
if (chapterId > 0) return chapterId;
} else if (double.Parse(firstChapter.Number) > double.Parse(currentChapter.Number)) return firstChapter.Id;
// If we are the last chapter and next volume is there, we should try to use it (unless it's volume 0)
else if (double.Parse(firstChapter.Number) == 0) return firstChapter.Id;
}
// If we are the last volume and we didn't find any next volume, loop back to volume 0 and give the first chapter
@ -485,7 +497,7 @@ public class ReaderService : IReaderService
var chapters = volume.Chapters
.OrderBy(c => float.Parse(c.Number))
.Where(c => !c.IsSpecial && Parser.Parser.MaxNumberFromRange(c.Range) <= chapterNumber);
MarkChaptersAsRead(user, volume.SeriesId, chapters);
await MarkChaptersAsRead(user, volume.SeriesId, chapters);
}
}
@ -494,7 +506,7 @@ public class ReaderService : IReaderService
var volumes = await _unitOfWork.VolumeRepository.GetVolumesForSeriesAsync(new List<int> { seriesId }, true);
foreach (var volume in volumes.OrderBy(v => v.Number).Where(v => v.Number <= volumeNumber && v.Number > 0))
{
MarkChaptersAsRead(user, volume.SeriesId, volume.Chapters);
await MarkChaptersAsRead(user, volume.SeriesId, volume.Chapters);
}
}
@ -540,4 +552,29 @@ public class ReaderService : IReaderService
AvgHours = (int) Math.Round((pageCount / AvgPagesPerMinute / 60F))
};
}
/// <summary>
/// Formats a Chapter name based on the library it's in
/// </summary>
/// <param name="libraryType"></param>
/// <param name="includeHash">For comics only, includes a # which is used for numbering on cards</param>
/// <param name="includeSpace">Add a space at the end of the string. if includeHash and includeSpace are true, only hash will be at the end.</param>
/// <returns></returns>
public string FormatChapterName(LibraryType libraryType, bool includeHash = false, bool includeSpace = false)
{
switch(libraryType)
{
case LibraryType.Manga:
return "Chapter" + (includeSpace ? " " : string.Empty);
case LibraryType.Comic:
if (includeHash) {
return "Issue #";
}
return "Issue" + (includeSpace ? " " : string.Empty);
case LibraryType.Book:
return "Book" + (includeSpace ? " " : string.Empty);
default:
throw new ArgumentOutOfRangeException(nameof(libraryType), libraryType, null);
}
}
}

View file

@ -105,7 +105,6 @@ public class SeriesService : ISeriesService
series.Metadata.LanguageLocked = true;
}
series.Metadata.CollectionTags ??= new List<CollectionTag>();
UpdateRelatedList(updateSeriesMetadataDto.CollectionTags, series, allCollectionTags, (tag) =>
{
@ -200,10 +199,11 @@ public class SeriesService : ISeriesService
return false;
}
// TODO: Move this to a helper so we can easily test
private static void UpdateRelatedList(ICollection<CollectionTagDto> tags, Series series, IReadOnlyCollection<CollectionTag> allTags,
Action<CollectionTag> handleAdd)
{
// TODO: Move UpdateRelatedList to a helper so we can easily test
if (tags == null) return;
// I want a union of these 2 lists. Return only elements that are in both lists, but the list types are different
var existingTags = series.Metadata.CollectionTags.ToList();

View file

@ -1,5 +1,6 @@
using System;
using System.Collections.Generic;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
using API.Data;
@ -162,8 +163,12 @@ public class TaskScheduler : ITaskScheduler
public void ScanLibrary(int libraryId)
{
if (HasAlreadyEnqueuedTask("ScannerService","ScanLibrary", new object[] {libraryId}))
{
_logger.LogInformation("A duplicate request to scan library for library occured. Skipping");
return;
}
_logger.LogInformation("Enqueuing library scan for: {LibraryId}", libraryId);
// TODO: If a library scan is already queued up for libraryId, don't do anything
BackgroundJob.Enqueue(() => _scannerService.ScanLibrary(libraryId));
// When we do a scan, force cache to re-unpack in case page numbers change
BackgroundJob.Enqueue(() => _cleanupService.CleanupCacheDirectory());
@ -176,24 +181,48 @@ public class TaskScheduler : ITaskScheduler
public void RefreshMetadata(int libraryId, bool forceUpdate = true)
{
if (HasAlreadyEnqueuedTask("MetadataService","RefreshMetadata", new object[] {libraryId, forceUpdate}))
{
_logger.LogInformation("A duplicate request to refresh metadata for library occured. Skipping");
return;
}
_logger.LogInformation("Enqueuing library metadata refresh for: {LibraryId}", libraryId);
BackgroundJob.Enqueue(() => _metadataService.RefreshMetadata(libraryId, forceUpdate));
}
public void RefreshSeriesMetadata(int libraryId, int seriesId, bool forceUpdate = false)
{
if (HasAlreadyEnqueuedTask("MetadataService","RefreshMetadataForSeries", new object[] {libraryId, seriesId, forceUpdate}))
{
_logger.LogInformation("A duplicate request to refresh metadata for library occured. Skipping");
return;
}
_logger.LogInformation("Enqueuing series metadata refresh for: {SeriesId}", seriesId);
BackgroundJob.Enqueue(() => _metadataService.RefreshMetadataForSeries(libraryId, seriesId, forceUpdate));
}
public void ScanSeries(int libraryId, int seriesId, bool forceUpdate = false)
{
if (HasAlreadyEnqueuedTask("ScannerService", "ScanSeries", new object[] {libraryId, seriesId, forceUpdate}))
{
_logger.LogInformation("A duplicate request to scan series occured. Skipping");
return;
}
_logger.LogInformation("Enqueuing series scan for: {SeriesId}", seriesId);
BackgroundJob.Enqueue(() => _scannerService.ScanSeries(libraryId, seriesId, CancellationToken.None));
}
public void AnalyzeFilesForSeries(int libraryId, int seriesId, bool forceUpdate = false)
{
if (HasAlreadyEnqueuedTask("WordCountAnalyzerService", "ScanSeries", new object[] {libraryId, seriesId, forceUpdate}))
{
_logger.LogInformation("A duplicate request to scan series occured. Skipping");
return;
}
_logger.LogInformation("Enqueuing analyze files scan for: {SeriesId}", seriesId);
BackgroundJob.Enqueue(() => _wordCountAnalyzerService.ScanSeries(libraryId, seriesId, forceUpdate));
}
@ -212,4 +241,21 @@ public class TaskScheduler : ITaskScheduler
var update = await _versionUpdaterService.CheckForUpdate();
await _versionUpdaterService.PushUpdate(update);
}
/// <summary>
/// Checks if this same invocation is already enqueued
/// </summary>
/// <param name="methodName">Method name that was enqueued</param>
/// <param name="className">Class name the method resides on</param>
/// <param name="args">object[] of arguments in the order they are passed to enqueued job</param>
/// <param name="queue">Queue to check against. Defaults to "default"</param>
/// <returns></returns>
private static bool HasAlreadyEnqueuedTask(string className, string methodName, object[] args, string queue = "default")
{
var enqueuedJobs = JobStorage.Current.GetMonitoringApi().EnqueuedJobs(queue, 0, int.MaxValue);
return enqueuedJobs.Any(j => j.Value.InEnqueuedState &&
j.Value.Job.Method.DeclaringType != null && j.Value.Job.Args.SequenceEqual(args) &&
j.Value.Job.Method.Name.Equals(methodName) &&
j.Value.Job.Method.DeclaringType.Name.Equals(className));
}
}

View file

@ -40,6 +40,7 @@ namespace API.Services.Tasks.Scanner
/// <param name="logger">Logger of the parent class that invokes this</param>
/// <param name="directoryService">Directory Service</param>
/// <param name="readingItemService">ReadingItemService Service for extracting information on a number of formats</param>
/// <param name="eventHub">For firing off SignalR events</param>
public ParseScannedFiles(ILogger logger, IDirectoryService directoryService,
IReadingItemService readingItemService, IEventHub eventHub)
{
@ -251,6 +252,7 @@ namespace API.Services.Tasks.Scanner
/// </summary>
/// <param name="libraryType">Type of library. Used for selecting the correct file extensions to search for and parsing files</param>
/// <param name="folders">The folders to scan. By default, this should be library.Folders, however it can be overwritten to restrict folders</param>
/// <param name="libraryName">Name of the Library</param>
/// <returns></returns>
public async Task<Dictionary<ParsedSeries, List<ParserInfo>>> ScanLibrariesForSeries(LibraryType libraryType, IEnumerable<string> folders, string libraryName)
{