v0.4.8 Release (#720)

* Bump versions by dotnet-bump-version.

* Bump versions by dotnet-bump-version.

* Workflow updates (#658)

# Added
- Added: Added automatic character parsing for discord notifier. Now if the PR is over a certain character limit, it will trim and add an appropriate link to the full changelog. (Release for Stable, PR for Dev)

# Removed
- Removed: Removed Sentry map task from the workflow since Sentry is no longer used.

* Bump versions by dotnet-bump-version.

* Misc Updates (#665)

* Do not allow non-admins to change their passwords when authentication is disabled

* Clean up the login page so that input field text is black

* cleanup some resizing when typing a password and having a lot of users

* Changed the LastActive for a user to not just be login, but also when they open an already authenticated session.

* Bump versions by dotnet-bump-version.

* Logging Cleanup (#668)

* Do not allow non-admins to change their passwords when authentication is disabled

* Clean up the login page so that input field text is black

* cleanup some resizing when typing a password and having a lot of users

* Changed the LastActive for a user to not just be login, but also when they open an already authenticated session.

* Removed some verbose debugging statements and moved some debug to information to be more prevelant to logs for default installs.

* In Progress now sends progress information on the Series

* Add ability to add cards to recently added when new series are added in backend

* Implemented the ability to click the glasses icon to turn off incognito mode from within the reader so you can start tracking progress

* Don't warn the user about authentication when they don't touch that control

* Bump versions by dotnet-bump-version.

* Changed the stats that are sent back to stat server from installed server.

* Revert "Changed the stats that are sent back to stat server from installed server."

This reverts commit 644cb6d1f6.

* Bump versions by dotnet-bump-version.

* Bump versions by dotnet-bump-version.

* Bulk Add to Collection (#674)

* Fixed the typeahead not having the same size input box as other inputs

* Implemented the ability to add multiple series to a collection through bulk operations flow. Updated book parser to handle "@import url('...');" syntax as well as @import '...';

* Implemented the ability to create a new Collection tag via bulk operations flow.

* Bump versions by dotnet-bump-version.

* Bulk Operations for In Progress and Recently Added (#677)

* Don't log a message about bad match if the file is a cover image

* Enable bulk operations for In Progress and Recently Added

* Fixed a bad logic case

* Bump versions by dotnet-bump-version.

* Regression Fix (#680)

* Ensure we mount the backups directory for Docker users

* Fixed a huge logic bug that deleted files in users libraries

* Bump versions by dotnet-bump-version.

* Change chunk size to be a fixed 50 to validate if it's causing issue with refresh. Added some try catches to see if exceptions are causing issues. (#681)

* Bump versions by dotnet-bump-version.

* Fixed a bug where searching on localized name would fail to show on the search. Fixed a bug where extra spaces would cause the search results not to show properly. (#682)

* Bump versions by dotnet-bump-version.

* When we have a special marker, ensure we fall back to folder parsing to try and group correctly to the actual series before just accepting what we parsed. (#684)

Fixed a missed parsing case where comic special parsing wasn't being called on comic libraries.

* Bump versions by dotnet-bump-version.

* iOS Admin page dropdown fix (#686)

# Fixed:
- Fixed: Fixed an issue where the dropdown on the admin server page would not work on Safari or other iOS browsers.

* When the DB fails to save, log out all the series the user should look into for constraint issues and push a message to the admins connected to webui. (#687)

* Bump versions by dotnet-bump-version.

* Bump versions by dotnet-bump-version.

* Stat upload will now schedule itself between midnight and 6am in server time for upload. (#688)

* Bump versions by dotnet-bump-version.

* EPUB CSS Parsing Issues (#690)

* WIP. Rewrote some of the Regex to better support css escaping. We now escape background-image, border-image, and list-style-image within css files.

* Added position relative to help with positioning on books that are just absolute positioned elements.

* When there is absolute positioning, like in some epub based comics, supress the bottom action bar since it wont render in the correct location.

* Fixed tests

* Commented out tests

* Bump versions by dotnet-bump-version.

* More EPUB Scoping Fixes (#691)

* Added better handling around when importing css files that are empty. Moved comment removal on css files to before some css whitespace cleanup to get better matches.

* Some enhancements on the checks to see if we need the bottom action bar on reader. Now we don't query DOM and have something that works more reliably.

* Bump versions by dotnet-bump-version.

* Fixed an issue where docker users were not properly backing up the database. Removed an empty File for when covers/ had nothing in it. (#692)

* Bump versions by dotnet-bump-version.

* Fallback to Folder Parsing Issue (#694)

* Fixed a bug in the scanner where we fall back to parsing from folders for poorly named files. The code was exiting early if a chapter or volume could be parsed out.

* Fixed a unit test by tweaking a regex for fallback

* Bump versions by dotnet-bump-version.

* KavitaStats Cleanup (#695)

* Refactored Stats code to be much cleaner and user better naming.

* Cleaned up the actual http code to use Flurl and to return if the upload was successful or not so we can delete the file where appropriate.

* More refactoring for the stats code to clean it up and keep it consistent with our standards.

* Removed a confusing log statement

* Added support for old api key header from original stat server

* Use the correct endpoint, not the new one.

* Code smell

* Bump versions by dotnet-bump-version.

* Bulk Deletion (#697)

* Implemented bulk deletion of series

* Don't show unauthorized exception on UI, just redirect to the login page.

* Bump versions by dotnet-bump-version.

* Cover Image Picking + Forwarding Headers with EPUBs (#700)

* Ensure Kavita knows about forwarding headers (fixes issue with epub urls not going through https with reverse proxy). Fixed a case where cover image selection preferred nested folders vs files in root directory.

* Fixed broken unit test

* Added bug that I fixed to the unit tests

* Cover Image Picking + Forwarding Headers with EPUBs (#702)

* Updating GA Bump version temporarily for fix (#703)

* Bump versions by dotnet-bump-version.

* Cover Image Picking + Forwarding Headers with EPUBs (GA Fix) (#704)

* Bump versions by dotnet-bump-version.

* Vacation Fixes (#709)

* Ignore system and hidden folders when performing directory scan.

* Fixed the comic parser tests not using Comic mode for parsing.

* Accept all forwarded headers and use them.

* Ignore some changes from another branch

* Bump versions by dotnet-bump-version.

* Breaking Changes: Docker Parity (#698)

* Refactored all the config files for Kavita to be loaded from config/. This will allow docker to just mount one folder and for Update functionality to be trivial.

* Cleaned up documentation around new update method.

* Updated docker files to support config directory

* Removed entrypoint, no longer needed

* Update appsettings to point to config directory for logs

* Updated message for docker users that are upgrading

* Ensure that docker users that have not updated their mount points from upgrade cannot start the server

* Code smells

* More cleanup

* Added entrypoint to fix bind mount issues

* Updated README with new folder structure

* Fixed build system for new setup

* Updated string path if user is docker

* Updated the migration flow for docker to work properly and Fixed LogFile configuration updating.

* Migrating docker images is now working 100%

* Fixed config from bad code

* Code cleanup

Co-authored-by: Chris Plaatjes <kizaing@gmail.com>

* Bump versions by dotnet-bump-version.

* Feature/docker parity (#714)

* Refactored all the config files for Kavita to be loaded from config/. This will allow docker to just mount one folder and for Update functionality to be trivial.

* Cleaned up documentation around new update method.

* Updated docker files to support config directory

* Removed entrypoint, no longer needed

* Update appsettings to point to config directory for logs

* Updated message for docker users that are upgrading

* Ensure that docker users that have not updated their mount points from upgrade cannot start the server

* Code smells

* More cleanup

* Added entrypoint to fix bind mount issues

* Updated README with new folder structure

* Fixed build system for new setup

* Updated string path if user is docker

* Updated the migration flow for docker to work properly and Fixed LogFile configuration updating.

* Migrating docker images is now working 100%

* Fixed config from bad code

* Code cleanup

* Fixed monorepo-build.sh

Co-authored-by: Chris Plaatjes <kizaing@gmail.com>

* Breaking Changes: Docker Parity (#715)

* Fixed a bug in the copy directory to directory in the migration

* Somehow GetFiles lost static modifier.

* Bump versions by dotnet-bump-version.

* Build issue (#716)

* Fixed a bug in the copy directory to directory in the migration

* Somehow GetFiles lost static modifier.

* Please work

* Bump versions by dotnet-bump-version.

* Bump versions by dotnet-bump-version.

* Shakeout Changes (#717)

* Make the appsettings public on Configuration and change how we detect when to migrate for non-docker users.

* Fixed up non-docker copy command and removed duplicate check on source directory for a copy.

* Don't delete files unless we know we are successful

* Bump versions by dotnet-bump-version.

* Fixed a migration issue on docker happening too many times or throwing exception when source wasn't there. (#719)

* Bump versions by dotnet-bump-version.

* Version bump for release (#718)

* Bump versions by dotnet-bump-version.

Co-authored-by: Robbie Davis <robbie@therobbiedavis.com>
Co-authored-by: YEGCSharpDev <89283498+YEGCSharpDev@users.noreply.github.com>
Co-authored-by: Chris Plaatjes <kizaing@gmail.com>
This commit is contained in:
Joseph Milazzo 2021-11-04 07:29:02 -05:00 committed by GitHub
parent cb9fa0dda8
commit 33db123e81
No known key found for this signature in database
GPG key ID: 4AEE18F83AFDEB23
115 changed files with 1818 additions and 910 deletions

View file

@ -6,6 +6,10 @@ using static System.String;
namespace API.Comparators
{
/// <summary>
/// Attempts to emulate Windows explorer sorting
/// </summary>
/// <remarks>This is not thread-safe</remarks>
public sealed class NaturalSortComparer : IComparer<string>, IDisposable
{
private readonly bool _isAscending;
@ -23,7 +27,6 @@ namespace API.Comparators
{
if (x == y) return 0;
// Should be fixed: Operations that change non-concurrent collections must have exclusive access. A concurrent update was performed on this collection and corrupted its state. The collection's state is no longer correct.
if (!_table.TryGetValue(x ?? Empty, out var x1))
{
x1 = Regex.Split(x ?? Empty, "([0-9]+)");
@ -33,7 +36,6 @@ namespace API.Comparators
if (!_table.TryGetValue(y ?? Empty, out var y1))
{
y1 = Regex.Split(y ?? Empty, "([0-9]+)");
// Should be fixed: EXCEPTION: An item with the same key has already been added. Key: M:\Girls of the Wild's\Girls of the Wild's - Ep. 083 (Season 1) [LINE Webtoon].cbz
_table.Add(y ?? Empty, y1);
}
@ -59,6 +61,7 @@ namespace API.Comparators
returnVal = 0;
}
return _isAscending ? returnVal : -returnVal;
}

View file

@ -259,7 +259,10 @@ namespace API.Controllers
}
var styleContent = await _bookService.ScopeStyles(await book.Content.Css[key].ReadContentAsync(), apiBase, book.Content.Css[key].FileName, book);
body.PrependChild(HtmlNode.CreateNode($"<style>{styleContent}</style>"));
if (styleContent != null)
{
body.PrependChild(HtmlNode.CreateNode($"<style>{styleContent}</style>"));
}
}
}

View file

@ -2,7 +2,9 @@
using System.Collections.Generic;
using System.Linq;
using System.Threading.Tasks;
using API.Data;
using API.DTOs;
using API.DTOs.CollectionTags;
using API.Entities;
using API.Extensions;
using API.Interfaces;
@ -90,6 +92,40 @@ namespace API.Controllers
return BadRequest("Something went wrong, please try again");
}
/// <summary>
/// Adds a collection tag onto multiple Series. If tag id is 0, this will create a new tag.
/// </summary>
/// <param name="dto"></param>
/// <returns></returns>
[HttpPost("update-for-series")]
public async Task<ActionResult> AddToMultipleSeries(CollectionTagBulkAddDto dto)
{
var tag = await _unitOfWork.CollectionTagRepository.GetFullTagAsync(dto.CollectionTagId);
if (tag == null)
{
tag = DbFactory.CollectionTag(0, dto.CollectionTagTitle, String.Empty, false);
_unitOfWork.CollectionTagRepository.Add(tag);
}
var seriesMetadatas = await _unitOfWork.SeriesRepository.GetSeriesMetadataForIdsAsync(dto.SeriesIds);
foreach (var metadata in seriesMetadatas)
{
if (!metadata.CollectionTags.Any(t => t.Title.Equals(tag.Title, StringComparison.InvariantCulture)))
{
metadata.CollectionTags.Add(tag);
_unitOfWork.SeriesMetadataRepository.Update(metadata);
}
}
if (!_unitOfWork.HasChanges()) return Ok();
if (await _unitOfWork.CommitAsync())
{
return Ok();
}
return BadRequest("There was an issue updating series with collection tag");
}
/// <summary>
/// For a given tag, update the summary if summary has changed and remove a set of series from the tag.
/// </summary>

View file

@ -164,7 +164,7 @@ namespace API.Controllers
case MangaFormat.Archive:
case MangaFormat.Pdf:
_cacheService.ExtractChapterFiles(chapterExtractPath, mangaFiles.ToList());
var originalFiles = _directoryService.GetFilesWithExtension(chapterExtractPath,
var originalFiles = DirectoryService.GetFilesWithExtension(chapterExtractPath,
Parser.Parser.ImageFileExtensions);
_directoryService.CopyFilesToDirectory(originalFiles, chapterExtractPath, $"{chapterId}_");
DirectoryService.DeleteFiles(originalFiles);
@ -175,7 +175,7 @@ namespace API.Controllers
return BadRequest("Series is not in a valid format. Please rescan series and try again.");
}
var files = _directoryService.GetFilesWithExtension(chapterExtractPath, Parser.Parser.ImageFileExtensions);
var files = DirectoryService.GetFilesWithExtension(chapterExtractPath, Parser.Parser.ImageFileExtensions);
// Filter out images that aren't in bookmarks
Array.Sort(files, _numericComparer);
totalFilePaths.AddRange(files.Where((_, i) => chapterPages.Contains(i)));

View file

@ -226,7 +226,7 @@ namespace API.Controllers
[HttpGet("search")]
public async Task<ActionResult<IEnumerable<SearchResultDto>>> Search(string queryString)
{
queryString = queryString.Trim().Replace(@"%", "");
queryString = Uri.UnescapeDataString(queryString).Trim().Replace(@"%", string.Empty);
var userId = await _unitOfWork.UserRepository.GetUserIdByUsernameAsync(User.GetUsername());
// Get libraries user has access to

View file

@ -6,6 +6,7 @@ using System.Threading.Tasks;
using System.Xml.Serialization;
using API.Comparators;
using API.DTOs;
using API.DTOs.CollectionTags;
using API.DTOs.Filtering;
using API.DTOs.OPDS;
using API.Entities;
@ -738,7 +739,7 @@ namespace API.Controllers
[HttpGet("{apiKey}/favicon")]
public async Task<ActionResult> GetFavicon(string apiKey)
{
var files = _directoryService.GetFilesWithExtension(Path.Join(Directory.GetCurrentDirectory(), ".."), @"\.ico");
var files = DirectoryService.GetFilesWithExtension(Path.Join(Directory.GetCurrentDirectory(), ".."), @"\.ico");
if (files.Length == 0) return BadRequest("Cannot find icon");
var path = files[0];
var content = await _directoryService.ReadFileAsync(path);

View file

@ -78,8 +78,9 @@ namespace API.Controllers
public async Task<ActionResult<bool>> DeleteSeries(int seriesId)
{
var username = User.GetUsername();
var chapterIds = (await _unitOfWork.SeriesRepository.GetChapterIdsForSeriesAsync(new []{seriesId}));
_logger.LogInformation("Series {SeriesId} is being deleted by {UserName}", seriesId, username);
var chapterIds = (await _unitOfWork.SeriesRepository.GetChapterIdsForSeriesAsync(new []{seriesId}));
var result = await _unitOfWork.SeriesRepository.DeleteSeriesAsync(seriesId);
if (result)
@ -92,6 +93,34 @@ namespace API.Controllers
return Ok(result);
}
[Authorize(Policy = "RequireAdminRole")]
[HttpPost("delete-multiple")]
public async Task<ActionResult> DeleteMultipleSeries(DeleteSeriesDto dto)
{
var username = User.GetUsername();
_logger.LogInformation("Series {SeriesId} is being deleted by {UserName}", dto.SeriesIds, username);
var chapterMappings =
await _unitOfWork.SeriesRepository.GetChapterIdWithSeriesIdForSeriesAsync(dto.SeriesIds.ToArray());
var allChapterIds = new List<int>();
foreach (var mapping in chapterMappings)
{
allChapterIds.AddRange(mapping.Value);
}
var series = await _unitOfWork.SeriesRepository.GetSeriesByIdsAsync(dto.SeriesIds);
_unitOfWork.SeriesRepository.Remove(series);
if (_unitOfWork.HasChanges() && await _unitOfWork.CommitAsync())
{
await _unitOfWork.AppUserProgressRepository.CleanupAbandonedChapters();
await _unitOfWork.CollectionTagRepository.RemoveTagsWithoutSeries();
_taskScheduler.CleanupChapters(allChapterIds.ToArray());
}
return Ok();
}
/// <summary>
/// Returns All volumes for a series with progress information and Chapters
/// </summary>
@ -212,6 +241,8 @@ namespace API.Controllers
.Take(userParams.PageSize).ToList();
var pagedList = new PagedList<SeriesDto>(listResults, listResults.Count, userParams.PageNumber, userParams.PageSize);
await _unitOfWork.SeriesRepository.AddSeriesModifiers(userId, pagedList);
Response.AddPaginationHeader(pagedList.CurrentPage, pagedList.PageSize, pagedList.TotalCount, pagedList.TotalPages);
return Ok(pagedList);

View file

@ -71,10 +71,10 @@ namespace API.Controllers
/// </summary>
/// <returns></returns>
[HttpPost("backup-db")]
public ActionResult BackupDatabase()
public async Task<ActionResult> BackupDatabase()
{
_logger.LogInformation("{UserName} is backing up database of server from admin dashboard", User.GetUsername());
_backupService.BackupDatabase();
await _backupService.BackupDatabase();
return Ok();
}

View file

@ -140,7 +140,7 @@ namespace API.Controllers
}
}
if (!_unitOfWork.HasChanges()) return Ok("Nothing was updated");
if (!_unitOfWork.HasChanges()) return Ok(updateSettingsDto);
try
{

View file

@ -25,7 +25,7 @@ namespace API.Controllers
{
try
{
await _statsService.PathData(clientInfoDto);
await _statsService.RecordClientInfo(clientInfoDto);
return Ok();
}

View file

@ -0,0 +1,18 @@
using System.Collections.Generic;
namespace API.DTOs.CollectionTags
{
public class CollectionTagBulkAddDto
{
/// <summary>
/// Collection Tag Id
/// </summary>
/// <remarks>Can be 0 which then will use Title to create a tag</remarks>
public int CollectionTagId { get; init; }
public string CollectionTagTitle { get; init; }
/// <summary>
/// Series Ids to add onto Collection Tag
/// </summary>
public IEnumerable<int> SeriesIds { get; init; }
}
}

View file

@ -1,4 +1,4 @@
namespace API.DTOs
namespace API.DTOs.CollectionTags
{
public class CollectionTagDto
{

View file

@ -1,10 +1,10 @@
using System.Collections.Generic;
namespace API.DTOs
namespace API.DTOs.CollectionTags
{
public class UpdateSeriesForTagDto
{
public CollectionTagDto Tag { get; init; }
public ICollection<int> SeriesIdsToRemove { get; init; }
public IEnumerable<int> SeriesIdsToRemove { get; init; }
}
}
}

View file

@ -0,0 +1,9 @@
using System.Collections.Generic;
namespace API.DTOs
{
public class DeleteSeriesDto
{
public IList<int> SeriesIds { get; set; }
}
}

View file

@ -1,4 +1,5 @@
using System.Collections.Generic;
using API.DTOs.CollectionTags;
using API.Entities;
namespace API.DTOs

View file

@ -1,4 +1,5 @@
using System.Collections.Generic;
using API.DTOs.CollectionTags;
namespace API.DTOs
{

View file

@ -0,0 +1,166 @@
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using API.Services;
using Kavita.Common;
namespace API.Data
{
public static class MigrateConfigFiles
{
private static readonly List<string> LooseLeafFiles = new List<string>()
{
"appsettings.json",
"appsettings.Development.json",
"kavita.db",
};
private static readonly List<string> AppFolders = new List<string>()
{
"covers",
"stats",
"logs",
"backups",
"cache",
"temp"
};
private static readonly string ConfigDirectory = Path.Join(Directory.GetCurrentDirectory(), "config");
/// <summary>
/// In v0.4.8 we moved all config files to config/ to match with how docker was setup. This will move all config files from current directory
/// to config/
/// </summary>
public static void Migrate(bool isDocker)
{
Console.WriteLine("Checking if migration to config/ is needed");
if (isDocker)
{
if (Configuration.LogPath.Contains("config"))
{
Console.WriteLine("Migration to config/ not needed");
return;
}
Console.WriteLine(
"Migrating files from pre-v0.4.8. All Kavita config files are now located in config/");
CopyAppFolders();
DeleteAppFolders();
UpdateConfiguration();
Console.WriteLine("Migration complete. All config files are now in config/ directory");
return;
}
if (new FileInfo(Configuration.AppSettingsFilename).Exists)
{
Console.WriteLine("Migration to config/ not needed");
return;
}
Console.WriteLine(
"Migrating files from pre-v0.4.8. All Kavita config files are now located in config/");
Console.WriteLine($"Creating {ConfigDirectory}");
DirectoryService.ExistOrCreate(ConfigDirectory);
try
{
CopyLooseLeafFiles();
CopyAppFolders();
// Then we need to update the config file to point to the new DB file
UpdateConfiguration();
}
catch (Exception)
{
Console.WriteLine("There was an exception during migration. Please move everything manually.");
return;
}
// Finally delete everything in the source directory
Console.WriteLine("Removing old files");
DeleteLooseFiles();
DeleteAppFolders();
Console.WriteLine("Removing old files...DONE");
Console.WriteLine("Migration complete. All config files are now in config/ directory");
}
private static void DeleteAppFolders()
{
foreach (var folderToDelete in AppFolders)
{
if (!new DirectoryInfo(Path.Join(Directory.GetCurrentDirectory(), folderToDelete)).Exists) continue;
DirectoryService.ClearAndDeleteDirectory(Path.Join(Directory.GetCurrentDirectory(), folderToDelete));
}
}
private static void DeleteLooseFiles()
{
var configFiles = LooseLeafFiles.Select(file => new FileInfo(Path.Join(Directory.GetCurrentDirectory(), file)))
.Where(f => f.Exists);
DirectoryService.DeleteFiles(configFiles.Select(f => f.FullName));
}
private static void CopyAppFolders()
{
Console.WriteLine("Moving folders to config");
foreach (var folderToMove in AppFolders)
{
if (new DirectoryInfo(Path.Join(ConfigDirectory, folderToMove)).Exists) continue;
try
{
DirectoryService.CopyDirectoryToDirectory(
Path.Join(Directory.GetCurrentDirectory(), folderToMove),
Path.Join(ConfigDirectory, folderToMove));
}
catch (Exception)
{
/* Swallow Exception */
}
}
Console.WriteLine("Moving folders to config...DONE");
}
private static void CopyLooseLeafFiles()
{
var configFiles = LooseLeafFiles.Select(file => new FileInfo(Path.Join(Directory.GetCurrentDirectory(), file)))
.Where(f => f.Exists);
// First step is to move all the files
Console.WriteLine("Moving files to config/");
foreach (var fileInfo in configFiles)
{
try
{
fileInfo.CopyTo(Path.Join(ConfigDirectory, fileInfo.Name));
}
catch (Exception)
{
/* Swallow exception when already exists */
}
}
Console.WriteLine("Moving files to config...DONE");
}
private static void UpdateConfiguration()
{
Console.WriteLine("Updating appsettings.json to new paths");
Configuration.DatabasePath = "config//kavita.db";
Configuration.LogPath = "config//logs/kavita.log";
Console.WriteLine("Updating appsettings.json to new paths...DONE");
}
}
}

View file

@ -3,6 +3,7 @@ using System.IO;
using System.Linq;
using System.Threading.Tasks;
using API.DTOs;
using API.DTOs.CollectionTags;
using API.Entities;
using API.Interfaces.Repositories;
using AutoMapper;
@ -22,6 +23,11 @@ namespace API.Data.Repositories
_mapper = mapper;
}
public void Add(CollectionTag tag)
{
_context.CollectionTag.Add(tag);
}
public void Remove(CollectionTag tag)
{
_context.CollectionTag.Remove(tag);

View file

@ -0,0 +1,20 @@
using API.Entities;
using API.Interfaces.Repositories;
namespace API.Data.Repositories
{
public class SeriesMetadataRepository : ISeriesMetadataRepository
{
private readonly DataContext _context;
public SeriesMetadataRepository(DataContext context)
{
_context = context;
}
public void Update(SeriesMetadata seriesMetadata)
{
_context.SeriesMetadata.Update(seriesMetadata);
}
}
}

View file

@ -4,6 +4,7 @@ using System.Linq;
using System.Threading.Tasks;
using API.Data.Scanner;
using API.DTOs;
using API.DTOs.CollectionTags;
using API.DTOs.Filtering;
using API.Entities;
using API.Extensions;
@ -41,6 +42,11 @@ namespace API.Data.Repositories
_context.Series.Remove(series);
}
public void Remove(IEnumerable<Series> series)
{
_context.Series.RemoveRange(series);
}
public async Task<bool> DoesSeriesNameExistInLibrary(string name)
{
var libraries = _context.Series
@ -171,6 +177,21 @@ namespace API.Data.Repositories
.SingleOrDefaultAsync();
}
/// <summary>
/// Returns Volumes, Metadata, and Collection Tags
/// </summary>
/// <param name="seriesIds"></param>
/// <returns></returns>
public async Task<IList<Series>> GetSeriesByIdsAsync(IList<int> seriesIds)
{
return await _context.Series
.Include(s => s.Volumes)
.Include(s => s.Metadata)
.ThenInclude(m => m.CollectionTags)
.Where(s => seriesIds.Contains(s.Id))
.ToListAsync();
}
public async Task<int[]> GetChapterIdsForSeriesAsync(int[] seriesIds)
{
var volumes = await _context.Volume
@ -454,15 +475,15 @@ namespace API.Data.Repositories
// TODO: Think about making this bigger depending on number of files a user has in said library
// and number of cores and amount of memory. We can then make an optimal choice
var totalSeries = await GetSeriesCount(libraryId);
var procCount = Math.Max(Environment.ProcessorCount - 1, 1);
if (totalSeries < procCount * 2 || totalSeries < 50)
{
return new Tuple<int, int>(totalSeries, totalSeries);
}
return new Tuple<int, int>(totalSeries, Math.Max(totalSeries / procCount, 50));
// var procCount = Math.Max(Environment.ProcessorCount - 1, 1);
//
// if (totalSeries < procCount * 2 || totalSeries < 50)
// {
// return new Tuple<int, int>(totalSeries, totalSeries);
// }
//
// return new Tuple<int, int>(totalSeries, Math.Max(totalSeries / procCount, 50));
return new Tuple<int, int>(totalSeries, 50);
}
public async Task<Chunk> GetChunkInfo(int libraryId = 0)
@ -485,5 +506,13 @@ namespace API.Data.Repositories
TotalChunks = totalChunks
};
}
public async Task<IList<SeriesMetadata>> GetSeriesMetadataForIdsAsync(IEnumerable<int> seriesIds)
{
return await _context.SeriesMetadata
.Where(sm => seriesIds.Contains(sm.SeriesId))
.Include(sm => sm.CollectionTags)
.ToListAsync();
}
}
}

View file

@ -35,15 +35,6 @@ namespace API.Data.Repositories
return _mapper.Map<ServerSettingDto>(settings);
}
public ServerSettingDto GetSettingsDto()
{
var settings = _context.ServerSetting
.Select(x => x)
.AsNoTracking()
.ToList();
return _mapper.Map<ServerSettingDto>(settings);
}
public Task<ServerSetting> GetSettingAsync(ServerSettingKey key)
{
return _context.ServerSetting.SingleOrDefaultAsync(x => x.Key == key);

View file

@ -41,11 +41,11 @@ namespace API.Data
IList<ServerSetting> defaultSettings = new List<ServerSetting>()
{
new() {Key = ServerSettingKey.CacheDirectory, Value = CacheService.CacheDirectory},
new() {Key = ServerSettingKey.CacheDirectory, Value = DirectoryService.CacheDirectory},
new () {Key = ServerSettingKey.TaskScan, Value = "daily"},
new () {Key = ServerSettingKey.LoggingLevel, Value = "Information"}, // Not used from DB, but DB is sync with appSettings.json
new () {Key = ServerSettingKey.TaskBackup, Value = "weekly"},
new () {Key = ServerSettingKey.BackupDirectory, Value = Path.GetFullPath(Path.Join(Directory.GetCurrentDirectory(), "backups/"))},
new () {Key = ServerSettingKey.BackupDirectory, Value = Path.GetFullPath(DirectoryService.BackupDirectory)},
new () {Key = ServerSettingKey.Port, Value = "5000"}, // Not used from DB, but DB is sync with appSettings.json
new () {Key = ServerSettingKey.AllowStatCollection, Value = "true"},
new () {Key = ServerSettingKey.EnableOpds, Value = "false"},
@ -69,6 +69,8 @@ namespace API.Data
Configuration.Port + string.Empty;
context.ServerSetting.First(s => s.Key == ServerSettingKey.LoggingLevel).Value =
Configuration.LogLevel + string.Empty;
context.ServerSetting.First(s => s.Key == ServerSettingKey.CacheDirectory).Value =
DirectoryService.CacheDirectory + string.Empty;
await context.SaveChangesAsync();

View file

@ -34,6 +34,7 @@ namespace API.Data
public IFileRepository FileRepository => new FileRepository(_context);
public IChapterRepository ChapterRepository => new ChapterRepository(_context, _mapper);
public IReadingListRepository ReadingListRepository => new ReadingListRepository(_context, _mapper);
public ISeriesMetadataRepository SeriesMetadataRepository => new SeriesMetadataRepository(_context);
/// <summary>
/// Commits changes to the DB. Completes the open transaction.

View file

@ -8,6 +8,9 @@ namespace API.Entities
{
[Key]
public ServerSettingKey Key { get; set; }
/// <summary>
/// The value of the Setting. Converter knows how to convert to the correct type
/// </summary>
public string Value { get; set; }
/// <inheritdoc />

View file

@ -1,24 +0,0 @@
using API.Interfaces.Services;
using API.Services.Clients;
using Microsoft.Extensions.Configuration;
using Microsoft.Extensions.DependencyInjection;
namespace API.Extensions
{
public static class ServiceCollectionExtensions
{
public static IServiceCollection AddStartupTask<T>(this IServiceCollection services)
where T : class, IStartupTask
=> services.AddTransient<IStartupTask, T>();
public static IServiceCollection AddStatsClient(this IServiceCollection services, IConfiguration configuration)
{
services.AddHttpClient<StatsApiClient>(client =>
{
client.DefaultRequestHeaders.Add("api-key", "MsnvA2DfQqxSK5jh");
});
return services;
}
}
}

View file

@ -1,6 +1,7 @@
using System.Collections.Generic;
using System.Linq;
using API.DTOs;
using API.DTOs.CollectionTags;
using API.DTOs.Reader;
using API.DTOs.ReadingLists;
using API.DTOs.Settings;

View file

@ -15,6 +15,7 @@ namespace API.Interfaces
IFileRepository FileRepository { get; }
IChapterRepository ChapterRepository { get; }
IReadingListRepository ReadingListRepository { get; }
ISeriesMetadataRepository SeriesMetadataRepository { get; }
bool Commit();
Task<bool> CommitAsync();
bool HasChanges();

View file

@ -1,12 +1,14 @@
using System.Collections.Generic;
using System.Threading.Tasks;
using API.DTOs;
using API.DTOs.CollectionTags;
using API.Entities;
namespace API.Interfaces.Repositories
{
public interface ICollectionTagRepository
{
void Add(CollectionTag tag);
void Remove(CollectionTag tag);
Task<IEnumerable<CollectionTagDto>> GetAllTagDtosAsync();
Task<IEnumerable<CollectionTagDto>> SearchTagDtosAsync(string searchQuery);

View file

@ -0,0 +1,9 @@
using API.Entities;
namespace API.Interfaces.Repositories
{
public interface ISeriesMetadataRepository
{
void Update(SeriesMetadata seriesMetadata);
}
}

View file

@ -13,6 +13,7 @@ namespace API.Interfaces.Repositories
void Attach(Series series);
void Update(Series series);
void Remove(Series series);
void Remove(IEnumerable<Series> series);
Task<bool> DoesSeriesNameExistInLibrary(string name);
/// <summary>
/// Adds user information like progress, ratings, etc
@ -33,6 +34,7 @@ namespace API.Interfaces.Repositories
Task<SeriesDto> GetSeriesDtoByIdAsync(int seriesId, int userId);
Task<bool> DeleteSeriesAsync(int seriesId);
Task<Series> GetSeriesByIdAsync(int seriesId);
Task<IList<Series>> GetSeriesByIdsAsync(IList<int> seriesIds);
Task<int[]> GetChapterIdsForSeriesAsync(int[] seriesIds);
Task<IDictionary<int, IList<int>>> GetChapterIdWithSeriesIdForSeriesAsync(int[] seriesIds);
/// <summary>
@ -54,5 +56,6 @@ namespace API.Interfaces.Repositories
Task<PagedList<Series>> GetFullSeriesForLibraryIdAsync(int libraryId, UserParams userParams);
Task<Series> GetFullSeriesForSeriesIdAsync(int seriesId);
Task<Chunk> GetChunkInfo(int libraryId = 0);
Task<IList<SeriesMetadata>> GetSeriesMetadataForIdsAsync(IEnumerable<int> seriesIds);
}
}

View file

@ -10,7 +10,6 @@ namespace API.Interfaces.Repositories
{
void Update(ServerSetting settings);
Task<ServerSettingDto> GetSettingsDtoAsync();
ServerSettingDto GetSettingsDto();
Task<ServerSetting> GetSettingAsync(ServerSettingKey key);
Task<IEnumerable<ServerSetting>> GetSettingsAsync();

View file

@ -12,21 +12,9 @@ namespace API.Interfaces.Services
/// <param name="rootPath">Absolute path of directory to scan.</param>
/// <returns>List of folder names</returns>
IEnumerable<string> ListDirectory(string rootPath);
/// <summary>
/// Gets files in a directory. If searchPatternExpression is passed, will match the regex against for filtering.
/// </summary>
/// <param name="path"></param>
/// <param name="searchPatternExpression"></param>
/// <returns></returns>
string[] GetFilesWithExtension(string path, string searchPatternExpression = "");
Task<byte[]> ReadFileAsync(string path);
bool CopyFilesToDirectory(IEnumerable<string> filePaths, string directoryPath, string prepend = "");
bool Exists(string directory);
IEnumerable<string> GetFiles(string path, string searchPatternExpression = "",
SearchOption searchOption = SearchOption.TopDirectoryOnly);
void CopyFileToDirectory(string fullFilePath, string targetDirectory);
public bool CopyDirectoryToDirectory(string sourceDirName, string destDirName, string searchPattern = "*");
}
}

View file

@ -5,7 +5,7 @@ namespace API.Interfaces.Services
{
public interface IStatsService
{
Task PathData(ClientInfoDto clientInfoDto);
Task CollectAndSendStatsData();
Task RecordClientInfo(ClientInfoDto clientInfoDto);
Task Send();
}
}

View file

@ -24,11 +24,25 @@ namespace API.Parser
private const RegexOptions MatchOptions =
RegexOptions.IgnoreCase | RegexOptions.Compiled | RegexOptions.CultureInvariant;
public static readonly Regex FontSrcUrlRegex = new Regex(@"(src:url\(.{1})" + "([^\"']*)" + @"(.{1}\))",
/// <summary>
/// Matches against font-family css syntax. Does not match if url import has data: starting, as that is binary data
/// </summary>
/// <remarks>See here for some examples https://developer.mozilla.org/en-US/docs/Web/CSS/@font-face</remarks>
public static readonly Regex FontSrcUrlRegex = new Regex(@"(?<Start>(src:\s?)?url\((?!data:).(?!data:))" + "(?<Filename>(?!data:)[^\"']*)" + @"(?<End>.{1}\))",
MatchOptions, RegexTimeout);
public static readonly Regex CssImportUrlRegex = new Regex("(@import\\s[\"|'])(?<Filename>[\\w\\d/\\._-]+)([\"|'];?)",
/// <summary>
/// https://developer.mozilla.org/en-US/docs/Web/CSS/@import
/// </summary>
public static readonly Regex CssImportUrlRegex = new Regex("(@import\\s([\"|']|url\\([\"|']))(?<Filename>[^'\"]+)([\"|']\\)?);",
MatchOptions | RegexOptions.Multiline, RegexTimeout);
/// <summary>
/// Misc css image references, like background-image: url(), border-image, or list-style-image
/// </summary>
/// Original prepend: (background|border|list-style)-image:\s?)?
public static readonly Regex CssImageUrlRegex = new Regex(@"(url\((?!data:).(?!data:))" + "(?<Filename>(?!data:)[^\"']*)" + @"(.\))",
MatchOptions, RegexTimeout);
private static readonly string XmlRegexExtensions = @"\.xml";
private static readonly Regex ImageRegex = new Regex(ImageFileExtensions,
MatchOptions, RegexTimeout);
@ -212,7 +226,7 @@ namespace API.Parser
MatchOptions, RegexTimeout),
// Baketeriya ch01-05.zip, Akiiro Bousou Biyori - 01.jpg, Beelzebub_172_RHS.zip, Cynthia the Mission 29.rar, A Compendium of Ghosts - 031 - The Third Story_ Part 12 (Digital) (Cobalt001)
new Regex(
@"^(?!Vol\.?)(?<Series>.+?)( |_|-)(?<!-)(ch)?\d+-?\d*",
@"^(?!Vol\.?)(?!Chapter)(?<Series>.+?)(\s|_|-)(?<!-)(ch|chapter)?\.?\d+-?\d*",
MatchOptions, RegexTimeout),
// [BAA]_Darker_than_Black_c1 (This is very greedy, make sure it's close to last)
new Regex(
@ -533,14 +547,16 @@ namespace API.Parser
ret.Edition = edition;
}
var isSpecial = ParseMangaSpecial(fileName);
var isSpecial = type == LibraryType.Comic ? ParseComicSpecial(fileName) : ParseMangaSpecial(fileName);
// We must ensure that we can only parse a special out. As some files will have v20 c171-180+Omake and that
// could cause a problem as Omake is a special term, but there is valid volume/chapter information.
if (ret.Chapters == DefaultChapter && ret.Volumes == DefaultVolume && !string.IsNullOrEmpty(isSpecial))
{
ret.IsSpecial = true;
ParseFromFallbackFolders(filePath, rootPath, type, ref ret);
}
// If we are a special with marker, we need to ensure we use the correct series name. we can do this by falling back to Folder name
if (HasSpecialMarker(fileName))
{
ret.IsSpecial = true;
@ -549,8 +565,6 @@ namespace API.Parser
ParseFromFallbackFolders(filePath, rootPath, type, ref ret);
}
// here is the issue. If we are a special with marker, we need to ensure we use the correct series name.
// we can do this by falling back
if (string.IsNullOrEmpty(ret.Series))
{
@ -594,8 +608,6 @@ namespace API.Parser
{
ret.Chapters = parsedChapter;
}
continue;
}
var series = ParseSeries(folder);

View file

@ -1,98 +1,127 @@
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Security.Cryptography;
using System.Threading.Tasks;
using API.Data;
using API.Entities;
using API.Services;
using Kavita.Common;
using Kavita.Common.EnvironmentInfo;
using Microsoft.AspNetCore.Hosting;
using Microsoft.AspNetCore.Identity;
using Microsoft.AspNetCore.Server.Kestrel.Core;
using Microsoft.EntityFrameworkCore;
using Microsoft.Extensions.Configuration;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Hosting;
using Microsoft.Extensions.Logging;
namespace API
{
public class Program
{
private static readonly int HttpPort = Configuration.Port;
public class Program
{
private static readonly int HttpPort = Configuration.Port;
protected Program()
{
}
protected Program()
{
}
public static async Task Main(string[] args)
{
Console.OutputEncoding = System.Text.Encoding.UTF8;
public static async Task Main(string[] args)
{
Console.OutputEncoding = System.Text.Encoding.UTF8;
var isDocker = new OsInfo(Array.Empty<IOsVersionAdapter>()).IsDocker;
// Before anything, check if JWT has been generated properly or if user still has default
if (!Configuration.CheckIfJwtTokenSet() &&
Environment.GetEnvironmentVariable("ASPNETCORE_ENVIRONMENT") != Environments.Development)
{
Console.WriteLine("Generating JWT TokenKey for encrypting user sessions...");
var rBytes = new byte[128];
using (var crypto = new RNGCryptoServiceProvider()) crypto.GetBytes(rBytes);
Configuration.JwtToken = Convert.ToBase64String(rBytes).Replace("/", string.Empty);
}
MigrateConfigFiles.Migrate(isDocker);
var host = CreateHostBuilder(args).Build();
// Before anything, check if JWT has been generated properly or if user still has default
if (!Configuration.CheckIfJwtTokenSet() &&
Environment.GetEnvironmentVariable("ASPNETCORE_ENVIRONMENT") != Environments.Development)
{
Console.WriteLine("Generating JWT TokenKey for encrypting user sessions...");
var rBytes = new byte[128];
using (var crypto = new RNGCryptoServiceProvider()) crypto.GetBytes(rBytes);
Configuration.JwtToken = Convert.ToBase64String(rBytes).Replace("/", string.Empty);
}
using var scope = host.Services.CreateScope();
var services = scope.ServiceProvider;
var host = CreateHostBuilder(args).Build();
try
{
var context = services.GetRequiredService<DataContext>();
var roleManager = services.GetRequiredService<RoleManager<AppRole>>();
using var scope = host.Services.CreateScope();
var services = scope.ServiceProvider;
var requiresCoverImageMigration = !Directory.Exists(DirectoryService.CoverImageDirectory);
try
{
// If this is a new install, tables wont exist yet
var context = services.GetRequiredService<DataContext>();
var roleManager = services.GetRequiredService<RoleManager<AppRole>>();
if (isDocker && new FileInfo("data/appsettings.json").Exists)
{
var logger = services.GetRequiredService<ILogger<Startup>>();
logger.LogCritical("WARNING! Mount point is incorrect, nothing here will persist. Please change your container mount from /kavita/data to /kavita/config");
return;
}
var requiresCoverImageMigration = !Directory.Exists(DirectoryService.CoverImageDirectory);
try
{
// If this is a new install, tables wont exist yet
if (requiresCoverImageMigration)
{
MigrateCoverImages.ExtractToImages(context);
}
}
catch (Exception)
{
requiresCoverImageMigration = false;
}
// Apply all migrations on startup
await context.Database.MigrateAsync();
if (requiresCoverImageMigration)
{
MigrateCoverImages.ExtractToImages(context);
await MigrateCoverImages.UpdateDatabaseWithImages(context);
}
await Seed.SeedRoles(roleManager);
await Seed.SeedSettings(context);
await Seed.SeedUserApiKeys(context);
}
catch (Exception )
catch (Exception ex)
{
requiresCoverImageMigration = false;
var logger = services.GetRequiredService<ILogger<Program>>();
logger.LogError(ex, "An error occurred during migration");
}
// Apply all migrations on startup
await context.Database.MigrateAsync();
await host.RunAsync();
}
if (requiresCoverImageMigration)
{
await MigrateCoverImages.UpdateDatabaseWithImages(context);
}
private static IHostBuilder CreateHostBuilder(string[] args) =>
Host.CreateDefaultBuilder(args)
.ConfigureAppConfiguration((hostingContext, config) =>
{
config.Sources.Clear();
await Seed.SeedRoles(roleManager);
await Seed.SeedSettings(context);
await Seed.SeedUserApiKeys(context);
}
catch (Exception ex)
{
var logger = services.GetRequiredService<ILogger<Program>>();
logger.LogError(ex, "An error occurred during migration");
}
var env = hostingContext.HostingEnvironment;
await host.RunAsync();
}
config.AddJsonFile("config/appsettings.json", optional: true, reloadOnChange: false)
.AddJsonFile($"config/appsettings.{env.EnvironmentName}.json",
optional: true, reloadOnChange: false);
})
.ConfigureWebHostDefaults(webBuilder =>
{
webBuilder.UseKestrel((opts) =>
{
opts.ListenAnyIP(HttpPort, options => { options.Protocols = HttpProtocols.Http1AndHttp2; });
});
private static IHostBuilder CreateHostBuilder(string[] args) =>
Host.CreateDefaultBuilder(args)
.ConfigureWebHostDefaults(webBuilder =>
{
webBuilder.UseKestrel((opts) =>
{
opts.ListenAnyIP(HttpPort, options => { options.Protocols = HttpProtocols.Http1AndHttp2; });
});
webBuilder.UseStartup<Startup>();
});
webBuilder.UseStartup<Startup>();
});
}
}
}

View file

@ -123,12 +123,24 @@ namespace API.Services
/// </summary>
/// <param name="entryFullNames"></param>
/// <returns>Entry name of match, null if no match</returns>
public string FirstFileEntry(IEnumerable<string> entryFullNames)
public static string FirstFileEntry(IEnumerable<string> entryFullNames, string archiveName)
{
var result = entryFullNames.OrderBy(Path.GetFileName, new NaturalSortComparer())
.FirstOrDefault(x => !Parser.Parser.HasBlacklistedFolderInPath(x)
&& Parser.Parser.IsImage(x)
&& !x.StartsWith(Parser.Parser.MacOsMetadataFileStartsWith));
// First check if there are any files that are not in a nested folder before just comparing by filename. This is needed
// because NaturalSortComparer does not work with paths and doesn't seem 001.jpg as before chapter 1/001.jpg.
var fullNames = entryFullNames.Where(x =>!Parser.Parser.HasBlacklistedFolderInPath(x)
&& Parser.Parser.IsImage(x)
&& !x.StartsWith(Parser.Parser.MacOsMetadataFileStartsWith)).ToList();
if (fullNames.Count == 0) return null;
var nonNestedFile = fullNames.Where(entry => (Path.GetDirectoryName(entry) ?? string.Empty).Equals(archiveName))
.OrderBy(Path.GetFullPath, new NaturalSortComparer())
.FirstOrDefault();
if (!string.IsNullOrEmpty(nonNestedFile)) return nonNestedFile;
var result = fullNames
.OrderBy(Path.GetFileName, new NaturalSortComparer())
.FirstOrDefault();
return string.IsNullOrEmpty(result) ? null : result;
}
@ -158,7 +170,7 @@ namespace API.Services
using var archive = ZipFile.OpenRead(archivePath);
var entryNames = archive.Entries.Select(e => e.FullName).ToArray();
var entryName = FindFolderEntry(entryNames) ?? FirstFileEntry(entryNames);
var entryName = FindFolderEntry(entryNames) ?? FirstFileEntry(entryNames, Path.GetFileName(archivePath));
var entry = archive.Entries.Single(e => e.FullName == entryName);
using var stream = entry.Open();
@ -169,7 +181,7 @@ namespace API.Services
using var archive = ArchiveFactory.Open(archivePath);
var entryNames = archive.Entries.Where(archiveEntry => !archiveEntry.IsDirectory).Select(e => e.Key).ToList();
var entryName = FindFolderEntry(entryNames) ?? FirstFileEntry(entryNames);
var entryName = FindFolderEntry(entryNames) ?? FirstFileEntry(entryNames, Path.GetFileName(archivePath));
var entry = archive.Entries.Single(e => e.Key == entryName);
using var stream = entry.OpenEntryStream();

View file

@ -140,15 +140,22 @@ namespace API.Services
}
stylesheetHtml = stylesheetHtml.Insert(0, importBuilder.ToString());
stylesheetHtml =
Parser.Parser.CssImportUrlRegex.Replace(stylesheetHtml, "$1" + apiBase + prepend + "$2" + "$3");
var importMatches = Parser.Parser.CssImportUrlRegex.Matches(stylesheetHtml);
foreach (Match match in importMatches)
{
if (!match.Success) continue;
var importFile = match.Groups["Filename"].Value;
stylesheetHtml = stylesheetHtml.Replace(importFile, apiBase + prepend + importFile);
}
// Check if there are any background images and rewrite those urls
EscapeCssImageReferences(ref stylesheetHtml, apiBase, book);
var styleContent = RemoveWhiteSpaceFromStylesheets(stylesheetHtml);
styleContent =
Parser.Parser.FontSrcUrlRegex.Replace(styleContent, "$1" + apiBase + "$2" + "$3");
styleContent = styleContent.Replace("body", ".reading-section");
if (string.IsNullOrEmpty(styleContent)) return string.Empty;
var stylesheet = await _cssParser.ParseAsync(styleContent);
foreach (var styleRule in stylesheet.StyleRules)
{
@ -165,6 +172,21 @@ namespace API.Services
return RemoveWhiteSpaceFromStylesheets(stylesheet.ToCss());
}
private static void EscapeCssImageReferences(ref string stylesheetHtml, string apiBase, EpubBookRef book)
{
var matches = Parser.Parser.CssImageUrlRegex.Matches(stylesheetHtml);
foreach (Match match in matches)
{
if (!match.Success) continue;
var importFile = match.Groups["Filename"].Value;
var key = CleanContentKeys(importFile);
if (!book.Content.AllFiles.ContainsKey(key)) continue;
stylesheetHtml = stylesheetHtml.Replace(importFile, apiBase + key);
}
}
public ComicInfo GetComicInfo(string filePath)
{
if (!IsValidFile(filePath) || Parser.Parser.IsPdf(filePath)) return null;
@ -488,15 +510,29 @@ namespace API.Services
private static string RemoveWhiteSpaceFromStylesheets(string body)
{
if (string.IsNullOrEmpty(body))
{
return string.Empty;
}
// Remove comments from CSS
body = Regex.Replace(body, @"/\*[\d\D]*?\*/", string.Empty);
body = Regex.Replace(body, @"[a-zA-Z]+#", "#");
body = Regex.Replace(body, @"[\n\r]+\s*", string.Empty);
body = Regex.Replace(body, @"\s+", " ");
body = Regex.Replace(body, @"\s?([:,;{}])\s?", "$1");
body = body.Replace(";}", "}");
try
{
body = body.Replace(";}", "}");
}
catch (Exception)
{
/* Swallow exception. Some css doesn't have style rules ending in ; */
}
body = Regex.Replace(body, @"([\s:]0)(px|pt|%|em)", "$1");
// Remove comments from CSS
body = Regex.Replace(body, @"/\*[\d\D]*?\*/", string.Empty);
return body;
}

View file

@ -21,7 +21,6 @@ namespace API.Services
private readonly IDirectoryService _directoryService;
private readonly IBookService _bookService;
private readonly NumericComparer _numericComparer;
public static readonly string CacheDirectory = Path.GetFullPath(Path.Join(Directory.GetCurrentDirectory(), "cache/"));
public CacheService(ILogger<CacheService> logger, IUnitOfWork unitOfWork, IArchiveService archiveService,
IDirectoryService directoryService, IBookService bookService)
@ -38,7 +37,7 @@ namespace API.Services
{
if (!DirectoryService.ExistOrCreate(DirectoryService.CacheDirectory))
{
_logger.LogError("Cache directory {CacheDirectory} is not accessible or does not exist. Creating...", CacheDirectory);
_logger.LogError("Cache directory {CacheDirectory} is not accessible or does not exist. Creating...", DirectoryService.CacheDirectory);
}
}
@ -102,7 +101,7 @@ namespace API.Services
}
else
{
_directoryService.CopyDirectoryToDirectory(Path.GetDirectoryName(files[0].FilePath), extractPath,
DirectoryService.CopyDirectoryToDirectory(Path.GetDirectoryName(files[0].FilePath), extractPath,
Parser.Parser.ImageFileExtensions);
}
@ -147,7 +146,7 @@ namespace API.Services
try
{
DirectoryService.ClearDirectory(CacheDirectory);
DirectoryService.ClearDirectory(DirectoryService.CacheDirectory);
}
catch (Exception ex)
{
@ -198,7 +197,7 @@ namespace API.Services
if (page <= (mangaFile.Pages + pagesSoFar))
{
var path = GetCachePath(chapter.Id);
var files = _directoryService.GetFilesWithExtension(path, Parser.Parser.ImageFileExtensions);
var files = DirectoryService.GetFilesWithExtension(path, Parser.Parser.ImageFileExtensions);
Array.Sort(files, _numericComparer);
if (files.Length == 0)

View file

@ -1,55 +0,0 @@
using System;
using System.Net.Http;
using System.Net.Http.Json;
using System.Threading.Tasks;
using API.DTOs.Stats;
using Microsoft.Extensions.Logging;
namespace API.Services.Clients
{
public class StatsApiClient
{
private readonly HttpClient _client;
private readonly ILogger<StatsApiClient> _logger;
#pragma warning disable S1075
private const string ApiUrl = "http://stats.kavitareader.com";
#pragma warning restore S1075
public StatsApiClient(HttpClient client, ILogger<StatsApiClient> logger)
{
_client = client;
_logger = logger;
_client.Timeout = TimeSpan.FromSeconds(30);
}
public async Task SendDataToStatsServer(UsageStatisticsDto data)
{
var responseContent = string.Empty;
try
{
using var response = await _client.PostAsJsonAsync(ApiUrl + "/api/InstallationStats", data);
responseContent = await response.Content.ReadAsStringAsync();
response.EnsureSuccessStatusCode();
}
catch (HttpRequestException e)
{
var info = new
{
dataSent = data,
response = responseContent
};
_logger.LogError(e, "KavitaStats did not respond successfully. {Content}", info);
throw;
}
catch (Exception e)
{
_logger.LogError(e, "An error happened during the request to KavitaStats");
throw;
}
}
}
}

View file

@ -16,10 +16,12 @@ namespace API.Services
private static readonly Regex ExcludeDirectories = new Regex(
@"@eaDir|\.DS_Store",
RegexOptions.Compiled | RegexOptions.IgnoreCase);
public static readonly string TempDirectory = Path.Join(Directory.GetCurrentDirectory(), "temp");
public static readonly string LogDirectory = Path.Join(Directory.GetCurrentDirectory(), "logs");
public static readonly string CacheDirectory = Path.Join(Directory.GetCurrentDirectory(), "cache");
public static readonly string CoverImageDirectory = Path.Join(Directory.GetCurrentDirectory(), "covers");
public static readonly string TempDirectory = Path.Join(Directory.GetCurrentDirectory(), "config", "temp");
public static readonly string LogDirectory = Path.Join(Directory.GetCurrentDirectory(), "config", "logs");
public static readonly string CacheDirectory = Path.Join(Directory.GetCurrentDirectory(), "config", "cache");
public static readonly string CoverImageDirectory = Path.Join(Directory.GetCurrentDirectory(), "config", "covers");
public static readonly string BackupDirectory = Path.Join(Directory.GetCurrentDirectory(), "config", "backups");
public static readonly string StatsDirectory = Path.Join(Directory.GetCurrentDirectory(), "config", "stats");
public DirectoryService(ILogger<DirectoryService> logger)
{
@ -95,7 +97,7 @@ namespace API.Services
return di.Exists;
}
public IEnumerable<string> GetFiles(string path, string searchPatternExpression = "",
public static IEnumerable<string> GetFiles(string path, string searchPatternExpression = "",
SearchOption searchOption = SearchOption.TopDirectoryOnly)
{
if (searchPatternExpression != string.Empty)
@ -134,13 +136,10 @@ namespace API.Services
/// <param name="searchPattern">Defaults to *, meaning all files</param>
/// <returns></returns>
/// <exception cref="DirectoryNotFoundException"></exception>
public bool CopyDirectoryToDirectory(string sourceDirName, string destDirName, string searchPattern = "*")
public static bool CopyDirectoryToDirectory(string sourceDirName, string destDirName, string searchPattern = "")
{
if (string.IsNullOrEmpty(sourceDirName)) return false;
var di = new DirectoryInfo(sourceDirName);
if (!di.Exists) return false;
// Get the subdirectories for the specified directory.
var dir = new DirectoryInfo(sourceDirName);
@ -154,7 +153,7 @@ namespace API.Services
var dirs = dir.GetDirectories();
// If the destination directory doesn't exist, create it.
Directory.CreateDirectory(destDirName);
ExistOrCreate(destDirName);
// Get the files in the directory and copy them to the new location.
var files = GetFilesWithExtension(dir.FullName, searchPattern).Select(n => new FileInfo(n));
@ -176,7 +175,7 @@ namespace API.Services
public string[] GetFilesWithExtension(string path, string searchPatternExpression = "")
public static string[] GetFilesWithExtension(string path, string searchPatternExpression = "")
{
if (searchPatternExpression != string.Empty)
{

View file

@ -13,7 +13,6 @@ namespace API.Services
public class ImageService : IImageService
{
private readonly ILogger<ImageService> _logger;
private readonly IDirectoryService _directoryService;
public const string ChapterCoverImageRegex = @"v\d+_c\d+";
public const string SeriesCoverImageRegex = @"seres\d+";
public const string CollectionTagCoverImageRegex = @"tag\d+";
@ -24,10 +23,9 @@ namespace API.Services
/// </summary>
private const int ThumbnailWidth = 320;
public ImageService(ILogger<ImageService> logger, IDirectoryService directoryService)
public ImageService(ILogger<ImageService> logger)
{
_logger = logger;
_directoryService = directoryService;
}
/// <summary>
@ -44,9 +42,9 @@ namespace API.Services
return null;
}
var firstImage = _directoryService.GetFilesWithExtension(directory, Parser.Parser.ImageFileExtensions)
var firstImage = DirectoryService.GetFilesWithExtension(directory, Parser.Parser.ImageFileExtensions)
.OrderBy(f => f, new NaturalSortComparer()).FirstOrDefault();
return firstImage;
}

View file

@ -1,3 +1,4 @@
using System;
using System.Collections.Generic;
using System.Diagnostics;
using System.IO;
@ -216,37 +217,45 @@ namespace API.Services
var chunkInfo = await _unitOfWork.SeriesRepository.GetChunkInfo(library.Id);
var stopwatch = Stopwatch.StartNew();
var totalTime = 0L;
_logger.LogDebug($"[MetadataService] Refreshing Library {library.Name}. Total Items: {chunkInfo.TotalSize}. Total Chunks: {chunkInfo.TotalChunks} with {chunkInfo.ChunkSize} size.");
_logger.LogInformation("[MetadataService] Refreshing Library {LibraryName}. Total Items: {TotalSize}. Total Chunks: {TotalChunks} with {ChunkSize} size", library.Name, chunkInfo.TotalSize, chunkInfo.TotalChunks, chunkInfo.ChunkSize);
// This technically does
for (var chunk = 1; chunk <= chunkInfo.TotalChunks; chunk++)
{
if (chunkInfo.TotalChunks == 0) continue;
totalTime += stopwatch.ElapsedMilliseconds;
stopwatch.Restart();
_logger.LogDebug($"[MetadataService] Processing chunk {chunk} / {chunkInfo.TotalChunks} with size {chunkInfo.ChunkSize} Series ({chunk * chunkInfo.ChunkSize} - {(chunk + 1) * chunkInfo.ChunkSize}");
_logger.LogInformation("[MetadataService] Processing chunk {ChunkNumber} / {TotalChunks} with size {ChunkSize}. Series ({SeriesStart} - {SeriesEnd}",
chunk, chunkInfo.TotalChunks, chunkInfo.ChunkSize, chunk * chunkInfo.ChunkSize, (chunk + 1) * chunkInfo.ChunkSize);
var nonLibrarySeries = await _unitOfWork.SeriesRepository.GetFullSeriesForLibraryIdAsync(library.Id,
new UserParams()
{
PageNumber = chunk,
PageSize = chunkInfo.ChunkSize
});
_logger.LogDebug($"[MetadataService] Fetched {nonLibrarySeries.Count} series for refresh");
_logger.LogDebug("[MetadataService] Fetched {SeriesCount} series for refresh", nonLibrarySeries.Count);
Parallel.ForEach(nonLibrarySeries, series =>
{
_logger.LogDebug("[MetadataService] Processing series {SeriesName}", series.OriginalName);
var volumeUpdated = false;
foreach (var volume in series.Volumes)
try
{
var chapterUpdated = false;
foreach (var chapter in volume.Chapters)
_logger.LogDebug("[MetadataService] Processing series {SeriesName}", series.OriginalName);
var volumeUpdated = false;
foreach (var volume in series.Volumes)
{
chapterUpdated = UpdateMetadata(chapter, forceUpdate);
var chapterUpdated = false;
foreach (var chapter in volume.Chapters)
{
chapterUpdated = UpdateMetadata(chapter, forceUpdate);
}
volumeUpdated = UpdateMetadata(volume, chapterUpdated || forceUpdate);
}
volumeUpdated = UpdateMetadata(volume, chapterUpdated || forceUpdate);
UpdateMetadata(series, volumeUpdated || forceUpdate);
}
catch (Exception)
{
/* Swallow exception */
}
UpdateMetadata(series, volumeUpdated || forceUpdate);
});
if (_unitOfWork.HasChanges() && await _unitOfWork.CommitAsync())

View file

@ -89,7 +89,7 @@ namespace API.Services
}
_logger.LogDebug("Scheduling stat collection daily");
RecurringJob.AddOrUpdate(SendDataTask, () => _statsService.CollectAndSendStatsData(), Cron.Daily, TimeZoneInfo.Local);
RecurringJob.AddOrUpdate(SendDataTask, () => _statsService.Send(), Cron.Daily, TimeZoneInfo.Local);
}
public void CancelStatsTasks()
@ -102,7 +102,7 @@ namespace API.Services
public void RunStatCollection()
{
_logger.LogInformation("Enqueuing stat collection");
BackgroundJob.Enqueue(() => _statsService.CollectAndSendStatsData());
BackgroundJob.Enqueue(() => _statsService.Send());
}
#endregion
@ -138,8 +138,7 @@ namespace API.Services
public void CleanupTemp()
{
var tempDirectory = Path.Join(Directory.GetCurrentDirectory(), "temp");
BackgroundJob.Enqueue(() => DirectoryService.ClearDirectory(tempDirectory));
BackgroundJob.Enqueue(() => DirectoryService.ClearDirectory(DirectoryService.TempDirectory));
}
public void RefreshSeriesMetadata(int libraryId, int seriesId, bool forceUpdate = true)

View file

@ -9,6 +9,7 @@ using API.Extensions;
using API.Interfaces;
using API.Interfaces.Services;
using Hangfire;
using Kavita.Common.EnvironmentInfo;
using Microsoft.Extensions.Configuration;
using Microsoft.Extensions.Logging;
@ -19,8 +20,8 @@ namespace API.Services.Tasks
private readonly IUnitOfWork _unitOfWork;
private readonly ILogger<BackupService> _logger;
private readonly IDirectoryService _directoryService;
private readonly string _tempDirectory = Path.Join(Directory.GetCurrentDirectory(), "temp");
private readonly string _logDirectory = Path.Join(Directory.GetCurrentDirectory(), "logs");
private readonly string _tempDirectory = DirectoryService.TempDirectory;
private readonly string _logDirectory = DirectoryService.LogDirectory;
private readonly IList<string> _backupFiles;
@ -33,15 +34,32 @@ namespace API.Services.Tasks
var maxRollingFiles = config.GetMaxRollingFiles();
var loggingSection = config.GetLoggingFileName();
var files = LogFiles(maxRollingFiles, loggingSection);
_backupFiles = new List<string>()
if (new OsInfo(Array.Empty<IOsVersionAdapter>()).IsDocker)
{
"appsettings.json",
"Hangfire.db",
"Hangfire-log.db",
"kavita.db",
"kavita.db-shm", // This wont always be there
"kavita.db-wal", // This wont always be there
};
_backupFiles = new List<string>()
{
"data/appsettings.json",
"data/Hangfire.db",
"data/Hangfire-log.db",
"data/kavita.db",
"data/kavita.db-shm", // This wont always be there
"data/kavita.db-wal" // This wont always be there
};
}
else
{
_backupFiles = new List<string>()
{
"appsettings.json",
"Hangfire.db",
"Hangfire-log.db",
"kavita.db",
"kavita.db-shm", // This wont always be there
"kavita.db-wal" // This wont always be there
};
}
foreach (var file in files.Select(f => (new FileInfo(f)).Name).ToList())
{
_backupFiles.Add(file);
@ -54,7 +72,7 @@ namespace API.Services.Tasks
var fi = new FileInfo(logFileName);
var files = maxRollingFiles > 0
? _directoryService.GetFiles(_logDirectory, $@"{Path.GetFileNameWithoutExtension(fi.Name)}{multipleFileRegex}\.log")
? DirectoryService.GetFiles(_logDirectory, $@"{Path.GetFileNameWithoutExtension(fi.Name)}{multipleFileRegex}\.log")
: new[] {"kavita.log"};
return files;
}
@ -129,6 +147,11 @@ namespace API.Services.Tasks
{
// Swallow exception. This can be a duplicate cover being copied as chapter and volumes can share same file.
}
if (!DirectoryService.GetFiles(outputTempDir).Any())
{
DirectoryService.ClearAndDeleteDirectory(outputTempDir);
}
}
/// <summary>
@ -141,7 +164,7 @@ namespace API.Services.Tasks
var backupDirectory = Task.Run(() => _unitOfWork.SettingsRepository.GetSettingAsync(ServerSettingKey.BackupDirectory)).Result.Value;
if (!_directoryService.Exists(backupDirectory)) return;
var deltaTime = DateTime.Today.Subtract(TimeSpan.FromDays(dayThreshold));
var allBackups = _directoryService.GetFiles(backupDirectory).ToList();
var allBackups = DirectoryService.GetFiles(backupDirectory).ToList();
var expiredBackups = allBackups.Select(filename => new FileInfo(filename))
.Where(f => f.CreationTime > deltaTime)
.ToList();

View file

@ -16,16 +16,14 @@ namespace API.Services.Tasks
private readonly ILogger<CleanupService> _logger;
private readonly IBackupService _backupService;
private readonly IUnitOfWork _unitOfWork;
private readonly IDirectoryService _directoryService;
public CleanupService(ICacheService cacheService, ILogger<CleanupService> logger,
IBackupService backupService, IUnitOfWork unitOfWork, IDirectoryService directoryService)
IBackupService backupService, IUnitOfWork unitOfWork)
{
_cacheService = cacheService;
_logger = logger;
_backupService = backupService;
_unitOfWork = unitOfWork;
_directoryService = directoryService;
}
public void CleanupCacheDirectory()
@ -42,7 +40,7 @@ namespace API.Services.Tasks
{
_logger.LogInformation("Starting Cleanup");
_logger.LogInformation("Cleaning temp directory");
var tempDirectory = Path.Join(Directory.GetCurrentDirectory(), "temp");
var tempDirectory = DirectoryService.TempDirectory;
DirectoryService.ClearDirectory(tempDirectory);
CleanupCacheDirectory();
_logger.LogInformation("Cleaning old database backups");
@ -57,7 +55,7 @@ namespace API.Services.Tasks
private async Task DeleteSeriesCoverImages()
{
var images = await _unitOfWork.SeriesRepository.GetAllCoverImagesAsync();
var files = _directoryService.GetFiles(DirectoryService.CoverImageDirectory, ImageService.SeriesCoverImageRegex);
var files = DirectoryService.GetFiles(DirectoryService.CoverImageDirectory, ImageService.SeriesCoverImageRegex);
foreach (var file in files)
{
if (images.Contains(Path.GetFileName(file))) continue;
@ -69,7 +67,7 @@ namespace API.Services.Tasks
private async Task DeleteChapterCoverImages()
{
var images = await _unitOfWork.ChapterRepository.GetAllCoverImagesAsync();
var files = _directoryService.GetFiles(DirectoryService.CoverImageDirectory, ImageService.ChapterCoverImageRegex);
var files = DirectoryService.GetFiles(DirectoryService.CoverImageDirectory, ImageService.ChapterCoverImageRegex);
foreach (var file in files)
{
if (images.Contains(Path.GetFileName(file))) continue;
@ -81,7 +79,7 @@ namespace API.Services.Tasks
private async Task DeleteTagCoverImages()
{
var images = await _unitOfWork.CollectionTagRepository.GetAllCoverImagesAsync();
var files = _directoryService.GetFiles(DirectoryService.CoverImageDirectory, ImageService.CollectionTagCoverImageRegex);
var files = DirectoryService.GetFiles(DirectoryService.CoverImageDirectory, ImageService.CollectionTagCoverImageRegex);
foreach (var file in files)
{
if (images.Contains(Path.GetFileName(file))) continue;

View file

@ -73,9 +73,13 @@ namespace API.Services.Tasks.Scanner
info = Parser.Parser.Parse(path, rootPath, type);
}
// If we couldn't match, log. But don't log if the file parses as a cover image
if (info == null)
{
_logger.LogWarning("[Scanner] Could not parse series from {Path}", path);
if (!(Parser.Parser.IsImage(path) && Parser.Parser.IsCoverImage(path)))
{
_logger.LogWarning("[Scanner] Could not parse series from {Path}", path);
}
return;
}
@ -133,13 +137,11 @@ namespace API.Services.Tasks.Scanner
public string MergeName(ParserInfo info)
{
var normalizedSeries = Parser.Parser.Normalize(info.Series);
_logger.LogDebug("Checking if we can merge {NormalizedSeries}", normalizedSeries);
var existingName =
_scannedSeries.SingleOrDefault(p => Parser.Parser.Normalize(p.Key.NormalizedName) == normalizedSeries && p.Key.Format == info.Format)
.Key;
if (existingName != null && !string.IsNullOrEmpty(existingName.Name))
{
_logger.LogDebug("Found duplicate parsed infos, merged {Original} into {Merged}", info.Series, existingName.Name);
return existingName.Name;
}

View file

@ -261,13 +261,15 @@ namespace API.Services.Tasks
var totalTime = 0L;
// Update existing series
_logger.LogDebug("[ScannerService] Updating existing series");
_logger.LogInformation("[ScannerService] Updating existing series for {LibraryName}. Total Items: {TotalSize}. Total Chunks: {TotalChunks} with {ChunkSize} size",
library.Name, chunkInfo.TotalSize, chunkInfo.TotalChunks, chunkInfo.ChunkSize);
for (var chunk = 1; chunk <= chunkInfo.TotalChunks; chunk++)
{
if (chunkInfo.TotalChunks == 0) continue;
totalTime += stopwatch.ElapsedMilliseconds;
stopwatch.Restart();
_logger.LogDebug($"[ScannerService] Processing chunk {chunk} / {chunkInfo.TotalChunks} with size {chunkInfo.ChunkSize} Series ({chunk * chunkInfo.ChunkSize} - {(chunk + 1) * chunkInfo.ChunkSize}");
_logger.LogInformation("[ScannerService] Processing chunk {ChunkNumber} / {TotalChunks} with size {ChunkSize}. Series ({SeriesStart} - {SeriesEnd}",
chunk, chunkInfo.TotalChunks, chunkInfo.ChunkSize, chunk * chunkInfo.ChunkSize, (chunk + 1) * chunkInfo.ChunkSize);
var nonLibrarySeries = await _unitOfWork.SeriesRepository.GetFullSeriesForLibraryIdAsync(library.Id, new UserParams()
{
PageNumber = chunk,
@ -299,7 +301,21 @@ namespace API.Services.Tasks
UpdateSeries(series, parsedSeries);
});
await _unitOfWork.CommitAsync();
try
{
await _unitOfWork.CommitAsync();
}
catch (Exception ex)
{
_logger.LogCritical(ex, "[ScannerService] There was an issue writing to the DB. Chunk {ChunkNumber} did not save to DB. If debug mode, series to check will be printed", chunk);
foreach (var series in nonLibrarySeries)
{
_logger.LogDebug("[ScannerService] There may be a constraint issue with {SeriesName}", series.OriginalName);
}
await _messageHub.Clients.All.SendAsync(SignalREvents.ScanLibraryError,
MessageFactory.ScanLibraryError(library.Id));
continue;
}
_logger.LogInformation(
"[ScannerService] Processed {SeriesStart} - {SeriesEnd} series in {ElapsedScanTime} milliseconds for {LibraryName}",
chunk * chunkInfo.ChunkSize, (chunk * chunkInfo.ChunkSize) + nonLibrarySeries.Count, totalTime, library.Name);
@ -320,12 +336,14 @@ namespace API.Services.Tasks
_logger.LogDebug("[ScannerService] Adding new series");
var newSeries = new List<Series>();
var allSeries = (await _unitOfWork.SeriesRepository.GetSeriesForLibraryIdAsync(library.Id)).ToList();
_logger.LogDebug("[ScannerService] Fetched {AllSeriesCount} series for comparing new series with. There should be {DeltaToParsedSeries} new series",
allSeries.Count, parsedSeries.Count - allSeries.Count);
foreach (var (key, infos) in parsedSeries)
{
// Key is normalized already
Series existingSeries;
try
{
{// NOTE: Maybe use .Equals() here
existingSeries = allSeries.SingleOrDefault(s =>
(s.NormalizedName == key.NormalizedName || Parser.Parser.Normalize(s.OriginalName) == key.NormalizedName)
&& (s.Format == key.Format || s.Format == MangaFormat.Unknown));
@ -386,7 +404,7 @@ namespace API.Services.Tasks
}
}
_logger.LogDebug(
_logger.LogInformation(
"[ScannerService] Added {NewSeries} series in {ElapsedScanTime} milliseconds for {LibraryName}",
newSeries.Count, stopwatch.ElapsedMilliseconds, library.Name);
}

View file

@ -1,6 +1,7 @@
using System;
using System.IO;
using System.Linq;
using System.Net.Http;
using System.Runtime.InteropServices;
using System.Text.Json;
using System.Threading;
@ -9,9 +10,11 @@ using API.Data;
using API.DTOs.Stats;
using API.Interfaces;
using API.Interfaces.Services;
using API.Services.Clients;
using Flurl.Http;
using Hangfire;
using Kavita.Common;
using Kavita.Common.EnvironmentInfo;
using Microsoft.AspNetCore.Http;
using Microsoft.EntityFrameworkCore;
using Microsoft.Extensions.Logging;
@ -19,32 +22,65 @@ namespace API.Services.Tasks
{
public class StatsService : IStatsService
{
private const string TempFilePath = "stats/";
private const string TempFileName = "app_stats.json";
private const string StatFileName = "app_stats.json";
private readonly StatsApiClient _client;
private readonly DataContext _dbContext;
private readonly ILogger<StatsService> _logger;
private readonly IUnitOfWork _unitOfWork;
public StatsService(StatsApiClient client, DataContext dbContext, ILogger<StatsService> logger,
#pragma warning disable S1075
private const string ApiUrl = "http://stats.kavitareader.com";
#pragma warning restore S1075
private static readonly string StatsFilePath = Path.Combine(DirectoryService.StatsDirectory, StatFileName);
private static bool FileExists => File.Exists(StatsFilePath);
public StatsService(DataContext dbContext, ILogger<StatsService> logger,
IUnitOfWork unitOfWork)
{
_client = client;
_dbContext = dbContext;
_logger = logger;
_unitOfWork = unitOfWork;
}
private static string FinalPath => Path.Combine(Directory.GetCurrentDirectory(), TempFilePath, TempFileName);
private static bool FileExists => File.Exists(FinalPath);
public async Task PathData(ClientInfoDto clientInfoDto)
/// <summary>
/// Due to all instances firing this at the same time, we can DDOS our server. This task when fired will schedule the task to be run
/// randomly over a 6 hour spread
/// </summary>
public async Task Send()
{
_logger.LogDebug("Pathing client data to the file");
var allowStatCollection = (await _unitOfWork.SettingsRepository.GetSettingsDtoAsync()).AllowStatCollection;
if (!allowStatCollection)
{
return;
}
var rnd = new Random();
var offset = rnd.Next(0, 6);
if (offset == 0)
{
await SendData();
}
else
{
_logger.LogInformation("KavitaStats upload has been schedule to run in {Offset} hours", offset);
BackgroundJob.Schedule(() => SendData(), DateTimeOffset.Now.AddHours(offset));
}
}
/// <summary>
/// This must be public for Hangfire. Do not call this directly.
/// </summary>
// ReSharper disable once MemberCanBePrivate.Global
public async Task SendData()
{
await CollectRelevantData();
await FinalizeStats();
}
public async Task RecordClientInfo(ClientInfoDto clientInfoDto)
{
var statisticsDto = await GetData();
statisticsDto.AddClientInfo(clientInfoDto);
await SaveFile(statisticsDto);
@ -52,12 +88,7 @@ namespace API.Services.Tasks
private async Task CollectRelevantData()
{
_logger.LogDebug("Collecting data from the server and database");
_logger.LogDebug("Collecting usage info");
var usageInfo = await GetUsageInfo();
_logger.LogDebug("Collecting server info");
var serverInfo = GetServerInfo();
await PathData(serverInfo, usageInfo);
@ -67,39 +98,68 @@ namespace API.Services.Tasks
{
try
{
_logger.LogDebug("Finalizing Stats collection flow");
var data = await GetExistingData<UsageStatisticsDto>();
var successful = await SendDataToStatsServer(data);
_logger.LogDebug("Sending data to the Stats server");
await _client.SendDataToStatsServer(data);
_logger.LogDebug("Deleting the file from disk");
if (FileExists) File.Delete(FinalPath);
if (successful)
{
ResetStats();
}
}
catch (Exception ex)
{
_logger.LogError(ex, "Error Finalizing Stats collection flow");
throw;
_logger.LogError(ex, "There was an exception while sending data to KavitaStats");
}
}
public async Task CollectAndSendStatsData()
private async Task<bool> SendDataToStatsServer(UsageStatisticsDto data)
{
var allowStatCollection = (await _unitOfWork.SettingsRepository.GetSettingsDtoAsync()).AllowStatCollection;
if (!allowStatCollection)
var responseContent = string.Empty;
try
{
_logger.LogDebug("User has opted out of stat collection, not registering tasks");
return;
var response = await (ApiUrl + "/api/InstallationStats")
.WithHeader("Accept", "application/json")
.WithHeader("User-Agent", "Kavita")
.WithHeader("x-api-key", "MsnvA2DfQqxSK5jh")
.WithHeader("api-key", "MsnvA2DfQqxSK5jh")
.WithHeader("x-kavita-version", BuildInfo.Version)
.WithTimeout(TimeSpan.FromSeconds(30))
.PostJsonAsync(data);
if (response.StatusCode != StatusCodes.Status200OK)
{
_logger.LogError("KavitaStats did not respond successfully. {Content}", response);
return false;
}
return true;
}
await CollectRelevantData();
await FinalizeStats();
catch (HttpRequestException e)
{
var info = new
{
dataSent = data,
response = responseContent
};
_logger.LogError(e, "KavitaStats did not respond successfully. {Content}", info);
}
catch (Exception e)
{
_logger.LogError(e, "An error happened during the request to KavitaStats");
}
return false;
}
private static void ResetStats()
{
if (FileExists) File.Delete(StatsFilePath);
}
private async Task PathData(ServerInfoDto serverInfoDto, UsageInfoDto usageInfoDto)
{
_logger.LogDebug("Pathing server and usage info to the file");
var data = await GetData();
data.ServerInfo = serverInfoDto;
@ -110,7 +170,7 @@ namespace API.Services.Tasks
await SaveFile(data);
}
private async ValueTask<UsageStatisticsDto> GetData()
private static async ValueTask<UsageStatisticsDto> GetData()
{
if (!FileExists) return new UsageStatisticsDto {InstallId = HashUtil.AnonymousToken()};
@ -156,39 +216,17 @@ namespace API.Services.Tasks
return serverInfo;
}
private async Task<T> GetExistingData<T>()
private static async Task<T> GetExistingData<T>()
{
_logger.LogInformation("Fetching existing data from file");
var existingDataJson = await GetFileDataAsString();
_logger.LogInformation("Deserializing data from file to object");
var existingData = JsonSerializer.Deserialize<T>(existingDataJson);
return existingData;
var json = await File.ReadAllTextAsync(StatsFilePath);
return JsonSerializer.Deserialize<T>(json);
}
private async Task<string> GetFileDataAsString()
private static async Task SaveFile(UsageStatisticsDto statisticsDto)
{
_logger.LogInformation("Reading file from disk");
return await File.ReadAllTextAsync(FinalPath);
}
DirectoryService.ExistOrCreate(DirectoryService.StatsDirectory);
private async Task SaveFile(UsageStatisticsDto statisticsDto)
{
_logger.LogDebug("Saving file");
var finalDirectory = FinalPath.Replace(TempFileName, string.Empty);
if (!Directory.Exists(finalDirectory))
{
_logger.LogDebug("Creating tmp directory");
Directory.CreateDirectory(finalDirectory);
}
_logger.LogDebug("Serializing data to write");
var dataJson = JsonSerializer.Serialize(statisticsDto);
_logger.LogDebug("Writing file to the disk");
await File.WriteAllTextAsync(FinalPath, dataJson);
await File.WriteAllTextAsync(StatsFilePath, JsonSerializer.Serialize(statisticsDto));
}
}
}

View file

@ -96,5 +96,17 @@ namespace API.SignalR
}
};
}
public static SignalRMessage ScanLibraryError(int libraryId)
{
return new SignalRMessage
{
Name = SignalREvents.ScanLibraryError,
Body = new
{
LibraryId = libraryId,
}
};
}
}
}

View file

@ -1,4 +1,5 @@
using System.Collections.Generic;
using System;
using System.Collections.Generic;
using System.Linq;
using System.Threading.Tasks;
using API.Interfaces;
@ -27,7 +28,7 @@ namespace API.SignalR.Presence
_unitOfWork = unitOfWork;
}
public Task UserConnected(string username, string connectionId)
public async Task UserConnected(string username, string connectionId)
{
lock (OnlineUsers)
{
@ -41,7 +42,10 @@ namespace API.SignalR.Presence
}
}
return Task.CompletedTask;
// Update the last active for the user
var user = await _unitOfWork.UserRepository.GetUserByUsernameAsync(username);
user.LastActive = DateTime.Now;
await _unitOfWork.CommitAsync();
}
public Task UserDisconnected(string username, string connectionId)

View file

@ -11,5 +11,6 @@
public const string ScanLibraryProgress = "ScanLibraryProgress";
public const string OnlineUsers = "OnlineUsers";
public const string SeriesAddedToCollection = "SeriesAddedToCollection";
public const string ScanLibraryError = "ScanLibraryError";
}
}

View file

@ -24,6 +24,7 @@ using Microsoft.AspNetCore.StaticFiles;
using Microsoft.Extensions.Configuration;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Hosting;
using Microsoft.Extensions.Logging;
using Microsoft.OpenApi.Models;
namespace API
@ -106,7 +107,11 @@ namespace API
services.AddResponseCaching();
services.AddStatsClient(_config);
services.Configure<ForwardedHeadersOptions>(options =>
{
options.ForwardedHeaders =
ForwardedHeaders.All;
});
services.AddHangfire(configuration => configuration
.UseSimpleAssemblyNameTypeSerializer()
@ -139,7 +144,10 @@ namespace API
app.UseResponseCompression();
app.UseForwardedHeaders();
app.UseForwardedHeaders(new ForwardedHeadersOptions
{
ForwardedHeaders = ForwardedHeaders.All
});
app.UseRouting();
@ -210,6 +218,15 @@ namespace API
applicationLifetime.ApplicationStopping.Register(OnShutdown);
applicationLifetime.ApplicationStarted.Register(() =>
{
try
{
var logger = serviceProvider.GetRequiredService<ILogger<Startup>>();
logger.LogInformation("Kavita - v{Version}", BuildInfo.Version);
}
catch (Exception)
{
/* Swallow Exception */
}
Console.WriteLine($"Kavita - v{BuildInfo.Version}");
});
}

View file

@ -1,21 +1,21 @@
{
"ConnectionStrings": {
"DefaultConnection": "Data source=kavita.db"
"DefaultConnection": "Data source=config//kavita.db"
},
"TokenKey": "super secret unguessable key",
"Logging": {
"LogLevel": {
"Default": "Debug",
"Default": "Information",
"Microsoft": "Information",
"Microsoft.Hosting.Lifetime": "Error",
"Hangfire": "Information",
"Microsoft.AspNetCore.Hosting.Internal.WebHost": "Information"
},
"File": {
"Path": "logs/kavita.log",
"Path": "config//logs/kavita.log",
"Append": "True",
"FileSizeLimitBytes": 10485760,
"MaxRollingFiles": 5
"FileSizeLimitBytes": 26214400,
"MaxRollingFiles": 2
}
},
"Port": 5000