v0.4.8 Release (#720)

* Bump versions by dotnet-bump-version.

* Bump versions by dotnet-bump-version.

* Workflow updates (#658)

# Added
- Added: Added automatic character parsing for discord notifier. Now if the PR is over a certain character limit, it will trim and add an appropriate link to the full changelog. (Release for Stable, PR for Dev)

# Removed
- Removed: Removed Sentry map task from the workflow since Sentry is no longer used.

* Bump versions by dotnet-bump-version.

* Misc Updates (#665)

* Do not allow non-admins to change their passwords when authentication is disabled

* Clean up the login page so that input field text is black

* cleanup some resizing when typing a password and having a lot of users

* Changed the LastActive for a user to not just be login, but also when they open an already authenticated session.

* Bump versions by dotnet-bump-version.

* Logging Cleanup (#668)

* Do not allow non-admins to change their passwords when authentication is disabled

* Clean up the login page so that input field text is black

* cleanup some resizing when typing a password and having a lot of users

* Changed the LastActive for a user to not just be login, but also when they open an already authenticated session.

* Removed some verbose debugging statements and moved some debug to information to be more prevelant to logs for default installs.

* In Progress now sends progress information on the Series

* Add ability to add cards to recently added when new series are added in backend

* Implemented the ability to click the glasses icon to turn off incognito mode from within the reader so you can start tracking progress

* Don't warn the user about authentication when they don't touch that control

* Bump versions by dotnet-bump-version.

* Changed the stats that are sent back to stat server from installed server.

* Revert "Changed the stats that are sent back to stat server from installed server."

This reverts commit 644cb6d1f6.

* Bump versions by dotnet-bump-version.

* Bump versions by dotnet-bump-version.

* Bulk Add to Collection (#674)

* Fixed the typeahead not having the same size input box as other inputs

* Implemented the ability to add multiple series to a collection through bulk operations flow. Updated book parser to handle "@import url('...');" syntax as well as @import '...';

* Implemented the ability to create a new Collection tag via bulk operations flow.

* Bump versions by dotnet-bump-version.

* Bulk Operations for In Progress and Recently Added (#677)

* Don't log a message about bad match if the file is a cover image

* Enable bulk operations for In Progress and Recently Added

* Fixed a bad logic case

* Bump versions by dotnet-bump-version.

* Regression Fix (#680)

* Ensure we mount the backups directory for Docker users

* Fixed a huge logic bug that deleted files in users libraries

* Bump versions by dotnet-bump-version.

* Change chunk size to be a fixed 50 to validate if it's causing issue with refresh. Added some try catches to see if exceptions are causing issues. (#681)

* Bump versions by dotnet-bump-version.

* Fixed a bug where searching on localized name would fail to show on the search. Fixed a bug where extra spaces would cause the search results not to show properly. (#682)

* Bump versions by dotnet-bump-version.

* When we have a special marker, ensure we fall back to folder parsing to try and group correctly to the actual series before just accepting what we parsed. (#684)

Fixed a missed parsing case where comic special parsing wasn't being called on comic libraries.

* Bump versions by dotnet-bump-version.

* iOS Admin page dropdown fix (#686)

# Fixed:
- Fixed: Fixed an issue where the dropdown on the admin server page would not work on Safari or other iOS browsers.

* When the DB fails to save, log out all the series the user should look into for constraint issues and push a message to the admins connected to webui. (#687)

* Bump versions by dotnet-bump-version.

* Bump versions by dotnet-bump-version.

* Stat upload will now schedule itself between midnight and 6am in server time for upload. (#688)

* Bump versions by dotnet-bump-version.

* EPUB CSS Parsing Issues (#690)

* WIP. Rewrote some of the Regex to better support css escaping. We now escape background-image, border-image, and list-style-image within css files.

* Added position relative to help with positioning on books that are just absolute positioned elements.

* When there is absolute positioning, like in some epub based comics, supress the bottom action bar since it wont render in the correct location.

* Fixed tests

* Commented out tests

* Bump versions by dotnet-bump-version.

* More EPUB Scoping Fixes (#691)

* Added better handling around when importing css files that are empty. Moved comment removal on css files to before some css whitespace cleanup to get better matches.

* Some enhancements on the checks to see if we need the bottom action bar on reader. Now we don't query DOM and have something that works more reliably.

* Bump versions by dotnet-bump-version.

* Fixed an issue where docker users were not properly backing up the database. Removed an empty File for when covers/ had nothing in it. (#692)

* Bump versions by dotnet-bump-version.

* Fallback to Folder Parsing Issue (#694)

* Fixed a bug in the scanner where we fall back to parsing from folders for poorly named files. The code was exiting early if a chapter or volume could be parsed out.

* Fixed a unit test by tweaking a regex for fallback

* Bump versions by dotnet-bump-version.

* KavitaStats Cleanup (#695)

* Refactored Stats code to be much cleaner and user better naming.

* Cleaned up the actual http code to use Flurl and to return if the upload was successful or not so we can delete the file where appropriate.

* More refactoring for the stats code to clean it up and keep it consistent with our standards.

* Removed a confusing log statement

* Added support for old api key header from original stat server

* Use the correct endpoint, not the new one.

* Code smell

* Bump versions by dotnet-bump-version.

* Bulk Deletion (#697)

* Implemented bulk deletion of series

* Don't show unauthorized exception on UI, just redirect to the login page.

* Bump versions by dotnet-bump-version.

* Cover Image Picking + Forwarding Headers with EPUBs (#700)

* Ensure Kavita knows about forwarding headers (fixes issue with epub urls not going through https with reverse proxy). Fixed a case where cover image selection preferred nested folders vs files in root directory.

* Fixed broken unit test

* Added bug that I fixed to the unit tests

* Cover Image Picking + Forwarding Headers with EPUBs (#702)

* Updating GA Bump version temporarily for fix (#703)

* Bump versions by dotnet-bump-version.

* Cover Image Picking + Forwarding Headers with EPUBs (GA Fix) (#704)

* Bump versions by dotnet-bump-version.

* Vacation Fixes (#709)

* Ignore system and hidden folders when performing directory scan.

* Fixed the comic parser tests not using Comic mode for parsing.

* Accept all forwarded headers and use them.

* Ignore some changes from another branch

* Bump versions by dotnet-bump-version.

* Breaking Changes: Docker Parity (#698)

* Refactored all the config files for Kavita to be loaded from config/. This will allow docker to just mount one folder and for Update functionality to be trivial.

* Cleaned up documentation around new update method.

* Updated docker files to support config directory

* Removed entrypoint, no longer needed

* Update appsettings to point to config directory for logs

* Updated message for docker users that are upgrading

* Ensure that docker users that have not updated their mount points from upgrade cannot start the server

* Code smells

* More cleanup

* Added entrypoint to fix bind mount issues

* Updated README with new folder structure

* Fixed build system for new setup

* Updated string path if user is docker

* Updated the migration flow for docker to work properly and Fixed LogFile configuration updating.

* Migrating docker images is now working 100%

* Fixed config from bad code

* Code cleanup

Co-authored-by: Chris Plaatjes <kizaing@gmail.com>

* Bump versions by dotnet-bump-version.

* Feature/docker parity (#714)

* Refactored all the config files for Kavita to be loaded from config/. This will allow docker to just mount one folder and for Update functionality to be trivial.

* Cleaned up documentation around new update method.

* Updated docker files to support config directory

* Removed entrypoint, no longer needed

* Update appsettings to point to config directory for logs

* Updated message for docker users that are upgrading

* Ensure that docker users that have not updated their mount points from upgrade cannot start the server

* Code smells

* More cleanup

* Added entrypoint to fix bind mount issues

* Updated README with new folder structure

* Fixed build system for new setup

* Updated string path if user is docker

* Updated the migration flow for docker to work properly and Fixed LogFile configuration updating.

* Migrating docker images is now working 100%

* Fixed config from bad code

* Code cleanup

* Fixed monorepo-build.sh

Co-authored-by: Chris Plaatjes <kizaing@gmail.com>

* Breaking Changes: Docker Parity (#715)

* Fixed a bug in the copy directory to directory in the migration

* Somehow GetFiles lost static modifier.

* Bump versions by dotnet-bump-version.

* Build issue (#716)

* Fixed a bug in the copy directory to directory in the migration

* Somehow GetFiles lost static modifier.

* Please work

* Bump versions by dotnet-bump-version.

* Bump versions by dotnet-bump-version.

* Shakeout Changes (#717)

* Make the appsettings public on Configuration and change how we detect when to migrate for non-docker users.

* Fixed up non-docker copy command and removed duplicate check on source directory for a copy.

* Don't delete files unless we know we are successful

* Bump versions by dotnet-bump-version.

* Fixed a migration issue on docker happening too many times or throwing exception when source wasn't there. (#719)

* Bump versions by dotnet-bump-version.

* Version bump for release (#718)

* Bump versions by dotnet-bump-version.

Co-authored-by: Robbie Davis <robbie@therobbiedavis.com>
Co-authored-by: YEGCSharpDev <89283498+YEGCSharpDev@users.noreply.github.com>
Co-authored-by: Chris Plaatjes <kizaing@gmail.com>
This commit is contained in:
Joseph Milazzo 2021-11-04 07:29:02 -05:00 committed by GitHub
parent cb9fa0dda8
commit 33db123e81
No known key found for this signature in database
GPG key ID: 4AEE18F83AFDEB23
115 changed files with 1818 additions and 910 deletions

View file

@ -9,6 +9,7 @@ using API.Extensions;
using API.Interfaces;
using API.Interfaces.Services;
using Hangfire;
using Kavita.Common.EnvironmentInfo;
using Microsoft.Extensions.Configuration;
using Microsoft.Extensions.Logging;
@ -19,8 +20,8 @@ namespace API.Services.Tasks
private readonly IUnitOfWork _unitOfWork;
private readonly ILogger<BackupService> _logger;
private readonly IDirectoryService _directoryService;
private readonly string _tempDirectory = Path.Join(Directory.GetCurrentDirectory(), "temp");
private readonly string _logDirectory = Path.Join(Directory.GetCurrentDirectory(), "logs");
private readonly string _tempDirectory = DirectoryService.TempDirectory;
private readonly string _logDirectory = DirectoryService.LogDirectory;
private readonly IList<string> _backupFiles;
@ -33,15 +34,32 @@ namespace API.Services.Tasks
var maxRollingFiles = config.GetMaxRollingFiles();
var loggingSection = config.GetLoggingFileName();
var files = LogFiles(maxRollingFiles, loggingSection);
_backupFiles = new List<string>()
if (new OsInfo(Array.Empty<IOsVersionAdapter>()).IsDocker)
{
"appsettings.json",
"Hangfire.db",
"Hangfire-log.db",
"kavita.db",
"kavita.db-shm", // This wont always be there
"kavita.db-wal", // This wont always be there
};
_backupFiles = new List<string>()
{
"data/appsettings.json",
"data/Hangfire.db",
"data/Hangfire-log.db",
"data/kavita.db",
"data/kavita.db-shm", // This wont always be there
"data/kavita.db-wal" // This wont always be there
};
}
else
{
_backupFiles = new List<string>()
{
"appsettings.json",
"Hangfire.db",
"Hangfire-log.db",
"kavita.db",
"kavita.db-shm", // This wont always be there
"kavita.db-wal" // This wont always be there
};
}
foreach (var file in files.Select(f => (new FileInfo(f)).Name).ToList())
{
_backupFiles.Add(file);
@ -54,7 +72,7 @@ namespace API.Services.Tasks
var fi = new FileInfo(logFileName);
var files = maxRollingFiles > 0
? _directoryService.GetFiles(_logDirectory, $@"{Path.GetFileNameWithoutExtension(fi.Name)}{multipleFileRegex}\.log")
? DirectoryService.GetFiles(_logDirectory, $@"{Path.GetFileNameWithoutExtension(fi.Name)}{multipleFileRegex}\.log")
: new[] {"kavita.log"};
return files;
}
@ -129,6 +147,11 @@ namespace API.Services.Tasks
{
// Swallow exception. This can be a duplicate cover being copied as chapter and volumes can share same file.
}
if (!DirectoryService.GetFiles(outputTempDir).Any())
{
DirectoryService.ClearAndDeleteDirectory(outputTempDir);
}
}
/// <summary>
@ -141,7 +164,7 @@ namespace API.Services.Tasks
var backupDirectory = Task.Run(() => _unitOfWork.SettingsRepository.GetSettingAsync(ServerSettingKey.BackupDirectory)).Result.Value;
if (!_directoryService.Exists(backupDirectory)) return;
var deltaTime = DateTime.Today.Subtract(TimeSpan.FromDays(dayThreshold));
var allBackups = _directoryService.GetFiles(backupDirectory).ToList();
var allBackups = DirectoryService.GetFiles(backupDirectory).ToList();
var expiredBackups = allBackups.Select(filename => new FileInfo(filename))
.Where(f => f.CreationTime > deltaTime)
.ToList();

View file

@ -16,16 +16,14 @@ namespace API.Services.Tasks
private readonly ILogger<CleanupService> _logger;
private readonly IBackupService _backupService;
private readonly IUnitOfWork _unitOfWork;
private readonly IDirectoryService _directoryService;
public CleanupService(ICacheService cacheService, ILogger<CleanupService> logger,
IBackupService backupService, IUnitOfWork unitOfWork, IDirectoryService directoryService)
IBackupService backupService, IUnitOfWork unitOfWork)
{
_cacheService = cacheService;
_logger = logger;
_backupService = backupService;
_unitOfWork = unitOfWork;
_directoryService = directoryService;
}
public void CleanupCacheDirectory()
@ -42,7 +40,7 @@ namespace API.Services.Tasks
{
_logger.LogInformation("Starting Cleanup");
_logger.LogInformation("Cleaning temp directory");
var tempDirectory = Path.Join(Directory.GetCurrentDirectory(), "temp");
var tempDirectory = DirectoryService.TempDirectory;
DirectoryService.ClearDirectory(tempDirectory);
CleanupCacheDirectory();
_logger.LogInformation("Cleaning old database backups");
@ -57,7 +55,7 @@ namespace API.Services.Tasks
private async Task DeleteSeriesCoverImages()
{
var images = await _unitOfWork.SeriesRepository.GetAllCoverImagesAsync();
var files = _directoryService.GetFiles(DirectoryService.CoverImageDirectory, ImageService.SeriesCoverImageRegex);
var files = DirectoryService.GetFiles(DirectoryService.CoverImageDirectory, ImageService.SeriesCoverImageRegex);
foreach (var file in files)
{
if (images.Contains(Path.GetFileName(file))) continue;
@ -69,7 +67,7 @@ namespace API.Services.Tasks
private async Task DeleteChapterCoverImages()
{
var images = await _unitOfWork.ChapterRepository.GetAllCoverImagesAsync();
var files = _directoryService.GetFiles(DirectoryService.CoverImageDirectory, ImageService.ChapterCoverImageRegex);
var files = DirectoryService.GetFiles(DirectoryService.CoverImageDirectory, ImageService.ChapterCoverImageRegex);
foreach (var file in files)
{
if (images.Contains(Path.GetFileName(file))) continue;
@ -81,7 +79,7 @@ namespace API.Services.Tasks
private async Task DeleteTagCoverImages()
{
var images = await _unitOfWork.CollectionTagRepository.GetAllCoverImagesAsync();
var files = _directoryService.GetFiles(DirectoryService.CoverImageDirectory, ImageService.CollectionTagCoverImageRegex);
var files = DirectoryService.GetFiles(DirectoryService.CoverImageDirectory, ImageService.CollectionTagCoverImageRegex);
foreach (var file in files)
{
if (images.Contains(Path.GetFileName(file))) continue;

View file

@ -73,9 +73,13 @@ namespace API.Services.Tasks.Scanner
info = Parser.Parser.Parse(path, rootPath, type);
}
// If we couldn't match, log. But don't log if the file parses as a cover image
if (info == null)
{
_logger.LogWarning("[Scanner] Could not parse series from {Path}", path);
if (!(Parser.Parser.IsImage(path) && Parser.Parser.IsCoverImage(path)))
{
_logger.LogWarning("[Scanner] Could not parse series from {Path}", path);
}
return;
}
@ -133,13 +137,11 @@ namespace API.Services.Tasks.Scanner
public string MergeName(ParserInfo info)
{
var normalizedSeries = Parser.Parser.Normalize(info.Series);
_logger.LogDebug("Checking if we can merge {NormalizedSeries}", normalizedSeries);
var existingName =
_scannedSeries.SingleOrDefault(p => Parser.Parser.Normalize(p.Key.NormalizedName) == normalizedSeries && p.Key.Format == info.Format)
.Key;
if (existingName != null && !string.IsNullOrEmpty(existingName.Name))
{
_logger.LogDebug("Found duplicate parsed infos, merged {Original} into {Merged}", info.Series, existingName.Name);
return existingName.Name;
}

View file

@ -261,13 +261,15 @@ namespace API.Services.Tasks
var totalTime = 0L;
// Update existing series
_logger.LogDebug("[ScannerService] Updating existing series");
_logger.LogInformation("[ScannerService] Updating existing series for {LibraryName}. Total Items: {TotalSize}. Total Chunks: {TotalChunks} with {ChunkSize} size",
library.Name, chunkInfo.TotalSize, chunkInfo.TotalChunks, chunkInfo.ChunkSize);
for (var chunk = 1; chunk <= chunkInfo.TotalChunks; chunk++)
{
if (chunkInfo.TotalChunks == 0) continue;
totalTime += stopwatch.ElapsedMilliseconds;
stopwatch.Restart();
_logger.LogDebug($"[ScannerService] Processing chunk {chunk} / {chunkInfo.TotalChunks} with size {chunkInfo.ChunkSize} Series ({chunk * chunkInfo.ChunkSize} - {(chunk + 1) * chunkInfo.ChunkSize}");
_logger.LogInformation("[ScannerService] Processing chunk {ChunkNumber} / {TotalChunks} with size {ChunkSize}. Series ({SeriesStart} - {SeriesEnd}",
chunk, chunkInfo.TotalChunks, chunkInfo.ChunkSize, chunk * chunkInfo.ChunkSize, (chunk + 1) * chunkInfo.ChunkSize);
var nonLibrarySeries = await _unitOfWork.SeriesRepository.GetFullSeriesForLibraryIdAsync(library.Id, new UserParams()
{
PageNumber = chunk,
@ -299,7 +301,21 @@ namespace API.Services.Tasks
UpdateSeries(series, parsedSeries);
});
await _unitOfWork.CommitAsync();
try
{
await _unitOfWork.CommitAsync();
}
catch (Exception ex)
{
_logger.LogCritical(ex, "[ScannerService] There was an issue writing to the DB. Chunk {ChunkNumber} did not save to DB. If debug mode, series to check will be printed", chunk);
foreach (var series in nonLibrarySeries)
{
_logger.LogDebug("[ScannerService] There may be a constraint issue with {SeriesName}", series.OriginalName);
}
await _messageHub.Clients.All.SendAsync(SignalREvents.ScanLibraryError,
MessageFactory.ScanLibraryError(library.Id));
continue;
}
_logger.LogInformation(
"[ScannerService] Processed {SeriesStart} - {SeriesEnd} series in {ElapsedScanTime} milliseconds for {LibraryName}",
chunk * chunkInfo.ChunkSize, (chunk * chunkInfo.ChunkSize) + nonLibrarySeries.Count, totalTime, library.Name);
@ -320,12 +336,14 @@ namespace API.Services.Tasks
_logger.LogDebug("[ScannerService] Adding new series");
var newSeries = new List<Series>();
var allSeries = (await _unitOfWork.SeriesRepository.GetSeriesForLibraryIdAsync(library.Id)).ToList();
_logger.LogDebug("[ScannerService] Fetched {AllSeriesCount} series for comparing new series with. There should be {DeltaToParsedSeries} new series",
allSeries.Count, parsedSeries.Count - allSeries.Count);
foreach (var (key, infos) in parsedSeries)
{
// Key is normalized already
Series existingSeries;
try
{
{// NOTE: Maybe use .Equals() here
existingSeries = allSeries.SingleOrDefault(s =>
(s.NormalizedName == key.NormalizedName || Parser.Parser.Normalize(s.OriginalName) == key.NormalizedName)
&& (s.Format == key.Format || s.Format == MangaFormat.Unknown));
@ -386,7 +404,7 @@ namespace API.Services.Tasks
}
}
_logger.LogDebug(
_logger.LogInformation(
"[ScannerService] Added {NewSeries} series in {ElapsedScanTime} milliseconds for {LibraryName}",
newSeries.Count, stopwatch.ElapsedMilliseconds, library.Name);
}

View file

@ -1,6 +1,7 @@
using System;
using System.IO;
using System.Linq;
using System.Net.Http;
using System.Runtime.InteropServices;
using System.Text.Json;
using System.Threading;
@ -9,9 +10,11 @@ using API.Data;
using API.DTOs.Stats;
using API.Interfaces;
using API.Interfaces.Services;
using API.Services.Clients;
using Flurl.Http;
using Hangfire;
using Kavita.Common;
using Kavita.Common.EnvironmentInfo;
using Microsoft.AspNetCore.Http;
using Microsoft.EntityFrameworkCore;
using Microsoft.Extensions.Logging;
@ -19,32 +22,65 @@ namespace API.Services.Tasks
{
public class StatsService : IStatsService
{
private const string TempFilePath = "stats/";
private const string TempFileName = "app_stats.json";
private const string StatFileName = "app_stats.json";
private readonly StatsApiClient _client;
private readonly DataContext _dbContext;
private readonly ILogger<StatsService> _logger;
private readonly IUnitOfWork _unitOfWork;
public StatsService(StatsApiClient client, DataContext dbContext, ILogger<StatsService> logger,
#pragma warning disable S1075
private const string ApiUrl = "http://stats.kavitareader.com";
#pragma warning restore S1075
private static readonly string StatsFilePath = Path.Combine(DirectoryService.StatsDirectory, StatFileName);
private static bool FileExists => File.Exists(StatsFilePath);
public StatsService(DataContext dbContext, ILogger<StatsService> logger,
IUnitOfWork unitOfWork)
{
_client = client;
_dbContext = dbContext;
_logger = logger;
_unitOfWork = unitOfWork;
}
private static string FinalPath => Path.Combine(Directory.GetCurrentDirectory(), TempFilePath, TempFileName);
private static bool FileExists => File.Exists(FinalPath);
public async Task PathData(ClientInfoDto clientInfoDto)
/// <summary>
/// Due to all instances firing this at the same time, we can DDOS our server. This task when fired will schedule the task to be run
/// randomly over a 6 hour spread
/// </summary>
public async Task Send()
{
_logger.LogDebug("Pathing client data to the file");
var allowStatCollection = (await _unitOfWork.SettingsRepository.GetSettingsDtoAsync()).AllowStatCollection;
if (!allowStatCollection)
{
return;
}
var rnd = new Random();
var offset = rnd.Next(0, 6);
if (offset == 0)
{
await SendData();
}
else
{
_logger.LogInformation("KavitaStats upload has been schedule to run in {Offset} hours", offset);
BackgroundJob.Schedule(() => SendData(), DateTimeOffset.Now.AddHours(offset));
}
}
/// <summary>
/// This must be public for Hangfire. Do not call this directly.
/// </summary>
// ReSharper disable once MemberCanBePrivate.Global
public async Task SendData()
{
await CollectRelevantData();
await FinalizeStats();
}
public async Task RecordClientInfo(ClientInfoDto clientInfoDto)
{
var statisticsDto = await GetData();
statisticsDto.AddClientInfo(clientInfoDto);
await SaveFile(statisticsDto);
@ -52,12 +88,7 @@ namespace API.Services.Tasks
private async Task CollectRelevantData()
{
_logger.LogDebug("Collecting data from the server and database");
_logger.LogDebug("Collecting usage info");
var usageInfo = await GetUsageInfo();
_logger.LogDebug("Collecting server info");
var serverInfo = GetServerInfo();
await PathData(serverInfo, usageInfo);
@ -67,39 +98,68 @@ namespace API.Services.Tasks
{
try
{
_logger.LogDebug("Finalizing Stats collection flow");
var data = await GetExistingData<UsageStatisticsDto>();
var successful = await SendDataToStatsServer(data);
_logger.LogDebug("Sending data to the Stats server");
await _client.SendDataToStatsServer(data);
_logger.LogDebug("Deleting the file from disk");
if (FileExists) File.Delete(FinalPath);
if (successful)
{
ResetStats();
}
}
catch (Exception ex)
{
_logger.LogError(ex, "Error Finalizing Stats collection flow");
throw;
_logger.LogError(ex, "There was an exception while sending data to KavitaStats");
}
}
public async Task CollectAndSendStatsData()
private async Task<bool> SendDataToStatsServer(UsageStatisticsDto data)
{
var allowStatCollection = (await _unitOfWork.SettingsRepository.GetSettingsDtoAsync()).AllowStatCollection;
if (!allowStatCollection)
var responseContent = string.Empty;
try
{
_logger.LogDebug("User has opted out of stat collection, not registering tasks");
return;
var response = await (ApiUrl + "/api/InstallationStats")
.WithHeader("Accept", "application/json")
.WithHeader("User-Agent", "Kavita")
.WithHeader("x-api-key", "MsnvA2DfQqxSK5jh")
.WithHeader("api-key", "MsnvA2DfQqxSK5jh")
.WithHeader("x-kavita-version", BuildInfo.Version)
.WithTimeout(TimeSpan.FromSeconds(30))
.PostJsonAsync(data);
if (response.StatusCode != StatusCodes.Status200OK)
{
_logger.LogError("KavitaStats did not respond successfully. {Content}", response);
return false;
}
return true;
}
await CollectRelevantData();
await FinalizeStats();
catch (HttpRequestException e)
{
var info = new
{
dataSent = data,
response = responseContent
};
_logger.LogError(e, "KavitaStats did not respond successfully. {Content}", info);
}
catch (Exception e)
{
_logger.LogError(e, "An error happened during the request to KavitaStats");
}
return false;
}
private static void ResetStats()
{
if (FileExists) File.Delete(StatsFilePath);
}
private async Task PathData(ServerInfoDto serverInfoDto, UsageInfoDto usageInfoDto)
{
_logger.LogDebug("Pathing server and usage info to the file");
var data = await GetData();
data.ServerInfo = serverInfoDto;
@ -110,7 +170,7 @@ namespace API.Services.Tasks
await SaveFile(data);
}
private async ValueTask<UsageStatisticsDto> GetData()
private static async ValueTask<UsageStatisticsDto> GetData()
{
if (!FileExists) return new UsageStatisticsDto {InstallId = HashUtil.AnonymousToken()};
@ -156,39 +216,17 @@ namespace API.Services.Tasks
return serverInfo;
}
private async Task<T> GetExistingData<T>()
private static async Task<T> GetExistingData<T>()
{
_logger.LogInformation("Fetching existing data from file");
var existingDataJson = await GetFileDataAsString();
_logger.LogInformation("Deserializing data from file to object");
var existingData = JsonSerializer.Deserialize<T>(existingDataJson);
return existingData;
var json = await File.ReadAllTextAsync(StatsFilePath);
return JsonSerializer.Deserialize<T>(json);
}
private async Task<string> GetFileDataAsString()
private static async Task SaveFile(UsageStatisticsDto statisticsDto)
{
_logger.LogInformation("Reading file from disk");
return await File.ReadAllTextAsync(FinalPath);
}
DirectoryService.ExistOrCreate(DirectoryService.StatsDirectory);
private async Task SaveFile(UsageStatisticsDto statisticsDto)
{
_logger.LogDebug("Saving file");
var finalDirectory = FinalPath.Replace(TempFileName, string.Empty);
if (!Directory.Exists(finalDirectory))
{
_logger.LogDebug("Creating tmp directory");
Directory.CreateDirectory(finalDirectory);
}
_logger.LogDebug("Serializing data to write");
var dataJson = JsonSerializer.Serialize(statisticsDto);
_logger.LogDebug("Writing file to the disk");
await File.WriteAllTextAsync(FinalPath, dataJson);
await File.WriteAllTextAsync(StatsFilePath, JsonSerializer.Serialize(statisticsDto));
}
}
}