parent
59b01d7106
commit
343e12a94d
|
@ -7,6 +7,35 @@ title: Changelog
|
|||
!!! note
|
||||
This is the new changelog, only the most recent builds. For all versions, see the [old changelog](old_changelog.html).
|
||||
|
||||
## [Version 546](https://github.com/hydrusnetwork/hydrus/releases/tag/v546)
|
||||
|
||||
### misc
|
||||
|
||||
* fixed the recent messed up colours in PSD thumbnail generation. I enthusiastically 'fixed' a problem with greyscale PSD thumbs at the last minute last week and accidentally swapped the RGB colour channels on coloured ones. I changed the badly named method that caused this mixup, and all existing PSD thumbs will be regenerated (issue #1448)
|
||||
* fixed up some borked button-enabling and status-displaying logic in the file history chart. the cancel process should work properly on repeat now
|
||||
* made two logical fixes to the archive count in the new file history chart when you have a specific search--archive times for files you deleted are now included properly, and files that are not eligible for archiving are discluded from the initial count. this _should_ make the inbox and archive lines, which were often way too high during specific searches, a little better behaved. let me know what you see!
|
||||
* added a checkbox to _options->thumbnails_ to turn off the new blurhash thumbnail fallback
|
||||
* 'this has exif data, the other does not' statements are now calculated from cached knowledge--loading pairs in the duplicate filter should be faster now
|
||||
* some larger image files with clever metadata should import just a little faster now
|
||||
* if the process isn't explicitly frozen into an executable or a macOS App, it is now considered 'running from source'. various unusual 'running from source' modes (e.g. booting from various scripts that mess with argv) should now be recognised better
|
||||
|
||||
### boring code cleanup
|
||||
|
||||
* moved 'recent tags' code to a new client db module
|
||||
* moved ratings code to a new client db module
|
||||
* moved some db integrity checking code to the db maintenance module
|
||||
* moved the orphan table checking code to the db maintenance module
|
||||
* fixed the orphan table checking code, which was under-detecting orphan tables
|
||||
* moved some final references to sibling/parent tables from main db method to sibling and parent modules
|
||||
* moved most of the image metadata functions (exif, icc profile, human-readable, subsampling, quantization quality estimate) to a new `HydrusImageMetadata` file
|
||||
* moved the new blurhash methods to a new `HydrusBlurhash` file
|
||||
* moved various normalisation routines to a new `HydrusImageNormalisation` file
|
||||
* moved various channel scanning and adjusting code to a new `HydrusImageColours` file
|
||||
* moved the hydrus image files to the new 'hydrus.core.images' module
|
||||
* cleaned up some image loading code
|
||||
* deleted ancient and no-longer-used client db code regarding imageboard definitions, status texts, and more
|
||||
* removed the ancient `OPENCV_OK` fallback code, which was only used, superfluously, in a couple of final places. OpenCV is not optional to run hydrus, server or client
|
||||
|
||||
## [Version 545](https://github.com/hydrusnetwork/hydrus/releases/tag/v545)
|
||||
|
||||
### blurhash
|
||||
|
@ -382,37 +411,3 @@ title: Changelog
|
|||
* silenced the long time logspam that oftens happens when generating flash thumbnails
|
||||
* fixed a stupid typo error in the routine that schedules downloading files from file repositories
|
||||
* `nose`, `six`, and `zope` are no longer in any of the requirements.txts. I think these were needed a million years ago as PyInstaller hacks, but the situation is much better these days
|
||||
|
||||
## [Version 536](https://github.com/hydrusnetwork/hydrus/releases/tag/v536)
|
||||
|
||||
### more new filetypes
|
||||
|
||||
* thanks to a user, we have XCF and gzip filetype support!
|
||||
* I rejiggered the new SVG support so there is a firmer server/client split. the new tech needs Qt, which broke the headless Docker server last week at the last minute--now the server has some sensible stubs that safely revert to the default svg thumb and give unknown resolution, and the client patches in full support dynamically
|
||||
* the new SVG code now supports the 'scale to fill' thumbnail option
|
||||
|
||||
### misc
|
||||
|
||||
* I fixed the issue that was causing tags to stay in the tag autocomplete lookup despite going to 0 count. it should not happen for new cases, and **on update, a database routine will run to remove all your existing orphans. if you have ever synced with the PTR, it will take several minutes to run!**
|
||||
* sending the command to set a file as the best in its duplicate group now presents a yes/no dialog to confirm
|
||||
* hitting the shortcut for 'set the focused file as better than the other(s)' when you only have one file now asks if you just want to set that file as the best of its group
|
||||
* fixed an erroneous 'cannot show the best quality file of this file's group here' label in the file relationships menu--a count was off
|
||||
* fixed the 'set up a hydrus.desktop file' setup script to point to the new hydrus_client.sh startup script name
|
||||
* thanks to a user, a situation where certain unhandled URLs that deliver JSON were parsing as mpegs by ffmpeg and causing a weird loop is now caught and stopped. more investigation is needed to fix it properly
|
||||
|
||||
### boring stuff
|
||||
|
||||
* when a problem or file maintenance job causes a new file maintenance job to be queued (e.g. if the client in a metadata scan discovers the resolution of a file was not as expected, let's say it now recognises EXIF rotation, and starts a secondary thumbnail regen job), it now wakes the file maintenance manager immediately, which should help clear out and update for these jobs quickly when you are looking at the problem thumbnails
|
||||
* if you have an image type set to show as an 'open externally' button in the media viewer, then it is now no longer prefetched in the rendering system!
|
||||
* I added a very simple .editorconfig file for the project. since we have a variety of weird files in the directory tree, I've made it cautious and python-specific to start with. we'll expand as needed
|
||||
* I moved the similar files search tree and maintenance tracker from client.caches.db to client.db. while the former table is regeneratable, it isn't a cache or precomputation store, _per se_, so I finally agreed to move it to the main db. if you have a giganto database, it may take an extra minute to update
|
||||
* added a 'requirements_server.txt' to the advanced requirements.txts directory, just for future reference, and trimmed the Server Dockerfile down to reflect it
|
||||
|
||||
### client api
|
||||
|
||||
* thanks to a user, fixed a really stupid typo in the Client API when sending the 'file_id' parameter to set the file
|
||||
* wrote unit tests for file_id and file_ids parameters to stop this sort mistake in future
|
||||
* if you attempt to delete a file over the Client API when one of the given files is delete-locked (this is an advanced option that stops deletion of any archived file), the request now returns a 409 Conflict response, saying which hashes were bad, and does not delete anything
|
||||
* wrote a unit test to catch the new delete lock test
|
||||
* deleted the old-and-deprecated-in-one-week 'pair_rows' parameter-handling code in the set_file_relationships command
|
||||
* the client api version is now 49
|
||||
|
|
|
@ -34,6 +34,34 @@
|
|||
<div class="content">
|
||||
<h1 id="changelog"><a href="#changelog">changelog</a></h1>
|
||||
<ul>
|
||||
<li>
|
||||
<h2 id="version_546"><a href="#version_546">version 546</a></h2>
|
||||
<ul>
|
||||
<li><h3>misc</h3></li>
|
||||
<li>fixed the recent messed up colours in PSD thumbnail generation. I enthusiastically 'fixed' a problem with greyscale PSD thumbs at the last minute last week and accidentally swapped the RGB colour channels on coloured ones. I changed the badly named method that caused this mixup, and all existing PSD thumbs will be regenerated (issue #1448)</li>
|
||||
<li>fixed up some borked button-enabling and status-displaying logic in the file history chart. the cancel process should work properly on repeat now</li>
|
||||
<li>made two logical fixes to the archive count in the new file history chart when you have a specific search--archive times for files you deleted are now included properly, and files that are not eligible for archiving are discluded from the initial count. this _should_ make the inbox and archive lines, which were often way too high during specific searches, a little better behaved. let me know what you see!</li>
|
||||
<li>added a checkbox to _options->thumbnails_ to turn off the new blurhash thumbnail fallback</li>
|
||||
<li>'this has exif data, the other does not' statements are now calculated from cached knowledge--loading pairs in the duplicate filter should be faster now</li>
|
||||
<li>some larger image files with clever metadata should import just a little faster now</li>
|
||||
<li>if the process isn't explicitly frozen into an executable or a macOS App, it is now considered 'running from source'. various unusual 'running from source' modes (e.g. booting from various scripts that mess with argv) should now be recognised better</li>
|
||||
<li><h3>boring code cleanup</h3></li>
|
||||
<li>moved 'recent tags' code to a new client db module</li>
|
||||
<li>moved ratings code to a new client db module</li>
|
||||
<li>moved some db integrity checking code to the db maintenance module</li>
|
||||
<li>moved the orphan table checking code to the db maintenance module</li>
|
||||
<li>fixed the orphan table checking code, which was under-detecting orphan tables</li>
|
||||
<li>moved some final references to sibling/parent tables from main db method to sibling and parent modules</li>
|
||||
<li>moved most of the image metadata functions (exif, icc profile, human-readable, subsampling, quantization quality estimate) to a new `HydrusImageMetadata` file</li>
|
||||
<li>moved the new blurhash methods to a new `HydrusBlurhash` file</li>
|
||||
<li>moved various normalisation routines to a new `HydrusImageNormalisation` file</li>
|
||||
<li>moved various channel scanning and adjusting code to a new `HydrusImageColours` file</li>
|
||||
<li>moved the hydrus image files to the new 'hydrus.core.images' module</li>
|
||||
<li>cleaned up some image loading code</li>
|
||||
<li>deleted ancient and no-longer-used client db code regarding imageboard definitions, status texts, and more</li>
|
||||
<li>removed the ancient `OPENCV_OK` fallback code, which was only used, superfluously, in a couple of final places. OpenCV is not optional to run hydrus, server or client</li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>
|
||||
<h2 id="version_545"><a href="#version_545">version 545</a></h2>
|
||||
<ul>
|
||||
|
|
|
@ -7,10 +7,12 @@ from hydrus.core import HydrusConstants as HC
|
|||
from hydrus.core import HydrusData
|
||||
from hydrus.core import HydrusExceptions
|
||||
from hydrus.core import HydrusGlobals as HG
|
||||
from hydrus.core import HydrusImageHandling
|
||||
from hydrus.core import HydrusSerialisable
|
||||
from hydrus.core import HydrusTags
|
||||
from hydrus.core import HydrusTime
|
||||
from hydrus.core.images import HydrusImageHandling
|
||||
from hydrus.core.images import HydrusImageMetadata
|
||||
from hydrus.core.images import HydrusImageOpening
|
||||
|
||||
from hydrus.client import ClientConstants as CC
|
||||
from hydrus.client import ClientThreading
|
||||
|
@ -425,14 +427,36 @@ def GetDuplicateComparisonStatements( shown_media, comparison_media ):
|
|||
|
||||
path = HG.client_controller.client_files_manager.GetFilePath( s_hash, s_mime )
|
||||
|
||||
hashes_to_jpeg_quality[ s_hash ] = HydrusImageHandling.GetJPEGQuantizationQualityEstimate( path )
|
||||
try:
|
||||
|
||||
raw_pil_image = HydrusImageOpening.RawOpenPILImage( path )
|
||||
|
||||
result = HydrusImageMetadata.GetJPEGQuantizationQualityEstimate( raw_pil_image )
|
||||
|
||||
except:
|
||||
|
||||
result = ( 'unknown', None )
|
||||
|
||||
|
||||
hashes_to_jpeg_quality[ s_hash ] = result
|
||||
|
||||
|
||||
if c_hash not in hashes_to_jpeg_quality:
|
||||
|
||||
path = HG.client_controller.client_files_manager.GetFilePath( c_hash, c_mime )
|
||||
|
||||
hashes_to_jpeg_quality[ c_hash ] = HydrusImageHandling.GetJPEGQuantizationQualityEstimate( path )
|
||||
try:
|
||||
|
||||
raw_pil_image = HydrusImageOpening.RawOpenPILImage( path )
|
||||
|
||||
result = HydrusImageMetadata.GetJPEGQuantizationQualityEstimate( raw_pil_image )
|
||||
|
||||
except:
|
||||
|
||||
result = ( 'unknown', None )
|
||||
|
||||
|
||||
hashes_to_jpeg_quality[ c_hash ] = result
|
||||
|
||||
|
||||
( s_label, s_jpeg_quality ) = hashes_to_jpeg_quality[ s_hash ]
|
||||
|
@ -489,9 +513,9 @@ def GetDuplicateComparisonStatements( shown_media, comparison_media ):
|
|||
|
||||
path = HG.client_controller.client_files_manager.GetFilePath( hash, mime )
|
||||
|
||||
pil_image = HydrusImageHandling.RawOpenPILImage( path )
|
||||
raw_pil_image = HydrusImageOpening.RawOpenPILImage( path )
|
||||
|
||||
exif_dict = HydrusImageHandling.GetEXIFDict( pil_image )
|
||||
exif_dict = HydrusImageMetadata.GetEXIFDict( raw_pil_image )
|
||||
|
||||
if exif_dict is None:
|
||||
|
||||
|
@ -506,8 +530,8 @@ def GetDuplicateComparisonStatements( shown_media, comparison_media ):
|
|||
|
||||
|
||||
|
||||
s_has_exif = has_exif( shown_media )
|
||||
c_has_exif = has_exif( comparison_media )
|
||||
s_has_exif = shown_media.GetFileInfoManager().has_exif
|
||||
c_has_exif = comparison_media.GetFileInfoManager().has_exif
|
||||
|
||||
if s_has_exif ^ c_has_exif:
|
||||
|
||||
|
@ -523,8 +547,8 @@ def GetDuplicateComparisonStatements( shown_media, comparison_media ):
|
|||
statements_and_scores[ 'exif_data' ] = ( exif_statement, 0 )
|
||||
|
||||
|
||||
s_has_human_readable_embedded_metadata = shown_media.GetMediaResult().GetFileInfoManager().has_human_readable_embedded_metadata
|
||||
c_has_human_readable_embedded_metadata = comparison_media.GetMediaResult().GetFileInfoManager().has_human_readable_embedded_metadata
|
||||
s_has_human_readable_embedded_metadata = shown_media.GetFileInfoManager().has_human_readable_embedded_metadata
|
||||
c_has_human_readable_embedded_metadata = comparison_media.GetFileInfoManager().has_human_readable_embedded_metadata
|
||||
|
||||
if s_has_human_readable_embedded_metadata ^ c_has_human_readable_embedded_metadata:
|
||||
|
||||
|
|
|
@ -10,12 +10,15 @@ from hydrus.core import HydrusData
|
|||
from hydrus.core import HydrusExceptions
|
||||
from hydrus.core import HydrusFileHandling
|
||||
from hydrus.core import HydrusGlobals as HG
|
||||
from hydrus.core import HydrusImageHandling
|
||||
from hydrus.core import HydrusPSDHandling
|
||||
from hydrus.core import HydrusLists
|
||||
from hydrus.core import HydrusPaths
|
||||
from hydrus.core import HydrusThreading
|
||||
from hydrus.core import HydrusTime
|
||||
from hydrus.core.images import HydrusBlurhash
|
||||
from hydrus.core.images import HydrusImageHandling
|
||||
from hydrus.core.images import HydrusImageMetadata
|
||||
from hydrus.core.images import HydrusImageOpening
|
||||
from hydrus.core.networking import HydrusNetworking
|
||||
|
||||
from hydrus.client import ClientConstants as CC
|
||||
|
@ -1740,7 +1743,7 @@ class ClientFilesManager( object ):
|
|||
thumbnail_scale_type = self._controller.new_options.GetInteger( 'thumbnail_scale_type' )
|
||||
thumbnail_dpr_percent = HG.client_controller.new_options.GetInteger( 'thumbnail_dpr_percent' )
|
||||
|
||||
( clip_rect, ( expected_width, expected_height ) ) = HydrusImageHandling.GetThumbnailResolutionAndClipRegion( ( media_width, media_height ), bounding_dimensions, thumbnail_scale_type, thumbnail_dpr_percent )
|
||||
( clip_rect, ( expected_width, expected_height ) ) = HydrusImageHandling.GetThumbnailResolutionAndClipRegion( (media_width, media_height), bounding_dimensions, thumbnail_scale_type, thumbnail_dpr_percent )
|
||||
|
||||
if current_width != expected_width or current_height != expected_height:
|
||||
|
||||
|
@ -1812,7 +1815,16 @@ def HasHumanReadableEmbeddedMetadata( path, mime ):
|
|||
|
||||
else:
|
||||
|
||||
has_human_readable_embedded_metadata = HydrusImageHandling.HasHumanReadableEmbeddedMetadata( path )
|
||||
try:
|
||||
|
||||
pil_image = HydrusImageOpening.RawOpenPILImage( path )
|
||||
|
||||
except:
|
||||
|
||||
return False
|
||||
|
||||
|
||||
has_human_readable_embedded_metadata = HydrusImageMetadata.HasHumanReadableEmbeddedMetadata( pil_image )
|
||||
|
||||
|
||||
return has_human_readable_embedded_metadata
|
||||
|
@ -2213,7 +2225,16 @@ class FilesMaintenanceManager( object ):
|
|||
|
||||
path = self._controller.client_files_manager.GetFilePath( hash, mime )
|
||||
|
||||
has_exif = HydrusImageHandling.HasEXIF( path )
|
||||
try:
|
||||
|
||||
raw_pil_image = HydrusImageOpening.RawOpenPILImage( path )
|
||||
|
||||
has_exif = HydrusImageMetadata.HasEXIF( path )
|
||||
|
||||
except:
|
||||
|
||||
has_exif = False
|
||||
|
||||
|
||||
additional_data = has_exif
|
||||
|
||||
|
@ -2259,34 +2280,35 @@ class FilesMaintenanceManager( object ):
|
|||
if mime not in HC.FILES_THAT_CAN_HAVE_ICC_PROFILE:
|
||||
|
||||
return False
|
||||
|
||||
|
||||
try:
|
||||
|
||||
path = self._controller.client_files_manager.GetFilePath( hash, mime )
|
||||
|
||||
if mime == HC.APPLICATION_PSD:
|
||||
|
||||
|
||||
try:
|
||||
|
||||
has_icc_profile = HydrusPSDHandling.PSDHasICCProfile(path)
|
||||
|
||||
|
||||
has_icc_profile = HydrusPSDHandling.PSDHasICCProfile( path )
|
||||
|
||||
except:
|
||||
|
||||
|
||||
return None
|
||||
|
||||
|
||||
else:
|
||||
|
||||
|
||||
try:
|
||||
|
||||
pil_image = HydrusImageHandling.RawOpenPILImage( path )
|
||||
raw_pil_image = HydrusImageOpening.RawOpenPILImage( path )
|
||||
|
||||
except:
|
||||
|
||||
return None
|
||||
|
||||
|
||||
has_icc_profile = HydrusImageMetadata.HasICCProfile( raw_pil_image )
|
||||
|
||||
|
||||
has_icc_profile = HydrusImageHandling.HasICCProfile( pil_image )
|
||||
|
||||
|
||||
additional_data = has_icc_profile
|
||||
|
||||
|
@ -2494,7 +2516,7 @@ class FilesMaintenanceManager( object ):
|
|||
|
||||
numpy_image = ClientImageHandling.GenerateNumPyImage( thumbnail_path, thumbnail_mime )
|
||||
|
||||
return HydrusImageHandling.GetBlurhashFromNumPy( numpy_image )
|
||||
return HydrusBlurhash.GetBlurhashFromNumPy( numpy_image )
|
||||
|
||||
except:
|
||||
|
||||
|
|
|
@ -7,9 +7,8 @@ import cv2
|
|||
|
||||
from hydrus.client import ClientConstants as CC
|
||||
from hydrus.core import HydrusData
|
||||
from hydrus.core import HydrusImageHandling
|
||||
from hydrus.core import HydrusGlobals as HG
|
||||
from hydrus.core import HydrusTime
|
||||
from hydrus.core.images import HydrusImageHandling
|
||||
|
||||
cv_interpolation_enum_lookup = {}
|
||||
|
||||
|
|
|
@ -8,7 +8,6 @@ from hydrus.core import HydrusGlobals as HG
|
|||
from hydrus.core import HydrusData
|
||||
from hydrus.core import HydrusSerialisable
|
||||
from hydrus.core import HydrusTags
|
||||
from hydrus.core import HydrusTime
|
||||
|
||||
from hydrus.client import ClientConstants as CC
|
||||
from hydrus.client import ClientDefaults
|
||||
|
@ -295,6 +294,8 @@ class ClientOptions( HydrusSerialisable.SerialisableBase ):
|
|||
self._dictionary[ 'booleans' ][ 'hide_uninteresting_local_import_time' ] = True
|
||||
self._dictionary[ 'booleans' ][ 'hide_uninteresting_modified_time' ] = True
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'allow_blurhash_fallback' ] = True
|
||||
|
||||
from hydrus.client.gui.canvas import ClientGUIMPV
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'mpv_available_at_start' ] = ClientGUIMPV.MPV_IS_AVAILABLE
|
||||
|
@ -438,7 +439,7 @@ class ClientOptions( HydrusSerialisable.SerialisableBase ):
|
|||
self._dictionary[ 'integers' ][ 'max_connection_attempts_allowed' ] = 5
|
||||
self._dictionary[ 'integers' ][ 'max_request_attempts_allowed_get' ] = 5
|
||||
|
||||
from hydrus.core import HydrusImageHandling
|
||||
from hydrus.core.images import HydrusImageHandling
|
||||
|
||||
self._dictionary[ 'integers' ][ 'thumbnail_scale_type' ] = HydrusImageHandling.THUMBNAIL_SCALE_DOWN_ONLY
|
||||
|
||||
|
|
|
@ -22,8 +22,8 @@ from qtpy import QtCore as QC
|
|||
|
||||
from hydrus.core import HydrusData
|
||||
from hydrus.core import HydrusExceptions
|
||||
from hydrus.core import HydrusImageHandling
|
||||
from hydrus.core import HydrusPDFHandling
|
||||
from hydrus.core.images import HydrusImageHandling
|
||||
|
||||
from hydrus.client.gui import ClientGUIFunctions
|
||||
|
||||
|
|
|
@ -5,17 +5,16 @@ import time
|
|||
import typing
|
||||
|
||||
from qtpy import QtCore as QC
|
||||
from qtpy import QtWidgets as QW
|
||||
from qtpy import QtGui as QG
|
||||
|
||||
from hydrus.core import HydrusAnimationHandling
|
||||
from hydrus.core import HydrusCompression
|
||||
from hydrus.core import HydrusConstants as HC
|
||||
from hydrus.core import HydrusData
|
||||
from hydrus.core import HydrusImageHandling
|
||||
from hydrus.core import HydrusGlobals as HG
|
||||
from hydrus.core import HydrusTime
|
||||
from hydrus.core import HydrusVideoHandling
|
||||
from hydrus.core.images import HydrusImageColours
|
||||
from hydrus.core.images import HydrusImageHandling
|
||||
|
||||
from hydrus.client import ClientFiles
|
||||
from hydrus.client import ClientImageHandling
|
||||
|
@ -464,7 +463,7 @@ class ImageRenderer( ClientCachesBase.CacheableObject ):
|
|||
raise Exception( 'I cannot know this yet--the image is not ready!' )
|
||||
|
||||
|
||||
return HydrusImageHandling.NumPyImageHasAlphaChannel( self._numpy_image )
|
||||
return HydrusImageColours.NumPyImageHasAlphaChannel( self._numpy_image )
|
||||
|
||||
|
||||
def IsReady( self ):
|
||||
|
|
|
@ -5,8 +5,8 @@ from qtpy import QtGui as QG
|
|||
from qtpy import QtCore as QC
|
||||
|
||||
from hydrus.core import HydrusExceptions
|
||||
from hydrus.core import HydrusImageHandling
|
||||
from hydrus.core import HydrusSVGHandling
|
||||
from hydrus.core.images import HydrusImageHandling
|
||||
|
||||
from hydrus.client.gui import ClientGUIFunctions
|
||||
|
||||
|
|
|
@ -2,7 +2,6 @@ import collections
|
|||
import cv2
|
||||
import numpy
|
||||
import os
|
||||
import shutil
|
||||
import struct
|
||||
|
||||
from qtpy import QtCore as QC
|
||||
|
@ -12,15 +11,13 @@ from qtpy import QtWidgets as QW
|
|||
from hydrus.core import HydrusCompression
|
||||
from hydrus.core import HydrusData
|
||||
from hydrus.core import HydrusGlobals as HG
|
||||
from hydrus.core import HydrusImageHandling
|
||||
from hydrus.core import HydrusPaths
|
||||
from hydrus.core import HydrusSerialisable
|
||||
from hydrus.core import HydrusTemp
|
||||
from hydrus.core import HydrusTime
|
||||
from hydrus.core.images import HydrusImageHandling
|
||||
|
||||
from hydrus.client import ClientConstants as CC
|
||||
from hydrus.client.gui import ClientGUIFunctions
|
||||
from hydrus.client.gui import QtPorting as QP
|
||||
|
||||
# ok, the serialised png format is:
|
||||
|
||||
|
@ -268,7 +265,7 @@ def LoadFromPNG( path ):
|
|||
|
||||
try:
|
||||
|
||||
# dequantize = False because we don't want to convert to RGB
|
||||
# dequantize = False because we don't want to convert our greyscale bytes to RGB
|
||||
|
||||
pil_image = HydrusImageHandling.GeneratePILImage( temp_path, dequantize = False )
|
||||
|
||||
|
|
|
@ -1,12 +1,10 @@
|
|||
import numpy.core.multiarray # important this comes before cv!
|
||||
|
||||
import cv2
|
||||
|
||||
from hydrus.core import HydrusData
|
||||
from hydrus.core import HydrusExceptions
|
||||
from hydrus.core import HydrusGlobals as HG
|
||||
from hydrus.core import HydrusImageHandling
|
||||
from hydrus.core import HydrusTime
|
||||
from hydrus.core.images import HydrusImageHandling
|
||||
from hydrus.core.images import HydrusImageNormalisation
|
||||
|
||||
if cv2.__version__.startswith( '2' ):
|
||||
|
||||
|
@ -26,6 +24,7 @@ else:
|
|||
CAP_PROP_CONVERT_RGB = cv2.CAP_PROP_CONVERT_RGB
|
||||
CAP_PROP_POS_FRAMES = cv2.CAP_PROP_POS_FRAMES
|
||||
|
||||
|
||||
def GetCVVideoProperties( path ):
|
||||
|
||||
capture = cv2.VideoCapture( path )
|
||||
|
@ -92,7 +91,7 @@ class GIFRenderer( object ):
|
|||
|
||||
else:
|
||||
|
||||
current_frame = HydrusImageHandling.DequantizePILImage( self._pil_image )
|
||||
current_frame = HydrusImageNormalisation.DequantizePILImage( self._pil_image )
|
||||
|
||||
if current_frame.mode == 'RGBA':
|
||||
|
||||
|
|
|
@ -3,16 +3,16 @@ import json
|
|||
import os
|
||||
import threading
|
||||
import time
|
||||
import typing
|
||||
|
||||
from hydrus.core import HydrusConstants as HC
|
||||
from hydrus.core import HydrusExceptions
|
||||
from hydrus.core import HydrusFileHandling
|
||||
from hydrus.core import HydrusImageHandling
|
||||
from hydrus.core import HydrusThreading
|
||||
from hydrus.core import HydrusData
|
||||
from hydrus.core import HydrusGlobals as HG
|
||||
from hydrus.core import HydrusTime
|
||||
from hydrus.core.images import HydrusBlurhash
|
||||
from hydrus.core.images import HydrusImageHandling
|
||||
|
||||
from hydrus.client import ClientConstants as CC
|
||||
from hydrus.client import ClientFiles
|
||||
|
@ -443,6 +443,8 @@ class ThumbnailCache( object ):
|
|||
self._delayed_regeneration_queue_quick = set()
|
||||
self._delayed_regeneration_queue = []
|
||||
|
||||
self._allow_blurhash_fallback = self._controller.new_options.GetBoolean( 'allow_blurhash_fallback' )
|
||||
|
||||
self._waterfall_event = threading.Event()
|
||||
|
||||
self._special_thumbs = {}
|
||||
|
@ -458,29 +460,32 @@ class ThumbnailCache( object ):
|
|||
|
||||
def _GetBestRecoveryThumbnailHydrusBitmap( self, display_media ):
|
||||
|
||||
blurhash = display_media.GetFileInfoManager().blurhash
|
||||
|
||||
if blurhash is not None:
|
||||
if self._allow_blurhash_fallback:
|
||||
|
||||
try:
|
||||
blurhash = display_media.GetFileInfoManager().blurhash
|
||||
|
||||
if blurhash is not None:
|
||||
|
||||
( media_width, media_height ) = display_media.GetResolution()
|
||||
|
||||
bounding_dimensions = self._controller.options[ 'thumbnail_dimensions' ]
|
||||
thumbnail_scale_type = self._controller.new_options.GetInteger( 'thumbnail_scale_type' )
|
||||
thumbnail_dpr_percent = HG.client_controller.new_options.GetInteger( 'thumbnail_dpr_percent' )
|
||||
|
||||
( clip_rect, ( expected_width, expected_height ) ) = HydrusImageHandling.GetThumbnailResolutionAndClipRegion( ( media_width, media_height ), bounding_dimensions, thumbnail_scale_type, thumbnail_dpr_percent )
|
||||
|
||||
numpy_image = HydrusImageHandling.GetNumpyFromBlurhash( blurhash, expected_width, expected_height )
|
||||
|
||||
hydrus_bitmap = ClientRendering.GenerateHydrusBitmapFromNumPyImage( numpy_image )
|
||||
|
||||
return hydrus_bitmap
|
||||
|
||||
except:
|
||||
|
||||
pass
|
||||
try:
|
||||
|
||||
( media_width, media_height ) = display_media.GetResolution()
|
||||
|
||||
bounding_dimensions = self._controller.options[ 'thumbnail_dimensions' ]
|
||||
thumbnail_scale_type = self._controller.new_options.GetInteger( 'thumbnail_scale_type' )
|
||||
thumbnail_dpr_percent = HG.client_controller.new_options.GetInteger( 'thumbnail_dpr_percent' )
|
||||
|
||||
( clip_rect, ( expected_width, expected_height ) ) = HydrusImageHandling.GetThumbnailResolutionAndClipRegion( ( media_width, media_height ), bounding_dimensions, thumbnail_scale_type, thumbnail_dpr_percent )
|
||||
|
||||
numpy_image = HydrusBlurhash.GetNumpyFromBlurhash( blurhash, expected_width, expected_height )
|
||||
|
||||
hydrus_bitmap = ClientRendering.GenerateHydrusBitmapFromNumPyImage( numpy_image )
|
||||
|
||||
return hydrus_bitmap
|
||||
|
||||
except:
|
||||
|
||||
pass
|
||||
|
||||
|
||||
|
||||
|
||||
|
@ -953,6 +958,15 @@ class ThumbnailCache( object ):
|
|||
|
||||
self._data_cache.SetCacheSizeAndTimeout( cache_size, cache_timeout )
|
||||
|
||||
allow_blurhash_fallback = self._controller.new_options.GetBoolean( 'allow_blurhash_fallback' )
|
||||
|
||||
if allow_blurhash_fallback != self._allow_blurhash_fallback:
|
||||
|
||||
self._allow_blurhash_fallback = allow_blurhash_fallback
|
||||
|
||||
self.Clear()
|
||||
|
||||
|
||||
|
||||
def Waterfall( self, page_key, medias ):
|
||||
|
||||
|
|
|
@ -35,6 +35,7 @@ from hydrus.client import ClientServices
|
|||
from hydrus.client import ClientThreading
|
||||
from hydrus.client import ClientTime
|
||||
from hydrus.client.db import ClientDBDefinitionsCache
|
||||
from hydrus.client.db import ClientDBFileDeleteLock
|
||||
from hydrus.client.db import ClientDBFilesDuplicates
|
||||
from hydrus.client.db import ClientDBFilesInbox
|
||||
from hydrus.client.db import ClientDBFilesMaintenance
|
||||
|
@ -56,6 +57,7 @@ from hydrus.client.db import ClientDBMappingsCountsUpdate
|
|||
from hydrus.client.db import ClientDBMappingsStorage
|
||||
from hydrus.client.db import ClientDBMaster
|
||||
from hydrus.client.db import ClientDBNotesMap
|
||||
from hydrus.client.db import ClientDBRatings
|
||||
from hydrus.client.db import ClientDBRepositories
|
||||
from hydrus.client.db import ClientDBSerialisable
|
||||
from hydrus.client.db import ClientDBServicePaths
|
||||
|
@ -65,6 +67,7 @@ from hydrus.client.db import ClientDBTagDisplay
|
|||
from hydrus.client.db import ClientDBTagParents
|
||||
from hydrus.client.db import ClientDBTagSearch
|
||||
from hydrus.client.db import ClientDBTagSiblings
|
||||
from hydrus.client.db import ClientDBTagSuggestions
|
||||
from hydrus.client.db import ClientDBURLMap
|
||||
from hydrus.client.importing import ClientImportFiles
|
||||
from hydrus.client.media import ClientMediaManagers
|
||||
|
@ -1040,66 +1043,6 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
|
||||
|
||||
def _CheckDBIntegrity( self ):
|
||||
|
||||
prefix_string = 'checking db integrity: '
|
||||
|
||||
job_key = ClientThreading.JobKey( cancellable = True )
|
||||
|
||||
try:
|
||||
|
||||
job_key.SetStatusTitle( prefix_string + 'preparing' )
|
||||
|
||||
self._controller.pub( 'modal_message', job_key )
|
||||
|
||||
num_errors = 0
|
||||
|
||||
job_key.SetStatusTitle( prefix_string + 'running' )
|
||||
job_key.SetStatusText( 'errors found so far: ' + HydrusData.ToHumanInt( num_errors ) )
|
||||
|
||||
db_names = [ name for ( index, name, path ) in self._Execute( 'PRAGMA database_list;' ) if name not in ( 'mem', 'temp', 'durable_temp' ) ]
|
||||
|
||||
for db_name in db_names:
|
||||
|
||||
for ( text, ) in self._Execute( 'PRAGMA ' + db_name + '.integrity_check;' ):
|
||||
|
||||
( i_paused, should_quit ) = job_key.WaitIfNeeded()
|
||||
|
||||
if should_quit:
|
||||
|
||||
job_key.SetStatusTitle( prefix_string + 'cancelled' )
|
||||
job_key.SetStatusText( 'errors found: ' + HydrusData.ToHumanInt( num_errors ) )
|
||||
|
||||
return
|
||||
|
||||
|
||||
if text != 'ok':
|
||||
|
||||
if num_errors == 0:
|
||||
|
||||
HydrusData.Print( 'During a db integrity check, these errors were discovered:' )
|
||||
|
||||
|
||||
HydrusData.Print( text )
|
||||
|
||||
num_errors += 1
|
||||
|
||||
|
||||
job_key.SetStatusText( 'errors found so far: ' + HydrusData.ToHumanInt( num_errors ) )
|
||||
|
||||
|
||||
|
||||
finally:
|
||||
|
||||
job_key.SetStatusTitle( prefix_string + 'completed' )
|
||||
job_key.SetStatusText( 'errors found: ' + HydrusData.ToHumanInt( num_errors ) )
|
||||
|
||||
HydrusData.Print( job_key.ToString() )
|
||||
|
||||
job_key.Finish()
|
||||
|
||||
|
||||
|
||||
def _CleanAfterJobWork( self ):
|
||||
|
||||
self._after_job_content_update_jobs = []
|
||||
|
@ -1215,46 +1158,6 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
|
||||
|
||||
def _ClearOrphanTables( self ):
|
||||
|
||||
all_table_names = set()
|
||||
|
||||
db_names = [ name for ( index, name, path ) in self._Execute( 'PRAGMA database_list;' ) if name not in ( 'mem', 'temp', 'durable_temp' ) ]
|
||||
|
||||
for db_name in db_names:
|
||||
|
||||
table_names = self._STS( self._Execute( 'SELECT name FROM {}.sqlite_master WHERE type = ?;'.format( db_name ), ( 'table', ) ) )
|
||||
|
||||
if db_name != 'main':
|
||||
|
||||
table_names = { '{}.{}'.format( db_name, table_name ) for table_name in table_names }
|
||||
|
||||
|
||||
all_table_names.update( table_names )
|
||||
|
||||
|
||||
all_surplus_table_names = set()
|
||||
|
||||
for module in self._modules:
|
||||
|
||||
surplus_table_names = module.GetSurplusServiceTableNames( all_table_names )
|
||||
|
||||
all_surplus_table_names.update( surplus_table_names )
|
||||
|
||||
|
||||
if len( surplus_table_names ) == 0:
|
||||
|
||||
HydrusData.ShowText( 'No orphan tables!' )
|
||||
|
||||
|
||||
for table_name in surplus_table_names:
|
||||
|
||||
HydrusData.ShowText( 'Dropping ' + table_name )
|
||||
|
||||
self._Execute( 'DROP table ' + table_name + ';' )
|
||||
|
||||
|
||||
|
||||
def _CreateDB( self ):
|
||||
|
||||
# main
|
||||
|
@ -1270,24 +1173,12 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
#
|
||||
|
||||
self._Execute( 'CREATE TABLE IF NOT EXISTS local_ratings ( service_id INTEGER, hash_id INTEGER, rating REAL, PRIMARY KEY ( service_id, hash_id ) );' )
|
||||
self._CreateIndex( 'local_ratings', [ 'hash_id' ] )
|
||||
self._CreateIndex( 'local_ratings', [ 'rating' ] )
|
||||
|
||||
self._Execute( 'CREATE TABLE IF NOT EXISTS local_incdec_ratings ( service_id INTEGER, hash_id INTEGER, rating INTEGER, PRIMARY KEY ( service_id, hash_id ) );' )
|
||||
self._CreateIndex( 'local_incdec_ratings', [ 'hash_id' ] )
|
||||
self._CreateIndex( 'local_incdec_ratings', [ 'rating' ] )
|
||||
|
||||
self._Execute( 'CREATE TABLE IF NOT EXISTS options ( options TEXT_YAML );', )
|
||||
|
||||
self._Execute( 'CREATE TABLE IF NOT EXISTS recent_tags ( service_id INTEGER, tag_id INTEGER, timestamp INTEGER, PRIMARY KEY ( service_id, tag_id ) );' )
|
||||
|
||||
self._Execute( 'CREATE TABLE IF NOT EXISTS remote_thumbnails ( service_id INTEGER, hash_id INTEGER, PRIMARY KEY ( service_id, hash_id ) );' )
|
||||
|
||||
self._Execute( 'CREATE TABLE IF NOT EXISTS service_info ( service_id INTEGER, info_type INTEGER, info INTEGER, PRIMARY KEY ( service_id, info_type ) );' )
|
||||
|
||||
self._Execute( 'CREATE TABLE IF NOT EXISTS statuses ( status_id INTEGER PRIMARY KEY, status TEXT UNIQUE );' )
|
||||
|
||||
# inserts
|
||||
|
||||
self.modules_files_physical_storage.Initialise()
|
||||
|
@ -1652,9 +1543,11 @@ class DB( HydrusDB.HydrusDB ):
|
|||
# however, this seemed to cause some immense temp drive space bloat when dropping the mapping tables, as there seems to be a trigger/foreign reference check for every row to be deleted
|
||||
# so now we just blat all tables and trust in the Lord that we don't forget to add any new ones in future
|
||||
|
||||
self._Execute( 'DELETE FROM local_ratings WHERE service_id = ?;', ( service_id, ) )
|
||||
self._Execute( 'DELETE FROM local_incdec_ratings WHERE service_id = ?;', ( service_id, ) )
|
||||
self._Execute( 'DELETE FROM recent_tags WHERE service_id = ?;', ( service_id, ) )
|
||||
if service_type in HC.RATINGS_SERVICES:
|
||||
|
||||
self.modules_ratings.Drop( service_id )
|
||||
|
||||
|
||||
self._Execute( 'DELETE FROM service_info WHERE service_id = ?;', ( service_id, ) )
|
||||
|
||||
self._DeleteServiceDropFilesTables( service_id, service_type )
|
||||
|
@ -1731,6 +1624,8 @@ class DB( HydrusDB.HydrusDB ):
|
|||
self.modules_tag_display.RegenerateTagSiblingsAndParentsCache( only_these_service_ids = interested_service_ids )
|
||||
|
||||
|
||||
self.modules_recent_tags.Drop( service_id )
|
||||
|
||||
self.modules_tag_search.Drop( self.modules_services.combined_file_service_id, service_id )
|
||||
|
||||
file_service_ids = self.modules_services.GetServiceIds( HC.FILE_SERVICES_WITH_SPECIFIC_TAG_LOOKUP_CACHES )
|
||||
|
@ -2373,23 +2268,6 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
|
||||
|
||||
def _FilterForFileDeleteLock( self, service_id, hash_ids ):
|
||||
|
||||
# TODO: like in the MediaSingleton object, eventually extend this to the metadata conditional object
|
||||
|
||||
if HG.client_controller.new_options.GetBoolean( 'delete_lock_for_archived_files' ):
|
||||
|
||||
service = self.modules_services.GetService( service_id )
|
||||
|
||||
if service.GetServiceType() in HC.LOCAL_FILE_SERVICES:
|
||||
|
||||
hash_ids = set( hash_ids ).intersection( self.modules_files_inbox.inbox_hash_ids )
|
||||
|
||||
|
||||
|
||||
return hash_ids
|
||||
|
||||
|
||||
def _FixLogicallyInconsistentMappings( self, tag_service_key = None ):
|
||||
|
||||
job_key = ClientThreading.JobKey( cancellable = True )
|
||||
|
@ -3066,7 +2944,10 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
|
||||
# note also that we do not scrub archived time on a file delete, so this upcoming fetch is for all files ever. this is useful, so don't undo it m8
|
||||
archive_timestamps = self._STL( self._Execute( f'SELECT archived_timestamp FROM {current_files_table_name} CROSS JOIN archive_timestamps USING ( hash_id ) ORDER BY archived_timestamp ASC;' ) )
|
||||
archive_timestamps_current = self._STL( self._Execute( f'SELECT archived_timestamp FROM {current_files_table_name} CROSS JOIN archive_timestamps USING ( hash_id );' ) )
|
||||
archive_timestamps_deleted = self._STL( self._Execute( f'SELECT archived_timestamp FROM {deleted_files_table_name} CROSS JOIN archive_timestamps USING ( hash_id );' ) )
|
||||
|
||||
archive_timestamps = sorted( archive_timestamps_current + archive_timestamps_deleted )
|
||||
|
||||
|
||||
if job_key.IsCancelled():
|
||||
|
@ -3074,13 +2955,18 @@ class DB( HydrusDB.HydrusDB ):
|
|||
return file_history
|
||||
|
||||
|
||||
total_current_files = len( current_timestamps )
|
||||
media_current_files_table_name = ClientDBFilesStorage.GenerateFilesTableName( self.modules_services.combined_local_media_service_id, HC.CONTENT_STATUS_CURRENT )
|
||||
|
||||
# I now exclude updates and trash my searching 'all my files'
|
||||
total_update_files = 0 #self.modules_files_storage.GetCurrentFilesCount( self.modules_services.local_update_service_id, HC.CONTENT_STATUS_CURRENT )
|
||||
total_trash_files = 0 #self.modules_files_storage.GetCurrentFilesCount( self.modules_services.trash_service_id, HC.CONTENT_STATUS_CURRENT )
|
||||
if current_files_table_name == media_current_files_table_name:
|
||||
|
||||
total_archiveable_count = len( current_timestamps )
|
||||
|
||||
else:
|
||||
|
||||
( total_archiveable_count, ) = self._Execute( f'SELECT COUNT( * ) FROM {current_files_table_name} CROSS JOIN {media_current_files_table_name} USING ( hash_id );' ).fetchone()
|
||||
|
||||
|
||||
total_archive_files = ( total_current_files - total_update_files - total_trash_files ) - total_inbox_files
|
||||
total_archive_files = total_archiveable_count - total_inbox_files
|
||||
|
||||
if len( archive_timestamps ) > 0:
|
||||
|
||||
|
@ -3628,20 +3514,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
hash_ids_to_service_ids_and_filenames = self.modules_service_paths.GetHashIdsToServiceIdsAndFilenames( temp_table_name )
|
||||
|
||||
hash_ids_to_local_star_ratings = HydrusData.BuildKeyToListDict( ( ( hash_id, ( service_id, rating ) ) for ( service_id, hash_id, rating ) in self._Execute( 'SELECT service_id, hash_id, rating FROM {} CROSS JOIN local_ratings USING ( hash_id );'.format( temp_table_name ) ) ) )
|
||||
hash_ids_to_local_incdec_ratings = HydrusData.BuildKeyToListDict( ( ( hash_id, ( service_id, rating ) ) for ( service_id, hash_id, rating ) in self._Execute( 'SELECT service_id, hash_id, rating FROM {} CROSS JOIN local_incdec_ratings USING ( hash_id );'.format( temp_table_name ) ) ) )
|
||||
|
||||
hash_ids_to_local_ratings = collections.defaultdict( list )
|
||||
|
||||
for ( hash_id, info_list ) in hash_ids_to_local_star_ratings.items():
|
||||
|
||||
hash_ids_to_local_ratings[ hash_id ].extend( info_list )
|
||||
|
||||
|
||||
for ( hash_id, info_list ) in hash_ids_to_local_incdec_ratings.items():
|
||||
|
||||
hash_ids_to_local_ratings[ hash_id ].extend( info_list )
|
||||
|
||||
hash_ids_to_local_ratings = self.modules_ratings.GetHashIdsToRatings( temp_table_name )
|
||||
|
||||
hash_ids_to_names_and_notes = self.modules_notes_map.GetHashIdsToNamesAndNotes( temp_table_name )
|
||||
|
||||
|
@ -4142,46 +4015,6 @@ class DB( HydrusDB.HydrusDB ):
|
|||
return paths
|
||||
|
||||
|
||||
def _GetRecentTags( self, service_key ):
|
||||
|
||||
service_id = self.modules_services.GetServiceId( service_key )
|
||||
|
||||
# we could be clever and do LIMIT and ORDER BY in the delete, but not all compilations of SQLite have that turned on, so let's KISS
|
||||
|
||||
tag_ids_to_timestamp = { tag_id : timestamp for ( tag_id, timestamp ) in self._Execute( 'SELECT tag_id, timestamp FROM recent_tags WHERE service_id = ?;', ( service_id, ) ) }
|
||||
|
||||
def sort_key( key ):
|
||||
|
||||
return tag_ids_to_timestamp[ key ]
|
||||
|
||||
|
||||
newest_first = list(tag_ids_to_timestamp.keys())
|
||||
|
||||
newest_first.sort( key = sort_key, reverse = True )
|
||||
|
||||
num_we_want = HG.client_controller.new_options.GetNoneableInteger( 'num_recent_tags' )
|
||||
|
||||
if num_we_want is None:
|
||||
|
||||
num_we_want = 20
|
||||
|
||||
|
||||
decayed = newest_first[ num_we_want : ]
|
||||
|
||||
if len( decayed ) > 0:
|
||||
|
||||
self._ExecuteMany( 'DELETE FROM recent_tags WHERE service_id = ? AND tag_id = ?;', ( ( service_id, tag_id ) for tag_id in decayed ) )
|
||||
|
||||
|
||||
sorted_recent_tag_ids = newest_first[ : num_we_want ]
|
||||
|
||||
tag_ids_to_tags = self.modules_tags_local_cache.GetTagIdsToTags( tag_ids = sorted_recent_tag_ids )
|
||||
|
||||
sorted_recent_tags = [ tag_ids_to_tags[ tag_id ] for tag_id in sorted_recent_tag_ids ]
|
||||
|
||||
return sorted_recent_tags
|
||||
|
||||
|
||||
def _GetRelatedTagCountsForOneTag( self, tag_display_type, file_service_id, tag_service_id, search_tag_id, max_num_files_to_search, stop_time_for_finding_results = None ) -> typing.Tuple[ collections.Counter, bool ]:
|
||||
|
||||
# a user provided the basic idea here
|
||||
|
@ -4761,33 +4594,33 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
elif info_type == HC.SERVICE_INFO_NUM_PENDING_TAG_SIBLINGS:
|
||||
|
||||
( info, ) = self._Execute( 'SELECT COUNT( * ) FROM tag_sibling_petitions WHERE service_id = ? AND status = ?;', ( service_id, HC.CONTENT_STATUS_PENDING ) ).fetchone()
|
||||
info = self.modules_tag_siblings.GetPendingSiblingsCount( service_id )
|
||||
|
||||
elif info_type == HC.SERVICE_INFO_NUM_PETITIONED_TAG_SIBLINGS:
|
||||
|
||||
( info, ) = self._Execute( 'SELECT COUNT( * ) FROM tag_sibling_petitions WHERE service_id = ? AND status = ?;', ( service_id, HC.CONTENT_STATUS_PETITIONED ) ).fetchone()
|
||||
info = self.modules_tag_siblings.GetPetitionedSiblingsCount( service_id )
|
||||
|
||||
elif info_type == HC.SERVICE_INFO_NUM_PENDING_TAG_PARENTS:
|
||||
|
||||
( info, ) = self._Execute( 'SELECT COUNT( * ) FROM tag_parent_petitions WHERE service_id = ? AND status = ?;', ( service_id, HC.CONTENT_STATUS_PENDING ) ).fetchone()
|
||||
info = self.modules_tag_parents.GetPendingParentsCount( service_id )
|
||||
|
||||
elif info_type == HC.SERVICE_INFO_NUM_PETITIONED_TAG_PARENTS:
|
||||
|
||||
( info, ) = self._Execute( 'SELECT COUNT( * ) FROM tag_parent_petitions WHERE service_id = ? AND status = ?;', ( service_id, HC.CONTENT_STATUS_PETITIONED ) ).fetchone()
|
||||
info = self.modules_tag_parents.GetPetitionedParentsCount( service_id )
|
||||
|
||||
|
||||
elif service_type in HC.STAR_RATINGS_SERVICES:
|
||||
|
||||
if info_type == HC.SERVICE_INFO_NUM_FILE_HASHES:
|
||||
|
||||
( info, ) = self._Execute( 'SELECT COUNT( * ) FROM local_ratings WHERE service_id = ?;', ( service_id, ) ).fetchone()
|
||||
info = self.modules_ratings.GetStarredServiceCount( service_id )
|
||||
|
||||
|
||||
elif service_type == HC.LOCAL_RATING_INCDEC:
|
||||
|
||||
if info_type == HC.SERVICE_INFO_NUM_FILE_HASHES:
|
||||
|
||||
( info, ) = self._Execute( 'SELECT COUNT( * ) FROM local_incdec_ratings WHERE service_id = ?;', ( service_id, ) ).fetchone()
|
||||
info = self.modules_ratings.GetIncDecServiceCount( service_id )
|
||||
|
||||
|
||||
elif service_type == HC.LOCAL_BOORU:
|
||||
|
@ -4815,24 +4648,6 @@ class DB( HydrusDB.HydrusDB ):
|
|||
return results
|
||||
|
||||
|
||||
def _GetSiteId( self, name ):
|
||||
|
||||
result = self._Execute( 'SELECT site_id FROM imageboard_sites WHERE name = ?;', ( name, ) ).fetchone()
|
||||
|
||||
if result is None:
|
||||
|
||||
self._Execute( 'INSERT INTO imageboard_sites ( name ) VALUES ( ? );', ( name, ) )
|
||||
|
||||
site_id = self._GetLastRowId()
|
||||
|
||||
else:
|
||||
|
||||
( site_id, ) = result
|
||||
|
||||
|
||||
return site_id
|
||||
|
||||
|
||||
def _GetTrashHashes( self, limit = None, minimum_age = None ):
|
||||
|
||||
if limit is None:
|
||||
|
@ -4844,6 +4659,8 @@ class DB( HydrusDB.HydrusDB ):
|
|||
limit_phrase = ' LIMIT ' + str( limit )
|
||||
|
||||
|
||||
timestamp_cutoff = 0
|
||||
|
||||
if minimum_age is None:
|
||||
|
||||
age_phrase = ' ORDER BY timestamp ASC' # when deleting until trash is small enough, let's delete oldest first
|
||||
|
@ -4859,7 +4676,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
hash_ids = self._STS( self._Execute( 'SELECT hash_id FROM {}{}{};'.format( current_files_table_name, age_phrase, limit_phrase ) ) )
|
||||
|
||||
hash_ids = self._FilterForFileDeleteLock( self.modules_services.trash_service_id, hash_ids )
|
||||
hash_ids = self.modules_file_delete_lock.FilterForFileDeleteLock( self.modules_services.trash_service_id, hash_ids )
|
||||
|
||||
if HG.db_report_mode:
|
||||
|
||||
|
@ -5078,15 +4895,6 @@ class DB( HydrusDB.HydrusDB ):
|
|||
self._AddFiles( self.modules_services.local_update_service_id, [ ( hash_id, now ) ] )
|
||||
|
||||
|
||||
def _InitCaches( self ):
|
||||
|
||||
# this occurs after db update, so is safe to reference things in there but also cannot be relied upon in db update
|
||||
|
||||
HG.client_controller.frame_splash_status.SetText( 'preparing db caches' )
|
||||
|
||||
HG.client_controller.frame_splash_status.SetSubtext( 'inbox' )
|
||||
|
||||
|
||||
def _InitExternalDatabases( self ):
|
||||
|
||||
self._db_filenames[ 'external_caches' ] = 'client.caches.db'
|
||||
|
@ -5132,7 +4940,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
def _LoadModules( self ):
|
||||
|
||||
self.modules_db_maintenance = ClientDBMaintenance.ClientDBMaintenance( self._c, self._db_dir, self._db_filenames, self._cursor_transaction_wrapper )
|
||||
self.modules_db_maintenance = ClientDBMaintenance.ClientDBMaintenance( self._c, self._db_dir, self._db_filenames, self._cursor_transaction_wrapper, self._modules )
|
||||
|
||||
self._modules.append( self.modules_db_maintenance )
|
||||
|
||||
|
@ -5204,6 +5012,12 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
#
|
||||
|
||||
self.modules_file_delete_lock = ClientDBFileDeleteLock.ClientDBFileDeleteLock( self._c, self.modules_services, self.modules_files_inbox )
|
||||
|
||||
self._modules.append( self.modules_file_delete_lock )
|
||||
|
||||
#
|
||||
|
||||
self.modules_mappings_counts = ClientDBMappingsCounts.ClientDBMappingsCounts( self._c, self.modules_db_maintenance, self.modules_services )
|
||||
|
||||
self._modules.append( self.modules_mappings_counts )
|
||||
|
@ -5220,6 +5034,18 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
#
|
||||
|
||||
self.modules_recent_tags = ClientDBTagSuggestions.ClientDBRecentTags( self._c, self.modules_tags, self.modules_services, self.modules_tags_local_cache )
|
||||
|
||||
self._modules.append( self.modules_recent_tags )
|
||||
|
||||
#
|
||||
|
||||
self.modules_ratings = ClientDBRatings.ClientDBRatings( self._c, self.modules_services )
|
||||
|
||||
self._modules.append( self.modules_ratings )
|
||||
|
||||
#
|
||||
|
||||
self.modules_service_paths = ClientDBServicePaths.ClientDBServicePaths( self._c, self.modules_services, self.modules_texts, self.modules_hashes_local_cache )
|
||||
|
||||
self._modules.append( self.modules_service_paths )
|
||||
|
@ -5798,7 +5624,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
elif action == HC.CONTENT_UPDATE_DELETE:
|
||||
|
||||
actual_delete_hash_ids = self._FilterForFileDeleteLock( service_id, hash_ids )
|
||||
actual_delete_hash_ids = self.modules_file_delete_lock.FilterForFileDeleteLock( service_id, hash_ids )
|
||||
|
||||
if len( actual_delete_hash_ids ) < len( hash_ids ):
|
||||
|
||||
|
@ -6316,40 +6142,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
hash_ids = self.modules_hashes_local_cache.GetHashIds( hashes )
|
||||
|
||||
if service_type in HC.STAR_RATINGS_SERVICES:
|
||||
|
||||
ratings_added = 0
|
||||
|
||||
self._ExecuteMany( 'DELETE FROM local_ratings WHERE service_id = ? AND hash_id = ?;', ( ( service_id, hash_id ) for hash_id in hash_ids ) )
|
||||
|
||||
ratings_added -= self._GetRowCount()
|
||||
|
||||
if rating is not None:
|
||||
|
||||
self._ExecuteMany( 'INSERT INTO local_ratings ( service_id, hash_id, rating ) VALUES ( ?, ?, ? );', [ ( service_id, hash_id, rating ) for hash_id in hash_ids ] )
|
||||
|
||||
ratings_added += self._GetRowCount()
|
||||
|
||||
|
||||
self._Execute( 'UPDATE service_info SET info = info + ? WHERE service_id = ? AND info_type = ?;', ( ratings_added, service_id, HC.SERVICE_INFO_NUM_FILE_HASHES ) )
|
||||
|
||||
elif service_type == HC.LOCAL_RATING_INCDEC:
|
||||
|
||||
ratings_added = 0
|
||||
|
||||
self._ExecuteMany( 'DELETE FROM local_incdec_ratings WHERE service_id = ? AND hash_id = ?;', ( ( service_id, hash_id ) for hash_id in hash_ids ) )
|
||||
|
||||
ratings_added -= self._GetRowCount()
|
||||
|
||||
if rating != 0:
|
||||
|
||||
self._ExecuteMany( 'INSERT INTO local_incdec_ratings ( service_id, hash_id, rating ) VALUES ( ?, ?, ? );', [ ( service_id, hash_id, rating ) for hash_id in hash_ids ] )
|
||||
|
||||
ratings_added += self._GetRowCount()
|
||||
|
||||
|
||||
self._Execute( 'UPDATE service_info SET info = info + ? WHERE service_id = ? AND info_type = ?;', ( ratings_added, service_id, HC.SERVICE_INFO_NUM_FILE_HASHES ) )
|
||||
|
||||
self.modules_ratings.SetRating( service_id, hash_ids, rating )
|
||||
|
||||
elif action == HC.CONTENT_UPDATE_ADVANCED:
|
||||
|
||||
|
@ -6809,24 +6602,6 @@ class DB( HydrusDB.HydrusDB ):
|
|||
return num_rows_processed
|
||||
|
||||
|
||||
def _PushRecentTags( self, service_key, tags ):
|
||||
|
||||
service_id = self.modules_services.GetServiceId( service_key )
|
||||
|
||||
if tags is None:
|
||||
|
||||
self._Execute( 'DELETE FROM recent_tags WHERE service_id = ?;', ( service_id, ) )
|
||||
|
||||
else:
|
||||
|
||||
now = HydrusTime.GetNow()
|
||||
|
||||
tag_ids = [ self.modules_tags.GetTagId( tag ) for tag in tags ]
|
||||
|
||||
self._ExecuteMany( 'REPLACE INTO recent_tags ( service_id, tag_id, timestamp ) VALUES ( ?, ?, ? );', ( ( service_id, tag_id, now ) for tag_id in tag_ids ) )
|
||||
|
||||
|
||||
|
||||
def _Read( self, action, *args, **kwargs ):
|
||||
|
||||
if action == 'autocomplete_predicates': result = self.modules_tag_search.GetAutocompletePredicates( *args, **kwargs )
|
||||
|
@ -6874,7 +6649,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
elif action == 'options': result = self._GetOptions( *args, **kwargs )
|
||||
elif action == 'pending': result = self._GetPending( *args, **kwargs )
|
||||
elif action == 'random_potential_duplicate_hashes': result = self._DuplicatesGetRandomPotentialDuplicateHashes( *args, **kwargs )
|
||||
elif action == 'recent_tags': result = self._GetRecentTags( *args, **kwargs )
|
||||
elif action == 'recent_tags': result = self.modules_recent_tags.GetRecentTags( *args, **kwargs )
|
||||
elif action == 'repository_progress': result = self.modules_repositories.GetRepositoryProgress( *args, **kwargs )
|
||||
elif action == 'repository_update_hashes_to_process': result = self.modules_repositories.GetRepositoryUpdateHashesICanProcess( *args, **kwargs )
|
||||
elif action == 'serialisable': result = self.modules_serialisable.GetJSONDump( *args, **kwargs )
|
||||
|
@ -10162,6 +9937,30 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
|
||||
|
||||
if version == 545:
|
||||
|
||||
try:
|
||||
|
||||
self._controller.frame_splash_status.SetSubtext( f'scheduling some maintenance work' )
|
||||
|
||||
all_local_hash_ids = self.modules_files_storage.GetCurrentHashIdsList( self.modules_services.combined_local_file_service_id )
|
||||
|
||||
with self._MakeTemporaryIntegerTable( all_local_hash_ids, 'hash_id' ) as temp_hash_ids_table_name:
|
||||
|
||||
hash_ids = self._STS( self._Execute( f'SELECT hash_id FROM {temp_hash_ids_table_name} CROSS JOIN files_info USING ( hash_id ) WHERE mime = ?;', ( HC.APPLICATION_PSD, ) ) )
|
||||
self.modules_files_maintenance_queue.AddJobs( hash_ids, ClientFiles.REGENERATE_FILE_DATA_JOB_FORCE_THUMBNAIL )
|
||||
|
||||
|
||||
except Exception as e:
|
||||
|
||||
HydrusData.PrintException( e )
|
||||
|
||||
message = 'Some file updates failed to schedule! This is not super important, but hydev would be interested in seeing the error that was printed to the log.'
|
||||
|
||||
self.pub_initial_message( message )
|
||||
|
||||
|
||||
|
||||
self._controller.frame_splash_status.SetTitleText( 'updated db to v{}'.format( HydrusData.ToHumanInt( version + 1 ) ) )
|
||||
|
||||
self._Execute( 'UPDATE version SET version = ?;', ( version + 1, ) )
|
||||
|
@ -10643,10 +10442,10 @@ class DB( HydrusDB.HydrusDB ):
|
|||
elif action == 'clear_false_positive_relations': self.modules_files_duplicates.ClearAllFalsePositiveRelationsFromHashes( *args, **kwargs )
|
||||
elif action == 'clear_false_positive_relations_between_groups': self.modules_files_duplicates.ClearFalsePositiveRelationsBetweenGroupsFromHashes( *args, **kwargs )
|
||||
elif action == 'clear_orphan_file_records': self._ClearOrphanFileRecords( *args, **kwargs )
|
||||
elif action == 'clear_orphan_tables': self._ClearOrphanTables( *args, **kwargs )
|
||||
elif action == 'clear_orphan_tables': self.modules_db_maintenance.ClearOrphanTables( *args, **kwargs )
|
||||
elif action == 'content_updates': self._ProcessContentUpdates( *args, **kwargs )
|
||||
elif action == 'cull_file_viewing_statistics': self.modules_files_viewing_stats.CullFileViewingStatistics( *args, **kwargs )
|
||||
elif action == 'db_integrity': self._CheckDBIntegrity( *args, **kwargs )
|
||||
elif action == 'db_integrity': self.modules_db_maintenance.CheckDBIntegrity( *args, **kwargs )
|
||||
elif action == 'delete_local_booru_share': self.modules_serialisable.DeleteYAMLDump( ClientDBSerialisable.YAML_DUMP_ID_LOCAL_BOORU, *args, **kwargs )
|
||||
elif action == 'delete_pending': self._DeletePending( *args, **kwargs )
|
||||
elif action == 'delete_serialisable_named': self.modules_serialisable.DeleteJSONDumpNamed( *args, **kwargs )
|
||||
|
@ -10675,7 +10474,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
elif action == 'migration_start_pairs_job': self._MigrationStartPairsJob( *args, **kwargs )
|
||||
elif action == 'process_repository_content': result = self._ProcessRepositoryContent( *args, **kwargs )
|
||||
elif action == 'process_repository_definitions': result = self.modules_repositories.ProcessRepositoryDefinitions( *args, **kwargs )
|
||||
elif action == 'push_recent_tags': self._PushRecentTags( *args, **kwargs )
|
||||
elif action == 'push_recent_tags': self.modules_recent_tags.PushRecentTags( *args, **kwargs )
|
||||
elif action == 'regenerate_local_hash_cache': self._RegenerateLocalHashCache( *args, **kwargs )
|
||||
elif action == 'regenerate_local_tag_cache': self._RegenerateLocalTagCache( *args, **kwargs )
|
||||
elif action == 'regenerate_similar_files': self.modules_similar_files.RegenerateTree( *args, **kwargs )
|
||||
|
|
|
@ -234,6 +234,7 @@ class ClientDBCacheLocalHashes( ClientDBModule.ClientDBModule ):
|
|||
|
||||
|
||||
|
||||
|
||||
class ClientDBCacheLocalTags( ClientDBModule.ClientDBModule ):
|
||||
|
||||
CAN_REPOPULATE_ALL_MISSING_DATA = True
|
||||
|
|
|
@ -0,0 +1,44 @@
|
|||
import sqlite3
|
||||
import typing
|
||||
|
||||
from hydrus.core import HydrusConstants as HC
|
||||
from hydrus.core import HydrusGlobals as HG
|
||||
|
||||
from hydrus.client.db import ClientDBFilesInbox
|
||||
from hydrus.client.db import ClientDBModule
|
||||
from hydrus.client.db import ClientDBServices
|
||||
|
||||
class ClientDBFileDeleteLock( ClientDBModule.ClientDBModule ):
|
||||
|
||||
def __init__( self, cursor: sqlite3.Cursor, modules_services: ClientDBServices.ClientDBMasterServices, modules_files_inbox: ClientDBFilesInbox.ClientDBFilesInbox ):
|
||||
|
||||
self.modules_services = modules_services
|
||||
self.modules_files_inbox = modules_files_inbox
|
||||
|
||||
ClientDBModule.ClientDBModule.__init__( self, 'client file delete lock', cursor )
|
||||
|
||||
|
||||
def FilterForFileDeleteLock( self, service_id, hash_ids ):
|
||||
|
||||
# TODO: like in the MediaSingleton object, eventually extend this to the metadata conditional object
|
||||
|
||||
if HG.client_controller.new_options.GetBoolean( 'delete_lock_for_archived_files' ):
|
||||
|
||||
service = self.modules_services.GetService( service_id )
|
||||
|
||||
if service.GetServiceType() in HC.LOCAL_FILE_SERVICES:
|
||||
|
||||
hash_ids = set( hash_ids ).intersection( self.modules_files_inbox.inbox_hash_ids )
|
||||
|
||||
|
||||
|
||||
return hash_ids
|
||||
|
||||
|
||||
def GetTablesAndColumnsThatUseDefinitions( self, content_type: int ) -> typing.List[ typing.Tuple[ str, str ] ]:
|
||||
|
||||
tables_and_columns = []
|
||||
|
||||
return tables_and_columns
|
||||
|
||||
|
|
@ -7,6 +7,7 @@ import typing
|
|||
from hydrus.core import HydrusConstants as HC
|
||||
from hydrus.core import HydrusData
|
||||
from hydrus.core import HydrusDBBase
|
||||
from hydrus.core import HydrusDBModule
|
||||
from hydrus.core import HydrusGlobals as HG
|
||||
from hydrus.core import HydrusTime
|
||||
|
||||
|
@ -15,13 +16,14 @@ from hydrus.client.db import ClientDBModule
|
|||
|
||||
class ClientDBMaintenance( ClientDBModule.ClientDBModule ):
|
||||
|
||||
def __init__( self, cursor: sqlite3.Cursor, db_dir: str, db_filenames: typing.Collection[ str ], cursor_transaction_wrapper: HydrusDBBase.DBCursorTransactionWrapper ):
|
||||
def __init__( self, cursor: sqlite3.Cursor, db_dir: str, db_filenames: typing.Collection[ str ], cursor_transaction_wrapper: HydrusDBBase.DBCursorTransactionWrapper, modules: typing.List[ HydrusDBModule.HydrusDBModule ] ):
|
||||
|
||||
ClientDBModule.ClientDBModule.__init__( self, 'client db maintenance', cursor )
|
||||
|
||||
self._db_dir = db_dir
|
||||
self._db_filenames = db_filenames
|
||||
self._cursor_transaction_wrapper = cursor_transaction_wrapper
|
||||
self._modules = modules
|
||||
|
||||
|
||||
def _DropTable( self, deletee_table_name: str ):
|
||||
|
@ -301,6 +303,8 @@ class ClientDBMaintenance( ClientDBModule.ClientDBModule ):
|
|||
|
||||
def AnalyzeTable( self, name ):
|
||||
|
||||
num_rows = 0
|
||||
|
||||
do_it = True
|
||||
|
||||
result = self._Execute( 'SELECT num_rows FROM analyze_timestamps WHERE name = ?;', ( name, ) ).fetchone()
|
||||
|
@ -328,6 +332,106 @@ class ClientDBMaintenance( ClientDBModule.ClientDBModule ):
|
|||
self._Execute( 'INSERT OR IGNORE INTO analyze_timestamps ( name, num_rows, timestamp ) VALUES ( ?, ?, ? );', ( name, num_rows, HydrusTime.GetNow() ) )
|
||||
|
||||
|
||||
def CheckDBIntegrity( self ):
|
||||
|
||||
prefix_string = 'checking db integrity: '
|
||||
|
||||
job_key = ClientThreading.JobKey( cancellable = True )
|
||||
|
||||
num_errors = 0
|
||||
|
||||
try:
|
||||
|
||||
job_key.SetStatusTitle( prefix_string + 'preparing' )
|
||||
|
||||
HG.client_controller.pub( 'modal_message', job_key )
|
||||
|
||||
job_key.SetStatusTitle( prefix_string + 'running' )
|
||||
job_key.SetStatusText( 'errors found so far: ' + HydrusData.ToHumanInt( num_errors ) )
|
||||
|
||||
db_names = [ name for ( index, name, path ) in self._Execute( 'PRAGMA database_list;' ) if name not in ( 'mem', 'temp', 'durable_temp' ) ]
|
||||
|
||||
for db_name in db_names:
|
||||
|
||||
for ( text, ) in self._Execute( 'PRAGMA ' + db_name + '.integrity_check;' ):
|
||||
|
||||
( i_paused, should_quit ) = job_key.WaitIfNeeded()
|
||||
|
||||
if should_quit:
|
||||
|
||||
job_key.SetStatusTitle( prefix_string + 'cancelled' )
|
||||
job_key.SetStatusText( 'errors found: ' + HydrusData.ToHumanInt( num_errors ) )
|
||||
|
||||
return
|
||||
|
||||
|
||||
if text != 'ok':
|
||||
|
||||
if num_errors == 0:
|
||||
|
||||
HydrusData.Print( 'During a db integrity check, these errors were discovered:' )
|
||||
|
||||
|
||||
HydrusData.Print( text )
|
||||
|
||||
num_errors += 1
|
||||
|
||||
|
||||
job_key.SetStatusText( 'errors found so far: ' + HydrusData.ToHumanInt( num_errors ) )
|
||||
|
||||
|
||||
|
||||
finally:
|
||||
|
||||
job_key.SetStatusTitle( prefix_string + 'completed' )
|
||||
job_key.SetStatusText( 'errors found: ' + HydrusData.ToHumanInt( num_errors ) )
|
||||
|
||||
HydrusData.Print( job_key.ToString() )
|
||||
|
||||
job_key.Finish()
|
||||
|
||||
|
||||
|
||||
def ClearOrphanTables( self ):
|
||||
|
||||
all_table_names = set()
|
||||
|
||||
db_names = [ name for ( index, name, path ) in self._Execute( 'PRAGMA database_list;' ) if name not in ( 'mem', 'temp', 'durable_temp' ) ]
|
||||
|
||||
for db_name in db_names:
|
||||
|
||||
table_names = self._STS( self._Execute( 'SELECT name FROM {}.sqlite_master WHERE type = ?;'.format( db_name ), ( 'table', ) ) )
|
||||
|
||||
if db_name != 'main':
|
||||
|
||||
table_names = { f'{db_name}.{table_name}' for table_name in table_names }
|
||||
|
||||
|
||||
all_table_names.update( table_names )
|
||||
|
||||
|
||||
all_surplus_table_names = set()
|
||||
|
||||
for module in self._modules:
|
||||
|
||||
surplus_table_names = module.GetSurplusServiceTableNames( all_table_names )
|
||||
|
||||
all_surplus_table_names.update( surplus_table_names )
|
||||
|
||||
|
||||
if len( all_surplus_table_names ) == 0:
|
||||
|
||||
HydrusData.ShowText( 'No orphan tables!' )
|
||||
|
||||
|
||||
for table_name in all_surplus_table_names:
|
||||
|
||||
HydrusData.ShowText( f'Dropping {table_name}' )
|
||||
|
||||
self._Execute( f'DROP table {table_name};' )
|
||||
|
||||
|
||||
|
||||
def DeferredDropTable( self, table_name: str ):
|
||||
|
||||
try:
|
||||
|
|
|
@ -158,3 +158,4 @@ class ClientDBMappingsCountsUpdate( ClientDBModule.ClientDBModule ):
|
|||
self.ReduceCounts( tag_display_type, file_service_id, tag_service_id, reduce_ac_cache_changes )
|
||||
|
||||
|
||||
|
||||
|
|
|
@ -0,0 +1,135 @@
|
|||
import collections
|
||||
import sqlite3
|
||||
import typing
|
||||
|
||||
from hydrus.core import HydrusConstants as HC
|
||||
from hydrus.core import HydrusData
|
||||
from hydrus.core import HydrusGlobals as HG
|
||||
from hydrus.core import HydrusTime
|
||||
|
||||
from hydrus.client.db import ClientDBDefinitionsCache
|
||||
from hydrus.client.db import ClientDBMaster
|
||||
from hydrus.client.db import ClientDBModule
|
||||
from hydrus.client.db import ClientDBServices
|
||||
|
||||
class ClientDBRatings( ClientDBModule.ClientDBModule ):
|
||||
|
||||
def __init__( self, cursor: sqlite3.Cursor, modules_services: ClientDBServices.ClientDBMasterServices ):
|
||||
|
||||
self.modules_services = modules_services
|
||||
|
||||
ClientDBModule.ClientDBModule.__init__( self, 'client ratings', cursor )
|
||||
|
||||
|
||||
def _GetInitialIndexGenerationDict( self ) -> dict:
|
||||
|
||||
index_generation_dict = {}
|
||||
|
||||
index_generation_dict[ 'main.local_ratings' ] = [
|
||||
( [ 'hash_id' ], False, 400 ),
|
||||
( [ 'rating' ], False, 400 )
|
||||
]
|
||||
|
||||
index_generation_dict[ 'main.local_incdec_ratings' ] = [
|
||||
( [ 'hash_id' ], False, 400 ),
|
||||
( [ 'rating' ], False, 400 )
|
||||
]
|
||||
|
||||
return index_generation_dict
|
||||
|
||||
|
||||
def _GetInitialTableGenerationDict( self ) -> dict:
|
||||
|
||||
return {
|
||||
'main.local_ratings' : ( 'CREATE TABLE IF NOT EXISTS {} ( service_id INTEGER, hash_id INTEGER, rating REAL, PRIMARY KEY ( service_id, hash_id ) );', 400 ),
|
||||
'main.local_incdec_ratings' : ( 'CREATE TABLE IF NOT EXISTS {} ( service_id INTEGER, hash_id INTEGER, rating INTEGER, PRIMARY KEY ( service_id, hash_id ) );', 400 )
|
||||
}
|
||||
|
||||
|
||||
def Drop( self, service_id: int ):
|
||||
|
||||
self._Execute( 'DELETE FROM local_ratings WHERE service_id = ?;', ( service_id, ) )
|
||||
self._Execute( 'DELETE FROM local_incdec_ratings WHERE service_id = ?;', ( service_id, ) )
|
||||
|
||||
|
||||
def GetHashIdsToRatings( self, hash_ids_table_name ):
|
||||
|
||||
hash_ids_to_local_star_ratings = HydrusData.BuildKeyToListDict( ( ( hash_id, ( service_id, rating ) ) for ( service_id, hash_id, rating ) in self._Execute( 'SELECT service_id, hash_id, rating FROM {} CROSS JOIN local_ratings USING ( hash_id );'.format( hash_ids_table_name ) ) ) )
|
||||
hash_ids_to_local_incdec_ratings = HydrusData.BuildKeyToListDict( ( ( hash_id, ( service_id, rating ) ) for ( service_id, hash_id, rating ) in self._Execute( 'SELECT service_id, hash_id, rating FROM {} CROSS JOIN local_incdec_ratings USING ( hash_id );'.format( hash_ids_table_name ) ) ) )
|
||||
|
||||
hash_ids_to_local_ratings = collections.defaultdict( list )
|
||||
|
||||
for ( hash_id, info_list ) in hash_ids_to_local_star_ratings.items():
|
||||
|
||||
hash_ids_to_local_ratings[ hash_id ].extend( info_list )
|
||||
|
||||
|
||||
for ( hash_id, info_list ) in hash_ids_to_local_incdec_ratings.items():
|
||||
|
||||
hash_ids_to_local_ratings[ hash_id ].extend( info_list )
|
||||
|
||||
|
||||
return hash_ids_to_local_ratings
|
||||
|
||||
|
||||
def GetIncDecServiceCount( self, service_id: int ):
|
||||
|
||||
( info, ) = self._Execute( 'SELECT COUNT( * ) FROM local_incdec_ratings WHERE service_id = ?;', ( service_id, ) ).fetchone()
|
||||
|
||||
return info
|
||||
|
||||
|
||||
def GetStarredServiceCount( self, service_id: int ):
|
||||
|
||||
( info, ) = self._Execute( 'SELECT COUNT( * ) FROM local_ratings WHERE service_id = ?;', ( service_id, ) ).fetchone()
|
||||
|
||||
return info
|
||||
|
||||
|
||||
def GetTablesAndColumnsThatUseDefinitions( self, content_type: int ) -> typing.List[ typing.Tuple[ str, str ] ]:
|
||||
|
||||
tables_and_columns = []
|
||||
|
||||
return tables_and_columns
|
||||
|
||||
|
||||
def SetRating( self, service_id, hash_ids, rating ):
|
||||
|
||||
service_type = self.modules_services.GetServiceType( service_id )
|
||||
|
||||
if service_type in HC.STAR_RATINGS_SERVICES:
|
||||
|
||||
ratings_added = 0
|
||||
|
||||
self._ExecuteMany( 'DELETE FROM local_ratings WHERE service_id = ? AND hash_id = ?;', ( ( service_id, hash_id ) for hash_id in hash_ids ) )
|
||||
|
||||
ratings_added -= self._GetRowCount()
|
||||
|
||||
if rating is not None:
|
||||
|
||||
self._ExecuteMany( 'INSERT INTO local_ratings ( service_id, hash_id, rating ) VALUES ( ?, ?, ? );', [ ( service_id, hash_id, rating ) for hash_id in hash_ids ] )
|
||||
|
||||
ratings_added += self._GetRowCount()
|
||||
|
||||
|
||||
self._Execute( 'UPDATE service_info SET info = info + ? WHERE service_id = ? AND info_type = ?;', ( ratings_added, service_id, HC.SERVICE_INFO_NUM_FILE_HASHES ) )
|
||||
|
||||
elif service_type == HC.LOCAL_RATING_INCDEC:
|
||||
|
||||
ratings_added = 0
|
||||
|
||||
self._ExecuteMany( 'DELETE FROM local_incdec_ratings WHERE service_id = ? AND hash_id = ?;', ( ( service_id, hash_id ) for hash_id in hash_ids ) )
|
||||
|
||||
ratings_added -= self._GetRowCount()
|
||||
|
||||
if rating != 0:
|
||||
|
||||
self._ExecuteMany( 'INSERT INTO local_incdec_ratings ( service_id, hash_id, rating ) VALUES ( ?, ?, ? );', [ ( service_id, hash_id, rating ) for hash_id in hash_ids ] )
|
||||
|
||||
ratings_added += self._GetRowCount()
|
||||
|
||||
|
||||
self._Execute( 'UPDATE service_info SET info = info + ? WHERE service_id = ? AND info_type = ?;', ( ratings_added, service_id, HC.SERVICE_INFO_NUM_FILE_HASHES ) )
|
||||
|
||||
|
||||
|
|
@ -448,6 +448,20 @@ class ClientDBTagParents( ClientDBModule.ClientDBModule ):
|
|||
return self._service_ids_to_interested_service_ids[ tag_service_id ]
|
||||
|
||||
|
||||
def GetPendingParentsCount( self, service_id: int ):
|
||||
|
||||
( info, ) = self._Execute( 'SELECT COUNT( * ) FROM tag_parent_petitions WHERE service_id = ? AND status = ?;', ( service_id, HC.CONTENT_STATUS_PENDING ) ).fetchone()
|
||||
|
||||
return info
|
||||
|
||||
|
||||
def GetPetitionedParentsCount( self, service_id: int ):
|
||||
|
||||
( info, ) = self._Execute( 'SELECT COUNT( * ) FROM tag_parent_petitions WHERE service_id = ? AND status = ?;', ( service_id, HC.CONTENT_STATUS_PETITIONED ) ).fetchone()
|
||||
|
||||
return info
|
||||
|
||||
|
||||
def GetTablesAndColumnsThatUseDefinitions( self, content_type: int ) -> typing.List[ typing.Tuple[ str, str ] ]:
|
||||
|
||||
if content_type == HC.CONTENT_TYPE_TAG:
|
||||
|
|
|
@ -581,6 +581,20 @@ class ClientDBTagSiblings( ClientDBModule.ClientDBModule ):
|
|||
return self._service_ids_to_interested_service_ids[ tag_service_id ]
|
||||
|
||||
|
||||
def GetPendingSiblingsCount( self, service_id: int ):
|
||||
|
||||
( info, ) = self._Execute( 'SELECT COUNT( * ) FROM tag_sibling_petitions WHERE service_id = ? AND status = ?;', ( service_id, HC.CONTENT_STATUS_PENDING ) ).fetchone()
|
||||
|
||||
return info
|
||||
|
||||
|
||||
def GetPetitionedSiblingsCount( self, service_id: int ):
|
||||
|
||||
( info, ) = self._Execute( 'SELECT COUNT( * ) FROM tag_sibling_petitions WHERE service_id = ? AND status = ?;', ( service_id, HC.CONTENT_STATUS_PETITIONED ) ).fetchone()
|
||||
|
||||
return info
|
||||
|
||||
|
||||
def GetTablesAndColumnsThatUseDefinitions( self, content_type: int ) -> typing.List[ typing.Tuple[ str, str ] ]:
|
||||
|
||||
if content_type == HC.CONTENT_TYPE_TAG:
|
||||
|
|
|
@ -0,0 +1,105 @@
|
|||
import sqlite3
|
||||
import typing
|
||||
|
||||
from hydrus.core import HydrusConstants as HC
|
||||
from hydrus.core import HydrusGlobals as HG
|
||||
from hydrus.core import HydrusTime
|
||||
|
||||
from hydrus.client.db import ClientDBDefinitionsCache
|
||||
from hydrus.client.db import ClientDBMaster
|
||||
from hydrus.client.db import ClientDBModule
|
||||
from hydrus.client.db import ClientDBServices
|
||||
|
||||
class ClientDBRecentTags( ClientDBModule.ClientDBModule ):
|
||||
|
||||
def __init__( self, cursor: sqlite3.Cursor, modules_tags: ClientDBMaster.ClientDBMasterTags, modules_services: ClientDBServices.ClientDBMasterServices, modules_tags_local_cache: ClientDBDefinitionsCache.ClientDBCacheLocalTags ):
|
||||
|
||||
self.modules_tags = modules_tags
|
||||
self.modules_services = modules_services
|
||||
self.modules_tags_local_cache = modules_tags_local_cache
|
||||
|
||||
ClientDBModule.ClientDBModule.__init__( self, 'client recent tags', cursor )
|
||||
|
||||
|
||||
def _GetInitialTableGenerationDict( self ) -> dict:
|
||||
|
||||
return {
|
||||
'main.recent_tags' : ( 'CREATE TABLE IF NOT EXISTS {} ( service_id INTEGER, tag_id INTEGER, timestamp INTEGER, PRIMARY KEY ( service_id, tag_id ) );', 546 )
|
||||
}
|
||||
|
||||
|
||||
def Drop( self, service_id ):
|
||||
|
||||
self._Execute( 'DELETE FROM recent_tags WHERe service_id = ?;', ( service_id, ) )
|
||||
|
||||
|
||||
def GetRecentTags( self, service_key ):
|
||||
|
||||
service_id = self.modules_services.GetServiceId( service_key )
|
||||
|
||||
# we could be clever and do LIMIT and ORDER BY in the delete, but not all compilations of SQLite have that turned on, so let's KISS
|
||||
|
||||
tag_ids_to_timestamp = { tag_id : timestamp for ( tag_id, timestamp ) in self._Execute( 'SELECT tag_id, timestamp FROM recent_tags WHERE service_id = ?;', ( service_id, ) ) }
|
||||
|
||||
def sort_key( key ):
|
||||
|
||||
return tag_ids_to_timestamp[ key ]
|
||||
|
||||
|
||||
newest_first = list(tag_ids_to_timestamp.keys())
|
||||
|
||||
newest_first.sort( key = sort_key, reverse = True )
|
||||
|
||||
num_we_want = HG.client_controller.new_options.GetNoneableInteger( 'num_recent_tags' )
|
||||
|
||||
if num_we_want is None:
|
||||
|
||||
num_we_want = 20
|
||||
|
||||
|
||||
decayed = newest_first[ num_we_want : ]
|
||||
|
||||
if len( decayed ) > 0:
|
||||
|
||||
self._ExecuteMany( 'DELETE FROM recent_tags WHERE service_id = ? AND tag_id = ?;', ( ( service_id, tag_id ) for tag_id in decayed ) )
|
||||
|
||||
|
||||
sorted_recent_tag_ids = newest_first[ : num_we_want ]
|
||||
|
||||
tag_ids_to_tags = self.modules_tags_local_cache.GetTagIdsToTags( tag_ids = sorted_recent_tag_ids )
|
||||
|
||||
sorted_recent_tags = [ tag_ids_to_tags[ tag_id ] for tag_id in sorted_recent_tag_ids ]
|
||||
|
||||
return sorted_recent_tags
|
||||
|
||||
|
||||
def GetTablesAndColumnsThatUseDefinitions( self, content_type: int ) -> typing.List[ typing.Tuple[ str, str ] ]:
|
||||
|
||||
tables_and_columns = []
|
||||
|
||||
if content_type == HC.CONTENT_TYPE_TAG:
|
||||
|
||||
tables_and_columns.append( ( 'recent_tags', 'tag_id' ) )
|
||||
|
||||
|
||||
return tables_and_columns
|
||||
|
||||
|
||||
def PushRecentTags( self, service_key, tags ):
|
||||
|
||||
service_id = self.modules_services.GetServiceId( service_key )
|
||||
|
||||
if tags is None:
|
||||
|
||||
self._Execute( 'DELETE FROM recent_tags WHERE service_id = ?;', ( service_id, ) )
|
||||
|
||||
else:
|
||||
|
||||
now = HydrusTime.GetNow()
|
||||
|
||||
tag_ids = [ self.modules_tags.GetTagId( tag ) for tag in tags ]
|
||||
|
||||
self._ExecuteMany( 'REPLACE INTO recent_tags ( service_id, tag_id, timestamp ) VALUES ( ?, ?, ? );', ( ( service_id, tag_id, now ) for tag_id in tag_ids ) )
|
||||
|
||||
|
||||
|
|
@ -186,7 +186,7 @@ class ClientDBURLMap( ClientDBModule.ClientDBModule ):
|
|||
|
||||
|
||||
return hash_ids_to_urls
|
||||
|
||||
|
||||
|
||||
def GetTablesAndColumnsThatUseDefinitions( self, content_type: int ) -> typing.List[ typing.Tuple[ str, str ] ]:
|
||||
|
||||
|
|
|
@ -1,5 +1,3 @@
|
|||
import collections
|
||||
import gc
|
||||
import hashlib
|
||||
import os
|
||||
import random
|
||||
|
@ -15,8 +13,6 @@ import cv2
|
|||
import PIL
|
||||
import sqlite3
|
||||
|
||||
import qtpy
|
||||
|
||||
from qtpy import QtCore as QC
|
||||
from qtpy import QtWidgets as QW
|
||||
from qtpy import QtGui as QG
|
||||
|
@ -28,7 +24,6 @@ from hydrus.core import HydrusEncryption
|
|||
from hydrus.core import HydrusExceptions
|
||||
from hydrus.core import HydrusFileHandling
|
||||
from hydrus.core import HydrusGlobals as HG
|
||||
from hydrus.core import HydrusImageHandling
|
||||
from hydrus.core import HydrusMemory
|
||||
from hydrus.core import HydrusPaths
|
||||
from hydrus.core import HydrusProfiling
|
||||
|
@ -39,6 +34,7 @@ from hydrus.core import HydrusText
|
|||
from hydrus.core import HydrusTime
|
||||
from hydrus.core import HydrusVideoHandling
|
||||
from hydrus.core import HydrusPSDHandling
|
||||
from hydrus.core.images import HydrusImageHandling
|
||||
from hydrus.core.networking import HydrusNetwork
|
||||
from hydrus.core.networking import HydrusNetworking
|
||||
|
||||
|
@ -54,7 +50,6 @@ from hydrus.client import ClientTime
|
|||
from hydrus.client.exporting import ClientExportingFiles
|
||||
from hydrus.client.gui import ClientGUIAsync
|
||||
from hydrus.client.gui import ClientGUICharts
|
||||
from hydrus.client.gui import ClientGUICore as CGC
|
||||
from hydrus.client.gui import ClientGUIDialogs
|
||||
from hydrus.client.gui import ClientGUIDialogsManage
|
||||
from hydrus.client.gui import ClientGUIDialogsQuick
|
||||
|
@ -86,13 +81,10 @@ from hydrus.client.gui import QtInit
|
|||
from hydrus.client.gui import QtPorting as QP
|
||||
from hydrus.client.gui.canvas import ClientGUIMPV
|
||||
from hydrus.client.gui.exporting import ClientGUIExport
|
||||
from hydrus.client.gui.importing import ClientGUIImport
|
||||
from hydrus.client.gui.importing import ClientGUIImportFolders
|
||||
from hydrus.client.gui.importing import ClientGUIImportOptions
|
||||
from hydrus.client.gui.networking import ClientGUIHydrusNetwork
|
||||
from hydrus.client.gui.networking import ClientGUINetwork
|
||||
from hydrus.client.gui.pages import ClientGUIManagementController
|
||||
from hydrus.client.gui.pages import ClientGUIManagementPanels
|
||||
from hydrus.client.gui.pages import ClientGUIPages
|
||||
from hydrus.client.gui.pages import ClientGUISession
|
||||
from hydrus.client.gui.parsing import ClientGUIParsing
|
||||
|
@ -102,7 +94,6 @@ from hydrus.client.gui.services import ClientGUIServersideServices
|
|||
from hydrus.client.gui.widgets import ClientGUICommon
|
||||
from hydrus.client.media import ClientMediaResult
|
||||
from hydrus.client.metadata import ClientTags
|
||||
from hydrus.client.search import ClientSearch
|
||||
|
||||
MENU_ORDER = [ 'file', 'undo', 'pages', 'database', 'network', 'services', 'tags', 'pending', 'help' ]
|
||||
|
||||
|
|
|
@ -8,10 +8,12 @@ from hydrus.core import HydrusConstants as HC
|
|||
from hydrus.core import HydrusData
|
||||
from hydrus.core import HydrusExceptions
|
||||
from hydrus.core import HydrusGlobals as HG
|
||||
from hydrus.core import HydrusImageHandling
|
||||
from hydrus.core import HydrusLists
|
||||
from hydrus.core import HydrusThreading
|
||||
from hydrus.core import HydrusTime
|
||||
from hydrus.core.images import HydrusImageHandling
|
||||
from hydrus.core.images import HydrusImageMetadata
|
||||
from hydrus.core.images import HydrusImageOpening
|
||||
|
||||
from hydrus.client import ClientApplicationCommand as CAC
|
||||
from hydrus.client import ClientConstants as CC
|
||||
|
@ -697,23 +699,23 @@ def ShowFileEmbeddedMetadata( win: QW.QWidget, media: ClientMedia.MediaSingleton
|
|||
|
||||
else:
|
||||
|
||||
pil_image = HydrusImageHandling.RawOpenPILImage( path )
|
||||
raw_pil_image = HydrusImageOpening.RawOpenPILImage( path )
|
||||
|
||||
if mime in HC.FILES_THAT_CAN_HAVE_EXIF:
|
||||
|
||||
exif_dict = HydrusImageHandling.GetEXIFDict( pil_image )
|
||||
exif_dict = HydrusImageMetadata.GetEXIFDict( raw_pil_image )
|
||||
|
||||
|
||||
if mime in HC.FILES_THAT_CAN_HAVE_HUMAN_READABLE_EMBEDDED_METADATA:
|
||||
|
||||
file_text = HydrusImageHandling.GetEmbeddedFileText( pil_image )
|
||||
file_text = HydrusImageMetadata.GetEmbeddedFileText( raw_pil_image )
|
||||
|
||||
|
||||
if mime == HC.IMAGE_JPEG:
|
||||
|
||||
extra_rows.append( ( 'progressive', 'yes' if 'progression' in pil_image.info else 'no' ) )
|
||||
extra_rows.append( ( 'progressive', 'yes' if 'progression' in raw_pil_image.info else 'no' ) )
|
||||
|
||||
extra_rows.append( ( 'subsampling', HydrusImageHandling.GetJpegSubsampling( pil_image ) ) )
|
||||
extra_rows.append( ( 'subsampling', HydrusImageMetadata.GetJpegSubsampling( raw_pil_image )) )
|
||||
|
||||
|
||||
|
||||
|
|
|
@ -12,12 +12,11 @@ from hydrus.core import HydrusConstants as HC
|
|||
from hydrus.core import HydrusData
|
||||
from hydrus.core import HydrusExceptions
|
||||
from hydrus.core import HydrusGlobals as HG
|
||||
from hydrus.core import HydrusImageHandling
|
||||
from hydrus.core import HydrusPaths
|
||||
from hydrus.core import HydrusSerialisable
|
||||
from hydrus.core import HydrusTags
|
||||
from hydrus.core import HydrusText
|
||||
from hydrus.core import HydrusTime
|
||||
from hydrus.core.images import HydrusImageHandling
|
||||
|
||||
from hydrus.client import ClientApplicationCommand as CAC
|
||||
from hydrus.client import ClientConstants as CC
|
||||
|
@ -4530,6 +4529,12 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
self._thumbnail_visibility_scroll_percent = ClientGUICommon.BetterSpinBox( self, min=1, max=99 )
|
||||
self._thumbnail_visibility_scroll_percent.setToolTip( 'Lower numbers will cause fewer scrolls, higher numbers more.' )
|
||||
|
||||
self._allow_blurhash_fallback = QW.QCheckBox( self )
|
||||
|
||||
tt = 'If hydrus does not have a thumbnail for a file (e.g. you are looking at a deleted file, or one unexpectedly missing), but it does know its blurhash, it will generate a blurry thumbnail based off that blurhash. Turning this behaviour off here will make it always show the default "hydrus" thumbnail.'
|
||||
|
||||
self._allow_blurhash_fallback.setToolTip( tt )
|
||||
|
||||
self._focus_preview_on_ctrl_click = QW.QCheckBox( self )
|
||||
self._focus_preview_on_ctrl_click_only_static = QW.QCheckBox( self )
|
||||
self._focus_preview_on_shift_click = QW.QCheckBox( self )
|
||||
|
@ -4554,6 +4559,8 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
|
||||
self._video_thumbnail_percentage_in.setValue( self._new_options.GetInteger( 'video_thumbnail_percentage_in' ) )
|
||||
|
||||
self._allow_blurhash_fallback.setChecked( self._new_options.GetBoolean( 'allow_blurhash_fallback' ) )
|
||||
|
||||
self._focus_preview_on_ctrl_click.setChecked( self._new_options.GetBoolean( 'focus_preview_on_ctrl_click' ) )
|
||||
self._focus_preview_on_ctrl_click_only_static.setChecked( self._new_options.GetBoolean( 'focus_preview_on_ctrl_click_only_static' ) )
|
||||
self._focus_preview_on_shift_click.setChecked( self._new_options.GetBoolean( 'focus_preview_on_shift_click' ) )
|
||||
|
@ -4585,6 +4592,7 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
rows.append( ( 'On shift-click, focus thumbnails in the preview window: ', self._focus_preview_on_shift_click ) )
|
||||
rows.append( ( ' Only on files with no duration: ', self._focus_preview_on_shift_click_only_static ) )
|
||||
rows.append( ( 'Generate video thumbnails this % in: ', self._video_thumbnail_percentage_in ) )
|
||||
rows.append( ( 'Use blurhash missing thumbnail fallback: ', self._allow_blurhash_fallback ) )
|
||||
rows.append( ( 'Do not scroll down on key navigation if thumbnail at least this % visible: ', self._thumbnail_visibility_scroll_percent ) )
|
||||
rows.append( ( 'EXPERIMENTAL: Scroll thumbnails at this rate per scroll tick: ', self._thumbnail_scroll_rate ) )
|
||||
rows.append( ( 'EXPERIMENTAL: Image path for thumbnail panel background image (set blank to clear): ', self._media_background_bmp_path ) )
|
||||
|
@ -4625,6 +4633,8 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
self._new_options.SetBoolean( 'focus_preview_on_shift_click', self._focus_preview_on_shift_click.isChecked() )
|
||||
self._new_options.SetBoolean( 'focus_preview_on_shift_click_only_static', self._focus_preview_on_shift_click_only_static.isChecked() )
|
||||
|
||||
self._new_options.SetBoolean( 'allow_blurhash_fallback', self._allow_blurhash_fallback.isChecked() )
|
||||
|
||||
try:
|
||||
|
||||
thumbnail_scroll_rate = self._thumbnail_scroll_rate.text()
|
||||
|
|
|
@ -2558,6 +2558,10 @@ class ReviewFileHistory( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
|
||||
num_steps = 7680
|
||||
|
||||
self._cancel_button.setEnabled( True )
|
||||
self._refresh_button.setEnabled( False )
|
||||
|
||||
self._status_st.setText( 'loading' + HC.UNICODE_ELLIPSIS )
|
||||
self._status_st.setVisible( True )
|
||||
|
||||
self._flip_deleted.setVisible( False )
|
||||
|
|
|
@ -8,11 +8,11 @@ from hydrus.core import HydrusConstants as HC
|
|||
from hydrus.core import HydrusData
|
||||
from hydrus.core import HydrusExceptions
|
||||
from hydrus.core import HydrusGlobals as HG
|
||||
from hydrus.core import HydrusImageHandling
|
||||
from hydrus.core import HydrusLists
|
||||
from hydrus.core import HydrusPaths
|
||||
from hydrus.core import HydrusTags
|
||||
from hydrus.core import HydrusTime
|
||||
from hydrus.core.images import HydrusImageHandling
|
||||
|
||||
from hydrus.client import ClientApplicationCommand as CAC
|
||||
from hydrus.client import ClientConstants as CC
|
||||
|
|
|
@ -21,10 +21,10 @@ from hydrus.core import HydrusConstants as HC
|
|||
from hydrus.core import HydrusData
|
||||
from hydrus.core import HydrusFileHandling
|
||||
from hydrus.core import HydrusGlobals as HG
|
||||
from hydrus.core import HydrusImageHandling
|
||||
from hydrus.core import HydrusPSDHandling
|
||||
from hydrus.core import HydrusPaths
|
||||
from hydrus.core import HydrusTime
|
||||
from hydrus.core.images import HydrusImageHandling
|
||||
|
||||
from hydrus.client import ClientApplicationCommand as CAC
|
||||
from hydrus.client import ClientConstants as CC
|
||||
|
|
|
@ -10,9 +10,7 @@ from hydrus.core import HydrusAnimationHandling
|
|||
from hydrus.core import HydrusConstants as HC
|
||||
from hydrus.core import HydrusData
|
||||
from hydrus.core import HydrusGlobals as HG
|
||||
from hydrus.core import HydrusImageHandling
|
||||
from hydrus.core import HydrusPaths
|
||||
from hydrus.core import HydrusVideoHandling
|
||||
|
||||
from hydrus.client import ClientApplicationCommand as CAC
|
||||
from hydrus.client import ClientConstants as CC
|
||||
|
|
|
@ -12,9 +12,9 @@ from hydrus.core import HydrusConstants as HC
|
|||
from hydrus.core import HydrusData
|
||||
from hydrus.core import HydrusExceptions
|
||||
from hydrus.core import HydrusGlobals as HG
|
||||
from hydrus.core import HydrusImageHandling
|
||||
from hydrus.core import HydrusPaths
|
||||
from hydrus.core import HydrusTime
|
||||
from hydrus.core.images import HydrusImageHandling
|
||||
from hydrus.core.networking import HydrusNetwork
|
||||
|
||||
from hydrus.client import ClientApplicationCommand as CAC
|
||||
|
|
|
@ -10,7 +10,7 @@ from hydrus.core import HydrusData
|
|||
from hydrus.core import HydrusExceptions
|
||||
from hydrus.core import HydrusGlobals as HG
|
||||
from hydrus.core import HydrusFileHandling
|
||||
from hydrus.core import HydrusImageHandling
|
||||
from hydrus.core.images import HydrusImageHandling
|
||||
|
||||
from hydrus.client import ClientConstants as CC
|
||||
from hydrus.client import ClientImageHandling
|
||||
|
|
|
@ -1,13 +1,14 @@
|
|||
import typing
|
||||
|
||||
from hydrus.core import HydrusConstants as HC
|
||||
from hydrus.core import HydrusPSDHandling
|
||||
from hydrus.core import HydrusData
|
||||
from hydrus.core import HydrusExceptions
|
||||
from hydrus.core import HydrusFileHandling
|
||||
from hydrus.core import HydrusImageHandling
|
||||
from hydrus.core import HydrusGlobals as HG
|
||||
from hydrus.core import HydrusTime
|
||||
from hydrus.core.images import HydrusBlurhash
|
||||
from hydrus.core.images import HydrusImageHandling
|
||||
from hydrus.core.images import HydrusImageMetadata
|
||||
from hydrus.core.images import HydrusImageOpening
|
||||
|
||||
from hydrus.client import ClientConstants as CC
|
||||
from hydrus.client import ClientFiles
|
||||
|
@ -361,7 +362,7 @@ class FileImportJob( object ):
|
|||
|
||||
try:
|
||||
|
||||
self._blurhash = HydrusImageHandling.GetBlurhashFromNumPy( thumbnail_numpy )
|
||||
self._blurhash = HydrusBlurhash.GetBlurhashFromNumPy( thumbnail_numpy )
|
||||
|
||||
except:
|
||||
|
||||
|
@ -406,11 +407,18 @@ class FileImportJob( object ):
|
|||
|
||||
has_exif = False
|
||||
|
||||
raw_pil_image = None
|
||||
|
||||
if mime in HC.FILES_THAT_CAN_HAVE_EXIF:
|
||||
|
||||
try:
|
||||
|
||||
has_exif = HydrusImageHandling.HasEXIF( self._temp_path )
|
||||
if raw_pil_image is None:
|
||||
|
||||
raw_pil_image = HydrusImageOpening.RawOpenPILImage( self._temp_path )
|
||||
|
||||
|
||||
has_exif = HydrusImageMetadata.HasEXIF( raw_pil_image )
|
||||
|
||||
except:
|
||||
|
||||
|
@ -429,14 +437,18 @@ class FileImportJob( object ):
|
|||
try:
|
||||
|
||||
if mime == HC.APPLICATION_PSD:
|
||||
|
||||
has_icc_profile = HydrusPSDHandling.PSDHasICCProfile( self._temp_path )
|
||||
|
||||
else:
|
||||
|
||||
pil_image = HydrusImageHandling.RawOpenPILImage( self._temp_path )
|
||||
|
||||
has_icc_profile = HydrusImageHandling.HasICCProfile( pil_image )
|
||||
has_icc_profile = HydrusPSDHandling.PSDHasICCProfile( self._temp_path )
|
||||
|
||||
else:
|
||||
|
||||
if raw_pil_image is None:
|
||||
|
||||
raw_pil_image = HydrusImageOpening.RawOpenPILImage( self._temp_path )
|
||||
|
||||
|
||||
has_icc_profile = HydrusImageMetadata.HasICCProfile( raw_pil_image )
|
||||
|
||||
|
||||
except:
|
||||
|
||||
|
|
|
@ -25,12 +25,12 @@ from hydrus.core import HydrusConstants as HC
|
|||
from hydrus.core import HydrusData
|
||||
from hydrus.core import HydrusExceptions
|
||||
from hydrus.core import HydrusFileHandling
|
||||
from hydrus.core import HydrusImageHandling
|
||||
from hydrus.core import HydrusGlobals as HG
|
||||
from hydrus.core import HydrusPaths
|
||||
from hydrus.core import HydrusTags
|
||||
from hydrus.core import HydrusTemp
|
||||
from hydrus.core import HydrusTime
|
||||
from hydrus.core.images import HydrusImageHandling
|
||||
from hydrus.core.networking import HydrusNetworkVariableHandling
|
||||
from hydrus.core.networking import HydrusServerRequest
|
||||
from hydrus.core.networking import HydrusServerResources
|
||||
|
|
|
@ -7,7 +7,8 @@ from PIL import Image as PILImage
|
|||
|
||||
from hydrus.core import HydrusConstants as HC
|
||||
from hydrus.core import HydrusExceptions
|
||||
from hydrus.core import HydrusImageHandling
|
||||
from hydrus.core.images import HydrusImageHandling
|
||||
from hydrus.core.images import HydrusImageOpening
|
||||
|
||||
def GetAnimationProperties( path, mime ):
|
||||
|
||||
|
@ -200,7 +201,7 @@ def GetAPNGDurationAndNumFrames( path ):
|
|||
|
||||
def GetFrameDurationsPILAnimation( path ):
|
||||
|
||||
pil_image = HydrusImageHandling.RawOpenPILImage( path )
|
||||
pil_image = HydrusImageOpening.RawOpenPILImage( path )
|
||||
|
||||
times_to_play = GetTimesToPlayPILAnimationFromPIL( pil_image )
|
||||
|
||||
|
@ -265,7 +266,7 @@ def GetTimesToPlayPILAnimation( path ) -> int:
|
|||
|
||||
try:
|
||||
|
||||
pil_image = HydrusImageHandling.RawOpenPILImage( path )
|
||||
pil_image = HydrusImageOpening.RawOpenPILImage( path )
|
||||
|
||||
except HydrusExceptions.UnsupportedFileException:
|
||||
|
||||
|
|
|
@ -56,9 +56,12 @@ elif PLATFORM_LINUX:
|
|||
elif PLATFORM_HAIKU:
|
||||
NICE_PLATFORM_STRING = 'Haiku'
|
||||
|
||||
RUNNING_FROM_SOURCE = sys.argv[0].endswith( '.py' ) or sys.argv[0].endswith( '.pyw' )
|
||||
RUNNING_FROM_MACOS_APP = os.path.exists( os.path.join( BASE_DIR, 'running_from_app' ) )
|
||||
|
||||
# I used to check argv[0], but it is unreliable
|
||||
# sys.argv[0].endswith( '.py' ) or sys.argv[0].endswith( '.pyw' )
|
||||
RUNNING_FROM_SOURCE = not ( RUNNING_FROM_FROZEN_BUILD or RUNNING_FROM_MACOS_APP )
|
||||
|
||||
if RUNNING_FROM_SOURCE:
|
||||
NICE_RUNNING_AS_STRING = 'from source'
|
||||
elif RUNNING_FROM_FROZEN_BUILD:
|
||||
|
@ -100,7 +103,7 @@ options = {}
|
|||
# Misc
|
||||
|
||||
NETWORK_VERSION = 20
|
||||
SOFTWARE_VERSION = 545
|
||||
SOFTWARE_VERSION = 546
|
||||
CLIENT_API_VERSION = 53
|
||||
|
||||
SERVER_THUMBNAIL_DIMENSIONS = ( 200, 200 )
|
||||
|
|
|
@ -956,6 +956,7 @@ def RestartProcess():
|
|||
|
||||
time.sleep( 1 ) # time for ports to unmap
|
||||
|
||||
# note argv is unreliable in weird script-launching situations, but there we go
|
||||
exe = sys.executable
|
||||
me = sys.argv[0]
|
||||
|
||||
|
|
|
@ -2,7 +2,6 @@ import hashlib
|
|||
import os
|
||||
|
||||
from hydrus.core import HydrusAnimationHandling
|
||||
from hydrus.core import HydrusAudioHandling
|
||||
from hydrus.core import HydrusPSDHandling
|
||||
from hydrus.core import HydrusClipHandling
|
||||
from hydrus.core import HydrusArchiveHandling
|
||||
|
@ -11,7 +10,6 @@ from hydrus.core import HydrusData
|
|||
from hydrus.core import HydrusDocumentHandling
|
||||
from hydrus.core import HydrusExceptions
|
||||
from hydrus.core import HydrusFlashHandling
|
||||
from hydrus.core import HydrusImageHandling
|
||||
from hydrus.core import HydrusKritaHandling
|
||||
from hydrus.core import HydrusProcreateHandling
|
||||
from hydrus.core import HydrusPaths
|
||||
|
@ -21,6 +19,7 @@ from hydrus.core import HydrusPDFHandling
|
|||
from hydrus.core import HydrusTemp
|
||||
from hydrus.core import HydrusText
|
||||
from hydrus.core import HydrusVideoHandling
|
||||
from hydrus.core.images import HydrusImageHandling
|
||||
from hydrus.core.networking import HydrusNetwork
|
||||
|
||||
try:
|
||||
|
|
File diff suppressed because it is too large
Load Diff
|
@ -4,7 +4,7 @@ import typing
|
|||
from PIL import Image as PILImage
|
||||
|
||||
from hydrus.core import HydrusExceptions
|
||||
from hydrus.core import HydrusImageHandling
|
||||
from hydrus.core.images import HydrusImageHandling
|
||||
|
||||
try:
|
||||
|
||||
|
@ -34,7 +34,9 @@ def MergedPILImageFromPSD( path: str ) -> PILImage:
|
|||
raise HydrusExceptions.UnsupportedFileException( 'psd_tools unavailable' )
|
||||
|
||||
|
||||
return HydrusPSDTools.MergedPILImageFromPSD( path )
|
||||
pil_image = HydrusPSDTools.MergedPILImageFromPSD( path )
|
||||
|
||||
return pil_image
|
||||
|
||||
|
||||
def GenerateThumbnailNumPyFromPSDPath( path: str, target_resolution: typing.Tuple[int, int], clip_rect = None ) -> bytes:
|
||||
|
@ -44,12 +46,11 @@ def GenerateThumbnailNumPyFromPSDPath( path: str, target_resolution: typing.Tupl
|
|||
if clip_rect is not None:
|
||||
|
||||
pil_image = HydrusImageHandling.ClipPILImage( pil_image, clip_rect )
|
||||
|
||||
pil_image = HydrusImageHandling.DequantizePILImage( pil_image )
|
||||
|
||||
|
||||
thumbnail_pil_image = pil_image.resize( target_resolution, PILImage.LANCZOS )
|
||||
|
||||
numpy_image = HydrusImageHandling.GenerateNumPyImageFromPILImage(thumbnail_pil_image)
|
||||
numpy_image = HydrusImageHandling.GenerateNumPyImageFromPILImage( thumbnail_pil_image )
|
||||
|
||||
return numpy_image
|
||||
|
||||
|
|
|
@ -1,12 +1,13 @@
|
|||
from PIL import Image as PILImage
|
||||
|
||||
from hydrus.core import HydrusExceptions
|
||||
|
||||
from psd_tools import PSDImage
|
||||
from psd_tools.constants import Resource, ColorMode, Resource
|
||||
from psd_tools.constants import Resource, ColorMode
|
||||
from psd_tools.api.numpy_io import has_transparency, get_transparency_index
|
||||
from psd_tools.api.pil_io import get_pil_mode, get_pil_channels, _create_image
|
||||
|
||||
from hydrus.core import HydrusExceptions
|
||||
from hydrus.core.images import HydrusImageNormalisation
|
||||
|
||||
def PSDHasICCProfile( path: str ):
|
||||
|
||||
psd = PSDImage.open( path )
|
||||
|
@ -20,26 +21,16 @@ def MergedPILImageFromPSD( path: str ) -> PILImage:
|
|||
|
||||
#pil_image = psd.topil( apply_icc = False )
|
||||
|
||||
if psd.has_preview():
|
||||
|
||||
pil_image = convert_image_data_to_pil(psd)
|
||||
|
||||
else:
|
||||
if not psd.has_preview():
|
||||
|
||||
raise HydrusExceptions.UnsupportedFileException('PSD file has no embedded preview!')
|
||||
|
||||
|
||||
if Resource.ICC_PROFILE in psd.image_resources:
|
||||
|
||||
icc = psd.image_resources.get_data( Resource.ICC_PROFILE )
|
||||
|
||||
pil_image.info[ 'icc_profile' ] = icc
|
||||
|
||||
pil_image = convert_image_data_to_pil( psd )
|
||||
|
||||
return pil_image
|
||||
|
||||
|
||||
|
||||
def GetPSDResolution( path: str ):
|
||||
|
||||
psd = PSDImage.open( path )
|
||||
|
@ -89,7 +80,18 @@ def convert_image_data_to_pil( psd: PSDImage ):
|
|||
return None
|
||||
|
||||
|
||||
return post_process(image, alpha)
|
||||
pil_image = post_process(image, alpha)
|
||||
|
||||
if Resource.ICC_PROFILE in psd.image_resources:
|
||||
|
||||
icc = psd.image_resources.get_data( Resource.ICC_PROFILE )
|
||||
|
||||
pil_image.info[ 'icc_profile' ] = icc
|
||||
|
||||
|
||||
pil_image = HydrusImageNormalisation.DequantizePILImage( pil_image )
|
||||
|
||||
return pil_image
|
||||
|
||||
|
||||
def post_process(image, alpha):
|
||||
|
|
|
@ -0,0 +1,61 @@
|
|||
import numpy
|
||||
import cv2
|
||||
|
||||
from hydrus.external import blurhash as external_blurhash
|
||||
|
||||
from hydrus.core.images import HydrusImageHandling
|
||||
|
||||
def GetBlurhashFromNumPy( numpy_image: numpy.array ) -> str:
|
||||
|
||||
media_height = numpy_image.shape[0]
|
||||
media_width = numpy_image.shape[1]
|
||||
|
||||
if media_width == 0 or media_height == 0:
|
||||
|
||||
return ''
|
||||
|
||||
|
||||
ratio = media_width / media_height
|
||||
|
||||
if ratio > 4 / 3:
|
||||
|
||||
components_x = 5
|
||||
components_y = 3
|
||||
|
||||
elif ratio < 3 / 4:
|
||||
|
||||
components_x = 3
|
||||
components_y = 5
|
||||
|
||||
else:
|
||||
|
||||
components_x = 4
|
||||
components_y = 4
|
||||
|
||||
|
||||
CUTOFF_DIMENSION = 100
|
||||
|
||||
if numpy_image.shape[0] > CUTOFF_DIMENSION or numpy_image.shape[1] > CUTOFF_DIMENSION:
|
||||
|
||||
numpy_image = HydrusImageHandling.ResizeNumPyImage( numpy_image, (CUTOFF_DIMENSION, CUTOFF_DIMENSION), forced_interpolation = cv2.INTER_LINEAR )
|
||||
|
||||
|
||||
return external_blurhash.blurhash_encode( numpy_image, components_x, components_y )
|
||||
|
||||
|
||||
def GetNumpyFromBlurhash( blurhash, width, height ) -> numpy.array:
|
||||
|
||||
# this thing is super slow, they recommend even in the documentation to render small and scale up
|
||||
if width > 32 or height > 32:
|
||||
|
||||
numpy_image = numpy.array( external_blurhash.blurhash_decode( blurhash, 32, 32 ), dtype = 'uint8' )
|
||||
|
||||
numpy_image = HydrusImageHandling.ResizeNumPyImage( numpy_image, ( width, height ) )
|
||||
|
||||
else:
|
||||
|
||||
numpy_image = numpy.array( external_blurhash.blurhash_decode( blurhash, width, height ), dtype = 'uint8' )
|
||||
|
||||
|
||||
return numpy_image
|
||||
|
|
@ -0,0 +1,90 @@
|
|||
import numpy
|
||||
|
||||
from PIL import Image as PILImage
|
||||
|
||||
def NumPyImageHasAllCellsTheSame( numpy_image: numpy.array, value: int ):
|
||||
|
||||
# I looked around for ways to do this iteratively at the c++ level but didn't have huge luck.
|
||||
# unless some magic is going on, the '==' actually creates the bool array
|
||||
# its ok for now!
|
||||
return numpy.all( numpy_image == value )
|
||||
|
||||
# old way, which makes a third array:
|
||||
# alpha_channel == numpy.full( ( shape[0], shape[1] ), 255, dtype = 'uint8' ) ).all()
|
||||
|
||||
|
||||
def NumPyImageHasUselessAlphaChannel( numpy_image: numpy.array ) -> bool:
|
||||
|
||||
if not NumPyImageHasAlphaChannel( numpy_image ):
|
||||
|
||||
return False
|
||||
|
||||
|
||||
# RGBA image
|
||||
|
||||
alpha_channel = numpy_image[:,:,3].copy()
|
||||
|
||||
if NumPyImageHasAllCellsTheSame( alpha_channel, 255 ): # all opaque
|
||||
|
||||
return True
|
||||
|
||||
|
||||
if NumPyImageHasAllCellsTheSame( alpha_channel, 0 ): # all transparent
|
||||
|
||||
underlying_image_is_black = NumPyImageHasAllCellsTheSame( numpy_image, 0 )
|
||||
|
||||
return not underlying_image_is_black
|
||||
|
||||
|
||||
return False
|
||||
|
||||
|
||||
def NumPyImageHasOpaqueAlphaChannel( numpy_image: numpy.array ) -> bool:
|
||||
|
||||
if not NumPyImageHasAlphaChannel( numpy_image ):
|
||||
|
||||
return False
|
||||
|
||||
|
||||
# RGBA image
|
||||
# opaque means 255
|
||||
|
||||
alpha_channel = numpy_image[:,:,3].copy()
|
||||
|
||||
return NumPyImageHasAllCellsTheSame( alpha_channel, 255 )
|
||||
|
||||
|
||||
def NumPyImageHasAlphaChannel( numpy_image: numpy.array ) -> bool:
|
||||
|
||||
# note this does not test how useful the channel is, just if it exists
|
||||
|
||||
shape = numpy_image.shape
|
||||
|
||||
if len( shape ) <= 2:
|
||||
|
||||
return False
|
||||
|
||||
|
||||
# 2 for LA? think this works
|
||||
return shape[2] in ( 2, 4 )
|
||||
|
||||
|
||||
def NumPyImageHasTransparentAlphaChannel( numpy_image: numpy.array ) -> bool:
|
||||
|
||||
if not NumPyImageHasAlphaChannel( numpy_image ):
|
||||
|
||||
return False
|
||||
|
||||
|
||||
# RGBA image
|
||||
# transparent means 0
|
||||
|
||||
alpha_channel = numpy_image[:,:,3].copy()
|
||||
|
||||
return NumPyImageHasAllCellsTheSame( alpha_channel, 0 )
|
||||
|
||||
|
||||
def PILImageHasTransparency( pil_image: PILImage.Image ) -> bool:
|
||||
|
||||
return pil_image.mode in ( 'LA', 'RGBA' ) or ( pil_image.mode == 'P' and 'transparency' in pil_image.info )
|
||||
|
|
@ -0,0 +1,650 @@
|
|||
from hydrus.core.images import HydrusImageInit # right up top
|
||||
|
||||
import cv2
|
||||
import hashlib
|
||||
import io
|
||||
import numpy
|
||||
import typing
|
||||
import warnings
|
||||
|
||||
from PIL import ImageFile as PILImageFile
|
||||
from PIL import Image as PILImage
|
||||
|
||||
try:
|
||||
|
||||
from pillow_heif import register_heif_opener
|
||||
from pillow_heif import register_avif_opener
|
||||
|
||||
register_heif_opener(thumbnails=False)
|
||||
register_avif_opener(thumbnails=False)
|
||||
|
||||
HEIF_OK = True
|
||||
|
||||
except:
|
||||
|
||||
HEIF_OK = False
|
||||
|
||||
|
||||
from hydrus.core import HydrusConstants as HC
|
||||
from hydrus.core import HydrusData
|
||||
from hydrus.core import HydrusExceptions
|
||||
from hydrus.core import HydrusGlobals as HG
|
||||
from hydrus.core import HydrusPSDHandling
|
||||
from hydrus.core.images import HydrusImageColours
|
||||
from hydrus.core.images import HydrusImageMetadata
|
||||
from hydrus.core.images import HydrusImageNormalisation
|
||||
from hydrus.core.images import HydrusImageOpening
|
||||
|
||||
def EnableLoadTruncatedImages():
|
||||
|
||||
if hasattr( PILImageFile, 'LOAD_TRUNCATED_IMAGES' ):
|
||||
|
||||
# this can now cause load hangs due to the trunc load code adding infinite fake EOFs to the file stream, wew lad
|
||||
# hence debug only
|
||||
PILImageFile.LOAD_TRUNCATED_IMAGES = True
|
||||
|
||||
return True
|
||||
|
||||
else:
|
||||
|
||||
return False
|
||||
|
||||
|
||||
|
||||
OLD_PIL_MAX_IMAGE_PIXELS = PILImage.MAX_IMAGE_PIXELS
|
||||
PILImage.MAX_IMAGE_PIXELS = None # this turns off decomp check entirely, wew
|
||||
|
||||
if cv2.__version__.startswith( '2' ):
|
||||
|
||||
CV_IMREAD_FLAGS_PNG = cv2.CV_LOAD_IMAGE_UNCHANGED
|
||||
CV_IMREAD_FLAGS_JPEG = CV_IMREAD_FLAGS_PNG
|
||||
CV_IMREAD_FLAGS_WEIRD = CV_IMREAD_FLAGS_PNG
|
||||
|
||||
CV_JPEG_THUMBNAIL_ENCODE_PARAMS = []
|
||||
CV_PNG_THUMBNAIL_ENCODE_PARAMS = []
|
||||
|
||||
else:
|
||||
|
||||
# allows alpha channel
|
||||
CV_IMREAD_FLAGS_PNG = cv2.IMREAD_UNCHANGED
|
||||
# this preserves colour info but does EXIF reorientation and flipping
|
||||
CV_IMREAD_FLAGS_JPEG = cv2.IMREAD_ANYDEPTH | cv2.IMREAD_ANYCOLOR
|
||||
# this seems to allow weirdass tiffs to load as non greyscale, although the LAB conversion 'whitepoint' or whatever can be wrong
|
||||
CV_IMREAD_FLAGS_WEIRD = CV_IMREAD_FLAGS_PNG
|
||||
|
||||
CV_JPEG_THUMBNAIL_ENCODE_PARAMS = [ cv2.IMWRITE_JPEG_QUALITY, 92 ]
|
||||
CV_PNG_THUMBNAIL_ENCODE_PARAMS = [ cv2.IMWRITE_PNG_COMPRESSION, 9 ]
|
||||
|
||||
|
||||
PIL_ONLY_MIMETYPES = { HC.ANIMATION_GIF, HC.IMAGE_ICON, HC.IMAGE_WEBP, HC.IMAGE_QOI, HC.IMAGE_BMP }.union( HC.PIL_HEIF_MIMES )
|
||||
|
||||
def MakeClipRectFit( image_resolution, clip_rect ):
|
||||
|
||||
( im_width, im_height ) = image_resolution
|
||||
( x, y, clip_width, clip_height ) = clip_rect
|
||||
|
||||
x = max( 0, x )
|
||||
y = max( 0, y )
|
||||
|
||||
clip_width = min( clip_width, im_width )
|
||||
clip_height = min( clip_height, im_height )
|
||||
|
||||
if x + clip_width > im_width:
|
||||
|
||||
x = im_width - clip_width
|
||||
|
||||
|
||||
if y + clip_height > im_height:
|
||||
|
||||
y = im_height - clip_height
|
||||
|
||||
|
||||
return ( x, y, clip_width, clip_height )
|
||||
|
||||
def ClipNumPyImage( numpy_image: numpy.array, clip_rect ):
|
||||
|
||||
if len( numpy_image.shape ) == 3:
|
||||
|
||||
( im_height, im_width, depth ) = numpy_image.shape
|
||||
|
||||
else:
|
||||
|
||||
( im_height, im_width ) = numpy_image.shape
|
||||
|
||||
|
||||
( x, y, clip_width, clip_height ) = MakeClipRectFit( ( im_width, im_height ), clip_rect )
|
||||
|
||||
return numpy_image[ y : y + clip_height, x : x + clip_width ]
|
||||
|
||||
|
||||
def ClipPILImage( pil_image: PILImage.Image, clip_rect ):
|
||||
|
||||
( x, y, clip_width, clip_height ) = MakeClipRectFit( pil_image.size, clip_rect )
|
||||
|
||||
return pil_image.crop( box = ( x, y, x + clip_width, y + clip_height ) )
|
||||
|
||||
|
||||
def GenerateNumPyImage( path, mime, force_pil = False ) -> numpy.array:
|
||||
|
||||
if HG.media_load_report_mode:
|
||||
|
||||
HydrusData.ShowText( 'Loading media: ' + path )
|
||||
|
||||
|
||||
if mime == HC.APPLICATION_PSD:
|
||||
|
||||
if HG.media_load_report_mode:
|
||||
|
||||
HydrusData.ShowText( 'Loading PSD' )
|
||||
|
||||
|
||||
pil_image = HydrusPSDHandling.MergedPILImageFromPSD( path )
|
||||
|
||||
numpy_image = GenerateNumPyImageFromPILImage( pil_image )
|
||||
|
||||
return HydrusImageNormalisation.StripOutAnyUselessAlphaChannel( numpy_image )
|
||||
|
||||
|
||||
if mime in PIL_ONLY_MIMETYPES:
|
||||
|
||||
force_pil = True
|
||||
|
||||
|
||||
if not force_pil:
|
||||
|
||||
try:
|
||||
|
||||
pil_image = HydrusImageOpening.RawOpenPILImage( path )
|
||||
|
||||
try:
|
||||
|
||||
pil_image.verify()
|
||||
|
||||
except:
|
||||
|
||||
raise HydrusExceptions.UnsupportedFileException()
|
||||
|
||||
|
||||
# I and F are some sort of 32-bit monochrome or whatever, doesn't seem to work in PIL well, with or without ICC
|
||||
if pil_image.mode not in ( 'I', 'F' ):
|
||||
|
||||
if pil_image.mode == 'LAB':
|
||||
|
||||
force_pil = True
|
||||
|
||||
|
||||
if HydrusImageMetadata.HasICCProfile( pil_image ):
|
||||
|
||||
if HG.media_load_report_mode:
|
||||
|
||||
HydrusData.ShowText( 'Image has ICC, so switching to PIL' )
|
||||
|
||||
|
||||
force_pil = True
|
||||
|
||||
|
||||
|
||||
except HydrusExceptions.UnsupportedFileException:
|
||||
|
||||
# pil had trouble, let's cross our fingers cv can do it
|
||||
pass
|
||||
|
||||
|
||||
|
||||
if force_pil:
|
||||
|
||||
if HG.media_load_report_mode:
|
||||
|
||||
HydrusData.ShowText( 'Loading with PIL' )
|
||||
|
||||
|
||||
pil_image = GeneratePILImage( path )
|
||||
|
||||
numpy_image = GenerateNumPyImageFromPILImage( pil_image )
|
||||
|
||||
else:
|
||||
|
||||
if HG.media_load_report_mode:
|
||||
|
||||
HydrusData.ShowText( 'Loading with OpenCV' )
|
||||
|
||||
|
||||
if mime in ( HC.IMAGE_JPEG, HC.IMAGE_TIFF ):
|
||||
|
||||
flags = CV_IMREAD_FLAGS_JPEG
|
||||
|
||||
elif mime == HC.IMAGE_PNG:
|
||||
|
||||
flags = CV_IMREAD_FLAGS_PNG
|
||||
|
||||
else:
|
||||
|
||||
flags = CV_IMREAD_FLAGS_WEIRD
|
||||
|
||||
|
||||
numpy_image = cv2.imread( path, flags = flags )
|
||||
|
||||
if numpy_image is None: # doesn't support some random stuff
|
||||
|
||||
if HG.media_load_report_mode:
|
||||
|
||||
HydrusData.ShowText( 'OpenCV Failed, loading with PIL' )
|
||||
|
||||
|
||||
pil_image = GeneratePILImage( path )
|
||||
|
||||
numpy_image = GenerateNumPyImageFromPILImage( pil_image )
|
||||
|
||||
else:
|
||||
|
||||
numpy_image = HydrusImageNormalisation.DequantizeFreshlyLoadedNumPyImage( numpy_image )
|
||||
|
||||
|
||||
|
||||
numpy_image = HydrusImageNormalisation.StripOutAnyUselessAlphaChannel( numpy_image )
|
||||
|
||||
return numpy_image
|
||||
|
||||
def GenerateNumPyImageFromPILImage( pil_image: PILImage.Image ) -> numpy.array:
|
||||
|
||||
# this seems to magically work, I guess asarray either has a match for Image or Image provides some common shape/datatype properties that it can hook into
|
||||
return numpy.asarray( pil_image )
|
||||
|
||||
# old method:
|
||||
'''
|
||||
( w, h ) = pil_image.size
|
||||
|
||||
try:
|
||||
|
||||
s = pil_image.tobytes()
|
||||
|
||||
except OSError as e: # e.g. OSError: unrecognized data stream contents when reading image file
|
||||
|
||||
raise HydrusExceptions.UnsupportedFileException( str( e ) )
|
||||
|
||||
|
||||
depth = len( s ) // ( w * h )
|
||||
|
||||
return numpy.fromstring( s, dtype = 'uint8' ).reshape( ( h, w, depth ) )
|
||||
'''
|
||||
|
||||
|
||||
def GeneratePILImage( path, dequantize = True ) -> PILImage.Image:
|
||||
|
||||
pil_image = HydrusImageOpening.RawOpenPILImage( path )
|
||||
|
||||
if pil_image is None:
|
||||
|
||||
raise Exception( 'The file at {} could not be rendered!'.format( path ) )
|
||||
|
||||
|
||||
pil_image = HydrusImageNormalisation.RotateEXIFPILImage( pil_image )
|
||||
|
||||
if dequantize:
|
||||
|
||||
# note this destroys animated gifs atm, it collapses down to one frame
|
||||
pil_image = HydrusImageNormalisation.DequantizePILImage( pil_image )
|
||||
|
||||
|
||||
return pil_image
|
||||
|
||||
|
||||
def GeneratePILImageFromNumPyImage( numpy_image: numpy.array ) -> PILImage.Image:
|
||||
|
||||
# I'll leave this here as a neat artifact, but I really shouldn't ever be making a PIL from a cv2 image. the only PIL benefits are the .info dict, which this won't generate
|
||||
|
||||
if len( numpy_image.shape ) == 2:
|
||||
|
||||
( h, w ) = numpy_image.shape
|
||||
|
||||
format = 'L'
|
||||
|
||||
else:
|
||||
|
||||
( h, w, depth ) = numpy_image.shape
|
||||
|
||||
if depth == 1:
|
||||
|
||||
format = 'L'
|
||||
|
||||
elif depth == 2:
|
||||
|
||||
format = 'LA'
|
||||
|
||||
elif depth == 3:
|
||||
|
||||
format = 'RGB'
|
||||
|
||||
elif depth == 4:
|
||||
|
||||
format = 'RGBA'
|
||||
|
||||
|
||||
|
||||
pil_image = PILImage.frombytes( format, ( w, h ), numpy_image.data.tobytes() )
|
||||
|
||||
return pil_image
|
||||
|
||||
def GenerateThumbnailNumPyFromStaticImagePath( path, target_resolution, mime, clip_rect = None ):
|
||||
|
||||
numpy_image = GenerateNumPyImage( path, mime )
|
||||
|
||||
if clip_rect is not None:
|
||||
|
||||
numpy_image = ClipNumPyImage( numpy_image, clip_rect )
|
||||
|
||||
|
||||
thumbnail_numpy_image = ResizeNumPyImage( numpy_image, target_resolution )
|
||||
|
||||
return thumbnail_numpy_image
|
||||
|
||||
|
||||
def GenerateThumbnailBytesFromNumPy( numpy_image ) -> bytes:
|
||||
|
||||
if len( numpy_image.shape ) == 2:
|
||||
|
||||
depth = 3
|
||||
|
||||
convert = cv2.COLOR_GRAY2RGB
|
||||
|
||||
else:
|
||||
|
||||
( im_height, im_width, depth ) = numpy_image.shape
|
||||
|
||||
numpy_image = HydrusImageNormalisation.StripOutAnyUselessAlphaChannel( numpy_image )
|
||||
|
||||
if depth == 4:
|
||||
|
||||
convert = cv2.COLOR_RGBA2BGRA
|
||||
|
||||
else:
|
||||
|
||||
convert = cv2.COLOR_RGB2BGR
|
||||
|
||||
|
||||
|
||||
numpy_image = cv2.cvtColor( numpy_image, convert )
|
||||
|
||||
( im_height, im_width, depth ) = numpy_image.shape
|
||||
|
||||
if depth == 4:
|
||||
|
||||
ext = '.png'
|
||||
|
||||
params = CV_PNG_THUMBNAIL_ENCODE_PARAMS
|
||||
|
||||
else:
|
||||
|
||||
ext = '.jpg'
|
||||
|
||||
params = CV_JPEG_THUMBNAIL_ENCODE_PARAMS
|
||||
|
||||
|
||||
( result_success, result_byte_array ) = cv2.imencode( ext, numpy_image, params )
|
||||
|
||||
if result_success:
|
||||
|
||||
thumbnail_bytes = result_byte_array.tostring()
|
||||
|
||||
return thumbnail_bytes
|
||||
|
||||
else:
|
||||
|
||||
raise HydrusExceptions.CantRenderWithCVException( 'Thumb failed to encode!' )
|
||||
|
||||
|
||||
|
||||
def GenerateThumbnailBytesFromPIL( pil_image: PILImage.Image ) -> bytes:
|
||||
|
||||
f = io.BytesIO()
|
||||
|
||||
if HydrusImageColours.PILImageHasTransparency( pil_image ):
|
||||
|
||||
pil_image.save( f, 'PNG' )
|
||||
|
||||
else:
|
||||
|
||||
pil_image.save( f, 'JPEG', quality = 92 )
|
||||
|
||||
|
||||
f.seek( 0 )
|
||||
|
||||
thumbnail_bytes = f.read()
|
||||
|
||||
f.close()
|
||||
|
||||
return thumbnail_bytes
|
||||
|
||||
|
||||
def GeneratePNGBytesNumPy( numpy_image ) -> bytes:
|
||||
|
||||
( im_height, im_width, depth ) = numpy_image.shape
|
||||
|
||||
ext = '.png'
|
||||
|
||||
if depth == 4:
|
||||
|
||||
convert = cv2.COLOR_RGBA2BGRA
|
||||
|
||||
else:
|
||||
|
||||
convert = cv2.COLOR_RGB2BGR
|
||||
|
||||
|
||||
numpy_image = cv2.cvtColor( numpy_image, convert )
|
||||
|
||||
( result_success, result_byte_array ) = cv2.imencode( ext, numpy_image )
|
||||
|
||||
if result_success:
|
||||
|
||||
return result_byte_array.tostring()
|
||||
|
||||
else:
|
||||
|
||||
raise HydrusExceptions.CantRenderWithCVException( 'Image failed to encode!' )
|
||||
|
||||
|
||||
|
||||
def GetImagePixelHash( path, mime ) -> bytes:
|
||||
|
||||
numpy_image = GenerateNumPyImage( path, mime )
|
||||
|
||||
return GetImagePixelHashNumPy( numpy_image )
|
||||
|
||||
|
||||
def GetImagePixelHashNumPy( numpy_image ):
|
||||
|
||||
return hashlib.sha256( numpy_image.data.tobytes() ).digest()
|
||||
|
||||
|
||||
def GetImageResolution( path, mime ):
|
||||
|
||||
# PIL first here, rather than numpy, as it loads image headers real quick
|
||||
try:
|
||||
|
||||
pil_image = GeneratePILImage( path, dequantize = False )
|
||||
|
||||
( width, height ) = pil_image.size
|
||||
|
||||
except HydrusExceptions.DamagedOrUnusualFileException:
|
||||
|
||||
# desperate situation
|
||||
numpy_image = GenerateNumPyImage( path, mime )
|
||||
|
||||
if len( numpy_image.shape ) == 3:
|
||||
|
||||
( height, width, depth ) = numpy_image.shape
|
||||
|
||||
else:
|
||||
|
||||
( height, width ) = numpy_image.shape
|
||||
|
||||
|
||||
|
||||
width = max( width, 1 )
|
||||
height = max( height, 1 )
|
||||
|
||||
return ( width, height )
|
||||
|
||||
|
||||
def GetResolutionNumPy( numpy_image ):
|
||||
|
||||
( image_height, image_width, depth ) = numpy_image.shape
|
||||
|
||||
return ( image_width, image_height )
|
||||
|
||||
|
||||
THUMBNAIL_SCALE_DOWN_ONLY = 0
|
||||
THUMBNAIL_SCALE_TO_FIT = 1
|
||||
THUMBNAIL_SCALE_TO_FILL = 2
|
||||
|
||||
thumbnail_scale_str_lookup = {
|
||||
THUMBNAIL_SCALE_DOWN_ONLY : 'scale down only',
|
||||
THUMBNAIL_SCALE_TO_FIT : 'scale to fit',
|
||||
THUMBNAIL_SCALE_TO_FILL : 'scale to fill'
|
||||
}
|
||||
|
||||
def GetThumbnailResolutionAndClipRegion( image_resolution: typing.Tuple[ int, int ], bounding_dimensions: typing.Tuple[ int, int ], thumbnail_scale_type: int, thumbnail_dpr_percent: int ):
|
||||
|
||||
clip_rect = None
|
||||
|
||||
( im_width, im_height ) = image_resolution
|
||||
( bounding_width, bounding_height ) = bounding_dimensions
|
||||
|
||||
if thumbnail_dpr_percent != 100:
|
||||
|
||||
thumbnail_dpr = thumbnail_dpr_percent / 100
|
||||
|
||||
bounding_height = int( bounding_height * thumbnail_dpr )
|
||||
bounding_width = int( bounding_width * thumbnail_dpr )
|
||||
|
||||
if im_width is None:
|
||||
|
||||
im_width = bounding_width
|
||||
|
||||
if im_height is None:
|
||||
|
||||
im_height = bounding_height
|
||||
|
||||
|
||||
# TODO SVG thumbs should always scale up to the bounding dimensions
|
||||
|
||||
if thumbnail_scale_type == THUMBNAIL_SCALE_DOWN_ONLY:
|
||||
|
||||
if bounding_width >= im_width and bounding_height >= im_height:
|
||||
|
||||
return ( clip_rect, ( im_width, im_height ) )
|
||||
|
||||
|
||||
|
||||
width_ratio = im_width / bounding_width
|
||||
height_ratio = im_height / bounding_height
|
||||
|
||||
thumbnail_width = bounding_width
|
||||
thumbnail_height = bounding_height
|
||||
|
||||
if thumbnail_scale_type in ( THUMBNAIL_SCALE_DOWN_ONLY, THUMBNAIL_SCALE_TO_FIT ):
|
||||
|
||||
if width_ratio > height_ratio:
|
||||
|
||||
thumbnail_height = im_height / width_ratio
|
||||
|
||||
elif height_ratio > width_ratio:
|
||||
|
||||
thumbnail_width = im_width / height_ratio
|
||||
|
||||
|
||||
elif thumbnail_scale_type == THUMBNAIL_SCALE_TO_FILL:
|
||||
|
||||
if width_ratio == height_ratio:
|
||||
|
||||
# we have something that fits bounding region perfectly, no clip region required
|
||||
|
||||
pass
|
||||
|
||||
else:
|
||||
|
||||
clip_x = 0
|
||||
clip_y = 0
|
||||
clip_width = im_width
|
||||
clip_height = im_height
|
||||
|
||||
if width_ratio > height_ratio:
|
||||
|
||||
clip_width = max( int( im_width * height_ratio / width_ratio ), 1 )
|
||||
clip_x = ( im_width - clip_width ) // 2
|
||||
|
||||
elif height_ratio > width_ratio:
|
||||
|
||||
clip_height = max( int( im_height * width_ratio / height_ratio ), 1 )
|
||||
clip_y = ( im_height - clip_height ) // 2
|
||||
|
||||
|
||||
clip_rect = ( clip_x, clip_y, clip_width, clip_height )
|
||||
|
||||
|
||||
|
||||
thumbnail_width = max( int( thumbnail_width ), 1 )
|
||||
thumbnail_height = max( int( thumbnail_height ), 1 )
|
||||
|
||||
return ( clip_rect, ( thumbnail_width, thumbnail_height ) )
|
||||
|
||||
|
||||
def IsDecompressionBomb( path ) -> bool:
|
||||
|
||||
# there are two errors here, the 'Warning' and the 'Error', which atm is just a test vs a test x 2 for number of pixels
|
||||
# 256MB bmp by default, ( 1024 ** 3 ) // 4 // 3
|
||||
# we'll set it at 512MB, and now catching error should be about 1GB
|
||||
|
||||
PILImage.MAX_IMAGE_PIXELS = ( 512 * ( 1024 ** 2 ) ) // 3
|
||||
|
||||
warnings.simplefilter( 'error', PILImage.DecompressionBombError )
|
||||
|
||||
try:
|
||||
|
||||
HydrusImageOpening.RawOpenPILImage( path )
|
||||
|
||||
except ( PILImage.DecompressionBombError ):
|
||||
|
||||
return True
|
||||
|
||||
except:
|
||||
|
||||
# pil was unable to load it, which does not mean it was a decomp bomb
|
||||
return False
|
||||
|
||||
finally:
|
||||
|
||||
PILImage.MAX_IMAGE_PIXELS = None
|
||||
|
||||
warnings.simplefilter( 'ignore', PILImage.DecompressionBombError )
|
||||
|
||||
|
||||
return False
|
||||
|
||||
|
||||
def ResizeNumPyImage( numpy_image: numpy.array, target_resolution, forced_interpolation = None ) -> numpy.array:
|
||||
|
||||
( target_width, target_height ) = target_resolution
|
||||
( image_width, image_height ) = GetResolutionNumPy( numpy_image )
|
||||
|
||||
if target_width == image_width and target_height == target_width:
|
||||
|
||||
return numpy_image
|
||||
|
||||
elif target_width > image_height or target_height > image_width:
|
||||
|
||||
interpolation = cv2.INTER_LANCZOS4
|
||||
|
||||
else:
|
||||
|
||||
interpolation = cv2.INTER_AREA
|
||||
|
||||
|
||||
if forced_interpolation is not None:
|
||||
|
||||
interpolation = forced_interpolation
|
||||
|
||||
|
||||
return cv2.resize( numpy_image, ( target_width, target_height ), interpolation = interpolation )
|
||||
|
|
@ -0,0 +1,50 @@
|
|||
import numpy
|
||||
import numpy.core.multiarray # important this comes before cv!
|
||||
|
||||
import cv2
|
||||
from PIL import Image as PILImage
|
||||
import warnings
|
||||
|
||||
try:
|
||||
|
||||
# more hidden imports for pyinstaller
|
||||
|
||||
import numpy.random.common # pylint: disable=E0401
|
||||
import numpy.random.bounded_integers # pylint: disable=E0401
|
||||
import numpy.random.entropy # pylint: disable=E0401
|
||||
|
||||
except:
|
||||
|
||||
pass # old version of numpy, screw it
|
||||
|
||||
|
||||
if not hasattr( PILImage, 'DecompressionBombError' ):
|
||||
|
||||
# super old versions don't have this, so let's just make a stub, wew
|
||||
|
||||
class DBEStub( Exception ):
|
||||
|
||||
pass
|
||||
|
||||
|
||||
PILImage.DecompressionBombError = DBEStub
|
||||
|
||||
|
||||
if not hasattr( PILImage, 'DecompressionBombWarning' ):
|
||||
|
||||
# super old versions don't have this, so let's just make a stub, wew
|
||||
|
||||
class DBWStub( Exception ):
|
||||
|
||||
pass
|
||||
|
||||
|
||||
PILImage.DecompressionBombWarning = DBWStub
|
||||
|
||||
|
||||
warnings.simplefilter( 'ignore', PILImage.DecompressionBombWarning )
|
||||
warnings.simplefilter( 'ignore', PILImage.DecompressionBombError )
|
||||
|
||||
# PIL moaning about weirdo TIFFs
|
||||
warnings.filterwarnings( "ignore", "(Possibly )?corrupt EXIF data", UserWarning )
|
||||
warnings.filterwarnings( "ignore", "Metadata Warning", UserWarning )
|
|
@ -0,0 +1,206 @@
|
|||
import os
|
||||
import typing
|
||||
|
||||
from PIL import Image as PILImage
|
||||
|
||||
from hydrus.core import HydrusExceptions
|
||||
|
||||
def GetEmbeddedFileText( pil_image: PILImage.Image ) -> typing.Optional[ str ]:
|
||||
|
||||
def render_dict( d, prefix ):
|
||||
|
||||
texts = []
|
||||
|
||||
keys = sorted( d.keys() )
|
||||
|
||||
for key in keys:
|
||||
|
||||
if key in ( 'exif', 'icc_profile' ):
|
||||
|
||||
continue
|
||||
|
||||
|
||||
value = d[ key ]
|
||||
|
||||
if isinstance( value, bytes ):
|
||||
|
||||
continue
|
||||
|
||||
|
||||
if isinstance( value, dict ):
|
||||
|
||||
value_string = render_dict( value, prefix = ' ' + prefix )
|
||||
|
||||
if value_string is None:
|
||||
|
||||
continue
|
||||
|
||||
|
||||
else:
|
||||
|
||||
value_string = ' {}{}'.format( prefix, value )
|
||||
|
||||
|
||||
row_text = '{}{}:'.format( prefix, key )
|
||||
row_text += os.linesep
|
||||
row_text += value_string
|
||||
|
||||
texts.append( row_text )
|
||||
|
||||
|
||||
if len( texts ) > 0:
|
||||
|
||||
return os.linesep.join( texts )
|
||||
|
||||
else:
|
||||
|
||||
return None
|
||||
|
||||
|
||||
|
||||
if hasattr( pil_image, 'info' ):
|
||||
|
||||
try:
|
||||
|
||||
return render_dict( pil_image.info, '' )
|
||||
|
||||
except:
|
||||
|
||||
pass
|
||||
|
||||
|
||||
|
||||
return None
|
||||
|
||||
|
||||
def GetEXIFDict( pil_image: PILImage.Image ) -> typing.Optional[ dict ]:
|
||||
|
||||
if pil_image.format in ( 'JPEG', 'TIFF', 'PNG', 'WEBP', 'HEIF', 'AVIF', 'MPO' ):
|
||||
|
||||
try:
|
||||
|
||||
exif_dict = pil_image.getexif()._get_merged_dict()
|
||||
|
||||
if len( exif_dict ) > 0:
|
||||
|
||||
return exif_dict
|
||||
|
||||
|
||||
except:
|
||||
|
||||
pass
|
||||
|
||||
|
||||
|
||||
return None
|
||||
|
||||
|
||||
def GetICCProfileBytes( pil_image: PILImage.Image ) -> bytes:
|
||||
|
||||
if HasICCProfile( pil_image ):
|
||||
|
||||
return pil_image.info[ 'icc_profile' ]
|
||||
|
||||
|
||||
raise HydrusExceptions.DataMissing( 'This image has no ICC profile!' )
|
||||
|
||||
|
||||
# bigger number is worse quality
|
||||
# this is very rough and misses some finesse
|
||||
def GetJPEGQuantizationQualityEstimate( pil_image: PILImage.Image ):
|
||||
|
||||
if hasattr( pil_image, 'quantization' ):
|
||||
|
||||
table_arrays = list( pil_image.quantization.values() )
|
||||
|
||||
if len( table_arrays ) == 0:
|
||||
|
||||
return ( 'unknown', None )
|
||||
|
||||
|
||||
quality = sum( ( sum( table_array ) for table_array in table_arrays ) )
|
||||
|
||||
quality /= len( table_arrays )
|
||||
|
||||
if quality >= 3400:
|
||||
|
||||
label = 'very low'
|
||||
|
||||
elif quality >= 2000:
|
||||
|
||||
label = 'low'
|
||||
|
||||
elif quality >= 1400:
|
||||
|
||||
label = 'medium low'
|
||||
|
||||
elif quality >= 1000:
|
||||
|
||||
label = 'medium'
|
||||
|
||||
elif quality >= 700:
|
||||
|
||||
label = 'medium high'
|
||||
|
||||
elif quality >= 400:
|
||||
|
||||
label = 'high'
|
||||
|
||||
elif quality >= 200:
|
||||
|
||||
label = 'very high'
|
||||
|
||||
else:
|
||||
|
||||
label = 'extremely high'
|
||||
|
||||
|
||||
return ( label, quality )
|
||||
|
||||
|
||||
return ( 'unknown', None )
|
||||
|
||||
|
||||
def GetJpegSubsampling( pil_image: PILImage.Image ) -> str:
|
||||
|
||||
from PIL import JpegImagePlugin
|
||||
|
||||
result = JpegImagePlugin.get_sampling( pil_image )
|
||||
|
||||
subsampling_str_lookup = {
|
||||
0 : '4:4:4',
|
||||
1 : '4:2:2',
|
||||
2 : '4:2:0'
|
||||
}
|
||||
|
||||
return subsampling_str_lookup.get( result, 'unknown' )
|
||||
|
||||
|
||||
def HasEXIF( pil_image: PILImage.Image ) -> bool:
|
||||
|
||||
result = GetEXIFDict( pil_image )
|
||||
|
||||
return result is not None
|
||||
|
||||
|
||||
def HasHumanReadableEmbeddedMetadata( pil_image: PILImage.Image ) -> bool:
|
||||
|
||||
result = GetEmbeddedFileText( pil_image )
|
||||
|
||||
return result is not None
|
||||
|
||||
|
||||
def HasICCProfile( pil_image: PILImage.Image ) -> bool:
|
||||
|
||||
if 'icc_profile' in pil_image.info:
|
||||
|
||||
icc_profile = pil_image.info[ 'icc_profile' ]
|
||||
|
||||
if isinstance( icc_profile, bytes ) and len( icc_profile ) > 0:
|
||||
|
||||
return True
|
||||
|
||||
|
||||
|
||||
return False
|
||||
|
|
@ -0,0 +1,237 @@
|
|||
import io
|
||||
|
||||
import numpy
|
||||
|
||||
import cv2
|
||||
|
||||
from PIL import Image as PILImage
|
||||
from PIL import ImageCms as PILImageCms
|
||||
|
||||
from hydrus.core import HydrusData
|
||||
from hydrus.core import HydrusExceptions
|
||||
from hydrus.core.images import HydrusImageColours
|
||||
from hydrus.core.images import HydrusImageMetadata
|
||||
|
||||
PIL_SRGB_PROFILE = PILImageCms.createProfile( 'sRGB' )
|
||||
|
||||
def DequantizeFreshlyLoadedNumPyImage( numpy_image: numpy.array ) -> numpy.array:
|
||||
|
||||
# OpenCV loads images in BGR, and we want to normalise to RGB in general
|
||||
|
||||
if numpy_image.dtype == 'uint16':
|
||||
|
||||
numpy_image = numpy.array( numpy_image // 256, dtype = 'uint8' )
|
||||
|
||||
|
||||
shape = numpy_image.shape
|
||||
|
||||
if len( shape ) == 2:
|
||||
|
||||
# monochrome image
|
||||
|
||||
convert = cv2.COLOR_GRAY2RGB
|
||||
|
||||
else:
|
||||
|
||||
( im_y, im_x, depth ) = shape
|
||||
|
||||
if depth == 4:
|
||||
|
||||
convert = cv2.COLOR_BGRA2RGBA
|
||||
|
||||
else:
|
||||
|
||||
convert = cv2.COLOR_BGR2RGB
|
||||
|
||||
|
||||
|
||||
numpy_image = cv2.cvtColor( numpy_image, convert )
|
||||
|
||||
return numpy_image
|
||||
|
||||
|
||||
def DequantizePILImage( pil_image: PILImage.Image ) -> PILImage.Image:
|
||||
|
||||
if HydrusImageMetadata.HasICCProfile( pil_image ):
|
||||
|
||||
try:
|
||||
|
||||
pil_image = NormaliseICCProfilePILImageToSRGB( pil_image )
|
||||
|
||||
except Exception as e:
|
||||
|
||||
HydrusData.ShowException( e )
|
||||
|
||||
HydrusData.ShowText( 'Failed to normalise image ICC profile.' )
|
||||
|
||||
|
||||
|
||||
pil_image = NormalisePILImageToRGB( pil_image )
|
||||
|
||||
return pil_image
|
||||
|
||||
|
||||
def NormaliseICCProfilePILImageToSRGB( pil_image: PILImage.Image ) -> PILImage.Image:
|
||||
|
||||
try:
|
||||
|
||||
icc_profile_bytes = HydrusImageMetadata.GetICCProfileBytes( pil_image )
|
||||
|
||||
except HydrusExceptions.DataMissing:
|
||||
|
||||
return pil_image
|
||||
|
||||
|
||||
try:
|
||||
|
||||
f = io.BytesIO( icc_profile_bytes )
|
||||
|
||||
src_profile = PILImageCms.ImageCmsProfile( f )
|
||||
|
||||
if pil_image.mode in ( 'L', 'LA' ):
|
||||
|
||||
# had a bunch of LA pngs that turned pure white on RGBA ICC conversion
|
||||
# but seem to work fine if keep colourspace the same for now
|
||||
# it is a mystery, I guess a PIL bug, but presumably L and LA are technically sRGB so it is still ok to this
|
||||
|
||||
outputMode = pil_image.mode
|
||||
|
||||
else:
|
||||
|
||||
if HydrusImageColours.PILImageHasTransparency( pil_image ):
|
||||
|
||||
outputMode = 'RGBA'
|
||||
|
||||
else:
|
||||
|
||||
outputMode = 'RGB'
|
||||
|
||||
|
||||
|
||||
pil_image = PILImageCms.profileToProfile( pil_image, src_profile, PIL_SRGB_PROFILE, outputMode = outputMode )
|
||||
|
||||
except ( PILImageCms.PyCMSError, OSError ):
|
||||
|
||||
# 'cannot build transform' and presumably some other fun errors
|
||||
# way more advanced than we can deal with, so we'll just no-op
|
||||
|
||||
# OSError is due to a "OSError: cannot open profile from string" a user got
|
||||
# no idea, but that seems to be an ImageCms issue doing byte handling and ending up with an odd OSError?
|
||||
# or maybe somehow my PIL reader or bytesIO sending string for some reason?
|
||||
# in any case, nuke it for now
|
||||
|
||||
pass
|
||||
|
||||
|
||||
pil_image = NormalisePILImageToRGB( pil_image )
|
||||
|
||||
return pil_image
|
||||
|
||||
|
||||
def NormalisePILImageToRGB( pil_image: PILImage.Image ) -> PILImage.Image:
|
||||
|
||||
if HydrusImageColours.PILImageHasTransparency( pil_image ):
|
||||
|
||||
desired_mode = 'RGBA'
|
||||
|
||||
else:
|
||||
|
||||
desired_mode = 'RGB'
|
||||
|
||||
|
||||
if pil_image.mode != desired_mode:
|
||||
|
||||
if pil_image.mode == 'LAB':
|
||||
|
||||
pil_image = PILImageCms.profileToProfile( pil_image, PILImageCms.createProfile( 'LAB' ), PIL_SRGB_PROFILE, outputMode = 'RGB' )
|
||||
|
||||
else:
|
||||
|
||||
pil_image = pil_image.convert( desired_mode )
|
||||
|
||||
|
||||
|
||||
return pil_image
|
||||
|
||||
|
||||
def RotateEXIFPILImage( pil_image: PILImage.Image )-> PILImage.Image:
|
||||
|
||||
exif_dict = HydrusImageMetadata.GetEXIFDict( pil_image )
|
||||
|
||||
if exif_dict is not None:
|
||||
|
||||
EXIF_ORIENTATION = 274
|
||||
|
||||
if EXIF_ORIENTATION in exif_dict:
|
||||
|
||||
orientation = exif_dict[ EXIF_ORIENTATION ]
|
||||
|
||||
if orientation == 1:
|
||||
|
||||
pass # normal
|
||||
|
||||
elif orientation == 2:
|
||||
|
||||
# mirrored horizontal
|
||||
|
||||
pil_image = pil_image.transpose( PILImage.FLIP_LEFT_RIGHT )
|
||||
|
||||
elif orientation == 3:
|
||||
|
||||
# 180
|
||||
|
||||
pil_image = pil_image.transpose( PILImage.ROTATE_180 )
|
||||
|
||||
elif orientation == 4:
|
||||
|
||||
# mirrored vertical
|
||||
|
||||
pil_image = pil_image.transpose( PILImage.FLIP_TOP_BOTTOM )
|
||||
|
||||
elif orientation == 5:
|
||||
|
||||
# seems like these 90 degree rotations are wrong, but fliping them works for my posh example images, so I guess the PIL constants are odd
|
||||
|
||||
# mirrored horizontal, then 90 CCW
|
||||
|
||||
pil_image = pil_image.transpose( PILImage.FLIP_LEFT_RIGHT ).transpose( PILImage.ROTATE_90 )
|
||||
|
||||
elif orientation == 6:
|
||||
|
||||
# 90 CW
|
||||
|
||||
pil_image = pil_image.transpose( PILImage.ROTATE_270 )
|
||||
|
||||
elif orientation == 7:
|
||||
|
||||
# mirrored horizontal, then 90 CCW
|
||||
|
||||
pil_image = pil_image.transpose( PILImage.FLIP_LEFT_RIGHT ).transpose( PILImage.ROTATE_270 )
|
||||
|
||||
elif orientation == 8:
|
||||
|
||||
# 90 CCW
|
||||
|
||||
pil_image = pil_image.transpose( PILImage.ROTATE_90 )
|
||||
|
||||
|
||||
|
||||
|
||||
return pil_image
|
||||
|
||||
|
||||
def StripOutAnyUselessAlphaChannel( numpy_image: numpy.array ) -> numpy.array:
|
||||
|
||||
if HydrusImageColours.NumPyImageHasUselessAlphaChannel( numpy_image ):
|
||||
|
||||
numpy_image = numpy_image[:,:,:3].copy()
|
||||
|
||||
# old way, which doesn't actually remove the channel lmao lmao lmao
|
||||
'''
|
||||
convert = cv2.COLOR_RGBA2RGB
|
||||
|
||||
numpy_image = cv2.cvtColor( numpy_image, convert )
|
||||
'''
|
||||
|
||||
return numpy_image
|
||||
|
|
@ -0,0 +1,17 @@
|
|||
from PIL import Image as PILImage
|
||||
|
||||
from hydrus.core import HydrusExceptions
|
||||
|
||||
def RawOpenPILImage( path ) -> PILImage.Image:
|
||||
|
||||
try:
|
||||
|
||||
pil_image = PILImage.open( path )
|
||||
|
||||
except Exception as e:
|
||||
|
||||
raise HydrusExceptions.DamagedOrUnusualFileException( 'Could not load the image--it was likely malformed!' )
|
||||
|
||||
|
||||
return pil_image
|
||||
|
|
@ -16,8 +16,8 @@ except:
|
|||
from hydrus.core import HydrusConstants as HC
|
||||
from hydrus.core import HydrusExceptions
|
||||
from hydrus.core import HydrusFileHandling
|
||||
from hydrus.core import HydrusImageHandling
|
||||
from hydrus.core import HydrusSerialisable
|
||||
from hydrus.core.images import HydrusImageHandling
|
||||
from hydrus.core.networking import HydrusNetwork
|
||||
|
||||
INT_PARAMS = { 'expires', 'num', 'since', 'content_type', 'action', 'status' }
|
||||
|
|
|
@ -1,22 +1,3 @@
|
|||
import collections
|
||||
import os
|
||||
import random
|
||||
import traceback
|
||||
|
||||
from twisted.internet import reactor, defer
|
||||
from twisted.internet.threads import deferToThread
|
||||
from twisted.protocols import amp
|
||||
|
||||
from hydrus.core import HydrusConstants as HC
|
||||
from hydrus.core import HydrusData
|
||||
from hydrus.core import HydrusDocumentHandling
|
||||
from hydrus.core import HydrusExceptions
|
||||
from hydrus.core import HydrusFileHandling
|
||||
from hydrus.core import HydrusFlashHandling
|
||||
from hydrus.core import HydrusGlobals as HG
|
||||
from hydrus.core import HydrusImageHandling
|
||||
from hydrus.core.networking import HydrusServerResources
|
||||
|
||||
# this is all just some old experiment for non-http network comms, never used
|
||||
'''
|
||||
class HydrusAMPCommand( amp.Command ):
|
||||
|
|
|
@ -16,10 +16,10 @@ from hydrus.core import HydrusConstants as HC
|
|||
from hydrus.core import HydrusData
|
||||
from hydrus.core import HydrusExceptions
|
||||
from hydrus.core import HydrusGlobals as HG
|
||||
from hydrus.core import HydrusImageHandling
|
||||
from hydrus.core import HydrusTags
|
||||
from hydrus.core import HydrusText
|
||||
from hydrus.core import HydrusTime
|
||||
from hydrus.core.images import HydrusImageHandling
|
||||
|
||||
from hydrus.client import ClientConstants as CC
|
||||
from hydrus.client import ClientAPI
|
||||
|
@ -33,7 +33,6 @@ from hydrus.client.networking import ClientLocalServer
|
|||
from hydrus.client.networking import ClientLocalServerResources
|
||||
from hydrus.client.networking import ClientNetworkingContexts
|
||||
from hydrus.client.search import ClientSearch
|
||||
from hydrus.client.search import ClientSearchParseSystemPredicates
|
||||
|
||||
from hydrus.test import HelperFunctions
|
||||
|
||||
|
|
|
@ -5,9 +5,9 @@ import unittest
|
|||
from hydrus.core import HydrusConstants as HC
|
||||
from hydrus.core import HydrusData
|
||||
from hydrus.core import HydrusGlobals as HG
|
||||
from hydrus.core import HydrusImageHandling
|
||||
from hydrus.core import HydrusSerialisable
|
||||
from hydrus.core import HydrusTime
|
||||
from hydrus.core.images import HydrusImageHandling
|
||||
from hydrus.core.networking import HydrusNetwork
|
||||
|
||||
from hydrus.client import ClientConstants as CC
|
||||
|
|
Loading…
Reference in New Issue