Version 551

This commit is contained in:
Hydrus Network Developer 2023-11-08 15:42:59 -06:00
parent 919c0ecacf
commit 75775a2e4d
No known key found for this signature in database
GPG Key ID: 76249F053212133C
40 changed files with 581 additions and 435 deletions

View File

@ -11,9 +11,12 @@ Short for <b>P</b>ublic <b>T</b>ag <b>R</b>epository, a now community managed r
Most of the things in this document also applies to [self-hosted servers](server.md), except for tag guidelines.
## Connecting to the PTR
The easiest method is to use the built in function, found under `help -> add the public tag repository`. For adding it manually, if you so desire, read the Hydrus help document on [access keys](access_keys.md).
~~If you are starting out completely fresh you can also download a fully synced client [here](https://koto.reisen/quicksync/)~~ After losing the processed client in a crash or something similar Koto has decided against maintaining a quicksync, you will have to download and process it yourself. Though possibly a bit (couple of days or so usually) out of date it will none the less save time. Some settings may differ from the defaults of an official installation.
Once you are connected Hydrus will proceed to download and then process the update files. The progress of this can be seen under `services -> review services -> remote -> tag repositories -> public tag repository`. Here you can view its status, your account (the default account is a shared public account. Currently only janitors and the administrator have personal accounts), tag status, and how synced you are. Being behind on the sync by a certain amount makes you unable to push tags and petitions until you are caught up again.
The easiest method is to use the built in function, found under `help -> add the public tag repository`. For adding it manually, if you so desire, read the Hydrus help document on [access keys](access_keys.md).
Once you are connected, Hydrus will proceed to download and then process the update files. The progress of this can be seen under `services -> review services -> remote -> tag repositories -> public tag repository`. Here you can view its status, your account (the default account is a shared public account. Currently only janitors and the administrator have personal accounts), tag status, and how synced you are. Being behind on the sync by a certain amount makes you unable to push tags and petitions until you are caught up again.
!!! note "QuickSync 2"
If you are starting out with a completely fresh client, you can instead download a fully pre-synced client [here](https://breadthread.gay/) Though a little out of date, it will nonetheless save time. Some settings may differ from the defaults of an official installation.
## How does it work?
For something to end up on the PTR it has to be pushed there. Tags can either be entered into the tag service manually by the user through the `manage tags` window, or be routed there by a parser when downloading files. See [parsing tags](getting_started_downloading.md). Once tags have been entered into the PTR tag service they are pending until pushed. This is indicated by the `pending ()` that will appear between `tags` and `help` in the menu bar. Here you can chose to either push your changes to the PTR or discard them.

View File

@ -7,6 +7,52 @@ title: Changelog
!!! note
This is the new changelog, only the most recent builds. For all versions, see the [old changelog](old_changelog.html).
## [Version 551](https://github.com/hydrusnetwork/hydrus/releases/tag/v551)
### misc
* thanks to a user, we have a new checkbox under _options->thumbnails_ that disables thumbnail fading. they'll just blink into place in one frame as soon as ready
* after looking at this code myself, I gave it a full clean. the actual thumbnail fade animation is now handled with some proper objects rather than a scatter of variables passed around
* I also doubled the default fade time to 500ms. I expect I'll add an option for it, especially if we rework all this into the proper Qt animation engine and get it performing better
* fixed the crashes users on PyQt were seeing! I made one tiny change (1->1.0) last week, and PyQt didn't like it, so any view of Mr Bones or 'open externally' panels, or the media viewer top-right ratings hover was leading to program instability
* the system predicates for 'has/no duration', 'has/no frames', 'has/no notes', 'has/no words' (i.e. the respective 'num x' system pred, but either = 0 or >0) are now aware that they are each others' inverse, so if you ctrl+double-click or do similar edit actions, they'll flip
* updated the 'PTR for dummies' page to link to a new QuickSync source, kindly maintained and hosted by a user
### code cleanup and misc bug fixes
* sped up some random iteration across the program (e.g. when deciding which order to waterfall thumbnails in, which can suffer from overhead if you do a fast giganto-scroll)
* cleaned up the code that does image alpha channel (transparency) detection, comparison, and stripping
* unified how the variety of image loads and conversions perform the 'strip this image of useless transparency data' normalisation step. thumbnails from krita, svg, and pdf are now stripped of useless alpha. also, all 'import this serialised object png' avenues now handle pngs with spurious alpha
* I think I fixed the alpha channel stripping code to handle 'LA' (greyscale with transparency) files. if you try to import a hydrus serialised object png file that is for some crazy reason now LA, I think it'll work!
* when a files popup message filters its current files and the count goes to 0 (happens if you re-click the button after deleting everything it has to show), the message now auto-dismisses itself (previously it was nuking the button but staying as a thin strip of null panel space)
* fixed a bug where `system:date` predicates were displaying labels an hour off (usually midnight -> 11pm, thus cycling back to the previous day) thanks to the clocks changed (in the USA) last weekend. I suspect there is more of this, here and there, so let me know what you see
* fixed a counting typo error with the delete files code when you delete the last file in a domain but the domain thinks it already has 0 files
* fixed up similar code across the database to forestall future typos on SQLite SUMs
* improved and unified the 'hydrus temp dir' management code. if the specific per-process hydrus temp dir is cleared out by an external factor (I'm guessing just the OS cleaning up during a long running client session), hydrus should just simply make a new folder as needed. with luck, this will fix a problem with drag and drop export that ran into this
* fingers crossed (I have little idea what I am doing and no convenient test platform!), the Docker build of the client will now have PDF and Charts support
### many file move/copy error handling improvements
* _tl;dr: if hydrus can't put a file somewhere, it deals with that better now_
* improved how file move/merge function reports its errors, and how all its callers handle them
* the 'rename a file's file extension when its filetype changes' job now correctly recognises when it fails to rename a file due to a reason other than the file being currently in use
* import folders now correctly detect when they fail to 'move' action a file out after processing
* the check file integrity routine now correctly detects when it fails to move a damaged file from file storage to a landing zone in the main db directory. this failure now cancels the job properly and prints a nicer error to the log
* improved how the file copy/mirror function reports its errors, and how all its callers handle them
* saving a serialised object png now properly catches a 'transfer from temp dir to dest location' move error
* the internal database backup and restore routines now detect file copy errors better
* a drag and drop export operation that wants to put the files in the temp dir and also fails to collect its files nicely now correctly raises an error
* failing to set the mpv file on options save (and the subsequent mpv-load action) now reports its error correctly
* exporting update files now handles a missing update file more gracefully
* mergedirectory and mirrordirectory now fail instantly after any single error, rather than several
* added some more file/directory pre-checks to all the merge/mirror functions
* deleted some old unused code here
### client api
* thanks to a user, the Client API now has a 'generate_hashes' endpoint that returns the sha256 hash (and pixel hash and perceptual hashes of any appropriate image file) of any file you give it
* the client api version is now 55
## [Version 550](https://github.com/hydrusnetwork/hydrus/releases/tag/v550)
### misc
@ -23,7 +69,7 @@ title: Changelog
* after vacillating and talking about it for months, I finally reworked how ''scale to fill' thumbnails work. as sometimes happens, I only had to change about six critical lines of code to get the core functionality changed and nothing seems to have exploded
* the main change here is KISS--'fill' thumbnail image files on disk are no longer clipped to just the viewable area, but the whole image scaled to fill the thumbnail space (with exceptions for extreme cases). this change gives us some simplicity and flexibility behind the scenes, saves some regeneration work when the user only changes one thumbnail dimension setting, improves maintenance tasks based off the thumbnail (like blurhash), and means that the Client API can fetch your thumbs and still have something useful to display
* if you have 'scale to fit' set, hydrus will regenerate your thumbnails naturally as you browse the client. fingers crossed, you won't notice any visual difference through the transition
* if you have 'scale to fill' set, hydrus will regenerate your thumbnails naturally as you browse the client. fingers crossed, you won't notice any visual difference through the transition
* 'open externally' button panels now display their thumbnails with more reasonable maximum dimensions, and when things are gonk for whatever reason, they should nonetheless be centered correctly
* as a side thing, this change allowed me to finally purge all the clipping tech from the thumbnail pipeline, where it had obtusely sunk in to every possible filetype thumbgen
@ -344,51 +390,3 @@ title: Changelog
* refactored many of my hardcoded special unicode characters to constants in HC. not sure I really like all the spammed `{HC.UNICODE_ELLIPSIS}` though, so might revisit
* fixed an issue with last week's update code that was affecting users with a history of certain database damage
* I may have improved import support for some damaged or generally very strange image files by falling back to OpenCV for resolution parsing when Pillow fails
## [Version 541](https://github.com/hydrusnetwork/hydrus/releases/tag/v541)
### misc
* fixed the gallery downloader and thread watcher loading with the 'clear highlight' button enabled despite there being nothing currently highlighted
* to fix the darkmode tooltips on the new Qt 6.5.2 on Windows (the text is stuck on a dark grey, which is unreadable in darkmodes), all the default darkmode styles now have an 'alternate-tooltip-colour' variant, which swaps out the tooltip background colour for the much brighter normal widget text colour
* rewrote the apng parser to work much faster on large files. someone encountered a 200MB giga apng that locked up the client for minutes. now it takes a second or two (unfortunately it looks like that huge apng breaks mpv, but there we go)
* the 'media' options page has two new checkboxes--'hide uninteresting import/modified times'--which allow you to turn off the media viewer behaivour where import and modified times similar to the 'added to my files xxx days ago' are hidden
* reworked the layout of the 'media' options page. everything is in sections now and re-ordered a bit
* the 'other file is a pixel-for-pixel duplicate png!' statements will now only show if the complement is a jpeg, gif, or webp. this statement isn't so appropriate for formats like PSD
* a variety of tricky tags like `:>=` are now searchable in normal autocomplete lookup. a test that determined whether to use a slower but more capable search was misfiring
* the client api key editing window has a new 'check all permissions' button
* fixed the updates I made last week to the missing-master-file-id recovery system. I made a stupid typo and didn't test it properly, fixed now. sorry for the trouble!
* thanks to a user, the help has a bunch of updated screenshots and fixed references to old concepts
* did a little more reformatting and cleanup of 'getting started with downloading' help document and added a short section on note import options
* cleaned up some of the syntax in our various batch files. fingers crossed, the setup_venv.bat script will absolutely retain the trailing space after its questions now, no matter what whitespace my IDE and github want to trim
### string joiner
* the parsing system has a new String Processor object--the 'String Joiner'. this is a simple concatenator that takes the list of strings and joins them together. it has two variables: what joining text to use, e.g. ', ', or '-', or empty string '' for simple concatenation; and an optional 'group size', which lets you join every two or three or n strings in 1-2-3, 1-2-3, 1-2-3 style patterns
### new file types
* thanks to a user; we now have support for QOI (a png-like lossless image type) and procreate (Apple image project file) files. the former has full support; the latter has thumbnails
* QOI needs Pillow 9.5 at least, so if you are on a super old 'running from source' version, try rebuilding your venv; or cope with you QOI-lessness
### client api
* thanks to a user, we now have `/add_tags/get_siblings_and_parents`, which, given a set of tags, shows their sibling and parent display rules for each service
* I wrote some help and unit tests for this
* client api version is now 51
### file storage (mostly boring)
* the file storage system is creaky and ugly to use. I have prepped some longer-term upgrades, mostly by writing new tools and cleaning and reworking existing code. I am nowhere near to done, but I'd like us to have four new features in the nearish future:
- dynamic-length subfolders (where instead of a fixed set of 256 x00-xff folders, we can bump up to 4096 x000-xfff, and beyond, based on total number of files)
- setting fixed space limits on particular database locations (e.g. 'no more than 200GB of files here') to complement the current weight system
- permitting multiple valid locations for a particular subfolder prefix
- slow per-file background migration between valid subfolders, rather than the giganto folder-atomic program-blocking 'move files now' button in database maintenance
* so, it is pretty boring so far, but I did the following:
* wrote a new class to handle a specific file storage subfolder and spammed it everywhere, replacing previous location and prefix juggling
* wrote some new tools to scan and check the coverage of multiple locations and dynamic-length subfolders
* rewrote the file location database initialisation, storage, testing, updating, and repair to support multiple valid locations
* updated the database to hold 'max num bytes' per file storage location
* the feature to migrate the SQLite database files and then restart is removed from the 'migrate database' dialog. it was always ultrajank in a place that really shouldn't be, and it was completely user-unfriendly. just move things manually, while the client is closed
* the old 'recover and merge surplus database locations into the correct position' side feature in 'move files now' is removed. it was always a little jank, was very rarely actually helpful, and had zero reporting. it will return in the new system as a better one-shot maintenance job
* touched up the migrated database help a little

View File

@ -34,6 +34,47 @@
<div class="content">
<h1 id="changelog"><a href="#changelog">changelog</a></h1>
<ul>
<li>
<h2 id="version_551"><a href="#version_551">version 551</a></h2>
<ul>
<li><h3>misc</h3></li>
<li>thanks to a user, we have a new checkbox under _options->thumbnails_ that disables thumbnail fading. they'll just blink into place in one frame as soon as ready</li>
<li>after looking at this code myself, I gave it a full clean. the actual thumbnail fade animation is now handled with some proper objects rather than a scatter of variables passed around</li>
<li>I also doubled the default fade time to 500ms. I expect I'll add an option for it, especially if we rework all this into the proper Qt animation engine and get it performing better</li>
<li>fixed the crashes users on PyQt were seeing! I made one tiny change (1->1.0) last week, and PyQt didn't like it, so any view of Mr Bones or 'open externally' panels, or the media viewer top-right ratings hover was leading to program instability</li>
<li>the system predicates for 'has/no duration', 'has/no frames', 'has/no notes', 'has/no words' (i.e. the respective 'num x' system pred, but either = 0 or >0) are now aware that they are each others' inverse, so if you ctrl+double-click or do similar edit actions, they'll flip</li>
<li>updated the 'PTR for dummies' page to link to a new QuickSync source, kindly maintained and hosted by a user</li>
<li><h3>code cleanup and misc bug fixes</h3></li>
<li>sped up some random iteration across the program (e.g. when deciding which order to waterfall thumbnails in, which can suffer from overhead if you do a fast giganto-scroll)</li>
<li>cleaned up the code that does image alpha channel (transparency) detection, comparison, and stripping</li>
<li>unified how the variety of image loads and conversions perform the 'strip this image of useless transparency data' normalisation step. thumbnails from krita, svg, and pdf are now stripped of useless alpha. also, all 'import this serialised object png' avenues now handle pngs with spurious alpha</li>
<li>I think I fixed the alpha channel stripping code to handle 'LA' (greyscale with transparency) files. if you try to import a hydrus serialised object png file that is for some crazy reason now LA, I think it'll work!</li>
<li>when a files popup message filters its current files and the count goes to 0 (happens if you re-click the button after deleting everything it has to show), the message now auto-dismisses itself (previously it was nuking the button but staying as a thin strip of null panel space)</li>
<li>fixed a bug where `system:date` predicates were displaying labels an hour off (usually midnight -> 11pm, thus cycling back to the previous day) thanks to the clocks changed (in the USA) last weekend. I suspect there is more of this, here and there, so let me know what you see</li>
<li>fixed a counting typo error with the delete files code when you delete the last file in a domain but the domain thinks it already has 0 files</li>
<li>fixed up similar code across the database to forestall future typos on SQLite SUMs</li>
<li>improved and unified the 'hydrus temp dir' management code. if the specific per-process hydrus temp dir is cleared out by an external factor (I'm guessing just the OS cleaning up during a long running client session), hydrus should just simply make a new folder as needed. with luck, this will fix a problem with drag and drop export that ran into this</li>
<li>fingers crossed (I have little idea what I am doing and no convenient test platform!), the Docker build of the client will now have PDF and Charts support</li>
<li><h3>many file move/copy error handling improvements</h3></li>
<li>_tl;dr: if hydrus can't put a file somewhere, it deals with that better now_</li>
<li>improved how file move/merge function reports its errors, and how all its callers handle them</li>
<li>the 'rename a file's file extension when its filetype changes' job now correctly recognises when it fails to rename a file due to a reason other than the file being currently in use</li>
<li>import folders now correctly detect when they fail to 'move' action a file out after processing</li>
<li>the check file integrity routine now correctly detects when it fails to move a damaged file from file storage to a landing zone in the main db directory. this failure now cancels the job properly and prints a nicer error to the log</li>
<li>improved how the file copy/mirror function reports its errors, and how all its callers handle them</li>
<li>saving a serialised object png now properly catches a 'transfer from temp dir to dest location' move error</li>
<li>the internal database backup and restore routines now detect file copy errors better</li>
<li>a drag and drop export operation that wants to put the files in the temp dir and also fails to collect its files nicely now correctly raises an error</li>
<li>failing to set the mpv file on options save (and the subsequent mpv-load action) now reports its error correctly</li>
<li>exporting update files now handles a missing update file more gracefully</li>
<li>mergedirectory and mirrordirectory now fail instantly after any single error, rather than several</li>
<li>added some more file/directory pre-checks to all the merge/mirror functions</li>
<li>deleted some old unused code here</li>
<li><h3>client api</h3></li>
<li>thanks to a user, the Client API now has a 'generate_hashes' endpoint that returns the sha256 hash (and pixel hash and perceptual hashes of any appropriate image file) of any file you give it</li>
<li>the client api version is now 55</li>
</ul>
</li>
<li>
<h2 id="version_550"><a href="#version_550">version 550</a></h2>
<ul>
@ -48,7 +89,7 @@
<li><h3>thumbnail fill</h3></li>
<li>after vacillating and talking about it for months, I finally reworked how ''scale to fill' thumbnails work. as sometimes happens, I only had to change about six critical lines of code to get the core functionality changed and nothing seems to have exploded</li>
<li>the main change here is KISS--'fill' thumbnail image files on disk are no longer clipped to just the viewable area, but the whole image scaled to fill the thumbnail space (with exceptions for extreme cases). this change gives us some simplicity and flexibility behind the scenes, saves some regeneration work when the user only changes one thumbnail dimension setting, improves maintenance tasks based off the thumbnail (like blurhash), and means that the Client API can fetch your thumbs and still have something useful to display</li>
<li>if you have 'scale to fit' set, hydrus will regenerate your thumbnails naturally as you browse the client. fingers crossed, you won't notice any visual difference through the transition</li>
<li>if you have 'scale to fill' set, hydrus will regenerate your thumbnails naturally as you browse the client. fingers crossed, you won't notice any visual difference through the transition</li>
<li>'open externally' button panels now display their thumbnails with more reasonable maximum dimensions, and when things are gonk for whatever reason, they should nonetheless be centered correctly</li>
<li>as a side thing, this change allowed me to finally purge all the clipping tech from the thumbnail pipeline, where it had obtusely sunk in to every possible filetype thumbgen</li>
<li><h3>eager login system</h3></li>

View File

@ -387,17 +387,19 @@ class ClientFilesManager( object ):
raise Exception( message )
successful = HydrusPaths.MirrorFile( source_path, dest_path )
if not successful:
try:
message = 'Copying the file from "{}" to "{}" failed! Details should be shown and other import queues should be paused. You should shut the client down now and fix this!'.format( source_path, dest_path )
HydrusPaths.MirrorFile( source_path, dest_path )
except Exception as e:
message = f'Copying the file from "{source_path}" to "{dest_path}" failed! Details should be shown and other import queues should be paused. You should shut the client down now and fix this!'
HydrusData.ShowText( message )
self._HandleCriticalDriveError()
raise Exception( 'There was a problem copying the file from ' + source_path + ' to ' + dest_path + '!' )
raise Exception( message ) from e
@ -1273,8 +1275,6 @@ class ClientFilesManager( object ):
try:
is_an_orphan = False
( directory, filename ) = os.path.split( path )
should_be_a_hex_hash = filename[:64]
@ -1300,7 +1300,20 @@ class ClientFilesManager( object ):
HydrusData.Print( 'Moving the orphan ' + path + ' to ' + dest )
HydrusPaths.MergeFile( path, dest )
try:
HydrusPaths.MergeFile( path, dest )
except Exception as e:
HydrusData.ShowText( f'Had trouble moving orphan from {path} to {dest}! Abandoning job!' )
HydrusData.ShowException( e, do_wait = False )
job_key.Cancel()
return
orphan_paths.append( path )
@ -1745,6 +1758,11 @@ class ClientFilesManager( object ):
path = self._GenerateExpectedThumbnailPath( hash )
if not os.path.exists( path ):
raise Exception()
thumbnail_mime = HydrusFileHandling.GetThumbnailMime( path )
numpy_image = ClientImageHandling.GenerateNumPyImage( path, thumbnail_mime )
@ -1773,6 +1791,7 @@ class ClientFilesManager( object ):
return do_it
def Reinit( self ):
@ -2110,7 +2129,14 @@ class FilesMaintenanceManager( object ):
dest_path = os.path.join( error_dir, os.path.basename( path ) )
HydrusPaths.MergeFile( path, dest_path )
try:
HydrusPaths.MergeFile( path, dest_path )
except Exception as e:
raise Exception( f'Could not move the damaged file "{path}" to "{dest_path}"!' ) from e
if not self._pubbed_message_about_invalid_file_export:

View File

@ -1343,10 +1343,7 @@ class ClientOptions( HydrusSerialisable.SerialisableBase ):
def GetKey( self, name ):
with self._lock:
return bytes.fromhex( self.GetKeyHex( name ) )
return bytes.fromhex( self.GetKeyHex( name ) )
def GetAllKeysHex( self ):

View File

@ -172,7 +172,7 @@ def DumpToPNG( width, payload_bytes, title, payload_description, text, path ):
HydrusData.ShowException( e )
raise Exception( 'Could not save the png!' )
raise Exception( 'Could not save the png!' ) from e
finally:
@ -242,6 +242,7 @@ def LoadFromQtImage( qt_image: QG.QImage ):
return LoadFromNumPyImage( numpy_image )
def LoadFromPNG( path ):
# this is to deal with unicode paths, which cv2 can't handle
@ -269,13 +270,15 @@ def LoadFromPNG( path ):
pil_image = HydrusImageHandling.GeneratePILImage( temp_path, dequantize = False )
# leave strip_useless_alpha = True in here just to catch the very odd LA situation
numpy_image = HydrusImageHandling.GenerateNumPyImageFromPILImage( pil_image )
except Exception as e:
HydrusData.ShowException( e )
raise Exception( '"{}" did not appear to be a valid image!'.format( path ) )
raise Exception( '"{}" did not appear to be a valid image!'.format( path ) ) from e
@ -286,6 +289,7 @@ def LoadFromPNG( path ):
return LoadFromNumPyImage( numpy_image )
def LoadFromNumPyImage( numpy_image: numpy.array ):
try:

View File

@ -109,7 +109,7 @@ class GIFRenderer( object ):
self._pil_canvas = current_frame
numpy_image = HydrusImageHandling.GenerateNumPyImageFromPILImage( self._pil_canvas )
numpy_image = HydrusImageHandling.GenerateNumPyImageFromPILImage( self._pil_canvas, strip_useless_alpha = False )
self._next_render_index = ( self._next_render_index + 1 ) % self._num_frames

View File

@ -1896,7 +1896,7 @@ class DB( HydrusDB.HydrusDB ):
chosen_media_id = None
chosen_hash_id = None
for potential_media_id in potential_media_ids:
for potential_media_id in HydrusData.IterateListRandomlyAndFast( potential_media_ids ):
best_king_hash_id = self.modules_files_duplicates.GetBestKingId( potential_media_id, db_location_context, allowed_hash_ids = chosen_allowed_hash_ids, preferred_hash_ids = chosen_preferred_hash_ids )
@ -5823,14 +5823,7 @@ class DB( HydrusDB.HydrusDB ):
result = self._Execute( 'SELECT SUM( size ) FROM files_info WHERE hash_id IN ' + HydrusData.SplayListForDB( hash_ids ) + ';' ).fetchone()
if result is None:
total_size = 0
else:
( total_size, ) = result
total_size = self._GetSumResult( result )
self.modules_service_paths.SetServiceDirectory( service_id, hash_ids, dirname, total_size, note )
@ -10468,10 +10461,10 @@ class DB( HydrusDB.HydrusDB ):
else:
# if someone backs up with an older version that does not have as many db files as this version, we get conflict
# don't want to delete just in case, but we will move it out the way
# if the current database (and thus software) is newer and has a spare client.wew.db file, we get a confusing conflict on restart that tries to create a fresh wew file
# it is useless without the other stuff we are overwriting anyway, so delete it
HydrusPaths.MergeFile( dest, dest + '.old' )
HydrusPaths.DeletePath( dest )

View File

@ -193,12 +193,7 @@ class ClientDBFilesMetadataBasic( ClientDBModule.ClientDBModule ):
if result is None:
return 0
( total_size, ) = result
total_size = self._GetSumResult( result )
return total_size

View File

@ -830,7 +830,7 @@ class ClientDBFilesStorage( ClientDBModule.ClientDBModule ):
# hashes to size
result = self._Execute( 'SELECT SUM( size ) FROM {} CROSS JOIN files_info USING ( hash_id );'.format( current_files_table_name ) ).fetchone()
( count, ) = result
count = self._GetSumResult( result )
return count

View File

@ -257,9 +257,7 @@ class ClientDBMaintenance( ClientDBModule.ClientDBModule ):
HG.client_controller.pub( 'modal_message', job_key )
random.shuffle( names_to_analyze )
for name in names_to_analyze:
for name in HydrusData.IterateListRandomlyAndFast( names_to_analyze ):
HG.client_controller.frame_splash_status.SetText( 'analyzing ' + name )
job_key.SetStatusText( 'analyzing ' + name )
@ -434,17 +432,11 @@ class ClientDBMaintenance( ClientDBModule.ClientDBModule ):
def DeferredDropTable( self, table_name: str ):
try:
if not self._TableExists( table_name ):
self._Execute( f'SELECT 1 FROM {table_name};' ).fetchone()
except:
# table doesn't exist I guess!
return
schema = 'main'
table_name_without_schema = table_name
if '.' in table_name:
@ -590,7 +582,7 @@ class ClientDBMaintenance( ClientDBModule.ClientDBModule ):
return last_shutdown_work_time
def GetTableNamesDueAnalysis( self, force_reanalyze = False ):
def GetTableNamesDueAnalysis( self, force_reanalyze = False ) -> typing.List:
db_names = [ name for ( index, name, path ) in self._Execute( 'PRAGMA database_list;' ) if name not in ( 'mem', 'temp', 'durable_temp' ) ]

View File

@ -533,14 +533,7 @@ class ClientDBMappingsCounts( ClientDBModule.ClientDBModule ):
result = self._Execute( 'SELECT SUM( current_count ) FROM {};'.format( counts_cache_table_name ) ).fetchone()
if result is None or result[0] is None:
count = 0
else:
( count, ) = result
count = self._GetSumResult( result )
return count

View File

@ -536,7 +536,7 @@ class ExportFolder( HydrusSerialisable.SerialisableBaseNamed ):
client_files_manager = HG.client_controller.client_files_manager
num_copied = 0
num_actually_copied = 0
for ( i, media_result ) in enumerate( media_results ):
@ -599,21 +599,21 @@ class ExportFolder( HydrusSerialisable.SerialisableBaseNamed ):
copied = True
actually_copied = True
else:
copied = False
actually_copied = False
else:
copied = HydrusPaths.MirrorFile( source_path, dest_path )
actually_copied = HydrusPaths.MirrorFile( source_path, dest_path )
if copied:
if actually_copied:
num_copied += 1
num_actually_copied += 1
HydrusPaths.TryToGiveFileNicePermissionBits( dest_path )
@ -627,9 +627,9 @@ class ExportFolder( HydrusSerialisable.SerialisableBaseNamed ):
sync_paths.add( dest_path )
if num_copied > 0:
if num_actually_copied > 0:
HydrusData.Print( 'Export folder ' + self._name + ' exported ' + HydrusData.ToHumanInt( num_copied ) + ' files.' )
HydrusData.Print( 'Export folder ' + self._name + ' exported ' + HydrusData.ToHumanInt( num_actually_copied ) + ' files.' )
if self._export_type == HC.EXPORT_FOLDER_TYPE_SYNCHRONISE:

View File

@ -123,10 +123,7 @@ def DoFileExportDragDrop( window, page_key, media, alt_down ):
dnd_path = os.path.join( dnd_temp_dir, filename )
if not os.path.exists( dnd_path ):
HydrusPaths.MirrorFile( original_path, dnd_path )
HydrusPaths.MirrorFile( original_path, dnd_path )
dnd_paths.append( dnd_path )

View File

@ -12,6 +12,7 @@ from hydrus.core import HydrusText
from hydrus.client.gui import QtInit
from hydrus.client.gui import QtPorting as QP
from hydrus.core.images import HydrusImageNormalisation
def ClientToScreen( win: QW.QWidget, pos: QC.QPoint ) -> QC.QPoint:
@ -68,7 +69,7 @@ def ConvertPixelsToTextWidth( window, pixels, round_down = False ) -> int:
return round( pixels / one_char_width )
def ConvertQtImageToNumPy( qt_image: QG.QImage ):
def ConvertQtImageToNumPy( qt_image: QG.QImage, strip_useless_alpha = True ):
# _ _ _ _
# _ | | (_) _ | | | |
@ -149,6 +150,11 @@ def ConvertQtImageToNumPy( qt_image: QG.QImage ):
numpy_image = numpy_padded[ :, : -excess_bytes_to_trim ].reshape( ( height, width, depth ) )
if strip_useless_alpha:
numpy_image = HydrusImageNormalisation.StripOutAnyUselessAlphaChannel( numpy_image )
return numpy_image
def ConvertTextToPixels( window, char_dimensions ) -> typing.Tuple[ int, int ]:

View File

@ -322,6 +322,14 @@ class PopupMessage( PopupWindow ):
self._job_key.SetFiles( presented_hashes, attached_files_label )
if len( presented_hashes ) == 0:
if self._job_key.IsDone():
self.TryToDismiss()
if len( presented_hashes ) > 0:

View File

@ -4769,11 +4769,9 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
# I tried <100%, but Qt seems to cap it to 1.0. Sad!
self._thumbnail_dpr_percentage = ClientGUICommon.BetterSpinBox( self, min = 100, max = 800 )
tt = 'If your OS runs at an UI scale greater than 100%, mirror it here and your thumbnails will look crisp. If you have multiple monitors at different UI scales, or you change UI scale regularly, set it to the largest one you use.'
tt += os.linesep * 2
tt += 'I believe the UI scale on the monitor this dialog opened on was {}'.format( HydrusData.ConvertFloatToPercentage( self.devicePixelRatio() ) )
self._thumbnail_dpr_percentage.setToolTip( tt )
self._video_thumbnail_percentage_in = ClientGUICommon.BetterSpinBox( self, min=0, max=100 )
@ -4782,12 +4780,12 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
self._thumbnail_visibility_scroll_percent.setToolTip( 'Lower numbers will cause fewer scrolls, higher numbers more.' )
self._allow_blurhash_fallback = QW.QCheckBox( self )
tt = 'If hydrus does not have a thumbnail for a file (e.g. you are looking at a deleted file, or one unexpectedly missing), but it does know its blurhash, it will generate a blurry thumbnail based off that blurhash. Turning this behaviour off here will make it always show the default "hydrus" thumbnail.'
self._allow_blurhash_fallback.setToolTip( tt )
self._fade_thumbnails = QW.QCheckBox( self )
tt = 'Whenever thumbnails change (appearing on a page, selecting, an icon or tag banner changes), they normally fade from the old to the new. If you would rather they change instantly, in one frame, uncheck this.'
self._fade_thumbnails.setToolTip( tt )
self._focus_preview_on_ctrl_click = QW.QCheckBox( self )
self._focus_preview_on_ctrl_click_only_static = QW.QCheckBox( self )
@ -4849,7 +4847,7 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
rows.append( ( ' Only on files with no duration: ', self._focus_preview_on_shift_click_only_static ) )
rows.append( ( 'Generate video thumbnails this % in: ', self._video_thumbnail_percentage_in ) )
rows.append( ( 'Use blurhash missing thumbnail fallback: ', self._allow_blurhash_fallback ) )
rows.append( ( 'Fade in thumbnails: ', self._fade_thumbnails ) )
rows.append( ( 'Fade thumbnails: ', self._fade_thumbnails ) )
rows.append( ( 'Do not scroll down on key navigation if thumbnail at least this % visible: ', self._thumbnail_visibility_scroll_percent ) )
rows.append( ( 'EXPERIMENTAL: Scroll thumbnails at this rate per scroll tick: ', self._thumbnail_scroll_rate ) )
rows.append( ( 'EXPERIMENTAL: Image path for thumbnail panel background image (set blank to clear): ', self._media_background_bmp_path ) )

View File

@ -1054,7 +1054,7 @@ class MPVWidget( CAC.ApplicationCommandProcessorMixin, QW.QWidget ):
self._previous_conf_content_bytes = conf_content_bytes
#To load an existing config file (by default it doesn't load the user/global config like standalone mpv does):
# To load an existing config file (by default it doesn't load the user/global config like standalone mpv does):
load_f = getattr( mpv, '_mpv_load_config_file', None )

View File

@ -3,6 +3,7 @@ import itertools
import os
import random
import time
import typing
from qtpy import QtCore as QC
from qtpy import QtWidgets as QW
@ -50,6 +51,93 @@ from hydrus.client.media import ClientMediaFileFilter
from hydrus.client.metadata import ClientTags
from hydrus.client.search import ClientSearch
FRAME_DURATION_60FPS = 1.0 / 60
class ThumbnailWaitingToBeDrawn( object ):
def __init__( self, hash, thumbnail, thumbnail_index, bitmap ):
self.hash = hash
self.thumbnail = thumbnail
self.thumbnail_index = thumbnail_index
self.bitmap = bitmap
self._draw_complete = False
def DrawComplete( self ) -> bool:
return self._draw_complete
def DrawDue( self ) -> bool:
return True
def DrawToPainter( self, x: int, y: int, painter: QG.QPainter ):
painter.drawImage( x, y, self.bitmap )
self._draw_complete = True
class ThumbnailWaitingToBeDrawnAnimated( ThumbnailWaitingToBeDrawn ):
FADE_DURATION_S = 0.5
def __init__( self, hash, thumbnail, thumbnail_index, bitmap ):
ThumbnailWaitingToBeDrawn.__init__( self, hash, thumbnail, thumbnail_index, bitmap )
self.num_frames_drawn = 0
self.num_frames_to_draw = max( int( self.FADE_DURATION_S // FRAME_DURATION_60FPS ), 1 )
opacity_factor = max( 0.05, 1 / ( self.num_frames_to_draw / 3 ) )
self.alpha_bmp = QP.AdjustOpacity( self.bitmap, opacity_factor )
self.animation_started = HydrusTime.GetNowPrecise()
def _GetNumFramesOutstanding( self ):
now = HydrusTime.GetNowPrecise()
num_frames_to_now = int( ( now - self.animation_started ) // FRAME_DURATION_60FPS )
return min( num_frames_to_now, self.num_frames_to_draw - self.num_frames_drawn )
def DrawDue( self ) -> bool:
return self._GetNumFramesOutstanding() > 0
def DrawToPainter( self, x: int, y: int, painter: QG.QPainter ):
num_frames_to_draw = self._GetNumFramesOutstanding()
if self.num_frames_drawn + num_frames_to_draw >= self.num_frames_to_draw:
painter.drawImage( x, y, self.bitmap )
self.num_frames_drawn = self.num_frames_to_draw
self._draw_complete = True
else:
for i in range( num_frames_to_draw ):
painter.drawImage( x, y, self.alpha_bmp )
self.num_frames_drawn += num_frames_to_draw
class MediaPanel( CAC.ApplicationCommandProcessorMixin, ClientMedia.ListeningMediaList, QW.QScrollArea ):
selectedMediaTagPresentationChanged = QC.Signal( list, bool )
@ -2602,7 +2690,7 @@ class MediaPanelThumbnails( MediaPanel ):
self._drag_init_coordinates = None
self._drag_prefire_event_count = 0
self._thumbnails_being_faded_in = {}
self._hashes_to_thumbnails_waiting_to_be_drawn: typing.Dict[ bytes, ThumbnailWaitingToBeDrawn ] = {}
self._hashes_faded = set()
MediaPanel.__init__( self, parent, page_key, management_controller, media_results )
@ -2731,7 +2819,7 @@ class MediaPanelThumbnails( MediaPanel ):
painter.setCompositionMode( comp_mode )
else:
painter.setBackground( QG.QBrush( bg_colour ) )
painter.eraseRect( painter.viewport() )
@ -2833,11 +2921,20 @@ class MediaPanelThumbnails( MediaPanel ):
self._StopFading( hash )
bmp = thumbnail.GetQtImage( self.devicePixelRatio() )
bitmap = thumbnail.GetQtImage( self.devicePixelRatio() )
alpha_bmp = QP.AdjustOpacity( bmp, 0.20 )
fade_thumbnails = HG.client_controller.new_options.GetBoolean( 'fade_thumbnails' )
self._thumbnails_being_faded_in[ hash ] = ( bmp, alpha_bmp, thumbnail_index, thumbnail, now_precise, 0 )
if fade_thumbnails:
thumbnail_draw_object = ThumbnailWaitingToBeDrawnAnimated( hash, thumbnail, thumbnail_index, bitmap )
else:
thumbnail_draw_object = ThumbnailWaitingToBeDrawn( hash, thumbnail, thumbnail_index, bitmap )
self._hashes_to_thumbnails_waiting_to_be_drawn[ hash ] = thumbnail_draw_object
HG.client_controller.gui.RegisterAnimationUpdateWindow( self )
@ -3277,11 +3374,9 @@ class MediaPanelThumbnails( MediaPanel ):
def _StopFading( self, hash ):
if hash in self._thumbnails_being_faded_in:
if hash in self._hashes_to_thumbnails_waiting_to_be_drawn:
( bmp, alpha_bmp, thumbnail_index, thumbnail, animation_started, num_frames ) = self._thumbnails_being_faded_in[ hash ]
del self._thumbnails_being_faded_in[ hash ]
del self._hashes_to_thumbnails_waiting_to_be_drawn[ hash ]
@ -4438,7 +4533,7 @@ class MediaPanelThumbnails( MediaPanel ):
self.verticalScrollBar().setSingleStep( int( round( thumbnail_span_height * thumbnail_scroll_rate ) ) )
self._thumbnails_being_faded_in = {}
self._hashes_to_thumbnails_waiting_to_be_drawn = {}
self._hashes_faded = set()
self._ReinitialisePageCacheIfNeeded()
@ -4450,120 +4545,81 @@ class MediaPanelThumbnails( MediaPanel ):
def TIMERAnimationUpdate( self ):
FRAME_DURATION = 1.0 / 60
fade_thumbnails = HG.client_controller.new_options.GetBoolean( 'fade_thumbnails' )
NUM_FRAMES_TO_FILL_IN = 15 if fade_thumbnails else 0
loop_started = HydrusTime.GetNowPrecise()
loop_should_break_time = loop_started + ( FRAME_DURATION / 2 )
loop_should_break_time = HydrusTime.GetNowPrecise() + ( FRAME_DURATION_60FPS / 2 )
( thumbnail_span_width, thumbnail_span_height ) = self._GetThumbnailSpanDimensions()
thumbnail_margin = HG.client_controller.new_options.GetInteger( 'thumbnail_margin' )
hashes = list( self._thumbnails_being_faded_in.keys() )
hashes = list( self._hashes_to_thumbnails_waiting_to_be_drawn.keys() )
random.shuffle( hashes )
dcs = {}
y_start = self._GetYStart()
earliest_y = y_start
page_indices_to_painters = {}
page_height = self._num_rows_per_canvas_page * thumbnail_span_height
for hash in hashes:
for hash in HydrusData.IterateListRandomlyAndFast( hashes ):
( original_bmp, alpha_bmp, thumbnail_index, thumbnail, animation_started, num_frames_rendered ) = self._thumbnails_being_faded_in[ hash ]
num_frames_supposed_to_be_rendered = int( ( loop_started - animation_started ) / FRAME_DURATION )
num_frames_to_render = num_frames_supposed_to_be_rendered - num_frames_rendered
if num_frames_to_render == 0:
continue
thumbnail_draw_object = self._hashes_to_thumbnails_waiting_to_be_drawn[ hash ]
delete_entry = False
try:
if thumbnail_draw_object.DrawDue():
expected_thumbnail = self._sorted_media[ thumbnail_index ]
thumbnail_index = thumbnail_draw_object.thumbnail_index
except:
expected_thumbnail = None
page_index = self._GetPageIndexFromThumbnailIndex( thumbnail_index )
if expected_thumbnail != thumbnail:
delete_entry = True
elif page_index not in self._clean_canvas_pages:
delete_entry = True
else:
times_to_draw = 1
if num_frames_supposed_to_be_rendered >= NUM_FRAMES_TO_FILL_IN:
try:
bmp_to_use = original_bmp
expected_thumbnail = self._sorted_media[ thumbnail_index ]
except:
expected_thumbnail = None
page_index = self._GetPageIndexFromThumbnailIndex( thumbnail_index )
if expected_thumbnail != thumbnail_draw_object.thumbnail:
delete_entry = True
elif page_index not in self._clean_canvas_pages:
delete_entry = True
else:
times_to_draw = num_frames_to_render
thumbnail_col = thumbnail_index % self._num_columns
bmp_to_use = alpha_bmp
thumbnail_row = thumbnail_index // self._num_columns
num_frames_rendered += times_to_draw
x = thumbnail_col * thumbnail_span_width + thumbnail_margin
self._thumbnails_being_faded_in[ hash ] = ( original_bmp, alpha_bmp, thumbnail_index, thumbnail, animation_started, num_frames_rendered )
y = ( thumbnail_row - ( page_index * self._num_rows_per_canvas_page ) ) * thumbnail_span_height + thumbnail_margin
thumbnail_col = thumbnail_index % self._num_columns
thumbnail_row = thumbnail_index // self._num_columns
x = thumbnail_col * thumbnail_span_width + thumbnail_margin
y = ( thumbnail_row - ( page_index * self._num_rows_per_canvas_page ) ) * thumbnail_span_height + thumbnail_margin
if page_index not in dcs:
if page_index not in page_indices_to_painters:
canvas_page = self._clean_canvas_pages[ page_index ]
painter = QG.QPainter( canvas_page )
page_indices_to_painters[ page_index ] = painter
canvas_page = self._clean_canvas_pages[ page_index ]
painter = page_indices_to_painters[ page_index ]
painter = QG.QPainter( canvas_page )
thumbnail_draw_object.DrawToPainter( x, y, painter )
dcs[ page_index ] = painter
#
painter = dcs[ page_index ]
for i in range( times_to_draw ):
page_virtual_y = page_height * page_index
painter.drawImage( x, y, bmp_to_use )
self.widget().update( QC.QRect( x, page_virtual_y + y, thumbnail_span_width - thumbnail_margin, thumbnail_span_height - thumbnail_margin ) )
#
page_virtual_y = page_height * page_index
self.widget().update( QC.QRect( x, page_virtual_y + y, thumbnail_span_width - thumbnail_margin, thumbnail_span_height - thumbnail_margin ) )
if delete_entry:
if thumbnail_draw_object.DrawComplete() or delete_entry:
del self._thumbnails_being_faded_in[ hash ]
del self._hashes_to_thumbnails_waiting_to_be_drawn[ hash ]
if HydrusTime.TimeHasPassedPrecise( loop_should_break_time ):
@ -4572,7 +4628,7 @@ class MediaPanelThumbnails( MediaPanel ):
if len( self._thumbnails_being_faded_in ) == 0:
if len( self._hashes_to_thumbnails_waiting_to_be_drawn ) == 0:
HG.client_controller.gui.UnregisterAnimationUpdateWindow( self )
@ -4650,8 +4706,6 @@ class MediaPanelThumbnails( MediaPanel ):
y_start = self._parent._GetYStart()
earliest_y = y_start
bg_colour = HG.client_controller.new_options.GetColour( CC.COLOUR_THUMBGRID_BACKGROUND )
painter.setBackground( QG.QBrush( bg_colour ) )

View File

@ -11,7 +11,6 @@ from hydrus.core import HydrusExceptions
from hydrus.core import HydrusGlobals as HG
from hydrus.core import HydrusFileHandling
from hydrus.core.images import HydrusImageHandling
from hydrus.core.images import HydrusImageNormalisation
from hydrus.client import ClientConstants as CC
from hydrus.client import ClientImageHandling
@ -2138,8 +2137,6 @@ class PanelPredicateSystemSimilarToData( PanelPredicateSystemSingle ):
numpy_image = ClientGUIFunctions.ConvertQtImageToNumPy( qt_image )
numpy_image = HydrusImageNormalisation.StripOutAnyUselessAlphaChannel( numpy_image )
pixel_hash = HydrusImageHandling.GetImagePixelHashNumPy( numpy_image )
perceptual_hashes = ClientImageHandling.GenerateShapePerceptualHashesNumPy( numpy_image )

View File

@ -2933,7 +2933,7 @@ class ReviewServiceRepositorySubPanel( QW.QWidget ):
try:
update_path = client_files_manager.GetFilePath( update_hash, HC.APPLICATION_HYDRUS_UPDATE_CONTENT, check_file_exists = False )
update_path = client_files_manager.GetFilePath( update_hash, HC.APPLICATION_HYDRUS_UPDATE_CONTENT )
dest_path = os.path.join( dest_dir, update_hash.hex() )

View File

@ -786,8 +786,8 @@ class BufferedWindowIcon( BufferedWindow ):
painter.setRenderHint( QG.QPainter.SmoothPixmapTransform, True ) # makes any scaling here due to jank thumbs look good
x_offset = ( self.width() - self._pixmap.width() ) / 2
y_offset = ( self.height() - self._pixmap.height() ) / 2
x_offset = int( ( self.width() - self._pixmap.width() ) / 2 )
y_offset = int( ( self.height() - self._pixmap.height() ) / 2 )
if isinstance( self._pixmap, QG.QImage ):

View File

@ -1035,12 +1035,7 @@ class FileSeed( HydrusSerialisable.SerialisableBase ):
status_hook( 'copying file to temp location' )
copied = HydrusPaths.MirrorFile( path, temp_path )
if not copied:
raise Exception( 'File failed to copy to temp path--see log for error.' )
HydrusPaths.MirrorFile( path, temp_path )
self.Import( temp_path, file_import_options, status_hook = status_hook )

View File

@ -661,7 +661,7 @@ class ImportFolder( HydrusSerialisable.SerialisableBaseNamed ):
except Exception as e:
HydrusData.ShowText( 'Import folder tried to move ' + path + ', but could not:' )
HydrusData.ShowText( f'Import folder tried to move "{path}", but it encountered an error:' )
HydrusData.ShowException( e )
@ -690,7 +690,7 @@ class ImportFolder( HydrusSerialisable.SerialisableBaseNamed ):
except Exception as e:
raise Exception( 'Tried to check existence of "{}", but could not.'.format( path ) )
raise Exception( 'Tried to check existence of "{}", but could not.'.format( path ) ) from e

View File

@ -123,6 +123,8 @@ def PublishPresentationHashes( publishing_label, hashes, publish_to_popup_button
files_job_key.SetVariable( 'attached_files_mergable', True )
files_job_key.SetFiles( list( hashes ), publishing_label )
files_job_key.Finish() # important to later make it auto-dismiss on all files disappearing
HG.client_controller.pub( 'message', files_job_key )

View File

@ -2137,6 +2137,25 @@ class Predicate( HydrusSerialisable.SerialisableBase ):
return Predicate( self._predicate_type, not self._value )
elif self._predicate_type in ( PREDICATE_TYPE_SYSTEM_NUM_NOTES, PREDICATE_TYPE_SYSTEM_NUM_WORDS, PREDICATE_TYPE_SYSTEM_NUM_FRAMES, PREDICATE_TYPE_SYSTEM_DURATION ):
( operator, value ) = self._value
number_test = NumberTest.STATICCreateFromCharacters( operator, value )
if number_test.IsZero():
return Predicate( self._predicate_type, ( '>', 0 ) )
elif number_test.IsAnythingButZero():
return Predicate( self._predicate_type, ( '=', 0 ) )
else:
return None
elif self._predicate_type == PREDICATE_TYPE_SYSTEM_FILE_RELATIONSHIPS_KING:
return Predicate( self._predicate_type, not self._value )
@ -2656,8 +2675,6 @@ class Predicate( HydrusSerialisable.SerialisableBase ):
dt = datetime.datetime( year, month, day, hour, minute )
timestamp = HydrusTime.DateTimeToTimestamp( dt )
if operator == '<':
pretty_operator = 'before '
@ -2677,7 +2694,7 @@ class Predicate( HydrusSerialisable.SerialisableBase ):
include_24h_time = operator != '=' and ( hour > 0 or minute > 0 )
base += ': ' + pretty_operator + HydrusTime.TimestampToPrettyTime( timestamp, include_24h_time = include_24h_time )
base += ': ' + pretty_operator + HydrusTime.DateTimeToPrettyTime( dt, include_24h_time = include_24h_time )

View File

@ -103,8 +103,8 @@ options = {}
# Misc
NETWORK_VERSION = 20
SOFTWARE_VERSION = 550
CLIENT_API_VERSION = 54
SOFTWARE_VERSION = 551
CLIENT_API_VERSION = 55
SERVER_THUMBNAIL_DIMENSIONS = ( 200, 200 )

View File

@ -162,9 +162,9 @@ class HydrusController( HydrusControllerInterface.HydrusControllerInterface ):
raise NotImplementedError()
def _InitTempDir( self ):
def _InitHydrusTempDir( self ):
self.temp_dir = HydrusTemp.GetHydrusTempDir()
self._hydrus_temp_dir = HydrusTemp.InitialiseHydrusTempDir()
def _MaintainCallToThreads( self ):
@ -455,6 +455,16 @@ class HydrusController( HydrusControllerInterface.HydrusControllerInterface ):
return self._caches[ name ]
def GetHydrusTempDir( self ):
if not os.path.exists( self._hydrus_temp_dir ):
self._InitHydrusTempDir()
return self._hydrus_temp_dir
def GetJobSchedulerSnapshot( self, scheduler_name ):
if scheduler_name == 'fast':
@ -548,7 +558,7 @@ class HydrusController( HydrusControllerInterface.HydrusControllerInterface ):
try:
self._InitTempDir()
self._InitHydrusTempDir()
except:
@ -813,9 +823,9 @@ class HydrusController( HydrusControllerInterface.HydrusControllerInterface ):
HydrusTemp.CleanUpOldTempPaths()
if hasattr( self, 'temp_dir' ):
if hasattr( self, '_hydrus_temp_dir' ):
HydrusPaths.DeletePath( self.temp_dir )
HydrusPaths.DeletePath( self._hydrus_temp_dir )
with self._call_to_thread_lock:

View File

@ -355,6 +355,20 @@ class DBBase( object ):
def _GetSumResult( self, result: typing.Optional[ typing.Tuple[ typing.Optional[ int ] ] ] ) -> int:
if result is None or result[0] is None:
sum_value = 0
else:
( sum_value, ) = result
return sum_value
def _ActuaIndexExists( self, index_name ):
if '.' in index_name:

View File

@ -3,6 +3,7 @@ import decimal
import fractions
import itertools
import os
import numpy
import psutil
import random
import re
@ -762,6 +763,17 @@ def IterateHexPrefixes():
yield prefix
def IterateListRandomlyAndFast( xs: typing.List ):
# do this instead of a pre-for-loop shuffle on big lists
for i in numpy.random.permutation( len( xs ) ):
yield xs[ i ]
def LastShutdownWasBad( db_path, instance ):
path = os.path.join( db_path, instance + '_running' )

View File

@ -129,51 +129,6 @@ def ConvertPortablePathToAbsPath( portable_path, base_dir_override = None ):
return abs_path
def CopyAndMergeTree( source, dest ):
pauser = HydrusThreading.BigJobPauser()
MakeSureDirectoryExists( dest )
num_errors = 0
for ( root, dirnames, filenames ) in os.walk( source ):
dest_root = root.replace( source, dest )
for dirname in dirnames:
pauser.Pause()
source_path = os.path.join( root, dirname )
dest_path = os.path.join( dest_root, dirname )
MakeSureDirectoryExists( dest_path )
shutil.copystat( source_path, dest_path )
for filename in filenames:
if num_errors > 5:
raise Exception( 'Too many errors, directory copy abandoned.' )
pauser.Pause()
source_path = os.path.join( root, filename )
dest_path = os.path.join( dest_root, filename )
ok = MirrorFile( source_path, dest_path )
if not ok:
num_errors += 1
def CopyFileLikeToFileLike( f_source, f_dest ):
@ -565,43 +520,55 @@ def safe_copy2( source, dest ):
def MergeFile( source, dest ):
def MergeFile( source, dest ) -> bool:
"""
Moves a file unless it already exists with same size and modified date, in which case it simply deletes the source.
# this can merge a file, but if it is given a dir it will just straight up overwrite not merge
:return: Whether an actual move happened.
"""
if os.path.isdir( source ):
raise Exception( f'Cannot file-merge "{source}" to "{dest}"--the source is a directory!' )
if os.path.isdir( dest ):
raise Exception( f'Cannot file-merge "{source}" to "{dest}"--the destination is a directory!' )
if os.path.exists( source ) and os.path.exists( dest ) and os.path.samefile( source, dest ):
raise Exception( f'Woah, "{source}" and "{dest}" are the same file!' )
if not os.path.isdir( source ):
if PathsHaveSameSizeAndDate( source, dest ):
if PathsHaveSameSizeAndDate( source, dest ):
DeletePath( source )
return True
try:
# this overwrites on conflict without hassle
shutil.move( source, dest, copy_function = safe_copy2 )
except Exception as e:
HydrusData.ShowText( 'Trying to move ' + source + ' to ' + dest + ' caused the following problem:' )
HydrusData.ShowException( e )
DeletePath( source )
return False
# this overwrites on conflict without hassle
shutil.move( source, dest, copy_function = safe_copy2 )
return True
def MergeTree( source, dest, text_update_hook = None ):
"""
Moves everything in the source to the dest using fast MergeFile tech.
"""
if not os.path.isdir( source ):
raise Exception( f'Cannot directory-merge "{source}" to "{dest}"--the source is not a directory!' )
if not os.path.isdir( dest ):
raise Exception( f'Cannot directory-merge "{source}" to "{dest}"--the destination is not a directory!' )
if os.path.exists( source ) and os.path.exists( dest ) and os.path.samefile( source, dest ):
@ -630,8 +597,6 @@ def MergeTree( source, dest, text_update_hook = None ):
# I had a thing here that tried to optimise if dest existed but was empty, but it wasn't neat
num_errors = 0
for ( root, dirnames, filenames ) in os.walk( source ):
if text_update_hook is not None:
@ -655,40 +620,53 @@ def MergeTree( source, dest, text_update_hook = None ):
for filename in filenames:
if num_errors > 5:
raise Exception( 'Too many errors, directory move abandoned.' )
pauser.Pause()
source_path = os.path.join( root, filename )
dest_path = os.path.join( dest_root, filename )
ok = MergeFile( source_path, dest_path )
if not ok:
try:
num_errors += 1
MergeFile( source_path, dest_path )
except Exception as e:
raise Exception( f'While trying to merge "{source}" into the already-existing "{dest}", moving "{source_path}" to "{dest_path}" failed!' ) from e
if num_errors == 0:
DeletePath( source )
DeletePath( source )
def MirrorFile( source, dest ):
def MirrorFile( source, dest ) -> bool:
"""
Copies a file unless it already exists with same date and size.
:return: Whether an actual file copy/overwrite happened.
"""
if os.path.isdir( source ):
raise Exception( f'Cannot file-mirror "{source}" to "{dest}"--the source is a directory!' )
if os.path.isdir( dest ):
raise Exception( f'Cannot file-mirror "{source}" to "{dest}"--the destination is a directory!' )
if os.path.exists( source ) and os.path.exists( dest ) and os.path.samefile( source, dest ):
return True
return False
if not PathsHaveSameSizeAndDate( source, dest ):
if PathsHaveSameSizeAndDate( source, dest ):
return False
else:
try:
@ -698,15 +676,13 @@ def MirrorFile( source, dest ):
except Exception as e:
HydrusData.ShowText( 'Trying to copy ' + source + ' to ' + dest + ' caused the following problem:' )
from hydrus.core import HydrusTemp
if isinstance( e, OSError ) and 'Errno 28' in str( e ) and dest.startswith( HydrusTemp.GetCurrentTempDir() ):
message = 'It looks like I failed to copy a file into your temporary folder because I ran out of disk space!'
message = 'The recent failed file copy looks like it was because your temporary folder ran out of disk space!'
message += '\n' * 2
message += 'This folder is on your system drive, so either free up space on that or use the "--temp_dir" launch command to tell hydrus to use a different location for the temporary folder. (Check the advanced help for more info!)'
message += 'This folder is normally on your system drive, so either free up space on that or use the "--temp_dir" launch command to tell hydrus to use a different location for the temporary folder. (Check the advanced help for more info!)'
message += '\n' * 2
message += 'If your system drive appears to have space but your temp folder still maxed out, then there are probably special rules about how big a file we are allowed to put in there. Use --temp_dir.'
@ -718,15 +694,28 @@ def MirrorFile( source, dest ):
HydrusData.ShowText( message )
HydrusData.ShowException( e )
return False
raise
return True
return True
def MirrorTree( source, dest, text_update_hook = None, is_cancelled_hook = None ):
"""
Makes the destination directory look exactly like the source using fast MirrorFile tech.
It deletes surplus stuff in the dest!
"""
if not os.path.isdir( source ):
raise Exception( f'Cannot directory-mirror "{source}" to "{dest}"--the source is not a directory!' )
if not os.path.isdir( dest ):
raise Exception( f'Cannot directory-mirror "{source}" to "{dest}"--the destination is not a directory!' )
if os.path.exists( source ) and os.path.exists( dest ) and os.path.samefile( source, dest ):
@ -737,8 +726,6 @@ def MirrorTree( source, dest, text_update_hook = None, is_cancelled_hook = None
MakeSureDirectoryExists( dest )
num_errors = 0
for ( root, dirnames, filenames ) in os.walk( source ):
if is_cancelled_hook is not None and is_cancelled_hook():
@ -771,11 +758,6 @@ def MirrorTree( source, dest, text_update_hook = None, is_cancelled_hook = None
for filename in filenames:
if num_errors > 5:
raise Exception( 'Too many errors, directory copy abandoned.' )
pauser.Pause()
source_path = os.path.join( root, filename )
@ -784,11 +766,13 @@ def MirrorTree( source, dest, text_update_hook = None, is_cancelled_hook = None
surplus_dest_paths.discard( dest_path )
ok = MirrorFile( source_path, dest_path )
if not ok:
try:
num_errors += 1
MirrorFile( source_path, dest_path )
except Exception as e:
raise Exception( f'While trying to mirror "{source}" into "{dest}", moving "{source_path}" to "{dest_path}" failed!' ) from e

View File

@ -3,6 +3,7 @@ import tempfile
import threading
from hydrus.core import HydrusData
from hydrus.core import HydrusGlobals as HG
from hydrus.core import HydrusPaths
from hydrus.core import HydrusTime
@ -77,20 +78,9 @@ def GetCurrentTempDir():
return tempfile.gettempdir()
HYDRUS_TEMP_DIR = None
def GetHydrusTempDir():
def InitialiseHydrusTempDir():
path = tempfile.mkdtemp( prefix = 'hydrus' )
global HYDRUS_TEMP_DIR
if HYDRUS_TEMP_DIR is None:
HYDRUS_TEMP_DIR = path
return path
return tempfile.mkdtemp( prefix = 'hydrus' )
def SetEnvTempDir( path ):
@ -127,25 +117,16 @@ def SetEnvTempDir( path ):
def GetSubTempDir( prefix = '' ):
global HYDRUS_TEMP_DIR
hydrus_temp_dir = HG.client_controller.GetHydrusTempDir()
return tempfile.mkdtemp( prefix = prefix, dir = HYDRUS_TEMP_DIR )
return tempfile.mkdtemp( prefix = prefix, dir = hydrus_temp_dir )
def GetTempPath( suffix = '', dir = None ):
global HYDRUS_TEMP_DIR
if dir is None and HYDRUS_TEMP_DIR is not None:
if dir is None:
dir = HYDRUS_TEMP_DIR
if not os.path.exists( dir ):
HYDRUS_TEMP_DIR = None
dir = GetHydrusTempDir()
dir = HG.client_controller.GetHydrusTempDir()
return tempfile.mkstemp( suffix = suffix, prefix = 'hydrus', dir = dir )

View File

@ -5,6 +5,27 @@ import time
from hydrus.core import HydrusData
from hydrus.core import HydrusConstants as HC
def DateTimeToPrettyTime( dt: datetime.datetime, include_24h_time = True ):
if include_24h_time:
phrase = '%Y-%m-%d %H:%M:%S'
else:
phrase = '%Y-%m-%d'
try:
return dt.strftime( phrase )
except:
return f'unknown time {dt}'
def DateTimeToTimestamp( dt: datetime.datetime ) -> int:
try:
@ -397,15 +418,6 @@ def TimestampToPrettyTime( timestamp, in_utc = False, include_24h_time = True ):
return 'unknown time'
if include_24h_time:
phrase = '%Y-%m-%d %H:%M:%S'
else:
phrase = '%Y-%m-%d'
if in_utc:
timezone = HC.TIMEZONE_UTC
@ -415,17 +427,21 @@ def TimestampToPrettyTime( timestamp, in_utc = False, include_24h_time = True ):
timezone = HC.TIMEZONE_LOCAL
# ok this timezone fails when the date of the timestamp we are actually talking about is in summer time and we are in standard time, or _vice versa_
# might be able to predict timezone better by recreating the dt using our year, month, day tuple and then pulling _that_ TZ, which I am pretty sure is corrected
# OR just don't convert back and forth so much when handling this garbage, which was the original fix to a system:date predicate shifting by an hour through two conversions
try:
dt = TimestampToDateTime( timestamp, timezone = timezone )
return dt.strftime( phrase )
except:
return 'unparseable time {}'.format( timestamp )
return DateTimeToPrettyTime( dt, include_24h_time = include_24h_time )
def BaseTimestampToPrettyTimeDelta( timestamp, just_now_string = 'now', just_now_threshold = 3, history_suffix = ' ago', show_seconds = True, no_prefix = False ):

View File

@ -2,6 +2,33 @@ import numpy
from PIL import Image as PILImage
def GetNumPyAlphaChannel( numpy_image: numpy.array ) -> numpy.array:
if not NumPyImageHasAlphaChannel( numpy_image ):
raise Exception( 'Does not have an alpha channel!' )
channel_number = GetNumPyAlphaChannelNumber( numpy_image )
alpha_channel = numpy_image[:,:,channel_number].copy()
return alpha_channel
def GetNumPyAlphaChannelNumber( numpy_image: numpy.array ):
shape = numpy_image.shape
if len( shape ) <= 2:
raise Exception( 'Greyscale image, does not have an alpha channel!' )
# 1 for LA, 3 for RGBA
return shape[2] - 1
def NumPyImageHasAllCellsTheSame( numpy_image: numpy.array, value: int ):
# I looked around for ways to do this iteratively at the c++ level but didn't have huge luck.
@ -22,7 +49,7 @@ def NumPyImageHasUselessAlphaChannel( numpy_image: numpy.array ) -> bool:
# RGBA image
alpha_channel = numpy_image[:,:,3].copy()
alpha_channel = GetNumPyAlphaChannel( numpy_image )
if NumPyImageHasAllCellsTheSame( alpha_channel, 255 ): # all opaque
@ -49,7 +76,7 @@ def NumPyImageHasOpaqueAlphaChannel( numpy_image: numpy.array ) -> bool:
# RGBA image
# opaque means 255
alpha_channel = numpy_image[:,:,3].copy()
alpha_channel = GetNumPyAlphaChannel( numpy_image )
return NumPyImageHasAllCellsTheSame( alpha_channel, 255 )
@ -79,7 +106,7 @@ def NumPyImageHasTransparentAlphaChannel( numpy_image: numpy.array ) -> bool:
# RGBA image
# transparent means 0
alpha_channel = numpy_image[:,:,3].copy()
alpha_channel = GetNumPyAlphaChannel( numpy_image )
return NumPyImageHasAllCellsTheSame( alpha_channel, 0 )

View File

@ -141,11 +141,9 @@ def GenerateNumPyImage( path, mime, force_pil = False ) -> numpy.array:
pil_image = HydrusPSDHandling.MergedPILImageFromPSD( path )
numpy_image = GenerateNumPyImageFromPILImage( pil_image )
return GenerateNumPyImageFromPILImage( pil_image )
return HydrusImageNormalisation.StripOutAnyUselessAlphaChannel( numpy_image )
if mime == HC.APPLICATION_KRITA:
if HG.media_load_report_mode:
@ -252,16 +250,23 @@ def GenerateNumPyImage( path, mime, force_pil = False ) -> numpy.array:
numpy_image = HydrusImageNormalisation.DequantizeFreshlyLoadedNumPyImage( numpy_image )
numpy_image = HydrusImageNormalisation.StripOutAnyUselessAlphaChannel( numpy_image )
numpy_image = HydrusImageNormalisation.StripOutAnyUselessAlphaChannel( numpy_image )
return numpy_image
def GenerateNumPyImageFromPILImage( pil_image: PILImage.Image ) -> numpy.array:
def GenerateNumPyImageFromPILImage( pil_image: PILImage.Image, strip_useless_alpha = True ) -> numpy.array:
# this seems to magically work, I guess asarray either has a match for Image or Image provides some common shape/datatype properties that it can hook into
return numpy.asarray( pil_image )
numpy_image = numpy.asarray( pil_image )
if strip_useless_alpha:
numpy_image = HydrusImageNormalisation.StripOutAnyUselessAlphaChannel( numpy_image )
return numpy_image
# old method:
'''

View File

@ -222,7 +222,9 @@ def StripOutAnyUselessAlphaChannel( numpy_image: numpy.array ) -> numpy.array:
if HydrusImageColours.NumPyImageHasUselessAlphaChannel( numpy_image ):
numpy_image = numpy_image[:,:,:3].copy()
channel_number = HydrusImageColours.GetNumPyAlphaChannelNumber( numpy_image )
numpy_image = numpy_image[:,:,:channel_number].copy()
# old way, which doesn't actually remove the channel lmao lmao lmao
'''

View File

@ -309,8 +309,6 @@ class DB( HydrusDB.HydrusDB ):
names_to_analyze = [ name for name in all_names if name not in existing_names_to_timestamps or HydrusTime.TimeHasPassed( existing_names_to_timestamps[ name ] + stale_time_delta ) ]
random.shuffle( names_to_analyze )
if len( names_to_analyze ) > 0:
locked = HG.server_busy.acquire( False ) # pylint: disable=E1111
@ -322,7 +320,7 @@ class DB( HydrusDB.HydrusDB ):
try:
for name in names_to_analyze:
for name in HydrusData.IterateListRandomlyAndFast( names_to_analyze ):
started = HydrusTime.GetNowPrecise()
@ -1949,9 +1947,7 @@ class DB( HydrusDB.HydrusDB ):
petition_account_ids = list( petitioner_account_ids_to_reason_ids.keys() )
random.shuffle( petition_account_ids )
for petition_account_id in petition_account_ids:
for petition_account_id in HydrusData.IterateListRandomlyAndFast( petition_account_ids ):
reason_ids = petitioner_account_ids_to_reason_ids[ petition_account_id ]
@ -3018,33 +3014,14 @@ class DB( HydrusDB.HydrusDB ):
contents = []
petition_namespace = None
petition_pairs = list( tag_ids_to_hash_ids.items() )
random.shuffle( petition_pairs )
for ( service_tag_id, service_hash_ids ) in petition_pairs:
for ( service_tag_id, service_hash_ids ) in HydrusData.IterateListRandomlyAndFast( petition_pairs ):
master_tag_id = self._RepositoryGetMasterTagId( service_id, service_tag_id )
tag = self._GetTag( master_tag_id )
# taking this out for now. it is confusing when you look at counts
'''
( namespace, subtag ) = HydrusTags.SplitTag( tag )
if petition_namespace is None:
petition_namespace = namespace
if namespace != petition_namespace:
continue
'''
master_hash_ids = self._RepositoryGetMasterHashIds( service_id, service_hash_ids )
hashes = self._GetHashes( master_hash_ids )
@ -4187,21 +4164,15 @@ class DB( HydrusDB.HydrusDB ):
table_join = self._RepositoryGetFilesInfoFilesTableJoin( service_id, HC.CONTENT_STATUS_CURRENT )
( total_current_storage, ) = self._Execute( 'SELECT SUM( size ) FROM ' + table_join + ';' ).fetchone()
result = self._Execute( 'SELECT SUM( size ) FROM ' + table_join + ';' ).fetchone()
if total_current_storage is None:
total_current_storage = 0
total_current_storage = self._GetSumResult( result )
table_join = self._RepositoryGetFilesInfoFilesTableJoin( service_id, HC.CONTENT_STATUS_PENDING )
( total_pending_storage, ) = self._Execute( 'SELECT SUM( size ) FROM ' + table_join + ';' ).fetchone()
result = self._Execute( 'SELECT SUM( size ) FROM ' + table_join + ';' ).fetchone()
if total_pending_storage is None:
total_pending_storage = 0
total_pending_storage = self._GetSumResult( result )
if total_current_storage + total_pending_storage + file_dict[ 'size' ] > max_storage:

View File

@ -20,7 +20,7 @@ class TestDaemons( unittest.TestCase ):
def test_import_folders_daemon( self ):
test_dir = HydrusTemp.GetHydrusTempDir()
test_dir = HydrusTemp.GetSubTempDir( 'import_test' )
try:

View File

@ -17,6 +17,7 @@ from hydrus.core import HydrusGlobals as HG
from hydrus.core import HydrusPaths
from hydrus.core import HydrusPubSub
from hydrus.core import HydrusSessions
from hydrus.core import HydrusTemp
from hydrus.core import HydrusThreading
from hydrus.client import ClientAPI
@ -185,6 +186,8 @@ class Controller( object ):
self.db_dir = tempfile.mkdtemp()
self._hydrus_temp_dir = HydrusTemp.InitialiseHydrusTempDir()
global DB_DIR
DB_DIR = self.db_dir
@ -622,6 +625,11 @@ class Controller( object ):
return read
def GetHydrusTempDir( self ):
return self._hydrus_temp_dir
def GetWrite( self, name ):
write = self._write_call_args[ name ]

View File

@ -9,9 +9,9 @@ LABEL git="https://github.com/hydrusnetwork/hydrus"
RUN apk --no-cache add fvwm x11vnc xvfb supervisor opencv mpv mpv-libs ffmpeg jq \
openssl nodejs patch font-noto font-noto-emoji font-noto-cjk \
py3-pyside6 py3-beautifulsoup4 py3-pillow py3-numpy py3-openssl py3-cryptography py3-pip py3-opencv py3-lxml py3-chardet \
py3-beautifulsoup4 py3-pillow py3-numpy py3-openssl py3-cryptography py3-pip py3-opencv py3-lxml py3-chardet \
py3-psutil py3-pysocks py3-requests py3-twisted py3-yaml py3-lz4 py3-html5lib py3-dateutil
RUN pip install qtpy Send2Trash python-mpv cloudscraper pyparsing cbor2 Pympler
RUN pip install qtpy Send2Trash python-mpv cloudscraper pyparsing cbor2 Pympler PySide6
RUN set -xe \
&& mkdir -p /opt/hydrus \