|
@ -7,6 +7,39 @@ title: Changelog
|
|||
!!! note
|
||||
This is the new changelog, only the most recent builds. For all versions, see the [old changelog](old_changelog.html).
|
||||
|
||||
## [Version 575](https://github.com/hydrusnetwork/hydrus/releases/tag/v575)
|
||||
|
||||
### misc
|
||||
|
||||
* the new 'children' tab now sorts its results by count, and it only shows the top n (default 40) results. you can edit the n under _options->tags_. let me know how this works IRL, as this new count-sorting needs a bit of extra CPU
|
||||
* when you ask subscriptions to 'check now', either in the 'edit subscription' or 'edit subscriptions' dialogs, if there is a mix of DEAD and ALIVE subs, it now pops up a quick question dialog asking whether you want to check now for all/alive/dead
|
||||
* fixed the (do not) 'alphabetise GET query parameters' URL Class checkbox, which I broke in v569. sorry for the trouble--the new URL encoding handling was accidentally alphabetising all URLs on ingestion. a new unit test will catch this in future, so it shouldn't happen again (issue #1551)
|
||||
* thanks to a user, I think we have fixed ICC profile processing when your system ICC Profile is non-sRGB
|
||||
* fixed a logical test that was disallowing thumbnail regen on files with no resolution (certain svg, for instance). all un-resolutioned files will now (re)render a thumb to the max bounding thumbnail resolution setting. fingers crossed we'll be able to figure out a ratio solution in future
|
||||
* added a _debug->help->gui actions->reload current stylesheet_ menu action. it unloads and reloads the current QSS
|
||||
* added a _debug->help->gui actions->reload current gui session_ menu action. it saves the current session and reloads it
|
||||
* fixed the rendering of some 16-bit pngs that seem to be getting a slightly different image mode on the new version of PIL
|
||||
* the debug 'gui report mode' now reports extensive info about virtual taglist heights. if I have been working with you on taglists, mostly on the manage tags dialog, that spawn without a scrollbar even though they should, please run this mode and then try to capture the error. hit me up and we'll see if the numbers explain what's going on. I may have also simply fixed the bug
|
||||
* I think I sped up adding tags to a local tag service that has a lot of siblings/parents
|
||||
* updated the default danbooru parsers to get the original and/or translated artist notes. I don't know if a user did this or I did, but my dev machine somehow already had the tech while the defaults did not--if you did this, thinks!
|
||||
* added more tweet URL Classes for the default downloader. you should now be able to drag and drop a vxtwitter or fxtwitter URL on the client and it'll work
|
||||
|
||||
### auto-duplicate resolution
|
||||
|
||||
* I have nothing real to show today, but I have a skeleton of code and a good plan on how to get the client resolving easy duplicate pairs by itself. so far, it looks easier than I feared, but, as always, there will be a lot to do. I will keep chipping away at this and will release features in tentative waves for advanced users to play with
|
||||
* with this system, I will be launching the very first version of the 'Metadata Conditional' object I have been talking about for a few years. fingers crossed, we'll be able to spam it to all sorts of other places to do 'if the file has x property, then do y' in a standardised way
|
||||
|
||||
### boring stuff
|
||||
|
||||
* refactored the new tag children autocomplete tab to its own class so it can handle its new predicate gubbins and sorted/culled search separately. it is also now aware of the current file location context to give file-domain-sensitive suggestions (falling back to 'all known files' for fast search if things are complicated)
|
||||
* fixed a layout issue on file import options panel when a sister page caused it to be taller than it wanted; the help button ended up being the expanding widget jej
|
||||
* non-menubar menus and submenus across the program now remove a hanging final separator item, making the logic of forming menu groups a little easier in future
|
||||
* the core 'Load image in PIL' method has some better error reporting, and many calls now explicitly tell it a human-readable source description so we can avoid repeats of `DamagedOrUnusualFileException: Could not load the image at "<_io.BytesIO object at 0x000001F60CE45620>"--it was likely malformed!`
|
||||
* cleaned up some dict instantiations in `ClientOptions`
|
||||
* moved `ClientDuplicates` up to a new `duplicates` module and migrated some duplicate enums over to it from `ClientConstants`
|
||||
* removed an old method-wrapper hack that applied the 'load images with PIL' option. I just moved to a global that I set on init and update on options change
|
||||
* cleaned some duplicate checking code
|
||||
|
||||
## [Version 574](https://github.com/hydrusnetwork/hydrus/releases/tag/v574)
|
||||
|
||||
### local hashes cache
|
||||
|
@ -381,46 +414,3 @@ title: Changelog
|
|||
|
||||
* just a small thing, but the under-documented `/manage_database/get_client_options` call now says the four types of default tag sort. I left the old key, `default_tag_sort`, in so as not to break stuff, but it is just a copy of the `search_page` variant in the new `default_tag_sort_xxx` foursome
|
||||
* client api version is now 62
|
||||
|
||||
## [Version 564](https://github.com/hydrusnetwork/hydrus/releases/tag/v564)
|
||||
|
||||
### more macOS work
|
||||
|
||||
* thanks to a user, we have more macOS features:
|
||||
* macOS users get a new shortcut action, default Space, that uses Quick Look to preview a thumbnail like you can in Finder. **all existing users will get the new shortcut!**
|
||||
* the hydrus .app now has the version number in Get Info
|
||||
* **macOS users who run from source should rebuild their venvs this week!** if you don't, then trying this new Quick Look feature will just give you an error notification
|
||||
|
||||
### new fuzzy operator math in system predicates
|
||||
|
||||
* the system predicates for width, height, num_notes, num_words, num_urls, num_frames, duration, and framerate now support two different kinds of approximate equals, ≈: absolute (±x), and percentage (±x%). previously, the ≈ secretly just did ±15% in all cases (issue #1468)
|
||||
* all `system:framerate=x` searches are now converted to `±5%`, which is what they were behind the scenes. `!=` framerate stuff is no longer supported, so if you happened to use it, it is now converted to `<` just as a valid fallback
|
||||
* `system:duration` gets the same thing, `±5%`. it wasn't doing this behind the scenes before, but it should have been!
|
||||
* `system:duration` also now allows hours and minutes input, if you need longer!
|
||||
* for now, the parsing system is not updated to specify the % or absolute ± values. it will remain the same as the old system, with ±15% as the default for a `~=` input
|
||||
* there's still a little borked logic in these combined types. if you search `< 3 URLs`, that will return files with 0 URLs, and same for `num_notes`, but if you search `< 200px width` or any of the others I changed this week, that won't return a PDF that has no width (although it will return a damaged file that reports 0 width specifically). I am going to think about this, since there isn't an easy one-size-fits-all-solution to marry what is technically correct with what is actually convenient. I'll probably add a checkbox that says whether to include 'Null' values or not and default that True/False depending on the situation; let me know what you think!
|
||||
|
||||
### misc
|
||||
|
||||
* I have taken out Space as the default for archive/delete filter 'keep' and duplicate filter 'this is better, delete other'. Space is now exclusively, by default, media pause/play. **I am going to set this to existing users too, deleting/overwriting what Space does for you, if you are still set to the defaults**
|
||||
* integer percentages are now rendered without the trailing `.0`. `15%`, not `15.0%`
|
||||
* when you 'open externally', 'open in web browser', or 'open path' from a thumbnail, the preview viewer now pauses rather than clears completely
|
||||
* fixed the edit shortcut panel ALWAYS showing the new (home/end/left/right/to focus) dropdown for thumbnail dropdown, arrgh
|
||||
* I fixed a stupid typo that was breaking file repository file deletes
|
||||
* `help->about` now shows the Qt platformName
|
||||
* added a note about bad Wayland support to the Linux 'installing' help document
|
||||
* the guy who wrote the `Fixing_Hydrus_Random_Crashes_Under_Linux` document has updated it with new information, particularly related to running hydrus fast using virtual memory on small, underpowered computers
|
||||
|
||||
### client api
|
||||
|
||||
* thanks to a user, the undocumented API call that returns info on importer pages now includes the sha256 file hash in each import object Object
|
||||
* although it is a tiny change, let's nonetheless update the Client API version to 61
|
||||
|
||||
### boring predicate overhaul work
|
||||
|
||||
* updated the `NumberTest` object to hold specific percentage and absolute ± values
|
||||
* updated the `NumberTest` object to render itself to any number format, for instance pixels vs kilobytes vs a time delta
|
||||
* updated the `Predicate` object for system preds width, height, num_notes, num_words, num_urls, num_frames, duration, and framerate to store their operator and value as a `NumberTest`, and updated predicate string rendering, parsing, editing, database-level predicate handling
|
||||
* wrote new widgets to edit `NumberTest`s of various sorts and spammed them to these (operator, value) system predicate UI panels. we are finally clearing out some 8+-year-old jank here
|
||||
* rewrote the `num_notes` database search logic to use `NumberTest`s
|
||||
* the system preds for height, width, and framerate now say 'has x' and 'no x' when set to `>0` or `=0`, although what these really mean is not perfectly defined
|
||||
|
|
|
@ -83,7 +83,7 @@ By default, hydrus stores all its data—options, files, subscriptions, _everyth
|
|||
!!! danger "Bad Locations"
|
||||
**Do not install to a network location!** (i.e. on a different computer's hard drive) The SQLite database is sensitive to interruption and requires good file locking, which network interfaces often fake. There are [ways of splitting your client up](database_migration.md) so the database is on a local SSD but the files are on a network--this is fine--but you really should not put the database on a remote machine unless you know what you are doing and have a backup in case things go wrong.
|
||||
|
||||
**Do not install to a location with filesystem-level compression enabled!** It may work ok to start, but when the SQLite database grows to large size, this can cause extreme access latency and I/O errors and corruption.
|
||||
**Do not install to a location with filesystem-level compression enabled! (e.g. BTRFS)** It may work ok to start, but when the SQLite database grows to large size, this can cause extreme access latency and I/O errors and corruption.
|
||||
|
||||
!!! info "For macOS users"
|
||||
The Hydrus App is **non-portable** and puts your database in `~/Library/Hydrus` (i.e. `/Users/[You]/Library/Hydrus`). You can update simply by replacing the old App with the new, but if you wish to backup, you should be looking at `~/Library/Hydrus`, not the App itself.
|
||||
|
|
|
@ -34,6 +34,36 @@
|
|||
<div class="content">
|
||||
<h1 id="changelog"><a href="#changelog">changelog</a></h1>
|
||||
<ul>
|
||||
<li>
|
||||
<h2 id="version_575"><a href="#version_575">version 575</a></h2>
|
||||
<ul>
|
||||
<li><h3>misc</h3></li>
|
||||
<li>the new 'children' tab now sorts its results by count, and it only shows the top n (default 40) results. you can edit the n under _options->tags_. let me know how this works IRL, as this new count-sorting needs a bit of extra CPU</li>
|
||||
<li>when you ask subscriptions to 'check now', either in the 'edit subscription' or 'edit subscriptions' dialogs, if there is a mix of DEAD and ALIVE subs, it now pops up a quick question dialog asking whether you want to check now for all/alive/dead</li>
|
||||
<li>fixed the (do not) 'alphabetise GET query parameters' URL Class checkbox, which I broke in v569. sorry for the trouble--the new URL encoding handling was accidentally alphabetising all URLs on ingestion. a new unit test will catch this in future, so it shouldn't happen again (issue #1551)</li>
|
||||
<li>thanks to a user, I think we have fixed ICC profile processing when your system ICC Profile is non-sRGB</li>
|
||||
<li>fixed a logical test that was disallowing thumbnail regen on files with no resolution (certain svg, for instance). all un-resolutioned files will now (re)render a thumb to the max bounding thumbnail resolution setting. fingers crossed we'll be able to figure out a ratio solution in future</li>
|
||||
<li>added a _debug->help->gui actions->reload current stylesheet_ menu action. it unloads and reloads the current QSS</li>
|
||||
<li>added a _debug->help->gui actions->reload current gui session_ menu action. it saves the current session and reloads it</li>
|
||||
<li>fixed the rendering of some 16-bit pngs that seem to be getting a slightly different image mode on the new version of PIL</li>
|
||||
<li>the debug 'gui report mode' now reports extensive info about virtual taglist heights. if I have been working with you on taglists, mostly on the manage tags dialog, that spawn without a scrollbar even though they should, please run this mode and then try to capture the error. hit me up and we'll see if the numbers explain what's going on. I may have also simply fixed the bug</li>
|
||||
<li>I think I sped up adding tags to a local tag service that has a lot of siblings/parents</li>
|
||||
<li>updated the default danbooru parsers to get the original and/or translated artist notes. I don't know if a user did this or I did, but my dev machine somehow already had the tech while the defaults did not--if you did this, thinks!</li>
|
||||
<li>added more tweet URL Classes for the default downloader. you should now be able to drag and drop a vxtwitter or fxtwitter URL on the client and it'll work</li>
|
||||
<li><h3>auto-duplicate resolution</h3></li>
|
||||
<li>I have nothing real to show today, but I have a skeleton of code and a good plan on how to get the client resolving easy duplicate pairs by itself. so far, it looks easier than I feared, but, as always, there will be a lot to do. I will keep chipping away at this and will release features in tentative waves for advanced users to play with</li>
|
||||
<li>with this system, I will be launching the very first version of the 'Metadata Conditional' object I have been talking about for a few years. fingers crossed, we'll be able to spam it to all sorts of other places to do 'if the file has x property, then do y' in a standardised way</li>
|
||||
<li><h3>boring stuff</h3></li>
|
||||
<li>refactored the new tag children autocomplete tab to its own class so it can handle its new predicate gubbins and sorted/culled search separately. it is also now aware of the current file location context to give file-domain-sensitive suggestions (falling back to 'all known files' for fast search if things are complicated)</li>
|
||||
<li>fixed a layout issue on file import options panel when a sister page caused it to be taller than it wanted; the help button ended up being the expanding widget jej</li>
|
||||
<li>non-menubar menus and submenus across the program now remove a hanging final separator item, making the logic of forming menu groups a little easier in future</li>
|
||||
<li>the core 'Load image in PIL' method has some better error reporting, and many calls now explicitly tell it a human-readable source description so we can avoid repeats of `DamagedOrUnusualFileException: Could not load the image at "<_io.BytesIO object at 0x000001F60CE45620>"--it was likely malformed!`</li>
|
||||
<li>cleaned up some dict instantiations in `ClientOptions`</li>
|
||||
<li>moved `ClientDuplicates` up to a new `duplicates` module and migrated some duplicate enums over to it from `ClientConstants`</li>
|
||||
<li>removed an old method-wrapper hack that applied the 'load images with PIL' option. I just moved to a global that I set on init and update on options change</li>
|
||||
<li>cleaned some duplicate checking code</li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>
|
||||
<h2 id="version_574"><a href="#version_574">version 574</a></h2>
|
||||
<ul>
|
||||
|
@ -53,7 +83,7 @@
|
|||
<li>when setting up an import folder, the dialog will now refuse to OK if you set a path that is 1) above the install dir or db dir or 2) above or below any of your file storage locations. shouldn't be possible to set up an import from your own file storage folder by accident any more</li>
|
||||
<li>added a new 'apply image ICC Profile colour adjustments' checkbox to _options->media_. this simply turns off ICC profile loading and application, for debug purposes</li>
|
||||
<li><h3>boring cleanup</h3></li>
|
||||
<li>the default SQLite page size is now 4096 bytes on Linux, the SQLite default. it was 1024 previously, but SQLite now recommend 4096 for all platforms. the next time Linux users vacuum any of their databases, they will get fixed. I do not think this is a big deal, so don't rush to force this</li>
|
||||
<li>the default SQLite page size is now 4096 bytes on Linux and macOS, the SQLite default. it was 1024 previously, but SQLite now recommend 4096 for all platforms. the next time Linux users vacuum any of their databases, they will get fixed. I do not think this is a big deal, so don't rush to force this</li>
|
||||
<li>fixed the last couple dozen missing layout flags across the program, which were ancient artifacts from the wx->Qt conversion</li>
|
||||
<li>fixed the WTFPL licence to be my copyright, lol</li>
|
||||
<li>deleted the local booru service management/UI code</li>
|
||||
|
|
|
@ -65,10 +65,6 @@ directions_alignment_string_lookup = {
|
|||
DIRECTION_DOWN : 'bottom'
|
||||
}
|
||||
|
||||
DUPE_SEARCH_ONE_FILE_MATCHES_ONE_SEARCH = 0
|
||||
DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH = 1
|
||||
DUPE_SEARCH_BOTH_FILES_MATCH_DIFFERENT_SEARCHES = 2
|
||||
|
||||
FIELD_VERIFICATION_RECAPTCHA = 0
|
||||
FIELD_COMMENT = 1
|
||||
FIELD_TEXT = 2
|
||||
|
@ -146,16 +142,6 @@ hamming_string_lookup = {
|
|||
HAMMING_SPECULATIVE : 'speculative'
|
||||
}
|
||||
|
||||
SIMILAR_FILES_PIXEL_DUPES_REQUIRED = 0
|
||||
SIMILAR_FILES_PIXEL_DUPES_ALLOWED = 1
|
||||
SIMILAR_FILES_PIXEL_DUPES_EXCLUDED = 2
|
||||
|
||||
similar_files_pixel_dupes_string_lookup = {
|
||||
SIMILAR_FILES_PIXEL_DUPES_REQUIRED : 'must be pixel dupes',
|
||||
SIMILAR_FILES_PIXEL_DUPES_ALLOWED : 'can be pixel dupes',
|
||||
SIMILAR_FILES_PIXEL_DUPES_EXCLUDED : 'must not be pixel dupes'
|
||||
}
|
||||
|
||||
IDLE_NOT_ON_SHUTDOWN = 0
|
||||
IDLE_ON_SHUTDOWN = 1
|
||||
IDLE_ON_SHUTDOWN_ASK_FIRST = 2
|
||||
|
|
|
@ -1023,6 +1023,8 @@ class Controller( ClientControllerInterface.ClientControllerInterface, HydrusCon
|
|||
HydrusImageHandling.SetEnableLoadTruncatedImages( self.new_options.GetBoolean( 'enable_truncated_images_pil' ) )
|
||||
HydrusImageNormalisation.SetDoICCProfileNormalisation( self.new_options.GetBoolean( 'do_icc_profile_normalisation' ) )
|
||||
|
||||
HydrusImageHandling.FORCE_PIL_ALWAYS = self.new_options.GetBoolean( 'load_images_with_pil' )
|
||||
|
||||
|
||||
def InitModel( self ):
|
||||
|
||||
|
@ -1490,7 +1492,7 @@ class Controller( ClientControllerInterface.ClientControllerInterface, HydrusCon
|
|||
|
||||
if work_done:
|
||||
|
||||
from hydrus.client import ClientDuplicates
|
||||
from hydrus.client.duplicates import ClientDuplicates
|
||||
|
||||
ClientDuplicates.DuplicatesManager.instance().RefreshMaintenanceNumbers()
|
||||
|
||||
|
|
|
@ -334,7 +334,7 @@ class QuickDownloadManager( object ):
|
|||
file_import_options.SetPreImportURLCheckLooksForNeighbours( preimport_url_check_looks_for_neighbours )
|
||||
file_import_options.SetPostImportOptions( automatic_archive, associate_primary_urls, associate_source_urls )
|
||||
|
||||
file_import_job = ClientImportFiles.FileImportJob( temp_path, file_import_options )
|
||||
file_import_job = ClientImportFiles.FileImportJob( temp_path, file_import_options, human_file_description = f'Downloaded File - {hash.hex()}' )
|
||||
|
||||
file_import_job.DoWork()
|
||||
|
||||
|
|
|
@ -1792,7 +1792,7 @@ class ClientFilesManager( object ):
|
|||
|
||||
thumbnail_mime = HydrusFileHandling.GetThumbnailMime( path )
|
||||
|
||||
numpy_image = ClientImageHandling.GenerateNumPyImage( path, thumbnail_mime )
|
||||
numpy_image = HydrusImageHandling.GenerateNumPyImage( path, thumbnail_mime )
|
||||
|
||||
( current_width, current_height ) = HydrusImageHandling.GetResolutionNumPy( numpy_image )
|
||||
|
||||
|
@ -1863,7 +1863,7 @@ class ClientFilesManager( object ):
|
|||
|
||||
|
||||
|
||||
def HasHumanReadableEmbeddedMetadata( path, mime ):
|
||||
def HasHumanReadableEmbeddedMetadata( path, mime, human_file_description = None ):
|
||||
|
||||
if mime not in HC.FILES_THAT_CAN_HAVE_HUMAN_READABLE_EMBEDDED_METADATA:
|
||||
|
||||
|
@ -1878,7 +1878,7 @@ def HasHumanReadableEmbeddedMetadata( path, mime ):
|
|||
|
||||
try:
|
||||
|
||||
pil_image = HydrusImageOpening.RawOpenPILImage( path )
|
||||
pil_image = HydrusImageOpening.RawOpenPILImage( path, human_file_description = human_file_description )
|
||||
|
||||
except:
|
||||
|
||||
|
@ -2051,7 +2051,7 @@ class FilesMaintenanceManager( object ):
|
|||
|
||||
( width, height ) = media_result.GetResolution()
|
||||
|
||||
if width is None or height is None:
|
||||
if mime in HC.MIMES_THAT_ALWAYS_HAVE_GOOD_RESOLUTION and ( width is None or height is None ) :
|
||||
|
||||
# this guy is probably pending a metadata regen but the user forced thumbnail regen now
|
||||
# we'll wait for metadata regen to notice the new dimensions and schedule this job again
|
||||
|
@ -2420,7 +2420,7 @@ class FilesMaintenanceManager( object ):
|
|||
path = self._controller.client_files_manager.GetFilePath( hash, mime )
|
||||
|
||||
if mime == HC.APPLICATION_PSD:
|
||||
|
||||
|
||||
try:
|
||||
|
||||
has_icc_profile = HydrusPSDHandling.PSDHasICCProfile( path )
|
||||
|
@ -2672,7 +2672,7 @@ class FilesMaintenanceManager( object ):
|
|||
|
||||
thumbnail_mime = HydrusFileHandling.GetThumbnailMime( thumbnail_path )
|
||||
|
||||
numpy_image = ClientImageHandling.GenerateNumPyImage( thumbnail_path, thumbnail_mime )
|
||||
numpy_image = HydrusImageHandling.GenerateNumPyImage( thumbnail_path, thumbnail_mime )
|
||||
|
||||
return HydrusBlurhash.GetBlurhashFromNumPy( numpy_image )
|
||||
|
||||
|
|
|
@ -64,6 +64,10 @@ def GetMissingPrefixes( merge_target: str, prefixes: typing.Collection[ str ], m
|
|||
return missing_prefixes
|
||||
|
||||
|
||||
# TODO: A 'FilePath' or 'FileLocation' or similar that holds the path or IO stream, and/or temp_path to use for import calcs, and hash once known, and the human description like 'this came from blah URL'
|
||||
# then we spam that all over the import pipeline and when we need a nice error, we ask that guy to describe himself
|
||||
# search up 'human_file_description' to see what we'd be replacing
|
||||
|
||||
class FilesStorageBaseLocation( object ):
|
||||
|
||||
def __init__( self, path: str, ideal_weight: int, max_num_bytes = None ):
|
||||
|
|
|
@ -28,13 +28,6 @@ def DiscardBlankPerceptualHashes( perceptual_hashes ):
|
|||
return perceptual_hashes
|
||||
|
||||
|
||||
def GenerateNumPyImage( path, mime ):
|
||||
|
||||
force_pil = CG.client_controller.new_options.GetBoolean( 'load_images_with_pil' )
|
||||
|
||||
return HydrusImageHandling.GenerateNumPyImage( path, mime, force_pil = force_pil )
|
||||
|
||||
|
||||
def GenerateShapePerceptualHashes( path, mime ):
|
||||
|
||||
if HG.phash_generation_report_mode:
|
||||
|
@ -44,7 +37,7 @@ def GenerateShapePerceptualHashes( path, mime ):
|
|||
|
||||
try:
|
||||
|
||||
numpy_image = GenerateNumPyImage( path, mime )
|
||||
numpy_image = HydrusImageHandling.GenerateNumPyImage( path, mime )
|
||||
|
||||
return GenerateShapePerceptualHashesNumPy( numpy_image )
|
||||
|
||||
|
|
|
@ -11,8 +11,8 @@ from hydrus.core import HydrusTags
|
|||
|
||||
from hydrus.client import ClientConstants as CC
|
||||
from hydrus.client import ClientDefaults
|
||||
from hydrus.client import ClientDuplicates
|
||||
from hydrus.client import ClientGlobals as CG
|
||||
from hydrus.client.duplicates import ClientDuplicates
|
||||
from hydrus.client.importing.options import FileImportOptions
|
||||
|
||||
class ClientOptions( HydrusSerialisable.SerialisableBase ):
|
||||
|
@ -122,206 +122,134 @@ class ClientOptions( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
def _InitialiseDefaults( self ):
|
||||
|
||||
self._dictionary[ 'booleans' ] = {}
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'advanced_mode' ] = False
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'remove_filtered_files_even_when_skipped' ] = False
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'filter_inbox_and_archive_predicates' ] = False
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'discord_dnd_fix' ] = False
|
||||
self._dictionary[ 'booleans' ][ 'secret_discord_dnd_fix' ] = False
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'show_unmatched_urls_in_media_viewer' ] = False
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'set_search_focus_on_page_change' ] = False
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'allow_remove_on_manage_tags_input' ] = True
|
||||
self._dictionary[ 'booleans' ][ 'yes_no_on_remove_on_manage_tags' ] = True
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'activate_window_on_tag_search_page_activation' ] = False
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'show_related_tags' ] = True
|
||||
self._dictionary[ 'booleans' ][ 'show_file_lookup_script_tags' ] = False
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'use_native_menubar' ] = HC.PLATFORM_MACOS
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'shortcuts_merge_non_number_numpad' ] = True
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'disable_get_safe_position_test' ] = False
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'freeze_message_manager_when_mouse_on_other_monitor' ] = False
|
||||
self._dictionary[ 'booleans' ][ 'freeze_message_manager_when_main_gui_minimised' ] = False
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'load_images_with_pil' ] = True
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'only_show_delete_from_all_local_domains_when_filtering' ] = False
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'use_system_ffmpeg' ] = False
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'elide_page_tab_names' ] = True
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'maintain_similar_files_duplicate_pairs_during_idle' ] = False
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'show_namespaces' ] = True
|
||||
self._dictionary[ 'booleans' ][ 'show_number_namespaces' ] = True
|
||||
self._dictionary[ 'booleans' ][ 'show_subtag_number_namespaces' ] = True
|
||||
self._dictionary[ 'booleans' ][ 'replace_tag_underscores_with_spaces' ] = False
|
||||
self._dictionary[ 'booleans' ][ 'replace_tag_emojis_with_boxes' ] = False
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'verify_regular_https' ] = True
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'page_drop_chase_normally' ] = True
|
||||
self._dictionary[ 'booleans' ][ 'page_drop_chase_with_shift' ] = False
|
||||
self._dictionary[ 'booleans' ][ 'page_drag_change_tab_normally' ] = True
|
||||
self._dictionary[ 'booleans' ][ 'page_drag_change_tab_with_shift' ] = True
|
||||
self._dictionary[ 'booleans' ][ 'wheel_scrolls_tab_bar' ] = False
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'remove_local_domain_moved_files' ] = False
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'anchor_and_hide_canvas_drags' ] = HC.PLATFORM_WINDOWS
|
||||
self._dictionary[ 'booleans' ][ 'touchscreen_canvas_drags_unanchor' ] = False
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'import_page_progress_display' ] = True
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'process_subs_in_random_order' ] = True
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'ac_select_first_with_count' ] = False
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'saving_sash_positions_on_exit' ] = True
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'database_deferred_delete_maintenance_during_idle' ] = True
|
||||
self._dictionary[ 'booleans' ][ 'database_deferred_delete_maintenance_during_active' ] = True
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'file_maintenance_during_idle' ] = True
|
||||
self._dictionary[ 'booleans' ][ 'file_maintenance_during_active' ] = True
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'tag_display_maintenance_during_idle' ] = True
|
||||
self._dictionary[ 'booleans' ][ 'tag_display_maintenance_during_active' ] = True
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'save_page_sort_on_change' ] = False
|
||||
self._dictionary[ 'booleans' ][ 'disable_page_tab_dnd' ] = False
|
||||
self._dictionary[ 'booleans' ][ 'force_hide_page_signal_on_new_page' ] = False
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'pause_export_folders_sync' ] = False
|
||||
self._dictionary[ 'booleans' ][ 'pause_import_folders_sync' ] = False
|
||||
self._dictionary[ 'booleans' ][ 'pause_repo_sync' ] = False
|
||||
self._dictionary[ 'booleans' ][ 'pause_subs_sync' ] = False
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'pause_all_new_network_traffic' ] = False
|
||||
self._dictionary[ 'booleans' ][ 'boot_with_network_traffic_paused' ] = False
|
||||
self._dictionary[ 'booleans' ][ 'pause_all_file_queues' ] = False
|
||||
self._dictionary[ 'booleans' ][ 'pause_all_watcher_checkers' ] = False
|
||||
self._dictionary[ 'booleans' ][ 'pause_all_gallery_searches' ] = False
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'popup_message_force_min_width' ] = False
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'always_show_iso_time' ] = False
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'confirm_multiple_local_file_services_move' ] = True
|
||||
self._dictionary[ 'booleans' ][ 'confirm_multiple_local_file_services_copy' ] = True
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'use_advanced_file_deletion_dialog' ] = False
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'show_new_on_file_seed_short_summary' ] = False
|
||||
self._dictionary[ 'booleans' ][ 'show_deleted_on_file_seed_short_summary' ] = False
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'only_save_last_session_during_idle' ] = False
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'do_human_sort_on_hdd_file_import_paths' ] = True
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'highlight_new_watcher' ] = True
|
||||
self._dictionary[ 'booleans' ][ 'highlight_new_query' ] = True
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'delete_files_after_export' ] = False
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'file_viewing_statistics_active' ] = True
|
||||
self._dictionary[ 'booleans' ][ 'file_viewing_statistics_active_on_archive_delete_filter' ] = True
|
||||
self._dictionary[ 'booleans' ][ 'file_viewing_statistics_active_on_dupe_filter' ] = False
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'prefix_hash_when_copying' ] = False
|
||||
self._dictionary[ 'booleans' ][ 'file_system_waits_on_wakeup' ] = False
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'always_show_system_everything' ] = False
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'watch_clipboard_for_watcher_urls' ] = False
|
||||
self._dictionary[ 'booleans' ][ 'watch_clipboard_for_other_recognised_urls' ] = False
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'default_search_synchronised' ] = True
|
||||
self._dictionary[ 'booleans' ][ 'autocomplete_float_main_gui' ] = True
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'global_audio_mute' ] = False
|
||||
self._dictionary[ 'booleans' ][ 'media_viewer_audio_mute' ] = False
|
||||
self._dictionary[ 'booleans' ][ 'media_viewer_uses_its_own_audio_volume' ] = False
|
||||
self._dictionary[ 'booleans' ][ 'preview_audio_mute' ] = False
|
||||
self._dictionary[ 'booleans' ][ 'preview_uses_its_own_audio_volume' ] = True
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'always_loop_gifs' ] = True
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'always_show_system_tray_icon' ] = False
|
||||
self._dictionary[ 'booleans' ][ 'minimise_client_to_system_tray' ] = False
|
||||
self._dictionary[ 'booleans' ][ 'close_client_to_system_tray' ] = False
|
||||
self._dictionary[ 'booleans' ][ 'start_client_in_system_tray' ] = False
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'use_qt_file_dialogs' ] = False
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'notify_client_api_cookies' ] = False
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'expand_parents_on_storage_taglists' ] = True
|
||||
self._dictionary[ 'booleans' ][ 'expand_parents_on_storage_autocomplete_taglists' ] = True
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'show_parent_decorators_on_storage_taglists' ] = True
|
||||
self._dictionary[ 'booleans' ][ 'show_parent_decorators_on_storage_autocomplete_taglists' ] = True
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'show_sibling_decorators_on_storage_taglists' ] = True
|
||||
self._dictionary[ 'booleans' ][ 'show_sibling_decorators_on_storage_autocomplete_taglists' ] = True
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'show_session_size_warnings' ] = True
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'delete_lock_for_archived_files' ] = False
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'remember_last_advanced_file_deletion_reason' ] = True
|
||||
self._dictionary[ 'booleans' ][ 'remember_last_advanced_file_deletion_special_action' ] = False
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'do_macos_debug_dialog_menus' ] = False
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'save_default_tag_service_tab_on_change' ] = True
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'force_animation_scanbar_show' ] = False
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'call_mouse_buttons_primary_secondary' ] = False
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'start_note_editing_at_end' ] = True
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'draw_transparency_checkerboard_media_canvas' ] = False
|
||||
self._dictionary[ 'booleans' ][ 'draw_transparency_checkerboard_media_canvas_duplicates' ] = True
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'menu_choice_buttons_can_mouse_scroll' ] = True
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'focus_preview_on_ctrl_click' ] = False
|
||||
self._dictionary[ 'booleans' ][ 'focus_preview_on_ctrl_click_only_static' ] = False
|
||||
self._dictionary[ 'booleans' ][ 'focus_preview_on_shift_click' ] = False
|
||||
self._dictionary[ 'booleans' ][ 'focus_preview_on_shift_click_only_static' ] = False
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'fade_sibling_connector' ] = True
|
||||
self._dictionary[ 'booleans' ][ 'use_custom_sibling_connector_colour' ] = False
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'hide_uninteresting_local_import_time' ] = True
|
||||
self._dictionary[ 'booleans' ][ 'hide_uninteresting_modified_time' ] = True
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'allow_blurhash_fallback' ] = True
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'fade_thumbnails' ] = True
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'slideshow_always_play_duration_media_once_through' ] = False
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'enable_truncated_images_pil' ] = True
|
||||
self._dictionary[ 'booleans' ][ 'do_icc_profile_normalisation' ] = True
|
||||
|
||||
from hydrus.client.gui.canvas import ClientGUIMPV
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'mpv_available_at_start' ] = ClientGUIMPV.MPV_IS_AVAILABLE
|
||||
self._dictionary[ 'booleans' ] = {
|
||||
'advanced_mode' : False,
|
||||
'remove_filtered_files_even_when_skipped' : False,
|
||||
'filter_inbox_and_archive_predicates' : False,
|
||||
'discord_dnd_fix' : False,
|
||||
'secret_discord_dnd_fix' : False,
|
||||
'show_unmatched_urls_in_media_viewer' : False,
|
||||
'set_search_focus_on_page_change' : False,
|
||||
'allow_remove_on_manage_tags_input' : True,
|
||||
'yes_no_on_remove_on_manage_tags' : True,
|
||||
'activate_window_on_tag_search_page_activation' : False,
|
||||
'show_related_tags' : True,
|
||||
'show_file_lookup_script_tags' : False,
|
||||
'use_native_menubar' : HC.PLATFORM_MACOS,
|
||||
'shortcuts_merge_non_number_numpad' : True,
|
||||
'disable_get_safe_position_test' : False,
|
||||
'freeze_message_manager_when_mouse_on_other_monitor' : False,
|
||||
'freeze_message_manager_when_main_gui_minimised' : False,
|
||||
'load_images_with_pil' : True,
|
||||
'only_show_delete_from_all_local_domains_when_filtering' : False,
|
||||
'use_system_ffmpeg' : False,
|
||||
'elide_page_tab_names' : True,
|
||||
'maintain_similar_files_duplicate_pairs_during_idle' : False,
|
||||
'show_namespaces' : True,
|
||||
'show_number_namespaces' : True,
|
||||
'show_subtag_number_namespaces' : True,
|
||||
'replace_tag_underscores_with_spaces' : False,
|
||||
'replace_tag_emojis_with_boxes' : False,
|
||||
'verify_regular_https' : True,
|
||||
'page_drop_chase_normally' : True,
|
||||
'page_drop_chase_with_shift' : False,
|
||||
'page_drag_change_tab_normally' : True,
|
||||
'page_drag_change_tab_with_shift' : True,
|
||||
'wheel_scrolls_tab_bar' : False,
|
||||
'remove_local_domain_moved_files' : False,
|
||||
'anchor_and_hide_canvas_drags' : HC.PLATFORM_WINDOWS,
|
||||
'touchscreen_canvas_drags_unanchor' : False,
|
||||
'import_page_progress_display' : True,
|
||||
'process_subs_in_random_order' : True,
|
||||
'ac_select_first_with_count' : False,
|
||||
'saving_sash_positions_on_exit' : True,
|
||||
'database_deferred_delete_maintenance_during_idle' : True,
|
||||
'database_deferred_delete_maintenance_during_active' : True,
|
||||
'file_maintenance_during_idle' : True,
|
||||
'file_maintenance_during_active' : True,
|
||||
'tag_display_maintenance_during_idle' : True,
|
||||
'tag_display_maintenance_during_active' : True,
|
||||
'save_page_sort_on_change' : False,
|
||||
'disable_page_tab_dnd' : False,
|
||||
'force_hide_page_signal_on_new_page' : False,
|
||||
'pause_export_folders_sync' : False,
|
||||
'pause_import_folders_sync' : False,
|
||||
'pause_repo_sync' : False,
|
||||
'pause_subs_sync' : False,
|
||||
'pause_all_new_network_traffic' : False,
|
||||
'boot_with_network_traffic_paused' : False,
|
||||
'pause_all_file_queues' : False,
|
||||
'pause_all_watcher_checkers' : False,
|
||||
'pause_all_gallery_searches' : False,
|
||||
'popup_message_force_min_width' : False,
|
||||
'always_show_iso_time' : False,
|
||||
'confirm_multiple_local_file_services_move' : True,
|
||||
'confirm_multiple_local_file_services_copy' : True,
|
||||
'use_advanced_file_deletion_dialog' : False,
|
||||
'show_new_on_file_seed_short_summary' : False,
|
||||
'show_deleted_on_file_seed_short_summary' : False,
|
||||
'only_save_last_session_during_idle' : False,
|
||||
'do_human_sort_on_hdd_file_import_paths' : True,
|
||||
'highlight_new_watcher' : True,
|
||||
'highlight_new_query' : True,
|
||||
'delete_files_after_export' : False,
|
||||
'file_viewing_statistics_active' : True,
|
||||
'file_viewing_statistics_active_on_archive_delete_filter' : True,
|
||||
'file_viewing_statistics_active_on_dupe_filter' : False,
|
||||
'prefix_hash_when_copying' : False,
|
||||
'file_system_waits_on_wakeup' : False,
|
||||
'always_show_system_everything' : False,
|
||||
'watch_clipboard_for_watcher_urls' : False,
|
||||
'watch_clipboard_for_other_recognised_urls' : False,
|
||||
'default_search_synchronised' : True,
|
||||
'autocomplete_float_main_gui' : True,
|
||||
'global_audio_mute' : False,
|
||||
'media_viewer_audio_mute' : False,
|
||||
'media_viewer_uses_its_own_audio_volume' : False,
|
||||
'preview_audio_mute' : False,
|
||||
'preview_uses_its_own_audio_volume' : True,
|
||||
'always_loop_gifs' : True,
|
||||
'always_show_system_tray_icon' : False,
|
||||
'minimise_client_to_system_tray' : False,
|
||||
'close_client_to_system_tray' : False,
|
||||
'start_client_in_system_tray' : False,
|
||||
'use_qt_file_dialogs' : False,
|
||||
'notify_client_api_cookies' : False,
|
||||
'expand_parents_on_storage_taglists' : True,
|
||||
'expand_parents_on_storage_autocomplete_taglists' : True,
|
||||
'show_parent_decorators_on_storage_taglists' : True,
|
||||
'show_parent_decorators_on_storage_autocomplete_taglists' : True,
|
||||
'show_sibling_decorators_on_storage_taglists' : True,
|
||||
'show_sibling_decorators_on_storage_autocomplete_taglists' : True,
|
||||
'show_session_size_warnings' : True,
|
||||
'delete_lock_for_archived_files' : False,
|
||||
'remember_last_advanced_file_deletion_reason' : True,
|
||||
'remember_last_advanced_file_deletion_special_action' : False,
|
||||
'do_macos_debug_dialog_menus' : False,
|
||||
'save_default_tag_service_tab_on_change' : True,
|
||||
'force_animation_scanbar_show' : False,
|
||||
'call_mouse_buttons_primary_secondary' : False,
|
||||
'start_note_editing_at_end' : True,
|
||||
'draw_transparency_checkerboard_media_canvas' : False,
|
||||
'draw_transparency_checkerboard_media_canvas_duplicates' : True,
|
||||
'menu_choice_buttons_can_mouse_scroll' : True,
|
||||
'focus_preview_on_ctrl_click' : False,
|
||||
'focus_preview_on_ctrl_click_only_static' : False,
|
||||
'focus_preview_on_shift_click' : False,
|
||||
'focus_preview_on_shift_click_only_static' : False,
|
||||
'fade_sibling_connector' : True,
|
||||
'use_custom_sibling_connector_colour' : False,
|
||||
'hide_uninteresting_local_import_time' : True,
|
||||
'hide_uninteresting_modified_time' : True,
|
||||
'allow_blurhash_fallback' : True,
|
||||
'fade_thumbnails' : True,
|
||||
'slideshow_always_play_duration_media_once_through' : False,
|
||||
'enable_truncated_images_pil' : True,
|
||||
'do_icc_profile_normalisation' : True,
|
||||
'mpv_available_at_start' : ClientGUIMPV.MPV_IS_AVAILABLE
|
||||
}
|
||||
|
||||
#
|
||||
|
||||
|
@ -415,207 +343,144 @@ class ClientOptions( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
#
|
||||
|
||||
self._dictionary[ 'integers' ] = {}
|
||||
|
||||
self._dictionary[ 'integers' ][ 'notebook_tab_alignment' ] = CC.DIRECTION_UP
|
||||
|
||||
self._dictionary[ 'integers' ][ 'video_buffer_size' ] = 96 * 1024 * 1024
|
||||
|
||||
self._dictionary[ 'integers' ][ 'related_tags_search_1_duration_ms' ] = 250
|
||||
self._dictionary[ 'integers' ][ 'related_tags_search_2_duration_ms' ] = 2000
|
||||
self._dictionary[ 'integers' ][ 'related_tags_search_3_duration_ms' ] = 6000
|
||||
self._dictionary[ 'integers' ][ 'related_tags_concurrence_threshold_percent' ] = 6
|
||||
|
||||
self._dictionary[ 'integers' ][ 'suggested_tags_width' ] = 300
|
||||
|
||||
self._dictionary[ 'integers' ][ 'similar_files_duplicate_pairs_search_distance' ] = 0
|
||||
|
||||
self._dictionary[ 'integers' ][ 'default_new_page_goes' ] = CC.NEW_PAGE_GOES_FAR_RIGHT
|
||||
|
||||
self._dictionary[ 'integers' ][ 'num_recent_petition_reasons' ] = 5
|
||||
|
||||
self._dictionary[ 'integers' ][ 'max_page_name_chars' ] = 20
|
||||
self._dictionary[ 'integers' ][ 'page_file_count_display' ] = CC.PAGE_FILE_COUNT_DISPLAY_ALL
|
||||
|
||||
self._dictionary[ 'integers' ][ 'network_timeout' ] = 10
|
||||
self._dictionary[ 'integers' ][ 'connection_error_wait_time' ] = 15
|
||||
self._dictionary[ 'integers' ][ 'serverside_bandwidth_wait_time' ] = 60
|
||||
|
||||
self._dictionary[ 'integers' ][ 'thumbnail_visibility_scroll_percent' ] = 75
|
||||
self._dictionary[ 'integers' ][ 'ideal_tile_dimension' ] = 768
|
||||
|
||||
self._dictionary[ 'integers' ][ 'wake_delay_period' ] = 15
|
||||
|
||||
from hydrus.client.gui.canvas import ClientGUICanvasMedia
|
||||
|
||||
self._dictionary[ 'integers' ][ 'media_viewer_zoom_center' ] = ClientGUICanvasMedia.ZOOM_CENTERPOINT_MOUSE
|
||||
|
||||
self._dictionary[ 'integers' ][ 'last_session_save_period_minutes' ] = 5
|
||||
|
||||
self._dictionary[ 'integers' ][ 'shutdown_work_period' ] = 86400
|
||||
|
||||
self._dictionary[ 'integers' ][ 'max_network_jobs' ] = 15
|
||||
self._dictionary[ 'integers' ][ 'max_network_jobs_per_domain' ] = 3
|
||||
|
||||
self._dictionary[ 'integers' ][ 'max_connection_attempts_allowed' ] = 5
|
||||
self._dictionary[ 'integers' ][ 'max_request_attempts_allowed_get' ] = 5
|
||||
|
||||
from hydrus.core.files.images import HydrusImageHandling
|
||||
|
||||
self._dictionary[ 'integers' ][ 'thumbnail_scale_type' ] = HydrusImageHandling.THUMBNAIL_SCALE_DOWN_ONLY
|
||||
|
||||
self._dictionary[ 'integers' ][ 'max_simultaneous_subscriptions' ] = 1
|
||||
|
||||
self._dictionary[ 'integers' ][ 'gallery_page_wait_period_pages' ] = 15
|
||||
self._dictionary[ 'integers' ][ 'gallery_page_wait_period_subscriptions' ] = 5
|
||||
self._dictionary[ 'integers' ][ 'watcher_page_wait_period' ] = 5
|
||||
|
||||
self._dictionary[ 'integers' ][ 'popup_message_character_width' ] = 56
|
||||
|
||||
self._dictionary[ 'integers' ][ 'duplicate_filter_max_batch_size' ] = 250
|
||||
|
||||
self._dictionary[ 'integers' ][ 'video_thumbnail_percentage_in' ] = 35
|
||||
|
||||
self._dictionary[ 'integers' ][ 'global_audio_volume' ] = 70
|
||||
self._dictionary[ 'integers' ][ 'media_viewer_audio_volume' ] = 70
|
||||
self._dictionary[ 'integers' ][ 'preview_audio_volume' ] = 70
|
||||
|
||||
self._dictionary[ 'integers' ][ 'duplicate_comparison_score_higher_jpeg_quality' ] = 10
|
||||
self._dictionary[ 'integers' ][ 'duplicate_comparison_score_much_higher_jpeg_quality' ] = 20
|
||||
self._dictionary[ 'integers' ][ 'duplicate_comparison_score_higher_filesize' ] = 10
|
||||
self._dictionary[ 'integers' ][ 'duplicate_comparison_score_much_higher_filesize' ] = 20
|
||||
self._dictionary[ 'integers' ][ 'duplicate_comparison_score_higher_resolution' ] = 20
|
||||
self._dictionary[ 'integers' ][ 'duplicate_comparison_score_much_higher_resolution' ] = 50
|
||||
self._dictionary[ 'integers' ][ 'duplicate_comparison_score_more_tags' ] = 8
|
||||
self._dictionary[ 'integers' ][ 'duplicate_comparison_score_older' ] = 4
|
||||
self._dictionary[ 'integers' ][ 'duplicate_comparison_score_nicer_ratio' ] = 10
|
||||
self._dictionary[ 'integers' ][ 'duplicate_comparison_score_has_audio' ] = 20
|
||||
|
||||
self._dictionary[ 'integers' ][ 'thumbnail_cache_size' ] = 1024 * 1024 * 32
|
||||
self._dictionary[ 'integers' ][ 'image_cache_size' ] = 1024 * 1024 * 384
|
||||
self._dictionary[ 'integers' ][ 'image_tile_cache_size' ] = 1024 * 1024 * 256
|
||||
|
||||
self._dictionary[ 'integers' ][ 'thumbnail_cache_timeout' ] = 86400
|
||||
self._dictionary[ 'integers' ][ 'image_cache_timeout' ] = 600
|
||||
self._dictionary[ 'integers' ][ 'image_tile_cache_timeout' ] = 300
|
||||
|
||||
self._dictionary[ 'integers' ][ 'image_cache_storage_limit_percentage' ] = 25
|
||||
self._dictionary[ 'integers' ][ 'image_cache_prefetch_limit_percentage' ] = 10
|
||||
|
||||
self._dictionary[ 'integers' ][ 'media_viewer_prefetch_delay_base_ms' ] = 100
|
||||
self._dictionary[ 'integers' ][ 'media_viewer_prefetch_num_previous' ] = 2
|
||||
self._dictionary[ 'integers' ][ 'media_viewer_prefetch_num_next' ] = 3
|
||||
|
||||
self._dictionary[ 'integers' ][ 'thumbnail_border' ] = 1
|
||||
self._dictionary[ 'integers' ][ 'thumbnail_margin' ] = 2
|
||||
|
||||
self._dictionary[ 'integers' ][ 'thumbnail_dpr_percent' ] = 100
|
||||
|
||||
self._dictionary[ 'integers' ][ 'file_maintenance_idle_throttle_files' ] = 1
|
||||
self._dictionary[ 'integers' ][ 'file_maintenance_idle_throttle_time_delta' ] = 2
|
||||
|
||||
self._dictionary[ 'integers' ][ 'file_maintenance_active_throttle_files' ] = 1
|
||||
self._dictionary[ 'integers' ][ 'file_maintenance_active_throttle_time_delta' ] = 20
|
||||
|
||||
self._dictionary[ 'integers' ][ 'subscription_network_error_delay' ] = 12 * 3600
|
||||
self._dictionary[ 'integers' ][ 'subscription_other_error_delay' ] = 36 * 3600
|
||||
self._dictionary[ 'integers' ][ 'downloader_network_error_delay' ] = 90 * 60
|
||||
|
||||
self._dictionary[ 'integers' ][ 'file_viewing_stats_menu_display' ] = CC.FILE_VIEWING_STATS_MENU_DISPLAY_MEDIA_AND_PREVIEW_IN_SUBMENU
|
||||
|
||||
self._dictionary[ 'integers' ][ 'number_of_gui_session_backups' ] = 10
|
||||
|
||||
self._dictionary[ 'integers' ][ 'animated_scanbar_height' ] = 20
|
||||
self._dictionary[ 'integers' ][ 'animated_scanbar_nub_width' ] = 10
|
||||
|
||||
self._dictionary[ 'integers' ][ 'domain_network_infrastructure_error_number' ] = 3
|
||||
self._dictionary[ 'integers' ][ 'domain_network_infrastructure_error_time_delta' ] = 600
|
||||
|
||||
self._dictionary[ 'integers' ][ 'ac_read_list_height_num_chars' ] = 21
|
||||
self._dictionary[ 'integers' ][ 'ac_write_list_height_num_chars' ] = 11
|
||||
|
||||
self._dictionary[ 'integers' ][ 'system_busy_cpu_percent' ] = 50
|
||||
|
||||
self._dictionary[ 'integers' ][ 'human_bytes_sig_figs' ] = 3
|
||||
|
||||
self._dictionary[ 'integers' ][ 'ms_to_wait_between_physical_file_deletes' ] = 250
|
||||
|
||||
self._dictionary[ 'integers' ][ 'potential_duplicates_search_work_time_ms' ] = 500
|
||||
self._dictionary[ 'integers' ][ 'potential_duplicates_search_rest_percentage' ] = 100
|
||||
|
||||
self._dictionary[ 'integers' ][ 'repository_processing_work_time_ms_very_idle' ] = 30000
|
||||
self._dictionary[ 'integers' ][ 'repository_processing_rest_percentage_very_idle' ] = 3
|
||||
|
||||
self._dictionary[ 'integers' ][ 'repository_processing_work_time_ms_idle' ] = 10000
|
||||
self._dictionary[ 'integers' ][ 'repository_processing_rest_percentage_idle' ] = 5
|
||||
|
||||
self._dictionary[ 'integers' ][ 'repository_processing_work_time_ms_normal' ] = 500
|
||||
self._dictionary[ 'integers' ][ 'repository_processing_rest_percentage_normal' ] = 10
|
||||
|
||||
self._dictionary[ 'integers' ][ 'tag_display_processing_work_time_ms_idle' ] = 15000
|
||||
self._dictionary[ 'integers' ][ 'tag_display_processing_rest_percentage_idle' ] = 3
|
||||
|
||||
self._dictionary[ 'integers' ][ 'tag_display_processing_work_time_ms_normal' ] = 100
|
||||
self._dictionary[ 'integers' ][ 'tag_display_processing_rest_percentage_normal' ] = 9900
|
||||
|
||||
self._dictionary[ 'integers' ][ 'tag_display_processing_work_time_ms_work_hard' ] = 5000
|
||||
self._dictionary[ 'integers' ][ 'tag_display_processing_rest_percentage_work_hard' ] = 5
|
||||
|
||||
self._dictionary[ 'integers' ][ 'deferred_table_delete_work_time_ms_idle' ] = 20000
|
||||
self._dictionary[ 'integers' ][ 'deferred_table_delete_rest_percentage_idle' ] = 10
|
||||
|
||||
self._dictionary[ 'integers' ][ 'deferred_table_delete_work_time_ms_normal' ] = 250
|
||||
self._dictionary[ 'integers' ][ 'deferred_table_delete_rest_percentage_normal' ] = 1000
|
||||
|
||||
self._dictionary[ 'integers' ][ 'deferred_table_delete_work_time_ms_work_hard' ] = 5000
|
||||
self._dictionary[ 'integers' ][ 'deferred_table_delete_rest_percentage_work_hard' ] = 10
|
||||
self._dictionary[ 'integers' ] = {
|
||||
'notebook_tab_alignment' : CC.DIRECTION_UP,
|
||||
'video_buffer_size' : 96 * 1024 * 1024,
|
||||
'related_tags_search_1_duration_ms' : 250,
|
||||
'related_tags_search_2_duration_ms' : 2000,
|
||||
'related_tags_search_3_duration_ms' : 6000,
|
||||
'related_tags_concurrence_threshold_percent' : 6,
|
||||
'suggested_tags_width' : 300,
|
||||
'similar_files_duplicate_pairs_search_distance' : 0,
|
||||
'default_new_page_goes' : CC.NEW_PAGE_GOES_FAR_RIGHT,
|
||||
'num_recent_petition_reasons' : 5,
|
||||
'max_page_name_chars' : 20,
|
||||
'page_file_count_display' : CC.PAGE_FILE_COUNT_DISPLAY_ALL,
|
||||
'network_timeout' : 10,
|
||||
'connection_error_wait_time' : 15,
|
||||
'serverside_bandwidth_wait_time' : 60,
|
||||
'thumbnail_visibility_scroll_percent' : 75,
|
||||
'ideal_tile_dimension' : 768,
|
||||
'wake_delay_period' : 15,
|
||||
'media_viewer_zoom_center' : ClientGUICanvasMedia.ZOOM_CENTERPOINT_MOUSE,
|
||||
'last_session_save_period_minutes' : 5,
|
||||
'shutdown_work_period' : 86400,
|
||||
'max_network_jobs' : 15,
|
||||
'max_network_jobs_per_domain' : 3,
|
||||
'max_connection_attempts_allowed' : 5,
|
||||
'max_request_attempts_allowed_get' : 5,
|
||||
'thumbnail_scale_type' : HydrusImageHandling.THUMBNAIL_SCALE_DOWN_ONLY,
|
||||
'max_simultaneous_subscriptions' : 1,
|
||||
'gallery_page_wait_period_pages' : 15,
|
||||
'gallery_page_wait_period_subscriptions' : 5,
|
||||
'watcher_page_wait_period' : 5,
|
||||
'popup_message_character_width' : 56,
|
||||
'duplicate_filter_max_batch_size' : 250,
|
||||
'video_thumbnail_percentage_in' : 35,
|
||||
'global_audio_volume' : 70,
|
||||
'media_viewer_audio_volume' : 70,
|
||||
'preview_audio_volume' : 70,
|
||||
'duplicate_comparison_score_higher_jpeg_quality' : 10,
|
||||
'duplicate_comparison_score_much_higher_jpeg_quality' : 20,
|
||||
'duplicate_comparison_score_higher_filesize' : 10,
|
||||
'duplicate_comparison_score_much_higher_filesize' : 20,
|
||||
'duplicate_comparison_score_higher_resolution' : 20,
|
||||
'duplicate_comparison_score_much_higher_resolution' : 50,
|
||||
'duplicate_comparison_score_more_tags' : 8,
|
||||
'duplicate_comparison_score_older' : 4,
|
||||
'duplicate_comparison_score_nicer_ratio' : 10,
|
||||
'duplicate_comparison_score_has_audio' : 20,
|
||||
'thumbnail_cache_size' : 1024 * 1024 * 32,
|
||||
'image_cache_size' : 1024 * 1024 * 384,
|
||||
'image_tile_cache_size' : 1024 * 1024 * 256,
|
||||
'thumbnail_cache_timeout' : 86400,
|
||||
'image_cache_timeout' : 600,
|
||||
'image_tile_cache_timeout' : 300,
|
||||
'image_cache_storage_limit_percentage' : 25,
|
||||
'image_cache_prefetch_limit_percentage' : 10,
|
||||
'media_viewer_prefetch_delay_base_ms' : 100,
|
||||
'media_viewer_prefetch_num_previous' : 2,
|
||||
'media_viewer_prefetch_num_next' : 3,
|
||||
'thumbnail_border' : 1,
|
||||
'thumbnail_margin' : 2,
|
||||
'thumbnail_dpr_percent' : 100,
|
||||
'file_maintenance_idle_throttle_files' : 1,
|
||||
'file_maintenance_idle_throttle_time_delta' : 2,
|
||||
'file_maintenance_active_throttle_files' : 1,
|
||||
'file_maintenance_active_throttle_time_delta' : 20,
|
||||
'subscription_network_error_delay' : 12 * 3600,
|
||||
'subscription_other_error_delay' : 36 * 3600,
|
||||
'downloader_network_error_delay' : 90 * 60,
|
||||
'file_viewing_stats_menu_display' : CC.FILE_VIEWING_STATS_MENU_DISPLAY_MEDIA_AND_PREVIEW_IN_SUBMENU,
|
||||
'number_of_gui_session_backups' : 10,
|
||||
'animated_scanbar_height' : 20,
|
||||
'animated_scanbar_nub_width' : 10,
|
||||
'domain_network_infrastructure_error_number' : 3,
|
||||
'domain_network_infrastructure_error_time_delta' : 600,
|
||||
'ac_read_list_height_num_chars' : 21,
|
||||
'ac_write_list_height_num_chars' : 11,
|
||||
'system_busy_cpu_percent' : 50,
|
||||
'human_bytes_sig_figs' : 3,
|
||||
'ms_to_wait_between_physical_file_deletes' : 250,
|
||||
'potential_duplicates_search_work_time_ms' : 500,
|
||||
'potential_duplicates_search_rest_percentage' : 100,
|
||||
'repository_processing_work_time_ms_very_idle' : 30000,
|
||||
'repository_processing_rest_percentage_very_idle' : 3,
|
||||
'repository_processing_work_time_ms_idle' : 10000,
|
||||
'repository_processing_rest_percentage_idle' : 5,
|
||||
'repository_processing_work_time_ms_normal' : 500,
|
||||
'repository_processing_rest_percentage_normal' : 10,
|
||||
'tag_display_processing_work_time_ms_idle' : 15000,
|
||||
'tag_display_processing_rest_percentage_idle' : 3,
|
||||
'tag_display_processing_work_time_ms_normal' : 100,
|
||||
'tag_display_processing_rest_percentage_normal' : 9900,
|
||||
'tag_display_processing_work_time_ms_work_hard' : 5000,
|
||||
'tag_display_processing_rest_percentage_work_hard' : 5,
|
||||
'deferred_table_delete_work_time_ms_idle' : 20000,
|
||||
'deferred_table_delete_rest_percentage_idle' : 10,
|
||||
'deferred_table_delete_work_time_ms_normal' : 250,
|
||||
'deferred_table_delete_rest_percentage_normal' : 1000,
|
||||
'deferred_table_delete_work_time_ms_work_hard' : 5000,
|
||||
'deferred_table_delete_rest_percentage_work_hard' : 10
|
||||
}
|
||||
|
||||
#
|
||||
|
||||
self._dictionary[ 'keys' ] = {}
|
||||
|
||||
self._dictionary[ 'keys' ][ 'default_tag_service_tab' ] = CC.DEFAULT_LOCAL_TAG_SERVICE_KEY.hex()
|
||||
self._dictionary[ 'keys' ][ 'default_tag_service_search_page' ] = CC.COMBINED_TAG_SERVICE_KEY.hex()
|
||||
self._dictionary[ 'keys' ][ 'default_gug_key' ] = HydrusData.GenerateKey().hex()
|
||||
self._dictionary[ 'keys' ] = {
|
||||
'default_tag_service_tab' : CC.DEFAULT_LOCAL_TAG_SERVICE_KEY.hex(),
|
||||
'default_tag_service_search_page' : CC.COMBINED_TAG_SERVICE_KEY.hex(),
|
||||
'default_gug_key' : HydrusData.GenerateKey().hex()
|
||||
}
|
||||
|
||||
self._dictionary[ 'key_list' ] = {}
|
||||
|
||||
#
|
||||
|
||||
self._dictionary[ 'noneable_integers' ] = {}
|
||||
|
||||
self._dictionary[ 'noneable_integers' ][ 'forced_search_limit' ] = None
|
||||
|
||||
self._dictionary[ 'noneable_integers' ][ 'num_recent_tags' ] = 20
|
||||
|
||||
self._dictionary[ 'noneable_integers' ][ 'duplicate_background_switch_intensity_a' ] = 0
|
||||
self._dictionary[ 'noneable_integers' ][ 'duplicate_background_switch_intensity_b' ] = 3
|
||||
|
||||
self._dictionary[ 'noneable_integers' ][ 'last_review_bandwidth_search_distance' ] = 7 * 86400
|
||||
|
||||
self._dictionary[ 'noneable_integers' ][ 'file_viewing_statistics_media_min_time' ] = 2
|
||||
self._dictionary[ 'noneable_integers' ][ 'file_viewing_statistics_media_max_time' ] = 600
|
||||
self._dictionary[ 'noneable_integers' ][ 'file_viewing_statistics_preview_min_time' ] = 5
|
||||
self._dictionary[ 'noneable_integers' ][ 'file_viewing_statistics_preview_max_time' ] = 60
|
||||
|
||||
self._dictionary[ 'noneable_integers' ][ 'subscription_file_error_cancel_threshold' ] = 5
|
||||
|
||||
self._dictionary[ 'noneable_integers' ][ 'media_viewer_cursor_autohide_time_ms' ] = 700
|
||||
|
||||
self._dictionary[ 'noneable_integers' ][ 'idle_mode_client_api_timeout' ] = None
|
||||
|
||||
self._dictionary[ 'noneable_integers' ][ 'system_busy_cpu_count' ] = 1
|
||||
|
||||
self._dictionary[ 'noneable_integers' ][ 'animated_scanbar_hide_height' ] = 5
|
||||
|
||||
self._dictionary[ 'noneable_integers' ][ 'last_backup_time' ] = None
|
||||
|
||||
self._dictionary[ 'noneable_integers' ][ 'slideshow_short_duration_loop_percentage' ] = 20
|
||||
self._dictionary[ 'noneable_integers' ][ 'slideshow_short_duration_loop_seconds' ] = 10
|
||||
|
||||
self._dictionary[ 'noneable_integers' ][ 'slideshow_short_duration_cutoff_percentage' ] = 75
|
||||
|
||||
self._dictionary[ 'noneable_integers' ][ 'slideshow_long_duration_overspill_percentage' ] = 50
|
||||
self._dictionary[ 'noneable_integers' ] = {
|
||||
'forced_search_limit' : None,
|
||||
'num_recent_tags' : 20,
|
||||
'duplicate_background_switch_intensity_a' : 0,
|
||||
'duplicate_background_switch_intensity_b' : 3,
|
||||
'last_review_bandwidth_search_distance' : 7 * 86400,
|
||||
'file_viewing_statistics_media_min_time' : 2,
|
||||
'file_viewing_statistics_media_max_time' : 600,
|
||||
'file_viewing_statistics_preview_min_time' : 5,
|
||||
'file_viewing_statistics_preview_max_time' : 60,
|
||||
'subscription_file_error_cancel_threshold' : 5,
|
||||
'media_viewer_cursor_autohide_time_ms' : 700,
|
||||
'idle_mode_client_api_timeout' : None,
|
||||
'system_busy_cpu_count' : 1,
|
||||
'animated_scanbar_hide_height' : 5,
|
||||
'last_backup_time' : None,
|
||||
'slideshow_short_duration_loop_percentage' : 20,
|
||||
'slideshow_short_duration_loop_seconds' : 10,
|
||||
'slideshow_short_duration_cutoff_percentage' : 75,
|
||||
'slideshow_long_duration_overspill_percentage' : 50,
|
||||
'num_to_show_in_ac_dropdown_children_tab' : 40
|
||||
}
|
||||
|
||||
#
|
||||
|
||||
|
@ -623,50 +488,50 @@ class ClientOptions( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
#
|
||||
|
||||
self._dictionary[ 'noneable_strings' ] = {}
|
||||
self._dictionary[ 'noneable_strings' ] = {
|
||||
'favourite_file_lookup_script' : 'gelbooru md5',
|
||||
'suggested_tags_layout' : 'notebook',
|
||||
'backup_path' : None,
|
||||
'web_browser_path' : None,
|
||||
'last_png_export_dir' : None,
|
||||
'media_background_bmp_path' : None,
|
||||
'http_proxy' : None,
|
||||
'https_proxy' : None,
|
||||
'no_proxy' : '127.0.0.1',
|
||||
'qt_style_name' : None,
|
||||
'qt_stylesheet_name' : None,
|
||||
'last_advanced_file_deletion_reason' : None,
|
||||
'last_advanced_file_deletion_special_action' : None,
|
||||
'sibling_connector_custom_namespace_colour' : 'system',
|
||||
'or_connector_custom_namespace_colour' : 'system'
|
||||
}
|
||||
|
||||
self._dictionary[ 'noneable_strings' ][ 'favourite_file_lookup_script' ] = 'gelbooru md5'
|
||||
self._dictionary[ 'noneable_strings' ][ 'suggested_tags_layout' ] = 'notebook'
|
||||
self._dictionary[ 'noneable_strings' ][ 'backup_path' ] = None
|
||||
self._dictionary[ 'noneable_strings' ][ 'web_browser_path' ] = None
|
||||
self._dictionary[ 'noneable_strings' ][ 'last_png_export_dir' ] = None
|
||||
self._dictionary[ 'noneable_strings' ][ 'media_background_bmp_path' ] = None
|
||||
self._dictionary[ 'noneable_strings' ][ 'http_proxy' ] = None
|
||||
self._dictionary[ 'noneable_strings' ][ 'https_proxy' ] = None
|
||||
self._dictionary[ 'noneable_strings' ][ 'no_proxy' ] = '127.0.0.1'
|
||||
self._dictionary[ 'noneable_strings' ][ 'qt_style_name' ] = None
|
||||
self._dictionary[ 'noneable_strings' ][ 'qt_stylesheet_name' ] = None
|
||||
self._dictionary[ 'noneable_strings' ][ 'last_advanced_file_deletion_reason' ] = None
|
||||
self._dictionary[ 'noneable_strings' ][ 'last_advanced_file_deletion_special_action' ] = None
|
||||
self._dictionary[ 'noneable_strings' ][ 'sibling_connector_custom_namespace_colour' ] = 'system'
|
||||
self._dictionary[ 'noneable_strings' ][ 'or_connector_custom_namespace_colour' ] = 'system'
|
||||
self._dictionary[ 'strings' ] = {
|
||||
'app_display_name' : 'hydrus client',
|
||||
'namespace_connector' : ':',
|
||||
'sibling_connector' : ' \u2192 ',
|
||||
'or_connector' : ' OR ',
|
||||
'export_phrase' : '{hash}',
|
||||
'current_colourset' : 'default',
|
||||
'favourite_simple_downloader_formula' : 'all files linked by images in page',
|
||||
'thumbnail_scroll_rate' : '1.0',
|
||||
'pause_character' : '\u23F8',
|
||||
'stop_character' : '\u23F9',
|
||||
'default_gug_name' : 'safebooru tag search',
|
||||
'has_audio_label' : '\U0001F50A',
|
||||
'has_duration_label' : ' \u23F5 ',
|
||||
'discord_dnd_filename_pattern' : '{hash}',
|
||||
'default_suggested_tags_notebook_page' : 'related',
|
||||
'last_incremental_tagging_namespace' : 'page',
|
||||
'last_incremental_tagging_prefix' : '',
|
||||
'last_incremental_tagging_suffix' : ''
|
||||
}
|
||||
|
||||
self._dictionary[ 'strings' ] = {}
|
||||
|
||||
self._dictionary[ 'strings' ][ 'app_display_name' ] = 'hydrus client'
|
||||
self._dictionary[ 'strings' ][ 'namespace_connector' ] = ':'
|
||||
self._dictionary[ 'strings' ][ 'sibling_connector' ] = ' \u2192 '
|
||||
self._dictionary[ 'strings' ][ 'or_connector' ] = ' OR '
|
||||
self._dictionary[ 'strings' ][ 'export_phrase' ] = '{hash}'
|
||||
self._dictionary[ 'strings' ][ 'current_colourset' ] = 'default'
|
||||
self._dictionary[ 'strings' ][ 'favourite_simple_downloader_formula' ] = 'all files linked by images in page'
|
||||
self._dictionary[ 'strings' ][ 'thumbnail_scroll_rate' ] = '1.0'
|
||||
self._dictionary[ 'strings' ][ 'pause_character' ] = '\u23F8'
|
||||
self._dictionary[ 'strings' ][ 'stop_character' ] = '\u23F9'
|
||||
self._dictionary[ 'strings' ][ 'default_gug_name' ] = 'safebooru tag search'
|
||||
self._dictionary[ 'strings' ][ 'has_audio_label' ] = '\U0001F50A'
|
||||
self._dictionary[ 'strings' ][ 'has_duration_label' ] = ' \u23F5 '
|
||||
self._dictionary[ 'strings' ][ 'discord_dnd_filename_pattern' ] = '{hash}'
|
||||
self._dictionary[ 'strings' ][ 'default_suggested_tags_notebook_page' ] = 'related'
|
||||
self._dictionary[ 'strings' ][ 'last_incremental_tagging_namespace' ] = 'page'
|
||||
self._dictionary[ 'strings' ][ 'last_incremental_tagging_prefix' ] = ''
|
||||
self._dictionary[ 'strings' ][ 'last_incremental_tagging_suffix' ] = ''
|
||||
|
||||
self._dictionary[ 'string_list' ] = {}
|
||||
|
||||
self._dictionary[ 'string_list' ][ 'default_media_viewer_custom_shortcuts' ] = []
|
||||
self._dictionary[ 'string_list' ][ 'favourite_tags' ] = []
|
||||
self._dictionary[ 'string_list' ][ 'advanced_file_deletion_reasons' ] = [ 'I do not like it.', 'It is bad quality.', 'It is not appropriate for this client.', 'Temporary delete--I want to bring it back later.' ]
|
||||
self._dictionary[ 'string_list' ] = {
|
||||
'default_media_viewer_custom_shortcuts' : [],
|
||||
'favourite_tags' : [],
|
||||
'advanced_file_deletion_reasons' : [ 'I do not like it.', 'It is bad quality.', 'It is not appropriate for this client.', 'Temporary delete--I want to bring it back later.' ]
|
||||
}
|
||||
|
||||
#
|
||||
|
||||
|
|
|
@ -45,12 +45,14 @@ def FrameIndexOutOfRange( index, range_start, range_end ):
|
|||
|
||||
return False
|
||||
|
||||
|
||||
def GenerateHydrusBitmap( path, mime, compressed = True ):
|
||||
|
||||
numpy_image = ClientImageHandling.GenerateNumPyImage( path, mime )
|
||||
numpy_image = HydrusImageHandling.GenerateNumPyImage( path, mime )
|
||||
|
||||
return GenerateHydrusBitmapFromNumPyImage( numpy_image, compressed = compressed )
|
||||
|
||||
|
||||
def GenerateHydrusBitmapFromNumPyImage( numpy_image, compressed = True ):
|
||||
|
||||
( y, x, depth ) = numpy_image.shape
|
||||
|
@ -247,7 +249,7 @@ class ImageRenderer( ClientCachesBase.CacheableObject ):
|
|||
|
||||
try:
|
||||
|
||||
self._numpy_image = ClientImageHandling.GenerateNumPyImage( self._path, self._mime )
|
||||
self._numpy_image = HydrusImageHandling.GenerateNumPyImage( self._path, self._mime )
|
||||
|
||||
except Exception as e:
|
||||
|
||||
|
|
|
@ -374,7 +374,7 @@ class ThumbnailCache( object ):
|
|||
|
||||
thumbnail_mime = HydrusFileHandling.GetThumbnailMime( thumbnail_path )
|
||||
|
||||
numpy_image = ClientImageHandling.GenerateNumPyImage( thumbnail_path, thumbnail_mime )
|
||||
numpy_image = HydrusImageHandling.GenerateNumPyImage( thumbnail_path, thumbnail_mime )
|
||||
|
||||
except Exception as e:
|
||||
|
||||
|
@ -394,7 +394,7 @@ class ThumbnailCache( object ):
|
|||
|
||||
try:
|
||||
|
||||
numpy_image = ClientImageHandling.GenerateNumPyImage( thumbnail_path, thumbnail_mime )
|
||||
numpy_image = HydrusImageHandling.GenerateNumPyImage( thumbnail_path, thumbnail_mime )
|
||||
|
||||
except Exception as e:
|
||||
|
||||
|
@ -645,7 +645,7 @@ class ThumbnailCache( object ):
|
|||
|
||||
for ( mime, thumbnail_path ) in HydrusFileHandling.mimes_to_default_thumbnail_paths.items():
|
||||
|
||||
numpy_image = ClientImageHandling.GenerateNumPyImage( thumbnail_path, HC.IMAGE_PNG )
|
||||
numpy_image = HydrusImageHandling.GenerateNumPyImage( thumbnail_path, HC.IMAGE_PNG )
|
||||
|
||||
numpy_image_resolution = HydrusImageHandling.GetResolutionNumPy( numpy_image )
|
||||
|
||||
|
|
|
@ -69,6 +69,7 @@ from hydrus.client.db import ClientDBTagSearch
|
|||
from hydrus.client.db import ClientDBTagSiblings
|
||||
from hydrus.client.db import ClientDBTagSuggestions
|
||||
from hydrus.client.db import ClientDBURLMap
|
||||
from hydrus.client.duplicates import ClientDuplicates
|
||||
from hydrus.client.importing import ClientImportFiles
|
||||
from hydrus.client.interfaces import ClientControllerInterface
|
||||
from hydrus.client.media import ClientMediaManagers
|
||||
|
@ -1829,7 +1830,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
with self._MakeTemporaryIntegerTable( [], 'hash_id' ) as temp_table_name_2:
|
||||
|
||||
if dupe_search_type == CC.DUPE_SEARCH_BOTH_FILES_MATCH_DIFFERENT_SEARCHES:
|
||||
if dupe_search_type == ClientDuplicates.DUPE_SEARCH_BOTH_FILES_MATCH_DIFFERENT_SEARCHES:
|
||||
|
||||
query_hash_ids_1 = set( self.modules_files_query.PopulateSearchIntoTempTable( file_search_context_1, temp_table_name_1 ) )
|
||||
query_hash_ids_2 = set( self.modules_files_query.PopulateSearchIntoTempTable( file_search_context_2, temp_table_name_2 ) )
|
||||
|
@ -1850,7 +1851,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
query_hash_ids = set( self.modules_files_query.PopulateSearchIntoTempTable( file_search_context_1, temp_table_name_1 ) )
|
||||
|
||||
if dupe_search_type == CC.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH:
|
||||
if dupe_search_type == ClientDuplicates.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH:
|
||||
|
||||
chosen_allowed_hash_ids = query_hash_ids
|
||||
comparison_allowed_hash_ids = query_hash_ids
|
||||
|
@ -1967,7 +1968,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
with self._MakeTemporaryIntegerTable( [], 'hash_id' ) as temp_table_name_2:
|
||||
|
||||
if dupe_search_type == CC.DUPE_SEARCH_BOTH_FILES_MATCH_DIFFERENT_SEARCHES:
|
||||
if dupe_search_type == ClientDuplicates.DUPE_SEARCH_BOTH_FILES_MATCH_DIFFERENT_SEARCHES:
|
||||
|
||||
query_hash_ids_1 = set( self.modules_files_query.PopulateSearchIntoTempTable( file_search_context_1, temp_table_name_1 ) )
|
||||
query_hash_ids_2 = set( self.modules_files_query.PopulateSearchIntoTempTable( file_search_context_2, temp_table_name_2 ) )
|
||||
|
@ -1988,7 +1989,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
query_hash_ids = set( self.modules_files_query.PopulateSearchIntoTempTable( file_search_context_1, temp_table_name_1 ) )
|
||||
|
||||
if dupe_search_type == CC.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH:
|
||||
if dupe_search_type == ClientDuplicates.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH:
|
||||
|
||||
# both chosen and comparison must be in the search, no king selection nonsense allowed
|
||||
chosen_allowed_hash_ids = query_hash_ids
|
||||
|
@ -2155,7 +2156,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
with self._MakeTemporaryIntegerTable( [], 'hash_id' ) as temp_table_name_2:
|
||||
|
||||
if dupe_search_type == CC.DUPE_SEARCH_BOTH_FILES_MATCH_DIFFERENT_SEARCHES:
|
||||
if dupe_search_type == ClientDuplicates.DUPE_SEARCH_BOTH_FILES_MATCH_DIFFERENT_SEARCHES:
|
||||
|
||||
self.modules_files_query.PopulateSearchIntoTempTable( file_search_context_1, temp_table_name_1 )
|
||||
self.modules_files_query.PopulateSearchIntoTempTable( file_search_context_2, temp_table_name_2 )
|
||||
|
@ -2172,7 +2173,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
self.modules_files_query.PopulateSearchIntoTempTable( file_search_context_1, temp_table_name_1 )
|
||||
|
||||
if dupe_search_type == CC.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH:
|
||||
if dupe_search_type == ClientDuplicates.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH:
|
||||
|
||||
table_join = self.modules_files_duplicates.GetPotentialDuplicatePairsTableJoinOnSearchResultsBothFiles( temp_table_name_1, pixel_dupes_preference, max_hamming_distance )
|
||||
|
||||
|
@ -2791,7 +2792,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
return boned_stats
|
||||
|
||||
# TODO: fix this, it takes ages sometimes IRL
|
||||
table_join = self.modules_files_duplicates.GetPotentialDuplicatePairsTableJoinOnSearchResults( db_location_context, current_files_table_name, CC.SIMILAR_FILES_PIXEL_DUPES_ALLOWED, max_hamming_distance = 8 )
|
||||
table_join = self.modules_files_duplicates.GetPotentialDuplicatePairsTableJoinOnSearchResults( db_location_context, current_files_table_name, ClientDuplicates.SIMILAR_FILES_PIXEL_DUPES_ALLOWED, max_hamming_distance = 8 )
|
||||
|
||||
( total_potential_pairs, ) = self._Execute( f'SELECT COUNT( * ) FROM ( SELECT DISTINCT smaller_media_id, larger_media_id FROM {table_join} );' ).fetchone()
|
||||
|
||||
|
@ -6807,6 +6808,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
elif action == 'tag_display_application': result = self.modules_tag_display.GetApplication( *args, **kwargs )
|
||||
elif action == 'tag_display_maintenance_status': result = self._CacheTagDisplayGetApplicationStatusNumbers( *args, **kwargs )
|
||||
elif action == 'tag_parents': result = self.modules_tag_parents.GetTagParents( *args, **kwargs )
|
||||
elif action == 'tag_predicates': result = self.modules_tag_search.GetTagPredicates( *args, **kwargs )
|
||||
elif action == 'tag_siblings': result = self.modules_tag_siblings.GetTagSiblings( *args, **kwargs )
|
||||
elif action == 'tag_siblings_all_ideals': result = self.modules_tag_siblings.GetTagSiblingsIdeals( *args, **kwargs )
|
||||
elif action == 'tag_display_decorators': result = self.modules_tag_display.GetUIDecorators( *args, **kwargs )
|
||||
|
@ -10321,6 +10323,55 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
|
||||
|
||||
if version == 574:
|
||||
|
||||
try:
|
||||
|
||||
domain_manager = self.modules_serialisable.GetJSONDump( HydrusSerialisable.SERIALISABLE_TYPE_NETWORK_DOMAIN_MANAGER )
|
||||
|
||||
domain_manager.Initialise()
|
||||
|
||||
domain_manager.OverwriteDefaultParsers( [
|
||||
'danbooru file page parser - get webm ugoira',
|
||||
'danbooru file page parser'
|
||||
] )
|
||||
|
||||
parsers = domain_manager.GetParsers()
|
||||
|
||||
parser_names = { parser.GetName() for parser in parsers }
|
||||
|
||||
# checking for floog's downloader
|
||||
if 'fxtwitter api status parser' not in parser_names and 'vxtwitter api status parser' not in parser_names:
|
||||
|
||||
domain_manager.OverwriteDefaultURLClasses( [
|
||||
'vxtwitter tweet',
|
||||
'vxtwitter api status',
|
||||
'vxtwitter api status (with username)',
|
||||
'fixvx tweet',
|
||||
'fixupx tweet',
|
||||
'fxtwitter tweet',
|
||||
'x post'
|
||||
] )
|
||||
|
||||
|
||||
#
|
||||
|
||||
domain_manager.TryToLinkURLClassesAndParsers()
|
||||
|
||||
#
|
||||
|
||||
self.modules_serialisable.SetJSONDump( domain_manager )
|
||||
|
||||
except Exception as e:
|
||||
|
||||
HydrusData.PrintException( e )
|
||||
|
||||
message = 'Trying to update some downloaders failed! Please let hydrus dev know!'
|
||||
|
||||
self.pub_initial_message( message )
|
||||
|
||||
|
||||
|
||||
self._controller.frame_splash_status.SetTitleText( 'updated db to v{}'.format( HydrusData.ToHumanInt( version + 1 ) ) )
|
||||
|
||||
self._Execute( 'UPDATE version SET version = ?;', ( version + 1, ) )
|
||||
|
|
|
@ -12,6 +12,7 @@ from hydrus.client.db import ClientDBDefinitionsCache
|
|||
from hydrus.client.db import ClientDBFilesStorage
|
||||
from hydrus.client.db import ClientDBModule
|
||||
from hydrus.client.db import ClientDBSimilarFiles
|
||||
from hydrus.client.duplicates import ClientDuplicates
|
||||
|
||||
class ClientDBFilesDuplicates( ClientDBModule.ClientDBModule ):
|
||||
|
||||
|
@ -1013,16 +1014,16 @@ class ClientDBFilesDuplicates( ClientDBModule.ClientDBModule ):
|
|||
|
||||
join_predicates = [ 'smaller_media_id = duplicate_files_smaller.media_id AND larger_media_id = duplicate_files_larger.media_id' ]
|
||||
|
||||
if pixel_dupes_preference != CC.SIMILAR_FILES_PIXEL_DUPES_REQUIRED:
|
||||
if pixel_dupes_preference != ClientDuplicates.SIMILAR_FILES_PIXEL_DUPES_REQUIRED:
|
||||
|
||||
join_predicates.append( 'distance <= {}'.format( max_hamming_distance ) )
|
||||
|
||||
|
||||
if pixel_dupes_preference in ( CC.SIMILAR_FILES_PIXEL_DUPES_REQUIRED, CC.SIMILAR_FILES_PIXEL_DUPES_EXCLUDED ):
|
||||
if pixel_dupes_preference in ( ClientDuplicates.SIMILAR_FILES_PIXEL_DUPES_REQUIRED, ClientDuplicates.SIMILAR_FILES_PIXEL_DUPES_EXCLUDED ):
|
||||
|
||||
join_predicate_pixel_dupes = 'duplicate_files_smaller.king_hash_id = pixel_hash_map_smaller.hash_id AND duplicate_files_larger.king_hash_id = pixel_hash_map_larger.hash_id AND pixel_hash_map_smaller.pixel_hash_id = pixel_hash_map_larger.pixel_hash_id'
|
||||
|
||||
if pixel_dupes_preference == CC.SIMILAR_FILES_PIXEL_DUPES_REQUIRED:
|
||||
if pixel_dupes_preference == ClientDuplicates.SIMILAR_FILES_PIXEL_DUPES_REQUIRED:
|
||||
|
||||
tables.extend( [
|
||||
'pixel_hash_map AS pixel_hash_map_smaller',
|
||||
|
@ -1031,7 +1032,7 @@ class ClientDBFilesDuplicates( ClientDBModule.ClientDBModule ):
|
|||
|
||||
join_predicates.append( join_predicate_pixel_dupes )
|
||||
|
||||
elif pixel_dupes_preference == CC.SIMILAR_FILES_PIXEL_DUPES_EXCLUDED:
|
||||
elif pixel_dupes_preference == ClientDuplicates.SIMILAR_FILES_PIXEL_DUPES_EXCLUDED:
|
||||
|
||||
# can't do "AND NOT {}", or the join will just give you the million rows where it isn't true. we want 'AND NEVER {}', and quick
|
||||
|
||||
|
|
|
@ -693,6 +693,9 @@ class ClientDBMappingsCacheSpecificDisplay( ClientDBModule.ClientDBModule ):
|
|||
|
||||
def RescindPendingMappings( self, file_service_id, tag_service_id, storage_tag_id, hash_ids ):
|
||||
|
||||
# other things imply this tag on display, so we need to check storage to see what else has it
|
||||
statuses_to_table_names = self.modules_mappings_storage.GetFastestStorageMappingTableNames( file_service_id, tag_service_id )
|
||||
|
||||
( cache_display_current_mappings_table_name, cache_display_pending_mappings_table_name ) = ClientDBMappingsStorage.GenerateSpecificDisplayMappingsCacheTableNames( file_service_id, tag_service_id )
|
||||
|
||||
implies_tag_ids = self.modules_tag_display.GetImplies( ClientTags.TAG_DISPLAY_DISPLAY_ACTUAL, tag_service_id, storage_tag_id )
|
||||
|
@ -718,9 +721,6 @@ class ClientDBMappingsCacheSpecificDisplay( ClientDBModule.ClientDBModule ):
|
|||
|
||||
else:
|
||||
|
||||
# other things imply this tag on display, so we need to check storage to see what else has it
|
||||
statuses_to_table_names = self.modules_mappings_storage.GetFastestStorageMappingTableNames( file_service_id, tag_service_id )
|
||||
|
||||
mappings_table_name = statuses_to_table_names[ HC.CONTENT_STATUS_PENDING ]
|
||||
|
||||
with self._MakeTemporaryIntegerTable( other_implied_by_tag_ids, 'tag_id' ) as temp_table_name:
|
||||
|
|
|
@ -263,16 +263,25 @@ class ClientDBMappingsCacheSpecificStorage( ClientDBModule.ClientDBModule ):
|
|||
|
||||
def AddMappings( self, tag_service_id, tag_id, hash_ids, filtered_hashes_generator: FilteredHashesGenerator ):
|
||||
|
||||
is_local = self.modules_services.GetServiceType( tag_service_id ) == HC.LOCAL_TAG
|
||||
|
||||
for ( file_service_id, filtered_hash_ids ) in filtered_hashes_generator.IterateHashes( hash_ids ):
|
||||
|
||||
( cache_current_mappings_table_name, cache_deleted_mappings_table_name, cache_pending_mappings_table_name ) = ClientDBMappingsStorage.GenerateSpecificMappingsCacheTableNames( file_service_id, tag_service_id )
|
||||
|
||||
# we have to interleave this into the iterator so that if two siblings with the same ideal are pend->currented at once, we remain logic consistent for soletag lookups!
|
||||
self.modules_mappings_cache_specific_display.RescindPendingMappings( file_service_id, tag_service_id, tag_id, filtered_hash_ids )
|
||||
|
||||
self._ExecuteMany( 'DELETE FROM ' + cache_pending_mappings_table_name + ' WHERE hash_id = ? AND tag_id = ?;', ( ( hash_id, tag_id ) for hash_id in filtered_hash_ids ) )
|
||||
|
||||
num_pending_rescinded = self._GetRowCount()
|
||||
if is_local:
|
||||
|
||||
num_pending_rescinded = 0
|
||||
|
||||
else:
|
||||
|
||||
# we have to interleave this into the iterator so that if two siblings with the same ideal are pend->currented at once, we remain logic consistent for soletag lookups!
|
||||
self.modules_mappings_cache_specific_display.RescindPendingMappings( file_service_id, tag_service_id, tag_id, filtered_hash_ids )
|
||||
|
||||
self._ExecuteMany( 'DELETE FROM ' + cache_pending_mappings_table_name + ' WHERE hash_id = ? AND tag_id = ?;', ( ( hash_id, tag_id ) for hash_id in filtered_hash_ids ) )
|
||||
|
||||
num_pending_rescinded = self._GetRowCount()
|
||||
|
||||
|
||||
#
|
||||
|
||||
|
@ -595,8 +604,6 @@ class ClientDBMappingsCacheSpecificStorage( ClientDBModule.ClientDBModule ):
|
|||
|
||||
( cache_current_mappings_table_name, cache_deleted_mappings_table_name, cache_pending_mappings_table_name ) = ClientDBMappingsStorage.GenerateSpecificMappingsCacheTableNames( file_service_id, tag_service_id )
|
||||
|
||||
ac_counts = collections.Counter()
|
||||
|
||||
self.modules_mappings_cache_specific_display.RescindPendingMappings( file_service_id, tag_service_id, tag_id, filtered_hash_ids )
|
||||
|
||||
self._ExecuteMany( 'DELETE FROM ' + cache_pending_mappings_table_name + ' WHERE hash_id = ? AND tag_id = ?;', ( ( hash_id, tag_id ) for hash_id in filtered_hash_ids ) )
|
||||
|
|
|
@ -1295,6 +1295,84 @@ class ClientDBTagSearch( ClientDBModule.ClientDBModule ):
|
|||
return final_result_tag_ids
|
||||
|
||||
|
||||
def GetTagIdPredicates(
|
||||
self,
|
||||
tag_display_type: int,
|
||||
file_search_context: ClientSearch.FileSearchContext,
|
||||
tag_ids: typing.Collection[ int ],
|
||||
inclusive = True,
|
||||
zero_count_ok = False,
|
||||
job_status = None
|
||||
):
|
||||
|
||||
all_predicates = []
|
||||
|
||||
tag_context = file_search_context.GetTagContext()
|
||||
|
||||
display_tag_service_id = self.modules_services.GetServiceId( tag_context.display_service_key )
|
||||
|
||||
include_current = tag_context.include_current_tags
|
||||
include_pending = tag_context.include_pending_tags
|
||||
|
||||
file_search_context_branch = self.modules_services.GetFileSearchContextBranch( file_search_context )
|
||||
|
||||
for leaf in file_search_context_branch.IterateLeaves():
|
||||
|
||||
domain_is_cross_referenced = leaf.file_service_id != self.modules_services.combined_deleted_file_service_id
|
||||
|
||||
for group_of_tag_ids in HydrusData.SplitIteratorIntoChunks( tag_ids, 1000 ):
|
||||
|
||||
if job_status is not None and job_status.IsCancelled():
|
||||
|
||||
return []
|
||||
|
||||
|
||||
ids_to_count = self.modules_mappings_counts.GetCounts( tag_display_type, leaf.tag_service_id, leaf.file_service_id, group_of_tag_ids, include_current, include_pending, domain_is_cross_referenced = domain_is_cross_referenced, zero_count_ok = zero_count_ok, job_status = job_status )
|
||||
|
||||
if len( ids_to_count ) == 0:
|
||||
|
||||
continue
|
||||
|
||||
|
||||
#
|
||||
|
||||
predicates = self.modules_tag_display.GeneratePredicatesFromTagIdsAndCounts( tag_display_type, display_tag_service_id, ids_to_count, inclusive, job_status = job_status )
|
||||
|
||||
all_predicates.extend( predicates )
|
||||
|
||||
|
||||
if job_status is not None and job_status.IsCancelled():
|
||||
|
||||
return []
|
||||
|
||||
|
||||
|
||||
predicates = ClientSearch.MergePredicates( all_predicates )
|
||||
|
||||
return predicates
|
||||
|
||||
|
||||
def GetTagPredicates(
|
||||
self,
|
||||
tag_display_type: int,
|
||||
file_search_context: ClientSearch.FileSearchContext,
|
||||
tags: typing.Collection[ str ],
|
||||
inclusive = True,
|
||||
zero_count_ok = False,
|
||||
job_status = None
|
||||
):
|
||||
|
||||
tag_ids = set( self.modules_tags.GetTagIdsToTags( tags = tags ).keys() )
|
||||
|
||||
return self.GetTagIdPredicates(
|
||||
tag_display_type,
|
||||
file_search_context,
|
||||
tag_ids,
|
||||
inclusive = inclusive,
|
||||
zero_count_ok = zero_count_ok,
|
||||
job_status = job_status )
|
||||
|
||||
|
||||
def GetTagsTableName( self, file_service_id, tag_service_id ):
|
||||
|
||||
if file_service_id == self.modules_services.combined_file_service_id:
|
||||
|
|
|
@ -0,0 +1,199 @@
|
|||
import random
|
||||
import threading
|
||||
|
||||
from hydrus.core import HydrusSerialisable
|
||||
|
||||
from hydrus.client import ClientConstants as CC
|
||||
from hydrus.client.duplicates import ClientDuplicates
|
||||
|
||||
DUPLICATE_STATUS_DOES_NOT_MATCH_SEARCH = 0
|
||||
DUPLICATE_STATUS_MATCHES_SEARCH_BUT_NOT_TESTED = 1
|
||||
DUPLICATE_STATUS_MATCHES_SEARCH_FAILED_TEST = 2
|
||||
DUPLICATE_STATUS_MATCHES_SEARCH_PASSED_TEST = 3 # presumably this will not be needed much since we'll delete the duplicate pair soon after, but we may as well be careful
|
||||
|
||||
class PairComparatorRule( HydrusSerialisable.SerialisableBase ):
|
||||
|
||||
def Test( self, media_result_better, media_result_worse ):
|
||||
|
||||
raise NotImplementedError()
|
||||
|
||||
|
||||
|
||||
LOOKING_AT_BETTER_CANDIDATE = 0
|
||||
LOOKING_AT_WORSE_CANDIDATE = 1
|
||||
|
||||
class PairComparatorRuleOneFile( PairComparatorRule ):
|
||||
|
||||
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_AUTO_DUPLICATES_PAIR_COMPARATOR_RULE_ONE_FILE
|
||||
SERIALISABLE_NAME = 'Auto-Duplicates Pair Comparator Rule - One File'
|
||||
SERIALISABLE_VERSION = 1
|
||||
|
||||
def __init__( self ):
|
||||
|
||||
PairComparatorRule.__init__( self )
|
||||
|
||||
self._looking_at = LOOKING_AT_BETTER_CANDIDATE
|
||||
|
||||
# ok bro time to get metadata conditional working. first draft will be filetype test for jpeg/png. no need for UI yet
|
||||
self._metadata_conditional = None
|
||||
# what are we testing?
|
||||
# this would be a great place to insert MetadataConditional
|
||||
# mime is jpeg
|
||||
# has icc profile
|
||||
# maybe stuff like filesize > 200KB
|
||||
|
||||
|
||||
# serialisable gubbins
|
||||
# get/set
|
||||
|
||||
def Test( self, media_result_better, media_result_worse ):
|
||||
|
||||
if self._looking_at == LOOKING_AT_BETTER_CANDIDATE:
|
||||
|
||||
return self._metadata_conditional.Test( media_result_better )
|
||||
|
||||
else:
|
||||
|
||||
return self._metadata_conditional.Test( media_result_worse )
|
||||
|
||||
|
||||
|
||||
|
||||
HydrusSerialisable.SERIALISABLE_TYPES_TO_OBJECT_TYPES[ HydrusSerialisable.SERIALISABLE_AUTO_DUPLICATES_PAIR_COMPARATOR_RULE_ONE_FILE ] = PairComparatorRuleOneFile
|
||||
|
||||
class PairComparatorRuleTwoFiles( PairComparatorRule ):
|
||||
|
||||
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_AUTO_DUPLICATES_PAIR_COMPARATOR_RULE_TWO_FILES
|
||||
SERIALISABLE_NAME = 'Auto-Duplicates Pair Comparator Rule - Two Files'
|
||||
SERIALISABLE_VERSION = 1
|
||||
|
||||
def __init__( self ):
|
||||
|
||||
PairComparatorRule.__init__( self )
|
||||
|
||||
# if I am feeling big brain, isn't this just a dynamic one-file metadata conditional?
|
||||
# if we want 4x size, then we just pull the size of A and ask if B is <0.25x that or whatever. we don't need a clever two-file MetadataConditional test
|
||||
# so, this guy should yeah just store two or three simple enums to handle type, operator, and quantity
|
||||
|
||||
# property
|
||||
# width
|
||||
# filesize
|
||||
# age
|
||||
# etc..
|
||||
# operator
|
||||
# is more than 4x larger
|
||||
# is at least x absolute value larger?
|
||||
|
||||
|
||||
# serialisable gubbins
|
||||
# get/set
|
||||
|
||||
def Test( self, media_result_better, media_result_worse ):
|
||||
|
||||
pass
|
||||
|
||||
|
||||
|
||||
HydrusSerialisable.SERIALISABLE_TYPES_TO_OBJECT_TYPES[ HydrusSerialisable.SERIALISABLE_AUTO_DUPLICATES_PAIR_COMPARATOR_RULE_TWO_FILES ] = PairComparatorRuleTwoFiles
|
||||
|
||||
class PairSelectorAndComparator( HydrusSerialisable.SerialisableBase ):
|
||||
|
||||
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_AUTO_DUPLICATES_PAIR_SELECTOR_AND_COMPARATOR
|
||||
SERIALISABLE_NAME = 'Auto-Duplicates Pair Selector and Comparator'
|
||||
SERIALISABLE_VERSION = 1
|
||||
|
||||
def __init__( self ):
|
||||
|
||||
HydrusSerialisable.SerialisableBase.__init__( self )
|
||||
|
||||
self._rules = HydrusSerialisable.SerialisableList()
|
||||
|
||||
|
||||
# serialisable gubbins
|
||||
# get/set
|
||||
|
||||
def GetMatchingMedia( self, media_result_1, media_result_2 ):
|
||||
|
||||
pair = [ media_result_1, media_result_2 ]
|
||||
|
||||
# just in case both match
|
||||
random.shuffle( pair )
|
||||
|
||||
( media_result_1, media_result_2 ) = pair
|
||||
|
||||
if False not in ( rule.Test( media_result_1, media_result_2 ) for rule in self._rules ):
|
||||
|
||||
return media_result_1
|
||||
|
||||
elif False not in ( rule.Test( media_result_2, media_result_1 ) for rule in self._rules ):
|
||||
|
||||
return media_result_2
|
||||
|
||||
else:
|
||||
|
||||
return None
|
||||
|
||||
|
||||
|
||||
|
||||
HydrusSerialisable.SERIALISABLE_TYPES_TO_OBJECT_TYPES[ HydrusSerialisable.SERIALISABLE_AUTO_DUPLICATES_PAIR_SELECTOR_AND_COMPARATOR ] = PairSelectorAndComparator
|
||||
|
||||
class AutoDuplicatesRule( HydrusSerialisable.SerialisableBaseNamed ):
|
||||
|
||||
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_AUTO_DUPLICATES_RULE
|
||||
SERIALISABLE_NAME = 'Auto-Duplicates Rule'
|
||||
SERIALISABLE_VERSION = 1
|
||||
|
||||
def __init__( self, name ):
|
||||
|
||||
HydrusSerialisable.SerialisableBaseNamed.__init__( self, name )
|
||||
|
||||
self._id = -1
|
||||
|
||||
# maybe make this search part into its own object? in ClientDuplicates
|
||||
# could wangle duplicate pages and client api dupe stuff to work in the same guy, great idea
|
||||
self._file_search_context_1 = None
|
||||
self._file_search_context_2 = None
|
||||
self._dupe_search_type = ClientDuplicates.DUPE_SEARCH_ONE_FILE_MATCHES_ONE_SEARCH
|
||||
self._pixel_dupes_preference = ClientDuplicates.SIMILAR_FILES_PIXEL_DUPES_ALLOWED
|
||||
self._max_hamming_distance = 4
|
||||
|
||||
self._selector_and_comparator = None
|
||||
|
||||
# action info
|
||||
# set as better
|
||||
# delete the other one
|
||||
# optional custom merge options
|
||||
|
||||
|
||||
# serialisable gubbins
|
||||
# get/set
|
||||
# 'here's a pair of media results, pass/fail?'
|
||||
|
||||
|
||||
HydrusSerialisable.SERIALISABLE_TYPES_TO_OBJECT_TYPES[ HydrusSerialisable.SERIALISABLE_AUTO_DUPLICATES_RULE ] = AutoDuplicatesRule
|
||||
|
||||
class AutoDuplicatesManager( object ):
|
||||
|
||||
my_instance = None
|
||||
|
||||
def __init__( self ):
|
||||
|
||||
AutoDuplicatesManager.my_instance = self
|
||||
|
||||
# my rules, start with empty and then load from db or whatever on controller init
|
||||
|
||||
self._lock = threading.Lock()
|
||||
|
||||
|
||||
@staticmethod
|
||||
def instance() -> 'AutoDuplicatesManager':
|
||||
|
||||
if AutoDuplicatesManager.my_instance is None:
|
||||
|
||||
AutoDuplicatesManager()
|
||||
|
||||
|
||||
return AutoDuplicatesManager.my_instance
|
||||
|
||||
|
|
@ -23,6 +23,24 @@ from hydrus.client.media import ClientMediaFileFilter
|
|||
from hydrus.client.metadata import ClientContentUpdates
|
||||
from hydrus.client.metadata import ClientTags
|
||||
|
||||
DUPE_SEARCH_ONE_FILE_MATCHES_ONE_SEARCH = 0
|
||||
DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH = 1
|
||||
DUPE_SEARCH_BOTH_FILES_MATCH_DIFFERENT_SEARCHES = 2
|
||||
|
||||
SIMILAR_FILES_PIXEL_DUPES_REQUIRED = 0
|
||||
SIMILAR_FILES_PIXEL_DUPES_ALLOWED = 1
|
||||
SIMILAR_FILES_PIXEL_DUPES_EXCLUDED = 2
|
||||
|
||||
similar_files_pixel_dupes_string_lookup = {
|
||||
SIMILAR_FILES_PIXEL_DUPES_REQUIRED : 'must be pixel dupes',
|
||||
SIMILAR_FILES_PIXEL_DUPES_ALLOWED : 'can be pixel dupes',
|
||||
SIMILAR_FILES_PIXEL_DUPES_EXCLUDED : 'must not be pixel dupes'
|
||||
}
|
||||
|
||||
SYNC_ARCHIVE_NONE = 0
|
||||
SYNC_ARCHIVE_IF_ONE_DO_BOTH = 1
|
||||
SYNC_ARCHIVE_DO_BOTH_REGARDLESS = 2
|
||||
|
||||
hashes_to_jpeg_quality = {}
|
||||
|
||||
def GetDuplicateComparisonScore( shown_media, comparison_media ):
|
||||
|
@ -423,40 +441,25 @@ def GetDuplicateComparisonStatements( shown_media, comparison_media ):
|
|||
|
||||
global hashes_to_jpeg_quality
|
||||
|
||||
if s_hash not in hashes_to_jpeg_quality:
|
||||
for jpeg_hash in ( s_hash, c_hash ):
|
||||
|
||||
path = CG.client_controller.client_files_manager.GetFilePath( s_hash, s_mime )
|
||||
|
||||
try:
|
||||
if jpeg_hash not in hashes_to_jpeg_quality:
|
||||
|
||||
raw_pil_image = HydrusImageOpening.RawOpenPILImage( path )
|
||||
path = CG.client_controller.client_files_manager.GetFilePath( jpeg_hash, HC.IMAGE_JPEG )
|
||||
|
||||
result = HydrusImageMetadata.GetJPEGQuantizationQualityEstimate( raw_pil_image )
|
||||
try:
|
||||
|
||||
raw_pil_image = HydrusImageOpening.RawOpenPILImage( path )
|
||||
|
||||
result = HydrusImageMetadata.GetJPEGQuantizationQualityEstimate( raw_pil_image )
|
||||
|
||||
except:
|
||||
|
||||
result = ( 'unknown', None )
|
||||
|
||||
|
||||
except:
|
||||
hashes_to_jpeg_quality[ jpeg_hash ] = result
|
||||
|
||||
result = ( 'unknown', None )
|
||||
|
||||
|
||||
hashes_to_jpeg_quality[ s_hash ] = result
|
||||
|
||||
|
||||
if c_hash not in hashes_to_jpeg_quality:
|
||||
|
||||
path = CG.client_controller.client_files_manager.GetFilePath( c_hash, c_mime )
|
||||
|
||||
try:
|
||||
|
||||
raw_pil_image = HydrusImageOpening.RawOpenPILImage( path )
|
||||
|
||||
result = HydrusImageMetadata.GetJPEGQuantizationQualityEstimate( raw_pil_image )
|
||||
|
||||
except:
|
||||
|
||||
result = ( 'unknown', None )
|
||||
|
||||
|
||||
hashes_to_jpeg_quality[ c_hash ] = result
|
||||
|
||||
|
||||
( s_label, s_jpeg_quality ) = hashes_to_jpeg_quality[ s_hash ]
|
||||
|
@ -785,10 +788,6 @@ class DuplicatesManager( object ):
|
|||
|
||||
|
||||
|
||||
SYNC_ARCHIVE_NONE = 0
|
||||
SYNC_ARCHIVE_IF_ONE_DO_BOTH = 1
|
||||
SYNC_ARCHIVE_DO_BOTH_REGARDLESS = 2
|
||||
|
||||
def get_updated_domain_modified_timestamp_datas( destination_media: ClientMedia.MediaSingleton, source_media: ClientMedia.MediaSingleton, urls: typing.Collection[ str ] ):
|
||||
|
||||
from hydrus.client.networking import ClientNetworkingFunctions
|
|
@ -3459,6 +3459,8 @@ class FrameGUI( CAC.ApplicationCommandProcessorMixin, ClientGUITopLevelWindows.M
|
|||
ClientGUIMenus.AppendMenuItem( gui_actions, 'make some popups', 'Throw some varied popups at the message manager, just to check it is working.', self._DebugMakeSomePopups )
|
||||
ClientGUIMenus.AppendMenuItem( gui_actions, 'publish some sub files in five seconds', 'Publish some files like a subscription would.', self._controller.CallLater, 5, lambda: CG.client_controller.pub( 'imported_files_to_page', [ HydrusData.GenerateKey() for i in range( 5 ) ], 'example sub files' ) )
|
||||
ClientGUIMenus.AppendMenuItem( gui_actions, 'refresh pages menu in five seconds', 'Delayed refresh the pages menu, giving you time to minimise or otherwise alter the client before it arrives.', self._controller.CallLater, 5, self._menu_updater_pages.update )
|
||||
ClientGUIMenus.AppendMenuItem( gui_actions, 'reload current g ui session', 'Reload the current QSS stylesheet.', self._ReloadCurrentGUISession )
|
||||
ClientGUIMenus.AppendMenuItem( gui_actions, 'reload current stylesheet', 'Reload the current QSS stylesheet.', ClientGUIStyle.ReloadStyleSheet )
|
||||
ClientGUIMenus.AppendMenuItem( gui_actions, 'reset multi-column list settings to default', 'Reset all multi-column list widths and other display settings to default.', self._DebugResetColumnListManager )
|
||||
ClientGUIMenus.AppendMenuItem( gui_actions, 'save \'last session\' gui session', 'Make an immediate save of the \'last session\' gui session. Mostly for testing crashes, where last session is not saved correctly.', self.ProposeSaveGUISession, CC.LAST_SESSION_SESSION_NAME )
|
||||
|
||||
|
@ -5437,6 +5439,42 @@ class FrameGUI( CAC.ApplicationCommandProcessorMixin, ClientGUITopLevelWindows.M
|
|||
|
||||
|
||||
|
||||
def _ReloadCurrentGUISession( self ):
|
||||
|
||||
name = 'temp_session_slot_for_reload_if_you_see_this_you_can_delete_it'
|
||||
only_changed_page_data = True
|
||||
about_to_save = True
|
||||
|
||||
session = self._notebook.GetCurrentGUISession( name, only_changed_page_data, about_to_save )
|
||||
|
||||
self._FleshOutSessionWithCleanDataIfNeeded( self._notebook, name, session )
|
||||
|
||||
def qt_load():
|
||||
|
||||
while self._notebook.count() > 0:
|
||||
|
||||
self._notebook.CloseCurrentPage( polite = False )
|
||||
|
||||
|
||||
self._notebook.LoadGUISession( name )
|
||||
|
||||
self._controller.Write( 'delete_serialisable_named', HydrusSerialisable.SERIALISABLE_TYPE_GUI_SESSION_CONTAINER, name )
|
||||
|
||||
self._controller.pub( 'notify_new_sessions' )
|
||||
|
||||
|
||||
|
||||
def do_save():
|
||||
|
||||
CG.client_controller.SaveGUISession( session )
|
||||
|
||||
CG.client_controller.CallBlockingToQt( self, qt_load )
|
||||
|
||||
|
||||
self._controller.CallToThread( do_save )
|
||||
|
||||
|
||||
|
||||
def _RepairInvalidTags( self ):
|
||||
|
||||
message = 'This will scan all your tags and repair any that are invalid. This might mean taking out unrenderable characters or cleaning up improper whitespace. If there is a tag collision once cleaned, it may add a (1)-style number on the end.'
|
||||
|
|
|
@ -53,6 +53,8 @@ class GUICore( QC.QObject ):
|
|||
|
||||
def PopupMenu( self, widget: QW.QWidget, menu: QW.QMenu ):
|
||||
|
||||
ClientGUIMenus.RemoveFinalSeparator( menu )
|
||||
|
||||
if HC.PLATFORM_MACOS and widget.window().isModal():
|
||||
|
||||
# Ok, seems like Big Sur can't do menus at the moment lmao. it shows the menu but the mouse can't interact with it
|
||||
|
|
|
@ -17,6 +17,8 @@ from hydrus.client.gui import ClientGUIFunctions
|
|||
|
||||
def AppendMenu( menu, submenu, label ):
|
||||
|
||||
RemoveFinalSeparator( submenu )
|
||||
|
||||
label = SanitiseLabel( label )
|
||||
|
||||
submenu.setTitle( label )
|
||||
|
@ -234,6 +236,26 @@ def GetEventCallable( callable, *args, **kwargs ):
|
|||
|
||||
return event_callable
|
||||
|
||||
|
||||
def RemoveFinalSeparator( menu: QW.QMenu ):
|
||||
|
||||
num_items = len( menu.actions() )
|
||||
|
||||
if num_items > 0:
|
||||
|
||||
last_item = menu.actions()[-1]
|
||||
|
||||
# got this once, who knows what happened, so we test for QAction now
|
||||
# 'PySide2.QtGui.QStandardItem' object has no attribute 'isSeparator'
|
||||
if isinstance( last_item, QW.QAction ):
|
||||
|
||||
if last_item.isSeparator():
|
||||
|
||||
menu.removeAction( last_item )
|
||||
|
||||
|
||||
|
||||
|
||||
def SanitiseLabel( label: str ) -> str:
|
||||
|
||||
if label == '':
|
||||
|
|
|
@ -19,9 +19,9 @@ from hydrus.core import HydrusText
|
|||
|
||||
from hydrus.client import ClientApplicationCommand as CAC
|
||||
from hydrus.client import ClientConstants as CC
|
||||
from hydrus.client import ClientDuplicates
|
||||
from hydrus.client import ClientGlobals as CG
|
||||
from hydrus.client import ClientTime
|
||||
from hydrus.client.duplicates import ClientDuplicates
|
||||
from hydrus.client.gui import ClientGUIDialogs
|
||||
from hydrus.client.gui import ClientGUIDialogsMessage
|
||||
from hydrus.client.gui import ClientGUIDialogsQuick
|
||||
|
|
|
@ -4119,6 +4119,15 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
|
||||
#
|
||||
|
||||
children_panel = ClientGUICommon.StaticBox( self, 'children tags' )
|
||||
|
||||
self._num_to_show_in_ac_dropdown_children_tab = ClientGUICommon.NoneableSpinCtrl( children_panel, none_phrase = 'show all', min = 1 )
|
||||
tt = 'The "children" tab will show children of the current tag context (usually the list of tags above the autocomplete), ordered by file count. This can quickly get spammy, so I recommend you cull it to a reasonable size.'
|
||||
self._num_to_show_in_ac_dropdown_children_tab.setToolTip( tt )
|
||||
self._num_to_show_in_ac_dropdown_children_tab.SetValue( 20 ) # init default
|
||||
|
||||
#
|
||||
|
||||
self._expand_parents_on_storage_taglists.setChecked( self._new_options.GetBoolean( 'expand_parents_on_storage_taglists' ) )
|
||||
self._expand_parents_on_storage_taglists.setToolTip( ClientGUIFunctions.WrapToolTip( 'This affects taglists in places like the manage tags dialog, where you edit tags as they actually are, and implied parents hang below tags.' ) )
|
||||
|
||||
|
@ -4147,7 +4156,9 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
|
||||
#
|
||||
|
||||
vbox = QP.VBoxLayout()
|
||||
self._num_to_show_in_ac_dropdown_children_tab.SetValue( self._new_options.GetNoneableInteger( 'num_to_show_in_ac_dropdown_children_tab' ) )
|
||||
|
||||
#
|
||||
|
||||
rows = []
|
||||
|
||||
|
@ -4164,18 +4175,30 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
|
||||
general_panel.Add( gridbox, CC.FLAGS_EXPAND_SIZER_PERPENDICULAR )
|
||||
|
||||
QP.AddToLayout( vbox, general_panel, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
|
||||
#
|
||||
|
||||
favourites_panel.Add( favourites_st, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
favourites_panel.Add( self._favourites, CC.FLAGS_EXPAND_BOTH_WAYS )
|
||||
favourites_panel.Add( self._favourites_input )
|
||||
|
||||
QP.AddToLayout( vbox, favourites_panel, CC.FLAGS_EXPAND_BOTH_WAYS )
|
||||
#
|
||||
|
||||
rows = []
|
||||
|
||||
rows.append( ( 'How many tags to show in the children tab: ', self._num_to_show_in_ac_dropdown_children_tab ) )
|
||||
|
||||
gridbox = ClientGUICommon.WrapInGrid( children_panel, rows )
|
||||
|
||||
children_panel.Add( gridbox, CC.FLAGS_EXPAND_SIZER_PERPENDICULAR )
|
||||
|
||||
#
|
||||
|
||||
vbox = QP.VBoxLayout()
|
||||
|
||||
QP.AddToLayout( vbox, general_panel, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
QP.AddToLayout( vbox, favourites_panel, CC.FLAGS_EXPAND_BOTH_WAYS )
|
||||
QP.AddToLayout( vbox, children_panel, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
|
||||
self.setLayout( vbox )
|
||||
|
||||
#
|
||||
|
@ -4212,6 +4235,10 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
|
||||
self._new_options.SetStringList( 'favourite_tags', list( self._favourites.GetTags() ) )
|
||||
|
||||
#
|
||||
|
||||
self._new_options.SetNoneableInteger( 'num_to_show_in_ac_dropdown_children_tab', self._num_to_show_in_ac_dropdown_children_tab.GetValue() )
|
||||
|
||||
|
||||
|
||||
class _TagPresentationPanel( QW.QWidget ):
|
||||
|
|
|
@ -15,6 +15,7 @@ ORIGINAL_STYLE_NAME = None
|
|||
CURRENT_STYLE_NAME = None
|
||||
ORIGINAL_STYLESHEET = None
|
||||
CURRENT_STYLESHEET = None
|
||||
CURRENT_STYLESHEET_FILENAME = None
|
||||
|
||||
def ClearStylesheet():
|
||||
|
||||
|
@ -88,6 +89,18 @@ def InitialiseDefaults():
|
|||
CURRENT_STYLESHEET = ORIGINAL_STYLESHEET
|
||||
|
||||
|
||||
def ReloadStyleSheet():
|
||||
|
||||
ClearStylesheet()
|
||||
|
||||
if CURRENT_STYLESHEET_FILENAME is not None:
|
||||
|
||||
ClearStylesheet()
|
||||
|
||||
SetStylesheetFromPath( CURRENT_STYLESHEET_FILENAME )
|
||||
|
||||
|
||||
|
||||
def SetStyleFromName( name: str ):
|
||||
|
||||
if QtInit.WE_ARE_QT5:
|
||||
|
@ -116,7 +129,6 @@ def SetStyleFromName( name: str ):
|
|||
except Exception as e:
|
||||
|
||||
raise HydrusExceptions.DataMissing( 'Style "{}" could not be generated/applied. If this is the default, perhaps a third-party custom style, you may have to restart the client to re-set it. Extra error info: {}'.format( name, e ) )
|
||||
|
||||
|
||||
|
||||
|
||||
|
@ -143,6 +155,10 @@ def SetStyleSheet( stylesheet, prepend_hydrus = True ):
|
|||
|
||||
def SetStylesheetFromPath( filename ):
|
||||
|
||||
global CURRENT_STYLESHEET_FILENAME
|
||||
|
||||
CURRENT_STYLESHEET_FILENAME = filename
|
||||
|
||||
path = os.path.join( STYLESHEET_DIR, filename )
|
||||
|
||||
if not os.path.exists( path ):
|
||||
|
|
|
@ -1,4 +1,5 @@
|
|||
import collections
|
||||
import itertools
|
||||
import os
|
||||
import threading
|
||||
import time
|
||||
|
@ -39,6 +40,36 @@ from hydrus.client.importing import ClientImportSubscriptions
|
|||
from hydrus.client.importing import ClientImportSubscriptionQuery
|
||||
from hydrus.client.importing import ClientImportSubscriptionLegacy # keep this here so the serialisable stuff is registered, it has to be imported somewhere
|
||||
|
||||
def DoAliveOrDeadCheck( win: QW.QWidget, query_headers: typing.Collection[ ClientImportSubscriptionQuery.SubscriptionQueryHeader ] ):
|
||||
|
||||
do_alive = True
|
||||
do_dead = True
|
||||
|
||||
num_dead = sum( ( 1 for query_header in query_headers if query_header.IsDead() ) )
|
||||
|
||||
if 0 < num_dead < len( query_headers ):
|
||||
|
||||
message = f'Of the {HydrusData.ToHumanInt(len(query_headers))} selected queries, {HydrusData.ToHumanInt(num_dead)} are DEAD. Which queries do you want to check?'
|
||||
|
||||
choice_tuples = [
|
||||
( f'all of them', ( True, True ), 'Resuscitate the DEAD queries and check everything.' ),
|
||||
( f'the {HydrusData.ToHumanInt(len(query_headers)-num_dead)} ALIVE', ( True, False ), 'Check the ALIVE queries.' ),
|
||||
( f'the {HydrusData.ToHumanInt(num_dead)} DEAD', ( False, True ), 'Resuscitate the DEAD queries and check them.' )
|
||||
]
|
||||
|
||||
try:
|
||||
|
||||
( do_alive, do_dead ) = ClientGUIDialogsQuick.SelectFromListButtons( win, 'Check which?', choice_tuples, message = message )
|
||||
|
||||
except HydrusExceptions.CancelledException:
|
||||
|
||||
raise
|
||||
|
||||
|
||||
|
||||
return ( do_alive, do_dead )
|
||||
|
||||
|
||||
def GetQueryHeadersQualityInfo( query_headers: typing.Iterable[ ClientImportSubscriptionQuery.SubscriptionQueryHeader ] ):
|
||||
|
||||
data = []
|
||||
|
@ -403,14 +434,38 @@ class EditSubscriptionPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
def _CheckNow( self ):
|
||||
|
||||
selected_queries = self._query_headers.GetData( only_selected = True )
|
||||
selected_query_headers = self._query_headers.GetData( only_selected = True )
|
||||
|
||||
for query_header in selected_queries:
|
||||
try:
|
||||
|
||||
( do_alive, do_dead ) = DoAliveOrDeadCheck( self, selected_query_headers )
|
||||
|
||||
except HydrusExceptions.CancelledException:
|
||||
|
||||
return
|
||||
|
||||
|
||||
for query_header in selected_query_headers:
|
||||
|
||||
if query_header.IsDead():
|
||||
|
||||
if not do_dead:
|
||||
|
||||
continue
|
||||
|
||||
|
||||
else:
|
||||
|
||||
if not do_alive:
|
||||
|
||||
continue
|
||||
|
||||
|
||||
|
||||
query_header.CheckNow()
|
||||
|
||||
|
||||
self._query_headers.UpdateDatas( selected_queries )
|
||||
self._query_headers.UpdateDatas( selected_query_headers )
|
||||
|
||||
self._query_headers.Sort()
|
||||
|
||||
|
@ -2060,9 +2115,49 @@ class EditSubscriptionsPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
subscriptions = self._subscriptions.GetData( only_selected = True )
|
||||
|
||||
query_headers = HydrusData.MassExtend( ( subscription.GetQueryHeaders() for subscription in subscriptions ) )
|
||||
|
||||
try:
|
||||
|
||||
( do_alive, do_dead ) = DoAliveOrDeadCheck( self, query_headers )
|
||||
|
||||
except HydrusExceptions.CancelledException:
|
||||
|
||||
return
|
||||
|
||||
|
||||
for subscription in subscriptions:
|
||||
|
||||
subscription.CheckNow()
|
||||
we_did_some = False
|
||||
|
||||
query_headers = subscription.GetQueryHeaders()
|
||||
|
||||
for query_header in query_headers:
|
||||
|
||||
if query_header.IsDead():
|
||||
|
||||
if not do_dead:
|
||||
|
||||
continue
|
||||
|
||||
|
||||
else:
|
||||
|
||||
if not do_alive:
|
||||
|
||||
continue
|
||||
|
||||
|
||||
|
||||
query_header.CheckNow()
|
||||
|
||||
we_did_some = True
|
||||
|
||||
|
||||
if we_did_some:
|
||||
|
||||
subscription.ScrubDelay()
|
||||
|
||||
|
||||
|
||||
self._subscriptions.UpdateDatas( subscriptions )
|
||||
|
|
|
@ -14,9 +14,9 @@ from hydrus.core import HydrusTime
|
|||
from hydrus.client import ClientApplicationCommand as CAC
|
||||
from hydrus.client import ClientConstants as CC
|
||||
from hydrus.client import ClientData
|
||||
from hydrus.client import ClientDuplicates
|
||||
from hydrus.client import ClientGlobals as CG
|
||||
from hydrus.client import ClientLocation
|
||||
from hydrus.client.duplicates import ClientDuplicates
|
||||
from hydrus.client.gui import ClientGUICore as CGC
|
||||
from hydrus.client.gui import ClientGUIDialogs
|
||||
from hydrus.client.gui import ClientGUIDialogsManage
|
||||
|
|
|
@ -12,8 +12,8 @@ from hydrus.core import HydrusSerialisable
|
|||
from hydrus.client import ClientApplicationCommand as CAC
|
||||
from hydrus.client import ClientConstants as CC
|
||||
from hydrus.client import ClientData
|
||||
from hydrus.client import ClientDuplicates
|
||||
from hydrus.client import ClientGlobals as CG
|
||||
from hydrus.client.duplicates import ClientDuplicates
|
||||
from hydrus.client.gui import ClientGUIDragDrop
|
||||
from hydrus.client.gui import ClientGUICore as CGC
|
||||
from hydrus.client.gui import ClientGUIFunctions
|
||||
|
|
|
@ -312,6 +312,8 @@ class EditFileImportOptionsPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
QP.AddToLayout( vbox, self._load_default_options, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
QP.AddToLayout( vbox, self._specific_options_panel, CC.FLAGS_EXPAND_BOTH_WAYS )
|
||||
|
||||
vbox.addStretch( 1 )
|
||||
|
||||
self.widget().setLayout( vbox )
|
||||
|
||||
self._destination_location_context.locationChanged.connect( self._UpdateLocationText )
|
||||
|
|
|
@ -1989,6 +1989,8 @@ class ListBox( QW.QScrollArea ):
|
|||
|
||||
def _SetVirtualSize( self ):
|
||||
|
||||
# this triggers an update of the scrollbars, maybe important if this is the first time the thing is shown, let's see if it helps our missing scrollbar issue
|
||||
# I think this is needed here for PySide2 and a/c dropdowns, help
|
||||
self.setWidgetResizable( True )
|
||||
|
||||
my_size = self.widget().size()
|
||||
|
@ -1997,10 +1999,18 @@ class ListBox( QW.QScrollArea ):
|
|||
|
||||
ideal_virtual_size = QC.QSize( my_size.width(), text_height * self._total_positional_rows )
|
||||
|
||||
if HG.gui_report_mode:
|
||||
|
||||
HydrusData.ShowText( f'Setting a virtual size on {self}. Num terms: {len( self._ordered_terms)}, Text height: {text_height}, Total Positional Rows: {self._total_positional_rows}, My Height: {my_size.height()}, Ideal Height: {ideal_virtual_size.height()}' )
|
||||
|
||||
|
||||
if ideal_virtual_size != my_size:
|
||||
|
||||
self.widget().setMinimumSize( ideal_virtual_size )
|
||||
|
||||
# this triggers an update of the scrollbars, maybe important if this is the first time the thing is shown, let's see if it helps our missing scrollbar issue
|
||||
self.setWidgetResizable( True )
|
||||
|
||||
|
||||
|
||||
def _Sort( self ):
|
||||
|
|
|
@ -9,6 +9,7 @@ from hydrus.core import HydrusTime
|
|||
from hydrus.client import ClientConstants as CC
|
||||
from hydrus.client import ClientGlobals as CG
|
||||
from hydrus.client import ClientLocation
|
||||
from hydrus.client.duplicates import ClientDuplicates
|
||||
from hydrus.client.importing import ClientImportGallery
|
||||
from hydrus.client.importing import ClientImportLocal
|
||||
from hydrus.client.importing import ClientImportSimpleURLs
|
||||
|
@ -74,8 +75,8 @@ def CreateManagementControllerDuplicateFilter(
|
|||
|
||||
management_controller.SetVariable( 'file_search_context_1', file_search_context )
|
||||
management_controller.SetVariable( 'file_search_context_2', file_search_context.Duplicate() )
|
||||
management_controller.SetVariable( 'dupe_search_type', CC.DUPE_SEARCH_ONE_FILE_MATCHES_ONE_SEARCH )
|
||||
management_controller.SetVariable( 'pixel_dupes_preference', CC.SIMILAR_FILES_PIXEL_DUPES_ALLOWED )
|
||||
management_controller.SetVariable( 'dupe_search_type', ClientDuplicates.DUPE_SEARCH_ONE_FILE_MATCHES_ONE_SEARCH )
|
||||
management_controller.SetVariable( 'pixel_dupes_preference', ClientDuplicates.SIMILAR_FILES_PIXEL_DUPES_ALLOWED )
|
||||
management_controller.SetVariable( 'max_hamming_distance', 4 )
|
||||
|
||||
return management_controller
|
||||
|
@ -506,13 +507,13 @@ class ManagementController( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
if management_type == MANAGEMENT_TYPE_DUPLICATE_FILTER:
|
||||
|
||||
value = CC.DUPE_SEARCH_ONE_FILE_MATCHES_ONE_SEARCH
|
||||
value = ClientDuplicates.DUPE_SEARCH_ONE_FILE_MATCHES_ONE_SEARCH
|
||||
|
||||
if 'both_files_match' in variables:
|
||||
|
||||
if variables[ 'both_files_match' ]:
|
||||
|
||||
value = CC.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH
|
||||
value = ClientDuplicates.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH
|
||||
|
||||
|
||||
del variables[ 'both_files_match' ]
|
||||
|
|
|
@ -17,7 +17,6 @@ from hydrus.core.networking import HydrusNetwork
|
|||
|
||||
from hydrus.client import ClientConstants as CC
|
||||
from hydrus.client import ClientDefaults
|
||||
from hydrus.client import ClientDuplicates
|
||||
from hydrus.client import ClientGlobals as CG
|
||||
from hydrus.client import ClientLocation
|
||||
from hydrus.client import ClientParsing
|
||||
|
@ -25,6 +24,7 @@ from hydrus.client import ClientPaths
|
|||
from hydrus.client import ClientServices
|
||||
from hydrus.client import ClientThreading
|
||||
from hydrus.client import ClientTime
|
||||
from hydrus.client.duplicates import ClientDuplicates
|
||||
from hydrus.client.gui import ClientGUIAsync
|
||||
from hydrus.client.gui import ClientGUICore as CGC
|
||||
from hydrus.client.gui import ClientGUIDialogs
|
||||
|
@ -549,15 +549,15 @@ class ManagementPanelDuplicateFilter( ManagementPanel ):
|
|||
|
||||
self._dupe_search_type = ClientGUICommon.BetterChoice( self._filtering_panel )
|
||||
|
||||
self._dupe_search_type.addItem( 'at least one file matches the search', CC.DUPE_SEARCH_ONE_FILE_MATCHES_ONE_SEARCH )
|
||||
self._dupe_search_type.addItem( 'both files match the search', CC.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH )
|
||||
self._dupe_search_type.addItem( 'both files match different searches', CC.DUPE_SEARCH_BOTH_FILES_MATCH_DIFFERENT_SEARCHES )
|
||||
self._dupe_search_type.addItem( 'at least one file matches the search', ClientDuplicates.DUPE_SEARCH_ONE_FILE_MATCHES_ONE_SEARCH )
|
||||
self._dupe_search_type.addItem( 'both files match the search', ClientDuplicates.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH )
|
||||
self._dupe_search_type.addItem( 'both files match different searches', ClientDuplicates.DUPE_SEARCH_BOTH_FILES_MATCH_DIFFERENT_SEARCHES )
|
||||
|
||||
self._pixel_dupes_preference = ClientGUICommon.BetterChoice( self._filtering_panel )
|
||||
|
||||
for p in ( CC.SIMILAR_FILES_PIXEL_DUPES_REQUIRED, CC.SIMILAR_FILES_PIXEL_DUPES_ALLOWED, CC.SIMILAR_FILES_PIXEL_DUPES_EXCLUDED ):
|
||||
for p in ( ClientDuplicates.SIMILAR_FILES_PIXEL_DUPES_REQUIRED, ClientDuplicates.SIMILAR_FILES_PIXEL_DUPES_ALLOWED, ClientDuplicates.SIMILAR_FILES_PIXEL_DUPES_EXCLUDED ):
|
||||
|
||||
self._pixel_dupes_preference.addItem( CC.similar_files_pixel_dupes_string_lookup[ p ], p )
|
||||
self._pixel_dupes_preference.addItem( ClientDuplicates.similar_files_pixel_dupes_string_lookup[ p ], p )
|
||||
|
||||
|
||||
self._max_hamming_distance_for_filter = ClientGUICommon.BetterSpinBox( self._filtering_panel, min = 0, max = 64 )
|
||||
|
@ -592,7 +592,7 @@ class ManagementPanelDuplicateFilter( ManagementPanel ):
|
|||
|
||||
if not management_controller.HasVariable( 'pixel_dupes_preference' ):
|
||||
|
||||
management_controller.SetVariable( 'pixel_dupes_preference', CC.SIMILAR_FILES_PIXEL_DUPES_ALLOWED )
|
||||
management_controller.SetVariable( 'pixel_dupes_preference', ClientDuplicates.SIMILAR_FILES_PIXEL_DUPES_ALLOWED )
|
||||
|
||||
|
||||
self._pixel_dupes_preference.SetValue( management_controller.GetVariable( 'pixel_dupes_preference' ) )
|
||||
|
@ -759,11 +759,11 @@ class ManagementPanelDuplicateFilter( ManagementPanel ):
|
|||
|
||||
if optimise_for_search:
|
||||
|
||||
if dupe_search_type == CC.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH and ( file_search_context_1.IsJustSystemEverything() or file_search_context_1.HasNoPredicates() ):
|
||||
if dupe_search_type == ClientDuplicates.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH and ( file_search_context_1.IsJustSystemEverything() or file_search_context_1.HasNoPredicates() ):
|
||||
|
||||
dupe_search_type = CC.DUPE_SEARCH_ONE_FILE_MATCHES_ONE_SEARCH
|
||||
dupe_search_type = ClientDuplicates.DUPE_SEARCH_ONE_FILE_MATCHES_ONE_SEARCH
|
||||
|
||||
elif dupe_search_type == CC.DUPE_SEARCH_BOTH_FILES_MATCH_DIFFERENT_SEARCHES:
|
||||
elif dupe_search_type == ClientDuplicates.DUPE_SEARCH_BOTH_FILES_MATCH_DIFFERENT_SEARCHES:
|
||||
|
||||
if file_search_context_1.IsJustSystemEverything() or file_search_context_1.HasNoPredicates():
|
||||
|
||||
|
@ -771,11 +771,11 @@ class ManagementPanelDuplicateFilter( ManagementPanel ):
|
|||
file_search_context_1 = file_search_context_2
|
||||
file_search_context_2 = f
|
||||
|
||||
dupe_search_type = CC.DUPE_SEARCH_ONE_FILE_MATCHES_ONE_SEARCH
|
||||
dupe_search_type = ClientDuplicates.DUPE_SEARCH_ONE_FILE_MATCHES_ONE_SEARCH
|
||||
|
||||
elif file_search_context_2.IsJustSystemEverything() or file_search_context_2.HasNoPredicates():
|
||||
|
||||
dupe_search_type = CC.DUPE_SEARCH_ONE_FILE_MATCHES_ONE_SEARCH
|
||||
dupe_search_type = ClientDuplicates.DUPE_SEARCH_ONE_FILE_MATCHES_ONE_SEARCH
|
||||
|
||||
|
||||
|
||||
|
@ -1027,9 +1027,9 @@ class ManagementPanelDuplicateFilter( ManagementPanel ):
|
|||
|
||||
( file_search_context_1, file_search_context_2, dupe_search_type, pixel_dupes_preference, max_hamming_distance ) = self._GetDuplicateFileSearchData( optimise_for_search = False )
|
||||
|
||||
self._tag_autocomplete_2.setVisible( dupe_search_type == CC.DUPE_SEARCH_BOTH_FILES_MATCH_DIFFERENT_SEARCHES )
|
||||
self._tag_autocomplete_2.setVisible( dupe_search_type == ClientDuplicates.DUPE_SEARCH_BOTH_FILES_MATCH_DIFFERENT_SEARCHES )
|
||||
|
||||
self._max_hamming_distance_for_filter.setEnabled( self._pixel_dupes_preference.GetValue() != CC.SIMILAR_FILES_PIXEL_DUPES_REQUIRED )
|
||||
self._max_hamming_distance_for_filter.setEnabled( self._pixel_dupes_preference.GetValue() != ClientDuplicates.SIMILAR_FILES_PIXEL_DUPES_REQUIRED )
|
||||
|
||||
|
||||
def FilterDupeSearchTypeChanged( self ):
|
||||
|
|
|
@ -1973,9 +1973,13 @@ class PagesNotebook( QP.TabWidgetWithDnD ):
|
|||
ClientGUIMenus.AppendMenuItem( menu, 'send pages to the right to a new page of pages', 'Make a new page of pages and put all the pages to the right into it.', self._SendRightPagesToNewNotebook, tab_index )
|
||||
|
||||
|
||||
if click_over_page_of_pages and page.count() > 0:
|
||||
ClientGUIMenus.AppendSeparator( menu )
|
||||
|
||||
if not click_over_page_of_pages:
|
||||
|
||||
ClientGUIMenus.AppendSeparator( menu )
|
||||
ClientGUIMenus.AppendMenuItem( menu, 'refresh this page', 'Command this page to refresh.', page.RefreshQuery )
|
||||
|
||||
elif click_over_page_of_pages and page.count() > 0:
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( menu, 'refresh all this page\'s pages', 'Command every page below this one to refresh.', page.RefreshAllPages )
|
||||
|
||||
|
|
|
@ -3286,7 +3286,7 @@ class MediaPanelThumbnails( MediaPanel ):
|
|||
|
||||
|
||||
def _UpdateScrollBars( self ):
|
||||
|
||||
|
||||
# The following call is officially a no-op since this property is already true, but it also triggers an update
|
||||
# of the scroll area's scrollbars which we need.
|
||||
# We need this since we are intercepting & doing work in resize events which causes
|
||||
|
|
|
@ -599,13 +599,13 @@ def WriteFetch(
|
|||
|
||||
class ListBoxTagsPredicatesAC( ClientGUIListBoxes.ListBoxTagsPredicates ):
|
||||
|
||||
def __init__( self, parent, callable, service_key, float_mode, **kwargs ):
|
||||
def __init__( self, parent, callable, float_mode, service_key, **kwargs ):
|
||||
|
||||
ClientGUIListBoxes.ListBoxTagsPredicates.__init__( self, parent, **kwargs )
|
||||
|
||||
self._callable = callable
|
||||
self._service_key = service_key
|
||||
self._float_mode = float_mode
|
||||
self._service_key = service_key
|
||||
|
||||
self._predicates = {}
|
||||
|
||||
|
@ -676,7 +676,6 @@ class ListBoxTagsPredicatesAC( ClientGUIListBoxes.ListBoxTagsPredicates ):
|
|||
if not they_are_the_same:
|
||||
|
||||
previously_selected_predicate = None
|
||||
previously_selected_predicate_had_count = False
|
||||
|
||||
if len( self._selected_terms ) == 1:
|
||||
|
||||
|
@ -1534,6 +1533,155 @@ class AutoCompleteDropdown( CAC.ApplicationCommandProcessorMixin, QW.QWidget ):
|
|||
self._DropdownHideShow()
|
||||
|
||||
|
||||
|
||||
class ChildrenTab( ListBoxTagsPredicatesAC ):
|
||||
|
||||
def __init__( self, parent: QW.QWidget, broadcast_call, float_mode: bool, location_context: ClientLocation.LocationContext, tag_service_key: bytes, tag_display_type: int = ClientTags.TAG_DISPLAY_DISPLAY_ACTUAL, height_num_chars: int = 4 ):
|
||||
|
||||
self._location_context = location_context
|
||||
self._tags_to_child_predicates_cache = dict()
|
||||
self._children_need_updating = True
|
||||
|
||||
ListBoxTagsPredicatesAC.__init__( self, parent, broadcast_call, float_mode, tag_service_key, tag_display_type = tag_display_type, height_num_chars = height_num_chars )
|
||||
|
||||
|
||||
def NotifyNeedsUpdating( self ):
|
||||
|
||||
self._children_need_updating = True
|
||||
|
||||
|
||||
def SetLocationContext( self, location_context: ClientLocation.LocationContext ):
|
||||
|
||||
self._location_context = location_context
|
||||
|
||||
|
||||
def SetTagServiceKey( self, service_key: bytes ):
|
||||
|
||||
ListBoxTagsPredicatesAC.SetTagServiceKey( self, service_key )
|
||||
|
||||
self._tags_to_child_predicates_cache = dict()
|
||||
|
||||
|
||||
def UpdateChildrenIfNeeded( self, context_tags: typing.Collection[ str ] ):
|
||||
|
||||
if self._children_need_updating:
|
||||
|
||||
context_tags = set( context_tags )
|
||||
|
||||
tag_display_type = self._tag_display_type
|
||||
location_context = self._location_context
|
||||
tag_service_key = self._service_key
|
||||
tags_to_child_predicates_cache = dict( self._tags_to_child_predicates_cache )
|
||||
|
||||
if location_context.IsOneDomain():
|
||||
|
||||
search_location_context = location_context
|
||||
|
||||
else:
|
||||
|
||||
# let's not blat the db on some crazy multi-domain just for this un-numbered list
|
||||
search_location_context = ClientLocation.LocationContext.STATICCreateSimple( CC.COMBINED_TAG_SERVICE_KEY )
|
||||
|
||||
|
||||
tag_context = ClientSearch.TagContext( service_key = tag_service_key )
|
||||
|
||||
file_search_context = ClientSearch.FileSearchContext(
|
||||
location_context = search_location_context,
|
||||
tag_context = tag_context
|
||||
)
|
||||
|
||||
def work_callable():
|
||||
|
||||
uncached_context_tags = { tag for tag in context_tags if tag not in tags_to_child_predicates_cache }
|
||||
|
||||
if len( uncached_context_tags ) > 0:
|
||||
|
||||
new_tags_to_child_tags = CG.client_controller.Read( 'tag_descendants_lookup', tag_service_key, uncached_context_tags )
|
||||
|
||||
new_child_tags = HydrusData.MassUnion( new_tags_to_child_tags.values() )
|
||||
|
||||
child_predicates = CG.client_controller.Read(
|
||||
'tag_predicates',
|
||||
tag_display_type,
|
||||
file_search_context,
|
||||
new_child_tags,
|
||||
zero_count_ok = True
|
||||
)
|
||||
|
||||
child_tags_to_child_predicates = { predicate.GetValue() : predicate for predicate in child_predicates }
|
||||
|
||||
new_tags_to_child_predicates = { tag : { child_tags_to_child_predicates[ child_tag ] for child_tag in child_tags if child_tag in child_tags_to_child_predicates } for ( tag, child_tags ) in new_tags_to_child_tags.items() }
|
||||
|
||||
else:
|
||||
|
||||
new_tags_to_child_predicates = dict()
|
||||
|
||||
|
||||
child_predicates = set()
|
||||
|
||||
for tag in context_tags:
|
||||
|
||||
if tag in tags_to_child_predicates_cache:
|
||||
|
||||
child_predicates.update( tags_to_child_predicates_cache[ tag ] )
|
||||
|
||||
elif tag in new_tags_to_child_predicates:
|
||||
|
||||
child_predicates.update( new_tags_to_child_predicates[ tag ] )
|
||||
|
||||
|
||||
|
||||
child_predicates = [ predicate for predicate in child_predicates if predicate.GetValue() not in context_tags ]
|
||||
|
||||
ClientSearch.SortPredicates( child_predicates )
|
||||
|
||||
child_predicates = [ predicate.GetCountlessCopy() for predicate in child_predicates ]
|
||||
|
||||
num_to_show_in_ac_dropdown_children_tab = CG.client_controller.new_options.GetNoneableInteger( 'num_to_show_in_ac_dropdown_children_tab' )
|
||||
|
||||
if num_to_show_in_ac_dropdown_children_tab is not None:
|
||||
|
||||
child_predicates = child_predicates[ : num_to_show_in_ac_dropdown_children_tab ]
|
||||
|
||||
|
||||
return ( location_context, tag_service_key, child_predicates, new_tags_to_child_predicates )
|
||||
|
||||
|
||||
def publish_callable( result ):
|
||||
|
||||
( job_location_context, job_tag_service_key, child_predicates, new_tags_to_children ) = result
|
||||
|
||||
if job_location_context != self._location_context or job_tag_service_key != self._service_key:
|
||||
|
||||
self.SetPredicates( [] )
|
||||
|
||||
return
|
||||
|
||||
|
||||
self._tags_to_child_predicates_cache.update( new_tags_to_children )
|
||||
|
||||
self.SetPredicates( child_predicates, preserve_single_selection = True )
|
||||
|
||||
self._children_need_updating = False
|
||||
|
||||
|
||||
def errback_callable( etype, value, tb ):
|
||||
|
||||
self.SetPredicates( [] )
|
||||
|
||||
self._children_need_updating = False
|
||||
|
||||
HydrusData.ShowText( 'Trying to load some child tags failed, please send this to hydev:' )
|
||||
HydrusData.ShowExceptionTuple( etype, value, tb, do_wait = False )
|
||||
|
||||
|
||||
job = ClientGUIAsync.AsyncQtJob( self, work_callable, publish_callable, errback_callable = errback_callable )
|
||||
|
||||
job.start()
|
||||
|
||||
|
||||
|
||||
|
||||
class AutoCompleteDropdownTags( AutoCompleteDropdown ):
|
||||
|
||||
locationChanged = QC.Signal( ClientLocation.LocationContext )
|
||||
|
@ -1569,9 +1717,7 @@ class AutoCompleteDropdownTags( AutoCompleteDropdown ):
|
|||
|
||||
self._dropdown_notebook.addTab( self._favourites_list, 'favourites' )
|
||||
|
||||
self._children_list = ListBoxTagsPredicatesAC( self._dropdown_notebook, self.BroadcastChoices, self._float_mode, self._tag_service_key, tag_display_type = ClientTags.TAG_DISPLAY_DISPLAY_ACTUAL, height_num_chars = 4 )
|
||||
self._tags_to_children_cache = dict()
|
||||
self._children_list_needs_updating = True
|
||||
self._children_list = ChildrenTab( self._dropdown_notebook, self.BroadcastChoices, self._float_mode, self._location_context_button.GetValue(), self._tag_service_key, tag_display_type = ClientTags.TAG_DISPLAY_DISPLAY_ACTUAL, height_num_chars = 4 )
|
||||
|
||||
self._dropdown_notebook.addTab( self._children_list, 'children' )
|
||||
|
||||
|
@ -1628,6 +1774,10 @@ class AutoCompleteDropdownTags( AutoCompleteDropdown ):
|
|||
self._SetTagService( top_local_tag_service_key )
|
||||
|
||||
|
||||
self._children_list.SetLocationContext( location_context )
|
||||
|
||||
self._NotifyChildrenListNeedsUpdating()
|
||||
|
||||
self.locationChanged.emit( location_context )
|
||||
|
||||
self._SetListDirty()
|
||||
|
@ -1635,7 +1785,7 @@ class AutoCompleteDropdownTags( AutoCompleteDropdown ):
|
|||
|
||||
def _NotifyChildrenListNeedsUpdating( self ):
|
||||
|
||||
self._children_list_needs_updating = True
|
||||
self._children_list.NotifyNeedsUpdating()
|
||||
|
||||
self._UpdateChildrenListIfNeeded()
|
||||
|
||||
|
@ -1737,7 +1887,6 @@ class AutoCompleteDropdownTags( AutoCompleteDropdown ):
|
|||
self._favourites_list.SetTagServiceKey( self._tag_service_key )
|
||||
self._children_list.SetTagServiceKey( self._tag_service_key )
|
||||
|
||||
self._tags_to_children_cache = dict()
|
||||
self._NotifyChildrenListNeedsUpdating()
|
||||
|
||||
self.tagServiceChanged.emit( self._tag_service_key )
|
||||
|
@ -1754,83 +1903,9 @@ class AutoCompleteDropdownTags( AutoCompleteDropdown ):
|
|||
|
||||
def _UpdateChildrenListIfNeeded( self ):
|
||||
|
||||
if self._children_list_needs_updating and self._dropdown_notebook.currentWidget() == self._children_list:
|
||||
if self._dropdown_notebook.currentWidget() == self._children_list:
|
||||
|
||||
tag_service_key = self._tag_service_key
|
||||
context_tags = set( self._current_context_tags )
|
||||
tags_to_children_cache = dict( self._tags_to_children_cache )
|
||||
|
||||
def work_callable():
|
||||
|
||||
uncached_context_tags = { tag for tag in context_tags if tag not in tags_to_children_cache }
|
||||
|
||||
if len( uncached_context_tags ) > 0:
|
||||
|
||||
new_tags_to_children = CG.client_controller.Read( 'tag_descendants_lookup', tag_service_key, uncached_context_tags )
|
||||
|
||||
else:
|
||||
|
||||
new_tags_to_children = dict()
|
||||
|
||||
|
||||
child_tags = set()
|
||||
|
||||
for tag in context_tags:
|
||||
|
||||
if tag in tags_to_children_cache:
|
||||
|
||||
child_tags.update( tags_to_children_cache[ tag ] )
|
||||
|
||||
elif tag in new_tags_to_children:
|
||||
|
||||
child_tags.update( new_tags_to_children[ tag ] )
|
||||
|
||||
|
||||
|
||||
child_tags.difference_update( context_tags )
|
||||
|
||||
return ( tag_service_key, child_tags, new_tags_to_children )
|
||||
|
||||
|
||||
def publish_callable( result ):
|
||||
|
||||
( job_tag_service_key, child_tags, new_tags_to_children ) = result
|
||||
|
||||
if job_tag_service_key != self._tag_service_key:
|
||||
|
||||
self._children_list.SetPredicates( [] )
|
||||
|
||||
return
|
||||
|
||||
|
||||
child_tags = list( child_tags )
|
||||
|
||||
self._tags_to_children_cache.update( new_tags_to_children )
|
||||
|
||||
tag_sort = ClientTagSorting.TagSort( sort_type = ClientTagSorting.SORT_BY_HUMAN_TAG, sort_order = CC.SORT_ASC )
|
||||
|
||||
ClientTagSorting.SortTags( tag_sort, child_tags )
|
||||
|
||||
predicates = [ ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_TAG, value = tag ) for tag in child_tags ]
|
||||
|
||||
self._children_list.SetPredicates( predicates, preserve_single_selection = True )
|
||||
|
||||
self._children_list_needs_updating = False
|
||||
|
||||
|
||||
def errback_callable( etype, value, tb ):
|
||||
|
||||
self._children_list.SetPredicates( [] )
|
||||
|
||||
self._children_list_needs_updating = False
|
||||
|
||||
HydrusData.ShowText( 'Trying to load some child tags failed, please send this to hydev:' )
|
||||
HydrusData.ShowExceptionTuple( etype, value, tb, do_wait = False )
|
||||
|
||||
|
||||
job = ClientGUIAsync.AsyncQtJob( self, work_callable, publish_callable, errback_callable = errback_callable )
|
||||
|
||||
job.start()
|
||||
self._children_list.UpdateChildrenIfNeeded( set( self._current_context_tags ) )
|
||||
|
||||
|
||||
|
||||
|
@ -2292,7 +2367,7 @@ class AutoCompleteDropdownTagsRead( AutoCompleteDropdownTags ):
|
|||
height_num_chars = self._fixed_results_list_height
|
||||
|
||||
|
||||
return ListBoxTagsPredicatesAC( self._dropdown_notebook, self.BroadcastChoices, self._tag_service_key, self._float_mode, tag_display_type = ClientTags.TAG_DISPLAY_DISPLAY_ACTUAL, height_num_chars = height_num_chars )
|
||||
return ListBoxTagsPredicatesAC( self._dropdown_notebook, self.BroadcastChoices, self._float_mode, self._tag_service_key, tag_display_type = ClientTags.TAG_DISPLAY_DISPLAY_ACTUAL, height_num_chars = height_num_chars )
|
||||
|
||||
|
||||
def _LocationContextJustChanged( self, location_context: ClientLocation.LocationContext ):
|
||||
|
@ -3061,7 +3136,7 @@ class AutoCompleteDropdownTagsWrite( AutoCompleteDropdownTags ):
|
|||
|
||||
height_num_chars = CG.client_controller.new_options.GetInteger( 'ac_write_list_height_num_chars' )
|
||||
|
||||
preds_list = ListBoxTagsPredicatesAC( self._dropdown_notebook, self.BroadcastChoices, self._display_tag_service_key, self._float_mode, tag_display_type = ClientTags.TAG_DISPLAY_STORAGE, height_num_chars = height_num_chars )
|
||||
preds_list = ListBoxTagsPredicatesAC( self._dropdown_notebook, self.BroadcastChoices, self._float_mode, self._display_tag_service_key, tag_display_type = ClientTags.TAG_DISPLAY_STORAGE, height_num_chars = height_num_chars )
|
||||
|
||||
preds_list.SetExtraParentRowsAllowed( CG.client_controller.new_options.GetBoolean( 'expand_parents_on_storage_autocomplete_taglists' ) )
|
||||
preds_list.SetParentDecoratorsAllowed( CG.client_controller.new_options.GetBoolean( 'show_parent_decorators_on_storage_autocomplete_taglists' ) )
|
||||
|
|
|
@ -1023,7 +1023,7 @@ class FileSeed( HydrusSerialisable.SerialisableBase ):
|
|||
file_import_options = FileImportOptions.GetRealFileImportOptions( file_import_options, FileImportOptions.IMPORT_TYPE_LOUD )
|
||||
|
||||
|
||||
file_import_job = ClientImportFiles.FileImportJob( temp_path, file_import_options )
|
||||
file_import_job = ClientImportFiles.FileImportJob( temp_path, file_import_options, human_file_description = self.file_seed_data )
|
||||
|
||||
file_import_status = file_import_job.DoWork( status_hook = status_hook )
|
||||
|
||||
|
|
|
@ -106,7 +106,7 @@ def CheckFileImportStatus( file_import_status: FileImportStatus ) -> FileImportS
|
|||
|
||||
class FileImportJob( object ):
|
||||
|
||||
def __init__( self, temp_path: str, file_import_options: FileImportOptions.FileImportOptions ):
|
||||
def __init__( self, temp_path: str, file_import_options: FileImportOptions.FileImportOptions, human_file_description = None ):
|
||||
|
||||
if HG.file_import_report_mode:
|
||||
|
||||
|
@ -120,6 +120,7 @@ class FileImportJob( object ):
|
|||
|
||||
self._temp_path = temp_path
|
||||
self._file_import_options = file_import_options
|
||||
self._human_file_description = human_file_description
|
||||
|
||||
self._pre_import_file_status = FileImportStatus.STATICGetUnknownStatus()
|
||||
self._post_import_file_status = FileImportStatus.STATICGetUnknownStatus()
|
||||
|
@ -422,7 +423,7 @@ class FileImportJob( object ):
|
|||
|
||||
if raw_pil_image is None:
|
||||
|
||||
raw_pil_image = HydrusImageOpening.RawOpenPILImage( self._temp_path )
|
||||
raw_pil_image = HydrusImageOpening.RawOpenPILImage( self._temp_path, human_file_description = self._human_file_description )
|
||||
|
||||
|
||||
has_exif = HydrusImageMetadata.HasEXIF( raw_pil_image )
|
||||
|
@ -451,7 +452,7 @@ class FileImportJob( object ):
|
|||
|
||||
if raw_pil_image is None:
|
||||
|
||||
raw_pil_image = HydrusImageOpening.RawOpenPILImage( self._temp_path )
|
||||
raw_pil_image = HydrusImageOpening.RawOpenPILImage( self._temp_path, human_file_description = self._human_file_description )
|
||||
|
||||
|
||||
has_icc_profile = HydrusImageMetadata.HasICCProfile( raw_pil_image )
|
||||
|
|
|
@ -45,6 +45,7 @@ from hydrus.client import ClientThreading
|
|||
from hydrus.client import ClientTime
|
||||
from hydrus.client import ClientRendering
|
||||
from hydrus.client import ClientImageHandling
|
||||
from hydrus.client.duplicates import ClientDuplicates
|
||||
from hydrus.client.importing import ClientImportFiles
|
||||
from hydrus.client.importing.options import FileImportOptions
|
||||
from hydrus.client.media import ClientMedia
|
||||
|
@ -625,8 +626,8 @@ def ParseDuplicateSearch( request: HydrusServerRequest.HydrusRequest ):
|
|||
file_search_context_1 = ClientSearch.FileSearchContext( location_context = location_context, tag_context = tag_context_1, predicates = predicates_1 )
|
||||
file_search_context_2 = ClientSearch.FileSearchContext( location_context = location_context, tag_context = tag_context_2, predicates = predicates_2 )
|
||||
|
||||
dupe_search_type = request.parsed_request_args.GetValue( 'potentials_search_type', int, default_value = CC.DUPE_SEARCH_ONE_FILE_MATCHES_ONE_SEARCH )
|
||||
pixel_dupes_preference = request.parsed_request_args.GetValue( 'pixel_duplicates', int, default_value = CC.SIMILAR_FILES_PIXEL_DUPES_ALLOWED )
|
||||
dupe_search_type = request.parsed_request_args.GetValue( 'potentials_search_type', int, default_value = ClientDuplicates.DUPE_SEARCH_ONE_FILE_MATCHES_ONE_SEARCH )
|
||||
pixel_dupes_preference = request.parsed_request_args.GetValue( 'pixel_duplicates', int, default_value = ClientDuplicates.SIMILAR_FILES_PIXEL_DUPES_ALLOWED )
|
||||
max_hamming_distance = request.parsed_request_args.GetValue( 'max_hamming_distance', int, default_value = 4 )
|
||||
|
||||
return (
|
||||
|
@ -1434,7 +1435,7 @@ class HydrusResourceClientAPIRestrictedAddFilesAddFile( HydrusResourceClientAPIR
|
|||
|
||||
file_import_options = CG.client_controller.new_options.GetDefaultFileImportOptions( FileImportOptions.IMPORT_TYPE_QUIET )
|
||||
|
||||
file_import_job = ClientImportFiles.FileImportJob( temp_path, file_import_options )
|
||||
file_import_job = ClientImportFiles.FileImportJob( temp_path, file_import_options, human_file_description = f'API POSTed File' )
|
||||
|
||||
body_dict = {}
|
||||
|
||||
|
|
|
@ -519,7 +519,7 @@ def EnsureURLIsEncoded( url: str, keep_fragment = True ) -> str:
|
|||
single_value_parameters = [ ensure_param_component_is_encoded( single_value_parameter ) for single_value_parameter in single_value_parameters ]
|
||||
|
||||
path = '/' + '/'.join( path_components )
|
||||
query = ConvertQueryDictToText( query_dict, single_value_parameters )
|
||||
query = ConvertQueryDictToText( query_dict, single_value_parameters, param_order = param_order )
|
||||
|
||||
if not keep_fragment:
|
||||
|
||||
|
|
|
@ -105,7 +105,7 @@ options = {}
|
|||
# Misc
|
||||
|
||||
NETWORK_VERSION = 20
|
||||
SOFTWARE_VERSION = 574
|
||||
SOFTWARE_VERSION = 575
|
||||
CLIENT_API_VERSION = 64
|
||||
|
||||
SERVER_THUMBNAIL_DIMENSIONS = ( 200, 200 )
|
||||
|
@ -992,6 +992,9 @@ APPLICATIONS_WITH_THUMBNAILS = { IMAGE_SVG, APPLICATION_PDF, APPLICATION_FLASH,
|
|||
|
||||
MIMES_WITH_THUMBNAILS = set( IMAGES ).union( ANIMATIONS ).union( VIDEO ).union( APPLICATIONS_WITH_THUMBNAILS )
|
||||
|
||||
# basically a flash or a clip or a svg or whatever can normally just have some janked out resolution, so when testing such for thumbnail gen etc.., we'll ignore applications
|
||||
MIMES_THAT_ALWAYS_HAVE_GOOD_RESOLUTION = set( IMAGES ).union( ANIMATIONS ).union( VIDEO )
|
||||
|
||||
FILES_THAT_CAN_HAVE_ICC_PROFILE = { IMAGE_BMP, IMAGE_JPEG, IMAGE_PNG, IMAGE_GIF, IMAGE_TIFF, APPLICATION_PSD }.union( PIL_HEIF_MIMES )
|
||||
|
||||
FILES_THAT_CAN_HAVE_EXIF = { IMAGE_JPEG, IMAGE_TIFF, IMAGE_PNG, IMAGE_WEBP }.union( PIL_HEIF_MIMES )
|
||||
|
|
|
@ -845,10 +845,16 @@ def LastShutdownWasBad( db_path, instance ):
|
|||
return False
|
||||
|
||||
|
||||
def MassExtend( iterables ):
|
||||
|
||||
return [ item for item in itertools.chain.from_iterable( iterables ) ]
|
||||
|
||||
|
||||
def MassUnion( iterables ):
|
||||
|
||||
return { item for item in itertools.chain.from_iterable( iterables ) }
|
||||
|
||||
|
||||
def MedianPop( population ):
|
||||
|
||||
# assume it has at least one and comes sorted
|
||||
|
|
|
@ -142,6 +142,10 @@ SERIALISABLE_TYPE_PETITION_HEADER = 124
|
|||
SERIALISABLE_TYPE_STRING_JOINER = 125
|
||||
SERIALISABLE_TYPE_FILE_FILTER = 126
|
||||
SERIALISABLE_TYPE_URL_CLASS_PARAMETER_FIXED_NAME = 127
|
||||
SERIALISABLE_AUTO_DUPLICATES_RULE = 128
|
||||
SERIALISABLE_AUTO_DUPLICATES_PAIR_SELECTOR_AND_COMPARATOR = 129
|
||||
SERIALISABLE_AUTO_DUPLICATES_PAIR_COMPARATOR_RULE_ONE_FILE = 130
|
||||
SERIALISABLE_AUTO_DUPLICATES_PAIR_COMPARATOR_RULE_TWO_FILES = 131
|
||||
|
||||
SERIALISABLE_TYPES_TO_OBJECT_TYPES = {}
|
||||
|
||||
|
|
|
@ -199,9 +199,9 @@ def GetAPNGDurationAndNumFrames( path ):
|
|||
return ( duration_in_ms, num_frames )
|
||||
|
||||
|
||||
def GetFrameDurationsPILAnimation( path ):
|
||||
def GetFrameDurationsPILAnimation( path, human_file_description = None ):
|
||||
|
||||
pil_image = HydrusImageOpening.RawOpenPILImage( path )
|
||||
pil_image = HydrusImageOpening.RawOpenPILImage( path, human_file_description = human_file_description )
|
||||
|
||||
times_to_play = GetTimesToPlayPILAnimationFromPIL( pil_image )
|
||||
|
||||
|
@ -301,11 +301,11 @@ def GetTimesToPlayAPNG( path: str ) -> int:
|
|||
return num_plays
|
||||
|
||||
|
||||
def GetTimesToPlayPILAnimation( path ) -> int:
|
||||
def GetTimesToPlayPILAnimation( path, human_file_description = None ) -> int:
|
||||
|
||||
try:
|
||||
|
||||
pil_image = HydrusImageOpening.RawOpenPILImage( path )
|
||||
pil_image = HydrusImageOpening.RawOpenPILImage( path, human_file_description = human_file_description )
|
||||
|
||||
except HydrusExceptions.UnsupportedFileException:
|
||||
|
||||
|
|
|
@ -93,7 +93,7 @@ def PowerPointResolution( path: str ):
|
|||
file = GetZipAsPath( path, 'ppt/presentation.xml' ).open( 'rb' )
|
||||
|
||||
root = ET.parse( file )
|
||||
|
||||
|
||||
sldSz = root.find('./p:sldSz', {'p': 'http://schemas.openxmlformats.org/presentationml/2006/main'})
|
||||
|
||||
x_emu = int(sldSz.get('cx'))
|
||||
|
|
|
@ -142,7 +142,11 @@ def ClipPILImage( pil_image: PILImage.Image, clip_rect ):
|
|||
return pil_image.crop( box = ( x, y, x + clip_width, y + clip_height ) )
|
||||
|
||||
|
||||
def GenerateNumPyImage( path, mime, force_pil = False ) -> numpy.array:
|
||||
FORCE_PIL_ALWAYS = True
|
||||
|
||||
def GenerateNumPyImage( path, mime, force_pil = False, human_file_description = None ) -> numpy.array:
|
||||
|
||||
force_pil = force_pil or FORCE_PIL_ALWAYS
|
||||
|
||||
if HG.media_load_report_mode:
|
||||
|
||||
|
@ -180,7 +184,7 @@ def GenerateNumPyImage( path, mime, force_pil = False ) -> numpy.array:
|
|||
|
||||
if not force_pil:
|
||||
|
||||
pil_image = HydrusImageOpening.RawOpenPILImage( path )
|
||||
pil_image = HydrusImageOpening.RawOpenPILImage( path, human_file_description = human_file_description )
|
||||
|
||||
if pil_image.mode == 'LAB':
|
||||
|
||||
|
@ -277,9 +281,9 @@ def GenerateNumPyImageFromPILImage( pil_image: PILImage.Image, strip_useless_alp
|
|||
return numpy_image
|
||||
|
||||
|
||||
def GeneratePILImage( path: typing.Union[ str, typing.BinaryIO ], dequantize = True ) -> PILImage.Image:
|
||||
def GeneratePILImage( path: typing.Union[ str, typing.BinaryIO ], dequantize = True, human_file_description = None ) -> PILImage.Image:
|
||||
|
||||
pil_image = HydrusImageOpening.RawOpenPILImage( path )
|
||||
pil_image = HydrusImageOpening.RawOpenPILImage( path, human_file_description = human_file_description )
|
||||
|
||||
try:
|
||||
|
||||
|
@ -287,7 +291,7 @@ def GeneratePILImage( path: typing.Union[ str, typing.BinaryIO ], dequantize = T
|
|||
|
||||
if dequantize:
|
||||
|
||||
if pil_image.mode in ( 'I', 'F' ):
|
||||
if pil_image.mode in ( 'I', 'I;16', 'I;16L', 'I;16B', 'I;16N', 'F' ):
|
||||
|
||||
# 'I' = greyscale, uint16
|
||||
# 'F' = float, np.float32
|
||||
|
@ -622,7 +626,7 @@ def GetThumbnailResolution( image_resolution: typing.Tuple[ int, int ], bounding
|
|||
return ( thumbnail_width, thumbnail_height )
|
||||
|
||||
|
||||
def IsDecompressionBomb( path ) -> bool:
|
||||
def IsDecompressionBomb( path, human_file_description = None ) -> bool:
|
||||
|
||||
# there are two errors here, the 'Warning' and the 'Error', which atm is just a test vs a test x 2 for number of pixels
|
||||
# 256MB bmp by default, ( 1024 ** 3 ) // 4 // 3
|
||||
|
@ -634,7 +638,7 @@ def IsDecompressionBomb( path ) -> bool:
|
|||
|
||||
try:
|
||||
|
||||
HydrusImageOpening.RawOpenPILImage( path )
|
||||
HydrusImageOpening.RawOpenPILImage( path, human_file_description = human_file_description )
|
||||
|
||||
except ( PILImage.DecompressionBombError ):
|
||||
|
||||
|
|
|
@ -155,7 +155,7 @@ def NormaliseICCProfilePILImageToSRGB( pil_image: PILImage.Image ) -> PILImage.I
|
|||
|
||||
src_profile = PILImageCms.ImageCmsProfile( f )
|
||||
|
||||
if pil_image.mode in ( 'I', 'F', 'L', 'LA', 'P' ):
|
||||
if pil_image.mode in ( 'I', 'I;16', 'I;16L', 'I;16B', 'I;16N', 'F', 'L', 'LA', 'P' ):
|
||||
|
||||
# had a bunch of LA pngs that turned pure white on RGBA ICC conversion
|
||||
# but seem to work fine if keep colourspace the same for now
|
||||
|
|
|
@ -3,20 +3,33 @@ from PIL import Image as PILImage
|
|||
|
||||
from hydrus.core import HydrusExceptions
|
||||
|
||||
def RawOpenPILImage( path: typing.Union[ str, typing.BinaryIO ] ) -> PILImage.Image:
|
||||
def RawOpenPILImage( path: typing.Union[ str, typing.BinaryIO ], human_file_description = None ) -> PILImage.Image:
|
||||
|
||||
try:
|
||||
|
||||
pil_image = PILImage.open( path )
|
||||
|
||||
if pil_image is None:
|
||||
|
||||
raise Exception( 'PIL returned None.' )
|
||||
|
||||
|
||||
except Exception as e:
|
||||
|
||||
raise HydrusExceptions.DamagedOrUnusualFileException( f'Could not load the image at "{path}"--it was likely malformed!' ) from e
|
||||
if human_file_description is not None:
|
||||
|
||||
message = f'Could not load the image at "{human_file_description}"--it was likely malformed!'
|
||||
|
||||
elif isinstance( path, str ):
|
||||
|
||||
message = f'Could not load the image at "{path}"--it was likely malformed!'
|
||||
|
||||
else:
|
||||
|
||||
message = f'Could not load the image, which had no path (so was probably from inside another file?)--it was likely malformed!'
|
||||
|
||||
|
||||
|
||||
if pil_image is None:
|
||||
|
||||
raise HydrusExceptions.DamagedOrUnusualFileException( f'Could not load the image at "{path}"--it was likely malformed!' )
|
||||
raise HydrusExceptions.DamagedOrUnusualFileException( message ) from e
|
||||
|
||||
|
||||
return pil_image
|
||||
|
|
|
@ -26,6 +26,7 @@ from hydrus.client import ClientGlobals as CG
|
|||
from hydrus.client import ClientLocation
|
||||
from hydrus.client import ClientServices
|
||||
from hydrus.client import ClientTime
|
||||
from hydrus.client.duplicates import ClientDuplicates
|
||||
from hydrus.client.importing import ClientImportFiles
|
||||
from hydrus.client.media import ClientMediaManagers
|
||||
from hydrus.client.media import ClientMediaResult
|
||||
|
@ -4172,8 +4173,8 @@ class TestClientAPI( unittest.TestCase ):
|
|||
|
||||
default_file_search_context = ClientSearch.FileSearchContext( location_context = default_location_context, tag_context = tag_context, predicates = predicates )
|
||||
|
||||
default_potentials_search_type = CC.DUPE_SEARCH_ONE_FILE_MATCHES_ONE_SEARCH
|
||||
default_pixel_duplicates = CC.SIMILAR_FILES_PIXEL_DUPES_ALLOWED
|
||||
default_potentials_search_type = ClientDuplicates.DUPE_SEARCH_ONE_FILE_MATCHES_ONE_SEARCH
|
||||
default_pixel_duplicates = ClientDuplicates.SIMILAR_FILES_PIXEL_DUPES_ALLOWED
|
||||
default_max_hamming_distance = 4
|
||||
|
||||
test_tag_service_key_1 = CC.DEFAULT_LOCAL_TAG_SERVICE_KEY
|
||||
|
@ -4192,8 +4193,8 @@ class TestClientAPI( unittest.TestCase ):
|
|||
|
||||
test_file_search_context_2 = ClientSearch.FileSearchContext( location_context = default_location_context, tag_context = test_tag_context_2, predicates = test_predicates_2 )
|
||||
|
||||
test_potentials_search_type = CC.DUPE_SEARCH_BOTH_FILES_MATCH_DIFFERENT_SEARCHES
|
||||
test_pixel_duplicates = CC.SIMILAR_FILES_PIXEL_DUPES_EXCLUDED
|
||||
test_potentials_search_type = ClientDuplicates.DUPE_SEARCH_BOTH_FILES_MATCH_DIFFERENT_SEARCHES
|
||||
test_pixel_duplicates = ClientDuplicates.SIMILAR_FILES_PIXEL_DUPES_EXCLUDED
|
||||
test_max_hamming_distance = 8
|
||||
|
||||
# get count
|
||||
|
|
|
@ -10,6 +10,7 @@ from hydrus.core import HydrusTime
|
|||
from hydrus.client import ClientConstants as CC
|
||||
from hydrus.client import ClientLocation
|
||||
from hydrus.client.db import ClientDB
|
||||
from hydrus.client.duplicates import ClientDuplicates
|
||||
from hydrus.client.importing import ClientImportFiles
|
||||
from hydrus.client.importing.options import FileImportOptions
|
||||
from hydrus.client.metadata import ClientContentUpdates
|
||||
|
@ -122,9 +123,9 @@ class TestClientDBDuplicates( unittest.TestCase ):
|
|||
|
||||
def _test_initial_state( self ):
|
||||
|
||||
pixel_dupes_preference = CC.SIMILAR_FILES_PIXEL_DUPES_ALLOWED
|
||||
pixel_dupes_preference = ClientDuplicates.SIMILAR_FILES_PIXEL_DUPES_ALLOWED
|
||||
max_hamming_distance = 4
|
||||
dupe_search_type = CC.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH
|
||||
dupe_search_type = ClientDuplicates.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH
|
||||
|
||||
num_potentials = self._read( 'potential_duplicates_count', self._file_search_context_1, self._file_search_context_2, dupe_search_type, pixel_dupes_preference, max_hamming_distance )
|
||||
|
||||
|
@ -175,9 +176,9 @@ class TestClientDBDuplicates( unittest.TestCase ):
|
|||
|
||||
self._our_main_dupe_group_hashes.add( self._dupe_hashes[2] )
|
||||
|
||||
pixel_dupes_preference = CC.SIMILAR_FILES_PIXEL_DUPES_ALLOWED
|
||||
pixel_dupes_preference = ClientDuplicates.SIMILAR_FILES_PIXEL_DUPES_ALLOWED
|
||||
max_hamming_distance = 4
|
||||
dupe_search_type = CC.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH
|
||||
dupe_search_type = ClientDuplicates.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH
|
||||
|
||||
num_potentials = self._read( 'potential_duplicates_count', self._file_search_context_1, self._file_search_context_2, dupe_search_type, pixel_dupes_preference, max_hamming_distance )
|
||||
|
||||
|
@ -263,9 +264,9 @@ class TestClientDBDuplicates( unittest.TestCase ):
|
|||
|
||||
self._our_main_dupe_group_hashes.add( self._king_hash )
|
||||
|
||||
pixel_dupes_preference = CC.SIMILAR_FILES_PIXEL_DUPES_ALLOWED
|
||||
pixel_dupes_preference = ClientDuplicates.SIMILAR_FILES_PIXEL_DUPES_ALLOWED
|
||||
max_hamming_distance = 4
|
||||
dupe_search_type = CC.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH
|
||||
dupe_search_type = ClientDuplicates.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH
|
||||
|
||||
num_potentials = self._read( 'potential_duplicates_count', self._file_search_context_1, self._file_search_context_2, dupe_search_type, pixel_dupes_preference, max_hamming_distance )
|
||||
|
||||
|
@ -330,9 +331,9 @@ class TestClientDBDuplicates( unittest.TestCase ):
|
|||
|
||||
self._our_main_dupe_group_hashes.add( self._dupe_hashes[5] )
|
||||
|
||||
pixel_dupes_preference = CC.SIMILAR_FILES_PIXEL_DUPES_ALLOWED
|
||||
pixel_dupes_preference = ClientDuplicates.SIMILAR_FILES_PIXEL_DUPES_ALLOWED
|
||||
max_hamming_distance = 4
|
||||
dupe_search_type = CC.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH
|
||||
dupe_search_type = ClientDuplicates.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH
|
||||
|
||||
num_potentials = self._read( 'potential_duplicates_count', self._file_search_context_1, self._file_search_context_2, dupe_search_type, pixel_dupes_preference, max_hamming_distance )
|
||||
|
||||
|
@ -508,9 +509,9 @@ class TestClientDBDuplicates( unittest.TestCase ):
|
|||
|
||||
self._write( 'duplicate_pair_status', [ row ] )
|
||||
|
||||
pixel_dupes_preference = CC.SIMILAR_FILES_PIXEL_DUPES_ALLOWED
|
||||
pixel_dupes_preference = ClientDuplicates.SIMILAR_FILES_PIXEL_DUPES_ALLOWED
|
||||
max_hamming_distance = 4
|
||||
dupe_search_type = CC.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH
|
||||
dupe_search_type = ClientDuplicates.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH
|
||||
|
||||
num_potentials = self._read( 'potential_duplicates_count', self._file_search_context_1, self._file_search_context_2, dupe_search_type, pixel_dupes_preference, max_hamming_distance )
|
||||
|
||||
|
@ -592,9 +593,9 @@ class TestClientDBDuplicates( unittest.TestCase ):
|
|||
|
||||
self._write( 'duplicate_pair_status', rows )
|
||||
|
||||
pixel_dupes_preference = CC.SIMILAR_FILES_PIXEL_DUPES_ALLOWED
|
||||
pixel_dupes_preference = ClientDuplicates.SIMILAR_FILES_PIXEL_DUPES_ALLOWED
|
||||
max_hamming_distance = 4
|
||||
dupe_search_type = CC.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH
|
||||
dupe_search_type = ClientDuplicates.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH
|
||||
|
||||
num_potentials = self._read( 'potential_duplicates_count', self._file_search_context_1, self._file_search_context_2, dupe_search_type, pixel_dupes_preference, max_hamming_distance )
|
||||
|
||||
|
@ -636,9 +637,9 @@ class TestClientDBDuplicates( unittest.TestCase ):
|
|||
|
||||
self._write( 'duplicate_pair_status', rows )
|
||||
|
||||
pixel_dupes_preference = CC.SIMILAR_FILES_PIXEL_DUPES_ALLOWED
|
||||
pixel_dupes_preference = ClientDuplicates.SIMILAR_FILES_PIXEL_DUPES_ALLOWED
|
||||
max_hamming_distance = 4
|
||||
dupe_search_type = CC.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH
|
||||
dupe_search_type = ClientDuplicates.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH
|
||||
|
||||
num_potentials = self._read( 'potential_duplicates_count', self._file_search_context_1, self._file_search_context_2, dupe_search_type, pixel_dupes_preference, max_hamming_distance )
|
||||
|
||||
|
@ -656,9 +657,9 @@ class TestClientDBDuplicates( unittest.TestCase ):
|
|||
|
||||
self._write( 'duplicate_pair_status', [ row ] )
|
||||
|
||||
pixel_dupes_preference = CC.SIMILAR_FILES_PIXEL_DUPES_ALLOWED
|
||||
pixel_dupes_preference = ClientDuplicates.SIMILAR_FILES_PIXEL_DUPES_ALLOWED
|
||||
max_hamming_distance = 4
|
||||
dupe_search_type = CC.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH
|
||||
dupe_search_type = ClientDuplicates.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH
|
||||
|
||||
num_potentials = self._read( 'potential_duplicates_count', self._file_search_context_1, self._file_search_context_2, dupe_search_type, pixel_dupes_preference, max_hamming_distance )
|
||||
|
||||
|
@ -712,9 +713,9 @@ class TestClientDBDuplicates( unittest.TestCase ):
|
|||
|
||||
self._write( 'duplicate_pair_status', rows )
|
||||
|
||||
pixel_dupes_preference = CC.SIMILAR_FILES_PIXEL_DUPES_ALLOWED
|
||||
pixel_dupes_preference = ClientDuplicates.SIMILAR_FILES_PIXEL_DUPES_ALLOWED
|
||||
max_hamming_distance = 4
|
||||
dupe_search_type = CC.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH
|
||||
dupe_search_type = ClientDuplicates.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH
|
||||
|
||||
num_potentials = self._read( 'potential_duplicates_count', self._file_search_context_1, self._file_search_context_2, dupe_search_type, pixel_dupes_preference, max_hamming_distance )
|
||||
|
||||
|
@ -732,9 +733,9 @@ class TestClientDBDuplicates( unittest.TestCase ):
|
|||
|
||||
self._write( 'duplicate_pair_status', [ row ] )
|
||||
|
||||
pixel_dupes_preference = CC.SIMILAR_FILES_PIXEL_DUPES_ALLOWED
|
||||
pixel_dupes_preference = ClientDuplicates.SIMILAR_FILES_PIXEL_DUPES_ALLOWED
|
||||
max_hamming_distance = 4
|
||||
dupe_search_type = CC.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH
|
||||
dupe_search_type = ClientDuplicates.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH
|
||||
|
||||
num_potentials = self._read( 'potential_duplicates_count', self._file_search_context_1, self._file_search_context_2, dupe_search_type, pixel_dupes_preference, max_hamming_distance )
|
||||
|
||||
|
@ -794,9 +795,9 @@ class TestClientDBDuplicates( unittest.TestCase ):
|
|||
|
||||
self._write( 'duplicate_pair_status', rows )
|
||||
|
||||
pixel_dupes_preference = CC.SIMILAR_FILES_PIXEL_DUPES_ALLOWED
|
||||
pixel_dupes_preference = ClientDuplicates.SIMILAR_FILES_PIXEL_DUPES_ALLOWED
|
||||
max_hamming_distance = 4
|
||||
dupe_search_type = CC.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH
|
||||
dupe_search_type = ClientDuplicates.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH
|
||||
|
||||
num_potentials = self._read( 'potential_duplicates_count', self._file_search_context_1, self._file_search_context_2, dupe_search_type, pixel_dupes_preference, max_hamming_distance )
|
||||
|
||||
|
@ -857,9 +858,9 @@ class TestClientDBDuplicates( unittest.TestCase ):
|
|||
|
||||
self._write( 'duplicate_pair_status', rows )
|
||||
|
||||
pixel_dupes_preference = CC.SIMILAR_FILES_PIXEL_DUPES_ALLOWED
|
||||
pixel_dupes_preference = ClientDuplicates.SIMILAR_FILES_PIXEL_DUPES_ALLOWED
|
||||
max_hamming_distance = 4
|
||||
dupe_search_type = CC.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH
|
||||
dupe_search_type = ClientDuplicates.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH
|
||||
|
||||
num_potentials = self._read( 'potential_duplicates_count', self._file_search_context_1, self._file_search_context_2, dupe_search_type, pixel_dupes_preference, max_hamming_distance )
|
||||
|
||||
|
|
|
@ -357,6 +357,13 @@ class TestURLClasses( unittest.TestCase ):
|
|||
self.assertEqual( ClientNetworkingFunctions.EnsureURLIsEncoded( human_url_with_mix ), encoded_url_with_mix )
|
||||
self.assertEqual( ClientNetworkingFunctions.EnsureURLIsEncoded( encoded_url_with_mix ), encoded_url_with_mix )
|
||||
|
||||
# double-check we don't auto-alphabetise params in this early stage! we screwed this up before and broke that option
|
||||
human_url_with_mix = 'https://grunky.site/post?b=5 5&a=1 1'
|
||||
encoded_url_with_mix = 'https://grunky.site/post?b=5%205&a=1%201'
|
||||
|
||||
self.assertEqual( ClientNetworkingFunctions.EnsureURLIsEncoded( human_url_with_mix ), encoded_url_with_mix )
|
||||
self.assertEqual( ClientNetworkingFunctions.EnsureURLIsEncoded( encoded_url_with_mix ), encoded_url_with_mix )
|
||||
|
||||
|
||||
def test_defaults( self ):
|
||||
|
||||
|
|
|
@ -8,9 +8,8 @@ from hydrus.core import HydrusTime
|
|||
|
||||
from hydrus.client import ClientApplicationCommand as CAC
|
||||
from hydrus.client import ClientConstants as CC
|
||||
from hydrus.client import ClientData
|
||||
from hydrus.client import ClientDefaults
|
||||
from hydrus.client import ClientDuplicates
|
||||
from hydrus.client.duplicates import ClientDuplicates
|
||||
from hydrus.client.gui import ClientGUIShortcuts
|
||||
from hydrus.client.importing import ClientImportSubscriptions
|
||||
from hydrus.client.importing import ClientImportSubscriptionQuery
|
||||
|
|
Before Width: | Height: | Size: 3.2 KiB After Width: | Height: | Size: 3.3 KiB |
Before Width: | Height: | Size: 2.3 KiB After Width: | Height: | Size: 2.9 KiB |
After Width: | Height: | Size: 1.9 KiB |
After Width: | Height: | Size: 1.9 KiB |
After Width: | Height: | Size: 2.0 KiB |
After Width: | Height: | Size: 2.7 KiB |
After Width: | Height: | Size: 2.2 KiB |
After Width: | Height: | Size: 2.0 KiB |
Before Width: | Height: | Size: 1.8 KiB After Width: | Height: | Size: 1.8 KiB |