Version 511

This commit is contained in:
Hydrus Network Developer 2022-12-21 16:00:27 -06:00
parent 4b20e3e466
commit 0733e41133
No known key found for this signature in database
GPG Key ID: 76249F053212133C
47 changed files with 1006 additions and 496 deletions

View File

@ -7,6 +7,51 @@ title: Changelog
!!! note
This is the new changelog, only the most recent builds. For all versions, see the [old changelog](old_changelog.html).
## [Version 511](https://github.com/hydrusnetwork/hydrus/releases/tag/v511)
### thumbnail UI scaling
* thumbnails can finally look good at high UI scales! a new setting in _options->thumbnails_, 'Thumbnail UI scale supersampling %', lets you tell hydrus to generate thumbnails at a particular UI scale. match it to your monitor, and your thumbnails should regenerate to look crisp
* some users have complicated multi-monitor setups, or they change their UI scale regularly, so I'm not auto-setting this _yet_. let me know how it goes
* sadly <100% for super-crunchy-mode doesn't work
### unnamespaced search tags
* _I am not really happy with this solution, since it doesn't neatly restore the old behaviour, but it does make things easier in the new system and I've fixed a related bug_
* a new option in _services->manage tag display and search_, 'Unnamespaced input gives (any namespace) wildcard results', now lets you quickly search `*:sam*` by typing `sam`
* fixed an issue where an autocomplete input with a total wildcard namespace, like `*:sam` was not matching to unnamespaced tags when preparing the list of tag results
* wildcards with `*` namespace now have a special `(any namespace)` suffix, and they show with unnamespaced namespace colour
### misc
* fixed the client-server communication problem related to last week's SerialisableDictionary update. I messed up and forgot this object is used in network comms, which meant >=v510 clients couldn't talk to a <=509 server and _vice versa_ version swaps. now the server always kicks out an old SerialisableDictionary serialisation. I plan to remove the patch in 26 weeks, giving us more buffer time for users to update naturally
* the recent option to turn off mouse-scroll-changes-menu-button-value is improved--now the wheel event is correctly passed up to the parent panel, so you'll scroll right through one of these buttons, not halt on it. the file sort control now also obeys this option
* if you try to zoom a media in so that its virtual size would be >32,000px on a side, the canvas now zooms to 32k exactly. this is the max allowed zoom for technical reasons atm (I'll fix it in a future rewrite). this also fixes the 'zoom max' command, which previously would make no action if the max zoom created a virtual canvas bigger than this. also, 'zoom max' is now shown on the media viewer right-click menu
* the 'max zoom' dimension for mpv windows and my native animation window is now 8k. seems like there are smaller technical limits for mpv, and my animation window isn't tiled, so this is to be extra safe for now
* fixed a bug where it was possible to send the 'undelete file' signal to a file that was physically deleted (and therefore viewed in a special 'deleted files' domain). the file would obediently return to its original local file service and then throw 'missing file' warnings when the thumb tried to show. now these files are discarded from undelete consideration
* if you are looking at physically deleted files, the thumbnail view now provides a 'clear deletion record' menu action! this is the same command as the button in _services->review services->all local files_, but just on the selection
* fixed several taglists across the program that were displaying tags in the wrong display context and/or not sorting correctly. this mostly went wrong by setting sorted storage taglists (which normally show sibling/parent flare) as unsorted display taglists
* file lookup script tag suggestions (as fetched from some external source) are now set to be sorted
### file import options pre-import checking
* _this stuff is advanced users only. normal users can rest assured that the way the client skips downloads for 'already in db/previously deleted' files now has fewer false negatives and false positives_
* the awkwardly named advanced 'do not check url/hash to see if file already in db/previously deleted' checkboxes in file import options have been overhauled. now they are phrased in the positive ("check x to determine aid/pd?") and offer 'do not check', 'check', and the new 'check - and matches are dispositive'. the tooltip has been updated to talk about what they do. 'dispositive' basically means 'if this one hits, trust it over the other', and by default the 'hash' check remains dispositive over the URLs (this was previously hardcoded, now you can choose urls to rule in some cases).
* there is also a new checkbox to optionally disable a component of the url checking that looks at neighbouring urls on the same file to determine url-mapping trustworthiness. this will solve or help explore some weird multi-url-mapping situations
* also, novel SHA256 hashes no longer count as 'matches', just like a novel MD5 hash would not. this helps keep useful dispositive behaviour for known hashes but also automatically defers to urls when a site is being CDN-optimised and transfer hashes are different to api-reported ones. this fixes some watchers that have been using excess bandwidth on repeated downloads
* fixed several problems with the url-lookup logic, particularly with the method that checks for 'file-neighbour' urls (simply, when a file-url match should be distrusted because that file has multiple urls of the same url class). it was also too aggressive on file/unknown url classes, which can legitimately have tokenised neighbours, and getting confused by http/https dupes
* the neighbour test now remembers untrustworthy domains across different url checks for a file, which helps some subsequent direct-file-url checks where neighbours aren't a marker of file-url mapping reliability
* the overall logic behind the hash and url lookup is cleaned up significantly
* if you are an advanced user who has been working with me on this stuff, let me know how it goes. we erected this rats' nest through years of patches, and now I have cleaned it out. I'm confident it works better overall, but I may have missed one of your complicated situations. at the least, these new options should help us figure out quicker fixes in future
### boring code cleanup
* removed some old 'subject_identifier' arg parsing from various account-modification calls in the server code. as previously planned, for simplicity and security, the only identifier for these actions is now 'subject_account_key', and subject_identifier is only used for account lookups
* improved the error handling around serialised object loading. the messages explain what happened and state object type and the versions involved
* cleaned up some tag sort code
* cleaned up how advanced file delete content updates work
* fixed yet another duplicate potentials count unit test that was sometimes failing due to complex count perspective
## [Version 510](https://github.com/hydrusnetwork/hydrus/releases/tag/v510)
### notes
@ -431,43 +476,3 @@ title: Changelog
* shuffled some page widget and layout code to make the embedded a/c dropdown work
* deleted a bunch of a/c event handling and forced layout and other garbage code
* worked on some linter warnings
## [Version 501](https://github.com/hydrusnetwork/hydrus/releases/tag/v501)
### misc
* the Linux build gets the same 'cannot boot' setuptools version hotfix as last week's Windows build. sorry if you could not boot v500 on Linux! macOS never got the problem, I think because it uses pyoxidizer instead of pyinstaller
* fixed the error/crash when clients running with PyQt6 (rather than the default Qt6, PySide6) tried to open file or directory selection dialogs. there was a slight method name discrepancy between the two libraries in Qt6 that we had missed, and it was sufficiently core that it was causing errors and best, crashes at worst
* fixed a common crash caused after several options-saving events such as pausing/resuming subscriptions, repositories, import/export folders. thank you very much to the users who reported this, I was finally able to reproduce it an hour before the release was due. the collect control was causing the crash--its ability to update itself without a client restart is disabled for now
* unfortunately, it seems Deviant Art have locked off the API we were using to get nice data, so I am reverting the DA downloader this week to the old html parser, which nonetheless still sems to work well. I expect we'll have to revisit this when we rediscover bad nsfw support or similar--let me know how things go, and you might like to hit your DA subs and 'retry ignored'
* fixed a bad bug where manage rating dialogs that were launched on multiple files with disagreeing numerical ratings (where it shows the stars in dark grey), if okayed on that 'mixed' rating, rather than leaving them untouched, were resetting all those files back to the minimum allowed star value. I do not know when this bug came in, it is unusual, but I did do some rating state work a few weeks ago, so I am hoping it was then. I regret this and the inconvenience it has caused
* if you manually navigate while the media viewer slideshow is running, the slideshow timer now resets (e.g. if you go 'back' on an image 7 seconds into a 10 second slideshow, it will show the previous image for 10 seconds, not 3, before moving on again)
* fixed a type bug in PyQt hydrus when you tried to seek an mpv video when no file was loaded (usually happens when a seek event arrives late)
* when you drop a hydrus serialised png of assorted objects onto a multi-column list, the little error where it says 'this list does not take objects of type x' now only shows once! previously, if your png was a list of objects, it could make a separate type error for each in turn. it should now all be merged properly
* this import function also now presents a summary of how many objects were successfully imported
* updated all ui-level ipfs multihash fetching across the program. this is now a little less laggy and uses no extra db in most cases
* misc code and linter warning cleanup
* .
* tag right-click:
* the 'edit x' entry in the tag right-click menu is now moved to the 'search' submenu with the other search-changing 'exclude'/'remove' etc.. actions
* the 'edit x' entry no longer appears when you only select invertible, non-editable predicates
* if you right-click on a -negated tag, the 'search' menu's action label now says 'require samus aran' instead of the awkward 'exclude -samus aran'. it will also say the neutral 'invert selection' if things get complicated
### notes logic improvements
* if you set notes to append on conflict and the existing note already contains the new note, now no changes will be made (repeatedly parsing the same conflcting note now won't append it multiple times)
* if you set notes to rename on conflict and the note already exists on another name, now no changes will be made (i.e. repeatedly parsing the same conflicting note won't create (1), (2), (3)... rename dupes)
### client api
* /add_tags/search_tags gets a new parameter, 'tag_display_type', which lets you either keep searching the raw 'storage' tags (as you see in edit contexts like the 'manage tags' dialog), or the prettier sibling-processed 'display' tags (as you see in read contexts like a normal file search page)
* /get_files/file_metadata now returns 'ipfs_multihashes' structure, which gives ipfs service key(s) and multihashes
* if you run /get_files/search_files with no search predicates, or with only tags that do not parse correctly so you end up with no tags, the search now returns nothing, rather than system:everything. I will likely make this call raise errors on bad tags in future
* the client api help is updated to talk about these
* there's also unit tests for them
* client api version is now 33
### popup messages
* the background workings of the popup toaster are rewritten. it looks the same, but instead of technically being its own window, it is now embedded into the main gui as a raised widget. this should clear up a whole heap of jank this window has caused over the years. for instance, in some OSes/Window Managers, when a new subscription popup appeared, the main window would activate and steal focus. this annoying thing should, fingers crossed, no longer happen
* I have significantly rewritten the layout routine of the popup toaster. beyond a general iteration of code cleanup, popup messages should size their width more sensibly, expand to available space, and retract better after needing to grow wide
* unfortunately, some layout jank does remain, mostly in popup messages that change height significantly, like error tracebacks. they can sometimes take two frames to resize correctly, which can look flickery. I am still doing something 'bad' here, in Qt terms, and have to hack part of the layout update routine. let me know what else breaks for you, and I will revisit this in future
* the 'BUGFIX: Hide the popup toaster when the main gui is minimised/loses focus' checkboxes under _options->popups_ are retired. since the toaster is now embedded into the main gui just like any search page, these issues no longer apply. I am leaving the two 'freeze the popup toaster' checkboxes in place, just so we can play around with some virtual desktop issues I know some users are having, but they may soon go too
* the popup toaster components are updated to use Qt signals rather than borked object callables
* as a side thing, the popup toaster can no longer grow taller than the main window size

View File

@ -164,7 +164,7 @@ As a result, if you get a failure on trying to do a big update, try cutting the
If you narrow the gap down to just one version and still get an error, please let me know. I am very interested in these sorts of problems and will be happy to help figure out a fix with you (and everyone else who might be affected).
_All that said, and while updating is complex and every client is different, one user recently did a giant multi-year update and found this route worked and was efficient: 204 > 238 > 246 > 291 > 328 > 335 > 376 > 421 > 466 > 474_
_All that said, and while updating is complex and every client is different, various user reports over the years suggest this route works and is efficient: 204 > 238 > 246 > 291 > 328 > 335 > 376 > 421 > 466 > 474 ? 494 > 509_
## Backing up

View File

@ -34,6 +34,44 @@
<div class="content">
<h1 id="changelog"><a href="#changelog">changelog</a></h1>
<ul>
<li>
<h2 id="version_511"><a href="#version_511">version 511</a></h2>
<ul>
<li><h3>thumbnail UI scaling</h3></li>
<li>thumbnails can finally look good at high UI scales! a new setting in _options->thumbnails_, 'Thumbnail UI scale supersampling %', lets you tell hydrus to generate thumbnails at a particular UI scale. match it to your monitor, and your thumbnails should regenerate to look crisp</li>
<li>some users have complicated multi-monitor setups, or they change their UI scale regularly, so I'm not auto-setting this _yet_. let me know how it goes</li>
<li>sadly <100% for super-crunchy-mode doesn't work</li>
<li><h3>unnamespaced search tags</h3></li>
<li>_I am not really happy with this solution, since it doesn't neatly restore the old behaviour, but it does make things easier in the new system and I've fixed a related bug_</li>
<li>a new option in _services->manage tag display and search_, 'Unnamespaced input gives (any namespace) wildcard results', now lets you quickly search `*:sam*` by typing `sam`</li>
<li>fixed an issue where an autocomplete input with a total wildcard namespace, like `*:sam` was not matching to unnamespaced tags when preparing the list of tag results</li>
<li>wildcards with `*` namespace now have a special `(any namespace)` suffix, and they show with unnamespaced namespace colour</li>
<li><h3>misc</h3></li>
<li>fixed the client-server communication problem related to last week's SerialisableDictionary update. I messed up and forgot this object is used in network comms, which meant >=v510 clients couldn't talk to a <=509 server and _vice versa_ version swaps. now the server always kicks out an old SerialisableDictionary serialisation. I plan to remove the patch in 26 weeks, giving us more buffer time for users to update naturally</li>
<li>the recent option to turn off mouse-scroll-changes-menu-button-value is improved--now the wheel event is correctly passed up to the parent panel, so you'll scroll right through one of these buttons, not halt on it. the file sort control now also obeys this option</li>
<li>if you try to zoom a media in so that its virtual size would be >32,000px on a side, the canvas now zooms to 32k exactly. this is the max allowed zoom for technical reasons atm (I'll fix it in a future rewrite). this also fixes the 'zoom max' command, which previously would make no action if the max zoom created a virtual canvas bigger than this. also, 'zoom max' is now shown on the media viewer right-click menu</li>
<li>the 'max zoom' dimension for mpv windows and my native animation window is now 8k. seems like there are smaller technical limits for mpv, and my animation window isn't tiled, so this is to be extra safe for now</li>
<li>fixed a bug where it was possible to send the 'undelete file' signal to a file that was physically deleted (and therefore viewed in a special 'deleted files' domain). the file would obediently return to its original local file service and then throw 'missing file' warnings when the thumb tried to show. now these files are discarded from undelete consideration</li>
<li>if you are looking at physically deleted files, the thumbnail view now provides a 'clear deletion record' menu action! this is the same command as the button in _services->review services->all local files_, but just on the selection</li>
<li>fixed several taglists across the program that were displaying tags in the wrong display context and/or not sorting correctly. this mostly went wrong by setting sorted storage taglists (which normally show sibling/parent flare) as unsorted display taglists</li>
<li>file lookup script tag suggestions (as fetched from some external source) are now set to be sorted</li>
<li><h3>file import options pre-import checking</h3></li>
<li>_this stuff is advanced users only. normal users can rest assured that the way the client skips downloads for 'already in db/previously deleted' files now has fewer false negatives and false positives_</li>
<li>the awkwardly named advanced 'do not check url/hash to see if file already in db/previously deleted' checkboxes in file import options have been overhauled. now they are phrased in the positive ("check x to determine aid/pd?") and offer 'do not check', 'check', and the new 'check - and matches are dispositive'. the tooltip has been updated to talk about what they do. 'dispositive' basically means 'if this one hits, trust it over the other', and by default the 'hash' check remains dispositive over the URLs (this was previously hardcoded, now you can choose urls to rule in some cases).</li>
<li>there is also a new checkbox to optionally disable a component of the url checking that looks at neighbouring urls on the same file to determine url-mapping trustworthiness. this will solve or help explore some weird multi-url-mapping situations</li>
<li>also, novel SHA256 hashes no longer count as 'matches', just like a novel MD5 hash would not. this helps keep useful dispositive behaviour for known hashes but also automatically defers to urls when a site is being CDN-optimised and transfer hashes are different to api-reported ones. this fixes some watchers that have been using excess bandwidth on repeated downloads</li>
<li>fixed several problems with the url-lookup logic, particularly with the method that checks for 'file-neighbour' urls (simply, when a file-url match should be distrusted because that file has multiple urls of the same url class). it was also too aggressive on file/unknown url classes, which can legitimately have tokenised neighbours, and getting confused by http/https dupes</li>
<li>the neighbour test now remembers untrustworthy domains across different url checks for a file, which helps some subsequent direct-file-url checks where neighbours aren't a marker of file-url mapping reliability</li>
<li>the overall logic behind the hash and url lookup is cleaned up significantly</li>
<li>if you are an advanced user who has been working with me on this stuff, let me know how it goes. we erected this rats' nest through years of patches, and now I have cleaned it out. I'm confident it works better overall, but I may have missed one of your complicated situations. at the least, these new options should help us figure out quicker fixes in future</li>
<li><h3>boring code cleanup</h3></li>
<li>removed some old 'subject_identifier' arg parsing from various account-modification calls in the server code. as previously planned, for simplicity and security, the only identifier for these actions is now 'subject_account_key', and subject_identifier is only used for account lookups</li>
<li>improved the error handling around serialised object loading. the messages explain what happened and state object type and the versions involved</li>
<li>cleaned up some tag sort code</li>
<li>cleaned up how advanced file delete content updates work</li>
<li>fixed yet another duplicate potentials count unit test that was sometimes failing due to complex count perspective</li>
</ul>
</li>
<li>
<h2 id="version_510"><a href="#version_510">version 510</a></h2>
<ul>

View File

@ -712,10 +712,10 @@ class ThumbnailCache( object ):
( media_width, media_height ) = display_media.GetResolution()
bounding_dimensions = self._controller.options[ 'thumbnail_dimensions' ]
thumbnail_scale_type = self._controller.new_options.GetInteger( 'thumbnail_scale_type' )
thumbnail_dpr_percent = HG.client_controller.new_options.GetInteger( 'thumbnail_dpr_percent' )
( clip_rect, ( expected_width, expected_height ) ) = HydrusImageHandling.GetThumbnailResolutionAndClipRegion( ( media_width, media_height ), bounding_dimensions, thumbnail_scale_type )
( clip_rect, ( expected_width, expected_height ) ) = HydrusImageHandling.GetThumbnailResolutionAndClipRegion( ( media_width, media_height ), bounding_dimensions, thumbnail_scale_type, thumbnail_dpr_percent )
exactly_as_expected = current_width == expected_width and current_height == expected_height
@ -733,7 +733,7 @@ class ThumbnailCache( object ):
if HG.file_report_mode:
HydrusData.ShowText( 'Thumbnail {} too small, scheduling regeneration from source.'.format( hash.hex() ) )
HydrusData.ShowText( 'Thumbnail {} wrong size ({}x{} instead of {}x{}), scheduling regeneration from source.'.format( hash.hex(), current_width, current_height, expected_width, expected_height ) )
delayed_item = display_media.GetMediaResult()
@ -754,7 +754,7 @@ class ThumbnailCache( object ):
if HG.file_report_mode:
HydrusData.ShowText( 'Stored thumbnail {} was the wrong size, only scaling due to no local source.'.format( hash.hex() ) )
HydrusData.ShowText( 'Thumbnail {} wrong size ({}x{} instead of {}x{}), only scaling due to no local source.'.format( hash.hex(), current_width, current_height, expected_width, expected_height ) )
@ -933,6 +933,7 @@ class ThumbnailCache( object ):
bounding_dimensions = self._controller.options[ 'thumbnail_dimensions' ]
thumbnail_scale_type = self._controller.new_options.GetInteger( 'thumbnail_scale_type' )
thumbnail_dpr_percent = HG.client_controller.new_options.GetInteger( 'thumbnail_dpr_percent' )
# it would be ideal to replace this with mimes_to_default_thumbnail_paths at a convenient point
@ -944,7 +945,7 @@ class ThumbnailCache( object ):
numpy_image_resolution = HydrusImageHandling.GetResolutionNumPy( numpy_image )
( clip_rect, target_resolution ) = HydrusImageHandling.GetThumbnailResolutionAndClipRegion( numpy_image_resolution, bounding_dimensions, thumbnail_scale_type )
( clip_rect, target_resolution ) = HydrusImageHandling.GetThumbnailResolutionAndClipRegion( numpy_image_resolution, bounding_dimensions, thumbnail_scale_type, thumbnail_dpr_percent )
if clip_rect is not None:

View File

@ -314,8 +314,9 @@ class QuickDownloadManager( object ):
file_repository.Request( HC.GET, 'file', { 'hash' : hash }, temp_path = temp_path )
exclude_deleted = False # this is the important part here
do_not_check_known_urls_before_importing = False
do_not_check_hashes_before_importing = False
preimport_hash_check_type = FileImportOptions.DO_CHECK_AND_MATCHES_ARE_DISPOSITIVE
preimport_url_check_type = FileImportOptions.DO_CHECK
preimport_url_check_looks_for_neighbours = True
allow_decompression_bombs = True
min_size = None
max_size = None
@ -328,7 +329,8 @@ class QuickDownloadManager( object ):
file_import_options = FileImportOptions.FileImportOptions()
file_import_options.SetPreImportOptions( exclude_deleted, do_not_check_known_urls_before_importing, do_not_check_hashes_before_importing, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution )
file_import_options.SetPreImportOptions( exclude_deleted, preimport_hash_check_type, preimport_url_check_type, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution )
file_import_options.SetPreImportURLCheckLooksForNeighbours( preimport_url_check_looks_for_neighbours )
file_import_options.SetPostImportOptions( automatic_archive, associate_primary_urls, associate_source_urls )
file_import_job = ClientImportFiles.FileImportJob( temp_path, file_import_options )

View File

@ -567,8 +567,9 @@ class ClientFilesManager( object ):
bounding_dimensions = self._controller.options[ 'thumbnail_dimensions' ]
thumbnail_scale_type = self._controller.new_options.GetInteger( 'thumbnail_scale_type' )
thumbnail_dpr_percent = HG.client_controller.new_options.GetInteger( 'thumbnail_dpr_percent' )
( clip_rect, target_resolution ) = HydrusImageHandling.GetThumbnailResolutionAndClipRegion( ( width, height ), bounding_dimensions, thumbnail_scale_type )
( clip_rect, target_resolution ) = HydrusImageHandling.GetThumbnailResolutionAndClipRegion( ( width, height ), bounding_dimensions, thumbnail_scale_type, thumbnail_dpr_percent )
percentage_in = self._controller.new_options.GetInteger( 'video_thumbnail_percentage_in' )
@ -1477,8 +1478,9 @@ class ClientFilesManager( object ):
bounding_dimensions = self._controller.options[ 'thumbnail_dimensions' ]
thumbnail_scale_type = self._controller.new_options.GetInteger( 'thumbnail_scale_type' )
thumbnail_dpr_percent = HG.client_controller.new_options.GetInteger( 'thumbnail_dpr_percent' )
( clip_rect, ( expected_width, expected_height ) ) = HydrusImageHandling.GetThumbnailResolutionAndClipRegion( ( media_width, media_height ), bounding_dimensions, thumbnail_scale_type )
( clip_rect, ( expected_width, expected_height ) ) = HydrusImageHandling.GetThumbnailResolutionAndClipRegion( ( media_width, media_height ), bounding_dimensions, thumbnail_scale_type, thumbnail_dpr_percent )
if current_width != expected_width or current_height != expected_height:
@ -1783,7 +1785,7 @@ class FilesMaintenanceManager( object ):
if not leave_deletion_record:
content_update = HydrusData.ContentUpdate( HC.CONTENT_TYPE_FILES, HC.CONTENT_UPDATE_ADVANCED, ( 'delete_deleted', ( hash, ) ) )
content_update = HydrusData.ContentUpdate( HC.CONTENT_TYPE_FILES, HC.CONTENT_UPDATE_CLEAR_DELETE_RECORD, ( hash, ) )
service_keys_to_content_updates = { CC.COMBINED_LOCAL_FILE_SERVICE_KEY : [ content_update ] }

View File

@ -417,7 +417,7 @@ class UndoManager( object ):
if data_type == HC.CONTENT_TYPE_FILES:
if action in ( HC.CONTENT_UPDATE_ADD, HC.CONTENT_UPDATE_DELETE, HC.CONTENT_UPDATE_UNDELETE, HC.CONTENT_UPDATE_RESCIND_PETITION, HC.CONTENT_UPDATE_ADVANCED ):
if action in ( HC.CONTENT_UPDATE_ADD, HC.CONTENT_UPDATE_DELETE, HC.CONTENT_UPDATE_UNDELETE, HC.CONTENT_UPDATE_RESCIND_PETITION, HC.CONTENT_UPDATE_CLEAR_DELETE_RECORD ):
continue

View File

@ -453,6 +453,8 @@ class ClientOptions( HydrusSerialisable.SerialisableBase ):
self._dictionary[ 'integers' ][ 'thumbnail_border' ] = 1
self._dictionary[ 'integers' ][ 'thumbnail_margin' ] = 2
self._dictionary[ 'integers' ][ 'thumbnail_dpr_percent' ] = 100
self._dictionary[ 'integers' ][ 'file_maintenance_idle_throttle_files' ] = 1
self._dictionary[ 'integers' ][ 'file_maintenance_idle_throttle_time_delta' ] = 2
@ -649,9 +651,12 @@ class ClientOptions( HydrusSerialisable.SerialisableBase ):
self._dictionary[ 'default_file_import_options' ] = HydrusSerialisable.SerialisableDictionary()
from hydrus.client.importing.options import FileImportOptions
exclude_deleted = True
do_not_check_known_urls_before_importing = False
do_not_check_hashes_before_importing = False
preimport_hash_check_type = FileImportOptions.DO_CHECK_AND_MATCHES_ARE_DISPOSITIVE
preimport_url_check_type = FileImportOptions.DO_CHECK
preimport_url_check_looks_for_neighbours = True
allow_decompression_bombs = True
min_size = None
max_size = None
@ -669,11 +674,10 @@ class ClientOptions( HydrusSerialisable.SerialisableBase ):
presentation_import_options.SetPresentationStatus( PresentationImportOptions.PRESENTATION_STATUS_NEW_ONLY )
from hydrus.client.importing.options import FileImportOptions
quiet_file_import_options = FileImportOptions.FileImportOptions()
quiet_file_import_options.SetPreImportOptions( exclude_deleted, do_not_check_known_urls_before_importing, do_not_check_hashes_before_importing, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution )
quiet_file_import_options.SetPreImportOptions( exclude_deleted, preimport_hash_check_type, preimport_url_check_type, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution )
quiet_file_import_options.SetPreImportURLCheckLooksForNeighbours( preimport_url_check_looks_for_neighbours )
quiet_file_import_options.SetPostImportOptions( automatic_archive, associate_primary_urls, associate_source_urls )
quiet_file_import_options.SetPresentationImportOptions( presentation_import_options )
@ -681,7 +685,8 @@ class ClientOptions( HydrusSerialisable.SerialisableBase ):
loud_file_import_options = FileImportOptions.FileImportOptions()
loud_file_import_options.SetPreImportOptions( exclude_deleted, do_not_check_known_urls_before_importing, do_not_check_hashes_before_importing, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution )
loud_file_import_options.SetPreImportOptions( exclude_deleted, preimport_hash_check_type, preimport_url_check_type, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution )
loud_file_import_options.SetPreImportURLCheckLooksForNeighbours( preimport_url_check_looks_for_neighbours )
loud_file_import_options.SetPostImportOptions( automatic_archive, associate_primary_urls, associate_source_urls )
self._dictionary[ 'default_file_import_options' ][ 'loud' ] = loud_file_import_options

View File

@ -1012,14 +1012,14 @@ class HydrusBitmap( object ):
return self._depth
def GetQtImage( self ):
def GetQtImage( self ) -> QG.QImage:
( width, height ) = self._size
return HG.client_controller.bitmap_manager.GetQtImageFromBuffer( width, height, self._depth * 8, self._GetData() )
def GetQtPixmap( self ):
def GetQtPixmap( self ) -> QG.QPixmap:
( width, height ) = self._size

View File

@ -1619,14 +1619,7 @@ class Predicate( HydrusSerialisable.SerialisableBase ):
self._parent_key = None
if predicate_type == PREDICATE_TYPE_TAG:
self._matchable_search_texts = { self._value }
else:
self._matchable_search_texts = set()
self._RecalculateMatchableSearchTexts()
#
@ -1787,6 +1780,23 @@ class Predicate( HydrusSerialisable.SerialisableBase ):
self._RecalcPythonHash()
def _RecalculateMatchableSearchTexts( self ):
if self._predicate_type == PREDICATE_TYPE_TAG:
self._matchable_search_texts = { self._value }
if self._siblings is not None:
self._matchable_search_texts.update( self._siblings )
else:
self._matchable_search_texts = set()
def _UpdateSerialisableInfo( self, version, old_serialisable_info ):
if version == 1:
@ -1895,7 +1905,14 @@ class Predicate( HydrusSerialisable.SerialisableBase ):
( namespace, subtag ) = HydrusTags.SplitTag( tag_analogue )
return namespace
if namespace == '*':
return ''
else:
return namespace
else:
@ -2168,14 +2185,7 @@ class Predicate( HydrusSerialisable.SerialisableBase ):
self._siblings = siblings
if self._predicate_type == PREDICATE_TYPE_TAG:
self._matchable_search_texts = { self._value }.union( self._siblings )
else:
self._matchable_search_texts = set()
self._RecalculateMatchableSearchTexts()
def ToString( self, with_count: bool = True, tag_display_type: int = ClientTags.TAG_DISPLAY_ACTUAL, render_for_user: bool = False, or_under_construction: bool = False ) -> str:
@ -2856,7 +2866,16 @@ class Predicate( HydrusSerialisable.SerialisableBase ):
elif self._predicate_type == PREDICATE_TYPE_WILDCARD:
wildcard = self._value + ' (wildcard search)'
if self._value.startswith( '*:' ):
( any_namespace, subtag ) = HydrusTags.SplitTag( self._value )
wildcard = '{} (any namespace)'.format( subtag )
else:
wildcard = self._value + ' (wildcard search)'
if not self._inclusive:
@ -2924,11 +2943,18 @@ def FilterPredicatesBySearchText( service_key, search_text, predicates: typing.C
if ':' in s:
beginning = r'\A'
( namespace, subtag ) = s.split( ':', 1 )
s = r'{}:(.*\s)?{}'.format( namespace, subtag )
if namespace == '.*':
beginning = r'(\A|:|\s)'
s = subtag
else:
beginning = r'\A'
s = r'{}:(.*\s)?{}'.format( namespace, subtag )
elif s.startswith( '.*' ):
@ -3074,15 +3100,38 @@ class ParsedAutocompleteText( object ):
return 'AC Tag Text: {}'.format( self.raw_input )
def _GetSearchText( self, always_autocompleting: bool, force_do_not_collapse: bool = False ):
def _GetSearchText( self, always_autocompleting: bool, force_do_not_collapse: bool = False, allow_unnamespaced_search_gives_any_namespace_wildcards: bool = True ) -> str:
text = CollapseWildcardCharacters( self.raw_content )
if len( text ) == 0:
return ''
if self._collapse_search_characters and not force_do_not_collapse:
text = ConvertTagToSearchable( text )
if allow_unnamespaced_search_gives_any_namespace_wildcards and self._tag_autocomplete_options.UnnamespacedSearchGivesAnyNamespaceWildcards():
if ':' not in text:
( namespace, subtag ) = HydrusTags.SplitTag( text )
if namespace == '':
if subtag == '':
return ''
text = '*:{}'.format( subtag )
if always_autocompleting:
( namespace, subtag ) = HydrusTags.SplitTag( text )
@ -3135,7 +3184,24 @@ class ParsedAutocompleteText( object ):
elif self.IsExplicitWildcard():
search_texts = [ self._GetSearchText( True, force_do_not_collapse = True ), self._GetSearchText( False, force_do_not_collapse = True ) ]
search_texts = []
allow_unnamespaced_search_gives_any_namespace_wildcards_values = [ True ]
always_autocompleting_values = [ True, False ]
if '*' in self.raw_content:
# don't spam users who type something with this setting turned on
allow_unnamespaced_search_gives_any_namespace_wildcards_values.append( False )
for allow_unnamespaced_search_gives_any_namespace_wildcards in allow_unnamespaced_search_gives_any_namespace_wildcards_values:
for always_autocompleting in always_autocompleting_values:
search_texts.append( self._GetSearchText( always_autocompleting, allow_unnamespaced_search_gives_any_namespace_wildcards = allow_unnamespaced_search_gives_any_namespace_wildcards, force_do_not_collapse = True ) )
for s in list( search_texts ):
@ -3224,7 +3290,7 @@ class ParsedAutocompleteText( object ):
def IsExplicitWildcard( self ):
# user has intentionally put a '*' in
return '*' in self.raw_content
return '*' in self.raw_content or self._GetSearchText( False ).startswith( '*:' )
def IsNamespaceSearch( self ):

View File

@ -6147,27 +6147,22 @@ class DB( HydrusDB.HydrusDB ):
if data_type == HC.CONTENT_TYPE_FILES:
if action == HC.CONTENT_UPDATE_ADVANCED:
if action == HC.CONTENT_UPDATE_CLEAR_DELETE_RECORD:
( sub_action, sub_row ) = row
hashes = row
if sub_action == 'delete_deleted':
if hashes is None:
hashes = sub_row
service_ids_to_nums_cleared = self.modules_files_storage.ClearLocalDeleteRecord()
if hashes is None:
service_ids_to_nums_cleared = self.modules_files_storage.ClearLocalDeleteRecord()
else:
hash_ids = self.modules_hashes_local_cache.GetHashIds( hashes )
service_ids_to_nums_cleared = self.modules_files_storage.ClearLocalDeleteRecord( hash_ids )
else:
self._ExecuteMany( 'UPDATE service_info SET info = info + ? WHERE service_id = ? AND info_type = ?;', ( ( -num_cleared, clear_service_id, HC.SERVICE_INFO_NUM_DELETED_FILES ) for ( clear_service_id, num_cleared ) in service_ids_to_nums_cleared.items() ) )
hash_ids = self.modules_hashes_local_cache.GetHashIds( hashes )
service_ids_to_nums_cleared = self.modules_files_storage.ClearLocalDeleteRecord( hash_ids )
self._ExecuteMany( 'UPDATE service_info SET info = info + ? WHERE service_id = ? AND info_type = ?;', ( ( -num_cleared, clear_service_id, HC.SERVICE_INFO_NUM_DELETED_FILES ) for ( clear_service_id, num_cleared ) in service_ids_to_nums_cleared.items() ) )
elif action == HC.CONTENT_UPDATE_ADD:

View File

@ -296,11 +296,9 @@ class ClientDBFilesMetadataRich( ClientDBModule.ClientDBModule ):
if not self.modules_hashes.HasHash( hash ):
f = ClientImportFiles.FileImportStatus.STATICGetUnknownStatus()
# this used to set the fis.hash = hash here, but that's unhelpful for the callers, who already know the hash and really want to know if there was a good match
f.hash = hash
return f
return ClientImportFiles.FileImportStatus.STATICGetUnknownStatus()
else:

View File

@ -517,7 +517,7 @@ class DialogInputTags( Dialog ):
self._service_key = service_key
self._tags = ClientGUIListBoxes.ListBoxTagsStringsAddRemove( self, service_key, tag_display_type )
self._tags = ClientGUIListBoxes.ListBoxTagsStringsAddRemove( self, service_key, tag_display_type = tag_display_type )
default_location_context = HG.client_controller.new_options.GetDefaultLocalLocationContext()

View File

@ -644,7 +644,7 @@ class EditFileSeedCachePanel( ClientGUIScrolledPanels.EditPanel ):
ClientGUIMediaActions.UndeleteFiles( deletee_hashes )
content_update_erase_record = HydrusData.ContentUpdate( HC.CONTENT_TYPE_FILES, HC.CONTENT_UPDATE_ADVANCED, ( 'delete_deleted', deletee_hashes ) )
content_update_erase_record = HydrusData.ContentUpdate( HC.CONTENT_TYPE_FILES, HC.CONTENT_UPDATE_CLEAR_DELETE_RECORD, deletee_hashes )
service_keys_to_content_updates = { CC.COMBINED_LOCAL_FILE_SERVICE_KEY : [ content_update_erase_record ] }

View File

@ -98,6 +98,34 @@ def ApplyContentApplicationCommandToMedia( parent: QW.QWidget, command: CAC.Appl
return True
def ClearDeleteRecord( win, media ):
clearable_media = [ m for m in media if CC.COMBINED_LOCAL_FILE_SERVICE_KEY in m.GetLocationsManager().GetDeleted() ]
if len( clearable_media ) == 0:
return
result = ClientGUIDialogsQuick.GetYesNo( win, 'Clear the deletion record for {} previously deleted files?.'.format( HydrusData.ToHumanInt( len( clearable_media ) ) ) )
if result == QW.QDialog.Accepted:
for chunk_of_media in HydrusData.SplitIteratorIntoChunks( clearable_media, 64 ):
service_keys_to_content_updates = collections.defaultdict( list )
clearee_hashes = [ m.GetHash() for m in chunk_of_media ]
content_update = HydrusData.ContentUpdate( HC.CONTENT_TYPE_FILES, HC.CONTENT_UPDATE_CLEAR_DELETE_RECORD, clearee_hashes )
service_keys_to_content_updates[ CC.COMBINED_LOCAL_FILE_SERVICE_KEY ] = [ content_update ]
HG.client_controller.Write( 'content_updates', service_keys_to_content_updates )
def EditFileNotes( win: QW.QWidget, media: ClientMedia.MediaSingleton, name_to_start_on = typing.Optional[ str ] ):
names_to_notes = media.GetNotesManager().GetNamesToNotes()
@ -635,7 +663,14 @@ def UndeleteFiles( hashes ):
def UndeleteMedia( win, media ):
media_deleted_service_keys = HydrusData.MassUnion( ( m.GetLocationsManager().GetDeleted() for m in media ) )
undeletable_media = [ m for m in media if m.GetLocationsManager().IsLocal() ]
if len( undeletable_media ) == 0:
return
media_deleted_service_keys = HydrusData.MassUnion( ( m.GetLocationsManager().GetDeleted() for m in undeletable_media ) )
local_file_services = HG.client_controller.services_manager.GetServices( ( HC.LOCAL_FILE_DOMAIN, ) )
@ -673,7 +708,7 @@ def UndeleteMedia( win, media ):
else:
( undelete_service, ) = undeletable_services
if HC.options[ 'confirm_trash' ]:
@ -693,7 +728,7 @@ def UndeleteMedia( win, media ):
if do_it:
for chunk_of_media in HydrusData.SplitIteratorIntoChunks( media, 64 ):
for chunk_of_media in HydrusData.SplitIteratorIntoChunks( undeletable_media, 64 ):
service_keys_to_content_updates = collections.defaultdict( list )

View File

@ -999,7 +999,7 @@ class EditDeleteFilesPanel( ClientGUIScrolledPanels.EditPanel ):
list_of_service_keys_to_content_updates.extend( [ { CC.COMBINED_LOCAL_FILE_SERVICE_KEY : [ content_update ] } for content_update in content_updates ] )
content_updates = [ HydrusData.ContentUpdate( HC.CONTENT_TYPE_FILES, HC.CONTENT_UPDATE_ADVANCED, ( 'delete_deleted', chunk_of_hashes ) ) for chunk_of_hashes in chunks_of_hashes ]
content_updates = [ HydrusData.ContentUpdate( HC.CONTENT_TYPE_FILES, HC.CONTENT_UPDATE_CLEAR_DELETE_RECORD, chunk_of_hashes ) for chunk_of_hashes in chunks_of_hashes ]
list_of_service_keys_to_content_updates.extend( [ { CC.COMBINED_LOCAL_FILE_SERVICE_KEY : [ content_update ] } for content_update in content_updates ] )

View File

@ -3442,7 +3442,7 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
default_location_context = HG.client_controller.new_options.GetDefaultLocalLocationContext()
self._favourites = ClientGUIListBoxes.ListBoxTagsStringsAddRemove( favourites_panel, CC.COMBINED_TAG_SERVICE_KEY, ClientTags.TAG_DISPLAY_STORAGE )
self._favourites = ClientGUIListBoxes.ListBoxTagsStringsAddRemove( favourites_panel, CC.COMBINED_TAG_SERVICE_KEY, tag_display_type = ClientTags.TAG_DISPLAY_STORAGE )
self._favourites_input = ClientGUIACDropdown.AutoCompleteDropdownTagsWrite( favourites_panel, self._favourites.AddTags, default_location_context, CC.COMBINED_TAG_SERVICE_KEY, show_paste_button = True )
#
@ -3782,7 +3782,7 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
self._suggested_favourites_services.addItem( tag_service.GetName(), tag_service.GetServiceKey() )
self._suggested_favourites = ClientGUIListBoxes.ListBoxTagsStringsAddRemove( suggested_tags_favourites_panel, CC.COMBINED_TAG_SERVICE_KEY, ClientTags.TAG_DISPLAY_STORAGE )
self._suggested_favourites = ClientGUIListBoxes.ListBoxTagsStringsAddRemove( suggested_tags_favourites_panel, CC.COMBINED_TAG_SERVICE_KEY, tag_display_type = ClientTags.TAG_DISPLAY_STORAGE )
self._current_suggested_favourites_service = None
@ -4005,13 +4005,22 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
self._thumbnail_scale_type = ClientGUICommon.BetterChoice( self )
self._video_thumbnail_percentage_in = ClientGUICommon.BetterSpinBox( self, min=0, max=100 )
for t in ( HydrusImageHandling.THUMBNAIL_SCALE_DOWN_ONLY, HydrusImageHandling.THUMBNAIL_SCALE_TO_FIT, HydrusImageHandling.THUMBNAIL_SCALE_TO_FILL ):
self._thumbnail_scale_type.addItem( HydrusImageHandling.thumbnail_scale_str_lookup[ t ], t )
# I tried <100%, but Qt seems to cap it to 1.0. Sad!
self._thumbnail_dpr_percentage = ClientGUICommon.BetterSpinBox( self, min = 100, max = 800 )
tt = 'If your OS runs at an UI scale greater than 100%, mirror it here, and your thumbnails will look crisp. If you have multiple monitors at different UI scales, set it to the one you will be looking at hydrus thumbnails on more often. Setting this value to anything other than the monitor hydrus is currently on will cause thumbs to look pixellated and/or muddy.'
tt += os.linesep * 2
tt += 'I believe your UI scale is {}'.format( HydrusData.ConvertFloatToPercentage( self.devicePixelRatio() ) )
self._thumbnail_dpr_percentage.setToolTip( tt )
self._video_thumbnail_percentage_in = ClientGUICommon.BetterSpinBox( self, min=0, max=100 )
self._thumbnail_visibility_scroll_percent = ClientGUICommon.BetterSpinBox( self, min=1, max=99 )
self._thumbnail_visibility_scroll_percent.setToolTip( 'Lower numbers will cause fewer scrolls, higher numbers more.' )
@ -4035,6 +4044,7 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
self._thumbnail_margin.setValue( self._new_options.GetInteger( 'thumbnail_margin' ) )
self._thumbnail_scale_type.SetValue( self._new_options.GetInteger( 'thumbnail_scale_type' ) )
self._thumbnail_dpr_percentage.setValue( self._new_options.GetInteger( 'thumbnail_dpr_percent' ) )
self._video_thumbnail_percentage_in.setValue( self._new_options.GetInteger( 'video_thumbnail_percentage_in' ) )
@ -4063,6 +4073,7 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
rows.append( ( 'Thumbnail border: ', self._thumbnail_border ) )
rows.append( ( 'Thumbnail margin: ', self._thumbnail_margin ) )
rows.append( ( 'Thumbnail scaling: ', self._thumbnail_scale_type ) )
rows.append( ( 'Thumbnail UI scale supersampling %: ', self._thumbnail_dpr_percentage ) )
rows.append( ( 'Focus thumbnails in the preview window on ctrl-click: ', self._focus_preview_on_ctrl_click ) )
rows.append( ( ' Only on files with no duration: ', self._focus_preview_on_ctrl_click_only_static ) )
rows.append( ( 'Focus thumbnails in the preview window on shift-click: ', self._focus_preview_on_shift_click ) )
@ -4099,6 +4110,7 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
self._new_options.SetInteger( 'thumbnail_margin', self._thumbnail_margin.value() )
self._new_options.SetInteger( 'thumbnail_scale_type', self._thumbnail_scale_type.GetValue() )
self._new_options.SetInteger( 'thumbnail_dpr_percent', self._thumbnail_dpr_percentage.value() )
self._new_options.SetInteger( 'video_thumbnail_percentage_in', self._video_thumbnail_percentage_in.value() )
@ -4156,8 +4168,9 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
res_changed = HC.options[ 'thumbnail_dimensions' ] != self._original_options[ 'thumbnail_dimensions' ]
type_changed = test_new_options.GetInteger( 'thumbnail_scale_type' ) != self._original_new_options.GetInteger( 'thumbnail_scale_type' )
dpr_changed = test_new_options.GetInteger( 'thumbnail_dpr_percent' ) != self._original_new_options.GetInteger( 'thumbnail_dpr_percent' )
if res_changed or type_changed:
if res_changed or type_changed or dpr_changed:
HG.client_controller.pub( 'reset_thumbnail_cache' )

View File

@ -10,7 +10,6 @@ from hydrus.core import HydrusSerialisable
from hydrus.client import ClientApplicationCommand as CAC
from hydrus.client import ClientConstants as CC
from hydrus.client.media import ClientMedia
from hydrus.client import ClientParsing
from hydrus.client import ClientSearch
from hydrus.client import ClientThreading
@ -20,6 +19,7 @@ from hydrus.client.gui.lists import ClientGUIListBoxes
from hydrus.client.gui.lists import ClientGUIListBoxesData
from hydrus.client.gui.parsing import ClientGUIParsingLegacy
from hydrus.client.gui.widgets import ClientGUICommon
from hydrus.client.media import ClientMedia
from hydrus.client.metadata import ClientTags
from hydrus.client.metadata import ClientTagSorting
@ -628,7 +628,11 @@ class FileLookupScriptTagsPanel( QW.QWidget ):
parse_results = script.DoQuery( job_key, file_identifier )
tags = ClientParsing.GetTagsFromParseResults( parse_results )
tags = list( ClientParsing.GetTagsFromParseResults( parse_results ) )
tag_sort = ClientTagSorting.TagSort( ClientTagSorting.SORT_BY_HUMAN_TAG, sort_order = CC.SORT_ASC )
ClientTagSorting.SortTags( tag_sort, tags )
QP.CallAfter( qt_code, tags )

View File

@ -143,7 +143,10 @@ class EditTagAutocompleteOptionsPanel( ClientGUIScrolledPanels.EditPanel ):
self._write_autocomplete_location_context.SetAllKnownFilesAllowed( True, False )
self._search_namespaces_into_full_tags = QW.QCheckBox( self )
self._search_namespaces_into_full_tags.setToolTip( 'If on, a search for "ser" will return all "series:" results such as "series:metrod". On large tag services, these searches are extremely slow.' )
self._search_namespaces_into_full_tags.setToolTip( 'If on, a search for "ser" will return all "series:" results such as "series:metroid". On large tag services, these searches are extremely slow.' )
self._unnamespaced_search_gives_any_namespace_wildcards = QW.QCheckBox( self )
self._unnamespaced_search_gives_any_namespace_wildcards.setToolTip( 'If on, an unnamespaced search like "sam" will return special wildcards for "sam* (any namespace)" and "sam (any namespace)", just as if you had typed "*:sam".' )
self._namespace_bare_fetch_all_allowed = QW.QCheckBox( self )
self._namespace_bare_fetch_all_allowed.setToolTip( 'If on, a search for "series:" will return all "series:" results. On large tag services, these searches are extremely slow.' )
@ -165,6 +168,7 @@ class EditTagAutocompleteOptionsPanel( ClientGUIScrolledPanels.EditPanel ):
self._write_autocomplete_tag_domain.SetValue( tag_autocomplete_options.GetWriteAutocompleteTagDomain() )
self._override_write_autocomplete_location_context.setChecked( tag_autocomplete_options.OverridesWriteAutocompleteLocationContext() )
self._search_namespaces_into_full_tags.setChecked( tag_autocomplete_options.SearchNamespacesIntoFullTags() )
self._unnamespaced_search_gives_any_namespace_wildcards.setChecked( tag_autocomplete_options.UnnamespacedSearchGivesAnyNamespaceWildcards() )
self._namespace_bare_fetch_all_allowed.setChecked( tag_autocomplete_options.NamespaceBareFetchAllAllowed() )
self._namespace_fetch_all_allowed.setChecked( tag_autocomplete_options.NamespaceFetchAllAllowed() )
self._fetch_all_allowed.setChecked( tag_autocomplete_options.FetchAllAllowed() )
@ -192,6 +196,7 @@ class EditTagAutocompleteOptionsPanel( ClientGUIScrolledPanels.EditPanel ):
rows.append( ( 'Search namespaces with normal input: ', self._search_namespaces_into_full_tags ) )
rows.append( ( 'Unnamespaced input gives (any namespace) wildcard results: ', self._unnamespaced_search_gives_any_namespace_wildcards ) )
rows.append( ( 'Allow "namespace:": ', self._namespace_bare_fetch_all_allowed ) )
rows.append( ( 'Allow "namespace:*": ', self._namespace_fetch_all_allowed ) )
rows.append( ( 'Allow "*": ', self._fetch_all_allowed ) )
@ -213,7 +218,8 @@ class EditTagAutocompleteOptionsPanel( ClientGUIScrolledPanels.EditPanel ):
self._UpdateControls()
self._override_write_autocomplete_location_context.stateChanged.connect( self._UpdateControls )
self._search_namespaces_into_full_tags.stateChanged.connect( self._UpdateControls )
self._search_namespaces_into_full_tags.stateChanged.connect( self._UpdateControlsFromSearchNamespacesIntoFullTags )
self._unnamespaced_search_gives_any_namespace_wildcards.stateChanged.connect( self._UpdateControlsFromUnnamespacedSearchGivesAnyNamespaceWildcards )
self._namespace_bare_fetch_all_allowed.stateChanged.connect( self._UpdateControls )
@ -221,11 +227,31 @@ class EditTagAutocompleteOptionsPanel( ClientGUIScrolledPanels.EditPanel ):
self._write_autocomplete_location_context.setEnabled( self._override_write_autocomplete_location_context.isChecked() )
for c in ( self._namespace_bare_fetch_all_allowed, self._namespace_fetch_all_allowed ):
if not c.isEnabled():
c.blockSignals( True )
c.setChecked( True )
c.blockSignals( False )
def _UpdateControlsFromSearchNamespacesIntoFullTags( self ):
if self._search_namespaces_into_full_tags.isChecked():
self._namespace_bare_fetch_all_allowed.setEnabled( False )
self._namespace_fetch_all_allowed.setEnabled( False )
if self._unnamespaced_search_gives_any_namespace_wildcards.isChecked():
self._unnamespaced_search_gives_any_namespace_wildcards.setChecked( False )
else:
self._namespace_bare_fetch_all_allowed.setEnabled( True )
@ -240,18 +266,21 @@ class EditTagAutocompleteOptionsPanel( ClientGUIScrolledPanels.EditPanel ):
for c in ( self._namespace_bare_fetch_all_allowed, self._namespace_fetch_all_allowed ):
self._UpdateControls()
def _UpdateControlsFromUnnamespacedSearchGivesAnyNamespaceWildcards( self ):
if self._unnamespaced_search_gives_any_namespace_wildcards.isChecked():
if not c.isEnabled():
if self._search_namespaces_into_full_tags.isChecked():
c.blockSignals( True )
c.setChecked( True )
c.blockSignals( False )
self._search_namespaces_into_full_tags.setChecked( False )
self._UpdateControls()
def GetValue( self ):
@ -277,6 +306,7 @@ class EditTagAutocompleteOptionsPanel( ClientGUIScrolledPanels.EditPanel ):
tag_autocomplete_options.SetFetchResultsAutomatically( self._fetch_results_automatically.isChecked() )
tag_autocomplete_options.SetExactMatchCharacterThreshold( self._exact_match_character_threshold.GetValue() )
tag_autocomplete_options.SetUnnamespacedSearchGivesAnyNamespaceWildcards( self._unnamespaced_search_gives_any_namespace_wildcards.isChecked() )
return tag_autocomplete_options
@ -3002,8 +3032,8 @@ class ManageTagParents( ClientGUIScrolledPanels.ManagePanel ):
self._show_all = QW.QCheckBox( self )
# leave up here since other things have updates based on them
self._children = ClientGUIListBoxes.ListBoxTagsStringsAddRemove( self, self._service_key, ClientTags.TAG_DISPLAY_ACTUAL )
self._parents = ClientGUIListBoxes.ListBoxTagsStringsAddRemove( self, self._service_key, ClientTags.TAG_DISPLAY_ACTUAL )
self._children = ClientGUIListBoxes.ListBoxTagsStringsAddRemove( self, self._service_key, tag_display_type = ClientTags.TAG_DISPLAY_ACTUAL )
self._parents = ClientGUIListBoxes.ListBoxTagsStringsAddRemove( self, self._service_key, tag_display_type = ClientTags.TAG_DISPLAY_ACTUAL )
self._listctrl_panel = ClientGUIListCtrl.BetterListCtrlPanel( self )
@ -4010,7 +4040,7 @@ class ManageTagSiblings( ClientGUIScrolledPanels.ManagePanel ):
self._show_all = QW.QCheckBox( self )
# leave up here since other things have updates based on them
self._old_siblings = ClientGUIListBoxes.ListBoxTagsStringsAddRemove( self, self._service_key, ClientTags.TAG_DISPLAY_ACTUAL )
self._old_siblings = ClientGUIListBoxes.ListBoxTagsStringsAddRemove( self, self._service_key, tag_display_type = ClientTags.TAG_DISPLAY_ACTUAL )
self._new_sibling = ClientGUICommon.BetterStaticText( self )
self._listctrl_panel = ClientGUIListCtrl.BetterListCtrlPanel( self )

View File

@ -4263,15 +4263,22 @@ class CanvasMediaListBrowser( CanvasMediaListNavigable ):
ClientGUIMenus.AppendMenuItem( zoom_menu, 'zoom in', 'Zoom the media in.', self._media_container.ZoomIn )
ClientGUIMenus.AppendMenuItem( zoom_menu, 'zoom out', 'Zoom the media out.', self._media_container.ZoomOut )
if self._media_container.GetCurrentZoom() != 1.0:
current_zoom = self._media_container.GetCurrentZoom()
if current_zoom != 1.0:
ClientGUIMenus.AppendMenuItem( zoom_menu, 'zoom to 100%', 'Set the zoom to 100%.', self._media_container.ZoomSwitch )
elif self._media_container.GetCurrentZoom() != self._media_container.GetCanvasZoom():
elif current_zoom != self._media_container.GetCanvasZoom():
ClientGUIMenus.AppendMenuItem( zoom_menu, 'zoom fit', 'Set the zoom so the media fits the canvas.', self._media_container.ZoomSwitch )
if not self._media_container.IsAtMaxZoom():
ClientGUIMenus.AppendMenuItem( zoom_menu, 'zoom to max', 'Set the zoom to the maximum possible.', self._media_container.ZoomMax )
ClientGUIMenus.AppendMenu( menu, zoom_menu, 'current zoom: {}'.format( ClientData.ConvertZoomToPercentage( self._media_container.GetCurrentZoom() ) ) )

View File

@ -195,8 +195,9 @@ def CalculateMediaContainerSize( media, device_pixel_ratio: float, zoom, show_ac
bounding_dimensions = HG.client_controller.options[ 'thumbnail_dimensions' ]
thumbnail_scale_type = HG.client_controller.new_options.GetInteger( 'thumbnail_scale_type' )
thumbnail_dpr_percent = HG.client_controller.new_options.GetInteger( 'thumbnail_dpr_percent' )
( clip_rect, ( thumb_width, thumb_height ) ) = HydrusImageHandling.GetThumbnailResolutionAndClipRegion( media.GetResolution(), bounding_dimensions, thumbnail_scale_type )
( clip_rect, ( thumb_width, thumb_height ) ) = HydrusImageHandling.GetThumbnailResolutionAndClipRegion( media.GetResolution(), bounding_dimensions, thumbnail_scale_type, thumbnail_dpr_percent )
height = height + thumb_height
@ -1416,6 +1417,18 @@ class MediaContainer( QW.QWidget ):
def _GetMaxZoomDimension( self ):
if self._show_action == CC.MEDIA_VIEWER_ACTION_SHOW_WITH_MPV or isinstance( self._media_window, Animation ):
return 8000
else:
return 32000
def _MakeMediaWindow( self ):
old_media_window = self._media_window
@ -1586,11 +1599,6 @@ class MediaContainer( QW.QWidget ):
return
if new_zoom == self._current_zoom:
return
my_size = self.size()
my_width = my_size.width()
@ -1603,7 +1611,21 @@ class MediaContainer( QW.QWidget ):
new_my_width = new_media_window_size.width()
new_my_height = new_media_window_size.height()
if new_my_width > 32000 or new_my_height > 32000:
max_zoom_dimension = self._GetMaxZoomDimension()
if new_my_width > max_zoom_dimension or new_my_height > max_zoom_dimension:
( limit_max_normal_zoom, limit_max_canvas_zoom ) = CalculateCanvasZooms( QC.QSize( max_zoom_dimension, max_zoom_dimension ), self._canvas_type, my_dpr, self._media, CC.MEDIA_VIEWER_ACTION_SHOW_WITH_NATIVE )
new_zoom = limit_max_canvas_zoom
new_media_window_size = CalculateMediaContainerSize( self._media, my_dpr, new_zoom, CC.MEDIA_VIEWER_ACTION_SHOW_WITH_NATIVE )
new_my_width = new_media_window_size.width()
new_my_height = new_media_window_size.height()
if new_zoom == self._current_zoom:
return
@ -1857,6 +1879,17 @@ class MediaContainer( QW.QWidget ):
def IsAtMaxZoom( self ):
possible_zooms = HG.client_controller.new_options.GetMediaZooms()
max_zoom = max( possible_zooms )
max_zoom_dimension = self._GetMaxZoomDimension()
return self._current_zoom == max_zoom or self.width() == max_zoom_dimension or self.height() == max_zoom_dimension
def IsPaused( self ):
if isinstance( self._media_window, ( Animation, ClientGUIMPV.MPVWidget ) ):

View File

@ -81,17 +81,45 @@ class EditFileImportOptionsPanel( ClientGUIScrolledPanels.EditPanel ):
self._exclude_deleted.setToolTip( tt )
self._do_not_check_known_urls_before_importing = QW.QCheckBox( pre_import_panel )
self._do_not_check_hashes_before_importing = QW.QCheckBox( pre_import_panel )
#
tt = 'DO NOT SET THESE EXPENSIVE OPTIONS UNLESS YOU KNOW YOU NEED THEM FOR THIS ONE JOB'
tt += os.linesep * 2
tt += 'If hydrus recognises a file\'s URL or hash, if it is confident it already has it or previously deleted it, it will normally skip the download, saving a huge amount of time and bandwidth. The logic behind this gets quite complicated, and it is usually best to let it work normally.'
tt += os.linesep * 2
tt += 'However, if you believe the clientside url mappings or serverside hashes are inaccurate and the file is being wrongly skipped, turn these on to force a download. Only ever do this for one-time manually fired jobs. Do not turn this on for a normal download or a subscription! You do not need to turn these on for a file maintenance job that is filling in missing files, as missing files are automatically detected and essentially turn these on for you on a per-file basis.'
self._preimport_hash_check_type = ClientGUICommon.BetterChoice( default_panel )
self._preimport_url_check_type = ClientGUICommon.BetterChoice( default_panel )
self._do_not_check_known_urls_before_importing.setToolTip( tt )
self._do_not_check_hashes_before_importing.setToolTip( tt )
jobs = [
( 'do not check', FileImportOptions.DO_NOT_CHECK ),
( 'check', FileImportOptions.DO_CHECK ),
( 'check - and matches are dispositive', FileImportOptions.DO_CHECK_AND_MATCHES_ARE_DISPOSITIVE )
]
for ( display_string, client_data ) in jobs:
self._preimport_hash_check_type.addItem( display_string, client_data )
self._preimport_url_check_type.addItem( display_string, client_data )
tt = 'DO NOT SET THESE AS THE EXPENSIVE "DO NOT CHECK" UNLESS YOU KNOW YOU NEED THEM FOR THIS ONE JOB'
tt += os.linesep * 2
tt += 'If hydrus recognises a file\'s URL or hash, it can determine that it is "already in db" or "previously deleted" and skip the download entirely, saving a huge amount of time and bandwidth. The logic behind this can get quite complicated, and it is usually best to let it work normally.'
tt += os.linesep * 2
tt += 'If the checking is set to "dispositive", then if a match is found, that match will be trusted and the other match type is not consulted. Note that, for now, SHA256 hashes your client has never seen before will never count as "matches", just like an MD5 it has not seen before, so in all cases the import will defer to any set url check that says "already in db/previously deleted". (This is to deal with some cloud-storage in-transfer optimisation hash-changing. Novel SHA256 hashes are not always trustworthy.)'
tt += os.linesep * 2
tt += 'If you believe your clientside parser or url mappings are completely broken, and these logical tests are producing false positive "deleted" or "already in db" results, then set one or both of these to "do not check". Only ever do this for one-time manually fired jobs. Do not turn this on for a normal download or a subscription! You do not need to switch off checking for a file maintenance job that is filling in missing files, as missing files are automatically detected in the logic.'
self._preimport_hash_check_type.setToolTip( tt )
self._preimport_url_check_type.setToolTip( tt )
self._preimport_url_check_looks_for_neighbours = QW.QCheckBox( pre_import_panel )
tt = 'When a file-url mapping is found, and additional check can be performed to see if it is trustworthy.'
tt += os.linesep * 2
tt += 'If the URL has a Post URL Class, and the file has multiple other URLs with the same domain & URL Class (basically the file has multiple URLs on the same site), then the mapping is assumed to be some parse spam and not trustworthy (leading to more "this file looks new" results in the pre-check).'
tt += os.linesep * 2
tt += 'This test is best left on unless you are doing a single job that is messed up by the logic.'
self._preimport_url_check_looks_for_neighbours.setToolTip( tt )
#
self._allow_decompression_bombs = QW.QCheckBox( pre_import_panel )
@ -192,13 +220,15 @@ class EditFileImportOptionsPanel( ClientGUIScrolledPanels.EditPanel ):
if show_downloader_options and HG.client_controller.new_options.GetBoolean( 'advanced_mode' ):
rows.append( ( 'force file downloading even if url recognised and already in db/deleted: ', self._do_not_check_known_urls_before_importing ) )
rows.append( ( 'force file downloading even if hash recognised and already in db/deleted: ', self._do_not_check_hashes_before_importing ) )
rows.append( ( 'check hashes to determine "already in db/previously deleted"?: ', self._preimport_hash_check_type ) )
rows.append( ( 'check URLs to determine "already in db/previously deleted"?: ', self._preimport_url_check_type ) )
rows.append( ( 'during URL check, check for neighbour-spam?: ', self._preimport_url_check_looks_for_neighbours ) )
else:
self._do_not_check_known_urls_before_importing.setVisible( False )
self._do_not_check_hashes_before_importing.setVisible( False )
self._preimport_hash_check_type.setVisible( False )
self._preimport_url_check_type.setVisible( False )
self._preimport_url_check_looks_for_neighbours.setVisible( False )
rows.append( ( 'allowed filetypes: ', self._mimes ) )
@ -273,6 +303,12 @@ class EditFileImportOptionsPanel( ClientGUIScrolledPanels.EditPanel ):
self._UpdateIsDefault()
self._preimport_hash_check_type.currentIndexChanged.connect( self._UpdateDispositiveFromHash )
self._preimport_url_check_type.currentIndexChanged.connect( self._UpdateDispositiveFromURL )
self._UpdateDispositiveFromHash()
self._UpdateDispositiveFromURL()
def _LoadDefaultOptions( self ):
@ -305,15 +341,18 @@ class EditFileImportOptionsPanel( ClientGUIScrolledPanels.EditPanel ):
self._use_default_dropdown.SetValue( file_import_options.IsDefault() )
( exclude_deleted, do_not_check_known_urls_before_importing, do_not_check_hashes_before_importing, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution ) = file_import_options.GetPreImportOptions()
( exclude_deleted, preimport_hash_check_type, preimport_url_check_type, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution ) = file_import_options.GetPreImportOptions()
preimport_url_check_looks_for_neighbours = file_import_options.PreImportURLCheckLooksForNeighbours()
mimes = file_import_options.GetAllowedSpecificFiletypes()
self._mimes.SetValue( mimes )
self._exclude_deleted.setChecked( exclude_deleted )
self._do_not_check_known_urls_before_importing.setChecked( do_not_check_known_urls_before_importing )
self._do_not_check_hashes_before_importing.setChecked( do_not_check_hashes_before_importing )
self._preimport_hash_check_type.SetValue( preimport_hash_check_type )
self._preimport_url_check_type.SetValue( preimport_url_check_type )
self._preimport_url_check_looks_for_neighbours.setChecked( preimport_url_check_looks_for_neighbours )
self._allow_decompression_bombs.setChecked( allow_decompression_bombs )
self._min_size.SetValue( min_size )
self._max_size.SetValue( max_size )
@ -377,6 +416,30 @@ If you have a very large (10k+ files) file import page, consider hiding some or
QW.QMessageBox.information( self, 'Information', help_message )
def _UpdateDispositiveFromHash( self ):
preimport_hash_check_type = self._preimport_hash_check_type.GetValue()
preimport_url_check_type = self._preimport_url_check_type.GetValue()
if preimport_hash_check_type == FileImportOptions.DO_CHECK_AND_MATCHES_ARE_DISPOSITIVE and preimport_url_check_type == FileImportOptions.DO_CHECK_AND_MATCHES_ARE_DISPOSITIVE:
self._preimport_url_check_type.SetValue( FileImportOptions.DO_CHECK )
def _UpdateDispositiveFromURL( self ):
preimport_hash_check_type = self._preimport_hash_check_type.GetValue()
preimport_url_check_type = self._preimport_url_check_type.GetValue()
if preimport_hash_check_type == FileImportOptions.DO_CHECK_AND_MATCHES_ARE_DISPOSITIVE and preimport_url_check_type == FileImportOptions.DO_CHECK_AND_MATCHES_ARE_DISPOSITIVE:
self._preimport_hash_check_type.SetValue( FileImportOptions.DO_CHECK )
self._preimport_url_check_looks_for_neighbours.setEnabled( preimport_url_check_type != FileImportOptions.DO_NOT_CHECK )
def _UpdateIsDefault( self ):
is_default = self._use_default_dropdown.GetValue()
@ -428,8 +491,9 @@ If you have a very large (10k+ files) file import page, consider hiding some or
else:
exclude_deleted = self._exclude_deleted.isChecked()
do_not_check_known_urls_before_importing = self._do_not_check_known_urls_before_importing.isChecked()
do_not_check_hashes_before_importing = self._do_not_check_hashes_before_importing.isChecked()
preimport_hash_check_type = self._preimport_hash_check_type.GetValue()
preimport_url_check_type = self._preimport_url_check_type.GetValue()
preimport_url_check_looks_for_neighbours = self._preimport_url_check_looks_for_neighbours.isChecked()
allow_decompression_bombs = self._allow_decompression_bombs.isChecked()
min_size = self._min_size.GetValue()
max_size = self._max_size.GetValue()
@ -445,7 +509,8 @@ If you have a very large (10k+ files) file import page, consider hiding some or
destination_location_context = self._destination_location_context.GetValue()
file_import_options.SetPreImportOptions( exclude_deleted, do_not_check_known_urls_before_importing, do_not_check_hashes_before_importing, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution )
file_import_options.SetPreImportOptions( exclude_deleted, preimport_hash_check_type, preimport_url_check_type, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution )
file_import_options.SetPreImportURLCheckLooksForNeighbours( preimport_url_check_looks_for_neighbours )
file_import_options.SetAllowedSpecificFiletypes( self._mimes.GetValue() )
file_import_options.SetDestinationLocationContext( destination_location_context )
file_import_options.SetPostImportOptions( automatic_archive, associate_primary_urls, associate_source_urls )

View File

@ -1,17 +1,10 @@
import collections
import typing
from hydrus.core import HydrusConstants as HC
from hydrus.core import HydrusData
from hydrus.core import HydrusExceptions
from hydrus.core import HydrusGlobals as HG
from hydrus.core import HydrusSerialisable
from hydrus.core import HydrusTags
from hydrus.client import ClientConstants as CC
from hydrus.client import ClientSearch
from hydrus.client.gui.search import ClientGUISearch
from hydrus.client.media import ClientMedia
from hydrus.client.metadata import ClientTags
class ListBoxItem( object ):
@ -193,6 +186,16 @@ class ListBoxItemTextTag( ListBoxItem ):
return self._tag.__hash__()
def __lt__( self, other ):
if isinstance( other, ListBoxItemTextTag ):
return HydrusTags.ConvertTagToSortable( self.GetCopyableText() ) < HydrusTags.ConvertTagToSortable( other.GetCopyableText() )
return NotImplemented
def _AppendIdealTagTextWithNamespace( self, texts_with_namespaces, render_for_user ):
( namespace, subtag ) = HydrusTags.SplitTag( self._ideal_tag )
@ -334,6 +337,16 @@ class ListBoxItemTextTagWithCounts( ListBoxItemTextTag ):
return self._tag.__hash__()
def __lt__( self, other ):
if isinstance( other, ListBoxItemTextTagWithCounts ):
return HydrusTags.ConvertTagToSortable( self.GetCopyableText( with_counts = False ) ) < HydrusTags.ConvertTagToSortable( other.GetCopyableText( with_counts = False ) )
return NotImplemented
def GetCopyableText( self, with_counts: bool = False ) -> str:
if with_counts:
@ -440,6 +453,16 @@ class ListBoxItemPredicate( ListBoxItem ):
return self._predicate.__hash__()
def __lt__( self, other ):
if isinstance( other, ListBoxItem ):
return HydrusTags.ConvertTagToSortable( self.GetCopyableText() ) < HydrusTags.ConvertTagToSortable( other.GetCopyableText() )
return NotImplemented
def GetCopyableText( self, with_counts: bool = False ) -> str:
if self._predicate.GetType() == ClientSearch.PREDICATE_TYPE_NAMESPACE:

View File

@ -362,8 +362,9 @@ class ManagementController( HydrusSerialisable.SerialisableBase ):
paths = [ path_info for ( path_type, path_info ) in paths_info if path_type != 'zip' ]
exclude_deleted = advanced_import_options[ 'exclude_deleted' ]
do_not_check_known_urls_before_importing = False
do_not_check_hashes_before_importing = False
preimport_hash_check_type = FileImportOptions.DO_CHECK_AND_MATCHES_ARE_DISPOSITIVE
preimport_url_check_type = FileImportOptions.DO_CHECK
preimport_url_check_looks_for_neighbours = True
allow_decompression_bombs = False
min_size = advanced_import_options[ 'min_size' ]
max_size = None
@ -377,7 +378,8 @@ class ManagementController( HydrusSerialisable.SerialisableBase ):
file_import_options = FileImportOptions.FileImportOptions()
file_import_options.SetPreImportOptions( exclude_deleted, do_not_check_known_urls_before_importing, do_not_check_hashes_before_importing, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution )
file_import_options.SetPreImportOptions( exclude_deleted, preimport_hash_check_type, preimport_url_check_type, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution )
file_import_options.SetPreImportURLCheckLooksForNeighbours( preimport_url_check_looks_for_neighbours )
file_import_options.SetPostImportOptions( automatic_archive, associate_primary_urls, associate_source_urls )
paths_to_tags = { path : { bytes.fromhex( service_key ) : tags for ( service_key, tags ) in additional_service_keys_to_tags } for ( path, additional_service_keys_to_tags ) in paths_to_tags.items() }

View File

@ -154,6 +154,13 @@ class MediaPanel( ClientMedia.ListeningMediaList, QW.QScrollArea, CAC.Applicatio
def _ClearDeleteRecord( self ):
media = self._GetSelectedFlatMedia()
ClientGUIMediaActions.ClearDeleteRecord( self, media )
def _CopyBMPToClipboard( self ):
copied = False
@ -3584,6 +3591,7 @@ class MediaPanelThumbnails( MediaPanel ):
selection_has_trash = True in ( locations_manager.IsTrashed() for locations_manager in selected_locations_managers )
selection_has_inbox = True in ( media.HasInbox() for media in self._selected_media )
selection_has_archive = True in ( media.HasArchive() and media.GetLocationsManager().IsLocal() for media in self._selected_media )
selection_has_deletion_record = True in ( CC.COMBINED_LOCAL_FILE_SERVICE_KEY in locations_manager.GetDeleted() for locations_manager in selected_locations_managers )
all_file_domains = HydrusData.MassUnion( locations_manager.GetCurrent() for locations_manager in all_locations_managers )
all_specific_file_domains = all_file_domains.difference( { CC.COMBINED_FILE_SERVICE_KEY, CC.COMBINED_LOCAL_FILE_SERVICE_KEY } )
@ -3668,6 +3676,7 @@ class MediaPanelThumbnails( MediaPanel ):
local_delete_phrase = 'delete selected'
delete_physically_phrase = 'delete selected physically now'
undelete_phrase = 'undelete selected'
clear_deletion_phrase = 'clear deletion record for selected'
export_phrase = 'files'
copy_phrase = 'files'
@ -3692,6 +3701,7 @@ class MediaPanelThumbnails( MediaPanel ):
local_delete_phrase = 'delete'
delete_physically_phrase = 'delete physically now'
undelete_phrase = 'undelete'
clear_deletion_phrase = 'clear deletion record'
export_phrase = 'file'
copy_phrase = 'file'
@ -3989,6 +3999,11 @@ class MediaPanelThumbnails( MediaPanel ):
ClientGUIMenus.AppendMenuItem( menu, undelete_phrase, 'Restore the selected files back to \'my files\'.', self._Undelete )
if selection_has_deletion_record:
ClientGUIMenus.AppendMenuItem( menu, clear_deletion_phrase, 'Clear the deletion record for these files, allowing them to reimport even if previously deleted files are set to be discarded.', self._ClearDeleteRecord )
#
ClientGUIMenus.AppendSeparator( menu )
@ -4665,7 +4680,7 @@ class Thumbnail( Selectable ):
self._last_lower_summary = None
def GetQtImage( self, device_pixel_ratio ):
def GetQtImage( self, device_pixel_ratio ) -> QG.QImage:
# we probably don't really want to say DPR as a param here, but instead ask for a qt_image in a certain resolution?
# or just give the qt_image to be drawn to?
@ -4768,13 +4783,27 @@ class Thumbnail( Selectable ):
painter.fillRect( thumbnail_border, thumbnail_border, width - ( thumbnail_border * 2 ), height - ( thumbnail_border * 2 ), new_options.GetColour( background_colour_type ) )
( thumb_width, thumb_height ) = thumbnail_hydrus_bmp.GetSize()
raw_thumbnail_qt_image = thumbnail_hydrus_bmp.GetQtImage()
x_offset = ( width - thumb_width ) // 2
thumbnail_dpr_percent = HG.client_controller.new_options.GetInteger( 'thumbnail_dpr_percent' )
y_offset = ( height - thumb_height ) // 2
if thumbnail_dpr_percent != 100:
thumbnail_dpr = thumbnail_dpr_percent / 100
raw_thumbnail_qt_image.setDevicePixelRatio( thumbnail_dpr )
# qt_image.deviceIndepedentSize isn't supported in Qt5 lmao
device_independent_thumb_size = raw_thumbnail_qt_image.size() / thumbnail_dpr
else:
device_independent_thumb_size = raw_thumbnail_qt_image.size()
x_offset = ( width - device_independent_thumb_size.width() ) // 2
y_offset = ( height - device_independent_thumb_size.height() ) // 2
painter.drawImage( x_offset, y_offset, raw_thumbnail_qt_image )

View File

@ -922,31 +922,38 @@ class MediaSortControl( QW.QWidget ):
def wheelEvent( self, event ):
if self._sort_type_button.rect().contains( self._sort_type_button.mapFromGlobal( QG.QCursor.pos() ) ):
if HG.client_controller.new_options.GetBoolean( 'menu_choice_buttons_can_mouse_scroll' ):
if event.angleDelta().y() > 0:
if self._sort_type_button.rect().contains( self._sort_type_button.mapFromGlobal( QG.QCursor.pos() ) ):
index_delta = -1
if event.angleDelta().y() > 0:
index_delta = -1
else:
index_delta = 1
else:
sort_types = self._PopulateSortMenuOrList()
index_delta = 1
if self._sort_type in sort_types:
index = sort_types.index( self._sort_type )
new_index = ( index + index_delta ) % len( sort_types )
new_sort_type = sort_types[ new_index ]
self._SetSortTypeFromUser( new_sort_type )
sort_types = self._PopulateSortMenuOrList()
event.accept()
if self._sort_type in sort_types:
index = sort_types.index( self._sort_type )
new_index = ( index + index_delta ) % len( sort_types )
new_sort_type = sort_types[ new_index ]
self._SetSortTypeFromUser( new_sort_type )
else:
event.ignore()
event.accept()

View File

@ -2109,7 +2109,7 @@ class ReviewServiceCombinedLocalFilesSubPanel( ClientGUICommon.StaticBox ):
hashes = None
content_update = HydrusData.ContentUpdate( HC.CONTENT_TYPE_FILES, HC.CONTENT_UPDATE_ADVANCED, ( 'delete_deleted', hashes ) )
content_update = HydrusData.ContentUpdate( HC.CONTENT_TYPE_FILES, HC.CONTENT_UPDATE_CLEAR_DELETE_RECORD, hashes )
service_keys_to_content_updates = { CC.COMBINED_LOCAL_FILE_SERVICE_KEY : [ content_update ] }

View File

@ -231,22 +231,29 @@ class MenuChoiceButton( MenuMixin, ClientGUICommon.BetterButton ):
can_do_it = HG.client_controller.new_options.GetBoolean( 'menu_choice_buttons_can_mouse_scroll' )
if self._value_index is not None and can_do_it:
if can_do_it:
if event.angleDelta().y() > 0:
if self._value_index is not None and can_do_it:
index_delta = -1
if event.angleDelta().y() > 0:
index_delta = -1
else:
index_delta = 1
else:
new_value_index = ( self._value_index + index_delta ) % len( self._choice_tuples )
index_delta = 1
self._SetValueIndex( new_value_index )
new_value_index = ( self._value_index + index_delta ) % len( self._choice_tuples )
event.accept()
self._SetValueIndex( new_value_index )
else:
event.ignore()
event.accept()

View File

@ -36,6 +36,78 @@ from hydrus.client.networking import ClientNetworkingFunctions
FILE_SEED_TYPE_HDD = 0
FILE_SEED_TYPE_URL = 1
def FileURLMappingHasUntrustworthyNeighbours( hash: bytes, url: str ):
# let's see if the file that has this url has any other interesting urls
# if the file has another url with the same url class, then this is prob an unreliable 'alternate' source url attribution, and untrustworthy
try:
url = HG.client_controller.network_engine.domain_manager.NormaliseURL( url )
except HydrusExceptions.URLClassException:
# this url is so borked it doesn't parse. can't make neighbour inferences about it
return False
url_class = HG.client_controller.network_engine.domain_manager.GetURLClass( url )
# direct file URLs do not care about neighbours, since that can mean tokenised or different CDN URLs
url_is_worried_about_neighbours = url_class is not None and url_class.GetURLType() not in ( HC.URL_TYPE_FILE, HC.URL_TYPE_UNKNOWN )
if url_is_worried_about_neighbours:
media_result = HG.client_controller.Read( 'media_result', hash )
file_urls = media_result.GetLocationsManager().GetURLs()
# normalise to collapse http/https dupes
file_urls = HG.client_controller.network_engine.domain_manager.NormaliseURLs( file_urls )
for file_url in file_urls:
if file_url == url:
# obviously when we find ourselves, that's not a dupe
continue
if ClientNetworkingFunctions.ConvertURLIntoDomain( file_url ) != ClientNetworkingFunctions.ConvertURLIntoDomain( url ):
# checking here for the day when url classes can refer to multiple domains
continue
try:
file_url_class = HG.client_controller.network_engine.domain_manager.GetURLClass( file_url )
except HydrusExceptions.URLClassException:
# this is borked text, not matchable
continue
if file_url_class is None or url_class.GetURLType() in ( HC.URL_TYPE_FILE, HC.URL_TYPE_UNKNOWN ):
# being slightly superfluous here, but this file url can't be an untrustworthy neighbour
continue
if file_url_class == url_class:
# oh no, the file this source url refers to has a different known url in this same domain
# it is more likely that an edit on this site points to the original elsewhere
return True
return False
class FileSeed( HydrusSerialisable.SerialisableBase ):
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_FILE_SEED
@ -652,13 +724,19 @@ class FileSeed( HydrusSerialisable.SerialisableBase ):
return dict( self._hashes )
def GetPreImportStatusPredictionHash( self, file_import_options: FileImportOptions.FileImportOptions ) -> typing.Tuple[ bool, ClientImportFiles.FileImportStatus ]:
def GetPreImportStatusPredictionHash( self, file_import_options: FileImportOptions.FileImportOptions ) -> typing.Tuple[ bool, bool, ClientImportFiles.FileImportStatus ]:
hash_match_found = False
# TODO: a user raised the spectre of multiple hash parses on some site that actually provides somehow the pre- and post- optimised versions of a file
# some I guess support multiple hashes at some point, maybe, or figure out a different solution, or draw a harder line in parsing about one-hash-per-parse
if file_import_options.DoNotCheckHashesBeforeImporting() or len( self._hashes ) == 0:
preimport_hash_check_type = file_import_options.GetPreImportHashCheckType()
match_found = False
matches_are_dispositive = preimport_hash_check_type == FileImportOptions.DO_CHECK_AND_MATCHES_ARE_DISPOSITIVE
if len( self._hashes ) == 0 or preimport_hash_check_type == FileImportOptions.DO_NOT_CHECK:
return ( hash_match_found, ClientImportFiles.FileImportStatus.STATICGetUnknownStatus() )
return ( match_found, matches_are_dispositive, ClientImportFiles.FileImportStatus.STATICGetUnknownStatus() )
# hashes
@ -680,43 +758,42 @@ class FileSeed( HydrusSerialisable.SerialisableBase ):
jobs.append( ( hash_type, found_hash ) )
first_result = None
for ( hash_type, found_hash ) in jobs:
file_import_status = HG.client_controller.Read( 'hash_status', hash_type, found_hash, prefix = '{} hash recognised'.format( hash_type ) )
hash_match_found = True
# there's some subtle gubbins going on here
# an sha256 'haven't seen this before' result will not set the hash here and so will not count as a match
# this is the same as if we do an md5 lookup and get no sha256 result back. we just aren't trusting a novel sha256 as a 'match'
# this is _useful_ to reduce the dispositivity of this lad in this specific case
if file_import_status.hash is None:
continue
match_found = True
file_import_status = ClientImportFiles.CheckFileImportStatus( file_import_status )
if first_result is None:
first_result = file_import_status
if not file_import_status.ShouldImport( file_import_options ):
return ( hash_match_found, file_import_status )
return ( match_found, matches_are_dispositive, file_import_status )
# we do first_result gubbins rather than generating a fresh unknown one to capture correct sha256 hash and mime if db provided it
if first_result is None:
return ( hash_match_found, ClientImportFiles.FileImportStatus.STATICGetUnknownStatus() )
else:
return ( hash_match_found, first_result )
return ( match_found, matches_are_dispositive, ClientImportFiles.FileImportStatus.STATICGetUnknownStatus() )
def GetPreImportStatusPredictionURL( self, file_import_options: FileImportOptions.FileImportOptions, file_url = None ) -> ClientImportFiles.FileImportStatus:
def GetPreImportStatusPredictionURL( self, file_import_options: FileImportOptions.FileImportOptions, file_url = None ) -> typing.Tuple[ bool, bool, ClientImportFiles.FileImportStatus ]:
if file_import_options.DoNotCheckKnownURLsBeforeImporting():
preimport_url_check_type = file_import_options.GetPreImportURLCheckType()
preimport_url_check_looks_for_neighbours = file_import_options.PreImportURLCheckLooksForNeighbours()
match_found = False
matches_are_dispositive = preimport_url_check_type == FileImportOptions.DO_CHECK_AND_MATCHES_ARE_DISPOSITIVE
if preimport_url_check_type == FileImportOptions.DO_NOT_CHECK:
return ClientImportFiles.FileImportStatus.STATICGetUnknownStatus()
return ( match_found, matches_are_dispositive, ClientImportFiles.FileImportStatus.STATICGetUnknownStatus() )
# urls
@ -743,15 +820,20 @@ class FileSeed( HydrusSerialisable.SerialisableBase ):
# now discard gallery pages or post urls that can hold multiple files
urls = [ url for url in urls if not HG.client_controller.network_engine.domain_manager.URLCanReferToMultipleFiles( url ) ]
unrecognised_url_results = set()
lookup_urls = HG.client_controller.network_engine.domain_manager.NormaliseURLs( urls )
first_result = None
untrustworthy_domains = set()
for url in urls:
for lookup_url in lookup_urls:
results = HG.client_controller.Read( 'url_statuses', url )
if ClientNetworkingFunctions.ConvertURLIntoDomain( lookup_url ) in untrustworthy_domains:
continue
if len( results ) == 0: # if no match found, no useful data discovered
results = HG.client_controller.Read( 'url_statuses', lookup_url )
if len( results ) == 0: # if no match found, this is a new URL, no useful data discovered
continue
@ -759,86 +841,30 @@ class FileSeed( HydrusSerialisable.SerialisableBase ):
continue
else: # i.e. 1 match found
else: # this url is matched to one known file--sounds good!
file_import_status = results[0]
file_import_status = ClientImportFiles.CheckFileImportStatus( file_import_status )
if first_result is None:
if preimport_url_check_looks_for_neighbours and FileURLMappingHasUntrustworthyNeighbours( file_import_status.hash, lookup_url ):
first_result = file_import_status
untrustworthy_domains.add( ClientNetworkingFunctions.ConvertURLIntoDomain( lookup_url ) )
continue
if not file_import_status.ShouldImport( file_import_options ):
hash = file_import_status.hash
# a known one-file url has given a single clear result. sounds good
we_have_a_match = True
if self.file_seed_type == FILE_SEED_TYPE_URL:
# to double-check, let's see if the file that claims that url has any other interesting urls
# if the file has another url with the same url class as ours, then this is prob an unreliable 'alternate' source url attribution, and untrustworthy
my_url = self.file_seed_data
if url != my_url:
my_url_class = HG.client_controller.network_engine.domain_manager.GetURLClass( my_url )
media_result = HG.client_controller.Read( 'media_result', hash )
this_files_urls = media_result.GetLocationsManager().GetURLs()
for this_files_url in this_files_urls:
if this_files_url != my_url:
try:
this_url_class = HG.client_controller.network_engine.domain_manager.GetURLClass( this_files_url )
except HydrusExceptions.URLClassException:
continue
if my_url_class == this_url_class:
# oh no, the file this source url refers to has a different known url in this same domain
# it is more likely that an edit on this site points to the original elsewhere
we_have_a_match = False
break
if we_have_a_match:
# if a known one-file url gives a single clear result, that result is reliable
return file_import_status
match_found = True
# we have discovered a single-file match with a hash and no controversial urls; we have a result
# this may be a 'needs to be imported' result, but that's fine. probably a record of a previously deleted file that is now ok to import
return ( match_found, matches_are_dispositive, file_import_status )
# we do first_result gubbins rather than generating a fresh unknown one to capture correct sha256 hash and mime if db provided it
if first_result is None:
return ClientImportFiles.FileImportStatus.STATICGetUnknownStatus()
else:
return first_result
# no good matches found
return ( match_found, matches_are_dispositive, ClientImportFiles.FileImportStatus.STATICGetUnknownStatus() )
def GetSearchFileSeeds( self ):
@ -1036,22 +1062,31 @@ class FileSeed( HydrusSerialisable.SerialisableBase ):
def PredictPreImportStatus( self, file_import_options: FileImportOptions.FileImportOptions, tag_import_options: TagImportOptions.TagImportOptions, note_import_options: NoteImportOptions.NoteImportOptions, file_url = None ):
( hash_match_found, hash_file_import_status ) = self.GetPreImportStatusPredictionHash( file_import_options )
( hash_match_found, hash_matches_are_dispositive, hash_file_import_status ) = self.GetPreImportStatusPredictionHash( file_import_options )
( url_match_found, url_matches_are_dispositive, url_file_import_status ) = self.GetPreImportStatusPredictionURL( file_import_options, file_url = file_url )
# now let's set the prediction
url_file_import_status = None
if hash_match_found: # trust hashes over urls m8
if hash_match_found and hash_matches_are_dispositive:
file_import_status = hash_file_import_status
else:
url_file_import_status = self.GetPreImportStatusPredictionURL( file_import_options, file_url = file_url )
elif url_match_found and url_matches_are_dispositive:
file_import_status = url_file_import_status
else:
# prefer the one that says already in db/previously deleted
if hash_file_import_status.ShouldImport( file_import_options ):
file_import_status = url_file_import_status
else:
file_import_status = hash_file_import_status
# and make some recommendations
@ -1062,21 +1097,7 @@ class FileSeed( HydrusSerialisable.SerialisableBase ):
# but if we otherwise still want to force some tags, let's do it
if not should_download_metadata and tag_import_options.WorthFetchingTags():
url_override = False
if tag_import_options.ShouldFetchTagsEvenIfURLKnownAndFileAlreadyInDB():
if url_file_import_status is None:
url_file_import_status = self.GetPreImportStatusPredictionURL( file_import_options, file_url = file_url )
if url_file_import_status.AlreadyInDB():
url_override = True
url_override = url_file_import_status.AlreadyInDB() and tag_import_options.ShouldFetchTagsEvenIfURLKnownAndFileAlreadyInDB()
hash_override = hash_file_import_status.AlreadyInDB() and tag_import_options.ShouldFetchTagsEvenIfHashKnownAndFileAlreadyInDB()
if url_override or hash_override:

View File

@ -72,7 +72,7 @@ class FileImportStatus( object ):
return FileImportStatus( CC.STATUS_UNKNOWN, None )
def CheckFileImportStatus( file_import_status: FileImportStatus ):
def CheckFileImportStatus( file_import_status: FileImportStatus ) -> FileImportStatus:
if file_import_status.AlreadyInDB():
@ -257,8 +257,10 @@ class FileImportJob( object ):
self._pre_import_file_status = HG.client_controller.Read( 'hash_status', 'sha256', hash, prefix = 'file recognised' )
# just in case
self._pre_import_file_status.hash = hash
if self._pre_import_file_status.hash is None:
self._pre_import_file_status.hash = hash
self._pre_import_file_status = CheckFileImportStatus( self._pre_import_file_status )
@ -339,8 +341,9 @@ class FileImportJob( object ):
bounding_dimensions = HG.client_controller.options[ 'thumbnail_dimensions' ]
thumbnail_scale_type = HG.client_controller.new_options.GetInteger( 'thumbnail_scale_type' )
thumbnail_dpr_percent = HG.client_controller.new_options.GetInteger( 'thumbnail_dpr_percent' )
( clip_rect, target_resolution ) = HydrusImageHandling.GetThumbnailResolutionAndClipRegion( ( width, height ), bounding_dimensions, thumbnail_scale_type )
( clip_rect, target_resolution ) = HydrusImageHandling.GetThumbnailResolutionAndClipRegion( ( width, height ), bounding_dimensions, thumbnail_scale_type, thumbnail_dpr_percent )
percentage_in = HG.client_controller.new_options.GetInteger( 'video_thumbnail_percentage_in' )

View File

@ -32,19 +32,24 @@ def GetRealPresentationImportOptions( file_import_options: "FileImportOptions",
return real_file_import_options.GetPresentationImportOptions()
DO_NOT_CHECK = 0
DO_CHECK = 1
DO_CHECK_AND_MATCHES_ARE_DISPOSITIVE = 2
class FileImportOptions( HydrusSerialisable.SerialisableBase ):
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_FILE_IMPORT_OPTIONS
SERIALISABLE_NAME = 'File Import Options'
SERIALISABLE_VERSION = 8
SERIALISABLE_VERSION = 9
def __init__( self ):
HydrusSerialisable.SerialisableBase.__init__( self )
self._exclude_deleted = True
self._do_not_check_known_urls_before_importing = False
self._do_not_check_hashes_before_importing = False
self._preimport_hash_check_type = DO_CHECK_AND_MATCHES_ARE_DISPOSITIVE
self._preimport_url_check_type = DO_CHECK
self._preimport_url_check_looks_for_neighbours = True
self._allow_decompression_bombs = True
self._filetype_filter_predicate = ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_SYSTEM_MIME, value = set( HC.GENERAL_FILETYPES ) )
self._min_size = None
@ -67,7 +72,7 @@ class FileImportOptions( HydrusSerialisable.SerialisableBase ):
serialisable_filetype_filter_predicate = self._filetype_filter_predicate.GetSerialisableTuple()
pre_import_options = ( self._exclude_deleted, self._do_not_check_known_urls_before_importing, self._do_not_check_hashes_before_importing, self._allow_decompression_bombs, serialisable_filetype_filter_predicate, self._min_size, self._max_size, self._max_gif_size, self._min_resolution, self._max_resolution, serialisable_import_destination_location_context )
pre_import_options = ( self._exclude_deleted, self._preimport_hash_check_type, self._preimport_url_check_type, self._preimport_url_check_looks_for_neighbours, self._allow_decompression_bombs, serialisable_filetype_filter_predicate, self._min_size, self._max_size, self._max_gif_size, self._min_resolution, self._max_resolution, serialisable_import_destination_location_context )
post_import_options = ( self._automatic_archive, self._associate_primary_urls, self._associate_source_urls )
serialisable_presentation_import_options = self._presentation_import_options.GetSerialisableTuple()
@ -78,7 +83,7 @@ class FileImportOptions( HydrusSerialisable.SerialisableBase ):
( pre_import_options, post_import_options, serialisable_presentation_import_options, self._is_default ) = serialisable_info
( self._exclude_deleted, self._do_not_check_known_urls_before_importing, self._do_not_check_hashes_before_importing, self._allow_decompression_bombs, serialisable_filetype_filter_predicate, self._min_size, self._max_size, self._max_gif_size, self._min_resolution, self._max_resolution, serialisable_import_destination_location_context ) = pre_import_options
( self._exclude_deleted, self._preimport_hash_check_type, self._preimport_url_check_type, self._preimport_url_check_looks_for_neighbours, self._allow_decompression_bombs, serialisable_filetype_filter_predicate, self._min_size, self._max_size, self._max_gif_size, self._min_resolution, self._max_resolution, serialisable_import_destination_location_context ) = pre_import_options
( self._automatic_archive, self._associate_primary_urls, self._associate_source_urls ) = post_import_options
self._presentation_import_options = HydrusSerialisable.CreateFromSerialisableTuple( serialisable_presentation_import_options )
@ -229,6 +234,39 @@ class FileImportOptions( HydrusSerialisable.SerialisableBase ):
return ( 8, new_serialisable_info )
if version == 8:
( pre_import_options, post_import_options, serialisable_presentation_import_options, is_default ) = old_serialisable_info
( exclude_deleted, do_not_check_known_urls_before_importing, do_not_check_hashes_before_importing, allow_decompression_bombs, serialisable_filetype_filter_predicate, min_size, max_size, max_gif_size, min_resolution, max_resolution, serialisable_import_destination_location_context ) = pre_import_options
if do_not_check_hashes_before_importing:
preimport_hash_check_type = DO_NOT_CHECK
else:
preimport_hash_check_type = DO_CHECK_AND_MATCHES_ARE_DISPOSITIVE
if do_not_check_known_urls_before_importing:
preimport_url_check_type = DO_NOT_CHECK
else:
preimport_url_check_type = DO_CHECK
preimport_url_check_looks_for_neighbours = True
pre_import_options = ( exclude_deleted, preimport_hash_check_type, preimport_url_check_type, preimport_url_check_looks_for_neighbours, allow_decompression_bombs, serialisable_filetype_filter_predicate, min_size, max_size, max_gif_size, min_resolution, max_resolution, serialisable_import_destination_location_context )
new_serialisable_info = ( pre_import_options, post_import_options, serialisable_presentation_import_options, is_default )
return ( 9, new_serialisable_info )
def AllowsDecompressionBombs( self ):
@ -332,16 +370,6 @@ class FileImportOptions( HydrusSerialisable.SerialisableBase ):
def DoNotCheckHashesBeforeImporting( self ):
return self._do_not_check_hashes_before_importing
def DoNotCheckKnownURLsBeforeImporting( self ):
return self._do_not_check_known_urls_before_importing
def ExcludesDeleted( self ):
return self._exclude_deleted
@ -362,9 +390,19 @@ class FileImportOptions( HydrusSerialisable.SerialisableBase ):
return self._presentation_import_options
def GetPreImportHashCheckType( self ):
return self._preimport_hash_check_type
def GetPreImportURLCheckType( self ):
return self._preimport_url_check_type
def GetPreImportOptions( self ):
pre_import_options = ( self._exclude_deleted, self._do_not_check_known_urls_before_importing, self._do_not_check_hashes_before_importing, self._allow_decompression_bombs, self._min_size, self._max_size, self._max_gif_size, self._min_resolution, self._max_resolution )
pre_import_options = ( self._exclude_deleted, self._preimport_hash_check_type, self._preimport_url_check_type, self._allow_decompression_bombs, self._min_size, self._max_size, self._max_gif_size, self._min_resolution, self._max_resolution )
return pre_import_options
@ -442,9 +480,9 @@ class FileImportOptions( HydrusSerialisable.SerialisableBase ):
return self._is_default
def SetDestinationLocationContext( self, location_context: ClientLocation.LocationContext ):
def PreImportURLCheckLooksForNeighbours( self ) -> bool:
self._import_destination_location_context = location_context.Duplicate()
return self._preimport_url_check_looks_for_neighbours
def SetAllowedSpecificFiletypes( self, mimes ) -> None:
@ -454,6 +492,11 @@ class FileImportOptions( HydrusSerialisable.SerialisableBase ):
self._filetype_filter_predicate = ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_SYSTEM_MIME, value = mimes )
def SetDestinationLocationContext( self, location_context: ClientLocation.LocationContext ):
self._import_destination_location_context = location_context.Duplicate()
def SetIsDefault( self, value: bool ) -> None:
self._is_default = value
@ -466,16 +509,21 @@ class FileImportOptions( HydrusSerialisable.SerialisableBase ):
self._associate_source_urls = associate_source_urls
def SetPreImportURLCheckLooksForNeighbours( self, preimport_url_check_looks_for_neighbours: bool ):
self._preimport_url_check_looks_for_neighbours = preimport_url_check_looks_for_neighbours
def SetPresentationImportOptions( self, presentation_import_options: PresentationImportOptions.PresentationImportOptions ):
self._presentation_import_options = presentation_import_options
def SetPreImportOptions( self, exclude_deleted, do_not_check_known_urls_before_importing, do_not_check_hashes_before_importing, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution ):
def SetPreImportOptions( self, exclude_deleted, preimport_hash_check_type, preimport_url_check_type, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution ):
self._exclude_deleted = exclude_deleted
self._do_not_check_known_urls_before_importing = do_not_check_known_urls_before_importing
self._do_not_check_hashes_before_importing = do_not_check_hashes_before_importing
self._preimport_hash_check_type = preimport_hash_check_type
self._preimport_url_check_type = preimport_url_check_type
self._allow_decompression_bombs = allow_decompression_bombs
self._min_size = min_size
self._max_size = max_size

View File

@ -156,6 +156,9 @@ class NoteImportOptions( HydrusSerialisable.SerialisableBase ):
return updatee_names_to_notes
# TODO: Add options to noteimportoptions to say whether we discard dupes in names_to_notes and empty notes and any other complex logic here
# we can have a complex-ish default, but someone is going to want different, so add options
existing_names_to_notes = dict( existing_names_to_notes )
names_and_notes = sorted( names_to_notes.items() )

View File

@ -782,30 +782,25 @@ class LocationsManager( object ):
if data_type == HC.CONTENT_TYPE_FILES:
if action == HC.CONTENT_UPDATE_ADVANCED:
if action == HC.CONTENT_UPDATE_CLEAR_DELETE_RECORD:
( sub_action, hashes ) = row
if sub_action == 'delete_deleted':
if service_key in self._deleted:
if CC.TRASH_SERVICE_KEY not in self._current:
if service_key == CC.COMBINED_LOCAL_FILE_SERVICE_KEY:
if service_key == CC.COMBINED_LOCAL_FILE_SERVICE_KEY:
service_keys = HG.client_controller.services_manager.GetServiceKeys( ( HC.LOCAL_FILE_DOMAIN, HC.COMBINED_LOCAL_FILE ) )
else:
service_keys = ( service_key, )
service_keys = HG.client_controller.services_manager.GetServiceKeys( ( HC.LOCAL_FILE_DOMAIN, HC.COMBINED_LOCAL_FILE ) )
for service_key in service_keys:
else:
service_keys = ( service_key, )
for service_key in service_keys:
if service_key in self._deleted_to_timestamps:
if service_key in self._deleted_to_timestamps:
del self._deleted_to_timestamps[ service_key ]
self._deleted.discard( service_key )
del self._deleted_to_timestamps[ service_key ]
self._deleted.discard( service_key )

View File

@ -1,3 +1,5 @@
import typing
from hydrus.core import HydrusSerialisable
from hydrus.core import HydrusTags
@ -82,33 +84,61 @@ class TagSort( HydrusSerialisable.SerialisableBase ):
HydrusSerialisable.SERIALISABLE_TYPES_TO_OBJECT_TYPES[ HydrusSerialisable.SERIALISABLE_TYPE_TAG_SORT ] = TagSort
def SortTags( tag_sort: TagSort, list_of_tag_items, tag_items_to_count = None, item_to_tag_key_wrapper = None, item_to_sibling_key_wrapper = None ):
def lexicographic_key( tag ):
def lexicographic_key( tag ):
( namespace, subtag ) = HydrusTags.SplitTag( tag )
comparable_namespace = HydrusTags.ConvertTagToSortable( namespace )
comparable_subtag = HydrusTags.ConvertTagToSortable( subtag )
if namespace == '':
( namespace, subtag ) = HydrusTags.SplitTag( tag )
return ( comparable_subtag, comparable_subtag )
comparable_namespace = HydrusTags.ConvertTagToSortable( namespace )
comparable_subtag = HydrusTags.ConvertTagToSortable( subtag )
else:
if namespace == '':
return ( comparable_subtag, comparable_subtag )
else:
return ( comparable_namespace, comparable_subtag )
return ( comparable_namespace, comparable_subtag )
def subtag_lexicographic_key( tag ):
def subtag_lexicographic_key( tag ):
( namespace, subtag ) = HydrusTags.SplitTag( tag )
comparable_subtag = HydrusTags.ConvertTagToSortable( subtag )
return comparable_subtag
def namespace_key( tag ):
( namespace, subtag ) = HydrusTags.SplitTag( tag )
if namespace == '':
( namespace, subtag ) = HydrusTags.SplitTag( tag )
namespace = '{' # '{' is above 'z' in ascii, so this works for most situations
comparable_subtag = HydrusTags.ConvertTagToSortable( subtag )
return namespace
def namespace_lexicographic_key( tag ):
# '{' is above 'z' in ascii, so this works for most situations
( namespace, subtag ) = HydrusTags.SplitTag( tag )
if namespace == '':
return comparable_subtag
return ( '{', HydrusTags.ConvertTagToSortable( subtag ) )
else:
return ( namespace, HydrusTags.ConvertTagToSortable( subtag ) )
def SortTags( tag_sort: TagSort, list_of_tag_items: typing.List, tag_items_to_count = None, item_to_tag_key_wrapper = None, item_to_sibling_key_wrapper = None ):
def incidence_key( tag_item ):
@ -122,34 +152,6 @@ def SortTags( tag_sort: TagSort, list_of_tag_items, tag_items_to_count = None, i
def namespace_key( tag ):
( namespace, subtag ) = HydrusTags.SplitTag( tag )
if namespace == '':
namespace = '{' # '{' is above 'z' in ascii, so this works for most situations
return namespace
def namespace_lexicographic_key( tag ):
# '{' is above 'z' in ascii, so this works for most situations
( namespace, subtag ) = HydrusTags.SplitTag( tag )
if namespace == '':
return ( '{', HydrusTags.ConvertTagToSortable( subtag ) )
else:
return ( namespace, HydrusTags.ConvertTagToSortable( subtag ) )
sorts_to_do = []
if tag_sort.sort_type == SORT_BY_COUNT:

View File

@ -18,7 +18,7 @@ class TagAutocompleteOptions( HydrusSerialisable.SerialisableBase ):
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_TAG_AUTOCOMPLETE_OPTIONS
SERIALISABLE_NAME = 'Tag Autocomplete Options'
SERIALISABLE_VERSION = 4
SERIALISABLE_VERSION = 5
def __init__( self, service_key: typing.Optional[ bytes ] = None ):
@ -45,6 +45,7 @@ class TagAutocompleteOptions( HydrusSerialisable.SerialisableBase ):
self._search_namespaces_into_full_tags = False
self._unnamespaced_search_gives_any_namespace_wildcards = False
self._namespace_bare_fetch_all_allowed = False
self._namespace_fetch_all_allowed = False
self._fetch_all_allowed = False
@ -65,6 +66,7 @@ class TagAutocompleteOptions( HydrusSerialisable.SerialisableBase ):
self._override_write_autocomplete_location_context,
serialisable_write_autocomplete_location_context,
self._search_namespaces_into_full_tags,
self._unnamespaced_search_gives_any_namespace_wildcards,
self._namespace_bare_fetch_all_allowed,
self._namespace_fetch_all_allowed,
self._fetch_all_allowed,
@ -83,6 +85,7 @@ class TagAutocompleteOptions( HydrusSerialisable.SerialisableBase ):
self._override_write_autocomplete_location_context,
serialisable_write_autocomplete_location_context,
self._search_namespaces_into_full_tags,
self._unnamespaced_search_gives_any_namespace_wildcards,
self._namespace_bare_fetch_all_allowed,
self._namespace_fetch_all_allowed,
self._fetch_all_allowed,
@ -194,6 +197,40 @@ class TagAutocompleteOptions( HydrusSerialisable.SerialisableBase ):
return ( 4, new_serialisable_info )
if version == 4:
[
serialisable_service_key,
serialisable_write_autocomplete_tag_domain,
override_write_autocomplete_location_context,
serialisable_write_autocomplete_location_context,
search_namespaces_into_full_tags,
namespace_bare_fetch_all_allowed,
namespace_fetch_all_allowed,
fetch_all_allowed,
fetch_results_automatically,
exact_match_character_threshold
] = old_serialisable_info
unnamespaced_search_gives_any_namespace_wildcards = False
new_serialisable_info = [
serialisable_service_key,
serialisable_write_autocomplete_tag_domain,
override_write_autocomplete_location_context,
serialisable_write_autocomplete_location_context,
search_namespaces_into_full_tags,
unnamespaced_search_gives_any_namespace_wildcards,
namespace_bare_fetch_all_allowed,
namespace_fetch_all_allowed,
fetch_all_allowed,
fetch_results_automatically,
exact_match_character_threshold
]
return ( 5, new_serialisable_info )
def FetchAllAllowed( self ):
@ -267,6 +304,11 @@ class TagAutocompleteOptions( HydrusSerialisable.SerialisableBase ):
return self._search_namespaces_into_full_tags
def UnnamespacedSearchGivesAnyNamespaceWildcards( self ) -> bool:
return self._unnamespaced_search_gives_any_namespace_wildcards
def SetExactMatchCharacterThreshold( self, exact_match_character_threshold: typing.Optional[ int ] ):
self._exact_match_character_threshold = exact_match_character_threshold
@ -277,6 +319,11 @@ class TagAutocompleteOptions( HydrusSerialisable.SerialisableBase ):
self._fetch_results_automatically = fetch_results_automatically
def SetUnnamespacedSearchGivesAnyNamespaceWildcards( self, unnamespaced_search_gives_any_namespace_wildcards: bool ):
self._unnamespaced_search_gives_any_namespace_wildcards = unnamespaced_search_gives_any_namespace_wildcards
def SetTuple( self,
write_autocomplete_tag_domain: bytes,
override_write_autocomplete_location_context: bool,

View File

@ -2583,8 +2583,8 @@ class HydrusResourceClientAPIRestrictedGetFilesFileMetadata( HydrusResourceClien
ipfs_service_keys = services_manager.GetServiceKeys( ( HC.IPFS, ) )
thumbnail_bounding_dimensions = HG.client_controller.options[ 'thumbnail_dimensions' ]
thumbnail_scale_type = HG.client_controller.new_options.GetInteger( 'thumbnail_scale_type' )
thumbnail_dpr_percent = HG.client_controller.new_options.GetInteger( 'thumbnail_dpr_percent' )
for media_result in media_results:
@ -2612,7 +2612,7 @@ class HydrusResourceClientAPIRestrictedGetFilesFileMetadata( HydrusResourceClien
if width is not None and height is not None and width > 0 and height > 0:
( clip_rect, ( expected_thumbnail_width, expected_thumbnail_height ) ) = HydrusImageHandling.GetThumbnailResolutionAndClipRegion( ( width, height ), thumbnail_bounding_dimensions, thumbnail_scale_type )
( clip_rect, ( expected_thumbnail_width, expected_thumbnail_height ) ) = HydrusImageHandling.GetThumbnailResolutionAndClipRegion( ( width, height ), thumbnail_bounding_dimensions, thumbnail_scale_type, thumbnail_dpr_percent )
metadata_row[ 'thumbnail_width' ] = expected_thumbnail_width
metadata_row[ 'thumbnail_height' ] = expected_thumbnail_height

View File

@ -2,6 +2,7 @@ import collections
import os
import threading
import time
import typing
import urllib.parse
from hydrus.core import HydrusConstants as HC
@ -1470,6 +1471,29 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
def NormaliseURLs( self, urls: typing.Collection[ str ] ) -> typing.List[ str ]:
normalised_urls = []
for url in urls:
try:
normalised_url = self.NormaliseURL( url )
except HydrusExceptions.URLClassException:
continue
normalised_urls.append( url )
normalised_urls = HydrusData.DedupeList( normalised_urls )
return normalised_urls
def OverwriteDefaultGUGs( self, gug_names ):
with self._lock:

View File

@ -83,7 +83,7 @@ options = {}
# Misc
NETWORK_VERSION = 20
SOFTWARE_VERSION = 510
SOFTWARE_VERSION = 511
CLIENT_API_VERSION = 38
SERVER_THUMBNAIL_DIMENSIONS = ( 200, 200 )

View File

@ -1861,20 +1861,7 @@ class ContentUpdate( object ):
if self._data_type == HC.CONTENT_TYPE_FILES:
if self._action == HC.CONTENT_UPDATE_ADVANCED:
( sub_action, possible_hashes ) = self._row
if possible_hashes is None:
hashes = set()
else:
hashes = possible_hashes
elif self._action == HC.CONTENT_UPDATE_ADD:
if self._action == HC.CONTENT_UPDATE_ADD:
( file_info_manager, timestamp ) = self._row
@ -1884,6 +1871,11 @@ class ContentUpdate( object ):
hashes = self._row
if hashes is None:
hashes = set()
elif self._data_type == HC.CONTENT_TYPE_DIRECTORIES:

View File

@ -843,13 +843,21 @@ thumbnail_scale_str_lookup = {
THUMBNAIL_SCALE_TO_FILL : 'scale to fill'
}
def GetThumbnailResolutionAndClipRegion( image_resolution, bounding_dimensions, thumbnail_scale_type: int ):
def GetThumbnailResolutionAndClipRegion( image_resolution: typing.Tuple[ int, int ], bounding_dimensions: typing.Tuple[ int, int ], thumbnail_scale_type: int, thumbnail_dpr_percent: int ):
clip_rect = None
( im_width, im_height ) = image_resolution
( bounding_width, bounding_height ) = bounding_dimensions
if thumbnail_dpr_percent != 100:
thumbnail_dpr = thumbnail_dpr_percent / 100
bounding_height = int( bounding_height * thumbnail_dpr )
bounding_width = int( bounding_width * thumbnail_dpr )
if thumbnail_scale_type == THUMBNAIL_SCALE_DOWN_ONLY:
if bounding_width >= im_width and bounding_height >= im_height:

View File

@ -268,32 +268,57 @@ class SerialisableBase( object ):
return ( self.SERIALISABLE_TYPE, self.SERIALISABLE_VERSION, serialisable_info )
def InitialiseFromSerialisableInfo( self, version, serialisable_info, raise_error_on_future_version = False ):
def InitialiseFromSerialisableInfo( self, original_version, serialisable_info, raise_error_on_future_version = False ):
if version > self.SERIALISABLE_VERSION:
object_is_newer = original_version > self.SERIALISABLE_VERSION
if object_is_newer:
if raise_error_on_future_version:
message = 'Unfortunately, an object of type {} could not be loaded because it was created in a client that uses an updated version of that object! This client supports versions up to {}, but the object was version {}.'.format( self.SERIALISABLE_NAME, self.SERIALISABLE_VERSION, version )
message = 'Unfortunately, an object of type {} could not be loaded because it was created in a client/server that uses an updated version of that object! We support up to version {}, but the object was version {}.'.format( self.SERIALISABLE_NAME, self.SERIALISABLE_VERSION, original_version )
message += os.linesep * 2
message += 'Please update your client to import this object.'
message += 'Please update your client/server to import this object.'
raise HydrusExceptions.SerialisationException( message )
else:
message = 'An object of type {} was created in a client that uses an updated version of that object! This client supports versions up to {}, but the object was version {}. For now, the client will try to continue work, but things may break. If you know why this has occured, please correct it. If you do not, please let hydrus dev know.'.format( self.SERIALISABLE_NAME, self.SERIALISABLE_VERSION, version )
message = 'An object of type {} was created in a client/server that uses an updated version of that object! We support versions up to {}, but the object was version {}. For now, we will try to continue work, but things may break. If you know why this has occured, please correct it. If you do not, please let hydrus dev know.'.format( self.SERIALISABLE_NAME, self.SERIALISABLE_VERSION, original_version )
HydrusData.ShowText( message )
while version < self.SERIALISABLE_VERSION:
try:
( version, serialisable_info ) = self._UpdateSerialisableInfo( version, serialisable_info )
current_version = original_version
while current_version < self.SERIALISABLE_VERSION:
( current_version, serialisable_info ) = self._UpdateSerialisableInfo( current_version, serialisable_info )
except:
raise HydrusExceptions.SerialisationException( 'Could not update this object of type {} from version {} to {}!'.format( self.SERIALISABLE_NAME, original_version, self.SERIALISABLE_VERSION ) )
self._InitialiseFromSerialisableInfo( serialisable_info )
try:
self._InitialiseFromSerialisableInfo( serialisable_info )
except:
if object_is_newer:
raise HydrusExceptions.SerialisationException( 'An object of type {} was created in a client/server that uses an updated version of that object! We support versions up to {}, but the object was version {}. I tried to load it, but the initialisation failed. You probably need to update your client/server.'.format( self.SERIALISABLE_NAME, self.SERIALISABLE_VERSION, original_version ) )
else:
raise HydrusExceptions.SerialisationException( 'Could not initialise this object of type {}!'.format( self.SERIALISABLE_NAME ) )

View File

@ -191,7 +191,7 @@ def ParseFileArguments( path, decompression_bombs_ok = False ):
bounding_dimensions = HC.SERVER_THUMBNAIL_DIMENSIONS
( clip_rect, target_resolution ) = HydrusImageHandling.GetThumbnailResolutionAndClipRegion( ( width, height ), bounding_dimensions, HydrusImageHandling.THUMBNAIL_SCALE_DOWN_ONLY )
( clip_rect, target_resolution ) = HydrusImageHandling.GetThumbnailResolutionAndClipRegion( ( width, height ), bounding_dimensions, HydrusImageHandling.THUMBNAIL_SCALE_DOWN_ONLY, 100 )
thumbnail_bytes = HydrusFileHandling.GenerateThumbnailBytes( path, target_resolution, mime, duration, num_frames, clip_rect = clip_rect )

View File

@ -584,19 +584,6 @@ class HydrusResourceRestrictedAccountModifyExpires( HydrusResourceRestrictedAcco
subject_account_key = request.parsed_request_args[ 'subject_account_key' ]
elif 'subject_identifier' in request.parsed_request_args:
subject_identifier = request.parsed_request_args[ 'subject_identifier' ]
if subject_identifier.HasAccountKey():
subject_account_key = subject_identifier.GetAccountKey()
else:
raise HydrusExceptions.BadRequestException( 'The subject\'s account identifier did not include an account id!' )
else:
raise HydrusExceptions.BadRequestException( 'I was expecting an account id, but did not get one!' )
@ -634,19 +621,6 @@ class HydrusResourceRestrictedAccountModifySetMessage( HydrusResourceRestrictedA
subject_account_key = request.parsed_request_args[ 'subject_account_key' ]
elif 'subject_identifier' in request.parsed_request_args:
subject_identifier = request.parsed_request_args[ 'subject_identifier' ]
if subject_identifier.HasAccountKey():
subject_account_key = subject_identifier.GetAccountKey()
else:
raise HydrusExceptions.BadRequestException( 'The subject\'s account identifier did not include an account id!' )
else:
raise HydrusExceptions.BadRequestException( 'I was expecting an account id, but did not get one!' )
@ -679,19 +653,6 @@ class HydrusResourceRestrictedAccountModifyUnban( HydrusResourceRestrictedAccoun
subject_account_key = request.parsed_request_args[ 'subject_account_key' ]
elif 'subject_identifier' in request.parsed_request_args:
subject_identifier = request.parsed_request_args[ 'subject_identifier' ]
if subject_identifier.HasAccountKey():
subject_account_key = subject_identifier.GetAccountKey()
else:
raise HydrusExceptions.BadRequestException( 'The subject\'s account identifier did not include an account id!' )
else:
raise HydrusExceptions.BadRequestException( 'I was expecting an account id, but did not get one!' )

View File

@ -3354,10 +3354,10 @@ class TestClientAPI( unittest.TestCase ):
if file_info_manager.mime in HC.MIMES_WITH_THUMBNAILS:
bounding_dimensions = HG.test_controller.options[ 'thumbnail_dimensions' ]
thumbnail_scale_type = HG.test_controller.new_options.GetInteger( 'thumbnail_scale_type' )
thumbnail_dpr_percent = HG.client_controller.new_options.GetInteger( 'thumbnail_dpr_percent' )
( clip_rect, ( thumbnail_expected_width, thumbnail_expected_height ) ) = HydrusImageHandling.GetThumbnailResolutionAndClipRegion( ( file_info_manager.width, file_info_manager.height ), bounding_dimensions, thumbnail_scale_type )
( clip_rect, ( thumbnail_expected_width, thumbnail_expected_height ) ) = HydrusImageHandling.GetThumbnailResolutionAndClipRegion( ( file_info_manager.width, file_info_manager.height ), bounding_dimensions, thumbnail_scale_type, thumbnail_dpr_percent )
metadata_row[ 'thumbnail_width' ] = thumbnail_expected_width
metadata_row[ 'thumbnail_height' ] = thumbnail_expected_height

View File

@ -673,7 +673,9 @@ class TestClientDBDuplicates( unittest.TestCase ):
self.assertEqual( len( file_duplicate_types_to_counts ), 3 )
self.assertEqual( file_duplicate_types_to_counts[ HC.DUPLICATE_POTENTIAL ], self._get_group_potential_count( file_duplicate_types_to_counts ) )
expected = self._get_group_potential_count( file_duplicate_types_to_counts )
self.assertIn( file_duplicate_types_to_counts[ HC.DUPLICATE_POTENTIAL ], ( expected, expected - 1 ) )
self.assertEqual( file_duplicate_types_to_counts[ HC.DUPLICATE_MEMBER ], len( self._our_main_dupe_group_hashes ) - 1 )
self.assertEqual( file_duplicate_types_to_counts[ HC.DUPLICATE_FALSE_POSITIVE ], 1 )
@ -685,7 +687,9 @@ class TestClientDBDuplicates( unittest.TestCase ):
self.assertEqual( len( file_duplicate_types_to_counts ), 3 )
self.assertEqual( file_duplicate_types_to_counts[ HC.DUPLICATE_POTENTIAL ], self._get_group_potential_count( file_duplicate_types_to_counts ) )
expected = self._get_group_potential_count( file_duplicate_types_to_counts )
self.assertIn( file_duplicate_types_to_counts[ HC.DUPLICATE_POTENTIAL ], ( expected, expected - 1 ) )
self.assertEqual( file_duplicate_types_to_counts[ HC.DUPLICATE_MEMBER ], len( self._our_fp_dupe_group_hashes ) - 1 )
self.assertEqual( file_duplicate_types_to_counts[ HC.DUPLICATE_FALSE_POSITIVE ], 1 )

View File

@ -562,8 +562,6 @@ class TestClientDBTags( unittest.TestCase ):
cls._db = ClientDB.DB( HG.test_controller, TestController.DB_DIR, 'client' )
HG.test_controller.SetRead( 'hash_status', ClientImportFiles.FileImportStatus.STATICGetUnknownStatus() )
@classmethod
def tearDownClass( cls ):
@ -1057,6 +1055,8 @@ class TestClientDBTags( unittest.TestCase ):
# import a file
HG.test_controller.SetRead( 'hash_status', ClientImportFiles.FileImportStatus.STATICGetUnknownStatus() )
path = os.path.join( HC.STATIC_DIR, 'testing', 'muh_jpg.jpg' )
file_import_options = FileImportOptions.FileImportOptions()
@ -1156,6 +1156,8 @@ class TestClientDBTags( unittest.TestCase ):
# import a file
HG.test_controller.SetRead( 'hash_status', ClientImportFiles.FileImportStatus.STATICGetUnknownStatus() )
path = os.path.join( HC.STATIC_DIR, 'testing', 'muh_jpg.jpg' )
file_import_options = FileImportOptions.FileImportOptions()
@ -1256,6 +1258,8 @@ class TestClientDBTags( unittest.TestCase ):
# import a file
HG.test_controller.SetRead( 'hash_status', ClientImportFiles.FileImportStatus.STATICGetUnknownStatus() )
path = os.path.join( HC.STATIC_DIR, 'testing', 'muh_jpg.jpg' )
file_import_options = FileImportOptions.FileImportOptions()
@ -1358,6 +1362,8 @@ class TestClientDBTags( unittest.TestCase ):
# import a file
HG.test_controller.SetRead( 'hash_status', ClientImportFiles.FileImportStatus.STATICGetUnknownStatus() )
path = os.path.join( HC.STATIC_DIR, 'testing', 'muh_jpg.jpg' )
file_import_options = FileImportOptions.FileImportOptions()
@ -1459,6 +1465,8 @@ class TestClientDBTags( unittest.TestCase ):
# import a file
HG.test_controller.SetRead( 'hash_status', ClientImportFiles.FileImportStatus.STATICGetUnknownStatus() )
path = os.path.join( HC.STATIC_DIR, 'testing', 'muh_jpg.jpg' )
file_import_options = FileImportOptions.FileImportOptions()
@ -2580,6 +2588,8 @@ class TestClientDBTags( unittest.TestCase ):
for filename in ( 'muh_jpg.jpg', 'muh_png.png', 'muh_apng.png' ):
HG.test_controller.SetRead( 'hash_status', ClientImportFiles.FileImportStatus.STATICGetUnknownStatus() )
path = os.path.join( HC.STATIC_DIR, 'testing', filename )
file_import_job = ClientImportFiles.FileImportJob( path, file_import_options )

View File

@ -211,8 +211,8 @@ class TestFileImportOptions( unittest.TestCase ):
file_import_options = FileImportOptions.FileImportOptions()
exclude_deleted = False
do_not_check_known_urls_before_importing = False
do_not_check_hashes_before_importing = False
preimport_hash_check_type = FileImportOptions.DO_CHECK_AND_MATCHES_ARE_DISPOSITIVE
preimport_url_check_type = FileImportOptions.DO_CHECK
allow_decompression_bombs = False
min_size = None
max_size = None
@ -220,7 +220,7 @@ class TestFileImportOptions( unittest.TestCase ):
min_resolution = None
max_resolution = None
file_import_options.SetPreImportOptions( exclude_deleted, do_not_check_known_urls_before_importing, do_not_check_hashes_before_importing, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution )
file_import_options.SetPreImportOptions( exclude_deleted, preimport_hash_check_type, preimport_url_check_type, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution )
automatic_archive = False
associate_primary_urls = False
@ -243,7 +243,7 @@ class TestFileImportOptions( unittest.TestCase ):
exclude_deleted = True
file_import_options.SetPreImportOptions( exclude_deleted, do_not_check_known_urls_before_importing, do_not_check_hashes_before_importing, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution )
file_import_options.SetPreImportOptions( exclude_deleted, preimport_hash_check_type, preimport_url_check_type, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution )
self.assertTrue( file_import_options.ExcludesDeleted() )
self.assertFalse( file_import_options.AllowsDecompressionBombs() )
@ -253,7 +253,7 @@ class TestFileImportOptions( unittest.TestCase ):
allow_decompression_bombs = True
file_import_options.SetPreImportOptions( exclude_deleted, do_not_check_known_urls_before_importing, do_not_check_hashes_before_importing, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution )
file_import_options.SetPreImportOptions( exclude_deleted, preimport_hash_check_type, preimport_url_check_type, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution )
self.assertTrue( file_import_options.ExcludesDeleted() )
self.assertTrue( file_import_options.AllowsDecompressionBombs() )
@ -277,7 +277,7 @@ class TestFileImportOptions( unittest.TestCase ):
min_size = 4096
file_import_options.SetPreImportOptions( exclude_deleted, do_not_check_known_urls_before_importing, do_not_check_hashes_before_importing, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution )
file_import_options.SetPreImportOptions( exclude_deleted, preimport_hash_check_type, preimport_url_check_type, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution )
file_import_options.CheckFileIsValid( 65536, HC.IMAGE_JPEG, 640, 480 )
@ -291,7 +291,7 @@ class TestFileImportOptions( unittest.TestCase ):
min_size = None
max_size = 2000
file_import_options.SetPreImportOptions( exclude_deleted, do_not_check_known_urls_before_importing, do_not_check_hashes_before_importing, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution )
file_import_options.SetPreImportOptions( exclude_deleted, preimport_hash_check_type, preimport_url_check_type, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution )
file_import_options.CheckFileIsValid( 1800, HC.IMAGE_JPEG, 640, 480 )
@ -305,7 +305,7 @@ class TestFileImportOptions( unittest.TestCase ):
max_size = None
max_gif_size = 2000
file_import_options.SetPreImportOptions( exclude_deleted, do_not_check_known_urls_before_importing, do_not_check_hashes_before_importing, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution )
file_import_options.SetPreImportOptions( exclude_deleted, preimport_hash_check_type, preimport_url_check_type, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution )
file_import_options.CheckFileIsValid( 1800, HC.IMAGE_JPEG, 640, 480 )
file_import_options.CheckFileIsValid( 2200, HC.IMAGE_JPEG, 640, 480 )
@ -322,7 +322,7 @@ class TestFileImportOptions( unittest.TestCase ):
max_gif_size = None
min_resolution = ( 200, 100 )
file_import_options.SetPreImportOptions( exclude_deleted, do_not_check_known_urls_before_importing, do_not_check_hashes_before_importing, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution )
file_import_options.SetPreImportOptions( exclude_deleted, preimport_hash_check_type, preimport_url_check_type, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution )
file_import_options.CheckFileIsValid( 65536, HC.IMAGE_JPEG, 640, 480 )
@ -343,7 +343,7 @@ class TestFileImportOptions( unittest.TestCase ):
min_resolution = None
max_resolution = ( 3000, 4000 )
file_import_options.SetPreImportOptions( exclude_deleted, do_not_check_known_urls_before_importing, do_not_check_hashes_before_importing, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution )
file_import_options.SetPreImportOptions( exclude_deleted, preimport_hash_check_type, preimport_url_check_type, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution )
file_import_options.CheckFileIsValid( 65536, HC.IMAGE_JPEG, 640, 480 )