Version 512
|
@ -48,6 +48,8 @@ crash.log
|
|||
/db/*.key
|
||||
/db/*.log
|
||||
/db/*.conf
|
||||
/db/client_running
|
||||
/db/server_running
|
||||
/db/client_files/
|
||||
/db/server_files/
|
||||
/db/missing_and_invalid_files/
|
||||
|
|
|
@ -7,6 +7,61 @@ title: Changelog
|
|||
!!! note
|
||||
This is the new changelog, only the most recent builds. For all versions, see the [old changelog](old_changelog.html).
|
||||
|
||||
## [Version 512](https://github.com/hydrusnetwork/hydrus/releases/tag/v512)
|
||||
|
||||
### two searches in duplicates
|
||||
|
||||
* the duplicate filter page now lets you search 'one file is in this search, the other is in this search'! the only real limitation is both searches are locked to the same file domain
|
||||
* the main neat thing is you can now search 'pngs vs jpegs, and must be pixel dupes' super easy. this is the first concrete step towards my plan to introduce an optional duplicate auto resolution system (png/jpeg pixel dupes is easy--the jpeg is 99.9999% always better)
|
||||
* the database tech to get this working was actually simpler than 'one file matches the search', and in testing it works at _ok_ speed, so we'll see how this goes IRL
|
||||
* duplicate calculations should be faster in some simple cases, usually when you set a search to system:everything. this extends to the new two-search mode too (e.g. a two-search with one as system:everything is just a one-search, and the system optimises for this), however I also search complicated domains much more precisely now, which may make some duplicate search stuff work real slow. again, let me know!
|
||||
|
||||
### sidecars
|
||||
|
||||
* the txt importer/exporter sidecars now allow custom 'separators', so if you don't want newlines, you can use ', ' or whatever format you need
|
||||
|
||||
### misc
|
||||
|
||||
* when you right-click on a selection of thumbs, the 'x files' can now be 'x videos' or 'x pngs' etc.. as you see on the status bar
|
||||
* when you select or right-click on a selection of thumbs that all have duration, the status bar and menu now show the total duration of your selection. same deal on the status bar if you have no selection on a page of only durating-having media
|
||||
* thanks to the user who figured out the correct render flag, the new 'thumbnail ui-scale supersampling %' option now draws non-pixelly thumbs on 100% monitors when it is set higher (e.g. 200% thumbs drawing on 100% monitor), so users with unusual multi-monitor setups etc... should have a nicer experience. as the tooltip now says, this setting should now be set to the largest UI scale you have
|
||||
* I removed the newgrounds downloader from the defaults (this only affects new users). the downloader has been busted for a while, and last time I looked, it was not trivial to figure out, so I am removing myself from the question
|
||||
* the 'manage where tag siblings and parents apply' dialog now explicitly points users to the 'review current sync' panel
|
||||
|
||||
### client api
|
||||
|
||||
* a new command, /manage_pages/refresh_page, refreshes the specified page
|
||||
* the help is updated to talk about this
|
||||
* client api version is now 39
|
||||
|
||||
### server management
|
||||
|
||||
* in the 'modify accounts' dialog, if the null account is checked when you try to do an action, it will be unchecked. this should stop the annoying 400 Errors when you accidentally try to set it something
|
||||
* also, if you do 'add to expires', any accounts that currently do not expire will be deselected before the action too, with a brief dialog note about it
|
||||
|
||||
### other duplicates improvements
|
||||
|
||||
* I reworked a ton of code here, fixing a heap of logic and general 'that isn't quite what you'd expect' comparison selection issues. ideally, the system will just make more obvious human sense more often, but this tech gets a little complicated as it tries to select comparison kings from larger groups, and we might have some situations where it says '3 pairs', but when you load it in the filter it says 'no pairs found m8', so let me know how it goes!
|
||||
* first, most importantly, the 'show some random potential pairs' button is vastly improved. it is now much better about limiting the group of presented files to what you specifically have searched, and the 'pixel dupes' and 'search distance' settings are obeyed properly (previously it was fetching too many potentials, not always limiting to the search you set, and choosing candidates from larger groups too liberally)
|
||||
* while it shows smaller groups now, since they are all culled better, it _should_ select larger groups more often than before
|
||||
* when you say 'show some random potential pairs' with 'at least one file matches the search', the first file displayed, which is the 'master' that the other file(s) are paired against, now always matches the search. when you are set to the new two-search 'files match different searches', the master will always match the first search, and the others of the pairs will always match the second search. in the filter itself, some similar logic applies, so the files selected for actual comparison should match the search you inputted better.
|
||||
* setting duplicates with 'custom options' from the thumbnail menu and selecting 'this is better' now correctly sets the focused media as the best. previously it set the first file as the best
|
||||
* also, in the duplicate merge options, you can now set notes to 'move' from worse to better
|
||||
* as a side thing, the 'search distance' number control is now disabled if you select 'must be pixel dupes'. duh!
|
||||
|
||||
### boring cleanup
|
||||
|
||||
* refactored the duplicate comparison statement generation code from ClientMedia to ClientDuplicates
|
||||
* significantly refactored all the duplicate files calculation pipelines to deal with two file search contexts
|
||||
* cleaned up a bunch of the 'find potential duplicate pairs in this file domain' master table join code. less hardcoding, more dynamic assembly
|
||||
* refactored the duplicated 'figure out pixel dupes table join gubbins' code in the file duplicates database module into a single separate method, and rolled in the base initialisation and hamming distance part into it too, clearing out more duplicated code
|
||||
* split up the 'both files match' search code into separate methods to further clean the logic here
|
||||
* updated the main object that handles page data to the new serialisable dictionary, combining its hardcoded key/primitive/serialisable storage into one clean dict that looks after itself
|
||||
* cleaned up the type definitions of the the main database file search and fixed the erroneous empty set returns
|
||||
* I added a couple unit tests for the new .txt sidecar separator
|
||||
* fixed a bad sidecar unit test
|
||||
* 'client_running' and 'server_running' are now in the .gitignore
|
||||
|
||||
## [Version 511](https://github.com/hydrusnetwork/hydrus/releases/tag/v511)
|
||||
|
||||
### thumbnail UI scaling
|
||||
|
@ -429,50 +484,3 @@ title: Changelog
|
|||
* updated most of the actions in the build script to use updated node16 versions. node12 just started getting deprecation warnings. there is more work to do
|
||||
* replaced the node12 pip installer action with a manual command on the reworked requirements.txts
|
||||
* replaced most of the build script's uses of 'set-output', which just started getting deprecation warnings. there is more work to do
|
||||
|
||||
## [Version 502](https://github.com/hydrusnetwork/hydrus/releases/tag/v502)
|
||||
|
||||
### autocomplete dropdown
|
||||
* the floating version of the autocomplete dropdown gets the same backend treatment the media hovers and the popup toaster recently received--it is no longer its own window, but now a normal widget floating inside its parent. it should look pretty much the same, but a variety of bugs are eliminated. clients with many search pages open now only have one top level window, rather than potentially hundreds of hidden ones
|
||||
* if you have turned off floating a/c windows because of graphical bugs, please try turning them back on today. the checkbox is under _options->search_.
|
||||
* as an additional consequence, I have decided to no longer allow 'floating' autocomplete windows in dialogs. I never liked how this worked or looked, overlapping the apply/cancel buttons, and it is not technically possible to make this work with the new tech, so they are always embedded in dialogs now. the related checkbox in _options->search_ is gone as a result
|
||||
* if you ok or cancel on the 'OR' buttons, focus is now preserved back to the dropdown
|
||||
* a bunch of weird interwindow-focus-juggling and 'what happens if the user's window manager allows them to close a floating a/c dropdown'-style code is cleared out. with simpler logic, some flicker jank is simply eliminated
|
||||
* if you move the window around, any displaying floating a/c dropdowns now glide along with them; previously it updated at 10fps
|
||||
* the way the client swaps a new thumbnail grid in when results are loaded or dismissed is faster and more atomic. there is less focus-cludge, and as a result the autocomplete is better at retaining focus and staying displayed as changes to the search state occur
|
||||
* the way scroll events are caught is also improved, so the floating dropdown should fix its position on scroll more smoothly and capably
|
||||
|
||||
### date system predicates
|
||||
* _this affects system:import time; :modified time; and :last viewed_
|
||||
* updated the system:time UI for time delta so you are choosing 'before', 'since', and '+/- 15% of'
|
||||
* updated the system:time UI for calendar date so you are choosing 'before', 'since', 'the day of', and '+/- a month of' rather than the ugly and awkward '<' stuff
|
||||
* updated the calendar calculations with calendar time-based system predicates, so '~=' operator now does plus or minus one month to the same calendar day, no matter how many days were in that month (previously it did +/- 30 days)
|
||||
* the system predicate parser now reassigns the '=' in a given 'system:time_type = time_delta' to '~='
|
||||
|
||||
### misc
|
||||
* 'sort files by import time' now sorts files correctly even when two files were imported in the same second. thanks to the user who thought of the solution here!
|
||||
* the 'recent' system predicates you see listed in the 'flesh out system pred' dialogs now have a 'X' button that lets you remove them from the recent/favourites
|
||||
* fixed the crash that I disabled some code for last week and reactivated the code. the collect-by dropdown is back to refreshing itself whenever you change the settings in _options->sort/collect_. furthermore, this guy now spams less behind the scenes, only reinitialising if there are actual changes to the sort/collect settings
|
||||
* brushed up some network content-range checking logic. this data is tracked better, and now any time a given 206 range response has insufficient data for what its header said, this is noted in the log. it doesn't raise an error, and the network job will still try to resume from the truncated point, but let's see how widespread this is. if a server delivers _more_ data than specified, this now does raise an error
|
||||
* fixed a tiny bit of logic in how the server calculates changes in sibling and parent petition counts. I am not sure if I fixed the miscount the janitors have seen
|
||||
* if a janitor asks for a petition and the current petition count for that type is miscounted, leading to a 404, the server now quickly recalculates that number for the next request
|
||||
* updated the system predicate parser to replace all underscores with whitespace, so it can accept system predicates that use_underscores_instead_of_whilespace. I don't _think_ this messes up any of the parsing except in an odd case where a file service might have an underscore'd name, but we'll cross that bridge if and when we get to it
|
||||
* added information about 'PRAGMA quick_check;' to 'help my db is broke.txt'
|
||||
* patched a unit test that would rarely fail because of random data (issue #1217)
|
||||
|
||||
### client api
|
||||
* /get_files/search_files:
|
||||
* fixed the recent bug where an empty tag input with 'search all' permission would raise an error. entering no search predicates now returns an empty list in all cases, no matter your permissions (issue #1250)
|
||||
* entering invalid tags now raises a 400 error
|
||||
* improved the tag permissions check. only non-wildcard tags are now tested against the filter
|
||||
* updated my unit tests to catch these cases
|
||||
* /add_tags/search_tags:
|
||||
* a unit test now explicitly tests that empty autocomplete input results in no tags
|
||||
* the Client API now responds with Access-Control-Max-Age=86400 on OPTIONS checks, which should reduce some CORS pre-flight spam
|
||||
* client api version is now 34
|
||||
|
||||
### misc cleanup
|
||||
* cleaned up the signalling code in the 'recent system predicate' buttons
|
||||
* shuffled some page widget and layout code to make the embedded a/c dropdown work
|
||||
* deleted a bunch of a/c event handling and forced layout and other garbage code
|
||||
* worked on some linter warnings
|
||||
|
|
|
@ -1329,6 +1329,33 @@ Response:
|
|||
: 200 with no content. If the page key is not found, this will 404.
|
||||
|
||||
|
||||
### **POST `/manage_pages/refresh_page`** { id="manage_pages_refresh_page" }
|
||||
|
||||
_Refresh a page in the main GUI. Like hitting F5 in the client, this obviously makes file search pages perform their search again, but for other page types it will force the currently in-view files to be re-sorted._
|
||||
|
||||
Restricted access:
|
||||
: YES. Manage Pages permission needed.
|
||||
|
||||
Required Headers:
|
||||
:
|
||||
* `Content-Type`: application/json
|
||||
|
||||
Arguments (in JSON):
|
||||
:
|
||||
* `page_key`: (the page key for the page you wish to refresh)
|
||||
|
||||
The page key is the same as fetched in the [/manage\_pages/get\_pages](#manage_pages_get_pages) call. If a file search page is not set to 'searching immediately', a 'refresh' command does nothing.
|
||||
|
||||
```json title="Example request body"
|
||||
{
|
||||
"page_key" : "af98318b6eece15fef3cf0378385ce759bfe056916f6e12157cd928eb56c1f18"
|
||||
}
|
||||
```
|
||||
|
||||
Response:
|
||||
: 200 with no content. If the page key is not found, this will 404.
|
||||
|
||||
|
||||
## Searching Files
|
||||
|
||||
File search in hydrus is not paginated like a booru--all searches return all results in one go. In order to keep this fast, search is split into two steps--fetching file identifiers with a search, and then fetching file metadata in batches. You may have noticed that the client itself performs searches like this--thinking a bit about a search and then bundling results in batches of 256 files before eventually throwing all the thumbnails on screen.
|
||||
|
|
|
@ -34,6 +34,50 @@
|
|||
<div class="content">
|
||||
<h1 id="changelog"><a href="#changelog">changelog</a></h1>
|
||||
<ul>
|
||||
<li>
|
||||
<h2 id="version_512"><a href="#version_512">version 512</a></h2>
|
||||
<ul>
|
||||
<li><h3>two searches in duplicates</h3></li>
|
||||
<li>the duplicate filter page now lets you search 'one file is in this search, the other is in this search'! the only real limitation is both searches are locked to the same file domain</li>
|
||||
<li>the main neat thing is you can now search 'pngs vs jpegs, and must be pixel dupes' super easy. this is the first concrete step towards my plan to introduce an optional duplicate auto resolution system (png/jpeg pixel dupes is easy--the jpeg is 99.9999% always better)</li>
|
||||
<li>the database tech to get this working was actually simpler than 'one file matches the search', and in testing it works at _ok_ speed, so we'll see how this goes IRL</li>
|
||||
<li>duplicate calculations should be faster in some simple cases, usually when you set a search to system:everything. this extends to the new two-search mode too (e.g. a two-search with one as system:everything is just a one-search, and the system optimises for this), however I also search complicated domains much more precisely now, which may make some duplicate search stuff work real slow. again, let me know!</li>
|
||||
<li><h3>sidecars</h3></li>
|
||||
<li>the txt importer/exporter sidecars now allow custom 'separators', so if you don't want newlines, you can use ', ' or whatever format you need</li>
|
||||
<li><h3>misc</h3></li>
|
||||
<li>when you right-click on a selection of thumbs, the 'x files' can now be 'x videos' or 'x pngs' etc.. as you see on the status bar</li>
|
||||
<li>when you select or right-click on a selection of thumbs that all have duration, the status bar and menu now show the total duration of your selection. same deal on the status bar if you have no selection on a page of only durating-having media</li>
|
||||
<li>thanks to the user who figured out the correct render flag, the new 'thumbnail ui-scale supersampling %' option now draws non-pixelly thumbs on 100% monitors when it is set higher (e.g. 200% thumbs drawing on 100% monitor), so users with unusual multi-monitor setups etc... should have a nicer experience. as the tooltip now says, this setting should now be set to the largest UI scale you have</li>
|
||||
<li>I removed the newgrounds downloader from the defaults (this only affects new users). the downloader has been busted for a while, and last time I looked, it was not trivial to figure out, so I am removing myself from the question</li>
|
||||
<li>the 'manage where tag siblings and parents apply' dialog now explicitly points users to the 'review current sync' panel</li>
|
||||
<li><h3>client api</h3></li>
|
||||
<li>a new command, /manage_pages/refresh_page, refreshes the specified page</li>
|
||||
<li>the help is updated to talk about this</li>
|
||||
<li>client api version is now 39</li>
|
||||
<li><h3>server management</h3></li>
|
||||
<li>in the 'modify accounts' dialog, if the null account is checked when you try to do an action, it will be unchecked. this should stop the annoying 400 Errors when you accidentally try to set it something</li>
|
||||
<li>also, if you do 'add to expires', any accounts that currently do not expire will be deselected before the action too, with a brief dialog note about it</li>
|
||||
<li><h3>other duplicates improvements</h3></li>
|
||||
<li>I reworked a ton of code here, fixing a heap of logic and general 'that isn't quite what you'd expect' comparison selection issues. ideally, the system will just make more obvious human sense more often, but this tech gets a little complicated as it tries to select comparison kings from larger groups, and we might have some situations where it says '3 pairs', but when you load it in the filter it says 'no pairs found m8', so let me know how it goes!</li>
|
||||
<li>first, most importantly, the 'show some random potential pairs' button is vastly improved. it is now much better about limiting the group of presented files to what you specifically have searched, and the 'pixel dupes' and 'search distance' settings are obeyed properly (previously it was fetching too many potentials, not always limiting to the search you set, and choosing candidates from larger groups too liberally)</li>
|
||||
<li>while it shows smaller groups now, since they are all culled better, it _should_ select larger groups more often than before</li>
|
||||
<li>when you say 'show some random potential pairs' with 'at least one file matches the search', the first file displayed, which is the 'master' that the other file(s) are paired against, now always matches the search. when you are set to the new two-search 'files match different searches', the master will always match the first search, and the others of the pairs will always match the second search. in the filter itself, some similar logic applies, so the files selected for actual comparison should match the search you inputted better.</li>
|
||||
<li>setting duplicates with 'custom options' from the thumbnail menu and selecting 'this is better' now correctly sets the focused media as the best. previously it set the first file as the best</li>
|
||||
<li>also, in the duplicate merge options, you can now set notes to 'move' from worse to better</li>
|
||||
<li>as a side thing, the 'search distance' number control is now disabled if you select 'must be pixel dupes'. duh!</li>
|
||||
<li><h3>boring cleanup</h3></li>
|
||||
<li>refactored the duplicate comparison statement generation code from ClientMedia to ClientDuplicates</li>
|
||||
<li>significantly refactored all the duplicate files calculation pipelines to deal with two file search contexts</li>
|
||||
<li>cleaned up a bunch of the 'find potential duplicate pairs in this file domain' master table join code. less hardcoding, more dynamic assembly</li>
|
||||
<li>refactored the duplicated 'figure out pixel dupes table join gubbins' code in the file duplicates database module into a single separate method, and rolled in the base initialisation and hamming distance part into it too, clearing out more duplicated code</li>
|
||||
<li>split up the 'both files match' search code into separate methods to further clean the logic here</li>
|
||||
<li>updated the main object that handles page data to the new serialisable dictionary, combining its hardcoded key/primitive/serialisable storage into one clean dict that looks after itself</li>
|
||||
<li>cleaned up the type definitions of the the main database file search and fixed the erroneous empty set returns</li>
|
||||
<li>I added a couple unit tests for the new .txt sidecar separator</li>
|
||||
<li>fixed a bad sidecar unit test</li>
|
||||
<li>'client_running' and 'server_running' are now in the .gitignore</li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>
|
||||
<h2 id="version_511"><a href="#version_511">version 511</a></h2>
|
||||
<ul>
|
||||
|
|
|
@ -63,6 +63,10 @@ directions_alignment_string_lookup = {
|
|||
DIRECTION_DOWN : 'bottom'
|
||||
}
|
||||
|
||||
DUPE_SEARCH_ONE_FILE_MATCHES_ONE_SEARCH = 0
|
||||
DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH = 1
|
||||
DUPE_SEARCH_BOTH_FILES_MATCH_DIFFERENT_SEARCHES = 2
|
||||
|
||||
FIELD_VERIFICATION_RECAPTCHA = 0
|
||||
FIELD_COMMENT = 1
|
||||
FIELD_TEXT = 2
|
||||
|
|
|
@ -7,15 +7,525 @@ from hydrus.core import HydrusConstants as HC
|
|||
from hydrus.core import HydrusData
|
||||
from hydrus.core import HydrusExceptions
|
||||
from hydrus.core import HydrusGlobals as HG
|
||||
from hydrus.core import HydrusImageHandling
|
||||
from hydrus.core import HydrusSerialisable
|
||||
from hydrus.core import HydrusTags
|
||||
|
||||
from hydrus.client import ClientConstants as CC
|
||||
from hydrus.client import ClientData
|
||||
from hydrus.client import ClientThreading
|
||||
from hydrus.client.importing.options import NoteImportOptions
|
||||
from hydrus.client.media import ClientMedia
|
||||
from hydrus.client.metadata import ClientTags
|
||||
|
||||
hashes_to_jpeg_quality = {}
|
||||
hashes_to_pixel_hashes = {}
|
||||
|
||||
def GetDuplicateComparisonScore( shown_media, comparison_media ):
|
||||
|
||||
statements_and_scores = GetDuplicateComparisonStatements( shown_media, comparison_media )
|
||||
|
||||
total_score = sum( ( score for ( statement, score ) in statements_and_scores.values() ) )
|
||||
|
||||
return total_score
|
||||
|
||||
|
||||
# TODO: ok, let's make an enum here at some point and a DuplicateComparisonSetting serialisable object
|
||||
# Then we can attach 'show/hide' boolean and allow editable scores and whatnot in a nice class that will one day evolve the enum to an editable MetadataConditional/MetadataComparison object
|
||||
# also have banding so we can have 'at this filesize difference, score 10, at this, score 15'
|
||||
# show it in a listctrl or whatever in the options, ditch the hardcoding
|
||||
# metadatacomparison needs to handle 'if one is a png and one is a jpeg', and then orient to A/B and give it a score
|
||||
|
||||
def GetDuplicateComparisonStatements( shown_media, comparison_media ):
|
||||
|
||||
new_options = HG.client_controller.new_options
|
||||
|
||||
duplicate_comparison_score_higher_jpeg_quality = new_options.GetInteger( 'duplicate_comparison_score_higher_jpeg_quality' )
|
||||
duplicate_comparison_score_much_higher_jpeg_quality = new_options.GetInteger( 'duplicate_comparison_score_much_higher_jpeg_quality' )
|
||||
duplicate_comparison_score_higher_filesize = new_options.GetInteger( 'duplicate_comparison_score_higher_filesize' )
|
||||
duplicate_comparison_score_much_higher_filesize = new_options.GetInteger( 'duplicate_comparison_score_much_higher_filesize' )
|
||||
duplicate_comparison_score_higher_resolution = new_options.GetInteger( 'duplicate_comparison_score_higher_resolution' )
|
||||
duplicate_comparison_score_much_higher_resolution = new_options.GetInteger( 'duplicate_comparison_score_much_higher_resolution' )
|
||||
duplicate_comparison_score_more_tags = new_options.GetInteger( 'duplicate_comparison_score_more_tags' )
|
||||
duplicate_comparison_score_older = new_options.GetInteger( 'duplicate_comparison_score_older' )
|
||||
duplicate_comparison_score_nicer_ratio = new_options.GetInteger( 'duplicate_comparison_score_nicer_ratio' )
|
||||
|
||||
#
|
||||
|
||||
statements_and_scores = {}
|
||||
|
||||
s_hash = shown_media.GetHash()
|
||||
c_hash = comparison_media.GetHash()
|
||||
|
||||
s_mime = shown_media.GetMime()
|
||||
c_mime = comparison_media.GetMime()
|
||||
|
||||
# size
|
||||
|
||||
s_size = shown_media.GetSize()
|
||||
c_size = comparison_media.GetSize()
|
||||
|
||||
is_a_pixel_dupe = False
|
||||
|
||||
if shown_media.IsStaticImage() and comparison_media.IsStaticImage() and shown_media.GetResolution() == comparison_media.GetResolution():
|
||||
|
||||
global hashes_to_pixel_hashes
|
||||
|
||||
if s_hash not in hashes_to_pixel_hashes:
|
||||
|
||||
path = HG.client_controller.client_files_manager.GetFilePath( s_hash, s_mime )
|
||||
|
||||
hashes_to_pixel_hashes[ s_hash ] = HydrusImageHandling.GetImagePixelHash( path, s_mime )
|
||||
|
||||
|
||||
if c_hash not in hashes_to_pixel_hashes:
|
||||
|
||||
path = HG.client_controller.client_files_manager.GetFilePath( c_hash, c_mime )
|
||||
|
||||
hashes_to_pixel_hashes[ c_hash ] = HydrusImageHandling.GetImagePixelHash( path, c_mime )
|
||||
|
||||
|
||||
s_pixel_hash = hashes_to_pixel_hashes[ s_hash ]
|
||||
c_pixel_hash = hashes_to_pixel_hashes[ c_hash ]
|
||||
|
||||
if s_pixel_hash == c_pixel_hash:
|
||||
|
||||
is_a_pixel_dupe = True
|
||||
|
||||
if s_mime == HC.IMAGE_PNG and c_mime != HC.IMAGE_PNG:
|
||||
|
||||
statement = 'this is a pixel-for-pixel duplicate png!'
|
||||
|
||||
score = -100
|
||||
|
||||
elif s_mime != HC.IMAGE_PNG and c_mime == HC.IMAGE_PNG:
|
||||
|
||||
statement = 'other file is a pixel-for-pixel duplicate png!'
|
||||
|
||||
score = 100
|
||||
|
||||
else:
|
||||
|
||||
statement = 'images are pixel-for-pixel duplicates!'
|
||||
|
||||
score = 0
|
||||
|
||||
|
||||
statements_and_scores[ 'pixel_duplicates' ] = ( statement, score )
|
||||
|
||||
|
||||
|
||||
if s_size != c_size:
|
||||
|
||||
absolute_size_ratio = max( s_size, c_size ) / min( s_size, c_size )
|
||||
|
||||
if absolute_size_ratio > 2.0:
|
||||
|
||||
if s_size > c_size:
|
||||
|
||||
operator = '>>'
|
||||
score = duplicate_comparison_score_much_higher_filesize
|
||||
|
||||
else:
|
||||
|
||||
operator = '<<'
|
||||
score = -duplicate_comparison_score_much_higher_filesize
|
||||
|
||||
|
||||
elif absolute_size_ratio > 1.05:
|
||||
|
||||
if s_size > c_size:
|
||||
|
||||
operator = '>'
|
||||
score = duplicate_comparison_score_higher_filesize
|
||||
|
||||
else:
|
||||
|
||||
operator = '<'
|
||||
score = -duplicate_comparison_score_higher_filesize
|
||||
|
||||
|
||||
else:
|
||||
|
||||
operator = CC.UNICODE_ALMOST_EQUAL_TO
|
||||
score = 0
|
||||
|
||||
|
||||
if s_size > c_size:
|
||||
|
||||
sign = '+'
|
||||
percentage_difference = ( s_size / c_size ) - 1.0
|
||||
|
||||
else:
|
||||
|
||||
sign = ''
|
||||
percentage_difference = ( s_size / c_size ) - 1.0
|
||||
|
||||
|
||||
percentage_different_string = ' ({}{})'.format( sign, HydrusData.ConvertFloatToPercentage( percentage_difference ) )
|
||||
|
||||
if is_a_pixel_dupe:
|
||||
|
||||
score = 0
|
||||
|
||||
|
||||
statement = '{} {} {}{}'.format( HydrusData.ToHumanBytes( s_size ), operator, HydrusData.ToHumanBytes( c_size ), percentage_different_string )
|
||||
|
||||
statements_and_scores[ 'filesize' ] = ( statement, score )
|
||||
|
||||
|
||||
# higher/same res
|
||||
|
||||
s_resolution = shown_media.GetResolution()
|
||||
c_resolution = comparison_media.GetResolution()
|
||||
|
||||
if s_resolution != c_resolution:
|
||||
|
||||
( s_w, s_h ) = s_resolution
|
||||
( c_w, c_h ) = c_resolution
|
||||
|
||||
all_measurements_are_good = None not in ( s_w, s_h, c_w, c_h ) and True not in ( d <= 0 for d in ( s_w, s_h, c_w, c_h ) )
|
||||
|
||||
if all_measurements_are_good:
|
||||
|
||||
resolution_ratio = ( s_w * s_h ) / ( c_w * c_h )
|
||||
|
||||
if resolution_ratio == 1.0:
|
||||
|
||||
operator = '!='
|
||||
score = 0
|
||||
|
||||
elif resolution_ratio > 2.0:
|
||||
|
||||
operator = '>>'
|
||||
score = duplicate_comparison_score_much_higher_resolution
|
||||
|
||||
elif resolution_ratio > 1.00:
|
||||
|
||||
operator = '>'
|
||||
score = duplicate_comparison_score_higher_resolution
|
||||
|
||||
elif resolution_ratio < 0.5:
|
||||
|
||||
operator = '<<'
|
||||
score = -duplicate_comparison_score_much_higher_resolution
|
||||
|
||||
else:
|
||||
|
||||
operator = '<'
|
||||
score = -duplicate_comparison_score_higher_resolution
|
||||
|
||||
|
||||
if s_resolution in HC.NICE_RESOLUTIONS:
|
||||
|
||||
s_string = HC.NICE_RESOLUTIONS[ s_resolution ]
|
||||
|
||||
else:
|
||||
|
||||
s_string = HydrusData.ConvertResolutionToPrettyString( s_resolution )
|
||||
|
||||
if s_w % 2 == 1 or s_h % 2 == 1:
|
||||
|
||||
s_string += ' (unusual)'
|
||||
|
||||
|
||||
|
||||
if c_resolution in HC.NICE_RESOLUTIONS:
|
||||
|
||||
c_string = HC.NICE_RESOLUTIONS[ c_resolution ]
|
||||
|
||||
else:
|
||||
|
||||
c_string = HydrusData.ConvertResolutionToPrettyString( c_resolution )
|
||||
|
||||
if c_w % 2 == 1 or c_h % 2 == 1:
|
||||
|
||||
c_string += ' (unusual)'
|
||||
|
||||
|
||||
|
||||
statement = '{} {} {}'.format( s_string, operator, c_string )
|
||||
|
||||
statements_and_scores[ 'resolution' ] = ( statement, score )
|
||||
|
||||
#
|
||||
|
||||
s_ratio = s_w / s_h
|
||||
c_ratio = c_w / c_h
|
||||
|
||||
s_nice = s_ratio in HC.NICE_RATIOS
|
||||
c_nice = c_ratio in HC.NICE_RATIOS
|
||||
|
||||
if s_nice or c_nice:
|
||||
|
||||
if s_nice:
|
||||
|
||||
s_string = HC.NICE_RATIOS[ s_ratio ]
|
||||
|
||||
else:
|
||||
|
||||
s_string = 'unusual'
|
||||
|
||||
|
||||
if c_nice:
|
||||
|
||||
c_string = HC.NICE_RATIOS[ c_ratio ]
|
||||
|
||||
else:
|
||||
|
||||
c_string = 'unusual'
|
||||
|
||||
|
||||
if s_nice and c_nice:
|
||||
|
||||
operator = '-'
|
||||
score = 0
|
||||
|
||||
elif s_nice:
|
||||
|
||||
operator = '>'
|
||||
score = duplicate_comparison_score_nicer_ratio
|
||||
|
||||
elif c_nice:
|
||||
|
||||
operator = '<'
|
||||
score = -duplicate_comparison_score_nicer_ratio
|
||||
|
||||
|
||||
if s_string == c_string:
|
||||
|
||||
statement = 'both {}'.format( s_string )
|
||||
|
||||
else:
|
||||
|
||||
statement = '{} {} {}'.format( s_string, operator, c_string )
|
||||
|
||||
|
||||
statements_and_scores[ 'ratio' ] = ( statement, score )
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
# same/diff mime
|
||||
|
||||
if s_mime != c_mime:
|
||||
|
||||
statement = '{} vs {}'.format( HC.mime_string_lookup[ s_mime ], HC.mime_string_lookup[ c_mime ] )
|
||||
score = 0
|
||||
|
||||
statements_and_scores[ 'mime' ] = ( statement, score )
|
||||
|
||||
|
||||
# more tags
|
||||
|
||||
s_num_tags = len( shown_media.GetTagsManager().GetCurrentAndPending( CC.COMBINED_TAG_SERVICE_KEY, ClientTags.TAG_DISPLAY_ACTUAL ) )
|
||||
c_num_tags = len( comparison_media.GetTagsManager().GetCurrentAndPending( CC.COMBINED_TAG_SERVICE_KEY, ClientTags.TAG_DISPLAY_ACTUAL ) )
|
||||
|
||||
if s_num_tags != c_num_tags:
|
||||
|
||||
if s_num_tags > 0 and c_num_tags > 0:
|
||||
|
||||
if s_num_tags > c_num_tags:
|
||||
|
||||
operator = '>'
|
||||
score = duplicate_comparison_score_more_tags
|
||||
|
||||
else:
|
||||
|
||||
operator = '<'
|
||||
score = -duplicate_comparison_score_more_tags
|
||||
|
||||
|
||||
elif s_num_tags > 0:
|
||||
|
||||
operator = '>>'
|
||||
score = duplicate_comparison_score_more_tags
|
||||
|
||||
elif c_num_tags > 0:
|
||||
|
||||
operator = '<<'
|
||||
score = -duplicate_comparison_score_more_tags
|
||||
|
||||
|
||||
statement = '{} tags {} {} tags'.format( HydrusData.ToHumanInt( s_num_tags ), operator, HydrusData.ToHumanInt( c_num_tags ) )
|
||||
|
||||
statements_and_scores[ 'num_tags' ] = ( statement, score )
|
||||
|
||||
|
||||
# older
|
||||
|
||||
s_ts = shown_media.GetLocationsManager().GetCurrentTimestamp( CC.COMBINED_LOCAL_FILE_SERVICE_KEY )
|
||||
c_ts = comparison_media.GetLocationsManager().GetCurrentTimestamp( CC.COMBINED_LOCAL_FILE_SERVICE_KEY )
|
||||
|
||||
one_month = 86400 * 30
|
||||
|
||||
if s_ts is not None and c_ts is not None and abs( s_ts - c_ts ) > one_month:
|
||||
|
||||
if s_ts < c_ts:
|
||||
|
||||
operator = 'older than'
|
||||
score = duplicate_comparison_score_older
|
||||
|
||||
else:
|
||||
|
||||
operator = 'newer than'
|
||||
score = -duplicate_comparison_score_older
|
||||
|
||||
|
||||
if is_a_pixel_dupe:
|
||||
|
||||
score = 0
|
||||
|
||||
|
||||
statement = '{}, {} {}'.format( ClientData.TimestampToPrettyTimeDelta( s_ts, history_suffix = ' old' ), operator, ClientData.TimestampToPrettyTimeDelta( c_ts, history_suffix = ' old' ) )
|
||||
|
||||
statements_and_scores[ 'time_imported' ] = ( statement, score )
|
||||
|
||||
|
||||
if s_mime == HC.IMAGE_JPEG and c_mime == HC.IMAGE_JPEG:
|
||||
|
||||
global hashes_to_jpeg_quality
|
||||
|
||||
if s_hash not in hashes_to_jpeg_quality:
|
||||
|
||||
path = HG.client_controller.client_files_manager.GetFilePath( s_hash, s_mime )
|
||||
|
||||
hashes_to_jpeg_quality[ s_hash ] = HydrusImageHandling.GetJPEGQuantizationQualityEstimate( path )
|
||||
|
||||
|
||||
if c_hash not in hashes_to_jpeg_quality:
|
||||
|
||||
path = HG.client_controller.client_files_manager.GetFilePath( c_hash, c_mime )
|
||||
|
||||
hashes_to_jpeg_quality[ c_hash ] = HydrusImageHandling.GetJPEGQuantizationQualityEstimate( path )
|
||||
|
||||
|
||||
( s_label, s_jpeg_quality ) = hashes_to_jpeg_quality[ s_hash ]
|
||||
( c_label, c_jpeg_quality ) = hashes_to_jpeg_quality[ c_hash ]
|
||||
|
||||
score = 0
|
||||
|
||||
if s_label != c_label:
|
||||
|
||||
if c_jpeg_quality is None or s_jpeg_quality is None:
|
||||
|
||||
score = 0
|
||||
|
||||
else:
|
||||
|
||||
# other way around, low score is good here
|
||||
quality_ratio = c_jpeg_quality / s_jpeg_quality
|
||||
|
||||
if quality_ratio > 2.0:
|
||||
|
||||
score = duplicate_comparison_score_much_higher_jpeg_quality
|
||||
|
||||
elif quality_ratio > 1.0:
|
||||
|
||||
score = duplicate_comparison_score_higher_jpeg_quality
|
||||
|
||||
elif quality_ratio < 0.5:
|
||||
|
||||
score = -duplicate_comparison_score_much_higher_jpeg_quality
|
||||
|
||||
else:
|
||||
|
||||
score = -duplicate_comparison_score_higher_jpeg_quality
|
||||
|
||||
|
||||
|
||||
statement = '{} vs {} jpeg quality'.format( s_label, c_label )
|
||||
|
||||
statements_and_scores[ 'jpeg_quality' ] = ( statement, score )
|
||||
|
||||
|
||||
|
||||
def has_exif( m ):
|
||||
|
||||
try:
|
||||
|
||||
hash = m.GetHash()
|
||||
mime = m.GetMime()
|
||||
|
||||
if mime not in ( HC.IMAGE_JPEG, HC.IMAGE_TIFF ):
|
||||
|
||||
return False
|
||||
|
||||
|
||||
path = HG.client_controller.client_files_manager.GetFilePath( hash, mime )
|
||||
|
||||
pil_image = HydrusImageHandling.RawOpenPILImage( path )
|
||||
|
||||
exif_dict = HydrusImageHandling.GetEXIFDict( pil_image )
|
||||
|
||||
if exif_dict is None:
|
||||
|
||||
return False
|
||||
|
||||
|
||||
return len( exif_dict ) > 0
|
||||
|
||||
except:
|
||||
|
||||
return False
|
||||
|
||||
|
||||
|
||||
s_has_exif = has_exif( shown_media )
|
||||
c_has_exif = has_exif( comparison_media )
|
||||
|
||||
if s_has_exif ^ c_has_exif:
|
||||
|
||||
if s_has_exif:
|
||||
|
||||
exif_statement = 'has exif data, the other does not'
|
||||
|
||||
else:
|
||||
|
||||
exif_statement = 'the other has exif data, this does not'
|
||||
|
||||
|
||||
statements_and_scores[ 'exif_data' ] = ( exif_statement, 0 )
|
||||
|
||||
|
||||
s_has_human_readable_embedded_metadata = shown_media.GetMediaResult().GetFileInfoManager().has_human_readable_embedded_metadata
|
||||
c_has_human_readable_embedded_metadata = comparison_media.GetMediaResult().GetFileInfoManager().has_human_readable_embedded_metadata
|
||||
|
||||
if s_has_human_readable_embedded_metadata ^ c_has_human_readable_embedded_metadata:
|
||||
|
||||
if s_has_human_readable_embedded_metadata:
|
||||
|
||||
embedded_metadata_statement = 'has embedded metadata, the other does not'
|
||||
|
||||
else:
|
||||
|
||||
embedded_metadata_statement = 'the other has embedded metadata, this does not'
|
||||
|
||||
|
||||
statements_and_scores[ 'embedded_metadata' ] = ( embedded_metadata_statement, 0 )
|
||||
|
||||
|
||||
s_has_icc = shown_media.GetMediaResult().GetFileInfoManager().has_icc_profile
|
||||
c_has_icc = comparison_media.GetMediaResult().GetFileInfoManager().has_icc_profile
|
||||
|
||||
if s_has_icc ^ c_has_icc:
|
||||
|
||||
if s_has_icc:
|
||||
|
||||
icc_statement = 'has icc profile, the other does not'
|
||||
|
||||
else:
|
||||
|
||||
icc_statement = 'the other has icc profile, this does not'
|
||||
|
||||
|
||||
statements_and_scores[ 'icc_profile' ] = ( icc_statement, 0 )
|
||||
|
||||
|
||||
return statements_and_scores
|
||||
|
||||
|
||||
class DuplicatesManager( object ):
|
||||
|
||||
my_instance = None
|
||||
|
@ -423,7 +933,7 @@ class DuplicateContentMergeOptions( HydrusSerialisable.SerialisableBase ):
|
|||
self._sync_urls_action = sync_urls_action
|
||||
|
||||
|
||||
def ProcessPairIntoContentUpdates( self, first_media, second_media, delete_first = False, delete_second = False, file_deletion_reason = None, do_not_do_deletes = False ):
|
||||
def ProcessPairIntoContentUpdates( self, first_media: ClientMedia.MediaSingleton, second_media: ClientMedia.MediaSingleton, delete_first = False, delete_second = False, file_deletion_reason = None, do_not_do_deletes = False ):
|
||||
|
||||
if file_deletion_reason is None:
|
||||
|
||||
|
@ -432,8 +942,13 @@ class DuplicateContentMergeOptions( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
service_keys_to_content_updates = collections.defaultdict( list )
|
||||
|
||||
first_hashes = first_media.GetHashes()
|
||||
second_hashes = second_media.GetHashes()
|
||||
first_hash = first_media.GetHash()
|
||||
second_hash = second_media.GetHash()
|
||||
first_hashes = { first_hash }
|
||||
second_hashes = { second_hash }
|
||||
|
||||
first_media_result = first_media.GetMediaResult()
|
||||
second_media_result = second_media.GetMediaResult()
|
||||
|
||||
#
|
||||
|
||||
|
@ -580,18 +1095,28 @@ class DuplicateContentMergeOptions( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
if self._sync_notes_action == HC.CONTENT_MERGE_ACTION_TWO_WAY_MERGE:
|
||||
|
||||
first_service_keys_to_content_updates = self._sync_note_import_options.GetServiceKeysToContentUpdates( first_media, second_names_and_notes )
|
||||
second_service_keys_to_content_updates = self._sync_note_import_options.GetServiceKeysToContentUpdates( second_media, first_names_and_notes )
|
||||
first_service_keys_to_content_updates = self._sync_note_import_options.GetServiceKeysToContentUpdates( first_media_result, second_names_and_notes )
|
||||
second_service_keys_to_content_updates = self._sync_note_import_options.GetServiceKeysToContentUpdates( second_media_result, first_names_and_notes )
|
||||
|
||||
content_updates.extend( first_service_keys_to_content_updates[ CC.LOCAL_NOTES_SERVICE_KEY ] )
|
||||
content_updates.extend( second_service_keys_to_content_updates[ CC.LOCAL_NOTES_SERVICE_KEY ] )
|
||||
|
||||
elif self._sync_notes_action == HC.CONTENT_MERGE_ACTION_COPY:
|
||||
|
||||
first_service_keys_to_content_updates = self._sync_note_import_options.GetServiceKeysToContentUpdates( first_media, second_names_and_notes )
|
||||
first_service_keys_to_content_updates = self._sync_note_import_options.GetServiceKeysToContentUpdates( first_media_result, second_names_and_notes )
|
||||
|
||||
content_updates.extend( first_service_keys_to_content_updates[ CC.LOCAL_NOTES_SERVICE_KEY ] )
|
||||
|
||||
elif self._sync_notes_action == HC.CONTENT_MERGE_ACTION_MOVE:
|
||||
|
||||
first_service_keys_to_content_updates = self._sync_note_import_options.GetServiceKeysToContentUpdates( first_media_result, second_names_and_notes )
|
||||
|
||||
content_updates.extend( first_service_keys_to_content_updates[ CC.LOCAL_NOTES_SERVICE_KEY ] )
|
||||
|
||||
content_updates.extend(
|
||||
[ HydrusData.ContentUpdate( HC.CONTENT_TYPE_NOTES, HC.CONTENT_UPDATE_DELETE, ( second_hash, name ) ) for ( name, note ) in second_names_and_notes ]
|
||||
)
|
||||
|
||||
|
||||
if len( content_updates ) > 0:
|
||||
|
||||
|
|
|
@ -1831,141 +1831,201 @@ class DB( HydrusDB.HydrusDB ):
|
|||
HydrusDB.HydrusDB._DoAfterJobWork( self )
|
||||
|
||||
|
||||
def _DuplicatesGetRandomPotentialDuplicateHashes( self, file_search_context: ClientSearch.FileSearchContext, both_files_match, pixel_dupes_preference, max_hamming_distance ):
|
||||
def _DuplicatesGetRandomPotentialDuplicateHashes(
|
||||
self,
|
||||
file_search_context_1: ClientSearch.FileSearchContext,
|
||||
file_search_context_2: ClientSearch.FileSearchContext,
|
||||
dupe_search_type: int,
|
||||
pixel_dupes_preference,
|
||||
max_hamming_distance
|
||||
) -> typing.List[ bytes ]:
|
||||
|
||||
db_location_context = self.modules_files_storage.GetDBLocationContext( file_search_context.GetLocationContext() )
|
||||
db_location_context = self.modules_files_storage.GetDBLocationContext( file_search_context_1.GetLocationContext() )
|
||||
|
||||
is_complicated_search = False
|
||||
chosen_allowed_hash_ids = None
|
||||
chosen_preferred_hash_ids = None
|
||||
comparison_allowed_hash_ids = None
|
||||
comparison_preferred_hash_ids = None
|
||||
|
||||
with self._MakeTemporaryIntegerTable( [], 'hash_id' ) as temp_table_name:
|
||||
# first we get a sample of current potential pairs in the db, given our limiting search context
|
||||
|
||||
with self._MakeTemporaryIntegerTable( [], 'hash_id' ) as temp_table_name_1:
|
||||
|
||||
# first we get a sample of current potential pairs in the db, given our limiting search context
|
||||
|
||||
allowed_hash_ids = None
|
||||
preferred_hash_ids = None
|
||||
|
||||
if file_search_context.IsJustSystemEverything() or file_search_context.HasNoPredicates():
|
||||
with self._MakeTemporaryIntegerTable( [], 'hash_id' ) as temp_table_name_2:
|
||||
|
||||
table_join = self.modules_files_duplicates.DuplicatesGetPotentialDuplicatePairsTableJoinOnEverythingSearchResults( db_location_context, pixel_dupes_preference, max_hamming_distance )
|
||||
|
||||
else:
|
||||
|
||||
is_complicated_search = True
|
||||
|
||||
query_hash_ids = self._GetHashIdsFromQuery( file_search_context, apply_implicit_limit = False )
|
||||
|
||||
if both_files_match:
|
||||
if dupe_search_type == CC.DUPE_SEARCH_BOTH_FILES_MATCH_DIFFERENT_SEARCHES:
|
||||
|
||||
allowed_hash_ids = query_hash_ids
|
||||
query_hash_ids_1 = set( self._PopulateSearchIntoTempTable( file_search_context_1, temp_table_name_1 ) )
|
||||
query_hash_ids_2 = set( self._PopulateSearchIntoTempTable( file_search_context_2, temp_table_name_2 ) )
|
||||
|
||||
# we are going to say our 'master' king for the pair(s) returned here is always from search 1
|
||||
chosen_allowed_hash_ids = query_hash_ids_1
|
||||
comparison_allowed_hash_ids = query_hash_ids_2
|
||||
|
||||
table_join = self.modules_files_duplicates.DuplicatesGetPotentialDuplicatePairsTableJoinOnSeparateSearchResults( temp_table_name_1, temp_table_name_2, pixel_dupes_preference, max_hamming_distance )
|
||||
|
||||
else:
|
||||
|
||||
preferred_hash_ids = query_hash_ids
|
||||
if file_search_context_1.IsJustSystemEverything() or file_search_context_1.HasNoPredicates():
|
||||
|
||||
table_join = self.modules_files_duplicates.DuplicatesGetPotentialDuplicatePairsTableJoinOnEverythingSearchResults( db_location_context, pixel_dupes_preference, max_hamming_distance )
|
||||
|
||||
else:
|
||||
|
||||
query_hash_ids = set( self._PopulateSearchIntoTempTable( file_search_context_1, temp_table_name_1 ) )
|
||||
|
||||
if dupe_search_type == CC.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH:
|
||||
|
||||
chosen_allowed_hash_ids = query_hash_ids
|
||||
comparison_allowed_hash_ids = query_hash_ids
|
||||
|
||||
table_join = self.modules_files_duplicates.DuplicatesGetPotentialDuplicatePairsTableJoinOnSearchResultsBothFiles( temp_table_name_1, pixel_dupes_preference, max_hamming_distance )
|
||||
|
||||
else:
|
||||
|
||||
# the master will always be one that matches the search, the comparison can be whatever
|
||||
chosen_allowed_hash_ids = query_hash_ids
|
||||
|
||||
table_join = self.modules_files_duplicates.DuplicatesGetPotentialDuplicatePairsTableJoinOnSearchResults( db_location_context, temp_table_name_1, pixel_dupes_preference, max_hamming_distance )
|
||||
|
||||
|
||||
|
||||
|
||||
self._ExecuteMany( 'INSERT OR IGNORE INTO {} ( hash_id ) VALUES ( ? );'.format( temp_table_name ), ( ( hash_id, ) for hash_id in query_hash_ids ) )
|
||||
# ok let's not use a set here, since that un-weights medias that appear a lot, and we want to see common stuff more often
|
||||
potential_media_ids = []
|
||||
|
||||
self._AnalyzeTempTable( temp_table_name )
|
||||
|
||||
table_join = self.modules_files_duplicates.DuplicatesGetPotentialDuplicatePairsTableJoinOnSearchResults( db_location_context, temp_table_name, both_files_match, pixel_dupes_preference, max_hamming_distance )
|
||||
|
||||
|
||||
potential_media_ids = set()
|
||||
|
||||
# distinct important here for the search results table join
|
||||
for ( smaller_media_id, larger_media_id ) in self._Execute( 'SELECT DISTINCT smaller_media_id, larger_media_id FROM {};'.format( table_join ) ):
|
||||
|
||||
potential_media_ids.add( smaller_media_id )
|
||||
potential_media_ids.add( larger_media_id )
|
||||
|
||||
if len( potential_media_ids ) >= 1000:
|
||||
# distinct important here for the search results table join
|
||||
for ( smaller_media_id, larger_media_id ) in self._Execute( 'SELECT DISTINCT smaller_media_id, larger_media_id FROM {};'.format( table_join ) ):
|
||||
|
||||
break
|
||||
potential_media_ids.append( smaller_media_id )
|
||||
potential_media_ids.append( larger_media_id )
|
||||
|
||||
if len( potential_media_ids ) >= 1000:
|
||||
|
||||
break
|
||||
|
||||
|
||||
|
||||
|
||||
# now let's randomly select a file in these medias
|
||||
|
||||
potential_media_ids = list( potential_media_ids )
|
||||
|
||||
random.shuffle( potential_media_ids )
|
||||
|
||||
chosen_hash_id = None
|
||||
|
||||
for potential_media_id in potential_media_ids:
|
||||
# now let's randomly select a file in these medias
|
||||
|
||||
best_king_hash_id = self.modules_files_duplicates.DuplicatesGetBestKingId( potential_media_id, db_location_context, allowed_hash_ids = allowed_hash_ids, preferred_hash_ids = preferred_hash_ids )
|
||||
random.shuffle( potential_media_ids )
|
||||
|
||||
if best_king_hash_id is not None:
|
||||
chosen_media_id = None
|
||||
chosen_hash_id = None
|
||||
|
||||
for potential_media_id in potential_media_ids:
|
||||
|
||||
chosen_hash_id = best_king_hash_id
|
||||
best_king_hash_id = self.modules_files_duplicates.DuplicatesGetBestKingId( potential_media_id, db_location_context, allowed_hash_ids = chosen_allowed_hash_ids, preferred_hash_ids = chosen_preferred_hash_ids )
|
||||
|
||||
break
|
||||
if best_king_hash_id is not None:
|
||||
|
||||
chosen_media_id = potential_media_id
|
||||
chosen_hash_id = best_king_hash_id
|
||||
|
||||
break
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
if chosen_hash_id is None:
|
||||
|
||||
return []
|
||||
|
||||
|
||||
hash = self.modules_hashes_local_cache.GetHash( chosen_hash_id )
|
||||
|
||||
if is_complicated_search and both_files_match:
|
||||
|
||||
allowed_hash_ids = query_hash_ids
|
||||
|
||||
else:
|
||||
|
||||
allowed_hash_ids = None
|
||||
|
||||
|
||||
location_context = file_search_context.GetLocationContext()
|
||||
|
||||
return self.modules_files_duplicates.DuplicatesGetFileHashesByDuplicateType( location_context, hash, HC.DUPLICATE_POTENTIAL, allowed_hash_ids = allowed_hash_ids, preferred_hash_ids = preferred_hash_ids )
|
||||
|
||||
if chosen_hash_id is None:
|
||||
|
||||
return []
|
||||
|
||||
|
||||
# I used to do self.modules_files_duplicates.DuplicatesGetFileHashesByDuplicateType here, but that gets _all_ potentials in the db context, even with allowed_hash_ids doing work it won't capture pixel hashes or duplicate distance that we searched above
|
||||
# so, let's search and make the list manually!
|
||||
|
||||
comparison_hash_ids = []
|
||||
|
||||
# distinct important here for the search results table join
|
||||
matching_pairs = self._Execute( 'SELECT DISTINCT smaller_media_id, larger_media_id FROM {} AND ( smaller_media_id = ? OR larger_media_id = ? );'.format( table_join ), ( chosen_media_id, chosen_media_id ) ).fetchall()
|
||||
|
||||
for ( smaller_media_id, larger_media_id ) in matching_pairs:
|
||||
|
||||
if smaller_media_id == chosen_media_id:
|
||||
|
||||
potential_media_id = larger_media_id
|
||||
|
||||
else:
|
||||
|
||||
potential_media_id = smaller_media_id
|
||||
|
||||
|
||||
best_king_hash_id = self.modules_files_duplicates.DuplicatesGetBestKingId( potential_media_id, db_location_context, allowed_hash_ids = comparison_allowed_hash_ids, preferred_hash_ids = comparison_preferred_hash_ids )
|
||||
|
||||
if best_king_hash_id is not None:
|
||||
|
||||
comparison_hash_ids.append( best_king_hash_id )
|
||||
|
||||
|
||||
|
||||
# might as well have some kind of order
|
||||
comparison_hash_ids.sort()
|
||||
|
||||
results_hash_ids = [ chosen_hash_id ] + comparison_hash_ids
|
||||
|
||||
return self.modules_hashes_local_cache.GetHashes( results_hash_ids )
|
||||
|
||||
|
||||
def _DuplicatesGetPotentialDuplicatePairsForFiltering( self, file_search_context: ClientSearch.FileSearchContext, both_files_match, pixel_dupes_preference, max_hamming_distance ):
|
||||
def _DuplicatesGetPotentialDuplicatePairsForFiltering( self, file_search_context_1: ClientSearch.FileSearchContext, file_search_context_2: ClientSearch.FileSearchContext, dupe_search_type: int, pixel_dupes_preference, max_hamming_distance ):
|
||||
|
||||
# we need to batch non-intersecting decisions here to keep it simple at the gui-level
|
||||
# we also want to maximise per-decision value
|
||||
|
||||
# now we will fetch some unknown pairs
|
||||
|
||||
db_location_context = self.modules_files_storage.GetDBLocationContext( file_search_context.GetLocationContext() )
|
||||
db_location_context = self.modules_files_storage.GetDBLocationContext( file_search_context_1.GetLocationContext() )
|
||||
|
||||
with self._MakeTemporaryIntegerTable( [], 'hash_id' ) as temp_table_name:
|
||||
chosen_allowed_hash_ids = None
|
||||
chosen_preferred_hash_ids = None
|
||||
comparison_allowed_hash_ids = None
|
||||
comparison_preferred_hash_ids = None
|
||||
|
||||
with self._MakeTemporaryIntegerTable( [], 'hash_id' ) as temp_table_name_1:
|
||||
|
||||
allowed_hash_ids = None
|
||||
preferred_hash_ids = None
|
||||
|
||||
if file_search_context.IsJustSystemEverything() or file_search_context.HasNoPredicates():
|
||||
with self._MakeTemporaryIntegerTable( [], 'hash_id' ) as temp_table_name_2:
|
||||
|
||||
table_join = self.modules_files_duplicates.DuplicatesGetPotentialDuplicatePairsTableJoinOnEverythingSearchResults( db_location_context, pixel_dupes_preference, max_hamming_distance )
|
||||
|
||||
else:
|
||||
|
||||
query_hash_ids = self._GetHashIdsFromQuery( file_search_context, apply_implicit_limit = False )
|
||||
|
||||
if both_files_match:
|
||||
if dupe_search_type == CC.DUPE_SEARCH_BOTH_FILES_MATCH_DIFFERENT_SEARCHES:
|
||||
|
||||
allowed_hash_ids = query_hash_ids
|
||||
query_hash_ids_1 = set( self._PopulateSearchIntoTempTable( file_search_context_1, temp_table_name_1 ) )
|
||||
query_hash_ids_2 = set( self._PopulateSearchIntoTempTable( file_search_context_2, temp_table_name_2 ) )
|
||||
|
||||
# we always want pairs where one is in one and the other is in the other, we don't want king-selection-trickery giving us a jpeg vs a jpeg
|
||||
chosen_allowed_hash_ids = query_hash_ids_1
|
||||
comparison_allowed_hash_ids = query_hash_ids_2
|
||||
|
||||
table_join = self.modules_files_duplicates.DuplicatesGetPotentialDuplicatePairsTableJoinOnSeparateSearchResults( temp_table_name_1, temp_table_name_2, pixel_dupes_preference, max_hamming_distance )
|
||||
|
||||
else:
|
||||
|
||||
preferred_hash_ids = query_hash_ids
|
||||
if file_search_context_1.IsJustSystemEverything() or file_search_context_1.HasNoPredicates():
|
||||
|
||||
table_join = self.modules_files_duplicates.DuplicatesGetPotentialDuplicatePairsTableJoinOnEverythingSearchResults( db_location_context, pixel_dupes_preference, max_hamming_distance )
|
||||
|
||||
else:
|
||||
|
||||
query_hash_ids = set( self._PopulateSearchIntoTempTable( file_search_context_1, temp_table_name_1 ) )
|
||||
|
||||
if dupe_search_type == CC.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH:
|
||||
|
||||
# both chosen and comparison must be in the search, no king selection nonsense allowed
|
||||
chosen_allowed_hash_ids = query_hash_ids
|
||||
comparison_allowed_hash_ids = query_hash_ids
|
||||
|
||||
table_join = self.modules_files_duplicates.DuplicatesGetPotentialDuplicatePairsTableJoinOnSearchResultsBothFiles( temp_table_name_1, pixel_dupes_preference, max_hamming_distance )
|
||||
|
||||
else:
|
||||
|
||||
# the chosen must be in the search, but we don't care about the comparison as long as it is viewable
|
||||
chosen_preferred_hash_ids = query_hash_ids
|
||||
|
||||
table_join = self.modules_files_duplicates.DuplicatesGetPotentialDuplicatePairsTableJoinOnSearchResults( db_location_context, temp_table_name_1, pixel_dupes_preference, max_hamming_distance )
|
||||
|
||||
|
||||
|
||||
|
||||
self._ExecuteMany( 'INSERT OR IGNORE INTO {} ( hash_id ) VALUES ( ? );'.format( temp_table_name ), ( ( hash_id, ) for hash_id in query_hash_ids ) )
|
||||
# distinct important here for the search results table join
|
||||
result = self._Execute( 'SELECT DISTINCT smaller_media_id, larger_media_id, distance FROM {} LIMIT 2500;'.format( table_join ) ).fetchall()
|
||||
|
||||
self._AnalyzeTempTable( temp_table_name )
|
||||
|
||||
table_join = self.modules_files_duplicates.DuplicatesGetPotentialDuplicatePairsTableJoinOnSearchResults( db_location_context, temp_table_name, both_files_match, pixel_dupes_preference, max_hamming_distance )
|
||||
|
||||
|
||||
# distinct important here for the search results table join
|
||||
result = self._Execute( 'SELECT DISTINCT smaller_media_id, larger_media_id, distance FROM {} LIMIT 2500;'.format( table_join ) ).fetchall()
|
||||
|
||||
|
||||
MAX_BATCH_SIZE = HG.client_controller.new_options.GetInteger( 'duplicate_filter_max_batch_size' )
|
||||
|
@ -2051,21 +2111,51 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
seen_hash_ids = set()
|
||||
|
||||
media_ids_to_best_king_ids = {}
|
||||
batch_of_pairs_of_hash_ids = []
|
||||
|
||||
for media_id in seen_media_ids:
|
||||
if chosen_allowed_hash_ids == comparison_allowed_hash_ids and chosen_preferred_hash_ids == comparison_preferred_hash_ids:
|
||||
|
||||
best_king_hash_id = self.modules_files_duplicates.DuplicatesGetBestKingId( media_id, db_location_context, allowed_hash_ids = allowed_hash_ids, preferred_hash_ids = preferred_hash_ids )
|
||||
# which file was 'chosen' vs 'comparison' is irrelevant. the user is expecting to see a mix, so we want the best kings possible. this is probably 'system:everything' or similar
|
||||
|
||||
if best_king_hash_id is not None:
|
||||
for ( smaller_media_id, larger_media_id ) in batch_of_pairs_of_media_ids:
|
||||
|
||||
seen_hash_ids.add( best_king_hash_id )
|
||||
best_smaller_king_hash_id = self.modules_files_duplicates.DuplicatesGetBestKingId( smaller_media_id, db_location_context, allowed_hash_ids = chosen_allowed_hash_ids, preferred_hash_ids = chosen_preferred_hash_ids )
|
||||
best_larger_king_hash_id = self.modules_files_duplicates.DuplicatesGetBestKingId( larger_media_id, db_location_context, allowed_hash_ids = chosen_allowed_hash_ids, preferred_hash_ids = chosen_preferred_hash_ids )
|
||||
|
||||
media_ids_to_best_king_ids[ media_id ] = best_king_hash_id
|
||||
if best_smaller_king_hash_id is not None and best_larger_king_hash_id is not None:
|
||||
|
||||
batch_of_pairs_of_hash_ids.append( ( best_smaller_king_hash_id, best_larger_king_hash_id ) )
|
||||
|
||||
seen_hash_ids.update( ( best_smaller_king_hash_id, best_larger_king_hash_id ) )
|
||||
|
||||
|
||||
|
||||
else:
|
||||
|
||||
# we want to enforce that our pairs seem human. if the user said 'A is in search 1 and B is in search 2', we don't want king selection going funny and giving us two from 1
|
||||
# previously, we did this on media_ids on their own, but we have to do it in pairs. we choose the 'chosen' and 'comparison' of our pair and filter accordingly
|
||||
|
||||
for ( smaller_media_id, larger_media_id ) in batch_of_pairs_of_media_ids:
|
||||
|
||||
best_smaller_king_hash_id = self.modules_files_duplicates.DuplicatesGetBestKingId( smaller_media_id, db_location_context, allowed_hash_ids = chosen_allowed_hash_ids, preferred_hash_ids = chosen_preferred_hash_ids )
|
||||
best_larger_king_hash_id = self.modules_files_duplicates.DuplicatesGetBestKingId( larger_media_id, db_location_context, allowed_hash_ids = comparison_allowed_hash_ids, preferred_hash_ids = comparison_preferred_hash_ids )
|
||||
|
||||
if best_smaller_king_hash_id is None or best_larger_king_hash_id is None:
|
||||
|
||||
# ok smaller was probably the comparison, let's see if that produces a better king hash
|
||||
|
||||
best_smaller_king_hash_id = self.modules_files_duplicates.DuplicatesGetBestKingId( smaller_media_id, db_location_context, allowed_hash_ids = comparison_allowed_hash_ids, preferred_hash_ids = comparison_preferred_hash_ids )
|
||||
best_larger_king_hash_id = self.modules_files_duplicates.DuplicatesGetBestKingId( larger_media_id, db_location_context, allowed_hash_ids = chosen_allowed_hash_ids, preferred_hash_ids = chosen_preferred_hash_ids )
|
||||
|
||||
|
||||
if best_smaller_king_hash_id is not None and best_larger_king_hash_id is not None:
|
||||
|
||||
batch_of_pairs_of_hash_ids.append( ( best_smaller_king_hash_id, best_larger_king_hash_id ) )
|
||||
|
||||
seen_hash_ids.update( ( best_smaller_king_hash_id, best_larger_king_hash_id ) )
|
||||
|
||||
|
||||
|
||||
|
||||
batch_of_pairs_of_hash_ids = [ ( media_ids_to_best_king_ids[ smaller_media_id ], media_ids_to_best_king_ids[ larger_media_id ] ) for ( smaller_media_id, larger_media_id ) in batch_of_pairs_of_media_ids if smaller_media_id in media_ids_to_best_king_ids and larger_media_id in media_ids_to_best_king_ids ]
|
||||
|
||||
media_results = self._GetMediaResults( seen_hash_ids )
|
||||
|
||||
|
@ -2076,29 +2166,45 @@ class DB( HydrusDB.HydrusDB ):
|
|||
return batch_of_pairs_of_media_results
|
||||
|
||||
|
||||
def _DuplicatesGetPotentialDuplicatesCount( self, file_search_context, both_files_match, pixel_dupes_preference, max_hamming_distance ):
|
||||
def _DuplicatesGetPotentialDuplicatesCount( self, file_search_context_1, file_search_context_2, dupe_search_type, pixel_dupes_preference, max_hamming_distance ):
|
||||
|
||||
db_location_context = self.modules_files_storage.GetDBLocationContext( file_search_context.GetLocationContext() )
|
||||
db_location_context = self.modules_files_storage.GetDBLocationContext( file_search_context_1.GetLocationContext() )
|
||||
|
||||
with self._MakeTemporaryIntegerTable( [], 'hash_id' ) as temp_table_name:
|
||||
with self._MakeTemporaryIntegerTable( [], 'hash_id' ) as temp_table_name_1:
|
||||
|
||||
if file_search_context.IsJustSystemEverything() or file_search_context.HasNoPredicates():
|
||||
with self._MakeTemporaryIntegerTable( [], 'hash_id' ) as temp_table_name_2:
|
||||
|
||||
table_join = self.modules_files_duplicates.DuplicatesGetPotentialDuplicatePairsTableJoinOnEverythingSearchResults( db_location_context, pixel_dupes_preference, max_hamming_distance )
|
||||
if dupe_search_type == CC.DUPE_SEARCH_BOTH_FILES_MATCH_DIFFERENT_SEARCHES:
|
||||
|
||||
self._PopulateSearchIntoTempTable( file_search_context_1, temp_table_name_1 )
|
||||
self._PopulateSearchIntoTempTable( file_search_context_2, temp_table_name_2 )
|
||||
|
||||
table_join = self.modules_files_duplicates.DuplicatesGetPotentialDuplicatePairsTableJoinOnSeparateSearchResults( temp_table_name_1, temp_table_name_2, pixel_dupes_preference, max_hamming_distance )
|
||||
|
||||
else:
|
||||
|
||||
if file_search_context_1.IsJustSystemEverything() or file_search_context_1.HasNoPredicates():
|
||||
|
||||
table_join = self.modules_files_duplicates.DuplicatesGetPotentialDuplicatePairsTableJoinOnEverythingSearchResults( db_location_context, pixel_dupes_preference, max_hamming_distance )
|
||||
|
||||
else:
|
||||
|
||||
self._PopulateSearchIntoTempTable( file_search_context_1, temp_table_name_1 )
|
||||
|
||||
if dupe_search_type == CC.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH:
|
||||
|
||||
table_join = self.modules_files_duplicates.DuplicatesGetPotentialDuplicatePairsTableJoinOnSearchResultsBothFiles( temp_table_name_1, pixel_dupes_preference, max_hamming_distance )
|
||||
|
||||
else:
|
||||
|
||||
table_join = self.modules_files_duplicates.DuplicatesGetPotentialDuplicatePairsTableJoinOnSearchResults( db_location_context, temp_table_name_1, pixel_dupes_preference, max_hamming_distance )
|
||||
|
||||
|
||||
|
||||
|
||||
else:
|
||||
# distinct important here for the search results table join
|
||||
( potential_duplicates_count, ) = self._Execute( 'SELECT COUNT( * ) FROM ( SELECT DISTINCT smaller_media_id, larger_media_id FROM {} );'.format( table_join ) ).fetchone()
|
||||
|
||||
query_hash_ids = self._GetHashIdsFromQuery( file_search_context, apply_implicit_limit = False )
|
||||
|
||||
self._ExecuteMany( 'INSERT OR IGNORE INTO {} ( hash_id ) VALUES ( ? );'.format( temp_table_name ), ( ( hash_id, ) for hash_id in query_hash_ids ) )
|
||||
|
||||
self._AnalyzeTempTable( temp_table_name )
|
||||
|
||||
table_join = self.modules_files_duplicates.DuplicatesGetPotentialDuplicatePairsTableJoinOnSearchResults( db_location_context, temp_table_name, both_files_match, pixel_dupes_preference, max_hamming_distance )
|
||||
|
||||
|
||||
# distinct important here for the search results table join
|
||||
( potential_duplicates_count, ) = self._Execute( 'SELECT COUNT( * ) FROM ( SELECT DISTINCT smaller_media_id, larger_media_id FROM {} );'.format( table_join ) ).fetchone()
|
||||
|
||||
|
||||
return potential_duplicates_count
|
||||
|
@ -2916,7 +3022,15 @@ class DB( HydrusDB.HydrusDB ):
|
|||
return ( storage_tag_data, display_tag_data )
|
||||
|
||||
|
||||
def _GetHashIdsFromQuery( self, file_search_context: ClientSearch.FileSearchContext, job_key = None, query_hash_ids: typing.Optional[ set ] = None, apply_implicit_limit = True, sort_by = None, limit_sort_by = None ):
|
||||
def _GetHashIdsFromQuery(
|
||||
self,
|
||||
file_search_context: ClientSearch.FileSearchContext,
|
||||
job_key = None,
|
||||
query_hash_ids: typing.Optional[ set ] = None,
|
||||
apply_implicit_limit = True,
|
||||
sort_by = None,
|
||||
limit_sort_by = None
|
||||
) -> typing.List[ int ]:
|
||||
|
||||
if job_key is None:
|
||||
|
||||
|
@ -2944,7 +3058,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
if location_context.IsEmpty():
|
||||
|
||||
return set()
|
||||
return []
|
||||
|
||||
|
||||
current_file_service_ids = set()
|
||||
|
@ -2959,7 +3073,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
HydrusData.ShowText( 'A file search query was run for a file service that does not exist! If you just removed a service, you might want to try checking the search and/or restarting the client.' )
|
||||
|
||||
return set()
|
||||
return []
|
||||
|
||||
|
||||
current_file_service_ids.add( current_file_service_id )
|
||||
|
@ -2977,7 +3091,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
HydrusData.ShowText( 'A file search query was run for a file service that does not exist! If you just removed a service, you might want to try checking the search and/or restarting the client.' )
|
||||
|
||||
return set()
|
||||
return []
|
||||
|
||||
|
||||
deleted_file_service_ids.add( deleted_file_service_id )
|
||||
|
@ -2993,7 +3107,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
HydrusData.ShowText( 'A file search query was run for a tag service that does not exist! If you just removed a service, you might want to try checking the search and/or restarting the client.' )
|
||||
|
||||
return set()
|
||||
return []
|
||||
|
||||
|
||||
tags_to_include = file_search_context.GetTagsToInclude()
|
||||
|
@ -3619,7 +3733,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
if len( query_hash_ids ) == 0:
|
||||
|
||||
return query_hash_ids
|
||||
return []
|
||||
|
||||
|
||||
|
||||
|
@ -3649,7 +3763,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
if len( query_hash_ids ) == 0:
|
||||
|
||||
return query_hash_ids
|
||||
return []
|
||||
|
||||
|
||||
|
||||
|
@ -3679,7 +3793,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
if len( query_hash_ids ) == 0:
|
||||
|
||||
return query_hash_ids
|
||||
return []
|
||||
|
||||
|
||||
|
||||
|
@ -3883,7 +3997,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
if job_key.IsCancelled():
|
||||
|
||||
return set()
|
||||
return []
|
||||
|
||||
|
||||
#
|
||||
|
@ -3921,7 +4035,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
if len( query_hash_ids ) == 0:
|
||||
|
||||
return query_hash_ids
|
||||
return []
|
||||
|
||||
|
||||
self._ExecuteMany( 'DELETE FROM {} WHERE hash_id = ?;'.format( temp_table_name ), ( ( hash_id, ) for hash_id in unwanted_hash_ids ) )
|
||||
|
@ -3935,7 +4049,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
if len( query_hash_ids ) == 0:
|
||||
|
||||
return query_hash_ids
|
||||
return []
|
||||
|
||||
|
||||
self._ExecuteMany( 'DELETE FROM {} WHERE hash_id = ?;'.format( temp_table_name ), ( ( hash_id, ) for hash_id in unwanted_hash_ids ) )
|
||||
|
@ -3949,7 +4063,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
if len( query_hash_ids ) == 0:
|
||||
|
||||
return query_hash_ids
|
||||
return []
|
||||
|
||||
|
||||
self._ExecuteMany( 'DELETE FROM {} WHERE hash_id = ?;'.format( temp_table_name ), ( ( hash_id, ) for hash_id in unwanted_hash_ids ) )
|
||||
|
@ -3959,7 +4073,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
if job_key.IsCancelled():
|
||||
|
||||
return set()
|
||||
return []
|
||||
|
||||
|
||||
#
|
||||
|
@ -4127,7 +4241,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
if job_key.IsCancelled():
|
||||
|
||||
return set()
|
||||
return []
|
||||
|
||||
|
||||
#
|
||||
|
@ -4270,7 +4384,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
if job_key.IsCancelled():
|
||||
|
||||
return set()
|
||||
return []
|
||||
|
||||
|
||||
#
|
||||
|
@ -4305,7 +4419,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
if job_key.IsCancelled():
|
||||
|
||||
return set()
|
||||
return []
|
||||
|
||||
|
||||
#
|
||||
|
@ -6101,6 +6215,17 @@ class DB( HydrusDB.HydrusDB ):
|
|||
return ( still_work_to_do, num_done )
|
||||
|
||||
|
||||
def _PopulateSearchIntoTempTable( self, file_search_context: ClientSearch.FileSearchContext, temp_table_name: str ) -> typing.List[ int ]:
|
||||
|
||||
query_hash_ids = self._GetHashIdsFromQuery( file_search_context, apply_implicit_limit = False )
|
||||
|
||||
self._ExecuteMany( 'INSERT OR IGNORE INTO {} ( hash_id ) VALUES ( ? );'.format( temp_table_name ), ( ( hash_id, ) for hash_id in query_hash_ids ) )
|
||||
|
||||
self._AnalyzeTempTable( temp_table_name )
|
||||
|
||||
return query_hash_ids
|
||||
|
||||
|
||||
def _ProcessContentUpdates( self, service_keys_to_content_updates, publish_content_updates = True ):
|
||||
|
||||
notify_new_downloads = False
|
||||
|
@ -7854,6 +7979,8 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
job_key.Delete( 5 )
|
||||
|
||||
HydrusData.ShowText( 'Now the mappings cache regen is done, you might want to restart the program.' )
|
||||
|
||||
self._cursor_transaction_wrapper.pub_after_job( 'notify_new_tag_display_application' )
|
||||
self._cursor_transaction_wrapper.pub_after_job( 'notify_new_force_refresh_tags_data' )
|
||||
|
||||
|
|
|
@ -581,7 +581,7 @@ class ClientDBFilesDuplicates( ClientDBModule.ClientDBModule ):
|
|||
return result_dict
|
||||
|
||||
|
||||
def DuplicatesGetFileHashesByDuplicateType( self, location_context: ClientLocation.LocationContext, hash, duplicate_type, allowed_hash_ids = None, preferred_hash_ids = None ):
|
||||
def DuplicatesGetFileHashesByDuplicateType( self, location_context: ClientLocation.LocationContext, hash: bytes, duplicate_type: int, allowed_hash_ids = None, preferred_hash_ids = None ) -> typing.List[ bytes ]:
|
||||
|
||||
hash_id = self.modules_hashes_local_cache.GetHashId( hash )
|
||||
|
||||
|
@ -924,18 +924,19 @@ class ClientDBFilesDuplicates( ClientDBModule.ClientDBModule ):
|
|||
return media_id
|
||||
|
||||
|
||||
def DuplicatesGetPotentialDuplicatePairsTableJoinOnEverythingSearchResults( self, db_location_context: ClientDBFilesStorage.DBLocationContext, pixel_dupes_preference: int, max_hamming_distance: int ):
|
||||
def DuplicatesGetPotentialDuplicatePairsTableJoinGetInitialTablesAndPreds( self, pixel_dupes_preference: int, max_hamming_distance: int ):
|
||||
|
||||
tables = 'potential_duplicate_pairs, duplicate_files AS duplicate_files_smaller, duplicate_files AS duplicate_files_larger'
|
||||
join_predicate = 'smaller_media_id = duplicate_files_smaller.media_id AND larger_media_id = duplicate_files_larger.media_id AND distance <= {}'.format( max_hamming_distance )
|
||||
tables = [
|
||||
'potential_duplicate_pairs',
|
||||
'duplicate_files AS duplicate_files_smaller',
|
||||
'duplicate_files AS duplicate_files_larger'
|
||||
]
|
||||
|
||||
if not db_location_context.location_context.IsAllKnownFiles():
|
||||
join_predicates = [ 'smaller_media_id = duplicate_files_smaller.media_id AND larger_media_id = duplicate_files_larger.media_id' ]
|
||||
|
||||
if pixel_dupes_preference != CC.SIMILAR_FILES_PIXEL_DUPES_REQUIRED:
|
||||
|
||||
files_table_name = db_location_context.GetSingleFilesTableName()
|
||||
|
||||
tables = '{}, {} AS current_files_smaller, {} AS current_files_larger'.format( tables, files_table_name, files_table_name )
|
||||
|
||||
join_predicate = '{} AND duplicate_files_smaller.king_hash_id = current_files_smaller.hash_id AND duplicate_files_larger.king_hash_id = current_files_larger.hash_id'.format( join_predicate )
|
||||
join_predicates.append( 'distance <= {}'.format( max_hamming_distance ) )
|
||||
|
||||
|
||||
if pixel_dupes_preference in ( CC.SIMILAR_FILES_PIXEL_DUPES_REQUIRED, CC.SIMILAR_FILES_PIXEL_DUPES_EXCLUDED ):
|
||||
|
@ -944,9 +945,12 @@ class ClientDBFilesDuplicates( ClientDBModule.ClientDBModule ):
|
|||
|
||||
if pixel_dupes_preference == CC.SIMILAR_FILES_PIXEL_DUPES_REQUIRED:
|
||||
|
||||
tables = '{}, pixel_hash_map AS pixel_hash_map_smaller, pixel_hash_map AS pixel_hash_map_larger'.format( tables )
|
||||
tables.extend( [
|
||||
'pixel_hash_map AS pixel_hash_map_smaller',
|
||||
'pixel_hash_map AS pixel_hash_map_larger'
|
||||
] )
|
||||
|
||||
join_predicate = '{} AND {}'.format( join_predicate, join_predicate_pixel_dupes )
|
||||
join_predicates.append( join_predicate_pixel_dupes )
|
||||
|
||||
elif pixel_dupes_preference == CC.SIMILAR_FILES_PIXEL_DUPES_EXCLUDED:
|
||||
|
||||
|
@ -954,11 +958,30 @@ class ClientDBFilesDuplicates( ClientDBModule.ClientDBModule ):
|
|||
|
||||
select_statement = 'SELECT 1 FROM pixel_hash_map AS pixel_hash_map_smaller, pixel_hash_map as pixel_hash_map_larger ON ( {} )'.format( join_predicate_pixel_dupes )
|
||||
|
||||
join_predicate = '{} AND NOT EXISTS ( {} )'.format( join_predicate, select_statement )
|
||||
join_predicates.append( 'NOT EXISTS ( {} )'.format( select_statement ) )
|
||||
|
||||
|
||||
|
||||
table_join = '{} ON ( {} )'.format( tables, join_predicate )
|
||||
return ( tables, join_predicates )
|
||||
|
||||
|
||||
def DuplicatesGetPotentialDuplicatePairsTableJoinOnEverythingSearchResults( self, db_location_context: ClientDBFilesStorage.DBLocationContext, pixel_dupes_preference: int, max_hamming_distance: int ):
|
||||
|
||||
( tables, join_predicates ) = self.DuplicatesGetPotentialDuplicatePairsTableJoinGetInitialTablesAndPreds( pixel_dupes_preference, max_hamming_distance )
|
||||
|
||||
if not db_location_context.location_context.IsAllKnownFiles():
|
||||
|
||||
files_table_name = db_location_context.GetSingleFilesTableName()
|
||||
|
||||
tables.extend( [
|
||||
'{} AS current_files_smaller'.format( files_table_name ),
|
||||
'{} AS current_files_larger'.format( files_table_name )
|
||||
] )
|
||||
|
||||
join_predicates.append( 'duplicate_files_smaller.king_hash_id = current_files_smaller.hash_id AND duplicate_files_larger.king_hash_id = current_files_larger.hash_id' )
|
||||
|
||||
|
||||
table_join = '{} ON ( {} )'.format( ', '.join( tables ), ' AND '.join( join_predicates ) )
|
||||
|
||||
return table_join
|
||||
|
||||
|
@ -979,11 +1002,27 @@ class ClientDBFilesDuplicates( ClientDBModule.ClientDBModule ):
|
|||
return table_join
|
||||
|
||||
|
||||
def DuplicatesGetPotentialDuplicatePairsTableJoinOnSearchResults( self, db_location_context: ClientDBFilesStorage.DBLocationContext, results_table_name: str, both_files_match: bool, pixel_dupes_preference: int, max_hamming_distance: int ):
|
||||
def DuplicatesGetPotentialDuplicatePairsTableJoinOnSearchResultsBothFiles( self, results_table_name: str, pixel_dupes_preference: int, max_hamming_distance: int ):
|
||||
|
||||
( tables, join_predicates ) = self.DuplicatesGetPotentialDuplicatePairsTableJoinGetInitialTablesAndPreds( pixel_dupes_preference, max_hamming_distance )
|
||||
|
||||
tables.extend( [
|
||||
'{} AS results_smaller'.format( results_table_name ),
|
||||
'{} AS results_larger'.format( results_table_name )
|
||||
] )
|
||||
|
||||
join_predicates.append( 'duplicate_files_smaller.king_hash_id = results_smaller.hash_id AND duplicate_files_larger.king_hash_id = results_larger.hash_id' )
|
||||
|
||||
table_join = '{} ON ( {} )'.format( ', '.join( tables ), ' AND '.join( join_predicates ) )
|
||||
|
||||
return table_join
|
||||
|
||||
|
||||
def DuplicatesGetPotentialDuplicatePairsTableJoinOnSearchResults( self, db_location_context: ClientDBFilesStorage.DBLocationContext, results_table_name: str, pixel_dupes_preference: int, max_hamming_distance: int ):
|
||||
|
||||
# why yes this is a seven table join that involves a mix of duplicated tables, temporary tables, and duplicated temporary tables
|
||||
#
|
||||
# main thing is, give this guy a search in duplicate filter UI, it'll give you a fast table join that returns potential dupes that match that
|
||||
# main thing is, give this guy a search from duplicate filter UI, it'll give you a fast table join that returns potential dupes that match that
|
||||
#
|
||||
# ████████████████████████████████████████████████████████████████████████
|
||||
# ████████████████████████████████████████████████████████████████████████
|
||||
|
@ -1030,61 +1069,58 @@ class ClientDBFilesDuplicates( ClientDBModule.ClientDBModule ):
|
|||
# ████████████████████████████████████████████████████████████████████████
|
||||
#
|
||||
|
||||
base_tables = 'potential_duplicate_pairs, duplicate_files AS duplicate_files_smaller, duplicate_files AS duplicate_files_larger'
|
||||
( tables, join_predicates ) = self.DuplicatesGetPotentialDuplicatePairsTableJoinGetInitialTablesAndPreds( pixel_dupes_preference, max_hamming_distance )
|
||||
|
||||
join_predicate_media_to_hashes = 'smaller_media_id = duplicate_files_smaller.media_id AND larger_media_id = duplicate_files_larger.media_id AND distance <= {}'.format( max_hamming_distance )
|
||||
|
||||
if both_files_match:
|
||||
if db_location_context.location_context.IsAllKnownFiles():
|
||||
|
||||
tables = '{}, {} AS results_smaller, {} AS results_larger'.format( base_tables, results_table_name, results_table_name )
|
||||
tables.append( '{} AS results_table_for_this_query'.format( results_table_name ) )
|
||||
|
||||
join_predicate_hashes_to_allowed_results = 'duplicate_files_smaller.king_hash_id = results_smaller.hash_id AND duplicate_files_larger.king_hash_id = results_larger.hash_id'
|
||||
join_predicates.append( '( duplicate_files_smaller.king_hash_id = results_table_for_this_query.hash_id OR duplicate_files_larger.king_hash_id = results_table_for_this_query.hash_id )' )
|
||||
|
||||
else:
|
||||
|
||||
if db_location_context.location_context.IsAllKnownFiles():
|
||||
|
||||
tables = '{}, {} AS results_table_for_this_query'.format( base_tables, results_table_name )
|
||||
|
||||
join_predicate_hashes_to_allowed_results = '( duplicate_files_smaller.king_hash_id = results_table_for_this_query.hash_id OR duplicate_files_larger.king_hash_id = results_table_for_this_query.hash_id )'
|
||||
|
||||
else:
|
||||
|
||||
files_table_name = db_location_context.GetSingleFilesTableName()
|
||||
|
||||
tables = '{}, {} AS results_table_for_this_query, {} AS current_files_for_this_query'.format( base_tables, results_table_name, files_table_name )
|
||||
|
||||
join_predicate_smaller_matches = '( duplicate_files_smaller.king_hash_id = results_table_for_this_query.hash_id AND duplicate_files_larger.king_hash_id = current_files_for_this_query.hash_id )'
|
||||
|
||||
join_predicate_larger_matches = '( duplicate_files_smaller.king_hash_id = current_files_for_this_query.hash_id AND duplicate_files_larger.king_hash_id = results_table_for_this_query.hash_id )'
|
||||
|
||||
join_predicate_hashes_to_allowed_results = '( {} OR {} )'.format( join_predicate_smaller_matches, join_predicate_larger_matches )
|
||||
|
||||
files_table_name = db_location_context.GetSingleFilesTableName()
|
||||
|
||||
tables.extend( [
|
||||
'{} AS results_table_for_this_query'.format( results_table_name ),
|
||||
'{} AS current_files_for_this_query'.format( files_table_name )
|
||||
] )
|
||||
|
||||
join_predicate_smaller_matches = '( duplicate_files_smaller.king_hash_id = results_table_for_this_query.hash_id AND duplicate_files_larger.king_hash_id = current_files_for_this_query.hash_id )'
|
||||
|
||||
join_predicate_larger_matches = '( duplicate_files_smaller.king_hash_id = current_files_for_this_query.hash_id AND duplicate_files_larger.king_hash_id = results_table_for_this_query.hash_id )'
|
||||
|
||||
join_predicates.append( '( {} OR {} )'.format( join_predicate_smaller_matches, join_predicate_larger_matches ) )
|
||||
|
||||
|
||||
if pixel_dupes_preference in ( CC.SIMILAR_FILES_PIXEL_DUPES_REQUIRED, CC.SIMILAR_FILES_PIXEL_DUPES_EXCLUDED ):
|
||||
|
||||
join_predicate_pixel_dupes = 'duplicate_files_smaller.king_hash_id = pixel_hash_map_smaller.hash_id AND duplicate_files_larger.king_hash_id = pixel_hash_map_larger.hash_id AND pixel_hash_map_smaller.pixel_hash_id = pixel_hash_map_larger.pixel_hash_id'
|
||||
|
||||
if pixel_dupes_preference == CC.SIMILAR_FILES_PIXEL_DUPES_REQUIRED:
|
||||
|
||||
tables = '{}, pixel_hash_map AS pixel_hash_map_smaller, pixel_hash_map AS pixel_hash_map_larger'.format( tables )
|
||||
|
||||
join_predicate_hashes_to_allowed_results = '{} AND {}'.format( join_predicate_hashes_to_allowed_results, join_predicate_pixel_dupes )
|
||||
|
||||
elif pixel_dupes_preference == CC.SIMILAR_FILES_PIXEL_DUPES_EXCLUDED:
|
||||
|
||||
# can't do "AND NOT {}", or the join will just give you the million rows where it isn't true. we want 'AND NEVER {}', and quick
|
||||
|
||||
select_statement = 'SELECT 1 FROM pixel_hash_map AS pixel_hash_map_smaller, pixel_hash_map as pixel_hash_map_larger ON ( {} )'.format( join_predicate_pixel_dupes )
|
||||
|
||||
join_predicate_hashes_to_allowed_results = '{} AND NOT EXISTS ( {} )'.format( join_predicate_hashes_to_allowed_results, select_statement )
|
||||
|
||||
|
||||
table_join = '{} ON ( {} )'.format( ', '.join( tables ), ' AND '.join( join_predicates ) )
|
||||
|
||||
join_predicate = '{} AND {}'.format( join_predicate_media_to_hashes, join_predicate_hashes_to_allowed_results )
|
||||
return table_join
|
||||
|
||||
table_join = '{} ON ( {} )'.format( tables, join_predicate )
|
||||
|
||||
def DuplicatesGetPotentialDuplicatePairsTableJoinOnSeparateSearchResults( self, results_table_name_1: str, results_table_name_2: str, pixel_dupes_preference: int, max_hamming_distance: int ):
|
||||
|
||||
#
|
||||
# And taking the above to its logical conclusion with two results sets, one file in xor either
|
||||
#
|
||||
|
||||
( tables, join_predicates ) = self.DuplicatesGetPotentialDuplicatePairsTableJoinGetInitialTablesAndPreds( pixel_dupes_preference, max_hamming_distance )
|
||||
|
||||
# we don't have to do any db_location_context jibber-jabber here as long as we stipulate that the two results sets have the same location context, which we'll enforce in UI
|
||||
# just like above when 'both files match', we know we are db_location_context cross-referenced since we are intersecting with file searches performed on that search domain
|
||||
# so, this is actually a bit simpler than the non-both-files-match one search case!!
|
||||
|
||||
tables.extend( [
|
||||
'{} AS results_table_for_this_query_1'.format( results_table_name_1 ),
|
||||
'{} AS results_table_for_this_query_2'.format( results_table_name_2 )
|
||||
] )
|
||||
|
||||
one_two = '( duplicate_files_smaller.king_hash_id = results_table_for_this_query_1.hash_id AND duplicate_files_larger.king_hash_id = results_table_for_this_query_2.hash_id )'
|
||||
two_one = '( duplicate_files_smaller.king_hash_id = results_table_for_this_query_2.hash_id AND duplicate_files_larger.king_hash_id = results_table_for_this_query_1.hash_id )'
|
||||
|
||||
join_predicates.append( '( {} OR {} )'.format( one_two, two_one ) )
|
||||
|
||||
table_join = '{} ON ( {} )'.format( ', '.join( tables ), ' AND '.join( join_predicates ) )
|
||||
|
||||
return table_join
|
||||
|
||||
|
|
|
@ -3545,7 +3545,7 @@ class FrameGUI( ClientGUITopLevelWindows.MainFrameThatResizes, CAC.ApplicationCo
|
|||
|
||||
ClientGUIMenus.AppendSeparator( menu )
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( menu, 'refresh', 'If the current page has a search, refresh it.', self._Refresh )
|
||||
ClientGUIMenus.AppendMenuItem( menu, 'refresh', 'If the current page has a search, refresh it.', self._RefreshCurrentPage )
|
||||
|
||||
splitter_menu = QW.QMenu( menu )
|
||||
|
||||
|
@ -3677,7 +3677,7 @@ class FrameGUI( ClientGUITopLevelWindows.MainFrameThatResizes, CAC.ApplicationCo
|
|||
|
||||
tag_display_maintenance_menu = QW.QMenu( menu )
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( tag_display_maintenance_menu, 'review tag sibling/parent maintenance', 'See how siblings and parents are currently applied.', self._ReviewTagDisplayMaintenance )
|
||||
ClientGUIMenus.AppendMenuItem( tag_display_maintenance_menu, 'review current sync', 'See how siblings and parents are currently applied.', self._ReviewTagDisplayMaintenance )
|
||||
ClientGUIMenus.AppendSeparator( tag_display_maintenance_menu )
|
||||
|
||||
check_manager = ClientGUICommon.CheckboxManagerOptions( 'tag_display_maintenance_during_idle' )
|
||||
|
@ -3685,14 +3685,14 @@ class FrameGUI( ClientGUITopLevelWindows.MainFrameThatResizes, CAC.ApplicationCo
|
|||
current_value = check_manager.GetCurrentValue()
|
||||
func = check_manager.Invert
|
||||
|
||||
ClientGUIMenus.AppendMenuCheckItem( tag_display_maintenance_menu, 'sync tag display during idle time', 'Control whether tag display maintenance can work during idle time.', current_value, func )
|
||||
ClientGUIMenus.AppendMenuCheckItem( tag_display_maintenance_menu, 'sync tag display during idle time', 'Control whether tag display processing can work during idle time.', current_value, func )
|
||||
|
||||
check_manager = ClientGUICommon.CheckboxManagerOptions( 'tag_display_maintenance_during_active' )
|
||||
|
||||
current_value = check_manager.GetCurrentValue()
|
||||
func = check_manager.Invert
|
||||
|
||||
ClientGUIMenus.AppendMenuCheckItem( tag_display_maintenance_menu, 'sync tag display during normal time', 'Control whether tag display maintenance can work during normal time.', current_value, func )
|
||||
ClientGUIMenus.AppendMenuCheckItem( tag_display_maintenance_menu, 'sync tag display during normal time', 'Control whether tag display processing can work during normal time.', current_value, func )
|
||||
|
||||
ClientGUIMenus.AppendMenu( menu, tag_display_maintenance_menu, 'sibling/parent sync' )
|
||||
|
||||
|
@ -5056,7 +5056,7 @@ class FrameGUI( ClientGUITopLevelWindows.MainFrameThatResizes, CAC.ApplicationCo
|
|||
self._controller.Write( 'save_options', HC.options )
|
||||
|
||||
|
||||
def _Refresh( self ):
|
||||
def _RefreshCurrentPage( self ):
|
||||
|
||||
page = self._notebook.GetCurrentMediaPage()
|
||||
|
||||
|
@ -5582,7 +5582,7 @@ class FrameGUI( ClientGUITopLevelWindows.MainFrameThatResizes, CAC.ApplicationCo
|
|||
|
||||
def _ReviewTagDisplayMaintenance( self ):
|
||||
|
||||
frame = ClientGUITopLevelWindowsPanels.FrameThatTakesScrollablePanel( self, 'tag display maintenance' )
|
||||
frame = ClientGUITopLevelWindowsPanels.FrameThatTakesScrollablePanel( self, 'tag display sync' )
|
||||
|
||||
panel = ClientGUITags.ReviewTagDisplayMaintenancePanel( frame )
|
||||
|
||||
|
@ -7619,7 +7619,7 @@ The password is cleartext here but obscured in the entry dialog. Enter a blank p
|
|||
|
||||
elif action == CAC.SIMPLE_REFRESH:
|
||||
|
||||
self._Refresh()
|
||||
self._RefreshCurrentPage()
|
||||
|
||||
elif action == CAC.SIMPLE_REFRESH_ALL_PAGES:
|
||||
|
||||
|
@ -7820,6 +7820,18 @@ The password is cleartext here but obscured in the entry dialog. Enter a blank p
|
|||
self._controller.CallToThread( self._controller.SaveGUISession, session )
|
||||
|
||||
|
||||
def RefreshPage( self, page_key: bytes ):
|
||||
|
||||
page = self._notebook.GetPageFromPageKey( page_key )
|
||||
|
||||
if page is None:
|
||||
|
||||
raise HydrusExceptions.DataMissing( 'Could not find that page!' )
|
||||
|
||||
|
||||
page.RefreshQuery()
|
||||
|
||||
|
||||
def RefreshStatusBar( self ):
|
||||
|
||||
self._RefreshStatusBar()
|
||||
|
|
|
@ -1225,6 +1225,7 @@ class EditDuplicateContentMergeOptionsPanel( ClientGUIScrolledPanels.EditPanel )
|
|||
|
||||
self._sync_urls_action.addItem( HC.content_merge_string_lookup[ HC.CONTENT_MERGE_ACTION_COPY ], HC.CONTENT_MERGE_ACTION_COPY )
|
||||
self._sync_notes_action.addItem( HC.content_merge_string_lookup[ HC.CONTENT_MERGE_ACTION_COPY ], HC.CONTENT_MERGE_ACTION_COPY )
|
||||
self._sync_notes_action.addItem( HC.content_merge_string_lookup[ HC.CONTENT_MERGE_ACTION_MOVE ], HC.CONTENT_MERGE_ACTION_MOVE )
|
||||
|
||||
|
||||
self._sync_urls_action.addItem( HC.content_merge_string_lookup[ HC.CONTENT_MERGE_ACTION_TWO_WAY_MERGE ], HC.CONTENT_MERGE_ACTION_TWO_WAY_MERGE )
|
||||
|
|
|
@ -687,7 +687,9 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
|
||||
self._duplicate_comparison_score_nicer_ratio.setToolTip( 'For instance, 16:9 vs 640:357.')
|
||||
|
||||
self._duplicate_filter_max_batch_size = ClientGUICommon.BetterSpinBox( self, min = 5, max = 1024 )
|
||||
batches_panel = ClientGUICommon.StaticBox( self, 'duplicate filter batches' )
|
||||
|
||||
self._duplicate_filter_max_batch_size = ClientGUICommon.BetterSpinBox( batches_panel, min = 5, max = 1024 )
|
||||
|
||||
#
|
||||
|
||||
|
@ -739,9 +741,11 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
|
||||
rows.append( ( 'Max size of duplicate filter pair batches:', self._duplicate_filter_max_batch_size ) )
|
||||
|
||||
gridbox = ClientGUICommon.WrapInGrid( self, rows )
|
||||
gridbox = ClientGUICommon.WrapInGrid( batches_panel, rows )
|
||||
|
||||
QP.AddToLayout( vbox, gridbox, CC.FLAGS_EXPAND_SIZER_PERPENDICULAR )
|
||||
batches_panel.Add( gridbox, CC.FLAGS_EXPAND_SIZER_PERPENDICULAR )
|
||||
|
||||
QP.AddToLayout( vbox, batches_panel, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
vbox.addStretch( 1 )
|
||||
|
||||
self.setLayout( vbox )
|
||||
|
@ -4013,9 +4017,9 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
# I tried <100%, but Qt seems to cap it to 1.0. Sad!
|
||||
self._thumbnail_dpr_percentage = ClientGUICommon.BetterSpinBox( self, min = 100, max = 800 )
|
||||
|
||||
tt = 'If your OS runs at an UI scale greater than 100%, mirror it here, and your thumbnails will look crisp. If you have multiple monitors at different UI scales, set it to the one you will be looking at hydrus thumbnails on more often. Setting this value to anything other than the monitor hydrus is currently on will cause thumbs to look pixellated and/or muddy.'
|
||||
tt = 'If your OS runs at an UI scale greater than 100%, mirror it here and your thumbnails will look crisp. If you have multiple monitors at different UI scales, or you change UI scale regularly, set it to the largest one you use.'
|
||||
tt += os.linesep * 2
|
||||
tt += 'I believe your UI scale is {}'.format( HydrusData.ConvertFloatToPercentage( self.devicePixelRatio() ) )
|
||||
tt += 'I believe the UI scale on the monitor this dialog opened on was {}'.format( HydrusData.ConvertFloatToPercentage( self.devicePixelRatio() ) )
|
||||
|
||||
self._thumbnail_dpr_percentage.setToolTip( tt )
|
||||
|
||||
|
@ -4073,10 +4077,10 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
rows.append( ( 'Thumbnail border: ', self._thumbnail_border ) )
|
||||
rows.append( ( 'Thumbnail margin: ', self._thumbnail_margin ) )
|
||||
rows.append( ( 'Thumbnail scaling: ', self._thumbnail_scale_type ) )
|
||||
rows.append( ( 'Thumbnail UI scale supersampling %: ', self._thumbnail_dpr_percentage ) )
|
||||
rows.append( ( 'Focus thumbnails in the preview window on ctrl-click: ', self._focus_preview_on_ctrl_click ) )
|
||||
rows.append( ( 'Thumbnail UI-scale supersampling %: ', self._thumbnail_dpr_percentage ) )
|
||||
rows.append( ( 'On ctrl-click, focus thumbnails in the preview window: ', self._focus_preview_on_ctrl_click ) )
|
||||
rows.append( ( ' Only on files with no duration: ', self._focus_preview_on_ctrl_click_only_static ) )
|
||||
rows.append( ( 'Focus thumbnails in the preview window on shift-click: ', self._focus_preview_on_shift_click ) )
|
||||
rows.append( ( 'On shift-click, focus thumbnails in the preview window: ', self._focus_preview_on_shift_click ) )
|
||||
rows.append( ( ' Only on files with no duration: ', self._focus_preview_on_shift_click_only_static ) )
|
||||
rows.append( ( 'Generate video thumbnails this % in: ', self._video_thumbnail_percentage_in ) )
|
||||
rows.append( ( 'Do not scroll down on key navigation if thumbnail at least this % visible: ', self._thumbnail_visibility_scroll_percent ) )
|
||||
|
|
|
@ -364,6 +364,8 @@ class EditTagDisplayApplication( ClientGUIScrolledPanels.EditPanel ):
|
|||
message = 'While a tag service normally only applies its own siblings and parents to itself, it does not have to. You can have other services\' rules apply (e.g. putting the PTR\'s siblings on your "my tags"), or no siblings/parents at all.'
|
||||
message += os.linesep * 2
|
||||
message += 'If you apply multiple services and there are conflicts (e.g. disagreements on where siblings go, or loops), the services at the top of the list have precedence. If you want to overwrite some PTR rules, then make what you want on a local service and then put it above the PTR here. Also, siblings apply first, then parents.'
|
||||
message += os.linesep * 2
|
||||
message += 'If you make big changes here, it will take a long time for the client to recalculate everything. Check the sync progress panel under _tags->sibling/parent sync_ to see how it is going. If your client gets laggy doing the recalc, turn it off during "normal time".'
|
||||
|
||||
self._message = ClientGUICommon.BetterStaticText( self, label = message )
|
||||
self._message.setWordWrap( True )
|
||||
|
@ -5118,7 +5120,8 @@ class ReviewTagDisplayMaintenancePanel( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
|
||||
vbox = QP.VBoxLayout()
|
||||
|
||||
message = 'Figuring out how tags should appear according to sibling and parent application rules takes time. When you set new rules, the changes do not happen immediately--the client catches up in the background. You can review current progress and force faster sync here.'
|
||||
message = 'Figuring out how tags should appear according to sibling and parent application rules takes time. When you set new rules, the changes do not happen immediately--the client catches up in the background. This work takes a lot of math and can be laggy.'
|
||||
|
||||
|
||||
self._message = ClientGUICommon.BetterStaticText( self, label = message )
|
||||
self._message.setWordWrap( True )
|
||||
|
|
|
@ -2219,9 +2219,9 @@ class CanvasFilterDuplicates( CanvasWithHovers ):
|
|||
|
||||
showPairInPage = QC.Signal( list )
|
||||
|
||||
def __init__( self, parent, file_search_context: ClientSearch.FileSearchContext, both_files_match, pixel_dupes_preference, max_hamming_distance ):
|
||||
def __init__( self, parent, file_search_context_1: ClientSearch.FileSearchContext, file_search_context_2: ClientSearch.FileSearchContext, dupe_search_type, pixel_dupes_preference, max_hamming_distance ):
|
||||
|
||||
location_context = file_search_context.GetLocationContext()
|
||||
location_context = file_search_context_1.GetLocationContext()
|
||||
|
||||
CanvasWithHovers.__init__( self, parent, location_context )
|
||||
|
||||
|
@ -2234,8 +2234,9 @@ class CanvasFilterDuplicates( CanvasWithHovers ):
|
|||
|
||||
self._my_shortcuts_handler.AddWindowToFilter( hover )
|
||||
|
||||
self._file_search_context = file_search_context
|
||||
self._both_files_match = both_files_match
|
||||
self._file_search_context_1 = file_search_context_1
|
||||
self._file_search_context_2 = file_search_context_2
|
||||
self._dupe_search_type = dupe_search_type
|
||||
self._pixel_dupes_preference = pixel_dupes_preference
|
||||
self._max_hamming_distance = max_hamming_distance
|
||||
|
||||
|
@ -2623,7 +2624,7 @@ class CanvasFilterDuplicates( CanvasWithHovers ):
|
|||
|
||||
self._currently_fetching_pairs = True
|
||||
|
||||
HG.client_controller.CallToThread( self.THREADFetchPairs, self._file_search_context, self._both_files_match, self._pixel_dupes_preference, self._max_hamming_distance )
|
||||
HG.client_controller.CallToThread( self.THREADFetchPairs, self._file_search_context_1, self._file_search_context_2, self._dupe_search_type, self._pixel_dupes_preference, self._max_hamming_distance )
|
||||
|
||||
self.update()
|
||||
|
||||
|
@ -2819,7 +2820,7 @@ class CanvasFilterDuplicates( CanvasWithHovers ):
|
|||
first_media = ClientMedia.MediaSingleton( first_media_result )
|
||||
second_media = ClientMedia.MediaSingleton( second_media_result )
|
||||
|
||||
score = ClientMedia.GetDuplicateComparisonScore( first_media, second_media )
|
||||
score = ClientDuplicates.GetDuplicateComparisonScore( first_media, second_media )
|
||||
|
||||
if score > 0:
|
||||
|
||||
|
@ -3057,8 +3058,8 @@ class CanvasFilterDuplicates( CanvasWithHovers ):
|
|||
|
||||
HG.client_controller.pub( 'new_similar_files_potentials_search_numbers' )
|
||||
|
||||
ClientMedia.hashes_to_jpeg_quality = {} # clear the cache
|
||||
ClientMedia.hashes_to_pixel_hashes = {} # clear the cache
|
||||
ClientDuplicates.hashes_to_jpeg_quality = {} # clear the cache
|
||||
ClientDuplicates.hashes_to_pixel_hashes = {} # clear the cache
|
||||
|
||||
CanvasWithHovers.CleanBeforeDestroy( self )
|
||||
|
||||
|
@ -3244,7 +3245,7 @@ class CanvasFilterDuplicates( CanvasWithHovers ):
|
|||
|
||||
|
||||
|
||||
def THREADFetchPairs( self, file_search_context, both_files_match, pixel_dupes_preference, max_hamming_distance ):
|
||||
def THREADFetchPairs( self, file_search_context_1, file_search_context_2, dupe_search_type, pixel_dupes_preference, max_hamming_distance ):
|
||||
|
||||
def qt_close():
|
||||
|
||||
|
@ -3273,7 +3274,7 @@ class CanvasFilterDuplicates( CanvasWithHovers ):
|
|||
self._ShowCurrentPair()
|
||||
|
||||
|
||||
result = HG.client_controller.Read( 'duplicate_pairs_for_filtering', file_search_context, both_files_match, pixel_dupes_preference, max_hamming_distance )
|
||||
result = HG.client_controller.Read( 'duplicate_pairs_for_filtering', file_search_context_1, file_search_context_2, dupe_search_type, pixel_dupes_preference, max_hamming_distance )
|
||||
|
||||
if len( result ) == 0:
|
||||
|
||||
|
|
|
@ -12,6 +12,7 @@ from hydrus.core import HydrusSerialisable
|
|||
from hydrus.client import ClientApplicationCommand as CAC
|
||||
from hydrus.client import ClientConstants as CC
|
||||
from hydrus.client import ClientData
|
||||
from hydrus.client import ClientDuplicates
|
||||
from hydrus.client.gui import ClientGUIDragDrop
|
||||
from hydrus.client.gui import ClientGUICore as CGC
|
||||
from hydrus.client.gui import ClientGUIFunctions
|
||||
|
@ -1884,7 +1885,7 @@ class CanvasHoverFrameRightDuplicates( CanvasHoverFrame ):
|
|||
|
||||
def _ResetComparisonStatements( self ):
|
||||
|
||||
statements_and_scores = ClientMedia.GetDuplicateComparisonStatements( self._current_media, self._comparison_media )
|
||||
statements_and_scores = ClientDuplicates.GetDuplicateComparisonStatements( self._current_media, self._comparison_media )
|
||||
|
||||
for name in self._comparison_statement_names:
|
||||
|
||||
|
|
|
@ -127,3 +127,72 @@ class EditSidecarDetailsPanel( ClientGUICommon.StaticBox ):
|
|||
self._filename_string_converter.SetValue( filename_string_converter )
|
||||
|
||||
|
||||
|
||||
SEPARATOR_NEWLINE = 0
|
||||
SEPARATOR_CUSTOM = 1
|
||||
|
||||
class EditSidecarTXTSeparator( ClientGUICommon.StaticBox ):
|
||||
|
||||
def __init__( self, parent: QW.QWidget ):
|
||||
|
||||
ClientGUICommon.StaticBox.__init__( self, parent, 'sidecar txt separator' )
|
||||
|
||||
self._choice = ClientGUICommon.BetterChoice( self )
|
||||
|
||||
self._choice.addItem( 'newline', SEPARATOR_NEWLINE )
|
||||
self._choice.addItem( 'custom text', SEPARATOR_CUSTOM )
|
||||
|
||||
tt = 'You can separate the "rows" of tags by something other than newlines if you like. If you have/want a CSV list, try a separator of "," or ", ".'
|
||||
|
||||
self._choice.setToolTip( tt )
|
||||
|
||||
self._custom_input = QW.QLineEdit( self )
|
||||
|
||||
rows = []
|
||||
|
||||
rows.append( ( 'separator: ', self._choice ) )
|
||||
rows.append( ( 'custom: ', self._custom_input ) )
|
||||
|
||||
gridbox = ClientGUICommon.WrapInGrid( self, rows )
|
||||
|
||||
self.Add( gridbox, CC.FLAGS_EXPAND_SIZER_PERPENDICULAR )
|
||||
|
||||
self._choice.currentIndexChanged.connect( self._UpdateControls )
|
||||
|
||||
|
||||
def _UpdateControls( self ):
|
||||
|
||||
value = self._choice.GetValue()
|
||||
|
||||
self._custom_input.setEnabled( value == SEPARATOR_CUSTOM )
|
||||
|
||||
|
||||
def GetValue( self ):
|
||||
|
||||
value = self._choice.GetValue()
|
||||
|
||||
if value == SEPARATOR_NEWLINE:
|
||||
|
||||
return '\n'
|
||||
|
||||
else:
|
||||
|
||||
return self._custom_input.text()
|
||||
|
||||
|
||||
|
||||
def SetValue( self, value: str ):
|
||||
|
||||
if value == '\n':
|
||||
|
||||
self._choice.SetValue( SEPARATOR_NEWLINE )
|
||||
|
||||
else:
|
||||
|
||||
self._choice.SetValue( SEPARATOR_CUSTOM )
|
||||
self._custom_input.setText( value )
|
||||
|
||||
|
||||
self._UpdateControls()
|
||||
|
||||
|
||||
|
|
|
@ -96,6 +96,10 @@ class EditSingleFileMetadataExporterPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
#
|
||||
|
||||
self._txt_separator_panel = ClientGUIMetadataMigrationCommon.EditSidecarTXTSeparator( self )
|
||||
|
||||
#
|
||||
|
||||
self._sidecar_panel = ClientGUIMetadataMigrationCommon.EditSidecarDetailsPanel( self )
|
||||
|
||||
#
|
||||
|
@ -106,6 +110,7 @@ class EditSingleFileMetadataExporterPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
QP.AddToLayout( vbox, self._service_selection_panel, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
QP.AddToLayout( vbox, self._sidecar_help_button, CC.FLAGS_ON_RIGHT )
|
||||
QP.AddToLayout( vbox, self._nested_object_names_panel, CC.FLAGS_EXPAND_BOTH_WAYS )
|
||||
QP.AddToLayout( vbox, self._txt_separator_panel, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
QP.AddToLayout( vbox, self._sidecar_panel, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
|
||||
vbox.addStretch( 1 )
|
||||
|
@ -148,7 +153,7 @@ class EditSingleFileMetadataExporterPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
exporter = exporter_class()
|
||||
|
||||
# it is nice to preserve old values as we flip from one type to another. more pleasant that making the user cancel and re-open
|
||||
# it is nice to preserve old values as we flip from one type to another. more pleasant than making the user cancel and re-open
|
||||
|
||||
if isinstance( exporter, ClientMetadataMigrationExporters.SingleFileMetadataExporterSidecar ):
|
||||
|
||||
|
@ -171,7 +176,7 @@ class EditSingleFileMetadataExporterPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
elif isinstance( exporter, ClientMetadataMigrationExporters.SingleFileMetadataExporterTXT ):
|
||||
|
||||
pass
|
||||
exporter.SetSeparator( self._txt_separator_panel.GetValue() )
|
||||
|
||||
elif isinstance( exporter, ClientMetadataMigrationExporters.SingleFileMetadataExporterJSON ):
|
||||
|
||||
|
@ -233,8 +238,9 @@ class EditSingleFileMetadataExporterPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
remove_actual_filename_ext = self._sidecar_panel.GetRemoveActualFilenameExt()
|
||||
suffix = self._sidecar_panel.GetSuffix()
|
||||
filename_string_converter = self._sidecar_panel.GetFilenameStringConverter()
|
||||
separator = self._txt_separator_panel.GetValue()
|
||||
|
||||
exporter = ClientMetadataMigrationExporters.SingleFileMetadataExporterTXT( remove_actual_filename_ext = remove_actual_filename_ext, suffix = suffix, filename_string_converter = filename_string_converter )
|
||||
exporter = ClientMetadataMigrationExporters.SingleFileMetadataExporterTXT( remove_actual_filename_ext = remove_actual_filename_ext, suffix = suffix, filename_string_converter = filename_string_converter, separator = separator )
|
||||
|
||||
elif self._current_exporter_class == ClientMetadataMigrationExporters.SingleFileMetadataExporterJSON:
|
||||
|
||||
|
@ -277,6 +283,7 @@ class EditSingleFileMetadataExporterPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
self._service_selection_panel.setVisible( False )
|
||||
self._sidecar_help_button.setVisible( False )
|
||||
self._nested_object_names_panel.setVisible( False )
|
||||
self._txt_separator_panel.setVisible( False )
|
||||
self._sidecar_panel.setVisible( False )
|
||||
|
||||
if isinstance( exporter, ClientMetadataMigrationExporters.SingleFileMetadataExporterSidecar ):
|
||||
|
@ -318,6 +325,10 @@ class EditSingleFileMetadataExporterPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
self._sidecar_panel.SetSidecarExt( 'txt' )
|
||||
self._sidecar_panel.SetExampleInput( '01234564789abcdef.jpg' )
|
||||
|
||||
self._txt_separator_panel.SetValue( exporter.GetSeparator() )
|
||||
|
||||
self._txt_separator_panel.setVisible( True )
|
||||
|
||||
elif isinstance( exporter, ClientMetadataMigrationExporters.SingleFileMetadataExporterJSON ):
|
||||
|
||||
self._sidecar_panel.SetSidecarExt( 'json' )
|
||||
|
|
|
@ -87,6 +87,10 @@ class EditSingleFileMetadataImporterPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
#
|
||||
|
||||
self._txt_separator_panel = ClientGUIMetadataMigrationCommon.EditSidecarTXTSeparator( self )
|
||||
|
||||
#
|
||||
|
||||
self._sidecar_panel = ClientGUIMetadataMigrationCommon.EditSidecarDetailsPanel( self )
|
||||
|
||||
#
|
||||
|
@ -111,6 +115,7 @@ class EditSingleFileMetadataImporterPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
QP.AddToLayout( vbox, self._change_type_button, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
QP.AddToLayout( vbox, self._service_selection_panel, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
QP.AddToLayout( vbox, self._json_parsing_formula_panel, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
QP.AddToLayout( vbox, self._txt_separator_panel, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
QP.AddToLayout( vbox, self._sidecar_panel, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
QP.AddToLayout( vbox, self._string_processor_panel, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
|
||||
|
@ -172,7 +177,7 @@ class EditSingleFileMetadataImporterPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
elif isinstance( importer, ClientMetadataMigrationImporters.SingleFileMetadataImporterTXT ):
|
||||
|
||||
pass
|
||||
importer.SetSeparator( self._txt_separator_panel.GetValue() )
|
||||
|
||||
elif isinstance( importer, ClientMetadataMigrationImporters.SingleFileMetadataImporterJSON ):
|
||||
|
||||
|
@ -231,8 +236,9 @@ class EditSingleFileMetadataImporterPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
remove_actual_filename_ext = self._sidecar_panel.GetRemoveActualFilenameExt()
|
||||
suffix = self._sidecar_panel.GetSuffix()
|
||||
filename_string_converter = self._sidecar_panel.GetFilenameStringConverter()
|
||||
separator = self._txt_separator_panel.GetValue()
|
||||
|
||||
importer = ClientMetadataMigrationImporters.SingleFileMetadataImporterTXT( string_processor = string_processor, remove_actual_filename_ext = remove_actual_filename_ext, suffix = suffix, filename_string_converter = filename_string_converter )
|
||||
importer = ClientMetadataMigrationImporters.SingleFileMetadataImporterTXT( string_processor = string_processor, remove_actual_filename_ext = remove_actual_filename_ext, suffix = suffix, filename_string_converter = filename_string_converter, separator = separator )
|
||||
|
||||
elif self._current_importer_class == ClientMetadataMigrationImporters.SingleFileMetadataImporterJSON:
|
||||
|
||||
|
@ -276,6 +282,7 @@ class EditSingleFileMetadataImporterPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
self._service_selection_panel.setVisible( False )
|
||||
self._json_parsing_formula_panel.setVisible( False )
|
||||
self._txt_separator_panel.setVisible( False )
|
||||
self._sidecar_panel.setVisible( False )
|
||||
|
||||
if isinstance( importer, ClientMetadataMigrationImporters.SingleFileMetadataImporterSidecar ):
|
||||
|
@ -308,6 +315,10 @@ class EditSingleFileMetadataImporterPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
self._sidecar_panel.SetSidecarExt( 'txt' )
|
||||
self._sidecar_panel.SetExampleInput( 'my_image.jpg' )
|
||||
|
||||
self._txt_separator_panel.SetValue( importer.GetSeparator() )
|
||||
|
||||
self._txt_separator_panel.setVisible( True )
|
||||
|
||||
elif isinstance( importer, ClientMetadataMigrationImporters.SingleFileMetadataImporterJSON ):
|
||||
|
||||
self._sidecar_panel.SetSidecarExt( 'json' )
|
||||
|
|
|
@ -808,6 +808,42 @@ class ReviewAccountsPanel( QW.QWidget ):
|
|||
self._RefreshAccounts()
|
||||
|
||||
|
||||
def UncheckAccountKey( self, account_key: bytes ):
|
||||
|
||||
for i in range( self._account_list.count() ):
|
||||
|
||||
item = self._account_list.item( i )
|
||||
|
||||
checked_account_key = item.data( QC.Qt.UserRole )
|
||||
|
||||
if checked_account_key == account_key:
|
||||
|
||||
item.setCheckState( QC.Qt.Unchecked )
|
||||
|
||||
return
|
||||
|
||||
|
||||
|
||||
|
||||
def UncheckNullAccount( self ):
|
||||
|
||||
for i in range( self._account_list.count() ):
|
||||
|
||||
item = self._account_list.item( i )
|
||||
|
||||
account_key = item.data( QC.Qt.UserRole )
|
||||
|
||||
account = self._account_keys_to_accounts[ account_key ]
|
||||
|
||||
if account.IsNullAccount():
|
||||
|
||||
item.setCheckState( QC.Qt.Unchecked )
|
||||
|
||||
return
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
class ModifyAccountsPanel( ClientGUIScrolledPanels.ReviewPanel ):
|
||||
|
||||
|
@ -968,12 +1004,28 @@ class ModifyAccountsPanel( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
|
||||
expires_delta = self._add_to_expires.GetValue()
|
||||
|
||||
self._account_panel.UncheckNullAccount()
|
||||
|
||||
subject_accounts = self._account_panel.GetCheckedAccounts()
|
||||
|
||||
num_unchecked = 0
|
||||
|
||||
for subject_account in subject_accounts:
|
||||
|
||||
if subject_account.GetExpires() is None:
|
||||
|
||||
self._account_panel.UncheckAccountKey( subject_account.GetAccountKey() )
|
||||
|
||||
num_unchecked += 1
|
||||
|
||||
|
||||
|
||||
QW.QMessageBox.information( self, 'Information', '{} accounts do not expire, so could not have time added!'.format( HydrusData.ToHumanInt( num_unchecked ) ) )
|
||||
|
||||
subject_accounts = self._account_panel.GetCheckedAccounts()
|
||||
|
||||
subject_account_keys_and_current_expires = [ ( subject_account.GetAccountKey(), subject_account.GetExpires() ) for subject_account in subject_accounts ]
|
||||
|
||||
subject_account_keys_and_current_expires = [ pair for pair in subject_account_keys_and_current_expires if pair[1] is not None ]
|
||||
|
||||
subject_account_keys_and_new_expires = [ ( subject_account_key, current_expires + expires_delta ) for ( subject_account_key, current_expires ) in subject_account_keys_and_current_expires ]
|
||||
|
||||
self._DoExpires( subject_account_keys_and_new_expires )
|
||||
|
@ -995,6 +1047,8 @@ class ModifyAccountsPanel( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
|
||||
def _DoAccountType( self ):
|
||||
|
||||
self._account_panel.UncheckNullAccount()
|
||||
|
||||
subject_account_keys = self._account_panel.GetCheckedAccountKeys()
|
||||
|
||||
if len( subject_account_keys ) == 0:
|
||||
|
@ -1036,6 +1090,8 @@ class ModifyAccountsPanel( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
|
||||
def _DoBan( self ):
|
||||
|
||||
self._account_panel.UncheckNullAccount()
|
||||
|
||||
subject_accounts = self._account_panel.GetCheckedAccounts()
|
||||
|
||||
if len( subject_accounts ) == 0:
|
||||
|
@ -1149,6 +1205,8 @@ class ModifyAccountsPanel( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
|
||||
def _DoSetMessage( self ):
|
||||
|
||||
self._account_panel.UncheckNullAccount()
|
||||
|
||||
subject_accounts = self._account_panel.GetCheckedAccounts()
|
||||
|
||||
if len( subject_accounts ) == 0:
|
||||
|
@ -1206,6 +1264,8 @@ class ModifyAccountsPanel( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
|
||||
def _DoUnban( self ):
|
||||
|
||||
self._account_panel.UncheckNullAccount()
|
||||
|
||||
subject_accounts = self._account_panel.GetCheckedAccounts()
|
||||
|
||||
if len( subject_accounts ) == 0:
|
||||
|
@ -1299,6 +1359,8 @@ class ModifyAccountsPanel( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
|
||||
def _SetExpires( self ):
|
||||
|
||||
self._account_panel.UncheckNullAccount()
|
||||
|
||||
expires = self._set_expires.GetValue()
|
||||
|
||||
if expires is not None:
|
||||
|
|
|
@ -146,6 +146,7 @@ def CreateManagementController( page_name, management_type, location_context = N
|
|||
|
||||
return management_controller
|
||||
|
||||
|
||||
def CreateManagementControllerDuplicateFilter():
|
||||
|
||||
default_location_context = HG.client_controller.new_options.GetDefaultLocalLocationContext()
|
||||
|
@ -158,8 +159,9 @@ def CreateManagementControllerDuplicateFilter():
|
|||
|
||||
management_controller.SetVariable( 'synchronised', synchronised )
|
||||
|
||||
management_controller.SetVariable( 'file_search_context', file_search_context )
|
||||
management_controller.SetVariable( 'both_files_match', False )
|
||||
management_controller.SetVariable( 'file_search_context_1', file_search_context )
|
||||
management_controller.SetVariable( 'file_search_context_2', file_search_context.Duplicate() )
|
||||
management_controller.SetVariable( 'dupe_search_type', CC.DUPE_SEARCH_ONE_FILE_MATCHES_ONE_SEARCH )
|
||||
management_controller.SetVariable( 'pixel_dupes_preference', CC.SIMILAR_FILES_PIXEL_DUPES_ALLOWED )
|
||||
management_controller.SetVariable( 'max_hamming_distance', 4 )
|
||||
|
||||
|
@ -265,7 +267,7 @@ def CreateManagementControllerPetitions( petition_service_key ):
|
|||
|
||||
management_controller = CreateManagementController( page_name, MANAGEMENT_TYPE_PETITIONS, location_context = location_context )
|
||||
|
||||
management_controller.SetKey( 'petition_service', petition_service_key )
|
||||
management_controller.SetVariable( 'petition_service_key', petition_service_key )
|
||||
|
||||
return management_controller
|
||||
|
||||
|
@ -287,7 +289,7 @@ class ManagementController( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_MANAGEMENT_CONTROLLER
|
||||
SERIALISABLE_NAME = 'Client Page Management Controller'
|
||||
SERIALISABLE_VERSION = 11
|
||||
SERIALISABLE_VERSION = 12
|
||||
|
||||
def __init__( self, page_name = 'page' ):
|
||||
|
||||
|
@ -299,9 +301,7 @@ class ManagementController( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
self._last_serialisable_change_timestamp = 0
|
||||
|
||||
self._keys = {}
|
||||
self._simples = {}
|
||||
self._serialisables = {}
|
||||
self._variables = HydrusSerialisable.SerialisableDictionary()
|
||||
|
||||
|
||||
def __repr__( self ):
|
||||
|
@ -314,31 +314,23 @@ class ManagementController( HydrusSerialisable.SerialisableBase ):
|
|||
# don't save these
|
||||
TRANSITORY_KEYS = { 'page' }
|
||||
|
||||
serialisable_keys = { name : value.hex() for ( name, value ) in self._keys.items() if name not in TRANSITORY_KEYS }
|
||||
serialisable_variables = self._variables.GetSerialisableTuple()
|
||||
|
||||
serialisable_simples = dict( self._simples )
|
||||
|
||||
serialisable_serialisables = { name : value.GetSerialisableTuple() for ( name, value ) in self._serialisables.items() }
|
||||
|
||||
return ( self._page_name, self._management_type, serialisable_keys, serialisable_simples, serialisable_serialisables )
|
||||
return ( self._page_name, self._management_type, serialisable_variables )
|
||||
|
||||
|
||||
def _InitialiseDefaults( self ):
|
||||
|
||||
self._serialisables[ 'media_sort' ] = ClientMedia.MediaSort( ( 'system', CC.SORT_FILES_BY_FILESIZE ), CC.SORT_ASC )
|
||||
self._variables[ 'media_sort' ] = ClientMedia.MediaSort( ( 'system', CC.SORT_FILES_BY_FILESIZE ), CC.SORT_ASC )
|
||||
|
||||
|
||||
def _InitialiseFromSerialisableInfo( self, serialisable_info ):
|
||||
|
||||
( self._page_name, self._management_type, serialisable_keys, serialisable_simples, serialisable_serialisables ) = serialisable_info
|
||||
( self._page_name, self._management_type, serialisable_variables ) = serialisable_info
|
||||
|
||||
self._InitialiseDefaults()
|
||||
|
||||
self._keys.update( { name : bytes.fromhex( key ) for ( name, key ) in serialisable_keys.items() } )
|
||||
|
||||
self._simples.update( dict( serialisable_simples ) )
|
||||
|
||||
self._serialisables.update( { name : HydrusSerialisable.CreateFromSerialisableTuple( value ) for ( name, value ) in list(serialisable_serialisables.items()) } )
|
||||
self._variables.update( HydrusSerialisable.CreateFromSerialisableTuple( serialisable_variables ) )
|
||||
|
||||
|
||||
def _SerialisableChangeMade( self ):
|
||||
|
@ -588,6 +580,62 @@ class ManagementController( HydrusSerialisable.SerialisableBase ):
|
|||
return ( 11, new_serialisable_info )
|
||||
|
||||
|
||||
if version == 11:
|
||||
|
||||
( page_name, management_type, serialisable_keys, serialisable_simples, serialisable_serialisables ) = old_serialisable_info
|
||||
|
||||
# notice I rename them to _key here!
|
||||
# we had 'page' and 'petition_service' before, so adding the key brings us in line with elsewhere
|
||||
keys = { name + '_key' : bytes.fromhex( key ) for ( name, key ) in serialisable_keys.items() }
|
||||
|
||||
simples = dict( serialisable_simples )
|
||||
|
||||
serialisables = { name : HydrusSerialisable.CreateFromSerialisableTuple( value ) for ( name, value ) in list(serialisable_serialisables.items()) }
|
||||
|
||||
variables = HydrusSerialisable.SerialisableDictionary()
|
||||
|
||||
variables.update( keys )
|
||||
variables.update( simples )
|
||||
variables.update( serialisables )
|
||||
|
||||
if management_type == MANAGEMENT_TYPE_DUPLICATE_FILTER:
|
||||
|
||||
value = CC.DUPE_SEARCH_ONE_FILE_MATCHES_ONE_SEARCH
|
||||
|
||||
if 'both_files_match' in variables:
|
||||
|
||||
if variables[ 'both_files_match' ]:
|
||||
|
||||
value = CC.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH
|
||||
|
||||
|
||||
del variables[ 'both_files_match' ]
|
||||
|
||||
|
||||
variables[ 'dupe_search_type' ] = value
|
||||
|
||||
default_location_context = HG.client_controller.new_options.GetDefaultLocalLocationContext()
|
||||
|
||||
file_search_context = ClientSearch.FileSearchContext( location_context = default_location_context, predicates = [ ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_SYSTEM_EVERYTHING ) ] )
|
||||
|
||||
variables[ 'file_search_context_1' ] = file_search_context
|
||||
variables[ 'file_search_context_2' ] = file_search_context.Duplicate()
|
||||
|
||||
if 'file_search_context' in variables:
|
||||
|
||||
variables[ 'file_search_context_1' ] = variables[ 'file_search_context' ]
|
||||
|
||||
del variables[ 'file_search_context' ]
|
||||
|
||||
|
||||
|
||||
serialisable_variables = variables.GetSerialisableTuple()
|
||||
|
||||
new_serialisable_info = ( page_name, management_type, serialisable_variables )
|
||||
|
||||
return ( 12, new_serialisable_info )
|
||||
|
||||
|
||||
|
||||
def GetAPIInfoDict( self, simple ):
|
||||
|
||||
|
@ -595,31 +643,31 @@ class ManagementController( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
if self._management_type == MANAGEMENT_TYPE_IMPORT_HDD:
|
||||
|
||||
hdd_import = self._serialisables[ 'hdd_import' ]
|
||||
hdd_import = self._variables[ 'hdd_import' ]
|
||||
|
||||
d[ 'hdd_import' ] = hdd_import.GetAPIInfoDict( simple )
|
||||
|
||||
elif self._management_type == MANAGEMENT_TYPE_IMPORT_SIMPLE_DOWNLOADER:
|
||||
|
||||
simple_downloader_import = self._serialisables[ 'simple_downloader_import' ]
|
||||
simple_downloader_import = self._variables[ 'simple_downloader_import' ]
|
||||
|
||||
d[ 'simple_downloader_import' ] = simple_downloader_import.GetAPIInfoDict( simple )
|
||||
|
||||
elif self._management_type == MANAGEMENT_TYPE_IMPORT_MULTIPLE_GALLERY:
|
||||
|
||||
multiple_gallery_import = self._serialisables[ 'multiple_gallery_import' ]
|
||||
multiple_gallery_import = self._variables[ 'multiple_gallery_import' ]
|
||||
|
||||
d[ 'multiple_gallery_import' ] = multiple_gallery_import.GetAPIInfoDict( simple )
|
||||
|
||||
elif self._management_type == MANAGEMENT_TYPE_IMPORT_MULTIPLE_WATCHER:
|
||||
|
||||
multiple_watcher_import = self._serialisables[ 'multiple_watcher_import' ]
|
||||
multiple_watcher_import = self._variables[ 'multiple_watcher_import' ]
|
||||
|
||||
d[ 'multiple_watcher_import' ] = multiple_watcher_import.GetAPIInfoDict( simple )
|
||||
|
||||
elif self._management_type == MANAGEMENT_TYPE_IMPORT_URLS:
|
||||
|
||||
urls_import = self._serialisables[ 'urls_import' ]
|
||||
urls_import = self._variables[ 'urls_import' ]
|
||||
|
||||
d[ 'urls_import' ] = urls_import.GetAPIInfoDict( simple )
|
||||
|
||||
|
@ -627,42 +675,37 @@ class ManagementController( HydrusSerialisable.SerialisableBase ):
|
|||
return d
|
||||
|
||||
|
||||
def GetKey( self, name ):
|
||||
|
||||
return self._keys[ name ]
|
||||
|
||||
|
||||
def GetNumSeeds( self ):
|
||||
|
||||
try:
|
||||
|
||||
if self._management_type == MANAGEMENT_TYPE_IMPORT_HDD:
|
||||
|
||||
hdd_import = self._serialisables[ 'hdd_import' ]
|
||||
hdd_import = self._variables[ 'hdd_import' ]
|
||||
|
||||
return hdd_import.GetNumSeeds()
|
||||
|
||||
elif self._management_type == MANAGEMENT_TYPE_IMPORT_SIMPLE_DOWNLOADER:
|
||||
|
||||
simple_downloader_import = self._serialisables[ 'simple_downloader_import' ]
|
||||
simple_downloader_import = self._variables[ 'simple_downloader_import' ]
|
||||
|
||||
return simple_downloader_import.GetNumSeeds()
|
||||
|
||||
elif self._management_type == MANAGEMENT_TYPE_IMPORT_MULTIPLE_GALLERY:
|
||||
|
||||
multiple_gallery_import = self._serialisables[ 'multiple_gallery_import' ]
|
||||
multiple_gallery_import = self._variables[ 'multiple_gallery_import' ]
|
||||
|
||||
return multiple_gallery_import.GetNumSeeds()
|
||||
|
||||
elif self._management_type == MANAGEMENT_TYPE_IMPORT_MULTIPLE_WATCHER:
|
||||
|
||||
multiple_watcher_import = self._serialisables[ 'multiple_watcher_import' ]
|
||||
multiple_watcher_import = self._variables[ 'multiple_watcher_import' ]
|
||||
|
||||
return multiple_watcher_import.GetNumSeeds()
|
||||
|
||||
elif self._management_type == MANAGEMENT_TYPE_IMPORT_URLS:
|
||||
|
||||
urls_import = self._serialisables[ 'urls_import' ]
|
||||
urls_import = self._variables[ 'urls_import' ]
|
||||
|
||||
return urls_import.GetNumSeeds()
|
||||
|
||||
|
@ -693,23 +736,27 @@ class ManagementController( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
if self._management_type == MANAGEMENT_TYPE_IMPORT_HDD:
|
||||
|
||||
importer = self._serialisables[ 'hdd_import' ]
|
||||
importer = self._variables[ 'hdd_import' ]
|
||||
|
||||
elif self._management_type == MANAGEMENT_TYPE_IMPORT_SIMPLE_DOWNLOADER:
|
||||
|
||||
importer = self._serialisables[ 'simple_downloader_import' ]
|
||||
importer = self._variables[ 'simple_downloader_import' ]
|
||||
|
||||
elif self._management_type == MANAGEMENT_TYPE_IMPORT_MULTIPLE_GALLERY:
|
||||
|
||||
importer = self._serialisables[ 'multiple_gallery_import' ]
|
||||
importer = self._variables[ 'multiple_gallery_import' ]
|
||||
|
||||
elif self._management_type == MANAGEMENT_TYPE_IMPORT_MULTIPLE_WATCHER:
|
||||
|
||||
importer = self._serialisables[ 'multiple_watcher_import' ]
|
||||
importer = self._variables[ 'multiple_watcher_import' ]
|
||||
|
||||
elif self._management_type == MANAGEMENT_TYPE_IMPORT_URLS:
|
||||
|
||||
importer = self._serialisables[ 'urls_import' ]
|
||||
importer = self._variables[ 'urls_import' ]
|
||||
|
||||
else:
|
||||
|
||||
raise KeyError()
|
||||
|
||||
|
||||
return importer.GetValueRange()
|
||||
|
@ -727,14 +774,7 @@ class ManagementController( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
def GetVariable( self, name ):
|
||||
|
||||
if name in self._simples:
|
||||
|
||||
return self._simples[ name ]
|
||||
|
||||
else:
|
||||
|
||||
return self._serialisables[ name ]
|
||||
|
||||
return self._variables[ name ]
|
||||
|
||||
|
||||
def HasSerialisableChangesSince( self, since_timestamp ):
|
||||
|
@ -743,23 +783,23 @@ class ManagementController( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
if self._management_type == MANAGEMENT_TYPE_IMPORT_HDD:
|
||||
|
||||
importer = self._serialisables[ 'hdd_import' ]
|
||||
importer = self._variables[ 'hdd_import' ]
|
||||
|
||||
elif self._management_type == MANAGEMENT_TYPE_IMPORT_SIMPLE_DOWNLOADER:
|
||||
|
||||
importer = self._serialisables[ 'simple_downloader_import' ]
|
||||
importer = self._variables[ 'simple_downloader_import' ]
|
||||
|
||||
elif self._management_type == MANAGEMENT_TYPE_IMPORT_MULTIPLE_GALLERY:
|
||||
|
||||
importer = self._serialisables[ 'multiple_gallery_import' ]
|
||||
importer = self._variables[ 'multiple_gallery_import' ]
|
||||
|
||||
elif self._management_type == MANAGEMENT_TYPE_IMPORT_MULTIPLE_WATCHER:
|
||||
|
||||
importer = self._serialisables[ 'multiple_watcher_import' ]
|
||||
importer = self._variables[ 'multiple_watcher_import' ]
|
||||
|
||||
elif self._management_type == MANAGEMENT_TYPE_IMPORT_URLS:
|
||||
|
||||
importer = self._serialisables[ 'urls_import' ]
|
||||
importer = self._variables[ 'urls_import' ]
|
||||
|
||||
|
||||
if importer.HasSerialisableChangesSince( since_timestamp ):
|
||||
|
@ -773,7 +813,7 @@ class ManagementController( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
def HasVariable( self, name ):
|
||||
|
||||
return name in self._simples or name in self._serialisables
|
||||
return name in self._variables
|
||||
|
||||
|
||||
def IsImporter( self ):
|
||||
|
@ -781,13 +821,6 @@ class ManagementController( HydrusSerialisable.SerialisableBase ):
|
|||
return self._management_type in ( MANAGEMENT_TYPE_IMPORT_HDD, MANAGEMENT_TYPE_IMPORT_SIMPLE_DOWNLOADER, MANAGEMENT_TYPE_IMPORT_MULTIPLE_GALLERY, MANAGEMENT_TYPE_IMPORT_MULTIPLE_WATCHER, MANAGEMENT_TYPE_IMPORT_URLS )
|
||||
|
||||
|
||||
def SetKey( self, name, key ):
|
||||
|
||||
self._keys[ name ] = key
|
||||
|
||||
self._SerialisableChangeMade()
|
||||
|
||||
|
||||
def SetPageName( self, name ):
|
||||
|
||||
if name != self._page_name:
|
||||
|
@ -809,26 +842,30 @@ class ManagementController( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
def SetVariable( self, name, value ):
|
||||
|
||||
if isinstance( value, HydrusSerialisable.SerialisableBase ):
|
||||
if name in self._variables:
|
||||
|
||||
if name not in self._serialisables or value.DumpToString() != self._serialisables[ name ].DumpToString():
|
||||
if type( value ) == type( self._variables[ name ] ):
|
||||
|
||||
self._serialisables[ name ] = value
|
||||
|
||||
self._SerialisableChangeMade()
|
||||
|
||||
|
||||
else:
|
||||
|
||||
if name not in self._simples or value != self._simples[ name ]:
|
||||
|
||||
self._simples[ name ] = value
|
||||
|
||||
self._SerialisableChangeMade()
|
||||
if isinstance( value, HydrusSerialisable.SerialisableBase ):
|
||||
|
||||
if value.GetSerialisableTuple() == self._variables[ name ].GetSerialisableTuple():
|
||||
|
||||
return
|
||||
|
||||
|
||||
elif value == self._variables[ name ]:
|
||||
|
||||
return
|
||||
|
||||
|
||||
|
||||
|
||||
self._variables[ name ] = value
|
||||
|
||||
self._SerialisableChangeMade()
|
||||
|
||||
|
||||
|
||||
HydrusSerialisable.SERIALISABLE_TYPES_TO_OBJECT_TYPES[ HydrusSerialisable.SERIALISABLE_TYPE_MANAGEMENT_CONTROLLER ] = ManagementController
|
||||
|
||||
class ListBoxTagsMediaManagementPanel( ClientGUIListBoxes.ListBoxTagsMedia ):
|
||||
|
@ -957,7 +994,7 @@ class ManagementPanel( QW.QScrollArea ):
|
|||
self._management_controller = management_controller
|
||||
|
||||
self._page = page
|
||||
self._page_key = self._management_controller.GetKey( 'page' )
|
||||
self._page_key = self._management_controller.GetVariable( 'page_key' )
|
||||
|
||||
self._current_selection_tags_list = None
|
||||
|
||||
|
@ -1172,10 +1209,10 @@ class ManagementPanelDuplicateFilter( ManagementPanel ):
|
|||
menu_items.append( ( 'normal', 'similar', 'Search for similar files.', HydrusData.Call( self._SetSearchDistance, CC.HAMMING_SIMILAR ) ) )
|
||||
menu_items.append( ( 'normal', 'speculative', 'Search for files that are probably similar.', HydrusData.Call( self._SetSearchDistance, CC.HAMMING_SPECULATIVE ) ) )
|
||||
|
||||
self._search_distance_button = ClientGUIMenuButton.MenuButton( self._searching_panel, 'similarity', menu_items )
|
||||
self._max_hamming_distance_for_potential_discovery_button = ClientGUIMenuButton.MenuButton( self._searching_panel, 'similarity', menu_items )
|
||||
|
||||
self._search_distance_spinctrl = ClientGUICommon.BetterSpinBox( self._searching_panel, min=0, max=64, width = 50 )
|
||||
self._search_distance_spinctrl.setSingleStep( 2 )
|
||||
self._max_hamming_distance_for_potential_discovery_spinctrl = ClientGUICommon.BetterSpinBox( self._searching_panel, min=0, max=64, width = 50 )
|
||||
self._max_hamming_distance_for_potential_discovery_spinctrl.setSingleStep( 2 )
|
||||
|
||||
self._num_searched = ClientGUICommon.TextAndGauge( self._searching_panel )
|
||||
|
||||
|
@ -1199,9 +1236,11 @@ class ManagementPanelDuplicateFilter( ManagementPanel ):
|
|||
|
||||
self._filtering_panel = ClientGUICommon.StaticBox( self._main_right_panel, 'duplicate filter' )
|
||||
|
||||
file_search_context = management_controller.GetVariable( 'file_search_context' )
|
||||
file_search_context_1 = management_controller.GetVariable( 'file_search_context_1' )
|
||||
file_search_context_2 = management_controller.GetVariable( 'file_search_context_2' )
|
||||
|
||||
file_search_context.FixMissingServices( HG.client_controller.services_manager.FilterValidServiceKeys )
|
||||
file_search_context_1.FixMissingServices( HG.client_controller.services_manager.FilterValidServiceKeys )
|
||||
file_search_context_2.FixMissingServices( HG.client_controller.services_manager.FilterValidServiceKeys )
|
||||
|
||||
if self._management_controller.HasVariable( 'synchronised' ):
|
||||
|
||||
|
@ -1212,9 +1251,14 @@ class ManagementPanelDuplicateFilter( ManagementPanel ):
|
|||
synchronised = True
|
||||
|
||||
|
||||
self._tag_autocomplete = ClientGUIACDropdown.AutoCompleteDropdownTagsRead( self._filtering_panel, self._page_key, file_search_context, media_sort_widget = self._media_sort, media_collect_widget = self._media_collect, allow_all_known_files = False, synchronised = synchronised, force_system_everything = True )
|
||||
self._tag_autocomplete_1 = ClientGUIACDropdown.AutoCompleteDropdownTagsRead( self._filtering_panel, self._page_key, file_search_context_1, media_sort_widget = self._media_sort, media_collect_widget = self._media_collect, allow_all_known_files = False, synchronised = synchronised, force_system_everything = True )
|
||||
self._tag_autocomplete_2 = ClientGUIACDropdown.AutoCompleteDropdownTagsRead( self._filtering_panel, self._page_key, file_search_context_2, media_sort_widget = self._media_sort, media_collect_widget = self._media_collect, allow_all_known_files = False, synchronised = synchronised, force_system_everything = True )
|
||||
|
||||
self._both_files_match = QW.QCheckBox( self._filtering_panel )
|
||||
self._dupe_search_type = ClientGUICommon.BetterChoice( self._filtering_panel )
|
||||
|
||||
self._dupe_search_type.addItem( 'at least one file matches the search', CC.DUPE_SEARCH_ONE_FILE_MATCHES_ONE_SEARCH )
|
||||
self._dupe_search_type.addItem( 'both files match the search', CC.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH )
|
||||
self._dupe_search_type.addItem( 'both files match different searches', CC.DUPE_SEARCH_BOTH_FILES_MATCH_DIFFERENT_SEARCHES )
|
||||
|
||||
self._pixel_dupes_preference = ClientGUICommon.BetterChoice( self._filtering_panel )
|
||||
|
||||
|
@ -1223,8 +1267,8 @@ class ManagementPanelDuplicateFilter( ManagementPanel ):
|
|||
self._pixel_dupes_preference.addItem( CC.similar_files_pixel_dupes_string_lookup[ p ], p )
|
||||
|
||||
|
||||
self._max_hamming_distance = ClientGUICommon.BetterSpinBox( self._filtering_panel, min = 0, max = 64 )
|
||||
self._max_hamming_distance.setSingleStep( 2 )
|
||||
self._max_hamming_distance_for_filter = ClientGUICommon.BetterSpinBox( self._filtering_panel, min = 0, max = 64 )
|
||||
self._max_hamming_distance_for_filter.setSingleStep( 2 )
|
||||
|
||||
self._num_potential_duplicates = ClientGUICommon.BetterStaticText( self._filtering_panel, ellipsize_end = True )
|
||||
self._refresh_dupe_counts_button = ClientGUICommon.BetterBitmapButton( self._filtering_panel, CC.global_pixmaps().refresh, self.RefreshDuplicateNumbers )
|
||||
|
@ -1249,13 +1293,9 @@ class ManagementPanelDuplicateFilter( ManagementPanel ):
|
|||
|
||||
#
|
||||
|
||||
self._search_distance_spinctrl.setValue( new_options.GetInteger( 'similar_files_duplicate_pairs_search_distance' ) )
|
||||
self._max_hamming_distance_for_potential_discovery_spinctrl.setValue( new_options.GetInteger( 'similar_files_duplicate_pairs_search_distance' ) )
|
||||
|
||||
self._both_files_match.setChecked( management_controller.GetVariable( 'both_files_match' ) )
|
||||
|
||||
self._both_files_match.clicked.connect( self.EventSearchDomainChanged )
|
||||
|
||||
self._UpdateBothFilesMatchButton()
|
||||
self._dupe_search_type.SetValue( management_controller.GetVariable( 'dupe_search_type' ) )
|
||||
|
||||
if not management_controller.HasVariable( 'pixel_dupes_preference' ):
|
||||
|
||||
|
@ -1264,16 +1304,20 @@ class ManagementPanelDuplicateFilter( ManagementPanel ):
|
|||
|
||||
self._pixel_dupes_preference.SetValue( management_controller.GetVariable( 'pixel_dupes_preference' ) )
|
||||
|
||||
self._pixel_dupes_preference.currentIndexChanged.connect( self.EventSearchDomainChanged )
|
||||
self._pixel_dupes_preference.currentIndexChanged.connect( self.FilterSearchDomainChanged )
|
||||
|
||||
if not management_controller.HasVariable( 'max_hamming_distance' ):
|
||||
|
||||
management_controller.SetVariable( 'max_hamming_distance', 4 )
|
||||
|
||||
|
||||
self._max_hamming_distance.setValue( management_controller.GetVariable( 'max_hamming_distance' ) )
|
||||
self._max_hamming_distance_for_filter.setValue( management_controller.GetVariable( 'max_hamming_distance' ) )
|
||||
|
||||
self._max_hamming_distance.valueChanged.connect( self.EventSearchDomainChanged )
|
||||
self._max_hamming_distance_for_filter.valueChanged.connect( self.FilterSearchDomainChanged )
|
||||
|
||||
#
|
||||
|
||||
self._UpdateFilterSearchControls()
|
||||
|
||||
#
|
||||
|
||||
|
@ -1282,8 +1326,8 @@ class ManagementPanelDuplicateFilter( ManagementPanel ):
|
|||
distance_hbox = QP.HBoxLayout()
|
||||
|
||||
QP.AddToLayout( distance_hbox, ClientGUICommon.BetterStaticText(self._searching_panel,label='search distance: '), CC.FLAGS_CENTER_PERPENDICULAR )
|
||||
QP.AddToLayout( distance_hbox, self._search_distance_button, CC.FLAGS_EXPAND_BOTH_WAYS )
|
||||
QP.AddToLayout( distance_hbox, self._search_distance_spinctrl, CC.FLAGS_CENTER_PERPENDICULAR )
|
||||
QP.AddToLayout( distance_hbox, self._max_hamming_distance_for_potential_discovery_button, CC.FLAGS_EXPAND_BOTH_WAYS )
|
||||
QP.AddToLayout( distance_hbox, self._max_hamming_distance_for_potential_discovery_spinctrl, CC.FLAGS_CENTER_PERPENDICULAR )
|
||||
|
||||
gridbox_2 = QP.GridLayout( cols = 2 )
|
||||
|
||||
|
@ -1322,12 +1366,13 @@ class ManagementPanelDuplicateFilter( ManagementPanel ):
|
|||
|
||||
rows = []
|
||||
|
||||
rows.append( ( 'both files of pair match in search: ', self._both_files_match ) )
|
||||
rows.append( ( 'maximum search distance of pair: ', self._max_hamming_distance ) )
|
||||
rows.append( ( 'maximum search distance of pair: ', self._max_hamming_distance_for_filter ) )
|
||||
|
||||
gridbox = ClientGUICommon.WrapInGrid( self._filtering_panel, rows )
|
||||
|
||||
self._filtering_panel.Add( self._tag_autocomplete, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
self._filtering_panel.Add( self._dupe_search_type, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
self._filtering_panel.Add( self._tag_autocomplete_1, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
self._filtering_panel.Add( self._tag_autocomplete_2, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
self._filtering_panel.Add( gridbox, CC.FLAGS_EXPAND_SIZER_PERPENDICULAR )
|
||||
self._filtering_panel.Add( self._pixel_dupes_preference, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
self._filtering_panel.Add( text_and_button_hbox, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
|
@ -1358,8 +1403,12 @@ class ManagementPanelDuplicateFilter( ManagementPanel ):
|
|||
self._controller.sub( self, 'NotifyNewMaintenanceNumbers', 'new_similar_files_maintenance_numbers' )
|
||||
self._controller.sub( self, 'NotifyNewPotentialsSearchNumbers', 'new_similar_files_potentials_search_numbers' )
|
||||
|
||||
self._tag_autocomplete.searchChanged.connect( self.SearchChanged )
|
||||
self._search_distance_spinctrl.valueChanged.connect( self.EventSearchDistanceChanged )
|
||||
self._tag_autocomplete_1.searchChanged.connect( self.Search1Changed )
|
||||
self._tag_autocomplete_2.searchChanged.connect( self.Search2Changed )
|
||||
|
||||
self._dupe_search_type.currentIndexChanged.connect( self.FilterDupeSearchTypeChanged )
|
||||
|
||||
self._max_hamming_distance_for_potential_discovery_spinctrl.valueChanged.connect( self.MaxHammingDistanceForPotentialDiscoveryChanged )
|
||||
|
||||
|
||||
def _EditMergeOptions( self, duplicate_type ):
|
||||
|
@ -1383,26 +1432,75 @@ class ManagementPanelDuplicateFilter( ManagementPanel ):
|
|||
|
||||
|
||||
|
||||
def _GetDuplicateFileSearchData( self ) -> typing.Tuple[ ClientSearch.FileSearchContext, bool, int, int ]:
|
||||
def _FilterSearchDomainUpdated( self ):
|
||||
|
||||
file_search_context = self._tag_autocomplete.GetFileSearchContext()
|
||||
( file_search_context_1, file_search_context_2, dupe_search_type, pixel_dupes_preference, max_hamming_distance ) = self._GetDuplicateFileSearchData( optimise_for_search = False )
|
||||
|
||||
both_files_match = self._both_files_match.isChecked()
|
||||
self._management_controller.SetVariable( 'file_search_context_1', file_search_context_1 )
|
||||
self._management_controller.SetVariable( 'file_search_context_2', file_search_context_2 )
|
||||
|
||||
synchronised = self._tag_autocomplete_1.IsSynchronised()
|
||||
|
||||
self._management_controller.SetVariable( 'synchronised', synchronised )
|
||||
|
||||
self._management_controller.SetVariable( 'dupe_search_type', dupe_search_type )
|
||||
self._management_controller.SetVariable( 'pixel_dupes_preference', pixel_dupes_preference )
|
||||
self._management_controller.SetVariable( 'max_hamming_distance', max_hamming_distance )
|
||||
|
||||
self._SetLocationContext( file_search_context_1.GetLocationContext() )
|
||||
|
||||
self._UpdateFilterSearchControls()
|
||||
|
||||
if self._tag_autocomplete_1.IsSynchronised():
|
||||
|
||||
self._dupe_count_numbers_dirty = True
|
||||
|
||||
|
||||
|
||||
def _GetDuplicateFileSearchData( self, optimise_for_search = True ) -> typing.Tuple[ ClientSearch.FileSearchContext, ClientSearch.FileSearchContext, int, int, int ]:
|
||||
|
||||
file_search_context_1 = self._tag_autocomplete_1.GetFileSearchContext()
|
||||
file_search_context_2 = self._tag_autocomplete_2.GetFileSearchContext()
|
||||
|
||||
dupe_search_type = self._dupe_search_type.GetValue()
|
||||
|
||||
if optimise_for_search:
|
||||
|
||||
if dupe_search_type == CC.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH and ( file_search_context_1.IsJustSystemEverything() or file_search_context_1.HasNoPredicates() ):
|
||||
|
||||
dupe_search_type = CC.DUPE_SEARCH_ONE_FILE_MATCHES_ONE_SEARCH
|
||||
|
||||
elif dupe_search_type == CC.DUPE_SEARCH_BOTH_FILES_MATCH_DIFFERENT_SEARCHES:
|
||||
|
||||
if file_search_context_1.IsJustSystemEverything() or file_search_context_1.HasNoPredicates():
|
||||
|
||||
f = file_search_context_1
|
||||
file_search_context_1 = file_search_context_2
|
||||
file_search_context_2 = f
|
||||
|
||||
dupe_search_type = CC.DUPE_SEARCH_ONE_FILE_MATCHES_ONE_SEARCH
|
||||
|
||||
elif file_search_context_2.IsJustSystemEverything() or file_search_context_2.HasNoPredicates():
|
||||
|
||||
dupe_search_type = CC.DUPE_SEARCH_ONE_FILE_MATCHES_ONE_SEARCH
|
||||
|
||||
|
||||
|
||||
|
||||
pixel_dupes_preference = self._pixel_dupes_preference.GetValue()
|
||||
|
||||
max_hamming_distance = self._max_hamming_distance.value()
|
||||
max_hamming_distance = self._max_hamming_distance_for_filter.value()
|
||||
|
||||
return ( file_search_context, both_files_match, pixel_dupes_preference, max_hamming_distance )
|
||||
return ( file_search_context_1, file_search_context_2, dupe_search_type, pixel_dupes_preference, max_hamming_distance )
|
||||
|
||||
|
||||
def _LaunchFilter( self ):
|
||||
|
||||
( file_search_context, both_files_match, pixel_dupes_preference, max_hamming_distance ) = self._GetDuplicateFileSearchData()
|
||||
( file_search_context_1, file_search_context_2, dupe_search_type, pixel_dupes_preference, max_hamming_distance ) = self._GetDuplicateFileSearchData()
|
||||
|
||||
canvas_frame = ClientGUICanvasFrame.CanvasFrame( self.window() )
|
||||
|
||||
canvas_window = ClientGUICanvas.CanvasFilterDuplicates( canvas_frame, file_search_context, both_files_match, pixel_dupes_preference, max_hamming_distance )
|
||||
canvas_window = ClientGUICanvas.CanvasFilterDuplicates( canvas_frame, file_search_context_1, file_search_context_2, dupe_search_type, pixel_dupes_preference, max_hamming_distance )
|
||||
|
||||
canvas_window.showPairInPage.connect( self._ShowPairInPage )
|
||||
|
||||
|
@ -1427,9 +1525,9 @@ class ManagementPanelDuplicateFilter( ManagementPanel ):
|
|||
self._UpdatePotentialDuplicatesCount( potential_duplicates_count )
|
||||
|
||||
|
||||
def thread_do_it( file_search_context, both_files_match, pixel_dupes_preference, max_hamming_distance ):
|
||||
def thread_do_it( file_search_context_1, file_search_context_2, dupe_search_type, pixel_dupes_preference, max_hamming_distance ):
|
||||
|
||||
potential_duplicates_count = HG.client_controller.Read( 'potential_duplicates_count', file_search_context, both_files_match, pixel_dupes_preference, max_hamming_distance )
|
||||
potential_duplicates_count = HG.client_controller.Read( 'potential_duplicates_count', file_search_context_1, file_search_context_2, dupe_search_type, pixel_dupes_preference, max_hamming_distance )
|
||||
|
||||
QP.CallAfter( qt_code, potential_duplicates_count )
|
||||
|
||||
|
@ -1442,9 +1540,9 @@ class ManagementPanelDuplicateFilter( ManagementPanel ):
|
|||
|
||||
self._num_potential_duplicates.setText( 'updating\u2026' )
|
||||
|
||||
( file_search_context, both_files_match, pixel_dupes_preference, max_hamming_distance ) = self._GetDuplicateFileSearchData()
|
||||
( file_search_context_1, file_search_context_2, dupe_search_type, pixel_dupes_preference, max_hamming_distance ) = self._GetDuplicateFileSearchData()
|
||||
|
||||
HG.client_controller.CallToThread( thread_do_it, file_search_context, both_files_match, pixel_dupes_preference, max_hamming_distance )
|
||||
HG.client_controller.CallToThread( thread_do_it, file_search_context_1, file_search_context_2, dupe_search_type, pixel_dupes_preference, max_hamming_distance )
|
||||
|
||||
|
||||
|
||||
|
@ -1464,30 +1562,6 @@ class ManagementPanelDuplicateFilter( ManagementPanel ):
|
|||
|
||||
|
||||
|
||||
def _SearchDomainUpdated( self ):
|
||||
|
||||
( file_search_context, both_files_match, pixel_dupes_preference, max_hamming_distance ) = self._GetDuplicateFileSearchData()
|
||||
|
||||
self._management_controller.SetVariable( 'file_search_context', file_search_context )
|
||||
|
||||
synchronised = self._tag_autocomplete.IsSynchronised()
|
||||
|
||||
self._management_controller.SetVariable( 'synchronised', synchronised )
|
||||
|
||||
self._management_controller.SetVariable( 'both_files_match', both_files_match )
|
||||
self._management_controller.SetVariable( 'pixel_dupes_preference', pixel_dupes_preference )
|
||||
self._management_controller.SetVariable( 'max_hamming_distance', max_hamming_distance )
|
||||
|
||||
self._SetLocationContext( file_search_context.GetLocationContext() )
|
||||
|
||||
self._UpdateBothFilesMatchButton()
|
||||
|
||||
if self._tag_autocomplete.IsSynchronised():
|
||||
|
||||
self._dupe_count_numbers_dirty = True
|
||||
|
||||
|
||||
|
||||
def _SetCurrentMediaAs( self, duplicate_type ):
|
||||
|
||||
media_panel = self._page.GetMediaPanel()
|
||||
|
@ -1511,7 +1585,7 @@ class ManagementPanelDuplicateFilter( ManagementPanel ):
|
|||
|
||||
def _SetSearchDistance( self, value ):
|
||||
|
||||
self._search_distance_spinctrl.setValue( value )
|
||||
self._max_hamming_distance_for_potential_discovery_spinctrl.setValue( value )
|
||||
|
||||
self._UpdateMaintenanceStatus()
|
||||
|
||||
|
@ -1525,9 +1599,9 @@ class ManagementPanelDuplicateFilter( ManagementPanel ):
|
|||
|
||||
def _ShowPotentialDupes( self, hashes ):
|
||||
|
||||
( file_search_context, both_files_match, pixel_dupes_preference, max_hamming_distance ) = self._GetDuplicateFileSearchData()
|
||||
( file_search_context_1, file_search_context_2, dupe_search_type, pixel_dupes_preference, max_hamming_distance ) = self._GetDuplicateFileSearchData()
|
||||
|
||||
location_context = file_search_context.GetLocationContext()
|
||||
location_context = file_search_context_1.GetLocationContext()
|
||||
|
||||
self._SetLocationContext( location_context )
|
||||
|
||||
|
@ -1549,9 +1623,9 @@ class ManagementPanelDuplicateFilter( ManagementPanel ):
|
|||
|
||||
def _ShowRandomPotentialDupes( self ):
|
||||
|
||||
( file_search_context, both_files_match, pixel_dupes_preference, max_hamming_distance ) = self._GetDuplicateFileSearchData()
|
||||
( file_search_context_1, file_search_context_2, dupe_search_type, pixel_dupes_preference, max_hamming_distance ) = self._GetDuplicateFileSearchData()
|
||||
|
||||
hashes = self._controller.Read( 'random_potential_duplicate_hashes', file_search_context, both_files_match, pixel_dupes_preference, max_hamming_distance )
|
||||
hashes = self._controller.Read( 'random_potential_duplicate_hashes', file_search_context_1, file_search_context_2, dupe_search_type, pixel_dupes_preference, max_hamming_distance )
|
||||
|
||||
if len( hashes ) == 0:
|
||||
|
||||
|
@ -1580,17 +1654,17 @@ class ManagementPanelDuplicateFilter( ManagementPanel ):
|
|||
|
||||
self._eligible_files.setText( '{} eligible files in the system.'.format(HydrusData.ToHumanInt(total_num_files)) )
|
||||
|
||||
self._search_distance_button.setEnabled( True )
|
||||
self._search_distance_spinctrl.setEnabled( True )
|
||||
self._max_hamming_distance_for_potential_discovery_button.setEnabled( True )
|
||||
self._max_hamming_distance_for_potential_discovery_spinctrl.setEnabled( True )
|
||||
|
||||
options_search_distance = self._controller.new_options.GetInteger( 'similar_files_duplicate_pairs_search_distance' )
|
||||
|
||||
if self._search_distance_spinctrl.value() != options_search_distance:
|
||||
if self._max_hamming_distance_for_potential_discovery_spinctrl.value() != options_search_distance:
|
||||
|
||||
self._search_distance_spinctrl.setValue( options_search_distance )
|
||||
self._max_hamming_distance_for_potential_discovery_spinctrl.setValue( options_search_distance )
|
||||
|
||||
|
||||
search_distance = self._search_distance_spinctrl.value()
|
||||
search_distance = self._max_hamming_distance_for_potential_discovery_spinctrl.value()
|
||||
|
||||
if search_distance in CC.hamming_string_lookup:
|
||||
|
||||
|
@ -1601,7 +1675,7 @@ class ManagementPanelDuplicateFilter( ManagementPanel ):
|
|||
button_label = 'custom'
|
||||
|
||||
|
||||
self._search_distance_button.setText( button_label )
|
||||
self._max_hamming_distance_for_potential_discovery_button.setText( button_label )
|
||||
|
||||
num_searched = sum( ( count for ( value, count ) in searched_distances_to_count.items() if value is not None and value >= search_distance ) )
|
||||
|
||||
|
@ -1641,20 +1715,6 @@ class ManagementPanelDuplicateFilter( ManagementPanel ):
|
|||
self._have_done_first_maintenance_numbers_show = True
|
||||
|
||||
|
||||
def _UpdateBothFilesMatchButton( self ):
|
||||
|
||||
( file_search_context, both_files_match, pixel_dupes_preference, max_hamming_distance ) = self._GetDuplicateFileSearchData()
|
||||
|
||||
if file_search_context.IsJustSystemEverything() or file_search_context.HasNoPredicates():
|
||||
|
||||
self._both_files_match.setEnabled( False )
|
||||
|
||||
else:
|
||||
|
||||
self._both_files_match.setEnabled( True )
|
||||
|
||||
|
||||
|
||||
def _UpdatePotentialDuplicatesCount( self, potential_duplicates_count ):
|
||||
|
||||
self._potential_duplicates_count = potential_duplicates_count
|
||||
|
@ -1673,14 +1733,28 @@ class ManagementPanelDuplicateFilter( ManagementPanel ):
|
|||
|
||||
|
||||
|
||||
def EventSearchDomainChanged( self ):
|
||||
def _UpdateFilterSearchControls( self ):
|
||||
|
||||
self._SearchDomainUpdated()
|
||||
( file_search_context_1, file_search_context_2, dupe_search_type, pixel_dupes_preference, max_hamming_distance ) = self._GetDuplicateFileSearchData( optimise_for_search = False )
|
||||
|
||||
self._tag_autocomplete_2.setVisible( dupe_search_type == CC.DUPE_SEARCH_BOTH_FILES_MATCH_DIFFERENT_SEARCHES )
|
||||
|
||||
self._max_hamming_distance_for_filter.setEnabled( self._pixel_dupes_preference.GetValue() != CC.SIMILAR_FILES_PIXEL_DUPES_REQUIRED )
|
||||
|
||||
|
||||
def EventSearchDistanceChanged( self ):
|
||||
def FilterDupeSearchTypeChanged( self ):
|
||||
|
||||
search_distance = self._search_distance_spinctrl.value()
|
||||
self._FilterSearchDomainUpdated()
|
||||
|
||||
|
||||
def FilterSearchDomainChanged( self ):
|
||||
|
||||
self._FilterSearchDomainUpdated()
|
||||
|
||||
|
||||
def MaxHammingDistanceForPotentialDiscoveryChanged( self ):
|
||||
|
||||
search_distance = self._max_hamming_distance_for_potential_discovery_spinctrl.value()
|
||||
|
||||
self._controller.new_options.SetInteger( 'similar_files_duplicate_pairs_search_distance', search_distance )
|
||||
|
||||
|
@ -1703,14 +1777,14 @@ class ManagementPanelDuplicateFilter( ManagementPanel ):
|
|||
|
||||
ManagementPanel.PageHidden( self )
|
||||
|
||||
self._tag_autocomplete.SetForceDropdownHide( True )
|
||||
self._tag_autocomplete_1.SetForceDropdownHide( True )
|
||||
|
||||
|
||||
def PageShown( self ):
|
||||
|
||||
ManagementPanel.PageShown( self )
|
||||
|
||||
self._tag_autocomplete.SetForceDropdownHide( False )
|
||||
self._tag_autocomplete_1.SetForceDropdownHide( False )
|
||||
|
||||
|
||||
def RefreshDuplicateNumbers( self ):
|
||||
|
@ -1720,7 +1794,7 @@ class ManagementPanelDuplicateFilter( ManagementPanel ):
|
|||
|
||||
def RefreshQuery( self ):
|
||||
|
||||
self._SearchDomainUpdated()
|
||||
self._FilterSearchDomainUpdated()
|
||||
|
||||
|
||||
def REPEATINGPageUpdate( self ):
|
||||
|
@ -1740,11 +1814,31 @@ class ManagementPanelDuplicateFilter( ManagementPanel ):
|
|||
|
||||
|
||||
|
||||
def SearchChanged( self, file_search_context: ClientSearch.FileSearchContext ):
|
||||
def Search1Changed( self, file_search_context: ClientSearch.FileSearchContext ):
|
||||
|
||||
self._SearchDomainUpdated()
|
||||
self._tag_autocomplete_2.blockSignals( True )
|
||||
|
||||
self._tag_autocomplete_2.SetLocationContext( self._tag_autocomplete_1.GetLocationContext() )
|
||||
self._tag_autocomplete_2.SetSynchronised( self._tag_autocomplete_1.IsSynchronised() )
|
||||
|
||||
self._tag_autocomplete_2.blockSignals( False )
|
||||
|
||||
self._FilterSearchDomainUpdated()
|
||||
|
||||
|
||||
def Search2Changed( self, file_search_context: ClientSearch.FileSearchContext ):
|
||||
|
||||
self._tag_autocomplete_1.blockSignals( True )
|
||||
|
||||
self._tag_autocomplete_1.SetLocationContext( self._tag_autocomplete_2.GetLocationContext() )
|
||||
self._tag_autocomplete_1.SetSynchronised( self._tag_autocomplete_2.IsSynchronised() )
|
||||
|
||||
self._tag_autocomplete_1.blockSignals( False )
|
||||
|
||||
self._FilterSearchDomainUpdated()
|
||||
|
||||
|
||||
|
||||
management_panel_types_to_classes[ MANAGEMENT_TYPE_DUPLICATE_FILTER ] = ManagementPanelDuplicateFilter
|
||||
|
||||
class ManagementPanelImporter( ManagementPanel ):
|
||||
|
@ -4254,7 +4348,7 @@ class ManagementPanelPetitions( ManagementPanel ):
|
|||
|
||||
def __init__( self, parent, page, controller, management_controller ):
|
||||
|
||||
self._petition_service_key = management_controller.GetKey( 'petition_service' )
|
||||
self._petition_service_key = management_controller.GetVariable( 'petition_service_key' )
|
||||
|
||||
ManagementPanel.__init__( self, parent, page, controller, management_controller )
|
||||
|
||||
|
|
|
@ -438,7 +438,7 @@ class Page( QW.QWidget ):
|
|||
|
||||
self._initial_hashes = initial_hashes
|
||||
|
||||
self._management_controller.SetKey( 'page', self._page_key )
|
||||
self._management_controller.SetVariable( 'page_key', self._page_key )
|
||||
|
||||
self._initialised = len( initial_hashes ) == 0
|
||||
self._pre_initialisation_media_results = []
|
||||
|
@ -2487,7 +2487,7 @@ class PagesNotebook( QP.TabWidgetWithDnD ):
|
|||
|
||||
|
||||
|
||||
def GetPageFromPageKey( self, page_key ):
|
||||
def GetPageFromPageKey( self, page_key ) -> typing.Optional[ Page ]:
|
||||
|
||||
if self._page_key == page_key:
|
||||
|
||||
|
|
|
@ -493,16 +493,7 @@ class MediaPanel( ClientMedia.ListeningMediaList, QW.QScrollArea, CAC.Applicatio
|
|||
|
||||
( num_files_descriptor, selected_files_descriptor ) = self._GetSortedSelectedMimeDescriptors()
|
||||
|
||||
if num_files == 1:
|
||||
|
||||
num_files_string = '1 ' + num_files_descriptor
|
||||
|
||||
else:
|
||||
|
||||
suffix = '' if num_files_descriptor.endswith( 's' ) else 's'
|
||||
|
||||
num_files_string = '{} {}{}'.format( HydrusData.ToHumanInt( num_files ), num_files_descriptor, suffix )
|
||||
|
||||
num_files_string = '{} {}'.format( HydrusData.ToHumanInt( num_files ), num_files_descriptor )
|
||||
|
||||
s = num_files_string # 23 files
|
||||
|
||||
|
@ -514,20 +505,26 @@ class MediaPanel( ClientMedia.ListeningMediaList, QW.QScrollArea, CAC.Applicatio
|
|||
|
||||
s += ' - totalling ' + pretty_total_size
|
||||
|
||||
pretty_total_duration = self._GetPrettyTotalDuration()
|
||||
|
||||
if pretty_total_duration != '':
|
||||
|
||||
s += ', {}'.format( pretty_total_duration )
|
||||
|
||||
|
||||
|
||||
else:
|
||||
|
||||
s += ' - '
|
||||
|
||||
# if 1 selected, we show the whole mime string, so no need to specify
|
||||
if num_selected == 1 or selected_files_descriptor == num_files_descriptor:
|
||||
|
||||
selected_files_string = HydrusData.ToHumanInt( num_selected )
|
||||
|
||||
else:
|
||||
|
||||
suffix = '' if selected_files_descriptor.endswith( 's' ) else 's'
|
||||
|
||||
selected_files_string = '{} {}{}'.format( HydrusData.ToHumanInt( num_selected ), selected_files_descriptor, suffix )
|
||||
selected_files_string = '{} {}'.format( HydrusData.ToHumanInt( num_selected ), selected_files_descriptor )
|
||||
|
||||
|
||||
if num_selected == 1: # 23 files - 1 video selected, file_info
|
||||
|
@ -544,26 +541,54 @@ class MediaPanel( ClientMedia.ListeningMediaList, QW.QScrollArea, CAC.Applicatio
|
|||
|
||||
if num_inbox == num_selected:
|
||||
|
||||
inbox_phrase = 'all in inbox, '
|
||||
inbox_phrase = 'all in inbox'
|
||||
|
||||
elif num_inbox == 0:
|
||||
|
||||
inbox_phrase = 'all archived, '
|
||||
inbox_phrase = 'all archived'
|
||||
|
||||
else:
|
||||
|
||||
inbox_phrase = '{} in inbox and {} archived, '.format( HydrusData.ToHumanInt( num_inbox ), HydrusData.ToHumanInt( num_selected - num_inbox ) )
|
||||
inbox_phrase = '{} in inbox and {} archived'.format( HydrusData.ToHumanInt( num_inbox ), HydrusData.ToHumanInt( num_selected - num_inbox ) )
|
||||
|
||||
|
||||
pretty_total_size = self._GetPrettyTotalSize( only_selected = True )
|
||||
|
||||
s += '{} selected, {}totalling {}'.format( selected_files_string, inbox_phrase, pretty_total_size )
|
||||
s += '{} selected, {}, totalling {}'.format( selected_files_string, inbox_phrase, pretty_total_size )
|
||||
|
||||
pretty_total_duration = self._GetPrettyTotalDuration( only_selected = True )
|
||||
|
||||
if pretty_total_duration != '':
|
||||
|
||||
s += ', {}'.format( pretty_total_duration )
|
||||
|
||||
|
||||
|
||||
|
||||
return s
|
||||
|
||||
|
||||
def _GetPrettyTotalDuration( self, only_selected = False ):
|
||||
|
||||
if only_selected:
|
||||
|
||||
media_source = self._selected_media
|
||||
|
||||
else:
|
||||
|
||||
media_source = self._sorted_media
|
||||
|
||||
|
||||
if len( media_source ) == 0 or False in ( media.HasDuration() for media in media_source ):
|
||||
|
||||
return ''
|
||||
|
||||
|
||||
total_duration = sum( ( media.GetDuration() for media in media_source ) )
|
||||
|
||||
return HydrusData.ConvertMillisecondsToPrettyTime( total_duration )
|
||||
|
||||
|
||||
def _GetPrettyTotalSize( self, only_selected = False ):
|
||||
|
||||
if only_selected:
|
||||
|
@ -686,11 +711,13 @@ class MediaPanel( ClientMedia.ListeningMediaList, QW.QScrollArea, CAC.Applicatio
|
|||
|
||||
def _GetSortedSelectedMimeDescriptors( self ):
|
||||
|
||||
def GetDescriptor( classes, num_collections ):
|
||||
def GetDescriptor( plural, classes, num_collections ):
|
||||
|
||||
suffix = 's' if plural else ''
|
||||
|
||||
if len( classes ) == 0:
|
||||
|
||||
return 'file'
|
||||
return 'file' + suffix
|
||||
|
||||
|
||||
if len( classes ) == 1:
|
||||
|
@ -699,39 +726,41 @@ class MediaPanel( ClientMedia.ListeningMediaList, QW.QScrollArea, CAC.Applicatio
|
|||
|
||||
if mime == HC.APPLICATION_HYDRUS_CLIENT_COLLECTION:
|
||||
|
||||
return 'files in {} collections'.format( HydrusData.ToHumanInt( num_collections ) )
|
||||
collections_suffix = 's' if num_collections > 1 else ''
|
||||
|
||||
return 'file{} in {} collection{}'.format( suffix, HydrusData.ToHumanInt( num_collections ), collections_suffix )
|
||||
|
||||
else:
|
||||
|
||||
return HC.mime_string_lookup[ mime ]
|
||||
return HC.mime_string_lookup[ mime ] + suffix
|
||||
|
||||
|
||||
|
||||
if len( classes.difference( HC.IMAGES ) ) == 0:
|
||||
|
||||
return 'image'
|
||||
return 'image' + suffix
|
||||
|
||||
elif len( classes.difference( HC.ANIMATIONS ) ) == 0:
|
||||
|
||||
return 'animation'
|
||||
return 'animation' + suffix
|
||||
|
||||
elif len( classes.difference( HC.VIDEO ) ) == 0:
|
||||
|
||||
return 'video'
|
||||
return 'video' + suffix
|
||||
|
||||
elif len( classes.difference( HC.AUDIO ) ) == 0:
|
||||
|
||||
return 'audio file'
|
||||
return 'audio file' + suffix
|
||||
|
||||
else:
|
||||
|
||||
return 'file'
|
||||
return 'file' + suffix
|
||||
|
||||
|
||||
|
||||
if len( self._sorted_media ) > 1000:
|
||||
|
||||
sorted_mime_descriptor = 'file'
|
||||
sorted_mime_descriptor = 'files'
|
||||
|
||||
else:
|
||||
|
||||
|
@ -746,12 +775,14 @@ class MediaPanel( ClientMedia.ListeningMediaList, QW.QScrollArea, CAC.Applicatio
|
|||
num_collections = 0
|
||||
|
||||
|
||||
sorted_mime_descriptor = GetDescriptor( sorted_mimes, num_collections )
|
||||
plural = len( self._sorted_media ) > 1 or sum( ( m.GetNumFiles() for m in self._sorted_media ) ) > 1
|
||||
|
||||
sorted_mime_descriptor = GetDescriptor( plural, sorted_mimes, num_collections )
|
||||
|
||||
|
||||
if len( self._selected_media ) > 1000:
|
||||
|
||||
selected_mime_descriptor = 'file'
|
||||
selected_mime_descriptor = 'files'
|
||||
|
||||
else:
|
||||
|
||||
|
@ -766,7 +797,9 @@ class MediaPanel( ClientMedia.ListeningMediaList, QW.QScrollArea, CAC.Applicatio
|
|||
num_collections = 0
|
||||
|
||||
|
||||
selected_mime_descriptor = GetDescriptor( selected_mimes, num_collections )
|
||||
plural = len( self._selected_media ) > 1 or sum( ( m.GetNumFiles() for m in self._selected_media ) ) > 1
|
||||
|
||||
selected_mime_descriptor = GetDescriptor( plural, selected_mimes, num_collections )
|
||||
|
||||
|
||||
return ( sorted_mime_descriptor, selected_mime_descriptor )
|
||||
|
@ -1697,7 +1730,14 @@ class MediaPanel( ClientMedia.ListeningMediaList, QW.QScrollArea, CAC.Applicatio
|
|||
|
||||
duplicate_content_merge_options = panel.GetValue()
|
||||
|
||||
self._SetDuplicates( duplicate_type, duplicate_content_merge_options = duplicate_content_merge_options )
|
||||
if duplicate_type == HC.DUPLICATE_BETTER:
|
||||
|
||||
self._SetDuplicatesFocusedBetter( duplicate_content_merge_options = duplicate_content_merge_options )
|
||||
|
||||
else:
|
||||
|
||||
self._SetDuplicates( duplicate_type, duplicate_content_merge_options = duplicate_content_merge_options )
|
||||
|
||||
|
||||
|
||||
|
||||
|
@ -1729,7 +1769,7 @@ class MediaPanel( ClientMedia.ListeningMediaList, QW.QScrollArea, CAC.Applicatio
|
|||
|
||||
if result == QW.QDialog.Accepted:
|
||||
|
||||
self._SetDuplicates( HC.DUPLICATE_BETTER, media_pairs = media_pairs, silent = True )
|
||||
self._SetDuplicates( HC.DUPLICATE_BETTER, media_pairs = media_pairs, silent = True, duplicate_content_merge_options = duplicate_content_merge_options )
|
||||
|
||||
|
||||
else:
|
||||
|
@ -3839,7 +3879,16 @@ class MediaPanelThumbnails( MediaPanel ):
|
|||
|
||||
if multiple_selected:
|
||||
|
||||
selection_info_menu_label = '{} files, {}'.format( HydrusData.ToHumanInt( num_selected ), self._GetPrettyTotalSize( only_selected = True ) )
|
||||
( num_files_descriptor, selected_files_descriptor ) = self._GetSortedSelectedMimeDescriptors()
|
||||
|
||||
selection_info_menu_label = '{} {}, {}'.format( HydrusData.ToHumanInt( num_selected ), selected_files_descriptor, self._GetPrettyTotalSize( only_selected = True ) )
|
||||
|
||||
pretty_total_duration = self._GetPrettyTotalDuration( only_selected = True )
|
||||
|
||||
if pretty_total_duration != '':
|
||||
|
||||
selection_info_menu_label += ', {}'.format( pretty_total_duration )
|
||||
|
||||
|
||||
else:
|
||||
|
||||
|
@ -4736,16 +4785,15 @@ class Thumbnail( Selectable ):
|
|||
# EDIT 2: I think it may only look weird when the thumb banner has opacity. Maybe I need to learn about CompositionModes
|
||||
#
|
||||
# EDIT 3: Appalently Qt 6.4.0 may fix the basic 100% UI scale QImage init bug!
|
||||
#
|
||||
# UPDATE 3a: Qt 6.4.x did not magically fix it. It draws much nicer, but still a different font weight/metrics compared to media viewer background, say.
|
||||
# The PreferAntialias flag on 6.4.x seems to draw very very close to our ideal, so let's be happy with it for now.
|
||||
|
||||
painter = QG.QPainter( qt_image )
|
||||
|
||||
painter.setRenderHint( QG.QPainter.TextAntialiasing, True ) # is true already in tests, is supposed to be 'the way' to fix the ugly text issue
|
||||
painter.setRenderHint( QG.QPainter.Antialiasing, True ) # seems to do nothing, it only affects primitives?
|
||||
|
||||
if device_pixel_ratio > 1.0:
|
||||
|
||||
painter.setRenderHint( QG.QPainter.SmoothPixmapTransform, True ) # makes the thumb scale up prettily and expensively when we need it
|
||||
|
||||
painter.setRenderHint( QG.QPainter.SmoothPixmapTransform, True ) # makes the thumb QImage scale up and down prettily when we need it, either because it is too small or DPR gubbins
|
||||
|
||||
new_options = HG.client_controller.new_options
|
||||
|
||||
|
|
|
@ -373,7 +373,7 @@ class MediaCollectControl( QW.QWidget ):
|
|||
|
||||
self._management_controller.SetVariable( 'media_collect', self._media_collect )
|
||||
|
||||
page_key = self._management_controller.GetKey( 'page' )
|
||||
page_key = self._management_controller.GetVariable( 'page_key' )
|
||||
|
||||
HG.client_controller.pub( 'collect_media', page_key, self._media_collect )
|
||||
HG.client_controller.pub( 'a_collect_happened', page_key )
|
||||
|
@ -472,7 +472,7 @@ class MediaCollectControl( QW.QWidget ):
|
|||
|
||||
def SetCollectFromPage( self, page_key, media_collect ):
|
||||
|
||||
if page_key == self._management_controller.GetKey( 'page' ):
|
||||
if page_key == self._management_controller.GetVariable( 'page_key' ):
|
||||
|
||||
self.SetCollect( media_collect )
|
||||
|
||||
|
@ -847,7 +847,7 @@ class MediaSortControl( QW.QWidget ):
|
|||
|
||||
if self._management_controller is not None:
|
||||
|
||||
my_page_key = self._management_controller.GetKey( 'page' )
|
||||
my_page_key = self._management_controller.GetVariable( 'page_key' )
|
||||
|
||||
if page_key == my_page_key:
|
||||
|
||||
|
@ -858,7 +858,7 @@ class MediaSortControl( QW.QWidget ):
|
|||
|
||||
def BroadcastSort( self, page_key = None ):
|
||||
|
||||
if page_key is not None and page_key != self._management_controller.GetKey( 'page' ):
|
||||
if page_key is not None and page_key != self._management_controller.GetVariable( 'page_key' ):
|
||||
|
||||
return
|
||||
|
||||
|
|
|
@ -1580,6 +1580,11 @@ class AutoCompleteDropdownTags( AutoCompleteDropdown ):
|
|||
raise NotImplementedError()
|
||||
|
||||
|
||||
def GetLocationContext( self ) -> ClientLocation.LocationContext:
|
||||
|
||||
return self._location_context_button.GetValue()
|
||||
|
||||
|
||||
def NotifyNewServices( self ):
|
||||
|
||||
self._SetLocationContext( self._location_context_button.GetValue() )
|
||||
|
|
|
@ -20,9 +20,6 @@ from hydrus.client.media import ClientMediaManagers
|
|||
from hydrus.client.media import ClientMediaResult
|
||||
from hydrus.client.metadata import ClientTags
|
||||
|
||||
hashes_to_jpeg_quality = {}
|
||||
hashes_to_pixel_hashes = {}
|
||||
|
||||
def FilterServiceKeysToContentUpdates( full_service_keys_to_content_updates, hashes ):
|
||||
|
||||
if not isinstance( hashes, set ):
|
||||
|
@ -70,503 +67,6 @@ def FlattenMedia( media_list ):
|
|||
|
||||
return flat_media
|
||||
|
||||
def GetDuplicateComparisonScore( shown_media, comparison_media ):
|
||||
|
||||
statements_and_scores = GetDuplicateComparisonStatements( shown_media, comparison_media )
|
||||
|
||||
total_score = sum( ( score for ( statement, score ) in statements_and_scores.values() ) )
|
||||
|
||||
return total_score
|
||||
|
||||
def GetDuplicateComparisonStatements( shown_media, comparison_media ):
|
||||
|
||||
new_options = HG.client_controller.new_options
|
||||
|
||||
duplicate_comparison_score_higher_jpeg_quality = new_options.GetInteger( 'duplicate_comparison_score_higher_jpeg_quality' )
|
||||
duplicate_comparison_score_much_higher_jpeg_quality = new_options.GetInteger( 'duplicate_comparison_score_much_higher_jpeg_quality' )
|
||||
duplicate_comparison_score_higher_filesize = new_options.GetInteger( 'duplicate_comparison_score_higher_filesize' )
|
||||
duplicate_comparison_score_much_higher_filesize = new_options.GetInteger( 'duplicate_comparison_score_much_higher_filesize' )
|
||||
duplicate_comparison_score_higher_resolution = new_options.GetInteger( 'duplicate_comparison_score_higher_resolution' )
|
||||
duplicate_comparison_score_much_higher_resolution = new_options.GetInteger( 'duplicate_comparison_score_much_higher_resolution' )
|
||||
duplicate_comparison_score_more_tags = new_options.GetInteger( 'duplicate_comparison_score_more_tags' )
|
||||
duplicate_comparison_score_older = new_options.GetInteger( 'duplicate_comparison_score_older' )
|
||||
duplicate_comparison_score_nicer_ratio = new_options.GetInteger( 'duplicate_comparison_score_nicer_ratio' )
|
||||
|
||||
#
|
||||
|
||||
statements_and_scores = {}
|
||||
|
||||
s_hash = shown_media.GetHash()
|
||||
c_hash = comparison_media.GetHash()
|
||||
|
||||
s_mime = shown_media.GetMime()
|
||||
c_mime = comparison_media.GetMime()
|
||||
|
||||
# size
|
||||
|
||||
s_size = shown_media.GetSize()
|
||||
c_size = comparison_media.GetSize()
|
||||
|
||||
is_a_pixel_dupe = False
|
||||
|
||||
if shown_media.IsStaticImage() and comparison_media.IsStaticImage() and shown_media.GetResolution() == comparison_media.GetResolution():
|
||||
|
||||
global hashes_to_pixel_hashes
|
||||
|
||||
if s_hash not in hashes_to_pixel_hashes:
|
||||
|
||||
path = HG.client_controller.client_files_manager.GetFilePath( s_hash, s_mime )
|
||||
|
||||
hashes_to_pixel_hashes[ s_hash ] = HydrusImageHandling.GetImagePixelHash( path, s_mime )
|
||||
|
||||
|
||||
if c_hash not in hashes_to_pixel_hashes:
|
||||
|
||||
path = HG.client_controller.client_files_manager.GetFilePath( c_hash, c_mime )
|
||||
|
||||
hashes_to_pixel_hashes[ c_hash ] = HydrusImageHandling.GetImagePixelHash( path, c_mime )
|
||||
|
||||
|
||||
s_pixel_hash = hashes_to_pixel_hashes[ s_hash ]
|
||||
c_pixel_hash = hashes_to_pixel_hashes[ c_hash ]
|
||||
|
||||
if s_pixel_hash == c_pixel_hash:
|
||||
|
||||
is_a_pixel_dupe = True
|
||||
|
||||
if s_mime == HC.IMAGE_PNG and c_mime != HC.IMAGE_PNG:
|
||||
|
||||
statement = 'this is a pixel-for-pixel duplicate png!'
|
||||
|
||||
score = -100
|
||||
|
||||
elif s_mime != HC.IMAGE_PNG and c_mime == HC.IMAGE_PNG:
|
||||
|
||||
statement = 'other file is a pixel-for-pixel duplicate png!'
|
||||
|
||||
score = 100
|
||||
|
||||
else:
|
||||
|
||||
statement = 'images are pixel-for-pixel duplicates!'
|
||||
|
||||
score = 0
|
||||
|
||||
|
||||
statements_and_scores[ 'pixel_duplicates' ] = ( statement, score )
|
||||
|
||||
|
||||
|
||||
if s_size != c_size:
|
||||
|
||||
absolute_size_ratio = max( s_size, c_size ) / min( s_size, c_size )
|
||||
|
||||
if absolute_size_ratio > 2.0:
|
||||
|
||||
if s_size > c_size:
|
||||
|
||||
operator = '>>'
|
||||
score = duplicate_comparison_score_much_higher_filesize
|
||||
|
||||
else:
|
||||
|
||||
operator = '<<'
|
||||
score = -duplicate_comparison_score_much_higher_filesize
|
||||
|
||||
|
||||
elif absolute_size_ratio > 1.05:
|
||||
|
||||
if s_size > c_size:
|
||||
|
||||
operator = '>'
|
||||
score = duplicate_comparison_score_higher_filesize
|
||||
|
||||
else:
|
||||
|
||||
operator = '<'
|
||||
score = -duplicate_comparison_score_higher_filesize
|
||||
|
||||
|
||||
else:
|
||||
|
||||
operator = CC.UNICODE_ALMOST_EQUAL_TO
|
||||
score = 0
|
||||
|
||||
|
||||
if s_size > c_size:
|
||||
|
||||
sign = '+'
|
||||
percentage_difference = ( s_size / c_size ) - 1.0
|
||||
|
||||
else:
|
||||
|
||||
sign = ''
|
||||
percentage_difference = ( s_size / c_size ) - 1.0
|
||||
|
||||
|
||||
percentage_different_string = ' ({}{})'.format( sign, HydrusData.ConvertFloatToPercentage( percentage_difference ) )
|
||||
|
||||
if is_a_pixel_dupe:
|
||||
|
||||
score = 0
|
||||
|
||||
|
||||
statement = '{} {} {}{}'.format( HydrusData.ToHumanBytes( s_size ), operator, HydrusData.ToHumanBytes( c_size ), percentage_different_string )
|
||||
|
||||
statements_and_scores[ 'filesize' ] = ( statement, score )
|
||||
|
||||
|
||||
# higher/same res
|
||||
|
||||
s_resolution = shown_media.GetResolution()
|
||||
c_resolution = comparison_media.GetResolution()
|
||||
|
||||
if s_resolution != c_resolution:
|
||||
|
||||
( s_w, s_h ) = s_resolution
|
||||
( c_w, c_h ) = c_resolution
|
||||
|
||||
all_measurements_are_good = None not in ( s_w, s_h, c_w, c_h ) and True not in ( d <= 0 for d in ( s_w, s_h, c_w, c_h ) )
|
||||
|
||||
if all_measurements_are_good:
|
||||
|
||||
resolution_ratio = ( s_w * s_h ) / ( c_w * c_h )
|
||||
|
||||
if resolution_ratio == 1.0:
|
||||
|
||||
operator = '!='
|
||||
score = 0
|
||||
|
||||
elif resolution_ratio > 2.0:
|
||||
|
||||
operator = '>>'
|
||||
score = duplicate_comparison_score_much_higher_resolution
|
||||
|
||||
elif resolution_ratio > 1.00:
|
||||
|
||||
operator = '>'
|
||||
score = duplicate_comparison_score_higher_resolution
|
||||
|
||||
elif resolution_ratio < 0.5:
|
||||
|
||||
operator = '<<'
|
||||
score = -duplicate_comparison_score_much_higher_resolution
|
||||
|
||||
else:
|
||||
|
||||
operator = '<'
|
||||
score = -duplicate_comparison_score_higher_resolution
|
||||
|
||||
|
||||
if s_resolution in HC.NICE_RESOLUTIONS:
|
||||
|
||||
s_string = HC.NICE_RESOLUTIONS[ s_resolution ]
|
||||
|
||||
else:
|
||||
|
||||
s_string = HydrusData.ConvertResolutionToPrettyString( s_resolution )
|
||||
|
||||
if s_w % 2 == 1 or s_h % 2 == 1:
|
||||
|
||||
s_string += ' (unusual)'
|
||||
|
||||
|
||||
|
||||
if c_resolution in HC.NICE_RESOLUTIONS:
|
||||
|
||||
c_string = HC.NICE_RESOLUTIONS[ c_resolution ]
|
||||
|
||||
else:
|
||||
|
||||
c_string = HydrusData.ConvertResolutionToPrettyString( c_resolution )
|
||||
|
||||
if c_w % 2 == 1 or c_h % 2 == 1:
|
||||
|
||||
c_string += ' (unusual)'
|
||||
|
||||
|
||||
|
||||
statement = '{} {} {}'.format( s_string, operator, c_string )
|
||||
|
||||
statements_and_scores[ 'resolution' ] = ( statement, score )
|
||||
|
||||
#
|
||||
|
||||
s_ratio = s_w / s_h
|
||||
c_ratio = c_w / c_h
|
||||
|
||||
s_nice = s_ratio in HC.NICE_RATIOS
|
||||
c_nice = c_ratio in HC.NICE_RATIOS
|
||||
|
||||
if s_nice or c_nice:
|
||||
|
||||
if s_nice:
|
||||
|
||||
s_string = HC.NICE_RATIOS[ s_ratio ]
|
||||
|
||||
else:
|
||||
|
||||
s_string = 'unusual'
|
||||
|
||||
|
||||
if c_nice:
|
||||
|
||||
c_string = HC.NICE_RATIOS[ c_ratio ]
|
||||
|
||||
else:
|
||||
|
||||
c_string = 'unusual'
|
||||
|
||||
|
||||
if s_nice and c_nice:
|
||||
|
||||
operator = '-'
|
||||
score = 0
|
||||
|
||||
elif s_nice:
|
||||
|
||||
operator = '>'
|
||||
score = duplicate_comparison_score_nicer_ratio
|
||||
|
||||
elif c_nice:
|
||||
|
||||
operator = '<'
|
||||
score = -duplicate_comparison_score_nicer_ratio
|
||||
|
||||
|
||||
if s_string == c_string:
|
||||
|
||||
statement = 'both {}'.format( s_string )
|
||||
|
||||
else:
|
||||
|
||||
statement = '{} {} {}'.format( s_string, operator, c_string )
|
||||
|
||||
|
||||
statements_and_scores[ 'ratio' ] = ( statement, score )
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
# same/diff mime
|
||||
|
||||
if s_mime != c_mime:
|
||||
|
||||
statement = '{} vs {}'.format( HC.mime_string_lookup[ s_mime ], HC.mime_string_lookup[ c_mime ] )
|
||||
score = 0
|
||||
|
||||
statements_and_scores[ 'mime' ] = ( statement, score )
|
||||
|
||||
|
||||
# more tags
|
||||
|
||||
s_num_tags = len( shown_media.GetTagsManager().GetCurrentAndPending( CC.COMBINED_TAG_SERVICE_KEY, ClientTags.TAG_DISPLAY_ACTUAL ) )
|
||||
c_num_tags = len( comparison_media.GetTagsManager().GetCurrentAndPending( CC.COMBINED_TAG_SERVICE_KEY, ClientTags.TAG_DISPLAY_ACTUAL ) )
|
||||
|
||||
if s_num_tags != c_num_tags:
|
||||
|
||||
if s_num_tags > 0 and c_num_tags > 0:
|
||||
|
||||
if s_num_tags > c_num_tags:
|
||||
|
||||
operator = '>'
|
||||
score = duplicate_comparison_score_more_tags
|
||||
|
||||
else:
|
||||
|
||||
operator = '<'
|
||||
score = -duplicate_comparison_score_more_tags
|
||||
|
||||
|
||||
elif s_num_tags > 0:
|
||||
|
||||
operator = '>>'
|
||||
score = duplicate_comparison_score_more_tags
|
||||
|
||||
elif c_num_tags > 0:
|
||||
|
||||
operator = '<<'
|
||||
score = -duplicate_comparison_score_more_tags
|
||||
|
||||
|
||||
statement = '{} tags {} {} tags'.format( HydrusData.ToHumanInt( s_num_tags ), operator, HydrusData.ToHumanInt( c_num_tags ) )
|
||||
|
||||
statements_and_scores[ 'num_tags' ] = ( statement, score )
|
||||
|
||||
|
||||
# older
|
||||
|
||||
s_ts = shown_media.GetLocationsManager().GetCurrentTimestamp( CC.COMBINED_LOCAL_FILE_SERVICE_KEY )
|
||||
c_ts = comparison_media.GetLocationsManager().GetCurrentTimestamp( CC.COMBINED_LOCAL_FILE_SERVICE_KEY )
|
||||
|
||||
one_month = 86400 * 30
|
||||
|
||||
if s_ts is not None and c_ts is not None and abs( s_ts - c_ts ) > one_month:
|
||||
|
||||
if s_ts < c_ts:
|
||||
|
||||
operator = 'older than'
|
||||
score = duplicate_comparison_score_older
|
||||
|
||||
else:
|
||||
|
||||
operator = 'newer than'
|
||||
score = -duplicate_comparison_score_older
|
||||
|
||||
|
||||
if is_a_pixel_dupe:
|
||||
|
||||
score = 0
|
||||
|
||||
|
||||
statement = '{}, {} {}'.format( ClientData.TimestampToPrettyTimeDelta( s_ts, history_suffix = ' old' ), operator, ClientData.TimestampToPrettyTimeDelta( c_ts, history_suffix = ' old' ) )
|
||||
|
||||
statements_and_scores[ 'time_imported' ] = ( statement, score )
|
||||
|
||||
|
||||
if s_mime == HC.IMAGE_JPEG and c_mime == HC.IMAGE_JPEG:
|
||||
|
||||
global hashes_to_jpeg_quality
|
||||
|
||||
if s_hash not in hashes_to_jpeg_quality:
|
||||
|
||||
path = HG.client_controller.client_files_manager.GetFilePath( s_hash, s_mime )
|
||||
|
||||
hashes_to_jpeg_quality[ s_hash ] = HydrusImageHandling.GetJPEGQuantizationQualityEstimate( path )
|
||||
|
||||
|
||||
if c_hash not in hashes_to_jpeg_quality:
|
||||
|
||||
path = HG.client_controller.client_files_manager.GetFilePath( c_hash, c_mime )
|
||||
|
||||
hashes_to_jpeg_quality[ c_hash ] = HydrusImageHandling.GetJPEGQuantizationQualityEstimate( path )
|
||||
|
||||
|
||||
( s_label, s_jpeg_quality ) = hashes_to_jpeg_quality[ s_hash ]
|
||||
( c_label, c_jpeg_quality ) = hashes_to_jpeg_quality[ c_hash ]
|
||||
|
||||
score = 0
|
||||
|
||||
if s_label != c_label:
|
||||
|
||||
if c_jpeg_quality is None or s_jpeg_quality is None:
|
||||
|
||||
score = 0
|
||||
|
||||
else:
|
||||
|
||||
# other way around, low score is good here
|
||||
quality_ratio = c_jpeg_quality / s_jpeg_quality
|
||||
|
||||
if quality_ratio > 2.0:
|
||||
|
||||
score = duplicate_comparison_score_much_higher_jpeg_quality
|
||||
|
||||
elif quality_ratio > 1.0:
|
||||
|
||||
score = duplicate_comparison_score_higher_jpeg_quality
|
||||
|
||||
elif quality_ratio < 0.5:
|
||||
|
||||
score = -duplicate_comparison_score_much_higher_jpeg_quality
|
||||
|
||||
else:
|
||||
|
||||
score = -duplicate_comparison_score_higher_jpeg_quality
|
||||
|
||||
|
||||
|
||||
statement = '{} vs {} jpeg quality'.format( s_label, c_label )
|
||||
|
||||
statements_and_scores[ 'jpeg_quality' ] = ( statement, score )
|
||||
|
||||
|
||||
|
||||
def has_exif( m ):
|
||||
|
||||
try:
|
||||
|
||||
hash = m.GetHash()
|
||||
mime = m.GetMime()
|
||||
|
||||
if mime not in ( HC.IMAGE_JPEG, HC.IMAGE_TIFF ):
|
||||
|
||||
return False
|
||||
|
||||
|
||||
path = HG.client_controller.client_files_manager.GetFilePath( hash, mime )
|
||||
|
||||
pil_image = HydrusImageHandling.RawOpenPILImage( path )
|
||||
|
||||
exif_dict = HydrusImageHandling.GetEXIFDict( pil_image )
|
||||
|
||||
if exif_dict is None:
|
||||
|
||||
return False
|
||||
|
||||
|
||||
return len( exif_dict ) > 0
|
||||
|
||||
except:
|
||||
|
||||
return False
|
||||
|
||||
|
||||
|
||||
s_has_exif = has_exif( shown_media )
|
||||
c_has_exif = has_exif( comparison_media )
|
||||
|
||||
if s_has_exif ^ c_has_exif:
|
||||
|
||||
if s_has_exif:
|
||||
|
||||
exif_statement = 'has exif data, the other does not'
|
||||
|
||||
else:
|
||||
|
||||
exif_statement = 'the other has exif data, this does not'
|
||||
|
||||
|
||||
statements_and_scores[ 'exif_data' ] = ( exif_statement, 0 )
|
||||
|
||||
|
||||
s_has_human_readable_embedded_metadata = shown_media.GetMediaResult().GetFileInfoManager().has_human_readable_embedded_metadata
|
||||
c_has_human_readable_embedded_metadata = comparison_media.GetMediaResult().GetFileInfoManager().has_human_readable_embedded_metadata
|
||||
|
||||
if s_has_human_readable_embedded_metadata ^ c_has_human_readable_embedded_metadata:
|
||||
|
||||
if s_has_human_readable_embedded_metadata:
|
||||
|
||||
embedded_metadata_statement = 'has embedded metadata, the other does not'
|
||||
|
||||
else:
|
||||
|
||||
embedded_metadata_statement = 'the other has embedded metadata, this does not'
|
||||
|
||||
|
||||
statements_and_scores[ 'embedded_metadata' ] = ( embedded_metadata_statement, 0 )
|
||||
|
||||
|
||||
s_has_icc = shown_media.GetMediaResult().GetFileInfoManager().has_icc_profile
|
||||
c_has_icc = comparison_media.GetMediaResult().GetFileInfoManager().has_icc_profile
|
||||
|
||||
if s_has_icc ^ c_has_icc:
|
||||
|
||||
if s_has_icc:
|
||||
|
||||
icc_statement = 'has icc profile, the other does not'
|
||||
|
||||
else:
|
||||
|
||||
icc_statement = 'the other has icc profile, this does not'
|
||||
|
||||
|
||||
statements_and_scores[ 'icc_profile' ] = ( icc_statement, 0 )
|
||||
|
||||
|
||||
return statements_and_scores
|
||||
|
||||
|
||||
def GetMediasTags( pool, tag_service_key, tag_display_type, content_statuses ):
|
||||
|
||||
|
|
|
@ -80,7 +80,7 @@ class SingleFileMetadataRouter( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
suffix = actually_an_importer.GetSuffix()
|
||||
|
||||
exporter = ClientMetadataMigrationExporters.SingleFileMetadataExporterTXT( suffix )
|
||||
exporter = ClientMetadataMigrationExporters.SingleFileMetadataExporterTXT( suffix = suffix )
|
||||
|
||||
elif isinstance( actually_an_importer, ClientMetadataMigrationImporters.SingleFileMetadataImporterMediaTags ):
|
||||
|
||||
|
|
|
@ -395,9 +395,9 @@ class SingleFileMetadataExporterTXT( HydrusSerialisable.SerialisableBase, Single
|
|||
|
||||
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_METADATA_SINGLE_FILE_EXPORTER_TXT
|
||||
SERIALISABLE_NAME = 'Metadata Single File Exporter TXT'
|
||||
SERIALISABLE_VERSION = 2
|
||||
SERIALISABLE_VERSION = 3
|
||||
|
||||
def __init__( self, remove_actual_filename_ext = None, suffix = None, filename_string_converter = None ):
|
||||
def __init__( self, remove_actual_filename_ext = None, suffix = None, filename_string_converter = None, separator = None ):
|
||||
|
||||
if remove_actual_filename_ext is None:
|
||||
|
||||
|
@ -414,20 +414,27 @@ class SingleFileMetadataExporterTXT( HydrusSerialisable.SerialisableBase, Single
|
|||
filename_string_converter = ClientStrings.StringConverter( example_string = '0123456789abcdef.jpg.txt' )
|
||||
|
||||
|
||||
if separator is None:
|
||||
|
||||
separator = '\n'
|
||||
|
||||
|
||||
HydrusSerialisable.SerialisableBase.__init__( self )
|
||||
SingleFileMetadataExporterSidecar.__init__( self, remove_actual_filename_ext, suffix, filename_string_converter )
|
||||
|
||||
self._separator = separator
|
||||
|
||||
|
||||
def _GetSerialisableInfo( self ):
|
||||
|
||||
serialisable_filename_string_converter = self._filename_string_converter.GetSerialisableTuple()
|
||||
|
||||
return ( self._remove_actual_filename_ext, self._suffix, serialisable_filename_string_converter )
|
||||
return ( self._remove_actual_filename_ext, self._suffix, serialisable_filename_string_converter, self._separator )
|
||||
|
||||
|
||||
def _InitialiseFromSerialisableInfo( self, serialisable_info ):
|
||||
|
||||
( self._remove_actual_filename_ext, self._suffix, serialisable_filename_string_converter ) = serialisable_info
|
||||
( self._remove_actual_filename_ext, self._suffix, serialisable_filename_string_converter, self._separator ) = serialisable_info
|
||||
|
||||
self._filename_string_converter = HydrusSerialisable.CreateFromSerialisableTuple( serialisable_filename_string_converter )
|
||||
|
||||
|
@ -448,6 +455,17 @@ class SingleFileMetadataExporterTXT( HydrusSerialisable.SerialisableBase, Single
|
|||
return ( 2, new_serialisable_info )
|
||||
|
||||
|
||||
if version == 2:
|
||||
|
||||
( remove_actual_filename_ext, suffix, serialisable_filename_string_converter ) = old_serialisable_info
|
||||
|
||||
separator = '\n'
|
||||
|
||||
new_serialisable_info = ( remove_actual_filename_ext, suffix, serialisable_filename_string_converter, separator )
|
||||
|
||||
return ( 3, new_serialisable_info )
|
||||
|
||||
|
||||
|
||||
def Export( self, actual_file_path: str, rows: typing.Collection[ str ] ):
|
||||
|
||||
|
@ -460,10 +478,20 @@ class SingleFileMetadataExporterTXT( HydrusSerialisable.SerialisableBase, Single
|
|||
|
||||
with open( path, 'w', encoding = 'utf-8' ) as f:
|
||||
|
||||
f.write( '\n'.join( rows ) )
|
||||
f.write( self._separator.join( rows ) )
|
||||
|
||||
|
||||
|
||||
def GetSeparator( self ) -> str:
|
||||
|
||||
return self._separator
|
||||
|
||||
|
||||
def SetSeparator( self, separator: str ):
|
||||
|
||||
self._separator = separator
|
||||
|
||||
|
||||
def ToString( self ) -> str:
|
||||
|
||||
suffix_s = '' if self._suffix == '' else '.{}'.format( self._suffix )
|
||||
|
|
|
@ -437,9 +437,9 @@ class SingleFileMetadataImporterTXT( HydrusSerialisable.SerialisableBase, Single
|
|||
|
||||
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_METADATA_SINGLE_FILE_IMPORTER_TXT
|
||||
SERIALISABLE_NAME = 'Metadata Single File Importer TXT'
|
||||
SERIALISABLE_VERSION = 3
|
||||
SERIALISABLE_VERSION = 4
|
||||
|
||||
def __init__( self, string_processor = None, remove_actual_filename_ext = None, suffix = None, filename_string_converter = None ):
|
||||
def __init__( self, string_processor = None, remove_actual_filename_ext = None, suffix = None, filename_string_converter = None, separator = None ):
|
||||
|
||||
if remove_actual_filename_ext is None:
|
||||
|
||||
|
@ -461,6 +461,13 @@ class SingleFileMetadataImporterTXT( HydrusSerialisable.SerialisableBase, Single
|
|||
string_processor = ClientStrings.StringProcessor()
|
||||
|
||||
|
||||
if separator is None:
|
||||
|
||||
separator = '\n'
|
||||
|
||||
|
||||
self._separator = separator
|
||||
|
||||
HydrusSerialisable.SerialisableBase.__init__( self )
|
||||
SingleFileMetadataImporterSidecar.__init__( self, string_processor, remove_actual_filename_ext, suffix, filename_string_converter )
|
||||
|
||||
|
@ -470,12 +477,12 @@ class SingleFileMetadataImporterTXT( HydrusSerialisable.SerialisableBase, Single
|
|||
serialisable_string_processor = self._string_processor.GetSerialisableTuple()
|
||||
serialisable_filename_string_converter = self._filename_string_converter.GetSerialisableTuple()
|
||||
|
||||
return ( serialisable_string_processor, self._remove_actual_filename_ext, self._suffix, serialisable_filename_string_converter )
|
||||
return ( serialisable_string_processor, self._remove_actual_filename_ext, self._suffix, serialisable_filename_string_converter, self._separator )
|
||||
|
||||
|
||||
def _InitialiseFromSerialisableInfo( self, serialisable_info ):
|
||||
|
||||
( serialisable_string_processor, self._remove_actual_filename_ext, self._suffix, serialisable_filename_string_converter ) = serialisable_info
|
||||
( serialisable_string_processor, self._remove_actual_filename_ext, self._suffix, serialisable_filename_string_converter, self._separator ) = serialisable_info
|
||||
|
||||
self._string_processor = HydrusSerialisable.CreateFromSerialisableTuple( serialisable_string_processor )
|
||||
self._filename_string_converter = HydrusSerialisable.CreateFromSerialisableTuple( serialisable_filename_string_converter )
|
||||
|
@ -510,12 +517,28 @@ class SingleFileMetadataImporterTXT( HydrusSerialisable.SerialisableBase, Single
|
|||
return ( 3, new_serialisable_info )
|
||||
|
||||
|
||||
if version == 3:
|
||||
|
||||
( serialisable_string_processor, remove_actual_filename_ext, suffix, serialisable_filename_string_converter ) = old_serialisable_info
|
||||
|
||||
separator = '\n'
|
||||
|
||||
new_serialisable_info = ( serialisable_string_processor, remove_actual_filename_ext, suffix, serialisable_filename_string_converter, separator )
|
||||
|
||||
return ( 4, new_serialisable_info )
|
||||
|
||||
|
||||
|
||||
def GetExpectedSidecarPath( self, actual_file_path: str ):
|
||||
|
||||
return ClientMetadataMigrationCore.GetSidecarPath( actual_file_path, self._remove_actual_filename_ext, self._suffix, self._filename_string_converter, 'txt' )
|
||||
|
||||
|
||||
def GetSeparator( self ) -> str:
|
||||
|
||||
return self._separator
|
||||
|
||||
|
||||
def Import( self, actual_file_path: str ) -> typing.Collection[ str ]:
|
||||
|
||||
path = self.GetExpectedSidecarPath( actual_file_path )
|
||||
|
@ -539,6 +562,14 @@ class SingleFileMetadataImporterTXT( HydrusSerialisable.SerialisableBase, Single
|
|||
|
||||
rows = HydrusText.DeserialiseNewlinedTexts( raw_text )
|
||||
|
||||
if self._separator != '\n':
|
||||
|
||||
# don't want any newlines, so this 'undo' is correct
|
||||
rejoined_text = ''.join( rows )
|
||||
|
||||
rows = rejoined_text.split( self._separator )
|
||||
|
||||
|
||||
if self._string_processor.MakesChanges():
|
||||
|
||||
rows = self._string_processor.ProcessStrings( rows )
|
||||
|
@ -547,6 +578,11 @@ class SingleFileMetadataImporterTXT( HydrusSerialisable.SerialisableBase, Single
|
|||
return rows
|
||||
|
||||
|
||||
def SetSeparator( self, separator: str ):
|
||||
|
||||
self._separator = separator
|
||||
|
||||
|
||||
def ToString( self ) -> str:
|
||||
|
||||
if self._string_processor.MakesChanges():
|
||||
|
|
|
@ -541,7 +541,7 @@ class TagDisplayMaintenanceManager( object ):
|
|||
|
||||
def GetName( self ):
|
||||
|
||||
return 'tag display maintenance'
|
||||
return 'tag display sync'
|
||||
|
||||
|
||||
def IsShutdown( self ):
|
||||
|
@ -569,7 +569,9 @@ class TagDisplayMaintenanceManager( object ):
|
|||
|
||||
except HydrusExceptions.NotFoundException:
|
||||
|
||||
time.sleep( 5 )
|
||||
self._wake_event.wait( 5 )
|
||||
|
||||
self._wake_event.clear()
|
||||
|
||||
continue
|
||||
|
||||
|
|
|
@ -113,6 +113,7 @@ class HydrusServiceClientAPI( HydrusClientService ):
|
|||
manage_pages.putChild( b'focus_page', ClientLocalServerResources.HydrusResourceClientAPIRestrictedManagePagesFocusPage( self._service, self._client_requests_domain ) )
|
||||
manage_pages.putChild( b'get_pages', ClientLocalServerResources.HydrusResourceClientAPIRestrictedManagePagesGetPages( self._service, self._client_requests_domain ) )
|
||||
manage_pages.putChild( b'get_page_info', ClientLocalServerResources.HydrusResourceClientAPIRestrictedManagePagesGetPageInfo( self._service, self._client_requests_domain ) )
|
||||
manage_pages.putChild( b'refresh_page', ClientLocalServerResources.HydrusResourceClientAPIRestrictedManagePagesRefreshPage( self._service, self._client_requests_domain ) )
|
||||
|
||||
manage_database = NoResource()
|
||||
|
||||
|
|
|
@ -3228,6 +3228,7 @@ class HydrusResourceClientAPIRestrictedManagePagesGetPages( HydrusResourceClient
|
|||
return response_context
|
||||
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedManagePagesGetPageInfo( HydrusResourceClientAPIRestrictedManagePages ):
|
||||
|
||||
def _threadDoGETJob( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
@ -3257,3 +3258,29 @@ class HydrusResourceClientAPIRestrictedManagePagesGetPageInfo( HydrusResourceCli
|
|||
return response_context
|
||||
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedManagePagesRefreshPage( HydrusResourceClientAPIRestrictedManagePages ):
|
||||
|
||||
def _threadDoPOSTJob( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
def do_it( page_key ):
|
||||
|
||||
return HG.client_controller.gui.RefreshPage( page_key )
|
||||
|
||||
|
||||
page_key = request.parsed_request_args.GetValue( 'page_key', bytes )
|
||||
|
||||
try:
|
||||
|
||||
HG.client_controller.CallBlockingToQt( HG.client_controller.gui, do_it, page_key )
|
||||
|
||||
except HydrusExceptions.DataMissing as e:
|
||||
|
||||
raise HydrusExceptions.NotFoundException( 'Could not find that page!' )
|
||||
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200 )
|
||||
|
||||
return response_context
|
||||
|
||||
|
||||
|
|
|
@ -83,8 +83,8 @@ options = {}
|
|||
# Misc
|
||||
|
||||
NETWORK_VERSION = 20
|
||||
SOFTWARE_VERSION = 511
|
||||
CLIENT_API_VERSION = 38
|
||||
SOFTWARE_VERSION = 512
|
||||
CLIENT_API_VERSION = 39
|
||||
|
||||
SERVER_THUMBNAIL_DIMENSIONS = ( 200, 200 )
|
||||
|
||||
|
@ -117,7 +117,8 @@ CONTENT_MERGE_ACTION_NONE = 3
|
|||
content_merge_string_lookup = {
|
||||
CONTENT_MERGE_ACTION_COPY : 'copy from worse to better',
|
||||
CONTENT_MERGE_ACTION_MOVE : 'move from worse to better',
|
||||
CONTENT_MERGE_ACTION_TWO_WAY_MERGE : 'copy in both directions'
|
||||
CONTENT_MERGE_ACTION_TWO_WAY_MERGE : 'copy in both directions',
|
||||
CONTENT_MERGE_ACTION_NONE : 'do nothing'
|
||||
}
|
||||
|
||||
CONTENT_STATUS_CURRENT = 0
|
||||
|
|
|
@ -211,6 +211,8 @@ class SerialisableBase( object ):
|
|||
SERIALISABLE_NAME = 'Base Serialisable Object'
|
||||
SERIALISABLE_VERSION = 1
|
||||
|
||||
# don't make an __eq__ here without more testing and research, it messes a bunch of things up in sets and hashing and stuff
|
||||
|
||||
def _GetSerialisableInfo( self ):
|
||||
|
||||
raise NotImplementedError()
|
||||
|
|
|
@ -2554,7 +2554,33 @@ class TestClientAPI( unittest.TestCase ):
|
|||
|
||||
self.assertEqual( response.status, 200 )
|
||||
|
||||
result = HG.test_controller.GetWrite( 'show_page' )
|
||||
result = HG.test_controller.GetWrite( 'show_page' ) # a fake hook in the controller handles this
|
||||
|
||||
expected_result = [ ( ( page_key, ), {} ) ]
|
||||
|
||||
self.assertEqual( result, expected_result )
|
||||
|
||||
#
|
||||
|
||||
headers = { 'Hydrus-Client-API-Access-Key' : access_key_hex, 'Content-Type' : HC.mime_mimetype_string_lookup[ HC.APPLICATION_JSON ] }
|
||||
|
||||
path = '/manage_pages/refresh_page'
|
||||
|
||||
page_key = os.urandom( 32 )
|
||||
|
||||
request_dict = { 'page_key' : page_key.hex() }
|
||||
|
||||
request_body = json.dumps( request_dict )
|
||||
|
||||
connection.request( 'POST', path, body = request_body, headers = headers )
|
||||
|
||||
response = connection.getresponse()
|
||||
|
||||
data = response.read()
|
||||
|
||||
self.assertEqual( response.status, 200 )
|
||||
|
||||
result = HG.test_controller.GetWrite( 'refresh_page' ) # a fake hook in the controller handles this
|
||||
|
||||
expected_result = [ ( ( page_key, ), {} ) ]
|
||||
|
||||
|
|
|
@ -123,19 +123,19 @@ class TestClientDBDuplicates( unittest.TestCase ):
|
|||
|
||||
pixel_dupes_preference = CC.SIMILAR_FILES_PIXEL_DUPES_ALLOWED
|
||||
max_hamming_distance = 4
|
||||
both_files_match = True
|
||||
dupe_search_type = CC.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH
|
||||
|
||||
num_potentials = self._read( 'potential_duplicates_count', self._file_search_context, both_files_match, pixel_dupes_preference, max_hamming_distance )
|
||||
num_potentials = self._read( 'potential_duplicates_count', self._file_search_context_1, self._file_search_context_2, dupe_search_type, pixel_dupes_preference, max_hamming_distance )
|
||||
|
||||
self.assertEqual( num_potentials, self._expected_num_potentials )
|
||||
|
||||
result = self._read( 'random_potential_duplicate_hashes', self._file_search_context, both_files_match, pixel_dupes_preference, max_hamming_distance )
|
||||
result = self._read( 'random_potential_duplicate_hashes', self._file_search_context_1, self._file_search_context_2, dupe_search_type, pixel_dupes_preference, max_hamming_distance )
|
||||
|
||||
self.assertEqual( len( result ), len( self._all_hashes ) )
|
||||
|
||||
self.assertEqual( set( result ), self._all_hashes )
|
||||
|
||||
filtering_pairs = self._read( 'duplicate_pairs_for_filtering', self._file_search_context, both_files_match, pixel_dupes_preference, max_hamming_distance )
|
||||
filtering_pairs = self._read( 'duplicate_pairs_for_filtering', self._file_search_context_1, self._file_search_context_2, dupe_search_type, pixel_dupes_preference, max_hamming_distance )
|
||||
|
||||
for ( a, b ) in filtering_pairs:
|
||||
|
||||
|
@ -176,9 +176,9 @@ class TestClientDBDuplicates( unittest.TestCase ):
|
|||
|
||||
pixel_dupes_preference = CC.SIMILAR_FILES_PIXEL_DUPES_ALLOWED
|
||||
max_hamming_distance = 4
|
||||
both_files_match = True
|
||||
dupe_search_type = CC.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH
|
||||
|
||||
num_potentials = self._read( 'potential_duplicates_count', self._file_search_context, both_files_match, pixel_dupes_preference, max_hamming_distance )
|
||||
num_potentials = self._read( 'potential_duplicates_count', self._file_search_context_1, self._file_search_context_2, dupe_search_type, pixel_dupes_preference, max_hamming_distance )
|
||||
|
||||
self._num_free_agents -= 1
|
||||
|
||||
|
@ -264,9 +264,9 @@ class TestClientDBDuplicates( unittest.TestCase ):
|
|||
|
||||
pixel_dupes_preference = CC.SIMILAR_FILES_PIXEL_DUPES_ALLOWED
|
||||
max_hamming_distance = 4
|
||||
both_files_match = True
|
||||
dupe_search_type = CC.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH
|
||||
|
||||
num_potentials = self._read( 'potential_duplicates_count', self._file_search_context, both_files_match, pixel_dupes_preference, max_hamming_distance )
|
||||
num_potentials = self._read( 'potential_duplicates_count', self._file_search_context_1, self._file_search_context_2, dupe_search_type, pixel_dupes_preference, max_hamming_distance )
|
||||
|
||||
self._num_free_agents -= 1
|
||||
|
||||
|
@ -331,9 +331,9 @@ class TestClientDBDuplicates( unittest.TestCase ):
|
|||
|
||||
pixel_dupes_preference = CC.SIMILAR_FILES_PIXEL_DUPES_ALLOWED
|
||||
max_hamming_distance = 4
|
||||
both_files_match = True
|
||||
dupe_search_type = CC.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH
|
||||
|
||||
num_potentials = self._read( 'potential_duplicates_count', self._file_search_context, both_files_match, pixel_dupes_preference, max_hamming_distance )
|
||||
num_potentials = self._read( 'potential_duplicates_count', self._file_search_context_1, self._file_search_context_2, dupe_search_type, pixel_dupes_preference, max_hamming_distance )
|
||||
|
||||
self._num_free_agents -= 1
|
||||
|
||||
|
@ -509,9 +509,9 @@ class TestClientDBDuplicates( unittest.TestCase ):
|
|||
|
||||
pixel_dupes_preference = CC.SIMILAR_FILES_PIXEL_DUPES_ALLOWED
|
||||
max_hamming_distance = 4
|
||||
both_files_match = True
|
||||
dupe_search_type = CC.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH
|
||||
|
||||
num_potentials = self._read( 'potential_duplicates_count', self._file_search_context, both_files_match, pixel_dupes_preference, max_hamming_distance )
|
||||
num_potentials = self._read( 'potential_duplicates_count', self._file_search_context_1, self._file_search_context_2, dupe_search_type, pixel_dupes_preference, max_hamming_distance )
|
||||
|
||||
self.assertLess( num_potentials, self._expected_num_potentials )
|
||||
|
||||
|
@ -593,9 +593,9 @@ class TestClientDBDuplicates( unittest.TestCase ):
|
|||
|
||||
pixel_dupes_preference = CC.SIMILAR_FILES_PIXEL_DUPES_ALLOWED
|
||||
max_hamming_distance = 4
|
||||
both_files_match = True
|
||||
dupe_search_type = CC.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH
|
||||
|
||||
num_potentials = self._read( 'potential_duplicates_count', self._file_search_context, both_files_match, pixel_dupes_preference, max_hamming_distance )
|
||||
num_potentials = self._read( 'potential_duplicates_count', self._file_search_context_1, self._file_search_context_2, dupe_search_type, pixel_dupes_preference, max_hamming_distance )
|
||||
|
||||
self.assertLess( num_potentials, self._expected_num_potentials )
|
||||
|
||||
|
@ -637,9 +637,9 @@ class TestClientDBDuplicates( unittest.TestCase ):
|
|||
|
||||
pixel_dupes_preference = CC.SIMILAR_FILES_PIXEL_DUPES_ALLOWED
|
||||
max_hamming_distance = 4
|
||||
both_files_match = True
|
||||
dupe_search_type = CC.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH
|
||||
|
||||
num_potentials = self._read( 'potential_duplicates_count', self._file_search_context, both_files_match, pixel_dupes_preference, max_hamming_distance )
|
||||
num_potentials = self._read( 'potential_duplicates_count', self._file_search_context_1, self._file_search_context_2, dupe_search_type, pixel_dupes_preference, max_hamming_distance )
|
||||
|
||||
self.assertLess( num_potentials, self._expected_num_potentials )
|
||||
|
||||
|
@ -657,9 +657,9 @@ class TestClientDBDuplicates( unittest.TestCase ):
|
|||
|
||||
pixel_dupes_preference = CC.SIMILAR_FILES_PIXEL_DUPES_ALLOWED
|
||||
max_hamming_distance = 4
|
||||
both_files_match = True
|
||||
dupe_search_type = CC.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH
|
||||
|
||||
num_potentials = self._read( 'potential_duplicates_count', self._file_search_context, both_files_match, pixel_dupes_preference, max_hamming_distance )
|
||||
num_potentials = self._read( 'potential_duplicates_count', self._file_search_context_1, self._file_search_context_2, dupe_search_type, pixel_dupes_preference, max_hamming_distance )
|
||||
|
||||
self.assertLess( num_potentials, self._expected_num_potentials )
|
||||
|
||||
|
@ -713,9 +713,9 @@ class TestClientDBDuplicates( unittest.TestCase ):
|
|||
|
||||
pixel_dupes_preference = CC.SIMILAR_FILES_PIXEL_DUPES_ALLOWED
|
||||
max_hamming_distance = 4
|
||||
both_files_match = True
|
||||
dupe_search_type = CC.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH
|
||||
|
||||
num_potentials = self._read( 'potential_duplicates_count', self._file_search_context, both_files_match, pixel_dupes_preference, max_hamming_distance )
|
||||
num_potentials = self._read( 'potential_duplicates_count', self._file_search_context_1, self._file_search_context_2, dupe_search_type, pixel_dupes_preference, max_hamming_distance )
|
||||
|
||||
self.assertLess( num_potentials, self._expected_num_potentials )
|
||||
|
||||
|
@ -733,9 +733,9 @@ class TestClientDBDuplicates( unittest.TestCase ):
|
|||
|
||||
pixel_dupes_preference = CC.SIMILAR_FILES_PIXEL_DUPES_ALLOWED
|
||||
max_hamming_distance = 4
|
||||
both_files_match = True
|
||||
dupe_search_type = CC.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH
|
||||
|
||||
num_potentials = self._read( 'potential_duplicates_count', self._file_search_context, both_files_match, pixel_dupes_preference, max_hamming_distance )
|
||||
num_potentials = self._read( 'potential_duplicates_count', self._file_search_context_1, self._file_search_context_2, dupe_search_type, pixel_dupes_preference, max_hamming_distance )
|
||||
|
||||
self.assertLess( num_potentials, self._expected_num_potentials )
|
||||
|
||||
|
@ -791,9 +791,9 @@ class TestClientDBDuplicates( unittest.TestCase ):
|
|||
|
||||
pixel_dupes_preference = CC.SIMILAR_FILES_PIXEL_DUPES_ALLOWED
|
||||
max_hamming_distance = 4
|
||||
both_files_match = True
|
||||
dupe_search_type = CC.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH
|
||||
|
||||
num_potentials = self._read( 'potential_duplicates_count', self._file_search_context, both_files_match, pixel_dupes_preference, max_hamming_distance )
|
||||
num_potentials = self._read( 'potential_duplicates_count', self._file_search_context_1, self._file_search_context_2, dupe_search_type, pixel_dupes_preference, max_hamming_distance )
|
||||
|
||||
self.assertLess( num_potentials, self._expected_num_potentials )
|
||||
|
||||
|
@ -850,9 +850,9 @@ class TestClientDBDuplicates( unittest.TestCase ):
|
|||
|
||||
pixel_dupes_preference = CC.SIMILAR_FILES_PIXEL_DUPES_ALLOWED
|
||||
max_hamming_distance = 4
|
||||
both_files_match = True
|
||||
dupe_search_type = CC.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH
|
||||
|
||||
num_potentials = self._read( 'potential_duplicates_count', self._file_search_context, both_files_match, pixel_dupes_preference, max_hamming_distance )
|
||||
num_potentials = self._read( 'potential_duplicates_count', self._file_search_context_1, self._file_search_context_2, dupe_search_type, pixel_dupes_preference, max_hamming_distance )
|
||||
|
||||
self.assertLess( num_potentials, self._expected_num_potentials )
|
||||
|
||||
|
@ -1059,10 +1059,12 @@ class TestClientDBDuplicates( unittest.TestCase ):
|
|||
self._expected_num_potentials = int( n * ( n - 1 ) / 2 )
|
||||
|
||||
size_pred = ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_SYSTEM_SIZE, ( '=', 65535, HydrusData.ConvertUnitToInt( 'B' ) ) )
|
||||
png_pred = ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_SYSTEM_MIME, ( HC.IMAGE_PNG, ) )
|
||||
|
||||
location_context = ClientLocation.LocationContext.STATICCreateSimple( CC.LOCAL_FILE_SERVICE_KEY )
|
||||
|
||||
self._file_search_context = ClientSearch.FileSearchContext( location_context = location_context, predicates = [ size_pred ] )
|
||||
self._file_search_context_1 = ClientSearch.FileSearchContext( location_context = location_context, predicates = [ size_pred ] )
|
||||
self._file_search_context_2 = ClientSearch.FileSearchContext( location_context = location_context, predicates = [ png_pred ] )
|
||||
|
||||
self._import_and_find_dupes()
|
||||
|
||||
|
|
|
@ -344,6 +344,25 @@ class TestSingleFileMetadataImporters( unittest.TestCase ):
|
|||
|
||||
self.assertEqual( set( result ), set( rows ) )
|
||||
|
||||
# diff separator
|
||||
|
||||
separator = ', '
|
||||
|
||||
expected_input_path = actual_file_path + '.txt'
|
||||
|
||||
with open( expected_input_path, 'w', encoding = 'utf-8' ) as f:
|
||||
|
||||
f.write( separator.join( rows ) )
|
||||
|
||||
|
||||
importer = ClientMetadataMigrationImporters.SingleFileMetadataImporterTXT( separator = separator )
|
||||
|
||||
result = importer.Import( actual_file_path )
|
||||
|
||||
os.unlink( expected_input_path )
|
||||
|
||||
self.assertEqual( set( result ), set( rows ) )
|
||||
|
||||
# with suffix and processing
|
||||
|
||||
string_processor = ClientStrings.StringProcessor()
|
||||
|
@ -623,17 +642,35 @@ class TestSingleFileMetadataExporters( unittest.TestCase ):
|
|||
|
||||
self.assertEqual( set( rows ), set( HydrusText.DeserialiseNewlinedTexts( text ) ) )
|
||||
|
||||
# diff separator
|
||||
|
||||
separator = ', '
|
||||
|
||||
exporter = ClientMetadataMigrationExporters.SingleFileMetadataExporterTXT( suffix = 'tags', separator = separator )
|
||||
|
||||
exporter.Export( actual_file_path, rows )
|
||||
|
||||
expected_output_path = actual_file_path + '.tags.txt'
|
||||
|
||||
self.assertTrue( os.path.exists( expected_output_path ) )
|
||||
|
||||
with open( expected_output_path, 'r', encoding = 'utf-8' ) as f:
|
||||
|
||||
text = f.read()
|
||||
|
||||
|
||||
os.unlink( expected_output_path )
|
||||
|
||||
self.assertEqual( set( rows ), set( text.split( separator ) ) )
|
||||
|
||||
# with filename remove ext and string conversion
|
||||
|
||||
expected_output_path = os.path.join( HG.test_controller.db_dir, 'file.jpg'[1:].rsplit( '.', 1 )[0] ) + '.txt'
|
||||
|
||||
with open( expected_output_path, 'w', encoding = 'utf-8' ) as f:
|
||||
|
||||
f.write( os.linesep.join( rows ) )
|
||||
|
||||
|
||||
exporter = ClientMetadataMigrationExporters.SingleFileMetadataExporterTXT( remove_actual_filename_ext = True, filename_string_converter = ClientStrings.StringConverter( conversions = [ ( ClientStrings.STRING_CONVERSION_REMOVE_TEXT_FROM_BEGINNING, 1 ) ] ) )
|
||||
|
||||
exporter.Export( actual_file_path, rows )
|
||||
|
||||
with open( expected_output_path, 'r', encoding = 'utf-8' ) as f:
|
||||
|
||||
text = f.read()
|
||||
|
|
|
@ -944,6 +944,11 @@ class Controller( object ):
|
|||
return False
|
||||
|
||||
|
||||
def RefreshPage( self, page_key ):
|
||||
|
||||
self.Write( 'refresh_page', page_key )
|
||||
|
||||
|
||||
def ShowPage( self, page_key ):
|
||||
|
||||
self.Write( 'show_page', page_key )
|
||||
|
|
|
@ -294,6 +294,8 @@ class TestSerialisables( unittest.TestCase ):
|
|||
|
||||
def assertSCUEqual( one, two ):
|
||||
|
||||
self.maxDiff = None
|
||||
|
||||
self.assertEqual( TC.ConvertServiceKeysToContentUpdatesToComparable( one ), TC.ConvertServiceKeysToContentUpdatesToComparable( two ) )
|
||||
|
||||
|
||||
|
|
Before Width: | Height: | Size: 1.8 KiB |
Before Width: | Height: | Size: 1.9 KiB |
Before Width: | Height: | Size: 2.0 KiB |
Before Width: | Height: | Size: 1.9 KiB |
Before Width: | Height: | Size: 2.3 KiB |
Before Width: | Height: | Size: 2.3 KiB |
Before Width: | Height: | Size: 2.6 KiB |
Before Width: | Height: | Size: 2.1 KiB |
Before Width: | Height: | Size: 1.9 KiB |
Before Width: | Height: | Size: 1.7 KiB |
Before Width: | Height: | Size: 1.8 KiB |
Before Width: | Height: | Size: 2.2 KiB |
Before Width: | Height: | Size: 1.9 KiB |
Before Width: | Height: | Size: 2.2 KiB |
Before Width: | Height: | Size: 2.0 KiB |