Version 354

This commit is contained in:
Hydrus Network Developer 2019-05-29 16:34:43 -05:00
parent 4dcdc1e324
commit e556dbadf5
30 changed files with 1462 additions and 502 deletions

View File

@ -8,6 +8,52 @@
<div class="content">
<h3>changelog</h3>
<ul>
<li><h3>version 354</h3></li>
<ul>
<li>duplicates important:</li>
<li>duplicates 'false positive' and 'alternates' pairs are now stored in a new more efficient structure that is better suited for larger groups of files</li>
<li>alternate relationships are now implicitly transitive--if A is alternate B and A is alternate C, B is now alternate C</li>
<li>false positive relationships remain correctly non-transitive, but they are now implicitly shared amongst alternates--if A is alternate B and A is false positive with C, B is now false positive with C. and further, if C alt D, then A and B are implicitly fp D as well!</li>
<li>your existing false positive and alternates relationships will be migrated on update. alternates will apply first, so in the case of conflicts due to previous non-excellent filtering workflow, formerly invalid false positives (i.e. false positives between now-transitive alternates) will be discarded. invalid potentials will also be cleared out</li>
<li>attempting to set a 'false positives' or 'alternates' relationship to files that already have a conflicting relation (e.g. setting false positive to two files that already have alternates) now does nothing. in future, this will have graceful failure reporting</li>
<li>the false positive and alternate transitivity clears out potential dupes at a faster rate than previously, speeding up duplicate filter workflow and reducing redundancy on the human end</li>
<li>unfortunately, as potential and better/worse/same pairs have yet to be updated, the system may report that a file has the same alternate as same quality partner. this will be automatically corrected in the coming weeks</li>
<li>when selecting 'view this file's duplicates' from thumbnail right-click, the focus file will now be the first file displayed in the next page</li>
<li>.</li>
<li>duplicates boring details:</li>
<li>setting 'false positive' and 'alternates' status now accounts for the new data storage, and a variety of follow-on assumptions and transitive properties (such as implying other false positive relationships or clearing out potential dupes between two groups of merging alternates) are now dealt with more rigorously (and moreso when I move the true 'duplicate' file relationships over)</li>
<li>fetching file duplicate status counts, file duplicate status hashes, and searching for system:num_dupes now accounts for the new data storage r.e. false positives and alternates</li>
<li>new potential dupes are culled when they conflict with the new transitive alternate and false positive relationships</li>
<li>removed the code that fudges explicit transitive 'false positive' and 'alternate' relationships based on existing same/better/worse pairs when setting new dupe pairs. this temporary gap will be filled back in in the coming weeks (clearing out way more potentials too)</li>
<li>several specific advanced duplicate actions are now cleared out to make way for future streamlining of the filter workflow:</li>
<li>removed the 'duplicate_media_set_false_positive' shortcut, which is an action only appropriate when viewing confirmed potentials through the duplicate filter (or after the ' show random pairs' button)</li>
<li>removed the 'duplicate_media_remove_relationships' shortcut and menu action ('remove x pairs ... from the dupes system'), which will return as multiple more precise and reliable 'dissolve' actions in the coming weeks</li>
<li>removed the 'duplicate_media_reset_to_potential' shortcut and menu action ('send the x pairs ... to be compared in the duplicates filter') as it was always buggy and lead to bloating of the filter queue. it is likely to return as part of the 'dissolve'-style reset commands as above</li>
<li>fixed an issue where hitting 'duplicate_media_set_focused_better' shortcut with no focused thumb would throw an error</li>
<li>started proper unit tests for the duplicates system and filled in the phash search, basic current better/worse, and false positive and alternate components</li>
<li>various incidences of duplicate 'action options' and similar phrasing are now unified to 'metadata merge options'</li>
<li>cleaned up 'unknown/potential' phrasing in duplicate pair code and some related duplicate filter code</li>
<li>cleaned up wording and layout of the thumbnail duplicates menu</li>
<li>.</li>
<li>the rest:</li>
<li>tag blacklists in downloaders' tag import options now apply to the parsed tags both before and after a tag sibling collapse. it uses the combined tag sibling rules, so feedback on how well this works irl would be appreciated</li>
<li>I believe I fixed the annoying issue where a handful of thumbnails would sometimes inexplicitly not fade in after during thumbgrid scrolling (and typically on first thumb load--this problem was aggravated by scroll/thumb-render speed ratio)</li>
<li>when to-be-regenerated thumbnails are taken off the thumbnail waterfall queue due to fast scrolling or page switching, they are now queued up in the new file maintenance system for idle-time work!</li>
<li>the main gui menus will now no longer try to update while they are open! uploading pending tags while lots of new tags are coming in is now much more reliable. let me know if you discover a way to get stuck in this frozen state!</li>
<li>cleaned up some main gui menu regeneration code, reducing the total number of stub objects created and deleted, particularly when the 'pending' menu refreshes its label frequently while uploading many pending tags. should be a bit more stable for some linux flavours</li>
<li>the 'fix siblings and parents' button on manage tags is now a menu button with two options--for fixing according to the 'all services combined' siblings and parents or just for the current panel's service. this overrides the 'apply sibs/parents across all services' options. this will be revisited in future when more complicated sibling application rules are added</li>
<li>the 'hide and anchor mouse' check under 'options->media' is no longer windows-only, if you want to test it, and the previous touchscreen-detecting override (which unhid and unanchored on vigorous movement) is now optional, defaulting to off</li>
<li>greatly reduced typical and max repository pre-processing disk cache time and reworked stop calculations to ensure some work always gets done</li>
<li>fixed an issue with 'show some random dupes' thumbnails not hiding on manual trashing, if that option is set. 'show some random dupes' thumbnail panels will now inherit their file service from the current duplicate search domain</li>
<li>repository processing will now never run for more than an hour at once. this mitigates some edge-case disastrous ui-hanging outcomes and generally gives a chance for hydrus-level jobs like subscriptions and even other programs like defraggers to run even when there is a gigantic backlog of processing to do</li>
<li>added yet another CORS header to improve Client API CORS compatibility, and fixed an overauthentication problem</li>
<li>setting a blank string on the new local booru external port override option will now forego the host:port colon in the resultant external url. a tooltip on the control repeats this</li>
<li>reworded and coloured the pause/play sync button in review services repository panel to be more clear about current paused status</li>
<li>fixed a problem when closing the gui when the popup message manager is already closed by clever OS-specific means</li>
<li>misc code cleanup</li>
<li>updated sqlite on windows to 3.28.0</li>
<li>updated upnpc exe on windows to 2.1</li>
</ul>
<li><h3>version 353</h3></li>
<ul>
<li>duplicate filter:</li>

View File

@ -1,23 +0,0 @@
<html>
<head>
<title>message depots</title>
<link href="hydrus.ico" rel="shortcut icon" />
<link href="style.css" rel="stylesheet" type="text/css" />
</head>
<body>
<div class="content">
<h3>how the hydrus network sends messages</h3>
<p>Message depots mix hydrus's standard access key authentication with cryptograhic principles to store clients' messages privately. They work a little like repositories, except anyone can upload data, and they do so anonymously.</p>
<p>All a message depot knows about its users are their public keys and which encrypted messages are for them. It does not know their private keys, and cannot decrypt the messages it stores.</p>
<p>I have made the encryption work as best as I can, but it is a very difficult problem to get cryptography 100% correct. I use AES-256 and RSA-2048 with a simple mostly-random-byte padding scheme and OAEP respectively, along with python's os.urandom() for the PRNG. I am fairly certain I have made no major errors, but I cannot guarantee that a dedicated and well financed attacker cannot defeat it. Please feel free to check my source code (HydrusMessageHandling.py) if you are so interested. If you would like to know more about cryptography, go check out <a href="http://en.wikipedia.org/wiki/Cryptography" title="See you in four hours!">wikipedia</a>. Hydrus uses both public-key and symmetric-key cryptography.</p>
<p>Contact keys are just sha256( PEM( public_key ) ).</p>
<p>I plan to extend this service to better guarantee anonymity; at the moment, it is trivial for someone to alter their server's source code to record IP addresses, so I will test onion-routing algorithms or similar in future.</p>
<p>Here are some diagrams:</p>
<p><img src="message_sync_1.png" /></p>
<p><img src="message_sync_2.png" /></p>
<p><img src="message_sync_3.png" /></p>
<p><img src="message_sync_4.png" /></p>
<p>There is a little more to it (applying statuses to a message), but I have only half-implemented it. I shall flush out the description once it is done.</p>
</div>
</body>
</html>

View File

@ -6,7 +6,7 @@
</head>
<body>
<div class="content">
<p class="warning">This help is slightly out of date--you can now refine your current duplicate 'search domain' with regular file searches. I will update this help in the coming weeks to reflect final changes.</p>
<p class="warning"><b>This is currently out of date! The duplicates system is being reworked right now at the database level to better support large groups of duplicates. The UI and workflow are simultaneously being streamlined. The concepts behind this help remain valid, but it will be properly updated once the work is complete to reflect the changes.</b></p>
<h3>duplicates</h3>
<p>As files are shared on the internet, they are often resized, cropped, converted to a different format, subsequently altered by the original or a new artist, or turned into a template and reinterpreted over and over and over. Even if you have a very restrictive importing workflow, your client is almost certainly going to get some <i>duplicates</i>. Some will be interesting alternate versions that you want to keep, and others will be thumbnails and other low-quality garbage you accidentally imported and would rather delete. Along the way, it would be nice to harmonise your ratings and tags to the better files so you don't lose any work.</p>
<p>Finding and processing duplicates within a large collection is impossible to do by hand, so I have written a system to do the heavy lifting for you. It is all on--</p>
@ -26,7 +26,7 @@
<h3>discovery</h3>
<p>Once the database is ready to search, you actually have to do it! You can set a 'search distance', which represents how 'fuzzy' or imprecise a match the database will consider a duplicate. I recommend you start with 'exact match', which looks for files that are as similar as it can understand. The smaller the search distance, the faster and better and fewer the results will be. I do not recommend you go above 8--the 'speculative' option--as you will be inundated with false positives.</p>
<p>Like the preparation step, this is very CPU intensive and will lock your db. Either leave it alone while it works or let the client handle everything automatically during idle time.</p>
<p>If you are interested, the current version of this system uses a <i>phash</i> (a 64-bit binary string 'perceptual hash' based on whether the values of an 8x8 DCT of a 32x32 greyscale version of the image are above or below the average value) to represent the image shape and a VPTree to search different files' phashes' relative <a href="https://en.wikipedia.org/wiki/Hamming_distance">hamming distance</a>. I expect to extend it in future with multiple phash generation (flips, rotations, crops on interesting parts of the image) and most-common colour comparisons.</p>
<p>If you are interested, the current version of this system uses a <a href="https://jenssegers.com/61/perceptual-image-hashes">phash</a> to represent the image shape and a <a href="https://en.wikipedia.org/wiki/VP-tree">VPTree</a> to search different files' phashes' relative <a href="https://en.wikipedia.org/wiki/Hamming_distance">hamming distance</a>. I expect to extend it in future with multiple phash generation (flips, rotations, and 'interesting' image crops and video frames) and most-common colour comparisons.</p>
</li>
<li>
<h3>processing</h3>
@ -110,7 +110,7 @@
<p><a href="dupe_alternates_progress.png"><img src="dupe_alternates_progress.png" /></a></p>
<p>And a costume change:</p>
<p><a href="dupe_alternates_costume.png"><img src="dupe_alternates_costume.png" /></a></p>
<p>None of these are exact duplicates, but they are obviously related. The duplicate search will notice they are similar, so we should let it know they are 'alternate'.</p>
<p>None of these are strictly duplicates, but they are obviously related. The duplicate search will notice they are similar, so we should let it know they are 'alternate'.</p>
<p>Here's a subtler case:</p>
<p><a href="dupe_alternate_boxer_a.jpg"><img src="dupe_alternate_boxer_a.jpg" /></a> <a href="dupe_alternate_boxer_b.jpg"><img src="dupe_alternate_boxer_b.jpg" /></a></p>
<p>These two files are very similar, but try opening both in separate tabs and then flicking back and forth: the second's glove-string is further into the mouth and has improved chin shading, a more refined eye shape, and shaved pubic hair. It is simple to spot these differences in the client's duplicate filter when you flick back and forth.</p>
@ -123,13 +123,13 @@
<p><b>The default action here is to do nothing but record the alternate status. A future version of the client will support revisiting the large unsorted archive you build here and adding file relationship metadata, but creating that will be a complicated job that was not in the scope of this initial duplicate management system.</b></p>
</li>
<li>
<h3>not duplicates</h3>
<p>The duplicate finder sometimes has false positives, so this status is to tell the client that the potential pair are actually not duplicates of any kind. This usually happens when two images have a similar shape by accident.</p>
<h3>not related/false positive</h3>
<p>The duplicate finder sometimes has false positives, so this status is to tell the client that the potential pair are not related in any way. This usually happens when two images have a similar shape by accident.</p>
<p>Here are two such files:</p>
<p><a href="dupe_not_dupes_1.png"><img style="max-width: 100%;" src="dupe_not_dupes_1.png" /></a></p>
<p><a href="dupe_not_dupes_2.jpg"><img style="max-width: 100%;" src="dupe_not_dupes_2.jpg" /></a></p>
<p>Despite their similarity, they are neither duplicates nor of even the same topic. The only commonality is the medium. I would not consider them close enough to be alternates--just adding something like 'screenshot' and 'imageboard' as tags to both is probably the closest connection they have.</p>
<p>The default action here is obviously to do nothing but record the status and move on.</p>
<p>The default action here is obviously to do nothing but record the status and move on. Recording the 'false positive' relationship is important to make sure the comparison does not come up again.</p>
<p>The incidence of false positives increases as you broaden the search distance--the less precise your search, the less likely it is to be correct. At distance 14, these files all match, but uselessly:</p>
<p><a href="dupe_garbage.png"><img style="max-width: 100%;" src="dupe_garbage.png" /></a></p>
</li>

View File

@ -1595,17 +1595,20 @@ class TagParentsManager( object ):
# first collapse siblings
sibling_manager = self._controller.tag_siblings_manager
siblings_manager = self._controller.tag_siblings_manager
collapsed_service_keys_to_statuses_to_pairs = collections.defaultdict( HydrusData.default_dict_set )
for ( service_key, statuses_to_pairs ) in list(service_keys_to_statuses_to_pairs.items()):
for ( service_key, statuses_to_pairs ) in service_keys_to_statuses_to_pairs.items():
if service_key == CC.COMBINED_TAG_SERVICE_KEY: continue
for ( status, pairs ) in list(statuses_to_pairs.items()):
if service_key == CC.COMBINED_TAG_SERVICE_KEY:
pairs = sibling_manager.CollapsePairs( service_key, pairs )
continue
for ( status, pairs ) in statuses_to_pairs.items():
pairs = siblings_manager.CollapsePairs( service_key, pairs )
collapsed_service_keys_to_statuses_to_pairs[ service_key ][ status ] = pairs
@ -1640,9 +1643,9 @@ class TagParentsManager( object ):
self._service_keys_to_children_to_parents = BuildServiceKeysToChildrenToParents( service_keys_to_simple_children_to_parents )
def ExpandPredicates( self, service_key, predicates ):
def ExpandPredicates( self, service_key, predicates, service_strict = False ):
if self._controller.new_options.GetBoolean( 'apply_all_parents_to_all_services' ):
if not service_strict and self._controller.new_options.GetBoolean( 'apply_all_parents_to_all_services' ):
service_key = CC.COMBINED_TAG_SERVICE_KEY
@ -1674,9 +1677,9 @@ class TagParentsManager( object ):
def ExpandTags( self, service_key, tags ):
def ExpandTags( self, service_key, tags, service_strict = False ):
if self._controller.new_options.GetBoolean( 'apply_all_parents_to_all_services' ):
if not service_strict and self._controller.new_options.GetBoolean( 'apply_all_parents_to_all_services' ):
service_key = CC.COMBINED_TAG_SERVICE_KEY
@ -1694,9 +1697,9 @@ class TagParentsManager( object ):
def GetParents( self, service_key, tag ):
def GetParents( self, service_key, tag, service_strict = False ):
if self._controller.new_options.GetBoolean( 'apply_all_parents_to_all_services' ):
if not service_strict and self._controller.new_options.GetBoolean( 'apply_all_parents_to_all_services' ):
service_key = CC.COMBINED_TAG_SERVICE_KEY
@ -1815,9 +1818,9 @@ class TagSiblingsManager( object ):
self._controller.pub( 'new_siblings_gui' )
def CollapsePredicates( self, service_key, predicates ):
def CollapsePredicates( self, service_key, predicates, service_strict = False ):
if self._controller.new_options.GetBoolean( 'apply_all_siblings_to_all_services' ):
if not service_strict and self._controller.new_options.GetBoolean( 'apply_all_siblings_to_all_services' ):
service_key = CC.COMBINED_TAG_SERVICE_KEY
@ -1872,9 +1875,9 @@ class TagSiblingsManager( object ):
def CollapsePairs( self, service_key, pairs ):
def CollapsePairs( self, service_key, pairs, service_strict = False ):
if self._controller.new_options.GetBoolean( 'apply_all_siblings_to_all_services' ):
if not service_strict and self._controller.new_options.GetBoolean( 'apply_all_siblings_to_all_services' ):
service_key = CC.COMBINED_TAG_SERVICE_KEY
@ -1904,9 +1907,9 @@ class TagSiblingsManager( object ):
def CollapseStatusesToTags( self, service_key, statuses_to_tags ):
def CollapseStatusesToTags( self, service_key, statuses_to_tags, service_strict = False ):
if self._controller.new_options.GetBoolean( 'apply_all_siblings_to_all_services' ):
if not service_strict and self._controller.new_options.GetBoolean( 'apply_all_siblings_to_all_services' ):
service_key = CC.COMBINED_TAG_SERVICE_KEY
@ -1926,9 +1929,9 @@ class TagSiblingsManager( object ):
def CollapseTag( self, service_key, tag ):
def CollapseTag( self, service_key, tag, service_strict = False ):
if self._controller.new_options.GetBoolean( 'apply_all_siblings_to_all_services' ):
if not service_strict and self._controller.new_options.GetBoolean( 'apply_all_siblings_to_all_services' ):
service_key = CC.COMBINED_TAG_SERVICE_KEY
@ -1948,9 +1951,9 @@ class TagSiblingsManager( object ):
def CollapseTags( self, service_key, tags ):
def CollapseTags( self, service_key, tags, service_strict = False ):
if self._controller.new_options.GetBoolean( 'apply_all_siblings_to_all_services' ):
if not service_strict and self._controller.new_options.GetBoolean( 'apply_all_siblings_to_all_services' ):
service_key = CC.COMBINED_TAG_SERVICE_KEY
@ -1961,9 +1964,9 @@ class TagSiblingsManager( object ):
def CollapseTagsToCount( self, service_key, tags_to_count ):
def CollapseTagsToCount( self, service_key, tags_to_count, service_strict = False ):
if self._controller.new_options.GetBoolean( 'apply_all_siblings_to_all_services' ):
if not service_strict and self._controller.new_options.GetBoolean( 'apply_all_siblings_to_all_services' ):
service_key = CC.COMBINED_TAG_SERVICE_KEY
@ -1988,9 +1991,9 @@ class TagSiblingsManager( object ):
def GetSibling( self, service_key, tag ):
def GetSibling( self, service_key, tag, service_strict = False ):
if self._controller.new_options.GetBoolean( 'apply_all_siblings_to_all_services' ):
if not service_strict and self._controller.new_options.GetBoolean( 'apply_all_siblings_to_all_services' ):
service_key = CC.COMBINED_TAG_SERVICE_KEY
@ -2010,9 +2013,9 @@ class TagSiblingsManager( object ):
def GetAllSiblings( self, service_key, tag ):
def GetAllSiblings( self, service_key, tag, service_strict = False ):
if self._controller.new_options.GetBoolean( 'apply_all_siblings_to_all_services' ):
if not service_strict and self._controller.new_options.GetBoolean( 'apply_all_siblings_to_all_services' ):
service_key = CC.COMBINED_TAG_SERVICE_KEY
@ -2412,7 +2415,16 @@ class ThumbnailCache( object ):
self._waterfall_queue_quick.difference_update( ( ( page_key, media ) for media in medias ) )
# don't cancel regen--that's useful and not time sensitive
cancelled_media_results = { media.GetMediaResult() for media in medias }
outstanding_delayed_hashes = { media_result.GetHash() for media_result in cancelled_media_results if media_result in self._delayed_regeneration_queue_quick }
if len( outstanding_delayed_hashes ) > 0:
self._controller.files_maintenance_manager.ScheduleJob( outstanding_delayed_hashes, ClientFiles.REGENERATE_FILE_DATA_JOB_FORCE_THUMBNAIL )
self._delayed_regeneration_queue_quick.difference_update( cancelled_media_results )
self._RecalcQueues()

View File

@ -349,7 +349,7 @@ SHORTCUTS_RESERVED_NAMES = [ 'archive_delete_filter', 'duplicate_filter', 'media
# shortcut commands
SHORTCUTS_MEDIA_ACTIONS = [ 'manage_file_tags', 'manage_file_ratings', 'manage_file_urls', 'manage_file_notes', 'archive_file', 'inbox_file', 'delete_file', 'export_files', 'export_files_quick_auto_export', 'remove_file_from_view', 'open_file_in_external_program', 'open_selection_in_new_page', 'launch_the_archive_delete_filter', 'copy_bmp', 'copy_file', 'copy_path', 'copy_sha256_hash', 'get_similar_to_exact', 'get_similar_to_very_similar', 'get_similar_to_similar', 'get_similar_to_speculative', 'duplicate_media_remove_relationships', 'duplicate_media_reset_to_potential', 'duplicate_media_set_alternate', 'duplicate_media_set_alternate_collections', 'duplicate_media_set_custom', 'duplicate_media_set_focused_better', 'duplicate_media_set_false_positive', 'duplicate_media_set_same_quality', 'open_known_url' ]
SHORTCUTS_MEDIA_ACTIONS = [ 'manage_file_tags', 'manage_file_ratings', 'manage_file_urls', 'manage_file_notes', 'archive_file', 'inbox_file', 'delete_file', 'export_files', 'export_files_quick_auto_export', 'remove_file_from_view', 'open_file_in_external_program', 'open_selection_in_new_page', 'launch_the_archive_delete_filter', 'copy_bmp', 'copy_file', 'copy_path', 'copy_sha256_hash', 'get_similar_to_exact', 'get_similar_to_very_similar', 'get_similar_to_similar', 'get_similar_to_speculative', 'duplicate_media_set_alternate', 'duplicate_media_set_alternate_collections', 'duplicate_media_set_custom', 'duplicate_media_set_focused_better', 'duplicate_media_set_same_quality', 'open_known_url' ]
SHORTCUTS_MEDIA_VIEWER_ACTIONS = [ 'move_animation_to_previous_frame', 'move_animation_to_next_frame', 'switch_between_fullscreen_borderless_and_regular_framed_window', 'pan_up', 'pan_down', 'pan_left', 'pan_right', 'zoom_in', 'zoom_out', 'switch_between_100_percent_and_canvas_zoom', 'flip_darkmode' ]
SHORTCUTS_MEDIA_VIEWER_BROWSER_ACTIONS = [ 'view_next', 'view_first', 'view_last', 'view_previous' ]
SHORTCUTS_MAIN_GUI_ACTIONS = [ 'refresh', 'new_page', 'new_page_of_pages', 'new_duplicate_filter_page', 'new_gallery_downloader_page', 'new_url_downloader_page', 'new_simple_downloader_page', 'new_watcher_downloader_page', 'synchronised_wait_switch', 'set_media_focus', 'show_hide_splitters', 'set_search_focus', 'unclose_page', 'close_page', 'redo', 'undo', 'flip_darkmode', 'check_all_import_folders', 'flip_debug_force_idle_mode_do_not_set_this' ]

View File

@ -992,6 +992,16 @@ class Controller( HydrusController.HydrusController ):
def MenubarMenuIsOpen( self ):
self._menu_open = True
def MenubarMenuIsClosed( self ):
self._menu_open = False
def MenuIsOpen( self ):
return self._menu_open
@ -1072,6 +1082,8 @@ class Controller( HydrusController.HydrusController ):
try:
time.sleep( 1 )
if HydrusNetworking.LocalPortInUse( port ):
text = 'The client\'s {} could not start because something was already bound to port {}.'.format( name, port )

File diff suppressed because it is too large Load Diff

View File

@ -72,21 +72,21 @@ MENU_ORDER = [ 'file', 'undo', 'pages', 'database', 'pending', 'network', 'servi
def THREADUploadPending( service_key ):
service = HG.client_controller.services_manager.GetService( service_key )
service_name = service.GetName()
service_type = service.GetServiceType()
nums_pending = HG.client_controller.Read( 'nums_pending' )
info = nums_pending[ service_key ]
initial_num_pending = sum( info.values() )
result = HG.client_controller.Read( 'pending', service_key )
try:
service = HG.client_controller.services_manager.GetService( service_key )
service_name = service.GetName()
service_type = service.GetServiceType()
nums_pending = HG.client_controller.Read( 'nums_pending' )
info = nums_pending[ service_key ]
initial_num_pending = sum( info.values() )
result = HG.client_controller.Read( 'pending', service_key )
job_key = ClientThreading.JobKey( pausable = True, cancellable = True )
job_key.SetVariable( 'popup_title', 'uploading pending to ' + service_name )
@ -204,6 +204,14 @@ def THREADUploadPending( service_key ):
result = HG.client_controller.Read( 'pending', service_key )
job_key.DeleteVariable( 'popup_gauge_1' )
job_key.SetVariable( 'popup_text_1', 'upload done!' )
HydrusData.Print( job_key.ToString() )
job_key.Finish()
job_key.Delete( 5 )
except Exception as e:
r = re.search( '[a-fA-F0-9]{64}', str( e ) )
@ -223,17 +231,12 @@ def THREADUploadPending( service_key ):
raise
finally:
HG.currently_uploading_pending = False
HG.client_controller.pub( 'notify_new_pending' )
job_key.DeleteVariable( 'popup_gauge_1' )
job_key.SetVariable( 'popup_text_1', 'upload done!' )
HydrusData.Print( job_key.ToString() )
job_key.Finish()
job_key.Delete( 5 )
class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
def __init__( self, controller ):
@ -1418,12 +1421,10 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
def _GenerateMenuInfo( self, name ):
menu = wx.Menu()
menu.hydrus_menubar_name = name
def file():
menu = wx.Menu()
ClientGUIMenus.AppendMenuItem( self, menu, 'import files', 'Add new files to the database.', self._ImportFiles )
ClientGUIMenus.AppendSeparator( menu )
@ -1518,7 +1519,7 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
ClientGUIMenus.AppendMenuItem( self, menu, 'exit', 'Shut the client down.', self.Exit )
return ( menu, '&file', True )
return ( menu, '&file' )
def undo():
@ -1533,7 +1534,7 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
if have_closed_pages or have_undo_stuff:
show = True
menu = wx.Menu()
if undo_string is not None:
@ -1576,14 +1577,16 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
else:
show = False
menu = None
return ( menu, '&undo', show )
return ( menu, '&undo' )
def pages():
menu = wx.Menu()
if self._controller.new_options.GetBoolean( 'advanced_mode' ):
( total_active_page_count, total_closed_page_count ) = self.GetTotalPageCounts()
@ -1788,11 +1791,13 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
#
return ( menu, '&pages', True )
return ( menu, '&pages' )
def database():
menu = wx.Menu()
ClientGUIMenus.AppendMenuItem( self, menu, 'set a password', 'Set a simple password for the database so only you can open it in the client.', self._SetPassword )
ClientGUIMenus.AppendSeparator( menu )
@ -1860,7 +1865,7 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
ClientGUIMenus.AppendMenu( menu, submenu, 'regenerate' )
return ( menu, '&database', True )
return ( menu, '&database' )
def pending():
@ -1869,7 +1874,11 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
total_num_pending = 0
for ( service_key, info ) in list(nums_pending.items()):
menu = None
can_do_a_menu = not HG.currently_uploading_pending
for ( service_key, info ) in nums_pending.items():
service = self._controller.services_manager.GetService( service_key )
@ -1903,7 +1912,12 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
num_petitioned = info[ HC.SERVICE_INFO_NUM_PETITIONED_FILES ]
if num_pending + num_petitioned > 0:
if can_do_a_menu and num_pending + num_petitioned > 0:
if menu is None:
menu = wx.Menu()
submenu = wx.Menu()
@ -1930,13 +1944,13 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
total_num_pending += num_pending + num_petitioned
show = total_num_pending > 0
return ( menu, '&pending (' + HydrusData.ToHumanInt( total_num_pending ) + ')', show )
return ( menu, '&pending (' + HydrusData.ToHumanInt( total_num_pending ) + ')' )
def network():
menu = wx.Menu()
submenu = wx.Menu()
pause_all_new_network_traffic = self._controller.new_options.GetBoolean( 'pause_all_new_network_traffic' )
@ -2036,11 +2050,13 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
#
return ( menu, '&network', True )
return ( menu, '&network' )
def services():
menu = wx.Menu()
tag_services = self._controller.services_manager.GetServices( ( HC.TAG_REPOSITORY, ) )
file_services = self._controller.services_manager.GetServices( ( HC.FILE_REPOSITORY, ) )
@ -2157,11 +2173,13 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
ClientGUIMenus.AppendMenuItem( self, menu, 'manage tag siblings', 'Set certain tags to be automatically replaced with other tags.', self._ManageTagSiblings )
ClientGUIMenus.AppendMenuItem( self, menu, 'manage tag parents', 'Set certain tags to be automatically added with other tags.', self._ManageTagParents )
return ( menu, '&services', True )
return ( menu, '&services' )
def help():
menu = wx.Menu()
ClientGUIMenus.AppendMenuItem( self, menu, 'help and getting started guide', 'Open hydrus\'s local help in your web browser.', ClientPaths.LaunchPathInWebBrowser, os.path.join( HC.HELP_DIR, 'index.html' ) )
links = wx.Menu()
@ -2284,7 +2302,7 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
ClientGUIMenus.AppendMenuItem( self, menu, 'hardcoded shortcuts', 'Review some currently hardcoded shortcuts.', wx.MessageBox, CC.SHORTCUT_HELP )
ClientGUIMenus.AppendMenuItem( self, menu, 'about', 'See this client\'s version and other information.', self._AboutWindow )
return ( menu, '&help', True )
return ( menu, '&help' )
if name == 'file': result = file()
@ -2297,23 +2315,21 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
elif name == 'help': result = help()
# hackery dackery doo
( menu, label, show ) = result
( menu_or_none, label ) = result
if show:
if menu_or_none is not None:
menu = menu_or_none
menu.hydrus_menubar_name = name
if HC.PLATFORM_OSX:
menu.SetTitle( label ) # causes bugs in os x if this is not here
else:
ClientGUIMenus.DestroyMenu( self, menu )
menu = None
return ( menu, label, show )
return ( menu_or_none, label )
def _GenerateNewAccounts( self, service_key ):
@ -2543,9 +2559,9 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
for name in MENU_ORDER:
( menu_or_none, label, show ) = self._GenerateMenuInfo( name )
( menu_or_none, label ) = self._GenerateMenuInfo( name )
if show:
if menu_or_none is not None:
menu = menu_or_none
@ -4105,6 +4121,8 @@ The password is cleartext here but obscured in the entry dialog. Enter a blank p
return
HG.currently_uploading_pending = True
self._controller.CallToThread( THREADUploadPending, service_key )
@ -4515,9 +4533,12 @@ The password is cleartext here but obscured in the entry dialog. Enter a blank p
self.DeleteAllClosedPages() # wx crashes if any are left in here, wew
self._message_manager.CleanBeforeDestroy()
self._message_manager.Hide()
if self._message_manager:
self._message_manager.CleanBeforeDestroy()
self._message_manager.Hide()
self._notebook.CleanBeforeDestroy()
@ -4953,8 +4974,9 @@ The password is cleartext here but obscured in the entry dialog. Enter a blank p
db_going_to_hang_if_we_hit_it = HG.client_controller.DBCurrentlyDoingJob()
menu_open = HG.client_controller.MenuIsOpen()
if db_going_to_hang_if_we_hit_it:
if db_going_to_hang_if_we_hit_it or menu_open:
self._controller.CallLaterWXSafe( self, 0.5, self.RefreshMenu )
@ -4965,13 +4987,13 @@ The password is cleartext here but obscured in the entry dialog. Enter a blank p
name = self._dirty_menus.pop()
( menu_or_none, label, show ) = self._GenerateMenuInfo( name )
( menu_or_none, label ) = self._GenerateMenuInfo( name )
old_menu_index = self._FindMenuBarIndex( name )
if old_menu_index == wx.NOT_FOUND:
if show:
if menu_or_none is not None:
menu = menu_or_none
@ -4998,20 +5020,27 @@ The password is cleartext here but obscured in the entry dialog. Enter a blank p
else:
old_menu = self._menubar.GetMenu( old_menu_index )
if show:
if name == 'pending' and HG.currently_uploading_pending:
menu = menu_or_none
self._menubar.Replace( old_menu_index, menu, label )
self._menubar.SetMenuLabel( old_menu_index, label )
else:
self._menubar.Remove( old_menu_index )
old_menu = self._menubar.GetMenu( old_menu_index )
if menu_or_none is not None:
menu = menu_or_none
self._menubar.Replace( old_menu_index, menu, label )
else:
self._menubar.Remove( old_menu_index )
ClientGUIMenus.DestroyMenu( self, old_menu )
ClientGUIMenus.DestroyMenu( self, old_menu )

View File

@ -3006,6 +3006,8 @@ class CanvasWithHovers( CanvasWithDetails ):
if delta_distance > 0:
touchscreen_canvas_drags_unanchor = HG.client_controller.new_options.GetBoolean( 'touchscreen_canvas_drags_unanchor' )
if not self._current_drag_is_touch and delta_distance > 50:
# if user is able to generate such a large distance, they are almost certainly touching
@ -3013,11 +3015,12 @@ class CanvasWithHovers( CanvasWithDetails ):
self._current_drag_is_touch = True
# touch events obviously don't mix with warping well. the touch just warps it back and again and we get a massive delta!
touch_anchor_override = touchscreen_canvas_drags_unanchor and self._current_drag_is_touch
anchor_and_hide_canvas_drags = HG.client_controller.new_options.GetBoolean( 'anchor_and_hide_canvas_drags' )
if HC.PLATFORM_WINDOWS and anchor_and_hide_canvas_drags and not self._current_drag_is_touch:
# touch events obviously don't mix with warping well. the touch just warps it back and again and we get a massive delta!
if anchor_and_hide_canvas_drags and not touch_anchor_override:
show_mouse = False
@ -3173,7 +3176,7 @@ class CanvasFilterDuplicates( CanvasWithHovers ):
for ( hash_pair, duplicate_type, first_media, second_media, service_keys_to_content_updates, was_auto_skipped ) in self._processed_pairs:
if duplicate_type == HC.DUPLICATE_UNKNOWN or was_auto_skipped:
if duplicate_type is None or was_auto_skipped:
continue # it was a 'skip' decision
@ -3424,7 +3427,7 @@ class CanvasFilterDuplicates( CanvasWithHovers ):
def _GetNumCommittableDecisions( self ):
return len( [ 1 for ( hash_pair, duplicate_type, first_media, second_media, service_keys_to_content_updates, was_auto_skipped ) in self._processed_pairs if duplicate_type != HC.DUPLICATE_UNKNOWN ] )
return len( [ 1 for ( hash_pair, duplicate_type, first_media, second_media, service_keys_to_content_updates, was_auto_skipped ) in self._processed_pairs if duplicate_type is not None ] )
def _GoBack( self ):
@ -3622,7 +3625,7 @@ class CanvasFilterDuplicates( CanvasWithHovers ):
was_auto_skipped = True
self._processed_pairs.append( ( potential_pair, HC.DUPLICATE_UNKNOWN, None, None, {}, was_auto_skipped ) )
self._processed_pairs.append( ( potential_pair, None, None, None, {}, was_auto_skipped ) )
if len( self._unprocessed_pairs ) == 0:
@ -3693,7 +3696,7 @@ class CanvasFilterDuplicates( CanvasWithHovers ):
was_auto_skipped = False
self._processed_pairs.append( ( self._current_pair, HC.DUPLICATE_UNKNOWN, None, None, {}, was_auto_skipped ) )
self._processed_pairs.append( ( self._current_pair, None, None, None, {}, was_auto_skipped ) )
self._ShowNewPair()

View File

@ -266,8 +266,8 @@ class FullscreenHoverFrameRightDuplicates( FullscreenHoverFrame ):
menu_items = []
menu_items.append( ( 'normal', 'edit duplicate action options for \'this is better\'', 'edit what content is merged when you filter files', HydrusData.Call( self._EditMergeOptions, HC.DUPLICATE_BETTER ) ) )
menu_items.append( ( 'normal', 'edit duplicate action options for \'same quality\'', 'edit what content is merged when you filter files', HydrusData.Call( self._EditMergeOptions, HC.DUPLICATE_SAME_QUALITY ) ) )
menu_items.append( ( 'normal', 'edit duplicate metadata merge options for \'this is better\'', 'edit what content is merged when you filter files', HydrusData.Call( self._EditMergeOptions, HC.DUPLICATE_BETTER ) ) )
menu_items.append( ( 'normal', 'edit duplicate metadata merge options for \'same quality\'', 'edit what content is merged when you filter files', HydrusData.Call( self._EditMergeOptions, HC.DUPLICATE_SAME_QUALITY ) ) )
menu_items.append( ( 'separator', None, None, None ) )
menu_items.append( ( 'normal', 'edit background lighten/darken switch intensity', 'edit how much the background will brighten or darken as you switch between the pair', self._EditBackgroundSwitchIntensity ) )

View File

@ -1047,10 +1047,10 @@ class ManagementPanelDuplicateFilter( ManagementPanel ):
menu_items = []
menu_items.append( ( 'normal', 'edit duplicate action options for \'this is better\'', 'edit what content is merged when you filter files', HydrusData.Call( self._EditMergeOptions, HC.DUPLICATE_BETTER ) ) )
menu_items.append( ( 'normal', 'edit duplicate action options for \'same quality\'', 'edit what content is merged when you filter files', HydrusData.Call( self._EditMergeOptions, HC.DUPLICATE_SAME_QUALITY ) ) )
menu_items.append( ( 'normal', 'edit duplicate metadata merge options for \'this is better\'', 'edit what content is merged when you filter files', HydrusData.Call( self._EditMergeOptions, HC.DUPLICATE_BETTER ) ) )
menu_items.append( ( 'normal', 'edit duplicate metadata merge options for \'same quality\'', 'edit what content is merged when you filter files', HydrusData.Call( self._EditMergeOptions, HC.DUPLICATE_SAME_QUALITY ) ) )
self._edit_merge_options = ClientGUICommon.MenuButton( self._main_right_panel, 'edit default duplicate action options', menu_items )
self._edit_merge_options = ClientGUICommon.MenuButton( self._main_right_panel, 'edit default duplicate metadata merge options', menu_items )
#
@ -1066,7 +1066,7 @@ class ManagementPanelDuplicateFilter( ManagementPanel ):
self._both_files_match = wx.CheckBox( self._filtering_panel )
self._num_unknown_duplicates = ClientGUICommon.BetterStaticText( self._filtering_panel )
self._num_potential_duplicates = ClientGUICommon.BetterStaticText( self._filtering_panel )
self._refresh_dupe_counts_button = ClientGUICommon.BetterBitmapButton( self._filtering_panel, CC.GlobalBMPs.refresh, self._RefreshDuplicateCounts )
self._launch_filter = ClientGUICommon.BetterButton( self._filtering_panel, 'launch the filter', self._LaunchFilter )
@ -1075,7 +1075,7 @@ class ManagementPanelDuplicateFilter( ManagementPanel ):
random_filtering_panel = ClientGUICommon.StaticBox( self._main_right_panel, 'quick and dirty processing' )
self._show_some_dupes = ClientGUICommon.BetterButton( random_filtering_panel, 'show some random potential pairs', self._ShowSomeDupes )
self._show_some_dupes = ClientGUICommon.BetterButton( random_filtering_panel, 'show some random potential pairs', self._ShowRandomPotentialDupes )
self._set_random_as_same_quality_button = ClientGUICommon.BetterButton( random_filtering_panel, 'set current media as duplicates of the same quality', self._SetCurrentMediaAs, HC.DUPLICATE_SAME_QUALITY )
self._set_random_as_alternates_button = ClientGUICommon.BetterButton( random_filtering_panel, 'set current media as all related alternates', self._SetCurrentMediaAs, HC.DUPLICATE_ALTERNATE )
@ -1155,7 +1155,7 @@ class ManagementPanelDuplicateFilter( ManagementPanel ):
text_and_button_hbox = wx.BoxSizer( wx.HORIZONTAL )
text_and_button_hbox.Add( self._num_unknown_duplicates, CC.FLAGS_VCENTER_EXPAND_DEPTH_ONLY )
text_and_button_hbox.Add( self._num_potential_duplicates, CC.FLAGS_VCENTER_EXPAND_DEPTH_ONLY )
text_and_button_hbox.Add( self._refresh_dupe_counts_button, CC.FLAGS_VCENTER )
rows = []
@ -1258,7 +1258,7 @@ class ManagementPanelDuplicateFilter( ManagementPanel ):
def _RefreshDuplicateCounts( self ):
def wx_code( unknown_duplicates_count ):
def wx_code( potential_duplicates_count ):
if not self:
@ -1269,14 +1269,14 @@ class ManagementPanelDuplicateFilter( ManagementPanel ):
self._refresh_dupe_counts_button.Enable()
self._UpdateUnknownDuplicatesCount( unknown_duplicates_count )
self._UpdatePotentialDuplicatesCount( potential_duplicates_count )
def thread_do_it( file_search_context, both_files_match ):
unknown_duplicates_count = HG.client_controller.Read( 'unknown_duplicates_count', file_search_context, both_files_match )
potential_duplicates_count = HG.client_controller.Read( 'potential_duplicates_count', file_search_context, both_files_match )
wx.CallAfter( wx_code, unknown_duplicates_count )
wx.CallAfter( wx_code, potential_duplicates_count )
if not self._currently_refreshing_dupe_count_numbers:
@ -1285,7 +1285,7 @@ class ManagementPanelDuplicateFilter( ManagementPanel ):
self._refresh_dupe_counts_button.Disable()
self._num_unknown_duplicates.SetLabelText( 'updating\u2026' )
self._num_potential_duplicates.SetLabelText( 'updating\u2026' )
( file_search_context, both_files_match ) = self._GetFileSearchContextAndBothFilesMatch()
@ -1355,7 +1355,7 @@ class ManagementPanelDuplicateFilter( ManagementPanel ):
if dlg.ShowModal() == wx.ID_YES:
self._controller.Write( 'delete_unknown_duplicate_pairs' )
self._controller.Write( 'delete_potential_duplicate_pairs' )
self._RefreshMaintenanceStatus()
@ -1402,7 +1402,7 @@ class ManagementPanelDuplicateFilter( ManagementPanel ):
self._RefreshDuplicateCounts()
self._ShowSomeDupes()
self._ShowRandomPotentialDupes()
@ -1413,11 +1413,13 @@ class ManagementPanelDuplicateFilter( ManagementPanel ):
self._UpdateMaintenanceStatus()
def _ShowSomeDupes( self ):
def _ShowRandomPotentialDupes( self ):
( file_search_context, both_files_match ) = self._GetFileSearchContextAndBothFilesMatch()
hashes = self._controller.Read( 'random_unknown_duplicate_hashes', file_search_context, both_files_match )
file_service_key = file_search_context.GetFileServiceKey()
hashes = self._controller.Read( 'random_potential_duplicate_hashes', file_search_context, both_files_match )
if len( hashes ) == 0:
@ -1435,7 +1437,7 @@ class ManagementPanelDuplicateFilter( ManagementPanel ):
return
panel = ClientGUIMedia.MediaPanelThumbnails( self._page, self._page_key, CC.COMBINED_LOCAL_FILE_SERVICE_KEY, media_results )
panel = ClientGUIMedia.MediaPanelThumbnails( self._page, self._page_key, file_service_key, media_results )
self._page.SwapMediaPanel( panel )
@ -1552,11 +1554,11 @@ class ManagementPanelDuplicateFilter( ManagementPanel ):
def _UpdateUnknownDuplicatesCount( self, unknown_duplicates_count ):
def _UpdatePotentialDuplicatesCount( self, potential_duplicates_count ):
self._num_unknown_duplicates.SetLabelText( HydrusData.ToHumanInt( unknown_duplicates_count ) + ' potential pairs.' )
self._num_potential_duplicates.SetLabelText( HydrusData.ToHumanInt( potential_duplicates_count ) + ' potential pairs.' )
if unknown_duplicates_count > 0:
if potential_duplicates_count > 0:
self._show_some_dupes.Enable()
self._launch_filter.Enable()

View File

@ -1837,40 +1837,26 @@ class MediaPanel( ClientMedia.ListeningMediaList, wx.ScrolledCanvas ):
for collection in collections:
collection_pairs = list( itertools.combinations( collection.GetFlatMedia(), 2 ) )
media_group = collection.GetFlatMedia()
self._SetDuplicates( HC.DUPLICATE_ALTERNATE, media_pairs = collection_pairs, silent = True )
self._SetDuplicates( HC.DUPLICATE_ALTERNATE, media_group = media_group, silent = True )
def _SetDuplicates( self, duplicate_type, media_pairs = None, duplicate_action_options = None, silent = False ):
def _SetDuplicates( self, duplicate_type, media_pairs = None, media_group = None, duplicate_action_options = None, silent = False ):
yes_no_text = 'unknown duplicate action'
if duplicate_type is None or duplicate_type == HC.DUPLICATE_UNKNOWN:
if duplicate_type is None:
yes_no_text = 'completely delete all pair duplicate relationships'
elif duplicate_type == HC.DUPLICATE_UNKNOWN:
yes_no_text = 'set all pair duplicate relationships to unknown/potential'
duplicate_action_options = None
elif duplicate_action_options is None:
if duplicate_action_options is None:
yes_no_text = 'set all pair relationships to ' + HC.duplicate_type_string_lookup[ duplicate_type ]
if duplicate_type in [ HC.DUPLICATE_BETTER, HC.DUPLICATE_SAME_QUALITY ]:
yes_no_text += ' (with default duplicate action/merge options)'
yes_no_text += ' (with default duplicate metadata merge options)'
new_options = HG.client_controller.new_options
@ -1883,21 +1869,52 @@ class MediaPanel( ClientMedia.ListeningMediaList, wx.ScrolledCanvas ):
else:
yes_no_text = 'set all pair relationships to ' + HC.duplicate_type_string_lookup[ duplicate_type ] + ' (with custom duplicate action/merge options)'
yes_no_text = 'set all pair relationships to ' + HC.duplicate_type_string_lookup[ duplicate_type ] + ' (with custom duplicate metadata merge options)'
file_deletion_reason = 'Deleted from duplicate action on Media Page ({}).'.format( yes_no_text )
if media_pairs is None:
flat_media = self._GetSelectedFlatMedia()
if media_group is None:
flat_media = self._GetSelectedFlatMedia()
else:
flat_media = ClientMedia.FlattenMedia( media_group )
if len( flat_media ) < 2:
return False
media_pairs = list( itertools.combinations( flat_media, 2 ) )
first_media = flat_media[0]
if duplicate_type == HC.DUPLICATE_FALSE_POSITIVE:
media_pairs = list( itertools.combinations( flat_media, 2 ) )
if len( media_pairs ) > 100 and not silent:
message = 'False positive records are complicated, and setting that relationship for many files at once is likely a mistake.'
message += os.linesep * 2
message += 'Are you sure all of these files are all potential duplicates and that they are all false positive matches with each other? If not, I recommend you step back for now.'
with ClientGUIDialogs.DialogYesNo( self, message, yes_label = 'I know what I am doing', no_label = 'step back for now' ) as dlg:
if dlg.ShowModal() != wx.ID_YES:
return False
else:
media_pairs = [ ( first_media, other_media ) for other_media in flat_media if other_media != first_media ]
if len( media_pairs ) == 0:
@ -1905,21 +1922,6 @@ class MediaPanel( ClientMedia.ListeningMediaList, wx.ScrolledCanvas ):
return False
if len( media_pairs ) > 100 and not silent:
message = 'The duplicate system does not yet work well for large groups of duplicates. This is about to ask if you want to apply a dupe status for more than 100 pairs.'
message += os.linesep * 2
message += 'Unless you are testing the system or have another good reason to try this, I recommend you step back for now.'
with ClientGUIDialogs.DialogYesNo( self, message, yes_label = 'I know what I am doing', no_label = 'step back for now' ) as dlg:
if dlg.ShowModal() != wx.ID_YES:
return False
if silent:
do_it = True
@ -2007,10 +2009,20 @@ class MediaPanel( ClientMedia.ListeningMediaList, wx.ScrolledCanvas ):
def _SetDuplicatesFocusedBetter( self, duplicate_action_options = None ):
focused_hash = self._focussed_media.GetDisplayMedia().GetHash()
flat_media = self._GetSelectedFlatMedia()
if self._focussed_media is None:
if len( flat_media ) > 1:
wx.MessageBox( 'No file is focused, so cannot set the focused file as better!' )
return
focused_hash = self._focussed_media.GetDisplayMedia().GetHash()
( better_media, ) = [ media for media in flat_media if media.GetHash() == focused_hash ]
worse_flat_media = [ media for media in flat_media if media.GetHash() != focused_hash ]
@ -2101,7 +2113,7 @@ class MediaPanel( ClientMedia.ListeningMediaList, wx.ScrolledCanvas ):
if hashes is not None and len( hashes ) > 0:
HG.client_controller.pub( 'new_page_query', self._file_service_key, initial_hashes = hashes, do_sort = True )
HG.client_controller.pub( 'new_page_query', self._file_service_key, initial_hashes = hashes )
@ -2274,14 +2286,6 @@ class MediaPanel( ClientMedia.ListeningMediaList, wx.ScrolledCanvas ):
self._CopyHashesToClipboard( 'sha256' )
elif action == 'duplicate_media_remove_relationships':
self._SetDuplicates( None )
elif action == 'duplicate_media_reset_to_potential':
self._SetDuplicates( HC.DUPLICATE_UNKNOWN )
elif action == 'duplicate_media_set_alternate':
self._SetDuplicates( HC.DUPLICATE_ALTERNATE )
@ -2298,10 +2302,6 @@ class MediaPanel( ClientMedia.ListeningMediaList, wx.ScrolledCanvas ):
self._SetDuplicatesFocusedBetter()
elif action == 'duplicate_media_set_false_positive':
self._SetDuplicates( HC.DUPLICATE_FALSE_POSITIVE )
elif action == 'duplicate_media_set_same_quality':
self._SetDuplicates( HC.DUPLICATE_SAME_QUALITY )
@ -2465,11 +2465,9 @@ class MediaPanel( ClientMedia.ListeningMediaList, wx.ScrolledCanvas ):
def SetDuplicateStatusForAll( self, duplicate_type ):
flat_media = ClientMedia.FlattenMedia( self._sorted_media )
media_group = ClientMedia.FlattenMedia( self._sorted_media )
media_pairs = list( itertools.combinations( flat_media, 2 ) )
return self._SetDuplicates( duplicate_type, media_pairs = media_pairs )
return self._SetDuplicates( duplicate_type, media_group = media_group )
def SetFocussedMedia( self, page_key, media ):
@ -2639,7 +2637,10 @@ class MediaPanelThumbnails( MediaPanel ):
thumbnails = [ thumbnail for ( thumbnail_index, thumbnail ) in self._GetThumbnailsFromPageIndex( clean_index ) ]
HG.client_controller.GetCache( 'thumbnail' ).CancelWaterfall( self._page_key, thumbnails )
if len( thumbnails ) > 0:
HG.client_controller.GetCache( 'thumbnail' ).CancelWaterfall( self._page_key, thumbnails )
self._dirty_canvas_pages.append( bmp )
@ -2735,7 +2736,7 @@ class MediaPanelThumbnails( MediaPanel ):
if self._GetPageIndexFromThumbnailIndex( thumbnail_index ) not in self._clean_canvas_pages:
return
continue
hash = thumbnail.GetDisplayMedia().GetHash()
@ -4171,9 +4172,11 @@ class MediaPanelThumbnails( MediaPanel ):
ClientGUIMenus.AppendMenuItem( self, duplicates_action_submenu, 'set all selected as same quality', 'Set all the selected files as same quality duplicates.', self.ProcessApplicationCommand, ClientData.ApplicationCommand( CC.APPLICATION_COMMAND_TYPE_SIMPLE, 'duplicate_media_set_same_quality' ) )
ClientGUIMenus.AppendMenuItem( self, duplicates_action_submenu, 'set all selected as alternates', 'Set all the selected files as alternates.', self.ProcessApplicationCommand, ClientData.ApplicationCommand( CC.APPLICATION_COMMAND_TYPE_SIMPLE, 'duplicate_media_set_alternate' ) )
ClientGUIMenus.AppendMenuItem( self, duplicates_action_submenu, 'set as duplicates with custom metadata merge options', 'Choose which duplicates status to set to this selection and customise non-default duplicate metadata merge options.', self.ProcessApplicationCommand, ClientData.ApplicationCommand( CC.APPLICATION_COMMAND_TYPE_SIMPLE, 'duplicate_media_set_custom' ) )
ClientGUIMenus.AppendMenuItem( self, duplicates_action_submenu, 'make a custom duplicates action', 'Choose which duplicates status to set to this selection and customise non-default merge options.', self.ProcessApplicationCommand, ClientData.ApplicationCommand( CC.APPLICATION_COMMAND_TYPE_SIMPLE, 'duplicate_media_set_custom' ) )
ClientGUIMenus.AppendSeparator( duplicates_action_submenu )
ClientGUIMenus.AppendMenuItem( self, duplicates_action_submenu, 'set all selected as alternates', 'Set all the selected files as alternates.', self.ProcessApplicationCommand, ClientData.ApplicationCommand( CC.APPLICATION_COMMAND_TYPE_SIMPLE, 'duplicate_media_set_alternate' ) )
if collections_selected:
@ -4182,13 +4185,7 @@ class MediaPanelThumbnails( MediaPanel ):
ClientGUIMenus.AppendMenuItem( self, duplicates_action_submenu, 'set selected collections as groups of alternates', 'Set files in the selection which are collected together as alternates.', self.ProcessApplicationCommand, ClientData.ApplicationCommand( CC.APPLICATION_COMMAND_TYPE_SIMPLE, 'duplicate_media_set_alternate_collections' ) )
ClientGUIMenus.AppendSeparator( duplicates_action_submenu )
ClientGUIMenus.AppendMenuItem( self, duplicates_action_submenu, 'send the ' + num_pairs_text + ' in this selection to be compared in the duplicates filter', 'Set all the possible pairs in the selection as unknown/potential duplicate pairs.', self.ProcessApplicationCommand, ClientData.ApplicationCommand( CC.APPLICATION_COMMAND_TYPE_SIMPLE, 'duplicate_media_reset_to_potential' ) )
ClientGUIMenus.AppendMenuItem( self, duplicates_action_submenu, 'remove the ' + num_pairs_text + ' in this selection from the duplicates system', 'Remove all duplicates relationships from all the pairs in this selection.', self.ProcessApplicationCommand, ClientData.ApplicationCommand( CC.APPLICATION_COMMAND_TYPE_SIMPLE, 'duplicate_media_remove_relationships' ) )
ClientGUIMenus.AppendMenu( duplicates_menu, duplicates_action_submenu, 'set duplicate relationships' )
ClientGUIMenus.AppendMenu( duplicates_menu, duplicates_action_submenu, 'set relationship' )
duplicates_edit_action_submenu = wx.Menu()
@ -4197,7 +4194,7 @@ class MediaPanelThumbnails( MediaPanel ):
ClientGUIMenus.AppendMenuItem( self, duplicates_edit_action_submenu, 'for ' + HC.duplicate_type_string_lookup[ duplicate_type ], 'Edit what happens when you set this status.', self._EditDuplicateActionOptions, duplicate_type )
ClientGUIMenus.AppendMenu( duplicates_menu, duplicates_edit_action_submenu, 'edit default merge options' )
ClientGUIMenus.AppendMenu( duplicates_menu, duplicates_edit_action_submenu, 'edit default duplicate metadata merge options' )
ClientGUIMenus.AppendMenu( menu, duplicates_menu, 'duplicates' )
@ -4214,7 +4211,7 @@ class MediaPanelThumbnails( MediaPanel ):
duplicates_view_menu = wx.Menu()
for duplicate_type in ( HC.DUPLICATE_BETTER_OR_WORSE, HC.DUPLICATE_BETTER, HC.DUPLICATE_WORSE, HC.DUPLICATE_SAME_QUALITY, HC.DUPLICATE_ALTERNATE, HC.DUPLICATE_FALSE_POSITIVE, HC.DUPLICATE_UNKNOWN ):
for duplicate_type in ( HC.DUPLICATE_BETTER_OR_WORSE, HC.DUPLICATE_BETTER, HC.DUPLICATE_WORSE, HC.DUPLICATE_SAME_QUALITY, HC.DUPLICATE_ALTERNATE, HC.DUPLICATE_FALSE_POSITIVE, HC.DUPLICATE_POTENTIAL ):
if duplicate_type in file_duplicate_types_to_counts:
@ -4226,7 +4223,7 @@ class MediaPanelThumbnails( MediaPanel ):
ClientGUIMenus.AppendMenu( duplicates_menu, duplicates_view_menu, 'view this file\'s duplicates' )
ClientGUIMenus.AppendMenu( duplicates_menu, duplicates_view_menu, 'view this file\'s relations' )

View File

@ -701,7 +701,7 @@ class ReviewServicePanel( wx.Panel ):
def __init__( self, parent, service ):
ClientGUICommon.StaticBox.__init__( self, parent, 'hydrus service account' )
ClientGUICommon.StaticBox.__init__( self, parent, 'hydrus service account - shared by all clients using the same access key' )
self._service = service
@ -1047,11 +1047,13 @@ class ReviewServicePanel( wx.Panel ):
if service_paused:
self._pause_play_button.SetLabelText( 'unpause' )
self._pause_play_button.SetLabelText( 'paused' )
self._pause_play_button.SetForegroundColour( ( 128, 0, 0 ) )
else:
self._pause_play_button.SetLabelText( 'pause' )
self._pause_play_button.SetLabelText( 'working' )
self._pause_play_button.SetForegroundColour( ( 0, 128, 0 ) )
self._metadata_st.SetLabelText( self._service.GetNextUpdateDueString() )

View File

@ -319,7 +319,7 @@ class PanelPredicateSystemDuplicateRelationships( PanelPredicateSystem ):
self._num = wx.SpinCtrl( self, min = 0, max = 65535 )
choices = [ ( HC.duplicate_type_string_lookup[ status ], status ) for status in ( HC.DUPLICATE_BETTER_OR_WORSE, HC.DUPLICATE_BETTER, HC.DUPLICATE_WORSE, HC.DUPLICATE_SAME_QUALITY, HC.DUPLICATE_ALTERNATE, HC.DUPLICATE_FALSE_POSITIVE, HC.DUPLICATE_UNKNOWN ) ]
choices = [ ( HC.duplicate_type_string_lookup[ status ], status ) for status in ( HC.DUPLICATE_BETTER_OR_WORSE, HC.DUPLICATE_BETTER, HC.DUPLICATE_WORSE, HC.DUPLICATE_SAME_QUALITY, HC.DUPLICATE_ALTERNATE, HC.DUPLICATE_FALSE_POSITIVE, HC.DUPLICATE_POTENTIAL ) ]
self._dupe_type = ClientGUICommon.BetterRadioBox( self, choices = choices, style = wx.RA_SPECIFY_ROWS )

View File

@ -1079,6 +1079,8 @@ class ManageClientServicesPanel( ClientGUIScrolledPanels.ManagePanel ):
self._external_host_override = ClientGUICommon.NoneableTextCtrl( self._client_server_options_panel, message = 'host override when copying external links' )
self._external_port_override = ClientGUICommon.NoneableTextCtrl( self._client_server_options_panel, message = 'port override when copying external links' )
self._external_port_override.SetToolTip( 'Setting this to a non-none empty string will forego the \':\' in the URL.' )
if service_type != HC.LOCAL_BOORU:
self._external_scheme_override.Hide()
@ -2993,6 +2995,7 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
self._file_viewing_stats_menu_display.Append( 'show media and preview combined', CC.FILE_VIEWING_STATS_MENU_DISPLAY_MEDIA_AND_PREVIEW_SUMMED )
self._anchor_and_hide_canvas_drags = wx.CheckBox( self )
self._touchscreen_canvas_drags_unanchor = wx.CheckBox( self )
self._media_zooms = wx.TextCtrl( self )
self._media_zooms.Bind( wx.EVT_TEXT, self.EventZoomsChanged )
@ -3012,6 +3015,7 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
self._use_system_ffmpeg.SetValue( self._new_options.GetBoolean( 'use_system_ffmpeg' ) )
self._file_viewing_stats_menu_display.SelectClientData( self._new_options.GetInteger( 'file_viewing_stats_menu_display' ) )
self._anchor_and_hide_canvas_drags.SetValue( self._new_options.GetBoolean( 'anchor_and_hide_canvas_drags' ) )
self._touchscreen_canvas_drags_unanchor.SetValue( self._new_options.GetBoolean( 'touchscreen_canvas_drags_unanchor' ) )
media_zooms = self._new_options.GetMediaZooms()
@ -3038,13 +3042,14 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
rows = []
rows.append( ( 'Start animations this % in: ', self._animation_start_position ) )
rows.append( ( 'Prefer system FFMPEG: ', self._use_system_ffmpeg ) )
rows.append( ( 'Media zooms: ', self._media_zooms ) )
rows.append( ( 'Show media/preview viewing stats or media right-click menus?: ', self._file_viewing_stats_menu_display ) )
rows.append( ( 'WINDOWS ONLY: Hide and anchor mouse cursor on slow canvas drags: ', self._anchor_and_hide_canvas_drags ) )
rows.append( ( 'BUGFIX: Load images with PIL (slower): ', self._load_images_with_pil ) )
rows.append( ( 'BUGFIX: Load gifs with PIL instead of OpenCV (slower, bad transparency): ', self._disable_cv_for_gifs ) )
rows.append( ( 'Start animations this % in:', self._animation_start_position ) )
rows.append( ( 'Prefer system FFMPEG:', self._use_system_ffmpeg ) )
rows.append( ( 'Media zooms:', self._media_zooms ) )
rows.append( ( 'Show media/preview viewing stats or media right-click menus?:', self._file_viewing_stats_menu_display ) )
rows.append( ( 'RECOMMEND WINDOWS ONLY: Hide and anchor mouse cursor on media viewer drags:', self._anchor_and_hide_canvas_drags ) )
rows.append( ( 'RECOMMEND WINDOWS ONLY: If set to hide and anchor, undo on apparent touchscreen drag:', self._touchscreen_canvas_drags_unanchor ) )
rows.append( ( 'BUGFIX: Load images with PIL (slower):', self._load_images_with_pil ) )
rows.append( ( 'BUGFIX: Load gifs with PIL instead of OpenCV (slower, bad transparency):', self._disable_cv_for_gifs ) )
gridbox = ClientGUICommon.WrapInGrid( self, rows )
@ -3143,6 +3148,7 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
self._new_options.SetBoolean( 'load_images_with_pil', self._load_images_with_pil.GetValue() )
self._new_options.SetBoolean( 'use_system_ffmpeg', self._use_system_ffmpeg.GetValue() )
self._new_options.SetBoolean( 'anchor_and_hide_canvas_drags', self._anchor_and_hide_canvas_drags.GetValue() )
self._new_options.SetBoolean( 'touchscreen_canvas_drags_unanchor', self._touchscreen_canvas_drags_unanchor.GetValue() )
try:

View File

@ -1240,7 +1240,17 @@ class ManageTagsPanel( ClientGUIScrolledPanels.ManagePanel ):
self._remove_tags = ClientGUICommon.BetterButton( self._tags_box_sorter, text, self._RemoveTagsButton )
self._do_siblings_and_parents = ClientGUICommon.BetterBitmapButton( self._tags_box_sorter, CC.GlobalBMPs.family, self._DoSiblingsAndParents )
menu_items = []
call = HydrusData.Call( self._DoSiblingsAndParents, self._tag_service_key )
menu_items.append( ( 'normal', 'Hard-replace all applicable tags with their siblings and add missing parents. (Just this service\'s siblings and parents)', 'Fix siblings and parents.', call ) )
call = HydrusData.Call( self._DoSiblingsAndParents, CC.COMBINED_TAG_SERVICE_KEY )
menu_items.append( ( 'normal', 'Hard-replace all applicable tags with their siblings and add missing parents. (All service siblings and parents)', 'Fix siblings and parents.', call ) )
self._do_siblings_and_parents = ClientGUICommon.MenuBitmapButton( self._tags_box_sorter, CC.GlobalBMPs.family, menu_items )
self._do_siblings_and_parents.SetToolTip( 'Hard-replace all applicable tags with their siblings and add missing parents.' )
self._copy_button = ClientGUICommon.BetterBitmapButton( self._tags_box_sorter, CC.GlobalBMPs.copy, self._Copy )
@ -1683,7 +1693,7 @@ class ManageTagsPanel( ClientGUIScrolledPanels.ManagePanel ):
def _DoSiblingsAndParents( self ):
def _DoSiblingsAndParents( self, service_key ):
try:
@ -1702,9 +1712,9 @@ class ManageTagsPanel( ClientGUIScrolledPanels.ManagePanel ):
tags = tags_manager.GetCurrent( self._tag_service_key ).union( tags_manager.GetPending( self._tag_service_key ) )
sibling_correct_tags = tag_siblings_manager.CollapseTags( self._tag_service_key, tags )
sibling_correct_tags = tag_siblings_manager.CollapseTags( service_key, tags, service_strict = True )
sibling_and_parent_correct_tags = tag_parents_manager.ExpandTags( self._tag_service_key, sibling_correct_tags )
sibling_and_parent_correct_tags = tag_parents_manager.ExpandTags( service_key, sibling_correct_tags, service_strict = True )
removee_tags = tags.difference( sibling_and_parent_correct_tags )
addee_tags = sibling_and_parent_correct_tags.difference( tags )

View File

@ -798,6 +798,11 @@ class Frame( wx.Frame ):
if menu is not None and menu in self._menu_stack:
if hasattr( menu, 'hydrus_menubar_name' ):
HG.client_controller.MenubarMenuIsClosed()
index = self._menu_stack.index( menu )
del self._menu_stack[ index ]
@ -849,6 +854,11 @@ class Frame( wx.Frame ):
if menu is not None:
if hasattr( menu, 'hydrus_menubar_name' ):
HG.client_controller.MenubarMenuIsOpen()
status_bar = HG.client_controller.GetGUI().GetStatusBar()
previous_text = status_bar.GetStatusText()

View File

@ -1066,15 +1066,20 @@ class TagImportOptions( HydrusSerialisable.SerialisableBase ):
def CheckBlacklist( self, tags ):
ok_tags = self._tag_blacklist.Filter( tags )
sibling_tags = HG.client_controller.tag_siblings_manager.CollapseTags( CC.COMBINED_TAG_SERVICE_KEY, tags )
if len( ok_tags ) < len( tags ):
for test_tags in ( tags, sibling_tags ):
bad_tags = set( tags ).difference( ok_tags )
ok_tags = self._tag_blacklist.Filter( test_tags )
bad_tags = HydrusTags.SortNumericTags( bad_tags )
raise HydrusExceptions.VetoException( ', '.join( bad_tags ) + ' is blacklisted!' )
if len( ok_tags ) < len( test_tags ):
bad_tags = set( test_tags ).difference( ok_tags )
bad_tags = HydrusTags.SortNumericTags( bad_tags )
raise HydrusExceptions.VetoException( ', '.join( bad_tags ) + ' is blacklisted!' )

View File

@ -637,9 +637,12 @@ class HydrusResourceClientAPIRestricted( HydrusResourceClientAPI ):
HydrusResourceClientAPI._callbackCheckRestrictions( self, request )
self._EstablishAPIPermissions( request )
self._CheckAPIPermissions( request )
if request.method != b'OPTIONS':
self._EstablishAPIPermissions( request )
self._CheckAPIPermissions( request )
return request

View File

@ -1254,15 +1254,19 @@ class MediaList( object ):
non_trash_local_file_services = list( local_file_domains ) + [ CC.COMBINED_LOCAL_FILE_SERVICE_KEY ]
local_file_services = list( non_trash_local_file_services ) + [ CC.TRASH_SERVICE_KEY ]
all_local_file_services = list( non_trash_local_file_services ) + [ CC.TRASH_SERVICE_KEY ]
deleted_from_trash_and_local_view = service_key == CC.TRASH_SERVICE_KEY and self._file_service_key in local_file_services
physically_deleted = service_key in ( CC.TRASH_SERVICE_KEY, CC.COMBINED_LOCAL_FILE_SERVICE_KEY )
trashed = service_key in local_file_domains
deleted_from_our_domain = service_key = self._file_service_key
trashed_from_our_local_file_domain = HC.options[ 'remove_trashed_files' ] and service_key in local_file_domains and self._file_service_key == service_key
physically_deleted_and_local_view = physically_deleted and self._file_service_key in all_local_file_services
deleted_from_repo_and_repo_view = service_key not in local_file_services and self._file_service_key == service_key
user_says_remove_and_trashed_from_our_local_file_domain = HC.options[ 'remove_trashed_files' ] and trashed and deleted_from_our_domain
if deleted_from_trash_and_local_view or trashed_from_our_local_file_domain or deleted_from_repo_and_repo_view:
deleted_from_repo_and_repo_view = service_key not in all_local_file_services and deleted_from_our_domain
if physically_deleted_and_local_view or user_says_remove_and_trashed_from_our_local_file_domain or deleted_from_repo_and_repo_view:
self._RemoveMediaByHashes( hashes )

View File

@ -79,6 +79,7 @@ class ClientOptions( HydrusSerialisable.SerialisableBase ):
self._dictionary[ 'booleans' ][ 'reverse_page_shift_drag_behaviour' ] = False
self._dictionary[ 'booleans' ][ 'anchor_and_hide_canvas_drags' ] = HC.PLATFORM_WINDOWS
self._dictionary[ 'booleans' ][ 'touchscreen_canvas_drags_unanchor' ] = False
self._dictionary[ 'booleans' ][ 'thumbnail_fill' ] = False

View File

@ -107,7 +107,7 @@ def FilterTagsBySearchText( service_key, search_text, tags, search_siblings = Tr
re_predicate = compile_re( search_text )
sibling_manager = HG.client_controller.tag_siblings_manager
siblings_manager = HG.client_controller.tag_siblings_manager
result = []
@ -115,7 +115,7 @@ def FilterTagsBySearchText( service_key, search_text, tags, search_siblings = Tr
if search_siblings:
possible_tags = sibling_manager.GetAllSiblings( service_key, tag )
possible_tags = siblings_manager.GetAllSiblings( service_key, tag )
else:

View File

@ -459,19 +459,24 @@ class ServiceLocalBooru( ServiceLocalServerService ):
if self._upnp_port is None:
port = self._port
port = ':' + self._port
else:
port = self._upnp_port
port = ':' + self._upnp_port
else:
port = self._external_port_override
if port != '':
port = ':' + port
url = '{}://{}:{}/gallery?share_key={}'.format( scheme, host, port, share_key.hex() )
url = '{}://{}{}/gallery?share_key={}'.format( scheme, host, port, share_key.hex() )
return url

View File

@ -67,7 +67,7 @@ options = {}
# Misc
NETWORK_VERSION = 18
SOFTWARE_VERSION = 353
SOFTWARE_VERSION = 354
CLIENT_API_VERSION = 6
SERVER_THUMBNAIL_DIMENSIONS = ( 200, 200 )
@ -195,7 +195,7 @@ content_update_string_lookup[ CONTENT_UPDATE_FLIP ] = 'flip on/off'
DEFINITIONS_TYPE_HASHES = 0
DEFINITIONS_TYPE_TAGS = 1
DUPLICATE_UNKNOWN = 0
DUPLICATE_POTENTIAL = 0
DUPLICATE_FALSE_POSITIVE = 1
DUPLICATE_SAME_QUALITY = 2
DUPLICATE_ALTERNATE = 3
@ -207,7 +207,7 @@ DUPLICATE_BETTER_OR_WORSE = 8
duplicate_type_string_lookup = {}
duplicate_type_string_lookup[ DUPLICATE_UNKNOWN ] = 'potential duplicates'
duplicate_type_string_lookup[ DUPLICATE_POTENTIAL ] = 'potential duplicates'
duplicate_type_string_lookup[ DUPLICATE_FALSE_POSITIVE ] = 'not related/false positive'
duplicate_type_string_lookup[ DUPLICATE_SAME_QUALITY ] = 'same quality'
duplicate_type_string_lookup[ DUPLICATE_ALTERNATE ] = 'alternates'

View File

@ -35,6 +35,7 @@ force_idle_mode = False
no_page_limit_mode = False
thumbnail_debug_mode = False
server_busy = False
currently_uploading_pending = False
do_idle_shutdown_work = False
shutdown_complete = False

View File

@ -683,6 +683,7 @@ class HydrusResource( Resource ):
if self._service.SupportsCORS():
request.setHeader( 'Access-Control-Allow-Headers', 'Hydrus-Client-API-Access-Key' )
request.setHeader( 'Access-Control-Allow-Origin', '*' )
request.setHeader( 'Access-Control-Allow-Methods', allowed_methods_string )
@ -700,7 +701,7 @@ class HydrusResource( Resource ):
# 204 No Content
response_context = ResponseContext( 204 )
response_context = ResponseContext( 200, mime = HC.TEXT_PLAIN )
return response_context

View File

@ -181,6 +181,8 @@ class Controller( HydrusController.HydrusController ):
try:
time.sleep( 1 )
port = service.GetPort()
if HydrusNetworking.LocalPortInUse( port ):

View File

@ -174,9 +174,12 @@ class HydrusResourceRestricted( HydrusResourceHydrusNetwork ):
HydrusResourceHydrusNetwork._callbackCheckRestrictions( self, request )
self._checkSession( request )
self._checkAccount( request )
if request.method != b'OPTIONS':
self._checkSession( request )
self._checkAccount( request )
return request

View File

@ -232,7 +232,7 @@ class TestClientAPI( unittest.TestCase ):
data = response.read()
self.assertEqual( response.status, 204 )
self.assertEqual( response.status, 200 )
self.assertEqual( response.getheader( 'Allow' ), 'GET' )
@ -255,7 +255,7 @@ class TestClientAPI( unittest.TestCase ):
data = response.read()
self.assertEqual( response.status, 204 )
self.assertEqual( response.status, 200 )
self.assertEqual( response.getheader( 'Allow' ), 'GET' )
@ -267,9 +267,10 @@ class TestClientAPI( unittest.TestCase ):
data = response.read()
self.assertEqual( response.status, 204 )
self.assertEqual( response.status, 200 )
self.assertEqual( response.getheader( 'Access-Control-Allow-Methods' ), 'GET' )
self.assertEqual( response.getheader( 'Access-Control-Allow-Headers' ), 'Hydrus-Client-API-Access-Key' )
self.assertEqual( response.getheader( 'Access-Control-Allow-Origin' ), '*' )

View File

@ -192,6 +192,252 @@ class TestClientDB( unittest.TestCase ):
self.assertEqual( result, [] )
def test_duplicates( self ):
dupe_hashes = [ HydrusData.GenerateKey() for i in range( 5 ) ]
similar_looking_alternate_hashes = [ HydrusData.GenerateKey() for i in range( 3 ) ]
similar_looking_false_positive_hashes = [ HydrusData.GenerateKey() for i in range( 3 ) ]
all_hashes = set()
all_hashes.update( dupe_hashes )
all_hashes.update( similar_looking_alternate_hashes )
all_hashes.update( similar_looking_false_positive_hashes )
phash = os.urandom( 8 )
# fake-import the files with the phash
( size, mime, width, height, duration, num_frames, num_words ) = ( 65535, HC.IMAGE_JPEG, 640, 480, None, None, None )
for hash in all_hashes:
fake_file_import_job = ClientImportFileSeeds.FileImportJob( 'fake path' )
fake_file_import_job._hash = hash
fake_file_import_job._file_info = ( size, mime, width, height, duration, num_frames, num_words )
fake_file_import_job._extra_hashes = ( b'abcd', b'abcd', b'abcd' )
fake_file_import_job._phashes = [ phash ]
fake_file_import_job._file_import_options = ClientImportOptions.FileImportOptions()
self._write( 'import_file', fake_file_import_job )
# run search maintenance
self._write( 'maintain_similar_files_tree' )
self._write( 'maintain_similar_files_search_for_potential_duplicates', 0 )
# get filter counts, random potential hashes, and dupe filter hashes
size_pred = ClientSearch.Predicate( HC.PREDICATE_TYPE_SYSTEM_SIZE, ( '=', 65535, HydrusData.ConvertUnitToInt( 'B' ) ) )
file_search_context = ClientSearch.FileSearchContext( file_service_key = CC.LOCAL_FILE_SERVICE_KEY, predicates = [ size_pred ] )
both_files_match = True
num_potentials = self._read( 'potential_duplicates_count', file_search_context, both_files_match )
n = len( all_hashes )
# number pair combinations is (n(n-1))/2
expected_num_potentials = n * ( n - 1 ) / 2
self.assertEqual( num_potentials, expected_num_potentials )
result = self._read( 'random_potential_duplicate_hashes', file_search_context, both_files_match )
self.assertEqual( len( result ), n )
self.assertEqual( set( result ), all_hashes )
filtering_pairs = self._read( 'duplicate_pairs_for_filtering', file_search_context, both_files_match )
for ( a, b ) in filtering_pairs:
self.assertIn( a, all_hashes )
self.assertIn( b, all_hashes )
result = self._read( 'file_duplicate_types_to_counts', CC.LOCAL_FILE_SERVICE_KEY, dupe_hashes[0] )
self.assertEqual( len( result ), 1 )
self.assertEqual( result[ HC.DUPLICATE_POTENTIAL ], 10 )
result = self._read( 'file_duplicate_hashes', CC.LOCAL_FILE_SERVICE_KEY, dupe_hashes[0], HC.DUPLICATE_POTENTIAL )
self.assertEqual( result[0], dupe_hashes[0] )
self.assertEqual( set( result ), all_hashes )
# applying better/worse, test dup counts and hashes and king hash
king_hash = dupe_hashes[0]
row = ( HC.DUPLICATE_BETTER, dupe_hashes[0], dupe_hashes[1], {} )
self._write( 'duplicate_pair_status', [ row ] )
num_potentials = self._read( 'potential_duplicates_count', file_search_context, both_files_match )
self.assertEqual( num_potentials, expected_num_potentials - 1 )
result = self._read( 'file_duplicate_types_to_counts', CC.LOCAL_FILE_SERVICE_KEY, king_hash )
self.assertEqual( len( result ), 2 )
self.assertEqual( result[ HC.DUPLICATE_POTENTIAL ], 9 )
self.assertEqual( result[ HC.DUPLICATE_BETTER ], 1 )
result = self._read( 'file_duplicate_types_to_counts', CC.LOCAL_FILE_SERVICE_KEY, dupe_hashes[1] )
self.assertEqual( len( result ), 2 )
self.assertEqual( result[ HC.DUPLICATE_POTENTIAL ], 9 )
self.assertEqual( result[ HC.DUPLICATE_WORSE ], 1 )
result = self._read( 'file_duplicate_hashes', CC.LOCAL_FILE_SERVICE_KEY, king_hash, HC.DUPLICATE_BETTER )
self.assertEqual( result, [ king_hash, dupe_hashes[1] ] )
result = self._read( 'file_duplicate_hashes', CC.LOCAL_FILE_SERVICE_KEY, king_hash, HC.DUPLICATE_WORSE )
self.assertEqual( result, [ king_hash ] )
result = self._read( 'file_duplicate_hashes', CC.LOCAL_FILE_SERVICE_KEY, king_hash, HC.DUPLICATE_BETTER_OR_WORSE )
self.assertEqual( result, [ king_hash, dupe_hashes[1] ] )
result = self._read( 'file_duplicate_hashes', CC.LOCAL_FILE_SERVICE_KEY, dupe_hashes[1], HC.DUPLICATE_BETTER )
self.assertEqual( result, [ dupe_hashes[1] ] )
result = self._read( 'file_duplicate_hashes', CC.LOCAL_FILE_SERVICE_KEY, dupe_hashes[1], HC.DUPLICATE_WORSE )
self.assertEqual( result, [ dupe_hashes[1], king_hash ] )
result = self._read( 'file_duplicate_hashes', CC.LOCAL_FILE_SERVICE_KEY, dupe_hashes[1], HC.DUPLICATE_BETTER_OR_WORSE )
self.assertEqual( result, [ dupe_hashes[1], king_hash ] )
# applying better/worse for new king, test dup counts and hashes and king hash
# king_hash = dupe_hashes[2]
# applying same quality, test dup counts and hashes and king hash
# applying false positive, test dup counts and hashes and that potentials are cut/split as a result
false_positive_king_hash = similar_looking_false_positive_hashes[0]
row = ( HC.DUPLICATE_FALSE_POSITIVE, king_hash, false_positive_king_hash, {} )
self._write( 'duplicate_pair_status', [ row ] )
num_potentials = self._read( 'potential_duplicates_count', file_search_context, both_files_match )
self.assertEqual( num_potentials, expected_num_potentials - 2 )
result = self._read( 'file_duplicate_types_to_counts', CC.LOCAL_FILE_SERVICE_KEY, king_hash )
self.assertEqual( len( result ), 3 )
self.assertEqual( result[ HC.DUPLICATE_POTENTIAL ], 8 )
self.assertEqual( result[ HC.DUPLICATE_BETTER ], 1 )
self.assertEqual( result[ HC.DUPLICATE_FALSE_POSITIVE ], 1 )
result = self._read( 'file_duplicate_types_to_counts', CC.LOCAL_FILE_SERVICE_KEY, false_positive_king_hash )
self.assertEqual( len( result ), 2 )
self.assertEqual( result[ HC.DUPLICATE_POTENTIAL ], 9 ) # if potentials are being reduced right in the new system, this will be less
self.assertEqual( result[ HC.DUPLICATE_FALSE_POSITIVE ], 1 )
result = self._read( 'file_duplicate_hashes', CC.LOCAL_FILE_SERVICE_KEY, king_hash, HC.DUPLICATE_FALSE_POSITIVE )
self.assertEqual( result, [ king_hash, false_positive_king_hash ] )
result = self._read( 'file_duplicate_hashes', CC.LOCAL_FILE_SERVICE_KEY, false_positive_king_hash, HC.DUPLICATE_FALSE_POSITIVE )
self.assertEqual( result, [ false_positive_king_hash, king_hash ] )
# applying alternate, test dup counts and hashes and that potentials are cut/split as a result
alternate_king_hash = similar_looking_alternate_hashes[0]
row = ( HC.DUPLICATE_ALTERNATE, king_hash, alternate_king_hash, {} )
self._write( 'duplicate_pair_status', [ row ] )
num_potentials = self._read( 'potential_duplicates_count', file_search_context, both_files_match )
self.assertEqual( num_potentials, expected_num_potentials - 3 )
result = self._read( 'file_duplicate_types_to_counts', CC.LOCAL_FILE_SERVICE_KEY, king_hash )
self.assertEqual( len( result ), 4 )
self.assertEqual( result[ HC.DUPLICATE_POTENTIAL ], 7 )
self.assertEqual( result[ HC.DUPLICATE_BETTER ], 1 )
self.assertEqual( result[ HC.DUPLICATE_FALSE_POSITIVE ], 1 )
self.assertEqual( result[ HC.DUPLICATE_ALTERNATE ], 1 )
result = self._read( 'file_duplicate_types_to_counts', CC.LOCAL_FILE_SERVICE_KEY, alternate_king_hash )
self.assertEqual( len( result ), 3 )
self.assertEqual( result[ HC.DUPLICATE_POTENTIAL ], 9 ) # if potentials are being reduced right in the new system, this will be less
self.assertEqual( result[ HC.DUPLICATE_FALSE_POSITIVE ], 1 )
self.assertEqual( result[ HC.DUPLICATE_ALTERNATE ], 1 )
result = self._read( 'file_duplicate_hashes', CC.LOCAL_FILE_SERVICE_KEY, king_hash, HC.DUPLICATE_ALTERNATE )
self.assertEqual( result, [ king_hash, alternate_king_hash ] )
result = self._read( 'file_duplicate_hashes', CC.LOCAL_FILE_SERVICE_KEY, alternate_king_hash, HC.DUPLICATE_ALTERNATE )
self.assertEqual( result, [ alternate_king_hash, king_hash ] )
# applying better/worse to false positive, test dup counts and hashes and that potentials are cut/split as a result
# applying better/worse to alternate, test dup counts and hashes and that potentials are cut/split as a result
# applying same quality to false positive, test dup counts and hashes and that potentials are cut/split as a result
# applying same quality to alternate, test dup counts and hashes and that potentials are cut/split as a result
# now we have some set up, try adding some impossible relationships and make sure they don't change anything
king_hash_counts = self._read( 'file_duplicate_types_to_counts', CC.LOCAL_FILE_SERVICE_KEY, king_hash )
false_positive_king_hash_counts = self._read( 'file_duplicate_types_to_counts', CC.LOCAL_FILE_SERVICE_KEY, false_positive_king_hash )
alternate_king_hash_counts = self._read( 'file_duplicate_types_to_counts', CC.LOCAL_FILE_SERVICE_KEY, alternate_king_hash )
for duplicate_type in [ HC.DUPLICATE_FALSE_POSITIVE, HC.DUPLICATE_ALTERNATE ]: # [ HC.DUPLICATE_BETTER, HC.DUPLICATE_WORSE, HC.DUPLICATE_SAME_QUALITY, HC.DUPLICATE_FALSE_POSITIVE, HC.DUPLICATE_ALTERNATE ]:
for ( hash_a, hash_b ) in itertools.combinations( [ king_hash, false_positive_king_hash, alternate_king_hash ], 2 ):
row = ( duplicate_type, king_hash, alternate_king_hash, {} )
self._write( 'duplicate_pair_status', [ row ] )
king_hash_counts_after = self._read( 'file_duplicate_types_to_counts', CC.LOCAL_FILE_SERVICE_KEY, king_hash )
false_positive_king_hash_counts_after = self._read( 'file_duplicate_types_to_counts', CC.LOCAL_FILE_SERVICE_KEY, false_positive_king_hash )
alternate_king_hash_counts_after = self._read( 'file_duplicate_types_to_counts', CC.LOCAL_FILE_SERVICE_KEY, alternate_king_hash )
self.assertEqual( king_hash_counts, king_hash_counts_after )
self.assertEqual( false_positive_king_hash_counts, false_positive_king_hash_counts_after )
self.assertEqual( alternate_king_hash_counts, alternate_king_hash_counts_after )
# test dissolve actions will reset counts to 0
def test_export_folders( self ):
file_search_context = ClientSearch.FileSearchContext(file_service_key = HydrusData.GenerateKey(), tag_service_key = HydrusData.GenerateKey(), predicates = [ ClientSearch.Predicate( HC.PREDICATE_TYPE_TAG, 'test' ) ] )
@ -788,16 +1034,16 @@ class TestClientDB( unittest.TestCase ):
test_files = []
test_files.append( ( 'muh_swf.swf', 'edfef9905fdecde38e0752a5b6ab7b6df887c3968d4246adc9cffc997e168cdf', 456774, HC.APPLICATION_FLASH, 400, 400, 33, 1, None ) )
test_files.append( ( 'muh_mp4.mp4', '2fa293907144a046d043d74e9570b1c792cbfd77ee3f5c93b2b1a1cb3e4c7383', 570534, HC.VIDEO_MP4, 480, 480, 'mp4_duration', 151, None ) )
test_files.append( ( 'muh_mpeg.mpeg', 'aebb10aaf3b27a5878fd2732ea28aaef7bbecef7449eaa759421c4ba4efff494', 772096, HC.VIDEO_MPEG, 720, 480, 3500, 105, None ) )
test_files.append( ( 'muh_webm.webm', '55b6ce9d067326bf4b2fbe66b8f51f366bc6e5f776ba691b0351364383c43fcb', 84069, HC.VIDEO_WEBM, 640, 360, 4010, 120, None ) )
test_files.append( ( 'muh_jpg.jpg', '5d884d84813beeebd59a35e474fa3e4742d0f2b6679faa7609b245ddbbd05444', 42296, HC.IMAGE_JPEG, 392, 498, None, None, None ) )
test_files.append( ( 'muh_png.png', 'cdc67d3b377e6e1397ffa55edc5b50f6bdf4482c7a6102c6f27fa351429d6f49', 31452, HC.IMAGE_PNG, 191, 196, None, None, None ) )
test_files.append( ( 'muh_apng.png', '9e7b8b5abc7cb11da32db05671ce926a2a2b701415d1b2cb77a28deea51010c3', 616956, HC.IMAGE_APNG, 500, 500, 'apng_duration', 47, None ) )
test_files.append( ( 'muh_gif.gif', '00dd9e9611ebc929bfc78fde99a0c92800bbb09b9d18e0946cea94c099b211c2', 15660, HC.IMAGE_GIF, 329, 302, 600, 5, None ) )
test_files.append( ( 'muh_swf.swf', 'edfef9905fdecde38e0752a5b6ab7b6df887c3968d4246adc9cffc997e168cdf', 456774, HC.APPLICATION_FLASH, 400, 400, { 33 }, { 1 }, None ) )
test_files.append( ( 'muh_mp4.mp4', '2fa293907144a046d043d74e9570b1c792cbfd77ee3f5c93b2b1a1cb3e4c7383', 570534, HC.VIDEO_MP4, 480, 480, { 6266, 6290 }, { 151 }, None ) )
test_files.append( ( 'muh_mpeg.mpeg', 'aebb10aaf3b27a5878fd2732ea28aaef7bbecef7449eaa759421c4ba4efff494', 772096, HC.VIDEO_MPEG, 720, 480, { 3500 }, { 105 }, None ) )
test_files.append( ( 'muh_webm.webm', '55b6ce9d067326bf4b2fbe66b8f51f366bc6e5f776ba691b0351364383c43fcb', 84069, HC.VIDEO_WEBM, 640, 360, { 4010 }, { 120 }, None ) )
test_files.append( ( 'muh_jpg.jpg', '5d884d84813beeebd59a35e474fa3e4742d0f2b6679faa7609b245ddbbd05444', 42296, HC.IMAGE_JPEG, 392, 498, { None }, { None }, None ) )
test_files.append( ( 'muh_png.png', 'cdc67d3b377e6e1397ffa55edc5b50f6bdf4482c7a6102c6f27fa351429d6f49', 31452, HC.IMAGE_PNG, 191, 196, { None }, { None }, None ) )
test_files.append( ( 'muh_apng.png', '9e7b8b5abc7cb11da32db05671ce926a2a2b701415d1b2cb77a28deea51010c3', 616956, HC.IMAGE_APNG, 500, 500, { 3133, 1880, 1125, 1800 }, { 27, 47 }, None ) )
test_files.append( ( 'muh_gif.gif', '00dd9e9611ebc929bfc78fde99a0c92800bbb09b9d18e0946cea94c099b211c2', 15660, HC.IMAGE_GIF, 329, 302, { 600 }, { 5 }, None ) )
for ( filename, hex_hash, size, mime, width, height, duration, num_frames, num_words ) in test_files:
for ( filename, hex_hash, size, mime, width, height, durations, nums_frame, num_words ) in test_files:
path = os.path.join( HC.STATIC_DIR, 'testing', filename )
@ -846,21 +1092,8 @@ class TestClientDB( unittest.TestCase ):
self.assertEqual( mr_mime, mime )
self.assertEqual( mr_width, width )
self.assertEqual( mr_height, height )
if duration == 'apng_duration': # diff ffmpeg versions report differently
self.assertIn( mr_duration, ( 3133, 1880, 1125, 1800 ) )
elif duration == 'mp4_duration':
self.assertIn( mr_duration, ( 6266, 6290 ) )
else:
self.assertEqual( mr_duration, duration )
self.assertEqual( mr_num_frames, num_frames )
self.assertIn( mr_duration, durations )
self.assertIn( mr_num_frames, nums_frame )
self.assertEqual( mr_num_words, num_words )