Version 259

This commit is contained in:
Hydrus Network Developer 2017-06-07 17:05:15 -05:00
parent 490b7e4ab8
commit a73d18d6c4
38 changed files with 1393 additions and 343 deletions

View File

@ -8,6 +8,8 @@
<div class="content">
<h3>this is non-comprehensive</h3>
<p>I am always changing and adding little things. The best way to learn is just to look around. If you think a shortcut should probably do something, try it out! If you can't find something, let me know and I'll try to add it!</p>
<h3>advanced mode</h3>
<p>To avoid confusing clutter, some advanced features are hidden by default. When you are comfortable with the program, hit <i>help->advanced mode</i> to reveal these buttons and menu options!</p>
<h3>searching with wildcards</h3>
<p>The autocomplete tag dropdown supports wildcard searching with '*'.</p>
<p><img src="wildcard_gelion.png"/></p>

View File

@ -8,6 +8,39 @@
<div class="content">
<h3>changelog</h3>
<ul>
<li><h3>version 259</h3></li>
<ul>
<li>planned out new networking engine and started the principal objects</li>
<li>renamed the 'exact match' duplicate status to 'same quality', to reduce confusion on what is appropriate for this status</li>
<li>the duplicate filter, on hitting the delete key, now offers the option of deleting both files</li>
<li>the duplicate system now combines duplicate status setting and the consequent batch of content updates into the same database transaction, speeding things up</li>
<li>the duplicate system now batches multiple duplicate status setting into a single transaction, massively speeding up large filters or thumbnail status set actions</li>
<li>misc duplicate help tweaks</li>
<li>you can now edit the default duplicate merge options from the new thumbnail duplicate menu</li>
<li>the duplicates page's jobs are less demanding on gui time and take better breaks if something else happens</li>
<li>renamed the new dupe system predicate to 'system:num duplicate relationships' to clarify what it searches</li>
<li>for normal queries, current and pending mappings are now fetched from a faster mappings cache. you should see faster result building across the board, particularly on fresh boots or otherwise slow-disk-access systems</li>
<li>added a prototype 'advanced mode' (defaulting to off, so experienced users will want to turn it on) under the help menu that will enable menu items that are often not helpful to new users. I will add more things to this in future, suggestions welcome</li>
<li>the new thumbnail menu dupe relationship set stuff is now considered advanced</li>
<li>thumbnail menu find similar files is now considered advanced</li>
<li>the thumbnail menu copy hash entries are now considered advanced</li>
<li>advanced content update buttons (on manage tags and review services) are now considered advanced</li>
<li>added an advanced mode 'open file location' entry to the thumbnail share menu, which will open the file in your OS's file explorer (not available on Linux)</li>
<li>added an advanced mode 'correct video frame count' thumbnail menu entry that will force-apply last week's more accurate video frame counter to correct and videos that render too fast and then cut off</li>
<li>fixed many entries on the media viewer menus, which were being blocked by an over-eager 'can continue' test and hence silently failing</li>
<li>fixed the youtube downloader on Linux and OS X--both now use youtube-dl</li>
<li>fixed an issue where the GetLighterDarkerColour function was producing very bright alternates to very dark colours (meaning dark dupe filters were having their bright background text rendered unreadable)</li>
<li>improved video frame number parsing accuracy</li>
<li>improved the accurate version of video frame number parsing accuracy, particularly for longer videos</li>
<li>the network engine now reports 5xx http status codes as ServerException to better contextualise to the user what went wrong</li>
<li>adminside mapping petitions are now sub-ordered by tag</li>
<li>adminside sibling/parent petitions are now ordered by the 'older' tag and sub-ordered by the 'newer' tag</li>
<li>censorship taglists are now roughly sorted</li>
<li>reduced default shutdown work max time to five minutes</li>
<li>improved how subprocesses are started</li>
<li>misc cleanup</li>
<li>misc refactoring</li>
</ul>
<li><h3>version 258</h3></li>
<ul>
<li>added a duplicate entry to the thumbnail right-click menu</li>

View File

@ -31,7 +31,7 @@
<h3>processing</h3>
<p>After you have searched your files, you should have a few dozen to a few ten-thousand 'potential' pairs. The number may be frighteningly high, but you will be able to cut it down quicker than you expect. There are several mathematical optimisations at the database level that can use one of your decisions to resolve multiple unknown relationships.</p>
<p>If you like, you can review some of these groups of potential pairs as thumbnails by hitting the 'show some random pairs' button. It is often surprising and interesting to discover what it has found.</p>
<p>You can do some manual filtering on these thumbnails--merging tags and deleting bad quality files, or even setting duplicate statuses manually through the right-click menu (PROTIP: This last bit isn't done yet!)--but like archiving and deleting from your inbox, this is much more quickly done through a specialised filter:</p>
<p>You can do some manual filtering on these thumbnails--merging tags and deleting bad quality files, or even setting duplicate statuses manually through the right-click menu, if you have <i>help->advanced mode</i> active--but like archiving and deleting from your inbox, most of these operations are done much more quickly through a specialised filter:</p>
</li>
</ul>
<h3>the duplicates filter</h3>
@ -84,14 +84,14 @@
<p>The default action on setting a better/worse pair is to move all <i>local tags</i> from the worse file to the best (i.e. adding them to the better file and then deleting them from the worse) and then sending the worse file to the trash.</p>
</li>
<li>
<h3>exact duplicates</h3>
<p>This is the same as better/worse, except that you absolutely cannot discern a better file. This is typically a rare decision.</p>
<p>Here are two exact matches:</p>
<h3>same quality duplicates</h3>
<p>This tells the client that the pairs of files represent the exact same thing and that you cannot tell which is clearly better.</p>
<p>Here are two same quality duplicates:</p>
<p><a href="dupe_exact_match_1.png"><img src="dupe_exact_match_1.png" /></a></p>
<p><a href="dupe_exact_match_2.png"><img src="dupe_exact_match_2.png" /></a></p>
<p>I cannot tell any difference, although the filesize is significantly different, so I suspect the smaller is a lossless png optimisation. Many of the big content providers--Facebook, Google, Clouflare--automatically 'optimise' the data that goes through their networks in order to save bandwidth. With pngs it is usually mostly harmless, but jpegs are often a slaughterhouse.</p>
<p>Given the filesize, you might decide that these are actually a better/worse pair--but if the larger image had tags and was the 'canonical' version on most boorus, the decision might not be so clear. Sometimes you just want to keep both without a firm decision on which is best, in which case you can just set this 'exact duplicates' status and move on.</p>
<p>The default action on setting an exact duplicates pair is to copy all <i>local tags</i> between the two files in both directions.</p>
<p>There is no obvious different between those two. The filesize is significantly different, so I suspect the smaller is a lossless png optimisation. Many of the big content providers--Facebook, Google, Clouflare--automatically 'optimise' the data that goes through their networks in order to save bandwidth. With pngs it is usually mostly harmless, but jpegs are often a slaughterhouse.</p>
<p>Given the filesize, you might decide that these are actually a better/worse pair--but if the larger image had tags and was the 'canonical' version on most boorus, the decision might not be so clear. Sometimes you just want to keep both without a firm decision on which is best, in which case you can just set this 'same quality' status and move on.</p>
<p>The default action on setting a same quality pair is to copy all <i>local tags</i> between the two files in both directions.</p>
</li>
<li>
<h3>alternates</h3>

View File

@ -1396,7 +1396,7 @@ class LocalBooruCache( object ):
def _CheckDataUsage( self ):
if not self._local_booru_service.BandwidthOk():
if not self._local_booru_service.BandwidthOK():
raise HydrusExceptions.ForbiddenException( 'This booru has used all its monthly data. Please try again next month.' )

View File

@ -483,6 +483,11 @@ class Controller( HydrusController.HydrusController ):
return self._app
def GetBandwidthManager( self ):
raise NotImplementedError()
def GetClientFilesManager( self ):
return self._client_files_manager

View File

@ -23,6 +23,7 @@ import HydrusPaths
import HydrusSerialisable
import HydrusTagArchive
import HydrusTags
import HydrusVideoHandling
import ClientConstants as CC
import os
import psutil
@ -1229,7 +1230,7 @@ class DB( HydrusDB.HydrusDB ):
existing_node_counter = collections.Counter()
# note this doesn't use the table_join
result = self._c.execute( 'SELECT smaller_hash_id, larger_hash_id FROM duplicate_pairs WHERE duplicate_type IN ( ?, ?, ? ) ORDER BY RANDOM() LIMIT 10000;', ( HC.DUPLICATE_SMALLER_BETTER, HC.DUPLICATE_LARGER_BETTER, HC.DUPLICATE_SAME_FILE ) ).fetchall()
result = self._c.execute( 'SELECT smaller_hash_id, larger_hash_id FROM duplicate_pairs WHERE duplicate_type IN ( ?, ?, ? ) ORDER BY RANDOM() LIMIT 10000;', ( HC.DUPLICATE_SMALLER_BETTER, HC.DUPLICATE_LARGER_BETTER, HC.DUPLICATE_SAME_QUALITY ) ).fetchall()
for ( smaller_hash_id, larger_hash_id ) in result:
@ -1367,12 +1368,12 @@ class DB( HydrusDB.HydrusDB ):
return
text = 'searched ' + HydrusData.ConvertValueRangeToPrettyString( total_done_previously + i, total_num_hash_ids_in_cache ) + ' files, found ' + HydrusData.ConvertIntToPrettyString( pairs_found ) + ' potential duplicate pairs'
job_key.SetVariable( 'popup_text_1', text )
job_key.SetVariable( 'popup_gauge_1', ( total_done_previously + i, total_num_hash_ids_in_cache ) )
if i % 100 == 0:
if i % 25 == 0:
text = 'searched ' + HydrusData.ConvertValueRangeToPrettyString( total_done_previously + i, total_num_hash_ids_in_cache ) + ' files'
job_key.SetVariable( 'popup_text_1', text )
job_key.SetVariable( 'popup_gauge_1', ( total_done_previously + i, total_num_hash_ids_in_cache ) )
HG.client_controller.pub( 'splash_set_status_text', text )
@ -1910,68 +1911,76 @@ class DB( HydrusDB.HydrusDB ):
return similar_hash_ids
def _CacheSimilarFilesSetDuplicatePairStatus( self, duplicate_type, hash_a, hash_b ):
def _CacheSimilarFilesSetDuplicatePairStatus( self, pair_info ):
if duplicate_type is None:
for ( duplicate_type, hash_a, hash_b, list_of_service_keys_to_content_updates ) in pair_info:
for service_keys_to_content_updates in list_of_service_keys_to_content_updates:
self._ProcessContentUpdates( service_keys_to_content_updates )
if duplicate_type is None:
hash_id_a = self._GetHashId( hash_a )
hash_id_b = self._GetHashId( hash_b )
smaller_hash_id = min( hash_id_a, hash_id_b )
larger_hash_id = max( hash_id_a, hash_id_b )
self._c.execute( 'DELETE FROM duplicate_pairs WHERE smaller_hash_id = ? AND larger_hash_id = ?;', ( smaller_hash_id, larger_hash_id ) )
return
if duplicate_type == HC.DUPLICATE_WORSE:
( hash_a, hash_b ) = ( hash_b, hash_a )
duplicate_type = HC.DUPLICATE_BETTER
hash_id_a = self._GetHashId( hash_a )
hash_id_b = self._GetHashId( hash_b )
smaller_hash_id = min( hash_id_a, hash_id_b )
larger_hash_id = max( hash_id_a, hash_id_b )
self._CacheSimilarFilesSetDuplicatePairStatusSingleRow( duplicate_type, hash_id_a, hash_id_b )
self._c.execute( 'DELETE FROM duplicate_pairs WHERE smaller_hash_id = ? AND larger_hash_id = ?;', ( smaller_hash_id, larger_hash_id ) )
return
if duplicate_type == HC.DUPLICATE_WORSE:
( hash_a, hash_b ) = ( hash_b, hash_a )
duplicate_type = HC.DUPLICATE_BETTER
hash_id_a = self._GetHashId( hash_a )
hash_id_b = self._GetHashId( hash_b )
self._CacheSimilarFilesSetDuplicatePairStatusSingleRow( duplicate_type, hash_id_a, hash_id_b )
if duplicate_type == HC.DUPLICATE_BETTER:
# anything better than A is now better than B
# i.e. for all X for which X > A, set X > B
# anything worse than B is now worse than A
# i.e. for all X for which B > X, set A > X
better_than_a = set()
better_than_a.update( self._STI( self._c.execute( 'SELECT smaller_hash_id FROM duplicate_pairs WHERE duplicate_type = ? AND larger_hash_id = ?;', ( HC.DUPLICATE_SMALLER_BETTER, hash_id_a ) ) ) )
better_than_a.update( self._STI( self._c.execute( 'SELECT larger_hash_id FROM duplicate_pairs WHERE duplicate_type = ? AND smaller_hash_id = ?;', ( HC.DUPLICATE_LARGER_BETTER, hash_id_a ) ) ) )
for better_than_a_hash_id in better_than_a:
if duplicate_type == HC.DUPLICATE_BETTER:
self._CacheSimilarFilesSetDuplicatePairStatusSingleRow( HC.DUPLICATE_BETTER, better_than_a_hash_id, hash_id_b )
# anything better than A is now better than B
# i.e. for all X for which X > A, set X > B
# anything worse than B is now worse than A
# i.e. for all X for which B > X, set A > X
better_than_a = set()
better_than_a.update( self._STI( self._c.execute( 'SELECT smaller_hash_id FROM duplicate_pairs WHERE duplicate_type = ? AND larger_hash_id = ?;', ( HC.DUPLICATE_SMALLER_BETTER, hash_id_a ) ) ) )
better_than_a.update( self._STI( self._c.execute( 'SELECT larger_hash_id FROM duplicate_pairs WHERE duplicate_type = ? AND smaller_hash_id = ?;', ( HC.DUPLICATE_LARGER_BETTER, hash_id_a ) ) ) )
for better_than_a_hash_id in better_than_a:
self._CacheSimilarFilesSetDuplicatePairStatusSingleRow( HC.DUPLICATE_BETTER, better_than_a_hash_id, hash_id_b )
worse_than_b = set()
worse_than_b.update( self._STI( self._c.execute( 'SELECT smaller_hash_id FROM duplicate_pairs WHERE duplicate_type = ? AND larger_hash_id = ?;', ( HC.DUPLICATE_LARGER_BETTER, hash_id_b ) ) ) )
worse_than_b.update( self._STI( self._c.execute( 'SELECT larger_hash_id FROM duplicate_pairs WHERE duplicate_type = ? AND smaller_hash_id = ?;', ( HC.DUPLICATE_SMALLER_BETTER, hash_id_b ) ) ) )
for worse_than_b_hash_id in worse_than_b:
self._CacheSimilarFilesSetDuplicatePairStatusSingleRow( HC.DUPLICATE_BETTER, hash_id_a, worse_than_b_hash_id )
worse_than_b = set()
worse_than_b.update( self._STI( self._c.execute( 'SELECT smaller_hash_id FROM duplicate_pairs WHERE duplicate_type = ? AND larger_hash_id = ?;', ( HC.DUPLICATE_LARGER_BETTER, hash_id_b ) ) ) )
worse_than_b.update( self._STI( self._c.execute( 'SELECT larger_hash_id FROM duplicate_pairs WHERE duplicate_type = ? AND smaller_hash_id = ?;', ( HC.DUPLICATE_SMALLER_BETTER, hash_id_b ) ) ) )
for worse_than_b_hash_id in worse_than_b:
if duplicate_type != HC.DUPLICATE_UNKNOWN:
self._CacheSimilarFilesSetDuplicatePairStatusSingleRow( HC.DUPLICATE_BETTER, hash_id_a, worse_than_b_hash_id )
self._CacheSimilarFilesSyncSameQualityDuplicates( hash_id_a )
self._CacheSimilarFilesSyncSameQualityDuplicates( hash_id_b )
self._CacheSimilarFilesSyncBetterWorseDuplicates( hash_id_a )
self._CacheSimilarFilesSyncBetterWorseDuplicates( hash_id_b )
if duplicate_type != HC.DUPLICATE_UNKNOWN:
self._CacheSimilarFilesSyncSameFileDuplicates( hash_id_a )
self._CacheSimilarFilesSyncSameFileDuplicates( hash_id_b )
self._CacheSimilarFilesSyncBetterWorseDuplicates( hash_id_a )
self._CacheSimilarFilesSyncBetterWorseDuplicates( hash_id_b )
@ -2054,7 +2063,7 @@ class DB( HydrusDB.HydrusDB ):
def _CacheSimilarFilesSyncSameFileDuplicates( self, hash_id ):
def _CacheSimilarFilesSyncSameQualityDuplicates( self, hash_id ):
# exactly similar files should have exactly the same relationships with other files
# so, replicate all of our file's known relationships to all of its 'same file' siblings
@ -2089,12 +2098,12 @@ class DB( HydrusDB.HydrusDB ):
all_relationships.add( ( larger_hash_id, duplicate_type ) )
all_same_file_siblings = set()
all_same_quality_siblings = set()
all_same_file_siblings.update( self._STI( self._c.execute( 'SELECT smaller_hash_id FROM duplicate_pairs WHERE duplicate_type = ? AND larger_hash_id = ?;', ( HC.DUPLICATE_SAME_FILE, hash_id ) ) ) )
all_same_file_siblings.update( self._STI( self._c.execute( 'SELECT larger_hash_id FROM duplicate_pairs WHERE duplicate_type = ? AND smaller_hash_id = ?;', ( HC.DUPLICATE_SAME_FILE, hash_id ) ) ) )
all_same_quality_siblings.update( self._STI( self._c.execute( 'SELECT smaller_hash_id FROM duplicate_pairs WHERE duplicate_type = ? AND larger_hash_id = ?;', ( HC.DUPLICATE_SAME_QUALITY, hash_id ) ) ) )
all_same_quality_siblings.update( self._STI( self._c.execute( 'SELECT larger_hash_id FROM duplicate_pairs WHERE duplicate_type = ? AND smaller_hash_id = ?;', ( HC.DUPLICATE_SAME_QUALITY, hash_id ) ) ) )
for sibling_hash_id in all_same_file_siblings:
for sibling_hash_id in all_same_quality_siblings:
for ( other_hash_id, duplicate_type ) in all_relationships:
@ -4775,6 +4784,34 @@ class DB( HydrusDB.HydrusDB ):
hash_ids_to_local_ratings = HydrusData.BuildKeyToListDict( ( ( hash_id, ( service_id, rating ) ) for ( service_id, hash_id, rating ) in self._SelectFromList( 'SELECT service_id, hash_id, rating FROM local_ratings WHERE hash_id IN %s;', hash_ids ) ) )
#
# Let's figure out if there is a common specific file service to this batch
file_service_id_counter = collections.Counter()
for file_service_ids_and_timestamps in hash_ids_to_current_file_service_ids_and_timestamps.values():
for ( file_service_id, timestamp ) in file_service_ids_and_timestamps:
file_service_id_counter[ file_service_id ] += 1
common_file_service_id = None
for ( file_service_id, count ) in file_service_id_counter.items():
if count == len( hash_ids ): # i.e. every hash has this file service
common_file_service_id = file_service_id
break
#
tag_data = []
tag_service_ids = self._GetServiceIds( HC.TAG_SERVICES )
@ -4783,9 +4820,20 @@ class DB( HydrusDB.HydrusDB ):
( current_mappings_table_name, deleted_mappings_table_name, pending_mappings_table_name, petitioned_mappings_table_name ) = GenerateMappingsTableNames( tag_service_id )
tag_data.extend( ( hash_id, ( tag_service_id, HC.CONTENT_STATUS_CURRENT, tag_id ) ) for ( hash_id, tag_id ) in self._SelectFromList( 'SELECT hash_id, tag_id FROM ' + current_mappings_table_name + ' WHERE hash_id IN %s;', hash_ids ) )
if common_file_service_id is None:
tag_data.extend( ( hash_id, ( tag_service_id, HC.CONTENT_STATUS_CURRENT, tag_id ) ) for ( hash_id, tag_id ) in self._SelectFromList( 'SELECT hash_id, tag_id FROM ' + current_mappings_table_name + ' WHERE hash_id IN %s;', hash_ids ) )
tag_data.extend( ( hash_id, ( tag_service_id, HC.CONTENT_STATUS_PENDING, tag_id ) ) for ( hash_id, tag_id ) in self._SelectFromList( 'SELECT hash_id, tag_id FROM ' + pending_mappings_table_name + ' WHERE hash_id IN %s;', hash_ids ) )
else:
( cache_files_table_name, cache_current_mappings_table_name, cache_pending_mappings_table_name, ac_cache_table_name ) = GenerateSpecificMappingsCacheTableNames( common_file_service_id, tag_service_id )
tag_data.extend( ( hash_id, ( tag_service_id, HC.CONTENT_STATUS_CURRENT, tag_id ) ) for ( hash_id, tag_id ) in self._SelectFromList( 'SELECT hash_id, tag_id FROM ' + cache_current_mappings_table_name + ' WHERE hash_id IN %s;', hash_ids ) )
tag_data.extend( ( hash_id, ( tag_service_id, HC.CONTENT_STATUS_PENDING, tag_id ) ) for ( hash_id, tag_id ) in self._SelectFromList( 'SELECT hash_id, tag_id FROM ' + cache_pending_mappings_table_name + ' WHERE hash_id IN %s;', hash_ids ) )
tag_data.extend( ( hash_id, ( tag_service_id, HC.CONTENT_STATUS_DELETED, tag_id ) ) for ( hash_id, tag_id ) in self._SelectFromList( 'SELECT hash_id, tag_id FROM ' + deleted_mappings_table_name + ' WHERE hash_id IN %s;', hash_ids ) )
tag_data.extend( ( hash_id, ( tag_service_id, HC.CONTENT_STATUS_PENDING, tag_id ) ) for ( hash_id, tag_id ) in self._SelectFromList( 'SELECT hash_id, tag_id FROM ' + pending_mappings_table_name + ' WHERE hash_id IN %s;', hash_ids ) )
tag_data.extend( ( hash_id, ( tag_service_id, HC.CONTENT_STATUS_PETITIONED, tag_id ) ) for ( hash_id, tag_id ) in self._SelectFromList( 'SELECT hash_id, tag_id FROM ' + petitioned_mappings_table_name + ' WHERE hash_id IN %s;', hash_ids ) )
@ -7668,6 +7716,46 @@ class DB( HydrusDB.HydrusDB ):
return result
def _RecheckVideoMetadata( self, hashes ):
job_key = ClientThreading.JobKey()
job_key.SetVariable( 'popup_title', 'rechecking video metadata' )
self._controller.pub( 'message', job_key )
client_files_manager = self._controller.GetClientFilesManager()
num_to_do = len( hashes )
for ( i, hash ) in enumerate( hashes ):
job_key.SetVariable( 'popup_text_1', 'processing ' + HydrusData.ConvertValueRangeToPrettyString( i + 1, num_to_do ) )
job_key.SetVariable( 'popup_gauge_1', ( i + 1, num_to_do ) )
hash_id = self._GetHashId( hash )
try:
mime = self._GetMime( hash_id )
except HydrusExceptions.FileMissingException:
continue
path = client_files_manager.LocklessGetFilePath( hash, mime )
( ( w, h ), duration, num_frames ) = HydrusVideoHandling.GetFFMPEGVideoProperties( path, count_frames_manually = True )
self._c.execute( 'UPDATE files_info SET width = ?, height = ?, duration = ?, num_frames = ? WHERE hash_id = ?;', ( w, h, duration, num_frames, hash_id ) )
job_key.SetVariable( 'popup_text_1', 'done!' )
job_key.DeleteVariable( 'popup_gauge_1' )
def _RelocateClientFiles( self, prefix, source, dest ):
full_source = os.path.join( source, prefix )
@ -9806,6 +9894,7 @@ class DB( HydrusDB.HydrusDB ):
elif action == 'maintain_similar_files_tree': result = self._CacheSimilarFilesMaintainTree( *args, **kwargs )
elif action == 'process_repository': result = self._ProcessRepositoryUpdates( *args, **kwargs )
elif action == 'push_recent_tags': result = self._PushRecentTags( *args, **kwargs )
elif action == 'recheck_video_metadata': result = self._RecheckVideoMetadata( *args, **kwargs )
elif action == 'regenerate_ac_cache': result = self._RegenerateACCache( *args, **kwargs )
elif action == 'regenerate_similar_files': result = self._CacheSimilarFilesRegenerateTree( *args, **kwargs )
elif action == 'relocate_client_files': result = self._RelocateClientFiles( *args, **kwargs )

View File

@ -289,6 +289,12 @@ def GetLighterDarkerColour( colour, intensity = 3 ):
else:
( r, g, b ) = colour.Get()
( r, g, b ) = [ max( value, 32 ) for value in ( r, g, b ) ]
colour = wx.Colour( r, g, b )
return wx.lib.colourutils.AdjustColour( colour, 5 * intensity )
@ -753,6 +759,8 @@ class ClientOptions( HydrusSerialisable.SerialisableBase ):
self._dictionary[ 'booleans' ] = {}
self._dictionary[ 'booleans' ][ 'advanced_mode' ] = False
self._dictionary[ 'booleans' ][ 'apply_all_parents_to_all_services' ] = False
self._dictionary[ 'booleans' ][ 'apply_all_siblings_to_all_services' ] = False
self._dictionary[ 'booleans' ][ 'filter_inbox_and_archive_predicates' ] = False
@ -788,7 +796,7 @@ class ClientOptions( HydrusSerialisable.SerialisableBase ):
self._dictionary[ 'duplicate_action_options' ] = HydrusSerialisable.SerialisableDictionary()
self._dictionary[ 'duplicate_action_options' ][ HC.DUPLICATE_BETTER ] = DuplicateActionOptions( [ ( CC.LOCAL_TAG_SERVICE_KEY, HC.CONTENT_MERGE_ACTION_MOVE, TagCensor() ) ], [], True, True )
self._dictionary[ 'duplicate_action_options' ][ HC.DUPLICATE_SAME_FILE ] = DuplicateActionOptions( [ ( CC.LOCAL_TAG_SERVICE_KEY, HC.CONTENT_MERGE_ACTION_TWO_WAY_MERGE, TagCensor() ) ], [], False, True )
self._dictionary[ 'duplicate_action_options' ][ HC.DUPLICATE_SAME_QUALITY ] = DuplicateActionOptions( [ ( CC.LOCAL_TAG_SERVICE_KEY, HC.CONTENT_MERGE_ACTION_TWO_WAY_MERGE, TagCensor() ) ], [], False, True )
self._dictionary[ 'duplicate_action_options' ][ HC.DUPLICATE_ALTERNATE ] = DuplicateActionOptions( [], [], False )
self._dictionary[ 'duplicate_action_options' ][ HC.DUPLICATE_NOT_DUPLICATE ] = DuplicateActionOptions( [], [], False )
@ -1878,7 +1886,7 @@ class Imageboard( HydrusData.HydrusYAMLBase ):
self._restrictions = restrictions
def IsOkToPost( self, media_result ):
def IsOKToPost( self, media_result ):
# deleted old code due to deprecation
@ -2497,7 +2505,7 @@ class TagCensor( HydrusSerialisable.SerialisableBase ):
self._tag_slices_to_rules = dict( serialisable_info )
def _TagOk( self, tag ):
def _TagOK( self, tag ):
rules = self._GetRulesForTag( tag )
@ -2519,7 +2527,7 @@ class TagCensor( HydrusSerialisable.SerialisableBase ):
with self._lock:
return { tag for tag in tags if self._TagOk( tag ) }
return { tag for tag in tags if self._TagOK( tag ) }

View File

@ -33,7 +33,7 @@ def GetClientDefaultOptions():
options[ 'idle_cpu_max' ] = 50
options[ 'idle_normal' ] = True
options[ 'idle_shutdown' ] = CC.IDLE_ON_SHUTDOWN_ASK_FIRST
options[ 'idle_shutdown_max_minutes' ] = 30
options[ 'idle_shutdown_max_minutes' ] = 5
options[ 'maintenance_delete_orphans_period' ] = 86400 * 3
options[ 'trash_max_age' ] = 72
options[ 'trash_max_size' ] = 512

View File

@ -214,7 +214,10 @@ def GetSoup( html ):
def GetYoutubeFormats( youtube_url ):
try: p = pafy.new( youtube_url )
try:
p = pafy.new( youtube_url )
except Exception as e:
raise Exception( 'Could not fetch video info from youtube!' + os.linesep + HydrusData.ToUnicode( e ) )

View File

@ -36,6 +36,7 @@ import HydrusTagArchive
import HydrusVideoHandling
import os
import PIL
import shlex
import sqlite3
import ssl
import subprocess
@ -446,7 +447,7 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
HydrusData.ShowText( u'Starting server\u2026' )
db_dir = '-d=' + self._controller.GetDBDir()
db_param = '-d="' + self._controller.GetDBDir() + '"'
if HC.PLATFORM_WINDOWS:
@ -458,8 +459,8 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
if os.path.exists( server_frozen_path ):
subprocess.Popen( [ server_frozen_path, db_dir ] )
cmd = '"' + server_frozen_path + '" ' + db_param
else:
@ -475,8 +476,12 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
python_executable = python_executable.replace( 'pythonw', 'python' )
subprocess.Popen( [ python_executable, os.path.join( HC.BASE_DIR, 'server.py' ), db_dir ] )
server_script_path = os.path.join( HC.BASE_DIR, 'server.py' )
cmd = '"' + python_executable + '" "' + server_script_path + '" ' + db_param
subprocess.Popen( shlex.split( cmd ) )
time_waited = 0
@ -1504,6 +1509,13 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
ClientGUIMenus.AppendMenuItem( self, menu, 'help', 'Open hydrus\'s local help in your web browser.', webbrowser.open, 'file://' + HC.HELP_DIR + '/index.html' )
check_manager = ClientGUICommon.CheckboxManagerOptions( 'advanced_mode' )
current_value = check_manager.GetCurrentValue()
func = check_manager.Invert
ClientGUIMenus.AppendMenuCheckItem( self, menu, 'advanced mode', 'Turn on advanced menu options and buttons.', current_value, func )
dont_know = wx.Menu()
ClientGUIMenus.AppendMenuItem( self, dont_know, 'just set up some repositories for me, please', 'This will add the hydrus dev\'s two repositories to your client.', self._AutoRepoSetup )

View File

@ -2375,7 +2375,7 @@ class CanvasPanel( Canvas ):
def EventMenu( self, event ):
# is None bit means this is prob from a keydown->menu event
if event.GetEventObject() is None or not self._CanProcessInput():
if event.GetEventObject() is None:
event.Skip()
@ -3042,6 +3042,8 @@ class CanvasFilterDuplicates( CanvasWithHovers ):
def _CommitProcessed( self ):
pair_info = []
for ( hash_pair, duplicate_type, first_media, second_media, duplicate_action_options, was_auto_skipped ) in self._processed_pairs:
if duplicate_type == HC.DUPLICATE_UNKNOWN:
@ -3049,17 +3051,17 @@ class CanvasFilterDuplicates( CanvasWithHovers ):
continue # it was a 'skip' decision
list_of_service_keys_to_content_updates = duplicate_action_options.ProcessPairIntoContentUpdates( first_media, second_media )
for service_keys_to_content_updates in list_of_service_keys_to_content_updates:
HG.client_controller.Write( 'content_updates', service_keys_to_content_updates )
first_hash = first_media.GetHash()
second_hash = second_media.GetHash()
HG.client_controller.WriteSynchronous( 'duplicate_pair_status', duplicate_type, first_hash, second_hash )
list_of_service_keys_to_content_updates = duplicate_action_options.ProcessPairIntoContentUpdates( first_media, second_media )
pair_info.append( ( duplicate_type, first_hash, second_hash, list_of_service_keys_to_content_updates ) )
if len( pair_info ) > 0:
HG.client_controller.WriteSynchronous( 'duplicate_pair_status', pair_info )
self._processed_pairs = []
@ -3071,6 +3073,67 @@ class CanvasFilterDuplicates( CanvasWithHovers ):
self._ProcessPair( HC.DUPLICATE_BETTER )
def _Delete( self, service_key = None ):
if self._current_media is None:
return
if service_key is None:
locations_manager = self._current_media.GetLocationsManager()
if CC.LOCAL_FILE_SERVICE_KEY in locations_manager.GetCurrent():
service_key = CC.LOCAL_FILE_SERVICE_KEY
elif CC.TRASH_SERVICE_KEY in locations_manager.GetCurrent():
service_key = CC.TRASH_SERVICE_KEY
else:
return
if service_key == CC.LOCAL_FILE_SERVICE_KEY:
text = 'Send this just this file to the trash, or both?'
elif service_key == CC.TRASH_SERVICE_KEY:
text = 'Permanently delete just this file, or both?'
yes_tuples = []
yes_tuples.append( ( 'delete just this one', 'current' ) )
yes_tuples.append( ( 'delete both', 'both' ) )
with ClientGUIDialogs.DialogYesYesNo( self, text, yes_tuples = yes_tuples, no_label = 'forget it' ) as dlg:
if dlg.ShowModal() == wx.ID_YES:
value = dlg.GetValue()
if value == 'current':
hashes = { self._current_media.GetHash() }
elif value == 'both':
hashes = { self._current_media.GetHash(), self._media_list.GetNext( self._current_media ).GetHash() }
HG.client_controller.Write( 'content_updates', { service_key : [ HydrusData.ContentUpdate( HC.CONTENT_TYPE_FILES, HC.CONTENT_UPDATE_DELETE, hashes ) ] } )
self.SetFocus() # annoying bug because of the modal dialog
def _DoCustomAction( self ):
if self._current_media is None:
@ -3078,7 +3141,7 @@ class CanvasFilterDuplicates( CanvasWithHovers ):
return
duplicate_types = [ HC.DUPLICATE_BETTER, HC.DUPLICATE_SAME_FILE, HC.DUPLICATE_ALTERNATE, HC.DUPLICATE_NOT_DUPLICATE ]
duplicate_types = [ HC.DUPLICATE_BETTER, HC.DUPLICATE_SAME_QUALITY, HC.DUPLICATE_ALTERNATE, HC.DUPLICATE_NOT_DUPLICATE ]
choice_tuples = [ ( HC.duplicate_type_string_lookup[ duplicate_type ], duplicate_type ) for duplicate_type in duplicate_types ]
@ -3251,7 +3314,7 @@ class CanvasFilterDuplicates( CanvasWithHovers ):
def _MediaAreTheSame( self ):
self._ProcessPair( HC.DUPLICATE_SAME_FILE )
self._ProcessPair( HC.DUPLICATE_SAME_QUALITY )
def _ProcessApplicationCommand( self, command ):
@ -3604,7 +3667,6 @@ class CanvasFilterDuplicates( CanvasWithHovers ):
wx.CallLater( 100, catch_up )
@ -4218,39 +4280,32 @@ class CanvasMediaListFilterArchiveDelete( CanvasMediaList ):
def EventMenu( self, event ):
if self._CanProcessInput():
action = ClientCaches.MENU_EVENT_ID_TO_ACTION_CACHE.GetAction( event.GetId() )
if action is not None:
action = ClientCaches.MENU_EVENT_ID_TO_ACTION_CACHE.GetAction( event.GetId() )
( command, data ) = action
if action is not None:
if command == 'archive_file': self._Keep()
elif command == 'back': self._Back()
elif command == 'close': self._Close()
elif command == 'delete_file': self.EventDelete( event )
elif command == 'switch_between_fullscreen_borderless_and_regular_framed_window': self.GetParent().FullscreenSwitch()
elif command == 'launch_the_archive_delete_filter': self._Close()
elif command == 'move_animation_to_previous_frame': self._media_container.GotoPreviousOrNextFrame( -1 )
elif command == 'move_animation_to_next_frame': self._media_container.GotoPreviousOrNextFrame( 1 )
elif command == 'manage_file_ratings': self._ManageRatings()
elif command == 'manage_file_tags': wx.CallAfter( self._ManageTags )
elif command in ( 'pan_up', 'pan_down', 'pan_left', 'pan_right' ):
( command, data ) = action
if command == 'pan_up': self._DoManualPan( 0, -1 )
elif command == 'pan_down': self._DoManualPan( 0, 1 )
elif command == 'pan_left': self._DoManualPan( -1, 0 )
elif command == 'pan_right': self._DoManualPan( 1, 0 )
if command == 'archive_file': self._Keep()
elif command == 'back': self._Back()
elif command == 'close': self._Close()
elif command == 'delete_file': self.EventDelete( event )
elif command == 'switch_between_fullscreen_borderless_and_regular_framed_window': self.GetParent().FullscreenSwitch()
elif command == 'launch_the_archive_delete_filter': self._Close()
elif command == 'move_animation_to_previous_frame': self._media_container.GotoPreviousOrNextFrame( -1 )
elif command == 'move_animation_to_next_frame': self._media_container.GotoPreviousOrNextFrame( 1 )
elif command == 'manage_file_ratings': self._ManageRatings()
elif command == 'manage_file_tags': wx.CallAfter( self._ManageTags )
elif command in ( 'pan_up', 'pan_down', 'pan_left', 'pan_right' ):
if command == 'pan_up': self._DoManualPan( 0, -1 )
elif command == 'pan_down': self._DoManualPan( 0, 1 )
elif command == 'pan_left': self._DoManualPan( -1, 0 )
elif command == 'pan_right': self._DoManualPan( 1, 0 )
elif command == 'zoom_in': self._ZoomIn()
elif command == 'zoom_out': self._ZoomOut()
else: event.Skip()
else:
event.Skip()
elif command == 'zoom_in': self._ZoomIn()
elif command == 'zoom_out': self._ZoomOut()
else: event.Skip()
@ -4549,7 +4604,7 @@ class CanvasMediaListBrowser( CanvasMediaListNavigable ):
def EventMenu( self, event ):
# is None bit means this is prob from a keydown->menu event
if event.GetEventObject() is None or not self._CanProcessInput():
if event.GetEventObject() is None:
event.Skip()
@ -4688,7 +4743,7 @@ class CanvasMediaListBrowser( CanvasMediaListNavigable ):
ClientGUIMenus.AppendMenuItem( self, menu, 'return to inbox', 'Put this file back in the inbox.', self._Inbox )
menu.Append( ClientCaches.MENU_EVENT_ID_TO_ACTION_CACHE.GetTemporaryId( 'remove_file_from_view' ), '&remove' )
ClientGUIMenus.AppendMenuItem( self, menu, 'remove', 'Remove this file from the list you are viewing.', self._Remove )
if CC.LOCAL_FILE_SERVICE_KEY in locations_manager.GetCurrent():

View File

@ -361,7 +361,7 @@ class DialogGenerateNewAccounts( Dialog ):
self._lifetime = ClientGUICommon.BetterChoice( self )
self._ok = wx.Button( self, label = 'Ok' )
self._ok = wx.Button( self, label = 'OK' )
self._ok.Bind( wx.EVT_BUTTON, self.EventOK )
self._ok.SetForegroundColour( ( 0, 128, 0 ) )
@ -627,7 +627,7 @@ class DialogInputFileSystemPredicates( Dialog ):
self._predicate_panel = predicate_class( self )
self._ok = wx.Button( self, id = wx.ID_OK, label = 'Ok' )
self._ok = wx.Button( self, id = wx.ID_OK, label = 'OK' )
self._ok.Bind( wx.EVT_BUTTON, self.EventOK )
self._ok.SetForegroundColour( ( 0, 128, 0 ) )
@ -1241,7 +1241,7 @@ class DialogInputNamespaceRegex( Dialog ):
self._regex_intro_link = wx.HyperlinkCtrl( self, id = -1, label = 'a good regex introduction', url = 'http://www.aivosto.com/vbtips/regex.html' )
self._regex_practise_link = wx.HyperlinkCtrl( self, id = -1, label = 'regex practise', url = 'http://regexr.com/3cvmf' )
self._ok = wx.Button( self, id = wx.ID_OK, label = 'Ok' )
self._ok = wx.Button( self, id = wx.ID_OK, label = 'OK' )
self._ok.Bind( wx.EVT_BUTTON, self.EventOK )
self._ok.SetForegroundColour( ( 0, 128, 0 ) )
@ -1340,7 +1340,7 @@ class DialogInputNewFormField( Dialog ):
self._editable = wx.CheckBox( self )
self._ok = wx.Button( self, id = wx.ID_OK, label = 'Ok' )
self._ok = wx.Button( self, id = wx.ID_OK, label = 'OK' )
self._ok.SetForegroundColour( ( 0, 128, 0 ) )
self._cancel = wx.Button( self, id = wx.ID_CANCEL, label = 'Cancel' )
@ -1413,9 +1413,9 @@ class DialogInputTags( Dialog ):
expand_parents = True
self._tag_box = ClientGUIACDropdown.AutoCompleteDropdownTagsWrite( self, self.EnterTags, expand_parents, CC.LOCAL_FILE_SERVICE_KEY, service_key, null_entry_callable = self.Ok )
self._tag_box = ClientGUIACDropdown.AutoCompleteDropdownTagsWrite( self, self.EnterTags, expand_parents, CC.LOCAL_FILE_SERVICE_KEY, service_key, null_entry_callable = self.OK )
self._ok = wx.Button( self, id= wx.ID_OK, label = 'Ok' )
self._ok = wx.Button( self, id= wx.ID_OK, label = 'OK' )
self._ok.SetForegroundColour( ( 0, 128, 0 ) )
self._cancel = wx.Button( self, id = wx.ID_CANCEL, label = 'Cancel' )
@ -1474,7 +1474,7 @@ class DialogInputTags( Dialog ):
return self._tags.GetTags()
def Ok( self ):
def OK( self ):
self.EndModal( wx.ID_OK )
@ -1487,7 +1487,7 @@ class DialogInputTimeDelta( Dialog ):
self._time_delta = ClientGUICommon.TimeDeltaCtrl( self, min = min, days = days, hours = hours, minutes = minutes, seconds = seconds, monthly_allowed = monthly_allowed )
self._ok = wx.Button( self, id = wx.ID_OK, label = 'Ok' )
self._ok = wx.Button( self, id = wx.ID_OK, label = 'OK' )
self._ok.SetForegroundColour( ( 0, 128, 0 ) )
self._cancel = wx.Button( self, id = wx.ID_CANCEL, label = 'Cancel' )
@ -1538,7 +1538,7 @@ class DialogInputUPnPMapping( Dialog ):
self._description = wx.TextCtrl( self )
self._duration = wx.SpinCtrl( self, min = 0, max = 86400 )
self._ok = wx.Button( self, id = wx.ID_OK, label = 'Ok' )
self._ok = wx.Button( self, id = wx.ID_OK, label = 'OK' )
self._ok.SetForegroundColour( ( 0, 128, 0 ) )
self._cancel = wx.Button( self, id = wx.ID_CANCEL, label = 'Cancel' )
@ -1617,7 +1617,7 @@ class DialogModifyAccounts( Dialog ):
self._account_types = wx.Choice( self._account_types_panel )
self._account_types_ok = wx.Button( self._account_types_panel, label = 'Ok' )
self._account_types_ok = wx.Button( self._account_types_panel, label = 'OK' )
self._account_types_ok.Bind( wx.EVT_BUTTON, self.EventChangeAccountType )
#
@ -1626,12 +1626,12 @@ class DialogModifyAccounts( Dialog ):
self._add_to_expires = wx.Choice( self._expiration_panel )
self._add_to_expires_ok = wx.Button( self._expiration_panel, label = 'Ok' )
self._add_to_expires_ok = wx.Button( self._expiration_panel, label = 'OK' )
self._add_to_expires_ok.Bind( wx.EVT_BUTTON, self.EventAddToExpires )
self._set_expires = wx.Choice( self._expiration_panel )
self._set_expires_ok = wx.Button( self._expiration_panel, label = 'Ok' )
self._set_expires_ok = wx.Button( self._expiration_panel, label = 'OK' )
self._set_expires_ok.Bind( wx.EVT_BUTTON, self.EventSetExpires )
#
@ -3874,3 +3874,74 @@ class DialogYesNo( Dialog ):
wx.CallAfter( self._yes.SetFocus )
class DialogYesYesNo( Dialog ):
def __init__( self, parent, message, title = 'Are you sure?', yes_tuples = None, no_label = 'no' ):
if yes_tuples is None:
yes_tuples = [ ( 'yes', 'yes' ) ]
Dialog.__init__( self, parent, title, position = 'center' )
self._value = None
yes_buttons = []
for ( label, data ) in yes_tuples:
yes_button = ClientGUICommon.BetterButton( self, label, self._DoYes, data )
yes_button.SetForegroundColour( ( 0, 128, 0 ) )
yes_buttons.append( yes_button )
self._no = wx.Button( self, id = wx.ID_NO )
self._no.SetForegroundColour( ( 128, 0, 0 ) )
self._no.SetLabelText( no_label )
self._hidden_cancel = wx.Button( self, id = wx.ID_CANCEL, size = ( 0, 0 ) )
#
hbox = wx.BoxSizer( wx.HORIZONTAL )
for yes_button in yes_buttons:
hbox.AddF( yes_button, CC.FLAGS_SMALL_INDENT )
hbox.AddF( self._no, CC.FLAGS_SMALL_INDENT )
vbox = wx.BoxSizer( wx.VERTICAL )
text = ClientGUICommon.BetterStaticText( self, message )
text.Wrap( 480 )
vbox.AddF( text, CC.FLAGS_BIG_INDENT )
vbox.AddF( hbox, CC.FLAGS_BUTTON_SIZER )
self.SetSizer( vbox )
( x, y ) = self.GetEffectiveMinSize()
x = max( x, 250 )
self.SetInitialSize( ( x, y ) )
wx.CallAfter( yes_buttons[0].SetFocus )
def _DoYes( self, value ):
self._value = value
self.EndModal( wx.ID_YES )
def GetValue( self ):
return self._value

View File

@ -88,7 +88,7 @@ class DialogManage4chanPass( ClientGUIDialogs.Dialog ):
self._reauthenticate = wx.Button( self, label = 'reauthenticate' )
self._reauthenticate.Bind( wx.EVT_BUTTON, self.EventReauthenticate )
self._ok = wx.Button( self, id = wx.ID_OK, label = 'Ok' )
self._ok = wx.Button( self, id = wx.ID_OK, label = 'OK' )
self._ok.Bind( wx.EVT_BUTTON, self.EventOK )
self._ok.SetForegroundColour( ( 0, 128, 0 ) )
@ -2918,7 +2918,7 @@ class DialogManagePixivAccount( ClientGUIDialogs.Dialog ):
self._test = wx.Button( self, label = 'test' )
self._test.Bind( wx.EVT_BUTTON, self.EventTest )
self._ok = wx.Button( self, id = wx.ID_OK, label = 'Ok' )
self._ok = wx.Button( self, id = wx.ID_OK, label = 'OK' )
self._ok.Bind( wx.EVT_BUTTON, self.EventOK )
self._ok.SetForegroundColour( ( 0, 128, 0 ) )

View File

@ -557,7 +557,7 @@ class FullscreenHoverFrameTopDuplicatesFilter( FullscreenHoverFrameTopNavigable
menu_items = []
menu_items.append( ( 'normal', 'edit duplicate action options for \'this is better\'', 'edit what content is merged when you filter files', HydrusData.Call( self._EditMergeOptions, HC.DUPLICATE_BETTER ) ) )
menu_items.append( ( 'normal', 'edit duplicate action options for \'exact duplicates\'', 'edit what content is merged when you filter files', HydrusData.Call( self._EditMergeOptions, HC.DUPLICATE_SAME_FILE ) ) )
menu_items.append( ( 'normal', 'edit duplicate action options for \'same quality\'', 'edit what content is merged when you filter files', HydrusData.Call( self._EditMergeOptions, HC.DUPLICATE_SAME_QUALITY ) ) )
menu_items.append( ( 'normal', 'edit duplicate action options for \'alternates\'', 'edit what content is merged when you filter files', HydrusData.Call( self._EditMergeOptions, HC.DUPLICATE_ALTERNATE ) ) )
menu_items.append( ( 'normal', 'edit duplicate action options for \'not duplicates\'', 'edit what content is merged when you filter files', HydrusData.Call( self._EditMergeOptions, HC.DUPLICATE_NOT_DUPLICATE ) ) )
menu_items.append( ( 'separator', None, None, None ) )
@ -572,7 +572,7 @@ class FullscreenHoverFrameTopDuplicatesFilter( FullscreenHoverFrameTopNavigable
dupe_commands = []
dupe_commands.append( ( 'this is better', 'Set that the current file you are looking at is better than the other in the pair.', ClientData.ApplicationCommand( CC.APPLICATION_COMMAND_TYPE_SIMPLE, 'duplicate_filter_this_is_better' ) ) )
dupe_commands.append( ( 'exact duplicates', 'Set that the two files are, as far as you can tell, exactly the same.', ClientData.ApplicationCommand( CC.APPLICATION_COMMAND_TYPE_SIMPLE, 'duplicate_filter_exactly_the_same' ) ) )
dupe_commands.append( ( 'same quality', 'Set that the two files are duplicates of very similar quality.', ClientData.ApplicationCommand( CC.APPLICATION_COMMAND_TYPE_SIMPLE, 'duplicate_filter_exactly_the_same' ) ) )
dupe_commands.append( ( 'alternates', 'Set that the files are not duplicates, but that one is derived from the other or that they are both descendants of a common ancestor.', ClientData.ApplicationCommand( CC.APPLICATION_COMMAND_TYPE_SIMPLE, 'duplicate_filter_alternates' ) ) )
dupe_commands.append( ( 'not duplicates', 'Set that the files are not duplicates or otherwise related--that this pair is a false-positive match.', ClientData.ApplicationCommand( CC.APPLICATION_COMMAND_TYPE_SIMPLE, 'duplicate_filter_not_dupes' ) ) )
dupe_commands.append( ( 'custom action', 'Choose one of the other actions but customise the merge and delete options for this specific decision.', ClientData.ApplicationCommand( CC.APPLICATION_COMMAND_TYPE_SIMPLE, 'duplicate_filter_custom_action' ) ) )

View File

@ -1345,6 +1345,8 @@ class ListBoxTagsCensorship( ListBoxTags ):
self._RemoveTerm( tag )
self._ordered_terms.sort()
self._DataHasChanged()
@ -1379,6 +1381,8 @@ class ListBoxTagsCensorship( ListBoxTags ):
self._AppendTerm( tag )
self._ordered_terms.sort()
self._DataHasChanged()
@ -1396,6 +1400,8 @@ class ListBoxTagsCensorship( ListBoxTags ):
self._ordered_terms.sort()
self._DataHasChanged()
@ -1406,6 +1412,8 @@ class ListBoxTagsCensorship( ListBoxTags ):
self._RemoveTerm( tag )
self._ordered_terms.sort()
self._DataHasChanged()

View File

@ -718,6 +718,7 @@ class ManagementPanelDuplicateFilter( ManagementPanel ):
self._job = None
self._job_key = None
self._in_break = False
menu_items = []
@ -778,7 +779,7 @@ class ManagementPanelDuplicateFilter( ManagementPanel ):
self._num_unknown_duplicates = wx.StaticText( self._filtering_panel )
self._num_better_duplicates = wx.StaticText( self._filtering_panel )
self._num_better_duplicates.SetToolTipString( 'If this stays at 0, it is likely because your \'worse\' files are being deleted and so are leaving this file domain!' )
self._num_same_file_duplicates = wx.StaticText( self._filtering_panel )
self._num_same_quality_duplicates = wx.StaticText( self._filtering_panel )
self._num_alternate_duplicates = wx.StaticText( self._filtering_panel )
self._show_some_dupes = ClientGUICommon.BetterButton( self._filtering_panel, 'show some random pairs', self._ShowSomeDupes )
self._launch_filter = ClientGUICommon.BetterButton( self._filtering_panel, 'launch the filter', self._LaunchFilter )
@ -834,7 +835,7 @@ class ManagementPanelDuplicateFilter( ManagementPanel ):
self._filtering_panel.AddF( self._file_domain_button, CC.FLAGS_EXPAND_PERPENDICULAR )
self._filtering_panel.AddF( self._num_unknown_duplicates, CC.FLAGS_EXPAND_PERPENDICULAR )
self._filtering_panel.AddF( self._num_better_duplicates, CC.FLAGS_EXPAND_PERPENDICULAR )
self._filtering_panel.AddF( self._num_same_file_duplicates, CC.FLAGS_EXPAND_PERPENDICULAR )
self._filtering_panel.AddF( self._num_same_quality_duplicates, CC.FLAGS_EXPAND_PERPENDICULAR )
self._filtering_panel.AddF( self._num_alternate_duplicates, CC.FLAGS_EXPAND_PERPENDICULAR )
self._filtering_panel.AddF( self._show_some_dupes, CC.FLAGS_EXPAND_PERPENDICULAR )
self._filtering_panel.AddF( self._launch_filter, CC.FLAGS_EXPAND_PERPENDICULAR )
@ -986,11 +987,11 @@ class ManagementPanelDuplicateFilter( ManagementPanel ):
message += os.linesep * 2
message += 'potential - This is the default state newly discovered pairs are assigned. They will be loaded in the filter for you to look at.'
message += os.linesep * 2
message += 'better/worse - This tells the client that the pair of files are exactly the same--except that the one you are looking at has better image quality or resolution or lacks an annoying watermark and so on.'
message += 'better/worse - This tells the client that the pair of files are duplicates--but the one you are looking at has better image quality or resolution or lacks an annoying watermark or so on.'
message += os.linesep * 2
message += 'exact duplicates - This tells the client that the pair of files are exactly the same, and that you cannot discern any quality difference.'
message += 'same quality - This tells the client that the pair of files are duplicates, and that you cannot discern an obvious quality difference.'
message += os.linesep * 2
message += 'alternates - This tells the client that the pair of files are not exactly the same but that they are related--perhaps they are a recolour or are an artist\'s different versions of a particular scene. A future version of the client will allow you to further process these alternate groups into family structures and so on.'
message += 'alternates - This tells the client that the pair of files are not duplicates but that they are related--perhaps they are a recolour or are an artist\'s different versions of a particular scene. A future version of the client will allow you to further process these alternate groups into family structures and so on.'
message += os.linesep * 2
message += 'not duplicates - This tells the client that the discovered pair is a false positive--they are not the same and are not otherwise related. This usually happens when the same part of two files have a similar shape by accident, such as if a hair fringe and a mountain range happen to line up.'
@ -1059,13 +1060,29 @@ class ManagementPanelDuplicateFilter( ManagementPanel ):
def _UpdateJob( self ):
if self._job_key.TimeRunning() > 30:
if self._in_break:
if HG.client_controller.DBCurrentlyDoingJob():
return
else:
self._in_break = False
self._StartStopDBJob()
return
if self._job_key.TimeRunning() > 10:
self._job_key.Cancel()
self._job_key = None
self._StartStopDBJob()
self._in_break = True
return
@ -1199,7 +1216,7 @@ class ManagementPanelDuplicateFilter( ManagementPanel ):
self._num_unknown_duplicates.SetLabelText( HydrusData.ConvertIntToPrettyString( num_unknown ) + ' potential pairs.' )
self._num_better_duplicates.SetLabelText( HydrusData.ConvertIntToPrettyString( duplicate_types_to_count[ HC.DUPLICATE_BETTER ] ) + ' better/worse pairs.' )
self._num_same_file_duplicates.SetLabelText( HydrusData.ConvertIntToPrettyString( duplicate_types_to_count[ HC.DUPLICATE_SAME_FILE ] ) + ' exact duplicate pairs.' )
self._num_same_quality_duplicates.SetLabelText( HydrusData.ConvertIntToPrettyString( duplicate_types_to_count[ HC.DUPLICATE_SAME_QUALITY ] ) + ' same quality pairs.' )
self._num_alternate_duplicates.SetLabelText( HydrusData.ConvertIntToPrettyString( duplicate_types_to_count[ HC.DUPLICATE_ALTERNATE ] ) + ' alternate pairs.' )
if num_unknown > 0:
@ -3056,10 +3073,27 @@ class ManagementPanelPetitions( ManagementPanel ):
def key( c ):
return c.GetVirtualWeight()
if c.GetContentType() in ( HC.CONTENT_TYPE_TAG_SIBLINGS, HC.CONTENT_TYPE_TAG_PARENTS ):
( part_two, part_one ) = c.GetContentData()
elif c.GetContentType() == HC.CONTENT_TYPE_MAPPINGS:
( tag, hashes ) = c.GetContentData()
part_one = tag
part_two = None
else:
part_one = None
part_two = None
return ( -c.GetVirtualWeight(), part_one, part_two )
contents.sort( key = key, reverse = True )
contents.sort( key = key )
self._contents.Clear()

View File

@ -446,6 +446,28 @@ class MediaPanel( ClientMedia.ListeningMediaList, wx.ScrolledWindow ):
HG.client_controller.Write( 'content_updates', { CC.COMBINED_LOCAL_FILE_SERVICE_KEY : [ HydrusData.ContentUpdate( HC.CONTENT_TYPE_FILES, HC.CONTENT_UPDATE_PEND, hashes ) ] } )
def _EditDuplicateActionOptions( self, duplicate_type ):
new_options = HG.client_controller.GetNewOptions()
duplicate_action_options = new_options.GetDuplicateActionOptions( duplicate_type )
with ClientGUITopLevelWindows.DialogEdit( self, 'edit duplicate merge options' ) as dlg_2:
panel = ClientGUIScrolledPanelsEdit.EditDuplicateActionOptionsPanel( dlg_2, duplicate_type, duplicate_action_options )
dlg_2.SetPanel( panel )
if dlg_2.ShowModal() == wx.ID_OK:
duplicate_action_options = panel.GetValue()
new_options.SetDuplicateActionOptions( duplicate_type, duplicate_action_options )
def _FullScreen( self, first_media = None ):
if self._focussed_media is not None:
@ -897,6 +919,26 @@ class MediaPanel( ClientMedia.ListeningMediaList, wx.ScrolledWindow ):
def _OpenFileLocation( self ):
if self._focussed_media is not None:
if self._focussed_media.GetLocationsManager().IsLocal():
hash = self._focussed_media.GetHash()
mime = self._focussed_media.GetMime()
client_files_manager = HG.client_controller.GetClientFilesManager()
path = client_files_manager.GetFilePath( hash, mime )
self._SetFocussedMedia( None )
HydrusPaths.OpenFileLocation( path )
def _PetitionFiles( self, remote_service_key ):
hashes = self._GetSelectedHashes()
@ -1034,6 +1076,30 @@ class MediaPanel( ClientMedia.ListeningMediaList, wx.ScrolledWindow ):
def _RecalculateVirtualSize( self ): pass
def _RecheckVideoMetadata( self ):
flat_media = self._GetSelectedFlatMedia()
hashes = { media.GetHash() for media in flat_media if media.GetMime() in HC.VIDEO }
if len( hashes ) > 0:
text = 'This will reparse the ' + HydrusData.ConvertIntToPrettyString( len( hashes ) ) + ' selected videos using a slower but more accurate routine.'
text += os.linesep * 2
text += 'If you see videos that seem to render too fast or cut short of frames half way through, this may fix it.'
text += os.linesep * 2
text += 'It may take some time to reparse the files, and you will need to refresh your search to see the updated videos.'
with ClientGUIDialogs.DialogYesNo( self, text ) as dlg:
if dlg.ShowModal() == wx.ID_YES:
HG.client_controller.Write( 'recheck_video_metadata', hashes )
def _RedrawMedia( self, media ): pass
def _Remove( self ):
@ -1161,22 +1227,28 @@ class MediaPanel( ClientMedia.ListeningMediaList, wx.ScrolledWindow ):
if dlg.ShowModal() == wx.ID_YES:
pair_info = []
for ( first_media, second_media ) in media_pairs:
if duplicate_action_options is not None:
list_of_service_keys_to_content_updates = duplicate_action_options.ProcessPairIntoContentUpdates( first_media, second_media )
for service_keys_to_content_updates in list_of_service_keys_to_content_updates:
HG.client_controller.Write( 'content_updates', service_keys_to_content_updates )
first_hash = first_media.GetHash()
second_hash = second_media.GetHash()
HG.client_controller.WriteSynchronous( 'duplicate_pair_status', duplicate_type, first_hash, second_hash )
if duplicate_action_options is None:
list_of_service_keys_to_content_updates = []
else:
list_of_service_keys_to_content_updates = duplicate_action_options.ProcessPairIntoContentUpdates( first_media, second_media )
pair_info.append( ( duplicate_type, first_hash, second_hash, list_of_service_keys_to_content_updates ) )
if len( pair_info ) > 0:
HG.client_controller.WriteSynchronous( 'duplicate_pair_status', pair_info )
@ -1184,7 +1256,7 @@ class MediaPanel( ClientMedia.ListeningMediaList, wx.ScrolledWindow ):
def _SetDuplicatesCustom( self ):
duplicate_types = [ HC.DUPLICATE_BETTER, HC.DUPLICATE_SAME_FILE, HC.DUPLICATE_ALTERNATE, HC.DUPLICATE_NOT_DUPLICATE ]
duplicate_types = [ HC.DUPLICATE_BETTER, HC.DUPLICATE_SAME_QUALITY, HC.DUPLICATE_ALTERNATE, HC.DUPLICATE_NOT_DUPLICATE ]
choice_tuples = [ ( HC.duplicate_type_string_lookup[ duplicate_type ], duplicate_type ) for duplicate_type in duplicate_types ]
@ -2463,6 +2535,10 @@ class MediaPanelThumbnails( MediaPanel ):
def EventShowMenu( self, event ):
new_options = HG.client_controller.GetNewOptions()
advanced_mode = new_options.GetBoolean( 'advanced_mode' )
services_manager = HG.client_controller.GetServicesManager()
thumbnail = self._GetThumbnailUnderMouse( event )
@ -2568,7 +2644,7 @@ class MediaPanelThumbnails( MediaPanel ):
i_can_post_ratings = len( local_ratings_services ) > 0
focussed_is_local = CC.LOCAL_FILE_SERVICE_KEY in self._focussed_media.GetLocationsManager().GetCurrent()
focussed_is_local = CC.COMBINED_LOCAL_FILE_SERVICE_KEY in self._focussed_media.GetLocationsManager().GetCurrent()
file_service_keys = { repository.GetServiceKey() for repository in file_repositories }
upload_permission_file_service_keys = { repository.GetServiceKey() for repository in file_repositories if repository.HasPermission( HC.CONTENT_TYPE_FILES, HC.PERMISSION_ACTION_CREATE ) }
@ -2986,7 +3062,7 @@ class MediaPanelThumbnails( MediaPanel ):
ClientGUIMenus.AppendSeparator( menu )
if selection_has_local:
if focussed_is_local:
ClientGUIMenus.AppendMenuItem( self, menu, 'open externally', 'Launch this file with your OS\'s default program for it.', self._OpenExternally )
@ -3024,40 +3100,58 @@ class MediaPanelThumbnails( MediaPanel ):
#
if advanced_mode:
if not HC.PLATFORM_LINUX and focussed_is_local:
open_menu = wx.Menu()
ClientGUIMenus.AppendMenuItem( self, open_menu, 'in file browser', 'Show this file in your OS\'s file browser.', self._OpenFileLocation )
ClientGUIMenus.AppendMenu( share_menu, open_menu, 'open' )
copy_menu = wx.Menu()
if selection_has_local:
ClientGUIMenus.AppendMenuItem( self, copy_menu, copy_phrase, 'Copy the selected files to the clipboard.', self._CopyFilesToClipboard )
copy_hash_menu = wx.Menu()
ClientGUIMenus.AppendMenuItem( self, copy_hash_menu, 'sha256 (hydrus default)', 'Copy the selected file\'s SHA256 hash to the clipboard.', self._CopyHashToClipboard, 'sha256' )
ClientGUIMenus.AppendMenuItem( self, copy_hash_menu, 'md5', 'Copy the selected file\'s MD5 hash to the clipboard.', self._CopyHashToClipboard, 'md5' )
ClientGUIMenus.AppendMenuItem( self, copy_hash_menu, 'sha1', 'Copy the selected file\'s SHA1 hash to the clipboard.', self._CopyHashToClipboard, 'sha1' )
ClientGUIMenus.AppendMenuItem( self, copy_hash_menu, 'sha512', 'Copy the selected file\'s SHA512 hash to the clipboard.', self._CopyHashToClipboard, 'sha512' )
ClientGUIMenus.AppendMenu( copy_menu, copy_hash_menu, 'hash' )
if multiple_selected:
if advanced_mode:
copy_hash_menu = wx.Menu()
ClientGUIMenus.AppendMenuItem( self, copy_hash_menu, 'sha256 (hydrus default)', 'Copy the selected files\' SHA256 hashes to the clipboard.', self._CopyHashesToClipboard, 'sha256' )
ClientGUIMenus.AppendMenuItem( self, copy_hash_menu, 'md5', 'Copy the selected files\' MD5 hashes to the clipboard.', self._CopyHashesToClipboard, 'md5' )
ClientGUIMenus.AppendMenuItem( self, copy_hash_menu, 'sha1', 'Copy the selected files\' SHA1 hashes to the clipboard.', self._CopyHashesToClipboard, 'sha1' )
ClientGUIMenus.AppendMenuItem( self, copy_hash_menu, 'sha512', 'Copy the selected files\' SHA512 hashes to the clipboard.', self._CopyHashesToClipboard, 'sha512' )
ClientGUIMenus.AppendMenuItem( self, copy_hash_menu, 'sha256 (hydrus default)', 'Copy the selected file\'s SHA256 hash to the clipboard.', self._CopyHashToClipboard, 'sha256' )
ClientGUIMenus.AppendMenuItem( self, copy_hash_menu, 'md5', 'Copy the selected file\'s MD5 hash to the clipboard.', self._CopyHashToClipboard, 'md5' )
ClientGUIMenus.AppendMenuItem( self, copy_hash_menu, 'sha1', 'Copy the selected file\'s SHA1 hash to the clipboard.', self._CopyHashToClipboard, 'sha1' )
ClientGUIMenus.AppendMenuItem( self, copy_hash_menu, 'sha512', 'Copy the selected file\'s SHA512 hash to the clipboard.', self._CopyHashToClipboard, 'sha512' )
ClientGUIMenus.AppendMenu( copy_menu, copy_hash_menu, 'hashes' )
ClientGUIMenus.AppendMenu( copy_menu, copy_hash_menu, 'hash' )
if multiple_selected:
copy_hash_menu = wx.Menu()
ClientGUIMenus.AppendMenuItem( self, copy_hash_menu, 'sha256 (hydrus default)', 'Copy the selected files\' SHA256 hashes to the clipboard.', self._CopyHashesToClipboard, 'sha256' )
ClientGUIMenus.AppendMenuItem( self, copy_hash_menu, 'md5', 'Copy the selected files\' MD5 hashes to the clipboard.', self._CopyHashesToClipboard, 'md5' )
ClientGUIMenus.AppendMenuItem( self, copy_hash_menu, 'sha1', 'Copy the selected files\' SHA1 hashes to the clipboard.', self._CopyHashesToClipboard, 'sha1' )
ClientGUIMenus.AppendMenuItem( self, copy_hash_menu, 'sha512', 'Copy the selected files\' SHA512 hashes to the clipboard.', self._CopyHashesToClipboard, 'sha512' )
ClientGUIMenus.AppendMenu( copy_menu, copy_hash_menu, 'hashes' )
else:
ClientGUIMenus.AppendMenuItem( self, copy_menu, 'sha256 hash', 'Copy the selected file\'s SHA256 hash to the clipboard.', self._CopyHashToClipboard, 'sha256' )
if multiple_selected:
if advanced_mode:
ClientGUIMenus.AppendMenuItem( self, copy_menu, 'sha256 hashes', 'Copy the selected files\' SHA256 hash to the clipboard.', self._CopyHashesToClipboard, 'sha256' )
ClientGUIMenus.AppendMenuItem( self, copy_menu, 'sha256 hash', 'Copy the selected file\'s SHA256 hash to the clipboard.', self._CopyHashToClipboard, 'sha256' )
if multiple_selected:
ClientGUIMenus.AppendMenuItem( self, copy_menu, 'sha256 hashes', 'Copy the selected files\' SHA256 hash to the clipboard.', self._CopyHashesToClipboard, 'sha256' )
@ -3100,7 +3194,11 @@ class MediaPanelThumbnails( MediaPanel ):
export_menu = wx.Menu()
ClientGUIMenus.AppendMenuItem( self, export_menu, export_phrase, 'Export the selected files to an external folder.', self._ExportFiles )
ClientGUIMenus.AppendMenuItem( self, export_menu, 'tags', 'Export the selected files\' tags to an external database.', self._ExportTags )
if advanced_mode:
ClientGUIMenus.AppendMenuItem( self, export_menu, 'tags', 'Export the selected files\' tags to an external database.', self._ExportTags )
share_menu.AppendMenu( CC.ID_NULL, 'export', export_menu )
@ -3171,85 +3269,114 @@ class MediaPanelThumbnails( MediaPanel ):
ClientGUIMenus.AppendMenuItem( self, menu, 'open selection in a new page', 'Copy your current selection into a simple new page.', self._ShowSelectionInNewPage )
duplicates_menu = menu # this is important to make the menu flexible if not multiple selected
focussed_hash = self._focussed_media.GetDisplayMedia().GetHash()
if multiple_selected:
if advanced_mode:
duplicates_menu = wx.Menu()
duplicates_menu = menu # this is important to make the menu flexible if not multiple selected
duplicates_action_submenu = wx.Menu()
focussed_hash = self._focussed_media.GetDisplayMedia().GetHash()
label = 'set this file as better than the ' + HydrusData.ConvertIntToPrettyString( num_selected - 1 ) + ' other selected'
ClientGUIMenus.AppendMenuItem( self, duplicates_action_submenu, label, 'Set the focused media to be better than the other selected files.', self._SetDuplicatesFocusedBetter )
num_files = self._GetNumSelected()
num_pairs = num_files * ( num_files - 1 ) / 2 # combinations -- n!/2(n-2)!
num_pairs_text = HydrusData.ConvertIntToPrettyString( num_pairs ) + ' pairs'
ClientGUIMenus.AppendMenuItem( self, duplicates_action_submenu, 'set all selected as exact duplicates', 'Set all the selected files as exact same duplicates.', self._SetDuplicates, HC.DUPLICATE_SAME_FILE )
ClientGUIMenus.AppendMenuItem( self, duplicates_action_submenu, 'set all selected as alternates', 'Set all the selected files as alternates.', self._SetDuplicates, HC.DUPLICATE_ALTERNATE )
ClientGUIMenus.AppendMenuItem( self, duplicates_action_submenu, 'set all selected as not duplicates', 'Set all the selected files as not duplicates.', self._SetDuplicates, HC.DUPLICATE_NOT_DUPLICATE )
ClientGUIMenus.AppendMenuItem( self, duplicates_action_submenu, 'make a custom duplicates action', 'Choose which duplicates status to set to this selection and customise non-default merge options.', self._SetDuplicatesCustom )
ClientGUIMenus.AppendSeparator( duplicates_action_submenu )
ClientGUIMenus.AppendMenuItem( self, duplicates_action_submenu, 'queue the ' + num_pairs_text + ' in this selection up for the duplicates filter', 'Set all the possible pairs in the selection as unknown/potential duplicate pairs.', self._SetDuplicates, HC.DUPLICATE_UNKNOWN )
ClientGUIMenus.AppendMenuItem( self, duplicates_action_submenu, 'remove the ' + num_pairs_text + ' in this selection from the duplicates system', 'Remove all duplicates relationships from all the pairs in this selection.', self._SetDuplicates, None )
ClientGUIMenus.AppendMenu( menu, duplicates_menu, 'duplicates' )
ClientGUIMenus.AppendMenu( duplicates_menu, duplicates_action_submenu, 'set duplicate relationships' )
if HG.client_controller.DBCurrentlyDoingJob():
ClientGUIMenus.AppendMenuLabel( duplicates_menu, 'Could not fetch duplicates (db currently locked)' )
else:
duplicate_types_to_counts = HG.client_controller.Read( 'duplicate_types_to_counts', self._file_service_key, focussed_hash )
if len( duplicate_types_to_counts ) > 0:
if multiple_selected:
duplicates_view_menu = wx.Menu()
duplicates_menu = wx.Menu()
for duplicate_type in ( HC.DUPLICATE_BETTER_OR_WORSE, HC.DUPLICATE_BETTER, HC.DUPLICATE_WORSE, HC.DUPLICATE_SAME_FILE, HC.DUPLICATE_ALTERNATE, HC.DUPLICATE_NOT_DUPLICATE, HC.DUPLICATE_UNKNOWN ):
duplicates_action_submenu = wx.Menu()
label = 'set this file as better than the ' + HydrusData.ConvertIntToPrettyString( num_selected - 1 ) + ' other selected'
ClientGUIMenus.AppendMenuItem( self, duplicates_action_submenu, label, 'Set the focused media to be better than the other selected files.', self._SetDuplicatesFocusedBetter )
num_files = self._GetNumSelected()
num_pairs = num_files * ( num_files - 1 ) / 2 # combinations -- n!/2(n-2)!
num_pairs_text = HydrusData.ConvertIntToPrettyString( num_pairs ) + ' pairs'
ClientGUIMenus.AppendMenuItem( self, duplicates_action_submenu, 'set all selected as same quality', 'Set all the selected files as same quality duplicates.', self._SetDuplicates, HC.DUPLICATE_SAME_QUALITY )
ClientGUIMenus.AppendMenuItem( self, duplicates_action_submenu, 'set all selected as alternates', 'Set all the selected files as alternates.', self._SetDuplicates, HC.DUPLICATE_ALTERNATE )
ClientGUIMenus.AppendMenuItem( self, duplicates_action_submenu, 'set all selected as not duplicates', 'Set all the selected files as not duplicates.', self._SetDuplicates, HC.DUPLICATE_NOT_DUPLICATE )
ClientGUIMenus.AppendMenuItem( self, duplicates_action_submenu, 'make a custom duplicates action', 'Choose which duplicates status to set to this selection and customise non-default merge options.', self._SetDuplicatesCustom )
ClientGUIMenus.AppendSeparator( duplicates_action_submenu )
ClientGUIMenus.AppendMenuItem( self, duplicates_action_submenu, 'send the ' + num_pairs_text + ' in this selection to be compared in the duplicates filter', 'Set all the possible pairs in the selection as unknown/potential duplicate pairs.', self._SetDuplicates, HC.DUPLICATE_UNKNOWN )
ClientGUIMenus.AppendMenuItem( self, duplicates_action_submenu, 'remove the ' + num_pairs_text + ' in this selection from the duplicates system', 'Remove all duplicates relationships from all the pairs in this selection.', self._SetDuplicates, None )
ClientGUIMenus.AppendSeparator( duplicates_action_submenu )
duplicates_edit_action_submenu = wx.Menu()
for duplicate_type in ( HC.DUPLICATE_BETTER, HC.DUPLICATE_SAME_QUALITY, HC.DUPLICATE_ALTERNATE, HC.DUPLICATE_NOT_DUPLICATE ):
if duplicate_type in duplicate_types_to_counts:
count = duplicate_types_to_counts[ duplicate_type ]
label = HydrusData.ConvertIntToPrettyString( count ) + ' ' + HC.duplicate_type_string_lookup[ duplicate_type ]
ClientGUIMenus.AppendMenuItem( self, duplicates_view_menu, label, 'Show these duplicates in a new page.', self._ShowDuplicatesInNewPage, focussed_hash, duplicate_type )
ClientGUIMenus.AppendMenuItem( self, duplicates_edit_action_submenu, 'for ' + HC.duplicate_type_string_lookup[ duplicate_type ], 'Edit what happens when you set this status.', self._EditDuplicateActionOptions, duplicate_type )
ClientGUIMenus.AppendMenu( duplicates_menu, duplicates_view_menu, 'view this file\'s duplicates' )
ClientGUIMenus.AppendMenu( duplicates_action_submenu, duplicates_edit_action_submenu, 'edit default merge options' )
ClientGUIMenus.AppendMenu( menu, duplicates_menu, 'duplicates' )
ClientGUIMenus.AppendMenu( duplicates_menu, duplicates_action_submenu, 'set duplicate relationships' )
if HG.client_controller.DBCurrentlyDoingJob():
ClientGUIMenus.AppendMenuLabel( duplicates_menu, 'Could not fetch duplicates (db currently locked)' )
else:
duplicate_types_to_counts = HG.client_controller.Read( 'duplicate_types_to_counts', self._file_service_key, focussed_hash )
if len( duplicate_types_to_counts ) > 0:
duplicates_view_menu = wx.Menu()
for duplicate_type in ( HC.DUPLICATE_BETTER_OR_WORSE, HC.DUPLICATE_BETTER, HC.DUPLICATE_WORSE, HC.DUPLICATE_SAME_QUALITY, HC.DUPLICATE_ALTERNATE, HC.DUPLICATE_NOT_DUPLICATE, HC.DUPLICATE_UNKNOWN ):
if duplicate_type in duplicate_types_to_counts:
count = duplicate_types_to_counts[ duplicate_type ]
label = HydrusData.ConvertIntToPrettyString( count ) + ' ' + HC.duplicate_type_string_lookup[ duplicate_type ]
ClientGUIMenus.AppendMenuItem( self, duplicates_view_menu, label, 'Show these duplicates in a new page.', self._ShowDuplicatesInNewPage, focussed_hash, duplicate_type )
ClientGUIMenus.AppendMenu( duplicates_menu, duplicates_view_menu, 'view this file\'s duplicates' )
if self._focussed_media.HasImages():
if advanced_mode:
ClientGUIMenus.AppendSeparator( menu )
if self._focussed_media.HasImages():
ClientGUIMenus.AppendSeparator( menu )
similar_menu = wx.Menu()
ClientGUIMenus.AppendMenuItem( self, similar_menu, 'exact match', 'Search the database for files that look precisely like this one.', self._GetSimilarTo, HC.HAMMING_EXACT_MATCH )
ClientGUIMenus.AppendMenuItem( self, similar_menu, 'very similar', 'Search the database for files that look just like this one.', self._GetSimilarTo, HC.HAMMING_VERY_SIMILAR )
ClientGUIMenus.AppendMenuItem( self, similar_menu, 'similar', 'Search the database for files that look generally like this one.', self._GetSimilarTo, HC.HAMMING_SIMILAR )
ClientGUIMenus.AppendMenuItem( self, similar_menu, 'speculative', 'Search the database for files that probably look like this one. This is sometimes useful for symbols with sharp edges or lines.', self._GetSimilarTo, HC.HAMMING_SPECULATIVE )
ClientGUIMenus.AppendMenu( menu, similar_menu, 'find similar files' )
similar_menu = wx.Menu()
if advanced_mode:
ClientGUIMenus.AppendMenuItem( self, similar_menu, 'exact match', 'Search the database for files that look precisely like this one.', self._GetSimilarTo, HC.HAMMING_EXACT_MATCH )
ClientGUIMenus.AppendMenuItem( self, similar_menu, 'very similar', 'Search the database for files that look just like this one.', self._GetSimilarTo, HC.HAMMING_VERY_SIMILAR )
ClientGUIMenus.AppendMenuItem( self, similar_menu, 'similar', 'Search the database for files that look generally like this one.', self._GetSimilarTo, HC.HAMMING_SIMILAR )
ClientGUIMenus.AppendMenuItem( self, similar_menu, 'speculative', 'Search the database for files that probably look like this one. This is sometimes useful for symbols with sharp edges or lines.', self._GetSimilarTo, HC.HAMMING_SPECULATIVE )
ClientGUIMenus.AppendMenu( menu, similar_menu, 'find similar files' )
if focussed_is_local and self._focussed_media.GetMime() in HC.VIDEO:
advanced_menu = wx.Menu()
ClientGUIMenus.AppendMenuItem( self, advanced_menu, 'attempt to correct video frame count', 'Recalculate this video\'s metadata using a slower but more accurate video parsing routine.', self._RecheckVideoMetadata )
ClientGUIMenus.AppendMenu( menu, advanced_menu, 'advanced' )

View File

@ -1315,6 +1315,15 @@ class ReviewServicePanel( wx.Panel ):
#
new_options = HG.client_controller.GetNewOptions()
advanced_mode = new_options.GetBoolean( 'advanced_mode' )
if not advanced_mode:
self._advanced_content_update.Hide()
self._Refresh()
#

View File

@ -93,7 +93,7 @@ class PanelPredicateSystemDuplicateRelationships( PanelPredicateSystem ):
self._num = wx.SpinCtrl( self, min = 0, max = 65535 )
choices = [ ( HC.duplicate_type_string_lookup[ status ], status ) for status in ( HC.DUPLICATE_BETTER_OR_WORSE, HC.DUPLICATE_BETTER, HC.DUPLICATE_WORSE, HC.DUPLICATE_SAME_FILE, HC.DUPLICATE_ALTERNATE, HC.DUPLICATE_NOT_DUPLICATE, HC.DUPLICATE_UNKNOWN ) ]
choices = [ ( HC.duplicate_type_string_lookup[ status ], status ) for status in ( HC.DUPLICATE_BETTER_OR_WORSE, HC.DUPLICATE_BETTER, HC.DUPLICATE_WORSE, HC.DUPLICATE_SAME_QUALITY, HC.DUPLICATE_ALTERNATE, HC.DUPLICATE_NOT_DUPLICATE, HC.DUPLICATE_UNKNOWN ) ]
self._dupe_type = ClientGUICommon.BetterRadioBox( self, choices = choices, style = wx.RA_SPECIFY_ROWS )
@ -106,7 +106,7 @@ class PanelPredicateSystemDuplicateRelationships( PanelPredicateSystem ):
hbox = wx.BoxSizer( wx.HORIZONTAL )
hbox.AddF( ClientGUICommon.BetterStaticText( self, 'system:duplicate relationships' ), CC.FLAGS_VCENTER )
hbox.AddF( ClientGUICommon.BetterStaticText( self, 'system:num duplicate relationships' ), CC.FLAGS_VCENTER )
hbox.AddF( self._sign, CC.FLAGS_VCENTER )
hbox.AddF( self._num, CC.FLAGS_VCENTER )
hbox.AddF( self._dupe_type, CC.FLAGS_VCENTER )

View File

@ -5328,7 +5328,7 @@ class ManageTagsPanel( ClientGUIScrolledPanels.ManagePanel ):
expand_parents = True
self._add_tag_box = ClientGUIACDropdown.AutoCompleteDropdownTagsWrite( self, self.EnterTags, expand_parents, self._file_service_key, self._tag_service_key, null_entry_callable = self.Ok )
self._add_tag_box = ClientGUIACDropdown.AutoCompleteDropdownTagsWrite( self, self.EnterTags, expand_parents, self._file_service_key, self._tag_service_key, null_entry_callable = self.OK )
self._advanced_content_update_button = wx.Button( self, label = 'advanced operation' )
self._advanced_content_update_button.Bind( wx.EVT_BUTTON, self.EventAdvancedContentUpdate )
@ -5372,6 +5372,11 @@ class ManageTagsPanel( ClientGUIScrolledPanels.ManagePanel ):
if not self._new_options.GetBoolean( 'advanced_mode' ):
self._advanced_content_update_button.Hide()
copy_paste_hbox = wx.BoxSizer( wx.HORIZONTAL )
copy_paste_hbox.AddF( self._copy_tags, CC.FLAGS_VCENTER )
@ -5716,9 +5721,9 @@ class ManageTagsPanel( ClientGUIScrolledPanels.ManagePanel ):
parent = self.GetTopLevelParent().GetParent()
self.Ok()
self.OK()
# do this because of the Ok() call, which doesn't want to happen in the dialog event loop
# do this because of the OK() call, which doesn't want to happen in the dialog event loop
def do_it():
with ClientGUITopLevelWindows.DialogNullipotent( parent, 'advanced content update' ) as dlg:
@ -5832,7 +5837,7 @@ class ManageTagsPanel( ClientGUIScrolledPanels.ManagePanel ):
return len( self._content_updates ) > 0
def Ok( self ):
def OK( self ):
wx.PostEvent( self, wx.CommandEvent( commandType = wx.wxEVT_COMMAND_MENU_SELECTED, winid = ClientCaches.MENU_EVENT_ID_TO_ACTION_CACHE.GetTemporaryId( 'ok' ) ) )

View File

@ -418,7 +418,7 @@ class DialogThatTakesScrollablePanel( DialogThatResizes ):
if command == 'ok':
self.EventOk( None )
self.EventOK( None )
else:
@ -427,7 +427,7 @@ class DialogThatTakesScrollablePanel( DialogThatResizes ):
def EventOk( self, event ):
def EventOK( self, event ):
raise NotImplementedError()
@ -464,7 +464,7 @@ class DialogThatTakesScrollablePanelClose( DialogThatTakesScrollablePanel ):
def _InitialiseButtons( self ):
self._close = wx.Button( self, id = wx.ID_OK, label = 'close' )
self._close.Bind( wx.EVT_BUTTON, self.EventOk )
self._close.Bind( wx.EVT_BUTTON, self.EventOK )
self._cancel = wx.Button( self, id = wx.ID_CANCEL )
self._cancel.Hide()
@ -477,7 +477,7 @@ class DialogNullipotent( DialogThatTakesScrollablePanelClose ):
DialogThatTakesScrollablePanelClose.__init__( self, parent, title )
def EventOk( self, event ):
def EventOK( self, event ):
SaveTLWSizeAndPosition( self, self._frame_key )
@ -499,7 +499,7 @@ class DialogThatTakesScrollablePanelApplyCancel( DialogThatTakesScrollablePanel
def _InitialiseButtons( self ):
self._apply = wx.Button( self, id = wx.ID_OK, label = 'apply' )
self._apply.Bind( wx.EVT_BUTTON, self.EventOk )
self._apply.Bind( wx.EVT_BUTTON, self.EventOK )
self._apply.SetForegroundColour( ( 0, 128, 0 ) )
self._cancel = wx.Button( self, id = wx.ID_CANCEL, label = 'cancel' )
@ -513,7 +513,7 @@ class DialogEdit( DialogThatTakesScrollablePanelApplyCancel ):
DialogThatTakesScrollablePanelApplyCancel.__init__( self, parent, title )
def EventOk( self, event ):
def EventOK( self, event ):
try:
@ -531,7 +531,7 @@ class DialogEdit( DialogThatTakesScrollablePanelApplyCancel ):
class DialogManage( DialogThatTakesScrollablePanelApplyCancel ):
def EventOk( self, event ):
def EventOK( self, event ):
try:

View File

@ -28,7 +28,7 @@ class HydrusResourceBooru( HydrusServerResources.HydrusResource ):
HydrusServerResources.HydrusResource._checkService( self, request )
if not self._service.BandwidthOk():
if not self._service.BandwidthOK():
raise HydrusExceptions.BandwidthException( 'This service has run out of bandwidth. Please try again later.' )

View File

@ -1,6 +1,8 @@
import collections
import HydrusConstants as HC
import HydrusExceptions
import HydrusNetwork
import HydrusNetworking
import HydrusPaths
import HydrusSerialisable
import errno
@ -76,6 +78,23 @@ def CheckHydrusVersion( service_key, service_type, response_headers ):
raise HydrusExceptions.NetworkVersionException( 'Network version mismatch! The server\'s network version was ' + str( network_version ) + ', whereas your client\'s is ' + str( HC.NETWORK_VERSION ) + '! ' + message )
def ConvertURLIntoDomains( url ):
domains = []
parser_result = urlparse.urlparse( url )
domain = parser_result.netloc
while domain.count( '.' ) > 0:
domains.append( domain )
domain = '.'.join( domain.split( '.' )[1:] ) # i.e. strip off the leftmost subdomain maps.google.com -> google.com
return domains
def RequestsGet( url, params = None, stream = False, headers = None ):
if headers is None:
@ -171,6 +190,10 @@ def RequestsCheckResponse( response ):
eclass = HydrusExceptions.NetworkVersionException
elif response.status_code >= 500:
eclass = HydrusExceptions.ServerException
else:
eclass = HydrusExceptions.NetworkException
@ -222,6 +245,18 @@ def ParseURL( url ):
return ( location, path, query )
def SerialiseSession( session ):
# move this to the new sessionmanager
cookies = session.cookies.copy()
items = requests.utils.dict_from_cookiejar( cookies )
# apply these to something serialisable
# do the reverse, add_dict_to_cookiejar, to set them back again in a new session
def SetProxy( proxytype, host, port, username = None, password = None ):
if proxytype == 'http': proxytype = socks.PROXY_TYPE_HTTP
@ -555,10 +590,13 @@ class HTTPConnection( object ):
else:
raise Exception( parsed_response )
raise HydrusExceptions.ServerException( parsed_response )
else: raise Exception( parsed_response )
else:
raise HydrusExceptions.NetworkException( parsed_response )
@ -980,4 +1018,418 @@ class HTTPConnection( object ):
return self._DealWithResponse( method, response, parsed_response, size_of_response )
class BandwidthManager( HydrusSerialisable.SerialisableBase ):
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_BANDWIDTH_MANAGER
SERIALISABLE_VERSION = 1
def __init__( self ):
HydrusSerialisable.SerialisableBase.__init__( self )
self._lock = threading.Lock()
self._global_bandwidth_tracker = HydrusNetworking.BandwidthTracker()
self._global_bandwidth_rules = HydrusNetworking.BandwidthRules()
self._domains_to_bandwidth_trackers = collections.defaultdict( HydrusNetworking.BandwidthTracker )
self._domains_to_bandwidth_rules = {}
def _GetApplicableTrackersAndRules( self, url = None ):
result = []
if url is not None:
domains = ConvertURLIntoDomains( url )
for domain in domains:
if domain in self._domains_to_bandwidth_rules:
bandwidth_tracker = self._domains_to_bandwidth_trackers[ domain ]
bandwidth_rules = self._domains_to_bandwidth_rules[ domain ]
result.append( ( bandwidth_tracker, bandwidth_rules ) )
result.append( ( self._global_bandwidth_tracker, self._global_bandwidth_rules ) )
return result
def GetEstimateInfo( self, domain = None ):
with self._lock:
# something that returns ( 'about a minute until you can request again', 60 )
pass
def GetDomainsAndTrackers( self ):
with self._lock:
result = list( self._domains_to_bandwidth_trackers.items() )
result.sort()
result.insert( 0, ( 'global', self._global_bandwidth_tracker ) )
return result
def OK( self, url = None ):
with self._lock:
for ( bandwidth_tracker, bandwidth_rules ) in self._GetApplicableTrackersAndRules( url ):
if not bandwidth_rules.OK( bandwidth_tracker ):
return False
return True
def RequestMade( self, url, num_bytes ):
with self._lock:
for ( bandwidth_tracker, bandwidth_rules ) in self._GetApplicableTrackersAndRules( url ):
bandwidth_tracker.RequestMade( num_bytes )
def SetRules( self, domain, bandwidth_rules ):
with self._lock:
if domain is None:
self._global_bandwidth_rules = bandwidth_rules
else:
self._domains_to_bandwidth_rules[ domain ] = bandwidth_rules
HydrusSerialisable.SERIALISABLE_TYPES_TO_OBJECT_TYPES[ HydrusSerialisable.SERIALISABLE_TYPE_BANDWIDTH_MANAGER ] = BandwidthManager
class NetworkEngine( object ):
def __init__( self, controller ):
self._controller = controller
self._lock = threading.Lock()
self._new_work_to_do = threading.Event()
self._new_network_jobs = []
self._throttled_jobs = []
self._local_shutdown = False
# start main loop
def AddJob( self, job ):
with self._lock:
self._new_network_jobs.append( job )
self._new_work_to_do.set()
def MainLoop( self ):
while not ( self._local_shutdown or self._controller.ModelIsShutdown() ):
with self._lock:
self._throttled_jobs.extend( self._new_network_jobs )
self._new_network_jobs = []
#
ready_to_start = []
throttled_jobs = self._throttled_jobs
self._throttled_jobs = []
for job in throttled_jobs:
if job.ReadyToWork():
ready_to_start.append( job )
elif not job.IsCancelled():
self._throttled_jobs.append( job )
#
for job in ready_to_start:
self._controller.CallToThread( job.Start )
# have this hold on to jobs until they are done, so the user can look at all the current ones in a review panel somewhere
# this also lets us max out the num active connections at an optional value
self._new_work_to_do.wait( 1 )
self._new_work_to_do.clear()
def Shutdown( self ):
self._local_shutdown = True
class NetworkJob( object ):
def __init__( self, method, url, body = None, referral_url = None, temp_path = None ):
self._method = method
self._url = url
self._body = body
self._referral_url = referral_url
self._temp_path = temp_path
self._response = None
self._speed_tracker = HydrusNetworking.TransferSpeedTracker()
self._time_ready_to_work = 0
self._has_error = False
# a way to hold error traceback and a way to fetch it
self._is_done = False
self._is_cancelled = False
self._bandwidth_override = False
self._text = 'initialising'
self._value_range = ( None, None )
self._lock = threading.Lock()
def _IsCancelled( self ):
if self._is_cancelled:
return True
if HG.client_controller.ModelIsShutdown():
return True
return False
def _GetSession( self ):
pass # fetch the regular session from the sessionmanager
def _ReadResponse( self, f = None ):
# get the content-length, if any, to use for range
bytes_read = 0
for chunk in self._response.iter_content( chunk_size = 8192 ):
if self._IsCancelled():
return
if f is not None:
f.write( chunk )
chunk_length = len( chunk )
bytes_read += chunk_length
self._speed_tracker.DataTransferred( chunk_length )
# update the status
num_bytes_used = bytes_read
if self._body is not None:
num_bytes_used += len( self._body )
self._ReportBandwidth( num_bytes_used )
self._is_done = True
def _ReportBandwidth( self, num_bytes ):
bandwidth_manager = HG.client_controller.GetBandwidthManager()
bandwidth_manager.RequestMade( self._url, num_bytes )
def BandwidthOverride( self ):
self._bandwidth_override = True
def Cancel( self ):
self._is_cancelled = True
def GetContent( self ):
return self._response.content
def GetJSON( self ):
return self._response.json
def GetStatus( self ):
with self._lock:
return ( self._text, self._speed_tracker, self._value_range )
def HasError( self ):
return self._has_error
def IsCancelled( self ):
return self._IsCancelled()
def IsDone( self ):
return self._is_done
def ReadyToWork( self ):
if not HydrusData.TimeHasPassed( self._time_ready_to_work ):
return False
if not self._bandwidth_override:
pass
# make sure bandwidth domain is ok
# report to status if not with how long to expect to wait
# set ready to work ahead an appropriate time
# try
# make sure login domain is ok
# this can do the actual login here. a little delay is fine for this job, and it is good it is here where the engine works serially
# except
# abandon and set status and set ready to work ahead a bit
return True
def SetStatus( self, text, value, range ):
with self._lock:
self._text = text
self._value_range = ( value, range )
def Start( self ):
# set status throughout this
session = self._GetSession()
headers = {}
if self._referral_url is not None:
headers = { 'referer' : self._referral_url }
self._response = session.request( self._method, self._url, headers = headers, stream = True )
# check the response here using requestscheckresponse above
# if no error:
# deal with reading errors here gracefully, whatever form they occur in
if self._temp_path is None:
self._ReadResponse()
else:
with open( self._temp_path, 'rb' ) as f:
self._ReadResponse( f )
class HydrusNetworkJob( NetworkJob ):
def __init__( self, service_key, method, url, body = None, referral_url = None, temp_path = None ):
NetworkJob.__init__( self, method, url, body, referral_url, temp_path )
self._service_key = service_key
def _ReportBandwidth( self, num_bytes ):
pass # fetch my service, report requestmade
def _GetSession( self ):
pass # fetch the hydrus (ssl verify=False) session, which should have the keys as cookies, right?
# this will ultimately be a job for the login engine step, earlier

View File

@ -1170,7 +1170,7 @@ class Predicate( HydrusSerialisable.SerialisableBase ):
elif self._predicate_type == HC.PREDICATE_TYPE_SYSTEM_DUPLICATE_RELATIONSHIPS:
base = 'duplicate relationships'
base = 'num duplicate relationships'
if self._value is not None:

View File

@ -266,7 +266,7 @@ class ServiceLocalBooru( Service ):
def _GetFunctionalStatus( self ):
if not self._bandwidth_rules.Ok( self._bandwidth_tracker ):
if not self._bandwidth_rules.OK( self._bandwidth_tracker ):
return ( False, 'bandwidth exceeded' )
@ -298,11 +298,11 @@ class ServiceLocalBooru( Service ):
# this should support the same serverservice interface so we can just toss it at the regular serverengine and all the bandwidth will work ok
def BandwidthOk( self ):
def BandwidthOK( self ):
with self._lock:
return self._bandwidth_rules.Ok( self._bandwidth_tracker )
return self._bandwidth_rules.OK( self._bandwidth_tracker )
@ -464,7 +464,7 @@ class ServiceRemote( Service ):
return ( False, self._no_requests_reason + ' - next request ' + HydrusData.ConvertTimestampToPrettyPending( self._no_requests_until ) )
if not self._bandwidth_rules.Ok( self._bandwidth_tracker ):
if not self._bandwidth_rules.OK( self._bandwidth_tracker ):
return ( False, 'bandwidth exceeded' )

View File

@ -49,7 +49,7 @@ options = {}
# Misc
NETWORK_VERSION = 18
SOFTWARE_VERSION = 258
SOFTWARE_VERSION = 259
UNSCALED_THUMBNAIL_DIMENSIONS = ( 200, 200 )
@ -166,7 +166,7 @@ DEFINITIONS_TYPE_TAGS = 1
DUPLICATE_UNKNOWN = 0
DUPLICATE_NOT_DUPLICATE = 1
DUPLICATE_SAME_FILE = 2
DUPLICATE_SAME_QUALITY = 2
DUPLICATE_ALTERNATE = 3
DUPLICATE_BETTER = 4
DUPLICATE_SMALLER_BETTER = 5
@ -178,7 +178,7 @@ duplicate_type_string_lookup = {}
duplicate_type_string_lookup[ DUPLICATE_UNKNOWN ] = 'unknown relationship'
duplicate_type_string_lookup[ DUPLICATE_NOT_DUPLICATE ] = 'not duplicates'
duplicate_type_string_lookup[ DUPLICATE_SAME_FILE ] = 'exact same files'
duplicate_type_string_lookup[ DUPLICATE_SAME_QUALITY ] = 'same quality'
duplicate_type_string_lookup[ DUPLICATE_ALTERNATE ] = 'alternates'
duplicate_type_string_lookup[ DUPLICATE_BETTER ] = 'this is better'
duplicate_type_string_lookup[ DUPLICATE_SMALLER_BETTER ] = 'smaller hash_id is better'

View File

@ -615,12 +615,6 @@ def GenerateKey():
return os.urandom( HC.HYDRUS_KEY_LENGTH )
def GetEmptyDataDict():
data = collections.defaultdict( default_dict_list )
return data
def Get64BitHammingDistance( phash1, phash2 ):
# old way of doing this was:
@ -644,6 +638,29 @@ def Get64BitHammingDistance( phash1, phash2 ):
return n
def GetEmptyDataDict():
data = collections.defaultdict( default_dict_list )
return data
def GetHideTerminalSubprocessStartupInfo():
if HC.PLATFORM_WINDOWS:
# This suppresses the terminal window that tends to pop up when calling ffmpeg or whatever
startupinfo = subprocess.STARTUPINFO()
startupinfo.dwFlags |= subprocess.STARTF_USESHOWWINDOW
else:
startupinfo = None
return startupinfo
def GetNow(): return int( time.time() )
def GetNowPrecise():
@ -701,23 +718,6 @@ def GetSiblingProcessPorts( db_path, instance ):
return None
def GetSubprocessStartupInfo():
if HC.PLATFORM_WINDOWS:
# This suppresses the terminal window that tends to pop up when calling ffmpeg or whatever
startupinfo = subprocess.STARTUPINFO()
startupinfo.dwFlags |= subprocess.STARTF_USESHOWWINDOW
else:
startupinfo = None
return startupinfo
def IntelligentMassIntersect( sets_to_reduce ):
answer = None

View File

@ -30,6 +30,7 @@ class NotModifiedException( NetworkException ): pass
class PermissionException( NetworkException ): pass
class RedirectionException( NetworkException ): pass
class ServerBusyException( NetworkException ): pass
class ServerException( NetworkException ): pass
class SessionException( NetworkException ): pass
class WrongServiceTypeException( NetworkException ): pass
class ShouldReattemptNetworkException( NetworkException ): pass

View File

@ -2,6 +2,7 @@ import hexagonitswfheader
import HydrusConstants as HC
import HydrusData
import os
import shlex
import subprocess
import time
import traceback
@ -39,11 +40,11 @@ def GetFlashProperties( path ):
def RenderPageToFile( path, temp_path, page_index ):
cmd = [ SWFRENDER_PATH, path, '-o', temp_path, '-p', str( page_index ) ]
cmd = '"' + SWFRENDER_PATH + '" "' + path + '" -o "' + temp_path + '" -p ' + str( page_index )
timeout = HydrusData.GetNow() + 60
p = subprocess.Popen( cmd, startupinfo = HydrusData.GetSubprocessStartupInfo() )
p = subprocess.Popen( shlex.split( cmd ), startupinfo = HydrusData.GetHideTerminalSubprocessStartupInfo() )
while p.poll() is None:
@ -58,4 +59,4 @@ def RenderPageToFile( path, temp_path, page_index ):
p.communicate()

View File

@ -2,6 +2,7 @@ import HydrusConstants as HC
import HydrusData
import HydrusExceptions
import os
import shlex
import socket
import subprocess
import threading
@ -12,9 +13,18 @@ from twisted.python import log
# new stuff starts here
if HC.PLATFORM_LINUX: upnpc_path = os.path.join( HC.BIN_DIR, 'upnpc_linux' )
elif HC.PLATFORM_OSX: upnpc_path = os.path.join( HC.BIN_DIR, 'upnpc_osx' )
elif HC.PLATFORM_WINDOWS: upnpc_path = os.path.join( HC.BIN_DIR, 'upnpc_win32.exe' )
if HC.PLATFORM_LINUX:
upnpc_path = os.path.join( HC.BIN_DIR, 'upnpc_linux' )
elif HC.PLATFORM_OSX:
upnpc_path = os.path.join( HC.BIN_DIR, 'upnpc_osx' )
elif HC.PLATFORM_WINDOWS:
upnpc_path = os.path.join( HC.BIN_DIR, 'upnpc_win32.exe' )
EXTERNAL_IP = {}
EXTERNAL_IP[ 'ip' ] = None
@ -29,9 +39,9 @@ def GetExternalIP():
if HydrusData.TimeHasPassed( EXTERNAL_IP[ 'time' ] + ( 3600 * 24 ) ):
cmd = [ upnpc_path, '-l' ]
cmd = '"' + upnpc_path + '" -l'
p = subprocess.Popen( cmd, stdin = subprocess.PIPE, stdout = subprocess.PIPE, stderr = subprocess.PIPE, startupinfo = HydrusData.GetSubprocessStartupInfo() )
p = subprocess.Popen( shlex.split( cmd ), stdin = subprocess.PIPE, stdout = subprocess.PIPE, stderr = subprocess.PIPE, startupinfo = HydrusData.GetHideTerminalSubprocessStartupInfo() )
HydrusData.WaitForProcessToFinish( p, 30 )
@ -71,9 +81,9 @@ def GetLocalIP(): return socket.gethostbyname( socket.gethostname() )
def AddUPnPMapping( internal_client, internal_port, external_port, protocol, description, duration = 3600 ):
cmd = [ upnpc_path, '-e', description, '-a', internal_client, str( internal_port ), str( external_port ), protocol, str( duration ) ]
cmd = '"' + upnpc_path + '" -e "' + description + '" -a ' + internal_client + ' ' + str( internal_port ) + ' ' + str( external_port ) + ' ' + protocol + ' ' + str( duration )
p = subprocess.Popen( cmd, stdin = subprocess.PIPE, stdout = subprocess.PIPE, stderr = subprocess.PIPE, startupinfo = HydrusData.GetSubprocessStartupInfo() )
p = subprocess.Popen( shlex.split( cmd ), stdin = subprocess.PIPE, stdout = subprocess.PIPE, stderr = subprocess.PIPE, startupinfo = HydrusData.GetHideTerminalSubprocessStartupInfo() )
HydrusData.WaitForProcessToFinish( p, 30 )
@ -105,9 +115,9 @@ def GetUPnPMappings():
external_ip_address = GetExternalIP()
cmd = [ upnpc_path, '-l' ]
cmd = '"' + upnpc_path + '" -l'
p = subprocess.Popen( cmd, stdin = subprocess.PIPE, stdout = subprocess.PIPE, stderr = subprocess.PIPE, startupinfo = HydrusData.GetSubprocessStartupInfo() )
p = subprocess.Popen( shlex.split( cmd ), stdin = subprocess.PIPE, stdout = subprocess.PIPE, stderr = subprocess.PIPE, startupinfo = HydrusData.GetHideTerminalSubprocessStartupInfo() )
HydrusData.WaitForProcessToFinish( p, 30 )
@ -176,9 +186,9 @@ def GetUPnPMappings():
def RemoveUPnPMapping( external_port, protocol ):
cmd = [ upnpc_path, '-d', str( external_port ), protocol ]
cmd = '"' + upnpc_path + '" -d ' + str( external_port ) + ' ' + protocol
p = subprocess.Popen( cmd, stdin = subprocess.PIPE, stdout = subprocess.PIPE, stderr = subprocess.PIPE, startupinfo = HydrusData.GetSubprocessStartupInfo() )
p = subprocess.Popen( shlex.split( cmd ), stdin = subprocess.PIPE, stdout = subprocess.PIPE, stderr = subprocess.PIPE, startupinfo = HydrusData.GetHideTerminalSubprocessStartupInfo() )
HydrusData.WaitForProcessToFinish( p, 30 )

View File

@ -391,7 +391,7 @@ class Account( object ):
return ( False, self._GetExpiresString() )
if not self._account_type.BandwidthOk( self._bandwidth_tracker ):
if not self._account_type.BandwidthOK( self._bandwidth_tracker ):
return ( False, 'account has exceeded bandwidth' )
@ -470,7 +470,7 @@ class Account( object ):
raise HydrusExceptions.PermissionException( 'This account has expired.' )
if not self._account_type.BandwidthOk( self._bandwidth_tracker ):
if not self._account_type.BandwidthOK( self._bandwidth_tracker ):
raise HydrusExceptions.PermissionException( 'This account has no remaining bandwidth.' )
@ -733,9 +733,9 @@ class AccountType( object ):
self._bandwidth_rules = dictionary[ 'bandwidth_rules' ]
def BandwidthOk( self, bandwidth_tracker ):
def BandwidthOK( self, bandwidth_tracker ):
return self._bandwidth_rules.Ok( bandwidth_tracker )
return self._bandwidth_rules.OK( bandwidth_tracker )
def HasPermission( self, content_type, permission ):
@ -2045,7 +2045,7 @@ class ServerService( object ):
def BandwidthOk( self ):
def BandwidthOK( self ):
with self._lock:
@ -2133,11 +2133,11 @@ class ServerServiceRestricted( ServerService ):
self._bandwidth_rules = dictionary[ 'bandwidth_rules' ]
def BandwidthOk( self ):
def BandwidthOK( self ):
with self._lock:
return self._bandwidth_rules.Ok( self._bandwidth_tracker )
return self._bandwidth_rules.OK( self._bandwidth_tracker )
@ -2279,11 +2279,11 @@ class ServerServiceAdmin( ServerServiceRestricted ):
self._server_bandwidth_rules = dictionary[ 'server_bandwidth_rules' ]
def ServerBandwidthOk( self ):
def ServerBandwidthOK( self ):
with self._lock:
return self._server_bandwidth_rules.Ok( self._server_bandwidth_tracker )
return self._server_bandwidth_rules.OK( self._server_bandwidth_tracker )

View File

@ -156,7 +156,7 @@ class BandwidthRules( HydrusSerialisable.SerialisableBase ):
def Ok( self, bandwidth_tracker ):
def OK( self, bandwidth_tracker ):
with self._lock:
@ -432,3 +432,83 @@ class BandwidthTracker( HydrusSerialisable.SerialisableBase ):
HydrusSerialisable.SERIALISABLE_TYPES_TO_OBJECT_TYPES[ HydrusSerialisable.SERIALISABLE_TYPE_BANDWIDTH_TRACKER ] = BandwidthTracker
class TransferSpeedTracker( object ):
CLEAN_PERIOD = 30
LONG_DELTA = 15
SHORT_DELTA = 3
SHORT_WEIGHT = 3
def __init__( self ):
self._lock = threading.Lock()
self._current_speed = 0
self._current_speed_timestamp = 0
self._current_speed_dirty = False
self._timestamps_to_amounts = collections.Counter()
self._next_clean_time = HydrusData.GetNow() + self.CLEAN_PERIOD
def _CleanHistory( self ):
if HydrusData.TimeHasPassed( self._next_clean_time ):
now = HydrusData.GetNow()
invalid_indices = [ timestamp for timestamp in self._timestamps_to_amounts.keys() if timestamp < now - self.LONG_DELTA ]
for timestamp in invalid_indices:
del self._timestamps_to_amounts[ timestamp ]
self._next_clean_time = HydrusData.GetNow() + self.CLEAN_PERIOD
def DataTransferred( self, num_bytes ):
with self._lock:
self._CleanHistory()
now = HydrusData.GetNow()
self._timestamps_to_amounts[ now ] += num_bytes
self._current_speed_dirty = True
def GetCurrentSpeed( self ):
with self._lock:
self._CleanHistory()
now = HydrusData.GetNow()
if self._current_speed_dirty or self._current_speed_timestamp != now:
total_bytes = ( self._timestamps_to_amounts[ timestamp ] for timestamp in range( now - self.LONG_DELTA, now ) )
total_bytes += ( self._timestamps_to_amounts[ timestamp ] * self.SHORT_WEIGHT for timestamp in range( now - self.SHORT_DELTA, now ) )
total_weight = self.LONG_DELTA + ( self.SHORT_DELTA * self.SHORT_WEIGHT )
self._current_speed = total_bytes // total_weight # since this is in bytes, an int is fine and proper
self._current_speed_timestamp = now
self._current_speed_dirty = False
return self._current_speed

View File

@ -5,6 +5,7 @@ import HydrusGlobals as HG
import os
import psutil
import send2trash
import shlex
import shutil
import stat
import subprocess
@ -315,14 +316,20 @@ def LaunchDirectory( path ):
else:
if HC.PLATFORM_OSX: cmd = [ 'open' ]
elif HC.PLATFORM_LINUX: cmd = [ 'xdg-open' ]
if HC.PLATFORM_OSX:
cmd = 'open'
elif HC.PLATFORM_LINUX:
cmd = 'xdg-open'
cmd.append( path )
cmd += ' "' + path + '"'
# setsid call un-childs this new process
process = subprocess.Popen( cmd, preexec_fn = os.setsid, startupinfo = HydrusData.GetSubprocessStartupInfo() )
process = subprocess.Popen( shlex.split( cmd ), preexec_fn = os.setsid, startupinfo = HydrusData.GetHideTerminalSubprocessStartupInfo() )
process.wait()
@ -346,14 +353,20 @@ def LaunchFile( path ):
else:
if HC.PLATFORM_OSX: cmd = [ 'open' ]
elif HC.PLATFORM_LINUX: cmd = [ 'xdg-open' ]
if HC.PLATFORM_OSX:
cmd = 'open'
elif HC.PLATFORM_LINUX:
cmd = 'xdg-open'
cmd.append( path )
cmd += ' "' + path + '"'
# setsid call un-childs this new process
process = subprocess.Popen( cmd, preexec_fn = os.setsid, startupinfo = HydrusData.GetSubprocessStartupInfo() )
process = subprocess.Popen( shlex.split( cmd ), preexec_fn = os.setsid, startupinfo = HydrusData.GetHideTerminalSubprocessStartupInfo() )
process.wait()
@ -560,6 +573,36 @@ def MirrorTree( source, dest ):
def OpenFileLocation( path ):
def do_it():
if HC.PLATFORM_WINDOWS:
cmd = 'explorer /select,"' + path + '"'
elif HC.PLATFORM_OSX:
cmd = 'open -R "' + path + '"'
elif HC.PLATFORM_LINUX:
raise NotImplementedError()
process = subprocess.Popen( shlex.split( cmd ) )
process.wait()
process.communicate()
thread = threading.Thread( target = do_it )
thread.daemon = True
thread.start()
def PathsHaveSameSizeAndDate( path1, path2 ):
if os.path.exists( path1 ) and os.path.exists( path2 ):

View File

@ -47,6 +47,7 @@ SERIALISABLE_TYPE_SHORTCUT = 41
SERIALISABLE_TYPE_APPLICATION_COMMAND = 42
SERIALISABLE_TYPE_DUPLICATE_ACTION_OPTIONS = 43
SERIALISABLE_TYPE_TAG_CENSOR = 44
SERIALISABLE_TYPE_BANDWIDTH_MANAGER = 45
SERIALISABLE_TYPES_TO_OBJECT_TYPES = {}

View File

@ -34,7 +34,7 @@ def GetFFMPEGVersion():
try:
proc = subprocess.Popen( cmd, bufsize=10**5, stdout=subprocess.PIPE, stderr=subprocess.PIPE, startupinfo = HydrusData.GetSubprocessStartupInfo() )
proc = subprocess.Popen( cmd, bufsize=10**5, stdout=subprocess.PIPE, stderr=subprocess.PIPE, startupinfo = HydrusData.GetHideTerminalSubprocessStartupInfo() )
except Exception as e:
@ -78,9 +78,9 @@ def GetFFMPEGVersion():
return 'unknown'
def GetFFMPEGVideoProperties( path ):
def GetFFMPEGVideoProperties( path, count_frames_manually = False ):
info = Hydrusffmpeg_parse_infos( path )
info = Hydrusffmpeg_parse_infos( path, count_frames_manually = count_frames_manually )
( w, h ) = info[ 'video_size' ]
@ -258,7 +258,7 @@ def Hydrusffmpeg_parse_infos(filename, print_infos=False, count_frames_manually
try:
proc = subprocess.Popen( cmd, bufsize=10**5, stdout=subprocess.PIPE, stderr=subprocess.PIPE, startupinfo = HydrusData.GetSubprocessStartupInfo() )
proc = subprocess.Popen( cmd, bufsize=10**5, stdout=subprocess.PIPE, stderr=subprocess.PIPE, startupinfo = HydrusData.GetHideTerminalSubprocessStartupInfo() )
except:
@ -274,7 +274,9 @@ def Hydrusffmpeg_parse_infos(filename, print_infos=False, count_frames_manually
infos = proc.stderr.read().decode('utf8')
proc.terminate()
proc.wait()
proc.communicate()
del proc
@ -351,7 +353,7 @@ def Hydrusffmpeg_parse_infos(filename, print_infos=False, count_frames_manually
if len( frame_lines ) > 0:
l = frame_lines[0]
l = frame_lines[-1] # there will be several of these, counting up as the file renders. we hence want the final one
while ' ' in l:
@ -406,7 +408,7 @@ def Hydrusffmpeg_parse_infos(filename, print_infos=False, count_frames_manually
fps = line[match.start():match.end()].split(' ')[1]
if fps.endswith( 'k' ):
if fps.endswith( 'k' ) or float( fps ) > 60:
if not doing_manual_frame_count:
@ -536,7 +538,7 @@ class VideoRendererFFMPEG( object ):
try:
self.process = subprocess.Popen( cmd, bufsize = self.bufsize, stdout=subprocess.PIPE, stderr=subprocess.PIPE, startupinfo = HydrusData.GetSubprocessStartupInfo() )
self.process = subprocess.Popen( cmd, bufsize = self.bufsize, stdout=subprocess.PIPE, stderr=subprocess.PIPE, startupinfo = HydrusData.GetHideTerminalSubprocessStartupInfo() )
except:

View File

@ -428,9 +428,9 @@ class Controller( HydrusController.HydrusController ):
def ServerBandwidthOk( self ):
def ServerBandwidthOK( self ):
return self._admin_service.ServerBandwidthOk()
return self._admin_service.ServerBandwidthOK()
def SetServices( self, services ):

View File

@ -1269,7 +1269,6 @@ class DB( HydrusDB.HydrusDB ):
elif action == 'petition': result = self._RepositoryGetPetition( *args, **kwargs )
elif action == 'registration_keys': result = self._GenerateRegistrationKeysFromAccount( *args, **kwargs )
elif action == 'service_has_file': result = self._RepositoryHasFile( *args, **kwargs )
elif action == 'service_info': result = self._GetServiceInfo( *args, **kwargs )
elif action == 'service_keys': result = self._GetServiceKeys( *args, **kwargs )
elif action == 'services': result = self._GetServices( *args, **kwargs )
elif action == 'services_from_account': result = self._GetServicesFromAccount( *args, **kwargs )

View File

@ -111,12 +111,12 @@ class HydrusResourceRestricted( HydrusServerResources.HydrusResource ):
def _checkBandwidth( self, request ):
if not self._service.BandwidthOk():
if not self._service.BandwidthOK():
raise HydrusExceptions.BandwidthException( 'This service has run out of bandwidth. Please try again later.' )
if not HG.server_controller.ServerBandwidthOk():
if not HG.server_controller.ServerBandwidthOK():
raise HydrusExceptions.BandwidthException( 'This server has run out of bandwidth. Please try again later.' )