Version 256

This commit is contained in:
Hydrus Network Developer 2017-05-17 16:53:02 -05:00
parent de39e34867
commit a45ba372a7
23 changed files with 1114 additions and 318 deletions

View File

@ -8,6 +8,41 @@
<div class="content">
<h3>changelog</h3>
<ul>
<li><h3>version 256</h3></li>
<ul>
<li>the duplicate filter now loads new pairs off the gui thread. it will display 'loading pairs...' during this time</li>
<li>media viewers of all kinds are now more comfortable displaying no media (when this occurs, it is usually a frame or two during startup/shutdown)</li>
<li>the duplicate filter now responds to any media_viewer_browser navigation commands (like view_next) with a media switch action</li>
<li>you can now alter the duplicate filter's background lighten/darken switch intensity from its top hover window's cog icon</li>
<li>fixed a bug in the new dupe pair selection algorithm that was preventing pairs from being presented as groups</li>
<li>the duplicate filter will now speed up workflow by automatically skipping pairs when you have previously chosen to delete one of the files in the current batch</li>
<li>auto-skipped pairs _should_ be auto-reverse-skipped on a 'go back' action</li>
<li>added a |< 'go back' index navigation button to the duplicate filter top hover window</li>
<li>the duplicate filter now displays several 'this file has larger resolution'-type statements about the currently viewed file. it lists them on the top hover window and in the background details text underneath</li>
<li>the duplicate filter _roughly_ attempts to put the better file of the two first. this will always be indexed 'A'</li>
<li>the duplicate filter now shows done/total batch progress in its index string--not sure how clear/helpful this ultimately is, so may need to revisit</li>
<li>an unusual bug where Linux would spam the 'All pairs have been filtered!' duplicate filter message over and over and then crash _should_ be fixed--the filter no longer waits for that message to be OKed before closing itself</li>
<li>drag-and-dropping text onto the client will now a) open a url import page if none already exist and b) put the dropped text into the input box of the first open url import page (and focus it, so you can quickly hit enter)! This works when dragging text links from browsers, as well</li>
<li>you can now 'append' gui sessions, which will just append that session's tabs to whatever is already open--say, if you have several 'favourites' pages you want to be able to quickly load up without having to break your existing workflow</li>
<li>ipfs services now have a 'check daemon' button on their review services panel which will test the daemon is running and accessible and report its version</li>
<li>fixed the 'test address' button for ipfs services on their manage services panel</li>
<li>the client can now automatically download files it wants and knows are on an ipfs service</li>
<li>middle-click on an 'all known files' domain thumbnail will now correctly start a download (as long as a specific remote file service is known)</li>
<li>the multihash prefix option is reinstated on ipfs manage services panels</li>
<li>the gelbooru parser now discovers the correct page url to associate with its files</li>
<li>wrote some redirect fetching code to fix the gelbooru bad urls issue</li>
<li>discovered a quicker fix for the gelbooru issue--the redirect location is the garbage in the original url in base64</li>
<li>all downloader/subscription url caches will purge any old gelbooru 'redirect.php' urls on update</li>
<li>fixed an issue where 'previously deleted' gallery/thread imports were returning 'fail'</li>
<li>fixed a problem that was causing some redundant laggy work in adminside petition processing</li>
<li>thread watchers will now remember their file and tag import options through a session save even when no thread url has yet been entered</li>
<li>fixed an issue where media 'removed' from a media viewer view of a collection resulted in the entire collection being removed at the thumbnail level</li>
<li>fixed an issue where media deleted from a media viewer view of a collection resulted in the media not being correctly removed from selection tags</li>
<li>tag, namespace, and wildcard searches on a specific file domain (i.e. other than 'all known files') now take advantage of an optimisation in the autocomplete cache and so run significantly faster</li>
<li>fixed a hover window coordinate calculation issue after minimising the media viewer on some platforms</li>
<li>removed some 'all files failed to download' spam that could sometimes occur</li>
<li>misc fixes</li>
</ul>
<li><h3>version 255</h3></li>
<ul>
<li>the duplicate filter now supports shift+left-click to drag, like the archive/delete filter (this remains hardcoded for now)</li>

View File

@ -1071,6 +1071,8 @@ class DB( HydrusDB.HydrusDB ):
continue
seen_hash_ids_for_this_master_hash_id = set()
for pair in master_hash_ids_to_groups[ master_hash_id ]:
( smaller_hash_id, larger_hash_id ) = pair
@ -1080,8 +1082,8 @@ class DB( HydrusDB.HydrusDB ):
continue
seen_hash_ids.add( smaller_hash_id )
seen_hash_ids.add( larger_hash_id )
seen_hash_ids_for_this_master_hash_id.add( smaller_hash_id )
seen_hash_ids_for_this_master_hash_id.add( larger_hash_id )
pairs_of_hash_ids.append( pair )
@ -1091,6 +1093,8 @@ class DB( HydrusDB.HydrusDB ):
seen_hash_ids.update( seen_hash_ids_for_this_master_hash_id )
if len( pairs_of_hash_ids ) >= MAX_BATCH_SIZE:
break
@ -1706,7 +1710,7 @@ class DB( HydrusDB.HydrusDB ):
return similar_hash_ids
def _CacheSimilarFilesSetDuplicatePairStatus( self, duplicate_status, hash_a, hash_b, merge_options = None ):
def _CacheSimilarFilesSetDuplicatePairStatus( self, duplicate_status, hash_a, hash_b ):
if duplicate_status == HC.DUPLICATE_WORSE:
@ -1720,7 +1724,7 @@ class DB( HydrusDB.HydrusDB ):
hash_id_a = self._GetHashId( hash_a )
hash_id_b = self._GetHashId( hash_b )
self._CacheSimilarFilesSetDuplicatePairStatusSingleRow( duplicate_status, hash_id_a, hash_id_b, merge_options )
self._CacheSimilarFilesSetDuplicatePairStatusSingleRow( duplicate_status, hash_id_a, hash_id_b )
if duplicate_status == HC.DUPLICATE_BETTER:
@ -1736,7 +1740,7 @@ class DB( HydrusDB.HydrusDB ):
for better_than_a_hash_id in better_than_a:
self._CacheSimilarFilesSetDuplicatePairStatusSingleRow( HC.DUPLICATE_BETTER, better_than_a_hash_id, hash_id_b, merge_options )
self._CacheSimilarFilesSetDuplicatePairStatusSingleRow( HC.DUPLICATE_BETTER, better_than_a_hash_id, hash_id_b )
worse_than_b = set()
@ -1746,17 +1750,17 @@ class DB( HydrusDB.HydrusDB ):
for worse_than_b_hash_id in worse_than_b:
self._CacheSimilarFilesSetDuplicatePairStatusSingleRow( HC.DUPLICATE_BETTER, hash_id_a, worse_than_b_hash_id, merge_options )
self._CacheSimilarFilesSetDuplicatePairStatusSingleRow( HC.DUPLICATE_BETTER, hash_id_a, worse_than_b_hash_id )
# do a sync for better dupes that applies not_dupe and alternate relationships across better-than groups
self._CacheSimilarFilesSyncSameFileDuplicates( hash_id_a, merge_options )
self._CacheSimilarFilesSyncSameFileDuplicates( hash_id_b, merge_options )
self._CacheSimilarFilesSyncSameFileDuplicates( hash_id_a )
self._CacheSimilarFilesSyncSameFileDuplicates( hash_id_b )
def _CacheSimilarFilesSetDuplicatePairStatusSingleRow( self, duplicate_status, hash_id_a, hash_id_b, merge_options, only_update_given_previous_status = None ):
def _CacheSimilarFilesSetDuplicatePairStatusSingleRow( self, duplicate_status, hash_id_a, hash_id_b, only_update_given_previous_status = None ):
smaller_hash_id = min( hash_id_a, hash_id_b )
larger_hash_id = max( hash_id_a, hash_id_b )
@ -1784,8 +1788,6 @@ class DB( HydrusDB.HydrusDB ):
change_occured = False
if only_update_given_previous_status is None:
result = self._c.execute( 'SELECT 1 FROM duplicate_pairs WHERE smaller_hash_id = ? AND larger_hash_id = ? AND duplicate_type = ?;', ( smaller_hash_id, larger_hash_id, duplicate_status ) ).fetchone()
@ -1794,42 +1796,14 @@ class DB( HydrusDB.HydrusDB ):
self._c.execute( 'REPLACE INTO duplicate_pairs ( smaller_hash_id, larger_hash_id, duplicate_type ) VALUES ( ?, ?, ? );', ( smaller_hash_id, larger_hash_id, duplicate_status ) )
change_occured = True
else:
self._c.execute( 'UPDATE duplicate_pairs SET duplicate_type = ? WHERE smaller_hash_id = ? AND larger_hash_id = ? AND duplicate_type = ?;', ( duplicate_status, smaller_hash_id, larger_hash_id, only_update_given_previous_status ) )
if self._GetRowCount() > 0:
change_occured = True
if change_occured and merge_options is not None:
# follow merge_options
# if better:
# do tags
# do ratings
# delete file
# if same:
# do tags
# do ratings
pass
def _CacheSimilarFilesSyncSameFileDuplicates( self, hash_id, merge_options ):
def _CacheSimilarFilesSyncSameFileDuplicates( self, hash_id ):
# for every known relationship our file has, that should be replicated to all of its 'same file' siblings
@ -1877,7 +1851,7 @@ class DB( HydrusDB.HydrusDB ):
continue
self._CacheSimilarFilesSetDuplicatePairStatusSingleRow( duplicate_status, sibling_hash_id, other_hash_id, merge_options )
self._CacheSimilarFilesSetDuplicatePairStatusSingleRow( duplicate_status, sibling_hash_id, other_hash_id )
@ -3505,17 +3479,19 @@ class DB( HydrusDB.HydrusDB ):
for search_tag_service_id in search_tag_service_ids:
( current_mappings_table_name, deleted_mappings_table_name, pending_mappings_table_name, petitioned_mappings_table_name ) = GenerateMappingsTableNames( search_tag_service_id )
if file_service_key == CC.COMBINED_FILE_SERVICE_KEY:
( current_mappings_table_name, deleted_mappings_table_name, pending_mappings_table_name, petitioned_mappings_table_name ) = GenerateMappingsTableNames( search_tag_service_id )
current_selects.append( 'SELECT hash_id FROM ' + current_mappings_table_name + ' NATURAL JOIN tags WHERE namespace_id = ' + str( namespace_id ) + ';' )
pending_selects.append( 'SELECT hash_id FROM ' + pending_mappings_table_name + ' NATURAL JOIN tags WHERE namespace_id = ' + str( namespace_id ) + ';' )
else:
current_selects.append( 'SELECT hash_id FROM ' + current_mappings_table_name + ' NATURAL JOIN current_files NATURAL JOIN tags WHERE service_id = ' + str( file_service_id ) + ' AND namespace_id = ' + str( namespace_id ) + ';' )
pending_selects.append( 'SELECT hash_id FROM ' + pending_mappings_table_name + ' NATURAL JOIN current_files NATURAL JOIN tags WHERE service_id = ' + str( file_service_id ) + ' AND namespace_id = ' + str( namespace_id ) + ';' )
( cache_files_table_name, cache_current_mappings_table_name, cache_pending_mappings_table_name, ac_cache_table_name ) = GenerateSpecificMappingsCacheTableNames( file_service_id, search_tag_service_id )
current_selects.append( 'SELECT hash_id FROM ' + cache_current_mappings_table_name + ' NATURAL JOIN tags WHERE namespace_id = ' + str( namespace_id ) + ';' )
pending_selects.append( 'SELECT hash_id FROM ' + cache_pending_mappings_table_name + ' NATURAL JOIN tags WHERE namespace_id = ' + str( namespace_id ) + ';' )
@ -4023,17 +3999,19 @@ class DB( HydrusDB.HydrusDB ):
for search_tag_service_id in search_tag_service_ids:
( current_mappings_table_name, deleted_mappings_table_name, pending_mappings_table_name, petitioned_mappings_table_name ) = GenerateMappingsTableNames( search_tag_service_id )
if file_service_key == CC.COMBINED_FILE_SERVICE_KEY:
( current_mappings_table_name, deleted_mappings_table_name, pending_mappings_table_name, petitioned_mappings_table_name ) = GenerateMappingsTableNames( search_tag_service_id )
current_selects.append( 'SELECT hash_id FROM ' + current_mappings_table_name + ' NATURAL JOIN tags WHERE namespace_id = ' + str( namespace_id ) + ' AND subtag_id = ' + str( subtag_id ) + ';' )
pending_selects.append( 'SELECT hash_id FROM ' + pending_mappings_table_name + ' NATURAL JOIN tags WHERE namespace_id = ' + str( namespace_id ) + ' AND subtag_id = ' + str( subtag_id ) + ';' )
else:
current_selects.append( 'SELECT hash_id FROM ' + current_mappings_table_name + ' NATURAL JOIN current_files NATURAL JOIN tags WHERE current_files.service_id = ' + str( file_service_id ) + ' AND namespace_id = ' + str( namespace_id ) + ' AND subtag_id = ' + str( subtag_id ) + ';' )
pending_selects.append( 'SELECT hash_id FROM ' + pending_mappings_table_name + ' NATURAL JOIN current_files NATURAL JOIN tags WHERE current_files.service_id = ' + str( file_service_id ) + ' AND namespace_id = ' + str( namespace_id ) + ' AND subtag_id = ' + str( subtag_id ) + ';' )
( cache_files_table_name, cache_current_mappings_table_name, cache_pending_mappings_table_name, ac_cache_table_name ) = GenerateSpecificMappingsCacheTableNames( file_service_id, search_tag_service_id )
current_selects.append( 'SELECT hash_id FROM ' + cache_current_mappings_table_name + ' NATURAL JOIN tags WHERE namespace_id = ' + str( namespace_id ) + ' AND subtag_id = ' + str( subtag_id ) + ';' )
pending_selects.append( 'SELECT hash_id FROM ' + cache_pending_mappings_table_name + ' NATURAL JOIN tags WHERE namespace_id = ' + str( namespace_id ) + ' AND subtag_id = ' + str( subtag_id ) + ';' )
@ -4048,17 +4026,19 @@ class DB( HydrusDB.HydrusDB ):
for search_tag_service_id in search_tag_service_ids:
( current_mappings_table_name, deleted_mappings_table_name, pending_mappings_table_name, petitioned_mappings_table_name ) = GenerateMappingsTableNames( search_tag_service_id )
if file_service_key == CC.COMBINED_FILE_SERVICE_KEY:
( current_mappings_table_name, deleted_mappings_table_name, pending_mappings_table_name, petitioned_mappings_table_name ) = GenerateMappingsTableNames( search_tag_service_id )
current_selects.append( 'SELECT hash_id FROM ' + current_mappings_table_name + ' NATURAL JOIN tags WHERE subtag_id = ' + str( subtag_id ) + ';' )
pending_selects.append( 'SELECT hash_id FROM ' + pending_mappings_table_name + ' NATURAL JOIN tags WHERE subtag_id = ' + str( subtag_id ) + ';' )
else:
current_selects.append( 'SELECT hash_id FROM ' + current_mappings_table_name + ' NATURAL JOIN current_files NATURAL JOIN tags WHERE current_files.service_id = ' + str( file_service_id ) + ' AND subtag_id = ' + str( subtag_id ) + ';' )
pending_selects.append( 'SELECT hash_id FROM ' + pending_mappings_table_name + ' NATURAL JOIN current_files NATURAL JOIN tags WHERE current_files.service_id = ' + str( file_service_id ) + ' AND subtag_id = ' + str( subtag_id ) + ';' )
( cache_files_table_name, cache_current_mappings_table_name, cache_pending_mappings_table_name, ac_cache_table_name ) = GenerateSpecificMappingsCacheTableNames( file_service_id, search_tag_service_id )
current_selects.append( 'SELECT hash_id FROM ' + cache_current_mappings_table_name + ' NATURAL JOIN tags WHERE subtag_id = ' + str( subtag_id ) + ';' )
pending_selects.append( 'SELECT hash_id FROM ' + cache_pending_mappings_table_name + ' NATURAL JOIN tags WHERE subtag_id = ' + str( subtag_id ) + ';' )
@ -4154,17 +4134,19 @@ class DB( HydrusDB.HydrusDB ):
for search_tag_service_id in search_tag_service_ids:
( current_mappings_table_name, deleted_mappings_table_name, pending_mappings_table_name, petitioned_mappings_table_name ) = GenerateMappingsTableNames( search_tag_service_id )
if file_service_key == CC.COMBINED_FILE_SERVICE_KEY:
( current_mappings_table_name, deleted_mappings_table_name, pending_mappings_table_name, petitioned_mappings_table_name ) = GenerateMappingsTableNames( search_tag_service_id )
current_selects.append( 'SELECT hash_id FROM ' + current_mappings_table_name + ' NATURAL JOIN tags WHERE namespace_id IN ' + HydrusData.SplayListForDB( possible_namespace_ids ) + ' AND subtag_id IN ' + HydrusData.SplayListForDB( possible_subtag_ids ) + ';' )
pending_selects.append( 'SELECT hash_id FROM ' + pending_mappings_table_name + ' NATURAL JOIN tags WHERE namespace_id IN ' + HydrusData.SplayListForDB( possible_namespace_ids ) + ' AND subtag_id IN ' + HydrusData.SplayListForDB( possible_subtag_ids ) + ';' )
else:
current_selects.append( 'SELECT hash_id FROM ' + current_mappings_table_name + ' NATURAL JOIN current_files NATURAL JOIN tags WHERE current_files.service_id = ' + str( file_service_id ) + ' AND namespace_id IN ' + HydrusData.SplayListForDB( possible_namespace_ids ) + ' AND subtag_id IN ' + HydrusData.SplayListForDB( possible_subtag_ids ) + ';' )
pending_selects.append( 'SELECT hash_id FROM ' + pending_mappings_table_name + ' NATURAL JOIN current_files NATURAL JOIN tags WHERE current_files.service_id = ' + str( file_service_id ) + ' AND namespace_id IN ' + HydrusData.SplayListForDB( possible_namespace_ids ) + ' AND subtag_id IN ' + HydrusData.SplayListForDB( possible_subtag_ids ) + ';' )
( cache_files_table_name, cache_current_mappings_table_name, cache_pending_mappings_table_name, ac_cache_table_name ) = GenerateSpecificMappingsCacheTableNames( file_service_id, search_tag_service_id )
current_selects.append( 'SELECT hash_id FROM ' + cache_current_mappings_table_name + ' NATURAL JOIN tags WHERE namespace_id IN ' + HydrusData.SplayListForDB( possible_namespace_ids ) + ' AND subtag_id IN ' + HydrusData.SplayListForDB( possible_subtag_ids ) + ';' )
pending_selects.append( 'SELECT hash_id FROM ' + cache_pending_mappings_table_name + ' NATURAL JOIN tags WHERE namespace_id IN ' + HydrusData.SplayListForDB( possible_namespace_ids ) + ' AND subtag_id IN ' + HydrusData.SplayListForDB( possible_subtag_ids ) + ';' )
@ -4174,17 +4156,19 @@ class DB( HydrusDB.HydrusDB ):
for search_tag_service_id in search_tag_service_ids:
( current_mappings_table_name, deleted_mappings_table_name, pending_mappings_table_name, petitioned_mappings_table_name ) = GenerateMappingsTableNames( search_tag_service_id )
if file_service_key == CC.COMBINED_FILE_SERVICE_KEY:
( current_mappings_table_name, deleted_mappings_table_name, pending_mappings_table_name, petitioned_mappings_table_name ) = GenerateMappingsTableNames( search_tag_service_id )
current_selects.append( 'SELECT hash_id FROM ' + current_mappings_table_name + ' NATURAL JOIN tags WHERE subtag_id IN ' + HydrusData.SplayListForDB( possible_subtag_ids ) + ';' )
pending_selects.append( 'SELECT hash_id FROM ' + pending_mappings_table_name + ' NATURAL JOIN tags WHERE subtag_id IN ' + HydrusData.SplayListForDB( possible_subtag_ids ) + ';' )
else:
current_selects.append( 'SELECT hash_id FROM ' + current_mappings_table_name + ' NATURAL JOIN current_files NATURAL JOIN tags WHERE current_files.service_id = ' + str( file_service_id ) + ' AND subtag_id IN ' + HydrusData.SplayListForDB( possible_subtag_ids ) + ';' )
pending_selects.append( 'SELECT hash_id FROM ' + pending_mappings_table_name + ' NATURAL JOIN current_files NATURAL JOIN tags WHERE current_files.service_id = ' + str( file_service_id ) + ' AND subtag_id IN ' + HydrusData.SplayListForDB( possible_subtag_ids ) + ';' )
( cache_files_table_name, cache_current_mappings_table_name, cache_pending_mappings_table_name, ac_cache_table_name ) = GenerateSpecificMappingsCacheTableNames( file_service_id, search_tag_service_id )
current_selects.append( 'SELECT hash_id FROM ' + cache_current_mappings_table_name + ' NATURAL JOIN tags WHERE subtag_id IN ' + HydrusData.SplayListForDB( possible_subtag_ids ) + ';' )
pending_selects.append( 'SELECT hash_id FROM ' + cache_pending_mappings_table_name + ' NATURAL JOIN tags WHERE subtag_id IN ' + HydrusData.SplayListForDB( possible_subtag_ids ) + ';' )
@ -4355,14 +4339,18 @@ class DB( HydrusDB.HydrusDB ):
if result is not None:
return ( CC.STATUS_DELETED, None )
hash = self._GetHash( hash_id )
return ( CC.STATUS_DELETED, hash )
result = self._c.execute( 'SELECT 1 FROM current_files WHERE service_id = ? AND hash_id = ?;', ( self._trash_service_id, hash_id ) ).fetchone()
if result is not None:
return ( CC.STATUS_DELETED, None )
hash = self._GetHash( hash_id )
return ( CC.STATUS_DELETED, hash )
result = self._c.execute( 'SELECT 1 FROM current_files WHERE service_id = ? AND hash_id = ?;', ( self._combined_local_file_service_id, hash_id ) ).fetchone()

View File

@ -2,6 +2,7 @@ import ClientThreading
import HydrusConstants as HC
import HydrusData
import HydrusExceptions
import HydrusGlobals as HG
import HydrusNATPunch
import HydrusPaths
import HydrusSerialisable
@ -95,52 +96,64 @@ def DAEMONDownloadFiles( controller ):
continue
if service.GetServiceType() != HC.FILE_REPOSITORY:
if service.GetServiceType() == HC.FILE_REPOSITORY:
continue
file_repository = service
file_repository = service
if file_repository.IsFunctional():
try:
( os_file_handle, temp_path ) = HydrusPaths.GetTempPath()
if file_repository.IsFunctional():
try:
file_repository.Request( HC.GET, 'file', { 'hash' : hash }, temp_path = temp_path )
( os_file_handle, temp_path ) = HydrusPaths.GetTempPath()
controller.WaitUntilPubSubsEmpty()
try:
file_repository.Request( HC.GET, 'file', { 'hash' : hash }, temp_path = temp_path )
controller.WaitUntilPubSubsEmpty()
client_files_manager.ImportFile( temp_path, override_deleted = True )
successful_hashes.add( hash )
break
finally:
HydrusPaths.CleanUpTempPath( os_file_handle, temp_path )
client_files_manager.ImportFile( temp_path, override_deleted = True )
except HydrusExceptions.ServerBusyException:
successful_hashes.add( hash )
job_key.SetVariable( 'popup_text_1', file_repository.GetName() + ' was busy. waiting 30s before trying again' )
break
time.sleep( 30 )
finally:
job_key.Delete()
HydrusPaths.CleanUpTempPath( os_file_handle, temp_path )
controller.pub( 'notify_new_downloads' )
return
except Exception as e:
HydrusData.ShowText( 'Error downloading file!' )
HydrusData.ShowException( e )
except HydrusExceptions.ServerBusyException:
elif service.GetServiceType() == HC.IPFS:
multihashes = HG.client_controller.Read( 'service_filenames', service_key, { hash } )
if len( multihashes ) > 0:
job_key.SetVariable( 'popup_text_1', file_repository.GetName() + ' was busy. waiting 30s before trying again' )
multihash = multihashes[0]
time.sleep( 30 )
# this actually calls to a thread that can launch gui 'select from tree' stuff, so let's just break at this point
service.ImportFile( multihash )
job_key.Delete()
controller.pub( 'notify_new_downloads' )
return
except Exception as e:
HydrusData.ShowText( 'Error downloading file!' )
HydrusData.ShowException( e )
break
@ -155,10 +168,6 @@ def DAEMONDownloadFiles( controller ):
job_key.SetVariable( 'popup_text_1', HydrusData.ConvertIntToPrettyString( len( successful_hashes ) ) + ' files downloaded' )
else:
job_key.SetVariable( 'popup_text_1', 'all files failed to download' )
job_key.Delete()

View File

@ -217,10 +217,10 @@ def DeletePath( path ):
def GetDifferentLighterDarkerColour( colour, intensity = 3 ):
( r, g, b ) = colour.Get()
if ColourIsGreyish( colour ):
( r, g, b ) = colour.Get()
if ColourIsBright( colour ):
colour = wx.Colour( int( g * ( 1 - 0.05 * intensity ) ), b, r )
@ -239,6 +239,11 @@ def GetDifferentLighterDarkerColour( colour, intensity = 3 ):
def GetLighterDarkerColour( colour, intensity = 3 ):
if intensity is None or intensity == 0:
return colour
if ColourIsBright( colour ):
return wx.lib.colourutils.AdjustColour( colour, -5 * intensity )
@ -781,6 +786,8 @@ class ClientOptions( HydrusSerialisable.SerialisableBase ):
self._dictionary[ 'noneable_integers' ][ 'maintenance_vacuum_period_days' ] = 30
self._dictionary[ 'noneable_integers' ][ 'duplicate_background_switch_intensity' ] = 3
#
self._dictionary[ 'noneable_strings' ] = {}
@ -1497,6 +1504,21 @@ class DuplicateActionOptions( HydrusSerialisable.SerialisableBase ):
self._service_actions = [ ( serialisable_service_key.decode( 'hex' ), action ) for ( serialisable_service_key, action ) in serialisable_service_actions ]
def GetDeletedHashes( self, first_media, second_media ):
first_hashes = first_media.GetHashes()
second_hashes = second_media.GetHashes()
if self._delete_second_file:
return second_hashes
else:
return set()
def SetTuple( self, service_actions, delete_second_file ):
self._service_actions = service_actions

View File

@ -9,7 +9,9 @@ import lxml # to force import for later bs4 stuff
import os
import pafy
import re
import requests
import threading
import time
import urllib
import urlparse
import HydrusData
@ -852,6 +854,52 @@ class GalleryBooru( Gallery ):
if 'gelbooru.com' in url_base:
# they now use redirect urls for thumbs, wew lad
bad_urls = urls
urls = []
session = requests.Session()
for bad_url in bad_urls:
# turns out the garbage after the redirect is the redirect in base64, so let's not waste time doing this
#url = ClientNetworking.RequestsGetRedirectURL( bad_url, session )
#
#urls.append( url )
#
#time.sleep( 0.5 )
# https://gelbooru.com/redirect.php?s=Ly9nZWxib29ydS5jb20vaW5kZXgucGhwP3BhZ2U9cG9zdCZzPXZpZXcmaWQ9MzY5NDEyMg==
try:
encoded_location = bad_url.split( '?s=' )[1]
location = encoded_location.decode( 'base64' )
url = urlparse.urljoin( bad_url, location )
urls.append( url )
except Exception as e:
HydrusData.ShowText( 'gelbooru parsing problem!' )
HydrusData.ShowException( e )
url = ClientNetworking.RequestsGetRedirectURL( bad_url, session )
urls.append( url )
time.sleep( 0.5 )
return ( urls, definitely_no_more_pages )

View File

@ -1,23 +1,27 @@
import HydrusGlobals as HG
import wx
class FileDropTarget( wx.PyDropTarget ):
def __init__( self, filenames_callable ):
def __init__( self, filenames_callable = None, url_callable = None ):
wx.PyDropTarget.__init__( self )
self._filenames_callable = filenames_callable
self._url_callable = url_callable
self._receiving_data_object = wx.DataObjectComposite()
self._hydrus_media_data_object = wx.CustomDataObject( 'application/hydrus-media' )
self._file_data_object = wx.FileDataObject()
self._text_data_object = wx.TextDataObject()
self._receiving_data_object.Add( self._hydrus_media_data_object, True )
self._receiving_data_object.Add( self._file_data_object )
self._receiving_data_object.Add( self._text_data_object )
self.SetDataObject( self._receiving_data_object )
self._filenames_callable = filenames_callable
def OnData( self, x, y, result ):
@ -25,12 +29,20 @@ class FileDropTarget( wx.PyDropTarget ):
received_format = self._receiving_data_object.GetReceivedFormat()
if received_format.GetType() == wx.DF_FILENAME:
received_format_type = received_format.GetType()
if received_format_type == wx.DF_FILENAME and self._filenames_callable is not None:
paths = self._file_data_object.GetFilenames()
wx.CallAfter( self._filenames_callable, paths )
elif received_format_type in ( wx.DF_TEXT, wx.DF_UNICODETEXT ) and self._url_callable is not None:
text = self._text_data_object.GetText()
self._url_callable( text )
else:
try:

View File

@ -66,7 +66,7 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
ClientGUITopLevelWindows.FrameThatResizes.__init__( self, None, title, 'main_gui', float_on_parent = False )
self.SetDropTarget( ClientDragDrop.FileDropTarget( self.ImportFiles ) )
self.SetDropTarget( ClientDragDrop.FileDropTarget( self.ImportFiles, self.ImportURL ) )
self._statusbar = self.CreateStatusBar()
self._statusbar.SetFieldsCount( 4 )
@ -269,6 +269,89 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
def _AppendGUISession( self, name ):
def do_it( session ):
try:
if not HC.PLATFORM_LINUX:
# on linux, this stops session pages from accepting keyboard input, wew
wx.CallAfter( self._notebook.Disable )
for ( page_name, management_controller, initial_hashes ) in session.IteratePages():
try:
if len( initial_hashes ) > 0:
initial_media_results = []
for group_of_inital_hashes in HydrusData.SplitListIntoChunks( initial_hashes, 256 ):
more_media_results = self._controller.Read( 'media_results', group_of_inital_hashes )
initial_media_results.extend( more_media_results )
self._media_status_override = u'Loading session page \'' + page_name + u'\'\u2026 ' + HydrusData.ConvertValueRangeToPrettyString( len( initial_media_results ), len( initial_hashes ) )
self._controller.pub( 'refresh_status' )
else:
initial_media_results = []
wx.CallAfter( self._NewPage, page_name, management_controller, initial_media_results = initial_media_results )
except Exception as e:
HydrusData.ShowException( e )
finally:
self._loading_session = False
self._media_status_override = None
if not HC.PLATFORM_LINUX:
wx.CallAfter( self._notebook.Enable )
if self._loading_session:
HydrusData.ShowText( 'Sorry, currently loading a session. Please wait.' )
return
self._loading_session = True
try:
session = self._controller.Read( 'serialisable_named', HydrusSerialisable.SERIALISABLE_TYPE_GUI_SESSION, name )
except Exception as e:
HydrusData.ShowText( 'While trying to load session ' + name + ', this error happened:' )
HydrusData.ShowException( e )
self._NewPageQuery( CC.LOCAL_FILE_SERVICE_KEY )
return
self._controller.CallToThread( do_it, session )
def _AutoRepoSetup( self ):
def do_it():
@ -700,7 +783,10 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
self._notebook.RemovePage( selection )
if self._notebook.GetPageCount() == 0: self._focus_holder.SetFocus()
if self._notebook.GetPageCount() == 0:
self._focus_holder.SetFocus()
self._controller.pub( 'notify_new_undo' )
@ -1017,6 +1103,15 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
ClientGUIMenus.AppendMenu( sessions, load, 'load' )
append = wx.Menu()
for name in gui_session_names:
ClientGUIMenus.AppendMenuItem( self, append, name, 'Append this session to whatever pages are already open.', self._AppendGUISession, name )
ClientGUIMenus.AppendMenu( sessions, append, 'append' )
ClientGUIMenus.AppendMenuItem( self, sessions, 'save current', 'Save the existing open pages as a session.', self._SaveGUISession )
@ -1613,22 +1708,6 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
return
self._loading_session = True
try:
session = self._controller.Read( 'serialisable_named', HydrusSerialisable.SERIALISABLE_TYPE_GUI_SESSION, name )
except Exception as e:
HydrusData.ShowText( 'While trying to load session ' + name + ', this error happened:' )
HydrusData.ShowException( e )
self._NewPageQuery( CC.LOCAL_FILE_SERVICE_KEY )
return
for page in [ self._notebook.GetPage( i ) for i in range( self._notebook.GetPageCount() ) ]:
try:
@ -1646,62 +1725,7 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
self._CloseCurrentPage( polite = False )
def do_it():
try:
if not HC.PLATFORM_LINUX:
# on linux, this stops session pages from accepting keyboard input, wew
wx.CallAfter( self._notebook.Disable )
for ( page_name, management_controller, initial_hashes ) in session.IteratePages():
try:
if len( initial_hashes ) > 0:
initial_media_results = []
for group_of_inital_hashes in HydrusData.SplitListIntoChunks( initial_hashes, 256 ):
more_media_results = self._controller.Read( 'media_results', group_of_inital_hashes )
initial_media_results.extend( more_media_results )
self._media_status_override = u'Loading session page \'' + page_name + u'\'\u2026 ' + HydrusData.ConvertValueRangeToPrettyString( len( initial_media_results ), len( initial_hashes ) )
self._controller.pub( 'refresh_status' )
else:
initial_media_results = []
wx.CallAfter( self._NewPage, page_name, management_controller, initial_media_results = initial_media_results )
except Exception as e:
HydrusData.ShowException( e )
finally:
self._loading_session = False
self._media_status_override = None
if not HC.PLATFORM_LINUX:
wx.CallAfter( self._notebook.Enable )
self._controller.CallToThread( do_it )
self._AppendGUISession( name )
def _ManageAccountTypes( self, service_key ):
@ -3153,6 +3177,31 @@ The password is cleartext here but obscured in the entry dialog. Enter a blank p
self._ImportFiles( paths )
def ImportURL( self, url ):
if True not in ( page.IsURLImportPage() for page in [ self._notebook.GetPage( i ) for i in range( self._notebook.GetPageCount() ) ] ):
self._NewPageImportURLs()
for ( page, i ) in [ ( self._notebook.GetPage( i ), i ) for i in range( self._notebook.GetPageCount() ) ]:
if page.IsURLImportPage():
if page != self._notebook.GetCurrentPage():
self._notebook.SetSelection( i )
page_key = page.GetPageKey()
HG.client_controller.pub( 'set_page_url_input', page_key, url )
break
def NewPageDuplicateFilter( self ):
self._NewPageDuplicateFilter()

View File

@ -1189,12 +1189,18 @@ class Canvas( wx.Window ):
def _Archive( self ):
HG.client_controller.Write( 'content_updates', { CC.COMBINED_LOCAL_FILE_SERVICE_KEY : [ HydrusData.ContentUpdate( HC.CONTENT_TYPE_FILES, HC.CONTENT_UPDATE_ARCHIVE, ( self._current_media.GetHash(), ) ) ] } )
if self._current_media is not None:
HG.client_controller.Write( 'content_updates', { CC.COMBINED_LOCAL_FILE_SERVICE_KEY : [ HydrusData.ContentUpdate( HC.CONTENT_TYPE_FILES, HC.CONTENT_UPDATE_ARCHIVE, ( self._current_media.GetHash(), ) ) ] } )
def _CopyBMPToClipboard( self ):
HG.client_controller.pub( 'clipboard', 'bmp', self._current_media )
if self._current_media is not None:
HG.client_controller.pub( 'clipboard', 'bmp', self._current_media )
def _CopyHashToClipboard( self, hash_type ):
@ -1226,24 +1232,35 @@ class Canvas( wx.Window ):
def _CopyFileToClipboard( self ):
client_files_manager = HG.client_controller.GetClientFilesManager()
paths = [ client_files_manager.GetFilePath( self._current_media.GetHash(), self._current_media.GetMime() ) ]
HG.client_controller.pub( 'clipboard', 'paths', paths )
if self._current_media is not None:
client_files_manager = HG.client_controller.GetClientFilesManager()
paths = [ client_files_manager.GetFilePath( self._current_media.GetHash(), self._current_media.GetMime() ) ]
HG.client_controller.pub( 'clipboard', 'paths', paths )
def _CopyPathToClipboard( self ):
client_files_manager = HG.client_controller.GetClientFilesManager()
path = client_files_manager.GetFilePath( self._current_media.GetHash(), self._current_media.GetMime() )
HG.client_controller.pub( 'clipboard', 'text', path )
if self._current_media is not None:
client_files_manager = HG.client_controller.GetClientFilesManager()
path = client_files_manager.GetFilePath( self._current_media.GetHash(), self._current_media.GetMime() )
HG.client_controller.pub( 'clipboard', 'text', path )
def _Delete( self, service_key = None ):
if self._current_media is None:
return
do_it = False
if service_key is None:
@ -1301,6 +1318,11 @@ class Canvas( wx.Window ):
def _DoManualPan( self, delta_x_step, delta_y_step ):
if self._current_media is None:
return
( my_x, my_y ) = self.GetClientSize()
( media_x, media_y ) = self._media_container.GetClientSize()
@ -1330,18 +1352,23 @@ class Canvas( wx.Window ):
self._dirty = False
def _DrawBackgroundDetails( self, dc ): pass
def _DrawBackgroundDetails( self, dc ):
pass
def _DrawCurrentMedia( self ):
if self._current_media is None:
return
( my_width, my_height ) = self.GetClientSize()
if my_width > 0 and my_height > 0:
if self._current_media is not None:
self._SizeAndPositionMediaContainer()
self._SizeAndPositionMediaContainer()
@ -1439,16 +1466,31 @@ class Canvas( wx.Window ):
def _Inbox( self ):
if self._current_media is None:
return
HG.client_controller.Write( 'content_updates', { CC.COMBINED_LOCAL_FILE_SERVICE_KEY : [ HydrusData.ContentUpdate( HC.CONTENT_TYPE_FILES, HC.CONTENT_UPDATE_INBOX, ( self._current_media.GetHash(), ) ) ] } )
def _IsZoomable( self ):
if self._current_media is None:
return False
return self._GetShowAction( self._current_media ) not in ( CC.MEDIA_VIEWER_ACTION_SHOW_OPEN_EXTERNALLY_BUTTON, CC.MEDIA_VIEWER_ACTION_DO_NOT_SHOW_ON_ACTIVATION_OPEN_EXTERNALLY, CC.MEDIA_VIEWER_ACTION_DO_NOT_SHOW )
def _MaintainZoom( self, previous_media ):
if self._current_media is None:
return
if previous_media.GetResolution() == self._current_media.GetResolution():
previous_zoom = self._current_zoom
@ -1466,7 +1508,12 @@ class Canvas( wx.Window ):
def _ManageRatings( self ):
if self._current_media is None:
return
if len( HG.client_controller.GetServicesManager().GetServices( HC.RATINGS_SERVICES ) ) > 0:
if self._current_media is not None:
@ -1478,6 +1525,11 @@ class Canvas( wx.Window ):
def _ManageTags( self ):
if self._current_media is None:
return
if self._manage_tags_panel:
self._manage_tags_panel.SetFocus()
@ -1502,21 +1554,23 @@ class Canvas( wx.Window ):
def _OpenExternally( self ):
if self._current_media is not None:
if self._current_media is None:
hash = self._current_media.GetHash()
mime = self._current_media.GetMime()
return
client_files_manager = HG.client_controller.GetClientFilesManager()
hash = self._current_media.GetHash()
mime = self._current_media.GetMime()
client_files_manager = HG.client_controller.GetClientFilesManager()
path = client_files_manager.GetFilePath( hash, mime )
HydrusPaths.LaunchFile( path )
if self._current_media.HasDuration() and mime != HC.APPLICATION_FLASH:
path = client_files_manager.GetFilePath( hash, mime )
HydrusPaths.LaunchFile( path )
if self._current_media.HasDuration() and mime != HC.APPLICATION_FLASH:
self._media_container.Pause()
self._media_container.Pause()
@ -1599,6 +1653,11 @@ class Canvas( wx.Window ):
elif command_type == CC.APPLICATION_COMMAND_TYPE_CONTENT:
if self._current_media is None:
return
( service_key, content_type, action, value ) = data
try:
@ -1752,6 +1811,11 @@ class Canvas( wx.Window ):
def _ReinitZoom( self ):
if self._current_media is None:
return
show_action = self._GetShowAction( self._current_media )
( self._current_zoom, self._canvas_zoom ) = CalculateCanvasZooms( self, self._current_media, show_action )
@ -1774,6 +1838,11 @@ class Canvas( wx.Window ):
def _SizeAndPositionMediaContainer( self ):
if self._current_media is None:
return
( new_size, new_position ) = self._GetMediaContainerSizeAndPosition()
if new_size != self._media_container.GetSize(): self._media_container.SetSize( new_size )
@ -1784,7 +1853,12 @@ class Canvas( wx.Window ):
def _TryToChangeZoom( self, new_zoom ):
if self._current_media is None:
return
if self._current_media.GetMime() == HC.APPLICATION_FLASH:
# we want to preserve whitespace around flash
@ -2446,9 +2520,27 @@ class CanvasWithDetails( Canvas ):
BORDER = wx.NO_BORDER
def _DrawAdditionalTopMiddleInfo( self, dc, current_y ):
pass
def _DrawBackgroundDetails( self, dc ):
if self._current_media is not None:
if self._current_media is None:
text = 'No media to display'
( width, height ) = dc.GetTextExtent( text )
( my_width, my_height ) = self.GetClientSize()
x = ( my_width - width ) // 2
y = ( my_height - height ) // 2
dc.DrawText( text, x, y )
else:
( client_width, client_height ) = self.GetClientSize()
@ -2642,6 +2734,10 @@ class CanvasWithDetails( Canvas ):
dc.DrawText( info_string, ( client_width - x ) / 2, current_y )
current_y += y + 3
self._DrawAdditionalTopMiddleInfo( dc, current_y )
# bottom-right index
index_string = self._GetIndexString()
@ -2836,12 +2932,16 @@ class CanvasFilterDuplicates( CanvasWithHovers ):
self._file_service_key = file_service_key
self._currently_fetching_pairs = False
self._unprocessed_pairs = []
self._current_pair = None
self._processed_pairs = []
self._batch_skip_hashes = set()
self._media_list = ClientMedia.ListeningMediaList( self._file_service_key, [] )
self._reserved_shortcut_names.append( 'media_viewer_browser' )
self._reserved_shortcut_names.append( 'duplicate_filter' )
self._hover_commands.AddCommand( 'this is better', self._CurrentMediaIsBetter )
@ -2855,14 +2955,14 @@ class CanvasFilterDuplicates( CanvasWithHovers ):
# add support for 'f' to borderless
# add support for F4 and other general shortcuts so people can do edits before processing
wx.CallAfter( self._ShowNewPair ) # don't set this until we have a size > (20, 20)!
HG.client_controller.sub( self, 'ProcessContentUpdates', 'content_updates_gui' )
HG.client_controller.sub( self, 'Delete', 'canvas_delete' )
HG.client_controller.sub( self, 'Undelete', 'canvas_undelete' )
HG.client_controller.sub( self, 'SwitchMedia', 'canvas_show_next' )
HG.client_controller.sub( self, 'SwitchMedia', 'canvas_show_previous' )
wx.CallAfter( self._ShowNewPair )
def _Close( self ):
@ -2901,7 +3001,7 @@ class CanvasFilterDuplicates( CanvasWithHovers ):
def _CommitProcessed( self ):
for ( hash_pair, duplicate_status, first_media, second_media, duplicate_action_options ) in self._processed_pairs:
for ( hash_pair, duplicate_status, first_media, second_media, duplicate_action_options, was_auto_skipped ) in self._processed_pairs:
if duplicate_status == HC.DUPLICATE_UNKNOWN:
@ -2918,10 +3018,11 @@ class CanvasFilterDuplicates( CanvasWithHovers ):
first_hash = first_media.GetHash()
second_hash = second_media.GetHash()
HG.client_controller.WriteSynchronous( 'duplicate_pair_status', duplicate_status, first_hash, second_hash, duplicate_action_options )
HG.client_controller.WriteSynchronous( 'duplicate_pair_status', duplicate_status, first_hash, second_hash )
self._processed_pairs = []
self._batch_skip_hashes = set()
def _CurrentMediaIsBetter( self ):
@ -2931,6 +3032,11 @@ class CanvasFilterDuplicates( CanvasWithHovers ):
def _DoCustomAction( self ):
if self._current_media is None:
return
duplicate_statuses = [ HC.DUPLICATE_BETTER, HC.DUPLICATE_SAME_FILE, HC.DUPLICATE_ALTERNATE, HC.DUPLICATE_NOT_DUPLICATE ]
choice_tuples = [ ( HC.duplicate_status_string_lookup[ duplicate_status ], duplicate_status ) for duplicate_status in duplicate_statuses ]
@ -2962,6 +3068,54 @@ class CanvasFilterDuplicates( CanvasWithHovers ):
def _DrawAdditionalTopMiddleInfo( self, dc, current_y ):
if self._current_media is not None:
shown_media = self._current_media
comparison_media = self._media_list.GetNext( shown_media )
if shown_media != comparison_media:
( statements, score ) = ClientMedia.GetDuplicateComparisonStatements( shown_media, comparison_media )
( client_width, client_height ) = self.GetClientSize()
for statement in statements:
( width, height ) = dc.GetTextExtent( statement )
dc.DrawText( statement, ( client_width - width ) / 2, current_y )
current_y += height + 3
return current_y
def _DrawBackgroundDetails( self, dc ):
if self._currently_fetching_pairs:
text = u'Loading pairs\u2026'
( width, height ) = dc.GetTextExtent( text )
( my_width, my_height ) = self.GetClientSize()
x = ( my_width - width ) // 2
y = ( my_height - height ) // 2
dc.DrawText( text, x, y )
else:
CanvasWithHovers._DrawBackgroundDetails( self, dc )
def _GenerateHoverTopFrame( self ):
return ClientGUIHoverFrames.FullscreenHoverFrameTopDuplicatesFilter( self, self._canvas_key )
@ -2983,7 +3137,11 @@ class CanvasFilterDuplicates( CanvasWithHovers ):
else:
return ClientData.GetLighterDarkerColour( normal_colour )
new_options = HG.client_controller.GetNewOptions()
duplicate_intensity = new_options.GetNoneableInteger( 'duplicate_background_switch_intensity' )
return ClientData.GetLighterDarkerColour( normal_colour, duplicate_intensity )
@ -2996,20 +3154,25 @@ class CanvasFilterDuplicates( CanvasWithHovers ):
else:
progress = len( self._processed_pairs ) + 1 # +1 here actually counts for the one currently displayed
total = progress + len( self._unprocessed_pairs )
index_string = HydrusData.ConvertValueRangeToPrettyString( progress, total )
if self._current_media == self._media_list.GetFirst():
return 'A'
return 'A - ' + index_string
else:
return 'B'
return 'B - ' + index_string
def _GetNumCommittableDecisions( self ):
return len( [ 1 for ( hash_pair, duplicate_status, first_media, second_media, duplicate_action_options ) in self._processed_pairs if duplicate_status != HC.DUPLICATE_UNKNOWN ] )
return len( [ 1 for ( hash_pair, duplicate_status, first_media, second_media, duplicate_action_options, was_auto_skipped ) in self._processed_pairs if duplicate_status != HC.DUPLICATE_UNKNOWN ] )
def _GoBack( self ):
@ -3018,10 +3181,19 @@ class CanvasFilterDuplicates( CanvasWithHovers ):
self._unprocessed_pairs.append( self._current_pair )
( hash_pair, duplicate_status, first_media, second_media, duplicate_action_options ) = self._processed_pairs.pop()
( hash_pair, duplicate_status, first_media, second_media, duplicate_action_options, was_auto_skipped ) = self._processed_pairs.pop()
self._unprocessed_pairs.append( hash_pair )
while was_auto_skipped:
( hash_pair, duplicate_status, first_media, second_media, duplicate_action_options, was_auto_skipped ) = self._processed_pairs.pop()
self._unprocessed_pairs.append( hash_pair )
self._batch_skip_hashes.difference_update( hash_pair )
self._ShowNewPair()
@ -3080,6 +3252,10 @@ class CanvasFilterDuplicates( CanvasWithHovers ):
self._GoBack()
elif action in ( 'view_first', 'view_last', 'view_previous', 'view_next' ):
self._SwitchMedia()
else:
command_processed = False
@ -3100,6 +3276,11 @@ class CanvasFilterDuplicates( CanvasWithHovers ):
def _ProcessPair( self, duplicate_status, duplicate_action_options = None ):
if self._current_media is None:
return
if duplicate_action_options is None:
new_options = HG.client_controller.GetNewOptions()
@ -3109,16 +3290,27 @@ class CanvasFilterDuplicates( CanvasWithHovers ):
other_media = self._media_list.GetNext( self._current_media )
self._processed_pairs.append( ( self._current_pair, duplicate_status, self._current_media, other_media, duplicate_action_options ) )
deleted_hashes = duplicate_action_options.GetDeletedHashes( self._current_media, other_media )
self._batch_skip_hashes.update( deleted_hashes )
was_auto_skipped = False
self._processed_pairs.append( ( self._current_pair, duplicate_status, self._current_media, other_media, duplicate_action_options, was_auto_skipped ) )
self._ShowNewPair()
def _ShowNewPair( self ):
if self._currently_fetching_pairs:
return
num_committable = self._GetNumCommittableDecisions()
if self._unprocessed_pairs == [] and num_committable > 0:
if len( self._unprocessed_pairs ) == 0 and num_committable > 0:
label = 'commit ' + HydrusData.ConvertIntToPrettyString( num_committable ) + ' decisions and continue?'
@ -3132,47 +3324,94 @@ class CanvasFilterDuplicates( CanvasWithHovers ):
else:
( hash_pair, duplicate_status, first_media, second_media, duplicate_action_options ) = self._processed_pairs.pop()
( hash_pair, duplicate_status, first_media, second_media, duplicate_action_options, was_auto_skipped ) = self._processed_pairs.pop()
self._unprocessed_pairs.append( hash_pair )
while was_auto_skipped:
( hash_pair, duplicate_status, first_media, second_media, duplicate_action_options, was_auto_skipped ) = self._processed_pairs.pop()
self._unprocessed_pairs.append( hash_pair )
self._batch_skip_hashes.difference_update( hash_pair )
if self._unprocessed_pairs == []:
if len( self._unprocessed_pairs ) == 0:
self._batch_skip_hashes = set()
self._processed_pairs = [] # just in case someone 'skip'ed everything in the last batch, so this never got cleared above
result = HG.client_controller.Read( 'unique_duplicate_pairs', self._file_service_key, HC.DUPLICATE_UNKNOWN )
self.SetMedia( None )
self._media_list = ClientMedia.ListeningMediaList( self._file_service_key, [] )
if len( result ) == 0:
self._currently_fetching_pairs = True
HG.client_controller.CallToThread( self.THREADFetchPairs )
self._SetDirty()
else:
potential_pair = self._unprocessed_pairs.pop()
( first_hash, second_hash ) = potential_pair
while first_hash in self._batch_skip_hashes or second_hash in self._batch_skip_hashes:
wx.MessageBox( 'All pairs have been filtered!' )
was_auto_skipped = True
self._Close()
self._processed_pairs.append( ( potential_pair, HC.DUPLICATE_UNKNOWN, None, None, None, was_auto_skipped ) )
return
if len( self._unprocessed_pairs ) == 0:
self._ShowNewPair() # there are no useful decisions left in the queue, so let's reset
return
potential_pair = self._unprocessed_pairs.pop()
( first_hash, second_hash ) = potential_pair
self._current_pair = potential_pair
( first_media_result, second_media_result ) = HG.client_controller.Read( 'media_results', self._current_pair )
first_media = ClientMedia.MediaSingleton( first_media_result )
second_media = ClientMedia.MediaSingleton( second_media_result )
( statements, score ) = ClientMedia.GetDuplicateComparisonStatements( first_media, second_media )
if score > 0:
media_results_with_better_first = ( first_media_result, second_media_result )
else:
self._unprocessed_pairs = result
media_results_with_better_first = ( second_media_result, first_media_result )
self._current_pair = self._unprocessed_pairs.pop()
media_results = HG.client_controller.Read( 'media_results', self._current_pair )
self._media_list = ClientMedia.ListeningMediaList( self._file_service_key, media_results )
self.SetMedia( self._media_list.GetFirst() )
self._media_list = ClientMedia.ListeningMediaList( self._file_service_key, media_results_with_better_first )
self.SetMedia( self._media_list.GetFirst() )
def _SkipPair( self ):
other_media = self._media_list.GetNext( self._current_media )
if self._current_media is None:
return
self._processed_pairs.append( ( self._current_pair, HC.DUPLICATE_UNKNOWN, self._current_media, other_media, None ) )
was_auto_skipped = False
self._processed_pairs.append( ( self._current_pair, HC.DUPLICATE_UNKNOWN, None, None, None, was_auto_skipped ) )
self._ShowNewPair()
@ -3317,6 +3556,22 @@ class CanvasFilterDuplicates( CanvasWithHovers ):
wx.CallLater( 100, catch_up )
def SetMedia( self, media, maintain_pan_and_zoom = False ):
CanvasWithHovers.SetMedia( self, media, maintain_pan_and_zoom )
if media is not None:
shown_media = self._current_media
comparison_media = self._media_list.GetNext( shown_media )
if shown_media != comparison_media:
HG.client_controller.pub( 'canvas_new_duplicate_pair', self._canvas_key, shown_media, comparison_media )
def SwitchMedia( self, canvas_key ):
if canvas_key == self._canvas_key:
@ -3333,6 +3588,42 @@ class CanvasFilterDuplicates( CanvasWithHovers ):
def THREADFetchPairs( self ):
def wx_close():
if self:
wx.CallAfter( wx.MessageBox, 'All pairs have been filtered!' )
self._Close()
def wx_continue( unprocessed_pairs ):
if self:
self._unprocessed_pairs = unprocessed_pairs
self._currently_fetching_pairs = False
self._ShowNewPair()
result = HG.client_controller.Read( 'unique_duplicate_pairs', self._file_service_key, HC.DUPLICATE_UNKNOWN )
if len( result ) == 0:
wx.CallAfter( wx_close )
else:
wx.CallAfter( wx_continue, result )
class CanvasMediaList( ClientMedia.ListeningMediaList, CanvasWithHovers ):
def __init__( self, parent, page_key, media_results ):
@ -3468,7 +3759,7 @@ class CanvasMediaList( ClientMedia.ListeningMediaList, CanvasWithHovers ):
singleton_media = { self._current_media }
ClientMedia.ListeningMediaList._RemoveMedia( self, singleton_media, {} )
ClientMedia.ListeningMediaList._RemoveMediaDirectly( self, singleton_media, {} )
if self.HasNoMedia():

View File

@ -790,7 +790,7 @@ class DialogInputLocalFiles( Dialog ):
Dialog.__init__( self, parent, 'importing files' )
self.SetDropTarget( ClientDragDrop.FileDropTarget( self._AddPathsToList ) )
self.SetDropTarget( ClientDragDrop.FileDropTarget( self._AddPathsToList, None ) )
self._paths_list = ClientGUICommon.SaneListCtrl( self, 120, [ ( 'path', -1 ), ( 'guessed mime', 110 ), ( 'size', 60 ) ], delete_key_callback = self.RemovePaths )

View File

@ -234,7 +234,7 @@ class DialogManageBoorus( ClientGUIDialogs.Dialog ):
self.SetSizer( vbox )
self.SetDropTarget( ClientDragDrop.FileDropTarget( self.Import ) )
self.SetDropTarget( ClientDragDrop.FileDropTarget( self.Import, None ) )
( x, y ) = self.GetEffectiveMinSize()
@ -731,7 +731,7 @@ class DialogManageContacts( ClientGUIDialogs.Dialog ):
self.SetInitialSize( ( 980, y ) )
self.SetDropTarget( ClientDragDrop.FileDropTarget( self.Import ) )
self.SetDropTarget( ClientDragDrop.FileDropTarget( self.Import, None ) )
self.EventContactChanged( None )
@ -1631,7 +1631,7 @@ class DialogManageImageboards( ClientGUIDialogs.Dialog ):
self.SetInitialSize( ( 980, y ) )
self.SetDropTarget( ClientDragDrop.FileDropTarget( self.Import ) )
self.SetDropTarget( ClientDragDrop.FileDropTarget( self.Import, None ) )
wx.CallAfter( self._ok.SetFocus )

View File

@ -7,6 +7,7 @@ import ClientGUIListBoxes
import ClientGUITopLevelWindows
import ClientGUIScrolledPanelsEdit
import ClientGUIScrolledPanelsManagement
import ClientMedia
import HydrusConstants as HC
import HydrusData
import HydrusGlobals as HG
@ -56,6 +57,11 @@ class FullscreenHoverFrame( wx.Frame ):
def _SizeAndPosition( self ):
if not self.GetParent().IsShown():
return
( should_resize, my_ideal_size, my_ideal_position ) = self._GetIdealSizeAndPosition()
if should_resize:
@ -92,7 +98,7 @@ class FullscreenHoverFrame( wx.Frame ):
if self._current_media is None:
if self._current_media is None or not self.GetParent().IsShown():
self.Hide()
@ -175,6 +181,7 @@ class FullscreenHoverFrameTop( FullscreenHoverFrame ):
self._top_hbox = wx.BoxSizer( wx.HORIZONTAL )
self._title_text = ClientGUICommon.BetterStaticText( self, 'title' )
self._info_text = ClientGUICommon.BetterStaticText( self, 'info' )
self._additional_info_text = ClientGUICommon.BetterStaticText( self, '', style = wx.ALIGN_CENTER )
self._button_hbox = wx.BoxSizer( wx.HORIZONTAL )
self._PopulateLeftButtons()
@ -188,6 +195,7 @@ class FullscreenHoverFrameTop( FullscreenHoverFrame ):
vbox.AddF( self._top_hbox, CC.FLAGS_EXPAND_PERPENDICULAR )
vbox.AddF( self._title_text, CC.FLAGS_CENTER )
vbox.AddF( self._info_text, CC.FLAGS_CENTER )
vbox.AddF( self._additional_info_text, CC.FLAGS_CENTER )
vbox.AddF( self._button_hbox, CC.FLAGS_CENTER )
self.SetSizer( vbox )
@ -386,6 +394,15 @@ class FullscreenHoverFrameTop( FullscreenHoverFrame ):
self._info_text.Show()
if self._additional_info_text.GetLabelText() == '':
self._additional_info_text.Hide()
else:
self._additional_info_text.Show()
def _SetDefaultShortcuts( self ):
@ -521,6 +538,13 @@ class FullscreenHoverFrameTopArchiveDeleteFilter( FullscreenHoverFrameTop ):
class FullscreenHoverFrameTopDuplicatesFilter( FullscreenHoverFrameTop ):
def __init__( self, parent, canvas_key ):
FullscreenHoverFrameTop.__init__( self, parent, canvas_key )
HG.client_controller.sub( self, 'SetDuplicatePair', 'canvas_new_duplicate_pair' )
def _PopulateCenterButtons( self ):
menu_items = []
@ -529,6 +553,8 @@ class FullscreenHoverFrameTopDuplicatesFilter( FullscreenHoverFrameTop ):
menu_items.append( ( 'normal', 'edit duplicate action options for \'exact duplicates\'', 'edit what content is merged when you filter files', HydrusData.Call( self._EditMergeOptions, HC.DUPLICATE_SAME_FILE ) ) )
menu_items.append( ( 'normal', 'edit duplicate action options for \'alternates\'', 'edit what content is merged when you filter files', HydrusData.Call( self._EditMergeOptions, HC.DUPLICATE_ALTERNATE ) ) )
menu_items.append( ( 'normal', 'edit duplicate action options for \'not duplicates\'', 'edit what content is merged when you filter files', HydrusData.Call( self._EditMergeOptions, HC.DUPLICATE_NOT_DUPLICATE ) ) )
menu_items.append( ( 'separator', None, None, None ) )
menu_items.append( ( 'normal', 'edit background lighten/darken switch intensity', 'edit how much the background will brighten or darken as you switch between the pair', self._EditBackgroundSwitchIntensity ) )
cog_button = ClientGUICommon.MenuBitmapButton( self, CC.GlobalBMPs.cog, menu_items )
@ -537,6 +563,27 @@ class FullscreenHoverFrameTopDuplicatesFilter( FullscreenHoverFrameTop ):
FullscreenHoverFrameTop._PopulateCenterButtons( self )
def _EditBackgroundSwitchIntensity( self ):
new_options = HG.client_controller.GetNewOptions()
value = new_options.GetNoneableInteger( 'duplicate_background_switch_intensity' )
with ClientGUITopLevelWindows.DialogEdit( self, 'edit lighten/darken intensity' ) as dlg:
panel = ClientGUIScrolledPanelsEdit.EditNoneableIntegerPanel( dlg, value, message = 'intensity: ', none_phrase = 'do not change', min = 1, max = 9 )
dlg.SetPanel( panel )
if dlg.ShowModal() == wx.ID_OK:
new_value = panel.GetValue()
new_options.SetNoneableInteger( 'duplicate_background_switch_intensity', new_value )
def _EditMergeOptions( self, duplicate_status ):
new_options = HG.client_controller.GetNewOptions()
@ -560,6 +607,11 @@ class FullscreenHoverFrameTopDuplicatesFilter( FullscreenHoverFrameTop ):
def _PopulateLeftButtons( self ):
self._first_button = ClientGUICommon.BetterBitmapButton( self, CC.GlobalBMPs.first, HG.client_controller.pub, 'canvas_application_command', self._canvas_key, ClientData.ApplicationCommand( CC.APPLICATION_COMMAND_TYPE_SIMPLE, 'duplicate_filter_back' ) )
self._first_button.SetToolTipString( 'go back a pair' )
self._top_hbox.AddF( self._first_button, CC.FLAGS_VCENTER )
FullscreenHoverFrameTop._PopulateLeftButtons( self )
self._last_button = ClientGUICommon.BetterBitmapButton( self, CC.GlobalBMPs.last, HG.client_controller.pub, 'canvas_application_command', self._canvas_key, ClientData.ApplicationCommand( CC.APPLICATION_COMMAND_TYPE_SIMPLE, 'duplicate_filter_skip' ) )
@ -568,6 +620,33 @@ class FullscreenHoverFrameTopDuplicatesFilter( FullscreenHoverFrameTop ):
self._top_hbox.AddF( self._last_button, CC.FLAGS_VCENTER )
def SetDisplayMedia( self, canvas_key, media ):
if canvas_key == self._canvas_key:
if media is None:
self._additional_info_text.SetLabelText( '' )
FullscreenHoverFrameTop.SetDisplayMedia( self, canvas_key, media )
def SetDuplicatePair( self, canvas_key, shown_media, comparison_media ):
if canvas_key == self._canvas_key:
( statements, score ) = ClientMedia.GetDuplicateComparisonStatements( shown_media, comparison_media )
self._additional_info_text.SetLabelText( os.linesep.join( statements ) )
self._ResetText()
self._ResetButtons()
class FullscreenHoverFrameTopNavigableList( FullscreenHoverFrameTop ):
def _PopulateLeftButtons( self ):

View File

@ -787,6 +787,7 @@ class ManagementPanelDuplicateFilter( ManagementPanel ):
self._file_domain_button = ClientGUICommon.BetterButton( self._filtering_panel, 'file domain', self._FileDomainButtonHit )
self._num_unknown_duplicates = wx.StaticText( self._filtering_panel )
self._num_better_duplicates = wx.StaticText( self._filtering_panel )
self._num_better_duplicates.SetToolTipString( 'If this stays at 0, it is likely because your \'worse\' files are being deleted and so are leaving this file domain!' )
self._num_same_file_duplicates = wx.StaticText( self._filtering_panel )
self._num_alternate_duplicates = wx.StaticText( self._filtering_panel )
self._show_some_dupes = ClientGUICommon.BetterButton( self._filtering_panel, 'show some random pairs', self._ShowSomeDupes )
@ -1661,7 +1662,10 @@ class ManagementPanelImporterGallery( ManagementPanelImporter ):
self._gallery_import.SetImportTagOptions( import_tag_options )
else: event.Skip()
else:
event.Skip()
@ -2338,19 +2342,19 @@ class ManagementPanelImporterThreadWatcher( ManagementPanelImporter ):
self._thread_watcher_import.SetDownloadHook( file_download_hook )
( thread_url, import_file_options, import_tag_options, times_to_check, check_period ) = self._thread_watcher_import.GetOptions()
self._thread_input.SetValue( thread_url )
self._thread_input.SetEditable( False )
self._import_file_options.SetOptions( import_file_options )
self._import_tag_options.SetOptions( import_tag_options )
self._thread_times_to_check.SetValue( times_to_check )
self._thread_check_period.SetValue( check_period )
if self._thread_watcher_import.HasThread():
( thread_url, import_file_options, import_tag_options, times_to_check, check_period ) = self._thread_watcher_import.GetOptions()
self._thread_input.SetValue( thread_url )
self._thread_input.SetEditable( False )
self._import_file_options.SetOptions( import_file_options )
self._import_tag_options.SetOptions( import_tag_options )
self._thread_times_to_check.SetValue( times_to_check )
self._thread_check_period.SetValue( check_period )
self._thread_watcher_import.Start( self._page_key )
@ -2675,6 +2679,8 @@ class ManagementPanelImporterURLs( ManagementPanelImporter ):
self._urls_import.Start( self._page_key )
HG.client_controller.sub( self, 'SetURLInput', 'set_page_url_input' )
def _SeedCache( self ):
@ -2826,7 +2832,20 @@ class ManagementPanelImporterURLs( ManagementPanelImporter ):
def SetSearchFocus( self, page_key ):
if page_key == self._page_key: self._url_input.SetFocus()
if page_key == self._page_key:
self._url_input.SetFocus()
def SetURLInput( self, page_key, url ):
if page_key == self._page_key:
self._url_input.SetValue( url )
self._url_input.SetFocus()
management_panel_types_to_classes[ MANAGEMENT_TYPE_IMPORT_URLS ] = ManagementPanelImporterURLs
@ -2967,10 +2986,12 @@ class ManagementPanelPetitions( ManagementPanel ):
weight += content.GetVirtualWeight()
if weight > 200:
if weight > 50:
chunks_of_approved_contents.append( chunk_of_approved_contents )
chunk_of_approved_contents = []
weight = 0

View File

@ -1019,7 +1019,7 @@ class MediaPanel( ClientMedia.ListeningMediaList, wx.ScrolledWindow ):
collections = [ media for media in self._selected_media if media.IsCollection() ]
self._RemoveMedia( singletons, collections )
self._RemoveMediaDirectly( singletons, collections )
def _RescindDownloadSelected( self ):
@ -1386,9 +1386,7 @@ class MediaPanel( ClientMedia.ListeningMediaList, wx.ScrolledWindow ):
if page_key == self._page_key:
media = self._GetMedia( hashes )
self._RemoveMedia( media, {} )
self._RemoveMediaByHashes( hashes )
@ -1921,7 +1919,7 @@ class MediaPanelThumbnails( MediaPanel ):
def _RemoveMedia( self, singleton_media, collected_media ):
def _RemoveMediaDirectly( self, singleton_media, collected_media ):
if self._focussed_media is not None:
@ -1931,7 +1929,7 @@ class MediaPanelThumbnails( MediaPanel ):
MediaPanel._RemoveMedia( self, singleton_media, collected_media )
MediaPanel._RemoveMediaDirectly( self, singleton_media, collected_media )
self._selected_media.difference_update( singleton_media )
self._selected_media.difference_update( collected_media )
@ -2216,13 +2214,13 @@ class MediaPanelThumbnails( MediaPanel ):
locations_manager = t.GetLocationsManager()
if locations_manager.IsLocal(): self._FullScreen( t )
elif self._file_service_key != CC.COMBINED_FILE_SERVICE_KEY:
if locations_manager.IsLocal():
if len( locations_manager.GetCurrentRemote() ) > 0:
self._DownloadHashes( t.GetHashes() )
self._FullScreen( t )
elif len( locations_manager.GetCurrentRemote() ) > 0:
self._DownloadHashes( t.GetHashes() )
@ -2580,7 +2578,7 @@ class MediaPanelThumbnails( MediaPanel ):
if not locations_manager.IsLocal() and not locations_manager.IsDownloading():
downloadable_file_service_keys.update( file_service_keys & locations_manager.GetCurrentRemote() )
downloadable_file_service_keys.update( ipfs_service_keys.union( file_service_keys ) & locations_manager.GetCurrentRemote() )
# we can petition when we have permission and a file is current and it is not already petitioned

View File

@ -134,6 +134,11 @@ class Page( wx.SplitterWindow ):
return ( x, y )
def IsURLImportPage( self ):
return self._management_controller.GetType() == ClientGUIManagement.MANAGEMENT_TYPE_IMPORT_URLS
def PageHidden( self ):
self._controller.pub( 'page_hidden', self._page_key )

View File

@ -985,6 +985,8 @@ class ReviewServicePanel( wx.Panel ):
self._my_updater = ClientGUICommon.ThreadToGUIUpdater( self, self._Refresh )
self._check_running_button = ClientGUICommon.BetterButton( self, 'check daemon', self._CheckRunning )
self._ipfs_shares = ClientGUICommon.SaneListCtrl( self, 200, [ ( 'multihash', 120 ), ( 'num files', 80 ), ( 'total size', 80 ), ( 'note', -1 ) ], delete_key_callback = self._Unpin, activation_callback = self._SetNotes )
self._copy_multihash_button = ClientGUICommon.BetterButton( self, 'copy multihashes', self._CopyMultihashes )
@ -1005,12 +1007,50 @@ class ReviewServicePanel( wx.Panel ):
button_box.AddF( self._set_notes_button, CC.FLAGS_VCENTER )
button_box.AddF( self._unpin_button, CC.FLAGS_VCENTER )
self.AddF( self._check_running_button, CC.FLAGS_LONE_BUTTON )
self.AddF( self._ipfs_shares, CC.FLAGS_EXPAND_BOTH_WAYS )
self.AddF( button_box, CC.FLAGS_BUTTON_SIZER )
HG.client_controller.sub( self, 'ServiceUpdated', 'service_updated' )
def _CheckRunning( self ):
def wx_clean_up():
if self:
self._check_running_button.Enable()
def do_it():
try:
version = self._service.GetDaemonVersion()
message = 'Everything looks ok! Daemon reports version: ' + version
wx.CallAfter( wx.MessageBox, message )
except:
message = 'There was a problem! Check your popup messages for the error.'
wx.CallAfter( wx.MessageBox, message )
finally:
wx.CallAfter( wx_clean_up )
self._check_running_button.Disable()
HG.client_controller.CallToThread( do_it )
def _CopyMultihashes( self ):
multihashes = [ multihash for ( multihash, num_files, total_size, note ) in self._ipfs_shares.GetSelectedClientData() ]

View File

@ -626,6 +626,28 @@ class EditMediaViewOptionsPanel( ClientGUIScrolledPanels.EditPanel ):
return ( self._mime, media_show_action, preview_show_action, ( media_scale_up, media_scale_down, preview_scale_up, preview_scale_down, exact_zooms_only, scale_up_quality, scale_down_quality ) )
class EditNoneableIntegerPanel( ClientGUIScrolledPanels.EditPanel ):
def __init__( self, parent, value, message = '', none_phrase = 'no limit', min = 0, max = 1000000, unit = None, multiplier = 1, num_dimensions = 1 ):
ClientGUIScrolledPanels.EditPanel.__init__( self, parent )
self._value = ClientGUICommon.NoneableSpinCtrl( self, message = message, none_phrase = none_phrase, min = min, max = max, unit = unit, multiplier = multiplier, num_dimensions = num_dimensions )
self._value.SetValue( value )
vbox = wx.BoxSizer( wx.VERTICAL )
vbox.AddF( self._value, CC.FLAGS_EXPAND_PERPENDICULAR )
self.SetSizer( vbox )
def GetValue( self ):
return self._value.GetValue()
class EditSeedCachePanel( ClientGUIScrolledPanels.EditPanel ):
def __init__( self, parent, controller, seed_cache ):

View File

@ -404,7 +404,7 @@ class ManageClientServicesPanel( ClientGUIScrolledPanels.ManagePanel ):
if self._service_type in HC.REMOTE_SERVICES:
remote_panel = self._ServiceRemotePanel( self, self._dictionary )
remote_panel = self._ServiceRemotePanel( self, self._service_type, self._dictionary )
self._panels.append( remote_panel )
@ -553,24 +553,6 @@ class ManageClientServicesPanel( ClientGUIScrolledPanels.ManagePanel ):
def EventCheckIPFS( self, event ):
service = self.GetValue()
try:
version = service.GetDaemonVersion()
wx.MessageBox( 'Everything looks ok! Connected to IPFS Daemon with version: ' + version )
except Exception as e:
HydrusData.ShowException( e )
wx.MessageBox( 'Could not connect!' )
def GetValue( self ):
name = self._service_panel.GetValue()
@ -621,10 +603,12 @@ class ManageClientServicesPanel( ClientGUIScrolledPanels.ManagePanel ):
class _ServiceRemotePanel( ClientGUICommon.StaticBox ):
def __init__( self, parent, dictionary ):
def __init__( self, parent, service_type, dictionary ):
ClientGUICommon.StaticBox.__init__( self, parent, 'clientside network' )
self._service_type = service_type
credentials = dictionary[ 'credentials' ]
bandwidth_rules = dictionary[ 'bandwidth_rules' ]
@ -670,11 +654,24 @@ class ManageClientServicesPanel( ClientGUIScrolledPanels.ManagePanel ):
( host, port ) = credentials.GetAddress()
url = 'https://' + host + ':' + str( port ) + '/'
if self._service_type == HC.IPFS:
scheme = 'http://'
hydrus_network = False
request = 'api/v0/version'
else:
scheme = 'https://'
hydrus_network = True
request = ''
url = scheme + host + ':' + str( port ) + '/' + request
try:
result = HG.client_controller.DoHTTP( HC.GET, url, hydrus_network = True )
result = HG.client_controller.DoHTTP( HC.GET, url, hydrus_network = hydrus_network )
wx.MessageBox( 'Got an ok response!' )
@ -1171,14 +1168,7 @@ class ManageClientServicesPanel( ClientGUIScrolledPanels.ManagePanel ):
ClientGUICommon.StaticBox.__init__( self, parent, 'ipfs' )
# test creds and fetch version
# multihash_prefix
'''
if service_type == HC.IPFS:
self._ipfs_panel = ClientGUICommon.StaticBox( self, 'ipfs settings' )
self._multihash_prefix = wx.TextCtrl( self._ipfs_panel, value = info[ 'multihash_prefix' ] )
self._multihash_prefix = wx.TextCtrl( self )
tts = 'When you tell the client to copy the ipfs multihash to your clipboard, it will prefix it with this.'
tts += os.linesep * 2
@ -1190,22 +1180,21 @@ class ManageClientServicesPanel( ClientGUIScrolledPanels.ManagePanel ):
self._multihash_prefix.SetToolTipString( tts )
'''
self._st = ClientGUICommon.BetterStaticText( self )
#
self._multihash_prefix.SetValue( dictionary[ 'multihash_prefix' ] )
#
self._st.SetLabelText( 'This is an IPFS service. This box will get regain IPFS options in a future update.' )
#
self.AddF( self._st, CC.FLAGS_EXPAND_PERPENDICULAR )
self.AddF( ClientGUICommon.WrapInText( self._multihash_prefix, self, 'multihash prefix: ' ), CC.FLAGS_EXPAND_PERPENDICULAR )
def GetValue( self ):
dictionary_part = {}
dictionary_part[ 'multihash_prefix' ] = self._multihash_prefix.GetValue()
return dictionary_part

View File

@ -1766,7 +1766,7 @@ HydrusSerialisable.SERIALISABLE_TYPES_TO_OBJECT_TYPES[ HydrusSerialisable.SERIAL
class SeedCache( HydrusSerialisable.SerialisableBase ):
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_SEED_CACHE
SERIALISABLE_VERSION = 2
SERIALISABLE_VERSION = 3
def __init__( self ):
@ -1839,6 +1839,26 @@ class SeedCache( HydrusSerialisable.SerialisableBase ):
return ( 2, new_serialisable_info )
if version == 2:
# gelbooru replaced their thumbnail links with this redirect spam
# 'https://gelbooru.com/redirect.php?s=Ly9nZWxib29ydS5jb20vaW5kZXgucGhwP3BhZ2U9cG9zdCZzPXZpZXcmaWQ9MzY4ODA1OA=='
new_serialisable_info = []
for ( seed, seed_info ) in old_serialisable_info:
if seed.startswith( 'https://gelbooru.com/redirect.php' ):
continue
new_serialisable_info.append( ( seed, seed_info ) )
return ( 3, new_serialisable_info )
def AddSeed( self, seed ):
@ -2147,7 +2167,7 @@ class Subscription( HydrusSerialisable.SerialisableBaseNamed ):
self._initial_file_limit = min( 200, HC.options[ 'gallery_file_limit' ] )
self._periodic_file_limit = None
self._periodic_file_limit = 50
self._paused = False
self._import_file_options = ClientDefaults.GetDefaultImportFileOptions()

View File

@ -36,6 +36,126 @@ def FlattenMedia( media_list ):
return flat_media
def GetDuplicateComparisonStatements( shown_media, comparison_media ):
statements = []
score = 0
# higher/same res
s_resolution = shown_media.GetResolution()
c_resolution = comparison_media.GetResolution()
if s_resolution is not None and c_resolution is not None:
( s_w, s_h ) = shown_media.GetResolution()
( c_w, c_h ) = comparison_media.GetResolution()
resolution_ratio = float( s_w * s_h ) / float( c_w * c_h )
if resolution_ratio == 1.0:
if s_resolution == c_resolution:
statements.append( 'Both have the same resolution.' )
else:
statements.append( 'Both have the same number of pixels but different resolution.' )
elif resolution_ratio > 3.0:
statements.append( 'This has much higher resolution.' )
score += 2
elif resolution_ratio > 1.0:
statements.append( 'This has higher resolution.' )
score += 1
elif resolution_ratio < 0.33:
statements.append( 'This has much lower resolution.' )
score -= 2
elif resolution_ratio < 1.0:
statements.append( 'This has lower resolution.' )
score -= 1
# same/diff mime
s_mime = shown_media.GetMime()
c_mime = comparison_media.GetMime()
if s_mime != c_mime:
statements.append( 'This is ' + HC.mime_string_lookup[ s_mime ] + ', the other is ' + HC.mime_string_lookup[ c_mime ] + '.' )
# more tags
s_num_tags = len( shown_media.GetTagsManager().GetCurrent() )
c_num_tags = len( comparison_media.GetTagsManager().GetCurrent() )
if s_num_tags == 0 and c_num_tags == 0:
statements.append( 'Neither have any tags.' )
elif s_num_tags > 0 and c_num_tags > 0:
if s_num_tags > c_num_tags:
statements.append( 'Both have tags, but this has more.' )
score += 1
else:
statements.append( 'Both files have tags, but this has fewer.' )
score += 1
elif s_num_tags > 0:
statements.append( 'This has tags, the other does not.' )
elif c_num_tags > 0:
statements.append( 'The other has tags, this does not.' )
# older
s_ts = shown_media.GetLocationsManager().GetTimestamp( CC.COMBINED_LOCAL_FILE_SERVICE_KEY )
c_ts = comparison_media.GetLocationsManager().GetTimestamp( CC.COMBINED_LOCAL_FILE_SERVICE_KEY )
if s_ts is not None and c_ts is not None:
if s_ts < c_ts - 86400 * 30:
statements.append( 'This is older.' )
score += 0.5
elif c_ts < s_ts - 86400 * 30:
statements.append( 'The other is older.' )
score -= 0.5
return ( statements, score )
def MergeTagsManagers( tags_managers ):
def CurrentAndPendingFilter( items ):
@ -638,7 +758,26 @@ class MediaList( object ):
def _RemoveMedia( self, singleton_media, collected_media ):
def _RemoveMediaByHashes( self, hashes ):
if not isinstance( hashes, set ):
hashes = set( hashes )
affected_singleton_media = self._GetMedia( hashes, discriminator = 'singletons' )
for media in self._collected_media:
media._RemoveMediaByHashes( hashes )
affected_collected_media = [ media for media in self._collected_media if media.HasNoMedia() ]
self._RemoveMediaDirectly( affected_singleton_media, affected_collected_media )
def _RemoveMediaDirectly( self, singleton_media, collected_media ):
if not isinstance( singleton_media, set ):
@ -965,10 +1104,7 @@ class MediaList( object ):
if deleted_from_trash_and_local_view or trashed_and_non_trash_local_view or deleted_from_repo_and_repo_view:
affected_singleton_media = self._GetMedia( hashes, 'singletons' )
affected_collected_media = [ media for media in self._collected_media if media.HasNoMedia() ]
self._RemoveMedia( affected_singleton_media, affected_collected_media )
self._RemoveMediaByHashes( hashes )
@ -1009,7 +1145,7 @@ class MediaList( object ):
if service_key == self._file_service_key:
self._RemoveMedia( self._singleton_media, self._collected_media )
self._RemoveMediaDirectly( self._singleton_media, self._collected_media )
else:

View File

@ -91,6 +91,28 @@ def RequestsGet( url, params = None, stream = False, headers = None ):
return response
def RequestsGetRedirectURL( url, session = None ):
if session is None:
session = requests.Session()
response = session.get( url, allow_redirects = False )
if 'location' in response.headers:
location_header = response.headers[ 'location' ]
new_url = urlparse.urljoin( url, location_header )
return new_url
else:
return url
def RequestsPost( url, data = None, files = None, headers = None ):
if headers is None:
@ -161,6 +183,11 @@ def ParseURL( url ):
try:
if url.startswith( '//' ):
url = url[2:]
starts_http = url.startswith( 'http://' )
starts_https = url.startswith( 'https://' )

View File

@ -49,7 +49,7 @@ options = {}
# Misc
NETWORK_VERSION = 18
SOFTWARE_VERSION = 255
SOFTWARE_VERSION = 256
UNSCALED_THUMBNAIL_DIMENSIONS = ( 200, 200 )

View File

@ -825,7 +825,7 @@ class TestClientDB( unittest.TestCase ):
result = self._read( 'md5_status', md5 )
self.assertEqual( result, ( CC.STATUS_DELETED, None ) )
self.assertEqual( result, ( CC.STATUS_DELETED, hash ) )
def test_media_results( self ):

View File

@ -242,6 +242,11 @@ class Controller( object ):
return write
def IsBooted( self ):
return True
def IsFirstStart( self ):
return True