Version 306

This commit is contained in:
Hydrus Network Developer 2018-05-09 15:23:00 -05:00
parent f6163609be
commit 3b0f013357
43 changed files with 1423 additions and 573 deletions

View File

@ -8,6 +8,58 @@
<div class="content">
<h3>changelog</h3>
<ul>
<li><h3>version 306</h3></li>
<ul>
<li>the file import status list now has 'open selected import files in a new page', which should show up where it is possible. this is a bit prototype and ugly--it'll show _all_ files, including in-trash and permanently deleted (which will show up with the hydrus thumbnail)</li>
<li>the file import status list now prefixes the already in db/deleted notes with 'url' or the hash type that lead to the recognition</li>
<li>these redundant/deleted notes now also propagate up from 'during import' recognition phase as well</li>
<li>the 'delete seeds of type x' entries on the file import status button's right-click menu are now split into three smaller individual tyes and are more explicit about exactly which status types they will remove</li>
<li>like import folders, subscriptions can now optionally publish their files to pages as well as popup buttons. also, subscriptions can optionally publish their files separately for each query instead of all merged together</li>
<li>sped up multiple tag queries significantly</li>
<li>sped up simple (file size, mime, etc...) system predicate queries that also include a tag/namespace/wildcard predicate significantly</li>
<li>added a pixiv parser that pulls the japanese tags to the defaults--users can switch to this if they prefer under network->manage url class links</li>
<li>fixed the 4chan parser to get part of comment as backup subject/page title</li>
<li>removed the 'newgrounds' entry from the normal gallery page creation ui, as the basic gallery parser no longer works due to a dynamic loading change on their end. I hope to have it back with the new gallery parsing system I will soon be writing</li>
<li>the edit url classes panel now has a little text box to put in example urls and see which class, if any, that they match to</li>
<li>improved layout of edit url class links panel</li>
<li>all url types are now displayable in the media viewer--only post url classes are default on</li>
<li>the new (x/y) import page page_name progress count is now updated on all alterations to this value (previously, this was not updating when a user interacted with the import queue, only when the natural downloader loop cycled)</li>
<li>added 'can produce multiple files' option to post url url classes, which informs client url-checking logic whether the url can be relied upon for 'already in db/deleted' calculations</li>
<li>the pixiv file page url class now has 'can produce multiple files' checked, meaning some bad pixiv url association logic due to other sites referencing it as a source url is now fixed</li>
<li>added a 'twitter tweet' url class, which is also a 'can produce multiple files' post url</li>
<li>added a 'sync known urls?' action choice to the duplicate merge options panel, which governs whether urls should be copied from worse to better or in both directions</li>
<li>gave the edit duplicate merge options panel a layout pass</li>
<li>the edit duplicate merge options panel will now disable pointless/over-complicated choices on non-custom actions, let me know if this is a pain for your workflow</li>
<li>added a 'manual' web browser path override to the 'files and trash' options panel, which fixes the new share->open->in web browser option for Windows and also fixes some #anchor link propagation</li>
<li>consolidated all URL/Path web browser launching code to one location</li>
<li>'open in web browser' is now available for non-advanced_mode users and the 'open' submenu of the share menu is available in the preview window and the media viewer</li>
<li>fixed a bug that was causing import folders to publish incorrect file identifiers, which was poisoning popup buttons and import page destinations</li>
<li>gui sessions that fail to load a page will recover and continue to attempt loading the rest of their pages. some popups detailing the page's serialised data and error will be presented</li>
<li>gui sessions that fail to save a page will recover and continue to attempt saving the rest of their pages. some popups detailsing the page's rough info and error will be presented</li>
<li>the core controller inside all media pages will now present itself in a more beautiful way when asked to dump itself to a log (which should beautify the above save error a bit)</li>
<li>wrote a subsidiary database->check->just repo update files that tests integrity of only repository update files</li>
<li>fixed an issue where default tag import options were sometimes not being saved from the new dialog in the networking menu</li>
<li>wrote a couple of layers of bad tag protection to help the new downloader deal with some occasional bad output from the old downloader</li>
<li>network jobs can now reattempt connection attempts up to three times on POST requests (if you ever got inexplicable immediate 'could not connect' errors on repository uploads, this should now be fixed)</li>
<li>replaced some archaic misc old import code with the new system, cleaning up a bunch of stuff and making space for further refactoring along the way</li>
<li>fixed tags blacklist not being inherited in the old (through options dialog) system</li>
<li>improved some invalid domain error handling</li>
<li>fixed an animation update issue that would pause naturally updating controls on non-main-gui frames when there were no regular media pages open on the main gui</li>
<li>added a BUGFIX option to 'files and trash' option page to override the default temp path for almost all client temp path requests</li>
<li>the minimum value for the 'vacuum period' in maintenance and processing options is now 28 days. the control also has a little explanatory tooltip</li>
<li>the 'try to auto-link url classes and parsers' function now always preferences parsers alphabetically</li>
<li>fixed a typo in the string transformations prettyfication code that incorrectly summarised 'take the last x characters' as the opposite</li>
<li>misc fix to file hash generation and status checking code</li>
<li>the 'export tags to .txt files' checkbox on the export files panel will no longer bother you with a dialog as you uncheck it</li>
<li>wrote some code to make it easier and more fool-proof to update the domain manager with new url classes and parsers on my end</li>
<li>improved some popup message manager ok-to-alter-ui logic when the main ui is minimised and so on</li>
<li>fixed some potential crash conditions (affecting linux mostly, seems like) in the service credential testing and access key fetching ui code</li>
<li>fixed a bug when 'stopping' a gallery parse during a long error pause (like when it holds on '404')</li>
<li>sped up some old set intersection code</li>
<li>some import file presentation refactoring</li>
<li>some url content application pipeline cleanup</li>
<li>misc cleanup</li>
</ul>
<li><h3>version 305</h3></li>
<ul>
<li>fixed the pixiv url class, which was unintentionally removing a parameter</li>

View File

@ -1,6 +1,7 @@
import ClientDefaults
import ClientDownloading
import ClientParsing
import ClientPaths
import ClientRendering
import ClientSearch
import ClientServices
@ -977,9 +978,7 @@ class ClientFilesManager( object ):
def ImportFile( self, file_import_job ):
file_import_job.GenerateHashAndStatus()
hash = file_import_job.GetHash()
( pre_import_status, hash, note ) = file_import_job.GenerateHashAndStatus()
if file_import_job.IsNewToDB():
@ -987,12 +986,12 @@ class ClientFilesManager( object ):
file_import_job.CheckIsGoodToImport()
( temp_path, thumbnail ) = file_import_job.GetTempPathAndThumbnail()
mime = file_import_job.GetMime()
with self._lock:
( temp_path, thumbnail ) = file_import_job.GetTempPathAndThumbnail()
mime = file_import_job.GetMime()
self.LocklessAddFile( hash, mime, temp_path )
if thumbnail is not None:
@ -1000,17 +999,17 @@ class ClientFilesManager( object ):
self.LocklessAddFullSizeThumbnail( hash, thumbnail )
import_status = self._controller.WriteSynchronous( 'import_file', file_import_job )
( import_status, note ) = self._controller.WriteSynchronous( 'import_file', file_import_job )
else:
file_import_job.PubsubContentUpdates()
import_status = file_import_job.GetPreImportStatus()
import_status = pre_import_status
return ( import_status, hash )
file_import_job.PubsubContentUpdates()
return ( import_status, hash, note )
def LocklessGetFilePath( self, hash, mime = None ):
@ -1964,7 +1963,7 @@ class ThumbnailCache( object ):
names = [ 'hydrus', 'flash', 'pdf', 'audio', 'video', 'zip' ]
( os_file_handle, temp_path ) = HydrusPaths.GetTempPath()
( os_file_handle, temp_path ) = ClientPaths.GetTempPath()
try:

View File

@ -414,7 +414,7 @@ status_string_lookup[ STATUS_DELETED ] = 'deleted'
status_string_lookup[ STATUS_ERROR ] = 'error'
status_string_lookup[ STATUS_NEW ] = 'new'
status_string_lookup[ STATUS_PAUSED ] = 'paused'
status_string_lookup[ STATUS_VETOED ] = 'vetoed'
status_string_lookup[ STATUS_VETOED ] = 'ignored'
status_string_lookup[ STATUS_SKIPPED ] = 'skipped'
SUCCESSFUL_IMPORT_STATES = { STATUS_SUCCESSFUL_AND_NEW, STATUS_SUCCESSFUL_BUT_REDUNDANT }

View File

@ -22,6 +22,7 @@ import ClientNetworkingBandwidth
import ClientNetworkingDomain
import ClientNetworkingLogin
import ClientNetworkingSessions
import ClientPaths
import ClientThreading
import hashlib
import HydrusConstants as HC
@ -547,6 +548,8 @@ class Controller( HydrusController.HydrusController ):
self.pub( 'splash_set_status_subtext', u'client files' )
self.temp_dir = ClientPaths.GetTempDir()
self.InitClientFilesManager()
#
@ -1112,6 +1115,11 @@ class Controller( HydrusController.HydrusController ):
self.SaveDirtyObjects()
if hasattr( self, 'temp_dir' ):
HydrusPaths.DeletePath( self.temp_dir )
HydrusController.HydrusController.ShutdownModel( self )

View File

@ -2568,7 +2568,7 @@ class DB( HydrusDB.HydrusDB ):
def _CheckFileIntegrity( self, mode, move_location = None ):
def _CheckFileIntegrity( self, mode, allowed_mimes = None, move_location = None ):
prefix_string = 'checking file integrity: '
@ -2580,7 +2580,16 @@ class DB( HydrusDB.HydrusDB ):
self._controller.pub( 'modal_message', job_key )
info = self._c.execute( 'SELECT hash_id, mime FROM current_files NATURAL JOIN files_info WHERE service_id = ?;', ( self._combined_local_file_service_id, ) ).fetchall()
if allowed_mimes is None:
select = 'SELECT hash_id, mime FROM current_files NATURAL JOIN files_info WHERE service_id = ?;'
else:
select = 'SELECT hash_id, mime FROM current_files NATURAL JOIN files_info WHERE service_id = ? AND mime IN ' + HydrusData.SplayListForDB( allowed_mimes ) + ';'
info = self._c.execute( select, ( self._combined_local_file_service_id, ) ).fetchall()
missing_count = 0
deletee_hash_ids = []
@ -4301,12 +4310,26 @@ class DB( HydrusDB.HydrusDB ):
if len( tags_to_include ) > 0 or len( namespaces_to_include ) > 0 or len( wildcards_to_include ) > 0:
if len( tags_to_include ) > 0:
def sort_longest_first_key( s ):
tag_query_hash_ids = HydrusData.IntelligentMassIntersect( ( self._GetHashIdsFromTag( file_service_key, tag_service_key, tag, include_current_tags, include_pending_tags ) for tag in tags_to_include ) )
return -len( s )
tags_to_include = list( tags_to_include )
tags_to_include.sort( key = sort_longest_first_key )
for tag in tags_to_include:
tag_query_hash_ids = self._GetHashIdsFromTag( file_service_key, tag_service_key, tag, include_current_tags, include_pending_tags, allowed_hash_ids = query_hash_ids )
query_hash_ids = update_qhi( query_hash_ids, tag_query_hash_ids )
if query_hash_ids == set():
return query_hash_ids
if len( namespaces_to_include ) > 0:
@ -4324,15 +4347,17 @@ class DB( HydrusDB.HydrusDB ):
if len( files_info_predicates ) > 0:
files_info_predicates.append( 'hash_id IN %s' )
if file_service_key == CC.COMBINED_FILE_SERVICE_KEY:
query_hash_ids.intersection_update( self._STI( self._c.execute( 'SELECT hash_id FROM files_info WHERE ' + ' AND '.join( files_info_predicates ) + ';' ) ) )
query_hash_ids.intersection_update( self._STI( self._SelectFromList( 'SELECT hash_id FROM files_info WHERE ' + ' AND '.join( files_info_predicates ) + ';', query_hash_ids ) ) )
else:
files_info_predicates.insert( 0, 'service_id = ' + str( file_service_id ) )
query_hash_ids.intersection_update( self._STI( self._c.execute( 'SELECT hash_id FROM current_files NATURAL JOIN files_info WHERE ' + ' AND '.join( files_info_predicates ) + ';' ) ) )
query_hash_ids.intersection_update( self._STI( self._SelectFromList( 'SELECT hash_id FROM current_files NATURAL JOIN files_info WHERE ' + ' AND '.join( files_info_predicates ) + ';', query_hash_ids ) ) )
@ -4432,7 +4457,7 @@ class DB( HydrusDB.HydrusDB ):
for tag in tags_to_exclude:
exclude_query_hash_ids.update( self._GetHashIdsFromTag( file_service_key, tag_service_key, tag, include_current_tags, include_pending_tags ) )
exclude_query_hash_ids.update( self._GetHashIdsFromTag( file_service_key, tag_service_key, tag, include_current_tags, include_pending_tags, allowed_hash_ids = query_hash_ids ) )
for namespace in namespaces_to_exclude:
@ -4798,7 +4823,7 @@ class DB( HydrusDB.HydrusDB ):
return hash_ids
def _GetHashIdsFromTag( self, file_service_key, tag_service_key, tag, include_current_tags, include_pending_tags ):
def _GetHashIdsFromTag( self, file_service_key, tag_service_key, tag, include_current_tags, include_pending_tags, allowed_hash_ids = None ):
siblings_manager = self._controller.GetManager( 'tag_siblings' )
@ -4815,8 +4840,6 @@ class DB( HydrusDB.HydrusDB ):
search_tag_service_ids = [ self._GetServiceId( tag_service_key ) ]
hash_ids = set()
for tag in tags:
current_selects = []
@ -4880,20 +4903,35 @@ class DB( HydrusDB.HydrusDB ):
if include_current_tags:
hash_ids = set()
selects = []
if include_current_tags:
selects.extend( current_selects )
if include_pending_tags:
selects.extend( pending_selects )
if allowed_hash_ids is None:
for select in selects:
for current_select in current_selects:
hash_ids.update( ( id for ( id, ) in self._c.execute( current_select ) ) )
hash_ids.update( self._STI( self._c.execute( select ) ) )
if include_pending_tags:
else:
selects = [ select.replace( ';', ' AND hash_id IN %s;' ) for select in selects ]
for select in selects:
for pending_select in pending_selects:
hash_ids.update( ( id for ( id, ) in self._c.execute( pending_select ) ) )
hash_ids.update( self._STI( self._SelectFromList( select, allowed_hash_ids ) ) )
@ -5155,47 +5193,48 @@ class DB( HydrusDB.HydrusDB ):
return results
def _GetHashIdStatus( self, hash_id ):
def _GetHashIdStatus( self, hash_id, prefix = '' ):
hash = self._GetHash( hash_id )
result = self._c.execute( 'SELECT 1 FROM deleted_files WHERE service_id = ? AND hash_id = ?;', ( self._combined_local_file_service_id, hash_id ) ).fetchone()
if result is not None:
hash = self._GetHash( hash_id )
return ( CC.STATUS_DELETED, hash, '' )
return ( CC.STATUS_DELETED, hash, prefix )
result = self._c.execute( 'SELECT timestamp FROM current_files WHERE service_id = ? AND hash_id = ?;', ( self._trash_service_id, hash_id ) ).fetchone()
if result is not None:
hash = self._GetHash( hash_id )
( timestamp, ) = result
note = 'Currently in trash. Sent there at ' + HydrusData.ConvertTimestampToPrettyTime( timestamp ) + ', which was ' + HydrusData.ConvertTimestampToPrettyAgo( timestamp ) + ' before this check.'
return ( CC.STATUS_DELETED, hash, note )
return ( CC.STATUS_DELETED, hash, prefix + ': ' + note )
result = self._c.execute( 'SELECT timestamp FROM current_files WHERE service_id = ? AND hash_id = ?;', ( self._combined_local_file_service_id, hash_id ) ).fetchone()
if result is not None:
hash = self._GetHash( hash_id )
( timestamp, ) = result
note = 'Imported at ' + HydrusData.ConvertTimestampToPrettyTime( timestamp ) + ', which was ' + HydrusData.ConvertTimestampToPrettyAgo( timestamp ) + ' before this check.'
return ( CC.STATUS_SUCCESSFUL_BUT_REDUNDANT, hash, note )
return ( CC.STATUS_SUCCESSFUL_BUT_REDUNDANT, hash, prefix + ': ' + note )
return ( CC.STATUS_UNKNOWN, None, '' )
return ( CC.STATUS_UNKNOWN, hash, '' )
def _GetHashStatus( self, hash_type, hash ):
def _GetHashStatus( self, hash_type, hash, prefix = None ):
if prefix is None:
prefix = hash_type + ' recognised'
if hash_type == 'sha256':
@ -5207,7 +5246,7 @@ class DB( HydrusDB.HydrusDB ):
hash_id = self._GetHashId( hash )
return self._GetHashIdStatus( hash_id )
return self._GetHashIdStatus( hash_id, prefix = prefix )
else:
@ -5233,7 +5272,7 @@ class DB( HydrusDB.HydrusDB ):
( hash_id, ) = result
return self._GetHashIdStatus( hash_id )
return self._GetHashIdStatus( hash_id, prefix = prefix )
@ -6624,7 +6663,7 @@ class DB( HydrusDB.HydrusDB ):
hash_ids.update( results )
results = [ self._GetHashIdStatus( hash_id ) for hash_id in hash_ids ]
results = [ self._GetHashIdStatus( hash_id, prefix = 'url recognised' ) for hash_id in hash_ids ]
return results
@ -6703,7 +6742,7 @@ class DB( HydrusDB.HydrusDB ):
hash_id = self._GetHashId( hash )
( status, status_hash, note ) = self._GetHashIdStatus( hash_id )
( status, status_hash, note ) = self._GetHashIdStatus( hash_id, prefix = 'recognised during import' )
if status != CC.STATUS_SUCCESSFUL_BUT_REDUNDANT:
@ -6771,7 +6810,7 @@ class DB( HydrusDB.HydrusDB ):
return status
return ( status, note )
def _ImportUpdate( self, update_network_string, update_hash, mime ):
@ -7273,19 +7312,25 @@ class DB( HydrusDB.HydrusDB ):
if action == HC.CONTENT_UPDATE_ADD:
( hash, urls ) = row
( urls, hashes ) = row
hash_id = self._GetHashId( hash )
self._c.executemany( 'INSERT OR IGNORE INTO urls ( hash_id, url ) VALUES ( ?, ? );', ( ( hash_id, url ) for url in urls ) )
for hash in hashes:
hash_id = self._GetHashId( hash )
self._c.executemany( 'INSERT OR IGNORE INTO urls ( hash_id, url ) VALUES ( ?, ? );', ( ( hash_id, url ) for url in urls ) )
elif action == HC.CONTENT_UPDATE_DELETE:
( hash, urls ) = row
( urls, hashes ) = row
hash_id = self._GetHashId( hash )
self._c.executemany( 'DELETE FROM urls WHERE hash_id = ? AND url = ?;', ( ( hash_id, url ) for url in urls ) )
for hash in hashes:
hash_id = self._GetHashId( hash )
self._c.executemany( 'DELETE FROM urls WHERE hash_id = ? AND url = ?;', ( ( hash_id, url ) for url in urls ) )
@ -8719,7 +8764,7 @@ class DB( HydrusDB.HydrusDB ):
if value is None:
self._c.execute( 'DELET FROM json_dict WHERE name = ?;', ( name, ) )
self._c.execute( 'DELETE FROM json_dict WHERE name = ?;', ( name, ) )
else:
@ -10086,6 +10131,44 @@ class DB( HydrusDB.HydrusDB ):
if version == 305:
try:
domain_manager = self._GetJSONDump( HydrusSerialisable.SERIALISABLE_TYPE_NETWORK_DOMAIN_MANAGER )
domain_manager.Initialise()
#
domain_manager.OverwriteDefaultURLMatches( ( 'pixiv file page', 'twitter tweet' ) )
#
domain_manager.OverwriteDefaultParsers( ( '4chan thread api parser', 'pixiv single file page parser - japanese tags' ) )
#
domain_manager.TryToLinkURLMatchesAndParsers()
#
self._SetJSONDump( domain_manager )
except Exception as e:
HydrusData.PrintException( e )
message = 'Trying to update some url classes and parsers failed! Please let hydrus dev know!'
self.pub_initial_message( message )
message = 'The Newgrounds gallery parser no longer works! I hope to have it back once the new downloader engine\'s gallery work is done. You may want to pause any Newgrounds subscriptions.'
self.pub_initial_message( message )
self._controller.pub( 'splash_set_title_text', 'updated db to v' + str( version + 1 ) )
self._c.execute( 'UPDATE version SET version = ?;', ( version + 1, ) )

View File

@ -1,6 +1,7 @@
import ClientData
import ClientImporting
import ClientImportOptions
import ClientPaths
import ClientThreading
import HydrusConstants as HC
import HydrusData
@ -122,7 +123,7 @@ def DAEMONDownloadFiles( controller ):
try:
( os_file_handle, temp_path ) = HydrusPaths.GetTempPath()
( os_file_handle, temp_path ) = ClientPaths.GetTempPath()
try:

View File

@ -1013,8 +1013,8 @@ class ClientOptions( HydrusSerialisable.SerialisableBase ):
import ClientTags
self._dictionary[ 'duplicate_action_options' ][ HC.DUPLICATE_BETTER ] = DuplicateActionOptions( [ ( CC.LOCAL_TAG_SERVICE_KEY, HC.CONTENT_MERGE_ACTION_MOVE, ClientTags.TagFilter() ) ], [], True, True )
self._dictionary[ 'duplicate_action_options' ][ HC.DUPLICATE_SAME_QUALITY ] = DuplicateActionOptions( [ ( CC.LOCAL_TAG_SERVICE_KEY, HC.CONTENT_MERGE_ACTION_TWO_WAY_MERGE, ClientTags.TagFilter() ) ], [], False, True )
self._dictionary[ 'duplicate_action_options' ][ HC.DUPLICATE_BETTER ] = DuplicateActionOptions( [ ( CC.LOCAL_TAG_SERVICE_KEY, HC.CONTENT_MERGE_ACTION_MOVE, ClientTags.TagFilter() ) ], [], True, True, sync_urls_action = HC.CONTENT_MERGE_ACTION_COPY )
self._dictionary[ 'duplicate_action_options' ][ HC.DUPLICATE_SAME_QUALITY ] = DuplicateActionOptions( [ ( CC.LOCAL_TAG_SERVICE_KEY, HC.CONTENT_MERGE_ACTION_TWO_WAY_MERGE, ClientTags.TagFilter() ) ], [], False, True, sync_urls_action = HC.CONTENT_MERGE_ACTION_TWO_WAY_MERGE )
self._dictionary[ 'duplicate_action_options' ][ HC.DUPLICATE_ALTERNATE ] = DuplicateActionOptions( [], [], False )
self._dictionary[ 'duplicate_action_options' ][ HC.DUPLICATE_NOT_DUPLICATE ] = DuplicateActionOptions( [], [], False )
@ -1093,6 +1093,8 @@ class ClientOptions( HydrusSerialisable.SerialisableBase ):
self._dictionary[ 'noneable_strings' ][ 'thread_watcher_not_found_page_string' ] = '[404]'
self._dictionary[ 'noneable_strings' ][ 'thread_watcher_dead_page_string' ] = '[DEAD]'
self._dictionary[ 'noneable_strings' ][ 'thread_watcher_paused_page_string' ] = u'\u23F8'
self._dictionary[ 'noneable_strings' ][ 'temp_path_override' ] = None
self._dictionary[ 'noneable_strings' ][ 'web_browser_path' ] = None
self._dictionary[ 'strings' ] = {}
@ -1572,6 +1574,8 @@ class ClientOptions( HydrusSerialisable.SerialisableBase ):
fetch_tags_even_if_url_known_and_file_already_in_db = False
tag_blacklist = None
service_keys_to_namespaces = {}
service_keys_to_additional_tags = {}
@ -1579,6 +1583,8 @@ class ClientOptions( HydrusSerialisable.SerialisableBase ):
fetch_tags_even_if_url_known_and_file_already_in_db = guidance_tag_import_options.ShouldFetchTagsEvenIfURLKnownAndFileAlreadyInDB()
tag_blacklist = guidance_tag_import_options.GetTagBlacklist()
( namespaces, search_value ) = ClientDefaults.GetDefaultNamespacesAndSearchValue( gallery_identifier )
guidance_service_keys_to_namespaces = guidance_tag_import_options.GetServiceKeysToNamespaces()
@ -1600,7 +1606,7 @@ class ClientOptions( HydrusSerialisable.SerialisableBase ):
import ClientImportOptions
tag_import_options = ClientImportOptions.TagImportOptions( fetch_tags_even_if_url_known_and_file_already_in_db = fetch_tags_even_if_url_known_and_file_already_in_db, service_keys_to_namespaces = service_keys_to_namespaces, service_keys_to_additional_tags = service_keys_to_additional_tags )
tag_import_options = ClientImportOptions.TagImportOptions( fetch_tags_even_if_url_known_and_file_already_in_db = fetch_tags_even_if_url_known_and_file_already_in_db, tag_blacklist = tag_blacklist, service_keys_to_namespaces = service_keys_to_namespaces, service_keys_to_additional_tags = service_keys_to_additional_tags )
return tag_import_options
@ -2126,9 +2132,9 @@ class DuplicateActionOptions( HydrusSerialisable.SerialisableBase ):
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_DUPLICATE_ACTION_OPTIONS
SERIALISABLE_NAME = 'Duplicate Action Options'
SERIALISABLE_VERSION = 2
SERIALISABLE_VERSION = 3
def __init__( self, tag_service_actions = None, rating_service_actions = None, delete_second_file = False, sync_archive = False, delete_both_files = False ):
def __init__( self, tag_service_actions = None, rating_service_actions = None, delete_second_file = False, sync_archive = False, delete_both_files = False, sync_urls_action = None ):
if tag_service_actions is None:
@ -2147,6 +2153,7 @@ class DuplicateActionOptions( HydrusSerialisable.SerialisableBase ):
self._delete_second_file = delete_second_file
self._sync_archive = sync_archive
self._delete_both_files = delete_both_files
self._sync_urls_action = sync_urls_action
def _GetSerialisableInfo( self ):
@ -2162,12 +2169,12 @@ class DuplicateActionOptions( HydrusSerialisable.SerialisableBase ):
serialisable_tag_service_actions = [ ( service_key.encode( 'hex' ), action, tag_filter.GetSerialisableTuple() ) for ( service_key, action, tag_filter ) in self._tag_service_actions ]
serialisable_rating_service_actions = [ ( service_key.encode( 'hex' ), action ) for ( service_key, action ) in self._rating_service_actions ]
return ( serialisable_tag_service_actions, serialisable_rating_service_actions, self._delete_second_file, self._sync_archive, self._delete_both_files )
return ( serialisable_tag_service_actions, serialisable_rating_service_actions, self._delete_second_file, self._sync_archive, self._delete_both_files, self._sync_urls_action )
def _InitialiseFromSerialisableInfo( self, serialisable_info ):
( serialisable_tag_service_actions, serialisable_rating_service_actions, self._delete_second_file, self._sync_archive, self._delete_both_files ) = serialisable_info
( serialisable_tag_service_actions, serialisable_rating_service_actions, self._delete_second_file, self._sync_archive, self._delete_both_files, self._sync_urls_action ) = serialisable_info
self._tag_service_actions = [ ( serialisable_service_key.decode( 'hex' ), action, HydrusSerialisable.CreateFromSerialisableTuple( serialisable_tag_filter ) ) for ( serialisable_service_key, action, serialisable_tag_filter ) in serialisable_tag_service_actions ]
self._rating_service_actions = [ ( serialisable_service_key.decode( 'hex' ), action ) for ( serialisable_service_key, action ) in serialisable_rating_service_actions ]
@ -2208,6 +2215,17 @@ class DuplicateActionOptions( HydrusSerialisable.SerialisableBase ):
return ( 2, new_serialisable_info )
if version == 2:
( serialisable_tag_service_actions, serialisable_rating_service_actions, delete_second_file, sync_archive, delete_both_files ) = old_serialisable_info
sync_urls_action = None
new_serialisable_info = ( serialisable_tag_service_actions, serialisable_rating_service_actions, delete_second_file, sync_archive, delete_both_files, sync_urls_action )
return ( 3, new_serialisable_info )
def GetDeletedHashes( self, first_media, second_media ):
@ -2228,18 +2246,19 @@ class DuplicateActionOptions( HydrusSerialisable.SerialisableBase ):
def SetTuple( self, tag_service_actions, rating_service_actions, delete_second_file, sync_archive, delete_both_files ):
def SetTuple( self, tag_service_actions, rating_service_actions, delete_second_file, sync_archive, delete_both_files, sync_urls_action ):
self._tag_service_actions = tag_service_actions
self._rating_service_actions = rating_service_actions
self._delete_second_file = delete_second_file
self._sync_archive = sync_archive
self._delete_both_files = delete_both_files
self._sync_urls_action = sync_urls_action
def ToTuple( self ):
return ( self._tag_service_actions, self._rating_service_actions, self._delete_second_file, self._sync_archive, self._delete_both_files )
return ( self._tag_service_actions, self._rating_service_actions, self._delete_second_file, self._sync_archive, self._delete_both_files, self._sync_urls_action )
def ProcessPairIntoContentUpdates( self, first_media, second_media ):
@ -2406,6 +2425,38 @@ class DuplicateActionOptions( HydrusSerialisable.SerialisableBase ):
list_of_service_keys_to_content_updates.append( service_keys_to_content_updates )
#
if self._sync_urls_action is not None:
first_urls = set( first_media.GetLocationsManager().GetURLs() )
second_urls = set( second_media.GetLocationsManager().GetURLs() )
content_updates = []
if self._sync_urls_action == HC.CONTENT_MERGE_ACTION_TWO_WAY_MERGE:
first_needs = second_urls.difference( first_urls )
second_needs = first_urls.difference( second_urls )
content_updates.append( HydrusData.ContentUpdate( HC.CONTENT_TYPE_URLS, HC.CONTENT_UPDATE_ADD, ( first_needs, first_hashes ) ) )
content_updates.append( HydrusData.ContentUpdate( HC.CONTENT_TYPE_URLS, HC.CONTENT_UPDATE_ADD, ( second_needs, second_hashes ) ) )
elif self._sync_urls_action == HC.CONTENT_MERGE_ACTION_COPY:
first_needs = second_urls.difference( first_urls )
content_updates.append( HydrusData.ContentUpdate( HC.CONTENT_TYPE_URLS, HC.CONTENT_UPDATE_ADD, ( first_needs, first_hashes ) ) )
if len( content_updates ) > 0:
service_keys_to_content_updates = { CC.COMBINED_LOCAL_FILE_SERVICE_KEY : content_updates }
list_of_service_keys_to_content_updates.append( service_keys_to_content_updates )
#
service_keys_to_content_updates = {}

View File

@ -22,6 +22,7 @@ import ClientGUITopLevelWindows
import ClientDownloading
import ClientMedia
import ClientNetworkingContexts
import ClientPaths
import ClientSearch
import ClientServices
import ClientThreading
@ -49,7 +50,6 @@ import threading
import time
import traceback
import types
import webbrowser
import wx
import wx.adv
@ -217,7 +217,7 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
library_versions.append( ( 'sqlite', sqlite3.sqlite_version ) )
library_versions.append( ( 'wx', wx.version() ) )
library_versions.append( ( 'temp dir', HydrusPaths.tempfile.gettempdir() ) )
library_versions.append( ( 'temp dir', ClientPaths.GetCurrentTempDir() ) )
import locale
@ -636,11 +636,20 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
def _CheckFileIntegrity( self ):
def _CheckFileIntegrity( self, allowed_mimes = None ):
client_files_manager = self._controller.client_files_manager
message = 'This will go through all the files the database thinks it has and check that they actually exist. Any files that are missing will be deleted from the internal record.'
if allowed_mimes is None:
file_desc = 'files'
else:
file_desc = os.linesep * 2 + os.linesep.join( ( HC.mime_string_lookup[ mime ] for mime in allowed_mimes ) ) + os.linesep * 2 + 'files'
message = 'This will go through all the ' + file_desc + ' the database thinks it has and check that they actually exist. Any files that are missing will be deleted from the internal record.'
message += os.linesep * 2
message += 'You can perform a quick existence check, which will only look to see if a file exists, or a thorough content check, which will also make sure existing files are not corrupt or otherwise incorrect.'
message += os.linesep * 2
@ -652,7 +661,7 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
if result == wx.ID_YES:
self._controller.CallToThread( client_files_manager.CheckFileIntegrity, 'quick' )
self._controller.CallToThread( client_files_manager.CheckFileIntegrity, 'quick', allowed_mimes = allowed_mimes )
elif result == wx.ID_NO:
@ -670,19 +679,24 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
path = HydrusData.ToUnicode( dlg_3.GetPath() )
self._controller.CallToThread( client_files_manager.CheckFileIntegrity, 'thorough', path )
self._controller.CallToThread( client_files_manager.CheckFileIntegrity, 'thorough', allowed_mimes = allowed_mimes, move_location = path )
elif result == wx.ID_NO:
self._controller.CallToThread( client_files_manager.CheckFileIntegrity, 'thorough' )
self._controller.CallToThread( client_files_manager.CheckFileIntegrity, 'thorough', allowed_mimes = allowed_mimes )
def _CheckFileIntegrityRepositoryUpdates( self ):
self._CheckFileIntegrity( allowed_mimes = HC.HYDRUS_UPDATE_FILES )
def _CheckImportFolder( self, name = None ):
if self._controller.options[ 'pause_import_folders_sync' ]:
@ -1386,7 +1400,7 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
ClientGUIMenus.AppendMenu( gallery_menu, hf_submenu, 'hentai foundry' )
ClientGUIMenus.AppendMenuItem( self, gallery_menu, 'newgrounds', 'Open a new tab to download files from Newgrounds.', self._notebook.NewPageImportGallery, ClientDownloading.GalleryIdentifier( HC.SITE_TYPE_NEWGROUNDS ), on_deepest_notebook = True )
#ClientGUIMenus.AppendMenuItem( self, gallery_menu, 'newgrounds', 'Open a new tab to download files from Newgrounds.', self._notebook.NewPageImportGallery, ClientDownloading.GalleryIdentifier( HC.SITE_TYPE_NEWGROUNDS ), on_deepest_notebook = True )
result = self._controller.Read( 'serialisable_simple', 'pixiv_account' )
@ -1476,6 +1490,7 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
ClientGUIMenus.AppendMenuItem( self, submenu, 'database integrity', 'Have the database examine all its records for internal consistency.', self._CheckDBIntegrity )
ClientGUIMenus.AppendMenuItem( self, submenu, 'file integrity', 'Have the database check if it truly has the files it thinks it does, and remove records when not.', self._CheckFileIntegrity )
ClientGUIMenus.AppendMenuItem( self, submenu, 'file integrity - only repository updates', 'Have the database check if it truly has the repository update files it thinks it does, and remove records when not.', self._CheckFileIntegrityRepositoryUpdates )
ClientGUIMenus.AppendMenu( menu, submenu, 'check' )
@ -1762,20 +1777,20 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
def help():
ClientGUIMenus.AppendMenuItem( self, menu, 'help', 'Open hydrus\'s local help in your web browser.', webbrowser.open, 'file://' + HC.HELP_DIR + '/index.html' )
ClientGUIMenus.AppendMenuItem( self, menu, 'help', 'Open hydrus\'s local help in your web browser.', ClientPaths.LaunchPathInWebBrowser, os.path.join( HC.HELP_DIR, 'index.html' ) )
links = wx.Menu()
site = ClientGUIMenus.AppendMenuBitmapItem( self, links, 'site', 'Open hydrus\'s website, which is mostly a mirror of the local help.', CC.GlobalBMPs.file_repository, webbrowser.open, 'https://hydrusnetwork.github.io/hydrus/' )
site = ClientGUIMenus.AppendMenuBitmapItem( self, links, '8chan board', 'Open hydrus dev\'s 8chan board, where he makes release posts and other status updates. Much other discussion also occurs.', CC.GlobalBMPs.eight_chan, webbrowser.open, 'https://8ch.net/hydrus/index.html' )
site = ClientGUIMenus.AppendMenuBitmapItem( self, links, 'twitter', 'Open hydrus dev\'s twitter, where he makes general progress updates and emergency notifications.', CC.GlobalBMPs.twitter, webbrowser.open, 'https://twitter.com/hydrusnetwork' )
site = ClientGUIMenus.AppendMenuBitmapItem( self, links, 'tumblr', 'Open hydrus dev\'s tumblr, where he makes release posts and other status updates.', CC.GlobalBMPs.tumblr, webbrowser.open, 'http://hydrus.tumblr.com/' )
site = ClientGUIMenus.AppendMenuBitmapItem( self, links, 'discord', 'Open a discord channel where many hydrus users congregate. Hydrus dev visits regularly.', CC.GlobalBMPs.discord, webbrowser.open, 'https://discord.gg/vy8CUB4' )
site = ClientGUIMenus.AppendMenuBitmapItem( self, links, 'patreon', 'Open hydrus dev\'s patreon, which lets you support development.', CC.GlobalBMPs.patreon, webbrowser.open, 'https://www.patreon.com/hydrus_dev' )
site = ClientGUIMenus.AppendMenuBitmapItem( self, links, 'site', 'Open hydrus\'s website, which is mostly a mirror of the local help.', CC.GlobalBMPs.file_repository, ClientPaths.LaunchURLInWebBrowser, 'https://hydrusnetwork.github.io/hydrus/' )
site = ClientGUIMenus.AppendMenuBitmapItem( self, links, '8chan board', 'Open hydrus dev\'s 8chan board, where he makes release posts and other status updates. Much other discussion also occurs.', CC.GlobalBMPs.eight_chan, ClientPaths.LaunchURLInWebBrowser, 'https://8ch.net/hydrus/index.html' )
site = ClientGUIMenus.AppendMenuBitmapItem( self, links, 'twitter', 'Open hydrus dev\'s twitter, where he makes general progress updates and emergency notifications.', CC.GlobalBMPs.twitter, ClientPaths.LaunchURLInWebBrowser, 'https://twitter.com/hydrusnetwork' )
site = ClientGUIMenus.AppendMenuBitmapItem( self, links, 'tumblr', 'Open hydrus dev\'s tumblr, where he makes release posts and other status updates.', CC.GlobalBMPs.tumblr, ClientPaths.LaunchURLInWebBrowser, 'http://hydrus.tumblr.com/' )
site = ClientGUIMenus.AppendMenuBitmapItem( self, links, 'discord', 'Open a discord channel where many hydrus users congregate. Hydrus dev visits regularly.', CC.GlobalBMPs.discord, ClientPaths.LaunchURLInWebBrowser, 'https://discord.gg/vy8CUB4' )
site = ClientGUIMenus.AppendMenuBitmapItem( self, links, 'patreon', 'Open hydrus dev\'s patreon, which lets you support development.', CC.GlobalBMPs.patreon, ClientPaths.LaunchURLInWebBrowser, 'https://www.patreon.com/hydrus_dev' )
ClientGUIMenus.AppendMenu( menu, links, 'links' )
ClientGUIMenus.AppendMenuItem( self, menu, 'changelog', 'Open hydrus\'s local changelog in your web browser.', webbrowser.open, 'file://' + HC.HELP_DIR + '/changelog.html' )
ClientGUIMenus.AppendMenuItem( self, menu, 'changelog', 'Open hydrus\'s local changelog in your web browser.', ClientPaths.LaunchPathInWebBrowser, os.path.join( HC.HELP_DIR, 'changelog.html' ) )
ClientGUIMenus.AppendSeparator( menu )
@ -3903,16 +3918,19 @@ The password is cleartext here but obscured in the entry dialog. Enter a blank p
current_page = self.GetCurrentPage()
if current_page is None:
if current_page is not None:
return False
in_current_page = ClientGUICommon.IsWXAncestor( window, current_page )
if in_current_page:
return True
in_current_page = ClientGUICommon.IsWXAncestor( window, current_page )
in_other_window = ClientGUICommon.GetTLP( window ) != self
return in_current_page or in_other_window
return in_other_window
def ImportFiles( self, paths ):

View File

@ -16,6 +16,7 @@ import ClientGUIScrolledPanelsManagement
import ClientGUIShortcuts
import ClientGUITopLevelWindows
import ClientMedia
import ClientPaths
import ClientRatings
import ClientRendering
import ClientTags
@ -27,7 +28,6 @@ import HydrusSerialisable
import HydrusTags
import os
import urlparse
import webbrowser
import wx
FLASHWIN_OK = False
@ -1694,6 +1694,40 @@ class Canvas( wx.Window ):
self._MediaFocusWentToExternalProgram()
def _OpenFileInWebBrowser( self ):
if self._current_media is not None:
hash = self._current_media.GetHash()
mime = self._current_media.GetMime()
client_files_manager = HG.client_controller.client_files_manager
path = client_files_manager.GetFilePath( hash, mime )
ClientPaths.LaunchPathInWebBrowser( path )
self._MediaFocusWentToExternalProgram()
def _OpenFileLocation( self ):
if self._current_media is not None:
hash = self._current_media.GetHash()
mime = self._current_media.GetMime()
client_files_manager = HG.client_controller.client_files_manager
path = client_files_manager.GetFilePath( hash, mime )
HydrusPaths.OpenFileLocation( path )
self._MediaFocusWentToExternalProgram()
def _PauseCurrentMedia( self ):
if self._current_media is None:
@ -2381,6 +2415,10 @@ class CanvasPanel( Canvas ):
if self._current_media is not None:
new_options = HG.client_controller.new_options
advanced_mode = new_options.GetBoolean( 'advanced_mode' )
services = HG.client_controller.services_manager.GetServices()
locations_manager = self._current_media.GetLocationsManager()
@ -2455,7 +2493,7 @@ class CanvasPanel( Canvas ):
for url in urls:
ClientGUIMenus.AppendMenuItem( self, urls_visit_menu, url, 'Open this url in your web browser.', webbrowser.open, url )
ClientGUIMenus.AppendMenuItem( self, urls_visit_menu, url, 'Open this url in your web browser.', ClientPaths.LaunchURLInWebBrowser, url )
ClientGUIMenus.AppendMenuItem( self, urls_copy_menu, url, 'Copy this url to your clipboard.', HG.client_controller.pub, 'clipboard', 'text', url )
@ -2467,6 +2505,26 @@ class CanvasPanel( Canvas ):
share_menu = wx.Menu()
show_open_in_web = True
show_open_in_explorer = advanced_mode and not HC.PLATFORM_LINUX
if show_open_in_web or show_open_in_explorer:
open_menu = wx.Menu()
if show_open_in_web:
ClientGUIMenus.AppendMenuItem( self, open_menu, 'in web browser', 'Show this file in your OS\'s web browser.', self._OpenFileInWebBrowser )
if show_open_in_explorer:
ClientGUIMenus.AppendMenuItem( self, open_menu, 'in file browser', 'Show this file in your OS\'s file browser.', self._OpenFileLocation )
ClientGUIMenus.AppendMenu( share_menu, open_menu, 'open' )
copy_menu = wx.Menu()
ClientGUIMenus.AppendMenuItem( self, copy_menu, 'file', 'Copy this file to your clipboard.', self._CopyFileToClipboard )
@ -3133,7 +3191,7 @@ class CanvasFilterDuplicates( CanvasWithHovers ):
with ClientGUITopLevelWindows.DialogEdit( self, 'edit duplicate merge options' ) as dlg_2:
panel = ClientGUIScrolledPanelsEdit.EditDuplicateActionOptionsPanel( dlg_2, duplicate_type, duplicate_action_options )
panel = ClientGUIScrolledPanelsEdit.EditDuplicateActionOptionsPanel( dlg_2, duplicate_type, duplicate_action_options, for_custom_action = True )
dlg_2.SetPanel( panel )
@ -4683,6 +4741,10 @@ class CanvasMediaListBrowser( CanvasMediaListNavigable ):
if self._current_media is not None:
new_options = HG.client_controller.new_options
advanced_mode = new_options.GetBoolean( 'advanced_mode' )
services = HG.client_controller.services_manager.GetServices()
local_ratings_services = [ service for service in services if service.GetServiceType() in ( HC.LOCAL_RATING_LIKE, HC.LOCAL_RATING_NUMERICAL ) ]
@ -4780,7 +4842,7 @@ class CanvasMediaListBrowser( CanvasMediaListNavigable ):
for url in urls:
ClientGUIMenus.AppendMenuItem( self, urls_visit_menu, url, 'Open this url in your web browser.', webbrowser.open, url )
ClientGUIMenus.AppendMenuItem( self, urls_visit_menu, url, 'Open this url in your web browser.', ClientPaths.LaunchURLInWebBrowser, url )
ClientGUIMenus.AppendMenuItem( self, urls_copy_menu, url, 'Copy this url to your clipboard.', HG.client_controller.pub, 'clipboard', 'text', url )
@ -4792,6 +4854,26 @@ class CanvasMediaListBrowser( CanvasMediaListNavigable ):
share_menu = wx.Menu()
show_open_in_web = True
show_open_in_explorer = advanced_mode and not HC.PLATFORM_LINUX
if show_open_in_web or show_open_in_explorer:
open_menu = wx.Menu()
if show_open_in_web:
ClientGUIMenus.AppendMenuItem( self, open_menu, 'in web browser', 'Show this file in your OS\'s web browser.', self._OpenFileInWebBrowser )
if show_open_in_explorer:
ClientGUIMenus.AppendMenuItem( self, open_menu, 'in file browser', 'Show this file in your OS\'s file browser.', self._OpenFileLocation )
ClientGUIMenus.AppendMenu( share_menu, open_menu, 'open' )
copy_menu = wx.Menu()
ClientGUIMenus.AppendMenuItem( self, copy_menu, 'file', 'Copy this file to your clipboard.', self._CopyFileToClipboard )

View File

@ -1232,11 +1232,26 @@ class Gauge( wx.Gauge ):
wx.Gauge.__init__( self, *args, **kwargs )
self._actual_value = None
self._actual_range = None
self._is_pulsing = False
def GetValueRange( self ):
if self._actual_range is None:
range = self.GetRange()
else:
range = self._actual_range
return ( self._actual_value, range )
def SetRange( self, range ):
if range is None:
@ -1271,6 +1286,8 @@ class Gauge( wx.Gauge ):
def SetValue( self, value ):
self._actual_value = value
if not self._is_pulsing:
if value is None:

View File

@ -1808,36 +1808,49 @@ class DialogModifyAccounts( Dialog ):
def EventAddToExpires( self, event ):
self._DoModification( HC.ADD_TO_EXPIRES, timespan = self._add_to_expires.GetClientData( self._add_to_expires.GetSelection() ) )
raise NotImplementedError()
#self._DoModification( HC.ADD_TO_EXPIRES, timespan = self._add_to_expires.GetClientData( self._add_to_expires.GetSelection() ) )
def EventBan( self, event ):
with DialogTextEntry( self, 'Enter reason for the ban.' ) as dlg:
if dlg.ShowModal() == wx.ID_OK: self._DoModification( HC.BAN, reason = dlg.GetValue() )
if dlg.ShowModal() == wx.ID_OK:
raise NotImplementedError()
#self._DoModification( HC.BAN, reason = dlg.GetValue() )
def EventChangeAccountType( self, event ):
self._DoModification( HC.CHANGE_ACCOUNT_TYPE, account_type_key = self._account_types.GetChoice() )
raise NotImplementedError()
#self._DoModification( HC.CHANGE_ACCOUNT_TYPE, account_type_key = self._account_types.GetChoice() )
def EventSetExpires( self, event ):
expires = self._set_expires.GetClientData( self._set_expires.GetSelection() )
if expires is not None: expires += HydrusData.GetNow()
if expires is not None:
expires += HydrusData.GetNow()
self._DoModification( HC.SET_EXPIRES, expires = expires )
raise NotImplementedError()
#self._DoModification( HC.SET_EXPIRES, expires = expires )
def EventSuperban( self, event ):
with DialogTextEntry( self, 'Enter reason for the superban.' ) as dlg:
if dlg.ShowModal() == wx.ID_OK: self._DoModification( HC.SUPERBAN, reason = dlg.GetValue() )
if dlg.ShowModal() == wx.ID_OK:
raise NotImplementedError()
#self._DoModification( HC.SUPERBAN, reason = dlg.GetValue() )

View File

@ -28,6 +28,7 @@ import ClientImporting
import ClientImportOptions
import ClientMedia
import ClientParsing
import ClientPaths
import ClientRendering
import ClientSearch
import ClientThreading
@ -42,7 +43,6 @@ import threading
import time
import traceback
import urlparse
import webbrowser
import wx
import wx.lib.scrolledpanel
@ -431,7 +431,7 @@ def GenerateDumpMultipartFormDataCTAndBody( fields ):
jpeg = self._controller.DoHTTP( HC.GET, 'https://www.google.com/recaptcha/api/image?c=' + self._captcha_challenge )
( os_file_handle, temp_path ) = HydrusPaths.GetTempPath()
( os_file_handle, temp_path ) = ClientPaths.GetTempPath()
try:
@ -561,6 +561,11 @@ class ManagementController( HydrusSerialisable.SerialisableBase ):
self._serialisables = {}
def __repr__( self ):
return HydrusData.ToByteString( 'Management Controller: ' + self._management_type + ' - ' + self._page_name )
def _GetSerialisableInfo( self ):
serialisable_keys = { name : value.encode( 'hex' ) for ( name, value ) in self._keys.items() }
@ -894,7 +899,7 @@ class ManagementPanelDuplicateFilter( ManagementPanel ):
menu_items = []
page_func = HydrusData.Call( webbrowser.open, 'file://' + HC.HELP_DIR + '/duplicates.html' )
page_func = HydrusData.Call( ClientPaths.LaunchPathInWebBrowser, os.path.join( HC.HELP_DIR, 'duplicates.html' ) )
menu_items.append( ( 'normal', 'show some simpler help here', 'Throw up a message box with some simple help.', self._ShowSimpleHelp ) )
menu_items.append( ( 'normal', 'open the html duplicates help', 'Open the help page for duplicates processing in your web browesr.', page_func ) )
@ -1397,7 +1402,7 @@ class ManagementPanelImporterGallery( ManagementPanelImporter ):
self._import_queue_panel = ClientGUICommon.StaticBox( self._gallery_downloader_panel, 'imports' )
self._current_action = ClientGUICommon.BetterStaticText( self._import_queue_panel )
self._seed_cache_control = ClientGUISeedCache.SeedCacheStatusControl( self._import_queue_panel, self._controller )
self._seed_cache_control = ClientGUISeedCache.SeedCacheStatusControl( self._import_queue_panel, self._controller, self._page_key )
self._file_download_control = ClientGUIControls.NetworkJobControl( self._import_queue_panel )
self._files_pause_button = wx.BitmapButton( self._import_queue_panel, bitmap = CC.GlobalBMPs.pause )
@ -1720,7 +1725,7 @@ class ManagementPanelImporterHDD( ManagementPanelImporter ):
self._import_queue_panel = ClientGUICommon.StaticBox( self, 'import summary' )
self._current_action = ClientGUICommon.BetterStaticText( self._import_queue_panel )
self._seed_cache_control = ClientGUISeedCache.SeedCacheStatusControl( self._import_queue_panel, self._controller )
self._seed_cache_control = ClientGUISeedCache.SeedCacheStatusControl( self._import_queue_panel, self._controller, self._page_key )
self._pause_button = wx.BitmapButton( self._import_queue_panel, bitmap = CC.GlobalBMPs.pause )
self._pause_button.Bind( wx.EVT_BUTTON, self.EventPause )
@ -1829,7 +1834,7 @@ class ManagementPanelImporterSimpleDownloader( ManagementPanelImporter ):
self._pause_files_button.Bind( wx.EVT_BUTTON, self.EventPauseFiles )
self._current_action = ClientGUICommon.BetterStaticText( self._import_queue_panel )
self._seed_cache_control = ClientGUISeedCache.SeedCacheStatusControl( self._import_queue_panel, self._controller )
self._seed_cache_control = ClientGUISeedCache.SeedCacheStatusControl( self._import_queue_panel, self._controller, self._page_key )
self._file_download_control = ClientGUIControls.NetworkJobControl( self._import_queue_panel )
#
@ -2267,7 +2272,7 @@ class ManagementPanelImporterThreadWatcher( ManagementPanelImporter ):
self._files_pause_button.Bind( wx.EVT_BUTTON, self.EventPauseFiles )
self._current_action = ClientGUICommon.BetterStaticText( imports_panel )
self._seed_cache_control = ClientGUISeedCache.SeedCacheStatusControl( imports_panel, self._controller )
self._seed_cache_control = ClientGUISeedCache.SeedCacheStatusControl( imports_panel, self._controller, self._page_key )
self._file_download_control = ClientGUIControls.NetworkJobControl( imports_panel )
#

View File

@ -16,6 +16,7 @@ import ClientGUIScrolledPanelsReview
import ClientGUIShortcuts
import ClientGUITopLevelWindows
import ClientMedia
import ClientPaths
import ClientSearch
import ClientTags
import collections
@ -36,7 +37,6 @@ import wx
import yaml
import HydrusData
import HydrusGlobals as HG
import webbrowser
def AddServiceKeyLabelsToMenu( menu, service_keys, phrase ):
@ -1110,7 +1110,7 @@ class MediaPanel( ClientMedia.ListeningMediaList, wx.ScrolledWindow ):
self._SetFocussedMedia( None )
webbrowser.open( 'file://' + path )
ClientPaths.LaunchPathInWebBrowser( path )
@ -1472,7 +1472,7 @@ class MediaPanel( ClientMedia.ListeningMediaList, wx.ScrolledWindow ):
with ClientGUITopLevelWindows.DialogEdit( self, 'edit duplicate merge options' ) as dlg_2:
panel = ClientGUIScrolledPanelsEdit.EditDuplicateActionOptionsPanel( dlg_2, duplicate_type, duplicate_action_options )
panel = ClientGUIScrolledPanelsEdit.EditDuplicateActionOptionsPanel( dlg_2, duplicate_type, duplicate_action_options, for_custom_action = True )
dlg_2.SetPanel( panel )
@ -3352,7 +3352,7 @@ class MediaPanelThumbnails( MediaPanel ):
for url in urls:
ClientGUIMenus.AppendMenuItem( self, urls_visit_menu, url, 'Open this url in your web browser.', webbrowser.open, url )
ClientGUIMenus.AppendMenuItem( self, urls_visit_menu, url, 'Open this url in your web browser.', ClientPaths.LaunchURLInWebBrowser, url )
ClientGUIMenus.AppendMenuItem( self, urls_copy_menu, url, 'Copy this url to your clipboard.', HG.client_controller.pub, 'clipboard', 'text', url )
@ -3370,7 +3370,7 @@ class MediaPanelThumbnails( MediaPanel ):
if focussed_is_local:
show_open_in_web = not HC.PLATFORM_WINDOWS or advanced_mode # let's turn it on for any Winlads who wants to try
show_open_in_web = True
show_open_in_explorer = advanced_mode and not HC.PLATFORM_LINUX
if show_open_in_web or show_open_in_explorer:
@ -3379,7 +3379,7 @@ class MediaPanelThumbnails( MediaPanel ):
if show_open_in_web:
ClientGUIMenus.AppendMenuItem( self, open_menu, 'in web browser (prototype)', 'Show this file in your OS\'s web browser.', self._OpenFileInWebBrowser )
ClientGUIMenus.AppendMenuItem( self, open_menu, 'in web browser', 'Show this file in your OS\'s web browser.', self._OpenFileInWebBrowser )
if show_open_in_explorer:
@ -3616,6 +3616,8 @@ class MediaPanelThumbnails( MediaPanel ):
ClientGUIMenus.AppendSeparator( duplicates_action_submenu )
ClientGUIMenus.AppendMenu( duplicates_menu, duplicates_action_submenu, 'set duplicate relationships' )
duplicates_edit_action_submenu = wx.Menu()
for duplicate_type in ( HC.DUPLICATE_BETTER, HC.DUPLICATE_SAME_QUALITY, HC.DUPLICATE_ALTERNATE, HC.DUPLICATE_NOT_DUPLICATE ):
@ -3623,12 +3625,10 @@ class MediaPanelThumbnails( MediaPanel ):
ClientGUIMenus.AppendMenuItem( self, duplicates_edit_action_submenu, 'for ' + HC.duplicate_type_string_lookup[ duplicate_type ], 'Edit what happens when you set this status.', self._EditDuplicateActionOptions, duplicate_type )
ClientGUIMenus.AppendMenu( duplicates_action_submenu, duplicates_edit_action_submenu, 'edit default merge options' )
ClientGUIMenus.AppendMenu( duplicates_menu, duplicates_edit_action_submenu, 'edit default merge options' )
ClientGUIMenus.AppendMenu( menu, duplicates_menu, 'duplicates' )
ClientGUIMenus.AppendMenu( duplicates_menu, duplicates_action_submenu, 'set duplicate relationships' )
if HG.client_controller.DBCurrentlyDoingJob():

View File

@ -269,7 +269,7 @@ class DialogPageChooser( ClientGUIDialogs.Dialog ):
entries.append( ( 'page_import_booru', None ) )
entries.append( ( 'page_import_gallery', HC.SITE_TYPE_DEVIANT_ART ) )
entries.append( ( 'menu', 'hentai foundry' ) )
entries.append( ( 'page_import_gallery', HC.SITE_TYPE_NEWGROUNDS ) )
#entries.append( ( 'page_import_gallery', HC.SITE_TYPE_NEWGROUNDS ) )
result = HG.client_controller.Read( 'serialisable_simple', 'pixiv_account' )
@ -2856,6 +2856,15 @@ class GUISession( HydrusSerialisable.SerialisableBaseNamed ):
def _GetSerialisableInfo( self ):
def handle_e( e ):
HydrusData.ShowText( 'Attempting to save a page to the session failed! Its data tuple and error follows! Please close it or see if you can clear any potentially invalid data from it!' )
HydrusData.ShowText( page_tuple )
HydrusData.ShowException( e )
def GetSerialisablePageTuple( page_tuple ):
( page_type, page_data ) = page_tuple
@ -2864,7 +2873,19 @@ class GUISession( HydrusSerialisable.SerialisableBaseNamed ):
( name, page_tuples ) = page_data
serialisable_page_tuples = [ GetSerialisablePageTuple( pt ) for pt in page_tuples ]
serialisable_page_tuples = []
for pt in page_tuples:
try:
serialisable_page_tuples.append( GetSerialisablePageTuple( pt ) )
except Exception as e:
handle_e( e )
serialisable_page_data = ( name, serialisable_page_tuples )
@ -2888,9 +2909,16 @@ class GUISession( HydrusSerialisable.SerialisableBaseNamed ):
for page_tuple in self._pages:
serialisable_page_tuple = GetSerialisablePageTuple( page_tuple )
serialisable_info.append( serialisable_page_tuple )
try:
serialisable_page_tuple = GetSerialisablePageTuple( page_tuple )
serialisable_info.append( serialisable_page_tuple )
except Exception as e:
handle_e( e )
return serialisable_info
@ -2898,6 +2926,15 @@ class GUISession( HydrusSerialisable.SerialisableBaseNamed ):
def _InitialiseFromSerialisableInfo( self, serialisable_info ):
def handle_e( e ):
HydrusData.ShowText( 'A page failed to load! Its serialised data and error follows!' )
HydrusData.ShowText( serialisable_page_tuple )
HydrusData.ShowException( e )
def GetPageTuple( serialisable_page_tuple ):
( page_type, serialisable_page_data ) = serialisable_page_tuple
@ -2906,7 +2943,19 @@ class GUISession( HydrusSerialisable.SerialisableBaseNamed ):
( name, serialisable_page_tuples ) = serialisable_page_data
page_tuples = [ GetPageTuple( spt ) for spt in serialisable_page_tuples ]
page_tuples = []
for spt in serialisable_page_tuples:
try:
page_tuples.append( GetPageTuple( spt ) )
except Exception as e:
handle_e( e )
page_data = ( name, page_tuples )
@ -2928,9 +2977,16 @@ class GUISession( HydrusSerialisable.SerialisableBaseNamed ):
for serialisable_page_tuple in serialisable_info:
page_tuple = GetPageTuple( serialisable_page_tuple )
self._pages.append( page_tuple )
try:
page_tuple = GetPageTuple( serialisable_page_tuple )
self._pages.append( page_tuple )
except Exception as e:
handle_e( e )

View File

@ -13,6 +13,7 @@ import ClientGUISerialisable
import ClientGUITopLevelWindows
import ClientNetworkingJobs
import ClientParsing
import ClientPaths
import ClientSerialisable
import ClientThreading
import HydrusConstants as HC
@ -27,7 +28,6 @@ import sys
import threading
import traceback
import time
import webbrowser
import wx
( StringConverterEvent, EVT_STRING_CONVERTER ) = wx.lib.newevent.NewCommandEvent()
@ -154,7 +154,7 @@ class EditCompoundFormulaPanel( ClientGUIScrolledPanels.EditPanel ):
menu_items = []
page_func = HydrusData.Call( webbrowser.open, 'file://' + HC.HELP_DIR + '/downloader_parsers_formulae.html#compound_formula' )
page_func = HydrusData.Call( ClientPaths.LaunchPathInWebBrowser, os.path.join( HC.HELP_DIR, 'downloader_parsers_formulae.html#compound_formula' ) )
menu_items.append( ( 'normal', 'open the compound formula help', 'Open the help page for compound formulae in your web browesr.', page_func ) )
@ -384,7 +384,7 @@ class EditContextVariableFormulaPanel( ClientGUIScrolledPanels.EditPanel ):
menu_items = []
page_func = HydrusData.Call( webbrowser.open, 'file://' + HC.HELP_DIR + '/downloader_parsers_formulae.html#context_variable_formula' )
page_func = HydrusData.Call( ClientPaths.LaunchPathInWebBrowser, os.path.join( HC.HELP_DIR, 'downloader_parsers_formulae.html#context_variable_formula' ) )
menu_items.append( ( 'normal', 'open the context variable formula help', 'Open the help page for context variable formulae in your web browesr.', page_func ) )
@ -802,7 +802,7 @@ class EditHTMLFormulaPanel( ClientGUIScrolledPanels.EditPanel ):
menu_items = []
page_func = HydrusData.Call( webbrowser.open, 'file://' + HC.HELP_DIR + '/downloader_parsers_formulae.html#html_formula' )
page_func = HydrusData.Call( ClientPaths.LaunchPathInWebBrowser, os.path.join( HC.HELP_DIR, 'downloader_parsers_formulae.html#html_formula' ) )
menu_items.append( ( 'normal', 'open the html formula help', 'Open the help page for html formulae in your web browesr.', page_func ) )
@ -1170,7 +1170,7 @@ class EditJSONFormulaPanel( ClientGUIScrolledPanels.EditPanel ):
menu_items = []
page_func = HydrusData.Call( webbrowser.open, 'file://' + HC.HELP_DIR + '/downloader_parsers_formulae.html#json_formula' )
page_func = HydrusData.Call( ClientPaths.LaunchPathInWebBrowser, os.path.join( HC.HELP_DIR, 'downloader_parsers_formulae.html#json_formula' ) )
menu_items.append( ( 'normal', 'open the json formula help', 'Open the help page for json formulae in your web browesr.', page_func ) )
@ -1400,7 +1400,7 @@ class EditContentParserPanel( ClientGUIScrolledPanels.EditPanel ):
menu_items = []
page_func = HydrusData.Call( webbrowser.open, 'file://' + HC.HELP_DIR + '/downloader_parsers_content_parsers.html#content_parsers' )
page_func = HydrusData.Call( ClientPaths.LaunchPathInWebBrowser, os.path.join( HC.HELP_DIR, 'downloader_parsers_content_parsers.html#content_parsers' ) )
menu_items.append( ( 'normal', 'open the content parsers help', 'Open the help page for content parsers in your web browesr.', page_func ) )
@ -2410,7 +2410,7 @@ class EditPageParserPanel( ClientGUIScrolledPanels.EditPanel ):
menu_items = []
page_func = HydrusData.Call( webbrowser.open, 'file://' + HC.HELP_DIR + '/downloader_parsers_page_parsers.html#page_parsers' )
page_func = HydrusData.Call( ClientPaths.LaunchPathInWebBrowser, os.path.join( HC.HELP_DIR, 'downloader_parsers_page_parsers.html#page_parsers' ) )
menu_items.append( ( 'normal', 'open the page parser help', 'Open the help page for page parsers in your web browesr.', page_func ) )
@ -4516,7 +4516,7 @@ class ScriptManagementControl( wx.Panel ):
for url in urls:
ClientGUIMenus.AppendMenuItem( self, menu, url, 'launch this url in your browser', webbrowser.open, url )
ClientGUIMenus.AppendMenuItem( self, menu, url, 'launch this url in your browser', ClientPaths.LaunchURLInWebBrowser, url )
HG.client_controller.PopupMenu( self, menu )

View File

@ -615,6 +615,11 @@ class PopupMessageManager( wx.Frame ):
size_and_position_needed = True
if not self.IsShown():
size_and_position_needed = True
if size_and_position_needed:
self._SizeAndPositionAndShow()
@ -640,6 +645,47 @@ class PopupMessageManager( wx.Frame ):
return False
def _DoDebugHide( self ):
parent = self.GetParent()
parent_iconized = parent.IsIconized()
# changing show status while parent iconised in Windows leads to grey window syndrome
windows_and_iconised = HC.PLATFORM_WINDOWS and parent_iconized
possibly_on_hidden_virtual_desktop = not ClientGUITopLevelWindows.MouseIsOnMyDisplay( parent )
going_to_bug_out_at_hide_or_show = windows_and_iconised or possibly_on_hidden_virtual_desktop
new_options = HG.client_controller.new_options
if new_options.GetBoolean( 'hide_message_manager_on_gui_iconise' ):
if parent.IsIconized():
self.Hide()
return
current_focus_tlp = wx.GetTopLevelParent( wx.Window.FindFocus() )
main_gui_is_active = current_focus_tlp in ( self, parent )
if new_options.GetBoolean( 'hide_message_manager_on_gui_deactive' ):
if not main_gui_is_active:
if not going_to_bug_out_at_hide_or_show:
self.Hide()
def _SizeAndPositionAndShow( self ):
try:
@ -653,18 +699,6 @@ class PopupMessageManager( wx.Frame ):
going_to_bug_out_at_hide_or_show = windows_and_iconised or possibly_on_hidden_virtual_desktop
new_options = HG.client_controller.new_options
if new_options.GetBoolean( 'hide_message_manager_on_gui_iconise' ):
if parent.IsIconized():
self.Hide()
return
current_focus_tlp = wx.GetTopLevelParent( wx.Window.FindFocus() )
main_gui_is_active = current_focus_tlp in ( self, parent )
@ -681,27 +715,6 @@ class PopupMessageManager( wx.Frame ):
if new_options.GetBoolean( 'hide_message_manager_on_gui_deactive' ):
if main_gui_is_active:
# gui can have focus even while minimised to the taskbar--let's not show in this case
if not self.IsShown() and parent.IsIconized():
return
else:
if not going_to_bug_out_at_hide_or_show:
self.Hide()
return
num_messages_displayed = self._message_vbox.GetItemCount()
there_is_stuff_to_display = num_messages_displayed > 0
@ -797,6 +810,17 @@ class PopupMessageManager( wx.Frame ):
return job_keys
def _OKToAlterUI( self ):
main_gui = self.GetParent()
not_on_hidden_or_virtual_display = ClientGUITopLevelWindows.MouseIsOnMyDisplay( main_gui )
main_gui_up = not main_gui.IsIconized()
return not_on_hidden_or_virtual_display and main_gui_up
def _TryToMergeMessage( self, job_key ):
if not job_key.HasVariable( 'popup_files_mergable' ):
@ -874,8 +898,6 @@ class PopupMessageManager( wx.Frame ):
self._SizeAndPositionAndShow()
def AddMessage( self, job_key ):
@ -890,7 +912,7 @@ class PopupMessageManager( wx.Frame ):
self._pending_job_keys.append( job_key )
if ClientGUITopLevelWindows.MouseIsOnMyDisplay( self.GetParent() ):
if self._OKToAlterUI():
self._CheckPending()
@ -937,9 +959,12 @@ class PopupMessageManager( wx.Frame ):
# OS X segfaults if this is instant
wx.CallAfter( window.Destroy )
self._SizeAndPositionAndShow()
self._CheckPending()
if self._OKToAlterUI():
self._SizeAndPositionAndShow()
self._CheckPending()
def DismissAll( self ):
@ -960,21 +985,29 @@ class PopupMessageManager( wx.Frame ):
def EventMove( self, event ):
self._SizeAndPositionAndShow()
if self._OKToAlterUI():
self._SizeAndPositionAndShow()
event.Skip()
def MakeSureEverythingFits( self ):
self._SizeAndPositionAndShow()
if self._OKToAlterUI():
self._SizeAndPositionAndShow()
def REPEATINGUpdate( self ):
try:
if ClientGUITopLevelWindows.MouseIsOnMyDisplay( self.GetParent() ):
self._DoDebugHide()
if self._OKToAlterUI():
self._Update()

View File

@ -19,6 +19,7 @@ import ClientImportOptions
import ClientNetworkingContexts
import ClientNetworkingDomain
import ClientParsing
import ClientPaths
import ClientSerialisable
import ClientTags
import collections
@ -31,7 +32,6 @@ import HydrusSerialisable
import HydrusTags
import HydrusText
import os
import webbrowser
import wx
class EditAccountTypePanel( ClientGUIScrolledPanels.EditPanel ):
@ -513,7 +513,7 @@ class EditDomainManagerInfoPanel( ClientGUIScrolledPanels.EditPanel ):
class EditDuplicateActionOptionsPanel( ClientGUIScrolledPanels.EditPanel ):
def __init__( self, parent, duplicate_action, duplicate_action_options ):
def __init__( self, parent, duplicate_action, duplicate_action_options, for_custom_action = False ):
ClientGUIScrolledPanels.EditPanel.__init__( self, parent )
@ -540,18 +540,36 @@ class EditDuplicateActionOptionsPanel( ClientGUIScrolledPanels.EditPanel ):
self._rating_service_actions.SetMinSize( ( 380, 120 ) )
add_rating_button = ClientGUICommon.BetterButton( rating_services_panel, 'add', self._AddRating )
edit_rating_button = ClientGUICommon.BetterButton( rating_services_panel, 'edit', self._EditRating )
if self._duplicate_action == HC.DUPLICATE_BETTER: # because there is only one valid action otherwise
edit_rating_button = ClientGUICommon.BetterButton( rating_services_panel, 'edit', self._EditRating )
delete_rating_button = ClientGUICommon.BetterButton( rating_services_panel, 'delete', self._DeleteRating )
#
self._delete_second_file = wx.CheckBox( self, label = 'delete worse file' )
self._sync_archive = wx.CheckBox( self, label = 'if one file is archived, archive the other as well' )
self._delete_both_files = wx.CheckBox( self, label = 'delete both files' )
self._delete_second_file = wx.CheckBox( self )
self._sync_archive = wx.CheckBox( self )
self._delete_both_files = wx.CheckBox( self )
self._delete_both_files.SetToolTip( 'This is only enabled on custom actions.' )
self._sync_urls_action = ClientGUICommon.BetterChoice( self )
self._sync_urls_action.Append( 'sync nothing', None )
if self._duplicate_action == HC.DUPLICATE_BETTER:
self._sync_urls_action.Append( HC.content_merge_string_lookup[ HC.CONTENT_MERGE_ACTION_COPY ], HC.CONTENT_MERGE_ACTION_COPY )
self._sync_urls_action.Append( HC.content_merge_string_lookup[ HC.CONTENT_MERGE_ACTION_TWO_WAY_MERGE ], HC.CONTENT_MERGE_ACTION_TWO_WAY_MERGE )
#
( tag_service_options, rating_service_options, delete_second_file, sync_archive, delete_both_files ) = duplicate_action_options.ToTuple()
( tag_service_options, rating_service_options, delete_second_file, sync_archive, delete_both_files, sync_urls_action ) = duplicate_action_options.ToTuple()
services_manager = HG.client_controller.services_manager
@ -585,14 +603,26 @@ class EditDuplicateActionOptionsPanel( ClientGUIScrolledPanels.EditPanel ):
#
if self._duplicate_action == HC.DUPLICATE_BETTER:
if not for_custom_action:
self._delete_both_files.Hide()
self._delete_both_files.Disable()
if self._duplicate_action in ( HC.DUPLICATE_ALTERNATE, HC.DUPLICATE_NOT_DUPLICATE ) and not for_custom_action:
self._sync_archive.Disable()
self._sync_urls_action.Disable()
self._sync_urls_action.SelectClientData( None )
else:
self._delete_second_file.Hide()
edit_rating_button.Hide() # because there is only one valid action in this case, and no tag censor to edit
self._sync_urls_action.SelectClientData( sync_urls_action )
if self._duplicate_action != HC.DUPLICATE_BETTER:
self._delete_second_file.Disable()
#
@ -611,7 +641,10 @@ class EditDuplicateActionOptionsPanel( ClientGUIScrolledPanels.EditPanel ):
button_hbox = wx.BoxSizer( wx.HORIZONTAL )
button_hbox.Add( add_rating_button, CC.FLAGS_VCENTER )
button_hbox.Add( edit_rating_button, CC.FLAGS_VCENTER )
if self._duplicate_action == HC.DUPLICATE_BETTER:
button_hbox.Add( edit_rating_button, CC.FLAGS_VCENTER )
button_hbox.Add( delete_rating_button, CC.FLAGS_VCENTER )
rating_services_panel.Add( self._rating_service_actions, CC.FLAGS_EXPAND_BOTH_WAYS )
@ -623,9 +656,17 @@ class EditDuplicateActionOptionsPanel( ClientGUIScrolledPanels.EditPanel ):
vbox.Add( tag_services_panel, CC.FLAGS_EXPAND_BOTH_WAYS )
vbox.Add( rating_services_panel, CC.FLAGS_EXPAND_BOTH_WAYS )
vbox.Add( self._delete_second_file, CC.FLAGS_LONE_BUTTON )
vbox.Add( self._sync_archive, CC.FLAGS_LONE_BUTTON )
vbox.Add( self._delete_both_files, CC.FLAGS_LONE_BUTTON )
rows = []
rows.append( ( 'delete worse file: ', self._delete_second_file ) )
rows.append( ( 'delete both files: ', self._delete_both_files ) )
rows.append( ( 'if one file is archived, archive the other as well: ', self._sync_archive ) )
rows.append( ( 'sync known urls?: ', self._sync_urls_action ) )
gridbox = ClientGUICommon.WrapInGrid( self, rows )
vbox.Add( gridbox, CC.FLAGS_EXPAND_SIZER_PERPENDICULAR )
self.SetSizer( vbox )
@ -957,8 +998,9 @@ class EditDuplicateActionOptionsPanel( ClientGUIScrolledPanels.EditPanel ):
delete_second_file = self._delete_second_file.GetValue()
sync_archive = self._sync_archive.GetValue()
delete_both_files = self._delete_both_files.GetValue()
sync_urls_action = self._sync_urls_action.GetChoice()
duplicate_action_options = ClientData.DuplicateActionOptions( tag_service_actions, rating_service_actions, delete_second_file, sync_archive, delete_both_files )
duplicate_action_options = ClientData.DuplicateActionOptions( tag_service_actions, rating_service_actions, delete_second_file, sync_archive, delete_both_files, sync_urls_action )
return duplicate_action_options
@ -2196,6 +2238,14 @@ class EditSubscriptionPanel( ClientGUIScrolledPanels.EditPanel ):
self._periodic_file_limit = ClientGUICommon.NoneableSpinCtrl( self._options_panel, '', none_phrase = 'get everything', min = 1, max = 1000000 )
self._periodic_file_limit.SetToolTip( 'If set, normal syncs will add no more than this many files. Otherwise, they will get everything up until they find a file they have seen before.' )
self._publish_files_to_popup_button = wx.CheckBox( self._options_panel )
self._publish_files_to_page = wx.CheckBox( self._options_panel )
self._merge_query_publish_events = wx.CheckBox( self._options_panel )
tt = 'If unchecked, each query will produce its own \'subscription_name: query\' button or page.'
self._merge_query_publish_events.SetToolTip( tt )
#
self._control_panel = ClientGUICommon.StaticBox( self, 'control' )
@ -2243,6 +2293,12 @@ class EditSubscriptionPanel( ClientGUIScrolledPanels.EditPanel ):
self._initial_file_limit.SetValue( initial_file_limit )
self._periodic_file_limit.SetValue( periodic_file_limit )
( publish_files_to_popup_button, publish_files_to_page, merge_query_publish_events ) = subscription.GetPresentationOptions()
self._publish_files_to_popup_button.SetValue( publish_files_to_popup_button )
self._publish_files_to_page.SetValue( publish_files_to_page )
self._merge_query_publish_events.SetValue( merge_query_publish_events )
self._paused.SetValue( paused )
#
@ -2258,6 +2314,9 @@ class EditSubscriptionPanel( ClientGUIScrolledPanels.EditPanel ):
rows.append( ( 'on first check, get at most this many files: ', self._initial_file_limit ) )
rows.append( ( 'on normal checks, get at most this many newer files: ', self._periodic_file_limit ) )
rows.append( ( 'if new files imported, publish them to a popup button: ', self._publish_files_to_popup_button ) )
rows.append( ( 'if new files imported, publish them to a page: ', self._publish_files_to_page ) )
rows.append( ( 'publish all queries\' new files to the same page/popup button: ', self._merge_query_publish_events ) )
gridbox = ClientGUICommon.WrapInGrid( self._options_panel, rows )
@ -2766,6 +2825,12 @@ class EditSubscriptionPanel( ClientGUIScrolledPanels.EditPanel ):
subscription.SetTuple( gallery_identifier, gallery_stream_identifiers, queries, self._checker_options, initial_file_limit, periodic_file_limit, paused, file_import_options, tag_import_options, self._no_work_until )
publish_files_to_popup_button = self._publish_files_to_popup_button.GetValue()
publish_files_to_page = self._publish_files_to_page.GetValue()
merge_query_publish_events = self._merge_query_publish_events.GetValue()
subscription.SetPresentationOptions( publish_files_to_popup_button, publish_files_to_page, merge_query_publish_events )
return subscription
@ -2870,7 +2935,7 @@ class EditSubscriptionsPanel( ClientGUIScrolledPanels.EditPanel ):
menu_items = []
page_func = HydrusData.Call( webbrowser.open, 'file://' + HC.HELP_DIR + '/getting_started_subscriptions.html' )
page_func = HydrusData.Call( ClientPaths.LaunchPathInWebBrowser, os.path.join( HC.HELP_DIR, 'getting_started_subscriptions.html' ) )
menu_items.append( ( 'normal', 'open the html subscriptions help', 'Open the help page for subscriptions in your web browesr.', page_func ) )
@ -4503,13 +4568,21 @@ class EditURLMatchPanel( ClientGUIScrolledPanels.EditPanel ):
self._keep_matched_subdomains.SetToolTip( tt )
self._can_produce_multiple_files = wx.CheckBox( self )
tt = 'If checked, the client will not rely on instances of this URL class to predetermine \'already in db\' or \'previously deleted\' outcomes. This is important for post types like pixiv pages (which can ultimately be manga, and represent many pages) and tweets (which can have multiple images).'
tt += os.linesep * 2
tt += 'Most booru-type Post URLs only produce one file per URL and should not have this checked. Checking this avoids some bad logic where the client would falsely think it if it had seen one file at the URL, it had seen them all, but it then means the client has to download those pages\' content again whenever it sees them (so it can check against the direct File URLs, which are always considered one-file each).'
self._can_produce_multiple_files.SetToolTip( tt )
self._should_be_associated_with_files = wx.CheckBox( self )
tt = 'If checked, the client will try to remember this url with any files it ends up importing. It will present this url in \'known urls\' ui across the program.'
tt += os.linesep * 2
tt += 'If this URL is a File or Post URL and the client comes across it after having already downloaded it once, it can skip the redundant download since it knows it already has (or has already deleted) the file once before.'
tt += os.linesep * 2
tt += 'Turning this on is only useful if the URL is non-ephemeral (i.e. the URL will produce the exact same file(s) in six months\' time). It is usually not appropriate for gallery or thread urls, which alter regularly, but is for static Post URLs or some gallery exceptions such as multi-image tweets.'
tt += 'Turning this on is only useful if the URL is non-ephemeral (i.e. the URL will produce the exact same file(s) in six months\' time). It is usually not appropriate for booru gallery or thread urls, which alter regularly, but is for static Post URLs or some fixed doujin galleries.'
self._should_be_associated_with_files.SetToolTip( tt )
@ -4547,7 +4620,7 @@ class EditURLMatchPanel( ClientGUIScrolledPanels.EditPanel ):
self._normalised_url.SetToolTip( tt )
( url_type, preferred_scheme, netloc, match_subdomains, keep_matched_subdomains, path_components, parameters, api_lookup_converter, should_be_associated_with_files, example_url ) = url_match.ToTuple()
( url_type, preferred_scheme, netloc, match_subdomains, keep_matched_subdomains, path_components, parameters, api_lookup_converter, can_produce_multiple_files, should_be_associated_with_files, example_url ) = url_match.ToTuple()
self._api_lookup_converter = ClientGUIParsing.StringConverterButton( self, api_lookup_converter )
@ -4567,6 +4640,7 @@ class EditURLMatchPanel( ClientGUIScrolledPanels.EditPanel ):
self._match_subdomains.SetValue( match_subdomains )
self._keep_matched_subdomains.SetValue( keep_matched_subdomains )
self._can_produce_multiple_files.SetValue( can_produce_multiple_files )
self._should_be_associated_with_files.SetValue( should_be_associated_with_files )
self._path_components.AddDatas( path_components )
@ -4601,6 +4675,7 @@ class EditURLMatchPanel( ClientGUIScrolledPanels.EditPanel ):
rows.append( ( 'network location: ', self._netloc ) )
rows.append( ( 'match subdomains?: ', self._match_subdomains ) )
rows.append( ( 'keep matched subdomains?: ', self._keep_matched_subdomains ) )
rows.append( ( 'can produce multiple files: ', self._can_produce_multiple_files ) )
rows.append( ( 'should associate a \'known url\' with resulting files: ', self._should_be_associated_with_files ) )
gridbox_1 = ClientGUICommon.WrapInGrid( self, rows )
@ -4826,13 +4901,14 @@ class EditURLMatchPanel( ClientGUIScrolledPanels.EditPanel ):
netloc = self._netloc.GetValue()
match_subdomains = self._match_subdomains.GetValue()
keep_matched_subdomains = self._keep_matched_subdomains.GetValue()
can_produce_multiple_files = self._can_produce_multiple_files.GetValue()
should_be_associated_with_files = self._should_be_associated_with_files.GetValue()
path_components = self._path_components.GetData()
parameters = dict( self._parameters.GetData() )
api_lookup_converter = self._api_lookup_converter.GetValue()
example_url = self._example_url.GetValue()
url_match = ClientNetworkingDomain.URLMatch( name, url_match_key = url_match_key, url_type = url_type, preferred_scheme = preferred_scheme, netloc = netloc, match_subdomains = match_subdomains, keep_matched_subdomains = keep_matched_subdomains, path_components = path_components, parameters = parameters, api_lookup_converter = api_lookup_converter, should_be_associated_with_files = should_be_associated_with_files, example_url = example_url )
url_match = ClientNetworkingDomain.URLMatch( name, url_match_key = url_match_key, url_type = url_type, preferred_scheme = preferred_scheme, netloc = netloc, match_subdomains = match_subdomains, keep_matched_subdomains = keep_matched_subdomains, path_components = path_components, parameters = parameters, api_lookup_converter = api_lookup_converter, can_produce_multiple_files = can_produce_multiple_files, should_be_associated_with_files = should_be_associated_with_files, example_url = example_url )
return url_match
@ -4843,6 +4919,15 @@ class EditURLMatchPanel( ClientGUIScrolledPanels.EditPanel ):
url_type = url_match.GetURLType()
if url_type == HC.URL_TYPE_POST:
self._can_produce_multiple_files.Enable()
else:
self._can_produce_multiple_files.Disable()
if url_match.NormalisationIsAppropriate():
if self._match_subdomains.GetValue():
@ -4913,7 +4998,7 @@ class EditURLMatchPanel( ClientGUIScrolledPanels.EditPanel ):
if self._url_type.GetChoice() in ( HC.URL_TYPE_GALLERY, HC.URL_TYPE_WATCHABLE ):
message = 'Please note that it is only appropriate to associate a Gallery or Watchable URL with a file if that URL is non-ephemeral. It is only appropriate if the exact same URL will definitely give the same files in six months\' time (like a tweet or a fixed doujin chapter gallery).'
message = 'Please note that it is only appropriate to associate a Gallery or Watchable URL with a file if that URL is non-ephemeral. It is only appropriate if the exact same URL will definitely give the same files in six months\' time (like a fixed doujin chapter gallery).'
message += os.linesep * 2
message += 'If you are not sure what this means, turn this back off.'
@ -4942,7 +5027,9 @@ class EditURLMatchPanel( ClientGUIScrolledPanels.EditPanel ):
def EventURLTypeUpdate( self, event ):
if self._url_type.GetChoice() in ( HC.URL_TYPE_FILE, HC.URL_TYPE_POST ):
url_type = self._url_type.GetChoice()
if url_type in ( HC.URL_TYPE_FILE, HC.URL_TYPE_POST ):
self._should_be_associated_with_files.SetValue( True )
@ -4978,7 +5065,7 @@ class EditURLMatchesPanel( ClientGUIScrolledPanels.EditPanel ):
menu_items = []
page_func = HydrusData.Call( webbrowser.open, 'file://' + HC.HELP_DIR + '/downloader_url_classes.html' )
page_func = HydrusData.Call( ClientPaths.LaunchPathInWebBrowser, os.path.join( HC.HELP_DIR, 'downloader_url_classes.html' ) )
menu_items.append( ( 'normal', 'open the url classes help', 'Open the help page for url classes in your web browesr.', page_func ) )
@ -4986,6 +5073,11 @@ class EditURLMatchesPanel( ClientGUIScrolledPanels.EditPanel ):
help_hbox = ClientGUICommon.WrapInText( help_button, self, 'help for this panel -->', wx.Colour( 0, 0, 255 ) )
self._url_class_checker = wx.TextCtrl( self )
self._url_class_checker.Bind( wx.EVT_TEXT, self.EventURLClassCheckerText )
self._url_class_checker_st = ClientGUICommon.BetterStaticText( self )
self._list_ctrl_panel = ClientGUIListCtrl.BetterListCtrlPanel( self )
self._list_ctrl = ClientGUIListCtrl.BetterListCtrl( self._list_ctrl_panel, 'url_matches', 15, 40, [ ( 'name', 36 ), ( 'type', 20 ), ( 'example (normalised) url', -1 ) ], self._ConvertDataToListCtrlTuples, delete_key_callback = self._Delete, activation_callback = self._Edit )
@ -5008,13 +5100,23 @@ class EditURLMatchesPanel( ClientGUIScrolledPanels.EditPanel ):
#
url_hbox = wx.BoxSizer( wx.HORIZONTAL )
url_hbox.Add( self._url_class_checker, CC.FLAGS_EXPAND_BOTH_WAYS )
url_hbox.Add( self._url_class_checker_st, CC.FLAGS_EXPAND_BOTH_WAYS )
vbox = wx.BoxSizer( wx.VERTICAL )
vbox.Add( help_hbox, CC.FLAGS_BUTTON_SIZER )
vbox.Add( url_hbox, CC.FLAGS_EXPAND_PERPENDICULAR )
vbox.Add( self._list_ctrl_panel, CC.FLAGS_EXPAND_BOTH_WAYS )
self.SetSizer( vbox )
#
self._UpdateURLClassCheckerText()
def _Add( self ):
@ -5112,6 +5214,51 @@ class EditURLMatchesPanel( ClientGUIScrolledPanels.EditPanel ):
return names
def _UpdateURLClassCheckerText( self ):
url = self._url_class_checker.GetValue()
if url == '':
text = '<-- Enter a URL here to see which url class it currently matches!'
else:
url_matches = self.GetValue()
domain_manager = ClientNetworkingDomain.NetworkDomainManager()
domain_manager.Initialise()
domain_manager.SetURLMatches( url_matches )
try:
url_match = domain_manager.GetURLMatch( url )
if url_match is None:
text = 'No match!'
else:
text = 'Matches "' + url_match.GetName() + '"'
except HydrusExceptions.URLMatchException as e:
text = HydrusData.ToUnicode( e )
self._url_class_checker_st.SetLabelText( text )
def EventURLClassCheckerText( self, event ):
self._UpdateURLClassCheckerText()
def GetValue( self ):
url_matches = self._list_ctrl.GetData()
@ -5133,19 +5280,29 @@ class EditURLMatchLinksPanel( ClientGUIScrolledPanels.EditPanel ):
self._network_engine = network_engine
self._display_list_ctrl_panel = ClientGUIListCtrl.BetterListCtrlPanel( self )
#
self._display_list_ctrl = ClientGUIListCtrl.BetterListCtrl( self._display_list_ctrl_panel, 'url_match_keys_to_display', 15, 36, [ ( 'url class', -1 ), ( 'display on media viewer?', 36 ) ], self._ConvertDisplayDataToListCtrlTuples, activation_callback = self._EditDisplay )
self._notebook = wx.Notebook( self )
#
self._display_list_ctrl_panel = ClientGUIListCtrl.BetterListCtrlPanel( self._notebook )
self._display_list_ctrl = ClientGUIListCtrl.BetterListCtrl( self._display_list_ctrl_panel, 'url_match_keys_to_display', 15, 36, [ ( 'url class', -1 ), ( 'url type', 20 ), ( 'display on media viewer?', 36 ) ], self._ConvertDisplayDataToListCtrlTuples, activation_callback = self._EditDisplay )
self._display_list_ctrl_panel.SetListCtrl( self._display_list_ctrl )
self._display_list_ctrl_panel.AddButton( 'edit', self._EditDisplay, enabled_only_on_selection = True )
self._api_pairs_list_ctrl = ClientGUIListCtrl.BetterListCtrl( self, 'url_match_api_pairs', 10, 36, [ ( 'url class', -1 ), ( 'api url class', 36 ) ], self._ConvertAPIPairDataToListCtrlTuples )
#
self._parser_list_ctrl_panel = ClientGUIListCtrl.BetterListCtrlPanel( self )
self._api_pairs_list_ctrl = ClientGUIListCtrl.BetterListCtrl( self._notebook, 'url_match_api_pairs', 10, 36, [ ( 'url class', -1 ), ( 'api url class', 36 ) ], self._ConvertAPIPairDataToListCtrlTuples )
self._parser_list_ctrl = ClientGUIListCtrl.BetterListCtrl( self._parser_list_ctrl_panel, 'url_match_keys_to_parser_keys', 15, 36, [ ( 'url class', -1 ), ( 'url type', 20 ), ( 'parser', 36 ) ], self._ConvertParserDataToListCtrlTuples, activation_callback = self._EditParser )
#
self._parser_list_ctrl_panel = ClientGUIListCtrl.BetterListCtrlPanel( self._notebook )
self._parser_list_ctrl = ClientGUIListCtrl.BetterListCtrl( self._parser_list_ctrl_panel, 'url_match_keys_to_parser_keys', 24, 36, [ ( 'url class', -1 ), ( 'url type', 20 ), ( 'parser', 36 ) ], self._ConvertParserDataToListCtrlTuples, activation_callback = self._EditParser )
self._parser_list_ctrl_panel.SetListCtrl( self._parser_list_ctrl )
@ -5159,11 +5316,6 @@ class EditURLMatchLinksPanel( ClientGUIScrolledPanels.EditPanel ):
for url_match in url_matches:
if not url_match.IsPostURL():
continue
url_match_key = url_match.GetMatchKey()
display = url_match_key in url_match_keys_to_display
@ -5173,7 +5325,7 @@ class EditURLMatchLinksPanel( ClientGUIScrolledPanels.EditPanel ):
self._display_list_ctrl.AddDatas( listctrl_data )
self._display_list_ctrl.Sort( 0 )
self._display_list_ctrl.Sort( 1 )
#
@ -5225,33 +5377,37 @@ class EditURLMatchLinksPanel( ClientGUIScrolledPanels.EditPanel ):
#
self._notebook.AddPage( self._parser_list_ctrl_panel, 'parser links' )
self._notebook.AddPage( self._api_pairs_list_ctrl, 'api link review' )
self._notebook.AddPage( self._display_list_ctrl_panel, 'media viewer display' )
#
vbox = wx.BoxSizer( wx.VERTICAL )
vbox.Add( self._display_list_ctrl_panel, CC.FLAGS_EXPAND_BOTH_WAYS )
vbox.Add( self._api_pairs_list_ctrl, CC.FLAGS_EXPAND_PERPENDICULAR )
vbox.Add( self._parser_list_ctrl_panel, CC.FLAGS_EXPAND_BOTH_WAYS )
vbox.Add( self._notebook, CC.FLAGS_EXPAND_BOTH_WAYS )
self.SetSizer( vbox )
def _ClearParser( self ):
with ClientGUIDialogs.DialogYesNo( self, 'Clear all the linked parsers?' ) as dlg:
with ClientGUIDialogs.DialogYesNo( self, 'Clear all the selected linked parsers?' ) as dlg:
if dlg.ShowModal() == wx.ID_YES:
for data in self._parser_list_ctrl.GetData( only_selected = True ):
( url_match_key, parser_key ) = data
self._parser_list_ctrl.DeleteDatas( ( data, ) )
( url_match_key, parser_key ) = data
new_data = ( url_match_key, None )
self._parser_list_ctrl.AddDatas( ( new_data, ) )
self._parser_list_ctrl.Sort()
self._parser_list_ctrl.Sort()
@ -5276,9 +5432,13 @@ class EditURLMatchLinksPanel( ClientGUIScrolledPanels.EditPanel ):
( url_match_key, display ) = data
url_match_name = self._url_match_keys_to_url_matches[ url_match_key ].GetName()
url_match = self._url_match_keys_to_url_matches[ url_match_key ]
url_match_name = url_match.GetName()
url_type = url_match.GetURLType()
pretty_name = url_match_name
pretty_url_type = HC.url_type_string_lookup[ url_type ]
if display:
@ -5289,8 +5449,8 @@ class EditURLMatchLinksPanel( ClientGUIScrolledPanels.EditPanel ):
pretty_display = 'no'
display_tuple = ( pretty_name, pretty_display )
sort_tuple = ( url_match_name, display )
display_tuple = ( pretty_name, pretty_url_type, pretty_display )
sort_tuple = ( url_match_name, pretty_url_type, display )
return ( display_tuple, sort_tuple )
@ -5323,7 +5483,7 @@ class EditURLMatchLinksPanel( ClientGUIScrolledPanels.EditPanel ):
pretty_parser_name = parser_name
display_tuple = ( pretty_url_match_name, pretty_url_type, pretty_parser_name )
sort_tuple = ( url_match_name, url_type, parser_name )
sort_tuple = ( url_match_name, pretty_url_type, parser_name )
return ( display_tuple, sort_tuple )
@ -5397,15 +5557,13 @@ class EditURLMatchLinksPanel( ClientGUIScrolledPanels.EditPanel ):
self._parser_list_ctrl.Sort()
self._parser_list_ctrl.Sort()
def _GapsExist( self ):
parser_keys = [ parser_key for ( url_match_key, parser_key ) in self._parser_list_ctrl.GetData() ]
return None in parser_keys
return None in ( parser_key for ( url_match_key, parser_key ) in self._parser_list_ctrl.GetData() )
def _LinksOnCurrentSelection( self ):

View File

@ -42,7 +42,6 @@ import os
import random
import traceback
import urlparse
import webbrowser
import wx
class ManageAccountTypesPanel( ClientGUIScrolledPanels.ManagePanel ):
@ -851,13 +850,13 @@ class ManageClientServicesPanel( ClientGUIScrolledPanels.ManagePanel ):
wx.CallAfter( wx_setkey, access_key_encoded )
wx.MessageBox( 'Looks good!' )
wx.CallAfter( wx.MessageBox, 'Looks good!' )
except Exception as e:
HydrusData.PrintException( e )
wx.MessageBox( 'Had a problem: ' + HydrusData.ToUnicode( e ) )
wx.CallAfter( wx.MessageBox, 'Had a problem: ' + HydrusData.ToUnicode( e ) )
finally:
@ -939,23 +938,23 @@ class ManageClientServicesPanel( ClientGUIScrolledPanels.ManagePanel ):
if not response[ 'verified' ]:
wx.MessageBox( 'That access key was not recognised!' )
wx.CallAfter( wx.MessageBox, 'That access key was not recognised!' )
else:
wx.MessageBox( 'Everything looks ok!' )
wx.CallAfter( wx.MessageBox, 'Everything looks ok!' )
except HydrusExceptions.WrongServiceTypeException:
wx.MessageBox( 'Connection was made, but the service was not a ' + HC.service_string_lookup[ self._service_type ] + '.' )
wx.CallAfter( wx.MessageBox, 'Connection was made, but the service was not a ' + HC.service_string_lookup[ self._service_type ] + '.' )
return
except HydrusExceptions.NetworkException as e:
wx.MessageBox( 'Network problem: ' + HydrusData.ToUnicode( e ) )
wx.CallAfter( wx.MessageBox, 'Network problem: ' + HydrusData.ToUnicode( e ) )
return
@ -2047,7 +2046,11 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
#
self._maintenance_vacuum_period_days = ClientGUICommon.NoneableSpinCtrl( self._maintenance_panel, '', min = 1, max = 365, none_phrase = 'do not automatically vacuum' )
self._maintenance_vacuum_period_days = ClientGUICommon.NoneableSpinCtrl( self._maintenance_panel, '', min = 28, max = 365, none_phrase = 'do not automatically vacuum' )
tts = 'Vacuuming is a kind of full defrag of the database\'s internal page table. It can take a long time (1MB/s) on a slow drive and does not need to be done often, so feel free to set this at 90 days+.'
self._maintenance_vacuum_period_days.SetToolTip( tts )
#
@ -2261,6 +2264,8 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
wx.Panel.__init__( self, parent )
self._new_options = HG.client_controller.new_options
self._export_location = wx.DirPickerCtrl( self, style = wx.DIRP_USE_TEXTCTRL )
self._delete_to_recycle_bin = wx.CheckBox( self, label = '' )
@ -2271,8 +2276,12 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
self._trash_max_age = ClientGUICommon.NoneableSpinCtrl( self, '', none_phrase = 'no age limit', min = 0, max = 8640 )
self._trash_max_size = ClientGUICommon.NoneableSpinCtrl( self, '', none_phrase = 'no size limit', min = 0, max = 20480 )
self._temp_path_override = wx.DirPickerCtrl( self, style = wx.DIRP_USE_TEXTCTRL )
mime_panel = ClientGUICommon.StaticBox( self, '\'open externally\' launch paths' )
self._web_browser_path = wx.TextCtrl( mime_panel )
self._mime_launch_listctrl = ClientGUIListCtrl.BetterListCtrl( mime_panel, 'mime_launch', 15, 30, [ ( 'mime', 20 ), ( 'launch path', -1 ) ], self._ConvertMimeToListCtrlTuples, activation_callback = self._EditMimeLaunch )
#
@ -2293,7 +2302,19 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
self._trash_max_age.SetValue( HC.options[ 'trash_max_age' ] )
self._trash_max_size.SetValue( HC.options[ 'trash_max_size' ] )
self._new_options = HG.client_controller.new_options
temp_path_override = self._new_options.GetNoneableString( 'temp_path_override' )
if temp_path_override is not None:
self._temp_path_override.SetPath( temp_path_override )
web_browser_path = self._new_options.GetNoneableString( 'web_browser_path' )
if web_browser_path is not None:
self._web_browser_path.SetValue( web_browser_path )
for mime in HC.SEARCHABLE_MIMES:
@ -2308,6 +2329,10 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
vbox = wx.BoxSizer( wx.VERTICAL )
text = 'If you set the default export directory blank, the client will use \'hydrus_export\' under the current user\'s home directory.'
vbox.Add( ClientGUICommon.BetterStaticText( self, text ), CC.FLAGS_CENTER )
rows = []
rows.append( ( 'Default export directory: ', self._export_location ) )
@ -2316,15 +2341,29 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
rows.append( ( 'Remove files from view when they are sent to the trash: ', self._remove_trashed_files ) )
rows.append( ( 'Number of hours a file can be in the trash before being deleted: ', self._trash_max_age ) )
rows.append( ( 'Maximum size of trash (MB): ', self._trash_max_size ) )
rows.append( ( 'BUGFIX: Temp folder override (set blank for OS default): ', self._temp_path_override ) )
gridbox = ClientGUICommon.WrapInGrid( self, rows )
vbox.Add( gridbox, CC.FLAGS_EXPAND_SIZER_PERPENDICULAR )
text = 'Setting a specific web browser path here--like \'C:\\program files\\firefox\\firefox.exe "%path%"\'--can help with the \'share->open->in web browser\' command, which is buggy working with OS defaults, particularly on Windows. It also fixes #anchors, which are dropped in some OSes using default means. Use the same %path% format as the \'open externally\' commands below.'
st = ClientGUICommon.BetterStaticText( mime_panel, text )
st.Wrap( 800 )
mime_panel.Add( st, CC.FLAGS_EXPAND_PERPENDICULAR )
rows = []
rows.append( ( 'Manual web browser launch path: ', self._web_browser_path ) )
gridbox = ClientGUICommon.WrapInGrid( mime_panel, rows )
mime_panel.Add( gridbox, CC.FLAGS_EXPAND_PERPENDICULAR )
mime_panel.Add( self._mime_launch_listctrl, CC.FLAGS_EXPAND_BOTH_WAYS )
text = 'If you set the default export directory blank, the client will use \'hydrus_export\' under the current user\'s home directory.'
vbox.Add( ClientGUICommon.BetterStaticText( self, text ), CC.FLAGS_CENTER )
vbox.Add( gridbox, CC.FLAGS_EXPAND_SIZER_PERPENDICULAR )
vbox.Add( mime_panel, CC.FLAGS_EXPAND_BOTH_WAYS )
self.SetSizer( vbox )
@ -2408,6 +2447,24 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
HC.options[ 'trash_max_age' ] = self._trash_max_age.GetValue()
HC.options[ 'trash_max_size' ] = self._trash_max_size.GetValue()
temp_path_override = self._temp_path_override.GetPath()
if temp_path_override == '':
temp_path_override = None
self._new_options.SetNoneableString( 'temp_path_override', temp_path_override )
web_browser_path = self._web_browser_path.GetValue()
if web_browser_path == '':
web_browser_path = None
self._new_options.SetNoneableString( 'web_browser_path', web_browser_path )
for ( mime, launch_path ) in self._mime_launch_listctrl.GetData():
self._new_options.SetMimeLaunch( mime, launch_path )
@ -5996,12 +6053,12 @@ class ManageURLsPanel( ClientGUIScrolledPanels.ManagePanel ):
if len( self._urls_to_add ) > 0:
content_updates.append( HydrusData.ContentUpdate( HC.CONTENT_TYPE_URLS, HC.CONTENT_UPDATE_ADD, ( hash, self._urls_to_add ) ) )
content_updates.append( HydrusData.ContentUpdate( HC.CONTENT_TYPE_URLS, HC.CONTENT_UPDATE_ADD, ( self._urls_to_add, ( hash, ) ) ) )
if len( self._urls_to_remove ) > 0:
content_updates.append( HydrusData.ContentUpdate( HC.CONTENT_TYPE_URLS, HC.CONTENT_UPDATE_DELETE, ( hash, self._urls_to_remove ) ) )
content_updates.append( HydrusData.ContentUpdate( HC.CONTENT_TYPE_URLS, HC.CONTENT_UPDATE_DELETE, ( self._urls_to_remove, ( hash, ) ) ) )
if len( content_updates ) > 0:

View File

@ -16,6 +16,7 @@ import ClientGUITime
import ClientGUITopLevelWindows
import ClientNetworking
import ClientNetworkingContexts
import ClientPaths
import ClientTags
import ClientThreading
import collections
@ -32,7 +33,6 @@ import sys
import threading
import time
import traceback
import webbrowser
import wx
try:
@ -327,7 +327,7 @@ class MigrateDatabasePanel( ClientGUIScrolledPanels.ReviewPanel ):
menu_items = []
page_func = HydrusData.Call( webbrowser.open, 'file://' + HC.HELP_DIR + '/database_migration.html' )
page_func = HydrusData.Call( ClientPaths.LaunchPathInWebBrowser, os.path.join( HC.HELP_DIR, 'database_migration.html' ) )
menu_items.append( ( 'normal', 'open the html migration help', 'Open the help page for database migration in your web browesr.', page_func ) )
@ -1757,33 +1757,40 @@ class ReviewExportFilesPanel( ClientGUIScrolledPanels.ReviewPanel ):
def EventExportTagTxtsChanged( self, event ):
services_manager = HG.client_controller.services_manager
tag_services = services_manager.GetServices( HC.TAG_SERVICES )
list_of_tuples = [ ( service.GetName(), service.GetServiceKey(), service.GetServiceKey() in self._neighbouring_txt_tag_service_keys ) for service in tag_services ]
list_of_tuples.sort()
with ClientGUIDialogs.DialogCheckFromList( self, 'select tag services', list_of_tuples ) as dlg:
if self._export_tag_txts.GetValue() == True:
if dlg.ShowModal() == wx.ID_OK:
services_manager = HG.client_controller.services_manager
tag_services = services_manager.GetServices( HC.TAG_SERVICES )
list_of_tuples = [ ( service.GetName(), service.GetServiceKey(), service.GetServiceKey() in self._neighbouring_txt_tag_service_keys ) for service in tag_services ]
list_of_tuples.sort()
with ClientGUIDialogs.DialogCheckFromList( self, 'select tag services', list_of_tuples ) as dlg:
self._neighbouring_txt_tag_service_keys = dlg.GetChecked()
if len( self._neighbouring_txt_tag_service_keys ) == 0:
if dlg.ShowModal() == wx.ID_OK:
self._export_tag_txts.SetValue( False )
self._neighbouring_txt_tag_service_keys = dlg.GetChecked()
if len( self._neighbouring_txt_tag_service_keys ) == 0:
self._export_tag_txts.SetValue( False )
else:
self._export_tag_txts.SetValue( True )
else:
self._export_tag_txts.SetValue( True )
self._export_tag_txts.SetValue( False )
else:
self._export_tag_txts.SetValue( False )
else:
self._neighbouring_txt_tag_service_keys = []

View File

@ -7,6 +7,7 @@ import ClientGUISerialisable
import ClientGUIScrolledPanels
import ClientGUITopLevelWindows
import ClientImporting
import ClientPaths
import ClientSerialisable
import ClientThreading
import HydrusConstants as HC
@ -15,7 +16,6 @@ import HydrusGlobals as HG
import HydrusPaths
import HydrusText
import os
import webbrowser
import wx
class EditSeedCachePanel( ClientGUIScrolledPanels.EditPanel ):
@ -175,7 +175,7 @@ class EditSeedCachePanel( ClientGUIScrolledPanels.EditPanel ):
for seed in seeds:
webbrowser.open( seed.seed_data )
ClientPaths.LaunchURLInWebBrowser( seed.seed_data )
else:
@ -209,10 +209,21 @@ class EditSeedCachePanel( ClientGUIScrolledPanels.EditPanel ):
def _ShowMenuIfNeeded( self ):
if self._list_ctrl.HasSelected() > 0:
selected_seeds = self._list_ctrl.GetData( only_selected = True )
if len( selected_seeds ) > 0:
menu = wx.Menu()
can_show_files_in_new_page = True in ( seed.HasHash() for seed in selected_seeds )
if can_show_files_in_new_page:
ClientGUIMenus.AppendMenuItem( self, menu, 'open selected import files in a new page', 'Show all the known selected files in a new thumbnail page. This is complicated, so cannot always be guaranteed, even if the import says \'success\'.', self._ShowSelectionInNewPage )
ClientGUIMenus.AppendSeparator( menu )
ClientGUIMenus.AppendMenuItem( self, menu, 'copy sources', 'Copy all the selected sources to clipboard.', self._CopySelectedSeedData )
ClientGUIMenus.AppendMenuItem( self, menu, 'copy notes', 'Copy all the selected notes to clipboard.', self._CopySelectedNotes )
@ -230,6 +241,24 @@ class EditSeedCachePanel( ClientGUIScrolledPanels.EditPanel ):
def _ShowSelectionInNewPage( self ):
hashes = []
for seed in self._list_ctrl.GetData( only_selected = True ):
if seed.HasHash():
hashes.append( seed.GetHash() )
if len( hashes ) > 0:
HG.client_controller.pub( 'new_page_query', CC.LOCAL_FILE_SERVICE_KEY, initial_hashes = hashes )
def _UpdateListCtrl( self, seeds ):
seeds_to_add = []
@ -310,9 +339,9 @@ class SeedCacheButton( ClientGUICommon.BetterBitmapButton ):
self.Bind( wx.EVT_RIGHT_DOWN, self.EventShowMenu )
def _ClearProcessed( self ):
def _ClearSeeds( self, statuses_to_remove ):
message = 'Are you sure you want to delete all the processed (i.e. anything with a non-blank status in the larger window) file imports? This is useful for cleaning up and de-laggifying a very large list, but not much else.'
message = 'Are you sure you want to delete all the ' + '/'.join( ( CC.status_string_lookup[ status ] for status in statuses_to_remove ) ) + ' file import items? This is useful for cleaning up and de-laggifying a very large list, but be careful you aren\'t removing something you would want to revisit or what watcher/subscription may be using for future check time calculations.'
with ClientGUIDialogs.DialogYesNo( self, message ) as dlg:
@ -320,22 +349,7 @@ class SeedCacheButton( ClientGUICommon.BetterBitmapButton ):
seed_cache = self._seed_cache_get_callable()
seed_cache.RemoveProcessedSeeds()
def _ClearSuccessful( self ):
message = 'Are you sure you want to delete all the successful/already in db file imports? This is useful for cleaning up and de-laggifying a very large list and leaving only failed and otherwise skipped entries.'
with ClientGUIDialogs.DialogYesNo( self, message ) as dlg:
if dlg.ShowModal() == wx.ID_YES:
seed_cache = self._seed_cache_get_callable()
seed_cache.RemoveSuccessfulSeeds()
seed_cache.RemoveSeedsByStatus( statuses_to_remove )
@ -513,27 +527,36 @@ class SeedCacheButton( ClientGUICommon.BetterBitmapButton ):
seed_cache = self._seed_cache_get_callable()
num_seeds = len( seed_cache )
num_successful = seed_cache.GetSeedCount( CC.STATUS_SUCCESSFUL_AND_NEW ) + seed_cache.GetSeedCount( CC.STATUS_SUCCESSFUL_BUT_REDUNDANT )
num_deleted_and_vetoed = seed_cache.GetSeedCount( CC.STATUS_DELETED ) + seed_cache.GetSeedCount( CC.STATUS_VETOED )
num_errors = seed_cache.GetSeedCount( CC.STATUS_ERROR )
num_skipped = seed_cache.GetSeedCount( CC.STATUS_SKIPPED )
if num_errors > 0:
ClientGUIMenus.AppendMenuItem( self, menu, 'retry ' + HydrusData.ConvertIntToPrettyString( num_errors ) + ' error failures', 'Tell this cache to reattempt all its error failures.', self._RetryErrors )
num_unknown = seed_cache.GetSeedCount( CC.STATUS_UNKNOWN )
num_successful = seed_cache.GetSeedCount( CC.STATUS_SUCCESSFUL_AND_NEW ) + seed_cache.GetSeedCount( CC.STATUS_SUCCESSFUL_BUT_REDUNDANT )
if num_successful > 0:
ClientGUIMenus.AppendMenuItem( self, menu, 'delete ' + HydrusData.ConvertIntToPrettyString( num_successful ) + ' \'successful\' file imports from the queue', 'Tell this cache to clear out successful/already in db files, reducing the size of the queue.', self._ClearSuccessful )
num_deletees = num_successful
ClientGUIMenus.AppendMenuItem( self, menu, 'delete ' + HydrusData.ConvertIntToPrettyString( num_deletees ) + ' successful file import items from the queue', 'Tell this cache to clear out successful files, reducing the size of the queue.', self._ClearSeeds, ( CC.STATUS_SUCCESSFUL_AND_NEW, CC.STATUS_SUCCESSFUL_BUT_REDUNDANT ) )
num_processed = len( seed_cache ) - num_unknown
if num_processed > 0 and num_processed != num_successful:
if num_deleted_and_vetoed > 0:
ClientGUIMenus.AppendMenuItem( self, menu, 'delete ' + HydrusData.ConvertIntToPrettyString( num_processed ) + ' \'processed\' file imports from the queue', 'Tell this cache to clear out processed files, reducing the size of the queue.', self._ClearProcessed )
num_deletees = num_deleted_and_vetoed
ClientGUIMenus.AppendMenuItem( self, menu, 'delete ' + HydrusData.ConvertIntToPrettyString( num_deletees ) + ' deleted/ignored file import items from the queue', 'Tell this cache to clear out processed files, reducing the size of the queue.', self._ClearSeeds, ( CC.STATUS_DELETED, CC.STATUS_VETOED ) )
if num_errors + num_skipped > 0:
num_deletees = num_errors + num_skipped
ClientGUIMenus.AppendMenuItem( self, menu, 'delete ' + HydrusData.ConvertIntToPrettyString( num_deletees ) + ' error/skipped file import items from the queue', 'Tell this cache to clear out all non-unknown files, reducing the size of the queue.', self._ClearSeeds, ( CC.STATUS_ERROR, CC.STATUS_SKIPPED ) )
ClientGUIMenus.AppendSeparator( menu )
@ -560,11 +583,12 @@ class SeedCacheButton( ClientGUICommon.BetterBitmapButton ):
class SeedCacheStatusControl( wx.Panel ):
def __init__( self, parent, controller ):
def __init__( self, parent, controller, page_key = None ):
wx.Panel.__init__( self, parent, style = wx.BORDER_DOUBLE )
self._controller = controller
self._page_key = page_key
self._seed_cache = None
@ -655,7 +679,26 @@ class SeedCacheStatusControl( wx.Panel ):
def TIMERUIUpdate( self ):
if self._controller.gui.IShouldRegularlyUpdate( self ):
do_it_anyway = False
if self._seed_cache is not None:
( import_summary, ( num_done, num_to_do ) ) = self._seed_cache.GetStatus()
( old_num_done, old_num_to_do ) = self._progress_gauge.GetValueRange()
if old_num_done != num_done or old_num_to_do != num_to_do:
if self._page_key is not None:
do_it_anyway = True # to update the gauge
HG.client_controller.pub( 'refresh_page_name', self._page_key )
if self._controller.gui.IShouldRegularlyUpdate( self ) or do_it_anyway:
self._Update()

View File

@ -10,6 +10,7 @@ import ClientNetworkingContexts
import ClientNetworkingDomain
import ClientNetworkingJobs
import ClientParsing
import ClientPaths
import ClientTags
import ClientThreading
import collections
@ -52,20 +53,7 @@ def GenerateDownloaderNetworkJobFactory( page_key ):
return network_job_factory
def GenerateSubscriptionNetworkJobFactory( subscription_key ):
def network_job_factory( *args, **kwargs ):
network_job = ClientNetworkingJobs.NetworkJobSubscription( subscription_key, *args, **kwargs )
network_job.OverrideBandwidth( 30 )
return network_job
return network_job_factory
def GenerateSubscriptionNetworkJobPresentationContextFactory( job_key ):
def GenerateMultiplePopupNetworkJobPresentationContextFactory( job_key ):
def network_job_presentation_context_factory( network_job ):
@ -84,6 +72,38 @@ def GenerateSubscriptionNetworkJobPresentationContextFactory( job_key ):
return network_job_presentation_context_factory
def GenerateSinglePopupNetworkJobPresentationContextFactory( job_key ):
def network_job_presentation_context_factory( network_job ):
def enter_call():
job_key.SetVariable( 'popup_network_job', network_job )
def exit_call():
job_key.DeleteVariable( 'popup_network_job' )
return NetworkJobPresentationContext( enter_call, exit_call )
return network_job_presentation_context_factory
def GenerateSubscriptionNetworkJobFactory( subscription_key ):
def network_job_factory( *args, **kwargs ):
network_job = ClientNetworkingJobs.NetworkJobSubscription( subscription_key, *args, **kwargs )
network_job.OverrideBandwidth( 30 )
return network_job
return network_job_factory
def GenerateWatcherNetworkJobFactory( thread_key ):
def network_job_factory( *args, **kwargs ):
@ -95,67 +115,78 @@ def GenerateWatcherNetworkJobFactory( thread_key ):
return network_job_factory
def PublishPresentationHashes( name, hashes, publish_to_popup_button, publish_files_to_page ):
if publish_to_popup_button:
files_job_key = ClientThreading.JobKey()
files_job_key.SetVariable( 'popup_files_mergable', True )
files_job_key.SetVariable( 'popup_files', ( list( hashes ), name ) )
HG.client_controller.pub( 'message', files_job_key )
if publish_files_to_page:
HG.client_controller.pub( 'imported_files_to_page', list( hashes ), name )
def THREADDownloadURL( job_key, url, url_string ):
job_key.SetVariable( 'popup_title', url_string )
job_key.SetVariable( 'popup_text_1', 'initialising' )
job_key.SetVariable( 'popup_text_1', 'downloading and importing' )
( os_file_handle, temp_path ) = HydrusPaths.GetTempPath()
#
try:
file_import_options = HG.client_controller.new_options.GetDefaultFileImportOptions( 'loud' )
def network_job_factory( *args, **kwargs ):
network_job = ClientNetworkingJobs.NetworkJob( 'GET', url, temp_path = temp_path )
network_job = ClientNetworkingJobs.NetworkJob( *args, **kwargs )
network_job.OverrideBandwidth()
HG.client_controller.network_engine.AddJob( network_job )
return network_job
job_key.SetVariable( 'popup_network_job', network_job )
network_job_presentation_context_factory = GenerateSinglePopupNetworkJobPresentationContextFactory( job_key )
seed = Seed( SEED_TYPE_URL, url )
#
try:
try:
seed.DownloadAndImportRawFile( url, file_import_options, network_job_factory, network_job_presentation_context_factory )
status = seed.status
if status in CC.SUCCESSFUL_IMPORT_STATES:
network_job.WaitUntilDone()
if status == CC.STATUS_SUCCESSFUL_AND_NEW:
job_key.SetVariable( 'popup_text_1', 'successful!' )
elif status == CC.STATUS_SUCCESSFUL_BUT_REDUNDANT:
job_key.SetVariable( 'popup_text_1', 'was already in the database!' )
except ( HydrusExceptions.ShutdownException, HydrusExceptions.CancelledException, HydrusExceptions.NetworkException ):
hash = seed.GetHash()
job_key.Cancel()
job_key.SetVariable( 'popup_files', ( [ hash ], 'download' ) )
raise
elif status == CC.STATUS_DELETED:
job_key.SetVariable( 'popup_text_1', 'had already been deleted!' )
job_key.DeleteVariable( 'popup_network_job' )
job_key.SetVariable( 'popup_text_1', 'importing' )
file_import_job = FileImportJob( temp_path )
( status, hash ) = HG.client_controller.client_files_manager.ImportFile( file_import_job )
finally:
HydrusPaths.CleanUpTempPath( os_file_handle, temp_path )
job_key.Finish()
if status in CC.SUCCESSFUL_IMPORT_STATES:
if status == CC.STATUS_SUCCESSFUL_AND_NEW:
job_key.SetVariable( 'popup_text_1', 'successful!' )
elif status == CC.STATUS_SUCCESSFUL_BUT_REDUNDANT:
job_key.SetVariable( 'popup_text_1', 'was already in the database!' )
job_key.SetVariable( 'popup_files', ( [ hash ], 'download' ) )
elif status == CC.STATUS_DELETED:
job_key.SetVariable( 'popup_text_1', 'had already been deleted!' )
job_key.Finish()
def THREADDownloadURLs( job_key, urls, title ):
job_key.SetVariable( 'popup_title', title )
@ -169,6 +200,19 @@ def THREADDownloadURLs( job_key, urls, title ):
presentation_hashes = []
presentation_hashes_fast = set()
file_import_options = HG.client_controller.new_options.GetDefaultFileImportOptions( 'loud' )
def network_job_factory( *args, **kwargs ):
network_job = ClientNetworkingJobs.NetworkJob( *args, **kwargs )
network_job.OverrideBandwidth()
return network_job
network_job_presentation_context_factory = GenerateMultiplePopupNetworkJobPresentationContextFactory( job_key )
for ( i, url ) in enumerate( urls ):
( i_paused, should_quit ) = job_key.WaitIfNeeded()
@ -181,73 +225,45 @@ def THREADDownloadURLs( job_key, urls, title ):
job_key.SetVariable( 'popup_text_1', HydrusData.ConvertValueRangeToPrettyString( i + 1, len( urls ) ) )
job_key.SetVariable( 'popup_gauge_1', ( i + 1, len( urls ) ) )
( os_file_handle, temp_path ) = HydrusPaths.GetTempPath()
seed = Seed( SEED_TYPE_URL, url )
try:
network_job = ClientNetworkingJobs.NetworkJob( 'GET', url, temp_path = temp_path )
seed.DownloadAndImportRawFile( url, file_import_options, network_job_factory, network_job_presentation_context_factory )
network_job.OverrideBandwidth()
status = seed.status
HG.client_controller.network_engine.AddJob( network_job )
job_key.SetVariable( 'popup_network_job', network_job )
try:
if status in CC.SUCCESSFUL_IMPORT_STATES:
network_job.WaitUntilDone()
if status == CC.STATUS_SUCCESSFUL_AND_NEW:
num_successful += 1
elif status == CC.STATUS_SUCCESSFUL_BUT_REDUNDANT:
num_redundant += 1
except ( HydrusExceptions.ShutdownException, HydrusExceptions.CancelledException, HydrusExceptions.NetworkException ):
hash = seed.GetHash()
break
if hash not in presentation_hashes_fast:
presentation_hashes.append( hash )
presentation_hashes_fast.add( hash )
elif status == CC.STATUS_DELETED:
num_deleted += 1
try:
job_key.SetVariable( 'popup_text_2', 'importing' )
file_import_job = FileImportJob( temp_path )
( status, hash ) = HG.client_controller.client_files_manager.ImportFile( file_import_job )
except Exception as e:
job_key.DeleteVariable( 'popup_text_2' )
HydrusData.Print( url + ' failed to import!' )
HydrusData.PrintException( e )
num_failed += 1
continue
except Exception as e:
finally:
num_failed += 1
HydrusPaths.CleanUpTempPath( os_file_handle, temp_path )
if status in CC.SUCCESSFUL_IMPORT_STATES:
if status == CC.STATUS_SUCCESSFUL_AND_NEW:
num_successful += 1
elif status == CC.STATUS_SUCCESSFUL_BUT_REDUNDANT:
num_redundant += 1
if hash not in presentation_hashes_fast:
presentation_hashes.append( hash )
presentation_hashes_fast.add( hash )
elif status == CC.STATUS_DELETED:
num_deleted += 1
HydrusData.Print( url + ' failed to import!' )
HydrusData.PrintException( e )
@ -283,7 +299,6 @@ def THREADDownloadURLs( job_key, urls, title ):
job_key.DeleteVariable( 'popup_gauge_1' )
job_key.DeleteVariable( 'popup_text_2' )
job_key.Finish()
@ -435,7 +450,9 @@ class FileImportJob( object ):
self._hash = HydrusFileHandling.GetHashFromPath( self._temp_path )
( self._pre_import_status, hash, note ) = HG.client_controller.Read( 'hash_status', 'sha256', self._hash )
( self._pre_import_status, hash, note ) = HG.client_controller.Read( 'hash_status', 'sha256', self._hash, prefix = 'recognised during import' )
return ( self._pre_import_status, self._hash, note )
def GenerateInfo( self ):
@ -725,7 +742,7 @@ class GalleryImport( HydrusSerialisable.SerialisableBase ):
elif status == CC.STATUS_UNKNOWN:
( os_file_handle, temp_path ) = HydrusPaths.GetTempPath()
( os_file_handle, temp_path ) = ClientPaths.GetTempPath()
try:
@ -806,8 +823,6 @@ class GalleryImport( HydrusSerialisable.SerialisableBase ):
self._seed_cache.NotifySeedsUpdated( ( seed, ) )
HG.client_controller.pub( 'refresh_page_name', page_key )
wx.CallAfter( self._download_control_file_clear )
@ -1009,14 +1024,12 @@ class GalleryImport( HydrusSerialisable.SerialisableBase ):
finally:
HG.client_controller.pub( 'refresh_page_name', page_key )
wx.CallAfter( self._download_control_gallery_clear )
with self._lock:
status = self._current_query + ': ' + HydrusData.ConvertIntToPrettyString( len( new_seeds ) ) + ' new urls found'
status = query + ': ' + HydrusData.ConvertIntToPrettyString( len( new_seeds ) ) + ' new urls found'
if num_already_in_seed_cache > 0:
@ -1471,8 +1484,6 @@ class HDDImport( HydrusSerialisable.SerialisableBase ):
self._seed_cache.NotifySeedsUpdated( ( seed, ) )
HG.client_controller.pub( 'refresh_page_name', page_key )
with self._lock:
self._current_action = ''
@ -1867,6 +1878,8 @@ class ImportFolder( HydrusSerialisable.SerialisableBaseNamed ):
seed.ImportPath( self._file_import_options )
hash = seed.GetHash()
if seed.status in CC.SUCCESSFUL_IMPORT_STATES:
downloaded_tags = []
@ -1956,20 +1969,7 @@ class ImportFolder( HydrusSerialisable.SerialisableBaseNamed ):
if len( presentation_hashes ) > 0:
if self._publish_files_to_popup_button:
job_key = ClientThreading.JobKey()
job_key.SetVariable( 'popup_files_mergable', True )
job_key.SetVariable( 'popup_files', ( list( presentation_hashes ), self._name ) )
HG.client_controller.pub( 'message', job_key )
if self._publish_files_to_page:
HG.client_controller.pub( 'imported_files_to_page', list( presentation_hashes ), self._name )
PublishPresentationHashes( self._name, presentation_hashes, self._publish_files_to_popup_button, self._publish_files_to_page )
@ -2174,7 +2174,7 @@ class ImportFolder( HydrusSerialisable.SerialisableBaseNamed ):
if set( mimes ) != set( self._mimes ):
self._seed_cache.RemoveSeedsByStatus( CC.STATUS_VETOED )
self._seed_cache.RemoveSeedsByStatus( ( CC.STATUS_VETOED, ) )
self._name = name
@ -2275,7 +2275,7 @@ class Seed( HydrusSerialisable.SerialisableBase ):
serialisable_urls = list( self._urls )
serialisable_tags = list( self._tags )
serialisable_hashes = [ ( hash_type, hash.encode( 'hex' ) ) for ( hash_type, hash ) in self._hashes.items() ]
serialisable_hashes = [ ( hash_type, hash.encode( 'hex' ) ) for ( hash_type, hash ) in self._hashes.items() if hash is not None ]
return ( self.seed_type, self.seed_data, self.created, self.modified, self.source_time, self.status, self.note, serialisable_urls, serialisable_tags, serialisable_hashes )
@ -2286,7 +2286,7 @@ class Seed( HydrusSerialisable.SerialisableBase ):
self._urls = set( serialisable_urls )
self._tags = set( serialisable_tags )
self._hashes = { hash_type : encoded_hash.decode( 'hex' ) for ( hash_type, encoded_hash ) in serialisable_hashes }
self._hashes = { hash_type : encoded_hash.decode( 'hex' ) for ( hash_type, encoded_hash ) in serialisable_hashes if encoded_hash is not None }
def _NormaliseAndFilterAssociableURLs( self, urls ):
@ -2337,6 +2337,8 @@ class Seed( HydrusSerialisable.SerialisableBase ):
def AddTags( self, tags ):
tags = HydrusTags.CleanTags( tags )
self._tags.update( tags )
self._UpdateModified()
@ -2360,7 +2362,7 @@ class Seed( HydrusSerialisable.SerialisableBase ):
self.AddURL( file_url )
( os_file_handle, temp_path ) = HydrusPaths.GetTempPath()
( os_file_handle, temp_path ) = ClientPaths.GetTempPath()
try:
@ -2444,7 +2446,7 @@ class Seed( HydrusSerialisable.SerialisableBase ):
for url in urls:
if HG.client_controller.network_engine.domain_manager.URLDefinitelyRefersToMultipleFiles( url ):
if HG.client_controller.network_engine.domain_manager.URLCanReferToMultipleFiles( url ):
continue
@ -2564,13 +2566,18 @@ class Seed( HydrusSerialisable.SerialisableBase ):
return search_seeds
def HasHash( self ):
return self.GetHash() is not None
def Import( self, temp_path, file_import_options ):
file_import_job = FileImportJob( temp_path, file_import_options )
( status, hash ) = HG.client_controller.client_files_manager.ImportFile( file_import_job )
( status, hash, note ) = HG.client_controller.client_files_manager.ImportFile( file_import_job )
self.SetStatus( status )
self.SetStatus( status, note = note )
self.SetHash( hash )
@ -2581,7 +2588,7 @@ class Seed( HydrusSerialisable.SerialisableBase ):
raise Exception( 'Attempted to import as a path, but I do not think I am a path!' )
( os_file_handle, temp_path ) = HydrusPaths.GetTempPath()
( os_file_handle, temp_path ) = ClientPaths.GetTempPath()
try:
@ -2624,7 +2631,10 @@ class Seed( HydrusSerialisable.SerialisableBase ):
def SetHash( self, hash ):
self._hashes[ 'sha256' ] = hash
if hash is not None:
self._hashes[ 'sha256' ] = hash
def SetStatus( self, status, note = '', exception = None ):
@ -2672,14 +2682,9 @@ class Seed( HydrusSerialisable.SerialisableBase ):
def ShouldPresent( self, file_import_options ):
if 'sha256' not in self._hashes:
return False
hash = self.GetHash()
hash = self._hashes[ 'sha256' ]
if self.status in CC.SUCCESSFUL_IMPORT_STATES:
if hash is not None and self.status in CC.SUCCESSFUL_IMPORT_STATES:
if file_import_options.ShouldPresentIgnorantOfInbox( self.status ):
@ -2812,6 +2817,7 @@ class Seed( HydrusSerialisable.SerialisableBase ):
parse_results = all_parse_results[0]
# this now needs to deal with multiple file post urls cleverly, which I think means no longer associating file_urls at this point--do that url association in DownloadAndImportRawFile only
self.AddParseResults( parse_results )
self.CheckPreFetchMetadata( tag_import_options )
@ -2827,13 +2833,31 @@ class Seed( HydrusSerialisable.SerialisableBase ):
raise HydrusExceptions.VetoException( 'Could not file a file URL!' )
file_url = file_urls[0]
status_hook( 'downloading file' )
self.DownloadAndImportRawFile( file_url, file_import_options, network_job_factory, network_job_presentation_context_factory )
did_substantial_work = True
if len( file_urls ) == 1 or HG.client_controller.network_engine.domain_manager.URLDefinitelyRefersToOneFile( post_url ) or True: # leave this mandatory for now
file_url = file_urls[0]
status_hook( 'downloading file' )
self.DownloadAndImportRawFile( file_url, file_import_options, network_job_factory, network_job_presentation_context_factory )
did_substantial_work = True
else:
# we have a tweet with multiple images
# seeds can't represent more than one file
# so, spawn a bunch more seeds via duplication, each with a sub-index and the file url associated
# insert them into the seed cache
# set my own note as 'generated 10 sub jobs' and ignored (or succesful and alter all gethash stuff to deal with hash being none off a success result
# then alter seeds so:
# if the sub-index is set and we have a file url, just go straight to the DownloadAndImportRawFile in this method.
# alter seed presentation in the file import cache column
# sub-index should be in seed.__hash__ as well
pass
@ -2907,13 +2931,13 @@ class Seed( HydrusSerialisable.SerialisableBase ):
return did_work
if 'sha256' not in self._hashes:
hash = self.GetHash()
if hash is None:
return did_work
hash = self._hashes[ 'sha256' ]
service_keys_to_content_updates = collections.defaultdict( list )
urls = set( self._urls )
@ -2927,7 +2951,7 @@ class Seed( HydrusSerialisable.SerialisableBase ):
if len( associable_urls ) > 0:
content_update = HydrusData.ContentUpdate( HC.CONTENT_TYPE_URLS, HC.CONTENT_UPDATE_ADD, ( hash, associable_urls ) )
content_update = HydrusData.ContentUpdate( HC.CONTENT_TYPE_URLS, HC.CONTENT_UPDATE_ADD, ( associable_urls, ( hash, ) ) )
service_keys_to_content_updates[ CC.COMBINED_LOCAL_FILE_SERVICE_KEY ].append( content_update )
@ -3568,16 +3592,6 @@ class SeedCache( HydrusSerialisable.SerialisableBase ):
HG.client_controller.pub( 'seed_cache_seeds_updated', self._seed_cache_key, seeds )
def RemoveProcessedSeeds( self ):
with self._lock:
seeds_to_delete = [ seed for seed in self._seeds if seed.status != CC.STATUS_UNKNOWN ]
self.RemoveSeeds( seeds_to_delete )
def RemoveSeeds( self, seeds ):
with self._lock:
@ -3594,21 +3608,21 @@ class SeedCache( HydrusSerialisable.SerialisableBase ):
self.NotifySeedsUpdated( seeds_to_delete )
def RemoveSeedsByStatus( self, status ):
def RemoveSeedsByStatus( self, statuses_to_remove ):
with self._lock:
seeds_to_delete = [ seed for seed in self._seeds if seed.status == status ]
seeds_to_delete = [ seed for seed in self._seeds if seed.status in statuses_to_remove ]
self.RemoveSeeds( seeds_to_delete )
def RemoveSuccessfulSeeds( self ):
def RemoveAllButUnknownSeeds( self ):
with self._lock:
seeds_to_delete = [ seed for seed in self._seeds if seed.status in CC.SUCCESSFUL_IMPORT_STATES ]
seeds_to_delete = [ seed for seed in self._seeds if seed.status != CC.STATUS_UNKNOWN ]
self.RemoveSeeds( seeds_to_delete )
@ -3836,8 +3850,6 @@ class SimpleDownloaderImport( HydrusSerialisable.SerialisableBase ):
self._seed_cache.NotifySeedsUpdated( ( seed, ) )
HG.client_controller.pub( 'refresh_page_name', page_key )
with self._lock:
self._current_action = ''
@ -3927,10 +3939,6 @@ class SimpleDownloaderImport( HydrusSerialisable.SerialisableBase ):
parser_status = HydrusData.ToUnicode( e )
finally:
HG.client_controller.pub( 'refresh_page_name', page_key )
with self._lock:
@ -4198,7 +4206,7 @@ class Subscription( HydrusSerialisable.SerialisableBaseNamed ):
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_SUBSCRIPTION
SERIALISABLE_NAME = 'Subscription'
SERIALISABLE_VERSION = 5
SERIALISABLE_VERSION = 6
def __init__( self, name ):
@ -4237,6 +4245,10 @@ class Subscription( HydrusSerialisable.SerialisableBaseNamed ):
self._no_work_until = 0
self._no_work_until_reason = ''
self._publish_files_to_popup_button = True
self._publish_files_to_page = False
self._merge_query_publish_events = True
def _DelayWork( self, time_delta, reason ):
@ -4300,12 +4312,12 @@ class Subscription( HydrusSerialisable.SerialisableBaseNamed ):
serialisable_file_options = self._file_import_options.GetSerialisableTuple()
serialisable_tag_options = self._tag_import_options.GetSerialisableTuple()
return ( serialisable_gallery_identifier, serialisable_gallery_stream_identifiers, serialisable_queries, serialisable_checker_options, self._initial_file_limit, self._periodic_file_limit, self._paused, serialisable_file_options, serialisable_tag_options, self._no_work_until, self._no_work_until_reason )
return ( serialisable_gallery_identifier, serialisable_gallery_stream_identifiers, serialisable_queries, serialisable_checker_options, self._initial_file_limit, self._periodic_file_limit, self._paused, serialisable_file_options, serialisable_tag_options, self._no_work_until, self._no_work_until_reason, self._publish_files_to_popup_button, self._publish_files_to_page, self._merge_query_publish_events )
def _InitialiseFromSerialisableInfo( self, serialisable_info ):
( serialisable_gallery_identifier, serialisable_gallery_stream_identifiers, serialisable_queries, serialisable_checker_options, self._initial_file_limit, self._periodic_file_limit, self._paused, serialisable_file_options, serialisable_tag_options, self._no_work_until, self._no_work_until_reason ) = serialisable_info
( serialisable_gallery_identifier, serialisable_gallery_stream_identifiers, serialisable_queries, serialisable_checker_options, self._initial_file_limit, self._periodic_file_limit, self._paused, serialisable_file_options, serialisable_tag_options, self._no_work_until, self._no_work_until_reason, self._publish_files_to_popup_button, self._publish_files_to_page, self._merge_query_publish_events ) = serialisable_info
self._gallery_identifier = HydrusSerialisable.CreateFromSerialisableTuple( serialisable_gallery_identifier )
self._gallery_stream_identifiers = [ HydrusSerialisable.CreateFromSerialisableTuple( serialisable_gallery_stream_identifier ) for serialisable_gallery_stream_identifier in serialisable_gallery_stream_identifiers ]
@ -4409,6 +4421,19 @@ class Subscription( HydrusSerialisable.SerialisableBaseNamed ):
return ( 5, new_serialisable_info )
if version == 5:
( serialisable_gallery_identifier, serialisable_gallery_stream_identifiers, serialisable_queries, serialisable_checker_options, initial_file_limit, periodic_file_limit, paused, serialisable_file_options, serialisable_tag_options, no_work_until, no_work_until_reason ) = old_serialisable_info
publish_files_to_popup_button = True
publish_files_to_page = False
merge_query_publish_events = True
new_serialisable_info = new_serialisable_info = ( serialisable_gallery_identifier, serialisable_gallery_stream_identifiers, serialisable_queries, serialisable_checker_options, initial_file_limit, periodic_file_limit, paused, serialisable_file_options, serialisable_tag_options, no_work_until, no_work_until_reason, publish_files_to_popup_button, publish_files_to_page, merge_query_publish_events )
return ( 6, new_serialisable_info )
def _WorkOnFiles( self, job_key ):
@ -4457,12 +4482,12 @@ class Subscription( HydrusSerialisable.SerialisableBaseNamed ):
gallery.SetNetworkJobFactory( network_job_factory )
text_1 = 'downloading files'
file_popup_text = self._name
query_summary_name = self._name
if query_text != self._name:
text_1 += ' for "' + query_text + '"'
file_popup_text += ': ' + query_text
query_summary_name += ': ' + query_text
job_key.SetVariable( 'popup_text_1', text_1 )
@ -4526,7 +4551,7 @@ class Subscription( HydrusSerialisable.SerialisableBaseNamed ):
seed.WorkOnPostURL( self._file_import_options, self._tag_import_options, status_hook, GenerateSubscriptionNetworkJobFactory( self._GetNetworkJobSubscriptionKey( query ) ), GenerateSubscriptionNetworkJobPresentationContextFactory( job_key ) )
seed.WorkOnPostURL( self._file_import_options, self._tag_import_options, status_hook, GenerateSubscriptionNetworkJobFactory( self._GetNetworkJobSubscriptionKey( query ) ), GenerateMultiplePopupNetworkJobPresentationContextFactory( job_key ) )
if seed.ShouldPresent( self._file_import_options ):
@ -4569,7 +4594,7 @@ class Subscription( HydrusSerialisable.SerialisableBaseNamed ):
elif status == CC.STATUS_UNKNOWN:
( os_file_handle, temp_path ) = HydrusPaths.GetTempPath()
( os_file_handle, temp_path ) = ClientPaths.GetTempPath()
try:
@ -4674,7 +4699,7 @@ class Subscription( HydrusSerialisable.SerialisableBaseNamed ):
if len( presentation_hashes ) > 0:
job_key.SetVariable( 'popup_files', ( list( presentation_hashes ), file_popup_text ) )
job_key.SetVariable( 'popup_files', ( list( presentation_hashes ), query_summary_name ) )
time.sleep( DID_SUBSTANTIAL_FILE_WORK_MINIMUM_SLEEP_TIME )
@ -4682,15 +4707,15 @@ class Subscription( HydrusSerialisable.SerialisableBaseNamed ):
HG.client_controller.WaitUntilViewFree()
if not self._merge_query_publish_events and len( presentation_hashes ) > 0:
PublishPresentationHashes( query_summary_name, presentation_hashes, self._publish_files_to_popup_button, self._publish_files_to_page )
if len( all_presentation_hashes ) > 0:
if self._merge_query_publish_events and len( all_presentation_hashes ) > 0:
files_job_key = ClientThreading.JobKey()
files_job_key.SetVariable( 'popup_files_mergable', True )
files_job_key.SetVariable( 'popup_files', ( all_presentation_hashes, self._name ) )
HG.client_controller.pub( 'message', files_job_key )
PublishPresentationHashes( self._name, all_presentation_hashes, self._publish_files_to_popup_button, self._publish_files_to_page )
job_key.DeleteVariable( 'popup_files' )
@ -5060,6 +5085,11 @@ class Subscription( HydrusSerialisable.SerialisableBaseNamed ):
return self._queries
def GetPresentationOptions( self ):
return ( self._publish_files_to_popup_button, self._publish_files_to_page, self._merge_query_publish_events )
def GetTagImportOptions( self ):
return self._tag_import_options
@ -5179,6 +5209,13 @@ class Subscription( HydrusSerialisable.SerialisableBaseNamed ):
def SetPresentationOptions( self, publish_files_to_popup_button, publish_files_to_page, merge_query_publish_events ):
self._publish_files_to_popup_button = publish_files_to_popup_button
self._publish_files_to_page = publish_files_to_page
self._merge_query_publish_events = merge_query_publish_events
def SetTuple( self, gallery_identifier, gallery_stream_identifiers, queries, checker_options, initial_file_limit, periodic_file_limit, paused, file_import_options, tag_import_options, no_work_until ):
self._gallery_identifier = gallery_identifier
@ -5768,8 +5805,6 @@ class ThreadWatcherImport( HydrusSerialisable.SerialisableBase ):
HG.client_controller.pub( 'refresh_page_name', page_key )
def _GetSerialisableInfo( self ):
@ -6024,8 +6059,6 @@ class ThreadWatcherImport( HydrusSerialisable.SerialisableBase ):
self._seed_cache.NotifySeedsUpdated( ( seed, ) )
HG.client_controller.pub( 'refresh_page_name', page_key )
with self._lock:
self._current_action = ''
@ -6456,8 +6489,6 @@ class URLsImport( HydrusSerialisable.SerialisableBase ):
self._seed_cache.NotifySeedsUpdated( ( seed, ) )
HG.client_controller.pub( 'refresh_page_name', page_key )
with self._lock:
self._RegenerateSeedCacheStatus()

View File

@ -524,13 +524,13 @@ class LocationsManager( object ):
if action == HC.CONTENT_UPDATE_ADD:
( hash, urls ) = row
( urls, hashes ) = row
self._urls.update( urls )
elif action == HC.CONTENT_UPDATE_DELETE:
( hash, urls ) = row
( urls, hashes ) = row
self._urls.difference_update( urls )

View File

@ -60,7 +60,14 @@ def ConvertDomainIntoAllApplicableDomains( domain ):
def ConvertDomainIntoSecondLevelDomain( domain ):
return ConvertDomainIntoAllApplicableDomains( domain )[-1]
domains = ConvertDomainIntoAllApplicableDomains( domain )
if len( domains ) == 0:
raise HydrusExceptions.URLMatchException( 'That url or domain did not seem to be valid!' )
return domains[-1]
def ConvertHTTPSToHTTP( url ):
@ -608,7 +615,7 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
url_match_key = url_match.GetMatchKey()
if url_match.IsPostURL() and url_match_key in self._url_match_keys_to_display:
if url_match_key in self._url_match_keys_to_display:
url_match_name = url_match.GetName()
@ -753,6 +760,14 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
def GetURLMatch( self, url ):
with self._lock:
return self._GetURLMatch( url )
def GetURLMatches( self ):
with self._lock:
@ -913,6 +928,40 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
def OverwriteDefaultParsers( self, parser_names ):
with self._lock:
import ClientDefaults
default_parsers = ClientDefaults.GetDefaultParsers()
existing_parsers = list( self._parsers )
new_parsers = [ parser for parser in existing_parsers if parser.GetName() not in parser_names ]
new_parsers.extend( [ parser for parser in default_parsers if parser.GetName() in parser_names ] )
self.SetParsers( new_parsers )
def OverwriteDefaultURLMatches( self, url_match_names ):
with self._lock:
import ClientDefaults
default_url_matches = ClientDefaults.GetDefaultURLMatches()
existing_url_matches = list( self._url_matches )
new_url_matches = [ url_match for url_match in existing_url_matches if url_match.GetName() not in url_match_names ]
new_url_matches.extend( [ url_match for url_match in default_url_matches if url_match.GetName() in url_match_names ] )
self.SetURLMatches( new_url_matches )
def SetClean( self ):
with self._lock:
@ -930,6 +979,8 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
self._url_match_keys_to_default_tag_import_options = url_match_keys_to_tag_import_options
self._SetDirty()
def SetHeaderValidation( self, network_context, key, approved ):
@ -1003,14 +1054,14 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
with self._lock:
# add new post url matches to the yes display set
# by default, we will show post urls
old_url_match_keys = { url_match.GetMatchKey() for url_match in self._url_matches if url_match.IsPostURL() }
url_match_keys = { url_match.GetMatchKey() for url_match in url_matches if url_match.IsPostURL() }
old_post_url_match_keys = { url_match.GetMatchKey() for url_match in self._url_matches if url_match.IsPostURL() }
post_url_match_keys = { url_match.GetMatchKey() for url_match in url_matches if url_match.IsPostURL() }
added_url_match_keys = url_match_keys.difference( old_url_match_keys )
added_post_url_match_keys = post_url_match_keys.difference( old_post_url_match_keys )
self._url_match_keys_to_display.update( added_url_match_keys )
self._url_match_keys_to_display.update( added_post_url_match_keys )
#
@ -1094,7 +1145,7 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
def URLDefinitelyRefersToMultipleFiles( self, url ):
def URLCanReferToMultipleFiles( self, url ):
with self._lock:
@ -1105,7 +1156,7 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
return False
return url_match.RefersToMultipleFiles()
return url_match.CanReferToMultipleFiles()
@ -1127,6 +1178,10 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
@staticmethod
def STATICLinkURLMatchesAndParsers( url_matches, parsers, existing_url_match_keys_to_parser_keys ):
parsers = list( parsers )
parsers.sort( key = lambda p: p.GetName() )
new_url_match_keys_to_parser_keys = {}
for url_match in url_matches:
@ -1257,9 +1312,9 @@ class URLMatch( HydrusSerialisable.SerialisableBaseNamed ):
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_URL_MATCH
SERIALISABLE_NAME = 'URL Match'
SERIALISABLE_VERSION = 3
SERIALISABLE_VERSION = 4
def __init__( self, name, url_match_key = None, url_type = None, preferred_scheme = 'https', netloc = 'hostname.com', match_subdomains = False, keep_matched_subdomains = False, path_components = None, parameters = None, api_lookup_converter = None, should_be_associated_with_files = True, example_url = 'https://hostname.com/post/page.php?id=123456&s=view' ):
def __init__( self, name, url_match_key = None, url_type = None, preferred_scheme = 'https', netloc = 'hostname.com', match_subdomains = False, keep_matched_subdomains = False, path_components = None, parameters = None, api_lookup_converter = None, can_produce_multiple_files = False, should_be_associated_with_files = True, example_url = 'https://hostname.com/post/page.php?id=123456&s=view' ):
if url_match_key is None:
@ -1308,6 +1363,7 @@ class URLMatch( HydrusSerialisable.SerialisableBaseNamed ):
self._path_components = path_components
self._parameters = parameters
self._api_lookup_converter = api_lookup_converter
self._can_produce_multiple_files = can_produce_multiple_files
self._should_be_associated_with_files = should_be_associated_with_files
self._example_url = example_url
@ -1384,12 +1440,12 @@ class URLMatch( HydrusSerialisable.SerialisableBaseNamed ):
serialisable_parameters = self._parameters.GetSerialisableTuple()
serialisable_api_lookup_converter = self._api_lookup_converter.GetSerialisableTuple()
return ( serialisable_url_match_key, self._url_type, self._preferred_scheme, self._netloc, self._match_subdomains, self._keep_matched_subdomains, serialisable_path_components, serialisable_parameters, serialisable_api_lookup_converter, self._should_be_associated_with_files, self._example_url )
return ( serialisable_url_match_key, self._url_type, self._preferred_scheme, self._netloc, self._match_subdomains, self._keep_matched_subdomains, serialisable_path_components, serialisable_parameters, serialisable_api_lookup_converter, self._can_produce_multiple_files, self._should_be_associated_with_files, self._example_url )
def _InitialiseFromSerialisableInfo( self, serialisable_info ):
( serialisable_url_match_key, self._url_type, self._preferred_scheme, self._netloc, self._match_subdomains, self._keep_matched_subdomains, serialisable_path_components, serialisable_parameters, serialisable_api_lookup_converter, self._should_be_associated_with_files, self._example_url ) = serialisable_info
( serialisable_url_match_key, self._url_type, self._preferred_scheme, self._netloc, self._match_subdomains, self._keep_matched_subdomains, serialisable_path_components, serialisable_parameters, serialisable_api_lookup_converter, self._can_produce_multiple_files, self._should_be_associated_with_files, self._example_url ) = serialisable_info
self._url_match_key = serialisable_url_match_key.decode( 'hex' )
self._path_components = HydrusSerialisable.CreateFromSerialisableTuple( serialisable_path_components )
@ -1434,6 +1490,26 @@ class URLMatch( HydrusSerialisable.SerialisableBaseNamed ):
return ( 3, new_serialisable_info )
if version == 3:
( serialisable_url_match_key, url_type, preferred_scheme, netloc, match_subdomains, keep_matched_subdomains, serialisable_path_components, serialisable_parameters, serialisable_api_lookup_converter, should_be_associated_with_files, example_url ) = old_serialisable_info
can_produce_multiple_files = False
new_serialisable_info = ( serialisable_url_match_key, url_type, preferred_scheme, netloc, match_subdomains, keep_matched_subdomains, serialisable_path_components, serialisable_parameters, serialisable_api_lookup_converter, can_produce_multiple_files, should_be_associated_with_files, example_url )
return ( 4, new_serialisable_info )
def CanReferToMultipleFiles( self ):
is_a_gallery_page = self._url_type in ( HC.URL_TYPE_GALLERY, HC.URL_TYPE_WATCHABLE )
is_a_multipost_post_page = self._url_type == HC.URL_TYPE_POST and self._can_produce_multiple_files
return is_a_gallery_page or is_a_multipost_post_page
def GetAPIURL( self, url ):
@ -1527,14 +1603,13 @@ class URLMatch( HydrusSerialisable.SerialisableBaseNamed ):
return r.geturl()
def RefersToMultipleFiles( self ):
return self._url_type in ( HC.URL_TYPE_GALLERY, HC.URL_TYPE_WATCHABLE )
def RefersToOneFile( self ):
return self._url_type in ( HC.URL_TYPE_FILE, HC.URL_TYPE_POST )
is_a_direct_file_page = self._url_type == HC.URL_TYPE_FILE
is_a_single_file_post_page = self._url_type == HC.URL_TYPE_POST and not self._can_produce_multiple_files
return is_a_direct_file_page or is_a_single_file_post_page
def RegenMatchKey( self ):
@ -1598,7 +1673,7 @@ class URLMatch( HydrusSerialisable.SerialisableBaseNamed ):
if len( url_parameters ) < len( self._parameters ):
raise HydrusExceptions.URLMatchException( p.query + ' did not have ' + str( len( self._parameters ) ) + ' value pairs' )
raise HydrusExceptions.URLMatchException( p.query + ' did not have ' + str( len( self._parameters ) ) + ' parameters' )
for ( key, string_match ) in self._parameters.items():
@ -1623,7 +1698,7 @@ class URLMatch( HydrusSerialisable.SerialisableBaseNamed ):
def ToTuple( self ):
return ( self._url_type, self._preferred_scheme, self._netloc, self._match_subdomains, self._keep_matched_subdomains, self._path_components, self._parameters, self._api_lookup_converter, self._should_be_associated_with_files, self._example_url )
return ( self._url_type, self._preferred_scheme, self._netloc, self._match_subdomains, self._keep_matched_subdomains, self._path_components, self._parameters, self._api_lookup_converter, self._can_produce_multiple_files, self._should_be_associated_with_files, self._example_url )
def UsesAPIURL( self ):

View File

@ -138,6 +138,13 @@ class NetworkJob( object ):
( self._session_network_context, self._login_network_context ) = self._GenerateSpecificNetworkContexts()
def _CanReattemptConnection( self ):
max_attempts_allowed = 3
return self._current_connection_attempt_number <= max_attempts_allowed
def _CanReattemptRequest( self ):
if self._method == 'GET':
@ -824,7 +831,7 @@ class NetworkJob( object ):
self._current_connection_attempt_number += 1
if not self._CanReattemptRequest():
if not self._CanReattemptConnection():
raise HydrusExceptions.ConnectionException( 'Could not connect!' )

View File

@ -2579,7 +2579,7 @@ class StringConverter( HydrusSerialisable.SerialisableBase ):
elif transformation_type == STRING_TRANSFORMATION_CLIP_TEXT_FROM_END:
return 'take the first ' + HydrusData.ConvertIntToPrettyString( data ) + ' characters'
return 'take the last ' + HydrusData.ConvertIntToPrettyString( data ) + ' characters'
elif transformation_type == STRING_TRANSFORMATION_PREPEND_TEXT:

46
include/ClientPaths.py Normal file
View File

@ -0,0 +1,46 @@
import HydrusGlobals as HG
import HydrusPaths
import webbrowser
def GetCurrentTempDir():
temp_path_override = HG.client_controller.new_options.GetNoneableString( 'temp_path_override' )
if temp_path_override is None:
return HydrusPaths.tempfile.gettempdir()
else:
return temp_path_override
def GetTempDir():
temp_path_override = HG.client_controller.new_options.GetNoneableString( 'temp_path_override' )
return HydrusPaths.GetTempDir( dir = temp_path_override ) # none means default
def GetTempPath( suffix = '' ):
temp_path_override = HG.client_controller.new_options.GetNoneableString( 'temp_path_override' )
return HydrusPaths.GetTempPath( suffix = suffix, dir = temp_path_override )
def LaunchPathInWebBrowser( path ):
LaunchURLInWebBrowser( 'file:///' + path )
def LaunchURLInWebBrowser( url ):
web_browser_path = HG.client_controller.new_options.GetNoneableString( 'web_browser_path' )
if web_browser_path is None:
webbrowser.open( url )
else:
HydrusPaths.LaunchFile( url, launch_path = web_browser_path )

View File

@ -2,6 +2,7 @@ import ClientConstants as CC
import ClientImageHandling
import ClientImporting
import ClientParsing
import ClientPaths
import cv2
import HydrusConstants as HC
import HydrusData
@ -166,7 +167,7 @@ def DumpToPng( width, payload, title, payload_description, text, path ):
finished_image = numpy.concatenate( ( top_image, payload_image ) )
# this is to deal with unicode paths, which cv2 can't handle
( os_file_handle, temp_path ) = HydrusPaths.GetTempPath( suffix = '.png' )
( os_file_handle, temp_path ) = ClientPaths.GetTempPath( suffix = '.png' )
try:
@ -226,7 +227,7 @@ def GetPayloadDescriptionAndString( payload_obj ):
def LoadFromPng( path ):
# this is to deal with unicode paths, which cv2 can't handle
( os_file_handle, temp_path ) = HydrusPaths.GetTempPath()
( os_file_handle, temp_path ) = ClientPaths.GetTempPath()
try:

View File

@ -1325,7 +1325,7 @@ class ServiceRepository( ServiceRestricted ):
self._DealWithFundamentalNetworkError()
message = 'While downloading updates for the ' + self._name + ' repository, an update failed to import! The error follows:'
message = 'While downloading updates for the ' + self._name + ' repository, one failed to import! The error follows:'
HydrusData.ShowText( message )
@ -1428,7 +1428,7 @@ class ServiceRepository( ServiceRestricted ):
with self._lock:
message = 'While processing updates for the ' + self._name + ' repository, an update failed to import! The error follows:'
message = 'While processing updates for the ' + self._name + ' repository, one failed! The error follows:'
HydrusData.ShowText( message )

View File

@ -49,7 +49,7 @@ options = {}
# Misc
NETWORK_VERSION = 18
SOFTWARE_VERSION = 305
SOFTWARE_VERSION = 306
UNSCALED_THUMBNAIL_DIMENSIONS = ( 200, 200 )
@ -466,6 +466,8 @@ ARCHIVES = ( APPLICATION_ZIP, APPLICATION_HYDRUS_ENCRYPTED_ZIP, APPLICATION_RAR,
MIMES_WITH_THUMBNAILS = ( APPLICATION_FLASH, IMAGE_JPEG, IMAGE_PNG, IMAGE_APNG, IMAGE_GIF, IMAGE_BMP, VIDEO_AVI, VIDEO_FLV, VIDEO_MOV, VIDEO_MP4, VIDEO_WMV, VIDEO_MKV, VIDEO_WEBM, VIDEO_MPEG )
HYDRUS_UPDATE_FILES = ( APPLICATION_HYDRUS_UPDATE_DEFINITIONS, APPLICATION_HYDRUS_UPDATE_CONTENT )
MIMES_WE_CAN_PHASH = ( IMAGE_JPEG, IMAGE_PNG )
# mp3 header is complicated

View File

@ -358,8 +358,6 @@ class HydrusController( object ):
def InitModel( self ):
self.temp_dir = HydrusPaths.GetTempDir()
self._job_scheduler = HydrusThreading.JobScheduler( self )
self._job_scheduler.start()
@ -478,11 +476,6 @@ class HydrusController( object ):
self._job_scheduler = None
if hasattr( self, 'temp_dir' ):
HydrusPaths.DeletePath( self.temp_dir )
def ShutdownView( self ):

View File

@ -12,7 +12,6 @@ import Queue
import random
import sqlite3
import sys
import tempfile
import threading
import traceback
import time

View File

@ -878,30 +878,43 @@ def IntelligentMassIntersect( sets_to_reduce ):
answer = None
sets_to_reduce = list( sets_to_reduce )
def get_len( item ):
return len( item )
sets_to_reduce.sort( key = get_len )
for set_to_reduce in sets_to_reduce:
if len( set_to_reduce ) == 0: return set()
if len( set_to_reduce ) == 0:
return set()
if answer is None: answer = set( set_to_reduce )
if answer is None:
answer = set( set_to_reduce )
else:
# same thing as union; I could go &= here, but I want to be quick, so use the function call
if len( answer ) == 0: return set()
else: answer.intersection_update( set_to_reduce )
if len( answer ) == 0:
return set()
else:
answer.intersection_update( set_to_reduce )
if answer is None: return set()
else: return answer
if answer is None:
return set()
else:
return answer
def IsAlreadyRunning( db_path, instance ):
@ -1586,9 +1599,7 @@ class ContentUpdate( object ):
elif self._data_type == HC.CONTENT_TYPE_URLS:
( hash, urls ) = self._row
hashes = { hash }
( urls, hashes ) = self._row
elif self._data_type == HC.CONTENT_TYPE_MAPPINGS:

View File

@ -9,7 +9,6 @@ import HydrusImageHandling
import HydrusPaths
import HydrusVideoHandling
import os
import tempfile
import threading
import traceback
import cStringIO

View File

@ -320,21 +320,13 @@ def GetFreeSpace( path ):
return disk_usage.free
def GetTempFile():
def GetTempDir( dir = None ):
return tempfile.TemporaryFile()
return tempfile.mkdtemp( prefix = 'hydrus', dir = dir )
def GetTempFileQuick():
def GetTempPath( suffix = '', dir = None ):
return tempfile.SpooledTemporaryFile( max_size = 1024 * 1024 * 4 )
def GetTempDir():
return tempfile.mkdtemp( prefix = 'hydrus' )
def GetTempPath( suffix = '' ):
return tempfile.mkstemp( suffix = suffix, prefix = 'hydrus' )
return tempfile.mkstemp( suffix = suffix, prefix = 'hydrus', dir = dir )
def HasSpaceForDBTransaction( db_dir, num_bytes ):

View File

@ -182,6 +182,11 @@ def CleanTag( tag ):
try:
if tag is None:
raise Exception()
tag = tag[:1024]
tag = tag.lower()
@ -227,6 +232,11 @@ def CleanTags( tags ):
for tag in tags:
if tag is None:
continue
tag = CleanTag( tag )
try:

View File

@ -10,7 +10,6 @@ import os
import re
import subprocess
import sys
import tempfile
import traceback
import threading
import time

View File

@ -1,12 +1,12 @@
import ClientDaemons
import ClientImporting
import ClientPaths
import collections
import HydrusConstants as HC
import os
import shutil
import stat
import TestConstants
import tempfile
import unittest
import HydrusData
import ClientConstants as CC
@ -21,7 +21,7 @@ class TestDaemons( unittest.TestCase ):
def test_import_folders_daemon( self ):
test_dir = HydrusPaths.GetTempDir()
test_dir = ClientPaths.GetTempDir()
try:

View File

@ -27,7 +27,6 @@ import shutil
import sqlite3
import stat
import TestConstants
import tempfile
import time
import threading
import unittest
@ -321,9 +320,10 @@ class TestClientDB( unittest.TestCase ):
file_import_job.GenerateInfo()
written_result = self._write( 'import_file', file_import_job )
( written_status, written_note ) = self._write( 'import_file', file_import_job )
self.assertEqual( written_result, CC.STATUS_SUCCESSFUL_AND_NEW )
self.assertEqual( written_status, CC.STATUS_SUCCESSFUL_AND_NEW )
self.assertEqual( written_note, '' )
self.assertEqual( file_import_job.GetHash(), hash )
time.sleep( 1 )
@ -780,9 +780,10 @@ class TestClientDB( unittest.TestCase ):
file_import_job.GenerateInfo()
written_result = self._write( 'import_file', file_import_job )
( written_status, written_note ) = self._write( 'import_file', file_import_job )
self.assertEqual( written_result, CC.STATUS_SUCCESSFUL_AND_NEW )
self.assertEqual( written_status, CC.STATUS_SUCCESSFUL_AND_NEW )
self.assertEqual( written_note, '' )
self.assertEqual( file_import_job.GetHash(), hash )
file_import_job = ClientImporting.FileImportJob( path )
@ -791,9 +792,10 @@ class TestClientDB( unittest.TestCase ):
file_import_job.GenerateInfo()
written_result = self._write( 'import_file', file_import_job )
( written_status, written_note ) = self._write( 'import_file', file_import_job )
self.assertEqual( written_result, CC.STATUS_SUCCESSFUL_BUT_REDUNDANT )
self.assertEqual( written_status, CC.STATUS_SUCCESSFUL_BUT_REDUNDANT )
self.assertTrue( len( written_note ) > 0 )
self.assertEqual( file_import_job.GetHash(), hash )
written_hash = file_import_job.GetHash()

Binary file not shown.

Before

Width:  |  Height:  |  Size: 2.8 KiB

After

Width:  |  Height:  |  Size: 2.7 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 3.1 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 2.1 KiB

After

Width:  |  Height:  |  Size: 2.1 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.9 KiB

View File

@ -532,7 +532,7 @@ class Controller( object ):
else:
return CC.STATUS_SUCCESSFUL_AND_NEW
return ( CC.STATUS_SUCCESSFUL_AND_NEW, '' )