Version 458
This commit is contained in:
parent
61ea185821
commit
18517d8a73
|
@ -8,6 +8,38 @@
|
|||
<div class="content">
|
||||
<h3 id="changelog"><a href="#changelog">changelog</a></h3>
|
||||
<ul>
|
||||
<li><h3 id="version_458"><a href="#version_458">version 458</a></h3></li>
|
||||
<ul>
|
||||
<li>quality of life:</li>
|
||||
<li>under _options->files and trash_, you can now govern whether the advanced file deletion dialog remembers action and reason. being able to save action (trash, physical delete, vs physical delete and clear history) is new, default off for now, and won't _always_ save (if you are currently set to trash, but the next dialog doesn't have trash as an option, then it won't overwrite to physical delete). if you try it out, let me know how you like it</li>
|
||||
<li>a new option under 'network->pause' now lets you always boot the client with paused network traffic</li>
|
||||
<li>the main file import object now stores primary urls (such as post and file url) separately from source url (which is produced by many parsers and typically refers to another website). a new checkbox in 'file import options' (when in advanced mode) now allows you to not associate primary urls separately to source urls (which is useful in some one-time technical jobs that talk to some unusual proxy etc...)</li>
|
||||
<li>the new import object right-click menu that shows urls now separates primary and source urls, and also shows any referral url</li>
|
||||
<li>when you flip between two images in the dupe filter, the zoom preservation calculation, which previously only locked to the same width, now tries to choose width or height based on the relative ratios of the two images to keep both images completely in view on a canvas zoom start. it should ensure that lower watermark banners stay in view and don't accidentally spill over the bottom of your monitor</li>
|
||||
<li>moved popup toaster options from 'gui' options page to the new 'popup' page</li>
|
||||
<li>added options for whether the popup toaster should update while the client is minimised and while the mouse is on a different monitor than the main gui window. these options now default to false, so if you have any trouble, please try turning them back on</li>
|
||||
<li>a new shortcut action in the 'global' set now flips profile mode on and off. please note for now 'global' only works on main gui and media viewer--I will add a hook to every window in the future!</li>
|
||||
<li>.</li>
|
||||
<li>bug fixes:</li>
|
||||
<li>you now cannot start an 'upload pending' job for a service more than once at a time. the menu is now disabled for each service while uploading is happening</li>
|
||||
<li>fixed a bug in media load where if the file was not in a specific domain (i.e. somewhere in all known files), its tags would not show implied parents. for non-specific files, this calculation happens on the fly, and previously it was only doing siblings</li>
|
||||
<li>fixed a bug from the somewhat recent file deletion update that meant many files' custom deletion reasons were being overwritten to 'No reason given' when trash was clearing. I am sorry for the inconvenience!</li>
|
||||
<li>fixed an issue with parsing 'string' from html 'script' tags (and perhaps some other 'meta' tag types) on recent versions of the built hydrus release. this should fix the newgrounds gallery parser</li>
|
||||
<li>fixed some gallery parsing error handling, for when the actually fetched url differs from the original and cannot be parsed, and when the actually fetched url redirects straight to a file page from a 1-length search result</li>
|
||||
<li>.</li>
|
||||
<li>update file handling bug fixes:</li>
|
||||
<li>when repository processing runs into an update file problem, it now specifies if the file was content or definitions type</li>
|
||||
<li>when the database gathers updates to process, it discriminates between definitions and content updates more carefully</li>
|
||||
<li>when a hydrus update file goes through file maintenance and changes filetype, the repository update progress tracker now 'reregisters' that file, updating what content types it can expect from it and clearing out incorrect data</li>
|
||||
<li>during normal update file registration, incorrect old data is now cleared out</li>
|
||||
<li>.</li>
|
||||
<li>boring cleanup:</li>
|
||||
<li>cleaned some of the positioning code that places icons and text on thumbnails, removing hardcoded numbers and letting custom icons position better</li>
|
||||
<li>cleaned some import url tracking, checking, and association code</li>
|
||||
<li>refactored profile mode and query planner mode switching up out of UI code into the controller</li>
|
||||
<li>added a hefty unit test to test that siblings and parents are transitively applied to mappings correctly for files outside and inside specific file services, and for tag sync and the normal tag pipeline</li>
|
||||
<li>refactored some database file maintenance code to decouple the queue from the worker</li>
|
||||
</ul>
|
||||
<li><h3 id="version_457"><a href="#version_457">version 457</a></h3></li>
|
||||
<ul>
|
||||
<li>smoother menubar updates:</li>
|
||||
|
|
|
@ -139,6 +139,7 @@ SIMPLE_AUTOCOMPLETE_IF_EMPTY_PAGE_RIGHT = 131
|
|||
SIMPLE_AUTOCOMPLETE_IF_EMPTY_MEDIA_PREVIOUS = 132
|
||||
SIMPLE_AUTOCOMPLETE_IF_EMPTY_MEDIA_NEXT = 133
|
||||
SIMPLE_MEDIA_SEEK_DELTA = 134
|
||||
SIMPLE_GLOBAL_PROFILE_MODE_FLIP = 135
|
||||
|
||||
simple_enum_to_str_lookup = {
|
||||
SIMPLE_ARCHIVE_DELETE_FILTER_BACK : 'archive/delete filter: back',
|
||||
|
@ -275,7 +276,8 @@ simple_enum_to_str_lookup = {
|
|||
SIMPLE_AUTOCOMPLETE_IF_EMPTY_PAGE_RIGHT : 'if input & results list are empty, move to right one service page',
|
||||
SIMPLE_AUTOCOMPLETE_IF_EMPTY_MEDIA_PREVIOUS : 'if input & results list are empty and in media viewer manage tags dialog, move to previous media',
|
||||
SIMPLE_AUTOCOMPLETE_IF_EMPTY_MEDIA_NEXT : 'if input & results list are empty and in media viewer manage tags dialog, move to previous media',
|
||||
SIMPLE_MEDIA_SEEK_DELTA : 'seek media'
|
||||
SIMPLE_MEDIA_SEEK_DELTA : 'seek media',
|
||||
SIMPLE_GLOBAL_PROFILE_MODE_FLIP : 'flip profile mode on/off'
|
||||
}
|
||||
|
||||
legacy_simple_str_to_enum_lookup = {
|
||||
|
|
|
@ -45,7 +45,6 @@ from hydrus.client.gui import ClientGUITopLevelWindowsPanels
|
|||
from hydrus.client.gui import QtPorting as QP
|
||||
from hydrus.client.gui.lists import ClientGUIListManager
|
||||
from hydrus.client.importing import ClientImportSubscriptions
|
||||
from hydrus.client.metadata import ClientTags
|
||||
from hydrus.client.metadata import ClientTagsHandling
|
||||
from hydrus.client.networking import ClientNetworking
|
||||
from hydrus.client.networking import ClientNetworkingBandwidth
|
||||
|
@ -792,6 +791,60 @@ class Controller( HydrusController.HydrusController ):
|
|||
|
||||
|
||||
|
||||
def FlipQueryPlannerMode( self ):
|
||||
|
||||
if not HG.query_planner_mode:
|
||||
|
||||
now = HydrusData.GetNow()
|
||||
|
||||
HG.query_planner_start_time = now
|
||||
HG.query_planner_query_count = 0
|
||||
|
||||
HG.query_planner_mode = True
|
||||
|
||||
HydrusData.ShowText( 'Query Planner mode on!' )
|
||||
|
||||
else:
|
||||
|
||||
HG.query_planner_mode = False
|
||||
|
||||
HG.queries_planned = set()
|
||||
|
||||
HydrusData.ShowText( 'Query Planning done: {} queries analyzed'.format( HydrusData.ToHumanInt( HG.query_planner_query_count ) ) )
|
||||
|
||||
|
||||
|
||||
def FlipProfileMode( self ):
|
||||
|
||||
if not HG.profile_mode:
|
||||
|
||||
now = HydrusData.GetNow()
|
||||
|
||||
with HG.profile_counter_lock:
|
||||
|
||||
HG.profile_start_time = now
|
||||
HG.profile_slow_count = 0
|
||||
HG.profile_fast_count = 0
|
||||
|
||||
|
||||
|
||||
HG.profile_mode = True
|
||||
|
||||
HydrusData.ShowText( 'Profile mode on!' )
|
||||
|
||||
else:
|
||||
|
||||
HG.profile_mode = False
|
||||
|
||||
with HG.profile_counter_lock:
|
||||
|
||||
( slow, fast ) = ( HG.profile_slow_count, HG.profile_fast_count )
|
||||
|
||||
|
||||
HydrusData.ShowText( 'Profiling done: {} slow jobs, {} fast jobs'.format( HydrusData.ToHumanInt( slow ), HydrusData.ToHumanInt( fast ) ) )
|
||||
|
||||
|
||||
|
||||
def GetClipboardText( self ):
|
||||
|
||||
clipboard_text = QW.QApplication.clipboard().text()
|
||||
|
@ -941,6 +994,11 @@ class Controller( HydrusController.HydrusController ):
|
|||
|
||||
self.frame_splash_status.SetSubtext( 'network' )
|
||||
|
||||
if self.new_options.GetBoolean( 'boot_with_network_traffic_paused' ):
|
||||
|
||||
HG.client_controller.new_options.SetBoolean( 'pause_all_new_network_traffic', True )
|
||||
|
||||
|
||||
self.parsing_cache = ClientCaches.ParsingCache()
|
||||
|
||||
client_api_manager = self.Read( 'serialisable', HydrusSerialisable.SERIALISABLE_TYPE_CLIENT_API_MANAGER )
|
||||
|
|
|
@ -323,12 +323,13 @@ class QuickDownloadManager( object ):
|
|||
min_resolution = None
|
||||
max_resolution = None
|
||||
automatic_archive = False
|
||||
associate_primary_urls = True
|
||||
associate_source_urls = True
|
||||
|
||||
file_import_options = FileImportOptions.FileImportOptions()
|
||||
|
||||
file_import_options.SetPreImportOptions( exclude_deleted, do_not_check_known_urls_before_importing, do_not_check_hashes_before_importing, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution )
|
||||
file_import_options.SetPostImportOptions( automatic_archive, associate_source_urls )
|
||||
file_import_options.SetPostImportOptions( automatic_archive, associate_primary_urls, associate_source_urls )
|
||||
|
||||
file_import_job = ClientImportFiles.FileImportJob( temp_path, file_import_options )
|
||||
|
||||
|
|
|
@ -134,8 +134,11 @@ class ClientOptions( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
self._dictionary[ 'booleans' ][ 'show_related_tags' ] = False
|
||||
self._dictionary[ 'booleans' ][ 'show_file_lookup_script_tags' ] = False
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'hide_message_manager_on_gui_iconise' ] = HC.PLATFORM_MACOS
|
||||
self._dictionary[ 'booleans' ][ 'hide_message_manager_on_gui_deactive' ] = False
|
||||
self._dictionary[ 'booleans' ][ 'freeze_message_manager_when_mouse_on_other_monitor' ] = False
|
||||
self._dictionary[ 'booleans' ][ 'freeze_message_manager_when_main_gui_minimised' ] = False
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'load_images_with_pil' ] = False
|
||||
|
||||
|
@ -174,6 +177,7 @@ class ClientOptions( HydrusSerialisable.SerialisableBase ):
|
|||
self._dictionary[ 'booleans' ][ 'save_page_sort_on_change' ] = False
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'pause_all_new_network_traffic' ] = False
|
||||
self._dictionary[ 'booleans' ][ 'boot_with_network_traffic_paused' ] = False
|
||||
self._dictionary[ 'booleans' ][ 'pause_all_file_queues' ] = False
|
||||
self._dictionary[ 'booleans' ][ 'pause_all_watcher_checkers' ] = False
|
||||
self._dictionary[ 'booleans' ][ 'pause_all_gallery_searches' ] = False
|
||||
|
@ -234,6 +238,9 @@ class ClientOptions( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
self._dictionary[ 'booleans' ][ 'delete_lock_for_archived_files' ] = False
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'remember_last_advanced_file_deletion_reason' ] = True
|
||||
self._dictionary[ 'booleans' ][ 'remember_last_advanced_file_deletion_special_action' ] = False
|
||||
|
||||
#
|
||||
|
||||
self._dictionary[ 'colours' ] = HydrusSerialisable.SerialisableDictionary()
|
||||
|
@ -448,6 +455,7 @@ class ClientOptions( HydrusSerialisable.SerialisableBase ):
|
|||
self._dictionary[ 'noneable_strings' ][ 'qt_style_name' ] = None
|
||||
self._dictionary[ 'noneable_strings' ][ 'qt_stylesheet_name' ] = None
|
||||
self._dictionary[ 'noneable_strings' ][ 'last_advanced_file_deletion_reason' ] = None
|
||||
self._dictionary[ 'noneable_strings' ][ 'last_advanced_file_deletion_special_action' ] = None
|
||||
|
||||
self._dictionary[ 'strings' ] = {}
|
||||
|
||||
|
@ -560,6 +568,7 @@ class ClientOptions( HydrusSerialisable.SerialisableBase ):
|
|||
max_resolution = None
|
||||
|
||||
automatic_archive = False
|
||||
associate_primary_urls = True
|
||||
associate_source_urls = True
|
||||
|
||||
present_new_files = True
|
||||
|
@ -571,7 +580,7 @@ class ClientOptions( HydrusSerialisable.SerialisableBase ):
|
|||
quiet_file_import_options = FileImportOptions.FileImportOptions()
|
||||
|
||||
quiet_file_import_options.SetPreImportOptions( exclude_deleted, do_not_check_known_urls_before_importing, do_not_check_hashes_before_importing, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution )
|
||||
quiet_file_import_options.SetPostImportOptions( automatic_archive, associate_source_urls )
|
||||
quiet_file_import_options.SetPostImportOptions( automatic_archive, associate_primary_urls, associate_source_urls )
|
||||
quiet_file_import_options.SetPresentationOptions( present_new_files, present_already_in_inbox_files, present_already_in_archive_files )
|
||||
|
||||
self._dictionary[ 'default_file_import_options' ][ 'quiet' ] = quiet_file_import_options
|
||||
|
@ -583,7 +592,7 @@ class ClientOptions( HydrusSerialisable.SerialisableBase ):
|
|||
loud_file_import_options = FileImportOptions.FileImportOptions()
|
||||
|
||||
loud_file_import_options.SetPreImportOptions( exclude_deleted, do_not_check_known_urls_before_importing, do_not_check_hashes_before_importing, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution )
|
||||
loud_file_import_options.SetPostImportOptions( automatic_archive, associate_source_urls )
|
||||
loud_file_import_options.SetPostImportOptions( automatic_archive, associate_primary_urls, associate_source_urls )
|
||||
loud_file_import_options.SetPresentationOptions( present_new_files, present_already_in_inbox_files, present_already_in_archive_files )
|
||||
|
||||
self._dictionary[ 'default_file_import_options' ][ 'loud' ] = loud_file_import_options
|
||||
|
|
|
@ -320,27 +320,23 @@ def GetHashesFromParseResults( results ):
|
|||
|
||||
return hash_results
|
||||
|
||||
def GetHTMLTagString( tag ):
|
||||
def GetHTMLTagString( tag: bs4.Tag ):
|
||||
|
||||
# don't mess about with tag.string, tag.strings or tag.get_text
|
||||
# on a version update, these suddenly went semi bonkers and wouldn't pull text unless the types of the subtag were explicitly set
|
||||
# so we'll just do it ourselves
|
||||
|
||||
try:
|
||||
|
||||
all_strings = [ s for s in tag.strings if len( s ) > 0 ]
|
||||
all_strings = [ str( c ) for c in tag.descendants if isinstance( c, ( bs4.NavigableString, bs4.CData ) ) ]
|
||||
all_strings = [ s for s in all_strings if len( s ) > 0 ]
|
||||
|
||||
except:
|
||||
|
||||
return ''
|
||||
all_strings = []
|
||||
|
||||
|
||||
if len( all_strings ) == 0:
|
||||
|
||||
result = ''
|
||||
|
||||
else:
|
||||
|
||||
result = all_strings[0]
|
||||
|
||||
|
||||
return result
|
||||
return ''.join( all_strings )
|
||||
|
||||
def GetNamespacesFromParsableContent( parsable_content ):
|
||||
|
||||
|
|
|
@ -1886,7 +1886,7 @@ class ServiceRepository( ServiceRestricted ):
|
|||
|
||||
HG.client_controller.WriteSynchronous( 'schedule_repository_update_file_maintenance', self._service_key, ClientFiles.REGENERATE_FILE_DATA_JOB_FILE_INTEGRITY_PRESENCE )
|
||||
|
||||
raise Exception( 'An unusual error has occured during repository processing: an update file ({}) was missing. Your repository should be paused, and all update files have been scheduled for a presence check. Please permit file maintenance to check them, or tell it to do so manually, before unpausing your repository.'.format( definition_hash.hex() ) )
|
||||
raise Exception( 'An unusual error has occured during repository processing: a definition update file ({}) was missing. Your repository should be paused, and all update files have been scheduled for a presence check. Please permit file maintenance under _database->file maintenance->review_ to finish its new work, which should fix this, before unpausing your repository.'.format( definition_hash.hex() ) )
|
||||
|
||||
|
||||
with open( update_path, 'rb' ) as f:
|
||||
|
@ -1902,14 +1902,14 @@ class ServiceRepository( ServiceRestricted ):
|
|||
|
||||
HG.client_controller.WriteSynchronous( 'schedule_repository_update_file_maintenance', self._service_key, ClientFiles.REGENERATE_FILE_DATA_JOB_FILE_INTEGRITY_DATA )
|
||||
|
||||
raise Exception( 'An unusual error has occured during repository processing: an update file ({}) was invalid. Your repository should be paused, and all update files have been scheduled for an integrity check. Please permit file maintenance to check them, or tell it to do so manually, before unpausing your repository.'.format( definition_hash.hex() ) )
|
||||
raise Exception( 'An unusual error has occured during repository processing: a definition update file ({}) was invalid. Your repository should be paused, and all update files have been scheduled for an integrity check. Please permit file maintenance under _database->file maintenance->review_ to finish its new work, which should fix this, before unpausing your repository.'.format( definition_hash.hex() ) )
|
||||
|
||||
|
||||
if not isinstance( definition_update, HydrusNetwork.DefinitionsUpdate ):
|
||||
|
||||
HG.client_controller.WriteSynchronous( 'schedule_repository_update_file_maintenance', self._service_key, ClientFiles.REGENERATE_FILE_DATA_JOB_FILE_METADATA )
|
||||
|
||||
raise Exception( 'An unusual error has occured during repository processing: an update file ({}) has incorrect metadata. Your repository should be paused, and all update files have been scheduled for a metadata rescan. Please permit file maintenance to fix them, or tell it to do so manually, before unpausing your repository.'.format( definition_hash.hex() ) )
|
||||
raise Exception( 'An unusual error has occured during repository processing: a definition update file ({}) has incorrect metadata. Your repository should be paused, and all update files have been scheduled for a metadata rescan. Please permit file maintenance under _database->file maintenance->review_ to finish its new work, which should fix this, before unpausing your repository.'.format( definition_hash.hex() ) )
|
||||
|
||||
|
||||
rows_in_this_update = definition_update.GetNumRows()
|
||||
|
@ -2013,7 +2013,7 @@ class ServiceRepository( ServiceRestricted ):
|
|||
|
||||
HG.client_controller.WriteSynchronous( 'schedule_repository_update_file_maintenance', self._service_key, ClientFiles.REGENERATE_FILE_DATA_JOB_FILE_INTEGRITY_PRESENCE )
|
||||
|
||||
raise Exception( 'An unusual error has occured during repository processing: an update file ({}) was missing. Your repository should be paused, and all update files have been scheduled for a presence check. Please permit file maintenance to check them, or tell it to do so manually, before unpausing your repository.'.format( content_hash.hex() ) )
|
||||
raise Exception( 'An unusual error has occured during repository processing: a content update file ({}) was missing. Your repository should be paused, and all update files have been scheduled for a presence check. Please permit file maintenance under _database->file maintenance->review_ to finish its new work, which should fix this, before unpausing your repository.'.format( content_hash.hex() ) )
|
||||
|
||||
|
||||
with open( update_path, 'rb' ) as f:
|
||||
|
@ -2029,14 +2029,14 @@ class ServiceRepository( ServiceRestricted ):
|
|||
|
||||
HG.client_controller.WriteSynchronous( 'schedule_repository_update_file_maintenance', self._service_key, ClientFiles.REGENERATE_FILE_DATA_JOB_FILE_INTEGRITY_DATA )
|
||||
|
||||
raise Exception( 'An unusual error has occured during repository processing: an update file ({}) was invalid. Your repository should be paused, and all update files have been scheduled for an integrity check. Please permit file maintenance to check them, or tell it to do so manually, before unpausing your repository.'.format( content_hash.hex() ) )
|
||||
raise Exception( 'An unusual error has occured during repository processing: a content update file ({}) was invalid. Your repository should be paused, and all update files have been scheduled for an integrity check. Please permit file maintenance under _database->file maintenance->review_ to finish its new work, which should fix this, before unpausing your repository.'.format( content_hash.hex() ) )
|
||||
|
||||
|
||||
if not isinstance( content_update, HydrusNetwork.ContentUpdate ):
|
||||
|
||||
HG.client_controller.WriteSynchronous( 'schedule_repository_update_file_maintenance', self._service_key, ClientFiles.REGENERATE_FILE_DATA_JOB_FILE_METADATA )
|
||||
|
||||
raise Exception( 'An unusual error has occured during repository processing: an update file ({}) has incorrect metadata. Your repository should be paused, and all update files have been scheduled for a metadata rescan. Please permit file maintenance to fix them, or tell it to do so manually, before unpausing your repository.'.format( content_hash.hex() ) )
|
||||
raise Exception( 'An unusual error has occured during repository processing: a content update file ({}) has incorrect metadata. Your repository should be paused, and all update files have been scheduled for a metadata rescan. Please permit file maintenance under _database->file maintenance->review_ to finish its new work, which should fix this, before unpausing your repository.'.format( content_hash.hex() ) )
|
||||
|
||||
|
||||
rows_in_this_update = content_update.GetNumRows( content_types )
|
||||
|
|
|
@ -33,6 +33,7 @@ from hydrus.client import ClientServices
|
|||
from hydrus.client import ClientThreading
|
||||
from hydrus.client.db import ClientDBDefinitionsCache
|
||||
from hydrus.client.db import ClientDBFilesMaintenance
|
||||
from hydrus.client.db import ClientDBFilesMaintenanceQueue
|
||||
from hydrus.client.db import ClientDBFilesMetadataBasic
|
||||
from hydrus.client.db import ClientDBFilesStorage
|
||||
from hydrus.client.db import ClientDBMaintenance
|
||||
|
@ -2943,7 +2944,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
possibly_affected_tag_ids.update( self._CacheTagDisplayGetImpliedBy( ClientTags.TAG_DISPLAY_ACTUAL, tag_service_id, a ) )
|
||||
possibly_affected_tag_ids.update( self._CacheTagDisplayGetImpliedBy( ClientTags.TAG_DISPLAY_ACTUAL, tag_service_id, b ) )
|
||||
possibly_affected_tag_ids.update( self._CacheTagDisplayGetImplies( ClientTags.TAG_DISPLAY_ACTUAL, tag_service_id, a ) )
|
||||
possibly_affected_tag_ids.update( self._CacheTagDisplayGetImplies( ClientTags.TAG_DISPLAY_ACTUAL, tag_service_id, a ) )
|
||||
possibly_affected_tag_ids.update( self._CacheTagDisplayGetImplies( ClientTags.TAG_DISPLAY_ACTUAL, tag_service_id, b ) )
|
||||
|
||||
previous_chain_tag_ids_to_implied_by = self._CacheTagDisplayGetTagsToImpliedBy( ClientTags.TAG_DISPLAY_ACTUAL, tag_service_id, possibly_affected_tag_ids )
|
||||
|
||||
|
@ -2965,7 +2966,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
possibly_affected_tag_ids.update( self._CacheTagDisplayGetImpliedBy( ClientTags.TAG_DISPLAY_ACTUAL, tag_service_id, a ) )
|
||||
possibly_affected_tag_ids.update( self._CacheTagDisplayGetImpliedBy( ClientTags.TAG_DISPLAY_ACTUAL, tag_service_id, b ) )
|
||||
possibly_affected_tag_ids.update( self._CacheTagDisplayGetImplies( ClientTags.TAG_DISPLAY_ACTUAL, tag_service_id, a ) )
|
||||
possibly_affected_tag_ids.update( self._CacheTagDisplayGetImplies( ClientTags.TAG_DISPLAY_ACTUAL, tag_service_id, a ) )
|
||||
possibly_affected_tag_ids.update( self._CacheTagDisplayGetImplies( ClientTags.TAG_DISPLAY_ACTUAL, tag_service_id, b ) )
|
||||
|
||||
previous_chain_tag_ids_to_implied_by = self._CacheTagDisplayGetTagsToImpliedBy( ClientTags.TAG_DISPLAY_ACTUAL, tag_service_id, possibly_affected_tag_ids )
|
||||
|
||||
|
@ -3025,7 +3026,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
possibly_affected_tag_ids.update( self._CacheTagDisplayGetImpliedBy( ClientTags.TAG_DISPLAY_ACTUAL, tag_service_id, a ) )
|
||||
possibly_affected_tag_ids.update( self._CacheTagDisplayGetImpliedBy( ClientTags.TAG_DISPLAY_ACTUAL, tag_service_id, b ) )
|
||||
possibly_affected_tag_ids.update( self._CacheTagDisplayGetImplies( ClientTags.TAG_DISPLAY_ACTUAL, tag_service_id, a ) )
|
||||
possibly_affected_tag_ids.update( self._CacheTagDisplayGetImplies( ClientTags.TAG_DISPLAY_ACTUAL, tag_service_id, a ) )
|
||||
possibly_affected_tag_ids.update( self._CacheTagDisplayGetImplies( ClientTags.TAG_DISPLAY_ACTUAL, tag_service_id, b ) )
|
||||
|
||||
previous_chain_tag_ids_to_implied_by = self._CacheTagDisplayGetTagsToImpliedBy( ClientTags.TAG_DISPLAY_ACTUAL, tag_service_id, possibly_affected_tag_ids )
|
||||
|
||||
|
@ -3047,7 +3048,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
possibly_affected_tag_ids.update( self._CacheTagDisplayGetImpliedBy( ClientTags.TAG_DISPLAY_ACTUAL, tag_service_id, a ) )
|
||||
possibly_affected_tag_ids.update( self._CacheTagDisplayGetImpliedBy( ClientTags.TAG_DISPLAY_ACTUAL, tag_service_id, b ) )
|
||||
possibly_affected_tag_ids.update( self._CacheTagDisplayGetImplies( ClientTags.TAG_DISPLAY_ACTUAL, tag_service_id, a ) )
|
||||
possibly_affected_tag_ids.update( self._CacheTagDisplayGetImplies( ClientTags.TAG_DISPLAY_ACTUAL, tag_service_id, a ) )
|
||||
possibly_affected_tag_ids.update( self._CacheTagDisplayGetImplies( ClientTags.TAG_DISPLAY_ACTUAL, tag_service_id, b ) )
|
||||
|
||||
previous_chain_tag_ids_to_implied_by = self._CacheTagDisplayGetTagsToImpliedBy( ClientTags.TAG_DISPLAY_ACTUAL, tag_service_id, possibly_affected_tag_ids )
|
||||
|
||||
|
@ -4191,7 +4192,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
self.modules_similar_files.StopSearchingFile( hash_id )
|
||||
|
||||
|
||||
self.modules_files_maintenance.CancelFiles( hash_ids )
|
||||
self.modules_files_maintenance_queue.CancelFiles( hash_ids )
|
||||
|
||||
pending_upload_hash_ids = self.modules_files_storage.FilterAllPendingHashIds( hash_ids )
|
||||
|
||||
|
@ -7356,13 +7357,18 @@ class DB( HydrusDB.HydrusDB ):
|
|||
# this is likely a 'all known files' query, which means we are in deep water without a cache
|
||||
# time to compute manually, which is semi hell mode, but not dreadful
|
||||
|
||||
display_tag_data = [ ( hash_id, ( tag_service_id, status, tag_id ) ) for ( hash_id, ( tag_service_id, status, tag_id ) ) in storage_tag_data if status in ( HC.CONTENT_STATUS_CURRENT, HC.CONTENT_STATUS_PENDING ) ]
|
||||
current_and_pending_storage_tag_data = [ ( hash_id, ( tag_service_id, status, tag_id ) ) for ( hash_id, ( tag_service_id, status, tag_id ) ) in storage_tag_data if status in ( HC.CONTENT_STATUS_CURRENT, HC.CONTENT_STATUS_PENDING ) ]
|
||||
|
||||
seen_service_ids_to_seen_tag_ids = HydrusData.BuildKeyToSetDict( ( ( tag_service_id, tag_id ) for ( hash_id, ( tag_service_id, status, tag_id ) ) in display_tag_data ) )
|
||||
seen_service_ids_to_seen_tag_ids = HydrusData.BuildKeyToSetDict( ( ( tag_service_id, tag_id ) for ( hash_id, ( tag_service_id, status, tag_id ) ) in current_and_pending_storage_tag_data ) )
|
||||
|
||||
seen_service_ids_to_tag_ids_to_ideal_tag_ids = { tag_service_id : self.modules_tag_siblings.GetTagsToIdeals( ClientTags.TAG_DISPLAY_ACTUAL, tag_service_id, tag_ids ) for ( tag_service_id, tag_ids ) in seen_service_ids_to_seen_tag_ids.items() }
|
||||
seen_service_ids_to_tag_ids_to_implied_tag_ids = { tag_service_id : self._CacheTagDisplayGetTagsToImplies( ClientTags.TAG_DISPLAY_ACTUAL, tag_service_id, tag_ids ) for ( tag_service_id, tag_ids ) in seen_service_ids_to_seen_tag_ids.items() }
|
||||
|
||||
display_tag_data = [ ( hash_id, ( tag_service_id, status, seen_service_ids_to_tag_ids_to_ideal_tag_ids[ tag_service_id ][ tag_id ] ) ) for ( hash_id, ( tag_service_id, status, tag_id ) ) in display_tag_data ]
|
||||
display_tag_data = []
|
||||
|
||||
for ( hash_id, ( tag_service_id, status, tag_id ) ) in current_and_pending_storage_tag_data:
|
||||
|
||||
display_tag_data.extend( ( ( hash_id, ( tag_service_id, status, implied_tag_id ) ) for implied_tag_id in seen_service_ids_to_tag_ids_to_implied_tag_ids[ tag_service_id ][ tag_id ] ) )
|
||||
|
||||
|
||||
|
||||
return ( storage_tag_data, display_tag_data )
|
||||
|
@ -11718,16 +11724,22 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
#
|
||||
|
||||
self.modules_files_maintenance = ClientDBFilesMaintenance.ClientDBFilesMaintenance( self._c, self.modules_hashes, self.modules_hashes_local_cache, self.modules_files_metadata_basic, self.modules_similar_files, self._weakref_media_result_cache )
|
||||
self.modules_files_maintenance_queue = ClientDBFilesMaintenanceQueue.ClientDBFilesMaintenanceQueue( self._c, self.modules_hashes_local_cache )
|
||||
|
||||
self._modules.append( self.modules_files_maintenance )
|
||||
self._modules.append( self.modules_files_maintenance_queue )
|
||||
|
||||
#
|
||||
|
||||
self.modules_repositories = ClientDBRepositories.ClientDBRepositories( self._c, self._cursor_transaction_wrapper, self.modules_services, self.modules_files_storage, self.modules_files_metadata_basic, self.modules_hashes_local_cache, self.modules_tags_local_cache, self.modules_files_maintenance )
|
||||
self.modules_repositories = ClientDBRepositories.ClientDBRepositories( self._c, self._cursor_transaction_wrapper, self.modules_services, self.modules_files_storage, self.modules_files_metadata_basic, self.modules_hashes_local_cache, self.modules_tags_local_cache, self.modules_files_maintenance_queue )
|
||||
|
||||
self._modules.append( self.modules_repositories )
|
||||
|
||||
#
|
||||
|
||||
self.modules_files_maintenance = ClientDBFilesMaintenance.ClientDBFilesMaintenance( self._c, self.modules_files_maintenance_queue, self.modules_hashes, self.modules_hashes_local_cache, self.modules_files_metadata_basic, self.modules_similar_files, self.modules_repositories, self._weakref_media_result_cache )
|
||||
|
||||
self._modules.append( self.modules_files_maintenance )
|
||||
|
||||
|
||||
def _ManageDBError( self, job, e ):
|
||||
|
||||
|
@ -12194,9 +12206,9 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
if service_type in ( HC.LOCAL_FILE_DOMAIN, HC.COMBINED_LOCAL_FILE ):
|
||||
|
||||
reason = content_update.GetReason()
|
||||
|
||||
if reason is not None:
|
||||
if content_update.HasReason():
|
||||
|
||||
reason = content_update.GetReason()
|
||||
|
||||
self.modules_files_storage.SetFileDeletionReason( hash_ids, reason )
|
||||
|
||||
|
@ -13123,8 +13135,8 @@ class DB( HydrusDB.HydrusDB ):
|
|||
elif action == 'file_duplicate_hashes': result = self._DuplicatesGetFileHashesByDuplicateType( *args, **kwargs )
|
||||
elif action == 'file_duplicate_info': result = self._DuplicatesGetFileDuplicateInfo( *args, **kwargs )
|
||||
elif action == 'file_hashes': result = self.modules_hashes.GetFileHashes( *args, **kwargs )
|
||||
elif action == 'file_maintenance_get_job': result = self.modules_files_maintenance.GetJob( *args, **kwargs )
|
||||
elif action == 'file_maintenance_get_job_counts': result = self.modules_files_maintenance.GetJobCounts( *args, **kwargs )
|
||||
elif action == 'file_maintenance_get_job': result = self.modules_files_maintenance_queue.GetJob( *args, **kwargs )
|
||||
elif action == 'file_maintenance_get_job_counts': result = self.modules_files_maintenance_queue.GetJobCounts( *args, **kwargs )
|
||||
elif action == 'file_query_ids': result = self._GetHashIdsFromQuery( *args, **kwargs )
|
||||
elif action == 'file_system_predicates': result = self._GetFileSystemPredicates( *args, **kwargs )
|
||||
elif action == 'filter_existing_tags': result = self._FilterExistingTags( *args, **kwargs )
|
||||
|
@ -16263,7 +16275,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
from hydrus.client import ClientFiles
|
||||
|
||||
self.modules_files_maintenance.AddJobs( hash_ids, ClientFiles.REGENERATE_FILE_DATA_JOB_FORCE_THUMBNAIL )
|
||||
self.modules_files_maintenance_queue.AddJobs( hash_ids, ClientFiles.REGENERATE_FILE_DATA_JOB_FORCE_THUMBNAIL )
|
||||
|
||||
except:
|
||||
|
||||
|
@ -16986,9 +16998,9 @@ class DB( HydrusDB.HydrusDB ):
|
|||
elif action == 'dissolve_duplicates_group': self._DuplicatesDissolveMediaIdFromHashes( *args, **kwargs )
|
||||
elif action == 'duplicate_pair_status': self._DuplicatesSetDuplicatePairStatus( *args, **kwargs )
|
||||
elif action == 'duplicate_set_king': self._DuplicatesSetKingFromHash( *args, **kwargs )
|
||||
elif action == 'file_maintenance_add_jobs': self.modules_files_maintenance.AddJobs( *args, **kwargs )
|
||||
elif action == 'file_maintenance_add_jobs_hashes': self.modules_files_maintenance.AddJobsHashes( *args, **kwargs )
|
||||
elif action == 'file_maintenance_cancel_jobs': self.modules_files_maintenance.CancelJobs( *args, **kwargs )
|
||||
elif action == 'file_maintenance_add_jobs': self.modules_files_maintenance_queue.AddJobs( *args, **kwargs )
|
||||
elif action == 'file_maintenance_add_jobs_hashes': self.modules_files_maintenance_queue.AddJobsHashes( *args, **kwargs )
|
||||
elif action == 'file_maintenance_cancel_jobs': self.modules_files_maintenance_queue.CancelJobs( *args, **kwargs )
|
||||
elif action == 'file_maintenance_clear_jobs': self.modules_files_maintenance.ClearJobs( *args, **kwargs )
|
||||
elif action == 'fix_logically_inconsistent_mappings': self._FixLogicallyInconsistentMappings( *args, **kwargs )
|
||||
elif action == 'imageboard': self.modules_serialisable.SetYAMLDump( ClientDBSerialisable.YAML_DUMP_ID_IMAGEBOARD, *args, **kwargs )
|
||||
|
|
|
@ -7,9 +7,11 @@ from hydrus.core import HydrusGlobals as HG
|
|||
|
||||
from hydrus.client import ClientFiles
|
||||
from hydrus.client.db import ClientDBDefinitionsCache
|
||||
from hydrus.client.db import ClientDBFilesMaintenanceQueue
|
||||
from hydrus.client.db import ClientDBFilesMetadataBasic
|
||||
from hydrus.client.db import ClientDBMaster
|
||||
from hydrus.client.db import ClientDBModule
|
||||
from hydrus.client.db import ClientDBRepositories
|
||||
from hydrus.client.db import ClientDBSimilarFiles
|
||||
from hydrus.client.media import ClientMediaResultCache
|
||||
|
||||
|
@ -18,60 +20,26 @@ class ClientDBFilesMaintenance( ClientDBModule.ClientDBModule ):
|
|||
def __init__(
|
||||
self,
|
||||
cursor: sqlite3.Cursor,
|
||||
modules_files_maintenance_queue: ClientDBFilesMaintenanceQueue.ClientDBFilesMaintenanceQueue,
|
||||
modules_hashes: ClientDBMaster.ClientDBMasterHashes,
|
||||
modules_hashes_local_cache: ClientDBDefinitionsCache.ClientDBCacheLocalHashes,
|
||||
modules_files_metadata_basic: ClientDBFilesMetadataBasic.ClientDBFilesMetadataBasic,
|
||||
modules_similar_files: ClientDBSimilarFiles.ClientDBSimilarFiles,
|
||||
modules_repositories: ClientDBRepositories.ClientDBRepositories,
|
||||
weakref_media_result_cache: ClientMediaResultCache.MediaResultCache
|
||||
):
|
||||
|
||||
ClientDBModule.ClientDBModule.__init__( self, 'client files maintenance', cursor )
|
||||
|
||||
self.modules_files_maintenance_queue = modules_files_maintenance_queue
|
||||
self.modules_hashes = modules_hashes
|
||||
self.modules_hashes_local_cache = modules_hashes_local_cache
|
||||
self.modules_files_metadata_basic = modules_files_metadata_basic
|
||||
self.modules_similar_files = modules_similar_files
|
||||
self.modules_repositories = modules_repositories
|
||||
self._weakref_media_result_cache = weakref_media_result_cache
|
||||
|
||||
|
||||
def _GetInitialTableGenerationDict( self ) -> dict:
|
||||
|
||||
return {
|
||||
'external_caches.file_maintenance_jobs' : ( 'CREATE TABLE IF NOT EXISTS {} ( hash_id INTEGER, job_type INTEGER, time_can_start INTEGER, PRIMARY KEY ( hash_id, job_type ) );', 400 )
|
||||
}
|
||||
|
||||
|
||||
def AddJobs( self, hash_ids, job_type, time_can_start = 0 ):
|
||||
|
||||
deletee_job_types = ClientFiles.regen_file_enum_to_overruled_jobs[ job_type ]
|
||||
|
||||
for deletee_job_type in deletee_job_types:
|
||||
|
||||
self._ExecuteMany( 'DELETE FROM file_maintenance_jobs WHERE hash_id = ? AND job_type = ?;', ( ( hash_id, deletee_job_type ) for hash_id in hash_ids ) )
|
||||
|
||||
|
||||
#
|
||||
|
||||
self._ExecuteMany( 'REPLACE INTO file_maintenance_jobs ( hash_id, job_type, time_can_start ) VALUES ( ?, ?, ? );', ( ( hash_id, job_type, time_can_start ) for hash_id in hash_ids ) )
|
||||
|
||||
|
||||
def AddJobsHashes( self, hashes, job_type, time_can_start = 0 ):
|
||||
|
||||
hash_ids = self.modules_hashes_local_cache.GetHashIds( hashes )
|
||||
|
||||
self.AddJobs( hash_ids, job_type, time_can_start = time_can_start )
|
||||
|
||||
|
||||
def CancelFiles( self, hash_ids ):
|
||||
|
||||
self._ExecuteMany( 'DELETE FROM file_maintenance_jobs WHERE hash_id = ?;', ( ( hash_id, ) for hash_id in hash_ids ) )
|
||||
|
||||
|
||||
def CancelJobs( self, job_type ):
|
||||
|
||||
self._Execute( 'DELETE FROM file_maintenance_jobs WHERE job_type = ?;', ( job_type, ) )
|
||||
|
||||
|
||||
def ClearJobs( self, cleared_job_tuples ):
|
||||
|
||||
new_file_info = set()
|
||||
|
@ -85,6 +53,7 @@ class ClientDBFilesMaintenance( ClientDBModule.ClientDBModule ):
|
|||
if job_type == ClientFiles.REGENERATE_FILE_DATA_JOB_FILE_METADATA:
|
||||
|
||||
original_resolution = self.modules_files_metadata_basic.GetResolution( hash_id )
|
||||
original_mime = self.modules_files_metadata_basic.GetMime( hash_id )
|
||||
|
||||
( size, mime, width, height, duration, num_frames, has_audio, num_words ) = additional_data
|
||||
|
||||
|
@ -100,20 +69,25 @@ class ClientDBFilesMaintenance( ClientDBModule.ClientDBModule ):
|
|||
|
||||
if not self.modules_hashes.HasExtraHashes( hash_id ):
|
||||
|
||||
self.AddJobs( { hash_id }, ClientFiles.REGENERATE_FILE_DATA_JOB_OTHER_HASHES )
|
||||
self.modules_files_maintenance_queue.AddJobs( { hash_id }, ClientFiles.REGENERATE_FILE_DATA_JOB_OTHER_HASHES )
|
||||
|
||||
|
||||
result = self._Execute( 'SELECT 1 FROM file_modified_timestamps WHERE hash_id = ?;', ( hash_id, ) ).fetchone()
|
||||
|
||||
if result is None:
|
||||
|
||||
self.AddJobs( { hash_id }, ClientFiles.REGENERATE_FILE_DATA_JOB_FILE_MODIFIED_TIMESTAMP )
|
||||
self.modules_files_maintenance_queue.AddJobs( { hash_id }, ClientFiles.REGENERATE_FILE_DATA_JOB_FILE_MODIFIED_TIMESTAMP )
|
||||
|
||||
|
||||
|
||||
if mime != original_mime and ( mime in HC.HYDRUS_UPDATE_FILES or original_mime in HC.HYDRUS_UPDATE_FILES ):
|
||||
|
||||
self.modules_repositories.NotifyUpdatesChanged( ( hash_id, ) )
|
||||
|
||||
|
||||
if mime in HC.MIMES_WITH_THUMBNAILS and resolution_changed:
|
||||
|
||||
self.AddJobs( { hash_id }, ClientFiles.REGENERATE_FILE_DATA_JOB_FORCE_THUMBNAIL )
|
||||
self.modules_files_maintenance_queue.AddJobs( { hash_id }, ClientFiles.REGENERATE_FILE_DATA_JOB_FORCE_THUMBNAIL )
|
||||
|
||||
|
||||
elif job_type == ClientFiles.REGENERATE_FILE_DATA_JOB_OTHER_HASHES:
|
||||
|
@ -144,7 +118,7 @@ class ClientDBFilesMaintenance( ClientDBModule.ClientDBModule ):
|
|||
|
||||
if not self.modules_similar_files.FileIsInSystem( hash_id ):
|
||||
|
||||
self.AddJobs( ( hash_id, ), ClientFiles.REGENERATE_FILE_DATA_JOB_SIMILAR_FILES_METADATA )
|
||||
self.modules_files_maintenance_queue.AddJobs( ( hash_id, ), ClientFiles.REGENERATE_FILE_DATA_JOB_SIMILAR_FILES_METADATA )
|
||||
|
||||
|
||||
else:
|
||||
|
@ -189,50 +163,3 @@ class ClientDBFilesMaintenance( ClientDBModule.ClientDBModule ):
|
|||
|
||||
|
||||
|
||||
def GetJob( self, job_types = None ):
|
||||
|
||||
if job_types is None:
|
||||
|
||||
possible_job_types = ClientFiles.ALL_REGEN_JOBS_IN_PREFERRED_ORDER
|
||||
|
||||
else:
|
||||
|
||||
possible_job_types = job_types
|
||||
|
||||
|
||||
for job_type in possible_job_types:
|
||||
|
||||
hash_ids = self._STL( self._Execute( 'SELECT hash_id FROM file_maintenance_jobs WHERE job_type = ? AND time_can_start < ? LIMIT ?;', ( job_type, HydrusData.GetNow(), 256 ) ) )
|
||||
|
||||
if len( hash_ids ) > 0:
|
||||
|
||||
hashes = self.modules_hashes_local_cache.GetHashes( hash_ids )
|
||||
|
||||
return ( hashes, job_type )
|
||||
|
||||
|
||||
|
||||
return None
|
||||
|
||||
|
||||
def GetJobCounts( self ):
|
||||
|
||||
result = self._Execute( 'SELECT job_type, COUNT( * ) FROM file_maintenance_jobs WHERE time_can_start < ? GROUP BY job_type;', ( HydrusData.GetNow(), ) ).fetchall()
|
||||
|
||||
job_types_to_count = dict( result )
|
||||
|
||||
return job_types_to_count
|
||||
|
||||
|
||||
def GetTablesAndColumnsThatUseDefinitions( self, content_type: int ) -> typing.List[ typing.Tuple[ str, str ] ]:
|
||||
|
||||
tables_and_columns = []
|
||||
|
||||
if HC.CONTENT_TYPE_HASH:
|
||||
|
||||
tables_and_columns.append( ( 'file_maintenance_jobs', 'hash_id' ) )
|
||||
|
||||
|
||||
return tables_and_columns
|
||||
|
||||
|
||||
|
|
|
@ -0,0 +1,113 @@
|
|||
import sqlite3
|
||||
import typing
|
||||
|
||||
from hydrus.core import HydrusConstants as HC
|
||||
from hydrus.core import HydrusData
|
||||
from hydrus.core import HydrusGlobals as HG
|
||||
|
||||
from hydrus.client import ClientFiles
|
||||
from hydrus.client.db import ClientDBDefinitionsCache
|
||||
from hydrus.client.db import ClientDBFilesMetadataBasic
|
||||
from hydrus.client.db import ClientDBMaster
|
||||
from hydrus.client.db import ClientDBModule
|
||||
from hydrus.client.db import ClientDBSimilarFiles
|
||||
from hydrus.client.media import ClientMediaResultCache
|
||||
|
||||
class ClientDBFilesMaintenanceQueue( ClientDBModule.ClientDBModule ):
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
cursor: sqlite3.Cursor,
|
||||
modules_hashes_local_cache: ClientDBDefinitionsCache.ClientDBCacheLocalHashes,
|
||||
):
|
||||
|
||||
ClientDBModule.ClientDBModule.__init__( self, 'client files maintenance', cursor )
|
||||
|
||||
self.modules_hashes_local_cache = modules_hashes_local_cache
|
||||
|
||||
|
||||
def _GetInitialTableGenerationDict( self ) -> dict:
|
||||
|
||||
return {
|
||||
'external_caches.file_maintenance_jobs' : ( 'CREATE TABLE IF NOT EXISTS {} ( hash_id INTEGER, job_type INTEGER, time_can_start INTEGER, PRIMARY KEY ( hash_id, job_type ) );', 400 )
|
||||
}
|
||||
|
||||
|
||||
def AddJobs( self, hash_ids, job_type, time_can_start = 0 ):
|
||||
|
||||
deletee_job_types = ClientFiles.regen_file_enum_to_overruled_jobs[ job_type ]
|
||||
|
||||
for deletee_job_type in deletee_job_types:
|
||||
|
||||
self._ExecuteMany( 'DELETE FROM file_maintenance_jobs WHERE hash_id = ? AND job_type = ?;', ( ( hash_id, deletee_job_type ) for hash_id in hash_ids ) )
|
||||
|
||||
|
||||
#
|
||||
|
||||
self._ExecuteMany( 'REPLACE INTO file_maintenance_jobs ( hash_id, job_type, time_can_start ) VALUES ( ?, ?, ? );', ( ( hash_id, job_type, time_can_start ) for hash_id in hash_ids ) )
|
||||
|
||||
|
||||
def AddJobsHashes( self, hashes, job_type, time_can_start = 0 ):
|
||||
|
||||
hash_ids = self.modules_hashes_local_cache.GetHashIds( hashes )
|
||||
|
||||
self.AddJobs( hash_ids, job_type, time_can_start = time_can_start )
|
||||
|
||||
|
||||
def CancelFiles( self, hash_ids ):
|
||||
|
||||
self._ExecuteMany( 'DELETE FROM file_maintenance_jobs WHERE hash_id = ?;', ( ( hash_id, ) for hash_id in hash_ids ) )
|
||||
|
||||
|
||||
def CancelJobs( self, job_type ):
|
||||
|
||||
self._Execute( 'DELETE FROM file_maintenance_jobs WHERE job_type = ?;', ( job_type, ) )
|
||||
|
||||
|
||||
def GetJob( self, job_types = None ):
|
||||
|
||||
if job_types is None:
|
||||
|
||||
possible_job_types = ClientFiles.ALL_REGEN_JOBS_IN_PREFERRED_ORDER
|
||||
|
||||
else:
|
||||
|
||||
possible_job_types = job_types
|
||||
|
||||
|
||||
for job_type in possible_job_types:
|
||||
|
||||
hash_ids = self._STL( self._Execute( 'SELECT hash_id FROM file_maintenance_jobs WHERE job_type = ? AND time_can_start < ? LIMIT ?;', ( job_type, HydrusData.GetNow(), 256 ) ) )
|
||||
|
||||
if len( hash_ids ) > 0:
|
||||
|
||||
hashes = self.modules_hashes_local_cache.GetHashes( hash_ids )
|
||||
|
||||
return ( hashes, job_type )
|
||||
|
||||
|
||||
|
||||
return None
|
||||
|
||||
|
||||
def GetJobCounts( self ):
|
||||
|
||||
result = self._Execute( 'SELECT job_type, COUNT( * ) FROM file_maintenance_jobs WHERE time_can_start < ? GROUP BY job_type;', ( HydrusData.GetNow(), ) ).fetchall()
|
||||
|
||||
job_types_to_count = dict( result )
|
||||
|
||||
return job_types_to_count
|
||||
|
||||
|
||||
def GetTablesAndColumnsThatUseDefinitions( self, content_type: int ) -> typing.List[ typing.Tuple[ str, str ] ]:
|
||||
|
||||
tables_and_columns = []
|
||||
|
||||
if HC.CONTENT_TYPE_HASH:
|
||||
|
||||
tables_and_columns.append( ( 'file_maintenance_jobs', 'hash_id' ) )
|
||||
|
||||
|
||||
return tables_and_columns
|
||||
|
||||
|
|
@ -12,7 +12,7 @@ from hydrus.core.networking import HydrusNetwork
|
|||
|
||||
from hydrus.client import ClientFiles
|
||||
from hydrus.client.db import ClientDBDefinitionsCache
|
||||
from hydrus.client.db import ClientDBFilesMaintenance
|
||||
from hydrus.client.db import ClientDBFilesMaintenanceQueue
|
||||
from hydrus.client.db import ClientDBFilesMetadataBasic
|
||||
from hydrus.client.db import ClientDBFilesStorage
|
||||
from hydrus.client.db import ClientDBModule
|
||||
|
@ -58,7 +58,7 @@ class ClientDBRepositories( ClientDBModule.ClientDBModule ):
|
|||
modules_files_metadata_basic: ClientDBFilesMetadataBasic.ClientDBFilesMetadataBasic,
|
||||
modules_hashes_local_cache: ClientDBDefinitionsCache.ClientDBCacheLocalHashes,
|
||||
modules_tags_local_cache: ClientDBDefinitionsCache.ClientDBCacheLocalTags,
|
||||
modules_files_maintenance: ClientDBFilesMaintenance.ClientDBFilesMaintenance
|
||||
modules_files_maintenance_queue: ClientDBFilesMaintenanceQueue.ClientDBFilesMaintenanceQueue
|
||||
):
|
||||
|
||||
# since we'll mostly be talking about hashes and tags we don't have locally, I think we shouldn't use the local caches
|
||||
|
@ -69,7 +69,7 @@ class ClientDBRepositories( ClientDBModule.ClientDBModule ):
|
|||
self.modules_services = modules_services
|
||||
self.modules_files_storage = modules_files_storage
|
||||
self.modules_files_metadata_basic = modules_files_metadata_basic
|
||||
self.modules_files_maintenance = modules_files_maintenance
|
||||
self.modules_files_maintenance_queue = modules_files_maintenance_queue
|
||||
self.modules_hashes_local_cache = modules_hashes_local_cache
|
||||
self.modules_tags_local_cache = modules_tags_local_cache
|
||||
|
||||
|
@ -217,7 +217,32 @@ class ClientDBRepositories( ClientDBModule.ClientDBModule ):
|
|||
|
||||
update_hash_ids = self._STL( self._Execute( 'SELECT hash_id FROM {};'.format( table_join ) ) )
|
||||
|
||||
self.modules_files_maintenance.AddJobs( update_hash_ids, job_type )
|
||||
self.modules_files_maintenance_queue.AddJobs( update_hash_ids, job_type )
|
||||
|
||||
|
||||
def _UnregisterUpdates( self, service_id, hash_ids = None ):
|
||||
|
||||
( repository_updates_table_name, repository_unregistered_updates_table_name, repository_updates_processed_table_name ) = GenerateRepositoryUpdatesTableNames( service_id )
|
||||
|
||||
if hash_ids is None:
|
||||
|
||||
hash_ids = self._STS( self._Execute( 'SELECT hash_id FROM {};'.format( repository_updates_processed_table_name ) ) )
|
||||
|
||||
else:
|
||||
|
||||
with self._MakeTemporaryIntegerTable( hash_ids, 'hash_id' ) as temp_hash_ids_table_name:
|
||||
|
||||
hash_ids = self._STS( self._Execute( 'SELECT hash_id FROM {} CROSS JOIN {} USING ( hash_id );'.format( temp_hash_ids_table_name, repository_updates_processed_table_name ) ) )
|
||||
|
||||
|
||||
|
||||
if len( hash_ids ) > 0:
|
||||
|
||||
self._ClearOutstandingWorkCache( service_id )
|
||||
|
||||
self._ExecuteMany( 'DELETE FROM {} WHERE hash_id = ?;'.format( repository_updates_processed_table_name ), ( ( hash_id, ) for hash_id in hash_ids ) )
|
||||
self._ExecuteMany( 'INSERT OR IGNORE INTO {} ( hash_id ) VALUES ( ? );'.format( repository_unregistered_updates_table_name ), ( ( hash_id, ) for hash_id in hash_ids ) )
|
||||
|
||||
|
||||
|
||||
def AssociateRepositoryUpdateHashes( self, service_key: bytes, metadata_slice: HydrusNetwork.Metadata ):
|
||||
|
@ -385,15 +410,13 @@ class ClientDBRepositories( ClientDBModule.ClientDBModule ):
|
|||
definition_hashes_and_content_types = []
|
||||
content_hashes_and_content_types = []
|
||||
|
||||
definitions_content_types = { HC.CONTENT_TYPE_DEFINITIONS }
|
||||
|
||||
if len( update_indices_to_unprocessed_hash_ids ) > 0:
|
||||
|
||||
for update_index in sorted( update_indices_to_unprocessed_hash_ids.keys() ):
|
||||
|
||||
unprocessed_hash_ids = update_indices_to_unprocessed_hash_ids[ update_index ]
|
||||
|
||||
definition_hash_ids = { hash_id for hash_id in unprocessed_hash_ids if hash_ids_to_content_types_to_process[ hash_id ] == definitions_content_types }
|
||||
definition_hash_ids = { hash_id for hash_id in unprocessed_hash_ids if HC.CONTENT_TYPE_DEFINITIONS in hash_ids_to_content_types_to_process[ hash_id ] }
|
||||
content_hash_ids = { hash_id for hash_id in unprocessed_hash_ids if hash_id not in definition_hash_ids }
|
||||
|
||||
for ( hash_ids, hashes_and_content_types ) in [
|
||||
|
@ -549,6 +572,15 @@ class ClientDBRepositories( ClientDBModule.ClientDBModule ):
|
|||
return tag_id
|
||||
|
||||
|
||||
def NotifyUpdatesChanged( self, hash_ids ):
|
||||
|
||||
for service_id in self.modules_services.GetServiceIds( HC.REPOSITORIES ):
|
||||
|
||||
self._UnregisterUpdates( service_id, hash_ids )
|
||||
self._RegisterUpdates( service_id, hash_ids )
|
||||
|
||||
|
||||
|
||||
def NotifyUpdatesImported( self, hash_ids ):
|
||||
|
||||
for service_id in self.modules_services.GetServiceIds( HC.REPOSITORIES ):
|
||||
|
|
|
@ -427,7 +427,7 @@ def THREADUploadPending( service_key ):
|
|||
HG.client_controller.Write( 'delete_service_info', service_key, types_to_delete )
|
||||
|
||||
|
||||
HG.client_controller.pub( 'notify_new_pending' )
|
||||
HG.client_controller.pub( 'notify_pending_upload_finished', service_key )
|
||||
|
||||
|
||||
class FrameGUI( ClientGUITopLevelWindows.MainFrameThatResizes ):
|
||||
|
@ -479,6 +479,8 @@ class FrameGUI( ClientGUITopLevelWindows.MainFrameThatResizes ):
|
|||
|
||||
self._garbage_snapshot = collections.Counter()
|
||||
|
||||
self._currently_uploading_pending = set()
|
||||
|
||||
self._last_clipboard_watched_text = ''
|
||||
self._clipboard_watcher_destination_page_watcher = None
|
||||
self._clipboard_watcher_destination_page_urls = None
|
||||
|
@ -517,6 +519,7 @@ class FrameGUI( ClientGUITopLevelWindows.MainFrameThatResizes ):
|
|||
self._controller.sub( self, 'NotifyNewServices', 'notify_new_services_gui' )
|
||||
self._controller.sub( self, 'NotifyNewSessions', 'notify_new_sessions' )
|
||||
self._controller.sub( self, 'NotifyNewUndo', 'notify_new_undo' )
|
||||
self._controller.sub( self, 'NotifyPendingUploadFinished', 'notify_pending_upload_finished' )
|
||||
self._controller.sub( self, 'PresentImportedFilesToPage', 'imported_files_to_page' )
|
||||
self._controller.sub( self, 'SetDBLockedStatus', 'db_locked_status' )
|
||||
self._controller.sub( self, 'SetStatusBarDirty', 'set_status_bar_dirty' )
|
||||
|
@ -2630,19 +2633,28 @@ class FrameGUI( ClientGUITopLevelWindows.MainFrameThatResizes ):
|
|||
|
||||
if num_pending + num_petitioned > 0:
|
||||
|
||||
submessages = []
|
||||
|
||||
if num_pending > 0:
|
||||
if service_key in self._currently_uploading_pending:
|
||||
|
||||
submessages.append( '{} {}'.format( HydrusData.ToHumanInt( num_pending ), pending_phrase ) )
|
||||
title = '{}: currently uploading {}'.format( name, HydrusData.ToHumanInt( num_pending + num_petitioned ) )
|
||||
|
||||
else:
|
||||
|
||||
submessages = []
|
||||
|
||||
if num_pending > 0:
|
||||
|
||||
submessages.append( '{} {}'.format( HydrusData.ToHumanInt( num_pending ), pending_phrase ) )
|
||||
|
||||
|
||||
if num_petitioned > 0:
|
||||
|
||||
submessages.append( '{} {}'.format( HydrusData.ToHumanInt( num_petitioned ), petitioned_phrase ) )
|
||||
|
||||
|
||||
title = '{}: {}'.format( name, ', '.join( submessages ) )
|
||||
|
||||
|
||||
if num_petitioned > 0:
|
||||
|
||||
submessages.append( '{} {}'.format( HydrusData.ToHumanInt( num_petitioned ), petitioned_phrase ) )
|
||||
|
||||
|
||||
title = '{}: {}'.format( name, ', '.join( submessages ) )
|
||||
submenu.setEnabled( service_key not in self._currently_uploading_pending )
|
||||
|
||||
ClientGUIMenus.SetMenuTitle( submenu, title )
|
||||
|
||||
|
@ -3097,8 +3109,8 @@ class FrameGUI( ClientGUITopLevelWindows.MainFrameThatResizes ):
|
|||
profile_mode_message += 'More information is available in the help, under \'reducing program lag\'.'
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( profiling, 'what is this?', 'Show profile info.', QW.QMessageBox.information, self, 'Profile modes', profile_mode_message )
|
||||
ClientGUIMenus.AppendMenuCheckItem( profiling, 'profile mode', 'Run detailed \'profiles\'.', HG.profile_mode, self._SwitchBoolean, 'profile_mode' )
|
||||
ClientGUIMenus.AppendMenuCheckItem( profiling, 'query planner mode', 'Run detailed \'query plans\'.', HG.query_planner_mode, self._SwitchBoolean, 'query_planner_mode' )
|
||||
ClientGUIMenus.AppendMenuCheckItem( profiling, 'profile mode', 'Run detailed \'profiles\'.', HG.profile_mode, HG.client_controller.FlipProfileMode )
|
||||
ClientGUIMenus.AppendMenuCheckItem( profiling, 'query planner mode', 'Run detailed \'query plans\'.', HG.query_planner_mode, HG.client_controller.FlipQueryPlannerMode )
|
||||
|
||||
ClientGUIMenus.AppendMenu( debug, profiling, 'profiling' )
|
||||
|
||||
|
@ -3201,6 +3213,7 @@ class FrameGUI( ClientGUITopLevelWindows.MainFrameThatResizes ):
|
|||
pause_all_new_network_traffic = self._controller.new_options.GetBoolean( 'pause_all_new_network_traffic' )
|
||||
|
||||
self._menubar_network_all_traffic_paused = ClientGUIMenus.AppendMenuCheckItem( submenu, 'all new network traffic', 'Stop any new network jobs from sending data.', pause_all_new_network_traffic, self.FlipNetworkTrafficPaused )
|
||||
ClientGUIMenus.AppendMenuCheckItem( submenu, 'always boot the client with paused network traffic', 'Always start the program with network traffic paused.', self._controller.new_options.GetBoolean( 'boot_with_network_traffic_paused' ), self._controller.new_options.FlipBoolean, 'boot_with_network_traffic_paused' )
|
||||
|
||||
ClientGUIMenus.AppendSeparator( submenu )
|
||||
|
||||
|
@ -6186,58 +6199,6 @@ The password is cleartext here but obscured in the entry dialog. Enter a blank p
|
|||
|
||||
HG.phash_generation_report_mode = not HG.phash_generation_report_mode
|
||||
|
||||
elif name == 'profile_mode':
|
||||
|
||||
if not HG.profile_mode:
|
||||
|
||||
now = HydrusData.GetNow()
|
||||
|
||||
with HG.profile_counter_lock:
|
||||
|
||||
HG.profile_start_time = now
|
||||
HG.profile_slow_count = 0
|
||||
HG.profile_fast_count = 0
|
||||
|
||||
|
||||
|
||||
HG.profile_mode = True
|
||||
|
||||
HydrusData.ShowText( 'Profile mode on!' )
|
||||
|
||||
else:
|
||||
|
||||
HG.profile_mode = False
|
||||
|
||||
with HG.profile_counter_lock:
|
||||
|
||||
( slow, fast ) = ( HG.profile_slow_count, HG.profile_fast_count )
|
||||
|
||||
|
||||
HydrusData.ShowText( 'Profiling done: {} slow jobs, {} fast jobs'.format( HydrusData.ToHumanInt( slow ), HydrusData.ToHumanInt( fast ) ) )
|
||||
|
||||
|
||||
elif name == 'query_planner_mode':
|
||||
|
||||
if not HG.query_planner_mode:
|
||||
|
||||
now = HydrusData.GetNow()
|
||||
|
||||
HG.query_planner_start_time = now
|
||||
HG.query_planner_query_count = 0
|
||||
|
||||
HG.query_planner_mode = True
|
||||
|
||||
HydrusData.ShowText( 'Query Planner mode on!' )
|
||||
|
||||
else:
|
||||
|
||||
HG.query_planner_mode = False
|
||||
|
||||
HG.queries_planned = set()
|
||||
|
||||
HydrusData.ShowText( 'Query Planning done: {} queries analyzed'.format( HydrusData.ToHumanInt( HG.query_planner_query_count ) ) )
|
||||
|
||||
|
||||
elif name == 'pubsub_report_mode':
|
||||
|
||||
HG.pubsub_report_mode = not HG.pubsub_report_mode
|
||||
|
@ -6349,6 +6310,10 @@ The password is cleartext here but obscured in the entry dialog. Enter a blank p
|
|||
return
|
||||
|
||||
|
||||
self._currently_uploading_pending.add( service_key )
|
||||
|
||||
self._menu_updater_pending.update()
|
||||
|
||||
self._controller.CallToThread( THREADUploadPending, service_key )
|
||||
|
||||
|
||||
|
@ -7104,6 +7069,13 @@ The password is cleartext here but obscured in the entry dialog. Enter a blank p
|
|||
self._menu_updater_undo.update()
|
||||
|
||||
|
||||
def NotifyPendingUploadFinished( self, service_key: bytes ):
|
||||
|
||||
self._currently_uploading_pending.discard( service_key )
|
||||
|
||||
self._menu_updater_pending.update()
|
||||
|
||||
|
||||
def PresentImportedFilesToPage( self, hashes, page_name ):
|
||||
|
||||
self._notebook.PresentImportedFilesToPage( hashes, page_name )
|
||||
|
@ -7228,6 +7200,10 @@ The password is cleartext here but obscured in the entry dialog. Enter a blank p
|
|||
|
||||
ClientGUIMediaControls.FlipMute( ClientGUIMediaControls.AUDIO_GLOBAL )
|
||||
|
||||
elif action == CAC.SIMPLE_GLOBAL_PROFILE_MODE_FLIP:
|
||||
|
||||
HG.client_controller.FlipProfileMode()
|
||||
|
||||
elif action == CAC.SIMPLE_SHOW_HIDE_SPLITTERS:
|
||||
|
||||
self._ShowHideSplitters()
|
||||
|
|
|
@ -233,22 +233,49 @@ class EditFileSeedCachePanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
if selected_file_seed.IsURLFileImport():
|
||||
|
||||
urls = sorted( selected_file_seed.GetURLs() )
|
||||
referral_url = selected_file_seed.GetReferralURL()
|
||||
primary_urls = sorted( selected_file_seed.GetPrimaryURLs() )
|
||||
source_urls = sorted( selected_file_seed.GetSourceURLs() )
|
||||
|
||||
if len( urls ) == 0:
|
||||
if referral_url is None and len( primary_urls ) + len( source_urls ) == 0:
|
||||
|
||||
ClientGUIMenus.AppendMenuLabel( menu, 'no parsed urls' )
|
||||
ClientGUIMenus.AppendMenuLabel( menu, 'no additional urls' )
|
||||
|
||||
else:
|
||||
|
||||
url_submenu = QW.QMenu( menu )
|
||||
|
||||
for url in urls:
|
||||
if referral_url is not None:
|
||||
|
||||
ClientGUIMenus.AppendMenuLabel( url_submenu, url )
|
||||
ClientGUIMenus.AppendMenuLabel( url_submenu, 'referral url:' )
|
||||
ClientGUIMenus.AppendMenuLabel( url_submenu, referral_url )
|
||||
|
||||
|
||||
ClientGUIMenus.AppendMenu( menu, url_submenu, 'parsed urls' )
|
||||
if len( primary_urls ) > 0:
|
||||
|
||||
ClientGUIMenus.AppendSeparator( url_submenu )
|
||||
|
||||
ClientGUIMenus.AppendMenuLabel( url_submenu, 'primary urls:' )
|
||||
|
||||
for url in primary_urls:
|
||||
|
||||
ClientGUIMenus.AppendMenuLabel( url_submenu, url )
|
||||
|
||||
|
||||
|
||||
if len( source_urls ) > 0:
|
||||
|
||||
ClientGUIMenus.AppendSeparator( url_submenu )
|
||||
|
||||
ClientGUIMenus.AppendMenuLabel( url_submenu, 'source urls:' )
|
||||
|
||||
for url in source_urls:
|
||||
|
||||
ClientGUIMenus.AppendMenuLabel( url_submenu, url )
|
||||
|
||||
|
||||
|
||||
ClientGUIMenus.AppendMenu( menu, url_submenu, 'additional urls' )
|
||||
|
||||
|
||||
#
|
||||
|
|
|
@ -1747,7 +1747,7 @@ class GalleryImportPanel( ClientGUICommon.StaticBox ):
|
|||
HG.client_controller.gui.RegisterUIUpdateWindow( self )
|
||||
|
||||
|
||||
def _SetFileImportOptions( self, file_import_options ):
|
||||
def _SetFileImportOptions( self, file_import_options: FileImportOptions.FileImportOptions ):
|
||||
|
||||
if self._gallery_import is not None:
|
||||
|
||||
|
@ -1755,7 +1755,7 @@ class GalleryImportPanel( ClientGUICommon.StaticBox ):
|
|||
|
||||
|
||||
|
||||
def _SetTagImportOptions( self, tag_import_options ):
|
||||
def _SetTagImportOptions( self, tag_import_options: TagImportOptions.TagImportOptions ):
|
||||
|
||||
if self._gallery_import is not None:
|
||||
|
||||
|
|
|
@ -765,11 +765,9 @@ class PopupMessageManager( QW.QWidget ):
|
|||
|
||||
gui_frame = self.parentWidget()
|
||||
|
||||
possibly_on_hidden_virtual_desktop = not ClientGUIFunctions.MouseIsOnMyDisplay( gui_frame )
|
||||
|
||||
gui_is_hidden = not gui_frame.isVisible()
|
||||
|
||||
going_to_bug_out_at_hide_or_show = possibly_on_hidden_virtual_desktop or gui_is_hidden
|
||||
going_to_bug_out_at_hide_or_show = gui_is_hidden
|
||||
|
||||
current_focus_tlw = QW.QApplication.activeWindow()
|
||||
|
||||
|
@ -864,13 +862,37 @@ class PopupMessageManager( QW.QWidget ):
|
|||
|
||||
main_gui = self.parentWidget()
|
||||
|
||||
# test both because when user uses a shortcut to send gui to a diff monitor, we can't chase it
|
||||
# this may need a better test for virtual display dismissal
|
||||
not_on_hidden_or_virtual_display = ClientGUIFunctions.MouseIsOnMyDisplay( main_gui ) or ClientGUIFunctions.MouseIsOnMyDisplay( self )
|
||||
gui_is_hidden = not main_gui.isVisible()
|
||||
|
||||
main_gui_up = not main_gui.isMinimized()
|
||||
if gui_is_hidden:
|
||||
|
||||
return False
|
||||
|
||||
|
||||
return not_on_hidden_or_virtual_display and main_gui_up
|
||||
if HG.client_controller.new_options.GetBoolean( 'freeze_message_manager_when_mouse_on_other_monitor' ):
|
||||
|
||||
# test both because when user uses a shortcut to send gui to a diff monitor, we can't chase it
|
||||
# this may need a better test for virtual display dismissal
|
||||
# this is also a proxy for hidden/virtual displays, which is really what it is going on about
|
||||
on_my_monitor = ClientGUIFunctions.MouseIsOnMyDisplay( main_gui ) or ClientGUIFunctions.MouseIsOnMyDisplay( self )
|
||||
|
||||
if not on_my_monitor:
|
||||
|
||||
return False
|
||||
|
||||
|
||||
|
||||
if HG.client_controller.new_options.GetBoolean( 'freeze_message_manager_when_main_gui_minimised' ):
|
||||
|
||||
main_gui_up = not main_gui.isMinimized()
|
||||
|
||||
if not main_gui_up:
|
||||
|
||||
return False
|
||||
|
||||
|
||||
|
||||
return True
|
||||
|
||||
|
||||
def _TryToMergeMessage( self, job_key ):
|
||||
|
|
|
@ -354,6 +354,7 @@ class EditDeleteFilesPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
self._simple_description = ClientGUICommon.BetterStaticText( self, label = 'init' )
|
||||
|
||||
self._permitted_action_choices = []
|
||||
self._this_dialog_includes_service_keys = False
|
||||
|
||||
self._InitialisePermittedActionChoices( suggested_file_service_key = suggested_file_service_key )
|
||||
|
||||
|
@ -361,13 +362,44 @@ class EditDeleteFilesPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
self._action_radio.Select( 0 )
|
||||
|
||||
if HG.client_controller.new_options.GetBoolean( 'remember_last_advanced_file_deletion_special_action' ):
|
||||
|
||||
last_advanced_file_deletion_special_action = HG.client_controller.new_options.GetNoneableString( 'last_advanced_file_deletion_special_action' )
|
||||
|
||||
else:
|
||||
|
||||
last_advanced_file_deletion_special_action = None
|
||||
|
||||
|
||||
if last_advanced_file_deletion_special_action is not None:
|
||||
|
||||
for ( i, choice ) in enumerate( self._permitted_action_choices ):
|
||||
|
||||
deletee_file_service_key = choice[1][0]
|
||||
|
||||
if deletee_file_service_key == last_advanced_file_deletion_special_action:
|
||||
|
||||
self._action_radio.Select( i )
|
||||
|
||||
break
|
||||
|
||||
|
||||
|
||||
|
||||
self._reason_panel = ClientGUICommon.StaticBox( self, 'reason' )
|
||||
|
||||
permitted_reason_choices = []
|
||||
|
||||
permitted_reason_choices.append( ( default_reason, default_reason ) )
|
||||
|
||||
last_advanced_file_deletion_reason = HG.client_controller.new_options.GetNoneableString( 'last_advanced_file_deletion_reason' )
|
||||
if HG.client_controller.new_options.GetBoolean( 'remember_last_advanced_file_deletion_reason' ):
|
||||
|
||||
last_advanced_file_deletion_reason = HG.client_controller.new_options.GetNoneableString( 'last_advanced_file_deletion_reason' )
|
||||
|
||||
else:
|
||||
|
||||
last_advanced_file_deletion_reason = None
|
||||
|
||||
|
||||
if last_advanced_file_deletion_reason is None:
|
||||
|
||||
|
@ -534,6 +566,8 @@ class EditDeleteFilesPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
if deletee_file_service_key == CC.LOCAL_FILE_SERVICE_KEY:
|
||||
|
||||
self._this_dialog_includes_service_keys = True
|
||||
|
||||
if not HC.options[ 'confirm_trash' ]:
|
||||
|
||||
# this dialog will never show
|
||||
|
@ -562,6 +596,8 @@ class EditDeleteFilesPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
else:
|
||||
|
||||
self._this_dialog_includes_service_keys = True
|
||||
|
||||
if num_to_delete == 1: text = 'Admin-delete this file?'
|
||||
else: text = 'Admin-delete these ' + HydrusData.ToHumanInt( num_to_delete ) + ' files?'
|
||||
|
||||
|
@ -686,7 +722,31 @@ class EditDeleteFilesPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
jobs = [ { file_service_key : content_updates } ]
|
||||
|
||||
|
||||
if save_reason:
|
||||
save_action = True
|
||||
|
||||
if isinstance( file_service_key, bytes ):
|
||||
|
||||
last_advanced_file_deletion_special_action = None
|
||||
|
||||
else:
|
||||
|
||||
previous_last_advanced_file_deletion_special_action = HG.client_controller.new_options.GetNoneableString( 'last_advanced_file_deletion_special_action' )
|
||||
|
||||
# if there is nothing to do but physically delete, then we don't want to overwrite an existing 'use service' setting
|
||||
if previous_last_advanced_file_deletion_special_action is None and not self._this_dialog_includes_service_keys:
|
||||
|
||||
save_action = False
|
||||
|
||||
|
||||
last_advanced_file_deletion_special_action = file_service_key
|
||||
|
||||
|
||||
if save_action and HG.client_controller.new_options.GetBoolean( 'remember_last_advanced_file_deletion_special_action' ):
|
||||
|
||||
HG.client_controller.new_options.SetNoneableString( 'last_advanced_file_deletion_special_action', last_advanced_file_deletion_special_action )
|
||||
|
||||
|
||||
if save_reason and HG.client_controller.new_options.GetBoolean( 'remember_last_advanced_file_deletion_reason' ):
|
||||
|
||||
if self._reason_radio.GetCurrentIndex() <= 0:
|
||||
|
||||
|
@ -1201,8 +1261,15 @@ class EditFileImportOptions( ClientGUIScrolledPanels.EditPanel ):
|
|||
post_import_panel = ClientGUICommon.StaticBox( self, 'post-import actions' )
|
||||
|
||||
self._auto_archive = QW.QCheckBox( post_import_panel )
|
||||
self._associate_primary_urls = QW.QCheckBox( post_import_panel )
|
||||
self._associate_source_urls = QW.QCheckBox( post_import_panel )
|
||||
|
||||
tt = 'Any URL in the \'chain\' to the file will be linked to it as a \'known url\' unless that URL has a matching URL Class that is set otherwise. Normally, since Gallery URL Classes are by default set not to associate, this means the file will get a visible Post URL and a less prominent direct File URL.'
|
||||
tt += os.linesep * 2
|
||||
tt += 'If you are doing a one-off job and do not want to associate these URLs, disable it here. Do not unset this unless you have a reason to!'
|
||||
|
||||
self._associate_primary_urls.setToolTip( tt )
|
||||
|
||||
tt = 'If the parser discovers and additional source URL for another site (e.g. "This file on wewbooru was originally posted to Bixiv [here]."), should that URL be associated with the final URL? Should it be trusted to make \'already in db/previously deleted\' determinations?'
|
||||
tt += os.linesep * 2
|
||||
tt += 'You should turn this off if the site supplies bad (incorrect or imprecise or malformed) source urls.'
|
||||
|
@ -1233,9 +1300,12 @@ class EditFileImportOptions( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
#
|
||||
|
||||
( automatic_archive, associate_source_urls ) = file_import_options.GetPostImportOptions()
|
||||
automatic_archive = file_import_options.AutomaticallyArchives()
|
||||
associate_primary_urls = file_import_options.ShouldAssociatePrimaryURLs()
|
||||
associate_source_urls = file_import_options.ShouldAssociateSourceURLs()
|
||||
|
||||
self._auto_archive.setChecked( automatic_archive )
|
||||
self._associate_primary_urls.setChecked( associate_primary_urls )
|
||||
self._associate_source_urls.setChecked( associate_source_urls )
|
||||
|
||||
#
|
||||
|
@ -1282,10 +1352,12 @@ class EditFileImportOptions( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
if show_downloader_options and HG.client_controller.new_options.GetBoolean( 'advanced_mode' ):
|
||||
|
||||
rows.append( ( 'associate primary urls: ', self._associate_primary_urls ) )
|
||||
rows.append( ( 'associate (and trust) additional source urls: ', self._associate_source_urls ) )
|
||||
|
||||
else:
|
||||
|
||||
self._associate_primary_urls.setVisible( False )
|
||||
self._associate_source_urls.setVisible( False )
|
||||
|
||||
|
||||
|
@ -1357,6 +1429,7 @@ If you have a very large (10k+ files) file import page, consider hiding some or
|
|||
max_resolution = self._max_resolution.GetValue()
|
||||
|
||||
automatic_archive = self._auto_archive.isChecked()
|
||||
associate_primary_urls = self._associate_primary_urls.isChecked()
|
||||
associate_source_urls = self._associate_source_urls.isChecked()
|
||||
|
||||
present_new_files = self._present_new_files.isChecked()
|
||||
|
@ -1366,7 +1439,7 @@ If you have a very large (10k+ files) file import page, consider hiding some or
|
|||
file_import_options = FileImportOptions.FileImportOptions()
|
||||
|
||||
file_import_options.SetPreImportOptions( exclude_deleted, do_not_check_known_urls_before_importing, do_not_check_hashes_before_importing, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution )
|
||||
file_import_options.SetPostImportOptions( automatic_archive, associate_source_urls )
|
||||
file_import_options.SetPostImportOptions( automatic_archive, associate_primary_urls, associate_source_urls )
|
||||
file_import_options.SetPresentationOptions( present_new_files, present_already_in_inbox_files, present_already_in_archive_files )
|
||||
|
||||
return file_import_options
|
||||
|
|
|
@ -67,6 +67,7 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
self._listbook.AddPage( 'system tray', 'system tray', self._SystemTrayPanel( self._listbook, self._new_options ) )
|
||||
self._listbook.AddPage( 'search', 'search', self._SearchPanel( self._listbook, self._new_options ) )
|
||||
self._listbook.AddPage( 'colours', 'colours', self._ColoursPanel( self._listbook ) )
|
||||
self._listbook.AddPage( 'popups', 'popups', self._PopupPanel( self._listbook, self._new_options ) )
|
||||
self._listbook.AddPage( 'regex favourites', 'regex favourites', self._RegexPanel( self._listbook ) )
|
||||
self._listbook.AddPage( 'sort/collect', 'sort/collect', self._SortCollectPanel( self._listbook ) )
|
||||
self._listbook.AddPage( 'downloading', 'downloading', self._DownloadingPanel( self._listbook, self._new_options ) )
|
||||
|
@ -956,6 +957,12 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
self._use_advanced_file_deletion_dialog = QW.QCheckBox( advanced_file_deletion_panel )
|
||||
self._use_advanced_file_deletion_dialog.setToolTip( 'If this is set, the client will present a more complicated file deletion confirmation dialog that will permit you to set your own deletion reason and perform \'clean\' deletes that leave no deletion record (making later re-import easier).' )
|
||||
|
||||
self._remember_last_advanced_file_deletion_special_action = QW.QCheckBox( advanced_file_deletion_panel )
|
||||
self._remember_last_advanced_file_deletion_special_action.setToolTip( 'This will try to remember and restore the last action you set, whether that was trash, physical delete, or physical delete and clear history.')
|
||||
|
||||
self._remember_last_advanced_file_deletion_reason = QW.QCheckBox( advanced_file_deletion_panel )
|
||||
self._remember_last_advanced_file_deletion_reason.setToolTip( 'This will remember and restore the last reason you set for a delete.' )
|
||||
|
||||
self._advanced_file_deletion_reasons = ClientGUIListBoxes.QueueListBox( advanced_file_deletion_panel, 5, str, add_callable = self._AddAFDR, edit_callable = self._EditAFDR )
|
||||
|
||||
#
|
||||
|
@ -989,6 +996,9 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
|
||||
self._use_advanced_file_deletion_dialog.clicked.connect( self._UpdateAdvancedControls )
|
||||
|
||||
self._remember_last_advanced_file_deletion_special_action.setChecked( HG.client_controller.new_options.GetBoolean( 'remember_last_advanced_file_deletion_special_action' ) )
|
||||
self._remember_last_advanced_file_deletion_reason.setChecked( HG.client_controller.new_options.GetBoolean( 'remember_last_advanced_file_deletion_reason' ) )
|
||||
|
||||
self._advanced_file_deletion_reasons.AddDatas( self._new_options.GetStringList( 'advanced_file_deletion_reasons' ) )
|
||||
|
||||
self._UpdateAdvancedControls()
|
||||
|
@ -1034,6 +1044,8 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
rows = []
|
||||
|
||||
rows.append( ( 'Use the advanced file deletion dialog: ', self._use_advanced_file_deletion_dialog ) )
|
||||
rows.append( ( 'Remember the last action: ', self._remember_last_advanced_file_deletion_special_action ) )
|
||||
rows.append( ( 'Remember the last reason: ', self._remember_last_advanced_file_deletion_reason ) )
|
||||
|
||||
gridbox = ClientGUICommon.WrapInGrid( advanced_file_deletion_panel, rows )
|
||||
|
||||
|
@ -1073,14 +1085,11 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
|
||||
def _UpdateAdvancedControls( self ):
|
||||
|
||||
if self._use_advanced_file_deletion_dialog.isChecked():
|
||||
|
||||
self._advanced_file_deletion_reasons.setEnabled( True )
|
||||
|
||||
else:
|
||||
|
||||
self._advanced_file_deletion_reasons.setEnabled( False )
|
||||
|
||||
advanced_enabled = self._use_advanced_file_deletion_dialog.isChecked()
|
||||
|
||||
self._remember_last_advanced_file_deletion_special_action.setEnabled( advanced_enabled )
|
||||
self._remember_last_advanced_file_deletion_reason.setEnabled( advanced_enabled )
|
||||
self._advanced_file_deletion_reasons.setEnabled( advanced_enabled )
|
||||
|
||||
|
||||
def UpdateOptions( self ):
|
||||
|
@ -1195,20 +1204,6 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
|
||||
#
|
||||
|
||||
self._popup_panel = ClientGUICommon.StaticBox( self, 'popup window toaster' )
|
||||
|
||||
self._popup_message_character_width = QP.MakeQSpinBox( self._popup_panel, min = 16, max = 256 )
|
||||
|
||||
self._popup_message_force_min_width = QW.QCheckBox( self._popup_panel )
|
||||
|
||||
self._hide_message_manager_on_gui_iconise = QW.QCheckBox( self._popup_panel )
|
||||
self._hide_message_manager_on_gui_iconise.setToolTip( 'If your message manager does not automatically minimise with your main gui, try this. It can lead to unusual show and positioning behaviour on window managers that do not support it, however.' )
|
||||
|
||||
self._hide_message_manager_on_gui_deactive = QW.QCheckBox( self._popup_panel )
|
||||
self._hide_message_manager_on_gui_deactive.setToolTip( 'If your message manager stays up after you minimise the program to the system tray using a custom window manager, try this out! It hides the popup messages as soon as the main gui loses focus.' )
|
||||
|
||||
#
|
||||
|
||||
self._misc_panel = ClientGUICommon.StaticBox( self, 'misc' )
|
||||
|
||||
self._always_show_iso_time = QW.QCheckBox( self._misc_panel )
|
||||
|
@ -1227,9 +1222,6 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
self._secret_discord_dnd_fix = QW.QCheckBox( self._misc_panel )
|
||||
self._secret_discord_dnd_fix.setToolTip( 'This saves the lag but is potentially dangerous, as it (may) treat the from-db-files-drag as a move rather than a copy and hence only works when the drop destination will not consume the files. It requires an additional secret Alternate key to unlock.' )
|
||||
|
||||
self._notify_client_api_cookies = QW.QCheckBox( self._misc_panel )
|
||||
self._notify_client_api_cookies.setToolTip( 'This will make a short-lived popup message every time you get new cookie information over the Client API.' )
|
||||
|
||||
self._use_qt_file_dialogs = QW.QCheckBox( self._misc_panel )
|
||||
self._use_qt_file_dialogs.setToolTip( 'If you get crashes opening file/directory dialogs, try this.' )
|
||||
|
||||
|
@ -1256,21 +1248,12 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
|
||||
self._human_bytes_sig_figs.setValue( self._new_options.GetInteger( 'human_bytes_sig_figs' ) )
|
||||
|
||||
self._popup_message_character_width.setValue( self._new_options.GetInteger( 'popup_message_character_width' ) )
|
||||
|
||||
self._popup_message_force_min_width.setChecked( self._new_options.GetBoolean( 'popup_message_force_min_width' ) )
|
||||
|
||||
self._discord_dnd_fix.setChecked( self._new_options.GetBoolean( 'discord_dnd_fix' ) )
|
||||
|
||||
self._discord_dnd_filename_pattern.setText( self._new_options.GetString( 'discord_dnd_filename_pattern' ) )
|
||||
|
||||
self._secret_discord_dnd_fix.setChecked( self._new_options.GetBoolean( 'secret_discord_dnd_fix' ) )
|
||||
|
||||
self._hide_message_manager_on_gui_iconise.setChecked( self._new_options.GetBoolean( 'hide_message_manager_on_gui_iconise' ) )
|
||||
self._hide_message_manager_on_gui_deactive.setChecked( self._new_options.GetBoolean( 'hide_message_manager_on_gui_deactive' ) )
|
||||
|
||||
self._notify_client_api_cookies.setChecked( self._new_options.GetBoolean( 'notify_client_api_cookies' ) )
|
||||
|
||||
self._use_qt_file_dialogs.setChecked( self._new_options.GetBoolean( 'use_qt_file_dialogs' ) )
|
||||
|
||||
for ( name, info ) in self._new_options.GetFrameLocations():
|
||||
|
@ -1296,19 +1279,7 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
|
||||
rows = []
|
||||
|
||||
rows.append( ( 'Approximate max width of popup messages (in characters): ', self._popup_message_character_width ) )
|
||||
rows.append( ( 'Make a short-lived popup on cookie updates through the Client API: ', self._notify_client_api_cookies ) )
|
||||
rows.append( ( 'BUGFIX: Hide the popup toaster when the main gui is minimised: ', self._hide_message_manager_on_gui_iconise ) )
|
||||
rows.append( ( 'BUGFIX: Hide the popup toaster when the main gui loses focus: ', self._hide_message_manager_on_gui_deactive ) )
|
||||
|
||||
gridbox = ClientGUICommon.WrapInGrid( self._popup_panel, rows )
|
||||
|
||||
self._popup_panel.Add( gridbox, CC.FLAGS_EXPAND_SIZER_PERPENDICULAR )
|
||||
|
||||
rows = []
|
||||
|
||||
rows.append( ( 'Prefer ISO time ("2018-03-01 12:40:23") to "5 days ago": ', self._always_show_iso_time ) )
|
||||
rows.append( ( 'BUGFIX: Force this width as the minimum width for all popup messages: ', self._popup_message_force_min_width ) )
|
||||
rows.append( ( 'BUGFIX: Discord file drag-and-drop fix (works for <=25, <200MB file DnDs): ', self._discord_dnd_fix ) )
|
||||
rows.append( ( 'Discord drag-and-drop filename pattern: ', self._discord_dnd_filename_pattern ) )
|
||||
rows.append( ( 'Export pattern shortcuts: ', ClientGUICommon.ExportPatternButton( self ) ) )
|
||||
|
@ -1331,7 +1302,6 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
vbox = QP.VBoxLayout()
|
||||
|
||||
QP.AddToLayout( vbox, self._main_gui_panel, CC.FLAGS_EXPAND_SIZER_PERPENDICULAR )
|
||||
QP.AddToLayout( vbox, self._popup_panel, CC.FLAGS_EXPAND_SIZER_PERPENDICULAR )
|
||||
QP.AddToLayout( vbox, self._misc_panel, CC.FLAGS_EXPAND_SIZER_PERPENDICULAR )
|
||||
QP.AddToLayout( vbox, frame_locations_panel, CC.FLAGS_EXPAND_BOTH_WAYS )
|
||||
|
||||
|
@ -1382,22 +1352,15 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
|
||||
self._new_options.SetBoolean( 'activate_window_on_tag_search_page_activation', self._activate_window_on_tag_search_page_activation.isChecked() )
|
||||
|
||||
self._new_options.SetInteger( 'popup_message_character_width', self._popup_message_character_width.value() )
|
||||
|
||||
self._new_options.SetBoolean( 'popup_message_force_min_width', self._popup_message_force_min_width.isChecked() )
|
||||
|
||||
title = self._main_gui_title.text()
|
||||
|
||||
self._new_options.SetString( 'main_gui_title', title )
|
||||
|
||||
HG.client_controller.pub( 'main_gui_title', title )
|
||||
|
||||
self._new_options.SetBoolean( 'notify_client_api_cookies', self._notify_client_api_cookies.isChecked() )
|
||||
self._new_options.SetBoolean( 'discord_dnd_fix', self._discord_dnd_fix.isChecked() )
|
||||
self._new_options.SetString( 'discord_dnd_filename_pattern', self._discord_dnd_filename_pattern.text() )
|
||||
self._new_options.SetBoolean( 'secret_discord_dnd_fix', self._secret_discord_dnd_fix.isChecked() )
|
||||
self._new_options.SetBoolean( 'hide_message_manager_on_gui_iconise', self._hide_message_manager_on_gui_iconise.isChecked() )
|
||||
self._new_options.SetBoolean( 'hide_message_manager_on_gui_deactive', self._hide_message_manager_on_gui_deactive.isChecked() )
|
||||
self._new_options.SetBoolean( 'use_qt_file_dialogs', self._use_qt_file_dialogs.isChecked() )
|
||||
|
||||
for listctrl_list in self._frame_locations.GetData():
|
||||
|
@ -2370,6 +2333,91 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
|
||||
|
||||
|
||||
class _PopupPanel( QW.QWidget ):
|
||||
|
||||
def __init__( self, parent, new_options ):
|
||||
|
||||
QW.QWidget.__init__( self, parent )
|
||||
|
||||
self._new_options = new_options
|
||||
|
||||
#
|
||||
|
||||
self._popup_panel = ClientGUICommon.StaticBox( self, 'popup window toaster' )
|
||||
|
||||
self._popup_message_character_width = QP.MakeQSpinBox( self._popup_panel, min = 16, max = 256 )
|
||||
|
||||
self._popup_message_force_min_width = QW.QCheckBox( self._popup_panel )
|
||||
|
||||
self._freeze_message_manager_when_mouse_on_other_monitor = QW.QCheckBox( self._popup_panel )
|
||||
self._freeze_message_manager_when_mouse_on_other_monitor.setToolTip( 'This is useful if you have a virtual desktop and find the popup manager restores strangely when you hop back to the hydrus display.' )
|
||||
|
||||
self._freeze_message_manager_when_main_gui_minimised = QW.QCheckBox( self._popup_panel )
|
||||
self._freeze_message_manager_when_main_gui_minimised.setToolTip( 'This is useful if the popup toaster restores strangely after minimised changes.' )
|
||||
|
||||
self._hide_message_manager_on_gui_iconise = QW.QCheckBox( self._popup_panel )
|
||||
self._hide_message_manager_on_gui_iconise.setToolTip( 'If your message manager does not automatically minimise with your main gui, try this. It can lead to unusual show and positioning behaviour on window managers that do not support it, however.' )
|
||||
|
||||
self._hide_message_manager_on_gui_deactive = QW.QCheckBox( self._popup_panel )
|
||||
self._hide_message_manager_on_gui_deactive.setToolTip( 'If your message manager stays up after you minimise the program to the system tray using a custom window manager, try this out! It hides the popup messages as soon as the main gui loses focus.' )
|
||||
|
||||
self._notify_client_api_cookies = QW.QCheckBox( self._popup_panel )
|
||||
self._notify_client_api_cookies.setToolTip( 'This will make a short-lived popup message every time you get new cookie information over the Client API.' )
|
||||
|
||||
#
|
||||
|
||||
self._popup_message_character_width.setValue( self._new_options.GetInteger( 'popup_message_character_width' ) )
|
||||
|
||||
self._popup_message_force_min_width.setChecked( self._new_options.GetBoolean( 'popup_message_force_min_width' ) )
|
||||
|
||||
self._freeze_message_manager_when_mouse_on_other_monitor.setChecked( self._new_options.GetBoolean( 'freeze_message_manager_when_mouse_on_other_monitor' ) )
|
||||
self._freeze_message_manager_when_main_gui_minimised.setChecked( self._new_options.GetBoolean( 'freeze_message_manager_when_main_gui_minimised' ) )
|
||||
|
||||
self._hide_message_manager_on_gui_iconise.setChecked( self._new_options.GetBoolean( 'hide_message_manager_on_gui_iconise' ) )
|
||||
self._hide_message_manager_on_gui_deactive.setChecked( self._new_options.GetBoolean( 'hide_message_manager_on_gui_deactive' ) )
|
||||
|
||||
self._notify_client_api_cookies.setChecked( self._new_options.GetBoolean( 'notify_client_api_cookies' ) )
|
||||
|
||||
#
|
||||
|
||||
rows = []
|
||||
|
||||
rows.append( ( 'Approximate max width of popup messages (in characters): ', self._popup_message_character_width ) )
|
||||
rows.append( ( 'BUGFIX: Force this width as the minimum width for all popup messages: ', self._popup_message_force_min_width ) )
|
||||
rows.append( ( 'Freeze the popup toaster when mouse is on another display: ', self._freeze_message_manager_when_mouse_on_other_monitor ) )
|
||||
rows.append( ( 'Freeze the popup toaster when the main gui is minimised: ', self._freeze_message_manager_when_main_gui_minimised ) )
|
||||
rows.append( ( 'BUGFIX: Hide the popup toaster when the main gui is minimised: ', self._hide_message_manager_on_gui_iconise ) )
|
||||
rows.append( ( 'BUGFIX: Hide the popup toaster when the main gui loses focus: ', self._hide_message_manager_on_gui_deactive ) )
|
||||
rows.append( ( 'Make a short-lived popup on cookie updates through the Client API: ', self._notify_client_api_cookies ) )
|
||||
|
||||
gridbox = ClientGUICommon.WrapInGrid( self._popup_panel, rows )
|
||||
|
||||
self._popup_panel.Add( gridbox, CC.FLAGS_EXPAND_SIZER_PERPENDICULAR )
|
||||
|
||||
vbox = QP.VBoxLayout()
|
||||
|
||||
QP.AddToLayout( vbox, self._popup_panel, CC.FLAGS_EXPAND_SIZER_PERPENDICULAR )
|
||||
vbox.addStretch( 1 )
|
||||
|
||||
self.setLayout( vbox )
|
||||
|
||||
|
||||
def UpdateOptions( self ):
|
||||
|
||||
self._new_options.SetInteger( 'popup_message_character_width', self._popup_message_character_width.value() )
|
||||
|
||||
self._new_options.SetBoolean( 'popup_message_force_min_width', self._popup_message_force_min_width.isChecked() )
|
||||
|
||||
self._new_options.SetBoolean( 'freeze_message_manager_when_mouse_on_other_monitor', self._freeze_message_manager_when_mouse_on_other_monitor.isChecked() )
|
||||
self._new_options.SetBoolean( 'freeze_message_manager_when_main_gui_minimised', self._freeze_message_manager_when_main_gui_minimised.isChecked() )
|
||||
|
||||
self._new_options.SetBoolean( 'hide_message_manager_on_gui_iconise', self._hide_message_manager_on_gui_iconise.isChecked() )
|
||||
self._new_options.SetBoolean( 'hide_message_manager_on_gui_deactive', self._hide_message_manager_on_gui_deactive.isChecked() )
|
||||
|
||||
self._new_options.SetBoolean( 'notify_client_api_cookies', self._notify_client_api_cookies.isChecked() )
|
||||
|
||||
|
||||
|
||||
class _RegexPanel( QW.QWidget ):
|
||||
|
||||
def __init__( self, parent ):
|
||||
|
|
|
@ -219,7 +219,7 @@ shortcut_names_to_descriptions[ 'preview_media_window' ] = 'Actions for any vide
|
|||
|
||||
SHORTCUTS_RESERVED_NAMES = [ 'global', 'archive_delete_filter', 'duplicate_filter', 'media', 'tags_autocomplete', 'main_gui', 'media_viewer_browser', 'media_viewer', 'media_viewer_media_window', 'preview_media_window' ]
|
||||
|
||||
SHORTCUTS_GLOBAL_ACTIONS = [ CAC.SIMPLE_GLOBAL_AUDIO_MUTE, CAC.SIMPLE_GLOBAL_AUDIO_UNMUTE, CAC.SIMPLE_GLOBAL_AUDIO_MUTE_FLIP, CAC.SIMPLE_EXIT_APPLICATION, CAC.SIMPLE_EXIT_APPLICATION_FORCE_MAINTENANCE, CAC.SIMPLE_RESTART_APPLICATION, CAC.SIMPLE_HIDE_TO_SYSTEM_TRAY ]
|
||||
SHORTCUTS_GLOBAL_ACTIONS = [ CAC.SIMPLE_GLOBAL_AUDIO_MUTE, CAC.SIMPLE_GLOBAL_AUDIO_UNMUTE, CAC.SIMPLE_GLOBAL_AUDIO_MUTE_FLIP, CAC.SIMPLE_EXIT_APPLICATION, CAC.SIMPLE_EXIT_APPLICATION_FORCE_MAINTENANCE, CAC.SIMPLE_RESTART_APPLICATION, CAC.SIMPLE_HIDE_TO_SYSTEM_TRAY, CAC.SIMPLE_GLOBAL_PROFILE_MODE_FLIP ]
|
||||
SHORTCUTS_MEDIA_ACTIONS = [ CAC.SIMPLE_MANAGE_FILE_TAGS, CAC.SIMPLE_MANAGE_FILE_RATINGS, CAC.SIMPLE_MANAGE_FILE_URLS, CAC.SIMPLE_MANAGE_FILE_NOTES, CAC.SIMPLE_ARCHIVE_FILE, CAC.SIMPLE_INBOX_FILE, CAC.SIMPLE_DELETE_FILE, CAC.SIMPLE_UNDELETE_FILE, CAC.SIMPLE_EXPORT_FILES, CAC.SIMPLE_EXPORT_FILES_QUICK_AUTO_EXPORT, CAC.SIMPLE_REMOVE_FILE_FROM_VIEW, CAC.SIMPLE_OPEN_FILE_IN_EXTERNAL_PROGRAM, CAC.SIMPLE_OPEN_SELECTION_IN_NEW_PAGE, CAC.SIMPLE_LAUNCH_THE_ARCHIVE_DELETE_FILTER, CAC.SIMPLE_COPY_BMP, CAC.SIMPLE_COPY_BMP_OR_FILE_IF_NOT_BMPABLE, CAC.SIMPLE_COPY_FILE, CAC.SIMPLE_COPY_PATH, CAC.SIMPLE_COPY_SHA256_HASH, CAC.SIMPLE_COPY_MD5_HASH, CAC.SIMPLE_COPY_SHA1_HASH, CAC.SIMPLE_COPY_SHA512_HASH, CAC.SIMPLE_GET_SIMILAR_TO_EXACT, CAC.SIMPLE_GET_SIMILAR_TO_VERY_SIMILAR, CAC.SIMPLE_GET_SIMILAR_TO_SIMILAR, CAC.SIMPLE_GET_SIMILAR_TO_SPECULATIVE, CAC.SIMPLE_DUPLICATE_MEDIA_SET_ALTERNATE, CAC.SIMPLE_DUPLICATE_MEDIA_SET_ALTERNATE_COLLECTIONS, CAC.SIMPLE_DUPLICATE_MEDIA_SET_CUSTOM, CAC.SIMPLE_DUPLICATE_MEDIA_SET_FOCUSED_BETTER, CAC.SIMPLE_DUPLICATE_MEDIA_SET_FOCUSED_KING, CAC.SIMPLE_DUPLICATE_MEDIA_SET_SAME_QUALITY, CAC.SIMPLE_OPEN_KNOWN_URL ]
|
||||
SHORTCUTS_MEDIA_VIEWER_ACTIONS = [ CAC.SIMPLE_PAUSE_MEDIA, CAC.SIMPLE_PAUSE_PLAY_MEDIA, CAC.SIMPLE_MEDIA_SEEK_DELTA, CAC.SIMPLE_MOVE_ANIMATION_TO_PREVIOUS_FRAME, CAC.SIMPLE_MOVE_ANIMATION_TO_NEXT_FRAME, CAC.SIMPLE_SWITCH_BETWEEN_FULLSCREEN_BORDERLESS_AND_REGULAR_FRAMED_WINDOW, CAC.SIMPLE_PAN_UP, CAC.SIMPLE_PAN_DOWN, CAC.SIMPLE_PAN_LEFT, CAC.SIMPLE_PAN_RIGHT, CAC.SIMPLE_PAN_TOP_EDGE, CAC.SIMPLE_PAN_BOTTOM_EDGE, CAC.SIMPLE_PAN_LEFT_EDGE, CAC.SIMPLE_PAN_RIGHT_EDGE, CAC.SIMPLE_PAN_VERTICAL_CENTER, CAC.SIMPLE_PAN_HORIZONTAL_CENTER, CAC.SIMPLE_ZOOM_IN, CAC.SIMPLE_ZOOM_OUT, CAC.SIMPLE_SWITCH_BETWEEN_100_PERCENT_AND_CANVAS_ZOOM, CAC.SIMPLE_FLIP_DARKMODE, CAC.SIMPLE_CLOSE_MEDIA_VIEWER ]
|
||||
SHORTCUTS_MEDIA_VIEWER_BROWSER_ACTIONS = [ CAC.SIMPLE_VIEW_NEXT, CAC.SIMPLE_VIEW_FIRST, CAC.SIMPLE_VIEW_LAST, CAC.SIMPLE_VIEW_PREVIOUS, CAC.SIMPLE_PAUSE_PLAY_SLIDESHOW, CAC.SIMPLE_SHOW_MENU, CAC.SIMPLE_CLOSE_MEDIA_VIEWER ]
|
||||
|
|
|
@ -727,13 +727,33 @@ class Canvas( QW.QWidget ):
|
|||
|
||||
( gumpf_current_zoom, self._canvas_zoom ) = CalculateCanvasZooms( self, self._current_media, media_show_action )
|
||||
|
||||
# for init zoom, we want the _width_ to stay the same as previous
|
||||
# previously, we always matched width, but this causes a problem in dupe viewer when the alternate has a little watermark on the bottom of the B file, spilling below bottom of screen
|
||||
# we want to preserve zoom so that if user scrolls on canvas zoom we won't have overlap
|
||||
|
||||
( previous_width, previous_height ) = CalculateMediaSize( previous_media, self._current_zoom )
|
||||
|
||||
( current_media_100_width, current_media_100_height ) = self._current_media.GetResolution()
|
||||
|
||||
self._current_zoom = previous_width / current_media_100_width
|
||||
potential_zooms = ( previous_width / current_media_100_width, previous_height / current_media_100_height )
|
||||
|
||||
both_smaller = True not in ( self._current_zoom > potential_zoom for potential_zoom in potential_zooms )
|
||||
both_bigger = True not in ( self._current_zoom < potential_zoom for potential_zoom in potential_zooms )
|
||||
|
||||
if both_smaller:
|
||||
|
||||
# keep the bigger dimension change in view
|
||||
self._current_zoom = max( potential_zooms )
|
||||
|
||||
elif both_bigger:
|
||||
|
||||
# do the reverse of the above
|
||||
self._current_zoom = min( potential_zooms )
|
||||
|
||||
else:
|
||||
|
||||
# fallback to width in weird situation
|
||||
self._current_zoom = potential_zooms[0]
|
||||
|
||||
|
||||
HG.client_controller.pub( 'canvas_new_zoom', self._canvas_key, self._current_zoom )
|
||||
|
||||
|
|
|
@ -134,6 +134,10 @@ class CanvasFrame( ClientGUITopLevelWindows.FrameThatResizesWithHovers ):
|
|||
|
||||
ClientGUIMediaControls.FlipMute( ClientGUIMediaControls.AUDIO_GLOBAL )
|
||||
|
||||
elif action == CAC.SIMPLE_GLOBAL_PROFILE_MODE_FLIP:
|
||||
|
||||
HG.client_controller.FlipProfileMode()
|
||||
|
||||
else:
|
||||
|
||||
command_processed = False
|
||||
|
|
|
@ -306,12 +306,13 @@ class ManagementController( HydrusSerialisable.SerialisableBase ):
|
|||
max_resolution = None
|
||||
|
||||
automatic_archive = advanced_import_options[ 'automatic_archive' ]
|
||||
associate_primary_urls = True
|
||||
associate_source_urls = True
|
||||
|
||||
file_import_options = FileImportOptions.FileImportOptions()
|
||||
|
||||
file_import_options.SetPreImportOptions( exclude_deleted, do_not_check_known_urls_before_importing, do_not_check_hashes_before_importing, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution )
|
||||
file_import_options.SetPostImportOptions( automatic_archive, associate_source_urls )
|
||||
file_import_options.SetPostImportOptions( automatic_archive, associate_primary_urls, associate_source_urls )
|
||||
|
||||
paths_to_tags = { path : { bytes.fromhex( service_key ) : tags for ( service_key, tags ) in additional_service_keys_to_tags } for ( path, additional_service_keys_to_tags ) in paths_to_tags.items() }
|
||||
|
||||
|
|
|
@ -4772,7 +4772,7 @@ class Thumbnail( Selectable ):
|
|||
|
||||
painter.setBrush( QG.QBrush( new_options.GetColour( background_colour_type ) ) )
|
||||
|
||||
painter.drawRect( thumbnail_border, thumbnail_border, width-(thumbnail_border*2), height-(thumbnail_border*2) )
|
||||
painter.drawRect( thumbnail_border, thumbnail_border, width - ( thumbnail_border * 2 ), height - ( thumbnail_border * 2 ) )
|
||||
|
||||
thumbnail_fill = HG.client_controller.new_options.GetBoolean( 'thumbnail_fill' )
|
||||
|
||||
|
@ -4948,6 +4948,8 @@ class Thumbnail( Selectable ):
|
|||
painter.drawRects( rectangles )
|
||||
|
||||
|
||||
ICON_MARGIN = 1
|
||||
|
||||
locations_manager = self.GetLocationsManager()
|
||||
|
||||
icons_to_draw = []
|
||||
|
@ -4974,19 +4976,26 @@ class Thumbnail( Selectable ):
|
|||
|
||||
if len( icons_to_draw ) > 0:
|
||||
|
||||
icon_x = -thumbnail_border
|
||||
icon_x = - ( thumbnail_border + ICON_MARGIN )
|
||||
|
||||
for icon in icons_to_draw:
|
||||
|
||||
painter.drawPixmap( width + icon_x - 18, thumbnail_border, icon )
|
||||
icon_x -= icon.width()
|
||||
|
||||
icon_x -= 18
|
||||
painter.drawPixmap( width + icon_x, thumbnail_border, icon )
|
||||
|
||||
icon_x -= 2 * ICON_MARGIN
|
||||
|
||||
|
||||
|
||||
if self.IsCollection():
|
||||
|
||||
painter.drawPixmap( 1, height-17, CC.global_pixmaps().collection )
|
||||
icon = CC.global_pixmaps().collection
|
||||
|
||||
icon_x = thumbnail_border + ICON_MARGIN
|
||||
icon_y = ( height - 1 ) - thumbnail_border - ICON_MARGIN - icon.height()
|
||||
|
||||
painter.drawPixmap( icon_x, icon_y, icon )
|
||||
|
||||
num_files_str = HydrusData.ToHumanInt( self.GetNumFiles() )
|
||||
|
||||
|
@ -5001,11 +5010,19 @@ class Thumbnail( Selectable ):
|
|||
|
||||
painter.setPen( QC.Qt.NoPen )
|
||||
|
||||
painter.drawRect( 17, height - text_height - 3, text_width + 2, text_height + 2 )
|
||||
box_width = text_width + ( ICON_MARGIN * 2 )
|
||||
box_x = icon_x + icon.width() + ICON_MARGIN
|
||||
box_height = text_height + ( ICON_MARGIN * 2 )
|
||||
box_y = ( height - 1 ) - box_height
|
||||
|
||||
painter.drawRect( box_x, height - text_height - 3, box_width, box_height )
|
||||
|
||||
painter.setPen( QG.QPen( CC.COLOUR_SELECTED_DARK ) )
|
||||
|
||||
ClientGUIFunctions.DrawText( painter, 18, height - text_height - 2, num_files_str )
|
||||
text_x = box_x + ICON_MARGIN
|
||||
text_y = box_y + ICON_MARGIN
|
||||
|
||||
ClientGUIFunctions.DrawText( painter, text_x, text_y, num_files_str )
|
||||
|
||||
|
||||
# top left icons
|
||||
|
@ -5073,16 +5090,13 @@ class Thumbnail( Selectable ):
|
|||
icons_to_draw.append( CC.global_pixmaps().ipfs_petitioned )
|
||||
|
||||
|
||||
ICON_MARGIN = 1
|
||||
ICON_SPACING = 2
|
||||
|
||||
top_left_x = thumbnail_border + ICON_MARGIN
|
||||
|
||||
for icon_to_draw in icons_to_draw:
|
||||
|
||||
painter.drawPixmap( top_left_x, thumbnail_border + ICON_MARGIN, icon_to_draw )
|
||||
|
||||
top_left_x += icon_to_draw.width() + ICON_SPACING
|
||||
top_left_x += icon_to_draw.width() + ( ICON_MARGIN * 2 )
|
||||
|
||||
|
||||
return qt_image
|
||||
|
|
|
@ -35,7 +35,7 @@ class FileSeed( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_FILE_SEED
|
||||
SERIALISABLE_NAME = 'File Import'
|
||||
SERIALISABLE_VERSION = 4
|
||||
SERIALISABLE_VERSION = 5
|
||||
|
||||
def __init__( self, file_seed_type: int = None, file_seed_data: str = None ):
|
||||
|
||||
|
@ -65,7 +65,8 @@ class FileSeed( HydrusSerialisable.SerialisableBase ):
|
|||
self._external_filterable_tags = set()
|
||||
self._external_additional_service_keys_to_tags = ClientTags.ServiceKeysToTags()
|
||||
|
||||
self._urls = set()
|
||||
self._primary_urls = set()
|
||||
self._source_urls = set()
|
||||
self._tags = set()
|
||||
self._hashes = {}
|
||||
|
||||
|
@ -90,6 +91,28 @@ class FileSeed( HydrusSerialisable.SerialisableBase ):
|
|||
return self.__hash__() != other.__hash__()
|
||||
|
||||
|
||||
def _AddPrimaryURLs( self, urls ):
|
||||
|
||||
urls = ClientNetworkingDomain.NormaliseAndFilterAssociableURLs( urls )
|
||||
|
||||
urls.discard( self.file_seed_data )
|
||||
urls.discard( self._referral_url )
|
||||
|
||||
self._primary_urls.update( urls )
|
||||
self._source_urls.difference_update( urls )
|
||||
|
||||
|
||||
def _AddSourceURLs( self, urls ):
|
||||
|
||||
urls = ClientNetworkingDomain.NormaliseAndFilterAssociableURLs( urls )
|
||||
|
||||
urls.discard( self.file_seed_data )
|
||||
urls.discard( self._referral_url )
|
||||
urls.difference_update( self._primary_urls )
|
||||
|
||||
self._source_urls.update( urls )
|
||||
|
||||
|
||||
def _CheckTagsVeto( self, tags, tag_import_options: TagImportOptions.TagImportOptions ):
|
||||
|
||||
if len( tags ) > 0:
|
||||
|
@ -107,48 +130,57 @@ class FileSeed( HydrusSerialisable.SerialisableBase ):
|
|||
serialisable_external_filterable_tags = list( self._external_filterable_tags )
|
||||
serialisable_external_additional_service_keys_to_tags = self._external_additional_service_keys_to_tags.GetSerialisableTuple()
|
||||
|
||||
serialisable_urls = list( self._urls )
|
||||
serialisable_primary_urls = list( self._primary_urls )
|
||||
serialisable_source_urls = list( self._source_urls )
|
||||
serialisable_tags = list( self._tags )
|
||||
serialisable_hashes = [ ( hash_type, hash.hex() ) for ( hash_type, hash ) in list(self._hashes.items()) if hash is not None ]
|
||||
|
||||
return ( self.file_seed_type, self.file_seed_data, self.created, self.modified, self.source_time, self.status, self.note, self._referral_url, serialisable_external_filterable_tags, serialisable_external_additional_service_keys_to_tags, serialisable_urls, serialisable_tags, serialisable_hashes )
|
||||
return (
|
||||
self.file_seed_type,
|
||||
self.file_seed_data,
|
||||
self.created,
|
||||
self.modified,
|
||||
self.source_time,
|
||||
self.status,
|
||||
self.note,
|
||||
self._referral_url,
|
||||
serialisable_external_filterable_tags,
|
||||
serialisable_external_additional_service_keys_to_tags,
|
||||
serialisable_primary_urls,
|
||||
serialisable_source_urls,
|
||||
serialisable_tags,
|
||||
serialisable_hashes
|
||||
)
|
||||
|
||||
|
||||
def _InitialiseFromSerialisableInfo( self, serialisable_info ):
|
||||
|
||||
( self.file_seed_type, self.file_seed_data, self.created, self.modified, self.source_time, self.status, self.note, self._referral_url, serialisable_external_filterable_tags, serialisable_external_additional_service_keys_to_tags, serialisable_urls, serialisable_tags, serialisable_hashes ) = serialisable_info
|
||||
(
|
||||
self.file_seed_type,
|
||||
self.file_seed_data,
|
||||
self.created,
|
||||
self.modified,
|
||||
self.source_time,
|
||||
self.status,
|
||||
self.note,
|
||||
self._referral_url,
|
||||
serialisable_external_filterable_tags,
|
||||
serialisable_external_additional_service_keys_to_tags,
|
||||
serialisable_primary_urls,
|
||||
serialisable_source_urls,
|
||||
serialisable_tags,
|
||||
serialisable_hashes
|
||||
) = serialisable_info
|
||||
|
||||
self._external_filterable_tags = set( serialisable_external_filterable_tags )
|
||||
self._external_additional_service_keys_to_tags = HydrusSerialisable.CreateFromSerialisableTuple( serialisable_external_additional_service_keys_to_tags )
|
||||
|
||||
self._urls = set( serialisable_urls )
|
||||
self._primary_urls = set( serialisable_primary_urls )
|
||||
self._source_urls = set( serialisable_source_urls )
|
||||
self._tags = set( serialisable_tags )
|
||||
self._hashes = { hash_type : bytes.fromhex( encoded_hash ) for ( hash_type, encoded_hash ) in serialisable_hashes if encoded_hash is not None }
|
||||
|
||||
|
||||
def _NormaliseAndFilterAssociableURLs( self, urls ):
|
||||
|
||||
normalised_urls = set()
|
||||
|
||||
for url in urls:
|
||||
|
||||
try:
|
||||
|
||||
url = HG.client_controller.network_engine.domain_manager.NormaliseURL( url )
|
||||
|
||||
except HydrusExceptions.URLClassException:
|
||||
|
||||
continue # not a url--something like "file:///C:/Users/Tall%20Man/Downloads/maxresdefault.jpg" ha ha ha
|
||||
|
||||
|
||||
normalised_urls.add( url )
|
||||
|
||||
|
||||
associable_urls = { url for url in normalised_urls if HG.client_controller.network_engine.domain_manager.ShouldAssociateURLWithFiles( url ) }
|
||||
|
||||
return associable_urls
|
||||
|
||||
|
||||
def _SetupTagImportOptions( self, given_tag_import_options: TagImportOptions.TagImportOptions ) -> TagImportOptions.TagImportOptions:
|
||||
|
||||
if given_tag_import_options.IsDefault():
|
||||
|
@ -223,6 +255,47 @@ class FileSeed( HydrusSerialisable.SerialisableBase ):
|
|||
return ( 4, new_serialisable_info )
|
||||
|
||||
|
||||
if version == 4:
|
||||
|
||||
(
|
||||
file_seed_type,
|
||||
file_seed_data,
|
||||
created,
|
||||
modified,
|
||||
source_time,
|
||||
status,
|
||||
note,
|
||||
referral_url,
|
||||
serialisable_external_filterable_tags,
|
||||
serialisable_external_additional_service_keys_to_tags,
|
||||
serialisable_urls,
|
||||
serialisable_tags,
|
||||
serialisable_hashes
|
||||
) = old_serialisable_info
|
||||
|
||||
serialisable_primary_urls = serialisable_urls
|
||||
serialisable_source_urls = []
|
||||
|
||||
new_serialisable_info = (
|
||||
file_seed_type,
|
||||
file_seed_data,
|
||||
created,
|
||||
modified,
|
||||
source_time,
|
||||
status,
|
||||
note,
|
||||
referral_url,
|
||||
serialisable_external_filterable_tags,
|
||||
serialisable_external_additional_service_keys_to_tags,
|
||||
serialisable_primary_urls,
|
||||
serialisable_source_urls,
|
||||
serialisable_tags,
|
||||
serialisable_hashes
|
||||
)
|
||||
|
||||
return ( 5, new_serialisable_info )
|
||||
|
||||
|
||||
|
||||
def AddParseResults( self, parse_results, file_import_options: FileImportOptions.FileImportOptions ):
|
||||
|
||||
|
@ -234,16 +307,9 @@ class FileSeed( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
|
||||
|
||||
if file_import_options.ShouldAssociateSourceURLs():
|
||||
|
||||
source_urls = ClientParsing.GetURLsFromParseResults( parse_results, ( HC.URL_TYPE_SOURCE, ) )
|
||||
|
||||
associable_urls = self._NormaliseAndFilterAssociableURLs( source_urls )
|
||||
|
||||
associable_urls.discard( self.file_seed_data )
|
||||
|
||||
self._urls.update( associable_urls )
|
||||
|
||||
source_urls = ClientParsing.GetURLsFromParseResults( parse_results, ( HC.URL_TYPE_SOURCE, ) )
|
||||
|
||||
self._AddSourceURLs( source_urls )
|
||||
|
||||
tags = ClientParsing.GetTagsFromParseResults( parse_results )
|
||||
|
||||
|
@ -270,15 +336,14 @@ class FileSeed( HydrusSerialisable.SerialisableBase ):
|
|||
self._UpdateModified()
|
||||
|
||||
|
||||
def AddURL( self, url: str ):
|
||||
def AddPrimaryURLs( self, urls ):
|
||||
|
||||
urls = ( url, )
|
||||
self._AddPrimaryURLs( urls )
|
||||
|
||||
associable_urls = self._NormaliseAndFilterAssociableURLs( urls )
|
||||
|
||||
def AddSourceURLs( self, urls ):
|
||||
|
||||
associable_urls.discard( self.file_seed_data )
|
||||
|
||||
self._urls.update( associable_urls )
|
||||
self._AddSourceURLs( urls )
|
||||
|
||||
|
||||
def CheckPreFetchMetadata( self, tag_import_options: TagImportOptions.TagImportOptions ):
|
||||
|
@ -288,7 +353,7 @@ class FileSeed( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
def DownloadAndImportRawFile( self, file_url: str, file_import_options, network_job_factory, network_job_presentation_context_factory, status_hook, override_bandwidth = False ):
|
||||
|
||||
self.AddURL( file_url )
|
||||
self.AddPrimaryURLs( ( file_url, ) )
|
||||
|
||||
( os_file_handle, temp_path ) = HydrusPaths.GetTempPath()
|
||||
|
||||
|
@ -471,11 +536,14 @@ class FileSeed( HydrusSerialisable.SerialisableBase ):
|
|||
urls.append( file_url )
|
||||
|
||||
|
||||
# we now only trust url-matched single urls and the post/file urls
|
||||
# trusting unmatched source urls was too much of a hassle with too many boorus providing bad source urls like user account pages
|
||||
urls.extend( self._primary_urls )
|
||||
|
||||
urls.extend( ( url for url in self._urls if HG.client_controller.network_engine.domain_manager.URLDefinitelyRefersToOneFile( url ) ) )
|
||||
# now that we store primary and source urls separately, we'll trust any primary but be careful about source
|
||||
# trusting classless source urls was too much of a hassle with too many boorus providing bad source urls like user account pages
|
||||
|
||||
urls.extend( ( url for url in self._source_urls if HG.client_controller.network_engine.domain_manager.URLDefinitelyRefersToOneFile( url ) ) )
|
||||
|
||||
# now discard gallery pages or post urls that can hold multiple files
|
||||
urls = [ url for url in urls if not HG.client_controller.network_engine.domain_manager.URLCanReferToMultipleFiles( url ) ]
|
||||
|
||||
unrecognised_url_results = set()
|
||||
|
@ -600,9 +668,19 @@ class FileSeed( HydrusSerialisable.SerialisableBase ):
|
|||
return t
|
||||
|
||||
|
||||
def GetURLs( self ):
|
||||
def GetPrimaryURLs( self ):
|
||||
|
||||
return set( self._urls )
|
||||
return set( self._primary_urls )
|
||||
|
||||
|
||||
def GetReferralURL( self ):
|
||||
|
||||
return self._referral_url
|
||||
|
||||
|
||||
def GetSourceURLs( self ):
|
||||
|
||||
return set( self._source_urls )
|
||||
|
||||
|
||||
def HasHash( self ):
|
||||
|
@ -676,7 +754,7 @@ class FileSeed( HydrusSerialisable.SerialisableBase ):
|
|||
HydrusPaths.CleanUpTempPath( os_file_handle, temp_path )
|
||||
|
||||
|
||||
self.WriteContentUpdates()
|
||||
self.WriteContentUpdates( file_import_options = file_import_options )
|
||||
|
||||
except HydrusExceptions.UnsupportedFileException as e:
|
||||
|
||||
|
@ -1082,8 +1160,11 @@ class FileSeed( HydrusSerialisable.SerialisableBase ):
|
|||
file_seed.SetExternalFilterableTags( self._external_filterable_tags )
|
||||
file_seed.SetExternalAdditionalServiceKeysToTags( self._external_additional_service_keys_to_tags )
|
||||
|
||||
file_seed._urls.update( self._urls )
|
||||
file_seed._tags.update( self._tags )
|
||||
file_seed.AddPrimaryURLs( set( self._primary_urls ) )
|
||||
|
||||
file_seed.AddSourceURLs( set( self._source_urls ) )
|
||||
|
||||
file_seed.AddTags( set( self._tags ) )
|
||||
|
||||
|
||||
try:
|
||||
|
@ -1176,7 +1257,7 @@ class FileSeed( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
if self._referral_url is not None:
|
||||
|
||||
duplicate_file_seed.AddURL( self._referral_url )
|
||||
duplicate_file_seed.AddSourceURLs( ( self._referral_url, ) )
|
||||
|
||||
|
||||
child_file_seeds.append( duplicate_file_seed )
|
||||
|
@ -1217,7 +1298,7 @@ class FileSeed( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
|
||||
|
||||
did_substantial_work |= self.WriteContentUpdates( tag_import_options )
|
||||
did_substantial_work |= self.WriteContentUpdates( file_import_options = file_import_options, tag_import_options = tag_import_options )
|
||||
|
||||
except HydrusExceptions.ShutdownException:
|
||||
|
||||
|
@ -1278,7 +1359,7 @@ class FileSeed( HydrusSerialisable.SerialisableBase ):
|
|||
return did_substantial_work
|
||||
|
||||
|
||||
def WriteContentUpdates( self, tag_import_options: typing.Optional[ TagImportOptions.TagImportOptions ] = None ):
|
||||
def WriteContentUpdates( self, file_import_options: typing.Optional[ FileImportOptions.FileImportOptions ] = None, tag_import_options: typing.Optional[ TagImportOptions.TagImportOptions ] = None ):
|
||||
|
||||
did_work = False
|
||||
|
||||
|
@ -1299,19 +1380,32 @@ class FileSeed( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
service_keys_to_content_updates = collections.defaultdict( list )
|
||||
|
||||
urls = set( self._urls )
|
||||
potentially_associable_urls = set()
|
||||
|
||||
if self.file_seed_type == FILE_SEED_TYPE_URL:
|
||||
if file_import_options is not None:
|
||||
|
||||
urls.add( self.file_seed_data )
|
||||
if file_import_options.ShouldAssociatePrimaryURLs():
|
||||
|
||||
potentially_associable_urls.update( self._primary_urls )
|
||||
|
||||
if self.file_seed_type == FILE_SEED_TYPE_URL:
|
||||
|
||||
potentially_associable_urls.add( self.file_seed_data )
|
||||
|
||||
|
||||
if self._referral_url is not None:
|
||||
|
||||
potentially_associable_urls.add( self._referral_url )
|
||||
|
||||
|
||||
|
||||
if file_import_options.ShouldAssociateSourceURLs():
|
||||
|
||||
potentially_associable_urls.update( self._source_urls )
|
||||
|
||||
|
||||
|
||||
if self._referral_url is not None:
|
||||
|
||||
urls.add( self._referral_url )
|
||||
|
||||
|
||||
associable_urls = self._NormaliseAndFilterAssociableURLs( urls )
|
||||
associable_urls = ClientNetworkingDomain.NormaliseAndFilterAssociableURLs( potentially_associable_urls )
|
||||
|
||||
if len( associable_urls ) > 0:
|
||||
|
||||
|
|
|
@ -282,7 +282,7 @@ class FileImportJob( object ):
|
|||
status_hook( 'generating file metadata' )
|
||||
|
||||
|
||||
self._file_info = HydrusFileHandling.GetFileInfo( self._temp_path, mime )
|
||||
self._file_info = HydrusFileHandling.GetFileInfo( self._temp_path, mime = mime )
|
||||
|
||||
( size, mime, width, height, duration, num_frames, has_audio, num_words ) = self._file_info
|
||||
|
||||
|
|
|
@ -384,6 +384,8 @@ class GallerySeed( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
else:
|
||||
|
||||
do_parse = False
|
||||
|
||||
status = CC.STATUS_ERROR
|
||||
|
||||
note = 'Could not parse {}: {}'.format( match_name, cannot_parse_reason )
|
||||
|
@ -401,7 +403,7 @@ class GallerySeed( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
file_seeds = [ file_seed ]
|
||||
|
||||
all_parse_results = []
|
||||
file_seeds_callable( ( file_seed, ) )
|
||||
|
||||
status = CC.STATUS_SUCCESSFUL_AND_NEW
|
||||
|
||||
|
|
|
@ -11,7 +11,7 @@ class FileImportOptions( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_FILE_IMPORT_OPTIONS
|
||||
SERIALISABLE_NAME = 'File Import Options'
|
||||
SERIALISABLE_VERSION = 4
|
||||
SERIALISABLE_VERSION = 5
|
||||
|
||||
def __init__( self ):
|
||||
|
||||
|
@ -27,6 +27,7 @@ class FileImportOptions( HydrusSerialisable.SerialisableBase ):
|
|||
self._min_resolution = None
|
||||
self._max_resolution = None
|
||||
self._automatic_archive = False
|
||||
self._associate_primary_urls = True
|
||||
self._associate_source_urls = True
|
||||
self._present_new_files = True
|
||||
self._present_already_in_inbox_files = True
|
||||
|
@ -36,7 +37,7 @@ class FileImportOptions( HydrusSerialisable.SerialisableBase ):
|
|||
def _GetSerialisableInfo( self ):
|
||||
|
||||
pre_import_options = ( self._exclude_deleted, self._do_not_check_known_urls_before_importing, self._do_not_check_hashes_before_importing, self._allow_decompression_bombs, self._min_size, self._max_size, self._max_gif_size, self._min_resolution, self._max_resolution )
|
||||
post_import_options = ( self._automatic_archive, self._associate_source_urls )
|
||||
post_import_options = ( self._automatic_archive, self._associate_primary_urls, self._associate_source_urls )
|
||||
presentation_options = ( self._present_new_files, self._present_already_in_inbox_files, self._present_already_in_archive_files )
|
||||
|
||||
return ( pre_import_options, post_import_options, presentation_options )
|
||||
|
@ -47,7 +48,7 @@ class FileImportOptions( HydrusSerialisable.SerialisableBase ):
|
|||
( pre_import_options, post_import_options, presentation_options ) = serialisable_info
|
||||
|
||||
( self._exclude_deleted, self._do_not_check_known_urls_before_importing, self._do_not_check_hashes_before_importing, self._allow_decompression_bombs, self._min_size, self._max_size, self._max_gif_size, self._min_resolution, self._max_resolution ) = pre_import_options
|
||||
( self._automatic_archive, self._associate_source_urls ) = post_import_options
|
||||
( self._automatic_archive, self._associate_primary_urls, self._associate_source_urls ) = post_import_options
|
||||
( self._present_new_files, self._present_already_in_inbox_files, self._present_already_in_archive_files ) = presentation_options
|
||||
|
||||
|
||||
|
@ -106,13 +107,28 @@ class FileImportOptions( HydrusSerialisable.SerialisableBase ):
|
|||
return ( 4, new_serialisable_info )
|
||||
|
||||
|
||||
if version == 4:
|
||||
|
||||
( pre_import_options, post_import_options, presentation_options ) = old_serialisable_info
|
||||
|
||||
( automatic_archive, associate_source_urls ) = post_import_options
|
||||
|
||||
associate_primary_urls = True
|
||||
|
||||
post_import_options = ( automatic_archive, associate_primary_urls, associate_source_urls )
|
||||
|
||||
new_serialisable_info = ( pre_import_options, post_import_options, presentation_options )
|
||||
|
||||
return ( 5, new_serialisable_info )
|
||||
|
||||
|
||||
|
||||
def AllowsDecompressionBombs( self ):
|
||||
|
||||
return self._allow_decompression_bombs
|
||||
|
||||
|
||||
def AutomaticallyArchives( self ):
|
||||
def AutomaticallyArchives( self ) -> bool:
|
||||
|
||||
return self._automatic_archive
|
||||
|
||||
|
@ -199,13 +215,6 @@ class FileImportOptions( HydrusSerialisable.SerialisableBase ):
|
|||
return self._exclude_deleted
|
||||
|
||||
|
||||
def GetPostImportOptions( self ):
|
||||
|
||||
post_import_options = ( self._automatic_archive, self._associate_source_urls )
|
||||
|
||||
return post_import_options
|
||||
|
||||
|
||||
def GetPresentationOptions( self ):
|
||||
|
||||
presentation_options = ( self._present_new_files, self._present_already_in_inbox_files, self._present_already_in_archive_files )
|
||||
|
@ -307,9 +316,10 @@ class FileImportOptions( HydrusSerialisable.SerialisableBase ):
|
|||
return summary
|
||||
|
||||
|
||||
def SetPostImportOptions( self, automatic_archive, associate_source_urls ):
|
||||
def SetPostImportOptions( self, automatic_archive: bool, associate_primary_urls: bool, associate_source_urls: bool ):
|
||||
|
||||
self._automatic_archive = automatic_archive
|
||||
self._associate_primary_urls = associate_primary_urls
|
||||
self._associate_source_urls = associate_source_urls
|
||||
|
||||
|
||||
|
@ -333,7 +343,12 @@ class FileImportOptions( HydrusSerialisable.SerialisableBase ):
|
|||
self._max_resolution = max_resolution
|
||||
|
||||
|
||||
def ShouldAssociateSourceURLs( self ):
|
||||
def ShouldAssociatePrimaryURLs( self ) -> bool:
|
||||
|
||||
return self._associate_primary_urls
|
||||
|
||||
|
||||
def ShouldAssociateSourceURLs( self ) -> bool:
|
||||
|
||||
return self._associate_source_urls
|
||||
|
||||
|
|
|
@ -329,13 +329,6 @@ def ConvertURLIntoSecondLevelDomain( url ):
|
|||
|
||||
return ConvertDomainIntoSecondLevelDomain( domain )
|
||||
|
||||
def DomainEqualsAnotherForgivingWWW( test_domain, wwwable_domain ):
|
||||
|
||||
# domain is either the same or starts with www. or www2. or something
|
||||
rule = r'^(www[^\.]*\.)?' + re.escape( wwwable_domain ) + '$'
|
||||
|
||||
return re.search( rule, test_domain ) is not None
|
||||
|
||||
def CookieDomainMatches( cookie, search_domain ):
|
||||
|
||||
cookie_domain = cookie.domain
|
||||
|
@ -351,6 +344,13 @@ def CookieDomainMatches( cookie, search_domain ):
|
|||
|
||||
return matches_exactly or matches_dot or valid_subdomain
|
||||
|
||||
def DomainEqualsAnotherForgivingWWW( test_domain, wwwable_domain ):
|
||||
|
||||
# domain is either the same or starts with www. or www2. or something
|
||||
rule = r'^(www[^\.]*\.)?' + re.escape( wwwable_domain ) + '$'
|
||||
|
||||
return re.search( rule, test_domain ) is not None
|
||||
|
||||
def GetCookie( cookies, search_domain, cookie_name_string_match ):
|
||||
|
||||
for cookie in cookies:
|
||||
|
@ -438,6 +438,28 @@ def GetSearchURLs( url ):
|
|||
|
||||
return search_urls
|
||||
|
||||
def NormaliseAndFilterAssociableURLs( urls ):
|
||||
|
||||
normalised_urls = set()
|
||||
|
||||
for url in urls:
|
||||
|
||||
try:
|
||||
|
||||
url = HG.client_controller.network_engine.domain_manager.NormaliseURL( url )
|
||||
|
||||
except HydrusExceptions.URLClassException:
|
||||
|
||||
continue # not a url--something like "file:///C:/Users/Tall%20Man/Downloads/maxresdefault.jpg" ha ha ha
|
||||
|
||||
|
||||
normalised_urls.add( url )
|
||||
|
||||
|
||||
associable_urls = { url for url in normalised_urls if HG.client_controller.network_engine.domain_manager.ShouldAssociateURLWithFiles( url ) }
|
||||
|
||||
return associable_urls
|
||||
|
||||
def ParseURL( url: str ) -> urllib.parse.ParseResult:
|
||||
|
||||
url = url.strip()
|
||||
|
|
|
@ -81,7 +81,7 @@ options = {}
|
|||
# Misc
|
||||
|
||||
NETWORK_VERSION = 20
|
||||
SOFTWARE_VERSION = 457
|
||||
SOFTWARE_VERSION = 458
|
||||
CLIENT_API_VERSION = 20
|
||||
|
||||
SERVER_THUMBNAIL_DIMENSIONS = ( 200, 200 )
|
||||
|
|
|
@ -1927,6 +1927,11 @@ class ContentUpdate( object ):
|
|||
return len( self.GetHashes() )
|
||||
|
||||
|
||||
def HasReason( self ):
|
||||
|
||||
return self._reason is not None
|
||||
|
||||
|
||||
def IsInboxRelated( self ):
|
||||
|
||||
return self._action in ( HC.CONTENT_UPDATE_ARCHIVE, HC.CONTENT_UPDATE_INBOX )
|
||||
|
|
|
@ -157,7 +157,7 @@ def ParseFileArguments( path, decompression_bombs_ok = False ):
|
|||
|
||||
|
||||
|
||||
( size, mime, width, height, duration, num_frames, has_audio, num_words ) = HydrusFileHandling.GetFileInfo( path, mime )
|
||||
( size, mime, width, height, duration, num_frames, has_audio, num_words ) = HydrusFileHandling.GetFileInfo( path, mime = mime )
|
||||
|
||||
except Exception as e:
|
||||
|
||||
|
|
|
@ -734,6 +734,205 @@ class TestClientDBTags( unittest.TestCase ):
|
|||
} ) )
|
||||
|
||||
|
||||
def test_display_pairs_sync_transitive( self ):
|
||||
|
||||
# ok, so say we have the situation where Sa -> Sb, and Sb -> P, all files with Sa should get P, right? let's check
|
||||
# we're trying to reproduce a particular reported bug here, so forgive the stochastic design
|
||||
|
||||
pre_combined_file_1 = os.urandom( 32 )
|
||||
pre_combined_file_2 = os.urandom( 32 )
|
||||
pre_combined_file_3 = os.urandom( 32 )
|
||||
|
||||
post_combined_file_1 = os.urandom( 32 )
|
||||
post_combined_file_2 = os.urandom( 32 )
|
||||
post_combined_file_3 = os.urandom( 32 )
|
||||
|
||||
child_tag_1 = 'samus_aran'
|
||||
child_tag_2 = 'samus aran'
|
||||
child_tag_3 = 'character:samus aran'
|
||||
|
||||
parent_1 = 'series:metroid'
|
||||
parent_2 = 'series:nintendo'
|
||||
|
||||
def do_specific_imports():
|
||||
|
||||
import_hashes = []
|
||||
filenames = [ 'muh_gif.gif', 'muh_jpg.jpg', 'muh_mp4.mp4', 'muh_mpeg.mpeg', 'muh_png.png', 'muh_webm.webm' ]
|
||||
|
||||
for filename in filenames:
|
||||
|
||||
HG.test_controller.SetRead( 'hash_status', ClientImportFiles.FileImportStatus.STATICGetUnknownStatus() )
|
||||
|
||||
path = os.path.join( HC.STATIC_DIR, 'testing', filename )
|
||||
|
||||
file_import_options = HG.client_controller.new_options.GetDefaultFileImportOptions( 'loud' )
|
||||
|
||||
file_import_job = ClientImportFiles.FileImportJob( path, file_import_options )
|
||||
|
||||
file_import_job.GeneratePreImportHashAndStatus()
|
||||
|
||||
file_import_job.GenerateInfo()
|
||||
|
||||
self._write( 'import_file', file_import_job )
|
||||
|
||||
import_hashes.append( file_import_job.GetHash() )
|
||||
|
||||
|
||||
return import_hashes
|
||||
|
||||
|
||||
def get_display_content_updates_in_random_order():
|
||||
|
||||
content_updates = []
|
||||
|
||||
child_tags = [ child_tag_1, child_tag_2, child_tag_3 ]
|
||||
|
||||
random.shuffle( child_tags )
|
||||
|
||||
for tag in child_tags[ 1 : ]:
|
||||
|
||||
content_updates.append( HydrusData.ContentUpdate( HC.CONTENT_TYPE_TAG_SIBLINGS, HC.CONTENT_UPDATE_ADD, ( tag, child_tags[ 0 ] ) ) )
|
||||
|
||||
|
||||
parent_tags = [ parent_1, parent_2 ]
|
||||
|
||||
random.shuffle( parent_tags )
|
||||
|
||||
random.shuffle( child_tags )
|
||||
|
||||
content_updates.append( HydrusData.ContentUpdate( HC.CONTENT_TYPE_TAG_PARENTS, HC.CONTENT_UPDATE_ADD, ( child_tags[0], parent_tags[0] ) ) )
|
||||
|
||||
child_tags.append( parent_tags[0] )
|
||||
|
||||
random.shuffle( child_tags )
|
||||
|
||||
content_updates.append( HydrusData.ContentUpdate( HC.CONTENT_TYPE_TAG_PARENTS, HC.CONTENT_UPDATE_ADD, ( child_tags[0], parent_tags[1] ) ) )
|
||||
|
||||
return content_updates
|
||||
|
||||
|
||||
self._clear_db()
|
||||
|
||||
(
|
||||
pre_specific_file_1,
|
||||
pre_specific_file_2,
|
||||
pre_specific_file_3,
|
||||
post_specific_file_1,
|
||||
post_specific_file_2,
|
||||
post_specific_file_3
|
||||
) = do_specific_imports()
|
||||
|
||||
def get_post_mapping_content_updates():
|
||||
|
||||
rows = [
|
||||
( child_tag_1, ( post_combined_file_1, ) ),
|
||||
( child_tag_2, ( post_combined_file_2, ) ),
|
||||
( child_tag_3, ( post_combined_file_3, ) ),
|
||||
( child_tag_1, ( post_specific_file_1, ) ),
|
||||
( child_tag_2, ( post_specific_file_2, ) ),
|
||||
( child_tag_3, ( post_specific_file_3, ) )
|
||||
]
|
||||
|
||||
content_updates = [ HydrusData.ContentUpdate( HC.CONTENT_TYPE_MAPPINGS, HC.CONTENT_UPDATE_ADD, row ) for row in rows ]
|
||||
|
||||
return content_updates
|
||||
|
||||
|
||||
def get_pre_mapping_content_updates():
|
||||
|
||||
rows = [
|
||||
( child_tag_1, ( pre_combined_file_1, ) ),
|
||||
( child_tag_2, ( pre_combined_file_2, ) ),
|
||||
( child_tag_3, ( pre_combined_file_3, ) ),
|
||||
( child_tag_1, ( pre_specific_file_1, ) ),
|
||||
( child_tag_2, ( pre_specific_file_2, ) ),
|
||||
( child_tag_3, ( pre_specific_file_3, ) )
|
||||
]
|
||||
|
||||
content_updates = [ HydrusData.ContentUpdate( HC.CONTENT_TYPE_MAPPINGS, HC.CONTENT_UPDATE_ADD, row ) for row in rows ]
|
||||
|
||||
return content_updates
|
||||
|
||||
|
||||
services = self._read( 'services' )
|
||||
|
||||
other_service_keys = [ HydrusData.GenerateKey() for i in range( 32 ) ]
|
||||
|
||||
for other_service_key in other_service_keys:
|
||||
|
||||
services.append( ClientServices.GenerateService( other_service_key, HC.LOCAL_TAG, other_service_key.hex() ) )
|
||||
|
||||
|
||||
self._write( 'update_services', services )
|
||||
|
||||
# let's do it a bunch of times in different orders with different structures
|
||||
|
||||
for other_service_key in other_service_keys:
|
||||
|
||||
self._write( 'content_updates', { other_service_key : get_pre_mapping_content_updates() } )
|
||||
|
||||
content_updates = get_display_content_updates_in_random_order()
|
||||
|
||||
# let's test a mix of atomic and complete sync
|
||||
block_size = random.choice( [ 1, 3, 5 ] )
|
||||
|
||||
for block_of_content_updates in HydrusData.SplitListIntoChunks( content_updates, block_size ):
|
||||
|
||||
self._write( 'content_updates', { other_service_key : block_of_content_updates } )
|
||||
|
||||
still_work_to_do = True
|
||||
|
||||
while still_work_to_do:
|
||||
|
||||
still_work_to_do = self._write( 'sync_tag_display_maintenance', other_service_key, 1 )
|
||||
|
||||
|
||||
|
||||
self._write( 'content_updates', { other_service_key : get_post_mapping_content_updates() } )
|
||||
|
||||
( siblings, ideal_sibling, descendants, ancestors ) = self._read( 'tag_siblings_and_parents_lookup', ( child_tag_1, ) )[ child_tag_1 ][ other_service_key ]
|
||||
# get ideal from here too
|
||||
|
||||
self.assertEqual( siblings, { child_tag_1, child_tag_2, child_tag_3 } )
|
||||
self.assertEqual( ancestors, { parent_1, parent_2 } )
|
||||
|
||||
for ( storage_tag, file_hash ) in [
|
||||
( child_tag_1, pre_combined_file_1 ),
|
||||
( child_tag_2, pre_combined_file_2 ),
|
||||
( child_tag_3, pre_combined_file_3 ),
|
||||
( child_tag_1, pre_specific_file_1 ),
|
||||
( child_tag_2, pre_specific_file_2 ),
|
||||
( child_tag_3, pre_specific_file_3 ),
|
||||
( child_tag_1, post_combined_file_1 ),
|
||||
( child_tag_2, post_combined_file_2 ),
|
||||
( child_tag_3, post_combined_file_3 ),
|
||||
( child_tag_1, post_specific_file_1 ),
|
||||
( child_tag_2, post_specific_file_2 ),
|
||||
( child_tag_3, post_specific_file_3 )
|
||||
]:
|
||||
|
||||
# fetch the mappings of all six files, should be the same, with whatever ideal in place, and both parents
|
||||
# we do combined and specific to test cached specific values and on-the-fly calculated combined
|
||||
# we do pre and post to test synced vs content updates
|
||||
|
||||
( media_result, ) = self._read( 'media_results', ( file_hash, ) )
|
||||
|
||||
tags_manager = media_result.GetTagsManager()
|
||||
|
||||
self.assertIn( storage_tag, tags_manager.GetCurrent( other_service_key, ClientTags.TAG_DISPLAY_STORAGE ) )
|
||||
|
||||
if storage_tag != ideal_sibling:
|
||||
|
||||
self.assertNotIn( storage_tag, tags_manager.GetCurrent( other_service_key, ClientTags.TAG_DISPLAY_ACTUAL ) )
|
||||
|
||||
|
||||
self.assertIn( ideal_sibling, tags_manager.GetCurrent( other_service_key, ClientTags.TAG_DISPLAY_ACTUAL ) )
|
||||
self.assertIn( parent_1, tags_manager.GetCurrent( other_service_key, ClientTags.TAG_DISPLAY_ACTUAL ) )
|
||||
self.assertIn( parent_2, tags_manager.GetCurrent( other_service_key, ClientTags.TAG_DISPLAY_ACTUAL ) )
|
||||
|
||||
|
||||
|
||||
|
||||
def test_display_pairs_lookup_bonkers( self ):
|
||||
|
||||
self._clear_db()
|
||||
|
|
|
@ -223,9 +223,10 @@ class TestFileImportOptions( unittest.TestCase ):
|
|||
file_import_options.SetPreImportOptions( exclude_deleted, do_not_check_known_urls_before_importing, do_not_check_hashes_before_importing, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution )
|
||||
|
||||
automatic_archive = False
|
||||
associate_primary_urls = False
|
||||
associate_source_urls = False
|
||||
|
||||
file_import_options.SetPostImportOptions( automatic_archive, associate_source_urls )
|
||||
file_import_options.SetPostImportOptions( automatic_archive, associate_primary_urls, associate_source_urls )
|
||||
|
||||
present_new_files = True
|
||||
present_already_in_inbox_files = True
|
||||
|
@ -238,6 +239,7 @@ class TestFileImportOptions( unittest.TestCase ):
|
|||
self.assertFalse( file_import_options.ExcludesDeleted() )
|
||||
self.assertFalse( file_import_options.AllowsDecompressionBombs() )
|
||||
self.assertFalse( file_import_options.AutomaticallyArchives() )
|
||||
self.assertFalse( file_import_options.ShouldAssociatePrimaryURLs() )
|
||||
self.assertFalse( file_import_options.ShouldAssociateSourceURLs() )
|
||||
|
||||
file_import_options.CheckFileIsValid( 65536, HC.IMAGE_JPEG, 640, 480 )
|
||||
|
@ -273,13 +275,15 @@ class TestFileImportOptions( unittest.TestCase ):
|
|||
#
|
||||
|
||||
automatic_archive = True
|
||||
associate_primary_urls = True
|
||||
associate_source_urls = True
|
||||
|
||||
file_import_options.SetPostImportOptions( automatic_archive, associate_source_urls )
|
||||
file_import_options.SetPostImportOptions( automatic_archive, associate_primary_urls, associate_source_urls )
|
||||
|
||||
self.assertTrue( file_import_options.ExcludesDeleted() )
|
||||
self.assertTrue( file_import_options.AllowsDecompressionBombs() )
|
||||
self.assertTrue( file_import_options.AutomaticallyArchives() )
|
||||
self.assertTrue( file_import_options.ShouldAssociatePrimaryURLs() )
|
||||
self.assertTrue( file_import_options.ShouldAssociateSourceURLs() )
|
||||
|
||||
#
|
||||
|
|
|
@ -832,6 +832,10 @@ class Controller( object ):
|
|||
TestClientDBDuplicates
|
||||
]
|
||||
|
||||
module_lookup[ 'db_tags' ] = [
|
||||
TestClientDBTags
|
||||
]
|
||||
|
||||
module_lookup[ 'nat' ] = [
|
||||
TestHydrusNATPunch
|
||||
]
|
||||
|
|
Loading…
Reference in New Issue