Version 463
This commit is contained in:
parent
ca2f5f1612
commit
acc17e18bd
|
@ -8,6 +8,45 @@
|
|||
<div class="content">
|
||||
<h3 id="changelog"><a href="#changelog">changelog</a></h3>
|
||||
<ul>
|
||||
<li><h3 id="version_463"><a href="#version_463">version 463</a></h3></li>
|
||||
<ul>
|
||||
<li>misc:</li>
|
||||
<li>ogv files (ogg with a video stream) are now recognised by the client! they will get resolution, duration, num frames and now show in the media viewer correctly as resizeable videos. all your existing ogg files will be scheduled for a rescan on update</li>
|
||||
<li>wrote new downloader objects to fix deviant art tag search. all clients will automatically update and should with luck just be working again with the same old 'deviant art tag search' downloader</li>
|
||||
<li>added prototype copy/paste buttons to the manage ratings dialog. the copy button also grabs 'null' ratings, let me know how you get on here and we'll tweak as needed</li>
|
||||
<li>file searches with namespaced and unnamespaced tags should now run just a little faster</li>
|
||||
<li>most file searches with multiple search predicates that include normal tags should now run just a little faster</li>
|
||||
<li>the file log right-click menu now shows 'delete x yyy files from the queue' for deleted, ignored, failed, and skipped states separately</li>
|
||||
<li>the tag siblings + parents display sync manager now forces more wait time before it does work. it now waits for the database and gui to be free of pending or current background work. this _should_ stop slower clients getting into hangs when the afterwork updates pile up and overwhelm the main work</li>
|
||||
<li>the option 'warn at this many pages' under _options->gui pages_ now has a max value of 65535, up from 500. if you are a madman or you have very page-spammy subscriptions, feel free to try boosting this super high. be warned this may lead to resource limit crashes</li>
|
||||
<li>the 'max pages' value that triggers a full yes/no dialog on page open is now set as the maximum value of 500 and 2 x the 'warn at this many pages' value</li>
|
||||
<li>the 'max pages' dialog trigger now only fires if there are no dialogs currently open (this should fix a nested dialog crash when page-publishing subscriptions goes bananas)</li>
|
||||
<li>improved error reporting for unsolveable cloudflare captcha errors</li>
|
||||
<li>added clarification text to the edit subscription query dialog regarding the tag import options there</li>
|
||||
<li>added/updated a bunch of file import options tooltips</li>
|
||||
<li>.</li>
|
||||
<li>new presentation import options:</li>
|
||||
<li>the 'presentation' section of 'file import options' has been completely overhauled. it can do more, and is more human-friendly</li>
|
||||
<li>rather than the old three checkboxes of new/already-in-inbox/already-in-archive, you now choose from three dropdowns--do you want all/new/none, do you want all/only-inbox/inbox-too, and do you want to my-files/and-trash-too. it is now possible to show 'in inbox' exclusively, at the time of publish (e.g. when you highlight)</li>
|
||||
<li>added a little help UI text around the places presentation is used</li>
|
||||
<li>the downloader and watcher page's list right-click menu entries for 'show xxx files' is now a submenu, uses the new presentation import options, shows 'show inbox files', and if you click on one row it says what the default is and excludes other entries if they are duplicates</li>
|
||||
<li>.</li>
|
||||
<li>boring presentation import options stuff:</li>
|
||||
<li>presentation options are now in their own object and will be easier to update in future</li>
|
||||
<li>the 'should I present' code is radically cleaned up and refactored to a single central object</li>
|
||||
<li>presentation filtering in general has more sophisticated logic and is faster when used on a list (e.g. when you highlight a decent sized downloader and it has to figure out which thumbs to show). it is now careful about only checking for inbox status on outstanding files</li>
|
||||
<li>presentation now always checks file domain, whereas before this was ad-hoc and scattered around (and in some buggy cases lead to long-deleted files appearing in import views)</li>
|
||||
<li>added a full suite of unit tests to ensure the presentation import options object is making the right decisions and filtering efficiently at each stage</li>
|
||||
<li>.</li>
|
||||
<li>boring multiple local file services work:</li>
|
||||
<li>I basically moved a bunch of file search code from 1 file services to n file services:</li>
|
||||
<li>the file storage module can now filter file ids to a complex location search context</li>
|
||||
<li>namespace:anything searches of various sorts now use complex location search contexts</li>
|
||||
<li>wildcard tag searches now use complex location search contexts</li>
|
||||
<li>counted tag searches now use complex location search contexts</li>
|
||||
<li>search code that uses complex location search contexts now cross-references its file results in all cases</li>
|
||||
<li>I now have a great plan to add deleted files search and keep it working quick. this will be the next step, and then I can do efficient complex-location regular tag search and hopefully just switch over the autocomplete control to allow deleted files search</li>
|
||||
</ul>
|
||||
<li><h3 id="version_462"><a href="#version_462">version 462</a></h3></li>
|
||||
<ul>
|
||||
<li>misc:</li>
|
||||
|
|
|
@ -573,9 +573,11 @@ class ClientOptions( HydrusSerialisable.SerialisableBase ):
|
|||
associate_primary_urls = True
|
||||
associate_source_urls = True
|
||||
|
||||
present_new_files = True
|
||||
present_already_in_inbox_files = False
|
||||
present_already_in_archive_files = False
|
||||
from hydrus.client.importing.options import PresentationImportOptions
|
||||
|
||||
presentation_import_options = PresentationImportOptions.PresentationImportOptions()
|
||||
|
||||
presentation_import_options.SetPresentationStatus( PresentationImportOptions.PRESENTATION_STATUS_NEW_ONLY )
|
||||
|
||||
from hydrus.client.importing.options import FileImportOptions
|
||||
|
||||
|
@ -583,19 +585,14 @@ class ClientOptions( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
quiet_file_import_options.SetPreImportOptions( exclude_deleted, do_not_check_known_urls_before_importing, do_not_check_hashes_before_importing, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution )
|
||||
quiet_file_import_options.SetPostImportOptions( automatic_archive, associate_primary_urls, associate_source_urls )
|
||||
quiet_file_import_options.SetPresentationOptions( present_new_files, present_already_in_inbox_files, present_already_in_archive_files )
|
||||
quiet_file_import_options.SetPresentationImportOptions( presentation_import_options )
|
||||
|
||||
self._dictionary[ 'default_file_import_options' ][ 'quiet' ] = quiet_file_import_options
|
||||
|
||||
present_new_files = True
|
||||
present_already_in_inbox_files = True
|
||||
present_already_in_archive_files = True
|
||||
|
||||
loud_file_import_options = FileImportOptions.FileImportOptions()
|
||||
|
||||
loud_file_import_options.SetPreImportOptions( exclude_deleted, do_not_check_known_urls_before_importing, do_not_check_hashes_before_importing, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution )
|
||||
loud_file_import_options.SetPostImportOptions( automatic_archive, associate_primary_urls, associate_source_urls )
|
||||
loud_file_import_options.SetPresentationOptions( present_new_files, present_already_in_inbox_files, present_already_in_archive_files )
|
||||
|
||||
self._dictionary[ 'default_file_import_options' ][ 'loud' ] = loud_file_import_options
|
||||
|
||||
|
|
|
@ -38,8 +38,6 @@ def filetype_pred_generator( v ):
|
|||
|
||||
# v is a list of non-hydrus-standard filetype strings
|
||||
|
||||
HC.ALLOWED_MIMES
|
||||
|
||||
mimes = ( 1, )
|
||||
|
||||
return ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_SYSTEM_MIME, mimes )
|
||||
|
|
|
@ -6045,9 +6045,9 @@ class DB( HydrusDB.HydrusDB ):
|
|||
return ( storage_tag_data, display_tag_data )
|
||||
|
||||
|
||||
def _GetHashIdsAndNonZeroTagCounts( self, tag_display_type: int, file_service_key, tag_search_context: ClientSearch.TagSearchContext, hash_ids, namespace_wildcard = None, job_key = None ):
|
||||
def _GetHashIdsAndNonZeroTagCounts( self, tag_display_type: int, location_search_context: ClientSearch.LocationSearchContext, tag_search_context: ClientSearch.TagSearchContext, hash_ids, namespace_wildcard = None, job_key = None ):
|
||||
|
||||
if namespace_wildcard == '*':
|
||||
if namespace_wildcard in ( '*', '' ):
|
||||
|
||||
namespace_wildcard = None
|
||||
|
||||
|
@ -6063,18 +6063,25 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
with self._MakeTemporaryIntegerTable( namespace_ids, 'namespace_id' ) as temp_namespace_ids_table_name:
|
||||
|
||||
# reason why I JOIN each table rather than join just the UNION is based on previous hell with having query planner figure out a "( UNION ) NATURAL JOIN stuff" situation
|
||||
# reason why I JOIN each table rather than join just the UNION is based on previous hell with having query planner figure out a "( a UNION b UNION c ) NATURAL JOIN stuff" situation
|
||||
# although this sometimes makes certifiable 2KB ( 6 UNION * 4-table ) queries, it actually works fast
|
||||
|
||||
# OK, a new problem is mass UNION leads to terrible cancelability because the first row cannot be fetched until the first n - 1 union queries are done
|
||||
# I tried some gubbins to try to do a pseudo table-union rather than query union and do 'get files->distinct tag count for this union of tables, and fetch hash_ids first on the union', but did not have luck
|
||||
# so we are just going to do it in bits mate. this also reduces memory use from the distinct-making UNION with large numbers of hash_ids
|
||||
|
||||
mapping_and_tag_table_names = self._GetMappingAndTagTables( tag_display_type, file_service_key, tag_search_context )
|
||||
( file_and_tag_domain_pairs, file_location_is_cross_referenced ) = self.modules_tag_search.ConvertLocationAndTagContextToCoveringCachePairs( location_search_context, tag_search_context )
|
||||
|
||||
mapping_and_tag_table_names = set()
|
||||
|
||||
for ( file_service_key, tag_search_context ) in file_and_tag_domain_pairs:
|
||||
|
||||
mapping_and_tag_table_names.update( self._GetMappingAndTagTables( tag_display_type, file_service_key, tag_search_context ) )
|
||||
|
||||
|
||||
results = []
|
||||
|
||||
BLOCK_SIZE = int( len( hash_ids ) ** 0.5 ) # go for square root for now
|
||||
BLOCK_SIZE = max( 64, int( len( hash_ids ) ** 0.5 ) ) # go for square root for now
|
||||
|
||||
for group_of_hash_ids in HydrusData.SplitIteratorIntoChunks( hash_ids, BLOCK_SIZE ):
|
||||
|
||||
|
@ -6826,7 +6833,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
def sort_longest_first_key( s ):
|
||||
|
||||
return -len( s )
|
||||
return ( 1 if HydrusTags.IsUnnamespaced( s ) else 0, -len( s ) )
|
||||
|
||||
|
||||
tags_to_include = list( tags_to_include )
|
||||
|
@ -6837,19 +6844,17 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
if query_hash_ids is None:
|
||||
|
||||
tag_query_hash_ids = self._GetHashIdsFromTag( ClientTags.TAG_DISPLAY_ACTUAL, file_service_key, tag_search_context, tag, job_key = job_key )
|
||||
tag_query_hash_ids = self._GetHashIdsFromTag( ClientTags.TAG_DISPLAY_ACTUAL, location_search_context, tag_search_context, tag, job_key = job_key )
|
||||
|
||||
elif is_inbox and len( query_hash_ids ) == len( self.modules_files_metadata_basic.inbox_hash_ids ):
|
||||
|
||||
tag_query_hash_ids = self._GetHashIdsFromTag( ClientTags.TAG_DISPLAY_ACTUAL, file_service_key, tag_search_context, tag, hash_ids = self.modules_files_metadata_basic.inbox_hash_ids, hash_ids_table_name = 'file_inbox', job_key = job_key )
|
||||
tag_query_hash_ids = self._GetHashIdsFromTag( ClientTags.TAG_DISPLAY_ACTUAL, location_search_context, tag_search_context, tag, hash_ids = self.modules_files_metadata_basic.inbox_hash_ids, hash_ids_table_name = 'file_inbox', job_key = job_key )
|
||||
|
||||
else:
|
||||
|
||||
with self._MakeTemporaryIntegerTable( query_hash_ids, 'hash_id' ) as temp_table_name:
|
||||
|
||||
self._AnalyzeTempTable( temp_table_name )
|
||||
|
||||
tag_query_hash_ids = self._GetHashIdsFromTag( ClientTags.TAG_DISPLAY_ACTUAL, file_service_key, tag_search_context, tag, hash_ids = query_hash_ids, hash_ids_table_name = temp_table_name, job_key = job_key )
|
||||
tag_query_hash_ids = self._GetHashIdsFromTag( ClientTags.TAG_DISPLAY_ACTUAL, location_search_context, tag_search_context, tag, hash_ids = query_hash_ids, hash_ids_table_name = temp_table_name, job_key = job_key )
|
||||
|
||||
|
||||
|
||||
|
@ -6867,7 +6872,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
if query_hash_ids is None or ( is_inbox and len( query_hash_ids ) == len( self.modules_files_metadata_basic.inbox_hash_ids ) ):
|
||||
|
||||
namespace_query_hash_ids = self._GetHashIdsThatHaveTags( ClientTags.TAG_DISPLAY_ACTUAL, file_service_key, tag_search_context, namespace_wildcard = namespace, job_key = job_key )
|
||||
namespace_query_hash_ids = self._GetHashIdsThatHaveTagsComplexLocation( ClientTags.TAG_DISPLAY_ACTUAL, location_search_context, tag_search_context, namespace_wildcard = namespace, job_key = job_key )
|
||||
|
||||
else:
|
||||
|
||||
|
@ -6875,7 +6880,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
self._AnalyzeTempTable( temp_table_name )
|
||||
|
||||
namespace_query_hash_ids = self._GetHashIdsThatHaveTags( ClientTags.TAG_DISPLAY_ACTUAL, file_service_key, tag_search_context, namespace_wildcard = namespace, hash_ids_table_name = temp_table_name, job_key = job_key )
|
||||
namespace_query_hash_ids = self._GetHashIdsThatHaveTagsComplexLocation( ClientTags.TAG_DISPLAY_ACTUAL, location_search_context, tag_search_context, namespace_wildcard = namespace, hash_ids_table_name = temp_table_name, job_key = job_key )
|
||||
|
||||
|
||||
|
||||
|
@ -6893,7 +6898,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
if query_hash_ids is None:
|
||||
|
||||
wildcard_query_hash_ids = self._GetHashIdsFromWildcard( ClientTags.TAG_DISPLAY_ACTUAL, file_service_key, tag_search_context, wildcard, job_key = job_key )
|
||||
wildcard_query_hash_ids = self._GetHashIdsFromWildcardComplexLocation( ClientTags.TAG_DISPLAY_ACTUAL, location_search_context, tag_search_context, wildcard, job_key = job_key )
|
||||
|
||||
else:
|
||||
|
||||
|
@ -6901,7 +6906,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
self._AnalyzeTempTable( temp_table_name )
|
||||
|
||||
wildcard_query_hash_ids = self._GetHashIdsFromWildcard( ClientTags.TAG_DISPLAY_ACTUAL, file_service_key, tag_search_context, wildcard, hash_ids = query_hash_ids, hash_ids_table_name = temp_table_name, job_key = job_key )
|
||||
wildcard_query_hash_ids = self._GetHashIdsFromWildcardComplexLocation( ClientTags.TAG_DISPLAY_ACTUAL, location_search_context, tag_search_context, wildcard, hash_ids = query_hash_ids, hash_ids_table_name = temp_table_name, job_key = job_key )
|
||||
|
||||
|
||||
|
||||
|
@ -6939,7 +6944,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
if location_search_context.IsAllKnownFiles():
|
||||
|
||||
query_hash_ids = intersection_update_qhi( query_hash_ids, self._GetHashIdsThatHaveTags( ClientTags.TAG_DISPLAY_ACTUAL, file_service_key, tag_search_context, job_key = job_key ) )
|
||||
query_hash_ids = intersection_update_qhi( query_hash_ids, self._GetHashIdsThatHaveTagsComplexLocation( ClientTags.TAG_DISPLAY_ACTUAL, location_search_context, tag_search_context, job_key = job_key ) )
|
||||
|
||||
else:
|
||||
|
||||
|
@ -7040,7 +7045,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
self._AnalyzeTempTable( temp_table_name )
|
||||
|
||||
unwanted_hash_ids = self._GetHashIdsFromTag( ClientTags.TAG_DISPLAY_ACTUAL, file_service_key, tag_search_context, tag, hash_ids = query_hash_ids, hash_ids_table_name = temp_table_name, job_key = job_key )
|
||||
unwanted_hash_ids = self._GetHashIdsFromTag( ClientTags.TAG_DISPLAY_ACTUAL, location_search_context, tag_search_context, tag, hash_ids = query_hash_ids, hash_ids_table_name = temp_table_name, job_key = job_key )
|
||||
|
||||
query_hash_ids.difference_update( unwanted_hash_ids )
|
||||
|
||||
|
@ -7057,7 +7062,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
self._AnalyzeTempTable( temp_table_name )
|
||||
|
||||
unwanted_hash_ids = self._GetHashIdsThatHaveTags( ClientTags.TAG_DISPLAY_ACTUAL, file_service_key, tag_search_context, namespace_wildcard = namespace, hash_ids_table_name = temp_table_name, job_key = job_key )
|
||||
unwanted_hash_ids = self._GetHashIdsThatHaveTagsComplexLocation( ClientTags.TAG_DISPLAY_ACTUAL, location_search_context, tag_search_context, namespace_wildcard = namespace, hash_ids_table_name = temp_table_name, job_key = job_key )
|
||||
|
||||
query_hash_ids.difference_update( unwanted_hash_ids )
|
||||
|
||||
|
@ -7074,7 +7079,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
self._AnalyzeTempTable( temp_table_name )
|
||||
|
||||
unwanted_hash_ids = self._GetHashIdsFromWildcard( ClientTags.TAG_DISPLAY_ACTUAL, file_service_key, tag_search_context, wildcard, hash_ids = query_hash_ids, hash_ids_table_name = temp_table_name, job_key = job_key )
|
||||
unwanted_hash_ids = self._GetHashIdsFromWildcardComplexLocation( ClientTags.TAG_DISPLAY_ACTUAL, location_search_context, tag_search_context, wildcard, hash_ids = query_hash_ids, hash_ids_table_name = temp_table_name, job_key = job_key )
|
||||
|
||||
query_hash_ids.difference_update( unwanted_hash_ids )
|
||||
|
||||
|
@ -7363,7 +7368,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
if is_zero or is_anything_but_zero:
|
||||
|
||||
nonzero_tag_query_hash_ids = self._GetHashIdsThatHaveTags( ClientTags.TAG_DISPLAY_ACTUAL, file_service_key, tag_search_context, hash_ids_table_name = temp_table_name, namespace_wildcard = namespace, job_key = job_key )
|
||||
nonzero_tag_query_hash_ids = self._GetHashIdsThatHaveTagsComplexLocation( ClientTags.TAG_DISPLAY_ACTUAL, location_search_context, tag_search_context, hash_ids_table_name = temp_table_name, namespace_wildcard = namespace, job_key = job_key )
|
||||
nonzero_tag_query_hash_ids_populated = True
|
||||
|
||||
if is_zero:
|
||||
|
@ -7380,7 +7385,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
if len( specific_number_tests ) > 0:
|
||||
|
||||
hash_id_tag_counts = self._GetHashIdsAndNonZeroTagCounts( ClientTags.TAG_DISPLAY_ACTUAL, file_service_key, tag_search_context, query_hash_ids, namespace_wildcard = namespace, job_key = job_key )
|
||||
hash_id_tag_counts = self._GetHashIdsAndNonZeroTagCounts( ClientTags.TAG_DISPLAY_ACTUAL, location_search_context, tag_search_context, query_hash_ids, namespace_wildcard = namespace, job_key = job_key )
|
||||
|
||||
good_tag_count_hash_ids = { hash_id for ( hash_id, count ) in hash_id_tag_counts if megalambda( count ) }
|
||||
|
||||
|
@ -7416,7 +7421,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
self._AnalyzeTempTable( temp_table_name )
|
||||
|
||||
( good_hash_ids, was_file_location_cross_referenced ) = self._GetHashIdsThatHaveTagAsNumComplexLocation( ClientTags.TAG_DISPLAY_ACTUAL, location_search_context, tag_search_context, namespace, num, '>', hash_ids = query_hash_ids, hash_ids_table_name = temp_table_name, job_key = job_key )
|
||||
good_hash_ids = self._GetHashIdsThatHaveTagAsNumComplexLocation( ClientTags.TAG_DISPLAY_ACTUAL, location_search_context, tag_search_context, namespace, num, '>', hash_ids = query_hash_ids, hash_ids_table_name = temp_table_name, job_key = job_key )
|
||||
|
||||
|
||||
query_hash_ids = intersection_update_qhi( query_hash_ids, good_hash_ids )
|
||||
|
@ -7430,7 +7435,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
self._AnalyzeTempTable( temp_table_name )
|
||||
|
||||
( good_hash_ids, was_file_location_cross_referenced ) = self._GetHashIdsThatHaveTagAsNumComplexLocation( ClientTags.TAG_DISPLAY_ACTUAL, location_search_context, tag_search_context, namespace, num, '<', hash_ids = query_hash_ids, hash_ids_table_name = temp_table_name, job_key = job_key )
|
||||
good_hash_ids = self._GetHashIdsThatHaveTagAsNumComplexLocation( ClientTags.TAG_DISPLAY_ACTUAL, location_search_context, tag_search_context, namespace, num, '<', hash_ids = query_hash_ids, hash_ids_table_name = temp_table_name, job_key = job_key )
|
||||
|
||||
|
||||
query_hash_ids = intersection_update_qhi( query_hash_ids, good_hash_ids )
|
||||
|
@ -7500,7 +7505,10 @@ class DB( HydrusDB.HydrusDB ):
|
|||
return self._GetHashIdsFromTagIds( tag_display_type, file_service_key, tag_search_context, tag_ids, hash_ids = hash_ids, hash_ids_table_name = hash_ids_table_name, job_key = job_key )
|
||||
|
||||
|
||||
def _GetHashIdsFromTag( self, tag_display_type: int, file_service_key, tag_search_context: ClientSearch.TagSearchContext, tag, hash_ids = None, hash_ids_table_name = None, allow_unnamespaced_to_fetch_namespaced = True, job_key = None ):
|
||||
def _GetHashIdsFromTag( self, tag_display_type: int, location_search_context: ClientSearch.LocationSearchContext, tag_search_context: ClientSearch.TagSearchContext, tag, hash_ids = None, hash_ids_table_name = None, allow_unnamespaced_to_fetch_namespaced = True, job_key = None ):
|
||||
|
||||
# we'll replace this with file_service_ids from the ConvertLocationAndTagContextToCoveringCachePairs replacement and then filter later
|
||||
file_service_key = location_search_context.GetFileServiceKey()
|
||||
|
||||
( namespace, subtag ) = HydrusTags.SplitTag( tag )
|
||||
|
||||
|
@ -7733,161 +7741,219 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
|
||||
|
||||
def _GetHashIdsFromWildcard( self, tag_display_type: int, file_service_key, tag_search_context: ClientSearch.TagSearchContext, wildcard, hash_ids = None, hash_ids_table_name = None, job_key = None ):
|
||||
def _GetHashIdsFromWildcardComplexLocation( self, tag_display_type: int, location_search_context: ClientSearch.LocationSearchContext, tag_search_context: ClientSearch.TagSearchContext, wildcard, hash_ids = None, hash_ids_table_name = None, job_key = None ):
|
||||
|
||||
( namespace_wildcard, subtag_wildcard ) = HydrusTags.SplitTag( wildcard )
|
||||
|
||||
if namespace_wildcard == '*':
|
||||
if namespace_wildcard in ( '*', '' ):
|
||||
|
||||
namespace_wildcard = ''
|
||||
namespace_wildcard = None
|
||||
|
||||
|
||||
if subtag_wildcard == '*':
|
||||
|
||||
if namespace_wildcard == '':
|
||||
|
||||
namespace_wildcard = None
|
||||
|
||||
|
||||
return self._GetHashIdsThatHaveTags( tag_display_type, file_service_key, tag_search_context, namespace_wildcard = namespace_wildcard, hash_ids_table_name = hash_ids_table_name, job_key = job_key )
|
||||
return self._GetHashIdsThatHaveTagsComplexLocation( tag_display_type, location_search_context, tag_search_context, namespace_wildcard = namespace_wildcard, hash_ids_table_name = hash_ids_table_name, job_key = job_key )
|
||||
|
||||
|
||||
file_service_id = self.modules_services.GetServiceId( file_service_key )
|
||||
tag_service_id = self.modules_services.GetServiceId( tag_search_context.service_key )
|
||||
results = set()
|
||||
|
||||
with self._MakeTemporaryIntegerTable( [], 'subtag_id' ) as temp_subtag_ids_table_name:
|
||||
( file_and_tag_domain_pairs, file_location_is_cross_referenced ) = self.modules_tag_search.ConvertLocationAndTagContextToCoveringCachePairs( location_search_context, tag_search_context )
|
||||
|
||||
if namespace_wildcard is None:
|
||||
|
||||
self.modules_tag_search.GetSubtagIdsFromWildcardIntoTable( file_service_id, tag_service_id, subtag_wildcard, temp_subtag_ids_table_name, job_key = job_key )
|
||||
possible_namespace_ids = []
|
||||
|
||||
if namespace_wildcard != '':
|
||||
else:
|
||||
|
||||
possible_namespace_ids = self.modules_tag_search.GetNamespaceIdsFromWildcard( namespace_wildcard )
|
||||
|
||||
if len( possible_namespace_ids ) == 0:
|
||||
|
||||
possible_namespace_ids = self.modules_tag_search.GetNamespaceIdsFromWildcard( namespace_wildcard )
|
||||
return set()
|
||||
|
||||
with self._MakeTemporaryIntegerTable( possible_namespace_ids, 'namespace_id' ) as temp_namespace_ids_table_name:
|
||||
|
||||
return self._GetHashIdsFromNamespaceIdsSubtagIdsTables( tag_display_type, file_service_key, tag_search_context, temp_namespace_ids_table_name, temp_subtag_ids_table_name, hash_ids = hash_ids, hash_ids_table_name = hash_ids_table_name, job_key = job_key )
|
||||
|
||||
|
||||
|
||||
with self._MakeTemporaryIntegerTable( possible_namespace_ids, 'namespace_id' ) as temp_namespace_ids_table_name:
|
||||
|
||||
if namespace_wildcard is None:
|
||||
|
||||
namespace_ids_table_name = None
|
||||
|
||||
else:
|
||||
|
||||
namespace_ids_table_name = temp_namespace_ids_table_name
|
||||
|
||||
|
||||
for ( file_service_key, tag_search_context ) in file_and_tag_domain_pairs:
|
||||
|
||||
some_results = self._GetHashIdsFromWildcardSimpleLocation( tag_display_type, file_service_key, tag_search_context, subtag_wildcard, namespace_ids_table_name = namespace_ids_table_name, hash_ids = hash_ids, hash_ids_table_name = hash_ids_table_name, job_key = job_key )
|
||||
|
||||
if len( results ) == 0:
|
||||
|
||||
results = some_results
|
||||
|
||||
else:
|
||||
|
||||
results.update( some_results )
|
||||
|
||||
|
||||
|
||||
|
||||
if not file_location_is_cross_referenced:
|
||||
|
||||
results = self.modules_files_storage.FilterHashIds( location_search_context, results )
|
||||
|
||||
|
||||
return results
|
||||
|
||||
|
||||
def _GetHashIdsFromWildcardSimpleLocation( self, tag_display_type: int, file_service_key: bytes, tag_search_context: ClientSearch.TagSearchContext, subtag_wildcard, namespace_ids_table_name = None, hash_ids = None, hash_ids_table_name = None, job_key = None ):
|
||||
|
||||
with self._MakeTemporaryIntegerTable( [], 'subtag_id' ) as temp_subtag_ids_table_name:
|
||||
|
||||
file_service_id = self.modules_services.GetServiceId( file_service_key )
|
||||
tag_service_id = self.modules_services.GetServiceId( tag_search_context.service_key )
|
||||
|
||||
# when ConvertLocationAndTagContextToCoveringCachePairs no longer needs tag search context, convert this to location search context and bump it up to the complexlocation method!
|
||||
|
||||
self.modules_tag_search.GetSubtagIdsFromWildcardIntoTable( file_service_id, tag_service_id, subtag_wildcard, temp_subtag_ids_table_name, job_key = job_key )
|
||||
|
||||
if namespace_ids_table_name is None:
|
||||
|
||||
return self._GetHashIdsFromSubtagIdsTable( tag_display_type, file_service_key, tag_search_context, temp_subtag_ids_table_name, hash_ids = hash_ids, hash_ids_table_name = hash_ids_table_name, job_key = job_key )
|
||||
|
||||
else:
|
||||
|
||||
return self._GetHashIdsFromNamespaceIdsSubtagIdsTables( tag_display_type, file_service_key, tag_search_context, namespace_ids_table_name, temp_subtag_ids_table_name, hash_ids = hash_ids, hash_ids_table_name = hash_ids_table_name, job_key = job_key )
|
||||
|
||||
|
||||
|
||||
|
||||
def _GetHashIdsThatHaveTags( self, tag_display_type: int, file_service_key, tag_search_context: ClientSearch.TagSearchContext, namespace_wildcard = None, hash_ids_table_name = None, job_key = None ):
|
||||
def _GetHashIdsThatHaveTagsComplexLocation( self, tag_display_type: int, location_search_context: ClientSearch.LocationSearchContext, tag_search_context: ClientSearch.TagSearchContext, namespace_wildcard = None, hash_ids_table_name = None, job_key = None ):
|
||||
|
||||
if namespace_wildcard == '*':
|
||||
if not location_search_context.SearchesAnything():
|
||||
|
||||
return set()
|
||||
|
||||
|
||||
if namespace_wildcard in ( '*', '' ):
|
||||
|
||||
namespace_wildcard = None
|
||||
|
||||
|
||||
if namespace_wildcard is None:
|
||||
|
||||
namespace_ids = []
|
||||
possible_namespace_ids = []
|
||||
|
||||
else:
|
||||
|
||||
namespace_ids = self.modules_tag_search.GetNamespaceIdsFromWildcard( namespace_wildcard )
|
||||
possible_namespace_ids = self.modules_tag_search.GetNamespaceIdsFromWildcard( namespace_wildcard )
|
||||
|
||||
|
||||
file_service_id = self.modules_services.GetServiceId( file_service_key )
|
||||
tag_service_id = self.modules_services.GetServiceId( tag_search_context.service_key )
|
||||
|
||||
with self._MakeTemporaryIntegerTable( namespace_ids, 'namespace_id' ) as temp_namespace_ids_table_name:
|
||||
|
||||
mapping_and_tag_table_names = self._GetMappingAndTagTables( tag_display_type, file_service_key, tag_search_context )
|
||||
|
||||
if hash_ids_table_name is None:
|
||||
if len( possible_namespace_ids ) == 0:
|
||||
|
||||
if namespace_wildcard is None:
|
||||
|
||||
# hellmode
|
||||
queries = [ 'SELECT DISTINCT hash_id FROM {};'.format( mappings_table_name ) for ( mappings_table_name, tags_table_name ) in mapping_and_tag_table_names ]
|
||||
|
||||
else:
|
||||
|
||||
# temp namespaces to tags to mappings
|
||||
queries = [ 'SELECT DISTINCT hash_id FROM {} CROSS JOIN {} USING ( namespace_id ) CROSS JOIN {} USING ( tag_id );'.format( temp_namespace_ids_table_name, tags_table_name, mappings_table_name ) for ( mappings_table_name, tags_table_name ) in mapping_and_tag_table_names ]
|
||||
|
||||
return set()
|
||||
|
||||
|
||||
|
||||
results = set()
|
||||
|
||||
with self._MakeTemporaryIntegerTable( possible_namespace_ids, 'namespace_id' ) as temp_namespace_ids_table_name:
|
||||
|
||||
if namespace_wildcard is None:
|
||||
|
||||
namespace_ids_table_name = None
|
||||
|
||||
else:
|
||||
|
||||
if namespace_wildcard is None:
|
||||
namespace_ids_table_name = temp_namespace_ids_table_name
|
||||
|
||||
|
||||
( file_and_tag_domain_pairs, file_location_is_cross_referenced ) = self.modules_tag_search.ConvertLocationAndTagContextToCoveringCachePairs( location_search_context, tag_search_context )
|
||||
|
||||
if not file_location_is_cross_referenced and hash_ids_table_name is not None:
|
||||
|
||||
file_location_is_cross_referenced = True
|
||||
|
||||
|
||||
for ( file_service_key, tag_search_context ) in file_and_tag_domain_pairs:
|
||||
|
||||
some_results = self._GetHashIdsThatHaveTagsSimpleLocation( tag_display_type, file_service_key, tag_search_context, namespace_ids_table_name = namespace_ids_table_name, hash_ids_table_name = hash_ids_table_name, job_key = job_key )
|
||||
|
||||
if len( results ) == 0:
|
||||
|
||||
queries = [ 'SELECT hash_id FROM {} WHERE EXISTS ( SELECT 1 FROM {} WHERE {}.hash_id = {}.hash_id );'.format( hash_ids_table_name, mappings_table_name, mappings_table_name, hash_ids_table_name ) for ( mappings_table_name, tags_table_name ) in mapping_and_tag_table_names ]
|
||||
results = some_results
|
||||
|
||||
else:
|
||||
|
||||
# temp hashes to mappings to tags to temp namespaces
|
||||
# this was originally a 'WHERE EXISTS' thing, but doing that on a three way cross join is too complex for that to work well
|
||||
# let's hope DISTINCT can save time too
|
||||
queries = [ 'SELECT DISTINCT hash_id FROM {} CROSS JOIN {} USING ( hash_id ) CROSS JOIN {} USING ( tag_id ) CROSS JOIN {} USING ( namespace_id );'.format( hash_ids_table_name, mappings_table_name, tags_table_name, temp_namespace_ids_table_name ) for ( mappings_table_name, tags_table_name ) in mapping_and_tag_table_names ]
|
||||
results.update( some_results )
|
||||
|
||||
|
||||
|
||||
cancelled_hook = None
|
||||
|
||||
if not file_location_is_cross_referenced:
|
||||
|
||||
if job_key is not None:
|
||||
results = self.modules_files_storage.FilterHashIds( location_search_context, results )
|
||||
|
||||
|
||||
return results
|
||||
|
||||
|
||||
def _GetHashIdsThatHaveTagsSimpleLocation( self, tag_display_type: int, file_service_key: bytes, tag_search_context: ClientSearch.TagSearchContext, namespace_ids_table_name = None, hash_ids_table_name = None, job_key = None ):
|
||||
|
||||
mapping_and_tag_table_names = self._GetMappingAndTagTables( tag_display_type, file_service_key, tag_search_context )
|
||||
|
||||
if hash_ids_table_name is None:
|
||||
|
||||
if namespace_ids_table_name is None:
|
||||
|
||||
cancelled_hook = job_key.IsCancelled
|
||||
# hellmode
|
||||
queries = [ 'SELECT DISTINCT hash_id FROM {};'.format( mappings_table_name ) for ( mappings_table_name, tags_table_name ) in mapping_and_tag_table_names ]
|
||||
|
||||
else:
|
||||
|
||||
# temp namespaces to tags to mappings
|
||||
queries = [ 'SELECT DISTINCT hash_id FROM {} CROSS JOIN {} USING ( namespace_id ) CROSS JOIN {} USING ( tag_id );'.format( namespace_ids_table_name, tags_table_name, mappings_table_name ) for ( mappings_table_name, tags_table_name ) in mapping_and_tag_table_names ]
|
||||
|
||||
|
||||
nonzero_tag_hash_ids = set()
|
||||
else:
|
||||
|
||||
for query in queries:
|
||||
if namespace_ids_table_name is None:
|
||||
|
||||
cursor = self._Execute( query )
|
||||
queries = [ 'SELECT hash_id FROM {} WHERE EXISTS ( SELECT 1 FROM {} WHERE {}.hash_id = {}.hash_id );'.format( hash_ids_table_name, mappings_table_name, mappings_table_name, hash_ids_table_name ) for ( mappings_table_name, tags_table_name ) in mapping_and_tag_table_names ]
|
||||
|
||||
nonzero_tag_hash_ids.update( self._STI( HydrusDB.ReadFromCancellableCursor( cursor, 10240, cancelled_hook ) ) )
|
||||
else:
|
||||
|
||||
if job_key is not None and job_key.IsCancelled():
|
||||
|
||||
return set()
|
||||
|
||||
# temp hashes to mappings to tags to temp namespaces
|
||||
# this was originally a 'WHERE EXISTS' thing, but doing that on a three way cross join is too complex for that to work well
|
||||
# let's hope DISTINCT can save time too
|
||||
queries = [ 'SELECT DISTINCT hash_id FROM {} CROSS JOIN {} USING ( hash_id ) CROSS JOIN {} USING ( tag_id ) CROSS JOIN {} USING ( namespace_id );'.format( hash_ids_table_name, mappings_table_name, tags_table_name, namespace_ids_table_name ) for ( mappings_table_name, tags_table_name ) in mapping_and_tag_table_names ]
|
||||
|
||||
|
||||
|
||||
cancelled_hook = None
|
||||
|
||||
if job_key is not None:
|
||||
|
||||
cancelled_hook = job_key.IsCancelled
|
||||
|
||||
|
||||
nonzero_tag_hash_ids = set()
|
||||
|
||||
for query in queries:
|
||||
|
||||
cursor = self._Execute( query )
|
||||
|
||||
nonzero_tag_hash_ids.update( self._STI( HydrusDB.ReadFromCancellableCursor( cursor, 10240, cancelled_hook ) ) )
|
||||
|
||||
if job_key is not None and job_key.IsCancelled():
|
||||
|
||||
return set()
|
||||
|
||||
|
||||
|
||||
return nonzero_tag_hash_ids
|
||||
|
||||
|
||||
def _ConvertLocationAndTagContextToCoveringCachePairs( self, location_search_context: ClientSearch.LocationSearchContext, tag_search_context: ClientSearch.TagSearchContext ):
|
||||
|
||||
# a way to support deleted efficiently is to have deleted file tag caches
|
||||
# but that is so silly, given how rare deleted file tag searches will be, that we should just swallow 'all known files' searches
|
||||
|
||||
tag_search_contexts = [ tag_search_context ]
|
||||
|
||||
if location_search_context.SearchesCurrent() and not location_search_context.SearchesDeleted():
|
||||
|
||||
file_location_needs_cross_referencing = not location_search_context.IsAllKnownFiles()
|
||||
|
||||
file_service_keys = list( location_search_context.current_service_keys )
|
||||
|
||||
else:
|
||||
|
||||
file_location_needs_cross_referencing = False
|
||||
|
||||
file_service_keys = [ CC.COMBINED_FILE_SERVICE_KEY ]
|
||||
|
||||
if tag_search_context.IsAllKnownTags() == CC.COMBINED_TAG_SERVICE_KEY:
|
||||
|
||||
tag_search_contexts = []
|
||||
|
||||
for tag_service in self.modules_services.GetServices( HC.REAL_TAG_SERVICES ):
|
||||
|
||||
tsc = tag_search_context.Duplicate()
|
||||
|
||||
tsc.service_key = tag_service.GetServiceKey()
|
||||
|
||||
tag_search_contexts.append( tsc )
|
||||
|
||||
|
||||
|
||||
|
||||
return ( list( itertools.product( file_service_keys, tag_search_contexts ) ), file_location_needs_cross_referencing )
|
||||
|
||||
|
||||
def _GetHashIdsThatHaveTagAsNumComplexLocation( self, tag_display_type: int, location_search_context: ClientSearch.LocationSearchContext, tag_search_context: ClientSearch.TagSearchContext, namespace, num, operator, hash_ids = None, hash_ids_table_name = None, job_key = None ):
|
||||
|
||||
if not location_search_context.SearchesAnything():
|
||||
|
@ -7895,7 +7961,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
return set()
|
||||
|
||||
|
||||
( file_and_tag_domain_pairs, file_location_is_cross_referenced ) = self._ConvertLocationAndTagContextToCoveringCachePairs( location_search_context, tag_search_context )
|
||||
( file_and_tag_domain_pairs, file_location_is_cross_referenced ) = self.modules_tag_search.ConvertLocationAndTagContextToCoveringCachePairs( location_search_context, tag_search_context )
|
||||
|
||||
if not file_location_is_cross_referenced and hash_ids_table_name is not None:
|
||||
|
||||
|
@ -7918,7 +7984,12 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
|
||||
|
||||
return ( results, file_location_is_cross_referenced )
|
||||
if not file_location_is_cross_referenced:
|
||||
|
||||
results = self.modules_files_storage.FilterHashIds( location_search_context, results )
|
||||
|
||||
|
||||
return results
|
||||
|
||||
|
||||
def _GetHashIdsThatHaveTagAsNumSimpleLocation( self, tag_display_type: int, file_service_key: bytes, tag_search_context: ClientSearch.TagSearchContext, namespace, num, operator, hash_ids = None, hash_ids_table_name = None, job_key = None ):
|
||||
|
@ -8180,6 +8251,8 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
if file_service_id == self.modules_services.combined_file_service_id:
|
||||
|
||||
# yo this does not support tag_display_actual--big tricky problem
|
||||
|
||||
( current_mappings_table_name, deleted_mappings_table_name, pending_mappings_table_name, petitioned_mappings_table_name ) = ClientDBMappingsStorage.GenerateMappingsTableNames( tag_service_id )
|
||||
|
||||
current_tables.append( ( current_mappings_table_name, tags_table_name ) )
|
||||
|
@ -15060,6 +15133,62 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
|
||||
|
||||
if version == 462:
|
||||
|
||||
try:
|
||||
|
||||
domain_manager = self.modules_serialisable.GetJSONDump( HydrusSerialisable.SERIALISABLE_TYPE_NETWORK_DOMAIN_MANAGER )
|
||||
|
||||
domain_manager.Initialise()
|
||||
|
||||
#
|
||||
|
||||
domain_manager.OverwriteDefaultGUGs( ( 'deviant art tag search', ) )
|
||||
|
||||
domain_manager.OverwriteDefaultParsers( ( 'deviant gallery page api parser (new cursor)', ) )
|
||||
|
||||
domain_manager.OverwriteDefaultURLClasses( ( 'deviant art tag gallery page api (cursor navigation)', ) )
|
||||
|
||||
#
|
||||
|
||||
domain_manager.TryToLinkURLClassesAndParsers()
|
||||
|
||||
#
|
||||
|
||||
self.modules_serialisable.SetJSONDump( domain_manager )
|
||||
|
||||
except Exception as e:
|
||||
|
||||
HydrusData.PrintException( e )
|
||||
|
||||
message = 'Trying to update some parsers failed! Please let hydrus dev know!'
|
||||
|
||||
self.pub_initial_message( message )
|
||||
|
||||
|
||||
try:
|
||||
|
||||
self._controller.frame_splash_status.SetSubtext( 'scheduling ogg files for regen' )
|
||||
|
||||
table_join = self.modules_files_storage.GetTableJoinLimitedByFileDomain( self.modules_services.combined_local_file_service_id, 'files_info', HC.CONTENT_STATUS_CURRENT )
|
||||
|
||||
from hydrus.client import ClientFiles
|
||||
|
||||
hash_ids = self._STL( self._Execute( 'SELECT hash_id FROM {} WHERE mime = ?;'.format( table_join ), ( HC.AUDIO_OGG, ) ) )
|
||||
|
||||
self.modules_files_maintenance_queue.AddJobs( hash_ids, ClientFiles.REGENERATE_FILE_DATA_JOB_FILE_METADATA )
|
||||
self.modules_files_maintenance_queue.AddJobs( hash_ids, ClientFiles.REGENERATE_FILE_DATA_JOB_REFIT_THUMBNAIL )
|
||||
|
||||
except:
|
||||
|
||||
HydrusData.PrintException( e )
|
||||
|
||||
message = 'Trying to schedule ogg files for maintenance failed! Please let hydrus dev know!'
|
||||
|
||||
self.pub_initial_message( message )
|
||||
|
||||
|
||||
|
||||
self._controller.frame_splash_status.SetTitleText( 'updated db to v{}'.format( HydrusData.ToHumanInt( version + 1 ) ) )
|
||||
|
||||
self._Execute( 'UPDATE version SET version = ?;', ( version + 1, ) )
|
||||
|
|
|
@ -324,6 +324,43 @@ class ClientDBFilesStorage( ClientDBModule.ClientDBModule ):
|
|||
return current_hash_ids
|
||||
|
||||
|
||||
def FilterHashIds( self, location_search_context: ClientSearch.LocationSearchContext, hash_ids ) -> set:
|
||||
|
||||
if not location_search_context.SearchesAnything():
|
||||
|
||||
return set()
|
||||
|
||||
|
||||
filtered_hash_ids = set()
|
||||
|
||||
with self._MakeTemporaryIntegerTable( hash_ids, 'hash_id' ) as temp_hash_ids_table_name:
|
||||
|
||||
for file_service_key in location_search_context.current_service_keys:
|
||||
|
||||
service_id = self.modules_services.GetServiceId( file_service_key )
|
||||
|
||||
current_files_table_name = GenerateFilesTableName( service_id, HC.CONTENT_STATUS_CURRENT )
|
||||
|
||||
hash_id_iterator = self._STI( self._Execute( 'SELECT hash_id FROM {} CROSS JOIN {} USING ( hash_id );'.format( temp_hash_ids_table_name, current_files_table_name ) ) )
|
||||
|
||||
filtered_hash_ids.update( hash_id_iterator )
|
||||
|
||||
|
||||
for file_service_key in location_search_context.deleted_service_keys:
|
||||
|
||||
service_id = self.modules_services.GetServiceId( file_service_key )
|
||||
|
||||
deleted_files_table_name = GenerateFilesTableName( service_id, HC.CONTENT_STATUS_DELETED )
|
||||
|
||||
hash_id_iterator = self._STI( self._Execute( 'SELECT hash_id FROM {} CROSS JOIN {} USING ( hash_id );'.format( temp_hash_ids_table_name, deleted_files_table_name ) ) )
|
||||
|
||||
filtered_hash_ids.update( hash_id_iterator )
|
||||
|
||||
|
||||
|
||||
return filtered_hash_ids
|
||||
|
||||
|
||||
def FilterPendingHashIds( self, service_id, hash_ids ):
|
||||
|
||||
if service_id == self.modules_services.combined_file_service_id:
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
import itertools
|
||||
import sqlite3
|
||||
import typing
|
||||
|
||||
|
@ -7,6 +8,7 @@ from hydrus.core import HydrusDB
|
|||
from hydrus.core import HydrusDBBase
|
||||
from hydrus.core import HydrusGlobals as HG
|
||||
|
||||
from hydrus.client import ClientConstants as CC
|
||||
from hydrus.client import ClientSearch
|
||||
from hydrus.client.db import ClientDBMaster
|
||||
from hydrus.client.db import ClientDBModule
|
||||
|
@ -315,6 +317,50 @@ class ClientDBTagSearch( ClientDBModule.ClientDBModule ):
|
|||
|
||||
|
||||
|
||||
def ConvertLocationAndTagContextToCoveringCachePairs( self, location_search_context: ClientSearch.LocationSearchContext, tag_search_context: ClientSearch.TagSearchContext ):
|
||||
|
||||
# a GREAT way to support fast deleted file search is to have a domain that covers all files ever deleted in all domains mate
|
||||
# adding on trash or full delete and only removing files from it on clear delete record for all services
|
||||
# then it just needs the cross-reference after search
|
||||
# if I do this, then I think we'll be covering all specific file services again, even if not perfectly
|
||||
# which means I can ditch the COMBINED_TAG_SERVICE_KEY gubbins here
|
||||
# and therefore ditch this being 'TagContext'-CoveringPairs. tag context will always be the same!
|
||||
|
||||
# furthermore, if we are moving to imperfect file caches, potentially we could go to just 'all local files' cache for current files, too
|
||||
# the only real drawback would be when searching a tiny local file service, like trash
|
||||
|
||||
tag_search_contexts = [ tag_search_context ]
|
||||
|
||||
if location_search_context.SearchesCurrent() and not location_search_context.SearchesDeleted():
|
||||
|
||||
file_location_is_cross_referenced = not location_search_context.IsAllKnownFiles()
|
||||
|
||||
file_service_keys = list( location_search_context.current_service_keys )
|
||||
|
||||
else:
|
||||
|
||||
file_location_is_cross_referenced = False
|
||||
|
||||
file_service_keys = [ CC.COMBINED_FILE_SERVICE_KEY ]
|
||||
|
||||
if tag_search_context.IsAllKnownTags() == CC.COMBINED_TAG_SERVICE_KEY:
|
||||
|
||||
tag_search_contexts = []
|
||||
|
||||
for tag_service in self.modules_services.GetServices( HC.REAL_TAG_SERVICES ):
|
||||
|
||||
tsc = tag_search_context.Duplicate()
|
||||
|
||||
tsc.service_key = tag_service.GetServiceKey()
|
||||
|
||||
tag_search_contexts.append( tsc )
|
||||
|
||||
|
||||
|
||||
|
||||
return ( list( itertools.product( file_service_keys, tag_search_contexts ) ), file_location_is_cross_referenced )
|
||||
|
||||
|
||||
def DeleteTags( self, file_service_id, tag_service_id, tag_ids ):
|
||||
|
||||
if len( tag_ids ) == 0:
|
||||
|
|
|
@ -1,10 +1,12 @@
|
|||
import itertools
|
||||
import json
|
||||
import os
|
||||
|
||||
from qtpy import QtWidgets as QW
|
||||
|
||||
from hydrus.core import HydrusConstants as HC
|
||||
from hydrus.core import HydrusData
|
||||
from hydrus.core import HydrusExceptions
|
||||
from hydrus.core import HydrusGlobals as HG
|
||||
from hydrus.core.networking import HydrusNATPunch
|
||||
|
||||
|
@ -53,6 +55,12 @@ class DialogManageRatings( ClientGUIDialogs.Dialog ):
|
|||
self._panels.append( self._NumericalPanel( self, numerical_services, media ) )
|
||||
|
||||
|
||||
self._copy_button = ClientGUICommon.BetterBitmapButton( self, CC.global_pixmaps().copy, self._Copy )
|
||||
self._copy_button.setToolTip( 'Copy ratings to the clipboard.' )
|
||||
|
||||
self._paste_button = ClientGUICommon.BetterBitmapButton( self, CC.global_pixmaps().paste, self._Paste )
|
||||
self._paste_button.setToolTip( 'Paste ratings from the clipboard.' )
|
||||
|
||||
self._apply = QW.QPushButton( 'apply', self )
|
||||
self._apply.clicked.connect( self.EventOK )
|
||||
self._apply.setObjectName( 'HydrusAccept' )
|
||||
|
@ -63,11 +71,6 @@ class DialogManageRatings( ClientGUIDialogs.Dialog ):
|
|||
|
||||
#
|
||||
|
||||
buttonbox = QP.HBoxLayout()
|
||||
|
||||
QP.AddToLayout( buttonbox, self._apply, CC.FLAGS_CENTER_PERPENDICULAR )
|
||||
QP.AddToLayout( buttonbox, self._cancel, CC.FLAGS_CENTER_PERPENDICULAR )
|
||||
|
||||
vbox = QP.VBoxLayout()
|
||||
|
||||
for panel in self._panels:
|
||||
|
@ -75,6 +78,18 @@ class DialogManageRatings( ClientGUIDialogs.Dialog ):
|
|||
QP.AddToLayout( vbox, panel, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
|
||||
|
||||
buttonbox = QP.HBoxLayout()
|
||||
|
||||
QP.AddToLayout( buttonbox, self._copy_button, CC.FLAGS_CENTER_PERPENDICULAR )
|
||||
QP.AddToLayout( buttonbox, self._paste_button, CC.FLAGS_CENTER_PERPENDICULAR )
|
||||
|
||||
QP.AddToLayout( vbox, buttonbox, CC.FLAGS_ON_RIGHT )
|
||||
|
||||
buttonbox = QP.HBoxLayout()
|
||||
|
||||
QP.AddToLayout( buttonbox, self._apply, CC.FLAGS_CENTER_PERPENDICULAR )
|
||||
QP.AddToLayout( buttonbox, self._cancel, CC.FLAGS_CENTER_PERPENDICULAR )
|
||||
|
||||
QP.AddToLayout( vbox, buttonbox, CC.FLAGS_ON_RIGHT )
|
||||
|
||||
self.setLayout( vbox )
|
||||
|
@ -88,6 +103,52 @@ class DialogManageRatings( ClientGUIDialogs.Dialog ):
|
|||
self._my_shortcut_handler = ClientGUIShortcuts.ShortcutsHandler( self, [ 'global', 'media' ] )
|
||||
|
||||
|
||||
def _Copy( self ):
|
||||
|
||||
rating_clipboard_pairs = []
|
||||
|
||||
for panel in self._panels:
|
||||
|
||||
rating_clipboard_pairs.extend( panel.GetRatingClipboardPairs() )
|
||||
|
||||
|
||||
text = json.dumps( [ ( service_key.hex(), rating ) for ( service_key, rating ) in rating_clipboard_pairs ] )
|
||||
|
||||
HG.client_controller.pub( 'clipboard', 'text', text )
|
||||
|
||||
|
||||
def _Paste( self ):
|
||||
|
||||
try:
|
||||
|
||||
raw_text = HG.client_controller.GetClipboardText()
|
||||
|
||||
except HydrusExceptions.DataMissing as e:
|
||||
|
||||
QW.QMessageBox.critical( self, 'Error', str(e) )
|
||||
|
||||
return
|
||||
|
||||
|
||||
try:
|
||||
|
||||
rating_clipboard_pairs_encoded = json.loads( raw_text )
|
||||
|
||||
rating_clipboard_pairs = [ ( bytes.fromhex( service_key_encoded ), rating ) for ( service_key_encoded, rating ) in rating_clipboard_pairs_encoded ]
|
||||
|
||||
except:
|
||||
|
||||
QW.QMessageBox.critical( self, 'Error', 'Did not understand what was in the clipboard!' )
|
||||
|
||||
return
|
||||
|
||||
|
||||
for panel in self._panels:
|
||||
|
||||
panel.SetRatingClipboardPairs( rating_clipboard_pairs )
|
||||
|
||||
|
||||
|
||||
def EventOK( self ):
|
||||
|
||||
try:
|
||||
|
@ -202,6 +263,67 @@ class DialogManageRatings( ClientGUIDialogs.Dialog ):
|
|||
return service_keys_to_content_updates
|
||||
|
||||
|
||||
def GetRatingClipboardPairs( self ):
|
||||
|
||||
rating_clipboard_pairs = []
|
||||
|
||||
for ( service_key, control ) in self._service_keys_to_controls.items():
|
||||
|
||||
rating_state = control.GetRatingState()
|
||||
|
||||
if rating_state == ClientRatings.LIKE:
|
||||
|
||||
rating = 1
|
||||
|
||||
elif rating_state == ClientRatings.DISLIKE:
|
||||
|
||||
rating = 0
|
||||
|
||||
elif rating_state == ClientRatings.NULL:
|
||||
|
||||
rating = None
|
||||
|
||||
else:
|
||||
|
||||
continue
|
||||
|
||||
|
||||
rating_clipboard_pairs.append( ( service_key, rating ) )
|
||||
|
||||
|
||||
return rating_clipboard_pairs
|
||||
|
||||
|
||||
def SetRatingClipboardPairs( self, rating_clipboard_pairs ):
|
||||
|
||||
for ( service_key, rating ) in rating_clipboard_pairs:
|
||||
|
||||
if rating == 1:
|
||||
|
||||
rating_state = ClientRatings.LIKE
|
||||
|
||||
elif rating == 0:
|
||||
|
||||
rating_state = ClientRatings.DISLIKE
|
||||
|
||||
elif rating is None:
|
||||
|
||||
rating_state = ClientRatings.NULL
|
||||
|
||||
else:
|
||||
|
||||
continue
|
||||
|
||||
|
||||
if service_key in self._service_keys_to_controls:
|
||||
|
||||
control = self._service_keys_to_controls[ service_key ]
|
||||
|
||||
control.SetRatingState( rating_state )
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
class _NumericalPanel( QW.QWidget ):
|
||||
|
||||
|
@ -280,6 +402,53 @@ class DialogManageRatings( ClientGUIDialogs.Dialog ):
|
|||
return service_keys_to_content_updates
|
||||
|
||||
|
||||
def GetRatingClipboardPairs( self ):
|
||||
|
||||
rating_clipboard_pairs = []
|
||||
|
||||
for ( service_key, control ) in self._service_keys_to_controls.items():
|
||||
|
||||
rating_state = control.GetRatingState()
|
||||
|
||||
if rating_state == ClientRatings.NULL:
|
||||
|
||||
rating = None
|
||||
|
||||
elif rating_state == ClientRatings.SET:
|
||||
|
||||
rating = control.GetRating()
|
||||
|
||||
else:
|
||||
|
||||
continue
|
||||
|
||||
|
||||
rating_clipboard_pairs.append( ( service_key, rating ) )
|
||||
|
||||
|
||||
return rating_clipboard_pairs
|
||||
|
||||
|
||||
def SetRatingClipboardPairs( self, rating_clipboard_pairs ):
|
||||
|
||||
for ( service_key, rating ) in rating_clipboard_pairs:
|
||||
|
||||
if service_key in self._service_keys_to_controls:
|
||||
|
||||
control = self._service_keys_to_controls[ service_key ]
|
||||
|
||||
if rating is None:
|
||||
|
||||
control.SetRatingState( ClientRatings.NULL )
|
||||
|
||||
elif isinstance( rating, ( int, float ) ) and 0 <= rating <= 1:
|
||||
|
||||
control.SetRating( rating )
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
class DialogManageUPnP( ClientGUIDialogs.Dialog ):
|
||||
|
||||
|
|
|
@ -26,7 +26,7 @@ from hydrus.client.gui.lists import ClientGUIListConstants as CGLC
|
|||
from hydrus.client.gui.lists import ClientGUIListCtrl
|
||||
from hydrus.client.gui.widgets import ClientGUICommon
|
||||
from hydrus.client.importing import ClientImportFileSeeds
|
||||
from hydrus.client.importing.options import FileImportOptions
|
||||
from hydrus.client.importing.options import PresentationImportOptions
|
||||
from hydrus.client.metadata import ClientTagSorting
|
||||
|
||||
def GetRetryIgnoredParam( window ):
|
||||
|
@ -712,11 +712,11 @@ class FileSeedCacheButton( ClientGUICommon.BetterButton ):
|
|||
|
||||
elif show == 'new':
|
||||
|
||||
file_import_options = FileImportOptions.FileImportOptions()
|
||||
presentation_import_options = PresentationImportOptions.PresentationImportOptions()
|
||||
|
||||
file_import_options.SetPresentationOptions( True, False, False )
|
||||
presentation_import_options.SetPresentationStatus( PresentationImportOptions.PRESENTATION_STATUS_NEW_ONLY )
|
||||
|
||||
hashes = file_seed_cache.GetPresentedHashes( file_import_options )
|
||||
hashes = file_seed_cache.GetPresentedHashes( presentation_import_options )
|
||||
|
||||
|
||||
if len( hashes ) > 0:
|
||||
|
@ -751,16 +751,15 @@ class FileSeedCacheButton( ClientGUICommon.BetterButton ):
|
|||
|
||||
file_seed_cache = self._file_seed_cache_get_callable()
|
||||
|
||||
num_file_seeds = len( file_seed_cache )
|
||||
num_successful = file_seed_cache.GetFileSeedCount( CC.STATUS_SUCCESSFUL_AND_NEW ) + file_seed_cache.GetFileSeedCount( CC.STATUS_SUCCESSFUL_BUT_REDUNDANT )
|
||||
num_vetoed = file_seed_cache.GetFileSeedCount( CC.STATUS_VETOED )
|
||||
num_deleted_and_vetoed = file_seed_cache.GetFileSeedCount( CC.STATUS_DELETED ) + num_vetoed
|
||||
num_deleted = file_seed_cache.GetFileSeedCount( CC.STATUS_DELETED )
|
||||
num_errors = file_seed_cache.GetFileSeedCount( CC.STATUS_ERROR )
|
||||
num_skipped = file_seed_cache.GetFileSeedCount( CC.STATUS_SKIPPED )
|
||||
|
||||
if num_errors > 0:
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( menu, 'retry ' + HydrusData.ToHumanInt( num_errors ) + ' error failures', 'Tell this cache to reattempt all its error failures.', self._RetryErrors )
|
||||
ClientGUIMenus.AppendMenuItem( menu, 'retry ' + HydrusData.ToHumanInt( num_errors ) + ' failures', 'Tell this cache to reattempt all its error failures.', self._RetryErrors )
|
||||
|
||||
|
||||
if num_vetoed > 0:
|
||||
|
@ -772,23 +771,27 @@ class FileSeedCacheButton( ClientGUICommon.BetterButton ):
|
|||
|
||||
if num_successful > 0:
|
||||
|
||||
num_deletees = num_successful
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( menu, 'delete ' + HydrusData.ToHumanInt( num_deletees ) + ' successful file import items from the queue', 'Tell this cache to clear out successful files, reducing the size of the queue.', self._ClearFileSeeds, (CC.STATUS_SUCCESSFUL_AND_NEW, CC.STATUS_SUCCESSFUL_BUT_REDUNDANT) )
|
||||
ClientGUIMenus.AppendMenuItem( menu, 'delete {} \'successful\' file import items from the queue'.format( HydrusData.ToHumanInt( num_successful ) ), 'Tell this cache to clear out successful files, reducing the size of the queue.', self._ClearFileSeeds, ( CC.STATUS_SUCCESSFUL_AND_NEW, CC.STATUS_SUCCESSFUL_BUT_REDUNDANT ) )
|
||||
|
||||
|
||||
if num_deleted_and_vetoed > 0:
|
||||
if num_deleted > 0:
|
||||
|
||||
num_deletees = num_deleted_and_vetoed
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( menu, 'delete ' + HydrusData.ToHumanInt( num_deletees ) + ' deleted/ignored file import items from the queue', 'Tell this cache to clear out deleted and ignored files, reducing the size of the queue.', self._ClearFileSeeds, (CC.STATUS_DELETED, CC.STATUS_VETOED) )
|
||||
ClientGUIMenus.AppendMenuItem( menu, 'delete {} \'previously deleted\' file import items from the queue'.format( HydrusData.ToHumanInt( num_deleted ) ), 'Tell this cache to clear out deleted files, reducing the size of the queue.', self._ClearFileSeeds, ( CC.STATUS_DELETED, ) )
|
||||
|
||||
|
||||
if num_errors + num_skipped > 0:
|
||||
if num_errors > 0:
|
||||
|
||||
num_deletees = num_errors + num_skipped
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( menu, 'delete ' + HydrusData.ToHumanInt( num_deletees ) + ' error/skipped file import items from the queue', 'Tell this cache to clear out errored and skipped files, reducing the size of the queue.', self._ClearFileSeeds, (CC.STATUS_ERROR, CC.STATUS_SKIPPED) )
|
||||
ClientGUIMenus.AppendMenuItem( menu, 'delete {} \'failed\' file import items from the queue'.format( HydrusData.ToHumanInt( num_errors ) ), 'Tell this cache to clear out errored and skipped files, reducing the size of the queue.', self._ClearFileSeeds, ( CC.STATUS_ERROR, ) )
|
||||
|
||||
|
||||
if num_vetoed > 0:
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( menu, 'delete {} \'ignored\' file import items from the queue'.format( HydrusData.ToHumanInt( num_vetoed ) ), 'Tell this cache to clear out ignored files, reducing the size of the queue.', self._ClearFileSeeds, ( CC.STATUS_VETOED, ) )
|
||||
|
||||
|
||||
if num_skipped > 0:
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( menu, 'delete {} \'skipped\' file import items from the queue'.format( HydrusData.ToHumanInt( num_skipped ) ), 'Tell this cache to clear out errored and skipped files, reducing the size of the queue.', self._ClearFileSeeds, ( CC.STATUS_SKIPPED, ) )
|
||||
|
||||
|
||||
ClientGUIMenus.AppendSeparator( menu )
|
||||
|
|
|
@ -1086,8 +1086,8 @@ class EditImportFolderPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
rows.append( ( 'check period: ', self._period ) )
|
||||
rows.append( ( 'check on manage dialog ok: ', self._check_now ) )
|
||||
rows.append( ( 'show a popup while working: ', self._show_working_popup ) )
|
||||
rows.append( ( 'publish new files to a popup button: ', self._publish_files_to_popup_button ) )
|
||||
rows.append( ( 'publish new files to a page: ', self._publish_files_to_page ) )
|
||||
rows.append( ( 'publish presented files to a popup button: ', self._publish_files_to_popup_button ) )
|
||||
rows.append( ( 'publish presented files to a page: ', self._publish_files_to_page ) )
|
||||
rows.append( ( 'review currently cached import paths: ', self._file_seed_cache_button ) )
|
||||
|
||||
gridbox = ClientGUICommon.WrapInGrid( self._folder_box, rows )
|
||||
|
|
|
@ -34,6 +34,7 @@ from hydrus.client.gui.widgets import ClientGUICommon
|
|||
from hydrus.client.gui.widgets import ClientGUIControls
|
||||
from hydrus.client.gui.widgets import ClientGUIMenuButton
|
||||
from hydrus.client.importing.options import FileImportOptions
|
||||
from hydrus.client.importing.options import PresentationImportOptions
|
||||
from hydrus.client.importing.options import TagImportOptions
|
||||
from hydrus.client.media import ClientMedia
|
||||
from hydrus.client.metadata import ClientTags
|
||||
|
@ -1229,10 +1230,18 @@ class EditFileImportOptions( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
self._exclude_deleted = QW.QCheckBox( pre_import_panel )
|
||||
|
||||
tt = 'By default, the client will not try to reimport files that it knows were deleted before. This is a good setting and should be left on in general.'
|
||||
tt += os.linesep * 2
|
||||
tt += 'However, you might like to turn it off for a one-time job where you want to force an import of previously deleted files.'
|
||||
|
||||
self._exclude_deleted.setToolTip( tt )
|
||||
|
||||
self._do_not_check_known_urls_before_importing = QW.QCheckBox( pre_import_panel )
|
||||
self._do_not_check_hashes_before_importing = QW.QCheckBox( pre_import_panel )
|
||||
|
||||
tt = 'If hydrus recognises a file\'s URL or hash, it can decide to skip downloading it if it believes it already has it or previously deleted it. The logic behind this gets quite complicated, and it is usually best to let it work normally. It saves a huge amount of bandwidth.'
|
||||
tt = 'DO NOT SET THESE EXPENSIVE OPTIONS UNLESS YOU KNOW YOU NEED THEM FOR THIS ONE JOB'
|
||||
tt += os.linesep * 2
|
||||
tt += 'If hydrus recognises a file\'s URL or hash, if it is confident it already has it or previously deleted it, it will normally skip the download, saving a huge amount of time and bandwidth. The logic behind this gets quite complicated, and it is usually best to let it work normally.'
|
||||
tt += os.linesep * 2
|
||||
tt += 'However, if you believe the clientside url mappings or serverside hashes are inaccurate and the file is being wrongly skipped, turn these on to force a download. Only ever do this for one-time manually fired jobs. Do not turn this on for a normal download or a subscription! You do not need to turn these on for a file maintenance job that is filling in missing files, as missing files are automatically detected and essentially turn these on for you on a per-file basis.'
|
||||
|
||||
|
@ -1241,6 +1250,10 @@ class EditFileImportOptions( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
self._allow_decompression_bombs = QW.QCheckBox( pre_import_panel )
|
||||
|
||||
tt = 'This is an old setting, it basically just rejects all jpegs and pngs with more than a 1GB bitmap, or about 250-350 Megapixels. In can be useful if you have an older computer that will die at a 16,000x22,000 png.'
|
||||
|
||||
self._allow_decompression_bombs.setToolTip( tt )
|
||||
|
||||
self._min_size = ClientGUIControls.NoneableBytesControl( pre_import_panel )
|
||||
self._min_size.SetValue( 5 * 1024 )
|
||||
|
||||
|
@ -1250,12 +1263,21 @@ class EditFileImportOptions( ClientGUIScrolledPanels.EditPanel ):
|
|||
self._max_gif_size = ClientGUIControls.NoneableBytesControl( pre_import_panel )
|
||||
self._max_gif_size.SetValue( 32 * 1024 * 1024 )
|
||||
|
||||
tt = 'This catches most of those gif conversions of webms. These files are low quality but huge and mostly a waste of storage and bandwidth.'
|
||||
|
||||
self._max_gif_size.setToolTip( tt )
|
||||
|
||||
self._min_resolution = ClientGUICommon.NoneableSpinCtrl( pre_import_panel, num_dimensions = 2 )
|
||||
self._min_resolution.SetValue( ( 50, 50 ) )
|
||||
|
||||
self._max_resolution = ClientGUICommon.NoneableSpinCtrl( pre_import_panel, num_dimensions = 2 )
|
||||
self._max_resolution.SetValue( ( 8192, 8192 ) )
|
||||
|
||||
tt = 'If either width or height is violated, the file will fail this test and be ignored. It does not have to be both.'
|
||||
|
||||
self._min_resolution.setToolTip( tt )
|
||||
self._max_resolution.setToolTip( tt )
|
||||
|
||||
#
|
||||
|
||||
post_import_panel = ClientGUICommon.StaticBox( self, 'post-import actions' )
|
||||
|
@ -1278,11 +1300,11 @@ class EditFileImportOptions( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
#
|
||||
|
||||
presentation_panel = ClientGUICommon.StaticBox( self, 'presentation options' )
|
||||
presentation_static_box = ClientGUICommon.StaticBox( self, 'presentation options' )
|
||||
|
||||
self._present_new_files = QW.QCheckBox( presentation_panel )
|
||||
self._present_already_in_inbox_files = QW.QCheckBox( presentation_panel )
|
||||
self._present_already_in_archive_files = QW.QCheckBox( presentation_panel )
|
||||
presentation_import_options = file_import_options.GetPresentationImportOptions()
|
||||
|
||||
self._presentation_import_options_edit_panel = EditPresentationImportOptions( presentation_static_box, presentation_import_options )
|
||||
|
||||
#
|
||||
|
||||
|
@ -1310,14 +1332,6 @@ class EditFileImportOptions( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
#
|
||||
|
||||
( present_new_files, present_already_in_inbox_files, present_already_in_archive_files ) = file_import_options.GetPresentationOptions()
|
||||
|
||||
self._present_new_files.setChecked( present_new_files )
|
||||
self._present_already_in_inbox_files.setChecked( present_already_in_inbox_files )
|
||||
self._present_already_in_archive_files.setChecked( present_already_in_archive_files )
|
||||
|
||||
#
|
||||
|
||||
rows = []
|
||||
|
||||
rows.append( ( 'exclude previously deleted files: ', self._exclude_deleted ) )
|
||||
|
@ -1367,15 +1381,7 @@ class EditFileImportOptions( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
#
|
||||
|
||||
rows = []
|
||||
|
||||
rows.append( ( 'present new files', self._present_new_files ) )
|
||||
rows.append( ( 'present \'already in db\' files in inbox', self._present_already_in_inbox_files ) )
|
||||
rows.append( ( 'present \'already in db\' files in archive', self._present_already_in_archive_files ) )
|
||||
|
||||
gridbox = ClientGUICommon.WrapInGrid( presentation_panel, rows )
|
||||
|
||||
presentation_panel.Add( gridbox, CC.FLAGS_EXPAND_SIZER_PERPENDICULAR )
|
||||
presentation_static_box.Add( self._presentation_import_options_edit_panel, CC.FLAGS_EXPAND_SIZER_PERPENDICULAR )
|
||||
|
||||
#
|
||||
|
||||
|
@ -1384,7 +1390,9 @@ class EditFileImportOptions( ClientGUIScrolledPanels.EditPanel ):
|
|||
QP.AddToLayout( vbox, help_hbox, CC.FLAGS_ON_RIGHT )
|
||||
QP.AddToLayout( vbox, pre_import_panel, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
QP.AddToLayout( vbox, post_import_panel, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
QP.AddToLayout( vbox, presentation_panel, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
QP.AddToLayout( vbox, presentation_static_box, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
|
||||
vbox.addStretch( 1 )
|
||||
|
||||
self.widget().setLayout( vbox )
|
||||
|
||||
|
@ -1432,15 +1440,13 @@ If you have a very large (10k+ files) file import page, consider hiding some or
|
|||
associate_primary_urls = self._associate_primary_urls.isChecked()
|
||||
associate_source_urls = self._associate_source_urls.isChecked()
|
||||
|
||||
present_new_files = self._present_new_files.isChecked()
|
||||
present_already_in_inbox_files = self._present_already_in_inbox_files.isChecked()
|
||||
present_already_in_archive_files = self._present_already_in_archive_files.isChecked()
|
||||
presentation_import_options = self._presentation_import_options_edit_panel.GetValue()
|
||||
|
||||
file_import_options = FileImportOptions.FileImportOptions()
|
||||
|
||||
file_import_options.SetPreImportOptions( exclude_deleted, do_not_check_known_urls_before_importing, do_not_check_hashes_before_importing, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution )
|
||||
file_import_options.SetPostImportOptions( automatic_archive, associate_primary_urls, associate_source_urls )
|
||||
file_import_options.SetPresentationOptions( present_new_files, present_already_in_inbox_files, present_already_in_archive_files )
|
||||
file_import_options.SetPresentationImportOptions( presentation_import_options )
|
||||
|
||||
return file_import_options
|
||||
|
||||
|
@ -2046,6 +2052,152 @@ class EditNoneableIntegerPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
return self._value.GetValue()
|
||||
|
||||
|
||||
class EditPresentationImportOptions( ClientGUIScrolledPanels.EditPanel ):
|
||||
|
||||
def __init__( self, parent: QW.QWidget, presentation_import_options: PresentationImportOptions.PresentationImportOptions ):
|
||||
|
||||
ClientGUIScrolledPanels.EditPanel.__init__( self, parent )
|
||||
|
||||
#
|
||||
|
||||
self._presentation_status = ClientGUICommon.BetterChoice( self )
|
||||
|
||||
for value in ( PresentationImportOptions.PRESENTATION_STATUS_ANY_GOOD, PresentationImportOptions.PRESENTATION_STATUS_NEW_ONLY, PresentationImportOptions.PRESENTATION_STATUS_NONE ):
|
||||
|
||||
self._presentation_status.addItem( PresentationImportOptions.presentation_status_enum_str_lookup[ value ], value )
|
||||
|
||||
|
||||
tt = 'All files means \'successful\' and \'already in db\'.'
|
||||
tt += os.linesep * 2
|
||||
tt += 'New means only \'successful\'.'
|
||||
tt += os.linesep * 2
|
||||
tt += 'None means this is a silent importer. This is rarely useful.'
|
||||
|
||||
self._presentation_status.setToolTip( tt )
|
||||
|
||||
self._presentation_inbox = ClientGUICommon.BetterChoice( self )
|
||||
|
||||
tt = 'Inbox or archive means all files.'
|
||||
tt += os.linesep * 2
|
||||
tt += 'Must be in inbox means only inbox files _at the time of the presentation_. This can be neat as you process and revisit currently watched threads.'
|
||||
tt += os.linesep * 2
|
||||
tt += 'Or in inbox (which only shows if you are set to only see new files) allows already in db results if they are currently in the inbox. Essentially you are just excluding already-in-archive files.'
|
||||
|
||||
self._presentation_inbox.setToolTip( tt )
|
||||
|
||||
self._presentation_location = ClientGUICommon.BetterChoice( self )
|
||||
|
||||
tt = 'This is mostly for technical purposes on hydev\'s end, but if you want you can show what is currently in the trash as well.'
|
||||
|
||||
self._presentation_location.setToolTip( tt )
|
||||
|
||||
for value in ( PresentationImportOptions.PRESENTATION_LOCATION_IN_LOCAL_FILES, PresentationImportOptions.PRESENTATION_LOCATION_IN_TRASH_TOO ):
|
||||
|
||||
self._presentation_location.addItem( PresentationImportOptions.presentation_location_enum_str_lookup[ value ], value )
|
||||
|
||||
|
||||
#
|
||||
|
||||
self._presentation_status.SetValue( presentation_import_options.GetPresentationStatus() )
|
||||
|
||||
self._UpdateInboxChoices()
|
||||
|
||||
self._presentation_inbox.SetValue( presentation_import_options.GetPresentationInbox() )
|
||||
self._presentation_location.SetValue( presentation_import_options.GetPresentationLocation() )
|
||||
|
||||
#
|
||||
|
||||
vbox = QP.VBoxLayout()
|
||||
|
||||
label = 'An importer will try to import everything in its queue, but it does not have to _show_ all the successful results in the import page or subscription button/page. Many advanced users will set this to only show _new_ results to skip seeing \'already in db\' files.'
|
||||
|
||||
st = ClientGUICommon.BetterStaticText( self, label = label )
|
||||
|
||||
st.setWordWrap( True )
|
||||
|
||||
hbox = QP.HBoxLayout()
|
||||
|
||||
QP.AddToLayout( hbox, self._presentation_status, CC.FLAGS_EXPAND_BOTH_WAYS )
|
||||
QP.AddToLayout( hbox, self._presentation_inbox, CC.FLAGS_EXPAND_BOTH_WAYS )
|
||||
QP.AddToLayout( hbox, self._presentation_location, CC.FLAGS_EXPAND_BOTH_WAYS )
|
||||
|
||||
#
|
||||
|
||||
QP.AddToLayout( vbox, st, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
QP.AddToLayout( vbox, hbox, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
|
||||
self.widget().setLayout( vbox )
|
||||
|
||||
#
|
||||
|
||||
self._presentation_status.currentIndexChanged.connect( self._UpdateInboxChoices )
|
||||
self._presentation_status.currentIndexChanged.connect( self._UpdateEnabled )
|
||||
|
||||
|
||||
def _UpdateEnabled( self ):
|
||||
|
||||
enabled = self._presentation_status.GetValue() != PresentationImportOptions.PRESENTATION_STATUS_NONE
|
||||
|
||||
self._presentation_inbox.setEnabled( enabled )
|
||||
self._presentation_location.setEnabled( enabled )
|
||||
|
||||
|
||||
def _UpdateInboxChoices( self ):
|
||||
|
||||
do_it = False
|
||||
|
||||
previous_presentation_inbox = self._presentation_inbox.GetValue()
|
||||
|
||||
presentation_status = self._presentation_status.GetValue()
|
||||
|
||||
if presentation_status == PresentationImportOptions.PRESENTATION_STATUS_NEW_ONLY:
|
||||
|
||||
if self._presentation_inbox.count() != 3:
|
||||
|
||||
do_it = True
|
||||
|
||||
allowed_values = ( PresentationImportOptions.PRESENTATION_INBOX_AGNOSTIC, PresentationImportOptions.PRESENTATION_INBOX_REQUIRE_INBOX, PresentationImportOptions.PRESENTATION_INBOX_INCLUDE_INBOX )
|
||||
|
||||
|
||||
else:
|
||||
|
||||
if self._presentation_inbox.count() != 2:
|
||||
|
||||
do_it = True
|
||||
|
||||
allowed_values = ( PresentationImportOptions.PRESENTATION_INBOX_AGNOSTIC, PresentationImportOptions.PRESENTATION_INBOX_REQUIRE_INBOX )
|
||||
|
||||
if previous_presentation_inbox == PresentationImportOptions.PRESENTATION_INBOX_INCLUDE_INBOX:
|
||||
|
||||
previous_presentation_inbox = PresentationImportOptions.PRESENTATION_INBOX_AGNOSTIC
|
||||
|
||||
|
||||
|
||||
|
||||
if do_it:
|
||||
|
||||
self._presentation_inbox.clear()
|
||||
|
||||
for value in allowed_values:
|
||||
|
||||
self._presentation_inbox.addItem( PresentationImportOptions.presentation_inbox_enum_str_lookup[ value ], value )
|
||||
|
||||
|
||||
self._presentation_inbox.SetValue( previous_presentation_inbox )
|
||||
|
||||
|
||||
|
||||
def GetValue( self ) -> PresentationImportOptions.PresentationImportOptions:
|
||||
|
||||
presentation_import_options = PresentationImportOptions.PresentationImportOptions()
|
||||
|
||||
presentation_import_options.SetPresentationStatus( self._presentation_status.GetValue() )
|
||||
presentation_import_options.SetPresentationInbox( self._presentation_inbox.GetValue() )
|
||||
presentation_import_options.SetPresentationLocation( self._presentation_location.GetValue() )
|
||||
|
||||
return presentation_import_options
|
||||
|
||||
|
||||
class EditRegexFavourites( ClientGUIScrolledPanels.EditPanel ):
|
||||
|
||||
def __init__( self, parent: QW.QWidget, regex_favourites ):
|
||||
|
|
|
@ -1394,7 +1394,11 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
self._notebook_tab_alignment.addItem( CC.directions_alignment_string_lookup[ value ], value )
|
||||
|
||||
|
||||
self._total_pages_warning = QP.MakeQSpinBox( self._pages_panel, min=5, max=500 )
|
||||
self._total_pages_warning = QP.MakeQSpinBox( self._pages_panel, min=5, max=65565 )
|
||||
|
||||
tt = 'If you have a gigantic session, or you have very page-spammy subscriptions, you can try boosting this, but be warned it may lead to resource limit crashes. The best solution to a large session is to make it smaller!'
|
||||
|
||||
self._total_pages_warning.setToolTip( tt )
|
||||
|
||||
self._reverse_page_shift_drag_behaviour = QW.QCheckBox( self._pages_panel )
|
||||
self._reverse_page_shift_drag_behaviour.setToolTip( 'By default, holding down shift when you drop off a page tab means the client will not \'chase\' the page tab. This makes this behaviour default, with shift-drop meaning to chase.' )
|
||||
|
|
|
@ -208,7 +208,7 @@ class EditSubscriptionPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
self._checker_options = ClientGUIImport.CheckerOptionsButton( self._file_limits_panel, checker_options, update_callable = self._CheckerOptionsUpdated )
|
||||
|
||||
self._file_presentation_panel = ClientGUICommon.StaticBox( self, 'presentation' )
|
||||
self._file_presentation_panel = ClientGUICommon.StaticBox( self, 'file publication' )
|
||||
|
||||
self._show_a_popup_while_working = QW.QCheckBox( self._file_presentation_panel )
|
||||
self._show_a_popup_while_working.setToolTip( 'Careful with this! Leave it on to begin with, just in case it goes wrong!' )
|
||||
|
@ -282,11 +282,19 @@ class EditSubscriptionPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
#
|
||||
|
||||
label = 'If you like, the subscription can send its files to popup buttons or pages directly. The files it sends are shaped by the \'presentation\' options in _file import options_.'
|
||||
|
||||
st = ClientGUICommon.BetterStaticText( self._file_presentation_panel, label = label )
|
||||
|
||||
st.setWordWrap( True )
|
||||
|
||||
self._file_presentation_panel.Add( st, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
|
||||
rows = []
|
||||
|
||||
rows.append( ( 'show a popup while working: ', self._show_a_popup_while_working ) )
|
||||
rows.append( ( 'publish new files to a popup button: ', self._publish_files_to_popup_button ) )
|
||||
rows.append( ( 'publish new files to a page: ', self._publish_files_to_page ) )
|
||||
rows.append( ( 'publish presented files to a popup button: ', self._publish_files_to_popup_button ) )
|
||||
rows.append( ( 'publish presented files to a page: ', self._publish_files_to_page ) )
|
||||
rows.append( ( 'publish to a specific label: ', self._publish_label_override ) )
|
||||
rows.append( ( 'publish all queries to the same page/popup button: ', self._merge_query_publish_events ) )
|
||||
|
||||
|
@ -1209,8 +1217,18 @@ class EditSubscriptionQueryPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
QP.AddToLayout( vbox, self._file_seed_cache_control, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
QP.AddToLayout( vbox, self._gallery_seed_log_control, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
QP.AddToLayout( vbox, gridbox, CC.FLAGS_EXPAND_SIZER_PERPENDICULAR )
|
||||
|
||||
label = 'The tag import options here is only for setting \'additional tags\' for this single query! If you want to change the parsed tags or do subscription-wide \'additional tags\', jump up a level to the edit subscriptions dialog.'
|
||||
|
||||
st = ClientGUICommon.BetterStaticText( self, label = label )
|
||||
|
||||
st.setWordWrap( True )
|
||||
|
||||
QP.AddToLayout( vbox, st, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
QP.AddToLayout( vbox, self._tag_import_options, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
|
||||
vbox.addStretch( 1 )
|
||||
|
||||
self.widget().setLayout( vbox )
|
||||
|
||||
#
|
||||
|
|
|
@ -50,9 +50,10 @@ from hydrus.client.gui.widgets import ClientGUIMenuButton
|
|||
from hydrus.client.importing import ClientImporting
|
||||
from hydrus.client.importing import ClientImportGallery
|
||||
from hydrus.client.importing import ClientImportLocal
|
||||
from hydrus.client.importing.options import FileImportOptions
|
||||
from hydrus.client.importing import ClientImportSimpleURLs
|
||||
from hydrus.client.importing import ClientImportWatchers
|
||||
from hydrus.client.importing.options import FileImportOptions
|
||||
from hydrus.client.importing.options import PresentationImportOptions
|
||||
from hydrus.client.media import ClientMedia
|
||||
from hydrus.client.metadata import ClientTags
|
||||
|
||||
|
@ -70,6 +71,64 @@ MANAGEMENT_TYPE_PAGE_OF_PAGES = 10
|
|||
|
||||
management_panel_types_to_classes = {}
|
||||
|
||||
def AddPresentationSubmenu( menu: QW.QMenu, importer_name: str, single_selected_presentation_import_options: typing.Optional[ PresentationImportOptions.PresentationImportOptions ], callable ):
|
||||
|
||||
submenu = QW.QMenu( menu )
|
||||
|
||||
# inbox only
|
||||
# detect single_selected_presentation_import_options and deal with it
|
||||
|
||||
description = 'Gather these files for the selected importers and show them.'
|
||||
|
||||
if single_selected_presentation_import_options is None:
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( submenu, 'default presented files', description, callable )
|
||||
|
||||
else:
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( submenu, 'default presented files ({})'.format( single_selected_presentation_import_options.GetSummary() ), description, callable )
|
||||
|
||||
|
||||
sets_of_options = []
|
||||
|
||||
presentation_import_options = PresentationImportOptions.PresentationImportOptions()
|
||||
|
||||
presentation_import_options.SetPresentationStatus( PresentationImportOptions.PRESENTATION_STATUS_NEW_ONLY )
|
||||
|
||||
sets_of_options.append( presentation_import_options )
|
||||
|
||||
presentation_import_options = PresentationImportOptions.PresentationImportOptions()
|
||||
|
||||
presentation_import_options.SetPresentationInbox( PresentationImportOptions.PRESENTATION_INBOX_REQUIRE_INBOX )
|
||||
|
||||
sets_of_options.append( presentation_import_options )
|
||||
|
||||
presentation_import_options = PresentationImportOptions.PresentationImportOptions()
|
||||
|
||||
sets_of_options.append( presentation_import_options )
|
||||
|
||||
presentation_import_options = PresentationImportOptions.PresentationImportOptions()
|
||||
|
||||
presentation_import_options.SetPresentationLocation( PresentationImportOptions.PRESENTATION_LOCATION_IN_TRASH_TOO )
|
||||
|
||||
sets_of_options.append( presentation_import_options )
|
||||
|
||||
for presentation_import_options in sets_of_options:
|
||||
|
||||
if single_selected_presentation_import_options is not None and presentation_import_options == single_selected_presentation_import_options:
|
||||
|
||||
continue
|
||||
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( submenu, presentation_import_options.GetSummary(), description, callable, presentation_import_options = presentation_import_options )
|
||||
|
||||
|
||||
importer_label_template = '{}s\'' if single_selected_presentation_import_options is None else '{}\'s'
|
||||
|
||||
importer_label = importer_label_template.format( importer_name )
|
||||
|
||||
ClientGUIMenus.AppendMenu( menu, submenu, 'show {} files'.format( importer_label ) )
|
||||
|
||||
def CreateManagementController( page_name, management_type, file_service_key = None ):
|
||||
|
||||
if file_service_key is None:
|
||||
|
@ -2020,9 +2079,9 @@ class ManagementPanelImporterMultipleGallery( ManagementPanelImporter ):
|
|||
|
||||
def _GetListCtrlMenu( self ):
|
||||
|
||||
selected_watchers = self._gallery_importers_listctrl.GetData( only_selected = True )
|
||||
selected_importers = self._gallery_importers_listctrl.GetData( only_selected = True )
|
||||
|
||||
if len( selected_watchers ) == 0:
|
||||
if len( selected_importers ) == 0:
|
||||
|
||||
raise HydrusExceptions.DataMissing()
|
||||
|
||||
|
@ -2033,10 +2092,16 @@ class ManagementPanelImporterMultipleGallery( ManagementPanelImporter ):
|
|||
|
||||
ClientGUIMenus.AppendSeparator( menu )
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( menu, 'show all importers\' presented files', 'Gather the presented files for the selected importers and show them in a new page.', self._ShowSelectedImportersFiles, show='presented' )
|
||||
ClientGUIMenus.AppendMenuItem( menu, 'show all importers\' new files', 'Gather the presented files for the selected importers and show them in a new page.', self._ShowSelectedImportersFiles, show='new' )
|
||||
ClientGUIMenus.AppendMenuItem( menu, 'show all importers\' files', 'Gather the presented files for the selected importers and show them in a new page.', self._ShowSelectedImportersFiles, show='all' )
|
||||
ClientGUIMenus.AppendMenuItem( menu, 'show all importers\' files (including trash)', 'Gather the presented files (including trash) for the selected importers and show them in a new page.', self._ShowSelectedImportersFiles, show='all_and_trash' )
|
||||
single_selected_presentation_import_options = None
|
||||
|
||||
if len( selected_importers ) == 1:
|
||||
|
||||
( importer, ) = selected_importers
|
||||
|
||||
single_selected_presentation_import_options = importer.GetFileImportOptions().GetPresentationImportOptions()
|
||||
|
||||
|
||||
AddPresentationSubmenu( menu, 'downloader', single_selected_presentation_import_options, self._ShowSelectedImportersFiles )
|
||||
|
||||
ClientGUIMenus.AppendSeparator( menu )
|
||||
|
||||
|
@ -2090,8 +2155,6 @@ class ManagementPanelImporterMultipleGallery( ManagementPanelImporter ):
|
|||
|
||||
if len( hashes ) > 0:
|
||||
|
||||
hashes = HG.client_controller.Read( 'filter_hashes', CC.LOCAL_FILE_SERVICE_KEY, hashes )
|
||||
|
||||
media_results = HG.client_controller.Read( 'media_results', hashes )
|
||||
|
||||
else:
|
||||
|
@ -2330,7 +2393,7 @@ class ManagementPanelImporterMultipleGallery( ManagementPanelImporter ):
|
|||
frame.SetPanel( panel )
|
||||
|
||||
|
||||
def _ShowSelectedImportersFiles( self, show = 'presented' ):
|
||||
def _ShowSelectedImportersFiles( self, presentation_import_options = None ):
|
||||
|
||||
gallery_imports = self._gallery_importers_listctrl.GetData( only_selected = True )
|
||||
|
||||
|
@ -2344,18 +2407,7 @@ class ManagementPanelImporterMultipleGallery( ManagementPanelImporter ):
|
|||
|
||||
for gallery_import in gallery_imports:
|
||||
|
||||
if show == 'presented':
|
||||
|
||||
gallery_hashes = gallery_import.GetPresentedHashes()
|
||||
|
||||
elif show == 'new':
|
||||
|
||||
gallery_hashes = gallery_import.GetNewHashes()
|
||||
|
||||
elif show in ( 'all', 'all_and_trash' ):
|
||||
|
||||
gallery_hashes = gallery_import.GetHashes()
|
||||
|
||||
gallery_hashes = gallery_import.GetPresentedHashes( presentation_import_options = presentation_import_options )
|
||||
|
||||
new_hashes = [ hash for hash in gallery_hashes if hash not in seen_hashes ]
|
||||
|
||||
|
@ -2363,17 +2415,6 @@ class ManagementPanelImporterMultipleGallery( ManagementPanelImporter ):
|
|||
seen_hashes.update( new_hashes )
|
||||
|
||||
|
||||
if show == 'all_and_trash':
|
||||
|
||||
filter_file_service_key = CC.COMBINED_LOCAL_FILE_SERVICE_KEY
|
||||
|
||||
else:
|
||||
|
||||
filter_file_service_key = CC.LOCAL_FILE_SERVICE_KEY
|
||||
|
||||
|
||||
hashes = HG.client_controller.Read( 'filter_hashes', filter_file_service_key, hashes )
|
||||
|
||||
if len( hashes ) > 0:
|
||||
|
||||
self._ClearExistingHighlightAndPanel()
|
||||
|
@ -2872,10 +2913,16 @@ class ManagementPanelImporterMultipleWatcher( ManagementPanelImporter ):
|
|||
|
||||
ClientGUIMenus.AppendSeparator( menu )
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( menu, 'show all watchers\' presented files', 'Gather the presented files for the selected watchers and show them in a new page.', self._ShowSelectedImportersFiles, show='presented' )
|
||||
ClientGUIMenus.AppendMenuItem( menu, 'show all watchers\' new files', 'Gather the presented files for the selected watchers and show them in a new page.', self._ShowSelectedImportersFiles, show='new' )
|
||||
ClientGUIMenus.AppendMenuItem( menu, 'show all watchers\' files', 'Gather the presented files for the selected watchers and show them in a new page.', self._ShowSelectedImportersFiles, show='all' )
|
||||
ClientGUIMenus.AppendMenuItem( menu, 'show all watchers\' files (including trash)', 'Gather the presented files (including trash) for the selected watchers and show them in a new page.', self._ShowSelectedImportersFiles, show='all_and_trash' )
|
||||
single_selected_presentation_import_options = None
|
||||
|
||||
if len( selected_watchers ) == 1:
|
||||
|
||||
( watcher, ) = selected_watchers
|
||||
|
||||
single_selected_presentation_import_options = watcher.GetFileImportOptions().GetPresentationImportOptions()
|
||||
|
||||
|
||||
AddPresentationSubmenu( menu, 'watcher', single_selected_presentation_import_options, self._ShowSelectedImportersFiles )
|
||||
|
||||
ClientGUIMenus.AppendSeparator( menu )
|
||||
|
||||
|
@ -2925,8 +2972,6 @@ class ManagementPanelImporterMultipleWatcher( ManagementPanelImporter ):
|
|||
|
||||
if len( hashes ) > 0:
|
||||
|
||||
hashes = HG.client_controller.Read( 'filter_hashes', CC.LOCAL_FILE_SERVICE_KEY, hashes )
|
||||
|
||||
media_results = HG.client_controller.Read( 'media_results', hashes )
|
||||
|
||||
else:
|
||||
|
@ -3165,7 +3210,7 @@ class ManagementPanelImporterMultipleWatcher( ManagementPanelImporter ):
|
|||
frame.SetPanel( panel )
|
||||
|
||||
|
||||
def _ShowSelectedImportersFiles( self, show = 'presented' ):
|
||||
def _ShowSelectedImportersFiles( self, presentation_import_options = None ):
|
||||
|
||||
watchers = self._watchers_listctrl.GetData( only_selected = True )
|
||||
|
||||
|
@ -3179,18 +3224,7 @@ class ManagementPanelImporterMultipleWatcher( ManagementPanelImporter ):
|
|||
|
||||
for watcher in watchers:
|
||||
|
||||
if show == 'presented':
|
||||
|
||||
watcher_hashes = watcher.GetPresentedHashes()
|
||||
|
||||
elif show == 'new':
|
||||
|
||||
watcher_hashes = watcher.GetNewHashes()
|
||||
|
||||
elif show in ( 'all', 'all_and_trash' ):
|
||||
|
||||
watcher_hashes = watcher.GetHashes()
|
||||
|
||||
watcher_hashes = watcher.GetPresentedHashes( presentation_import_options = presentation_import_options )
|
||||
|
||||
new_hashes = [ hash for hash in watcher_hashes if hash not in seen_hashes ]
|
||||
|
||||
|
@ -3198,17 +3232,6 @@ class ManagementPanelImporterMultipleWatcher( ManagementPanelImporter ):
|
|||
seen_hashes.update( new_hashes )
|
||||
|
||||
|
||||
if show == 'all_and_trash':
|
||||
|
||||
filter_file_service_key = CC.COMBINED_LOCAL_FILE_SERVICE_KEY
|
||||
|
||||
else:
|
||||
|
||||
filter_file_service_key = CC.LOCAL_FILE_SERVICE_KEY
|
||||
|
||||
|
||||
hashes = HG.client_controller.Read( 'filter_hashes', filter_file_service_key, hashes )
|
||||
|
||||
if len( hashes ) > 0:
|
||||
|
||||
self._ClearExistingHighlightAndPanel()
|
||||
|
|
|
@ -2906,7 +2906,7 @@ class PagesNotebook( QP.TabWidgetWithDnD ):
|
|||
|
||||
|
||||
WARNING_TOTAL_PAGES = self._controller.new_options.GetInteger( 'total_pages_warning' )
|
||||
MAX_TOTAL_PAGES = 500
|
||||
MAX_TOTAL_PAGES = max( 500, WARNING_TOTAL_PAGES * 2 )
|
||||
|
||||
(
|
||||
total_active_page_count,
|
||||
|
@ -2924,7 +2924,7 @@ class PagesNotebook( QP.TabWidgetWithDnD ):
|
|||
|
||||
if not HG.no_page_limit_mode:
|
||||
|
||||
if total_active_page_count >= MAX_TOTAL_PAGES:
|
||||
if total_active_page_count >= MAX_TOTAL_PAGES and not ClientGUIFunctions.DialogIsOpen():
|
||||
|
||||
message = 'The client should not have more than ' + str( MAX_TOTAL_PAGES ) + ' pages open, as it leads to program instability! Are you sure you want to open more pages?'
|
||||
|
||||
|
|
|
@ -26,6 +26,7 @@ from hydrus.client import ClientParsing
|
|||
from hydrus.client.importing import ClientImportFiles
|
||||
from hydrus.client.importing import ClientImporting
|
||||
from hydrus.client.importing.options import FileImportOptions
|
||||
from hydrus.client.importing.options import PresentationImportOptions
|
||||
from hydrus.client.importing.options import TagImportOptions
|
||||
from hydrus.client.metadata import ClientTags
|
||||
from hydrus.client.networking import ClientNetworkingDomain
|
||||
|
@ -1027,34 +1028,18 @@ class FileSeed( HydrusSerialisable.SerialisableBase ):
|
|||
self._UpdateModified()
|
||||
|
||||
|
||||
def ShouldPresent( self, file_import_options: FileImportOptions.FileImportOptions, in_inbox = None ):
|
||||
def ShouldPresent( self, presentation_import_options: PresentationImportOptions.PresentationImportOptions ):
|
||||
|
||||
hash = self.GetHash()
|
||||
|
||||
if hash is not None and self.status in CC.SUCCESSFUL_IMPORT_STATES:
|
||||
if not self.HasHash():
|
||||
|
||||
if in_inbox is None:
|
||||
|
||||
if file_import_options.ShouldPresentIgnorantOfInbox( self.status ):
|
||||
|
||||
return True
|
||||
|
||||
|
||||
if file_import_options.ShouldNotPresentIgnorantOfInbox( self.status ):
|
||||
|
||||
return False
|
||||
|
||||
|
||||
in_inbox = hash in HG.client_controller.Read( 'inbox_hashes', ( hash, ) )
|
||||
|
||||
|
||||
if file_import_options.ShouldPresent( self.status, in_inbox ):
|
||||
|
||||
return True
|
||||
|
||||
return False
|
||||
|
||||
|
||||
return False
|
||||
was_just_imported = not HydrusData.TimeHasPassed( self.modified + 5 )
|
||||
|
||||
should_check_location = not was_just_imported
|
||||
|
||||
return presentation_import_options.ShouldPresentHashAndStatus( self.GetHash(), self.status, should_check_location = should_check_location )
|
||||
|
||||
|
||||
def WorksInNewSystem( self ):
|
||||
|
@ -2489,7 +2474,7 @@ class FileSeedCache( HydrusSerialisable.SerialisableBase ):
|
|||
return latest_timestamp
|
||||
|
||||
|
||||
def GetNextFileSeed( self, status: int ):
|
||||
def GetNextFileSeed( self, status: int ) -> typing.Optional[ FileSeed ]:
|
||||
|
||||
with self._lock:
|
||||
|
||||
|
@ -2517,46 +2502,14 @@ class FileSeedCache( HydrusSerialisable.SerialisableBase ):
|
|||
return num_files
|
||||
|
||||
|
||||
def GetPresentedHashes( self, file_import_options: FileImportOptions.FileImportOptions ):
|
||||
def GetPresentedHashes( self, presentation_import_options: PresentationImportOptions.PresentationImportOptions ):
|
||||
|
||||
with self._lock:
|
||||
|
||||
eligible_file_seeds = [ file_seed for file_seed in self._file_seeds if file_seed.HasHash() ]
|
||||
hashes_and_statuses = [ ( file_seed.GetHash(), file_seed.status ) for file_seed in self._file_seeds if file_seed.HasHash() ]
|
||||
|
||||
|
||||
file_seed_hashes = [ file_seed.GetHash() for file_seed in eligible_file_seeds ]
|
||||
|
||||
if len( file_seed_hashes ) > 0:
|
||||
|
||||
inbox_hashes = HG.client_controller.Read( 'inbox_hashes', file_seed_hashes )
|
||||
|
||||
else:
|
||||
|
||||
inbox_hashes = set()
|
||||
|
||||
|
||||
hashes = []
|
||||
hashes_seen = set()
|
||||
|
||||
for file_seed in eligible_file_seeds:
|
||||
|
||||
hash = file_seed.GetHash()
|
||||
|
||||
if hash in hashes_seen:
|
||||
|
||||
continue
|
||||
|
||||
|
||||
in_inbox = hash in inbox_hashes
|
||||
|
||||
if file_seed.ShouldPresent( file_import_options, in_inbox = in_inbox ):
|
||||
|
||||
hashes.append( hash )
|
||||
hashes_seen.add( hash )
|
||||
|
||||
|
||||
|
||||
return hashes
|
||||
return presentation_import_options.GetPresentedHashes( hashes_and_statuses )
|
||||
|
||||
|
||||
def GetStatus( self ):
|
||||
|
|
|
@ -13,6 +13,7 @@ from hydrus.client.importing import ClientImportFileSeeds
|
|||
from hydrus.client.importing import ClientImportGallerySeeds
|
||||
from hydrus.client.importing import ClientImporting
|
||||
from hydrus.client.importing.options import FileImportOptions
|
||||
from hydrus.client.importing.options import PresentationImportOptions
|
||||
from hydrus.client.importing.options import TagImportOptions
|
||||
from hydrus.client.networking import ClientNetworkingJobs
|
||||
|
||||
|
@ -241,7 +242,7 @@ class GalleryImport( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
with self._lock:
|
||||
|
||||
should_present = self._publish_to_page and file_seed.ShouldPresent( self._file_import_options )
|
||||
should_present = self._publish_to_page and file_seed.ShouldPresent( self._file_import_options.GetPresentationImportOptions() )
|
||||
|
||||
page_key = self._page_key
|
||||
|
||||
|
@ -560,20 +561,6 @@ class GalleryImport( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
|
||||
|
||||
def GetNewHashes( self ):
|
||||
|
||||
with self._lock:
|
||||
|
||||
fsc = self._file_seed_cache
|
||||
|
||||
|
||||
file_import_options = FileImportOptions.FileImportOptions()
|
||||
|
||||
file_import_options.SetPresentationOptions( True, False, False )
|
||||
|
||||
return fsc.GetPresentedHashes( file_import_options )
|
||||
|
||||
|
||||
def GetNumSeeds( self ):
|
||||
|
||||
with self._lock:
|
||||
|
@ -590,15 +577,19 @@ class GalleryImport( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
|
||||
|
||||
def GetPresentedHashes( self ):
|
||||
def GetPresentedHashes( self, presentation_import_options = None ):
|
||||
|
||||
with self._lock:
|
||||
|
||||
fsc = self._file_seed_cache
|
||||
fio = self._file_import_options
|
||||
|
||||
if presentation_import_options is None:
|
||||
|
||||
presentation_import_options = self._file_import_options.GetPresentationImportOptions()
|
||||
|
||||
|
||||
|
||||
return fsc.GetPresentedHashes( fio )
|
||||
return fsc.GetPresentedHashes( presentation_import_options )
|
||||
|
||||
|
||||
def GetQueryText( self ):
|
||||
|
|
|
@ -169,7 +169,7 @@ class HDDImport( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
if file_seed.status in CC.SUCCESSFUL_IMPORT_STATES:
|
||||
|
||||
if file_seed.ShouldPresent( self._file_import_options ):
|
||||
if file_seed.ShouldPresent( self._file_import_options.GetPresentationImportOptions() ):
|
||||
|
||||
file_seed.PresentToPage( page_key )
|
||||
|
||||
|
@ -606,7 +606,7 @@ class ImportFolder( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
action_pairs = list(self._actions.items())
|
||||
action_location_pairs = list(self._action_locations.items())
|
||||
|
||||
return ( self._path, self._mimes, serialisable_file_import_options, serialisable_tag_import_options, serialisable_tag_service_keys_to_filename_tagging_options, action_pairs, action_location_pairs, self._period, self._check_regularly, serialisable_file_seed_cache, self._last_checked, self._paused, self._check_now, self._show_working_popup, self._publish_files_to_popup_button, self._publish_files_to_page )
|
||||
return ( self._path, list( self._mimes ), serialisable_file_import_options, serialisable_tag_import_options, serialisable_tag_service_keys_to_filename_tagging_options, action_pairs, action_location_pairs, self._period, self._check_regularly, serialisable_file_seed_cache, self._last_checked, self._paused, self._check_now, self._show_working_popup, self._publish_files_to_popup_button, self._publish_files_to_page )
|
||||
|
||||
|
||||
def _ImportFiles( self, job_key ):
|
||||
|
@ -714,7 +714,7 @@ class ImportFolder( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
if hash not in presentation_hashes_fast:
|
||||
|
||||
if file_seed.ShouldPresent( self._file_import_options ):
|
||||
if file_seed.ShouldPresent( self._file_import_options.GetPresentationImportOptions() ):
|
||||
|
||||
presentation_hashes.append( hash )
|
||||
|
||||
|
@ -752,7 +752,9 @@ class ImportFolder( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
def _InitialiseFromSerialisableInfo( self, serialisable_info ):
|
||||
|
||||
( self._path, self._mimes, serialisable_file_import_options, serialisable_tag_import_options, serialisable_tag_service_keys_to_filename_tagging_options, action_pairs, action_location_pairs, self._period, self._check_regularly, serialisable_file_seed_cache, self._last_checked, self._paused, self._check_now, self._show_working_popup, self._publish_files_to_popup_button, self._publish_files_to_page ) = serialisable_info
|
||||
( self._path, mimes, serialisable_file_import_options, serialisable_tag_import_options, serialisable_tag_service_keys_to_filename_tagging_options, action_pairs, action_location_pairs, self._period, self._check_regularly, serialisable_file_seed_cache, self._last_checked, self._paused, self._check_now, self._show_working_popup, self._publish_files_to_popup_button, self._publish_files_to_page ) = serialisable_info
|
||||
|
||||
self._mimes = set( mimes )
|
||||
|
||||
self._actions = dict( action_pairs )
|
||||
self._action_locations = dict( action_location_pairs )
|
||||
|
@ -965,7 +967,9 @@ class ImportFolder( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
self._file_seed_cache = ClientImportFileSeeds.FileSeedCache()
|
||||
|
||||
|
||||
if set( mimes ) != set( self._mimes ):
|
||||
mimes = set( mimes )
|
||||
|
||||
if mimes != self._mimes:
|
||||
|
||||
self._file_seed_cache.RemoveFileSeedsByStatus( ( CC.STATUS_VETOED, ) )
|
||||
|
||||
|
|
|
@ -214,7 +214,7 @@ class SimpleDownloaderImport( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
did_substantial_work = file_seed.WorkOnURL( self._file_seed_cache, status_hook, self._NetworkJobFactory, self._FileNetworkJobPresentationContextFactory, self._file_import_options, tag_import_options )
|
||||
|
||||
if file_seed.ShouldPresent( self._file_import_options ):
|
||||
if file_seed.ShouldPresent( self._file_import_options.GetPresentationImportOptions() ):
|
||||
|
||||
file_seed.PresentToPage( page_key )
|
||||
|
||||
|
@ -890,7 +890,7 @@ class URLsImport( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
did_substantial_work = file_seed.WorkOnURL( self._file_seed_cache, status_hook, self._NetworkJobFactory, self._FileNetworkJobPresentationContextFactory, self._file_import_options, self._tag_import_options )
|
||||
|
||||
if file_seed.ShouldPresent( self._file_import_options ):
|
||||
if file_seed.ShouldPresent( self._file_import_options.GetPresentationImportOptions() ):
|
||||
|
||||
file_seed.PresentToPage( page_key )
|
||||
|
||||
|
|
|
@ -1018,7 +1018,7 @@ class SubscriptionLegacy( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
|
||||
|
||||
if file_seed.ShouldPresent( self._file_import_options ):
|
||||
if file_seed.ShouldPresent( self._file_import_options.GetPresentationImportOptions() ):
|
||||
|
||||
hash = file_seed.GetHash()
|
||||
|
||||
|
|
|
@ -1029,7 +1029,7 @@ class Subscription( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
|
||||
|
||||
if file_seed.ShouldPresent( self._file_import_options ):
|
||||
if file_seed.ShouldPresent( self._file_import_options.GetPresentationImportOptions() ):
|
||||
|
||||
hash = file_seed.GetHash()
|
||||
|
||||
|
|
|
@ -13,6 +13,7 @@ from hydrus.client.importing import ClientImportFileSeeds
|
|||
from hydrus.client.importing import ClientImportGallerySeeds
|
||||
from hydrus.client.importing.options import ClientImportOptions
|
||||
from hydrus.client.importing.options import FileImportOptions
|
||||
from hydrus.client.importing.options import PresentationImportOptions
|
||||
from hydrus.client.importing.options import TagImportOptions
|
||||
from hydrus.client.metadata import ClientTags
|
||||
from hydrus.client.networking import ClientNetworkingJobs
|
||||
|
@ -1070,7 +1071,7 @@ class WatcherImport( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
with self._lock:
|
||||
|
||||
should_present = self._publish_to_page and file_seed.ShouldPresent( self._file_import_options )
|
||||
should_present = self._publish_to_page and file_seed.ShouldPresent( self._file_import_options.GetPresentationImportOptions() )
|
||||
|
||||
page_key = self._page_key
|
||||
|
||||
|
@ -1252,20 +1253,6 @@ class WatcherImport( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
|
||||
|
||||
def GetNewHashes( self ):
|
||||
|
||||
with self._lock:
|
||||
|
||||
fsc = self._file_seed_cache
|
||||
|
||||
|
||||
file_import_options = FileImportOptions.FileImportOptions()
|
||||
|
||||
file_import_options.SetPresentationOptions( True, False, False )
|
||||
|
||||
return fsc.GetPresentedHashes( file_import_options )
|
||||
|
||||
|
||||
def GetNextCheckTime( self ):
|
||||
|
||||
with self._lock:
|
||||
|
@ -1290,15 +1277,19 @@ class WatcherImport( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
|
||||
|
||||
def GetPresentedHashes( self ):
|
||||
def GetPresentedHashes( self, presentation_import_options = None ):
|
||||
|
||||
with self._lock:
|
||||
|
||||
fsc = self._file_seed_cache
|
||||
fio = self._file_import_options
|
||||
|
||||
if presentation_import_options is None:
|
||||
|
||||
presentation_import_options = self._file_import_options.GetPresentationImportOptions()
|
||||
|
||||
|
||||
|
||||
return fsc.GetPresentedHashes( fio )
|
||||
return fsc.GetPresentedHashes( presentation_import_options )
|
||||
|
||||
|
||||
def GetSimpleStatus( self ):
|
||||
|
|
|
@ -18,52 +18,6 @@ def FilterDeletedTags( service_key: bytes, media_result: ClientMediaResult.Media
|
|||
|
||||
return tags
|
||||
|
||||
def NewInboxArchiveMatch( new_files, inbox_files, archive_files, status, inbox ):
|
||||
|
||||
if status == CC.STATUS_SUCCESSFUL_AND_NEW and new_files:
|
||||
|
||||
return True
|
||||
|
||||
elif status == CC.STATUS_SUCCESSFUL_BUT_REDUNDANT:
|
||||
|
||||
if inbox and inbox_files:
|
||||
|
||||
return True
|
||||
|
||||
elif not inbox and archive_files:
|
||||
|
||||
return True
|
||||
|
||||
|
||||
|
||||
return False
|
||||
|
||||
def NewInboxArchiveMatchIgnorantOfInbox( new_files, inbox_files, archive_files, status ):
|
||||
|
||||
if status == CC.STATUS_SUCCESSFUL_AND_NEW and new_files:
|
||||
|
||||
return True
|
||||
|
||||
elif status == CC.STATUS_SUCCESSFUL_BUT_REDUNDANT and archive_files and inbox_files:
|
||||
|
||||
return True
|
||||
|
||||
|
||||
return False
|
||||
|
||||
def NewInboxArchiveNonMatchIgnorantOfInbox( new_files, inbox_files, archive_files, status ):
|
||||
|
||||
if status == CC.STATUS_SUCCESSFUL_AND_NEW and not new_files:
|
||||
|
||||
return True
|
||||
|
||||
elif status == CC.STATUS_SUCCESSFUL_BUT_REDUNDANT and not ( archive_files or inbox_files ):
|
||||
|
||||
return True
|
||||
|
||||
|
||||
return False
|
||||
|
||||
class CheckerOptions( HydrusSerialisable.SerialisableBase ):
|
||||
|
||||
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_CHECKER_OPTIONS
|
||||
|
|
|
@ -6,12 +6,13 @@ from hydrus.core import HydrusExceptions
|
|||
from hydrus.core import HydrusSerialisable
|
||||
|
||||
from hydrus.client.importing.options import ClientImportOptions
|
||||
from hydrus.client.importing.options import PresentationImportOptions
|
||||
|
||||
class FileImportOptions( HydrusSerialisable.SerialisableBase ):
|
||||
|
||||
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_FILE_IMPORT_OPTIONS
|
||||
SERIALISABLE_NAME = 'File Import Options'
|
||||
SERIALISABLE_VERSION = 5
|
||||
SERIALISABLE_VERSION = 6
|
||||
|
||||
def __init__( self ):
|
||||
|
||||
|
@ -29,27 +30,25 @@ class FileImportOptions( HydrusSerialisable.SerialisableBase ):
|
|||
self._automatic_archive = False
|
||||
self._associate_primary_urls = True
|
||||
self._associate_source_urls = True
|
||||
self._present_new_files = True
|
||||
self._present_already_in_inbox_files = True
|
||||
self._present_already_in_archive_files = True
|
||||
self._presentation_import_options = PresentationImportOptions.PresentationImportOptions()
|
||||
|
||||
|
||||
def _GetSerialisableInfo( self ):
|
||||
|
||||
pre_import_options = ( self._exclude_deleted, self._do_not_check_known_urls_before_importing, self._do_not_check_hashes_before_importing, self._allow_decompression_bombs, self._min_size, self._max_size, self._max_gif_size, self._min_resolution, self._max_resolution )
|
||||
post_import_options = ( self._automatic_archive, self._associate_primary_urls, self._associate_source_urls )
|
||||
presentation_options = ( self._present_new_files, self._present_already_in_inbox_files, self._present_already_in_archive_files )
|
||||
serialisable_presentation_import_options = self._presentation_import_options.GetSerialisableTuple()
|
||||
|
||||
return ( pre_import_options, post_import_options, presentation_options )
|
||||
return ( pre_import_options, post_import_options, serialisable_presentation_import_options )
|
||||
|
||||
|
||||
def _InitialiseFromSerialisableInfo( self, serialisable_info ):
|
||||
|
||||
( pre_import_options, post_import_options, presentation_options ) = serialisable_info
|
||||
( pre_import_options, post_import_options, serialisable_presentation_import_options ) = serialisable_info
|
||||
|
||||
( self._exclude_deleted, self._do_not_check_known_urls_before_importing, self._do_not_check_hashes_before_importing, self._allow_decompression_bombs, self._min_size, self._max_size, self._max_gif_size, self._min_resolution, self._max_resolution ) = pre_import_options
|
||||
( self._automatic_archive, self._associate_primary_urls, self._associate_source_urls ) = post_import_options
|
||||
( self._present_new_files, self._present_already_in_inbox_files, self._present_already_in_archive_files ) = presentation_options
|
||||
self._presentation_import_options = HydrusSerialisable.CreateFromSerialisableTuple( serialisable_presentation_import_options )
|
||||
|
||||
|
||||
def _UpdateSerialisableInfo( self, version, old_serialisable_info ):
|
||||
|
@ -122,6 +121,42 @@ class FileImportOptions( HydrusSerialisable.SerialisableBase ):
|
|||
return ( 5, new_serialisable_info )
|
||||
|
||||
|
||||
if version == 5:
|
||||
|
||||
( pre_import_options, post_import_options, presentation_options ) = old_serialisable_info
|
||||
|
||||
( present_new_files, present_already_in_inbox_files, present_already_in_archive_files ) = presentation_options
|
||||
|
||||
presentation_import_options = PresentationImportOptions.PresentationImportOptions()
|
||||
|
||||
if not present_new_files:
|
||||
|
||||
presentation_import_options.SetPresentationStatus( PresentationImportOptions.PRESENTATION_STATUS_NONE )
|
||||
|
||||
else:
|
||||
|
||||
if present_already_in_archive_files and present_already_in_inbox_files:
|
||||
|
||||
presentation_import_options.SetPresentationStatus( PresentationImportOptions.PRESENTATION_STATUS_ANY_GOOD )
|
||||
|
||||
else:
|
||||
|
||||
presentation_import_options.SetPresentationStatus( PresentationImportOptions.PRESENTATION_STATUS_NEW_ONLY )
|
||||
|
||||
if present_already_in_inbox_files:
|
||||
|
||||
presentation_import_options.SetPresentationInbox( PresentationImportOptions.PRESENTATION_INBOX_INCLUDE_INBOX )
|
||||
|
||||
|
||||
|
||||
|
||||
serialisable_presentation_import_options = presentation_import_options.GetSerialisableTuple()
|
||||
|
||||
new_serialisable_info = ( pre_import_options, post_import_options, serialisable_presentation_import_options )
|
||||
|
||||
return ( 6, new_serialisable_info )
|
||||
|
||||
|
||||
|
||||
def AllowsDecompressionBombs( self ):
|
||||
|
||||
|
@ -215,11 +250,9 @@ class FileImportOptions( HydrusSerialisable.SerialisableBase ):
|
|||
return self._exclude_deleted
|
||||
|
||||
|
||||
def GetPresentationOptions( self ):
|
||||
def GetPresentationImportOptions( self ):
|
||||
|
||||
presentation_options = ( self._present_new_files, self._present_already_in_inbox_files, self._present_already_in_archive_files )
|
||||
|
||||
return presentation_options
|
||||
return self._presentation_import_options
|
||||
|
||||
|
||||
def GetPreImportOptions( self ):
|
||||
|
@ -281,35 +314,9 @@ class FileImportOptions( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
#
|
||||
|
||||
presentation_statements = []
|
||||
statements.append( self._presentation_import_options.GetSummary() )
|
||||
|
||||
if self._present_new_files:
|
||||
|
||||
presentation_statements.append( 'new' )
|
||||
|
||||
|
||||
if self._present_already_in_inbox_files:
|
||||
|
||||
presentation_statements.append( 'already in inbox' )
|
||||
|
||||
|
||||
if self._present_already_in_archive_files:
|
||||
|
||||
presentation_statements.append( 'already in archive' )
|
||||
|
||||
|
||||
if len( presentation_statements ) == 0:
|
||||
|
||||
statements.append( 'not presenting any files' )
|
||||
|
||||
elif len( presentation_statements ) == 3:
|
||||
|
||||
statements.append( 'presenting all files' )
|
||||
|
||||
else:
|
||||
|
||||
statements.append( 'presenting ' + ', '.join( presentation_statements ) + ' files' )
|
||||
|
||||
#
|
||||
|
||||
summary = os.linesep.join( statements )
|
||||
|
||||
|
@ -323,11 +330,9 @@ class FileImportOptions( HydrusSerialisable.SerialisableBase ):
|
|||
self._associate_source_urls = associate_source_urls
|
||||
|
||||
|
||||
def SetPresentationOptions( self, present_new_files, present_already_in_inbox_files, present_already_in_archive_files ):
|
||||
def SetPresentationImportOptions( self, presentation_import_options: PresentationImportOptions.PresentationImportOptions ):
|
||||
|
||||
self._present_new_files = present_new_files
|
||||
self._present_already_in_inbox_files = present_already_in_inbox_files
|
||||
self._present_already_in_archive_files = present_already_in_archive_files
|
||||
self._presentation_import_options = presentation_import_options
|
||||
|
||||
|
||||
def SetPreImportOptions( self, exclude_deleted, do_not_check_known_urls_before_importing, do_not_check_hashes_before_importing, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution ):
|
||||
|
@ -363,19 +368,4 @@ class FileImportOptions( HydrusSerialisable.SerialisableBase ):
|
|||
return self._do_not_check_known_urls_before_importing
|
||||
|
||||
|
||||
def ShouldNotPresentIgnorantOfInbox( self, status ):
|
||||
|
||||
return ClientImportOptions.NewInboxArchiveNonMatchIgnorantOfInbox( self._present_new_files, self._present_already_in_inbox_files, self._present_already_in_archive_files, status )
|
||||
|
||||
|
||||
def ShouldPresent( self, status, inbox ):
|
||||
|
||||
return ClientImportOptions.NewInboxArchiveMatch( self._present_new_files, self._present_already_in_inbox_files, self._present_already_in_archive_files, status, inbox )
|
||||
|
||||
|
||||
def ShouldPresentIgnorantOfInbox( self, status ):
|
||||
|
||||
return ClientImportOptions.NewInboxArchiveMatchIgnorantOfInbox( self._present_new_files, self._present_already_in_inbox_files, self._present_already_in_archive_files, status )
|
||||
|
||||
|
||||
HydrusSerialisable.SERIALISABLE_TYPES_TO_OBJECT_TYPES[ HydrusSerialisable.SERIALISABLE_TYPE_FILE_IMPORT_OPTIONS ] = FileImportOptions
|
||||
|
|
|
@ -0,0 +1,325 @@
|
|||
from hydrus.core import HydrusExceptions
|
||||
from hydrus.core import HydrusGlobals as HG
|
||||
from hydrus.core import HydrusSerialisable
|
||||
|
||||
from hydrus.client import ClientConstants as CC
|
||||
|
||||
PRESENTATION_LOCATION_IN_LOCAL_FILES = 0
|
||||
PRESENTATION_LOCATION_IN_TRASH_TOO = 1
|
||||
|
||||
presentation_location_enum_str_lookup = {
|
||||
PRESENTATION_LOCATION_IN_LOCAL_FILES : 'in my collection',
|
||||
PRESENTATION_LOCATION_IN_TRASH_TOO : 'in my collection or trash'
|
||||
}
|
||||
|
||||
PRESENTATION_STATUS_ANY_GOOD = 0
|
||||
PRESENTATION_STATUS_NEW_ONLY = 1
|
||||
PRESENTATION_STATUS_NONE = 2
|
||||
|
||||
presentation_status_enum_str_lookup = {
|
||||
PRESENTATION_STATUS_ANY_GOOD : 'all files',
|
||||
PRESENTATION_STATUS_NEW_ONLY : 'new files',
|
||||
PRESENTATION_STATUS_NONE : 'do not show anything'
|
||||
}
|
||||
|
||||
PRESENTATION_INBOX_AGNOSTIC = 0
|
||||
PRESENTATION_INBOX_REQUIRE_INBOX = 1
|
||||
PRESENTATION_INBOX_INCLUDE_INBOX = 2
|
||||
|
||||
presentation_inbox_enum_str_lookup = {
|
||||
PRESENTATION_INBOX_AGNOSTIC : 'inbox or archive',
|
||||
PRESENTATION_INBOX_REQUIRE_INBOX : 'must be in inbox',
|
||||
PRESENTATION_INBOX_INCLUDE_INBOX : 'or in inbox'
|
||||
}
|
||||
|
||||
class PresentationImportOptions( HydrusSerialisable.SerialisableBase ):
|
||||
|
||||
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_PRESENTATION_IMPORT_OPTIONS
|
||||
SERIALISABLE_NAME = 'Presentation Import Options'
|
||||
SERIALISABLE_VERSION = 1
|
||||
|
||||
def __init__( self ):
|
||||
|
||||
HydrusSerialisable.SerialisableBase.__init__( self )
|
||||
|
||||
self._presentation_location = PRESENTATION_LOCATION_IN_LOCAL_FILES
|
||||
self._presentation_status = PRESENTATION_STATUS_ANY_GOOD
|
||||
self._presentation_inbox = PRESENTATION_INBOX_AGNOSTIC
|
||||
|
||||
|
||||
def __eq__( self, other ):
|
||||
|
||||
if isinstance( other, PresentationImportOptions ):
|
||||
|
||||
return self.__hash__() == other.__hash__()
|
||||
|
||||
|
||||
return NotImplemented
|
||||
|
||||
|
||||
def __hash__( self ):
|
||||
|
||||
return ( self._presentation_location, self._presentation_status, self._presentation_inbox ).__hash__()
|
||||
|
||||
|
||||
def _DefinitelyShouldNotPresentIgnorantOfInbox( self, status ):
|
||||
|
||||
if self._presentation_status == PRESENTATION_STATUS_NONE:
|
||||
|
||||
return True
|
||||
|
||||
|
||||
if self._presentation_status == PRESENTATION_STATUS_NEW_ONLY:
|
||||
|
||||
if status == CC.STATUS_SUCCESSFUL_BUT_REDUNDANT and self._presentation_inbox != PRESENTATION_INBOX_INCLUDE_INBOX:
|
||||
|
||||
# only want new, and this is already in db
|
||||
|
||||
return True
|
||||
|
||||
|
||||
|
||||
return False
|
||||
|
||||
|
||||
def _DefinitelyShouldPresentIgnorantOfInbox( self, status ):
|
||||
|
||||
if self._presentation_status == PRESENTATION_STATUS_NONE:
|
||||
|
||||
return False
|
||||
|
||||
|
||||
if self._presentation_inbox == PRESENTATION_INBOX_REQUIRE_INBOX:
|
||||
|
||||
# we can't know either way
|
||||
|
||||
return False
|
||||
|
||||
|
||||
if self._presentation_status == PRESENTATION_STATUS_ANY_GOOD:
|
||||
|
||||
# we accept all
|
||||
|
||||
return True
|
||||
|
||||
elif self._presentation_status == PRESENTATION_STATUS_NEW_ONLY:
|
||||
|
||||
if status == CC.STATUS_SUCCESSFUL_AND_NEW:
|
||||
|
||||
# we accept all new and this is new
|
||||
|
||||
return True
|
||||
|
||||
|
||||
|
||||
return False
|
||||
|
||||
|
||||
def _GetSerialisableInfo( self ):
|
||||
|
||||
return ( self._presentation_location, self._presentation_status, self._presentation_inbox )
|
||||
|
||||
|
||||
def _InitialiseFromSerialisableInfo( self, serialisable_info ):
|
||||
|
||||
( self._presentation_location, self._presentation_status, self._presentation_inbox ) = serialisable_info
|
||||
|
||||
|
||||
def _ShouldPresentGivenStatusAndInbox( self, status, inbox ):
|
||||
|
||||
if self._presentation_status == PRESENTATION_STATUS_NONE:
|
||||
|
||||
return False
|
||||
|
||||
|
||||
if self._presentation_inbox == PRESENTATION_INBOX_REQUIRE_INBOX:
|
||||
|
||||
if not inbox:
|
||||
|
||||
return False
|
||||
|
||||
|
||||
|
||||
#
|
||||
|
||||
if self._presentation_status == PRESENTATION_STATUS_NEW_ONLY:
|
||||
|
||||
if status == CC.STATUS_SUCCESSFUL_BUT_REDUNDANT:
|
||||
|
||||
if self._presentation_inbox == PRESENTATION_INBOX_AGNOSTIC:
|
||||
|
||||
# only want new, and this is already in db
|
||||
|
||||
return False
|
||||
|
||||
elif self._presentation_inbox == PRESENTATION_INBOX_INCLUDE_INBOX:
|
||||
|
||||
if not inbox:
|
||||
|
||||
# only want new or inbox, and this is already in db and archived
|
||||
|
||||
return False
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
return True
|
||||
|
||||
|
||||
def GetPresentationInbox( self ) -> int:
|
||||
|
||||
return self._presentation_inbox
|
||||
|
||||
|
||||
def GetPresentationLocation( self ) -> int:
|
||||
|
||||
return self._presentation_location
|
||||
|
||||
|
||||
def GetPresentationStatus( self ) -> int:
|
||||
|
||||
return self._presentation_status
|
||||
|
||||
|
||||
def GetSummary( self ):
|
||||
|
||||
if self._presentation_status == PRESENTATION_STATUS_NONE:
|
||||
|
||||
return 'not presenting any files'
|
||||
|
||||
|
||||
summary = presentation_status_enum_str_lookup[ self._presentation_status ]
|
||||
|
||||
if self._presentation_inbox == PRESENTATION_INBOX_REQUIRE_INBOX:
|
||||
|
||||
if self._presentation_status == PRESENTATION_STATUS_ANY_GOOD:
|
||||
|
||||
summary = 'inbox files'
|
||||
|
||||
else:
|
||||
|
||||
summary = 'new inbox files'
|
||||
|
||||
|
||||
elif self._presentation_inbox == PRESENTATION_INBOX_INCLUDE_INBOX:
|
||||
|
||||
if self._presentation_status == PRESENTATION_STATUS_NEW_ONLY:
|
||||
|
||||
summary = 'new or inbox files'
|
||||
|
||||
|
||||
|
||||
if self._presentation_location == PRESENTATION_LOCATION_IN_TRASH_TOO:
|
||||
|
||||
summary = '{}, including trash'.format( summary )
|
||||
|
||||
|
||||
return summary
|
||||
|
||||
|
||||
def SetPresentationInbox( self, presentation_inbox: int ):
|
||||
|
||||
self._presentation_inbox = presentation_inbox
|
||||
|
||||
|
||||
def SetPresentationLocation( self, presentation_location: int ):
|
||||
|
||||
self._presentation_location = presentation_location
|
||||
|
||||
|
||||
def SetPresentationStatus( self, presentation_status: int ):
|
||||
|
||||
self._presentation_status = presentation_status
|
||||
|
||||
|
||||
def GetPresentedHashes( self, hashes_and_statuses, should_check_location = True ):
|
||||
|
||||
if self._presentation_status == PRESENTATION_STATUS_NONE:
|
||||
|
||||
return []
|
||||
|
||||
|
||||
hashes_handled = set()
|
||||
needs_inbox_lookup = set()
|
||||
desired_hashes = set()
|
||||
|
||||
for ( hash, status ) in hashes_and_statuses:
|
||||
|
||||
if hash in hashes_handled:
|
||||
|
||||
continue
|
||||
|
||||
|
||||
if status not in CC.SUCCESSFUL_IMPORT_STATES:
|
||||
|
||||
hashes_handled.add( hash )
|
||||
|
||||
continue
|
||||
|
||||
|
||||
if self._DefinitelyShouldNotPresentIgnorantOfInbox( status ):
|
||||
|
||||
hashes_handled.add( hash )
|
||||
|
||||
continue
|
||||
|
||||
|
||||
if self._DefinitelyShouldPresentIgnorantOfInbox( status ):
|
||||
|
||||
hashes_handled.add( hash )
|
||||
desired_hashes.add( hash )
|
||||
|
||||
continue
|
||||
|
||||
|
||||
needs_inbox_lookup.add( hash )
|
||||
|
||||
|
||||
if len( needs_inbox_lookup ) > 0:
|
||||
|
||||
inbox_hashes = HG.client_controller.Read( 'inbox_hashes', needs_inbox_lookup )
|
||||
|
||||
for ( hash, status ) in hashes_and_statuses:
|
||||
|
||||
if hash in hashes_handled:
|
||||
|
||||
continue
|
||||
|
||||
|
||||
in_inbox = hash in inbox_hashes
|
||||
|
||||
if self._ShouldPresentGivenStatusAndInbox( status, in_inbox ):
|
||||
|
||||
desired_hashes.add( hash )
|
||||
|
||||
|
||||
hashes_handled.add( hash )
|
||||
|
||||
|
||||
|
||||
presented_hashes = [ hash for ( hash, status ) in hashes_and_statuses if hash in desired_hashes ]
|
||||
|
||||
if len( presented_hashes ) > 0 and should_check_location:
|
||||
|
||||
file_service_key = CC.LOCAL_FILE_SERVICE_KEY
|
||||
|
||||
if self._presentation_location == PRESENTATION_LOCATION_IN_TRASH_TOO:
|
||||
|
||||
file_service_key = CC.COMBINED_LOCAL_FILE_SERVICE_KEY
|
||||
|
||||
|
||||
presented_hashes = HG.client_controller.Read( 'filter_hashes', file_service_key, presented_hashes )
|
||||
|
||||
|
||||
return presented_hashes
|
||||
|
||||
|
||||
def ShouldPresentHashAndStatus( self, hash, status, should_check_location = True ):
|
||||
|
||||
hashes = self.GetPresentedHashes( [ ( hash, status ) ], should_check_location = should_check_location )
|
||||
|
||||
return len( hashes ) > 0
|
||||
|
||||
|
||||
HydrusSerialisable.SERIALISABLE_TYPES_TO_OBJECT_TYPES[ HydrusSerialisable.SERIALISABLE_TYPE_PRESENTATION_IMPORT_OPTIONS ] = PresentationImportOptions
|
|
@ -10,6 +10,7 @@ from hydrus.core import HydrusSerialisable
|
|||
from hydrus.core import HydrusTags
|
||||
from hydrus.core import HydrusText
|
||||
|
||||
from hydrus.client import ClientConstants as CC
|
||||
from hydrus.client import ClientData
|
||||
from hydrus.client.importing.options import ClientImportOptions
|
||||
from hydrus.client.media import ClientMediaResult
|
||||
|
@ -697,6 +698,26 @@ class TagImportOptions( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
HydrusSerialisable.SERIALISABLE_TYPES_TO_OBJECT_TYPES[ HydrusSerialisable.SERIALISABLE_TYPE_TAG_IMPORT_OPTIONS ] = TagImportOptions
|
||||
|
||||
def NewInboxArchiveMatch( new_files, inbox_files, archive_files, status, inbox ):
|
||||
|
||||
if status == CC.STATUS_SUCCESSFUL_AND_NEW and new_files:
|
||||
|
||||
return True
|
||||
|
||||
elif status == CC.STATUS_SUCCESSFUL_BUT_REDUNDANT:
|
||||
|
||||
if inbox and inbox_files:
|
||||
|
||||
return True
|
||||
|
||||
elif not inbox and archive_files:
|
||||
|
||||
return True
|
||||
|
||||
|
||||
|
||||
return False
|
||||
|
||||
class ServiceTagImportOptions( HydrusSerialisable.SerialisableBase ):
|
||||
|
||||
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_SERVICE_TAG_IMPORT_OPTIONS
|
||||
|
@ -845,7 +866,7 @@ class ServiceTagImportOptions( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
in_inbox = media_result.GetInbox()
|
||||
|
||||
if ClientImportOptions.NewInboxArchiveMatch( self._to_new_files, self._to_already_in_inbox, self._to_already_in_archive, status, in_inbox ):
|
||||
if NewInboxArchiveMatch( self._to_new_files, self._to_already_in_inbox, self._to_already_in_archive, status, in_inbox ):
|
||||
|
||||
if self._get_tags:
|
||||
|
||||
|
|
|
@ -461,6 +461,8 @@ class TagDisplayMaintenanceManager( object ):
|
|||
|
||||
while not ( HG.view_shutdown or self._shutdown ):
|
||||
|
||||
self._controller.WaitUntilViewFree()
|
||||
|
||||
if self._WorkPermitted() and self._WorkToDo():
|
||||
|
||||
try:
|
||||
|
|
|
@ -796,31 +796,7 @@ class NetworkJob( object ):
|
|||
|
||||
except Exception as e:
|
||||
|
||||
if hasattr( cloudscraper.exceptions, 'CloudflareReCaptchaProvider' ):
|
||||
|
||||
e_type_test = getattr( cloudscraper.exceptions, 'CloudflareReCaptchaProvider' )
|
||||
|
||||
elif hasattr( cloudscraper.exceptions, 'CloudflareCaptchaProvider' ):
|
||||
|
||||
e_type_test = getattr( cloudscraper.exceptions, 'CloudflareCaptchaProvider' )
|
||||
|
||||
else:
|
||||
|
||||
e_type_test = int
|
||||
|
||||
|
||||
if isinstance( e, e_type_test ):
|
||||
|
||||
message = 'The page had a captcha, and hydrus does not yet plug cloudscraper into a captcha-solving service.'
|
||||
|
||||
else:
|
||||
|
||||
message = str( e )
|
||||
|
||||
|
||||
HydrusData.PrintException( e )
|
||||
|
||||
raise HydrusExceptions.CloudFlareException( 'Looks like an unsolvable CloudFlare issue: {}'.format( message ) )
|
||||
raise HydrusExceptions.CloudFlareException( 'This looks like an unsolvable CloudFlare captcha! Best solution we know of is to copy cookies and User-Agent header from your web browser to hydrus!' )
|
||||
|
||||
|
||||
raise HydrusExceptions.ShouldReattemptNetworkException( 'CloudFlare needed solving.' )
|
||||
|
|
|
@ -81,7 +81,7 @@ options = {}
|
|||
# Misc
|
||||
|
||||
NETWORK_VERSION = 20
|
||||
SOFTWARE_VERSION = 462
|
||||
SOFTWARE_VERSION = 463
|
||||
CLIENT_API_VERSION = 22
|
||||
|
||||
SERVER_THUMBNAIL_DIMENSIONS = ( 200, 200 )
|
||||
|
@ -553,28 +553,29 @@ GENERAL_APPLICATION = 43
|
|||
GENERAL_ANIMATION = 44
|
||||
APPLICATION_CLIP = 45
|
||||
AUDIO_WAVE = 46
|
||||
VIDEO_OGV = 47
|
||||
APPLICATION_OCTET_STREAM = 100
|
||||
APPLICATION_UNKNOWN = 101
|
||||
|
||||
GENERAL_FILETYPES = { GENERAL_APPLICATION, GENERAL_AUDIO, GENERAL_IMAGE, GENERAL_VIDEO, GENERAL_ANIMATION }
|
||||
|
||||
SEARCHABLE_MIMES = { IMAGE_JPEG, IMAGE_PNG, IMAGE_APNG, IMAGE_GIF, IMAGE_WEBP, IMAGE_TIFF, IMAGE_ICON, APPLICATION_FLASH, VIDEO_AVI, VIDEO_FLV, VIDEO_MOV, VIDEO_MP4, VIDEO_MKV, VIDEO_REALMEDIA, VIDEO_WEBM, VIDEO_MPEG, APPLICATION_CLIP, APPLICATION_PSD, APPLICATION_PDF, APPLICATION_ZIP, APPLICATION_RAR, APPLICATION_7Z, AUDIO_M4A, AUDIO_MP3, AUDIO_REALMEDIA, AUDIO_OGG, AUDIO_FLAC, AUDIO_WAVE, AUDIO_TRUEAUDIO, AUDIO_WMA, VIDEO_WMV }
|
||||
SEARCHABLE_MIMES = { IMAGE_JPEG, IMAGE_PNG, IMAGE_APNG, IMAGE_GIF, IMAGE_WEBP, IMAGE_TIFF, IMAGE_ICON, APPLICATION_FLASH, VIDEO_AVI, VIDEO_FLV, VIDEO_MOV, VIDEO_MP4, VIDEO_MKV, VIDEO_REALMEDIA, VIDEO_WEBM, VIDEO_OGV, VIDEO_MPEG, APPLICATION_CLIP, APPLICATION_PSD, APPLICATION_PDF, APPLICATION_ZIP, APPLICATION_RAR, APPLICATION_7Z, AUDIO_M4A, AUDIO_MP3, AUDIO_REALMEDIA, AUDIO_OGG, AUDIO_FLAC, AUDIO_WAVE, AUDIO_TRUEAUDIO, AUDIO_WMA, VIDEO_WMV }
|
||||
|
||||
STORABLE_MIMES = set( SEARCHABLE_MIMES ).union( { APPLICATION_HYDRUS_UPDATE_CONTENT, APPLICATION_HYDRUS_UPDATE_DEFINITIONS } )
|
||||
|
||||
ALLOWED_MIMES = set( STORABLE_MIMES ).union( { IMAGE_BMP } )
|
||||
|
||||
DECOMPRESSION_BOMB_IMAGES = ( IMAGE_JPEG, IMAGE_PNG )
|
||||
DECOMPRESSION_BOMB_IMAGES = { IMAGE_JPEG, IMAGE_PNG }
|
||||
|
||||
IMAGES = ( IMAGE_JPEG, IMAGE_PNG, IMAGE_BMP, IMAGE_WEBP, IMAGE_TIFF, IMAGE_ICON )
|
||||
IMAGES = { IMAGE_JPEG, IMAGE_PNG, IMAGE_BMP, IMAGE_WEBP, IMAGE_TIFF, IMAGE_ICON }
|
||||
|
||||
ANIMATIONS = ( IMAGE_GIF, IMAGE_APNG )
|
||||
ANIMATIONS = { IMAGE_GIF, IMAGE_APNG }
|
||||
|
||||
AUDIO = ( AUDIO_M4A, AUDIO_MP3, AUDIO_OGG, AUDIO_FLAC, AUDIO_WAVE, AUDIO_WMA, AUDIO_REALMEDIA, AUDIO_TRUEAUDIO )
|
||||
AUDIO = { AUDIO_M4A, AUDIO_MP3, AUDIO_OGG, AUDIO_FLAC, AUDIO_WAVE, AUDIO_WMA, AUDIO_REALMEDIA, AUDIO_TRUEAUDIO }
|
||||
|
||||
VIDEO = ( VIDEO_AVI, VIDEO_FLV, VIDEO_MOV, VIDEO_MP4, VIDEO_WMV, VIDEO_MKV, VIDEO_REALMEDIA, VIDEO_WEBM, VIDEO_MPEG )
|
||||
VIDEO = { VIDEO_AVI, VIDEO_FLV, VIDEO_MOV, VIDEO_MP4, VIDEO_WMV, VIDEO_MKV, VIDEO_REALMEDIA, VIDEO_WEBM, VIDEO_OGV, VIDEO_MPEG }
|
||||
|
||||
APPLICATIONS = ( APPLICATION_FLASH, APPLICATION_PSD, APPLICATION_CLIP, APPLICATION_PDF, APPLICATION_ZIP, APPLICATION_RAR, APPLICATION_7Z )
|
||||
APPLICATIONS = { APPLICATION_FLASH, APPLICATION_PSD, APPLICATION_CLIP, APPLICATION_PDF, APPLICATION_ZIP, APPLICATION_RAR, APPLICATION_7Z }
|
||||
|
||||
general_mimetypes_to_mime_groups = {}
|
||||
|
||||
|
@ -597,9 +598,9 @@ for ( general_mime_type, mimes_in_type ) in general_mimetypes_to_mime_groups.ite
|
|||
MIMES_THAT_DEFINITELY_HAVE_AUDIO = tuple( [ APPLICATION_FLASH ] + list( AUDIO ) )
|
||||
MIMES_THAT_MAY_HAVE_AUDIO = tuple( list( MIMES_THAT_DEFINITELY_HAVE_AUDIO ) + list( VIDEO ) )
|
||||
|
||||
ARCHIVES = ( APPLICATION_ZIP, APPLICATION_HYDRUS_ENCRYPTED_ZIP, APPLICATION_RAR, APPLICATION_7Z )
|
||||
ARCHIVES = { APPLICATION_ZIP, APPLICATION_HYDRUS_ENCRYPTED_ZIP, APPLICATION_RAR, APPLICATION_7Z }
|
||||
|
||||
MIMES_WITH_THUMBNAILS = ( APPLICATION_FLASH, APPLICATION_CLIP, IMAGE_JPEG, IMAGE_PNG, IMAGE_APNG, IMAGE_GIF, IMAGE_BMP, IMAGE_WEBP, IMAGE_TIFF, IMAGE_ICON, APPLICATION_PSD, VIDEO_AVI, VIDEO_FLV, VIDEO_MOV, VIDEO_MP4, VIDEO_WMV, VIDEO_MKV, VIDEO_REALMEDIA, VIDEO_WEBM, VIDEO_MPEG )
|
||||
MIMES_WITH_THUMBNAILS = set( IMAGES ).union( ANIMATIONS ).union( VIDEO ).union( { APPLICATION_FLASH, APPLICATION_CLIP, APPLICATION_PSD } )
|
||||
|
||||
HYDRUS_UPDATE_FILES = ( APPLICATION_HYDRUS_UPDATE_DEFINITIONS, APPLICATION_HYDRUS_UPDATE_CONTENT )
|
||||
|
||||
|
@ -656,6 +657,7 @@ mime_enum_lookup[ 'video/mp4' ] = VIDEO_MP4
|
|||
mime_enum_lookup[ 'video/mpeg' ] = VIDEO_MPEG
|
||||
mime_enum_lookup[ 'video/x-ms-wmv' ] = VIDEO_WMV
|
||||
mime_enum_lookup[ 'video/x-matroska' ] = VIDEO_MKV
|
||||
mime_enum_lookup[ 'video/ogg' ] = VIDEO_OGV
|
||||
mime_enum_lookup[ 'video/vnd.rn-realvideo' ] = VIDEO_REALMEDIA
|
||||
mime_enum_lookup[ 'application/vnd.rn-realmedia' ] = VIDEO_REALMEDIA
|
||||
mime_enum_lookup[ 'video/webm' ] = VIDEO_WEBM
|
||||
|
@ -703,6 +705,7 @@ mime_string_lookup[ VIDEO_MP4 ] = 'mp4'
|
|||
mime_string_lookup[ VIDEO_MPEG ] = 'mpeg'
|
||||
mime_string_lookup[ VIDEO_WMV ] = 'wmv'
|
||||
mime_string_lookup[ VIDEO_MKV ] = 'matroska'
|
||||
mime_string_lookup[ VIDEO_OGV ] = 'ogv'
|
||||
mime_string_lookup[ VIDEO_REALMEDIA ] = 'realvideo'
|
||||
mime_string_lookup[ VIDEO_WEBM ] = 'webm'
|
||||
mime_string_lookup[ UNDETERMINED_WM ] = 'wma or wmv'
|
||||
|
@ -754,6 +757,7 @@ mime_mimetype_string_lookup[ VIDEO_MP4 ] = 'video/mp4'
|
|||
mime_mimetype_string_lookup[ VIDEO_MPEG ] = 'video/mpeg'
|
||||
mime_mimetype_string_lookup[ VIDEO_WMV ] = 'video/x-ms-wmv'
|
||||
mime_mimetype_string_lookup[ VIDEO_MKV ] = 'video/x-matroska'
|
||||
mime_mimetype_string_lookup[ VIDEO_OGV ] = 'video/ogg'
|
||||
mime_mimetype_string_lookup[ VIDEO_REALMEDIA ] = 'video/vnd.rn-realvideo'
|
||||
mime_mimetype_string_lookup[ VIDEO_WEBM ] = 'video/webm'
|
||||
mime_mimetype_string_lookup[ UNDETERMINED_WM ] = 'audio/x-ms-wma or video/x-ms-wmv'
|
||||
|
@ -805,6 +809,7 @@ mime_ext_lookup[ VIDEO_MP4 ] = '.mp4'
|
|||
mime_ext_lookup[ VIDEO_MPEG ] = '.mpeg'
|
||||
mime_ext_lookup[ VIDEO_WMV ] = '.wmv'
|
||||
mime_ext_lookup[ VIDEO_MKV ] = '.mkv'
|
||||
mime_ext_lookup[ VIDEO_OGV ] = '.ogv'
|
||||
mime_ext_lookup[ VIDEO_REALMEDIA ] = '.rm'
|
||||
mime_ext_lookup[ VIDEO_WEBM ] = '.webm'
|
||||
mime_ext_lookup[ APPLICATION_UNKNOWN ] = ''
|
||||
|
|
|
@ -259,7 +259,7 @@ def ConvertResolutionToPrettyString( resolution ):
|
|||
|
||||
except:
|
||||
|
||||
'broken resolution'
|
||||
return 'broken resolution'
|
||||
|
||||
|
||||
|
||||
|
|
|
@ -244,10 +244,6 @@ def GetFileInfo( path, mime = None, ok_to_look_for_hydrus_updates = False ):
|
|||
|
||||
( ( width, height ), duration, num_frames, has_audio ) = HydrusVideoHandling.GetFFMPEGAPNGProperties( path )
|
||||
|
||||
elif mime in ( HC.VIDEO_AVI, HC.VIDEO_FLV, HC.VIDEO_WMV, HC.VIDEO_MOV, HC.VIDEO_MP4, HC.VIDEO_MKV, HC.VIDEO_REALMEDIA, HC.VIDEO_WEBM, HC.VIDEO_MPEG ):
|
||||
|
||||
( ( width, height ), duration, num_frames, has_audio ) = HydrusVideoHandling.GetFFMPEGVideoProperties( path )
|
||||
|
||||
elif mime == HC.APPLICATION_PDF:
|
||||
|
||||
num_words = HydrusDocumentHandling.GetPDFNumWords( path ) # this now give None until a better solution can be found
|
||||
|
@ -256,6 +252,10 @@ def GetFileInfo( path, mime = None, ok_to_look_for_hydrus_updates = False ):
|
|||
|
||||
( width, height ) = HydrusImageHandling.GetPSDResolution( path )
|
||||
|
||||
elif mime in HC.VIDEO:
|
||||
|
||||
( ( width, height ), duration, num_frames, has_audio ) = HydrusVideoHandling.GetFFMPEGVideoProperties( path )
|
||||
|
||||
elif mime in HC.AUDIO:
|
||||
|
||||
ffmpeg_lines = HydrusVideoHandling.GetFFMPEGInfoLines( path )
|
||||
|
|
|
@ -128,6 +128,7 @@ SERIALISABLE_TYPE_GUI_SESSION_CONTAINER = 104
|
|||
SERIALISABLE_TYPE_GUI_SESSION_PAGE_DATA = 105
|
||||
SERIALISABLE_TYPE_GUI_SESSION_CONTAINER_PAGE_NOTEBOOK = 106
|
||||
SERIALISABLE_TYPE_GUI_SESSION_CONTAINER_PAGE_SINGLE = 107
|
||||
SERIALISABLE_TYPE_PRESENTATION_IMPORT_OPTIONS = 108
|
||||
|
||||
SERIALISABLE_TYPES_TO_OBJECT_TYPES = {}
|
||||
|
||||
|
|
|
@ -291,6 +291,10 @@ def ConvertTagSliceToString( tag_slice ):
|
|||
return tag_slice
|
||||
|
||||
|
||||
def IsUnnamespaced( tag ):
|
||||
|
||||
return SplitTag( tag )[0] == ''
|
||||
|
||||
def SplitTag( tag ):
|
||||
|
||||
if ':' in tag:
|
||||
|
|
|
@ -448,8 +448,14 @@ def GetMime( path ):
|
|||
|
||||
elif mime_text == 'ogg':
|
||||
|
||||
return HC.AUDIO_OGG
|
||||
|
||||
if has_video:
|
||||
|
||||
return HC.VIDEO_OGV
|
||||
|
||||
else:
|
||||
|
||||
return HC.AUDIO_OGG
|
||||
|
||||
|
||||
elif 'rm' in mime_text:
|
||||
|
||||
|
|
|
@ -3,7 +3,7 @@ import os
|
|||
import random
|
||||
import unittest
|
||||
|
||||
from mock import patch
|
||||
from unittest.mock import patch
|
||||
|
||||
from hydrus.core import HydrusConstants as HC
|
||||
from hydrus.core import HydrusData
|
||||
|
@ -16,6 +16,7 @@ from hydrus.client.importing import ClientImportFileSeeds
|
|||
from hydrus.client.importing.options import ClientImportOptions
|
||||
from hydrus.client.importing.options import FileImportOptions
|
||||
from hydrus.client.importing.options import NoteImportOptions
|
||||
from hydrus.client.importing.options import PresentationImportOptions
|
||||
from hydrus.client.importing.options import TagImportOptions
|
||||
from hydrus.client.media import ClientMedia
|
||||
from hydrus.client.media import ClientMediaManagers
|
||||
|
@ -228,12 +229,6 @@ class TestFileImportOptions( unittest.TestCase ):
|
|||
|
||||
file_import_options.SetPostImportOptions( automatic_archive, associate_primary_urls, associate_source_urls )
|
||||
|
||||
present_new_files = True
|
||||
present_already_in_inbox_files = True
|
||||
present_already_in_archive_files = True
|
||||
|
||||
file_import_options.SetPresentationOptions( present_new_files, present_already_in_inbox_files, present_already_in_archive_files )
|
||||
|
||||
#
|
||||
|
||||
self.assertFalse( file_import_options.ExcludesDeleted() )
|
||||
|
@ -245,13 +240,6 @@ class TestFileImportOptions( unittest.TestCase ):
|
|||
file_import_options.CheckFileIsValid( 65536, HC.IMAGE_JPEG, 640, 480 )
|
||||
file_import_options.CheckFileIsValid( 65536, HC.APPLICATION_7Z, None, None )
|
||||
|
||||
self.assertTrue( file_import_options.ShouldPresent( CC.STATUS_SUCCESSFUL_AND_NEW, False ) )
|
||||
self.assertTrue( file_import_options.ShouldPresent( CC.STATUS_SUCCESSFUL_AND_NEW, True ) )
|
||||
self.assertTrue( file_import_options.ShouldPresent( CC.STATUS_SUCCESSFUL_BUT_REDUNDANT, False ) )
|
||||
self.assertTrue( file_import_options.ShouldPresent( CC.STATUS_SUCCESSFUL_BUT_REDUNDANT, True ) )
|
||||
|
||||
self.assertFalse( file_import_options.ShouldPresent( CC.STATUS_DELETED, False ) )
|
||||
|
||||
#
|
||||
|
||||
exclude_deleted = True
|
||||
|
@ -372,41 +360,6 @@ class TestFileImportOptions( unittest.TestCase ):
|
|||
|
||||
file_import_options.CheckFileIsValid( 65536, HC.IMAGE_JPEG, 2800, 3800 )
|
||||
|
||||
#
|
||||
|
||||
present_new_files = False
|
||||
|
||||
file_import_options.SetPresentationOptions( present_new_files, present_already_in_inbox_files, present_already_in_archive_files )
|
||||
|
||||
self.assertFalse( file_import_options.ShouldPresent( CC.STATUS_SUCCESSFUL_AND_NEW, False ) )
|
||||
self.assertFalse( file_import_options.ShouldPresent( CC.STATUS_SUCCESSFUL_AND_NEW, True ) )
|
||||
self.assertTrue( file_import_options.ShouldPresent( CC.STATUS_SUCCESSFUL_BUT_REDUNDANT, False ) )
|
||||
self.assertTrue( file_import_options.ShouldPresent( CC.STATUS_SUCCESSFUL_BUT_REDUNDANT, True ) )
|
||||
|
||||
#
|
||||
|
||||
present_new_files = True
|
||||
present_already_in_inbox_files = False
|
||||
|
||||
file_import_options.SetPresentationOptions( present_new_files, present_already_in_inbox_files, present_already_in_archive_files )
|
||||
|
||||
self.assertTrue( file_import_options.ShouldPresent( CC.STATUS_SUCCESSFUL_AND_NEW, False ) )
|
||||
self.assertTrue( file_import_options.ShouldPresent( CC.STATUS_SUCCESSFUL_AND_NEW, True ) )
|
||||
self.assertTrue( file_import_options.ShouldPresent( CC.STATUS_SUCCESSFUL_BUT_REDUNDANT, False ) )
|
||||
self.assertFalse( file_import_options.ShouldPresent( CC.STATUS_SUCCESSFUL_BUT_REDUNDANT, True ) )
|
||||
|
||||
#
|
||||
|
||||
present_already_in_inbox_files = True
|
||||
present_already_in_archive_files = False
|
||||
|
||||
file_import_options.SetPresentationOptions( present_new_files, present_already_in_inbox_files, present_already_in_archive_files )
|
||||
|
||||
self.assertTrue( file_import_options.ShouldPresent( CC.STATUS_SUCCESSFUL_AND_NEW, False ) )
|
||||
self.assertTrue( file_import_options.ShouldPresent( CC.STATUS_SUCCESSFUL_AND_NEW, True ) )
|
||||
self.assertFalse( file_import_options.ShouldPresent( CC.STATUS_SUCCESSFUL_BUT_REDUNDANT, False ) )
|
||||
self.assertTrue( file_import_options.ShouldPresent( CC.STATUS_SUCCESSFUL_BUT_REDUNDANT, True ) )
|
||||
|
||||
|
||||
def GetNotesMediaResult( hash, names_to_notes ):
|
||||
|
||||
|
@ -571,6 +524,337 @@ def GetTagsMediaResult( hash, in_inbox, service_key, deleted_tags ):
|
|||
|
||||
return media_result
|
||||
|
||||
class TestPresentationImportOptions( unittest.TestCase ):
|
||||
|
||||
def test_presentation_import_options( self ):
|
||||
|
||||
new_and_inboxed_hash = HydrusData.GenerateKey()
|
||||
new_and_archived_hash = HydrusData.GenerateKey()
|
||||
already_in_and_inboxed_hash = HydrusData.GenerateKey()
|
||||
already_in_and_archived_hash = HydrusData.GenerateKey()
|
||||
new_and_inboxed_but_trashed_hash = HydrusData.GenerateKey()
|
||||
skipped_hash = HydrusData.GenerateKey()
|
||||
deleted_hash = HydrusData.GenerateKey()
|
||||
failed_hash = HydrusData.GenerateKey()
|
||||
|
||||
hashes_and_statuses = [
|
||||
( new_and_inboxed_hash, CC.STATUS_SUCCESSFUL_AND_NEW ),
|
||||
( new_and_archived_hash, CC.STATUS_SUCCESSFUL_AND_NEW ),
|
||||
( already_in_and_inboxed_hash, CC.STATUS_SUCCESSFUL_BUT_REDUNDANT ),
|
||||
( already_in_and_archived_hash, CC.STATUS_SUCCESSFUL_BUT_REDUNDANT ),
|
||||
( new_and_inboxed_but_trashed_hash, CC.STATUS_SUCCESSFUL_AND_NEW ),
|
||||
( skipped_hash, CC.STATUS_SKIPPED ),
|
||||
( deleted_hash, CC.STATUS_DELETED ),
|
||||
( failed_hash, CC.STATUS_ERROR )
|
||||
]
|
||||
|
||||
# all good
|
||||
|
||||
HG.test_controller.ClearReads( 'inbox_hashes' )
|
||||
HG.test_controller.ClearReads( 'file_query_ids' )
|
||||
|
||||
presentation_import_options = PresentationImportOptions.PresentationImportOptions()
|
||||
|
||||
presentation_import_options.SetPresentationStatus( PresentationImportOptions.PRESENTATION_STATUS_ANY_GOOD )
|
||||
presentation_import_options.SetPresentationInbox( PresentationImportOptions.PRESENTATION_INBOX_AGNOSTIC )
|
||||
presentation_import_options.SetPresentationLocation( PresentationImportOptions.PRESENTATION_LOCATION_IN_LOCAL_FILES )
|
||||
|
||||
pre_filter_expected_result = [
|
||||
new_and_inboxed_hash,
|
||||
new_and_archived_hash,
|
||||
already_in_and_inboxed_hash,
|
||||
already_in_and_archived_hash,
|
||||
new_and_inboxed_but_trashed_hash
|
||||
]
|
||||
|
||||
expected_result = [
|
||||
new_and_inboxed_hash,
|
||||
new_and_archived_hash,
|
||||
already_in_and_inboxed_hash,
|
||||
already_in_and_archived_hash
|
||||
]
|
||||
|
||||
HG.test_controller.SetRead( 'inbox_hashes', 'not used' )
|
||||
HG.test_controller.SetRead( 'filter_hashes', expected_result )
|
||||
|
||||
result = presentation_import_options.GetPresentedHashes( hashes_and_statuses )
|
||||
|
||||
[ ( args, kwargs ) ] = HG.test_controller.GetRead( 'filter_hashes' )
|
||||
|
||||
self.assertEqual( args, ( CC.LOCAL_FILE_SERVICE_KEY, pre_filter_expected_result ) )
|
||||
|
||||
self.assertEqual( result, expected_result )
|
||||
|
||||
# all good and trash too
|
||||
|
||||
HG.test_controller.ClearReads( 'inbox_hashes' )
|
||||
HG.test_controller.ClearReads( 'file_query_ids' )
|
||||
|
||||
presentation_import_options = PresentationImportOptions.PresentationImportOptions()
|
||||
|
||||
presentation_import_options.SetPresentationStatus( PresentationImportOptions.PRESENTATION_STATUS_ANY_GOOD )
|
||||
presentation_import_options.SetPresentationInbox( PresentationImportOptions.PRESENTATION_INBOX_AGNOSTIC )
|
||||
presentation_import_options.SetPresentationLocation( PresentationImportOptions.PRESENTATION_LOCATION_IN_TRASH_TOO )
|
||||
|
||||
pre_filter_expected_result = [
|
||||
new_and_inboxed_hash,
|
||||
new_and_archived_hash,
|
||||
already_in_and_inboxed_hash,
|
||||
already_in_and_archived_hash,
|
||||
new_and_inboxed_but_trashed_hash
|
||||
]
|
||||
|
||||
expected_result = [
|
||||
new_and_inboxed_hash,
|
||||
new_and_archived_hash,
|
||||
already_in_and_inboxed_hash,
|
||||
already_in_and_archived_hash,
|
||||
new_and_inboxed_but_trashed_hash
|
||||
]
|
||||
|
||||
HG.test_controller.SetRead( 'inbox_hashes', 'not used' )
|
||||
HG.test_controller.SetRead( 'filter_hashes', expected_result )
|
||||
|
||||
result = presentation_import_options.GetPresentedHashes( hashes_and_statuses )
|
||||
|
||||
[ ( args, kwargs ) ] = HG.test_controller.GetRead( 'filter_hashes' )
|
||||
|
||||
self.assertEqual( args, ( CC.COMBINED_LOCAL_FILE_SERVICE_KEY, pre_filter_expected_result ) )
|
||||
|
||||
self.assertEqual( result, expected_result )
|
||||
|
||||
# silent
|
||||
|
||||
HG.test_controller.ClearReads( 'inbox_hashes' )
|
||||
HG.test_controller.ClearReads( 'file_query_ids' )
|
||||
|
||||
presentation_import_options = PresentationImportOptions.PresentationImportOptions()
|
||||
|
||||
presentation_import_options.SetPresentationStatus( PresentationImportOptions.PRESENTATION_STATUS_NONE )
|
||||
presentation_import_options.SetPresentationInbox( PresentationImportOptions.PRESENTATION_INBOX_AGNOSTIC )
|
||||
presentation_import_options.SetPresentationLocation( PresentationImportOptions.PRESENTATION_LOCATION_IN_LOCAL_FILES )
|
||||
|
||||
expected_result = []
|
||||
|
||||
HG.test_controller.SetRead( 'inbox_hashes', 'not used' )
|
||||
HG.test_controller.SetRead( 'filter_hashes', 'not used' )
|
||||
|
||||
result = presentation_import_options.GetPresentedHashes( hashes_and_statuses )
|
||||
|
||||
self.assertEqual( result, expected_result )
|
||||
|
||||
# new files only
|
||||
|
||||
HG.test_controller.ClearReads( 'inbox_hashes' )
|
||||
HG.test_controller.ClearReads( 'file_query_ids' )
|
||||
|
||||
presentation_import_options = PresentationImportOptions.PresentationImportOptions()
|
||||
|
||||
presentation_import_options.SetPresentationStatus( PresentationImportOptions.PRESENTATION_STATUS_NEW_ONLY )
|
||||
presentation_import_options.SetPresentationInbox( PresentationImportOptions.PRESENTATION_INBOX_AGNOSTIC )
|
||||
presentation_import_options.SetPresentationLocation( PresentationImportOptions.PRESENTATION_LOCATION_IN_LOCAL_FILES )
|
||||
|
||||
pre_filter_expected_result = [
|
||||
new_and_inboxed_hash,
|
||||
new_and_archived_hash,
|
||||
new_and_inboxed_but_trashed_hash
|
||||
]
|
||||
|
||||
expected_result = [
|
||||
new_and_inboxed_hash,
|
||||
new_and_archived_hash
|
||||
]
|
||||
|
||||
HG.test_controller.SetRead( 'inbox_hashes', 'not used' )
|
||||
HG.test_controller.SetRead( 'filter_hashes', expected_result )
|
||||
|
||||
result = presentation_import_options.GetPresentedHashes( hashes_and_statuses )
|
||||
|
||||
[ ( args, kwargs ) ] = HG.test_controller.GetRead( 'filter_hashes' )
|
||||
|
||||
self.assertEqual( args, ( CC.LOCAL_FILE_SERVICE_KEY, pre_filter_expected_result ) )
|
||||
|
||||
self.assertEqual( result, expected_result )
|
||||
|
||||
# inbox only
|
||||
|
||||
HG.test_controller.ClearReads( 'inbox_hashes' )
|
||||
HG.test_controller.ClearReads( 'file_query_ids' )
|
||||
|
||||
presentation_import_options = PresentationImportOptions.PresentationImportOptions()
|
||||
|
||||
presentation_import_options.SetPresentationStatus( PresentationImportOptions.PRESENTATION_STATUS_ANY_GOOD )
|
||||
presentation_import_options.SetPresentationInbox( PresentationImportOptions.PRESENTATION_INBOX_REQUIRE_INBOX )
|
||||
presentation_import_options.SetPresentationLocation( PresentationImportOptions.PRESENTATION_LOCATION_IN_LOCAL_FILES )
|
||||
|
||||
pre_inbox_filter_expected_result = {
|
||||
new_and_inboxed_hash,
|
||||
new_and_archived_hash,
|
||||
already_in_and_inboxed_hash,
|
||||
already_in_and_archived_hash,
|
||||
new_and_inboxed_but_trashed_hash
|
||||
}
|
||||
|
||||
inbox_filter_answer = {
|
||||
new_and_inboxed_hash,
|
||||
already_in_and_inboxed_hash,
|
||||
new_and_inboxed_but_trashed_hash
|
||||
}
|
||||
|
||||
pre_filter_expected_result = [
|
||||
new_and_inboxed_hash,
|
||||
already_in_and_inboxed_hash,
|
||||
new_and_inboxed_but_trashed_hash
|
||||
]
|
||||
|
||||
expected_result = [
|
||||
new_and_inboxed_hash,
|
||||
already_in_and_inboxed_hash
|
||||
]
|
||||
|
||||
HG.test_controller.SetRead( 'inbox_hashes', inbox_filter_answer )
|
||||
HG.test_controller.SetRead( 'filter_hashes', expected_result )
|
||||
|
||||
result = presentation_import_options.GetPresentedHashes( hashes_and_statuses )
|
||||
|
||||
[ ( args, kwargs ) ] = HG.test_controller.GetRead( 'inbox_hashes' )
|
||||
|
||||
self.assertEqual( args, ( pre_inbox_filter_expected_result, ) )
|
||||
|
||||
[ ( args, kwargs ) ] = HG.test_controller.GetRead( 'filter_hashes' )
|
||||
|
||||
self.assertEqual( args, ( CC.LOCAL_FILE_SERVICE_KEY, pre_filter_expected_result ) )
|
||||
|
||||
self.assertEqual( result, expected_result )
|
||||
|
||||
# new only
|
||||
|
||||
HG.test_controller.ClearReads( 'inbox_hashes' )
|
||||
HG.test_controller.ClearReads( 'file_query_ids' )
|
||||
|
||||
presentation_import_options = PresentationImportOptions.PresentationImportOptions()
|
||||
|
||||
presentation_import_options.SetPresentationStatus( PresentationImportOptions.PRESENTATION_STATUS_NEW_ONLY )
|
||||
presentation_import_options.SetPresentationInbox( PresentationImportOptions.PRESENTATION_INBOX_AGNOSTIC )
|
||||
presentation_import_options.SetPresentationLocation( PresentationImportOptions.PRESENTATION_LOCATION_IN_LOCAL_FILES )
|
||||
|
||||
pre_filter_expected_result = [
|
||||
new_and_inboxed_hash,
|
||||
new_and_archived_hash,
|
||||
new_and_inboxed_but_trashed_hash
|
||||
]
|
||||
|
||||
expected_result = [
|
||||
new_and_inboxed_hash,
|
||||
new_and_archived_hash
|
||||
]
|
||||
|
||||
HG.test_controller.SetRead( 'inbox_hashes', 'not used' )
|
||||
HG.test_controller.SetRead( 'filter_hashes', expected_result )
|
||||
|
||||
result = presentation_import_options.GetPresentedHashes( hashes_and_statuses )
|
||||
|
||||
[ ( args, kwargs ) ] = HG.test_controller.GetRead( 'filter_hashes' )
|
||||
|
||||
self.assertEqual( args, ( CC.LOCAL_FILE_SERVICE_KEY, pre_filter_expected_result ) )
|
||||
|
||||
self.assertEqual( result, expected_result )
|
||||
|
||||
# new and inbox only
|
||||
|
||||
HG.test_controller.ClearReads( 'inbox_hashes' )
|
||||
HG.test_controller.ClearReads( 'file_query_ids' )
|
||||
|
||||
presentation_import_options = PresentationImportOptions.PresentationImportOptions()
|
||||
|
||||
presentation_import_options.SetPresentationStatus( PresentationImportOptions.PRESENTATION_STATUS_NEW_ONLY )
|
||||
presentation_import_options.SetPresentationInbox( PresentationImportOptions.PRESENTATION_INBOX_REQUIRE_INBOX )
|
||||
presentation_import_options.SetPresentationLocation( PresentationImportOptions.PRESENTATION_LOCATION_IN_LOCAL_FILES )
|
||||
|
||||
pre_inbox_filter_expected_result = {
|
||||
new_and_inboxed_hash,
|
||||
new_and_archived_hash,
|
||||
new_and_inboxed_but_trashed_hash
|
||||
}
|
||||
|
||||
inbox_filter_answer = {
|
||||
new_and_inboxed_hash,
|
||||
new_and_inboxed_but_trashed_hash
|
||||
}
|
||||
|
||||
pre_filter_expected_result = [
|
||||
new_and_inboxed_hash,
|
||||
new_and_inboxed_but_trashed_hash
|
||||
]
|
||||
|
||||
expected_result = [
|
||||
new_and_inboxed_hash
|
||||
]
|
||||
|
||||
HG.test_controller.SetRead( 'inbox_hashes', inbox_filter_answer )
|
||||
HG.test_controller.SetRead( 'filter_hashes', expected_result )
|
||||
|
||||
result = presentation_import_options.GetPresentedHashes( hashes_and_statuses )
|
||||
|
||||
[ ( args, kwargs ) ] = HG.test_controller.GetRead( 'inbox_hashes' )
|
||||
|
||||
self.assertEqual( args, ( pre_inbox_filter_expected_result, ) )
|
||||
|
||||
[ ( args, kwargs ) ] = HG.test_controller.GetRead( 'filter_hashes' )
|
||||
|
||||
self.assertEqual( args, ( CC.LOCAL_FILE_SERVICE_KEY, pre_filter_expected_result ) )
|
||||
|
||||
self.assertEqual( result, expected_result )
|
||||
|
||||
# new or inbox only
|
||||
|
||||
HG.test_controller.ClearReads( 'inbox_hashes' )
|
||||
HG.test_controller.ClearReads( 'file_query_ids' )
|
||||
|
||||
presentation_import_options = PresentationImportOptions.PresentationImportOptions()
|
||||
|
||||
presentation_import_options.SetPresentationStatus( PresentationImportOptions.PRESENTATION_STATUS_NEW_ONLY )
|
||||
presentation_import_options.SetPresentationInbox( PresentationImportOptions.PRESENTATION_INBOX_INCLUDE_INBOX )
|
||||
presentation_import_options.SetPresentationLocation( PresentationImportOptions.PRESENTATION_LOCATION_IN_LOCAL_FILES )
|
||||
|
||||
pre_inbox_filter_expected_result = {
|
||||
already_in_and_inboxed_hash,
|
||||
already_in_and_archived_hash
|
||||
}
|
||||
|
||||
inbox_filter_answer = {
|
||||
already_in_and_inboxed_hash
|
||||
}
|
||||
|
||||
pre_filter_expected_result = [
|
||||
new_and_inboxed_hash,
|
||||
new_and_archived_hash,
|
||||
already_in_and_inboxed_hash,
|
||||
new_and_inboxed_but_trashed_hash
|
||||
]
|
||||
|
||||
expected_result = [
|
||||
new_and_inboxed_hash,
|
||||
new_and_archived_hash,
|
||||
already_in_and_inboxed_hash,
|
||||
]
|
||||
|
||||
HG.test_controller.SetRead( 'inbox_hashes', inbox_filter_answer )
|
||||
HG.test_controller.SetRead( 'filter_hashes', expected_result )
|
||||
|
||||
result = presentation_import_options.GetPresentedHashes( hashes_and_statuses )
|
||||
|
||||
[ ( args, kwargs ) ] = HG.test_controller.GetRead( 'inbox_hashes' )
|
||||
|
||||
self.assertEqual( args, ( pre_inbox_filter_expected_result, ) )
|
||||
|
||||
[ ( args, kwargs ) ] = HG.test_controller.GetRead( 'filter_hashes' )
|
||||
|
||||
self.assertEqual( args, ( CC.LOCAL_FILE_SERVICE_KEY, pre_filter_expected_result ) )
|
||||
|
||||
self.assertEqual( result, expected_result )
|
||||
|
||||
|
||||
class TestTagImportOptions( unittest.TestCase ):
|
||||
|
||||
def test_basics( self ):
|
||||
|
|
Binary file not shown.
Before Width: | Height: | Size: 2.4 KiB After Width: | Height: | Size: 1.7 KiB |
Binary file not shown.
After Width: | Height: | Size: 2.7 KiB |
Binary file not shown.
After Width: | Height: | Size: 2.5 KiB |
Loading…
Reference in New Issue