Version 469

closes #1042
This commit is contained in:
Hydrus Network Developer 2022-01-12 16:14:50 -06:00
parent fa5ebd9c22
commit be79406f1f
29 changed files with 2625 additions and 2145 deletions

View File

@ -8,6 +8,34 @@
<div class="content">
<h3 id="changelog"><a href="#changelog">changelog</a></h3>
<ul>
<li><h3 id="version_469"><a href="#version_469">version 469</a></h3></li>
<ul>
<li>misc:</li>
<li>the 'search log' button and the window panel now let you delete log entries. you can delete by completion status from the menu or specifically by row in the panel (just like the file log)</li>
<li>fixed the new 'file is writable' checks for Linux/macOS, which were testing permissions overbroadly and messing with users with user-only permissions set. the code now hands off specific user/group negotiation to the OS. thanks to the maintainer of the AUR package for helping me out here (issue #1042)</li>
<li>the various places where a file's permission bits are set are also cleaned up--hydrus now makes a distinction between double-checking a file is set user-writable before deleting/overwriting vs making a file's permission bits (which were potentially messed up in the past) 'nice' for human use after export. in the latter case, which still defaults to 644 on linux/macOS, the user's umask is now applied, so it should be 600 if you prefer that</li>
<li>fixed a bug where the media viewer could have trouble initialising video when the player window instantiation was delayed (e.g. with embed button)</li>
<li>.</li>
<li>client api:</li>
<li>added 'return_hashes' boolean parameter to GET /get_files/search_files, which makes the command return hashes instead of file ids. this is documented in the help and has a new unit test</li>
<li>client api version is now 25</li>
<li>.</li>
<li>multiple local file services work:</li>
<li>I rewrote a lot of code this week, but it proved more complex than I expected. I also discovered I'll have to switch the pages and canvases over too before I can nicely switch the top level UI over to allow multiple search. rather than release a borked feature, I decided not to rush the final phase, so this remains boring for now! the good news is that it works well when I hack it in, so I just need to keep pushing</li>
<li>rewrote the caller side of tag autocomplete lookup to work on the new multiple file search domain</li>
<li>rewrote the main database level tag lookup code to work on the new multiple file search domain</li>
<li>certain types of complicated tag autocomplete lookup, particularly on all known tags and any client with lots of siblings, will be faster now</li>
<li>an unusual and complicated too-expansive sibling lookup on autocomplete lookups on 'all known tags' is now fixed</li>
<li>.</li>
<li>boring cleanup and refactoring:</li>
<li>predicate counts are now managed by a new object. predicates also support 0 minimum count for x-y count ranges, which is now possible when fetching count results from non-cross-referenced file domains (for now this means searching deleted files)</li>
<li>cleaned up a ton of predicate instantiation code</li>
<li>updated autocomplete, predicate, and pred count unit tests to handle the new objects and bug fixes</li>
<li>wrote new classess to cover convenient multiple file search domain at the database level and updated a bunch of tag autocomplete search code to use it</li>
<li>misc cleanup and refactoring for file domain search code</li>
<li>purged more single file service inspection code from file search systems</li>
<li>refactored most duplicate files storage code (about 70KB) to a new client db module</li>
</ul>
<li><h3 id="version_468"><a href="#version_468">version 468</a></h3></li>
<ul>
<li>misc:</li>

View File

@ -1158,6 +1158,7 @@
<li>tag_service_key : (optional, selective, hexadecimal, the tag domain on which to search)</li>
<li>file_sort_type : (optional, integer, the results sort method)</li>
<li>file_sort_asc : true or false (optional, the results sort order)</li>
<li>return_hashes : true or false (optional, default false, returns hex hashes instead of file ids)</li>
<li><i>system_inbox : true or false (obsolete, use tags)</i></li>
<li><i>system_archive : true or false (obsolete, use tags)</i></li>
</ul>
@ -1287,6 +1288,18 @@
<li>
<pre>{
"file_ids" : [ 125462, 4852415, 123, 591415 ]
}</pre>
</li>
</ul>
<p>Example response with return_hashes=true:</p>
<ul>
<li>
<pre>{
"hashes": [
"1b04c4df7accd5a61c5d02b36658295686b0abfebdc863110e7d7249bba3f9ad",
"fe416723c731d679aa4d20e9fd36727f4a38cd0ac6d035431f0f452fad54563f",
"b53505929c502848375fbc4dab2f40ad4ae649d34ef72802319a348f81b52bad"
]
}</pre>
</li>
</ul>

View File

@ -178,96 +178,19 @@ def ConvertZoomToPercentage( zoom ):
return pretty_zoom
def MergeCounts( min_a, max_a, min_b, max_b ):
def MergeCounts( min_a: int, max_a: int, min_b: int, max_b: int ):
# 100-None and 100-None returns 100-200
# 1-None and 4-5 returns 5-6
# 1-2, and 5-7 returns 6, 9
# this no longer takes 'None' maxes, and it is now comfortable with 0-5 ranges
if min_a == 0:
( min_answer, max_answer ) = ( min_b, max_b )
elif min_b == 0:
( min_answer, max_answer ) = ( min_a, max_a )
else:
if max_a is None:
max_a = min_a
if max_b is None:
max_b = min_b
min_answer = max( min_a, min_b )
max_answer = max_a + max_b
# 100-100 and 100-100 returns 100-200
# 1-1 and 4-5 returns 4-6
# 1-2, and 5-7 returns 5-9
min_answer = max( min_a, min_b )
max_answer = max_a + max_b
return ( min_answer, max_answer )
def MergePredicates( predicates, add_namespaceless = False ):
master_predicate_dict = {}
for predicate in predicates:
# this works because predicate.__hash__ exists
if predicate in master_predicate_dict:
master_predicate_dict[ predicate ].AddCounts( predicate )
else:
master_predicate_dict[ predicate ] = predicate
if add_namespaceless:
# we want to include the count for namespaced tags in the namespaceless version when:
# there exists more than one instance of the subtag with different namespaces, including '', that has nonzero count
unnamespaced_predicate_dict = {}
subtag_nonzero_instance_counter = collections.Counter()
for predicate in master_predicate_dict.values():
if predicate.HasNonZeroCount():
unnamespaced_predicate = predicate.GetUnnamespacedCopy()
subtag_nonzero_instance_counter[ unnamespaced_predicate ] += 1
if unnamespaced_predicate in unnamespaced_predicate_dict:
unnamespaced_predicate_dict[ unnamespaced_predicate ].AddCounts( unnamespaced_predicate )
else:
unnamespaced_predicate_dict[ unnamespaced_predicate ] = unnamespaced_predicate
for ( unnamespaced_predicate, count ) in subtag_nonzero_instance_counter.items():
# if there were indeed several instances of this subtag, overwrte the master dict's instance with our new count total
if count > 1:
master_predicate_dict[ unnamespaced_predicate ] = unnamespaced_predicate_dict[ unnamespaced_predicate ]
return list( master_predicate_dict.values() )
def OrdIsSensibleASCII( o ):
return 32 <= o and o <= 127

View File

@ -470,7 +470,7 @@ class ExportFolder( HydrusSerialisable.SerialisableBaseNamed ):
num_copied += 1
HydrusPaths.MakeFileWriteable( dest_path )
HydrusPaths.TryToGiveFileNicePermissionBits( dest_path )

View File

@ -217,7 +217,7 @@ class ClientFilesManager( object ):
try:
HydrusPaths.MakeFileWriteable( dest_path )
HydrusPaths.TryToGiveFileNicePermissionBits( dest_path )
with open( dest_path, 'wb' ) as f:
@ -760,7 +760,7 @@ class ClientFilesManager( object ):
HydrusData.ShowText( 'Adding file from string: ' + str( ( len( file_bytes ), dest_path ) ) )
HydrusPaths.MakeFileWriteable( dest_path )
HydrusPaths.TryToGiveFileNicePermissionBits( dest_path )
with open( dest_path, 'wb' ) as f:
@ -1631,7 +1631,7 @@ class FilesMaintenanceManager( object ):
path = self._controller.client_files_manager.GetFilePath( hash, mime )
HydrusPaths.MakeFileWriteable( path )
HydrusPaths.TryToGiveFileNicePermissionBits( path )
except HydrusExceptions.FileMissingException:

View File

@ -164,7 +164,7 @@ def IsComplexWildcard( search_text ):
def SortPredicates( predicates ):
key = lambda p: p.GetCount()
key = lambda p: p.GetCount().GetMinCount()
predicates.sort( key = key, reverse = True )
@ -976,18 +976,6 @@ class FileSearchContext( HydrusSerialisable.SerialisableBase ):
self._tag_search_context.FixMissingServices( services_manager )
def GetFileServiceKey( self ):
if self._location_search_context.SearchesAnything():
return self._location_search_context.GetFileServiceKey()
else:
return CC.COMBINED_FILE_SERVICE_KEY
def GetLocationSearchContext( self ) -> "LocationSearchContext":
return self._location_search_context
@ -1073,7 +1061,14 @@ class LocationSearchContext( HydrusSerialisable.SerialisableBase ):
if current_service_keys is None:
current_service_keys = [ CC.COMBINED_LOCAL_FILE_SERVICE_KEY ]
if deleted_service_keys is None:
current_service_keys = [ CC.COMBINED_LOCAL_FILE_SERVICE_KEY ]
else:
current_service_keys = []
if deleted_service_keys is None:
@ -1081,8 +1076,8 @@ class LocationSearchContext( HydrusSerialisable.SerialisableBase ):
deleted_service_keys = []
self.current_service_keys = current_service_keys
self.deleted_service_keys = deleted_service_keys
self.current_service_keys = set( current_service_keys )
self.deleted_service_keys = set( deleted_service_keys )
def _GetSerialisableInfo( self ):
@ -1097,8 +1092,8 @@ class LocationSearchContext( HydrusSerialisable.SerialisableBase ):
( serialisable_current_service_keys, serialisable_deleted_service_keys ) = serialisable_info
self.current_service_keys = [ bytes.fromhex( service_key ) for service_key in serialisable_current_service_keys ]
self.deleted_service_keys = [ bytes.fromhex( service_key ) for service_key in serialisable_deleted_service_keys ]
self.current_service_keys = { bytes.fromhex( service_key ) for service_key in serialisable_current_service_keys }
self.deleted_service_keys = { bytes.fromhex( service_key ) for service_key in serialisable_deleted_service_keys }
def FixMissingServices( self, services_manager ):
@ -1107,20 +1102,31 @@ class LocationSearchContext( HydrusSerialisable.SerialisableBase ):
self.deleted_service_keys = services_manager.FilterValidServiceKeys( self.deleted_service_keys )
def GetFileServiceKey( self ):
def GetCoveringCurrentFileServiceKeys( self ):
if not self.IsOneDomain():
file_location_is_cross_referenced = not ( self.IsAllKnownFiles() or self.SearchesDeleted() )
file_service_keys = list( self.current_service_keys )
if self.SearchesDeleted():
raise Exception( 'Location context was asked for specific file domain, but it did not have a single domain' )
file_service_keys.append( CC.COMBINED_DELETED_FILE_SERVICE_KEY )
if len( self.current_service_keys ) > 0:
return ( file_service_keys, file_location_is_cross_referenced )
def GetBestSingleFileServiceKey( self ):
# this could be improved to check multiple local lads -> all local files
if self.IsOneDomain() and len( self.current_service_keys ) == 1:
( service_key, ) = self.current_service_keys
service_key = list( self.current_service_keys )[0]
else:
( service_key, ) = self.deleted_service_keys
service_key = CC.COMBINED_FILE_SERVICE_KEY
return service_key
@ -1381,6 +1387,138 @@ class FavouriteSearchManager( HydrusSerialisable.SerialisableBase ):
HydrusSerialisable.SERIALISABLE_TYPES_TO_OBJECT_TYPES[ HydrusSerialisable.SERIALISABLE_TYPE_FAVOURITE_SEARCH_MANAGER ] = FavouriteSearchManager
class PredicateCount( object ):
def __init__(
self,
min_current_count: int,
min_pending_count: int,
max_current_count: typing.Optional[ int ],
max_pending_count: typing.Optional[ int ]
):
self.min_current_count = min_current_count
self.min_pending_count = min_pending_count
self.max_current_count = max_current_count if max_current_count is not None else min_current_count
self.max_pending_count = max_pending_count if max_pending_count is not None else min_pending_count
def __eq__( self, other ):
if isinstance( other, PredicateCount ):
return self.__hash__() == other.__hash__()
return NotImplemented
def __hash__( self ):
return (
self.min_current_count,
self.min_pending_count,
self.max_current_count,
self.max_pending_count
).__hash__()
def __repr__( self ):
return 'Predicate Count: {}-{} +{}-{}'.format( self.min_current_count, self.max_current_count, self.min_pending_count, self.max_pending_count )
def AddCounts( self, count: "PredicateCount" ):
( self.min_current_count, self.max_current_count ) = ClientData.MergeCounts( self.min_current_count, self.max_current_count, count.min_current_count, count.max_current_count )
( self.min_pending_count, self.max_pending_count) = ClientData.MergeCounts( self.min_pending_count, self.max_pending_count, count.min_pending_count, count.max_pending_count )
def Duplicate( self ):
return PredicateCount(
self.min_current_count,
self.min_pending_count,
self.max_current_count,
self.max_pending_count
)
def GetMinCount( self, current_or_pending = None ):
if current_or_pending is None:
return self.min_current_count + self.min_pending_count
elif current_or_pending == HC.CONTENT_STATUS_CURRENT:
return self.min_current_count
elif current_or_pending == HC.CONTENT_STATUS_PENDING:
return self.min_pending_count
def GetSuffixString( self ) -> str:
suffix_components = []
if self.min_current_count > 0 or self.max_current_count > 0:
number_text = HydrusData.ToHumanInt( self.min_current_count )
if self.max_current_count > self.min_current_count:
number_text = '{}-{}'.format( number_text, HydrusData.ToHumanInt( self.max_current_count ) )
suffix_components.append( '({})'.format( number_text ) )
if self.min_pending_count > 0 or self.max_pending_count > 0:
number_text = HydrusData.ToHumanInt( self.min_pending_count )
if self.max_pending_count > self.min_pending_count:
number_text = '{}-{}'.format( number_text, HydrusData.ToHumanInt( self.max_pending_count ) )
suffix_components.append( '(+{})'.format( number_text ) )
return ' '.join( suffix_components )
def HasNonZeroCount( self ):
return self.min_current_count > 0 or self.min_pending_count > 0 or self.max_current_count > 0 or self.max_pending_count > 0
def HasZeroCount( self ):
return not self.HasNonZeroCount()
@staticmethod
def STATICCreateCurrentCount( current_count ) -> "PredicateCount":
return PredicateCount( current_count, 0, None, None )
@staticmethod
def STATICCreateNullCount() -> "PredicateCount":
return PredicateCount( 0, 0, None, None )
@staticmethod
def STATICCreateStaticCount( current_count, pending_count ) -> "PredicateCount":
return PredicateCount( current_count, pending_count, None, None )
class Predicate( HydrusSerialisable.SerialisableBase ):
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_PREDICATE
@ -1392,10 +1530,7 @@ class Predicate( HydrusSerialisable.SerialisableBase ):
predicate_type: int = None,
value: object = None,
inclusive: bool = True,
min_current_count: HC.noneable_int = 0,
min_pending_count: HC.noneable_int = 0,
max_current_count: HC.noneable_int = None,
max_pending_count: HC.noneable_int = None
count = None
):
if isinstance( value, ( list, set ) ):
@ -1403,15 +1538,17 @@ class Predicate( HydrusSerialisable.SerialisableBase ):
value = tuple( value )
if count is None:
count = PredicateCount.STATICCreateNullCount()
self._predicate_type = predicate_type
self._value = value
self._inclusive = inclusive
self._min_current_count = min_current_count
self._min_pending_count = min_pending_count
self._max_current_count = max_current_count
self._max_pending_count = max_pending_count
self._count = count
self._count_text_suffix = ''
@ -1465,7 +1602,7 @@ class Predicate( HydrusSerialisable.SerialisableBase ):
def __repr__( self ):
return 'Predicate: ' + str( ( self._predicate_type, self._value, self._inclusive, self.GetCount() ) )
return 'Predicate: ' + str( ( self._predicate_type, self._value, self._inclusive, self._count.GetMinCount() ) )
def _RecalcPythonHash( self ):
@ -1654,30 +1791,14 @@ class Predicate( HydrusSerialisable.SerialisableBase ):
def AddCounts( self, predicate ):
( min_current_count, max_current_count, min_pending_count, max_pending_count ) = predicate.GetAllCounts()
( self._min_current_count, self._max_current_count ) = ClientData.MergeCounts( self._min_current_count, self._max_current_count, min_current_count, max_current_count )
( self._min_pending_count, self._max_pending_count) = ClientData.MergeCounts( self._min_pending_count, self._max_pending_count, min_pending_count, max_pending_count )
def ClearCounts( self ):
self._min_current_count = 0
self._min_pending_count = 0
self._max_current_count = None
self._max_pending_count = None
def GetAllCounts( self ):
return ( self._min_current_count, self._max_current_count, self._min_pending_count, self._max_pending_count )
def GetCopy( self ):
return Predicate( self._predicate_type, self._value, self._inclusive, self._min_current_count, self._min_pending_count, self._max_current_count, self._max_pending_count )
return Predicate( self._predicate_type, self._value, self._inclusive, count = self._count.Duplicate() )
def GetCount( self ):
return self._count
def GetCountlessCopy( self ):
@ -1685,22 +1806,6 @@ class Predicate( HydrusSerialisable.SerialisableBase ):
return Predicate( self._predicate_type, self._value, self._inclusive )
def GetCount( self, current_or_pending = None ):
if current_or_pending is None:
return self._min_current_count + self._min_pending_count
elif current_or_pending == HC.CONTENT_STATUS_CURRENT:
return self._min_current_count
elif current_or_pending == HC.CONTENT_STATUS_PENDING:
return self._min_pending_count
def GetNamespace( self ):
if self._predicate_type in SYSTEM_PREDICATE_TYPES:
@ -1858,7 +1963,7 @@ class Predicate( HydrusSerialisable.SerialisableBase ):
( namespace, subtag ) = HydrusTags.SplitTag( self._value )
return Predicate( self._predicate_type, subtag, self._inclusive, self._min_current_count, self._min_pending_count, self._max_current_count, self._max_pending_count )
return Predicate( self._predicate_type, subtag, self._inclusive, count = self._count.Duplicate() )
return self.GetCopy()
@ -1869,11 +1974,6 @@ class Predicate( HydrusSerialisable.SerialisableBase ):
return self._value
def HasNonZeroCount( self ):
return self._min_current_count > 0 or self._min_pending_count > 0
def HasIdealSibling( self ):
return self._ideal_sibling is not None
@ -2008,28 +2108,11 @@ class Predicate( HydrusSerialisable.SerialisableBase ):
if with_count:
if self._min_current_count > 0:
number_text = HydrusData.ToHumanInt( self._min_current_count )
if self._max_current_count is not None:
number_text += '-' + HydrusData.ToHumanInt( self._max_current_count )
count_text += ' ({})'.format( number_text )
suffix = self._count.GetSuffixString()
if self._min_pending_count > 0:
if len( suffix ) > 0:
number_text = HydrusData.ToHumanInt( self._min_pending_count )
if self._max_pending_count is not None:
number_text += '-' + HydrusData.ToHumanInt( self._max_pending_count )
count_text += ' (+{})'.format( number_text )
count_text += ' {}'.format( suffix )
if self._count_text_suffix != '':
@ -2793,6 +2876,64 @@ def FilterPredicatesBySearchText( service_key, search_text, predicates: typing.C
return matches
def MergePredicates( predicates, add_namespaceless = False ):
master_predicate_dict = {}
for predicate in predicates:
# this works because predicate.__hash__ exists
if predicate in master_predicate_dict:
master_predicate_dict[ predicate ].GetCount().AddCounts( predicate.GetCount() )
else:
master_predicate_dict[ predicate ] = predicate
if add_namespaceless:
# we want to include the count for namespaced tags in the namespaceless version when:
# there exists more than one instance of the subtag with different namespaces, including '', that has nonzero count
unnamespaced_predicate_dict = {}
subtag_nonzero_instance_counter = collections.Counter()
for predicate in master_predicate_dict.values():
if predicate.GetCount().HasNonZeroCount():
unnamespaced_predicate = predicate.GetUnnamespacedCopy()
subtag_nonzero_instance_counter[ unnamespaced_predicate ] += 1
if unnamespaced_predicate in unnamespaced_predicate_dict:
unnamespaced_predicate_dict[ unnamespaced_predicate ].GetCount().AddCounts( unnamespaced_predicate.GetCount() )
else:
unnamespaced_predicate_dict[ unnamespaced_predicate ] = unnamespaced_predicate
for ( unnamespaced_predicate, count ) in subtag_nonzero_instance_counter.items():
# if there were indeed several instances of this subtag, overwrte the master dict's instance with our new count total
if count > 1:
master_predicate_dict[ unnamespaced_predicate ] = unnamespaced_predicate_dict[ unnamespaced_predicate ]
return list( master_predicate_dict.values() )
def SearchTextIsFetchAll( search_text: str ):
( namespace, subtag ) = HydrusTags.SplitTag( search_text )

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@ -54,36 +54,51 @@ class DBLocationSearchContext( object ):
self.location_search_context = location_search_context
self.files_table_name = None
def GetLocationSearchContext( self ) -> ClientSearch.LocationSearchContext:
return self.location_search_context
def GetTableJoinIteratedByFileDomain( self, table_phrase: str ):
def GetTableJoinIteratedByFileDomain( self, table_phrase: str ) -> str:
if self.location_search_context.IsAllKnownFiles():
return table_phrase
else:
return '{} CROSS JOIN {} USING ( hash_id )'.format( self.files_table_name, table_phrase )
raise NotImplementedError()
def GetTableJoinLimitedByFileDomain( self, table_phrase: str ):
def GetTableJoinLimitedByFileDomain( self, table_phrase: str ) -> str:
if self.location_search_context.IsAllKnownFiles():
return table_phrase
else:
return '{} CROSS JOIN {} USING ( hash_id )'.format( table_phrase, self.files_table_name )
raise NotImplementedError()
class DBLocationSearchContextAllKnownFiles( DBLocationSearchContext ):
def GetTableJoinIteratedByFileDomain( self, table_phrase: str ) -> str:
return table_phrase
def GetTableJoinLimitedByFileDomain( self, table_phrase: str ) -> str:
return table_phrase
class DBLocationSearchContextBranch( DBLocationSearchContext ):
def __init__( self, location_search_context: ClientSearch.LocationSearchContext, files_table_name: str ):
DBLocationSearchContext.__init__( self, location_search_context )
self.files_table_name = files_table_name
def GetTableJoinIteratedByFileDomain( self, table_phrase: str ) -> str:
return '{} CROSS JOIN {} USING ( hash_id )'.format( self.files_table_name, table_phrase )
def GetTableJoinLimitedByFileDomain( self, table_phrase: str ) -> str:
return '{} CROSS JOIN {} USING ( hash_id )'.format( table_phrase, self.files_table_name )
class ClientDBFilesStorage( ClientDBModule.ClientDBModule ):
@ -273,20 +288,6 @@ class ClientDBFilesStorage( ClientDBModule.ClientDBModule ):
return service_ids_to_nums_cleared
def ConvertLocationToCoveringFileServiceKeys( self, location_search_context: ClientSearch.LocationSearchContext ):
file_location_is_cross_referenced = not ( location_search_context.IsAllKnownFiles() or location_search_context.SearchesDeleted() )
file_service_keys = list( location_search_context.current_service_keys )
if location_search_context.SearchesDeleted():
file_service_keys.append( CC.COMBINED_DELETED_FILE_SERVICE_KEY )
return ( file_service_keys, file_location_is_cross_referenced )
def DeferFilesDeleteIfNowOrphan( self, hash_ids, definitely_no_thumbnails = False, ignore_service_id = None ):
orphan_hash_ids = self.FilterOrphanFileHashIds( hash_ids, ignore_service_id = ignore_service_id )
@ -775,13 +776,11 @@ class ClientDBFilesStorage( ClientDBModule.ClientDBModule ):
location_search_context = ClientSearch.LocationSearchContext( current_service_keys = [ CC.COMBINED_FILE_SERVICE_KEY ] )
db_location_search_context = DBLocationSearchContext( location_search_context )
if location_search_context.IsAllKnownFiles():
# no table set, obviously
return db_location_search_context
return DBLocationSearchContextAllKnownFiles( location_search_context )
table_names = []
@ -804,7 +803,7 @@ class ClientDBFilesStorage( ClientDBModule.ClientDBModule ):
table_name = table_names[0]
db_location_search_context.files_table_name = table_name
files_table_name = table_name
else:
@ -829,10 +828,10 @@ class ClientDBFilesStorage( ClientDBModule.ClientDBModule ):
self._Execute( 'INSERT OR IGNORE INTO {} ( hash_id ) SELECT hash_id FROM {};'.format( self.temp_file_storage_table_name, select_query ) )
db_location_search_context.files_table_name = self.temp_file_storage_table_name
files_table_name = self.temp_file_storage_table_name
return db_location_search_context
return DBLocationSearchContextBranch( location_search_context, files_table_name )
def GetHashIdsToCurrentServiceIds( self, temp_hash_ids_table_name ):

View File

@ -217,7 +217,7 @@ class ClientDBMappingsCounts( ClientDBModule.ClientDBModule ):
return self._STS( self._Execute( 'SELECT tag_id FROM {} CROSS JOIN {} USING ( tag_id );'.format( tag_ids_table_name, counts_cache_table_name ) ) )
def GetCounts( self, tag_display_type, tag_service_id, file_service_id, tag_ids, include_current, include_pending, zero_count_ok = False, job_key = None, tag_ids_table_name = None ):
def GetCounts( self, tag_display_type, tag_service_id, file_service_id, tag_ids, include_current, include_pending, domain_is_cross_referenced = True, zero_count_ok = False, job_key = None, tag_ids_table_name = None ):
if len( tag_ids ) == 0:
@ -303,16 +303,30 @@ class ClientDBMappingsCounts( ClientDBModule.ClientDBModule ):
continue
if tag_id in ids_to_count:
current_max = current_count
pending_max = pending_count
if domain_is_cross_referenced:
( current_min, current_max, pending_min, pending_max ) = ids_to_count[ tag_id ]
# file counts are perfectly accurate
( current_min, current_max ) = ClientData.MergeCounts( current_min, current_max, current_count, None )
( pending_min, pending_max ) = ClientData.MergeCounts( pending_min, pending_max, pending_count, None )
current_min = current_count
pending_min = pending_count
else:
( current_min, current_max, pending_min, pending_max ) = ( current_count, None, pending_count, None )
# for instance this is a search for 'my files' deleted files, but we are searching on 'all deleted files' domain
current_min = 0
pending_min = 0
if tag_id in ids_to_count:
( existing_current_min, existing_current_max, existing_pending_min, existing_pending_max ) = ids_to_count[ tag_id ]
( current_min, current_max ) = ClientData.MergeCounts( existing_current_min, existing_current_max, current_min, current_max )
( pending_min, pending_max ) = ClientData.MergeCounts( existing_pending_min, existing_pending_max, pending_min, pending_max )
ids_to_count[ tag_id ] = ( current_min, current_max, pending_min, pending_max )
@ -324,7 +338,7 @@ class ClientDBMappingsCounts( ClientDBModule.ClientDBModule ):
if tag_id not in ids_to_count:
ids_to_count[ tag_id ] = ( 0, None, 0, None )
ids_to_count[ tag_id ] = ( 0, 0, 0, 0 )

View File

@ -1,3 +1,4 @@
import itertools
import sqlite3
import typing
@ -11,6 +12,56 @@ from hydrus.client import ClientSearch
from hydrus.client import ClientServices
from hydrus.client.db import ClientDBModule
class FileSearchContextLeaf( object ):
def __init__( self, file_service_id: int, tag_service_id: int ):
# special thing about a leaf is that it has a specific current domain in the caches
# no all known files or deleted files here. leaf might not be file cross-referenced, but it does cover something we can search fast
# it should get tag display type at some point, maybe current/pending too
self.file_service_id = file_service_id
self.tag_service_id = tag_service_id
class FileSearchContextBranch( object ):
def __init__( self, file_search_context: ClientSearch.FileSearchContext, file_service_ids: typing.Collection[ int ], tag_service_ids: typing.Collection[ int ], file_location_is_cross_referenced: bool ):
self.file_search_context = file_search_context
self.file_service_ids = file_service_ids
self.tag_service_ids = tag_service_ids
self.file_location_is_cross_referenced = file_location_is_cross_referenced
def FileLocationIsCrossReferenced( self ) -> bool:
return self.file_location_is_cross_referenced
def GetFileSearchContext( self ) -> ClientSearch.FileSearchContext:
return self.file_search_context
def IterateLeaves( self ):
for ( file_service_id, tag_service_id ) in itertools.product( self.file_service_ids, self.tag_service_ids ):
yield FileSearchContextLeaf( file_service_id, tag_service_id )
def IterateTableIdPairs( self ):
for ( file_service_id, tag_service_id ) in itertools.product( self.file_service_ids, self.tag_service_ids ):
yield ( file_service_id, tag_service_id )
class ClientDBMasterServices( ClientDBModule.ClientDBModule ):
def __init__( self, cursor: sqlite3.Cursor ):

View File

@ -240,7 +240,7 @@ class ClientDBTagSiblings( ClientDBModule.ClientDBModule ):
# temp tags to lookup
self._Execute( 'INSERT OR IGNORE INTO {} SELECT ideal_tag_id FROM {} CROSS JOIN {} ON ( bad_tag_id = tag_id );'.format( results_table_name, tag_ids_table_name, cache_tag_siblings_lookup_table_name ) )
self._STI( self._Execute( 'INSERT OR IGNORE INTO {} SELECT tag_id FROM {} CROSS JOIN {} ON ( ideal_tag_id = tag_id );'.format( results_table_name, tag_ids_table_name, cache_tag_siblings_lookup_table_name ) ) )
self._STI( self._Execute( 'INSERT OR IGNORE INTO {} SELECT ideal_tag_id FROM {} CROSS JOIN {} ON ( ideal_tag_id = tag_id );'.format( results_table_name, tag_ids_table_name, cache_tag_siblings_lookup_table_name ) ) )
def FilterChainedIntoTable( self, display_type, tag_service_id, tag_ids_table_name, results_table_name ):

View File

@ -873,7 +873,7 @@ class ReviewExportFilesPanel( ClientGUIScrolledPanels.ReviewPanel ):
HydrusPaths.MirrorFile( source_path, path )
HydrusPaths.MakeFileWriteable( path )
HydrusPaths.TryToGiveFileNicePermissionBits( path )
except:

View File

@ -167,7 +167,7 @@ class EditFileSeedCachePanel( ClientGUIScrolledPanels.EditPanel ):
if len( file_seeds_to_delete ) > 0:
message = 'Are you sure you want to delete all the selected entries?'
message = 'Are you sure you want to delete the {} selected entries?'.format( HydrusData.ToHumanInt( len( file_seeds_to_delete ) ) )
result = ClientGUIDialogsQuick.GetYesNo( self, message )
@ -514,39 +514,39 @@ class FileSeedCacheButton( ClientGUICommon.ButtonWithMenuArrow ):
if num_errors > 0:
ClientGUIMenus.AppendMenuItem( menu, 'retry ' + HydrusData.ToHumanInt( num_errors ) + ' failures', 'Tell this cache to reattempt all its error failures.', self._RetryErrors )
ClientGUIMenus.AppendMenuItem( menu, 'retry ' + HydrusData.ToHumanInt( num_errors ) + ' failures', 'Tell this log to reattempt all its error failures.', self._RetryErrors )
if num_vetoed > 0:
ClientGUIMenus.AppendMenuItem( menu, 'retry ' + HydrusData.ToHumanInt( num_vetoed ) + ' ignored', 'Tell this cache to reattempt all its ignored/vetoed results.', self._RetryIgnored )
ClientGUIMenus.AppendMenuItem( menu, 'retry ' + HydrusData.ToHumanInt( num_vetoed ) + ' ignored', 'Tell this log to reattempt all its ignored/vetoed results.', self._RetryIgnored )
ClientGUIMenus.AppendSeparator( menu )
if num_successful > 0:
ClientGUIMenus.AppendMenuItem( menu, 'delete {} \'successful\' file import items from the queue'.format( HydrusData.ToHumanInt( num_successful ) ), 'Tell this cache to clear out successful files, reducing the size of the queue.', self._ClearFileSeeds, ( CC.STATUS_SUCCESSFUL_AND_NEW, CC.STATUS_SUCCESSFUL_BUT_REDUNDANT ) )
ClientGUIMenus.AppendMenuItem( menu, 'delete {} \'successful\' file import items from the queue'.format( HydrusData.ToHumanInt( num_successful ) ), 'Tell this log to clear out successful files, reducing the size of the queue.', self._ClearFileSeeds, ( CC.STATUS_SUCCESSFUL_AND_NEW, CC.STATUS_SUCCESSFUL_BUT_REDUNDANT ) )
if num_deleted > 0:
ClientGUIMenus.AppendMenuItem( menu, 'delete {} \'previously deleted\' file import items from the queue'.format( HydrusData.ToHumanInt( num_deleted ) ), 'Tell this cache to clear out deleted files, reducing the size of the queue.', self._ClearFileSeeds, ( CC.STATUS_DELETED, ) )
ClientGUIMenus.AppendMenuItem( menu, 'delete {} \'previously deleted\' file import items from the queue'.format( HydrusData.ToHumanInt( num_deleted ) ), 'Tell this log to clear out deleted files, reducing the size of the queue.', self._ClearFileSeeds, ( CC.STATUS_DELETED, ) )
if num_errors > 0:
ClientGUIMenus.AppendMenuItem( menu, 'delete {} \'failed\' file import items from the queue'.format( HydrusData.ToHumanInt( num_errors ) ), 'Tell this cache to clear out errored and skipped files, reducing the size of the queue.', self._ClearFileSeeds, ( CC.STATUS_ERROR, ) )
ClientGUIMenus.AppendMenuItem( menu, 'delete {} \'failed\' file import items from the queue'.format( HydrusData.ToHumanInt( num_errors ) ), 'Tell this log to clear out errored files, reducing the size of the queue.', self._ClearFileSeeds, ( CC.STATUS_ERROR, ) )
if num_vetoed > 0:
ClientGUIMenus.AppendMenuItem( menu, 'delete {} \'ignored\' file import items from the queue'.format( HydrusData.ToHumanInt( num_vetoed ) ), 'Tell this cache to clear out ignored files, reducing the size of the queue.', self._ClearFileSeeds, ( CC.STATUS_VETOED, ) )
ClientGUIMenus.AppendMenuItem( menu, 'delete {} \'ignored\' file import items from the queue'.format( HydrusData.ToHumanInt( num_vetoed ) ), 'Tell this log to clear out ignored files, reducing the size of the queue.', self._ClearFileSeeds, ( CC.STATUS_VETOED, ) )
if num_skipped > 0:
ClientGUIMenus.AppendMenuItem( menu, 'delete {} \'skipped\' file import items from the queue'.format( HydrusData.ToHumanInt( num_skipped ) ), 'Tell this cache to clear out errored and skipped files, reducing the size of the queue.', self._ClearFileSeeds, ( CC.STATUS_SKIPPED, ) )
ClientGUIMenus.AppendMenuItem( menu, 'delete {} \'skipped\' file import items from the queue'.format( HydrusData.ToHumanInt( num_skipped ) ), 'Tell this log to clear out skipped files, reducing the size of the queue.', self._ClearFileSeeds, ( CC.STATUS_SKIPPED, ) )
ClientGUIMenus.AppendSeparator( menu )

View File

@ -40,7 +40,7 @@ class EditGallerySeedLogPanel( ClientGUIScrolledPanels.EditPanel ):
# add index control row here, hide it if needed and hook into showing/hiding and postsizechangedevent on gallery_seed add/remove
self._list_ctrl = ClientGUIListCtrl.BetterListCtrl( self, CGLC.COLUMN_LIST_GALLERY_SEED_LOG.ID, 30, self._ConvertGallerySeedToListCtrlTuples )
self._list_ctrl = ClientGUIListCtrl.BetterListCtrl( self, CGLC.COLUMN_LIST_GALLERY_SEED_LOG.ID, 30, self._ConvertGallerySeedToListCtrlTuples, delete_key_callback = self._DeleteSelected )
#
@ -132,6 +132,23 @@ class EditGallerySeedLogPanel( ClientGUIScrolledPanels.EditPanel ):
def _DeleteSelected( self ):
gallery_seeds_to_delete = self._list_ctrl.GetData( only_selected = True )
if len( gallery_seeds_to_delete ) > 0:
message = 'Are you sure you want to delete the {} selected entries? This is only useful if you have a really really huge list.'.format( HydrusData.ToHumanInt( len( gallery_seeds_to_delete ) ) )
result = ClientGUIDialogsQuick.GetYesNo( self, message )
if result == QW.QDialog.Accepted:
self._gallery_seed_log.RemoveGallerySeeds( gallery_seeds_to_delete )
def _GetListCtrlMenu( self ):
selected_gallery_seeds = self._list_ctrl.GetData( only_selected = True )
@ -306,6 +323,33 @@ class GallerySeedLogButton( ClientGUICommon.ButtonWithMenuArrow ):
gallery_seed_log = self._gallery_seed_log_get_callable()
num_successful = gallery_seed_log.GetGallerySeedCount( CC.STATUS_SUCCESSFUL_AND_NEW )
num_vetoed = gallery_seed_log.GetGallerySeedCount( CC.STATUS_VETOED )
num_errors = gallery_seed_log.GetGallerySeedCount( CC.STATUS_ERROR )
num_skipped = gallery_seed_log.GetGallerySeedCount( CC.STATUS_SKIPPED )
if num_successful > 0:
ClientGUIMenus.AppendMenuItem( menu, 'delete {} \'successful\' gallery log entries from the log'.format( HydrusData.ToHumanInt( num_successful ) ), 'Tell this log to clear out successful records, reducing the size of the queue.', self._ClearGallerySeeds, ( CC.STATUS_SUCCESSFUL_AND_NEW, ) )
if num_errors > 0:
ClientGUIMenus.AppendMenuItem( menu, 'delete {} \'failed\' gallery log entries from the log'.format( HydrusData.ToHumanInt( num_errors ) ), 'Tell this log to clear out errored records, reducing the size of the queue.', self._ClearGallerySeeds, ( CC.STATUS_ERROR, ) )
if num_vetoed > 0:
ClientGUIMenus.AppendMenuItem( menu, 'delete {} \'ignored\' gallery log entries from the log'.format( HydrusData.ToHumanInt( num_vetoed ) ), 'Tell this log to clear out ignored records, reducing the size of the queue.', self._ClearGallerySeeds, ( CC.STATUS_VETOED, ) )
if num_skipped > 0:
ClientGUIMenus.AppendMenuItem( menu, 'delete {} \'skipped\' gallery log entries from the log'.format( HydrusData.ToHumanInt( num_skipped ) ), 'Tell this log to clear out skipped records, reducing the size of the queue.', self._ClearGallerySeeds, ( CC.STATUS_SKIPPED, ) )
ClientGUIMenus.AppendSeparator( menu )
if len( gallery_seed_log ) > 0:
if not self._read_only and gallery_seed_log.CanRestartFailedSearch():

View File

@ -139,7 +139,7 @@ class ListBoxTagsSuggestionsRelated( ClientGUIListBoxes.ListBoxTagsPredicates ):
def _GenerateTermFromPredicate( self, predicate: ClientSearch.Predicate ) -> ClientGUIListBoxesData.ListBoxItemPredicate:
predicate.ClearCounts()
predicate = predicate.GetCountlessCopy()
return ClientGUIListBoxesData.ListBoxItemPredicate( predicate )

View File

@ -2648,7 +2648,7 @@ class CanvasFilterDuplicates( CanvasWithHovers ):
def __init__( self, parent, file_search_context: ClientSearch.FileSearchContext, both_files_match, pixel_dupes_preference, max_hamming_distance ):
file_service_key = file_search_context.GetFileServiceKey()
file_service_key = file_search_context.GetLocationSearchContext().GetBestSingleFileServiceKey()
CanvasWithHovers.__init__( self, parent, file_service_key )
@ -2677,8 +2677,6 @@ class CanvasFilterDuplicates( CanvasWithHovers ):
self._hashes_processed_in_this_batch = set()
file_service_key = self._file_search_context.GetFileServiceKey()
self._media_list = ClientMedia.ListeningMediaList( file_service_key, [] )
self._my_shortcuts_handler.AddShortcuts( 'media_viewer_browser' )
@ -3145,7 +3143,7 @@ class CanvasFilterDuplicates( CanvasWithHovers ):
file_service_key = self._file_search_context.GetFileServiceKey()
file_service_key = self._file_search_context.GetLocationSearchContext().GetBestSingleFileServiceKey()
if len( self._unprocessed_pairs ) == 0:

View File

@ -1168,8 +1168,11 @@ class MediaContainer( QW.QWidget ):
self._embed_button.setFixedSize( self.size() )
self._embed_button.move( QC.QPoint( 0, 0 ) )
self._media_window.setFixedSize( self.size() )
self._media_window.move( QC.QPoint( 0, 0 ) )
if self._media_window is not None:
self._media_window.setFixedSize( self.size() )
self._media_window.move( QC.QPoint( 0, 0 ) )
controls_bar_rect = self.GetIdealControlsBarRect()

View File

@ -242,7 +242,7 @@ class ListBoxItemTextTag( ListBoxItem ):
def GetSearchPredicates( self ) -> typing.List[ ClientSearch.Predicate ]:
return [ ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_TAG, self._tag ) ]
return [ ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_TAG, value = self._tag ) ]
def GetRowCount( self, child_rows_allowed: bool ):
@ -350,7 +350,7 @@ class ListBoxItemTextTagWithCounts( ListBoxItemTextTag ):
# with counts? or just merge this into texttag???
return [ ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_TAG, self._tag ) ]
return [ ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_TAG, value = self._tag ) ]
def GetRowsOfPresentationTextsWithNamespaces( self, render_for_user: bool, sibling_decoration_allowed: bool, child_rows_allowed: bool ) -> typing.List[ typing.List[ typing.Tuple[ str, str ] ] ]:

View File

@ -260,7 +260,7 @@ def CreateManagementControllerPetitions( petition_service_key ):
def CreateManagementControllerQuery( page_name, file_search_context: ClientSearch.FileSearchContext, search_enabled ):
file_service_key = file_search_context.GetFileServiceKey()
file_service_key = file_search_context.GetLocationSearchContext().GetBestSingleFileServiceKey()
management_controller = CreateManagementController( page_name, MANAGEMENT_TYPE_QUERY, file_service_key = file_service_key )
@ -823,7 +823,7 @@ class ListBoxTagsMediaManagementPanel( ClientGUIListBoxes.ListBoxTagsMedia ):
if shift_down and len( predicates ) > 1:
predicates = ( ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_OR_CONTAINER, predicates ), )
predicates = ( ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_OR_CONTAINER, value = predicates ), )
HG.client_controller.pub( 'enter_predicates', self._page_key, predicates )
@ -1432,7 +1432,7 @@ class ManagementPanelDuplicateFilter( ManagementPanel ):
self._management_controller.SetVariable( 'pixel_dupes_preference', pixel_dupes_preference )
self._management_controller.SetVariable( 'max_hamming_distance', max_hamming_distance )
self._SetFileServiceKey( file_search_context.GetFileServiceKey() )
self._SetFileServiceKey( file_search_context.GetLocationSearchContext().GetBestSingleFileServiceKey() )
self._UpdateBothFilesMatchButton()
@ -1474,7 +1474,7 @@ class ManagementPanelDuplicateFilter( ManagementPanel ):
( file_search_context, both_files_match, pixel_dupes_preference, max_hamming_distance ) = self._GetDuplicateFileSearchData()
file_service_key = file_search_context.GetFileServiceKey()
file_service_key = file_search_context.GetLocationSearchContext().GetBestSingleFileServiceKey()
if len( hashes ) > 0:
@ -4955,7 +4955,7 @@ class ManagementPanelQuery( ManagementPanel ):
file_search_context = self._tag_autocomplete.GetFileSearchContext()
file_service_key = file_search_context.GetFileServiceKey()
file_service_key = file_search_context.GetLocationSearchContext().GetBestSingleFileServiceKey()
panel = ClientGUIResults.MediaPanelThumbnails( self._page, self._page_key, file_service_key, [] )
@ -5015,8 +5015,10 @@ class ManagementPanelQuery( ManagementPanel ):
file_search_context = self._tag_autocomplete.GetFileSearchContext()
display_file_service_key = file_search_context.GetLocationSearchContext().GetBestSingleFileServiceKey()
self._management_controller.SetVariable( 'file_search_context', file_search_context )
self._SetFileServiceKey( file_search_context.GetFileServiceKey() )
self._SetFileServiceKey( display_file_service_key )
synchronised = self._tag_autocomplete.IsSynchronised()
@ -5024,8 +5026,6 @@ class ManagementPanelQuery( ManagementPanel ):
if synchronised:
file_service_key = file_search_context.GetFileServiceKey()
if len( file_search_context.GetPredicates() ) > 0:
self._query_job_key = ClientThreading.JobKey()
@ -5034,11 +5034,11 @@ class ManagementPanelQuery( ManagementPanel ):
self._controller.CallToThread( self.THREADDoQuery, self._controller, self._page_key, self._query_job_key, file_search_context, sort_by )
panel = ClientGUIResults.MediaPanelLoading( self._page, self._page_key, file_service_key )
panel = ClientGUIResults.MediaPanelLoading( self._page, self._page_key, display_file_service_key )
else:
panel = ClientGUIResults.MediaPanelThumbnails( self._page, self._page_key, file_service_key, [] )
panel = ClientGUIResults.MediaPanelThumbnails( self._page, self._page_key, display_file_service_key, [] )
panel.SetEmptyPageStatusOverride( 'no search' )

View File

@ -118,7 +118,6 @@ def ReadFetch(
force_system_everything
):
file_service_key = file_search_context.GetFileServiceKey()
tag_search_context = file_search_context.GetTagSearchContext()
tag_service_key = tag_search_context.service_key
@ -199,7 +198,7 @@ def ReadFetch(
if not results_cache.CanServeTagResults( parsed_autocomplete_text, True ):
predicates = HG.client_controller.Read( 'autocomplete_predicates', ClientTags.TAG_DISPLAY_ACTUAL, tag_search_context, file_service_key, search_text = strict_search_text, exact_match = True, inclusive = parsed_autocomplete_text.inclusive, add_namespaceless = add_namespaceless, job_key = job_key )
predicates = HG.client_controller.Read( 'autocomplete_predicates', ClientTags.TAG_DISPLAY_ACTUAL, file_search_context, search_text = strict_search_text, exact_match = True, inclusive = parsed_autocomplete_text.inclusive, add_namespaceless = add_namespaceless, job_key = job_key )
results_cache = ClientSearch.PredicateResultsCacheTag( predicates, strict_search_text, True )
@ -225,7 +224,7 @@ def ReadFetch(
search_namespaces_into_full_tags = parsed_autocomplete_text.GetTagAutocompleteOptions().SearchNamespacesIntoFullTags()
predicates = HG.client_controller.Read( 'autocomplete_predicates', ClientTags.TAG_DISPLAY_ACTUAL, tag_search_context, file_service_key, search_text = autocomplete_search_text, inclusive = parsed_autocomplete_text.inclusive, add_namespaceless = add_namespaceless, job_key = job_key, search_namespaces_into_full_tags = search_namespaces_into_full_tags )
predicates = HG.client_controller.Read( 'autocomplete_predicates', ClientTags.TAG_DISPLAY_ACTUAL, file_search_context, search_text = autocomplete_search_text, inclusive = parsed_autocomplete_text.inclusive, add_namespaceless = add_namespaceless, job_key = job_key, search_namespaces_into_full_tags = search_namespaces_into_full_tags )
if job_key.IsCancelled():
@ -328,7 +327,7 @@ def ReadFetch(
return
predicates = ClientData.MergePredicates( predicates, add_namespaceless = add_namespaceless )
predicates = ClientSearch.MergePredicates( predicates, add_namespaceless = add_namespaceless )
matches = predicates
@ -410,7 +409,9 @@ def ShouldDoExactSearch( parsed_autocomplete_text: ClientSearch.ParsedAutocomple
return len( test_text ) <= exact_match_character_threshold
def WriteFetch( win, job_key, results_callable, parsed_autocomplete_text: ClientSearch.ParsedAutocompleteText, tag_search_context: ClientSearch.TagSearchContext, file_service_key: bytes, results_cache: ClientSearch.PredicateResultsCache ):
def WriteFetch( win, job_key, results_callable, parsed_autocomplete_text: ClientSearch.ParsedAutocompleteText, file_search_context: ClientSearch.FileSearchContext, results_cache: ClientSearch.PredicateResultsCache ):
tag_search_context = file_search_context.GetTagSearchContext()
display_tag_service_key = tag_search_context.display_service_key
@ -431,7 +432,7 @@ def WriteFetch( win, job_key, results_callable, parsed_autocomplete_text: Client
if not results_cache.CanServeTagResults( parsed_autocomplete_text, True ):
predicates = HG.client_controller.Read( 'autocomplete_predicates', ClientTags.TAG_DISPLAY_STORAGE, tag_search_context, file_service_key, search_text = strict_search_text, exact_match = True, add_namespaceless = False, job_key = job_key )
predicates = HG.client_controller.Read( 'autocomplete_predicates', ClientTags.TAG_DISPLAY_STORAGE, file_search_context, search_text = strict_search_text, exact_match = True, add_namespaceless = False, job_key = job_key )
results_cache = ClientSearch.PredicateResultsCacheTag( predicates, strict_search_text, True )
@ -457,7 +458,7 @@ def WriteFetch( win, job_key, results_callable, parsed_autocomplete_text: Client
search_namespaces_into_full_tags = parsed_autocomplete_text.GetTagAutocompleteOptions().SearchNamespacesIntoFullTags()
predicates = HG.client_controller.Read( 'autocomplete_predicates', ClientTags.TAG_DISPLAY_STORAGE, tag_search_context, file_service_key, search_text = autocomplete_search_text, add_namespaceless = False, job_key = job_key, search_namespaces_into_full_tags = search_namespaces_into_full_tags )
predicates = HG.client_controller.Read( 'autocomplete_predicates', ClientTags.TAG_DISPLAY_STORAGE, file_search_context, search_text = autocomplete_search_text, add_namespaceless = False, job_key = job_key, search_namespaces_into_full_tags = search_namespaces_into_full_tags )
if is_explicit_wildcard:
@ -478,7 +479,7 @@ def WriteFetch( win, job_key, results_callable, parsed_autocomplete_text: Client
# we always do this, because results cache will not have current text input data
input_text_predicates = HG.client_controller.Read( 'autocomplete_predicates', ClientTags.TAG_DISPLAY_STORAGE, tag_search_context, file_service_key, search_text = strict_search_text, exact_match = True, add_namespaceless = False, zero_count_ok = True, job_key = job_key )
input_text_predicates = HG.client_controller.Read( 'autocomplete_predicates', ClientTags.TAG_DISPLAY_STORAGE, file_search_context, search_text = strict_search_text, exact_match = True, add_namespaceless = False, zero_count_ok = True, job_key = job_key )
for input_text_predicate in input_text_predicates:
@ -600,7 +601,7 @@ class ListBoxTagsPredicatesAC( ClientGUIListBoxes.ListBoxTagsPredicates ):
skip_ors = True
some_preds_have_count = True in ( predicate.GetCount() > 0 for predicate in predicates )
some_preds_have_count = True in ( predicate.GetCount().HasNonZeroCount() for predicate in predicates )
skip_countless = HG.client_controller.new_options.GetBoolean( 'ac_select_first_with_count' ) and some_preds_have_count
for ( index, predicate ) in enumerate( predicates ):
@ -612,7 +613,7 @@ class ListBoxTagsPredicatesAC( ClientGUIListBoxes.ListBoxTagsPredicates ):
continue
if skip_countless and predicate.GetType() in ( ClientSearch.PREDICATE_TYPE_PARENT, ClientSearch.PREDICATE_TYPE_TAG ) and predicate.GetCount() == 0:
if skip_countless and predicate.GetType() in ( ClientSearch.PREDICATE_TYPE_PARENT, ClientSearch.PREDICATE_TYPE_TAG ) and predicate.GetCount().HasZeroCount():
continue
@ -1546,7 +1547,7 @@ class AutoCompleteDropdownTags( AutoCompleteDropdown ):
favourite_tags = sorted( HG.client_controller.new_options.GetStringList( 'favourite_tags' ) )
predicates = [ ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_TAG, tag ) for tag in favourite_tags ]
predicates = [ ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_TAG, value = tag ) for tag in favourite_tags ]
self._favourites_list.SetPredicates( predicates )
@ -1600,7 +1601,7 @@ class AutoCompleteDropdownTagsRead( AutoCompleteDropdownTags ):
self._under_construction_or_predicate = None
file_service_key = file_search_context.GetFileServiceKey()
file_service_key = file_search_context.GetLocationSearchContext().GetBestSingleFileServiceKey()
tag_search_context = file_search_context.GetTagSearchContext()
self._include_unusual_predicate_types = include_unusual_predicate_types
@ -1732,7 +1733,7 @@ class AutoCompleteDropdownTagsRead( AutoCompleteDropdownTags ):
if self._under_construction_or_predicate is None:
self._under_construction_or_predicate = ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_OR_CONTAINER, predicates )
self._under_construction_or_predicate = ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_OR_CONTAINER, value = predicates )
else:
@ -1745,7 +1746,7 @@ class AutoCompleteDropdownTagsRead( AutoCompleteDropdownTags ):
or_preds.extend( [ predicate for predicate in predicates if predicate not in or_preds ] )
self._under_construction_or_predicate = ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_OR_CONTAINER, or_preds )
self._under_construction_or_predicate = ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_OR_CONTAINER, value = or_preds )
else:
@ -1767,7 +1768,7 @@ class AutoCompleteDropdownTagsRead( AutoCompleteDropdownTags ):
or_preds.extend( [ predicate for predicate in predicates if predicate not in or_preds ] )
predicates = { ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_OR_CONTAINER, or_preds ) }
predicates = { ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_OR_CONTAINER, value = or_preds ) }
self._under_construction_or_predicate = None
@ -1984,7 +1985,7 @@ class AutoCompleteDropdownTagsRead( AutoCompleteDropdownTags ):
or_preds = or_preds[:-1]
self._under_construction_or_predicate = ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_OR_CONTAINER, or_preds )
self._under_construction_or_predicate = ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_OR_CONTAINER, value = or_preds )
self._UpdateORButtons()
@ -2200,7 +2201,7 @@ class AutoCompleteDropdownTagsRead( AutoCompleteDropdownTags ):
self._predicates_listbox.SetPredicates( self._file_search_context.GetPredicates() )
self._ChangeFileService( self._file_search_context.GetFileServiceKey() )
self._ChangeFileService( self._file_search_context.GetLocationSearchContext().GetBestSingleFileServiceKey() )
self._ChangeTagService( self._file_search_context.GetTagSearchContext().service_key )
self._SignalNewSearchState()
@ -2639,7 +2640,7 @@ class AutoCompleteDropdownTagsWrite( AutoCompleteDropdownTags ):
tags = HydrusTags.CleanTags( tags )
entry_predicates = [ ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_TAG, tag ) for tag in tags ]
entry_predicates = [ ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_TAG, value = tag ) for tag in tags ]
if len( entry_predicates ) > 0:
@ -2690,9 +2691,12 @@ class AutoCompleteDropdownTagsWrite( AutoCompleteDropdownTags ):
HG.client_controller.CallLaterQtSafe( self, 0.2, 'set stub predicates', self.SetStubPredicates, job_key, stub_predicates, parsed_autocomplete_text )
location_search_context = ClientSearch.LocationSearchContext( current_service_keys = [ self._file_service_key ] )
tag_search_context = ClientSearch.TagSearchContext( service_key = self._tag_service_key, display_service_key = self._display_tag_service_key )
HG.client_controller.CallToThread( WriteFetch, self, job_key, self.SetFetchedResults, parsed_autocomplete_text, tag_search_context, self._file_service_key, self._results_cache )
file_search_context = ClientSearch.FileSearchContext( location_search_context = location_search_context, tag_search_context = tag_search_context )
HG.client_controller.CallToThread( WriteFetch, self, job_key, self.SetFetchedResults, parsed_autocomplete_text, file_search_context, self._results_cache )
def _TakeResponsibilityForEnter( self, shift_down ):
@ -2824,16 +2828,16 @@ class EditAdvancedORPredicates( ClientGUIScrolledPanels.EditPanel ):
if len( namespace ) > 0 and subtag == '*':
row_pred = ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_NAMESPACE, namespace, inclusive )
row_pred = ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_NAMESPACE, value = namespace, inclusive = inclusive )
else:
row_pred = ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_WILDCARD, tag_string, inclusive )
row_pred = ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_WILDCARD, value = tag_string, inclusive = inclusive )
else:
row_pred = ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_TAG, tag_string, inclusive )
row_pred = ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_TAG, value = tag_string, inclusive = inclusive )
row_preds.append( row_pred )
@ -2845,7 +2849,7 @@ class EditAdvancedORPredicates( ClientGUIScrolledPanels.EditPanel ):
else:
self._current_predicates.append( ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_OR_CONTAINER, row_preds ) )
self._current_predicates.append( ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_OR_CONTAINER, value = row_preds ) )

View File

@ -41,7 +41,7 @@ LOCAL_BOORU_JSON_BYTE_LIST_PARAMS = set()
CLIENT_API_INT_PARAMS = { 'file_id', 'file_sort_type' }
CLIENT_API_BYTE_PARAMS = { 'hash', 'destination_page_key', 'page_key', 'Hydrus-Client-API-Access-Key', 'Hydrus-Client-API-Session-Key', 'tag_service_key', 'file_service_key' }
CLIENT_API_STRING_PARAMS = { 'name', 'url', 'domain', 'file_service_name', 'tag_service_name' }
CLIENT_API_JSON_PARAMS = { 'basic_permissions', 'system_inbox', 'system_archive', 'tags', 'file_ids', 'only_return_identifiers', 'detailed_url_information', 'hide_service_names_tags', 'simple', 'file_sort_asc' }
CLIENT_API_JSON_PARAMS = { 'basic_permissions', 'system_inbox', 'system_archive', 'tags', 'file_ids', 'only_return_identifiers', 'detailed_url_information', 'hide_service_names_tags', 'simple', 'file_sort_asc', 'return_hashes' }
CLIENT_API_JSON_BYTE_LIST_PARAMS = { 'hashes' }
CLIENT_API_JSON_BYTE_DICT_PARAMS = { 'service_keys_to_tags', 'service_keys_to_actions_to_tags', 'service_keys_to_additional_tags' }
@ -1925,11 +1925,28 @@ class HydrusResourceClientAPIRestrictedGetFilesSearchFiles( HydrusResourceClient
# newest first
sort_by = ClientMedia.MediaSort( sort_type = ( 'system', file_sort_type ), sort_order = sort_order )
return_hashes = False
if 'return_hashes' in request.parsed_request_args:
return_hashes = request.parsed_request_args.GetValue( 'return_hashes', bool )
hash_ids = HG.client_controller.Read( 'file_query_ids', file_search_context, sort_by = sort_by, apply_implicit_limit = False )
request.client_api_permissions.SetLastSearchResults( hash_ids )
body_dict = { 'file_ids' : list( hash_ids ) }
if return_hashes:
hash_ids_to_hashes = HG.client_controller.Read( 'hash_ids_to_hashes', hash_ids = hash_ids )
# maintain sort
body_dict = { 'hashes' : [ hash_ids_to_hashes[ hash_id ].hex() for hash_id in hash_ids ] }
else:
body_dict = { 'file_ids' : list( hash_ids ) }
body = json.dumps( body_dict )

View File

@ -81,8 +81,8 @@ options = {}
# Misc
NETWORK_VERSION = 20
SOFTWARE_VERSION = 468
CLIENT_API_VERSION = 24
SOFTWARE_VERSION = 469
CLIENT_API_VERSION = 25
SERVER_THUMBNAIL_DIMENSIONS = ( 200, 200 )

View File

@ -152,7 +152,7 @@ def DeletePath( path ):
if os.path.exists( path ):
MakeFileWriteable( path )
TryToMakeFileWriteable( path )
try:
@ -183,9 +183,6 @@ def DeletePath( path ):
def DirectoryIsWriteable( path ):
# testing access bits on directories to see if we can make new files is multiplatform hellmode
# so, just try it and see what happens
while not os.path.exists( path ):
try:
@ -198,6 +195,13 @@ def DirectoryIsWriteable( path ):
return os.access( path, os.W_OK | os.X_OK )
# old crazy method:
'''
# testing access bits on directories to see if we can make new files is multiplatform hellmode
# so, just try it and see what happens
temp_path = os.path.join( path, 'hydrus_temp_test_top_jej' )
if os.path.exists( temp_path ):
@ -229,18 +233,10 @@ def DirectoryIsWriteable( path ):
return False
'''
def FileisWriteable( path: str ):
stat_result = os.stat( path )
current_bits = stat_result.st_mode
desired_bits = GetFileWritePermissionBits()
file_is_writeable = ( desired_bits & current_bits ) == desired_bits
return file_is_writeable
return os.access( path, os.W_OK )
def FilterFreePaths( paths ):
@ -310,21 +306,6 @@ def GetDevice( path ):
return None
def GetFileWritePermissionBits():
if HC.PLATFORM_WINDOWS:
# this is actually the same value as S_IWUSR, but let's not try to second guess ourselves
write_bits = stat.S_IREAD | stat.S_IWRITE
else:
# guarantee 644 for regular files m8
write_bits = stat.S_IRUSR | stat.S_IWUSR | stat.S_IRGRP | stat.S_IROTH
return write_bits
def GetFreeSpace( path ):
disk_usage = psutil.disk_usage( path )
@ -460,31 +441,6 @@ def MakeSureDirectoryExists( path ):
os.makedirs( path, exist_ok = True )
def MakeFileWriteable( path ):
if not os.path.exists( path ):
return
try:
stat_result = os.stat( path )
current_bits = stat_result.st_mode
desired_bits = GetFileWritePermissionBits()
if not FileisWriteable( path ):
os.chmod( path, current_bits | desired_bits )
except Exception as e:
HydrusData.Print( 'Wanted to add write permission to "{}", but had an error: {}'.format( path, str( e ) ) )
def safe_copy2( source, dest ):
copy_metadata = True
@ -516,8 +472,6 @@ def MergeFile( source, dest ):
if not os.path.isdir( source ):
MakeFileWriteable( source )
if PathsHaveSameSizeAndDate( source, dest ):
DeletePath( source )
@ -622,7 +576,7 @@ def MirrorFile( source, dest ):
try:
MakeFileWriteable( dest )
TryToMakeFileWriteable( dest )
safe_copy2( source, dest )
@ -806,7 +760,7 @@ def RecyclePath( path ):
if os.path.exists( path ):
MakeFileWriteable( path )
TryToMakeFileWriteable( path )
try:
@ -838,3 +792,87 @@ def SanitizeFilename( filename ):
return filename
def TryToGiveFileNicePermissionBits( path ):
if not os.path.exists( path ):
return
try:
stat_result = os.stat( path )
current_bits = stat_result.st_mode
if HC.PLATFORM_WINDOWS:
# this is actually the same value as S_IWUSR, but let's not try to second guess ourselves
desired_bits = stat.S_IREAD | stat.S_IWRITE
else:
# typically guarantee 644 for regular files m8, but now we also take umask into account
try:
umask = os.umask( 0o022 )
os.umask( umask )
except:
umask = 0o022
desired_bits = ( stat.S_IRUSR | stat.S_IWUSR | stat.S_IRGRP | stat.S_IROTH ) & ~umask
if not ( desired_bits & current_bits ) == desired_bits:
os.chmod( path, current_bits | desired_bits )
except Exception as e:
HydrusData.Print( 'Wanted to add read and write permission to "{}", but had an error: {}'.format( path, str( e ) ) )
def TryToMakeFileWriteable( path ):
if not os.path.exists( path ):
return
if FileisWriteable( path ):
return
try:
stat_result = os.stat( path )
current_bits = stat_result.st_mode
if HC.PLATFORM_WINDOWS:
# this is actually the same value as S_IWUSR, but let's not try to second guess ourselves
desired_bits = stat.S_IREAD | stat.S_IWRITE
else:
# this only does what we want if we own the file, but only owners can non-sudo change permission anyway
desired_bits = stat.S_IWUSR
if not ( desired_bits & current_bits ) == desired_bits:
os.chmod( path, current_bits | desired_bits )
except Exception as e:
HydrusData.Print( 'Wanted to add user write permission to "{}", but had an error: {}'.format( path, str( e ) ) )

View File

@ -1946,7 +1946,7 @@ class TestClientAPI( unittest.TestCase ):
( file_search_context, ) = args
self.assertEqual( file_search_context.GetFileServiceKey(), CC.LOCAL_FILE_SERVICE_KEY )
self.assertEqual( file_search_context.GetLocationSearchContext().current_service_keys, { CC.LOCAL_FILE_SERVICE_KEY } )
self.assertEqual( file_search_context.GetTagSearchContext().service_key, CC.COMBINED_TAG_SERVICE_KEY )
self.assertEqual( set( file_search_context.GetPredicates() ), { ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_TAG, tag ) for tag in tags } )
@ -1961,6 +1961,63 @@ class TestClientAPI( unittest.TestCase ):
self.assertEqual( kwargs[ 'apply_implicit_limit' ], False )
# search files and get hashes
HG.test_controller.ClearReads( 'file_query_ids' )
sample_hash_ids = set( random.sample( hash_ids, 3 ) )
hash_ids_to_hashes = { hash_id : os.urandom( 32 ) for hash_id in sample_hash_ids }
HG.test_controller.SetRead( 'file_query_ids', set( sample_hash_ids ) )
HG.test_controller.SetRead( 'hash_ids_to_hashes', hash_ids_to_hashes )
tags = [ 'kino', 'green' ]
path = '/get_files/search_files?tags={}&return_hashes=true'.format( urllib.parse.quote( json.dumps( tags ) ) )
connection.request( 'GET', path, headers = headers )
response = connection.getresponse()
data = response.read()
text = str( data, 'utf-8' )
self.assertEqual( response.status, 200 )
d = json.loads( text )
expected_hashes_set = { hash.hex() for hash in hash_ids_to_hashes.values() }
self.assertEqual( set( d[ 'hashes' ] ), expected_hashes_set )
[ ( args, kwargs ) ] = HG.test_controller.GetRead( 'file_query_ids' )
( file_search_context, ) = args
self.assertEqual( file_search_context.GetLocationSearchContext().current_service_keys, { CC.LOCAL_FILE_SERVICE_KEY } )
self.assertEqual( file_search_context.GetTagSearchContext().service_key, CC.COMBINED_TAG_SERVICE_KEY )
self.assertEqual( set( file_search_context.GetPredicates() ), { ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_TAG, tag ) for tag in tags } )
self.assertIn( 'sort_by', kwargs )
sort_by = kwargs[ 'sort_by' ]
self.assertEqual( sort_by.sort_type, ( 'system', CC.SORT_FILES_BY_IMPORT_TIME ) )
self.assertEqual( sort_by.sort_order, CC.SORT_DESC )
self.assertIn( 'apply_implicit_limit', kwargs )
self.assertEqual( kwargs[ 'apply_implicit_limit' ], False )
[ ( args, kwargs ) ] = HG.test_controller.GetRead( 'hash_ids_to_hashes' )
hash_ids = kwargs[ 'hash_ids' ]
self.assertEqual( set( hash_ids ), sample_hash_ids )
# sort
# this just tests if it parses, we don't have a full test for read params yet
@ -1989,7 +2046,7 @@ class TestClientAPI( unittest.TestCase ):
( file_search_context, ) = args
self.assertEqual( file_search_context.GetFileServiceKey(), CC.LOCAL_FILE_SERVICE_KEY )
self.assertEqual( file_search_context.GetLocationSearchContext().current_service_keys, { CC.LOCAL_FILE_SERVICE_KEY } )
self.assertEqual( file_search_context.GetTagSearchContext().service_key, CC.COMBINED_TAG_SERVICE_KEY )
self.assertEqual( set( file_search_context.GetPredicates() ), { ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_TAG, tag ) for tag in tags } )
@ -2030,7 +2087,7 @@ class TestClientAPI( unittest.TestCase ):
( file_search_context, ) = args
self.assertEqual( file_search_context.GetFileServiceKey(), CC.LOCAL_FILE_SERVICE_KEY )
self.assertEqual( file_search_context.GetLocationSearchContext().current_service_keys, { CC.LOCAL_FILE_SERVICE_KEY } )
self.assertEqual( file_search_context.GetTagSearchContext().service_key, CC.COMBINED_TAG_SERVICE_KEY )
self.assertEqual( set( file_search_context.GetPredicates() ), { ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_TAG, tag ) for tag in tags } )
@ -2076,7 +2133,7 @@ class TestClientAPI( unittest.TestCase ):
( file_search_context, ) = args
self.assertEqual( file_search_context.GetFileServiceKey(), CC.TRASH_SERVICE_KEY )
self.assertEqual( file_search_context.GetLocationSearchContext().current_service_keys, { CC.TRASH_SERVICE_KEY } )
self.assertEqual( file_search_context.GetTagSearchContext().service_key, CC.COMBINED_TAG_SERVICE_KEY )
self.assertEqual( set( file_search_context.GetPredicates() ), { ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_TAG, tag ) for tag in tags } )
@ -2123,7 +2180,7 @@ class TestClientAPI( unittest.TestCase ):
( file_search_context, ) = args
self.assertEqual( file_search_context.GetFileServiceKey(), CC.TRASH_SERVICE_KEY )
self.assertEqual( file_search_context.GetLocationSearchContext().current_service_keys, { CC.TRASH_SERVICE_KEY } )
self.assertEqual( file_search_context.GetTagSearchContext().service_key, CC.COMBINED_TAG_SERVICE_KEY )
self.assertEqual( set( file_search_context.GetPredicates() ), { ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_TAG, tag ) for tag in tags } )

View File

@ -84,15 +84,19 @@ class TestClientDB( unittest.TestCase ):
file_import_options = HG.client_controller.new_options.GetDefaultFileImportOptions( 'loud' )
location_search_context = ClientSearch.LocationSearchContext( current_service_keys = [ CC.COMBINED_FILE_SERVICE_KEY ] )
tag_search_context = ClientSearch.TagSearchContext( service_key = CC.DEFAULT_LOCAL_TAG_SERVICE_KEY )
file_search_context = ClientSearch.FileSearchContext( location_search_context = location_search_context, tag_search_context = tag_search_context )
TestClientDB._clear_db()
result = self._read( 'autocomplete_predicates', ClientTags.TAG_DISPLAY_STORAGE, tag_search_context, CC.COMBINED_FILE_SERVICE_KEY, search_text = 'c*' )
result = self._read( 'autocomplete_predicates', ClientTags.TAG_DISPLAY_STORAGE, file_search_context, search_text = 'c*' )
self.assertEqual( result, [] )
result = self._read( 'autocomplete_predicates', ClientTags.TAG_DISPLAY_STORAGE, tag_search_context, CC.COMBINED_FILE_SERVICE_KEY, search_text = 'series:*' )
result = self._read( 'autocomplete_predicates', ClientTags.TAG_DISPLAY_STORAGE, file_search_context, search_text = 'series:*' )
self.assertEqual( result, [] )
@ -126,114 +130,114 @@ class TestClientDB( unittest.TestCase ):
# cars
result = self._read( 'autocomplete_predicates', ClientTags.TAG_DISPLAY_STORAGE, tag_search_context, CC.COMBINED_FILE_SERVICE_KEY, search_text = 'c*', add_namespaceless = True )
result = self._read( 'autocomplete_predicates', ClientTags.TAG_DISPLAY_STORAGE, file_search_context, search_text = 'c*', add_namespaceless = True )
preds = set()
preds.add( ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_TAG, 'car', min_current_count = 1 ) )
preds.add( ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_TAG, 'series:cars', min_current_count = 1 ) )
preds.add( ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_TAG, 'car', count = ClientSearch.PredicateCount.STATICCreateCurrentCount( 1 ) ) )
preds.add( ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_TAG, 'series:cars', count = ClientSearch.PredicateCount.STATICCreateCurrentCount( 1 ) ) )
for p in result: self.assertEqual( p.GetCount( HC.CONTENT_STATUS_CURRENT ), 1 )
for p in result: self.assertEqual( p.GetCount().GetMinCount( HC.CONTENT_STATUS_CURRENT ), 1 )
self.assertEqual( set( result ), preds )
# cars
result = self._read( 'autocomplete_predicates', ClientTags.TAG_DISPLAY_STORAGE, tag_search_context, CC.COMBINED_FILE_SERVICE_KEY, search_text = 'c*', add_namespaceless = False )
result = self._read( 'autocomplete_predicates', ClientTags.TAG_DISPLAY_STORAGE, file_search_context, search_text = 'c*', add_namespaceless = False )
preds = set()
preds.add( ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_TAG, 'series:cars', min_current_count = 1 ) )
preds.add( ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_TAG, 'car', min_current_count = 1 ) )
preds.add( ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_TAG, 'series:cars', count = ClientSearch.PredicateCount.STATICCreateCurrentCount( 1 ) ) )
preds.add( ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_TAG, 'car', count = ClientSearch.PredicateCount.STATICCreateCurrentCount( 1 ) ) )
for p in result: self.assertEqual( p.GetCount( HC.CONTENT_STATUS_CURRENT ), 1 )
for p in result: self.assertEqual( p.GetCount().GetMinCount( HC.CONTENT_STATUS_CURRENT ), 1 )
self.assertEqual( set( result ), preds )
#
result = self._read( 'autocomplete_predicates', ClientTags.TAG_DISPLAY_STORAGE, tag_search_context, CC.COMBINED_FILE_SERVICE_KEY, search_text = 'ser*' )
result = self._read( 'autocomplete_predicates', ClientTags.TAG_DISPLAY_STORAGE, file_search_context, search_text = 'ser*' )
self.assertEqual( result, [] )
#
result = self._read( 'autocomplete_predicates', ClientTags.TAG_DISPLAY_STORAGE, tag_search_context, CC.COMBINED_FILE_SERVICE_KEY, search_text = 'series:c*' )
result = self._read( 'autocomplete_predicates', ClientTags.TAG_DISPLAY_STORAGE, file_search_context, search_text = 'series:c*' )
pred = ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_TAG, 'series:cars', min_current_count = 1 )
pred = ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_TAG, 'series:cars', count = ClientSearch.PredicateCount.STATICCreateCurrentCount( 1 ) )
( read_pred, ) = result
self.assertEqual( read_pred.GetCount( HC.CONTENT_STATUS_CURRENT ), 1 )
self.assertEqual( read_pred.GetCount().GetMinCount( HC.CONTENT_STATUS_CURRENT ), 1 )
self.assertEqual( pred, read_pred )
#
result = self._read( 'autocomplete_predicates', ClientTags.TAG_DISPLAY_STORAGE, tag_search_context, CC.COMBINED_FILE_SERVICE_KEY, search_text = 'car', exact_match = True )
result = self._read( 'autocomplete_predicates', ClientTags.TAG_DISPLAY_STORAGE, file_search_context, search_text = 'car', exact_match = True )
pred = ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_TAG, 'car', min_current_count = 1 )
pred = ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_TAG, 'car', count = ClientSearch.PredicateCount.STATICCreateCurrentCount( 1 ) )
( read_pred, ) = result
self.assertEqual( read_pred.GetCount( HC.CONTENT_STATUS_CURRENT ), 1 )
self.assertEqual( read_pred.GetCount().GetMinCount( HC.CONTENT_STATUS_CURRENT ), 1 )
self.assertEqual( pred, read_pred )
#
result = self._read( 'autocomplete_predicates', ClientTags.TAG_DISPLAY_STORAGE, tag_search_context, CC.COMBINED_FILE_SERVICE_KEY, search_text = 'c', exact_match = True )
result = self._read( 'autocomplete_predicates', ClientTags.TAG_DISPLAY_STORAGE, file_search_context, search_text = 'c', exact_match = True )
self.assertEqual( result, [] )
#
result = self._read( 'autocomplete_predicates', ClientTags.TAG_DISPLAY_STORAGE, tag_search_context, CC.COMBINED_FILE_SERVICE_KEY, search_text = '*' )
result = self._read( 'autocomplete_predicates', ClientTags.TAG_DISPLAY_STORAGE, file_search_context, search_text = '*' )
preds = set()
preds.add( ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_TAG, 'car', min_current_count = 1 ) )
preds.add( ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_TAG, 'series:cars', min_current_count = 1 ) )
preds.add( ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_TAG, 'maker:ford', min_current_count = 1 ) )
preds.add( ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_TAG, 'car', count = ClientSearch.PredicateCount.STATICCreateCurrentCount( 1 ) ) )
preds.add( ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_TAG, 'series:cars', count = ClientSearch.PredicateCount.STATICCreateCurrentCount( 1 ) ) )
preds.add( ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_TAG, 'maker:ford', count = ClientSearch.PredicateCount.STATICCreateCurrentCount( 1 ) ) )
for p in result: self.assertEqual( p.GetCount( HC.CONTENT_STATUS_CURRENT ), 1 )
for p in result: self.assertEqual( p.GetCount().GetMinCount( HC.CONTENT_STATUS_CURRENT ), 1 )
self.assertEqual( set( result ), preds )
#
result = self._read( 'autocomplete_predicates', ClientTags.TAG_DISPLAY_STORAGE, tag_search_context, CC.COMBINED_FILE_SERVICE_KEY, search_text = 'series:*' )
result = self._read( 'autocomplete_predicates', ClientTags.TAG_DISPLAY_STORAGE, file_search_context, search_text = 'series:*' )
preds = set()
preds.add( ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_TAG, 'series:cars', min_current_count = 1 ) )
preds.add( ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_TAG, 'series:cars', count = ClientSearch.PredicateCount.STATICCreateCurrentCount( 1 ) ) )
for p in result: self.assertEqual( p.GetCount( HC.CONTENT_STATUS_CURRENT ), 1 )
for p in result: self.assertEqual( p.GetCount().GetMinCount( HC.CONTENT_STATUS_CURRENT ), 1 )
self.assertEqual( set( result ), preds )
#
result = self._read( 'autocomplete_predicates', ClientTags.TAG_DISPLAY_STORAGE, tag_search_context, CC.COMBINED_FILE_SERVICE_KEY, search_text = 'c*r*' )
result = self._read( 'autocomplete_predicates', ClientTags.TAG_DISPLAY_STORAGE, file_search_context, search_text = 'c*r*' )
preds = set()
preds.add( ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_TAG, 'car', min_current_count = 1 ) )
preds.add( ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_TAG, 'series:cars', min_current_count = 1 ) )
preds.add( ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_TAG, 'car', count = ClientSearch.PredicateCount.STATICCreateCurrentCount( 1 ) ) )
preds.add( ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_TAG, 'series:cars', count = ClientSearch.PredicateCount.STATICCreateCurrentCount( 1 ) ) )
for p in result: self.assertEqual( p.GetCount( HC.CONTENT_STATUS_CURRENT ), 1 )
for p in result: self.assertEqual( p.GetCount().GetMinCount( HC.CONTENT_STATUS_CURRENT ), 1 )
self.assertEqual( set( result ), preds )
#
result = self._read( 'autocomplete_predicates', ClientTags.TAG_DISPLAY_STORAGE, tag_search_context, CC.COMBINED_FILE_SERVICE_KEY, search_text = 'ser*', search_namespaces_into_full_tags = True )
result = self._read( 'autocomplete_predicates', ClientTags.TAG_DISPLAY_STORAGE, file_search_context, search_text = 'ser*', search_namespaces_into_full_tags = True )
preds = set()
preds.add( ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_TAG, 'series:cars', min_current_count = 1 ) )
preds.add( ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_TAG, 'series:cars', count = ClientSearch.PredicateCount.STATICCreateCurrentCount( 1 ) ) )
for p in result: self.assertEqual( p.GetCount( HC.CONTENT_STATUS_CURRENT ), 1 )
for p in result: self.assertEqual( p.GetCount().GetMinCount( HC.CONTENT_STATUS_CURRENT ), 1 )
self.assertEqual( set( result ), preds )
@ -768,14 +772,14 @@ class TestClientDB( unittest.TestCase ):
predicates = []
predicates.append( ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_SYSTEM_EVERYTHING, min_current_count = 1 ) )
predicates.append( ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_SYSTEM_INBOX, min_current_count = 1 ) )
predicates.append( ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_SYSTEM_ARCHIVE, min_current_count = 0 ) )
predicates.append( ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_SYSTEM_EVERYTHING, count = ClientSearch.PredicateCount.STATICCreateCurrentCount( 1 ) ) )
predicates.append( ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_SYSTEM_INBOX, count = ClientSearch.PredicateCount.STATICCreateCurrentCount( 1 ) ) )
predicates.append( ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_SYSTEM_ARCHIVE, count = ClientSearch.PredicateCount.STATICCreateCurrentCount( 0 ) ) )
predicates.extend( [ ClientSearch.Predicate( predicate_type ) for predicate_type in [ ClientSearch.PREDICATE_TYPE_SYSTEM_NUM_TAGS, ClientSearch.PREDICATE_TYPE_SYSTEM_LIMIT, ClientSearch.PREDICATE_TYPE_SYSTEM_SIZE, ClientSearch.PREDICATE_TYPE_SYSTEM_AGE, ClientSearch.PREDICATE_TYPE_SYSTEM_MODIFIED_TIME, ClientSearch.PREDICATE_TYPE_SYSTEM_KNOWN_URLS, ClientSearch.PREDICATE_TYPE_SYSTEM_HAS_AUDIO, ClientSearch.PREDICATE_TYPE_SYSTEM_HAS_ICC_PROFILE, ClientSearch.PREDICATE_TYPE_SYSTEM_HASH, ClientSearch.PREDICATE_TYPE_SYSTEM_DIMENSIONS, ClientSearch.PREDICATE_TYPE_SYSTEM_DURATION, ClientSearch.PREDICATE_TYPE_SYSTEM_NOTES, ClientSearch.PREDICATE_TYPE_SYSTEM_NUM_WORDS, ClientSearch.PREDICATE_TYPE_SYSTEM_MIME, ClientSearch.PREDICATE_TYPE_SYSTEM_RATING, ClientSearch.PREDICATE_TYPE_SYSTEM_SIMILAR_TO, ClientSearch.PREDICATE_TYPE_SYSTEM_FILE_SERVICE, ClientSearch.PREDICATE_TYPE_SYSTEM_TAG_AS_NUMBER, ClientSearch.PREDICATE_TYPE_SYSTEM_FILE_RELATIONSHIPS, ClientSearch.PREDICATE_TYPE_SYSTEM_FILE_VIEWING_STATS ] ] )
self.assertEqual( set( result ), set( predicates ) )
for i in range( len( predicates ) ): self.assertEqual( result[i].GetCount(), predicates[i].GetCount() )
for i in range( len( predicates ) ): self.assertEqual( result[i].GetCount().GetMinCount(), predicates[i].GetCount().GetMinCount() )
def test_file_updates( self ):
@ -1160,7 +1164,7 @@ class TestClientDB( unittest.TestCase ):
location_search_context = ClientSearch.LocationSearchContext( current_service_keys = [ CC.LOCAL_FILE_SERVICE_KEY ] )
fsc = ClientSearch.FileSearchContext( location_search_context = location_search_context, predicates = [ ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_TAG, 'tag', min_current_count = 1, min_pending_count = 3 ) ] )
fsc = ClientSearch.FileSearchContext( location_search_context = location_search_context, predicates = [ ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_TAG, 'tag', count = ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 3 ) ) ] )
management_controller = ClientGUIManagement.CreateManagementControllerQuery( 'wew lad', fsc, True )

View File

@ -636,19 +636,22 @@ class TestClientDBTags( unittest.TestCase ):
def _test_ac( self, search_text, tag_service_key, file_service_key, expected_storage_tags_to_counts, expected_display_tags_to_counts ):
location_search_context = ClientSearch.LocationSearchContext( current_service_keys = [ file_service_key ] )
tag_search_context = ClientSearch.TagSearchContext( tag_service_key )
preds = self._read( 'autocomplete_predicates', ClientTags.TAG_DISPLAY_STORAGE, tag_search_context, file_service_key, search_text = search_text )
file_search_context = ClientSearch.FileSearchContext( location_search_context = location_search_context, tag_search_context = tag_search_context )
tags_to_counts = { pred.GetValue() : pred.GetAllCounts() for pred in preds }
preds = self._read( 'autocomplete_predicates', ClientTags.TAG_DISPLAY_STORAGE, file_search_context, search_text = search_text )
self.assertEqual( expected_storage_tags_to_counts, tags_to_counts )
tags_to_counts = { pred.GetValue() : pred.GetCount() for pred in preds }
preds = self._read( 'autocomplete_predicates', ClientTags.TAG_DISPLAY_ACTUAL, tag_search_context, file_service_key, search_text = search_text )
self.assertDictEqual( expected_storage_tags_to_counts, tags_to_counts )
tags_to_counts = { pred.GetValue() : pred.GetAllCounts() for pred in preds }
preds = self._read( 'autocomplete_predicates', ClientTags.TAG_DISPLAY_ACTUAL, file_search_context, search_text = search_text )
self.assertEqual( expected_display_tags_to_counts, tags_to_counts )
tags_to_counts = { pred.GetValue() : pred.GetCount() for pred in preds }
self.assertDictEqual( expected_display_tags_to_counts, tags_to_counts )
def test_display_pairs_lookup_web_parents( self ):
@ -1089,8 +1092,8 @@ class TestClientDBTags( unittest.TestCase ):
# and a/c results, both specific and combined
self._test_ac( 'samu*', self._public_service_key, CC.LOCAL_FILE_SERVICE_KEY, { bad_samus_tag_1 : ( 0, None, 1, None ), bad_samus_tag_2 : ( 0, None, 1, None ) }, { good_samus_tag : ( 0, None, 1, None ) } )
self._test_ac( 'samu*', self._public_service_key, CC.COMBINED_FILE_SERVICE_KEY, { bad_samus_tag_1 : ( 0, None, 1, None ), bad_samus_tag_2 : ( 0, None, 1, None ) }, { good_samus_tag : ( 0, None, 1, None ) } )
self._test_ac( 'samu*', self._public_service_key, CC.LOCAL_FILE_SERVICE_KEY, { bad_samus_tag_1 : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 1 ), bad_samus_tag_2 : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 1 ) }, { good_samus_tag : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 1 ) } )
self._test_ac( 'samu*', self._public_service_key, CC.COMBINED_FILE_SERVICE_KEY, { bad_samus_tag_1 : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 1 ), bad_samus_tag_2 : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 1 ) }, { good_samus_tag : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 1 ) } )
# now we'll currentify the tags in one action
@ -1118,8 +1121,8 @@ class TestClientDBTags( unittest.TestCase ):
# and a/c results, both specific and combined
self._test_ac( 'samu*', self._public_service_key, CC.LOCAL_FILE_SERVICE_KEY, { bad_samus_tag_1 : ( 1, None, 0, None ), bad_samus_tag_2 : ( 1, None, 0, None ) }, { good_samus_tag : ( 1, None, 0, None ) } )
self._test_ac( 'samu*', self._public_service_key, CC.COMBINED_FILE_SERVICE_KEY, { bad_samus_tag_1 : ( 1, None, 0, None ), bad_samus_tag_2 : ( 1, None, 0, None ) }, { good_samus_tag : ( 1, None, 0, None ) } )
self._test_ac( 'samu*', self._public_service_key, CC.LOCAL_FILE_SERVICE_KEY, { bad_samus_tag_1 : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 0 ), bad_samus_tag_2 : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 0 ) }, { good_samus_tag : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 0 ) } )
self._test_ac( 'samu*', self._public_service_key, CC.COMBINED_FILE_SERVICE_KEY, { bad_samus_tag_1 : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 0 ), bad_samus_tag_2 : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 0 ) }, { good_samus_tag : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 0 ) } )
def test_display_pending_to_current_bug_non_ideal_and_ideal( self ):
@ -1187,8 +1190,8 @@ class TestClientDBTags( unittest.TestCase ):
# and a/c results, both specific and combined
self._test_ac( 'samu*', self._public_service_key, CC.LOCAL_FILE_SERVICE_KEY, { bad_samus_tag_1 : ( 0, None, 1, None ), good_samus_tag : ( 0, None, 1, None ) }, { good_samus_tag : ( 0, None, 1, None ) } )
self._test_ac( 'samu*', self._public_service_key, CC.COMBINED_FILE_SERVICE_KEY, { bad_samus_tag_1 : ( 0, None, 1, None ), good_samus_tag : ( 0, None, 1, None ) }, { good_samus_tag : ( 0, None, 1, None ) } )
self._test_ac( 'samu*', self._public_service_key, CC.LOCAL_FILE_SERVICE_KEY, { bad_samus_tag_1 : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 1 ), good_samus_tag : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 1 ) }, { good_samus_tag : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 1 ) } )
self._test_ac( 'samu*', self._public_service_key, CC.COMBINED_FILE_SERVICE_KEY, { bad_samus_tag_1 : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 1 ), good_samus_tag : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 1 ) }, { good_samus_tag : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 1 ) } )
# now we'll currentify the tags in one action
@ -1216,8 +1219,8 @@ class TestClientDBTags( unittest.TestCase ):
# and a/c results, both specific and combined
self._test_ac( 'samu*', self._public_service_key, CC.LOCAL_FILE_SERVICE_KEY, { bad_samus_tag_1 : ( 1, None, 0, None ), good_samus_tag : ( 1, None, 0, None ) }, { good_samus_tag : ( 1, None, 0, None ) } )
self._test_ac( 'samu*', self._public_service_key, CC.COMBINED_FILE_SERVICE_KEY, { bad_samus_tag_1 : ( 1, None, 0, None ), good_samus_tag : ( 1, None, 0, None ) }, { good_samus_tag : ( 1, None, 0, None ) } )
self._test_ac( 'samu*', self._public_service_key, CC.LOCAL_FILE_SERVICE_KEY, { bad_samus_tag_1 : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 0 ), good_samus_tag : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 0 ) }, { good_samus_tag : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 0 ) } )
self._test_ac( 'samu*', self._public_service_key, CC.COMBINED_FILE_SERVICE_KEY, { bad_samus_tag_1 : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 0 ), good_samus_tag : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 0 ) }, { good_samus_tag : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 0 ) } )
def test_display_pending_to_current_merge_bug_both_non_ideal( self ):
@ -1289,8 +1292,8 @@ class TestClientDBTags( unittest.TestCase ):
# and a/c results, both specific and combined
self._test_ac( 'samu*', self._public_service_key, CC.LOCAL_FILE_SERVICE_KEY, { bad_samus_tag_1 : ( 1, None, 0, None ), bad_samus_tag_2 : ( 0, None, 1, None ) }, { good_samus_tag : ( 1, None, 1, None ) } )
self._test_ac( 'samu*', self._public_service_key, CC.COMBINED_FILE_SERVICE_KEY, { bad_samus_tag_1 : ( 1, None, 0, None ), bad_samus_tag_2 : ( 0, None, 1, None ) }, { good_samus_tag : ( 1, None, 1, None ) } )
self._test_ac( 'samu*', self._public_service_key, CC.LOCAL_FILE_SERVICE_KEY, { bad_samus_tag_1 : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 0 ), bad_samus_tag_2 : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 1 ) }, { good_samus_tag : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 1 ) } )
self._test_ac( 'samu*', self._public_service_key, CC.COMBINED_FILE_SERVICE_KEY, { bad_samus_tag_1 : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 0 ), bad_samus_tag_2 : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 1 ) }, { good_samus_tag : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 1 ) } )
# now we'll currentify the tags in one action
@ -1318,8 +1321,8 @@ class TestClientDBTags( unittest.TestCase ):
# and a/c results, both specific and combined
self._test_ac( 'samu*', self._public_service_key, CC.LOCAL_FILE_SERVICE_KEY, { bad_samus_tag_1 : ( 1, None, 0, None ), bad_samus_tag_2 : ( 1, None, 0, None ) }, { good_samus_tag : ( 1, None, 0, None ) } )
self._test_ac( 'samu*', self._public_service_key, CC.COMBINED_FILE_SERVICE_KEY, { bad_samus_tag_1 : ( 1, None, 0, None ), bad_samus_tag_2 : ( 1, None, 0, None ) }, { good_samus_tag : ( 1, None, 0, None ) } )
self._test_ac( 'samu*', self._public_service_key, CC.LOCAL_FILE_SERVICE_KEY, { bad_samus_tag_1 : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 0 ), bad_samus_tag_2 : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 0 ) }, { good_samus_tag : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 0 ) } )
self._test_ac( 'samu*', self._public_service_key, CC.COMBINED_FILE_SERVICE_KEY, { bad_samus_tag_1 : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 0 ), bad_samus_tag_2 : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 0 ) }, { good_samus_tag : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 0 ) } )
def test_display_pending_to_current_merge_bug_non_ideal_and_ideal( self ):
@ -1390,8 +1393,8 @@ class TestClientDBTags( unittest.TestCase ):
# and a/c results, both specific and combined
self._test_ac( 'samu*', self._public_service_key, CC.LOCAL_FILE_SERVICE_KEY, { bad_samus_tag_1 : ( 0, None, 1, None ), good_samus_tag : ( 1, None, 0, None ) }, { good_samus_tag : ( 1, None, 1, None ) } )
self._test_ac( 'samu*', self._public_service_key, CC.COMBINED_FILE_SERVICE_KEY, { bad_samus_tag_1 : ( 0, None, 1, None ), good_samus_tag : ( 1, None, 0, None ) }, { good_samus_tag : ( 1, None, 1, None ) } )
self._test_ac( 'samu*', self._public_service_key, CC.LOCAL_FILE_SERVICE_KEY, { bad_samus_tag_1 : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 1 ), good_samus_tag : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 0 ) }, { good_samus_tag : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 1 ) } )
self._test_ac( 'samu*', self._public_service_key, CC.COMBINED_FILE_SERVICE_KEY, { bad_samus_tag_1 : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 1 ), good_samus_tag : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 0 ) }, { good_samus_tag : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 1 ) } )
# now we'll currentify the tags in one action
@ -1419,8 +1422,8 @@ class TestClientDBTags( unittest.TestCase ):
# and a/c results, both specific and combined
self._test_ac( 'samu*', self._public_service_key, CC.LOCAL_FILE_SERVICE_KEY, { bad_samus_tag_1 : ( 1, None, 0, None ), good_samus_tag : ( 1, None, 0, None ) }, { good_samus_tag : ( 1, None, 0, None ) } )
self._test_ac( 'samu*', self._public_service_key, CC.COMBINED_FILE_SERVICE_KEY, { bad_samus_tag_1 : ( 1, None, 0, None ), good_samus_tag : ( 1, None, 0, None ) }, { good_samus_tag : ( 1, None, 0, None ) } )
self._test_ac( 'samu*', self._public_service_key, CC.LOCAL_FILE_SERVICE_KEY, { bad_samus_tag_1 : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 0 ), good_samus_tag : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 0 ) }, { good_samus_tag : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 0 ) } )
self._test_ac( 'samu*', self._public_service_key, CC.COMBINED_FILE_SERVICE_KEY, { bad_samus_tag_1 : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 0 ), good_samus_tag : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 0 ) }, { good_samus_tag : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 0 ) } )
def test_display_pending_regen( self ):
@ -1491,11 +1494,11 @@ class TestClientDBTags( unittest.TestCase ):
# and a/c results, both specific and combined
self._test_ac( 'samu*', self._public_service_key, CC.LOCAL_FILE_SERVICE_KEY, { bad_samus_tag_1 : ( 0, None, 1, None ), bad_samus_tag_2 : ( 0, None, 1, None ) }, { good_samus_tag : ( 0, None, 1, None ) } )
self._test_ac( 'samu*', self._public_service_key, CC.COMBINED_FILE_SERVICE_KEY, { bad_samus_tag_1 : ( 0, None, 1, None ), bad_samus_tag_2 : ( 0, None, 1, None ) }, { good_samus_tag : ( 0, None, 1, None ) } )
self._test_ac( 'samu*', self._public_service_key, CC.LOCAL_FILE_SERVICE_KEY, { bad_samus_tag_1 : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 1 ), bad_samus_tag_2 : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 1 ) }, { good_samus_tag : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 1 ) } )
self._test_ac( 'samu*', self._public_service_key, CC.COMBINED_FILE_SERVICE_KEY, { bad_samus_tag_1 : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 1 ), bad_samus_tag_2 : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 1 ) }, { good_samus_tag : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 1 ) } )
self._test_ac( 'lara*', self._public_service_key, CC.LOCAL_FILE_SERVICE_KEY, { lara_tag : ( 0, None, 1, None ) }, { lara_tag : ( 0, None, 1, None ) } )
self._test_ac( 'lara*', self._public_service_key, CC.COMBINED_FILE_SERVICE_KEY, { lara_tag : ( 0, None, 1, None ) }, { lara_tag : ( 0, None, 1, None ) } )
self._test_ac( 'lara*', self._public_service_key, CC.LOCAL_FILE_SERVICE_KEY, { lara_tag : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 1 ) }, { lara_tag : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 1 ) } )
self._test_ac( 'lara*', self._public_service_key, CC.COMBINED_FILE_SERVICE_KEY, { lara_tag : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 1 ) }, { lara_tag : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 1 ) } )
# now we'll currentify the tags in one action
@ -1515,11 +1518,11 @@ class TestClientDBTags( unittest.TestCase ):
# and a/c results, both specific and combined
self._test_ac( 'samu*', self._public_service_key, CC.LOCAL_FILE_SERVICE_KEY, { bad_samus_tag_1 : ( 0, None, 1, None ), bad_samus_tag_2 : ( 0, None, 1, None ) }, { good_samus_tag : ( 0, None, 1, None ) } )
self._test_ac( 'samu*', self._public_service_key, CC.COMBINED_FILE_SERVICE_KEY, { bad_samus_tag_1 : ( 0, None, 1, None ), bad_samus_tag_2 : ( 0, None, 1, None ) }, { good_samus_tag : ( 0, None, 1, None ) } )
self._test_ac( 'samu*', self._public_service_key, CC.LOCAL_FILE_SERVICE_KEY, { bad_samus_tag_1 : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 1 ), bad_samus_tag_2 : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 1 ) }, { good_samus_tag : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 1 ) } )
self._test_ac( 'samu*', self._public_service_key, CC.COMBINED_FILE_SERVICE_KEY, { bad_samus_tag_1 : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 1 ), bad_samus_tag_2 : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 1 ) }, { good_samus_tag : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 1 ) } )
self._test_ac( 'lara*', self._public_service_key, CC.LOCAL_FILE_SERVICE_KEY, { lara_tag : ( 0, None, 1, None ) }, { lara_tag : ( 0, None, 1, None ) } )
self._test_ac( 'lara*', self._public_service_key, CC.COMBINED_FILE_SERVICE_KEY, { lara_tag : ( 0, None, 1, None ) }, { lara_tag : ( 0, None, 1, None ) } )
self._test_ac( 'lara*', self._public_service_key, CC.LOCAL_FILE_SERVICE_KEY, { lara_tag : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 1 ) }, { lara_tag : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 1 ) } )
self._test_ac( 'lara*', self._public_service_key, CC.COMBINED_FILE_SERVICE_KEY, { lara_tag : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 1 ) }, { lara_tag : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 1 ) } )
def test_parents_pairs_lookup( self ):
@ -2430,23 +2433,6 @@ class TestClientDBTags( unittest.TestCase ):
# this sucks big time and should really be broken into specific scenarios to test add_file with tags and sibs etc...
def test_ac( search_text, tag_service_key, file_service_key, expected_storage_tags_to_counts, expected_display_tags_to_counts ):
tag_search_context = ClientSearch.TagSearchContext( tag_service_key )
preds = self._read( 'autocomplete_predicates', ClientTags.TAG_DISPLAY_STORAGE, tag_search_context, file_service_key, search_text = search_text )
tags_to_counts = { pred.GetValue() : pred.GetAllCounts() for pred in preds }
self.assertEqual( expected_storage_tags_to_counts, tags_to_counts )
preds = self._read( 'autocomplete_predicates', ClientTags.TAG_DISPLAY_ACTUAL, tag_search_context, file_service_key, search_text = search_text )
tags_to_counts = { pred.GetValue() : pred.GetAllCounts() for pred in preds }
self.assertEqual( expected_display_tags_to_counts, tags_to_counts )
for on_local_files in ( False, True ):
def test_no_sibs( force_no_local_files = False ):
@ -2492,44 +2478,44 @@ class TestClientDBTags( unittest.TestCase ):
self.assertEqual( hash_ids_to_tags_managers[ self._samus_good_hash_id ].GetCurrentAndPending( CC.COMBINED_TAG_SERVICE_KEY, ClientTags.TAG_DISPLAY_ACTUAL ), { 'mc good', 'process these', 'pc good', 'pp good', 'character:samus aran' } )
test_ac( 'mc bad*', self._my_service_key, CC.COMBINED_FILE_SERVICE_KEY, { 'mc bad' : ( 2, None, 0, None ) }, { 'mc bad' : ( 2, None, 0, None ) } )
test_ac( 'pc bad*', self._public_service_key, CC.COMBINED_FILE_SERVICE_KEY, { 'pc bad' : ( 2, None, 0, None ) }, { 'pc bad' : ( 2, None, 0, None ) } )
test_ac( 'pp bad*', self._public_service_key, CC.COMBINED_FILE_SERVICE_KEY, { 'pp bad' : ( 0, None, 2, None ) }, { 'pp bad' : ( 0, None, 2, None ) } )
test_ac( 'sameus aran*', self._my_service_key, CC.COMBINED_FILE_SERVICE_KEY, { 'sameus aran' : ( 1, None, 0, None ) }, { 'sameus aran' : ( 1, None, 0, None ) } )
test_ac( 'samus metroid*', self._public_service_key, CC.COMBINED_FILE_SERVICE_KEY, { 'samus metroid' : ( 1, None, 0, None ) }, { 'samus metroid' : ( 1, None, 0, None ) } )
test_ac( 'samus aran*', self._public_service_key, CC.COMBINED_FILE_SERVICE_KEY, { 'character:samus aran' : ( 0, None, 1, None ) }, { 'character:samus aran' : ( 0, None, 1, None ) } )
self._test_ac( 'mc bad*', self._my_service_key, CC.COMBINED_FILE_SERVICE_KEY, { 'mc bad' : ClientSearch.PredicateCount.STATICCreateStaticCount( 2, 0 ) }, { 'mc bad' : ClientSearch.PredicateCount.STATICCreateStaticCount( 2, 0 ) } )
self._test_ac( 'pc bad*', self._public_service_key, CC.COMBINED_FILE_SERVICE_KEY, { 'pc bad' : ClientSearch.PredicateCount.STATICCreateStaticCount( 2, 0 ) }, { 'pc bad' : ClientSearch.PredicateCount.STATICCreateStaticCount( 2, 0 ) } )
self._test_ac( 'pp bad*', self._public_service_key, CC.COMBINED_FILE_SERVICE_KEY, { 'pp bad' : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 2 ) }, { 'pp bad' : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 2 ) } )
self._test_ac( 'sameus aran*', self._my_service_key, CC.COMBINED_FILE_SERVICE_KEY, { 'sameus aran' : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 0 ) }, { 'sameus aran' : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 0 ) } )
self._test_ac( 'samus metroid*', self._public_service_key, CC.COMBINED_FILE_SERVICE_KEY, { 'samus metroid' : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 0 ) }, { 'samus metroid' : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 0 ) } )
self._test_ac( 'samus aran*', self._public_service_key, CC.COMBINED_FILE_SERVICE_KEY, { 'character:samus aran' : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 1 ) }, { 'character:samus aran' : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 1 ) } )
if on_local_files and not force_no_local_files:
test_ac( 'mc bad*', self._my_service_key, CC.LOCAL_FILE_SERVICE_KEY, { 'mc bad' : ( 2, None, 0, None ) }, { 'mc bad' : ( 2, None, 0, None ) } )
test_ac( 'pc bad*', self._public_service_key, CC.LOCAL_FILE_SERVICE_KEY, { 'pc bad' : ( 2, None, 0, None ) }, { 'pc bad' : ( 2, None, 0, None ) } )
test_ac( 'pp bad*', self._public_service_key, CC.LOCAL_FILE_SERVICE_KEY, { 'pp bad' : ( 0, None, 2, None ) }, { 'pp bad' : ( 0, None, 2, None ) } )
test_ac( 'sameus aran*', self._my_service_key, CC.LOCAL_FILE_SERVICE_KEY, { 'sameus aran' : ( 1, None, 0, None ) }, { 'sameus aran' : ( 1, None, 0, None ) } )
test_ac( 'samus metroid*', self._public_service_key, CC.LOCAL_FILE_SERVICE_KEY, { 'samus metroid' : ( 1, None, 0, None ) }, { 'samus metroid' : ( 1, None, 0, None ) } )
test_ac( 'samus aran*', self._public_service_key, CC.LOCAL_FILE_SERVICE_KEY, { 'character:samus aran' : ( 0, None, 1, None ) }, { 'character:samus aran' : ( 0, None, 1, None ) } )
self._test_ac( 'mc bad*', self._my_service_key, CC.LOCAL_FILE_SERVICE_KEY, { 'mc bad' : ClientSearch.PredicateCount.STATICCreateStaticCount( 2, 0 ) }, { 'mc bad' : ClientSearch.PredicateCount.STATICCreateStaticCount( 2, 0 ) } )
self._test_ac( 'pc bad*', self._public_service_key, CC.LOCAL_FILE_SERVICE_KEY, { 'pc bad' : ClientSearch.PredicateCount.STATICCreateStaticCount( 2, 0 ) }, { 'pc bad' : ClientSearch.PredicateCount.STATICCreateStaticCount( 2, 0 ) } )
self._test_ac( 'pp bad*', self._public_service_key, CC.LOCAL_FILE_SERVICE_KEY, { 'pp bad' : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 2 ) }, { 'pp bad' : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 2 ) } )
self._test_ac( 'sameus aran*', self._my_service_key, CC.LOCAL_FILE_SERVICE_KEY, { 'sameus aran' : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 0 ) }, { 'sameus aran' : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 0 ) } )
self._test_ac( 'samus metroid*', self._public_service_key, CC.LOCAL_FILE_SERVICE_KEY, { 'samus metroid' : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 0 ) }, { 'samus metroid' : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 0 ) } )
self._test_ac( 'samus aran*', self._public_service_key, CC.LOCAL_FILE_SERVICE_KEY, { 'character:samus aran' : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 1 ) }, { 'character:samus aran' : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 1 ) } )
test_ac( 'mc bad*', CC.COMBINED_TAG_SERVICE_KEY, CC.LOCAL_FILE_SERVICE_KEY, { 'mc bad' : ( 2, None, 0, None ) }, { 'mc bad' : ( 2, None, 0, None ) } )
test_ac( 'pc bad*', CC.COMBINED_TAG_SERVICE_KEY, CC.LOCAL_FILE_SERVICE_KEY, { 'pc bad' : ( 2, None, 0, None ) }, { 'pc bad' : ( 2, None, 0, None ) } )
test_ac( 'pp bad*', CC.COMBINED_TAG_SERVICE_KEY, CC.LOCAL_FILE_SERVICE_KEY, { 'pp bad' : ( 0, None, 2, None ) }, { 'pp bad' : ( 0, None, 2, None ) } )
test_ac( 'sameus aran*', CC.COMBINED_TAG_SERVICE_KEY, CC.LOCAL_FILE_SERVICE_KEY, { 'sameus aran' : ( 1, None, 0, None ) }, { 'sameus aran' : ( 1, None, 0, None ) } )
test_ac( 'samus metroid*', CC.COMBINED_TAG_SERVICE_KEY, CC.LOCAL_FILE_SERVICE_KEY, { 'samus metroid' : ( 1, None, 0, None ) }, { 'samus metroid' : ( 1, None, 0, None ) } )
test_ac( 'samus aran*', CC.COMBINED_TAG_SERVICE_KEY, CC.LOCAL_FILE_SERVICE_KEY, { 'character:samus aran' : ( 0, None, 1, None ) }, { 'character:samus aran' : ( 0, None, 1, None ) } )
self._test_ac( 'mc bad*', CC.COMBINED_TAG_SERVICE_KEY, CC.LOCAL_FILE_SERVICE_KEY, { 'mc bad' : ClientSearch.PredicateCount.STATICCreateStaticCount( 2, 0 ) }, { 'mc bad' : ClientSearch.PredicateCount.STATICCreateStaticCount( 2, 0 ) } )
self._test_ac( 'pc bad*', CC.COMBINED_TAG_SERVICE_KEY, CC.LOCAL_FILE_SERVICE_KEY, { 'pc bad' : ClientSearch.PredicateCount.STATICCreateStaticCount( 2, 0 ) }, { 'pc bad' : ClientSearch.PredicateCount.STATICCreateStaticCount( 2, 0 ) } )
self._test_ac( 'pp bad*', CC.COMBINED_TAG_SERVICE_KEY, CC.LOCAL_FILE_SERVICE_KEY, { 'pp bad' : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 2 ) }, { 'pp bad' : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 2 ) } )
self._test_ac( 'sameus aran*', CC.COMBINED_TAG_SERVICE_KEY, CC.LOCAL_FILE_SERVICE_KEY, { 'sameus aran' : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 0 ) }, { 'sameus aran' : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 0 ) } )
self._test_ac( 'samus metroid*', CC.COMBINED_TAG_SERVICE_KEY, CC.LOCAL_FILE_SERVICE_KEY, { 'samus metroid' : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 0 ) }, { 'samus metroid' : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 0 ) } )
self._test_ac( 'samus aran*', CC.COMBINED_TAG_SERVICE_KEY, CC.LOCAL_FILE_SERVICE_KEY, { 'character:samus aran' : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 1 ) }, { 'character:samus aran' : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 1 ) } )
else:
test_ac( 'mc bad*', self._my_service_key, CC.LOCAL_FILE_SERVICE_KEY, {}, {} )
test_ac( 'pc bad*', self._public_service_key, CC.LOCAL_FILE_SERVICE_KEY, {}, {} )
test_ac( 'pp bad*', self._public_service_key, CC.LOCAL_FILE_SERVICE_KEY, {}, {} )
test_ac( 'sameus aran*', self._my_service_key, CC.LOCAL_FILE_SERVICE_KEY, {}, {} )
test_ac( 'samus metroid*', self._public_service_key, CC.LOCAL_FILE_SERVICE_KEY, {}, {} )
test_ac( 'samus aran*', self._public_service_key, CC.LOCAL_FILE_SERVICE_KEY, {}, {} )
self._test_ac( 'mc bad*', self._my_service_key, CC.LOCAL_FILE_SERVICE_KEY, {}, {} )
self._test_ac( 'pc bad*', self._public_service_key, CC.LOCAL_FILE_SERVICE_KEY, {}, {} )
self._test_ac( 'pp bad*', self._public_service_key, CC.LOCAL_FILE_SERVICE_KEY, {}, {} )
self._test_ac( 'sameus aran*', self._my_service_key, CC.LOCAL_FILE_SERVICE_KEY, {}, {} )
self._test_ac( 'samus metroid*', self._public_service_key, CC.LOCAL_FILE_SERVICE_KEY, {}, {} )
self._test_ac( 'samus aran*', self._public_service_key, CC.LOCAL_FILE_SERVICE_KEY, {}, {} )
test_ac( 'mc bad*', CC.COMBINED_TAG_SERVICE_KEY, CC.LOCAL_FILE_SERVICE_KEY, {}, {} )
test_ac( 'pc bad*', CC.COMBINED_TAG_SERVICE_KEY, CC.LOCAL_FILE_SERVICE_KEY, {}, {} )
test_ac( 'pp bad*', CC.COMBINED_TAG_SERVICE_KEY, CC.LOCAL_FILE_SERVICE_KEY, {}, {} )
test_ac( 'sameus aran*', CC.COMBINED_TAG_SERVICE_KEY, CC.LOCAL_FILE_SERVICE_KEY, {}, {} )
test_ac( 'samus metroid*', CC.COMBINED_TAG_SERVICE_KEY, CC.LOCAL_FILE_SERVICE_KEY, {}, {} )
test_ac( 'samus aran*', CC.COMBINED_TAG_SERVICE_KEY, CC.LOCAL_FILE_SERVICE_KEY, {}, {} )
self._test_ac( 'mc bad*', CC.COMBINED_TAG_SERVICE_KEY, CC.LOCAL_FILE_SERVICE_KEY, {}, {} )
self._test_ac( 'pc bad*', CC.COMBINED_TAG_SERVICE_KEY, CC.LOCAL_FILE_SERVICE_KEY, {}, {} )
self._test_ac( 'pp bad*', CC.COMBINED_TAG_SERVICE_KEY, CC.LOCAL_FILE_SERVICE_KEY, {}, {} )
self._test_ac( 'sameus aran*', CC.COMBINED_TAG_SERVICE_KEY, CC.LOCAL_FILE_SERVICE_KEY, {}, {} )
self._test_ac( 'samus metroid*', CC.COMBINED_TAG_SERVICE_KEY, CC.LOCAL_FILE_SERVICE_KEY, {}, {} )
self._test_ac( 'samus aran*', CC.COMBINED_TAG_SERVICE_KEY, CC.LOCAL_FILE_SERVICE_KEY, {}, {} )
@ -2722,50 +2708,48 @@ class TestClientDBTags( unittest.TestCase ):
self.assertEqual( hash_ids_to_tags_managers[ self._samus_good_hash_id ].GetCurrentAndPending( CC.COMBINED_TAG_SERVICE_KEY, ClientTags.TAG_DISPLAY_ACTUAL ), { 'mc good', 'process these', 'pc good', 'pp good', 'character:samus aran' } )
# now we get more write a/c suggestions, and accurated merged read a/c values
test_ac( 'mc bad*', self._my_service_key, CC.COMBINED_FILE_SERVICE_KEY, { 'mc bad' : ( 2, None, 0, None ), 'mc good' : ( 2, None, 0, None ) }, { 'mc good' : ( 3, None, 0, None ) } )
test_ac( 'pc bad*', self._public_service_key, CC.COMBINED_FILE_SERVICE_KEY, { 'pc bad' : ( 2, None, 0, None ), 'pc good' : ( 2, None, 0, None ) }, { 'pc good' : ( 3, None, 0, None ) } )
test_ac( 'pp bad*', self._public_service_key, CC.COMBINED_FILE_SERVICE_KEY, { 'pp bad' : ( 0, None, 2, None ), 'pp good' : ( 0, None, 2, None ) }, { 'pp good' : ( 0, None, 3, None ) } )
test_ac( 'sameus aran*', self._my_service_key, CC.COMBINED_FILE_SERVICE_KEY, { 'sameus aran' : ( 1, None, 0, None ) }, { 'samus metroid' : ( 1, None, 0, None ) } )
test_ac( 'samus metroid*', self._public_service_key, CC.COMBINED_FILE_SERVICE_KEY, { 'samus metroid' : ( 1, None, 0, None ), 'character:samus aran' : ( 0, None, 1, None ) }, { 'character:samus aran' : ( 1, None, 1, None ) } )
test_ac( 'samus aran*', self._public_service_key, CC.COMBINED_FILE_SERVICE_KEY, { 'samus metroid' : ( 1, None, 0, None ), 'character:samus aran' : ( 0, None, 1, None ) }, { 'character:samus aran' : ( 1, None, 1, None ) } )
self._test_ac( 'mc bad*', self._my_service_key, CC.COMBINED_FILE_SERVICE_KEY, { 'mc bad' : ClientSearch.PredicateCount.STATICCreateStaticCount( 2, 0 ), 'mc good' : ClientSearch.PredicateCount.STATICCreateStaticCount( 2, 0 ) }, { 'mc good' : ClientSearch.PredicateCount.STATICCreateStaticCount( 3, 0 ) } )
self._test_ac( 'pc bad*', self._public_service_key, CC.COMBINED_FILE_SERVICE_KEY, { 'pc bad' : ClientSearch.PredicateCount.STATICCreateStaticCount( 2, 0 ), 'pc good' : ClientSearch.PredicateCount.STATICCreateStaticCount( 2, 0 ) }, { 'pc good' : ClientSearch.PredicateCount.STATICCreateStaticCount( 3, 0 ) } )
self._test_ac( 'pp bad*', self._public_service_key, CC.COMBINED_FILE_SERVICE_KEY, { 'pp bad' : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 2 ), 'pp good' : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 2 ) }, { 'pp good' : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 3 ) } )
self._test_ac( 'sameus aran*', self._my_service_key, CC.COMBINED_FILE_SERVICE_KEY, { 'sameus aran' : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 0 ) }, { 'samus metroid' : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 0 ) } )
self._test_ac( 'samus metroid*', self._public_service_key, CC.COMBINED_FILE_SERVICE_KEY, { 'samus metroid' : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 0 ), 'character:samus aran' : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 1 ) }, { 'character:samus aran' : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 1 ) } )
self._test_ac( 'samus aran*', self._public_service_key, CC.COMBINED_FILE_SERVICE_KEY, { 'samus metroid' : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 0 ), 'character:samus aran' : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 1 ) }, { 'character:samus aran' : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 1 ) } )
if on_local_files:
# same deal, just smaller file domain
test_ac( 'mc bad*', self._my_service_key, CC.LOCAL_FILE_SERVICE_KEY, { 'mc bad' : ( 2, None, 0, None ), 'mc good' : ( 2, None, 0, None ) }, { 'mc good' : ( 3, None, 0, None ) } )
test_ac( 'pc bad*', self._public_service_key, CC.LOCAL_FILE_SERVICE_KEY, { 'pc bad' : ( 2, None, 0, None ), 'pc good' : ( 2, None, 0, None ) }, { 'pc good' : ( 3, None, 0, None ) } )
test_ac( 'pp bad*', self._public_service_key, CC.LOCAL_FILE_SERVICE_KEY, { 'pp bad' : ( 0, None, 2, None ), 'pp good' : ( 0, None, 2, None ) }, { 'pp good' : ( 0, None, 3, None ) } )
test_ac( 'sameus aran*', self._my_service_key, CC.LOCAL_FILE_SERVICE_KEY, { 'sameus aran' : ( 1, None, 0, None ) }, { 'samus metroid' : ( 1, None, 0, None ) } )
test_ac( 'samus metroid*', self._public_service_key, CC.LOCAL_FILE_SERVICE_KEY, { 'samus metroid' : ( 1, None, 0, None ), 'character:samus aran' : ( 0, None, 1, None ) }, { 'character:samus aran' : ( 1, None, 1, None ) } )
test_ac( 'samus aran*', self._public_service_key, CC.LOCAL_FILE_SERVICE_KEY, { 'samus metroid' : ( 1, None, 0, None ), 'character:samus aran' : ( 0, None, 1, None ) }, { 'character:samus aran' : ( 1, None, 1, None ) } )
self._test_ac( 'mc bad*', self._my_service_key, CC.LOCAL_FILE_SERVICE_KEY, { 'mc bad' : ClientSearch.PredicateCount.STATICCreateStaticCount( 2, 0 ), 'mc good' : ClientSearch.PredicateCount.STATICCreateStaticCount( 2, 0 ) }, { 'mc good' : ClientSearch.PredicateCount.STATICCreateStaticCount( 3, 0 ) } )
self._test_ac( 'pc bad*', self._public_service_key, CC.LOCAL_FILE_SERVICE_KEY, { 'pc bad' : ClientSearch.PredicateCount.STATICCreateStaticCount( 2, 0 ), 'pc good' : ClientSearch.PredicateCount.STATICCreateStaticCount( 2, 0 ) }, { 'pc good' : ClientSearch.PredicateCount.STATICCreateStaticCount( 3, 0 ) } )
self._test_ac( 'pp bad*', self._public_service_key, CC.LOCAL_FILE_SERVICE_KEY, { 'pp bad' : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 2 ), 'pp good' : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 2 ) }, { 'pp good' : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 3 ) } )
self._test_ac( 'sameus aran*', self._my_service_key, CC.LOCAL_FILE_SERVICE_KEY, { 'sameus aran' : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 0 ) }, { 'samus metroid' : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 0 ) } )
self._test_ac( 'samus metroid*', self._public_service_key, CC.LOCAL_FILE_SERVICE_KEY, { 'samus metroid' : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 0 ), 'character:samus aran' : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 1 ) }, { 'character:samus aran' : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 1 ) } )
self._test_ac( 'samus aran*', self._public_service_key, CC.LOCAL_FILE_SERVICE_KEY, { 'samus metroid' : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 0 ), 'character:samus aran' : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 1 ) }, { 'character:samus aran' : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 1 ) } )
test_ac( 'mc bad*', CC.COMBINED_TAG_SERVICE_KEY, CC.LOCAL_FILE_SERVICE_KEY, { 'mc bad' : ( 2, None, 0, None ), 'mc good' : ( 2, None, 0, None ) }, { 'mc good' : ( 3, None, 0, None ) } )
test_ac( 'pc bad*', CC.COMBINED_TAG_SERVICE_KEY, CC.LOCAL_FILE_SERVICE_KEY, { 'pc bad' : ( 2, None, 0, None ), 'pc good' : ( 2, None, 0, None ) }, { 'pc good' : ( 3, None, 0, None ) } )
test_ac( 'pp bad*', CC.COMBINED_TAG_SERVICE_KEY, CC.LOCAL_FILE_SERVICE_KEY, { 'pp bad' : ( 0, None, 2, None ), 'pp good' : ( 0, None, 2, None ) }, { 'pp good' : ( 0, None, 3, None ) } )
# here the write a/c gets funky because of all known tags. finding counts for disjoint yet now merged sibling suggestions even though not on same tag domain
# slightly odd situation, but we'll want to clear it up
# this is cleared up UI side when it does sibling_tag_id filtering based on the tag service we are pending to, but it shows that a/c fetch needs an optional sibling_tag_service_key
# this is a job for tag search context
# read a/c counts are fine
test_ac( 'sameus aran*', CC.COMBINED_TAG_SERVICE_KEY, CC.LOCAL_FILE_SERVICE_KEY, { 'sameus aran' : ( 1, None, 0, None ), 'samus metroid' : ( 1, None, 0, None ) }, { 'samus metroid' : ( 1, None, 0, None ) } )
test_ac( 'samus metroid*', CC.COMBINED_TAG_SERVICE_KEY, CC.LOCAL_FILE_SERVICE_KEY, { 'sameus aran' : ( 1, None, 0, None ), 'samus metroid' : ( 1, None, 0, None ), 'character:samus aran' : ( 0, None, 1, None ) }, { 'samus metroid' : ( 1, None, 0, None ), 'character:samus aran' : ( 1, None, 1, None ) } )
test_ac( 'samus aran*', CC.COMBINED_TAG_SERVICE_KEY, CC.LOCAL_FILE_SERVICE_KEY, { 'samus metroid' : ( 1, None, 0, None ), 'character:samus aran' : ( 0, None, 1, None ) }, { 'samus metroid' : ( 1, None, 0, None ), 'character:samus aran' : ( 1, None, 1, None ) } )
self._test_ac( 'mc bad*', CC.COMBINED_TAG_SERVICE_KEY, CC.LOCAL_FILE_SERVICE_KEY, { 'mc bad' : ClientSearch.PredicateCount.STATICCreateStaticCount( 2, 0 ), 'mc good' : ClientSearch.PredicateCount.STATICCreateStaticCount( 2, 0 ) }, { 'mc good' : ClientSearch.PredicateCount.STATICCreateStaticCount( 3, 0 ) } )
self._test_ac( 'pc bad*', CC.COMBINED_TAG_SERVICE_KEY, CC.LOCAL_FILE_SERVICE_KEY, { 'pc bad' : ClientSearch.PredicateCount.STATICCreateStaticCount( 2, 0 ), 'pc good' : ClientSearch.PredicateCount.STATICCreateStaticCount( 2, 0 ) }, { 'pc good' : ClientSearch.PredicateCount.STATICCreateStaticCount( 3, 0 ) } )
self._test_ac( 'pp bad*', CC.COMBINED_TAG_SERVICE_KEY, CC.LOCAL_FILE_SERVICE_KEY, { 'pp bad' : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 2 ), 'pp good' : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 2 ) }, { 'pp good' : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 3 ) } )
# the storage/write a/c used to get funky here because it was tricky to do siblings over all known tags. it merged all sibling lookups for all fetched tags
# but now it is fixed I am pretty sure! each set of positive count tag results is siblinged in a service leaf silo
# I basically just fixed these tests to the new results. it seems good in UI. this is more reason that these unit tests are way too complicated and need to be redone
self._test_ac( 'sameus aran*', CC.COMBINED_TAG_SERVICE_KEY, CC.LOCAL_FILE_SERVICE_KEY, { 'sameus aran' : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 0 ) }, { 'samus metroid' : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 0 ) } )
self._test_ac( 'samus metroid*', CC.COMBINED_TAG_SERVICE_KEY, CC.LOCAL_FILE_SERVICE_KEY, { 'sameus aran' : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 0 ), 'samus metroid' : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 0 ), 'character:samus aran' : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 1 ) }, { 'character:samus aran' : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 1 ), 'samus metroid' : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 0 ) } )
self._test_ac( 'samus aran*', CC.COMBINED_TAG_SERVICE_KEY, CC.LOCAL_FILE_SERVICE_KEY, { 'samus metroid' : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 0 ), 'character:samus aran' : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 1 ) }, { 'character:samus aran' : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 1 ) } )
else:
test_ac( 'mc bad*', self._my_service_key, CC.LOCAL_FILE_SERVICE_KEY, {}, {} )
test_ac( 'pc bad*', self._public_service_key, CC.LOCAL_FILE_SERVICE_KEY, {}, {} )
test_ac( 'pp bad*', self._public_service_key, CC.LOCAL_FILE_SERVICE_KEY, {}, {} )
test_ac( 'sameus aran*', self._my_service_key, CC.LOCAL_FILE_SERVICE_KEY, {}, {} )
test_ac( 'samus metroid*', self._public_service_key, CC.LOCAL_FILE_SERVICE_KEY, {}, {} )
test_ac( 'samus aran*', self._public_service_key, CC.LOCAL_FILE_SERVICE_KEY, {}, {} )
self._test_ac( 'mc bad*', self._my_service_key, CC.LOCAL_FILE_SERVICE_KEY, {}, {} )
self._test_ac( 'pc bad*', self._public_service_key, CC.LOCAL_FILE_SERVICE_KEY, {}, {} )
self._test_ac( 'pp bad*', self._public_service_key, CC.LOCAL_FILE_SERVICE_KEY, {}, {} )
self._test_ac( 'sameus aran*', self._my_service_key, CC.LOCAL_FILE_SERVICE_KEY, {}, {} )
self._test_ac( 'samus metroid*', self._public_service_key, CC.LOCAL_FILE_SERVICE_KEY, {}, {} )
self._test_ac( 'samus aran*', self._public_service_key, CC.LOCAL_FILE_SERVICE_KEY, {}, {} )
test_ac( 'mc bad*', CC.COMBINED_TAG_SERVICE_KEY, CC.LOCAL_FILE_SERVICE_KEY, {}, {} )
test_ac( 'pc bad*', CC.COMBINED_TAG_SERVICE_KEY, CC.LOCAL_FILE_SERVICE_KEY, {}, {} )
test_ac( 'pp bad*', CC.COMBINED_TAG_SERVICE_KEY, CC.LOCAL_FILE_SERVICE_KEY, {}, {} )
test_ac( 'sameus aran*', CC.COMBINED_TAG_SERVICE_KEY, CC.LOCAL_FILE_SERVICE_KEY, {}, {} )
test_ac( 'samus metroid*', CC.COMBINED_TAG_SERVICE_KEY, CC.LOCAL_FILE_SERVICE_KEY, {}, {} )
test_ac( 'samus aran*', CC.COMBINED_TAG_SERVICE_KEY, CC.LOCAL_FILE_SERVICE_KEY, {}, {} )
self._test_ac( 'mc bad*', CC.COMBINED_TAG_SERVICE_KEY, CC.LOCAL_FILE_SERVICE_KEY, {}, {} )
self._test_ac( 'pc bad*', CC.COMBINED_TAG_SERVICE_KEY, CC.LOCAL_FILE_SERVICE_KEY, {}, {} )
self._test_ac( 'pp bad*', CC.COMBINED_TAG_SERVICE_KEY, CC.LOCAL_FILE_SERVICE_KEY, {}, {} )
self._test_ac( 'sameus aran*', CC.COMBINED_TAG_SERVICE_KEY, CC.LOCAL_FILE_SERVICE_KEY, {}, {} )
self._test_ac( 'samus metroid*', CC.COMBINED_TAG_SERVICE_KEY, CC.LOCAL_FILE_SERVICE_KEY, {}, {} )
self._test_ac( 'samus aran*', CC.COMBINED_TAG_SERVICE_KEY, CC.LOCAL_FILE_SERVICE_KEY, {}, {} )
@ -2840,50 +2824,48 @@ class TestClientDBTags( unittest.TestCase ):
self.assertEqual( hash_ids_to_tags_managers[ self._samus_good_hash_id ].GetCurrentAndPending( CC.COMBINED_TAG_SERVICE_KEY, ClientTags.TAG_DISPLAY_ACTUAL ), { 'mc good', 'process these', 'pc good', 'pp good', 'character:samus aran' } )
# now we get more write a/c suggestions, and accurated merged read a/c values
test_ac( 'mc bad*', self._my_service_key, CC.COMBINED_FILE_SERVICE_KEY, { 'mc bad' : ( 2, None, 0, None ), 'mc good' : ( 2, None, 0, None ) }, { 'mc good' : ( 3, None, 0, None ) } )
test_ac( 'pc bad*', self._public_service_key, CC.COMBINED_FILE_SERVICE_KEY, { 'pc bad' : ( 2, None, 0, None ), 'pc good' : ( 2, None, 0, None ) }, { 'pc good' : ( 3, None, 0, None ) } )
test_ac( 'pp bad*', self._public_service_key, CC.COMBINED_FILE_SERVICE_KEY, { 'pp bad' : ( 0, None, 2, None ), 'pp good' : ( 0, None, 2, None ) }, { 'pp good' : ( 0, None, 3, None ) } )
test_ac( 'sameus aran*', self._my_service_key, CC.COMBINED_FILE_SERVICE_KEY, { 'sameus aran' : ( 1, None, 0, None ) }, { 'character:samus aran' : ( 1, None, 0, None ) } )
test_ac( 'samus metroid*', self._public_service_key, CC.COMBINED_FILE_SERVICE_KEY, { 'samus metroid' : ( 1, None, 0, None ), 'character:samus aran' : ( 0, None, 1, None ) }, { 'character:samus aran' : ( 1, None, 1, None ) } )
test_ac( 'samus aran*', self._public_service_key, CC.COMBINED_FILE_SERVICE_KEY, { 'samus metroid' : ( 1, None, 0, None ), 'character:samus aran' : ( 0, None, 1, None ) }, { 'character:samus aran' : ( 1, None, 1, None ) } )
self._test_ac( 'mc bad*', self._my_service_key, CC.COMBINED_FILE_SERVICE_KEY, { 'mc bad' : ClientSearch.PredicateCount.STATICCreateStaticCount( 2, 0 ), 'mc good' : ClientSearch.PredicateCount.STATICCreateStaticCount( 2, 0 ) }, { 'mc good' : ClientSearch.PredicateCount.STATICCreateStaticCount( 3, 0 ) } )
self._test_ac( 'pc bad*', self._public_service_key, CC.COMBINED_FILE_SERVICE_KEY, { 'pc bad' : ClientSearch.PredicateCount.STATICCreateStaticCount( 2, 0 ), 'pc good' : ClientSearch.PredicateCount.STATICCreateStaticCount( 2, 0 ) }, { 'pc good' : ClientSearch.PredicateCount.STATICCreateStaticCount( 3, 0 ) } )
self._test_ac( 'pp bad*', self._public_service_key, CC.COMBINED_FILE_SERVICE_KEY, { 'pp bad' : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 2 ), 'pp good' : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 2 ) }, { 'pp good' : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 3 ) } )
self._test_ac( 'sameus aran*', self._my_service_key, CC.COMBINED_FILE_SERVICE_KEY, { 'sameus aran' : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 0 ) }, { 'character:samus aran' : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 0 ) } )
self._test_ac( 'samus metroid*', self._public_service_key, CC.COMBINED_FILE_SERVICE_KEY, { 'samus metroid' : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 0 ), 'character:samus aran' : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 1 ) }, { 'character:samus aran' : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 1 ) } )
self._test_ac( 'samus aran*', self._public_service_key, CC.COMBINED_FILE_SERVICE_KEY, { 'samus metroid' : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 0 ), 'character:samus aran' : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 1 ) }, { 'character:samus aran' : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 1 ) } )
if on_local_files:
# same deal, just smaller file domain
test_ac( 'mc bad*', self._my_service_key, CC.LOCAL_FILE_SERVICE_KEY, { 'mc bad' : ( 2, None, 0, None ), 'mc good' : ( 2, None, 0, None ) }, { 'mc good' : ( 3, None, 0, None ) } )
test_ac( 'pc bad*', self._public_service_key, CC.LOCAL_FILE_SERVICE_KEY, { 'pc bad' : ( 2, None, 0, None ), 'pc good' : ( 2, None, 0, None ) }, { 'pc good' : ( 3, None, 0, None ) } )
test_ac( 'pp bad*', self._public_service_key, CC.LOCAL_FILE_SERVICE_KEY, { 'pp bad' : ( 0, None, 2, None ), 'pp good' : ( 0, None, 2, None ) }, { 'pp good' : ( 0, None, 3, None ) } )
test_ac( 'sameus aran*', self._my_service_key, CC.LOCAL_FILE_SERVICE_KEY, { 'sameus aran' : ( 1, None, 0, None ) }, { 'character:samus aran' : ( 1, None, 0, None ) } )
test_ac( 'samus metroid*', self._public_service_key, CC.LOCAL_FILE_SERVICE_KEY, { 'samus metroid' : ( 1, None, 0, None ), 'character:samus aran' : ( 0, None, 1, None ) }, { 'character:samus aran' : ( 1, None, 1, None ) } )
test_ac( 'samus aran*', self._public_service_key, CC.LOCAL_FILE_SERVICE_KEY, { 'samus metroid' : ( 1, None, 0, None ), 'character:samus aran' : ( 0, None, 1, None ) }, { 'character:samus aran' : ( 1, None, 1, None ) } )
self._test_ac( 'mc bad*', self._my_service_key, CC.LOCAL_FILE_SERVICE_KEY, { 'mc bad' : ClientSearch.PredicateCount.STATICCreateStaticCount( 2, 0 ), 'mc good' : ClientSearch.PredicateCount.STATICCreateStaticCount( 2, 0 ) }, { 'mc good' : ClientSearch.PredicateCount.STATICCreateStaticCount( 3, 0 ) } )
self._test_ac( 'pc bad*', self._public_service_key, CC.LOCAL_FILE_SERVICE_KEY, { 'pc bad' : ClientSearch.PredicateCount.STATICCreateStaticCount( 2, 0 ), 'pc good' : ClientSearch.PredicateCount.STATICCreateStaticCount( 2, 0 ) }, { 'pc good' : ClientSearch.PredicateCount.STATICCreateStaticCount( 3, 0 ) } )
self._test_ac( 'pp bad*', self._public_service_key, CC.LOCAL_FILE_SERVICE_KEY, { 'pp bad' : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 2 ), 'pp good' : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 2 ) }, { 'pp good' : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 3 ) } )
self._test_ac( 'sameus aran*', self._my_service_key, CC.LOCAL_FILE_SERVICE_KEY, { 'sameus aran' : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 0 ) }, { 'character:samus aran' : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 0 ) } )
self._test_ac( 'samus metroid*', self._public_service_key, CC.LOCAL_FILE_SERVICE_KEY, { 'samus metroid' : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 0 ), 'character:samus aran' : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 1 ) }, { 'character:samus aran' : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 1 ) } )
self._test_ac( 'samus aran*', self._public_service_key, CC.LOCAL_FILE_SERVICE_KEY, { 'samus metroid' : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 0 ), 'character:samus aran' : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 1 ) }, { 'character:samus aran' : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 1 ) } )
test_ac( 'mc bad*', CC.COMBINED_TAG_SERVICE_KEY, CC.LOCAL_FILE_SERVICE_KEY, { 'mc bad' : ( 2, None, 0, None ), 'mc good' : ( 2, None, 0, None ) }, { 'mc good' : ( 3, None, 0, None ) } )
test_ac( 'pc bad*', CC.COMBINED_TAG_SERVICE_KEY, CC.LOCAL_FILE_SERVICE_KEY, { 'pc bad' : ( 2, None, 0, None ), 'pc good' : ( 2, None, 0, None ) }, { 'pc good' : ( 3, None, 0, None ) } )
test_ac( 'pp bad*', CC.COMBINED_TAG_SERVICE_KEY, CC.LOCAL_FILE_SERVICE_KEY, { 'pp bad' : ( 0, None, 2, None ), 'pp good' : ( 0, None, 2, None ) }, { 'pp good' : ( 0, None, 3, None ) } )
# here the write a/c gets funky because of all known tags. finding counts for disjoint yet now merged sibling suggestions even though not on same tag domain
# slightly odd situation, but we'll want to clear it up
# this is cleared up UI side when it does sibling_tag_id filtering based on the tag service we are pending to, but it shows that a/c fetch needs an optional sibling_tag_service_key
# this is a job for tag search context
# read a/c counts are fine
test_ac( 'sameus aran*', CC.COMBINED_TAG_SERVICE_KEY, CC.LOCAL_FILE_SERVICE_KEY, { 'sameus aran' : ( 1, None, 0, None ), 'samus metroid' : ( 1, None, 0, None ), 'character:samus aran' : ( 0, None, 1, None ) }, { 'character:samus aran' : ( 1, 2, 1, None ) } )
test_ac( 'samus metroid*', CC.COMBINED_TAG_SERVICE_KEY, CC.LOCAL_FILE_SERVICE_KEY, { 'sameus aran' : ( 1, None, 0, None ), 'samus metroid' : ( 1, None, 0, None ), 'character:samus aran' : ( 0, None, 1, None ) }, { 'character:samus aran' : ( 1, 2, 1, None ) } )
test_ac( 'samus aran*', CC.COMBINED_TAG_SERVICE_KEY, CC.LOCAL_FILE_SERVICE_KEY, { 'sameus aran' : ( 1, None, 0, None ), 'samus metroid' : ( 1, None, 0, None ), 'character:samus aran' : ( 0, None, 1, None ) }, { 'character:samus aran' : ( 1, 2, 1, None ) } )
self._test_ac( 'mc bad*', CC.COMBINED_TAG_SERVICE_KEY, CC.LOCAL_FILE_SERVICE_KEY, { 'mc bad' : ClientSearch.PredicateCount.STATICCreateStaticCount( 2, 0 ), 'mc good' : ClientSearch.PredicateCount.STATICCreateStaticCount( 2, 0 ) }, { 'mc good' : ClientSearch.PredicateCount.STATICCreateStaticCount( 3, 0 ) } )
self._test_ac( 'pc bad*', CC.COMBINED_TAG_SERVICE_KEY, CC.LOCAL_FILE_SERVICE_KEY, { 'pc bad' : ClientSearch.PredicateCount.STATICCreateStaticCount( 2, 0 ), 'pc good' : ClientSearch.PredicateCount.STATICCreateStaticCount( 2, 0 ) }, { 'pc good' : ClientSearch.PredicateCount.STATICCreateStaticCount( 3, 0 ) } )
self._test_ac( 'pp bad*', CC.COMBINED_TAG_SERVICE_KEY, CC.LOCAL_FILE_SERVICE_KEY, { 'pp bad' : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 2 ), 'pp good' : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 2 ) }, { 'pp good' : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 3 ) } )
# the storage/write a/c used to get funky here because it was tricky to do siblings over all known tags. it merged all sibling lookups for all fetched tags
# but now it is fixed I am pretty sure! each set of positive count tag results is siblinged in a service leaf silo
# I basically just fixed these tests to the new results. it seems good in UI. this is more reason that these unit tests are way too complicated and need to be redone
self._test_ac( 'sameus aran*', CC.COMBINED_TAG_SERVICE_KEY, CC.LOCAL_FILE_SERVICE_KEY, { 'sameus aran' : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 0 ), 'samus metroid' : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 0 ), 'character:samus aran' : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 1 ) }, { 'character:samus aran' : ClientSearch.PredicateCount( 1, 1, 2, 1 ) } )
self._test_ac( 'samus metroid*', CC.COMBINED_TAG_SERVICE_KEY, CC.LOCAL_FILE_SERVICE_KEY, { 'sameus aran' : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 0 ), 'samus metroid' : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 0 ), 'character:samus aran' : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 1 ) }, { 'character:samus aran' : ClientSearch.PredicateCount( 1, 1, 2, 1 ) } )
self._test_ac( 'samus aran*', CC.COMBINED_TAG_SERVICE_KEY, CC.LOCAL_FILE_SERVICE_KEY, { 'sameus aran' : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 0 ), 'samus metroid' : ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 0 ), 'character:samus aran' : ClientSearch.PredicateCount.STATICCreateStaticCount( 0, 1 ) }, { 'character:samus aran' : ClientSearch.PredicateCount( 1, 1, 2, 1 ) } )
else:
test_ac( 'mc bad*', self._my_service_key, CC.LOCAL_FILE_SERVICE_KEY, {}, {} )
test_ac( 'pc bad*', self._public_service_key, CC.LOCAL_FILE_SERVICE_KEY, {}, {} )
test_ac( 'pp bad*', self._public_service_key, CC.LOCAL_FILE_SERVICE_KEY, {}, {} )
test_ac( 'sameus aran*', self._my_service_key, CC.LOCAL_FILE_SERVICE_KEY, {}, {} )
test_ac( 'samus metroid*', self._public_service_key, CC.LOCAL_FILE_SERVICE_KEY, {}, {} )
test_ac( 'samus aran*', self._public_service_key, CC.LOCAL_FILE_SERVICE_KEY, {}, {} )
self._test_ac( 'mc bad*', self._my_service_key, CC.LOCAL_FILE_SERVICE_KEY, {}, {} )
self._test_ac( 'pc bad*', self._public_service_key, CC.LOCAL_FILE_SERVICE_KEY, {}, {} )
self._test_ac( 'pp bad*', self._public_service_key, CC.LOCAL_FILE_SERVICE_KEY, {}, {} )
self._test_ac( 'sameus aran*', self._my_service_key, CC.LOCAL_FILE_SERVICE_KEY, {}, {} )
self._test_ac( 'samus metroid*', self._public_service_key, CC.LOCAL_FILE_SERVICE_KEY, {}, {} )
self._test_ac( 'samus aran*', self._public_service_key, CC.LOCAL_FILE_SERVICE_KEY, {}, {} )
test_ac( 'mc bad*', CC.COMBINED_TAG_SERVICE_KEY, CC.LOCAL_FILE_SERVICE_KEY, {}, {} )
test_ac( 'pc bad*', CC.COMBINED_TAG_SERVICE_KEY, CC.LOCAL_FILE_SERVICE_KEY, {}, {} )
test_ac( 'pp bad*', CC.COMBINED_TAG_SERVICE_KEY, CC.LOCAL_FILE_SERVICE_KEY, {}, {} )
test_ac( 'sameus aran*', CC.COMBINED_TAG_SERVICE_KEY, CC.LOCAL_FILE_SERVICE_KEY, {}, {} )
test_ac( 'samus metroid*', CC.COMBINED_TAG_SERVICE_KEY, CC.LOCAL_FILE_SERVICE_KEY, {}, {} )
test_ac( 'samus aran*', CC.COMBINED_TAG_SERVICE_KEY, CC.LOCAL_FILE_SERVICE_KEY, {}, {} )
self._test_ac( 'mc bad*', CC.COMBINED_TAG_SERVICE_KEY, CC.LOCAL_FILE_SERVICE_KEY, {}, {} )
self._test_ac( 'pc bad*', CC.COMBINED_TAG_SERVICE_KEY, CC.LOCAL_FILE_SERVICE_KEY, {}, {} )
self._test_ac( 'pp bad*', CC.COMBINED_TAG_SERVICE_KEY, CC.LOCAL_FILE_SERVICE_KEY, {}, {} )
self._test_ac( 'sameus aran*', CC.COMBINED_TAG_SERVICE_KEY, CC.LOCAL_FILE_SERVICE_KEY, {}, {} )
self._test_ac( 'samus metroid*', CC.COMBINED_TAG_SERVICE_KEY, CC.LOCAL_FILE_SERVICE_KEY, {}, {} )
self._test_ac( 'samus aran*', CC.COMBINED_TAG_SERVICE_KEY, CC.LOCAL_FILE_SERVICE_KEY, {}, {} )
@ -3031,19 +3013,19 @@ class TestTagParents( unittest.TestCase ):
predicates = []
predicates.append( ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_TAG, 'grandmother', min_current_count = 10 ) )
predicates.append( ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_TAG, 'grandfather', min_current_count = 15 ) )
predicates.append( ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_TAG, 'not_exist', min_current_count = 20 ) )
predicates.append( ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_TAG, 'grandmother', ClientSearch.PredicateCount.STATICCreateCurrentCount( 10 ) ) )
predicates.append( ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_TAG, 'grandfather', ClientSearch.PredicateCount.STATICCreateCurrentCount( 15 ) ) )
predicates.append( ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_TAG, 'not_exist', ClientSearch.PredicateCount.STATICCreateCurrentCount( 20 ) ) )
self.assertEqual( self._tag_parents_manager.ExpandPredicates( CC.COMBINED_TAG_SERVICE_KEY, predicates ), predicates )
predicates = []
predicates.append( ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_TAG, 'child', min_current_count = 10 ) )
predicates.append( ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_TAG, 'child', ClientSearch.PredicateCount.STATICCreateCurrentCount( 10 ) ) )
results = []
results.append( ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_TAG, 'child', min_current_count = 10 ) )
results.append( ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_TAG, 'child', ClientSearch.PredicateCount.STATICCreateCurrentCount( 10 ) ) )
results.append( ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_PARENT, 'mother' ) )
results.append( ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_PARENT, 'father' ) )
results.append( ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_PARENT, 'grandmother' ) )
@ -3054,18 +3036,18 @@ class TestTagParents( unittest.TestCase ):
predicates = []
predicates.append( ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_NAMESPACE, 'series' ) )
predicates.append( ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_TAG, 'child', min_current_count = 10 ) )
predicates.append( ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_TAG, 'cousin', min_current_count = 5 ) )
predicates.append( ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_TAG, 'child', ClientSearch.PredicateCount.STATICCreateCurrentCount( 10 ) ) )
predicates.append( ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_TAG, 'cousin', ClientSearch.PredicateCount.STATICCreateCurrentCount( 5 ) ) )
results = []
results.append( ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_NAMESPACE, 'series' ) )
results.append( ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_TAG, 'child', min_current_count = 10 ) )
results.append( ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_TAG, 'child', ClientSearch.PredicateCount.STATICCreateCurrentCount( 10 ) ) )
results.append( ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_PARENT, 'mother' ) )
results.append( ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_PARENT, 'father' ) )
results.append( ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_PARENT, 'grandmother' ) )
results.append( ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_PARENT, 'grandfather' ) )
results.append( ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_TAG, 'cousin', min_current_count = 5 ) )
results.append( ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_TAG, 'cousin', ClientSearch.PredicateCount.STATICCreateCurrentCount( 5 ) ) )
results.append( ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_PARENT, 'aunt' ) )
results.append( ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_PARENT, 'uncle' ) )
results.append( ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_PARENT, 'grandmother' ) )

View File

@ -1570,6 +1570,115 @@ class TestTagObjects( unittest.TestCase ):
self.assertEqual( set( predicate_results_cache.FilterPredicates( CC.COMBINED_TAG_SERVICE_KEY, 'samus aran*' ) ), { samus_aran, character_samus_aran } )
def test_predicate_counts( self ):
# quick test for counts and __hash__
p_c = ClientSearch.PredicateCount( 1, 2, 3, 4 )
self.assertEqual( p_c.min_current_count, 1 )
self.assertEqual( p_c.min_pending_count, 2 )
self.assertEqual( p_c.max_current_count, 3 )
self.assertEqual( p_c.max_pending_count, 4 )
self.assertNotEqual( p_c, ClientSearch.PredicateCount( 1, 2, 3, 5 ) )
self.assertNotEqual( p_c, ClientSearch.PredicateCount( 1, 5, 3, 4 ) )
self.assertEqual( p_c, ClientSearch.PredicateCount( 1, 2, 3, 4 ) )
#
null = ClientSearch.PredicateCount.STATICCreateNullCount()
self.assertEqual( null, ClientSearch.PredicateCount( 0, 0, 0, 0 ) )
self.assertEqual( null.GetMinCount(), 0 )
self.assertEqual( null.GetMinCount( HC.CONTENT_STATUS_CURRENT ), 0 )
self.assertEqual( null.GetMinCount( HC.CONTENT_STATUS_PENDING ), 0 )
self.assertEqual( null.HasZeroCount(), True )
self.assertEqual( null.HasNonZeroCount(), False )
self.assertEqual( null.GetSuffixString(), '' )
#
p_c = ClientSearch.PredicateCount( 3, 0, 3, 0 )
self.assertEqual( p_c, ClientSearch.PredicateCount( 3, 0, 3, 0 ) )
self.assertEqual( p_c.GetMinCount(), 3 )
self.assertEqual( p_c.GetMinCount( HC.CONTENT_STATUS_CURRENT ), 3 )
self.assertEqual( p_c.GetMinCount( HC.CONTENT_STATUS_PENDING ), 0 )
self.assertEqual( p_c.HasZeroCount(), False )
self.assertEqual( p_c.HasNonZeroCount(), True )
self.assertEqual( p_c.GetSuffixString(), '(3)' )
#
p_c = ClientSearch.PredicateCount( 0, 5, 0, 5 )
self.assertEqual( p_c, ClientSearch.PredicateCount( 0, 5, 0, 5 ) )
self.assertEqual( p_c.GetMinCount(), 5 )
self.assertEqual( p_c.GetMinCount( HC.CONTENT_STATUS_CURRENT ), 0 )
self.assertEqual( p_c.GetMinCount( HC.CONTENT_STATUS_PENDING ), 5 )
self.assertEqual( p_c.HasZeroCount(), False )
self.assertEqual( p_c.HasNonZeroCount(), True )
self.assertEqual( p_c.GetSuffixString(), '(+5)' )
#
p_c = ClientSearch.PredicateCount( 100, 0, 150, 0 )
self.assertEqual( p_c, ClientSearch.PredicateCount( 100, 0, 150, 0 ) )
self.assertEqual( p_c.GetMinCount(), 100 )
self.assertEqual( p_c.GetMinCount( HC.CONTENT_STATUS_CURRENT ), 100 )
self.assertEqual( p_c.GetMinCount( HC.CONTENT_STATUS_PENDING ), 0 )
self.assertEqual( p_c.HasZeroCount(), False )
self.assertEqual( p_c.HasNonZeroCount(), True )
self.assertEqual( p_c.GetSuffixString(), '(100-150)' )
#
p_c = ClientSearch.PredicateCount( 0, 80, 0, 85 )
self.assertEqual( p_c, ClientSearch.PredicateCount( 0, 80, 0, 85 ) )
self.assertEqual( p_c.GetMinCount(), 80 )
self.assertEqual( p_c.GetMinCount( HC.CONTENT_STATUS_CURRENT ), 0 )
self.assertEqual( p_c.GetMinCount( HC.CONTENT_STATUS_PENDING ), 80 )
self.assertEqual( p_c.HasZeroCount(), False )
self.assertEqual( p_c.HasNonZeroCount(), True )
self.assertEqual( p_c.GetSuffixString(), '(+80-85)' )
#
p_c = ClientSearch.PredicateCount( 0, 0, 1500, 0 )
self.assertEqual( p_c, ClientSearch.PredicateCount( 0, 0, 1500, 0 ) )
self.assertEqual( p_c.GetMinCount(), 0 )
self.assertEqual( p_c.GetMinCount( HC.CONTENT_STATUS_CURRENT ), 0 )
self.assertEqual( p_c.GetMinCount( HC.CONTENT_STATUS_PENDING ), 0 )
self.assertEqual( p_c.HasZeroCount(), False )
self.assertEqual( p_c.HasNonZeroCount(), True )
self.assertEqual( p_c.GetSuffixString(), '(0-1,500)' )
#
p_c = ClientSearch.PredicateCount( 1, 2, 3, 4 )
self.assertEqual( p_c, ClientSearch.PredicateCount( 1, 2, 3, 4 ) )
self.assertEqual( p_c.GetMinCount(), 3 )
self.assertEqual( p_c.GetMinCount( HC.CONTENT_STATUS_CURRENT ), 1 )
self.assertEqual( p_c.GetMinCount( HC.CONTENT_STATUS_PENDING ), 2 )
self.assertEqual( p_c.HasZeroCount(), False )
self.assertEqual( p_c.HasNonZeroCount(), True )
self.assertEqual( p_c.GetSuffixString(), '(1-3) (+2-4)' )
#
p_c_1 = ClientSearch.PredicateCount( 10, 2, 12, 4 )
p_c_2 = ClientSearch.PredicateCount( 1, 0, 2, 4 )
p_c_1.AddCounts( p_c_2 )
self.assertEqual( p_c_1, ClientSearch.PredicateCount( 10, 2, 14, 8 ) )
def test_predicate_strings_and_namespaces( self ):
render_for_user = False
@ -1580,7 +1689,7 @@ class TestTagObjects( unittest.TestCase ):
self.assertEqual( p.GetNamespace(), '' )
self.assertEqual( p.GetTextsAndNamespaces( render_for_user ), [ ( p.ToString(), p.GetNamespace() ) ] )
p = ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_TAG, 'tag', min_current_count = 1, min_pending_count = 2 )
p = ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_TAG, 'tag', True, count = ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 2 ) )
self.assertEqual( p.ToString( with_count = False ), 'tag' )
self.assertEqual( p.ToString( with_count = True ), 'tag (1) (+2)' )
@ -1593,7 +1702,7 @@ class TestTagObjects( unittest.TestCase ):
self.assertEqual( p.GetNamespace(), '' )
self.assertEqual( p.GetTextsAndNamespaces( render_for_user ), [ ( p.ToString(), p.GetNamespace() ) ] )
p = ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_TAG, 'tag', False, 1, 2 )
p = ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_TAG, 'tag', False, count = ClientSearch.PredicateCount.STATICCreateStaticCount( 1, 2 ) )
self.assertEqual( p.ToString( with_count = False ), '-tag' )
self.assertEqual( p.ToString( with_count = True ), '-tag (1) (+2)' )
@ -1620,7 +1729,7 @@ class TestTagObjects( unittest.TestCase ):
self.assertEqual( p.GetNamespace(), 'system' )
self.assertEqual( p.GetTextsAndNamespaces( render_for_user ), [ ( p.ToString(), p.GetNamespace() ) ] )
p = ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_SYSTEM_ARCHIVE, min_current_count = 1000 )
p = ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_SYSTEM_ARCHIVE, count = ClientSearch.PredicateCount.STATICCreateCurrentCount( 1000 ) )
self.assertEqual( p.ToString(), 'system:archive (1,000)' )
self.assertEqual( p.GetNamespace(), 'system' )
@ -1632,7 +1741,7 @@ class TestTagObjects( unittest.TestCase ):
self.assertEqual( p.GetNamespace(), 'system' )
self.assertEqual( p.GetTextsAndNamespaces( render_for_user ), [ ( p.ToString(), p.GetNamespace() ) ] )
p = ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_SYSTEM_EVERYTHING, min_current_count = 2000 )
p = ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_SYSTEM_EVERYTHING, count = ClientSearch.PredicateCount.STATICCreateCurrentCount( 2000 ) )
self.assertEqual( p.ToString(), 'system:everything (2,000)' )
self.assertEqual( p.GetNamespace(), 'system' )
@ -1698,7 +1807,7 @@ class TestTagObjects( unittest.TestCase ):
self.assertEqual( p.GetNamespace(), 'system' )
self.assertEqual( p.GetTextsAndNamespaces( render_for_user ), [ ( p.ToString(), p.GetNamespace() ) ] )
p = ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_SYSTEM_INBOX, min_current_count = 1000 )
p = ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_SYSTEM_INBOX, count = ClientSearch.PredicateCount.STATICCreateCurrentCount( 1000 ) )
self.assertEqual( p.ToString(), 'system:inbox (1,000)' )
self.assertEqual( p.GetNamespace(), 'system' )
@ -1710,7 +1819,7 @@ class TestTagObjects( unittest.TestCase ):
self.assertEqual( p.GetNamespace(), 'system' )
self.assertEqual( p.GetTextsAndNamespaces( render_for_user ), [ ( p.ToString(), p.GetNamespace() ) ] )
p = ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_SYSTEM_LOCAL, min_current_count = 100 )
p = ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_SYSTEM_LOCAL, count = ClientSearch.PredicateCount.STATICCreateCurrentCount( 100 ) )
self.assertEqual( p.ToString(), 'system:local (100)' )
self.assertEqual( p.GetNamespace(), 'system' )
@ -1734,7 +1843,7 @@ class TestTagObjects( unittest.TestCase ):
self.assertEqual( p.GetNamespace(), 'system' )
self.assertEqual( p.GetTextsAndNamespaces( render_for_user ), [ ( p.ToString(), p.GetNamespace() ) ] )
p = ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_SYSTEM_NOT_LOCAL, min_current_count = 100 )
p = ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_SYSTEM_NOT_LOCAL, count = ClientSearch.PredicateCount.STATICCreateCurrentCount( 100 ) )
self.assertEqual( p.ToString(), 'system:not local (100)' )
self.assertEqual( p.GetNamespace(), 'system' )

View File

@ -108,7 +108,7 @@ class TestServer( unittest.TestCase ):
for path in ( cls._ssl_cert_path, cls._ssl_key_path ):
HydrusPaths.MakeFileWriteable( path )
HydrusPaths.TryToMakeFileWriteable( path )
os.unlink( path )