Version 487

closes #1155
This commit is contained in:
Hydrus Network Developer 2022-06-01 16:19:26 -05:00
parent 5db6daa416
commit a39462c4bc
17 changed files with 612 additions and 165 deletions

View File

@ -3,6 +3,30 @@
!!! note
This is the new changelog, only the most recent builds. For all versions, see the [old changelog](old_changelog.html).
## [Version 487](https://github.com/hydrusnetwork/hydrus/releases/tag/v487)
### misc
* updated the duplicate filter 'show next pair' logic again, mostly simplification and merging of decision making. it _should_ be even more resistant to weird problems at the end of batches, particularly if you have deleted files manually
* a new button on the duplicate filter right hover window now appends the current pair to the parent duplicate media page (for if you want to do more processing to them later)
* if you manually delete a file in the duplicate filter, if that file appears again in the current batch of pairs, those will be auto-skipped
* if you manually delete a file in the duplicate filter, the actual delete is now deferred to when you commit the batch! it also undoes if you go back!
* fixed a bug when editing the external program launch paths in the options
* fixed an annoying delay-and-error-popup when clearing the separator field when editing a String Splitter. now the field just turns red and vetoes an OK with a nicer error text
* also improved how string splitters report actual split errors
* if you are in advanced mode, the _review services_ panels now have an 'id' button that lets you fetch the database service id
* wrote a new database maintenance routine under _database->check and repair->resync tag mappings cache files_, which is a lightweight way of fixing ghost files or situations where files with a tag are neither counted nor appear in file results. this fixes these problems in a couple minutes, so for this it is much better than a full regen of the cache
### cleanup and other boring stuff
* the archive/delete filter now says which file domain it will be deleting from
* if an archive/delete filter is launched on a 'multiple locations' file domain, it is now careful to only make delete records for the deleted files for the file services each one is actually in
* renamed the 'default local file search location' option to 'fallback' and updated its tooltip a bit. this was really a hacky thing I needed to fill some gaps while rewriting from 'my files' to multiple local file services. the whole thing needs more attention to become more useful. I also fixed an issue where it could become invalid 'nothing' if you deleted a file service it was referring to (issue #1155)
* I think I fixed a rare 'did not find info for that file' style problem when highlighting some watchers/downloaders
* I think I have silenced some unhelpful BeautifulSoup (html parser) warnings that were spamming to the log in some situations
* updated last week's big update to work with TRUNCATE journalling mode. I will be doing this for other big updates going forwards, since multi-GB WAL transactions cause problems for some users
* last week's update also gives a time estimate in its pre-popup, based on 60k files per minute
* removed some old database cache data that wasn't cleared in a previous update
* a variety of misc UI text fixes and cleanup
## [Version 486](https://github.com/hydrusnetwork/hydrus/releases/tag/v486)
* **This week's release is for advanced users only! I make a big change, and I want to make sure the update is fast and there are no unusual problems before rolling it out to all users.**
@ -334,27 +358,3 @@
* I integrated the guy's unit tests for the new notes support into the main hydrus test suite
* the client api version is now 27
* I added links to the client api help to this new list of hydrus-related projects on github, which was helpfully compiled by another user: https://github.com/stars/hydrusnetwork/lists/hydrus-related-projects
## [Version 476](https://github.com/hydrusnetwork/hydrus/releases/tag/v476)
### domain modified times
* the downloader now saves the 'source time' (or, if none was parsed, 'creation time') for each file import object to the database when a file import is completed. separate timestamps are tracked for every domain you download from, and a file's number can update to an earlier time if a new one comes in for that domain
* I overhauled how hydrus stores timestamps in each media object and added these domain timestamps to it. now, when you see 'modified time', it is the minimum of the file modified time and all recorded domain modified times. this aggregated modfified time works for sort in UI and when sorting before applying system:limit, and it also works for system:modified time search. the search may be slow in some situations--let me know
* I also added the very recent 'archived' timestamps into this new object and added sort for archived time too. 'archived 3 minutes ago' style text will appear in thumbnail right-click menus and the media viewer top status text
* in future, I will add search for archive time; more display, search, and sort for modified time (for specific domains); and also figure out a dialog so you can manually edit these timestamps in case of problems
* I also expect to write an optional 'fill in dummy data' routine for the archived timestamps for files archived before I started tracking these timestamps. something like 'for all archived files, put in an archive time 20% between import time and now', but maybe there is a better way of doing it, let me know if you have any ideas. we'll only get one shot at this, so maybe we can do a better estimate with closer analysis
* in the longer future, I expect import/export support for this data and maintenance routines to retroactively populate the domain data based on hitting up known urls again, so all us long-time users can backfill in nicer post times for all our downloaded files
### searching tags on client api
* a user has helped me out by writing autocomplete tag search for the client api, under /add_tags/search_tags. I normally do not accept pull requests like this, but the guy did a great job and I have not been able to fit this in myself despite wanting it a lot
* I added some bells and whistles--py 3.8 support, tag sorting, filtering results according to any api permissions, and some unit tests
* at the moment, it searches the 'storage' domain that you see in a manage tags dialog, i.e. without siblings collapsed. I can and will expand it to support more options in future. please give it a go and let me know what you think
* client api version is now 26
### misc
* when you edit something in a multi-column list, I think I have updated every single one so the selection is preserved through the edit. annoyingly and confusingly on most of the old lists, for instance subscriptions, the 'ghost' of the selection focus would bump up one position after an edit. now it should stay the same even if you rename etc... and if you have multiple selected/edited
* I _think_ I fixed a bug in the selected files taglist where, in some combination of changing the tag service of the page and then loading up a favourite search, the taglist could get stuck on the previous tag domain. typically this would look as if the page's taglist had nothing in it no matter what files were selected
* if you set some files as 'alternates' when they are already 'duplicates', this now works (previously it did nothing). the non-kings of the group will be extracted from the duplicate group and applied as new alts
* added a 'BUGFIX' checkbox to 'gui pages' options page that forces a 'hide page' signal to the current page when creating a new page. we'll see if this patches a weird error or if more work is needed
* added some protections against viewing files when the image/video file has (incorrectly) 0 width or height
* added support for viewing non-image/video files in the duplicate filter. there are advanced ways to get unusual files in here, and until now a pdf or something would throw an error about having 0 width

View File

@ -33,6 +33,30 @@
<div class="content">
<h3 id="changelog"><a href="#changelog">changelog</a></h3>
<ul>
<li><h3 id="version_487"><a href="#version_487">version 487</a></h3></li>
<ul>
<li>misc:</li>
<li>updated the duplicate filter 'show next pair' logic again, mostly simplification and merging of decision making. it _should_ be even more resistant to weird problems at the end of batches, particularly if you have deleted files manually</li>
<li>a new button on the duplicate filter right hover window now appends the current pair to the parent duplicate media page (for if you want to do more processing to them later)</li>
<li>if you manually delete a file in the duplicate filter, if that file appears again in the current batch of pairs, those will be auto-skipped</li>
<li>if you manually delete a file in the duplicate filter, the actual delete is now deferred to when you commit the batch! it also undoes if you go back!</li>
<li>fixed a bug when editing the external program launch paths in the options</li>
<li>fixed an annoying delay-and-error-popup when clearing the separator field when editing a String Splitter. now the field just turns red and vetoes an OK with a nicer error text</li>
<li>also improved how string splitters report actual split errors</li>
<li>if you are in advanced mode, the _review services_ panels now have an 'id' button that lets you fetch the database service id</li>
<li>wrote a new database maintenance routine under _database->check and repair->resync tag mappings cache files_, which is a lightweight way of fixing ghost files or situations where files with a tag are neither counted nor appear in file results. this fixes these problems in a couple minutes, so for this it is much better than a full regen of the cache</li>
<li>.</li>
<li>cleanup and other boring stuff:</li>
<li>the archive/delete filter now says which file domain it will be deleting from</li>
<li>if an archive/delete filter is launched on a 'multiple locations' file domain, it is now careful to only make delete records for the deleted files for the file services each one is actually in</li>
<li>renamed the 'default local file search location' option to 'fallback' and updated its tooltip a bit. this was really a hacky thing I needed to fill some gaps while rewriting from 'my files' to multiple local file services. the whole thing needs more attention to become more useful. I also fixed an issue where it could become invalid 'nothing' if you deleted a file service it was referring to (issue #1155)</li>
<li>I think I fixed a rare 'did not find info for that file' style problem when highlighting some watchers/downloaders</li>
<li>I think I have silenced some unhelpful BeautifulSoup (html parser) warnings that were spamming to the log in some situations</li>
<li>updated last week's big update to work with TRUNCATE journalling mode. I will be doing this for other big updates going forwards, since multi-GB WAL transactions cause problems for some users</li>
<li>last week's update also gives a time estimate in its pre-popup, based on 60k files per minute</li>
<li>removed some old database cache data that wasn't cleared in a previous update</li>
<li>a variety of misc UI text fixes and cleanup</li>
</ul>
<li><h3 id="version_486"><a href="#version_486">version 486</a></h3></li>
<ul>
<li>**This week's release is for advanced users only! I make a big change, and I want to make sure the update is fast and there are no unusual problems before rolling it out to all users.**</li>

View File

@ -971,7 +971,25 @@ class ClientOptions( HydrusSerialisable.SerialisableBase ):
with self._lock:
return self._dictionary[ 'default_local_location_context' ]
location_context = self._dictionary[ 'default_local_location_context' ]
try:
location_context.FixMissingServices( HG.client_controller.services_manager.FilterValidServiceKeys )
if location_context.IsEmpty():
from hydrus.client import ClientLocation
location_context = ClientLocation.LocationContext.STATICCreateSimple( CC.COMBINED_LOCAL_MEDIA_SERVICE_KEY )
except:
pass
return location_context

View File

@ -6,6 +6,7 @@ import os
import re
import time
import urllib.parse
import warnings
from hydrus.core import HydrusConstants as HC
from hydrus.core import HydrusData
@ -363,7 +364,14 @@ def GetSoup( html ):
raise HydrusExceptions.ParseException( message )
return bs4.BeautifulSoup( html, parser )
with warnings.catch_warnings():
# bs4 goes bananas with MarkupResemblesLocatorWarning warnings to the log at times, basically when you throw something that looks like a file at it, which I presume sometimes means stuff like '/'
warnings.simplefilter( 'ignore' )
return bs4.BeautifulSoup( html, parser )
def GetTagsFromParseResults( results ):

View File

@ -1014,13 +1014,20 @@ class StringSplitter( StringProcessingStep ):
raise HydrusExceptions.StringSplitterException( 'Got a bytes value in a string splitter!' )
if self._max_splits is None:
try:
results = text.split( self._separator )
if self._max_splits is None:
results = text.split( self._separator )
else:
results = text.split( self._separator, self._max_splits )
else:
except Exception as e:
results = text.split( self._separator, self._max_splits )
raise HydrusExceptions.StringSplitterException( 'Problem when splitting text: {}'.format( e ) )
return [ result for result in results if result != '' ]

View File

@ -2165,9 +2165,14 @@ class DB( HydrusDB.HydrusDB ):
def _DuplicatesSetDuplicatePairStatus( self, pair_info ):
for ( duplicate_type, hash_a, hash_b, service_keys_to_content_updates ) in pair_info:
for ( duplicate_type, hash_a, hash_b, list_of_service_keys_to_content_updates ) in pair_info:
if len( service_keys_to_content_updates ) > 0:
if isinstance( list_of_service_keys_to_content_updates, dict ):
list_of_service_keys_to_content_updates = [ list_of_service_keys_to_content_updates ]
for service_keys_to_content_updates in list_of_service_keys_to_content_updates:
self._ProcessContentUpdates( service_keys_to_content_updates )
@ -7990,6 +7995,7 @@ class DB( HydrusDB.HydrusDB ):
elif action == 'service_directories': result = self._GetServiceDirectoriesInfo( *args, **kwargs )
elif action == 'service_filenames': result = self._GetServiceFilenames( *args, **kwargs )
elif action == 'service_info': result = self._GetServiceInfo( *args, **kwargs )
elif action == 'service_id': result = self.modules_services.GetServiceId( *args, **kwargs )
elif action == 'services': result = self.modules_services.GetServices( *args, **kwargs )
elif action == 'similar_files_maintenance_status': result = self.modules_similar_files.GetMaintenanceStatus( *args, **kwargs )
elif action == 'related_tags': result = self._GetRelatedTags( *args, **kwargs )
@ -9472,6 +9478,116 @@ class DB( HydrusDB.HydrusDB ):
def _ResyncTagMappingsCacheFiles( self, tag_service_key = None ):
job_key = ClientThreading.JobKey( cancellable = True )
try:
job_key.SetStatusTitle( 'resyncing tag mappings cache files' )
self._controller.pub( 'modal_message', job_key )
if tag_service_key is None:
tag_service_ids = self.modules_services.GetServiceIds( HC.REAL_TAG_SERVICES )
else:
tag_service_ids = ( self.modules_services.GetServiceId( tag_service_key ), )
problems_found = False
file_service_ids = self.modules_services.GetServiceIds( HC.FILE_SERVICES_WITH_SPECIFIC_MAPPING_CACHES )
for file_service_id in file_service_ids:
file_service_key = self.modules_services.GetServiceKey( file_service_id )
location_context = ClientLocation.LocationContext.STATICCreateSimple( file_service_key )
for tag_service_id in tag_service_ids:
message = 'resyncing caches for {}_{}'.format( file_service_id, tag_service_id )
job_key.SetVariable( 'popup_text_1', message )
self._controller.frame_splash_status.SetSubtext( message )
if job_key.IsCancelled():
break
( cache_current_mappings_table_name, cache_deleted_mappings_table_name, cache_pending_mappings_table_name ) = ClientDBMappingsStorage.GenerateSpecificMappingsCacheTableNames( file_service_id, tag_service_id )
hash_ids_in_this_cache = self._STS( self._Execute( 'SELECT DISTINCT hash_id FROM {};'.format( cache_current_mappings_table_name ) ) )
hash_ids_in_this_cache.update( self._STL( self._Execute( 'SELECT DISTINCT hash_id FROM {};'.format( cache_current_mappings_table_name ) ) ) )
hash_ids_in_this_cache_and_in_file_service = self.modules_files_storage.FilterHashIds( location_context, hash_ids_in_this_cache )
# for every file in cache, if it is not in current files, remove it
hash_ids_in_this_cache_but_not_in_file_service = hash_ids_in_this_cache.difference( hash_ids_in_this_cache_and_in_file_service )
if len( hash_ids_in_this_cache_but_not_in_file_service ) > 0:
problems_found = True
HydrusData.ShowText( '{} surplus files in {}_{}!'.format( HydrusData.ToHumanInt( len( hash_ids_in_this_cache_but_not_in_file_service ) ), file_service_id, tag_service_id ) )
with self._MakeTemporaryIntegerTable( hash_ids_in_this_cache_but_not_in_file_service, 'hash_id' ) as temp_hash_id_table_name:
self.modules_mappings_cache_specific_storage.DeleteFiles( file_service_id, tag_service_id, hash_ids_in_this_cache_but_not_in_file_service, temp_hash_id_table_name )
# for every file in current files, if it is not in cache, add it
hash_ids_in_file_service = set( self.modules_files_storage.GetCurrentHashIdsList( file_service_id ) )
hash_ids_in_file_service_and_not_in_cache = hash_ids_in_file_service.difference( hash_ids_in_this_cache )
( current_mappings_table_name, deleted_mappings_table_name, pending_mappings_table_name, petitioned_mappings_table_name ) = ClientDBMappingsStorage.GenerateMappingsTableNames( tag_service_id )
with self._MakeTemporaryIntegerTable( hash_ids_in_file_service_and_not_in_cache, 'hash_id' ) as temp_hash_id_table_name:
hash_ids_in_file_service_and_not_in_cache_that_have_tags = self._STS( self._Execute( 'SELECT hash_id FROM {} WHERE EXISTS ( SELECT 1 FROM {} WHERE {}.hash_id = {}.hash_id );'.format( temp_hash_id_table_name, current_mappings_table_name, current_mappings_table_name, temp_hash_id_table_name ) ) )
hash_ids_in_file_service_and_not_in_cache_that_have_tags.update( self._STL( self._Execute( 'SELECT hash_id FROM {} WHERE EXISTS ( SELECT 1 FROM {} WHERE {}.hash_id = {}.hash_id );'.format( temp_hash_id_table_name, current_mappings_table_name, current_mappings_table_name, temp_hash_id_table_name ) ) ) )
if len( hash_ids_in_file_service_and_not_in_cache_that_have_tags ) > 0:
problems_found = True
HydrusData.ShowText( '{} missing files in {}_{}!'.format( HydrusData.ToHumanInt( len( hash_ids_in_file_service_and_not_in_cache_that_have_tags ) ), file_service_id, tag_service_id ) )
with self._MakeTemporaryIntegerTable( hash_ids_in_file_service_and_not_in_cache_that_have_tags, 'hash_id' ) as temp_hash_id_table_name:
self.modules_mappings_cache_specific_storage.AddFiles( file_service_id, tag_service_id, hash_ids_in_file_service_and_not_in_cache_that_have_tags, temp_hash_id_table_name )
if not problems_found:
HydrusData.ShowText( 'All checks ok--no desynced mapping caches!' )
finally:
job_key.SetVariable( 'popup_text_1', 'done!' )
job_key.Finish()
job_key.Delete( 5 )
self._cursor_transaction_wrapper.pub_after_job( 'notify_new_tag_display_application' )
self._cursor_transaction_wrapper.pub_after_job( 'notify_new_force_refresh_tags_data' )
def _SaveDirtyServices( self, dirty_services ):
# if allowed to save objects
@ -11547,7 +11663,38 @@ class DB( HydrusDB.HydrusDB ):
if result is None:
message = 'Your database is going to calculate some new data so it can refer to multiple local services more efficiently. If you have a large client, this may take a few minutes.'
warning_ptr_text = 'After looking at your database, I think this will be quick, maybe a couple minutes at most.'
nums_mappings = self._STL( self._Execute( 'SELECT info FROM service_info WHERE info_type = ?;', ( HC.SERVICE_INFO_NUM_MAPPINGS, ) ) )
if len( nums_mappings ) > 0:
we_ptr = max( nums_mappings ) > 1000000000
if we_ptr:
result = self._Execute( 'SELECT info FROM service_info WHERE info_type = ? AND service_id = ?;', ( HC.SERVICE_INFO_NUM_FILES, self.modules_services.combined_local_file_service_id ) ).fetchone()
if result is not None:
( num_files, ) = result
warning_ptr_text = 'For most users, this update works at about 25-100k files per minute, so with {} files I expect it to take ~{} minutes for you.'.format( HydrusData.ToHumanInt( num_files ), max( 1, int( num_files / 60000 ) ) )
else:
we_ptr = False
else:
we_ptr = False
message = 'Your database is going to calculate some new data so it can refer to multiple local services more efficiently. It could take a while.'
message += os.linesep * 2
message += warning_ptr_text
message += os.linesep * 2
message += 'If you do not have the time at the moment, please force kill the hydrus process now. Otherwise, continue!'
@ -11573,6 +11720,12 @@ class DB( HydrusDB.HydrusDB ):
self._controller.frame_splash_status.SetText( 'creating "all my files" virtual service' )
self._controller.frame_splash_status.SetSubtext( 'gathering current file records' )
self._cursor_transaction_wrapper.Commit()
self._Execute( 'PRAGMA journal_mode = TRUNCATE;' )
self._cursor_transaction_wrapper.BeginImmediate()
dictionary = ClientServices.GenerateDefaultServiceDictionary( HC.COMBINED_LOCAL_MEDIA )
self._AddService( CC.COMBINED_LOCAL_MEDIA_SERVICE_KEY, HC.COMBINED_LOCAL_MEDIA, 'all my files', dictionary )
@ -11651,6 +11804,8 @@ class DB( HydrusDB.HydrusDB ):
deleted_files_table_name = ClientDBFilesStorage.GenerateFilesTableName( self.modules_services.combined_local_media_service_id, HC.CONTENT_STATUS_DELETED )
num_to_do = len( all_media_hash_ids )
for ( i, hash_id ) in enumerate( all_media_hash_ids ):
# no need to fake the service info number updates--that will calculate from raw on next review services open
@ -11681,10 +11836,41 @@ class DB( HydrusDB.HydrusDB ):
self._cursor_transaction_wrapper.Commit()
self._Execute( 'PRAGMA journal_mode = {};'.format( HG.db_journal_mode ) )
self._cursor_transaction_wrapper.BeginImmediate()
self._controller.frame_splash_status.SetSubtext( '' )
if version == 486:
file_service_ids = self.modules_services.GetServiceIds( HC.FILE_SERVICES_WITH_SPECIFIC_MAPPING_CACHES )
tag_service_ids = self.modules_services.GetServiceIds( HC.REAL_TAG_SERVICES )
for ( file_service_id, tag_service_id ) in itertools.product( file_service_ids, tag_service_ids ):
# some users still have a few of these floating around, they are not needed
suffix = '{}_{}'.format( file_service_id, tag_service_id )
cache_files_table_name = 'external_caches.specific_files_cache_{}'.format( suffix )
result = self._Execute( 'SELECT 1 FROM external_caches.sqlite_master WHERE name = ?;', ( cache_files_table_name.split( '.', 1 )[1], ) ).fetchone()
if result is None:
continue
self._Execute( 'DROP TABLE {};'.format( cache_files_table_name ) )
self._controller.frame_splash_status.SetTitleText( 'updated db to v{}'.format( HydrusData.ToHumanInt( version + 1 ) ) )
self._Execute( 'UPDATE version SET version = ?;', ( version + 1, ) )
@ -12222,6 +12408,7 @@ class DB( HydrusDB.HydrusDB ):
elif action == 'reset_repository': self._ResetRepository( *args, **kwargs )
elif action == 'reset_repository_processing': self._ResetRepositoryProcessing( *args, **kwargs )
elif action == 'reset_potential_search_status': self._PerceptualHashesResetSearchFromHashes( *args, **kwargs )
elif action == 'resync_tag_mappings_cache_files': self._ResyncTagMappingsCacheFiles( *args, **kwargs )
elif action == 'save_options': self._SaveOptions( *args, **kwargs )
elif action == 'serialisable': self.modules_serialisable.SetJSONDump( *args, **kwargs )
elif action == 'serialisable_atomic': self.modules_serialisable.SetJSONComplex( *args, **kwargs )

View File

@ -2988,9 +2988,9 @@ class FrameGUI( ClientGUITopLevelWindows.MainFrameThatResizes, CAC.ApplicationCo
self._menubar_database_restore_backup = ClientGUIMenus.AppendMenuItem( menu, 'restore from a database backup', 'Restore the database from an external location.', self._controller.RestoreDatabase )
message = 'Your database is stored across multiple locations, which disables my internal backup routine. To back up, please use a third-party program that will work better than anything I can write.'
message = 'Your database is stored across multiple locations. The in-client backup routine can only handle simple databases (in one location), so the menu commands to backup have been hidden. To back up, please use a third-party program that will work better than anything I can write.'
message += os.linesep * 2
message += 'Please check the help for more info on how best to backup manually.'
message += 'Check the help for more info on how best to backup manually.'
self._menubar_database_multiple_location_label = ClientGUIMenus.AppendMenuItem( menu, 'database is stored in multiple locations', 'The database is migrated.', HydrusData.ShowText, message )
@ -3041,6 +3041,7 @@ class FrameGUI( ClientGUITopLevelWindows.MainFrameThatResizes, CAC.ApplicationCo
ClientGUIMenus.AppendMenuItem( check_submenu, 'database integrity', 'Have the database examine all its records for internal consistency.', self._CheckDBIntegrity )
ClientGUIMenus.AppendMenuItem( check_submenu, 'repopulate truncated mappings tables', 'Use the mappings cache to try to repair a previously damaged mappings file.', self._RepopulateMappingsTables )
ClientGUIMenus.AppendMenuItem( check_submenu, 'resync tag mappings cache files', 'Check the tag mappings cache for surplus or missing files.', self._ResyncTagMappingsCacheFiles )
ClientGUIMenus.AppendMenuItem( check_submenu, 'fix logically inconsistent mappings', 'Remove tags that are occupying two mutually exclusive states.', self._FixLogicallyInconsistentMappings )
ClientGUIMenus.AppendMenuItem( check_submenu, 'fix invalid tags', 'Scan the database for invalid tags.', self._RepairInvalidTags )
@ -5309,6 +5310,31 @@ class FrameGUI( ClientGUITopLevelWindows.MainFrameThatResizes, CAC.ApplicationCo
self._controller.pub( 'set_splitter_positions', HC.options[ 'hpos' ], HC.options[ 'vpos' ] )
def _ResyncTagMappingsCacheFiles( self ):
message = 'This will scan your mappings cache for surplus or missing files and correct them. This is useful if you see ghost files or if searches miss files that have the tag.'
message += os.linesep * 2
message += 'If you have a lot of tags and files, it can take a long time, during which the gui may hang. It should be much faster than the full regen options though!'
message += os.linesep * 2
message += 'If you do not have a specific reason to run this, it is pointless.'
result = ClientGUIDialogsQuick.GetYesNo( self, message, yes_label = 'do it--now choose which service', no_label = 'forget it' )
if result == QW.QDialog.Accepted:
try:
tag_service_key = GetTagServiceKeyForMaintenance( self )
except HydrusExceptions.CancelledException:
return
self._controller.Write( 'resync_tag_mappings_cache_files', tag_service_key = tag_service_key )
def _STARTReviewAllAccounts( self, service_key ):
admin_service = HG.client_controller.services_manager.GetService( service_key )

View File

@ -672,7 +672,7 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
self._duplicate_comparison_score_nicer_ratio.setToolTip( 'For instance, 16:9 vs 640:357.')
self._duplicate_filter_max_batch_size = ClientGUICommon.BetterSpinBox( self, min = 10, max = 1024 )
self._duplicate_filter_max_batch_size = ClientGUICommon.BetterSpinBox( self, min = 5, max = 1024 )
#
@ -864,9 +864,9 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
self._mime_launch_listctrl.DeleteDatas( [ ( mime, launch_path ) ] )
edited_data = [ ( mime, new_launch_path ) ]
edited_data = ( mime, new_launch_path )
self._mime_launch_listctrl.AddDatas( edited_data )
self._mime_launch_listctrl.AddDatas( [ edited_data ] )
edited_datas.append( edited_data )
@ -914,7 +914,7 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
location_context = self._new_options.GetDefaultLocalLocationContext()
self._default_local_location_context = ClientGUILocation.LocationSearchContextButton( self, location_context )
self._default_local_location_context.setToolTip( 'This initialised into a bunch of dialogs across the program. You can probably leave it alone forever, but if you delete or move away from \'my files\' as your main place to do work, please update it here.' )
self._default_local_location_context.setToolTip( 'This initialised into a bunch of dialogs across the program as a fallback. You can probably leave it alone forever, but if you delete or move away from \'my files\' as your main place to do work, please update it here.' )
self._default_local_location_context.SetOnlyImportableDomainsAllowed( True )
@ -997,7 +997,7 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
rows = []
rows.append( ( 'Default local file search location: ', self._default_local_location_context ) )
rows.append( ( 'Fallback local file search location: ', self._default_local_location_context ) )
rows.append( ( 'When copying file hashes, prefix with booru-friendly hash type: ', self._prefix_hash_when_copying ) )
rows.append( ( 'Confirm sending files to trash: ', self._confirm_trash ) )
rows.append( ( 'Confirm sending more than one file to archive or inbox: ', self._confirm_archive ) )

View File

@ -1744,9 +1744,20 @@ class EditStringSplitterPanel( ClientGUIScrolledPanels.EditPanel ):
def _UpdateControls( self ):
string_splitter = self._GetValue()
results = string_splitter.Split( self._example_string.text() )
if self._separator.text() == '':
self._separator.setObjectName( 'HydrusInvalid' )
results = []
else:
self._separator.setObjectName( '' )
string_splitter = self._GetValue()
results = string_splitter.Split( self._example_string.text() )
self._example_string_splits.clear()
@ -1755,9 +1766,16 @@ class EditStringSplitterPanel( ClientGUIScrolledPanels.EditPanel ):
self._example_string_splits.addItem( result )
self._separator.style().polish( self._separator )
def GetValue( self ):
if self._separator.text() == '':
raise HydrusExceptions.VetoException( 'Sorry, you have to have a value in the separator field!' )
string_splitter = self._GetValue()
return string_splitter

View File

@ -491,7 +491,7 @@ class Canvas( QW.QWidget, CAC.ApplicationCommandProcessorMixin ):
def _Delete( self, media = None, default_reason = None, file_service_key = None ):
def _Delete( self, media = None, default_reason = None, file_service_key = None, just_get_jobs = False ):
if media is None:
@ -538,9 +538,16 @@ class Canvas( QW.QWidget, CAC.ApplicationCommandProcessorMixin ):
HG.client_controller.CallToThread( do_it, jobs )
return True
if just_get_jobs:
return jobs
else:
HG.client_controller.CallToThread( do_it, jobs )
return True
def _DoEdgePan( self, pan_type ):
@ -2880,6 +2887,8 @@ class CanvasFilterDuplicates( CanvasWithHovers ):
CANVAS_TYPE = CC.CANVAS_MEDIA_VIEWER_DUPLICATES
showPairInPage = QC.Signal( list )
def __init__( self, parent, file_search_context: ClientSearch.FileSearchContext, both_files_match, pixel_dupes_preference, max_hamming_distance ):
location_context = file_search_context.GetLocationContext()
@ -2888,6 +2897,7 @@ class CanvasFilterDuplicates( CanvasWithHovers ):
hover = ClientGUICanvasHoverFrames.CanvasHoverFrameRightDuplicates( self, self, self._right_notes_hover, self._canvas_key )
hover.showPairInPage.connect( self._ShowPairInPage )
hover.sendApplicationCommand.connect( self.ProcessApplicationCommand )
self._hovers.append( hover )
@ -2933,9 +2943,29 @@ class CanvasFilterDuplicates( CanvasWithHovers ):
pair_info = []
for ( media_result_pair, duplicate_type, first_media, second_media, service_keys_to_content_updates, was_auto_skipped ) in self._processed_pairs:
for ( media_result_pair, duplicate_type, first_media, second_media, list_of_service_keys_to_content_updates, was_auto_skipped ) in self._processed_pairs:
if duplicate_type is None or was_auto_skipped:
if duplicate_type is None:
if len( list_of_service_keys_to_content_updates ) > 0:
for service_keys_to_content_updates in list_of_service_keys_to_content_updates:
if blocking:
HG.client_controller.WriteSynchronous( 'content_updates', service_keys_to_content_updates )
else:
HG.client_controller.Write( 'content_updates', service_keys_to_content_updates )
continue
if was_auto_skipped:
continue # it was a 'skip' decision
@ -2943,7 +2973,7 @@ class CanvasFilterDuplicates( CanvasWithHovers ):
first_hash = first_media.GetHash()
second_hash = second_media.GetHash()
pair_info.append( ( duplicate_type, first_hash, second_hash, service_keys_to_content_updates ) )
pair_info.append( ( duplicate_type, first_hash, second_hash, list_of_service_keys_to_content_updates ) )
if len( pair_info ) > 0:
@ -2975,6 +3005,9 @@ class CanvasFilterDuplicates( CanvasWithHovers ):
return
first_media = self._current_media
second_media = self._media_list.GetNext( self._current_media )
text = 'Delete just this file, or both?'
yes_tuples = []
@ -2990,13 +3023,13 @@ class CanvasFilterDuplicates( CanvasWithHovers ):
if value == 'current':
media = [ self._current_media ]
media = [ first_media ]
default_reason = 'Deleted manually in Duplicate Filter.'
elif value == 'both':
media = [ self._current_media, self._media_list.GetNext( self._current_media ) ]
media = [ first_media, second_media ]
default_reason = 'Deleted manually in Duplicate Filter, along with its potential duplicate.'
@ -3011,14 +3044,25 @@ class CanvasFilterDuplicates( CanvasWithHovers ):
deleted = CanvasWithHovers._Delete( self, media = media, default_reason = default_reason, file_service_key = file_service_key )
jobs = CanvasWithHovers._Delete( self, media = media, default_reason = default_reason, file_service_key = file_service_key, just_get_jobs = True )
deleted = isinstance( jobs, list ) and len( jobs ) > 0
if deleted:
self._SkipPair()
for m in media:
self._hashes_due_to_be_deleted_in_this_batch.update( m.GetHashes() )
was_auto_skipped = False
self._processed_pairs.append( ( self._batch_of_pairs_to_process[ self._current_pair_index ], None, None, None, jobs, was_auto_skipped ) )
self._ShowNextPair()
return True
return deleted
def _DoCustomAction( self ):
@ -3190,7 +3234,7 @@ class CanvasFilterDuplicates( CanvasWithHovers ):
def _GetNumCommittableDecisions( self ):
return len( [ 1 for ( media_result_pair, duplicate_type, first_media, second_media, service_keys_to_content_updates, was_auto_skipped ) in self._processed_pairs if duplicate_type is not None and not was_auto_skipped ] )
return len( [ 1 for ( media_result_pair, duplicate_type, first_media, second_media, list_of_service_keys_to_content_updates, was_auto_skipped ) in self._processed_pairs if len( list_of_service_keys_to_content_updates ) > 0 ] )
def _GetNumRemainingDecisions( self ):
@ -3201,7 +3245,7 @@ class CanvasFilterDuplicates( CanvasWithHovers ):
number_of_decisions_after_the_current = last_decision_index - self._current_pair_index
return 1 + number_of_decisions_after_the_current
return max( 0, 1 + number_of_decisions_after_the_current )
def _GoBack( self ):
@ -3352,9 +3396,9 @@ class CanvasFilterDuplicates( CanvasWithHovers ):
file_deletion_reason = None
service_keys_to_content_updates = duplicate_action_options.ProcessPairIntoContentUpdates( first_media, second_media, delete_first = delete_first, delete_second = delete_second, delete_both = delete_both, file_deletion_reason = file_deletion_reason )
list_of_service_keys_to_content_updates = [ duplicate_action_options.ProcessPairIntoContentUpdates( first_media, second_media, delete_first = delete_first, delete_second = delete_second, delete_both = delete_both, file_deletion_reason = file_deletion_reason ) ]
self._processed_pairs.append( ( self._batch_of_pairs_to_process[ self._current_pair_index ], duplicate_type, first_media, second_media, service_keys_to_content_updates, was_auto_skipped ) )
self._processed_pairs.append( ( self._batch_of_pairs_to_process[ self._current_pair_index ], duplicate_type, first_media, second_media, list_of_service_keys_to_content_updates, was_auto_skipped ) )
self._ShowNextPair()
@ -3363,7 +3407,7 @@ class CanvasFilterDuplicates( CanvasWithHovers ):
if self._current_pair_index > 0:
( media_result_pair, duplicate_type, first_media, second_media, service_keys_to_content_updates, was_auto_skipped ) = self._processed_pairs.pop()
( media_result_pair, duplicate_type, first_media, second_media, list_of_service_keys_to_content_updates, was_auto_skipped ) = self._processed_pairs.pop()
self._current_pair_index -= 1
@ -3382,7 +3426,7 @@ class CanvasFilterDuplicates( CanvasWithHovers ):
return False
( media_result_pair, duplicate_type, first_media, second_media, service_keys_to_content_updates, was_auto_skipped ) = self._processed_pairs.pop()
( media_result_pair, duplicate_type, first_media, second_media, list_of_service_keys_to_content_updates, was_auto_skipped ) = self._processed_pairs.pop()
self._current_pair_index -= 1
@ -3469,75 +3513,81 @@ class CanvasFilterDuplicates( CanvasWithHovers ):
#
num_committable = self._GetNumCommittableDecisions()
num_remaining = self._GetNumRemainingDecisions()
if num_remaining == 0 and num_committable > 0:
def pair_is_good( pair ):
label = 'commit ' + HydrusData.ToHumanInt( num_committable ) + ' decisions and continue?'
( first_media_result, second_media_result ) = pair
result = ClientGUIDialogsQuick.GetInterstitialFilteringAnswer( self, label )
first_hash = first_media_result.GetHash()
second_hash = second_media_result.GetHash()
if result == QW.QDialog.Accepted:
if first_hash in self._hashes_processed_in_this_batch or second_hash in self._hashes_processed_in_this_batch:
self._CommitProcessed( blocking = True )
return False
else:
if first_hash in self._hashes_due_to_be_deleted_in_this_batch or second_hash in self._hashes_due_to_be_deleted_in_this_batch:
it_went_ok = self._RewindProcessing()
return False
if not it_went_ok:
return
first_media = ClientMedia.MediaSingleton( first_media_result )
second_media = ClientMedia.MediaSingleton( second_media_result )
if not self._CanDisplayMedia( first_media ) or not self._CanDisplayMedia( second_media ):
return False
return True
num_remaining = self._GetNumRemainingDecisions()
if num_remaining == 0:
while True:
self._LoadNextBatchOfPairs()
num_remaining = self._GetNumRemainingDecisions()
else:
def pair_is_good( pair ):
if num_remaining == 0:
( first_media_result, second_media_result ) = pair
num_committable = self._GetNumCommittableDecisions()
first_hash = first_media_result.GetHash()
second_hash = second_media_result.GetHash()
if first_hash in self._hashes_processed_in_this_batch or second_hash in self._hashes_processed_in_this_batch:
if num_committable > 0:
return False
label = 'commit ' + HydrusData.ToHumanInt( num_committable ) + ' decisions and continue?'
if first_hash in self._hashes_due_to_be_deleted_in_this_batch or second_hash in self._hashes_due_to_be_deleted_in_this_batch:
result = ClientGUIDialogsQuick.GetInterstitialFilteringAnswer( self, label )
return False
if result == QW.QDialog.Accepted:
self._CommitProcessed( blocking = True )
else:
it_went_ok = self._RewindProcessing()
if it_went_ok:
self._ShowCurrentPair()
return
first_media = ClientMedia.MediaSingleton( first_media_result )
second_media = ClientMedia.MediaSingleton( second_media_result )
if not self._CanDisplayMedia( first_media ) or not self._CanDisplayMedia( second_media ):
else:
return False
# nothing to commit, so let's see if we have a big problem here or if user just skipped all
return True
while not pair_is_good( self._batch_of_pairs_to_process[ self._current_pair_index ] ):
was_auto_skipped = True
self._processed_pairs.append( ( self._batch_of_pairs_to_process[ self._current_pair_index ], None, None, None, {}, was_auto_skipped ) )
if self._GetNumRemainingDecisions() == 1: # we are at the end of the queue, this decision we just appended is the last
we_saw_a_non_auto_skip = False
if self._GetNumCommittableDecisions() == 0:
for ( media_result_pair, duplicate_type, first_media, second_media, list_of_service_keys_to_content_updates, was_auto_skipped ) in self._processed_pairs:
if not was_auto_skipped:
we_saw_a_non_auto_skip = True
break
if not we_saw_a_non_auto_skip:
HG.client_controller.pub( 'new_similar_files_potentials_search_numbers' )
@ -3547,21 +3597,40 @@ class CanvasFilterDuplicates( CanvasWithHovers ):
return
else:
self._ShowNextPair() # there are no useful decisions left in the queue, so let's reset
return
else:
self._current_pair_index += 1
self._LoadNextBatchOfPairs()
return
self._ShowCurrentPair()
current_pair = self._batch_of_pairs_to_process[ self._current_pair_index ]
if pair_is_good( current_pair ):
self._ShowCurrentPair()
return
else:
was_auto_skipped = True
self._processed_pairs.append( ( current_pair, None, None, None, [], was_auto_skipped ) )
self._current_pair_index += 1
def _ShowPairInPage( self ):
if self._current_media is None:
return
self.showPairInPage.emit( [ self._current_media, self._media_list.GetNext( self._current_media ) ] )
def _SkipPair( self ):
@ -3573,7 +3642,7 @@ class CanvasFilterDuplicates( CanvasWithHovers ):
was_auto_skipped = False
self._processed_pairs.append( ( self._batch_of_pairs_to_process[ self._current_pair_index ], None, None, None, {}, was_auto_skipped ) )
self._processed_pairs.append( ( self._batch_of_pairs_to_process[ self._current_pair_index ], None, None, None, [], was_auto_skipped ) )
self._ShowNextPair()
@ -4047,28 +4116,24 @@ class CanvasMediaList( ClientMedia.ListeningMediaList, CanvasWithHovers ):
def CommitArchiveDelete( page_key: bytes, location_context: ClientLocation.LocationContext, kept_hashes: typing.Collection[ bytes ], deleted_hashes: typing.Collection[ bytes ] ):
def CommitArchiveDelete( page_key: bytes, location_context: ClientLocation.LocationContext, kept: typing.Collection[ ClientMedia.MediaSingleton ], deleted: typing.Collection[ ClientMedia.MediaSingleton ] ):
kept = list( kept )
deleted = list( deleted )
kept_hashes = [ m.GetHash() for m in kept ]
deleted_hashes = [ m.GetHash() for m in deleted ]
if HC.options[ 'remove_filtered_files' ]:
all_hashes = set()
all_hashes.update( deleted_hashes )
all_hashes.update( kept_hashes )
all_hashes.update( deleted_hashes )
HG.client_controller.pub( 'remove_media', page_key, all_hashes )
if not isinstance( deleted_hashes, list ):
deleted_hashes = list( deleted_hashes )
if not isinstance( kept_hashes, list ):
kept_hashes = list( kept_hashes )
location_context = location_context.Duplicate()
location_context.FixMissingServices( ClientLocation.ValidLocalDomainsFilter )
@ -4080,10 +4145,10 @@ def CommitArchiveDelete( page_key: bytes, location_context: ClientLocation.Locat
else:
# if we are in a weird search domain, then just say 'delete from all local'
deletee_file_service_keys = HG.client_controller.services_manager.GetServiceKeys( ( HC.LOCAL_FILE_DOMAIN, ) )
deletee_file_service_keys = [ CC.COMBINED_LOCAL_MEDIA_SERVICE_KEY ]
for block_of_deleted_hashes in HydrusData.SplitListIntoChunks( deleted_hashes, 64 ):
for block_of_deleted in HydrusData.SplitListIntoChunks( deleted, 64 ):
service_keys_to_content_updates = {}
@ -4091,6 +4156,8 @@ def CommitArchiveDelete( page_key: bytes, location_context: ClientLocation.Locat
for deletee_file_service_key in deletee_file_service_keys:
block_of_deleted_hashes = [ m.GetHash() for m in block_of_deleted if deletee_file_service_key in m.GetLocationsManager().GetCurrent() ]
service_keys_to_content_updates[ deletee_file_service_key ] = [ HydrusData.ContentUpdate( HC.CONTENT_TYPE_FILES, HC.CONTENT_UPDATE_DELETE, block_of_deleted_hashes, reason = reason ) ]
@ -4100,6 +4167,8 @@ def CommitArchiveDelete( page_key: bytes, location_context: ClientLocation.Locat
if HC.options[ 'remove_filtered_files' ]:
block_of_deleted_hashes = [ m.GetHash() for m in block_of_deleted ]
HG.client_controller.pub( 'remove_media', page_key, block_of_deleted_hashes )
@ -4161,22 +4230,46 @@ class CanvasMediaListFilterArchiveDelete( CanvasMediaList ):
def TryToDoPreClose( self ):
kept_hashes = [ media.GetHash() for media in self._kept ]
kept = list( self._kept )
delete_lock_for_archived_files = HG.client_controller.new_options.GetBoolean( 'delete_lock_for_archived_files' )
if delete_lock_for_archived_files:
deleted_hashes = [ media.GetHash() for media in self._deleted if not media.HasArchive() ]
deleted = [ media.GetHash() for media in self._deleted if not media.HasArchive() ]
else:
deleted_hashes = [ media.GetHash() for media in self._deleted ]
deleted = list( self._deleted )
if len( kept_hashes ) > 0 or len( deleted_hashes ) > 0:
if len( kept ) > 0 or len( deleted ) > 0:
label = 'keep ' + HydrusData.ToHumanInt( len( self._kept ) ) + ' and delete ' + HydrusData.ToHumanInt( len( self._deleted ) ) + ' files?'
label_components = []
if len( kept ) > 0:
label_components.append( 'keep {}'.format( HydrusData.ToHumanInt( len( kept ) ) ) )
if len( deleted ) > 0:
if self._location_context.IncludesCurrent():
location_context_label = self._location_context.ToString( HG.client_controller.services_manager.GetName )
else:
location_context_label = 'all possible local file services'
label_components.append( 'delete (from {}) {}'.format( location_context_label, HydrusData.ToHumanInt( len( self._deleted ) ) ) )
label = '{} files?'.format( ' and '.join( label_components ) )
# TODO: ok so ideally we should total up the deleteds' actual file services and give users UI to select what they want to delete from
# so like '23 files in my files, 2 in favourites' and then a 'yes' button for all or just my files or favourites
( result, cancelled ) = ClientGUIDialogsQuick.GetFinishFilteringAnswer( self, label )
@ -4201,7 +4294,7 @@ class CanvasMediaListFilterArchiveDelete( CanvasMediaList ):
self._current_media = self._GetFirst() # so the pubsub on close is better
HG.client_controller.CallToThread( CommitArchiveDelete, self._page_key, self._location_context, kept_hashes, deleted_hashes )
HG.client_controller.CallToThread( CommitArchiveDelete, self._page_key, self._location_context, kept, deleted )

View File

@ -1549,6 +1549,8 @@ class CanvasHoverFrameRightNotes( CanvasHoverFrame ):
class CanvasHoverFrameRightDuplicates( CanvasHoverFrame ):
showPairInPage = QC.Signal()
def __init__( self, parent: QW.QWidget, my_canvas: QW.QWidget, right_notes_hover: CanvasHoverFrameRightNotes, canvas_key: bytes ):
CanvasHoverFrame.__init__( self, parent, my_canvas, canvas_key )
@ -1561,6 +1563,10 @@ class CanvasHoverFrameRightDuplicates( CanvasHoverFrame ):
self._comparison_media = None
self._show_in_a_page_button = ClientGUICommon.BetterBitmapButton( self, CC.global_pixmaps().fullscreen_switch, self.showPairInPage.emit )
self._show_in_a_page_button.setToolTip( 'send pair to the duplicates media page, for later processing' )
self._show_in_a_page_button.setFocusPolicy( QC.Qt.TabFocus )
self._trash_button = ClientGUICommon.BetterBitmapButton( self, CC.global_pixmaps().delete, HG.client_controller.pub, 'canvas_delete', self._canvas_key )
self._trash_button.setToolTip( 'send to trash' )
self._trash_button.setFocusPolicy( QC.Qt.TabFocus )
@ -1665,6 +1671,7 @@ class CanvasHoverFrameRightDuplicates( CanvasHoverFrame ):
top_button_hbox = QP.HBoxLayout()
QP.AddToLayout( top_button_hbox, self._next_button, CC.FLAGS_EXPAND_BOTH_WAYS )
QP.AddToLayout( top_button_hbox, self._show_in_a_page_button, CC.FLAGS_CENTER_PERPENDICULAR )
QP.AddToLayout( top_button_hbox, self._trash_button, CC.FLAGS_CENTER_PERPENDICULAR )
QP.AddToLayout( top_button_hbox, self._cog_button, CC.FLAGS_CENTER_PERPENDICULAR )
QP.AddToLayout( top_button_hbox, close_button, CC.FLAGS_CENTER_PERPENDICULAR )

View File

@ -1396,6 +1396,8 @@ class ManagementPanelDuplicateFilter( ManagementPanel ):
canvas_window = ClientGUICanvas.CanvasFilterDuplicates( canvas_frame, file_search_context, both_files_match, pixel_dupes_preference, max_hamming_distance )
canvas_window.showPairInPage.connect( self._ShowPairInPage )
canvas_frame.SetCanvas( canvas_window )
@ -1501,6 +1503,13 @@ class ManagementPanelDuplicateFilter( ManagementPanel ):
self._UpdateMaintenanceStatus()
def _ShowPairInPage( self, media: typing.Collection[ ClientMedia.MediaSingleton ] ):
media_results = [ m.GetMediaResult() for m in media ]
self._page.GetMediaPanel().AddMediaResults( self._page_key, media_results )
def _ShowPotentialDupes( self, hashes ):
( file_search_context, both_files_match, pixel_dupes_preference, max_hamming_distance ) = self._GetDuplicateFileSearchData()
@ -2249,23 +2258,18 @@ class ManagementPanelImporterMultipleGallery( ManagementPanelImporter ):
if len( hashes ) > 0:
media_results = HG.client_controller.Read( 'media_results', hashes )
media_results = HG.client_controller.Read( 'media_results', hashes, sorted = True )
else:
hashes = []
media_results = []
hashes_to_media_results = { media_result.GetHash() : media_result for media_result in media_results }
sorted_media_results = [ hashes_to_media_results[ hash ] for hash in hashes ]
location_context = self._highlighted_gallery_import.GetFileImportOptions().GetDestinationLocationContext()
self._SetLocationContext( location_context )
panel = ClientGUIResults.MediaPanelThumbnails( self._page, self._page_key, location_context, sorted_media_results )
panel = ClientGUIResults.MediaPanelThumbnails( self._page, self._page_key, location_context, media_results )
panel.SetEmptyPageStatusOverride( 'no files for this query and its publishing settings' )
@ -3111,23 +3115,18 @@ class ManagementPanelImporterMultipleWatcher( ManagementPanelImporter ):
if len( hashes ) > 0:
media_results = HG.client_controller.Read( 'media_results', hashes )
media_results = HG.client_controller.Read( 'media_results', hashes, sorted = True )
else:
hashes = []
media_results = []
hashes_to_media_results = { media_result.GetHash() : media_result for media_result in media_results }
sorted_media_results = [ hashes_to_media_results[ hash ] for hash in hashes ]
location_context = self._highlighted_watcher.GetFileImportOptions().GetDestinationLocationContext()
self._SetLocationContext( location_context )
panel = ClientGUIResults.MediaPanelThumbnails( self._page, self._page_key, location_context, sorted_media_results )
panel = ClientGUIResults.MediaPanelThumbnails( self._page, self._page_key, location_context, media_results )
panel.SetEmptyPageStatusOverride( 'no files for this watcher and its publishing settings' )

View File

@ -574,6 +574,8 @@ class Page( QW.QSplitter ):
return
old_panel.CleanBeforeDestroy()
old_panel.deleteLater()
@ -610,6 +612,8 @@ class Page( QW.QSplitter ):
self._preview_canvas.CleanBeforeDestroy()
self._media_panel.CleanBeforeDestroy()
self._controller.ReleasePageKey( self._page_key )

View File

@ -1867,14 +1867,14 @@ class MediaPanel( ClientMedia.ListeningMediaList, QW.QScrollArea, CAC.Applicatio
if duplicate_action_options is None:
service_keys_to_content_updates = {}
list_of_service_keys_to_content_updates = []
else:
service_keys_to_content_updates = duplicate_action_options.ProcessPairIntoContentUpdates( first_media, second_media, file_deletion_reason = file_deletion_reason )
list_of_service_keys_to_content_updates = [ duplicate_action_options.ProcessPairIntoContentUpdates( first_media, second_media, file_deletion_reason = file_deletion_reason ) ]
pair_info.append( ( duplicate_type, first_hash, second_hash, service_keys_to_content_updates ) )
pair_info.append( ( duplicate_type, first_hash, second_hash, list_of_service_keys_to_content_updates ) )
if len( pair_info ) > 0:
@ -2149,6 +2149,11 @@ class MediaPanel( ClientMedia.ListeningMediaList, QW.QScrollArea, CAC.Applicatio
def CleanBeforeDestroy( self ):
self.Clear()
def ClearPageKey( self ):
self._page_key = 'dead media panel page key'
@ -3220,9 +3225,6 @@ class MediaPanelThumbnails( MediaPanel ):
MediaPanel._RemoveMediaDirectly( self, singleton_media, collected_media )
self._selected_media.difference_update( singleton_media )
self._selected_media.difference_update( collected_media )
self._EndShiftSelect()
self._RecalculateVirtualSize()

View File

@ -1510,6 +1510,13 @@ class ReviewServicePanel( QW.QWidget ):
self._service = service
self._id_button = ClientGUICommon.BetterButton( self, 'id', self._GetAndShowID )
self._id_button.setToolTip( 'Click to fetch your service\'s database id.' )
width = ClientGUIFunctions.ConvertTextToPixelWidth( self._id_button, 4 )
self._id_button.setFixedWidth( width )
self._refresh_button = ClientGUICommon.BetterBitmapButton( self, CC.global_pixmaps().refresh, self._RefreshButton )
service_type = self._service.GetServiceType()
@ -1575,9 +1582,19 @@ class ReviewServicePanel( QW.QWidget ):
#
if not HG.client_controller.new_options.GetBoolean( 'advanced_mode' ):
self._id_button.hide()
vbox = QP.VBoxLayout()
QP.AddToLayout( vbox, self._refresh_button, CC.FLAGS_ON_RIGHT )
hbox = QP.HBoxLayout( margin = 0 )
QP.AddToLayout( hbox, self._id_button, CC.FLAGS_CENTER )
QP.AddToLayout( hbox, self._refresh_button, CC.FLAGS_CENTER )
QP.AddToLayout( vbox, hbox, CC.FLAGS_ON_RIGHT )
saw_both_ways = False
@ -1599,6 +1616,29 @@ class ReviewServicePanel( QW.QWidget ):
self.setLayout( vbox )
def _GetAndShowID( self ):
service_key = self._service.GetServiceKey()
def work_callable():
service_id = HG.client_controller.Read( 'service_id', service_key )
return service_id
def publish_callable( service_id ):
message = 'The service id is: {}'.format( service_id )
QP.CallAfter( QW.QMessageBox.information, None, 'Service ID', message )
job = ClientGUIAsync.AsyncQtJob( self, work_callable, publish_callable )
job.start()
def _RefreshButton( self ):
HG.client_controller.pub( 'service_updated', self._service )

View File

@ -1044,6 +1044,9 @@ class MediaList( object ):
self._singleton_media.difference_update( singleton_media )
self._collected_media.difference_update( collected_media )
self._selected_media.difference_update( singleton_media )
self._selected_media.difference_update( collected_media )
self._sorted_media.remove_items( singleton_media.union( collected_media ) )
self._RecalcAfterMediaRemove()
@ -1078,6 +1081,17 @@ class MediaList( object ):
return new_media
def Clear( self ):
self._singleton_media = set()
self._collected_media = set()
self._selected_media = set()
self._sorted_media = []
self._RecalcAfterMediaRemove()
def Collect( self, media_collect = None ):
if media_collect == None:
@ -2500,7 +2514,7 @@ class MediaSingleton( Media ):
timestamp = locations_manager.GetCurrentTimestamp( local_file_service.GetServiceKey() )
lines.append( ( True, 'added to {} {}'.format( local_file_service.GetName(), ClientData.TimestampToPrettyTimeDelta( timestamp ) ) ) )
lines.append( ( True, 'added to {}: {}'.format( local_file_service.GetName(), ClientData.TimestampToPrettyTimeDelta( timestamp ) ) ) )
seen_local_file_service_timestamps.add( timestamp )
@ -2513,7 +2527,7 @@ class MediaSingleton( Media ):
# if we haven't already printed this timestamp somewhere
line_is_interesting = False not in ( timestamp_is_interesting( t, import_timestamp ) for t in seen_local_file_service_timestamps )
lines.append( ( line_is_interesting, 'imported {}'.format( ClientData.TimestampToPrettyTimeDelta( import_timestamp ) ) ) )
lines.append( ( line_is_interesting, 'imported: {}'.format( ClientData.TimestampToPrettyTimeDelta( import_timestamp ) ) ) )
if line_is_interesting:

View File

@ -80,7 +80,7 @@ options = {}
# Misc
NETWORK_VERSION = 20
SOFTWARE_VERSION = 486
SOFTWARE_VERSION = 487
CLIENT_API_VERSION = 31
SERVER_THUMBNAIL_DIMENSIONS = ( 200, 200 )