Version 329
|
@ -8,6 +8,49 @@
|
|||
<div class="content">
|
||||
<h3>changelog</h3>
|
||||
<ul>
|
||||
<li><h3>version 329</h3></li>
|
||||
<ul>
|
||||
<li>login:</li>
|
||||
<li>the login manager is fully turned on! hentai-foundry click-through and pixiv login now occur fully on the new system</li>
|
||||
<li>wrote a Deviant Art login script for NSFW downloading--however, it only seems to work on a client that has done some logged-out downloading first (otherwise it thinks you are a robot)</li>
|
||||
<li>updated the DA file page parser to only NSFW-veto if the user is currently logged out</li>
|
||||
<li>wrote a danbooru login script for user prefs and special files if you have a gold account</li>
|
||||
<li>wrote a gelbooru 0.2.x login script for user prefs</li>
|
||||
<li>pixiv recently(?) allowed non-logged in users to see sfw content, so the login script is updated to reflect this. the login script doesn't detect a failed login any more, so I will revisit this</li>
|
||||
<li>logging in in the regular order of things now makes a temporary popup message with the overall login status and final result. ~it is cancellable~--and if cancelled, future login attempts will be delayed</li>
|
||||
<li>logging in in the regular order of things now prints simple started/result lines to the log</li>
|
||||
<li>deleted old network->login menu and related code such as the custom pixiv login management. gdpr click-through is now under downloaders</li>
|
||||
<li>subscription login errors will now specify the given login failure reason</li>
|
||||
<li>subscription login tests will now occur at a better time, guaranteeing the sub will be correctly saved paused if the test fails</li>
|
||||
<li>login errors will now always specify the domain for which they failed</li>
|
||||
<li>testing a login script on a fresh edit login script dialog now pre-fills the alphabetically first example domain</li>
|
||||
<li>the login script test ui now restores its 'run test' button correctly if the test is abandoned early</li>
|
||||
<li>misc improvements to login error handling and reporting</li>
|
||||
<li>.</li>
|
||||
<li>other:</li>
|
||||
<li>any texts across the program that ellipsize when they are too thin to display what they have will now tooltip their text (this most importantly includes the status on the network job control, which will now display full login problem info)</li>
|
||||
<li>the copy button on manage tags goes back to copying all if no tags are selected</li>
|
||||
<li>the remove button on manage tags now removes only selected if some tags are selected. it still removes all if none are selected</li>
|
||||
<li>the remove button on manage tags is now wrapped in a yes/no dialog (as is hitting the delete key on the list's selection). this can be turned off under the cog button</li>
|
||||
<li>filename tagging panels now support directory tagging for the last, second last, and third last directories. the related code for handling directory tagging is cleaned up significantly</li>
|
||||
<li>the export files panel now lets you delete the files from the client after export. this value will be remembered, and if on will prompt a capital letters warning on export, either via the button or the quick-export shortcut</li>
|
||||
<li>in manage tag parents, where there are multiple parents in a pending action (either by importing via clipboard/file or by putting multiple parents in right-hand box), the action will now be treated as one transaction with one 'enter a reason' confirmation!</li>
|
||||
<li>in manage tag siblings, when multiple 'better' values are pended in one action via a clipboard/file import, they will now be treated as one transaction with one 'enter a reason' confirmation!</li>
|
||||
<li>.</li>
|
||||
<li>misc:</li>
|
||||
<li>added a new url class that api-links .gifv-style imgur links so they are downloadable like regular imgur single media pages</li>
|
||||
<li>the pixiv manga page url class now redirects to the new api, so mode=manga pages should now be drag-and-drop importable and generally downloadable if you have any still hanging around in any queues</li>
|
||||
<li>clients now come with an additional danbooru parser that fetches the webm version of ugoiras</li>
|
||||
<li>after discovering a pdf that ate indefinite 100% CPU while trying to parse, I have decided to stop pulling num_words for pdfs. it was always a super inaccurate number, so let's wait for a better solution at a later date. hydrus hence no longer requires pypdf2</li>
|
||||
<li>fixed an issue with monthly bandwidth estimates rolling over to the new year incorrectly</li>
|
||||
<li>in an attempt to chase down a duplicate files content move/copy bug, the duplicate action content updates got a bit of cleanup work. if you have noticed duplicate actions not copying tags/urls, please let me know the exact process in the ui, including services and merge options, you went through</li>
|
||||
<li>tag lists should now update their sibling appearance correctly after a tag siblings dialog ok--previously, they were checking for new sibs too early</li>
|
||||
<li>tag siblings and parents should now refresh their data more efficiently when spammed with new data notifications (this usually happens janitor-side, which approving dozens at once)</li>
|
||||
<li>copy queries/watcher urls on the download pages' lists' right-click menus no longer double-spaces the copied texts (it just does single spaces)</li>
|
||||
<li>fixed an issue where certain initialised watchers were erroring out when asked to provide next-check time estimates--in all cases, null timestamps will be dealt with better here</li>
|
||||
<li>misc tag parents/siblings ui code cleanup</li>
|
||||
<li>wrote some code to catch and report on an unusual dialog dismissal error</li>
|
||||
</ul>
|
||||
<li><h3>version 328</h3></li>
|
||||
<ul>
|
||||
<li>wrote test ui for edit login script panel</li>
|
||||
|
|
|
@ -27,7 +27,7 @@
|
|||
<p>That '. venv/bin/activate' line turns your venv on, and will be needed every time you run the client.pyw/server.py files. You can easily tuck it into a launch script.</p>
|
||||
<p>After that, you can go nuts with pip. I think this will do for most systems:</p>
|
||||
<ul>
|
||||
<li>pip install beautifulsoup4 html5lib lxml lz4 nose numpy opencv-python six pafy Pillow psutil pycrypto pylzma PyOpenSSL PyPDF2 PyYAML requests Send2Trash service_identity twisted youtube-dl</li>
|
||||
<li>pip install beautifulsoup4 html5lib lxml lz4 nose numpy opencv-python six pafy Pillow psutil pycrypto pylzma PyOpenSSL PyYAML requests Send2Trash service_identity twisted youtube-dl</li>
|
||||
</ul>
|
||||
<p>You may want to do all that in smaller batches.</p>
|
||||
<p>Depending on your OS, you might need something else. Ultimately, the best way to figure out if you have enough for hydrus is to just keep running client.pyw and see what it complains about missing. If you are not familiar with pip and get an error about a library already existing, know that you update an existing library with the --upgrade switch, like so:</p>
|
||||
|
|
|
@ -2595,6 +2595,7 @@ class TagParentsManager( object ):
|
|||
self._controller = controller
|
||||
|
||||
self._dirty = False
|
||||
self._refresh_job = None
|
||||
|
||||
self._service_keys_to_children_to_parents = collections.defaultdict( HydrusData.default_dict_list )
|
||||
|
||||
|
@ -2729,7 +2730,12 @@ class TagParentsManager( object ):
|
|||
|
||||
self._dirty = True
|
||||
|
||||
self._controller.CallLater( 1.0, self.RefreshParentsIfDirty )
|
||||
if self._refresh_job is not None:
|
||||
|
||||
self._refresh_job.Cancel()
|
||||
|
||||
|
||||
self._refresh_job = self._controller.CallLater( 1.0, self.RefreshParentsIfDirty )
|
||||
|
||||
|
||||
|
||||
|
@ -2753,6 +2759,7 @@ class TagSiblingsManager( object ):
|
|||
self._controller = controller
|
||||
|
||||
self._dirty = False
|
||||
self._refresh_job = None
|
||||
|
||||
self._service_keys_to_siblings = collections.defaultdict( dict )
|
||||
self._service_keys_to_reverse_lookup = collections.defaultdict( dict )
|
||||
|
@ -3059,7 +3066,12 @@ class TagSiblingsManager( object ):
|
|||
|
||||
self._dirty = True
|
||||
|
||||
self._controller.CallLater( 1.0, self.RefreshSiblingsIfDirty )
|
||||
if self._refresh_job is not None:
|
||||
|
||||
self._refresh_job.Cancel()
|
||||
|
||||
|
||||
self._refresh_job = self._controller.CallLater( 1.0, self.RefreshSiblingsIfDirty )
|
||||
|
||||
|
||||
|
||||
|
@ -3073,6 +3085,8 @@ class TagSiblingsManager( object ):
|
|||
|
||||
self._dirty = False
|
||||
|
||||
self._controller.pub( 'notify_new_siblings_gui' )
|
||||
|
||||
|
||||
|
||||
|
||||
|
|
|
@ -1957,9 +1957,9 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
def _CacheSimilarFilesSetDuplicatePairStatus( self, pair_info ):
|
||||
|
||||
for ( duplicate_type, hash_a, hash_b, list_of_service_keys_to_content_updates ) in pair_info:
|
||||
for ( duplicate_type, hash_a, hash_b, service_keys_to_content_updates ) in pair_info:
|
||||
|
||||
for service_keys_to_content_updates in list_of_service_keys_to_content_updates:
|
||||
if len( service_keys_to_content_updates ) > 0:
|
||||
|
||||
self._ProcessContentUpdates( service_keys_to_content_updates )
|
||||
|
||||
|
@ -3166,7 +3166,6 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
self.pub_after_job( 'notify_new_pending' )
|
||||
self.pub_after_job( 'notify_new_siblings_data' )
|
||||
self.pub_after_job( 'notify_new_siblings_gui' )
|
||||
self.pub_after_job( 'notify_new_parents' )
|
||||
|
||||
self.pub_service_updates_after_commit( { service_key : [ HydrusData.ServiceUpdate( HC.SERVICE_UPDATE_DELETE_PENDING ) ] } )
|
||||
|
@ -8127,7 +8126,6 @@ class DB( HydrusDB.HydrusDB ):
|
|||
if notify_new_siblings:
|
||||
|
||||
self.pub_after_job( 'notify_new_siblings_data' )
|
||||
self.pub_after_job( 'notify_new_siblings_gui' )
|
||||
self.pub_after_job( 'notify_new_parents' )
|
||||
|
||||
elif notify_new_parents:
|
||||
|
@ -10382,7 +10380,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
#
|
||||
|
||||
message = 'If you are an EU/EEA user and are subject to GDPR, the tumblr downloader has likely broken for you. If so, please hit _network->logins->DEBUG: misc->do tumblr GDPR click-through_ to restore tumblr downloader functionality.'
|
||||
message = 'If you are an EU/EEA user and are subject to GDPR, the tumblr downloader has likely broken for you. If so, please hit _network->downloaders->DEBUG->do tumblr GDPR click-through_ to restore tumblr downloader functionality.'
|
||||
|
||||
self.pub_initial_message( message )
|
||||
|
||||
|
@ -11209,6 +11207,56 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
|
||||
|
||||
if version == 328:
|
||||
|
||||
try:
|
||||
|
||||
domain_manager = self._GetJSONDump( HydrusSerialisable.SERIALISABLE_TYPE_NETWORK_DOMAIN_MANAGER )
|
||||
|
||||
domain_manager.Initialise()
|
||||
|
||||
#
|
||||
|
||||
domain_manager.OverwriteDefaultURLMatches( [ 'imgur single media page gifv format', 'pixiv manga page' ] )
|
||||
|
||||
#
|
||||
|
||||
domain_manager.OverwriteDefaultParsers( [ 'danbooru file page parser', 'danbooru file page parser - get webm ugoira', 'deviant art file page parser' ] )
|
||||
|
||||
#
|
||||
|
||||
domain_manager.TryToLinkURLMatchesAndParsers()
|
||||
|
||||
#
|
||||
|
||||
self._SetJSONDump( domain_manager )
|
||||
|
||||
#
|
||||
|
||||
login_manager = self._GetJSONDump( HydrusSerialisable.SERIALISABLE_TYPE_NETWORK_LOGIN_MANAGER )
|
||||
|
||||
login_manager.Initialise()
|
||||
|
||||
#
|
||||
|
||||
login_manager.OverwriteDefaultLoginScripts( [ 'pixiv login', 'danbooru login', 'gelbooru 0.2.x login', 'deviant art login (only works on a client that has already done some downloading)' ] )
|
||||
|
||||
#
|
||||
|
||||
self._SetJSONDump( login_manager )
|
||||
|
||||
self._c.execute( 'DELETE FROM json_dict WHERE name = ?;', ( 'pixiv_account', ) )
|
||||
|
||||
except Exception as e:
|
||||
|
||||
HydrusData.PrintException( e )
|
||||
|
||||
message = 'Trying to update some url classes and parsers failed! Please let hydrus dev know!'
|
||||
|
||||
self.pub_initial_message( message )
|
||||
|
||||
|
||||
|
||||
self._controller.pub( 'splash_set_title_text', 'updated db to v' + str( version + 1 ) )
|
||||
|
||||
self._c.execute( 'UPDATE version SET version = ?;', ( version + 1, ) )
|
||||
|
|
|
@ -1,4 +1,5 @@
|
|||
import ClientConstants as CC
|
||||
import collections
|
||||
import HydrusConstants as HC
|
||||
import HydrusData
|
||||
import HydrusExceptions
|
||||
|
@ -140,15 +141,13 @@ class DuplicateActionOptions( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
def ProcessPairIntoContentUpdates( self, first_media, second_media ):
|
||||
|
||||
list_of_service_keys_to_content_updates = []
|
||||
service_keys_to_content_updates = collections.defaultdict( list )
|
||||
|
||||
first_hashes = first_media.GetHashes()
|
||||
second_hashes = second_media.GetHashes()
|
||||
|
||||
#
|
||||
|
||||
service_keys_to_content_updates = {}
|
||||
|
||||
services_manager = HG.client_controller.services_manager
|
||||
|
||||
for ( service_key, action, tag_filter ) in self._tag_service_actions:
|
||||
|
@ -205,7 +204,7 @@ class DuplicateActionOptions( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
if len( content_updates ) > 0:
|
||||
|
||||
service_keys_to_content_updates[ service_key ] = content_updates
|
||||
service_keys_to_content_updates[ service_key ].extend( content_updates )
|
||||
|
||||
|
||||
|
||||
|
@ -268,40 +267,28 @@ class DuplicateActionOptions( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
if len( content_updates ) > 0:
|
||||
|
||||
service_keys_to_content_updates[ service_key ] = content_updates
|
||||
service_keys_to_content_updates[ service_key ].extend( content_updates )
|
||||
|
||||
|
||||
|
||||
if len( service_keys_to_content_updates ) > 0:
|
||||
|
||||
list_of_service_keys_to_content_updates.append( service_keys_to_content_updates )
|
||||
|
||||
|
||||
#
|
||||
|
||||
service_keys_to_content_updates = {}
|
||||
|
||||
if self._sync_archive:
|
||||
|
||||
if first_media.HasInbox() and second_media.HasArchive():
|
||||
|
||||
content_update = HydrusData.ContentUpdate( HC.CONTENT_TYPE_FILES, HC.CONTENT_UPDATE_ARCHIVE, first_hashes )
|
||||
|
||||
service_keys_to_content_updates[ CC.COMBINED_LOCAL_FILE_SERVICE_KEY ] = [ content_update ]
|
||||
service_keys_to_content_updates[ CC.COMBINED_LOCAL_FILE_SERVICE_KEY ].append( content_update )
|
||||
|
||||
elif first_media.HasArchive() and second_media.HasInbox():
|
||||
|
||||
content_update = HydrusData.ContentUpdate( HC.CONTENT_TYPE_FILES, HC.CONTENT_UPDATE_ARCHIVE, second_hashes )
|
||||
|
||||
service_keys_to_content_updates[ CC.COMBINED_LOCAL_FILE_SERVICE_KEY ] = [ content_update ]
|
||||
service_keys_to_content_updates[ CC.COMBINED_LOCAL_FILE_SERVICE_KEY ].append( content_update )
|
||||
|
||||
|
||||
|
||||
if len( service_keys_to_content_updates ) > 0:
|
||||
|
||||
list_of_service_keys_to_content_updates.append( service_keys_to_content_updates )
|
||||
|
||||
|
||||
#
|
||||
|
||||
if self._sync_urls_action is not None:
|
||||
|
@ -338,16 +325,12 @@ class DuplicateActionOptions( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
if len( content_updates ) > 0:
|
||||
|
||||
service_keys_to_content_updates = { CC.COMBINED_LOCAL_FILE_SERVICE_KEY : content_updates }
|
||||
|
||||
list_of_service_keys_to_content_updates.append( service_keys_to_content_updates )
|
||||
service_keys_to_content_updates[ CC.COMBINED_LOCAL_FILE_SERVICE_KEY ].extend( content_updates )
|
||||
|
||||
|
||||
|
||||
#
|
||||
|
||||
service_keys_to_content_updates = {}
|
||||
|
||||
deletee_media = []
|
||||
|
||||
if self._delete_second_file or self._delete_both_files:
|
||||
|
@ -379,25 +362,15 @@ class DuplicateActionOptions( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
if deletee_service_key is not None:
|
||||
|
||||
if deletee_service_key not in service_keys_to_content_updates:
|
||||
|
||||
service_keys_to_content_updates[ deletee_service_key ] = []
|
||||
|
||||
|
||||
content_update = HydrusData.ContentUpdate( HC.CONTENT_TYPE_FILES, HC.CONTENT_UPDATE_DELETE, media.GetHashes() )
|
||||
|
||||
service_keys_to_content_updates[ deletee_service_key ].append( content_update )
|
||||
|
||||
|
||||
|
||||
if len( service_keys_to_content_updates ) > 0:
|
||||
|
||||
list_of_service_keys_to_content_updates.append( service_keys_to_content_updates )
|
||||
|
||||
|
||||
#
|
||||
|
||||
return list_of_service_keys_to_content_updates
|
||||
return service_keys_to_content_updates
|
||||
|
||||
|
||||
HydrusSerialisable.SERIALISABLE_TYPES_TO_OBJECT_TYPES[ HydrusSerialisable.SERIALISABLE_TYPE_DUPLICATE_ACTION_OPTIONS ] = DuplicateActionOptions
|
||||
|
|
|
@ -24,6 +24,7 @@ import ClientGUITopLevelWindows
|
|||
import ClientDownloading
|
||||
import ClientMedia
|
||||
import ClientNetworkingContexts
|
||||
import ClientNetworkingJobs
|
||||
import ClientParsing
|
||||
import ClientPaths
|
||||
import ClientRendering
|
||||
|
@ -1742,7 +1743,13 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
|
|||
|
||||
ClientGUIMenus.AppendSeparator( submenu )
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( self, submenu, 'UNDER CONSTRUCTION: manage logins', 'Edit which domains you wish to log in to.', self._ManageLogins )
|
||||
ClientGUIMenus.AppendMenuItem( self, submenu, 'READY FOR CAUTIOUS USE: manage logins', 'Edit which domains you wish to log in to.', self._ManageLogins )
|
||||
|
||||
debug_menu = wx.Menu()
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( self, debug_menu, 'do tumblr GDPR click-through', 'Do a manual click-through for the tumblr GDPR page.', self._controller.CallLater, 0.0, self._controller.network_engine.login_manager.LoginTumblrGDPR )
|
||||
|
||||
ClientGUIMenus.AppendMenu( submenu, debug_menu, 'DEBUG' )
|
||||
|
||||
ClientGUIMenus.AppendMenu( menu, submenu, 'downloaders' )
|
||||
|
||||
|
@ -1754,6 +1761,8 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
|
|||
ClientGUIMenus.AppendMenuItem( self, submenu, 'manage url classes', 'Configure which URLs the client can recognise.', self._ManageURLMatches )
|
||||
ClientGUIMenus.AppendMenuItem( self, submenu, 'manage parsers', 'Manage the client\'s parsers, which convert URL content into hydrus metadata.', self._ManageParsers )
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( self, submenu, 'manage login scripts', 'Manage the client\'s login scripts, which define how to log in to different sites.', self._ManageLoginScripts )
|
||||
|
||||
ClientGUIMenus.AppendSeparator( submenu )
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( self, submenu, 'manage url class links', 'Configure how URLs present across the client.', self._ManageURLMatchLinks )
|
||||
|
@ -1761,37 +1770,12 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
|
|||
|
||||
ClientGUIMenus.AppendSeparator( submenu )
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( self, submenu, 'UNDER CONSTRUCTION: manage login scripts', 'Manage the client\'s login scripts, which define how to log in to different sites.', self._ManageLoginScripts )
|
||||
|
||||
ClientGUIMenus.AppendSeparator( submenu )
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( self, submenu, 'SEMI-LEGACY: manage file lookup scripts', 'Manage how the client parses different types of web content.', self._ManageParsingScripts )
|
||||
|
||||
ClientGUIMenus.AppendMenu( menu, submenu, 'downloader definitions' )
|
||||
|
||||
#
|
||||
|
||||
submenu = wx.Menu()
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( self, submenu, 'manage pixiv account', 'Set up your pixiv username and password.', self._ManagePixivAccount )
|
||||
|
||||
reset_login_menu = wx.Menu()
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( self, reset_login_menu, 'pixiv', 'Reset pixiv session.', self._controller.network_engine.session_manager.ClearSession, ClientNetworkingContexts.NetworkContext( CC.NETWORK_CONTEXT_DOMAIN, 'pixiv.net' ) )
|
||||
ClientGUIMenus.AppendMenuItem( self, reset_login_menu, 'hentai foundry', 'Reset HF session.', self._controller.network_engine.session_manager.ClearSession, ClientNetworkingContexts.NetworkContext( CC.NETWORK_CONTEXT_DOMAIN, 'hentai-foundry.com' ) )
|
||||
|
||||
ClientGUIMenus.AppendMenu( submenu, reset_login_menu, 'DEBUG: reset login' )
|
||||
|
||||
debug_menu = wx.Menu()
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( self, debug_menu, 'do tumblr GDPR click-through', 'Do a manual click-through for the tumblr GDPR page.', self._controller.CallLater, 0.0, self._controller.network_engine.login_manager.LoginTumblrGDPR )
|
||||
|
||||
ClientGUIMenus.AppendMenu( submenu, debug_menu, 'DEBUG: misc' )
|
||||
|
||||
ClientGUIMenus.AppendMenu( menu, submenu, 'logins' )
|
||||
|
||||
#
|
||||
|
||||
return ( menu, '&network', True )
|
||||
|
||||
|
||||
|
@ -2625,11 +2609,6 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
|
|||
|
||||
|
||||
|
||||
def _ManagePixivAccount( self ):
|
||||
|
||||
with ClientGUIDialogsManage.DialogManagePixivAccount( self ) as dlg: dlg.ShowModal()
|
||||
|
||||
|
||||
def _ManageServer( self, service_key ):
|
||||
|
||||
title = 'manage server services'
|
||||
|
|
|
@ -3008,7 +3008,7 @@ class CanvasFilterDuplicates( CanvasWithHovers ):
|
|||
self._unprocessed_pairs = []
|
||||
self._current_pair = None
|
||||
self._processed_pairs = []
|
||||
self._batch_skip_hashes = set()
|
||||
self._hashes_due_to_be_deleted_in_this_batch = set()
|
||||
|
||||
self._media_list = ClientMedia.ListeningMediaList( self._file_service_key, [] )
|
||||
|
||||
|
@ -3078,9 +3078,9 @@ class CanvasFilterDuplicates( CanvasWithHovers ):
|
|||
first_hash = first_media.GetHash()
|
||||
second_hash = second_media.GetHash()
|
||||
|
||||
list_of_service_keys_to_content_updates = duplicate_action_options.ProcessPairIntoContentUpdates( first_media, second_media )
|
||||
service_keys_to_content_updates = duplicate_action_options.ProcessPairIntoContentUpdates( first_media, second_media )
|
||||
|
||||
pair_info.append( ( duplicate_type, first_hash, second_hash, list_of_service_keys_to_content_updates ) )
|
||||
pair_info.append( ( duplicate_type, first_hash, second_hash, service_keys_to_content_updates ) )
|
||||
|
||||
|
||||
if len( pair_info ) > 0:
|
||||
|
@ -3089,7 +3089,7 @@ class CanvasFilterDuplicates( CanvasWithHovers ):
|
|||
|
||||
|
||||
self._processed_pairs = []
|
||||
self._batch_skip_hashes = set()
|
||||
self._hashes_due_to_be_deleted_in_this_batch = set()
|
||||
|
||||
|
||||
def _CurrentMediaIsBetter( self ):
|
||||
|
@ -3325,7 +3325,7 @@ class CanvasFilterDuplicates( CanvasWithHovers ):
|
|||
self._unprocessed_pairs.append( hash_pair )
|
||||
|
||||
|
||||
self._batch_skip_hashes.difference_update( hash_pair )
|
||||
self._hashes_due_to_be_deleted_in_this_batch.difference_update( hash_pair )
|
||||
|
||||
self._ShowNewPair()
|
||||
|
||||
|
@ -3364,7 +3364,7 @@ class CanvasFilterDuplicates( CanvasWithHovers ):
|
|||
|
||||
deleted_hashes = duplicate_action_options.GetDeletedHashes( self._current_media, other_media )
|
||||
|
||||
self._batch_skip_hashes.update( deleted_hashes )
|
||||
self._hashes_due_to_be_deleted_in_this_batch.update( deleted_hashes )
|
||||
|
||||
was_auto_skipped = False
|
||||
|
||||
|
@ -3407,14 +3407,14 @@ class CanvasFilterDuplicates( CanvasWithHovers ):
|
|||
self._unprocessed_pairs.append( hash_pair )
|
||||
|
||||
|
||||
self._batch_skip_hashes.difference_update( hash_pair )
|
||||
self._hashes_due_to_be_deleted_in_this_batch.difference_update( hash_pair )
|
||||
|
||||
|
||||
|
||||
|
||||
if len( self._unprocessed_pairs ) == 0:
|
||||
|
||||
self._batch_skip_hashes = set()
|
||||
self._hashes_due_to_be_deleted_in_this_batch = set()
|
||||
self._processed_pairs = [] # just in case someone 'skip'ed everything in the last batch, so this never got cleared above
|
||||
|
||||
self.SetMedia( None )
|
||||
|
@ -3432,7 +3432,7 @@ class CanvasFilterDuplicates( CanvasWithHovers ):
|
|||
|
||||
( first_hash, second_hash ) = pair
|
||||
|
||||
if first_hash in self._batch_skip_hashes or second_hash in self._batch_skip_hashes:
|
||||
if first_hash in self._hashes_due_to_be_deleted_in_this_batch or second_hash in self._hashes_due_to_be_deleted_in_this_batch:
|
||||
|
||||
return False
|
||||
|
||||
|
|
|
@ -871,10 +871,17 @@ class BetterRadioBox( wx.RadioBox ):
|
|||
|
||||
class BetterStaticText( wx.StaticText ):
|
||||
|
||||
def __init__( self, parent, label = None, **kwargs ):
|
||||
def __init__( self, parent, label = None, tooltip_label = False, **kwargs ):
|
||||
|
||||
wx.StaticText.__init__( self, parent, **kwargs )
|
||||
|
||||
self._tooltip_label = tooltip_label
|
||||
|
||||
if 'style' in kwargs and kwargs[ 'style' ] & wx.ST_ELLIPSIZE_END:
|
||||
|
||||
self._tooltip_label = True
|
||||
|
||||
|
||||
self._last_set_text = '' # we want a separate copy since the one we'll send to the st will be wrapped and have additional '\n's
|
||||
|
||||
self._wrap_width = None
|
||||
|
@ -901,6 +908,11 @@ class BetterStaticText( wx.StaticText ):
|
|||
self.Wrap( self._wrap_width )
|
||||
|
||||
|
||||
if self._tooltip_label:
|
||||
|
||||
self.SetToolTip( text )
|
||||
|
||||
|
||||
|
||||
|
||||
def SetWrapWidth( self, wrap_width ):
|
||||
|
|
|
@ -2392,126 +2392,6 @@ class DialogManageImportFoldersEdit( ClientGUIDialogs.Dialog ):
|
|||
return self._import_folder
|
||||
|
||||
|
||||
class DialogManagePixivAccount( ClientGUIDialogs.Dialog ):
|
||||
|
||||
def __init__( self, parent ):
|
||||
|
||||
ClientGUIDialogs.Dialog.__init__( self, parent, 'manage pixiv account' )
|
||||
|
||||
self._id = wx.TextCtrl( self )
|
||||
self._password = wx.TextCtrl( self )
|
||||
|
||||
self._status = ClientGUICommon.BetterStaticText( self )
|
||||
|
||||
self._test = wx.Button( self, label = 'test' )
|
||||
self._test.Bind( wx.EVT_BUTTON, self.EventTest )
|
||||
|
||||
self._ok = wx.Button( self, id = wx.ID_OK, label = 'OK' )
|
||||
self._ok.Bind( wx.EVT_BUTTON, self.EventOK )
|
||||
self._ok.SetForegroundColour( ( 0, 128, 0 ) )
|
||||
|
||||
self._cancel = wx.Button( self, id = wx.ID_CANCEL, label = 'Cancel' )
|
||||
self._cancel.SetForegroundColour( ( 128, 0, 0 ) )
|
||||
|
||||
#
|
||||
|
||||
result = HG.client_controller.Read( 'serialisable_simple', 'pixiv_account' )
|
||||
|
||||
if result is None:
|
||||
|
||||
result = ( '', '' )
|
||||
|
||||
|
||||
( id, password ) = result
|
||||
|
||||
self._id.SetValue( id )
|
||||
self._password.SetValue( password )
|
||||
|
||||
#
|
||||
|
||||
rows = []
|
||||
|
||||
rows.append( ( 'id/email: ', self._id ) )
|
||||
rows.append( ( 'password: ', self._password ) )
|
||||
|
||||
gridbox = ClientGUICommon.WrapInGrid( self, rows )
|
||||
|
||||
b_box = wx.BoxSizer( wx.HORIZONTAL )
|
||||
b_box.Add( self._ok, CC.FLAGS_VCENTER )
|
||||
b_box.Add( self._cancel, CC.FLAGS_VCENTER )
|
||||
|
||||
vbox = wx.BoxSizer( wx.VERTICAL )
|
||||
|
||||
text = 'In order to search and download from Pixiv, the client needs to log in to it.'
|
||||
text += os.linesep
|
||||
text += 'Until you put something in here, you will not see the option to download from Pixiv.'
|
||||
text += os.linesep
|
||||
text += 'You can use a throwaway account if you want--this only needs to log in.'
|
||||
|
||||
vbox.Add( ClientGUICommon.BetterStaticText( self, text ), CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
vbox.Add( gridbox, CC.FLAGS_EXPAND_SIZER_PERPENDICULAR )
|
||||
vbox.Add( self._status, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
vbox.Add( self._test, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
vbox.Add( b_box, CC.FLAGS_BUTTON_SIZER )
|
||||
|
||||
self.SetSizer( vbox )
|
||||
|
||||
( x, y ) = self.GetEffectiveMinSize()
|
||||
|
||||
x = max( x, 240 )
|
||||
|
||||
self.SetInitialSize( ( x, y ) )
|
||||
|
||||
wx.CallAfter( self._ok.SetFocus )
|
||||
|
||||
|
||||
def EventOK( self, event ):
|
||||
|
||||
pixiv_id = self._id.GetValue()
|
||||
password = self._password.GetValue()
|
||||
|
||||
if pixiv_id == '' and password == '':
|
||||
|
||||
HG.client_controller.Write( 'serialisable_simple', 'pixiv_account', None )
|
||||
|
||||
else:
|
||||
|
||||
HG.client_controller.Write( 'serialisable_simple', 'pixiv_account', ( pixiv_id, password ) )
|
||||
|
||||
|
||||
self.EndModal( wx.ID_OK )
|
||||
|
||||
|
||||
def EventTest( self, event ):
|
||||
|
||||
pixiv_id = self._id.GetValue()
|
||||
password = self._password.GetValue()
|
||||
|
||||
try:
|
||||
|
||||
manager = HG.client_controller.network_engine.login_manager
|
||||
|
||||
( result, message ) = manager.TestPixiv( pixiv_id, password )
|
||||
|
||||
if result:
|
||||
|
||||
self._status.SetLabelText( 'OK!' )
|
||||
|
||||
HG.client_controller.CallLaterWXSafe( self._status, 5, self._status.SetLabel, '' )
|
||||
|
||||
else:
|
||||
|
||||
self._status.SetLabelText( message )
|
||||
|
||||
|
||||
except HydrusExceptions.ForbiddenException as e:
|
||||
|
||||
HydrusData.ShowException( e )
|
||||
|
||||
self._status.SetLabelText( 'Did not work! ' + repr( e ) )
|
||||
|
||||
|
||||
|
||||
class DialogManageRatings( ClientGUIDialogs.Dialog ):
|
||||
|
||||
def __init__( self, parent, media ):
|
||||
|
|
|
@ -547,24 +547,30 @@ class FilenameTaggingOptionsPanel( wx.Panel ):
|
|||
|
||||
self._filename_checkbox = wx.CheckBox( self._checkboxes_panel, label = 'add filename? [namespace]' )
|
||||
|
||||
self._dir_namespace_1 = wx.TextCtrl( self._checkboxes_panel )
|
||||
self._dir_namespace_1.SetMinSize( ( 100, -1 ) )
|
||||
self._directory_namespace_controls = {}
|
||||
|
||||
self._dir_checkbox_1 = wx.CheckBox( self._checkboxes_panel, label = 'add first directory? [namespace]' )
|
||||
directory_items = []
|
||||
|
||||
self._dir_namespace_2 = wx.TextCtrl( self._checkboxes_panel )
|
||||
self._dir_namespace_2.SetMinSize( ( 100, -1 ) )
|
||||
directory_items.append( ( 0, 'first' ) )
|
||||
directory_items.append( ( 1, 'second' ) )
|
||||
directory_items.append( ( 2, 'third' ) )
|
||||
directory_items.append( ( -3, 'third last' ) )
|
||||
directory_items.append( ( -2, 'second last' ) )
|
||||
directory_items.append( ( -1, 'last' ) )
|
||||
|
||||
self._dir_checkbox_2 = wx.CheckBox( self._checkboxes_panel, label = 'add second directory? [namespace]' )
|
||||
|
||||
self._dir_namespace_3 = wx.TextCtrl( self._checkboxes_panel )
|
||||
self._dir_namespace_3.SetMinSize( ( 100, -1 ) )
|
||||
|
||||
self._dir_checkbox_3 = wx.CheckBox( self._checkboxes_panel, label = 'add third directory? [namespace]' )
|
||||
for ( index, phrase ) in directory_items:
|
||||
|
||||
dir_checkbox = wx.CheckBox( self._checkboxes_panel, label = 'add ' + phrase + ' directory? [namespace]' )
|
||||
|
||||
dir_namespace_textctrl = wx.TextCtrl( self._checkboxes_panel )
|
||||
dir_namespace_textctrl.SetMinSize( ( 100, -1 ) )
|
||||
|
||||
self._directory_namespace_controls[ index ] = ( dir_checkbox, dir_namespace_textctrl )
|
||||
|
||||
|
||||
#
|
||||
|
||||
( tags_for_all, load_from_neighbouring_txt_files, add_filename, add_first_directory, add_second_directory, add_third_directory ) = filename_tagging_options.SimpleToTuple()
|
||||
( tags_for_all, load_from_neighbouring_txt_files, add_filename, directory_dict ) = filename_tagging_options.SimpleToTuple()
|
||||
|
||||
self._tags.AddTags( tags_for_all )
|
||||
self._load_from_txt_files_checkbox.SetValue( load_from_neighbouring_txt_files )
|
||||
|
@ -574,20 +580,16 @@ class FilenameTaggingOptionsPanel( wx.Panel ):
|
|||
self._filename_checkbox.SetValue( add_filename_boolean )
|
||||
self._filename_namespace.SetValue( add_filename_namespace )
|
||||
|
||||
( dir_1_boolean, dir_1_namespace ) = add_first_directory
|
||||
|
||||
self._dir_checkbox_1.SetValue( dir_1_boolean )
|
||||
self._dir_namespace_1.SetValue( dir_1_namespace )
|
||||
|
||||
( dir_2_boolean, dir_2_namespace ) = add_second_directory
|
||||
|
||||
self._dir_checkbox_2.SetValue( dir_2_boolean )
|
||||
self._dir_namespace_2.SetValue( dir_2_namespace )
|
||||
|
||||
( dir_3_boolean, dir_3_namespace ) = add_third_directory
|
||||
|
||||
self._dir_checkbox_3.SetValue( dir_3_boolean )
|
||||
self._dir_namespace_3.SetValue( dir_3_namespace )
|
||||
for ( index, ( dir_boolean, dir_namespace ) ) in directory_dict.items():
|
||||
|
||||
( dir_checkbox, dir_namespace_textctrl ) = self._directory_namespace_controls[ index ]
|
||||
|
||||
dir_checkbox.SetValue( dir_boolean )
|
||||
dir_namespace_textctrl.SetValue( dir_namespace )
|
||||
|
||||
dir_checkbox.Bind( wx.EVT_CHECKBOX, self.EventRefresh )
|
||||
dir_namespace_textctrl.Bind( wx.EVT_TEXT, self.EventRefresh )
|
||||
|
||||
|
||||
#
|
||||
|
||||
|
@ -609,26 +611,20 @@ class FilenameTaggingOptionsPanel( wx.Panel ):
|
|||
filename_hbox.Add( self._filename_checkbox, CC.FLAGS_VCENTER )
|
||||
filename_hbox.Add( self._filename_namespace, CC.FLAGS_EXPAND_BOTH_WAYS )
|
||||
|
||||
dir_hbox_1 = wx.BoxSizer( wx.HORIZONTAL )
|
||||
|
||||
dir_hbox_1.Add( self._dir_checkbox_1, CC.FLAGS_VCENTER )
|
||||
dir_hbox_1.Add( self._dir_namespace_1, CC.FLAGS_EXPAND_BOTH_WAYS )
|
||||
|
||||
dir_hbox_2 = wx.BoxSizer( wx.HORIZONTAL )
|
||||
|
||||
dir_hbox_2.Add( self._dir_checkbox_2, CC.FLAGS_VCENTER )
|
||||
dir_hbox_2.Add( self._dir_namespace_2, CC.FLAGS_EXPAND_BOTH_WAYS )
|
||||
|
||||
dir_hbox_3 = wx.BoxSizer( wx.HORIZONTAL )
|
||||
|
||||
dir_hbox_3.Add( self._dir_checkbox_3, CC.FLAGS_VCENTER )
|
||||
dir_hbox_3.Add( self._dir_namespace_3, CC.FLAGS_EXPAND_BOTH_WAYS )
|
||||
|
||||
self._checkboxes_panel.Add( txt_hbox, CC.FLAGS_EXPAND_SIZER_PERPENDICULAR )
|
||||
self._checkboxes_panel.Add( filename_hbox, CC.FLAGS_EXPAND_SIZER_PERPENDICULAR )
|
||||
self._checkboxes_panel.Add( dir_hbox_1, CC.FLAGS_EXPAND_SIZER_PERPENDICULAR )
|
||||
self._checkboxes_panel.Add( dir_hbox_2, CC.FLAGS_EXPAND_SIZER_PERPENDICULAR )
|
||||
self._checkboxes_panel.Add( dir_hbox_3, CC.FLAGS_EXPAND_SIZER_PERPENDICULAR )
|
||||
|
||||
for index in ( 0, 1, 2, -3, -2, -1 ):
|
||||
|
||||
hbox = wx.BoxSizer( wx.HORIZONTAL )
|
||||
|
||||
( dir_checkbox, dir_namespace_textctrl ) = self._directory_namespace_controls[ index ]
|
||||
|
||||
hbox.Add( dir_checkbox, CC.FLAGS_VCENTER )
|
||||
hbox.Add( dir_namespace_textctrl, CC.FLAGS_EXPAND_BOTH_WAYS )
|
||||
|
||||
self._checkboxes_panel.Add( hbox, CC.FLAGS_EXPAND_SIZER_PERPENDICULAR )
|
||||
|
||||
|
||||
hbox = wx.BoxSizer( wx.HORIZONTAL )
|
||||
|
||||
|
@ -643,12 +639,6 @@ class FilenameTaggingOptionsPanel( wx.Panel ):
|
|||
self._load_from_txt_files_checkbox.Bind( wx.EVT_CHECKBOX, self.EventRefresh )
|
||||
self._filename_namespace.Bind( wx.EVT_TEXT, self.EventRefresh )
|
||||
self._filename_checkbox.Bind( wx.EVT_CHECKBOX, self.EventRefresh )
|
||||
self._dir_namespace_1.Bind( wx.EVT_TEXT, self.EventRefresh )
|
||||
self._dir_checkbox_1.Bind( wx.EVT_CHECKBOX, self.EventRefresh )
|
||||
self._dir_namespace_2.Bind( wx.EVT_TEXT, self.EventRefresh )
|
||||
self._dir_checkbox_2.Bind( wx.EVT_CHECKBOX, self.EventRefresh )
|
||||
self._dir_namespace_3.Bind( wx.EVT_TEXT, self.EventRefresh )
|
||||
self._dir_checkbox_3.Bind( wx.EVT_CHECKBOX, self.EventRefresh )
|
||||
|
||||
|
||||
def _GetTagsFromClipboard( self ):
|
||||
|
@ -835,27 +825,16 @@ class FilenameTaggingOptionsPanel( wx.Panel ):
|
|||
tags_for_all = self._tags.GetTags()
|
||||
load_from_neighbouring_txt_files = self._load_from_txt_files_checkbox.GetValue()
|
||||
|
||||
add_filename_boolean = self._filename_checkbox.GetValue()
|
||||
add_filename_namespace = self._filename_namespace.GetValue()
|
||||
add_filename = ( self._filename_checkbox.GetValue(), self._filename_namespace.GetValue() )
|
||||
|
||||
add_filename = ( add_filename_boolean, add_filename_namespace )
|
||||
directories_dict = {}
|
||||
|
||||
dir_1_boolean = self._dir_checkbox_1.GetValue()
|
||||
dir_1_namespace = self._dir_namespace_1.GetValue()
|
||||
for ( index, ( dir_checkbox, dir_namespace_textctrl ) ) in self._directory_namespace_controls.items():
|
||||
|
||||
directories_dict[ index ] = ( dir_checkbox.GetValue(), dir_namespace_textctrl.GetValue() )
|
||||
|
||||
|
||||
add_first_directory = ( dir_1_boolean, dir_1_namespace )
|
||||
|
||||
dir_2_boolean = self._dir_checkbox_2.GetValue()
|
||||
dir_2_namespace = self._dir_namespace_2.GetValue()
|
||||
|
||||
add_second_directory = ( dir_2_boolean, dir_2_namespace )
|
||||
|
||||
dir_3_boolean = self._dir_checkbox_3.GetValue()
|
||||
dir_3_namespace = self._dir_namespace_3.GetValue()
|
||||
|
||||
add_third_directory = ( dir_3_boolean, dir_3_namespace )
|
||||
|
||||
filename_tagging_options.SimpleSetTuple( tags_for_all, load_from_neighbouring_txt_files, add_filename, add_first_directory, add_second_directory, add_third_directory )
|
||||
filename_tagging_options.SimpleSetTuple( tags_for_all, load_from_neighbouring_txt_files, add_filename, directories_dict )
|
||||
|
||||
|
||||
|
||||
|
|
|
@ -518,10 +518,16 @@ class EditLoginsPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
sort_login_script = login_script.GetName()
|
||||
|
||||
network_context = ClientNetworkingContexts.NetworkContext( context_type = CC.NETWORK_CONTEXT_DOMAIN, context_data = login_domain )
|
||||
|
||||
logged_in = login_script.IsLoggedIn( self._engine, network_context )
|
||||
|
||||
except HydrusExceptions.DataMissing:
|
||||
|
||||
sort_login_script = 'login script not found'
|
||||
|
||||
logged_in = False
|
||||
|
||||
|
||||
access = ClientNetworkingLogin.login_access_type_str_lookup[ login_access_type ] + ' - ' + login_access_text
|
||||
|
||||
|
@ -541,10 +547,6 @@ class EditLoginsPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
sort_validity += ' - ' + validity_error_text
|
||||
|
||||
|
||||
network_context = ClientNetworkingContexts.NetworkContext( context_type = CC.NETWORK_CONTEXT_DOMAIN, context_data = login_domain )
|
||||
|
||||
logged_in = login_script.IsLoggedIn( self._engine, network_context )
|
||||
|
||||
if logged_in:
|
||||
|
||||
sort_logged_in = 'yes'
|
||||
|
@ -1257,26 +1259,6 @@ class EditLoginScriptPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
def _DoTest( self ):
|
||||
|
||||
if self._currently_testing:
|
||||
|
||||
wx.MessageBox( 'Currently testing already! Please cancel current job!' )
|
||||
|
||||
return
|
||||
|
||||
|
||||
try:
|
||||
|
||||
login_script = self.GetValue()
|
||||
|
||||
except HydrusExceptions.VetoException:
|
||||
|
||||
return
|
||||
|
||||
|
||||
self._test_listctrl.DeleteDatas( self._test_listctrl.GetData() )
|
||||
|
||||
self._test_button.Disable()
|
||||
|
||||
def wx_add_result( test_result ):
|
||||
|
||||
if not self:
|
||||
|
@ -1344,6 +1326,34 @@ class EditLoginScriptPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
|
||||
|
||||
if self._currently_testing:
|
||||
|
||||
wx.MessageBox( 'Currently testing already! Please cancel current job!' )
|
||||
|
||||
return
|
||||
|
||||
|
||||
try:
|
||||
|
||||
login_script = self.GetValue()
|
||||
|
||||
except HydrusExceptions.VetoException:
|
||||
|
||||
return
|
||||
|
||||
|
||||
if self._test_domain == '':
|
||||
|
||||
example_domains = list( login_script.GetExampleDomains() )
|
||||
|
||||
example_domains.sort()
|
||||
|
||||
if len( example_domains ) > 0:
|
||||
|
||||
self._test_domain = example_domains[0]
|
||||
|
||||
|
||||
|
||||
with ClientGUIDialogs.DialogTextEntry( self, 'edit the domain', default = self._test_domain, allow_blank = False ) as dlg:
|
||||
|
||||
if dlg.ShowModal() == wx.ID_OK:
|
||||
|
@ -1381,6 +1391,10 @@ class EditLoginScriptPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
self._test_credentials = {}
|
||||
|
||||
|
||||
self._test_listctrl.DeleteDatas( self._test_listctrl.GetData() )
|
||||
|
||||
self._test_button.Disable()
|
||||
|
||||
network_job_presentation_context_factory = GenerateTestNetworkJobPresentationContextFactory( self, self._test_network_job_control )
|
||||
|
||||
self._currently_testing = True
|
||||
|
|
|
@ -1784,9 +1784,7 @@ class ManagementPanelImporterMultipleGallery( ManagementPanelImporter ):
|
|||
|
||||
if len( gallery_importers ) > 0:
|
||||
|
||||
separator = os.linesep * 2
|
||||
|
||||
text = separator.join( ( gallery_importer.GetQueryText() for gallery_importer in gallery_importers ) )
|
||||
text = os.linesep.join( ( gallery_importer.GetQueryText() for gallery_importer in gallery_importers ) )
|
||||
|
||||
HG.client_controller.pub( 'clipboard', 'text', text )
|
||||
|
||||
|
@ -2444,9 +2442,7 @@ class ManagementPanelImporterMultipleWatcher( ManagementPanelImporter ):
|
|||
|
||||
if len( watchers ) > 0:
|
||||
|
||||
separator = os.linesep * 2
|
||||
|
||||
text = separator.join( ( watcher.GetURL() for watcher in watchers ) )
|
||||
text = os.linesep.join( ( watcher.GetURL() for watcher in watchers ) )
|
||||
|
||||
HG.client_controller.pub( 'clipboard', 'text', text )
|
||||
|
||||
|
|
|
@ -1687,14 +1687,14 @@ class MediaPanel( ClientMedia.ListeningMediaList, wx.ScrolledCanvas ):
|
|||
|
||||
if duplicate_action_options is None:
|
||||
|
||||
list_of_service_keys_to_content_updates = []
|
||||
service_keys_to_content_updates = {}
|
||||
|
||||
else:
|
||||
|
||||
list_of_service_keys_to_content_updates = duplicate_action_options.ProcessPairIntoContentUpdates( first_media, second_media )
|
||||
service_keys_to_content_updates = duplicate_action_options.ProcessPairIntoContentUpdates( first_media, second_media )
|
||||
|
||||
|
||||
pair_info.append( ( duplicate_type, first_hash, second_hash, list_of_service_keys_to_content_updates ) )
|
||||
pair_info.append( ( duplicate_type, first_hash, second_hash, service_keys_to_content_updates ) )
|
||||
|
||||
|
||||
if len( pair_info ) > 0:
|
||||
|
|
|
@ -1900,6 +1900,9 @@ class ReviewExportFilesPanel( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
|
||||
self._examples = ClientGUICommon.ExportPatternButton( self._filenames_box )
|
||||
|
||||
self._delete_files_after_export = wx.CheckBox( self, label = 'delete files from client after export?' )
|
||||
self._delete_files_after_export.SetForegroundColour( wx.Colour( 127, 0, 0 ) )
|
||||
|
||||
text = 'This will export all the files\' tags, newline separated, into .txts beside the files themselves.'
|
||||
|
||||
self._export_tag_txts = wx.CheckBox( self, label = 'export tags to .txt files?' )
|
||||
|
@ -1926,6 +1929,9 @@ class ReviewExportFilesPanel( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
|
||||
self._paths.SetData( list( enumerate( flat_media ) ) )
|
||||
|
||||
self._delete_files_after_export.SetValue( HG.client_controller.new_options.GetBoolean( 'delete_files_after_export' ) )
|
||||
self._delete_files_after_export.Bind( wx.EVT_CHECKBOX, self.EventDeleteFilesChanged )
|
||||
|
||||
#
|
||||
|
||||
top_hbox = wx.BoxSizer( wx.HORIZONTAL )
|
||||
|
@ -1953,6 +1959,7 @@ class ReviewExportFilesPanel( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
vbox.Add( top_hbox, CC.FLAGS_EXPAND_SIZER_BOTH_WAYS )
|
||||
vbox.Add( self._export_path_box, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
vbox.Add( self._filenames_box, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
vbox.Add( self._delete_files_after_export, CC.FLAGS_LONE_BUTTON )
|
||||
vbox.Add( self._export_tag_txts, CC.FLAGS_LONE_BUTTON )
|
||||
vbox.Add( self._export, CC.FLAGS_LONE_BUTTON )
|
||||
|
||||
|
@ -1991,9 +1998,19 @@ class ReviewExportFilesPanel( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
|
||||
def _DoExport( self, quit_afterwards = False ):
|
||||
|
||||
delete_afterwards = self._delete_files_after_export.GetValue()
|
||||
|
||||
if quit_afterwards:
|
||||
|
||||
with ClientGUIDialogs.DialogYesNo( self, 'Export as shown?' ) as dlg:
|
||||
message = 'Export as shown?'
|
||||
|
||||
if delete_afterwards:
|
||||
|
||||
message += os.linesep * 2
|
||||
message += 'THE FILES WILL BE DELETED FROM THE CLIENT AFTERWARDS'
|
||||
|
||||
|
||||
with ClientGUIDialogs.DialogYesNo( self, message ) as dlg:
|
||||
|
||||
if dlg.ShowModal() != wx.ID_YES:
|
||||
|
||||
|
@ -2003,6 +2020,18 @@ class ReviewExportFilesPanel( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
|
||||
|
||||
|
||||
elif delete_afterwards:
|
||||
|
||||
message = 'THE FILES WILL BE DELETED FROM THE CLIENT AFTERWARDS'
|
||||
|
||||
with ClientGUIDialogs.DialogYesNo( self, message ) as dlg:
|
||||
|
||||
if dlg.ShowModal() != wx.ID_YES:
|
||||
|
||||
return
|
||||
|
||||
|
||||
|
||||
|
||||
self._RefreshPaths()
|
||||
|
||||
|
@ -2055,7 +2084,7 @@ class ReviewExportFilesPanel( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
|
||||
|
||||
|
||||
def do_it( neighbouring_txt_tag_service_keys, quit_afterwards ):
|
||||
def do_it( neighbouring_txt_tag_service_keys, delete_afterwards, quit_afterwards ):
|
||||
|
||||
for ( index, ( ordering_index, media ) ) in enumerate( to_do ):
|
||||
|
||||
|
@ -2120,6 +2149,22 @@ class ReviewExportFilesPanel( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
|
||||
|
||||
|
||||
if delete_afterwards:
|
||||
|
||||
wx.CallAfter( wx_update_label, 'deleting' )
|
||||
|
||||
deletee_hashes = { media.GetHash() for ( ordering_index, media ) in to_do }
|
||||
|
||||
chunks_of_hashes = HydrusData.SplitListIntoChunks( deletee_hashes, 64 )
|
||||
|
||||
content_updates = [ HydrusData.ContentUpdate( HC.CONTENT_TYPE_FILES, HC.CONTENT_UPDATE_DELETE, chunk_of_hashes ) for chunk_of_hashes in chunks_of_hashes ]
|
||||
|
||||
for content_update in content_updates:
|
||||
|
||||
HG.client_controller.WriteSynchronous( 'content_updates', { CC.LOCAL_FILE_SERVICE_KEY : [ content_update ] } )
|
||||
|
||||
|
||||
|
||||
wx.CallAfter( wx_update_label, 'done!' )
|
||||
|
||||
time.sleep( 1 )
|
||||
|
@ -2129,7 +2174,7 @@ class ReviewExportFilesPanel( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
wx.CallAfter( wx_done, quit_afterwards )
|
||||
|
||||
|
||||
HG.client_controller.CallToThread( do_it, self._neighbouring_txt_tag_service_keys, quit_afterwards )
|
||||
HG.client_controller.CallToThread( do_it, self._neighbouring_txt_tag_service_keys, delete_afterwards, quit_afterwards )
|
||||
|
||||
|
||||
def _GetPath( self, media ):
|
||||
|
@ -2204,6 +2249,11 @@ class ReviewExportFilesPanel( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
self._DoExport()
|
||||
|
||||
|
||||
def EventDeleteFilesChanged( self, event ):
|
||||
|
||||
HG.client_controller.new_options.SetBoolean( 'delete_files_after_export', self._delete_files_after_export.GetValue() )
|
||||
|
||||
|
||||
def EventExportTagTxtsChanged( self, event ):
|
||||
|
||||
if self._export_tag_txts.GetValue() == True:
|
||||
|
|
|
@ -1233,18 +1233,17 @@ class ManageTagsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
|
||||
if self._i_am_local_tag_service:
|
||||
|
||||
text = 'remove all tags'
|
||||
text = 'remove all/selected tags'
|
||||
|
||||
else:
|
||||
|
||||
text = 'petition all tags'
|
||||
text = 'petition all/selected tags'
|
||||
|
||||
|
||||
self._remove_tags = wx.Button( self._tags_box_sorter, label = text )
|
||||
self._remove_tags.Bind( wx.EVT_BUTTON, self.EventRemoveTags )
|
||||
self._remove_tags = ClientGUICommon.BetterButton( self._tags_box_sorter, text, self._RemoveTagsButton )
|
||||
|
||||
self._copy_button = ClientGUICommon.BetterBitmapButton( self._tags_box_sorter, CC.GlobalBMPs.copy, self._Copy )
|
||||
self._copy_button.SetToolTip( 'Copy the selected tags to the clipboard.' )
|
||||
self._copy_button.SetToolTip( 'Copy selected tags to the clipboard. If none are selected, copies all.' )
|
||||
|
||||
self._paste_button = ClientGUICommon.BetterBitmapButton( self._tags_box_sorter, CC.GlobalBMPs.paste, self._Paste )
|
||||
self._paste_button.SetToolTip( 'Paste newline-separated tags from the clipboard into here.' )
|
||||
|
@ -1261,6 +1260,10 @@ class ManageTagsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
|
||||
menu_items.append( ( 'check', 'auto-replace entered siblings', 'If checked, adding any tag that has a sibling will instead add that sibling.', check_manager ) )
|
||||
|
||||
check_manager = ClientGUICommon.CheckboxManagerOptions( 'yes_no_on_remove_on_manage_tags' )
|
||||
|
||||
menu_items.append( ( 'check', 'confirm remove/petition tags', 'If checked, clicking the remove/petition tags button (or hitting the deleted key on the list) will first confirm the action with a yes/no dialog.', check_manager ) )
|
||||
|
||||
check_manager = ClientGUICommon.CheckboxManagerCalls( self._FlipShowDeleted, lambda: self._show_deleted )
|
||||
|
||||
menu_items.append( ( 'check', 'show deleted', 'Show deleted tags, if any.', check_manager ) )
|
||||
|
@ -1665,11 +1668,21 @@ class ManageTagsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
|
||||
tags = list( self._tags_box.GetSelectedTags() )
|
||||
|
||||
tags = HydrusTags.SortNumericTags( tags )
|
||||
if len( tags ) == 0:
|
||||
|
||||
( current_tags_to_count, deleted_tags_to_count, pending_tags_to_count, petitioned_tags_to_count ) = ClientData.GetMediasTagCount( self._media, tag_service_key = self._tag_service_key, collapse_siblings = False )
|
||||
|
||||
tags = set( current_tags_to_count.keys() ).union( pending_tags_to_count.keys() )
|
||||
|
||||
|
||||
text = os.linesep.join( tags )
|
||||
|
||||
HG.client_controller.pub( 'clipboard', 'text', text )
|
||||
if len( tags ) > 0:
|
||||
|
||||
tags = HydrusTags.SortNumericTags( tags )
|
||||
|
||||
text = os.linesep.join( tags )
|
||||
|
||||
HG.client_controller.pub( 'clipboard', 'text', text )
|
||||
|
||||
|
||||
|
||||
def _FlipShowDeleted( self ):
|
||||
|
@ -1720,11 +1733,39 @@ class ManageTagsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
self.EnterTags( tags, only_add = True )
|
||||
|
||||
except Exception as e:
|
||||
HydrusData.ShowException( e )
|
||||
|
||||
wx.MessageBox( 'I could not understand what was in the clipboard' )
|
||||
|
||||
|
||||
|
||||
def _RemoveTagsButton( self ):
|
||||
|
||||
tag_managers = [ m.GetTagsManager() for m in self._media ]
|
||||
|
||||
removable_tags = set()
|
||||
|
||||
for tag_manager in tag_managers:
|
||||
|
||||
removable_tags.update( tag_manager.GetCurrent( self._tag_service_key ) )
|
||||
removable_tags.update( tag_manager.GetPending( self._tag_service_key ) )
|
||||
|
||||
|
||||
selected_tags = list( self._tags_box.GetSelectedTags() )
|
||||
|
||||
if len( selected_tags ) == 0:
|
||||
|
||||
tags_to_remove = list( removable_tags )
|
||||
|
||||
else:
|
||||
|
||||
tags_to_remove = [ tag for tag in selected_tags if tag in removable_tags ]
|
||||
|
||||
|
||||
tags_to_remove = HydrusTags.SortNumericTags( tags_to_remove )
|
||||
|
||||
self.RemoveTags( tags_to_remove )
|
||||
|
||||
|
||||
def AddTags( self, tags, only_add = False ):
|
||||
|
||||
if len( tags ) > 0:
|
||||
|
@ -1748,21 +1789,6 @@ class ManageTagsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
|
||||
|
||||
|
||||
def EventRemoveTags( self, event ):
|
||||
|
||||
tag_managers = [ m.GetTagsManager() for m in self._media ]
|
||||
|
||||
removable_tags = set()
|
||||
|
||||
for tag_manager in tag_managers:
|
||||
|
||||
removable_tags.update( tag_manager.GetCurrent( self._tag_service_key ) )
|
||||
removable_tags.update( tag_manager.GetPending( self._tag_service_key ) )
|
||||
|
||||
|
||||
self._AddTags( removable_tags, only_remove = True )
|
||||
|
||||
|
||||
def GetGroupsOfContentUpdates( self ):
|
||||
|
||||
return ( self._tag_service_key, self._groups_of_content_updates )
|
||||
|
@ -1805,6 +1831,28 @@ class ManageTagsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
|
||||
if len( tags ) > 0:
|
||||
|
||||
if self._new_options.GetBoolean( 'yes_no_on_remove_on_manage_tags' ):
|
||||
|
||||
if len( tags ) < 10:
|
||||
|
||||
message = 'Are you sure you want to remove these tags:'
|
||||
message += os.linesep * 2
|
||||
message += os.linesep.join( tags )
|
||||
|
||||
else:
|
||||
|
||||
message = 'Are you sure you want to remove these ' + HydrusData.ToHumanInt( len( tags ) ) + ' tags?'
|
||||
|
||||
|
||||
with ClientGUIDialogs.DialogYesNo( self, message ) as dlg:
|
||||
|
||||
if dlg.ShowModal() != wx.ID_YES:
|
||||
|
||||
return
|
||||
|
||||
|
||||
|
||||
|
||||
self._AddTags( tags, only_remove = True )
|
||||
|
||||
|
||||
|
@ -2174,30 +2222,18 @@ class ManageTagParents( ClientGUIScrolledPanels.ManagePanel ):
|
|||
HG.client_controller.CallToThread( self.THREADInitialise, tags, self._service_key )
|
||||
|
||||
|
||||
def _AddFlatPairs( self, pairs, add_only = False ):
|
||||
def _AddPairs( self, pairs, add_only = False ):
|
||||
|
||||
parents_to_children = HydrusData.BuildKeyToSetDict( ( ( parent, child ) for ( child, parent ) in pairs ) )
|
||||
pairs = list( pairs )
|
||||
|
||||
for ( parent, children ) in parents_to_children.items():
|
||||
|
||||
self._AddPairs( children, parent, add_only = add_only )
|
||||
|
||||
|
||||
self._UpdateListCtrlData()
|
||||
|
||||
self._SetButtonStatus()
|
||||
|
||||
|
||||
def _AddPairs( self, children, parent, add_only = False ):
|
||||
pairs.sort( key = lambda ( c, p ): HydrusTags.ConvertTagToSortable( p ) )
|
||||
|
||||
new_pairs = []
|
||||
current_pairs = []
|
||||
petitioned_pairs = []
|
||||
pending_pairs = []
|
||||
|
||||
for child in children:
|
||||
|
||||
pair = ( child, parent )
|
||||
for pair in pairs:
|
||||
|
||||
if pair in self._current_statuses_to_pairs[ HC.CONTENT_STATUS_PENDING ]:
|
||||
|
||||
|
@ -2542,7 +2578,9 @@ class ManageTagParents( ClientGUIScrolledPanels.ManagePanel ):
|
|||
|
||||
pairs = self._DeserialiseImportString( import_string )
|
||||
|
||||
self._AddFlatPairs( pairs, add_only )
|
||||
self._AddPairs( pairs, add_only = add_only )
|
||||
|
||||
self._UpdateListCtrlData()
|
||||
|
||||
|
||||
def _ImportFromTXT( self, add_only = False ):
|
||||
|
@ -2566,7 +2604,9 @@ class ManageTagParents( ClientGUIScrolledPanels.ManagePanel ):
|
|||
|
||||
pairs = self._DeserialiseImportString( import_string )
|
||||
|
||||
self._AddFlatPairs( pairs, add_only )
|
||||
self._AddPairs( pairs, add_only = add_only )
|
||||
|
||||
self._UpdateListCtrlData()
|
||||
|
||||
|
||||
def _ListCtrlActivated( self ):
|
||||
|
@ -2575,17 +2615,9 @@ class ManageTagParents( ClientGUIScrolledPanels.ManagePanel ):
|
|||
|
||||
pairs = self._tag_parents.GetData( only_selected = True )
|
||||
|
||||
for ( child, parent ) in pairs:
|
||||
if len( pairs ) > 0:
|
||||
|
||||
parents_to_children[ parent ].add( child )
|
||||
|
||||
|
||||
if len( parents_to_children ) > 0:
|
||||
|
||||
for ( parent, children ) in parents_to_children.items():
|
||||
|
||||
self._AddPairs( children, parent )
|
||||
|
||||
self._AddPairs( pairs )
|
||||
|
||||
|
||||
|
||||
|
@ -2686,10 +2718,9 @@ class ManageTagParents( ClientGUIScrolledPanels.ManagePanel ):
|
|||
children = self._children.GetTags()
|
||||
parents = self._parents.GetTags()
|
||||
|
||||
for parent in parents:
|
||||
|
||||
self._AddPairs( children, parent )
|
||||
|
||||
pairs = list( itertools.product( children, parents ) )
|
||||
|
||||
self._AddPairs( pairs )
|
||||
|
||||
self._children.SetTags( [] )
|
||||
self._parents.SetTags( [] )
|
||||
|
@ -3020,32 +3051,18 @@ class ManageTagSiblings( ClientGUIScrolledPanels.ManagePanel ):
|
|||
HG.client_controller.CallToThread( self.THREADInitialise, tags, self._service_key )
|
||||
|
||||
|
||||
def _AddFlatPairs( self, pairs, add_only = False ):
|
||||
def _AddPairs( self, pairs, add_only = False, remove_only = False, default_reason = None ):
|
||||
|
||||
news_to_olds = HydrusData.BuildKeyToSetDict( ( ( new, old ) for ( old, new ) in pairs ) )
|
||||
pairs = list( pairs )
|
||||
|
||||
for ( new, olds ) in news_to_olds.items():
|
||||
|
||||
self._AutoPetitionConflicts( olds, new )
|
||||
|
||||
self._AddPairs( olds, new, add_only = add_only )
|
||||
|
||||
|
||||
self._UpdateListCtrlData()
|
||||
|
||||
self._SetButtonStatus()
|
||||
|
||||
|
||||
def _AddPairs( self, olds, new, add_only = False, remove_only = False, default_reason = None ):
|
||||
pairs.sort( key = lambda ( c, p ): HydrusTags.ConvertTagToSortable( p ) )
|
||||
|
||||
new_pairs = []
|
||||
current_pairs = []
|
||||
petitioned_pairs = []
|
||||
pending_pairs = []
|
||||
|
||||
for old in olds:
|
||||
|
||||
pair = ( old, new )
|
||||
for pair in pairs:
|
||||
|
||||
if pair in self._current_statuses_to_pairs[ HC.CONTENT_STATUS_PENDING ]:
|
||||
|
||||
|
@ -3242,27 +3259,38 @@ class ManageTagSiblings( ClientGUIScrolledPanels.ManagePanel ):
|
|||
|
||||
|
||||
|
||||
def _AutoPetitionConflicts( self, olds, new ):
|
||||
def _AutoPetitionConflicts( self, pairs ):
|
||||
|
||||
current_pairs = self._current_statuses_to_pairs[ HC.CONTENT_STATUS_CURRENT ].union( self._current_statuses_to_pairs[ HC.CONTENT_STATUS_PENDING ] ).difference( self._current_statuses_to_pairs[ HC.CONTENT_STATUS_PETITIONED ] )
|
||||
|
||||
olds_to_news = dict( current_pairs )
|
||||
current_olds_to_news = dict( current_pairs )
|
||||
|
||||
current_olds = { current_old for ( current_old, current_new ) in current_pairs }
|
||||
|
||||
for old in olds:
|
||||
pairs_to_auto_petition = set()
|
||||
|
||||
for ( old, new ) in pairs:
|
||||
|
||||
if old in current_olds:
|
||||
|
||||
conflicting_new = olds_to_news[ old ]
|
||||
conflicting_new = current_olds_to_news[ old ]
|
||||
|
||||
if conflicting_new != new:
|
||||
|
||||
self._AddPairs( [ old ], conflicting_new, remove_only = True, default_reason = 'AUTO-PETITION TO REASSIGN TO: ' + new )
|
||||
conflicting_pair = ( old, conflicting_new )
|
||||
|
||||
pairs_to_auto_petition.add( conflicting_pair )
|
||||
|
||||
|
||||
|
||||
|
||||
if len( pairs_to_auto_petition ) > 0:
|
||||
|
||||
pairs_to_auto_petition = list( pairs_to_auto_petition )
|
||||
|
||||
self._AddPairs( pairs_to_auto_petition, remove_only = True, default_reason = 'AUTO-PETITION TO REASSIGN TO: ' + new )
|
||||
|
||||
|
||||
|
||||
def _CanAdd( self, potential_pair ):
|
||||
|
||||
|
@ -3415,7 +3443,11 @@ class ManageTagSiblings( ClientGUIScrolledPanels.ManagePanel ):
|
|||
|
||||
pairs = self._DeserialiseImportString( import_string )
|
||||
|
||||
self._AddFlatPairs( pairs, add_only )
|
||||
self._AutoPetitionConflicts( pairs )
|
||||
|
||||
self._AddPairs( pairs, add_only = add_only )
|
||||
|
||||
self._UpdateListCtrlData()
|
||||
|
||||
|
||||
def _ImportFromTXT( self, add_only = False ):
|
||||
|
@ -3439,26 +3471,20 @@ class ManageTagSiblings( ClientGUIScrolledPanels.ManagePanel ):
|
|||
|
||||
pairs = self._DeserialiseImportString( import_string )
|
||||
|
||||
self._AddFlatPairs( pairs, add_only )
|
||||
self._AutoPetitionConflicts( pairs )
|
||||
|
||||
self._AddPairs( pairs, add_only = add_only )
|
||||
|
||||
self._UpdateListCtrlData()
|
||||
|
||||
|
||||
def _ListCtrlActivated( self ):
|
||||
|
||||
news_to_olds = collections.defaultdict( set )
|
||||
|
||||
pairs = self._tag_siblings.GetData( only_selected = True )
|
||||
|
||||
for ( old, new ) in pairs:
|
||||
if len( pairs ) > 0:
|
||||
|
||||
news_to_olds[ new ].add( old )
|
||||
|
||||
|
||||
if len( news_to_olds ) > 0:
|
||||
|
||||
for ( new, olds ) in news_to_olds.items():
|
||||
|
||||
self._AddPairs( olds, new )
|
||||
|
||||
self._AddPairs( pairs )
|
||||
|
||||
|
||||
self._UpdateListCtrlData()
|
||||
|
@ -3552,9 +3578,11 @@ class ManageTagSiblings( ClientGUIScrolledPanels.ManagePanel ):
|
|||
|
||||
olds = self._old_siblings.GetTags()
|
||||
|
||||
self._AutoPetitionConflicts( olds, self._current_new )
|
||||
pairs = [ ( old, self._current_new ) for old in olds ]
|
||||
|
||||
self._AddPairs( olds, self._current_new )
|
||||
self._AutoPetitionConflicts( pairs )
|
||||
|
||||
self._AddPairs( pairs )
|
||||
|
||||
self._old_siblings.SetTags( set() )
|
||||
self.SetNew( set() )
|
||||
|
|
|
@ -444,6 +444,42 @@ class DialogThatTakesScrollablePanel( DialogThatResizes ):
|
|||
raise NotImplementedError()
|
||||
|
||||
|
||||
def _TryEndModal( self, value ):
|
||||
|
||||
try:
|
||||
|
||||
self.EndModal( value )
|
||||
|
||||
except Exception as e:
|
||||
|
||||
HydrusData.ShowText( 'This dialog seems to have been unable to close for some reason. I am printing the stack to the log. The dialog may have already closed, or may attempt to close now. Please inform hydrus dev of this situation. I recommend you restart the client if you can. If the UI is locked, you will have to kill it via task manager.' )
|
||||
|
||||
HydrusData.PrintException( e )
|
||||
|
||||
import traceback
|
||||
|
||||
HydrusData.DebugPrint( ''.join( traceback.format_stack() ) )
|
||||
|
||||
try:
|
||||
|
||||
self.Close()
|
||||
|
||||
except:
|
||||
|
||||
HydrusData.ShowText( 'The dialog would not close on command.' )
|
||||
|
||||
|
||||
try:
|
||||
|
||||
self.Destroy()
|
||||
|
||||
except:
|
||||
|
||||
HydrusData.ShowText( 'The dialog would not destroy on command.' )
|
||||
|
||||
|
||||
|
||||
|
||||
def DoOK( self ):
|
||||
|
||||
raise NotImplementedError()
|
||||
|
@ -535,7 +571,7 @@ class DialogNullipotent( DialogThatTakesScrollablePanelClose ):
|
|||
|
||||
SaveTLWSizeAndPosition( self, self._frame_key )
|
||||
|
||||
self.EndModal( wx.ID_OK )
|
||||
self._TryEndModal( wx.ID_OK )
|
||||
|
||||
|
||||
class DialogNullipotentVetoable( DialogThatTakesScrollablePanelClose ):
|
||||
|
@ -577,7 +613,7 @@ class DialogNullipotentVetoable( DialogThatTakesScrollablePanelClose ):
|
|||
|
||||
SaveTLWSizeAndPosition( self, self._frame_key )
|
||||
|
||||
self.EndModal( wx.ID_OK )
|
||||
self._TryEndModal( wx.ID_OK )
|
||||
|
||||
|
||||
class DialogThatTakesScrollablePanelApplyCancel( DialogThatTakesScrollablePanel ):
|
||||
|
@ -634,7 +670,7 @@ class DialogEdit( DialogThatTakesScrollablePanelApplyCancel ):
|
|||
|
||||
SaveTLWSizeAndPosition( self, self._frame_key )
|
||||
|
||||
self.EndModal( wx.ID_OK )
|
||||
self._TryEndModal( wx.ID_OK )
|
||||
|
||||
|
||||
class DialogManage( DialogThatTakesScrollablePanelApplyCancel ):
|
||||
|
@ -664,7 +700,7 @@ class DialogManage( DialogThatTakesScrollablePanelApplyCancel ):
|
|||
|
||||
SaveTLWSizeAndPosition( self, self._frame_key )
|
||||
|
||||
self.EndModal( wx.ID_OK )
|
||||
self._TryEndModal( wx.ID_OK )
|
||||
|
||||
|
||||
class Frame( wx.Frame ):
|
||||
|
|
|
@ -274,7 +274,7 @@ class FilenameTaggingOptions( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_FILENAME_TAGGING_OPTIONS
|
||||
SERIALISABLE_NAME = 'Filename Tagging Options'
|
||||
SERIALISABLE_VERSION = 1
|
||||
SERIALISABLE_VERSION = 2
|
||||
|
||||
def __init__( self ):
|
||||
|
||||
|
@ -285,9 +285,13 @@ class FilenameTaggingOptions( HydrusSerialisable.SerialisableBase ):
|
|||
self._load_from_neighbouring_txt_files = False
|
||||
|
||||
self._add_filename = ( False, '' )
|
||||
self._add_first_directory = ( False, '' )
|
||||
self._add_second_directory = ( False, '' )
|
||||
self._add_third_directory = ( False, '' )
|
||||
|
||||
self._directories_dict = {}
|
||||
|
||||
for index in ( 0, 1, 2, -3, -2, -1 ):
|
||||
|
||||
self._directories_dict[ index ] = ( False, '' )
|
||||
|
||||
|
||||
self._quick_namespaces = []
|
||||
self._regexes = []
|
||||
|
@ -295,18 +299,47 @@ class FilenameTaggingOptions( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
def _GetSerialisableInfo( self ):
|
||||
|
||||
return ( list( self._tags_for_all ), self._load_from_neighbouring_txt_files, self._add_filename, self._add_first_directory, self._add_second_directory, self._add_third_directory, self._quick_namespaces, self._regexes )
|
||||
serialisable_directories_dict = self._directories_dict.items()
|
||||
|
||||
return ( list( self._tags_for_all ), self._load_from_neighbouring_txt_files, self._add_filename, serialisable_directories_dict, self._quick_namespaces, self._regexes )
|
||||
|
||||
|
||||
def _InitialiseFromSerialisableInfo( self, serialisable_info ):
|
||||
|
||||
( tags_for_all_list, self._load_from_neighbouring_txt_files, self._add_filename, self._add_first_directory, self._add_second_directory, self._add_third_directory, self._quick_namespaces, self._regexes ) = serialisable_info
|
||||
( tags_for_all_list, self._load_from_neighbouring_txt_files, self._add_filename, serialisable_directories_dict, self._quick_namespaces, self._regexes ) = serialisable_info
|
||||
|
||||
self._directories_dict = dict( serialisable_directories_dict )
|
||||
|
||||
# converting [ namespace, regex ] to ( namespace, regex ) for listctrl et al to handle better
|
||||
self._quick_namespaces = [ tuple( item ) for item in self._quick_namespaces ]
|
||||
self._tags_for_all = set( tags_for_all_list )
|
||||
|
||||
|
||||
def _UpdateSerialisableInfo( self, version, old_serialisable_info ):
|
||||
|
||||
if version == 1:
|
||||
|
||||
( tags_for_all_list, load_from_neighbouring_txt_files, add_filename, add_first_directory, add_second_directory, add_third_directory, quick_namespaces, regexes ) = old_serialisable_info
|
||||
|
||||
directories_dict = {}
|
||||
|
||||
directories_dict[ 0 ] = add_first_directory
|
||||
directories_dict[ 1 ] = add_second_directory
|
||||
directories_dict[ 2 ] = add_third_directory
|
||||
|
||||
for index in ( -3, -2, -1 ):
|
||||
|
||||
directories_dict[ index ] = ( False, '' )
|
||||
|
||||
|
||||
serialisable_directories_dict = directories_dict.items()
|
||||
|
||||
new_serialisable_info = ( tags_for_all_list, load_from_neighbouring_txt_files, add_filename, serialisable_directories_dict, quick_namespaces, regexes )
|
||||
|
||||
return ( 2, new_serialisable_info )
|
||||
|
||||
|
||||
|
||||
def AdvancedSetTuple( self, quick_namespaces, regexes ):
|
||||
|
||||
self._quick_namespaces = quick_namespaces
|
||||
|
@ -379,58 +412,40 @@ class FilenameTaggingOptions( HydrusSerialisable.SerialisableBase ):
|
|||
tags.add( tag )
|
||||
|
||||
|
||||
( drive, dirs ) = os.path.splitdrive( base )
|
||||
( drive, directories ) = os.path.splitdrive( base )
|
||||
|
||||
while dirs.startswith( os.path.sep ):
|
||||
while directories.startswith( os.path.sep ):
|
||||
|
||||
dirs = dirs[1:]
|
||||
directories = directories[1:]
|
||||
|
||||
|
||||
dirs = dirs.split( os.path.sep )
|
||||
directories = directories.split( os.path.sep )
|
||||
|
||||
( dir_1_boolean, dir_1_namespace ) = self._add_first_directory
|
||||
|
||||
if len( dirs ) > 0 and dir_1_boolean:
|
||||
for ( index, ( dir_boolean, dir_namespace ) ) in self._directories_dict.items():
|
||||
|
||||
if dir_1_namespace != '':
|
||||
# we are talking -3 through 2 here
|
||||
|
||||
if not dir_boolean:
|
||||
|
||||
tag = dir_1_namespace + ':' + dirs[0]
|
||||
continue
|
||||
|
||||
|
||||
try:
|
||||
|
||||
directory = directories[ index ]
|
||||
|
||||
except IndexError:
|
||||
|
||||
continue
|
||||
|
||||
|
||||
if dir_namespace != '':
|
||||
|
||||
tag = dir_namespace + ':' + directory
|
||||
|
||||
else:
|
||||
|
||||
tag = dirs[0]
|
||||
|
||||
|
||||
tags.add( tag )
|
||||
|
||||
|
||||
( dir_2_boolean, dir_2_namespace ) = self._add_second_directory
|
||||
|
||||
if len( dirs ) > 1 and dir_2_boolean:
|
||||
|
||||
if dir_2_namespace != '':
|
||||
|
||||
tag = dir_2_namespace + ':' + dirs[1]
|
||||
|
||||
else:
|
||||
|
||||
tag = dirs[1]
|
||||
|
||||
|
||||
tags.add( tag )
|
||||
|
||||
|
||||
( dir_3_boolean, dir_3_namespace ) = self._add_third_directory
|
||||
|
||||
if len( dirs ) > 2 and dir_3_boolean:
|
||||
|
||||
if dir_3_namespace != '':
|
||||
|
||||
tag = dir_3_namespace + ':' + dirs[2]
|
||||
|
||||
else:
|
||||
|
||||
tag = dirs[2]
|
||||
tag = directory
|
||||
|
||||
|
||||
tags.add( tag )
|
||||
|
@ -507,19 +522,17 @@ class FilenameTaggingOptions( HydrusSerialisable.SerialisableBase ):
|
|||
return tags
|
||||
|
||||
|
||||
def SimpleSetTuple( self, tags_for_all, load_from_neighbouring_txt_files, add_filename, add_first_directory, add_second_directory, add_third_directory ):
|
||||
def SimpleSetTuple( self, tags_for_all, load_from_neighbouring_txt_files, add_filename, directories_dict ):
|
||||
|
||||
self._tags_for_all = tags_for_all
|
||||
self._load_from_neighbouring_txt_files = load_from_neighbouring_txt_files
|
||||
self._add_filename = add_filename
|
||||
self._add_first_directory = add_first_directory
|
||||
self._add_second_directory = add_second_directory
|
||||
self._add_third_directory = add_third_directory
|
||||
self._directories_dict = directories_dict
|
||||
|
||||
|
||||
def SimpleToTuple( self ):
|
||||
|
||||
return ( self._tags_for_all, self._load_from_neighbouring_txt_files, self._add_filename, self._add_first_directory, self._add_second_directory, self._add_third_directory )
|
||||
return ( self._tags_for_all, self._load_from_neighbouring_txt_files, self._add_filename, self._directories_dict )
|
||||
|
||||
|
||||
HydrusSerialisable.SERIALISABLE_TYPES_TO_OBJECT_TYPES[ HydrusSerialisable.SERIALISABLE_TYPE_FILENAME_TAGGING_OPTIONS ] = FilenameTaggingOptions
|
||||
|
|
|
@ -218,9 +218,35 @@ class Subscription( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
nj.engine = HG.client_controller.network_engine
|
||||
|
||||
if nj.NeedsLogin() and not nj.CanLogin():
|
||||
if nj.NeedsLogin():
|
||||
|
||||
result = False
|
||||
try:
|
||||
|
||||
nj.CheckCanLogin()
|
||||
|
||||
result = True
|
||||
|
||||
except Exception as e:
|
||||
|
||||
result = False
|
||||
|
||||
if not self._paused:
|
||||
|
||||
login_fail_reason = HydrusData.ToUnicode( e )
|
||||
|
||||
message = 'Query "' + query.GetHumanName() + '" for subscription "' + self._name + '" seemed to have an invalid login for one of its file imports. The reason was:'
|
||||
message += os.linesep * 2
|
||||
message += login_fail_reason
|
||||
message += os.linesep * 2
|
||||
message += 'The subscription has paused. Please see if you can fix the problem and then unpause. Hydrus dev would like feedback on this process.'
|
||||
|
||||
HydrusData.ShowText( message )
|
||||
|
||||
self._DelayWork( 300, login_fail_reason )
|
||||
|
||||
self._paused = True
|
||||
|
||||
|
||||
|
||||
else:
|
||||
|
||||
|
@ -233,13 +259,6 @@ class Subscription( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
HydrusData.ShowText( 'Query "' + query.GetHumanName() + '" pre-work file login test. Login ok: ' + str( result ) + '.' )
|
||||
|
||||
|
||||
if not result and not self._paused:
|
||||
|
||||
HydrusData.ShowText( 'Query "' + query.GetHumanName() + '" for subscription "' + self._name + '" seemed to have an invalid login for one of its file imports. The subscription has paused. Please see if you can fix the problem and then unpause. Hydrus dev would like feedback on this process.' )
|
||||
|
||||
self._paused = True
|
||||
|
||||
|
||||
return result
|
||||
|
||||
|
||||
|
@ -259,9 +278,35 @@ class Subscription( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
nj.engine = HG.client_controller.network_engine
|
||||
|
||||
if nj.NeedsLogin() and not nj.CanLogin():
|
||||
if nj.NeedsLogin():
|
||||
|
||||
result = False
|
||||
try:
|
||||
|
||||
nj.CheckCanLogin()
|
||||
|
||||
result = True
|
||||
|
||||
except Exception as e:
|
||||
|
||||
result = False
|
||||
|
||||
if not self._paused:
|
||||
|
||||
login_fail_reason = HydrusData.ToUnicode( e )
|
||||
|
||||
message = 'Query "' + query.GetHumanName() + '" for subscription "' + self._name + '" seemed to have an invalid login. The reason was:'
|
||||
message += os.linesep * 2
|
||||
message += login_fail_reason
|
||||
message += os.linesep * 2
|
||||
message += 'The subscription has paused. Please see if you can fix the problem and then unpause. Hydrus dev would like feedback on this process.'
|
||||
|
||||
HydrusData.ShowText( message )
|
||||
|
||||
self._DelayWork( 300, login_fail_reason )
|
||||
|
||||
self._paused = True
|
||||
|
||||
|
||||
|
||||
else:
|
||||
|
||||
|
@ -274,13 +319,6 @@ class Subscription( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
HydrusData.ShowText( 'Query "' + query.GetHumanName() + '" pre-work sync login test. Login ok: ' + str( result ) + '.' )
|
||||
|
||||
|
||||
if not result and not self._paused:
|
||||
|
||||
HydrusData.ShowText( 'Query "' + query.GetHumanName() + '" for subscription "' + self._name + '" seemed to have an invalid login for one of its sync queries. The subscription has paused. Please see if you can fix the problem and then unpause. Hydrus dev would like feedback on this process.' )
|
||||
|
||||
self._paused = True
|
||||
|
||||
|
||||
return result
|
||||
|
||||
|
||||
|
@ -634,7 +672,7 @@ class Subscription( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
if query.CanWorkOnFiles():
|
||||
|
||||
if self._QueryBandwidthIsOK( query ) and self._QueryFileLoginIsOK( query ):
|
||||
if self._QueryBandwidthIsOK( query ):
|
||||
|
||||
return True
|
||||
|
||||
|
|
|
@ -530,20 +530,6 @@ class NetworkJob( object ):
|
|||
|
||||
|
||||
|
||||
def CanLogin( self ):
|
||||
|
||||
try:
|
||||
|
||||
self.CheckCanLogin()
|
||||
|
||||
return True
|
||||
|
||||
except:
|
||||
|
||||
return False
|
||||
|
||||
|
||||
|
||||
def CanValidateInPopup( self ):
|
||||
|
||||
with self._lock:
|
||||
|
|
|
@ -4,6 +4,7 @@ import ClientNetworkingContexts
|
|||
import ClientNetworkingDomain
|
||||
import ClientNetworkingJobs
|
||||
import ClientParsing
|
||||
import ClientThreading
|
||||
import cPickle
|
||||
import HydrusConstants as HC
|
||||
import HydrusGlobals as HG
|
||||
|
@ -180,6 +181,15 @@ class NetworkLoginManager( HydrusSerialisable.SerialisableBase ):
|
|||
raise
|
||||
|
||||
|
||||
if validity == VALIDITY_UNTESTED and validity_error_text != '':
|
||||
|
||||
# cleaning up the 'restart dialog to test validity in cases where it is valid
|
||||
|
||||
validity_error_text = ''
|
||||
|
||||
self._domains_to_login_info[ login_domain ] = ( login_script_key_and_name, credentials, login_access_type, login_access_text, active, validity, validity_error_text, no_work_until, no_work_until_reason )
|
||||
|
||||
|
||||
return ( login_script, credentials )
|
||||
|
||||
else:
|
||||
|
@ -257,7 +267,7 @@ class NetworkLoginManager( HydrusSerialisable.SerialisableBase ):
|
|||
with self._lock:
|
||||
|
||||
if network_context.context_type == CC.NETWORK_CONTEXT_DOMAIN:
|
||||
|
||||
'''
|
||||
domain = network_context.context_data
|
||||
|
||||
if 'pixiv.net' in domain:
|
||||
|
@ -285,16 +295,16 @@ class NetworkLoginManager( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
return
|
||||
|
||||
|
||||
'''
|
||||
( login_domain, login_expected, login_possible, login_error_text ) = self._GetLoginDomainStatus( network_context )
|
||||
|
||||
if login_domain is None or not login_expected:
|
||||
|
||||
raise HydrusExceptions.ValidationException( 'This domain has no active login script--has it just been turned off?' )
|
||||
raise HydrusExceptions.ValidationException( 'The domain ' + login_domain + ' has no active login script--has it just been turned off?' )
|
||||
|
||||
elif not login_possible:
|
||||
|
||||
raise HydrusExceptions.ValidationException( login_error_text )
|
||||
raise HydrusExceptions.ValidationException( 'The domain ' + login_domain + ' cannot log in: ' + login_error_text )
|
||||
|
||||
|
||||
elif network_context.context_type == CC.NETWORK_CONTEXT_HYDRUS:
|
||||
|
@ -358,7 +368,7 @@ class NetworkLoginManager( HydrusSerialisable.SerialisableBase ):
|
|||
with self._lock:
|
||||
|
||||
if network_context.context_type == CC.NETWORK_CONTEXT_DOMAIN:
|
||||
|
||||
'''
|
||||
domain = network_context.context_data
|
||||
|
||||
if 'pixiv.net' in domain:
|
||||
|
@ -369,16 +379,16 @@ class NetworkLoginManager( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
return LoginProcessLegacy( self.engine, HENTAI_FOUNDRY_NETWORK_CONTEXT, 'hentai foundry' )
|
||||
|
||||
|
||||
'''
|
||||
( login_domain, login_expected, login_possible, login_error_text ) = self._GetLoginDomainStatus( network_context )
|
||||
|
||||
if login_domain is None or not login_expected:
|
||||
|
||||
raise HydrusExceptions.ValidationException( 'This domain has no active login script--has it just been turned off?' )
|
||||
raise HydrusExceptions.ValidationException( 'The domain ' + login_domain + ' has no active login script--has it just been turned off?' )
|
||||
|
||||
elif not login_possible:
|
||||
|
||||
raise HydrusExceptions.ValidationException( login_error_text )
|
||||
raise HydrusExceptions.ValidationException( 'The domain ' + login_domain + ' cannot log in: ' + login_error_text )
|
||||
|
||||
else:
|
||||
|
||||
|
@ -463,7 +473,7 @@ class NetworkLoginManager( HydrusSerialisable.SerialisableBase ):
|
|||
with self._lock:
|
||||
|
||||
if network_context.context_type == CC.NETWORK_CONTEXT_DOMAIN:
|
||||
|
||||
'''
|
||||
domain = network_context.context_data
|
||||
|
||||
if 'pixiv.net' in domain:
|
||||
|
@ -478,7 +488,7 @@ class NetworkLoginManager( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
return not self._IsLoggedIn( HENTAI_FOUNDRY_NETWORK_CONTEXT, required_cookies )
|
||||
|
||||
|
||||
'''
|
||||
( login_domain, login_expected, login_possible, login_error_text ) = self._GetLoginDomainStatus( network_context )
|
||||
|
||||
if login_domain is None or not login_expected:
|
||||
|
@ -493,7 +503,7 @@ class NetworkLoginManager( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
except HydrusExceptions.ValidationException:
|
||||
|
||||
# couldn't find the script or something. assume we need a login to move errors forward to canlogin trigger phase
|
||||
# couldn't find the script or something. assume we need a login to move errors forward to checkcanlogin trigger phase
|
||||
|
||||
return True
|
||||
|
||||
|
@ -510,6 +520,28 @@ class NetworkLoginManager( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
|
||||
|
||||
def OverwriteDefaultLoginScripts( self, login_script_names ):
|
||||
|
||||
with self._lock:
|
||||
|
||||
import ClientDefaults
|
||||
|
||||
default_login_scripts = ClientDefaults.GetDefaultLoginScripts()
|
||||
|
||||
for login_script in default_login_scripts:
|
||||
|
||||
login_script.RegenerateLoginScriptKey()
|
||||
|
||||
|
||||
existing_login_scripts = list( self._login_scripts )
|
||||
|
||||
new_login_scripts = [ login_script for login_script in existing_login_scripts if login_script.GetName() not in login_script_names ]
|
||||
new_login_scripts.extend( [ login_script for login_script in default_login_scripts if login_script.GetName() in login_script_names ] )
|
||||
|
||||
|
||||
self.SetLoginScripts( new_login_scripts )
|
||||
|
||||
|
||||
def SetClean( self ):
|
||||
|
||||
with self._lock:
|
||||
|
@ -1075,7 +1107,25 @@ class LoginProcessDomain( LoginProcess ):
|
|||
|
||||
def _Start( self ):
|
||||
|
||||
self.login_script.Start( self.engine, self.network_context, self.credentials )
|
||||
login_domain = self.network_context.context_data
|
||||
|
||||
job_key = ClientThreading.JobKey( cancellable = True )
|
||||
|
||||
job_key.SetVariable( 'popup_title', 'Logging in ' + login_domain )
|
||||
|
||||
HG.client_controller.pub( 'message', job_key )
|
||||
|
||||
HydrusData.Print( 'Starting login for ' + login_domain )
|
||||
|
||||
result = self.login_script.Start( self.engine, self.network_context, self.credentials, job_key = job_key )
|
||||
|
||||
HydrusData.Print( 'Finished login for ' + self.network_context.context_data + '. Result was: ' + result )
|
||||
|
||||
job_key.SetVariable( 'popup_text_1', result )
|
||||
|
||||
job_key.Finish()
|
||||
|
||||
job_key.Delete( 4 )
|
||||
|
||||
|
||||
class LoginProcessHydrus( LoginProcess ):
|
||||
|
@ -1551,14 +1601,10 @@ class LoginScriptDomain( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
self._login_script_key = login_script_key
|
||||
|
||||
|
||||
def Start( self, engine, network_context, given_credentials, network_job_presentation_context_factory = None, test_result_callable = None ):
|
||||
def Start( self, engine, network_context, given_credentials, network_job_presentation_context_factory = None, test_result_callable = None, job_key = None ):
|
||||
|
||||
# don't mess with the domain--assume that we are given precisely the right domain
|
||||
|
||||
# this maybe takes some job_key or something so it can present to the user login process status
|
||||
# this will be needed in the dialog where we test this. we need good feedback on how it is going
|
||||
# irl, this could be a 'login popup' message as well, just to inform the user on the progress of any delay
|
||||
|
||||
login_domain = network_context.context_data
|
||||
|
||||
temp_variables = {}
|
||||
|
@ -1567,6 +1613,20 @@ class LoginScriptDomain( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
for login_step in self._login_steps:
|
||||
|
||||
if job_key is not None:
|
||||
|
||||
if job_key.IsCancelled():
|
||||
|
||||
message = 'User cancelled the login process.'
|
||||
|
||||
engine.login_manager.DelayLoginScript( login_domain, self._login_script_key, message )
|
||||
|
||||
return message
|
||||
|
||||
|
||||
job_key.SetVariable( 'popup_text_1', login_step.GetName() )
|
||||
|
||||
|
||||
try:
|
||||
|
||||
last_url_used = login_step.Start( engine, login_domain, given_credentials, temp_variables, referral_url = last_url_used, network_job_presentation_context_factory = network_job_presentation_context_factory, test_result_callable = test_result_callable )
|
||||
|
|
|
@ -63,6 +63,7 @@ class ClientOptions( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
self._dictionary[ 'booleans' ][ 'add_parents_on_manage_tags' ] = True
|
||||
self._dictionary[ 'booleans' ][ 'replace_siblings_on_manage_tags' ] = True
|
||||
self._dictionary[ 'booleans' ][ 'yes_no_on_remove_on_manage_tags' ] = True
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'show_related_tags' ] = False
|
||||
self._dictionary[ 'booleans' ][ 'show_file_lookup_script_tags' ] = False
|
||||
|
@ -114,6 +115,8 @@ class ClientOptions( HydrusSerialisable.SerialisableBase ):
|
|||
self._dictionary[ 'booleans' ][ 'highlight_new_watcher' ] = True
|
||||
self._dictionary[ 'booleans' ][ 'highlight_new_query' ] = True
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'delete_files_after_export' ] = False
|
||||
|
||||
#
|
||||
|
||||
self._dictionary[ 'colours' ] = HydrusSerialisable.SerialisableDictionary()
|
||||
|
|
|
@ -49,7 +49,7 @@ options = {}
|
|||
# Misc
|
||||
|
||||
NETWORK_VERSION = 18
|
||||
SOFTWARE_VERSION = 328
|
||||
SOFTWARE_VERSION = 329
|
||||
|
||||
UNSCALED_THUMBNAIL_DIMENSIONS = ( 200, 200 )
|
||||
|
||||
|
|
|
@ -379,6 +379,11 @@ def ConvertTimestampToPrettyTime( timestamp, in_gmt = False, include_24h_time =
|
|||
|
||||
def TimestampToPrettyTimeDelta( timestamp, just_now_string = 'now', just_now_threshold = 3 ):
|
||||
|
||||
if timestamp is None:
|
||||
|
||||
timestamp = 0
|
||||
|
||||
|
||||
if HG.client_controller.new_options.GetBoolean( 'always_show_iso_time' ):
|
||||
|
||||
return ConvertTimestampToPrettyTime( timestamp )
|
||||
|
|
|
@ -1,5 +1,5 @@
|
|||
import HydrusConstants as HC
|
||||
import PyPDF2
|
||||
#import PyPDF2
|
||||
import re
|
||||
import time
|
||||
import traceback
|
||||
|
@ -14,8 +14,15 @@ def GetNumWordsFromString( s ):
|
|||
|
||||
def GetPDFNumWords( path ):
|
||||
|
||||
# I discovered a pdf that pulled this into an infinite loop due to malformed header.
|
||||
# This gives bunk data anyway, so let's just cut it out until we have a better solution here all around
|
||||
|
||||
return None
|
||||
|
||||
try:
|
||||
|
||||
pass
|
||||
'''
|
||||
with open( path, 'rb' ) as f:
|
||||
|
||||
pdf_object = PyPDF2.PdfFileReader( f, strict = False )
|
||||
|
@ -27,7 +34,7 @@ def GetPDFNumWords( path ):
|
|||
|
||||
return pdf_object.numPages * 350
|
||||
|
||||
|
||||
'''
|
||||
except:
|
||||
|
||||
num_words = 0
|
||||
|
|
|
@ -229,7 +229,7 @@ def GetFileInfo( path, mime = None ):
|
|||
|
||||
elif mime == HC.APPLICATION_PDF:
|
||||
|
||||
num_words = HydrusDocumentHandling.GetPDFNumWords( path )
|
||||
num_words = HydrusDocumentHandling.GetPDFNumWords( path ) # this now give None until a better solution can be found
|
||||
|
||||
elif mime in HC.AUDIO:
|
||||
|
||||
|
|
|
@ -639,7 +639,7 @@ class BandwidthTracker( HydrusSerialisable.SerialisableBase ):
|
|||
next_month_year += 1
|
||||
|
||||
|
||||
next_month = ( month + 1 ) % 12
|
||||
next_month = ( month % 12 ) + 1
|
||||
|
||||
next_month_dt = datetime.datetime( next_month_year, next_month, 1 )
|
||||
|
||||
|
|
|
@ -279,7 +279,7 @@ class TestSerialisables( unittest.TestCase ):
|
|||
|
||||
scu[ CC.LOCAL_FILE_SERVICE_KEY ] = [ HydrusData.ContentUpdate( HC.CONTENT_TYPE_FILES, HC.CONTENT_UPDATE_DELETE, { local_hash_empty } ) ]
|
||||
|
||||
assertSCUEqual( result[0], scu )
|
||||
assertSCUEqual( result, scu )
|
||||
|
||||
#
|
||||
|
||||
|
@ -289,13 +289,13 @@ class TestSerialisables( unittest.TestCase ):
|
|||
|
||||
scu[ CC.TRASH_SERVICE_KEY ] = [ HydrusData.ContentUpdate( HC.CONTENT_TYPE_FILES, HC.CONTENT_UPDATE_DELETE, { trashed_hash_empty } ) ]
|
||||
|
||||
assertSCUEqual( result[0], scu )
|
||||
assertSCUEqual( result, scu )
|
||||
|
||||
#
|
||||
|
||||
result = duplicate_action_options_delete_and_move.ProcessPairIntoContentUpdates( local_media_has_values, deleted_media_empty )
|
||||
|
||||
self.assertEqual( result, [] )
|
||||
self.assertEqual( result, {} )
|
||||
|
||||
#
|
||||
|
||||
|
@ -306,14 +306,9 @@ class TestSerialisables( unittest.TestCase ):
|
|||
scu[ CC.LOCAL_TAG_SERVICE_KEY ] = [ HydrusData.ContentUpdate( HC.CONTENT_TYPE_MAPPINGS, HC.CONTENT_UPDATE_DELETE, ( 'test tag', { other_local_hash_has_values } ) ), HydrusData.ContentUpdate( HC.CONTENT_TYPE_MAPPINGS, HC.CONTENT_UPDATE_DELETE, ( 'series:namespaced test tag', { other_local_hash_has_values } ) ) ]
|
||||
scu[ TC.LOCAL_RATING_LIKE_SERVICE_KEY ] = [ HydrusData.ContentUpdate( HC.CONTENT_TYPE_RATINGS, HC.CONTENT_UPDATE_ADD, ( None, { other_local_hash_has_values } ) ) ]
|
||||
scu[ TC.LOCAL_RATING_NUMERICAL_SERVICE_KEY ] = [ HydrusData.ContentUpdate( HC.CONTENT_TYPE_RATINGS, HC.CONTENT_UPDATE_ADD, ( None, { other_local_hash_has_values } ) ) ]
|
||||
|
||||
assertSCUEqual( result[0], scu )
|
||||
|
||||
scu = {}
|
||||
|
||||
scu[ CC.LOCAL_FILE_SERVICE_KEY ] = [ HydrusData.ContentUpdate( HC.CONTENT_TYPE_FILES, HC.CONTENT_UPDATE_DELETE, { other_local_hash_has_values } ) ]
|
||||
|
||||
assertSCUEqual( result[1], scu )
|
||||
assertSCUEqual( result, scu )
|
||||
|
||||
#
|
||||
|
||||
|
@ -324,21 +319,16 @@ class TestSerialisables( unittest.TestCase ):
|
|||
scu[ CC.LOCAL_TAG_SERVICE_KEY ] = [ HydrusData.ContentUpdate( HC.CONTENT_TYPE_MAPPINGS, HC.CONTENT_UPDATE_ADD, ( 'test tag', { local_hash_empty } ) ), HydrusData.ContentUpdate( HC.CONTENT_TYPE_MAPPINGS, HC.CONTENT_UPDATE_ADD, ( 'series:namespaced test tag', { local_hash_empty } ) ), HydrusData.ContentUpdate( HC.CONTENT_TYPE_MAPPINGS, HC.CONTENT_UPDATE_DELETE, ( 'test tag', { other_local_hash_has_values } ) ), HydrusData.ContentUpdate( HC.CONTENT_TYPE_MAPPINGS, HC.CONTENT_UPDATE_DELETE, ( 'series:namespaced test tag', { other_local_hash_has_values } ) ) ]
|
||||
scu[ TC.LOCAL_RATING_LIKE_SERVICE_KEY ] = [ HydrusData.ContentUpdate( HC.CONTENT_TYPE_RATINGS, HC.CONTENT_UPDATE_ADD, ( 1.0, { local_hash_empty } ) ), HydrusData.ContentUpdate( HC.CONTENT_TYPE_RATINGS, HC.CONTENT_UPDATE_ADD, ( None, { other_local_hash_has_values } ) ) ]
|
||||
scu[ TC.LOCAL_RATING_NUMERICAL_SERVICE_KEY ] = [ HydrusData.ContentUpdate( HC.CONTENT_TYPE_RATINGS, HC.CONTENT_UPDATE_ADD, ( 0.8, { local_hash_empty } ) ), HydrusData.ContentUpdate( HC.CONTENT_TYPE_RATINGS, HC.CONTENT_UPDATE_ADD, ( None, { other_local_hash_has_values } ) ) ]
|
||||
|
||||
assertSCUEqual( result[0], scu )
|
||||
|
||||
scu = {}
|
||||
|
||||
scu[ CC.LOCAL_FILE_SERVICE_KEY ] = [ HydrusData.ContentUpdate( HC.CONTENT_TYPE_FILES, HC.CONTENT_UPDATE_DELETE, { other_local_hash_has_values } ) ]
|
||||
|
||||
assertSCUEqual( result[1], scu )
|
||||
assertSCUEqual( result, scu )
|
||||
|
||||
#
|
||||
#
|
||||
|
||||
result = duplicate_action_options_copy.ProcessPairIntoContentUpdates( local_media_has_values, local_media_empty )
|
||||
|
||||
self.assertEqual( result, [] )
|
||||
self.assertEqual( result, {} )
|
||||
|
||||
#
|
||||
|
||||
|
@ -350,7 +340,7 @@ class TestSerialisables( unittest.TestCase ):
|
|||
scu[ TC.LOCAL_RATING_LIKE_SERVICE_KEY ] = [ HydrusData.ContentUpdate( HC.CONTENT_TYPE_RATINGS, HC.CONTENT_UPDATE_ADD, ( 1.0, { local_hash_empty } ) ) ]
|
||||
scu[ TC.LOCAL_RATING_NUMERICAL_SERVICE_KEY ] = [ HydrusData.ContentUpdate( HC.CONTENT_TYPE_RATINGS, HC.CONTENT_UPDATE_ADD, ( 0.8, { local_hash_empty } ) ) ]
|
||||
|
||||
assertSCUEqual( result[0], scu )
|
||||
assertSCUEqual( result, scu )
|
||||
|
||||
#
|
||||
#
|
||||
|
@ -363,7 +353,7 @@ class TestSerialisables( unittest.TestCase ):
|
|||
scu[ TC.LOCAL_RATING_LIKE_SERVICE_KEY ] = [ HydrusData.ContentUpdate( HC.CONTENT_TYPE_RATINGS, HC.CONTENT_UPDATE_ADD, ( 1.0, { local_hash_empty } ) ) ]
|
||||
scu[ TC.LOCAL_RATING_NUMERICAL_SERVICE_KEY ] = [ HydrusData.ContentUpdate( HC.CONTENT_TYPE_RATINGS, HC.CONTENT_UPDATE_ADD, ( 0.8, { local_hash_empty } ) ) ]
|
||||
|
||||
assertSCUEqual( result[0], scu )
|
||||
assertSCUEqual( result, scu )
|
||||
|
||||
#
|
||||
|
||||
|
@ -375,7 +365,7 @@ class TestSerialisables( unittest.TestCase ):
|
|||
scu[ TC.LOCAL_RATING_LIKE_SERVICE_KEY ] = [ HydrusData.ContentUpdate( HC.CONTENT_TYPE_RATINGS, HC.CONTENT_UPDATE_ADD, ( 1.0, { local_hash_empty } ) ) ]
|
||||
scu[ TC.LOCAL_RATING_NUMERICAL_SERVICE_KEY ] = [ HydrusData.ContentUpdate( HC.CONTENT_TYPE_RATINGS, HC.CONTENT_UPDATE_ADD, ( 0.8, { local_hash_empty } ) ) ]
|
||||
|
||||
assertSCUEqual( result[0], scu )
|
||||
assertSCUEqual( result, scu )
|
||||
|
||||
#
|
||||
|
||||
|
@ -385,7 +375,7 @@ class TestSerialisables( unittest.TestCase ):
|
|||
|
||||
scu[ CC.LOCAL_TAG_SERVICE_KEY ] = [ HydrusData.ContentUpdate( HC.CONTENT_TYPE_MAPPINGS, HC.CONTENT_UPDATE_ADD, ( 'one', { two_hash } ) ), HydrusData.ContentUpdate( HC.CONTENT_TYPE_MAPPINGS, HC.CONTENT_UPDATE_ADD, ( 'two', { one_hash } ) ) ]
|
||||
|
||||
assertSCUEqual( result[0], scu )
|
||||
assertSCUEqual( result, scu )
|
||||
|
||||
|
||||
def test_SERIALISABLE_TYPE_SHORTCUT( self ):
|
||||
|
|
After Width: | Height: | Size: 2.5 KiB |
After Width: | Height: | Size: 3.3 KiB |
After Width: | Height: | Size: 2.7 KiB |
Before Width: | Height: | Size: 2.6 KiB After Width: | Height: | Size: 2.5 KiB |
After Width: | Height: | Size: 3.2 KiB |
Before Width: | Height: | Size: 2.7 KiB After Width: | Height: | Size: 2.7 KiB |
Before Width: | Height: | Size: 2.8 KiB After Width: | Height: | Size: 2.8 KiB |
After Width: | Height: | Size: 2.5 KiB |
Before Width: | Height: | Size: 2.2 KiB After Width: | Height: | Size: 2.2 KiB |