Version 305
|
@ -8,6 +8,43 @@
|
||||||
<div class="content">
|
<div class="content">
|
||||||
<h3>changelog</h3>
|
<h3>changelog</h3>
|
||||||
<ul>
|
<ul>
|
||||||
|
<li><h3>version 305</h3></li>
|
||||||
|
<ul>
|
||||||
|
<li>fixed the pixiv url class, which was unintentionally removing a parameter</li>
|
||||||
|
<li>wrote a pixiv parser in the new system, fixing a whole bunch of tag parsing along the way, and also parses 'source time'! by default, pixiv now fetches the translated/romaji versions of tags</li>
|
||||||
|
<li>finished a safebooru parser that also handles source time and source urls</li>
|
||||||
|
<li>finished an e621 parser that also handles source time and source urls and hash!</li>
|
||||||
|
<li>wrote a danbooru parser that also handles source time and source urls and hash!</li>
|
||||||
|
<li>as a result, danbooru, safebooru, e621, and pixiv post urls are now drag-and-droppable onto the client!</li>
|
||||||
|
<li>finished up a full yiff.party watcher from another contribution by @cuddlebear on the discord, including url classes and a full parser, meaning yiff.party artist urls are now droppable onto the client and will spawn thread watchers (I expect to add some kind of subscription support for watchers in the future). inline links are supported, and there is source time and limited filename: and hash parsing</li>
|
||||||
|
<li>fixed some thread watcher tag association problems in the new system</li>
|
||||||
|
<li>when pages put an (x) number after their name for number of files, they will now also put an (x/y) import total (if appropriate and not complete) as well. this also sums up through page of pages!</li>
|
||||||
|
<li>if a call to close a page of pages or the application would present more than one page's 'I am still importing' complaint, all the complaints are now summarised in a single yes/no dialog</li>
|
||||||
|
<li>url downloader pages now run a 'are you sure you want to close this page' when their import queues are unfinished and unpaused</li>
|
||||||
|
<li>if the subscriptions for 'manage subscriptions' take more than a second to load, a popup will come up with load progress. the popup is cancellable</li>
|
||||||
|
<li>added a prototype 'open in web browser' to the thumbnail right-click share menu. it will only appear in windows if you are in advanced mode, as atm it mostly just launches the file in the default program, not browser. I will keep working on this</li>
|
||||||
|
<li>harmonised more old download code into a single location in the new system</li>
|
||||||
|
<li>created a neater network job factory system for generalised network requests at the import job level</li>
|
||||||
|
<li>created a neater presentation context factory system for generalised and reliable set/clear network job ui presentation at the import job level</li>
|
||||||
|
<li>moved the new downloader simple-file-download-and-import to the new file object and harmonised all downloader code to call this single location where possible</li>
|
||||||
|
<li>did the same thing with download-post-and-then-fetch-tags-and-file job and added hooks for in the subscription and gallery downloader loops (where a parser match for the url is found)</li>
|
||||||
|
<li>the simple downloader and urls downloader now use 'downloader instance' network jobs, so they obey a couple more bandwidth rules</li>
|
||||||
|
<li>harmonised how imported media is then presented to pages as thumbnails through the new main import object</li>
|
||||||
|
<li>the new post downloader sets up referral urls for the file download (which are needed for pixiv and anything else picky) automatically</li>
|
||||||
|
<li>improved file download/import error reporting a little</li>
|
||||||
|
<li>entering an invalid regex phrase in the stringmatch panel (as happens all the time as you type it) will now present the error in the status area rather than spamming popups</li>
|
||||||
|
<li>fixed a bug in the new parsing gui that was prohibiting editing a date decode string transformation</li>
|
||||||
|
<li>fixed enabling of additional date decode controls in the string transformations edit panel</li>
|
||||||
|
<li>added a hyperlink to date decoding controls that links to python date decoding explainer</li>
|
||||||
|
<li>if a source time in the new parsing system suggests a time in the future, it will now clip to 30s ago</li>
|
||||||
|
<li>misc downloader refactoring and cleanup</li>
|
||||||
|
<li>fixed an issue where new file lookup scripts were initialising with bad string transformation rows and breaking the whole dialog in subsequent calls, fugg</li>
|
||||||
|
<li>hid the 'find similar files' menu entry for images that have duration (gifs and apngs), which are not yet supported</li>
|
||||||
|
<li>added 'flip_debug_force_idle_mode_do_not_set_this' to main_gui shortcut set. only set it if you are an advanced user and prepared for the potential consequences</li>
|
||||||
|
<li>silenced a problem with newgrounds gallery parser--will fix it properly next week</li>
|
||||||
|
<li>fixed some old busted unit test code</li>
|
||||||
|
<li>rejiggered some thumb dupe menu entry layout</li>
|
||||||
|
</ul>
|
||||||
<li><h3>version 304</h3></li>
|
<li><h3>version 304</h3></li>
|
||||||
<ul>
|
<ul>
|
||||||
<li>renamed the new 'tagcensor' object to 'tagfilter' (since it will end up doing a bunch of non-censoring jobs) and refactored it into clienttags</li>
|
<li>renamed the new 'tagcensor' object to 'tagfilter' (since it will end up doing a bunch of non-censoring jobs) and refactored it into clienttags</li>
|
||||||
|
|
|
@ -339,7 +339,7 @@ SHORTCUTS_RESERVED_NAMES = [ 'archive_delete_filter', 'duplicate_filter', 'media
|
||||||
SHORTCUTS_MEDIA_ACTIONS = [ 'manage_file_tags', 'manage_file_ratings', 'manage_file_urls', 'manage_file_notes', 'archive_file', 'inbox_file', 'delete_file', 'export_files', 'remove_file_from_view', 'open_file_in_external_program', 'open_selection_in_new_page', 'launch_the_archive_delete_filter', 'copy_bmp', 'copy_file', 'copy_path', 'copy_sha256_hash', 'get_similar_to_exact', 'get_similar_to_very_similar', 'get_similar_to_similar', 'get_similar_to_speculative', 'duplicate_media_remove_relationships', 'duplicate_media_reset_to_potential', 'duplicate_media_set_alternate', 'duplicate_media_set_custom', 'duplicate_media_set_focused_better', 'duplicate_media_set_not_duplicate', 'duplicate_media_set_same_quality' ]
|
SHORTCUTS_MEDIA_ACTIONS = [ 'manage_file_tags', 'manage_file_ratings', 'manage_file_urls', 'manage_file_notes', 'archive_file', 'inbox_file', 'delete_file', 'export_files', 'remove_file_from_view', 'open_file_in_external_program', 'open_selection_in_new_page', 'launch_the_archive_delete_filter', 'copy_bmp', 'copy_file', 'copy_path', 'copy_sha256_hash', 'get_similar_to_exact', 'get_similar_to_very_similar', 'get_similar_to_similar', 'get_similar_to_speculative', 'duplicate_media_remove_relationships', 'duplicate_media_reset_to_potential', 'duplicate_media_set_alternate', 'duplicate_media_set_custom', 'duplicate_media_set_focused_better', 'duplicate_media_set_not_duplicate', 'duplicate_media_set_same_quality' ]
|
||||||
SHORTCUTS_MEDIA_VIEWER_ACTIONS = [ 'move_animation_to_previous_frame', 'move_animation_to_next_frame', 'switch_between_fullscreen_borderless_and_regular_framed_window', 'pan_up', 'pan_down', 'pan_left', 'pan_right', 'zoom_in', 'zoom_out', 'switch_between_100_percent_and_canvas_zoom', 'flip_darkmode' ]
|
SHORTCUTS_MEDIA_VIEWER_ACTIONS = [ 'move_animation_to_previous_frame', 'move_animation_to_next_frame', 'switch_between_fullscreen_borderless_and_regular_framed_window', 'pan_up', 'pan_down', 'pan_left', 'pan_right', 'zoom_in', 'zoom_out', 'switch_between_100_percent_and_canvas_zoom', 'flip_darkmode' ]
|
||||||
SHORTCUTS_MEDIA_VIEWER_BROWSER_ACTIONS = [ 'view_next', 'view_first', 'view_last', 'view_previous' ]
|
SHORTCUTS_MEDIA_VIEWER_BROWSER_ACTIONS = [ 'view_next', 'view_first', 'view_last', 'view_previous' ]
|
||||||
SHORTCUTS_MAIN_GUI_ACTIONS = [ 'refresh', 'new_page', 'new_page_of_pages', 'new_duplicate_filter_page', 'new_url_downloader_page', 'new_simple_downloader_page', 'new_watcher_downloader_page', 'synchronised_wait_switch', 'set_media_focus', 'show_hide_splitters', 'set_search_focus', 'unclose_page', 'close_page', 'redo', 'undo', 'flip_darkmode', 'check_all_import_folders' ]
|
SHORTCUTS_MAIN_GUI_ACTIONS = [ 'refresh', 'new_page', 'new_page_of_pages', 'new_duplicate_filter_page', 'new_url_downloader_page', 'new_simple_downloader_page', 'new_watcher_downloader_page', 'synchronised_wait_switch', 'set_media_focus', 'show_hide_splitters', 'set_search_focus', 'unclose_page', 'close_page', 'redo', 'undo', 'flip_darkmode', 'check_all_import_folders', 'flip_debug_force_idle_mode_do_not_set_this' ]
|
||||||
SHORTCUTS_DUPLICATE_FILTER_ACTIONS = [ 'duplicate_filter_this_is_better', 'duplicate_filter_exactly_the_same', 'duplicate_filter_alternates', 'duplicate_filter_not_dupes', 'duplicate_filter_custom_action', 'duplicate_filter_skip', 'duplicate_filter_back' ]
|
SHORTCUTS_DUPLICATE_FILTER_ACTIONS = [ 'duplicate_filter_this_is_better', 'duplicate_filter_exactly_the_same', 'duplicate_filter_alternates', 'duplicate_filter_not_dupes', 'duplicate_filter_custom_action', 'duplicate_filter_skip', 'duplicate_filter_back' ]
|
||||||
SHORTCUTS_ARCHIVE_DELETE_FILTER_ACTIONS = [ 'archive_delete_filter_keep', 'archive_delete_filter_delete', 'archive_delete_filter_skip', 'archive_delete_filter_back' ]
|
SHORTCUTS_ARCHIVE_DELETE_FILTER_ACTIONS = [ 'archive_delete_filter_keep', 'archive_delete_filter_delete', 'archive_delete_filter_skip', 'archive_delete_filter_back' ]
|
||||||
|
|
||||||
|
|
|
@ -9933,11 +9933,11 @@ class DB( HydrusDB.HydrusDB ):
|
||||||
|
|
||||||
new_yiff_sdf = [ sdf for sdf in default_sdf if 'yiff' in sdf.GetName() ]
|
new_yiff_sdf = [ sdf for sdf in default_sdf if 'yiff' in sdf.GetName() ]
|
||||||
|
|
||||||
existing_sdf = new_options.GetSimpleDownloaderFormulae()
|
existing_sdf = list( new_options.GetSimpleDownloaderFormulae() )
|
||||||
|
|
||||||
existing_sdf.extend( new_yiff_sdf )
|
existing_sdf.extend( new_yiff_sdf )
|
||||||
|
|
||||||
new_options.SetSimpleDownloaderFormulae( ClientDefaults.GetDefaultSimpleDownloaderFormulae() )
|
new_options.SetSimpleDownloaderFormulae( existing_sdf )
|
||||||
|
|
||||||
self._SetJSONDump( new_options )
|
self._SetJSONDump( new_options )
|
||||||
|
|
||||||
|
@ -10032,6 +10032,60 @@ class DB( HydrusDB.HydrusDB ):
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
if version == 304:
|
||||||
|
|
||||||
|
try:
|
||||||
|
|
||||||
|
domain_manager = self._GetJSONDump( HydrusSerialisable.SERIALISABLE_TYPE_NETWORK_DOMAIN_MANAGER )
|
||||||
|
|
||||||
|
domain_manager.Initialise()
|
||||||
|
|
||||||
|
#
|
||||||
|
|
||||||
|
url_matches = ClientDefaults.GetDefaultURLMatches()
|
||||||
|
|
||||||
|
url_matches = [ url_match for url_match in url_matches if 'pixiv file page' in url_match.GetName() or 'yiff.party' in url_match.GetName() ]
|
||||||
|
|
||||||
|
existing_url_matches = [ url_match for url_match in domain_manager.GetURLMatches() if 'pixiv file page' not in url_match.GetName() and 'yiff.party' not in url_match.GetName() ]
|
||||||
|
|
||||||
|
url_matches.extend( existing_url_matches )
|
||||||
|
|
||||||
|
domain_manager.SetURLMatches( url_matches )
|
||||||
|
|
||||||
|
#
|
||||||
|
|
||||||
|
existing_parsers = domain_manager.GetParsers()
|
||||||
|
|
||||||
|
existing_names = { parser.GetName() for parser in existing_parsers }
|
||||||
|
|
||||||
|
new_parsers = list( existing_parsers )
|
||||||
|
|
||||||
|
default_parsers = ClientDefaults.GetDefaultParsers()
|
||||||
|
|
||||||
|
interesting_new_parsers = [ parser for parser in default_parsers if parser.GetName() not in existing_names ] # add it
|
||||||
|
|
||||||
|
new_parsers.extend( interesting_new_parsers )
|
||||||
|
|
||||||
|
domain_manager.SetParsers( new_parsers )
|
||||||
|
|
||||||
|
#
|
||||||
|
|
||||||
|
domain_manager.TryToLinkURLMatchesAndParsers()
|
||||||
|
|
||||||
|
#
|
||||||
|
|
||||||
|
self._SetJSONDump( domain_manager )
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
|
||||||
|
HydrusData.PrintException( e )
|
||||||
|
|
||||||
|
message = 'Trying to update pixiv url class failed! Please let hydrus dev know!'
|
||||||
|
|
||||||
|
self.pub_initial_message( message )
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
self._controller.pub( 'splash_set_title_text', 'updated db to v' + str( version + 1 ) )
|
self._controller.pub( 'splash_set_title_text', 'updated db to v' + str( version + 1 ) )
|
||||||
|
|
||||||
self._c.execute( 'UPDATE version SET version = ?;', ( version + 1, ) )
|
self._c.execute( 'UPDATE version SET version = ?;', ( version + 1, ) )
|
||||||
|
|
|
@ -1387,7 +1387,14 @@ class GalleryNewgrounds( Gallery ):
|
||||||
|
|
||||||
fatcol = soup.find( 'div', class_ = 'fatcol' )
|
fatcol = soup.find( 'div', class_ = 'fatcol' )
|
||||||
|
|
||||||
links = fatcol.find_all( 'a' )
|
if fatcol is not None:
|
||||||
|
|
||||||
|
links = fatcol.find_all( 'a' )
|
||||||
|
|
||||||
|
else:
|
||||||
|
|
||||||
|
links = []
|
||||||
|
|
||||||
|
|
||||||
urls_set = set()
|
urls_set = set()
|
||||||
|
|
||||||
|
|
|
@ -2444,7 +2444,43 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
subscriptions = HG.client_controller.Read( 'serialisable_named', HydrusSerialisable.SERIALISABLE_TYPE_SUBSCRIPTION )
|
job_key = ClientThreading.JobKey( cancellable = True )
|
||||||
|
|
||||||
|
job_key.SetVariable( 'popup_title', 'loading subscriptions' )
|
||||||
|
|
||||||
|
subscription_names = HG.client_controller.Read( 'serialisable_names', HydrusSerialisable.SERIALISABLE_TYPE_SUBSCRIPTION )
|
||||||
|
|
||||||
|
pubbed_it = False
|
||||||
|
started = HydrusData.GetNowFloat()
|
||||||
|
num_to_do = len( subscription_names )
|
||||||
|
|
||||||
|
subscriptions = []
|
||||||
|
|
||||||
|
for ( i, name ) in enumerate( subscription_names ):
|
||||||
|
|
||||||
|
if job_key.IsCancelled():
|
||||||
|
|
||||||
|
job_key.Delete()
|
||||||
|
|
||||||
|
return
|
||||||
|
|
||||||
|
|
||||||
|
if not pubbed_it and HydrusData.TimeHasPassedFloat( started + 1.0 ):
|
||||||
|
|
||||||
|
self._controller.pub( 'message', job_key )
|
||||||
|
|
||||||
|
pubbed_it = True
|
||||||
|
|
||||||
|
|
||||||
|
job_key.SetVariable( 'popup_text_1', HydrusData.ConvertValueRangeToPrettyString( i + 1, num_to_do ) + ': ' + name )
|
||||||
|
job_key.SetVariable( 'popup_gauge_1', ( i + 1, num_to_do ) )
|
||||||
|
|
||||||
|
subscription = HG.client_controller.Read( 'serialisable_named', HydrusSerialisable.SERIALISABLE_TYPE_SUBSCRIPTION, name )
|
||||||
|
|
||||||
|
subscriptions.append( subscription )
|
||||||
|
|
||||||
|
|
||||||
|
job_key.Delete()
|
||||||
|
|
||||||
controller.CallBlockingToWx( wx_do_it, subscriptions, original_pause_status )
|
controller.CallBlockingToWx( wx_do_it, subscriptions, original_pause_status )
|
||||||
|
|
||||||
|
@ -3728,7 +3764,7 @@ The password is cleartext here but obscured in the entry dialog. Enter a blank p
|
||||||
|
|
||||||
self._notebook.TestAbleToClose()
|
self._notebook.TestAbleToClose()
|
||||||
|
|
||||||
except HydrusExceptions.PermissionException:
|
except HydrusExceptions.VetoException:
|
||||||
|
|
||||||
return False
|
return False
|
||||||
|
|
||||||
|
@ -4157,6 +4193,14 @@ The password is cleartext here but obscured in the entry dialog. Enter a blank p
|
||||||
|
|
||||||
self._controller.pub( 'undo' )
|
self._controller.pub( 'undo' )
|
||||||
|
|
||||||
|
elif action == 'flip_debug_force_idle_mode_do_not_set_this':
|
||||||
|
|
||||||
|
self._SwitchBoolean( 'force_idle_mode' )
|
||||||
|
|
||||||
|
self._DirtyMenu( 'help' )
|
||||||
|
|
||||||
|
self._menu_updater.Update()
|
||||||
|
|
||||||
else:
|
else:
|
||||||
|
|
||||||
command_processed = False
|
command_processed = False
|
||||||
|
|
|
@ -693,6 +693,49 @@ class ManagementController( HydrusSerialisable.SerialisableBase ):
|
||||||
return self._management_type
|
return self._management_type
|
||||||
|
|
||||||
|
|
||||||
|
def GetValueRange( self ):
|
||||||
|
|
||||||
|
try:
|
||||||
|
|
||||||
|
if self._management_type == MANAGEMENT_TYPE_IMPORT_GALLERY:
|
||||||
|
|
||||||
|
gallery_import = self._serialisables[ 'gallery_import' ]
|
||||||
|
|
||||||
|
return gallery_import.GetValueRange()
|
||||||
|
|
||||||
|
elif self._management_type == MANAGEMENT_TYPE_IMPORT_HDD:
|
||||||
|
|
||||||
|
hdd_import = self._serialisables[ 'hdd_import' ]
|
||||||
|
|
||||||
|
return hdd_import.GetValueRange()
|
||||||
|
|
||||||
|
elif self._management_type == MANAGEMENT_TYPE_IMPORT_SIMPLE_DOWNLOADER:
|
||||||
|
|
||||||
|
simple_downloader_import = self._serialisables[ 'simple_downloader_import' ]
|
||||||
|
|
||||||
|
return simple_downloader_import.GetValueRange()
|
||||||
|
|
||||||
|
elif self._management_type == MANAGEMENT_TYPE_IMPORT_THREAD_WATCHER:
|
||||||
|
|
||||||
|
thread_watcher_import = self._serialisables[ 'thread_watcher_import' ]
|
||||||
|
|
||||||
|
return thread_watcher_import.GetValueRange()
|
||||||
|
|
||||||
|
elif self._management_type == MANAGEMENT_TYPE_IMPORT_URLS:
|
||||||
|
|
||||||
|
urls_import = self._serialisables[ 'urls_import' ]
|
||||||
|
|
||||||
|
return urls_import.GetValueRange()
|
||||||
|
|
||||||
|
|
||||||
|
except KeyError:
|
||||||
|
|
||||||
|
return ( 0 , 0 )
|
||||||
|
|
||||||
|
|
||||||
|
return ( 0, 0 )
|
||||||
|
|
||||||
|
|
||||||
def GetVariable( self, name ):
|
def GetVariable( self, name ):
|
||||||
|
|
||||||
if name in self._simples:
|
if name in self._simples:
|
||||||
|
@ -719,6 +762,8 @@ class ManagementController( HydrusSerialisable.SerialisableBase ):
|
||||||
return thread_watcher_import.IsDead()
|
return thread_watcher_import.IsDead()
|
||||||
|
|
||||||
|
|
||||||
|
return False
|
||||||
|
|
||||||
|
|
||||||
def IsImporter( self ):
|
def IsImporter( self ):
|
||||||
|
|
||||||
|
@ -790,6 +835,11 @@ class ManagementPanel( wx.lib.scrolledpanel.ScrolledPanel ):
|
||||||
sizer.Add( tags_box, CC.FLAGS_EXPAND_BOTH_WAYS )
|
sizer.Add( tags_box, CC.FLAGS_EXPAND_BOTH_WAYS )
|
||||||
|
|
||||||
|
|
||||||
|
def CheckAbleToClose( self ):
|
||||||
|
|
||||||
|
pass
|
||||||
|
|
||||||
|
|
||||||
def CleanBeforeDestroy( self ):
|
def CleanBeforeDestroy( self ):
|
||||||
|
|
||||||
pass
|
pass
|
||||||
|
@ -815,11 +865,6 @@ class ManagementPanel( wx.lib.scrolledpanel.ScrolledPanel ):
|
||||||
pass
|
pass
|
||||||
|
|
||||||
|
|
||||||
def TestAbleToClose( self ):
|
|
||||||
|
|
||||||
pass
|
|
||||||
|
|
||||||
|
|
||||||
def TIMERPageUpdate( self ):
|
def TIMERPageUpdate( self ):
|
||||||
|
|
||||||
pass
|
pass
|
||||||
|
@ -1571,6 +1616,14 @@ class ManagementPanelImporterGallery( ManagementPanelImporter ):
|
||||||
self._current_action.SetLabelText( current_action )
|
self._current_action.SetLabelText( current_action )
|
||||||
|
|
||||||
|
|
||||||
|
def CheckAbleToClose( self ):
|
||||||
|
|
||||||
|
if self._gallery_import.CurrentlyWorking():
|
||||||
|
|
||||||
|
raise HydrusExceptions.VetoException( 'This page is still importing.' )
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
def EventAdvance( self, event ):
|
def EventAdvance( self, event ):
|
||||||
|
|
||||||
selected_indices = self._pending_queries_listbox.GetSelections()
|
selected_indices = self._pending_queries_listbox.GetSelections()
|
||||||
|
@ -1656,20 +1709,6 @@ class ManagementPanelImporterGallery( ManagementPanelImporter ):
|
||||||
self._gallery_import.Start( self._page_key )
|
self._gallery_import.Start( self._page_key )
|
||||||
|
|
||||||
|
|
||||||
def TestAbleToClose( self ):
|
|
||||||
|
|
||||||
if self._gallery_import.CurrentlyWorking():
|
|
||||||
|
|
||||||
with ClientGUIDialogs.DialogYesNo( self, 'This page is still importing. Are you sure you want to close it?' ) as dlg:
|
|
||||||
|
|
||||||
if dlg.ShowModal() == wx.ID_NO:
|
|
||||||
|
|
||||||
raise HydrusExceptions.PermissionException()
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
management_panel_types_to_classes[ MANAGEMENT_TYPE_IMPORT_GALLERY ] = ManagementPanelImporterGallery
|
management_panel_types_to_classes[ MANAGEMENT_TYPE_IMPORT_GALLERY ] = ManagementPanelImporterGallery
|
||||||
|
|
||||||
class ManagementPanelImporterHDD( ManagementPanelImporter ):
|
class ManagementPanelImporterHDD( ManagementPanelImporter ):
|
||||||
|
@ -1748,6 +1787,14 @@ class ManagementPanelImporterHDD( ManagementPanelImporter ):
|
||||||
self._current_action.SetLabelText( current_action )
|
self._current_action.SetLabelText( current_action )
|
||||||
|
|
||||||
|
|
||||||
|
def CheckAbleToClose( self ):
|
||||||
|
|
||||||
|
if self._hdd_import.CurrentlyWorking():
|
||||||
|
|
||||||
|
raise HydrusExceptions.VetoException( 'This page is still importing.' )
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
def EventPause( self, event ):
|
def EventPause( self, event ):
|
||||||
|
|
||||||
self._hdd_import.PausePlay()
|
self._hdd_import.PausePlay()
|
||||||
|
@ -1760,20 +1807,6 @@ class ManagementPanelImporterHDD( ManagementPanelImporter ):
|
||||||
self._hdd_import.Start( self._page_key )
|
self._hdd_import.Start( self._page_key )
|
||||||
|
|
||||||
|
|
||||||
def TestAbleToClose( self ):
|
|
||||||
|
|
||||||
if self._hdd_import.CurrentlyWorking():
|
|
||||||
|
|
||||||
with ClientGUIDialogs.DialogYesNo( self, 'This page is still importing. Are you sure you want to close it?' ) as dlg:
|
|
||||||
|
|
||||||
if dlg.ShowModal() == wx.ID_NO:
|
|
||||||
|
|
||||||
raise HydrusExceptions.PermissionException()
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
management_panel_types_to_classes[ MANAGEMENT_TYPE_IMPORT_HDD ] = ManagementPanelImporterHDD
|
management_panel_types_to_classes[ MANAGEMENT_TYPE_IMPORT_HDD ] = ManagementPanelImporterHDD
|
||||||
|
|
||||||
class ManagementPanelImporterSimpleDownloader( ManagementPanelImporter ):
|
class ManagementPanelImporterSimpleDownloader( ManagementPanelImporter ):
|
||||||
|
@ -2118,6 +2151,14 @@ class ManagementPanelImporterSimpleDownloader( ManagementPanelImporter ):
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
def CheckAbleToClose( self ):
|
||||||
|
|
||||||
|
if self._simple_downloader_import.CurrentlyWorking():
|
||||||
|
|
||||||
|
raise HydrusExceptions.VetoException( 'This page is still importing.' )
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
def EventAdvance( self, event ):
|
def EventAdvance( self, event ):
|
||||||
|
|
||||||
selection = self._pending_jobs_listbox.GetSelection()
|
selection = self._pending_jobs_listbox.GetSelection()
|
||||||
|
@ -2201,20 +2242,6 @@ class ManagementPanelImporterSimpleDownloader( ManagementPanelImporter ):
|
||||||
self._simple_downloader_import.Start( self._page_key )
|
self._simple_downloader_import.Start( self._page_key )
|
||||||
|
|
||||||
|
|
||||||
def TestAbleToClose( self ):
|
|
||||||
|
|
||||||
if self._simple_downloader_import.CurrentlyWorking():
|
|
||||||
|
|
||||||
with ClientGUIDialogs.DialogYesNo( self, 'This page is still importing. Are you sure you want to close it?' ) as dlg:
|
|
||||||
|
|
||||||
if dlg.ShowModal() == wx.ID_NO:
|
|
||||||
|
|
||||||
raise HydrusExceptions.PermissionException()
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
management_panel_types_to_classes[ MANAGEMENT_TYPE_IMPORT_SIMPLE_DOWNLOADER ] = ManagementPanelImporterSimpleDownloader
|
management_panel_types_to_classes[ MANAGEMENT_TYPE_IMPORT_SIMPLE_DOWNLOADER ] = ManagementPanelImporterSimpleDownloader
|
||||||
|
|
||||||
class ManagementPanelImporterThreadWatcher( ManagementPanelImporter ):
|
class ManagementPanelImporterThreadWatcher( ManagementPanelImporter ):
|
||||||
|
@ -2463,6 +2490,17 @@ class ManagementPanelImporterThreadWatcher( ManagementPanelImporter ):
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
def CheckAbleToClose( self ):
|
||||||
|
|
||||||
|
if self._thread_watcher_import.HasThread():
|
||||||
|
|
||||||
|
if self._thread_watcher_import.CurrentlyWorking():
|
||||||
|
|
||||||
|
raise HydrusExceptions.VetoException( 'This page is still importing.' )
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
def EventCheckNow( self, event ):
|
def EventCheckNow( self, event ):
|
||||||
|
|
||||||
self._thread_watcher_import.CheckNow()
|
self._thread_watcher_import.CheckNow()
|
||||||
|
@ -2527,23 +2565,6 @@ class ManagementPanelImporterThreadWatcher( ManagementPanelImporter ):
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
def TestAbleToClose( self ):
|
|
||||||
|
|
||||||
if self._thread_watcher_import.HasThread():
|
|
||||||
|
|
||||||
if self._thread_watcher_import.CurrentlyWorking():
|
|
||||||
|
|
||||||
with ClientGUIDialogs.DialogYesNo( self, 'This page is still importing. Are you sure you want to close it?' ) as dlg:
|
|
||||||
|
|
||||||
if dlg.ShowModal() == wx.ID_NO:
|
|
||||||
|
|
||||||
raise HydrusExceptions.PermissionException()
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
management_panel_types_to_classes[ MANAGEMENT_TYPE_IMPORT_THREAD_WATCHER ] = ManagementPanelImporterThreadWatcher
|
management_panel_types_to_classes[ MANAGEMENT_TYPE_IMPORT_THREAD_WATCHER ] = ManagementPanelImporterThreadWatcher
|
||||||
|
|
||||||
class ManagementPanelImporterURLs( ManagementPanelImporter ):
|
class ManagementPanelImporterURLs( ManagementPanelImporter ):
|
||||||
|
@ -2661,6 +2682,14 @@ class ManagementPanelImporterURLs( ManagementPanelImporter ):
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
def CheckAbleToClose( self ):
|
||||||
|
|
||||||
|
if self._urls_import.CurrentlyWorking():
|
||||||
|
|
||||||
|
raise HydrusExceptions.VetoException( 'This page is still importing.' )
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
def EventPause( self, event ):
|
def EventPause( self, event ):
|
||||||
|
|
||||||
self._urls_import.PausePlay()
|
self._urls_import.PausePlay()
|
||||||
|
|
|
@ -1095,6 +1095,26 @@ class MediaPanel( ClientMedia.ListeningMediaList, wx.ScrolledWindow ):
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
def _OpenFileInWebBrowser( self ):
|
||||||
|
|
||||||
|
if self._focussed_media is not None:
|
||||||
|
|
||||||
|
if self._focussed_media.GetLocationsManager().IsLocal():
|
||||||
|
|
||||||
|
hash = self._focussed_media.GetHash()
|
||||||
|
mime = self._focussed_media.GetMime()
|
||||||
|
|
||||||
|
client_files_manager = HG.client_controller.client_files_manager
|
||||||
|
|
||||||
|
path = client_files_manager.GetFilePath( hash, mime )
|
||||||
|
|
||||||
|
self._SetFocussedMedia( None )
|
||||||
|
|
||||||
|
webbrowser.open( 'file://' + path )
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
def _OpenFileLocation( self ):
|
def _OpenFileLocation( self ):
|
||||||
|
|
||||||
if self._focussed_media is not None:
|
if self._focussed_media is not None:
|
||||||
|
@ -3348,13 +3368,24 @@ class MediaPanelThumbnails( MediaPanel ):
|
||||||
|
|
||||||
#
|
#
|
||||||
|
|
||||||
if advanced_mode:
|
if focussed_is_local:
|
||||||
|
|
||||||
if not HC.PLATFORM_LINUX and focussed_is_local:
|
show_open_in_web = not HC.PLATFORM_WINDOWS or advanced_mode # let's turn it on for any Winlads who wants to try
|
||||||
|
show_open_in_explorer = advanced_mode and not HC.PLATFORM_LINUX
|
||||||
|
|
||||||
|
if show_open_in_web or show_open_in_explorer:
|
||||||
|
|
||||||
open_menu = wx.Menu()
|
open_menu = wx.Menu()
|
||||||
|
|
||||||
ClientGUIMenus.AppendMenuItem( self, open_menu, 'in file browser', 'Show this file in your OS\'s file browser.', self._OpenFileLocation )
|
if show_open_in_web:
|
||||||
|
|
||||||
|
ClientGUIMenus.AppendMenuItem( self, open_menu, 'in web browser (prototype)', 'Show this file in your OS\'s web browser.', self._OpenFileInWebBrowser )
|
||||||
|
|
||||||
|
|
||||||
|
if show_open_in_explorer:
|
||||||
|
|
||||||
|
ClientGUIMenus.AppendMenuItem( self, open_menu, 'in file browser', 'Show this file in your OS\'s file browser.', self._OpenFileLocation )
|
||||||
|
|
||||||
|
|
||||||
ClientGUIMenus.AppendMenu( share_menu, open_menu, 'open' )
|
ClientGUIMenus.AppendMenu( share_menu, open_menu, 'open' )
|
||||||
|
|
||||||
|
@ -3547,6 +3578,8 @@ class MediaPanelThumbnails( MediaPanel ):
|
||||||
|
|
||||||
if advanced_mode:
|
if advanced_mode:
|
||||||
|
|
||||||
|
ClientGUIMenus.AppendSeparator( menu )
|
||||||
|
|
||||||
duplicates_menu = menu # this is important to make the menu flexible if not multiple selected
|
duplicates_menu = menu # this is important to make the menu flexible if not multiple selected
|
||||||
|
|
||||||
focussed_hash = self._focussed_media.GetDisplayMedia().GetHash()
|
focussed_hash = self._focussed_media.GetDisplayMedia().GetHash()
|
||||||
|
@ -3625,13 +3658,8 @@ class MediaPanelThumbnails( MediaPanel ):
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
if advanced_mode:
|
|
||||||
|
|
||||||
if self._focussed_media.HasImages():
|
if self._focussed_media.HasImages():
|
||||||
|
|
||||||
ClientGUIMenus.AppendSeparator( menu )
|
|
||||||
|
|
||||||
similar_menu = wx.Menu()
|
similar_menu = wx.Menu()
|
||||||
|
|
||||||
ClientGUIMenus.AppendMenuItem( self, similar_menu, 'exact match', 'Search the database for files that look precisely like this one.', self._GetSimilarTo, HC.HAMMING_EXACT_MATCH )
|
ClientGUIMenus.AppendMenuItem( self, similar_menu, 'exact match', 'Search the database for files that look precisely like this one.', self._GetSimilarTo, HC.HAMMING_EXACT_MATCH )
|
||||||
|
@ -3642,8 +3670,7 @@ class MediaPanelThumbnails( MediaPanel ):
|
||||||
ClientGUIMenus.AppendMenu( menu, similar_menu, 'find similar files' )
|
ClientGUIMenus.AppendMenu( menu, similar_menu, 'find similar files' )
|
||||||
|
|
||||||
|
|
||||||
|
ClientGUIMenus.AppendSeparator( menu )
|
||||||
if advanced_mode:
|
|
||||||
|
|
||||||
ClientGUIMenus.AppendMenuItem( self, menu, 'reparse files and regenerate thumbnails', 'Refresh this file\'s metadata and regenerate its thumbnails.', self._ReparseFile )
|
ClientGUIMenus.AppendMenuItem( self, menu, 'reparse files and regenerate thumbnails', 'Refresh this file\'s metadata and regenerate its thumbnails.', self._ReparseFile )
|
||||||
|
|
||||||
|
|
|
@ -10,6 +10,7 @@ import ClientGUICanvas
|
||||||
import ClientDownloading
|
import ClientDownloading
|
||||||
import ClientSearch
|
import ClientSearch
|
||||||
import ClientThreading
|
import ClientThreading
|
||||||
|
import collections
|
||||||
import hashlib
|
import hashlib
|
||||||
import HydrusData
|
import HydrusData
|
||||||
import HydrusExceptions
|
import HydrusExceptions
|
||||||
|
@ -475,6 +476,11 @@ class Page( wx.SplitterWindow ):
|
||||||
self._controller.pub( 'refresh_page_name', self._page_key )
|
self._controller.pub( 'refresh_page_name', self._page_key )
|
||||||
|
|
||||||
|
|
||||||
|
def CheckAbleToClose( self ):
|
||||||
|
|
||||||
|
self._management_panel.CheckAbleToClose()
|
||||||
|
|
||||||
|
|
||||||
def CleanBeforeDestroy( self ):
|
def CleanBeforeDestroy( self ):
|
||||||
|
|
||||||
self._management_panel.CleanBeforeDestroy()
|
self._management_panel.CleanBeforeDestroy()
|
||||||
|
@ -550,17 +556,26 @@ class Page( wx.SplitterWindow ):
|
||||||
return self._management_controller.GetPageName()
|
return self._management_controller.GetPageName()
|
||||||
|
|
||||||
|
|
||||||
def GetNumFiles( self ):
|
def GetNumFileSummary( self ):
|
||||||
|
|
||||||
if self._initialised:
|
if self._initialised:
|
||||||
|
|
||||||
return self._media_panel.GetNumFiles()
|
num_files = self._media_panel.GetNumFiles()
|
||||||
|
|
||||||
else:
|
else:
|
||||||
|
|
||||||
return len( self._initial_hashes )
|
num_files = len( self._initial_hashes )
|
||||||
|
|
||||||
|
|
||||||
|
( num_value, num_range ) = self._management_controller.GetValueRange()
|
||||||
|
|
||||||
|
if num_value == num_range:
|
||||||
|
|
||||||
|
( num_value, num_range ) = ( 0, 0 )
|
||||||
|
|
||||||
|
|
||||||
|
return ( num_files, ( num_value, num_range ) )
|
||||||
|
|
||||||
|
|
||||||
def GetPageKey( self ):
|
def GetPageKey( self ):
|
||||||
|
|
||||||
|
@ -751,7 +766,22 @@ class Page( wx.SplitterWindow ):
|
||||||
|
|
||||||
def TestAbleToClose( self ):
|
def TestAbleToClose( self ):
|
||||||
|
|
||||||
self._management_panel.TestAbleToClose()
|
try:
|
||||||
|
|
||||||
|
self._management_panel.CheckAbleToClose()
|
||||||
|
|
||||||
|
except HydrusExceptions.VetoException as e:
|
||||||
|
|
||||||
|
reason = HydrusData.ToUnicode( e )
|
||||||
|
|
||||||
|
with ClientGUIDialogs.DialogYesNo( self, reason + ' Are you sure you want to close it?' ) as dlg:
|
||||||
|
|
||||||
|
if dlg.ShowModal() == wx.ID_NO:
|
||||||
|
|
||||||
|
raise HydrusExceptions.VetoException()
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
def THREADLoadInitialMediaResults( self, controller, initial_hashes ):
|
def THREADLoadInitialMediaResults( self, controller, initial_hashes ):
|
||||||
|
@ -912,7 +942,7 @@ class PagesNotebook( wx.Notebook ):
|
||||||
|
|
||||||
page.TestAbleToClose()
|
page.TestAbleToClose()
|
||||||
|
|
||||||
except HydrusExceptions.PermissionException:
|
except HydrusExceptions.VetoException:
|
||||||
|
|
||||||
return False
|
return False
|
||||||
|
|
||||||
|
@ -1252,9 +1282,16 @@ class PagesNotebook( wx.Notebook ):
|
||||||
|
|
||||||
if page_file_count_display == CC.PAGE_FILE_COUNT_DISPLAY_ALL or ( page_file_count_display == CC.PAGE_FILE_COUNT_DISPLAY_ONLY_IMPORTERS and page.IsImporter() ):
|
if page_file_count_display == CC.PAGE_FILE_COUNT_DISPLAY_ALL or ( page_file_count_display == CC.PAGE_FILE_COUNT_DISPLAY_ONLY_IMPORTERS and page.IsImporter() ):
|
||||||
|
|
||||||
num_files = page.GetNumFiles()
|
( num_files, ( num_value, num_range ) ) = page.GetNumFileSummary()
|
||||||
|
|
||||||
page_name += ' (' + HydrusData.ConvertIntToPrettyString( num_files ) + ')'
|
num_string = HydrusData.ConvertIntToPrettyString( num_files )
|
||||||
|
|
||||||
|
if num_range > 0 and num_value != num_range:
|
||||||
|
|
||||||
|
num_string += ', ' + HydrusData.ConvertValueRangeToPrettyString( num_value, num_range )
|
||||||
|
|
||||||
|
|
||||||
|
page_name += ' (' + num_string + ')'
|
||||||
|
|
||||||
|
|
||||||
safe_page_name = self.EscapeMnemonics( page_name )
|
safe_page_name = self.EscapeMnemonics( page_name )
|
||||||
|
@ -1847,9 +1884,22 @@ class PagesNotebook( wx.Notebook ):
|
||||||
return self._name
|
return self._name
|
||||||
|
|
||||||
|
|
||||||
def GetNumFiles( self ):
|
def GetNumFileSummary( self ):
|
||||||
|
|
||||||
return sum( page.GetNumFiles() for page in self._GetPages() )
|
total_num_files = 0
|
||||||
|
total_num_value = 0
|
||||||
|
total_num_range = 0
|
||||||
|
|
||||||
|
for page in self._GetPages():
|
||||||
|
|
||||||
|
( num_files, ( num_value, num_range ) ) = page.GetNumFileSummary()
|
||||||
|
|
||||||
|
total_num_files += num_files
|
||||||
|
total_num_value += num_value
|
||||||
|
total_num_range += num_range
|
||||||
|
|
||||||
|
|
||||||
|
return ( total_num_files, ( total_num_value, total_num_range ) )
|
||||||
|
|
||||||
|
|
||||||
def GetNumPages( self, only_my_level = False ):
|
def GetNumPages( self, only_my_level = False ):
|
||||||
|
@ -1912,7 +1962,16 @@ class PagesNotebook( wx.Notebook ):
|
||||||
|
|
||||||
def GetPrettyStatus( self ):
|
def GetPrettyStatus( self ):
|
||||||
|
|
||||||
return HydrusData.ConvertIntToPrettyString( self.GetPageCount() ) + ' pages, ' + HydrusData.ConvertIntToPrettyString( self.GetNumFiles() ) + ' files'
|
( num_files, ( num_value, num_range ) ) = self.GetNumFileSummary()
|
||||||
|
|
||||||
|
num_string = HydrusData.ConvertIntToPrettyString( num_files )
|
||||||
|
|
||||||
|
if num_range > 0 and num_value != num_range:
|
||||||
|
|
||||||
|
num_string += ', ' + HydrusData.ConvertValueRangeToPrettyString( num_value, num_range )
|
||||||
|
|
||||||
|
|
||||||
|
return HydrusData.ConvertIntToPrettyString( self.GetPageCount() ) + ' pages, ' + num_string + ' files'
|
||||||
|
|
||||||
|
|
||||||
def HasPage( self, page ):
|
def HasPage( self, page ):
|
||||||
|
@ -1978,7 +2037,7 @@ class PagesNotebook( wx.Notebook ):
|
||||||
|
|
||||||
self.TestAbleToClose()
|
self.TestAbleToClose()
|
||||||
|
|
||||||
except HydrusExceptions.PermissionException:
|
except HydrusExceptions.VetoException:
|
||||||
|
|
||||||
return
|
return
|
||||||
|
|
||||||
|
@ -2700,9 +2759,60 @@ class PagesNotebook( wx.Notebook ):
|
||||||
|
|
||||||
def TestAbleToClose( self ):
|
def TestAbleToClose( self ):
|
||||||
|
|
||||||
for page in self._GetPages():
|
count = collections.Counter()
|
||||||
|
|
||||||
|
for page in self._GetMediaPages( False ):
|
||||||
|
|
||||||
|
try:
|
||||||
|
|
||||||
|
page.CheckAbleToClose()
|
||||||
|
|
||||||
|
except HydrusExceptions.VetoException as e:
|
||||||
|
|
||||||
|
reason = HydrusData.ToUnicode( e )
|
||||||
|
|
||||||
|
count[ reason ] += 1
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
if len( count ) > 0:
|
||||||
|
|
||||||
|
total_problems = sum( count.values() )
|
||||||
|
|
||||||
|
message = ''
|
||||||
|
|
||||||
|
for ( reason, c ) in count.items():
|
||||||
|
|
||||||
|
if c == 1:
|
||||||
|
|
||||||
|
message = '1 page says: ' + reason
|
||||||
|
|
||||||
|
else:
|
||||||
|
|
||||||
|
message = HydrusData.ConvertIntToPrettyString( c ) + ' pages say:' + reason
|
||||||
|
|
||||||
|
|
||||||
|
message += os.linesep
|
||||||
|
|
||||||
|
|
||||||
|
message += os.linesep
|
||||||
|
|
||||||
|
if total_problems == 1:
|
||||||
|
|
||||||
|
message += 'Are you sure you want to close it?'
|
||||||
|
|
||||||
|
else:
|
||||||
|
|
||||||
|
message += 'Are you sure you want to close them?'
|
||||||
|
|
||||||
|
|
||||||
|
with ClientGUIDialogs.DialogYesNo( self, message ) as dlg:
|
||||||
|
|
||||||
|
if dlg.ShowModal() == wx.ID_NO:
|
||||||
|
|
||||||
|
raise HydrusExceptions.VetoException()
|
||||||
|
|
||||||
|
|
||||||
page.TestAbleToClose()
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
|
@ -1433,8 +1433,8 @@ class EditContentParserPanel( ClientGUIScrolledPanels.EditPanel ):
|
||||||
|
|
||||||
self._url_type = ClientGUICommon.BetterChoice( self._urls_panel )
|
self._url_type = ClientGUICommon.BetterChoice( self._urls_panel )
|
||||||
|
|
||||||
self._url_type.Append( 'actual file', HC.URL_TYPE_FILE )
|
self._url_type.Append( 'file url', HC.URL_TYPE_FILE )
|
||||||
self._url_type.Append( 'post page', HC.URL_TYPE_POST )
|
self._url_type.Append( 'post url', HC.URL_TYPE_POST )
|
||||||
self._url_type.Append( 'next gallery page', HC.URL_TYPE_NEXT )
|
self._url_type.Append( 'next gallery page', HC.URL_TYPE_NEXT )
|
||||||
|
|
||||||
self._file_priority = wx.SpinCtrl( self._urls_panel, min = 0, max = 100 )
|
self._file_priority = wx.SpinCtrl( self._urls_panel, min = 0, max = 100 )
|
||||||
|
@ -3663,6 +3663,7 @@ class EditStringConverterPanel( ClientGUIScrolledPanels.EditPanel ):
|
||||||
self._data_encoding = ClientGUICommon.BetterChoice( self )
|
self._data_encoding = ClientGUICommon.BetterChoice( self )
|
||||||
self._data_regex_pattern = wx.TextCtrl( self )
|
self._data_regex_pattern = wx.TextCtrl( self )
|
||||||
self._data_regex_repl = wx.TextCtrl( self )
|
self._data_regex_repl = wx.TextCtrl( self )
|
||||||
|
self._data_date_link = wx.adv.HyperlinkCtrl( self, label = 'link to date info', url = 'https://docs.python.org/2/library/datetime.html#strftime-strptime-behavior' )
|
||||||
self._data_timezone = ClientGUICommon.BetterChoice( self )
|
self._data_timezone = ClientGUICommon.BetterChoice( self )
|
||||||
self._data_timezone_offset = wx.SpinCtrl( self, min = -86400, max = 86400 )
|
self._data_timezone_offset = wx.SpinCtrl( self, min = -86400, max = 86400 )
|
||||||
|
|
||||||
|
@ -3699,7 +3700,7 @@ class EditStringConverterPanel( ClientGUIScrolledPanels.EditPanel ):
|
||||||
( phrase, timezone_type, timezone_offset ) = data
|
( phrase, timezone_type, timezone_offset ) = data
|
||||||
|
|
||||||
self._data_text.SetValue( phrase )
|
self._data_text.SetValue( phrase )
|
||||||
self._data_timezone.SetValue( timezone_type )
|
self._data_timezone.SelectClientData( timezone_type )
|
||||||
self._data_timezone_offset.SetValue( timezone_offset )
|
self._data_timezone_offset.SetValue( timezone_offset )
|
||||||
|
|
||||||
elif data is not None:
|
elif data is not None:
|
||||||
|
@ -3723,6 +3724,7 @@ class EditStringConverterPanel( ClientGUIScrolledPanels.EditPanel ):
|
||||||
rows.append( ( 'encoding data: ', self._data_encoding ) )
|
rows.append( ( 'encoding data: ', self._data_encoding ) )
|
||||||
rows.append( ( 'regex pattern: ', self._data_regex_pattern ) )
|
rows.append( ( 'regex pattern: ', self._data_regex_pattern ) )
|
||||||
rows.append( ( 'regex replacement: ', self._data_regex_repl ) )
|
rows.append( ( 'regex replacement: ', self._data_regex_repl ) )
|
||||||
|
rows.append( ( 'date info: ', self._data_date_link ) )
|
||||||
rows.append( ( 'date timezone: ', self._data_timezone ) )
|
rows.append( ( 'date timezone: ', self._data_timezone ) )
|
||||||
rows.append( ( 'timezone offset: ', self._data_timezone_offset ) )
|
rows.append( ( 'timezone offset: ', self._data_timezone_offset ) )
|
||||||
|
|
||||||
|
@ -3761,20 +3763,20 @@ class EditStringConverterPanel( ClientGUIScrolledPanels.EditPanel ):
|
||||||
|
|
||||||
self._data_text.Enable()
|
self._data_text.Enable()
|
||||||
|
|
||||||
|
if transformation_type == ClientParsing.STRING_TRANSFORMATION_DATE_DECODE:
|
||||||
|
|
||||||
|
self._data_timezone.Enable()
|
||||||
|
|
||||||
|
if self._data_timezone.GetChoice() == HC.TIMEZONE_OFFSET:
|
||||||
|
|
||||||
|
self._data_timezone_offset.Enable()
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
elif transformation_type in ( ClientParsing.STRING_TRANSFORMATION_REMOVE_TEXT_FROM_BEGINNING, ClientParsing.STRING_TRANSFORMATION_REMOVE_TEXT_FROM_END, ClientParsing.STRING_TRANSFORMATION_CLIP_TEXT_FROM_BEGINNING, ClientParsing.STRING_TRANSFORMATION_CLIP_TEXT_FROM_END ):
|
elif transformation_type in ( ClientParsing.STRING_TRANSFORMATION_REMOVE_TEXT_FROM_BEGINNING, ClientParsing.STRING_TRANSFORMATION_REMOVE_TEXT_FROM_END, ClientParsing.STRING_TRANSFORMATION_CLIP_TEXT_FROM_BEGINNING, ClientParsing.STRING_TRANSFORMATION_CLIP_TEXT_FROM_END ):
|
||||||
|
|
||||||
self._data_number.Enable()
|
self._data_number.Enable()
|
||||||
|
|
||||||
elif transformation_type == ClientParsing.STRING_TRANSFORMATION_DATE_DECODE:
|
|
||||||
|
|
||||||
self._data_text.Enable()
|
|
||||||
self._data_timezone.Enable()
|
|
||||||
|
|
||||||
if self._data_timezone.GetChoice() == HC.TIMEZONE_OFFSET:
|
|
||||||
|
|
||||||
self._data_timezone_offset.Enable()
|
|
||||||
|
|
||||||
|
|
||||||
elif transformation_type == ClientParsing.STRING_TRANSFORMATION_REGEX_SUB:
|
elif transformation_type == ClientParsing.STRING_TRANSFORMATION_REGEX_SUB:
|
||||||
|
|
||||||
self._data_regex_pattern.Enable()
|
self._data_regex_pattern.Enable()
|
||||||
|
@ -4162,7 +4164,7 @@ class ManageParsingScriptsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
||||||
url = ''
|
url = ''
|
||||||
query_type = HC.GET
|
query_type = HC.GET
|
||||||
file_identifier_type = ClientParsing.FILE_IDENTIFIER_TYPE_MD5
|
file_identifier_type = ClientParsing.FILE_IDENTIFIER_TYPE_MD5
|
||||||
file_identifier_string_converter = ClientParsing.StringConverter( [ ClientParsing.STRING_TRANSFORMATION_ENCODE, 'hex' ], 'some hash bytes' )
|
file_identifier_string_converter = ClientParsing.StringConverter( ( [ ClientParsing.STRING_TRANSFORMATION_ENCODE, 'hex' ] ), 'some hash bytes' )
|
||||||
file_identifier_arg_name = 'md5'
|
file_identifier_arg_name = 'md5'
|
||||||
static_args = {}
|
static_args = {}
|
||||||
children = []
|
children = []
|
||||||
|
|
|
@ -1664,7 +1664,10 @@ class MediaSingleton( Media ):
|
||||||
|
|
||||||
def IsCollection( self ): return False
|
def IsCollection( self ): return False
|
||||||
|
|
||||||
def IsImage( self ): return self._media_result.GetMime() in HC.IMAGES
|
def IsImage( self ):
|
||||||
|
|
||||||
|
return self._media_result.GetMime() in HC.IMAGES and not self.HasDuration()
|
||||||
|
|
||||||
|
|
||||||
def IsNoisy( self ): return self._media_result.GetMime() in HC.NOISY_MIMES
|
def IsNoisy( self ): return self._media_result.GetMime() in HC.NOISY_MIMES
|
||||||
|
|
||||||
|
|
|
@ -293,6 +293,11 @@ def GetTimestampFromParseResults( results, desired_timestamp_type ):
|
||||||
continue
|
continue
|
||||||
|
|
||||||
|
|
||||||
|
if timestamp_type == HC.TIMESTAMP_TYPE_SOURCE:
|
||||||
|
|
||||||
|
timestamp = min( HydrusData.GetNow() - 30, timestamp )
|
||||||
|
|
||||||
|
|
||||||
timestamp_results.append( timestamp )
|
timestamp_results.append( timestamp )
|
||||||
|
|
||||||
|
|
||||||
|
@ -337,7 +342,7 @@ def GetTitleFromAllParseResults( all_parse_results ):
|
||||||
return None
|
return None
|
||||||
|
|
||||||
|
|
||||||
def GetURLsFromParseResults( results, desired_url_types ):
|
def GetURLsFromParseResults( results, desired_url_types, only_get_top_priority = False ):
|
||||||
|
|
||||||
url_results = collections.defaultdict( list )
|
url_results = collections.defaultdict( list )
|
||||||
|
|
||||||
|
@ -354,24 +359,36 @@ def GetURLsFromParseResults( results, desired_url_types ):
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
# ( priority, url_list ) pairs
|
if only_get_top_priority:
|
||||||
|
|
||||||
url_results = list( url_results.items() )
|
# ( priority, url_list ) pairs
|
||||||
|
|
||||||
# ordered by descending priority
|
url_results = list( url_results.items() )
|
||||||
|
|
||||||
url_results.sort( reverse = True )
|
# ordered by descending priority
|
||||||
|
|
||||||
# url_lists of descending priority
|
url_results.sort( reverse = True )
|
||||||
|
|
||||||
if len( url_results ) > 0:
|
# url_lists of descending priority
|
||||||
|
|
||||||
|
if len( url_results ) > 0:
|
||||||
|
|
||||||
|
( priority, url_list ) = url_results[0]
|
||||||
|
|
||||||
|
else:
|
||||||
|
|
||||||
|
url_list = []
|
||||||
|
|
||||||
( priority, url_list ) = url_results[0]
|
|
||||||
|
|
||||||
else:
|
else:
|
||||||
|
|
||||||
url_list = []
|
url_list = []
|
||||||
|
|
||||||
|
for u_l in url_results.values():
|
||||||
|
|
||||||
|
url_list.extend( u_l )
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
return url_list
|
return url_list
|
||||||
|
|
||||||
|
@ -1866,6 +1883,8 @@ class PageParser( HydrusSerialisable.SerialisableBaseNamed ):
|
||||||
|
|
||||||
except HydrusExceptions.VetoException as e:
|
except HydrusExceptions.VetoException as e:
|
||||||
|
|
||||||
|
all_parse_results = [ 1 ]
|
||||||
|
|
||||||
pretty_parse_result_text = 'veto: ' + HydrusData.ToUnicode( e )
|
pretty_parse_result_text = 'veto: ' + HydrusData.ToUnicode( e )
|
||||||
|
|
||||||
|
|
||||||
|
@ -2404,14 +2423,21 @@ class StringConverter( HydrusSerialisable.SerialisableBase ):
|
||||||
|
|
||||||
self.transformations = []
|
self.transformations = []
|
||||||
|
|
||||||
for ( transformation_type, data ) in serialisable_transformations:
|
try: # I initialised this bad one time and broke a dialog on subsequent loads, fugg
|
||||||
|
|
||||||
if isinstance( data, list ):
|
for ( transformation_type, data ) in serialisable_transformations:
|
||||||
|
|
||||||
data = tuple( data ) # convert from list to tuple thing
|
if isinstance( data, list ):
|
||||||
|
|
||||||
|
data = tuple( data ) # convert from list to tuple thing
|
||||||
|
|
||||||
|
|
||||||
self.transformations.append( ( transformation_type, data ) )
|
self.transformations.append( ( transformation_type, data ) )
|
||||||
|
|
||||||
|
|
||||||
|
except:
|
||||||
|
|
||||||
|
pass
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
@ -2705,7 +2731,16 @@ class StringMatch( HydrusSerialisable.SerialisableBase ):
|
||||||
fail_reason = ' did not match "' + r + '"'
|
fail_reason = ' did not match "' + r + '"'
|
||||||
|
|
||||||
|
|
||||||
if re.search( r, text, flags = re.UNICODE ) is None:
|
try:
|
||||||
|
|
||||||
|
result = re.search( r, text, flags = re.UNICODE )
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
|
||||||
|
raise HydrusExceptions.StringMatchException( 'That regex did not work! ' + HydrusData.ToUnicode( e ) )
|
||||||
|
|
||||||
|
|
||||||
|
if result is None:
|
||||||
|
|
||||||
raise HydrusExceptions.StringMatchException( presentation_text + fail_reason )
|
raise HydrusExceptions.StringMatchException( presentation_text + fail_reason )
|
||||||
|
|
||||||
|
|
|
@ -49,7 +49,7 @@ options = {}
|
||||||
# Misc
|
# Misc
|
||||||
|
|
||||||
NETWORK_VERSION = 18
|
NETWORK_VERSION = 18
|
||||||
SOFTWARE_VERSION = 304
|
SOFTWARE_VERSION = 305
|
||||||
|
|
||||||
UNSCALED_THUMBNAIL_DIMENSIONS = ( 200, 200 )
|
UNSCALED_THUMBNAIL_DIMENSIONS = ( 200, 200 )
|
||||||
|
|
||||||
|
|
|
@ -1384,31 +1384,6 @@ class TestServerDB( unittest.TestCase ):
|
||||||
|
|
||||||
self.assertEqual( set( result ), { self._tag_service_key, self._file_service_key } )
|
self.assertEqual( set( result ), { self._tag_service_key, self._file_service_key } )
|
||||||
|
|
||||||
#
|
|
||||||
|
|
||||||
result = self._read( 'services_info' )
|
|
||||||
|
|
||||||
services_info = { service_key : ( service_type, options ) for ( service_key, service_type, options ) in result }
|
|
||||||
|
|
||||||
self.assertEqual( services_info[ HC.SERVER_ADMIN_KEY ], ( 99, { 'max_monthly_data' : None, 'message' : 'hydrus server administration service', 'max_storage' : None, 'upnp' : None, 'port' : 45870 } ) )
|
|
||||||
self.assertEqual( services_info[ self._tag_service_key ], ( HC.TAG_REPOSITORY, t_options ) )
|
|
||||||
self.assertEqual( services_info[ self._file_service_key ], ( HC.FILE_REPOSITORY, f_options ) )
|
|
||||||
|
|
||||||
#
|
|
||||||
|
|
||||||
f_options_modified = dict( f_options )
|
|
||||||
f_options_modified[ 'port' ] = 102
|
|
||||||
|
|
||||||
edit_log = [ ( HC.EDIT, ( self._file_service_key, HC.FILE_REPOSITORY, f_options_modified ) ) ]
|
|
||||||
|
|
||||||
self._write( 'services', self._admin_account_key, edit_log )
|
|
||||||
|
|
||||||
result = self._read( 'services_info' )
|
|
||||||
|
|
||||||
services_info = { service_key : ( service_type, options ) for ( service_key, service_type, options ) in result }
|
|
||||||
|
|
||||||
self.assertEqual( services_info[ self._file_service_key ], ( HC.FILE_REPOSITORY, f_options_modified ) )
|
|
||||||
|
|
||||||
|
|
||||||
def test_server( self ):
|
def test_server( self ):
|
||||||
|
|
||||||
|
|
|
@ -322,7 +322,7 @@ class TestServer( unittest.TestCase ):
|
||||||
|
|
||||||
# num_petitions
|
# num_petitions
|
||||||
|
|
||||||
num_petitions = [ ( HC.CONTENT_TYPE_MAPPINGS, HC.CONTENT_STATUS_PETITIONED, 23 ), ( HC.CONTENT_TYPE_TAG_PARENTS, HC.CONTENT_STATUS_PENDING, 0 ) ]
|
num_petitions = [ [ HC.CONTENT_TYPE_MAPPINGS, HC.CONTENT_STATUS_PETITIONED, 23 ], [ HC.CONTENT_TYPE_TAG_PARENTS, HC.CONTENT_STATUS_PENDING, 0 ] ]
|
||||||
|
|
||||||
HG.test_controller.SetRead( 'num_petitions', num_petitions )
|
HG.test_controller.SetRead( 'num_petitions', num_petitions )
|
||||||
|
|
||||||
|
@ -341,7 +341,7 @@ class TestServer( unittest.TestCase ):
|
||||||
|
|
||||||
HG.test_controller.SetRead( 'petition', petition )
|
HG.test_controller.SetRead( 'petition', petition )
|
||||||
|
|
||||||
response = service.Request( HC.GET, 'petition' )
|
response = service.Request( HC.GET, 'petition', { 'content_type' : HC.CONTENT_TYPE_FILES, 'status' : HC.CONTENT_UPDATE_PETITION } )
|
||||||
|
|
||||||
self.assertEqual( response[ 'petition' ].GetSerialisableTuple(), petition.GetSerialisableTuple() )
|
self.assertEqual( response[ 'petition' ].GetSerialisableTuple(), petition.GetSerialisableTuple() )
|
||||||
|
|
||||||
|
@ -361,7 +361,12 @@ class TestServer( unittest.TestCase ):
|
||||||
|
|
||||||
path = ServerFiles.GetExpectedFilePath( definitions_update_hash )
|
path = ServerFiles.GetExpectedFilePath( definitions_update_hash )
|
||||||
|
|
||||||
with open( path, 'wb' ) as f: f.write( definitions_update_network_string )
|
HydrusPaths.MakeSureDirectoryExists( path )
|
||||||
|
|
||||||
|
with open( path, 'wb' ) as f:
|
||||||
|
|
||||||
|
f.write( definitions_update_network_string )
|
||||||
|
|
||||||
|
|
||||||
response = service.Request( HC.GET, 'update', { 'update_hash' : definitions_update_hash } )
|
response = service.Request( HC.GET, 'update', { 'update_hash' : definitions_update_hash } )
|
||||||
|
|
||||||
|
@ -475,25 +480,24 @@ class TestServer( unittest.TestCase ):
|
||||||
|
|
||||||
# account_types
|
# account_types
|
||||||
|
|
||||||
account_types = { 'message' : 'hello' }
|
account_types = [ HydrusNetwork.AccountType.GenerateAdminAccountType( service.GetServiceType() ) ]
|
||||||
|
|
||||||
HG.test_controller.SetRead( 'account_types', account_types )
|
HG.test_controller.SetRead( 'account_types', account_types )
|
||||||
|
|
||||||
response = service.Request( HC.GET, 'account_types' )
|
response = service.Request( HC.GET, 'account_types' )
|
||||||
|
|
||||||
self.assertEqual( response[ 'account_types' ], account_types )
|
self.assertEqual( response[ 'account_types' ][0].GetAccountTypeKey(), account_types[0].GetAccountTypeKey() )
|
||||||
|
|
||||||
edit_log = 'blah'
|
service.Request( HC.POST, 'account_types', { 'account_types' : account_types, 'deletee_account_type_keys_to_new_account_type_keys' : {} } )
|
||||||
|
|
||||||
service.Request( HC.POST, 'account_types', { 'edit_log' : edit_log } )
|
|
||||||
|
|
||||||
written = HG.test_controller.GetWrite( 'account_types' )
|
written = HG.test_controller.GetWrite( 'account_types' )
|
||||||
|
|
||||||
[ ( args, kwargs ) ] = written
|
[ ( args, kwargs ) ] = written
|
||||||
|
|
||||||
( written_service_key, written_edit_log ) = args
|
( written_service_key, written_account, written_account_types, written_deletee_account_type_keys_to_new_account_type_keys ) = args
|
||||||
|
|
||||||
self.assertEqual( edit_log, written_edit_log )
|
self.assertEqual( written_account_types[0].GetAccountTypeKey(), account_types[0].GetAccountTypeKey() )
|
||||||
|
self.assertEqual( written_deletee_account_type_keys_to_new_account_type_keys, {} )
|
||||||
|
|
||||||
# registration_keys
|
# registration_keys
|
||||||
|
|
||||||
|
@ -514,37 +518,19 @@ class TestServer( unittest.TestCase ):
|
||||||
|
|
||||||
HG.test_controller.SetRead( 'access_key', access_key )
|
HG.test_controller.SetRead( 'access_key', access_key )
|
||||||
|
|
||||||
response = service.Request( HC.GET, 'access_key', 'init' )
|
response = service.Request( HC.GET, 'access_key', { 'registration_key' : 'init' } )
|
||||||
|
|
||||||
self.assertEqual( response[ 'access_key' ], access_key )
|
self.assertEqual( response[ 'access_key' ], access_key )
|
||||||
|
|
||||||
#
|
#
|
||||||
|
|
||||||
# backup
|
## backup
|
||||||
|
|
||||||
response = service.Request( HC.POST, 'backup' )
|
response = service.Request( HC.POST, 'backup' )
|
||||||
|
|
||||||
# services
|
#
|
||||||
|
|
||||||
services_info = { 'message' : 'hello' }
|
# add some new services info
|
||||||
|
|
||||||
HG.test_controller.SetRead( 'services_info', services_info )
|
|
||||||
|
|
||||||
response = service.Request( HC.GET, 'services_info' )
|
|
||||||
|
|
||||||
self.assertEqual( response[ 'services_info' ], services_info )
|
|
||||||
|
|
||||||
edit_log = 'blah'
|
|
||||||
|
|
||||||
registration_keys = service.Request( HC.POST, 'services', { 'edit_log' : edit_log } )
|
|
||||||
|
|
||||||
written = HG.test_controller.GetWrite( 'services' )
|
|
||||||
|
|
||||||
[ ( args, kwargs ) ] = written
|
|
||||||
|
|
||||||
( written_service_key, written_edit_log ) = args
|
|
||||||
|
|
||||||
self.assertEqual( edit_log, written_edit_log )
|
|
||||||
|
|
||||||
|
|
||||||
def _test_tag_repo( self, service ):
|
def _test_tag_repo( self, service ):
|
||||||
|
|
After Width: | Height: | Size: 2.7 KiB |
After Width: | Height: | Size: 2.7 KiB |
After Width: | Height: | Size: 2.8 KiB |
After Width: | Height: | Size: 2.9 KiB |
After Width: | Height: | Size: 2.8 KiB |
Before Width: | Height: | Size: 2.1 KiB After Width: | Height: | Size: 2.1 KiB |
After Width: | Height: | Size: 2.2 KiB |
After Width: | Height: | Size: 2.4 KiB |
After Width: | Height: | Size: 2.2 KiB |
After Width: | Height: | Size: 2.1 KiB |