Version 325
This commit is contained in:
parent
d38cac210f
commit
7d0fb823c6
|
@ -8,6 +8,32 @@
|
|||
<div class="content">
|
||||
<h3>changelog</h3>
|
||||
<ul>
|
||||
<li><h3>version 325</h3></li>
|
||||
<ul>
|
||||
<li>added a 'show a popup while working' checkbox to edit subscription panel--be careful with it, I think maybe only turn it off after you are happy everything is set up right and the sub has run once</li>
|
||||
<li>advanced mode users will see a new 'get quality info' button on the edit subscription panel. this will some ugly+hacky inbox/archived/deleted info on the selected queries to help you figure out if you are only archiving, say, 2% of one query. this is a quickly made but cpu-expensive way of calculating this info. I can obviously expand it in future, so I would appreciate your thoughts</li>
|
||||
<li>subscription queries now have an optional display name, which has no bearing on their function but if set will appear instead of query text in various presentation contexts (this is useful, for instance, if the downloader query text deals in something unhelpful like integer artist_id)</li>
|
||||
<li>subscription queries now each have a simple tag import options! this only allows 'additional tags', in case you want to add some simple per-query tags</li>
|
||||
<li>selecting 'try again' on file imports that previously failed due to 'deleted' will now pop up a little yes/no asking if you would like to first erase these files' previously deleted file record!</li>
|
||||
<li>the watcher and gallery import panels now have 'retry failed' buttons and right-click menu entries when appropriate</li>
|
||||
<li>the watcher and gallery import panels will now do some ui update less frequently when they contain a lot of data</li>
|
||||
<li>fixed the new human-friendly tag sorting code for ungrouped lexicographic sort orders, where it was accidentally grouping by namespace</li>
|
||||
<li>downloader easy-import pngs can now hold custom header and bandwidth rules metadata! this info, if explicitly present for the appropriate domain, will be added automatically on the export side as you add gugs. it can also be bundled separately after manually typing a domain to add. on the import side, it is now listed as a new type. longer human-friendly descriptions of all bandwidth and header information being bundled will be displayed during the export and import processes, just as an additional check</li>
|
||||
<li>for advanced users, added 'do not skip downloading because of known urls/hashes' options to downloader file import options. these checkboxes work like the tag import options ones--ignoring known urls and hashes to force downloads. they are advanced and should not be used unless you have a particular problem to fix</li>
|
||||
<li>improved how the pre-import url/hash checking code is compared for the tag and file import options, particularly on the hash side</li>
|
||||
<li>for advanced users, added 'associate additional source urls' to downloader file import options, which governs whether a site's given 'source urls' should be added and trusted for downloaded files. turn this off if the site is giving bad source urls</li>
|
||||
<li>fixed an unusual problem where gallery searches with search terms that included the search separator (like '6+girls skirt', with a separator of '+') were being overzealously de/encoded (to '6+girls+skirt' rather than '6%2bgirls+skirt')</li>
|
||||
<li>improved how unicode quoted characters in URLs' query parameters, like %E5%B0%BB%E7%A5%9E%E6%A7%98 are auto-converted to something prettier when the user sees them</li>
|
||||
<li>the client now tests if 'already in db' results are actually backed by the file structure--now, if a the actual file is missing despite the db record, the import will be force-attempted and the file structure hopefully healed</li>
|
||||
<li>gallery url jobs will no longer spawn new 'next page' urls if the job yielded 0 _new_ (rather than _total_) file urls (so we should have fixed loops fetching the same x 'already in file import cache' results due to the gallery just passing the same results for n+1 page fetches)</li>
|
||||
<li>in the edit parsing panels, if the example data currently looks like json, new content parsers will spawn with json formulae, otherwise they will get html formulae</li>
|
||||
<li>fixed an issue with the default twitter tweet parser pulling the wrong month for source time</li>
|
||||
<li>added a simple 'media load report mode' to the help debug menu to help figure out some PIL/OpenCV load order stuff</li>
|
||||
<li>the 'missing locations recovery' dialog that spawns on boot if file locations are missing now uses the new listctrl, so is thankfully sortable! it also works better behind the scenes</li>
|
||||
<li>this dialog now also has an 'add a possibly correct location' button, which will scan the given directory for the correct prefixes and automatically fill in the list for you</li>
|
||||
<li>fixed some of the new import folder error reporting</li>
|
||||
<li>misc code cleanup</li>
|
||||
</ul>
|
||||
<li><h3>version 324</h3></li>
|
||||
<ul>
|
||||
<li>downloaders:</li>
|
||||
|
|
|
@ -5377,6 +5377,24 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
( timestamp, ) = result
|
||||
|
||||
result = self._c.execute( 'SELECT mime FROM files_info WHERE hash_id = ?;', ( hash_id, ) ).fetchone()
|
||||
|
||||
if result is not None:
|
||||
|
||||
( mime, ) = result
|
||||
|
||||
try:
|
||||
|
||||
self._controller.client_files_manager.LocklessGetFilePath( hash, mime )
|
||||
|
||||
except HydrusExceptions.FileMissingException:
|
||||
|
||||
note = 'The client believed this file was already in the db, but it was truly missing! Import will go ahead, in an attempt to fix the situation.'
|
||||
|
||||
return ( CC.STATUS_UNKNOWN, hash, prefix + ': ' + note )
|
||||
|
||||
|
||||
|
||||
note = 'Imported at ' + HydrusData.ConvertTimestampToPrettyTime( timestamp ) + ', which was ' + HydrusData.TimestampToPrettyTimeDelta( timestamp, just_now_threshold = 0 ) + ' (before this check).'
|
||||
|
||||
return ( CC.STATUS_SUCCESSFUL_BUT_REDUNDANT, hash, prefix + ': ' + note )
|
||||
|
@ -11015,6 +11033,36 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
|
||||
|
||||
if version == 324:
|
||||
|
||||
try:
|
||||
|
||||
domain_manager = self._GetJSONDump( HydrusSerialisable.SERIALISABLE_TYPE_NETWORK_DOMAIN_MANAGER )
|
||||
|
||||
domain_manager.Initialise()
|
||||
|
||||
#
|
||||
|
||||
domain_manager.OverwriteDefaultParsers( [ 'twitter tweet parser' ] )
|
||||
|
||||
#
|
||||
|
||||
domain_manager.TryToLinkURLMatchesAndParsers()
|
||||
|
||||
#
|
||||
|
||||
self._SetJSONDump( domain_manager )
|
||||
|
||||
except Exception as e:
|
||||
|
||||
HydrusData.PrintException( e )
|
||||
|
||||
message = 'Trying to update some url classes and parsers failed! Please let hydrus dev know!'
|
||||
|
||||
self.pub_initial_message( message )
|
||||
|
||||
|
||||
|
||||
self._controller.pub( 'splash_set_title_text', 'updated db to v' + str( version + 1 ) )
|
||||
|
||||
self._c.execute( 'UPDATE version SET version = ?;', ( version + 1, ) )
|
||||
|
|
|
@ -132,6 +132,8 @@ def DAEMONDownloadFiles( controller ):
|
|||
controller.WaitUntilModelFree()
|
||||
|
||||
exclude_deleted = False # this is the important part here
|
||||
do_not_check_known_urls_before_importing = False
|
||||
do_not_check_hashes_before_importing = False
|
||||
allow_decompression_bombs = True
|
||||
min_size = None
|
||||
max_size = None
|
||||
|
@ -139,11 +141,12 @@ def DAEMONDownloadFiles( controller ):
|
|||
min_resolution = None
|
||||
max_resolution = None
|
||||
automatic_archive = False
|
||||
associate_source_urls = True
|
||||
|
||||
file_import_options = ClientImportOptions.FileImportOptions()
|
||||
|
||||
file_import_options.SetPreImportOptions( exclude_deleted, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution )
|
||||
file_import_options.SetPostImportOptions( automatic_archive )
|
||||
file_import_options.SetPreImportOptions( exclude_deleted, do_not_check_known_urls_before_importing, do_not_check_hashes_before_importing, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution )
|
||||
file_import_options.SetPostImportOptions( automatic_archive, associate_source_urls )
|
||||
|
||||
file_import_job = ClientImportFileSeeds.FileImportJob( temp_path, file_import_options )
|
||||
|
||||
|
|
|
@ -1974,6 +1974,7 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
|
|||
ClientGUIMenus.AppendMenuCheckItem( self, report_modes, 'file report mode', 'Have the file manager report file request information, where supported.', HG.file_report_mode, self._SwitchBoolean, 'file_report_mode' )
|
||||
ClientGUIMenus.AppendMenuCheckItem( self, report_modes, 'gui report mode', 'Have the gui report inside information, where supported.', HG.gui_report_mode, self._SwitchBoolean, 'gui_report_mode' )
|
||||
ClientGUIMenus.AppendMenuCheckItem( self, report_modes, 'hover window report mode', 'Have the hover windows report their show/hide logic.', HG.hover_window_report_mode, self._SwitchBoolean, 'hover_window_report_mode' )
|
||||
ClientGUIMenus.AppendMenuCheckItem( self, report_modes, 'media load report mode', 'Have the client report media load information, where supported.', HG.media_load_report_mode, self._SwitchBoolean, 'media_load_report_mode' )
|
||||
ClientGUIMenus.AppendMenuCheckItem( self, report_modes, 'network report mode', 'Have the network engine report new jobs.', HG.network_report_mode, self._SwitchBoolean, 'network_report_mode' )
|
||||
ClientGUIMenus.AppendMenuCheckItem( self, report_modes, 'shortcut report mode', 'Have the new shortcut system report what shortcuts it catches and whether it matches an action.', HG.shortcut_report_mode, self._SwitchBoolean, 'shortcut_report_mode' )
|
||||
ClientGUIMenus.AppendMenuCheckItem( self, report_modes, 'subscription report mode', 'Have the subscription system report what it is doing.', HG.subscription_report_mode, self._SwitchBoolean, 'subscription_report_mode' )
|
||||
|
@ -3322,6 +3323,10 @@ The password is cleartext here but obscured in the entry dialog. Enter a blank p
|
|||
|
||||
HG.hover_window_report_mode = not HG.hover_window_report_mode
|
||||
|
||||
elif name == 'media_load_report_mode':
|
||||
|
||||
HG.media_load_report_mode = not HG.media_load_report_mode
|
||||
|
||||
elif name == 'menu_profile_mode':
|
||||
|
||||
HG.menu_profile_mode = not HG.menu_profile_mode
|
||||
|
|
|
@ -2192,7 +2192,7 @@ class NoneableSpinCtrl( wx.Panel ):
|
|||
|
||||
class NoneableTextCtrl( wx.Panel ):
|
||||
|
||||
def __init__( self, parent, message = '', none_phrase = 'no limit' ):
|
||||
def __init__( self, parent, message = '', none_phrase = 'none' ):
|
||||
|
||||
wx.Panel.__init__( self, parent )
|
||||
|
||||
|
|
|
@ -826,8 +826,9 @@ class FrameInputLocalFiles( wx.Frame ):
|
|||
self._progress_cancel.Disable()
|
||||
|
||||
file_import_options = HG.client_controller.new_options.GetDefaultFileImportOptions( 'loud' )
|
||||
show_downloader_options = False
|
||||
|
||||
self._file_import_options = ClientGUIImport.FileImportOptionsButton( self, file_import_options )
|
||||
self._file_import_options = ClientGUIImport.FileImportOptionsButton( self, file_import_options, show_downloader_options )
|
||||
|
||||
menu_items = []
|
||||
|
||||
|
|
|
@ -1957,13 +1957,15 @@ class DialogManageImportFoldersEdit( ClientGUIDialogs.Dialog ):
|
|||
self._action_failed = create_choice()
|
||||
self._location_failed = wx.DirPickerCtrl( self._file_box, style = wx.DIRP_USE_TEXTCTRL )
|
||||
|
||||
self._file_import_options = ClientGUIImport.FileImportOptionsButton( self._file_box, file_import_options )
|
||||
show_downloader_options = False
|
||||
|
||||
self._file_import_options = ClientGUIImport.FileImportOptionsButton( self._file_box, file_import_options, show_downloader_options )
|
||||
|
||||
#
|
||||
|
||||
self._tag_box = ClientGUICommon.StaticBox( self._panel, 'tag options' )
|
||||
|
||||
self._tag_import_options = ClientGUIImport.TagImportOptionsButton( self._tag_box, tag_import_options, show_downloader_options = False )
|
||||
self._tag_import_options = ClientGUIImport.TagImportOptionsButton( self._tag_box, tag_import_options, show_downloader_options )
|
||||
|
||||
self._filename_tagging_options_box = ClientGUICommon.StaticBox( self._tag_box, 'filename tagging' )
|
||||
|
||||
|
|
|
@ -238,6 +238,31 @@ class EditFileSeedCachePanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
file_seeds = self._list_ctrl.GetData( only_selected = True )
|
||||
|
||||
if status_to_set == CC.STATUS_UNKNOWN:
|
||||
|
||||
deleted_file_seeds = [ file_seed for file_seed in file_seeds if file_seed.IsDeleted() and file_seed.HasHash() ]
|
||||
|
||||
if True in ( file_seed.IsDeleted() and file_seed.HasHash() for file_seed in file_seeds ):
|
||||
|
||||
message = 'One or more of these files did not import due to being previously deleted. They will likely fail again unless you erase those deletion records. Would you like to do this now?'
|
||||
|
||||
with ClientGUIDialogs.DialogYesNo( self, message ) as dlg:
|
||||
|
||||
if dlg.ShowModal() == wx.ID_YES:
|
||||
|
||||
deletee_hashes = { file_seed.GetHash() for file_seed in deleted_file_seeds }
|
||||
|
||||
content_update_erase_record = HydrusData.ContentUpdate( HC.CONTENT_TYPE_FILES, HC.CONTENT_UPDATE_ADVANCED, ( 'delete_deleted', deletee_hashes ) )
|
||||
content_update_undelete_from_trash = HydrusData.ContentUpdate( HC.CONTENT_TYPE_FILES, HC.CONTENT_UPDATE_UNDELETE, deletee_hashes )
|
||||
|
||||
service_keys_to_content_updates = { CC.COMBINED_LOCAL_FILE_SERVICE_KEY : [ content_update_erase_record, content_update_undelete_from_trash ] }
|
||||
|
||||
HG.client_controller.WriteSynchronous( 'content_updates', service_keys_to_content_updates )
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
for file_seed in file_seeds:
|
||||
|
||||
file_seed.SetStatus( status_to_set )
|
||||
|
|
|
@ -89,11 +89,12 @@ class CheckerOptionsButton( ClientGUICommon.BetterButton ):
|
|||
|
||||
class FileImportOptionsButton( ClientGUICommon.BetterButton ):
|
||||
|
||||
def __init__( self, parent, file_import_options, update_callable = None ):
|
||||
def __init__( self, parent, file_import_options, show_downloader_options, update_callable = None ):
|
||||
|
||||
ClientGUICommon.BetterButton.__init__( self, parent, 'file import options', self._EditOptions )
|
||||
|
||||
self._file_import_options = file_import_options
|
||||
self._show_downloader_options = show_downloader_options
|
||||
self._update_callable = update_callable
|
||||
|
||||
self._SetToolTip()
|
||||
|
@ -103,7 +104,7 @@ class FileImportOptionsButton( ClientGUICommon.BetterButton ):
|
|||
|
||||
with ClientGUITopLevelWindows.DialogEdit( self, 'edit file import options' ) as dlg:
|
||||
|
||||
panel = ClientGUIScrolledPanelsEdit.EditFileImportOptions( dlg, self._file_import_options )
|
||||
panel = ClientGUIScrolledPanelsEdit.EditFileImportOptions( dlg, self._file_import_options, self._show_downloader_options )
|
||||
|
||||
dlg.SetPanel( panel )
|
||||
|
||||
|
@ -1152,8 +1153,10 @@ class GalleryImportPanel( ClientGUICommon.StaticBox ):
|
|||
file_import_options = ClientImportOptions.FileImportOptions()
|
||||
tag_import_options = ClientImportOptions.TagImportOptions( is_default = True )
|
||||
|
||||
self._file_import_options = FileImportOptionsButton( self, file_import_options, self._SetFileImportOptions )
|
||||
self._tag_import_options = TagImportOptionsButton( self, tag_import_options, update_callable = self._SetTagImportOptions, allow_default_selection = True )
|
||||
show_downloader_options = True
|
||||
|
||||
self._file_import_options = FileImportOptionsButton( self, file_import_options, show_downloader_options, self._SetFileImportOptions )
|
||||
self._tag_import_options = TagImportOptionsButton( self, tag_import_options, show_downloader_options, update_callable = self._SetTagImportOptions, allow_default_selection = True )
|
||||
|
||||
#
|
||||
|
||||
|
@ -1518,13 +1521,13 @@ class GUGKeyAndNameSelector( ClientGUICommon.BetterButton ):
|
|||
|
||||
class TagImportOptionsButton( ClientGUICommon.BetterButton ):
|
||||
|
||||
def __init__( self, parent, tag_import_options, update_callable = None, show_downloader_options = True, allow_default_selection = False ):
|
||||
def __init__( self, parent, tag_import_options, show_downloader_options, update_callable = None, allow_default_selection = False ):
|
||||
|
||||
ClientGUICommon.BetterButton.__init__( self, parent, 'tag import options', self._EditOptions )
|
||||
|
||||
self._tag_import_options = tag_import_options
|
||||
self._update_callable = update_callable
|
||||
self._show_downloader_options = show_downloader_options
|
||||
self._update_callable = update_callable
|
||||
self._allow_default_selection = allow_default_selection
|
||||
|
||||
self._SetToolTip()
|
||||
|
@ -1545,7 +1548,7 @@ class TagImportOptionsButton( ClientGUICommon.BetterButton ):
|
|||
|
||||
with ClientGUITopLevelWindows.DialogEdit( self, 'edit tag import options' ) as dlg:
|
||||
|
||||
panel = ClientGUIScrolledPanelsEdit.EditTagImportOptionsPanel( dlg, self._tag_import_options, show_downloader_options = self._show_downloader_options, allow_default_selection = self._allow_default_selection )
|
||||
panel = ClientGUIScrolledPanelsEdit.EditTagImportOptionsPanel( dlg, self._tag_import_options, self._show_downloader_options, allow_default_selection = self._allow_default_selection )
|
||||
|
||||
dlg.SetPanel( panel )
|
||||
|
||||
|
@ -1692,8 +1695,10 @@ class WatcherReviewPanel( ClientGUICommon.StaticBox ):
|
|||
file_import_options = ClientImportOptions.FileImportOptions()
|
||||
tag_import_options = ClientImportOptions.TagImportOptions( is_default = True )
|
||||
|
||||
self._file_import_options = FileImportOptionsButton( self, file_import_options, self._SetFileImportOptions )
|
||||
self._tag_import_options = TagImportOptionsButton( self, tag_import_options, update_callable = self._SetTagImportOptions, allow_default_selection = True )
|
||||
show_downloader_options = True
|
||||
|
||||
self._file_import_options = FileImportOptionsButton( self, file_import_options, show_downloader_options, self._SetFileImportOptions )
|
||||
self._tag_import_options = TagImportOptionsButton( self, tag_import_options, show_downloader_options, update_callable = self._SetTagImportOptions, allow_default_selection = True )
|
||||
|
||||
#
|
||||
|
||||
|
|
|
@ -627,6 +627,8 @@ class ManagementController( HydrusSerialisable.SerialisableBase ):
|
|||
paths = [ path_info for ( path_type, path_info ) in paths_info if path_type != 'zip' ]
|
||||
|
||||
exclude_deleted = advanced_import_options[ 'exclude_deleted' ]
|
||||
do_not_check_known_urls_before_importing = False
|
||||
do_not_check_hashes_before_importing = False
|
||||
allow_decompression_bombs = False
|
||||
min_size = advanced_import_options[ 'min_size' ]
|
||||
max_size = None
|
||||
|
@ -635,11 +637,12 @@ class ManagementController( HydrusSerialisable.SerialisableBase ):
|
|||
max_resolution = None
|
||||
|
||||
automatic_archive = advanced_import_options[ 'automatic_archive' ]
|
||||
associate_source_urls = True
|
||||
|
||||
file_import_options = ClientImportOptions.FileImportOptions()
|
||||
|
||||
file_import_options.SetPreImportOptions( exclude_deleted, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution )
|
||||
file_import_options.SetPostImportOptions( automatic_archive )
|
||||
file_import_options.SetPreImportOptions( exclude_deleted, do_not_check_known_urls_before_importing, do_not_check_hashes_before_importing, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution )
|
||||
file_import_options.SetPostImportOptions( automatic_archive, associate_source_urls )
|
||||
|
||||
paths_to_tags = { path : { service_key.decode( 'hex' ) : tags for ( service_key, tags ) in service_keys_to_tags } for ( path, service_keys_to_tags ) in paths_to_tags.items() }
|
||||
|
||||
|
@ -1453,8 +1456,9 @@ class ManagementPanelImporterHDD( ManagementPanelImporter ):
|
|||
self._hdd_import = self._management_controller.GetVariable( 'hdd_import' )
|
||||
|
||||
file_import_options = self._hdd_import.GetFileImportOptions()
|
||||
show_downloader_options = False
|
||||
|
||||
self._file_import_options = ClientGUIImport.FileImportOptionsButton( self._import_queue_panel, file_import_options, self._hdd_import.SetFileImportOptions )
|
||||
self._file_import_options = ClientGUIImport.FileImportOptionsButton( self._import_queue_panel, file_import_options, show_downloader_options, self._hdd_import.SetFileImportOptions )
|
||||
|
||||
#
|
||||
|
||||
|
@ -1575,6 +1579,7 @@ class ManagementPanelImporterMultipleGallery( ManagementPanelImporter ):
|
|||
|
||||
self._gallery_importers_listctrl_panel.NewButtonRow()
|
||||
|
||||
self._gallery_importers_listctrl_panel.AddButton( 'retry failed', self._RetryFailed, enabled_check_func = self._CanRetryFailed )
|
||||
self._gallery_importers_listctrl_panel.AddButton( 'remove', self._RemoveGalleryImports, enabled_only_on_selection = True )
|
||||
|
||||
self._gallery_importers_listctrl_panel.NewButtonRow()
|
||||
|
@ -1597,8 +1602,10 @@ class ManagementPanelImporterMultipleGallery( ManagementPanelImporter ):
|
|||
tag_import_options = self._multiple_gallery_import.GetTagImportOptions()
|
||||
file_limit = self._multiple_gallery_import.GetFileLimit()
|
||||
|
||||
self._file_import_options = ClientGUIImport.FileImportOptionsButton( self._gallery_downloader_panel, file_import_options, self._multiple_gallery_import.SetFileImportOptions )
|
||||
self._tag_import_options = ClientGUIImport.TagImportOptionsButton( self._gallery_downloader_panel, tag_import_options, update_callable = self._multiple_gallery_import.SetTagImportOptions, allow_default_selection = True )
|
||||
show_downloader_options = True
|
||||
|
||||
self._file_import_options = ClientGUIImport.FileImportOptionsButton( self._gallery_downloader_panel, file_import_options, show_downloader_options, self._multiple_gallery_import.SetFileImportOptions )
|
||||
self._tag_import_options = ClientGUIImport.TagImportOptionsButton( self._gallery_downloader_panel, tag_import_options, show_downloader_options, update_callable = self._multiple_gallery_import.SetTagImportOptions, allow_default_selection = True )
|
||||
|
||||
#
|
||||
|
||||
|
@ -1664,6 +1671,19 @@ class ManagementPanelImporterMultipleGallery( ManagementPanelImporter ):
|
|||
return gallery_import != self._highlighted_gallery_import
|
||||
|
||||
|
||||
def _CanRetryFailed( self ):
|
||||
|
||||
for gallery_import in self._gallery_importers_listctrl.GetData( only_selected = True ):
|
||||
|
||||
if gallery_import.CanRetryFailed():
|
||||
|
||||
return True
|
||||
|
||||
|
||||
|
||||
return False
|
||||
|
||||
|
||||
def _ClearExistingHighlight( self ):
|
||||
|
||||
if self._highlighted_gallery_import is not None:
|
||||
|
@ -1801,6 +1821,13 @@ class ManagementPanelImporterMultipleGallery( ManagementPanelImporter ):
|
|||
|
||||
ClientGUIMenus.AppendMenuItem( self, menu, 'copy queries', 'Copy all the selected downloaders\' queries to clipboard.', self._CopySelectedQueries )
|
||||
|
||||
if self._CanRetryFailed():
|
||||
|
||||
ClientGUIMenus.AppendSeparator( menu )
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( self, menu, 'retry failed', 'Retry all the failed downloads.', self._RetryFailed )
|
||||
|
||||
|
||||
ClientGUIMenus.AppendSeparator( menu )
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( self, menu, 'pause/play files', 'Pause/play all the selected downloaders\' file queues.', self._PausePlayFiles )
|
||||
|
@ -1958,6 +1985,14 @@ class ManagementPanelImporterMultipleGallery( ManagementPanelImporter ):
|
|||
self._UpdateImportStatusNow()
|
||||
|
||||
|
||||
def _RetryFailed( self ):
|
||||
|
||||
for gallery_import in self._gallery_importers_listctrl.GetData( only_selected = True ):
|
||||
|
||||
gallery_import.RetryFailed()
|
||||
|
||||
|
||||
|
||||
def _SetGUGKeyAndName( self, gug_key_and_name ):
|
||||
|
||||
current_initial_search_text = self._multiple_gallery_import.GetInitialSearchText()
|
||||
|
@ -2009,7 +2044,11 @@ class ManagementPanelImporterMultipleGallery( ManagementPanelImporter ):
|
|||
|
||||
if HydrusData.TimeHasPassed( self._next_update_time ):
|
||||
|
||||
self._next_update_time = HydrusData.GetNow() + 1
|
||||
num_items = len( self._gallery_importers_listctrl.GetData() )
|
||||
|
||||
update_period = max( 1, int( ( num_items / 10 ) ** 0.33 ) )
|
||||
|
||||
self._next_update_time = HydrusData.GetNow() + update_period
|
||||
|
||||
#
|
||||
|
||||
|
@ -2152,6 +2191,7 @@ class ManagementPanelImporterMultipleWatcher( ManagementPanelImporter ):
|
|||
|
||||
self._watchers_listctrl_panel.NewButtonRow()
|
||||
|
||||
self._watchers_listctrl_panel.AddButton( 'retry failed', self._RetryFailed, enabled_check_func = self._CanRetryFailed )
|
||||
self._watchers_listctrl_panel.AddButton( 'remove', self._RemoveWatchers, enabled_only_on_selection = True )
|
||||
|
||||
self._watchers_listctrl_panel.NewButtonRow()
|
||||
|
@ -2162,9 +2202,11 @@ class ManagementPanelImporterMultipleWatcher( ManagementPanelImporter ):
|
|||
|
||||
self._watcher_url_input = ClientGUIControls.TextAndPasteCtrl( self._watchers_panel, self._AddURLs )
|
||||
|
||||
show_downloader_options = True
|
||||
|
||||
self._checker_options = ClientGUIImport.CheckerOptionsButton( self._watchers_panel, checker_options, self._OptionsUpdated )
|
||||
self._file_import_options = ClientGUIImport.FileImportOptionsButton( self._watchers_panel, file_import_options, self._OptionsUpdated )
|
||||
self._tag_import_options = ClientGUIImport.TagImportOptionsButton( self._watchers_panel, tag_import_options, update_callable = self._OptionsUpdated, allow_default_selection = True )
|
||||
self._file_import_options = ClientGUIImport.FileImportOptionsButton( self._watchers_panel, file_import_options, show_downloader_options, self._OptionsUpdated )
|
||||
self._tag_import_options = ClientGUIImport.TagImportOptionsButton( self._watchers_panel, tag_import_options, show_downloader_options, update_callable = self._OptionsUpdated, allow_default_selection = True )
|
||||
|
||||
# suck up watchers from elsewhere in the program (presents a checklistboxdialog)
|
||||
|
||||
|
@ -2250,6 +2292,19 @@ class ManagementPanelImporterMultipleWatcher( ManagementPanelImporter ):
|
|||
return watcher != self._highlighted_watcher
|
||||
|
||||
|
||||
def _CanRetryFailed( self ):
|
||||
|
||||
for watcher in self._watchers_listctrl.GetData( only_selected = True ):
|
||||
|
||||
if watcher.CanRetryFailed():
|
||||
|
||||
return True
|
||||
|
||||
|
||||
|
||||
return False
|
||||
|
||||
|
||||
def _CheckNow( self ):
|
||||
|
||||
for watcher in self._watchers_listctrl.GetData( only_selected = True ):
|
||||
|
@ -2392,6 +2447,13 @@ class ManagementPanelImporterMultipleWatcher( ManagementPanelImporter ):
|
|||
ClientGUIMenus.AppendMenuItem( self, menu, 'copy urls', 'Copy all the selected watchers\' urls to clipboard.', self._CopySelectedURLs )
|
||||
ClientGUIMenus.AppendMenuItem( self, menu, 'open urls', 'Open all the selected watchers\' urls in your browser.', self._OpenSelectedURLs )
|
||||
|
||||
if self._CanRetryFailed():
|
||||
|
||||
ClientGUIMenus.AppendSeparator( menu )
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( self, menu, 'retry failed', 'Retry all the failed downloads.', self._RetryFailed )
|
||||
|
||||
|
||||
ClientGUIMenus.AppendSeparator( menu )
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( self, menu, 'pause/play files', 'Pause/play all the selected watchers\' file queues.', self._PausePlayFiles )
|
||||
|
@ -2570,6 +2632,14 @@ class ManagementPanelImporterMultipleWatcher( ManagementPanelImporter ):
|
|||
self._UpdateImportStatusNow()
|
||||
|
||||
|
||||
def _RetryFailed( self ):
|
||||
|
||||
for watcher in self._watchers_listctrl.GetData( only_selected = True ):
|
||||
|
||||
watcher.RetryFailed()
|
||||
|
||||
|
||||
|
||||
def _SetOptionsToWatchers( self ):
|
||||
|
||||
watchers = self._watchers_listctrl.GetData( only_selected = True )
|
||||
|
@ -2603,7 +2673,11 @@ class ManagementPanelImporterMultipleWatcher( ManagementPanelImporter ):
|
|||
|
||||
if HydrusData.TimeHasPassed( self._next_update_time ):
|
||||
|
||||
self._next_update_time = HydrusData.GetNow() + 1
|
||||
num_items = len( self._watchers_listctrl.GetData() )
|
||||
|
||||
update_period = max( 1, int( ( num_items / 10 ) ** 0.33 ) )
|
||||
|
||||
self._next_update_time = HydrusData.GetNow() + update_period
|
||||
|
||||
#
|
||||
|
||||
|
@ -2779,7 +2853,9 @@ class ManagementPanelImporterSimpleDownloader( ManagementPanelImporter ):
|
|||
|
||||
file_import_options = self._simple_downloader_import.GetFileImportOptions()
|
||||
|
||||
self._file_import_options = ClientGUIImport.FileImportOptionsButton( self._simple_downloader_panel, file_import_options, self._simple_downloader_import.SetFileImportOptions )
|
||||
show_downloader_options = True
|
||||
|
||||
self._file_import_options = ClientGUIImport.FileImportOptionsButton( self._simple_downloader_panel, file_import_options, show_downloader_options, self._simple_downloader_import.SetFileImportOptions )
|
||||
|
||||
#
|
||||
|
||||
|
@ -3179,9 +3255,10 @@ class ManagementPanelImporterURLs( ManagementPanelImporter ):
|
|||
|
||||
( file_import_options, tag_import_options ) = self._urls_import.GetOptions()
|
||||
|
||||
self._file_import_options = ClientGUIImport.FileImportOptionsButton( self._url_panel, file_import_options, self._urls_import.SetFileImportOptions )
|
||||
show_downloader_options = True
|
||||
|
||||
self._tag_import_options = ClientGUIImport.TagImportOptionsButton( self._url_panel, tag_import_options, update_callable = self._urls_import.SetTagImportOptions, show_downloader_options = True, allow_default_selection = True )
|
||||
self._file_import_options = ClientGUIImport.FileImportOptionsButton( self._url_panel, file_import_options, show_downloader_options, self._urls_import.SetFileImportOptions )
|
||||
self._tag_import_options = ClientGUIImport.TagImportOptionsButton( self._url_panel, tag_import_options, show_downloader_options, update_callable = self._urls_import.SetTagImportOptions, allow_default_selection = True )
|
||||
|
||||
#
|
||||
|
||||
|
|
|
@ -12,6 +12,7 @@ import ClientGUIScrolledPanels
|
|||
import ClientGUIScrolledPanelsEdit
|
||||
import ClientGUISerialisable
|
||||
import ClientGUITopLevelWindows
|
||||
import ClientNetworkingContexts
|
||||
import ClientNetworkingDomain
|
||||
import ClientNetworkingJobs
|
||||
import ClientParsing
|
||||
|
@ -25,6 +26,7 @@ import HydrusGlobals as HG
|
|||
import HydrusSerialisable
|
||||
import HydrusTags
|
||||
import HydrusText
|
||||
import itertools
|
||||
import json
|
||||
import os
|
||||
import sys
|
||||
|
@ -178,6 +180,7 @@ class DownloaderExportPanel( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
listctrl_panel.AddButton( 'add gug', self._AddGUG )
|
||||
listctrl_panel.AddButton( 'add url class', self._AddURLMatch )
|
||||
listctrl_panel.AddButton( 'add parser', self._AddParser )
|
||||
listctrl_panel.AddButton( 'add headers/bandwidth rules', self._AddDomainMetadata )
|
||||
listctrl_panel.AddButton( 'delete', self._Delete, enabled_only_on_selection = True )
|
||||
listctrl_panel.AddSeparator()
|
||||
listctrl_panel.AddButton( 'export to png', self._Export, enabled_check_func = self._CanExport )
|
||||
|
@ -192,6 +195,34 @@ class DownloaderExportPanel( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
self.SetSizer( vbox )
|
||||
|
||||
|
||||
def _AddDomainMetadata( self ):
|
||||
|
||||
message = 'Enter domain:'
|
||||
|
||||
with ClientGUIDialogs.DialogTextEntry( self, message ) as dlg:
|
||||
|
||||
if dlg.ShowModal() == wx.ID_OK:
|
||||
|
||||
domain = dlg.GetValue()
|
||||
|
||||
else:
|
||||
|
||||
return
|
||||
|
||||
|
||||
|
||||
domain_metadatas = self._GetDomainMetadatasToInclude( { domain } )
|
||||
|
||||
if len( domain_metadatas ) > 0:
|
||||
|
||||
self._listctrl.AddDatas( domain_metadatas )
|
||||
|
||||
else:
|
||||
|
||||
wx.MessageBox( 'No headers/bandwidth rules found!' )
|
||||
|
||||
|
||||
|
||||
def _AddGUG( self ):
|
||||
|
||||
choosable_gugs = [ gug for gug in self._network_engine.domain_manager.GetGUGs() if gug.IsFunctional() ]
|
||||
|
@ -222,12 +253,17 @@ class DownloaderExportPanel( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
|
||||
|
||||
|
||||
domains = { ClientNetworkingDomain.ConvertURLIntoDomain( example_url ) for example_url in itertools.chain.from_iterable( ( gug.GetExampleURLs() for gug in gugs_to_include ) ) }
|
||||
|
||||
domain_metadatas_to_include = self._GetDomainMetadatasToInclude( domains )
|
||||
|
||||
url_matches_to_include = self._GetURLMatchesToInclude( gugs_to_include )
|
||||
|
||||
url_matches_to_include = self._FlushOutURLMatchesWithAPILinks( url_matches_to_include )
|
||||
|
||||
parsers_to_include = self._GetParsersToInclude( url_matches_to_include )
|
||||
|
||||
self._listctrl.AddDatas( domain_metadatas_to_include )
|
||||
self._listctrl.AddDatas( gugs_to_include )
|
||||
self._listctrl.AddDatas( url_matches_to_include )
|
||||
self._listctrl.AddDatas( parsers_to_include )
|
||||
|
@ -307,7 +343,15 @@ class DownloaderExportPanel( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
|
||||
def _ConvertContentToListCtrlTuples( self, content ):
|
||||
|
||||
name = content.GetName()
|
||||
if isinstance( content, ClientNetworkingDomain.DomainMetadataPackage ):
|
||||
|
||||
name = content.GetDomain()
|
||||
|
||||
else:
|
||||
|
||||
name = content.GetName()
|
||||
|
||||
|
||||
t = content.SERIALISABLE_NAME
|
||||
|
||||
pretty_name = name
|
||||
|
@ -417,6 +461,58 @@ class DownloaderExportPanel( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
return list( url_matches_to_include )
|
||||
|
||||
|
||||
def _GetDomainMetadatasToInclude( self, domains ):
|
||||
|
||||
domains = { d for d in itertools.chain.from_iterable( ClientNetworkingDomain.ConvertDomainIntoAllApplicableDomains( domain ) for domain in domains ) }
|
||||
|
||||
existing_domains = { obj.GetDomain() for obj in self._listctrl.GetData() if isinstance( obj, ClientNetworkingDomain.DomainMetadataPackage ) }
|
||||
|
||||
domains = domains.difference( existing_domains )
|
||||
|
||||
domains = list( domains )
|
||||
|
||||
domains.sort()
|
||||
|
||||
domain_metadatas = []
|
||||
|
||||
for domain in domains:
|
||||
|
||||
network_context = ClientNetworkingContexts.NetworkContext( CC.NETWORK_CONTEXT_DOMAIN, domain )
|
||||
|
||||
if self._network_engine.domain_manager.HasCustomHeaders( network_context ):
|
||||
|
||||
headers_list = self._network_engine.domain_manager.GetShareableCustomHeaders( network_context )
|
||||
|
||||
else:
|
||||
|
||||
headers_list = None
|
||||
|
||||
|
||||
if self._network_engine.bandwidth_manager.HasRules( network_context ):
|
||||
|
||||
bandwidth_rules = self._network_engine.bandwidth_manager.GetRules( network_context )
|
||||
|
||||
else:
|
||||
|
||||
bandwidth_rules = None
|
||||
|
||||
|
||||
if headers_list is not None or bandwidth_rules is not None:
|
||||
|
||||
domain_metadata = ClientNetworkingDomain.DomainMetadataPackage( domain = domain, headers_list = headers_list, bandwidth_rules = bandwidth_rules )
|
||||
|
||||
domain_metadatas.append( domain_metadata )
|
||||
|
||||
|
||||
|
||||
for domain_metadata in domain_metadatas:
|
||||
|
||||
wx.MessageBox( domain_metadata.GetDetailedSafeSummary() )
|
||||
|
||||
|
||||
return domain_metadatas
|
||||
|
||||
|
||||
def _GetParsersToInclude( self, url_matches ):
|
||||
|
||||
parsers_to_include = set()
|
||||
|
@ -2184,12 +2280,23 @@ class EditContentParsersPanel( ClientGUICommon.StaticBox ):
|
|||
|
||||
dlg_title = 'edit content node'
|
||||
|
||||
content_parser = ClientParsing.ContentParser( 'new content parser' )
|
||||
test_context = self._test_context_callable()
|
||||
|
||||
( example_parsing_context, example_data ) = test_context
|
||||
|
||||
if len( example_data ) > 0 and HydrusText.LooksLikeJSON( example_data ):
|
||||
|
||||
formula = ClientParsing.ParseFormulaJSON()
|
||||
|
||||
else:
|
||||
|
||||
formula = ClientParsing.ParseFormulaHTML()
|
||||
|
||||
|
||||
content_parser = ClientParsing.ContentParser( 'new content parser', formula = formula )
|
||||
|
||||
with ClientGUITopLevelWindows.DialogEdit( self, 'edit content parser', frame_key = 'deeply_nested_dialog' ) as dlg_edit:
|
||||
|
||||
test_context = self._test_context_callable()
|
||||
|
||||
panel = EditContentParserPanel( dlg_edit, content_parser, test_context )
|
||||
|
||||
dlg_edit.SetPanel( panel )
|
||||
|
@ -5140,16 +5247,10 @@ class TestPanel( wx.Panel ):
|
|||
|
||||
|
||||
# put this second, so if the JSON contains some HTML, it'll overwrite here. decent compromise
|
||||
try:
|
||||
|
||||
json.loads( example_data )
|
||||
if HydrusText.LooksLikeJSON( example_data ):
|
||||
|
||||
parse_phrase = 'looks like JSON'
|
||||
|
||||
except:
|
||||
|
||||
pass
|
||||
|
||||
|
||||
description = HydrusData.ConvertIntToBytes( len( example_data ) ) + ' total, ' + parse_phrase
|
||||
|
||||
|
@ -5303,16 +5404,10 @@ class TestPanelPageParser( TestPanel ):
|
|||
|
||||
|
||||
# put this second, so if the JSON contains some HTML, it'll overwrite here. decent compromise
|
||||
try:
|
||||
|
||||
json.loads( post_conversion_example_data )
|
||||
if HydrusText.LooksLikeJSON( example_data ):
|
||||
|
||||
parse_phrase = 'looks like JSON'
|
||||
|
||||
except:
|
||||
|
||||
pass
|
||||
|
||||
|
||||
description = HydrusData.ConvertIntToBytes( len( post_conversion_example_data ) ) + ' total, ' + parse_phrase
|
||||
|
||||
|
|
|
@ -325,8 +325,10 @@ class EditDefaultTagImportOptionsPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
#
|
||||
|
||||
self._file_post_default_tag_import_options_button = ClientGUIImport.TagImportOptionsButton( self, file_post_default_tag_import_options )
|
||||
self._watchable_default_tag_import_options_button = ClientGUIImport.TagImportOptionsButton( self, watchable_default_tag_import_options )
|
||||
show_downloader_options = True
|
||||
|
||||
self._file_post_default_tag_import_options_button = ClientGUIImport.TagImportOptionsButton( self, file_post_default_tag_import_options, show_downloader_options )
|
||||
self._watchable_default_tag_import_options_button = ClientGUIImport.TagImportOptionsButton( self, watchable_default_tag_import_options, show_downloader_options )
|
||||
|
||||
self._list_ctrl_panel = ClientGUIListCtrl.BetterListCtrlPanel( self )
|
||||
|
||||
|
@ -445,8 +447,9 @@ class EditDefaultTagImportOptionsPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
with ClientGUITopLevelWindows.DialogEdit( self, 'edit tag import options' ) as dlg:
|
||||
|
||||
tag_import_options = self._GetDefaultTagImportOptions( url_match )
|
||||
show_downloader_options = True
|
||||
|
||||
panel = EditTagImportOptionsPanel( dlg, tag_import_options )
|
||||
panel = EditTagImportOptionsPanel( dlg, tag_import_options, show_downloader_options )
|
||||
|
||||
dlg.SetPanel( panel )
|
||||
|
||||
|
@ -1297,7 +1300,7 @@ class EditDuplicateActionOptionsPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
class EditFileImportOptions( ClientGUIScrolledPanels.EditPanel ):
|
||||
|
||||
def __init__( self, parent, file_import_options ):
|
||||
def __init__( self, parent, file_import_options, show_downloader_options ):
|
||||
|
||||
ClientGUIScrolledPanels.EditPanel.__init__( self, parent )
|
||||
|
||||
|
@ -1311,6 +1314,16 @@ class EditFileImportOptions( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
self._exclude_deleted = wx.CheckBox( pre_import_panel )
|
||||
|
||||
self._do_not_check_known_urls_before_importing = wx.CheckBox( pre_import_panel )
|
||||
self._do_not_check_hashes_before_importing = wx.CheckBox( pre_import_panel )
|
||||
|
||||
tt = 'If hydrus recognises a file\'s URL or hash, it can decide to skip downloading it if it believes it already has it or previously deleted it.'
|
||||
tt += os.linesep * 2
|
||||
tt += 'This is usually a great way to reduce bandwidth, but if you believe the clientside url mappings or serverside hashes are inaccurate and the file is being wrongly skipped, turn these on to force a download.'
|
||||
|
||||
self._do_not_check_known_urls_before_importing.SetToolTip( tt )
|
||||
self._do_not_check_hashes_before_importing.SetToolTip( tt )
|
||||
|
||||
self._allow_decompression_bombs = wx.CheckBox( pre_import_panel )
|
||||
|
||||
self._min_size = ClientGUIControls.NoneableBytesControl( pre_import_panel )
|
||||
|
@ -1333,6 +1346,13 @@ class EditFileImportOptions( ClientGUIScrolledPanels.EditPanel ):
|
|||
post_import_panel = ClientGUICommon.StaticBox( self, 'post-import actions' )
|
||||
|
||||
self._auto_archive = wx.CheckBox( post_import_panel )
|
||||
self._associate_source_urls = wx.CheckBox( post_import_panel )
|
||||
|
||||
tt = 'If the parser discovers and additional source URL for another site (e.g. "This file on wewbooru was originally posted to Bixiv [here]."), should that URL be associated with the final URL? Should it be trusted to make \'already in db/previously deleted\' determinations?'
|
||||
tt += os.linesep * 2
|
||||
tt += 'You should turn this off if the site supplies bad (incorrect or imprecise or malformed) source urls.'
|
||||
|
||||
self._associate_source_urls.SetToolTip( tt )
|
||||
|
||||
#
|
||||
|
||||
|
@ -1344,9 +1364,11 @@ class EditFileImportOptions( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
#
|
||||
|
||||
( exclude_deleted, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution ) = file_import_options.GetPreImportOptions()
|
||||
( exclude_deleted, do_not_check_known_urls_before_importing, do_not_check_hashes_before_importing, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution ) = file_import_options.GetPreImportOptions()
|
||||
|
||||
self._exclude_deleted.SetValue( exclude_deleted )
|
||||
self._do_not_check_known_urls_before_importing.SetValue( do_not_check_known_urls_before_importing )
|
||||
self._do_not_check_hashes_before_importing.SetValue( do_not_check_hashes_before_importing )
|
||||
self._allow_decompression_bombs.SetValue( allow_decompression_bombs )
|
||||
self._min_size.SetValue( min_size )
|
||||
self._max_size.SetValue( max_size )
|
||||
|
@ -1356,9 +1378,10 @@ class EditFileImportOptions( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
#
|
||||
|
||||
automatic_archive = file_import_options.GetPostImportOptions()
|
||||
( automatic_archive, associate_source_urls ) = file_import_options.GetPostImportOptions()
|
||||
|
||||
self._auto_archive.SetValue( automatic_archive )
|
||||
self._associate_source_urls.SetValue( associate_source_urls )
|
||||
|
||||
#
|
||||
|
||||
|
@ -1373,6 +1396,18 @@ class EditFileImportOptions( ClientGUIScrolledPanels.EditPanel ):
|
|||
rows = []
|
||||
|
||||
rows.append( ( 'exclude previously deleted files: ', self._exclude_deleted ) )
|
||||
|
||||
if show_downloader_options and HG.client_controller.new_options.GetBoolean( 'advanced_mode' ):
|
||||
|
||||
rows.append( ( 'do not skip downloading because of known urls: ', self._do_not_check_known_urls_before_importing ) )
|
||||
rows.append( ( 'do not skip downloading because of hashes: ', self._do_not_check_hashes_before_importing ) )
|
||||
|
||||
else:
|
||||
|
||||
self._do_not_check_known_urls_before_importing.Hide()
|
||||
self._do_not_check_hashes_before_importing.Hide()
|
||||
|
||||
|
||||
rows.append( ( 'allow decompression bombs: ', self._allow_decompression_bombs ) )
|
||||
rows.append( ( 'minimum filesize: ', self._min_size ) )
|
||||
rows.append( ( 'maximum filesize: ', self._max_size ) )
|
||||
|
@ -1390,6 +1425,15 @@ class EditFileImportOptions( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
rows.append( ( 'archive all imports: ', self._auto_archive ) )
|
||||
|
||||
if show_downloader_options and HG.client_controller.new_options.GetBoolean( 'advanced_mode' ):
|
||||
|
||||
rows.append( ( 'associate (and trust) additional source urls: ', self._associate_source_urls ) )
|
||||
|
||||
else:
|
||||
|
||||
self._associate_source_urls.Hide()
|
||||
|
||||
|
||||
gridbox = ClientGUICommon.WrapInGrid( post_import_panel, rows )
|
||||
|
||||
post_import_panel.Add( gridbox, CC.FLAGS_EXPAND_SIZER_PERPENDICULAR )
|
||||
|
@ -1448,6 +1492,8 @@ If you have a very large (10k+ files) file import page, consider hiding some or
|
|||
def GetValue( self ):
|
||||
|
||||
exclude_deleted = self._exclude_deleted.GetValue()
|
||||
do_not_check_known_urls_before_importing = self._do_not_check_known_urls_before_importing.GetValue()
|
||||
do_not_check_hashes_before_importing = self._do_not_check_hashes_before_importing.GetValue()
|
||||
allow_decompression_bombs = self._allow_decompression_bombs.GetValue()
|
||||
min_size = self._min_size.GetValue()
|
||||
max_size = self._max_size.GetValue()
|
||||
|
@ -1456,6 +1502,7 @@ If you have a very large (10k+ files) file import page, consider hiding some or
|
|||
max_resolution = self._max_resolution.GetValue()
|
||||
|
||||
automatic_archive = self._auto_archive.GetValue()
|
||||
associate_source_urls = self._associate_source_urls.GetValue()
|
||||
|
||||
present_new_files = self._present_new_files.GetValue()
|
||||
present_already_in_inbox_files = self._present_already_in_inbox_files.GetValue()
|
||||
|
@ -1463,8 +1510,8 @@ If you have a very large (10k+ files) file import page, consider hiding some or
|
|||
|
||||
file_import_options = ClientImportOptions.FileImportOptions()
|
||||
|
||||
file_import_options.SetPreImportOptions( exclude_deleted, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution )
|
||||
file_import_options.SetPostImportOptions( automatic_archive )
|
||||
file_import_options.SetPreImportOptions( exclude_deleted, do_not_check_known_urls_before_importing, do_not_check_hashes_before_importing, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution )
|
||||
file_import_options.SetPostImportOptions( automatic_archive, associate_source_urls )
|
||||
file_import_options.SetPresentationOptions( present_new_files, present_already_in_inbox_files, present_already_in_archive_files )
|
||||
|
||||
return file_import_options
|
||||
|
@ -3133,7 +3180,7 @@ class EditSubscriptionPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
queries_panel = ClientGUIListCtrl.BetterListCtrlPanel( self._query_panel )
|
||||
|
||||
columns = [ ( 'query', 20 ), ( 'paused', 8 ), ( 'status', 8 ), ( 'last new file time', 20 ), ( 'last check time', 20 ), ( 'next check time', 20 ), ( 'file velocity', 20 ), ( 'recent delays', 20 ), ( 'items', 13 ) ]
|
||||
columns = [ ( 'name/query', 20 ), ( 'paused', 8 ), ( 'status', 8 ), ( 'last new file time', 20 ), ( 'last check time', 20 ), ( 'next check time', 20 ), ( 'file velocity', 20 ), ( 'recent delays', 20 ), ( 'items', 13 ) ]
|
||||
|
||||
self._queries = ClientGUIListCtrl.BetterListCtrl( queries_panel, 'subscription_queries', 20, 20, columns, self._ConvertQueryToListCtrlTuples, delete_key_callback = self._DeleteQuery, activation_callback = self._EditQuery )
|
||||
|
||||
|
@ -3151,6 +3198,12 @@ class EditSubscriptionPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
queries_panel.AddButton( 'check now', self._CheckNow, enabled_check_func = self._ListCtrlCanCheckNow )
|
||||
queries_panel.AddButton( 'reset cache', self._ResetCache, enabled_check_func = self._ListCtrlCanResetCache )
|
||||
|
||||
if HG.client_controller.new_options.GetBoolean( 'advanced_mode' ):
|
||||
|
||||
queries_panel.AddSeparator()
|
||||
queries_panel.AddButton( 'show \'quality\' info', self._GetQualityInfo, enabled_only_on_selection = True )
|
||||
|
||||
|
||||
self._checker_options = ClientGUIImport.CheckerOptionsButton( self._query_panel, checker_options, update_callable = self._CheckerOptionsUpdated )
|
||||
|
||||
#
|
||||
|
@ -3208,6 +3261,9 @@ But if 2 is--and is also perhaps accompanied by many 'could not parse' errors--t
|
|||
self._periodic_file_limit = wx.SpinCtrl( self._options_panel, min = 1, max = limits_max )
|
||||
self._periodic_file_limit.SetToolTip( 'Normal syncs will add no more than this many URLs, stopping early if they find several URLs the query has seen before.' )
|
||||
|
||||
self._show_a_popup_while_working = wx.CheckBox( self._options_panel )
|
||||
self._show_a_popup_while_working.SetToolTip( 'Careful with this! Leave it on to begin with, just in case it goes wrong!' )
|
||||
|
||||
self._publish_files_to_popup_button = wx.CheckBox( self._options_panel )
|
||||
self._publish_files_to_page = wx.CheckBox( self._options_panel )
|
||||
self._merge_query_publish_events = wx.CheckBox( self._options_panel )
|
||||
|
@ -3224,9 +3280,10 @@ But if 2 is--and is also perhaps accompanied by many 'could not parse' errors--t
|
|||
|
||||
#
|
||||
|
||||
self._file_import_options = ClientGUIImport.FileImportOptionsButton( self, file_import_options )
|
||||
show_downloader_options = True
|
||||
|
||||
self._tag_import_options = ClientGUIImport.TagImportOptionsButton( self, tag_import_options, allow_default_selection = True )
|
||||
self._file_import_options = ClientGUIImport.FileImportOptionsButton( self, file_import_options, show_downloader_options )
|
||||
self._tag_import_options = ClientGUIImport.TagImportOptionsButton( self, tag_import_options, show_downloader_options, allow_default_selection = True )
|
||||
|
||||
#
|
||||
|
||||
|
@ -3239,8 +3296,9 @@ But if 2 is--and is also perhaps accompanied by many 'could not parse' errors--t
|
|||
self._initial_file_limit.SetValue( initial_file_limit )
|
||||
self._periodic_file_limit.SetValue( periodic_file_limit )
|
||||
|
||||
( publish_files_to_popup_button, publish_files_to_page, merge_query_publish_events ) = subscription.GetPresentationOptions()
|
||||
( show_a_popup_while_working, publish_files_to_popup_button, publish_files_to_page, merge_query_publish_events ) = subscription.GetPresentationOptions()
|
||||
|
||||
self._show_a_popup_while_working.SetValue( show_a_popup_while_working )
|
||||
self._publish_files_to_popup_button.SetValue( publish_files_to_popup_button )
|
||||
self._publish_files_to_page.SetValue( publish_files_to_page )
|
||||
self._merge_query_publish_events.SetValue( merge_query_publish_events )
|
||||
|
@ -3259,6 +3317,7 @@ But if 2 is--and is also perhaps accompanied by many 'could not parse' errors--t
|
|||
|
||||
rows.append( ( 'on first check, get at most this many files: ', self._initial_file_limit ) )
|
||||
rows.append( ( 'on normal checks, get at most this many newer files: ', self._periodic_file_limit ) )
|
||||
rows.append( ( 'show a popup while working: ', self._show_a_popup_while_working ) )
|
||||
rows.append( ( 'publish new files to a popup button: ', self._publish_files_to_popup_button ) )
|
||||
rows.append( ( 'publish new files to a page: ', self._publish_files_to_page ) )
|
||||
rows.append( ( 'publish all queries to the same page/popup button: ', self._merge_query_publish_events ) )
|
||||
|
@ -3360,7 +3419,8 @@ But if 2 is--and is also perhaps accompanied by many 'could not parse' errors--t
|
|||
|
||||
( query_text, check_now, last_check_time, next_check_time, paused, status ) = query.ToTuple()
|
||||
|
||||
pretty_query_text = query_text
|
||||
name = query.GetHumanName()
|
||||
pretty_name = name
|
||||
|
||||
if paused:
|
||||
|
||||
|
@ -3437,8 +3497,8 @@ But if 2 is--and is also perhaps accompanied by many 'could not parse' errors--t
|
|||
|
||||
pretty_items = simple_status
|
||||
|
||||
display_tuple = ( pretty_query_text, pretty_paused, pretty_status, pretty_last_new_file_time, pretty_last_check_time, pretty_next_check_time, pretty_file_velocity, pretty_delay, pretty_items )
|
||||
sort_tuple = ( query_text, paused, status, last_new_file_time, last_check_time, next_check_time, file_velocity, delay, items )
|
||||
display_tuple = ( pretty_name, pretty_paused, pretty_status, pretty_last_new_file_time, pretty_last_check_time, pretty_next_check_time, pretty_file_velocity, pretty_delay, pretty_items )
|
||||
sort_tuple = ( name, paused, status, last_new_file_time, last_check_time, next_check_time, file_velocity, delay, items )
|
||||
|
||||
return ( display_tuple, sort_tuple )
|
||||
|
||||
|
@ -3522,6 +3582,58 @@ But if 2 is--and is also perhaps accompanied by many 'could not parse' errors--t
|
|||
return query_strings
|
||||
|
||||
|
||||
def _GetQualityInfo( self ):
|
||||
|
||||
data_strings = []
|
||||
|
||||
for query in self._queries.GetData():
|
||||
|
||||
fsc = query.GetFileSeedCache()
|
||||
|
||||
hashes = fsc.GetHashes()
|
||||
|
||||
media_results = HG.client_controller.Read( 'media_results', hashes )
|
||||
|
||||
num_inbox = 0
|
||||
num_archived = 0
|
||||
num_deleted = 0
|
||||
|
||||
for media_result in media_results:
|
||||
|
||||
lm = media_result.GetLocationsManager()
|
||||
|
||||
if lm.IsLocal() and not lm.IsTrashed():
|
||||
|
||||
if media_result.GetInbox():
|
||||
|
||||
num_inbox += 1
|
||||
|
||||
else:
|
||||
|
||||
num_archived += 1
|
||||
|
||||
|
||||
else:
|
||||
|
||||
num_deleted += 1
|
||||
|
||||
|
||||
|
||||
data_string = query.GetHumanName() + ': inbox ' + HydrusData.ToHumanInt( num_inbox ) + ' | archive ' + HydrusData.ToHumanInt( num_archived ) + ' | deleted ' + HydrusData.ToHumanInt( num_deleted )
|
||||
|
||||
if num_archived + num_deleted > 0:
|
||||
|
||||
data_string += ' | good ' + HydrusData.ConvertFloatToPercentage( float( num_archived ) / ( num_archived + num_deleted ) )
|
||||
|
||||
|
||||
data_strings.append( data_string )
|
||||
|
||||
|
||||
message = os.linesep.join( data_strings )
|
||||
|
||||
wx.MessageBox( message )
|
||||
|
||||
|
||||
def _ListCtrlCanCheckNow( self ):
|
||||
|
||||
for query in self._queries.GetData( only_selected = True ):
|
||||
|
@ -3726,11 +3838,12 @@ But if 2 is--and is also perhaps accompanied by many 'could not parse' errors--t
|
|||
|
||||
subscription.SetTuple( gug_key_and_name, queries, checker_options, initial_file_limit, periodic_file_limit, paused, file_import_options, tag_import_options, self._no_work_until )
|
||||
|
||||
show_a_popup_while_working = self._show_a_popup_while_working.GetValue()
|
||||
publish_files_to_popup_button = self._publish_files_to_popup_button.GetValue()
|
||||
publish_files_to_page = self._publish_files_to_page.GetValue()
|
||||
merge_query_publish_events = self._merge_query_publish_events.GetValue()
|
||||
|
||||
subscription.SetPresentationOptions( publish_files_to_popup_button, publish_files_to_page, merge_query_publish_events )
|
||||
subscription.SetPresentationOptions( show_a_popup_while_working, publish_files_to_popup_button, publish_files_to_page, merge_query_publish_events )
|
||||
|
||||
return subscription
|
||||
|
||||
|
@ -3749,6 +3862,7 @@ class EditSubscriptionQueryPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
self._status_st.SetMinSize( ( st_width, -1 ) )
|
||||
|
||||
self._display_name = ClientGUICommon.NoneableTextCtrl( self, none_phrase = 'show query text' )
|
||||
self._query_text = wx.TextCtrl( self )
|
||||
self._check_now = wx.CheckBox( self )
|
||||
self._paused = wx.CheckBox( self )
|
||||
|
@ -3757,10 +3871,19 @@ class EditSubscriptionQueryPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
self._gallery_seed_log_control = ClientGUIGallerySeedLog.GallerySeedLogStatusControl( self, HG.client_controller, True, True )
|
||||
|
||||
tag_import_options = self._original_query.GetTagImportOptions()
|
||||
show_downloader_options = False # just for additional tags, no parsing gubbins needed
|
||||
|
||||
self._tag_import_options = ClientGUIImport.TagImportOptionsButton( self, tag_import_options, show_downloader_options )
|
||||
|
||||
#
|
||||
|
||||
( query_text, check_now, self._last_check_time, self._next_check_time, paused, self._status ) = self._original_query.ToTuple()
|
||||
|
||||
display_name = self._original_query.GetDisplayName()
|
||||
|
||||
self._display_name.SetValue( display_name )
|
||||
|
||||
self._query_text.SetValue( query_text )
|
||||
|
||||
self._check_now.SetValue( check_now )
|
||||
|
@ -3779,6 +3902,7 @@ class EditSubscriptionQueryPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
rows = []
|
||||
|
||||
rows.append( ( 'optional display name: ', self._display_name ) )
|
||||
rows.append( ( 'query text: ', self._query_text ) )
|
||||
rows.append( ( 'check now: ', self._check_now ) )
|
||||
rows.append( ( 'paused: ', self._paused ) )
|
||||
|
@ -3791,6 +3915,7 @@ class EditSubscriptionQueryPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
vbox.Add( self._file_seed_cache_control, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
vbox.Add( self._gallery_seed_log_control, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
vbox.Add( gridbox, CC.FLAGS_EXPAND_SIZER_PERPENDICULAR )
|
||||
vbox.Add( self._tag_import_options, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
|
||||
self.SetSizer( vbox )
|
||||
|
||||
|
@ -3815,6 +3940,10 @@ class EditSubscriptionQueryPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
query.SetCheckNow( self._check_now.GetValue() )
|
||||
|
||||
query.SetDisplayName( self._display_name.GetValue() )
|
||||
|
||||
query.SetTagImportOptions( self._tag_import_options.GetValue() )
|
||||
|
||||
return query
|
||||
|
||||
|
||||
|
@ -4537,7 +4666,7 @@ class EditSubscriptionsPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
queries = subscription.GetQueries()
|
||||
|
||||
choice_tuples = [ ( query.GetQueryText(), query, False ) for query in queries ]
|
||||
choice_tuples = [ ( query.GetHumanName(), query, False ) for query in queries ]
|
||||
|
||||
with ClientGUITopLevelWindows.DialogEdit( self, 'select the queries to extract' ) as dlg:
|
||||
|
||||
|
@ -4683,10 +4812,11 @@ class EditSubscriptionsPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
|
||||
tag_import_options = HG.client_controller.network_engine.domain_manager.GetDefaultTagImportOptionsForPosts()
|
||||
show_downloader_options = True
|
||||
|
||||
with ClientGUITopLevelWindows.DialogEdit( self, 'edit tag import options' ) as dlg:
|
||||
|
||||
panel = EditTagImportOptionsPanel( dlg, tag_import_options, show_downloader_options = True, allow_default_selection = True )
|
||||
panel = EditTagImportOptionsPanel( dlg, tag_import_options, show_downloader_options, allow_default_selection = True )
|
||||
|
||||
dlg.SetPanel( panel )
|
||||
|
||||
|
@ -4706,7 +4836,7 @@ class EditSubscriptionsPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
class EditTagImportOptionsPanel( ClientGUIScrolledPanels.EditPanel ):
|
||||
|
||||
def __init__( self, parent, tag_import_options, show_downloader_options = True, allow_default_selection = False ):
|
||||
def __init__( self, parent, tag_import_options, show_downloader_options, allow_default_selection = False ):
|
||||
|
||||
ClientGUIScrolledPanels.EditPanel.__init__( self, parent )
|
||||
|
||||
|
|
|
@ -1897,13 +1897,15 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
|
||||
import ClientGUIImport
|
||||
|
||||
show_downloader_options = True
|
||||
|
||||
quiet_file_import_options = self._new_options.GetDefaultFileImportOptions( 'quiet' )
|
||||
|
||||
self._quiet_fios = ClientGUIImport.FileImportOptionsButton( default_fios, quiet_file_import_options )
|
||||
self._quiet_fios = ClientGUIImport.FileImportOptionsButton( default_fios, quiet_file_import_options, show_downloader_options )
|
||||
|
||||
loud_file_import_options = self._new_options.GetDefaultFileImportOptions( 'loud' )
|
||||
|
||||
self._loud_fios = ClientGUIImport.FileImportOptionsButton( default_fios, loud_file_import_options )
|
||||
self._loud_fios = ClientGUIImport.FileImportOptionsButton( default_fios, loud_file_import_options, show_downloader_options )
|
||||
|
||||
#
|
||||
|
||||
|
@ -6264,8 +6266,13 @@ class RepairFileSystemPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
|
||||
self._only_thumbs = True
|
||||
|
||||
self._incorrect_locations = {}
|
||||
self._correct_locations = {}
|
||||
|
||||
for ( incorrect_location, prefix ) in missing_locations:
|
||||
|
||||
self._incorrect_locations[ prefix ] = incorrect_location
|
||||
|
||||
if prefix.startswith( 'f' ):
|
||||
|
||||
self._only_thumbs = False
|
||||
|
@ -6274,7 +6281,7 @@ class RepairFileSystemPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
|
||||
text = 'This dialog has launched because some expected file storage directories were not found. This is a serious error. You have two options:'
|
||||
text += os.linesep * 2
|
||||
text += '1) If you know what these should be (e.g. you recently remapped their external drive to another location), update the paths here manually. For most users, this will likely be a simple ctrl+a->correct, but if you have a more complicated system or store your thumbnails different to your files, make sure you skim the whole list. Check everything reports _ok!_'
|
||||
text += '1) If you know what these should be (e.g. you recently remapped their external drive to another location), update the paths here manually. For most users, this will be clicking _add a possibly correct location_ and then select the new folder where the subdirectories all went. You can repeat this if your folders are missing in multiple locations. Check everything reports _ok!_'
|
||||
text += os.linesep * 2
|
||||
text += 'Although it is best if you can find everything, you only _have_ to fix the subdirectories starting with \'f\', which store your original files. Those starting \'t\' and \'r\' are for your thumbnails, which can be regenerated with a bit of work.'
|
||||
text += os.linesep * 2
|
||||
|
@ -6292,25 +6299,20 @@ class RepairFileSystemPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
|
||||
st.SetWrapWidth( 640 )
|
||||
|
||||
self._locations = ClientGUIListCtrl.SaneListCtrl( self, 400, [ ( 'missing location', -1 ), ( 'expected subdirectory', 120 ), ( 'correct location', 240 ), ( 'now ok?', 120 ) ], activation_callback = self._SetLocations )
|
||||
columns = [ ( 'missing location', -1 ), ( 'expected subdirectory', 23 ), ( 'correct location', 36 ), ( 'now ok?', 9 ) ]
|
||||
|
||||
self._locations = ClientGUIListCtrl.BetterListCtrl( self, 'repair_locations', 12, 36, columns, self._ConvertPrefixToListCtrlTuples, activation_callback = self._SetLocations )
|
||||
|
||||
self._set_button = ClientGUICommon.BetterButton( self, 'set correct location', self._SetLocations )
|
||||
self._add_button = ClientGUICommon.BetterButton( self, 'add a possibly correct location (let the client figure out what it contains)', self._AddLocation )
|
||||
|
||||
# add a button here for 'try to fill them in for me'. you give it a dir, and it tries to figure out and fill in the prefixes for you
|
||||
|
||||
#
|
||||
|
||||
for ( incorrect_location, prefix ) in missing_locations:
|
||||
|
||||
t = ( incorrect_location, prefix, '', '' )
|
||||
|
||||
self._locations.Append( t, t )
|
||||
|
||||
self._locations.AddDatas( [ prefix for ( incorrect_location, prefix ) in missing_locations ] )
|
||||
|
||||
# sort by prefix
|
||||
|
||||
#self._locations.SortListItems( 1 ) # subdirs secondary
|
||||
#self._locations.SortListItems( 0 ) # missing location primary
|
||||
self._locations.Sort( 0 )
|
||||
|
||||
#
|
||||
|
||||
|
@ -6319,40 +6321,89 @@ class RepairFileSystemPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
vbox.Add( st, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
vbox.Add( self._locations, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
vbox.Add( self._set_button, CC.FLAGS_LONE_BUTTON )
|
||||
vbox.Add( self._add_button, CC.FLAGS_LONE_BUTTON )
|
||||
|
||||
self.SetSizer( vbox )
|
||||
|
||||
|
||||
def _AddLocation( self ):
|
||||
|
||||
with wx.DirDialog( self, 'Select the potential correct location.' ) as dlg:
|
||||
|
||||
if dlg.ShowModal() == wx.ID_OK:
|
||||
|
||||
path = HydrusData.ToUnicode( dlg.GetPath() )
|
||||
|
||||
for prefix in self._locations.GetData():
|
||||
|
||||
ok = os.path.exists( os.path.join( path, prefix ) )
|
||||
|
||||
if ok:
|
||||
|
||||
self._correct_locations[ prefix ] = ( path, ok )
|
||||
|
||||
|
||||
|
||||
self._locations.UpdateDatas()
|
||||
|
||||
|
||||
|
||||
|
||||
def _ConvertPrefixToListCtrlTuples( self, prefix ):
|
||||
|
||||
incorrect_location = self._incorrect_locations[ prefix ]
|
||||
|
||||
if prefix in self._correct_locations:
|
||||
|
||||
( correct_location, ok ) = self._correct_locations[ prefix ]
|
||||
|
||||
if ok:
|
||||
|
||||
pretty_ok = 'ok!'
|
||||
|
||||
else:
|
||||
|
||||
pretty_ok = 'not found'
|
||||
|
||||
|
||||
else:
|
||||
|
||||
correct_location = ''
|
||||
ok = None
|
||||
pretty_ok = ''
|
||||
|
||||
|
||||
pretty_incorrect_location = incorrect_location
|
||||
pretty_prefix = prefix
|
||||
pretty_correct_location = correct_location
|
||||
|
||||
display_tuple = ( pretty_incorrect_location, pretty_prefix, pretty_correct_location, pretty_ok )
|
||||
sort_tuple = ( incorrect_location, prefix, correct_location, ok )
|
||||
|
||||
return ( display_tuple, sort_tuple )
|
||||
|
||||
|
||||
def _SetLocations( self ):
|
||||
|
||||
selected_indices = self._locations.GetAllSelected()
|
||||
prefixes = self._locations.GetData( only_selected = True )
|
||||
|
||||
if len( selected_indices ) > 0:
|
||||
if len( prefixes ) > 0:
|
||||
|
||||
with wx.DirDialog( self, 'Select correct location.' ) as dlg:
|
||||
|
||||
if dlg.ShowModal() == wx.ID_OK:
|
||||
|
||||
correct_location = HydrusData.ToUnicode( dlg.GetPath() )
|
||||
path = HydrusData.ToUnicode( dlg.GetPath() )
|
||||
|
||||
for index in selected_indices:
|
||||
for prefix in prefixes:
|
||||
|
||||
( incorrect_location, prefix, gumpf, gumpf_ok ) = self._locations.GetClientData( index )
|
||||
ok = os.path.exists( os.path.join( path, prefix ) )
|
||||
|
||||
if os.path.exists( os.path.join( correct_location, prefix ) ):
|
||||
|
||||
ok = 'ok!'
|
||||
|
||||
else:
|
||||
|
||||
ok = 'not found'
|
||||
|
||||
|
||||
t = ( incorrect_location, prefix, correct_location, ok )
|
||||
|
||||
self._locations.UpdateRow( index, t, t )
|
||||
self._correct_locations[ prefix ] = ( path, ok )
|
||||
|
||||
|
||||
self._locations.UpdateDatas()
|
||||
|
||||
|
||||
|
||||
|
||||
|
@ -6363,9 +6414,11 @@ class RepairFileSystemPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
|
||||
thumb_problems = False
|
||||
|
||||
for ( incorrect_location, prefix, correct_location, ok ) in self._locations.GetClientData():
|
||||
for prefix in self._locations.GetData():
|
||||
|
||||
if correct_location == '':
|
||||
incorrect_location = self._incorrect_locations[ prefix ]
|
||||
|
||||
if prefix not in self._correct_locations:
|
||||
|
||||
if prefix.startswith( 'f' ):
|
||||
|
||||
|
@ -6378,15 +6431,20 @@ class RepairFileSystemPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
correct_location = incorrect_location
|
||||
|
||||
|
||||
elif ok != 'ok!':
|
||||
else:
|
||||
|
||||
if prefix.startswith( 'f' ):
|
||||
( correct_location, ok ) = self._correct_locations[ prefix ]
|
||||
|
||||
if not ok:
|
||||
|
||||
raise HydrusExceptions.VetoException( 'You did not find all the correct file locations!' )
|
||||
|
||||
else:
|
||||
|
||||
thumb_problems = True
|
||||
if prefix.startswith( 'f' ):
|
||||
|
||||
raise HydrusExceptions.VetoException( 'You did not find all the correct file locations!' )
|
||||
|
||||
else:
|
||||
|
||||
thumb_problems = True
|
||||
|
||||
|
||||
|
||||
|
||||
|
|
|
@ -1549,13 +1549,13 @@ class ReviewDownloaderImport( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
def ImportFromDragDrop( self, paths ):
|
||||
|
||||
gugs = []
|
||||
|
||||
url_matches = []
|
||||
|
||||
parsers = []
|
||||
domain_metadatas = []
|
||||
|
||||
num_misc_objects = 0
|
||||
|
||||
bandwidth_manager = self._network_engine.bandwidth_manager
|
||||
domain_manager = self._network_engine.domain_manager
|
||||
|
||||
for path in paths:
|
||||
|
@ -1584,7 +1584,7 @@ class ReviewDownloaderImport( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
continue
|
||||
|
||||
|
||||
if isinstance( obj_list, ( ClientNetworkingDomain.GalleryURLGenerator, ClientNetworkingDomain.NestedGalleryURLGenerator, ClientNetworkingDomain.URLMatch, ClientParsing.PageParser ) ):
|
||||
if isinstance( obj_list, ( ClientNetworkingDomain.GalleryURLGenerator, ClientNetworkingDomain.NestedGalleryURLGenerator, ClientNetworkingDomain.URLMatch, ClientParsing.PageParser, ClientNetworkingDomain.DomainMetadataPackage ) ):
|
||||
|
||||
obj_list = HydrusSerialisable.SerialisableList( [ obj_list ] )
|
||||
|
||||
|
@ -1610,6 +1610,10 @@ class ReviewDownloaderImport( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
|
||||
parsers.append( obj )
|
||||
|
||||
elif isinstance( obj, ClientNetworkingDomain.DomainMetadataPackage ):
|
||||
|
||||
domain_metadatas.append( obj )
|
||||
|
||||
else:
|
||||
|
||||
num_misc_objects += 1
|
||||
|
@ -1617,7 +1621,7 @@ class ReviewDownloaderImport( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
|
||||
|
||||
|
||||
if len( gugs ) + len( url_matches ) + len( parsers ) == 0:
|
||||
if len( obj_list ) - num_misc_objects == 0:
|
||||
|
||||
if num_misc_objects > 0:
|
||||
|
||||
|
@ -1663,8 +1667,6 @@ class ReviewDownloaderImport( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
|
||||
|
||||
|
||||
#
|
||||
|
||||
# now parsers
|
||||
|
||||
num_exact_dupe_parsers = 0
|
||||
|
@ -1682,9 +1684,58 @@ class ReviewDownloaderImport( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
|
||||
|
||||
|
||||
total_num_dupes = num_exact_dupe_gugs + num_exact_dupe_url_matches + num_exact_dupe_parsers
|
||||
# now domain metadata
|
||||
|
||||
if len( new_gugs ) + len( new_url_matches ) + len( new_parsers ) == 0:
|
||||
num_exact_dupe_domain_metadatas = 0
|
||||
new_domain_metadatas = []
|
||||
|
||||
for domain_metadata in domain_metadatas:
|
||||
|
||||
# check if the headers or rules are new, discarding any dupe content
|
||||
|
||||
domain = domain_metadata.GetDomain()
|
||||
headers_list = None
|
||||
bandwidth_rules = None
|
||||
|
||||
nc = ClientNetworkingContexts.NetworkContext( CC.NETWORK_CONTEXT_DOMAIN, domain )
|
||||
|
||||
if domain_metadata.HasHeaders():
|
||||
|
||||
headers_list = domain_metadata.GetHeaders()
|
||||
|
||||
if domain_manager.AlreadyHaveExactlyTheseHeaders( nc, headers_list ):
|
||||
|
||||
headers_list = None
|
||||
|
||||
|
||||
|
||||
if domain_metadata.HasBandwidthRules():
|
||||
|
||||
bandwidth_rules = domain_metadata.GetBandwidthRules()
|
||||
|
||||
if bandwidth_manager.AlreadyHaveExactlyTheseBandwidthRules( nc, bandwidth_rules ):
|
||||
|
||||
bandwidth_rules = None
|
||||
|
||||
|
||||
|
||||
if headers_list is None and bandwidth_rules is None:
|
||||
|
||||
num_exact_dupe_domain_metadatas += 1
|
||||
|
||||
else:
|
||||
|
||||
new_dm = ClientNetworkingDomain.DomainMetadataPackage( domain = domain, headers_list = headers_list, bandwidth_rules = bandwidth_rules )
|
||||
|
||||
new_domain_metadatas.append( new_dm )
|
||||
|
||||
|
||||
|
||||
#
|
||||
|
||||
total_num_dupes = num_exact_dupe_gugs + num_exact_dupe_url_matches + num_exact_dupe_parsers + num_exact_dupe_domain_metadatas
|
||||
|
||||
if len( new_gugs ) + len( new_url_matches ) + len( new_parsers ) + len( new_domain_metadatas ) == 0:
|
||||
|
||||
wx.MessageBox( 'All ' + HydrusData.ToHumanInt( total_num_dupes ) + ' downloader objects in that package appeared to already be in the client, so nothing need be added.' )
|
||||
|
||||
|
@ -1699,6 +1750,7 @@ class ReviewDownloaderImport( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
choice_tuples.extend( [ ( 'GUG: ' + gug.GetName(), gug, True ) for gug in new_gugs ] )
|
||||
choice_tuples.extend( [ ( 'URL Class: ' + url_match.GetName(), url_match, True ) for url_match in new_url_matches ] )
|
||||
choice_tuples.extend( [ ( 'Parser: ' + parser.GetName(), parser, True ) for parser in new_parsers ] )
|
||||
choice_tuples.extend( [ ( 'Domain Metadata: ' + domain_metadata.GetDomain(), domain_metadata, True ) for domain_metadata in new_domain_metadatas ] )
|
||||
|
||||
with ClientGUITopLevelWindows.DialogEdit( self, 'select objects to add' ) as dlg:
|
||||
|
||||
|
@ -1713,6 +1765,7 @@ class ReviewDownloaderImport( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
new_gugs = [ obj for obj in new_objects if isinstance( obj, ( ClientNetworkingDomain.GalleryURLGenerator, ClientNetworkingDomain.NestedGalleryURLGenerator ) ) ]
|
||||
new_url_matches = [ obj for obj in new_objects if isinstance( obj, ClientNetworkingDomain.URLMatch ) ]
|
||||
new_parsers = [ obj for obj in new_objects if isinstance( obj, ClientParsing.PageParser ) ]
|
||||
new_domain_metadatas = [ obj for obj in new_objects if isinstance( obj, ClientNetworkingDomain.DomainMetadataPackage ) ]
|
||||
|
||||
else:
|
||||
|
||||
|
@ -1726,10 +1779,32 @@ class ReviewDownloaderImport( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
new_gugs.sort( key = lambda o: o.GetName() )
|
||||
new_url_matches.sort( key = lambda o: o.GetName() )
|
||||
new_parsers.sort( key = lambda o: o.GetName() )
|
||||
new_domain_metadatas.sort( key = lambda o: o.GetDomain() )
|
||||
|
||||
if len( new_domain_metadatas ) > 0:
|
||||
|
||||
message = 'Before the final import confirmation, I will now show the domain metadata in detail. Give it a cursory look just to check it seems good. If not, click no on the final dialog!'
|
||||
|
||||
TOO_MANY_DM = 8
|
||||
|
||||
if len( new_domain_metadatas ) > TOO_MANY_DM:
|
||||
|
||||
message += os.linesep * 2
|
||||
message += 'There are more than ' + HydrusData.ToHumanInt( TOO_MANY_DM ) + ' domain metadata objects. So I do not give you dozens of preview windows, I will only show you these first ' + HydrusData.ToHumanInt( TOO_MANY_DM ) + '.'
|
||||
|
||||
|
||||
new_domain_metadatas_to_show = new_domain_metadatas[:TOO_MANY_DM]
|
||||
|
||||
for new_dm in new_domain_metadatas_to_show:
|
||||
|
||||
wx.MessageBox( new_dm.GetDetailedSafeSummary() )
|
||||
|
||||
|
||||
|
||||
all_to_add = list( new_gugs )
|
||||
all_to_add.extend( new_url_matches )
|
||||
all_to_add.extend( new_parsers )
|
||||
all_to_add.extend( new_domain_metadatas )
|
||||
|
||||
message = 'The client is about to add and link these objects:'
|
||||
message += os.linesep * 2
|
||||
|
@ -1761,10 +1836,13 @@ class ReviewDownloaderImport( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
|
||||
domain_manager.AutoAddURLMatchesAndParsers( new_url_matches, dupe_url_matches, new_parsers )
|
||||
|
||||
bandwidth_manager.AutoAddDomainMetadatas( new_domain_metadatas )
|
||||
domain_manager.AutoAddDomainMetadatas( new_domain_metadatas, approved = True )
|
||||
|
||||
num_new_gugs = len( new_gugs )
|
||||
num_aux = len( new_url_matches ) + len( new_parsers )
|
||||
|
||||
final_message = 'Successfully added ' + HydrusData.ToHumanInt( len( new_gugs ) ) + ' new downloaders and ' + HydrusData.ToHumanInt( len( new_url_matches ) + len( new_parsers ) ) + ' auxiliary objects.'
|
||||
final_message = 'Successfully added ' + HydrusData.ToHumanInt( len( new_gugs ) ) + ' new downloaders and ' + HydrusData.ToHumanInt( len( new_url_matches ) + len( new_parsers ) + len( new_domain_metadatas ) ) + ' auxiliary objects.'
|
||||
|
||||
if total_num_dupes > 0:
|
||||
|
||||
|
|
|
@ -2,6 +2,7 @@ import numpy.core.multiarray # important this comes before cv!
|
|||
import ClientConstants as CC
|
||||
import cv2
|
||||
import HydrusConstants as HC
|
||||
import HydrusData
|
||||
import HydrusImageHandling
|
||||
import HydrusGlobals as HG
|
||||
|
||||
|
@ -61,8 +62,18 @@ def EfficientlyThumbnailNumpyImage( numpy_image, ( target_x, target_y ) ):
|
|||
|
||||
def GenerateNumpyImage( path, mime ):
|
||||
|
||||
if HG.media_load_report_mode:
|
||||
|
||||
HydrusData.ShowText( 'Loading media: ' + path )
|
||||
|
||||
|
||||
if mime == HC.IMAGE_GIF or HG.client_controller.new_options.GetBoolean( 'load_images_with_pil' ):
|
||||
|
||||
if HG.media_load_report_mode:
|
||||
|
||||
HydrusData.ShowText( 'Loading with PIL' )
|
||||
|
||||
|
||||
# a regular cv.imread call, can crash the whole process on random thumbs, hooray, so have this as backup
|
||||
# it was just the read that was the problem, so this seems to work fine, even if pil is only about half as fast
|
||||
|
||||
|
@ -72,6 +83,11 @@ def GenerateNumpyImage( path, mime ):
|
|||
|
||||
else:
|
||||
|
||||
if HG.media_load_report_mode:
|
||||
|
||||
HydrusData.ShowText( 'Loading with OpenCV' )
|
||||
|
||||
|
||||
if mime == HC.IMAGE_JPEG:
|
||||
|
||||
flags = CV_IMREAD_FLAGS_SUPPORTS_EXIF_REORIENTATION
|
||||
|
@ -85,6 +101,11 @@ def GenerateNumpyImage( path, mime ):
|
|||
|
||||
if numpy_image is None: # doesn't support static gifs and some random other stuff
|
||||
|
||||
if HG.media_load_report_mode:
|
||||
|
||||
HydrusData.ShowText( 'OpenCV Failed, loading with PIL' )
|
||||
|
||||
|
||||
pil_image = HydrusImageHandling.GeneratePILImage( path )
|
||||
|
||||
numpy_image = GenerateNumPyImageFromPILImage( pil_image )
|
||||
|
|
|
@ -439,7 +439,7 @@ class FileSeed( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
|
||||
|
||||
def AddParseResults( self, parse_results ):
|
||||
def AddParseResults( self, parse_results, file_import_options ):
|
||||
|
||||
for ( hash_type, hash ) in ClientParsing.GetHashesFromParseResults( parse_results ):
|
||||
|
||||
|
@ -449,13 +449,16 @@ class FileSeed( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
|
||||
|
||||
urls = ClientParsing.GetURLsFromParseResults( parse_results, ( HC.URL_TYPE_SOURCE, ) )
|
||||
|
||||
associable_urls = self._NormaliseAndFilterAssociableURLs( urls )
|
||||
|
||||
associable_urls.discard( self.file_seed_data )
|
||||
|
||||
self._urls.update( associable_urls )
|
||||
if file_import_options.ShouldAssociateSourceURLs():
|
||||
|
||||
source_urls = ClientParsing.GetURLsFromParseResults( parse_results, ( HC.URL_TYPE_SOURCE, ) )
|
||||
|
||||
associable_urls = self._NormaliseAndFilterAssociableURLs( source_urls )
|
||||
|
||||
associable_urls.discard( self.file_seed_data )
|
||||
|
||||
self._urls.update( associable_urls )
|
||||
|
||||
|
||||
tags = ClientParsing.GetTagsFromParseResults( parse_results )
|
||||
|
||||
|
@ -554,6 +557,159 @@ class FileSeed( HydrusSerialisable.SerialisableBase ):
|
|||
return None
|
||||
|
||||
|
||||
def GetPreImportStatusPredictionHash( self, file_import_options ):
|
||||
|
||||
UNKNOWN_DEFAULT = ( CC.STATUS_UNKNOWN, None, '' )
|
||||
|
||||
( status, hash, note ) = UNKNOWN_DEFAULT
|
||||
|
||||
if file_import_options.DoNotCheckHashesBeforeImporting():
|
||||
|
||||
return ( status, hash, note )
|
||||
|
||||
|
||||
# hashes
|
||||
|
||||
if status == CC.STATUS_UNKNOWN:
|
||||
|
||||
for ( hash_type, found_hash ) in self._hashes.items():
|
||||
|
||||
( status, hash, note ) = HG.client_controller.Read( 'hash_status', hash_type, found_hash )
|
||||
|
||||
if status != CC.STATUS_UNKNOWN:
|
||||
|
||||
break
|
||||
|
||||
|
||||
|
||||
|
||||
if status == CC.STATUS_DELETED:
|
||||
|
||||
if not file_import_options.ExcludesDeleted():
|
||||
|
||||
( status, hash, note ) = UNKNOWN_DEFAULT
|
||||
|
||||
|
||||
|
||||
return ( status, hash, note )
|
||||
|
||||
|
||||
def GetPreImportStatusPredictionURL( self, file_import_options, file_url = None ):
|
||||
|
||||
UNKNOWN_DEFAULT = ( CC.STATUS_UNKNOWN, None, '' )
|
||||
|
||||
( status, hash, note ) = UNKNOWN_DEFAULT
|
||||
|
||||
if file_import_options.DoNotCheckKnownURLsBeforeImporting():
|
||||
|
||||
return ( status, hash, note )
|
||||
|
||||
|
||||
# urls
|
||||
|
||||
urls = set( self._urls )
|
||||
|
||||
if file_url is not None:
|
||||
|
||||
urls.add( file_url )
|
||||
|
||||
|
||||
if self.file_seed_type == FILE_SEED_TYPE_URL:
|
||||
|
||||
urls.add( self.file_seed_data )
|
||||
|
||||
|
||||
unrecognised_url_results = set()
|
||||
|
||||
for url in urls:
|
||||
|
||||
if HG.client_controller.network_engine.domain_manager.URLCanReferToMultipleFiles( url ):
|
||||
|
||||
continue
|
||||
|
||||
|
||||
# we now only trust url-matched single urls and the post/file urls
|
||||
# trusting unmatched source urls was too much of a hassle with too many boorus providing bad source urls like user account pages
|
||||
|
||||
if HG.client_controller.network_engine.domain_manager.URLDefinitelyRefersToOneFile( url ) or url in ( self.file_seed_data, file_url ):
|
||||
|
||||
results = HG.client_controller.Read( 'url_statuses', url )
|
||||
|
||||
if len( results ) == 0: # if no match found, no useful data discovered
|
||||
|
||||
continue
|
||||
|
||||
elif len( results ) > 1: # if more than one file claims this url, it cannot be relied on to guess the file
|
||||
|
||||
continue
|
||||
|
||||
else: # i.e. 1 match found
|
||||
|
||||
( status, hash, note ) = results[0]
|
||||
|
||||
if status != CC.STATUS_UNKNOWN:
|
||||
|
||||
# a known one-file url has given a single clear result. sounds good
|
||||
|
||||
we_have_a_match = True
|
||||
|
||||
if self.file_seed_type == FILE_SEED_TYPE_URL:
|
||||
|
||||
# to double-check, let's see if the file that claims that url has any other interesting urls
|
||||
# if the file has another url with the same url class as ours, then this is prob an unreliable 'alternate' source url attribution, and untrustworthy
|
||||
|
||||
my_url = self.file_seed_data
|
||||
|
||||
if url != my_url:
|
||||
|
||||
my_url_match = HG.client_controller.network_engine.domain_manager.GetURLMatch( my_url )
|
||||
|
||||
( media_result, ) = HG.client_controller.Read( 'media_results', ( hash, ) )
|
||||
|
||||
this_files_urls = media_result.GetLocationsManager().GetURLs()
|
||||
|
||||
for this_files_url in this_files_urls:
|
||||
|
||||
if this_files_url != my_url:
|
||||
|
||||
this_url_match = HG.client_controller.network_engine.domain_manager.GetURLMatch( this_files_url )
|
||||
|
||||
if my_url_match == this_url_match:
|
||||
|
||||
# oh no, the file this source url refers to has a different known url in this same domain
|
||||
# it is more likely that an edit on this site points to the original elsewhere
|
||||
|
||||
( status, hash, note ) = UNKNOWN_DEFAULT
|
||||
|
||||
we_have_a_match = False
|
||||
|
||||
break
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
if we_have_a_match:
|
||||
|
||||
break # if a known one-file url gives a single clear result, that result is reliable
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
if status == CC.STATUS_DELETED:
|
||||
|
||||
if not file_import_options.ExcludesDeleted():
|
||||
|
||||
( status, hash, note ) = UNKNOWN_DEFAULT
|
||||
|
||||
|
||||
|
||||
return ( status, hash, note )
|
||||
|
||||
|
||||
def GetSearchFileSeeds( self ):
|
||||
|
||||
if self.file_seed_type == FILE_SEED_TYPE_URL:
|
||||
|
@ -660,6 +816,11 @@ class FileSeed( HydrusSerialisable.SerialisableBase ):
|
|||
return False
|
||||
|
||||
|
||||
def IsDeleted( self ):
|
||||
|
||||
return self.status == CC.STATUS_DELETED
|
||||
|
||||
|
||||
def Normalise( self ):
|
||||
|
||||
if self.file_seed_type == FILE_SEED_TYPE_URL:
|
||||
|
@ -670,16 +831,44 @@ class FileSeed( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
def PredictPreImportStatus( self, file_import_options, tag_import_options, file_url = None ):
|
||||
|
||||
# atm, if url recognised, then hash is always recognised because url sets sha256 hash wew
|
||||
# however, we will move these to now be 'setting' methods soonish, and do the status/note/etc... set here, at which point this will be _more_ accurate, at least first time around
|
||||
# the should_download_file test should take into account future url/hash checkboxes in a similar way, which this will matter more
|
||||
# and in fact, it may be appropriate to not actually do/'trust' url/hash status sets if the file ones are checked
|
||||
( url_status, url_hash, url_note ) = self.GetPreImportStatusPredictionURL( file_import_options, file_url = file_url )
|
||||
( hash_status, hash_hash, hash_note ) = self.GetPreImportStatusPredictionHash( file_import_options )
|
||||
|
||||
url_recognised_and_file_already_in_db = self.PredictPreImportStatusURL( file_import_options, file_url = file_url )
|
||||
hash_recognised_and_file_already_in_db = self.PredictPreImportStatusHash( file_import_options )
|
||||
url_recognised_and_file_already_in_db = url_status == CC.STATUS_SUCCESSFUL_BUT_REDUNDANT
|
||||
hash_recognised_and_file_already_in_db = hash_status == CC.STATUS_SUCCESSFUL_BUT_REDUNDANT
|
||||
|
||||
should_download_metadata = self.status == CC.STATUS_UNKNOWN # if the file is unknown, we need the metadata to get the file_url!
|
||||
# now let's set the prediction
|
||||
|
||||
if hash_status != CC.STATUS_UNKNOWN: # trust hashes over urls m8
|
||||
|
||||
( status, hash, note ) = ( hash_status, hash_hash, hash_note )
|
||||
|
||||
else:
|
||||
|
||||
( status, hash, note ) = ( url_status, url_hash, url_note )
|
||||
|
||||
|
||||
if self.status == CC.STATUS_UNKNOWN and status != CC.STATUS_UNKNOWN:
|
||||
|
||||
self.status = status
|
||||
|
||||
if hash is not None:
|
||||
|
||||
self._hashes[ 'sha256' ] = hash
|
||||
|
||||
|
||||
self.note = note
|
||||
|
||||
self._UpdateModified()
|
||||
|
||||
|
||||
# and make some recommendations
|
||||
|
||||
should_download_file = self.status == CC.STATUS_UNKNOWN
|
||||
|
||||
should_download_metadata = should_download_file # if we want the file, we need the metadata to get the file_url!
|
||||
|
||||
# but if we otherwise still want to force some tags, let's do it
|
||||
if not should_download_metadata and tag_import_options.WorthFetchingTags():
|
||||
|
||||
url_override = url_recognised_and_file_already_in_db and tag_import_options.ShouldFetchTagsEvenIfURLKnownAndFileAlreadyInDB()
|
||||
|
@ -691,192 +880,9 @@ class FileSeed( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
|
||||
|
||||
should_download_file = self.status == CC.STATUS_UNKNOWN
|
||||
|
||||
return ( should_download_metadata, should_download_file )
|
||||
|
||||
|
||||
def PredictPreImportStatusHash( self, file_import_options ):
|
||||
|
||||
UNKNOWN_DEFAULT = ( CC.STATUS_UNKNOWN, None, '' )
|
||||
|
||||
( status, hash, note ) = UNKNOWN_DEFAULT
|
||||
|
||||
# hashes
|
||||
|
||||
if status == CC.STATUS_UNKNOWN:
|
||||
|
||||
for ( hash_type, found_hash ) in self._hashes.items():
|
||||
|
||||
( status, hash, note ) = HG.client_controller.Read( 'hash_status', hash_type, found_hash )
|
||||
|
||||
if status != CC.STATUS_UNKNOWN:
|
||||
|
||||
break
|
||||
|
||||
|
||||
|
||||
|
||||
hash_recognised_and_file_already_in_db = status == CC.STATUS_SUCCESSFUL_BUT_REDUNDANT
|
||||
|
||||
#
|
||||
|
||||
if status == CC.STATUS_DELETED:
|
||||
|
||||
if not file_import_options.ExcludesDeleted():
|
||||
|
||||
status = CC.STATUS_UNKNOWN
|
||||
note = ''
|
||||
|
||||
|
||||
|
||||
if self.status == CC.STATUS_UNKNOWN:
|
||||
|
||||
self.status = status
|
||||
|
||||
if hash is not None:
|
||||
|
||||
self._hashes[ 'sha256' ] = hash
|
||||
|
||||
|
||||
self.note = note
|
||||
|
||||
self._UpdateModified()
|
||||
|
||||
|
||||
return hash_recognised_and_file_already_in_db
|
||||
|
||||
|
||||
def PredictPreImportStatusURL( self, file_import_options, file_url = None ):
|
||||
|
||||
UNKNOWN_DEFAULT = ( CC.STATUS_UNKNOWN, None, '' )
|
||||
|
||||
( status, hash, note ) = UNKNOWN_DEFAULT
|
||||
|
||||
# urls
|
||||
|
||||
urls = set( self._urls )
|
||||
|
||||
if file_url is not None:
|
||||
|
||||
urls.add( file_url )
|
||||
|
||||
|
||||
if self.file_seed_type == FILE_SEED_TYPE_URL:
|
||||
|
||||
urls.add( self.file_seed_data )
|
||||
|
||||
|
||||
unrecognised_url_results = set()
|
||||
|
||||
for url in urls:
|
||||
|
||||
if HG.client_controller.network_engine.domain_manager.URLCanReferToMultipleFiles( url ):
|
||||
|
||||
continue
|
||||
|
||||
|
||||
# we now only trust url-matched single urls and the post/file urls
|
||||
# trusting unmatched source urls was too much of a hassle with too many boorus providing bad source urls like user account pages
|
||||
|
||||
if HG.client_controller.network_engine.domain_manager.URLDefinitelyRefersToOneFile( url ) or url in ( self.file_seed_data, file_url ):
|
||||
|
||||
results = HG.client_controller.Read( 'url_statuses', url )
|
||||
|
||||
if len( results ) == 0: # if no match found, no useful data discovered
|
||||
|
||||
continue
|
||||
|
||||
elif len( results ) > 1: # if more than one file claims this url, it cannot be relied on to guess the file
|
||||
|
||||
continue
|
||||
|
||||
else: # i.e. 1 match found
|
||||
|
||||
( status, hash, note ) = results[0]
|
||||
|
||||
if status != CC.STATUS_UNKNOWN:
|
||||
|
||||
# a known one-file url has given a single clear result. sounds good
|
||||
|
||||
we_have_a_match = True
|
||||
|
||||
if self.file_seed_type == FILE_SEED_TYPE_URL:
|
||||
|
||||
# to double-check, let's see if the file that claims that url has any other interesting urls
|
||||
# if the file has another url with the same url class as ours, then this is prob an unreliable 'alternate' source url attribution, and untrustworthy
|
||||
|
||||
my_url = self.file_seed_data
|
||||
|
||||
if url != my_url:
|
||||
|
||||
my_url_match = HG.client_controller.network_engine.domain_manager.GetURLMatch( my_url )
|
||||
|
||||
( media_result, ) = HG.client_controller.Read( 'media_results', ( hash, ) )
|
||||
|
||||
this_files_urls = media_result.GetLocationsManager().GetURLs()
|
||||
|
||||
for this_files_url in this_files_urls:
|
||||
|
||||
if this_files_url != my_url:
|
||||
|
||||
this_url_match = HG.client_controller.network_engine.domain_manager.GetURLMatch( this_files_url )
|
||||
|
||||
if my_url_match == this_url_match:
|
||||
|
||||
# oh no, the file this source url refers to has a different known url in this same domain
|
||||
# it is more likely that an edit on this site points to the original elsewhere
|
||||
|
||||
( status, hash, note ) = UNKNOWN_DEFAULT
|
||||
|
||||
we_have_a_match = False
|
||||
|
||||
break
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
if we_have_a_match:
|
||||
|
||||
break # if a known one-file url gives a single clear result, that result is reliable
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
url_recognised_and_file_already_in_db = status == CC.STATUS_SUCCESSFUL_BUT_REDUNDANT
|
||||
|
||||
#
|
||||
|
||||
if status == CC.STATUS_DELETED:
|
||||
|
||||
if not file_import_options.ExcludesDeleted():
|
||||
|
||||
status = CC.STATUS_UNKNOWN
|
||||
note = ''
|
||||
|
||||
|
||||
|
||||
if self.status == CC.STATUS_UNKNOWN:
|
||||
|
||||
self.status = status
|
||||
|
||||
if hash is not None:
|
||||
|
||||
self._hashes[ 'sha256' ] = hash
|
||||
|
||||
|
||||
self.note = note
|
||||
|
||||
self._UpdateModified()
|
||||
|
||||
|
||||
return url_recognised_and_file_already_in_db
|
||||
|
||||
|
||||
def PresentToPage( self, page_key ):
|
||||
|
||||
hash = self.GetHash()
|
||||
|
@ -1051,7 +1057,7 @@ class FileSeed( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
elif len( all_parse_results ) > 1:
|
||||
|
||||
file_seeds = ClientImporting.ConvertAllParseResultsToFileSeeds( all_parse_results, self.file_seed_data )
|
||||
file_seeds = ClientImporting.ConvertAllParseResultsToFileSeeds( all_parse_results, self.file_seed_data, file_import_options )
|
||||
|
||||
( num_urls_added, num_urls_already_in_file_seed_cache, can_search_for_more_files, stop_reason ) = ClientImporting.UpdateFileSeedCacheWithFileSeeds( file_seed_cache, file_seeds )
|
||||
|
||||
|
@ -1064,7 +1070,7 @@ class FileSeed( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
parse_results = all_parse_results[0]
|
||||
|
||||
self.AddParseResults( parse_results )
|
||||
self.AddParseResults( parse_results, file_import_options )
|
||||
|
||||
self.CheckPreFetchMetadata( tag_import_options )
|
||||
|
||||
|
@ -1799,6 +1805,16 @@ class FileSeedCache( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
|
||||
|
||||
def GetHashes( self ):
|
||||
|
||||
with self._lock:
|
||||
|
||||
hashes = [ file_seed.GetHash() for file_seed in self._file_seeds if file_seed.HasHash() ]
|
||||
|
||||
|
||||
return hashes
|
||||
|
||||
|
||||
def GetLatestAddedTime( self ):
|
||||
|
||||
with self._lock:
|
||||
|
|
|
@ -389,6 +389,14 @@ class GalleryImport( HydrusSerialisable.SerialisableBase ):
|
|||
return True
|
||||
|
||||
|
||||
def CanRetryFailed( self ):
|
||||
|
||||
with self._lock:
|
||||
|
||||
return self._file_seed_cache.GetFileSeedCount( CC.STATUS_ERROR ) > 0
|
||||
|
||||
|
||||
|
||||
def CurrentlyWorking( self ):
|
||||
|
||||
with self._lock:
|
||||
|
@ -603,6 +611,14 @@ class GalleryImport( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
|
||||
|
||||
def RetryFailed( self ):
|
||||
|
||||
with self._lock:
|
||||
|
||||
self._file_seed_cache.RetryFailures()
|
||||
|
||||
|
||||
|
||||
def SetFileLimit( self, file_limit ):
|
||||
|
||||
with self._lock:
|
||||
|
|
|
@ -249,7 +249,7 @@ class GallerySeed( HydrusSerialisable.SerialisableBase ):
|
|||
title_hook( title )
|
||||
|
||||
|
||||
file_seeds = ClientImporting.ConvertAllParseResultsToFileSeeds( all_parse_results, self.url )
|
||||
file_seeds = ClientImporting.ConvertAllParseResultsToFileSeeds( all_parse_results, self.url, file_import_options )
|
||||
|
||||
num_urls_total = len( file_seeds )
|
||||
|
||||
|
@ -270,7 +270,7 @@ class GallerySeed( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
|
||||
# only keep searching if we found any files, otherwise this could be a blank results page with another stub page
|
||||
can_add_more_gallery_urls = num_urls_total > 0 and can_search_for_more_files
|
||||
can_add_more_gallery_urls = num_urls_added > 0 and can_search_for_more_files
|
||||
|
||||
if self._can_generate_more_pages and can_add_more_gallery_urls:
|
||||
|
||||
|
|
|
@ -585,15 +585,18 @@ class ImportFolder( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
hash = file_seed.GetHash()
|
||||
|
||||
downloaded_tags = []
|
||||
|
||||
in_inbox = HG.client_controller.Read( 'in_inbox', hash )
|
||||
|
||||
service_keys_to_content_updates = self._tag_import_options.GetServiceKeysToContentUpdates( file_seed.status, in_inbox, hash, downloaded_tags ) # additional tags
|
||||
|
||||
if len( service_keys_to_content_updates ) > 0:
|
||||
if self._tag_import_options.HasAdditionalTags():
|
||||
|
||||
HG.client_controller.WriteSynchronous( 'content_updates', service_keys_to_content_updates )
|
||||
in_inbox = HG.client_controller.Read( 'in_inbox', hash )
|
||||
|
||||
downloaded_tags = []
|
||||
|
||||
service_keys_to_content_updates = self._tag_import_options.GetServiceKeysToContentUpdates( file_seed.status, in_inbox, hash, downloaded_tags ) # additional tags
|
||||
|
||||
if len( service_keys_to_content_updates ) > 0:
|
||||
|
||||
HG.client_controller.WriteSynchronous( 'content_updates', service_keys_to_content_updates )
|
||||
|
||||
|
||||
|
||||
service_keys_to_tags = {}
|
||||
|
@ -783,17 +786,17 @@ class ImportFolder( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
error_occured = False
|
||||
|
||||
job_key = ClientThreading.JobKey( pausable = False, cancellable = True )
|
||||
|
||||
try:
|
||||
|
||||
if not os.path.exists( self._path ) or not os.path.isdir( self._path ):
|
||||
|
||||
raise Exception( 'Path does not seem to exist, or is not a directory.' )
|
||||
raise Exception( 'Path "' + self._path + '" does not seem to exist, or is not a directory.' )
|
||||
|
||||
|
||||
pubbed_job_key = False
|
||||
|
||||
job_key = ClientThreading.JobKey( pausable = False, cancellable = True )
|
||||
|
||||
job_key.SetVariable( 'popup_title', 'import folder - ' + self._name )
|
||||
|
||||
due_by_check_now = self._check_now
|
||||
|
|
|
@ -528,13 +528,15 @@ class FileImportOptions( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_FILE_IMPORT_OPTIONS
|
||||
SERIALISABLE_NAME = 'File Import Options'
|
||||
SERIALISABLE_VERSION = 3
|
||||
SERIALISABLE_VERSION = 4
|
||||
|
||||
def __init__( self ):
|
||||
|
||||
HydrusSerialisable.SerialisableBase.__init__( self )
|
||||
|
||||
self._exclude_deleted = True
|
||||
self._do_not_check_known_urls_before_importing = False
|
||||
self._do_not_check_hashes_before_importing = False
|
||||
self._allow_decompression_bombs = False
|
||||
self._min_size = None
|
||||
self._max_size = None
|
||||
|
@ -542,6 +544,7 @@ class FileImportOptions( HydrusSerialisable.SerialisableBase ):
|
|||
self._min_resolution = None
|
||||
self._max_resolution = None
|
||||
self._automatic_archive = False
|
||||
self._associate_source_urls = True
|
||||
self._present_new_files = True
|
||||
self._present_already_in_inbox_files = True
|
||||
self._present_already_in_archive_files = True
|
||||
|
@ -549,8 +552,8 @@ class FileImportOptions( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
def _GetSerialisableInfo( self ):
|
||||
|
||||
pre_import_options = ( self._exclude_deleted, self._allow_decompression_bombs, self._min_size, self._max_size, self._max_gif_size, self._min_resolution, self._max_resolution )
|
||||
post_import_options = self._automatic_archive
|
||||
pre_import_options = ( self._exclude_deleted, self._do_not_check_known_urls_before_importing, self._do_not_check_hashes_before_importing, self._allow_decompression_bombs, self._min_size, self._max_size, self._max_gif_size, self._min_resolution, self._max_resolution )
|
||||
post_import_options = ( self._automatic_archive, self._associate_source_urls )
|
||||
presentation_options = ( self._present_new_files, self._present_already_in_inbox_files, self._present_already_in_archive_files )
|
||||
|
||||
return ( pre_import_options, post_import_options, presentation_options )
|
||||
|
@ -560,8 +563,8 @@ class FileImportOptions( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
( pre_import_options, post_import_options, presentation_options ) = serialisable_info
|
||||
|
||||
( self._exclude_deleted, self._allow_decompression_bombs, self._min_size, self._max_size, self._max_gif_size, self._min_resolution, self._max_resolution ) = pre_import_options
|
||||
self._automatic_archive = post_import_options
|
||||
( self._exclude_deleted, self._do_not_check_known_urls_before_importing, self._do_not_check_hashes_before_importing, self._allow_decompression_bombs, self._min_size, self._max_size, self._max_gif_size, self._min_resolution, self._max_resolution ) = pre_import_options
|
||||
( self._automatic_archive, self._associate_source_urls ) = post_import_options
|
||||
( self._present_new_files, self._present_already_in_inbox_files, self._present_already_in_archive_files ) = presentation_options
|
||||
|
||||
|
||||
|
@ -599,6 +602,27 @@ class FileImportOptions( HydrusSerialisable.SerialisableBase ):
|
|||
return ( 3, new_serialisable_info )
|
||||
|
||||
|
||||
if version == 3:
|
||||
|
||||
( pre_import_options, post_import_options, presentation_options ) = old_serialisable_info
|
||||
|
||||
( exclude_deleted, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution ) = pre_import_options
|
||||
|
||||
automatic_archive = post_import_options
|
||||
|
||||
do_not_check_known_urls_before_importing = False
|
||||
do_not_check_hashes_before_importing = False
|
||||
associate_source_urls = True
|
||||
|
||||
pre_import_options = ( exclude_deleted, do_not_check_known_urls_before_importing, do_not_check_hashes_before_importing, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution )
|
||||
|
||||
post_import_options = ( automatic_archive, associate_source_urls )
|
||||
|
||||
new_serialisable_info = ( pre_import_options, post_import_options, presentation_options )
|
||||
|
||||
return ( 4, new_serialisable_info )
|
||||
|
||||
|
||||
|
||||
def AllowsDecompressionBombs( self ):
|
||||
|
||||
|
@ -696,7 +720,7 @@ class FileImportOptions( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
def GetPostImportOptions( self ):
|
||||
|
||||
post_import_options = self._automatic_archive
|
||||
post_import_options = ( self._automatic_archive, self._associate_source_urls )
|
||||
|
||||
return post_import_options
|
||||
|
||||
|
@ -710,7 +734,7 @@ class FileImportOptions( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
def GetPreImportOptions( self ):
|
||||
|
||||
pre_import_options = ( self._exclude_deleted, self._allow_decompression_bombs, self._min_size, self._max_size, self._max_gif_size, self._min_resolution, self._max_resolution )
|
||||
pre_import_options = ( self._exclude_deleted, self._do_not_check_known_urls_before_importing, self._do_not_check_hashes_before_importing, self._allow_decompression_bombs, self._min_size, self._max_size, self._max_gif_size, self._min_resolution, self._max_resolution )
|
||||
|
||||
return pre_import_options
|
||||
|
||||
|
@ -802,9 +826,10 @@ class FileImportOptions( HydrusSerialisable.SerialisableBase ):
|
|||
return summary
|
||||
|
||||
|
||||
def SetPostImportOptions( self, automatic_archive ):
|
||||
def SetPostImportOptions( self, automatic_archive, associate_source_urls ):
|
||||
|
||||
self._automatic_archive = automatic_archive
|
||||
self._associate_source_urls = associate_source_urls
|
||||
|
||||
|
||||
def SetPresentationOptions( self, present_new_files, present_already_in_inbox_files, present_already_in_archive_files ):
|
||||
|
@ -814,9 +839,11 @@ class FileImportOptions( HydrusSerialisable.SerialisableBase ):
|
|||
self._present_already_in_archive_files = present_already_in_archive_files
|
||||
|
||||
|
||||
def SetPreImportOptions( self, exclude_deleted, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution ):
|
||||
def SetPreImportOptions( self, exclude_deleted, do_not_check_known_urls_before_importing, do_not_check_hashes_before_importing, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution ):
|
||||
|
||||
self._exclude_deleted = exclude_deleted
|
||||
self._do_not_check_known_urls_before_importing = do_not_check_known_urls_before_importing
|
||||
self._do_not_check_hashes_before_importing = do_not_check_hashes_before_importing
|
||||
self._allow_decompression_bombs = allow_decompression_bombs
|
||||
self._min_size = min_size
|
||||
self._max_size = max_size
|
||||
|
@ -825,6 +852,21 @@ class FileImportOptions( HydrusSerialisable.SerialisableBase ):
|
|||
self._max_resolution = max_resolution
|
||||
|
||||
|
||||
def ShouldAssociateSourceURLs( self ):
|
||||
|
||||
return self._associate_source_urls
|
||||
|
||||
|
||||
def DoNotCheckHashesBeforeImporting( self ):
|
||||
|
||||
return self._do_not_check_hashes_before_importing
|
||||
|
||||
|
||||
def DoNotCheckKnownURLsBeforeImporting( self ):
|
||||
|
||||
return self._do_not_check_known_urls_before_importing
|
||||
|
||||
|
||||
def ShouldNotPresentIgnorantOfInbox( self, status ):
|
||||
|
||||
return NewInboxArchiveNonMatchIgnorantOfInbox( self._present_new_files, self._present_already_in_inbox_files, self._present_already_in_archive_files, status )
|
||||
|
@ -1148,6 +1190,11 @@ class TagImportOptions( HydrusSerialisable.SerialisableBase ):
|
|||
return self._tag_blacklist
|
||||
|
||||
|
||||
def HasAdditionalTags( self ):
|
||||
|
||||
return True in ( service_tag_import_options.HasAdditionalTags() for service_tag_import_options in self._service_keys_to_service_tag_import_options.values() )
|
||||
|
||||
|
||||
def IsDefault( self ):
|
||||
|
||||
return self._is_default
|
||||
|
@ -1330,6 +1377,11 @@ class ServiceTagImportOptions( HydrusSerialisable.SerialisableBase ):
|
|||
return tags
|
||||
|
||||
|
||||
def HasAdditionalTags( self ):
|
||||
|
||||
return len( self._additional_tags ) > 0
|
||||
|
||||
|
||||
def ToTuple( self ):
|
||||
|
||||
return ( self._get_tags, self._get_tags_filter, self._additional_tags, self._to_new_files, self._to_already_in_inbox, self._to_already_in_archive, self._only_add_existing_tags, self._only_add_existing_tags_filter )
|
||||
|
|
|
@ -22,7 +22,7 @@ class Subscription( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_SUBSCRIPTION
|
||||
SERIALISABLE_NAME = 'Subscription'
|
||||
SERIALISABLE_VERSION = 8
|
||||
SERIALISABLE_VERSION = 9
|
||||
|
||||
def __init__( self, name, gug_key_and_name = None ):
|
||||
|
||||
|
@ -62,6 +62,7 @@ class Subscription( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
self._no_work_until = 0
|
||||
self._no_work_until_reason = ''
|
||||
|
||||
self._show_a_popup_while_working = True
|
||||
self._publish_files_to_popup_button = True
|
||||
self._publish_files_to_page = False
|
||||
self._merge_query_publish_events = True
|
||||
|
@ -94,9 +95,9 @@ class Subscription( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
def _GetNetworkJobSubscriptionKey( self, query ):
|
||||
|
||||
query_text = query.GetQueryText()
|
||||
query_name = query.GetHumanName()
|
||||
|
||||
return self._name + ': ' + query_text
|
||||
return self._name + ': ' + query_name
|
||||
|
||||
|
||||
def _GetQueriesForProcessing( self ):
|
||||
|
@ -111,7 +112,7 @@ class Subscription( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
def key( q ):
|
||||
|
||||
return q.GetQueryText()
|
||||
return q.GetHumanName()
|
||||
|
||||
|
||||
queries.sort( key = key )
|
||||
|
@ -130,12 +131,12 @@ class Subscription( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
serialisable_file_import_options = self._file_import_options.GetSerialisableTuple()
|
||||
serialisable_tag_import_options = self._tag_import_options.GetSerialisableTuple()
|
||||
|
||||
return ( serialisable_gug_key_and_name, serialisable_queries, serialisable_checker_options, self._initial_file_limit, self._periodic_file_limit, self._paused, serialisable_file_import_options, serialisable_tag_import_options, self._no_work_until, self._no_work_until_reason, self._publish_files_to_popup_button, self._publish_files_to_page, self._merge_query_publish_events )
|
||||
return ( serialisable_gug_key_and_name, serialisable_queries, serialisable_checker_options, self._initial_file_limit, self._periodic_file_limit, self._paused, serialisable_file_import_options, serialisable_tag_import_options, self._no_work_until, self._no_work_until_reason, self._show_a_popup_while_working, self._publish_files_to_popup_button, self._publish_files_to_page, self._merge_query_publish_events )
|
||||
|
||||
|
||||
def _InitialiseFromSerialisableInfo( self, serialisable_info ):
|
||||
|
||||
( serialisable_gug_key_and_name, serialisable_queries, serialisable_checker_options, self._initial_file_limit, self._periodic_file_limit, self._paused, serialisable_file_import_options, serialisable_tag_import_options, self._no_work_until, self._no_work_until_reason, self._publish_files_to_popup_button, self._publish_files_to_page, self._merge_query_publish_events ) = serialisable_info
|
||||
( serialisable_gug_key_and_name, serialisable_queries, serialisable_checker_options, self._initial_file_limit, self._periodic_file_limit, self._paused, serialisable_file_import_options, serialisable_tag_import_options, self._no_work_until, self._no_work_until_reason, self._show_a_popup_while_working, self._publish_files_to_popup_button, self._publish_files_to_page, self._merge_query_publish_events ) = serialisable_info
|
||||
|
||||
( serialisable_gug_key, gug_name ) = serialisable_gug_key_and_name
|
||||
|
||||
|
@ -177,7 +178,7 @@ class Subscription( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
if HG.subscription_report_mode:
|
||||
|
||||
HydrusData.ShowText( 'Query "' + query.GetQueryText() + '" pre-work bandwidth test. Bandwidth ok: ' + str( result ) + '.' )
|
||||
HydrusData.ShowText( 'Query "' + query.GetHumanName() + '" pre-work bandwidth test. Bandwidth ok: ' + str( result ) + '.' )
|
||||
|
||||
|
||||
return result
|
||||
|
@ -296,6 +297,17 @@ class Subscription( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
return ( 8, new_serialisable_info )
|
||||
|
||||
|
||||
if version == 8:
|
||||
|
||||
( serialisable_gug_key_and_name, serialisable_queries, serialisable_checker_options, initial_file_limit, periodic_file_limit, paused, serialisable_file_import_options, serialisable_tag_import_options, no_work_until, no_work_until_reason, publish_files_to_popup_button, publish_files_to_page, merge_query_publish_events ) = old_serialisable_info
|
||||
|
||||
show_a_popup_while_working = True
|
||||
|
||||
new_serialisable_info = ( serialisable_gug_key_and_name, serialisable_queries, serialisable_checker_options, initial_file_limit, periodic_file_limit, paused, serialisable_file_import_options, serialisable_tag_import_options, no_work_until, no_work_until_reason, show_a_popup_while_working, publish_files_to_popup_button, publish_files_to_page, merge_query_publish_events )
|
||||
|
||||
return ( 9, new_serialisable_info )
|
||||
|
||||
|
||||
|
||||
def _WorkOnFiles( self, job_key ):
|
||||
|
||||
|
@ -310,16 +322,16 @@ class Subscription( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
this_query_has_done_work = False
|
||||
|
||||
query_text = query.GetQueryText()
|
||||
query_name = query.GetHumanName()
|
||||
file_seed_cache = query.GetFileSeedCache()
|
||||
|
||||
text_1 = 'downloading files'
|
||||
query_summary_name = self._name
|
||||
|
||||
if query_text != self._name:
|
||||
if query_name != self._name:
|
||||
|
||||
text_1 += ' for "' + query_text + '"'
|
||||
query_summary_name += ': ' + query_text
|
||||
text_1 += ' for "' + query_name + '"'
|
||||
query_summary_name += ': ' + query_name
|
||||
|
||||
|
||||
job_key.SetVariable( 'popup_text_1', text_1 )
|
||||
|
@ -339,7 +351,7 @@ class Subscription( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
if HG.subscription_report_mode:
|
||||
|
||||
HydrusData.ShowText( 'Query "' + query_text + '" can do no more file work due to running out of unknown urls.' )
|
||||
HydrusData.ShowText( 'Query "' + query_name + '" can do no more file work due to running out of unknown urls.' )
|
||||
|
||||
|
||||
break
|
||||
|
@ -381,6 +393,24 @@ class Subscription( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
file_seed.WorkOnURL( file_seed_cache, status_hook, self._GenerateNetworkJobFactory( query ), ClientImporting.GenerateMultiplePopupNetworkJobPresentationContextFactory( job_key ), self._file_import_options, self._tag_import_options )
|
||||
|
||||
query_tag_import_options = query.GetTagImportOptions()
|
||||
|
||||
if query_tag_import_options.HasAdditionalTags() and file_seed.status in CC.SUCCESSFUL_IMPORT_STATES:
|
||||
|
||||
hash = file_seed.GetHash()
|
||||
|
||||
in_inbox = HG.client_controller.Read( 'in_inbox', hash )
|
||||
|
||||
downloaded_tags = []
|
||||
|
||||
service_keys_to_content_updates = query_tag_import_options.GetServiceKeysToContentUpdates( file_seed.status, in_inbox, hash, downloaded_tags ) # additional tags
|
||||
|
||||
if len( service_keys_to_content_updates ) > 0:
|
||||
|
||||
HG.client_controller.WriteSynchronous( 'content_updates', service_keys_to_content_updates )
|
||||
|
||||
|
||||
|
||||
if file_seed.ShouldPresent( self._file_import_options ):
|
||||
|
||||
hash = file_seed.GetHash()
|
||||
|
@ -529,7 +559,7 @@ class Subscription( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
if HG.subscription_report_mode:
|
||||
|
||||
HydrusData.ShowText( 'Query "' + query.GetQueryText() + '" started. Current can_sync is ' + str( can_sync ) + '.' )
|
||||
HydrusData.ShowText( 'Query "' + query.GetHumanName() + '" started. Current can_sync is ' + str( can_sync ) + '.' )
|
||||
|
||||
|
||||
if not can_sync:
|
||||
|
@ -538,6 +568,7 @@ class Subscription( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
|
||||
query_text = query.GetQueryText()
|
||||
query_name = query.GetHumanName()
|
||||
file_seed_cache = query.GetFileSeedCache()
|
||||
gallery_seed_log = query.GetGallerySeedLog()
|
||||
|
||||
|
@ -562,9 +593,9 @@ class Subscription( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
prefix = 'synchronising'
|
||||
|
||||
if query_text != self._name:
|
||||
if query_name != self._name:
|
||||
|
||||
prefix += ' "' + query_text + '"'
|
||||
prefix += ' "' + query_name + '"'
|
||||
|
||||
|
||||
job_key.SetVariable( 'popup_text_1', prefix )
|
||||
|
@ -676,7 +707,7 @@ class Subscription( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
else:
|
||||
|
||||
self._ShowHitPeriodicFileLimitMessage( query_text )
|
||||
self._ShowHitPeriodicFileLimitMessage( query_name )
|
||||
|
||||
stop_reason = 'hit periodic file limit'
|
||||
|
||||
|
@ -769,11 +800,11 @@ class Subscription( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
if this_is_initial_sync:
|
||||
|
||||
HydrusData.ShowText( 'The query "' + query_text + '" for subscription "' + self._name + '" did not find any files on its first sync! Could the query text have a typo, like a missing underscore?' )
|
||||
HydrusData.ShowText( 'The query "' + query_name + '" for subscription "' + self._name + '" did not find any files on its first sync! Could the query text have a typo, like a missing underscore?' )
|
||||
|
||||
else:
|
||||
|
||||
HydrusData.ShowText( 'The query "' + query_text + '" for subscription "' + self._name + '" appears to be dead!' )
|
||||
HydrusData.ShowText( 'The query "' + query_name + '" for subscription "' + self._name + '" appears to be dead!' )
|
||||
|
||||
|
||||
else:
|
||||
|
@ -782,7 +813,7 @@ class Subscription( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
if not self._QueryBandwidthIsOK( query ) and not have_made_an_initial_sync_bandwidth_notification:
|
||||
|
||||
HydrusData.ShowText( 'FYI: The query "' + query_text + '" for subscription "' + self._name + '" performed its initial sync ok, but that domain is short on bandwidth right now, so no files will be downloaded yet. The subscription will catch up in future as bandwidth becomes available. You can review the estimated time until bandwidth is available under the manage subscriptions dialog. If more queries are performing initial syncs in this run, they may be the same.' )
|
||||
HydrusData.ShowText( 'FYI: The query "' + query_name + '" for subscription "' + self._name + '" performed its initial sync ok, but that domain is short on bandwidth right now, so no files will be downloaded yet. The subscription will catch up in future as bandwidth becomes available. You can review the estimated time until bandwidth is available under the manage subscriptions dialog. If more queries are performing initial syncs in this run, they may be the same.' )
|
||||
|
||||
have_made_an_initial_sync_bandwidth_notification = True
|
||||
|
||||
|
@ -889,7 +920,7 @@ class Subscription( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
def GetPresentationOptions( self ):
|
||||
|
||||
return ( self._publish_files_to_popup_button, self._publish_files_to_page, self._merge_query_publish_events )
|
||||
return ( self._show_a_popup_while_working, self._publish_files_to_popup_button, self._publish_files_to_page, self._merge_query_publish_events )
|
||||
|
||||
|
||||
def GetTagImportOptions( self ):
|
||||
|
@ -999,7 +1030,7 @@ class Subscription( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
subscription._queries = [ query.Duplicate() ]
|
||||
|
||||
subscription.SetName( base_name + ': ' + query.GetQueryText() )
|
||||
subscription.SetName( base_name + ': ' + query.GetHumanName() )
|
||||
|
||||
subscriptions.append( subscription )
|
||||
|
||||
|
@ -1019,8 +1050,9 @@ class Subscription( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
|
||||
|
||||
def SetPresentationOptions( self, publish_files_to_popup_button, publish_files_to_page, merge_query_publish_events ):
|
||||
def SetPresentationOptions( self, show_a_popup_while_working, publish_files_to_popup_button, publish_files_to_page, merge_query_publish_events ):
|
||||
|
||||
self._show_a_popup_while_working = show_a_popup_while_working
|
||||
self._publish_files_to_popup_button = publish_files_to_popup_button
|
||||
self._publish_files_to_page = publish_files_to_page
|
||||
self._merge_query_publish_events = merge_query_publish_events
|
||||
|
@ -1083,7 +1115,10 @@ class Subscription( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
job_key.SetVariable( 'popup_title', 'subscriptions - ' + self._name )
|
||||
|
||||
HG.client_controller.pub( 'message', job_key )
|
||||
if self._show_a_popup_while_working:
|
||||
|
||||
HG.client_controller.pub( 'message', job_key )
|
||||
|
||||
|
||||
self._SyncQuery( job_key )
|
||||
|
||||
|
@ -1145,13 +1180,14 @@ class SubscriptionQuery( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_SUBSCRIPTION_QUERY
|
||||
SERIALISABLE_NAME = 'Subscription Query'
|
||||
SERIALISABLE_VERSION = 2
|
||||
SERIALISABLE_VERSION = 3
|
||||
|
||||
def __init__( self, query = 'query text' ):
|
||||
|
||||
HydrusSerialisable.SerialisableBase.__init__( self )
|
||||
|
||||
self._query = query
|
||||
self._display_name = None
|
||||
self._check_now = False
|
||||
self._last_check_time = 0
|
||||
self._next_check_time = 0
|
||||
|
@ -1159,22 +1195,25 @@ class SubscriptionQuery( HydrusSerialisable.SerialisableBase ):
|
|||
self._status = ClientImporting.CHECKER_STATUS_OK
|
||||
self._gallery_seed_log = ClientImportGallerySeeds.GallerySeedLog()
|
||||
self._file_seed_cache = ClientImportFileSeeds.FileSeedCache()
|
||||
self._tag_import_options = ClientImportOptions.TagImportOptions()
|
||||
|
||||
|
||||
def _GetSerialisableInfo( self ):
|
||||
|
||||
serialisable_gallery_seed_log = self._gallery_seed_log.GetSerialisableTuple()
|
||||
serialisable_file_seed_cache = self._file_seed_cache.GetSerialisableTuple()
|
||||
serialisable_tag_import_options = self._tag_import_options.GetSerialisableTuple()
|
||||
|
||||
return ( self._query, self._check_now, self._last_check_time, self._next_check_time, self._paused, self._status, serialisable_gallery_seed_log, serialisable_file_seed_cache )
|
||||
return ( self._query, self._display_name, self._check_now, self._last_check_time, self._next_check_time, self._paused, self._status, serialisable_gallery_seed_log, serialisable_file_seed_cache, serialisable_tag_import_options )
|
||||
|
||||
|
||||
def _InitialiseFromSerialisableInfo( self, serialisable_info ):
|
||||
|
||||
( self._query, self._check_now, self._last_check_time, self._next_check_time, self._paused, self._status, serialisable_gallery_seed_log, serialisable_file_seed_cache ) = serialisable_info
|
||||
( self._query, self._display_name, self._check_now, self._last_check_time, self._next_check_time, self._paused, self._status, serialisable_gallery_seed_log, serialisable_file_seed_cache, serialisable_tag_import_options ) = serialisable_info
|
||||
|
||||
self._gallery_seed_log = HydrusSerialisable.CreateFromSerialisableTuple( serialisable_gallery_seed_log )
|
||||
self._file_seed_cache = HydrusSerialisable.CreateFromSerialisableTuple( serialisable_file_seed_cache )
|
||||
self._tag_import_options = HydrusSerialisable.CreateFromSerialisableTuple( serialisable_tag_import_options )
|
||||
|
||||
|
||||
def _UpdateSerialisableInfo( self, version, old_serialisable_info ):
|
||||
|
@ -1192,6 +1231,20 @@ class SubscriptionQuery( HydrusSerialisable.SerialisableBase ):
|
|||
return ( 2, new_serialisable_info )
|
||||
|
||||
|
||||
if version == 2:
|
||||
|
||||
( query, check_now, last_check_time, next_check_time, paused, status, serialisable_gallery_seed_log, serialisable_file_seed_cache ) = old_serialisable_info
|
||||
|
||||
display_name = None
|
||||
tag_import_options = ClientImportOptions.TagImportOptions()
|
||||
|
||||
serialisable_tag_import_options = tag_import_options.GetSerialisableTuple()
|
||||
|
||||
new_serialisable_info = ( query, display_name, check_now, last_check_time, next_check_time, paused, status, serialisable_gallery_seed_log, serialisable_file_seed_cache, serialisable_tag_import_options )
|
||||
|
||||
return ( 3, new_serialisable_info )
|
||||
|
||||
|
||||
|
||||
def CanWorkOnFiles( self ):
|
||||
|
||||
|
@ -1263,6 +1316,11 @@ class SubscriptionQuery( HydrusSerialisable.SerialisableBase ):
|
|||
self._gallery_seed_log.Compact( compact_before_this_time )
|
||||
|
||||
|
||||
def GetDisplayName( self ):
|
||||
|
||||
return self._display_name
|
||||
|
||||
|
||||
def GetFileSeedCache( self ):
|
||||
|
||||
return self._file_seed_cache
|
||||
|
@ -1273,6 +1331,18 @@ class SubscriptionQuery( HydrusSerialisable.SerialisableBase ):
|
|||
return self._gallery_seed_log
|
||||
|
||||
|
||||
def GetHumanName( self ):
|
||||
|
||||
if self._display_name is None:
|
||||
|
||||
return self._query
|
||||
|
||||
else:
|
||||
|
||||
return self._display_name
|
||||
|
||||
|
||||
|
||||
def GetLastChecked( self ):
|
||||
|
||||
return self._last_check_time
|
||||
|
@ -1323,6 +1393,11 @@ class SubscriptionQuery( HydrusSerialisable.SerialisableBase ):
|
|||
return self._query
|
||||
|
||||
|
||||
def GetTagImportOptions( self ):
|
||||
|
||||
return self._tag_import_options
|
||||
|
||||
|
||||
def IsDead( self ):
|
||||
|
||||
return self._status == ClientImporting.CHECKER_STATUS_DEAD
|
||||
|
@ -1375,6 +1450,11 @@ class SubscriptionQuery( HydrusSerialisable.SerialisableBase ):
|
|||
self._check_now = check_now
|
||||
|
||||
|
||||
def SetDisplayName( self, display_name ):
|
||||
|
||||
self._display_name = display_name
|
||||
|
||||
|
||||
def SetPaused( self, paused ):
|
||||
|
||||
self._paused = paused
|
||||
|
@ -1387,6 +1467,11 @@ class SubscriptionQuery( HydrusSerialisable.SerialisableBase ):
|
|||
self._gallery_seed_log = gallery_seed_log
|
||||
|
||||
|
||||
def SetTagImportOptions( self, tag_import_options ):
|
||||
|
||||
self._tag_import_options = tag_import_options
|
||||
|
||||
|
||||
def UpdateNextCheckTime( self, checker_options ):
|
||||
|
||||
if self._check_now:
|
||||
|
|
|
@ -896,6 +896,14 @@ class WatcherImport( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
|
||||
|
||||
def CanRetryFailed( self ):
|
||||
|
||||
with self._lock:
|
||||
|
||||
return self._file_seed_cache.GetFileSeedCount( CC.STATUS_ERROR ) > 0
|
||||
|
||||
|
||||
|
||||
def CheckingPaused( self ):
|
||||
|
||||
with self._lock:
|
||||
|
@ -1176,6 +1184,14 @@ class WatcherImport( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
|
||||
|
||||
def RetryFailed( self ):
|
||||
|
||||
with self._lock:
|
||||
|
||||
self._file_seed_cache.RetryFailures()
|
||||
|
||||
|
||||
|
||||
def SetCheckerOptions( self, checker_options ):
|
||||
|
||||
with self._lock:
|
||||
|
|
|
@ -35,7 +35,7 @@ DID_SUBSTANTIAL_FILE_WORK_MINIMUM_SLEEP_TIME = 0.1
|
|||
|
||||
REPEATING_JOB_TYPICAL_PERIOD = 30.0
|
||||
|
||||
def ConvertAllParseResultsToFileSeeds( all_parse_results, source_url ):
|
||||
def ConvertAllParseResultsToFileSeeds( all_parse_results, source_url, file_import_options ):
|
||||
|
||||
file_seeds = []
|
||||
|
||||
|
@ -49,7 +49,7 @@ def ConvertAllParseResultsToFileSeeds( all_parse_results, source_url ):
|
|||
|
||||
file_seed.SetReferralURL( source_url )
|
||||
|
||||
file_seed.AddParseResults( parse_results )
|
||||
file_seed.AddParseResults( parse_results, file_import_options )
|
||||
|
||||
file_seeds.append( file_seed )
|
||||
|
||||
|
|
|
@ -115,6 +115,44 @@ class NetworkBandwidthManager( HydrusSerialisable.SerialisableBase ):
|
|||
self._dirty = True
|
||||
|
||||
|
||||
def AlreadyHaveExactlyTheseBandwidthRules( self, network_context, bandwidth_rules ):
|
||||
|
||||
with self._lock:
|
||||
|
||||
if network_context in self._network_contexts_to_bandwidth_rules:
|
||||
|
||||
if self._network_contexts_to_bandwidth_rules[ network_context ].GetSerialisableTuple() == bandwidth_rules.GetSerialisableTuple():
|
||||
|
||||
return True
|
||||
|
||||
|
||||
|
||||
|
||||
return False
|
||||
|
||||
|
||||
def AutoAddDomainMetadatas( self, domain_metadatas ):
|
||||
|
||||
for domain_metadata in domain_metadatas:
|
||||
|
||||
if not domain_metadata.HasBandwidthRules():
|
||||
|
||||
return
|
||||
|
||||
|
||||
with self._lock:
|
||||
|
||||
domain = domain_metadata.GetDomain()
|
||||
|
||||
network_context = ClientNetworkingContexts.NetworkContext( CC.NETWORK_CONTEXT_DOMAIN, domain )
|
||||
|
||||
bandwidth_rules = domain_metadata.GetBandwidthRules()
|
||||
|
||||
self._network_contexts_to_bandwidth_rules[ network_context ] = bandwidth_rules
|
||||
|
||||
|
||||
|
||||
|
||||
def CanContinueDownload( self, network_contexts ):
|
||||
|
||||
with self._lock:
|
||||
|
@ -322,6 +360,14 @@ class NetworkBandwidthManager( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
|
||||
|
||||
def HasRules( self, network_context ):
|
||||
|
||||
with self._lock:
|
||||
|
||||
return network_context in self._network_contexts_to_bandwidth_rules
|
||||
|
||||
|
||||
|
||||
def IsDirty( self ):
|
||||
|
||||
with self._lock:
|
||||
|
|
|
@ -1,4 +1,5 @@
|
|||
import ClientConstants as CC
|
||||
import ClientNetworkingContexts
|
||||
import ClientParsing
|
||||
import ClientThreading
|
||||
import collections
|
||||
|
@ -6,6 +7,7 @@ import HydrusConstants as HC
|
|||
import HydrusGlobals as HG
|
||||
import HydrusData
|
||||
import HydrusExceptions
|
||||
import HydrusNetworking
|
||||
import HydrusSerialisable
|
||||
import os
|
||||
import re
|
||||
|
@ -102,12 +104,17 @@ def ConvertQueryDictToText( query_dict ):
|
|||
|
||||
param_pairs.sort()
|
||||
|
||||
query_text = u'&'.join( ( unicode( key ) + u'=' + unicode( value ) for ( key, value ) in param_pairs ) )
|
||||
query_text = u'&'.join( ( HydrusData.ToUnicode( key ) + u'=' + HydrusData.ToUnicode( value ) for ( key, value ) in param_pairs ) )
|
||||
|
||||
return query_text
|
||||
|
||||
def ConvertQueryTextToDict( query_text ):
|
||||
|
||||
# we generally do not want quote characters %20 stuff, in our urls. we would prefer regular ascii and even unicode
|
||||
|
||||
# first we will decode all unicode, which allows urllib to work
|
||||
query_text = HydrusData.ToByteString( query_text )
|
||||
|
||||
query_dict = {}
|
||||
|
||||
pairs = query_text.split( '&' )
|
||||
|
@ -120,11 +127,26 @@ def ConvertQueryTextToDict( query_text ):
|
|||
|
||||
if len( result ) == 2:
|
||||
|
||||
# so, let's replace all keys and values with unquoted versions
|
||||
# -but-
|
||||
# we only replace if it is a completely reversable operation!
|
||||
# odd situations like '6+girls+skirt', which comes here encoded as '6%2Bgirls+skirt', shouldn't turn into '6+girls+skirt'
|
||||
# so if there are a mix of encoded and non-encoded, we won't touch it here m8
|
||||
|
||||
# we convert to unicode afterwards so %E5%B0%BB%E7%A5%9E%E6%A7%98 -> \xe5\xb0\xbb\xe7\xa5\x9e\xe6\xa7\x98 -> \u5c3b\u795e\u69d8
|
||||
|
||||
( key, value ) = result
|
||||
|
||||
try:
|
||||
|
||||
key = urllib.unquote( key )
|
||||
unquoted_key = urllib.unquote( key )
|
||||
|
||||
requoted_key = urllib.quote( unquoted_key )
|
||||
|
||||
if key == requoted_key:
|
||||
|
||||
key = HydrusData.ToUnicode( unquoted_key )
|
||||
|
||||
|
||||
except:
|
||||
|
||||
|
@ -133,7 +155,14 @@ def ConvertQueryTextToDict( query_text ):
|
|||
|
||||
try:
|
||||
|
||||
value = urllib.unquote( value )
|
||||
unquoted_value = urllib.unquote( value )
|
||||
|
||||
requoted_value = urllib.quote( unquoted_value )
|
||||
|
||||
if value == requoted_value:
|
||||
|
||||
value = HydrusData.ToUnicode( unquoted_value )
|
||||
|
||||
|
||||
except:
|
||||
|
||||
|
@ -793,6 +822,41 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
|
|||
self.SetURLMatches( url_matches )
|
||||
|
||||
|
||||
def AlreadyHaveExactlyTheseHeaders( self, network_context, headers_list ):
|
||||
|
||||
with self._lock:
|
||||
|
||||
if network_context in self._network_contexts_to_custom_header_dicts:
|
||||
|
||||
custom_headers_dict = self._network_contexts_to_custom_header_dicts[ network_context ]
|
||||
|
||||
if len( headers_list ) != len( custom_headers_dict ):
|
||||
|
||||
return False
|
||||
|
||||
|
||||
for ( key, value, reason ) in headers_list:
|
||||
|
||||
if key not in custom_headers_dict:
|
||||
|
||||
return False
|
||||
|
||||
|
||||
( existing_value, existing_approved, existing_reason ) = custom_headers_dict[ key ]
|
||||
|
||||
if existing_value != value:
|
||||
|
||||
return False
|
||||
|
||||
|
||||
|
||||
return True
|
||||
|
||||
|
||||
|
||||
return False
|
||||
|
||||
|
||||
def AlreadyHaveExactlyThisGUG( self, new_gug ):
|
||||
|
||||
with self._lock:
|
||||
|
@ -887,6 +951,30 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
|
|||
return False
|
||||
|
||||
|
||||
def AutoAddDomainMetadatas( self, domain_metadatas, approved = False ):
|
||||
|
||||
for domain_metadata in domain_metadatas:
|
||||
|
||||
if not domain_metadata.HasHeaders():
|
||||
|
||||
continue
|
||||
|
||||
|
||||
with self._lock:
|
||||
|
||||
domain = domain_metadata.GetDomain()
|
||||
|
||||
network_context = ClientNetworkingContexts.NetworkContext( CC.NETWORK_CONTEXT_DOMAIN, domain )
|
||||
|
||||
headers_list = domain_metadata.GetHeaders()
|
||||
|
||||
custom_headers_dict = { key : ( value, approved, reason ) for ( key, value, reason ) in headers_list }
|
||||
|
||||
self._network_contexts_to_custom_header_dicts[ network_context ] = custom_headers_dict
|
||||
|
||||
|
||||
|
||||
|
||||
def AutoAddURLMatchesAndParsers( self, new_url_matches, dupe_url_matches, new_parsers ):
|
||||
|
||||
for url_match in new_url_matches:
|
||||
|
@ -1214,6 +1302,26 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
|
||||
|
||||
def GetShareableCustomHeaders( self, network_context ):
|
||||
|
||||
with self._lock:
|
||||
|
||||
headers_list = []
|
||||
|
||||
if network_context in self._network_contexts_to_custom_header_dicts:
|
||||
|
||||
custom_header_dict = self._network_contexts_to_custom_header_dicts[ network_context ]
|
||||
|
||||
for ( key, ( value, approved, reason ) ) in custom_header_dict.items():
|
||||
|
||||
headers_list.append( ( key, value, reason ) )
|
||||
|
||||
|
||||
|
||||
return headers_list
|
||||
|
||||
|
||||
|
||||
def GetURLMatch( self, url ):
|
||||
|
||||
with self._lock:
|
||||
|
@ -1283,6 +1391,14 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
|
||||
|
||||
def HasCustomHeaders( self, network_context ):
|
||||
|
||||
with self._lock:
|
||||
|
||||
return network_context in self._network_contexts_to_custom_header_dicts and len( self._network_contexts_to_custom_header_dicts[ network_context ] ) > 0
|
||||
|
||||
|
||||
|
||||
def Initialise( self ):
|
||||
|
||||
self._RecalcCache()
|
||||
|
@ -1829,6 +1945,127 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
HydrusSerialisable.SERIALISABLE_TYPES_TO_OBJECT_TYPES[ HydrusSerialisable.SERIALISABLE_TYPE_NETWORK_DOMAIN_MANAGER ] = NetworkDomainManager
|
||||
|
||||
class DomainMetadataPackage( HydrusSerialisable.SerialisableBase ):
|
||||
|
||||
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_DOMAIN_METADATA_PACKAGE
|
||||
SERIALISABLE_NAME = 'Domain Metadata'
|
||||
SERIALISABLE_VERSION = 1
|
||||
|
||||
def __init__( self, domain = None, headers_list = None, bandwidth_rules = None ):
|
||||
|
||||
HydrusSerialisable.SerialisableBase.__init__( self )
|
||||
|
||||
if domain is None:
|
||||
|
||||
domain = 'example.com'
|
||||
|
||||
|
||||
self._domain = domain
|
||||
self._headers_list = headers_list
|
||||
self._bandwidth_rules = bandwidth_rules
|
||||
|
||||
|
||||
def _GetSerialisableInfo( self ):
|
||||
|
||||
if self._bandwidth_rules is None:
|
||||
|
||||
serialisable_bandwidth_rules = self._bandwidth_rules
|
||||
|
||||
else:
|
||||
|
||||
serialisable_bandwidth_rules = self._bandwidth_rules.GetSerialisableTuple()
|
||||
|
||||
|
||||
return ( self._domain, self._headers_list, serialisable_bandwidth_rules )
|
||||
|
||||
|
||||
def _InitialiseFromSerialisableInfo( self, serialisable_info ):
|
||||
|
||||
( self._domain, self._headers_list, serialisable_bandwidth_rules ) = serialisable_info
|
||||
|
||||
if serialisable_bandwidth_rules is None:
|
||||
|
||||
self._bandwidth_rules = serialisable_bandwidth_rules
|
||||
|
||||
else:
|
||||
|
||||
self._bandwidth_rules = HydrusSerialisable.CreateFromSerialisableTuple( serialisable_bandwidth_rules )
|
||||
|
||||
|
||||
|
||||
def GetBandwidthRules( self ):
|
||||
|
||||
return self._bandwidth_rules
|
||||
|
||||
|
||||
def GetDetailedSafeSummary( self ):
|
||||
|
||||
components = [ 'For domain "' + self._domain + '":' ]
|
||||
|
||||
if self.HasBandwidthRules():
|
||||
|
||||
m = 'Bandwidth rules: '
|
||||
m += os.linesep
|
||||
m += os.linesep.join( [ HydrusNetworking.ConvertBandwidthRuleToString( rule ) for rule in self._bandwidth_rules.GetRules() ] )
|
||||
|
||||
components.append( m )
|
||||
|
||||
|
||||
if self.HasHeaders():
|
||||
|
||||
m = 'Headers: '
|
||||
m += os.linesep
|
||||
m += os.linesep.join( [ key + ' : ' + value + ' - ' + reason for ( key, value, reason ) in self._headers_list ] )
|
||||
|
||||
components.append( m )
|
||||
|
||||
|
||||
joiner = os.linesep * 2
|
||||
|
||||
s = joiner.join( components )
|
||||
|
||||
return s
|
||||
|
||||
|
||||
def GetDomain( self ):
|
||||
|
||||
return self._domain
|
||||
|
||||
|
||||
def GetHeaders( self ):
|
||||
|
||||
return self._headers_list
|
||||
|
||||
|
||||
def GetSafeSummary( self ):
|
||||
|
||||
components = []
|
||||
|
||||
if self.HasBandwidthRules():
|
||||
|
||||
components.append( 'bandwidth rules' )
|
||||
|
||||
|
||||
if self.HasHeaders():
|
||||
|
||||
components.append( 'headers' )
|
||||
|
||||
|
||||
return ' and '.join( components ) + ' - ' + self._domain
|
||||
|
||||
|
||||
def HasBandwidthRules( self ):
|
||||
|
||||
return self._bandwidth_rules is not None
|
||||
|
||||
|
||||
def HasHeaders( self ):
|
||||
|
||||
return self._headers_list is not None
|
||||
|
||||
|
||||
HydrusSerialisable.SERIALISABLE_TYPES_TO_OBJECT_TYPES[ HydrusSerialisable.SERIALISABLE_TYPE_DOMAIN_METADATA_PACKAGE ] = DomainMetadataPackage
|
||||
|
||||
class DomainValidationPopupProcess( object ):
|
||||
|
||||
def __init__( self, domain_manager, header_tuples ):
|
||||
|
@ -1988,12 +2225,28 @@ class GalleryURLGenerator( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
# this basically fixes e621 searches for 'male/female', which through some httpconf trickery are embedded in path but end up in a query, so need to be encoded right beforehand
|
||||
# we need ToByteString as urllib.quote can't handle unicode hiragana etc...
|
||||
|
||||
search_terms = [ urllib.quote( HydrusData.ToByteString( search_term ), safe = '' ) for search_term in search_terms ]
|
||||
encoded_search_terms = [ urllib.quote( HydrusData.ToByteString( search_term ), safe = '' ) for search_term in search_terms ]
|
||||
|
||||
else:
|
||||
|
||||
# when the separator is '+' but the permitted tags might be '6+girls', we run into fun internet land
|
||||
|
||||
encoded_search_terms = []
|
||||
|
||||
for search_term in search_terms:
|
||||
|
||||
if self._search_terms_separator in search_term:
|
||||
|
||||
search_term = urllib.quote( HydrusData.ToByteString( search_term ), safe = '' )
|
||||
|
||||
|
||||
encoded_search_terms.append( search_term )
|
||||
|
||||
|
||||
|
||||
try:
|
||||
|
||||
search_phrase = self._search_terms_separator.join( search_terms )
|
||||
search_phrase = self._search_terms_separator.join( encoded_search_terms )
|
||||
|
||||
gallery_url = self._url_template.replace( self._replacement_phrase, search_phrase )
|
||||
|
||||
|
@ -2015,6 +2268,11 @@ class GalleryURLGenerator( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
return self.GenerateGalleryURL( self._example_search_text )
|
||||
|
||||
|
||||
def GetExampleURLs( self ):
|
||||
|
||||
return ( self.GetExampleURL(), )
|
||||
|
||||
|
||||
def GetGUGKey( self ):
|
||||
|
||||
return self._gallery_url_generator_key
|
||||
|
|
|
@ -335,6 +335,8 @@ class ClientOptions( HydrusSerialisable.SerialisableBase ):
|
|||
self._dictionary[ 'default_file_import_options' ] = HydrusSerialisable.SerialisableDictionary()
|
||||
|
||||
exclude_deleted = True
|
||||
do_not_check_known_urls_before_importing = False
|
||||
do_not_check_hashes_before_importing = False
|
||||
allow_decompression_bombs = False
|
||||
min_size = None
|
||||
max_size = None
|
||||
|
@ -343,6 +345,7 @@ class ClientOptions( HydrusSerialisable.SerialisableBase ):
|
|||
max_resolution = None
|
||||
|
||||
automatic_archive = False
|
||||
associate_source_urls = True
|
||||
|
||||
present_new_files = True
|
||||
present_already_in_inbox_files = False
|
||||
|
@ -352,8 +355,8 @@ class ClientOptions( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
quiet_file_import_options = ClientImportOptions.FileImportOptions()
|
||||
|
||||
quiet_file_import_options.SetPreImportOptions( exclude_deleted, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution )
|
||||
quiet_file_import_options.SetPostImportOptions( automatic_archive )
|
||||
quiet_file_import_options.SetPreImportOptions( exclude_deleted, do_not_check_known_urls_before_importing, do_not_check_hashes_before_importing, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution )
|
||||
quiet_file_import_options.SetPostImportOptions( automatic_archive, associate_source_urls )
|
||||
quiet_file_import_options.SetPresentationOptions( present_new_files, present_already_in_inbox_files, present_already_in_archive_files )
|
||||
|
||||
self._dictionary[ 'default_file_import_options' ][ 'quiet' ] = quiet_file_import_options
|
||||
|
@ -364,8 +367,8 @@ class ClientOptions( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
loud_file_import_options = ClientImportOptions.FileImportOptions()
|
||||
|
||||
loud_file_import_options.SetPreImportOptions( exclude_deleted, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution )
|
||||
loud_file_import_options.SetPostImportOptions( automatic_archive )
|
||||
loud_file_import_options.SetPreImportOptions( exclude_deleted, do_not_check_known_urls_before_importing, do_not_check_hashes_before_importing, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution )
|
||||
loud_file_import_options.SetPostImportOptions( automatic_archive, associate_source_urls )
|
||||
loud_file_import_options.SetPresentationOptions( present_new_files, present_already_in_inbox_files, present_already_in_archive_files )
|
||||
|
||||
self._dictionary[ 'default_file_import_options' ][ 'loud' ] = loud_file_import_options
|
||||
|
|
|
@ -73,6 +73,7 @@ def SortTags( sort_by, tags_list, tags_to_count = None ):
|
|||
|
||||
( namespace, subtag ) = HydrusTags.SplitTag( tag )
|
||||
|
||||
comparable_namespace = HydrusTags.ConvertTagToSortable( namespace )
|
||||
comparable_subtag = HydrusTags.ConvertTagToSortable( subtag )
|
||||
|
||||
# 'cat' < 'character:rei'
|
||||
|
@ -87,7 +88,7 @@ def SortTags( sort_by, tags_list, tags_to_count = None ):
|
|||
|
||||
else:
|
||||
|
||||
return ( namespace, comparable_subtag )
|
||||
return ( comparable_namespace, comparable_subtag )
|
||||
|
||||
|
||||
|
||||
|
@ -109,7 +110,7 @@ def SortTags( sort_by, tags_list, tags_to_count = None ):
|
|||
|
||||
if namespace == '':
|
||||
|
||||
namespace = '{' # '{' is above 'z' in ascii, so this works for most situations
|
||||
namespace = u'{' # '{' is above 'z' in ascii, so this works for most situations
|
||||
|
||||
|
||||
return namespace
|
||||
|
|
|
@ -1,6 +1,7 @@
|
|||
import numpy.core.multiarray # important this comes before cv!
|
||||
import cv2
|
||||
import ClientImageHandling
|
||||
import HydrusData
|
||||
import HydrusExceptions
|
||||
import HydrusGlobals as HG
|
||||
import HydrusImageHandling
|
||||
|
@ -58,6 +59,11 @@ class GIFRenderer( object ):
|
|||
|
||||
def __init__( self, path, num_frames, target_resolution ):
|
||||
|
||||
if HG.media_load_report_mode:
|
||||
|
||||
HydrusData.ShowText( 'Loading GIF: ' + path )
|
||||
|
||||
|
||||
self._path = path
|
||||
self._num_frames = num_frames
|
||||
self._target_resolution = target_resolution
|
||||
|
@ -136,6 +142,11 @@ class GIFRenderer( object ):
|
|||
|
||||
def _InitialiseCV( self ):
|
||||
|
||||
if HG.media_load_report_mode:
|
||||
|
||||
HydrusData.ShowText( 'Loading GIF with OpenCV' )
|
||||
|
||||
|
||||
self._cv_mode = True
|
||||
|
||||
self._cv_video = cv2.VideoCapture( self._path )
|
||||
|
@ -148,6 +159,11 @@ class GIFRenderer( object ):
|
|||
|
||||
def _InitialisePIL( self ):
|
||||
|
||||
if HG.media_load_report_mode:
|
||||
|
||||
HydrusData.ShowText( 'Loading GIF with PIL' )
|
||||
|
||||
|
||||
self._cv_mode = False
|
||||
|
||||
self._pil_image = HydrusImageHandling.GeneratePILImage( self._path )
|
||||
|
@ -187,6 +203,11 @@ class GIFRenderer( object ):
|
|||
|
||||
if self._last_frame is None:
|
||||
|
||||
if HG.media_load_report_mode:
|
||||
|
||||
HydrusData.ShowText( 'OpenCV Failed to render a frame' )
|
||||
|
||||
|
||||
self._InitialisePIL()
|
||||
|
||||
numpy_image = self._RenderCurrentFrame()
|
||||
|
|
|
@ -49,7 +49,7 @@ options = {}
|
|||
# Misc
|
||||
|
||||
NETWORK_VERSION = 18
|
||||
SOFTWARE_VERSION = 324
|
||||
SOFTWARE_VERSION = 325
|
||||
|
||||
UNSCALED_THUMBNAIL_DIMENSIONS = ( 200, 200 )
|
||||
|
||||
|
|
|
@ -15,6 +15,7 @@ callto_report_mode = False
|
|||
db_report_mode = False
|
||||
db_profile_mode = False
|
||||
file_report_mode = False
|
||||
media_load_report_mode = False
|
||||
gui_report_mode = False
|
||||
shortcut_report_mode = False
|
||||
subscription_report_mode = False
|
||||
|
|
|
@ -28,7 +28,7 @@ def ConvertBandwidthRuleToString( rule ):
|
|||
|
||||
elif bandwidth_type == HC.BANDWIDTH_TYPE_REQUESTS:
|
||||
|
||||
s = HydrusData.ToHumanInt( max_allowed )
|
||||
s = HydrusData.ToHumanInt( max_allowed ) + ' rqs'
|
||||
|
||||
|
||||
if time_delta is None:
|
||||
|
|
|
@ -85,6 +85,7 @@ SERIALISABLE_TYPE_GALLERY_SEED_LOG = 67
|
|||
SERIALISABLE_TYPE_GALLERY_IMPORT = 68
|
||||
SERIALISABLE_TYPE_GALLERY_URL_GENERATOR = 69
|
||||
SERIALISABLE_TYPE_NESTED_GALLERY_URL_GENERATOR = 70
|
||||
SERIALISABLE_TYPE_DOMAIN_METADATA_PACKAGE = 71
|
||||
|
||||
SERIALISABLE_TYPES_TO_OBJECT_TYPES = {}
|
||||
|
||||
|
|
|
@ -283,7 +283,7 @@ def SplitTag( tag ):
|
|||
|
||||
else:
|
||||
|
||||
return ( '', tag )
|
||||
return ( u'', tag )
|
||||
|
||||
|
||||
def StripTextOfGumpf( t ):
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
import json
|
||||
import re
|
||||
|
||||
re_newlines = re.compile( '[\r\n]+', re.UNICODE )
|
||||
|
@ -21,9 +22,23 @@ def DeserialiseNewlinedTexts( text ):
|
|||
return texts
|
||||
|
||||
def LooksLikeHTML( file_data ):
|
||||
# this will false-positive if it is json that contains html, ha ha
|
||||
|
||||
return '<html' in file_data or '<HTML' in file_data
|
||||
|
||||
def LooksLikeJSON( file_data ):
|
||||
|
||||
try:
|
||||
|
||||
json.loads( file_data )
|
||||
|
||||
return True
|
||||
|
||||
except:
|
||||
|
||||
return False
|
||||
|
||||
|
||||
def RemoveNewlines( text ):
|
||||
|
||||
text = re.sub( r'\r|\n', '', text )
|
||||
|
|
|
@ -199,6 +199,8 @@ class TestFileImportOptions( unittest.TestCase ):
|
|||
file_import_options = ClientImportOptions.FileImportOptions()
|
||||
|
||||
exclude_deleted = False
|
||||
do_not_check_known_urls_before_importing = False
|
||||
do_not_check_hashes_before_importing = False
|
||||
allow_decompression_bombs = False
|
||||
min_size = None
|
||||
max_size = None
|
||||
|
@ -206,11 +208,12 @@ class TestFileImportOptions( unittest.TestCase ):
|
|||
min_resolution = None
|
||||
max_resolution = None
|
||||
|
||||
file_import_options.SetPreImportOptions( exclude_deleted, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution )
|
||||
file_import_options.SetPreImportOptions( exclude_deleted, do_not_check_known_urls_before_importing, do_not_check_hashes_before_importing, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution )
|
||||
|
||||
automatic_archive = False
|
||||
associate_source_urls = False
|
||||
|
||||
file_import_options.SetPostImportOptions( automatic_archive )
|
||||
file_import_options.SetPostImportOptions( automatic_archive, associate_source_urls )
|
||||
|
||||
present_new_files = True
|
||||
present_already_in_inbox_files = True
|
||||
|
@ -223,6 +226,7 @@ class TestFileImportOptions( unittest.TestCase ):
|
|||
self.assertFalse( file_import_options.ExcludesDeleted() )
|
||||
self.assertFalse( file_import_options.AllowsDecompressionBombs() )
|
||||
self.assertFalse( file_import_options.AutomaticallyArchives() )
|
||||
self.assertFalse( file_import_options.ShouldAssociateSourceURLs() )
|
||||
|
||||
file_import_options.CheckFileIsValid( 65536, HC.IMAGE_JPEG, 640, 480 )
|
||||
file_import_options.CheckFileIsValid( 65536, HC.APPLICATION_7Z, None, None )
|
||||
|
@ -238,7 +242,7 @@ class TestFileImportOptions( unittest.TestCase ):
|
|||
|
||||
exclude_deleted = True
|
||||
|
||||
file_import_options.SetPreImportOptions( exclude_deleted, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution )
|
||||
file_import_options.SetPreImportOptions( exclude_deleted, do_not_check_known_urls_before_importing, do_not_check_hashes_before_importing, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution )
|
||||
|
||||
self.assertTrue( file_import_options.ExcludesDeleted() )
|
||||
self.assertFalse( file_import_options.AllowsDecompressionBombs() )
|
||||
|
@ -248,7 +252,7 @@ class TestFileImportOptions( unittest.TestCase ):
|
|||
|
||||
allow_decompression_bombs = True
|
||||
|
||||
file_import_options.SetPreImportOptions( exclude_deleted, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution )
|
||||
file_import_options.SetPreImportOptions( exclude_deleted, do_not_check_known_urls_before_importing, do_not_check_hashes_before_importing, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution )
|
||||
|
||||
self.assertTrue( file_import_options.ExcludesDeleted() )
|
||||
self.assertTrue( file_import_options.AllowsDecompressionBombs() )
|
||||
|
@ -257,18 +261,20 @@ class TestFileImportOptions( unittest.TestCase ):
|
|||
#
|
||||
|
||||
automatic_archive = True
|
||||
associate_source_urls = True
|
||||
|
||||
file_import_options.SetPostImportOptions( automatic_archive )
|
||||
file_import_options.SetPostImportOptions( automatic_archive, associate_source_urls )
|
||||
|
||||
self.assertTrue( file_import_options.ExcludesDeleted() )
|
||||
self.assertTrue( file_import_options.AllowsDecompressionBombs() )
|
||||
self.assertTrue( file_import_options.AutomaticallyArchives() )
|
||||
self.assertTrue( file_import_options.ShouldAssociateSourceURLs() )
|
||||
|
||||
#
|
||||
|
||||
min_size = 4096
|
||||
|
||||
file_import_options.SetPreImportOptions( exclude_deleted, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution )
|
||||
file_import_options.SetPreImportOptions( exclude_deleted, do_not_check_known_urls_before_importing, do_not_check_hashes_before_importing, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution )
|
||||
|
||||
file_import_options.CheckFileIsValid( 65536, HC.IMAGE_JPEG, 640, 480 )
|
||||
|
||||
|
@ -282,7 +288,7 @@ class TestFileImportOptions( unittest.TestCase ):
|
|||
min_size = None
|
||||
max_size = 2000
|
||||
|
||||
file_import_options.SetPreImportOptions( exclude_deleted, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution )
|
||||
file_import_options.SetPreImportOptions( exclude_deleted, do_not_check_known_urls_before_importing, do_not_check_hashes_before_importing, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution )
|
||||
|
||||
file_import_options.CheckFileIsValid( 1800, HC.IMAGE_JPEG, 640, 480 )
|
||||
|
||||
|
@ -296,7 +302,7 @@ class TestFileImportOptions( unittest.TestCase ):
|
|||
max_size = None
|
||||
max_gif_size = 2000
|
||||
|
||||
file_import_options.SetPreImportOptions( exclude_deleted, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution )
|
||||
file_import_options.SetPreImportOptions( exclude_deleted, do_not_check_known_urls_before_importing, do_not_check_hashes_before_importing, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution )
|
||||
|
||||
file_import_options.CheckFileIsValid( 1800, HC.IMAGE_JPEG, 640, 480 )
|
||||
file_import_options.CheckFileIsValid( 2200, HC.IMAGE_JPEG, 640, 480 )
|
||||
|
@ -313,7 +319,7 @@ class TestFileImportOptions( unittest.TestCase ):
|
|||
max_gif_size = None
|
||||
min_resolution = ( 200, 100 )
|
||||
|
||||
file_import_options.SetPreImportOptions( exclude_deleted, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution )
|
||||
file_import_options.SetPreImportOptions( exclude_deleted, do_not_check_known_urls_before_importing, do_not_check_hashes_before_importing, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution )
|
||||
|
||||
file_import_options.CheckFileIsValid( 65536, HC.IMAGE_JPEG, 640, 480 )
|
||||
|
||||
|
@ -334,7 +340,7 @@ class TestFileImportOptions( unittest.TestCase ):
|
|||
min_resolution = None
|
||||
max_resolution = ( 3000, 4000 )
|
||||
|
||||
file_import_options.SetPreImportOptions( exclude_deleted, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution )
|
||||
file_import_options.SetPreImportOptions( exclude_deleted, do_not_check_known_urls_before_importing, do_not_check_hashes_before_importing, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution )
|
||||
|
||||
file_import_options.CheckFileIsValid( 65536, HC.IMAGE_JPEG, 640, 480 )
|
||||
|
||||
|
|
|
@ -748,8 +748,9 @@ class TestClientDB( unittest.TestCase ):
|
|||
|
||||
( written_status, written_note ) = self._write( 'import_file', file_import_job )
|
||||
|
||||
self.assertEqual( written_status, CC.STATUS_SUCCESSFUL_BUT_REDUNDANT )
|
||||
self.assertTrue( len( written_note ) > 0 )
|
||||
# would be redundant, but triggers the 'it is missing from db' hook
|
||||
self.assertEqual( written_status, CC.STATUS_SUCCESSFUL_AND_NEW )
|
||||
self.assertIn( 'already in the db', written_note )
|
||||
self.assertEqual( file_import_job.GetHash(), hash )
|
||||
|
||||
written_hash = file_import_job.GetHash()
|
||||
|
@ -872,9 +873,15 @@ class TestClientDB( unittest.TestCase ):
|
|||
|
||||
#
|
||||
|
||||
( status, hash, note ) = self._read( 'hash_status', 'md5', md5 )
|
||||
( status, written_hash, note ) = self._read( 'hash_status', 'md5', md5 )
|
||||
|
||||
self.assertEqual( ( status, hash ), ( CC.STATUS_SUCCESSFUL_BUT_REDUNDANT, hash ) )
|
||||
# would be redundant, but sometimes(?) triggers the 'it is missing from db' hook
|
||||
self.assertIn( status, ( CC.STATUS_UNKNOWN, CC.STATUS_SUCCESSFUL_BUT_REDUNDANT ) )
|
||||
self.assertEqual( written_hash, hash )
|
||||
if status == CC.STATUS_UNKNOWN:
|
||||
|
||||
self.assertIn( 'already in the db', note )
|
||||
|
||||
|
||||
#
|
||||
|
||||
|
|
Binary file not shown.
Before Width: | Height: | Size: 2.4 KiB After Width: | Height: | Size: 2.4 KiB |
Loading…
Reference in New Issue