Version 293
|
@ -8,6 +8,52 @@
|
|||
<div class="content">
|
||||
<h3>changelog</h3>
|
||||
<ul>
|
||||
<li><h3>version 293</h3></li>
|
||||
<ul>
|
||||
<li>fixed the issue with options dialog not opening again after a save--I apologise for the inconvenience</li>
|
||||
<li>the default system:time imported predicate (as set in options) is reset to the default '< 7 days'</li>
|
||||
<li>fixed another potential crash with the manage tags dialog (in fact with any dialog that has an embedded autocomplete or any other autocomplete with a non-floating dropdown list, such as the import file path tagging dialog)</li>
|
||||
<li>the sankaku user-agent has been updated to be a generic Firefox string "Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:56.0) Gecko/20100101 Firefox/56.0", which seems to work with their new rules. if your sank has been broken and you haven't touched the settings that do this stuff, your sank should be magically fixed on update</li>
|
||||
<li>the default global hydrus user-agent has been updated to the more polite and compatible "Mozilla/5.0 (compatible; Hydrus Client)". if your previous global user-agent is currently 'hydrus client' (the old default), you'll be updated</li>
|
||||
<li>import folders now deal with interruptions more skillfully and take a break and save their progress every ten minutes</li>
|
||||
<li>the manage import folders dialog now waits properly for any currently running import folders to quit and save themselves before opening itself (like how the manage subs dialog does it)</li>
|
||||
<li>import folders deal with some error states in a better way and are able to cancel out of their loops more reliably when an error occurs</li>
|
||||
<li>added a new checkbox option to options->tags that lets your 'write' autocomplete inputs (ones where you can submit new tags, like in manage tags) select the first result with non-zero count by default. this means it will skip over the first result if it is just nonsense like 'boyp' that you typed to get the results</li>
|
||||
<li>linked all the final behind the scenes stuff in the thread watcher together with the new parsing system. this stuff remains complicated and non-user-friendly, so please feel free to completely ignore it and the following batch of points:</li>
|
||||
<li>-</li>
|
||||
<li>wrote new url classes and parsers for 4chan and 8chan--they should all be automatically linked up on update</li>
|
||||
<li>the new thread parsers will use the first part of the first post's comment if no subject is set</li>
|
||||
<li>the domain manager can now cope with URL->API URL links</li>
|
||||
<li>manage url class links now displays the current expected api pairs of URL class links and filters out what cannot be currently parsed as a result</li>
|
||||
<li>manage url class links now only displays watchable urls in the bottom listctrl and permits parsers to be linked!</li>
|
||||
<li>the domain manager can now try to link url classes with parsers based on example urls!</li>
|
||||
<li>a button to fill in missing links is now available on the manage url class links panel</li>
|
||||
<li>content parsers that produce urls now define different url types--file, post, and 'next gallery page'</li>
|
||||
<li>content parsers can now parse a 'source timestamp' type</li>
|
||||
<li>veto content parsers now use a StringMatch object to do their matching. existing veto content parsers will be updated and _should_ work about the same as before (unless the previous string was crazy in regex rules) using a regex match</li>
|
||||
<li>html formulas now pick up the 'string' value more reliably in complicated or mangled html</li>
|
||||
<li>html and json parsing formulas can now work without any rules, in which case they action the top level node (this is useful to replicate the json/html for subsidiary page parsing)</li>
|
||||
<li>activated some of the new file import objects' features for the new parsing system--now, they store tags and known hashes as informed by what is parsed. this information is tested and used in import</li>
|
||||
<li>the db can now test all the hash types--md5, sha1, sha256, sha512--during pre-import checking</li>
|
||||
<li>the new parsing system now takes a temporary 'parsing context' that will eventually be able to receive and serve temporary variables but at the moment just holds a 'url' entry if the parser would like to use the url used to fetch the data anywhere</li>
|
||||
<li>all the new parsing ui now has a button to edit the parsing context, and the example parsing context is saved to the topmost page parser object</li>
|
||||
<li>wrote some url classes for 420chan.org--they will be added on update</li>
|
||||
<li>fixed an issue with the 4chan thread url class</li>
|
||||
<li>-</li>
|
||||
<li>the standard new listctrl panel wrapper can now now provide a special 'export to pngs' for certain named objects that will quickly export one object per png</li>
|
||||
<li>the 'import from png' dialog as launched from the standard new listctrl panel wrapper now allows multiple png selection</li>
|
||||
<li>the default parsers (and now the url classes as well) are stored in install_dir/static/defaults. they are read from here when you create a db or ask to load from defaults, so feel free to play around with this</li>
|
||||
<li>the manage services dialog now has an additional Yes/No warning/confirmation dialog if it expects to delete any services on dialog ok</li>
|
||||
<li>page tabs escape mnemonic characters (typically '&', which in ui labelling rules makes the following character an underlined shortcut) more reliably</li>
|
||||
<li>in an attempt to stop subs making so many separate small file popups, the subscription daemon will now only naturally check every four hours (previously one hour) and will wait up to ten minutes for bandwidth to free up before dumping out (previously 30 seconds). these values will be user-configurable in a future update</li>
|
||||
<li>fixed import for some unusual apngs</li>
|
||||
<li>corrected some more GMT/local time stuff with the new system:time imported 'date' mode</li>
|
||||
<li>misc GMT/local time fixes (profile logs names themselves correctly, etc..)</li>
|
||||
<li>improved how the thread watcher reports thread-check errors in unusual situations</li>
|
||||
<li>fixed an issue generating network contexts when the initial domain is 'localhost'</li>
|
||||
<li>improved some misc program stability</li>
|
||||
<li>started a new job scheduler object to reduce idle thread count and CPU usage--will play around with it a bit more and see about integrating it in the coming weeks</li>
|
||||
</ul>
|
||||
<li><h3>version 292</h3></li>
|
||||
<ul>
|
||||
<li>extended system:age to support searching by fixed calendar date, e.g. "system:age > 2018/01/24"</li>
|
||||
|
|
|
@ -660,7 +660,7 @@ class Controller( HydrusController.HydrusController ):
|
|||
self._daemons.append( HydrusThreading.DAEMONWorker( self, 'SaveDirtyObjects', ClientDaemons.DAEMONSaveDirtyObjects, ( 'important_dirt_to_clean', ), period = 30 ) )
|
||||
|
||||
self._daemons.append( HydrusThreading.DAEMONForegroundWorker( self, 'DownloadFiles', ClientDaemons.DAEMONDownloadFiles, ( 'notify_new_downloads', 'notify_new_permissions' ) ) )
|
||||
self._daemons.append( HydrusThreading.DAEMONForegroundWorker( self, 'SynchroniseSubscriptions', ClientDaemons.DAEMONSynchroniseSubscriptions, ( 'notify_restart_subs_sync_daemon', 'notify_new_subscriptions' ), init_wait = 60, pre_call_wait = 3 ) )
|
||||
self._daemons.append( HydrusThreading.DAEMONForegroundWorker( self, 'SynchroniseSubscriptions', ClientDaemons.DAEMONSynchroniseSubscriptions, ( 'notify_restart_subs_sync_daemon', 'notify_new_subscriptions' ), period = 14400, init_wait = 60, pre_call_wait = 3 ) )
|
||||
self._daemons.append( HydrusThreading.DAEMONForegroundWorker( self, 'CheckImportFolders', ClientDaemons.DAEMONCheckImportFolders, ( 'notify_restart_import_folders_daemon', 'notify_new_import_folders' ), period = 180 ) )
|
||||
self._daemons.append( HydrusThreading.DAEMONForegroundWorker( self, 'CheckExportFolders', ClientDaemons.DAEMONCheckExportFolders, ( 'notify_restart_export_folders_daemon', 'notify_new_export_folders' ), period = 180 ) )
|
||||
self._daemons.append( HydrusThreading.DAEMONForegroundWorker( self, 'MaintainTrash', ClientDaemons.DAEMONMaintainTrash, init_wait = 120 ) )
|
||||
|
|
|
@ -4983,13 +4983,47 @@ class DB( HydrusDB.HydrusDB ):
|
|||
return ( CC.STATUS_NEW, None, '' )
|
||||
|
||||
|
||||
def _GetHashStatus( self, hash ):
|
||||
def _GetHashStatus( self, hash_type, hash ):
|
||||
|
||||
hash_id = self._GetHashId( hash )
|
||||
|
||||
( status, hash, note ) = self._GetHashIdStatus( hash_id )
|
||||
|
||||
return status
|
||||
if hash_type == 'sha256':
|
||||
|
||||
if not self._HashExists( hash ):
|
||||
|
||||
return ( CC.STATUS_NEW, hash, '' )
|
||||
|
||||
else:
|
||||
|
||||
hash_id = self._GetHashId( hash )
|
||||
|
||||
return self._GetHashIdStatus( hash_id )
|
||||
|
||||
|
||||
else:
|
||||
|
||||
if hash_type == 'md5':
|
||||
|
||||
result = self._c.execute( 'SELECT hash_id FROM local_hashes WHERE md5 = ?;', ( sqlite3.Binary( hash ), ) ).fetchone()
|
||||
|
||||
elif hash_type == 'sha1':
|
||||
|
||||
result = self._c.execute( 'SELECT hash_id FROM local_hashes WHERE sha1 = ?;', ( sqlite3.Binary( hash ), ) ).fetchone()
|
||||
|
||||
elif hash_type == 'sha512':
|
||||
|
||||
result = self._c.execute( 'SELECT hash_id FROM local_hashes WHERE sha512 = ?;', ( sqlite3.Binary( hash ), ) ).fetchone()
|
||||
|
||||
|
||||
if result is None:
|
||||
|
||||
return ( CC.STATUS_NEW, None, '' )
|
||||
|
||||
else:
|
||||
|
||||
( hash_id, ) = result
|
||||
|
||||
return self._GetHashIdStatus( hash_id )
|
||||
|
||||
|
||||
|
||||
|
||||
def _GetHydrusSessions( self ):
|
||||
|
@ -5091,22 +5125,6 @@ class DB( HydrusDB.HydrusDB ):
|
|||
return value
|
||||
|
||||
|
||||
def _GetMD5Status( self, md5 ):
|
||||
|
||||
result = self._c.execute( 'SELECT hash_id FROM local_hashes WHERE md5 = ?;', ( sqlite3.Binary( md5 ), ) ).fetchone()
|
||||
|
||||
if result is None:
|
||||
|
||||
return ( CC.STATUS_NEW, None, '' )
|
||||
|
||||
else:
|
||||
|
||||
( hash_id, ) = result
|
||||
|
||||
return self._GetHashIdStatus( hash_id )
|
||||
|
||||
|
||||
|
||||
def _GetMediaResults( self, hash_ids ):
|
||||
|
||||
# get first detailed results
|
||||
|
@ -8048,7 +8066,6 @@ class DB( HydrusDB.HydrusDB ):
|
|||
elif action == 'local_booru_share': result = self._GetYAMLDump( YAML_DUMP_ID_LOCAL_BOORU, *args, **kwargs )
|
||||
elif action == 'local_booru_shares': result = self._GetYAMLDump( YAML_DUMP_ID_LOCAL_BOORU )
|
||||
elif action == 'maintenance_due': result = self._MaintenanceDue( *args, **kwargs )
|
||||
elif action == 'md5_status': result = self._GetMD5Status( *args, **kwargs )
|
||||
elif action == 'media_results': result = self._GetMediaResultsFromHashes( *args, **kwargs )
|
||||
elif action == 'media_results_from_ids': result = self._GetMediaResults( *args, **kwargs )
|
||||
elif action == 'missing_repository_update_hashes': result = self._GetRepositoryUpdateHashesIDoNotHave( *args, **kwargs )
|
||||
|
@ -10242,7 +10259,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
if version == 292:
|
||||
|
||||
if HC.SOFTWARE_VERSION == 293: # I don't need this info fifty weeks from now, so we'll just do it for the one week
|
||||
if HC.SOFTWARE_VERSION < 296: # I don't need this info fifty weeks from now, so we'll just do it for a bit
|
||||
|
||||
try:
|
||||
|
||||
|
@ -10280,6 +10297,108 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
|
||||
|
||||
#
|
||||
|
||||
try:
|
||||
|
||||
domain_manager = self._GetJSONDump( HydrusSerialisable.SERIALISABLE_TYPE_NETWORK_DOMAIN_MANAGER )
|
||||
|
||||
#
|
||||
|
||||
url_matches = ClientDefaults.GetDefaultURLMatches()
|
||||
|
||||
url_matches = [ url_match for url_match in url_matches if '420chan' in url_match.GetName() or url_match.GetName() == '4chan thread' ]
|
||||
|
||||
existing_url_matches = [ url_match for url_match in domain_manager.GetURLMatches() if url_match.GetName() != '4chan thread' ]
|
||||
|
||||
url_matches.extend( existing_url_matches )
|
||||
|
||||
domain_manager.SetURLMatches( url_matches )
|
||||
|
||||
#
|
||||
|
||||
parsers = ClientDefaults.GetDefaultParsers()
|
||||
|
||||
domain_manager.SetParsers( parsers )
|
||||
|
||||
#
|
||||
|
||||
domain_manager.TryToLinkURLMatchesAndParsers()
|
||||
|
||||
#
|
||||
|
||||
network_contexts_to_custom_header_dicts = domain_manager.GetNetworkContextsToCustomHeaderDicts()
|
||||
|
||||
#
|
||||
|
||||
# sank changed again, wew
|
||||
|
||||
sank_network_context = ClientNetworking.NetworkContext( CC.NETWORK_CONTEXT_DOMAIN, 'sankakucomplex.com' )
|
||||
|
||||
if sank_network_context in network_contexts_to_custom_header_dicts:
|
||||
|
||||
custom_header_dict = network_contexts_to_custom_header_dicts[ sank_network_context ]
|
||||
|
||||
if 'User-Agent' in custom_header_dict:
|
||||
|
||||
( value, approved, reason ) = custom_header_dict[ 'User-Agent' ]
|
||||
|
||||
if value.startswith( 'SCChannelApp' ):
|
||||
|
||||
value = 'Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:56.0) Gecko/20100101 Firefox/56.0'
|
||||
|
||||
custom_header_dict[ 'User-Agent' ] = ( value, approved, reason )
|
||||
|
||||
|
||||
|
||||
|
||||
if ClientNetworking.GLOBAL_NETWORK_CONTEXT in network_contexts_to_custom_header_dicts:
|
||||
|
||||
custom_header_dict = network_contexts_to_custom_header_dicts[ ClientNetworking.GLOBAL_NETWORK_CONTEXT ]
|
||||
|
||||
if 'User-Agent' in custom_header_dict:
|
||||
|
||||
( value, approved, reason ) = custom_header_dict[ 'User-Agent' ]
|
||||
|
||||
if value == 'hydrus client':
|
||||
|
||||
value = 'Mozilla/5.0 (compatible; Hydrus Client)'
|
||||
|
||||
custom_header_dict[ 'User-Agent' ] = ( value, approved, reason )
|
||||
|
||||
|
||||
|
||||
|
||||
domain_manager.SetNetworkContextsToCustomHeaderDicts( network_contexts_to_custom_header_dicts )
|
||||
|
||||
#
|
||||
|
||||
self._SetJSONDump( domain_manager )
|
||||
|
||||
except:
|
||||
|
||||
HydrusData.PrintException( e )
|
||||
|
||||
self.pub_initial_message( 'The client was unable to add some new domain and parsing data. The error has been written to the log--hydrus_dev would be interested in this information.' )
|
||||
|
||||
|
||||
#
|
||||
|
||||
try:
|
||||
|
||||
options = self._GetOptions()
|
||||
|
||||
options[ 'file_system_predicates' ][ 'age' ] = ( '<', 'delta', ( 0, 0, 7, 0 ) )
|
||||
|
||||
self._SaveOptions( options )
|
||||
|
||||
except:
|
||||
|
||||
HydrusData.PrintException( e )
|
||||
|
||||
self.pub_initial_message( 'The client was unable to fix a predicate issue. The error has been written to the log--hydrus_dev would be interested in this information.' )
|
||||
|
||||
|
||||
|
||||
self._controller.pub( 'splash_set_title_text', 'updated db to v' + str( version + 1 ) )
|
||||
|
||||
|
|
|
@ -35,18 +35,27 @@ def DAEMONCheckImportFolders( controller ):
|
|||
|
||||
if not controller.options[ 'pause_import_folders_sync' ]:
|
||||
|
||||
HG.import_folders_running = True
|
||||
|
||||
import_folder_names = controller.Read( 'serialisable_names', HydrusSerialisable.SERIALISABLE_TYPE_IMPORT_FOLDER )
|
||||
|
||||
for name in import_folder_names:
|
||||
try:
|
||||
|
||||
import_folder = controller.Read( 'serialisable_named', HydrusSerialisable.SERIALISABLE_TYPE_IMPORT_FOLDER, name )
|
||||
|
||||
if controller.options[ 'pause_import_folders_sync' ]:
|
||||
for name in import_folder_names:
|
||||
|
||||
break
|
||||
import_folder = controller.Read( 'serialisable_named', HydrusSerialisable.SERIALISABLE_TYPE_IMPORT_FOLDER, name )
|
||||
|
||||
if controller.options[ 'pause_import_folders_sync' ] or HG.view_shutdown:
|
||||
|
||||
break
|
||||
|
||||
|
||||
import_folder.DoWork()
|
||||
|
||||
|
||||
import_folder.DoWork()
|
||||
finally:
|
||||
|
||||
HG.import_folders_running = False
|
||||
|
||||
|
||||
|
||||
|
|
|
@ -874,6 +874,8 @@ class ClientOptions( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
self._dictionary[ 'booleans' ][ 'process_subs_in_random_order' ] = True
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'ac_select_first_with_count' ] = False
|
||||
|
||||
#
|
||||
|
||||
self._dictionary[ 'colours' ] = HydrusSerialisable.SerialisableDictionary()
|
||||
|
@ -1864,7 +1866,10 @@ class Credentials( HydrusData.HydrusYAMLBase ):
|
|||
|
||||
HydrusData.HydrusYAMLBase.__init__( self )
|
||||
|
||||
if host == 'localhost': host = '127.0.0.1'
|
||||
if host == 'localhost':
|
||||
|
||||
host = '127.0.0.1'
|
||||
|
||||
|
||||
self._host = host
|
||||
self._port = port
|
||||
|
|
|
@ -880,14 +880,23 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
|
|||
|
||||
self._bandwidth_timer = None
|
||||
|
||||
|
||||
if self._page_update_timer is not None:
|
||||
|
||||
self._page_update_timer.Stop()
|
||||
|
||||
self._page_update_timer = None
|
||||
|
||||
|
||||
if self._ui_update_timer is not None:
|
||||
|
||||
self._ui_update_timer.Stop()
|
||||
|
||||
self._ui_update_timer = None
|
||||
|
||||
|
||||
if self._animation_update_timer is not None:
|
||||
|
||||
self._animation_update_timer.Stop()
|
||||
|
||||
self._animation_update_timer = None
|
||||
|
@ -926,11 +935,18 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
|
|||
ip = response[ 'ip' ]
|
||||
timestamp = response[ 'timestamp' ]
|
||||
|
||||
text = 'File Hash: ' + hash.encode( 'hex' ) + os.linesep + 'Uploader\'s IP: ' + ip + os.linesep + 'Upload Time (GMT): ' + time.asctime( time.gmtime( int( timestamp ) ) )
|
||||
gmt_time = HydrusData.ConvertTimestampToPrettyTime( timestamp, in_gmt = True )
|
||||
local_time = HydrusData.ConvertTimestampToPrettyTime( timestamp )
|
||||
|
||||
text = 'File Hash: ' + hash.encode( 'hex' )
|
||||
text += os.linesep
|
||||
text += 'Uploader\'s IP: ' + ip
|
||||
text += 'Upload Time (GMT): ' + gmt_time
|
||||
text += 'Upload Time (Your time): ' + local_time
|
||||
|
||||
HydrusData.Print( text )
|
||||
|
||||
wx.MessageBox( text + os.linesep + 'This has been written to the log.' )
|
||||
wx.MessageBox( text + os.linesep * 2 + 'This has been written to the log.' )
|
||||
|
||||
|
||||
|
||||
|
@ -1890,21 +1906,64 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
|
|||
|
||||
def _ManageImportFolders( self ):
|
||||
|
||||
original_pause_status = HC.options[ 'pause_import_folders_sync' ]
|
||||
|
||||
HC.options[ 'pause_import_folders_sync' ] = True
|
||||
|
||||
try:
|
||||
def wx_do_it():
|
||||
|
||||
if not self:
|
||||
|
||||
return
|
||||
|
||||
|
||||
with ClientGUIDialogsManage.DialogManageImportFolders( self ) as dlg:
|
||||
|
||||
dlg.ShowModal()
|
||||
|
||||
|
||||
finally:
|
||||
|
||||
def THREAD_do_it( controller ):
|
||||
|
||||
HC.options[ 'pause_import_folders_sync' ] = original_pause_status
|
||||
original_pause_status = controller.options[ 'pause_import_folders_sync' ]
|
||||
|
||||
controller.options[ 'pause_import_folders_sync' ] = True
|
||||
|
||||
try:
|
||||
|
||||
if HG.import_folders_running:
|
||||
|
||||
job_key = ClientThreading.JobKey()
|
||||
|
||||
try:
|
||||
|
||||
job_key.SetVariable( 'popup_text_1', 'Waiting for import folders to finish.' )
|
||||
|
||||
controller.pub( 'message', job_key )
|
||||
|
||||
while HG.import_folders_running:
|
||||
|
||||
time.sleep( 0.1 )
|
||||
|
||||
if HG.view_shutdown:
|
||||
|
||||
return
|
||||
|
||||
|
||||
|
||||
finally:
|
||||
|
||||
job_key.Delete()
|
||||
|
||||
|
||||
|
||||
controller.CallBlockingToWx( wx_do_it )
|
||||
|
||||
finally:
|
||||
|
||||
controller.options[ 'pause_import_folders_sync' ] = original_pause_status
|
||||
|
||||
controller.pub( 'notify_new_import_folders' )
|
||||
|
||||
|
||||
|
||||
self._controller.CallToThread( THREAD_do_it, self._controller )
|
||||
|
||||
|
||||
def _ManageNetworkHeaders( self ):
|
||||
|
@ -2046,6 +2105,11 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
|
|||
|
||||
def wx_do_it():
|
||||
|
||||
if not self:
|
||||
|
||||
return
|
||||
|
||||
|
||||
title = 'manage subscriptions'
|
||||
frame_key = 'manage_subscriptions_dialog'
|
||||
|
||||
|
@ -2059,11 +2123,11 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
|
|||
|
||||
|
||||
|
||||
def THREAD_do_it():
|
||||
def THREAD_do_it( controller ):
|
||||
|
||||
original_pause_status = self._controller.options[ 'pause_subs_sync' ]
|
||||
original_pause_status = controller.options[ 'pause_subs_sync' ]
|
||||
|
||||
self._controller.options[ 'pause_subs_sync' ] = True
|
||||
controller.options[ 'pause_subs_sync' ] = True
|
||||
|
||||
try:
|
||||
|
||||
|
@ -2075,7 +2139,7 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
|
|||
|
||||
job_key.SetVariable( 'popup_text_1', 'Waiting for subs to finish.' )
|
||||
|
||||
self._controller.pub( 'message', job_key )
|
||||
controller.pub( 'message', job_key )
|
||||
|
||||
while HG.subscriptions_running:
|
||||
|
||||
|
@ -2093,17 +2157,17 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
|
|||
|
||||
|
||||
|
||||
self._controller.CallBlockingToWx( wx_do_it )
|
||||
controller.CallBlockingToWx( wx_do_it )
|
||||
|
||||
finally:
|
||||
|
||||
self._controller.options[ 'pause_subs_sync' ] = original_pause_status
|
||||
controller.options[ 'pause_subs_sync' ] = original_pause_status
|
||||
|
||||
HG.client_controller.pub( 'notify_new_subscriptions' )
|
||||
controller.pub( 'notify_new_subscriptions' )
|
||||
|
||||
|
||||
|
||||
self._controller.CallToThread( THREAD_do_it )
|
||||
self._controller.CallToThread( THREAD_do_it, self._controller )
|
||||
|
||||
|
||||
def _ManageTagCensorship( self ):
|
||||
|
@ -2150,15 +2214,14 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
|
|||
|
||||
with ClientGUITopLevelWindows.DialogEdit( self, title ) as dlg:
|
||||
|
||||
# eventually make this a proper management panel with several notebook pages or something
|
||||
|
||||
domain_manager = self._controller.network_engine.domain_manager
|
||||
|
||||
url_matches = domain_manager.GetURLMatches()
|
||||
parsers = domain_manager.GetParsers()
|
||||
|
||||
( url_match_keys_to_display, url_match_keys_to_parser_keys ) = domain_manager.GetURLMatchLinks()
|
||||
|
||||
panel = ClientGUIScrolledPanelsEdit.EditURLMatchLinksPanel( dlg, self._controller.network_engine, url_matches, url_match_keys_to_display, url_match_keys_to_parser_keys )
|
||||
panel = ClientGUIScrolledPanelsEdit.EditURLMatchLinksPanel( dlg, self._controller.network_engine, url_matches, parsers, url_match_keys_to_display, url_match_keys_to_parser_keys )
|
||||
|
||||
dlg.SetPanel( panel )
|
||||
|
||||
|
@ -3581,7 +3644,9 @@ The password is cleartext here but obscured in the entry dialog. Enter a blank p
|
|||
|
||||
def ImportURL( self, url ):
|
||||
|
||||
( url_type, match_name, can_parse ) = self._controller.network_engine.domain_manager.GetURLParseCapability( url )
|
||||
domain_manager = self._controller.network_engine.domain_manager
|
||||
|
||||
( url_type, match_name, can_parse ) = domain_manager.GetURLParseCapability( url )
|
||||
|
||||
if url_type in ( HC.URL_TYPE_UNKNOWN, HC.URL_TYPE_FILE ):
|
||||
|
||||
|
@ -3598,18 +3663,11 @@ The password is cleartext here but obscured in the entry dialog. Enter a blank p
|
|||
|
||||
else:
|
||||
|
||||
# this is temporary until I can get the parser link going and actually do this dynamically
|
||||
|
||||
if url_type == HC.URL_TYPE_WATCHABLE:
|
||||
|
||||
self._notebook.NewPageImportThreadWatcher( url, on_deepest_notebook = True )
|
||||
|
||||
return
|
||||
|
||||
# url was recognised as a gallery, page, or watchable url
|
||||
|
||||
if not can_parse:
|
||||
|
||||
message = 'This URL was recognised as a ' + match_name + ' but this URL class does not yet have a parsing script linked to it!'
|
||||
message = 'This URL was recognised as a "' + match_name + '" but this URL class does not yet have a parsing script linked to it!'
|
||||
message += os.linesep * 2
|
||||
message += 'Since this URL cannot be parsed, a downloader cannot be created for it! Please check your url class links under the \'networking\' menu.'
|
||||
|
||||
|
@ -3623,6 +3681,13 @@ The password is cleartext here but obscured in the entry dialog. Enter a blank p
|
|||
# gallery url -> open gallery page for the respective parser for import options, but no query input stuff, queue up gallery page to be parsed for page urls
|
||||
# page url -> open gallery page for the respective parser for import options, but no query input stuff (maybe no gallery stuff, but this is prob overkill), queue up file page to be parsed for tags and file
|
||||
|
||||
if url_type == HC.URL_TYPE_WATCHABLE:
|
||||
|
||||
self._notebook.NewPageImportThreadWatcher( url, on_deepest_notebook = True )
|
||||
|
||||
return
|
||||
|
||||
|
||||
|
||||
|
||||
def IsCurrentPage( self, page_key ):
|
||||
|
|
|
@ -108,7 +108,10 @@ class AutoCompleteDropdown( wx.Panel ):
|
|||
|
||||
self._dropdown_list = self._InitDropDownList()
|
||||
|
||||
if not self._float_mode: vbox.Add( self._dropdown_window, CC.FLAGS_EXPAND_BOTH_WAYS )
|
||||
if not self._float_mode:
|
||||
|
||||
vbox.Add( self._dropdown_window, CC.FLAGS_EXPAND_BOTH_WAYS )
|
||||
|
||||
|
||||
self.SetSizer( vbox )
|
||||
|
||||
|
@ -313,6 +316,9 @@ class AutoCompleteDropdown( wx.Panel ):
|
|||
self._move_hide_timer.Stop()
|
||||
self._move_hide_timer = None
|
||||
|
||||
|
||||
if self._lag_timer is not None:
|
||||
|
||||
self._lag_timer.Stop()
|
||||
self._lag_timer = None
|
||||
|
||||
|
@ -482,8 +488,14 @@ class AutoCompleteDropdown( wx.Panel ):
|
|||
self._next_updatelist_is_probably_fast = self._next_updatelist_is_probably_fast and num_chars > len( self._last_search_text )
|
||||
|
||||
if self._next_updatelist_is_probably_fast: self._UpdateList()
|
||||
elif num_chars < char_limit: self._lag_timer.Start( long_wait, wx.TIMER_ONE_SHOT )
|
||||
else: self._lag_timer.Start( short_wait, wx.TIMER_ONE_SHOT )
|
||||
elif num_chars < char_limit:
|
||||
|
||||
self._lag_timer.Start( long_wait, wx.TIMER_ONE_SHOT )
|
||||
|
||||
else:
|
||||
|
||||
self._lag_timer.Start( short_wait, wx.TIMER_ONE_SHOT )
|
||||
|
||||
|
||||
|
||||
|
||||
|
|
|
@ -578,3 +578,42 @@ class NetworkJobControl( wx.Panel ):
|
|||
self._Update()
|
||||
|
||||
|
||||
|
||||
class StringToStringDictButton( ClientGUICommon.BetterButton ):
|
||||
|
||||
def __init__( self, parent, label ):
|
||||
|
||||
ClientGUICommon.BetterButton.__init__( self, parent, label, self._Edit )
|
||||
|
||||
self._value = {}
|
||||
|
||||
|
||||
def _Edit( self ):
|
||||
|
||||
with ClientGUITopLevelWindows.DialogEdit( self, 'edit string dictionary' ) as dlg:
|
||||
|
||||
panel = ClientGUIScrolledPanels.EditSingleCtrlPanel( dlg )
|
||||
|
||||
control = EditStringToStringDictControl( panel, self._value )
|
||||
|
||||
panel.SetControl( control )
|
||||
|
||||
dlg.SetPanel( panel )
|
||||
|
||||
if dlg.ShowModal() == wx.ID_OK:
|
||||
|
||||
self._value = control.GetValue()
|
||||
|
||||
|
||||
|
||||
|
||||
def GetValue( self ):
|
||||
|
||||
return self._value
|
||||
|
||||
|
||||
def SetValue( self, value ):
|
||||
|
||||
self._value = value
|
||||
|
||||
|
||||
|
|
|
@ -147,7 +147,13 @@ class Dialog( wx.Dialog ):
|
|||
HG.client_controller.ResetIdleTimer()
|
||||
|
||||
|
||||
def EventDialogButton( self, event ): self.EndModal( event.GetId() )
|
||||
def EventDialogButton( self, event ):
|
||||
|
||||
if self.IsModal():
|
||||
|
||||
self.EndModal( event.GetId() )
|
||||
|
||||
|
||||
|
||||
def SetInitialSize( self, ( width, height ) ):
|
||||
|
||||
|
|
|
@ -2771,7 +2771,7 @@ class DialogManageImportFoldersEdit( ClientGUIDialogs.Dialog ):
|
|||
|
||||
if not os.path.exists( path ):
|
||||
|
||||
wx.MessageBox( 'The path you have entered--"' + path + '"--does not exist! The dialog will not force you to correct it, but you should not let this import folder run until you have corrected or created it!' )
|
||||
wx.MessageBox( 'The path you have entered--"' + path + '"--does not exist! The dialog will not force you to correct it, but this import folder will do no work as long as the location is missing!' )
|
||||
|
||||
|
||||
if HC.BASE_DIR.startswith( path ) or HG.client_controller.GetDBDir().startswith( path ):
|
||||
|
|
|
@ -1756,7 +1756,25 @@ class ListBoxTagsAC( ListBoxTagsPredicates ):
|
|||
|
||||
if len( predicates ) > 0:
|
||||
|
||||
self._Hit( False, False, 0 )
|
||||
hit_index = 0
|
||||
|
||||
if len( predicates ) > 1:
|
||||
|
||||
if HG.client_controller.new_options.GetBoolean( 'ac_select_first_with_count' ):
|
||||
|
||||
for ( index, predicate ) in enumerate( predicates ):
|
||||
|
||||
if predicate.GetCount() != 0:
|
||||
|
||||
hit_index = index
|
||||
|
||||
break
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
self._Hit( False, False, hit_index )
|
||||
|
||||
|
||||
|
||||
|
|
|
@ -1011,6 +1011,36 @@ class BetterListCtrlPanel( wx.Panel ):
|
|||
|
||||
|
||||
|
||||
def _ExportToPngs( self ):
|
||||
|
||||
export_object = self._GetExportObject()
|
||||
|
||||
if export_object is None:
|
||||
|
||||
return
|
||||
|
||||
|
||||
if not isinstance( export_object, HydrusSerialisable.SerialisableList ):
|
||||
|
||||
self._ExportToPng()
|
||||
|
||||
return
|
||||
|
||||
|
||||
import ClientGUITopLevelWindows
|
||||
import ClientGUISerialisable
|
||||
|
||||
with ClientGUITopLevelWindows.DialogNullipotent( self, 'export to png' ) as dlg:
|
||||
|
||||
panel = ClientGUISerialisable.PngsExportPanel( dlg, export_object )
|
||||
|
||||
dlg.SetPanel( panel )
|
||||
|
||||
dlg.ShowModal()
|
||||
|
||||
|
||||
|
||||
|
||||
def _GetExportObject( self ):
|
||||
|
||||
to_export = HydrusSerialisable.SerialisableList()
|
||||
|
@ -1057,32 +1087,37 @@ class BetterListCtrlPanel( wx.Panel ):
|
|||
|
||||
def _ImportFromPng( self ):
|
||||
|
||||
with wx.FileDialog( self, 'select the png with the encoded script', wildcard = 'PNG (*.png)|*.png' ) as dlg:
|
||||
with wx.FileDialog( self, 'select the png or pngs with the encoded data', style = wx.FD_OPEN | wx.FD_MULTIPLE, wildcard = 'PNG (*.png)|*.png' ) as dlg:
|
||||
|
||||
if dlg.ShowModal() == wx.ID_OK:
|
||||
|
||||
path = HydrusData.ToUnicode( dlg.GetPath() )
|
||||
|
||||
try:
|
||||
for path in dlg.GetPaths():
|
||||
|
||||
payload = ClientSerialisable.LoadFromPng( path )
|
||||
path = HydrusData.ToUnicode( path )
|
||||
|
||||
except Exception as e:
|
||||
try:
|
||||
|
||||
payload = ClientSerialisable.LoadFromPng( path )
|
||||
|
||||
except Exception as e:
|
||||
|
||||
wx.MessageBox( HydrusData.ToUnicode( e ) )
|
||||
|
||||
return
|
||||
|
||||
|
||||
wx.MessageBox( HydrusData.ToUnicode( e ) )
|
||||
|
||||
return
|
||||
|
||||
|
||||
try:
|
||||
|
||||
obj = HydrusSerialisable.CreateFromNetworkString( payload )
|
||||
|
||||
self._ImportObject( obj )
|
||||
|
||||
except:
|
||||
|
||||
wx.MessageBox( 'I could not understand what was encoded in the png!' )
|
||||
try:
|
||||
|
||||
obj = HydrusSerialisable.CreateFromNetworkString( payload )
|
||||
|
||||
self._ImportObject( obj )
|
||||
|
||||
except:
|
||||
|
||||
wx.MessageBox( 'I could not understand what was encoded in the png!' )
|
||||
|
||||
return
|
||||
|
||||
|
||||
|
||||
|
||||
|
@ -1159,10 +1194,17 @@ class BetterListCtrlPanel( wx.Panel ):
|
|||
export_menu_items.append( ( 'normal', 'to clipboard', 'Serialise the selected data and put it on your clipboard.', self._ExportToClipboard ) )
|
||||
export_menu_items.append( ( 'normal', 'to png', 'Serialise the selected data and encode it to an image file you can easily share with other hydrus users.', self._ExportToPng ) )
|
||||
|
||||
all_objs_are_named = False not in ( issubclass( o, HydrusSerialisable.SerialisableBaseNamed ) for o in self._permitted_object_types )
|
||||
|
||||
if all_objs_are_named:
|
||||
|
||||
export_menu_items.append( ( 'normal', 'to pngs', 'Serialise the selected data and encode it to multiple image files you can easily share with other hydrus users.', self._ExportToPngs ) )
|
||||
|
||||
|
||||
import_menu_items = []
|
||||
|
||||
import_menu_items.append( ( 'normal', 'from clipboard', 'Load a data from text in your clipboard.', self._ImportFromClipboard ) )
|
||||
import_menu_items.append( ( 'normal', 'from png', 'Load a data from an encoded png.', self._ImportFromPng ) )
|
||||
import_menu_items.append( ( 'normal', 'from pngs', 'Load a data from an encoded png.', self._ImportFromPng ) )
|
||||
|
||||
self.AddMenuButton( 'export', export_menu_items, enabled_only_on_selection = True )
|
||||
self.AddMenuButton( 'import', import_menu_items )
|
||||
|
|
|
@ -118,27 +118,18 @@ def CreateManagementControllerImportHDD( paths, file_import_options, paths_to_ta
|
|||
|
||||
def CreateManagementControllerImportThreadWatcher( thread_url = None ):
|
||||
|
||||
if thread_url is None:
|
||||
|
||||
thread_url = ''
|
||||
|
||||
|
||||
management_controller = CreateManagementController( 'thread watcher', MANAGEMENT_TYPE_IMPORT_THREAD_WATCHER )
|
||||
|
||||
thread_watcher_import = ClientImporting.ThreadWatcherImport()
|
||||
|
||||
management_controller.SetVariable( 'thread_watcher_import', thread_watcher_import )
|
||||
thread_watcher_import.SetThreadURL( thread_url )
|
||||
|
||||
if thread_url is not None:
|
||||
|
||||
try:
|
||||
|
||||
( thread_url, host, board, thread_id ) = ClientDownloading.ParseImageboardThreadURL( thread_url )
|
||||
|
||||
except Exception as e:
|
||||
|
||||
HydrusData.ShowException( e )
|
||||
|
||||
return
|
||||
|
||||
|
||||
thread_watcher_import.SetThreadURL( thread_url )
|
||||
|
||||
management_controller.SetVariable( 'thread_watcher_import', thread_watcher_import )
|
||||
|
||||
return management_controller
|
||||
|
||||
|
@ -2290,17 +2281,6 @@ class ManagementPanelImporterThreadWatcher( ManagementPanelImporter ):
|
|||
return
|
||||
|
||||
|
||||
try:
|
||||
|
||||
( thread_url, host, board, thread_id ) = ClientDownloading.ParseImageboardThreadURL( thread_url )
|
||||
|
||||
except Exception as e:
|
||||
|
||||
HydrusData.ShowException( e )
|
||||
|
||||
return
|
||||
|
||||
|
||||
self._thread_input.SetEditable( False )
|
||||
|
||||
self._thread_watcher_import.SetThreadURL( thread_url )
|
||||
|
|
|
@ -1144,6 +1144,8 @@ class PagesNotebook( wx.Notebook ):
|
|||
|
||||
page_name = page.GetDisplayName()
|
||||
|
||||
page_name = page_name.replace( os.linesep, '' )
|
||||
|
||||
if len( page_name ) > max_page_name_chars:
|
||||
|
||||
page_name = page_name[ : max_page_name_chars ] + u'\u2026'
|
||||
|
@ -1156,9 +1158,13 @@ class PagesNotebook( wx.Notebook ):
|
|||
page_name += ' (' + HydrusData.ConvertIntToPrettyString( num_files ) + ')'
|
||||
|
||||
|
||||
if self.GetPageText( index ) != page_name:
|
||||
safe_page_name = self.EscapeMnemonics( page_name )
|
||||
|
||||
existing_page_name = self.GetPageText( index )
|
||||
|
||||
if existing_page_name not in ( safe_page_name, page_name ):
|
||||
|
||||
self.SetPageText( index, page_name )
|
||||
self.SetPageText( index, safe_page_name )
|
||||
|
||||
|
||||
|
||||
|
@ -1179,8 +1185,6 @@ class PagesNotebook( wx.Notebook ):
|
|||
|
||||
new_name = dlg.GetValue()
|
||||
|
||||
new_name = self.EscapeMnemonics( new_name )
|
||||
|
||||
page.SetName( new_name, from_user = True )
|
||||
|
||||
self._controller.pub( 'refresh_page_name', page.GetPageKey() )
|
||||
|
|
|
@ -853,9 +853,12 @@ class PopupMessageManager( wx.Frame ):
|
|||
|
||||
|
||||
|
||||
self._timer.Stop()
|
||||
|
||||
self._timer = None
|
||||
if self._timer is not None:
|
||||
|
||||
self._timer.Stop()
|
||||
|
||||
self._timer = None
|
||||
|
||||
|
||||
sys.excepthook = self._old_excepthook
|
||||
|
||||
|
|
|
@ -52,7 +52,7 @@ class PanelPredicateSystemAgeDate( PanelPredicateSystem ):
|
|||
|
||||
hbox = wx.BoxSizer( wx.HORIZONTAL )
|
||||
|
||||
hbox.Add( ClientGUICommon.BetterStaticText( self, 'system:age' ), CC.FLAGS_VCENTER )
|
||||
hbox.Add( ClientGUICommon.BetterStaticText( self, 'system:time imported' ), CC.FLAGS_VCENTER )
|
||||
hbox.Add( self._sign, CC.FLAGS_VCENTER )
|
||||
hbox.Add( self._date, CC.FLAGS_VCENTER )
|
||||
|
||||
|
@ -89,7 +89,20 @@ class PanelPredicateSystemAgeDelta( PanelPredicateSystem ):
|
|||
|
||||
system_predicates = HC.options[ 'file_system_predicates' ]
|
||||
|
||||
( sign, years, months, days, hours ) = system_predicates[ 'age' ]
|
||||
try:
|
||||
|
||||
( sign, age_type, ( years, months, days, hours ) ) = system_predicates[ 'age' ]
|
||||
|
||||
except:
|
||||
|
||||
# wew lad. replace this all with proper system pred saving on new_options in future
|
||||
sign = '<'
|
||||
|
||||
years = 0
|
||||
months = 0
|
||||
days = 7
|
||||
hours = 0
|
||||
|
||||
|
||||
self._sign.SetStringSelection( sign )
|
||||
|
||||
|
@ -100,7 +113,7 @@ class PanelPredicateSystemAgeDelta( PanelPredicateSystem ):
|
|||
|
||||
hbox = wx.BoxSizer( wx.HORIZONTAL )
|
||||
|
||||
hbox.Add( ClientGUICommon.BetterStaticText( self, 'system:age' ), CC.FLAGS_VCENTER )
|
||||
hbox.Add( ClientGUICommon.BetterStaticText( self, 'system:time imported' ), CC.FLAGS_VCENTER )
|
||||
hbox.Add( self._sign, CC.FLAGS_VCENTER )
|
||||
hbox.Add( self._years, CC.FLAGS_VCENTER )
|
||||
hbox.Add( ClientGUICommon.BetterStaticText( self, 'years' ), CC.FLAGS_VCENTER )
|
||||
|
|
|
@ -3354,9 +3354,11 @@ class EditURLMatchPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
try:
|
||||
|
||||
api_lookup_url = url_match.GetAPIURL( normalised )
|
||||
|
||||
if api_lookup_url == normalised:
|
||||
if url_match.UsesAPIURL():
|
||||
|
||||
api_lookup_url = url_match.GetAPIURL( normalised )
|
||||
|
||||
else:
|
||||
|
||||
api_lookup_url = 'none set'
|
||||
|
||||
|
@ -3547,13 +3549,16 @@ class EditURLMatchesPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
class EditURLMatchLinksPanel( ClientGUIScrolledPanels.EditPanel ):
|
||||
|
||||
def __init__( self, parent, network_engine, url_matches, url_match_keys_to_display, url_match_keys_to_parser_keys ):
|
||||
def __init__( self, parent, network_engine, url_matches, parsers, url_match_keys_to_display, url_match_keys_to_parser_keys ):
|
||||
|
||||
ClientGUIScrolledPanels.EditPanel.__init__( self, parent )
|
||||
|
||||
self._url_matches = url_matches
|
||||
self._url_match_keys_to_url_matches = { url_match.GetMatchKey() : url_match for url_match in self._url_matches }
|
||||
|
||||
self._parsers = parsers
|
||||
self._parser_keys_to_parsers = { parser.GetParserKey() : parser for parser in self._parsers }
|
||||
|
||||
self._network_engine = network_engine
|
||||
|
||||
self._display_list_ctrl_panel = ClientGUIListCtrl.BetterListCtrlPanel( self )
|
||||
|
@ -3564,13 +3569,16 @@ class EditURLMatchLinksPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
self._display_list_ctrl_panel.AddButton( 'edit', self._EditDisplay, enabled_only_on_selection = True )
|
||||
|
||||
self._api_pairs_list_ctrl = ClientGUIListCtrl.BetterListCtrl( self, 'url_match_api_pairs', 10, 36, [ ( 'url class', -1 ), ( 'api url class', 36 ) ], self._ConvertAPIPairDataToListCtrlTuples )
|
||||
|
||||
self._parser_list_ctrl_panel = ClientGUIListCtrl.BetterListCtrlPanel( self )
|
||||
|
||||
self._parser_list_ctrl = ClientGUIListCtrl.BetterListCtrl( self._parser_list_ctrl_panel, 'url_match_keys_to_parser_keys', 15, 36, [ ( 'url class', -1 ), ( 'url type', 20 ), ( 'page parser', 36 ) ], self._ConvertParserDataToListCtrlTuples, activation_callback = self._EditParser )
|
||||
self._parser_list_ctrl = ClientGUIListCtrl.BetterListCtrl( self._parser_list_ctrl_panel, 'url_match_keys_to_parser_keys', 15, 36, [ ( 'url class', -1 ), ( 'url type', 20 ), ( 'parser', 36 ) ], self._ConvertParserDataToListCtrlTuples, activation_callback = self._EditParser )
|
||||
|
||||
self._parser_list_ctrl_panel.SetListCtrl( self._parser_list_ctrl )
|
||||
|
||||
self._parser_list_ctrl_panel.AddButton( 'edit', self._EditParser, enabled_only_on_selection = True )
|
||||
self._parser_list_ctrl_panel.AddButton( 'try to fill in gaps based on example urls', self._TryToLinkUrlMatchesAndParsers, enabled_check_func = self._GapsExist )
|
||||
|
||||
#
|
||||
|
||||
|
@ -3596,12 +3604,30 @@ class EditURLMatchLinksPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
#
|
||||
|
||||
api_pairs = ClientNetworkingDomain.ConvertURLMatchesIntoAPIPairs( url_matches )
|
||||
|
||||
self._api_pairs_list_ctrl.AddDatas( api_pairs )
|
||||
|
||||
# anything that goes to an api url will be parsed by that api's parser--it can't have its own
|
||||
api_pair_unparsable_url_matches = set()
|
||||
|
||||
for ( a, b ) in api_pairs:
|
||||
|
||||
api_pair_unparsable_url_matches.add( a )
|
||||
|
||||
|
||||
#
|
||||
|
||||
listctrl_data = []
|
||||
|
||||
for url_match in url_matches:
|
||||
|
||||
if not url_match.IsParsable():
|
||||
if not url_match.IsParsable() or url_match in api_pair_unparsable_url_matches:
|
||||
|
||||
continue
|
||||
|
||||
|
||||
if not url_match.IsWatchableURL(): # only starting with the thread watcher atm
|
||||
|
||||
continue
|
||||
|
||||
|
@ -3629,12 +3655,28 @@ class EditURLMatchLinksPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
vbox = wx.BoxSizer( wx.VERTICAL )
|
||||
|
||||
vbox.Add( self._display_list_ctrl_panel, CC.FLAGS_EXPAND_BOTH_WAYS )
|
||||
vbox.Add( ClientGUICommon.BetterStaticText( self, 'a listctrl here to show current api links as understood by the domain manager' ), CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
vbox.Add( self._api_pairs_list_ctrl, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
vbox.Add( self._parser_list_ctrl_panel, CC.FLAGS_EXPAND_BOTH_WAYS )
|
||||
|
||||
self.SetSizer( vbox )
|
||||
|
||||
|
||||
def _ConvertAPIPairDataToListCtrlTuples( self, data ):
|
||||
|
||||
( a, b ) = data
|
||||
|
||||
a_name = a.GetName()
|
||||
b_name = b.GetName()
|
||||
|
||||
pretty_a_name = a_name
|
||||
pretty_b_name = b_name
|
||||
|
||||
display_tuple = ( pretty_a_name, pretty_b_name )
|
||||
sort_tuple = ( a_name, b_name )
|
||||
|
||||
return ( display_tuple, sort_tuple )
|
||||
|
||||
|
||||
def _ConvertDisplayDataToListCtrlTuples( self, data ):
|
||||
|
||||
( url_match_key, display ) = data
|
||||
|
@ -3662,27 +3704,31 @@ class EditURLMatchLinksPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
( url_match_key, parser_key ) = data
|
||||
|
||||
url_match = url_match_name = self._url_match_keys_to_url_matches[ url_match_key ]
|
||||
url_match = self._url_match_keys_to_url_matches[ url_match_key ]
|
||||
|
||||
url_match_name = url_match.GetName()
|
||||
|
||||
url_type = url_match.GetURLType()
|
||||
|
||||
pretty_name = url_match_name
|
||||
|
||||
pretty_url_type = HC.url_type_string_lookup[ url_type ]
|
||||
|
||||
if parser_key is None:
|
||||
|
||||
pretty_parser_key = ''
|
||||
parser_name = ''
|
||||
|
||||
else:
|
||||
|
||||
pretty_parser_key = 'fetch this from network engine like with the url matches'
|
||||
parser = self._parser_keys_to_parsers[ parser_key ]
|
||||
|
||||
parser_name = parser.GetName()
|
||||
|
||||
|
||||
display_tuple = ( pretty_name, pretty_url_type, pretty_parser_key )
|
||||
sort_tuple = ( url_match_name, url_type, pretty_parser_key )
|
||||
pretty_url_match_name = url_match_name
|
||||
|
||||
pretty_url_type = HC.url_type_string_lookup[ url_type ]
|
||||
|
||||
pretty_parser_name = parser_name
|
||||
|
||||
display_tuple = ( pretty_url_match_name, pretty_url_type, pretty_parser_name )
|
||||
sort_tuple = ( url_match_name, url_type, parser_name )
|
||||
|
||||
return ( display_tuple, sort_tuple )
|
||||
|
||||
|
@ -3723,15 +3769,75 @@ class EditURLMatchLinksPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
def _EditParser( self ):
|
||||
|
||||
if len( self._parsers ) == 0:
|
||||
|
||||
wx.MessageBox( 'Unfortunately, you do not have any parsers, so none can be linked to your url classes. Please create some!' )
|
||||
|
||||
return
|
||||
|
||||
|
||||
for data in self._parser_list_ctrl.GetData( only_selected = True ):
|
||||
|
||||
( url_match_key, page_script_key ) = data
|
||||
( url_match_key, parser_key ) = data
|
||||
|
||||
# present the user with a dialog to choose page script key, or none
|
||||
url_match = self._url_match_keys_to_url_matches[ url_match_key ]
|
||||
|
||||
wx.MessageBox( 'This does not work yet!' )
|
||||
choice_tuples = [ ( parser.GetName(), parser ) for parser in self._parsers ]
|
||||
|
||||
break
|
||||
with ClientGUIDialogs.DialogSelectFromList( self, 'select parser for ' + url_match.GetName(), choice_tuples ) as dlg:
|
||||
|
||||
if dlg.ShowModal() == wx.ID_OK:
|
||||
|
||||
parser = dlg.GetChoice()
|
||||
|
||||
self._parser_list_ctrl.DeleteDatas( ( data, ) )
|
||||
|
||||
new_data = ( url_match_key, parser.GetParserKey() )
|
||||
|
||||
self._parser_list_ctrl.AddDatas( ( new_data, ) )
|
||||
|
||||
else:
|
||||
|
||||
break
|
||||
|
||||
|
||||
|
||||
self._parser_list_ctrl.Sort()
|
||||
|
||||
|
||||
|
||||
def _GapsExist( self ):
|
||||
|
||||
parser_keys = [ parser_key for ( url_match_key, parser_key ) in self._parser_list_ctrl.GetData() ]
|
||||
|
||||
return None in parser_keys
|
||||
|
||||
|
||||
def _TryToLinkUrlMatchesAndParsers( self ):
|
||||
|
||||
existing_url_match_keys_to_parser_keys = { url_match_key : parser_key for ( url_match_key, parser_key ) in self._parser_list_ctrl.GetData() if parser_key is not None }
|
||||
|
||||
new_url_match_keys_to_parser_keys = ClientNetworkingDomain.NetworkDomainManager.STATICLinkURLMatchesAndParsers( self._url_matches, self._parsers, existing_url_match_keys_to_parser_keys )
|
||||
|
||||
if len( new_url_match_keys_to_parser_keys ) > 0:
|
||||
|
||||
removees = []
|
||||
|
||||
for row in self._parser_list_ctrl.GetData():
|
||||
|
||||
( url_match_key, parser_key ) = row
|
||||
|
||||
if url_match_key in new_url_match_keys_to_parser_keys:
|
||||
|
||||
removees.append( row )
|
||||
|
||||
|
||||
|
||||
self._parser_list_ctrl.DeleteDatas( removees )
|
||||
|
||||
self._parser_list_ctrl.AddDatas( new_url_match_keys_to_parser_keys.items() )
|
||||
|
||||
self._parser_list_ctrl.Sort()
|
||||
|
||||
|
||||
|
||||
|
|
|
@ -266,9 +266,9 @@ class ManageClientServicesPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
|
||||
#
|
||||
|
||||
services = HG.client_controller.services_manager.GetServices()
|
||||
self._original_services = HG.client_controller.services_manager.GetServices()
|
||||
|
||||
self._listctrl.AddDatas( services )
|
||||
self._listctrl.AddDatas( self._original_services )
|
||||
|
||||
self._listctrl.Sort( 0 )
|
||||
|
||||
|
@ -402,6 +402,28 @@ class ManageClientServicesPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
|
||||
services = self._listctrl.GetData()
|
||||
|
||||
new_service_keys = { service.GetServiceKey() for service in services }
|
||||
|
||||
deletee_service_names = [ service.GetName() for service in self._original_services if service.GetServiceKey() not in new_service_keys ]
|
||||
|
||||
if len( deletee_service_names ) > 0:
|
||||
|
||||
message = 'You are about to delete the following services:'
|
||||
message += os.linesep * 2
|
||||
message += os.linesep.join( deletee_service_names )
|
||||
message += os.linesep * 2
|
||||
message += 'Are you absolutely sure this is correct?'
|
||||
|
||||
with ClientGUIDialogs.DialogYesNo( self, message ) as dlg:
|
||||
|
||||
if dlg.ShowModal() != wx.ID_YES:
|
||||
|
||||
raise HydrusExceptions.VetoException( 'Commit cancelled by user! If you do not believe you meant to delete any services (i.e the code accidentally intended to delete them all by itself), please inform hydrus dev immediately.' )
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
HG.client_controller.SetServices( services )
|
||||
|
||||
|
||||
|
@ -3372,6 +3394,8 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
|
||||
self._show_all_tags_in_autocomplete = wx.CheckBox( general_panel )
|
||||
|
||||
self._ac_select_first_with_count = wx.CheckBox( general_panel )
|
||||
|
||||
self._apply_all_parents_to_all_services = wx.CheckBox( general_panel )
|
||||
self._apply_all_siblings_to_all_services = wx.CheckBox( general_panel )
|
||||
|
||||
|
@ -3487,6 +3511,7 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
self._default_tag_service_search_page.SelectClientData( new_options.GetKey( 'default_tag_service_search_page' ) )
|
||||
|
||||
self._show_all_tags_in_autocomplete.SetValue( HC.options[ 'show_all_tags_in_autocomplete' ] )
|
||||
self._ac_select_first_with_count.SetValue( self._new_options.GetBoolean( 'ac_select_first_with_count' ) )
|
||||
|
||||
self._apply_all_parents_to_all_services.SetValue( self._new_options.GetBoolean( 'apply_all_parents_to_all_services' ) )
|
||||
self._apply_all_siblings_to_all_services.SetValue( self._new_options.GetBoolean( 'apply_all_siblings_to_all_services' ) )
|
||||
|
@ -3526,6 +3551,7 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
rows.append( ( 'Default tag service in search pages: ', self._default_tag_service_search_page ) )
|
||||
rows.append( ( 'Default tag sort: ', self._default_tag_sort ) )
|
||||
rows.append( ( 'By default, search non-local tags in write-autocomplete: ', self._show_all_tags_in_autocomplete ) )
|
||||
rows.append( ( 'By default, select the first tag result with actual count in write-autocomplete: ', self._ac_select_first_with_count ) )
|
||||
rows.append( ( 'Suggest all parents for all services: ', self._apply_all_parents_to_all_services ) )
|
||||
rows.append( ( 'Apply all siblings to all services (local siblings have precedence): ', self._apply_all_siblings_to_all_services ) )
|
||||
|
||||
|
@ -3665,6 +3691,8 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
HC.options[ 'default_tag_sort' ] = self._default_tag_sort.GetClientData( self._default_tag_sort.GetSelection() )
|
||||
HC.options[ 'show_all_tags_in_autocomplete' ] = self._show_all_tags_in_autocomplete.GetValue()
|
||||
|
||||
self._new_options.SetBoolean( 'ac_select_first_with_count', self._ac_select_first_with_count.GetValue() )
|
||||
|
||||
self._new_options.SetKey( 'default_tag_service_search_page', self._default_tag_service_search_page.GetChoice() )
|
||||
|
||||
self._new_options.SetInteger( 'suggested_tags_width', self._suggested_tags_width.GetValue() )
|
||||
|
|
|
@ -6,6 +6,8 @@ import ClientSerialisable
|
|||
import ClientThreading
|
||||
import HydrusConstants as HC
|
||||
import HydrusData
|
||||
import HydrusSerialisable
|
||||
import os
|
||||
import wx
|
||||
|
||||
class PngExportPanel( ClientGUIScrolledPanels.ReviewPanel ):
|
||||
|
@ -40,7 +42,12 @@ class PngExportPanel( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
|
||||
self._width.SetValue( 512 )
|
||||
|
||||
self.EventChanged( None )
|
||||
if isinstance( self._payload_obj, HydrusSerialisable.SerialisableBaseNamed ):
|
||||
|
||||
self._title.SetValue( self._payload_obj.GetName() )
|
||||
|
||||
|
||||
self._Update()
|
||||
|
||||
#
|
||||
|
||||
|
@ -58,7 +65,7 @@ class PngExportPanel( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
self.SetSizer( gridbox )
|
||||
|
||||
|
||||
def EventChanged( self, event ):
|
||||
def _Update( self ):
|
||||
|
||||
problems = []
|
||||
|
||||
|
@ -88,6 +95,11 @@ class PngExportPanel( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
|
||||
|
||||
|
||||
def EventChanged( self, event ):
|
||||
|
||||
self._Update()
|
||||
|
||||
|
||||
def Export( self ):
|
||||
|
||||
width = self._width.GetValue()
|
||||
|
@ -110,3 +122,94 @@ class PngExportPanel( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
ClientThreading.CallLater( self, 2, self._export.SetLabelText, 'export' )
|
||||
|
||||
|
||||
class PngsExportPanel( ClientGUIScrolledPanels.ReviewPanel ):
|
||||
|
||||
def __init__( self, parent, payload_objs ):
|
||||
|
||||
ClientGUIScrolledPanels.ReviewPanel.__init__( self, parent )
|
||||
|
||||
self._payload_objs = payload_objs
|
||||
|
||||
self._directory_picker = wx.DirPickerCtrl( self )
|
||||
self._directory_picker.Bind( wx.EVT_DIRPICKER_CHANGED, self.EventChanged )
|
||||
|
||||
self._width = wx.SpinCtrl( self, min = 100, max = 4096 )
|
||||
|
||||
self._export = ClientGUICommon.BetterButton( self, 'export', self.Export )
|
||||
|
||||
#
|
||||
|
||||
self._width.SetValue( 512 )
|
||||
|
||||
self._Update()
|
||||
|
||||
#
|
||||
|
||||
rows = []
|
||||
|
||||
rows.append( ( 'export path: ', self._directory_picker ) )
|
||||
rows.append( ( 'png width: ', self._width ) )
|
||||
rows.append( ( '', self._export ) )
|
||||
|
||||
gridbox = ClientGUICommon.WrapInGrid( self, rows )
|
||||
|
||||
self.SetSizer( gridbox )
|
||||
|
||||
|
||||
def _Update( self ):
|
||||
|
||||
problems = []
|
||||
|
||||
path = self._directory_picker.GetPath()
|
||||
|
||||
if path == '' or path is None:
|
||||
|
||||
problems.append( 'select a path' )
|
||||
|
||||
|
||||
if len( problems ) == 0:
|
||||
|
||||
self._export.SetLabelText( 'export' )
|
||||
|
||||
self._export.Enable()
|
||||
|
||||
else:
|
||||
|
||||
self._export.SetLabelText( ' and '.join( problems ) )
|
||||
|
||||
self._export.Disable()
|
||||
|
||||
|
||||
|
||||
def EventChanged( self, event ):
|
||||
|
||||
self._Update()
|
||||
|
||||
|
||||
def Export( self ):
|
||||
|
||||
width = self._width.GetValue()
|
||||
|
||||
directory = HydrusData.ToUnicode( self._directory_picker.GetPath() )
|
||||
|
||||
for obj in self._payload_objs:
|
||||
|
||||
( payload_description, payload_string ) = ClientSerialisable.GetPayloadDescriptionAndString( obj )
|
||||
|
||||
title = obj.GetName()
|
||||
text = ''
|
||||
path = os.path.join( directory, title )
|
||||
|
||||
if not path.endswith( '.png' ):
|
||||
|
||||
path += '.png'
|
||||
|
||||
|
||||
ClientSerialisable.DumpToPng( width, payload_string, title, payload_description, text, path )
|
||||
|
||||
|
||||
self._export.SetLabelText( 'done!' )
|
||||
|
||||
ClientThreading.CallLater( self, 2, self._export.SetLabelText, 'export' )
|
||||
|
||||
|
||||
|
|
|
@ -424,9 +424,9 @@ class FileLookupScriptTagsPanel( wx.Panel ):
|
|||
self._SetTags( tags )
|
||||
|
||||
|
||||
content_results = script.DoQuery( job_key, file_identifier )
|
||||
parse_results = script.DoQuery( job_key, file_identifier )
|
||||
|
||||
tags = ClientParsing.GetTagsFromContentResults( content_results )
|
||||
tags = ClientParsing.GetTagsFromParseResults( parse_results )
|
||||
|
||||
wx.CallAfter( wx_code, tags )
|
||||
|
||||
|
|
|
@ -6,6 +6,7 @@ import ClientDownloading
|
|||
import ClientFiles
|
||||
import ClientImageHandling
|
||||
import ClientNetworking
|
||||
import ClientParsing
|
||||
import ClientTags
|
||||
import ClientThreading
|
||||
import collections
|
||||
|
@ -37,6 +38,36 @@ CHECKER_STATUS_404 = 2
|
|||
|
||||
DID_SUBSTANTIAL_FILE_WORK_MINIMUM_SLEEP_TIME = 0.1
|
||||
|
||||
def GetInitialSeedStatus( seed ):
|
||||
|
||||
( status, hash, note ) = ( CC.STATUS_NEW, None, '' )
|
||||
|
||||
url_not_known_beforehand = True
|
||||
|
||||
if seed.seed_type == SEED_TYPE_URL:
|
||||
|
||||
url = seed.seed_data
|
||||
|
||||
( status, hash, note ) = HG.client_controller.Read( 'url_status', url )
|
||||
|
||||
url_not_known_beforehand = status == CC.STATUS_NEW
|
||||
|
||||
|
||||
if status == CC.STATUS_NEW:
|
||||
|
||||
for ( hash_type, found_hash ) in seed.GetHashes().items():
|
||||
|
||||
( status, hash, note ) = HG.client_controller.Read( 'hash_status', hash_type, found_hash )
|
||||
|
||||
if status != CC.STATUS_NEW:
|
||||
|
||||
break
|
||||
|
||||
|
||||
|
||||
|
||||
return ( url_not_known_beforehand, ( status, hash, note ) )
|
||||
|
||||
def THREADDownloadURL( job_key, url, url_string ):
|
||||
|
||||
job_key.SetVariable( 'popup_title', url_string )
|
||||
|
@ -249,6 +280,57 @@ def THREADDownloadURLs( job_key, urls, title ):
|
|||
|
||||
job_key.Finish()
|
||||
|
||||
def UpdateSeedCacheWithAllParseResults( seed_cache, all_parse_results ):
|
||||
|
||||
# need a limit param here for 'stop at 40 total new because of file limit'
|
||||
|
||||
new_seeds = []
|
||||
|
||||
num_new = 0
|
||||
num_already_in = 0
|
||||
|
||||
for parse_results in all_parse_results:
|
||||
|
||||
parsed_urls = ClientParsing.GetURLsFromParseResults( parse_results, ( HC.URL_TYPE_FILE, HC.URL_TYPE_POST ) )
|
||||
|
||||
urls_to_add = filter( lambda u: not seed_cache.HasURL( u ), parsed_urls )
|
||||
|
||||
num_new += len( urls_to_add )
|
||||
num_already_in += len( parsed_urls ) - len( urls_to_add )
|
||||
|
||||
if len( urls_to_add ) == 0:
|
||||
|
||||
continue
|
||||
|
||||
|
||||
tags = ClientParsing.GetTagsFromParseResults( parse_results )
|
||||
hashes = ClientParsing.GetHashesFromParseResults( parse_results )
|
||||
source_timestamp = ClientParsing.GetTimestampFromParseResults( parse_results, HC.TIMESTAMP_TYPE_SOURCE )
|
||||
|
||||
for url in urls_to_add:
|
||||
|
||||
seed = Seed( SEED_TYPE_URL, url )
|
||||
|
||||
seed.AddTags( tags )
|
||||
|
||||
for ( hash_type, hash ) in hashes:
|
||||
|
||||
seed.SetHash( hash_type, hash )
|
||||
|
||||
|
||||
if source_timestamp is not None:
|
||||
|
||||
seed.source_time = source_timestamp
|
||||
|
||||
|
||||
new_seeds.append( seed )
|
||||
|
||||
|
||||
|
||||
seed_cache.AddSeeds( new_seeds )
|
||||
|
||||
return ( num_new, num_already_in )
|
||||
|
||||
class FileImportJob( object ):
|
||||
|
||||
def __init__( self, temp_path, file_import_options = None ):
|
||||
|
@ -379,7 +461,7 @@ class FileImportJob( object ):
|
|||
|
||||
self._hash = HydrusFileHandling.GetHashFromPath( self._temp_path )
|
||||
|
||||
self._pre_import_status = HG.client_controller.Read( 'hash_status', self._hash )
|
||||
( self._pre_import_status, hash, note ) = HG.client_controller.Read( 'hash_status', 'sha256', self._hash )
|
||||
|
||||
|
||||
def GenerateInfo( self ):
|
||||
|
@ -2040,9 +2122,14 @@ class ImportFolder( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
try:
|
||||
|
||||
if os.path.exists( path ):
|
||||
dest_dir = self._action_locations[ status ]
|
||||
|
||||
if not os.path.exists( dest_dir ):
|
||||
|
||||
dest_dir = self._action_locations[ status ]
|
||||
raise HydrusExceptions.DataMissing( 'The move location "' + dest_dir + '" does not exist!' )
|
||||
|
||||
|
||||
if os.path.exists( path ):
|
||||
|
||||
filename = os.path.basename( path )
|
||||
|
||||
|
@ -2057,8 +2144,6 @@ class ImportFolder( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
if os.path.exists( txt_path ):
|
||||
|
||||
dest_dir = self._action_locations[ status ]
|
||||
|
||||
txt_filename = os.path.basename( txt_path )
|
||||
|
||||
txt_dest_path = os.path.join( dest_dir, txt_filename )
|
||||
|
@ -2199,6 +2284,8 @@ class ImportFolder( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
return
|
||||
|
||||
|
||||
time_to_stop = HydrusData.GetNow() + 600
|
||||
was_interrupted = False
|
||||
due_by_check_now = self._check_now
|
||||
due_by_period = not self._paused and HydrusData.TimeHasPassed( self._last_checked + self._period )
|
||||
|
||||
|
@ -2241,11 +2328,20 @@ class ImportFolder( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
seed = self._path_cache.GetNextSeed( CC.STATUS_UNKNOWN )
|
||||
|
||||
p1 = HC.options[ 'pause_import_folders_sync' ]
|
||||
p1 = HC.options[ 'pause_import_folders_sync' ] or self._paused
|
||||
p2 = HG.view_shutdown
|
||||
|
||||
if seed is None or p1 or p2:
|
||||
|
||||
was_interrupted = True
|
||||
|
||||
break
|
||||
|
||||
|
||||
if HydrusData.TimeHasPassed( time_to_stop ):
|
||||
|
||||
was_interrupted = True
|
||||
|
||||
break
|
||||
|
||||
|
||||
|
@ -2380,8 +2476,11 @@ class ImportFolder( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
self._ActionPaths()
|
||||
|
||||
|
||||
self._last_checked = HydrusData.GetNow()
|
||||
self._check_now = False
|
||||
if not was_interrupted:
|
||||
|
||||
self._last_checked = HydrusData.GetNow()
|
||||
self._check_now = False
|
||||
|
||||
|
||||
HG.client_controller.WriteSynchronous( 'serialisable', self )
|
||||
|
||||
|
@ -3081,17 +3180,17 @@ class Seed( HydrusSerialisable.SerialisableBase ):
|
|||
self.note = ''
|
||||
|
||||
self._urls = set()
|
||||
self._service_keys_to_tags = collections.defaultdict( set )
|
||||
self._tags = set()
|
||||
self._hashes = {}
|
||||
|
||||
|
||||
def _GetSerialisableInfo( self ):
|
||||
|
||||
serialisable_urls = list( self._urls )
|
||||
serialisable_service_keys_to_tags = [ ( service_key.encode( 'hex' ), list( tags ) ) for ( service_key, tags ) in self._service_keys_to_tags.items() ]
|
||||
serialisable_tags = list( self._tags )
|
||||
serialisable_hashes = [ ( hash_type, hash.encode( 'hex' ) ) for ( hash_type, hash ) in self._hashes.items() ]
|
||||
|
||||
return ( self.seed_type, self.seed_data, self.created, self.modified, self.source_time, self.status, self.note, serialisable_urls, serialisable_service_keys_to_tags, serialisable_hashes )
|
||||
return ( self.seed_type, self.seed_data, self.created, self.modified, self.source_time, self.status, self.note, serialisable_urls, serialisable_tags, serialisable_hashes )
|
||||
|
||||
|
||||
def __eq__( self, other ):
|
||||
|
@ -3111,10 +3210,10 @@ class Seed( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
def _InitialiseFromSerialisableInfo( self, serialisable_info ):
|
||||
|
||||
( self.seed_type, self.seed_data, self.created, self.modified, self.source_time, self.status, self.note, serialisable_urls, serialisable_service_keys_to_tags, serialisable_hashes ) = serialisable_info
|
||||
( self.seed_type, self.seed_data, self.created, self.modified, self.source_time, self.status, self.note, serialisable_urls, serialisable_tags, serialisable_hashes ) = serialisable_info
|
||||
|
||||
self._urls = set( serialisable_urls )
|
||||
self._service_keys_to_tags = { encoded_service_key.decode( 'hex' ) : set( tags ) for ( encoded_service_key, tags ) in serialisable_service_keys_to_tags }
|
||||
self._service_keys_to_tags = set( serialisable_tags )
|
||||
self._hashes = { hash_type : encoded_hash.decode( 'hex' ) for ( hash_type, encoded_hash ) in serialisable_hashes }
|
||||
|
||||
|
||||
|
@ -3123,14 +3222,9 @@ class Seed( HydrusSerialisable.SerialisableBase ):
|
|||
self.modified = HydrusData.GetNow()
|
||||
|
||||
|
||||
def AddTags( self, service_key, tags ):
|
||||
def AddTags( self, tags ):
|
||||
|
||||
if service_key not in self._service_keys_to_tags:
|
||||
|
||||
self._service_keys_to_tags[ service_key ] = set()
|
||||
|
||||
|
||||
self._service_keys_to_tags[ service_key ].update( tags )
|
||||
self._tags.update( tags )
|
||||
|
||||
self._UpdateModified()
|
||||
|
||||
|
@ -3170,6 +3264,11 @@ class Seed( HydrusSerialisable.SerialisableBase ):
|
|||
return search_seeds
|
||||
|
||||
|
||||
def GetTags( self ):
|
||||
|
||||
return set( self._tags )
|
||||
|
||||
|
||||
def SetHash( self, hash_type, hash ):
|
||||
|
||||
self._hashes[ hash_type ] = hash
|
||||
|
@ -4117,8 +4216,9 @@ class Subscription( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
# just a little padding, to make sure we don't accidentally get into a long wait because we need to fetch file and tags independantly etc...
|
||||
expected_requests = 3
|
||||
expected_bytes = 1048576
|
||||
threshold = 600
|
||||
|
||||
p4 = not HG.client_controller.network_engine.bandwidth_manager.CanDoWork( example_nj.GetNetworkContexts(), expected_requests, expected_bytes )
|
||||
p4 = not HG.client_controller.network_engine.bandwidth_manager.CanDoWork( example_nj.GetNetworkContexts(), expected_requests = expected_requests, expected_bytes = expected_bytes, threshold = threshold )
|
||||
|
||||
if p1 or p3 or p4:
|
||||
|
||||
|
@ -4340,8 +4440,9 @@ class Subscription( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
# just a little padding here
|
||||
expected_requests = 3
|
||||
expected_bytes = 1048576
|
||||
threshold = 30
|
||||
|
||||
if HG.client_controller.network_engine.bandwidth_manager.CanDoWork( example_nj.GetNetworkContexts(), expected_requests, expected_bytes ):
|
||||
if HG.client_controller.network_engine.bandwidth_manager.CanDoWork( example_nj.GetNetworkContexts(), expected_requests = expected_requests, expected_bytes = expected_bytes, threshold = threshold ):
|
||||
|
||||
return True
|
||||
|
||||
|
@ -5262,6 +5363,40 @@ class ThreadWatcherImport( HydrusSerialisable.SerialisableBase ):
|
|||
error_occurred = False
|
||||
watcher_status_should_stick = True
|
||||
|
||||
( url_type, match_name, can_parse ) = HG.client_controller.network_engine.domain_manager.GetURLParseCapability( self._thread_url )
|
||||
|
||||
if url_type != HC.URL_TYPE_WATCHABLE:
|
||||
|
||||
error_occurred = True
|
||||
|
||||
watcher_status = 'Did not understand the given URL as watchable!'
|
||||
|
||||
elif not can_parse:
|
||||
|
||||
error_occurred = True
|
||||
|
||||
watcher_status = 'Could not parse the given URL!'
|
||||
|
||||
|
||||
# convert to API url as appropriate
|
||||
( url_to_check, parser ) = HG.client_controller.network_engine.domain_manager.GetURLToFetchAndParser( self._thread_url )
|
||||
|
||||
if parser is None:
|
||||
|
||||
error_occurred = True
|
||||
|
||||
watcher_status = 'Could not find a parser for the given URL!'
|
||||
|
||||
|
||||
if error_occurred:
|
||||
|
||||
self._FinishCheck( page_key, watcher_status, error_occurred, watcher_status_should_stick )
|
||||
|
||||
return
|
||||
|
||||
|
||||
#
|
||||
|
||||
with self._lock:
|
||||
|
||||
self._watcher_status = 'checking thread'
|
||||
|
@ -5269,9 +5404,7 @@ class ThreadWatcherImport( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
try:
|
||||
|
||||
json_url = ClientDownloading.GetImageboardThreadJSONURL( self._thread_url )
|
||||
|
||||
network_job = ClientNetworking.NetworkJobThreadWatcher( self._thread_key, 'GET', json_url )
|
||||
network_job = ClientNetworking.NetworkJobThreadWatcher( self._thread_key, 'GET', url_to_check )
|
||||
|
||||
network_job.OverrideBandwidth()
|
||||
|
||||
|
@ -5297,55 +5430,30 @@ class ThreadWatcherImport( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
|
||||
|
||||
raw_json = network_job.GetContent()
|
||||
data = network_job.GetContent()
|
||||
|
||||
parser = HG.client_controller.network_engine.domain_manager.GetParser( url_to_check )
|
||||
|
||||
parse_context = {}
|
||||
|
||||
parse_context[ 'thread_url' ] = self._thread_url
|
||||
parse_context[ 'url' ] = url_to_check
|
||||
|
||||
all_parse_results = parser.Parse( parse_context, data )
|
||||
|
||||
subject = ClientParsing.GetTitleFromAllParseResults( all_parse_results )
|
||||
|
||||
if subject is None:
|
||||
|
||||
subject = ''
|
||||
|
||||
|
||||
with self._lock:
|
||||
|
||||
self._thread_subject = ClientDownloading.ParseImageboardThreadSubject( raw_json )
|
||||
self._thread_subject = subject
|
||||
|
||||
|
||||
file_infos = ClientDownloading.ParseImageboardFileURLsFromJSON( self._thread_url, raw_json )
|
||||
|
||||
new_urls = []
|
||||
new_urls_set = set()
|
||||
|
||||
file_urls_to_source_timestamps = {}
|
||||
|
||||
for ( file_url, file_md5_base64, file_original_filename, source_timestamp ) in file_infos:
|
||||
|
||||
if not self._urls_cache.HasURL( file_url ) and not file_url in new_urls_set:
|
||||
|
||||
new_urls.append( file_url )
|
||||
new_urls_set.add( file_url )
|
||||
|
||||
self._urls_to_filenames[ file_url ] = file_original_filename
|
||||
|
||||
if file_md5_base64 is not None:
|
||||
|
||||
self._urls_to_md5_base64[ file_url ] = file_md5_base64
|
||||
|
||||
|
||||
file_urls_to_source_timestamps[ file_url ] = source_timestamp
|
||||
|
||||
|
||||
|
||||
seeds = []
|
||||
|
||||
for url in new_urls:
|
||||
|
||||
seed = Seed( SEED_TYPE_URL, url )
|
||||
|
||||
if url in file_urls_to_source_timestamps:
|
||||
|
||||
seed.source_time = file_urls_to_source_timestamps[ url ]
|
||||
|
||||
|
||||
seeds.append( seed )
|
||||
|
||||
|
||||
self._urls_cache.AddSeeds( seeds )
|
||||
|
||||
num_new = len( new_urls )
|
||||
( num_new, num_already_in ) = UpdateSeedCacheWithAllParseResults( self._urls_cache, all_parse_results )
|
||||
|
||||
watcher_status = 'thread checked OK - ' + HydrusData.ConvertIntToPrettyString( num_new ) + ' new urls'
|
||||
watcher_status_should_stick = False
|
||||
|
@ -5359,6 +5467,14 @@ class ThreadWatcherImport( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
return
|
||||
|
||||
except HydrusExceptions.ParseException as e:
|
||||
|
||||
error_occurred = True
|
||||
|
||||
watcher_status = 'Was unable to parse the returned data! Full error written to log!'
|
||||
|
||||
HydrusData.PrintException( e )
|
||||
|
||||
except HydrusExceptions.NotFoundException:
|
||||
|
||||
error_occurred = True
|
||||
|
@ -5387,6 +5503,31 @@ class ThreadWatcherImport( HydrusSerialisable.SerialisableBase ):
|
|||
HydrusData.PrintException( e )
|
||||
|
||||
|
||||
self._FinishCheck( page_key, watcher_status, error_occurred, watcher_status_should_stick )
|
||||
|
||||
|
||||
def _DelayWork( self, time_delta, reason ):
|
||||
|
||||
self._no_work_until = HydrusData.GetNow() + time_delta
|
||||
self._no_work_until_reason = reason
|
||||
|
||||
|
||||
def _FinishCheck( self, page_key, watcher_status, error_occurred, watcher_status_should_stick ):
|
||||
|
||||
if error_occurred:
|
||||
|
||||
# the [DEAD] stuff can override watcher status, so let's give a brief time for this to display the error
|
||||
|
||||
with self._lock:
|
||||
|
||||
self._thread_paused = True
|
||||
|
||||
self._watcher_status = watcher_status
|
||||
|
||||
|
||||
time.sleep( 5 )
|
||||
|
||||
|
||||
with self._lock:
|
||||
|
||||
if self._check_now:
|
||||
|
@ -5402,19 +5543,9 @@ class ThreadWatcherImport( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
self._UpdateNextCheckTime()
|
||||
|
||||
if error_occurred:
|
||||
|
||||
self._thread_paused = True
|
||||
|
||||
|
||||
self._PublishPageName( page_key )
|
||||
|
||||
|
||||
if error_occurred:
|
||||
|
||||
time.sleep( 5 )
|
||||
|
||||
|
||||
if not watcher_status_should_stick:
|
||||
|
||||
time.sleep( 5 )
|
||||
|
@ -5426,12 +5557,6 @@ class ThreadWatcherImport( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
|
||||
|
||||
def _DelayWork( self, time_delta, reason ):
|
||||
|
||||
self._no_work_until = HydrusData.GetNow() + time_delta
|
||||
self._no_work_until_reason = reason
|
||||
|
||||
|
||||
def _GetSerialisableInfo( self ):
|
||||
|
||||
serialisable_url_cache = self._urls_cache.GetSerialisableTuple()
|
||||
|
@ -5610,28 +5735,10 @@ class ThreadWatcherImport( HydrusSerialisable.SerialisableBase ):
|
|||
self._current_action = 'reviewing file'
|
||||
|
||||
|
||||
file_original_filename = self._urls_to_filenames[ file_url ]
|
||||
|
||||
downloaded_tags = [ 'filename:' + file_original_filename ]
|
||||
|
||||
# we now do both url and md5 tests here because cloudflare was sometimes giving optimised versions of images, meaning the api's md5 was unreliable
|
||||
# if someone set up a thread watcher of a thread they had previously watched, any optimised images would be redownloaded
|
||||
|
||||
( status, hash, note ) = HG.client_controller.Read( 'url_status', file_url )
|
||||
|
||||
url_not_known_beforehand = status == CC.STATUS_NEW
|
||||
|
||||
if status == CC.STATUS_NEW:
|
||||
|
||||
if file_url in self._urls_to_md5_base64:
|
||||
|
||||
file_md5_base64 = self._urls_to_md5_base64[ file_url ]
|
||||
|
||||
file_md5 = file_md5_base64.decode( 'base64' )
|
||||
|
||||
( status, hash, note ) = HG.client_controller.Read( 'md5_status', file_md5 )
|
||||
|
||||
|
||||
( url_not_known_beforehand, ( status, hash, note ) ) = GetInitialSeedStatus( seed )
|
||||
|
||||
if status == CC.STATUS_DELETED:
|
||||
|
||||
|
@ -5735,7 +5842,9 @@ class ThreadWatcherImport( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
with self._lock:
|
||||
|
||||
service_keys_to_content_updates = self._tag_import_options.GetServiceKeysToContentUpdates( hash, downloaded_tags )
|
||||
tags = seed.GetTags()
|
||||
|
||||
service_keys_to_content_updates = self._tag_import_options.GetServiceKeysToContentUpdates( hash, tags )
|
||||
|
||||
|
||||
if len( service_keys_to_content_updates ) > 0:
|
||||
|
@ -6016,6 +6125,16 @@ class ThreadWatcherImport( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
def SetThreadURL( self, thread_url ):
|
||||
|
||||
if thread_url is None:
|
||||
|
||||
thread_url = ''
|
||||
|
||||
|
||||
if thread_url != '':
|
||||
|
||||
thread_url = HG.client_controller.network_engine.domain_manager.NormaliseURL( thread_url )
|
||||
|
||||
|
||||
with self._lock:
|
||||
|
||||
self._thread_url = thread_url
|
||||
|
|
|
@ -222,7 +222,7 @@ class NetworkBandwidthManager( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
|
||||
|
||||
def CanDoWork( self, network_contexts, expected_requests = 3, expected_bytes = 1048576 ):
|
||||
def CanDoWork( self, network_contexts, expected_requests = 3, expected_bytes = 1048576, threshold = 30 ):
|
||||
|
||||
with self._lock:
|
||||
|
||||
|
@ -232,7 +232,7 @@ class NetworkBandwidthManager( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
bandwidth_tracker = self._network_contexts_to_bandwidth_trackers[ network_context ]
|
||||
|
||||
if not bandwidth_rules.CanDoWork( bandwidth_tracker, expected_requests, expected_bytes ):
|
||||
if not bandwidth_rules.CanDoWork( bandwidth_tracker, expected_requests = expected_requests, expected_bytes = expected_bytes, threshold = threshold ):
|
||||
|
||||
return False
|
||||
|
||||
|
|
|
@ -16,7 +16,12 @@ import urlparse
|
|||
def ConvertDomainIntoAllApplicableDomains( domain ):
|
||||
|
||||
# is an ip address, possibly with a port
|
||||
if re.search( '^[\d\.):]+$', domain ) is not None:
|
||||
if re.search( r'^[\d\.):]+$', domain ) is not None:
|
||||
|
||||
return [ domain ]
|
||||
|
||||
|
||||
if domain == 'localhost':
|
||||
|
||||
return [ domain ]
|
||||
|
||||
|
@ -41,6 +46,35 @@ def ConvertDomainIntoAllApplicableDomains( domain ):
|
|||
def ConvertDomainIntoSecondLevelDomain( domain ):
|
||||
|
||||
return ConvertDomainIntoAllApplicableDomains( domain )[-1]
|
||||
|
||||
def ConvertURLMatchesIntoAPIPairs( url_matches ):
|
||||
|
||||
pairs = []
|
||||
|
||||
for url_match in url_matches:
|
||||
|
||||
if not url_match.UsesAPIURL():
|
||||
|
||||
continue
|
||||
|
||||
|
||||
api_url = url_match.GetAPIURL( url_match.GetExampleURL() )
|
||||
|
||||
for other_url_match in url_matches:
|
||||
|
||||
if other_url_match == url_match:
|
||||
|
||||
continue
|
||||
|
||||
|
||||
if other_url_match.Matches( api_url ):
|
||||
|
||||
pairs.append( ( url_match, other_url_match ) )
|
||||
|
||||
|
||||
|
||||
|
||||
return pairs
|
||||
|
||||
def ConvertURLIntoDomain( url ):
|
||||
|
||||
|
@ -439,6 +473,23 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
|
||||
|
||||
def GetParser( self, url ):
|
||||
|
||||
with self._lock:
|
||||
|
||||
url_match = self._GetURLMatch( url )
|
||||
|
||||
if url_match is None:
|
||||
|
||||
return None
|
||||
|
||||
|
||||
parser = self._GetParser( url_match )
|
||||
|
||||
return parser
|
||||
|
||||
|
||||
|
||||
def GetParsers( self ):
|
||||
|
||||
with self._lock:
|
||||
|
@ -465,30 +516,95 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
def GetURLParseCapability( self, url ):
|
||||
|
||||
url_match = self._GetURLMatch( url )
|
||||
|
||||
if url_match is None:
|
||||
with self._lock:
|
||||
|
||||
return ( HC.URL_TYPE_UNKNOWN, 'unknown url', False )
|
||||
url_match = self._GetURLMatch( url )
|
||||
|
||||
|
||||
url_type = url_match.GetURLType()
|
||||
match_name = url_match.GetName()
|
||||
|
||||
parser = self._GetParser( url_match )
|
||||
|
||||
if parser is None:
|
||||
if url_match is None:
|
||||
|
||||
return ( HC.URL_TYPE_UNKNOWN, 'unknown url', False )
|
||||
|
||||
|
||||
can_parse = False
|
||||
url_type = url_match.GetURLType()
|
||||
match_name = url_match.GetName()
|
||||
|
||||
else:
|
||||
parser_url_match = url_match
|
||||
|
||||
can_parse = True
|
||||
while parser_url_match.UsesAPIURL():
|
||||
|
||||
api_url = parser_url_match.GetAPIURL( url )
|
||||
|
||||
parser_url_match = self._GetURLMatch( api_url )
|
||||
|
||||
if parser_url_match is None:
|
||||
|
||||
break
|
||||
|
||||
|
||||
|
||||
if parser_url_match is None:
|
||||
|
||||
can_parse = False
|
||||
|
||||
else:
|
||||
|
||||
parser = self._GetParser( parser_url_match )
|
||||
|
||||
if parser is None:
|
||||
|
||||
can_parse = False
|
||||
|
||||
else:
|
||||
|
||||
can_parse = True
|
||||
|
||||
|
||||
|
||||
|
||||
return ( url_type, match_name, can_parse )
|
||||
|
||||
|
||||
def GetURLToFetchAndParser( self, url ):
|
||||
|
||||
with self._lock:
|
||||
|
||||
url_match = self._GetURLMatch( url )
|
||||
|
||||
url = url_match.Normalise( url )
|
||||
|
||||
if url_match is None:
|
||||
|
||||
return ( url, None )
|
||||
|
||||
|
||||
fetch_url = url
|
||||
parser_url_match = url_match
|
||||
|
||||
while parser_url_match.UsesAPIURL():
|
||||
|
||||
fetch_url = parser_url_match.GetAPIURL( url )
|
||||
|
||||
parser_url_match = self._GetURLMatch( fetch_url )
|
||||
|
||||
if parser_url_match is None:
|
||||
|
||||
break
|
||||
|
||||
|
||||
|
||||
if parser_url_match is None:
|
||||
|
||||
parser = None
|
||||
|
||||
else:
|
||||
|
||||
parser = self._GetParser( parser_url_match )
|
||||
|
||||
|
||||
|
||||
return ( fetch_url, parser )
|
||||
|
||||
|
||||
def Initialise( self ):
|
||||
|
||||
self._RecalcCache()
|
||||
|
@ -647,6 +763,20 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
|
|||
del self._url_match_keys_to_parser_keys[ deletee_key ]
|
||||
|
||||
|
||||
# any url matches that link to another via the API conversion will not be using parsers
|
||||
|
||||
url_match_api_pairs = ConvertURLMatchesIntoAPIPairs( self._url_matches )
|
||||
|
||||
for ( url_match_original, url_match_api ) in url_match_api_pairs:
|
||||
|
||||
url_match_key = url_match_original.GetMatchKey()
|
||||
|
||||
if url_match_key in self._url_match_keys_to_parser_keys:
|
||||
|
||||
del self._url_match_keys_to_parser_keys[ url_match_key ]
|
||||
|
||||
|
||||
|
||||
self._RecalcCache()
|
||||
|
||||
self._SetDirty()
|
||||
|
@ -667,6 +797,75 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
|
||||
|
||||
def TryToLinkURLMatchesAndParsers( self ):
|
||||
|
||||
with self._lock:
|
||||
|
||||
new_url_match_keys_to_parser_keys = NetworkDomainManager.STATICLinkURLMatchesAndParsers( self._url_matches, self._parsers, self._url_match_keys_to_parser_keys )
|
||||
|
||||
self._url_match_keys_to_parser_keys.update( new_url_match_keys_to_parser_keys )
|
||||
|
||||
self._SetDirty()
|
||||
|
||||
|
||||
|
||||
@staticmethod
|
||||
def STATICLinkURLMatchesAndParsers( url_matches, parsers, existing_url_match_keys_to_parser_keys ):
|
||||
|
||||
new_url_match_keys_to_parser_keys = {}
|
||||
|
||||
for url_match in url_matches:
|
||||
|
||||
api_pairs = ConvertURLMatchesIntoAPIPairs( url_matches )
|
||||
|
||||
# anything that goes to an api url will be parsed by that api's parser--it can't have its own
|
||||
api_pair_unparsable_url_matches = set()
|
||||
|
||||
for ( a, b ) in api_pairs:
|
||||
|
||||
api_pair_unparsable_url_matches.add( a )
|
||||
|
||||
|
||||
#
|
||||
|
||||
listctrl_data = []
|
||||
|
||||
for url_match in url_matches:
|
||||
|
||||
if not url_match.IsParsable() or url_match in api_pair_unparsable_url_matches:
|
||||
|
||||
continue
|
||||
|
||||
|
||||
if not url_match.IsWatchableURL(): # only starting with the thread watcher atm
|
||||
|
||||
continue
|
||||
|
||||
|
||||
url_match_key = url_match.GetMatchKey()
|
||||
|
||||
if url_match_key in existing_url_match_keys_to_parser_keys:
|
||||
|
||||
continue
|
||||
|
||||
|
||||
for parser in parsers:
|
||||
|
||||
example_urls = parser.GetExampleURLs()
|
||||
|
||||
if True in ( url_match.Matches( example_url ) for example_url in example_urls ):
|
||||
|
||||
new_url_match_keys_to_parser_keys[ url_match_key ] = parser.GetParserKey()
|
||||
|
||||
break
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
return new_url_match_keys_to_parser_keys
|
||||
|
||||
|
||||
HydrusSerialisable.SERIALISABLE_TYPES_TO_OBJECT_TYPES[ HydrusSerialisable.SERIALISABLE_TYPE_NETWORK_DOMAIN_MANAGER ] = NetworkDomainManager
|
||||
|
||||
class DomainValidationPopupProcess( object ):
|
||||
|
@ -1073,5 +1272,10 @@ class URLMatch( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
return ( self._url_type, self._preferred_scheme, self._netloc, self._allow_subdomains, self._keep_subdomains, self._path_components, self._parameters, self._api_lookup_converter, self._example_url )
|
||||
|
||||
|
||||
def UsesAPIURL( self ):
|
||||
|
||||
return self._api_lookup_converter.MakesChanges()
|
||||
|
||||
|
||||
HydrusSerialisable.SERIALISABLE_TYPES_TO_OBJECT_TYPES[ HydrusSerialisable.SERIALISABLE_TYPE_URL_MATCH ] = URLMatch
|
||||
|
||||
|
|
|
@ -1,4 +1,5 @@
|
|||
import bs4
|
||||
import calendar
|
||||
import ClientNetworking
|
||||
import collections
|
||||
import HydrusConstants as HC
|
||||
|
@ -13,13 +14,26 @@ import re
|
|||
import time
|
||||
import urlparse
|
||||
|
||||
def ConvertContentResultToPrettyString( result ):
|
||||
def ConvertParseResultToPrettyString( result ):
|
||||
|
||||
( ( name, content_type, additional_info ), parsed_text ) = result
|
||||
|
||||
if content_type == HC.CONTENT_TYPE_URLS:
|
||||
|
||||
return 'url: ' + parsed_text
|
||||
( url_type, priority ) = additional_info
|
||||
|
||||
if url_type == HC.URL_TYPE_FILE:
|
||||
|
||||
return 'file url: ' + parsed_text
|
||||
|
||||
elif url_type == HC.URL_TYPE_POST:
|
||||
|
||||
return 'post url: ' + parsed_text
|
||||
|
||||
elif url_type == HC.URL_TYPE_NEXT:
|
||||
|
||||
return 'next page url: ' + parsed_text
|
||||
|
||||
|
||||
elif content_type == HC.CONTENT_TYPE_MAPPINGS:
|
||||
|
||||
|
@ -29,6 +43,32 @@ def ConvertContentResultToPrettyString( result ):
|
|||
|
||||
return additional_info + ' hash: ' + parsed_text.encode( 'hex' )
|
||||
|
||||
elif content_type == HC.CONTENT_TYPE_TIMESTAMP:
|
||||
|
||||
timestamp_type = additional_info
|
||||
|
||||
try:
|
||||
|
||||
timestamp = int( parsed_text )
|
||||
|
||||
timestamp_string = HydrusData.ConvertTimestampToPrettyTime( timestamp )
|
||||
|
||||
except:
|
||||
|
||||
timestamp_string = 'could not convert to integer'
|
||||
|
||||
|
||||
if timestamp_type == HC.TIMESTAMP_TYPE_SOURCE:
|
||||
|
||||
return 'source time: ' + timestamp_string
|
||||
|
||||
|
||||
elif content_type == HC.CONTENT_TYPE_TITLE:
|
||||
|
||||
priority = additional_info
|
||||
|
||||
return 'thread watcher page title (priority ' + str( priority ) + '): ' + parsed_text
|
||||
|
||||
elif content_type == HC.CONTENT_TYPE_VETO:
|
||||
|
||||
return 'veto'
|
||||
|
@ -50,7 +90,21 @@ def ConvertParsableContentToPrettyString( parsable_content, include_veto = False
|
|||
|
||||
if content_type == HC.CONTENT_TYPE_URLS:
|
||||
|
||||
pretty_strings.append( 'urls' )
|
||||
for ( url_type, priority ) in additional_infos:
|
||||
|
||||
if url_type == HC.URL_TYPE_FILE:
|
||||
|
||||
pretty_strings.append( 'file url' )
|
||||
|
||||
elif url_type == HC.URL_TYPE_POST:
|
||||
|
||||
pretty_strings.append( 'post url' )
|
||||
|
||||
elif url_type == HC.URL_TYPE_NEXT:
|
||||
|
||||
pretty_strings.append( 'gallery next page url' )
|
||||
|
||||
|
||||
|
||||
elif content_type == HC.CONTENT_TYPE_MAPPINGS:
|
||||
|
||||
|
@ -80,6 +134,20 @@ def ConvertParsableContentToPrettyString( parsable_content, include_veto = False
|
|||
pretty_strings.append( 'hashes: ' + ', '.join( hash_types ) )
|
||||
|
||||
|
||||
elif content_type == HC.CONTENT_TYPE_TIMESTAMP:
|
||||
|
||||
for timestamp_type in additional_infos:
|
||||
|
||||
if timestamp_type == HC.TIMESTAMP_TYPE_SOURCE:
|
||||
|
||||
pretty_strings.append( 'source time' )
|
||||
|
||||
|
||||
|
||||
elif content_type == HC.CONTENT_TYPE_TITLE:
|
||||
|
||||
pretty_strings.append( 'thread watcher page title' )
|
||||
|
||||
elif content_type == HC.CONTENT_TYPE_VETO:
|
||||
|
||||
if include_veto:
|
||||
|
@ -112,7 +180,7 @@ def GetChildrenContent( job_key, children, data, referral_url ):
|
|||
|
||||
elif isinstance( child, ContentParser ):
|
||||
|
||||
child_content = child.Parse( data )
|
||||
child_content = child.Parse( {}, data )
|
||||
|
||||
|
||||
except HydrusExceptions.VetoException:
|
||||
|
@ -125,7 +193,7 @@ def GetChildrenContent( job_key, children, data, referral_url ):
|
|||
|
||||
return content
|
||||
|
||||
def GetHashesFromContentResults( results ):
|
||||
def GetHashesFromParseResults( results ):
|
||||
|
||||
hash_results = []
|
||||
|
||||
|
@ -139,7 +207,7 @@ def GetHashesFromContentResults( results ):
|
|||
|
||||
return hash_results
|
||||
|
||||
def GetTagsFromContentResults( results ):
|
||||
def GetTagsFromParseResults( results ):
|
||||
|
||||
tag_results = []
|
||||
|
||||
|
@ -155,7 +223,72 @@ def GetTagsFromContentResults( results ):
|
|||
|
||||
return tag_results
|
||||
|
||||
def GetURLsFromContentResults( results ):
|
||||
def GetTimestampFromParseResults( results, desired_timestamp_type ):
|
||||
|
||||
timestamp_results = []
|
||||
|
||||
for ( ( name, content_type, additional_info ), parsed_text ) in results:
|
||||
|
||||
if content_type == HC.CONTENT_TYPE_TIMESTAMP:
|
||||
|
||||
timestamp_type = additional_info
|
||||
|
||||
if timestamp_type == desired_timestamp_type:
|
||||
|
||||
try:
|
||||
|
||||
timestamp = int( parsed_text )
|
||||
|
||||
except:
|
||||
|
||||
continue
|
||||
|
||||
|
||||
timestamp_results.append( timestamp )
|
||||
|
||||
|
||||
|
||||
|
||||
if len( timestamp_results ) == 0:
|
||||
|
||||
return None
|
||||
|
||||
else:
|
||||
|
||||
return min( timestamp_results )
|
||||
|
||||
|
||||
def GetTitleFromAllParseResults( all_parse_results ):
|
||||
|
||||
titles = []
|
||||
|
||||
for results in all_parse_results:
|
||||
|
||||
for ( ( name, content_type, additional_info ), parsed_text ) in results:
|
||||
|
||||
if content_type == HC.CONTENT_TYPE_TITLE:
|
||||
|
||||
priority = additional_info
|
||||
|
||||
titles.append( ( priority, parsed_text ) )
|
||||
|
||||
|
||||
|
||||
|
||||
if len( titles ) > 0:
|
||||
|
||||
titles.sort( reverse = True ) # highest priority first
|
||||
|
||||
( priority, title ) = titles[0]
|
||||
|
||||
return title
|
||||
|
||||
else:
|
||||
|
||||
return None
|
||||
|
||||
|
||||
def GetURLsFromParseResults( results, desired_url_types ):
|
||||
|
||||
url_results = collections.defaultdict( list )
|
||||
|
||||
|
@ -163,15 +296,13 @@ def GetURLsFromContentResults( results ):
|
|||
|
||||
if content_type == HC.CONTENT_TYPE_URLS:
|
||||
|
||||
priority = additional_info
|
||||
( url_type, priority ) = additional_info
|
||||
|
||||
if priority is None:
|
||||
if url_type in desired_url_types:
|
||||
|
||||
priority = -1
|
||||
url_results[ priority ].append( parsed_text )
|
||||
|
||||
|
||||
url_results[ priority ].append( parsed_text )
|
||||
|
||||
|
||||
|
||||
# ( priority, url_list ) pairs
|
||||
|
@ -184,9 +315,16 @@ def GetURLsFromContentResults( results ):
|
|||
|
||||
# url_lists of descending priority
|
||||
|
||||
url_results = [ url_list for ( priority, url_list ) in url_results ]
|
||||
if len( url_results ) > 0:
|
||||
|
||||
( priority, url_list ) = url_results[0]
|
||||
|
||||
else:
|
||||
|
||||
url_list = []
|
||||
|
||||
|
||||
return url_results
|
||||
return url_list
|
||||
|
||||
def RenderJSONParseRule( parse_rule ):
|
||||
|
||||
|
@ -251,14 +389,14 @@ class ParseFormula( HydrusSerialisable.SerialisableBase ):
|
|||
self._string_converter = string_converter
|
||||
|
||||
|
||||
def _ParseRawContents( self, data ):
|
||||
def _ParseRawContents( self, parse_context, data ):
|
||||
|
||||
raise NotImplementedError()
|
||||
|
||||
|
||||
def Parse( self, data ):
|
||||
def Parse( self, parse_context, data ):
|
||||
|
||||
raw_contents = self._ParseRawContents( data )
|
||||
raw_contents = self._ParseRawContents( parse_context, data )
|
||||
|
||||
contents = []
|
||||
|
||||
|
@ -341,7 +479,7 @@ class ParseFormulaCompound( ParseFormula ):
|
|||
self._string_converter = HydrusSerialisable.CreateFromSerialisableTuple( serialisable_string_converter )
|
||||
|
||||
|
||||
def _ParseRawContents( self, data ):
|
||||
def _ParseRawContents( self, parse_context, data ):
|
||||
|
||||
def get_stream_data( index, s ):
|
||||
|
||||
|
@ -363,7 +501,14 @@ class ParseFormulaCompound( ParseFormula ):
|
|||
|
||||
for formula in self._formulae:
|
||||
|
||||
streams.append( formula.Parse( data ) )
|
||||
stream = formula.Parse( parse_context, data )
|
||||
|
||||
if len( stream ) == 0: # no contents were found for one of the /1 replace components, so no valid strings can be made.
|
||||
|
||||
return []
|
||||
|
||||
|
||||
streams.append( stream )
|
||||
|
||||
|
||||
num_raw_contents_to_make = max( ( len( stream ) for stream in streams ) )
|
||||
|
@ -419,6 +564,77 @@ class ParseFormulaCompound( ParseFormula ):
|
|||
|
||||
HydrusSerialisable.SERIALISABLE_TYPES_TO_OBJECT_TYPES[ HydrusSerialisable.SERIALISABLE_TYPE_PARSE_FORMULA_COMPOUND ] = ParseFormulaCompound
|
||||
|
||||
class ParseFormulaContextVariable( ParseFormula ):
|
||||
|
||||
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_PARSE_FORMULA_CONTEXT_VARIABLE
|
||||
SERIALISABLE_NAME = 'Context Variable Formula'
|
||||
SERIALISABLE_VERSION = 1
|
||||
|
||||
def __init__( self, variable_name = None, string_match = None, string_converter = None ):
|
||||
|
||||
ParseFormula.__init__( self, string_match, string_converter )
|
||||
|
||||
if variable_name is None:
|
||||
|
||||
variable_name = 'url'
|
||||
|
||||
|
||||
self._variable_name = variable_name
|
||||
|
||||
|
||||
def _GetSerialisableInfo( self ):
|
||||
|
||||
serialisable_string_match = self._string_match.GetSerialisableTuple()
|
||||
serialisable_string_converter = self._string_converter.GetSerialisableTuple()
|
||||
|
||||
return ( self._variable_name, serialisable_string_match, serialisable_string_converter )
|
||||
|
||||
|
||||
def _InitialiseFromSerialisableInfo( self, serialisable_info ):
|
||||
|
||||
( self._variable_name, serialisable_string_match, serialisable_string_converter ) = serialisable_info
|
||||
|
||||
self._string_match = HydrusSerialisable.CreateFromSerialisableTuple( serialisable_string_match )
|
||||
self._string_converter = HydrusSerialisable.CreateFromSerialisableTuple( serialisable_string_converter )
|
||||
|
||||
|
||||
def _ParseRawContents( self, parse_context, data ):
|
||||
|
||||
raw_contents = []
|
||||
|
||||
if self._variable_name in parse_context:
|
||||
|
||||
raw_contents.append( parse_context[ self._variable_name ] )
|
||||
|
||||
|
||||
return raw_contents
|
||||
|
||||
|
||||
def ToPrettyString( self ):
|
||||
|
||||
return 'CONTEXT VARIABLE: ' + self._variable_name
|
||||
|
||||
|
||||
def ToPrettyMultilineString( self ):
|
||||
|
||||
s = []
|
||||
|
||||
s.append( 'fetch the "' + self._variable_name + '" variable from the parsing context' )
|
||||
|
||||
separator = os.linesep * 2
|
||||
|
||||
text = '--CONTEXT VARIABLE--' + os.linesep * 2 + separator.join( s )
|
||||
|
||||
return text
|
||||
|
||||
|
||||
def ToTuple( self ):
|
||||
|
||||
return ( self._variable_name, self._string_match, self._string_converter )
|
||||
|
||||
|
||||
HydrusSerialisable.SERIALISABLE_TYPES_TO_OBJECT_TYPES[ HydrusSerialisable.SERIALISABLE_TYPE_PARSE_FORMULA_CONTEXT_VARIABLE ] = ParseFormulaContextVariable
|
||||
|
||||
HTML_CONTENT_ATTRIBUTE = 0
|
||||
HTML_CONTENT_STRING = 1
|
||||
HTML_CONTENT_HTML = 2
|
||||
|
@ -520,7 +736,16 @@ class ParseFormulaHTML( ParseFormula ):
|
|||
|
||||
elif self._content_to_fetch == HTML_CONTENT_STRING:
|
||||
|
||||
result = tag.string
|
||||
all_strings = [ s for s in tag.strings if len( s ) > 0 ]
|
||||
|
||||
if len( all_strings ) == 0:
|
||||
|
||||
result = ''
|
||||
|
||||
else:
|
||||
|
||||
result = all_strings[0]
|
||||
|
||||
|
||||
elif self._content_to_fetch == HTML_CONTENT_HTML:
|
||||
|
||||
|
@ -572,7 +797,7 @@ class ParseFormulaHTML( ParseFormula ):
|
|||
self._string_converter = HydrusSerialisable.CreateFromSerialisableTuple( serialisable_string_converter )
|
||||
|
||||
|
||||
def _ParseRawContents( self, data ):
|
||||
def _ParseRawContents( self, parse_context, data ):
|
||||
|
||||
root = bs4.BeautifulSoup( data, 'lxml' )
|
||||
|
||||
|
@ -841,7 +1066,7 @@ class ParseFormulaJSON( ParseFormula ):
|
|||
self._string_converter = HydrusSerialisable.CreateFromSerialisableTuple( serialisable_string_converter )
|
||||
|
||||
|
||||
def _ParseRawContents( self, data ):
|
||||
def _ParseRawContents( self, parse_context, data ):
|
||||
|
||||
j = json.loads( data )
|
||||
|
||||
|
@ -893,7 +1118,7 @@ class ContentParser( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_CONTENT_PARSER
|
||||
SERIALISABLE_NAME = 'Content Parser'
|
||||
SERIALISABLE_VERSION = 1
|
||||
SERIALISABLE_VERSION = 2
|
||||
|
||||
def __init__( self, name = None, content_type = None, formula = None, additional_info = None ):
|
||||
|
||||
|
@ -930,21 +1155,75 @@ class ContentParser( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
serialisable_formula = self._formula.GetSerialisableTuple()
|
||||
|
||||
return ( self._name, self._content_type, serialisable_formula, self._additional_info )
|
||||
if self._content_type == HC.CONTENT_TYPE_VETO:
|
||||
|
||||
( veto_if_matches_found, string_match ) = self._additional_info
|
||||
|
||||
serialisable_additional_info = ( veto_if_matches_found, string_match.GetSerialisableTuple() )
|
||||
|
||||
else:
|
||||
|
||||
serialisable_additional_info = self._additional_info
|
||||
|
||||
|
||||
return ( self._name, self._content_type, serialisable_formula, serialisable_additional_info )
|
||||
|
||||
|
||||
def _InitialiseFromSerialisableInfo( self, serialisable_info ):
|
||||
|
||||
( self._name, self._content_type, serialisable_formula, self._additional_info ) = serialisable_info
|
||||
( self._name, self._content_type, serialisable_formula, serialisable_additional_info ) = serialisable_info
|
||||
|
||||
if isinstance( self._additional_info, list ):
|
||||
if self._content_type == HC.CONTENT_TYPE_VETO:
|
||||
|
||||
self._additional_info = tuple( self._additional_info )
|
||||
( veto_if_matches_found, serialisable_string_match ) = serialisable_additional_info
|
||||
|
||||
string_match = HydrusSerialisable.CreateFromSerialisableTuple( serialisable_string_match )
|
||||
|
||||
self._additional_info = ( veto_if_matches_found, string_match )
|
||||
|
||||
else:
|
||||
|
||||
self._additional_info = serialisable_additional_info
|
||||
|
||||
if isinstance( self._additional_info, list ):
|
||||
|
||||
self._additional_info = tuple( self._additional_info )
|
||||
|
||||
|
||||
|
||||
self._formula = HydrusSerialisable.CreateFromSerialisableTuple( serialisable_formula )
|
||||
|
||||
|
||||
def _UpdateSerialisableInfo( self, version, old_serialisable_info ):
|
||||
|
||||
if version == 1:
|
||||
|
||||
( name, content_type, serialisable_formula, additional_info ) = old_serialisable_info
|
||||
|
||||
if content_type == HC.CONTENT_TYPE_VETO:
|
||||
|
||||
( veto_if_matches_found, match_if_text_present, search_text ) = additional_info
|
||||
|
||||
if match_if_text_present:
|
||||
|
||||
string_match = StringMatch( match_type = STRING_MATCH_REGEX, match_value = search_text, example_string = search_text )
|
||||
|
||||
else:
|
||||
|
||||
string_match = StringMatch()
|
||||
|
||||
|
||||
serialisable_string_match = string_match.GetSerialisableTuple()
|
||||
|
||||
additional_info = ( veto_if_matches_found, serialisable_string_match )
|
||||
|
||||
|
||||
new_serialisable_info = ( name, content_type, serialisable_formula, additional_info )
|
||||
|
||||
return ( 2, new_serialisable_info )
|
||||
|
||||
|
||||
|
||||
def GetName( self ):
|
||||
|
||||
return self._name
|
||||
|
@ -955,24 +1234,19 @@ class ContentParser( HydrusSerialisable.SerialisableBase ):
|
|||
return { ( self._name, self._content_type, self._additional_info ) }
|
||||
|
||||
|
||||
def Parse( self, data ):
|
||||
def Parse( self, parse_context, data ):
|
||||
|
||||
parsed_texts = self._formula.Parse( data )
|
||||
parsed_texts = self._formula.Parse( parse_context, data )
|
||||
|
||||
if self._content_type == HC.CONTENT_TYPE_VETO:
|
||||
|
||||
( veto_if_matches_found, match_if_text_present, search_text ) = self._additional_info
|
||||
( veto_if_matches_found, string_match ) = self._additional_info
|
||||
|
||||
if match_if_text_present:
|
||||
|
||||
match_found = True in [ search_text in parsed_text for parsed_text in parsed_texts ]
|
||||
|
||||
else:
|
||||
|
||||
match_found = True not in [ search_text in parsed_text for parsed_text in parsed_texts ]
|
||||
|
||||
match_found = True in ( string_match.Matches( parsed_text ) for parsed_text in parsed_texts )
|
||||
|
||||
do_veto = ( veto_if_matches_found and match_found ) or ( not veto_if_matches_found and not match_found )
|
||||
veto_if_missing = not veto_if_matches_found
|
||||
|
||||
do_veto = ( veto_if_matches_found and match_found ) or ( veto_if_missing and not match_found )
|
||||
|
||||
if do_veto:
|
||||
|
||||
|
@ -1012,9 +1286,9 @@ class PageParser( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_PAGE_PARSER
|
||||
SERIALISABLE_NAME = 'Page Parser'
|
||||
SERIALISABLE_VERSION = 1
|
||||
SERIALISABLE_VERSION = 2
|
||||
|
||||
def __init__( self, name, parser_key = None, string_converter = None, sub_page_parsers = None, content_parsers = None, example_urls = None ):
|
||||
def __init__( self, name, parser_key = None, string_converter = None, sub_page_parsers = None, content_parsers = None, example_urls = None, example_parsing_context = None ):
|
||||
|
||||
if parser_key is None:
|
||||
|
||||
|
@ -1041,6 +1315,13 @@ class PageParser( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
example_urls = []
|
||||
|
||||
|
||||
if example_parsing_context is None:
|
||||
|
||||
example_parsing_context = {}
|
||||
|
||||
example_parsing_context[ 'url' ] = 'http://example.com/posts/index.php?id=123456'
|
||||
|
||||
|
||||
HydrusSerialisable.SerialisableBaseNamed.__init__( self, name )
|
||||
|
||||
self._parser_key = parser_key
|
||||
|
@ -1048,6 +1329,7 @@ class PageParser( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
self._sub_page_parsers = sub_page_parsers
|
||||
self._content_parsers = content_parsers
|
||||
self._example_urls = example_urls
|
||||
self._example_parsing_context = example_parsing_context
|
||||
|
||||
|
||||
def _GetSerialisableInfo( self ):
|
||||
|
@ -1059,12 +1341,12 @@ class PageParser( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
serialisable_content_parsers = HydrusSerialisable.SerialisableList( self._content_parsers ).GetSerialisableTuple()
|
||||
|
||||
return ( self._name, serialisable_parser_key, serialisable_string_converter, serialisable_sub_page_parsers, serialisable_content_parsers, self._example_urls )
|
||||
return ( self._name, serialisable_parser_key, serialisable_string_converter, serialisable_sub_page_parsers, serialisable_content_parsers, self._example_urls, self._example_parsing_context )
|
||||
|
||||
|
||||
def _InitialiseFromSerialisableInfo( self, serialisable_info ):
|
||||
|
||||
( self._name, serialisable_parser_key, serialisable_string_converter, serialisable_sub_page_parsers, serialisable_content_parsers, self._example_urls ) = serialisable_info
|
||||
( self._name, serialisable_parser_key, serialisable_string_converter, serialisable_sub_page_parsers, serialisable_content_parsers, self._example_urls, self._example_parsing_context ) = serialisable_info
|
||||
|
||||
self._parser_key = serialisable_parser_key.decode( 'hex' )
|
||||
self._string_converter = HydrusSerialisable.CreateFromSerialisableTuple( serialisable_string_converter )
|
||||
|
@ -1072,11 +1354,31 @@ class PageParser( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
self._content_parsers = HydrusSerialisable.CreateFromSerialisableTuple( serialisable_content_parsers )
|
||||
|
||||
|
||||
def _UpdateSerialisableInfo( self, version, old_serialisable_info ):
|
||||
|
||||
if version == 1:
|
||||
|
||||
( name, serialisable_parser_key, serialisable_string_converter, serialisable_sub_page_parsers, serialisable_content_parsers, example_urls ) = old_serialisable_info
|
||||
|
||||
example_parsing_context = {}
|
||||
|
||||
example_parsing_context[ 'url' ] = 'http://example.com/posts/index.php?id=123456'
|
||||
|
||||
new_serialisable_info = ( name, serialisable_parser_key, serialisable_string_converter, serialisable_sub_page_parsers, serialisable_content_parsers, example_urls, example_parsing_context )
|
||||
|
||||
return ( 2, new_serialisable_info )
|
||||
|
||||
|
||||
def GetContentParsers( self ):
|
||||
|
||||
return ( self._sub_page_parsers, self._content_parsers )
|
||||
|
||||
|
||||
def GetExampleParsingContext( self ):
|
||||
|
||||
return self._example_parsing_context
|
||||
|
||||
|
||||
def GetExampleURLs( self ):
|
||||
|
||||
return self._example_urls
|
||||
|
@ -1109,7 +1411,7 @@ class PageParser( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
return self._string_converter
|
||||
|
||||
|
||||
def Parse( self, page_data ):
|
||||
def Parse( self, parse_context, page_data ):
|
||||
|
||||
try:
|
||||
|
||||
|
@ -1122,20 +1424,29 @@ class PageParser( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
#
|
||||
|
||||
whole_page_content_results = []
|
||||
whole_page_parse_results = []
|
||||
|
||||
for content_parser in self._content_parsers:
|
||||
|
||||
whole_page_content_results.extend( content_parser.Parse( converted_page_data ) )
|
||||
try:
|
||||
|
||||
whole_page_parse_results.extend( content_parser.Parse( parse_context, converted_page_data ) )
|
||||
|
||||
except HydrusExceptions.VetoException:
|
||||
|
||||
return []
|
||||
|
||||
|
||||
|
||||
#
|
||||
|
||||
all_parse_results = []
|
||||
|
||||
if len( self._sub_page_parsers ) == 0:
|
||||
|
||||
if len( whole_page_content_results ) > 0:
|
||||
if len( whole_page_parse_results ) > 0:
|
||||
|
||||
all_content_results = [ whole_page_content_results ]
|
||||
all_parse_results = [ whole_page_parse_results ]
|
||||
|
||||
|
||||
else:
|
||||
|
@ -1151,27 +1462,25 @@ class PageParser( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
sub_page_parsers.sort( key = sort_key )
|
||||
|
||||
all_content_results = []
|
||||
|
||||
for ( formula, page_parser ) in self._sub_page_parsers:
|
||||
|
||||
posts = formula.Parse( converted_page_data )
|
||||
posts = formula.Parse( parse_context, converted_page_data )
|
||||
|
||||
for post in posts:
|
||||
|
||||
page_parser_all_content_results = page_parser.Parse( post )
|
||||
page_parser_all_parse_results = page_parser.Parse( parse_context, post )
|
||||
|
||||
for page_parser_content_results in page_parser_all_content_results:
|
||||
for page_parser_parse_results in page_parser_all_parse_results:
|
||||
|
||||
page_parser_content_results.extend( whole_page_content_results )
|
||||
page_parser_parse_results.extend( whole_page_parse_results )
|
||||
|
||||
all_content_results.append( page_parser_content_results )
|
||||
all_parse_results.append( page_parser_parse_results )
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
return all_content_results
|
||||
return all_parse_results
|
||||
|
||||
|
||||
def RegenerateParserKey( self ):
|
||||
|
@ -1305,7 +1614,7 @@ class ParseNodeContentLink( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
def ParseURLs( self, job_key, data, referral_url ):
|
||||
|
||||
basic_urls = self._formula.Parse( data )
|
||||
basic_urls = self._formula.Parse( {}, data )
|
||||
|
||||
absolute_urls = [ urlparse.urljoin( referral_url, basic_url ) for basic_url in basic_urls ]
|
||||
|
||||
|
@ -1413,6 +1722,7 @@ class ParseRootFileLookup( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
return ( 2, new_serialisable_info )
|
||||
|
||||
|
||||
|
||||
def ConvertMediaToFileIdentifier( self, media ):
|
||||
|
||||
if self._file_identifier_type == FILE_IDENTIFIER_TYPE_USER_INPUT:
|
||||
|
@ -1579,9 +1889,9 @@ class ParseRootFileLookup( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
return []
|
||||
|
||||
|
||||
content_results = self.Parse( job_key, data )
|
||||
parse_results = self.Parse( job_key, data )
|
||||
|
||||
return content_results
|
||||
return parse_results
|
||||
|
||||
except HydrusExceptions.CancelledException:
|
||||
|
||||
|
@ -1602,18 +1912,18 @@ class ParseRootFileLookup( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
def Parse( self, job_key, data ):
|
||||
|
||||
content_results = GetChildrenContent( job_key, self._children, data, self._url )
|
||||
parse_results = GetChildrenContent( job_key, self._children, data, self._url )
|
||||
|
||||
if len( content_results ) == 0:
|
||||
if len( parse_results ) == 0:
|
||||
|
||||
job_key.SetVariable( 'script_status', 'Did not find anything.' )
|
||||
|
||||
else:
|
||||
|
||||
job_key.SetVariable( 'script_status', 'Found ' + HydrusData.ConvertIntToPrettyString( len( content_results ) ) + ' rows.' )
|
||||
job_key.SetVariable( 'script_status', 'Found ' + HydrusData.ConvertIntToPrettyString( len( parse_results ) ) + ' rows.' )
|
||||
|
||||
|
||||
return content_results
|
||||
return parse_results
|
||||
|
||||
|
||||
def SetChildren( self, children ):
|
||||
|
@ -1643,6 +1953,7 @@ STRING_TRANSFORMATION_CLIP_TEXT_FROM_BEGINNING = 6
|
|||
STRING_TRANSFORMATION_CLIP_TEXT_FROM_END = 7
|
||||
STRING_TRANSFORMATION_REVERSE = 8
|
||||
STRING_TRANSFORMATION_REGEX_SUB = 9
|
||||
STRING_TRANSFORMATION_DATE_DECODE = 10
|
||||
|
||||
transformation_type_str_lookup = {}
|
||||
|
||||
|
@ -1656,6 +1967,7 @@ transformation_type_str_lookup[ STRING_TRANSFORMATION_CLIP_TEXT_FROM_BEGINNING ]
|
|||
transformation_type_str_lookup[ STRING_TRANSFORMATION_CLIP_TEXT_FROM_END ] = 'take the end of the string'
|
||||
transformation_type_str_lookup[ STRING_TRANSFORMATION_REVERSE ] = 'reverse text'
|
||||
transformation_type_str_lookup[ STRING_TRANSFORMATION_REGEX_SUB ] = 'regex substitution'
|
||||
transformation_type_str_lookup[ STRING_TRANSFORMATION_DATE_DECODE ] = 'date decode'
|
||||
|
||||
class StringConverter( HydrusSerialisable.SerialisableBase ):
|
||||
|
||||
|
@ -1695,7 +2007,7 @@ class StringConverter( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
for ( transformation_type, data ) in serialisable_transformations:
|
||||
|
||||
if transformation_type == STRING_TRANSFORMATION_REGEX_SUB:
|
||||
if isinstance( data, list ):
|
||||
|
||||
data = tuple( data ) # convert from list to tuple thing
|
||||
|
||||
|
@ -1770,6 +2082,34 @@ class StringConverter( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
s = re.sub( pattern, repl, s, flags = re.UNICODE )
|
||||
|
||||
elif transformation_type == STRING_TRANSFORMATION_DATE_DECODE:
|
||||
|
||||
( phrase, timezone, timezone_offset ) = data
|
||||
|
||||
struct_time = time.strptime( s, phrase )
|
||||
|
||||
if timezone == HC.TIMEZONE_GMT:
|
||||
|
||||
# the given struct is in GMT, so calendar.timegm is appropriate here
|
||||
|
||||
timestamp = int( calendar.timegm( struct_time ) )
|
||||
|
||||
elif timezone == HC.TIMEZONE_LOCAL:
|
||||
|
||||
# the given struct is in local time, so time.mktime is correct
|
||||
|
||||
timestamp = int( time.mktime( struct_time ) )
|
||||
|
||||
elif timezone == HC.TIMEZONE_OFFSET:
|
||||
|
||||
# the given struct is in server time, which is the same as GMT minus an offset
|
||||
# if we are 7200 seconds ahead, the correct GMT timestamp needs to be 7200 smaller
|
||||
|
||||
timestamp = int( calendar.timegm( struct_time ) ) - timezone_offset
|
||||
|
||||
|
||||
s = str( timestamp )
|
||||
|
||||
|
||||
except:
|
||||
|
||||
|
@ -1790,6 +2130,11 @@ class StringConverter( HydrusSerialisable.SerialisableBase ):
|
|||
return [ self.TransformationToUnicode( transformation ) for transformation in self.transformations ]
|
||||
|
||||
|
||||
def MakesChanges( self ):
|
||||
|
||||
return len( self.transformations ) > 0
|
||||
|
||||
|
||||
@staticmethod
|
||||
def TransformationToUnicode( transformation ):
|
||||
|
||||
|
@ -1889,6 +2234,20 @@ class StringMatch( HydrusSerialisable.SerialisableBase ):
|
|||
self._min_chars = min_chars
|
||||
|
||||
|
||||
def Matches( self, text ):
|
||||
|
||||
try:
|
||||
|
||||
self.Test( text )
|
||||
|
||||
return True
|
||||
|
||||
except HydrusExceptions.StringMatchException:
|
||||
|
||||
return False
|
||||
|
||||
|
||||
|
||||
def Test( self, text ):
|
||||
|
||||
text_len = len( text )
|
||||
|
|
|
@ -9,6 +9,7 @@ import HydrusGlobals as HG
|
|||
import HydrusSerialisable
|
||||
import HydrusTags
|
||||
import re
|
||||
import time
|
||||
import wx
|
||||
|
||||
IGNORED_TAG_SEARCH_CHARACTERS = u'[](){}"\''
|
||||
|
@ -487,8 +488,10 @@ class FileSystemPredicates( object ):
|
|||
|
||||
( year, month, day ) = age_value
|
||||
|
||||
# convert this dt, which is in local time, to a gmt timestamp
|
||||
|
||||
day_dt = datetime.datetime( year, month, day )
|
||||
timestamp = calendar.timegm( day_dt.timetuple() )
|
||||
timestamp = int( time.mktime( day_dt.timetuple() ) )
|
||||
|
||||
if operator == '<':
|
||||
|
||||
|
@ -1134,7 +1137,8 @@ class Predicate( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
dt = datetime.datetime( year, month, day )
|
||||
|
||||
timestamp = calendar.timegm( dt.timetuple() )
|
||||
# make a timestamp (IN GMT SECS SINCE 1970) from the local meaning of 2018/02/01
|
||||
timestamp = int( time.mktime( dt.timetuple() ) )
|
||||
|
||||
if operator == '<':
|
||||
|
||||
|
@ -1153,6 +1157,7 @@ class Predicate( HydrusSerialisable.SerialisableBase ):
|
|||
pretty_operator = u'a month either side of '
|
||||
|
||||
|
||||
# convert this GMT TIMESTAMP to a pretty local string
|
||||
base += u': ' + pretty_operator + HydrusData.ConvertTimestampToPrettyTime( timestamp, include_24h_time = False )
|
||||
|
||||
|
||||
|
|
|
@ -49,7 +49,7 @@ options = {}
|
|||
# Misc
|
||||
|
||||
NETWORK_VERSION = 18
|
||||
SOFTWARE_VERSION = 292
|
||||
SOFTWARE_VERSION = 293
|
||||
|
||||
UNSCALED_THUMBNAIL_DIMENSIONS = ( 200, 200 )
|
||||
|
||||
|
@ -108,6 +108,8 @@ CONTENT_TYPE_UNKNOWN = 12
|
|||
CONTENT_TYPE_ACCOUNT_TYPES = 13
|
||||
CONTENT_TYPE_VARIABLE = 14
|
||||
CONTENT_TYPE_HASH = 15
|
||||
CONTENT_TYPE_TIMESTAMP = 16
|
||||
CONTENT_TYPE_TITLE = 17
|
||||
|
||||
content_type_string_lookup = {}
|
||||
|
||||
|
@ -125,6 +127,10 @@ content_type_string_lookup[ CONTENT_TYPE_OPTIONS ] = 'options'
|
|||
content_type_string_lookup[ CONTENT_TYPE_SERVICES ] = 'services'
|
||||
content_type_string_lookup[ CONTENT_TYPE_UNKNOWN ] = 'unknown'
|
||||
content_type_string_lookup[ CONTENT_TYPE_ACCOUNT_TYPES ] = 'account types'
|
||||
content_type_string_lookup[ CONTENT_TYPE_VARIABLE ] = 'variable'
|
||||
content_type_string_lookup[ CONTENT_TYPE_HASH ] = 'hash'
|
||||
content_type_string_lookup[ CONTENT_TYPE_TIMESTAMP ] = 'timestamp'
|
||||
content_type_string_lookup[ CONTENT_TYPE_TITLE ] = 'title'
|
||||
|
||||
REPOSITORY_CONTENT_TYPES = [ CONTENT_TYPE_FILES, CONTENT_TYPE_MAPPINGS, CONTENT_TYPE_TAG_PARENTS, CONTENT_TYPE_TAG_SIBLINGS ]
|
||||
|
||||
|
@ -652,12 +658,19 @@ site_type_string_lookup[ SITE_TYPE_PIXIV_TAG ] = 'pixiv tag'
|
|||
site_type_string_lookup[ SITE_TYPE_TUMBLR ] = 'tumblr'
|
||||
site_type_string_lookup[ SITE_TYPE_THREAD_WATCHER ] = 'thread watcher'
|
||||
|
||||
TIMESTAMP_TYPE_SOURCE = 0
|
||||
|
||||
TIMEZONE_GMT = 0
|
||||
TIMEZONE_LOCAL = 1
|
||||
TIMEZONE_OFFSET = 2
|
||||
|
||||
URL_TYPE_POST = 0
|
||||
URL_TYPE_API = 1
|
||||
URL_TYPE_FILE = 2
|
||||
URL_TYPE_GALLERY = 3
|
||||
URL_TYPE_WATCHABLE = 4
|
||||
URL_TYPE_UNKNOWN = 5
|
||||
URL_TYPE_NEXT = 5
|
||||
|
||||
url_type_string_lookup = {}
|
||||
|
||||
|
|
|
@ -343,7 +343,7 @@ class HydrusController( object ):
|
|||
|
||||
def PrintProfile( self, summary, profile_text ):
|
||||
|
||||
boot_pretty_timestamp = time.strftime( '%Y-%m-%d %H-%M-%S', time.gmtime( self._timestamps[ 'boot' ] ) )
|
||||
boot_pretty_timestamp = time.strftime( '%Y-%m-%d %H-%M-%S', time.localtime( self._timestamps[ 'boot' ] ) )
|
||||
|
||||
profile_log_filename = self._name + ' profile - ' + boot_pretty_timestamp + '.log'
|
||||
|
||||
|
|
|
@ -301,7 +301,7 @@ def ConvertTimeDeltaToPrettyString( seconds ):
|
|||
|
||||
else:
|
||||
|
||||
result_components.append( ' %d' % hours + ' hours' )
|
||||
result_components.append( '%d' % hours + ' hours' )
|
||||
|
||||
|
||||
|
||||
|
@ -613,7 +613,7 @@ def ConvertTimestampToPrettySync( timestamp ):
|
|||
elif hours > 0: return ' '.join( ( h, m ) ) + ' ago'
|
||||
else: return ' '.join( ( m, s ) ) + ' ago'
|
||||
|
||||
def ConvertTimestampToPrettyTime( timestamp, include_24h_time = True ):
|
||||
def ConvertTimestampToPrettyTime( timestamp, in_gmt = False, include_24h_time = True ):
|
||||
|
||||
if include_24h_time:
|
||||
|
||||
|
@ -624,7 +624,18 @@ def ConvertTimestampToPrettyTime( timestamp, include_24h_time = True ):
|
|||
phrase = '%Y/%m/%d'
|
||||
|
||||
|
||||
return time.strftime( phrase, time.gmtime( timestamp ) )
|
||||
if in_gmt:
|
||||
|
||||
struct_time = time.gmtime( timestamp )
|
||||
|
||||
phrase = phrase + ' GMT'
|
||||
|
||||
else:
|
||||
|
||||
struct_time = time.localtime( timestamp )
|
||||
|
||||
|
||||
return time.strftime( phrase, struct_time )
|
||||
|
||||
def ConvertTimestampToHumanPrettyTime( timestamp ):
|
||||
|
||||
|
|
|
@ -7,6 +7,7 @@ test_controller = None
|
|||
view_shutdown = False
|
||||
model_shutdown = False
|
||||
|
||||
import_folders_running = False
|
||||
subscriptions_running = False
|
||||
|
||||
callto_report_mode = False
|
||||
|
|
|
@ -10,6 +10,9 @@ import ssl
|
|||
import threading
|
||||
import time
|
||||
|
||||
# The calendar portion of this works in GMT. A new 'day' or 'month' is calculated based on GMT time, so it won't tick over at midnight for most people.
|
||||
# But this means a server can pass a bandwidth object to a lad and everyone can agree on when a new day is.
|
||||
|
||||
def ConvertBandwidthRuleToString( rule ):
|
||||
|
||||
( bandwidth_type, time_delta, max_allowed ) = rule
|
||||
|
@ -347,6 +350,7 @@ class BandwidthTracker( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
def _GetCurrentDateTime( self ):
|
||||
|
||||
# keep getnow in here for the moment to aid in testing, which patches it to do time shifting
|
||||
return datetime.datetime.utcfromtimestamp( HydrusData.GetNow() )
|
||||
|
||||
|
||||
|
@ -408,7 +412,7 @@ class BandwidthTracker( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
month_dt = datetime.datetime( year, month, 1 )
|
||||
|
||||
month_time = calendar.timegm( month_dt.timetuple() )
|
||||
month_time = int( calendar.timegm( month_dt.timetuple() ) )
|
||||
|
||||
return month_time
|
||||
|
||||
|
@ -475,12 +479,12 @@ class BandwidthTracker( HydrusSerialisable.SerialisableBase ):
|
|||
hour_dt = datetime.datetime( year, month, day, hour )
|
||||
minute_dt = datetime.datetime( year, month, day, hour, minute )
|
||||
|
||||
month_time = calendar.timegm( month_dt.timetuple() )
|
||||
day_time = calendar.timegm( day_dt.timetuple() )
|
||||
hour_time = calendar.timegm( hour_dt.timetuple() )
|
||||
minute_time = calendar.timegm( minute_dt.timetuple() )
|
||||
month_time = int( calendar.timegm( month_dt.timetuple() ) )
|
||||
day_time = int( calendar.timegm( day_dt.timetuple() ) )
|
||||
hour_time = int( calendar.timegm( hour_dt.timetuple() ) )
|
||||
minute_time = int( calendar.timegm( minute_dt.timetuple() ) )
|
||||
|
||||
second_time = calendar.timegm( dt.timetuple() )
|
||||
second_time = int( calendar.timegm( dt.timetuple() ) )
|
||||
|
||||
return ( month_time, day_time, hour_time, minute_time, second_time )
|
||||
|
||||
|
@ -633,7 +637,7 @@ class BandwidthTracker( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
next_month_dt = datetime.datetime( next_month_year, next_month, 1 )
|
||||
|
||||
next_month_time = calendar.timegm( next_month_dt.timetuple() )
|
||||
next_month_time = int( calendar.timegm( next_month_dt.timetuple() ) )
|
||||
|
||||
return next_month_time - HydrusData.GetNow()
|
||||
|
||||
|
|
|
@ -74,6 +74,7 @@ SERIALISABLE_TYPE_FILENAME_TAGGING_OPTIONS = 56
|
|||
SERIALISABLE_TYPE_SEED = 57
|
||||
SERIALISABLE_TYPE_PAGE_PARSER = 58
|
||||
SERIALISABLE_TYPE_PARSE_FORMULA_COMPOUND = 59
|
||||
SERIALISABLE_TYPE_PARSE_FORMULA_CONTEXT_VARIABLE = 60
|
||||
|
||||
SERIALISABLE_TYPES_TO_OBJECT_TYPES = {}
|
||||
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
import collections
|
||||
import HydrusExceptions
|
||||
import Queue
|
||||
import threading
|
||||
|
@ -192,6 +193,261 @@ class DAEMONForegroundWorker( DAEMONWorker ):
|
|||
return self._controller.GoodTimeToDoForegroundWork()
|
||||
|
||||
|
||||
class JobScheduler( DAEMON ):
|
||||
|
||||
def __init__( self, controller ):
|
||||
|
||||
DAEMON.__init__( self, controller, 'JobScheduler' )
|
||||
|
||||
self._currently_working = []
|
||||
|
||||
self._waiting = []
|
||||
|
||||
self._waiting_lock = threading.Lock()
|
||||
|
||||
self._new_action = threading.Event()
|
||||
|
||||
self._sort_needed = threading.Event()
|
||||
|
||||
|
||||
def _InsertJob( self, job ):
|
||||
|
||||
# write __lt__, __gt__, stuff and do a bisect insort_left here
|
||||
|
||||
with self._waiting_lock:
|
||||
|
||||
self._waiting.append( job )
|
||||
|
||||
|
||||
self._sort_needed.set()
|
||||
|
||||
|
||||
def _NoWorkToStart( self ):
|
||||
|
||||
with self._waiting_lock:
|
||||
|
||||
if len( self._waiting ) == 0:
|
||||
|
||||
return True
|
||||
|
||||
|
||||
next_job = self._waiting[0]
|
||||
|
||||
|
||||
if HydrusData.TimeHasPassed( next_job.GetNextWorkTime() ):
|
||||
|
||||
return False
|
||||
|
||||
else:
|
||||
|
||||
return True
|
||||
|
||||
|
||||
|
||||
def _RescheduleFinishedJobs( self ):
|
||||
|
||||
def reschedule_finished_job( job ):
|
||||
|
||||
if job.CurrentlyWorking():
|
||||
|
||||
return True
|
||||
|
||||
else:
|
||||
|
||||
self._InsertJob( job )
|
||||
|
||||
return False
|
||||
|
||||
|
||||
|
||||
self._currently_working = filter( reschedule_finished_job, self._currently_working )
|
||||
|
||||
|
||||
def _SortWaiting( self ):
|
||||
|
||||
# sort the waiting jobs in ascending order of expected work time
|
||||
|
||||
def key( job ):
|
||||
|
||||
return job.GetNextWorkTime()
|
||||
|
||||
|
||||
with self._waiting_lock:
|
||||
|
||||
self._waiting.sort( key = key )
|
||||
|
||||
|
||||
|
||||
def _StartWork( self ):
|
||||
|
||||
while True:
|
||||
|
||||
with self._waiting_lock:
|
||||
|
||||
if len( self._waiting ) == 0:
|
||||
|
||||
break
|
||||
|
||||
|
||||
next_job = self._waiting[0]
|
||||
|
||||
if HydrusData.TimeHasPassed( next_job.GetNextWorkTime() ):
|
||||
|
||||
next_job = self._waiting.pop( 0 )
|
||||
|
||||
if not next_job.IsDead():
|
||||
|
||||
next_job.StartWork()
|
||||
|
||||
self._currently_working.append( next_job )
|
||||
|
||||
|
||||
else:
|
||||
|
||||
break # all the rest in the queue are not due
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
def RegisterJob( self, job ):
|
||||
|
||||
job.SetScheduler( self )
|
||||
|
||||
self._InsertJob( job )
|
||||
|
||||
|
||||
def WorkTimesHaveChanged( self ):
|
||||
|
||||
self._sort_needed.set()
|
||||
|
||||
|
||||
def run( self ):
|
||||
|
||||
while True:
|
||||
|
||||
try:
|
||||
|
||||
while self._NoWorkToStart():
|
||||
|
||||
if self._controller.ModelIsShutdown():
|
||||
|
||||
return
|
||||
|
||||
|
||||
#
|
||||
|
||||
self._RescheduleFinishedJobs()
|
||||
|
||||
#
|
||||
|
||||
self._sort_needed.wait( 0.2 )
|
||||
|
||||
if self._sort_needed.is_set():
|
||||
|
||||
self._SortWaiting()
|
||||
|
||||
self._sort_needed.clear()
|
||||
|
||||
|
||||
|
||||
self._StartWork()
|
||||
|
||||
except HydrusExceptions.ShutdownException:
|
||||
|
||||
return
|
||||
|
||||
except Exception as e:
|
||||
|
||||
HydrusData.Print( traceback.format_exc() )
|
||||
|
||||
HydrusData.ShowException( e )
|
||||
|
||||
|
||||
time.sleep( 0.00001 )
|
||||
|
||||
|
||||
|
||||
class RepeatingJob( object ):
|
||||
|
||||
def __init__( self, controller, work_callable, period, initial_delay = 0 ):
|
||||
|
||||
self._controller = controller
|
||||
self._work_callable = work_callable
|
||||
self._period = period
|
||||
|
||||
self._is_dead = threading.Event()
|
||||
|
||||
self._work_lock = threading.Lock()
|
||||
|
||||
self._currently_working = threading.Event()
|
||||
|
||||
self._next_work_time = HydrusData.GetNow() + initial_delay
|
||||
|
||||
self._scheduler = None
|
||||
|
||||
# registers itself with controller here
|
||||
|
||||
|
||||
def CurrentlyWorking( self ):
|
||||
|
||||
return self._currently_working.is_set()
|
||||
|
||||
|
||||
def GetNextWorkTime( self ):
|
||||
|
||||
return self._next_work_time
|
||||
|
||||
|
||||
def IsDead( self ):
|
||||
|
||||
return self._is_dead.is_set()
|
||||
|
||||
|
||||
def Kill( self ):
|
||||
|
||||
self._is_dead.set()
|
||||
|
||||
|
||||
def SetScheduler( self, scheduler ):
|
||||
|
||||
self._scheduler = scheduler
|
||||
|
||||
|
||||
def StartWork( self ):
|
||||
|
||||
self._currently_working.set()
|
||||
|
||||
self._controller.CallToThread( self.Work )
|
||||
|
||||
|
||||
def WakeAndWork( self ):
|
||||
|
||||
self._next_work_time = HydrusData.GetNow()
|
||||
|
||||
if self._scheduler is not None:
|
||||
|
||||
self._scheduler.WorkTimesHaveChanged()
|
||||
|
||||
|
||||
|
||||
def Work( self ):
|
||||
|
||||
with self._work_lock:
|
||||
|
||||
try:
|
||||
|
||||
self._work_callable()
|
||||
|
||||
finally:
|
||||
|
||||
self._next_work_time = HydrusData.GetNow() + self._period
|
||||
|
||||
self._currently_working.clear()
|
||||
|
||||
|
||||
|
||||
|
||||
class THREADCallToThread( DAEMON ):
|
||||
|
||||
def __init__( self, controller ):
|
||||
|
|
|
@ -659,12 +659,22 @@ class VideoRendererFFMPEG( object ):
|
|||
|
||||
if self._mime in ( HC.IMAGE_APNG, HC.IMAGE_GIF ):
|
||||
|
||||
do_ss = False
|
||||
ss = 0
|
||||
self.pos = 0
|
||||
skip_frames = start_index
|
||||
|
||||
else:
|
||||
|
||||
if start_index == 0:
|
||||
|
||||
do_ss = True
|
||||
|
||||
else:
|
||||
|
||||
do_ss = False
|
||||
|
||||
|
||||
ss = float( start_index ) / self.fps
|
||||
self.pos = start_index
|
||||
skip_frames = 0
|
||||
|
@ -672,15 +682,20 @@ class VideoRendererFFMPEG( object ):
|
|||
|
||||
( w, h ) = self._target_resolution
|
||||
|
||||
cmd = [ FFMPEG_PATH,
|
||||
'-ss', "%.03f" % ss,
|
||||
'-i', self._path,
|
||||
cmd = [ FFMPEG_PATH ]
|
||||
|
||||
if do_ss:
|
||||
|
||||
cmd.extend( [ '-ss', "%.03f" % ss ] )
|
||||
|
||||
|
||||
cmd.extend( [ '-i', self._path,
|
||||
'-loglevel', 'quiet',
|
||||
'-f', 'image2pipe',
|
||||
"-pix_fmt", self.pix_fmt,
|
||||
"-s", str( w ) + 'x' + str( h ),
|
||||
'-vsync', '0',
|
||||
'-vcodec', 'rawvideo', '-' ]
|
||||
'-vcodec', 'rawvideo', '-' ] )
|
||||
|
||||
|
||||
try:
|
||||
|
|
|
@ -1536,7 +1536,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
( name, ) = self._c.execute( 'SELECT name FROM services WHERE service_id = ?;', ( service_id, ) ).fetchone()
|
||||
|
||||
HydrusData.Print( 'Creating update for ' + repr( name ) + ' from ' + HydrusData.ConvertTimestampToPrettyTime( begin ) + ' to ' + HydrusData.ConvertTimestampToPrettyTime( end ) )
|
||||
HydrusData.Print( 'Creating update for ' + repr( name ) + ' from ' + HydrusData.ConvertTimestampToPrettyTime( begin, in_gmt = True ) + ' to ' + HydrusData.ConvertTimestampToPrettyTime( end, in_gmt = True ) )
|
||||
|
||||
updates = self._RepositoryGenerateUpdates( service_id, begin, end )
|
||||
|
||||
|
|
|
@ -25,7 +25,7 @@ class TestDaemons( unittest.TestCase ):
|
|||
|
||||
try:
|
||||
|
||||
HG.test_controller.SetRead( 'hash_status', CC.STATUS_NEW )
|
||||
HG.test_controller.SetRead( 'hash_status', ( CC.STATUS_NEW, None, '' ) )
|
||||
|
||||
HydrusPaths.MakeSureDirectoryExists( test_dir )
|
||||
|
||||
|
|
|
@ -70,7 +70,7 @@ class TestClientDB( unittest.TestCase ):
|
|||
|
||||
cls._db = ClientDB.DB( HG.test_controller, TestConstants.DB_DIR, 'client' )
|
||||
|
||||
HG.test_controller.SetRead( 'hash_status', CC.STATUS_NEW )
|
||||
HG.test_controller.SetRead( 'hash_status', ( CC.STATUS_NEW, None, '' ) )
|
||||
|
||||
|
||||
@classmethod
|
||||
|
@ -887,7 +887,7 @@ class TestClientDB( unittest.TestCase ):
|
|||
|
||||
|
||||
|
||||
def test_md5_status( self ):
|
||||
def test_hash_status( self ):
|
||||
|
||||
TestClientDB._clear_db()
|
||||
|
||||
|
@ -899,7 +899,7 @@ class TestClientDB( unittest.TestCase ):
|
|||
|
||||
#
|
||||
|
||||
result = self._read( 'md5_status', md5 )
|
||||
result = self._read( 'hash_status', 'md5', md5 )
|
||||
|
||||
self.assertEqual( result, ( CC.STATUS_NEW, None, '' ) )
|
||||
|
||||
|
@ -915,7 +915,7 @@ class TestClientDB( unittest.TestCase ):
|
|||
|
||||
#
|
||||
|
||||
( status, hash, note ) = self._read( 'md5_status', md5 )
|
||||
( status, hash, note ) = self._read( 'hash_status', 'md5', md5 )
|
||||
|
||||
self.assertEqual( ( status, hash ), ( CC.STATUS_REDUNDANT, hash ) )
|
||||
|
||||
|
@ -929,7 +929,7 @@ class TestClientDB( unittest.TestCase ):
|
|||
|
||||
#
|
||||
|
||||
( status, hash, note ) = self._read( 'md5_status', md5 )
|
||||
( status, hash, note ) = self._read( 'hash_status', 'md5', md5 )
|
||||
|
||||
self.assertEqual( ( status, hash ), ( CC.STATUS_DELETED, hash ) )
|
||||
|
||||
|
|
|
@ -408,15 +408,15 @@ class TestTagObjects( unittest.TestCase ):
|
|||
|
||||
p = ClientSearch.Predicate( HC.PREDICATE_TYPE_SYSTEM_AGE, ( '<', 'delta', ( 1, 2, 3, 4 ) ) )
|
||||
|
||||
self.assertEqual( p.GetUnicode(), u'system:age < 1y2m3d4h' )
|
||||
self.assertEqual( p.GetUnicode(), u'system:time imported: since 1 year 2 months 3 days 4 hours ago' )
|
||||
|
||||
p = ClientSearch.Predicate( HC.PREDICATE_TYPE_SYSTEM_AGE, ( u'\u2248', 'delta', ( 1, 2, 3, 4 ) ) )
|
||||
|
||||
self.assertEqual( p.GetUnicode(), u'system:age ' + u'\u2248' + ' 1y2m3d4h' )
|
||||
self.assertEqual( p.GetUnicode(), u'system:time imported: around 1 year 2 months 3 days 4 hours ago' )
|
||||
|
||||
p = ClientSearch.Predicate( HC.PREDICATE_TYPE_SYSTEM_AGE, ( '>', 'delta', ( 1, 2, 3, 4 ) ) )
|
||||
|
||||
self.assertEqual( p.GetUnicode(), u'system:age > 1y2m3d4h' )
|
||||
self.assertEqual( p.GetUnicode(), u'system:time imported: before 1 year 2 months 3 days 4 hours ago' )
|
||||
|
||||
p = ClientSearch.Predicate( HC.PREDICATE_TYPE_SYSTEM_ARCHIVE, min_current_count = 1000 )
|
||||
|
||||
|
|
After Width: | Height: | Size: 2.8 KiB |
After Width: | Height: | Size: 3.0 KiB |
After Width: | Height: | Size: 2.1 KiB |
After Width: | Height: | Size: 2.5 KiB |
After Width: | Height: | Size: 2.2 KiB |
After Width: | Height: | Size: 1.8 KiB |
After Width: | Height: | Size: 2.3 KiB |
After Width: | Height: | Size: 2.0 KiB |
After Width: | Height: | Size: 2.0 KiB |
After Width: | Height: | Size: 2.3 KiB |
After Width: | Height: | Size: 2.0 KiB |
After Width: | Height: | Size: 2.1 KiB |
After Width: | Height: | Size: 2.2 KiB |
After Width: | Height: | Size: 2.4 KiB |
After Width: | Height: | Size: 2.2 KiB |
After Width: | Height: | Size: 2.1 KiB |
After Width: | Height: | Size: 2.3 KiB |
After Width: | Height: | Size: 2.3 KiB |
After Width: | Height: | Size: 2.4 KiB |
After Width: | Height: | Size: 2.1 KiB |
After Width: | Height: | Size: 2.2 KiB |
After Width: | Height: | Size: 2.8 KiB |
After Width: | Height: | Size: 2.7 KiB |
After Width: | Height: | Size: 2.4 KiB |
After Width: | Height: | Size: 2.7 KiB |
After Width: | Height: | Size: 2.2 KiB |
After Width: | Height: | Size: 2.3 KiB |
After Width: | Height: | Size: 2.1 KiB |
After Width: | Height: | Size: 2.3 KiB |
After Width: | Height: | Size: 2.3 KiB |
After Width: | Height: | Size: 2.5 KiB |
After Width: | Height: | Size: 2.6 KiB |
After Width: | Height: | Size: 2.4 KiB |
After Width: | Height: | Size: 2.1 KiB |
After Width: | Height: | Size: 2.3 KiB |
After Width: | Height: | Size: 2.5 KiB |
After Width: | Height: | Size: 2.2 KiB |
After Width: | Height: | Size: 2.4 KiB |
After Width: | Height: | Size: 2.2 KiB |
After Width: | Height: | Size: 2.5 KiB |
After Width: | Height: | Size: 2.2 KiB |
After Width: | Height: | Size: 2.4 KiB |
After Width: | Height: | Size: 2.3 KiB |
After Width: | Height: | Size: 2.4 KiB |
After Width: | Height: | Size: 2.3 KiB |
After Width: | Height: | Size: 2.3 KiB |
After Width: | Height: | Size: 2.0 KiB |
After Width: | Height: | Size: 2.2 KiB |
After Width: | Height: | Size: 2.3 KiB |
After Width: | Height: | Size: 2.1 KiB |
After Width: | Height: | Size: 2.1 KiB |
After Width: | Height: | Size: 2.3 KiB |
After Width: | Height: | Size: 2.1 KiB |
After Width: | Height: | Size: 2.2 KiB |