Version 301
This commit is contained in:
parent
32dc66cb95
commit
c8773bad86
|
@ -8,6 +8,33 @@
|
|||
<div class="content">
|
||||
<h3>changelog</h3>
|
||||
<ul>
|
||||
<li><h3>version 301</h3></li>
|
||||
<ul>
|
||||
<li>after discussions with Sankaku Complex about their recent bandwidth problems, added a new 64MB/day default bandwidth rule for sankakucomplex.com--please check the release post for more information</li>
|
||||
<li>the 'page of images downloader' is now called the 'simple downloader' that uses the new parsing system (particularly, a single formula to parse urls)</li>
|
||||
<li>the simple downloader supports multiple named parsers--currently defaulting to: html 4chan and 8chan threads, all images, gfycat mp4, gfycat webm, imgur image, imgur video, and twitter images (which fetches the :orig and also works on galleries!)</li>
|
||||
<li>there is some basic editing of these parsing formulae, but it isn't pretty or easy to import/export yet</li>
|
||||
<li>the new parsing test panel now has a 'link' button that lets you fetch test data straight from a URL</li>
|
||||
<li>added a 'gather to this page of pages->dead thread watchers' menu to the page of pages right-click menu--it searches for all 404/DEAD thread watchers in the current page structure and puts them in the clicked page of pages!</li>
|
||||
<li>cleaned up some page tab right-click menu layout and order</li>
|
||||
<li>fixed tag parents, which I previously broke while optimising their load time fugg</li>
|
||||
<li>the new favourites list now presents parents in 'write' tag contexts, like manage tags--see if you like it (maybe this is better if hidden?)</li>
|
||||
<li>sped up known_url searches for most situations</li>
|
||||
<li>fixed an unusual error when drag-and-dropping a focused collection thumbnail to a new page</li>
|
||||
<li>fixed a problem that was marking collected thumbnails' media as not eligible for the archive/delete filter</li>
|
||||
<li>wrote a 'subscription report mode' that will say some things about subscriptions and their internal test states as they try (and potentially fail) to run</li>
|
||||
<li>if a subscription query fails to find any files on its first sync, it will give a better text popup notification</li>
|
||||
<li>if a subscription query finds files in its initial sync but does not have bandwidth to download them, a FYI text popup notification will explain what happened and how to review estimated wait time</li>
|
||||
<li>delete key now deletes from file import status lists</li>
|
||||
<li>default downloader tag import options will now inherit the fetch_tags_even_if_url_known_and_file_already_in_db value more reliably from 'parent' default options objects (like 'general boorus'->'specific booru')</li>
|
||||
<li>the db maintenance routine 'clear file orphans' will now move files to a chosen location as it finds them (previously, it waited until the end of the search to do the move). if the user chooses to delete, this will still be put off until the end of the search (so a mid-search cancel event in this case remains harmless)</li>
|
||||
<li>the migrate database panel should now launch ok even if a location does not exist (it will also notify you about this)</li>
|
||||
<li>brushed up some help (and updated a screenshot) about tag import options</li>
|
||||
<li>fixed a problem that stopped some old manage parsing scripts ui (to content links) from opening correctly</li>
|
||||
<li>improved some parsing test code so it can't hang the client on certain network problems</li>
|
||||
<li>misc ui code updates</li>
|
||||
<li>misc refactoring</li>
|
||||
</ul>
|
||||
<li><h3>version 300</h3></li>
|
||||
<ul>
|
||||
<li>wrote system:known url to find files that have--or do not have--certain types of urls. it works but is still a little slow--I can optimise it later!</li>
|
||||
|
|
|
@ -26,12 +26,6 @@
|
|||
<li><a href="https://www.patreon.com/hydrus_dev">patreon</a></li>
|
||||
<li><a href="https://github.com/CuddleBear92/Hydrus-Presets-and-Scripts">user-run wiki (including download presets for several non-default boorus)</a>
|
||||
</ul>
|
||||
<p>If you would like to send me something physical, you can use my PO Box:</p>
|
||||
<ul>
|
||||
<li>PO Box 8883</li>
|
||||
<li>Rockford, IL, 61126</li>
|
||||
<li>UNITED STATES</li>
|
||||
</ul>
|
||||
</div>
|
||||
</body>
|
||||
</html>
|
|
@ -28,6 +28,12 @@
|
|||
<p>If you add more tags or system predicates to a search, you will limit the results to those files that match every single one:</p>
|
||||
<p><a href="sororitas_hanako.png"><img src="sororitas_hanako.png" width="960" height="540" /></a></p>
|
||||
<p>You can also exclude a tag by prefixing it with a hyphen (e.g. '-heresy').</p>
|
||||
<h3>importing tags from galleries</h3>
|
||||
<p>In several places around the hydrus client, always in the context of importing files from another location, you will see a <i>tag import options</i> button:</p>
|
||||
<p><img src="import_tag_options.png" /></p>
|
||||
<p>The namespaces listed are those that hydrus knows how to parse from where you are downloading from. Selecting one will tell hydrus to get those tags and set/pend them to the respective tag service.</p>
|
||||
<p>'Explicit tags' are a way to force-add some additional tags for every file that comes through this import context. This can be useful, sometimes, to create personal 'processing' tags on your local tags like 'from tumblr' or 'imported on sunday' that you can revisit to find this download's files again.</p>
|
||||
<p>You can quickly get thousands of tags in a few minutes this way!</p>
|
||||
<h3>tag repositories</h3>
|
||||
<p>It can take a long time to tag even this small number of files well, so I created <i>tag repositories</i> so people can share the work.</p>
|
||||
<p>Tag repos store many file->tag relationships. Anyone who has an access key to the repository can sync with it and hence download all these relationships. If any of their own files match up, they will get those tags. Access keys will also usually have permission to upload new tags and ask for existing ones to be deleted.</p>
|
||||
|
@ -47,13 +53,6 @@
|
|||
<p>Please do not spam tags to my public tag repo until you get a rough feel for the <a href="tagging_schema.html">tag schema</a>, or just lurk until you get the idea. I am not very strict about it (it is not difficult to correct mistakes), but I essentially only want factual tags—no subjective opinion.</p>
|
||||
<p>You can connect to many different tag repositories, if you like. When you are in the <i>manage tags</i> dialog, pressing the up or down arrow keys on an empty input switches between your services.</p>
|
||||
<p><a href="faq.html#delays">FAQ: why can my friend not see what I just uploaded?</a></p>
|
||||
<h3>importing tags from galleries</h3>
|
||||
<p>In several places around the hydrus client, always in the context of importing files from another location, you will see this:</p>
|
||||
<p><img src="import_tag_options_collapsed.png" /></p>
|
||||
<p>If you hit the 'expand' buttons, you will get some more options for the imports. The tag ones are like this:</p>
|
||||
<p><img src="import_tag_options_expanded.png" /></p>
|
||||
<p>The namespaces listed are those that hydrus knows how to parse from the gallery site or wherever you are downloading files from. Selecting one will tell hydrus to get those tags and set/pend them to the respective tag service.</p>
|
||||
<p>You can quickly get thousands of tags in a few minutes this way!</p>
|
||||
<p class="right"><a href="getting_started_ratings.html">Read about ratings ---></a></p>
|
||||
<p class="right"><a href="index.html">Go back to the index ---></a></p>
|
||||
</div>
|
||||
|
|
Binary file not shown.
After Width: | Height: | Size: 31 KiB |
Binary file not shown.
Before Width: | Height: | Size: 7.3 KiB |
Binary file not shown.
Before Width: | Height: | Size: 9.4 KiB |
|
@ -792,6 +792,19 @@ class ClientFilesManager( object ):
|
|||
|
||||
if is_an_orphan:
|
||||
|
||||
if move_location is not None:
|
||||
|
||||
( source_dir, filename ) = os.path.split( path )
|
||||
|
||||
dest = os.path.join( move_location, filename )
|
||||
|
||||
dest = HydrusPaths.AppendPathUntilNoConflicts( dest )
|
||||
|
||||
HydrusData.Print( 'Moving the orphan ' + path + ' to ' + dest )
|
||||
|
||||
HydrusPaths.MergeFile( path, dest )
|
||||
|
||||
|
||||
orphan_paths.append( path )
|
||||
|
||||
|
||||
|
@ -839,65 +852,30 @@ class ClientFilesManager( object ):
|
|||
|
||||
time.sleep( 2 )
|
||||
|
||||
if len( orphan_paths ) > 0:
|
||||
if move_location is None and len( orphan_paths ) > 0:
|
||||
|
||||
if move_location is None:
|
||||
status = 'found ' + HydrusData.ConvertIntToPrettyString( len( orphan_paths ) ) + ' orphans, now deleting'
|
||||
|
||||
job_key.SetVariable( 'popup_text_1', status )
|
||||
|
||||
time.sleep( 5 )
|
||||
|
||||
for path in orphan_paths:
|
||||
|
||||
status = 'found ' + HydrusData.ConvertIntToPrettyString( len( orphan_paths ) ) + ' orphans, now deleting'
|
||||
( i_paused, should_quit ) = job_key.WaitIfNeeded()
|
||||
|
||||
if should_quit:
|
||||
|
||||
return
|
||||
|
||||
|
||||
HydrusData.Print( 'Deleting the orphan ' + path )
|
||||
|
||||
status = 'deleting orphan files: ' + HydrusData.ConvertValueRangeToPrettyString( i + 1, len( orphan_paths ) )
|
||||
|
||||
job_key.SetVariable( 'popup_text_1', status )
|
||||
|
||||
time.sleep( 5 )
|
||||
|
||||
for path in orphan_paths:
|
||||
|
||||
( i_paused, should_quit ) = job_key.WaitIfNeeded()
|
||||
|
||||
if should_quit:
|
||||
|
||||
return
|
||||
|
||||
|
||||
HydrusData.Print( 'Deleting the orphan ' + path )
|
||||
|
||||
status = 'deleting orphan files: ' + HydrusData.ConvertValueRangeToPrettyString( i + 1, len( orphan_paths ) )
|
||||
|
||||
job_key.SetVariable( 'popup_text_1', status )
|
||||
|
||||
HydrusPaths.DeletePath( path )
|
||||
|
||||
|
||||
else:
|
||||
|
||||
status = 'found ' + HydrusData.ConvertIntToPrettyString( len( orphan_paths ) ) + ' orphans, now moving to ' + move_location
|
||||
|
||||
job_key.SetVariable( 'popup_text_1', status )
|
||||
|
||||
time.sleep( 5 )
|
||||
|
||||
for path in orphan_paths:
|
||||
|
||||
( i_paused, should_quit ) = job_key.WaitIfNeeded()
|
||||
|
||||
if should_quit:
|
||||
|
||||
return
|
||||
|
||||
|
||||
( source_dir, filename ) = os.path.split( path )
|
||||
|
||||
dest = os.path.join( move_location, filename )
|
||||
|
||||
dest = HydrusPaths.AppendPathUntilNoConflicts( dest )
|
||||
|
||||
HydrusData.Print( 'Moving the orphan ' + path + ' to ' + dest )
|
||||
|
||||
status = 'moving orphan files: ' + HydrusData.ConvertValueRangeToPrettyString( i + 1, len( orphan_paths ) )
|
||||
|
||||
job_key.SetVariable( 'popup_text_1', status )
|
||||
|
||||
HydrusPaths.MergeFile( path, dest )
|
||||
|
||||
HydrusPaths.DeletePath( path )
|
||||
|
||||
|
||||
|
||||
|
|
|
@ -4204,19 +4204,6 @@ class DB( HydrusDB.HydrusDB ):
|
|||
query_hash_ids = update_qhi( query_hash_ids, similar_hash_ids )
|
||||
|
||||
|
||||
if 'known_url_rules' in simple_preds:
|
||||
|
||||
for ( operator, rule_type, rule ) in simple_preds[ 'known_url_rules' ]:
|
||||
|
||||
if operator: # inclusive
|
||||
|
||||
url_hash_ids = self._GetHashIdsFromURLRule( rule_type, rule )
|
||||
|
||||
query_hash_ids = update_qhi( query_hash_ids, url_hash_ids )
|
||||
|
||||
|
||||
|
||||
|
||||
# now the simple preds and typical ways to populate query_hash_ids
|
||||
|
||||
if 'min_size' in simple_preds: files_info_predicates.append( 'size > ' + str( simple_preds[ 'min_size' ] ) )
|
||||
|
@ -4463,19 +4450,6 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
#
|
||||
|
||||
if 'known_url_rules' in simple_preds:
|
||||
|
||||
for ( operator, rule_type, rule ) in simple_preds[ 'known_url_rules' ]:
|
||||
|
||||
if not operator: # exclusive
|
||||
|
||||
url_hash_ids = self._GetHashIdsFromURLRule( rule_type, rule )
|
||||
|
||||
query_hash_ids.difference_update( url_hash_ids )
|
||||
|
||||
|
||||
|
||||
|
||||
( file_services_to_include_current, file_services_to_include_pending, file_services_to_exclude_current, file_services_to_exclude_pending ) = system_predicates.GetFileServiceInfo()
|
||||
|
||||
for service_key in file_services_to_include_current:
|
||||
|
@ -4619,6 +4593,25 @@ class DB( HydrusDB.HydrusDB ):
|
|||
query_hash_ids.difference_update( self._inbox_hash_ids )
|
||||
|
||||
|
||||
#
|
||||
|
||||
if 'known_url_rules' in simple_preds:
|
||||
|
||||
for ( operator, rule_type, rule ) in simple_preds[ 'known_url_rules' ]:
|
||||
|
||||
url_hash_ids = self._GetHashIdsFromURLRule( rule_type, rule, hash_ids = query_hash_ids )
|
||||
|
||||
if operator: # inclusive
|
||||
|
||||
query_hash_ids.intersection_update( url_hash_ids )
|
||||
|
||||
else:
|
||||
|
||||
query_hash_ids.difference_update( url_hash_ids )
|
||||
|
||||
|
||||
|
||||
|
||||
#
|
||||
|
||||
num_tags_zero = False
|
||||
|
@ -4903,29 +4896,38 @@ class DB( HydrusDB.HydrusDB ):
|
|||
return hash_ids
|
||||
|
||||
|
||||
def _GetHashIdsFromURLRule( self, rule_type, rule ):
|
||||
def _GetHashIdsFromURLRule( self, rule_type, rule, hash_ids = None ):
|
||||
|
||||
hash_ids = set()
|
||||
if hash_ids is None:
|
||||
|
||||
query = self._c.execute( 'SELECT hash_id, url FROM urls;' )
|
||||
|
||||
else:
|
||||
|
||||
query = self._SelectFromList( 'SELECT hash_id, url FROM urls WHERE hash_id in %s;', hash_ids )
|
||||
|
||||
|
||||
for ( hash_id, url ) in self._c.execute( 'SELECT hash_id, url FROM urls;' ):
|
||||
result_hash_ids = set()
|
||||
|
||||
for ( hash_id, url ) in query:
|
||||
|
||||
if rule_type == 'url_match':
|
||||
|
||||
if rule.Matches( url ):
|
||||
|
||||
hash_ids.add( hash_id )
|
||||
result_hash_ids.add( hash_id )
|
||||
|
||||
|
||||
else:
|
||||
|
||||
if re.search( rule, url ) is not None:
|
||||
|
||||
hash_ids.add( hash_id )
|
||||
result_hash_ids.add( hash_id )
|
||||
|
||||
|
||||
|
||||
|
||||
return hash_ids
|
||||
return result_hash_ids
|
||||
|
||||
|
||||
def _GetHashIdsFromWildcard( self, file_service_key, tag_service_key, wildcard, include_current_tags, include_pending_tags ):
|
||||
|
@ -6412,7 +6414,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
if service_key is None:
|
||||
|
||||
service_ids_to_statuses_and_pair_ids = HydrusData.BuildKeyToListDict( ( ( service_id, ( status, child_tag_id, parent_tag_id ) ) for ( service_id, child_tag_id, parent_tag_id, status ) in self._c.execute( 'SELECT service_id, status, child_tag_id, parent_tag_id FROM tag_parents UNION SELECT service_id, status, child_tag_id, parent_tag_id FROM tag_parent_petitions;' ) ) )
|
||||
service_ids_to_statuses_and_pair_ids = HydrusData.BuildKeyToListDict( ( ( service_id, ( status, child_tag_id, parent_tag_id ) ) for ( service_id, status, child_tag_id, parent_tag_id ) in self._c.execute( 'SELECT service_id, status, child_tag_id, parent_tag_id FROM tag_parents UNION SELECT service_id, status, child_tag_id, parent_tag_id FROM tag_parent_petitions;' ) ) )
|
||||
|
||||
service_keys_to_statuses_to_pairs = collections.defaultdict( HydrusData.default_dict_set )
|
||||
|
||||
|
@ -9862,6 +9864,42 @@ class DB( HydrusDB.HydrusDB ):
|
|||
self._AddService( CC.LOCAL_NOTES_SERVICE_KEY, HC.LOCAL_NOTES, CC.LOCAL_NOTES_SERVICE_KEY, dictionary )
|
||||
|
||||
|
||||
if version == 300:
|
||||
|
||||
try:
|
||||
|
||||
sank_nc = ClientNetworking.NetworkContext( CC.NETWORK_CONTEXT_DOMAIN, 'sankakucomplex.com' )
|
||||
|
||||
bandwidth_manager = self._GetJSONDump( HydrusSerialisable.SERIALISABLE_TYPE_NETWORK_BANDWIDTH_MANAGER )
|
||||
|
||||
rules = bandwidth_manager.GetRules( sank_nc )
|
||||
|
||||
rules = rules.Duplicate()
|
||||
|
||||
rules.AddRule( HC.BANDWIDTH_TYPE_DATA, 86400, 64 * 1024 * 1024 ) # added as a compromise to try to reduce hydrus sankaku bandwidth usage until their new API and subscription model comes in
|
||||
|
||||
bandwidth_manager.SetRules( sank_nc, rules )
|
||||
|
||||
self._SetJSONDump( bandwidth_manager )
|
||||
|
||||
message = 'Sankaku Complex have mentioned to me, Hydrus Dev, that they have recently been running into bandwidth problems. They were respectful in reaching out to me and I am sympathetic to their problem. After some discussion, rather than removing hydrus support for Sankaku entirely, I am in this version adding a new restrictive default bandwidth rule for the sankakucomplex.com domain of 64MB/day.'
|
||||
|
||||
self.pub_initial_message( message )
|
||||
|
||||
message = 'If you are a heavy Sankaku downloader, please bear with this limit until we can come up with a better solution. They told me they have plans for API upgrades and will be rolling out a subscription service in the coming months that may relieve this problem. I also expect to write some way to embed \'Here is how to support this source: (LINK)\' links into the downloader ui of my new downloader engine for those who can and wish to help out with bandwidth costs. Please check my release post if you would like to read more, and feel free to contact me directly to discuss it further.'
|
||||
|
||||
self.pub_initial_message( message )
|
||||
|
||||
except Exception as e:
|
||||
|
||||
HydrusData.PrintException( e )
|
||||
|
||||
message = 'Attempting to add a new rule to sankaku\'s domain failed. The error has been printed to your log file--please let hydrus dev know the details.'
|
||||
|
||||
self.pub_initial_message( message )
|
||||
|
||||
|
||||
|
||||
self._controller.pub( 'splash_set_title_text', 'updated db to v' + str( version + 1 ) )
|
||||
|
||||
self._c.execute( 'UPDATE version SET version = ?;', ( version + 1, ) )
|
||||
|
|
|
@ -317,6 +317,11 @@ def DAEMONSynchroniseRepositories( controller ):
|
|||
|
||||
def DAEMONSynchroniseSubscriptions( controller ):
|
||||
|
||||
if HG.subscription_report_mode:
|
||||
|
||||
HydrusData.ShowText( 'Subscription daemon started a run.' )
|
||||
|
||||
|
||||
subscription_names = list( controller.Read( 'serialisable_names', HydrusSerialisable.SERIALISABLE_TYPE_SUBSCRIPTION ) )
|
||||
|
||||
if controller.new_options.GetBoolean( 'process_subs_in_random_order' ):
|
||||
|
@ -337,6 +342,11 @@ def DAEMONSynchroniseSubscriptions( controller ):
|
|||
p1 = controller.options[ 'pause_subs_sync' ]
|
||||
p2 = HydrusThreading.IsThreadShuttingDown()
|
||||
|
||||
if HG.subscription_report_mode:
|
||||
|
||||
HydrusData.ShowText( 'Subscription "' + name + '" about to start. Global sub pause is ' + str( p1 ) + ' and thread shutdown status is ' + str( p2 ) + '.' )
|
||||
|
||||
|
||||
if p1 or p2:
|
||||
|
||||
return
|
||||
|
|
|
@ -1142,6 +1142,10 @@ class ClientOptions( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
#
|
||||
|
||||
self._dictionary[ 'simple_downloader_formulae' ] = HydrusSerialisable.SerialisableDictionary()
|
||||
|
||||
#
|
||||
|
||||
self._dictionary[ 'noneable_strings' ] = {}
|
||||
|
||||
self._dictionary[ 'noneable_strings' ][ 'favourite_file_lookup_script' ] = 'gelbooru md5'
|
||||
|
@ -1157,6 +1161,7 @@ class ClientOptions( HydrusSerialisable.SerialisableBase ):
|
|||
self._dictionary[ 'strings' ][ 'namespace_connector' ] = ':'
|
||||
self._dictionary[ 'strings' ][ 'export_phrase' ] = '{hash}'
|
||||
self._dictionary[ 'strings' ][ 'current_colourset' ] = 'default'
|
||||
self._dictionary[ 'strings' ][ 'favourite_simple_downloader_formula' ] = 'all images'
|
||||
|
||||
self._dictionary[ 'string_list' ] = {}
|
||||
|
||||
|
@ -1626,11 +1631,15 @@ class ClientOptions( HydrusSerialisable.SerialisableBase ):
|
|||
guidance_tag_import_options = default_tag_import_options[ default_gallery_identifier ]
|
||||
|
||||
|
||||
fetch_tags_even_if_url_known_and_file_already_in_db = False
|
||||
|
||||
service_keys_to_namespaces = {}
|
||||
service_keys_to_explicit_tags = {}
|
||||
|
||||
if guidance_tag_import_options is not None:
|
||||
|
||||
fetch_tags_even_if_url_known_and_file_already_in_db = guidance_tag_import_options.ShouldFetchTagsEvenIfURLKnownAndFileAlreadyInDB()
|
||||
|
||||
( namespaces, search_value ) = ClientDefaults.GetDefaultNamespacesAndSearchValue( gallery_identifier )
|
||||
|
||||
guidance_service_keys_to_namespaces = guidance_tag_import_options.GetServiceKeysToNamespaces()
|
||||
|
@ -1652,7 +1661,7 @@ class ClientOptions( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
import ClientImporting
|
||||
|
||||
tag_import_options = ClientImporting.TagImportOptions( service_keys_to_namespaces = service_keys_to_namespaces, service_keys_to_explicit_tags = service_keys_to_explicit_tags )
|
||||
tag_import_options = ClientImporting.TagImportOptions( fetch_tags_even_if_url_known_and_file_already_in_db = fetch_tags_even_if_url_known_and_file_already_in_db, service_keys_to_namespaces = service_keys_to_namespaces, service_keys_to_explicit_tags = service_keys_to_explicit_tags )
|
||||
|
||||
|
||||
return tag_import_options
|
||||
|
@ -1827,6 +1836,22 @@ class ClientOptions( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
|
||||
|
||||
def GetSimpleDownloaderFormulae( self ):
|
||||
|
||||
with self._lock:
|
||||
|
||||
if len( self._dictionary[ 'simple_downloader_formulae' ] ) == 0:
|
||||
|
||||
for ( formula_name, formula ) in ClientDefaults.GetDefaultSimpleDownloaderFormulae():
|
||||
|
||||
self._dictionary[ 'simple_downloader_formulae' ][ formula_name ] = formula
|
||||
|
||||
|
||||
|
||||
return self._dictionary[ 'simple_downloader_formulae' ].items()
|
||||
|
||||
|
||||
|
||||
def GetString( self, name ):
|
||||
|
||||
with self._lock:
|
||||
|
@ -2067,6 +2092,19 @@ class ClientOptions( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
|
||||
|
||||
def SetSimpleDownloaderFormulae( self, formulae ):
|
||||
|
||||
with self._lock:
|
||||
|
||||
self._dictionary[ 'simple_downloader_formulae' ] = HydrusSerialisable.SerialisableDictionary()
|
||||
|
||||
for ( formula_name, formula ) in formulae:
|
||||
|
||||
self._dictionary[ 'simple_downloader_formulae' ][ formula_name ] = formula
|
||||
|
||||
|
||||
|
||||
|
||||
def SetString( self, name, value ):
|
||||
|
||||
with self._lock:
|
||||
|
|
|
@ -92,6 +92,8 @@ def SetDefaultBandwidthManagerRules( bandwidth_manager ):
|
|||
|
||||
rules.AddRule( HC.BANDWIDTH_TYPE_DATA, 86400, 2 * GB ) # keep this in there so subs can know better when to stop running (the files come from a subdomain, which causes a pain for bandwidth calcs)
|
||||
|
||||
rules.AddRule( HC.BANDWIDTH_TYPE_DATA, 86400, 64 * MB ) # added as a compromise to try to reduce hydrus sankaku bandwidth usage until their new API and subscription model comes in
|
||||
|
||||
bandwidth_manager.SetRules( ClientNetworking.NetworkContext( CC.NETWORK_CONTEXT_DOMAIN, 'sankakucomplex.com' ), rules )
|
||||
|
||||
def SetDefaultDomainManagerData( domain_manager ):
|
||||
|
@ -809,6 +811,118 @@ def GetDefaultShortcuts():
|
|||
|
||||
return shortcuts
|
||||
|
||||
def GetDefaultSimpleDownloaderFormulae():
|
||||
|
||||
import ClientParsing
|
||||
|
||||
formulae = []
|
||||
|
||||
#
|
||||
|
||||
formula_name = 'all images'
|
||||
|
||||
tag_rules = [ ( 'img', {}, None ) ]
|
||||
content_to_fetch = ClientParsing.HTML_CONTENT_ATTRIBUTE
|
||||
attribute_to_fetch = 'src'
|
||||
|
||||
formula = ClientParsing.ParseFormulaHTML( tag_rules = tag_rules, content_to_fetch = content_to_fetch, attribute_to_fetch = attribute_to_fetch )
|
||||
|
||||
formulae.append( ( formula_name, formula ) )
|
||||
|
||||
#
|
||||
|
||||
formula_name = '4chan thread (html)'
|
||||
|
||||
tag_rules = [ ( 'div', { 'class' : 'fileText' }, None ), ( 'a', {}, 0 ) ]
|
||||
content_to_fetch = ClientParsing.HTML_CONTENT_ATTRIBUTE
|
||||
attribute_to_fetch = 'href'
|
||||
|
||||
formula = ClientParsing.ParseFormulaHTML( tag_rules = tag_rules, content_to_fetch = content_to_fetch, attribute_to_fetch = attribute_to_fetch )
|
||||
|
||||
formulae.append( ( formula_name, formula ) )
|
||||
|
||||
#
|
||||
|
||||
formula_name = '8chan thread (html)'
|
||||
|
||||
tag_rules = [ ( 'p', { 'class' : 'fileinfo' }, None ), ( 'a', {}, 0 ) ]
|
||||
content_to_fetch = ClientParsing.HTML_CONTENT_ATTRIBUTE
|
||||
attribute_to_fetch = 'href'
|
||||
|
||||
formula = ClientParsing.ParseFormulaHTML( tag_rules = tag_rules, content_to_fetch = content_to_fetch, attribute_to_fetch = attribute_to_fetch )
|
||||
|
||||
formulae.append( ( formula_name, formula ) )
|
||||
|
||||
#
|
||||
|
||||
formula_name = 'twitter image'
|
||||
|
||||
tag_rules = [ ( 'div', { 'class' : 'permalink-tweet' }, 0 ), ( 'div', { 'class' : 'AdaptiveMedia-container' }, None ), ( 'img', {}, None ) ]
|
||||
content_to_fetch = ClientParsing.HTML_CONTENT_ATTRIBUTE
|
||||
attribute_to_fetch = 'src'
|
||||
|
||||
string_converter = ClientParsing.StringConverter( transformations = [ ( ClientParsing.STRING_TRANSFORMATION_APPEND_TEXT, ':orig' ) ], example_string = 'https://pbs.twimg.com/media/DZoQ2SdXcAIrFkm.jpg' )
|
||||
|
||||
formula = ClientParsing.ParseFormulaHTML( tag_rules = tag_rules, content_to_fetch = content_to_fetch, attribute_to_fetch = attribute_to_fetch, string_converter = string_converter )
|
||||
|
||||
formulae.append( ( formula_name, formula ) )
|
||||
|
||||
#
|
||||
|
||||
formula_name = 'gfycat webm'
|
||||
|
||||
tag_rules = [ ( 'video', {}, None ), ( 'source', {}, None ) ]
|
||||
content_to_fetch = ClientParsing.HTML_CONTENT_ATTRIBUTE
|
||||
attribute_to_fetch = 'src'
|
||||
|
||||
string_match = ClientParsing.StringMatch( match_type = ClientParsing.STRING_MATCH_REGEX, match_value = '\\.webm$', example_string = 'https://giant.gfycat.com/HatefulBleakBluegill.webm' )
|
||||
|
||||
formula = ClientParsing.ParseFormulaHTML( tag_rules = tag_rules, content_to_fetch = content_to_fetch, attribute_to_fetch = attribute_to_fetch, string_match = string_match )
|
||||
|
||||
formulae.append( ( formula_name, formula ) )
|
||||
|
||||
#
|
||||
|
||||
formula_name = 'gfycat mp4'
|
||||
|
||||
tag_rules = [ ( 'video', {}, None ), ( 'source', {}, None ) ]
|
||||
content_to_fetch = ClientParsing.HTML_CONTENT_ATTRIBUTE
|
||||
attribute_to_fetch = 'src'
|
||||
|
||||
string_match = ClientParsing.StringMatch( match_type = ClientParsing.STRING_MATCH_REGEX, match_value = '\\.mp4$', example_string = 'https://giant.gfycat.com/HatefulBleakBluegill.mp4' )
|
||||
|
||||
formula = ClientParsing.ParseFormulaHTML( tag_rules = tag_rules, content_to_fetch = content_to_fetch, attribute_to_fetch = attribute_to_fetch, string_match = string_match )
|
||||
|
||||
formulae.append( ( formula_name, formula ) )
|
||||
|
||||
#
|
||||
|
||||
formula_name = 'imgur video'
|
||||
|
||||
tag_rules = [ ( 'meta', { 'property' : 'og:video' }, None ) ]
|
||||
content_to_fetch = ClientParsing.HTML_CONTENT_ATTRIBUTE
|
||||
attribute_to_fetch = 'content'
|
||||
|
||||
formula = ClientParsing.ParseFormulaHTML( tag_rules = tag_rules, content_to_fetch = content_to_fetch, attribute_to_fetch = attribute_to_fetch )
|
||||
|
||||
formulae.append( ( formula_name, formula ) )
|
||||
|
||||
#
|
||||
|
||||
formula_name = 'imgur image'
|
||||
|
||||
tag_rules = [ ( 'link', { 'rel' : 'image_src' }, None ) ]
|
||||
content_to_fetch = ClientParsing.HTML_CONTENT_ATTRIBUTE
|
||||
attribute_to_fetch = 'href'
|
||||
|
||||
formula = ClientParsing.ParseFormulaHTML( tag_rules = tag_rules, content_to_fetch = content_to_fetch, attribute_to_fetch = attribute_to_fetch )
|
||||
|
||||
formulae.append( ( formula_name, formula ) )
|
||||
|
||||
#
|
||||
|
||||
return formulae
|
||||
|
||||
def GetDefaultURLMatches():
|
||||
|
||||
url_match_dir = os.path.join( HC.STATIC_DIR, 'default', 'url_classes' )
|
||||
|
|
|
@ -1367,7 +1367,7 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
|
|||
|
||||
ClientGUIMenus.AppendMenuItem( self, download_menu, 'url download', 'Open a new tab to download some raw urls.', self._notebook.NewPageImportURLs, on_deepest_notebook = True )
|
||||
ClientGUIMenus.AppendMenuItem( self, download_menu, 'thread watcher', 'Open a new tab to watch a thread.', self._notebook.NewPageImportThreadWatcher, on_deepest_notebook = True )
|
||||
ClientGUIMenus.AppendMenuItem( self, download_menu, 'webpage of images', 'Open a new tab to download files from generic galleries or threads.', self._notebook.NewPageImportPageOfImages, on_deepest_notebook = True )
|
||||
ClientGUIMenus.AppendMenuItem( self, download_menu, 'simple downloader', 'Open a new tab to download files from generic galleries or threads.', self._notebook.NewPageImportSimpleDownloader, on_deepest_notebook = True )
|
||||
|
||||
gallery_menu = wx.Menu()
|
||||
|
||||
|
@ -1819,6 +1819,7 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
|
|||
ClientGUIMenus.AppendMenuCheckItem( self, report_modes, 'hover window report mode', 'Have the hover windows report their show/hide logic.', HG.hover_window_report_mode, self._SwitchBoolean, 'hover_window_report_mode' )
|
||||
ClientGUIMenus.AppendMenuCheckItem( self, report_modes, 'network report mode', 'Have the network engine report new jobs.', HG.network_report_mode, self._SwitchBoolean, 'network_report_mode' )
|
||||
ClientGUIMenus.AppendMenuCheckItem( self, report_modes, 'shortcut report mode', 'Have the new shortcut system report what shortcuts it catches and whether it matches an action.', HG.shortcut_report_mode, self._SwitchBoolean, 'shortcut_report_mode' )
|
||||
ClientGUIMenus.AppendMenuCheckItem( self, report_modes, 'subscription report mode', 'Have the subscription system report what it is doing.', HG.subscription_report_mode, self._SwitchBoolean, 'subscription_report_mode' )
|
||||
|
||||
ClientGUIMenus.AppendMenu( debug, report_modes, 'report modes' )
|
||||
|
||||
|
@ -3003,6 +3004,10 @@ The password is cleartext here but obscured in the entry dialog. Enter a blank p
|
|||
|
||||
HG.shortcut_report_mode = not HG.shortcut_report_mode
|
||||
|
||||
elif name == 'subscription_report_mode':
|
||||
|
||||
HG.subscription_report_mode = not HG.subscription_report_mode
|
||||
|
||||
elif name == 'pubsub_profile_mode':
|
||||
|
||||
HG.pubsub_profile_mode = not HG.pubsub_profile_mode
|
||||
|
|
|
@ -1458,3 +1458,18 @@ class AutoCompleteDropdownTagsWrite( AutoCompleteDropdownTags ):
|
|||
|
||||
|
||||
|
||||
def RefreshFavouriteTags( self ):
|
||||
|
||||
favourite_tags = list( HG.client_controller.new_options.GetStringList( 'favourite_tags' ) )
|
||||
|
||||
favourite_tags.sort()
|
||||
|
||||
predicates = [ ClientSearch.Predicate( HC.PREDICATE_TYPE_TAG, tag ) for tag in favourite_tags ]
|
||||
|
||||
parents_manager = HG.client_controller.GetManager( 'tag_parents' )
|
||||
|
||||
predicates = parents_manager.ExpandPredicates( CC.COMBINED_TAG_SERVICE_KEY, predicates )
|
||||
|
||||
self._favourites_list.SetPredicates( predicates )
|
||||
|
||||
|
||||
|
|
|
@ -3520,8 +3520,14 @@ class CanvasFilterDuplicates( CanvasWithHovers ):
|
|||
|
||||
( modifier, key ) = ClientData.ConvertKeyEventToSimpleTuple( event )
|
||||
|
||||
if modifier == wx.ACCEL_NORMAL and key in CC.DELETE_KEYS: self._Delete()
|
||||
elif modifier == wx.ACCEL_SHIFT and key in CC.DELETE_KEYS: self._Undelete()
|
||||
if modifier == wx.ACCEL_NORMAL and key in CC.DELETE_KEYS:
|
||||
|
||||
self._Delete()
|
||||
|
||||
elif modifier == wx.ACCEL_SHIFT and key in CC.DELETE_KEYS:
|
||||
|
||||
self._Undelete()
|
||||
|
||||
else:
|
||||
|
||||
CanvasWithHovers.EventCharHook( self, event )
|
||||
|
@ -4067,8 +4073,15 @@ class CanvasMediaListFilterArchiveDelete( CanvasMediaList ):
|
|||
|
||||
if result == wx.ID_CANCEL:
|
||||
|
||||
if self._current_media in self._kept: self._kept.remove( self._current_media )
|
||||
if self._current_media in self._deleted: self._deleted.remove( self._current_media )
|
||||
if self._current_media in self._kept:
|
||||
|
||||
self._kept.remove( self._current_media )
|
||||
|
||||
|
||||
if self._current_media in self._deleted:
|
||||
|
||||
self._deleted.remove( self._current_media )
|
||||
|
||||
|
||||
return
|
||||
|
||||
|
|
|
@ -322,13 +322,14 @@ class DialogFinishFiltering( Dialog ):
|
|||
|
||||
Dialog.__init__( self, parent, 'are you sure?', position = 'center' )
|
||||
|
||||
self._commit = wx.Button( self, id = wx.ID_YES, label = 'commit' )
|
||||
self._commit = ClientGUICommon.BetterButton( self, 'commit', self.EndModal, wx.ID_YES )
|
||||
self._commit.SetForegroundColour( ( 0, 128, 0 ) )
|
||||
|
||||
self._forget = wx.Button( self, id = wx.ID_NO, label = 'forget' )
|
||||
self._forget = ClientGUICommon.BetterButton( self, 'forget', self.EndModal, wx.ID_NO )
|
||||
self._forget.SetForegroundColour( ( 128, 0, 0 ) )
|
||||
|
||||
self._back = wx.Button( self, id = wx.ID_CANCEL, label = 'back to filtering' )
|
||||
self._back = ClientGUICommon.BetterButton( self, 'back to filtering', self.EndModal, wx.ID_CANCEL )
|
||||
self._back.SetId( wx.ID_CANCEL )
|
||||
|
||||
hbox = wx.BoxSizer( wx.HORIZONTAL )
|
||||
|
||||
|
|
|
@ -188,6 +188,11 @@ class AddEditDeleteListBox( wx.Panel ):
|
|||
return datas
|
||||
|
||||
|
||||
def GetValue( self ):
|
||||
|
||||
return self.GetData()
|
||||
|
||||
|
||||
class QueueListBox( wx.Panel ):
|
||||
|
||||
def __init__( self, parent, height_num_chars, data_to_pretty_callable, add_callable = None, edit_callable = None ):
|
||||
|
|
|
@ -18,12 +18,15 @@ import ClientGUIImport
|
|||
import ClientGUIListBoxes
|
||||
import ClientGUIMedia
|
||||
import ClientGUIMenus
|
||||
import ClientGUIParsing
|
||||
import ClientGUIScrolledPanels
|
||||
import ClientGUIScrolledPanelsEdit
|
||||
import ClientGUISeedCache
|
||||
import ClientGUITime
|
||||
import ClientGUITopLevelWindows
|
||||
import ClientImporting
|
||||
import ClientMedia
|
||||
import ClientParsing
|
||||
import ClientRendering
|
||||
import ClientSearch
|
||||
import ClientThreading
|
||||
|
@ -50,7 +53,7 @@ ID_TIMER_DUMP = wx.NewId()
|
|||
|
||||
MANAGEMENT_TYPE_DUMPER = 0
|
||||
MANAGEMENT_TYPE_IMPORT_GALLERY = 1
|
||||
MANAGEMENT_TYPE_IMPORT_PAGE_OF_IMAGES = 2
|
||||
MANAGEMENT_TYPE_IMPORT_SIMPLE_DOWNLOADER = 2
|
||||
MANAGEMENT_TYPE_IMPORT_HDD = 3
|
||||
MANAGEMENT_TYPE_IMPORT_THREAD_WATCHER = 4
|
||||
MANAGEMENT_TYPE_PETITIONS = 5
|
||||
|
@ -97,13 +100,13 @@ def CreateManagementControllerImportGallery( gallery_identifier ):
|
|||
|
||||
return management_controller
|
||||
|
||||
def CreateManagementControllerImportPageOfImages():
|
||||
def CreateManagementControllerImportSimpleDownloader():
|
||||
|
||||
management_controller = CreateManagementController( 'page download', MANAGEMENT_TYPE_IMPORT_PAGE_OF_IMAGES )
|
||||
management_controller = CreateManagementController( 'simple downloader', MANAGEMENT_TYPE_IMPORT_SIMPLE_DOWNLOADER )
|
||||
|
||||
page_of_images_import = ClientImporting.PageOfImagesImport()
|
||||
simple_downloader_import = ClientImporting.SimpleDownloaderImport()
|
||||
|
||||
management_controller.SetVariable( 'page_of_images_import', page_of_images_import )
|
||||
management_controller.SetVariable( 'simple_downloader_import', simple_downloader_import )
|
||||
|
||||
return management_controller
|
||||
|
||||
|
@ -538,7 +541,7 @@ class ManagementController( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_MANAGEMENT_CONTROLLER
|
||||
SERIALISABLE_NAME = 'Client Page Management Controller'
|
||||
SERIALISABLE_VERSION = 3
|
||||
SERIALISABLE_VERSION = 4
|
||||
|
||||
def __init__( self, page_name = 'page' ):
|
||||
|
||||
|
@ -576,7 +579,7 @@ class ManagementController( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
def _InitialiseFromSerialisableInfo( self, serialisable_info ):
|
||||
|
||||
( self._page_name, self._management_type, serialisable_keys, serialisable_simples, serialisables ) = serialisable_info
|
||||
( self._page_name, self._management_type, serialisable_keys, serialisable_simples, serialisable_serialisables ) = serialisable_info
|
||||
|
||||
self._InitialiseDefaults()
|
||||
|
||||
|
@ -592,7 +595,7 @@ class ManagementController( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
self._simples.update( dict( serialisable_simples ) )
|
||||
|
||||
self._serialisables.update( { name : HydrusSerialisable.CreateFromSerialisableTuple( value ) for ( name, value ) in serialisables.items() } )
|
||||
self._serialisables.update( { name : HydrusSerialisable.CreateFromSerialisableTuple( value ) for ( name, value ) in serialisable_serialisables.items() } )
|
||||
|
||||
|
||||
def _UpdateSerialisableInfo( self, version, old_serialisable_info ):
|
||||
|
@ -653,6 +656,22 @@ class ManagementController( HydrusSerialisable.SerialisableBase ):
|
|||
return ( 3, new_serialisable_info )
|
||||
|
||||
|
||||
if version == 3:
|
||||
|
||||
( page_name, management_type, serialisable_keys, serialisable_simples, serialisable_serialisables ) = old_serialisable_info
|
||||
|
||||
if 'page_of_images_import' in serialisable_serialisables:
|
||||
|
||||
serialisable_serialisables[ 'simple_downloader_import' ] = serialisable_serialisables[ 'page_of_images_import' ]
|
||||
|
||||
del serialisable_serialisables[ 'page_of_images_import' ]
|
||||
|
||||
|
||||
new_serialisable_info = ( page_name, management_type, serialisable_keys, serialisable_simples, serialisable_serialisables )
|
||||
|
||||
return ( 4, new_serialisable_info )
|
||||
|
||||
|
||||
|
||||
def GetKey( self, name ):
|
||||
|
||||
|
@ -686,9 +705,19 @@ class ManagementController( HydrusSerialisable.SerialisableBase ):
|
|||
return name in self._simples or name in self._serialisables
|
||||
|
||||
|
||||
def IsDeadThreadWatcher( self ):
|
||||
|
||||
if self._management_type == MANAGEMENT_TYPE_IMPORT_THREAD_WATCHER:
|
||||
|
||||
thread_watcher_import = self.GetVariable( 'thread_watcher_import' )
|
||||
|
||||
return thread_watcher_import.IsDead()
|
||||
|
||||
|
||||
|
||||
def IsImporter( self ):
|
||||
|
||||
return self._management_type in ( MANAGEMENT_TYPE_IMPORT_GALLERY, MANAGEMENT_TYPE_IMPORT_HDD, MANAGEMENT_TYPE_IMPORT_PAGE_OF_IMAGES, MANAGEMENT_TYPE_IMPORT_THREAD_WATCHER, MANAGEMENT_TYPE_IMPORT_URLS )
|
||||
return self._management_type in ( MANAGEMENT_TYPE_IMPORT_GALLERY, MANAGEMENT_TYPE_IMPORT_HDD, MANAGEMENT_TYPE_IMPORT_SIMPLE_DOWNLOADER, MANAGEMENT_TYPE_IMPORT_THREAD_WATCHER, MANAGEMENT_TYPE_IMPORT_URLS )
|
||||
|
||||
|
||||
def SetKey( self, name, key ):
|
||||
|
@ -1742,17 +1771,17 @@ class ManagementPanelImporterHDD( ManagementPanelImporter ):
|
|||
|
||||
management_panel_types_to_classes[ MANAGEMENT_TYPE_IMPORT_HDD ] = ManagementPanelImporterHDD
|
||||
|
||||
class ManagementPanelImporterPageOfImages( ManagementPanelImporter ):
|
||||
class ManagementPanelImporterSimpleDownloader( ManagementPanelImporter ):
|
||||
|
||||
def __init__( self, parent, page, controller, management_controller ):
|
||||
|
||||
ManagementPanelImporter.__init__( self, parent, page, controller, management_controller )
|
||||
|
||||
self._page_of_images_panel = ClientGUICommon.StaticBox( self, 'page of images downloader' )
|
||||
self._simple_downloader_panel = ClientGUICommon.StaticBox( self, 'simple downloader' )
|
||||
|
||||
#
|
||||
|
||||
self._import_queue_panel = ClientGUICommon.StaticBox( self._page_of_images_panel, 'imports' )
|
||||
self._import_queue_panel = ClientGUICommon.StaticBox( self._simple_downloader_panel, 'imports' )
|
||||
|
||||
self._pause_files_button = wx.BitmapButton( self._import_queue_panel, bitmap = CC.GlobalBMPs.pause )
|
||||
self._pause_files_button.Bind( wx.EVT_BUTTON, self.EventPauseFiles )
|
||||
|
@ -1763,41 +1792,43 @@ class ManagementPanelImporterPageOfImages( ManagementPanelImporter ):
|
|||
|
||||
#
|
||||
|
||||
self._pending_page_urls_panel = ClientGUICommon.StaticBox( self._page_of_images_panel, 'pending page urls' )
|
||||
self._pending_jobs_panel = ClientGUICommon.StaticBox( self._simple_downloader_panel, 'pending urls' )
|
||||
|
||||
self._pause_queue_button = wx.BitmapButton( self._pending_page_urls_panel, bitmap = CC.GlobalBMPs.pause )
|
||||
self._pause_queue_button = wx.BitmapButton( self._pending_jobs_panel, bitmap = CC.GlobalBMPs.pause )
|
||||
self._pause_queue_button.Bind( wx.EVT_BUTTON, self.EventPauseQueue )
|
||||
|
||||
self._parser_status = ClientGUICommon.BetterStaticText( self._pending_page_urls_panel )
|
||||
self._parser_status = ClientGUICommon.BetterStaticText( self._pending_jobs_panel )
|
||||
|
||||
self._page_download_control = ClientGUIControls.NetworkJobControl( self._pending_page_urls_panel )
|
||||
self._page_download_control = ClientGUIControls.NetworkJobControl( self._pending_jobs_panel )
|
||||
|
||||
self._pending_page_urls_listbox = wx.ListBox( self._pending_page_urls_panel, size = ( -1, 100 ) )
|
||||
self._pending_jobs_listbox = wx.ListBox( self._pending_jobs_panel, size = ( -1, 100 ) )
|
||||
|
||||
self._advance_button = wx.Button( self._pending_page_urls_panel, label = u'\u2191' )
|
||||
self._advance_button = wx.Button( self._pending_jobs_panel, label = u'\u2191' )
|
||||
self._advance_button.Bind( wx.EVT_BUTTON, self.EventAdvance )
|
||||
|
||||
self._delete_button = wx.Button( self._pending_page_urls_panel, label = 'X' )
|
||||
self._delete_button = wx.Button( self._pending_jobs_panel, label = 'X' )
|
||||
self._delete_button.Bind( wx.EVT_BUTTON, self.EventDelete )
|
||||
|
||||
self._delay_button = wx.Button( self._pending_page_urls_panel, label = u'\u2193' )
|
||||
self._delay_button = wx.Button( self._pending_jobs_panel, label = u'\u2193' )
|
||||
self._delay_button.Bind( wx.EVT_BUTTON, self.EventDelay )
|
||||
|
||||
self._page_url_input = ClientGUICommon.TextAndPasteCtrl( self._pending_page_urls_panel, self._PendPageURLs )
|
||||
self._page_url_input = ClientGUICommon.TextAndPasteCtrl( self._pending_jobs_panel, self._PendPageURLs )
|
||||
|
||||
self._download_image_links = wx.CheckBox( self._page_of_images_panel, label = 'download image links' )
|
||||
self._download_image_links.Bind( wx.EVT_CHECKBOX, self.EventDownloadImageLinks )
|
||||
self._download_image_links.SetToolTip( 'i.e. download the href url of an <a> tag if there is an <img> tag nested beneath it' )
|
||||
self._formulae = ClientGUICommon.BetterChoice( self._pending_jobs_panel )
|
||||
|
||||
self._download_unlinked_images = wx.CheckBox( self._page_of_images_panel, label = 'download unlinked images' )
|
||||
self._download_unlinked_images.Bind( wx.EVT_CHECKBOX, self.EventDownloadUnlinkedImages )
|
||||
self._download_unlinked_images.SetToolTip( 'i.e. download the src url of an <img> tag if there is no parent <a> tag' )
|
||||
menu_items = []
|
||||
|
||||
self._page_of_images_import = self._management_controller.GetVariable( 'page_of_images_import' )
|
||||
menu_items.append( ( 'normal', 'edit formulae', 'Edit these parsing formulae.', self._EditFormulae ) )
|
||||
|
||||
( file_import_options, download_image_links, download_unlinked_images ) = self._page_of_images_import.GetOptions()
|
||||
self._formula_cog = ClientGUICommon.MenuBitmapButton( self._pending_jobs_panel, CC.GlobalBMPs.cog, menu_items )
|
||||
|
||||
self._file_import_options = ClientGUIImport.FileImportOptionsButton( self._page_of_images_panel, file_import_options, self._page_of_images_import.SetFileImportOptions )
|
||||
self._RefreshFormulae()
|
||||
|
||||
self._simple_downloader_import = self._management_controller.GetVariable( 'simple_downloader_import' )
|
||||
|
||||
file_import_options = self._simple_downloader_import.GetFileImportOptions()
|
||||
|
||||
self._file_import_options = ClientGUIImport.FileImportOptionsButton( self._simple_downloader_panel, file_import_options, self._simple_downloader_import.SetFileImportOptions )
|
||||
|
||||
#
|
||||
|
||||
|
@ -1814,22 +1845,26 @@ class ManagementPanelImporterPageOfImages( ManagementPanelImporter ):
|
|||
|
||||
queue_hbox = wx.BoxSizer( wx.HORIZONTAL )
|
||||
|
||||
queue_hbox.Add( self._pending_page_urls_listbox, CC.FLAGS_EXPAND_BOTH_WAYS )
|
||||
queue_hbox.Add( self._pending_jobs_listbox, CC.FLAGS_EXPAND_BOTH_WAYS )
|
||||
queue_hbox.Add( queue_buttons_vbox, CC.FLAGS_VCENTER )
|
||||
|
||||
self._pending_page_urls_panel.Add( self._parser_status, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
self._pending_page_urls_panel.Add( self._page_download_control, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
self._pending_page_urls_panel.Add( queue_hbox, CC.FLAGS_EXPAND_SIZER_BOTH_WAYS )
|
||||
self._pending_page_urls_panel.Add( self._page_url_input, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
self._pending_page_urls_panel.Add( self._pause_queue_button, CC.FLAGS_LONE_BUTTON )
|
||||
formulae_hbox = wx.BoxSizer( wx.HORIZONTAL )
|
||||
|
||||
formulae_hbox.Add( self._formulae, CC.FLAGS_EXPAND_BOTH_WAYS )
|
||||
formulae_hbox.Add( self._formula_cog, CC.FLAGS_VCENTER )
|
||||
|
||||
self._pending_jobs_panel.Add( self._parser_status, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
self._pending_jobs_panel.Add( self._page_download_control, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
self._pending_jobs_panel.Add( queue_hbox, CC.FLAGS_EXPAND_SIZER_BOTH_WAYS )
|
||||
self._pending_jobs_panel.Add( self._page_url_input, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
self._pending_jobs_panel.Add( formulae_hbox, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
self._pending_jobs_panel.Add( self._pause_queue_button, CC.FLAGS_LONE_BUTTON )
|
||||
|
||||
#
|
||||
|
||||
self._page_of_images_panel.Add( self._import_queue_panel, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
self._page_of_images_panel.Add( self._pending_page_urls_panel, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
self._page_of_images_panel.Add( self._download_image_links, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
self._page_of_images_panel.Add( self._download_unlinked_images, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
self._page_of_images_panel.Add( self._file_import_options, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
self._simple_downloader_panel.Add( self._import_queue_panel, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
self._simple_downloader_panel.Add( self._pending_jobs_panel, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
self._simple_downloader_panel.Add( self._file_import_options, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
|
||||
#
|
||||
|
||||
|
@ -1839,7 +1874,7 @@ class ManagementPanelImporterPageOfImages( ManagementPanelImporter ):
|
|||
|
||||
self._collect_by.Hide()
|
||||
|
||||
vbox.Add( self._page_of_images_panel, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
vbox.Add( self._simple_downloader_panel, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
|
||||
self._MakeCurrentSelectionTagsBox( vbox )
|
||||
|
||||
|
@ -1847,34 +1882,154 @@ class ManagementPanelImporterPageOfImages( ManagementPanelImporter ):
|
|||
|
||||
#
|
||||
|
||||
seed_cache = self._page_of_images_import.GetSeedCache()
|
||||
seed_cache = self._simple_downloader_import.GetSeedCache()
|
||||
|
||||
self._seed_cache_control.SetSeedCache( seed_cache )
|
||||
|
||||
self._page_of_images_import.SetDownloadControlFile( self._file_download_control )
|
||||
self._page_of_images_import.SetDownloadControlPage( self._page_download_control )
|
||||
|
||||
self._download_image_links.SetValue( download_image_links )
|
||||
self._download_unlinked_images.SetValue( download_unlinked_images )
|
||||
self._simple_downloader_import.SetDownloadControlFile( self._file_download_control )
|
||||
self._simple_downloader_import.SetDownloadControlPage( self._page_download_control )
|
||||
|
||||
self._UpdateStatus()
|
||||
|
||||
|
||||
def _EditFormulae( self ):
|
||||
|
||||
def data_to_pretty_callable( data ):
|
||||
|
||||
( formula_name, formula ) = data
|
||||
|
||||
return formula_name
|
||||
|
||||
|
||||
def edit_callable( data ):
|
||||
|
||||
( formula_name, formula ) = data
|
||||
|
||||
with ClientGUIDialogs.DialogTextEntry( dlg, 'edit name', default = formula_name ) as dlg_2:
|
||||
|
||||
if dlg_2.ShowModal() == wx.ID_OK:
|
||||
|
||||
formula_name = dlg_2.GetValue()
|
||||
|
||||
else:
|
||||
|
||||
return ( False, None )
|
||||
|
||||
|
||||
|
||||
with ClientGUITopLevelWindows.DialogEdit( dlg, 'edit formula' ) as dlg_3:
|
||||
|
||||
panel = ClientGUIScrolledPanels.EditSingleCtrlPanel( dlg_3 )
|
||||
|
||||
control = ClientGUIParsing.EditFormulaPanel( panel, formula, lambda: ( {}, '' ) )
|
||||
|
||||
panel.SetControl( control )
|
||||
|
||||
dlg_3.SetPanel( panel )
|
||||
|
||||
if dlg_3.ShowModal() == wx.ID_OK:
|
||||
|
||||
formula = control.GetValue()
|
||||
|
||||
data = ( formula_name, formula )
|
||||
|
||||
return ( True, data )
|
||||
|
||||
else:
|
||||
|
||||
return ( False, None )
|
||||
|
||||
|
||||
|
||||
|
||||
def add_callable():
|
||||
|
||||
formula_name = 'new formula'
|
||||
|
||||
formula = ClientParsing.ParseFormulaHTML()
|
||||
|
||||
data = ( formula_name, formula )
|
||||
|
||||
return edit_callable( data )
|
||||
|
||||
|
||||
formulae = list( self._controller.new_options.GetSimpleDownloaderFormulae() )
|
||||
|
||||
formulae.sort()
|
||||
|
||||
with ClientGUITopLevelWindows.DialogEdit( self, 'edit simple downloader formulae' ) as dlg:
|
||||
|
||||
panel = ClientGUIScrolledPanels.EditSingleCtrlPanel( dlg )
|
||||
|
||||
height_num_chars = 20
|
||||
|
||||
control = ClientGUIListBoxes.AddEditDeleteListBox( panel, height_num_chars, data_to_pretty_callable, add_callable, edit_callable )
|
||||
|
||||
control.AddDatas( formulae )
|
||||
|
||||
panel.SetControl( control )
|
||||
|
||||
dlg.SetPanel( panel )
|
||||
|
||||
if dlg.ShowModal() == wx.ID_OK:
|
||||
|
||||
formulae = control.GetData()
|
||||
|
||||
self._controller.new_options.SetSimpleDownloaderFormulae( formulae )
|
||||
|
||||
|
||||
|
||||
self._RefreshFormulae()
|
||||
|
||||
|
||||
def _PendPageURLs( self, urls ):
|
||||
|
||||
urls = [ url for url in urls if url.startswith( 'http' ) ]
|
||||
|
||||
( formula_name, formula ) = self._formulae.GetChoice()
|
||||
|
||||
self._controller.new_options.SetString( 'favourite_simple_downloader_formula', formula_name )
|
||||
|
||||
for url in urls:
|
||||
|
||||
self._page_of_images_import.PendPageURL( url )
|
||||
job = ( url, formula_name, formula )
|
||||
|
||||
self._simple_downloader_import.PendJob( job )
|
||||
|
||||
|
||||
self._UpdateStatus()
|
||||
|
||||
|
||||
def _RefreshFormulae( self ):
|
||||
|
||||
self._formulae.Clear()
|
||||
|
||||
favourite = None
|
||||
favourite_name = self._controller.new_options.GetString( 'favourite_simple_downloader_formula' )
|
||||
|
||||
formulae = list( self._controller.new_options.GetSimpleDownloaderFormulae() )
|
||||
|
||||
formulae.sort()
|
||||
|
||||
for ( i, ( formula_name, formula ) ) in enumerate( formulae ):
|
||||
|
||||
self._formulae.Append( formula_name, ( formula_name, formula ) )
|
||||
|
||||
if formula_name == favourite_name:
|
||||
|
||||
favourite = i
|
||||
|
||||
|
||||
|
||||
if favourite is not None:
|
||||
|
||||
self._formulae.Select( favourite )
|
||||
|
||||
|
||||
|
||||
def _SeedCache( self ):
|
||||
|
||||
seed_cache = self._page_of_images_import.GetSeedCache()
|
||||
seed_cache = self._simple_downloader_import.GetSeedCache()
|
||||
|
||||
title = 'file import status'
|
||||
frame_key = 'file_import_status'
|
||||
|
@ -1888,19 +2043,30 @@ class ManagementPanelImporterPageOfImages( ManagementPanelImporter ):
|
|||
|
||||
def _UpdateStatus( self ):
|
||||
|
||||
( pending_page_urls, parser_status, current_action, queue_paused, files_paused ) = self._page_of_images_import.GetStatus()
|
||||
( pending_jobs, parser_status, current_action, queue_paused, files_paused ) = self._simple_downloader_import.GetStatus()
|
||||
|
||||
if self._pending_page_urls_listbox.GetStrings() != pending_page_urls:
|
||||
current_pending_jobs = [ self._pending_jobs_listbox.GetClientData( i ) for i in range( self._pending_jobs_listbox.GetCount() ) ]
|
||||
|
||||
if current_pending_jobs != pending_jobs:
|
||||
|
||||
selected_string = self._pending_page_urls_listbox.GetStringSelection()
|
||||
selected_string = self._pending_jobs_listbox.GetStringSelection()
|
||||
|
||||
self._pending_page_urls_listbox.SetItems( pending_page_urls )
|
||||
self._pending_jobs_listbox.Clear()
|
||||
|
||||
selection_index = self._pending_page_urls_listbox.FindString( selected_string )
|
||||
for job in pending_jobs:
|
||||
|
||||
( url, formula_name, formula ) = job
|
||||
|
||||
pretty_job = formula_name + ': ' + url
|
||||
|
||||
self._pending_jobs_listbox.Append( pretty_job, job )
|
||||
|
||||
|
||||
selection_index = self._pending_jobs_listbox.FindString( selected_string )
|
||||
|
||||
if selection_index != wx.NOT_FOUND:
|
||||
|
||||
self._pending_page_urls_listbox.Select( selection_index )
|
||||
self._pending_jobs_listbox.Select( selection_index )
|
||||
|
||||
|
||||
|
||||
|
@ -1939,13 +2105,13 @@ class ManagementPanelImporterPageOfImages( ManagementPanelImporter ):
|
|||
|
||||
def EventAdvance( self, event ):
|
||||
|
||||
selection = self._pending_page_urls_listbox.GetSelection()
|
||||
selection = self._pending_jobs_listbox.GetSelection()
|
||||
|
||||
if selection != wx.NOT_FOUND:
|
||||
|
||||
page_url = self._pending_page_urls_listbox.GetString( selection )
|
||||
job = self._pending_jobs_listbox.GetClientData( selection )
|
||||
|
||||
self._page_of_images_import.AdvancePageURL( page_url )
|
||||
self._simple_downloader_import.AdvanceJob( job )
|
||||
|
||||
self._UpdateStatus()
|
||||
|
||||
|
@ -1953,13 +2119,13 @@ class ManagementPanelImporterPageOfImages( ManagementPanelImporter ):
|
|||
|
||||
def EventDelay( self, event ):
|
||||
|
||||
selection = self._pending_page_urls_listbox.GetSelection()
|
||||
selection = self._pending_jobs_listbox.GetSelection()
|
||||
|
||||
if selection != wx.NOT_FOUND:
|
||||
|
||||
page_url = self._pending_page_urls_listbox.GetString( selection )
|
||||
job = self._pending_jobs_listbox.GetClientData( selection )
|
||||
|
||||
self._page_of_images_import.DelayPageURL( page_url )
|
||||
self._simple_downloader_import.DelayJob( job )
|
||||
|
||||
self._UpdateStatus()
|
||||
|
||||
|
@ -1967,38 +2133,28 @@ class ManagementPanelImporterPageOfImages( ManagementPanelImporter ):
|
|||
|
||||
def EventDelete( self, event ):
|
||||
|
||||
selection = self._pending_page_urls_listbox.GetSelection()
|
||||
selection = self._pending_jobs_listbox.GetSelection()
|
||||
|
||||
if selection != wx.NOT_FOUND:
|
||||
|
||||
page_url = self._pending_page_urls_listbox.GetString( selection )
|
||||
job = self._pending_jobs_listbox.GetClientData( selection )
|
||||
|
||||
self._page_of_images_import.DeletePageURL( page_url )
|
||||
self._simple_downloader_import.DeleteJob( job )
|
||||
|
||||
self._UpdateStatus()
|
||||
|
||||
|
||||
|
||||
def EventDownloadImageLinks( self, event ):
|
||||
|
||||
self._page_of_images_import.SetDownloadImageLinks( self._download_image_links.GetValue() )
|
||||
|
||||
|
||||
def EventDownloadUnlinkedImages( self, event ):
|
||||
|
||||
self._page_of_images_import.SetDownloadUnlinkedImages( self._download_unlinked_images.GetValue() )
|
||||
|
||||
|
||||
def EventPauseQueue( self, event ):
|
||||
|
||||
self._page_of_images_import.PausePlayQueue()
|
||||
self._simple_downloader_import.PausePlayQueue()
|
||||
|
||||
self._UpdateStatus()
|
||||
|
||||
|
||||
def EventPauseFiles( self, event ):
|
||||
|
||||
self._page_of_images_import.PausePlayFiles()
|
||||
self._simple_downloader_import.PausePlayFiles()
|
||||
|
||||
self._UpdateStatus()
|
||||
|
||||
|
@ -2015,12 +2171,12 @@ class ManagementPanelImporterPageOfImages( ManagementPanelImporter ):
|
|||
|
||||
def Start( self ):
|
||||
|
||||
self._page_of_images_import.Start( self._page_key )
|
||||
self._simple_downloader_import.Start( self._page_key )
|
||||
|
||||
|
||||
def TestAbleToClose( self ):
|
||||
|
||||
if self._page_of_images_import.CurrentlyWorking():
|
||||
if self._simple_downloader_import.CurrentlyWorking():
|
||||
|
||||
with ClientGUIDialogs.DialogYesNo( self, 'This page is still importing. Are you sure you want to close it?' ) as dlg:
|
||||
|
||||
|
@ -2032,7 +2188,7 @@ class ManagementPanelImporterPageOfImages( ManagementPanelImporter ):
|
|||
|
||||
|
||||
|
||||
management_panel_types_to_classes[ MANAGEMENT_TYPE_IMPORT_PAGE_OF_IMAGES ] = ManagementPanelImporterPageOfImages
|
||||
management_panel_types_to_classes[ MANAGEMENT_TYPE_IMPORT_SIMPLE_DOWNLOADER ] = ManagementPanelImporterSimpleDownloader
|
||||
|
||||
class ManagementPanelImporterThreadWatcher( ManagementPanelImporter ):
|
||||
|
||||
|
|
|
@ -1503,7 +1503,16 @@ class MediaPanel( ClientMedia.ListeningMediaList, wx.ScrolledWindow ):
|
|||
|
||||
self._focussed_media = media
|
||||
|
||||
HG.client_controller.pub( 'preview_changed', self._page_key, media )
|
||||
if self._focussed_media is None:
|
||||
|
||||
publish_media = None
|
||||
|
||||
else:
|
||||
|
||||
publish_media = self._focussed_media.GetDisplayMedia()
|
||||
|
||||
|
||||
HG.client_controller.pub( 'preview_changed', self._page_key, publish_media )
|
||||
|
||||
|
||||
def _ShareOnLocalBooru( self ):
|
||||
|
@ -1680,7 +1689,16 @@ class MediaPanel( ClientMedia.ListeningMediaList, wx.ScrolledWindow ):
|
|||
|
||||
def PageShown( self ):
|
||||
|
||||
HG.client_controller.pub( 'preview_changed', self._page_key, self._focussed_media )
|
||||
if self._focussed_media is None:
|
||||
|
||||
publish_media = None
|
||||
|
||||
else:
|
||||
|
||||
publish_media = self._focussed_media.GetDisplayMedia()
|
||||
|
||||
|
||||
HG.client_controller.pub( 'preview_changed', self._page_key, publish_media )
|
||||
|
||||
self._PublishSelectionChange()
|
||||
|
||||
|
|
|
@ -117,9 +117,9 @@ class DialogPageChooser( ClientGUIDialogs.Dialog ):
|
|||
|
||||
button.SetLabelText( text )
|
||||
|
||||
elif entry_type == 'page_import_page_of_images':
|
||||
elif entry_type == 'page_import_simple_downloader':
|
||||
|
||||
button.SetLabelText( 'page of images' )
|
||||
button.SetLabelText( 'simple downloader' )
|
||||
|
||||
elif entry_type == 'page_import_thread_watcher':
|
||||
|
||||
|
@ -200,9 +200,9 @@ class DialogPageChooser( ClientGUIDialogs.Dialog ):
|
|||
|
||||
self._result = ( 'page', ClientGUIManagement.CreateManagementControllerImportGallery( gallery_identifier ) )
|
||||
|
||||
elif entry_type == 'page_import_page_of_images':
|
||||
elif entry_type == 'page_import_simple_downloader':
|
||||
|
||||
self._result = ( 'page', ClientGUIManagement.CreateManagementControllerImportPageOfImages() )
|
||||
self._result = ( 'page', ClientGUIManagement.CreateManagementControllerImportSimpleDownloader() )
|
||||
|
||||
elif entry_type == 'page_import_thread_watcher':
|
||||
|
||||
|
@ -261,7 +261,7 @@ class DialogPageChooser( ClientGUIDialogs.Dialog ):
|
|||
entries.append( ( 'page_import_urls', None ) )
|
||||
entries.append( ( 'page_import_thread_watcher', None ) )
|
||||
entries.append( ( 'menu', 'gallery' ) )
|
||||
entries.append( ( 'page_import_page_of_images', None ) )
|
||||
entries.append( ( 'page_import_simple_downloader', None ) )
|
||||
|
||||
elif menu_keyword == 'gallery':
|
||||
|
||||
|
@ -963,6 +963,15 @@ class PagesNotebook( wx.Notebook ):
|
|||
|
||||
|
||||
|
||||
def _GatherDeadThreadWatchers( self, insertion_page ):
|
||||
|
||||
top_notebook = self._GetTopNotebook()
|
||||
|
||||
gathered_pages = top_notebook.GetGatherPages( 'dead_thread_watchers' )
|
||||
|
||||
self._MovePages( gathered_pages, insertion_page )
|
||||
|
||||
|
||||
def _GetDefaultPageInsertionIndex( self ):
|
||||
|
||||
new_options = self._controller.new_options
|
||||
|
@ -1118,6 +1127,22 @@ class PagesNotebook( wx.Notebook ):
|
|||
return None
|
||||
|
||||
|
||||
def _GetTopNotebook( self ):
|
||||
|
||||
top_notebook = self
|
||||
|
||||
parent = top_notebook.GetParent()
|
||||
|
||||
while isinstance( parent, PagesNotebook ):
|
||||
|
||||
top_notebook = parent
|
||||
|
||||
parent = top_notebook.GetParent()
|
||||
|
||||
|
||||
return top_notebook
|
||||
|
||||
|
||||
def _MovePage( self, page, dest_notebook, insertion_tab_index, follow_dropped_page = False ):
|
||||
|
||||
source_notebook = page.GetParent()
|
||||
|
@ -1151,6 +1176,21 @@ class PagesNotebook( wx.Notebook ):
|
|||
self._controller.pub( 'refresh_page_name', page.GetPageKey() )
|
||||
|
||||
|
||||
def _MovePages( self, pages, dest_notebook ):
|
||||
|
||||
insertion_tab_index = dest_notebook.GetNumPages( only_my_level = True )
|
||||
|
||||
for page in pages:
|
||||
|
||||
if page.GetParent() != dest_notebook:
|
||||
|
||||
self._MovePage( page, dest_notebook, insertion_tab_index )
|
||||
|
||||
insertion_tab_index += 1
|
||||
|
||||
|
||||
|
||||
|
||||
def _ShiftPage( self, page_index, delta = None, new_index = None ):
|
||||
|
||||
new_page_index = page_index
|
||||
|
@ -1289,13 +1329,16 @@ class PagesNotebook( wx.Notebook ):
|
|||
|
||||
num_pages = self.GetPageCount()
|
||||
|
||||
end_index = num_pages - 1
|
||||
|
||||
more_than_one_tab = num_pages > 1
|
||||
|
||||
click_over_tab = tab_index != -1
|
||||
|
||||
click_over_page_of_pages = False
|
||||
can_go_left = tab_index > 0
|
||||
can_go_right = tab_index < end_index
|
||||
|
||||
end_index = num_pages - 1
|
||||
click_over_page_of_pages = False
|
||||
|
||||
existing_session_names = self._controller.Read( 'serialisable_names', HydrusSerialisable.SERIALISABLE_TYPE_GUI_SESSION )
|
||||
|
||||
|
@ -1309,9 +1352,6 @@ class PagesNotebook( wx.Notebook ):
|
|||
|
||||
ClientGUIMenus.AppendMenuItem( self, menu, 'close page', 'Close this page.', self._ClosePage, tab_index )
|
||||
|
||||
can_go_left = tab_index > 0
|
||||
can_go_right = tab_index < end_index
|
||||
|
||||
if num_pages > 1:
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( self, menu, 'close other pages', 'Close all pages but this one.', self._CloseOtherPages, tab_index )
|
||||
|
@ -1327,15 +1367,6 @@ class PagesNotebook( wx.Notebook ):
|
|||
|
||||
|
||||
|
||||
ClientGUIMenus.AppendSeparator( menu )
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( self, menu, 'send this page down to a new page of pages', 'Make a new page of pages and put this page in it.', self._SendPageToNewNotebook, tab_index )
|
||||
|
||||
if can_go_right:
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( self, menu, 'send pages to the right to a new page of pages', 'Make a new page of pages and put all the pages to the right into it.', self._SendRightPagesToNewNotebook, tab_index )
|
||||
|
||||
|
||||
ClientGUIMenus.AppendSeparator( menu )
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( self, menu, 'rename page', 'Rename this page.', self._RenamePage, tab_index )
|
||||
|
@ -1343,43 +1374,9 @@ class PagesNotebook( wx.Notebook ):
|
|||
|
||||
ClientGUIMenus.AppendMenuItem( self, menu, 'new page', 'Choose a new page.', self._ChooseNewPage )
|
||||
|
||||
if click_over_page_of_pages or len( existing_session_names ) > 0:
|
||||
|
||||
ClientGUIMenus.AppendSeparator( menu )
|
||||
|
||||
|
||||
if len( existing_session_names ) > 0:
|
||||
|
||||
submenu = wx.Menu()
|
||||
|
||||
for name in existing_session_names:
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( self, submenu, name, 'Load this session here.', self.AppendGUISession, name )
|
||||
|
||||
|
||||
ClientGUIMenus.AppendMenu( menu, submenu, 'append session' )
|
||||
|
||||
|
||||
if click_over_tab:
|
||||
|
||||
if click_over_page_of_pages:
|
||||
|
||||
submenu = wx.Menu()
|
||||
|
||||
for name in existing_session_names:
|
||||
|
||||
if name == 'last session':
|
||||
|
||||
continue
|
||||
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( self, submenu, name, 'Save this page of pages to the session.', page.SaveGUISession, name )
|
||||
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( self, submenu, 'create a new session', 'Save this page of pages to the session.', page.SaveGUISession, suggested_name = page.GetDisplayName() )
|
||||
|
||||
ClientGUIMenus.AppendMenu( menu, submenu, 'save this page of pages to a session' )
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( self, menu, 'new page here', 'Choose a new page.', self._ChooseNewPage, tab_index )
|
||||
|
||||
if more_than_one_tab:
|
||||
|
||||
|
@ -1390,10 +1387,6 @@ class PagesNotebook( wx.Notebook ):
|
|||
can_move_right = tab_index < end_index
|
||||
can_end = tab_index < end_index - 1
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( self, menu, 'new page here', 'Choose a new page.', self._ChooseNewPage, tab_index )
|
||||
|
||||
ClientGUIMenus.AppendSeparator( menu )
|
||||
|
||||
if can_home:
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( self, menu, 'move to left end', 'Move this page all the way to the left.', self._ShiftPage, tab_index, new_index = 0 )
|
||||
|
@ -1415,6 +1408,15 @@ class PagesNotebook( wx.Notebook ):
|
|||
|
||||
|
||||
|
||||
ClientGUIMenus.AppendSeparator( menu )
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( self, menu, 'send this page down to a new page of pages', 'Make a new page of pages and put this page in it.', self._SendPageToNewNotebook, tab_index )
|
||||
|
||||
if can_go_right:
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( self, menu, 'send pages to the right to a new page of pages', 'Make a new page of pages and put all the pages to the right into it.', self._SendRightPagesToNewNotebook, tab_index )
|
||||
|
||||
|
||||
if click_over_page_of_pages and page.GetPageCount() > 0:
|
||||
|
||||
ClientGUIMenus.AppendSeparator( menu )
|
||||
|
@ -1423,6 +1425,53 @@ class PagesNotebook( wx.Notebook ):
|
|||
|
||||
|
||||
|
||||
if click_over_page_of_pages:
|
||||
|
||||
ClientGUIMenus.AppendSeparator( menu )
|
||||
|
||||
submenu = wx.Menu()
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( self, submenu, 'dead thread watchers', 'Find all currently open dead thread watchers and move them to this page of pages.', self._GatherDeadThreadWatchers, page )
|
||||
|
||||
ClientGUIMenus.AppendMenu( menu, submenu, 'gather on this page of pages' )
|
||||
|
||||
|
||||
if len( existing_session_names ) > 0 or click_over_page_of_pages:
|
||||
|
||||
ClientGUIMenus.AppendSeparator( menu )
|
||||
|
||||
|
||||
if len( existing_session_names ) > 0:
|
||||
|
||||
submenu = wx.Menu()
|
||||
|
||||
for name in existing_session_names:
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( self, submenu, name, 'Load this session here.', self.AppendGUISession, name )
|
||||
|
||||
|
||||
ClientGUIMenus.AppendMenu( menu, submenu, 'append session' )
|
||||
|
||||
|
||||
if click_over_page_of_pages:
|
||||
|
||||
submenu = wx.Menu()
|
||||
|
||||
for name in existing_session_names:
|
||||
|
||||
if name == 'last session':
|
||||
|
||||
continue
|
||||
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( self, submenu, name, 'Save this page of pages to the session.', page.SaveGUISession, name )
|
||||
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( self, submenu, 'create a new session', 'Save this page of pages to the session.', page.SaveGUISession, suggested_name = page.GetDisplayName() )
|
||||
|
||||
ClientGUIMenus.AppendMenu( menu, submenu, 'save this page of pages to a session' )
|
||||
|
||||
|
||||
self._controller.PopupMenu( self, menu )
|
||||
|
||||
|
||||
|
@ -1753,6 +1802,35 @@ class PagesNotebook( wx.Notebook ):
|
|||
|
||||
|
||||
|
||||
def GetGatherPages( self, gather_type ):
|
||||
|
||||
if gather_type == 'dead_thread_watchers':
|
||||
|
||||
def test( page ):
|
||||
|
||||
management_controller = page.GetManagementController()
|
||||
|
||||
return management_controller.IsDeadThreadWatcher()
|
||||
|
||||
|
||||
else:
|
||||
|
||||
raise NotImplementedError()
|
||||
|
||||
|
||||
gathered_pages = []
|
||||
|
||||
for page in self.GetMediaPages():
|
||||
|
||||
if test( page ):
|
||||
|
||||
gathered_pages.append( page )
|
||||
|
||||
|
||||
|
||||
return gathered_pages
|
||||
|
||||
|
||||
def GetMediaPages( self, only_my_level = False ):
|
||||
|
||||
return self._GetMediaPages( only_my_level )
|
||||
|
@ -2111,9 +2189,9 @@ class PagesNotebook( wx.Notebook ):
|
|||
return self.NewPage( management_controller, on_deepest_notebook = on_deepest_notebook )
|
||||
|
||||
|
||||
def NewPageImportPageOfImages( self, on_deepest_notebook = False ):
|
||||
def NewPageImportSimpleDownloader( self, on_deepest_notebook = False ):
|
||||
|
||||
management_controller = ClientGUIManagement.CreateManagementControllerImportPageOfImages()
|
||||
management_controller = ClientGUIManagement.CreateManagementControllerImportSimpleDownloader()
|
||||
|
||||
return self.NewPage( management_controller, on_deepest_notebook = on_deepest_notebook )
|
||||
|
||||
|
|
|
@ -2203,18 +2203,12 @@ The formula should attempt to parse full or relative urls. If the url is relativ
|
|||
|
||||
def EventTestParse( self, event ):
|
||||
|
||||
node = self.GetValue()
|
||||
|
||||
try:
|
||||
def wx_code( parsed_urls ):
|
||||
|
||||
stop_time = HydrusData.GetNow() + 30
|
||||
|
||||
job_key = ClientThreading.JobKey( cancellable = True, stop_time = stop_time )
|
||||
|
||||
data = self._example_data.GetValue()
|
||||
referral_url = self._referral_url
|
||||
|
||||
parsed_urls = node.ParseURLs( job_key, data, referral_url )
|
||||
if not self:
|
||||
|
||||
return
|
||||
|
||||
|
||||
if len( parsed_urls ) > 0:
|
||||
|
||||
|
@ -2232,15 +2226,40 @@ The formula should attempt to parse full or relative urls. If the url is relativ
|
|||
|
||||
self._results.SetValue( results_text )
|
||||
|
||||
except Exception as e:
|
||||
|
||||
def do_it( node, data, referral_url ):
|
||||
|
||||
HydrusData.ShowException( e )
|
||||
|
||||
message = 'Could not parse!'
|
||||
|
||||
wx.MessageBox( message )
|
||||
try:
|
||||
|
||||
stop_time = HydrusData.GetNow() + 30
|
||||
|
||||
job_key = ClientThreading.JobKey( cancellable = True, stop_time = stop_time )
|
||||
|
||||
parsed_urls = node.ParseURLs( job_key, data, referral_url )
|
||||
|
||||
wx.CallAfter( wx_code, parsed_urls )
|
||||
|
||||
except Exception as e:
|
||||
|
||||
HydrusData.ShowException( e )
|
||||
|
||||
message = 'Could not parse!'
|
||||
|
||||
wx.CallAfter( wx.MessageBox, message )
|
||||
|
||||
|
||||
|
||||
node = self.GetValue()
|
||||
data = self._example_data.GetValue()
|
||||
referral_url = self._referral_url
|
||||
|
||||
HG.client_controller.CallToThread( do_it, node, data, referral_url )
|
||||
|
||||
|
||||
def GetExampleData( self ):
|
||||
|
||||
return self._example_data.GetValue()
|
||||
|
||||
|
||||
def GetExampleURL( self ):
|
||||
|
||||
|
@ -3100,19 +3119,12 @@ And pass that html to a number of 'parsing children' that will each look through
|
|||
|
||||
def EventTestParse( self, event ):
|
||||
|
||||
script = self.GetValue()
|
||||
|
||||
try:
|
||||
def wx_code( results ):
|
||||
|
||||
stop_time = HydrusData.GetNow() + 30
|
||||
|
||||
job_key = ClientThreading.JobKey( cancellable = True, stop_time = stop_time )
|
||||
|
||||
self._test_script_management.SetJobKey( job_key )
|
||||
|
||||
data = self._example_data.GetValue()
|
||||
|
||||
results = script.Parse( job_key, data )
|
||||
if not self:
|
||||
|
||||
return
|
||||
|
||||
|
||||
result_lines = [ '*** ' + HydrusData.ConvertIntToPrettyString( len( results ) ) + ' RESULTS BEGIN ***' ]
|
||||
|
||||
|
@ -3124,19 +3136,41 @@ And pass that html to a number of 'parsing children' that will each look through
|
|||
|
||||
self._results.SetValue( results_text )
|
||||
|
||||
except Exception as e:
|
||||
|
||||
def do_it( script, job_key, data ):
|
||||
|
||||
HydrusData.ShowException( e )
|
||||
|
||||
message = 'Could not parse!'
|
||||
|
||||
wx.MessageBox( message )
|
||||
|
||||
finally:
|
||||
|
||||
job_key.Finish()
|
||||
try:
|
||||
|
||||
results = script.Parse( job_key, data )
|
||||
|
||||
wx.CallAfter( wx_code, results )
|
||||
|
||||
except Exception as e:
|
||||
|
||||
HydrusData.ShowException( e )
|
||||
|
||||
message = 'Could not parse!'
|
||||
|
||||
wx.CallAfter( wx.MessageBox, message )
|
||||
|
||||
finally:
|
||||
|
||||
job_key.Finish()
|
||||
|
||||
|
||||
|
||||
script = self.GetValue()
|
||||
|
||||
stop_time = HydrusData.GetNow() + 30
|
||||
|
||||
job_key = ClientThreading.JobKey( cancellable = True, stop_time = stop_time )
|
||||
|
||||
self._test_script_management.SetJobKey( job_key )
|
||||
|
||||
data = self._example_data.GetValue()
|
||||
|
||||
HG.client_controller.CallToThread( do_it, script, job_key, data )
|
||||
|
||||
|
||||
def GetExampleData( self ):
|
||||
|
||||
|
@ -4414,6 +4448,9 @@ class TestPanel( wx.Panel ):
|
|||
self._copy_button = ClientGUICommon.BetterBitmapButton( self, CC.GlobalBMPs.copy, self._Copy )
|
||||
self._copy_button.SetToolTip( 'Copy the current example data to the clipboard.' )
|
||||
|
||||
self._fetch_button = ClientGUICommon.BetterBitmapButton( self, CC.GlobalBMPs.link, self._FetchFromURL )
|
||||
self._fetch_button.SetToolTip( 'Fetch data from a URL.' )
|
||||
|
||||
self._paste_button = ClientGUICommon.BetterBitmapButton( self, CC.GlobalBMPs.paste, self._Paste )
|
||||
self._paste_button.SetToolTip( 'Paste the current clipboard data into here.' )
|
||||
|
||||
|
@ -4446,6 +4483,7 @@ class TestPanel( wx.Panel ):
|
|||
buttons_hbox = wx.BoxSizer( wx.HORIZONTAL )
|
||||
|
||||
buttons_hbox.Add( self._copy_button, CC.FLAGS_VCENTER )
|
||||
buttons_hbox.Add( self._fetch_button, CC.FLAGS_VCENTER )
|
||||
buttons_hbox.Add( self._paste_button, CC.FLAGS_VCENTER )
|
||||
|
||||
desc_hbox = wx.BoxSizer( wx.HORIZONTAL )
|
||||
|
@ -4469,6 +4507,54 @@ class TestPanel( wx.Panel ):
|
|||
HG.client_controller.pub( 'clipboard', 'text', self._example_data )
|
||||
|
||||
|
||||
def _FetchFromURL( self ):
|
||||
|
||||
def wx_code( example_data ):
|
||||
|
||||
self._SetExampleData( example_data )
|
||||
|
||||
|
||||
def do_it( url ):
|
||||
|
||||
network_job = ClientNetworking.NetworkJob( 'GET', url )
|
||||
|
||||
network_job.OverrideBandwidth()
|
||||
|
||||
HG.client_controller.network_engine.AddJob( network_job )
|
||||
|
||||
try:
|
||||
|
||||
network_job.WaitUntilDone()
|
||||
|
||||
example_data = network_job.GetContent()
|
||||
|
||||
except HydrusExceptions.CancelledException:
|
||||
|
||||
example_data = 'fetch cancelled'
|
||||
|
||||
except Exception as e:
|
||||
|
||||
example_data = 'fetch failed:' + os.linesep * 2 + HydrusData.ToUnicode( e )
|
||||
|
||||
HydrusData.ShowException( e )
|
||||
|
||||
|
||||
wx.CallAfter( wx_code, example_data )
|
||||
|
||||
|
||||
message = 'Enter URL to fetch data for.'
|
||||
|
||||
with ClientGUIDialogs.DialogTextEntry( self, message, default = 'enter url', allow_blank = False) as dlg:
|
||||
|
||||
if dlg.ShowModal() == wx.ID_OK:
|
||||
|
||||
url = dlg.GetValue()
|
||||
|
||||
HG.client_controller.CallToThread( do_it, url )
|
||||
|
||||
|
||||
|
||||
|
||||
def _Paste( self ):
|
||||
|
||||
raw_text = HG.client_controller.GetClipboardText()
|
||||
|
|
|
@ -1908,8 +1908,23 @@ class MigrateDatabasePanel( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
pretty_portable = 'no'
|
||||
|
||||
|
||||
free_space = HydrusPaths.GetFreeSpace( location )
|
||||
pretty_free_space = HydrusData.ConvertIntToBytes( free_space )
|
||||
try:
|
||||
|
||||
free_space = HydrusPaths.GetFreeSpace( location )
|
||||
pretty_free_space = HydrusData.ConvertIntToBytes( free_space )
|
||||
|
||||
except Exception as e:
|
||||
|
||||
HydrusData.ShowException( e )
|
||||
|
||||
message = 'There was a problem finding the free space for "' + location + '"! Perhaps this location does not exist?'
|
||||
|
||||
wx.MessageBox( message )
|
||||
HydrusData.ShowText( message )
|
||||
|
||||
free_space = 0
|
||||
pretty_free_space = 'problem finding free space'
|
||||
|
||||
|
||||
fp = locations_to_file_weights[ location ] / 256.0
|
||||
tp = locations_to_fs_thumb_weights[ location ] / 256.0
|
||||
|
|
|
@ -33,7 +33,7 @@ class EditSeedCachePanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
columns = [ ( '#', 3 ), ( 'source', -1 ), ( 'status', 12 ), ( 'added', 23 ), ( 'last modified', 23 ), ( 'source time', 23 ), ( 'note', 20 ) ]
|
||||
|
||||
self._list_ctrl = ClientGUIListCtrl.BetterListCtrl( self, 'seed_cache', 30, 30, columns, self._ConvertSeedToListCtrlTuples )
|
||||
self._list_ctrl = ClientGUIListCtrl.BetterListCtrl( self, 'seed_cache', 30, 30, columns, self._ConvertSeedToListCtrlTuples, delete_key_callback = self._DeleteSelected )
|
||||
|
||||
#
|
||||
|
||||
|
|
File diff suppressed because it is too large
Load Diff
|
@ -958,7 +958,9 @@ class MediaList( object ):
|
|||
|
||||
if media.IsCollection():
|
||||
|
||||
media_results.extend( media.GenerateMediaResults( has_location = has_location, discriminant = discriminant, selected_media = selected_media, unrated = unrated, for_media_viewer = True ) )
|
||||
# don't include selected_media here as it is not valid at the deeper collection level
|
||||
|
||||
media_results.extend( media.GenerateMediaResults( has_location = has_location, discriminant = discriminant, unrated = unrated, for_media_viewer = True ) )
|
||||
|
||||
else:
|
||||
|
||||
|
@ -1489,68 +1491,6 @@ class MediaSingleton( Media ):
|
|||
return self._media_result.GetHash()
|
||||
|
||||
|
||||
def MatchesDiscriminant( self, has_location = None, discriminant = None, not_uploaded_to = None ):
|
||||
|
||||
if discriminant is not None:
|
||||
|
||||
inbox = self._media_result.GetInbox()
|
||||
|
||||
locations_manager = self._media_result.GetLocationsManager()
|
||||
|
||||
if discriminant == CC.DISCRIMINANT_INBOX:
|
||||
|
||||
p = inbox
|
||||
|
||||
elif discriminant == CC.DISCRIMINANT_ARCHIVE:
|
||||
|
||||
p = not inbox
|
||||
|
||||
elif discriminant == CC.DISCRIMINANT_LOCAL:
|
||||
|
||||
p = locations_manager.IsLocal()
|
||||
|
||||
elif discriminant == CC.DISCRIMINANT_LOCAL_BUT_NOT_IN_TRASH:
|
||||
|
||||
p = locations_manager.IsLocal() and not locations_manager.IsTrashed()
|
||||
|
||||
elif discriminant == CC.DISCRIMINANT_NOT_LOCAL:
|
||||
|
||||
p = not locations_manager.IsLocal()
|
||||
|
||||
elif discriminant == CC.DISCRIMINANT_DOWNLOADING:
|
||||
|
||||
p = locations_manager.IsDownloading()
|
||||
|
||||
|
||||
if not p:
|
||||
|
||||
return False
|
||||
|
||||
|
||||
|
||||
if has_location is not None:
|
||||
|
||||
locations_manager = self._media_result.GetLocationsManager()
|
||||
|
||||
if has_location not in locations_manager.GetCurrent():
|
||||
|
||||
return False
|
||||
|
||||
|
||||
|
||||
if not_uploaded_to is not None:
|
||||
|
||||
locations_manager = self._media_result.GetLocationsManager()
|
||||
|
||||
if not_uploaded_to in locations_manager.GetCurrentRemote():
|
||||
|
||||
return False
|
||||
|
||||
|
||||
|
||||
return True
|
||||
|
||||
|
||||
def GetHashes( self, has_location = None, discriminant = None, not_uploaded_to = None, ordered = False ):
|
||||
|
||||
if self.MatchesDiscriminant( has_location = has_location, discriminant = discriminant, not_uploaded_to = not_uploaded_to ):
|
||||
|
@ -1730,6 +1670,68 @@ class MediaSingleton( Media ):
|
|||
|
||||
def IsSizeDefinite( self ): return self._media_result.GetSize() is not None
|
||||
|
||||
def MatchesDiscriminant( self, has_location = None, discriminant = None, not_uploaded_to = None ):
|
||||
|
||||
if discriminant is not None:
|
||||
|
||||
inbox = self._media_result.GetInbox()
|
||||
|
||||
locations_manager = self._media_result.GetLocationsManager()
|
||||
|
||||
if discriminant == CC.DISCRIMINANT_INBOX:
|
||||
|
||||
p = inbox
|
||||
|
||||
elif discriminant == CC.DISCRIMINANT_ARCHIVE:
|
||||
|
||||
p = not inbox
|
||||
|
||||
elif discriminant == CC.DISCRIMINANT_LOCAL:
|
||||
|
||||
p = locations_manager.IsLocal()
|
||||
|
||||
elif discriminant == CC.DISCRIMINANT_LOCAL_BUT_NOT_IN_TRASH:
|
||||
|
||||
p = locations_manager.IsLocal() and not locations_manager.IsTrashed()
|
||||
|
||||
elif discriminant == CC.DISCRIMINANT_NOT_LOCAL:
|
||||
|
||||
p = not locations_manager.IsLocal()
|
||||
|
||||
elif discriminant == CC.DISCRIMINANT_DOWNLOADING:
|
||||
|
||||
p = locations_manager.IsDownloading()
|
||||
|
||||
|
||||
if not p:
|
||||
|
||||
return False
|
||||
|
||||
|
||||
|
||||
if has_location is not None:
|
||||
|
||||
locations_manager = self._media_result.GetLocationsManager()
|
||||
|
||||
if has_location not in locations_manager.GetCurrent():
|
||||
|
||||
return False
|
||||
|
||||
|
||||
|
||||
if not_uploaded_to is not None:
|
||||
|
||||
locations_manager = self._media_result.GetLocationsManager()
|
||||
|
||||
if not_uploaded_to in locations_manager.GetCurrentRemote():
|
||||
|
||||
return False
|
||||
|
||||
|
||||
|
||||
return True
|
||||
|
||||
|
||||
def RefreshFileInfo( self ):
|
||||
|
||||
self._media_result.RefreshFileInfo()
|
||||
|
|
|
@ -49,7 +49,7 @@ options = {}
|
|||
# Misc
|
||||
|
||||
NETWORK_VERSION = 18
|
||||
SOFTWARE_VERSION = 300
|
||||
SOFTWARE_VERSION = 301
|
||||
|
||||
UNSCALED_THUMBNAIL_DIMENSIONS = ( 200, 200 )
|
||||
|
||||
|
|
|
@ -16,6 +16,7 @@ db_report_mode = False
|
|||
db_profile_mode = False
|
||||
gui_report_mode = False
|
||||
shortcut_report_mode = False
|
||||
subscription_report_mode = False
|
||||
hover_window_report_mode = False
|
||||
menu_profile_mode = False
|
||||
network_report_mode = False
|
||||
|
|
|
@ -32,7 +32,7 @@ SERIALISABLE_TYPE_PREDICATE = 14
|
|||
SERIALISABLE_TYPE_FILE_SEARCH_CONTEXT = 15
|
||||
SERIALISABLE_TYPE_EXPORT_FOLDER = 16
|
||||
SERIALISABLE_TYPE_THREAD_WATCHER_IMPORT = 17
|
||||
SERIALISABLE_TYPE_PAGE_OF_IMAGES_IMPORT = 18
|
||||
SERIALISABLE_TYPE_SIMPLE_DOWNLOADER_IMPORT = 18
|
||||
SERIALISABLE_TYPE_IMPORT_FOLDER = 19
|
||||
SERIALISABLE_TYPE_GALLERY_IMPORT = 20
|
||||
SERIALISABLE_TYPE_DICTIONARY = 21
|
||||
|
|
|
@ -662,7 +662,7 @@ class TestClientDB( unittest.TestCase ):
|
|||
|
||||
#
|
||||
|
||||
management_controller = ClientGUIManagement.CreateManagementControllerImportPageOfImages()
|
||||
management_controller = ClientGUIManagement.CreateManagementControllerImportSimpleDownloader()
|
||||
|
||||
page = ClientGUIPages.Page( test_frame, HG.test_controller, management_controller, [] )
|
||||
|
||||
|
@ -744,7 +744,7 @@ class TestClientDB( unittest.TestCase ):
|
|||
|
||||
|
||||
|
||||
self.assertEqual( page_names, [ u'hentai foundry artist', u'import', u'thread watcher', u'page download', u'example tag repo petitions', u'search', u'search', u'files', u'wew lad', u'files' ] )
|
||||
self.assertEqual( page_names, [ u'hentai foundry artist', u'import', u'thread watcher', u'simple downloader', u'example tag repo petitions', u'search', u'search', u'files', u'wew lad', u'files' ] )
|
||||
|
||||
finally:
|
||||
|
||||
|
|
Loading…
Reference in New Issue