Version 321
|
@ -8,8 +8,47 @@
|
|||
<div class="content">
|
||||
<h3>changelog</h3>
|
||||
<ul>
|
||||
<li><h3>version 321</h3></li>
|
||||
<ul>
|
||||
<li>downloader overhaul</li>
|
||||
<li>the basic downloader overhaul is complete! at this point, any user can create and share the objects required for a completely new downloader! it is still rough in some places, so a round of EZ-import is coming to make adding new downloaders a single easy drag and drop action</li>
|
||||
<li>rounded out the ngug (nested gugs, which contain multiple gugs) code</li>
|
||||
<li>updated the edit gug panel to deal with gugs and ngugs on different notebook pages</li>
|
||||
<li>added a bunch of logic to this panel and backend data handling to deal with missing gugs in ngugs</li>
|
||||
<li>if an ngug cannot find a gug by its internal identifier key, it will now attempt to fallback to its simple name, and will silently fail if no gug can be found. all gug tracking now uses this 'key first, name later' id method, so downloaders and subs should generally survive gug renames and same-name overwrites</li>
|
||||
<li>the gallery selector now works in gugs. it has two 'pages', depending on which gugs are set to 'display', and will note if the chosen gug is cannot be found in the current definitions. the gugs have slightly more specific names ('gelbooru tag search', 'hentai foundry artist lookup', etc...) than before</li>
|
||||
<li>the gallery selector also puts 'non-functional' gugs (i.e. those with no parsable gallery url class) to a third page</li>
|
||||
<li>moved the gallery downloader gallery and file pipeline completely over to the new system</li>
|
||||
<li>the gallery downloader will now bundle nested gugs (like hentai foundry artist, which searches both works and scraps) into a single downloader</li>
|
||||
<li>moved the subs gallery and file pipeline comppletely over to the new system</li>
|
||||
<li>the subs gallery sync now handles nested gugs (like hentai foundry artist, which searches both works and scraps) in an interleaved manner and make behind-the-scenes checking decisions in a clearer and more logical way</li>
|
||||
<li>subs should now make correct 'hit limit' stop reason reports and not generate new gallery pages when the current page has exactly enough results to hit the current file limit</li>
|
||||
<li>artstation artist lookup is now available as a default downloader</li>
|
||||
<li>newgrounds artist lookup makes a triumphant return. it works pretty well, given how flash and NG has changed since</li>
|
||||
<li>derpibooru tag lookup is now available as a default downloader. due to unusual search syntax on derpibooru, please enter queries exactly as you would on derpi, using ',' or ' AND ' to separate tags (such as 'rainbow dash,straight')</li>
|
||||
<li>pixiv now has multiple artist lookup options--either images, manga, ugoira (doesn't work yet!), or everything</li>
|
||||
<li>the old downloader code is deleted!</li>
|
||||
<li>the old manage booru dialog is deleted!</li>
|
||||
<li>'custom' boorus (i.e. new ones you created or imported to 'manage boorus'), cannot be completely automatically updated to the new system. I've figured out a way to generate new gugs and gallery&post parsers, but they will miss url classes to get working again. your custom-booru subs will notice this and safely pause until the issue is fixed. if you rely on custom boorus, please check the release post for info on this--you might like to put off updating</li>
|
||||
<li>many misc changes and fixes to gugs and overall gallery url handling pipeline</li>
|
||||
<li>some misc refactoring and concept-renaming in gallery pipeline r.e. gugs</li>
|
||||
<li>when the downloader tries to import what looks like a raw html file, its error notes will specify this and suggest a parser may be needed</li>
|
||||
<li>moved the 'media viewer url display' options panel from the manage url match links dialog to the new network->downloaders->manage downloader and url display</li>
|
||||
<li>this new dialog also hosts a list for managing which downloaders to show in the first list of the downloader selector</li>
|
||||
<li>.</li>
|
||||
<li>misc</li>
|
||||
<li>gave the video rendering pipeline communication logic a quick pass, cleaning up a bunch of bad code and other decisions. the video renderer should be quicker to respond to various changes in scanbar position, and incidences of the frame buffer suddenly sperging out (usually inexplicably falling behind the current frame position or deciding to regen for no apparent reason) should be greatly reduced if not completely eliminated</li>
|
||||
<li>the test that stops repository processing if there is not enough disk space now uses half the current size of client.mappings.db for its estimate (previously 1GB) and also tests temp folder location free space (just as the vacuum test does) and reports this nature of the error along with pausing the repo, stopping further attempts</li>
|
||||
<li>might have fixed another out-of-order dialog close/open event combination during manage tags close->advanced content update open</li>
|
||||
<li>fixed gallery queries that include '/' (or some other unusual characters) that end up in the 'path' of the url (as opposed to the query). this fixes 'male/female' on e621, for instance</li>
|
||||
<li>'advanced mode' users now have a 'nudge subs awake' menu entry below 'manage subs'. this simply wakes the subs daemon (which usually only checks once every four hours), in case any subs are due</li>
|
||||
<li>'db report mode' now reports every db job as it comes in (formerly, it only reported some optimisation esoterica). this makes it a more lightweight version of 'db profile mode' for several debugging tasks</li>
|
||||
<li>fixed a tiny issue in fetching the 'how boned am I?' stats when the user had zero inbox/everything count</li>
|
||||
<li>fixed a typo in the default new url class object that was breaking the edit ui panel</li>
|
||||
<li>highlighted the quiet filename tagging options on edit import folder panel</li>
|
||||
</ul>
|
||||
<li><h3>version 320</h3></li>
|
||||
<ul>
|
||||
<li>clients should now have objects for all default downloaders. everything should be prepped for the big switchover:</li>
|
||||
<li>wrote gallery url generators for all the default downloaders and a couple more as well</li>
|
||||
<li>wrote a gallery parser for deviant art--it also comes with an update to the DA url class because the meta 'next page' link on DA gallery pages is invalid wew!</li>
|
||||
|
|
|
@ -2947,8 +2947,6 @@ class DB( HydrusDB.HydrusDB ):
|
|||
self._AddService( service_key, service_type, name, dictionary )
|
||||
|
||||
|
||||
self._c.executemany( 'INSERT INTO yaml_dumps VALUES ( ?, ?, ? );', ( ( YAML_DUMP_ID_REMOTE_BOORU, name, booru ) for ( name, booru ) in ClientDefaults.GetDefaultBoorus().items() ) )
|
||||
|
||||
self._c.executemany( 'INSERT INTO yaml_dumps VALUES ( ?, ?, ? );', ( ( YAML_DUMP_ID_IMAGEBOARD, name, imageboards ) for ( name, imageboards ) in ClientDefaults.GetDefaultImageboards() ) )
|
||||
|
||||
new_options = ClientOptions.ClientOptions( self._db_dir )
|
||||
|
@ -3936,6 +3934,16 @@ class DB( HydrusDB.HydrusDB ):
|
|||
( num_total, size_total ) = self._c.execute( 'SELECT COUNT( hash_id ), SUM( size ) FROM files_info NATURAL JOIN current_files WHERE service_id = ?;', ( self._local_file_service_id, ) ).fetchone()
|
||||
( num_inbox, size_inbox ) = self._c.execute( 'SELECT COUNT( hash_id ), SUM( size ) FROM files_info NATURAL JOIN current_files NATURAL JOIN file_inbox WHERE service_id = ?;', ( self._local_file_service_id, ) ).fetchone()
|
||||
|
||||
if size_total is None:
|
||||
|
||||
size_total = 0
|
||||
|
||||
|
||||
if size_inbox is None:
|
||||
|
||||
size_inbox = 0
|
||||
|
||||
|
||||
num_archive = num_total - num_inbox
|
||||
size_archive = size_total - size_inbox
|
||||
|
||||
|
@ -8390,11 +8398,19 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
def _ProcessRepositoryUpdates( self, service_key, only_when_idle = False, stop_time = None ):
|
||||
|
||||
if HydrusPaths.GetFreeSpace( self._db_dir ) < 1024 * 1048576:
|
||||
db_path = os.path.join( self._db_dir, 'client.mappings.db' )
|
||||
|
||||
db_size = os.path.getsize( db_path )
|
||||
|
||||
( has_space, reason ) = HydrusPaths.HasSpaceForDBTransaction( self._db_dir, db_size / 2 )
|
||||
|
||||
if not has_space:
|
||||
|
||||
HydrusData.ShowText( 'The db partition has <1GB free space, so will not sync repositories.' )
|
||||
message = 'Not enough free disk space to guarantee a safe repository processing job. Full text: ' + os.linesep * 2 + reason
|
||||
|
||||
return
|
||||
HydrusData.DebugPrint( message )
|
||||
|
||||
raise Exception( message )
|
||||
|
||||
|
||||
service_id = self._GetServiceId( service_key )
|
||||
|
@ -8673,8 +8689,6 @@ class DB( HydrusDB.HydrusDB ):
|
|||
elif action == 'options': result = self._GetOptions( *args, **kwargs )
|
||||
elif action == 'pending': result = self._GetPending( *args, **kwargs )
|
||||
elif action == 'recent_tags': result = self._GetRecentTags( *args, **kwargs )
|
||||
elif action == 'remote_booru': result = self._GetYAMLDump( YAML_DUMP_ID_REMOTE_BOORU, *args, **kwargs )
|
||||
elif action == 'remote_boorus': result = self._GetYAMLDump( YAML_DUMP_ID_REMOTE_BOORU )
|
||||
elif action == 'repository_progress': result = self._GetRepositoryProgress( *args, **kwargs )
|
||||
elif action == 'serialisable': result = self._GetJSONDump( *args, **kwargs )
|
||||
elif action == 'serialisable_simple': result = self._GetJSONSimple( *args, **kwargs )
|
||||
|
@ -10792,6 +10806,111 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
|
||||
|
||||
if version == 320:
|
||||
|
||||
try:
|
||||
|
||||
domain_manager = self._GetJSONDump( HydrusSerialisable.SERIALISABLE_TYPE_NETWORK_DOMAIN_MANAGER )
|
||||
|
||||
domain_manager.Initialise()
|
||||
|
||||
#
|
||||
|
||||
domain_manager.DeleteGUGs( [ 'rule.34.paheal tag search' ] )
|
||||
domain_manager.OverwriteDefaultGUGs( [ 'rule34.paheal tag search' ] )
|
||||
|
||||
domain_manager.OverwriteDefaultGUGs( [ 'newgrounds artist lookup', 'hentai foundry artist lookup', 'danbooru & gelbooru tag search' ] )
|
||||
|
||||
domain_manager.OverwriteDefaultGUGs( [ 'derpibooru tag search' ] )
|
||||
|
||||
#
|
||||
|
||||
domain_manager.OverwriteDefaultURLMatches( [ 'derpibooru gallery page' ] )
|
||||
|
||||
#
|
||||
|
||||
domain_manager.OverwriteDefaultParsers( [ 'derpibooru gallery page parser' ] )
|
||||
|
||||
#
|
||||
|
||||
boorus = self._STL( self._c.execute( 'SELECT dump FROM yaml_dumps WHERE dump_type = ?;', ( YAML_DUMP_ID_REMOTE_BOORU, ) ) )
|
||||
|
||||
default_names = { 'derpibooru', 'gelbooru', 'safebooru', 'e621', 'rule34@paheal', 'danbooru', 'mishimmie', 'rule34@booru.org', 'furry@booru.org', 'xbooru', 'konachan', 'yande.re', 'tbib', 'sankaku chan', 'sankaku idol', 'rule34hentai' }
|
||||
|
||||
new_gugs = []
|
||||
new_parsers = []
|
||||
|
||||
import ClientDownloading
|
||||
|
||||
for booru in boorus:
|
||||
|
||||
name = booru.GetName()
|
||||
|
||||
if name in default_names:
|
||||
|
||||
continue # we already have an update in place, so no need for legacy update
|
||||
|
||||
|
||||
try:
|
||||
|
||||
( gug, gallery_parser, post_parser ) = ClientDownloading.ConvertBooruToNewObjects( booru )
|
||||
|
||||
new_gugs.append( gug )
|
||||
|
||||
new_parsers.extend( ( gallery_parser, post_parser ) )
|
||||
|
||||
except Exception as e:
|
||||
|
||||
HydrusData.PrintException( e )
|
||||
|
||||
|
||||
|
||||
if len( new_gugs ) > 0:
|
||||
|
||||
try:
|
||||
|
||||
domain_manager.AddGUGs( new_gugs )
|
||||
|
||||
except Exception as e:
|
||||
|
||||
HydrusData.PrintException( e )
|
||||
|
||||
|
||||
|
||||
if len( new_parsers ) > 0:
|
||||
|
||||
try:
|
||||
|
||||
domain_manager.AddParsers( new_parsers )
|
||||
|
||||
except Exception as e:
|
||||
|
||||
HydrusData.PrintException( e )
|
||||
|
||||
|
||||
|
||||
#
|
||||
|
||||
domain_manager.TryToLinkURLMatchesAndParsers()
|
||||
|
||||
#
|
||||
|
||||
self._SetJSONDump( domain_manager )
|
||||
|
||||
message = 'The final big step of the downloader overhaul has just occurred! It looks like it went well. All of your default downloaders and subs should have updated smoothly, but if you use any \'custom\' boorus that you created or imported yourself, these will need more work to get going again. The subs using these old boorus will notice the problem and safely pause on their next sync attempt. Please see my v321 release post for more information.'
|
||||
|
||||
self.pub_initial_message( message )
|
||||
|
||||
except Exception as e:
|
||||
|
||||
HydrusData.PrintException( e )
|
||||
|
||||
message = 'Trying to update some downloader objects failed! Please let hydrus dev know!'
|
||||
|
||||
self.pub_initial_message( message )
|
||||
|
||||
|
||||
|
||||
self._controller.pub( 'splash_set_title_text', 'updated db to v' + str( version + 1 ) )
|
||||
|
||||
self._c.execute( 'UPDATE version SET version = ?;', ( version + 1, ) )
|
||||
|
@ -11364,7 +11483,6 @@ class DB( HydrusDB.HydrusDB ):
|
|||
elif action == 'delete_imageboard': result = self._DeleteYAMLDump( YAML_DUMP_ID_IMAGEBOARD, *args, **kwargs )
|
||||
elif action == 'delete_local_booru_share': result = self._DeleteYAMLDump( YAML_DUMP_ID_LOCAL_BOORU, *args, **kwargs )
|
||||
elif action == 'delete_pending': result = self._DeletePending( *args, **kwargs )
|
||||
elif action == 'delete_remote_booru': result = self._DeleteYAMLDump( YAML_DUMP_ID_REMOTE_BOORU, *args, **kwargs )
|
||||
elif action == 'delete_serialisable_named': result = self._DeleteJSONDumpNamed( *args, **kwargs )
|
||||
elif action == 'delete_service_info': result = self._DeleteServiceInfo( *args, **kwargs )
|
||||
elif action == 'delete_unknown_duplicate_pairs': result = self._CacheSimilarFilesDeleteUnknownDuplicatePairs( *args, **kwargs )
|
||||
|
@ -11386,7 +11504,6 @@ class DB( HydrusDB.HydrusDB ):
|
|||
elif action == 'regenerate_ac_cache': result = self._RegenerateACCache( *args, **kwargs )
|
||||
elif action == 'regenerate_similar_files': result = self._CacheSimilarFilesRegenerateTree( *args, **kwargs )
|
||||
elif action == 'relocate_client_files': result = self._RelocateClientFiles( *args, **kwargs )
|
||||
elif action == 'remote_booru': result = self._SetYAMLDump( YAML_DUMP_ID_REMOTE_BOORU, *args, **kwargs )
|
||||
elif action == 'repair_client_files': result = self._RepairClientFiles( *args, **kwargs )
|
||||
elif action == 'reparse_files': result = self._ReparseFiles( *args, **kwargs )
|
||||
elif action == 'reset_repository': result = self._ResetRepository( *args, **kwargs )
|
||||
|
|
|
@ -165,249 +165,29 @@ def GetDefaultHentaiFoundryInfo():
|
|||
|
||||
return info
|
||||
|
||||
def GetDefaultSearchValue( gallery_identifier ):
|
||||
|
||||
site_type = gallery_identifier.GetSiteType()
|
||||
|
||||
if site_type == HC.SITE_TYPE_DEFAULT:
|
||||
|
||||
search_value = ''
|
||||
|
||||
elif site_type == HC.SITE_TYPE_BOORU:
|
||||
|
||||
search_value = 'search tags'
|
||||
|
||||
elif site_type == HC.SITE_TYPE_DEVIANT_ART:
|
||||
|
||||
search_value = 'artist username'
|
||||
|
||||
elif site_type == HC.SITE_TYPE_GIPHY:
|
||||
|
||||
search_value = 'search tag'
|
||||
|
||||
elif site_type in ( HC.SITE_TYPE_HENTAI_FOUNDRY, HC.SITE_TYPE_HENTAI_FOUNDRY_ARTIST, HC.SITE_TYPE_HENTAI_FOUNDRY_TAGS ):
|
||||
|
||||
if site_type == HC.SITE_TYPE_HENTAI_FOUNDRY:
|
||||
|
||||
search_value = 'search'
|
||||
|
||||
elif site_type == HC.SITE_TYPE_HENTAI_FOUNDRY_ARTIST:
|
||||
|
||||
search_value = 'artist username'
|
||||
|
||||
elif site_type == HC.SITE_TYPE_HENTAI_FOUNDRY_TAGS:
|
||||
|
||||
search_value = 'search tags'
|
||||
|
||||
|
||||
elif site_type == HC.SITE_TYPE_NEWGROUNDS:
|
||||
|
||||
search_value = 'artist username'
|
||||
|
||||
elif site_type in ( HC.SITE_TYPE_PIXIV, HC.SITE_TYPE_PIXIV_ARTIST_ID, HC.SITE_TYPE_PIXIV_TAG ):
|
||||
|
||||
if site_type == HC.SITE_TYPE_PIXIV:
|
||||
|
||||
search_value = 'search'
|
||||
|
||||
elif site_type == HC.SITE_TYPE_PIXIV_ARTIST_ID:
|
||||
|
||||
search_value = 'numerical artist id'
|
||||
|
||||
elif site_type == HC.SITE_TYPE_PIXIV_TAG:
|
||||
|
||||
search_value = 'search tag'
|
||||
|
||||
|
||||
elif site_type == HC.SITE_TYPE_TUMBLR:
|
||||
|
||||
search_value = 'username'
|
||||
|
||||
elif site_type == HC.SITE_TYPE_WATCHER:
|
||||
|
||||
search_value = 'thread url'
|
||||
|
||||
|
||||
return search_value
|
||||
|
||||
def GetDefaultBoorus():
|
||||
|
||||
boorus = {}
|
||||
|
||||
name = 'gelbooru'
|
||||
search_url = 'https://gelbooru.com/index.php?page=post&s=list&tags=%tags%&pid=%index%'
|
||||
search_separator = '+'
|
||||
advance_by_page_num = False
|
||||
thumb_classname = 'thumb'
|
||||
image_id = None
|
||||
image_data = 'Original image'
|
||||
tag_classnames_to_namespaces = { 'tag-type-general' : '', 'tag-type-character' : 'character', 'tag-type-copyright' : 'series', 'tag-type-artist' : 'creator' }
|
||||
|
||||
boorus[ 'gelbooru' ] = ClientData.Booru( name, search_url, search_separator, advance_by_page_num, thumb_classname, image_id, image_data, tag_classnames_to_namespaces )
|
||||
|
||||
name = 'safebooru'
|
||||
search_url = 'https://safebooru.org/index.php?page=post&s=list&tags=%tags%&pid=%index%'
|
||||
search_separator = '+'
|
||||
advance_by_page_num = False
|
||||
thumb_classname = 'thumb'
|
||||
image_id = None
|
||||
image_data = 'Original image'
|
||||
tag_classnames_to_namespaces = { 'tag-type-general' : '', 'tag-type-character' : 'character', 'tag-type-copyright' : 'series', 'tag-type-artist' : 'creator' }
|
||||
|
||||
boorus[ 'safebooru' ] = ClientData.Booru( name, search_url, search_separator, advance_by_page_num, thumb_classname, image_id, image_data, tag_classnames_to_namespaces )
|
||||
|
||||
name = 'e621'
|
||||
search_url = 'https://e621.net/post/index/%index%/%tags%'
|
||||
search_separator = '%20'
|
||||
advance_by_page_num = True
|
||||
thumb_classname = 'thumb'
|
||||
image_id = None
|
||||
image_data = 'Download'
|
||||
tag_classnames_to_namespaces = { 'tag-type-general' : '', 'tag-type-character' : 'character', 'tag-type-copyright' : 'series', 'tag-type-artist' : 'creator', 'tag-type-species' : 'species' }
|
||||
|
||||
boorus[ 'e621' ] = ClientData.Booru( name, search_url, search_separator, advance_by_page_num, thumb_classname, image_id, image_data, tag_classnames_to_namespaces )
|
||||
|
||||
name = 'rule34@paheal'
|
||||
search_url = 'https://rule34.paheal.net/post/list/%tags%/%index%'
|
||||
search_separator = '%20'
|
||||
advance_by_page_num = True
|
||||
thumb_classname = 'thumb'
|
||||
image_id = 'main_image'
|
||||
image_data = None
|
||||
tag_classnames_to_namespaces = { 'tag_name' : '' }
|
||||
|
||||
boorus[ 'rule34@paheal' ] = ClientData.Booru( name, search_url, search_separator, advance_by_page_num, thumb_classname, image_id, image_data, tag_classnames_to_namespaces )
|
||||
|
||||
name = 'danbooru'
|
||||
search_url = 'https://danbooru.donmai.us/posts?page=%index%&tags=%tags%'
|
||||
search_separator = '%20'
|
||||
advance_by_page_num = True
|
||||
thumb_classname = 'post-preview'
|
||||
image_id = 'image'
|
||||
image_data = None
|
||||
tag_classnames_to_namespaces = { 'category-0' : '', 'category-4' : 'character', 'category-3' : 'series', 'category-1' : 'creator' }
|
||||
|
||||
boorus[ 'danbooru' ] = ClientData.Booru( name, search_url, search_separator, advance_by_page_num, thumb_classname, image_id, image_data, tag_classnames_to_namespaces )
|
||||
|
||||
name = 'mishimmie'
|
||||
search_url = 'https://shimmie.katawa-shoujo.com/post/list/%tags%/%index%'
|
||||
search_separator = '%20'
|
||||
advance_by_page_num = True
|
||||
thumb_classname = 'thumb'
|
||||
image_id = 'main_image'
|
||||
image_data = None
|
||||
tag_classnames_to_namespaces = { 'tag_name' : '' }
|
||||
|
||||
boorus[ 'mishimmie' ] = ClientData.Booru( name, search_url, search_separator, advance_by_page_num, thumb_classname, image_id, image_data, tag_classnames_to_namespaces )
|
||||
|
||||
name = 'rule34@booru.org'
|
||||
search_url = 'https://rule34.xxx/index.php?page=post&s=list&tags=%tags%&pid=%index%'
|
||||
search_separator = '%20'
|
||||
advance_by_page_num = False
|
||||
thumb_classname = 'thumb'
|
||||
image_id = None
|
||||
image_data = 'Original image'
|
||||
tag_classnames_to_namespaces = { 'tag-type-general' : '', 'tag-type-character' : 'character', 'tag-type-copyright' : 'series', 'tag-type-artist' : 'creator' }
|
||||
|
||||
boorus[ 'rule34@booru.org' ] = ClientData.Booru( name, search_url, search_separator, advance_by_page_num, thumb_classname, image_id, image_data, tag_classnames_to_namespaces )
|
||||
|
||||
name = 'furry@booru.org'
|
||||
search_url = 'http://furry.booru.org/index.php?page=post&s=list&tags=%tags%&pid=%index%'
|
||||
search_separator = '+'
|
||||
advance_by_page_num = False
|
||||
thumb_classname = 'thumb'
|
||||
image_id = None
|
||||
image_data = 'Original image'
|
||||
tag_classnames_to_namespaces = { 'tag-type-general' : '', 'tag-type-character' : 'character', 'tag-type-copyright' : 'series', 'tag-type-artist' : 'creator' }
|
||||
|
||||
boorus[ 'furry@booru.org' ] = ClientData.Booru( name, search_url, search_separator, advance_by_page_num, thumb_classname, image_id, image_data, tag_classnames_to_namespaces )
|
||||
|
||||
name = 'xbooru'
|
||||
search_url = 'https://xbooru.com/index.php?page=post&s=list&tags=%tags%&pid=%index%'
|
||||
search_separator = '+'
|
||||
advance_by_page_num = False
|
||||
thumb_classname = 'thumb'
|
||||
image_id = None
|
||||
image_data = 'Original image'
|
||||
tag_classnames_to_namespaces = { 'tag-type-general' : '', 'tag-type-character' : 'character', 'tag-type-copyright' : 'series', 'tag-type-artist' : 'creator' }
|
||||
|
||||
boorus[ 'xbooru' ] = ClientData.Booru( name, search_url, search_separator, advance_by_page_num, thumb_classname, image_id, image_data, tag_classnames_to_namespaces )
|
||||
|
||||
name = 'konachan'
|
||||
search_url = 'https://konachan.com/post?page=%index%&tags=%tags%'
|
||||
search_separator = '+'
|
||||
advance_by_page_num = True
|
||||
thumb_classname = 'thumb'
|
||||
image_id = None
|
||||
image_data = 'View larger version'
|
||||
tag_classnames_to_namespaces = { 'tag-type-general' : '', 'tag-type-character' : 'character', 'tag-type-copyright' : 'series', 'tag-type-artist' : 'creator' }
|
||||
|
||||
boorus[ 'konachan' ] = ClientData.Booru( name, search_url, search_separator, advance_by_page_num, thumb_classname, image_id, image_data, tag_classnames_to_namespaces )
|
||||
|
||||
name = 'yande.re'
|
||||
search_url = 'https://yande.re/post?page=%index%&tags=%tags%'
|
||||
search_separator = '+'
|
||||
advance_by_page_num = True
|
||||
thumb_classname = 'thumb'
|
||||
image_id = None
|
||||
image_data = 'View larger version'
|
||||
tag_classnames_to_namespaces = { 'tag-type-general' : '', 'tag-type-character' : 'character', 'tag-type-copyright' : 'series', 'tag-type-artist' : 'creator' }
|
||||
|
||||
boorus[ 'yande.re' ] = ClientData.Booru( name, search_url, search_separator, advance_by_page_num, thumb_classname, image_id, image_data, tag_classnames_to_namespaces )
|
||||
|
||||
name = 'tbib'
|
||||
search_url = 'https://tbib.org/index.php?page=post&s=list&tags=%tags%&pid=%index%'
|
||||
search_separator = '+'
|
||||
advance_by_page_num = False
|
||||
thumb_classname = 'thumb'
|
||||
image_id = None
|
||||
image_data = 'Original image'
|
||||
tag_classnames_to_namespaces = { 'tag-type-general' : '', 'tag-type-character' : 'character', 'tag-type-copyright' : 'series', 'tag-type-artist' : 'creator' }
|
||||
|
||||
boorus[ 'tbib' ] = ClientData.Booru( name, search_url, search_separator, advance_by_page_num, thumb_classname, image_id, image_data, tag_classnames_to_namespaces )
|
||||
|
||||
name = 'sankaku chan'
|
||||
search_url = 'https://chan.sankakucomplex.com/?tags=%tags%&page=%index%'
|
||||
search_separator = '+'
|
||||
advance_by_page_num = True
|
||||
thumb_classname = 'thumb'
|
||||
image_id = 'highres'
|
||||
image_data = None
|
||||
tag_classnames_to_namespaces = { 'tag-type-general' : '', 'tag-type-character' : 'character', 'tag-type-copyright' : 'series', 'tag-type-artist' : 'creator', 'tag-type-medium' : 'medium', 'tag-type-meta' : 'meta', 'tag-type-studio' : 'studio' }
|
||||
|
||||
boorus[ 'sankaku chan' ] = ClientData.Booru( name, search_url, search_separator, advance_by_page_num, thumb_classname, image_id, image_data, tag_classnames_to_namespaces )
|
||||
|
||||
name = 'sankaku idol'
|
||||
search_url = 'https://idol.sankakucomplex.com/?tags=%tags%&page=%index%'
|
||||
search_separator = '+'
|
||||
advance_by_page_num = True
|
||||
thumb_classname = 'thumb'
|
||||
image_id = 'highres'
|
||||
image_data = None
|
||||
tag_classnames_to_namespaces = { 'tag-type-general' : '', 'tag-type-character' : 'character', 'tag-type-copyright' : 'series', 'tag-type-artist' : 'creator', 'tag-type-medium' : 'medium', 'tag-type-meta' : 'meta', 'tag-type-photo_set' : 'photo set', 'tag-type-idol' : 'person' }
|
||||
|
||||
boorus[ 'sankaku idol' ] = ClientData.Booru( name, search_url, search_separator, advance_by_page_num, thumb_classname, image_id, image_data, tag_classnames_to_namespaces )
|
||||
|
||||
name = 'rule34hentai'
|
||||
search_url = 'https://rule34hentai.net/post/list/%tags%/%index%'
|
||||
search_separator = '%20'
|
||||
advance_by_page_num = True
|
||||
thumb_classname = 'shm-thumb'
|
||||
image_id = 'main_image'
|
||||
image_data = None
|
||||
tag_classnames_to_namespaces = { 'tag_name' : '' }
|
||||
|
||||
boorus[ 'rule34hentai' ] = ClientData.Booru( name, search_url, search_separator, advance_by_page_num, thumb_classname, image_id, image_data, tag_classnames_to_namespaces )
|
||||
|
||||
return boorus
|
||||
|
||||
def GetDefaultGUGs():
|
||||
|
||||
dir_path = os.path.join( HC.STATIC_DIR, 'default', 'gugs' )
|
||||
|
||||
import ClientNetworkingDomain
|
||||
|
||||
return GetDefaultObjectsFromPNGs( dir_path, ( ClientNetworkingDomain.GalleryURLGenerator, ) )
|
||||
return GetDefaultObjectsFromPNGs( dir_path, ( ClientNetworkingDomain.GalleryURLGenerator, ClientNetworkingDomain.NestedGalleryURLGenerator ) )
|
||||
|
||||
def GetDefaultNGUGs():
|
||||
|
||||
import ClientNetworkingDomain
|
||||
|
||||
gugs = [ gug for gug in GetDefaultGUGs() if isinstance( gug, ClientNetworkingDomain.NestedGalleryURLGenerator ) ]
|
||||
|
||||
return gugs
|
||||
|
||||
def GetDefaultSingleGUGs():
|
||||
|
||||
import ClientNetworkingDomain
|
||||
|
||||
gugs = [ gug for gug in GetDefaultGUGs() if isinstance( gug, ClientNetworkingDomain.GalleryURLGenerator ) ]
|
||||
|
||||
return gugs
|
||||
|
||||
def GetDefaultImageboards():
|
||||
|
||||
|
@ -818,7 +598,13 @@ def SetDefaultDomainManagerData( domain_manager ):
|
|||
|
||||
#
|
||||
|
||||
domain_manager.SetGUGs( GetDefaultGUGs() )
|
||||
gugs = GetDefaultGUGs()
|
||||
|
||||
domain_manager.SetGUGs( gugs )
|
||||
|
||||
gug_keys_to_display = [ gug.GetGUGKey() for gug in gugs if 'ugoira' not in gug.GetName() ]
|
||||
|
||||
domain_manager.SetGUGKeysToDisplay( gug_keys_to_display )
|
||||
|
||||
#
|
||||
|
||||
|
|
|
@ -1703,6 +1703,11 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
|
|||
|
||||
ClientGUIMenus.AppendMenuItem( self, submenu, 'manage subscriptions', 'Change the queries you want the client to regularly import from.', self._ManageSubscriptions )
|
||||
|
||||
if self._controller.new_options.GetBoolean( 'advanced_mode' ):
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( self, submenu, 'nudge subscriptions awake', 'Tell the subs daemon to wake up, just in case any subs are due.', self._controller.pub, 'notify_restart_subs_sync_daemon' )
|
||||
|
||||
|
||||
ClientGUIMenus.AppendSeparator( submenu )
|
||||
|
||||
# this will be the easy-mode 'export ability to download from blahbooru' that'll bundle it all into a nice package with a neat png.
|
||||
|
@ -1713,6 +1718,7 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
|
|||
ClientGUIMenus.AppendSeparator( submenu )
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( self, submenu, 'manage default tag import options', 'Change the default tag import options for each of your linked url matches.', self._ManageDefaultTagImportOptions )
|
||||
ClientGUIMenus.AppendMenuItem( self, submenu, 'manage downloader and url display', 'Configure how downloader objects present across the client.', self._ManageDownloaderDisplay )
|
||||
|
||||
ClientGUIMenus.AppendMenu( menu, submenu, 'downloaders' )
|
||||
|
||||
|
@ -1730,7 +1736,6 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
|
|||
|
||||
ClientGUIMenus.AppendSeparator( submenu )
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( self, submenu, 'LEGACY: manage boorus', 'Change the html parsing information for boorus to download from.', self._ManageBoorus )
|
||||
ClientGUIMenus.AppendMenuItem( self, submenu, 'SEMI-LEGACY: manage file lookup scripts', 'Manage how the client parses different types of web content.', self._ManageParsingScripts )
|
||||
|
||||
ClientGUIMenus.AppendMenu( menu, submenu, 'downloader definitions' )
|
||||
|
@ -2240,11 +2245,6 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
|
|||
|
||||
|
||||
|
||||
def _ManageBoorus( self ):
|
||||
|
||||
with ClientGUIDialogsManage.DialogManageBoorus( self ) as dlg: dlg.ShowModal()
|
||||
|
||||
|
||||
def _ManageDefaultTagImportOptions( self ):
|
||||
|
||||
title = 'manage default tag import options'
|
||||
|
@ -2258,7 +2258,7 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
|
|||
url_matches = domain_manager.GetURLMatches()
|
||||
parsers = domain_manager.GetParsers()
|
||||
|
||||
( url_match_keys_to_display, url_match_keys_to_parser_keys ) = domain_manager.GetURLMatchLinks()
|
||||
url_match_keys_to_parser_keys = domain_manager.GetURLMatchKeysToParserKeys()
|
||||
|
||||
panel = ClientGUIScrolledPanelsEdit.EditDefaultTagImportOptionsPanel( dlg, url_matches, parsers, url_match_keys_to_parser_keys, file_post_default_tag_import_options, watchable_default_tag_import_options, url_match_keys_to_tag_import_options )
|
||||
|
||||
|
@ -2273,6 +2273,36 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
|
|||
|
||||
|
||||
|
||||
def _ManageDownloaderDisplay( self ):
|
||||
|
||||
title = 'manage downloader display'
|
||||
|
||||
with ClientGUITopLevelWindows.DialogEdit( self, title ) as dlg:
|
||||
|
||||
domain_manager = self._controller.network_engine.domain_manager
|
||||
|
||||
gugs = domain_manager.GetGUGs()
|
||||
|
||||
gug_keys_to_display = domain_manager.GetGUGKeysToDisplay()
|
||||
|
||||
url_matches = domain_manager.GetURLMatches()
|
||||
|
||||
url_match_keys_to_display = domain_manager.GetURLMatchKeysToDisplay()
|
||||
|
||||
panel = ClientGUIScrolledPanelsEdit.EditDownloaderDisplayPanel( dlg, self._controller.network_engine, gugs, gug_keys_to_display, url_matches, url_match_keys_to_display )
|
||||
|
||||
dlg.SetPanel( panel )
|
||||
|
||||
if dlg.ShowModal() == wx.ID_OK:
|
||||
|
||||
( gug_keys_to_display, url_match_keys_to_display ) = panel.GetValue()
|
||||
|
||||
domain_manager.SetGUGKeysToDisplay( gug_keys_to_display )
|
||||
domain_manager.SetURLMatchKeysToDisplay( url_match_keys_to_display )
|
||||
|
||||
|
||||
|
||||
|
||||
def _ManageExportFolders( self ):
|
||||
|
||||
def wx_do_it():
|
||||
|
@ -2727,17 +2757,17 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
|
|||
url_matches = domain_manager.GetURLMatches()
|
||||
parsers = domain_manager.GetParsers()
|
||||
|
||||
( url_match_keys_to_display, url_match_keys_to_parser_keys ) = domain_manager.GetURLMatchLinks()
|
||||
url_match_keys_to_parser_keys = domain_manager.GetURLMatchKeysToParserKeys()
|
||||
|
||||
panel = ClientGUIScrolledPanelsEdit.EditURLMatchLinksPanel( dlg, self._controller.network_engine, url_matches, parsers, url_match_keys_to_display, url_match_keys_to_parser_keys )
|
||||
panel = ClientGUIScrolledPanelsEdit.EditURLMatchLinksPanel( dlg, self._controller.network_engine, url_matches, parsers, url_match_keys_to_parser_keys )
|
||||
|
||||
dlg.SetPanel( panel )
|
||||
|
||||
if dlg.ShowModal() == wx.ID_OK:
|
||||
|
||||
( url_match_keys_to_display, url_match_keys_to_parser_keys ) = panel.GetValue()
|
||||
url_match_keys_to_parser_keys = panel.GetValue()
|
||||
|
||||
domain_manager.SetURLMatchLinks( url_match_keys_to_display, url_match_keys_to_parser_keys )
|
||||
domain_manager.SetURLMatchKeysToParserKeys( url_match_keys_to_parser_keys )
|
||||
|
||||
|
||||
|
||||
|
|
|
@ -1869,84 +1869,6 @@ class DialogModifyAccounts( Dialog ):
|
|||
|
||||
|
||||
|
||||
class DialogSelectBooru( Dialog ):
|
||||
|
||||
def __init__( self, parent ):
|
||||
|
||||
Dialog.__init__( self, parent, 'select booru' )
|
||||
|
||||
self._hidden_cancel = wx.Button( self, id = wx.ID_CANCEL, size = ( 0, 0 ) )
|
||||
|
||||
self._boorus = wx.ListBox( self )
|
||||
self._boorus.Bind( wx.EVT_LISTBOX_DCLICK, self.EventDoubleClick )
|
||||
|
||||
self._ok = wx.Button( self, id = wx.ID_OK, label = 'ok' )
|
||||
self._ok.Bind( wx.EVT_BUTTON, self.EventOK )
|
||||
self._ok.SetDefault()
|
||||
|
||||
#
|
||||
|
||||
boorus = HG.client_controller.Read( 'remote_boorus' )
|
||||
|
||||
booru_names = boorus.keys()
|
||||
|
||||
booru_names.sort()
|
||||
|
||||
for name in booru_names:
|
||||
|
||||
self._boorus.Append( name )
|
||||
|
||||
|
||||
self._boorus.Select( 0 )
|
||||
|
||||
self._boorus.SetFocus()
|
||||
|
||||
#
|
||||
|
||||
vbox = wx.BoxSizer( wx.VERTICAL )
|
||||
|
||||
vbox.Add( self._boorus, CC.FLAGS_EXPAND_BOTH_WAYS )
|
||||
vbox.Add( self._ok, CC.FLAGS_LONE_BUTTON )
|
||||
|
||||
self.SetSizer( vbox )
|
||||
|
||||
( x, y ) = self.GetEffectiveMinSize()
|
||||
|
||||
x = max( x, 320 )
|
||||
y = max( y, 320 )
|
||||
|
||||
self.SetInitialSize( ( x, y ) )
|
||||
|
||||
if len( boorus ) == 1:
|
||||
|
||||
wx.CallAfter( self.EndModal, wx.ID_OK )
|
||||
|
||||
|
||||
|
||||
def EventDoubleClick( self, event ):
|
||||
|
||||
self.EndModal( wx.ID_OK )
|
||||
|
||||
|
||||
def EventOK( self, event ):
|
||||
|
||||
selection = self._boorus.GetSelection()
|
||||
|
||||
if selection != wx.NOT_FOUND:
|
||||
|
||||
self.EndModal( wx.ID_OK )
|
||||
|
||||
|
||||
|
||||
def GetGalleryIdentifier( self ):
|
||||
|
||||
name = self._boorus.GetString( self._boorus.GetSelection() )
|
||||
|
||||
gallery_identifier = ClientDownloading.GalleryIdentifier( HC.SITE_TYPE_BOORU, additional_info = name )
|
||||
|
||||
return gallery_identifier
|
||||
|
||||
|
||||
class DialogSelectFromURLTree( Dialog ):
|
||||
|
||||
def __init__( self, parent, url_tree ):
|
||||
|
@ -2124,7 +2046,7 @@ class DialogSelectImageboard( Dialog ):
|
|||
|
||||
class DialogSelectFromList( Dialog ):
|
||||
|
||||
def __init__( self, parent, title, choice_tuples, value_to_select = None ):
|
||||
def __init__( self, parent, title, choice_tuples, value_to_select = None, sort_tuples = True ):
|
||||
|
||||
Dialog.__init__( self, parent, title )
|
||||
|
||||
|
@ -2140,7 +2062,10 @@ class DialogSelectFromList( Dialog ):
|
|||
|
||||
selected_a_value = False
|
||||
|
||||
choice_tuples.sort()
|
||||
if sort_tuples:
|
||||
|
||||
choice_tuples.sort()
|
||||
|
||||
|
||||
for ( i, ( label, value ) ) in enumerate( choice_tuples ):
|
||||
|
||||
|
|
|
@ -64,521 +64,7 @@ def GenerateMultipartFormDataCTAndBodyFromDict( fields ):
|
|||
|
||||
return m.get()
|
||||
|
||||
class DialogManageBoorus( ClientGUIDialogs.Dialog ):
|
||||
|
||||
def __init__( self, parent ):
|
||||
|
||||
ClientGUIDialogs.Dialog.__init__( self, parent, 'manage boorus' )
|
||||
|
||||
self._names_to_delete = []
|
||||
|
||||
self._boorus = ClientGUICommon.ListBook( self )
|
||||
|
||||
self._add = wx.Button( self, label = 'add' )
|
||||
self._add.Bind( wx.EVT_BUTTON, self.EventAdd )
|
||||
self._add.SetForegroundColour( ( 0, 128, 0 ) )
|
||||
|
||||
self._remove = wx.Button( self, label = 'remove' )
|
||||
self._remove.Bind( wx.EVT_BUTTON, self.EventRemove )
|
||||
self._remove.SetForegroundColour( ( 128, 0, 0 ) )
|
||||
|
||||
menu_items = []
|
||||
|
||||
menu_items.append( ( 'normal', 'all', 'restore all defaults', self._RestoreDefault ) )
|
||||
menu_items.append( ( 'separator', 0, 0, 0 ) )
|
||||
|
||||
default_booru_data = list( ClientDefaults.GetDefaultBoorus().items() )
|
||||
|
||||
default_booru_data.sort()
|
||||
|
||||
for ( name, booru ) in default_booru_data:
|
||||
|
||||
menu_items.append( ( 'normal', name, 'add this booru from the defaults', HydrusData.Call( self._RestoreDefault, booru ) ) )
|
||||
|
||||
|
||||
self._restore_default = ClientGUICommon.MenuButton( self, 'restore default', menu_items )
|
||||
|
||||
self._export = wx.Button( self, label = 'export' )
|
||||
self._export.Bind( wx.EVT_BUTTON, self.EventExport )
|
||||
|
||||
self._ok = wx.Button( self, id = wx.ID_OK, label = 'ok' )
|
||||
self._ok.Bind( wx.EVT_BUTTON, self.EventOK )
|
||||
self._ok.SetForegroundColour( ( 0, 128, 0 ) )
|
||||
|
||||
self._cancel = wx.Button( self, id = wx.ID_CANCEL, label = 'cancel' )
|
||||
self._cancel.SetForegroundColour( ( 128, 0, 0 ) )
|
||||
|
||||
#
|
||||
|
||||
boorus = HG.client_controller.Read( 'remote_boorus' )
|
||||
|
||||
for ( name, booru ) in boorus.items():
|
||||
|
||||
self._boorus.AddPageArgs( name, name, self._Panel, ( self._boorus, booru ), {} )
|
||||
|
||||
|
||||
#
|
||||
|
||||
add_remove_hbox = wx.BoxSizer( wx.HORIZONTAL )
|
||||
add_remove_hbox.Add( self._add, CC.FLAGS_VCENTER )
|
||||
add_remove_hbox.Add( self._remove, CC.FLAGS_VCENTER )
|
||||
add_remove_hbox.Add( self._restore_default, CC.FLAGS_VCENTER )
|
||||
add_remove_hbox.Add( self._export, CC.FLAGS_VCENTER )
|
||||
|
||||
ok_hbox = wx.BoxSizer( wx.HORIZONTAL )
|
||||
ok_hbox.Add( self._ok, CC.FLAGS_VCENTER )
|
||||
ok_hbox.Add( self._cancel, CC.FLAGS_VCENTER )
|
||||
|
||||
vbox = wx.BoxSizer( wx.VERTICAL )
|
||||
vbox.Add( self._boorus, CC.FLAGS_EXPAND_BOTH_WAYS )
|
||||
vbox.Add( add_remove_hbox, CC.FLAGS_SMALL_INDENT )
|
||||
vbox.Add( ok_hbox, CC.FLAGS_BUTTON_SIZER )
|
||||
|
||||
self.SetSizer( vbox )
|
||||
|
||||
self.SetDropTarget( ClientDragDrop.FileDropTarget( self, filenames_callable = self.Import ) )
|
||||
|
||||
( x, y ) = self.GetEffectiveMinSize()
|
||||
|
||||
self.SetInitialSize( ( 980, y ) )
|
||||
|
||||
wx.CallAfter( self._ok.SetFocus )
|
||||
|
||||
|
||||
def _RestoreDefault( self, booru = None ):
|
||||
|
||||
if booru is None:
|
||||
|
||||
for booru in ClientDefaults.GetDefaultBoorus().values():
|
||||
|
||||
self._RestoreDefault( booru )
|
||||
|
||||
|
||||
else:
|
||||
|
||||
name = booru.GetName()
|
||||
|
||||
if self._boorus.KeyExists( name ):
|
||||
|
||||
message = '\'' + name + '\' already exists--are you sure you want to overwrite it with the default entry?'
|
||||
|
||||
with ClientGUIDialogs.DialogYesNo( self, message ) as dlg:
|
||||
|
||||
if dlg.ShowModal() == wx.ID_YES:
|
||||
|
||||
self._boorus.Select( name )
|
||||
|
||||
page = self._boorus.GetPage( name )
|
||||
|
||||
page.Update( booru )
|
||||
|
||||
|
||||
|
||||
else:
|
||||
|
||||
page = self._Panel( self._boorus, booru, is_new = True )
|
||||
|
||||
self._boorus.AddPage( name, name, page, select = True )
|
||||
|
||||
|
||||
|
||||
|
||||
def EventAdd( self, event ):
|
||||
|
||||
with ClientGUIDialogs.DialogTextEntry( self, 'Enter new booru\'s name.' ) as dlg:
|
||||
|
||||
if dlg.ShowModal() == wx.ID_OK:
|
||||
|
||||
try:
|
||||
|
||||
name = dlg.GetValue()
|
||||
|
||||
if self._boorus.KeyExists( name ):
|
||||
|
||||
raise HydrusExceptions.NameException( 'That name is already in use!' )
|
||||
|
||||
|
||||
if name == '':
|
||||
|
||||
raise HydrusExceptions.NameException( 'Please enter a nickname for the booru.' )
|
||||
|
||||
|
||||
booru = ClientData.Booru( name, 'search_url', '+', 1, 'thumbnail', '', 'original image', {} )
|
||||
|
||||
page = self._Panel( self._boorus, booru, is_new = True )
|
||||
|
||||
self._boorus.AddPage( name, name, page, select = True )
|
||||
|
||||
except HydrusExceptions.NameException as e:
|
||||
|
||||
wx.MessageBox( HydrusData.ToUnicode( e ) )
|
||||
|
||||
self.EventAdd( event )
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
def EventExport( self, event ):
|
||||
|
||||
booru_panel = self._boorus.GetCurrentPage()
|
||||
|
||||
if booru_panel is not None:
|
||||
|
||||
name = self._boorus.GetCurrentKey()
|
||||
|
||||
booru = booru_panel.GetBooru()
|
||||
|
||||
with wx.FileDialog( self, 'select where to export booru', defaultFile = 'booru.yaml', style = wx.FD_SAVE ) as dlg:
|
||||
|
||||
if dlg.ShowModal() == wx.ID_OK:
|
||||
|
||||
path = HydrusData.ToUnicode( dlg.GetPath() )
|
||||
|
||||
with open( path, 'wb' ) as f: f.write( yaml.safe_dump( booru ) )
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
def EventOK( self, event ):
|
||||
|
||||
try:
|
||||
|
||||
for name in self._names_to_delete:
|
||||
|
||||
HG.client_controller.Write( 'delete_remote_booru', name )
|
||||
|
||||
|
||||
for page in self._boorus.GetActivePages():
|
||||
|
||||
if page.HasChanges():
|
||||
|
||||
booru = page.GetBooru()
|
||||
|
||||
name = booru.GetName()
|
||||
|
||||
HG.client_controller.Write( 'remote_booru', name, booru )
|
||||
|
||||
|
||||
|
||||
finally: self.EndModal( wx.ID_OK )
|
||||
|
||||
|
||||
def EventRemove( self, event ):
|
||||
|
||||
booru_panel = self._boorus.GetCurrentPage()
|
||||
|
||||
if booru_panel is not None:
|
||||
|
||||
name = self._boorus.GetCurrentKey()
|
||||
|
||||
self._names_to_delete.append( name )
|
||||
|
||||
self._boorus.DeleteCurrentPage()
|
||||
|
||||
|
||||
|
||||
def Import( self, paths ):
|
||||
|
||||
for path in paths:
|
||||
|
||||
try:
|
||||
|
||||
with open( path, 'rb' ) as f: file = f.read()
|
||||
|
||||
thing = yaml.safe_load( file )
|
||||
|
||||
if isinstance( thing, ClientData.Booru ):
|
||||
|
||||
booru = thing
|
||||
|
||||
name = booru.GetName()
|
||||
|
||||
if not self._boorus.KeyExists( name ):
|
||||
|
||||
new_booru = ClientData.Booru( name, 'search_url', '+', 1, 'thumbnail', '', 'original image', {} )
|
||||
|
||||
page = self._Panel( self._boorus, new_booru, is_new = True )
|
||||
|
||||
self._boorus.AddPage( name, name, page, select = True )
|
||||
|
||||
|
||||
self._boorus.Select( name )
|
||||
|
||||
page = self._boorus.GetPage( name )
|
||||
|
||||
page.Update( booru )
|
||||
|
||||
|
||||
except:
|
||||
|
||||
wx.MessageBox( traceback.format_exc() )
|
||||
|
||||
|
||||
|
||||
|
||||
class _Panel( wx.Panel ):
|
||||
|
||||
def __init__( self, parent, booru, is_new = False ):
|
||||
|
||||
wx.Panel.__init__( self, parent )
|
||||
|
||||
self._booru = booru
|
||||
self._is_new = is_new
|
||||
|
||||
( search_url, search_separator, advance_by_page_num, thumb_classname, image_id, image_data, tag_classnames_to_namespaces ) = booru.GetData()
|
||||
|
||||
self._booru_panel = ClientGUICommon.StaticBox( self, 'booru' )
|
||||
|
||||
#
|
||||
|
||||
self._search_panel = ClientGUICommon.StaticBox( self._booru_panel, 'search' )
|
||||
|
||||
self._search_url = wx.TextCtrl( self._search_panel )
|
||||
self._search_url.Bind( wx.EVT_TEXT, self.EventHTML )
|
||||
|
||||
self._search_separator = wx.Choice( self._search_panel, choices = [ '+', '&', '%20' ] )
|
||||
self._search_separator.Bind( wx.EVT_CHOICE, self.EventHTML )
|
||||
|
||||
self._advance_by_page_num = wx.CheckBox( self._search_panel )
|
||||
|
||||
self._thumb_classname = wx.TextCtrl( self._search_panel )
|
||||
self._thumb_classname.Bind( wx.EVT_TEXT, self.EventHTML )
|
||||
|
||||
self._example_html_search = wx.StaticText( self._search_panel, style = wx.ST_NO_AUTORESIZE )
|
||||
|
||||
#
|
||||
|
||||
self._image_panel = ClientGUICommon.StaticBox( self._booru_panel, 'image' )
|
||||
|
||||
self._image_info = wx.TextCtrl( self._image_panel )
|
||||
self._image_info.Bind( wx.EVT_TEXT, self.EventHTML )
|
||||
|
||||
self._image_id = wx.RadioButton( self._image_panel, style = wx.RB_GROUP )
|
||||
self._image_id.Bind( wx.EVT_RADIOBUTTON, self.EventHTML )
|
||||
|
||||
self._image_data = wx.RadioButton( self._image_panel )
|
||||
self._image_data.Bind( wx.EVT_RADIOBUTTON, self.EventHTML )
|
||||
|
||||
self._example_html_image = wx.StaticText( self._image_panel, style = wx.ST_NO_AUTORESIZE )
|
||||
|
||||
#
|
||||
|
||||
self._tag_panel = ClientGUICommon.StaticBox( self._booru_panel, 'tags' )
|
||||
|
||||
self._tag_classnames_to_namespaces = wx.ListBox( self._tag_panel )
|
||||
self._tag_classnames_to_namespaces.Bind( wx.EVT_LEFT_DCLICK, self.EventRemove )
|
||||
|
||||
self._tag_classname = wx.TextCtrl( self._tag_panel )
|
||||
self._namespace = wx.TextCtrl( self._tag_panel )
|
||||
|
||||
self._add = wx.Button( self._tag_panel, label = 'add' )
|
||||
self._add.Bind( wx.EVT_BUTTON, self.EventAdd )
|
||||
|
||||
self._example_html_tags = wx.StaticText( self._tag_panel, style = wx.ST_NO_AUTORESIZE )
|
||||
|
||||
#
|
||||
|
||||
self._search_url.SetValue( search_url )
|
||||
|
||||
self._search_separator.Select( self._search_separator.FindString( search_separator ) )
|
||||
|
||||
self._advance_by_page_num.SetValue( advance_by_page_num )
|
||||
|
||||
self._thumb_classname.SetValue( thumb_classname )
|
||||
|
||||
#
|
||||
|
||||
if image_id is None:
|
||||
|
||||
self._image_info.SetValue( image_data )
|
||||
self._image_data.SetValue( True )
|
||||
|
||||
else:
|
||||
|
||||
self._image_info.SetValue( image_id )
|
||||
self._image_id.SetValue( True )
|
||||
|
||||
|
||||
#
|
||||
|
||||
for ( tag_classname, namespace ) in tag_classnames_to_namespaces.items(): self._tag_classnames_to_namespaces.Append( tag_classname + ' : ' + namespace, ( tag_classname, namespace ) )
|
||||
|
||||
#
|
||||
|
||||
rows = []
|
||||
|
||||
rows.append( ( 'search url: ', self._search_url ) )
|
||||
rows.append( ( 'search tag separator: ', self._search_separator ) )
|
||||
rows.append( ( 'advance by page num: ', self._advance_by_page_num ) )
|
||||
rows.append( ( 'thumbnail classname: ', self._thumb_classname ) )
|
||||
|
||||
gridbox = ClientGUICommon.WrapInGrid( self._search_panel, rows )
|
||||
|
||||
self._search_panel.Add( gridbox, CC.FLAGS_EXPAND_SIZER_PERPENDICULAR )
|
||||
self._search_panel.Add( self._example_html_search, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
|
||||
#
|
||||
rows = []
|
||||
|
||||
rows.append( ( 'text: ', self._image_info ) )
|
||||
rows.append( ( 'id of <img>: ', self._image_id ) )
|
||||
rows.append( ( 'text of <a>: ', self._image_data ) )
|
||||
|
||||
gridbox = ClientGUICommon.WrapInGrid( self._image_panel, rows )
|
||||
|
||||
self._image_panel.Add( gridbox, CC.FLAGS_EXPAND_SIZER_PERPENDICULAR )
|
||||
self._image_panel.Add( self._example_html_image, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
|
||||
#
|
||||
|
||||
hbox = wx.BoxSizer( wx.HORIZONTAL )
|
||||
|
||||
hbox.Add( self._tag_classname, CC.FLAGS_VCENTER )
|
||||
hbox.Add( self._namespace, CC.FLAGS_VCENTER )
|
||||
hbox.Add( self._add, CC.FLAGS_VCENTER )
|
||||
|
||||
self._tag_panel.Add( self._tag_classnames_to_namespaces, CC.FLAGS_EXPAND_BOTH_WAYS )
|
||||
self._tag_panel.Add( hbox, CC.FLAGS_EXPAND_SIZER_PERPENDICULAR )
|
||||
self._tag_panel.Add( self._example_html_tags, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
|
||||
#
|
||||
|
||||
self._booru_panel.Add( self._search_panel, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
self._booru_panel.Add( self._image_panel, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
self._booru_panel.Add( self._tag_panel, CC.FLAGS_EXPAND_BOTH_WAYS )
|
||||
|
||||
vbox = wx.BoxSizer( wx.VERTICAL )
|
||||
|
||||
vbox.Add( self._booru_panel, CC.FLAGS_EXPAND_BOTH_WAYS )
|
||||
|
||||
self.SetSizer( vbox )
|
||||
|
||||
|
||||
def _GetInfo( self ):
|
||||
|
||||
booru_name = self._booru.GetName()
|
||||
|
||||
search_url = self._search_url.GetValue()
|
||||
|
||||
search_separator = self._search_separator.GetStringSelection()
|
||||
|
||||
advance_by_page_num = self._advance_by_page_num.GetValue()
|
||||
|
||||
thumb_classname = self._thumb_classname.GetValue()
|
||||
|
||||
if self._image_id.GetValue():
|
||||
|
||||
image_id = self._image_info.GetValue()
|
||||
image_data = None
|
||||
|
||||
else:
|
||||
|
||||
image_id = None
|
||||
image_data = self._image_info.GetValue()
|
||||
|
||||
|
||||
tag_classnames_to_namespaces = { tag_classname : namespace for ( tag_classname, namespace ) in [ self._tag_classnames_to_namespaces.GetClientData( i ) for i in range( self._tag_classnames_to_namespaces.GetCount() ) ] }
|
||||
|
||||
return ( booru_name, search_url, search_separator, advance_by_page_num, thumb_classname, image_id, image_data, tag_classnames_to_namespaces )
|
||||
|
||||
|
||||
def EventAdd( self, event ):
|
||||
|
||||
tag_classname = self._tag_classname.GetValue()
|
||||
namespace = self._namespace.GetValue()
|
||||
|
||||
if tag_classname != '':
|
||||
|
||||
self._tag_classnames_to_namespaces.Append( tag_classname + ' : ' + namespace, ( tag_classname, namespace ) )
|
||||
|
||||
self._tag_classname.SetValue( '' )
|
||||
self._namespace.SetValue( '' )
|
||||
|
||||
self.EventHTML( event )
|
||||
|
||||
|
||||
|
||||
def EventHTML( self, event ):
|
||||
|
||||
pass
|
||||
|
||||
|
||||
def EventRemove( self, event ):
|
||||
|
||||
selection = self._tag_classnames_to_namespaces.GetSelection()
|
||||
|
||||
if selection != wx.NOT_FOUND:
|
||||
|
||||
self._tag_classnames_to_namespaces.Delete( selection )
|
||||
|
||||
self.EventHTML( event )
|
||||
|
||||
|
||||
|
||||
def GetBooru( self ):
|
||||
|
||||
( booru_name, search_url, search_separator, advance_by_page_num, thumb_classname, image_id, image_data, tag_classnames_to_namespaces ) = self._GetInfo()
|
||||
|
||||
return ClientData.Booru( booru_name, search_url, search_separator, advance_by_page_num, thumb_classname, image_id, image_data, tag_classnames_to_namespaces )
|
||||
|
||||
|
||||
def HasChanges( self ):
|
||||
|
||||
if self._is_new: return True
|
||||
|
||||
( booru_name, my_search_url, my_search_separator, my_advance_by_page_num, my_thumb_classname, my_image_id, my_image_data, my_tag_classnames_to_namespaces ) = self._GetInfo()
|
||||
|
||||
( search_url, search_separator, advance_by_page_num, thumb_classname, image_id, image_data, tag_classnames_to_namespaces ) = self._booru.GetData()
|
||||
|
||||
if search_url != my_search_url: return True
|
||||
|
||||
if search_separator != my_search_separator: return True
|
||||
|
||||
if advance_by_page_num != my_advance_by_page_num: return True
|
||||
|
||||
if thumb_classname != my_thumb_classname: return True
|
||||
|
||||
if image_id != my_image_id: return True
|
||||
|
||||
if image_data != my_image_data: return True
|
||||
|
||||
if tag_classnames_to_namespaces != my_tag_classnames_to_namespaces: return True
|
||||
|
||||
return False
|
||||
|
||||
|
||||
def Update( self, booru ):
|
||||
|
||||
( search_url, search_separator, advance_by_page_num, thumb_classname, image_id, image_data, tag_classnames_to_namespaces ) = booru.GetData()
|
||||
|
||||
self._search_url.SetValue( search_url )
|
||||
|
||||
self._search_separator.Select( self._search_separator.FindString( search_separator ) )
|
||||
|
||||
self._advance_by_page_num.SetValue( advance_by_page_num )
|
||||
|
||||
self._thumb_classname.SetValue( thumb_classname )
|
||||
|
||||
if image_id is None:
|
||||
|
||||
self._image_info.SetValue( image_data )
|
||||
self._image_data.SetValue( True )
|
||||
|
||||
else:
|
||||
|
||||
self._image_info.SetValue( image_id )
|
||||
self._image_id.SetValue( True )
|
||||
|
||||
|
||||
self._tag_classnames_to_namespaces.Clear()
|
||||
|
||||
for ( tag_classname, namespace ) in tag_classnames_to_namespaces.items(): self._tag_classnames_to_namespaces.Append( tag_classname + ' : ' + namespace, ( tag_classname, namespace ) )
|
||||
|
||||
|
||||
'''
|
||||
class DialogManageContacts( ClientGUIDialogs.Dialog ):
|
||||
'''class DialogManageContacts( ClientGUIDialogs.Dialog ):
|
||||
|
||||
def __init__( self, parent ):
|
||||
|
||||
|
@ -2479,7 +1965,9 @@ class DialogManageImportFoldersEdit( ClientGUIDialogs.Dialog ):
|
|||
|
||||
self._tag_import_options = ClientGUIImport.TagImportOptionsButton( self._tag_box, tag_import_options, show_downloader_options = False )
|
||||
|
||||
filename_tagging_options_panel = ClientGUIListCtrl.BetterListCtrlPanel( self._tag_box )
|
||||
self._filename_tagging_options_box = ClientGUICommon.StaticBox( self._tag_box, 'filename tagging' )
|
||||
|
||||
filename_tagging_options_panel = ClientGUIListCtrl.BetterListCtrlPanel( self._filename_tagging_options_box )
|
||||
|
||||
columns = [ ( 'filename tagging options services', -1 ) ]
|
||||
|
||||
|
@ -2601,8 +2089,10 @@ class DialogManageImportFoldersEdit( ClientGUIDialogs.Dialog ):
|
|||
|
||||
#
|
||||
|
||||
self._filename_tagging_options_box.Add( filename_tagging_options_panel, CC.FLAGS_EXPAND_BOTH_WAYS )
|
||||
|
||||
self._tag_box.Add( self._tag_import_options, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
self._tag_box.Add( filename_tagging_options_panel, CC.FLAGS_EXPAND_BOTH_WAYS )
|
||||
self._tag_box.Add( self._filename_tagging_options_box, CC.FLAGS_EXPAND_BOTH_WAYS )
|
||||
|
||||
#
|
||||
|
||||
|
|
|
@ -1371,13 +1371,20 @@ class GalleryImportPanel( ClientGUICommon.StaticBox ):
|
|||
|
||||
|
||||
|
||||
class GallerySelector( ClientGUICommon.BetterButton ):
|
||||
class GUGKeyAndNameSelector( ClientGUICommon.BetterButton ):
|
||||
|
||||
def __init__( self, parent, gallery_identifier, update_callable = None ):
|
||||
def __init__( self, parent, gug_key_and_name, update_callable = None ):
|
||||
|
||||
ClientGUICommon.BetterButton.__init__( self, parent, 'gallery selector', self._Edit )
|
||||
|
||||
self._gallery_identifier = gallery_identifier
|
||||
gug = HG.client_controller.network_engine.domain_manager.GetGUG( gug_key_and_name )
|
||||
|
||||
if gug is not None:
|
||||
|
||||
gug_key_and_name = gug.GetGUGKeyAndName()
|
||||
|
||||
|
||||
self._gug_key_and_name = gug_key_and_name
|
||||
self._update_callable = update_callable
|
||||
|
||||
self._SetLabel()
|
||||
|
@ -1385,73 +1392,128 @@ class GallerySelector( ClientGUICommon.BetterButton ):
|
|||
|
||||
def _Edit( self ):
|
||||
|
||||
gallery_identifiers = []
|
||||
domain_manager = HG.client_controller.network_engine.domain_manager
|
||||
|
||||
site_types = [ HC.SITE_TYPE_DEVIANT_ART, HC.SITE_TYPE_TUMBLR, HC.SITE_TYPE_HENTAI_FOUNDRY_ARTIST, HC.SITE_TYPE_HENTAI_FOUNDRY_TAGS ]
|
||||
# maybe relegate to hidden page and something like "(does not work)" if no gallery url class match
|
||||
|
||||
result = HG.client_controller.Read( 'serialisable_simple', 'pixiv_account' )
|
||||
my_gug = domain_manager.GetGUG( self._gug_key_and_name )
|
||||
|
||||
if result is not None:
|
||||
gugs = domain_manager.GetGUGs()
|
||||
gug_keys_to_display = domain_manager.GetGUGKeysToDisplay()
|
||||
|
||||
functional_gugs = []
|
||||
non_functional_gugs = []
|
||||
|
||||
for gug in gugs:
|
||||
|
||||
site_types.append( HC.SITE_TYPE_PIXIV_ARTIST_ID )
|
||||
if gug.IsFunctional():
|
||||
|
||||
functional_gugs.append( gug )
|
||||
|
||||
else:
|
||||
|
||||
non_functional_gugs.append( gug )
|
||||
|
||||
|
||||
|
||||
for site_type in site_types:
|
||||
|
||||
gallery_identifier = ClientDownloading.GalleryIdentifier( site_type )
|
||||
|
||||
gallery_identifiers.append( gallery_identifier )
|
||||
|
||||
|
||||
boorus = HG.client_controller.Read( 'remote_boorus' )
|
||||
|
||||
for booru_name in boorus.keys():
|
||||
|
||||
gallery_identifier = ClientDownloading.GalleryIdentifier( HC.SITE_TYPE_BOORU, additional_info = booru_name )
|
||||
|
||||
gallery_identifiers.append( gallery_identifier )
|
||||
|
||||
|
||||
choice_tuples = [ ( gallery_identifier.ToString(), gallery_identifier ) for gallery_identifier in gallery_identifiers ]
|
||||
choice_tuples = [ ( gug.GetName(), gug ) for gug in functional_gugs if gug.GetGUGKey() in gug_keys_to_display ]
|
||||
|
||||
choice_tuples.sort()
|
||||
|
||||
with ClientGUIDialogs.DialogSelectFromList( self, 'select gallery', choice_tuples, value_to_select = self._gallery_identifier ) as dlg:
|
||||
second_choice_tuples = [ ( gug.GetName(), gug ) for gug in functional_gugs if gug.GetGUGKey() not in gug_keys_to_display ]
|
||||
|
||||
second_choice_tuples.sort()
|
||||
|
||||
if len( second_choice_tuples ) > 0:
|
||||
|
||||
choice_tuples.append( ( '--other galleries', -1 ) )
|
||||
|
||||
|
||||
if len( non_functional_gugs ) > 0:
|
||||
|
||||
non_functional_choice_tuples = [ ( gug.GetName(), gug ) for gug in non_functional_gugs ]
|
||||
|
||||
non_functional_choice_tuples.sort()
|
||||
|
||||
choice_tuples.append( ( '--non-functional galleries', -2 ) )
|
||||
|
||||
|
||||
with ClientGUIDialogs.DialogSelectFromList( self, 'select gallery', choice_tuples, value_to_select = my_gug, sort_tuples = False ) as dlg:
|
||||
|
||||
if dlg.ShowModal() == wx.ID_OK:
|
||||
|
||||
gallery_identifier = dlg.GetChoice()
|
||||
gug = dlg.GetChoice()
|
||||
|
||||
self._SetValue( gallery_identifier )
|
||||
if gug == -1:
|
||||
|
||||
with ClientGUIDialogs.DialogSelectFromList( self, 'select gallery', second_choice_tuples, value_to_select = my_gug ) as dlg:
|
||||
|
||||
if dlg.ShowModal() == wx.ID_OK:
|
||||
|
||||
gug = dlg.GetChoice()
|
||||
|
||||
else:
|
||||
|
||||
return
|
||||
|
||||
|
||||
|
||||
elif gug == -2:
|
||||
|
||||
with ClientGUIDialogs.DialogSelectFromList( self, 'select gallery', non_functional_choice_tuples, value_to_select = my_gug ) as dlg:
|
||||
|
||||
if dlg.ShowModal() == wx.ID_OK:
|
||||
|
||||
gug = dlg.GetChoice()
|
||||
|
||||
else:
|
||||
|
||||
return
|
||||
|
||||
|
||||
|
||||
|
||||
gug_key_and_name = gug.GetGUGKeyAndName()
|
||||
|
||||
self._SetValue( gug_key_and_name )
|
||||
|
||||
|
||||
|
||||
|
||||
def _SetLabel( self ):
|
||||
|
||||
self.SetLabelText( self._gallery_identifier.ToString() )
|
||||
label = self._gug_key_and_name[1]
|
||||
|
||||
gug = HG.client_controller.network_engine.domain_manager.GetGUG( self._gug_key_and_name )
|
||||
|
||||
if gug is None:
|
||||
|
||||
label = 'not found: ' + label
|
||||
|
||||
|
||||
self.SetLabelText( label )
|
||||
|
||||
|
||||
def _SetValue( self, gallery_identifier ):
|
||||
def _SetValue( self, gug_key_and_name ):
|
||||
|
||||
self._gallery_identifier = gallery_identifier
|
||||
self._gug_key_and_name = gug_key_and_name
|
||||
|
||||
self._SetLabel()
|
||||
|
||||
if self._update_callable is not None:
|
||||
|
||||
self._update_callable( self._gallery_identifier )
|
||||
self._update_callable( gug_key_and_name )
|
||||
|
||||
|
||||
|
||||
def GetValue( self ):
|
||||
|
||||
return self._gallery_identifier
|
||||
return self._gug_key_and_name
|
||||
|
||||
|
||||
def SetValue( self, gallery_identifier ):
|
||||
def SetValue( self, gug_key_and_name ):
|
||||
|
||||
self._SetValue( gallery_identifier )
|
||||
self._SetValue( gug_key_and_name )
|
||||
|
||||
|
||||
class TagImportOptionsButton( ClientGUICommon.BetterButton ):
|
||||
|
|
|
@ -103,7 +103,9 @@ def CreateManagementControllerImportGallery():
|
|||
|
||||
management_controller = CreateManagementController( page_name, MANAGEMENT_TYPE_IMPORT_MULTIPLE_GALLERY )
|
||||
|
||||
multiple_gallery_import = ClientImportGallery.MultipleGalleryImport()
|
||||
gug_key_and_name = HG.client_controller.network_engine.domain_manager.GetDefaultGUGKeyAndName()
|
||||
|
||||
multiple_gallery_import = ClientImportGallery.MultipleGalleryImport( gug_key_and_name = gug_key_and_name )
|
||||
|
||||
management_controller.SetVariable( 'multiple_gallery_import', multiple_gallery_import )
|
||||
|
||||
|
@ -1585,7 +1587,7 @@ class ManagementPanelImporterMultipleGallery( ManagementPanelImporter ):
|
|||
|
||||
self._query_input = ClientGUIControls.TextAndPasteCtrl( self._gallery_downloader_panel, self._PendQueries )
|
||||
|
||||
self._gallery_selector = ClientGUIImport.GallerySelector( self._gallery_downloader_panel, self._multiple_gallery_import.GetGalleryIdentifier(), update_callable = self._SetGalleryIdentifier )
|
||||
self._gug_key_and_name = ClientGUIImport.GUGKeyAndNameSelector( self._gallery_downloader_panel, self._multiple_gallery_import.GetGUGKeyAndName(), update_callable = self._SetGUGKeyAndName )
|
||||
|
||||
self._file_limit = ClientGUICommon.NoneableSpinCtrl( self._gallery_downloader_panel, 'stop after this many files', min = 1, none_phrase = 'no limit' )
|
||||
self._file_limit.Bind( wx.EVT_SPINCTRL, self.EventFileLimit )
|
||||
|
@ -1595,10 +1597,6 @@ class ManagementPanelImporterMultipleGallery( ManagementPanelImporter ):
|
|||
tag_import_options = self._multiple_gallery_import.GetTagImportOptions()
|
||||
file_limit = self._multiple_gallery_import.GetFileLimit()
|
||||
|
||||
gallery_identifier = self._multiple_gallery_import.GetGalleryIdentifier()
|
||||
|
||||
search_value = ClientDefaults.GetDefaultSearchValue( gallery_identifier )
|
||||
|
||||
self._file_import_options = ClientGUIImport.FileImportOptionsButton( self._gallery_downloader_panel, file_import_options, self._multiple_gallery_import.SetFileImportOptions )
|
||||
self._tag_import_options = ClientGUIImport.TagImportOptionsButton( self._gallery_downloader_panel, tag_import_options, update_callable = self._multiple_gallery_import.SetTagImportOptions, allow_default_selection = True )
|
||||
|
||||
|
@ -1608,7 +1606,7 @@ class ManagementPanelImporterMultipleGallery( ManagementPanelImporter ):
|
|||
self._gallery_downloader_panel.Add( self._gallery_importers_status_st_bottom, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
self._gallery_downloader_panel.Add( self._gallery_importers_listctrl_panel, CC.FLAGS_EXPAND_BOTH_WAYS )
|
||||
self._gallery_downloader_panel.Add( self._query_input, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
self._gallery_downloader_panel.Add( self._gallery_selector, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
self._gallery_downloader_panel.Add( self._gug_key_and_name, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
self._gallery_downloader_panel.Add( self._file_limit, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
self._gallery_downloader_panel.Add( self._file_import_options, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
self._gallery_downloader_panel.Add( self._tag_import_options, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
|
@ -1636,7 +1634,9 @@ class ManagementPanelImporterMultipleGallery( ManagementPanelImporter ):
|
|||
|
||||
#
|
||||
|
||||
self._query_input.SetValue( search_value )
|
||||
initial_search_text = self._multiple_gallery_import.GetInitialSearchText()
|
||||
|
||||
self._query_input.SetValue( initial_search_text )
|
||||
|
||||
self._file_limit.SetValue( file_limit )
|
||||
|
||||
|
@ -1707,9 +1707,9 @@ class ManagementPanelImporterMultipleGallery( ManagementPanelImporter ):
|
|||
pretty_query_text = '* ' + pretty_query_text
|
||||
|
||||
|
||||
source = gallery_import.GetGalleryIdentifier()
|
||||
source = gallery_import.GetSourceName()
|
||||
|
||||
pretty_source = source.ToString()
|
||||
pretty_source = source
|
||||
|
||||
files_paused = gallery_import.FilesPaused()
|
||||
|
||||
|
@ -1921,22 +1921,22 @@ class ManagementPanelImporterMultipleGallery( ManagementPanelImporter ):
|
|||
self._UpdateImportStatusNow()
|
||||
|
||||
|
||||
def _SetGalleryIdentifier( self, gallery_identifier ):
|
||||
def _SetGUGKeyAndName( self, gug_key_and_name ):
|
||||
|
||||
current_gallery_identifier = self._multiple_gallery_import.GetGalleryIdentifier()
|
||||
|
||||
current_search_value = ClientDefaults.GetDefaultSearchValue( current_gallery_identifier )
|
||||
current_initial_search_text = self._multiple_gallery_import.GetInitialSearchText()
|
||||
|
||||
current_input_value = self._query_input.GetValue()
|
||||
|
||||
if current_input_value in ( current_search_value, '' ):
|
||||
|
||||
search_value = ClientDefaults.GetDefaultSearchValue( gallery_identifier )
|
||||
|
||||
self._query_input.SetValue( search_value )
|
||||
|
||||
should_initialise_new_text = current_input_value in ( current_initial_search_text, '' )
|
||||
|
||||
self._multiple_gallery_import.SetGalleryIdentifier( gallery_identifier )
|
||||
self._multiple_gallery_import.SetGUGKeyAndName( gug_key_and_name )
|
||||
|
||||
if should_initialise_new_text:
|
||||
|
||||
new_initial_search_text = self._multiple_gallery_import.GetInitialSearchText()
|
||||
|
||||
self._query_input.SetValue( new_initial_search_text )
|
||||
|
||||
|
||||
|
||||
def _SetOptionsToGalleryImports( self ):
|
||||
|
|
|
@ -22,6 +22,7 @@ import HydrusExceptions
|
|||
import HydrusGlobals as HG
|
||||
import HydrusSerialisable
|
||||
import HydrusTags
|
||||
import HydrusText
|
||||
import json
|
||||
import os
|
||||
import sys
|
||||
|
@ -4747,7 +4748,7 @@ class TestPanel( wx.Panel ):
|
|||
parse_phrase = 'uncertain data type'
|
||||
|
||||
# can't just throw this at bs4 to see if it 'works', as it'll just wrap any unparsable string in some bare <html><body><p> tags
|
||||
if '<html' in example_data:
|
||||
if HydrusText.LooksLikeHTML( example_data ):
|
||||
|
||||
parse_phrase = 'looks like HTML'
|
||||
|
||||
|
|
|
@ -9,6 +9,7 @@ import ClientGUIDialogs
|
|||
import ClientGUIImport
|
||||
import ClientGUIListBoxes
|
||||
import ClientGUIListCtrl
|
||||
import ClientGUIMenus
|
||||
import ClientGUIParsing
|
||||
import ClientGUIScrolledPanels
|
||||
import ClientGUIFileSeedCache
|
||||
|
@ -582,6 +583,220 @@ class EditDomainManagerInfoPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
return ( url_matches, network_contexts_to_custom_header_dicts )
|
||||
|
||||
|
||||
class EditDownloaderDisplayPanel( ClientGUIScrolledPanels.EditPanel ):
|
||||
|
||||
def __init__( self, parent, network_engine, gugs, gug_keys_to_display, url_matches, url_match_keys_to_display ):
|
||||
|
||||
ClientGUIScrolledPanels.EditPanel.__init__( self, parent )
|
||||
|
||||
self._gugs = gugs
|
||||
self._gug_keys_to_gugs = { gug.GetGUGKey() : gug for gug in self._gugs }
|
||||
|
||||
self._url_matches = url_matches
|
||||
self._url_match_keys_to_url_matches = { url_match.GetMatchKey() : url_match for url_match in self._url_matches }
|
||||
|
||||
self._network_engine = network_engine
|
||||
|
||||
#
|
||||
|
||||
self._notebook = wx.Notebook( self )
|
||||
|
||||
#
|
||||
|
||||
self._gug_display_list_ctrl_panel = ClientGUIListCtrl.BetterListCtrlPanel( self._notebook )
|
||||
|
||||
columns = [ ( 'downloader', -1 ), ( 'show in main selector list?', 29 ) ]
|
||||
|
||||
self._gug_display_list_ctrl = ClientGUIListCtrl.BetterListCtrl( self._gug_display_list_ctrl_panel, 'gug_keys_to_display', 15, 36, columns, self._ConvertGUGDisplayDataToListCtrlTuples, activation_callback = self._EditGUGDisplay )
|
||||
|
||||
self._gug_display_list_ctrl_panel.SetListCtrl( self._gug_display_list_ctrl )
|
||||
|
||||
self._gug_display_list_ctrl_panel.AddButton( 'edit', self._EditGUGDisplay, enabled_only_on_selection = True )
|
||||
|
||||
#
|
||||
|
||||
self._url_display_list_ctrl_panel = ClientGUIListCtrl.BetterListCtrlPanel( self._notebook )
|
||||
|
||||
columns = [ ( 'url class', -1 ), ( 'url type', 20 ), ( 'display on media viewer?', 36 ) ]
|
||||
|
||||
self._url_display_list_ctrl = ClientGUIListCtrl.BetterListCtrl( self._url_display_list_ctrl_panel, 'url_match_keys_to_display', 15, 36, columns, self._ConvertURLDisplayDataToListCtrlTuples, activation_callback = self._EditURLDisplay )
|
||||
|
||||
self._url_display_list_ctrl_panel.SetListCtrl( self._url_display_list_ctrl )
|
||||
|
||||
self._url_display_list_ctrl_panel.AddButton( 'edit', self._EditURLDisplay, enabled_only_on_selection = True )
|
||||
|
||||
#
|
||||
|
||||
listctrl_data = []
|
||||
|
||||
for ( gug_key, gug ) in self._gug_keys_to_gugs.items():
|
||||
|
||||
display = gug_key in gug_keys_to_display
|
||||
|
||||
listctrl_data.append( ( gug_key, display ) )
|
||||
|
||||
|
||||
self._gug_display_list_ctrl.AddDatas( listctrl_data )
|
||||
|
||||
self._gug_display_list_ctrl.Sort( 1 )
|
||||
|
||||
#
|
||||
|
||||
listctrl_data = []
|
||||
|
||||
for ( url_match_key, url_match ) in self._url_match_keys_to_url_matches.items():
|
||||
|
||||
display = url_match_key in url_match_keys_to_display
|
||||
|
||||
listctrl_data.append( ( url_match_key, display ) )
|
||||
|
||||
|
||||
self._url_display_list_ctrl.AddDatas( listctrl_data )
|
||||
|
||||
self._url_display_list_ctrl.Sort( 1 )
|
||||
|
||||
#
|
||||
|
||||
self._notebook.AddPage( self._gug_display_list_ctrl_panel, 'downloaders selector', select = True )
|
||||
self._notebook.AddPage( self._url_display_list_ctrl_panel, 'media viewer urls', select = False )
|
||||
|
||||
#
|
||||
|
||||
vbox = wx.BoxSizer( wx.VERTICAL )
|
||||
|
||||
vbox.Add( self._notebook, CC.FLAGS_EXPAND_BOTH_WAYS )
|
||||
|
||||
self.SetSizer( vbox )
|
||||
|
||||
|
||||
def _ConvertGUGDisplayDataToListCtrlTuples( self, data ):
|
||||
|
||||
( gug_key, display ) = data
|
||||
|
||||
gug = self._gug_keys_to_gugs[ gug_key ]
|
||||
|
||||
name = gug.GetName()
|
||||
|
||||
pretty_name = name
|
||||
|
||||
if display:
|
||||
|
||||
pretty_display = 'yes'
|
||||
|
||||
else:
|
||||
|
||||
pretty_display = 'no'
|
||||
|
||||
|
||||
display_tuple = ( pretty_name, pretty_display )
|
||||
sort_tuple = ( name, display )
|
||||
|
||||
return ( display_tuple, sort_tuple )
|
||||
|
||||
|
||||
def _ConvertURLDisplayDataToListCtrlTuples( self, data ):
|
||||
|
||||
( url_match_key, display ) = data
|
||||
|
||||
url_match = self._url_match_keys_to_url_matches[ url_match_key ]
|
||||
|
||||
url_match_name = url_match.GetName()
|
||||
url_type = url_match.GetURLType()
|
||||
|
||||
pretty_name = url_match_name
|
||||
pretty_url_type = HC.url_type_string_lookup[ url_type ]
|
||||
|
||||
if display:
|
||||
|
||||
pretty_display = 'yes'
|
||||
|
||||
else:
|
||||
|
||||
pretty_display = 'no'
|
||||
|
||||
|
||||
display_tuple = ( pretty_name, pretty_url_type, pretty_display )
|
||||
sort_tuple = ( url_match_name, pretty_url_type, display )
|
||||
|
||||
return ( display_tuple, sort_tuple )
|
||||
|
||||
|
||||
def _EditGUGDisplay( self ):
|
||||
|
||||
for data in self._gug_display_list_ctrl.GetData( only_selected = True ):
|
||||
|
||||
( gug_key, display ) = data
|
||||
|
||||
name = self._gug_keys_to_gugs[ gug_key ].GetName()
|
||||
|
||||
message = 'Show ' + name + ' in the main selector list?'
|
||||
|
||||
with ClientGUIDialogs.DialogYesNo( self, message, title = 'Show in the first list?' ) as dlg:
|
||||
|
||||
result = dlg.ShowModal()
|
||||
|
||||
if result in ( wx.ID_YES, wx.ID_NO ):
|
||||
|
||||
display = result == wx.ID_YES
|
||||
|
||||
self._gug_display_list_ctrl.DeleteDatas( ( data, ) )
|
||||
|
||||
new_data = ( gug_key, display )
|
||||
|
||||
self._gug_display_list_ctrl.AddDatas( ( new_data, ) )
|
||||
|
||||
else:
|
||||
|
||||
break
|
||||
|
||||
|
||||
|
||||
|
||||
self._gug_display_list_ctrl.Sort()
|
||||
|
||||
|
||||
def _EditURLDisplay( self ):
|
||||
|
||||
for data in self._url_display_list_ctrl.GetData( only_selected = True ):
|
||||
|
||||
( url_match_key, display ) = data
|
||||
|
||||
url_match_name = self._url_match_keys_to_url_matches[ url_match_key ].GetName()
|
||||
|
||||
message = 'Show ' + url_match_name + ' in the media viewer?'
|
||||
|
||||
with ClientGUIDialogs.DialogYesNo( self, message, title = 'Show in the media viewer?' ) as dlg:
|
||||
|
||||
result = dlg.ShowModal()
|
||||
|
||||
if result in ( wx.ID_YES, wx.ID_NO ):
|
||||
|
||||
display = result == wx.ID_YES
|
||||
|
||||
self._url_display_list_ctrl.DeleteDatas( ( data, ) )
|
||||
|
||||
new_data = ( url_match_key, display )
|
||||
|
||||
self._url_display_list_ctrl.AddDatas( ( new_data, ) )
|
||||
|
||||
else:
|
||||
|
||||
break
|
||||
|
||||
|
||||
|
||||
|
||||
self._url_display_list_ctrl.Sort()
|
||||
|
||||
|
||||
def GetValue( self ):
|
||||
|
||||
gug_keys_to_display = { gug_key for ( gug_key, display ) in self._gug_display_list_ctrl.GetData() if display }
|
||||
url_match_keys_to_display = { url_match_key for ( url_match_key, display ) in self._url_display_list_ctrl.GetData() if display }
|
||||
|
||||
return ( gug_keys_to_display, url_match_keys_to_display )
|
||||
|
||||
|
||||
class EditDuplicateActionOptionsPanel( ClientGUIScrolledPanels.EditPanel ):
|
||||
|
||||
def __init__( self, parent, duplicate_action, duplicate_action_options, for_custom_action = False ):
|
||||
|
@ -1498,6 +1713,146 @@ class EditGUGPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
return gug
|
||||
|
||||
|
||||
class EditNGUGPanel( ClientGUIScrolledPanels.EditPanel ):
|
||||
|
||||
def __init__( self, parent, ngug, available_gugs ):
|
||||
|
||||
ClientGUIScrolledPanels.EditPanel.__init__( self, parent )
|
||||
|
||||
self._original_ngug = ngug
|
||||
self._available_gugs = available_gugs
|
||||
|
||||
self._available_gugs.sort( key = lambda g: g.GetName() )
|
||||
|
||||
self._name = wx.TextCtrl( self )
|
||||
|
||||
self._initial_search_text = wx.TextCtrl( self )
|
||||
|
||||
self._gug_list_ctrl_panel = ClientGUIListCtrl.BetterListCtrlPanel( self )
|
||||
|
||||
columns = [ ( 'gug name', 24 ), ( 'available?', 20 ) ]
|
||||
|
||||
self._gug_list_ctrl = ClientGUIListCtrl.BetterListCtrl( self._gug_list_ctrl_panel, 'ngug_gugs', 30, 74, columns, self._ConvertGUGDataToListCtrlTuples, delete_key_callback = self._DeleteGUG )
|
||||
|
||||
self._gug_list_ctrl_panel.SetListCtrl( self._gug_list_ctrl )
|
||||
|
||||
self._add_button = ClientGUICommon.BetterButton( self._gug_list_ctrl_panel, 'add', self._AddGUGButtonClick )
|
||||
|
||||
self._gug_list_ctrl_panel.AddWindow( self._add_button )
|
||||
self._gug_list_ctrl_panel.AddButton( 'delete', self._DeleteGUG, enabled_only_on_selection = True )
|
||||
|
||||
#
|
||||
|
||||
name = ngug.GetName()
|
||||
|
||||
initial_search_text = ngug.GetInitialSearchText()
|
||||
|
||||
self._name.SetValue( name )
|
||||
self._initial_search_text.SetValue( initial_search_text )
|
||||
|
||||
gug_keys_and_names = ngug.GetGUGKeysAndNames()
|
||||
|
||||
self._gug_list_ctrl.AddDatas( gug_keys_and_names )
|
||||
|
||||
self._gug_list_ctrl.Sort( 0 )
|
||||
|
||||
#
|
||||
|
||||
rows = []
|
||||
|
||||
rows.append( ( 'name: ', self._name ) )
|
||||
rows.append( ( 'initial search text (to prompt user): ', self._initial_search_text ) )
|
||||
|
||||
gridbox = ClientGUICommon.WrapInGrid( self, rows )
|
||||
|
||||
vbox = wx.BoxSizer( wx.VERTICAL )
|
||||
|
||||
vbox.Add( gridbox, CC.FLAGS_EXPAND_SIZER_PERPENDICULAR )
|
||||
vbox.Add( self._gug_list_ctrl_panel, CC.FLAGS_EXPAND_BOTH_WAYS )
|
||||
|
||||
self.SetSizer( vbox )
|
||||
|
||||
|
||||
def _AddGUG( self, gug ):
|
||||
|
||||
gug_key_and_name = gug.GetGUGKeyAndName()
|
||||
|
||||
self._gug_list_ctrl.AddDatas( ( gug_key_and_name, ) )
|
||||
|
||||
|
||||
def _AddGUGButtonClick( self ):
|
||||
|
||||
existing_gug_keys = { gug_key for ( gug_key, gug_name ) in self._gug_list_ctrl.GetData() }
|
||||
existing_gug_names = { gug_name for ( gug_key, gug_name ) in self._gug_list_ctrl.GetData() }
|
||||
|
||||
menu = wx.Menu()
|
||||
|
||||
for gug in self._available_gugs:
|
||||
|
||||
if gug.GetGUGKey() in existing_gug_keys or gug.GetName() in existing_gug_names:
|
||||
|
||||
continue
|
||||
|
||||
|
||||
label = gug.GetName()
|
||||
description = 'Add this GUG to the NGUG.'
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( self, menu, label, description, self._AddGUG, gug )
|
||||
|
||||
|
||||
HG.client_controller.PopupMenu( self._add_button, menu )
|
||||
|
||||
|
||||
def _ConvertGUGDataToListCtrlTuples( self, gug_key_and_name ):
|
||||
|
||||
( gug_key, gug_name ) = gug_key_and_name
|
||||
|
||||
name = gug_name
|
||||
pretty_name = name
|
||||
|
||||
available = gug_key in ( gug.GetGUGKey() for gug in self._available_gugs ) or gug_name in ( gug.GetName() for gug in self._available_gugs )
|
||||
|
||||
if available:
|
||||
|
||||
pretty_available = 'yes'
|
||||
|
||||
else:
|
||||
|
||||
pretty_available = 'no'
|
||||
|
||||
|
||||
display_tuple = ( pretty_name, pretty_available )
|
||||
sort_tuple = ( name, available )
|
||||
|
||||
return ( display_tuple, sort_tuple )
|
||||
|
||||
|
||||
def _DeleteGUG( self ):
|
||||
|
||||
with ClientGUIDialogs.DialogYesNo( self, 'Remove all selected?' ) as dlg:
|
||||
|
||||
if dlg.ShowModal() == wx.ID_YES:
|
||||
|
||||
self._gug_list_ctrl.DeleteSelected()
|
||||
|
||||
|
||||
|
||||
|
||||
def GetValue( self ):
|
||||
|
||||
gug_key = self._original_ngug.GetGUGKey()
|
||||
name = self._name.GetValue()
|
||||
initial_search_text = self._initial_search_text.GetValue()
|
||||
|
||||
gug_keys_and_names = self._gug_list_ctrl.GetData()
|
||||
|
||||
ngug = ClientNetworkingDomain.NestedGalleryURLGenerator( name, gug_key = gug_key, initial_search_text = initial_search_text, gug_keys_and_names = gug_keys_and_names )
|
||||
|
||||
ngug.RepairGUGs( self._available_gugs )
|
||||
|
||||
return ngug
|
||||
|
||||
|
||||
class EditGUGsPanel( ClientGUIScrolledPanels.EditPanel ):
|
||||
|
||||
def __init__( self, parent, gugs ):
|
||||
|
@ -1508,45 +1863,82 @@ class EditGUGsPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
page_func = HydrusData.Call( ClientPaths.LaunchPathInWebBrowser, os.path.join( HC.HELP_DIR, 'downloader_gugs.html' ) )
|
||||
|
||||
menu_items.append( ( 'normal', 'open the url classes help', 'Open the help page for url classes in your web browesr.', page_func ) )
|
||||
menu_items.append( ( 'normal', 'open the gugs help', 'Open the help page for gugs in your web browesr.', page_func ) )
|
||||
|
||||
help_button = ClientGUICommon.MenuBitmapButton( self, CC.GlobalBMPs.help, menu_items )
|
||||
|
||||
help_hbox = ClientGUICommon.WrapInText( help_button, self, 'help for this panel -->', wx.Colour( 0, 0, 255 ) )
|
||||
|
||||
self._list_ctrl_panel = ClientGUIListCtrl.BetterListCtrlPanel( self )
|
||||
#
|
||||
|
||||
columns = [ ( 'name', 24 ), ( 'example url', -1 ), ( 'gallery url class?', 20 ) ]
|
||||
|
||||
self._list_ctrl = ClientGUIListCtrl.BetterListCtrl( self._list_ctrl_panel, 'gugs', 30, 74, columns, self._ConvertDataToListCtrlTuples, delete_key_callback = self._Delete, activation_callback = self._Edit )
|
||||
|
||||
self._list_ctrl_panel.SetListCtrl( self._list_ctrl )
|
||||
|
||||
self._list_ctrl_panel.AddButton( 'add', self._Add )
|
||||
self._list_ctrl_panel.AddButton( 'edit', self._Edit, enabled_only_on_selection = True )
|
||||
self._list_ctrl_panel.AddButton( 'delete', self._Delete, enabled_only_on_selection = True )
|
||||
self._list_ctrl_panel.AddSeparator()
|
||||
self._list_ctrl_panel.AddImportExportButtons( ( ClientNetworkingDomain.GalleryURLGenerator, ), self._AddGUG )
|
||||
self._list_ctrl_panel.AddSeparator()
|
||||
self._list_ctrl_panel.AddDefaultsButton( ClientDefaults.GetDefaultGUGs, self._AddGUG )
|
||||
self._notebook = wx.Notebook( self )
|
||||
|
||||
#
|
||||
|
||||
self._list_ctrl.AddDatas( gugs )
|
||||
self._gug_list_ctrl_panel = ClientGUIListCtrl.BetterListCtrlPanel( self._notebook )
|
||||
|
||||
self._list_ctrl.Sort( 0 )
|
||||
columns = [ ( 'name', 24 ), ( 'example url', -1 ), ( 'gallery url class?', 20 ) ]
|
||||
|
||||
self._gug_list_ctrl = ClientGUIListCtrl.BetterListCtrl( self._gug_list_ctrl_panel, 'gugs', 30, 74, columns, self._ConvertGUGToListCtrlTuples, delete_key_callback = self._DeleteGUG, activation_callback = self._EditGUG )
|
||||
|
||||
self._gug_list_ctrl_panel.SetListCtrl( self._gug_list_ctrl )
|
||||
|
||||
self._gug_list_ctrl_panel.AddButton( 'add', self._AddNewGUG )
|
||||
self._gug_list_ctrl_panel.AddButton( 'edit', self._EditGUG, enabled_only_on_selection = True )
|
||||
self._gug_list_ctrl_panel.AddButton( 'delete', self._DeleteGUG, enabled_only_on_selection = True )
|
||||
self._gug_list_ctrl_panel.AddSeparator()
|
||||
self._gug_list_ctrl_panel.AddImportExportButtons( ( ClientNetworkingDomain.GalleryURLGenerator, ), self._AddGUG )
|
||||
self._gug_list_ctrl_panel.AddSeparator()
|
||||
self._gug_list_ctrl_panel.AddDefaultsButton( ClientDefaults.GetDefaultSingleGUGs, self._AddGUG )
|
||||
|
||||
#
|
||||
|
||||
self._ngug_list_ctrl_panel = ClientGUIListCtrl.BetterListCtrlPanel( self._notebook )
|
||||
|
||||
columns = [ ( 'name', 24 ), ( 'gugs', -1 ), ( 'missing gugs', 14 ) ]
|
||||
|
||||
self._ngug_list_ctrl = ClientGUIListCtrl.BetterListCtrl( self._ngug_list_ctrl_panel, 'ngugs', 20, 64, columns, self._ConvertNGUGToListCtrlTuples, delete_key_callback = self._DeleteNGUG, activation_callback = self._EditNGUG )
|
||||
|
||||
self._ngug_list_ctrl_panel.SetListCtrl( self._ngug_list_ctrl )
|
||||
|
||||
self._ngug_list_ctrl_panel.AddButton( 'add', self._AddNewNGUG )
|
||||
self._ngug_list_ctrl_panel.AddButton( 'edit', self._EditNGUG, enabled_only_on_selection = True )
|
||||
self._ngug_list_ctrl_panel.AddButton( 'delete', self._DeleteNGUG, enabled_only_on_selection = True )
|
||||
self._ngug_list_ctrl_panel.AddSeparator()
|
||||
self._ngug_list_ctrl_panel.AddImportExportButtons( ( ClientNetworkingDomain.NestedGalleryURLGenerator, ), self._AddNGUG )
|
||||
self._ngug_list_ctrl_panel.AddSeparator()
|
||||
self._ngug_list_ctrl_panel.AddDefaultsButton( ClientDefaults.GetDefaultNGUGs, self._AddNGUG )
|
||||
|
||||
#
|
||||
|
||||
single_gugs = [ gug for gug in gugs if isinstance( gug, ClientNetworkingDomain.GalleryURLGenerator ) ]
|
||||
|
||||
self._gug_list_ctrl.AddDatas( single_gugs )
|
||||
|
||||
self._gug_list_ctrl.Sort( 0 )
|
||||
|
||||
ngugs = [ gug for gug in gugs if isinstance( gug, ClientNetworkingDomain.NestedGalleryURLGenerator ) ]
|
||||
|
||||
self._ngug_list_ctrl.AddDatas( ngugs )
|
||||
|
||||
self._ngug_list_ctrl.Sort( 0 )
|
||||
|
||||
#
|
||||
|
||||
self._notebook.AddPage( self._gug_list_ctrl_panel, 'gallery url generators', select = True )
|
||||
self._notebook.AddPage( self._ngug_list_ctrl_panel, 'nested gallery url generators', select = False )
|
||||
|
||||
#
|
||||
|
||||
vbox = wx.BoxSizer( wx.VERTICAL )
|
||||
|
||||
vbox.Add( help_hbox, CC.FLAGS_BUTTON_SIZER )
|
||||
vbox.Add( self._list_ctrl_panel, CC.FLAGS_EXPAND_BOTH_WAYS )
|
||||
vbox.Add( self._notebook, CC.FLAGS_EXPAND_BOTH_WAYS )
|
||||
|
||||
self.SetSizer( vbox )
|
||||
|
||||
|
||||
def _Add( self ):
|
||||
def _AddNewGUG( self ):
|
||||
|
||||
gug = ClientNetworkingDomain.GalleryURLGenerator( 'new gallery url generator' )
|
||||
|
||||
|
@ -1562,7 +1954,30 @@ class EditGUGsPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
self._AddGUG( gug )
|
||||
|
||||
self._list_ctrl.Sort()
|
||||
self._gug_list_ctrl.Sort()
|
||||
|
||||
|
||||
|
||||
|
||||
def _AddNewNGUG( self ):
|
||||
|
||||
ngug = ClientNetworkingDomain.NestedGalleryURLGenerator( 'new nested gallery url generator' )
|
||||
|
||||
available_gugs = self._gug_list_ctrl.GetData()
|
||||
|
||||
with ClientGUITopLevelWindows.DialogEdit( self, 'edit nested gallery url generator' ) as dlg:
|
||||
|
||||
panel = EditNGUGPanel( dlg, ngug, available_gugs )
|
||||
|
||||
dlg.SetPanel( panel )
|
||||
|
||||
if dlg.ShowModal() == wx.ID_OK:
|
||||
|
||||
ngug = panel.GetValue()
|
||||
|
||||
self._AddNGUG( ngug )
|
||||
|
||||
self._ngug_list_ctrl.Sort()
|
||||
|
||||
|
||||
|
||||
|
@ -1573,10 +1988,19 @@ class EditGUGsPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
gug.RegenerateGUGKey()
|
||||
|
||||
self._list_ctrl.AddDatas( ( gug, ) )
|
||||
self._gug_list_ctrl.AddDatas( ( gug, ) )
|
||||
|
||||
|
||||
def _ConvertDataToListCtrlTuples( self, gug ):
|
||||
def _AddNGUG( self, ngug ):
|
||||
|
||||
HydrusSerialisable.SetNonDupeName( ngug, self._GetExistingNames() )
|
||||
|
||||
ngug.RegenerateGUGKey()
|
||||
|
||||
self._ngug_list_ctrl.AddDatas( ( ngug, ) )
|
||||
|
||||
|
||||
def _ConvertGUGToListCtrlTuples( self, gug ):
|
||||
|
||||
name = gug.GetName()
|
||||
example_url = gug.GetExampleURL()
|
||||
|
@ -1605,22 +2029,97 @@ class EditGUGsPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
return ( display_tuple, sort_tuple )
|
||||
|
||||
|
||||
def _Delete( self ):
|
||||
def _ConvertNGUGToListCtrlTuples( self, ngug ):
|
||||
|
||||
# This GUG is in NGUG blah, you sure?
|
||||
existing_names = { gug.GetName() for gug in self._gug_list_ctrl.GetData() }
|
||||
|
||||
name = ngug.GetName()
|
||||
gugs = ngug.GetGUGNames()
|
||||
missing = len( set( gugs ).difference( existing_names ) ) > 0
|
||||
|
||||
pretty_name = name
|
||||
pretty_gugs = ', '.join( gugs )
|
||||
|
||||
if missing:
|
||||
|
||||
pretty_missing = 'yes'
|
||||
|
||||
else:
|
||||
|
||||
pretty_missing = ''
|
||||
|
||||
|
||||
display_tuple = ( pretty_name, pretty_gugs, pretty_missing )
|
||||
sort_tuple = ( name, gugs, missing )
|
||||
|
||||
return ( display_tuple, sort_tuple )
|
||||
|
||||
|
||||
def _DeleteGUG( self ):
|
||||
|
||||
ngugs = self._ngug_list_ctrl.GetData()
|
||||
|
||||
deletees = self._gug_list_ctrl.GetData( only_selected = True )
|
||||
|
||||
with ClientGUIDialogs.DialogYesNo( self, 'Remove all selected?' ) as dlg:
|
||||
|
||||
if dlg.ShowModal() == wx.ID_YES:
|
||||
|
||||
self._list_ctrl.DeleteSelected()
|
||||
for deletee in deletees:
|
||||
|
||||
deletee_ngug_key = deletee.GetGUGKey()
|
||||
|
||||
affected_ngug_names = []
|
||||
|
||||
for ngug in ngugs:
|
||||
|
||||
if deletee_ngug_key in ngug.GetGUGKeys():
|
||||
|
||||
affected_ngug_names.append( ngug.GetName() )
|
||||
|
||||
|
||||
|
||||
if len( affected_ngug_names ) > 0:
|
||||
|
||||
affected_ngug_names.sort()
|
||||
|
||||
message = 'The GUG "' + deletee.GetName() + '" is in the NGUGs:'
|
||||
message += os.linesep * 2
|
||||
message += os.linesep.join( affected_ngug_names )
|
||||
message += os.linesep * 2
|
||||
message += 'Deleting this GUG will ultimately remove it from those NGUGs--are you sure that is ok?'
|
||||
|
||||
with ClientGUIDialogs.DialogYesNo( self, message ) as ngug_dlg:
|
||||
|
||||
if ngug_dlg.ShowModal() == wx.ID_YES:
|
||||
|
||||
self._gug_list_ctrl.DeleteDatas( ( deletee, ) )
|
||||
|
||||
else:
|
||||
|
||||
break
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
def _Edit( self ):
|
||||
def _DeleteNGUG( self ):
|
||||
|
||||
for gug in self._list_ctrl.GetData( only_selected = True ):
|
||||
with ClientGUIDialogs.DialogYesNo( self, 'Remove all selected?' ) as dlg:
|
||||
|
||||
if dlg.ShowModal() == wx.ID_YES:
|
||||
|
||||
self._ngug_list_ctrl.DeleteSelected()
|
||||
|
||||
|
||||
|
||||
|
||||
def _EditGUG( self ):
|
||||
|
||||
for gug in self._gug_list_ctrl.GetData( only_selected = True ):
|
||||
|
||||
with ClientGUITopLevelWindows.DialogEdit( self, 'edit gallery url generator' ) as dlg:
|
||||
|
||||
|
@ -1630,13 +2129,13 @@ class EditGUGsPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
if dlg.ShowModal() == wx.ID_OK:
|
||||
|
||||
self._list_ctrl.DeleteDatas( ( gug, ) )
|
||||
self._gug_list_ctrl.DeleteDatas( ( gug, ) )
|
||||
|
||||
gug = panel.GetValue()
|
||||
|
||||
HydrusSerialisable.SetNonDupeName( gug, self._GetExistingNames() )
|
||||
|
||||
self._list_ctrl.AddDatas( ( gug, ) )
|
||||
self._gug_list_ctrl.AddDatas( ( gug, ) )
|
||||
|
||||
else:
|
||||
|
||||
|
@ -1645,21 +2144,64 @@ class EditGUGsPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
|
||||
|
||||
self._list_ctrl.Sort()
|
||||
self._gug_list_ctrl.Sort()
|
||||
|
||||
|
||||
def _EditNGUG( self ):
|
||||
|
||||
available_gugs = self._gug_list_ctrl.GetData()
|
||||
|
||||
for ngug in self._ngug_list_ctrl.GetData( only_selected = True ):
|
||||
|
||||
with ClientGUITopLevelWindows.DialogEdit( self, 'edit nested gallery url generator' ) as dlg:
|
||||
|
||||
panel = EditNGUGPanel( dlg, ngug, available_gugs )
|
||||
|
||||
dlg.SetPanel( panel )
|
||||
|
||||
if dlg.ShowModal() == wx.ID_OK:
|
||||
|
||||
self._ngug_list_ctrl.DeleteDatas( ( ngug, ) )
|
||||
|
||||
ngug = panel.GetValue()
|
||||
|
||||
HydrusSerialisable.SetNonDupeName( ngug, self._GetExistingNames() )
|
||||
|
||||
self._ngug_list_ctrl.AddDatas( ( ngug, ) )
|
||||
|
||||
else:
|
||||
|
||||
break
|
||||
|
||||
|
||||
|
||||
|
||||
self._ngug_list_ctrl.Sort()
|
||||
|
||||
|
||||
def _GetExistingNames( self ):
|
||||
|
||||
gugs = self._list_ctrl.GetData()
|
||||
gugs = self._gug_list_ctrl.GetData()
|
||||
ngugs = self._ngug_list_ctrl.GetData()
|
||||
|
||||
names = { gug.GetName() for gug in gugs }
|
||||
names.update( ( ngug.GetName() for ngug in ngugs ) )
|
||||
|
||||
return names
|
||||
|
||||
|
||||
def GetValue( self ):
|
||||
|
||||
gugs = self._list_ctrl.GetData()
|
||||
gugs = list( self._gug_list_ctrl.GetData() )
|
||||
|
||||
ngugs = self._ngug_list_ctrl.GetData()
|
||||
|
||||
for ngug in ngugs:
|
||||
|
||||
ngug.RepairGUGs( gugs )
|
||||
|
||||
|
||||
gugs.extend( ngugs )
|
||||
|
||||
return gugs
|
||||
|
||||
|
@ -2577,11 +3119,11 @@ class EditSubscriptionPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
#
|
||||
|
||||
( name, gallery_identifier, gallery_stream_identifiers, queries, checker_options, initial_file_limit, periodic_file_limit, paused, file_import_options, tag_import_options, self._no_work_until, self._no_work_until_reason ) = subscription.ToTuple()
|
||||
( name, gug_key_and_name, queries, checker_options, initial_file_limit, periodic_file_limit, paused, file_import_options, tag_import_options, self._no_work_until, self._no_work_until_reason ) = subscription.ToTuple()
|
||||
|
||||
self._query_panel = ClientGUICommon.StaticBox( self, 'site and queries' )
|
||||
|
||||
self._gallery_identifier = ClientGUIImport.GallerySelector( self._query_panel, gallery_identifier )
|
||||
self._gug_key_and_name = ClientGUIImport.GUGKeyAndNameSelector( self._query_panel, gug_key_and_name )
|
||||
|
||||
queries_panel = ClientGUIListCtrl.BetterListCtrlPanel( self._query_panel )
|
||||
|
||||
|
@ -2701,7 +3243,7 @@ But if 2 is--and is also perhaps accompanied by many 'could not parse' errors--t
|
|||
|
||||
#
|
||||
|
||||
self._query_panel.Add( self._gallery_identifier, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
self._query_panel.Add( self._gug_key_and_name, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
self._query_panel.Add( queries_panel, CC.FLAGS_EXPAND_BOTH_WAYS )
|
||||
self._query_panel.Add( self._checker_options, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
|
||||
|
@ -2750,11 +3292,11 @@ But if 2 is--and is also perhaps accompanied by many 'could not parse' errors--t
|
|||
|
||||
def _AddQuery( self ):
|
||||
|
||||
gallery_identifier = self._gallery_identifier.GetValue()
|
||||
gug_key_and_name = self._gug_key_and_name.GetValue()
|
||||
|
||||
search_value = ClientDefaults.GetDefaultSearchValue( gallery_identifier )
|
||||
initial_search_text = HG.client_controller.network_engine.domain_manager.GetInitialSearchText( gug_key_and_name )
|
||||
|
||||
query = ClientImportSubscriptions.SubscriptionQuery( search_value )
|
||||
query = ClientImportSubscriptions.SubscriptionQuery( initial_search_text )
|
||||
|
||||
with ClientGUITopLevelWindows.DialogEdit( self, 'edit subscription query' ) as dlg:
|
||||
|
||||
|
@ -3163,10 +3705,7 @@ But if 2 is--and is also perhaps accompanied by many 'could not parse' errors--t
|
|||
|
||||
subscription = ClientImportSubscriptions.Subscription( name )
|
||||
|
||||
gallery_identifier = self._gallery_identifier.GetValue()
|
||||
|
||||
# in future, this can be harvested from some checkboxes or whatever for stream selection
|
||||
gallery_stream_identifiers = ClientDownloading.GetGalleryStreamIdentifiers( gallery_identifier )
|
||||
gug_key_and_name = self._gug_key_and_name.GetValue()
|
||||
|
||||
queries = self._queries.GetData()
|
||||
|
||||
|
@ -3179,7 +3718,7 @@ But if 2 is--and is also perhaps accompanied by many 'could not parse' errors--t
|
|||
file_import_options = self._file_import_options.GetValue()
|
||||
tag_import_options = self._tag_import_options.GetValue()
|
||||
|
||||
subscription.SetTuple( gallery_identifier, gallery_stream_identifiers, queries, checker_options, initial_file_limit, periodic_file_limit, paused, file_import_options, tag_import_options, self._no_work_until )
|
||||
subscription.SetTuple( gug_key_and_name, queries, checker_options, initial_file_limit, periodic_file_limit, paused, file_import_options, tag_import_options, self._no_work_until )
|
||||
|
||||
publish_files_to_popup_button = self._publish_files_to_popup_button.GetValue()
|
||||
publish_files_to_page = self._publish_files_to_page.GetValue()
|
||||
|
@ -3406,11 +3945,11 @@ class EditSubscriptionsPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
subscriptions = [ subscription for subscription in subscriptions if len( subscription.GetQueries() ) > 0 ]
|
||||
|
||||
gallery_identifiers = { subscription.GetGalleryIdentifier() for subscription in subscriptions }
|
||||
gug_names = { subscription.GetGUGKeyAndName()[1] for subscription in subscriptions }
|
||||
|
||||
# if there are fewer, there must be dupes, so we must be able to merge
|
||||
|
||||
return len( gallery_identifiers ) < len( subscriptions )
|
||||
return len( gug_names ) < len( subscriptions )
|
||||
|
||||
|
||||
def _CanReset( self ):
|
||||
|
@ -3462,9 +4001,9 @@ class EditSubscriptionsPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
def _ConvertSubscriptionToListCtrlTuples( self, subscription ):
|
||||
|
||||
( name, gallery_identifier, gallery_stream_identifiers, queries, checker_options, initial_file_limit, periodic_file_limit, paused, file_import_options, tag_import_options, no_work_until, no_work_until_reason ) = subscription.ToTuple()
|
||||
( name, gug_key_and_name, queries, checker_options, initial_file_limit, periodic_file_limit, paused, file_import_options, tag_import_options, no_work_until, no_work_until_reason ) = subscription.ToTuple()
|
||||
|
||||
pretty_site = gallery_identifier.ToString()
|
||||
pretty_site = gug_key_and_name[1]
|
||||
|
||||
period = 100
|
||||
pretty_period = 'fix this'
|
||||
|
@ -3672,7 +4211,9 @@ class EditSubscriptionsPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
def Add( self ):
|
||||
|
||||
empty_subscription = ClientImportSubscriptions.Subscription( 'new subscription' )
|
||||
gug_key_and_name = HG.client_controller.network_engine.domain_manager.GetDefaultGUGKeyAndName()
|
||||
|
||||
empty_subscription = ClientImportSubscriptions.Subscription( 'new subscription', gug_key_and_name = gug_key_and_name )
|
||||
|
||||
with ClientGUITopLevelWindows.DialogEdit( self, 'edit subscription' ) as dlg_edit:
|
||||
|
||||
|
@ -5223,6 +5764,8 @@ class EditURLMatchPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
new_row = ( new_string_match, new_default )
|
||||
|
||||
wx.CallAfter( self._UpdateControls ) # seems sometimes this doesn't kick in naturally
|
||||
|
||||
return ( True, new_row )
|
||||
|
||||
|
||||
|
@ -5696,7 +6239,7 @@ class EditURLMatchesPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
class EditURLMatchLinksPanel( ClientGUIScrolledPanels.EditPanel ):
|
||||
|
||||
def __init__( self, parent, network_engine, url_matches, parsers, url_match_keys_to_display, url_match_keys_to_parser_keys ):
|
||||
def __init__( self, parent, network_engine, url_matches, parsers, url_match_keys_to_parser_keys ):
|
||||
|
||||
ClientGUIScrolledPanels.EditPanel.__init__( self, parent )
|
||||
|
||||
|
@ -5714,18 +6257,6 @@ class EditURLMatchLinksPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
#
|
||||
|
||||
self._display_list_ctrl_panel = ClientGUIListCtrl.BetterListCtrlPanel( self._notebook )
|
||||
|
||||
columns = [ ( 'url class', -1 ), ( 'url type', 20 ), ( 'display on media viewer?', 36 ) ]
|
||||
|
||||
self._display_list_ctrl = ClientGUIListCtrl.BetterListCtrl( self._display_list_ctrl_panel, 'url_match_keys_to_display', 15, 36, columns, self._ConvertDisplayDataToListCtrlTuples, activation_callback = self._EditDisplay )
|
||||
|
||||
self._display_list_ctrl_panel.SetListCtrl( self._display_list_ctrl )
|
||||
|
||||
self._display_list_ctrl_panel.AddButton( 'edit', self._EditDisplay, enabled_only_on_selection = True )
|
||||
|
||||
#
|
||||
|
||||
columns = [ ( 'url class', -1 ), ( 'api url class', 36 ) ]
|
||||
|
||||
self._api_pairs_list_ctrl = ClientGUIListCtrl.BetterListCtrl( self._notebook, 'url_match_api_pairs', 10, 36, columns, self._ConvertAPIPairDataToListCtrlTuples )
|
||||
|
@ -5746,23 +6277,6 @@ class EditURLMatchLinksPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
#
|
||||
|
||||
listctrl_data = []
|
||||
|
||||
for url_match in url_matches:
|
||||
|
||||
url_match_key = url_match.GetMatchKey()
|
||||
|
||||
display = url_match_key in url_match_keys_to_display
|
||||
|
||||
listctrl_data.append( ( url_match_key, display ) )
|
||||
|
||||
|
||||
self._display_list_ctrl.AddDatas( listctrl_data )
|
||||
|
||||
self._display_list_ctrl.Sort( 1 )
|
||||
|
||||
#
|
||||
|
||||
api_pairs = ClientNetworkingDomain.ConvertURLMatchesIntoAPIPairs( url_matches )
|
||||
|
||||
self._api_pairs_list_ctrl.AddDatas( api_pairs )
|
||||
|
@ -5808,7 +6322,6 @@ class EditURLMatchLinksPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
self._notebook.AddPage( self._parser_list_ctrl_panel, 'parser links' )
|
||||
self._notebook.AddPage( self._api_pairs_list_ctrl, 'api link review' )
|
||||
self._notebook.AddPage( self._display_list_ctrl_panel, 'media viewer display' )
|
||||
|
||||
#
|
||||
|
||||
|
@ -5857,33 +6370,6 @@ class EditURLMatchLinksPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
return ( display_tuple, sort_tuple )
|
||||
|
||||
|
||||
def _ConvertDisplayDataToListCtrlTuples( self, data ):
|
||||
|
||||
( url_match_key, display ) = data
|
||||
|
||||
url_match = self._url_match_keys_to_url_matches[ url_match_key ]
|
||||
|
||||
url_match_name = url_match.GetName()
|
||||
url_type = url_match.GetURLType()
|
||||
|
||||
pretty_name = url_match_name
|
||||
pretty_url_type = HC.url_type_string_lookup[ url_type ]
|
||||
|
||||
if display:
|
||||
|
||||
pretty_display = 'yes'
|
||||
|
||||
else:
|
||||
|
||||
pretty_display = 'no'
|
||||
|
||||
|
||||
display_tuple = ( pretty_name, pretty_url_type, pretty_display )
|
||||
sort_tuple = ( url_match_name, pretty_url_type, display )
|
||||
|
||||
return ( display_tuple, sort_tuple )
|
||||
|
||||
|
||||
def _ConvertParserDataToListCtrlTuples( self, data ):
|
||||
|
||||
( url_match_key, parser_key ) = data
|
||||
|
@ -5917,40 +6403,6 @@ class EditURLMatchLinksPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
return ( display_tuple, sort_tuple )
|
||||
|
||||
|
||||
def _EditDisplay( self ):
|
||||
|
||||
for data in self._display_list_ctrl.GetData( only_selected = True ):
|
||||
|
||||
( url_match_key, display ) = data
|
||||
|
||||
url_match_name = self._url_match_keys_to_url_matches[ url_match_key ].GetName()
|
||||
|
||||
message = 'Show ' + url_match_name + ' in the media viewer?'
|
||||
|
||||
with ClientGUIDialogs.DialogYesNo( self, message, title = 'Show in the media viewer?' ) as dlg:
|
||||
|
||||
result = dlg.ShowModal()
|
||||
|
||||
if result in ( wx.ID_YES, wx.ID_NO ):
|
||||
|
||||
display = result == wx.ID_YES
|
||||
|
||||
self._display_list_ctrl.DeleteDatas( ( data, ) )
|
||||
|
||||
new_data = ( url_match_key, display )
|
||||
|
||||
self._display_list_ctrl.AddDatas( ( new_data, ) )
|
||||
|
||||
else:
|
||||
|
||||
break
|
||||
|
||||
|
||||
|
||||
|
||||
self._display_list_ctrl.Sort()
|
||||
|
||||
|
||||
def _EditParser( self ):
|
||||
|
||||
if len( self._parsers ) == 0:
|
||||
|
@ -6032,9 +6484,8 @@ class EditURLMatchLinksPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
def GetValue( self ):
|
||||
|
||||
url_match_keys_to_display = { url_match_key for ( url_match_key, display ) in self._display_list_ctrl.GetData() if display }
|
||||
url_match_keys_to_parser_keys = { url_match_key : parser_key for ( url_match_key, parser_key ) in self._parser_list_ctrl.GetData() if parser_key is not None }
|
||||
|
||||
return ( url_match_keys_to_display, url_match_keys_to_parser_keys )
|
||||
return url_match_keys_to_parser_keys
|
||||
|
||||
|
||||
|
|
|
@ -5823,7 +5823,7 @@ class ManageTagsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
|
||||
|
||||
|
||||
wx.CallAfter( do_it )
|
||||
HG.client_controller.CallLaterWXSafe( HG.client_controller.gui, 0.5, do_it )
|
||||
|
||||
|
||||
def EventCheckAddParents( self, event ):
|
||||
|
@ -5922,7 +5922,11 @@ class ManageTagsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
|
||||
def OK( self ):
|
||||
|
||||
wx.QueueEvent( self.GetEventHandler(), ClientGUITopLevelWindows.OKEvent( -1 ) )
|
||||
# nice to have this happen immediately so the 'advanced content update' stuff can occur in a neat event order afterwards
|
||||
self.GetEventHandler().ProcessEvent( ClientGUITopLevelWindows.OKEvent( -1 ) )
|
||||
|
||||
# old call:
|
||||
# wx.QueueEvent( self.GetEventHandler(), ClientGUITopLevelWindows.OKEvent( -1 ) )
|
||||
|
||||
|
||||
def ProcessContentUpdates( self, service_keys_to_content_updates ):
|
||||
|
|
|
@ -1053,7 +1053,7 @@ class FileSeed( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
file_seeds = ClientImporting.ConvertAllParseResultsToFileSeeds( all_parse_results, self.file_seed_data )
|
||||
|
||||
( num_urls_added, num_urls_already_in_file_seed_cache, can_add_more_file_urls, stop_reason ) = ClientImporting.UpdateFileSeedCacheWithFileSeeds( file_seed_cache, file_seeds )
|
||||
( num_urls_added, num_urls_already_in_file_seed_cache, can_search_for_more_files, stop_reason ) = ClientImporting.UpdateFileSeedCacheWithFileSeeds( file_seed_cache, file_seeds )
|
||||
|
||||
status = CC.STATUS_SUCCESSFUL_AND_NEW
|
||||
note = 'Found ' + HydrusData.ToHumanInt( num_urls_added ) + ' new URLs.'
|
||||
|
|
|
@ -21,9 +21,9 @@ class GalleryImport( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_GALLERY_IMPORT
|
||||
SERIALISABLE_NAME = 'Gallery Import'
|
||||
SERIALISABLE_VERSION = 1
|
||||
SERIALISABLE_VERSION = 2
|
||||
|
||||
def __init__( self, query = None, gallery_identifier = None ):
|
||||
def __init__( self, query = None, source_name = None, initial_search_urls = None ):
|
||||
|
||||
# eventually move this to be ( name, first_url ). the name will be like 'samus_aran on gelbooru'
|
||||
# then queue up a first url
|
||||
|
@ -33,9 +33,14 @@ class GalleryImport( HydrusSerialisable.SerialisableBase ):
|
|||
query = 'samus_aran'
|
||||
|
||||
|
||||
if gallery_identifier is None:
|
||||
if source_name is None:
|
||||
|
||||
gallery_identifier = ClientDownloading.GalleryIdentifier( HC.SITE_TYPE_DEVIANT_ART )
|
||||
source_name = 'unknown'
|
||||
|
||||
|
||||
if initial_search_urls is None:
|
||||
|
||||
initial_search_urls = []
|
||||
|
||||
|
||||
HydrusSerialisable.SerialisableBase.__init__( self )
|
||||
|
@ -44,7 +49,8 @@ class GalleryImport( HydrusSerialisable.SerialisableBase ):
|
|||
self._gallery_import_key = HydrusData.GenerateKey()
|
||||
|
||||
self._query = query
|
||||
self._gallery_identifier = gallery_identifier
|
||||
|
||||
self._source_name = source_name
|
||||
|
||||
self._page_key = 'initialising page key'
|
||||
self._publish_to_page = False
|
||||
|
@ -63,6 +69,11 @@ class GalleryImport( HydrusSerialisable.SerialisableBase ):
|
|||
self._tag_import_options = ClientImportOptions.TagImportOptions( is_default = True )
|
||||
|
||||
self._gallery_seed_log = ClientImportGallerySeeds.GallerySeedLog()
|
||||
|
||||
gallery_seeds = [ ClientImportGallerySeeds.GallerySeed( url ) for url in initial_search_urls ]
|
||||
|
||||
self._gallery_seed_log.AddGallerySeeds( gallery_seeds )
|
||||
|
||||
self._file_seed_cache = ClientImportFileSeeds.FileSeedCache()
|
||||
|
||||
self._no_work_until = 0
|
||||
|
@ -84,31 +95,6 @@ class GalleryImport( HydrusSerialisable.SerialisableBase ):
|
|||
HG.client_controller.sub( self, 'NotifyFileSeedsUpdated', 'file_seed_cache_file_seeds_updated' )
|
||||
|
||||
|
||||
def _AddSearchPage( self, page_index ):
|
||||
|
||||
try:
|
||||
|
||||
gallery = ClientDownloading.GetGallery( self._gallery_identifier )
|
||||
|
||||
except Exception as e:
|
||||
|
||||
HydrusData.PrintException( e )
|
||||
|
||||
self._files_paused = True
|
||||
self._gallery_paused = True
|
||||
|
||||
HydrusData.ShowText( 'A downloader could not load its gallery! It has been paused and the full error has been written to the log!' )
|
||||
|
||||
return
|
||||
|
||||
|
||||
gallery_url = gallery.GetGalleryPageURL( self._query, page_index )
|
||||
|
||||
gallery_seed = ClientImportGallerySeeds.GallerySeed( gallery_url, can_generate_more_pages = True )
|
||||
|
||||
self._gallery_seed_log.AddGallerySeeds( ( gallery_seed, ) )
|
||||
|
||||
|
||||
def _AmOverFileLimit( self ):
|
||||
|
||||
if self._file_limit is not None and self._num_new_urls_found >= self._file_limit:
|
||||
|
@ -129,25 +115,21 @@ class GalleryImport( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
serialisable_gallery_import_key = self._gallery_import_key.encode( 'hex' )
|
||||
|
||||
serialisable_gallery_identifier = self._gallery_identifier.GetSerialisableTuple()
|
||||
|
||||
serialisable_file_import_options = self._file_import_options.GetSerialisableTuple()
|
||||
serialisable_tag_import_options = self._tag_import_options.GetSerialisableTuple()
|
||||
|
||||
serialisable_gallery_seed_log = self._gallery_seed_log.GetSerialisableTuple()
|
||||
serialisable_file_seed_cache = self._file_seed_cache.GetSerialisableTuple()
|
||||
|
||||
return ( serialisable_gallery_import_key, self._creation_time, self._query, serialisable_gallery_identifier, self._current_page_index, self._num_urls_found, self._num_new_urls_found, self._file_limit, self._gallery_paused, self._files_paused, serialisable_file_import_options, serialisable_tag_import_options, serialisable_gallery_seed_log, serialisable_file_seed_cache, self._no_work_until, self._no_work_until_reason )
|
||||
return ( serialisable_gallery_import_key, self._creation_time, self._query, self._source_name, self._current_page_index, self._num_urls_found, self._num_new_urls_found, self._file_limit, self._gallery_paused, self._files_paused, serialisable_file_import_options, serialisable_tag_import_options, serialisable_gallery_seed_log, serialisable_file_seed_cache, self._no_work_until, self._no_work_until_reason )
|
||||
|
||||
|
||||
def _InitialiseFromSerialisableInfo( self, serialisable_info ):
|
||||
|
||||
( serialisable_gallery_import_key, self._creation_time, self._query, serialisable_gallery_identifier, self._current_page_index, self._num_urls_found, self._num_new_urls_found, self._file_limit, self._gallery_paused, self._files_paused, serialisable_file_import_options, serialisable_tag_import_options, serialisable_gallery_seed_log, serialisable_file_seed_cache, self._no_work_until, self._no_work_until_reason ) = serialisable_info
|
||||
( serialisable_gallery_import_key, self._creation_time, self._query, self._source_name, self._current_page_index, self._num_urls_found, self._num_new_urls_found, self._file_limit, self._gallery_paused, self._files_paused, serialisable_file_import_options, serialisable_tag_import_options, serialisable_gallery_seed_log, serialisable_file_seed_cache, self._no_work_until, self._no_work_until_reason ) = serialisable_info
|
||||
|
||||
self._gallery_import_key = serialisable_gallery_import_key.decode( 'hex' )
|
||||
|
||||
self._gallery_identifier = HydrusSerialisable.CreateFromSerialisableTuple( serialisable_gallery_identifier )
|
||||
|
||||
self._file_import_options = HydrusSerialisable.CreateFromSerialisableTuple( serialisable_file_import_options )
|
||||
self._tag_import_options = HydrusSerialisable.CreateFromSerialisableTuple( serialisable_tag_import_options )
|
||||
|
||||
|
@ -204,6 +186,22 @@ class GalleryImport( HydrusSerialisable.SerialisableBase ):
|
|||
return network_job
|
||||
|
||||
|
||||
def _UpdateSerialisableInfo( self, version, old_serialisable_info ):
|
||||
|
||||
if version == 1:
|
||||
|
||||
( serialisable_gallery_import_key, self._creation_time, self._query, serialisable_gallery_identifier, self._current_page_index, self._num_urls_found, self._num_new_urls_found, self._file_limit, self._gallery_paused, self._files_paused, serialisable_file_import_options, serialisable_tag_import_options, serialisable_gallery_seed_log, serialisable_file_seed_cache, self._no_work_until, self._no_work_until_reason ) = old_serialisable_info
|
||||
|
||||
gallery_identifier = HydrusSerialisable.CreateFromSerialisableTuple( serialisable_gallery_identifier )
|
||||
|
||||
source_name = ClientDownloading.ConvertGalleryIdentifierToGUGName( gallery_identifier )
|
||||
|
||||
new_serialisable_info = ( serialisable_gallery_import_key, self._creation_time, self._query, source_name, self._current_page_index, self._num_urls_found, self._num_new_urls_found, self._file_limit, self._gallery_paused, self._files_paused, serialisable_file_import_options, serialisable_tag_import_options, serialisable_gallery_seed_log, serialisable_file_seed_cache, self._no_work_until, self._no_work_until_reason )
|
||||
|
||||
return ( 2, new_serialisable_info )
|
||||
|
||||
|
||||
|
||||
def _WorkOnFiles( self ):
|
||||
|
||||
file_seed = self._file_seed_cache.GetNextFileSeed( CC.STATUS_UNKNOWN )
|
||||
|
@ -217,141 +215,28 @@ class GalleryImport( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
try:
|
||||
|
||||
if file_seed.WorksInNewSystem():
|
||||
|
||||
def status_hook( text ):
|
||||
|
||||
with self._lock:
|
||||
|
||||
self._current_action = text
|
||||
|
||||
|
||||
|
||||
did_substantial_work = file_seed.WorkOnURL( self._file_seed_cache, status_hook, self._NetworkJobFactory, self._FileNetworkJobPresentationContextFactory, self._file_import_options, self._tag_import_options )
|
||||
def status_hook( text ):
|
||||
|
||||
with self._lock:
|
||||
|
||||
should_present = self._publish_to_page and file_seed.ShouldPresent( self._file_import_options )
|
||||
|
||||
page_key = self._page_key
|
||||
self._current_action = text
|
||||
|
||||
|
||||
if should_present:
|
||||
|
||||
file_seed.PresentToPage( page_key )
|
||||
|
||||
did_substantial_work = True
|
||||
|
||||
|
||||
did_substantial_work = file_seed.WorkOnURL( self._file_seed_cache, status_hook, self._NetworkJobFactory, self._FileNetworkJobPresentationContextFactory, self._file_import_options, self._tag_import_options )
|
||||
|
||||
with self._lock:
|
||||
|
||||
else:
|
||||
should_present = self._publish_to_page and file_seed.ShouldPresent( self._file_import_options )
|
||||
|
||||
def network_job_factory( method, url, **kwargs ):
|
||||
|
||||
network_job = ClientNetworkingJobs.NetworkJobDownloader( self._gallery_import_key, method, url, **kwargs )
|
||||
|
||||
with self._lock:
|
||||
|
||||
self._file_network_job = network_job
|
||||
|
||||
|
||||
return network_job
|
||||
|
||||
page_key = self._page_key
|
||||
|
||||
try:
|
||||
|
||||
gallery = ClientDownloading.GetGallery( self._gallery_identifier )
|
||||
|
||||
except Exception as e:
|
||||
|
||||
HydrusData.PrintException( e )
|
||||
|
||||
with self._lock:
|
||||
|
||||
self._files_paused = True
|
||||
self._gallery_paused = True
|
||||
|
||||
HydrusData.ShowText( 'A downloader could not load its gallery! It has been paused and the full error has been written to the log!' )
|
||||
|
||||
return
|
||||
|
||||
|
||||
|
||||
if should_present:
|
||||
|
||||
gallery.SetNetworkJobFactory( network_job_factory )
|
||||
file_seed.PresentToPage( page_key )
|
||||
|
||||
with self._lock:
|
||||
|
||||
self._current_action = 'reviewing file'
|
||||
|
||||
|
||||
file_seed.PredictPreImportStatus( self._file_import_options, self._tag_import_options )
|
||||
|
||||
status = file_seed.status
|
||||
|
||||
url = file_seed.file_seed_data
|
||||
|
||||
if status == CC.STATUS_SUCCESSFUL_BUT_REDUNDANT:
|
||||
|
||||
if self._tag_import_options.ShouldFetchTagsEvenIfURLKnownAndFileAlreadyInDB() and self._tag_import_options.WorthFetchingTags():
|
||||
|
||||
downloaded_tags = gallery.GetTags( url )
|
||||
|
||||
file_seed.AddTags( downloaded_tags )
|
||||
|
||||
|
||||
elif status == CC.STATUS_UNKNOWN:
|
||||
|
||||
( os_file_handle, temp_path ) = ClientPaths.GetTempPath()
|
||||
|
||||
try:
|
||||
|
||||
with self._lock:
|
||||
|
||||
self._current_action = 'downloading file'
|
||||
|
||||
|
||||
if self._tag_import_options.WorthFetchingTags():
|
||||
|
||||
downloaded_tags = gallery.GetFileAndTags( temp_path, url )
|
||||
|
||||
file_seed.AddTags( downloaded_tags )
|
||||
|
||||
else:
|
||||
|
||||
gallery.GetFile( temp_path, url )
|
||||
|
||||
|
||||
file_seed.CheckPreFetchMetadata( self._tag_import_options )
|
||||
|
||||
with self._lock:
|
||||
|
||||
self._current_action = 'importing file'
|
||||
|
||||
|
||||
file_seed.Import( temp_path, self._file_import_options )
|
||||
|
||||
did_substantial_work = True
|
||||
|
||||
finally:
|
||||
|
||||
HydrusPaths.CleanUpTempPath( os_file_handle, temp_path )
|
||||
|
||||
|
||||
|
||||
did_substantial_work = file_seed.WriteContentUpdates( self._tag_import_options )
|
||||
|
||||
with self._lock:
|
||||
|
||||
should_present = self._publish_to_page and file_seed.ShouldPresent( self._file_import_options )
|
||||
|
||||
page_key = self._page_key
|
||||
|
||||
|
||||
if should_present:
|
||||
|
||||
file_seed.PresentToPage( page_key )
|
||||
|
||||
did_substantial_work = True
|
||||
|
||||
did_substantial_work = True
|
||||
|
||||
|
||||
except HydrusExceptions.VetoException as e:
|
||||
|
@ -428,225 +313,70 @@ class GalleryImport( HydrusSerialisable.SerialisableBase ):
|
|||
self._gallery_status = 'checking next page'
|
||||
|
||||
|
||||
if gallery_seed.WorksInNewSystem():
|
||||
def file_seeds_callable( file_seeds ):
|
||||
|
||||
def file_seeds_callable( file_seeds ):
|
||||
if self._file_limit is None:
|
||||
|
||||
if self._file_limit is None:
|
||||
|
||||
max_new_urls_allowed = None
|
||||
|
||||
else:
|
||||
|
||||
max_new_urls_allowed = self._file_limit - self._num_new_urls_found
|
||||
|
||||
max_new_urls_allowed = None
|
||||
|
||||
return ClientImporting.UpdateFileSeedCacheWithFileSeeds( self._file_seed_cache, file_seeds, max_new_urls_allowed )
|
||||
else:
|
||||
|
||||
max_new_urls_allowed = self._file_limit - self._num_new_urls_found
|
||||
|
||||
|
||||
def status_hook( text ):
|
||||
return ClientImporting.UpdateFileSeedCacheWithFileSeeds( self._file_seed_cache, file_seeds, max_new_urls_allowed )
|
||||
|
||||
|
||||
def status_hook( text ):
|
||||
|
||||
with self._lock:
|
||||
|
||||
with self._lock:
|
||||
|
||||
self._gallery_status = text
|
||||
|
||||
self._gallery_status = text
|
||||
|
||||
|
||||
def title_hook( text ):
|
||||
|
||||
def title_hook( text ):
|
||||
|
||||
return
|
||||
|
||||
|
||||
try:
|
||||
|
||||
( num_urls_added, num_urls_already_in_file_seed_cache, num_urls_total, result_404, added_new_gallery_pages, stop_reason ) = gallery_seed.WorkOnURL( 'download page', self._gallery_seed_log, file_seeds_callable, status_hook, title_hook, self._NetworkJobFactory, self._GalleryNetworkJobPresentationContextFactory, self._file_import_options )
|
||||
|
||||
self._num_new_urls_found += num_urls_added
|
||||
self._num_urls_found += num_urls_total
|
||||
|
||||
if num_urls_added > 0:
|
||||
|
||||
return
|
||||
ClientImporting.WakeRepeatingJob( self._files_repeating_job )
|
||||
|
||||
|
||||
try:
|
||||
self._current_page_index += 1
|
||||
|
||||
except HydrusExceptions.NetworkException as e:
|
||||
|
||||
with self._lock:
|
||||
|
||||
( num_urls_added, num_urls_already_in_file_seed_cache, num_urls_total, result_404, can_add_more_file_urls, stop_reason ) = gallery_seed.WorkOnURL( 'download page', self._gallery_seed_log, file_seeds_callable, status_hook, title_hook, self._NetworkJobFactory, self._GalleryNetworkJobPresentationContextFactory, self._file_import_options )
|
||||
|
||||
self._num_new_urls_found += num_urls_added
|
||||
self._num_urls_found += num_urls_total
|
||||
|
||||
if num_urls_added > 0:
|
||||
|
||||
ClientImporting.WakeRepeatingJob( self._files_repeating_job )
|
||||
|
||||
|
||||
self._current_page_index += 1
|
||||
|
||||
except HydrusExceptions.NetworkException as e:
|
||||
|
||||
with self._lock:
|
||||
|
||||
self._DelayWork( 4 * 3600, HydrusData.ToUnicode( e ) )
|
||||
|
||||
|
||||
return
|
||||
|
||||
except Exception as e:
|
||||
|
||||
gallery_seed_status = CC.STATUS_ERROR
|
||||
gallery_seed_note = HydrusData.ToUnicode( e )
|
||||
|
||||
gallery_seed.SetStatus( gallery_seed_status, note = gallery_seed_note )
|
||||
|
||||
HydrusData.PrintException( e )
|
||||
|
||||
with self._lock:
|
||||
|
||||
self._gallery_paused = True
|
||||
|
||||
self._DelayWork( 4 * 3600, HydrusData.ToUnicode( e ) )
|
||||
|
||||
|
||||
else:
|
||||
return
|
||||
|
||||
def network_job_factory( method, url, **kwargs ):
|
||||
|
||||
network_job = ClientNetworkingJobs.NetworkJobDownloader( self._gallery_import_key, method, url, **kwargs )
|
||||
|
||||
network_job.SetGalleryToken( 'download page' )
|
||||
|
||||
network_job.OverrideBandwidth( 30 )
|
||||
|
||||
with self._lock:
|
||||
|
||||
self._gallery_network_job = network_job
|
||||
|
||||
|
||||
return network_job
|
||||
|
||||
except Exception as e:
|
||||
|
||||
try:
|
||||
|
||||
gallery = ClientDownloading.GetGallery( self._gallery_identifier )
|
||||
|
||||
except Exception as e:
|
||||
|
||||
HydrusData.PrintException( e )
|
||||
|
||||
with self._lock:
|
||||
|
||||
self._files_paused = True
|
||||
self._gallery_paused = True
|
||||
|
||||
HydrusData.ShowText( 'A downloader could not load its gallery! It has been paused and the full error has been written to the log!' )
|
||||
|
||||
return
|
||||
|
||||
|
||||
|
||||
gallery.SetNetworkJobFactory( network_job_factory )
|
||||
|
||||
num_already_in_file_seed_cache = 0
|
||||
new_file_seeds = []
|
||||
|
||||
try:
|
||||
|
||||
gallery_url = gallery_seed.url
|
||||
|
||||
( page_of_file_seeds, definitely_no_more_pages ) = gallery.GetPage( gallery_url )
|
||||
|
||||
|
||||
# do files
|
||||
|
||||
for file_seed in page_of_file_seeds:
|
||||
|
||||
self._num_urls_found += 1
|
||||
|
||||
if self._file_seed_cache.HasFileSeed( file_seed ):
|
||||
|
||||
num_already_in_file_seed_cache += 1
|
||||
|
||||
else:
|
||||
|
||||
with self._lock:
|
||||
|
||||
if self._AmOverFileLimit():
|
||||
|
||||
self._gallery_paused = True
|
||||
|
||||
break
|
||||
|
||||
|
||||
|
||||
new_file_seeds.append( file_seed )
|
||||
|
||||
self._num_new_urls_found += 1
|
||||
|
||||
|
||||
|
||||
num_urls_added = self._file_seed_cache.AddFileSeeds( new_file_seeds )
|
||||
|
||||
# do gallery pages
|
||||
|
||||
with self._lock:
|
||||
|
||||
no_urls_found = len( page_of_file_seeds ) == 0
|
||||
|
||||
no_new_urls = len( new_file_seeds ) == 0
|
||||
|
||||
am_over_limit = self._AmOverFileLimit()
|
||||
|
||||
if definitely_no_more_pages or no_urls_found or no_new_urls or am_over_limit:
|
||||
|
||||
pass # dead search
|
||||
|
||||
else:
|
||||
|
||||
self._current_page_index += 1
|
||||
|
||||
self._AddSearchPage( self._current_page_index )
|
||||
|
||||
|
||||
|
||||
# report and finish up
|
||||
|
||||
status = self._query + ': ' + HydrusData.ToHumanInt( len( new_file_seeds ) ) + ' new urls found'
|
||||
|
||||
if num_already_in_file_seed_cache > 0:
|
||||
|
||||
status += ' (' + HydrusData.ToHumanInt( num_already_in_file_seed_cache ) + ' of last page already in queue)'
|
||||
|
||||
|
||||
if am_over_limit:
|
||||
|
||||
status += ' - hit file limit'
|
||||
|
||||
|
||||
gallery_seed_status = CC.STATUS_SUCCESSFUL_AND_NEW
|
||||
gallery_seed_note = status
|
||||
|
||||
if len( new_file_seeds ) > 0:
|
||||
|
||||
ClientImporting.WakeRepeatingJob( self._files_repeating_job )
|
||||
|
||||
|
||||
except Exception as e:
|
||||
|
||||
if isinstance( e, HydrusExceptions.NotFoundException ):
|
||||
|
||||
text = 'gallery 404'
|
||||
|
||||
gallery_seed_status = CC.STATUS_VETOED
|
||||
gallery_seed_note = text
|
||||
|
||||
else:
|
||||
|
||||
text = HydrusData.ToUnicode( e )
|
||||
|
||||
gallery_seed_status = CC.STATUS_ERROR
|
||||
gallery_seed_note = text
|
||||
|
||||
HydrusData.DebugPrint( traceback.format_exc() )
|
||||
|
||||
|
||||
finally:
|
||||
|
||||
with self._lock:
|
||||
|
||||
self._gallery_network_job = None
|
||||
|
||||
|
||||
gallery_seed_status = CC.STATUS_ERROR
|
||||
gallery_seed_note = HydrusData.ToUnicode( e )
|
||||
|
||||
gallery_seed.SetStatus( gallery_seed_status, note = gallery_seed_note )
|
||||
|
||||
HydrusData.PrintException( e )
|
||||
|
||||
with self._lock:
|
||||
|
||||
self._gallery_paused = True
|
||||
|
||||
|
||||
|
||||
self._gallery_seed_log.NotifyGallerySeedsUpdated( ( gallery_seed, ) )
|
||||
|
||||
|
@ -732,14 +462,6 @@ class GalleryImport( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
|
||||
|
||||
def GetGalleryIdentifier( self ):
|
||||
|
||||
with self._lock:
|
||||
|
||||
return self._gallery_identifier
|
||||
|
||||
|
||||
|
||||
def GetGalleryImportKey( self ):
|
||||
|
||||
with self._lock:
|
||||
|
@ -796,6 +518,14 @@ class GalleryImport( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
|
||||
|
||||
def GetSourceName( self ):
|
||||
|
||||
with self._lock:
|
||||
|
||||
return self._source_name
|
||||
|
||||
|
||||
|
||||
def GetStatus( self ):
|
||||
|
||||
with self._lock:
|
||||
|
@ -820,14 +550,6 @@ class GalleryImport( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
|
||||
|
||||
def InitialiseFirstSearchPage( self ):
|
||||
|
||||
with self._lock:
|
||||
|
||||
self._AddSearchPage( 0 )
|
||||
|
||||
|
||||
|
||||
def NotifyFileSeedsUpdated( self, file_seed_cache_key, file_seeds ):
|
||||
|
||||
if file_seed_cache_key == self._file_seed_cache.GetFileSeedCacheKey():
|
||||
|
@ -1032,13 +754,13 @@ class MultipleGalleryImport( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_MULTIPLE_GALLERY_IMPORT
|
||||
SERIALISABLE_NAME = 'Multiple Gallery Import'
|
||||
SERIALISABLE_VERSION = 4
|
||||
SERIALISABLE_VERSION = 5
|
||||
|
||||
def __init__( self, gallery_identifier = None ):
|
||||
def __init__( self, gug_key_and_name = None ):
|
||||
|
||||
if gallery_identifier is None:
|
||||
if gug_key_and_name is None:
|
||||
|
||||
gallery_identifier = ClientDownloading.GalleryIdentifier( HC.SITE_TYPE_DEVIANT_ART )
|
||||
gug_key_and_name = ( HydrusData.GenerateKey(), 'unknown source' )
|
||||
|
||||
|
||||
HydrusSerialisable.SerialisableBase.__init__( self )
|
||||
|
@ -1047,7 +769,7 @@ class MultipleGalleryImport( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
self._page_key = 'initialising page key'
|
||||
|
||||
self._gallery_identifier = gallery_identifier
|
||||
self._gug_key_and_name = gug_key_and_name
|
||||
|
||||
self._highlighted_gallery_import_key = None
|
||||
|
||||
|
@ -1095,7 +817,9 @@ class MultipleGalleryImport( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
def _GetSerialisableInfo( self ):
|
||||
|
||||
serialisable_gallery_identifier = self._gallery_identifier.GetSerialisableTuple()
|
||||
( gug_key, gug_name ) = self._gug_key_and_name
|
||||
|
||||
serialisable_gug_key_and_name = ( gug_key.encode( 'hex' ), gug_name )
|
||||
|
||||
if self._highlighted_gallery_import_key is None:
|
||||
|
||||
|
@ -1111,14 +835,16 @@ class MultipleGalleryImport( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
serialisable_gallery_imports = self._gallery_imports.GetSerialisableTuple()
|
||||
|
||||
return ( serialisable_gallery_identifier, serialisable_highlighted_gallery_import_key, self._file_limit, serialisable_file_import_options, serialisable_tag_import_options, serialisable_gallery_imports )
|
||||
return ( serialisable_gug_key_and_name, serialisable_highlighted_gallery_import_key, self._file_limit, serialisable_file_import_options, serialisable_tag_import_options, serialisable_gallery_imports )
|
||||
|
||||
|
||||
def _InitialiseFromSerialisableInfo( self, serialisable_info ):
|
||||
|
||||
( serialisable_gallery_identifier, serialisable_highlighted_gallery_import_key, self._file_limit, serialisable_file_import_options, serialisable_tag_import_options, serialisable_gallery_imports ) = serialisable_info
|
||||
( serialisable_gug_key_and_name, serialisable_highlighted_gallery_import_key, self._file_limit, serialisable_file_import_options, serialisable_tag_import_options, serialisable_gallery_imports ) = serialisable_info
|
||||
|
||||
self._gallery_identifier = HydrusSerialisable.CreateFromSerialisableTuple( serialisable_gallery_identifier )
|
||||
( serialisable_gug_key, gug_name ) = serialisable_gug_key_and_name
|
||||
|
||||
self._gug_key_and_name = ( serialisable_gug_key.decode( 'hex' ), gug_name )
|
||||
|
||||
if serialisable_highlighted_gallery_import_key is None:
|
||||
|
||||
|
@ -1207,8 +933,6 @@ class MultipleGalleryImport( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
gallery_imports = HydrusSerialisable.SerialisableList()
|
||||
|
||||
gallery_identifier = HydrusSerialisable.CreateFromSerialisableTuple( serialisable_gallery_identifier )
|
||||
|
||||
file_import_options = HydrusSerialisable.CreateFromSerialisableTuple( serialisable_file_import_options )
|
||||
tag_import_options = HydrusSerialisable.CreateFromSerialisableTuple( serialisable_tag_import_options )
|
||||
|
||||
|
@ -1219,7 +943,7 @@ class MultipleGalleryImport( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
current_query = 'queue brought from old page'
|
||||
|
||||
gallery_import = GalleryImport( current_query, gallery_identifier )
|
||||
gallery_import = GalleryImport( query = current_query, source_name = 'updated from old system', initial_search_urls = [] )
|
||||
|
||||
gallery_import.PausePlayGallery()
|
||||
gallery_import.PausePlayFiles()
|
||||
|
@ -1235,26 +959,6 @@ class MultipleGalleryImport( HydrusSerialisable.SerialisableBase ):
|
|||
gallery_imports.append( gallery_import )
|
||||
|
||||
|
||||
for query in pending_queries:
|
||||
|
||||
pq_gallery_identifiers = ClientDownloading.GetGalleryStreamIdentifiers( gallery_identifier )
|
||||
|
||||
for pq_gallery_identifier in pq_gallery_identifiers:
|
||||
|
||||
gallery_import = GalleryImport( 'updated stub: ' + query + ' (will not run, please re-queue)', gallery_identifier )
|
||||
|
||||
gallery_import.PausePlayGallery()
|
||||
gallery_import.PausePlayFiles()
|
||||
|
||||
gallery_import.SetFileLimit( file_limit )
|
||||
|
||||
gallery_import.SetFileImportOptions( file_import_options )
|
||||
gallery_import.SetTagImportOptions( tag_import_options )
|
||||
|
||||
gallery_imports.append( gallery_import )
|
||||
|
||||
|
||||
|
||||
serialisable_gallery_imports = gallery_imports.GetSerialisableTuple()
|
||||
|
||||
new_serialisable_info = ( serialisable_gallery_identifier, serialisable_highlighted_gallery_import_key, file_limit, serialisable_file_import_options, serialisable_tag_import_options, serialisable_gallery_imports )
|
||||
|
@ -1262,6 +966,21 @@ class MultipleGalleryImport( HydrusSerialisable.SerialisableBase ):
|
|||
return ( 4, new_serialisable_info )
|
||||
|
||||
|
||||
if version == 4:
|
||||
|
||||
( serialisable_gallery_identifier, serialisable_highlighted_gallery_import_key, file_limit, serialisable_file_import_options, serialisable_tag_import_options, serialisable_gallery_imports ) = old_serialisable_info
|
||||
|
||||
gallery_identifier = HydrusSerialisable.CreateFromSerialisableTuple( serialisable_gallery_identifier )
|
||||
|
||||
( gug_key, gug_name ) = ClientDownloading.ConvertGalleryIdentifierToGUGKeyAndName( gallery_identifier )
|
||||
|
||||
serialisable_gug_key_and_name = ( HydrusData.GenerateKey().encode( 'hex' ), gug_name )
|
||||
|
||||
new_serialisable_info = ( serialisable_gug_key_and_name, serialisable_highlighted_gallery_import_key, file_limit, serialisable_file_import_options, serialisable_tag_import_options, serialisable_gallery_imports )
|
||||
|
||||
return ( 5, new_serialisable_info )
|
||||
|
||||
|
||||
|
||||
def CurrentlyWorking( self ):
|
||||
|
||||
|
@ -1287,14 +1006,6 @@ class MultipleGalleryImport( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
|
||||
|
||||
def GetGalleryIdentifier( self ):
|
||||
|
||||
with self._lock:
|
||||
|
||||
return self._gallery_identifier
|
||||
|
||||
|
||||
|
||||
def GetGalleryImports( self ):
|
||||
|
||||
with self._lock:
|
||||
|
@ -1303,6 +1014,14 @@ class MultipleGalleryImport( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
|
||||
|
||||
def GetGUGKeyAndName( self ):
|
||||
|
||||
with self._lock:
|
||||
|
||||
return self._gug_key_and_name
|
||||
|
||||
|
||||
|
||||
def GetHighlightedGalleryImport( self ):
|
||||
|
||||
with self._lock:
|
||||
|
@ -1321,6 +1040,11 @@ class MultipleGalleryImport( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
|
||||
|
||||
def GetInitialSearchText( self ):
|
||||
|
||||
return HG.client_controller.network_engine.domain_manager.GetInitialSearchText( self._gug_key_and_name )
|
||||
|
||||
|
||||
def GetLastTimeImportsChanged( self ):
|
||||
|
||||
with self._lock:
|
||||
|
@ -1380,43 +1104,51 @@ class MultipleGalleryImport( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
|
||||
|
||||
def PendQuery( self, query ):
|
||||
def PendQuery( self, query_text ):
|
||||
|
||||
created_import = None
|
||||
|
||||
with self._lock:
|
||||
|
||||
gallery_identifiers = ClientDownloading.GetGalleryStreamIdentifiers( self._gallery_identifier )
|
||||
gug = HG.client_controller.network_engine.domain_manager.GetGUG( self._gug_key_and_name )
|
||||
|
||||
for gallery_identifier in gallery_identifiers:
|
||||
if gug is None:
|
||||
|
||||
gallery_import = GalleryImport( query, gallery_identifier )
|
||||
HydrusData.ShowText( 'Could not find a Gallery URL Generator for "' + self._gug_key_and_name[1] + '"!' )
|
||||
|
||||
gallery_import.SetFileLimit( self._file_limit )
|
||||
return None
|
||||
|
||||
gallery_import.SetFileImportOptions( self._file_import_options )
|
||||
gallery_import.SetTagImportOptions( self._tag_import_options )
|
||||
|
||||
self._gug_key_and_name = gug.GetGUGKeyAndName() # just a refresher, to keep up with any changes
|
||||
|
||||
initial_search_urls = gug.GenerateGalleryURLs( query_text )
|
||||
|
||||
if len( initial_search_urls ) == 0:
|
||||
|
||||
gallery_import.InitialiseFirstSearchPage()
|
||||
HydrusData.ShowText( 'The Gallery URL Generator "' + self._gug_key_and_name[1] + '" did not produce any URLs!' )
|
||||
|
||||
publish_to_page = False
|
||||
|
||||
gallery_import.Start( self._page_key, publish_to_page )
|
||||
|
||||
self._AddGalleryImport( gallery_import )
|
||||
|
||||
if created_import is None:
|
||||
|
||||
created_import = gallery_import
|
||||
|
||||
return None
|
||||
|
||||
|
||||
gallery_import = GalleryImport( query = query_text, source_name = self._gug_key_and_name[1], initial_search_urls = initial_search_urls )
|
||||
|
||||
gallery_import.SetFileLimit( self._file_limit )
|
||||
|
||||
gallery_import.SetFileImportOptions( self._file_import_options )
|
||||
gallery_import.SetTagImportOptions( self._tag_import_options )
|
||||
|
||||
publish_to_page = False
|
||||
|
||||
gallery_import.Start( self._page_key, publish_to_page )
|
||||
|
||||
self._AddGalleryImport( gallery_import )
|
||||
|
||||
ClientImporting.WakeRepeatingJob( self._importers_repeating_job )
|
||||
|
||||
self._SetDirty()
|
||||
|
||||
|
||||
return created_import
|
||||
return gallery_import
|
||||
|
||||
|
||||
def RemoveGalleryImport( self, gallery_import_key ):
|
||||
|
@ -1445,11 +1177,11 @@ class MultipleGalleryImport( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
|
||||
|
||||
def SetGalleryIdentifier( self, gallery_identifier ):
|
||||
def SetGUGKeyAndName( self, gug_key_and_name ):
|
||||
|
||||
with self._lock:
|
||||
|
||||
self._gallery_identifier = gallery_identifier
|
||||
self._gug_key_and_name = gug_key_and_name
|
||||
|
||||
|
||||
|
||||
|
|
|
@ -178,7 +178,7 @@ class GallerySeed( HydrusSerialisable.SerialisableBase ):
|
|||
num_urls_already_in_file_seed_cache = 0
|
||||
num_urls_total = 0
|
||||
result_404 = False
|
||||
can_add_more_file_urls = False
|
||||
added_new_gallery_pages = False
|
||||
stop_reason = ''
|
||||
|
||||
try:
|
||||
|
@ -246,7 +246,7 @@ class GallerySeed( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
num_urls_total = len( file_seeds )
|
||||
|
||||
( num_urls_added, num_urls_already_in_file_seed_cache, can_add_more_file_urls, stop_reason ) = file_seeds_callable( file_seeds )
|
||||
( num_urls_added, num_urls_already_in_file_seed_cache, can_search_for_more_files, stop_reason ) = file_seeds_callable( file_seeds )
|
||||
|
||||
status = CC.STATUS_SUCCESSFUL_AND_NEW
|
||||
|
||||
|
@ -257,13 +257,13 @@ class GallerySeed( HydrusSerialisable.SerialisableBase ):
|
|||
note += ' (' + HydrusData.ToHumanInt( num_urls_already_in_file_seed_cache ) + ' of page already in)'
|
||||
|
||||
|
||||
if not can_add_more_file_urls:
|
||||
if not can_search_for_more_files:
|
||||
|
||||
note += ' - ' + stop_reason
|
||||
|
||||
|
||||
# only keep searching if we found any files, otherwise this could be a blank results page with another stub page
|
||||
can_add_more_gallery_urls = num_urls_total > 0 and can_add_more_file_urls
|
||||
can_add_more_gallery_urls = num_urls_total > 0 and can_search_for_more_files
|
||||
|
||||
if self._can_generate_more_pages and can_add_more_gallery_urls:
|
||||
|
||||
|
@ -322,6 +322,8 @@ class GallerySeed( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
gallery_seed_log.AddGallerySeeds( next_gallery_seeds )
|
||||
|
||||
added_new_gallery_pages = True
|
||||
|
||||
gallery_urls_seen_before.update( new_next_page_urls )
|
||||
|
||||
if num_dupe_next_page_urls == 0:
|
||||
|
@ -405,7 +407,7 @@ class GallerySeed( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
gallery_seed_log.NotifyGallerySeedsUpdated( ( self, ) )
|
||||
|
||||
return ( num_urls_added, num_urls_already_in_file_seed_cache, num_urls_total, result_404, can_add_more_file_urls, stop_reason )
|
||||
return ( num_urls_added, num_urls_already_in_file_seed_cache, num_urls_total, result_404, added_new_gallery_pages, stop_reason )
|
||||
|
||||
|
||||
HydrusSerialisable.SERIALISABLE_TYPES_TO_OBJECT_TYPES[ HydrusSerialisable.SERIALISABLE_TYPE_GALLERY_SEED ] = GallerySeed
|
||||
|
|
|
@ -22,15 +22,18 @@ class Subscription( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_SUBSCRIPTION
|
||||
SERIALISABLE_NAME = 'Subscription'
|
||||
SERIALISABLE_VERSION = 7
|
||||
SERIALISABLE_VERSION = 8
|
||||
|
||||
def __init__( self, name ):
|
||||
def __init__( self, name, gug_key_and_name = None ):
|
||||
|
||||
HydrusSerialisable.SerialisableBaseNamed.__init__( self, name )
|
||||
|
||||
self._gallery_identifier = ClientDownloading.GalleryIdentifier( HC.SITE_TYPE_DEVIANT_ART )
|
||||
if gug_key_and_name is None:
|
||||
|
||||
gug_key_and_name = ( HydrusData.GenerateKey(), 'unknown source' )
|
||||
|
||||
|
||||
self._gallery_stream_identifiers = ClientDownloading.GetGalleryStreamIdentifiers( self._gallery_identifier )
|
||||
self._gug_key_and_name = gug_key_and_name
|
||||
|
||||
self._queries = []
|
||||
|
||||
|
@ -119,22 +122,24 @@ class Subscription( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
def _GetSerialisableInfo( self ):
|
||||
|
||||
serialisable_gallery_identifier = self._gallery_identifier.GetSerialisableTuple()
|
||||
serialisable_gallery_stream_identifiers = [ gallery_stream_identifier.GetSerialisableTuple() for gallery_stream_identifier in self._gallery_stream_identifiers ]
|
||||
( gug_key, gug_name ) = self._gug_key_and_name
|
||||
|
||||
serialisable_gug_key_and_name = ( gug_key.encode( 'hex' ), gug_name )
|
||||
serialisable_queries = [ query.GetSerialisableTuple() for query in self._queries ]
|
||||
serialisable_checker_options = self._checker_options.GetSerialisableTuple()
|
||||
serialisable_file_import_options = self._file_import_options.GetSerialisableTuple()
|
||||
serialisable_tag_import_options = self._tag_import_options.GetSerialisableTuple()
|
||||
|
||||
return ( serialisable_gallery_identifier, serialisable_gallery_stream_identifiers, serialisable_queries, serialisable_checker_options, self._initial_file_limit, self._periodic_file_limit, self._paused, serialisable_file_import_options, serialisable_tag_import_options, self._no_work_until, self._no_work_until_reason, self._publish_files_to_popup_button, self._publish_files_to_page, self._merge_query_publish_events )
|
||||
return ( serialisable_gug_key_and_name, serialisable_queries, serialisable_checker_options, self._initial_file_limit, self._periodic_file_limit, self._paused, serialisable_file_import_options, serialisable_tag_import_options, self._no_work_until, self._no_work_until_reason, self._publish_files_to_popup_button, self._publish_files_to_page, self._merge_query_publish_events )
|
||||
|
||||
|
||||
def _InitialiseFromSerialisableInfo( self, serialisable_info ):
|
||||
|
||||
( serialisable_gallery_identifier, serialisable_gallery_stream_identifiers, serialisable_queries, serialisable_checker_options, self._initial_file_limit, self._periodic_file_limit, self._paused, serialisable_file_import_options, serialisable_tag_import_options, self._no_work_until, self._no_work_until_reason, self._publish_files_to_popup_button, self._publish_files_to_page, self._merge_query_publish_events ) = serialisable_info
|
||||
( serialisable_gug_key_and_name, serialisable_queries, serialisable_checker_options, self._initial_file_limit, self._periodic_file_limit, self._paused, serialisable_file_import_options, serialisable_tag_import_options, self._no_work_until, self._no_work_until_reason, self._publish_files_to_popup_button, self._publish_files_to_page, self._merge_query_publish_events ) = serialisable_info
|
||||
|
||||
self._gallery_identifier = HydrusSerialisable.CreateFromSerialisableTuple( serialisable_gallery_identifier )
|
||||
self._gallery_stream_identifiers = [ HydrusSerialisable.CreateFromSerialisableTuple( serialisable_gallery_stream_identifier ) for serialisable_gallery_stream_identifier in serialisable_gallery_stream_identifiers ]
|
||||
( serialisable_gug_key, gug_name ) = serialisable_gug_key_and_name
|
||||
|
||||
self._gug_key_and_name = ( serialisable_gug_key.decode( 'hex' ), gug_name )
|
||||
self._queries = [ HydrusSerialisable.CreateFromSerialisableTuple( serialisable_query ) for serialisable_query in serialisable_queries ]
|
||||
self._checker_options = HydrusSerialisable.CreateFromSerialisableTuple( serialisable_checker_options )
|
||||
self._file_import_options = HydrusSerialisable.CreateFromSerialisableTuple( serialisable_file_import_options )
|
||||
|
@ -276,26 +281,24 @@ class Subscription( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
return ( 7, new_serialisable_info )
|
||||
|
||||
|
||||
if version == 7:
|
||||
|
||||
( serialisable_gallery_identifier, serialisable_gallery_stream_identifiers, serialisable_queries, serialisable_checker_options, initial_file_limit, periodic_file_limit, paused, serialisable_file_import_options, serialisable_tag_import_options, no_work_until, no_work_until_reason, publish_files_to_popup_button, publish_files_to_page, merge_query_publish_events ) = old_serialisable_info
|
||||
|
||||
gallery_identifier = HydrusSerialisable.CreateFromSerialisableTuple( serialisable_gallery_identifier )
|
||||
|
||||
( gug_key, gug_name ) = ClientDownloading.ConvertGalleryIdentifierToGUGKeyAndName( gallery_identifier )
|
||||
|
||||
serialisable_gug_key_and_name = ( gug_key.encode( 'hex' ), gug_name )
|
||||
|
||||
new_serialisable_info = ( serialisable_gug_key_and_name, serialisable_queries, serialisable_checker_options, initial_file_limit, periodic_file_limit, paused, serialisable_file_import_options, serialisable_tag_import_options, no_work_until, no_work_until_reason, publish_files_to_popup_button, publish_files_to_page, merge_query_publish_events )
|
||||
|
||||
return ( 8, new_serialisable_info )
|
||||
|
||||
|
||||
|
||||
def _WorkOnFiles( self, job_key ):
|
||||
|
||||
try:
|
||||
|
||||
gallery = ClientDownloading.GetGallery( self._gallery_identifier )
|
||||
|
||||
except Exception as e:
|
||||
|
||||
HydrusData.PrintException( e )
|
||||
|
||||
self._DelayWork( HC.UPDATE_DURATION, 'gallery would not load' )
|
||||
|
||||
self._paused = True
|
||||
|
||||
HydrusData.ShowText( 'The subscription ' + self._name + ' could not load its gallery! It has been paused and the full error has been written to the log!' )
|
||||
|
||||
return
|
||||
|
||||
|
||||
error_count = 0
|
||||
|
||||
all_presentation_hashes = []
|
||||
|
@ -310,19 +313,6 @@ class Subscription( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
query_text = query.GetQueryText()
|
||||
file_seed_cache = query.GetFileSeedCache()
|
||||
|
||||
def network_job_factory( method, url, **kwargs ):
|
||||
|
||||
network_job = ClientNetworkingJobs.NetworkJobSubscription( self._GetNetworkJobSubscriptionKey( query ), method, url, **kwargs )
|
||||
|
||||
network_job.OverrideBandwidth( 30 )
|
||||
|
||||
job_key.SetVariable( 'popup_network_job', network_job )
|
||||
|
||||
return network_job
|
||||
|
||||
|
||||
gallery.SetNetworkJobFactory( network_job_factory )
|
||||
|
||||
text_1 = 'downloading files'
|
||||
query_summary_name = self._name
|
||||
|
||||
|
@ -384,106 +374,30 @@ class Subscription( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
job_key.SetVariable( 'popup_gauge_2', ( num_done, num_urls ) )
|
||||
|
||||
if file_seed.WorksInNewSystem():
|
||||
def status_hook( text ):
|
||||
|
||||
def status_hook( text ):
|
||||
|
||||
job_key.SetVariable( 'popup_text_2', x_out_of_y + text )
|
||||
|
||||
|
||||
job_key.SetVariable( 'popup_text_2', x_out_of_y + text )
|
||||
|
||||
file_seed.WorkOnURL( file_seed_cache, status_hook, self._GenerateNetworkJobFactory( query ), ClientImporting.GenerateMultiplePopupNetworkJobPresentationContextFactory( job_key ), self._file_import_options, self._tag_import_options )
|
||||
|
||||
file_seed.WorkOnURL( file_seed_cache, status_hook, self._GenerateNetworkJobFactory( query ), ClientImporting.GenerateMultiplePopupNetworkJobPresentationContextFactory( job_key ), self._file_import_options, self._tag_import_options )
|
||||
|
||||
if file_seed.ShouldPresent( self._file_import_options ):
|
||||
|
||||
if file_seed.ShouldPresent( self._file_import_options ):
|
||||
|
||||
hash = file_seed.GetHash()
|
||||
|
||||
if hash not in presentation_hashes_fast:
|
||||
|
||||
if hash not in all_presentation_hashes_fast:
|
||||
|
||||
all_presentation_hashes.append( hash )
|
||||
|
||||
all_presentation_hashes_fast.add( hash )
|
||||
|
||||
|
||||
presentation_hashes.append( hash )
|
||||
|
||||
presentation_hashes_fast.add( hash )
|
||||
|
||||
|
||||
hash = file_seed.GetHash()
|
||||
|
||||
else:
|
||||
|
||||
job_key.SetVariable( 'popup_text_2', x_out_of_y + 'checking url status' )
|
||||
|
||||
( should_download_metadata, should_download_file ) = file_seed.PredictPreImportStatus( self._file_import_options, self._tag_import_options )
|
||||
|
||||
status = file_seed.status
|
||||
url = file_seed.file_seed_data
|
||||
|
||||
if status == CC.STATUS_SUCCESSFUL_BUT_REDUNDANT:
|
||||
if hash not in presentation_hashes_fast:
|
||||
|
||||
if self._tag_import_options.ShouldFetchTagsEvenIfURLKnownAndFileAlreadyInDB() and self._tag_import_options.WorthFetchingTags():
|
||||
if hash not in all_presentation_hashes_fast:
|
||||
|
||||
job_key.SetVariable( 'popup_text_2', x_out_of_y + 'found file in db, fetching tags' )
|
||||
all_presentation_hashes.append( hash )
|
||||
|
||||
downloaded_tags = gallery.GetTags( url )
|
||||
|
||||
file_seed.AddTags( downloaded_tags )
|
||||
all_presentation_hashes_fast.add( hash )
|
||||
|
||||
|
||||
elif status == CC.STATUS_UNKNOWN:
|
||||
presentation_hashes.append( hash )
|
||||
|
||||
( os_file_handle, temp_path ) = ClientPaths.GetTempPath()
|
||||
presentation_hashes_fast.add( hash )
|
||||
|
||||
try:
|
||||
|
||||
job_key.SetVariable( 'popup_text_2', x_out_of_y + 'downloading file' )
|
||||
|
||||
if self._tag_import_options.WorthFetchingTags():
|
||||
|
||||
downloaded_tags = gallery.GetFileAndTags( temp_path, url )
|
||||
|
||||
file_seed.AddTags( downloaded_tags )
|
||||
|
||||
else:
|
||||
|
||||
gallery.GetFile( temp_path, url )
|
||||
|
||||
|
||||
file_seed.CheckPreFetchMetadata( self._tag_import_options )
|
||||
|
||||
job_key.SetVariable( 'popup_text_2', x_out_of_y + 'importing file' )
|
||||
|
||||
file_seed.Import( temp_path, self._file_import_options )
|
||||
|
||||
hash = file_seed.GetHash()
|
||||
|
||||
if hash not in presentation_hashes_fast:
|
||||
|
||||
if file_seed.ShouldPresent( self._file_import_options ):
|
||||
|
||||
if hash not in all_presentation_hashes_fast:
|
||||
|
||||
all_presentation_hashes.append( hash )
|
||||
|
||||
all_presentation_hashes_fast.add( hash )
|
||||
|
||||
|
||||
presentation_hashes.append( hash )
|
||||
|
||||
presentation_hashes_fast.add( hash )
|
||||
|
||||
|
||||
|
||||
finally:
|
||||
|
||||
HydrusPaths.CleanUpTempPath( os_file_handle, temp_path )
|
||||
|
||||
|
||||
|
||||
file_seed.WriteContentUpdates( self._tag_import_options )
|
||||
|
||||
|
||||
except HydrusExceptions.CancelledException as e:
|
||||
|
@ -585,6 +499,28 @@ class Subscription( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
have_made_an_initial_sync_bandwidth_notification = False
|
||||
|
||||
gug = HG.client_controller.network_engine.domain_manager.GetGUG( self._gug_key_and_name )
|
||||
|
||||
if gug is None:
|
||||
|
||||
self._paused = True
|
||||
|
||||
HydrusData.ShowText( 'The subscription "' + self._name + '" could not find a Gallery URL Generator for "' + self._gug_key_and_name[1] + '"! The sub has paused!' )
|
||||
|
||||
return
|
||||
|
||||
|
||||
if not gug.IsFunctional():
|
||||
|
||||
self._paused = True
|
||||
|
||||
HydrusData.ShowText( 'The subscription "' + self._name + '"\'s Gallery URL Generator, "' + self._gug_key_and_name[1] + '" seems not to be functional! Maybe it needs a gallery url class or a gallery parser? The sub has paused!' )
|
||||
|
||||
return
|
||||
|
||||
|
||||
self._gug_key_and_name = gug.GetGUGKeyAndName() # just a refresher, to keep up with any changes
|
||||
|
||||
queries = self._GetQueriesForProcessing()
|
||||
|
||||
for query in queries:
|
||||
|
@ -601,8 +537,6 @@ class Subscription( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
continue
|
||||
|
||||
|
||||
done_first_page = False
|
||||
|
||||
query_text = query.GetQueryText()
|
||||
file_seed_cache = query.GetFileSeedCache()
|
||||
gallery_seed_log = query.GetGallerySeedLog()
|
||||
|
@ -624,6 +558,8 @@ class Subscription( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
file_seeds_to_add = set()
|
||||
file_seeds_to_add_ordered = []
|
||||
|
||||
stop_reason = 'unknown stop reason'
|
||||
|
||||
prefix = 'synchronising'
|
||||
|
||||
if query_text != self._name:
|
||||
|
@ -633,44 +569,50 @@ class Subscription( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
job_key.SetVariable( 'popup_text_1', prefix )
|
||||
|
||||
for gallery_stream_identifier in self._gallery_stream_identifiers:
|
||||
initial_search_urls = gug.GenerateGalleryURLs( query_text )
|
||||
|
||||
if len( initial_search_urls ) == 0:
|
||||
|
||||
if file_limit_for_this_sync is not None and total_new_urls_for_this_sync >= file_limit_for_this_sync:
|
||||
|
||||
break
|
||||
|
||||
self._paused = True
|
||||
|
||||
p1 = HC.options[ 'pause_subs_sync' ]
|
||||
p2 = job_key.IsCancelled()
|
||||
p3 = HG.view_shutdown
|
||||
HydrusData.ShowText( 'The subscription "' + self._name + '"\'s Gallery URL Generator, "' + self._gug_key_and_name[1] + '" did not generate any URLs! The sub has paused!' )
|
||||
|
||||
if p1 or p2 or p3:
|
||||
|
||||
break
|
||||
|
||||
return
|
||||
|
||||
try:
|
||||
|
||||
gallery = ClientDownloading.GetGallery( gallery_stream_identifier )
|
||||
|
||||
except Exception as e:
|
||||
|
||||
HydrusData.PrintException( e )
|
||||
|
||||
self._DelayWork( HC.UPDATE_DURATION, 'gallery would not load' )
|
||||
|
||||
self._paused = True
|
||||
|
||||
HydrusData.ShowText( 'The subscription ' + self._name + ' could not load its gallery! It has been paused and the full error has been written to the log!' )
|
||||
|
||||
return
|
||||
|
||||
|
||||
gallery_seeds = [ ClientImportGallerySeeds.GallerySeed( url, can_generate_more_pages = True ) for url in initial_search_urls ]
|
||||
|
||||
gallery_seed_log.AddGallerySeeds( gallery_seeds )
|
||||
|
||||
try:
|
||||
|
||||
first_gallery_url = gallery.GetGalleryPageURL( query_text, 0 )
|
||||
|
||||
gallery_seed = ClientImportGallerySeeds.GallerySeed( first_gallery_url, can_generate_more_pages = True )
|
||||
|
||||
if gallery_seed.WorksInNewSystem():
|
||||
while gallery_seed_log.WorkToDo():
|
||||
|
||||
p1 = HC.options[ 'pause_subs_sync' ]
|
||||
p2 = HG.view_shutdown
|
||||
|
||||
if p1 or p2:
|
||||
|
||||
return
|
||||
|
||||
|
||||
if job_key.IsCancelled():
|
||||
|
||||
stop_reason = 'gallery parsing cancelled, likely by user'
|
||||
|
||||
self._DelayWork( 600, stop_reason )
|
||||
|
||||
return
|
||||
|
||||
|
||||
gallery_seed = gallery_seed_log.GetNextGallerySeed( CC.STATUS_UNKNOWN )
|
||||
|
||||
if gallery_seed is None:
|
||||
|
||||
stop_reason = 'thought there was a page to check, but apparently there was not!'
|
||||
|
||||
break
|
||||
|
||||
|
||||
def status_hook( text ):
|
||||
|
||||
|
@ -682,301 +624,130 @@ class Subscription( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
pass
|
||||
|
||||
|
||||
gallery_seed_log.AddGallerySeeds( ( gallery_seed, ) )
|
||||
|
||||
num_existing_urls_this_stream = 0
|
||||
|
||||
stop_reason = 'unknown stop reason'
|
||||
|
||||
keep_checking = True
|
||||
|
||||
try:
|
||||
def file_seeds_callable( file_seeds ):
|
||||
|
||||
while keep_checking and gallery_seed_log.WorkToDo():
|
||||
num_urls_added = 0
|
||||
num_urls_already_in_file_seed_cache = 0
|
||||
can_search_for_more_files = True
|
||||
stop_reason = 'unknown stop reason'
|
||||
|
||||
for file_seed in file_seeds:
|
||||
|
||||
p1 = HC.options[ 'pause_subs_sync' ]
|
||||
p2 = HG.view_shutdown
|
||||
|
||||
if p1 or p2:
|
||||
if file_seed in file_seeds_to_add:
|
||||
|
||||
return
|
||||
# this catches the occasional overflow when a new file is uploaded while gallery parsing is going on
|
||||
# we don't want to count these 'seen before this run' urls in the 'caught up to last time' count
|
||||
|
||||
continue
|
||||
|
||||
|
||||
if job_key.IsCancelled():
|
||||
if file_seed_cache.HasFileSeed( file_seed ):
|
||||
|
||||
stop_reason = 'gallery parsing cancelled, likely by user'
|
||||
num_urls_already_in_file_seed_cache += 1
|
||||
|
||||
self._DelayWork( 600, stop_reason )
|
||||
WE_HIT_OLD_GROUND_THRESHOLD = 5
|
||||
|
||||
return
|
||||
|
||||
|
||||
gallery_seed = gallery_seed_log.GetNextGallerySeed( CC.STATUS_UNKNOWN )
|
||||
|
||||
if gallery_seed is None:
|
||||
|
||||
stop_reason = 'thought there was a page to check, but apparently there was not!'
|
||||
|
||||
break
|
||||
|
||||
|
||||
job_key.SetVariable( 'popup_text_1', prefix + ': found ' + HydrusData.ToHumanInt( total_new_urls_for_this_sync ) + ' new urls, checking next page' )
|
||||
|
||||
def file_seeds_callable( file_seeds ):
|
||||
|
||||
num_urls_added = 0
|
||||
num_urls_already_in_file_seed_cache = 0
|
||||
can_add_more_file_urls = True
|
||||
stop_reason = 'no known stop reason'
|
||||
|
||||
for file_seed in file_seeds:
|
||||
if num_urls_already_in_file_seed_cache >= WE_HIT_OLD_GROUND_THRESHOLD:
|
||||
|
||||
if file_limit_for_this_sync is not None and total_new_urls_for_this_sync + num_urls_added >= file_limit_for_this_sync:
|
||||
|
||||
if this_is_initial_sync:
|
||||
|
||||
stop_reason = 'hit initial file limit'
|
||||
|
||||
else:
|
||||
|
||||
self._ShowHitPeriodicFileLimitMessage( query_text )
|
||||
|
||||
stop_reason = 'hit periodic file limit'
|
||||
|
||||
|
||||
can_add_more_file_urls = False
|
||||
|
||||
break
|
||||
|
||||
# this gallery page has caught up to before, so it should not spawn any more gallery pages
|
||||
|
||||
if file_seed in file_seeds_to_add:
|
||||
|
||||
# this catches the occasional overflow when a new file is uploaded while gallery parsing is going on
|
||||
|
||||
continue
|
||||
|
||||
can_search_for_more_files = False
|
||||
|
||||
if file_seed_cache.HasFileSeed( file_seed ):
|
||||
|
||||
num_urls_already_in_file_seed_cache += 1
|
||||
|
||||
WE_HIT_OLD_GROUND_THRESHOLD = 5
|
||||
|
||||
if num_urls_already_in_file_seed_cache >= WE_HIT_OLD_GROUND_THRESHOLD:
|
||||
|
||||
can_add_more_file_urls = False
|
||||
|
||||
stop_reason = 'saw ' + HydrusData.ToHumanInt( WE_HIT_OLD_GROUND_THRESHOLD ) + ' previously seen urls, so assuming we caught up'
|
||||
|
||||
break
|
||||
|
||||
|
||||
else:
|
||||
|
||||
num_urls_added += 1
|
||||
|
||||
file_seeds_to_add.add( file_seed )
|
||||
file_seeds_to_add_ordered.append( file_seed )
|
||||
|
||||
|
||||
|
||||
return ( num_urls_added, num_urls_already_in_file_seed_cache, can_add_more_file_urls, stop_reason )
|
||||
|
||||
|
||||
try:
|
||||
|
||||
( num_urls_added, num_urls_already_in_file_seed_cache, num_urls_total, result_404, can_add_more_file_urls, stop_reason ) = gallery_seed.WorkOnURL( 'subscription', gallery_seed_log, file_seeds_callable, status_hook, title_hook, self._GenerateNetworkJobFactory( query ), ClientImporting.GenerateMultiplePopupNetworkJobPresentationContextFactory( job_key ), self._file_import_options, gallery_urls_seen_before = gallery_urls_seen_this_sync )
|
||||
|
||||
except HydrusExceptions.CancelledException as e:
|
||||
|
||||
stop_reason = 'gallery network job cancelled, likely by user'
|
||||
|
||||
self._DelayWork( 600, stop_reason )
|
||||
|
||||
return
|
||||
|
||||
except Exception as e:
|
||||
|
||||
stop_reason = HydrusData.ToUnicode( e )
|
||||
|
||||
raise
|
||||
|
||||
finally:
|
||||
|
||||
done_first_page = True
|
||||
|
||||
|
||||
keep_checking = can_add_more_file_urls
|
||||
|
||||
num_existing_urls_this_stream += num_urls_already_in_file_seed_cache
|
||||
|
||||
WE_HIT_OLD_GROUND_TOTAL_THRESHOLD = 15
|
||||
|
||||
if num_existing_urls_this_stream >= WE_HIT_OLD_GROUND_TOTAL_THRESHOLD:
|
||||
|
||||
keep_checking = False
|
||||
stop_reason = 'saw ' + HydrusData.ToHumanInt( WE_HIT_OLD_GROUND_TOTAL_THRESHOLD ) + ' previously seen urls in the whole sync, so assuming we caught up'
|
||||
|
||||
|
||||
total_new_urls_for_this_sync += num_urls_added
|
||||
|
||||
|
||||
finally:
|
||||
|
||||
while gallery_seed_log.WorkToDo():
|
||||
|
||||
gallery_seed = gallery_seed_log.GetNextGallerySeed( CC.STATUS_UNKNOWN )
|
||||
|
||||
if gallery_seed is None:
|
||||
|
||||
break
|
||||
|
||||
|
||||
gallery_seed.SetStatus( CC.STATUS_VETOED, note = stop_reason )
|
||||
|
||||
|
||||
|
||||
else:
|
||||
|
||||
def network_job_factory( method, url, **kwargs ):
|
||||
|
||||
network_job = ClientNetworkingJobs.NetworkJobSubscription( self._GetNetworkJobSubscriptionKey( query ), method, url, **kwargs )
|
||||
|
||||
job_key.SetVariable( 'popup_network_job', network_job )
|
||||
|
||||
network_job.SetGalleryToken( 'subscription' )
|
||||
|
||||
network_job.OverrideBandwidth( 30 )
|
||||
|
||||
return network_job
|
||||
|
||||
|
||||
gallery.SetNetworkJobFactory( network_job_factory )
|
||||
|
||||
page_index = 0
|
||||
num_existing_urls_this_stream = 0
|
||||
keep_checking = True
|
||||
|
||||
while keep_checking:
|
||||
|
||||
new_urls_this_page = 0
|
||||
|
||||
p1 = HC.options[ 'pause_subs_sync' ]
|
||||
p2 = HG.view_shutdown
|
||||
|
||||
if p1 or p2:
|
||||
|
||||
return
|
||||
|
||||
|
||||
if job_key.IsCancelled():
|
||||
|
||||
raise HydrusExceptions.CancelledException( 'gallery parsing cancelled, likely by user' )
|
||||
|
||||
|
||||
job_key.SetVariable( 'popup_text_1', prefix + ': found ' + HydrusData.ToHumanInt( total_new_urls_for_this_sync ) + ' new urls, checking next page' )
|
||||
|
||||
gallery_url = gallery.GetGalleryPageURL( query_text, page_index )
|
||||
|
||||
try:
|
||||
|
||||
gallery_seed = ClientImportGallerySeeds.GallerySeed( gallery_url, can_generate_more_pages = False )
|
||||
|
||||
gallery_seed_log.AddGallerySeeds( ( gallery_seed, ) )
|
||||
|
||||
( page_of_file_seeds, definitely_no_more_pages ) = gallery.GetPage( gallery_url )
|
||||
|
||||
done_first_page = True
|
||||
|
||||
page_index += 1
|
||||
|
||||
if definitely_no_more_pages:
|
||||
|
||||
keep_checking = False
|
||||
|
||||
|
||||
for file_seed in page_of_file_seeds:
|
||||
|
||||
if file_limit_for_this_sync is not None and total_new_urls_for_this_sync >= file_limit_for_this_sync:
|
||||
|
||||
if not this_is_initial_sync:
|
||||
|
||||
self._ShowHitPeriodicFileLimitMessage( query_text )
|
||||
|
||||
|
||||
keep_checking = False
|
||||
stop_reason = 'saw ' + HydrusData.ToHumanInt( WE_HIT_OLD_GROUND_THRESHOLD ) + ' previously seen urls, so assuming we caught up'
|
||||
|
||||
break
|
||||
|
||||
|
||||
if file_seed in file_seeds_to_add:
|
||||
|
||||
# this catches the occasional overflow when a new file is uploaded while gallery parsing is going on
|
||||
|
||||
continue
|
||||
|
||||
else:
|
||||
|
||||
if file_seed_cache.HasFileSeed( file_seed ):
|
||||
num_urls_added += 1
|
||||
|
||||
file_seeds_to_add.add( file_seed )
|
||||
file_seeds_to_add_ordered.append( file_seed )
|
||||
|
||||
|
||||
if file_limit_for_this_sync is not None and total_new_urls_for_this_sync + num_urls_added >= file_limit_for_this_sync:
|
||||
|
||||
# we have found enough new files this sync, so should stop adding files and new gallery pages
|
||||
|
||||
if this_is_initial_sync:
|
||||
|
||||
num_existing_urls_this_stream += 1
|
||||
|
||||
if num_existing_urls_this_stream > 5:
|
||||
|
||||
keep_checking = False
|
||||
|
||||
break
|
||||
|
||||
stop_reason = 'hit initial file limit'
|
||||
|
||||
else:
|
||||
|
||||
file_seeds_to_add.add( file_seed )
|
||||
file_seeds_to_add_ordered.append( file_seed )
|
||||
self._ShowHitPeriodicFileLimitMessage( query_text )
|
||||
|
||||
new_urls_this_page += 1
|
||||
total_new_urls_for_this_sync += 1
|
||||
stop_reason = 'hit periodic file limit'
|
||||
|
||||
|
||||
|
||||
if new_urls_this_page == 0:
|
||||
can_search_for_more_files = False
|
||||
|
||||
keep_checking = False
|
||||
break
|
||||
|
||||
|
||||
gallery_seed_status = CC.STATUS_SUCCESSFUL_AND_NEW
|
||||
gallery_seed_note = 'checked OK - found ' + HydrusData.ToUnicode( new_urls_this_page ) + ' new urls'
|
||||
|
||||
except HydrusExceptions.CancelledException as e:
|
||||
|
||||
gallery_seed_status = CC.STATUS_VETOED
|
||||
gallery_seed_note = HydrusData.ToUnicode( e )
|
||||
|
||||
self._DelayWork( 600, gallery_seed_note )
|
||||
|
||||
return
|
||||
|
||||
except HydrusExceptions.NotFoundException:
|
||||
|
||||
gallery_seed_status = CC.STATUS_VETOED
|
||||
gallery_seed_note = '404'
|
||||
|
||||
# paheal now 404s when no results, so just naturally break
|
||||
|
||||
break
|
||||
|
||||
except Exception as e:
|
||||
|
||||
gallery_seed_status = CC.STATUS_ERROR
|
||||
gallery_seed_note = HydrusData.ToUnicode( e )
|
||||
|
||||
raise
|
||||
|
||||
finally:
|
||||
|
||||
gallery_seed.SetStatus( gallery_seed_status, note = gallery_seed_note )
|
||||
|
||||
gallery_seed_log.NotifyGallerySeedsUpdated( ( gallery_seed, ) )
|
||||
|
||||
|
||||
if num_urls_added == 0:
|
||||
|
||||
can_search_for_more_files = False
|
||||
stop_reason = 'no new urls found'
|
||||
|
||||
|
||||
return ( num_urls_added, num_urls_already_in_file_seed_cache, can_search_for_more_files, stop_reason )
|
||||
|
||||
|
||||
job_key.SetVariable( 'popup_text_1', prefix + ': found ' + HydrusData.ToHumanInt( total_new_urls_for_this_sync ) + ' new urls, checking next page' )
|
||||
|
||||
try:
|
||||
|
||||
( num_urls_added, num_urls_already_in_file_seed_cache, num_urls_total, result_404, added_new_gallery_pages, stop_reason ) = gallery_seed.WorkOnURL( 'subscription', gallery_seed_log, file_seeds_callable, status_hook, title_hook, self._GenerateNetworkJobFactory( query ), ClientImporting.GenerateMultiplePopupNetworkJobPresentationContextFactory( job_key ), self._file_import_options, gallery_urls_seen_before = gallery_urls_seen_this_sync )
|
||||
|
||||
except HydrusExceptions.CancelledException as e:
|
||||
|
||||
stop_reason = 'gallery network job cancelled, likely by user'
|
||||
|
||||
self._DelayWork( 600, stop_reason )
|
||||
|
||||
return
|
||||
|
||||
except Exception as e:
|
||||
|
||||
stop_reason = HydrusData.ToUnicode( e )
|
||||
|
||||
raise
|
||||
|
||||
|
||||
total_new_urls_for_this_sync += num_urls_added
|
||||
|
||||
if file_limit_for_this_sync is not None and total_new_urls_for_this_sync >= file_limit_for_this_sync:
|
||||
|
||||
# we have found enough new files this sync, so stop and cancel any outstanding gallery urls
|
||||
|
||||
if this_is_initial_sync:
|
||||
|
||||
stop_reason = 'hit initial file limit'
|
||||
|
||||
else:
|
||||
|
||||
stop_reason = 'hit periodic file limit'
|
||||
|
||||
|
||||
break
|
||||
|
||||
|
||||
|
||||
finally:
|
||||
|
||||
while gallery_seed_log.WorkToDo():
|
||||
|
||||
gallery_seed = gallery_seed_log.GetNextGallerySeed( CC.STATUS_UNKNOWN )
|
||||
|
||||
if gallery_seed is None:
|
||||
|
||||
break
|
||||
|
||||
|
||||
gallery_seed.SetStatus( CC.STATUS_VETOED, note = stop_reason )
|
||||
|
||||
|
||||
|
||||
|
@ -1106,9 +877,9 @@ class Subscription( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
return ( min_estimate, max_estimate )
|
||||
|
||||
|
||||
def GetGalleryIdentifier( self ):
|
||||
def GetGUGKeyAndName( self ):
|
||||
|
||||
return self._gallery_identifier
|
||||
return self._gug_key_and_name
|
||||
|
||||
|
||||
def GetQueries( self ):
|
||||
|
@ -1147,7 +918,7 @@ class Subscription( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
for subscription in potential_mergee_subscriptions:
|
||||
|
||||
if subscription._gallery_identifier == self._gallery_identifier:
|
||||
if subscription._gug_key_and_name[1] == self._gug_key_and_name[1]:
|
||||
|
||||
my_new_queries = [ query.Duplicate() for query in subscription._queries ]
|
||||
|
||||
|
@ -1260,10 +1031,9 @@ class Subscription( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
self._tag_import_options = tag_import_options.Duplicate()
|
||||
|
||||
|
||||
def SetTuple( self, gallery_identifier, gallery_stream_identifiers, queries, checker_options, initial_file_limit, periodic_file_limit, paused, file_import_options, tag_import_options, no_work_until ):
|
||||
def SetTuple( self, gug_key_and_name, queries, checker_options, initial_file_limit, periodic_file_limit, paused, file_import_options, tag_import_options, no_work_until ):
|
||||
|
||||
self._gallery_identifier = gallery_identifier
|
||||
self._gallery_stream_identifiers = gallery_stream_identifiers
|
||||
self._gug_key_and_name = gug_key_and_name
|
||||
self._queries = queries
|
||||
self._checker_options = checker_options
|
||||
self._initial_file_limit = initial_file_limit
|
||||
|
@ -1366,7 +1136,7 @@ class Subscription( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
def ToTuple( self ):
|
||||
|
||||
return ( self._name, self._gallery_identifier, self._gallery_stream_identifiers, self._queries, self._checker_options, self._initial_file_limit, self._periodic_file_limit, self._paused, self._file_import_options, self._tag_import_options, self._no_work_until, self._no_work_until_reason )
|
||||
return ( self._name, self._gug_key_and_name, self._queries, self._checker_options, self._initial_file_limit, self._periodic_file_limit, self._paused, self._file_import_options, self._tag_import_options, self._no_work_until, self._no_work_until_reason )
|
||||
|
||||
|
||||
HydrusSerialisable.SERIALISABLE_TYPES_TO_OBJECT_TYPES[ HydrusSerialisable.SERIALISABLE_TYPE_SUBSCRIPTION ] = Subscription
|
||||
|
|
|
@ -602,7 +602,7 @@ class WatcherImport( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
try:
|
||||
|
||||
( num_urls_added, num_urls_already_in_file_seed_cache, num_urls_total, result_404, can_add_more_file_urls, stop_reason ) = gallery_seed.WorkOnURL( 'watcher', self._gallery_seed_log, file_seeds_callable, status_hook, title_hook, self._NetworkJobFactory, self._CheckerNetworkJobPresentationContextFactory, self._file_import_options )
|
||||
( num_urls_added, num_urls_already_in_file_seed_cache, num_urls_total, result_404, added_new_gallery_pages, stop_reason ) = gallery_seed.WorkOnURL( 'watcher', self._gallery_seed_log, file_seeds_callable, status_hook, title_hook, self._NetworkJobFactory, self._CheckerNetworkJobPresentationContextFactory, self._file_import_options )
|
||||
|
||||
if num_urls_added > 0:
|
||||
|
||||
|
|
|
@ -296,14 +296,14 @@ def UpdateFileSeedCacheWithFileSeeds( file_seed_cache, file_seeds, max_new_urls_
|
|||
|
||||
num_urls_added = 0
|
||||
num_urls_already_in_file_seed_cache = 0
|
||||
can_add_more_file_urls = True
|
||||
can_search_for_more_files = True
|
||||
stop_reason = ''
|
||||
|
||||
for file_seed in file_seeds:
|
||||
|
||||
if max_new_urls_allowed is not None and num_urls_added >= max_new_urls_allowed:
|
||||
|
||||
can_add_more_file_urls = False
|
||||
can_search_for_more_files = False
|
||||
|
||||
stop_reason = 'hit file limit'
|
||||
|
||||
|
@ -324,7 +324,7 @@ def UpdateFileSeedCacheWithFileSeeds( file_seed_cache, file_seeds, max_new_urls_
|
|||
|
||||
file_seed_cache.AddFileSeeds( new_file_seeds )
|
||||
|
||||
return ( num_urls_added, num_urls_already_in_file_seed_cache, can_add_more_file_urls, stop_reason )
|
||||
return ( num_urls_added, num_urls_already_in_file_seed_cache, can_search_for_more_files, stop_reason )
|
||||
|
||||
def WakeRepeatingJob( job ):
|
||||
|
||||
|
|
|
@ -304,7 +304,7 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_NETWORK_DOMAIN_MANAGER
|
||||
SERIALISABLE_NAME = 'Domain Manager'
|
||||
SERIALISABLE_VERSION = 5
|
||||
SERIALISABLE_VERSION = 6
|
||||
|
||||
def __init__( self ):
|
||||
|
||||
|
@ -319,6 +319,8 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
self._parser_namespaces = []
|
||||
|
||||
self._gug_keys_to_display = set()
|
||||
|
||||
self._url_match_keys_to_display = set()
|
||||
self._url_match_keys_to_parser_keys = HydrusSerialisable.SerialisableBytesDictionary()
|
||||
|
||||
|
@ -331,6 +333,9 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
self._url_match_keys_to_default_tag_import_options = {}
|
||||
|
||||
self._gug_keys_to_gugs = {}
|
||||
self._gug_names_to_gugs = {}
|
||||
|
||||
self._parser_keys_to_parsers = {}
|
||||
|
||||
self._dirty = False
|
||||
|
@ -383,6 +388,24 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
|
||||
|
||||
def _GetGUG( self, gug_key_and_name ):
|
||||
|
||||
( gug_key, gug_name ) = gug_key_and_name
|
||||
|
||||
if gug_key in self._gug_keys_to_gugs:
|
||||
|
||||
return self._gug_keys_to_gugs[ gug_key ]
|
||||
|
||||
elif gug_name in self._gug_names_to_gugs:
|
||||
|
||||
return self._gug_names_to_gugs[ gug_name ]
|
||||
|
||||
else:
|
||||
|
||||
return None
|
||||
|
||||
|
||||
|
||||
def _GetNormalisedAPIURLMatchAndURL( self, url ):
|
||||
|
||||
url_match = self._GetURLMatch( url )
|
||||
|
@ -467,6 +490,7 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
|
|||
def _GetSerialisableInfo( self ):
|
||||
|
||||
serialisable_gugs = self._gugs.GetSerialisableTuple()
|
||||
serialisable_gug_keys_to_display = [ gug_key.encode( 'hex' ) for gug_key in self._gug_keys_to_display ]
|
||||
|
||||
serialisable_url_matches = self._url_matches.GetSerialisableTuple()
|
||||
serialisable_url_match_keys_to_display = [ url_match_key.encode( 'hex' ) for url_match_key in self._url_match_keys_to_display ]
|
||||
|
@ -481,7 +505,7 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
|
|||
serialisable_parsers = self._parsers.GetSerialisableTuple()
|
||||
serialisable_network_contexts_to_custom_header_dicts = [ ( network_context.GetSerialisableTuple(), custom_header_dict.items() ) for ( network_context, custom_header_dict ) in self._network_contexts_to_custom_header_dicts.items() ]
|
||||
|
||||
return ( serialisable_gugs, serialisable_url_matches, serialisable_url_match_keys_to_display, serialisable_url_match_keys_to_parser_keys, serialisable_default_tag_import_options_tuple, serialisable_parsers, serialisable_network_contexts_to_custom_header_dicts )
|
||||
return ( serialisable_gugs, serialisable_gug_keys_to_display, serialisable_url_matches, serialisable_url_match_keys_to_display, serialisable_url_match_keys_to_parser_keys, serialisable_default_tag_import_options_tuple, serialisable_parsers, serialisable_network_contexts_to_custom_header_dicts )
|
||||
|
||||
|
||||
def _GetURLMatch( self, url ):
|
||||
|
@ -512,10 +536,12 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
def _InitialiseFromSerialisableInfo( self, serialisable_info ):
|
||||
|
||||
( serialisable_gugs, serialisable_url_matches, serialisable_url_match_keys_to_display, serialisable_url_match_keys_to_parser_keys, serialisable_default_tag_import_options_tuple, serialisable_parsers, serialisable_network_contexts_to_custom_header_dicts ) = serialisable_info
|
||||
( serialisable_gugs, serialisable_gug_keys_to_display, serialisable_url_matches, serialisable_url_match_keys_to_display, serialisable_url_match_keys_to_parser_keys, serialisable_default_tag_import_options_tuple, serialisable_parsers, serialisable_network_contexts_to_custom_header_dicts ) = serialisable_info
|
||||
|
||||
self._gugs = HydrusSerialisable.CreateFromSerialisableTuple( serialisable_gugs )
|
||||
|
||||
self._gug_keys_to_display = { serialisable_gug_key.decode( 'hex' ) for serialisable_gug_key in serialisable_gug_keys_to_display }
|
||||
|
||||
self._url_matches = HydrusSerialisable.CreateFromSerialisableTuple( serialisable_url_matches )
|
||||
|
||||
self._url_match_keys_to_display = { serialisable_url_match_key.decode( 'hex' ) for serialisable_url_match_key in serialisable_url_match_keys_to_display }
|
||||
|
@ -557,12 +583,10 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
|
|||
NetworkDomainManager.STATICSortURLMatchesDescendingComplexity( url_matches )
|
||||
|
||||
|
||||
self._parser_keys_to_parsers = {}
|
||||
self._gug_keys_to_gugs = { gug.GetGUGKey() : gug for gug in self._gugs }
|
||||
self._gug_names_to_gugs = { gug.GetName() : gug for gug in self._gugs }
|
||||
|
||||
for parser in self._parsers:
|
||||
|
||||
self._parser_keys_to_parsers[ parser.GetParserKey() ] = parser
|
||||
|
||||
self._parser_keys_to_parsers = { parser.GetParserKey() : parser for parser in self._parsers }
|
||||
|
||||
namespaces = set()
|
||||
|
||||
|
@ -696,6 +720,45 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
|
|||
return ( 5, new_serialisable_info )
|
||||
|
||||
|
||||
if version == 5:
|
||||
|
||||
( serialisable_gugs, serialisable_url_matches, serialisable_url_match_keys_to_display, serialisable_url_match_keys_to_parser_keys, serialisable_default_tag_import_options_tuple, serialisable_parsing_parsers, serialisable_network_contexts_to_custom_header_dicts ) = old_serialisable_info
|
||||
|
||||
gugs = HydrusSerialisable.CreateFromSerialisableTuple( serialisable_gugs )
|
||||
|
||||
gug_keys_to_display = [ gug.GetGUGKey() for gug in gugs if 'ugoira' not in gug.GetName() ]
|
||||
|
||||
serialisable_gug_keys_to_display = [ gug_key.encode( 'hex' ) for gug_key in gug_keys_to_display ]
|
||||
|
||||
new_serialisable_info = ( serialisable_gugs, serialisable_gug_keys_to_display, serialisable_url_matches, serialisable_url_match_keys_to_display, serialisable_url_match_keys_to_parser_keys, serialisable_default_tag_import_options_tuple, serialisable_parsing_parsers, serialisable_network_contexts_to_custom_header_dicts )
|
||||
|
||||
return ( 6, new_serialisable_info )
|
||||
|
||||
|
||||
|
||||
def AddGUGs( self, new_gugs ):
|
||||
|
||||
with self._lock:
|
||||
|
||||
gugs = list( self._gugs )
|
||||
|
||||
gugs.extend( new_gugs )
|
||||
|
||||
|
||||
self.SetGUGs( gugs )
|
||||
|
||||
|
||||
def AddParsers( self, new_parsers ):
|
||||
|
||||
with self._lock:
|
||||
|
||||
parsers = list( self._parsers )
|
||||
|
||||
parsers.extend( new_parsers )
|
||||
|
||||
|
||||
self.SetParsers( parsers )
|
||||
|
||||
|
||||
def CanValidateInPopup( self, network_contexts ):
|
||||
|
||||
|
@ -747,6 +810,16 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
|
|||
return url_tuples
|
||||
|
||||
|
||||
def DeleteGUGs( self, deletee_names ):
|
||||
|
||||
with self._lock:
|
||||
|
||||
gugs = [ gug for gug in self._gugs if gug.GetName() not in deletee_names ]
|
||||
|
||||
|
||||
self.SetGUGs( gugs )
|
||||
|
||||
|
||||
def GenerateValidationPopupProcess( self, network_contexts ):
|
||||
|
||||
with self._lock:
|
||||
|
@ -775,6 +848,27 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
|
||||
|
||||
def GetDefaultGUGKeyAndName( self ):
|
||||
|
||||
with self._lock:
|
||||
|
||||
if len( self._gugs ) == 0:
|
||||
|
||||
return ( HydrusData.GenerateKey(), 'unknown source' )
|
||||
|
||||
else:
|
||||
|
||||
gugs = list( self._gugs )
|
||||
|
||||
gugs.sort( key = lambda g: g.GetName() )
|
||||
|
||||
gug = gugs[0]
|
||||
|
||||
return ( gug.GetGUGKey(), gug.GetName() )
|
||||
|
||||
|
||||
|
||||
|
||||
def GetDefaultTagImportOptions( self ):
|
||||
|
||||
with self._lock:
|
||||
|
@ -812,6 +906,14 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
|
||||
|
||||
def GetGUG( self, gug_key_and_name ):
|
||||
|
||||
with self._lock:
|
||||
|
||||
return self._GetGUG( gug_key_and_name )
|
||||
|
||||
|
||||
|
||||
def GetGUGs( self ):
|
||||
|
||||
with self._lock:
|
||||
|
@ -820,6 +922,14 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
|
||||
|
||||
def GetGUGKeysToDisplay( self ):
|
||||
|
||||
with self._lock:
|
||||
|
||||
return set( self._gug_keys_to_display )
|
||||
|
||||
|
||||
|
||||
def GetHeaders( self, network_contexts ):
|
||||
|
||||
with self._lock:
|
||||
|
@ -846,6 +956,23 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
|
||||
|
||||
def GetInitialSearchText( self, gug_key_and_name ):
|
||||
|
||||
with self._lock:
|
||||
|
||||
gug = self._GetGUG( gug_key_and_name )
|
||||
|
||||
if gug is None:
|
||||
|
||||
return 'unknown downloader'
|
||||
|
||||
else:
|
||||
|
||||
return gug.GetInitialSearchText()
|
||||
|
||||
|
||||
|
||||
|
||||
def GetNetworkContextsToCustomHeaderDicts( self ):
|
||||
|
||||
with self._lock:
|
||||
|
@ -902,11 +1029,19 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
|
||||
|
||||
def GetURLMatchLinks( self ):
|
||||
def GetURLMatchKeysToParserKeys( self ):
|
||||
|
||||
with self._lock:
|
||||
|
||||
return ( set( self._url_match_keys_to_display ), dict( self._url_match_keys_to_parser_keys ) )
|
||||
return dict( self._url_match_keys_to_parser_keys )
|
||||
|
||||
|
||||
|
||||
def GetURLMatchKeysToDisplay( self ):
|
||||
|
||||
with self._lock:
|
||||
|
||||
return set( self._url_match_keys_to_display )
|
||||
|
||||
|
||||
|
||||
|
@ -1013,6 +1148,28 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
|
||||
|
||||
def OverwriteDefaultGUGs( self, gug_names ):
|
||||
|
||||
with self._lock:
|
||||
|
||||
import ClientDefaults
|
||||
|
||||
default_gugs = ClientDefaults.GetDefaultGUGs()
|
||||
|
||||
for gug in default_gugs:
|
||||
|
||||
gug.RegenerateGUGKey()
|
||||
|
||||
|
||||
existing_gugs = list( self._gugs )
|
||||
|
||||
new_gugs = [ gug for gug in existing_gugs if gug.GetName() not in gug_names ]
|
||||
new_gugs.extend( [ gug for gug in default_gugs if gug.GetName() in gug_names ] )
|
||||
|
||||
|
||||
self.SetGUGs( new_gugs )
|
||||
|
||||
|
||||
def OverwriteDefaultParsers( self, parser_names ):
|
||||
|
||||
with self._lock:
|
||||
|
@ -1093,10 +1250,33 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
with self._lock:
|
||||
|
||||
#check ngugs maybe
|
||||
# by default, we will show new gugs
|
||||
|
||||
old_gug_keys = { gug.GetGUGKey() for gug in self._gugs }
|
||||
gug_keys = { gug.GetGUGKey() for gug in gugs }
|
||||
|
||||
added_gug_keys = gug_keys.difference( old_gug_keys )
|
||||
|
||||
self._gug_keys_to_display.update( added_gug_keys )
|
||||
|
||||
#
|
||||
|
||||
self._gugs = HydrusSerialisable.SerialisableList( gugs )
|
||||
|
||||
self._RecalcCache()
|
||||
|
||||
self._SetDirty()
|
||||
|
||||
|
||||
|
||||
def SetGUGKeysToDisplay( self, gug_keys_to_display ):
|
||||
|
||||
with self._lock:
|
||||
|
||||
self._gug_keys_to_display = set()
|
||||
|
||||
self._gug_keys_to_display.update( gug_keys_to_display )
|
||||
|
||||
self._SetDirty()
|
||||
|
||||
|
||||
|
@ -1222,15 +1402,25 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
|
||||
|
||||
def SetURLMatchLinks( self, url_match_keys_to_display, url_match_keys_to_parser_keys ):
|
||||
def SetURLMatchKeysToParserKeys( self, url_match_keys_to_parser_keys ):
|
||||
|
||||
with self._lock:
|
||||
|
||||
self._url_match_keys_to_parser_keys = HydrusSerialisable.SerialisableBytesDictionary()
|
||||
|
||||
self._url_match_keys_to_parser_keys.update( url_match_keys_to_parser_keys )
|
||||
|
||||
self._SetDirty()
|
||||
|
||||
|
||||
|
||||
def SetURLMatchKeysToDisplay( self, url_match_keys_to_display ):
|
||||
|
||||
with self._lock:
|
||||
|
||||
self._url_match_keys_to_display = set()
|
||||
self._url_match_keys_to_parser_keys = HydrusSerialisable.SerialisableBytesDictionary()
|
||||
|
||||
self._url_match_keys_to_display.update( url_match_keys_to_display )
|
||||
self._url_match_keys_to_parser_keys.update( url_match_keys_to_parser_keys )
|
||||
|
||||
self._SetDirty()
|
||||
|
||||
|
@ -1562,7 +1752,7 @@ class GalleryURLGenerator( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
self._gallery_url_generator_key = serialisable_gallery_url_generator_key.decode( 'hex' )
|
||||
|
||||
|
||||
def GenerateGalleryURL( self, search_terms ):
|
||||
def GenerateGalleryURL( self, query_text ):
|
||||
|
||||
if self._replacement_phrase == '':
|
||||
|
||||
|
@ -1574,6 +1764,20 @@ class GalleryURLGenerator( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
raise HydrusExceptions.GUGException( 'Replacement phrase not in URL template!' )
|
||||
|
||||
|
||||
( first_part, second_part ) = self._url_template.split( self._replacement_phrase, 1 )
|
||||
|
||||
search_phrase_seems_to_go_in_path = '?' not in first_part
|
||||
|
||||
search_terms = query_text.split( ' ' )
|
||||
|
||||
if search_phrase_seems_to_go_in_path:
|
||||
|
||||
# encode this gubbins since requests won't be able to do it
|
||||
# this basically fixes e621 searches for 'male/female', which through some httpconf trickery are embedded in path but end up in a query, so need to be encoded right beforehand
|
||||
|
||||
search_terms = [ urllib.quote( search_term, safe = '' ) for search_term in search_terms ]
|
||||
|
||||
|
||||
try:
|
||||
|
||||
search_phrase = self._search_terms_separator.join( search_terms )
|
||||
|
@ -1588,9 +1792,14 @@ class GalleryURLGenerator( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
return gallery_url
|
||||
|
||||
|
||||
def GenerateGalleryURLs( self, query_text ):
|
||||
|
||||
return ( self.GenerateGalleryURL( query_text ), )
|
||||
|
||||
|
||||
def GetExampleURL( self ):
|
||||
|
||||
return self.GenerateGalleryURL( self._example_search_text.split( ' ' ) )
|
||||
return self.GenerateGalleryURL( self._example_search_text )
|
||||
|
||||
|
||||
def GetGUGKey( self ):
|
||||
|
@ -1598,6 +1807,11 @@ class GalleryURLGenerator( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
return self._gallery_url_generator_key
|
||||
|
||||
|
||||
def GetGUGKeyAndName( self ):
|
||||
|
||||
return ( self._gallery_url_generator_key, self._name )
|
||||
|
||||
|
||||
def GetInitialSearchText( self ):
|
||||
|
||||
return self._initial_search_text
|
||||
|
@ -1608,6 +1822,15 @@ class GalleryURLGenerator( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
return ( self._url_template, self._replacement_phrase, self._search_terms_separator, self._example_search_text )
|
||||
|
||||
|
||||
def IsFunctional( self ):
|
||||
|
||||
example_url = self.GetExampleURL()
|
||||
|
||||
( url_type, match_name, can_parse ) = HG.client_controller.network_engine.domain_manager.GetURLParseCapability( example_url )
|
||||
|
||||
return can_parse
|
||||
|
||||
|
||||
def RegenerateGUGKey( self ):
|
||||
|
||||
self._gallery_url_generator_key = HydrusData.GenerateKey()
|
||||
|
@ -1621,60 +1844,144 @@ class NestedGalleryURLGenerator( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
SERIALISABLE_NAME = 'Nested Gallery URL Generator'
|
||||
SERIALISABLE_VERSION = 1
|
||||
|
||||
def __init__( self, name, initial_search_text = None, gug_keys = None ):
|
||||
def __init__( self, name, gug_key = None, initial_search_text = None, gug_keys_and_names = None ):
|
||||
|
||||
if gug_key is None:
|
||||
|
||||
gug_key = HydrusData.GenerateKey()
|
||||
|
||||
|
||||
if initial_search_text is None:
|
||||
|
||||
initial_search_text = 'search tags'
|
||||
|
||||
|
||||
if gug_keys is None:
|
||||
if gug_keys_and_names is None:
|
||||
|
||||
gug_keys = []
|
||||
gug_keys_and_names = []
|
||||
|
||||
|
||||
HydrusSerialisable.SerialisableBaseNamed.__init__( self, name )
|
||||
|
||||
self._gallery_url_generator_key = gug_key
|
||||
self._initial_search_text = initial_search_text
|
||||
self._gug_keys = gug_keys
|
||||
self._gug_keys_and_names = gug_keys_and_names
|
||||
|
||||
|
||||
def _GetSerialisableInfo( self ):
|
||||
|
||||
serialisable_gug_keys = [ gug_key.encode( 'hex' ) for gug_key in self._gug_keys ]
|
||||
serialisable_gug_key = self._gallery_url_generator_key.encode( 'hex' )
|
||||
serialisable_gug_keys_and_names = [ ( gug_key.encode( 'hex' ), gug_name ) for ( gug_key, gug_name ) in self._gug_keys_and_names ]
|
||||
|
||||
return ( self._initial_search_text, serialisable_gug_keys )
|
||||
return ( serialisable_gug_key, self._initial_search_text, serialisable_gug_keys_and_names )
|
||||
|
||||
|
||||
def _InitialiseFromSerialisableInfo( self, serialisable_info ):
|
||||
|
||||
( self._initial_search_text, serialisable_gug_keys ) = serialisable_info
|
||||
( serialisable_gug_key, self._initial_search_text, serialisable_gug_keys_and_names ) = serialisable_info
|
||||
|
||||
self._gug_keys = [ gug_key.decode( 'hex' ) for gug_key in serialisable_gug_keys ]
|
||||
self._gallery_url_generator_key = serialisable_gug_key.decode( 'hex' )
|
||||
self._gug_keys_and_names = [ ( gug_key.decode( 'hex' ), gug_name ) for ( gug_key, gug_name ) in serialisable_gug_keys_and_names ]
|
||||
|
||||
|
||||
def GenerateGalleryURLs( self, search_terms ):
|
||||
def GenerateGalleryURLs( self, query_text ):
|
||||
|
||||
gallery_urls = []
|
||||
|
||||
for gug_key in self._gug_keys:
|
||||
for gug_key_and_name in self._gug_keys_and_names:
|
||||
|
||||
gug = HG.client_controller.network_engine.domain_manager.GetGUG( gug_key )
|
||||
gug = HG.client_controller.network_engine.domain_manager.GetGUG( gug_key_and_name )
|
||||
|
||||
if gug is not None:
|
||||
|
||||
gallery_urls.append( gug.GenerateGalleryURL( search_terms ) )
|
||||
gallery_urls.append( gug.GenerateGalleryURL( query_text ) )
|
||||
|
||||
|
||||
|
||||
return gallery_urls
|
||||
|
||||
|
||||
def GetGUGKey( self ):
|
||||
|
||||
return self._gallery_url_generator_key
|
||||
|
||||
|
||||
def GetGUGKeys( self ):
|
||||
|
||||
return [ gug_key for ( gug_key, gug_name ) in self._gug_keys_and_names ]
|
||||
|
||||
|
||||
def GetGUGKeysAndNames( self ):
|
||||
|
||||
return list( self._gug_keys_and_names )
|
||||
|
||||
|
||||
def GetGUGKeyAndName( self ):
|
||||
|
||||
return ( self._gallery_url_generator_key, self._name )
|
||||
|
||||
|
||||
def GetGUGNames( self ):
|
||||
|
||||
return [ gug_name for ( gug_key, gug_name ) in self._gug_keys_and_names ]
|
||||
|
||||
|
||||
def GetInitialSearchText( self ):
|
||||
|
||||
return self._initial_search_text
|
||||
|
||||
|
||||
def IsFunctional( self ):
|
||||
|
||||
for gug_key_and_name in self._gug_keys_and_names:
|
||||
|
||||
gug = HG.client_controller.network_engine.domain_manager.GetGUG( gug_key_and_name )
|
||||
|
||||
if gug is not None:
|
||||
|
||||
if gug.IsFunctional():
|
||||
|
||||
return True
|
||||
|
||||
|
||||
|
||||
|
||||
return False
|
||||
|
||||
|
||||
def RegenerateGUGKey( self ):
|
||||
|
||||
self._gallery_url_generator_key = HydrusData.GenerateKey()
|
||||
|
||||
|
||||
def RepairGUGs( self, available_gugs ):
|
||||
|
||||
available_keys_to_gugs = { gug.GetGUGKey() : gug for gug in available_gugs }
|
||||
available_names_to_gugs = { gug.GetName() : gug for gug in available_gugs }
|
||||
|
||||
good_gug_keys_and_names = []
|
||||
|
||||
for ( gug_key, gug_name ) in self._gug_keys_and_names:
|
||||
|
||||
if gug_key in available_keys_to_gugs:
|
||||
|
||||
gug = available_keys_to_gugs[ gug_key ]
|
||||
|
||||
elif gug_name in available_names_to_gugs:
|
||||
|
||||
gug = available_names_to_gugs[ gug_name ]
|
||||
|
||||
else:
|
||||
|
||||
continue
|
||||
|
||||
|
||||
good_gug_keys_and_names.append( ( gug.GetGUGKey(), gug.GetName() ) )
|
||||
|
||||
|
||||
self._gug_keys_and_names = good_gug_keys_and_names
|
||||
|
||||
|
||||
HydrusSerialisable.SERIALISABLE_TYPES_TO_OBJECT_TYPES[ HydrusSerialisable.SERIALISABLE_TYPE_NESTED_GALLERY_URL_GENERATOR ] = NestedGalleryURLGenerator
|
||||
|
||||
class URLMatch( HydrusSerialisable.SerialisableBaseNamed ):
|
||||
|
@ -1709,7 +2016,7 @@ class URLMatch( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
parameters[ 's' ] = ( ClientParsing.StringMatch( match_type = ClientParsing.STRING_MATCH_FIXED, match_value = 'view', example_string = 'view' ), None )
|
||||
parameters[ 'id' ] = ( ClientParsing.StringMatch( match_type = ClientParsing.STRING_MATCH_FLEXIBLE, match_value = ClientParsing.NUMERIC, example_string = '123456' ), None )
|
||||
parameters[ 'page' ] = ( ClientParsing.StringMatch( match_type = ClientParsing.STRING_MATCH_FLEXIBLE, match_value = ClientParsing.NUMERIC, example_string = '1' ), 1 )
|
||||
parameters[ 'page' ] = ( ClientParsing.StringMatch( match_type = ClientParsing.STRING_MATCH_FLEXIBLE, match_value = ClientParsing.NUMERIC, example_string = '1' ), '1' )
|
||||
|
||||
|
||||
if api_lookup_converter is None:
|
||||
|
|
|
@ -184,6 +184,7 @@ class RasterContainerVideo( RasterContainer ):
|
|||
self._initialised = False
|
||||
|
||||
self._frames = {}
|
||||
|
||||
self._buffer_start_index = -1
|
||||
self._buffer_end_index = -1
|
||||
|
||||
|
@ -198,9 +199,9 @@ class RasterContainerVideo( RasterContainer ):
|
|||
video_buffer_size_mb = new_options.GetInteger( 'video_buffer_size_mb' )
|
||||
|
||||
duration = self._media.GetDuration()
|
||||
num_frames = self._media.GetNumFrames()
|
||||
num_frames_in_video = self._media.GetNumFrames()
|
||||
|
||||
if num_frames is None or num_frames == 0:
|
||||
if num_frames_in_video is None or num_frames_in_video == 0:
|
||||
|
||||
message = 'The file with hash ' + media.GetHash().encode( 'hex' ) + ', had an invalid number of frames.'
|
||||
message += os.linesep * 2
|
||||
|
@ -208,18 +209,18 @@ class RasterContainerVideo( RasterContainer ):
|
|||
|
||||
HydrusData.ShowText( message )
|
||||
|
||||
num_frames = 1
|
||||
num_frames_in_video = 1
|
||||
|
||||
|
||||
self._average_frame_duration = float( duration ) / num_frames
|
||||
self._average_frame_duration = float( duration ) / num_frames_in_video
|
||||
|
||||
frame_buffer_length = ( video_buffer_size_mb * 1024 * 1024 ) / ( x * y * 3 )
|
||||
|
||||
# if we can't buffer the whole vid, then don't have a clunky massive buffer
|
||||
|
||||
max_streaming_buffer_size = max( 48, int( num_frames / ( duration / 3.0 ) ) ) # 48 or 3 seconds
|
||||
max_streaming_buffer_size = max( 48, int( num_frames_in_video / ( duration / 3.0 ) ) ) # 48 or 3 seconds
|
||||
|
||||
if max_streaming_buffer_size < frame_buffer_length and frame_buffer_length < num_frames:
|
||||
if max_streaming_buffer_size < frame_buffer_length and frame_buffer_length < num_frames_in_video:
|
||||
|
||||
frame_buffer_length = max_streaming_buffer_size
|
||||
|
||||
|
@ -227,18 +228,26 @@ class RasterContainerVideo( RasterContainer ):
|
|||
self._num_frames_backwards = frame_buffer_length * 2 / 3
|
||||
self._num_frames_forwards = frame_buffer_length / 3
|
||||
|
||||
self._render_lock = threading.Lock()
|
||||
self._buffer_lock = threading.Lock()
|
||||
self._lock = threading.Lock()
|
||||
|
||||
self._last_index_rendered = -1
|
||||
self._next_render_index = -1
|
||||
self._render_to_index = -1
|
||||
self._rendered_first_frame = False
|
||||
self._rush_to_index = None
|
||||
self._ideal_next_frame = 0
|
||||
|
||||
HG.client_controller.CallToThread( self.THREADRender )
|
||||
|
||||
|
||||
def _HasFrame( self, index ):
|
||||
|
||||
return index in self._frames
|
||||
|
||||
|
||||
def _IndexInRange( self, index, range_start, range_end ):
|
||||
|
||||
return not self._IndexOutOfRange( index, range_start, range_end )
|
||||
|
||||
|
||||
def _IndexOutOfRange( self, index, range_start, range_end ):
|
||||
|
||||
before_start = index < range_start
|
||||
|
@ -264,64 +273,11 @@ class RasterContainerVideo( RasterContainer ):
|
|||
|
||||
def _MaintainBuffer( self ):
|
||||
|
||||
with self._buffer_lock:
|
||||
|
||||
deletees = [ index for index in self._frames.keys() if self._IndexOutOfRange( index, self._buffer_start_index, self._buffer_end_index ) ]
|
||||
|
||||
for i in deletees:
|
||||
|
||||
del self._frames[ i ]
|
||||
|
||||
|
||||
deletees = [ index for index in self._frames.keys() if self._IndexOutOfRange( index, self._buffer_start_index, self._buffer_end_index ) ]
|
||||
|
||||
|
||||
def THREADMoveRenderTo( self, render_to_index ):
|
||||
|
||||
with self._render_lock:
|
||||
for i in deletees:
|
||||
|
||||
if self._render_to_index != render_to_index:
|
||||
|
||||
self._render_to_index = render_to_index
|
||||
|
||||
self._render_event.set()
|
||||
|
||||
self._initialised = True
|
||||
|
||||
|
||||
|
||||
|
||||
def THREADMoveRenderer( self, start_index, rush_to_index, render_to_index ):
|
||||
|
||||
with self._render_lock:
|
||||
|
||||
if self._next_render_index != start_index:
|
||||
|
||||
self._renderer.set_position( start_index )
|
||||
|
||||
self._last_index_rendered = -1
|
||||
|
||||
self._next_render_index = start_index
|
||||
|
||||
self._rush_to_index = rush_to_index
|
||||
|
||||
self._render_to_index = render_to_index
|
||||
|
||||
self._render_event.set()
|
||||
|
||||
self._initialised = True
|
||||
|
||||
|
||||
|
||||
|
||||
def THREADRushTo( self, rush_to_index ):
|
||||
|
||||
with self._render_lock:
|
||||
|
||||
self._rush_to_index = rush_to_index
|
||||
|
||||
self._render_event.set()
|
||||
|
||||
self._initialised = True
|
||||
del self._frames[ i ]
|
||||
|
||||
|
||||
|
||||
|
@ -330,7 +286,7 @@ class RasterContainerVideo( RasterContainer ):
|
|||
hash = self._media.GetHash()
|
||||
mime = self._media.GetMime()
|
||||
duration = self._media.GetDuration()
|
||||
num_frames = self._media.GetNumFrames()
|
||||
num_frames_in_video = self._media.GetNumFrames()
|
||||
|
||||
client_files_manager = HG.client_controller.client_files_manager
|
||||
|
||||
|
@ -338,15 +294,20 @@ class RasterContainerVideo( RasterContainer ):
|
|||
|
||||
self._durations = HydrusImageHandling.GetGIFFrameDurations( self._path )
|
||||
|
||||
self._renderer = ClientVideoHandling.GIFRenderer( self._path, num_frames, self._target_resolution )
|
||||
self._renderer = ClientVideoHandling.GIFRenderer( self._path, num_frames_in_video, self._target_resolution )
|
||||
|
||||
else:
|
||||
|
||||
self._renderer = HydrusVideoHandling.VideoRendererFFMPEG( self._path, mime, duration, num_frames, self._target_resolution )
|
||||
self._renderer = HydrusVideoHandling.VideoRendererFFMPEG( self._path, mime, duration, num_frames_in_video, self._target_resolution )
|
||||
|
||||
|
||||
self.GetReadyForFrame( self._init_position )
|
||||
|
||||
with self._lock:
|
||||
|
||||
self._initialised = True
|
||||
|
||||
|
||||
while True:
|
||||
|
||||
if self._stop or HG.view_shutdown:
|
||||
|
@ -354,12 +315,42 @@ class RasterContainerVideo( RasterContainer ):
|
|||
return
|
||||
|
||||
|
||||
ready_to_render = self._initialised
|
||||
frames_needed = not self._rendered_first_frame or self._next_render_index != ( self._render_to_index + 1 ) % num_frames
|
||||
#
|
||||
|
||||
if ready_to_render and frames_needed:
|
||||
with self._lock:
|
||||
|
||||
with self._render_lock:
|
||||
# lets see if we should move the renderer to a new position
|
||||
|
||||
next_render_is_out_of_buffer = self._IndexOutOfRange( self._next_render_index, self._buffer_start_index, self._buffer_end_index )
|
||||
buffer_not_fully_rendered = self._last_index_rendered != self._buffer_end_index
|
||||
|
||||
currently_rendering_out_of_buffer = next_render_is_out_of_buffer and buffer_not_fully_rendered
|
||||
|
||||
will_render_ideal_frame_soon = self._IndexInRange( self._next_render_index, self._buffer_start_index, self._ideal_next_frame )
|
||||
|
||||
need_ideal_next_frame = not self._HasFrame( self._ideal_next_frame )
|
||||
|
||||
will_not_get_to_ideal_frame = need_ideal_next_frame and not will_render_ideal_frame_soon
|
||||
|
||||
if currently_rendering_out_of_buffer or will_not_get_to_ideal_frame:
|
||||
|
||||
# we cannot get to the ideal next frame, so we need to rewind/reposition
|
||||
|
||||
self._renderer.set_position( self._buffer_start_index )
|
||||
|
||||
self._last_index_rendered = -1
|
||||
|
||||
self._next_render_index = self._buffer_start_index
|
||||
|
||||
|
||||
#
|
||||
|
||||
need_to_render = self._last_index_rendered != self._buffer_end_index
|
||||
|
||||
|
||||
if need_to_render:
|
||||
|
||||
with self._lock:
|
||||
|
||||
self._rendered_first_frame = True
|
||||
|
||||
|
@ -379,38 +370,42 @@ class RasterContainerVideo( RasterContainer ):
|
|||
|
||||
self._last_index_rendered = frame_index
|
||||
|
||||
self._next_render_index = ( self._next_render_index + 1 ) % num_frames
|
||||
self._next_render_index = ( self._next_render_index + 1 ) % num_frames_in_video
|
||||
|
||||
|
||||
|
||||
with self._buffer_lock:
|
||||
with self._lock:
|
||||
|
||||
frame_needed = frame_index not in self._frames
|
||||
if self._next_render_index == 0 and self._buffer_end_index != num_frames_in_video - 1:
|
||||
|
||||
# we need to rewind renderer
|
||||
|
||||
self._renderer.set_position( 0 )
|
||||
|
||||
self._last_index_rendered = -1
|
||||
|
||||
|
||||
if self._rush_to_index is not None:
|
||||
|
||||
reached_it = self._rush_to_index == frame_index
|
||||
already_got_it = self._rush_to_index in self._frames
|
||||
can_no_longer_reach_it = self._IndexOutOfRange( self._rush_to_index, self._next_render_index, self._render_to_index )
|
||||
|
||||
if reached_it or already_got_it or can_no_longer_reach_it:
|
||||
|
||||
self._rush_to_index = None
|
||||
|
||||
|
||||
should_save_frame = not self._HasFrame( frame_index )
|
||||
|
||||
|
||||
if frame_needed:
|
||||
if should_save_frame:
|
||||
|
||||
frame = GenerateHydrusBitmapFromNumPyImage( numpy_image, compressed = False )
|
||||
|
||||
with self._buffer_lock:
|
||||
with self._lock:
|
||||
|
||||
self._frames[ frame_index ] = frame
|
||||
|
||||
self._MaintainBuffer()
|
||||
|
||||
|
||||
|
||||
if self._rush_to_index is not None:
|
||||
with self._lock:
|
||||
|
||||
we_have_the_ideal_next_frame = self._HasFrame( self._ideal_next_frame )
|
||||
|
||||
|
||||
if not we_have_the_ideal_next_frame: # there is work to do!
|
||||
|
||||
time.sleep( 0.00001 )
|
||||
|
||||
|
@ -418,7 +413,9 @@ class RasterContainerVideo( RasterContainer ):
|
|||
|
||||
half_a_frame = ( self._average_frame_duration / 1000.0 ) * 0.5
|
||||
|
||||
time.sleep( half_a_frame ) # just so we don't spam cpu
|
||||
sleep_duration = min( 0.1, half_a_frame ) # for 10s-long 3-frame gifs, wew
|
||||
|
||||
time.sleep( sleep_duration ) # just so we don't spam cpu
|
||||
|
||||
|
||||
else:
|
||||
|
@ -456,14 +453,14 @@ class RasterContainerVideo( RasterContainer ):
|
|||
|
||||
def GetFrame( self, index ):
|
||||
|
||||
with self._buffer_lock:
|
||||
with self._lock:
|
||||
|
||||
frame = self._frames[ index ]
|
||||
|
||||
|
||||
num_frames = self.GetNumFrames()
|
||||
num_frames_in_video = self.GetNumFrames()
|
||||
|
||||
if index == num_frames - 1:
|
||||
if index == num_frames_in_video - 1:
|
||||
|
||||
next_index = 0
|
||||
|
||||
|
@ -477,81 +474,96 @@ class RasterContainerVideo( RasterContainer ):
|
|||
return frame
|
||||
|
||||
|
||||
def GetHash( self ): return self._media.GetHash()
|
||||
def GetHash( self ):
|
||||
|
||||
return self._media.GetHash()
|
||||
|
||||
|
||||
def GetKey( self ): return ( self._media.GetHash(), self._target_resolution )
|
||||
def GetKey( self ):
|
||||
|
||||
return ( self._media.GetHash(), self._target_resolution )
|
||||
|
||||
|
||||
def GetNumFrames( self ): return self._media.GetNumFrames()
|
||||
def GetNumFrames( self ):
|
||||
|
||||
return self._media.GetNumFrames()
|
||||
|
||||
|
||||
def GetReadyForFrame( self, next_index_to_expect ):
|
||||
|
||||
num_frames = self.GetNumFrames()
|
||||
num_frames_in_video = self.GetNumFrames()
|
||||
|
||||
frame_exists = 0 <= next_index_to_expect and next_index_to_expect <= ( num_frames - 1 )
|
||||
frame_request_is_impossible = self._IndexOutOfRange( next_index_to_expect, 0, num_frames_in_video - 1 )
|
||||
|
||||
if not frame_exists:
|
||||
if frame_request_is_impossible:
|
||||
|
||||
return
|
||||
|
||||
|
||||
if num_frames > self._num_frames_backwards + 1 + self._num_frames_forwards:
|
||||
with self._lock:
|
||||
|
||||
index_out_of_buffer = self._IndexOutOfRange( next_index_to_expect, self._buffer_start_index, self._buffer_end_index )
|
||||
self._ideal_next_frame = next_index_to_expect
|
||||
|
||||
ideal_buffer_start_index = max( 0, next_index_to_expect - self._num_frames_backwards )
|
||||
video_is_bigger_than_buffer = num_frames_in_video > self._num_frames_backwards + 1 + self._num_frames_forwards
|
||||
|
||||
ideal_buffer_end_index = ( next_index_to_expect + self._num_frames_forwards ) % num_frames
|
||||
|
||||
if not self._rendered_first_frame or index_out_of_buffer:
|
||||
if video_is_bigger_than_buffer:
|
||||
|
||||
self._buffer_start_index = ideal_buffer_start_index
|
||||
current_ideal_is_out_of_buffer = self._buffer_start_index == -1 or self._IndexOutOfRange( self._ideal_next_frame, self._buffer_start_index, self._buffer_end_index )
|
||||
|
||||
self._buffer_end_index = ideal_buffer_end_index
|
||||
ideal_buffer_start_index = max( 0, self._ideal_next_frame - self._num_frames_backwards )
|
||||
|
||||
HG.client_controller.CallToThread( self.THREADMoveRenderer, self._buffer_start_index, next_index_to_expect, self._buffer_end_index )
|
||||
ideal_buffer_end_index = ( self._ideal_next_frame + self._num_frames_forwards ) % num_frames_in_video
|
||||
|
||||
else:
|
||||
|
||||
# rendering can't go backwards, so dragging caret back shouldn't rewind either of these!
|
||||
|
||||
if self.HasFrame( ideal_buffer_start_index ):
|
||||
if current_ideal_is_out_of_buffer:
|
||||
|
||||
# the current buffer won't get to where we want, so remake it
|
||||
|
||||
self._buffer_start_index = ideal_buffer_start_index
|
||||
|
||||
|
||||
if not self._IndexOutOfRange( self._next_render_index + 1, self._buffer_start_index, ideal_buffer_end_index ):
|
||||
|
||||
self._buffer_end_index = ideal_buffer_end_index
|
||||
|
||||
else:
|
||||
|
||||
# we can get to our desired position, but should we move the start and beginning on a bit?
|
||||
|
||||
# we do not ever want to shunt left (rewind)
|
||||
# we do not want to shunt right if we don't have the earliest frames yet--be patient
|
||||
|
||||
# i.e. it is between the current start and the ideal
|
||||
next_ideal_start_would_shunt_right = self._IndexInRange( ideal_buffer_start_index, self._buffer_start_index, self._ideal_next_frame )
|
||||
have_next_ideal_start = self._HasFrame( ideal_buffer_start_index )
|
||||
|
||||
if next_ideal_start_would_shunt_right and have_next_ideal_start:
|
||||
|
||||
self._buffer_start_index = ideal_buffer_start_index
|
||||
|
||||
|
||||
next_ideal_end_would_shunt_right = self._IndexInRange( ideal_buffer_end_index, self._buffer_end_index, self._buffer_start_index )
|
||||
|
||||
if next_ideal_end_would_shunt_right:
|
||||
|
||||
self._buffer_end_index = ideal_buffer_end_index
|
||||
|
||||
|
||||
HG.client_controller.CallToThread( self.THREADMoveRenderTo, self._buffer_end_index )
|
||||
|
||||
|
||||
else:
|
||||
|
||||
if self._buffer_end_index == -1:
|
||||
else:
|
||||
|
||||
self._buffer_start_index = 0
|
||||
|
||||
self._buffer_end_index = num_frames - 1
|
||||
|
||||
HG.client_controller.CallToThread( self.THREADMoveRenderer, self._buffer_start_index, next_index_to_expect, self._buffer_end_index )
|
||||
|
||||
else:
|
||||
|
||||
if not self.HasFrame( next_index_to_expect ):
|
||||
|
||||
HG.client_controller.CallToThread( self.THREADRushTo, next_index_to_expect )
|
||||
|
||||
self._buffer_end_index = num_frames_in_video - 1
|
||||
|
||||
|
||||
|
||||
self._MaintainBuffer()
|
||||
self._render_event.set()
|
||||
|
||||
|
||||
def GetResolution( self ): return self._media.GetResolution()
|
||||
def GetResolution( self ):
|
||||
|
||||
return self._media.GetResolution()
|
||||
|
||||
|
||||
def GetSize( self ): return self._target_resolution
|
||||
def GetSize( self ):
|
||||
|
||||
return self._target_resolution
|
||||
|
||||
|
||||
def GetTotalDuration( self ):
|
||||
|
||||
|
@ -567,18 +579,24 @@ class RasterContainerVideo( RasterContainer ):
|
|||
|
||||
def HasFrame( self, index ):
|
||||
|
||||
with self._buffer_lock:
|
||||
with self._lock:
|
||||
|
||||
return index in self._frames
|
||||
return self._HasFrame( index )
|
||||
|
||||
|
||||
|
||||
def IsInitialised( self ):
|
||||
|
||||
return self._initialised
|
||||
with self._lock:
|
||||
|
||||
return self._initialised
|
||||
|
||||
|
||||
|
||||
def IsScaled( self ): return self._zoom != 1.0
|
||||
def IsScaled( self ):
|
||||
|
||||
return self._zoom != 1.0
|
||||
|
||||
|
||||
def Stop( self ):
|
||||
|
||||
|
|
|
@ -1447,7 +1447,7 @@ class ServiceRepository( ServiceRestricted ):
|
|||
|
||||
with self._lock:
|
||||
|
||||
message = 'While processing updates for the ' + self._name + ' repository, one failed! The error follows:'
|
||||
message = 'Failed to process updates for the ' + self._name + ' repository! The error follows:'
|
||||
|
||||
HydrusData.ShowText( message )
|
||||
|
||||
|
|
|
@ -49,7 +49,7 @@ options = {}
|
|||
# Misc
|
||||
|
||||
NETWORK_VERSION = 18
|
||||
SOFTWARE_VERSION = 320
|
||||
SOFTWARE_VERSION = 321
|
||||
|
||||
UNSCALED_THUMBNAIL_DIMENSIONS = ( 200, 200 )
|
||||
|
||||
|
|
|
@ -790,6 +790,13 @@ class HydrusDB( object ):
|
|||
|
||||
try:
|
||||
|
||||
if HG.db_report_mode:
|
||||
|
||||
summary = 'Running ' + job.ToString()
|
||||
|
||||
HydrusData.ShowText( summary )
|
||||
|
||||
|
||||
if HG.db_profile_mode:
|
||||
|
||||
summary = 'Profiling ' + job.ToString()
|
||||
|
|
|
@ -2,17 +2,18 @@ import gc
|
|||
import hashlib
|
||||
import HydrusAudioHandling
|
||||
import HydrusConstants as HC
|
||||
import HydrusData
|
||||
import HydrusDocumentHandling
|
||||
import HydrusExceptions
|
||||
import HydrusFlashHandling
|
||||
import HydrusImageHandling
|
||||
import HydrusPaths
|
||||
import HydrusText
|
||||
import HydrusVideoHandling
|
||||
import os
|
||||
import threading
|
||||
import traceback
|
||||
import cStringIO
|
||||
import HydrusData
|
||||
|
||||
# Mime
|
||||
|
||||
|
@ -194,7 +195,18 @@ def GetFileInfo( path, mime = None ):
|
|||
|
||||
if mime not in HC.ALLOWED_MIMES:
|
||||
|
||||
raise HydrusExceptions.MimeException( 'Filetype is not permitted!' )
|
||||
if mime == HC.TEXT_HTML:
|
||||
|
||||
raise HydrusExceptions.MimeException( 'Looks like HTML -- maybe the client needs to be taught how to parse this?' )
|
||||
|
||||
elif mime == HC.APPLICATION_UNKNOWN:
|
||||
|
||||
raise HydrusExceptions.MimeException( 'Unknown filetype!' )
|
||||
|
||||
else:
|
||||
|
||||
raise HydrusExceptions.MimeException( 'Filetype is not permitted!' )
|
||||
|
||||
|
||||
|
||||
width = None
|
||||
|
@ -280,8 +292,6 @@ def GetMime( path ):
|
|||
|
||||
with open( path, 'rb' ) as f:
|
||||
|
||||
f.seek( 0 )
|
||||
|
||||
bit_to_check = f.read( 256 )
|
||||
|
||||
|
||||
|
@ -337,5 +347,10 @@ def GetMime( path ):
|
|||
HydrusData.PrintException( e, do_wait = False )
|
||||
|
||||
|
||||
if HydrusText.LooksLikeHTML( bit_to_check ):
|
||||
|
||||
return HC.TEXT_HTML
|
||||
|
||||
|
||||
return HC.APPLICATION_UNKNOWN
|
||||
|
||||
|
|
|
@ -20,6 +20,10 @@ def DeserialiseNewlinedTexts( text ):
|
|||
|
||||
return texts
|
||||
|
||||
def LooksLikeHTML( file_data ):
|
||||
|
||||
return '<html' in file_data or '<HTML' in file_data
|
||||
|
||||
def SortStringsIgnoringCase( list_of_strings ):
|
||||
|
||||
list_of_strings.sort( key = lambda s: s.lower() )
|
||||
|
|
|
@ -789,7 +789,13 @@ class VideoRendererFFMPEG( object ):
|
|||
rewind = pos < self.pos
|
||||
jump_a_long_way_ahead = pos > self.pos + 60
|
||||
|
||||
if rewind or jump_a_long_way_ahead: self.initialize( pos )
|
||||
else: self.skip_frames( pos - self.pos )
|
||||
if rewind or jump_a_long_way_ahead:
|
||||
|
||||
self.initialize( pos )
|
||||
|
||||
else:
|
||||
|
||||
self.skip_frames( pos - self.pos )
|
||||
|
||||
|
||||
|
||||
|
|
|
@ -191,52 +191,6 @@ class TestClientDB( unittest.TestCase ):
|
|||
self.assertEqual( result, [] )
|
||||
|
||||
|
||||
def test_booru( self ):
|
||||
|
||||
default_boorus = ClientDefaults.GetDefaultBoorus()
|
||||
|
||||
for ( name, booru ) in default_boorus.items():
|
||||
|
||||
read_booru = self._read( 'remote_booru', name )
|
||||
|
||||
self.assertEqual( booru.GetData(), read_booru.GetData() )
|
||||
|
||||
|
||||
#
|
||||
|
||||
result = self._read( 'remote_boorus' )
|
||||
|
||||
for ( name, booru ) in default_boorus.items(): self.assertEqual( result[ name ].GetData(), booru.GetData() )
|
||||
|
||||
#
|
||||
|
||||
name = 'blah'
|
||||
search_url = 'url'
|
||||
search_separator = '%20'
|
||||
advance_by_page_num = True
|
||||
thumb_classname = 'thumb'
|
||||
image_id = None
|
||||
image_data = 'Download'
|
||||
tag_classnames_to_namespaces = { 'tag' : '' }
|
||||
|
||||
booru = ClientData.Booru( name, search_url, search_separator, advance_by_page_num, thumb_classname, image_id, image_data, tag_classnames_to_namespaces )
|
||||
|
||||
self._write( 'remote_booru', 'blah', booru )
|
||||
|
||||
read_booru = self._read( 'remote_booru', name )
|
||||
|
||||
self.assertEqual( booru.GetData(), read_booru.GetData() )
|
||||
|
||||
#
|
||||
|
||||
self._write( 'delete_remote_booru', 'blah' )
|
||||
|
||||
with self.assertRaises( Exception ):
|
||||
|
||||
read_booru = self._read( 'remote_booru', name )
|
||||
|
||||
|
||||
|
||||
def test_export_folders( self ):
|
||||
|
||||
file_search_context = ClientSearch.FileSearchContext(file_service_key = HydrusData.GenerateKey(), tag_service_key = HydrusData.GenerateKey(), predicates = [ ClientSearch.Predicate( HC.PREDICATE_TYPE_TAG, 'test' ) ] )
|
||||
|
@ -635,8 +589,6 @@ class TestClientDB( unittest.TestCase ):
|
|||
|
||||
#
|
||||
|
||||
gallery_identifier = ClientDownloading.GalleryIdentifier( HC.SITE_TYPE_HENTAI_FOUNDRY_ARTIST )
|
||||
|
||||
management_controller = ClientGUIManagement.CreateManagementControllerImportGallery()
|
||||
|
||||
page = ClientGUIPages.Page( test_frame, HG.test_controller, management_controller, [] )
|
||||
|
|
|
@ -59,20 +59,6 @@ def PressKey( window, key ):
|
|||
|
||||
class TestDBDialogs( unittest.TestCase ):
|
||||
|
||||
def test_dialog_select_booru( self ):
|
||||
|
||||
HG.test_controller.SetRead( 'remote_boorus', ClientDefaults.GetDefaultBoorus() )
|
||||
|
||||
with ClientGUIDialogs.DialogSelectBooru( None ) as dlg:
|
||||
|
||||
HitCancelButton( dlg )
|
||||
|
||||
result = dlg.ShowModal()
|
||||
|
||||
self.assertEqual( result, wx.ID_CANCEL )
|
||||
|
||||
|
||||
|
||||
def test_dialog_manage_subs( self ):
|
||||
|
||||
title = 'subs test'
|
||||
|
|
|
@ -9,6 +9,7 @@ import ClientImporting
|
|||
import ClientImportOptions
|
||||
import ClientImportSubscriptions
|
||||
import ClientMedia
|
||||
import ClientNetworkingDomain
|
||||
import ClientRatings
|
||||
import ClientSearch
|
||||
import ClientTags
|
||||
|
@ -473,8 +474,7 @@ class TestSerialisables( unittest.TestCase ):
|
|||
|
||||
self.assertEqual( obj.GetName(), dupe_obj.GetName() )
|
||||
|
||||
self.assertEqual( obj._gallery_identifier, dupe_obj._gallery_identifier )
|
||||
self.assertEqual( obj._gallery_stream_identifiers, dupe_obj._gallery_stream_identifiers )
|
||||
self.assertEqual( obj._gug_key_and_name, dupe_obj._gug_key_and_name )
|
||||
self.assertEqual( len( obj._queries ), len( dupe_obj._queries ) )
|
||||
self.assertEqual( obj._initial_file_limit, dupe_obj._initial_file_limit )
|
||||
self.assertEqual( obj._periodic_file_limit, dupe_obj._periodic_file_limit )
|
||||
|
@ -490,8 +490,7 @@ class TestSerialisables( unittest.TestCase ):
|
|||
|
||||
self._dump_and_load_and_test( sub, test )
|
||||
|
||||
gallery_identifier = ClientDownloading.GalleryIdentifier( HC.SITE_TYPE_BOORU, 'gelbooru' )
|
||||
gallery_stream_identifiers = ClientDownloading.GetGalleryStreamIdentifiers( gallery_identifier )
|
||||
gug_key_and_name = ( HydrusData.GenerateKey(), 'muh test gug' )
|
||||
queries = [ ClientImportSubscriptions.SubscriptionQuery( 'test query' ), ClientImportSubscriptions.SubscriptionQuery( 'test query 2' ) ]
|
||||
checker_options = ClientImportOptions.CheckerOptions()
|
||||
initial_file_limit = 100
|
||||
|
@ -506,9 +505,9 @@ class TestSerialisables( unittest.TestCase ):
|
|||
|
||||
no_work_until = HydrusData.GetNow() - 86400 * 20
|
||||
|
||||
sub.SetTuple( gallery_identifier, gallery_stream_identifiers, queries, checker_options, initial_file_limit, periodic_file_limit, paused, file_import_options, tag_import_options, no_work_until )
|
||||
sub.SetTuple( gug_key_and_name, queries, checker_options, initial_file_limit, periodic_file_limit, paused, file_import_options, tag_import_options, no_work_until )
|
||||
|
||||
self.assertEqual( sub.GetGalleryIdentifier(), gallery_identifier )
|
||||
self.assertEqual( sub.GetGUGKeyAndName(), gug_key_and_name )
|
||||
self.assertEqual( sub.GetTagImportOptions(), tag_import_options )
|
||||
self.assertEqual( sub.GetQueries(), queries )
|
||||
|
||||
|
|
After Width: | Height: | Size: 2.8 KiB |
After Width: | Height: | Size: 2.4 KiB |
After Width: | Height: | Size: 2.7 KiB |
After Width: | Height: | Size: 2.8 KiB |
Before Width: | Height: | Size: 2.5 KiB |
After Width: | Height: | Size: 2.5 KiB |
After Width: | Height: | Size: 2.5 KiB |
After Width: | Height: | Size: 2.2 KiB |