changelog
-
+
version 320
+ - clients should now have objects for all default downloaders. everything should be prepped for the big switchover: +
- wrote gallery url generators for all the default downloaders and a couple more as well +
- wrote a gallery parser for deviant art--it also comes with an update to the DA url class because the meta 'next page' link on DA gallery pages is invalid wew! +
- wrote a gallery parser for hentai foundry, inkbunny, rule34hentai, moebooru (konachan, sakugabooru, yande.re), artstation, newgrounds, and pixiv artist galleries (static html) +
- added a gallery parser for sankaku +
- the artstation post url parser no longer fetches cover images +
- url classes can now support 'default' values for path components and query parameters! so, if your url might be missing a page=1 initialsation value due to user drag-and-drop, you can auto-add it in the normalisation step! +
- if the entered default does not match the rules of the component or parameter, it will be cleared back to none! +
- all appropriate default gallery url classes (which is most) now have these default values. all default gallery url classes will be overwritten on db update +
- three test 'search initialisation' url classes that attempted to fix this problem a different way will be deleted on update, if present +
- updated some other url classes +
- when checking source urls during the pre-download import status check, the client will now distrust parsed source urls if the files they seem to refer to also have other urls of the same url class as the file import object being actioned (basically, this is some logic that tries to detect bad source url attribution, where multiple files on a booru (typically including alternate edits) are all source-url'd back to a single original) +
- gallery page parsing now discounts parsed 'next page' urls that are the same as the page that fetched them (some gallery end-points link themselves as the next page, wew) +
- json parsing formulae that are set to parse all 'list' items will now also parse all dictionary entries if faced with a dict instead! +
- added new stop-gap 'stop checking' logic in subscription syncing for certain low-gallery-count edge-cases +
- fixed an issue where (typically new) subscriptions were bugging out trying to figure a default stop_reason on certain page results +
- fixed an unusual listctrl delete item index-tracking error that would sometimes cause exceptions on the 'try to link url stuff together' button press and maybe some other places +
- thanks to a submission from user prkc on the discord, we now have 'import cookies.txt' buttons on the review sessions panels! if you are interested in 'manual' logins through browser-cookie-copying, please give this a go and let me know which kinds of cookies.txt do and do not work, and how your different site cookie-copy-login tests work in hydrus. +
- the mappings cache tables now have some new indices that speed up certain kinds of tag search significantly. db update will spend a minute or two generating these indices for existing users +
- advanced mode users will discover a fun new entry on the help menu +
- the hyperlinks on the media viewer hover window and a couple of other places are now a custom control that uses any custom browser launch path in options->files and trash +
- fixed an issue where certain canvas edge-case media clearing events could be caught incorrectly by the manage tags dialog and its subsidiary panels +
- think I fixed an issue where a client left with a dialog open could sometimes run into trouble later trying to show an idle time maintenance modal popup and give a 'C++ assertion IsRunning()' exception and end up locking the client's ui +
- manage parsers dialog will now autosort after an add event +
- the gug panels now normalise example urls +
- improved some misc service error handling +
- rewrote some url parsing to stop forcing '+'->' ' in our urls' query texts +
- fixed some bad error handling for matplotlib import +
- misc fixes +
version 319
- started the new convert-query-text-to-gallery-urls object. these objects, which I was thinking of calling 'Searchers', will be called the more specific and practical 'Gallery URL Generators', or GUGs for short
- the first version of GUGs is done, and I've written some test ui for advanced users under network->downloader definitions->manage gugs. this ui doesn't save anything yet, but lets you mess around with different values. if we don't think of anything else needed in the next week, I will fix this code for v320 and start filling in defaults -
- watchers now have a global checking slot, much like the recent change to galleries and subs. it safely throttles dozens of threads so they don't rudely hammer your (or the destination server's) CPU if they all happen to want to go at once (like just after your computer wakes up). the option is similarly under options->downloading, and is global for the moment +
- watchers now have a checking slot, much like the recent change to galleries and subs. it safely throttles dozens of threads so they don't rudely hammer your (or the destination server's) CPU if they all happen to want to go at once (like just after your computer wakes up). the option is similarly under options->downloading
- moved the new gallery delay/token management code to the better-fit bandwidth manager (it was in domain manager before)
- the gallery delay/token code now works per-domain!
- moved the gallery delay/token checking code into the network job proper, simplifying a bunch of import-level code and making the text display now appear in the network job control. token consumption now occurs after bandwidth (it is now the last hoop to jump through, which reduces the chance of a pileup in unusual situations) I expect to soon add some kind of 'force-go' action to the cog menu diff --git a/help/introduction.html b/help/introduction.html index f771f5a0..9f87573b 100755 --- a/help/introduction.html +++ b/help/introduction.html @@ -8,7 +8,7 @@
-
+
on being anonymous
Nearly all sites use the same pseudonymous username/password system, and nearly all of them have the same drama, sockpuppets, and egotistical mods. Censorship is routine. That works for many people, but not for me.
-I enjoy being anonymous online. When you aren't afraid of repurcussions, you can be as truthful as you want. You can have conversations that can happen nowhere else. It's fun!
+I enjoy being anonymous online. When you aren't afraid of repercussions, you can be as truthful as you want. You can have conversations that can happen nowhere else. It's fun!
I've been on the imageboards for a long time, saving everything I like to my hard drive. After a while, the whole collection was just too large to manage on my own.
the hydrus network
So! I'm developing a program that helps people organise their files together anonymously. I want to help you do what you want with your stuff, and that's it. You can share some tags and files with other people if you want to, but you don't have to connect to anything if you don't. The default is no sharing, and every upload requires a conscious action on your part. I don't plan to ever record metrics on users, nor serve ads, nor charge for my software. The software never phones home.
diff --git a/include/ClientDB.py b/include/ClientDB.py index 87c7d26f..1fca9135 100755 --- a/include/ClientDB.py +++ b/include/ClientDB.py @@ -2431,6 +2431,10 @@ class DB( HydrusDB.HydrusDB ): self._CacheSpecificMappingsAddFiles( file_service_id, tag_service_id, hash_ids ) + self._CreateIndex( cache_current_mappings_table_name, [ 'tag_id', 'hash_id' ], unique = True ) + self._CreateIndex( cache_deleted_mappings_table_name, [ 'tag_id', 'hash_id' ], unique = True ) + self._CreateIndex( cache_pending_mappings_table_name, [ 'tag_id', 'hash_id' ], unique = True ) + def _CacheSpecificMappingsGetAutocompleteCounts( self, file_service_id, tag_service_id, tag_ids ): @@ -3927,6 +3931,17 @@ class DB( HydrusDB.HydrusDB ): return names_to_analyze + def _GetBonedStats( self ): + + ( num_total, size_total ) = self._c.execute( 'SELECT COUNT( hash_id ), SUM( size ) FROM files_info NATURAL JOIN current_files WHERE service_id = ?;', ( self._local_file_service_id, ) ).fetchone() + ( num_inbox, size_inbox ) = self._c.execute( 'SELECT COUNT( hash_id ), SUM( size ) FROM files_info NATURAL JOIN current_files NATURAL JOIN file_inbox WHERE service_id = ?;', ( self._local_file_service_id, ) ).fetchone() + + num_archive = num_total - num_inbox + size_archive = size_total - size_inbox + + return ( num_inbox, num_archive, size_inbox, size_archive ) + + def _GetClientFilesLocations( self ): result = { prefix : HydrusPaths.ConvertPortablePathToAbsPath( location ) for ( prefix, location ) in self._c.execute( 'SELECT prefix, location FROM client_files_locations;' ) } @@ -8627,6 +8642,7 @@ class DB( HydrusDB.HydrusDB ): def _Read( self, action, *args, **kwargs ): if action == 'autocomplete_predicates': result = self._GetAutocompletePredicates( *args, **kwargs ) + elif action == 'boned_stats': result = self._GetBonedStats( *args, **kwargs ) elif action == 'client_files_locations': result = self._GetClientFilesLocations( *args, **kwargs ) elif action == 'downloads': result = self._GetDownloads( *args, **kwargs ) elif action == 'duplicate_hashes': result = self._CacheSimilarFilesGetDuplicateHashes( *args, **kwargs ) @@ -9390,194 +9406,6 @@ class DB( HydrusDB.HydrusDB ): self._controller.pub( 'splash_set_status_text', 'updating db to v' + str( version + 1 ) ) - if version == 260: - - self._controller.pub( 'splash_set_status_subtext', 'generating some new tag search data' ) - - self._c.execute( 'CREATE TABLE external_caches.integer_subtags ( subtag_id INTEGER PRIMARY KEY, integer_subtag INTEGER );' ) - - existing_subtag_data = self._c.execute( 'SELECT subtag_id, subtag FROM subtags;' ).fetchall() - - inserts = [] - - for ( subtag_id, subtag ) in existing_subtag_data: - - try: - - integer_subtag = int( subtag ) - - if CanCacheInteger( integer_subtag ): - - inserts.append( ( subtag_id, integer_subtag ) ) - - - except ValueError: - - pass - - - - self._c.executemany( 'INSERT OR IGNORE INTO integer_subtags ( subtag_id, integer_subtag ) VALUES ( ?, ? );', inserts ) - - self._CreateIndex( 'external_caches.integer_subtags', [ 'integer_subtag' ] ) - - # - - do_the_message = False - - subscriptions = self._GetJSONDumpNamed( HydrusSerialisable.SERIALISABLE_TYPE_SUBSCRIPTION ) - - for subscription in subscriptions: - - g_i = subscription._gallery_identifier - - if g_i.GetSiteType() == HC.SITE_TYPE_TUMBLR: - - do_the_message = True - - break - - - - if do_the_message: - - message = 'The tumblr downloader can now produce \'raw\' urls for images that have >1280px width. It is possible some of your tumblr subscriptions\' urls are resizes, so at some point you may want to reset their url caches. I recommend you not do it yet--wait for the upcoming downloader overhaul, which will provide other benefits such as associating the \'post\' url with the image, rather than the ugly API url.' - - self.pub_initial_message( message ) - - - - if version == 262: - - self._controller.pub( 'splash_set_status_subtext', 'moving some hash data' ) - - self._c.execute( 'CREATE TABLE IF NOT EXISTS external_master.hashes ( hash_id INTEGER PRIMARY KEY, hash BLOB_BYTES UNIQUE );' ) - - self._c.execute( 'CREATE TABLE external_master.local_hashes ( hash_id INTEGER PRIMARY KEY, md5 BLOB_BYTES, sha1 BLOB_BYTES, sha512 BLOB_BYTES );' ) - self._CreateIndex( 'external_master.local_hashes', [ 'md5' ] ) - self._CreateIndex( 'external_master.local_hashes', [ 'sha1' ] ) - self._CreateIndex( 'external_master.local_hashes', [ 'sha512' ] ) - - self._c.execute( 'INSERT INTO external_master.local_hashes SELECT * FROM main.local_hashes;' ) - - self._c.execute( 'DROP TABLE main.local_hashes;' ) - - self._c.execute( 'ANALYZE external_master.local_hashes;' ) - - self._CloseDBCursor() - - self._controller.pub( 'splash_set_status_subtext', 'vacuuming main db ' ) - - db_path = os.path.join( self._db_dir, 'client.db' ) - - try: - - if HydrusDB.CanVacuum( db_path ): - - HydrusDB.VacuumDB( db_path ) - - - except Exception as e: - - HydrusData.Print( 'Vacuum failed!' ) - HydrusData.PrintException( e ) - - - self._InitDBCursor() - - # - - bandwidth_manager = ClientNetworkingBandwidth.NetworkBandwidthManager() - - ClientDefaults.SetDefaultBandwidthManagerRules( bandwidth_manager ) - - self._SetJSONDump( bandwidth_manager ) - - session_manager = ClientNetworkingSessions.NetworkSessionManager() - - self._SetJSONDump( session_manager ) - - # - - self._controller.pub( 'splash_set_status_subtext', 'generating deleted tag cache' ) - - tag_service_ids = self._GetServiceIds( HC.TAG_SERVICES ) - file_service_ids = self._GetServiceIds( HC.AUTOCOMPLETE_CACHE_SPECIFIC_FILE_SERVICES ) - - for ( file_service_id, tag_service_id ) in itertools.product( file_service_ids, tag_service_ids ): - - ( cache_files_table_name, cache_current_mappings_table_name, cache_deleted_mappings_table_name, cache_pending_mappings_table_name, ac_cache_table_name ) = GenerateSpecificMappingsCacheTableNames( file_service_id, tag_service_id ) - - self._c.execute( 'CREATE TABLE IF NOT EXISTS ' + cache_deleted_mappings_table_name + ' ( hash_id INTEGER, tag_id INTEGER, PRIMARY KEY ( hash_id, tag_id ) ) WITHOUT ROWID;' ) - - hash_ids = self._STL( self._c.execute( 'SELECT hash_id FROM current_files WHERE service_id = ?;', ( file_service_id, ) ) ) - - if len( hash_ids ) > 0: - - ( current_mappings_table_name, deleted_mappings_table_name, pending_mappings_table_name, petitioned_mappings_table_name ) = GenerateMappingsTableNames( tag_service_id ) - - for group_of_hash_ids in HydrusData.SplitListIntoChunks( hash_ids, 100 ): - - splayed_group_of_hash_ids = HydrusData.SplayListForDB( group_of_hash_ids ) - - deleted_mapping_ids_raw = self._c.execute( 'SELECT tag_id, hash_id FROM ' + deleted_mappings_table_name + ' WHERE hash_id IN ' + splayed_group_of_hash_ids + ';' ).fetchall() - - deleted_mapping_ids_dict = HydrusData.BuildKeyToSetDict( deleted_mapping_ids_raw ) - - all_ids_seen = set( deleted_mapping_ids_dict.keys() ) - - for tag_id in all_ids_seen: - - deleted_hash_ids = deleted_mapping_ids_dict[ tag_id ] - - num_deleted = len( deleted_hash_ids ) - - if num_deleted > 0: - - self._c.executemany( 'INSERT OR IGNORE INTO ' + cache_deleted_mappings_table_name + ' ( hash_id, tag_id ) VALUES ( ?, ? );', ( ( hash_id, tag_id ) for hash_id in deleted_hash_ids ) ) - - - - - - self._c.execute( 'ANALYZE ' + cache_deleted_mappings_table_name + ';' ) - - - - if version == 263: - - self._controller.pub( 'splash_set_status_subtext', 'rebuilding urls table' ) - - self._c.execute( 'ALTER TABLE urls RENAME TO urls_old;' ) - - self._c.execute( 'CREATE TABLE urls ( hash_id INTEGER, url TEXT, PRIMARY KEY ( hash_id, url ) );' ) - self._CreateIndex( 'urls', [ 'url' ] ) - - self._c.execute( 'INSERT INTO urls SELECT hash_id, url FROM urls_old;' ) - - self._c.execute( 'DROP TABLE urls_old;' ) - - self._c.execute( 'ANALYZE urls;' ) - - - if version == 264: - - default_bandwidth_manager = ClientNetworkingBandwidth.NetworkBandwidthManager() - - ClientDefaults.SetDefaultBandwidthManagerRules( default_bandwidth_manager ) - - bandwidth_manager = self._GetJSONDump( HydrusSerialisable.SERIALISABLE_TYPE_NETWORK_BANDWIDTH_MANAGER ) - - bandwidth_manager._network_contexts_to_bandwidth_rules = dict( default_bandwidth_manager._network_contexts_to_bandwidth_rules ) - - self._SetJSONDump( bandwidth_manager ) - - - if version == 266: - - self._c.execute( 'DROP TABLE IF EXISTS web_sessions;' ) - - if version == 272: try: @@ -10871,7 +10699,7 @@ class DB( HydrusDB.HydrusDB ): if i % 100 == 0: - self._controller.pub( 'splash_set_status_subtext', 'normalising some uls: ' + HydrusData.ConvertValueRangeToPrettyString( i, num_to_do ) ) + self._controller.pub( 'splash_set_status_subtext', 'normalising some urls: ' + HydrusData.ConvertValueRangeToPrettyString( i, num_to_do ) ) @@ -10885,6 +10713,85 @@ class DB( HydrusDB.HydrusDB ): + if version == 319: + + tag_service_ids = self._GetServiceIds( HC.TAG_SERVICES ) + file_service_ids = self._GetServiceIds( HC.AUTOCOMPLETE_CACHE_SPECIFIC_FILE_SERVICES ) + + for ( file_service_id, tag_service_id ) in itertools.product( file_service_ids, tag_service_ids ): + + try: + + self._controller.pub( 'splash_set_status_subtext', 'generating some new indices: ' + str( file_service_id ) + ', ' + str( tag_service_id ) ) + + ( cache_files_table_name, cache_current_mappings_table_name, cache_deleted_mappings_table_name, cache_pending_mappings_table_name, ac_cache_table_name ) = GenerateSpecificMappingsCacheTableNames( file_service_id, tag_service_id ) + + self._CreateIndex( cache_current_mappings_table_name, [ 'tag_id', 'hash_id' ], unique = True ) + self._CreateIndex( cache_deleted_mappings_table_name, [ 'tag_id', 'hash_id' ], unique = True ) + self._CreateIndex( cache_pending_mappings_table_name, [ 'tag_id', 'hash_id' ], unique = True ) + + except: + + message = 'Trying to create some new tag lookup indices failed! This is not a huge deal, particularly if you have had service errors in the past, but you might want to let hydrus dev know! Your magic failure numbers are: ' + str( ( file_service_id, tag_service_id ) ) + + self.pub_initial_message( message ) + + + + # + + try: + + domain_manager = self._GetJSONDump( HydrusSerialisable.SERIALISABLE_TYPE_NETWORK_DOMAIN_MANAGER ) + + domain_manager.Initialise() + + # + + gugs = ClientDefaults.GetDefaultGUGs() + + domain_manager.SetGUGs( gugs ) + + # + + # lots of gallery updates this week, so let's just do them all + + default_url_matches = ClientDefaults.GetDefaultURLMatches() + + overwrite_names = [ url_match.GetName() for url_match in default_url_matches if url_match.GetURLType() == HC.URL_TYPE_GALLERY ] + + domain_manager.OverwriteDefaultURLMatches( overwrite_names ) + + # now clear out some failed experiments + + url_matches = domain_manager.GetURLMatches() + + url_matches = [ url_match for url_match in url_matches if 'search initialisation' not in url_match.GetName() ] + + domain_manager.SetURLMatches( url_matches ) + + # + + domain_manager.OverwriteDefaultParsers( ( 'artstation file page api parser', 'artstation gallery page api parser', 'hentai foundry gallery page parser', 'inkbunny gallery page parser', 'moebooru gallery page parser', 'newgrounds gallery page parser', 'pixiv static html gallery page parser', 'rule34hentai gallery page parser', 'sankaku gallery page parser', 'deviant art gallery page parser' ) ) + + # + + domain_manager.TryToLinkURLMatchesAndParsers() + + # + + self._SetJSONDump( domain_manager ) + + except Exception as e: + + HydrusData.PrintException( e ) + + message = 'Trying to update some url classes and parsers failed! Please let hydrus dev know!' + + self.pub_initial_message( message ) + + + self._controller.pub( 'splash_set_title_text', 'updated db to v' + str( version + 1 ) ) self._c.execute( 'UPDATE version SET version = ?;', ( version + 1, ) ) diff --git a/include/ClientDefaults.py b/include/ClientDefaults.py index 8568e9f9..d509e1bb 100644 --- a/include/ClientDefaults.py +++ b/include/ClientDefaults.py @@ -818,6 +818,10 @@ def SetDefaultDomainManagerData( domain_manager ): # + domain_manager.SetGUGs( GetDefaultGUGs() ) + + # + domain_manager.SetURLMatches( GetDefaultURLMatches() ) # diff --git a/include/ClientGUI.py b/include/ClientGUI.py index 68a912f8..bb563367 100755 --- a/include/ClientGUI.py +++ b/include/ClientGUI.py @@ -1720,7 +1720,7 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ): submenu = wx.Menu() - ClientGUIMenus.AppendMenuItem( self, submenu, 'UNDER CONSTRUCTION: manage gallery url generators', 'Manage the client\'s GUGs, which convert search terms into URLs.', self._ManageGUGs ) + ClientGUIMenus.AppendMenuItem( self, submenu, 'manage gallery url generators', 'Manage the client\'s GUGs, which convert search terms into URLs.', self._ManageGUGs ) ClientGUIMenus.AppendMenuItem( self, submenu, 'manage url classes', 'Configure which URLs the client can recognise.', self._ManageURLMatches ) ClientGUIMenus.AppendMenuItem( self, submenu, 'manage parsers', 'Manage the client\'s parsers, which convert URL content into hydrus metadata.', self._ManageParsers ) @@ -1907,6 +1907,11 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ): ClientGUIMenus.AppendMenu( menu, dont_know, 'I don\'t know what I am doing' ) + if self._controller.new_options.GetBoolean( 'advanced_mode' ): + + ClientGUIMenus.AppendMenuItem( self, menu, 'how boned am I?', 'Check for a summary of your ride so far.', self._HowBonedAmI ) + + ClientGUIMenus.AppendSeparator( menu ) currently_darkmode = self._new_options.GetString( 'current_colourset' ) == 'darkmode' @@ -2015,6 +2020,17 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ): with ClientGUIDialogs.DialogGenerateNewAccounts( self, service_key ) as dlg: dlg.ShowModal() + def _HowBonedAmI( self ): + + boned_stats = self._controller.Read( 'boned_stats' ) + + frame = ClientGUITopLevelWindows.FrameThatTakesScrollablePanel( self, 'review your fate' ) + + panel = ClientGUIScrolledPanelsReview.ReviewHowBonedAmI( frame, boned_stats ) + + frame.SetPanel( panel ) + + def _ImportFiles( self, paths = None ): if paths is None: @@ -2332,10 +2348,6 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ): gugs = domain_manager.GetGUGs() - import ClientNetworkingDomain - - gugs.append( ClientNetworkingDomain.GalleryURLGenerator( 'test gug', url_template = 'https://www.gelbooru.com/index.php?page=post&s=list&tags=%tags%&pid=0' ) ) - panel = ClientGUIScrolledPanelsEdit.EditGUGsPanel( dlg, gugs ) dlg.SetPanel( panel ) @@ -3622,7 +3634,19 @@ The password is cleartext here but obscured in the entry dialog. Enter a blank p return - if self.IsIconized(): + dialog_open = False + + tlps = wx.GetTopLevelWindows() + + for tlp in tlps: + + if isinstance( tlp, wx.Dialog ): + + dialog_open = True + + + + if self.IsIconized() or dialog_open: self._controller.CallLaterWXSafe( self, 10, self.AddModalMessage, job_key ) diff --git a/include/ClientGUICommon.py b/include/ClientGUICommon.py index 603db830..63a0405f 100755 --- a/include/ClientGUICommon.py +++ b/include/ClientGUICommon.py @@ -4,6 +4,7 @@ import ClientConstants as CC import ClientGUIMenus import ClientGUITopLevelWindows import ClientMedia +import ClientPaths import ClientRatings import ClientThreading import HydrusConstants as HC @@ -698,6 +699,33 @@ class BetterStaticText( wx.StaticText ): +class BetterHyperLink( BetterStaticText ): + + def __init__( self, parent, label, url ): + + BetterStaticText.__init__( self, parent, label ) + + self._url = url + + self.SetCursor( wx.Cursor( wx.CURSOR_HAND ) ) + self.SetForegroundColour( wx.SystemSettings.GetColour( wx.SYS_COLOUR_HOTLIGHT ) ) + + font = self.GetFont() + + font.SetUnderlined( True ) + + self.SetFont( font ) + + self.SetToolTipString( self._url ) + + self.Bind( wx.EVT_LEFT_DOWN, self.EventClick ) + + + def EventClick( self, event ): + + ClientPaths.LaunchURLInWebBrowser( self._url ) + + class BufferedWindow( wx.Window ): def __init__( self, *args, **kwargs ): diff --git a/include/ClientGUIDialogs.py b/include/ClientGUIDialogs.py index 058599bd..207bf0e1 100755 --- a/include/ClientGUIDialogs.py +++ b/include/ClientGUIDialogs.py @@ -44,7 +44,6 @@ import traceback import urllib import wx import wx.lib.agw.customtreectrl -import wx.adv import yaml import HydrusData import ClientSearch @@ -1343,8 +1342,8 @@ class DialogInputNamespaceRegex( Dialog ): self._shortcuts = ClientGUICommon.RegexButton( self ) - self._regex_intro_link = wx.adv.HyperlinkCtrl( self, id = -1, label = 'a good regex introduction', url = 'http://www.aivosto.com/vbtips/regex.html' ) - self._regex_practise_link = wx.adv.HyperlinkCtrl( self, id = -1, label = 'regex practise', url = 'http://regexr.com/3cvmf' ) + self._regex_intro_link = ClientGUICommon.BetterHyperLink( self, 'a good regex introduction', 'http://www.aivosto.com/vbtips/regex.html' ) + self._regex_practise_link = ClientGUICommon.BetterHyperLink( self, 'regex practise', 'http://regexr.com/3cvmf' ) self._ok = wx.Button( self, id = wx.ID_OK, label = 'OK' ) self._ok.Bind( wx.EVT_BUTTON, self.EventOK ) diff --git a/include/ClientGUIHoverFrames.py b/include/ClientGUIHoverFrames.py index 562d939c..3501bf33 100644 --- a/include/ClientGUIHoverFrames.py +++ b/include/ClientGUIHoverFrames.py @@ -16,7 +16,6 @@ import HydrusSerialisable import urlparse import os import wx -import wx.adv class FullscreenHoverFrame( wx.Frame ): @@ -934,9 +933,7 @@ class FullscreenHoverFrameTopRight( FullscreenHoverFrame ): for ( display_string, url ) in url_tuples: - link = wx.adv.HyperlinkCtrl( self, id = -1, label = display_string, url = url ) - - link.SetToolTip( url ) + link = ClientGUICommon.BetterHyperLink( self, display_string, url ) self._urls_vbox.Add( link, CC.FLAGS_EXPAND_PERPENDICULAR ) diff --git a/include/ClientGUIImport.py b/include/ClientGUIImport.py index 0bd2d858..8b5cf4d9 100644 --- a/include/ClientGUIImport.py +++ b/include/ClientGUIImport.py @@ -30,7 +30,6 @@ import HydrusText import os import re import wx -import wx.adv class CheckerOptionsButton( ClientGUICommon.BetterButton ): @@ -253,8 +252,8 @@ class FilenameTaggingOptionsPanel( wx.Panel ): self._regex_shortcuts = ClientGUICommon.RegexButton( self._regexes_panel ) - self._regex_intro_link = wx.adv.HyperlinkCtrl( self._regexes_panel, id = -1, label = 'a good regex introduction', url = 'http://www.aivosto.com/vbtips/regex.html' ) - self._regex_practise_link = wx.adv.HyperlinkCtrl( self._regexes_panel, id = -1, label = 'regex practise', url = 'http://regexr.com/3cvmf' ) + self._regex_intro_link = ClientGUICommon.BetterHyperLink( self._regexes_panel, 'a good regex introduction', 'http://www.aivosto.com/vbtips/regex.html' ) + self._regex_practise_link = ClientGUICommon.BetterHyperLink( self._regexes_panel, 'regex practise', 'http://regexr.com/3cvmf' ) # diff --git a/include/ClientGUIListCtrl.py b/include/ClientGUIListCtrl.py index 0838501d..b45e86f6 100644 --- a/include/ClientGUIListCtrl.py +++ b/include/ClientGUIListCtrl.py @@ -704,10 +704,20 @@ class BetterListCtrl( wx.ListCtrl, ListCtrlAutoWidthMixin ): deletees.sort( reverse = True ) + # I am not sure, but I think if subsequent deleteitems occur in the same event, the event processing of the first is forced!! + # this means that button checking and so on occurs for n-1 times on an invalid indices structure in this thing before correcting itself in the last one + # if a button update then tests selected data against the invalid index and a selection is on the i+1 or whatever but just got bumped up into invalid area, we are exception city + # this doesn't normally affect us because mostly we _are_ deleting selections when we do deletes, but 'try to link url stuff' auto thing hit this + # I obviously don't want to recalc all indices for every delete + # so I wrote a catch in getdata to skip the missing error, and now I'm moving the data deletion to a second loop, which seems to help + for ( index, data ) in deletees: self.DeleteItem( index ) + + for ( index, data ) in deletees: + del self._data_to_indices[ data ] del self._indices_to_data_info[ index ] @@ -829,6 +839,12 @@ class BetterListCtrl( wx.ListCtrl, ListCtrlAutoWidthMixin ): for index in indices: + # this can get fired while indices are invalid, wew + if index not in self._indices_to_data_info: + + continue + + ( data, display_tuple, sort_tuple ) = self._indices_to_data_info[ index ] result.append( data ) diff --git a/include/ClientGUIPanels.py b/include/ClientGUIPanels.py index ddb0f697..2efb0537 100644 --- a/include/ClientGUIPanels.py +++ b/include/ClientGUIPanels.py @@ -594,6 +594,13 @@ class ReviewServicePanel( wx.Panel ): wx.CallAfter( self._Refresh ) + if HG.client_controller.options[ 'pause_repo_sync' ]: + + wx.MessageBox( 'All repositories are currently paused under the services->pause menu! Please unpause them and then try again!' ) + + return + + self._refresh_account_button.Disable() self._refresh_account_button.SetLabelText( u'fetching\u2026' ) diff --git a/include/ClientGUIParsing.py b/include/ClientGUIParsing.py index 60bf44ec..7479b482 100644 --- a/include/ClientGUIParsing.py +++ b/include/ClientGUIParsing.py @@ -1106,7 +1106,7 @@ class EditJSONParsingRulePanel( ClientGUIScrolledPanels.EditPanel ): self._type = ClientGUICommon.BetterChoice( self ) self._type.Append( 'dictionary entry', self.DICT_ENTRY ) - self._type.Append( 'all list items', self.ALL_LIST_ITEMS ) + self._type.Append( 'all dictionary/list items', self.ALL_LIST_ITEMS ) self._type.Append( 'indexed list item', self.INDEXED_LIST_ITEM) self._key = wx.TextCtrl( self ) @@ -2946,6 +2946,8 @@ class EditParsersPanel( ClientGUIScrolledPanels.EditPanel ): self._AddParser( new_parser ) + self._parsers.Sort() + @@ -3716,7 +3718,7 @@ class EditStringConverterPanel( ClientGUIScrolledPanels.EditPanel ): self._data_encoding = ClientGUICommon.BetterChoice( self ) self._data_regex_pattern = wx.TextCtrl( self ) self._data_regex_repl = wx.TextCtrl( self ) - self._data_date_link = wx.adv.HyperlinkCtrl( self, label = 'link to date info', url = 'https://docs.python.org/2/library/datetime.html#strftime-strptime-behavior' ) + self._data_date_link = ClientGUICommon.BetterHyperLink( self, 'link to date info', 'https://docs.python.org/2/library/datetime.html#strftime-strptime-behavior' ) self._data_timezone = ClientGUICommon.BetterChoice( self ) self._data_timezone_offset = wx.SpinCtrl( self, min = -86400, max = 86400 ) diff --git a/include/ClientGUIScrolledPanelsEdit.py b/include/ClientGUIScrolledPanelsEdit.py index 92e5c4c3..7fd9cdec 100644 --- a/include/ClientGUIScrolledPanelsEdit.py +++ b/include/ClientGUIScrolledPanelsEdit.py @@ -1443,6 +1443,8 @@ class EditGUGPanel( ClientGUIScrolledPanels.EditPanel ): example_url = gug.GetExampleURL() + example_url = HG.client_controller.network_engine.domain_manager.NormaliseURL( example_url ) + self._example_url.SetValue( example_url ) except HydrusExceptions.GUGException as e: @@ -1514,7 +1516,7 @@ class EditGUGsPanel( ClientGUIScrolledPanels.EditPanel ): self._list_ctrl_panel = ClientGUIListCtrl.BetterListCtrlPanel( self ) - columns = [ ( 'name', 16 ), ( 'example url', -1 ), ( 'gallery url class?', 20 ) ] + columns = [ ( 'name', 24 ), ( 'example url', -1 ), ( 'gallery url class?', 20 ) ] self._list_ctrl = ClientGUIListCtrl.BetterListCtrl( self._list_ctrl_panel, 'gugs', 30, 74, columns, self._ConvertDataToListCtrlTuples, delete_key_callback = self._Delete, activation_callback = self._Edit ) @@ -1579,6 +1581,8 @@ class EditGUGsPanel( ClientGUIScrolledPanels.EditPanel ): name = gug.GetName() example_url = gug.GetExampleURL() + example_url = HG.client_controller.network_engine.domain_manager.NormaliseURL( example_url ) + url_match = HG.client_controller.network_engine.domain_manager.GetURLMatch( example_url ) if url_match is None: @@ -4809,7 +4813,7 @@ class EditURLMatchPanel( ClientGUIScrolledPanels.EditPanel ): path_components_panel = ClientGUICommon.StaticBox( self, 'path components' ) - self._path_components = ClientGUIListBoxes.QueueListBox( path_components_panel, 6, self._ConvertPathComponentToString, self._AddPathComponent, self._EditPathComponent ) + self._path_components = ClientGUIListBoxes.QueueListBox( path_components_panel, 6, self._ConvertPathComponentRowToString, self._AddPathComponent, self._EditPathComponent ) # @@ -4999,13 +5003,36 @@ class EditURLMatchPanel( ClientGUIScrolledPanels.EditPanel ): string_match = panel.GetValue() + with ClientGUIDialogs.DialogTextEntry( self, 'Enter optional \'default\' value for this parameter, which will be filled in if missing. Leave blank for none (recommended).', allow_blank = True ) as dlg_default: + + if dlg_default.ShowModal() == wx.ID_OK: + + default = dlg_default.GetValue() + + if default == '': + + default = None + + elif not string_match.Matches( default ): + + wx.MessageBox( 'That default does not match the given rule! Clearing it to none!' ) + + default = None + + + else: + + return + + + else: return - data = ( key, string_match ) + data = ( key, ( string_match, default ) ) self._parameters.AddDatas( ( data, ) ) @@ -5017,17 +5044,23 @@ class EditURLMatchPanel( ClientGUIScrolledPanels.EditPanel ): def _AddPathComponent( self ): string_match = ClientParsing.StringMatch() + default = None - return self._EditPathComponent( string_match ) + return self._EditPathComponent( ( string_match, default ) ) def _ConvertParameterToListCtrlTuples( self, data ): - ( key, string_match ) = data + ( key, ( string_match, default ) ) = data pretty_key = key pretty_string_match = string_match.ToUnicode() + if default is not None: + + pretty_string_match += ' (default "' + default + '")' + + sort_key = pretty_key sort_string_match = pretty_string_match @@ -5037,9 +5070,18 @@ class EditURLMatchPanel( ClientGUIScrolledPanels.EditPanel ): return ( display_tuple, sort_tuple ) - def _ConvertPathComponentToString( self, path_component ): + def _ConvertPathComponentRowToString( self, row ): - return path_component.ToUnicode() + ( string_match, default ) = row + + s = string_match.ToUnicode() + + if default is not None: + + s += ' (default "' + default + '")' + + + return s def _DeleteParameters( self ): @@ -5061,7 +5103,7 @@ class EditURLMatchPanel( ClientGUIScrolledPanels.EditPanel ): for parameter in selected_params: - ( original_key, original_string_match ) = parameter + ( original_key, ( original_string_match, original_default ) ) = parameter with ClientGUIDialogs.DialogTextEntry( self, 'edit the key', default = original_key, allow_blank = False ) as dlg: @@ -5097,6 +5139,34 @@ class EditURLMatchPanel( ClientGUIScrolledPanels.EditPanel ): string_match = panel.GetValue() + if original_default is None: + + original_default = '' + + + with ClientGUIDialogs.DialogTextEntry( self, 'Enter optional \'default\' value for this parameter, which will be filled in if missing. Leave blank for none (recommended).', default = original_default, allow_blank = True ) as dlg_default: + + if dlg_default.ShowModal() == wx.ID_OK: + + default = dlg_default.GetValue() + + if default == '': + + default = None + + elif not string_match.Matches( default ): + + wx.MessageBox( 'That default does not match the given rule! Clearing it to none!' ) + + default = None + + + else: + + return + + + else: return @@ -5105,7 +5175,7 @@ class EditURLMatchPanel( ClientGUIScrolledPanels.EditPanel ): self._parameters.DeleteDatas( ( parameter, ) ) - new_parameter = ( key, string_match ) + new_parameter = ( key, ( string_match, default ) ) self._parameters.AddDatas( ( new_parameter, ) ) @@ -5115,7 +5185,9 @@ class EditURLMatchPanel( ClientGUIScrolledPanels.EditPanel ): self._UpdateControls() - def _EditPathComponent( self, string_match ): + def _EditPathComponent( self, row ): + + ( string_match, default ) = row with ClientGUITopLevelWindows.DialogEdit( self, 'edit path component' ) as dlg: @@ -5127,13 +5199,37 @@ class EditURLMatchPanel( ClientGUIScrolledPanels.EditPanel ): new_string_match = panel.GetValue() - return ( True, new_string_match ) + if default is None: + + default = '' + - else: - - return ( False, None ) + with ClientGUIDialogs.DialogTextEntry( self, 'Enter optional \'default\' value for this path component, which will be filled in if missing. Leave blank for none (recommended).', default = default, allow_blank = True ) as dlg_default: + + if dlg_default.ShowModal() == wx.ID_OK: + + new_default = dlg_default.GetValue() + + if new_default == '': + + new_default = None + + elif not string_match.Matches( new_default ): + + wx.MessageBox( 'That default does not match the given rule! Clearing it to none!' ) + + new_default = None + + + new_row = ( new_string_match, new_default ) + + return ( True, new_row ) + + + return ( False, None ) + def _GetExistingKeys( self ): @@ -5180,17 +5276,17 @@ class EditURLMatchPanel( ClientGUIScrolledPanels.EditPanel ): choices = [ ( 'no next gallery page info set', ( None, None ) ) ] - for ( index, path_component ) in enumerate( self._path_components.GetData() ): + for ( index, ( string_match, default ) ) in enumerate( self._path_components.GetData() ): - if True in ( path_component.Matches( n ) for n in ( '0', '1', '10', '100', '42' ) ): + if True in ( string_match.Matches( n ) for n in ( '0', '1', '10', '100', '42' ) ): choices.append( ( HydrusData.ConvertIntToPrettyOrdinalString( index + 1 ) + ' path component', ( ClientNetworkingDomain.GALLERY_INDEX_TYPE_PATH_COMPONENT, index ) ) ) - for ( index, ( key, value ) ) in enumerate( self._parameters.GetData() ): + for ( index, ( key, ( string_match, default ) ) ) in enumerate( self._parameters.GetData() ): - if True in ( value.Matches( n ) for n in ( '0', '1', '10', '100', '42' ) ): + if True in ( string_match.Matches( n ) for n in ( '0', '1', '10', '100', '42' ) ): choices.append( ( key + ' parameter', ( ClientNetworkingDomain.GALLERY_INDEX_TYPE_PARAMETER, key ) ) ) @@ -5238,7 +5334,7 @@ class EditURLMatchPanel( ClientGUIScrolledPanels.EditPanel ): self._can_produce_multiple_files.Disable() - if url_match.NormalisationIsAppropriate(): + if url_match.ClippingIsAppropriate(): if self._match_subdomains.GetValue(): diff --git a/include/ClientGUIScrolledPanelsManagement.py b/include/ClientGUIScrolledPanelsManagement.py index 2cd2f981..2105a26a 100644 --- a/include/ClientGUIScrolledPanelsManagement.py +++ b/include/ClientGUIScrolledPanelsManagement.py @@ -5214,11 +5214,14 @@ class ManageTagsPanel( ClientGUIScrolledPanels.ManagePanel ): if canvas_key == self._canvas_key: - self._current_media = ( new_media_singleton.Duplicate(), ) - - for page in self._tag_repositories.GetPages(): + if new_media_singleton is not None: - page.SetMedia( self._current_media ) + self._current_media = ( new_media_singleton.Duplicate(), ) + + for page in self._tag_repositories.GetPages(): + + page.SetMedia( self._current_media ) + diff --git a/include/ClientGUIScrolledPanelsReview.py b/include/ClientGUIScrolledPanelsReview.py index be9a127c..fc031bd0 100644 --- a/include/ClientGUIScrolledPanelsReview.py +++ b/include/ClientGUIScrolledPanelsReview.py @@ -12,18 +12,21 @@ import ClientGUIScrolledPanels import ClientGUIScrolledPanelsEdit import ClientGUIPanels import ClientGUIPopupMessages +import ClientGUITags import ClientGUITime import ClientGUITopLevelWindows import ClientNetworking import ClientNetworkingContexts +import ClientNetworkingDomain import ClientPaths -import ClientGUITags +import ClientRendering import ClientTags import ClientThreading import collections import cookielib import HydrusConstants as HC import HydrusData +import HydrusExceptions import HydrusGlobals as HG import HydrusNATPunch import HydrusPaths @@ -42,7 +45,7 @@ try: MATPLOTLIB_OK = True -except ImportError: +except: MATPLOTLIB_OK = False @@ -1885,6 +1888,62 @@ class ReviewExportFilesPanel( ClientGUIScrolledPanels.ReviewPanel ): self._RefreshTags() +class ReviewHowBonedAmI( ClientGUIScrolledPanels.ReviewPanel ): + + def __init__( self, parent, stats ): + + ClientGUIScrolledPanels.ReviewPanel.__init__( self, parent ) + + ( num_inbox, num_archive, size_inbox, size_archive ) = stats + + vbox = wx.BoxSizer( wx.VERTICAL ) + + num_total = num_archive + num_inbox + size_total = size_archive + size_inbox + + if num_total < 1000: + + get_more = ClientGUICommon.BetterStaticText( self, label = 'I hope you enjoy my software. You might like to check out the downloaders!' ) + + vbox.Add( get_more, CC.FLAGS_CENTER ) + + elif num_inbox <= num_archive / 100: + + hooray = ClientGUICommon.BetterStaticText( self, label = 'CONGRATULATIONS, YOU APPEAR TO BE UNBONED, BUT REMAIN EVER VIGILANT' ) + + vbox.Add( hooray, CC.FLAGS_CENTER ) + + else: + + boned_path = os.path.join( HC.STATIC_DIR, 'boned.jpg' ) + + boned_bmp = ClientRendering.GenerateHydrusBitmap( boned_path, HC.IMAGE_JPEG ).GetWxBitmap() + + win = ClientGUICommon.BufferedWindowIcon( self, boned_bmp ) + + vbox.Add( win, CC.FLAGS_CENTER ) + + + num_archive_percent = float( num_archive ) / num_total + size_archive_percent = float( size_archive ) / size_total + + num_inbox_percent = float( num_inbox ) / num_total + size_inbox_percent = float( size_inbox ) / size_total + + archive_label = 'Archive: ' + HydrusData.ToHumanInt( num_archive ) + ' files (' + ClientData.ConvertZoomToPercentage( num_archive_percent ) + '), totalling ' + HydrusData.ConvertIntToBytes( size_archive ) + '(' + ClientData.ConvertZoomToPercentage( size_archive_percent ) + ')' + + archive_st = ClientGUICommon.BetterStaticText( self, label = archive_label ) + + inbox_label = 'Inbox: ' + HydrusData.ToHumanInt( num_inbox ) + ' files (' + ClientData.ConvertZoomToPercentage( num_inbox_percent ) + '), totalling ' + HydrusData.ConvertIntToBytes( size_inbox ) + '(' + ClientData.ConvertZoomToPercentage( size_inbox_percent ) + ')' + + inbox_st = ClientGUICommon.BetterStaticText( self, label = inbox_label ) + + vbox.Add( archive_st, CC.FLAGS_CENTER ) + vbox.Add( inbox_st, CC.FLAGS_CENTER ) + + self.SetSizer( vbox ) + + class ReviewNetworkContextBandwidthPanel( ClientGUIScrolledPanels.ReviewPanel ): def __init__( self, parent, controller, network_context ): @@ -2226,6 +2285,7 @@ class ReviewNetworkSessionsPanel( ClientGUIScrolledPanels.ReviewPanel ): listctrl_panel.SetListCtrl( self._listctrl ) listctrl_panel.AddButton( 'create new', self._Add ) + listctrl_panel.AddButton( 'import cookies.txt', self._ImportCookiesTXT ) listctrl_panel.AddButton( 'review', self._Review, enabled_only_on_selection = True ) listctrl_panel.AddButton( 'clear', self._Clear, enabled_only_on_selection = True ) listctrl_panel.AddSeparator() @@ -2320,6 +2380,42 @@ class ReviewNetworkSessionsPanel( ClientGUIScrolledPanels.ReviewPanel ): return ( display_tuple, sort_tuple ) + # this method is thanks to user prkc on the discord! + def _ImportCookiesTXT( self ): + + with wx.FileDialog( self, 'select cookies.txt', style = wx.FD_OPEN ) as f_dlg: + + if f_dlg.ShowModal() == wx.ID_OK: + + path = HydrusData.ToUnicode( f_dlg.GetPath() ) + + cj = cookielib.MozillaCookieJar() + + cj.load( path, ignore_discard = True, ignore_expires = True ) + + for cookie in cj: + + try: + + nc_domain = ClientNetworkingDomain.ConvertDomainIntoSecondLevelDomain( cookie.domain ) + + except HydrusExceptions.URLMatchException: + + continue + + + context = ClientNetworkingContexts.NetworkContext( CC.NETWORK_CONTEXT_DOMAIN, nc_domain ) + + session = self._session_manager.GetSession( context ) + + session.cookies.set_cookie( cookie ) + + + + + self._Update() + + def _Review( self ): for network_context in self._listctrl.GetData( only_selected = True ): @@ -2365,6 +2461,7 @@ class ReviewNetworkSessionPanel( ClientGUIScrolledPanels.ReviewPanel ): listctrl_panel.SetListCtrl( self._listctrl ) listctrl_panel.AddButton( 'add', self._Add ) + listctrl_panel.AddButton( 'import cookies.txt', self._ImportCookiesTXT ) listctrl_panel.AddButton( 'edit', self._Edit, enabled_only_on_selection = True ) listctrl_panel.AddButton( 'delete', self._Delete, enabled_only_on_selection = True ) listctrl_panel.AddSeparator() @@ -2502,6 +2599,29 @@ class ReviewNetworkSessionPanel( ClientGUIScrolledPanels.ReviewPanel ): self._Update() + # this method is thanks to user prkc on the discord! + def _ImportCookiesTXT( self ): + + with wx.FileDialog( self, 'select cookies.txt', style = wx.FD_OPEN ) as f_dlg: + + if f_dlg.ShowModal() == wx.ID_OK: + + path = HydrusData.ToUnicode( f_dlg.GetPath() ) + + cj = cookielib.MozillaCookieJar() + + cj.load( path, ignore_discard = True, ignore_expires = True ) + + for cookie in cj: + + self._session.cookies.set_cookie( cookie ) + + + + + self._Update() + + def _SetCookie( self, name, value, domain, path, expires ): version = 0 diff --git a/include/ClientGUITagSuggestions.py b/include/ClientGUITagSuggestions.py index 70505b7e..3e58b630 100644 --- a/include/ClientGUITagSuggestions.py +++ b/include/ClientGUITagSuggestions.py @@ -247,9 +247,12 @@ class RelatedTagsPanel( wx.Panel ): if canvas_key == self._canvas_key: - self._media = ( new_media_singleton.Duplicate(), ) - - self._QuickSuggestedRelatedTags() + if new_media_singleton is not None: + + self._media = ( new_media_singleton.Duplicate(), ) + + self._QuickSuggestedRelatedTags() + @@ -372,9 +375,12 @@ class FileLookupScriptTagsPanel( wx.Panel ): if canvas_key == self._canvas_key: - self._media = ( new_media_singleton.Duplicate(), ) - - self._SetTags( [] ) + if new_media_singleton is not None: + + self._media = ( new_media_singleton.Duplicate(), ) + + self._SetTags( [] ) + diff --git a/include/ClientImportFileSeeds.py b/include/ClientImportFileSeeds.py index 2eab53ad..66ac0cbd 100644 --- a/include/ClientImportFileSeeds.py +++ b/include/ClientImportFileSeeds.py @@ -797,7 +797,51 @@ class FileSeed( HydrusSerialisable.SerialisableBase ): if status != CC.STATUS_UNKNOWN: - break # if a known one-file url gives a single clear result, that result is reliable + # a known one-file url has given a single clear result. sounds good + + we_have_a_match = True + + if self.file_seed_type == FILE_SEED_TYPE_URL: + + # to double-check, let's see if the file that claims that url has any other interesting urls + # if the file has another url with the same url class as ours, then this is prob an unreliable 'alternate' source url attribution, and untrustworthy + + my_url = self.file_seed_data + + if url != my_url: + + my_url_match = HG.client_controller.network_engine.domain_manager.GetURLMatch( my_url ) + + ( media_result, ) = HG.client_controller.Read( 'media_results', ( hash, ) ) + + this_files_urls = media_result.GetLocationsManager().GetURLs() + + for this_files_url in this_files_urls: + + if this_files_url != my_url: + + this_url_match = HG.client_controller.network_engine.domain_manager.GetURLMatch( this_files_url ) + + if my_url_match == this_url_match: + + # oh no, the file this source url refers to has a different known url in this same domain + # it is more likely that an edit on this site points to the original elsewhere + + ( status, hash, note ) = UNKNOWN_DEFAULT + + we_have_a_match = False + + break + + + + + + + if we_have_a_match: + + break # if a known one-file url gives a single clear result, that result is reliable + diff --git a/include/ClientImportGallerySeeds.py b/include/ClientImportGallerySeeds.py index 6c4b8029..09e7436d 100644 --- a/include/ClientImportGallerySeeds.py +++ b/include/ClientImportGallerySeeds.py @@ -271,6 +271,11 @@ class GallerySeed( HydrusSerialisable.SerialisableBase ): next_page_urls = ClientParsing.GetURLsFromParseResults( flattened_results, ( HC.URL_TYPE_NEXT, ), only_get_top_priority = True ) + if self.url in next_page_urls: + + next_page_urls.remove( self.url ) + + if len( next_page_urls ) > 0: next_page_generation_phrase = ' next gallery pages found' diff --git a/include/ClientImportSubscriptions.py b/include/ClientImportSubscriptions.py index 46f17464..ef2dcbf4 100644 --- a/include/ClientImportSubscriptions.py +++ b/include/ClientImportSubscriptions.py @@ -727,6 +727,7 @@ class Subscription( HydrusSerialisable.SerialisableBaseNamed ): num_urls_added = 0 num_urls_already_in_file_seed_cache = 0 can_add_more_file_urls = True + stop_reason = 'no known stop reason' for file_seed in file_seeds: @@ -761,7 +762,7 @@ class Subscription( HydrusSerialisable.SerialisableBaseNamed ): WE_HIT_OLD_GROUND_THRESHOLD = 5 - if num_urls_already_in_file_seed_cache > WE_HIT_OLD_GROUND_THRESHOLD: + if num_urls_already_in_file_seed_cache >= WE_HIT_OLD_GROUND_THRESHOLD: can_add_more_file_urls = False @@ -809,6 +810,14 @@ class Subscription( HydrusSerialisable.SerialisableBaseNamed ): num_existing_urls_this_stream += num_urls_already_in_file_seed_cache + WE_HIT_OLD_GROUND_TOTAL_THRESHOLD = 15 + + if num_existing_urls_this_stream >= WE_HIT_OLD_GROUND_TOTAL_THRESHOLD: + + keep_checking = False + stop_reason = 'saw ' + HydrusData.ToHumanInt( WE_HIT_OLD_GROUND_TOTAL_THRESHOLD ) + ' previously seen urls in the whole sync, so assuming we caught up' + + total_new_urls_for_this_sync += num_urls_added diff --git a/include/ClientNetworkingDomain.py b/include/ClientNetworkingDomain.py index c72c02ca..3e7097ac 100644 --- a/include/ClientNetworkingDomain.py +++ b/include/ClientNetworkingDomain.py @@ -108,7 +108,41 @@ def ConvertQueryDictToText( query_dict ): def ConvertQueryTextToDict( query_text ): - query_dict = dict( urlparse.parse_qsl( query_text ) ) + query_dict = {} + + pairs = query_text.split( '&' ) + + for pair in pairs: + + result = pair.split( '=', 1 ) + + # for the moment, ignore tracker bugs and so on that have only key and no value + + if len( result ) == 2: + + ( key, value ) = result + + try: + + key = urllib.unquote( key ) + + except: + + pass + + + try: + + value = urllib.unquote( value ) + + except: + + pass + + + query_dict[ key ] = value + + return query_dict @@ -270,7 +304,7 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ): SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_NETWORK_DOMAIN_MANAGER SERIALISABLE_NAME = 'Domain Manager' - SERIALISABLE_VERSION = 4 + SERIALISABLE_VERSION = 5 def __init__( self ): @@ -432,6 +466,8 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ): def _GetSerialisableInfo( self ): + serialisable_gugs = self._gugs.GetSerialisableTuple() + serialisable_url_matches = self._url_matches.GetSerialisableTuple() serialisable_url_match_keys_to_display = [ url_match_key.encode( 'hex' ) for url_match_key in self._url_match_keys_to_display ] serialisable_url_match_keys_to_parser_keys = self._url_match_keys_to_parser_keys.GetSerialisableTuple() @@ -445,7 +481,7 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ): serialisable_parsers = self._parsers.GetSerialisableTuple() serialisable_network_contexts_to_custom_header_dicts = [ ( network_context.GetSerialisableTuple(), custom_header_dict.items() ) for ( network_context, custom_header_dict ) in self._network_contexts_to_custom_header_dicts.items() ] - return ( serialisable_url_matches, serialisable_url_match_keys_to_display, serialisable_url_match_keys_to_parser_keys, serialisable_default_tag_import_options_tuple, serialisable_parsers, serialisable_network_contexts_to_custom_header_dicts ) + return ( serialisable_gugs, serialisable_url_matches, serialisable_url_match_keys_to_display, serialisable_url_match_keys_to_parser_keys, serialisable_default_tag_import_options_tuple, serialisable_parsers, serialisable_network_contexts_to_custom_header_dicts ) def _GetURLMatch( self, url ): @@ -476,7 +512,9 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ): def _InitialiseFromSerialisableInfo( self, serialisable_info ): - ( serialisable_url_matches, serialisable_url_match_keys_to_display, serialisable_url_match_keys_to_parser_keys, serialisable_default_tag_import_options_tuple, serialisable_parsers, serialisable_network_contexts_to_custom_header_dicts ) = serialisable_info + ( serialisable_gugs, serialisable_url_matches, serialisable_url_match_keys_to_display, serialisable_url_match_keys_to_parser_keys, serialisable_default_tag_import_options_tuple, serialisable_parsers, serialisable_network_contexts_to_custom_header_dicts ) = serialisable_info + + self._gugs = HydrusSerialisable.CreateFromSerialisableTuple( serialisable_gugs ) self._url_matches = HydrusSerialisable.CreateFromSerialisableTuple( serialisable_url_matches ) @@ -645,6 +683,19 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ): return ( 4, new_serialisable_info ) + if version == 4: + + ( serialisable_url_matches, serialisable_url_match_keys_to_display, serialisable_url_match_keys_to_parser_keys, serialisable_default_tag_import_options_tuple, serialisable_parsing_parsers, serialisable_network_contexts_to_custom_header_dicts ) = old_serialisable_info + + gugs = HydrusSerialisable.SerialisableList() + + serialisable_gugs = gugs.GetSerialisableTuple() + + new_serialisable_info = ( serialisable_gugs, serialisable_url_matches, serialisable_url_match_keys_to_display, serialisable_url_match_keys_to_parser_keys, serialisable_default_tag_import_options_tuple, serialisable_parsing_parsers, serialisable_network_contexts_to_custom_header_dicts ) + + return ( 5, new_serialisable_info ) + + def CanValidateInPopup( self, network_contexts ): @@ -1044,7 +1095,9 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ): #check ngugs maybe - self._gugs = gugs + self._gugs = HydrusSerialisable.SerialisableList( gugs ) + + self._SetDirty() @@ -1628,7 +1681,7 @@ class URLMatch( HydrusSerialisable.SerialisableBaseNamed ): SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_URL_MATCH SERIALISABLE_NAME = 'URL Class' - SERIALISABLE_VERSION = 5 + SERIALISABLE_VERSION = 6 def __init__( self, name, url_match_key = None, url_type = None, preferred_scheme = 'https', netloc = 'hostname.com', match_subdomains = False, keep_matched_subdomains = False, path_components = None, parameters = None, api_lookup_converter = None, can_produce_multiple_files = False, should_be_associated_with_files = True, gallery_index_type = None, gallery_index_identifier = None, gallery_index_delta = 1, example_url = 'https://hostname.com/post/page.php?id=123456&s=view' ): @@ -1644,18 +1697,19 @@ class URLMatch( HydrusSerialisable.SerialisableBaseNamed ): if path_components is None: - path_components = HydrusSerialisable.SerialisableList() + path_components = [] - path_components.append( ClientParsing.StringMatch( match_type = ClientParsing.STRING_MATCH_FIXED, match_value = 'post', example_string = 'post' ) ) - path_components.append( ClientParsing.StringMatch( match_type = ClientParsing.STRING_MATCH_FIXED, match_value = 'page.php', example_string = 'page.php' ) ) + path_components.append( ( ClientParsing.StringMatch( match_type = ClientParsing.STRING_MATCH_FIXED, match_value = 'post', example_string = 'post' ), None ) ) + path_components.append( ( ClientParsing.StringMatch( match_type = ClientParsing.STRING_MATCH_FIXED, match_value = 'page.php', example_string = 'page.php' ), None ) ) if parameters is None: - parameters = HydrusSerialisable.SerialisableDictionary() + parameters = {} - parameters[ 's' ] = ClientParsing.StringMatch( match_type = ClientParsing.STRING_MATCH_FIXED, match_value = 'view', example_string = 'view' ) - parameters[ 'id' ] = ClientParsing.StringMatch( match_type = ClientParsing.STRING_MATCH_FLEXIBLE, match_value = ClientParsing.NUMERIC, example_string = '123456' ) + parameters[ 's' ] = ( ClientParsing.StringMatch( match_type = ClientParsing.STRING_MATCH_FIXED, match_value = 'view', example_string = 'view' ), None ) + parameters[ 'id' ] = ( ClientParsing.StringMatch( match_type = ClientParsing.STRING_MATCH_FLEXIBLE, match_value = ClientParsing.NUMERIC, example_string = '123456' ), None ) + parameters[ 'page' ] = ( ClientParsing.StringMatch( match_type = ClientParsing.STRING_MATCH_FLEXIBLE, match_value = ClientParsing.NUMERIC, example_string = '1' ), 1 ) if api_lookup_converter is None: @@ -1712,7 +1766,7 @@ class URLMatch( HydrusSerialisable.SerialisableBaseNamed ): return netloc - def _ClipPath( self, path ): + def _ClipAndFleshOutPath( self, path, allow_clip = True ): # /post/show/1326143/akunim-anthro-armband-armwear-clothed-clothing-fem @@ -1725,7 +1779,30 @@ class URLMatch( HydrusSerialisable.SerialisableBaseNamed ): path_components = path.split( '/' ) - path = '/'.join( path_components[ : len( self._path_components ) ] ) + if allow_clip or len( path_components ) < len( self._path_components ): + + clipped_path_components = [] + + for ( index, ( string_match, default ) ) in enumerate( self._path_components ): + + if len( path_components ) > index: # the given path has the value + + clipped_path_component = path_components[ index ] + + elif default is not None: + + clipped_path_component = default + + else: + + raise HydrusExceptions.URLMatchException( 'Could not clip path--given url appeared to be too short!' ) + + + clipped_path_components.append( clipped_path_component ) + + + path = '/'.join( clipped_path_components ) + # post/show/1326143 @@ -1739,13 +1816,31 @@ class URLMatch( HydrusSerialisable.SerialisableBaseNamed ): return path - def _ClipQuery( self, query ): + def _ClipAndFleshOutQuery( self, query, allow_clip = True ): query_dict = ConvertQueryTextToDict( query ) - valid_parameters = { key : value for ( key, value ) in query_dict.items() if key in self._parameters } + if allow_clip: + + query_dict = { key : value for ( key, value ) in query_dict.items() if key in self._parameters } + - query = ConvertQueryDictToText( valid_parameters ) + for ( key, ( string_match, default ) ) in self._parameters.items(): + + if key not in query_dict: + + if default is None: + + raise HydrusExceptions.URLMatchException( 'Could not flesh out query--no default for ' + key + ' defined!' ) + + else: + + query_dict[ key ] = default + + + + + query = ConvertQueryDictToText( query_dict ) return query @@ -1753,8 +1848,8 @@ class URLMatch( HydrusSerialisable.SerialisableBaseNamed ): def _GetSerialisableInfo( self ): serialisable_url_match_key = self._url_match_key.encode( 'hex' ) - serialisable_path_components = self._path_components.GetSerialisableTuple() - serialisable_parameters = self._parameters.GetSerialisableTuple() + serialisable_path_components = [ ( string_match.GetSerialisableTuple(), default ) for ( string_match, default ) in self._path_components ] + serialisable_parameters = [ ( key, ( string_match.GetSerialisableTuple(), default ) ) for ( key, ( string_match, default ) ) in self._parameters.items() ] serialisable_api_lookup_converter = self._api_lookup_converter.GetSerialisableTuple() return ( serialisable_url_match_key, self._url_type, self._preferred_scheme, self._netloc, self._match_subdomains, self._keep_matched_subdomains, serialisable_path_components, serialisable_parameters, serialisable_api_lookup_converter, self._can_produce_multiple_files, self._should_be_associated_with_files, self._gallery_index_type, self._gallery_index_identifier, self._gallery_index_delta, self._example_url ) @@ -1765,8 +1860,8 @@ class URLMatch( HydrusSerialisable.SerialisableBaseNamed ): ( serialisable_url_match_key, self._url_type, self._preferred_scheme, self._netloc, self._match_subdomains, self._keep_matched_subdomains, serialisable_path_components, serialisable_parameters, serialisable_api_lookup_converter, self._can_produce_multiple_files, self._should_be_associated_with_files, self._gallery_index_type, self._gallery_index_identifier, self._gallery_index_delta, self._example_url ) = serialisable_info self._url_match_key = serialisable_url_match_key.decode( 'hex' ) - self._path_components = HydrusSerialisable.CreateFromSerialisableTuple( serialisable_path_components ) - self._parameters = HydrusSerialisable.CreateFromSerialisableTuple( serialisable_parameters ) + self._path_components = [ ( HydrusSerialisable.CreateFromSerialisableTuple( serialisable_string_match ), default ) for ( serialisable_string_match, default ) in serialisable_path_components ] + self._parameters = { key : ( HydrusSerialisable.CreateFromSerialisableTuple( serialisable_string_match ), default ) for ( key, ( serialisable_string_match, default ) ) in serialisable_parameters } self._api_lookup_converter = HydrusSerialisable.CreateFromSerialisableTuple( serialisable_api_lookup_converter ) @@ -1831,6 +1926,24 @@ class URLMatch( HydrusSerialisable.SerialisableBaseNamed ): return ( 5, new_serialisable_info ) + if version == 5: + + ( serialisable_url_match_key, url_type, preferred_scheme, netloc, match_subdomains, keep_matched_subdomains, serialisable_path_components, serialisable_parameters, serialisable_api_lookup_converter, can_produce_multiple_files, should_be_associated_with_files, gallery_index_type, gallery_index_identifier, gallery_index_delta, example_url ) = old_serialisable_info + + path_components = HydrusSerialisable.CreateFromSerialisableTuple( serialisable_path_components ) + parameters = HydrusSerialisable.CreateFromSerialisableTuple( serialisable_parameters ) + + path_components = [ ( value, None ) for value in path_components ] + parameters = { key : ( value, None ) for ( key, value ) in parameters.items() } + + serialisable_path_components = [ ( string_match.GetSerialisableTuple(), default ) for ( string_match, default ) in path_components ] + serialisable_parameters = [ ( key, ( string_match.GetSerialisableTuple(), default ) ) for ( key, ( string_match, default ) ) in parameters.items() ] + + new_serialisable_info = ( serialisable_url_match_key, url_type, preferred_scheme, netloc, match_subdomains, keep_matched_subdomains, serialisable_path_components, serialisable_parameters, serialisable_api_lookup_converter, can_produce_multiple_files, should_be_associated_with_files, gallery_index_type, gallery_index_identifier, gallery_index_delta, example_url ) + + return ( 6, new_serialisable_info ) + + def CanGenerateNextGalleryPage( self ): @@ -1854,6 +1967,11 @@ class URLMatch( HydrusSerialisable.SerialisableBaseNamed ): return is_a_gallery_page or is_a_multipost_post_page + def ClippingIsAppropriate( self ): + + return self._should_be_associated_with_files or self.UsesAPIURL() + + def GetAPIURL( self, url = None ): if url is None: @@ -1888,6 +2006,8 @@ class URLMatch( HydrusSerialisable.SerialisableBaseNamed ): def GetNextGalleryPage( self, url ): + url = self.Normalise( url ) + p = urlparse.urlparse( url ) scheme = p.scheme @@ -2005,11 +2125,6 @@ class URLMatch( HydrusSerialisable.SerialisableBaseNamed ): - def NormalisationIsAppropriate( self ): - - return self._should_be_associated_with_files or self.UsesAPIURL() - - def Normalise( self, url ): p = urlparse.urlparse( url ) @@ -2018,19 +2133,17 @@ class URLMatch( HydrusSerialisable.SerialisableBaseNamed ): params = '' fragment = '' - # gallery urls we don't want to clip stuff, but we do want to flip to https - - if self.NormalisationIsAppropriate(): + if self.ClippingIsAppropriate(): netloc = self._ClipNetLoc( p.netloc ) - path = self._ClipPath( p.path ) - query = self._ClipQuery( p.query ) + path = self._ClipAndFleshOutPath( p.path ) + query = self._ClipAndFleshOutQuery( p.query ) else: netloc = p.netloc - path = p.path - query = AlphabetiseQueryText( p.query ) + path = self._ClipAndFleshOutPath( p.path, allow_clip = False ) + query = self._ClipAndFleshOutQuery( p.query, allow_clip = False ) r = urlparse.ParseResult( scheme, netloc, path, params, query, fragment ) @@ -2085,35 +2198,41 @@ class URLMatch( HydrusSerialisable.SerialisableBaseNamed ): url_path_components = url_path.split( '/' ) - if len( url_path_components ) < len( self._path_components ): + for ( index, ( string_match, default ) ) in enumerate( self._path_components ): - raise HydrusExceptions.URLMatchException( url_path + ' did not have ' + str( len( self._path_components ) ) + ' components' ) - - - for ( url_path_component, expected_path_component ) in zip( url_path_components, self._path_components ): - - try: + if len( url_path_components ) > index: - expected_path_component.Test( url_path_component ) + url_path_component = url_path_components[ index ] - except HydrusExceptions.StringMatchException as e: + try: + + string_match.Test( url_path_component ) + + except HydrusExceptions.StringMatchException as e: + + raise HydrusExceptions.URLMatchException( HydrusData.ToUnicode( e ) ) + - raise HydrusExceptions.URLMatchException( HydrusData.ToUnicode( e ) ) + elif default is None: + + raise HydrusExceptions.URLMatchException( url_path + ' did not have enough of the required path components!' ) url_parameters = ConvertQueryTextToDict( p.query ) - if len( url_parameters ) < len( self._parameters ): - - raise HydrusExceptions.URLMatchException( p.query + ' did not have ' + str( len( self._parameters ) ) + ' parameters' ) - - - for ( key, string_match ) in self._parameters.items(): + for ( key, ( string_match, default ) ) in self._parameters.items(): if key not in url_parameters: - raise HydrusExceptions.URLMatchException( key + ' not found in ' + p.query ) + if default is None: + + raise HydrusExceptions.URLMatchException( key + ' not found in ' + p.query ) + + else: + + continue + value = url_parameters[ key ] diff --git a/include/ClientParsing.py b/include/ClientParsing.py index 24de79b9..48f102d5 100644 --- a/include/ClientParsing.py +++ b/include/ClientParsing.py @@ -1355,13 +1355,26 @@ class ParseFormulaJSON( ParseFormula ): if parse_rule is None: - if not isinstance( root, list ): + if isinstance( root, list ): + + next_roots.extend( root ) + + elif isinstance( root, dict ): + + pairs = list( root.items() ) + + pairs.sort() + + for ( key, value ) in pairs: + + next_roots.append( value ) + + + else: continue - next_roots.extend( root ) - elif isinstance( parse_rule, int ): if not isinstance( root, list ): diff --git a/include/HydrusConstants.py b/include/HydrusConstants.py index 6c1053ec..e92890fd 100755 --- a/include/HydrusConstants.py +++ b/include/HydrusConstants.py @@ -49,7 +49,7 @@ options = {} # Misc NETWORK_VERSION = 18 -SOFTWARE_VERSION = 319 +SOFTWARE_VERSION = 320 UNSCALED_THUMBNAIL_DIMENSIONS = ( 200, 200 ) diff --git a/static/boned.jpg b/static/boned.jpg new file mode 100644 index 00000000..d3036dae Binary files /dev/null and b/static/boned.jpg differ diff --git a/static/default/gugs/artstation artist lookup.png b/static/default/gugs/artstation artist lookup.png new file mode 100644 index 00000000..03ef1459 Binary files /dev/null and b/static/default/gugs/artstation artist lookup.png differ diff --git a/static/default/gugs/danbooru tag search.png b/static/default/gugs/danbooru tag search.png new file mode 100644 index 00000000..95c02d28 Binary files /dev/null and b/static/default/gugs/danbooru tag search.png differ diff --git a/static/default/gugs/deviant art artist lookup.png b/static/default/gugs/deviant art artist lookup.png new file mode 100644 index 00000000..0942bb39 Binary files /dev/null and b/static/default/gugs/deviant art artist lookup.png differ diff --git a/static/default/gugs/e621 tag search.png b/static/default/gugs/e621 tag search.png new file mode 100644 index 00000000..154c83e7 Binary files /dev/null and b/static/default/gugs/e621 tag search.png differ diff --git a/static/default/gugs/furry.booru.org tag search.png b/static/default/gugs/furry.booru.org tag search.png new file mode 100644 index 00000000..afb2fb03 Binary files /dev/null and b/static/default/gugs/furry.booru.org tag search.png differ diff --git a/static/default/gugs/gelbooru tag search.png b/static/default/gugs/gelbooru tag search.png new file mode 100644 index 00000000..30adf9a8 Binary files /dev/null and b/static/default/gugs/gelbooru tag search.png differ diff --git a/static/default/gugs/hentai foundry artist scraps lookup.png b/static/default/gugs/hentai foundry artist scraps lookup.png new file mode 100644 index 00000000..9f1d3e00 Binary files /dev/null and b/static/default/gugs/hentai foundry artist scraps lookup.png differ diff --git a/static/default/gugs/hentai foundry artist works lookup.png b/static/default/gugs/hentai foundry artist works lookup.png new file mode 100644 index 00000000..cd052f0b Binary files /dev/null and b/static/default/gugs/hentai foundry artist works lookup.png differ diff --git a/static/default/gugs/hentai foundry tag search.png b/static/default/gugs/hentai foundry tag search.png new file mode 100644 index 00000000..1bfde904 Binary files /dev/null and b/static/default/gugs/hentai foundry tag search.png differ diff --git a/static/default/gugs/inkbunny artist lookup.png b/static/default/gugs/inkbunny artist lookup.png new file mode 100644 index 00000000..a21aaf2a Binary files /dev/null and b/static/default/gugs/inkbunny artist lookup.png differ diff --git a/static/default/gugs/inkbunny tag search.png b/static/default/gugs/inkbunny tag search.png new file mode 100644 index 00000000..9d62399e Binary files /dev/null and b/static/default/gugs/inkbunny tag search.png differ diff --git a/static/default/gugs/konachan tag search.png b/static/default/gugs/konachan tag search.png new file mode 100644 index 00000000..460fd64a Binary files /dev/null and b/static/default/gugs/konachan tag search.png differ diff --git a/static/default/gugs/mishimmie tag search.png b/static/default/gugs/mishimmie tag search.png new file mode 100644 index 00000000..3ac4228d Binary files /dev/null and b/static/default/gugs/mishimmie tag search.png differ diff --git a/static/default/gugs/newgrounds artist games lookup.png b/static/default/gugs/newgrounds artist games lookup.png new file mode 100644 index 00000000..baaadd36 Binary files /dev/null and b/static/default/gugs/newgrounds artist games lookup.png differ diff --git a/static/default/gugs/newgrounds artist movies lookup.png b/static/default/gugs/newgrounds artist movies lookup.png new file mode 100644 index 00000000..fdabbc14 Binary files /dev/null and b/static/default/gugs/newgrounds artist movies lookup.png differ diff --git a/static/default/gugs/pixiv artist illust lookup.png b/static/default/gugs/pixiv artist illust lookup.png new file mode 100644 index 00000000..af8a9ba7 Binary files /dev/null and b/static/default/gugs/pixiv artist illust lookup.png differ diff --git a/static/default/gugs/pixiv artist lookup.png b/static/default/gugs/pixiv artist lookup.png new file mode 100644 index 00000000..b926292c Binary files /dev/null and b/static/default/gugs/pixiv artist lookup.png differ diff --git a/static/default/gugs/pixiv artist manga lookup.png b/static/default/gugs/pixiv artist manga lookup.png new file mode 100644 index 00000000..8d572880 Binary files /dev/null and b/static/default/gugs/pixiv artist manga lookup.png differ diff --git a/static/default/gugs/pixiv artist ugoira lookup.png b/static/default/gugs/pixiv artist ugoira lookup.png new file mode 100644 index 00000000..1036ed35 Binary files /dev/null and b/static/default/gugs/pixiv artist ugoira lookup.png differ diff --git a/static/default/gugs/rule.34.paheal tag search.png b/static/default/gugs/rule.34.paheal tag search.png new file mode 100644 index 00000000..928163db Binary files /dev/null and b/static/default/gugs/rule.34.paheal tag search.png differ diff --git a/static/default/gugs/rule34.xxx tag search.png b/static/default/gugs/rule34.xxx tag search.png new file mode 100644 index 00000000..eaf4f04d Binary files /dev/null and b/static/default/gugs/rule34.xxx tag search.png differ diff --git a/static/default/gugs/rule34hentai tag search.png b/static/default/gugs/rule34hentai tag search.png new file mode 100644 index 00000000..bc77b6d4 Binary files /dev/null and b/static/default/gugs/rule34hentai tag search.png differ diff --git a/static/default/gugs/safebooru tag search.png b/static/default/gugs/safebooru tag search.png new file mode 100644 index 00000000..bdd26e3f Binary files /dev/null and b/static/default/gugs/safebooru tag search.png differ diff --git a/static/default/gugs/sakugabooru tag search.png b/static/default/gugs/sakugabooru tag search.png new file mode 100644 index 00000000..419a5c10 Binary files /dev/null and b/static/default/gugs/sakugabooru tag search.png differ diff --git a/static/default/gugs/sankaku channel tag search.png b/static/default/gugs/sankaku channel tag search.png new file mode 100644 index 00000000..0f2e8bf2 Binary files /dev/null and b/static/default/gugs/sankaku channel tag search.png differ diff --git a/static/default/gugs/sankaku idol tag search.png b/static/default/gugs/sankaku idol tag search.png new file mode 100644 index 00000000..cd6fc753 Binary files /dev/null and b/static/default/gugs/sankaku idol tag search.png differ diff --git a/static/default/gugs/tbib tag search.png b/static/default/gugs/tbib tag search.png new file mode 100644 index 00000000..af55e599 Binary files /dev/null and b/static/default/gugs/tbib tag search.png differ diff --git a/static/default/gugs/tumblr username lookup.png b/static/default/gugs/tumblr username lookup.png new file mode 100644 index 00000000..364e1b94 Binary files /dev/null and b/static/default/gugs/tumblr username lookup.png differ diff --git a/static/default/gugs/xbooru tag search.png b/static/default/gugs/xbooru tag search.png new file mode 100644 index 00000000..92e35837 Binary files /dev/null and b/static/default/gugs/xbooru tag search.png differ diff --git a/static/default/gugs/yande.re tag search.png b/static/default/gugs/yande.re tag search.png new file mode 100644 index 00000000..4ff4a929 Binary files /dev/null and b/static/default/gugs/yande.re tag search.png differ diff --git a/static/default/parsers/artstation file page api parser.png b/static/default/parsers/artstation file page api parser.png index 5878fdd6..6391a5df 100644 Binary files a/static/default/parsers/artstation file page api parser.png and b/static/default/parsers/artstation file page api parser.png differ diff --git a/static/default/parsers/artstation gallery page api parser.png b/static/default/parsers/artstation gallery page api parser.png new file mode 100644 index 00000000..b79eec8f Binary files /dev/null and b/static/default/parsers/artstation gallery page api parser.png differ diff --git a/static/default/parsers/deviant art gallery page parser.png b/static/default/parsers/deviant art gallery page parser.png new file mode 100644 index 00000000..d33c5a02 Binary files /dev/null and b/static/default/parsers/deviant art gallery page parser.png differ diff --git a/static/default/parsers/hentai foundry gallery page parser.png b/static/default/parsers/hentai foundry gallery page parser.png new file mode 100644 index 00000000..ba8f52da Binary files /dev/null and b/static/default/parsers/hentai foundry gallery page parser.png differ diff --git a/static/default/parsers/inkbunny gallery page parser.png b/static/default/parsers/inkbunny gallery page parser.png new file mode 100644 index 00000000..717eb8e6 Binary files /dev/null and b/static/default/parsers/inkbunny gallery page parser.png differ diff --git a/static/default/parsers/moebooru gallery page parser.png b/static/default/parsers/moebooru gallery page parser.png new file mode 100644 index 00000000..8f8652cb Binary files /dev/null and b/static/default/parsers/moebooru gallery page parser.png differ diff --git a/static/default/parsers/newgrounds gallery page parser.png b/static/default/parsers/newgrounds gallery page parser.png new file mode 100644 index 00000000..49ae630f Binary files /dev/null and b/static/default/parsers/newgrounds gallery page parser.png differ diff --git a/static/default/parsers/pixiv static html gallery page parser.png b/static/default/parsers/pixiv static html gallery page parser.png new file mode 100644 index 00000000..7bd2499f Binary files /dev/null and b/static/default/parsers/pixiv static html gallery page parser.png differ diff --git a/static/default/parsers/rule34hentai gallery page parser.png b/static/default/parsers/rule34hentai gallery page parser.png new file mode 100644 index 00000000..d0d2b364 Binary files /dev/null and b/static/default/parsers/rule34hentai gallery page parser.png differ diff --git a/static/default/parsers/sankaku gallery page parser.png b/static/default/parsers/sankaku gallery page parser.png new file mode 100644 index 00000000..eb0701b1 Binary files /dev/null and b/static/default/parsers/sankaku gallery page parser.png differ diff --git a/static/default/url_classes/artstation artist gallery page api.png b/static/default/url_classes/artstation artist gallery page api.png index e47c5eae..e1c2b894 100644 Binary files a/static/default/url_classes/artstation artist gallery page api.png and b/static/default/url_classes/artstation artist gallery page api.png differ diff --git a/static/default/url_classes/artstation artist gallery page.png b/static/default/url_classes/artstation artist gallery page.png index c7f64629..791d3c1d 100644 Binary files a/static/default/url_classes/artstation artist gallery page.png and b/static/default/url_classes/artstation artist gallery page.png differ diff --git a/static/default/url_classes/danbooru gallery page.png b/static/default/url_classes/danbooru gallery page.png index bd2f1c9e..cf0d938d 100644 Binary files a/static/default/url_classes/danbooru gallery page.png and b/static/default/url_classes/danbooru gallery page.png differ diff --git a/static/default/url_classes/deviant art artist gallery page - search initialisation.png b/static/default/url_classes/deviant art artist gallery page - search initialisation.png deleted file mode 100644 index 1195b2f2..00000000 Binary files a/static/default/url_classes/deviant art artist gallery page - search initialisation.png and /dev/null differ diff --git a/static/default/url_classes/deviant art artist gallery page.png b/static/default/url_classes/deviant art artist gallery page.png index 796c931b..058cdbe1 100644 Binary files a/static/default/url_classes/deviant art artist gallery page.png and b/static/default/url_classes/deviant art artist gallery page.png differ diff --git a/static/default/url_classes/e621 gallery page - search initialisation.png b/static/default/url_classes/e621 gallery page - search initialisation.png deleted file mode 100644 index 63f4385f..00000000 Binary files a/static/default/url_classes/e621 gallery page - search initialisation.png and /dev/null differ diff --git a/static/default/url_classes/e621 gallery page.png b/static/default/url_classes/e621 gallery page.png index 74445dc1..e50875a5 100644 Binary files a/static/default/url_classes/e621 gallery page.png and b/static/default/url_classes/e621 gallery page.png differ diff --git a/static/default/url_classes/furry.booru.org gallery page.png b/static/default/url_classes/furry.booru.org gallery page.png index f2e72083..ac08491a 100644 Binary files a/static/default/url_classes/furry.booru.org gallery page.png and b/static/default/url_classes/furry.booru.org gallery page.png differ diff --git a/static/default/url_classes/gelbooru gallery page - search initialisation.png b/static/default/url_classes/gelbooru gallery page - search initialisation.png deleted file mode 100644 index c0472ae7..00000000 Binary files a/static/default/url_classes/gelbooru gallery page - search initialisation.png and /dev/null differ diff --git a/static/default/url_classes/gelbooru gallery page.png b/static/default/url_classes/gelbooru gallery page.png index 9476d909..0ee952b9 100644 Binary files a/static/default/url_classes/gelbooru gallery page.png and b/static/default/url_classes/gelbooru gallery page.png differ diff --git a/static/default/url_classes/hentai foundry artist pictures gallery page.png b/static/default/url_classes/hentai foundry artist pictures gallery page.png index 5cbb1641..065529ec 100644 Binary files a/static/default/url_classes/hentai foundry artist pictures gallery page.png and b/static/default/url_classes/hentai foundry artist pictures gallery page.png differ diff --git a/static/default/url_classes/hentai foundry artist scraps gallery page.png b/static/default/url_classes/hentai foundry artist scraps gallery page.png index dacd459f..45124bd4 100644 Binary files a/static/default/url_classes/hentai foundry artist scraps gallery page.png and b/static/default/url_classes/hentai foundry artist scraps gallery page.png differ diff --git a/static/default/url_classes/hentai foundry tag search gallery page.png b/static/default/url_classes/hentai foundry tag search gallery page.png index 2ea3e7ce..2d8802dc 100644 Binary files a/static/default/url_classes/hentai foundry tag search gallery page.png and b/static/default/url_classes/hentai foundry tag search gallery page.png differ diff --git a/static/default/url_classes/inkbunny artist gallery page.png b/static/default/url_classes/inkbunny artist gallery page.png new file mode 100644 index 00000000..81f8c201 Binary files /dev/null and b/static/default/url_classes/inkbunny artist gallery page.png differ diff --git a/static/default/url_classes/inkbunny tag search gallery page.png b/static/default/url_classes/inkbunny tag search gallery page.png new file mode 100644 index 00000000..b93183e6 Binary files /dev/null and b/static/default/url_classes/inkbunny tag search gallery page.png differ diff --git a/static/default/url_classes/konachan gallery page.png b/static/default/url_classes/konachan gallery page.png index 01b0e173..e40623e4 100644 Binary files a/static/default/url_classes/konachan gallery page.png and b/static/default/url_classes/konachan gallery page.png differ diff --git a/static/default/url_classes/mishimmie gallery page.png b/static/default/url_classes/mishimmie gallery page.png index 499af078..876a6861 100644 Binary files a/static/default/url_classes/mishimmie gallery page.png and b/static/default/url_classes/mishimmie gallery page.png differ diff --git a/static/default/url_classes/newgrounds games gallery page.png b/static/default/url_classes/newgrounds games gallery page.png index 53099d1f..9dcb3c87 100644 Binary files a/static/default/url_classes/newgrounds games gallery page.png and b/static/default/url_classes/newgrounds games gallery page.png differ diff --git a/static/default/url_classes/newgrounds movies gallery page.png b/static/default/url_classes/newgrounds movies gallery page.png index 73d63264..c584aa54 100644 Binary files a/static/default/url_classes/newgrounds movies gallery page.png and b/static/default/url_classes/newgrounds movies gallery page.png differ diff --git a/static/default/url_classes/pixiv artist gallery page.png b/static/default/url_classes/pixiv artist gallery page.png index d0f7df26..a7125d53 100644 Binary files a/static/default/url_classes/pixiv artist gallery page.png and b/static/default/url_classes/pixiv artist gallery page.png differ diff --git a/static/default/url_classes/rule34.paheal gallery page.png b/static/default/url_classes/rule34.paheal gallery page.png index 5367ca5a..7e5894df 100644 Binary files a/static/default/url_classes/rule34.paheal gallery page.png and b/static/default/url_classes/rule34.paheal gallery page.png differ diff --git a/static/default/url_classes/rule34.xxx gallery page.png b/static/default/url_classes/rule34.xxx gallery page.png index bbeccf15..cd3fdaa5 100644 Binary files a/static/default/url_classes/rule34.xxx gallery page.png and b/static/default/url_classes/rule34.xxx gallery page.png differ diff --git a/static/default/url_classes/rule34hentai gallery page.png b/static/default/url_classes/rule34hentai gallery page.png index 8531683a..82e8acaf 100644 Binary files a/static/default/url_classes/rule34hentai gallery page.png and b/static/default/url_classes/rule34hentai gallery page.png differ diff --git a/static/default/url_classes/safebooru gallery page.png b/static/default/url_classes/safebooru gallery page.png index d5117523..17f476a2 100644 Binary files a/static/default/url_classes/safebooru gallery page.png and b/static/default/url_classes/safebooru gallery page.png differ diff --git a/static/default/url_classes/sakugabooru gallery page.png b/static/default/url_classes/sakugabooru gallery page.png index 55b5f8dc..cc1d04d4 100644 Binary files a/static/default/url_classes/sakugabooru gallery page.png and b/static/default/url_classes/sakugabooru gallery page.png differ diff --git a/static/default/url_classes/sankaku chan gallery page.png b/static/default/url_classes/sankaku chan gallery page.png index 538f8f25..e0218979 100644 Binary files a/static/default/url_classes/sankaku chan gallery page.png and b/static/default/url_classes/sankaku chan gallery page.png differ diff --git a/static/default/url_classes/sankaku idol gallery page.png b/static/default/url_classes/sankaku idol gallery page.png index 092f9da5..9f7f4d4a 100644 Binary files a/static/default/url_classes/sankaku idol gallery page.png and b/static/default/url_classes/sankaku idol gallery page.png differ diff --git a/static/default/url_classes/tbib gallery page.png b/static/default/url_classes/tbib gallery page.png index bc4b8503..8ace1c59 100644 Binary files a/static/default/url_classes/tbib gallery page.png and b/static/default/url_classes/tbib gallery page.png differ diff --git a/static/default/url_classes/tumblr api gallery page.png b/static/default/url_classes/tumblr api gallery page.png index e032917d..8ef32eb0 100644 Binary files a/static/default/url_classes/tumblr api gallery page.png and b/static/default/url_classes/tumblr api gallery page.png differ diff --git a/static/default/url_classes/xbooru gallery page.png b/static/default/url_classes/xbooru gallery page.png index 0688511b..8c9d8b7f 100644 Binary files a/static/default/url_classes/xbooru gallery page.png and b/static/default/url_classes/xbooru gallery page.png differ diff --git a/static/default/url_classes/yande.re gallery page.png b/static/default/url_classes/yande.re gallery page.png index 41073d31..e2b2592f 100644 Binary files a/static/default/url_classes/yande.re gallery page.png and b/static/default/url_classes/yande.re gallery page.png differ