Version 320

This commit is contained in:
Hydrus Network Developer 2018-08-29 15:20:41 -05:00
parent c45aa2c914
commit 64d2403f85
94 changed files with 726 additions and 296 deletions

View File

@ -8,11 +8,43 @@
<div class="content">
<h3>changelog</h3>
<ul>
<ul>
<li><h3>version 320</h3></li>
<li>clients should now have objects for all default downloaders. everything should be prepped for the big switchover:</li>
<li>wrote gallery url generators for all the default downloaders and a couple more as well</li>
<li>wrote a gallery parser for deviant art--it also comes with an update to the DA url class because the meta 'next page' link on DA gallery pages is invalid wew!</li>
<li>wrote a gallery parser for hentai foundry, inkbunny, rule34hentai, moebooru (konachan, sakugabooru, yande.re), artstation, newgrounds, and pixiv artist galleries (static html)</li>
<li>added a gallery parser for sankaku</li>
<li>the artstation post url parser no longer fetches cover images</li>
<li>url classes can now support 'default' values for path components and query parameters! so, if your url might be missing a page=1 initialsation value due to user drag-and-drop, you can auto-add it in the normalisation step!</li>
<li>if the entered default does not match the rules of the component or parameter, it will be cleared back to none!</li>
<li>all appropriate default gallery url classes (which is most) now have these default values. all default gallery url classes will be overwritten on db update</li>
<li>three test 'search initialisation' url classes that attempted to fix this problem a different way will be deleted on update, if present</li>
<li>updated some other url classes</li>
<li>when checking source urls during the pre-download import status check, the client will now distrust parsed source urls if the files they seem to refer to also have other urls of the same url class as the file import object being actioned (basically, this is some logic that tries to detect bad source url attribution, where multiple files on a booru (typically including alternate edits) are all source-url'd back to a single original)</li>
<li>gallery page parsing now discounts parsed 'next page' urls that are the same as the page that fetched them (some gallery end-points link themselves as the next page, wew)</li>
<li>json parsing formulae that are set to parse all 'list' items will now also parse all dictionary entries if faced with a dict instead!</li>
<li>added new stop-gap 'stop checking' logic in subscription syncing for certain low-gallery-count edge-cases</li>
<li>fixed an issue where (typically new) subscriptions were bugging out trying to figure a default stop_reason on certain page results</li>
<li>fixed an unusual listctrl delete item index-tracking error that would sometimes cause exceptions on the 'try to link url stuff together' button press and maybe some other places</li>
<li>thanks to a submission from user prkc on the discord, we now have 'import cookies.txt' buttons on the review sessions panels! if you are interested in 'manual' logins through browser-cookie-copying, please give this a go and let me know which kinds of cookies.txt do and do not work, and how your different site cookie-copy-login tests work in hydrus.</li>
<li>the mappings cache tables now have some new indices that speed up certain kinds of tag search significantly. db update will spend a minute or two generating these indices for existing users</li>
<li>advanced mode users will discover a fun new entry on the help menu</li>
<li>the hyperlinks on the media viewer hover window and a couple of other places are now a custom control that uses any custom browser launch path in options->files and trash</li>
<li>fixed an issue where certain canvas edge-case media clearing events could be caught incorrectly by the manage tags dialog and its subsidiary panels</li>
<li>think I fixed an issue where a client left with a dialog open could sometimes run into trouble later trying to show an idle time maintenance modal popup and give a 'C++ assertion IsRunning()' exception and end up locking the client's ui</li>
<li>manage parsers dialog will now autosort after an add event</li>
<li>the gug panels now normalise example urls</li>
<li>improved some misc service error handling</li>
<li>rewrote some url parsing to stop forcing '+'->' ' in our urls' query texts</li>
<li>fixed some bad error handling for matplotlib import</li>
<li>misc fixes</li>
</ul>
<li><h3>version 319</h3></li>
<ul>
<li>started the new convert-query-text-to-gallery-urls object. these objects, which I was thinking of calling 'Searchers', will be called the more specific and practical 'Gallery URL Generators', or GUGs for short</li>
<li>the first version of GUGs is done, and I've written some test ui for advanced users under network->downloader definitions->manage gugs. this ui doesn't save anything yet, but lets you mess around with different values. if we don't think of anything else needed in the next week, I will fix this code for v320 and start filling in defaults</li>
<li>watchers now have a global checking slot, much like the recent change to galleries and subs. it safely throttles dozens of threads so they don't rudely hammer your (or the destination server's) CPU if they all happen to want to go at once (like just after your computer wakes up). the option is similarly under options->downloading, and is global for the moment</li>
<li>watchers now have a checking slot, much like the recent change to galleries and subs. it safely throttles dozens of threads so they don't rudely hammer your (or the destination server's) CPU if they all happen to want to go at once (like just after your computer wakes up). the option is similarly under options->downloading</li>
<li>moved the new gallery delay/token management code to the better-fit bandwidth manager (it was in domain manager before)</li>
<li>the gallery delay/token code now works per-domain!</li>
<li>moved the gallery delay/token checking code into the network job proper, simplifying a bunch of import-level code and making the text display now appear in the network job control. token consumption now occurs after bandwidth (it is now the last hoop to jump through, which reduces the chance of a pileup in unusual situations) I expect to soon add some kind of 'force-go' action to the cog menu</li>

View File

@ -8,7 +8,7 @@
<div class="content">
<h3>on being anonymous</h3>
<p>Nearly all sites use the same pseudonymous username/password system, and nearly all of them have the same drama, sockpuppets, and egotistical mods. Censorship is routine. That works for many people, but not for me.</p>
<p>I enjoy being anonymous online. When you aren't afraid of repurcussions, you can be as truthful as you want. You can have conversations that can happen nowhere else. It's fun!</p>
<p>I enjoy being anonymous online. When you aren't afraid of repercussions, you can be as truthful as you want. You can have conversations that can happen nowhere else. It's fun!</p>
<p>I've been on the imageboards for a <span class="de">lo</span><span class="su">ng</span> time, saving everything I like to my hard drive. After a while, the whole collection was just too large to manage on my own.</p>
<h3>the hydrus network</h3>
<p>So! I'm developing a program that helps people organise their files together anonymously. I want to help you do what you want with your stuff, and that's it. You can share some tags and files with other people if you want to, but you don't have to connect to anything if you don't. The default is <b>no sharing</b>, and every upload requires a conscious action on your part. I don't plan to ever record metrics on users, nor serve ads, nor charge for my software. The software never phones home.</p>

View File

@ -2431,6 +2431,10 @@ class DB( HydrusDB.HydrusDB ):
self._CacheSpecificMappingsAddFiles( file_service_id, tag_service_id, hash_ids )
self._CreateIndex( cache_current_mappings_table_name, [ 'tag_id', 'hash_id' ], unique = True )
self._CreateIndex( cache_deleted_mappings_table_name, [ 'tag_id', 'hash_id' ], unique = True )
self._CreateIndex( cache_pending_mappings_table_name, [ 'tag_id', 'hash_id' ], unique = True )
def _CacheSpecificMappingsGetAutocompleteCounts( self, file_service_id, tag_service_id, tag_ids ):
@ -3927,6 +3931,17 @@ class DB( HydrusDB.HydrusDB ):
return names_to_analyze
def _GetBonedStats( self ):
( num_total, size_total ) = self._c.execute( 'SELECT COUNT( hash_id ), SUM( size ) FROM files_info NATURAL JOIN current_files WHERE service_id = ?;', ( self._local_file_service_id, ) ).fetchone()
( num_inbox, size_inbox ) = self._c.execute( 'SELECT COUNT( hash_id ), SUM( size ) FROM files_info NATURAL JOIN current_files NATURAL JOIN file_inbox WHERE service_id = ?;', ( self._local_file_service_id, ) ).fetchone()
num_archive = num_total - num_inbox
size_archive = size_total - size_inbox
return ( num_inbox, num_archive, size_inbox, size_archive )
def _GetClientFilesLocations( self ):
result = { prefix : HydrusPaths.ConvertPortablePathToAbsPath( location ) for ( prefix, location ) in self._c.execute( 'SELECT prefix, location FROM client_files_locations;' ) }
@ -8627,6 +8642,7 @@ class DB( HydrusDB.HydrusDB ):
def _Read( self, action, *args, **kwargs ):
if action == 'autocomplete_predicates': result = self._GetAutocompletePredicates( *args, **kwargs )
elif action == 'boned_stats': result = self._GetBonedStats( *args, **kwargs )
elif action == 'client_files_locations': result = self._GetClientFilesLocations( *args, **kwargs )
elif action == 'downloads': result = self._GetDownloads( *args, **kwargs )
elif action == 'duplicate_hashes': result = self._CacheSimilarFilesGetDuplicateHashes( *args, **kwargs )
@ -9390,194 +9406,6 @@ class DB( HydrusDB.HydrusDB ):
self._controller.pub( 'splash_set_status_text', 'updating db to v' + str( version + 1 ) )
if version == 260:
self._controller.pub( 'splash_set_status_subtext', 'generating some new tag search data' )
self._c.execute( 'CREATE TABLE external_caches.integer_subtags ( subtag_id INTEGER PRIMARY KEY, integer_subtag INTEGER );' )
existing_subtag_data = self._c.execute( 'SELECT subtag_id, subtag FROM subtags;' ).fetchall()
inserts = []
for ( subtag_id, subtag ) in existing_subtag_data:
try:
integer_subtag = int( subtag )
if CanCacheInteger( integer_subtag ):
inserts.append( ( subtag_id, integer_subtag ) )
except ValueError:
pass
self._c.executemany( 'INSERT OR IGNORE INTO integer_subtags ( subtag_id, integer_subtag ) VALUES ( ?, ? );', inserts )
self._CreateIndex( 'external_caches.integer_subtags', [ 'integer_subtag' ] )
#
do_the_message = False
subscriptions = self._GetJSONDumpNamed( HydrusSerialisable.SERIALISABLE_TYPE_SUBSCRIPTION )
for subscription in subscriptions:
g_i = subscription._gallery_identifier
if g_i.GetSiteType() == HC.SITE_TYPE_TUMBLR:
do_the_message = True
break
if do_the_message:
message = 'The tumblr downloader can now produce \'raw\' urls for images that have >1280px width. It is possible some of your tumblr subscriptions\' urls are resizes, so at some point you may want to reset their url caches. I recommend you not do it yet--wait for the upcoming downloader overhaul, which will provide other benefits such as associating the \'post\' url with the image, rather than the ugly API url.'
self.pub_initial_message( message )
if version == 262:
self._controller.pub( 'splash_set_status_subtext', 'moving some hash data' )
self._c.execute( 'CREATE TABLE IF NOT EXISTS external_master.hashes ( hash_id INTEGER PRIMARY KEY, hash BLOB_BYTES UNIQUE );' )
self._c.execute( 'CREATE TABLE external_master.local_hashes ( hash_id INTEGER PRIMARY KEY, md5 BLOB_BYTES, sha1 BLOB_BYTES, sha512 BLOB_BYTES );' )
self._CreateIndex( 'external_master.local_hashes', [ 'md5' ] )
self._CreateIndex( 'external_master.local_hashes', [ 'sha1' ] )
self._CreateIndex( 'external_master.local_hashes', [ 'sha512' ] )
self._c.execute( 'INSERT INTO external_master.local_hashes SELECT * FROM main.local_hashes;' )
self._c.execute( 'DROP TABLE main.local_hashes;' )
self._c.execute( 'ANALYZE external_master.local_hashes;' )
self._CloseDBCursor()
self._controller.pub( 'splash_set_status_subtext', 'vacuuming main db ' )
db_path = os.path.join( self._db_dir, 'client.db' )
try:
if HydrusDB.CanVacuum( db_path ):
HydrusDB.VacuumDB( db_path )
except Exception as e:
HydrusData.Print( 'Vacuum failed!' )
HydrusData.PrintException( e )
self._InitDBCursor()
#
bandwidth_manager = ClientNetworkingBandwidth.NetworkBandwidthManager()
ClientDefaults.SetDefaultBandwidthManagerRules( bandwidth_manager )
self._SetJSONDump( bandwidth_manager )
session_manager = ClientNetworkingSessions.NetworkSessionManager()
self._SetJSONDump( session_manager )
#
self._controller.pub( 'splash_set_status_subtext', 'generating deleted tag cache' )
tag_service_ids = self._GetServiceIds( HC.TAG_SERVICES )
file_service_ids = self._GetServiceIds( HC.AUTOCOMPLETE_CACHE_SPECIFIC_FILE_SERVICES )
for ( file_service_id, tag_service_id ) in itertools.product( file_service_ids, tag_service_ids ):
( cache_files_table_name, cache_current_mappings_table_name, cache_deleted_mappings_table_name, cache_pending_mappings_table_name, ac_cache_table_name ) = GenerateSpecificMappingsCacheTableNames( file_service_id, tag_service_id )
self._c.execute( 'CREATE TABLE IF NOT EXISTS ' + cache_deleted_mappings_table_name + ' ( hash_id INTEGER, tag_id INTEGER, PRIMARY KEY ( hash_id, tag_id ) ) WITHOUT ROWID;' )
hash_ids = self._STL( self._c.execute( 'SELECT hash_id FROM current_files WHERE service_id = ?;', ( file_service_id, ) ) )
if len( hash_ids ) > 0:
( current_mappings_table_name, deleted_mappings_table_name, pending_mappings_table_name, petitioned_mappings_table_name ) = GenerateMappingsTableNames( tag_service_id )
for group_of_hash_ids in HydrusData.SplitListIntoChunks( hash_ids, 100 ):
splayed_group_of_hash_ids = HydrusData.SplayListForDB( group_of_hash_ids )
deleted_mapping_ids_raw = self._c.execute( 'SELECT tag_id, hash_id FROM ' + deleted_mappings_table_name + ' WHERE hash_id IN ' + splayed_group_of_hash_ids + ';' ).fetchall()
deleted_mapping_ids_dict = HydrusData.BuildKeyToSetDict( deleted_mapping_ids_raw )
all_ids_seen = set( deleted_mapping_ids_dict.keys() )
for tag_id in all_ids_seen:
deleted_hash_ids = deleted_mapping_ids_dict[ tag_id ]
num_deleted = len( deleted_hash_ids )
if num_deleted > 0:
self._c.executemany( 'INSERT OR IGNORE INTO ' + cache_deleted_mappings_table_name + ' ( hash_id, tag_id ) VALUES ( ?, ? );', ( ( hash_id, tag_id ) for hash_id in deleted_hash_ids ) )
self._c.execute( 'ANALYZE ' + cache_deleted_mappings_table_name + ';' )
if version == 263:
self._controller.pub( 'splash_set_status_subtext', 'rebuilding urls table' )
self._c.execute( 'ALTER TABLE urls RENAME TO urls_old;' )
self._c.execute( 'CREATE TABLE urls ( hash_id INTEGER, url TEXT, PRIMARY KEY ( hash_id, url ) );' )
self._CreateIndex( 'urls', [ 'url' ] )
self._c.execute( 'INSERT INTO urls SELECT hash_id, url FROM urls_old;' )
self._c.execute( 'DROP TABLE urls_old;' )
self._c.execute( 'ANALYZE urls;' )
if version == 264:
default_bandwidth_manager = ClientNetworkingBandwidth.NetworkBandwidthManager()
ClientDefaults.SetDefaultBandwidthManagerRules( default_bandwidth_manager )
bandwidth_manager = self._GetJSONDump( HydrusSerialisable.SERIALISABLE_TYPE_NETWORK_BANDWIDTH_MANAGER )
bandwidth_manager._network_contexts_to_bandwidth_rules = dict( default_bandwidth_manager._network_contexts_to_bandwidth_rules )
self._SetJSONDump( bandwidth_manager )
if version == 266:
self._c.execute( 'DROP TABLE IF EXISTS web_sessions;' )
if version == 272:
try:
@ -10871,7 +10699,7 @@ class DB( HydrusDB.HydrusDB ):
if i % 100 == 0:
self._controller.pub( 'splash_set_status_subtext', 'normalising some uls: ' + HydrusData.ConvertValueRangeToPrettyString( i, num_to_do ) )
self._controller.pub( 'splash_set_status_subtext', 'normalising some urls: ' + HydrusData.ConvertValueRangeToPrettyString( i, num_to_do ) )
@ -10885,6 +10713,85 @@ class DB( HydrusDB.HydrusDB ):
if version == 319:
tag_service_ids = self._GetServiceIds( HC.TAG_SERVICES )
file_service_ids = self._GetServiceIds( HC.AUTOCOMPLETE_CACHE_SPECIFIC_FILE_SERVICES )
for ( file_service_id, tag_service_id ) in itertools.product( file_service_ids, tag_service_ids ):
try:
self._controller.pub( 'splash_set_status_subtext', 'generating some new indices: ' + str( file_service_id ) + ', ' + str( tag_service_id ) )
( cache_files_table_name, cache_current_mappings_table_name, cache_deleted_mappings_table_name, cache_pending_mappings_table_name, ac_cache_table_name ) = GenerateSpecificMappingsCacheTableNames( file_service_id, tag_service_id )
self._CreateIndex( cache_current_mappings_table_name, [ 'tag_id', 'hash_id' ], unique = True )
self._CreateIndex( cache_deleted_mappings_table_name, [ 'tag_id', 'hash_id' ], unique = True )
self._CreateIndex( cache_pending_mappings_table_name, [ 'tag_id', 'hash_id' ], unique = True )
except:
message = 'Trying to create some new tag lookup indices failed! This is not a huge deal, particularly if you have had service errors in the past, but you might want to let hydrus dev know! Your magic failure numbers are: ' + str( ( file_service_id, tag_service_id ) )
self.pub_initial_message( message )
#
try:
domain_manager = self._GetJSONDump( HydrusSerialisable.SERIALISABLE_TYPE_NETWORK_DOMAIN_MANAGER )
domain_manager.Initialise()
#
gugs = ClientDefaults.GetDefaultGUGs()
domain_manager.SetGUGs( gugs )
#
# lots of gallery updates this week, so let's just do them all
default_url_matches = ClientDefaults.GetDefaultURLMatches()
overwrite_names = [ url_match.GetName() for url_match in default_url_matches if url_match.GetURLType() == HC.URL_TYPE_GALLERY ]
domain_manager.OverwriteDefaultURLMatches( overwrite_names )
# now clear out some failed experiments
url_matches = domain_manager.GetURLMatches()
url_matches = [ url_match for url_match in url_matches if 'search initialisation' not in url_match.GetName() ]
domain_manager.SetURLMatches( url_matches )
#
domain_manager.OverwriteDefaultParsers( ( 'artstation file page api parser', 'artstation gallery page api parser', 'hentai foundry gallery page parser', 'inkbunny gallery page parser', 'moebooru gallery page parser', 'newgrounds gallery page parser', 'pixiv static html gallery page parser', 'rule34hentai gallery page parser', 'sankaku gallery page parser', 'deviant art gallery page parser' ) )
#
domain_manager.TryToLinkURLMatchesAndParsers()
#
self._SetJSONDump( domain_manager )
except Exception as e:
HydrusData.PrintException( e )
message = 'Trying to update some url classes and parsers failed! Please let hydrus dev know!'
self.pub_initial_message( message )
self._controller.pub( 'splash_set_title_text', 'updated db to v' + str( version + 1 ) )
self._c.execute( 'UPDATE version SET version = ?;', ( version + 1, ) )

View File

@ -818,6 +818,10 @@ def SetDefaultDomainManagerData( domain_manager ):
#
domain_manager.SetGUGs( GetDefaultGUGs() )
#
domain_manager.SetURLMatches( GetDefaultURLMatches() )
#

View File

@ -1720,7 +1720,7 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
submenu = wx.Menu()
ClientGUIMenus.AppendMenuItem( self, submenu, 'UNDER CONSTRUCTION: manage gallery url generators', 'Manage the client\'s GUGs, which convert search terms into URLs.', self._ManageGUGs )
ClientGUIMenus.AppendMenuItem( self, submenu, 'manage gallery url generators', 'Manage the client\'s GUGs, which convert search terms into URLs.', self._ManageGUGs )
ClientGUIMenus.AppendMenuItem( self, submenu, 'manage url classes', 'Configure which URLs the client can recognise.', self._ManageURLMatches )
ClientGUIMenus.AppendMenuItem( self, submenu, 'manage parsers', 'Manage the client\'s parsers, which convert URL content into hydrus metadata.', self._ManageParsers )
@ -1907,6 +1907,11 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
ClientGUIMenus.AppendMenu( menu, dont_know, 'I don\'t know what I am doing' )
if self._controller.new_options.GetBoolean( 'advanced_mode' ):
ClientGUIMenus.AppendMenuItem( self, menu, 'how boned am I?', 'Check for a summary of your ride so far.', self._HowBonedAmI )
ClientGUIMenus.AppendSeparator( menu )
currently_darkmode = self._new_options.GetString( 'current_colourset' ) == 'darkmode'
@ -2015,6 +2020,17 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
with ClientGUIDialogs.DialogGenerateNewAccounts( self, service_key ) as dlg: dlg.ShowModal()
def _HowBonedAmI( self ):
boned_stats = self._controller.Read( 'boned_stats' )
frame = ClientGUITopLevelWindows.FrameThatTakesScrollablePanel( self, 'review your fate' )
panel = ClientGUIScrolledPanelsReview.ReviewHowBonedAmI( frame, boned_stats )
frame.SetPanel( panel )
def _ImportFiles( self, paths = None ):
if paths is None:
@ -2332,10 +2348,6 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
gugs = domain_manager.GetGUGs()
import ClientNetworkingDomain
gugs.append( ClientNetworkingDomain.GalleryURLGenerator( 'test gug', url_template = 'https://www.gelbooru.com/index.php?page=post&s=list&tags=%tags%&pid=0' ) )
panel = ClientGUIScrolledPanelsEdit.EditGUGsPanel( dlg, gugs )
dlg.SetPanel( panel )
@ -3622,7 +3634,19 @@ The password is cleartext here but obscured in the entry dialog. Enter a blank p
return
if self.IsIconized():
dialog_open = False
tlps = wx.GetTopLevelWindows()
for tlp in tlps:
if isinstance( tlp, wx.Dialog ):
dialog_open = True
if self.IsIconized() or dialog_open:
self._controller.CallLaterWXSafe( self, 10, self.AddModalMessage, job_key )

View File

@ -4,6 +4,7 @@ import ClientConstants as CC
import ClientGUIMenus
import ClientGUITopLevelWindows
import ClientMedia
import ClientPaths
import ClientRatings
import ClientThreading
import HydrusConstants as HC
@ -698,6 +699,33 @@ class BetterStaticText( wx.StaticText ):
class BetterHyperLink( BetterStaticText ):
def __init__( self, parent, label, url ):
BetterStaticText.__init__( self, parent, label )
self._url = url
self.SetCursor( wx.Cursor( wx.CURSOR_HAND ) )
self.SetForegroundColour( wx.SystemSettings.GetColour( wx.SYS_COLOUR_HOTLIGHT ) )
font = self.GetFont()
font.SetUnderlined( True )
self.SetFont( font )
self.SetToolTipString( self._url )
self.Bind( wx.EVT_LEFT_DOWN, self.EventClick )
def EventClick( self, event ):
ClientPaths.LaunchURLInWebBrowser( self._url )
class BufferedWindow( wx.Window ):
def __init__( self, *args, **kwargs ):

View File

@ -44,7 +44,6 @@ import traceback
import urllib
import wx
import wx.lib.agw.customtreectrl
import wx.adv
import yaml
import HydrusData
import ClientSearch
@ -1343,8 +1342,8 @@ class DialogInputNamespaceRegex( Dialog ):
self._shortcuts = ClientGUICommon.RegexButton( self )
self._regex_intro_link = wx.adv.HyperlinkCtrl( self, id = -1, label = 'a good regex introduction', url = 'http://www.aivosto.com/vbtips/regex.html' )
self._regex_practise_link = wx.adv.HyperlinkCtrl( self, id = -1, label = 'regex practise', url = 'http://regexr.com/3cvmf' )
self._regex_intro_link = ClientGUICommon.BetterHyperLink( self, 'a good regex introduction', 'http://www.aivosto.com/vbtips/regex.html' )
self._regex_practise_link = ClientGUICommon.BetterHyperLink( self, 'regex practise', 'http://regexr.com/3cvmf' )
self._ok = wx.Button( self, id = wx.ID_OK, label = 'OK' )
self._ok.Bind( wx.EVT_BUTTON, self.EventOK )

View File

@ -16,7 +16,6 @@ import HydrusSerialisable
import urlparse
import os
import wx
import wx.adv
class FullscreenHoverFrame( wx.Frame ):
@ -934,9 +933,7 @@ class FullscreenHoverFrameTopRight( FullscreenHoverFrame ):
for ( display_string, url ) in url_tuples:
link = wx.adv.HyperlinkCtrl( self, id = -1, label = display_string, url = url )
link.SetToolTip( url )
link = ClientGUICommon.BetterHyperLink( self, display_string, url )
self._urls_vbox.Add( link, CC.FLAGS_EXPAND_PERPENDICULAR )

View File

@ -30,7 +30,6 @@ import HydrusText
import os
import re
import wx
import wx.adv
class CheckerOptionsButton( ClientGUICommon.BetterButton ):
@ -253,8 +252,8 @@ class FilenameTaggingOptionsPanel( wx.Panel ):
self._regex_shortcuts = ClientGUICommon.RegexButton( self._regexes_panel )
self._regex_intro_link = wx.adv.HyperlinkCtrl( self._regexes_panel, id = -1, label = 'a good regex introduction', url = 'http://www.aivosto.com/vbtips/regex.html' )
self._regex_practise_link = wx.adv.HyperlinkCtrl( self._regexes_panel, id = -1, label = 'regex practise', url = 'http://regexr.com/3cvmf' )
self._regex_intro_link = ClientGUICommon.BetterHyperLink( self._regexes_panel, 'a good regex introduction', 'http://www.aivosto.com/vbtips/regex.html' )
self._regex_practise_link = ClientGUICommon.BetterHyperLink( self._regexes_panel, 'regex practise', 'http://regexr.com/3cvmf' )
#

View File

@ -704,10 +704,20 @@ class BetterListCtrl( wx.ListCtrl, ListCtrlAutoWidthMixin ):
deletees.sort( reverse = True )
# I am not sure, but I think if subsequent deleteitems occur in the same event, the event processing of the first is forced!!
# this means that button checking and so on occurs for n-1 times on an invalid indices structure in this thing before correcting itself in the last one
# if a button update then tests selected data against the invalid index and a selection is on the i+1 or whatever but just got bumped up into invalid area, we are exception city
# this doesn't normally affect us because mostly we _are_ deleting selections when we do deletes, but 'try to link url stuff' auto thing hit this
# I obviously don't want to recalc all indices for every delete
# so I wrote a catch in getdata to skip the missing error, and now I'm moving the data deletion to a second loop, which seems to help
for ( index, data ) in deletees:
self.DeleteItem( index )
for ( index, data ) in deletees:
del self._data_to_indices[ data ]
del self._indices_to_data_info[ index ]
@ -829,6 +839,12 @@ class BetterListCtrl( wx.ListCtrl, ListCtrlAutoWidthMixin ):
for index in indices:
# this can get fired while indices are invalid, wew
if index not in self._indices_to_data_info:
continue
( data, display_tuple, sort_tuple ) = self._indices_to_data_info[ index ]
result.append( data )

View File

@ -594,6 +594,13 @@ class ReviewServicePanel( wx.Panel ):
wx.CallAfter( self._Refresh )
if HG.client_controller.options[ 'pause_repo_sync' ]:
wx.MessageBox( 'All repositories are currently paused under the services->pause menu! Please unpause them and then try again!' )
return
self._refresh_account_button.Disable()
self._refresh_account_button.SetLabelText( u'fetching\u2026' )

View File

@ -1106,7 +1106,7 @@ class EditJSONParsingRulePanel( ClientGUIScrolledPanels.EditPanel ):
self._type = ClientGUICommon.BetterChoice( self )
self._type.Append( 'dictionary entry', self.DICT_ENTRY )
self._type.Append( 'all list items', self.ALL_LIST_ITEMS )
self._type.Append( 'all dictionary/list items', self.ALL_LIST_ITEMS )
self._type.Append( 'indexed list item', self.INDEXED_LIST_ITEM)
self._key = wx.TextCtrl( self )
@ -2946,6 +2946,8 @@ class EditParsersPanel( ClientGUIScrolledPanels.EditPanel ):
self._AddParser( new_parser )
self._parsers.Sort()
@ -3716,7 +3718,7 @@ class EditStringConverterPanel( ClientGUIScrolledPanels.EditPanel ):
self._data_encoding = ClientGUICommon.BetterChoice( self )
self._data_regex_pattern = wx.TextCtrl( self )
self._data_regex_repl = wx.TextCtrl( self )
self._data_date_link = wx.adv.HyperlinkCtrl( self, label = 'link to date info', url = 'https://docs.python.org/2/library/datetime.html#strftime-strptime-behavior' )
self._data_date_link = ClientGUICommon.BetterHyperLink( self, 'link to date info', 'https://docs.python.org/2/library/datetime.html#strftime-strptime-behavior' )
self._data_timezone = ClientGUICommon.BetterChoice( self )
self._data_timezone_offset = wx.SpinCtrl( self, min = -86400, max = 86400 )

View File

@ -1443,6 +1443,8 @@ class EditGUGPanel( ClientGUIScrolledPanels.EditPanel ):
example_url = gug.GetExampleURL()
example_url = HG.client_controller.network_engine.domain_manager.NormaliseURL( example_url )
self._example_url.SetValue( example_url )
except HydrusExceptions.GUGException as e:
@ -1514,7 +1516,7 @@ class EditGUGsPanel( ClientGUIScrolledPanels.EditPanel ):
self._list_ctrl_panel = ClientGUIListCtrl.BetterListCtrlPanel( self )
columns = [ ( 'name', 16 ), ( 'example url', -1 ), ( 'gallery url class?', 20 ) ]
columns = [ ( 'name', 24 ), ( 'example url', -1 ), ( 'gallery url class?', 20 ) ]
self._list_ctrl = ClientGUIListCtrl.BetterListCtrl( self._list_ctrl_panel, 'gugs', 30, 74, columns, self._ConvertDataToListCtrlTuples, delete_key_callback = self._Delete, activation_callback = self._Edit )
@ -1579,6 +1581,8 @@ class EditGUGsPanel( ClientGUIScrolledPanels.EditPanel ):
name = gug.GetName()
example_url = gug.GetExampleURL()
example_url = HG.client_controller.network_engine.domain_manager.NormaliseURL( example_url )
url_match = HG.client_controller.network_engine.domain_manager.GetURLMatch( example_url )
if url_match is None:
@ -4809,7 +4813,7 @@ class EditURLMatchPanel( ClientGUIScrolledPanels.EditPanel ):
path_components_panel = ClientGUICommon.StaticBox( self, 'path components' )
self._path_components = ClientGUIListBoxes.QueueListBox( path_components_panel, 6, self._ConvertPathComponentToString, self._AddPathComponent, self._EditPathComponent )
self._path_components = ClientGUIListBoxes.QueueListBox( path_components_panel, 6, self._ConvertPathComponentRowToString, self._AddPathComponent, self._EditPathComponent )
#
@ -4999,13 +5003,36 @@ class EditURLMatchPanel( ClientGUIScrolledPanels.EditPanel ):
string_match = panel.GetValue()
with ClientGUIDialogs.DialogTextEntry( self, 'Enter optional \'default\' value for this parameter, which will be filled in if missing. Leave blank for none (recommended).', allow_blank = True ) as dlg_default:
if dlg_default.ShowModal() == wx.ID_OK:
default = dlg_default.GetValue()
if default == '':
default = None
elif not string_match.Matches( default ):
wx.MessageBox( 'That default does not match the given rule! Clearing it to none!' )
default = None
else:
return
else:
return
data = ( key, string_match )
data = ( key, ( string_match, default ) )
self._parameters.AddDatas( ( data, ) )
@ -5017,17 +5044,23 @@ class EditURLMatchPanel( ClientGUIScrolledPanels.EditPanel ):
def _AddPathComponent( self ):
string_match = ClientParsing.StringMatch()
default = None
return self._EditPathComponent( string_match )
return self._EditPathComponent( ( string_match, default ) )
def _ConvertParameterToListCtrlTuples( self, data ):
( key, string_match ) = data
( key, ( string_match, default ) ) = data
pretty_key = key
pretty_string_match = string_match.ToUnicode()
if default is not None:
pretty_string_match += ' (default "' + default + '")'
sort_key = pretty_key
sort_string_match = pretty_string_match
@ -5037,9 +5070,18 @@ class EditURLMatchPanel( ClientGUIScrolledPanels.EditPanel ):
return ( display_tuple, sort_tuple )
def _ConvertPathComponentToString( self, path_component ):
def _ConvertPathComponentRowToString( self, row ):
return path_component.ToUnicode()
( string_match, default ) = row
s = string_match.ToUnicode()
if default is not None:
s += ' (default "' + default + '")'
return s
def _DeleteParameters( self ):
@ -5061,7 +5103,7 @@ class EditURLMatchPanel( ClientGUIScrolledPanels.EditPanel ):
for parameter in selected_params:
( original_key, original_string_match ) = parameter
( original_key, ( original_string_match, original_default ) ) = parameter
with ClientGUIDialogs.DialogTextEntry( self, 'edit the key', default = original_key, allow_blank = False ) as dlg:
@ -5097,6 +5139,34 @@ class EditURLMatchPanel( ClientGUIScrolledPanels.EditPanel ):
string_match = panel.GetValue()
if original_default is None:
original_default = ''
with ClientGUIDialogs.DialogTextEntry( self, 'Enter optional \'default\' value for this parameter, which will be filled in if missing. Leave blank for none (recommended).', default = original_default, allow_blank = True ) as dlg_default:
if dlg_default.ShowModal() == wx.ID_OK:
default = dlg_default.GetValue()
if default == '':
default = None
elif not string_match.Matches( default ):
wx.MessageBox( 'That default does not match the given rule! Clearing it to none!' )
default = None
else:
return
else:
return
@ -5105,7 +5175,7 @@ class EditURLMatchPanel( ClientGUIScrolledPanels.EditPanel ):
self._parameters.DeleteDatas( ( parameter, ) )
new_parameter = ( key, string_match )
new_parameter = ( key, ( string_match, default ) )
self._parameters.AddDatas( ( new_parameter, ) )
@ -5115,7 +5185,9 @@ class EditURLMatchPanel( ClientGUIScrolledPanels.EditPanel ):
self._UpdateControls()
def _EditPathComponent( self, string_match ):
def _EditPathComponent( self, row ):
( string_match, default ) = row
with ClientGUITopLevelWindows.DialogEdit( self, 'edit path component' ) as dlg:
@ -5127,13 +5199,37 @@ class EditURLMatchPanel( ClientGUIScrolledPanels.EditPanel ):
new_string_match = panel.GetValue()
return ( True, new_string_match )
if default is None:
default = ''
else:
return ( False, None )
with ClientGUIDialogs.DialogTextEntry( self, 'Enter optional \'default\' value for this path component, which will be filled in if missing. Leave blank for none (recommended).', default = default, allow_blank = True ) as dlg_default:
if dlg_default.ShowModal() == wx.ID_OK:
new_default = dlg_default.GetValue()
if new_default == '':
new_default = None
elif not string_match.Matches( new_default ):
wx.MessageBox( 'That default does not match the given rule! Clearing it to none!' )
new_default = None
new_row = ( new_string_match, new_default )
return ( True, new_row )
return ( False, None )
def _GetExistingKeys( self ):
@ -5180,17 +5276,17 @@ class EditURLMatchPanel( ClientGUIScrolledPanels.EditPanel ):
choices = [ ( 'no next gallery page info set', ( None, None ) ) ]
for ( index, path_component ) in enumerate( self._path_components.GetData() ):
for ( index, ( string_match, default ) ) in enumerate( self._path_components.GetData() ):
if True in ( path_component.Matches( n ) for n in ( '0', '1', '10', '100', '42' ) ):
if True in ( string_match.Matches( n ) for n in ( '0', '1', '10', '100', '42' ) ):
choices.append( ( HydrusData.ConvertIntToPrettyOrdinalString( index + 1 ) + ' path component', ( ClientNetworkingDomain.GALLERY_INDEX_TYPE_PATH_COMPONENT, index ) ) )
for ( index, ( key, value ) ) in enumerate( self._parameters.GetData() ):
for ( index, ( key, ( string_match, default ) ) ) in enumerate( self._parameters.GetData() ):
if True in ( value.Matches( n ) for n in ( '0', '1', '10', '100', '42' ) ):
if True in ( string_match.Matches( n ) for n in ( '0', '1', '10', '100', '42' ) ):
choices.append( ( key + ' parameter', ( ClientNetworkingDomain.GALLERY_INDEX_TYPE_PARAMETER, key ) ) )
@ -5238,7 +5334,7 @@ class EditURLMatchPanel( ClientGUIScrolledPanels.EditPanel ):
self._can_produce_multiple_files.Disable()
if url_match.NormalisationIsAppropriate():
if url_match.ClippingIsAppropriate():
if self._match_subdomains.GetValue():

View File

@ -5214,11 +5214,14 @@ class ManageTagsPanel( ClientGUIScrolledPanels.ManagePanel ):
if canvas_key == self._canvas_key:
self._current_media = ( new_media_singleton.Duplicate(), )
for page in self._tag_repositories.GetPages():
if new_media_singleton is not None:
page.SetMedia( self._current_media )
self._current_media = ( new_media_singleton.Duplicate(), )
for page in self._tag_repositories.GetPages():
page.SetMedia( self._current_media )

View File

@ -12,18 +12,21 @@ import ClientGUIScrolledPanels
import ClientGUIScrolledPanelsEdit
import ClientGUIPanels
import ClientGUIPopupMessages
import ClientGUITags
import ClientGUITime
import ClientGUITopLevelWindows
import ClientNetworking
import ClientNetworkingContexts
import ClientNetworkingDomain
import ClientPaths
import ClientGUITags
import ClientRendering
import ClientTags
import ClientThreading
import collections
import cookielib
import HydrusConstants as HC
import HydrusData
import HydrusExceptions
import HydrusGlobals as HG
import HydrusNATPunch
import HydrusPaths
@ -42,7 +45,7 @@ try:
MATPLOTLIB_OK = True
except ImportError:
except:
MATPLOTLIB_OK = False
@ -1885,6 +1888,62 @@ class ReviewExportFilesPanel( ClientGUIScrolledPanels.ReviewPanel ):
self._RefreshTags()
class ReviewHowBonedAmI( ClientGUIScrolledPanels.ReviewPanel ):
def __init__( self, parent, stats ):
ClientGUIScrolledPanels.ReviewPanel.__init__( self, parent )
( num_inbox, num_archive, size_inbox, size_archive ) = stats
vbox = wx.BoxSizer( wx.VERTICAL )
num_total = num_archive + num_inbox
size_total = size_archive + size_inbox
if num_total < 1000:
get_more = ClientGUICommon.BetterStaticText( self, label = 'I hope you enjoy my software. You might like to check out the downloaders!' )
vbox.Add( get_more, CC.FLAGS_CENTER )
elif num_inbox <= num_archive / 100:
hooray = ClientGUICommon.BetterStaticText( self, label = 'CONGRATULATIONS, YOU APPEAR TO BE UNBONED, BUT REMAIN EVER VIGILANT' )
vbox.Add( hooray, CC.FLAGS_CENTER )
else:
boned_path = os.path.join( HC.STATIC_DIR, 'boned.jpg' )
boned_bmp = ClientRendering.GenerateHydrusBitmap( boned_path, HC.IMAGE_JPEG ).GetWxBitmap()
win = ClientGUICommon.BufferedWindowIcon( self, boned_bmp )
vbox.Add( win, CC.FLAGS_CENTER )
num_archive_percent = float( num_archive ) / num_total
size_archive_percent = float( size_archive ) / size_total
num_inbox_percent = float( num_inbox ) / num_total
size_inbox_percent = float( size_inbox ) / size_total
archive_label = 'Archive: ' + HydrusData.ToHumanInt( num_archive ) + ' files (' + ClientData.ConvertZoomToPercentage( num_archive_percent ) + '), totalling ' + HydrusData.ConvertIntToBytes( size_archive ) + '(' + ClientData.ConvertZoomToPercentage( size_archive_percent ) + ')'
archive_st = ClientGUICommon.BetterStaticText( self, label = archive_label )
inbox_label = 'Inbox: ' + HydrusData.ToHumanInt( num_inbox ) + ' files (' + ClientData.ConvertZoomToPercentage( num_inbox_percent ) + '), totalling ' + HydrusData.ConvertIntToBytes( size_inbox ) + '(' + ClientData.ConvertZoomToPercentage( size_inbox_percent ) + ')'
inbox_st = ClientGUICommon.BetterStaticText( self, label = inbox_label )
vbox.Add( archive_st, CC.FLAGS_CENTER )
vbox.Add( inbox_st, CC.FLAGS_CENTER )
self.SetSizer( vbox )
class ReviewNetworkContextBandwidthPanel( ClientGUIScrolledPanels.ReviewPanel ):
def __init__( self, parent, controller, network_context ):
@ -2226,6 +2285,7 @@ class ReviewNetworkSessionsPanel( ClientGUIScrolledPanels.ReviewPanel ):
listctrl_panel.SetListCtrl( self._listctrl )
listctrl_panel.AddButton( 'create new', self._Add )
listctrl_panel.AddButton( 'import cookies.txt', self._ImportCookiesTXT )
listctrl_panel.AddButton( 'review', self._Review, enabled_only_on_selection = True )
listctrl_panel.AddButton( 'clear', self._Clear, enabled_only_on_selection = True )
listctrl_panel.AddSeparator()
@ -2320,6 +2380,42 @@ class ReviewNetworkSessionsPanel( ClientGUIScrolledPanels.ReviewPanel ):
return ( display_tuple, sort_tuple )
# this method is thanks to user prkc on the discord!
def _ImportCookiesTXT( self ):
with wx.FileDialog( self, 'select cookies.txt', style = wx.FD_OPEN ) as f_dlg:
if f_dlg.ShowModal() == wx.ID_OK:
path = HydrusData.ToUnicode( f_dlg.GetPath() )
cj = cookielib.MozillaCookieJar()
cj.load( path, ignore_discard = True, ignore_expires = True )
for cookie in cj:
try:
nc_domain = ClientNetworkingDomain.ConvertDomainIntoSecondLevelDomain( cookie.domain )
except HydrusExceptions.URLMatchException:
continue
context = ClientNetworkingContexts.NetworkContext( CC.NETWORK_CONTEXT_DOMAIN, nc_domain )
session = self._session_manager.GetSession( context )
session.cookies.set_cookie( cookie )
self._Update()
def _Review( self ):
for network_context in self._listctrl.GetData( only_selected = True ):
@ -2365,6 +2461,7 @@ class ReviewNetworkSessionPanel( ClientGUIScrolledPanels.ReviewPanel ):
listctrl_panel.SetListCtrl( self._listctrl )
listctrl_panel.AddButton( 'add', self._Add )
listctrl_panel.AddButton( 'import cookies.txt', self._ImportCookiesTXT )
listctrl_panel.AddButton( 'edit', self._Edit, enabled_only_on_selection = True )
listctrl_panel.AddButton( 'delete', self._Delete, enabled_only_on_selection = True )
listctrl_panel.AddSeparator()
@ -2502,6 +2599,29 @@ class ReviewNetworkSessionPanel( ClientGUIScrolledPanels.ReviewPanel ):
self._Update()
# this method is thanks to user prkc on the discord!
def _ImportCookiesTXT( self ):
with wx.FileDialog( self, 'select cookies.txt', style = wx.FD_OPEN ) as f_dlg:
if f_dlg.ShowModal() == wx.ID_OK:
path = HydrusData.ToUnicode( f_dlg.GetPath() )
cj = cookielib.MozillaCookieJar()
cj.load( path, ignore_discard = True, ignore_expires = True )
for cookie in cj:
self._session.cookies.set_cookie( cookie )
self._Update()
def _SetCookie( self, name, value, domain, path, expires ):
version = 0

View File

@ -247,9 +247,12 @@ class RelatedTagsPanel( wx.Panel ):
if canvas_key == self._canvas_key:
self._media = ( new_media_singleton.Duplicate(), )
self._QuickSuggestedRelatedTags()
if new_media_singleton is not None:
self._media = ( new_media_singleton.Duplicate(), )
self._QuickSuggestedRelatedTags()
@ -372,9 +375,12 @@ class FileLookupScriptTagsPanel( wx.Panel ):
if canvas_key == self._canvas_key:
self._media = ( new_media_singleton.Duplicate(), )
self._SetTags( [] )
if new_media_singleton is not None:
self._media = ( new_media_singleton.Duplicate(), )
self._SetTags( [] )

View File

@ -797,7 +797,51 @@ class FileSeed( HydrusSerialisable.SerialisableBase ):
if status != CC.STATUS_UNKNOWN:
break # if a known one-file url gives a single clear result, that result is reliable
# a known one-file url has given a single clear result. sounds good
we_have_a_match = True
if self.file_seed_type == FILE_SEED_TYPE_URL:
# to double-check, let's see if the file that claims that url has any other interesting urls
# if the file has another url with the same url class as ours, then this is prob an unreliable 'alternate' source url attribution, and untrustworthy
my_url = self.file_seed_data
if url != my_url:
my_url_match = HG.client_controller.network_engine.domain_manager.GetURLMatch( my_url )
( media_result, ) = HG.client_controller.Read( 'media_results', ( hash, ) )
this_files_urls = media_result.GetLocationsManager().GetURLs()
for this_files_url in this_files_urls:
if this_files_url != my_url:
this_url_match = HG.client_controller.network_engine.domain_manager.GetURLMatch( this_files_url )
if my_url_match == this_url_match:
# oh no, the file this source url refers to has a different known url in this same domain
# it is more likely that an edit on this site points to the original elsewhere
( status, hash, note ) = UNKNOWN_DEFAULT
we_have_a_match = False
break
if we_have_a_match:
break # if a known one-file url gives a single clear result, that result is reliable

View File

@ -271,6 +271,11 @@ class GallerySeed( HydrusSerialisable.SerialisableBase ):
next_page_urls = ClientParsing.GetURLsFromParseResults( flattened_results, ( HC.URL_TYPE_NEXT, ), only_get_top_priority = True )
if self.url in next_page_urls:
next_page_urls.remove( self.url )
if len( next_page_urls ) > 0:
next_page_generation_phrase = ' next gallery pages found'

View File

@ -727,6 +727,7 @@ class Subscription( HydrusSerialisable.SerialisableBaseNamed ):
num_urls_added = 0
num_urls_already_in_file_seed_cache = 0
can_add_more_file_urls = True
stop_reason = 'no known stop reason'
for file_seed in file_seeds:
@ -761,7 +762,7 @@ class Subscription( HydrusSerialisable.SerialisableBaseNamed ):
WE_HIT_OLD_GROUND_THRESHOLD = 5
if num_urls_already_in_file_seed_cache > WE_HIT_OLD_GROUND_THRESHOLD:
if num_urls_already_in_file_seed_cache >= WE_HIT_OLD_GROUND_THRESHOLD:
can_add_more_file_urls = False
@ -809,6 +810,14 @@ class Subscription( HydrusSerialisable.SerialisableBaseNamed ):
num_existing_urls_this_stream += num_urls_already_in_file_seed_cache
WE_HIT_OLD_GROUND_TOTAL_THRESHOLD = 15
if num_existing_urls_this_stream >= WE_HIT_OLD_GROUND_TOTAL_THRESHOLD:
keep_checking = False
stop_reason = 'saw ' + HydrusData.ToHumanInt( WE_HIT_OLD_GROUND_TOTAL_THRESHOLD ) + ' previously seen urls in the whole sync, so assuming we caught up'
total_new_urls_for_this_sync += num_urls_added

View File

@ -108,7 +108,41 @@ def ConvertQueryDictToText( query_dict ):
def ConvertQueryTextToDict( query_text ):
query_dict = dict( urlparse.parse_qsl( query_text ) )
query_dict = {}
pairs = query_text.split( '&' )
for pair in pairs:
result = pair.split( '=', 1 )
# for the moment, ignore tracker bugs and so on that have only key and no value
if len( result ) == 2:
( key, value ) = result
try:
key = urllib.unquote( key )
except:
pass
try:
value = urllib.unquote( value )
except:
pass
query_dict[ key ] = value
return query_dict
@ -270,7 +304,7 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_NETWORK_DOMAIN_MANAGER
SERIALISABLE_NAME = 'Domain Manager'
SERIALISABLE_VERSION = 4
SERIALISABLE_VERSION = 5
def __init__( self ):
@ -432,6 +466,8 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
def _GetSerialisableInfo( self ):
serialisable_gugs = self._gugs.GetSerialisableTuple()
serialisable_url_matches = self._url_matches.GetSerialisableTuple()
serialisable_url_match_keys_to_display = [ url_match_key.encode( 'hex' ) for url_match_key in self._url_match_keys_to_display ]
serialisable_url_match_keys_to_parser_keys = self._url_match_keys_to_parser_keys.GetSerialisableTuple()
@ -445,7 +481,7 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
serialisable_parsers = self._parsers.GetSerialisableTuple()
serialisable_network_contexts_to_custom_header_dicts = [ ( network_context.GetSerialisableTuple(), custom_header_dict.items() ) for ( network_context, custom_header_dict ) in self._network_contexts_to_custom_header_dicts.items() ]
return ( serialisable_url_matches, serialisable_url_match_keys_to_display, serialisable_url_match_keys_to_parser_keys, serialisable_default_tag_import_options_tuple, serialisable_parsers, serialisable_network_contexts_to_custom_header_dicts )
return ( serialisable_gugs, serialisable_url_matches, serialisable_url_match_keys_to_display, serialisable_url_match_keys_to_parser_keys, serialisable_default_tag_import_options_tuple, serialisable_parsers, serialisable_network_contexts_to_custom_header_dicts )
def _GetURLMatch( self, url ):
@ -476,7 +512,9 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
def _InitialiseFromSerialisableInfo( self, serialisable_info ):
( serialisable_url_matches, serialisable_url_match_keys_to_display, serialisable_url_match_keys_to_parser_keys, serialisable_default_tag_import_options_tuple, serialisable_parsers, serialisable_network_contexts_to_custom_header_dicts ) = serialisable_info
( serialisable_gugs, serialisable_url_matches, serialisable_url_match_keys_to_display, serialisable_url_match_keys_to_parser_keys, serialisable_default_tag_import_options_tuple, serialisable_parsers, serialisable_network_contexts_to_custom_header_dicts ) = serialisable_info
self._gugs = HydrusSerialisable.CreateFromSerialisableTuple( serialisable_gugs )
self._url_matches = HydrusSerialisable.CreateFromSerialisableTuple( serialisable_url_matches )
@ -645,6 +683,19 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
return ( 4, new_serialisable_info )
if version == 4:
( serialisable_url_matches, serialisable_url_match_keys_to_display, serialisable_url_match_keys_to_parser_keys, serialisable_default_tag_import_options_tuple, serialisable_parsing_parsers, serialisable_network_contexts_to_custom_header_dicts ) = old_serialisable_info
gugs = HydrusSerialisable.SerialisableList()
serialisable_gugs = gugs.GetSerialisableTuple()
new_serialisable_info = ( serialisable_gugs, serialisable_url_matches, serialisable_url_match_keys_to_display, serialisable_url_match_keys_to_parser_keys, serialisable_default_tag_import_options_tuple, serialisable_parsing_parsers, serialisable_network_contexts_to_custom_header_dicts )
return ( 5, new_serialisable_info )
def CanValidateInPopup( self, network_contexts ):
@ -1044,7 +1095,9 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
#check ngugs maybe
self._gugs = gugs
self._gugs = HydrusSerialisable.SerialisableList( gugs )
self._SetDirty()
@ -1628,7 +1681,7 @@ class URLMatch( HydrusSerialisable.SerialisableBaseNamed ):
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_URL_MATCH
SERIALISABLE_NAME = 'URL Class'
SERIALISABLE_VERSION = 5
SERIALISABLE_VERSION = 6
def __init__( self, name, url_match_key = None, url_type = None, preferred_scheme = 'https', netloc = 'hostname.com', match_subdomains = False, keep_matched_subdomains = False, path_components = None, parameters = None, api_lookup_converter = None, can_produce_multiple_files = False, should_be_associated_with_files = True, gallery_index_type = None, gallery_index_identifier = None, gallery_index_delta = 1, example_url = 'https://hostname.com/post/page.php?id=123456&s=view' ):
@ -1644,18 +1697,19 @@ class URLMatch( HydrusSerialisable.SerialisableBaseNamed ):
if path_components is None:
path_components = HydrusSerialisable.SerialisableList()
path_components = []
path_components.append( ClientParsing.StringMatch( match_type = ClientParsing.STRING_MATCH_FIXED, match_value = 'post', example_string = 'post' ) )
path_components.append( ClientParsing.StringMatch( match_type = ClientParsing.STRING_MATCH_FIXED, match_value = 'page.php', example_string = 'page.php' ) )
path_components.append( ( ClientParsing.StringMatch( match_type = ClientParsing.STRING_MATCH_FIXED, match_value = 'post', example_string = 'post' ), None ) )
path_components.append( ( ClientParsing.StringMatch( match_type = ClientParsing.STRING_MATCH_FIXED, match_value = 'page.php', example_string = 'page.php' ), None ) )
if parameters is None:
parameters = HydrusSerialisable.SerialisableDictionary()
parameters = {}
parameters[ 's' ] = ClientParsing.StringMatch( match_type = ClientParsing.STRING_MATCH_FIXED, match_value = 'view', example_string = 'view' )
parameters[ 'id' ] = ClientParsing.StringMatch( match_type = ClientParsing.STRING_MATCH_FLEXIBLE, match_value = ClientParsing.NUMERIC, example_string = '123456' )
parameters[ 's' ] = ( ClientParsing.StringMatch( match_type = ClientParsing.STRING_MATCH_FIXED, match_value = 'view', example_string = 'view' ), None )
parameters[ 'id' ] = ( ClientParsing.StringMatch( match_type = ClientParsing.STRING_MATCH_FLEXIBLE, match_value = ClientParsing.NUMERIC, example_string = '123456' ), None )
parameters[ 'page' ] = ( ClientParsing.StringMatch( match_type = ClientParsing.STRING_MATCH_FLEXIBLE, match_value = ClientParsing.NUMERIC, example_string = '1' ), 1 )
if api_lookup_converter is None:
@ -1712,7 +1766,7 @@ class URLMatch( HydrusSerialisable.SerialisableBaseNamed ):
return netloc
def _ClipPath( self, path ):
def _ClipAndFleshOutPath( self, path, allow_clip = True ):
# /post/show/1326143/akunim-anthro-armband-armwear-clothed-clothing-fem
@ -1725,7 +1779,30 @@ class URLMatch( HydrusSerialisable.SerialisableBaseNamed ):
path_components = path.split( '/' )
path = '/'.join( path_components[ : len( self._path_components ) ] )
if allow_clip or len( path_components ) < len( self._path_components ):
clipped_path_components = []
for ( index, ( string_match, default ) ) in enumerate( self._path_components ):
if len( path_components ) > index: # the given path has the value
clipped_path_component = path_components[ index ]
elif default is not None:
clipped_path_component = default
else:
raise HydrusExceptions.URLMatchException( 'Could not clip path--given url appeared to be too short!' )
clipped_path_components.append( clipped_path_component )
path = '/'.join( clipped_path_components )
# post/show/1326143
@ -1739,13 +1816,31 @@ class URLMatch( HydrusSerialisable.SerialisableBaseNamed ):
return path
def _ClipQuery( self, query ):
def _ClipAndFleshOutQuery( self, query, allow_clip = True ):
query_dict = ConvertQueryTextToDict( query )
valid_parameters = { key : value for ( key, value ) in query_dict.items() if key in self._parameters }
if allow_clip:
query_dict = { key : value for ( key, value ) in query_dict.items() if key in self._parameters }
query = ConvertQueryDictToText( valid_parameters )
for ( key, ( string_match, default ) ) in self._parameters.items():
if key not in query_dict:
if default is None:
raise HydrusExceptions.URLMatchException( 'Could not flesh out query--no default for ' + key + ' defined!' )
else:
query_dict[ key ] = default
query = ConvertQueryDictToText( query_dict )
return query
@ -1753,8 +1848,8 @@ class URLMatch( HydrusSerialisable.SerialisableBaseNamed ):
def _GetSerialisableInfo( self ):
serialisable_url_match_key = self._url_match_key.encode( 'hex' )
serialisable_path_components = self._path_components.GetSerialisableTuple()
serialisable_parameters = self._parameters.GetSerialisableTuple()
serialisable_path_components = [ ( string_match.GetSerialisableTuple(), default ) for ( string_match, default ) in self._path_components ]
serialisable_parameters = [ ( key, ( string_match.GetSerialisableTuple(), default ) ) for ( key, ( string_match, default ) ) in self._parameters.items() ]
serialisable_api_lookup_converter = self._api_lookup_converter.GetSerialisableTuple()
return ( serialisable_url_match_key, self._url_type, self._preferred_scheme, self._netloc, self._match_subdomains, self._keep_matched_subdomains, serialisable_path_components, serialisable_parameters, serialisable_api_lookup_converter, self._can_produce_multiple_files, self._should_be_associated_with_files, self._gallery_index_type, self._gallery_index_identifier, self._gallery_index_delta, self._example_url )
@ -1765,8 +1860,8 @@ class URLMatch( HydrusSerialisable.SerialisableBaseNamed ):
( serialisable_url_match_key, self._url_type, self._preferred_scheme, self._netloc, self._match_subdomains, self._keep_matched_subdomains, serialisable_path_components, serialisable_parameters, serialisable_api_lookup_converter, self._can_produce_multiple_files, self._should_be_associated_with_files, self._gallery_index_type, self._gallery_index_identifier, self._gallery_index_delta, self._example_url ) = serialisable_info
self._url_match_key = serialisable_url_match_key.decode( 'hex' )
self._path_components = HydrusSerialisable.CreateFromSerialisableTuple( serialisable_path_components )
self._parameters = HydrusSerialisable.CreateFromSerialisableTuple( serialisable_parameters )
self._path_components = [ ( HydrusSerialisable.CreateFromSerialisableTuple( serialisable_string_match ), default ) for ( serialisable_string_match, default ) in serialisable_path_components ]
self._parameters = { key : ( HydrusSerialisable.CreateFromSerialisableTuple( serialisable_string_match ), default ) for ( key, ( serialisable_string_match, default ) ) in serialisable_parameters }
self._api_lookup_converter = HydrusSerialisable.CreateFromSerialisableTuple( serialisable_api_lookup_converter )
@ -1831,6 +1926,24 @@ class URLMatch( HydrusSerialisable.SerialisableBaseNamed ):
return ( 5, new_serialisable_info )
if version == 5:
( serialisable_url_match_key, url_type, preferred_scheme, netloc, match_subdomains, keep_matched_subdomains, serialisable_path_components, serialisable_parameters, serialisable_api_lookup_converter, can_produce_multiple_files, should_be_associated_with_files, gallery_index_type, gallery_index_identifier, gallery_index_delta, example_url ) = old_serialisable_info
path_components = HydrusSerialisable.CreateFromSerialisableTuple( serialisable_path_components )
parameters = HydrusSerialisable.CreateFromSerialisableTuple( serialisable_parameters )
path_components = [ ( value, None ) for value in path_components ]
parameters = { key : ( value, None ) for ( key, value ) in parameters.items() }
serialisable_path_components = [ ( string_match.GetSerialisableTuple(), default ) for ( string_match, default ) in path_components ]
serialisable_parameters = [ ( key, ( string_match.GetSerialisableTuple(), default ) ) for ( key, ( string_match, default ) ) in parameters.items() ]
new_serialisable_info = ( serialisable_url_match_key, url_type, preferred_scheme, netloc, match_subdomains, keep_matched_subdomains, serialisable_path_components, serialisable_parameters, serialisable_api_lookup_converter, can_produce_multiple_files, should_be_associated_with_files, gallery_index_type, gallery_index_identifier, gallery_index_delta, example_url )
return ( 6, new_serialisable_info )
def CanGenerateNextGalleryPage( self ):
@ -1854,6 +1967,11 @@ class URLMatch( HydrusSerialisable.SerialisableBaseNamed ):
return is_a_gallery_page or is_a_multipost_post_page
def ClippingIsAppropriate( self ):
return self._should_be_associated_with_files or self.UsesAPIURL()
def GetAPIURL( self, url = None ):
if url is None:
@ -1888,6 +2006,8 @@ class URLMatch( HydrusSerialisable.SerialisableBaseNamed ):
def GetNextGalleryPage( self, url ):
url = self.Normalise( url )
p = urlparse.urlparse( url )
scheme = p.scheme
@ -2005,11 +2125,6 @@ class URLMatch( HydrusSerialisable.SerialisableBaseNamed ):
def NormalisationIsAppropriate( self ):
return self._should_be_associated_with_files or self.UsesAPIURL()
def Normalise( self, url ):
p = urlparse.urlparse( url )
@ -2018,19 +2133,17 @@ class URLMatch( HydrusSerialisable.SerialisableBaseNamed ):
params = ''
fragment = ''
# gallery urls we don't want to clip stuff, but we do want to flip to https
if self.NormalisationIsAppropriate():
if self.ClippingIsAppropriate():
netloc = self._ClipNetLoc( p.netloc )
path = self._ClipPath( p.path )
query = self._ClipQuery( p.query )
path = self._ClipAndFleshOutPath( p.path )
query = self._ClipAndFleshOutQuery( p.query )
else:
netloc = p.netloc
path = p.path
query = AlphabetiseQueryText( p.query )
path = self._ClipAndFleshOutPath( p.path, allow_clip = False )
query = self._ClipAndFleshOutQuery( p.query, allow_clip = False )
r = urlparse.ParseResult( scheme, netloc, path, params, query, fragment )
@ -2085,35 +2198,41 @@ class URLMatch( HydrusSerialisable.SerialisableBaseNamed ):
url_path_components = url_path.split( '/' )
if len( url_path_components ) < len( self._path_components ):
for ( index, ( string_match, default ) ) in enumerate( self._path_components ):
raise HydrusExceptions.URLMatchException( url_path + ' did not have ' + str( len( self._path_components ) ) + ' components' )
for ( url_path_component, expected_path_component ) in zip( url_path_components, self._path_components ):
try:
if len( url_path_components ) > index:
expected_path_component.Test( url_path_component )
url_path_component = url_path_components[ index ]
except HydrusExceptions.StringMatchException as e:
try:
string_match.Test( url_path_component )
except HydrusExceptions.StringMatchException as e:
raise HydrusExceptions.URLMatchException( HydrusData.ToUnicode( e ) )
raise HydrusExceptions.URLMatchException( HydrusData.ToUnicode( e ) )
elif default is None:
raise HydrusExceptions.URLMatchException( url_path + ' did not have enough of the required path components!' )
url_parameters = ConvertQueryTextToDict( p.query )
if len( url_parameters ) < len( self._parameters ):
raise HydrusExceptions.URLMatchException( p.query + ' did not have ' + str( len( self._parameters ) ) + ' parameters' )
for ( key, string_match ) in self._parameters.items():
for ( key, ( string_match, default ) ) in self._parameters.items():
if key not in url_parameters:
raise HydrusExceptions.URLMatchException( key + ' not found in ' + p.query )
if default is None:
raise HydrusExceptions.URLMatchException( key + ' not found in ' + p.query )
else:
continue
value = url_parameters[ key ]

View File

@ -1355,13 +1355,26 @@ class ParseFormulaJSON( ParseFormula ):
if parse_rule is None:
if not isinstance( root, list ):
if isinstance( root, list ):
next_roots.extend( root )
elif isinstance( root, dict ):
pairs = list( root.items() )
pairs.sort()
for ( key, value ) in pairs:
next_roots.append( value )
else:
continue
next_roots.extend( root )
elif isinstance( parse_rule, int ):
if not isinstance( root, list ):

View File

@ -49,7 +49,7 @@ options = {}
# Misc
NETWORK_VERSION = 18
SOFTWARE_VERSION = 319
SOFTWARE_VERSION = 320
UNSCALED_THUMBNAIL_DIMENSIONS = ( 200, 200 )

BIN
static/boned.jpg Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 52 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.2 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.3 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.4 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.3 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.5 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.3 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.6 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.7 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.5 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.4 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.6 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.3 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.2 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.7 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.8 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.3 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.3 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.5 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.5 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.5 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.4 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.4 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.4 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.4 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.4 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.4 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.2 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.3 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.3 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.4 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 2.6 KiB

After

Width:  |  Height:  |  Size: 2.8 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.6 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.6 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.8 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.6 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.6 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.9 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.8 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.7 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.6 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 2.4 KiB

After

Width:  |  Height:  |  Size: 2.4 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 2.4 KiB

After

Width:  |  Height:  |  Size: 2.3 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 2.2 KiB

After

Width:  |  Height:  |  Size: 2.2 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 2.7 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 2.3 KiB

After

Width:  |  Height:  |  Size: 2.3 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 2.7 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 2.3 KiB

After

Width:  |  Height:  |  Size: 2.2 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 2.4 KiB

After

Width:  |  Height:  |  Size: 2.4 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 2.7 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 2.2 KiB

After

Width:  |  Height:  |  Size: 2.2 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 2.8 KiB

After

Width:  |  Height:  |  Size: 2.7 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 2.7 KiB

After

Width:  |  Height:  |  Size: 2.7 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 2.7 KiB

After

Width:  |  Height:  |  Size: 2.6 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.4 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.7 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 2.3 KiB

After

Width:  |  Height:  |  Size: 2.2 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 2.3 KiB

After

Width:  |  Height:  |  Size: 2.2 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 2.5 KiB

After

Width:  |  Height:  |  Size: 2.5 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 2.6 KiB

After

Width:  |  Height:  |  Size: 2.6 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 2.4 KiB

After

Width:  |  Height:  |  Size: 2.4 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 2.5 KiB

After

Width:  |  Height:  |  Size: 2.4 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 2.4 KiB

After

Width:  |  Height:  |  Size: 2.3 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 2.5 KiB

After

Width:  |  Height:  |  Size: 2.4 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 2.4 KiB

After

Width:  |  Height:  |  Size: 2.3 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 2.3 KiB

After

Width:  |  Height:  |  Size: 2.3 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 2.4 KiB

After

Width:  |  Height:  |  Size: 2.3 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 2.3 KiB

After

Width:  |  Height:  |  Size: 2.3 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 2.2 KiB

After

Width:  |  Height:  |  Size: 2.1 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 2.2 KiB

After

Width:  |  Height:  |  Size: 2.2 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 2.2 KiB

After

Width:  |  Height:  |  Size: 2.2 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 2.2 KiB

After

Width:  |  Height:  |  Size: 2.2 KiB