Version 522

This commit is contained in:
Hydrus Network Developer 2023-03-29 15:57:59 -05:00
parent 67f8b3e651
commit 91e1c54d24
No known key found for this signature in database
GPG Key ID: 76249F053212133C
46 changed files with 1562 additions and 368 deletions

View File

@ -132,3 +132,13 @@ The cool thing about JSON files is I can export multiple times to the same file
```
You should be careful that the location you are exporting to does not have any old JSON files with conflicting filenames in it--hydrus will update them, not overwrite them! This may be an issue if you have an synchronising Export Folder that exports random files with the same filenames.
## Note on Notes
You can now import/export notes with your sidecars. Since notes have two variables--name and text--but the sidecars system only supports lists of single strings, I merge these together! If you export notes, they will output in the form 'name: text'. If you want to import notes, arrange them in the same form, 'name: text'.
If you do need to select a particular note out of many, see if a String Match (regex `^name: `) in the String Processor will do it.
If you need to work with multiple notes that have newlines, I recommend you use JSON rather than txt. If you have to use txt on multiple multi-paragraph-notes, then try a different separator than newline. Go for `||||` or something, whatever works for your job.
Depending on how awkward this all is, I may revise it.

View File

@ -7,6 +7,52 @@ title: Changelog
!!! note
This is the new changelog, only the most recent builds. For all versions, see the [old changelog](old_changelog.html).
## [Version 522](https://github.com/hydrusnetwork/hydrus/releases/tag/v522)
### notes in sidecars
* the sidecars system now supports notes!
* my sidecars only support univariate rows atm (a list of strings, rather than, say, a list of pairs of strings), so I had to make a decision how to handle note names. if I reworked the pipeline to handle multivariate data, it would take weeks; if I incorporated explicit names into the sidecar object, it would have made 'get/export all my notes' awkward or impossible and not solved the storage problem; so I have compromised in this first version by choosing to import/export everything and merging the name and text into the same row. it expects/says 'name: text' for input and output. let me know what you think. I may revisit this, depending on how it goes
* I added a note to the sidecars help about this special 'name: text' rule along with a couple ideas for tricky situations
### misc
* added 'system:framerate' and 'system:number of frames' to the system predicate parser!
* I am undoing two changes to tag logic from last week: you can now have as many colons at the start of a tag as you like, and the content parser no longer tries to stop double-stacked namespaces. both of these were more trouble than they were worth. in related news, '::' is now a valid tag again, displaying as ':', and you can create ':blush:'-style tags by typing '::blush:'. I'm pretty sure these tags will autocomplete search awfully, so if you end up using something like this legit, let me know how it goes
* if you change the 'media/preview viewer uses its own volume' setting, the client now updates the UI sliders for this immediately, it doesn't need a client restart. the actual volume on the video also changes immediately
* when an mpv window is called to play media that has 'no audio', the mpv window is now explicitly muted. we'll see if this fixes an interesting issue where on one system, videos that have an audio channel with no sound, which hydrus detects as 'no audio', were causing cracks and pops and bursts of hellnoise in mpv (we suspect some sort of normalisation gain error)
### file safety with duplicate symlinked directory entries
* the main hydrus function that merges/mirrors files and directories now checks if the source and destination are the same location but with two different representations (e.g. a mapped drive and its network location). if so, to act as a final safety backstop, the mirror skips work and the merge throws an error. previously, if you wangled two entries for the same location into 'migrate database' and started a migration, it could cause file deletions!
* I've also updated my database migration routines to recognise and handle this situation explicitly. it now skips all file operations and just updates the location record instantly. it is now safe to have the same location twice in the dialog using different names, and to migrate from one to the other. the only bizzaro thing is if you look in the directory, it of course has boths' contents. as always though, I'll say make backups regularly, and sync them before you do any big changes like a migration--then if something goes wrong, you always have an up-to-date backup to roll back to
* the 'migrate database' dialog no longer chases the real path of what you give it. if you want to give it the mapped drive Z:, it'll take and remember it
* some related 'this is in the wrong place' recovery code handles these symlink situations better as well
### advanced new parsing tricks
* thanks to a clever user doing the heavy lifting, there are two neat but advanced additions to the downloader system
* first, the parsing system has a new content parser type, 'http headers', which lets you parse http headers to be used on subsequent downloads created by the parsing downloader object (e.g. next gallery page urls, file downloads from post pages, multi-file posts that split off to single post page urls). should be possible to wangle tokenized gallery searches and file downloads and some hacky login systems
* second, the string converter system now lets you calculate the normal hydrus hashes--md5, sha1, sha256, sha512--of any string (decoding it by utf-8), outputting hexadecimal
### http headers on the client api
* the client api now lets you see and edit the http headers (as under _network->data->review http headers_) for the global network context and specific domains. the commands are `/manage_headers/get_headers` and `/manage_headers/set_headers`
* if you have the 'Make a short-lived popup on cookie updates through the Client API' option set (under 'popups' options page), this now applies to these header changes too
* also debuting on the side is a 'network context' object in the `get_headers` response, confirming the domain you set for. this is an internal object that does domain location stuff all over. it isn't important here, but as we do more network domain setting editing, I expect we'll see more of this guy
* I added some some documentation for all this, as normal, to the client api help
* the labels and help around 'manage cookies' permission are now 'manage cookies and headers'
* the client api version is now 43
* the old `/manage_headers/set_user_agent` still works. ideally, please move to `set_headers`, since it isn't that complex, but no rush. I've made a job to delete it in a year
* while I was doing this, I realised get/set_cookies is pretty bad. I hate their old 'just spam tuples' approach. I've slowly been replacing this stuff with nicer named JSON Objects as is more typical in APIs and is easier to update, so I expect I'll overhaul them at some point
### boring cleanup
* gave the about window a pass. it now runs on the newer scrolling panel system using my hydrus UI objects (so e.g. the hyperlink now opens on a custom browser command, if you need it), says what platform you are on and whether you are source/build/app, and the version info lines are cleaned a little
* fixed/cleaned some bad code all around http header management
* wrote some unit tests for http headers in the client api
* wrote some unit tests for notes in sidecars
## [Version 521](https://github.com/hydrusnetwork/hydrus/releases/tag/v521)
### some tag presentation
@ -346,58 +392,3 @@ title: Changelog
* reordered and renamed the dev help headers in the same way
* simple but significant rename-refactoring in file duplicates database module, tearing off the old 'Duplicates' prefixes to every method ha ha
* updated the advanced Windows 'running from source' help to talk more about VC build tools. some old scripts don't seem to work any more in Win 11, but you also don't really need it any more (I moved to a new dev machine this week so had to set everything up again)
## [Version 512](https://github.com/hydrusnetwork/hydrus/releases/tag/v512)
### two searches in duplicates
* the duplicate filter page now lets you search 'one file is in this search, the other is in this search'! the only real limitation is both searches are locked to the same file domain
* the main neat thing is you can now search 'pngs vs jpegs, and must be pixel dupes' super easy. this is the first concrete step towards my plan to introduce an optional duplicate auto resolution system (png/jpeg pixel dupes is easy--the jpeg is 99.9999% always better)
* the database tech to get this working was actually simpler than 'one file matches the search', and in testing it works at _ok_ speed, so we'll see how this goes IRL
* duplicate calculations should be faster in some simple cases, usually when you set a search to system:everything. this extends to the new two-search mode too (e.g. a two-search with one as system:everything is just a one-search, and the system optimises for this), however I also search complicated domains much more precisely now, which may make some duplicate search stuff work real slow. again, let me know!
### sidecars
* the txt importer/exporter sidecars now allow custom 'separators', so if you don't want newlines, you can use ', ' or whatever format you need
### misc
* when you right-click on a selection of thumbs, the 'x files' can now be 'x videos' or 'x pngs' etc.. as you see on the status bar
* when you select or right-click on a selection of thumbs that all have duration, the status bar and menu now show the total duration of your selection. same deal on the status bar if you have no selection on a page of only durating-having media
* thanks to the user who figured out the correct render flag, the new 'thumbnail ui-scale supersampling %' option now draws non-pixelly thumbs on 100% monitors when it is set higher (e.g. 200% thumbs drawing on 100% monitor), so users with unusual multi-monitor setups etc... should have a nicer experience. as the tooltip now says, this setting should now be set to the largest UI scale you have
* I removed the newgrounds downloader from the defaults (this only affects new users). the downloader has been busted for a while, and last time I looked, it was not trivial to figure out, so I am removing myself from the question
* the 'manage where tag siblings and parents apply' dialog now explicitly points users to the 'review current sync' panel
### client api
* a new command, /manage_pages/refresh_page, refreshes the specified page
* the help is updated to talk about this
* client api version is now 39
### server management
* in the 'modify accounts' dialog, if the null account is checked when you try to do an action, it will be unchecked. this should stop the annoying 400 Errors when you accidentally try to set it something
* also, if you do 'add to expires', any accounts that currently do not expire will be deselected before the action too, with a brief dialog note about it
### other duplicates improvements
* I reworked a ton of code here, fixing a heap of logic and general 'that isn't quite what you'd expect' comparison selection issues. ideally, the system will just make more obvious human sense more often, but this tech gets a little complicated as it tries to select comparison kings from larger groups, and we might have some situations where it says '3 pairs', but when you load it in the filter it says 'no pairs found m8', so let me know how it goes!
* first, most importantly, the 'show some random potential pairs' button is vastly improved. it is now much better about limiting the group of presented files to what you specifically have searched, and the 'pixel dupes' and 'search distance' settings are obeyed properly (previously it was fetching too many potentials, not always limiting to the search you set, and choosing candidates from larger groups too liberally)
* while it shows smaller groups now, since they are all culled better, it _should_ select larger groups more often than before
* when you say 'show some random potential pairs' with 'at least one file matches the search', the first file displayed, which is the 'master' that the other file(s) are paired against, now always matches the search. when you are set to the new two-search 'files match different searches', the master will always match the first search, and the others of the pairs will always match the second search. in the filter itself, some similar logic applies, so the files selected for actual comparison should match the search you inputted better.
* setting duplicates with 'custom options' from the thumbnail menu and selecting 'this is better' now correctly sets the focused media as the best. previously it set the first file as the best
* also, in the duplicate merge options, you can now set notes to 'move' from worse to better
* as a side thing, the 'search distance' number control is now disabled if you select 'must be pixel dupes'. duh!
### boring cleanup
* refactored the duplicate comparison statement generation code from ClientMedia to ClientDuplicates
* significantly refactored all the duplicate files calculation pipelines to deal with two file search contexts
* cleaned up a bunch of the 'find potential duplicate pairs in this file domain' master table join code. less hardcoding, more dynamic assembly
* refactored the duplicated 'figure out pixel dupes table join gubbins' code in the file duplicates database module into a single separate method, and rolled in the base initialisation and hamming distance part into it too, clearing out more duplicated code
* split up the 'both files match' search code into separate methods to further clean the logic here
* updated the main object that handles page data to the new serialisable dictionary, combining its hardcoded key/primitive/serialisable storage into one clean dict that looks after itself
* cleaned up the type definitions of the the main database file search and fixed the erroneous empty set returns
* I added a couple unit tests for the new .txt sidecar separator
* fixed a bad sidecar unit test
* 'client_running' and 'server_running' are now in the .gitignore

View File

@ -221,7 +221,7 @@ Arguments:
* 2 - Edit File Tags
* 3 - Search for and Fetch Files
* 4 - Manage Pages
* 5 - Manage Cookies
* 5 - Manage Cookies and Headers
* 6 - Manage Database
* 7 - Edit File Notes
* 8 - Manage File Relationships
@ -2035,29 +2035,29 @@ Response:
The files will be promoted to be the kings of their respective duplicate groups. If the file is already the king (also true for any file with no duplicates), this is idempotent. It also processes the files in the given order, so if you specify two files in the same group, the latter will be the king at the end of the request.
## Managing Cookies and HTTP Headers
## Managing Cookies
This refers to the cookies held in the client's session manager, which are sent with network requests to different domains.
This refers to the cookies held in the client's session manager, which you can review under _network->data->manage session cookies_. These are sent to every request on the respective domains.
### **GET `/manage_cookies/get_cookies`** { id="manage_cookies_get_cookies" }
_Get the cookies for a particular domain._
Restricted access:
: YES. Manage Cookies permission needed.
: YES. Manage Cookies and Headers permission needed.
Required Headers: n/a
Arguments:
: * `domain`
``` title="Example request (for gelbooru.com)"
/manage_cookies/get_cookies?domain=gelbooru.com
```
``` title="Example request (for gelbooru.com)"
/manage_cookies/get_cookies?domain=gelbooru.com
```
Response:
: A JSON Object listing all the cookies for that domain in \[ name, value, domain, path, expires \] format.
```json title="Example response"
{
"cookies" : [
@ -2077,7 +2077,7 @@ Response:
Set some new cookies for the client. This makes it easier to 'copy' a login from a web browser or similar to hydrus if hydrus's login system can't handle the site yet.
Restricted access:
: YES. Manage Cookies permission needed.
: YES. Manage Cookies and Headers permission needed.
Required Headers:
:
@ -2100,12 +2100,121 @@ You can set 'value' to be null, which will clear any existing cookie with the co
Expires can be null, but session cookies will time-out in hydrus after 60 minutes of non-use.
## Managing HTTP Headers
This refers to the custom headers you can see under _network->data->manage http headers_.
### **GET `/manage_headers/get_headers`** { id="manage_headers_get_headers" }
Get the custom http headers.
Restricted access:
: YES. Manage Cookies and Headers permission needed.
Required Headers: n/a
Arguments:
: * `domain`: optional, the domain to fetch headers for
``` title="Example request (for gelbooru.com)"
/manage_headers/get_headers?domain=gelbooru.com
```
``` title="Example request (for global)"
/manage_headers/get_headers
```
Response:
: A JSON Object listing all the headers:
```json title="Example response"
{
"network_context" : {
"type" : 2,
"data" : "gelbooru.com"
},
"headers" : {
"User-Agent" : {
"value" : "Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:56.0) Gecko/20100101 Firefox/56.0",
"approved" : "approved",
"reason" : "Set by Client API"
},
"DNT" : {
"value" : "1",
"approved" : "approved",
"reason" : "Set by Client API"
}
}
}
```
### **POST `/manage_headers/set_headers`** { id="manage_headers_set_headers" }
Manages the custom http headers.
Restricted access:
: YES. Manage Cookies and Headers permission needed.
Required Headers:
:
* `Content-Type`: application/json
Arguments (in JSON):
:
* `domain`: (optional, the specific domain to set the header for)
* `headers`: (a JSON Object that holds "key" objects)
```json title="Example request body"
{
"domain" : "mysite.com",
"headers" : {
"User-Agent" : {
"value" : "Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:56.0) Gecko/20100101 Firefox/56.0"
},
"DNT" : {
"value" : "1"
},
"CoolStuffToken" : {
"value" : "abcdef0123456789",
"approved" : "pending",
"reason" : "This unlocks the Sonic fanfiction!"
}
}
}
```
```json title="Example request body that deletes"
{
"domain" : "myothersite.com",
"headers" : {
"User-Agent" : {
"value" : null
},
"Authorization" : {
"value" : null
}
}
}
```
If you do not set a domain, or you set it to `null`, the 'context' will be the global context, which applies as a fallback to all jobs.
Domain headers also apply to their subdomains--unless they are overwritten by specific subdomain entries.
Each `key` Object under `headers` has the same form as [/manage\_headers/get\_headers](#manage_headers_get_headers). `value` is obvious--it is the value of the header. If the pair doesn't exist yet, you need the `value`, but if you just want to approve something, it is optional. Set it to `null` to delete an existing pair.
You probably won't ever use `approved` or `reason`, but they plug into the 'validation' system in the client. They are both optional. Approved can be any of `[ approved, denied, pending ]`, and by default everything you add will be `approved`. If there is anything `pending` when a network job asks, the user will be presented with a yes/no popup presenting the reason for the header. If they click 'no', the header is set to `denied` and the network job goes ahead without it. If you have a header that changes behaviour or unlocks special content, you might like to make it optional in this way.
If you need to reinstate it, the default `global` `User-Agent` is `Mozilla/5.0 (compatible; Hydrus Client)`.
### **POST `/manage_headers/set_user_agent`** { id="manage_headers_set_user_agent" }
_This is deprecated--move to [/manage\_headers/set\_headers](#manage_headers_set_headers)!_
This sets the 'Global' User-Agent for the client, as typically editable under _network->data->manage http headers_, for instance if you want hydrus to appear as a specific browser associated with some cookies.
Restricted access:
: YES. Manage Cookies permission needed.
: YES. Manage Cookies and Headers permission needed.
Required Headers:
:

View File

@ -34,6 +34,43 @@
<div class="content">
<h1 id="changelog"><a href="#changelog">changelog</a></h1>
<ul>
<li>
<h2 id="version_522"><a href="#version_522">version 522</a></h2>
<ul>
<li><h3>notes in sidecars</h3></li>
<li>the sidecars system now supports notes!</li>
<li>my sidecars only support univariate rows atm (a list of strings, rather than, say, a list of pairs of strings), so I had to make a decision how to handle note names. if I reworked the pipeline to handle multivariate data, it would take weeks; if I incorporated explicit names into the sidecar object, it would have made 'get/export all my notes' awkward or impossible and not solved the storage problem; so I have compromised in this first version by choosing to import/export everything and merging the name and text into the same row. it expects/says 'name: text' for input and output. let me know what you think. I may revisit this, depending on how it goes</li>
<li>I added a note to the sidecars help about this special 'name: text' rule along with a couple ideas for tricky situations</li>
<li><h3>misc</h3></li>
<li>added 'system:framerate' and 'system:number of frames' to the system predicate parser!</li>
<li>I am undoing two changes to tag logic from last week: you can now have as many colons at the start of a tag as you like, and the content parser no longer tries to stop double-stacked namespaces. both of these were more trouble than they were worth. in related news, '::' is now a valid tag again, displaying as ':', and you can create ':blush:'-style tags by typing '::blush:'. I'm pretty sure these tags will autocomplete search awfully, so if you end up using something like this legit, let me know how it goes</li>
<li>if you change the 'media/preview viewer uses its own volume' setting, the client now updates the UI sliders for this immediately, it doesn't need a client restart. the actual volume on the video also changes immediately</li>
<li>when an mpv window is called to play media that has 'no audio', the mpv window is now explicitly muted. we'll see if this fixes an interesting issue where on one system, videos that have an audio channel with no sound, which hydrus detects as 'no audio', were causing cracks and pops and bursts of hellnoise in mpv (we suspect some sort of normalisation gain error)</li>
<li><h3>file safety with duplicate symlinked directory entries</h3></li>
<li>the main hydrus function that merges/mirrors files and directories now checks if the source and destination are the same location but with two different representations (e.g. a mapped drive and its network location). if so, to act as a final safety backstop, the mirror skips work and the merge throws an error. previously, if you wangled two entries for the same location into 'migrate database' and started a migration, it could cause file deletions!</li>
<li>I've also updated my database migration routines to recognise and handle this situation explicitly. it now skips all file operations and just updates the location record instantly. it is now safe to have the same location twice in the dialog using different names, and to migrate from one to the other. the only bizzaro thing is if you look in the directory, it of course has boths' contents. as always though, I'll say make backups regularly, and sync them before you do any big changes like a migration--then if something goes wrong, you always have an up-to-date backup to roll back to</li>
<li>the 'migrate database' dialog no longer chases the real path of what you give it. if you want to give it the mapped drive Z:, it'll take and remember it</li>
<li>some related 'this is in the wrong place' recovery code handles these symlink situations better as well</li>
<li><h3>advanced new parsing tricks</h3></li>
<li>thanks to a clever user doing the heavy lifting, there are two neat but advanced additions to the downloader system</li>
<li>first, the parsing system has a new content parser type, 'http headers', which lets you parse http headers to be used on subsequent downloads created by the parsing downloader object (e.g. next gallery page urls, file downloads from post pages, multi-file posts that split off to single post page urls). should be possible to wangle tokenized gallery searches and file downloads and some hacky login systems</li>
<li>second, the string converter system now lets you calculate the normal hydrus hashes--md5, sha1, sha256, sha512--of any string (decoding it by utf-8), outputting hexadecimal</li>
<li><h3>http headers on the client api</h3></li>
<li>the client api now lets you see and edit the http headers (as under _network->data->review http headers_) for the global network context and specific domains. the commands are `/manage_headers/get_headers` and `/manage_headers/set_headers`</li>
<li>if you have the 'Make a short-lived popup on cookie updates through the Client API' option set (under 'popups' options page), this now applies to these header changes too</li>
<li>also debuting on the side is a 'network context' object in the `get_headers` response, confirming the domain you set for. this is an internal object that does domain location stuff all over. it isn't important here, but as we do more network domain setting editing, I expect we'll see more of this guy</li>
<li>I added some some documentation for all this, as normal, to the client api help</li>
<li>the labels and help around 'manage cookies' permission are now 'manage cookies and headers'</li>
<li>the client api version is now 43</li>
<li>the old `/manage_headers/set_user_agent` still works. ideally, please move to `set_headers`, since it isn't that complex, but no rush. I've made a job to delete it in a year</li>
<li>while I was doing this, I realised get/set_cookies is pretty bad. I hate their old 'just spam tuples' approach. I've slowly been replacing this stuff with nicer named JSON Objects as is more typical in APIs and is easier to update, so I expect I'll overhaul them at some point</li>
<li><h3>boring cleanup</h3></li>
<li>gave the about window a pass. it now runs on the newer scrolling panel system using my hydrus UI objects (so e.g. the hyperlink now opens on a custom browser command, if you need it), says what platform you are on and whether you are source/build/app, and the version info lines are cleaned a little</li>
<li>fixed/cleaned some bad code all around http header management</li>
<li>wrote some unit tests for http headers in the client api</li>
<li>wrote some unit tests for notes in sidecars</li>
</ul>
</li>
<li>
<h2 id="version_521"><a href="#version_521">version 521</a></h2>
<ul>

View File

@ -14,12 +14,12 @@ CLIENT_API_PERMISSION_ADD_FILES = 1
CLIENT_API_PERMISSION_ADD_TAGS = 2
CLIENT_API_PERMISSION_SEARCH_FILES = 3
CLIENT_API_PERMISSION_MANAGE_PAGES = 4
CLIENT_API_PERMISSION_MANAGE_COOKIES = 5
CLIENT_API_PERMISSION_MANAGE_HEADERS = 5
CLIENT_API_PERMISSION_MANAGE_DATABASE = 6
CLIENT_API_PERMISSION_ADD_NOTES = 7
CLIENT_API_PERMISSION_MANAGE_FILE_RELATIONSHIPS = 8
ALLOWED_PERMISSIONS = ( CLIENT_API_PERMISSION_ADD_FILES, CLIENT_API_PERMISSION_ADD_TAGS, CLIENT_API_PERMISSION_ADD_URLS, CLIENT_API_PERMISSION_SEARCH_FILES, CLIENT_API_PERMISSION_MANAGE_PAGES, CLIENT_API_PERMISSION_MANAGE_COOKIES, CLIENT_API_PERMISSION_MANAGE_DATABASE, CLIENT_API_PERMISSION_ADD_NOTES, CLIENT_API_PERMISSION_MANAGE_FILE_RELATIONSHIPS )
ALLOWED_PERMISSIONS = ( CLIENT_API_PERMISSION_ADD_FILES, CLIENT_API_PERMISSION_ADD_TAGS, CLIENT_API_PERMISSION_ADD_URLS, CLIENT_API_PERMISSION_SEARCH_FILES, CLIENT_API_PERMISSION_MANAGE_PAGES, CLIENT_API_PERMISSION_MANAGE_HEADERS, CLIENT_API_PERMISSION_MANAGE_DATABASE, CLIENT_API_PERMISSION_ADD_NOTES, CLIENT_API_PERMISSION_MANAGE_FILE_RELATIONSHIPS )
basic_permission_to_str_lookup = {}
@ -28,7 +28,7 @@ basic_permission_to_str_lookup[ CLIENT_API_PERMISSION_ADD_FILES ] = 'import and
basic_permission_to_str_lookup[ CLIENT_API_PERMISSION_ADD_TAGS ] = 'edit file tags'
basic_permission_to_str_lookup[ CLIENT_API_PERMISSION_SEARCH_FILES ] = 'search and fetch files'
basic_permission_to_str_lookup[ CLIENT_API_PERMISSION_MANAGE_PAGES ] = 'manage pages'
basic_permission_to_str_lookup[ CLIENT_API_PERMISSION_MANAGE_COOKIES ] = 'manage cookies'
basic_permission_to_str_lookup[ CLIENT_API_PERMISSION_MANAGE_HEADERS ] = 'manage cookies and headers'
basic_permission_to_str_lookup[ CLIENT_API_PERMISSION_MANAGE_DATABASE ] = 'manage database'
basic_permission_to_str_lookup[ CLIENT_API_PERMISSION_ADD_NOTES ] = 'edit file notes'
basic_permission_to_str_lookup[ CLIENT_API_PERMISSION_MANAGE_FILE_RELATIONSHIPS ] = 'manage file relationships'

View File

@ -614,17 +614,37 @@ class ClientFilesManager( object ):
def _GetRecoverTuple( self ):
all_locations = { location for location in list(self._prefixes_to_locations.values()) }
all_locations = { location for location in self._prefixes_to_locations.values() }
all_prefixes = list(self._prefixes_to_locations.keys())
for possible_location in all_locations:
if not os.path.exists( possible_location ):
continue
for prefix in all_prefixes:
correct_location = self._prefixes_to_locations[ prefix ]
if possible_location != correct_location and os.path.exists( os.path.join( possible_location, prefix ) ):
if correct_location == possible_location:
continue
if os.path.exists( os.path.join( possible_location, prefix ) ):
if not os.path.exists( correct_location ):
continue
if os.path.samefile( possible_location, correct_location ):
continue
recoverable_location = possible_location

View File

@ -77,7 +77,7 @@ def ConvertParseResultToPrettyString( result ):
else:
combined_tag = HydrusTags.CombineTag( additional_info, parsed_text, do_not_double_namespace = True )
combined_tag = HydrusTags.CombineTag( additional_info, parsed_text )
tag = HydrusTags.CleanTag( combined_tag )
@ -147,7 +147,7 @@ def ConvertParseResultToPrettyString( result ):
return 'watcher page title (priority ' + str( priority ) + '): ' + parsed_text
elif content_type == HC.CONTENT_TYPE_HTTP_HEADER:
elif content_type == HC.CONTENT_TYPE_HTTP_HEADERS:
header_name = additional_info
@ -245,9 +245,9 @@ def ConvertParsableContentToPrettyString( parsable_content, include_veto = False
pretty_strings.append( 'watcher page title' )
elif content_type == HC.CONTENT_TYPE_HTTP_HEADER:
elif content_type == HC.CONTENT_TYPE_HTTP_HEADERS:
headers = [ header for header in additional_infos if header not in ( '', None ) ]
headers = sorted( [ header for header in additional_infos if header not in ( '', None ) ] )
pretty_strings.append( 'http headers: ' + ', '.join( headers ) )
@ -499,7 +499,7 @@ def GetTagsFromParseResults( results ):
else:
combined_tag = HydrusTags.CombineTag( namespace, parsed_text, do_not_double_namespace = True )
combined_tag = HydrusTags.CombineTag( namespace, parsed_text )
tag_results.append( combined_tag )
@ -589,16 +589,16 @@ def GetHTTPHeadersFromParseResults( parse_results ):
for ( ( name, content_type, additional_info ), parsed_text ) in parse_results:
if content_type == HC.CONTENT_TYPE_HTTP_HEADER:
if content_type == HC.CONTENT_TYPE_HTTP_HEADERS:
header_name = additional_info
headers[header_name] = parsed_text
headers[ header_name ] = parsed_text
return headers
def GetURLsFromParseResults( results, desired_url_types, only_get_top_priority = False ):

View File

@ -166,6 +166,8 @@ pred_generators = {
SystemPredicateParser.Predicate.SIMILAR_TO : lambda o, v, u: ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_SYSTEM_SIMILAR_TO, convert_hex_hashlist_and_other_to_bytes_and_other( v ) ),
SystemPredicateParser.Predicate.HASH : lambda o, v, u: ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_SYSTEM_HASH, convert_hex_hashlist_and_other_to_bytes_and_other( v ), inclusive = o == '=' ),
SystemPredicateParser.Predicate.DURATION : lambda o, v, u: ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_SYSTEM_DURATION, ( o, v[0] * 1000 + v[1] ) ),
SystemPredicateParser.Predicate.FRAMERATE : lambda o, v, u: ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_SYSTEM_FRAMERATE, ( o, v ) ),
SystemPredicateParser.Predicate.NUM_OF_FRAMES : lambda o, v, u: ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_SYSTEM_NUM_FRAMES, ( o, v ) ),
SystemPredicateParser.Predicate.NUM_PIXELS : lambda o, v, u: ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_SYSTEM_NUM_PIXELS, ( o, v, HydrusData.ConvertPixelsToInt( u ) ) ),
SystemPredicateParser.Predicate.RATIO : lambda o, v, u: ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_SYSTEM_RATIO, ( o, v[0], v[1] ) ),
SystemPredicateParser.Predicate.TAG_AS_NUMBER : lambda o, v, u: ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_SYSTEM_TAG_AS_NUMBER, ( o[0], o[1], v ) ),

View File

@ -316,24 +316,29 @@ class StringConverter( StringProcessingStep ):
if hash_function == 'md5':
s = hashlib.md5(s.encode('utf-8')).hexdigest()
s = hashlib.md5( s.encode( 'utf-8' ) ).hexdigest()
elif hash_function == 'sha1':
s = hashlib.sha1(s.encode('utf-8')).hexdigest()
s = hashlib.sha1( s.encode( 'utf-8' ) ).hexdigest()
elif hash_function == 'sha256':
s = hashlib.sha256(s.encode('utf-8')).hexdigest()
s = hashlib.sha256( s.encode( 'utf-8' ) ).hexdigest()
elif hash_function == 'sha512':
s = hashlib.sha512(s.encode('utf-8')).hexdigest()
s = hashlib.sha512( s.encode( 'utf-8' ) ).hexdigest()
else:
raise Exception( f'Unknown hash function "{hash_function}"!' )
except Exception as e:
raise HydrusExceptions.StringConvertException( 'ERROR: Could not apply "' + self.ConversionToString( conversion ) + '" to string "' + repr( s ) + '":' + str( e ) )
raise HydrusExceptions.StringConvertException( 'ERROR: Could not apply "{}" to string "{}": {}'.format( self.ConversionToString( conversion ), s, e ) )
@ -454,7 +459,7 @@ class StringConverter( StringProcessingStep ):
elif conversion_type == STRING_CONVERSION_HASH_FUNCTION:
return 'hash string with ' + str( data )
return 'hash string by ' + str( data )
else:

View File

@ -121,6 +121,11 @@ class ClientDBFilesPhysicalStorage( ClientDBModule.ClientDBModule ):
def RelocateClientFiles( self, prefix, abs_source, abs_dest ):
if not os.path.exists( abs_source ):
raise Exception( 'Was commanded to move prefix "{}" from "{}" to "{}", but that source does not exist!'.format( prefix, abs_source, abs_dest ) )
if not os.path.exists( abs_dest ):
raise Exception( 'Was commanded to move prefix "{}" from "{}" to "{}", but that destination does not exist!'.format( prefix, abs_source, abs_dest ) )
@ -129,23 +134,29 @@ class ClientDBFilesPhysicalStorage( ClientDBModule.ClientDBModule ):
full_source = os.path.join( abs_source, prefix )
full_dest = os.path.join( abs_dest, prefix )
if os.path.exists( full_source ):
if not os.path.samefile( abs_source, abs_dest ):
HydrusPaths.MergeTree( full_source, full_dest )
elif not os.path.exists( full_dest ):
HydrusPaths.MakeSureDirectoryExists( full_dest )
if os.path.exists( full_source ):
HydrusPaths.MergeTree( full_source, full_dest )
elif not os.path.exists( full_dest ):
HydrusPaths.MakeSureDirectoryExists( full_dest )
portable_dest = HydrusPaths.ConvertAbsPathToPortablePath( abs_dest )
self._Execute( 'UPDATE client_files_locations SET location = ? WHERE prefix = ?;', ( portable_dest, prefix ) )
if os.path.exists( full_source ):
if not os.path.samefile( abs_source, abs_dest ):
try: HydrusPaths.RecyclePath( full_source )
except: pass
if os.path.exists( full_source ):
try: HydrusPaths.RecyclePath( full_source )
except: pass

View File

@ -702,12 +702,12 @@ class FrameGUI( ClientGUITopLevelWindows.MainFrameThatResizes, CAC.ApplicationCo
def _AboutWindow( self ):
aboutinfo = QP.AboutDialogInfo()
name = 'hydrus client'
version = '{}, using network version {}'.format( HC.SOFTWARE_VERSION, HC.NETWORK_VERSION )
aboutinfo.SetName( 'hydrus client' )
aboutinfo.SetVersion( str( HC.SOFTWARE_VERSION ) + ', using network version ' + str( HC.NETWORK_VERSION ) )
library_version_lines = []
library_versions = []
library_version_lines.append( 'running on {} {}'.format( HC.NICE_PLATFORM_STRING, HC.NICE_RUNNING_AS_STRING ) )
# 2.7.12 (v2.7.12:d33e0cf91556, Jun 27 2016, 15:24:40) [MSC v.1500 64 bit (AMD64)]
v = sys.version
@ -717,20 +717,20 @@ class FrameGUI( ClientGUITopLevelWindows.MainFrameThatResizes, CAC.ApplicationCo
v = v.split( ' ' )[0]
library_versions.append( ( 'python', v ) )
library_versions.append( ( 'FFMPEG', HydrusVideoHandling.GetFFMPEGVersion() ) )
library_version_lines.append( 'python: {}'.format( v ) )
library_version_lines.append( 'FFMPEG: {}'.format( HydrusVideoHandling.GetFFMPEGVersion() ) )
if ClientGUIMPV.MPV_IS_AVAILABLE:
library_versions.append( ( 'mpv api version: ', ClientGUIMPV.GetClientAPIVersionString() ) )
library_version_lines.append( 'mpv api version: {}'.format( ClientGUIMPV.GetClientAPIVersionString() ) )
else:
library_versions.append( ( 'mpv', 'not available' ) )
library_version_lines.append( 'mpv not available!' )
if HC.RUNNING_FROM_FROZEN_BUILD and HC.PLATFORM_MACOS:
HydrusData.ShowText( 'The macOS App does not come with MPV support on its own, but if your system has the dev library, libmpv1, it will try to import it. It seems your system does not have this or it failed to import. The specific error follows:' )
HydrusData.ShowText( 'The macOS App does not come with MPV support on its own, but if your system has the dev library, libmpv1, it will try to import it. It seems your system does not have this, or it failed to import. The specific error follows:' )
else:
@ -740,9 +740,9 @@ class FrameGUI( ClientGUITopLevelWindows.MainFrameThatResizes, CAC.ApplicationCo
HydrusData.ShowText( ClientGUIMPV.mpv_failed_reason )
library_versions.append( ( 'OpenCV', cv2.__version__ ) )
library_versions.append( ( 'openssl', ssl.OPENSSL_VERSION ) )
library_versions.append( ( 'Pillow', PIL.__version__ ) )
library_version_lines.append( 'OpenCV: {}'.format( cv2.__version__ ) )
library_version_lines.append( 'openssl: {}'.format( ssl.OPENSSL_VERSION ) )
library_version_lines.append( 'Pillow: {}'.format( PIL.__version__ ) )
if QtInit.WE_ARE_QT5:
@ -750,13 +750,13 @@ class FrameGUI( ClientGUITopLevelWindows.MainFrameThatResizes, CAC.ApplicationCo
import PySide2
library_versions.append( ( 'PySide2', PySide2.__version__ ) )
library_version_lines.append( 'Qt: PySide2 {}'.format( PySide2.__version__ ) )
elif QtInit.WE_ARE_PYQT:
from PyQt5.Qt import PYQT_VERSION_STR # pylint: disable=E0401,E0611
library_versions.append( ( 'PyQt5', PYQT_VERSION_STR ) )
library_version_lines.append( 'Qt: PyQt5 {}'.format( PYQT_VERSION_STR ) )
elif QtInit.WE_ARE_QT6:
@ -765,21 +765,17 @@ class FrameGUI( ClientGUITopLevelWindows.MainFrameThatResizes, CAC.ApplicationCo
import PySide6
library_versions.append( ( 'PySide6', PySide6.__version__ ) )
library_version_lines.append( 'Qt: PySide6 {}'.format( PySide6.__version__ ) )
elif QtInit.WE_ARE_PYQT:
from PyQt6.QtCore import PYQT_VERSION_STR # pylint: disable=E0401,E0611
library_versions.append( ( 'PyQt6', PYQT_VERSION_STR ) )
library_version_lines.append( 'Qt: PyQt6 {}'.format( PYQT_VERSION_STR ) )
library_versions.append( ( 'qtpy', qtpy.__version__ ) )
library_versions.append( ( 'Qt', QC.__version__ ) )
library_versions.append( ( 'sqlite', sqlite3.sqlite_version ) )
library_version_lines.append( 'sqlite: {}'.format( sqlite3.sqlite_version ) )
CBOR_AVAILABLE = False
@ -793,50 +789,57 @@ class FrameGUI( ClientGUITopLevelWindows.MainFrameThatResizes, CAC.ApplicationCo
pass
library_versions.append( ( 'cbor2 present: ', str( CBOR_AVAILABLE ) ) )
library_version_lines.append( 'cbor2 present: {}'.format( str( CBOR_AVAILABLE ) ) )
library_version_lines.append( 'chardet present: {}'.format( str( HydrusText.CHARDET_OK ) ) )
from hydrus.client.networking import ClientNetworkingJobs
library_versions.append( ( 'chardet present: ', str( HydrusText.CHARDET_OK ) ) )
if ClientNetworkingJobs.CLOUDSCRAPER_OK:
library_versions.append( ( 'cloudscraper present:', ClientNetworkingJobs.cloudscraper.__version__ ) )
try:
library_version_lines.append( 'cloudscraper present: {}'.format( ClientNetworkingJobs.cloudscraper.__version__ ) )
except:
library_version_lines.append( 'cloudscraper present: unknown version' )
else:
library_versions.append( ( 'cloudscraper present: ', 'False' ) )
library_version_lines.append( 'cloudscraper present: {}'.format( 'False' ) )
library_versions.append( ( 'cryptography present:', str( HydrusEncryption.CRYPTO_OK ) ) )
library_versions.append( ( 'dateutil present: ', str( ClientTime.DATEUTIL_OK ) ) )
library_versions.append( ( 'html5lib present: ', str( ClientParsing.HTML5LIB_IS_OK ) ) )
library_versions.append( ( 'lxml present: ', str( ClientParsing.LXML_IS_OK ) ) )
library_versions.append( ( 'lz4 present: ', str( HydrusCompression.LZ4_OK ) ) )
library_versions.append( ( 'pympler present:', str( HydrusMemory.PYMPLER_OK ) ) )
library_versions.append( ( 'pyopenssl present:', str( HydrusEncryption.OPENSSL_OK ) ) )
library_versions.append( ( 'speedcopy present:', str( HydrusFileHandling.SPEEDCOPY_OK ) ) )
library_versions.append( ( 'install dir', HC.BASE_DIR ) )
library_versions.append( ( 'db dir', HG.client_controller.db_dir ) )
library_versions.append( ( 'temp dir', HydrusTemp.GetCurrentTempDir() ) )
library_versions.append( ( 'db cache size per file', '{}MB'.format( HG.db_cache_size ) ) )
library_versions.append( ( 'db journal mode', HG.db_journal_mode ) )
library_versions.append( ( 'db synchronous mode', str( HG.db_synchronous ) ) )
library_versions.append( ( 'db transaction commit period', '{}'.format( HydrusData.TimeDeltaToPrettyTimeDelta( HG.db_cache_size ) ) ) )
library_versions.append( ( 'db using memory for temp?', str( HG.no_db_temp_files ) ) )
library_version_lines.append( 'cryptography present: {}'.format( HydrusEncryption.CRYPTO_OK ) )
library_version_lines.append( 'dateutil present: {}'.format( ClientTime.DATEUTIL_OK ) )
library_version_lines.append( 'html5lib present: {}'.format( ClientParsing.HTML5LIB_IS_OK ) )
library_version_lines.append( 'lxml present: {}'.format( ClientParsing.LXML_IS_OK ) )
library_version_lines.append( 'lz4 present: {}'.format( HydrusCompression.LZ4_OK ) )
library_version_lines.append( 'pympler present: {}'.format( HydrusMemory.PYMPLER_OK ) )
library_version_lines.append( 'pyopenssl present: {}'.format( HydrusEncryption.OPENSSL_OK ) )
library_version_lines.append( 'speedcopy present: {}'.format( HydrusFileHandling.SPEEDCOPY_OK ) )
library_version_lines.append( 'install dir: {}'.format( HC.BASE_DIR ) )
library_version_lines.append( 'db dir: {}'.format( HG.client_controller.db_dir ) )
library_version_lines.append( 'temp dir: {}'.format( HydrusTemp.GetCurrentTempDir() ) )
library_version_lines.append( 'db cache size per file: {}MB'.format( HG.db_cache_size ) )
library_version_lines.append( 'db journal mode: {}'.format( HG.db_journal_mode ) )
library_version_lines.append( 'db synchronous mode: {}'.format( HG.db_synchronous ) )
library_version_lines.append( 'db transaction commit period: {}'.format( HydrusData.TimeDeltaToPrettyTimeDelta( HG.db_cache_size ) ) )
library_version_lines.append( 'db using memory for temp?: {}'.format( HG.no_db_temp_files ) )
import locale
l_string = locale.getlocale()[0]
qtl_string = QC.QLocale().name()
library_versions.append( ( 'locale strings', str( ( l_string, qtl_string ) ) ) )
library_version_lines.append( 'locale: {}/{}'.format( l_string, qtl_string ) )
description = 'This client is the media management application of the hydrus software suite.'
description = 'This is the media management application of the hydrus software suite.'
description += os.linesep * 2 + os.linesep.join( ( lib + ': ' + version for ( lib, version ) in library_versions ) )
description += os.linesep * 2 + os.linesep.join( library_version_lines )
aboutinfo.SetDescription( description )
#
if os.path.exists( HC.LICENSE_PATH ):
@ -850,12 +853,15 @@ class FrameGUI( ClientGUITopLevelWindows.MainFrameThatResizes, CAC.ApplicationCo
license = 'no licence file found!'
aboutinfo.SetLicense( license )
developers = [ 'Anonymous' ]
aboutinfo.SetDevelopers( [ 'Anonymous' ] )
aboutinfo.SetWebSite( 'https://hydrusnetwork.github.io/hydrus/' )
site = 'https://hydrusnetwork.github.io/hydrus/'
QP.AboutBox( self, aboutinfo )
frame = ClientGUITopLevelWindowsPanels.FrameThatTakesScrollablePanel( self, 'about hydrus' )
panel = ClientGUIScrolledPanelsReview.AboutPanel( frame, name, version, description, license, developers, site )
frame.SetPanel( panel )
def _AnalyzeDatabase( self ):

View File

@ -534,9 +534,9 @@ class EditFileSeedCachePanel( ClientGUIScrolledPanels.EditPanel ):
headers = selected_file_seed.GetHTTPHeaders()
if headers is None:
if len( headers ) == 0:
ClientGUIMenus.AppendMenuLabel( menu, 'no additional headers')
ClientGUIMenus.AppendMenuLabel( menu, 'no additional headers' )
else:

View File

@ -198,6 +198,8 @@ class VolumeControl( QW.QWidget ):
self.adjustSize()
HG.client_controller.sub( self, 'NotifyNewOptions', 'notify_new_options' )
def DoShowHide( self ):
@ -245,6 +247,34 @@ class VolumeControl( QW.QWidget ):
event.ignore()
def NotifyNewOptions( self ):
if self._canvas_type in CC.CANVAS_MEDIA_VIEWER_TYPES:
option_to_use = 'media_viewer_uses_its_own_audio_volume'
volume_type = AUDIO_MEDIA_VIEWER
else:
option_to_use = 'preview_uses_its_own_audio_volume'
volume_type = AUDIO_PREVIEW
if HG.client_controller.new_options.GetBoolean( option_to_use ):
slider_volume_type = volume_type
else:
slider_volume_type = AUDIO_GLOBAL
if slider_volume_type != self._volume.GetVolumeType():
self._volume.SetVolumeType( slider_volume_type )
class VolumeSlider( QW.QSlider ):
@ -278,6 +308,20 @@ class VolumeSlider( QW.QSlider ):
ChangeVolume( self._volume_type, self.value() )
def GetVolumeType( self ):
return self._volume_type
def SetVolumeType( self, volume_type ):
self._volume_type = volume_type
volume = self._GetCorrectValue()
self.setValue( volume )
def UpdateMute( self ):
volume = self._GetCorrectValue()

View File

@ -2498,7 +2498,7 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
self._freeze_message_manager_when_main_gui_minimised.setToolTip( 'This is useful if the popup toaster restores strangely after minimised changes.' )
self._notify_client_api_cookies = QW.QCheckBox( self._popup_panel )
self._notify_client_api_cookies.setToolTip( 'This will make a short-lived popup message every time you get new cookie information over the Client API.' )
self._notify_client_api_cookies.setToolTip( 'This will make a short-lived popup message every time you get new cookie or http header information over the Client API.' )
#
@ -2519,7 +2519,7 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
rows.append( ( 'BUGFIX: Force this width as the fixed width for all popup messages: ', self._popup_message_force_min_width ) )
rows.append( ( 'Freeze the popup toaster when mouse is on another display: ', self._freeze_message_manager_when_mouse_on_other_monitor ) )
rows.append( ( 'Freeze the popup toaster when the main gui is minimised: ', self._freeze_message_manager_when_main_gui_minimised ) )
rows.append( ( 'Make a short-lived popup on cookie updates through the Client API: ', self._notify_client_api_cookies ) )
rows.append( ( 'Make a short-lived popup on cookie/header updates through the Client API: ', self._notify_client_api_cookies ) )
gridbox = ClientGUICommon.WrapInGrid( self._popup_panel, rows )

View File

@ -63,6 +63,63 @@ from hydrus.client.networking import ClientNetworkingGUG
from hydrus.client.networking import ClientNetworkingLogin
from hydrus.client.networking import ClientNetworkingURLClass
class AboutPanel( ClientGUIScrolledPanels.ReviewPanel ):
def __init__( self, parent, name, version, description, license_text, developers, site ):
ClientGUIScrolledPanels.ReviewPanel.__init__( self, parent )
icon_label = ClientGUICommon.BetterStaticText( self )
icon_label.setPixmap( HG.client_controller.frame_icon_pixmap )
name_label = ClientGUICommon.BetterStaticText( self, name )
name_label_font = name_label.font()
name_label_font.setBold( True )
name_label.setFont( name_label_font )
version_label = ClientGUICommon.BetterStaticText( self, version )
tabwidget = QW.QTabWidget( self )
desc_panel = QW.QWidget( self )
desc_label = ClientGUICommon.BetterStaticText( self, description )
desc_label.setAlignment( QC.Qt.AlignHCenter | QC.Qt.AlignVCenter )
url_label = ClientGUICommon.BetterHyperLink( self, site, site )
credits = QW.QTextEdit( self )
credits.setPlainText( 'Created by ' + ', '.join( developers ) )
credits.setReadOnly( True )
credits.setAlignment( QC.Qt.AlignHCenter )
license_textedit = QW.QTextEdit( self )
license_textedit.setPlainText( license_text )
license_textedit.setReadOnly( True )
tabwidget.addTab( desc_panel, 'Description' )
tabwidget.addTab( credits, 'Credits' )
tabwidget.addTab( license_textedit, 'License' )
tabwidget.setCurrentIndex( 0 )
desc_layout = QP.VBoxLayout()
QP.AddToLayout( desc_layout, desc_label, CC.FLAGS_CENTER_PERPENDICULAR )
QP.AddToLayout( desc_layout, url_label, CC.FLAGS_CENTER_PERPENDICULAR )
desc_panel.setLayout( desc_layout )
vbox = QP.VBoxLayout()
QP.AddToLayout( vbox, icon_label, CC.FLAGS_CENTER_PERPENDICULAR )
QP.AddToLayout( vbox, name_label, CC.FLAGS_CENTER_PERPENDICULAR )
QP.AddToLayout( vbox, version_label, CC.FLAGS_CENTER_PERPENDICULAR )
QP.AddToLayout( vbox, tabwidget, CC.FLAGS_CENTER_PERPENDICULAR )
self.widget().setLayout( vbox )
class MigrateDatabasePanel( ClientGUIScrolledPanels.ReviewPanel ):
def __init__( self, parent, controller ):
@ -187,17 +244,6 @@ class MigrateDatabasePanel( ClientGUIScrolledPanels.ReviewPanel ):
def _AddPath( self, path, starting_weight = 1 ):
try:
path = os.path.realpath( path )
except OSError as e:
HydrusData.PrintException( e )
QW.QMessageBox.warning( self, 'Warning', 'I tried to remove symlinks from this path, but that failed! If this path is a clever mount, this situation may be ok. I will let you continue, and if the path looks ok and you are confident you can read from and write to it, you can continue. I recommend you close the client and make a backup right now though. The full error has been printed to log.' )
if path in self._locations_to_ideal_weights:
QW.QMessageBox.warning( self, 'Warning', 'You already have that location entered!' )
@ -816,17 +862,6 @@ class MigrateDatabasePanel( ClientGUIScrolledPanels.ReviewPanel ):
path = dlg.GetPath()
try:
path = os.path.realpath( path )
except OSError as e:
HydrusData.PrintException( e )
QW.QMessageBox.warning( self, 'Warning', 'I tried to remove symlinks from this path, but that failed! If this path is a clever mount, this situation may be ok. I will let you continue, and if the path looks ok and you are confident you can read from and write to it, you can continue. I recommend you close the client and make a backup right now though. The full error has been printed to log.' )
if path in self._locations_to_ideal_weights:
QW.QMessageBox.warning( self, 'Warning', 'That path already exists as a regular file location! Please choose another.' )
@ -903,6 +938,7 @@ class MigrateDatabasePanel( ClientGUIScrolledPanels.ReviewPanel ):
self._rebalance_status_st.style().polish( self._rebalance_status_st )
def THREADMigrateDatabase( controller, source, portable_locations, dest ):
time.sleep( 2 ) # important to have this, so the migrate dialog can close itself and clean its event loop, wew
@ -956,6 +992,11 @@ def THREADMigrateDatabase( controller, source, portable_locations, dest ):
source_path = os.path.join( source, filename )
dest_path = os.path.join( dest, filename )
if os.path.exists( source_path ) and os.path.exists( dest_path ) and os.path.samefile( source_path, dest_path ):
continue
HydrusPaths.MergeFile( source_path, dest_path )
@ -965,6 +1006,11 @@ def THREADMigrateDatabase( controller, source, portable_locations, dest ):
source_path = os.path.join( source, portable_location )
dest_path = os.path.join( dest, portable_location )
if os.path.exists( source_path ) and os.path.exists( dest_path ) and os.path.samefile( source_path, dest_path ):
continue
HydrusPaths.MergeTree( source_path, dest_path, text_update_hook = text_update_hook )
@ -2221,7 +2267,7 @@ class ReviewDownloaderImport( ClientGUIScrolledPanels.ReviewPanel ):
domain_manager.AutoAddURLClassesAndParsers( new_url_classes, dupe_url_classes, new_parsers )
bandwidth_manager.AutoAddDomainMetadatas( new_domain_metadatas )
domain_manager.AutoAddDomainMetadatas( new_domain_metadatas, approved = True )
domain_manager.AutoAddDomainMetadatas( new_domain_metadatas, approved = ClientNetworkingDomain.VALID_APPROVED )
login_manager.AutoAddLoginScripts( new_login_scripts )
num_new_gugs = len( new_gugs )

View File

@ -704,7 +704,22 @@ class EditStringConverterPanel( ClientGUIScrolledPanels.EditPanel ):
self._conversion_type = ClientGUICommon.BetterChoice( self._control_panel )
for t_type in ( ClientStrings.STRING_CONVERSION_REMOVE_TEXT_FROM_BEGINNING, ClientStrings.STRING_CONVERSION_REMOVE_TEXT_FROM_END, ClientStrings.STRING_CONVERSION_CLIP_TEXT_FROM_BEGINNING, ClientStrings.STRING_CONVERSION_CLIP_TEXT_FROM_END, ClientStrings.STRING_CONVERSION_PREPEND_TEXT, ClientStrings.STRING_CONVERSION_APPEND_TEXT, ClientStrings.STRING_CONVERSION_ENCODE, ClientStrings.STRING_CONVERSION_DECODE, ClientStrings.STRING_CONVERSION_REVERSE, ClientStrings.STRING_CONVERSION_REGEX_SUB, ClientStrings.STRING_CONVERSION_DATE_DECODE, ClientStrings.STRING_CONVERSION_DATE_ENCODE, ClientStrings.STRING_CONVERSION_INTEGER_ADDITION, ClientStrings.STRING_CONVERSION_HASH_FUNCTION ):
for t_type in (
ClientStrings.STRING_CONVERSION_REMOVE_TEXT_FROM_BEGINNING,
ClientStrings.STRING_CONVERSION_REMOVE_TEXT_FROM_END,
ClientStrings.STRING_CONVERSION_CLIP_TEXT_FROM_BEGINNING,
ClientStrings.STRING_CONVERSION_CLIP_TEXT_FROM_END,
ClientStrings.STRING_CONVERSION_PREPEND_TEXT,
ClientStrings.STRING_CONVERSION_APPEND_TEXT,
ClientStrings.STRING_CONVERSION_ENCODE,
ClientStrings.STRING_CONVERSION_DECODE,
ClientStrings.STRING_CONVERSION_REVERSE,
ClientStrings.STRING_CONVERSION_REGEX_SUB,
ClientStrings.STRING_CONVERSION_DATE_DECODE,
ClientStrings.STRING_CONVERSION_DATE_ENCODE,
ClientStrings.STRING_CONVERSION_INTEGER_ADDITION,
ClientStrings.STRING_CONVERSION_HASH_FUNCTION
):
self._conversion_type.addItem( ClientStrings.conversion_type_str_lookup[ t_type ], t_type )
@ -718,7 +733,10 @@ class EditStringConverterPanel( ClientGUIScrolledPanels.EditPanel ):
self._data_timezone_decode = ClientGUICommon.BetterChoice( self._control_panel )
self._data_timezone_encode = ClientGUICommon.BetterChoice( self._control_panel )
self._data_timezone_offset = ClientGUICommon.BetterSpinBox( self._control_panel, min=-86400, max=86400 )
self._data_hash_function = ClientGUICommon.BetterChoice( self._control_panel )
tt = 'This hashes the string\'s UTF-8-decoded bytes to hexadecimal.'
self._data_hash_function.setToolTip( tt )
for e in ( 'hex', 'base64', 'url percent encoding', 'unicode escape characters', 'html entities' ):
@ -807,8 +825,8 @@ class EditStringConverterPanel( ClientGUIScrolledPanels.EditPanel ):
self._data_timezone_encode.SetValue( timezone_type )
elif conversion_type == ClientStrings.STRING_CONVERSION_HASH_FUNCTION:
self._data_hash_function.SetValue(data)
self._data_hash_function.SetValue( data )
elif data is not None:
@ -849,7 +867,7 @@ class EditStringConverterPanel( ClientGUIScrolledPanels.EditPanel ):
rows.append( ( self._data_timezone_decode_label, self._data_timezone_decode ) )
rows.append( ( self._data_timezone_offset_label, self._data_timezone_offset ) )
rows.append( ( self._data_timezone_encode_label, self._data_timezone_encode ) )
rows.append( ( self._data_hash_function_label, self._data_hash_function) )
rows.append( ( self._data_hash_function_label, self._data_hash_function ) )
self._control_gridbox = ClientGUICommon.WrapInGrid( self._control_panel, rows )

View File

@ -1552,47 +1552,6 @@ class StatusBar( QW.QStatusBar ):
class AboutDialogInfo:
def __init__( self ):
self.name = ''
self.version = ''
self.description = ''
self.license = ''
self.developers = []
self.website = ''
def SetName( self, name ):
self.name = name
def SetVersion( self, version ):
self.version = version
def SetDescription( self, description ):
self.description = description
def SetLicense( self, license ):
self.license = license
def SetDevelopers( self, developers_list ):
self.developers = developers_list
def SetWebSite( self, url ):
self.website = url
class UIActionSimulator:
def __init__( self ):
@ -1614,74 +1573,6 @@ class UIActionSimulator:
QW.QApplication.instance().postEvent( widget, ev2 )
# TODO: rewrite this to be on my newer panel system so this can resize for lads on small screens etc..
class AboutBox( QW.QDialog ):
def __init__( self, parent, about_info ):
QW.QDialog.__init__( self, parent )
self.setWindowFlag( QC.Qt.WindowContextHelpButtonHint, on = False )
self.setAttribute( QC.Qt.WA_DeleteOnClose )
self.setWindowIcon( QG.QIcon( HG.client_controller.frame_icon_pixmap ) )
layout = QW.QVBoxLayout( self )
self.setWindowTitle( 'About ' + about_info.name )
icon_label = QW.QLabel( self )
name_label = QW.QLabel( about_info.name, self )
version_label = QW.QLabel( about_info.version, self )
tabwidget = QW.QTabWidget( self )
desc_panel = QW.QWidget( self )
desc_label = QW.QLabel( about_info.description, self )
url_label = QW.QLabel( '<a href="{0}">{0}</a>'.format( about_info.website ), self )
credits = QW.QTextEdit( self )
license = QW.QTextEdit( self )
close_button = QW.QPushButton( 'close', self )
icon_label.setPixmap( HG.client_controller.frame_icon_pixmap )
layout.addWidget( icon_label, alignment = QC.Qt.AlignHCenter )
name_label_font = name_label.font()
name_label_font.setBold( True )
name_label.setFont( name_label_font )
layout.addWidget( name_label, alignment = QC.Qt.AlignHCenter )
layout.addWidget( version_label, alignment = QC.Qt.AlignHCenter )
layout.addWidget( tabwidget, alignment = QC.Qt.AlignHCenter )
tabwidget.addTab( desc_panel, 'Description' )
tabwidget.addTab( credits, 'Credits' )
tabwidget.addTab( license, 'License' )
tabwidget.setCurrentIndex( 0 )
credits.setPlainText( 'Created by ' + ', '.join(about_info.developers) )
credits.setReadOnly( True )
credits.setAlignment( QC.Qt.AlignHCenter )
license.setPlainText( about_info.license )
license.setReadOnly( True )
desc_layout = QW.QVBoxLayout()
desc_layout.addWidget( desc_label, alignment = QC.Qt.AlignHCenter )
desc_label.setWordWrap( True )
desc_label.setAlignment( QC.Qt.AlignHCenter | QC.Qt.AlignVCenter )
desc_layout.addWidget( url_label, alignment = QC.Qt.AlignHCenter )
url_label.setTextFormat( QC.Qt.RichText )
url_label.setTextInteractionFlags( QC.Qt.TextBrowserInteraction )
url_label.setOpenExternalLinks( True )
desc_panel.setLayout( desc_layout )
layout.addWidget( close_button, alignment = QC.Qt.AlignRight )
close_button.clicked.connect( self.accept )
self.setLayout( layout )
self.exec_()
class RadioBox( QW.QFrame ):
radioBoxChanged = QC.Signal()

View File

@ -1,6 +1,7 @@
import locale
import os
import traceback
import typing
from qtpy import QtCore as QC
from qtpy import QtWidgets as QW
@ -18,6 +19,7 @@ from hydrus.client.gui import ClientGUIMedia
from hydrus.client.gui import ClientGUIMediaControls
from hydrus.client.gui import ClientGUIShortcuts
from hydrus.client.gui import QtPorting as QP
from hydrus.client.media import ClientMedia
mpv_failed_reason = 'MPV seems ok!'
@ -521,7 +523,7 @@ class MPVWidget( QW.QWidget, CAC.ApplicationCommandProcessorMixin ):
self._player.set_loglevel( level )
def SetMedia( self, media, start_paused = False ):
def SetMedia( self, media: typing.Optional[ ClientMedia.MediaSingleton ], start_paused = False ):
if media == self._media:
@ -576,6 +578,11 @@ class MPVWidget( QW.QWidget, CAC.ApplicationCommandProcessorMixin ):
hash = self._media.GetHash()
mime = self._media.GetMime()
# some videos have an audio channel that is silent. hydrus thinks these dudes are 'no audio', but when we throw them at mpv, it may play audio for them
# would be fine, you think, except in one reported case this causes scratches and pops and hell whitenoise
# so let's see what happens here
mute_override = not self._media.HasAudio()
client_files_manager = HG.client_controller.client_files_manager
path = client_files_manager.GetFilePath( hash, mime )
@ -596,7 +603,7 @@ class MPVWidget( QW.QWidget, CAC.ApplicationCommandProcessorMixin ):
self._player.volume = self._GetCorrectCurrentVolume()
self._player.mute = self._GetCorrectCurrentMute()
self._player.mute = mute_override or self._GetCorrectCurrentMute()
self._player.pause = start_paused

View File

@ -304,7 +304,7 @@ class EditExportFolderPanel( ClientGUIScrolledPanels.EditPanel ):
self._metadata_routers_box = ClientGUICommon.StaticBox( self, 'sidecar exporting' )
metadata_routers = export_folder.GetMetadataRouters()
allowed_importer_classes = [ ClientMetadataMigrationImporters.SingleFileMetadataImporterMediaTags, ClientMetadataMigrationImporters.SingleFileMetadataImporterMediaURLs ]
allowed_importer_classes = [ ClientMetadataMigrationImporters.SingleFileMetadataImporterMediaTags, ClientMetadataMigrationImporters.SingleFileMetadataImporterMediaNotes, ClientMetadataMigrationImporters.SingleFileMetadataImporterMediaURLs ]
allowed_exporter_classes = [ ClientMetadataMigrationExporters.SingleFileMetadataExporterTXT, ClientMetadataMigrationExporters.SingleFileMetadataExporterJSON ]
self._metadata_routers_button = ClientGUIMetadataMigration.SingleFileMetadataRoutersButton( self._metadata_routers_box, metadata_routers, allowed_importer_classes, allowed_exporter_classes )
@ -566,7 +566,7 @@ class ReviewExportFilesPanel( ClientGUIScrolledPanels.ReviewPanel ):
metadata_routers = new_options.GetDefaultExportFilesMetadataRouters()
allowed_importer_classes = [ ClientMetadataMigrationImporters.SingleFileMetadataImporterMediaTags, ClientMetadataMigrationImporters.SingleFileMetadataImporterMediaURLs ]
allowed_importer_classes = [ ClientMetadataMigrationImporters.SingleFileMetadataImporterMediaTags, ClientMetadataMigrationImporters.SingleFileMetadataImporterMediaNotes, ClientMetadataMigrationImporters.SingleFileMetadataImporterMediaURLs ]
allowed_exporter_classes = [ ClientMetadataMigrationExporters.SingleFileMetadataExporterTXT, ClientMetadataMigrationExporters.SingleFileMetadataExporterJSON ]
self._metadata_routers_button = ClientGUIMetadataMigration.SingleFileMetadataRoutersButton( self, metadata_routers, allowed_importer_classes, allowed_exporter_classes )

View File

@ -1023,7 +1023,7 @@ class EditLocalImportFilenameTaggingPanel( ClientGUIScrolledPanels.EditPanel ):
self._paths_list = ClientGUIListCtrl.BetterListCtrl( self, CGLC.COLUMN_LIST_PATHS_TO_TAGS.ID, 10, self._ConvertDataToListCtrlTuples )
allowed_importer_classes = [ ClientMetadataMigrationImporters.SingleFileMetadataImporterTXT, ClientMetadataMigrationImporters.SingleFileMetadataImporterJSON ]
allowed_exporter_classes = [ ClientMetadataMigrationExporters.SingleFileMetadataExporterMediaTags, ClientMetadataMigrationExporters.SingleFileMetadataExporterMediaURLs ]
allowed_exporter_classes = [ ClientMetadataMigrationExporters.SingleFileMetadataExporterMediaTags, ClientMetadataMigrationExporters.SingleFileMetadataExporterMediaNotes, ClientMetadataMigrationExporters.SingleFileMetadataExporterMediaURLs ]
self._metadata_routers_panel = ClientGUIMetadataMigration.SingleFileMetadataRoutersControl( self, metadata_routers, allowed_importer_classes, allowed_exporter_classes )

View File

@ -258,7 +258,7 @@ class EditImportFolderPanel( ClientGUIScrolledPanels.EditPanel ):
metadata_routers = self._import_folder.GetMetadataRouters()
allowed_importer_classes = [ ClientMetadataMigrationImporters.SingleFileMetadataImporterTXT, ClientMetadataMigrationImporters.SingleFileMetadataImporterJSON ]
allowed_exporter_classes = [ ClientMetadataMigrationExporters.SingleFileMetadataExporterMediaTags, ClientMetadataMigrationExporters.SingleFileMetadataExporterMediaURLs ]
allowed_exporter_classes = [ ClientMetadataMigrationExporters.SingleFileMetadataExporterMediaTags, ClientMetadataMigrationExporters.SingleFileMetadataExporterMediaNotes, ClientMetadataMigrationExporters.SingleFileMetadataExporterMediaURLs ]
self._metadata_routers_button = ClientGUIMetadataMigration.SingleFileMetadataRoutersButton( self, metadata_routers, allowed_importer_classes, allowed_exporter_classes )

View File

@ -310,11 +310,11 @@ class COLUMN_LIST_PATHS_TO_TAGS( COLUMN_LIST_DEFINITION ):
TAGS = 2
column_list_type_name_lookup[ COLUMN_LIST_PATHS_TO_TAGS.ID ] = 'import paths and their tags'
column_list_type_name_lookup[ COLUMN_LIST_PATHS_TO_TAGS.ID ] = 'import paths and their metadata'
register_column_type( COLUMN_LIST_PATHS_TO_TAGS.ID, COLUMN_LIST_PATHS_TO_TAGS.NUMBER, '#', False, 4, True )
register_column_type( COLUMN_LIST_PATHS_TO_TAGS.ID, COLUMN_LIST_PATHS_TO_TAGS.PATH, 'path', False, 40, True )
register_column_type( COLUMN_LIST_PATHS_TO_TAGS.ID, COLUMN_LIST_PATHS_TO_TAGS.TAGS, 'tags', False, 40, True )
register_column_type( COLUMN_LIST_PATHS_TO_TAGS.ID, COLUMN_LIST_PATHS_TO_TAGS.TAGS, 'metadata', False, 40, True )
default_column_list_sort_lookup[ COLUMN_LIST_PATHS_TO_TAGS.ID ] = ( COLUMN_LIST_PATHS_TO_TAGS.NUMBER, True )

View File

@ -21,6 +21,7 @@ from hydrus.client.gui.widgets import ClientGUICommon
from hydrus.client.metadata import ClientMetadataMigrationExporters
choice_tuple_label_lookup = {
ClientMetadataMigrationExporters.SingleFileMetadataExporterMediaNotes : 'a file\'s notes',
ClientMetadataMigrationExporters.SingleFileMetadataExporterMediaTags : 'a file\'s tags',
ClientMetadataMigrationExporters.SingleFileMetadataExporterMediaURLs : 'a file\'s URLs',
ClientMetadataMigrationExporters.SingleFileMetadataExporterTXT : 'a .txt sidecar',
@ -28,6 +29,7 @@ choice_tuple_label_lookup = {
}
choice_tuple_description_lookup = {
ClientMetadataMigrationExporters.SingleFileMetadataExporterMediaNotes : 'The notes that a file has.',
ClientMetadataMigrationExporters.SingleFileMetadataExporterMediaTags : 'The tags that a file has on a particular service.',
ClientMetadataMigrationExporters.SingleFileMetadataExporterMediaURLs : 'The known URLs that a file has.',
ClientMetadataMigrationExporters.SingleFileMetadataExporterTXT : 'A list of raw newline-separated texts in a .txt file.',
@ -170,7 +172,7 @@ class EditSingleFileMetadataExporterPanel( ClientGUIScrolledPanels.EditPanel ):
exporter.SetServiceKey( self._service_key )
elif isinstance( exporter, ClientMetadataMigrationExporters.SingleFileMetadataExporterMediaURLs ):
elif isinstance( exporter, ( ClientMetadataMigrationExporters.SingleFileMetadataExporterMediaNotes, ClientMetadataMigrationExporters.SingleFileMetadataExporterMediaURLs ) ):
pass
@ -233,6 +235,10 @@ class EditSingleFileMetadataExporterPanel( ClientGUIScrolledPanels.EditPanel ):
exporter = ClientMetadataMigrationExporters.SingleFileMetadataExporterMediaURLs()
elif self._current_exporter_class == ClientMetadataMigrationExporters.SingleFileMetadataExporterMediaNotes:
exporter = ClientMetadataMigrationExporters.SingleFileMetadataExporterMediaNotes()
elif self._current_exporter_class == ClientMetadataMigrationExporters.SingleFileMetadataExporterTXT:
remove_actual_filename_ext = self._sidecar_panel.GetRemoveActualFilenameExt()
@ -316,7 +322,7 @@ class EditSingleFileMetadataExporterPanel( ClientGUIScrolledPanels.EditPanel ):
QW.QMessageBox.warning( self, 'Warning', message )
elif isinstance( exporter, ClientMetadataMigrationExporters.SingleFileMetadataExporterMediaURLs ):
elif isinstance( exporter, ( ClientMetadataMigrationExporters.SingleFileMetadataExporterMediaNotes, ClientMetadataMigrationExporters.SingleFileMetadataExporterMediaURLs ) ):
pass

View File

@ -22,6 +22,7 @@ from hydrus.client.gui.widgets import ClientGUICommon
from hydrus.client.metadata import ClientMetadataMigrationImporters
choice_tuple_label_lookup = {
ClientMetadataMigrationImporters.SingleFileMetadataImporterMediaNotes : 'a file\'s notes',
ClientMetadataMigrationImporters.SingleFileMetadataImporterMediaTags : 'a file\'s tags',
ClientMetadataMigrationImporters.SingleFileMetadataImporterMediaURLs : 'a file\'s URLs',
ClientMetadataMigrationImporters.SingleFileMetadataImporterTXT : 'a .txt sidecar',
@ -29,6 +30,7 @@ choice_tuple_label_lookup = {
}
choice_tuple_description_lookup = {
ClientMetadataMigrationImporters.SingleFileMetadataImporterMediaNotes : 'The notes that a file has.',
ClientMetadataMigrationImporters.SingleFileMetadataImporterMediaTags : 'The tags that a file has on a particular service.',
ClientMetadataMigrationImporters.SingleFileMetadataImporterMediaURLs : 'The known URLs that a file has.',
ClientMetadataMigrationImporters.SingleFileMetadataImporterTXT : 'A list of raw newline-separated texts in a .txt file.',
@ -171,7 +173,7 @@ class EditSingleFileMetadataImporterPanel( ClientGUIScrolledPanels.EditPanel ):
importer.SetServiceKey( self._service_key )
elif isinstance( importer, ClientMetadataMigrationImporters.SingleFileMetadataImporterMediaURLs ):
elif isinstance( importer, ( ClientMetadataMigrationImporters.SingleFileMetadataImporterMediaNotes, ClientMetadataMigrationImporters.SingleFileMetadataImporterMediaURLs ) ):
pass
@ -227,6 +229,10 @@ class EditSingleFileMetadataImporterPanel( ClientGUIScrolledPanels.EditPanel ):
importer = ClientMetadataMigrationImporters.SingleFileMetadataImporterMediaTags( string_processor = string_processor, service_key = self._service_key )
elif self._current_importer_class == ClientMetadataMigrationImporters.SingleFileMetadataImporterMediaNotes:
importer = ClientMetadataMigrationImporters.SingleFileMetadataImporterMediaNotes( string_processor = string_processor )
elif self._current_importer_class == ClientMetadataMigrationImporters.SingleFileMetadataImporterMediaURLs:
importer = ClientMetadataMigrationImporters.SingleFileMetadataImporterMediaURLs( string_processor = string_processor )
@ -306,7 +312,7 @@ class EditSingleFileMetadataImporterPanel( ClientGUIScrolledPanels.EditPanel ):
self._service_selection_panel.setVisible( True )
elif isinstance( importer, ClientMetadataMigrationImporters.SingleFileMetadataImporterMediaURLs ):
elif isinstance( importer, ( ClientMetadataMigrationImporters.SingleFileMetadataImporterMediaNotes, ClientMetadataMigrationImporters.SingleFileMetadataImporterMediaURLs ) ):
pass

View File

@ -529,7 +529,7 @@ class EditContentParserPanel( ClientGUIScrolledPanels.EditPanel ):
types_to_str[ HC.CONTENT_TYPE_HASH ] = 'file hash'
types_to_str[ HC.CONTENT_TYPE_TIMESTAMP ] = 'timestamp'
types_to_str[ HC.CONTENT_TYPE_TITLE ] = 'watcher title'
types_to_str[ HC.CONTENT_TYPE_HTTP_HEADER ] = 'http header'
types_to_str[ HC.CONTENT_TYPE_HTTP_HEADERS ] = 'http headers'
types_to_str[ HC.CONTENT_TYPE_VETO ] = 'veto'
types_to_str[ HC.CONTENT_TYPE_VARIABLE ] = 'temporary variable'
@ -606,6 +606,8 @@ class EditContentParserPanel( ClientGUIScrolledPanels.EditPanel ):
self._header_panel = QW.QWidget( self._content_panel )
self._header_name = QW.QLineEdit( self._header_panel )
tt = 'Any header you parse here will be passed on to subsequent jobs/objects created by this same parse. Next gallery pages, file downloads from post urls, post pages spawned from multi-file posts. And the headers will be passed on to their children. Should help with tokenised searches or weird guest-login issues.'
self._header_name.setToolTip( tt )
#
@ -670,11 +672,11 @@ class EditContentParserPanel( ClientGUIScrolledPanels.EditPanel ):
self._title_priority.setValue( priority )
elif content_type == HC.CONTENT_TYPE_HTTP_HEADER:
elif content_type == HC.CONTENT_TYPE_HTTP_HEADERS:
header_name = additional_info
self._header_name.setText(header_name)
self._header_name.setText( header_name )
elif content_type == HC.CONTENT_TYPE_VETO:
@ -910,7 +912,7 @@ class EditContentParserPanel( ClientGUIScrolledPanels.EditPanel ):
self._title_panel.show()
elif content_type == HC.CONTENT_TYPE_HTTP_HEADER:
elif content_type == HC.CONTENT_TYPE_HTTP_HEADERS:
self._header_panel.show()
@ -977,7 +979,7 @@ class EditContentParserPanel( ClientGUIScrolledPanels.EditPanel ):
additional_info = priority
elif content_type == HC.CONTENT_TYPE_HTTP_HEADER:
elif content_type == HC.CONTENT_TYPE_HTTP_HEADERS:
header_name = self._header_name.text()
@ -1268,7 +1270,7 @@ class EditPageParserPanel( ClientGUIScrolledPanels.EditPanel ):
#
permitted_content_types = [ HC.CONTENT_TYPE_URLS, HC.CONTENT_TYPE_MAPPINGS, HC.CONTENT_TYPE_NOTES, HC.CONTENT_TYPE_HASH, HC.CONTENT_TYPE_TIMESTAMP, HC.CONTENT_TYPE_TITLE, HC.CONTENT_TYPE_HTTP_HEADER, HC.CONTENT_TYPE_VETO ]
permitted_content_types = [ HC.CONTENT_TYPE_URLS, HC.CONTENT_TYPE_MAPPINGS, HC.CONTENT_TYPE_NOTES, HC.CONTENT_TYPE_HASH, HC.CONTENT_TYPE_TIMESTAMP, HC.CONTENT_TYPE_TITLE, HC.CONTENT_TYPE_HTTP_HEADERS, HC.CONTENT_TYPE_VETO ]
self._content_parsers = EditContentParsersPanel( content_parsers_panel, self._test_panel.GetTestDataForChild, permitted_content_types )

View File

@ -111,7 +111,7 @@ class FileSeed( HydrusSerialisable.SerialisableBase ):
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_FILE_SEED
SERIALISABLE_NAME = 'File Import'
SERIALISABLE_VERSION = 6
SERIALISABLE_VERSION = 7
def __init__( self, file_seed_type: int = None, file_seed_data: str = None ):
@ -139,7 +139,7 @@ class FileSeed( HydrusSerialisable.SerialisableBase ):
self._cloudflare_last_modified_time = None
self._referral_url = None
self._request_headers = None
self._request_headers = {}
self._external_filterable_tags = set()
self._external_additional_service_keys_to_tags = ClientTags.ServiceKeysToTags()
@ -483,6 +483,50 @@ class FileSeed( HydrusSerialisable.SerialisableBase ):
return ( 6, new_serialisable_info )
if version == 6:
(
file_seed_type,
file_seed_data,
created,
modified,
source_time,
status,
note,
referral_url,
serialisable_external_filterable_tags,
serialisable_external_additional_service_keys_to_tags,
serialisable_primary_urls,
serialisable_source_urls,
serialisable_tags,
names_and_notes,
serialisable_hashes
) = old_serialisable_info
request_headers = {}
new_serialisable_info = (
file_seed_type,
file_seed_data,
created,
modified,
source_time,
status,
note,
referral_url,
request_headers,
serialisable_external_filterable_tags,
serialisable_external_additional_service_keys_to_tags,
serialisable_primary_urls,
serialisable_source_urls,
serialisable_tags,
names_and_notes,
serialisable_hashes
)
return ( 7, new_serialisable_info )
def AddParseResults( self, parse_results, file_import_options: FileImportOptions.FileImportOptions ):
@ -583,6 +627,7 @@ class FileSeed( HydrusSerialisable.SerialisableBase ):
for ( key, value ) in self._request_headers.items():
network_job.AddAdditionalHeader( key, value )
if override_bandwidth:
@ -911,7 +956,7 @@ class FileSeed( HydrusSerialisable.SerialisableBase ):
return set( self._source_urls )
def GetHTTPHeaders( self ):
def GetHTTPHeaders( self ) -> dict:
return self._request_headers
@ -1184,7 +1229,7 @@ class FileSeed( HydrusSerialisable.SerialisableBase ):
def SetRequestHeaders( self, request_headers: dict ):
self._request_headers = request_headers
self._request_headers = dict( request_headers )
def SetStatus( self, status: int, note: str = '', exception = None ):
@ -1306,12 +1351,9 @@ class FileSeed( HydrusSerialisable.SerialisableBase ):
network_job = network_job_factory( 'GET', url_to_check, referral_url = referral_url )
if self._request_headers is not None:
for ( key, value ) in self._request_headers.items():
network_job.AddAdditionalHeader( key, value )
for ( key, value ) in self._request_headers.items():
network_job.AddAdditionalHeader( key, value )
HG.client_controller.network_engine.AddJob( network_job )
@ -1438,6 +1480,8 @@ class FileSeed( HydrusSerialisable.SerialisableBase ):
parsed_request_headers = ClientParsing.GetHTTPHeadersFromParseResults( parse_results )
self._request_headers.update( parsed_request_headers )
desired_urls = ClientParsing.GetURLsFromParseResults( parse_results, ( HC.URL_TYPE_DESIRED, ), only_get_top_priority = True )
child_urls = []
@ -1498,7 +1542,7 @@ class FileSeed( HydrusSerialisable.SerialisableBase ):
duplicate_file_seed.SetReferralURL( url_for_child_referral )
duplicate_file_seed.SetRequestHeaders( parsed_request_headers )
duplicate_file_seed.SetRequestHeaders( self._request_headers )
if self._referral_url is not None:

View File

@ -70,7 +70,7 @@ class GallerySeed( HydrusSerialisable.SerialisableBase ):
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_GALLERY_SEED
SERIALISABLE_NAME = 'Gallery Log Entry'
SERIALISABLE_VERSION = 3
SERIALISABLE_VERSION = 4
def __init__( self, url = None, can_generate_more_pages = True ):
@ -104,7 +104,7 @@ class GallerySeed( HydrusSerialisable.SerialisableBase ):
self.note = ''
self._referral_url = None
self._request_headers = None
self._request_headers = {}
self._force_next_page_url_generation = False
@ -202,6 +202,28 @@ class GallerySeed( HydrusSerialisable.SerialisableBase ):
return ( 3, new_serialisable_info )
if version == 3:
( url, can_generate_more_pages, serialisable_external_filterable_tags, serialisable_external_additional_service_keys_to_tags, created, modified, status, note, referral_url ) = old_serialisable_info
request_headers = {}
new_serialisable_info = (
url,
can_generate_more_pages,
serialisable_external_filterable_tags,
serialisable_external_additional_service_keys_to_tags,
created,
modified,
status,
note,
referral_url,
request_headers
)
return ( 4, new_serialisable_info )
def ForceNextPageURLGeneration( self ):
@ -266,7 +288,7 @@ class GallerySeed( HydrusSerialisable.SerialisableBase ):
def SetRequestHeaders( self, request_headers: dict ):
self._request_headers = request_headers
self._request_headers = dict( request_headers )
def SetRunToken( self, run_token: bytes ):
@ -366,6 +388,7 @@ class GallerySeed( HydrusSerialisable.SerialisableBase ):
for ( key, value ) in self._request_headers.items():
network_job.AddAdditionalHeader( key, value )
network_job.SetGalleryToken( gallery_token_name )
@ -419,8 +442,6 @@ class GallerySeed( HydrusSerialisable.SerialisableBase ):
file_seed.SetRequestHeaders( self._request_headers )
file_seeds = [ file_seed ]
file_seeds_callable( ( file_seed, ) )
status = CC.STATUS_SUCCESSFUL_AND_NEW
@ -431,11 +452,11 @@ class GallerySeed( HydrusSerialisable.SerialisableBase ):
if do_parse:
parsing_context = {}
parsing_context[ 'gallery_url' ] = gallery_url
parsing_context[ 'url' ] = url_to_check
parsing_context[ 'post_index' ] = '0'
parsing_context = {
'gallery_url' : gallery_url,
'url' : url_to_check,
'post_index' : '0'
}
all_parse_results = parser.Parse( parsing_context, parsing_text )
@ -489,6 +510,10 @@ class GallerySeed( HydrusSerialisable.SerialisableBase ):
flattened_results = list( itertools.chain.from_iterable( all_parse_results ) )
parsed_request_headers = ClientParsing.GetHTTPHeadersFromParseResults( flattened_results )
self._request_headers.update( parsed_request_headers )
sub_gallery_urls = ClientParsing.GetURLsFromParseResults( flattened_results, ( HC.URL_TYPE_SUB_GALLERY, ), only_get_top_priority = True )
sub_gallery_urls = HydrusData.DedupeList( sub_gallery_urls )

View File

@ -5,6 +5,9 @@ from hydrus.core import HydrusExceptions
from hydrus.client import ClientStrings
NOTE_CONNECTOR_STRING = ': '
NOTE_NAME_ESCAPE_STRING = ':\\ '
def GetSidecarPath( actual_file_path: str, remove_actual_filename_ext: bool, suffix: str, filename_string_converter: ClientStrings.StringConverter, file_extension: str ):
path_components = []

View File

@ -11,6 +11,7 @@ from hydrus.core import HydrusTags
from hydrus.client import ClientConstants as CC
from hydrus.client import ClientStrings
from hydrus.client.importing.options import NoteImportOptions
from hydrus.client.metadata import ClientMetadataMigrationCore
class SingleFileMetadataExporter( ClientMetadataMigrationCore.ImporterExporterNode ):
@ -58,6 +59,90 @@ class SingleFileMetadataExporterSidecar( SingleFileMetadataExporter, ClientMetad
class SingleFileMetadataExporterMediaNotes( HydrusSerialisable.SerialisableBase, SingleFileMetadataExporterMedia ):
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_METADATA_SINGLE_FILE_EXPORTER_MEDIA_NOTES
SERIALISABLE_NAME = 'Metadata Single File Exporter Media Notes'
SERIALISABLE_VERSION = 1
def __init__( self ):
HydrusSerialisable.SerialisableBase.__init__( self )
SingleFileMetadataExporterMedia.__init__( self )
def _GetSerialisableInfo( self ):
return list()
def _InitialiseFromSerialisableInfo( self, serialisable_info ):
gumpf = serialisable_info
def Export( self, hash: bytes, rows: typing.Collection[ str ] ):
if len( rows ) == 0:
return
names_and_notes = []
for row in rows:
if ClientMetadataMigrationCore.NOTE_CONNECTOR_STRING not in row:
continue
( name, text ) = row.split( ClientMetadataMigrationCore.NOTE_CONNECTOR_STRING, 1 )
if name == '' or text == '':
continue
name = name.replace( ClientMetadataMigrationCore.NOTE_NAME_ESCAPE_STRING, ClientMetadataMigrationCore.NOTE_CONNECTOR_STRING )
names_and_notes.append( ( name, text ) )
media_result = HG.client_controller.Read( 'media_result', hash )
note_import_options = NoteImportOptions.NoteImportOptions()
note_import_options.SetIsDefault( False )
note_import_options.SetExtendExistingNoteIfPossible( True )
note_import_options.SetConflictResolution( NoteImportOptions.NOTE_IMPORT_CONFLICT_RENAME )
service_keys_to_content_updates = note_import_options.GetServiceKeysToContentUpdates( media_result, names_and_notes )
if len( service_keys_to_content_updates ) > 0:
HG.client_controller.WriteSynchronous( 'content_updates', service_keys_to_content_updates )
def GetExampleStrings( self ):
examples = [
'Artist Commentary: This work is one of my favourites.',
'Translation: "What a nice day!"'
]
return examples
def ToString( self ) -> str:
return 'notes to media'
HydrusSerialisable.SERIALISABLE_TYPES_TO_OBJECT_TYPES[ HydrusSerialisable.SERIALISABLE_TYPE_METADATA_SINGLE_FILE_EXPORTER_MEDIA_NOTES ] = SingleFileMetadataExporterMediaNotes
class SingleFileMetadataExporterMediaTags( HydrusSerialisable.SerialisableBase, SingleFileMetadataExporterMedia ):
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_METADATA_SINGLE_FILE_EXPORTER_MEDIA_TAGS

View File

@ -74,6 +74,78 @@ class SingleFileMetadataImporterSidecar( SingleFileMetadataImporter, ClientMetad
class SingleFileMetadataImporterMediaNotes( HydrusSerialisable.SerialisableBase, SingleFileMetadataImporterMedia ):
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_METADATA_SINGLE_FILE_IMPORTER_MEDIA_NOTES
SERIALISABLE_NAME = 'Metadata Single File Importer Media Notes'
SERIALISABLE_VERSION = 1
def __init__( self, string_processor: typing.Optional[ ClientStrings.StringProcessor ] = None ):
if string_processor is None:
string_processor = ClientStrings.StringProcessor()
HydrusSerialisable.SerialisableBase.__init__( self )
SingleFileMetadataImporterMedia.__init__( self, string_processor )
def _GetSerialisableInfo( self ):
serialisable_string_processor = self._string_processor.GetSerialisableTuple()
return serialisable_string_processor
def _InitialiseFromSerialisableInfo( self, serialisable_info ):
serialisable_string_processor = serialisable_info
self._string_processor = HydrusSerialisable.CreateFromSerialisableTuple( serialisable_string_processor )
def GetExampleStrings( self ):
examples = [
'Artist Commentary: This work is one of my favourites.',
'Translation: "What a nice day!"'
]
return examples
def Import( self, media_result: ClientMediaResult.MediaResult ):
names_to_notes = media_result.GetNotesManager().GetNamesToNotes()
rows = [ '{}{}{}'.format( name.replace( ClientMetadataMigrationCore.NOTE_CONNECTOR_STRING, ClientMetadataMigrationCore.NOTE_NAME_ESCAPE_STRING ), ClientMetadataMigrationCore.NOTE_CONNECTOR_STRING, text ) for ( name, text ) in names_to_notes.items() ]
if self._string_processor.MakesChanges():
rows = self._string_processor.ProcessStrings( rows )
return rows
def ToString( self ) -> str:
if self._string_processor.MakesChanges():
full_munge_text = ', applying {}'.format( self._string_processor.ToString() )
else:
full_munge_text = ''
return 'notes from media{}'.format( full_munge_text )
HydrusSerialisable.SERIALISABLE_TYPES_TO_OBJECT_TYPES[ HydrusSerialisable.SERIALISABLE_TYPE_METADATA_SINGLE_FILE_IMPORTER_MEDIA_NOTES ] = SingleFileMetadataImporterMediaNotes
class SingleFileMetadataImporterMediaTags( HydrusSerialisable.SerialisableBase, SingleFileMetadataImporterMedia ):
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_METADATA_SINGLE_FILE_IMPORTER_MEDIA_TAGS

View File

@ -123,6 +123,8 @@ class HydrusServiceClientAPI( HydrusClientService ):
root.putChild( b'manage_headers', manage_headers )
manage_headers.putChild( b'set_user_agent', ClientLocalServerResources.HydrusResourceClientAPIRestrictedManageCookiesSetUserAgent( self._service, self._client_requests_domain ) )
manage_headers.putChild( b'get_headers', ClientLocalServerResources.HydrusResourceClientAPIRestrictedManageCookiesGetHeaders( self._service, self._client_requests_domain ) )
manage_headers.putChild( b'set_headers', ClientLocalServerResources.HydrusResourceClientAPIRestrictedManageCookiesSetHeaders( self._service, self._client_requests_domain ) )
manage_pages = NoResource()

View File

@ -45,6 +45,7 @@ from hydrus.client.importing.options import FileImportOptions
from hydrus.client.media import ClientMedia
from hydrus.client.metadata import ClientTags
from hydrus.client.networking import ClientNetworkingContexts
from hydrus.client.networking import ClientNetworkingDomain
from hydrus.client.networking import ClientNetworkingFunctions
local_booru_css = FileResource( os.path.join( HC.STATIC_DIR, 'local_booru_style.css' ), defaultType = 'text/css' )
@ -2972,13 +2973,15 @@ class HydrusResourceClientAPIRestrictedGetFilesGetThumbnail( HydrusResourceClien
return response_context
class HydrusResourceClientAPIRestrictedManageCookies( HydrusResourceClientAPIRestricted ):
def _CheckAPIPermissions( self, request: HydrusServerRequest.HydrusRequest ):
request.client_api_permissions.CheckPermission( ClientAPI.CLIENT_API_PERMISSION_MANAGE_COOKIES )
request.client_api_permissions.CheckPermission( ClientAPI.CLIENT_API_PERMISSION_MANAGE_HEADERS )
class HydrusResourceClientAPIRestrictedManageCookiesGetCookies( HydrusResourceClientAPIRestrictedManageCookies ):
def _threadDoGETJob( self, request: HydrusServerRequest.HydrusRequest ):
@ -3018,6 +3021,7 @@ class HydrusResourceClientAPIRestrictedManageCookiesGetCookies( HydrusResourceCl
return response_context
class HydrusResourceClientAPIRestrictedManageCookiesSetCookies( HydrusResourceClientAPIRestrictedManageCookies ):
def _threadDoPOSTJob( self, request: HydrusServerRequest.HydrusRequest ):
@ -3027,6 +3031,9 @@ class HydrusResourceClientAPIRestrictedManageCookiesSetCookies( HydrusResourceCl
domains_cleared = set()
domains_set = set()
# TODO: This all sucks. replace the rows in this and the _set_ with an Object, and the domains_cleared/set stuff should say more, like count removed from each etc...
# refer to get/set_headers for example
for cookie_row in cookie_rows:
if len( cookie_row ) != 5:
@ -3096,6 +3103,7 @@ class HydrusResourceClientAPIRestrictedManageCookiesSetCookies( HydrusResourceCl
return response_context
class HydrusResourceClientAPIRestrictedManageCookiesSetUserAgent( HydrusResourceClientAPIRestrictedManageCookies ):
def _threadDoPOSTJob( self, request: HydrusServerRequest.HydrusRequest ):
@ -3109,13 +3117,241 @@ class HydrusResourceClientAPIRestrictedManageCookiesSetUserAgent( HydrusResource
user_agent = ClientDefaults.DEFAULT_USER_AGENT
HG.client_controller.network_engine.domain_manager.SetGlobalUserAgent( user_agent )
HG.client_controller.network_engine.domain_manager.SetCustomHeader( ClientNetworkingContexts.GLOBAL_NETWORK_CONTEXT, 'User-Agent', value = user_agent )
response_context = HydrusServerResources.ResponseContext( 200 )
return response_context
def GenerateNetworkContextFromRequest( request: HydrusServerRequest.Request ):
domain = request.parsed_request_args.GetValueOrNone( 'domain', str )
if domain is None:
network_context = ClientNetworkingContexts.GLOBAL_NETWORK_CONTEXT
else:
if '.' not in domain:
raise HydrusExceptions.BadRequestException( 'The value "{}" does not seem to be a domain!'.format( domain ) )
network_context = ClientNetworkingContexts.NetworkContext( CC.NETWORK_CONTEXT_DOMAIN, domain )
return network_context
def RenderNetworkContextToJSONObject( network_context: ClientNetworkingContexts.NetworkContext ) -> dict:
result = {}
result[ 'type' ] = network_context.context_type
if isinstance( network_context.context_data, bytes ):
result[ 'data' ] = network_context.context_data.hex()
elif isinstance( network_context.context_data, str ) or network_context.context_data is None:
result[ 'data' ] = network_context.context_data
else:
result[ 'data' ] = repr( network_context.context_data )
return result
class HydrusResourceClientAPIRestrictedManageCookiesGetHeaders( HydrusResourceClientAPIRestrictedManageCookies ):
def _threadDoGETJob( self, request: HydrusServerRequest.HydrusRequest ):
network_context = GenerateNetworkContextFromRequest( request )
ncs_to_header_dicts = HG.client_controller.network_engine.domain_manager.GetNetworkContextsToCustomHeaderDicts()
body_dict = {}
body_dict[ 'network_context' ] = RenderNetworkContextToJSONObject( network_context )
headers_dict = ncs_to_header_dicts.get( network_context, {} )
body_headers_dict = {}
for ( key, ( value, approved, reason ) ) in headers_dict.items():
body_headers_dict[ key ] = {
'value' : value,
'approved' : ClientNetworkingDomain.valid_str_lookup[ approved ],
'reason' : reason
}
body_dict[ 'headers' ] = body_headers_dict
body = Dumps( body_dict, request.preferred_mime )
response_context = HydrusServerResources.ResponseContext( 200, mime = request.preferred_mime, body = body )
return response_context
class HydrusResourceClientAPIRestrictedManageCookiesSetHeaders( HydrusResourceClientAPIRestrictedManageCookies ):
def _threadDoPOSTJob( self, request: HydrusServerRequest.HydrusRequest ):
network_context = GenerateNetworkContextFromRequest( request )
http_header_objects = request.parsed_request_args.GetValue( 'headers', dict )
headers_cleared = set()
headers_set = set()
headers_altered = set()
for ( key, info_dict ) in http_header_objects.items():
ncs_to_header_dicts = HG.client_controller.network_engine.domain_manager.GetNetworkContextsToCustomHeaderDicts()
if network_context in ncs_to_header_dicts:
headers_dict = ncs_to_header_dicts[ network_context ]
else:
headers_dict = {}
approved = None
reason = None
if 'approved' in info_dict:
approved_str = info_dict[ 'approved' ]
approved = ClientNetworkingDomain.valid_enum_lookup.get( approved_str, None )
if approved is None:
raise HydrusExceptions.BadRequestException( 'The value "{}" was not in the permitted list!'.format( approved_str ) )
if 'reason' in info_dict:
reason = info_dict[ 'reason' ]
if not isinstance( reason, str ):
raise HydrusExceptions.BadRequestException( 'The reason "{}" was not a string!'.format( reason ) )
if 'value' in info_dict:
value = info_dict[ 'value' ]
if value is None:
if key in headers_dict:
HG.client_controller.network_engine.domain_manager.DeleteCustomHeader( network_context, key )
headers_cleared.add( key )
else:
if not isinstance( value, str ):
raise HydrusExceptions.BadRequestException( 'The value "{}" was not a string!'.format( value ) )
do_it = True
if key in headers_dict:
old_value = headers_dict[ key ][0]
if old_value == value:
do_it = False
else:
headers_altered.add( key )
else:
headers_set.add( key )
if do_it:
HG.client_controller.network_engine.domain_manager.SetCustomHeader( network_context, key, value = value, approved = approved, reason = reason )
else:
if approved is None and reason is None:
raise HydrusExceptions.BadRequestException( 'Sorry, you have to set a value, approved, or reason parameter!' )
if key not in headers_dict:
raise HydrusExceptions.BadRequestException( 'Sorry, you tried to set approved/reason on "{}" for "{}", but that entry does not exist, so there is no value to set them to! Please give a value!'.format( key, network_context ) )
headers_altered.add( key )
HG.client_controller.network_engine.domain_manager.SetCustomHeader( network_context, key, approved = approved, reason = reason )
if HG.client_controller.new_options.GetBoolean( 'notify_client_api_cookies' ) and len( headers_cleared ) + len( headers_set ) + len( headers_altered ) > 0:
message_lines = [ 'Headers sent from API:' ]
if len( headers_cleared ) > 0:
message_lines.extend( [ 'Cleared: {}'.format( key ) for key in sorted( headers_cleared ) ] )
if len( headers_set ) > 0:
message_lines.extend( [ 'Set: {}'.format( key ) for key in sorted( headers_set ) ] )
if len( headers_set ) > 0:
message_lines.extend( [ 'Altered: {}'.format( key ) for key in sorted( headers_altered ) ] )
message = os.linesep.join( message_lines )
job_key = ClientThreading.JobKey()
job_key.SetStatusText( message )
job_key.Delete( 5 )
HG.client_controller.pub( 'message', job_key )
response_context = HydrusServerResources.ResponseContext( 200 )
return response_context
class HydrusResourceClientAPIRestrictedManageDatabase( HydrusResourceClientAPIRestricted ):
def _CheckAPIPermissions( self, request: HydrusServerRequest.HydrusRequest ):

View File

@ -29,6 +29,11 @@ class NetworkContext( HydrusSerialisable.SerialisableBase ):
return NotImplemented
def __format__( self, format_spec ):
return self.ToString()
def __hash__( self ):
return ( self.context_type, self.context_data ).__hash__()

View File

@ -25,9 +25,11 @@ VALID_UNKNOWN = 2
valid_str_lookup = {
VALID_DENIED : 'denied',
VALID_APPROVED : 'approved',
VALID_UNKNOWN : 'unknown'
VALID_UNKNOWN : 'pending'
}
valid_enum_lookup = { value : key for ( key, value ) in valid_str_lookup.items() }
class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_NETWORK_DOMAIN_MANAGER
@ -289,7 +291,7 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
serialisable_default_note_import_options_tuple = ( serialisable_file_post_default_note_import_options, serialisable_watchable_default_note_import_options, serialisable_url_class_keys_to_default_note_import_options )
serialisable_parsers = self._parsers.GetSerialisableTuple()
serialisable_network_contexts_to_custom_header_dicts = [ ( network_context.GetSerialisableTuple(), list(custom_header_dict.items()) ) for ( network_context, custom_header_dict ) in list(self._network_contexts_to_custom_header_dicts.items()) ]
serialisable_network_contexts_to_custom_header_dicts = [ ( network_context.GetSerialisableTuple(), list( custom_header_dict.items() ) ) for ( network_context, custom_header_dict ) in self._network_contexts_to_custom_header_dicts.items() ]
return ( serialisable_gugs, serialisable_gug_keys_to_display, serialisable_url_classes, serialisable_url_class_keys_to_display, serialisable_url_class_keys_to_parser_keys, serialisable_default_tag_import_options_tuple, serialisable_default_note_import_options_tuple, serialisable_parsers, serialisable_network_contexts_to_custom_header_dicts )
@ -755,7 +757,7 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
return False
def AutoAddDomainMetadatas( self, domain_metadatas, approved = False ):
def AutoAddDomainMetadatas( self, domain_metadatas, approved = VALID_APPROVED ):
for domain_metadata in domain_metadatas:
@ -936,6 +938,24 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
return url_tuples
def DeleteCustomHeader( self, network_context: ClientNetworkingContexts.NetworkContext, key: str ):
with self._lock:
if network_context in self._network_contexts_to_custom_header_dicts:
custom_header_dict = self._network_contexts_to_custom_header_dicts[ network_context ]
if key in custom_header_dict:
del custom_header_dict[ key ]
self._SetDirty()
def DeleteGUGs( self, deletee_names ):
with self._lock:
@ -1164,7 +1184,7 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
custom_header_dict = self._network_contexts_to_custom_header_dicts[ network_context ]
for ( key, ( value, approved, reason ) ) in list(custom_header_dict.items()):
for ( key, ( value, approved, reason ) ) in list( custom_header_dict.items() ):
if approved == VALID_APPROVED:
@ -1262,9 +1282,12 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
custom_header_dict = self._network_contexts_to_custom_header_dicts[ network_context ]
for ( key, ( value, approved, reason ) ) in list(custom_header_dict.items()):
for ( key, ( value, approved, reason ) ) in list( custom_header_dict.items() ):
headers_list.append( ( key, value, reason ) )
if approved == VALID_APPROVED:
headers_list.append( ( key, value, reason ) )
@ -1416,7 +1439,7 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
def IsValid( self, network_contexts ):
# for now, let's say that denied headers are simply not added, not that they invalidate a query
# denied headers are simply not added--they don't invalidate a query
for network_context in network_contexts:
@ -1713,6 +1736,54 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
def SetCustomHeader( self, network_context: ClientNetworkingContexts.NetworkContext, key, value = None, approved = None, reason = None ):
with self._lock:
fallback_value = None
fallback_approved = VALID_APPROVED
fallback_reason = 'Set by Client API'
if network_context not in self._network_contexts_to_custom_header_dicts:
self._network_contexts_to_custom_header_dicts[ network_context ] = {}
custom_header_dict = self._network_contexts_to_custom_header_dicts[ network_context ]
if key in custom_header_dict:
( fallback_value, fallback_approved, fallback_reason ) = custom_header_dict[ key ]
if value is None:
if fallback_value is None:
raise Exception( 'Sorry, was called to set HTTP Header information for key "{}" on "{}", but there was no attached value and it did not already exist!'.format( key, network_context ) )
else:
value = fallback_value
if approved is None:
approved = fallback_approved
if reason is None:
reason = fallback_reason
custom_header_dict[ key ] = ( value, approved, reason )
self._SetDirty()
def SetDefaultFilePostNoteImportOptions( self, note_import_options ):
with self._lock:
@ -1805,14 +1876,6 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
def SetGlobalUserAgent( self, user_agent_string ):
with self._lock:
self._network_contexts_to_custom_header_dicts[ ClientNetworkingContexts.GLOBAL_NETWORK_CONTEXT ][ 'User-Agent' ] = ( user_agent_string, True, 'Set by Client API' )
def SetHeaderValidation( self, network_context, key, approved ):
with self._lock:

View File

@ -47,9 +47,25 @@ PLATFORM_MACOS = muh_platform == 'darwin'
PLATFORM_LINUX = muh_platform == 'linux'
PLATFORM_HAIKU = muh_platform == 'haiku1'
if PLATFORM_WINDOWS:
NICE_PLATFORM_STRING = 'Windows'
elif PLATFORM_MACOS:
NICE_PLATFORM_STRING = 'macOS'
elif PLATFORM_LINUX:
NICE_PLATFORM_STRING = 'Linux'
elif PLATFORM_HAIKU:
NICE_PLATFORM_STRING = 'Haiku'
RUNNING_FROM_SOURCE = sys.argv[0].endswith( '.py' ) or sys.argv[0].endswith( '.pyw' )
RUNNING_FROM_MACOS_APP = os.path.exists( os.path.join( BASE_DIR, 'running_from_app' ) )
if RUNNING_FROM_SOURCE:
NICE_RUNNING_AS_STRING = 'from source'
elif RUNNING_FROM_FROZEN_BUILD:
NICE_RUNNING_AS_STRING = 'from frozen build'
elif RUNNING_FROM_MACOS_APP:
NICE_RUNNING_AS_STRING = 'from App'
BIN_DIR = os.path.join( BASE_DIR, 'bin' )
HELP_DIR = os.path.join( BASE_DIR, 'help' )
INCLUDE_DIR = os.path.join( BASE_DIR, 'include' )
@ -84,8 +100,8 @@ options = {}
# Misc
NETWORK_VERSION = 20
SOFTWARE_VERSION = 521
CLIENT_API_VERSION = 42
SOFTWARE_VERSION = 522
CLIENT_API_VERSION = 43
SERVER_THUMBNAIL_DIMENSIONS = ( 200, 200 )
@ -164,7 +180,7 @@ CONTENT_TYPE_NOTES = 18
CONTENT_TYPE_FILE_VIEWING_STATS = 19
CONTENT_TYPE_TAG = 20
CONTENT_TYPE_DEFINITIONS = 21
CONTENT_TYPE_HTTP_HEADER = 22
CONTENT_TYPE_HTTP_HEADERS = 22
content_type_string_lookup = {
CONTENT_TYPE_MAPPINGS : 'mappings',
@ -188,7 +204,7 @@ content_type_string_lookup = {
CONTENT_TYPE_NOTES : 'notes',
CONTENT_TYPE_FILE_VIEWING_STATS : 'file viewing stats',
CONTENT_TYPE_DEFINITIONS : 'definitions',
CONTENT_TYPE_HTTP_HEADER : 'http header'
CONTENT_TYPE_HTTP_HEADERS : 'http headers'
}
CONTENT_UPDATE_ADD = 0

View File

@ -538,6 +538,11 @@ def MergeFile( source, dest ):
# this can merge a file, but if it is given a dir it will just straight up overwrite not merge
if os.path.exists( source ) and os.path.exists( dest ) and os.path.samefile( source, dest ):
raise Exception( f'Woah, "{source}" and "{dest}" are the same file!' )
if not os.path.isdir( source ):
if PathsHaveSameSizeAndDate( source, dest ):
@ -564,8 +569,14 @@ def MergeFile( source, dest ):
return True
def MergeTree( source, dest, text_update_hook = None ):
if os.path.exists( source ) and os.path.exists( dest ) and os.path.samefile( source, dest ):
raise Exception( f'Woah, "{source}" and "{dest}" are the same directory!' )
pauser = HydrusData.BigJobPauser()
if not os.path.exists( dest ):
@ -638,8 +649,14 @@ def MergeTree( source, dest, text_update_hook = None ):
def MirrorFile( source, dest ):
if os.path.exists( source ) and os.path.exists( dest ) and os.path.samefile( source, dest ):
return True
if not PathsHaveSameSizeAndDate( source, dest ):
try:
@ -662,6 +679,11 @@ def MirrorFile( source, dest ):
def MirrorTree( source, dest, text_update_hook = None, is_cancelled_hook = None ):
if os.path.exists( source ) and os.path.exists( dest ) and os.path.samefile( source, dest ):
return
pauser = HydrusData.BigJobPauser()
MakeSureDirectoryExists( dest )
@ -729,6 +751,7 @@ def MirrorTree( source, dest, text_update_hook = None, is_cancelled_hook = None
def OpenFileLocation( path ):
def do_it():

View File

@ -133,6 +133,8 @@ SERIALISABLE_TYPE_METADATA_SINGLE_FILE_EXPORTER_MEDIA_TAGS = 115
SERIALISABLE_TYPE_METADATA_SINGLE_FILE_EXPORTER_TXT = 116
SERIALISABLE_TYPE_METADATA_SINGLE_FILE_EXPORTER_MEDIA_URLS = 117
SERIALISABLE_TYPE_METADATA_SINGLE_FILE_IMPORTER_MEDIA_URLS = 118
SERIALISABLE_TYPE_METADATA_SINGLE_FILE_EXPORTER_MEDIA_NOTES = 119
SERIALISABLE_TYPE_METADATA_SINGLE_FILE_IMPORTER_MEDIA_NOTES = 120
SERIALISABLE_TYPES_TO_OBJECT_TYPES = {}

View File

@ -200,8 +200,6 @@ def CleanTag( tag ):
tag = tag.lower()
tag = HydrusText.re_leading_colons.sub( ':', tag )
if HydrusText.re_leading_single_colon_and_no_more_colons.match( tag ) is not None:
# Convert anything starting with one colon to start with two i.e. :D -> ::D
@ -265,11 +263,11 @@ def CleanTags( tags ):
return clean_tags
def CombineTag( namespace, subtag, do_not_double_namespace = False ):
def CombineTag( namespace, subtag ):
if namespace == '':
if ':' in subtag and HydrusText.re_leading_double_colon.match( subtag ) is None:
if ':' in subtag:
return ':' + subtag
@ -280,14 +278,7 @@ def CombineTag( namespace, subtag, do_not_double_namespace = False ):
else:
if do_not_double_namespace and subtag.startswith( namespace + ':' ):
return subtag
else:
return namespace + ':' + subtag
return namespace + ':' + subtag

View File

@ -490,6 +490,8 @@ def TestVariableType( name: str, value: typing.Any, expected_type: type, expecte
EXPECTED_TYPE = typing.TypeVar( 'EXPECTED_TYPE' )
class ParsedRequestArguments( dict ):
def __missing__( self, key ):
@ -497,8 +499,13 @@ class ParsedRequestArguments( dict ):
raise HydrusExceptions.BadRequestException( 'It looks like the parameter "{}" was missing!'.format( key ) )
def GetValue( self, key, expected_type, expected_list_type = None, expected_dict_types = None, default_value = None, none_on_missing = False ):
def GetValue( self, key, expected_type: EXPECTED_TYPE, expected_list_type = None, expected_dict_types = None, default_value = None ) -> EXPECTED_TYPE:
return GetValueFromDict( self, key, expected_type, expected_list_type = expected_list_type, expected_dict_types = expected_dict_types, default_value = default_value, none_on_missing = none_on_missing )
return GetValueFromDict( self, key, expected_type, expected_list_type = expected_list_type, expected_dict_types = expected_dict_types, default_value = default_value )
def GetValueOrNone( self, key, expected_type: EXPECTED_TYPE, expected_list_type = None, expected_dict_types = None ) -> typing.Optional[ EXPECTED_TYPE ]:
return GetValueFromDict( self, key, expected_type, expected_list_type = expected_list_type, expected_dict_types = expected_dict_types, none_on_missing = True )

View File

@ -82,6 +82,8 @@ class Predicate( Enum ):
LAST_VIEWED_TIME = auto()
TIME_IMPORTED = auto()
DURATION = auto()
FRAMERATE = auto()
NUM_OF_FRAMES = auto()
FILE_SERVICE = auto()
NUM_FILE_RELS = auto()
RATIO = auto()
@ -145,6 +147,7 @@ class Units( Enum ):
FILE_RELATIONSHIP_TYPE = auto() # One of 'not related/false positive', 'duplicates', 'alternates', 'potential duplicates'
PIXELS_OR_NONE = auto() # Always None (meaning pixels)
PIXELS = auto() # One of 'pixels', 'kilopixels', 'megapixels'
FPS_OR_NONE = auto() # 'fps'
# All system predicates
@ -182,6 +185,8 @@ SYSTEM_PREDICATES = {
'last viewed time|last view time': (Predicate.LAST_VIEWED_TIME, Operators.RELATIONAL, Value.DATE_OR_TIME_INTERVAL, None),
'time imported|import time': (Predicate.TIME_IMPORTED, Operators.RELATIONAL, Value.DATE_OR_TIME_INTERVAL, None),
'duration': (Predicate.DURATION, Operators.RELATIONAL, Value.TIME_SEC_MSEC, None),
'framerate': (Predicate.FRAMERATE, Operators.RELATIONAL_EXACT, Value.NATURAL, Units.FPS_OR_NONE),
'number of frames': (Predicate.NUM_OF_FRAMES, Operators.RELATIONAL, Value.NATURAL, None),
'file service': (Predicate.FILE_SERVICE, Operators.FILESERVICE_STATUS, Value.ANY_STRING, None),
'num(ber of)? file relationships': (Predicate.NUM_FILE_RELS, Operators.RELATIONAL, Value.NATURAL, Units.FILE_RELATIONSHIP_TYPE),
'ratio': (Predicate.RATIO, Operators.RATIO_OPERATORS, Value.RATIO, None),
@ -272,6 +277,13 @@ def parse_unit( string: str, spec ):
match = re.match( 'mpx|megapixels|megapixel', string )
if match: return string[ len( match[ 0 ] ): ], 'megapixels'
raise ValueError( "Invalid unit, expected pixels" )
elif spec == Units.FPS_OR_NONE:
if not string:
return string, None
else:
match = re.match( 'fps', string )
if match: return string[ len( match[ 0 ] ): ], None
raise ValueError( "Invalid unit, expected no unit or fps" )
raise ValueError( "Invalid unit specification" )
@ -487,6 +499,8 @@ examples = [
"system:duration < 5 seconds",
"system:duration ~= 5 sec 6000 msecs",
"system:duration > 3 milliseconds",
"system:framerate > 60fps",
"system:number of frames > 6000",
"system:file service is pending to my files",
" system:file service currently in my files",
"system:file service isn't currently in my files",

View File

@ -905,7 +905,7 @@ class HydrusResourceRestrictedNumPetitions( HydrusResourceRestricted ):
def _threadDoGETJob( self, request: HydrusServerRequest.HydrusRequest ):
subject_account_key = request.parsed_request_args.GetValue( 'subject_account_key', bytes, none_on_missing = True )
subject_account_key = request.parsed_request_args.GetValueOrNone( 'subject_account_key', bytes )
petition_count_info = HG.server_controller.Read( 'num_petitions', self._service_key, request.hydrus_account, subject_account_key = subject_account_key )
@ -949,7 +949,7 @@ class HydrusResourceRestrictedPetition( HydrusResourceRestricted ):
def _threadDoGETJob( self, request: HydrusServerRequest.HydrusRequest ):
subject_account_key = request.parsed_request_args.GetValue( 'subject_account_key', bytes, none_on_missing = True )
subject_account_key = request.parsed_request_args.GetValueOrNone( 'subject_account_key', bytes )
# add reason to here some time, for when we eventually select petitions from a summary list of ( account, reason, size ) stuff
content_type = request.parsed_request_args[ 'content_type' ]
status = request.parsed_request_args[ 'status' ]

View File

@ -261,7 +261,7 @@ class TestClientAPI( unittest.TestCase ):
permissions_to_set_up.append( ( 'add_tags', [ ClientAPI.CLIENT_API_PERMISSION_ADD_TAGS ] ) )
permissions_to_set_up.append( ( 'add_urls', [ ClientAPI.CLIENT_API_PERMISSION_ADD_URLS ] ) )
permissions_to_set_up.append( ( 'manage_pages', [ ClientAPI.CLIENT_API_PERMISSION_MANAGE_PAGES ] ) )
permissions_to_set_up.append( ( 'manage_cookies', [ ClientAPI.CLIENT_API_PERMISSION_MANAGE_COOKIES ] ) )
permissions_to_set_up.append( ( 'manage_headers', [ ClientAPI.CLIENT_API_PERMISSION_MANAGE_HEADERS ] ) )
permissions_to_set_up.append( ( 'search_all_files', [ ClientAPI.CLIENT_API_PERMISSION_SEARCH_FILES ] ) )
permissions_to_set_up.append( ( 'search_green_files', [ ClientAPI.CLIENT_API_PERMISSION_SEARCH_FILES ] ) )
@ -638,7 +638,7 @@ class TestClientAPI( unittest.TestCase ):
def _test_get_services( self, connection, set_up_permissions ):
should_work = { set_up_permissions[ 'everything' ], set_up_permissions[ 'add_files' ], set_up_permissions[ 'add_tags' ], set_up_permissions[ 'manage_pages' ], set_up_permissions[ 'search_all_files' ], set_up_permissions[ 'search_green_files' ] }
should_break = { set_up_permissions[ 'add_urls' ], set_up_permissions[ 'manage_cookies' ] }
should_break = { set_up_permissions[ 'add_urls' ], set_up_permissions[ 'manage_headers' ] }
expected_answer = {
'local_tags' : [
@ -2317,7 +2317,7 @@ class TestClientAPI( unittest.TestCase ):
def _test_manage_cookies( self, connection, set_up_permissions ):
api_permissions = set_up_permissions[ 'manage_cookies' ]
api_permissions = set_up_permissions[ 'manage_headers' ]
access_key_hex = api_permissions.GetAccessKey().hex()
@ -2459,6 +2459,15 @@ class TestClientAPI( unittest.TestCase ):
#
def _test_manage_headers( self, connection, set_up_permissions ):
api_permissions = set_up_permissions[ 'manage_headers' ]
access_key_hex = api_permissions.GetAccessKey().hex()
#
headers = { 'Hydrus-Client-API-Access-Key' : access_key_hex, 'Content-Type' : HC.mime_mimetype_string_lookup[ HC.APPLICATION_JSON ] }
path = '/manage_headers/set_user_agent'
@ -2501,6 +2510,316 @@ class TestClientAPI( unittest.TestCase ):
self.assertEqual( current_headers[ 'User-Agent' ], ClientDefaults.DEFAULT_USER_AGENT )
#
headers = { 'Hydrus-Client-API-Access-Key' : access_key_hex }
path = '/manage_headers/get_headers'
connection.request( 'GET', path, headers = headers )
response = connection.getresponse()
data = response.read()
text = str( data, 'utf-8' )
self.assertEqual( response.status, 200 )
d = json.loads( text )
expected_result = {
'network_context' : {
'type' : 0,
'data' : None
},
'headers' : {
'User-Agent' : {
'approved': 'approved',
'reason': 'This is the default User-Agent identifier for the client for all network connections.',
'value' : ClientDefaults.DEFAULT_USER_AGENT
}
}
}
self.assertEqual( d, expected_result )
#
headers = { 'Hydrus-Client-API-Access-Key' : access_key_hex, 'Content-Type' : HC.mime_mimetype_string_lookup[ HC.APPLICATION_JSON ] }
path = '/manage_headers/set_headers'
request_dict = { 'headers' : { 'Test' : { 'value' : 'test_value' } } }
request_body = json.dumps( request_dict )
connection.request( 'POST', path, body = request_body, headers = headers )
response = connection.getresponse()
data = response.read()
self.assertEqual( response.status, 200 )
headers = { 'Hydrus-Client-API-Access-Key' : access_key_hex }
path = '/manage_headers/get_headers'
connection.request( 'GET', path, headers = headers )
response = connection.getresponse()
data = response.read()
text = str( data, 'utf-8' )
self.assertEqual( response.status, 200 )
d = json.loads( text )
expected_result = {
'network_context' : {
'type' : 0,
'data' : None
},
'headers' : {
'User-Agent' : {
'approved': 'approved',
'reason': 'This is the default User-Agent identifier for the client for all network connections.',
'value' : ClientDefaults.DEFAULT_USER_AGENT
},
'Test' : {
'approved': 'approved',
'reason': 'Set by Client API',
'value' : 'test_value'
}
}
}
self.assertEqual( d, expected_result )
#
headers = { 'Hydrus-Client-API-Access-Key' : access_key_hex, 'Content-Type' : HC.mime_mimetype_string_lookup[ HC.APPLICATION_JSON ] }
path = '/manage_headers/set_headers'
request_dict = { 'domain' : None, 'headers' : { 'Test' : { 'value' : 'test_value2' } } }
request_body = json.dumps( request_dict )
connection.request( 'POST', path, body = request_body, headers = headers )
response = connection.getresponse()
data = response.read()
self.assertEqual( response.status, 200 )
headers = { 'Hydrus-Client-API-Access-Key' : access_key_hex }
path = '/manage_headers/get_headers'
connection.request( 'GET', path, headers = headers )
response = connection.getresponse()
data = response.read()
text = str( data, 'utf-8' )
self.assertEqual( response.status, 200 )
d = json.loads( text )
expected_result = {
'network_context' : {
'type' : 0,
'data' : None
},
'headers' : {
'User-Agent' : {
'approved': 'approved',
'reason': 'This is the default User-Agent identifier for the client for all network connections.',
'value' : ClientDefaults.DEFAULT_USER_AGENT
},
'Test' : {
'approved': 'approved',
'reason': 'Set by Client API',
'value' : 'test_value2'
}
}
}
self.assertEqual( d, expected_result )
#
domain = 'subdomain.example.com'
headers = { 'Hydrus-Client-API-Access-Key' : access_key_hex }
path = f'/manage_headers/get_headers?domain={domain}'
connection.request( 'GET', path, headers = headers )
response = connection.getresponse()
data = response.read()
text = str( data, 'utf-8' )
self.assertEqual( response.status, 200 )
d = json.loads( text )
expected_result = {
'network_context' : {
'type' : 2,
'data' : 'subdomain.example.com'
},
'headers' : {}
}
self.assertEqual( d, expected_result )
#
headers = { 'Hydrus-Client-API-Access-Key' : access_key_hex, 'Content-Type' : HC.mime_mimetype_string_lookup[ HC.APPLICATION_JSON ] }
path = '/manage_headers/set_headers'
request_dict = { 'domain' : 'subdomain.example.com', 'headers' : { 'cool_stuff' : { 'value' : 'on', 'approved' : 'pending', 'reason' : 'select yes to turn on cool stuff' } } }
request_body = json.dumps( request_dict )
connection.request( 'POST', path, body = request_body, headers = headers )
response = connection.getresponse()
data = response.read()
self.assertEqual( response.status, 200 )
headers = { 'Hydrus-Client-API-Access-Key' : access_key_hex }
path = '/manage_headers/get_headers?domain=subdomain.example.com'
connection.request( 'GET', path, headers = headers )
response = connection.getresponse()
data = response.read()
text = str( data, 'utf-8' )
self.assertEqual( response.status, 200 )
d = json.loads( text )
expected_result = {
'network_context' : {
'type' : 2,
'data' : 'subdomain.example.com'
},
'headers' : {
'cool_stuff' : { 'value' : 'on', 'approved' : 'pending', 'reason' : 'select yes to turn on cool stuff' }
}
}
self.assertEqual( d, expected_result )
#
headers = { 'Hydrus-Client-API-Access-Key' : access_key_hex, 'Content-Type' : HC.mime_mimetype_string_lookup[ HC.APPLICATION_JSON ] }
path = '/manage_headers/set_headers'
request_dict = { 'domain' : 'subdomain.example.com', 'headers' : { 'cool_stuff' : { 'approved' : 'approved' } } }
request_body = json.dumps( request_dict )
connection.request( 'POST', path, body = request_body, headers = headers )
response = connection.getresponse()
data = response.read()
self.assertEqual( response.status, 200 )
headers = { 'Hydrus-Client-API-Access-Key' : access_key_hex }
path = '/manage_headers/get_headers?domain=subdomain.example.com'
connection.request( 'GET', path, headers = headers )
response = connection.getresponse()
data = response.read()
text = str( data, 'utf-8' )
self.assertEqual( response.status, 200 )
d = json.loads( text )
expected_result = {
'network_context' : {
'type' : 2,
'data' : 'subdomain.example.com'
},
'headers' : {
'cool_stuff' : { 'value' : 'on', 'approved' : 'approved', 'reason' : 'select yes to turn on cool stuff' }
}
}
self.assertEqual( d, expected_result )
#
headers = { 'Hydrus-Client-API-Access-Key' : access_key_hex, 'Content-Type' : HC.mime_mimetype_string_lookup[ HC.APPLICATION_JSON ] }
path = '/manage_headers/set_headers'
request_dict = { 'domain' : 'subdomain.example.com', 'headers' : { 'cool_stuff' : { 'value' : None } } }
request_body = json.dumps( request_dict )
connection.request( 'POST', path, body = request_body, headers = headers )
response = connection.getresponse()
data = response.read()
self.assertEqual( response.status, 200 )
headers = { 'Hydrus-Client-API-Access-Key' : access_key_hex }
path = '/manage_headers/get_headers?domain=subdomain.example.com'
connection.request( 'GET', path, headers = headers )
response = connection.getresponse()
data = response.read()
text = str( data, 'utf-8' )
self.assertEqual( response.status, 200 )
d = json.loads( text )
expected_result = {
'network_context' : {
'type' : 2,
'data' : 'subdomain.example.com'
},
'headers' : {}
}
self.assertEqual( d, expected_result )
def _test_manage_database( self, connection, set_up_permissions ):
@ -4610,6 +4929,7 @@ class TestClientAPI( unittest.TestCase ):
self._test_add_urls( connection, set_up_permissions )
self._test_manage_duplicates( connection, set_up_permissions )
self._test_manage_cookies( connection, set_up_permissions )
self._test_manage_headers( connection, set_up_permissions )
self._test_manage_pages( connection, set_up_permissions )
self._test_search_files( connection, set_up_permissions )

View File

@ -261,6 +261,48 @@ class TestSingleFileMetadataImporters( unittest.TestCase ):
self.assertEqual( set( result ), set( string_processor.ProcessStrings( my_current_storage_tags ) ) )
def test_media_notes( self ):
names_to_notes = {
'test' : 'This is a test note!',
'Another Test' : 'This one has\n\na newline!'
}
expected_rows = [ '{}: {}'.format( name, note ) for ( name, note ) in names_to_notes.items() ]
# simple
hash = HydrusData.GenerateKey()
media_result = HF.GetFakeMediaResult( hash )
media_result.GetNotesManager().SetNamesToNotes( names_to_notes )
# simple
importer = ClientMetadataMigrationImporters.SingleFileMetadataImporterMediaNotes()
result = importer.Import( media_result )
self.assertEqual( set( result ), set( expected_rows ) )
# with string processor
string_processor = ClientStrings.StringProcessor()
processing_steps = [ ClientStrings.StringConverter( conversions = [ ( ClientStrings.STRING_CONVERSION_REMOVE_TEXT_FROM_BEGINNING, 1 ) ] ) ]
string_processor.SetProcessingSteps( processing_steps )
importer = ClientMetadataMigrationImporters.SingleFileMetadataImporterMediaNotes( string_processor = string_processor )
result = importer.Import( media_result )
self.assertTrue( len( result ) > 0 )
self.assertNotEqual( set( result ), set( expected_rows ) )
self.assertEqual( set( result ), set( string_processor.ProcessStrings( expected_rows ) ) )
def test_media_urls( self ):
urls = { 'https://site.com/123456', 'https://cdn5.st.com/file/123456' }
@ -556,6 +598,43 @@ class TestSingleFileMetadataExporters( unittest.TestCase ):
HF.compare_content_updates( self, service_keys_to_content_updates, expected_service_keys_to_content_updates )
def test_media_notes( self ):
hash = os.urandom( 32 )
notes = [ 'test: this is a test note', 'another test: this is a different\n\ntest note' ]
# no notes makes no write
exporter = ClientMetadataMigrationExporters.SingleFileMetadataExporterMediaNotes()
HG.test_controller.ClearWrites( 'content_updates' )
exporter.Export( hash, [] )
with self.assertRaises( Exception ):
[ ( ( service_keys_to_content_updates, ), kwargs ) ] = HG.test_controller.GetWrite( 'content_updates' )
# simple
exporter = ClientMetadataMigrationExporters.SingleFileMetadataExporterMediaNotes()
HG.test_controller.SetRead( 'media_result', HF.GetFakeMediaResult( hash ) )
HG.test_controller.ClearWrites( 'content_updates' )
exporter.Export( hash, notes )
content_updates = [ HydrusData.ContentUpdate( HC.CONTENT_TYPE_NOTES, HC.CONTENT_UPDATE_SET, ( hash, name, note ) ) for ( name, note ) in [ n.split( ': ', 1 ) for n in notes ] ]
expected_service_keys_to_content_updates = { CC.LOCAL_NOTES_SERVICE_KEY : content_updates }
[ ( ( service_keys_to_content_updates, ), kwargs ) ] = HG.test_controller.GetWrite( 'content_updates' )
HF.compare_content_updates( self, service_keys_to_content_updates, expected_service_keys_to_content_updates )
def test_media_urls( self ):
hash = os.urandom( 32 )

View File

@ -77,7 +77,7 @@ class TestContentParser( unittest.TestCase ):
content_parser = ClientParsing.ContentParser( name = name, content_type = HC.CONTENT_TYPE_MAPPINGS, formula = dummy_formula, additional_info = additional_info )
self.assertEqual( ClientParsing.GetTagsFromParseResults( content_parser.Parse( parsing_context, parsing_text ) ), { 'character:lara croft', 'character:double pistols' } )
self.assertEqual( ClientParsing.GetTagsFromParseResults( content_parser.Parse( parsing_context, parsing_text ) ), { 'character:character:lara croft', 'character:double pistols' } )
# series

View File

@ -5,12 +5,8 @@ from hydrus.core import HydrusConstants as HC
from hydrus.core import HydrusData
from hydrus.core import HydrusGlobals as HG
from hydrus.core import HydrusTags
from hydrus.core import HydrusText
from hydrus.external import SystemPredicateParser
from hydrus.client import ClientConstants as CC
from hydrus.client import ClientManagers
from hydrus.client import ClientSearch
from hydrus.client import ClientSearchParseSystemPredicates
from hydrus.client.media import ClientMediaManagers

View File

@ -15,7 +15,7 @@ class TestHydrusTags( unittest.TestCase ):
self.assertEqual( HydrusTags.CleanTag( ':p' ), '::p' )
self.assertEqual( HydrusTags.CombineTag( '', ':p' ), '::p' )
self.assertEqual( HydrusTags.CombineTag( '', '::p' ), '::p' )
self.assertEqual( HydrusTags.CombineTag( '', '::p' ), ':::p' )
self.assertEqual( HydrusTags.CombineTag( '', 'unnamespace:withcolon' ), ':unnamespace:withcolon' )