Version 508

This commit is contained in:
Hydrus Network Developer 2022-11-30 16:06:58 -06:00
parent b1b841cb11
commit 6d16ec02ce
44 changed files with 16881 additions and 15136 deletions

View File

@ -7,7 +7,47 @@ title: Changelog
!!! note
This is the new changelog, only the most recent builds. For all versions, see the [old changelog](old_changelog.html).
## [Version 507](https://github.com/hydrusnetwork/hydrus/releases/tag/v506)
## [Version 508](https://github.com/hydrusnetwork/hydrus/releases/tag/v508)
### misc
* added a shortcut action to the 'media' set for 'file relationships: show x', where x is duplicates, potential duplicates, alternates, or false positives, just like the action buried in the thumbnail right-click menu. this actually works in both thumbs and the canvas.
* fixed file deletes not getting processed in the duplicate filter when there were no normal duplicate actions committed in a batch. sorry for the trouble here--duplicate decisions and deletes are now counted and reported in the confirmation dialogs as separate numbers
* as an experiment, the duplicate filter now says (+50%, -33%) percentage differences in the file size comparison statement. while the numbers here are correct, I'm not sure if this is helpful or awkward. maybe it should be phrased differently--let me know
* url classes get two new checkboxes this week: 'do not allow any extra path components/parameters', which will stop a match if the testee URL is 'longer' than the url class's definition. this should help with some difficult 'path-nested URLs aren't matching to the right URL Class' problems
* when you import hard drive files manually or in an import folder, files with .txt, .json, or .xml suffixes are now ignored in the file scanning phase. when hydrus eventually supports text files and arbitrary files, the solution will be nicer here, but this patch makes the new sidecar system nicer to work with in the meantime without, I hope, causing too much other fuss
* the 'tags' button in the advanced-mode 'sort files' control now hides/shows based on the sort type. also, the asc/desc button now hides/shows when it is invalid (filetype, hash, random), rather than disable/enable. there was a bit more signals-cleanup behind the scenes here too
* updated the 'could not set up qtpy/QtCore' error handling yet again to try to figure out this macOS App boot problem some users are getting. the error handling now says what the initial QT_API env variable was and tries to import every possible Qt and prints the whole error for each. hopefully we'll now see why PySide6 is not loading
* cleaned up the 'old changelog' page. all the '.' separators are replaced with proper header tags and I rejiggered some of the ul and li elements to interleave better. its favicon is also fixed. btw if you want to edit 500-odd elements at a time in a 2MB document, PyCharm is mostly great. multi-hundred simultaneous edit hung for about five minutes per character, but multiline regex Find and Replace was instant
* added a link to a user-written guide for running Hydrus on Windows in Anaconda to the 'installing' help
* fixed some old/invalid dialog locations in the 'how to build a downloader' help
### client api
* a new `/get_files/file_hashes` command lets you look up any of the sha256, md5, sha1, sha512 hashes that hydrus knows about using any of the other hashes. if you have a bunch of md5 and want to figure out if you have them, or if you want to get the md5s of your files and run them against an external check, this is now possible
* added help and unit tests for this new command
* added a service enum to the `/get_services` Client API help
* client api version is now 37
* as a side thing, I rejiggered the 'what non-sha256 hash do these sha256 hashes have?' test here. it now returns a mapping, allowing for more efficient mass lookups, and it no longer creates new sha256 records for novel hashes. feel free to spam this on new sha256 hashes if you like
### interesting serverside
* the tag repository now manages a tag filter. admins with 'modify options' permission can alter it under the new menu command _services->administrate services->tag repo->edit tag filter_.
* any time new tags are pended to the tag repository, they are now washed through the tag filter. any that don't pass are silently discarded
* normal users will regularly fetch the tag filter as long as their client is relatively new. they can review it under a new read-only Tag Filter panel from _review services_. if their client is super old (or the server), account sync and the UI should fail gracefully
* if you are in advanced mode and your client account-syncs and discovers the tag filter has changed, it will make a popup with a summary of the changes. I am not sure how spammy/annoying this will be, so let me know if you'd rather turn them off or auto-hide after two hours or something
* future updates will have more feedback on _manage tags_ dialog and similar, just to let you know there and then if an entered tag is not wanted. also, admins who change the tag filter will be able to retroactively remove tags that apply to the filter, not just stop new ones. I'd also like some sibling hard-replace to go along with this, so we don't accidentalyl remove tags that are otherwise sibling'd to be good--we'll see
* the hydrus server won't bug out so much at unusual errors now. previously, I ingrained that any error during any request would kick off automatic delays, but I have rejiggered it a bit so this mostly just happens during automatic work like update downloading
### boring serverside
* added get/set and similar to the tag repo's until-now-untouched tag filter
* wrote a nice helper method that splays two tag filters into their added/changed/deleted rules and another that can present that in human-readable format. it prints to the server log whenever a human changes the tag filter, and will be used in future retroactive syncing
* cleaned up how the service options are delivered to the client. previously, there would have been a version desync pain if I had ever updated the tag filter internal version. now, the service options delivered to the client are limited to python primitives, atm just update period and nullification period, and tag filter and other complex objects will have their own get calls and fail in quiet isolation
* I fixed some borked nullification period initialisation serverside
* whenever a tag filter describes itself, if either black or whitelist have more than 12 rules, it now summarises rather than listing every single one
## [Version 507](https://github.com/hydrusnetwork/hydrus/releases/tag/v507)
### misc
@ -442,35 +482,3 @@ title: Changelog
* fixed a weird menu creation bug involving a QStandardItem appearing in the menu actions
* fixed a similar weird QStandardItem bug in the media viewer canvas code
* fixed an error that could appear on force-emptied pages that receive sort signals
## [Version 498](https://github.com/hydrusnetwork/hydrus/releases/tag/v498)
_almost all the changes this week are only important to server admins and janitors. regular users can skip updating this week_
## overview
* the server has important database and network updates this week. if your server has a lot of content, it has to count it all up, so it will take a short while to update. the petition protocol has also changed, so older clients will not be able to fetch new servers' petitions without an error. I think newer clients will be able to fetch older servers' ones, but it may be iffy
* I considered whether I should update the network protocol version number, which would (politely) force all users to update, but as this causes inconvenience every time I do it, and I expect to do more incremental updates here in coming weeks, and since this only affects admins and janitors, I decided to not. we are going to be in awkward flux for a little bit, so please make sure you update privileged clients and servers at roughly the same time
## server petition workflow
* the server now maintains an ongoing fast count of its various repository metadata, such as 'number of mappings' and 'number of petitions of type x'. when you fetch petition counts, no longer will it count live and max out at 1,000, it'll give you good full numbers every time, and real fast
* you can see the current numbers from the new 'service info' button on review services, which only appears in advanced mode. any user with an account key can see these numbers, which include number of petitions in the queue. I can make this more private if you like, but for now I think it is good if advanced users can see them all
* in the petition processing page, sibling and parent petitions will now include both delete and add rows if the account and reason are the same. I'm aiming to get better 'full' coverage of a replace petition, so you can see and approve/deny both the add and the remove parts in one go. for fetching, these combined petitions count as 'delete' petitions, and won't appear in the 'add' petition queue
* when users encounter an automatic conflict resolution in the manage siblings dialog, those auto-petitioned pairs are now assigned the same reason as the original conflicting pended pairs. they _should_ show up together in the new petition processing UI
* as part of this, sibling and parent petitions are no longer filtered by namespace. you will see everything with that same account and reason in one go. let's try it out, and if it is too much, I will add filters clientside or something. since we are now starting to see add and remove together, we'll want to at least have the option to see everything
## boring server stuff
* the petition object is updated to handle multiple actions per petition, and the clientside petition UI is updated appropriately
* the server tracks 'actionable' petition counts as separate to the number of raw petition rows. some of this was happening before, but the logic is improved, including clever counting of the new petitions that include both add and delete rows
* for when my count-update logic inevitably fails, there is now a 'regen service info' entry in the 'administrate services' menu for all repositories. numbers generated will be printed to server log
* some unusual repo upload logic is cleaned up, e.g. if a user with 'create permission' uploads a sibling or parent, any pending rows for that content will now be properly cleared)
* fixed a stupid swap logical bug where janitors who could only moderate siblings (and not parents) were only being given parent numbers and vice versa
* all server services now respond to /busy check. it requires no authentication and just returns 1 or 0 depending on the current lock state
* fixed a bug where tag siblings or parents that were denied would still make a new definition record for the child/bad tag
* with all the fine number changes, fleshed out the server unit tests with more examples of submitting and altering content and then checking for numbers afterwards. now checked are: file add, file admin delete, mapping add, mapping admin delete, mapping petition, mapping petition approve+deny, parent add, parent admin delete, parent pend, parent pend approve+deny, parent petition, parent petition approve+deny
* significant refactoring of the tail end of server content update pipeline. more things now go through logic-harmonised update methods that ensure count is reliable
* did some misc server db and constant enum code cleanup
## misc
* to match the new change in the server, in the client, tag and rating services now store their 'num_files' service info count as the new 'num_file_hashes'. existing numbers will be converted over during update
* fixed a probably ten year old bug where 'num pending/petitioned files' had the same enum as 'num pending/petitioned mappings'. never noticed, since no service has done both those things
* if the upload pending process fails due to an unusual permission error or similar, the pending menu should now recover and update itself (previously it stayed greyed out)

View File

@ -314,6 +314,27 @@ Response:
Now that I state `type` and `type_pretty` here, I may rearrange this call, probably to make the `service_key` the Object key, rather than the arbitrary 'all_known_tags' strings.
You won't see all these, and you'll only ever need some, but `type` is:
* 0 - tag repository
* 1 - file repository
* 2 - a local file domain like 'my files'
* 5 - a local tag domain like 'my tags'
* 6 - a 'numerical' rating service with several stars
* 7 - a 'like/dislike' rating service with on/off status
* 10 - all known tags -- a union of all the tag services
* 11 - all known files -- a union of all the file services and files that appear in tag services
* 12 - the local booru -- you can ignore this
* 13 - IPFS
* 14 - trash
* 15 - all local files -- all files on hard disk ('all my files' + updates + trash)
* 17 - file notes
* 18 - Client API
* 19 - all deleted files -- you can ignore this
* 20 - local updates -- a file domain to store repository update files in
* 21 - all my files -- union of all local file domains
* 99 - server administration
## Adding Files
@ -925,7 +946,7 @@ Arguments (in percent-encoded JSON):
* `hash`: (selective, an SHA256 hash for the file in 64 characters of hexadecimal)
* `file_id`: (selective, the integer numerical identifier for the file)
Existing notes will be overwritten.
Existing notes will be overwritten. If the file has extra notes you do not specify, they are untouched.
```json title="Example request body"
{
"notes" : {
@ -1477,6 +1498,38 @@ Response:
This search does **not** apply the implicit limit that most clients set to all searches (usually 10,000), so if you do system:everything on a client with millions of files, expect to get boshed. Even with a system:limit included, complicated queries with large result sets may take several seconds to respond. Just like the client itself.
### **GET `/get_files/file_hashes`** { id="get_files_file_hashes" }
_Lookup file hashes from other hashes._
Restricted access:
: YES. Search for Files permission needed.
Required Headers: n/a
Arguments (in percent-encoded JSON):
:
* `hash`: (selective, a hexadecimal hash)
* `hashes`: (selective, a list of hexadecimal hashes)
* `source_hash_type`: [sha256|md5|sha1|sha512] (optional, defaulting to sha256)
* `desired_hash_type`: [sha256|md5|sha1|sha512]
If you have some MD5 hashes and want to see what their SHA256 are, or _vice versa_, this is the place. Hydrus records the non-SHA256 hashes for every file it has ever imported. This data is not removed on file deletion.
``` title="Example request"
/get_files/file_hashes?hash=ec5c5a4d7da4be154597e283f0b6663c&source_hash_type=md5&desired_hash_type=sha256
```
Response:
: A mapping Object of the successful lookups. Where no matching hash is found, no entry will be made (therefore, if none of your source hashes have matches on the client, this will return an empty `hashes` Object).
```json title="Example response"
{
"hashes" : {
"ec5c5a4d7da4be154597e283f0b6663c" : "2a0174970defa6f147f2eabba829c5b05aba1f1aea8b978611a07b7bb9cf9399"
}
}
```
### **GET `/get_files/file_metadata`** { id="get_files_file_metadata" }
_Get metadata about files in the client._

View File

@ -6,7 +6,7 @@ title: Putting It All Together
Now you know what GUGs, URL Classes, and Parsers are, you should have some ideas of how URL Classes could steer what happens when the downloader is faced with an URL to process. Should a URL be imported as a media file, or should it be parsed? If so, how?
You may have noticed in the Edit GUG ui that it lists if a current URL Class matches the example URL output. If the GUG has no matching URL Class, it won't be listed in the main 'gallery selector' button's list--it'll be relegated to the 'non-functioning' page. Without a URL Class, the client doesn't know what to do with the output of that GUG. But if a URL Class does match, we can then hand the result over to a parser set at _network->downloader definitions->manage url class links_:
You may have noticed in the Edit GUG ui that it lists if a current URL Class matches the example URL output. If the GUG has no matching URL Class, it won't be listed in the main 'gallery selector' button's list--it'll be relegated to the 'non-functioning' page. Without a URL Class, the client doesn't know what to do with the output of that GUG. But if a URL Class does match, we can then hand the result over to a parser set at _network->downloader components->manage url class links_:
![](images/downloader_completion_url_links.png)

View File

@ -26,7 +26,7 @@ These are all the 'first page' of the results if you type or click-through to th
## actually doing it { id="doing_it" }
Although it is usually a fairly simple process of just substituting the inputted tags into a string template, there are a couple of extra things to think about. Let's look at the ui under _network->downloader definitions->manage gugs_:
Although it is usually a fairly simple process of just substituting the inputted tags into a string template, there are a couple of extra things to think about. Let's look at the ui under _network->downloader components->manage gugs_:
![](images/downloader_edit_gug_panel.png)

View File

@ -6,7 +6,7 @@ title: Parsers
In hydrus, a parser is an object that takes a single block of HTML or JSON data and returns many kinds of hydrus-level metadata.
Parsers are flexible and potentially quite complicated. You might like to open _network->manage parsers_ and explore the UI as you read these pages. Check out how the default parsers already in the client work, and if you want to write a new one, see if there is something already in there that is similar--it is usually easier to duplicate an existing parser and then alter it than to create a new one from scratch every time.
Parsers are flexible and potentially quite complicated. You might like to open _network->downloader components->manage parsers_ and explore the UI as you read these pages. Check out how the default parsers already in the client work, and if you want to write a new one, see if there is something already in there that is similar--it is usually easier to duplicate an existing parser and then alter it than to create a new one from scratch every time.
There are three main components in the parsing system (click to open each component's help page):

View File

@ -10,7 +10,7 @@ Some sites offer API calls for their pages. Depending on complexity and quality
We convert the original Post URL, [https://www.artstation.com/artwork/mQLe1](https://www.artstation.com/artwork/mQLe1) to [https://www.artstation.com/projects/mQLe1.json](https://www.artstation.com/projects/mQLe1.json). Note that Artstation Post URLs can produce multiple files, and that the API url should not be associated with those final files.
So, when the client encounters an 'artstation file page' URL, it will generate the equivalent 'artstation file page json api' URL and use that for downloading and parsing. If you would like to review your API links, check out _network->downloader definitions->manage url class links->api links_. Using Example URLs, it will figure out which URL Classes link to others and ensure you are mapping parsers only to the final link in the chain--there should be several already in there by default.
So, when the client encounters an 'artstation file page' URL, it will generate the equivalent 'artstation file page json api' URL and use that for downloading and parsing. If you would like to review your API links, check out _network->downloader components->manage url class links->api links_. Using Example URLs, it will figure out which URL Classes link to others and ensure you are mapping parsers only to the final link in the chain--there should be several already in there by default.
Now lets look at the JSON. Loading clean JSON in a browser should present you with a nicer view:

View File

@ -8,7 +8,7 @@ If you are working with users who also understand the downloader system, you can
But if you want to share conveniently, and with users who are not familiar with the different downloader objects, you can package everything into a single easy-import png as per [here](adding_new_downloaders.md).
The dialog to use is _network->downloader definitions->export downloaders_:
The dialog to use is _network->downloader components->export downloaders_:
![](images/downloader_export_panel.png)

View File

@ -40,9 +40,9 @@ As far as we are concerned, a URL string has four parts:
* **Scheme:** `http` or `https`
* **Location/Domain:** `safebooru.org` or `i.4cdn.org` or `cdn002.somebooru.net`
* **Path Components:** `index.php` or `tesla/res/7518.json` or `pictures/user/daruak/page/2` or `art/Commission-animation-Elsa-and-Anna-541820782`
* **Query Parameters:** `page=post&s=list&tags=yorha_no._2_type_b&pid=40` or `page=post&s=view&id=2429668`
* **Parameters:** `page=post&s=list&tags=yorha_no._2_type_b&pid=40` or `page=post&s=view&id=2429668`
So, let's look at the 'edit url class' panel, which is found under _network->manage url classes_:
So, let's look at the 'edit url class' panel, which is found under _network->downloader components->manage url classes_:
![](images/downloader_edit_url_class_panel.png)
@ -79,9 +79,9 @@ Path Components
:
TBIB just uses a single "index.php" on the root directory, so the path is not complicated. Were it longer (like "gallery/cgi/index.php", we would add more ("gallery" and "cgi"), and since the path of a URL has a strict order, we would need to arrange the items in the listbox there so they were sorted correctly.
Query Parameters
Parameters
:
TBIB's index.php takes many query parameters to render different page types. Note that the Post URL uses "s=view", while TBIB Gallery URLs use "s=list". In any case, for a Post URL, "id", "page", and "s" are necessary and sufficient.
TBIB's index.php takes many parameters to render different page types. Note that the Post URL uses "s=view", while TBIB Gallery URLs use "s=list". In any case, for a Post URL, "id", "page", and "s" are necessary and sufficient.
## string matches { id="string_matches" }
@ -96,7 +96,7 @@ Don't go overboard with this stuff, though--most sites do not have super-fine di
## how do they match, exactly? { id="match_details" }
This URL Class will be assigned to any URL that matches the location, path, and query. Missing path compontent or query parameters in the URL will invalidate the match but additonal ones will not!
This URL Class will be assigned to any URL that matches the location, path, and query. Missing path component or parameters in the URL will invalidate the match but additonal ones will not!
For instance, given:
@ -125,7 +125,7 @@ And:
Both URL A and B will match, URL C will not
If multiple URL Classes match a URL, the client will try to assign the most 'complicated' one, with the most path components and then query parameters.
If multiple URL Classes match a URL, the client will try to assign the most 'complicated' one, with the most path components and then parameters.
Given two example URLs and URL Classes:
@ -160,7 +160,7 @@ Since we are in the business of storing and comparing URLs, we want to 'normalis
Note that in e621's case (and for many other sites!), that text after the id is purely decoration. It can change when the file's tags change, so if we want to compare today's URLs with those we saw a month ago, we'd rather just be without it.
On normalisation, all URLs will get the preferred http/https switch, and their query parameters will be alphabetised. File and Post URLs will also cull out any surplus path or query components. This wouldn't affect our TBIB example above, but it will clip the e621 example down to that 'bare' id URL, and it will take any surplus 'lang=en' or 'browser=netscape_24.11' garbage off the query text as well. URLs that are not associated and saved and compared (i.e. normal Gallery and Watchable URLs) are not culled of unmatched path components or query parameters, which can sometimes be useful if you want to match (and keep intact) gallery URLs that might or might not include an important 'sort=desc' type of parameter.
On normalisation, all URLs will get the preferred http/https switch, and their parameters will be alphabetised. File and Post URLs will also cull out any surplus path or query components. This wouldn't affect our TBIB example above, but it will clip the e621 example down to that 'bare' id URL, and it will take any surplus 'lang=en' or 'browser=netscape_24.11' garbage off the query text as well. URLs that are not associated and saved and compared (i.e. normal Gallery and Watchable URLs) are not culled of unmatched path components or query parameters, which can sometimes be useful if you want to match (and keep intact) gallery URLs that might or might not include an important 'sort=desc' type of parameter.
Since File and Post URLs will do this culling, be careful that you not leave out anything important in your rules. Make sure what you have is both necessary (nothing can be removed and still keep it valid) and sufficient (no more needs to be added to make it valid). It is a good idea to try pasting the 'normalised' version of the example URL into your browser, just to check it still works.
@ -184,11 +184,11 @@ What happened to 'page=1' and '/page/1'? Adding those '1' values in works fine!
![](images/downloader_edit_url_class_panel_default.png)
After you set a path component or query parameter String Match, you will be asked for an optional 'default' value. You won't want to set one most of the time, but for Gallery URLs, it can be hugely useful--see how the normalisation process automatically fills in the missing path component with the default! There are plenty of examples in the default Gallery URLs of this, so check them out. Most sites use page indices starting at '1', but Gelbooru-style imageboards use 'pid=0' file index (and often move forward 42, so the next pages will be 'pid=42', 'pid=84', and so on, although others use deltas of 20 or 40).
After you set a path component or parameter String Match, you will be asked for an optional 'default' value. You won't want to set one most of the time, but for Gallery URLs, it can be hugely useful--see how the normalisation process automatically fills in the missing path component with the default! There are plenty of examples in the default Gallery URLs of this, so check them out. Most sites use page indices starting at '1', but Gelbooru-style imageboards use 'pid=0' file index (and often move forward 42, so the next pages will be 'pid=42', 'pid=84', and so on, although others use deltas of 20 or 40).
## can we predict the next gallery page? { id="next_gallery_page_prediction" }
Now we can harmonise gallery urls to a single format, we can predict the next gallery page! If, say, the third path component or 'page' query parameter is always a number referring to page, you can select this under the 'next gallery page' section and set the delta to change it by. The 'next gallery page url' section will be automatically filled in. This value will be consulted if the parser cannot find a 'next gallery page url' from the page content.
Now we can harmonise gallery urls to a single format, we can predict the next gallery page! If, say, the third path component or 'page' parameter is always a number referring to page, you can select this under the 'next gallery page' section and set the delta to change it by. The 'next gallery page url' section will be automatically filled in. This value will be consulted if the parser cannot find a 'next gallery page url' from the page content.
It is neat to set this up, but I only recommend it if you actually cannot reliably parse a next gallery page url from the HTML later in the process. It is neater to have searches stop naturally because the parser said 'no more gallery pages' than to have hydrus always one page beyond and end every single search on an uglier 'No results found' or 404 result.

View File

@ -107,7 +107,7 @@ It is important to note that while subscriptions can have multiple queries (even
### Setting up subscriptions
Here's the dialog, which is under _network->downloaders->manage subscriptions_:
Here's the dialog, which is under _network->manage subscriptions_:
![](images/subscriptions_edit_subscriptions.png)
@ -156,7 +156,7 @@ Again: the real problem with downloading is not finding new things, it is keepin
## Logins
The client now supports a flexible (but slightly prototype and ugly) login system. It can handle simple sites and is as [completely user-customisable as the downloader system](downloader_login.md). The client starts with multiple login scripts by default, which you can review under _network->downloaders->manage logins_:
The client now supports a flexible (but slightly prototype and ugly) login system. It can handle simple sites and is as [completely user-customisable as the downloader system](downloader_login.md). The client starts with multiple login scripts by default, which you can review under _network->logins->manage logins_:
![](images/manage_logins.png)

View File

@ -28,6 +28,7 @@ I try to release a new version every Wednesday by 8pm EST and write an accompany
* [Chocolatey](https://community.chocolatey.org/packages/hydrus-network)
* [Scoop](https://github.com/ScoopInstaller/Scoop) (`hydrus-network` in the 'Extras' bucket)
* Winget. The command is `winget install --id=HydrusNetwork.HydrusNetwork -e --location "\PATH\TO\INSTALL\HERE"`, which can, if you know what you are doing, be `winget install --id=HydrusNetwork.HydrusNetwork -e --location ".\"`, maybe rolled into a batch file.
* [User guide for Anaconda](https://gist.github.com/jufogu/b78509695c6c65cdb2866a56fb14a820)
=== "macOS"

View File

@ -6,11 +6,11 @@ title: Introduction and Statement of Principles
## this help { id="this_help" }
Click the links on the left to go through the getting started guide. Please at least skim every page in turn, as it will introduce you to the main systems in the client. There is a lot, so you do not have to do it all in one go.
Click the links on the left to go through the getting started guide. Subheadings are on the right. Larger sections are up top. Please at least skim every page in the getting started section, as this will introduce you to the main systems in the client. There is a lot, so you do not have to do it all in one go.
The section on installing, updating, and **backing up** is very important.
This help is available locally in every release. Open `install_dir/help/index.html`.
This help is available locally in every release. Hit `help->help and getting started guide` in the client, or open `install_dir/help/index.html`.
## on having too many files { id="files" }

File diff suppressed because it is too large Load Diff

View File

@ -43,7 +43,7 @@ So, please let me know:
* The type of hard drive you are running hydrus from. (e.g. "A 2TB 7200rpm drive that is 20% full. I regularly defrag it.")
* Any _profiles_ you have collected.
You can generate a profile by hitting _help->debug->profile mode_, which tells the client to generate profile information for almost all of its behind the scenes jobs. This can be spammy, so don't leave it on for a very long time (you can turn it off by hitting the help menu entry again).
You can generate a profile by hitting _help->debug->profiling->profile mode_, which tells the client to generate profile information for almost all of its behind the scenes jobs. This can be spammy, so don't leave it on for a very long time (you can turn it off by hitting the help menu entry again).
Turn on profile mode, do the thing that runs slow for you (importing a file, fetching some tags, whatever), and then check your database folder (most likely _install_dir/db_) for a new 'client profile - DATE.log' file. This file will be filled with several sets of tables with timing information. Please send that whole file to me, or if it is too large, cut what seems important. It should not contain any personal information, but feel free to look through it.

View File

@ -148,6 +148,7 @@ SIMPLE_ZOOM_MAX = 140
SIMPLE_ZOOM_CANVAS = 141
SIMPLE_ZOOM_100 = 142
SIMPLE_ZOOM_DEFAULT = 143
SIMPLE_SHOW_DUPLICATES = 144
simple_enum_to_str_lookup = {
SIMPLE_ARCHIVE_DELETE_FILTER_BACK : 'archive/delete filter: back',
@ -269,6 +270,7 @@ simple_enum_to_str_lookup = {
SIMPLE_ZOOM_IN_VIEWER_CENTER : 'zoom: in with forced media viewer center',
SIMPLE_ZOOM_OUT_VIEWER_CENTER : 'zoom: out with forced media viewer center',
SIMPLE_SWITCH_BETWEEN_100_PERCENT_AND_CANVAS_ZOOM_VIEWER_CENTER : 'zoom: switch 100% and canvas fit with forced media viewer center',
SIMPLE_SHOW_DUPLICATES : 'file relationships: show',
SIMPLE_DUPLICATE_MEDIA_DISSOLVE_FOCUSED_ALTERNATE_GROUP : 'file relationships: dissolve focused file alternate group',
SIMPLE_DUPLICATE_MEDIA_DISSOLVE_ALTERNATE_GROUP : 'file relationships: dissolve alternate groups',
SIMPLE_DUPLICATE_MEDIA_DISSOLVE_FOCUSED_DUPLICATE_GROUP : 'file relationships: dissolve focused file duplicate group',
@ -645,7 +647,13 @@ class ApplicationCommand( HydrusSerialisable.SerialisableBase ):
s = simple_enum_to_str_lookup[ action ]
if action == SIMPLE_MEDIA_SEEK_DELTA:
if action == SIMPLE_SHOW_DUPLICATES:
duplicate_type = self.GetSimpleData()
s = '{} {}'.format( s, HC.duplicate_type_string_lookup[ duplicate_type ] )
elif action == SIMPLE_MEDIA_SEEK_DELTA:
( direction, ms ) = self.GetSimpleData()

View File

@ -181,7 +181,7 @@ regen_file_enum_to_overruled_jobs = {
ALL_REGEN_JOBS_IN_PREFERRED_ORDER = [ REGENERATE_FILE_DATA_JOB_FILE_INTEGRITY_PRESENCE_TRY_URL_ELSE_REMOVE_RECORD, REGENERATE_FILE_DATA_JOB_FILE_INTEGRITY_PRESENCE_TRY_URL, REGENERATE_FILE_DATA_JOB_FILE_INTEGRITY_DATA_TRY_URL_ELSE_REMOVE_RECORD, REGENERATE_FILE_DATA_JOB_FILE_INTEGRITY_DATA_TRY_URL, REGENERATE_FILE_DATA_JOB_FILE_INTEGRITY_PRESENCE_REMOVE_RECORD, REGENERATE_FILE_DATA_JOB_FILE_INTEGRITY_PRESENCE_DELETE_RECORD, REGENERATE_FILE_DATA_JOB_FILE_INTEGRITY_DATA_REMOVE_RECORD, REGENERATE_FILE_DATA_JOB_FILE_INTEGRITY_DATA_SILENT_DELETE, REGENERATE_FILE_DATA_JOB_FILE_INTEGRITY_PRESENCE_LOG_ONLY, REGENERATE_FILE_DATA_JOB_FILE_METADATA, REGENERATE_FILE_DATA_JOB_REFIT_THUMBNAIL, REGENERATE_FILE_DATA_JOB_FORCE_THUMBNAIL, REGENERATE_FILE_DATA_JOB_SIMILAR_FILES_METADATA, REGENERATE_FILE_DATA_JOB_CHECK_SIMILAR_FILES_MEMBERSHIP, REGENERATE_FILE_DATA_JOB_FIX_PERMISSIONS, REGENERATE_FILE_DATA_JOB_FILE_MODIFIED_TIMESTAMP, REGENERATE_FILE_DATA_JOB_OTHER_HASHES, REGENERATE_FILE_DATA_JOB_FILE_HAS_EXIF, REGENERATE_FILE_DATA_JOB_FILE_HAS_HUMAN_READABLE_EMBEDDED_METADATA, REGENERATE_FILE_DATA_JOB_FILE_HAS_ICC_PROFILE, REGENERATE_FILE_DATA_JOB_PIXEL_HASH, REGENERATE_FILE_DATA_JOB_DELETE_NEIGHBOUR_DUPES ]
def GetAllFilePaths( raw_paths, do_human_sort = True ):
def GetAllFilePaths( raw_paths, do_human_sort = True, clear_out_sidecars = True ):
file_paths = []
@ -230,6 +230,23 @@ def GetAllFilePaths( raw_paths, do_human_sort = True ):
HydrusData.HumanTextSort( file_paths )
if clear_out_sidecars:
exts = [ '.txt', '.json', '.xml' ]
def not_a_sidecar( p ):
if True in ( p.endswith( ext ) for ext in exts ):
return False
return True
file_paths = [ path for path in file_paths if not_a_sidecar( path ) ]
return file_paths
class ClientFilesManager( object ):

View File

@ -459,14 +459,14 @@ class MigrationSourceHTA( MigrationSource ):
for ( hash, tags ) in data:
result = self._controller.Read( 'file_hashes', ( hash, ), source_hash_type, desired_hash_type )
source_to_desired = self._controller.Read( 'file_hashes', ( hash, ), source_hash_type, desired_hash_type )
if len( result ) == 0:
if len( source_to_desired ) == 0:
continue
desired_hash = result[0]
desired_hash = list( source_to_desired.values() )[0]
fixed_data.append( ( desired_hash, tags ) )

View File

@ -2949,7 +2949,9 @@ class ParseRootFileLookup( HydrusSerialisable.SerialisableBaseNamed ):
try:
( other_hash, ) = HG.client_controller.Read( 'file_hashes', ( sha256_hash, ), 'sha256', hash_type )
source_to_desired = HG.client_controller.Read( 'file_hashes', ( sha256_hash, ), 'sha256', hash_type )
other_hash = list( source_to_desired.values() )[0]
return other_hash

View File

@ -15,6 +15,7 @@ from hydrus.core import HydrusExceptions
from hydrus.core import HydrusGlobals as HG
from hydrus.core import HydrusPaths
from hydrus.core import HydrusSerialisable
from hydrus.core import HydrusTags
from hydrus.core.networking import HydrusNATPunch
from hydrus.core.networking import HydrusNetwork
from hydrus.core.networking import HydrusNetworkVariableHandling
@ -1053,7 +1054,12 @@ class ServiceRestricted( ServiceRemote ):
def _SetNewServiceOptions( self, service_options ):
self._service_options = service_options
self._service_options.update( service_options )
def _SetNewTagFilter( self, tag_filter: HydrusTags.TagFilter ):
self._service_options[ 'tag_filter' ] = tag_filter
def CanSyncAccount( self, including_external_communication = True ):
@ -1292,6 +1298,10 @@ class ServiceRestricted( ServiceRemote ):
with self._lock:
if isinstance( e, HydrusExceptions.NotFoundException ):
self._DelayFutureRequests( 'got an unexpected 404', SHORT_DELAY_PERIOD )
if isinstance( e, HydrusExceptions.ServerBusyException ):
self._DelayFutureRequests( 'server was busy', 5 * 60 )
@ -1308,15 +1318,11 @@ class ServiceRestricted( ServiceRemote ):
self._DealWithFundamentalNetworkError()
elif isinstance( e, HydrusExceptions.NotFoundException ):
self._DelayFutureRequests( 'got an unexpected 404', SHORT_DELAY_PERIOD )
elif isinstance( e, HydrusExceptions.BandwidthException ):
self._DelayFutureRequests( 'service has exceeded bandwidth', ACCOUNT_SYNC_PERIOD )
else:
elif isinstance( e, HydrusExceptions.ServerException ):
self._DelayFutureRequests( str( e ) )
@ -1421,6 +1427,46 @@ class ServiceRestricted( ServiceRemote ):
pass
if self._service_type == HC.TAG_REPOSITORY:
try:
tag_filter_response = self.Request( HC.GET, 'tag_filter' )
with self._lock:
tag_filter = tag_filter_response[ 'tag_filter' ]
if 'tag_filter' in self._service_options and HG.client_controller.new_options.GetBoolean( 'advanced_mode' ):
old_tag_filter = self._service_options[ 'tag_filter' ]
if old_tag_filter != tag_filter:
try:
summary = tag_filter.GetChangesSummaryText( old_tag_filter )
message = 'The tag filter for "{}" just changed! Changes are:{}{}'.format( self._name, os.linesep * 2, summary )
HydrusData.ShowText( message )
except:
pass
self._SetNewTagFilter( tag_filter )
except Exception: # any exception, screw it
pass
except ( HydrusExceptions.CancelledException, HydrusExceptions.NetworkException ) as e:
HydrusData.Print( 'Failed to refresh account for {}:'.format( name ) )
@ -2342,6 +2388,33 @@ class ServiceRepository( ServiceRestricted ):
def GetTagFilter( self ) -> HydrusTags.TagFilter:
with self._lock:
if self._service_type != HC.TAG_REPOSITORY:
raise Exception( 'This is not a tag repository! It does not have a tag filter!' )
if 'tag_filter' in self._service_options:
tag_filter = self._service_options[ 'tag_filter' ]
if not isinstance( tag_filter, HydrusTags.TagFilter ):
raise HydrusExceptions.DataMissing( 'This service has a bad tag filter! Try refreshing your account!' )
return tag_filter
else:
raise HydrusExceptions.DataMissing( 'This service does not seem to have a tag filter! Try refreshing your account!' )
def GetUpdateHashes( self ):
with self._lock:
@ -2537,6 +2610,19 @@ class ServiceRepository( ServiceRestricted ):
HG.client_controller.Write( 'reset_repository', self )
def SetTagFilter( self, tag_filter: HydrusTags.TagFilter ):
with self._lock:
if self._service_type != HC.TAG_REPOSITORY:
raise Exception( 'This is not a tag repository! It does not have a tag filter!' )
self._service_options[ 'tag_filter' ] = tag_filter
def SyncRemote( self, stop_time = None ):
with self._sync_remote_lock:

View File

@ -3347,7 +3347,9 @@ class DB( HydrusDB.HydrusDB ):
else:
matching_sha256_hashes = self.modules_hashes.GetFileHashes( search_hashes, search_hash_type, 'sha256' )
source_to_desired = self.modules_hashes.GetFileHashes( search_hashes, search_hash_type, 'sha256' )
matching_sha256_hashes = list( source_to_desired.values() )
specific_hash_ids = self.modules_hashes_local_cache.GetHashIds( matching_sha256_hashes )
@ -3781,7 +3783,9 @@ class DB( HydrusDB.HydrusDB ):
else:
matching_sha256_hashes = self.modules_hashes.GetFileHashes( search_hashes, search_hash_type, 'sha256' )
source_to_desired = self.modules_hashes.GetFileHashes( search_hashes, search_hash_type, 'sha256' )
matching_sha256_hashes = list( source_to_desired.values() )
specific_hash_ids = self.modules_hashes_local_cache.GetHashIds( matching_sha256_hashes )

View File

@ -129,15 +129,17 @@ class ClientDBMasterHashes( ClientDBModule.ClientDBModule ):
return hash
def GetFileHashes( self, given_hashes, given_hash_type, desired_hash_type ) -> typing.Collection[ bytes ]:
def GetFileHashes( self, given_hashes, given_hash_type, desired_hash_type ) -> typing.Dict[ bytes, bytes ]:
if given_hash_type == 'sha256':
hash_ids = self.GetHashIds( given_hashes )
hashes_we_have = [ hash for hash in given_hashes if self.HasHash( hash ) ]
hash_ids_to_source_hashes = self.GetHashIdsToHashes( hashes = hashes_we_have )
else:
hash_ids = []
hash_ids_to_source_hashes = {}
for given_hash in given_hashes:
@ -152,21 +154,26 @@ class ClientDBMasterHashes( ClientDBModule.ClientDBModule ):
( hash_id, ) = result
hash_ids.append( hash_id )
hash_ids_to_source_hashes[ hash_id ] = given_hash
if desired_hash_type == 'sha256':
desired_hashes = self.GetHashes( hash_ids )
hash_ids_to_desired_hashes = self.GetHashIdsToHashes( hash_ids = set( hash_ids_to_source_hashes.keys() ) )
else:
desired_hashes = [ desired_hash for ( desired_hash, ) in self._Execute( 'SELECT {} FROM local_hashes WHERE hash_id IN {};'.format( desired_hash_type, HydrusData.SplayListForDB( hash_ids ) ) ) ]
with self._MakeTemporaryIntegerTable( set( hash_ids_to_source_hashes.keys() ), 'hash_id' ) as temp_table_name:
hash_ids_to_desired_hashes = { hash_id : desired_hash for ( hash_id, desired_hash ) in self._Execute( 'SELECT hash_id, {} FROM {} CROSS JOIN local_hashes USING ( hash_id );'.format( desired_hash_type, temp_table_name ) ) }
return desired_hashes
source_to_desired = { hash_ids_to_source_hashes[ hash_id ] : hash_ids_to_desired_hashes[ hash_id ] for hash_id in list( hash_ids_to_desired_hashes.keys() ) }
return source_to_desired
def GetHash( self, hash_id ) -> bytes:

View File

@ -2929,6 +2929,11 @@ class FrameGUI( ClientGUITopLevelWindows.MainFrameThatResizes, CAC.ApplicationCo
ClientGUIMenus.AppendMenuItem( submenu, 'change anonymisation period', 'Change the account history nullification period for this service.', self._ManageServiceOptionsNullificationPeriod, service_key )
if service_type == HC.TAG_REPOSITORY:
ClientGUIMenus.AppendMenuItem( submenu, 'edit tag filter', 'Change the tag filter for this service.', self._ManageServiceOptionsTagFilter, service_key )
ClientGUIMenus.AppendSeparator( submenu )
ClientGUIMenus.AppendMenuItem( submenu, 'maintenance: regen service info', 'Add, edit, and delete this server\'s services.', self._ServerMaintenanceRegenServiceInfo, service_key )
@ -4443,6 +4448,67 @@ class FrameGUI( ClientGUITopLevelWindows.MainFrameThatResizes, CAC.ApplicationCo
job_key.Finish()
job_key.Delete( 5 )
service.SetAccountRefreshDueNow()
def errback_ui_cleanup_callable():
job_key.SetVariable( 'popup_text_1', 'error!' )
job_key.Finish()
job = ClientGUIAsync.AsyncQtJob( self, work_callable, publish_callable, errback_ui_cleanup_callable = errback_ui_cleanup_callable )
job.start()
def _ManageServiceOptionsTagFilter( self, service_key ):
service = self._controller.services_manager.GetService( service_key )
tag_filter = service.GetTagFilter()
with ClientGUITopLevelWindowsPanels.DialogEdit( self, 'edit tag repository tag filter' ) as dlg:
namespaces = HG.client_controller.network_engine.domain_manager.GetParserNamespaces()
message = 'The repository will apply this to all new pending tags that are uploaded to it. Anything that does not pass is silently discarded.'
panel = ClientGUITags.EditTagFilterPanel( dlg, tag_filter, message = message, namespaces = namespaces )
dlg.SetPanel( panel )
if dlg.exec() == QW.QDialog.Accepted:
tag_filter = panel.GetValue()
job_key = ClientThreading.JobKey()
job_key.SetStatusTitle( 'setting tag filter' )
job_key.SetVariable( 'popup_text_1', 'uploading\u2026' )
self._controller.pub( 'message', job_key )
def work_callable():
service.Request( HC.POST, 'tag_filter', { 'tag_filter' : tag_filter } )
return 1
def publish_callable( gumpf ):
job_key.SetVariable( 'popup_text_1', 'done!' )
job_key.Finish()
job_key.Delete( 5 )
service.SetAccountRefreshDueNow()
@ -4511,6 +4577,8 @@ class FrameGUI( ClientGUITopLevelWindows.MainFrameThatResizes, CAC.ApplicationCo
job_key.Finish()
job_key.Delete( 5 )
service.DoAFullMetadataResync()
service.SetAccountRefreshDueNow()

View File

@ -1020,6 +1020,22 @@ class EditURLClassPanel( ClientGUIScrolledPanels.EditPanel ):
self._alphabetise_get_parameters.setToolTip( tt )
self._no_more_path_components_than_this = QW.QCheckBox( self._options_panel )
tt = 'Normally, hydrus will match a URL that has a longer path than is defined here. site.com/index/123456/cool-pic-by-artist will match a URL class that looks for site.com/index/123456, and it will remove that extra cruft on normalisation.'
tt += os.linesep * 2
tt += 'Checking this turns that behaviour off. It will only match if the given URL satisfies all defined path component tests, and no more. If you have multiple URL Classes matching on different levels of a tree, and hydrus is having difficulty matching them up in the right order (neighbouring Gallery/Post URLs can do this), try this.'
self._no_more_path_components_than_this.setToolTip( tt )
self._no_more_parameters_than_this = QW.QCheckBox( self._options_panel )
tt = 'Normally, hydrus will match a URL that has more parameters than is defined here. site.com/index?p=123456&orig_tags=skirt will match a URL class that looks for site.com/index?p=123456. Post URLs will remove that extra cruft on normalisation.'
tt += os.linesep * 2
tt += 'Checking this turns that behaviour off. It will only match if the given URL satisfies all defined parameter tests, and no more. If you have multiple URL Classes matching on the same base URL path but with different query params, and hydrus is having difficulty matching them up in the right order (neighbouring Gallery/Post URLs can do this), try this.'
self._no_more_parameters_than_this.setToolTip( tt )
self._can_produce_multiple_files = QW.QCheckBox( self._options_panel )
tt = 'If checked, the client will not rely on instances of this URL class to predetermine \'already in db\' or \'previously deleted\' outcomes. This is important for post types like pixiv pages (which can ultimately be manga, and represent many pages) and tweets (which can have multiple images).'
@ -1144,6 +1160,9 @@ class EditURLClassPanel( ClientGUIScrolledPanels.EditPanel ):
self._should_be_associated_with_files.setChecked( should_be_associated_with_files )
self._keep_fragment.setChecked( keep_fragment )
self._no_more_path_components_than_this.setChecked( url_class.NoMorePathComponentsThanThis() )
self._no_more_parameters_than_this.setChecked( url_class.NoMoreParametersThanThis() )
self._path_components.AddDatas( path_components )
self._parameters.AddDatas( list( parameters.items() ) )
@ -1241,6 +1260,8 @@ class EditURLClassPanel( ClientGUIScrolledPanels.EditPanel ):
rows.append( ( 'if matching by subdomain, keep it when normalising?: ', self._keep_matched_subdomains ) )
rows.append( ( 'alphabetise GET parameters when normalising?: ', self._alphabetise_get_parameters ) )
rows.append( ( 'do not allow any extra path components?: ', self._no_more_path_components_than_this ) )
rows.append( ( 'do not allow any extra parameters?: ', self._no_more_parameters_than_this ) )
rows.append( ( 'keep fragment when normalising?: ', self._keep_fragment ) )
rows.append( ( 'post page can produce multiple files?: ', self._can_produce_multiple_files ) )
rows.append( ( 'associate a \'known url\' with resulting files?: ', self._should_be_associated_with_files ) )
@ -1283,6 +1304,8 @@ class EditURLClassPanel( ClientGUIScrolledPanels.EditPanel ):
self._preferred_scheme.currentIndexChanged.connect( self._UpdateControls )
self._netloc.textChanged.connect( self._UpdateControls )
self._alphabetise_get_parameters.clicked.connect( self._UpdateControls )
self._no_more_path_components_than_this.clicked.connect( self._UpdateControls )
self._no_more_parameters_than_this.clicked.connect( self._UpdateControls )
self._match_subdomains.clicked.connect( self._UpdateControls )
self._keep_matched_subdomains.clicked.connect( self._UpdateControls )
self._keep_fragment.clicked.connect( self._UpdateControls )
@ -1639,6 +1662,14 @@ class EditURLClassPanel( ClientGUIScrolledPanels.EditPanel ):
keep_fragment
)
no_more = self._no_more_path_components_than_this.isChecked()
url_class.SetNoMorePathComponentsThanThis( no_more )
no_more = self._no_more_parameters_than_this.isChecked()
url_class.SetNoMoreParametersThanThis( no_more )
return url_class

View File

@ -33,7 +33,9 @@ def CopyHashesToClipboard( win: QW.QWidget, hash_type: str, medias: typing.Seque
num_hashes = len( sha256_hashes )
num_remote_sha256_hashes = len( [ itertools.chain.from_iterable( ( media.GetHashes( discriminant = CC.DISCRIMINANT_NOT_LOCAL, ordered = True ) for media in medias ) ) ] )
desired_hashes = HG.client_controller.Read( 'file_hashes', sha256_hashes, 'sha256', hash_type )
source_to_desired = HG.client_controller.Read( 'file_hashes', sha256_hashes, 'sha256', hash_type )
desired_hashes = [ source_to_desired[ source_hash ] for source_hash in sha256_hashes if source_hash in source_to_desired ]
num_missing = num_hashes - len( desired_hashes )
@ -361,9 +363,10 @@ def OpenMediaURLClassURLs( medias, url_class ):
def ShowDuplicatesInNewPage( location_context: ClientLocation.LocationContext, hash, duplicate_type ):
# TODO: this can be replaced by a call to the MediaResult when it holds these hashes
# don't forget to return itself in position 0!
hashes = HG.client_controller.Read( 'file_duplicate_hashes', location_context, hash, duplicate_type )
if hashes is not None and len( hashes ) > 0:
if hashes is not None and len( hashes ) > 1:
HG.client_controller.pub( 'new_page_query', location_context, initial_hashes = hashes )

View File

@ -234,7 +234,7 @@ shortcut_names_to_descriptions = {
SHORTCUTS_RESERVED_NAMES = [ 'global', 'archive_delete_filter', 'duplicate_filter', 'media', 'tags_autocomplete', 'main_gui', 'media_viewer_browser', 'media_viewer', 'media_viewer_media_window', 'preview_media_window' ]
SHORTCUTS_GLOBAL_ACTIONS = [ CAC.SIMPLE_GLOBAL_AUDIO_MUTE, CAC.SIMPLE_GLOBAL_AUDIO_UNMUTE, CAC.SIMPLE_GLOBAL_AUDIO_MUTE_FLIP, CAC.SIMPLE_EXIT_APPLICATION, CAC.SIMPLE_EXIT_APPLICATION_FORCE_MAINTENANCE, CAC.SIMPLE_RESTART_APPLICATION, CAC.SIMPLE_HIDE_TO_SYSTEM_TRAY, CAC.SIMPLE_GLOBAL_PROFILE_MODE_FLIP, CAC.SIMPLE_GLOBAL_FORCE_ANIMATION_SCANBAR_SHOW ]
SHORTCUTS_MEDIA_ACTIONS = [ CAC.SIMPLE_MANAGE_FILE_TAGS, CAC.SIMPLE_MANAGE_FILE_RATINGS, CAC.SIMPLE_MANAGE_FILE_URLS, CAC.SIMPLE_MANAGE_FILE_NOTES, CAC.SIMPLE_ARCHIVE_FILE, CAC.SIMPLE_INBOX_FILE, CAC.SIMPLE_DELETE_FILE, CAC.SIMPLE_UNDELETE_FILE, CAC.SIMPLE_EXPORT_FILES, CAC.SIMPLE_EXPORT_FILES_QUICK_AUTO_EXPORT, CAC.SIMPLE_REMOVE_FILE_FROM_VIEW, CAC.SIMPLE_OPEN_FILE_IN_EXTERNAL_PROGRAM, CAC.SIMPLE_OPEN_SELECTION_IN_NEW_PAGE, CAC.SIMPLE_LAUNCH_THE_ARCHIVE_DELETE_FILTER, CAC.SIMPLE_COPY_BMP, CAC.SIMPLE_COPY_BMP_OR_FILE_IF_NOT_BMPABLE, CAC.SIMPLE_COPY_FILE, CAC.SIMPLE_COPY_PATH, CAC.SIMPLE_COPY_SHA256_HASH, CAC.SIMPLE_COPY_MD5_HASH, CAC.SIMPLE_COPY_SHA1_HASH, CAC.SIMPLE_COPY_SHA512_HASH, CAC.SIMPLE_GET_SIMILAR_TO_EXACT, CAC.SIMPLE_GET_SIMILAR_TO_VERY_SIMILAR, CAC.SIMPLE_GET_SIMILAR_TO_SIMILAR, CAC.SIMPLE_GET_SIMILAR_TO_SPECULATIVE, CAC.SIMPLE_DUPLICATE_MEDIA_SET_ALTERNATE, CAC.SIMPLE_DUPLICATE_MEDIA_SET_ALTERNATE_COLLECTIONS, CAC.SIMPLE_DUPLICATE_MEDIA_SET_CUSTOM, CAC.SIMPLE_DUPLICATE_MEDIA_SET_FOCUSED_BETTER, CAC.SIMPLE_DUPLICATE_MEDIA_SET_FOCUSED_KING, CAC.SIMPLE_DUPLICATE_MEDIA_SET_SAME_QUALITY, CAC.SIMPLE_DUPLICATE_MEDIA_SET_POTENTIAL, CAC.SIMPLE_OPEN_KNOWN_URL ]
SHORTCUTS_MEDIA_ACTIONS = [ CAC.SIMPLE_MANAGE_FILE_TAGS, CAC.SIMPLE_MANAGE_FILE_RATINGS, CAC.SIMPLE_MANAGE_FILE_URLS, CAC.SIMPLE_MANAGE_FILE_NOTES, CAC.SIMPLE_ARCHIVE_FILE, CAC.SIMPLE_INBOX_FILE, CAC.SIMPLE_DELETE_FILE, CAC.SIMPLE_UNDELETE_FILE, CAC.SIMPLE_EXPORT_FILES, CAC.SIMPLE_EXPORT_FILES_QUICK_AUTO_EXPORT, CAC.SIMPLE_REMOVE_FILE_FROM_VIEW, CAC.SIMPLE_OPEN_FILE_IN_EXTERNAL_PROGRAM, CAC.SIMPLE_OPEN_SELECTION_IN_NEW_PAGE, CAC.SIMPLE_LAUNCH_THE_ARCHIVE_DELETE_FILTER, CAC.SIMPLE_COPY_BMP, CAC.SIMPLE_COPY_BMP_OR_FILE_IF_NOT_BMPABLE, CAC.SIMPLE_COPY_FILE, CAC.SIMPLE_COPY_PATH, CAC.SIMPLE_COPY_SHA256_HASH, CAC.SIMPLE_COPY_MD5_HASH, CAC.SIMPLE_COPY_SHA1_HASH, CAC.SIMPLE_COPY_SHA512_HASH, CAC.SIMPLE_GET_SIMILAR_TO_EXACT, CAC.SIMPLE_GET_SIMILAR_TO_VERY_SIMILAR, CAC.SIMPLE_GET_SIMILAR_TO_SIMILAR, CAC.SIMPLE_GET_SIMILAR_TO_SPECULATIVE, CAC.SIMPLE_DUPLICATE_MEDIA_SET_ALTERNATE, CAC.SIMPLE_DUPLICATE_MEDIA_SET_ALTERNATE_COLLECTIONS, CAC.SIMPLE_DUPLICATE_MEDIA_SET_CUSTOM, CAC.SIMPLE_DUPLICATE_MEDIA_SET_FOCUSED_BETTER, CAC.SIMPLE_DUPLICATE_MEDIA_SET_FOCUSED_KING, CAC.SIMPLE_DUPLICATE_MEDIA_SET_SAME_QUALITY, CAC.SIMPLE_DUPLICATE_MEDIA_SET_POTENTIAL, CAC.SIMPLE_SHOW_DUPLICATES, CAC.SIMPLE_OPEN_KNOWN_URL ]
SHORTCUTS_MEDIA_VIEWER_ACTIONS = [ CAC.SIMPLE_PAUSE_MEDIA, CAC.SIMPLE_PAUSE_PLAY_MEDIA, CAC.SIMPLE_MEDIA_SEEK_DELTA, CAC.SIMPLE_MOVE_ANIMATION_TO_PREVIOUS_FRAME, CAC.SIMPLE_MOVE_ANIMATION_TO_NEXT_FRAME, CAC.SIMPLE_SWITCH_BETWEEN_FULLSCREEN_BORDERLESS_AND_REGULAR_FRAMED_WINDOW, CAC.SIMPLE_PAN_UP, CAC.SIMPLE_PAN_DOWN, CAC.SIMPLE_PAN_LEFT, CAC.SIMPLE_PAN_RIGHT, CAC.SIMPLE_PAN_TOP_EDGE, CAC.SIMPLE_PAN_BOTTOM_EDGE, CAC.SIMPLE_PAN_LEFT_EDGE, CAC.SIMPLE_PAN_RIGHT_EDGE, CAC.SIMPLE_PAN_VERTICAL_CENTER, CAC.SIMPLE_PAN_HORIZONTAL_CENTER, CAC.SIMPLE_ZOOM_IN, CAC.SIMPLE_ZOOM_OUT, CAC.SIMPLE_SWITCH_BETWEEN_100_PERCENT_AND_CANVAS_ZOOM, CAC.SIMPLE_SWITCH_BETWEEN_100_PERCENT_AND_MAX_ZOOM, CAC.SIMPLE_SWITCH_BETWEEN_CANVAS_AND_MAX_ZOOM, CAC.SIMPLE_ZOOM_100, CAC.SIMPLE_ZOOM_CANVAS, CAC.SIMPLE_ZOOM_DEFAULT, CAC.SIMPLE_ZOOM_MAX, CAC.SIMPLE_FLIP_DARKMODE, CAC.SIMPLE_CLOSE_MEDIA_VIEWER ]
SHORTCUTS_MEDIA_VIEWER_BROWSER_ACTIONS = [ CAC.SIMPLE_VIEW_NEXT, CAC.SIMPLE_VIEW_FIRST, CAC.SIMPLE_VIEW_LAST, CAC.SIMPLE_VIEW_PREVIOUS, CAC.SIMPLE_PAUSE_PLAY_SLIDESHOW, CAC.SIMPLE_SHOW_MENU, CAC.SIMPLE_CLOSE_MEDIA_VIEWER ]
SHORTCUTS_MAIN_GUI_ACTIONS = [ CAC.SIMPLE_REFRESH, CAC.SIMPLE_REFRESH_ALL_PAGES, CAC.SIMPLE_REFRESH_PAGE_OF_PAGES_PAGES, CAC.SIMPLE_NEW_PAGE, CAC.SIMPLE_NEW_PAGE_OF_PAGES, CAC.SIMPLE_NEW_DUPLICATE_FILTER_PAGE, CAC.SIMPLE_NEW_GALLERY_DOWNLOADER_PAGE, CAC.SIMPLE_NEW_URL_DOWNLOADER_PAGE, CAC.SIMPLE_NEW_SIMPLE_DOWNLOADER_PAGE, CAC.SIMPLE_NEW_WATCHER_DOWNLOADER_PAGE, CAC.SIMPLE_SET_MEDIA_FOCUS, CAC.SIMPLE_SHOW_HIDE_SPLITTERS, CAC.SIMPLE_SET_SEARCH_FOCUS, CAC.SIMPLE_UNCLOSE_PAGE, CAC.SIMPLE_CLOSE_PAGE, CAC.SIMPLE_REDO, CAC.SIMPLE_UNDO, CAC.SIMPLE_FLIP_DARKMODE, CAC.SIMPLE_RUN_ALL_EXPORT_FOLDERS, CAC.SIMPLE_CHECK_ALL_IMPORT_FOLDERS, CAC.SIMPLE_FLIP_DEBUG_FORCE_IDLE_MODE_DO_NOT_SET_THIS, CAC.SIMPLE_SHOW_AND_FOCUS_MANAGE_TAGS_FAVOURITE_TAGS, CAC.SIMPLE_SHOW_AND_FOCUS_MANAGE_TAGS_RELATED_TAGS, CAC.SIMPLE_REFRESH_RELATED_TAGS, CAC.SIMPLE_SHOW_AND_FOCUS_MANAGE_TAGS_FILE_LOOKUP_SCRIPT_TAGS, CAC.SIMPLE_SHOW_AND_FOCUS_MANAGE_TAGS_RECENT_TAGS, CAC.SIMPLE_FOCUS_MEDIA_VIEWER, CAC.SIMPLE_MOVE_PAGES_SELECTION_LEFT, CAC.SIMPLE_MOVE_PAGES_SELECTION_RIGHT, CAC.SIMPLE_MOVE_PAGES_SELECTION_HOME, CAC.SIMPLE_MOVE_PAGES_SELECTION_END, CAC.SIMPLE_OPEN_COMMAND_PALETTE ]

View File

@ -637,12 +637,13 @@ class EditTagFilterPanel( ClientGUIScrolledPanels.EditPanel ):
TEST_RESULT_DEFAULT = 'Enter a tag here to test if it passes the current filter:'
TEST_RESULT_BLACKLIST_DEFAULT = 'Enter a tag here to test if it passes the blacklist (siblings tested, unnamespaced rules match namespaced tags):'
def __init__( self, parent, tag_filter, only_show_blacklist = False, namespaces = None, message = None ):
def __init__( self, parent, tag_filter, only_show_blacklist = False, namespaces = None, message = None, read_only = False ):
ClientGUIScrolledPanels.EditPanel.__init__( self, parent )
self._only_show_blacklist = only_show_blacklist
self._namespaces = namespaces
self._read_only = read_only
self._wildcard_replacements = {}
@ -652,12 +653,6 @@ class EditTagFilterPanel( ClientGUIScrolledPanels.EditPanel ):
#
help_button = ClientGUICommon.BetterBitmapButton( self, CC.global_pixmaps().help, self._ShowHelp )
help_hbox = ClientGUICommon.WrapInText( help_button, self, 'help for this panel -->', object_name = 'HydrusIndeterminate' )
#
self._import_favourite = ClientGUICommon.BetterButton( self, 'import', self._ImportFavourite )
self._export_favourite = ClientGUICommon.BetterButton( self, 'export', self._ExportFavourite )
self._load_favourite = ClientGUICommon.BetterButton( self, 'load', self._LoadFavourite )
@ -716,7 +711,14 @@ class EditTagFilterPanel( ClientGUIScrolledPanels.EditPanel ):
vbox = QP.VBoxLayout()
QP.AddToLayout( vbox, help_hbox, CC.FLAGS_ON_RIGHT )
if not self._read_only:
help_button = ClientGUICommon.BetterBitmapButton( self, CC.global_pixmaps().help, self._ShowHelp )
help_hbox = ClientGUICommon.WrapInText( help_button, self, 'help for this panel -->', object_name = 'HydrusIndeterminate' )
QP.AddToLayout( vbox, help_hbox, CC.FLAGS_ON_RIGHT )
if message is not None:
@ -729,6 +731,12 @@ class EditTagFilterPanel( ClientGUIScrolledPanels.EditPanel ):
hbox = QP.HBoxLayout()
if self._read_only:
self._import_favourite.hide()
self._load_favourite.hide()
QP.AddToLayout( hbox, self._import_favourite, CC.FLAGS_CENTER_PERPENDICULAR )
QP.AddToLayout( hbox, self._export_favourite, CC.FLAGS_CENTER_PERPENDICULAR )
QP.AddToLayout( hbox, self._load_favourite, CC.FLAGS_CENTER_PERPENDICULAR )
@ -1094,7 +1102,7 @@ class EditTagFilterPanel( ClientGUIScrolledPanels.EditPanel ):
blacklist_panel = ClientGUICommon.StaticBox( advanced_panel, 'exclude these' )
self._advanced_blacklist = ClientGUIListBoxes.ListBoxTagsFilter( blacklist_panel )
self._advanced_blacklist = ClientGUIListBoxes.ListBoxTagsFilter( blacklist_panel, read_only = self._read_only )
self._advanced_blacklist_input = ClientGUIControls.TextAndPasteCtrl( blacklist_panel, self._AdvancedAddBlacklistMultiple, allow_empty_input = True )
@ -1106,7 +1114,7 @@ class EditTagFilterPanel( ClientGUIScrolledPanels.EditPanel ):
whitelist_panel = ClientGUICommon.StaticBox( advanced_panel, 'except for these' )
self._advanced_whitelist = ClientGUIListBoxes.ListBoxTagsFilter( whitelist_panel )
self._advanced_whitelist = ClientGUIListBoxes.ListBoxTagsFilter( whitelist_panel, read_only = self._read_only )
self._advanced_whitelist_input = ClientGUIControls.TextAndPasteCtrl( whitelist_panel, self._AdvancedAddWhitelistMultiple, allow_empty_input = True )
@ -1115,6 +1123,18 @@ class EditTagFilterPanel( ClientGUIScrolledPanels.EditPanel ):
#
if self._read_only:
self._advanced_blacklist_input.hide()
add_blacklist_button.hide()
delete_blacklist_button.hide()
blacklist_everything_button.hide()
self._advanced_whitelist_input.hide()
self._advanced_add_whitelist_button.hide()
delete_whitelist_button.hide()
button_hbox = QP.HBoxLayout()
QP.AddToLayout( button_hbox, self._advanced_blacklist_input, CC.FLAGS_EXPAND_BOTH_WAYS )
@ -1173,12 +1193,20 @@ class EditTagFilterPanel( ClientGUIScrolledPanels.EditPanel ):
self._simple_blacklist_namespace_checkboxes.Append( namespace, namespace + ':' )
self._simple_blacklist = ClientGUIListBoxes.ListBoxTagsFilter( blacklist_panel )
self._simple_blacklist = ClientGUIListBoxes.ListBoxTagsFilter( blacklist_panel, read_only = self._read_only )
self._simple_blacklist_input = ClientGUIControls.TextAndPasteCtrl( blacklist_panel, self._SimpleAddBlacklistMultiple, allow_empty_input = True )
#
if self._read_only:
self._simple_blacklist_global_checkboxes.setEnabled( False )
self._simple_blacklist_namespace_checkboxes.setEnabled( False )
self._simple_blacklist_input.hide()
left_vbox = QP.VBoxLayout()
QP.AddToLayout( left_vbox, self._simple_blacklist_global_checkboxes, CC.FLAGS_EXPAND_PERPENDICULAR )
@ -1232,12 +1260,21 @@ class EditTagFilterPanel( ClientGUIScrolledPanels.EditPanel ):
self._simple_whitelist_namespace_checkboxes.Append( namespace, namespace + ':' )
self._simple_whitelist = ClientGUIListBoxes.ListBoxTagsFilter( whitelist_panel )
self._simple_whitelist = ClientGUIListBoxes.ListBoxTagsFilter( whitelist_panel, read_only = self._read_only )
self._simple_whitelist_input = ClientGUIControls.TextAndPasteCtrl( whitelist_panel, self._SimpleAddWhitelistMultiple, allow_empty_input = True )
#
if self._read_only:
self._simple_whitelist_global_checkboxes.setEnabled( False )
self._simple_whitelist_namespace_checkboxes.setEnabled( False )
self._simple_whitelist_input.hide()
left_vbox = QP.VBoxLayout()
QP.AddToLayout( left_vbox, self._simple_whitelist_global_checkboxes, CC.FLAGS_EXPAND_PERPENDICULAR )

View File

@ -1,77 +1,67 @@
import os
import traceback
# If not explicitly set, prefer PySide instead of PyQt, which is the qtpy default
# It is critical that this runs on startup *before* anything is imported from qtpy.
def get_qt_api_str_status():
def get_qt_library_str_status():
infos = []
try:
if 'QT_API' in os.environ:
qt_api = os.environ[ 'QT_API' ]
import_status = 'imported ok'
if qt_api == 'pyqt5':
try:
import PyQt5
except ImportError as e:
import_status = 'did not import ok: {}'.format( e )
elif qt_api == 'pyside2':
try:
import PySide2
except ImportError as e:
import_status = 'did not import ok: {}'.format( e )
elif qt_api == 'pyqt6':
try:
import PyQt6
except ImportError as e:
import_status = 'did not import ok: {}'.format( e )
elif qt_api == 'pyside6':
try:
import PySide6
except ImportError as e:
import_status = 'did not import ok: {}'.format( e )
return 'QT_API: {}, {}'.format( qt_api, import_status )
else:
return 'No QT_API set.'
import PyQt5
infos.append( 'PyQt5 imported ok' )
except Exception as e:
return 'Unable to get QT_API info: {}'.format( e )
infos.append( 'PyQt5 did not import ok:\n{}'.format( traceback.format_exc() ) )
try:
import PySide2
infos.append( 'PySide2 imported ok' )
except Exception as e:
infos.append( 'PySide2 did not import ok:\n{}'.format( traceback.format_exc() ) )
try:
import PyQt6
infos.append( 'PyQt6 imported ok' )
except Exception as e:
infos.append( 'PyQt6 did not import ok:\n{}'.format( traceback.format_exc() ) )
try:
import PySide6
infos.append( 'PySide6 imported ok' )
except Exception as e:
infos.append( 'PySide6 did not import ok:\n{}'.format( traceback.format_exc() ) )
return '\n'.join( infos )
if 'QT_API' not in os.environ:
if 'QT_API' in os.environ:
QT_API_INITIAL_VALUE = os.environ[ 'QT_API' ]
else:
QT_API_INITIAL_VALUE = None
try:
@ -94,6 +84,36 @@ if 'QT_API' not in os.environ:
def get_qt_api_str_status():
try:
if QT_API_INITIAL_VALUE is None:
initial_qt = 'QT_API was initially not set.'
else:
initial_qt = 'QT_API was initially "{}".'.format( QT_API_INITIAL_VALUE )
if 'QT_API' in os.environ:
current_qt = 'Current QT_API is "{}".'.format( os.environ[ 'QT_API' ] )
else:
current_qt = 'Currently QT_API is not set.'
return '{} {}'.format( initial_qt, current_qt )
except Exception as e:
return 'Unable to get QT_API info: {}'.format( traceback.format_exc() )
#
try:
@ -102,15 +122,19 @@ try:
except ModuleNotFoundError as e:
qt_str = get_qt_api_str_status()
message = 'Either the qtpy module was not found, or qtpy could not find a Qt to use!'
message = 'Either the qtpy module was not found, or qtpy could not find a Qt to use! Error was: {}'.format(
e
)
message += os.linesep * 2
message += 'Are you sure you installed and activated your venv correctly? Check the \'running from source\' section of the help if you are confused! Here is info on QT_API: {}'.format(
qt_str
)
message += '\n' * 2
message += 'Are you sure you installed and activated your venv correctly? Check the \'running from source\' section of the help if you are confused!'
message += '\n' * 2
message += 'Here is info on QT_API:\n{}'.format( get_qt_api_str_status() )
message += '\n' * 2
message += 'Here is info on your available Qt Libraries:\n{}'.format( get_qt_library_str_status() )
raise Exception( message )
@ -123,11 +147,11 @@ try:
except ModuleNotFoundError as e:
message = 'One of the Qt modules could not be loaded! Error was: {}'.format(
e
message = 'One of the Qt modules could not be loaded! Error was:\n{}'.format(
traceback.format_exc()
)
message += os.linesep * 2
message += '\n' * 2
try:
@ -143,13 +167,15 @@ except ModuleNotFoundError as e:
message += 'qtpy had problems saying which module it had selected!'
qt_str = get_qt_api_str_status()
message += '\n' * 2
message += ' Here is info on QT_API: {}'.format(
qt_str
)
message += 'Here is info on QT_API:\n{}'.format( get_qt_api_str_status() )
message += os.linesep * 2
message += '\n' * 2
message += 'Here is info on your available Qt Libraries:\n{}'.format( get_qt_library_str_status() )
message += '\n' * 2
message += 'If you are running from a built release, please let hydev know!'

View File

@ -22,6 +22,7 @@ from hydrus.client.gui import ClientGUICore as CGC
from hydrus.client.gui import ClientGUIDialogs
from hydrus.client.gui import ClientGUIDialogsManage
from hydrus.client.gui import ClientGUIDialogsQuick
from hydrus.client.gui import ClientGUIDuplicates
from hydrus.client.gui import ClientGUIFunctions
from hydrus.client.gui import ClientGUIMedia
from hydrus.client.gui import ClientGUIMediaActions
@ -908,6 +909,128 @@ class Canvas( QW.QWidget, CAC.ApplicationCommandProcessorMixin ):
self._media_container.PausePlay()
elif action == CAC.SIMPLE_SHOW_DUPLICATES:
if self._current_media is not None:
hash = self._current_media.GetHash()
duplicate_type = command.GetSimpleData()
ClientGUIMedia.ShowDuplicatesInNewPage( self._location_context, hash, duplicate_type )
elif action == CAC.SIMPLE_DUPLICATE_MEDIA_CLEAR_FOCUSED_FALSE_POSITIVES:
# TODO: when media knows dupe relationships, all these lads here need a media scan for the existence of alternate groups or whatever
# no duplicate group->don't start the process
if self._current_media is not None:
hash = self._current_media.GetHash()
ClientGUIDuplicates.ClearFalsePositives( self, ( hash, ) )
elif action == CAC.SIMPLE_DUPLICATE_MEDIA_CLEAR_FALSE_POSITIVES:
if self._current_media is not None:
hash = self._current_media.GetHash()
ClientGUIDuplicates.ClearFalsePositives( self, ( hash, ) )
elif action == CAC.SIMPLE_DUPLICATE_MEDIA_DISSOLVE_FOCUSED_ALTERNATE_GROUP:
if self._current_media is not None:
hash = self._current_media.GetHash()
ClientGUIDuplicates.DissolveAlternateGroup( self, ( hash, ) )
elif action == CAC.SIMPLE_DUPLICATE_MEDIA_DISSOLVE_ALTERNATE_GROUP:
if self._current_media is not None:
hash = self._current_media.GetHash()
ClientGUIDuplicates.DissolveAlternateGroup( self, ( hash, ) )
elif action == CAC.SIMPLE_DUPLICATE_MEDIA_DISSOLVE_FOCUSED_DUPLICATE_GROUP:
if self._current_media is not None:
hash = self._current_media.GetHash()
ClientGUIDuplicates.DissolveDuplicateGroup( self, ( hash, ) )
elif action == CAC.SIMPLE_DUPLICATE_MEDIA_DISSOLVE_DUPLICATE_GROUP:
if self._current_media is not None:
hash = self._current_media.GetHash()
ClientGUIDuplicates.DissolveDuplicateGroup( self, ( hash, ) )
elif action == CAC.SIMPLE_DUPLICATE_MEDIA_REMOVE_FOCUSED_FROM_ALTERNATE_GROUP:
if self._current_media is not None:
hash = self._current_media.GetHash()
ClientGUIDuplicates.RemoveFromAlternateGroup( self, ( hash, ) )
elif action == CAC.SIMPLE_DUPLICATE_MEDIA_REMOVE_FOCUSED_FROM_DUPLICATE_GROUP:
if self._current_media is not None:
hash = self._current_media.GetHash()
ClientGUIDuplicates.RemoveFromDuplicateGroup( self, ( hash, ) )
elif action == CAC.SIMPLE_DUPLICATE_MEDIA_RESET_FOCUSED_POTENTIAL_SEARCH:
if self._current_media is not None:
hash = self._current_media.GetHash()
ClientGUIDuplicates.ResetPotentialSearch( self, ( hash, ) )
elif action == CAC.SIMPLE_DUPLICATE_MEDIA_RESET_POTENTIAL_SEARCH:
if self._current_media is not None:
hash = self._current_media.GetHash()
ClientGUIDuplicates.ResetPotentialSearch( self, ( hash, ) )
elif action == CAC.SIMPLE_DUPLICATE_MEDIA_REMOVE_FOCUSED_POTENTIALS:
if self._current_media is not None:
hash = self._current_media.GetHash()
ClientGUIDuplicates.RemovePotentials( self, ( hash, ) )
elif action == CAC.SIMPLE_DUPLICATE_MEDIA_REMOVE_POTENTIALS:
if self._current_media is not None:
hash = self._current_media.GetHash()
ClientGUIDuplicates.RemovePotentials( self, ( hash, ) )
elif action == CAC.SIMPLE_MEDIA_SEEK_DELTA:
( direction, ms ) = command.GetSimpleData()
@ -2421,7 +2544,29 @@ class CanvasFilterDuplicates( CanvasWithHovers ):
index_string = HydrusData.ConvertValueRangeToPrettyString( progress, total )
num_decisions_string = '{} decisions'.format( HydrusData.ToHumanInt( self._GetNumCommittableDecisions() ) )
num_committable = self._GetNumCommittableDecisions()
num_deletable = self._GetNumCommittableDeletes()
components = []
if num_committable > 0:
components.append( '{} decisions'.format( HydrusData.ToHumanInt( num_committable ) ) )
if num_deletable > 0:
components.append( '{} deletes'.format( HydrusData.ToHumanInt( num_deletable ) ) )
if len( components ) == 0:
num_decisions_string = 'no decisions yet'
else:
num_decisions_string = ', '.join( components )
return '{} - {} - {}'.format( current_media_label, index_string, num_decisions_string )
@ -2437,6 +2582,11 @@ class CanvasFilterDuplicates( CanvasWithHovers ):
return len( [ 1 for ( duplicate_type, first_media, second_media, list_of_service_keys_to_content_updates, was_auto_skipped ) in self._processed_pairs if duplicate_type is not None ] )
def _GetNumCommittableDeletes( self ):
return len( [ 1 for ( duplicate_type, first_media, second_media, list_of_service_keys_to_content_updates, was_auto_skipped ) in self._processed_pairs if duplicate_type is None and len( list_of_service_keys_to_content_updates ) > 0 ] )
def _GetNumRemainingDecisions( self ):
# this looks a little weird, but I want to be clear that we make a decision on the final index
@ -2761,10 +2911,23 @@ class CanvasFilterDuplicates( CanvasWithHovers ):
if num_remaining == 0:
num_committable = self._GetNumCommittableDecisions()
num_deletable = self._GetNumCommittableDeletes()
if num_committable > 0:
if num_committable + num_deletable > 0:
label = 'commit ' + HydrusData.ToHumanInt( num_committable ) + ' decisions and continue?'
components = []
if num_committable > 0:
components.append( '{} decisions'.format( HydrusData.ToHumanInt( num_committable ) ) )
if num_deletable > 0:
components.append( '{} deletes'.format( HydrusData.ToHumanInt( num_deletable ) ) )
label = 'commit {} and continue?'.format( ' and '.join( components ) )
result = ClientGUIDialogsQuick.GetInterstitialFilteringAnswer( self, label )
@ -3033,10 +3196,23 @@ class CanvasFilterDuplicates( CanvasWithHovers ):
def TryToDoPreClose( self ):
num_committable = self._GetNumCommittableDecisions()
num_deletable = self._GetNumCommittableDeletes()
if num_committable > 0:
if num_committable + num_deletable > 0:
label = 'commit ' + HydrusData.ToHumanInt( num_committable ) + ' decisions?'
components = []
if num_committable > 0:
components.append( '{} decisions'.format( HydrusData.ToHumanInt( num_committable ) ) )
if num_deletable > 0:
components.append( '{} deletes'.format( HydrusData.ToHumanInt( num_deletable ) ) )
label = 'commit {}?'.format( ' and '.join( components ) )
( result, cancelled ) = ClientGUIDialogsQuick.GetFinishFilteringAnswer( self, label )

View File

@ -1152,7 +1152,7 @@ class EditTagImportOptionsPanel( ClientGUIScrolledPanels.EditPanel ):
self._use_default_dropdown.addItem( 'use the default tag import options at the time of import', True )
self._use_default_dropdown.addItem( 'set custom tag import options just for this importer', False )
tt = 'Normally, the client will refer to the defaults (as set under "network->downloaders->manage default tag import options") for the appropriate tag import options at the time of import.'
tt = 'Normally, the client will refer to the defaults (as set under "network->downloaders->manage default import options") for the appropriate tag import options at the time of import.'
tt += os.linesep * 2
tt += 'It is easier to work this way, since you can change a single default setting and update all current and future downloaders that refer to those defaults, whereas having specific options for every subscription or downloader means you have to update every single one just to make a little change somewhere.'
tt += os.linesep * 2
@ -1408,7 +1408,7 @@ You can also set some fixed 'explicit' tags (like, say, 'read later' or 'from my
---
Please note that once you know what tags you like, you can (and should) set up the 'default' values for these tag import options under _network->downloaders->manage default tag import options_, both globally and on a per-parser basis. If you always want all the tags going to 'my tags', this is easy to set up there, and you won't have to put it in every time.'''
Please note that once you know what tags you like, you can (and should) set up the 'default' values for these tag import options under _network->downloaders->manage default import options_, both globally and on a per-parser basis. If you always want all the tags going to 'my tags', this is easy to set up there, and you won't have to put it in every time.'''
QW.QMessageBox.information( self, 'Information', message )

View File

@ -3208,14 +3208,16 @@ class ListBoxTagsFilter( ListBoxTags ):
tagsRemoved = QC.Signal( list )
def __init__( self, parent ):
def __init__( self, parent, read_only = False ):
ListBoxTags.__init__( self, parent )
self._read_only = read_only
def _Activate( self, ctrl_down, shift_down ) -> bool:
if len( self._selected_terms ) > 0:
if len( self._selected_terms ) > 0 and not self._read_only:
tag_slices = [ term.GetTagSlice() for term in self._selected_terms ]

View File

@ -2010,6 +2010,19 @@ class MediaPanel( ClientMedia.ListeningMediaList, QW.QScrollArea, CAC.Applicatio
self._CopyHashesToClipboard( 'sha512' )
elif action == CAC.SIMPLE_SHOW_DUPLICATES:
if self._HasFocusSingleton():
media = self._GetFocusSingleton()
hash = media.GetHash()
duplicate_type = command.GetSimpleData()
ClientGUIMedia.ShowDuplicatesInNewPage( self._location_context, hash, duplicate_type )
elif action == CAC.SIMPLE_DUPLICATE_MEDIA_CLEAR_FOCUSED_FALSE_POSITIVES:
if self._HasFocusSingleton():

View File

@ -513,6 +513,7 @@ class MediaSortControl( QW.QWidget ):
self._sort_order_choice.setMinimumWidth( asc_width )
self._UpdateSortTypeLabel()
self._UpdateButtonsVisible()
self._UpdateAscDescLabelsAndDefault()
#
@ -547,7 +548,7 @@ class MediaSortControl( QW.QWidget ):
self._sort_tag_display_type_button.valueChanged.connect( self.EventTagDisplayTypeChoice )
self._sort_order_choice.valueChanged.connect( self.EventSortAscChoice )
self._tag_context_button.valueChanged.connect( self._TagContextChanged )
self._tag_context_button.valueChanged.connect( self.EventTagContextChanged )
def _BroadcastSort( self ):
@ -738,6 +739,7 @@ class MediaSortControl( QW.QWidget ):
self._sort_type = sort_type
self._UpdateSortTypeLabel()
self._UpdateButtonsVisible()
self._UpdateAscDescLabelsAndDefault()
@ -750,13 +752,6 @@ class MediaSortControl( QW.QWidget ):
self._BroadcastSort()
def _TagContextChanged( self, tag_context: ClientSearch.TagContext ):
self._UserChoseASort()
self._BroadcastSort()
def _UpdateAscDescLabelsAndDefault( self ):
media_sort = self._GetCurrentSort()
@ -772,6 +767,8 @@ class MediaSortControl( QW.QWidget ):
( desc_str, CC.SORT_DESC )
]
# if there are no changes to asc/desc texts, then we'll keep the previous value
if choice_tuples != self._sort_order_choice.GetChoiceTuples():
self._sort_order_choice.SetChoiceTuples( choice_tuples )
@ -779,11 +776,12 @@ class MediaSortControl( QW.QWidget ):
self._sort_order_choice.SetValue( default_sort_order )
# if there are no changes to asc/desc texts, then we'll keep the previous value
self._sort_order_choice.setVisible( True )
else:
self._sort_order_choice.SetChoiceTuples( [] )
self._sort_order_choice.setVisible( False )
#self._sort_order_choice.SetChoiceTuples( [] )
self._sort_order_choice.blockSignals( False )
@ -791,7 +789,13 @@ class MediaSortControl( QW.QWidget ):
def _UpdateButtonsVisible( self ):
self._tag_context_button.setVisible( HG.client_controller.new_options.GetBoolean( 'advanced_mode' ) )
( sort_metatype, sort_data ) = self._sort_type
show_tag_button = sort_metatype == 'namespaces' and HG.client_controller.new_options.GetBoolean( 'advanced_mode' )
self._tag_context_button.setVisible( show_tag_button )
self._sort_tag_display_type_button.setVisible( show_tag_button )
def _UpdateSortTypeLabel( self ):
@ -828,8 +832,6 @@ class MediaSortControl( QW.QWidget ):
self._sort_tag_display_type_button.setVisible( show_tdt )
def _UserChoseASort( self ):
@ -871,9 +873,14 @@ class MediaSortControl( QW.QWidget ):
self._BroadcastSort()
def EventTagDisplayTypeChoice( self ):
def EventTagContextChanged( self, tag_context: ClientSearch.TagContext ):
tag_display_type = self._sort_tag_display_type_button.GetValue()
self._UserChoseASort()
self._BroadcastSort()
def EventTagDisplayTypeChoice( self ):
( sort_metatype, sort_data ) = self._sort_type
@ -881,9 +888,11 @@ class MediaSortControl( QW.QWidget ):
( namespaces, current_tag_display_type ) = sort_data
tag_display_type = self._sort_tag_display_type_button.GetValue()
sort_data = ( namespaces, tag_display_type )
self._sort_type = ( sort_metatype, sort_data )
self._SetSortType( ( sort_metatype, sort_data ) )
self._UserChoseASort()
@ -910,8 +919,6 @@ class MediaSortControl( QW.QWidget ):
self._tag_context_button.SetValue( media_sort.tag_context )
self._UpdateButtonsVisible()
def wheelEvent( self, event ):

View File

@ -31,6 +31,7 @@ from hydrus.client.gui import ClientGUIPanels
from hydrus.client.gui import ClientGUIScrolledPanels
from hydrus.client.gui import ClientGUIScrolledPanelsReview
from hydrus.client.gui import ClientGUIStringControls
from hydrus.client.gui import ClientGUITags
from hydrus.client.gui import ClientGUITopLevelWindowsPanels
from hydrus.client.gui import QtPorting as QP
from hydrus.client.gui.lists import ClientGUIListConstants as CGLC
@ -2628,6 +2629,9 @@ class ReviewServiceRepositorySubPanel( QW.QWidget ):
self._metadata_st = ClientGUICommon.BetterStaticText( self._network_panel )
self._tag_filter_button = ClientGUICommon.BetterButton( self._network_panel, 'tag filter', self._ReviewTagFilter )
self._tag_filter_button.setEnabled( False )
self._download_progress = ClientGUICommon.TextAndGauge( self._network_panel )
self._update_downloading_paused_button = ClientGUICommon.BetterBitmapButton( self._network_panel, CC.global_pixmaps().pause, self._PausePlayUpdateDownloading )
@ -2701,6 +2705,11 @@ class ReviewServiceRepositorySubPanel( QW.QWidget ):
self._reset_processing_button.hide()
if not self._service.GetServiceType() == HC.TAG_REPOSITORY:
self._tag_filter_button.hide()
self._network_panel.Add( self._repo_options_st, CC.FLAGS_EXPAND_PERPENDICULAR )
self._network_panel.Add( self._metadata_st, CC.FLAGS_EXPAND_PERPENDICULAR )
self._network_panel.Add( self._download_progress, CC.FLAGS_EXPAND_PERPENDICULAR )
@ -2709,6 +2718,7 @@ class ReviewServiceRepositorySubPanel( QW.QWidget ):
hbox = QP.HBoxLayout()
QP.AddToLayout( hbox, self._service_info_button, CC.FLAGS_CENTER_PERPENDICULAR )
QP.AddToLayout( hbox, self._tag_filter_button, CC.FLAGS_CENTER_PERPENDICULAR )
QP.AddToLayout( hbox, self._sync_remote_now_button, CC.FLAGS_CENTER_PERPENDICULAR )
QP.AddToLayout( hbox, self._reset_downloading_button, CC.FLAGS_CENTER_PERPENDICULAR )
QP.AddToLayout( hbox, self._export_updates_button, CC.FLAGS_CENTER_PERPENDICULAR )
@ -3031,6 +3041,25 @@ class ReviewServiceRepositorySubPanel( QW.QWidget ):
self._repo_options_st.setText( ', '.join( repo_options_text_components ) )
if self._service.GetServiceType() == HC.TAG_REPOSITORY:
try:
tag_filter = self._service.GetTagFilter()
self._tag_filter_button.setEnabled( True )
tt = 'See which tags this repository accepts. Summary:{}{}'.format( os.linesep * 2, tag_filter.ToPermittedString() )
self._tag_filter_button.setToolTip( tt )
except HydrusExceptions.DataMissing:
self._tag_filter_button.setEnabled( False )
self._tag_filter_button.setToolTip( 'Do not have a tag filter for this repository. Try refreshing your account, or, if your client is old, update it.' )
self._metadata_st.setText( self._service.GetNextUpdateDueString() )
HG.client_controller.CallToThread( self.THREADFetchInfo, self._service )
@ -3146,6 +3175,28 @@ class ReviewServiceRepositorySubPanel( QW.QWidget ):
def _ReviewTagFilter( self ):
try:
tag_filter = self._service.GetTagFilter()
except HydrusExceptions.DataMissing:
return
frame = ClientGUITopLevelWindowsPanels.FrameThatTakesScrollablePanel( self, 'review tag filter' )
message = 'The Tag Repository applies this to all new pending tag mapping uploads. If you upload a mapping that this filter denies, it will be silently discarded serverside. Siblings and parents are not affected.'
namespaces = HG.client_controller.network_engine.domain_manager.GetParserNamespaces()
panel = ClientGUITags.EditTagFilterPanel( frame, tag_filter, namespaces = namespaces, message = message, read_only = True )
frame.SetPanel( panel )
def _SelectContentTypes( self ):
choice_tuples = [ ( HC.content_type_string_lookup[ content_type ], content_type, False ) for content_type in self._content_types_to_gauges_and_buttons.keys() ]

View File

@ -418,6 +418,16 @@ class SimpleSubPanel( QW.QWidget ):
self._simple_actions.addItem( display_string, data )
#
self._duplicates_type_panel = QW.QWidget( self )
choices = [ ( HC.duplicate_type_string_lookup[ t ], t ) for t in ( HC.DUPLICATE_MEMBER, HC.DUPLICATE_ALTERNATE, HC.DUPLICATE_FALSE_POSITIVE, HC.DUPLICATE_POTENTIAL ) ]
self._duplicate_type = ClientGUICommon.BetterRadioBox( self._duplicates_type_panel, choices = choices )
#
self._seek_panel = QW.QWidget( self )
choices = [
@ -439,14 +449,21 @@ class SimpleSubPanel( QW.QWidget ):
#
vbox = QP.VBoxLayout()
QP.AddToLayout( vbox, self._duplicate_type, CC.FLAGS_EXPAND_BOTH_WAYS )
self._duplicates_type_panel.setLayout( vbox )
#
hbox = QP.HBoxLayout()
QP.AddToLayout( hbox, self._seek_direction, CC.FLAGS_CENTER )
QP.AddToLayout( hbox, self._seek_direction, CC.FLAGS_EXPAND_BOTH_WAYS )
QP.AddToLayout( hbox, self._seek_duration_s, CC.FLAGS_CENTER )
QP.AddToLayout( hbox, ClientGUICommon.BetterStaticText( self._seek_panel, label = 's' ), CC.FLAGS_CENTER )
QP.AddToLayout( hbox, self._seek_duration_ms, CC.FLAGS_CENTER )
QP.AddToLayout( hbox, ClientGUICommon.BetterStaticText( self._seek_panel, label = 'ms' ), CC.FLAGS_CENTER )
hbox.addStretch( 1 )
self._seek_panel.setLayout( hbox )
@ -455,6 +472,7 @@ class SimpleSubPanel( QW.QWidget ):
vbox = QP.VBoxLayout()
QP.AddToLayout( vbox, self._simple_actions, CC.FLAGS_EXPAND_SIZER_BOTH_WAYS )
QP.AddToLayout( vbox, self._duplicates_type_panel, CC.FLAGS_EXPAND_SIZER_BOTH_WAYS )
QP.AddToLayout( vbox, self._seek_panel, CC.FLAGS_EXPAND_SIZER_BOTH_WAYS )
self.setLayout( vbox )
@ -468,6 +486,7 @@ class SimpleSubPanel( QW.QWidget ):
action = self._simple_actions.GetValue()
self._duplicates_type_panel.setVisible( action == CAC.SIMPLE_SHOW_DUPLICATES )
self._seek_panel.setVisible( action == CAC.SIMPLE_MEDIA_SEEK_DELTA )
@ -481,7 +500,13 @@ class SimpleSubPanel( QW.QWidget ):
else:
if action == CAC.SIMPLE_MEDIA_SEEK_DELTA:
if action == CAC.SIMPLE_SHOW_DUPLICATES:
duplicate_type = self._duplicate_type.GetValue()
simple_data = duplicate_type
elif action == CAC.SIMPLE_MEDIA_SEEK_DELTA:
direction = self._seek_direction.GetValue()
@ -505,7 +530,13 @@ class SimpleSubPanel( QW.QWidget ):
self._simple_actions.SetValue( action )
if action == CAC.SIMPLE_MEDIA_SEEK_DELTA:
if action == CAC.SIMPLE_SHOW_DUPLICATES:
duplicate_type = command.GetSimpleData()
self._duplicate_type.SetValue( duplicate_type )
elif action == CAC.SIMPLE_MEDIA_SEEK_DELTA:
( direction, ms ) = command.GetSimpleData()

View File

@ -683,11 +683,6 @@ class ImportFolder( HydrusSerialisable.SerialisableBaseNamed ):
break
if path.endswith( '.txt' ):
continue
file_seed = ClientImportFileSeeds.FileSeed( ClientImportFileSeeds.FILE_SEED_TYPE_HDD, path )
if not self._file_seed_cache.HasFileSeed( file_seed ):

View File

@ -193,12 +193,25 @@ def GetDuplicateComparisonStatements( shown_media, comparison_media ):
score = 0
if s_size > c_size:
sign = '+'
percentage_difference = ( s_size / c_size ) - 1.0
else:
sign = ''
percentage_difference = ( s_size / c_size ) - 1.0
percentage_different_string = ' ({}{})'.format( sign, HydrusData.ConvertFloatToPercentage( percentage_difference ) )
if is_a_pixel_dupe:
score = 0
statement = '{} {} {}'.format( HydrusData.ToHumanBytes( s_size ), operator, HydrusData.ToHumanBytes( c_size ) )
statement = '{} {} {}{}'.format( HydrusData.ToHumanBytes( s_size ), operator, HydrusData.ToHumanBytes( c_size ), percentage_different_string )
statements_and_scores[ 'filesize' ] = ( statement, score )

View File

@ -81,6 +81,7 @@ class HydrusServiceClientAPI( HydrusClientService ):
get_files.putChild( b'search_files', ClientLocalServerResources.HydrusResourceClientAPIRestrictedGetFilesSearchFiles( self._service, self._client_requests_domain ) )
get_files.putChild( b'file_metadata', ClientLocalServerResources.HydrusResourceClientAPIRestrictedGetFilesFileMetadata( self._service, self._client_requests_domain ) )
get_files.putChild( b'file_hashes', ClientLocalServerResources.HydrusResourceClientAPIRestrictedGetFilesFileHashes( self._service, self._client_requests_domain ) )
get_files.putChild( b'file', ClientLocalServerResources.HydrusResourceClientAPIRestrictedGetFilesGetFile( self._service, self._client_requests_domain ) )
get_files.putChild( b'thumbnail', ClientLocalServerResources.HydrusResourceClientAPIRestrictedGetFilesGetThumbnail( self._service, self._client_requests_domain ) )

View File

@ -57,7 +57,7 @@ LOCAL_BOORU_JSON_BYTE_LIST_PARAMS = set()
CLIENT_API_INT_PARAMS = { 'file_id', 'file_sort_type' }
CLIENT_API_BYTE_PARAMS = { 'hash', 'destination_page_key', 'page_key', 'Hydrus-Client-API-Access-Key', 'Hydrus-Client-API-Session-Key', 'tag_service_key', 'file_service_key' }
CLIENT_API_STRING_PARAMS = { 'name', 'url', 'domain', 'search', 'file_service_name', 'tag_service_name', 'reason', 'tag_display_type' }
CLIENT_API_STRING_PARAMS = { 'name', 'url', 'domain', 'search', 'file_service_name', 'tag_service_name', 'reason', 'tag_display_type', 'source_hash_type', 'desired_hash_type' }
CLIENT_API_JSON_PARAMS = { 'basic_permissions', 'system_inbox', 'system_archive', 'tags', 'file_ids', 'only_return_identifiers', 'only_return_basic_information', 'create_new_file_ids', 'detailed_url_information', 'hide_service_names_tags', 'hide_service_keys_tags', 'simple', 'file_sort_asc', 'return_hashes', 'return_file_ids', 'include_notes', 'notes', 'note_names', 'doublecheck_file_system' }
CLIENT_API_JSON_BYTE_LIST_PARAMS = { 'hashes' }
CLIENT_API_JSON_BYTE_DICT_PARAMS = { 'service_keys_to_tags', 'service_keys_to_actions_to_tags', 'service_keys_to_additional_tags' }
@ -2354,6 +2354,64 @@ class HydrusResourceClientAPIRestrictedGetFilesGetFile( HydrusResourceClientAPIR
return response_context
class HydrusResourceClientAPIRestrictedGetFilesFileHashes( HydrusResourceClientAPIRestrictedGetFiles ):
def _threadDoGETJob( self, request: HydrusServerRequest.HydrusRequest ):
supported_hash_types = ( 'sha256', 'md5', 'sha1', 'sha512' )
source_hash_type = request.parsed_request_args.GetValue( 'source_hash_type', str, default_value = 'sha256' )
if source_hash_type not in supported_hash_types:
raise HydrusExceptions.BadRequestException( 'I do not support that hash type!' )
desired_hash_type = request.parsed_request_args.GetValue( 'desired_hash_type', str )
if desired_hash_type not in supported_hash_types:
raise HydrusExceptions.BadRequestException( 'I do not support that hash type!' )
source_hashes = set()
if 'hash' in request.parsed_request_args:
request_hash = request.parsed_request_args.GetValue( 'hash', bytes )
source_hashes.add( request_hash )
if 'hashes' in request.parsed_request_args:
request_hashes = request.parsed_request_args.GetValue( 'hashes', list, expected_list_type = bytes )
source_hashes.update( request_hashes )
if len( source_hashes ) == 0:
raise HydrusExceptions.BadRequestException( 'You have to specify a hash to look up!' )
CheckHashLength( source_hashes, hash_type = source_hash_type )
source_to_desired = HG.client_controller.Read( 'file_hashes', source_hashes, source_hash_type, desired_hash_type )
encoded_source_to_desired = { source_hash.hex() : desired_hash.hex() for ( source_hash, desired_hash ) in source_to_desired.items() }
body_dict = {
'hashes' : encoded_source_to_desired
}
body = Dumps( body_dict, request.preferred_mime )
response_context = HydrusServerResources.ResponseContext( 200, mime = request.preferred_mime, body = body )
return response_context
class HydrusResourceClientAPIRestrictedGetFilesFileMetadata( HydrusResourceClientAPIRestrictedGetFiles ):
def _threadDoGETJob( self, request: HydrusServerRequest.HydrusRequest ):

View File

@ -72,7 +72,7 @@ class URLClass( HydrusSerialisable.SerialisableBaseNamed ):
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_URL_CLASS
SERIALISABLE_NAME = 'URL Class'
SERIALISABLE_VERSION = 11
SERIALISABLE_VERSION = 12
def __init__(
self,
@ -156,6 +156,8 @@ class URLClass( HydrusSerialisable.SerialisableBaseNamed ):
self._match_subdomains = False
self._keep_matched_subdomains = False
self._alphabetise_get_parameters = True
self._no_more_path_components_than_this = False
self._no_more_parameters_than_this = False
self._can_produce_multiple_files = False
self._should_be_associated_with_files = True
self._keep_fragment = False
@ -296,7 +298,7 @@ class URLClass( HydrusSerialisable.SerialisableBaseNamed ):
serialisable_api_lookup_converter = self._api_lookup_converter.GetSerialisableTuple()
serialisable_referral_url_converter = self._referral_url_converter.GetSerialisableTuple()
booleans = ( self._match_subdomains, self._keep_matched_subdomains, self._alphabetise_get_parameters, self._can_produce_multiple_files, self._should_be_associated_with_files, self._keep_fragment )
booleans = ( self._match_subdomains, self._keep_matched_subdomains, self._alphabetise_get_parameters, self._no_more_path_components_than_this, self._no_more_parameters_than_this, self._can_produce_multiple_files, self._should_be_associated_with_files, self._keep_fragment )
return (
serialisable_url_class_key,
@ -341,7 +343,7 @@ class URLClass( HydrusSerialisable.SerialisableBaseNamed ):
self._example_url
) = serialisable_info
( self._match_subdomains, self._keep_matched_subdomains, self._alphabetise_get_parameters, self._can_produce_multiple_files, self._should_be_associated_with_files, self._keep_fragment ) = booleans
( self._match_subdomains, self._keep_matched_subdomains, self._alphabetise_get_parameters, self._no_more_path_components_than_this, self._no_more_parameters_than_this, self._can_produce_multiple_files, self._should_be_associated_with_files, self._keep_fragment ) = booleans
self._url_class_key = bytes.fromhex( serialisable_url_class_key )
self._path_components = [ ( HydrusSerialisable.CreateFromSerialisableTuple( serialisable_string_match ), default ) for ( serialisable_string_match, default ) in serialisable_path_components ]
@ -514,6 +516,58 @@ class URLClass( HydrusSerialisable.SerialisableBaseNamed ):
return ( 11, new_serialisable_info )
if version == 11:
(
serialisable_url_class_key,
url_type,
preferred_scheme,
netloc,
booleans,
serialisable_path_components,
serialisable_parameters,
has_single_value_parameters,
serialisable_single_value_parameters_match,
serialisable_header_overrides,
serialisable_api_lookup_converter,
send_referral_url,
serialisable_referrel_url_converter,
gallery_index_type,
gallery_index_identifier,
gallery_index_delta,
example_url
) = old_serialisable_info
( match_subdomains, keep_matched_subdomains, alphabetise_get_parameters, can_produce_multiple_files, should_be_associated_with_files, keep_fragment ) = booleans
no_more_path_components_than_this = False
no_more_parameters_than_this = False
booleans = ( match_subdomains, keep_matched_subdomains, alphabetise_get_parameters, no_more_path_components_than_this, no_more_parameters_than_this, can_produce_multiple_files, should_be_associated_with_files, keep_fragment )
new_serialisable_info = (
serialisable_url_class_key,
url_type,
preferred_scheme,
netloc,
booleans,
serialisable_path_components,
serialisable_parameters,
has_single_value_parameters,
serialisable_single_value_parameters_match,
serialisable_header_overrides,
serialisable_api_lookup_converter,
send_referral_url,
serialisable_referrel_url_converter,
gallery_index_type,
gallery_index_identifier,
gallery_index_delta,
example_url
)
return ( 12, new_serialisable_info )
def AlphabetiseGetParameters( self ):
@ -824,6 +878,16 @@ class URLClass( HydrusSerialisable.SerialisableBaseNamed ):
return r.geturl()
def NoMorePathComponentsThanThis( self ) -> bool:
return self._no_more_path_components_than_this
def NoMoreParametersThanThis( self ) -> bool:
return self._no_more_parameters_than_this
def RefersToOneFile( self ):
is_a_direct_file_page = self._url_type == HC.URL_TYPE_FILE
@ -853,6 +917,16 @@ class URLClass( HydrusSerialisable.SerialisableBaseNamed ):
self._example_url = example_url
def SetNoMorePathComponentsThanThis( self, no_more: bool ):
self._no_more_path_components_than_this = no_more
def SetNoMoreParametersThanThis( self, no_more: bool ):
self._no_more_parameters_than_this = no_more
def SetSingleValueParameterData( self, has_single_value_parameters: bool, single_value_parameters_string_match: ClientStrings.StringMatch ):
self._has_single_value_parameters = has_single_value_parameters
@ -910,6 +984,11 @@ class URLClass( HydrusSerialisable.SerialisableBaseNamed ):
url_path_components = url_path.split( '/' )
if len( url_path_components ) > len( self._path_components ) and self._no_more_path_components_than_this:
raise HydrusExceptions.URLClassException( '"{}" has {} path components, but I will not allow more than my defined {}!'.format( url_path, len( url_path_components ), len( self._path_components ) ) )
for ( index, ( string_match, default ) ) in enumerate( self._path_components ):
if len( url_path_components ) > index:
@ -942,6 +1021,11 @@ class URLClass( HydrusSerialisable.SerialisableBaseNamed ):
( url_parameters, single_value_parameters, param_order ) = ClientNetworkingFunctions.ConvertQueryTextToDict( p.query )
if len( url_parameters ) > len( self._parameters ) and self._no_more_parameters_than_this:
raise HydrusExceptions.URLClassException( '"{}" has {} parameters, but I will not allow more than my defined {}!'.format( url_path, len( url_parameters ), len( self._parameters ) ) )
for ( key, ( string_match, default ) ) in self._parameters.items():
if key not in url_parameters:
@ -968,6 +1052,11 @@ class URLClass( HydrusSerialisable.SerialisableBaseNamed ):
if len( single_value_parameters ) > 0 and not self._has_single_value_parameters and self._no_more_parameters_than_this:
raise HydrusExceptions.URLClassException( '"{}" has unexpected single-value parameters, but I am set not to allow any unexpected parameters!'.format( url_path ) )
if self._has_single_value_parameters:
if len( single_value_parameters ) == 0:

View File

@ -80,8 +80,8 @@ options = {}
# Misc
NETWORK_VERSION = 20
SOFTWARE_VERSION = 507
CLIENT_API_VERSION = 36
SOFTWARE_VERSION = 508
CLIENT_API_VERSION = 37
SERVER_THUMBNAIL_DIMENSIONS = ( 200, 200 )
@ -266,6 +266,11 @@ EXPORT_FOLDER_TYPE_SYNCHRONISE = 1
FILTER_WHITELIST = 0
FILTER_BLACKLIST = 1
filter_black_white_str_lookup = {
FILTER_WHITELIST : 'whitelist',
FILTER_BLACKLIST : 'blacklist'
}
HYDRUS_CLIENT = 0
HYDRUS_SERVER = 1
HYDRUS_TEST = 2

View File

@ -6,6 +6,7 @@ import threading
from hydrus.core import HydrusConstants as HC
from hydrus.core import HydrusExceptions
from hydrus.core import HydrusSerialisable
from hydrus.core import HydrusData
from hydrus.core import HydrusText
def CensorshipMatch( tag, censorships ):
@ -344,6 +345,8 @@ class TagFilter( HydrusSerialisable.SerialisableBase ):
SERIALISABLE_NAME = 'Tag Filter Rules'
SERIALISABLE_VERSION = 1
WOAH_TOO_MANY_RULES_THRESHOLD = 12
def __init__( self ):
HydrusSerialisable.SerialisableBase.__init__( self )
@ -669,6 +672,68 @@ class TagFilter( HydrusSerialisable.SerialisableBase ):
def GetChanges( self, old_tag_filter: "TagFilter" ):
old_slices_to_rules = old_tag_filter.GetTagSlicesToRules()
new_rules = [ ( slice, rule ) for ( slice, rule ) in self._tag_slices_to_rules.items() if slice not in old_slices_to_rules ]
changed_rules = [ ( slice, rule ) for ( slice, rule ) in self._tag_slices_to_rules.items() if slice in old_slices_to_rules and rule != old_slices_to_rules[ slice ] ]
deleted_rules = [ ( slice, rule ) for ( slice, rule ) in old_slices_to_rules.items() if slice not in self._tag_slices_to_rules ]
return ( new_rules, changed_rules, deleted_rules )
def GetChangesSummaryText( self, old_tag_filter: "TagFilter" ):
( new_rules, changed_rules, deleted_rules ) = self.GetChanges( old_tag_filter )
summary_components = []
if len( new_rules ) > 0:
if len( new_rules ) > self.WOAH_TOO_MANY_RULES_THRESHOLD:
summary_components.append( 'Added {} rules'.format( HydrusData.ToHumanInt( len( new_rules ) ) ) )
else:
rows = [ 'Added rule: {} - {}'.format( HC.filter_black_white_str_lookup[ rule ], ConvertTagSliceToString( slice ) ) for ( slice, rule ) in new_rules ]
summary_components.append( os.linesep.join( rows ) )
if len( changed_rules ) > 0:
if len( new_rules ) > self.WOAH_TOO_MANY_RULES_THRESHOLD:
summary_components.append( 'Changed {} rules'.format( HydrusData.ToHumanInt( len( new_rules ) ) ) )
else:
rows = [ 'Flipped rule: to {} - {}'.format( HC.filter_black_white_str_lookup[ rule ], ConvertTagSliceToString( slice ) ) for ( slice, rule ) in changed_rules ]
summary_components.append( os.linesep.join( rows ) )
if len( deleted_rules ) > 0:
if len( new_rules ) > self.WOAH_TOO_MANY_RULES_THRESHOLD:
summary_components.append( 'Deleted {} rules'.format( HydrusData.ToHumanInt( len( new_rules ) ) ) )
else:
rows = [ 'Deleted rule: {} - {}'.format( HC.filter_black_white_str_lookup[ rule ], ConvertTagSliceToString( slice ) ) for ( slice, rule ) in deleted_rules ]
summary_components.append( os.linesep.join( rows ) )
return os.linesep.join( summary_components )
def GetTagSlicesToRules( self ):
with self._lock:
@ -729,12 +794,26 @@ class TagFilter( HydrusSerialisable.SerialisableBase ):
else:
text = 'blacklisting on ' + ', '.join( ( ConvertTagSliceToString( tag_slice ) for tag_slice in blacklist ) )
if len( blacklist ) > self.WOAH_TOO_MANY_RULES_THRESHOLD:
text = 'blacklisting on {} rules'.format( HydrusData.ToHumanInt( len( blacklist ) ) )
else:
text = 'blacklisting on ' + ', '.join( ( ConvertTagSliceToString( tag_slice ) for tag_slice in blacklist ) )
if len( whitelist ) > 0:
text += ' except ' + ', '.join( ( ConvertTagSliceToString( tag_slice ) for tag_slice in whitelist ) )
if len( whitelist ) > self.WOAH_TOO_MANY_RULES_THRESHOLD:
text += ' except {} other rules'.format( HydrusData.ToHumanInt( len( whitelist ) ) )
else:
text += ' except ' + ', '.join( ( ConvertTagSliceToString( tag_slice ) for tag_slice in whitelist ) )
return text
@ -776,12 +855,26 @@ class TagFilter( HydrusSerialisable.SerialisableBase ):
else:
text = 'all but ' + ', '.join( ( ConvertTagSliceToString( tag_slice ) for tag_slice in blacklist ) ) + ' allowed'
if len( blacklist ) > self.WOAH_TOO_MANY_RULES_THRESHOLD:
text = 'all but {} rules allowed'.format( HydrusData.ToHumanInt( len( blacklist ) ) )
else:
text = 'all but ' + ', '.join( ( ConvertTagSliceToString( tag_slice ) for tag_slice in blacklist ) ) + ' allowed'
if len( whitelist ) > 0:
text += ' except ' + ', '.join( ( ConvertTagSliceToString( tag_slice ) for tag_slice in whitelist ) )
if len( whitelist ) > self.WOAH_TOO_MANY_RULES_THRESHOLD:
text += ' except for {} other rules'.format( HydrusData.ToHumanInt( len( whitelist ) ) )
else:
text += ' except ' + ', '.join( ( ConvertTagSliceToString( tag_slice ) for tag_slice in whitelist ) )
return text
@ -796,7 +889,7 @@ class TagFilter( HydrusSerialisable.SerialisableBase ):
blacklist = []
whitelist = []
for ( tag_slice, rule ) in list(self._tag_slices_to_rules.items()):
for ( tag_slice, rule ) in self._tag_slices_to_rules.items():
if rule == HC.FILTER_BLACKLIST:
@ -823,6 +916,10 @@ class TagFilter( HydrusSerialisable.SerialisableBase ):
text = 'no tags'
elif len( whitelist ) > self.WOAH_TOO_MANY_RULES_THRESHOLD:
text = '{} rules that allow'.format( HydrusData.ToHumanInt( len( whitelist ) ) )
else:
text = 'only ' + ', '.join( ( ConvertTagSliceToString( tag_slice ) for tag_slice in whitelist ) )
@ -832,7 +929,11 @@ class TagFilter( HydrusSerialisable.SerialisableBase ):
text = 'all namespaced tags'
if len( whitelist ) > 0:
if len( whitelist ) > self.WOAH_TOO_MANY_RULES_THRESHOLD:
text += ' and {} other rules'.format( HydrusData.ToHumanInt( len( whitelist ) ) )
elif len( whitelist ) > 0:
text += ' and ' + ', '.join( ( ConvertTagSliceToString( tag_slice ) for tag_slice in whitelist ) )
@ -841,16 +942,31 @@ class TagFilter( HydrusSerialisable.SerialisableBase ):
text = 'all unnamespaced tags'
if len( whitelist ) > 0:
if len( whitelist ) > self.WOAH_TOO_MANY_RULES_THRESHOLD:
text += ' and {} other rules'.format( HydrusData.ToHumanInt( len( whitelist ) ) )
elif len( whitelist ) > 0:
text += ' and ' + ', '.join( ( ConvertTagSliceToString( tag_slice ) for tag_slice in whitelist ) )
else:
text = 'all tags except ' + ', '.join( ( ConvertTagSliceToString( tag_slice ) for tag_slice in blacklist ) )
if len( blacklist ) > self.WOAH_TOO_MANY_RULES_THRESHOLD:
text = 'all tags except {} other rules'.format( HydrusData.ToHumanInt( len( blacklist ) ) )
else:
text = 'all tags except ' + ', '.join( ( ConvertTagSliceToString( tag_slice ) for tag_slice in blacklist ) )
if len( whitelist ) > 0:
if len( whitelist ) > self.WOAH_TOO_MANY_RULES_THRESHOLD:
text += ' (except {} other rules)'.format( HydrusData.ToHumanInt( len( whitelist ) ) )
elif len( whitelist ) > 0:
text += ' (except ' + ', '.join( ( ConvertTagSliceToString( tag_slice ) for tag_slice in whitelist ) ) + ')'

View File

@ -37,8 +37,8 @@ def GenerateDefaultServiceDictionary( service_type ):
update_period = 100000
dictionary[ 'service_options' ][ 'update_period' ] = update_period
dictionary[ 'service_options' ][ 'nullification_period' ] = 90 * 86400
dictionary[ 'nullification_period' ] = 90 * 86400
dictionary[ 'next_nullification_update_index' ] = 0
metadata = Metadata()
@ -1201,6 +1201,33 @@ class ClientToServerUpdate( HydrusSerialisable.SerialisableBase ):
self._actions_to_contents_and_reasons[ action ].append( ( content, reason ) )
def ApplyTagFilterToPendingMappings( self, tag_filter: HydrusTags.TagFilter ):
if HC.CONTENT_UPDATE_PEND in self._actions_to_contents_and_reasons:
contents_and_reasons = self._actions_to_contents_and_reasons[ HC.CONTENT_UPDATE_PEND ]
new_contents_and_reasons = []
for ( content, reason ) in contents_and_reasons:
if content.GetContentType() == HC.CONTENT_TYPE_MAPPINGS:
( tag, hashes ) = content.GetContentData()
if not tag_filter.TagOK( tag ):
continue
new_contents_and_reasons.append( ( content, reason ) )
self._actions_to_contents_and_reasons[ HC.CONTENT_UPDATE_PEND ] = new_contents_and_reasons
def GetClientsideContentUpdates( self ):
content_updates = []
@ -1238,7 +1265,7 @@ class ClientToServerUpdate( HydrusSerialisable.SerialisableBase ):
hashes = set()
for contents_and_reasons in list(self._actions_to_contents_and_reasons.values()):
for contents_and_reasons in self._actions_to_contents_and_reasons.values():
for ( content, reason ) in contents_and_reasons:
@ -2675,22 +2702,6 @@ class ServerServiceRestricted( ServerService ):
def GetServiceOptions( self ):
with self._lock:
return self._service_options
def SetServiceOptions( self, service_options: HydrusSerialisable.SerialisableDictionary ):
with self._lock:
self._service_options = service_options
class ServerServiceRepository( ServerServiceRestricted ):
def _GetSerialisableDictionary( self ):
@ -2712,9 +2723,20 @@ class ServerServiceRepository( ServerServiceRestricted ):
self._service_options[ 'update_period' ] = 100000
if 'nullification_period' in dictionary:
default_nullification_period = dictionary[ 'nullification_period' ]
del dictionary[ 'nullification_period' ]
else:
default_nullification_period = 90 * 86400
if 'nullification_period' not in self._service_options:
self._service_options[ 'nullification_period' ] = 90 * 86400
self._service_options[ 'nullification_period' ] = default_nullification_period
if 'next_nullification_update_index' not in dictionary:
@ -2873,8 +2895,8 @@ class ServerServiceRepository( ServerServiceRestricted ):
self._SetDirty()
HG.server_controller.pub( 'notify_new_nullification' )
HG.server_controller.pub( 'notify_new_nullification' )
def SetUpdatePeriod( self, update_period: int ):
@ -2887,8 +2909,8 @@ class ServerServiceRepository( ServerServiceRestricted ):
self._SetDirty()
HG.server_controller.pub( 'notify_new_repo_sync' )
HG.server_controller.pub( 'notify_new_repo_sync' )
def Sync( self ):
@ -2968,6 +2990,25 @@ class ServerServiceRepositoryTag( ServerServiceRepository ):
def GetTagFilter( self ) -> HydrusTags.TagFilter:
with self._lock:
return self._service_options[ 'tag_filter' ]
def SetTagFilter( self, tag_filter: HydrusTags.TagFilter ):
with self._lock:
self._service_options[ 'tag_filter' ] = tag_filter
self._SetDirty()
class ServerServiceRepositoryFile( ServerServiceRepository ):
def _GetSerialisableDictionary( self ):

View File

@ -92,5 +92,12 @@ class HydrusServiceRepositoryFile( HydrusServiceRepository ):
class HydrusServiceRepositoryTag( HydrusServiceRepository ):
pass
def _InitRoot( self ):
root = HydrusServiceRepository._InitRoot( self )
root.putChild( b'tag_filter', ServerServerResources.HydrusResourceRestrictedTagFilter( self._service, HydrusServer.REMOTE_DOMAIN ) )
return root

View File

@ -1,4 +1,5 @@
import http.cookies
import os
import threading
import time
@ -8,6 +9,7 @@ from hydrus.core import HydrusExceptions
from hydrus.core import HydrusGlobals as HG
from hydrus.core import HydrusPaths
from hydrus.core import HydrusSerialisable
from hydrus.core import HydrusTags
from hydrus.core import HydrusTemp
from hydrus.core.networking import HydrusNetwork
from hydrus.core.networking import HydrusNetworkVariableHandling
@ -366,7 +368,22 @@ class HydrusResourceRestrictedOptions( HydrusResourceRestricted ):
def _threadDoGETJob( self, request: HydrusServerRequest.HydrusRequest ):
service_options = self._service.GetServiceOptions()
# originally I feched and dumped the serialisabledict here straight from the service
# buuuut, of course if I update the tag filter object version, we can't just spit that out to the network and expect old clients to be ok
# so now this is just 'get the primitives' request, and it comes back as a straight up JSON dict
# anything serialisable can be its own request and can have separate deserialisation error handling
if self._service.GetServiceType() in HC.REPOSITORIES:
service_options = {
'update_period' : self._service.GetUpdatePeriod(),
'nullification_period' : self._service.GetNullificationPeriod()
}
else:
service_options = {}
body = HydrusNetworkVariableHandling.DumpHydrusArgsToNetworkBytes( { 'service_options' : service_options } )
@ -375,6 +392,7 @@ class HydrusResourceRestrictedOptions( HydrusResourceRestricted ):
return response_context
class HydrusResourceRestrictedOptionsModify( HydrusResourceRestricted ):
def _checkAccountPermissions( self, request: HydrusServerRequest.HydrusRequest ):
@ -382,6 +400,7 @@ class HydrusResourceRestrictedOptionsModify( HydrusResourceRestricted ):
request.hydrus_account.CheckPermission( HC.CONTENT_TYPE_OPTIONS, HC.PERMISSION_ACTION_MODERATE )
class HydrusResourceRestrictedOptionsModifyNullificationPeriod( HydrusResourceRestrictedOptionsModify ):
def _threadDoPOSTJob( self, request: HydrusServerRequest.HydrusRequest ):
@ -405,7 +424,7 @@ class HydrusResourceRestrictedOptionsModifyNullificationPeriod( HydrusResourceRe
self._service.SetNullificationPeriod( nullification_period )
HydrusData.Print(
'Account {} changed the anonymisation period to from "{}" to "{}".'.format(
'Account {} changed the anonymisation period from "{}" to "{}".'.format(
request.hydrus_account.GetAccountKey().hex(),
HydrusData.TimeDeltaToPrettyTimeDelta( old_nullification_period ),
HydrusData.TimeDeltaToPrettyTimeDelta( nullification_period )
@ -418,6 +437,7 @@ class HydrusResourceRestrictedOptionsModifyNullificationPeriod( HydrusResourceRe
return response_context
class HydrusResourceRestrictedOptionsModifyUpdatePeriod( HydrusResourceRestrictedOptionsModify ):
def _threadDoPOSTJob( self, request: HydrusServerRequest.HydrusRequest ):
@ -441,7 +461,7 @@ class HydrusResourceRestrictedOptionsModifyUpdatePeriod( HydrusResourceRestricte
self._service.SetUpdatePeriod( update_period )
HydrusData.Print(
'Account {} changed the update period to from "{}" to "{}".'.format(
'Account {} changed the update period from "{}" to "{}".'.format(
request.hydrus_account.GetAccountKey().hex(),
HydrusData.TimeDeltaToPrettyTimeDelta( old_update_period ),
HydrusData.TimeDeltaToPrettyTimeDelta( update_period )
@ -454,6 +474,7 @@ class HydrusResourceRestrictedOptionsModifyUpdatePeriod( HydrusResourceRestricte
return response_context
class HydrusResourceRestrictedAccountModify( HydrusResourceRestricted ):
def _checkAccountPermissions( self, request: HydrusServerRequest.HydrusRequest ):
@ -1158,6 +1179,75 @@ class HydrusResourceRestrictedServices( HydrusResourceRestricted ):
return response_context
class HydrusResourceRestrictedTagFilter( HydrusResourceRestricted ):
def _checkAccount( self, request: HydrusServerRequest.HydrusRequest ):
if request.IsPOST():
return HydrusResourceRestricted._checkAccount( self, request )
else:
# you can always fetch the tag filter
return request
def _checkAccountPermissions( self, request: HydrusServerRequest.HydrusRequest ):
if request.IsPOST():
request.hydrus_account.CheckPermission( HC.CONTENT_TYPE_OPTIONS, HC.PERMISSION_ACTION_MODERATE )
else:
# you can always fetch the tag filter
pass
def _threadDoGETJob( self, request: HydrusServerRequest.HydrusRequest ):
tag_filter = self._service.GetTagFilter()
body = HydrusNetworkVariableHandling.DumpHydrusArgsToNetworkBytes( { 'tag_filter' : tag_filter } )
response_context = HydrusServerResources.ResponseContext( 200, body = body )
return response_context
def _threadDoPOSTJob( self, request: HydrusServerRequest.HydrusRequest ):
tag_filter = request.parsed_request_args[ 'tag_filter' ]
old_tag_filter = self._service.GetTagFilter()
if old_tag_filter != tag_filter:
self._service.SetTagFilter( tag_filter )
summary_text = tag_filter.GetChangesSummaryText( old_tag_filter )
HydrusData.Print(
'Account {} changed the tag filter. Rule changes are:{}{}.'.format(
request.hydrus_account.GetAccountKey().hex(),
os.linesep,
summary_text
)
)
response_context = HydrusServerResources.ResponseContext( 200 )
return response_context
class HydrusResourceRestrictedUpdate( HydrusResourceRestricted ):
def _checkAccountPermissions( self, request: HydrusServerRequest.HydrusRequest ):
@ -1196,6 +1286,11 @@ class HydrusResourceRestrictedUpdate( HydrusResourceRestricted ):
client_to_server_update = request.parsed_request_args[ 'client_to_server_update' ]
if isinstance( self._service, HydrusNetwork.ServerServiceRepositoryTag ):
client_to_server_update.ApplyTagFilterToPendingMappings( self._service.GetTagFilter() )
timestamp = self._service.GetMetadata().GetNextUpdateBegin() + 1
HG.server_controller.WriteSynchronous( 'update', self._service_key, request.hydrus_account, client_to_server_update, timestamp )

View File

@ -2973,6 +2973,44 @@ class TestClientAPI( unittest.TestCase ):
def _test_file_hashes( self, connection, set_up_permissions ):
api_permissions = set_up_permissions[ 'everything' ]
access_key_hex = api_permissions.GetAccessKey().hex()
headers = { 'Hydrus-Client-API-Access-Key' : access_key_hex }
md5_hash = bytes.fromhex( 'ec5c5a4d7da4be154597e283f0b6663c' )
sha256_hash = bytes.fromhex( '2a0174970defa6f147f2eabba829c5b05aba1f1aea8b978611a07b7bb9cf9399' )
source_to_dest = { md5_hash : sha256_hash }
HG.test_controller.SetRead( 'file_hashes', source_to_dest )
path = '/get_files/file_hashes?source_hash_type=md5&desired_hash_type=sha256&hash={}'.format( md5_hash.hex() )
connection.request( 'GET', path, headers = headers )
response = connection.getresponse()
data = response.read()
text = str( data, 'utf-8' )
self.assertEqual( response.status, 200 )
d = json.loads( text )
expected_answer = {
'hashes' : {
md5_hash.hex() : sha256_hash.hex()
}
}
self.assertEqual( d, expected_answer )
def _test_file_metadata( self, connection, set_up_permissions ):
# test file metadata
@ -3916,6 +3954,7 @@ class TestClientAPI( unittest.TestCase ):
self._test_search_files_predicate_parsing( connection, set_up_permissions )
self._test_file_hashes( connection, set_up_permissions )
self._test_file_metadata( connection, set_up_permissions )
self._test_get_files( connection, set_up_permissions )
self._test_permission_failures( connection, set_up_permissions )