Version 478

This commit is contained in:
Hydrus Network Developer 2022-03-23 15:57:10 -05:00
parent a09ab83963
commit 6239eef1c5
43 changed files with 1799 additions and 1460 deletions

View File

@ -3,6 +3,41 @@
!!! note
This is the new changelog, only the most recent builds. For all versions, see the [old changelog](old_changelog.html).
## [Version 478](https://github.com/hydrusnetwork/hydrus/releases/tag/v478)
### misc
* if a file note text is crazy and can't be displayed, this is now handled and the best visual approximation is displayed (and saved back on ok) instead
* fixed an error in the cloudflare problem detection calls for the newer versions of cloudscraper (>=1.2.60) while maintaining support for the older versions. fingers crossed, we also shouldn't repeat this specific error if they refactor again
### file history chart updates
* fixed the 'inbox' line in file history, which has to be calculated in an odd way and was not counting on file imports adding to the inbox
* the file history chart now expands its y axis range to show all data even if deleted_files is huge. we'll see how nice this actually is IRL
* bumped the file history resolution up from 1,000 to 2,000 steps
* the y axis _should_ now show localised numbers, 5,000 instead of 5000, but the method by which this occurs involves fox tongues and the breath of a slighted widow, so it may just not work for some machines
### cleanup, mostly file location stuff
* I believe I have replaced all the remaining surplus static 'my files' references with code compatible with multiple local file services. when I add the capability to create new local file services, there now won't be a problem trying to display thumbnails or generate menu actions etc... if they aren't in 'my files'
* pulled the autocomplete dropdown file domain button code out to its own class and refactored it and the multiple location context panel to their own file
* added a 'default file location' option to 'files and trash' page, and a bunch of dialogs (e.g. the search panel when you make a new export folder) and similar now pull it to initialise. for most users this will stay 'my files' forever, but when we hit multiple local file services, it may want to change
* the file domain override options in 'manage tag display and search' now work on the new location system and support multple file services
* in downloaders, when highlighting, a database job that does the 'show files' filter (e.g. to include those in trash or not) now works on the new location context system and will handle files that will be imported to places other than my files
* refactored client api file service parsing
* refactored client api hashes parsing
* cleaned a whole heap of misc location code
* cleaned misc basic code across hydrus and client constant files
* gave 'you don't want the server' help page a very quick pass
### client api
* in prep for multiple local file services, delete_files now takes an optional file_service_key or file_service_name. by default, it now deletes from all appropriate local services, so behaviour is unchanged from before without the parameter if you just want to delete m8
* undelete files is the same. when we have multiple local file services, an undelete without a file service will undelete to all locations that have a delete record
* delete_files also now takes an optional 'reason' parameter
* the 'set_notes' command now checks the type of the notes Object. it obviously has to be string-to-string
* the 'get_thumbnail' command should now never 404. if you ask for a pdf thumb, it gives the pdf default thumb, and if there is no thumb for whatever reason, you get the hydrus fallback thumbnail. just like in the client itself
* updated client api help to talk about these
* updated the unit tests to handle them too
* did a pass over the client api help to unify indent style and fix other small formatting issues
* client api version is now 28
## [Version 477](https://github.com/hydrusnetwork/hydrus/releases/tag/v477)
### misc
@ -272,45 +307,3 @@
* misc cleanup and refactoring for file domain search code
* purged more single file service inspection code from file search systems
* refactored most duplicate files storage code (about 70KB) to a new client db module
## [Version 468](https://github.com/hydrusnetwork/hydrus/releases/tag/v468)
### misc
* fixed an issue where the one pixel border on the new 'control bar' on the media viewer was annoyingly catching mouse events at the bottom of the screen when in borderless fullscreen mode (and hence dragging the video, not scanning video position). the animation scanbar now owns its own border and processes mouse events on it properly
* fixed a typo bug in the new pixel hash system that meant new imports were not being added to the system correctly. on update, all files affected will be fixed of bad data and scheduled for a pixel hash regen. sorry for the trouble, and thank you for the quick reports here
* added a 'fixed font size example' qss file to the install. I have passed this file to others as an example of a quick way to make the font (and essentially ui scale) larger. it has some help comments inside and is now always available. the default example font size is lmao
* fixed another type checking problem for (mostly Arch/AUR) PyQt5 users (issue #1033)
* wrote a new display mappings maintenance routine for the database menu that repopulates the display mappings cache for missing rows. this task will be radically faster than a full regen for some problems, but it only solves those problems
* on boot, the program now explicitly checks if any of the database files are set as read-only and if so will dump out with an appropriate error
* rewrote my various 'file size problem' exception hierarchy to clearly split 'the file import options disallow this big gif' vs 'this file is zero size/this file is malformed'. we've had several problems here recently, but file import options rule-breaking should set 'ignore' again, and import objects should be better about ignore vs fail state from now on
* added more error handling for broken image files. some will report cleaner errors, some will now import
* the new parsing system that discards source urls if they share a domain with a primary import url is now stricter--now discarding only if they share the same url class. the domain check was messing up saving post urls when they were parsed from an api url (issue #1036)
* the network engine no longer sends a Range header if it is expecting to pull html/json, just files. this fixes fetching pages from nijie.info (and several other server engines, it seems), which has some unusual access rules regarding Range and Accept-Encoding
* fixed a problem with no_daemons and the docker package server scripts (issue #1039)
* if the server engine (serverside or client api) is running a request during program shutdown, it now politely says 'Application is shutting down!' with a 503 rather than going bananas and dumping to log with an uncaught 500
* fixed some bad client db update error handling code
### multiple local file services (system predicate edition)
* system:file service now supports 'deleted' and 'petitioned' status
* advanced 'all known files' search pages now show more system predicates
* when inbox and archive are hidden because one has 0 count, and the search space is simple, system everything now says what they are, e.g. "system:everything (24) (all in inbox)"
* file repos' 'system:local/not local' now sort at the top of the system predicate list, like inbox/archive
### client api
* the GET /get_files/file_metadata call now returns the file modified date and imported/deleted timestamps for each file service the file is currently in or deleted from. check the help for an example!
* fixed client api file search with random sort (file_sort_type = 4)
* client api version is now 24
### boring multiple local file services work
* the system predicates listed in a search page dropdown now support the new 'multiple location search context' object, which means in future I will be able to switch over to 'file domain is union of (A, deleted from B, and C)' and all the numbers will add up appropriately with ranged 'x-y' counts and deal with combinations of file repo and local service and current/deleted neatly
* when fetching system preds in 'all known files', the system:everything 'num files' count will be stated if available in the cache
* for the new system:file service search, refactored db level file filtering to support all status types
* cleaned up how system preds are generated
### boring refactoring work
* moved GUGs from network domain code to their own file
* moved URL Class from network domain code to its own file
* moved the pure functions from network domain code to their own file
* cleared up some file maintenance enum variable names
* sped up random file sort for large result sets
* misc client network code cleanup and type hints, and rejiggered cleaner imports after the refactoring

View File

@ -110,14 +110,13 @@ Arguments: n/a
Response:
: Some simple JSON describing the current api version (and hydrus client version, if you are interested).
```json title="Example response"
{
"version": 17,
"hydrus_version": 441
}
```
```json title="Example response"
{
"version": 17,
"hydrus_version": 441
}
```
### **GET `/request_new_permissions`** { id="request_new_permissions" }
@ -278,9 +277,9 @@ Required Headers:
Arguments (in JSON):
: - `path`: (the path you want to import)
```json title="Example request body"
{"path": "E:\\to_import\\ayanami.jpg"}
```
```json title="Example request body"
{"path": "E:\\to_import\\ayanami.jpg"}
```
Arguments (as bytes):
: You can alternately just send the file's bytes as the POST body.
@ -323,18 +322,20 @@ Arguments (in JSON):
:
* `hash`: (an SHA256 hash for a file in 64 characters of hexadecimal)
* `hashes`: (a list of SHA256 hashes)
* `file_service_name`: (optional, selective, string, the local file domain from which to delete, or all local files)
* `file_service_key`: (optional, selective, hexadecimal, the local file domain from which to delete, or all local files)
* `reason`: (optional, string, the reason attached to the delete action)
```json title="Example request body"
{"hash": "78f92ba4a786225ee2a1236efa6b7dc81dd729faf4af99f96f3e20bad6d8b538"}
```
```json title="Example request body"
{"hash": "78f92ba4a786225ee2a1236efa6b7dc81dd729faf4af99f96f3e20bad6d8b538"}
```
Response:
: 200 and no content.
You can use hash or hashes, whichever is more convenient.
At the moment, this is only able to send files from 'my files' to the trash, and so it cannot perform physical deletes. There is no error if any files do not currently exist in 'my files'. In future, it will take some sort of file service parameter to do more.
If you specify a file service, the file will only be deleted from that location. Only local file domains are allowed (so you can't delete from a file repository or unpin from ipfs yet), but if you specific 'all local files', you should be able to trigger a physical delete if you wish.
### **POST `/add_files/undelete_files`** { id="add_files_undelete_files" }
@ -351,17 +352,19 @@ Arguments (in JSON):
:
* `hash`: (an SHA256 hash for a file in 64 characters of hexadecimal)
* `hashes`: (a list of SHA256 hashes)
* `file_service_name`: (optional, selective, string, the local file domain to which to undelete)
* `file_service_key`: (optional, selective, hexadecimal, the local file domain to which to undelete)
```json title="Example request body"
{"hash": "78f92ba4a786225ee2a1236efa6b7dc81dd729faf4af99f96f3e20bad6d8b538"}
```
```json title="Example request body"
{"hash": "78f92ba4a786225ee2a1236efa6b7dc81dd729faf4af99f96f3e20bad6d8b538"}
```
Response:
: 200 and no content.
You can use hash or hashes, whichever is more convenient.
This is just the reverse of a delete_files--removing files from trash and putting them back in 'my files'. There is no error if any files do not currently exist in 'trash'.
This is the reverse of a delete_files--removing files from trash and putting them back where they came from. If you specify a file service, the files will only be undeleted to there (if they have a delete record, otherwise this is nullipotent). If you do not specify a file service, they will be undeleted to all local file services for which there are deletion records. There is no error if any files do not currently exist in 'trash'.
### **POST `/add_files/archive_files`** { id="add_files_archive_files" }
@ -380,9 +383,9 @@ Arguments (in JSON):
* `hash`: (an SHA256 hash for a file in 64 characters of hexadecimal)
* `hashes`: (a list of SHA256 hashes)
```json title="Example request body"
{"hash": "78f92ba4a786225ee2a1236efa6b7dc81dd729faf4af99f96f3e20bad6d8b538"}
```
```json title="Example request body"
{"hash": "78f92ba4a786225ee2a1236efa6b7dc81dd729faf4af99f96f3e20bad6d8b538"}
```
Response:
: 200 and no content.
@ -408,9 +411,9 @@ Arguments (in JSON):
* `hash`: (an SHA256 hash for a file in 64 characters of hexadecimal)
* `hashes`: (a list of SHA256 hashes)
```json title="Example request body"
{"hash": "78f92ba4a786225ee2a1236efa6b7dc81dd729faf4af99f96f3e20bad6d8b538"}
```
```json title="Example request body"
{"hash": "78f92ba4a786225ee2a1236efa6b7dc81dd729faf4af99f96f3e20bad6d8b538"}
```
Response:
: 200 and no content.
@ -725,62 +728,62 @@ Arguments (in JSON):
* `filterable_tags`: (optional tags to be filtered by any tag import options that applies to the URL)
* _`service_names_to_tags`: (obsolete, legacy synonym for service\_names\_to\_additional_tags)_
If you specify a `destination_page_name` and an appropriate importer page already exists with that name, that page will be used. Otherwise, a new page with that name will be recreated (and used by subsequent calls with that name). Make sure it that page name is unique (e.g. '/b/ threads', not 'watcher') in your client, or it may not be found.
If you specify a `destination_page_name` and an appropriate importer page already exists with that name, that page will be used. Otherwise, a new page with that name will be recreated (and used by subsequent calls with that name). Make sure it that page name is unique (e.g. '/b/ threads', not 'watcher') in your client, or it may not be found.
Alternately, `destination_page_key` defines exactly which page should be used. Bear in mind this page key is only valid to the current session (they are regenerated on client reset or session reload), so you must figure out which one you want using the [/manage\_pages/get\_pages](#manage_pages_get_pages) call. If the correct page_key is not found, or the page it corresponds to is of the incorrect type, the standard page selection/creation rules will apply.
Alternately, `destination_page_key` defines exactly which page should be used. Bear in mind this page key is only valid to the current session (they are regenerated on client reset or session reload), so you must figure out which one you want using the [/manage\_pages/get\_pages](#manage_pages_get_pages) call. If the correct page_key is not found, or the page it corresponds to is of the incorrect type, the standard page selection/creation rules will apply.
`show_destination_page` defaults to False to reduce flicker when adding many URLs to different pages quickly. If you turn it on, the client will behave like a URL drag and drop and select the final page the URL ends up on.
`show_destination_page` defaults to False to reduce flicker when adding many URLs to different pages quickly. If you turn it on, the client will behave like a URL drag and drop and select the final page the URL ends up on.
`service_names_to_additional_tags` and `service_keys_to_additional_tags` use the same data structure as in /add\_tags/add\_tags--service ids to a list of tags to add. You will need 'add tags' permission or this will 403. These tags work exactly as 'additional' tags work in a _tag import options_. They are service specific, and always added unless some advanced tag import options checkbox (like 'only add tags to new files') is set.
`service_names_to_additional_tags` and `service_keys_to_additional_tags` use the same data structure as in /add\_tags/add\_tags--service ids to a list of tags to add. You will need 'add tags' permission or this will 403. These tags work exactly as 'additional' tags work in a _tag import options_. They are service specific, and always added unless some advanced tag import options checkbox (like 'only add tags to new files') is set.
filterable_tags works like the tags parsed by a hydrus downloader. It is just a list of strings. They have no inherant service and will be sent to a _tag import options_, if one exists, to decide which tag services get what. This parameter is useful if you are pulling all a URL's tags outside of hydrus and want to have them processed like any other downloader, rather than figuring out service names and namespace filtering on your end. Note that in order for a tag import options to kick in, I think you will have to have a Post URL URL Class hydrus-side set up for the URL so some tag import options (whether that is Class-specific or just the default) can be loaded at import time.
filterable_tags works like the tags parsed by a hydrus downloader. It is just a list of strings. They have no inherant service and will be sent to a _tag import options_, if one exists, to decide which tag services get what. This parameter is useful if you are pulling all a URL's tags outside of hydrus and want to have them processed like any other downloader, rather than figuring out service names and namespace filtering on your end. Note that in order for a tag import options to kick in, I think you will have to have a Post URL URL Class hydrus-side set up for the URL so some tag import options (whether that is Class-specific or just the default) can be loaded at import time.
```json title="Example request body"
{
"url": "https://8ch.net/tv/res/1846574.html",
"destination_page_name": "kino zone",
"service_names_to_additional_tags": {
"my tags": ["as seen on /tv/"]
}
}
```
```json title="Example request body"
{
"url": "https://safebooru.org/index.php?page=post&s=view&id=3195917",
"filterable_tags": [
"1girl",
"artist name",
"creator:azto dio",
"blonde hair",
"blue eyes",
"breasts",
"character name",
"commentary",
"english commentary",
"formal",
"full body",
"glasses",
"gloves",
"hair between eyes",
"high heels",
"highres",
"large breasts",
"long hair",
"long sleeves",
"looking at viewer",
"series:metroid",
"mole",
"mole under mouth",
"patreon username",
"ponytail",
"character:samus aran",
"solo",
"standing",
"suit",
"watermark"
]
}
```
```json title="Example request body"
{
"url": "https://8ch.net/tv/res/1846574.html",
"destination_page_name": "kino zone",
"service_names_to_additional_tags": {
"my tags": ["as seen on /tv/"]
}
}
```
```json title="Example request body"
{
"url": "https://safebooru.org/index.php?page=post&s=view&id=3195917",
"filterable_tags": [
"1girl",
"artist name",
"creator:azto dio",
"blonde hair",
"blue eyes",
"breasts",
"character name",
"commentary",
"english commentary",
"formal",
"full body",
"glasses",
"gloves",
"hair between eyes",
"high heels",
"highres",
"large breasts",
"long hair",
"long sleeves",
"looking at viewer",
"series:metroid",
"mole",
"mole under mouth",
"patreon username",
"ponytail",
"character:samus aran",
"solo",
"standing",
"suit",
"watermark"
]
}
```
Response:
: Some JSON with info on the URL added.
@ -841,11 +844,11 @@ Required Headers:
Arguments (in percent-encoded JSON):
:
* `notes`: a dictionary mapping note names to note contents
* `hash`: the SHA256 of the target file
* `file_id`: the identifier of the target file (an integer)
* `notes`: (an Object mapping string note names to string note contents)
* `hash`: (selective, an SHA256 hash for the file in 64 characters of hexadecimal)
* `file_id`: (selective, the integer numerical identifier for the file)
You must provide one of `hash` or `file_id`. Existing notes will be overwritten.
Existing notes will be overwritten.
```json title="Example request body"
{
"notes": {
@ -872,11 +875,10 @@ Required Headers:
Arguments (in percent-encoded JSON):
:
* `note_names`: a list of note names to delete
* `hash`: the SHA256 of the target file
* `file_id`: the identifier of the target file (an integer)
* `note_names`: (a list of string note names to delete)
* `hash`: (selective, an SHA256 hash for the file in 64 characters of hexadecimal)
* `file_id`: (selective, the integer numerical identifier for the file)
You must provide one of `hash` or `file_id`.
```json title="Example request body"
{
"note_names": ["note name", "another note"],
@ -939,18 +941,18 @@ Arguments (in JSON):
:
* `cookies`: (a list of cookie rows in the same format as the GET request above)
```json title="Example request body"
{
"cookies": [
["PHPSESSID", "07669eb2a1a6e840e498bb6e0799f3fb", ".somesite.com", "/", 1627327719],
["tag_filter", "1", ".somesite.com", "/", 1627327719]
]
}
```
You can set 'value' to be null, which will clear any existing cookie with the corresponding name, domain, and path (acting essentially as a delete).
```json title="Example request body"
{
"cookies": [
["PHPSESSID", "07669eb2a1a6e840e498bb6e0799f3fb", ".somesite.com", "/", 1627327719],
["tag_filter", "1", ".somesite.com", "/", 1627327719]
]
}
```
Expires can be null, but session cookies will time-out in hydrus after 60 minutes of non-use.
You can set 'value' to be null, which will clear any existing cookie with the corresponding name, domain, and path (acting essentially as a delete).
Expires can be null, but session cookies will time-out in hydrus after 60 minutes of non-use.
### **POST `/manage_headers/set_user_agent`** { id="manage_headers_set_user_agent" }
@ -967,13 +969,13 @@ Arguments (in JSON):
:
* `user-agent`: (a string)
```json title="Example request body"
{
"user-agent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:56.0) Gecko/20100101 Firefox/56.0"
}
```
Send an empty string to reset the client back to the default User-Agent, which should be `Mozilla/5.0 (compatible; Hydrus Client)`.
```json title="Example request body"
{
"user-agent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:56.0) Gecko/20100101 Firefox/56.0"
}
```
Send an empty string to reset the client back to the default User-Agent, which should be `Mozilla/5.0 (compatible; Hydrus Client)`.
## Managing Pages
@ -1165,18 +1167,17 @@ Arguments (in JSON):
* `file_ids`: (selective, a list of numerical file ids)
* `hashes`: (selective, a list of hexadecimal SHA256 hashes)
You need to use either file_ids or hashes. The files they refer to will be appended to the given page, just like a thumbnail drag and drop operation. The page key is the same as fetched in the [/manage\_pages/get\_pages](#manage_pages_get_pages) call.
You need to use either file_ids or hashes. The files they refer to will be appended to the given page, just like a thumbnail drag and drop operation. The page key is the same as fetched in the [/manage\_pages/get\_pages](#manage_pages_get_pages) call.
```json title="Example request body"
{
"page_key": "af98318b6eece15fef3cf0378385ce759bfe056916f6e12157cd928eb56c1f18",
"file_ids": [123, 124, 125]
}
```
```json title="Example request body"
{
"page_key": "af98318b6eece15fef3cf0378385ce759bfe056916f6e12157cd928eb56c1f18",
"file_ids": [123, 124, 125]
}
```
Response:
: 200 with no content. If the page key is not found, this will 404.
### **POST `/manage_pages/focus_page`** { id="manage_pages_focus_page" }
@ -1193,13 +1194,13 @@ Arguments (in JSON):
:
* `page_key`: (the page key for the page you wish to show)
The page key is the same as fetched in the [/manage\_pages/get\_pages](#manage_pages_get_pages) call.
The page key is the same as fetched in the [/manage\_pages/get\_pages](#manage_pages_get_pages) call.
```json title="Example request body"
{
"page_key": "af98318b6eece15fef3cf0378385ce759bfe056916f6e12157cd928eb56c1f18"
}
```
```json title="Example request body"
{
"page_key": "af98318b6eece15fef3cf0378385ce759bfe056916f6e12157cd928eb56c1f18"
}
```
Response:
: 200 with no content. If the page key is not found, this will 404.
@ -1231,135 +1232,135 @@ Arguments (in percent-encoded JSON):
* _`system_inbox`: true or false (obsolete, use tags)_
* _`system_archive`: true or false (obsolete, use tags)_
``` title='Example request for 16 files (system:limit=16) in the inbox with tags "blue eyes", "blonde hair", and "кино"'
/get_files/search_files?tags=%5B%22blue%20eyes%22%2C%20%22blonde%20hair%22%2C%20%22%5Cu043a%5Cu0438%5Cu043d%5Cu043e%22%2C%20%22system%3Ainbox%22%2C%20%22system%3Alimit%3D16%22%5D
```
``` title='Example request for 16 files (system:limit=16) in the inbox with tags "blue eyes", "blonde hair", and "кино"'
/get_files/search_files?tags=%5B%22blue%20eyes%22%2C%20%22blonde%20hair%22%2C%20%22%5Cu043a%5Cu0438%5Cu043d%5Cu043e%22%2C%20%22system%3Ainbox%22%2C%20%22system%3Alimit%3D16%22%5D
```
If the access key's permissions only permit search for certain tags, at least one positive whitelisted/non-blacklisted tag must be in the "tags" list or this will 403. Tags can be prepended with a hyphen to make a negated tag (e.g. "-green eyes"), but these will not be checked against the permissions whitelist.
If the access key's permissions only permit search for certain tags, at least one positive whitelisted/non-blacklisted tag must be in the "tags" list or this will 403. Tags can be prepended with a hyphen to make a negated tag (e.g. "-green eyes"), but these will not be checked against the permissions whitelist.
Wildcards and namespace searches are supported, so if you search for 'character:sam*' or 'series:*', this will be handled correctly clientside.
Wildcards and namespace searches are supported, so if you search for 'character:sam*' or 'series:*', this will be handled correctly clientside.
Many system predicates are also supported using a text parser! The parser was designed by a clever user for human input and allows for a certain amount of error (e.g. ~= instead of ≈, or "isn't" instead of "is not") or requires more information (e.g. the specific hashes for a hash lookup). Here's a big list of current formats supported:
Many system predicates are also supported using a text parser! The parser was designed by a clever user for human input and allows for a certain amount of error (e.g. ~= instead of ≈, or "isn't" instead of "is not") or requires more information (e.g. the specific hashes for a hash lookup). Here's a big list of current formats supported:
??? example "System Predicates"
* system:everything
* system:inbox
* system:archive
* system:has duration
* system:no duration
* system:is the best quality file of its duplicate group
* system:is not the best quality file of its duplicate group
* system:has audio
* system:no audio
* system:has icc profile
* system:no icc profile
* system:has tags
* system:no tags
* system:untagged
* system:number of tags > 5
* system:number of tags ~= 10
* system:number of tags > 0
* system:number of words < 2
* system:height = 600
* system:height > 900
* system:width < 200
* system:width > 1000
* system:filesize ~= 50 kilobytes
* system:filesize > 10megabytes
* system:filesize < 1 GB
* system:filesize > 0 B
* system:similar to abcdef01 abcdef02 abcdef03, abcdef04 with distance 3
* system:similar to abcdef distance 5
* system:limit = 100
* system:filetype = image/jpg, image/png, apng
* system:hash = abcdef01 abcdef02 abcdef03 _(this does sha256)_
* system:hash = abcdef01 abcdef02 md5
* system:modified date < 7 years 45 days 7h
* system:modified date > 2011-06-04
* system:date modified > 7 years 2 months
* system:date modified < 0 years 1 month 1 day 1 hour
* system:time imported < 7 years 45 days 7h
* system:time imported > 2011-06-04
* system:time imported > 7 years 2 months
* system:time imported < 0 years 1 month 1 day 1 hour
* system:time imported ~= 2011-1-3
* system:time imported ~= 1996-05-2
* system:duration < 5 seconds
* system:duration ~= 600 msecs
* system:duration > 3 milliseconds
* system:file service is pending to my files
* system:file service currently in my files
* system:file service is not currently in my files
* system:file service is not pending to my files
* system:num file relationships < 3 alternates
* system:number of file relationships > 3 false positives
* system:ratio is wider than 16:9
* system:ratio is 16:9
* system:ratio taller than 1:1
* system:num pixels > 50 px
* system:num pixels < 1 megapixels
* system:num pixels ~= 5 kilopixel
* system:media views ~= 10
* system:all views > 0
* system:preview views < 10
* system:media viewtime < 1 days 1 hour 0 minutes
* system:all viewtime > 1 hours 100 seconds
* system:preview viewtime ~= 1 day 30 hours 100 minutes 90s
* system:has url matching regex index\\.php
* system:does not have a url matching regex index\\.php
* system:has url https://safebooru.donmai.us/posts/4695284
* system:does not have url https://safebooru.donmai.us/posts/4695284
* system:has domain safebooru.com
* system:does not have domain safebooru.com
* system:has a url with class safebooru file page
* system:does not have a url with url class safebooru file page
* system:tag as number page < 5
* system:has notes
* system:no notes
* system:does not have notes
* system:num notes is 5
* system:num notes > 1
* system:has note with name note name
* system:no note with name note name
* system:does not have note with name note name
??? example "System Predicates"
* system:everything
* system:inbox
* system:archive
* system:has duration
* system:no duration
* system:is the best quality file of its duplicate group
* system:is not the best quality file of its duplicate group
* system:has audio
* system:no audio
* system:has icc profile
* system:no icc profile
* system:has tags
* system:no tags
* system:untagged
* system:number of tags > 5
* system:number of tags ~= 10
* system:number of tags > 0
* system:number of words < 2
* system:height = 600
* system:height > 900
* system:width < 200
* system:width > 1000
* system:filesize ~= 50 kilobytes
* system:filesize > 10megabytes
* system:filesize < 1 GB
* system:filesize > 0 B
* system:similar to abcdef01 abcdef02 abcdef03, abcdef04 with distance 3
* system:similar to abcdef distance 5
* system:limit = 100
* system:filetype = image/jpg, image/png, apng
* system:hash = abcdef01 abcdef02 abcdef03 _(this does sha256)_
* system:hash = abcdef01 abcdef02 md5
* system:modified date < 7 years 45 days 7h
* system:modified date > 2011-06-04
* system:date modified > 7 years 2 months
* system:date modified < 0 years 1 month 1 day 1 hour
* system:time imported < 7 years 45 days 7h
* system:time imported > 2011-06-04
* system:time imported > 7 years 2 months
* system:time imported < 0 years 1 month 1 day 1 hour
* system:time imported ~= 2011-1-3
* system:time imported ~= 1996-05-2
* system:duration < 5 seconds
* system:duration ~= 600 msecs
* system:duration > 3 milliseconds
* system:file service is pending to my files
* system:file service currently in my files
* system:file service is not currently in my files
* system:file service is not pending to my files
* system:num file relationships < 3 alternates
* system:number of file relationships > 3 false positives
* system:ratio is wider than 16:9
* system:ratio is 16:9
* system:ratio taller than 1:1
* system:num pixels > 50 px
* system:num pixels < 1 megapixels
* system:num pixels ~= 5 kilopixel
* system:media views ~= 10
* system:all views > 0
* system:preview views < 10
* system:media viewtime < 1 days 1 hour 0 minutes
* system:all viewtime > 1 hours 100 seconds
* system:preview viewtime ~= 1 day 30 hours 100 minutes 90s
* system:has url matching regex index\\.php
* system:does not have a url matching regex index\\.php
* system:has url https://safebooru.donmai.us/posts/4695284
* system:does not have url https://safebooru.donmai.us/posts/4695284
* system:has domain safebooru.com
* system:does not have domain safebooru.com
* system:has a url with class safebooru file page
* system:does not have a url with url class safebooru file page
* system:tag as number page < 5
* system:has notes
* system:no notes
* system:does not have notes
* system:num notes is 5
* system:num notes > 1
* system:has note with name note name
* system:no note with name note name
* system:does not have note with name note name
More system predicate types and input formats will be available in future. Please test out the system predicates you want to send. Reverse engineering system predicate data from text is obviously tricky. If a system predicate does not parse, you'll get 400.
More system predicate types and input formats will be available in future. Please test out the system predicates you want to send. Reverse engineering system predicate data from text is obviously tricky. If a system predicate does not parse, you'll get 400.
Also, OR predicates are now supported! Just nest within the tag list, and it'll be treated like an OR. For instance:
Also, OR predicates are now supported! Just nest within the tag list, and it'll be treated like an OR. For instance:
* `#!json [ "skirt", [ "samus aran", "lara croft" ], "system:height > 1000" ]`
Makes:
* skirt
* samus aran OR lara croft
* system:height > 1000
* `#!json [ "skirt", [ "samus aran", "lara croft" ], "system:height > 1000" ]`
The file and tag services are for search domain selection, just like clicking the buttons in the client. They are optional--default is 'my files' and 'all known tags', and you can use either key or name as in [GET /get_services](#get_services), whichever is easiest for your situation.
Makes:
file\_sort\_asc is 'true' for ascending, and 'false' for descending. The default is descending.
* skirt
* samus aran OR lara croft
* system:height > 1000
file\_sort\_type is by default _import time_. It is an integer according to the following enum, and I have written the semantic (asc/desc) meaning for each type after:
The file and tag services are for search domain selection, just like clicking the buttons in the client. They are optional--default is 'my files' and 'all known tags', and you can use either key or name as in [GET /get_services](#get_services), whichever is easiest for your situation.
* 0 - file size (smallest first/largest first)
* 1 - duration (shortest first/longest first)
* 2 - import time (oldest first/newest first)
* 3 - filetype (N/A)
* 4 - random (N/A)
* 5 - width (slimmest first/widest first)
* 6 - height (shortest first/tallest first)
* 7 - ratio (tallest first/widest first)
* 8 - number of pixels (ascending/descending)
* 9 - number of tags (on the current tag domain) (ascending/descending)
* 10 - number of media views (ascending/descending)
* 11 - total media viewtime (ascending/descending)
* 12 - approximate bitrate (smallest first/largest first)
* 13 - has audio (audio first/silent first)
* 14 - modified time (oldest first/newest first)
* 15 - framerate (slowest first/fastest first)
* 16 - number of frames (smallest first/largest first)
* 18 - last viewed time (oldest first/newest first)
file\_sort\_asc is 'true' for ascending, and 'false' for descending. The default is descending.
file\_sort\_type is by default _import time_. It is an integer according to the following enum, and I have written the semantic (asc/desc) meaning for each type after:
* 0 - file size (smallest first/largest first)
* 1 - duration (shortest first/longest first)
* 2 - import time (oldest first/newest first)
* 3 - filetype (N/A)
* 4 - random (N/A)
* 5 - width (slimmest first/widest first)
* 6 - height (shortest first/tallest first)
* 7 - ratio (tallest first/widest first)
* 8 - number of pixels (ascending/descending)
* 9 - number of tags (on the current tag domain) (ascending/descending)
* 10 - number of media views (ascending/descending)
* 11 - total media viewtime (ascending/descending)
* 12 - approximate bitrate (smallest first/largest first)
* 13 - has audio (audio first/silent first)
* 14 - modified time (oldest first/newest first)
* 15 - framerate (slowest first/fastest first)
* 16 - number of frames (smallest first/largest first)
* 18 - last viewed time (oldest first/newest first)
Response:
: The full list of numerical file ids that match the search.
@ -1400,21 +1401,21 @@ Arguments (in percent-encoded JSON):
* `hide_service_names_tags`: true or false (optional, defaulting to false)
* `include_notes`: true or false (optional, defaulting to false)
You need one of file_ids or hashes. If your access key is restricted by tag, you cannot search by hashes, and **the file_ids you search for must have been in the most recent search result**.
You need one of file_ids or hashes. If your access key is restricted by tag, you cannot search by hashes, and **the file_ids you search for must have been in the most recent search result**.
``` title="Example request for two files with ids 123 and 4567"
/get_files/file_metadata?file_ids=%5B123%2C%204567%5D
```
``` title="The same, but only wants hashes back"
/get_files/file_metadata?file_ids=%5B123%2C%204567%5D&only_return_identifiers=true
```
``` title="And one that fetches two hashes"
/get_files/file_metadata?hashes=%5B%224c77267f93415de0bc33b7725b8c331a809a924084bee03ab2f5fae1c6019eb2%22%2C%20%223e7cb9044fe81bda0d7a84b5cb781cba4e255e4871cba6ae8ecd8207850d5b82%22%5D
```
``` title="Example request for two files with ids 123 and 4567"
/get_files/file_metadata?file_ids=%5B123%2C%204567%5D
```
``` title="The same, but only wants hashes back"
/get_files/file_metadata?file_ids=%5B123%2C%204567%5D&only_return_identifiers=true
```
``` title="And one that fetches two hashes"
/get_files/file_metadata?hashes=%5B%224c77267f93415de0bc33b7725b8c331a809a924084bee03ab2f5fae1c6019eb2%22%2C%20%223e7cb9044fe81bda0d7a84b5cb781cba4e255e4871cba6ae8ecd8207850d5b82%22%5D
```
This request string can obviously get pretty ridiculously long. It also takes a bit of time to fetch metadata from the database. In its normal searches, the client usually fetches file metadata in batches of 256.
This request string can obviously get pretty ridiculously long. It also takes a bit of time to fetch metadata from the database. In its normal searches, the client usually fetches file metadata in batches of 256.
Response:
: A list of JSON Objects that store a variety of file metadata.
@ -1542,47 +1543,46 @@ Response:
}
```
Size is in bytes. Duration is in milliseconds, and may be an int or a float.
Size is in bytes. Duration is in milliseconds, and may be an int or a float.
file_services stores which file services the file is <i>current</i>ly in and _deleted_ from. The entries are by the service key, same as for tags later on. In rare cases, the timestamps may be `null`, if they are unknown (e.g. a `time_deleted` for the file deleted before this information was tracked). The `time_modified` can also be null. Time modified is just the filesystem modified time for now, but it will evolve into more complicated storage in future with multiple locations (website post times) that'll be aggregated to a sensible value in UI.
file_services stores which file services the file is <i>current</i>ly in and _deleted_ from. The entries are by the service key, same as for tags later on. In rare cases, the timestamps may be `null`, if they are unknown (e.g. a `time_deleted` for the file deleted before this information was tracked). The `time_modified` can also be null. Time modified is just the filesystem modified time for now, but it will evolve into more complicated storage in future with multiple locations (website post times) that'll be aggregated to a sensible value in UI.
The `service_names_to_statuses_to_tags and service_keys_to_statuses_to_tags` structures are similar to the `/add_tags/add_tags` scheme, excepting that the status numbers are:
The `service_names_to_statuses_to_tags and service_keys_to_statuses_to_tags` structures are similar to the `/add_tags/add_tags` scheme, excepting that the status numbers are:
* 0 - current
* 1 - pending
* 2 - deleted
* 3 - petitioned
* 0 - current
* 1 - pending
* 2 - deleted
* 3 - petitioned
The tag structure is duplicated for both `name` and `key`. The use of `name` is an increasingly legacy issue--a hack when the Client API was young--and 'service\_names\_to...' lookups are likely to be deleted in future in favour of `service_key`. I recommend you move to service key when you can. To learn more about service names and keys on a client, use the [/get_services](#get_services) call (and cache the response--it doesn't change much!).
The tag structure is duplicated for both `name` and `key`. The use of `name` is an increasingly legacy issue--a hack when the Client API was young--and 'service\_names\_to...' lookups are likely to be deleted in future in favour of `service_key`. I recommend you move to service key when you can. To learn more about service names and keys on a client, use the [/get_services](#get_services) call (and cache the response--it doesn't change much!).
!!! note
Since JSON Object keys must be strings, these status numbers are strings, not ints.
!!! note
Since JSON Object keys must be strings, these status numbers are strings, not ints.
While `service_XXX_to_statuses_to_tags` represent the actual tags stored on the database for a file, the <code>service_XXX_to_statuses_to_<i>display</i>_tags</code> structures reflect how tags appear in the UI, after siblings are collapsed and parents are added. If you want to edit a file's tags, start with `service_keys_to_statuses_to_tags`. If you want to render to the user, use `service_keys_to_statuses_to_displayed_tags`.
While `service_XXX_to_statuses_to_tags` represent the actual tags stored on the database for a file, the <code>service_XXX_to_statuses_to_<i>display</i>_tags</code> structures reflect how tags appear in the UI, after siblings are collapsed and parents are added. If you want to edit a file's tags, start with `service_keys_to_statuses_to_tags`. If you want to render to the user, use `service_keys_to_statuses_to_displayed_tags`.
If you add `hide_service_names_tags=true`, the `service_names_to_statuses_to_tags` and `service_names_to_statuses_to_display_tags` Objects will not be included. Use this to save data/CPU on large queries.
If you add `hide_service_names_tags=true`, the `service_names_to_statuses_to_tags` and `service_names_to_statuses_to_display_tags` Objects will not be included. Use this to save data/CPU on large queries.
If you add `detailed_url_information=true`, a new entry, `detailed_known_urls`, will be added for each file, with a list of the same structure as /`add_urls/get_url_info`. This may be an expensive request if you are querying thousands of files at once.
If you add `detailed_url_information=true`, a new entry, `detailed_known_urls`, will be added for each file, with a list of the same structure as /`add_urls/get_url_info`. This may be an expensive request if you are querying thousands of files at once.
```json title="For example"
"detailed_known_urls" : [
{
"normalised_url": "https://gelbooru.com/index.php?id=4841557&page=post&s=view",
"url_type": 0,
"url_type_string": "post url",
"match_name": "gelbooru file page",
"can_parse": true
},
{
"normalised_url": "https://img2.gelbooru.com//images/80/c8/80c8646b4a49395fb36c805f316c49a9.jpg",
"url_type": 5,
"url_type_string": "unknown url",
"match_name": "unknown url",
"can_parse": false
}
]
```
```json title="For example"
"detailed_known_urls" : [
{
"normalised_url": "https://gelbooru.com/index.php?id=4841557&page=post&s=view",
"url_type": 0,
"url_type_string": "post url",
"match_name": "gelbooru file page",
"can_parse": true
},
{
"normalised_url": "https://img2.gelbooru.com//images/80/c8/80c8646b4a49395fb36c805f316c49a9.jpg",
"url_type": 5,
"url_type_string": "unknown url",
"match_name": "unknown url",
"can_parse": false
}
]
```
### **GET `/get_files/file`** { id="get_files_file" }
@ -1610,7 +1610,7 @@ Arguments :
Response:
: The file itself. You should get the correct mime type as the Content-Type header.
### **GET `/get_files/thumbnail`** { id="get_files_thumbnail" }
@ -1628,15 +1628,21 @@ Arguments:
Only use one. As with metadata fetching, you may only use the hash argument if you have access to all files. If you are tag-restricted, you will have to use a file_id in the last search you ran.
``` title="Example request"
/get_files/thumbnail?file_id=452158
```
``` title="Example request"
/get_files/thumbnail?hash=7f30c113810985b69014957c93bc25e8eb4cf3355dae36d8b9d011d8b0cf623a
```
``` title="Example request"
/get_files/thumbnail?file_id=452158
```
``` title="Example request"
/get_files/thumbnail?hash=7f30c113810985b69014957c93bc25e8eb4cf3355dae36d8b9d011d8b0cf623a
```
Response:
: The thumbnail for the file. It will give application/octet-stream as the mime type. Some hydrus thumbs are jpegs, some are pngs.
If hydrus keeps no thumbnail for the filetype, for instance with pdfs, then you will get the same default 'pdf' icon you see in the client. If the file does not exist in the client, or the thumbnail was expected but is missing from storage, you will get the fallback 'hydrus' icon, again just as you would in the client itself. This request should never give a 404.
!!! note
If you get a 'default' filetype thumbnail like the pdf or hydrus one, you will be pulling the defaults straight from the hydrus/static folder. They will most likely be 200x200 pixels.
## Managing the Database

View File

@ -33,6 +33,41 @@
<div class="content">
<h3 id="changelog"><a href="#changelog">changelog</a></h3>
<ul>
<li><h3 id="version_478"><a href="#version_478">version 478</a></h3></li>
<ul>
<li>misc:</li>
<li>if a file note text is crazy and can't be displayed, this is now handled and the best visual approximation is displayed (and saved back on ok) instead</li>
<li>fixed an error in the cloudflare problem detection calls for the newer versions of cloudscraper (>=1.2.60) while maintaining support for the older versions. fingers crossed, we also shouldn't repeat this specific error if they refactor again</li>
<li>.</li>
<li>file history chart updates:</li>
<li>fixed the 'inbox' line in file history, which has to be calculated in an odd way and was not counting on file imports adding to the inbox</li>
<li>the file history chart now expands its y axis range to show all data even if deleted_files is huge. we'll see how nice this actually is IRL</li>
<li>bumped the file history resolution up from 1,000 to 2,000 steps</li>
<li>the y axis _should_ now show localised numbers, 5,000 instead of 5000, but the method by which this occurs involves fox tongues and the breath of a slighted widow, so it may just not work for some machines</li>
<li>.</li>
<li>cleanup, mostly file location stuff:</li>
<li>I believe I have replaced all the remaining surplus static 'my files' references with code compatible with multiple local file services. when I add the capability to create new local file services, there now won't be a problem trying to display thumbnails or generate menu actions etc... if they aren't in 'my files'</li>
<li>pulled the autocomplete dropdown file domain button code out to its own class and refactored it and the multiple location context panel to their own file</li>
<li>added a 'default file location' option to 'files and trash' page, and a bunch of dialogs (e.g. the search panel when you make a new export folder) and similar now pull it to initialise. for most users this will stay 'my files' forever, but when we hit multiple local file services, it may want to change</li>
<li>the file domain override options in 'manage tag display and search' now work on the new location system and support multple file services</li>
<li>in downloaders, when highlighting, a database job that does the 'show files' filter (e.g. to include those in trash or not) now works on the new location context system and will handle files that will be imported to places other than my files</li>
<li>refactored client api file service parsing</li>
<li>refactored client api hashes parsing</li>
<li>cleaned a whole heap of misc location code</li>
<li>cleaned misc basic code across hydrus and client constant files</li>
<li>gave 'you don't want the server' help page a very quick pass</li>
<li>.</li>
<li>client api:</li>
<li>in prep for multiple local file services, delete_files now takes an optional file_service_key or file_service_name. by default, it now deletes from all appropriate local services, so behaviour is unchanged from before without the parameter if you just want to delete m8</li>
<li>undelete files is the same. when we have multiple local file services, an undelete without a file service will undelete to all locations that have a delete record</li>
<li>delete_files also now takes an optional 'reason' parameter</li>
<li>the 'set_notes' command now checks the type of the notes Object. it obviously has to be string-to-string</li>
<li>the 'get_thumbnail' command should now never 404. if you ask for a pdf thumb, it gives the pdf default thumb, and if there is no thumb for whatever reason, you get the hydrus fallback thumbnail. just like in the client itself</li>
<li>updated client api help to talk about these</li>
<li>updated the unit tests to handle them too</li>
<li>did a pass over the client api help to unify indent style and fix other small formatting issues</li>
<li>client api version is now 28</li>
</ul>
<li><h3 id="version_477"><a href="#version_477">version 477</a></h3></li>
<ul>
<li>misc:</li>

View File

@ -8,7 +8,7 @@ The server.exe/server.py is the victim of many a misconception. You don't need t
The server is only really useful for a few specific cases which will not apply for the vast majority of users.
## The server
The Hydrus server doesn't really work as most people envision a server working. When you sync with a Hydrus server you get everything it has, a complete copy. You can't have it host files which you can then search and selectively retrieve, it's all or nothing.
The Hydrus server doesn't really work as most people envision a server working. Rather than on-demand viewing, when you link with a Hydrus server, you synchronise a complete copy of all its data. For the tag repository, you download every single tag it has ever been told about. For the file repository, you download the whole file list, related file info, and every single thumbnail, which lets you browse the whole repository in your client in a regular search page--to view files in the media viewer, you need to download and import them specifically.
## You don't want the server (probably)
Do you want to remotely view your files? You don't want the server.
@ -19,7 +19,7 @@ Do you want to use multiple clients and have everything synced between them? You
Do you want to expose API for Hydrus Web, Hydroid, or some other third-party tool? You don't want the server.
Do you want to share some files and tags in a small group of friends? You might actually want the server.
Do you want to share some files and/or tags in a small group of friends? You might actually want the server.
## The options
Now, you're not the first person to have any of the above ideas and some of the thinkers even had enough programming know-how to make something for it. Below is a list of some options, see [this page](client_api.md) for a few more.
@ -34,4 +34,4 @@ Now, you're not the first person to have any of the above ideas and some of the
- Lets you browse your collection.
### [Database migration](https://hydrusnetwork.github.io/hydrus/help/database_migration.html)
- Lets you host your files on another drive, even on another computer in the network.
- Lets you host your files on another drive, even on another computer in the network.

View File

@ -939,9 +939,11 @@ class ThumbnailCache( object ):
bounding_dimensions = self._controller.options[ 'thumbnail_dimensions' ]
thumbnail_scale_type = self._controller.new_options.GetInteger( 'thumbnail_scale_type' )
# it would be ideal to replace this with mimes_to_default_thumbnail_paths at a convenient point
for name in names:
path = os.path.join( HC.STATIC_DIR, name + '.png' )
path = os.path.join( HC.STATIC_DIR, '{}.png'.format( name ) )
numpy_image = ClientImageHandling.GenerateNumPyImage( path, HC.IMAGE_PNG )

View File

@ -16,11 +16,11 @@ CANVAS_MEDIA_VIEWER_DUPLICATES = 2
CANVAS_MEDIA_VIEWER_TYPES = { CANVAS_MEDIA_VIEWER, CANVAS_MEDIA_VIEWER_DUPLICATES }
canvas_type_str_lookup = {}
canvas_type_str_lookup[ CANVAS_MEDIA_VIEWER ] = 'media viewer'
canvas_type_str_lookup[ CANVAS_PREVIEW ] = 'preview'
canvas_type_str_lookup[ CANVAS_MEDIA_VIEWER_DUPLICATES ] = 'duplicates filter'
canvas_type_str_lookup = {
CANVAS_MEDIA_VIEWER : 'media viewer',
CANVAS_PREVIEW : 'preview',
CANVAS_MEDIA_VIEWER_DUPLICATES : 'duplicates filter'
}
# Hue is generally 200, Sat and Lum changes based on need
COLOUR_LIGHT_SELECTED = QG.QColor( 235, 248, 255 )
@ -56,12 +56,12 @@ DIRECTION_LEFT = 1
DIRECTION_RIGHT = 2
DIRECTION_DOWN = 3
directions_alignment_string_lookup = {}
directions_alignment_string_lookup[ DIRECTION_UP ] = 'top'
directions_alignment_string_lookup[ DIRECTION_LEFT ] = 'left'
directions_alignment_string_lookup[ DIRECTION_RIGHT ] = 'right'
directions_alignment_string_lookup[ DIRECTION_DOWN ] = 'bottom'
directions_alignment_string_lookup = {
DIRECTION_UP : 'top',
DIRECTION_LEFT : 'left',
DIRECTION_RIGHT : 'right',
DIRECTION_DOWN : 'bottom'
}
FIELD_VERIFICATION_RECAPTCHA = 0
FIELD_COMMENT = 1
@ -73,25 +73,25 @@ FIELD_PASSWORD = 6
FIELDS = [ FIELD_VERIFICATION_RECAPTCHA, FIELD_COMMENT, FIELD_TEXT, FIELD_CHECKBOX, FIELD_FILE, FIELD_THREAD_ID, FIELD_PASSWORD ]
field_enum_lookup = {}
field_enum_lookup = {
'recaptcha' : FIELD_VERIFICATION_RECAPTCHA,
'comment' : FIELD_COMMENT,
'text' : FIELD_TEXT,
'checkbox' : FIELD_CHECKBOX,
'file' : FIELD_FILE,
'thread id': FIELD_THREAD_ID,
'password' : FIELD_PASSWORD
}
field_enum_lookup[ 'recaptcha' ] = FIELD_VERIFICATION_RECAPTCHA
field_enum_lookup[ 'comment' ] = FIELD_COMMENT
field_enum_lookup[ 'text' ] = FIELD_TEXT
field_enum_lookup[ 'checkbox' ] = FIELD_CHECKBOX
field_enum_lookup[ 'file' ] = FIELD_FILE
field_enum_lookup[ 'thread id' ] = FIELD_THREAD_ID
field_enum_lookup[ 'password' ] = FIELD_PASSWORD
field_string_lookup = {}
field_string_lookup[ FIELD_VERIFICATION_RECAPTCHA ] = 'recaptcha'
field_string_lookup[ FIELD_COMMENT ] = 'comment'
field_string_lookup[ FIELD_TEXT ] = 'text'
field_string_lookup[ FIELD_CHECKBOX ] = 'checkbox'
field_string_lookup[ FIELD_FILE ] = 'file'
field_string_lookup[ FIELD_THREAD_ID ] = 'thread id'
field_string_lookup[ FIELD_PASSWORD ] = 'password'
field_string_lookup = {
FIELD_VERIFICATION_RECAPTCHA : 'recaptcha',
FIELD_COMMENT : 'comment',
FIELD_TEXT : 'text',
FIELD_CHECKBOX : 'checkbox',
FIELD_FILE : 'file',
FIELD_THREAD_ID : 'thread id',
FIELD_PASSWORD : 'password'
}
FILE_VIEWING_STATS_MENU_DISPLAY_NONE = 0
FILE_VIEWING_STATS_MENU_DISPLAY_MEDIA_ONLY = 1
@ -154,21 +154,21 @@ IDLE_NOT_ON_SHUTDOWN = 0
IDLE_ON_SHUTDOWN = 1
IDLE_ON_SHUTDOWN_ASK_FIRST = 2
idle_string_lookup = {}
idle_string_lookup[ IDLE_NOT_ON_SHUTDOWN ] = 'do not run jobs on shutdown'
idle_string_lookup[ IDLE_ON_SHUTDOWN ] = 'run jobs on shutdown if needed'
idle_string_lookup[ IDLE_ON_SHUTDOWN_ASK_FIRST ] = 'run jobs on shutdown if needed, but ask first'
idle_string_lookup = {
IDLE_NOT_ON_SHUTDOWN : 'do not run jobs on shutdown',
IDLE_ON_SHUTDOWN : 'run jobs on shutdown if needed',
IDLE_ON_SHUTDOWN_ASK_FIRST : 'run jobs on shutdown if needed, but ask first'
}
IMPORT_FOLDER_DELETE = 0
IMPORT_FOLDER_IGNORE = 1
IMPORT_FOLDER_MOVE = 2
import_folder_string_lookup = {}
import_folder_string_lookup[ IMPORT_FOLDER_DELETE ] = 'delete the file'
import_folder_string_lookup[ IMPORT_FOLDER_IGNORE ] = 'leave the file alone, do not reattempt it'
import_folder_string_lookup[ IMPORT_FOLDER_MOVE ] = 'move the file'
import_folder_string_lookup = {
IMPORT_FOLDER_DELETE : 'delete the file',
IMPORT_FOLDER_IGNORE : 'leave the file alone, do not reattempt it',
IMPORT_FOLDER_MOVE : 'move the file'
}
EXIT_SESSION_SESSION_NAME = 'exit session'
LAST_SESSION_SESSION_NAME = 'last session'
@ -182,16 +182,16 @@ MEDIA_VIEWER_ACTION_DO_NOT_SHOW_ON_ACTIVATION_OPEN_EXTERNALLY = 5
MEDIA_VIEWER_ACTION_DO_NOT_SHOW = 6
MEDIA_VIEWER_ACTION_SHOW_WITH_MPV = 7
media_viewer_action_string_lookup = {}
media_viewer_action_string_lookup[ MEDIA_VIEWER_ACTION_SHOW_WITH_NATIVE ] = 'show with native hydrus viewer'
media_viewer_action_string_lookup[ MEDIA_VIEWER_ACTION_SHOW_WITH_NATIVE_PAUSED ] = 'show as normal, but start paused -- obselete'
media_viewer_action_string_lookup[ MEDIA_VIEWER_ACTION_SHOW_BEHIND_EMBED ] = 'show, but initially behind an embed button -- obselete'
media_viewer_action_string_lookup[ MEDIA_VIEWER_ACTION_SHOW_BEHIND_EMBED_PAUSED ] = 'show, but initially behind an embed button, and start paused -- obselete'
media_viewer_action_string_lookup[ MEDIA_VIEWER_ACTION_SHOW_OPEN_EXTERNALLY_BUTTON ] = 'show an \'open externally\' button'
media_viewer_action_string_lookup[ MEDIA_VIEWER_ACTION_DO_NOT_SHOW_ON_ACTIVATION_OPEN_EXTERNALLY ] = 'do not show in the media viewer. on thumbnail activation, open externally'
media_viewer_action_string_lookup[ MEDIA_VIEWER_ACTION_DO_NOT_SHOW ] = 'do not show at all'
media_viewer_action_string_lookup[ MEDIA_VIEWER_ACTION_SHOW_WITH_MPV ] = 'show using mpv'
media_viewer_action_string_lookup = {
MEDIA_VIEWER_ACTION_SHOW_WITH_NATIVE : 'show with native hydrus viewer',
MEDIA_VIEWER_ACTION_SHOW_WITH_NATIVE_PAUSED : 'show as normal, but start paused -- obselete',
MEDIA_VIEWER_ACTION_SHOW_BEHIND_EMBED : 'show, but initially behind an embed button -- obselete',
MEDIA_VIEWER_ACTION_SHOW_BEHIND_EMBED_PAUSED : 'show, but initially behind an embed button, and start paused -- obselete',
MEDIA_VIEWER_ACTION_SHOW_OPEN_EXTERNALLY_BUTTON : 'show an \'open externally\' button',
MEDIA_VIEWER_ACTION_DO_NOT_SHOW_ON_ACTIVATION_OPEN_EXTERNALLY : 'do not show in the media viewer. on thumbnail activation, open externally',
MEDIA_VIEWER_ACTION_DO_NOT_SHOW : 'do not show at all',
MEDIA_VIEWER_ACTION_SHOW_WITH_MPV : 'show using mpv'
}
unsupported_media_actions = [ MEDIA_VIEWER_ACTION_SHOW_OPEN_EXTERNALLY_BUTTON, MEDIA_VIEWER_ACTION_DO_NOT_SHOW_ON_ACTIVATION_OPEN_EXTERNALLY, MEDIA_VIEWER_ACTION_DO_NOT_SHOW ]
static_media_actions = [ MEDIA_VIEWER_ACTION_SHOW_WITH_NATIVE ] + unsupported_media_actions
@ -204,13 +204,13 @@ animated_full_support = ( animated_media_actions, True, True )
audio_full_support = ( audio_media_actions, True, True )
no_support = ( unsupported_media_actions, False, False )
media_viewer_capabilities = {}
media_viewer_capabilities[ HC.GENERAL_ANIMATION ] = animated_full_support
media_viewer_capabilities[ HC.GENERAL_IMAGE ] = static_full_support
media_viewer_capabilities[ HC.GENERAL_VIDEO ] = animated_full_support
media_viewer_capabilities[ HC.GENERAL_AUDIO ] = audio_full_support
media_viewer_capabilities[ HC.GENERAL_APPLICATION ] = no_support
media_viewer_capabilities = {
HC.GENERAL_ANIMATION : animated_full_support,
HC.GENERAL_IMAGE : static_full_support,
HC.GENERAL_VIDEO : animated_full_support,
HC.GENERAL_AUDIO : audio_full_support,
HC.GENERAL_APPLICATION : no_support
}
for mime in HC.SEARCHABLE_MIMES:
@ -243,23 +243,23 @@ MEDIA_VIEWER_SCALE_100 = 0
MEDIA_VIEWER_SCALE_MAX_REGULAR = 1
MEDIA_VIEWER_SCALE_TO_CANVAS = 2
media_viewer_scale_string_lookup = {}
media_viewer_scale_string_lookup[ MEDIA_VIEWER_SCALE_100 ] = 'show at 100%'
media_viewer_scale_string_lookup[ MEDIA_VIEWER_SCALE_MAX_REGULAR ] = 'scale to the largest regular zoom that fits'
media_viewer_scale_string_lookup[ MEDIA_VIEWER_SCALE_TO_CANVAS ] = 'scale to the canvas size'
media_viewer_scale_string_lookup = {
MEDIA_VIEWER_SCALE_100 : 'show at 100%',
MEDIA_VIEWER_SCALE_MAX_REGULAR : 'scale to the largest regular zoom that fits',
MEDIA_VIEWER_SCALE_TO_CANVAS : 'scale to the canvas size'
}
NEW_PAGE_GOES_FAR_LEFT = 0
NEW_PAGE_GOES_LEFT_OF_CURRENT = 1
NEW_PAGE_GOES_RIGHT_OF_CURRENT = 2
NEW_PAGE_GOES_FAR_RIGHT = 3
new_page_goes_string_lookup = {}
new_page_goes_string_lookup[ NEW_PAGE_GOES_FAR_LEFT ] = 'the far left'
new_page_goes_string_lookup[ NEW_PAGE_GOES_LEFT_OF_CURRENT ] = 'left of current page tab'
new_page_goes_string_lookup[ NEW_PAGE_GOES_RIGHT_OF_CURRENT ] = 'right of current page tab'
new_page_goes_string_lookup[ NEW_PAGE_GOES_FAR_RIGHT ] = 'the far right'
new_page_goes_string_lookup = {
NEW_PAGE_GOES_FAR_LEFT : 'the far left',
NEW_PAGE_GOES_LEFT_OF_CURRENT : 'left of current page tab',
NEW_PAGE_GOES_RIGHT_OF_CURRENT : 'right of current page tab',
NEW_PAGE_GOES_FAR_RIGHT : 'the far right'
}
NETWORK_CONTEXT_GLOBAL = 0
NETWORK_CONTEXT_HYDRUS = 1
@ -269,35 +269,35 @@ NETWORK_CONTEXT_DOWNLOADER_PAGE = 4
NETWORK_CONTEXT_SUBSCRIPTION = 5
NETWORK_CONTEXT_WATCHER_PAGE = 6
network_context_type_string_lookup = {}
network_context_type_string_lookup = {
NETWORK_CONTEXT_GLOBAL : 'global',
NETWORK_CONTEXT_HYDRUS : 'hydrus service',
NETWORK_CONTEXT_DOMAIN : 'web domain',
NETWORK_CONTEXT_DOWNLOADER : 'downloader',
NETWORK_CONTEXT_DOWNLOADER_PAGE : 'downloader page',
NETWORK_CONTEXT_SUBSCRIPTION : 'subscription',
NETWORK_CONTEXT_WATCHER_PAGE : 'watcher page'
}
network_context_type_string_lookup[ NETWORK_CONTEXT_GLOBAL ] = 'global'
network_context_type_string_lookup[ NETWORK_CONTEXT_HYDRUS ] = 'hydrus service'
network_context_type_string_lookup[ NETWORK_CONTEXT_DOMAIN ] = 'web domain'
network_context_type_string_lookup[ NETWORK_CONTEXT_DOWNLOADER ] = 'downloader'
network_context_type_string_lookup[ NETWORK_CONTEXT_DOWNLOADER_PAGE ] = 'downloader page'
network_context_type_string_lookup[ NETWORK_CONTEXT_SUBSCRIPTION ] = 'subscription'
network_context_type_string_lookup[ NETWORK_CONTEXT_WATCHER_PAGE ] = 'watcher page'
network_context_type_description_lookup = {}
network_context_type_description_lookup[ NETWORK_CONTEXT_GLOBAL ] = 'All network traffic, no matter the source or destination.'
network_context_type_description_lookup[ NETWORK_CONTEXT_HYDRUS ] = 'Network traffic going to or from a hydrus service.'
network_context_type_description_lookup[ NETWORK_CONTEXT_DOMAIN ] = 'Network traffic going to or from a web domain (or a subdomain).'
network_context_type_description_lookup[ NETWORK_CONTEXT_DOWNLOADER ] = 'Network traffic going through a downloader. This is no longer used.'
network_context_type_description_lookup[ NETWORK_CONTEXT_DOWNLOADER_PAGE ] = 'Network traffic going through a single downloader page. This is an ephemeral context--it will not be saved through a client restart. It is useful to throttle individual downloader pages so they give the db and other import pages time to do work.'
network_context_type_description_lookup[ NETWORK_CONTEXT_SUBSCRIPTION ] = 'Network traffic going through a subscription query. Each query gets its own network context, named \'[subscription name]: [query text]\'.'
network_context_type_description_lookup[ NETWORK_CONTEXT_WATCHER_PAGE ] = 'Network traffic going through a single watcher page. This is an ephemeral context--it will not be saved through a client restart. It is useful to throttle individual watcher pages so they give the db and other import pages time to do work.'
network_context_type_description_lookup = {
NETWORK_CONTEXT_GLOBAL : 'All network traffic, no matter the source or destination.',
NETWORK_CONTEXT_HYDRUS : 'Network traffic going to or from a hydrus service.',
NETWORK_CONTEXT_DOMAIN : 'Network traffic going to or from a web domain (or a subdomain).',
NETWORK_CONTEXT_DOWNLOADER : 'Network traffic going through a downloader. This is no longer used.',
NETWORK_CONTEXT_DOWNLOADER_PAGE : 'Network traffic going through a single downloader page. This is an ephemeral context--it will not be saved through a client restart. It is useful to throttle individual downloader pages so they give the db and other import pages time to do work.',
NETWORK_CONTEXT_SUBSCRIPTION : 'Network traffic going through a subscription query. Each query gets its own network context, named \'[subscription name]: [query text]\'.',
NETWORK_CONTEXT_WATCHER_PAGE : 'Network traffic going through a single watcher page. This is an ephemeral context--it will not be saved through a client restart. It is useful to throttle individual watcher pages so they give the db and other import pages time to do work.'
}
PAGE_FILE_COUNT_DISPLAY_ALL = 0
PAGE_FILE_COUNT_DISPLAY_NONE = 1
PAGE_FILE_COUNT_DISPLAY_ONLY_IMPORTERS = 2
page_file_count_display_string_lookup = {}
page_file_count_display_string_lookup[ PAGE_FILE_COUNT_DISPLAY_ALL ] = 'for all pages'
page_file_count_display_string_lookup[ PAGE_FILE_COUNT_DISPLAY_ONLY_IMPORTERS ] = 'for import pages'
page_file_count_display_string_lookup[ PAGE_FILE_COUNT_DISPLAY_NONE ] = 'for no pages'
page_file_count_display_string_lookup = {
PAGE_FILE_COUNT_DISPLAY_ALL : 'for all pages',
PAGE_FILE_COUNT_DISPLAY_ONLY_IMPORTERS : 'for import pages',
PAGE_FILE_COUNT_DISPLAY_NONE : 'for no pages'
}
SHUTDOWN_TIMESTAMP_VACUUM = 0
SHUTDOWN_TIMESTAMP_FATTEN_AC_CACHE = 1
@ -347,51 +347,51 @@ SYSTEM_SORT_TYPES = {
SORT_FILES_BY_ARCHIVED_TIMESTAMP
}
system_sort_type_submetatype_string_lookup = {}
system_sort_type_submetatype_string_lookup = {
SORT_FILES_BY_NUM_COLLECTION_FILES : 'collections',
SORT_FILES_BY_HEIGHT : 'dimensions',
SORT_FILES_BY_NUM_PIXELS : 'dimensions',
SORT_FILES_BY_RATIO : 'dimensions',
SORT_FILES_BY_WIDTH : 'dimensions',
SORT_FILES_BY_DURATION : 'duration',
SORT_FILES_BY_FRAMERATE : 'duration',
SORT_FILES_BY_NUM_FRAMES : 'duration',
SORT_FILES_BY_APPROX_BITRATE : 'file',
SORT_FILES_BY_FILESIZE : 'file',
SORT_FILES_BY_MIME : 'file',
SORT_FILES_BY_HAS_AUDIO : 'file',
SORT_FILES_BY_RANDOM : None,
SORT_FILES_BY_NUM_TAGS : 'tags',
SORT_FILES_BY_IMPORT_TIME : 'time',
SORT_FILES_BY_FILE_MODIFIED_TIMESTAMP : 'time',
SORT_FILES_BY_ARCHIVED_TIMESTAMP : 'time',
SORT_FILES_BY_LAST_VIEWED_TIME : 'time',
SORT_FILES_BY_MEDIA_VIEWS : 'views',
SORT_FILES_BY_MEDIA_VIEWTIME : 'views'
}
system_sort_type_submetatype_string_lookup[ SORT_FILES_BY_NUM_COLLECTION_FILES ] = 'collections'
system_sort_type_submetatype_string_lookup[ SORT_FILES_BY_HEIGHT ] = 'dimensions'
system_sort_type_submetatype_string_lookup[ SORT_FILES_BY_NUM_PIXELS ] = 'dimensions'
system_sort_type_submetatype_string_lookup[ SORT_FILES_BY_RATIO ] = 'dimensions'
system_sort_type_submetatype_string_lookup[ SORT_FILES_BY_WIDTH ] = 'dimensions'
system_sort_type_submetatype_string_lookup[ SORT_FILES_BY_DURATION ] = 'duration'
system_sort_type_submetatype_string_lookup[ SORT_FILES_BY_FRAMERATE ] = 'duration'
system_sort_type_submetatype_string_lookup[ SORT_FILES_BY_NUM_FRAMES ] = 'duration'
system_sort_type_submetatype_string_lookup[ SORT_FILES_BY_APPROX_BITRATE ] = 'file'
system_sort_type_submetatype_string_lookup[ SORT_FILES_BY_FILESIZE ] = 'file'
system_sort_type_submetatype_string_lookup[ SORT_FILES_BY_MIME ] = 'file'
system_sort_type_submetatype_string_lookup[ SORT_FILES_BY_HAS_AUDIO ] = 'file'
system_sort_type_submetatype_string_lookup[ SORT_FILES_BY_RANDOM ] = None
system_sort_type_submetatype_string_lookup[ SORT_FILES_BY_NUM_TAGS ] = 'tags'
system_sort_type_submetatype_string_lookup[ SORT_FILES_BY_IMPORT_TIME ] = 'time'
system_sort_type_submetatype_string_lookup[ SORT_FILES_BY_FILE_MODIFIED_TIMESTAMP ] = 'time'
system_sort_type_submetatype_string_lookup[ SORT_FILES_BY_ARCHIVED_TIMESTAMP ] = 'time'
system_sort_type_submetatype_string_lookup[ SORT_FILES_BY_LAST_VIEWED_TIME ] = 'time'
system_sort_type_submetatype_string_lookup[ SORT_FILES_BY_MEDIA_VIEWS ] = 'views'
system_sort_type_submetatype_string_lookup[ SORT_FILES_BY_MEDIA_VIEWTIME ] = 'views'
sort_type_basic_string_lookup = {}
sort_type_basic_string_lookup[ SORT_FILES_BY_DURATION ] = 'duration'
sort_type_basic_string_lookup[ SORT_FILES_BY_FRAMERATE ] = 'framerate'
sort_type_basic_string_lookup[ SORT_FILES_BY_NUM_FRAMES ] = 'number of frames'
sort_type_basic_string_lookup[ SORT_FILES_BY_HEIGHT ] = 'height'
sort_type_basic_string_lookup[ SORT_FILES_BY_NUM_COLLECTION_FILES ] = 'number of files in collection'
sort_type_basic_string_lookup[ SORT_FILES_BY_NUM_PIXELS ] = 'number of pixels'
sort_type_basic_string_lookup[ SORT_FILES_BY_RATIO ] = 'resolution ratio'
sort_type_basic_string_lookup[ SORT_FILES_BY_WIDTH ] = 'width'
sort_type_basic_string_lookup[ SORT_FILES_BY_APPROX_BITRATE ] = 'approximate bitrate'
sort_type_basic_string_lookup[ SORT_FILES_BY_FILESIZE ] = 'filesize'
sort_type_basic_string_lookup[ SORT_FILES_BY_MIME ] = 'filetype'
sort_type_basic_string_lookup[ SORT_FILES_BY_HAS_AUDIO ] = 'has audio'
sort_type_basic_string_lookup[ SORT_FILES_BY_IMPORT_TIME ] = 'import time'
sort_type_basic_string_lookup[ SORT_FILES_BY_FILE_MODIFIED_TIMESTAMP ] = 'modified time'
sort_type_basic_string_lookup[ SORT_FILES_BY_ARCHIVED_TIMESTAMP ] = 'archived time'
sort_type_basic_string_lookup[ SORT_FILES_BY_LAST_VIEWED_TIME ] = 'last viewed time'
sort_type_basic_string_lookup[ SORT_FILES_BY_RANDOM ] = 'random'
sort_type_basic_string_lookup[ SORT_FILES_BY_NUM_TAGS ] = 'number of tags'
sort_type_basic_string_lookup[ SORT_FILES_BY_MEDIA_VIEWS ] = 'media views'
sort_type_basic_string_lookup[ SORT_FILES_BY_MEDIA_VIEWTIME ] = 'media viewtime'
sort_type_basic_string_lookup = {
SORT_FILES_BY_DURATION : 'duration',
SORT_FILES_BY_FRAMERATE : 'framerate',
SORT_FILES_BY_NUM_FRAMES : 'number of frames',
SORT_FILES_BY_HEIGHT : 'height',
SORT_FILES_BY_NUM_COLLECTION_FILES : 'number of files in collection',
SORT_FILES_BY_NUM_PIXELS : 'number of pixels',
SORT_FILES_BY_RATIO : 'resolution ratio',
SORT_FILES_BY_WIDTH : 'width',
SORT_FILES_BY_APPROX_BITRATE : 'approximate bitrate',
SORT_FILES_BY_FILESIZE : 'filesize',
SORT_FILES_BY_MIME : 'filetype',
SORT_FILES_BY_HAS_AUDIO : 'has audio',
SORT_FILES_BY_IMPORT_TIME : 'import time',
SORT_FILES_BY_FILE_MODIFIED_TIMESTAMP : 'modified time',
SORT_FILES_BY_ARCHIVED_TIMESTAMP : 'archived time',
SORT_FILES_BY_LAST_VIEWED_TIME : 'last viewed time',
SORT_FILES_BY_RANDOM : 'random',
SORT_FILES_BY_NUM_TAGS : 'number of tags',
SORT_FILES_BY_MEDIA_VIEWS : 'media views',
SORT_FILES_BY_MEDIA_VIEWTIME : 'media viewtime'
}
sort_type_string_lookup = {}
@ -424,18 +424,18 @@ STATUS_VETOED = 7
STATUS_SKIPPED = 8
STATUS_SUCCESSFUL_AND_CHILD_FILES = 9
status_string_lookup = {}
status_string_lookup[ STATUS_UNKNOWN ] = ''
status_string_lookup[ STATUS_SUCCESSFUL_AND_NEW ] = 'successful'
status_string_lookup[ STATUS_SUCCESSFUL_BUT_REDUNDANT ] = 'already in db'
status_string_lookup[ STATUS_DELETED ] = 'deleted'
status_string_lookup[ STATUS_ERROR ] = 'error'
status_string_lookup[ STATUS_NEW ] = 'new'
status_string_lookup[ STATUS_PAUSED ] = 'paused'
status_string_lookup[ STATUS_VETOED ] = 'ignored'
status_string_lookup[ STATUS_SKIPPED ] = 'skipped'
status_string_lookup[ STATUS_SUCCESSFUL_AND_CHILD_FILES ] = 'completed'
status_string_lookup = {
STATUS_UNKNOWN : '',
STATUS_SUCCESSFUL_AND_NEW : 'successful',
STATUS_SUCCESSFUL_BUT_REDUNDANT : 'already in db',
STATUS_DELETED : 'deleted',
STATUS_ERROR : 'error',
STATUS_NEW : 'new',
STATUS_PAUSED : 'paused',
STATUS_VETOED : 'ignored',
STATUS_SKIPPED : 'skipped',
STATUS_SUCCESSFUL_AND_CHILD_FILES : 'completed'
}
SUCCESSFUL_IMPORT_STATES = { STATUS_SUCCESSFUL_AND_NEW, STATUS_SUCCESSFUL_BUT_REDUNDANT, STATUS_SUCCESSFUL_AND_CHILD_FILES }
UNSUCCESSFUL_IMPORT_STATES = { STATUS_DELETED, STATUS_ERROR, STATUS_VETOED }
@ -450,13 +450,13 @@ ZOOM_AREA = 2 # for shrinking without moire
ZOOM_CUBIC = 3 # for interpolating, pretty good
ZOOM_LANCZOS4 = 4 # for interpolating, noice
zoom_string_lookup = {}
zoom_string_lookup[ ZOOM_NEAREST ] = 'nearest neighbour'
zoom_string_lookup[ ZOOM_LINEAR ] = 'bilinear interpolation'
zoom_string_lookup[ ZOOM_AREA ] = 'pixel area resampling'
zoom_string_lookup[ ZOOM_CUBIC ] = '4x4 bilinear interpolation'
zoom_string_lookup[ ZOOM_LANCZOS4 ] = '8x8 Lanczos interpolation'
zoom_string_lookup = {
ZOOM_NEAREST : 'nearest neighbour',
ZOOM_LINEAR : 'bilinear interpolation',
ZOOM_AREA : 'pixel area resampling',
ZOOM_CUBIC : '4x4 bilinear interpolation',
ZOOM_LANCZOS4 : '8x8 Lanczos interpolation'
}
class GlobalPixmaps( object ):

View File

@ -719,7 +719,7 @@ def SetDefaultFavouriteSearchManagerData( favourite_search_manager ):
#
favourite_search_manager.SetFavouriteSearchRows( rows )
def SetDefaultLoginManagerScripts( login_manager ):
default_login_scripts = GetDefaultLoginScripts()

View File

@ -291,7 +291,7 @@ class ExportFolder( HydrusSerialisable.SerialisableBaseNamed ):
if file_search_context is None:
default_location_context = HG.client_controller.services_manager.GetDefaultLocationContext()
default_location_context = HG.client_controller.new_options.GetDefaultLocalLocationContext()
file_search_context = ClientSearch.FileSearchContext( location_context = default_location_context )

View File

@ -75,23 +75,24 @@ class LocationContext( HydrusSerialisable.SerialisableBase ):
self.deleted_service_keys = frozenset( { bytes.fromhex( service_key ) for service_key in serialisable_deleted_service_keys } )
def ClearAllLocalFilesServices( self, filter_func: typing.Callable ):
def ClearSurplusLocalFilesServices( self, service_type_func: typing.Callable ):
# if we have combined local files, then we don't need specific local domains
if CC.COMBINED_LOCAL_FILE_SERVICE_KEY in self.current_service_keys:
self.current_service_keys = frozenset( ( service_key for service_key in self.current_service_keys if filter_func( service_key ) ) )
self.current_service_keys = frozenset( ( service_key for service_key in self.current_service_keys if service_type_func( service_key ) not in ( HC.LOCAL_FILE_DOMAIN, HC.LOCAL_FILE_TRASH_DOMAIN ) ) )
if CC.COMBINED_LOCAL_FILE_SERVICE_KEY in self.deleted_service_keys:
self.deleted_service_keys = frozenset( ( service_key for service_key in self.deleted_service_keys if filter_func( service_key ) ) )
self.deleted_service_keys = frozenset( ( service_key for service_key in self.deleted_service_keys if service_type_func( service_key ) not in ( HC.LOCAL_FILE_DOMAIN, HC.LOCAL_FILE_TRASH_DOMAIN ) ) )
def FixMissingServices( self, filter_method: typing.Callable ):
def FixMissingServices( self, services_exist_func: typing.Callable ):
self.current_service_keys = frozenset( filter_method( self.current_service_keys ) )
self.deleted_service_keys = frozenset( filter_method( self.deleted_service_keys ) )
self.current_service_keys = frozenset( services_exist_func( self.current_service_keys ) )
self.deleted_service_keys = frozenset( services_exist_func( self.deleted_service_keys ) )
def GetCoveringCurrentFileServiceKeys( self ):
@ -108,6 +109,14 @@ class LocationContext( HydrusSerialisable.SerialisableBase ):
return ( file_service_keys, file_location_is_cross_referenced )
def GetStatusesAndServiceKeysList( self ):
statuses_and_service_keys = [ ( HC.CONTENT_STATUS_CURRENT, service_key ) for service_key in self.current_service_keys ]
statuses_and_service_keys.extend( [ ( HC.CONTENT_STATUS_DELETED, service_key ) for service_key in self.deleted_service_keys ] )
return statuses_and_service_keys
def IncludesCurrent( self ):
return len( self.current_service_keys ) > 0
@ -133,6 +142,13 @@ class LocationContext( HydrusSerialisable.SerialisableBase ):
return len( self.current_service_keys ) + len( self.deleted_service_keys ) == 1
def LimitToServiceTypes( self, service_type_func: typing.Callable, service_types ):
self.current_service_keys = frozenset( ( service_key for service_key in self.current_service_keys if service_type_func( service_key ) in service_types ) )
self.deleted_service_keys = frozenset( ( service_key for service_key in self.deleted_service_keys if service_type_func( service_key ) in service_types ) )
def SearchesAnything( self ):
return len( self.current_service_keys ) + len( self.deleted_service_keys ) > 0

View File

@ -498,6 +498,10 @@ class ClientOptions( HydrusSerialisable.SerialisableBase ):
self._dictionary[ 'predicate_types_to_recent_predicates' ] = HydrusSerialisable.SerialisableDictionary()
from hydrus.client import ClientLocation
self._dictionary[ 'default_local_location_context' ] = ClientLocation.LocationContext.STATICCreateSimple( CC.LOCAL_FILE_SERVICE_KEY )
#
self._dictionary[ 'favourite_tag_filters' ] = HydrusSerialisable.SerialisableDictionary()
@ -932,6 +936,14 @@ class ClientOptions( HydrusSerialisable.SerialisableBase ):
def GetDefaultLocalLocationContext( self ):
with self._lock:
return self._dictionary[ 'default_local_location_context' ]
def GetDefaultMediaViewOptions( self ):
with self._lock:
@ -1344,19 +1356,11 @@ class ClientOptions( HydrusSerialisable.SerialisableBase ):
def SetDefaultSort( self, media_sort ):
def SetDefaultLocalLocationContext( self, location_context ):
with self._lock:
self._dictionary[ 'default_sort' ] = media_sort
def SetDefaultSubscriptionCheckerOptions( self, checker_options ):
with self._lock:
self._dictionary[ 'misc' ][ 'default_subscription_checker_options' ] = checker_options
self._dictionary[ 'default_local_location_context' ] = location_context
@ -1383,6 +1387,22 @@ class ClientOptions( HydrusSerialisable.SerialisableBase ):
def SetDefaultSort( self, media_sort ):
with self._lock:
self._dictionary[ 'default_sort' ] = media_sort
def SetDefaultSubscriptionCheckerOptions( self, checker_options ):
with self._lock:
self._dictionary[ 'misc' ][ 'default_subscription_checker_options' ] = checker_options
def SetDefaultWatcherCheckerOptions( self, checker_options ):
with self._lock:

View File

@ -3323,11 +3323,6 @@ class ServicesManager( object ):
def GetDefaultLocationContext( self ) -> bytes:
return ClientLocation.LocationContext.STATICCreateSimple( CC.LOCAL_FILE_SERVICE_KEY )
def GetLocalMediaFileServices( self ):
with self._lock:
@ -3336,6 +3331,13 @@ class ServicesManager( object ):
def GetLocalMediaLocationContextUmbrella( self ) -> ClientLocation.LocationContext:
service_keys = [ service.GetServiceKey() for service in self.GetLocalMediaFileServices() ]
return ClientLocation.LocationContext( current_service_keys = service_keys )
def GetName( self, service_key: bytes ):
with self._lock:

View File

@ -3285,20 +3285,23 @@ class DB( HydrusDB.HydrusDB ):
return hash_ids
def _FilterHashesByService( self, file_service_key: bytes, hashes: typing.Sequence[ bytes ] ) -> typing.List[ bytes ]:
def _FilterHashesByService( self, location_context: ClientLocation.LocationContext, hashes: typing.Sequence[ bytes ] ) -> typing.List[ bytes ]:
# returns hashes in order, to be nice to UI
if file_service_key == CC.COMBINED_FILE_SERVICE_KEY:
if not location_context.SearchesAnything():
return []
if location_context.IsAllKnownFiles():
return list( hashes )
service_id = self.modules_services.GetServiceId( file_service_key )
hashes_to_hash_ids = { hash : self.modules_hashes_local_cache.GetHashId( hash ) for hash in hashes if self.modules_hashes.HasHash( hash ) }
valid_hash_ids = self.modules_files_storage.FilterHashIdsToStatus( service_id, set( hashes_to_hash_ids.values() ), HC.CONTENT_STATUS_CURRENT )
valid_hash_ids = self.modules_files_storage.FilterHashIds( location_context, hashes_to_hash_ids.values() )
return [ hash for hash in hashes if hash in hashes_to_hash_ids and hashes_to_hash_ids[ hash ] in valid_hash_ids ]
@ -4073,41 +4076,56 @@ class DB( HydrusDB.HydrusDB ):
file_history[ 'deleted' ] = deleted_file_history
# and inbox, which will work backwards since we have numbers for archiving. several subtle differences here
( total_inbox_files, ) = self._Execute( 'SELECT COUNT( * ) FROM file_inbox;' ).fetchone()
archive_timestamps = self._STL( self._Execute( 'SELECT archived_timestamp FROM archive_timestamps ORDER BY archived_timestamp ASC;' ) )
# we know the inbox now and the recent history of archives and file changes
# working backwards in time (which reverses increment/decrement):
# an archive increments
# a file import decrements
# note that we archive right before we delete a file, so file deletes shouldn't change anything. all deletes are on archived files, so the increment will already be counted
inbox_file_history = []
( total_inbox_files, ) = self._Execute( 'SELECT COUNT( * ) FROM file_inbox;' ).fetchone()
# note also that we do not scrub archived time on a file delete, so this upcoming fetch is for all files ever. this is useful, so don't undo it m8
archive_timestamps = self._STL( self._Execute( 'SELECT archived_timestamp FROM archive_timestamps ORDER BY archived_timestamp ASC;' ) )
if len( archive_timestamps ) > 0:
if len( archive_timestamps ) < 2:
step_gap = 1
else:
step_gap = max( ( archive_timestamps[-1] - archive_timestamps[0] ) // num_steps, 1 )
first_archive_time = archive_timestamps[0]
archive_timestamps.reverse()
combined_timestamps_with_delta = [ ( timestamp, 1 ) for timestamp in archive_timestamps ]
combined_timestamps_with_delta.extend( ( ( timestamp, -1 ) for timestamp in current_timestamps if timestamp >= first_archive_time ) )
step_timestamp = archive_timestamps[0]
combined_timestamps_with_delta.sort( reverse = True )
for archived_timestamp in archive_timestamps:
if len( combined_timestamps_with_delta ) > 0:
if archived_timestamp < step_timestamp - step_gap:
if len( combined_timestamps_with_delta ) < 2:
inbox_file_history.append( ( archived_timestamp, total_inbox_files ) )
step_gap = 1
step_timestamp = archived_timestamp
else:
# reversed, so first minus last
step_gap = max( ( combined_timestamps_with_delta[0][0] - combined_timestamps_with_delta[-1][0] ) // num_steps, 1 )
total_inbox_files += 1
step_timestamp = combined_timestamps_with_delta[0][0]
for ( archived_timestamp, delta ) in combined_timestamps_with_delta:
if archived_timestamp < step_timestamp - step_gap:
inbox_file_history.append( ( archived_timestamp, total_inbox_files ) )
step_timestamp = archived_timestamp
total_inbox_files += delta
inbox_file_history.reverse()
inbox_file_history.reverse()
file_history[ 'inbox' ] = inbox_file_history
@ -12209,7 +12227,10 @@ class DB( HydrusDB.HydrusDB ):
rows = self.modules_files_storage.GetUndeleteRows( service_id, hash_ids )
self._AddFiles( service_id, rows )
if len( rows ) > 0:
self._AddFiles( service_id, rows )
def _UnloadModules( self ):

View File

@ -2276,7 +2276,7 @@ class FrameGUI( ClientGUITopLevelWindows.MainFrameThatResizes ):
HG.client_controller.pub( 'message', job_key )
num_steps = 1000
num_steps = 2000
file_history = HG.client_controller.Read( 'file_history', num_steps )
@ -3213,7 +3213,7 @@ class FrameGUI( ClientGUITopLevelWindows.MainFrameThatResizes ):
gui_actions = QW.QMenu( debug )
default_location_context = HG.client_controller.services_manager.GetDefaultLocationContext()
default_location_context = HG.client_controller.new_options.GetDefaultLocalLocationContext()
def flip_macos_antiflicker():
@ -3634,7 +3634,7 @@ class FrameGUI( ClientGUITopLevelWindows.MainFrameThatResizes ):
if load_a_blank_page:
default_location_context = HG.client_controller.services_manager.GetDefaultLocationContext()
default_location_context = HG.client_controller.new_options.GetDefaultLocalLocationContext()
self._notebook.NewPageQuery( default_location_context, on_deepest_notebook = True )
@ -5717,7 +5717,7 @@ class FrameGUI( ClientGUITopLevelWindows.MainFrameThatResizes ):
t = 0.25
default_location_context = HG.client_controller.services_manager.GetDefaultLocationContext()
default_location_context = HG.client_controller.new_options.GetDefaultLocalLocationContext()
HG.client_controller.CallLaterQtSafe( self, t, 'test job', self._notebook.NewPageQuery, default_location_context, page_name = 'test', on_deepest_notebook = True )
@ -5789,7 +5789,7 @@ class FrameGUI( ClientGUITopLevelWindows.MainFrameThatResizes ):
def qt_test_ac():
default_location_context = HG.client_controller.services_manager.GetDefaultLocationContext()
default_location_context = HG.client_controller.new_options.GetDefaultLocalLocationContext()
SYS_PRED_REFRESH = 1.0

View File

@ -478,7 +478,7 @@ class TagSubPanel( QW.QWidget ):
self._tag_value = QW.QLineEdit( self )
self._tag_value.setReadOnly( True )
default_location_context = HG.client_controller.services_manager.GetDefaultLocationContext()
default_location_context = HG.client_controller.new_options.GetDefaultLocalLocationContext()
self._tag_input = ClientGUIACDropdown.AutoCompleteDropdownTagsWrite( self, self.SetTags, default_location_context, CC.COMBINED_TAG_SERVICE_KEY )

View File

@ -1,3 +1,5 @@
import itertools
from qtpy import QtCore as QC
try:
@ -81,10 +83,14 @@ try:
current_files_series.setName( 'files in storage' )
max_num_files = 0
for ( timestamp, num_files ) in file_history[ 'current' ]:
current_files_series.append( timestamp * 1000.0, num_files )
max_num_files = max( max_num_files, num_files )
deleted_files_series = QCh.QtCharts.QLineSeries()
@ -94,6 +100,8 @@ try:
deleted_files_series.append( timestamp * 1000.0, num_files )
max_num_files = max( max_num_files, num_files )
inbox_files_series = QCh.QtCharts.QLineSeries()
@ -103,6 +111,8 @@ try:
inbox_files_series.append( timestamp * 1000.0, num_files )
max_num_files = max( max_num_files, num_files )
# takes ms since epoch
x_datetime_axis = QCh.QtCharts.QDateTimeAxis()
@ -114,7 +124,7 @@ try:
y_value_axis = QCh.QtCharts.QValueAxis()
y_value_axis.setLabelFormat( '%i' )
y_value_axis.setLabelFormat( '%\'i' )
chart = QCh.QtCharts.QChart()
@ -134,7 +144,7 @@ try:
inbox_files_series.attachAxis( x_datetime_axis )
inbox_files_series.attachAxis( y_value_axis )
y_value_axis.setMin( 0 )
y_value_axis.setRange( 0, max_num_files )
y_value_axis.applyNiceNumbers()

View File

@ -514,7 +514,7 @@ class DialogInputTags( Dialog ):
self._tags = ClientGUIListBoxes.ListBoxTagsStringsAddRemove( self, service_key, tag_display_type )
default_location_context = HG.client_controller.services_manager.GetDefaultLocationContext()
default_location_context = HG.client_controller.new_options.GetDefaultLocalLocationContext()
self._tag_autocomplete = ClientGUIACDropdown.AutoCompleteDropdownTagsWrite( self, self.EnterTags, default_location_context, service_key, null_entry_callable = self.OK, show_paste_button = True )

View File

@ -1,3 +1,4 @@
import collections
import os
import time
import traceback
@ -74,7 +75,7 @@ class EditExportFoldersPanel( ClientGUIScrolledPanels.EditPanel ):
export_type = HC.EXPORT_FOLDER_TYPE_REGULAR
delete_from_client_after_export = False
default_location_context = HG.client_controller.services_manager.GetDefaultLocationContext()
default_location_context = HG.client_controller.new_options.GetDefaultLocalLocationContext()
file_search_context = ClientSearch.FileSearchContext( location_context = default_location_context )
@ -900,22 +901,35 @@ class ReviewExportFilesPanel( ClientGUIScrolledPanels.ReviewPanel ):
if delete_lock_for_archived_files:
deletee_hashes = { media.GetHash() for ( ordering_index, media, path ) in to_do if not media.HasArchive() }
deletee_medias = { media for ( ordering_index, media, path ) in to_do if not media.HasArchive() }
else:
deletee_hashes = { media.GetHash() for ( ordering_index, media, path ) in to_do }
deletee_medias = { media for ( ordering_index, media, path ) in to_do }
chunks_of_hashes = HydrusData.SplitListIntoChunks( deletee_hashes, 64 )
chunks_of_deletee_medias = HydrusData.SplitListIntoChunks( list( deletee_medias ), 64 )
reason = 'Deleted after manual export to "{}".'.format( directory )
content_updates = [ HydrusData.ContentUpdate( HC.CONTENT_TYPE_FILES, HC.CONTENT_UPDATE_DELETE, chunk_of_hashes, reason = reason ) for chunk_of_hashes in chunks_of_hashes ]
for content_update in content_updates:
for chunk_of_deletee_medias in chunks_of_deletee_medias:
HG.client_controller.WriteSynchronous( 'content_updates', { CC.LOCAL_FILE_SERVICE_KEY : [ content_update ] } )
reason = 'Deleted after manual export to "{}".'.format( directory )
service_keys_to_hashes = collections.defaultdict( set )
for media in chunk_of_deletee_medias:
for service_key in media.GetLocationsManager().GetCurrent():
service_keys_to_hashes[ service_key ].add( hash )
for service_key in ClientLocation.ValidLocalDomainsFilter( service_keys_to_hashes.keys() ):
content_update = HydrusData.ContentUpdate( HC.CONTENT_TYPE_FILES, HC.CONTENT_UPDATE_DELETE, service_keys_to_hashes[ service_key ], reason = reason )
HG.client_controller.WriteSynchronous( 'content_updates', { service_key : [ content_update ] } )

View File

@ -493,7 +493,7 @@ class FilenameTaggingOptionsPanel( QW.QWidget ):
self._tags = ClientGUIListBoxes.ListBoxTagsStringsAddRemove( self._tags_panel, self._service_key, tag_display_type = ClientTags.TAG_DISPLAY_STORAGE )
default_location_context = HG.client_controller.services_manager.GetDefaultLocationContext()
default_location_context = HG.client_controller.new_options.GetDefaultLocalLocationContext()
self._tag_autocomplete_all = ClientGUIACDropdown.AutoCompleteDropdownTagsWrite( self._tags_panel, self.EnterTags, default_location_context, service_key, show_paste_button = True )

View File

@ -1628,13 +1628,21 @@ class EditFileNotesPanel( ClientGUIScrolledPanels.EditPanel ):
control = QW.QPlainTextEdit( self._notebook )
control.setPlainText( note )
try:
control.setPlainText( note )
except:
control.setPlainText( repr( note ) )
self._notebook.addTab( control, name )
self._notebook.setCurrentWidget( control )
ClientGUIFunctions.SetFocusLater( control )
HG.client_controller.CallAfterQtSafe( control, 'moving cursor to end', control.moveCursor, QG.QTextCursor.End )
self._UpdateButtons()

View File

@ -20,6 +20,7 @@ from hydrus.core import HydrusText
from hydrus.client import ClientApplicationCommand as CAC
from hydrus.client import ClientConstants as CC
from hydrus.client import ClientLocation
from hydrus.client.gui import ClientGUIDialogs
from hydrus.client.gui import ClientGUIDialogsQuick
from hydrus.client.gui import ClientGUIFunctions
@ -38,7 +39,7 @@ from hydrus.client.gui.lists import ClientGUIListConstants as CGLC
from hydrus.client.gui.lists import ClientGUIListCtrl
from hydrus.client.gui.pages import ClientGUIResultsSortCollect
from hydrus.client.gui.search import ClientGUIACDropdown
from hydrus.client.gui.search import ClientGUISearch
from hydrus.client.gui.search import ClientGUILocation
from hydrus.client.gui.widgets import ClientGUICommon
from hydrus.client.gui.widgets import ClientGUIControls
from hydrus.client.media import ClientMedia
@ -909,6 +910,13 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
self._export_location = QP.DirPickerCtrl( self )
location_context = self._new_options.GetDefaultLocalLocationContext()
self._default_local_location_context = ClientGUILocation.LocationSearchContextButton( self, location_context )
self._default_local_location_context.setToolTip( 'This initialised into a bunch of dialogs across the program. You can probably leave it alone forever, but if you delete or move away from \'my files\' as your main place to do work, please update it here.' )
self._default_local_location_context.SetOnlyImportableDomainsAllowed( True )
self._prefix_hash_when_copying = QW.QCheckBox( self )
self._prefix_hash_when_copying.setToolTip( 'If you often paste hashes into boorus, check this to automatically prefix with the type, like "md5:2496dabcbd69e3c56a5d8caabb7acde5".' )
@ -988,6 +996,7 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
rows = []
rows.append( ( 'Default local file search location: ', self._default_local_location_context ) )
rows.append( ( 'When copying a file hashes, prefix with booru-friendly hash type: ', self._prefix_hash_when_copying ) )
rows.append( ( 'Confirm sending files to trash: ', self._confirm_trash ) )
rows.append( ( 'Confirm sending more than one file to archive or inbox: ', self._confirm_archive ) )
@ -1071,6 +1080,8 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
HC.options[ 'export_path' ] = HydrusPaths.ConvertAbsPathToPortablePath( self._export_location.GetPath() )
self._new_options.SetDefaultLocalLocationContext( self._default_local_location_context.GetValue() )
self._new_options.SetBoolean( 'prefix_hash_when_copying', self._prefix_hash_when_copying.isChecked() )
HC.options[ 'delete_to_recycle_bin' ] = self._delete_to_recycle_bin.isChecked()
@ -1950,7 +1961,7 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
self._animated_scanbar_height = QP.MakeQSpinBox( self, min=1, max=255 )
self._animated_scanbar_nub_width = QP.MakeQSpinBox( self, min=1, max=63 )
self._media_viewer_panel = ClientGUICommon.StaticBox( self, 'media viewer mime handling' )
self._media_viewer_panel = ClientGUICommon.StaticBox( self, 'media viewer filetype handling' )
media_viewer_list_panel = ClientGUIListCtrl.BetterListCtrlPanel( self._media_viewer_panel )
@ -3279,7 +3290,7 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
favourites_st = ClientGUICommon.BetterStaticText( favourites_panel, desc )
favourites_st.setWordWrap( True )
default_location_context = HG.client_controller.services_manager.GetDefaultLocationContext()
default_location_context = HG.client_controller.new_options.GetDefaultLocalLocationContext()
self._favourites = ClientGUIListBoxes.ListBoxTagsStringsAddRemove( favourites_panel, CC.COMBINED_TAG_SERVICE_KEY, ClientTags.TAG_DISPLAY_STORAGE )
self._favourites_input = ClientGUIACDropdown.AutoCompleteDropdownTagsWrite( favourites_panel, self._favourites.AddTags, default_location_context, CC.COMBINED_TAG_SERVICE_KEY, show_paste_button = True )
@ -3573,7 +3584,7 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
self._suggested_favourites_dict = {}
default_location_context = HG.client_controller.services_manager.GetDefaultLocationContext()
default_location_context = HG.client_controller.new_options.GetDefaultLocalLocationContext()
self._suggested_favourites_input = ClientGUIACDropdown.AutoCompleteDropdownTagsWrite( suggested_tags_favourites_panel, self._suggested_favourites.AddTags, default_location_context, CC.COMBINED_TAG_SERVICE_KEY, show_paste_button = True )

View File

@ -2356,7 +2356,7 @@ class ReviewFileMaintenance( ClientGUIScrolledPanels.ReviewPanel ):
page_key = HydrusData.GenerateKey()
default_location_context = HG.client_controller.services_manager.GetDefaultLocationContext()
default_location_context = HG.client_controller.new_options.GetDefaultLocalLocationContext()
file_search_context = ClientSearch.FileSearchContext( location_context = default_location_context )

View File

@ -40,6 +40,7 @@ from hydrus.client.gui.lists import ClientGUIListConstants as CGLC
from hydrus.client.gui.lists import ClientGUIListCtrl
from hydrus.client.gui.networking import ClientGUIHydrusNetwork
from hydrus.client.gui.search import ClientGUIACDropdown
from hydrus.client.gui.search import ClientGUILocation
from hydrus.client.gui.widgets import ClientGUICommon
from hydrus.client.gui.widgets import ClientGUIControls
from hydrus.client.gui.widgets import ClientGUIMenuButton
@ -120,7 +121,6 @@ class EditTagAutocompleteOptionsPanel( ClientGUIScrolledPanels.EditPanel ):
services_manager = HG.client_controller.services_manager
all_real_tag_service_keys = services_manager.GetServiceKeys( HC.REAL_TAG_SERVICES )
all_real_file_service_keys = services_manager.GetServiceKeys( ( HC.LOCAL_FILE_DOMAIN, HC.FILE_REPOSITORY ) )
#
@ -134,18 +134,13 @@ class EditTagAutocompleteOptionsPanel( ClientGUIScrolledPanels.EditPanel ):
self._write_autocomplete_tag_domain.addItem( services_manager.GetName( service_key ), service_key )
self._override_write_autocomplete_file_domain = QW.QCheckBox( self )
self._override_write_autocomplete_file_domain.setToolTip( 'If set, a manage tags dialog autocomplete will start with a different file domain than the one that launched the dialog.' )
self._override_write_autocomplete_location_context = QW.QCheckBox( self )
self._override_write_autocomplete_location_context.setToolTip( 'If set, a manage tags dialog autocomplete will start with a different file domain than the one that launched the dialog.' )
self._write_autocomplete_file_domain = ClientGUICommon.BetterChoice( self )
self._write_autocomplete_file_domain.setToolTip( 'A manage tags autocomplete will start with this domain. Normally only useful for "all known files" or "my files".' )
self._write_autocomplete_location_context = ClientGUILocation.LocationSearchContextButton( self, tag_autocomplete_options.GetWriteAutocompleteLocationContext() )
self._write_autocomplete_location_context.setToolTip( 'A manage tags autocomplete will start with this domain. Normally only useful for "all known files" or "my files".' )
self._write_autocomplete_file_domain.addItem( services_manager.GetName( CC.COMBINED_FILE_SERVICE_KEY ), CC.COMBINED_FILE_SERVICE_KEY )
for service_key in all_real_file_service_keys:
self._write_autocomplete_file_domain.addItem( services_manager.GetName( service_key ), service_key )
self._write_autocomplete_location_context.SetAllKnownFilesAllowed( True, False )
self._search_namespaces_into_full_tags = QW.QCheckBox( self )
self._search_namespaces_into_full_tags.setToolTip( 'If on, a search for "ser" will return all "series:" results such as "series:metrod". On large tag services, these searches are extremely slow.' )
@ -168,8 +163,7 @@ class EditTagAutocompleteOptionsPanel( ClientGUIScrolledPanels.EditPanel ):
#
self._write_autocomplete_tag_domain.SetValue( tag_autocomplete_options.GetWriteAutocompleteTagDomain() )
self._override_write_autocomplete_file_domain.setChecked( tag_autocomplete_options.OverridesWriteAutocompleteFileDomain() )
self._write_autocomplete_file_domain.SetValue( tag_autocomplete_options.GetWriteAutocompleteFileDomain() )
self._override_write_autocomplete_location_context.setChecked( tag_autocomplete_options.OverridesWriteAutocompleteLocationContext() )
self._search_namespaces_into_full_tags.setChecked( tag_autocomplete_options.SearchNamespacesIntoFullTags() )
self._namespace_bare_fetch_all_allowed.setChecked( tag_autocomplete_options.NamespaceBareFetchAllAllowed() )
self._namespace_fetch_all_allowed.setChecked( tag_autocomplete_options.NamespaceFetchAllAllowed() )
@ -187,13 +181,13 @@ class EditTagAutocompleteOptionsPanel( ClientGUIScrolledPanels.EditPanel ):
if tag_autocomplete_options.GetServiceKey() == CC.COMBINED_TAG_SERVICE_KEY:
self._write_autocomplete_tag_domain.setVisible( False )
self._override_write_autocomplete_file_domain.setVisible( False )
self._write_autocomplete_file_domain.setVisible( False )
self._override_write_autocomplete_location_context.setVisible( False )
self._write_autocomplete_location_context.setVisible( False )
else:
rows.append( ( 'Override default autocomplete file domain in _manage tags_: ', self._override_write_autocomplete_file_domain ) )
rows.append( ( 'Default autocomplete file domain in _manage tags_: ', self._write_autocomplete_file_domain ) )
rows.append( ( 'Override default autocomplete file domain in _manage tags_: ', self._override_write_autocomplete_location_context ) )
rows.append( ( 'Default autocomplete location in _manage tags_: ', self._write_autocomplete_location_context ) )
rows.append( ( 'Default autocomplete tag domain in _manage tags_: ', self._write_autocomplete_tag_domain ) )
@ -218,14 +212,14 @@ class EditTagAutocompleteOptionsPanel( ClientGUIScrolledPanels.EditPanel ):
self._UpdateControls()
self._override_write_autocomplete_file_domain.stateChanged.connect( self._UpdateControls )
self._override_write_autocomplete_location_context.stateChanged.connect( self._UpdateControls )
self._search_namespaces_into_full_tags.stateChanged.connect( self._UpdateControls )
self._namespace_bare_fetch_all_allowed.stateChanged.connect( self._UpdateControls )
def _UpdateControls( self ):
self._write_autocomplete_file_domain.setEnabled( self._override_write_autocomplete_file_domain.isChecked() )
self._write_autocomplete_location_context.setEnabled( self._override_write_autocomplete_location_context.isChecked() )
if self._search_namespaces_into_full_tags.isChecked():
@ -264,8 +258,8 @@ class EditTagAutocompleteOptionsPanel( ClientGUIScrolledPanels.EditPanel ):
tag_autocomplete_options = ClientTagsHandling.TagAutocompleteOptions( self._original_tag_autocomplete_options.GetServiceKey() )
write_autocomplete_tag_domain = self._write_autocomplete_tag_domain.GetValue()
override_write_autocomplete_file_domain = self._override_write_autocomplete_file_domain.isChecked()
write_autocomplete_file_domain = self._write_autocomplete_file_domain.GetValue()
override_write_autocomplete_location_context = self._override_write_autocomplete_location_context.isChecked()
write_autocomplete_location_context = self._write_autocomplete_location_context.GetValue()
search_namespaces_into_full_tags = self._search_namespaces_into_full_tags.isChecked()
namespace_bare_fetch_all_allowed = self._namespace_bare_fetch_all_allowed.isChecked()
namespace_fetch_all_allowed = self._namespace_fetch_all_allowed.isChecked()
@ -273,8 +267,8 @@ class EditTagAutocompleteOptionsPanel( ClientGUIScrolledPanels.EditPanel ):
tag_autocomplete_options.SetTuple(
write_autocomplete_tag_domain,
override_write_autocomplete_file_domain,
write_autocomplete_file_domain,
override_write_autocomplete_location_context,
write_autocomplete_location_context,
search_namespaces_into_full_tags,
namespace_bare_fetch_all_allowed,
namespace_fetch_all_allowed,
@ -2951,7 +2945,7 @@ class ManageTagParents( ClientGUIScrolledPanels.ManagePanel ):
self._children.setMinimumHeight( preview_height )
self._parents.setMinimumHeight( preview_height )
default_location_context = HG.client_controller.services_manager.GetDefaultLocationContext()
default_location_context = HG.client_controller.new_options.GetDefaultLocalLocationContext()
self._child_input = ClientGUIACDropdown.AutoCompleteDropdownTagsWrite( self, self.EnterChildren, default_location_context, service_key, show_paste_button = True )
self._child_input.setEnabled( False )
@ -3950,7 +3944,7 @@ class ManageTagSiblings( ClientGUIScrolledPanels.ManagePanel ):
self._old_siblings.setMinimumHeight( preview_height )
default_location_context = HG.client_controller.services_manager.GetDefaultLocationContext()
default_location_context = HG.client_controller.new_options.GetDefaultLocalLocationContext()
self._old_input = ClientGUIACDropdown.AutoCompleteDropdownTagsWrite( self, self.EnterOlds, default_location_context, service_key, show_paste_button = True )
self._old_input.setEnabled( False )

View File

@ -1593,6 +1593,7 @@ class UIActionSimulator:
QW.QApplication.instance().postEvent( widget, ev2 )
# TODO: rewrite this to be on my newer panel system so this can resize for lads on small screens etc..
class AboutBox( QW.QDialog ):
def __init__( self, parent, about_info ):

View File

@ -150,7 +150,7 @@ def CreateManagementController( page_name, management_type, location_context = N
def CreateManagementControllerDuplicateFilter():
default_location_context = HG.client_controller.services_manager.GetDefaultLocationContext()
default_location_context = HG.client_controller.new_options.GetDefaultLocalLocationContext()
management_controller = CreateManagementController( 'duplicates', MANAGEMENT_TYPE_DUPLICATE_FILTER, location_context = default_location_context )

View File

@ -122,9 +122,7 @@ class DialogPageChooser( ClientGUIDialogs.Dialog ):
self.setMinimumWidth( width )
self.setMinimumHeight( height )
self._services = HG.client_controller.services_manager.GetServices()
self._petition_service_keys = [ service.GetServiceKey() for service in self._services if service.GetServiceType() in HC.REPOSITORIES and True in ( service.HasPermission( content_type, HC.PERMISSION_ACTION_MODERATE ) for content_type in HC.SERVICE_TYPES_TO_CONTENT_TYPES[ service.GetServiceType() ] ) ]
self._petition_service_keys = [ service.GetServiceKey() for service in HG.client_controller.services_manager.GetServices( HC.REPOSITORIES ) if True in ( service.HasPermission( content_type, HC.PERMISSION_ACTION_MODERATE ) for content_type in HC.SERVICE_TYPES_TO_CONTENT_TYPES[ service.GetServiceType() ] ) ]
self._InitButtons( 'home' )
@ -280,20 +278,26 @@ class DialogPageChooser( ClientGUIDialogs.Dialog ):
elif menu_keyword == 'files':
entries.append( ( 'page_query', CC.LOCAL_FILE_SERVICE_KEY ) )
for service_key in self._controller.services_manager.GetServiceKeys( ( HC.LOCAL_FILE_DOMAIN, ) ):
if service_key == CC.LOCAL_UPDATE_SERVICE_KEY:
continue
entries.append( ( 'page_query', service_key ) )
entries.append( ( 'page_query', CC.TRASH_SERVICE_KEY ) )
if HG.client_controller.new_options.GetBoolean( 'advanced_mode' ):
if self._controller.new_options.GetBoolean( 'advanced_mode' ):
entries.append( ( 'page_query', CC.COMBINED_LOCAL_FILE_SERVICE_KEY ) )
for service in self._services:
for service_key in self._controller.services_manager.GetServiceKeys( ( HC.FILE_REPOSITORY, ) ):
if service.GetServiceType() == HC.FILE_REPOSITORY:
entries.append( ( 'page_query', service.GetServiceKey() ) )
entries.append( ( 'page_query', service_key ) )
elif menu_keyword == 'download':
@ -3222,7 +3226,7 @@ class PagesNotebook( QP.TabWidgetWithDnD ):
if give_it_a_blank_page:
default_location_context = HG.client_controller.services_manager.GetDefaultLocationContext()
default_location_context = HG.client_controller.new_options.GetDefaultLocalLocationContext()
page.NewPageQuery( default_location_context )

View File

@ -316,7 +316,7 @@ class MediaPanel( ClientMedia.ListeningMediaList, QW.QScrollArea ):
media_to_delete = [ m for m in media_to_delete if only_those_in_file_service_key in m.GetLocationsManager().GetCurrent() ]
if file_service_key is None or file_service_key in ( CC.LOCAL_FILE_SERVICE_KEY, CC.COMBINED_LOCAL_FILE_SERVICE_KEY ):
if file_service_key is None or HG.client_controller.services_manager.GetServiceType( file_service_key ) in HC.LOCAL_FILE_SERVICES:
default_reason = 'Deleted from Media Page.'
@ -3264,7 +3264,7 @@ class MediaPanelThumbnails( MediaPanel ):
selected_locations_managers = [ media.GetLocationsManager() for media in flat_selected_medias ]
selection_has_local = True in ( locations_manager.IsLocal() for locations_manager in selected_locations_managers )
selection_has_local_file_domain = True in ( CC.LOCAL_FILE_SERVICE_KEY in locations_manager.GetCurrent() for locations_manager in selected_locations_managers )
selection_has_local_file_domain = True in ( locations_manager.IsLocal() and not locations_manager.IsTrashed() for locations_manager in selected_locations_managers )
selection_has_trash = True in ( locations_manager.IsTrashed() for locations_manager in selected_locations_managers )
selection_has_inbox = True in ( media.HasInbox() for media in self._selected_media )
selection_has_archive = True in ( media.HasArchive() for media in self._selected_media )
@ -3704,7 +3704,7 @@ class MediaPanelThumbnails( MediaPanel ):
if HG.client_controller.DBCurrentlyDoingJob():
file_duplicate_info = None
file_duplicate_info = {}
else:
@ -3712,7 +3712,7 @@ class MediaPanelThumbnails( MediaPanel ):
if self._location_context.current_service_keys.isdisjoint( HG.client_controller.services_manager.GetServiceKeys( ( HC.LOCAL_FILE_DOMAIN, HC.LOCAL_FILE_TRASH_DOMAIN ) ) ):
all_local_files_file_duplicate_info = None
all_local_files_file_duplicate_info = {}
else:
@ -3726,7 +3726,7 @@ class MediaPanelThumbnails( MediaPanel ):
focus_has_potentials = False
focus_can_be_searched = focus_singleton.GetMime() in HC.FILES_THAT_HAVE_PERCEPTUAL_HASH
if file_duplicate_info is None:
if len( file_duplicate_info ) == 0:
ClientGUIMenus.AppendMenuLabel( duplicates_menu, 'could not fetch file\'s duplicates (db currently locked)' )
@ -3739,7 +3739,7 @@ class MediaPanelThumbnails( MediaPanel ):
view_duplicate_relations_jobs.append( ( self._location_context, file_duplicate_info ) )
if all_local_files_file_duplicate_info is not None and len( all_local_files_file_duplicate_info[ 'counts' ] ) > 0 and all_local_files_file_duplicate_info != file_duplicate_info:
if len( all_local_files_file_duplicate_info ) > 0 and len( all_local_files_file_duplicate_info[ 'counts' ] ) > 0 and all_local_files_file_duplicate_info != file_duplicate_info:
view_duplicate_relations_jobs.append( ( combined_local_location_context, all_local_files_file_duplicate_info ) )
@ -3807,7 +3807,7 @@ class MediaPanelThumbnails( MediaPanel ):
focus_is_definitely_king = file_duplicate_info is not None and file_duplicate_info[ 'is_king' ]
focus_is_definitely_king = len( file_duplicate_info ) > 0 and file_duplicate_info[ 'is_king' ]
dissolution_actions_available = focus_can_be_searched or focus_is_in_duplicate_group or focus_is_in_alternate_group or focus_has_fps
@ -3817,7 +3817,7 @@ class MediaPanelThumbnails( MediaPanel ):
duplicates_action_submenu = QW.QMenu( duplicates_menu )
if file_duplicate_info is None:
if len( file_duplicate_info ) == 0:
ClientGUIMenus.AppendMenuLabel( duplicates_action_submenu, 'could not fetch info to check for available file actions (db currently locked)' )
@ -4557,10 +4557,15 @@ def AddRemoveMenu( win: MediaPanel, menu, filter_counts, all_specific_file_domai
all_specific_file_domains.insert( 0, CC.TRASH_SERVICE_KEY )
if CC.LOCAL_FILE_SERVICE_KEY in all_specific_file_domains:
for service in HG.client_controller.services_manager.GetLocalMediaFileServices():
all_specific_file_domains.remove( CC.LOCAL_FILE_SERVICE_KEY )
all_specific_file_domains.insert( 0, CC.LOCAL_FILE_SERVICE_KEY )
service_key = service.GetServiceKey()
if service_key in all_specific_file_domains:
all_specific_file_domains.remove( service_key )
all_specific_file_domains.insert( 0, service_key )
for file_service_key in all_specific_file_domains:
@ -4642,10 +4647,15 @@ def AddSelectMenu( win: MediaPanel, menu, filter_counts, all_specific_file_domai
all_specific_file_domains.insert( 0, CC.TRASH_SERVICE_KEY )
if CC.LOCAL_FILE_SERVICE_KEY in all_specific_file_domains:
for service in HG.client_controller.services_manager.GetLocalMediaFileServices():
all_specific_file_domains.remove( CC.LOCAL_FILE_SERVICE_KEY )
all_specific_file_domains.insert( 0, CC.LOCAL_FILE_SERVICE_KEY )
service_key = service.GetServiceKey()
if service_key in all_specific_file_domains:
all_specific_file_domains.remove( service_key )
all_specific_file_domains.insert( 0, service_key )
for file_service_key in all_specific_file_domains:

View File

@ -15,7 +15,6 @@ from hydrus.core import HydrusText
from hydrus.client import ClientApplicationCommand as CAC
from hydrus.client import ClientConstants as CC
from hydrus.client import ClientData
from hydrus.client import ClientLocation
from hydrus.client import ClientSearch
from hydrus.client import ClientThreading
@ -27,8 +26,8 @@ from hydrus.client.gui import ClientGUIShortcuts
from hydrus.client.gui import ClientGUITopLevelWindowsPanels
from hydrus.client.gui import QtPorting as QP
from hydrus.client.gui.lists import ClientGUIListBoxes
from hydrus.client.gui.lists import ClientGUIListBoxesData
from hydrus.client.gui.pages import ClientGUIResultsSortCollect
from hydrus.client.gui.search import ClientGUILocation
from hydrus.client.gui.search import ClientGUISearch
from hydrus.client.gui.widgets import ClientGUICommon
from hydrus.client.metadata import ClientTags
@ -39,37 +38,6 @@ def AppendLoadingPredicate( predicates ):
predicates.append( ClientSearch.Predicate( predicate_type = ClientSearch.PREDICATE_TYPE_LABEL, value = 'loading results\u2026' ) )
def GetPossibleFileDomainServicesInOrder( all_known_files_allowed: bool ):
services_manager = HG.client_controller.services_manager
service_types_in_order = [ HC.LOCAL_FILE_DOMAIN, HC.LOCAL_FILE_TRASH_DOMAIN ]
advanced_mode = HG.client_controller.new_options.GetBoolean( 'advanced_mode' )
if advanced_mode:
service_types_in_order.append( HC.COMBINED_LOCAL_FILE )
service_types_in_order.append( HC.FILE_REPOSITORY )
service_types_in_order.append( HC.IPFS )
if all_known_files_allowed:
service_types_in_order.append( HC.COMBINED_FILE )
services = services_manager.GetServices( service_types_in_order )
if not advanced_mode:
services = [ service for service in services if service.GetServiceKey() != CC.LOCAL_UPDATE_SERVICE_KEY ]
return services
def InsertOtherPredicatesForRead( predicates: list, parsed_autocomplete_text: ClientSearch.ParsedAutocompleteText, include_unusual_predicate_types: bool, under_construction_or_predicate: typing.Optional[ ClientSearch.Predicate ] ):
if include_unusual_predicate_types:
@ -529,120 +497,6 @@ def WriteFetch( win, job_key, results_callable, parsed_autocomplete_text: Client
HG.client_controller.CallAfterQtSafe( win, 'write a/c fetch', results_callable, job_key, parsed_autocomplete_text, results_cache, matches )
class EditLocationContextPanel( ClientGUIScrolledPanels.EditPanel ):
def __init__( self, parent: QW.QWidget, location_context: ClientLocation.LocationContext, all_known_files_allowed: bool ):
ClientGUIScrolledPanels.EditPanel.__init__( self, parent )
self._original_location_context = location_context
self._all_known_files_allowed = all_known_files_allowed
self._location_list = ClientGUICommon.BetterCheckBoxList( self )
services = GetPossibleFileDomainServicesInOrder( all_known_files_allowed )
for service in services:
name = service.GetName()
service_key = service.GetServiceKey()
starts_checked = service_key in self._original_location_context.current_service_keys
self._location_list.Append( name, ( HC.CONTENT_STATUS_CURRENT, service_key ), starts_checked = starts_checked )
advanced_mode = HG.client_controller.new_options.GetBoolean( 'advanced_mode' )
if advanced_mode:
for service in services:
name = service.GetName()
service_key = service.GetServiceKey()
if service_key in ( CC.COMBINED_FILE_SERVICE_KEY, CC.TRASH_SERVICE_KEY ):
continue
starts_checked = service_key in self._original_location_context.deleted_service_keys
self._location_list.Append( 'deleted from {}'.format( name ), ( HC.CONTENT_STATUS_DELETED, service_key ), starts_checked = starts_checked )
vbox = QP.VBoxLayout()
QP.AddToLayout( vbox, self._location_list, CC.FLAGS_EXPAND_SIZER_BOTH_WAYS )
self.widget().setLayout( vbox )
self._location_list.checkBoxListChanged.connect( self._ClearSurplusServices )
def _ClearSurplusServices( self ):
# if user clicks all known files, then all other services will be wiped
# all local files should do other file services too
location_context = self._GetValue()
filter_func = lambda service_key: HG.client_controller.services_manager.GetServiceType( service_key ) not in ( HC.LOCAL_FILE_DOMAIN, HC.LOCAL_FILE_TRASH_DOMAIN )
location_context.ClearAllLocalFilesServices( filter_func )
if set( self._GetStatusesAndServiceKeys( location_context ) ) != set( self._location_list.GetValue() ):
self._SetValue( location_context )
def _GetStatusesAndServiceKeys( self, location_context: ClientLocation.LocationContext ):
statuses_and_service_keys = [ ( HC.CONTENT_STATUS_CURRENT, service_key ) for service_key in location_context.current_service_keys ]
statuses_and_service_keys.extend( [ ( HC.CONTENT_STATUS_DELETED, service_key ) for service_key in location_context.deleted_service_keys ] )
return statuses_and_service_keys
def _GetValue( self ):
statuses_and_service_keys = self._location_list.GetValue()
current_service_keys = { service_key for ( status, service_key ) in statuses_and_service_keys if status == HC.CONTENT_STATUS_CURRENT }
deleted_service_keys = { service_key for ( status, service_key ) in statuses_and_service_keys if status == HC.CONTENT_STATUS_DELETED }
location_context = ClientLocation.LocationContext( current_service_keys = current_service_keys, deleted_service_keys = deleted_service_keys )
return location_context
def _SetValue( self, location_context: ClientLocation.LocationContext ):
self._location_list.blockSignals( True )
statuses_and_service_keys = self._GetStatusesAndServiceKeys( location_context )
self._location_list.SetValue( statuses_and_service_keys )
self._location_list.blockSignals( False )
def GetValue( self ) -> ClientLocation.LocationContext:
location_context = self._GetValue()
return location_context
def SetValue( self, location_context: ClientLocation.LocationContext ):
self._SetValue( location_context )
self._location_list.checkBoxListChanged.emit()
class ListBoxTagsPredicatesAC( ClientGUIListBoxes.ListBoxTagsPredicates ):
def __init__( self, parent, callable, service_key, float_mode, **kwargs ):
@ -1527,17 +1381,14 @@ class AutoCompleteDropdownTags( AutoCompleteDropdown ):
tag_service_key = CC.COMBINED_TAG_SERVICE_KEY
self._location_context = location_context
self._tag_service_key = tag_service_key
AutoCompleteDropdown.__init__( self, parent )
self._allow_all_known_files = True
tag_service = HG.client_controller.services_manager.GetService( self._tag_service_key )
self._file_repo_button = ClientGUICommon.BetterButton( self._dropdown_window, location_context.ToString( HG.client_controller.services_manager.GetName ), self.FileButtonHit )
self._file_repo_button.setMinimumWidth( 20 )
self._location_context_button = ClientGUILocation.LocationSearchContextButton( self._dropdown_window, location_context )
self._location_context_button.setMinimumWidth( 20 )
self._tag_repo_button = ClientGUICommon.BetterButton( self._dropdown_window, tag_service.GetName(), self.TagButtonHit )
self._tag_repo_button.setMinimumWidth( 20 )
@ -1550,47 +1401,24 @@ class AutoCompleteDropdownTags( AutoCompleteDropdown ):
#
self._location_context_button.locationChanged.connect( self._LocationContextJustChanged )
HG.client_controller.sub( self, 'RefreshFavouriteTags', 'notify_new_favourite_tags' )
HG.client_controller.sub( self, 'NotifyNewServices', 'notify_new_services' )
def _IsAllKnownFilesServiceTypeAllowed( self ):
raise NotImplementedError()
def _ChangeLocationContext( self, location_context: ClientLocation.LocationContext ):
location_context.FixMissingServices( HG.client_controller.services_manager.FilterValidServiceKeys )
if location_context.IsAllKnownFiles() and self._tag_service_key == CC.COMBINED_TAG_SERVICE_KEY:
local_tag_services = HG.client_controller.services_manager.GetServices( ( HC.LOCAL_TAG, ) )
self._ChangeTagService( local_tag_services[0].GetServiceKey() )
self._location_context = location_context
self._UpdateFileServiceLabel()
self.locationChanged.emit( self._location_context )
self._SetListDirty()
def _ChangeTagService( self, tag_service_key ):
def _SetTagService( self, tag_service_key ):
if not HG.client_controller.services_manager.ServiceExists( tag_service_key ):
tag_service_key = CC.COMBINED_TAG_SERVICE_KEY
if tag_service_key == CC.COMBINED_TAG_SERVICE_KEY and self._location_context.IsAllKnownFiles():
if tag_service_key == CC.COMBINED_TAG_SERVICE_KEY and self._location_context_button.GetValue().IsAllKnownFiles():
default_location_context = HG.client_controller.services_manager.GetDefaultLocationContext()
default_location_context = HG.client_controller.new_options.GetDefaultLocalLocationContext()
self._ChangeLocationContext( default_location_context )
self._SetLocationContext( default_location_context )
self._tag_service_key = tag_service_key
@ -1605,23 +1433,6 @@ class AutoCompleteDropdownTags( AutoCompleteDropdown ):
self._SetListDirty()
def _EditMultipleLocationContext( self ):
with ClientGUITopLevelWindowsPanels.DialogEdit( self, 'edit multiple location' ) as dlg:
panel = EditLocationContextPanel( dlg, self._location_context, self._IsAllKnownFilesServiceTypeAllowed() )
dlg.SetPanel( panel )
if dlg.exec() == QW.QDialog.Accepted:
location_context = panel.GetValue()
self._ChangeLocationContext( location_context )
def _GetCurrentBroadcastTextPredicate( self ) -> typing.Optional[ ClientSearch.Predicate ]:
raise NotImplementedError()
@ -1643,6 +1454,31 @@ class AutoCompleteDropdownTags( AutoCompleteDropdown ):
raise NotImplementedError()
def _LocationContextJustChanged( self, location_context: ClientLocation.LocationContext ):
self._RestoreTextCtrlFocus()
self.locationChanged.emit( location_context )
self._SetListDirty()
def _SetLocationContext( self, location_context: ClientLocation.LocationContext ):
location_context.FixMissingServices( HG.client_controller.services_manager.FilterValidServiceKeys )
if location_context.IsAllKnownFiles() and self._tag_service_key == CC.COMBINED_TAG_SERVICE_KEY:
local_tag_services = HG.client_controller.services_manager.GetServices( ( HC.LOCAL_TAG, ) )
self._SetTagService( local_tag_services[0].GetServiceKey() )
self._location_context_button.SetValue( location_context )
self._SetListDirty()
def _SetResultsToList( self, results, parsed_autocomplete_text: ClientSearch.ParsedAutocompleteText ):
self._search_results_list.SetPredicates( results )
@ -1650,15 +1486,6 @@ class AutoCompleteDropdownTags( AutoCompleteDropdown ):
self._current_list_parsed_autocomplete_text = parsed_autocomplete_text
def _UpdateFileServiceLabel( self ):
name = self._location_context.ToString( HG.client_controller.services_manager.GetName )
self._file_repo_button.setText( name )
self._SetListDirty()
def _UpdateTagServiceLabel( self ):
tag_service = HG.client_controller.services_manager.GetService( self._tag_service_key )
@ -1668,51 +1495,10 @@ class AutoCompleteDropdownTags( AutoCompleteDropdown ):
self._tag_repo_button.setText( name )
def FileButtonHit( self ):
services = GetPossibleFileDomainServicesInOrder( self._IsAllKnownFilesServiceTypeAllowed() )
advanced_mode = HG.client_controller.new_options.GetBoolean( 'advanced_mode' )
menu = QW.QMenu()
for service in services:
location_context = ClientLocation.LocationContext.STATICCreateSimple( service.GetServiceKey() )
ClientGUIMenus.AppendMenuItem( menu, service.GetName(), 'Change the current file domain to ' + service.GetName() + '.', self._ChangeLocationContext, location_context )
if advanced_mode and False:
ClientGUIMenus.AppendSeparator( menu )
for service in services:
if service.GetServiceKey() in ( CC.COMBINED_FILE_SERVICE_KEY, CC.TRASH_SERVICE_KEY ):
continue
location_context = ClientLocation.LocationContext( [], [ service.GetServiceKey() ] )
ClientGUIMenus.AppendMenuItem( menu, 'deleted from {}'.format( service.GetName() ), 'Change the current file domain to files deleted from ' + service.GetName() + '.', self._ChangeLocationContext, location_context )
ClientGUIMenus.AppendSeparator( menu )
ClientGUIMenus.AppendMenuItem( menu, 'multiple locations', 'Change the current file domain to something with multiple locations.', self._EditMultipleLocationContext )
CGC.core().PopupMenu( self._file_repo_button, menu )
self._RestoreTextCtrlFocus()
def NotifyNewServices( self ):
self._ChangeLocationContext( self._location_context )
self._ChangeTagService( self._tag_service_key )
self._SetLocationContext( self._location_context_button.GetValue() )
self._SetTagService( self._tag_service_key )
def RefreshFavouriteTags( self ):
@ -1724,9 +1510,9 @@ class AutoCompleteDropdownTags( AutoCompleteDropdown ):
self._favourites_list.SetPredicates( predicates )
def ChangeLocationContext( self, location_context: ClientLocation.LocationContext ):
def SetLocationContext( self, location_context: ClientLocation.LocationContext ):
self._ChangeLocationContext( location_context )
self._SetLocationContext( location_context )
def SetStubPredicates( self, job_key, stub_predicates, parsed_autocomplete_text ):
@ -1739,7 +1525,7 @@ class AutoCompleteDropdownTags( AutoCompleteDropdown ):
def SetTagServiceKey( self, tag_service_key ):
self._ChangeTagService( tag_service_key )
self._SetTagService( tag_service_key )
def TagButtonHit( self ):
@ -1754,7 +1540,7 @@ class AutoCompleteDropdownTags( AutoCompleteDropdown ):
for service in services:
ClientGUIMenus.AppendMenuItem( menu, service.GetName(), 'Change the current tag domain to ' + service.GetName() + '.', self._ChangeTagService, service.GetServiceKey() )
ClientGUIMenus.AppendMenuItem( menu, service.GetName(), 'Change the current tag domain to ' + service.GetName() + '.', self._SetTagService, service.GetServiceKey() )
CGC.core().PopupMenu( self._tag_repo_button, menu )
@ -1786,8 +1572,6 @@ class AutoCompleteDropdownTagsRead( AutoCompleteDropdownTags ):
self._media_sort_widget = media_sort_widget
self._media_collect_widget = media_collect_widget
self._allow_all_known_files = allow_all_known_files
self._media_callable = media_callable
self._file_search_context = file_search_context
@ -1796,6 +1580,8 @@ class AutoCompleteDropdownTagsRead( AutoCompleteDropdownTags ):
self._predicates_listbox.SetPredicates( self._file_search_context.GetPredicates() )
self._location_context_button.SetAllKnownFilesAllowed( allow_all_known_files, True )
#
self._favourite_searches_button = ClientGUICommon.BetterBitmapButton( self._text_input_panel, CC.global_pixmaps().star, self._FavouriteSearchesMenu )
@ -1848,7 +1634,7 @@ class AutoCompleteDropdownTagsRead( AutoCompleteDropdownTags ):
button_hbox_2 = QP.HBoxLayout()
QP.AddToLayout( button_hbox_2, self._file_repo_button, CC.FLAGS_EXPAND_BOTH_WAYS )
QP.AddToLayout( button_hbox_2, self._location_context_button, CC.FLAGS_EXPAND_BOTH_WAYS )
QP.AddToLayout( button_hbox_2, self._tag_repo_button, CC.FLAGS_EXPAND_BOTH_WAYS )
vbox = QP.VBoxLayout()
@ -1867,13 +1653,6 @@ class AutoCompleteDropdownTagsRead( AutoCompleteDropdownTags ):
self._search_pause_play.valueChanged.connect( self.SetSynchronised )
def _IsAllKnownFilesServiceTypeAllowed( self ):
advanced_mode = HG.client_controller.new_options.GetBoolean( 'advanced_mode' )
return advanced_mode and self._allow_all_known_files
def _AdvancedORInput( self ):
title = 'enter advanced OR predicates'
@ -1971,24 +1750,6 @@ class AutoCompleteDropdownTagsRead( AutoCompleteDropdownTags ):
self._ClearInput()
def _ChangeLocationContext( self, location_context: ClientLocation.LocationContext ):
AutoCompleteDropdownTags._ChangeLocationContext( self, location_context )
self._file_search_context.SetLocationContext( location_context )
self._SignalNewSearchState()
def _ChangeTagService( self, tag_service_key ):
AutoCompleteDropdownTags._ChangeTagService( self, tag_service_key )
self._file_search_context.SetTagServiceKey( tag_service_key )
self._SignalNewSearchState()
def _FavouriteSearchesMenu( self ):
menu = QW.QMenu()
@ -2092,6 +1853,15 @@ class AutoCompleteDropdownTagsRead( AutoCompleteDropdownTags ):
return ListBoxTagsPredicatesAC( self._dropdown_notebook, self.BroadcastChoices, self._tag_service_key, self._float_mode, tag_display_type = ClientTags.TAG_DISPLAY_ACTUAL, height_num_chars = height_num_chars )
def _LocationContextJustChanged( self, location_context: ClientLocation.LocationContext ):
AutoCompleteDropdownTags._LocationContextJustChanged( self, location_context )
self._file_search_context.SetLocationContext( location_context )
self._SignalNewSearchState()
def _LoadFavouriteSearch( self, folder_name, name ):
( file_search_context, synchronised, media_sort, media_collect ) = HG.client_controller.favourite_search_manager.GetFavouriteSearch( folder_name, name )
@ -2114,7 +1884,7 @@ class AutoCompleteDropdownTagsRead( AutoCompleteDropdownTags ):
self.blockSignals( False )
self.locationChanged.emit( self._location_context )
self.locationChanged.emit( self._location_context_button.GetValue() )
self.tagServiceChanged.emit( self._tag_service_key )
self._SignalNewSearchState()
@ -2195,6 +1965,15 @@ class AutoCompleteDropdownTagsRead( AutoCompleteDropdownTags ):
self._ManageFavouriteSearches( favourite_search_row_to_save = search_row )
def _SetTagService( self, tag_service_key ):
AutoCompleteDropdownTags._SetTagService( self, tag_service_key )
self._file_search_context.SetTagServiceKey( tag_service_key )
self._SignalNewSearchState()
def _SetupTopListBox( self ):
self._predicates_listbox = ListBoxTagsActiveSearchPredicates( self, self._page_key )
@ -2373,8 +2152,8 @@ class AutoCompleteDropdownTagsRead( AutoCompleteDropdownTags ):
self._predicates_listbox.SetPredicates( self._file_search_context.GetPredicates() )
self._ChangeLocationContext( self._file_search_context.GetLocationContext() )
self._ChangeTagService( self._file_search_context.GetTagSearchContext().service_key )
self._SetLocationContext( self._file_search_context.GetLocationContext() )
self._SetTagService( self._file_search_context.GetTagSearchContext().service_key )
self._SignalNewSearchState()
@ -2691,10 +2470,12 @@ class AutoCompleteDropdownTagsWrite( AutoCompleteDropdownTags ):
tag_autocomplete_options = HG.client_controller.tag_display_manager.GetTagAutocompleteOptions( tag_service_key )
( location_context, tag_service_key ) = tag_autocomplete_options.GetWriteAutocompleteDomain( location_context )
( location_context, tag_service_key ) = tag_autocomplete_options.GetWriteAutocompleteSearchDomain( location_context )
AutoCompleteDropdownTags.__init__( self, parent, location_context, tag_service_key )
self._location_context_button.SetAllKnownFilesAllowed( True, False )
self._paste_button = ClientGUICommon.BetterBitmapButton( self._text_input_panel, CC.global_pixmaps().paste, self._Paste )
self._paste_button.setToolTip( 'Paste from the clipboard and quick-enter as if you had typed. This can take multiple newline-separated tags.' )
@ -2709,7 +2490,7 @@ class AutoCompleteDropdownTagsWrite( AutoCompleteDropdownTags ):
hbox = QP.HBoxLayout()
QP.AddToLayout( hbox, self._file_repo_button, CC.FLAGS_EXPAND_BOTH_WAYS )
QP.AddToLayout( hbox, self._location_context_button, CC.FLAGS_EXPAND_BOTH_WAYS )
QP.AddToLayout( hbox, self._tag_repo_button, CC.FLAGS_EXPAND_BOTH_WAYS )
QP.AddToLayout( vbox, hbox, CC.FLAGS_EXPAND_SIZER_PERPENDICULAR )
@ -2718,11 +2499,6 @@ class AutoCompleteDropdownTagsWrite( AutoCompleteDropdownTags ):
self._dropdown_window.setLayout( vbox )
def _IsAllKnownFilesServiceTypeAllowed( self ):
return self._allow_all_known_files
def _BroadcastChoices( self, predicates, shift_down ):
tags = { predicate.GetValue() for predicate in predicates }
@ -2735,9 +2511,9 @@ class AutoCompleteDropdownTagsWrite( AutoCompleteDropdownTags ):
self._ClearInput()
def _ChangeTagService( self, tag_service_key ):
def _SetTagService( self, tag_service_key ):
AutoCompleteDropdownTags._ChangeTagService( self, tag_service_key )
AutoCompleteDropdownTags._SetTagService( self, tag_service_key )
if self._tag_service_key_changed_callable is not None:
@ -2862,7 +2638,7 @@ class AutoCompleteDropdownTagsWrite( AutoCompleteDropdownTags ):
tag_search_context = ClientSearch.TagSearchContext( service_key = self._tag_service_key, display_service_key = self._display_tag_service_key )
file_search_context = ClientSearch.FileSearchContext( location_context = self._location_context, tag_search_context = tag_search_context )
file_search_context = ClientSearch.FileSearchContext( location_context = self._location_context_button.GetValue(), tag_search_context = tag_search_context )
HG.client_controller.CallToThread( WriteFetch, self, job_key, self.SetFetchedResults, parsed_autocomplete_text, file_search_context, self._results_cache )

View File

@ -0,0 +1,259 @@
from qtpy import QtCore as QC
from qtpy import QtWidgets as QW
from hydrus.core import HydrusConstants as HC
from hydrus.core import HydrusGlobals as HG
from hydrus.client import ClientConstants as CC
from hydrus.client import ClientLocation
from hydrus.client.gui import ClientGUICore as CGC
from hydrus.client.gui import ClientGUIMenus
from hydrus.client.gui import ClientGUIScrolledPanels
from hydrus.client.gui import ClientGUITopLevelWindowsPanels
from hydrus.client.gui import QtPorting as QP
from hydrus.client.gui.widgets import ClientGUICommon
def GetPossibleFileDomainServicesInOrder( all_known_files_allowed: bool, only_local_file_domains_allowed: bool ):
services_manager = HG.client_controller.services_manager
if only_local_file_domains_allowed:
service_types_in_order = [ HC.LOCAL_FILE_DOMAIN ]
else:
service_types_in_order = [ HC.LOCAL_FILE_DOMAIN, HC.LOCAL_FILE_TRASH_DOMAIN ]
advanced_mode = HG.client_controller.new_options.GetBoolean( 'advanced_mode' )
if advanced_mode:
service_types_in_order.append( HC.COMBINED_LOCAL_FILE )
service_types_in_order.append( HC.FILE_REPOSITORY )
service_types_in_order.append( HC.IPFS )
if all_known_files_allowed:
service_types_in_order.append( HC.COMBINED_FILE )
services = services_manager.GetServices( service_types_in_order )
if only_local_file_domains_allowed or not advanced_mode:
services = [ service for service in services if service.GetServiceKey() != CC.LOCAL_UPDATE_SERVICE_KEY ]
return services
class EditMultipleLocationContextPanel( ClientGUIScrolledPanels.EditPanel ):
def __init__( self, parent: QW.QWidget, location_context: ClientLocation.LocationContext, all_known_files_allowed: bool, only_local_file_domains_allowed: bool ):
ClientGUIScrolledPanels.EditPanel.__init__( self, parent )
self._original_location_context = location_context
self._all_known_files_allowed = all_known_files_allowed
self._only_local_file_domains_allowed = only_local_file_domains_allowed
self._location_list = ClientGUICommon.BetterCheckBoxList( self )
services = GetPossibleFileDomainServicesInOrder( all_known_files_allowed, only_local_file_domains_allowed )
for service in services:
name = service.GetName()
service_key = service.GetServiceKey()
starts_checked = service_key in self._original_location_context.current_service_keys
self._location_list.Append( name, ( HC.CONTENT_STATUS_CURRENT, service_key ), starts_checked = starts_checked )
advanced_mode = HG.client_controller.new_options.GetBoolean( 'advanced_mode' )
if advanced_mode and not only_local_file_domains_allowed:
for service in services:
name = service.GetName()
service_key = service.GetServiceKey()
if service_key in ( CC.COMBINED_FILE_SERVICE_KEY, CC.TRASH_SERVICE_KEY ):
continue
starts_checked = service_key in self._original_location_context.deleted_service_keys
self._location_list.Append( 'deleted from {}'.format( name ), ( HC.CONTENT_STATUS_DELETED, service_key ), starts_checked = starts_checked )
vbox = QP.VBoxLayout()
QP.AddToLayout( vbox, self._location_list, CC.FLAGS_EXPAND_SIZER_BOTH_WAYS )
self.widget().setLayout( vbox )
self._location_list.checkBoxListChanged.connect( self._ClearSurplusServices )
def _ClearSurplusServices( self ):
# if user clicks all known files, then all other services will be wiped
# all local files should do other file services too
location_context = self._GetValue()
location_context.ClearSurplusLocalFilesServices( HG.client_controller.services_manager.GetServiceType )
if set( location_context.GetStatusesAndServiceKeysList() ) != set( self._location_list.GetValue() ):
self._SetValue( location_context )
def _GetValue( self ):
statuses_and_service_keys = self._location_list.GetValue()
current_service_keys = { service_key for ( status, service_key ) in statuses_and_service_keys if status == HC.CONTENT_STATUS_CURRENT }
deleted_service_keys = { service_key for ( status, service_key ) in statuses_and_service_keys if status == HC.CONTENT_STATUS_DELETED }
location_context = ClientLocation.LocationContext( current_service_keys = current_service_keys, deleted_service_keys = deleted_service_keys )
return location_context
def _SetValue( self, location_context: ClientLocation.LocationContext ):
self._location_list.blockSignals( True )
statuses_and_service_keys = location_context.GetStatusesAndServiceKeysList()
self._location_list.SetValue( statuses_and_service_keys )
self._location_list.blockSignals( False )
def GetValue( self ) -> ClientLocation.LocationContext:
location_context = self._GetValue()
return location_context
def SetValue( self, location_context: ClientLocation.LocationContext ):
self._SetValue( location_context )
self._location_list.checkBoxListChanged.emit()
class LocationSearchContextButton( ClientGUICommon.BetterButton ):
locationChanged = QC.Signal( ClientLocation.LocationContext )
def __init__( self, parent: QW.QWidget, location_context: ClientLocation.LocationContext ):
self._location_context = location_context
ClientGUICommon.BetterButton.__init__( self, parent, 'initialising', self._EditLocation )
self._all_known_files_allowed = True
self._all_known_files_allowed_only_in_advanced_mode = False
self._only_importable_domains_allowed = False
self.SetValue( location_context )
def _EditLocation( self ):
services = GetPossibleFileDomainServicesInOrder( self._IsAllKnownFilesServiceTypeAllowed(), self._only_importable_domains_allowed )
menu = QW.QMenu()
for service in services:
location_context = ClientLocation.LocationContext.STATICCreateSimple( service.GetServiceKey() )
ClientGUIMenus.AppendMenuItem( menu, service.GetName(), 'Change the current file domain to {}.'.format( service.GetName() ), self.SetValue, location_context )
ClientGUIMenus.AppendSeparator( menu )
ClientGUIMenus.AppendMenuItem( menu, 'multiple locations', 'Change the current file domain to something with multiple locations.', self._EditMultipleLocationContext )
CGC.core().PopupMenu( self, menu )
def _EditMultipleLocationContext( self ):
with ClientGUITopLevelWindowsPanels.DialogEdit( self, 'edit multiple location' ) as dlg:
panel = EditMultipleLocationContextPanel( dlg, self._location_context, self._IsAllKnownFilesServiceTypeAllowed(), self._only_importable_domains_allowed )
dlg.SetPanel( panel )
if dlg.exec() == QW.QDialog.Accepted:
location_context = panel.GetValue()
self.SetValue( location_context )
def _IsAllKnownFilesServiceTypeAllowed( self ) -> bool:
if self._all_known_files_allowed:
if self._all_known_files_allowed_only_in_advanced_mode and not HG.client_controller.new_options.GetBoolean( 'advanced_mode' ):
return False
else:
return True
else:
return False
def GetValue( self ) -> ClientLocation.LocationContext:
return self._location_context
def SetOnlyImportableDomainsAllowed( self, only_importable_domains_allowed: bool ):
self._only_importable_domains_allowed = only_importable_domains_allowed
def SetAllKnownFilesAllowed( self, all_known_files_allowed: bool, all_known_files_allowed_only_in_advanced_mode: bool ):
self._all_known_files_allowed = all_known_files_allowed
self._all_known_files_allowed_only_in_advanced_mode = all_known_files_allowed_only_in_advanced_mode
def SetValue( self, location_context: ClientLocation.LocationContext ):
location_context = location_context.Duplicate()
location_context.FixMissingServices( HG.client_controller.services_manager.FilterValidServiceKeys )
self._location_context = location_context
self.setText( self._location_context.ToString( HG.client_controller.services_manager.GetName ) )
self.locationChanged.emit( self._location_context )

View File

@ -4,7 +4,6 @@ from qtpy import QtCore as QC
from qtpy import QtWidgets as QW
from hydrus.core import HydrusConstants as HC
from hydrus.core import HydrusExceptions
from hydrus.core import HydrusGlobals as HG
from hydrus.client import ClientConstants as CC

View File

@ -4,6 +4,7 @@ from qtpy import QtWidgets as QW
from hydrus.core import HydrusData
from hydrus.core import HydrusExceptions
from hydrus.core import HydrusGlobals as HG
from hydrus.client import ClientConstants as CC
from hydrus.client import ClientLocation
@ -32,7 +33,7 @@ class ORPredicateControl( QW.QWidget ):
page_key = HydrusData.GenerateKey()
location_context = ClientLocation.LocationContext.STATICCreateSimple( CC.LOCAL_FILE_SERVICE_KEY )
location_context = HG.client_controller.new_options.GetDefaultLocalLocationContext()
file_search_context = ClientSearch.FileSearchContext( location_context = location_context, predicates = predicates )

View File

@ -220,7 +220,7 @@ class EditFavouriteSearchesPanel( ClientGUIScrolledPanels.EditPanel ):
foldername = None
name = 'new favourite search'
default_location_context = HG.client_controller.services_manager.GetDefaultLocationContext()
default_location_context = HG.client_controller.new_options.GetDefaultLocalLocationContext()
file_search_context = ClientSearch.FileSearchContext( location_context = default_location_context )

View File

@ -3953,7 +3953,7 @@ class ReviewServicesPanel( ClientGUIScrolledPanels.ReviewPanel ):
if lb.count() == 0:
previous_service_key = CC.LOCAL_FILE_SERVICE_KEY
previous_service_key = None
else:

View File

@ -7,15 +7,13 @@ from hydrus.core import HydrusSerialisable
from hydrus.client import ClientConstants as CC
from hydrus.client import ClientLocation
from hydrus.client import ClientSearch
from hydrus.client.importing.options import ClientImportOptions
from hydrus.client.importing.options import PresentationImportOptions
class FileImportOptions( HydrusSerialisable.SerialisableBase ):
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_FILE_IMPORT_OPTIONS
SERIALISABLE_NAME = 'File Import Options'
SERIALISABLE_VERSION = 6
SERIALISABLE_VERSION = 7
def __init__( self ):
@ -34,11 +32,14 @@ class FileImportOptions( HydrusSerialisable.SerialisableBase ):
self._associate_primary_urls = True
self._associate_source_urls = True
self._presentation_import_options = PresentationImportOptions.PresentationImportOptions()
self._import_destination_location_context = ClientLocation.LocationContext.STATICCreateSimple( CC.LOCAL_FILE_SERVICE_KEY )
def _GetSerialisableInfo( self ):
pre_import_options = ( self._exclude_deleted, self._do_not_check_known_urls_before_importing, self._do_not_check_hashes_before_importing, self._allow_decompression_bombs, self._min_size, self._max_size, self._max_gif_size, self._min_resolution, self._max_resolution )
serialisable_import_destination_location_context = self._import_destination_location_context.GetSerialisableTuple()
pre_import_options = ( self._exclude_deleted, self._do_not_check_known_urls_before_importing, self._do_not_check_hashes_before_importing, self._allow_decompression_bombs, self._min_size, self._max_size, self._max_gif_size, self._min_resolution, self._max_resolution, serialisable_import_destination_location_context )
post_import_options = ( self._automatic_archive, self._associate_primary_urls, self._associate_source_urls )
serialisable_presentation_import_options = self._presentation_import_options.GetSerialisableTuple()
@ -49,10 +50,12 @@ class FileImportOptions( HydrusSerialisable.SerialisableBase ):
( pre_import_options, post_import_options, serialisable_presentation_import_options ) = serialisable_info
( self._exclude_deleted, self._do_not_check_known_urls_before_importing, self._do_not_check_hashes_before_importing, self._allow_decompression_bombs, self._min_size, self._max_size, self._max_gif_size, self._min_resolution, self._max_resolution ) = pre_import_options
( self._exclude_deleted, self._do_not_check_known_urls_before_importing, self._do_not_check_hashes_before_importing, self._allow_decompression_bombs, self._min_size, self._max_size, self._max_gif_size, self._min_resolution, self._max_resolution, serialisable_import_destination_location_context ) = pre_import_options
( self._automatic_archive, self._associate_primary_urls, self._associate_source_urls ) = post_import_options
self._presentation_import_options = HydrusSerialisable.CreateFromSerialisableTuple( serialisable_presentation_import_options )
self._import_destination_location_context = HydrusSerialisable.CreateFromSerialisableTuple( serialisable_import_destination_location_context )
def _UpdateSerialisableInfo( self, version, old_serialisable_info ):
@ -160,6 +163,23 @@ class FileImportOptions( HydrusSerialisable.SerialisableBase ):
return ( 6, new_serialisable_info )
if version == 6:
( pre_import_options, post_import_options, serialisable_presentation_import_options ) = old_serialisable_info
( exclude_deleted, do_not_check_known_urls_before_importing, do_not_check_hashes_before_importing, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution ) = pre_import_options
import_destination_location_context = ClientLocation.LocationContext.STATICCreateSimple( CC.LOCAL_FILE_SERVICE_KEY )
serialisable_import_destination_location_context = import_destination_location_context.GetSerialisableTuple()
pre_import_options = ( exclude_deleted, do_not_check_known_urls_before_importing, do_not_check_hashes_before_importing, allow_decompression_bombs, min_size, max_size, max_gif_size, min_resolution, max_resolution, serialisable_import_destination_location_context )
new_serialisable_info = ( pre_import_options, post_import_options, serialisable_presentation_import_options )
return ( 7, new_serialisable_info )
def AllowsDecompressionBombs( self ):
@ -255,9 +275,7 @@ class FileImportOptions( HydrusSerialisable.SerialisableBase ):
def GetDestinationLocationContext( self ) -> ClientLocation.LocationContext:
# for now this is static, but obviously we'll have a control to handle this (with 'needs at least one service m8' handling/errors) and then import code to make it happen
return ClientLocation.LocationContext.STATICCreateSimple( CC.LOCAL_FILE_SERVICE_KEY )
return self._import_destination_location_context
def GetPresentationImportOptions( self ):
@ -333,6 +351,11 @@ class FileImportOptions( HydrusSerialisable.SerialisableBase ):
return summary
def SetDestinationLocationContext( self, location_context: ClientLocation.LocationContext ):
self._import_destination_location_context = location_context.Duplicate()
def SetPostImportOptions( self, automatic_archive: bool, associate_primary_urls: bool, associate_source_urls: bool ):
self._automatic_archive = automatic_archive

View File

@ -3,6 +3,7 @@ from hydrus.core import HydrusGlobals as HG
from hydrus.core import HydrusSerialisable
from hydrus.client import ClientConstants as CC
from hydrus.client import ClientLocation
PRESENTATION_LOCATION_IN_LOCAL_FILES = 0
PRESENTATION_LOCATION_IN_TRASH_TOO = 1
@ -302,14 +303,16 @@ class PresentationImportOptions( HydrusSerialisable.SerialisableBase ):
if len( presented_hashes ) > 0 and should_check_location:
file_service_key = CC.LOCAL_FILE_SERVICE_KEY
if self._presentation_location == PRESENTATION_LOCATION_IN_TRASH_TOO:
file_service_key = CC.COMBINED_LOCAL_FILE_SERVICE_KEY
location_context = ClientLocation.LocationContext.STATICCreateSimple( CC.COMBINED_LOCAL_FILE_SERVICE_KEY )
else:
location_context = HG.client_controller.services_manager.GetLocalMediaLocationContextUmbrella()
presented_hashes = HG.client_controller.Read( 'filter_hashes', file_service_key, presented_hashes )
presented_hashes = HG.client_controller.Read( 'filter_hashes', location_context, presented_hashes )
return presented_hashes

View File

@ -13,13 +13,12 @@ from hydrus.core import HydrusTags
from hydrus.client import ClientConstants as CC
from hydrus.client import ClientLocation
from hydrus.client.metadata import ClientTags
class TagAutocompleteOptions( HydrusSerialisable.SerialisableBase ):
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_TAG_AUTOCOMPLETE_OPTIONS
SERIALISABLE_NAME = 'Tag Autocomplete Options'
SERIALISABLE_VERSION = 3
SERIALISABLE_VERSION = 4
def __init__( self, service_key: typing.Optional[ bytes ] = None ):
@ -34,15 +33,15 @@ class TagAutocompleteOptions( HydrusSerialisable.SerialisableBase ):
self._write_autocomplete_tag_domain = self._service_key
self._override_write_autocomplete_file_domain = True
self._override_write_autocomplete_location_context = True
if service_key == CC.DEFAULT_LOCAL_TAG_SERVICE_KEY:
self._write_autocomplete_file_domain = CC.LOCAL_FILE_SERVICE_KEY
self._write_autocomplete_location_context = HG.client_controller.services_manager.GetLocalMediaLocationContextUmbrella()
else:
self._write_autocomplete_file_domain = CC.COMBINED_FILE_SERVICE_KEY
self._write_autocomplete_location_context = ClientLocation.LocationContext.STATICCreateSimple( CC.COMBINED_FILE_SERVICE_KEY )
self._search_namespaces_into_full_tags = False
@ -58,13 +57,13 @@ class TagAutocompleteOptions( HydrusSerialisable.SerialisableBase ):
serialisable_service_key = self._service_key.hex()
serialisable_write_autocomplete_tag_domain = self._write_autocomplete_tag_domain.hex()
serialisable_write_autocomplete_file_domain = self._write_autocomplete_file_domain.hex()
serialisable_write_autocomplete_location_context = self._write_autocomplete_location_context.GetSerialisableTuple()
serialisable_info = [
serialisable_service_key,
serialisable_write_autocomplete_tag_domain,
self._override_write_autocomplete_file_domain,
serialisable_write_autocomplete_file_domain,
self._override_write_autocomplete_location_context,
serialisable_write_autocomplete_location_context,
self._search_namespaces_into_full_tags,
self._namespace_bare_fetch_all_allowed,
self._namespace_fetch_all_allowed,
@ -81,8 +80,8 @@ class TagAutocompleteOptions( HydrusSerialisable.SerialisableBase ):
[
serialisable_service_key,
serialisable_write_autocomplete_tag_domain,
self._override_write_autocomplete_file_domain,
serialisable_write_autocomplete_file_domain,
self._override_write_autocomplete_location_context,
serialisable_write_autocomplete_location_context,
self._search_namespaces_into_full_tags,
self._namespace_bare_fetch_all_allowed,
self._namespace_fetch_all_allowed,
@ -93,7 +92,7 @@ class TagAutocompleteOptions( HydrusSerialisable.SerialisableBase ):
self._service_key = bytes.fromhex( serialisable_service_key )
self._write_autocomplete_tag_domain = bytes.fromhex( serialisable_write_autocomplete_tag_domain )
self._write_autocomplete_file_domain = bytes.fromhex( serialisable_write_autocomplete_file_domain )
self._write_autocomplete_location_context = HydrusSerialisable.CreateFromSerialisableTuple( serialisable_write_autocomplete_location_context )
def _UpdateSerialisableInfo( self, version, old_serialisable_info ):
@ -103,7 +102,7 @@ class TagAutocompleteOptions( HydrusSerialisable.SerialisableBase ):
[
serialisable_service_key,
serialisable_write_autocomplete_tag_domain,
override_write_autocomplete_file_domain,
override_write_autocomplete_location_context,
serialisable_write_autocomplete_file_domain,
search_namespaces_into_full_tags,
namespace_fetch_all_allowed,
@ -115,7 +114,7 @@ class TagAutocompleteOptions( HydrusSerialisable.SerialisableBase ):
new_serialisable_info = [
serialisable_service_key,
serialisable_write_autocomplete_tag_domain,
override_write_autocomplete_file_domain,
override_write_autocomplete_location_context,
serialisable_write_autocomplete_file_domain,
search_namespaces_into_full_tags,
namespace_bare_fetch_all_allowed,
@ -132,7 +131,7 @@ class TagAutocompleteOptions( HydrusSerialisable.SerialisableBase ):
[
serialisable_service_key,
serialisable_write_autocomplete_tag_domain,
override_write_autocomplete_file_domain,
override_write_autocomplete_location_context,
serialisable_write_autocomplete_file_domain,
search_namespaces_into_full_tags,
namespace_bare_fetch_all_allowed,
@ -146,7 +145,7 @@ class TagAutocompleteOptions( HydrusSerialisable.SerialisableBase ):
new_serialisable_info = [
serialisable_service_key,
serialisable_write_autocomplete_tag_domain,
override_write_autocomplete_file_domain,
override_write_autocomplete_location_context,
serialisable_write_autocomplete_file_domain,
search_namespaces_into_full_tags,
namespace_bare_fetch_all_allowed,
@ -159,6 +158,42 @@ class TagAutocompleteOptions( HydrusSerialisable.SerialisableBase ):
return ( 3, new_serialisable_info )
if version == 3:
[
serialisable_service_key,
serialisable_write_autocomplete_tag_domain,
override_write_autocomplete_location_context,
serialisable_write_autocomplete_file_domain,
search_namespaces_into_full_tags,
namespace_bare_fetch_all_allowed,
namespace_fetch_all_allowed,
fetch_all_allowed,
fetch_results_automatically,
exact_match_character_threshold
] = old_serialisable_info
file_service_key = bytes.fromhex( serialisable_write_autocomplete_file_domain )
location_context = ClientLocation.LocationContext.STATICCreateSimple( file_service_key )
serialisable_write_autocomplete_location_context = location_context.GetSerialisableTuple()
new_serialisable_info = [
serialisable_service_key,
serialisable_write_autocomplete_tag_domain,
override_write_autocomplete_location_context,
serialisable_write_autocomplete_location_context,
search_namespaces_into_full_tags,
namespace_bare_fetch_all_allowed,
namespace_fetch_all_allowed,
fetch_all_allowed,
fetch_results_automatically,
exact_match_character_threshold
]
return ( 4, new_serialisable_info )
def FetchAllAllowed( self ):
@ -180,20 +215,20 @@ class TagAutocompleteOptions( HydrusSerialisable.SerialisableBase ):
return self._service_key
def GetWriteAutocompleteFileDomain( self ):
def GetWriteAutocompleteLocationContext( self ) -> ClientLocation.LocationContext:
return self._write_autocomplete_file_domain
return self._write_autocomplete_location_context
def GetWriteAutocompleteDomain( self, location_context: ClientLocation.LocationContext ):
def GetWriteAutocompleteSearchDomain( self, location_context: ClientLocation.LocationContext ):
tag_service_key = self._service_key
if self._service_key != CC.COMBINED_TAG_SERVICE_KEY:
if self._override_write_autocomplete_file_domain:
if self._override_write_autocomplete_location_context:
location_context = ClientLocation.LocationContext.STATICCreateSimple( self._write_autocomplete_file_domain )
location_context = self._write_autocomplete_location_context.Duplicate()
tag_service_key = self._write_autocomplete_tag_domain
@ -222,9 +257,9 @@ class TagAutocompleteOptions( HydrusSerialisable.SerialisableBase ):
return self._namespace_fetch_all_allowed
def OverridesWriteAutocompleteFileDomain( self ):
def OverridesWriteAutocompleteLocationContext( self ):
return self._override_write_autocomplete_file_domain
return self._override_write_autocomplete_location_context
def SearchNamespacesIntoFullTags( self ):
@ -244,8 +279,8 @@ class TagAutocompleteOptions( HydrusSerialisable.SerialisableBase ):
def SetTuple( self,
write_autocomplete_tag_domain: bytes,
override_write_autocomplete_file_domain: bool,
write_autocomplete_file_domain: bytes,
override_write_autocomplete_location_context: bool,
write_autocomplete_location_context: ClientLocation.LocationContext,
search_namespaces_into_full_tags: bool,
namespace_bare_fetch_all_allowed: bool,
namespace_fetch_all_allowed: bool,
@ -253,8 +288,8 @@ class TagAutocompleteOptions( HydrusSerialisable.SerialisableBase ):
):
self._write_autocomplete_tag_domain = write_autocomplete_tag_domain
self._override_write_autocomplete_file_domain = override_write_autocomplete_file_domain
self._write_autocomplete_file_domain = write_autocomplete_file_domain
self._override_write_autocomplete_location_context = override_write_autocomplete_location_context
self._write_autocomplete_location_context = write_autocomplete_location_context
self._search_namespaces_into_full_tags = search_namespaces_into_full_tags
self._namespace_bare_fetch_all_allowed = namespace_bare_fetch_all_allowed
self._namespace_fetch_all_allowed = namespace_fetch_all_allowed

View File

@ -8,11 +8,16 @@ import traceback
import typing
CBOR_AVAILABLE = False
try:
import cbor2
CBOR_AVAILABLE = True
except:
pass
from twisted.web.static import File as FileResource
@ -49,7 +54,7 @@ LOCAL_BOORU_JSON_BYTE_LIST_PARAMS = set()
CLIENT_API_INT_PARAMS = { 'file_id', 'file_sort_type' }
CLIENT_API_BYTE_PARAMS = { 'hash', 'destination_page_key', 'page_key', 'Hydrus-Client-API-Access-Key', 'Hydrus-Client-API-Session-Key', 'tag_service_key', 'file_service_key' }
CLIENT_API_STRING_PARAMS = { 'name', 'url', 'domain', 'search', 'file_service_name', 'tag_service_name' }
CLIENT_API_STRING_PARAMS = { 'name', 'url', 'domain', 'search', 'file_service_name', 'tag_service_name', 'reason' }
CLIENT_API_JSON_PARAMS = { 'basic_permissions', 'system_inbox', 'system_archive', 'tags', 'file_ids', 'only_return_identifiers', 'detailed_url_information', 'hide_service_names_tags', 'simple', 'file_sort_asc', 'return_hashes', 'include_notes', 'notes', 'note_names' }
CLIENT_API_JSON_BYTE_LIST_PARAMS = { 'hashes' }
CLIENT_API_JSON_BYTE_DICT_PARAMS = { 'service_keys_to_tags', 'service_keys_to_actions_to_tags', 'service_keys_to_additional_tags' }
@ -67,6 +72,11 @@ def Dumps( data, mime ):
def CheckHashLength( hashes, hash_type = 'sha256' ):
if len( hashes ) == 0:
raise HydrusExceptions.BadRequestException( 'Sorry, I was expecting at least 1 {} hash, but none were given!'.format( hash_type ) )
hash_types_to_length = {
'sha256' : 32,
'md5' : 16,
@ -375,6 +385,71 @@ def ParseClientAPISearchPredicates( request ):
return predicates
def ParseLocationContext( request: HydrusServerRequest.HydrusRequest, default: ClientLocation.LocationContext ):
if 'file_service_key' in request.parsed_request_args or 'file_service_name' in request.parsed_request_args:
if 'file_service_key' in request.parsed_request_args:
file_service_key = request.parsed_request_args[ 'file_service_key' ]
else:
file_service_name = request.parsed_request_args[ 'file_service_name' ]
try:
file_service_key = HG.client_controller.services_manager.GetServiceKeyFromName( HC.ALL_FILE_SERVICES, file_service_name )
except:
raise HydrusExceptions.BadRequestException( 'Could not find the service "{}"!'.format( file_service_name ) )
try:
service_type = HG.client_controller.services_manager.GetServiceType( file_service_key )
except:
raise HydrusExceptions.BadRequestException( 'Could not find that file service!' )
if service_type not in HC.ALL_FILE_SERVICES:
raise HydrusExceptions.BadRequestException( 'Sorry, that service key did not give a file service!' )
return ClientLocation.LocationContext.STATICCreateSimple( file_service_key )
else:
return default
def ParseHashes( request: HydrusServerRequest.HydrusRequest ):
hashes = set()
if 'hash' in request.parsed_request_args:
hash = request.parsed_request_args.GetValue( 'hash', bytes )
hashes.add( hash )
if 'hashes' in request.parsed_request_args:
more_hashes = request.parsed_request_args.GetValue( 'hashes', list, expected_list_type = bytes )
hashes.update( more_hashes )
CheckHashLength( hashes )
return hashes
def ConvertTagListToPredicates( request, tag_list, do_permission_check = True ) -> list:
or_tag_lists = [ tag for tag in tag_list if isinstance( tag, list ) ]
@ -737,26 +812,14 @@ class HydrusResourceBooruThumbnail( HydrusResourceBooru ):
response_context_mime = HC.APPLICATION_UNKNOWN
elif mime in HC.AUDIO:
path = os.path.join( HC.STATIC_DIR, 'audio.png' )
elif mime == HC.APPLICATION_PDF:
path = os.path.join( HC.STATIC_DIR, 'pdf.png' )
elif mime == HC.APPLICATION_PSD:
path = os.path.join( HC.STATIC_DIR, 'psd.png' )
if not os.path.exists( path ):
path = HydrusPaths.mimes_to_default_thumbnail_paths[ mime ]
else:
path = os.path.join( HC.STATIC_DIR, 'hydrus.png' )
if not os.path.exists( path ):
raise HydrusExceptions.NotFoundException( 'Could not find that thumbnail!' )
path = HydrusPaths.mimes_to_default_thumbnail_paths[ mime ]
response_context = HydrusServerResources.ResponseContext( 200, mime = response_context_mime, path = path )
@ -1136,23 +1199,7 @@ class HydrusResourceClientAPIRestrictedAddFilesArchiveFiles( HydrusResourceClien
def _threadDoPOSTJob( self, request: HydrusServerRequest.HydrusRequest ):
hashes = set()
if 'hash' in request.parsed_request_args:
hash = request.parsed_request_args.GetValue( 'hash', bytes )
hashes.add( hash )
if 'hashes' in request.parsed_request_args:
more_hashes = request.parsed_request_args.GetValue( 'hashes', list, expected_list_type = bytes )
hashes.update( more_hashes )
CheckHashLength( hashes )
hashes = ParseHashes( request )
content_update = HydrusData.ContentUpdate( HC.CONTENT_TYPE_FILES, HC.CONTENT_UPDATE_ARCHIVE, hashes )
@ -1172,31 +1219,28 @@ class HydrusResourceClientAPIRestrictedAddFilesDeleteFiles( HydrusResourceClient
def _threadDoPOSTJob( self, request: HydrusServerRequest.HydrusRequest ):
hashes = set()
location_context = ParseLocationContext( request, HG.client_controller.services_manager.GetLocalMediaLocationContextUmbrella() )
if 'hash' in request.parsed_request_args:
if 'reason' in request.parsed_request_args:
hash = request.parsed_request_args.GetValue( 'hash', bytes )
reason = request.parsed_request_args.GetValue( 'reason', str )
hashes.add( hash )
else:
reason = 'Deleted via Client API.'
if 'hashes' in request.parsed_request_args:
hashes = ParseHashes( request )
# expand this to take reason
location_context.LimitToServiceTypes( HG.client_controller.services_manager.GetServiceType, ( HC.COMBINED_LOCAL_FILE, HC.LOCAL_FILE_DOMAIN ) )
content_update = HydrusData.ContentUpdate( HC.CONTENT_TYPE_FILES, HC.CONTENT_UPDATE_DELETE, hashes, reason = reason )
for service_key in location_context.current_service_keys:
more_hashes = request.parsed_request_args.GetValue( 'hashes', list, expected_list_type = bytes )
hashes.update( more_hashes )
CheckHashLength( hashes )
# expand this to take file service and reason
content_update = HydrusData.ContentUpdate( HC.CONTENT_TYPE_FILES, HC.CONTENT_UPDATE_DELETE, hashes )
service_keys_to_content_updates = { CC.LOCAL_FILE_SERVICE_KEY : [ content_update ] }
if len( service_keys_to_content_updates ) > 0:
service_keys_to_content_updates = { service_key : [ content_update ] }
HG.client_controller.WriteSynchronous( 'content_updates', service_keys_to_content_updates )
@ -1210,32 +1254,13 @@ class HydrusResourceClientAPIRestrictedAddFilesUnarchiveFiles( HydrusResourceCli
def _threadDoPOSTJob( self, request: HydrusServerRequest.HydrusRequest ):
hashes = set()
if 'hash' in request.parsed_request_args:
hash = request.parsed_request_args.GetValue( 'hash', bytes )
hashes.add( hash )
if 'hashes' in request.parsed_request_args:
more_hashes = request.parsed_request_args.GetValue( 'hashes', list, expected_list_type = bytes )
hashes.update( more_hashes )
CheckHashLength( hashes )
hashes = ParseHashes( request )
content_update = HydrusData.ContentUpdate( HC.CONTENT_TYPE_FILES, HC.CONTENT_UPDATE_INBOX, hashes )
service_keys_to_content_updates = { CC.COMBINED_LOCAL_FILE_SERVICE_KEY : [ content_update ] }
if len( service_keys_to_content_updates ) > 0:
HG.client_controller.WriteSynchronous( 'content_updates', service_keys_to_content_updates )
HG.client_controller.WriteSynchronous( 'content_updates', service_keys_to_content_updates )
response_context = HydrusServerResources.ResponseContext( 200 )
@ -1246,31 +1271,17 @@ class HydrusResourceClientAPIRestrictedAddFilesUndeleteFiles( HydrusResourceClie
def _threadDoPOSTJob( self, request: HydrusServerRequest.HydrusRequest ):
hashes = set()
location_context = ParseLocationContext( request, HG.client_controller.services_manager.GetLocalMediaLocationContextUmbrella() )
if 'hash' in request.parsed_request_args:
hash = request.parsed_request_args.GetValue( 'hash', bytes )
hashes.add( hash )
hashes = ParseHashes( request )
if 'hashes' in request.parsed_request_args:
more_hashes = request.parsed_request_args.GetValue( 'hashes', list, expected_list_type = bytes )
hashes.update( more_hashes )
CheckHashLength( hashes )
# expand this to take file service, if and when we move to multiple trashes or whatever
location_context.LimitToServiceTypes( HG.client_controller.services_manager.GetServiceType, ( HC.LOCAL_FILE_DOMAIN, ) )
content_update = HydrusData.ContentUpdate( HC.CONTENT_TYPE_FILES, HC.CONTENT_UPDATE_UNDELETE, hashes )
service_keys_to_content_updates = { CC.LOCAL_FILE_SERVICE_KEY : [ content_update ] }
if len( service_keys_to_content_updates ) > 0:
for service_key in location_context.current_service_keys:
service_keys_to_content_updates = { service_key : [ content_update ] }
HG.client_controller.WriteSynchronous( 'content_updates', service_keys_to_content_updates )
@ -1308,7 +1319,7 @@ class HydrusResourceClientAPIRestrictedAddNotesSetNotes( HydrusResourceClientAPI
raise HydrusExceptions.BadRequestException( 'There was no file identifier or hash given!' )
notes = request.parsed_request_args.GetValue( 'notes', dict )
notes = request.parsed_request_args.GetValue( 'notes', dict, expected_dict_types = ( str, str ) )
content_updates = [ HydrusData.ContentUpdate( HC.CONTENT_TYPE_NOTES, HC.CONTENT_UPDATE_SET, ( hash, name, note ) ) for ( name, note ) in notes.items() ]
@ -1365,28 +1376,7 @@ class HydrusResourceClientAPIRestrictedAddTagsAddTags( HydrusResourceClientAPIRe
def _threadDoPOSTJob( self, request: HydrusServerRequest.HydrusRequest ):
hashes = set()
if 'hash' in request.parsed_request_args:
hash = request.parsed_request_args.GetValue( 'hash', bytes )
hashes.add( hash )
if 'hashes' in request.parsed_request_args:
more_hashes = request.parsed_request_args.GetValue( 'hashes', list, expected_list_type = bytes )
hashes.update( more_hashes )
CheckHashLength( hashes )
if len( hashes ) == 0:
raise HydrusExceptions.BadRequestException( 'There were no hashes given!' )
hashes = ParseHashes( request )
#
@ -1643,7 +1633,7 @@ class HydrusResourceClientAPIRestrictedAddTagsSearchTags( HydrusResourceClientAP
autocomplete_search_text = parsed_autocomplete_text.GetSearchText( True )
default_location_context = HG.client_controller.services_manager.GetDefaultLocationContext()
default_location_context = HG.client_controller.new_options.GetDefaultLocalLocationContext()
file_search_context = ClientSearch.FileSearchContext( location_context = default_location_context, tag_search_context = tag_search_context )
@ -1786,23 +1776,7 @@ class HydrusResourceClientAPIRestrictedAddURLsAssociateURL( HydrusResourceClient
raise HydrusExceptions.BadRequestException( 'Did not find any URLs to add or delete!' )
applicable_hashes = []
if 'hash' in request.parsed_request_args:
hash = request.parsed_request_args.GetValue( 'hash', bytes )
applicable_hashes.append( hash )
if 'hashes' in request.parsed_request_args:
hashes = request.parsed_request_args.GetValue( 'hashes', list, expected_list_type = bytes )
applicable_hashes.extend( hashes )
CheckHashLength( applicable_hashes )
applicable_hashes = ParseHashes( request )
if len( applicable_hashes ) == 0:
@ -2034,47 +2008,7 @@ class HydrusResourceClientAPIRestrictedGetFilesSearchFiles( HydrusResourceClient
def _threadDoGETJob( self, request: HydrusServerRequest.HydrusRequest ):
if 'file_service_key' in request.parsed_request_args or 'file_service_name' in request.parsed_request_args:
if 'file_service_key' in request.parsed_request_args:
file_service_key = request.parsed_request_args[ 'file_service_key' ]
else:
file_service_name = request.parsed_request_args[ 'file_service_name' ]
try:
file_service_key = HG.client_controller.services_manager.GetServiceKeyFromName( HC.ALL_FILE_SERVICES, file_service_name )
except:
raise HydrusExceptions.BadRequestException( 'Could not find the service "{}"!'.format( file_service_name ) )
try:
service = HG.client_controller.services_manager.GetService( file_service_key )
except:
raise HydrusExceptions.BadRequestException( 'Could not find that file service!' )
if service.GetServiceType() not in HC.ALL_FILE_SERVICES:
raise HydrusExceptions.BadRequestException( 'Sorry, that service key did not give a file service!' )
else:
# I guess ideally we would go for the 'all local services' umbrella, or a list of them, or however we end up doing that
# for now we'll fudge it
file_service_key = list( HG.client_controller.services_manager.GetServiceKeys( ( HC.LOCAL_FILE_DOMAIN, ) ) )[0]
location_context = ParseLocationContext( request, HG.client_controller.services_manager.GetLocalMediaLocationContextUmbrella() )
if 'tag_service_key' in request.parsed_request_args or 'tag_service_name' in request.parsed_request_args:
@ -2115,12 +2049,11 @@ class HydrusResourceClientAPIRestrictedGetFilesSearchFiles( HydrusResourceClient
tag_service_key = CC.COMBINED_TAG_SERVICE_KEY
if tag_service_key == CC.COMBINED_TAG_SERVICE_KEY and file_service_key == CC.COMBINED_FILE_SERVICE_KEY:
if tag_service_key == CC.COMBINED_TAG_SERVICE_KEY and location_context.IsAllKnownFiles():
raise HydrusExceptions.BadRequestException( 'Sorry, search for all known tags over all known files is not supported!' )
location_context = ClientLocation.LocationContext.STATICCreateSimple( file_service_key )
tag_search_context = ClientSearch.TagSearchContext( service_key = tag_service_key )
predicates = ParseClientAPISearchPredicates( request )
@ -2527,7 +2460,7 @@ class HydrusResourceClientAPIRestrictedGetFilesGetThumbnail( HydrusResourceClien
except HydrusExceptions.FileMissingException:
raise HydrusExceptions.NotFoundException( 'Could not find that file!' )
path = HydrusPaths.mimes_to_default_thumbnail_paths[ media_result.GetMime() ]
response_context = HydrusServerResources.ResponseContext( 200, mime = HC.APPLICATION_OCTET_STREAM, path = path )

View File

@ -759,22 +759,71 @@ class NetworkJob( object ):
try:
is_firewall = cloudscraper.CloudScraper.is_Firewall_Blocked( response )
# cloudscraper refactored a bit around 1.2.60, so we now have some different paths to what we want
if hasattr( cloudscraper.CloudScraper, 'is_reCaptcha_Challenge' ):
possible_paths = [
( cloudscraper.CloudScraper, 'is_Firewall_Blocked' ),
( cloudscraper.cloudflare.Cloudflare, 'is_Firewall_Blocked' )
]
is_firewall = False
for ( m, method_name ) in possible_paths:
is_captcha = getattr( cloudscraper.CloudScraper, 'is_reCaptcha_Challenge' )( response )
elif hasattr( cloudscraper.CloudScraper, 'is_Captcha_Challenge' ):
is_captcha = getattr( cloudscraper.CloudScraper, 'is_Captcha_Challenge' )( response )
else:
is_captcha = False
if hasattr( m, method_name ):
is_firewall = getattr( m, method_name )( response )
if is_firewall:
break
is_attemptable = is_captcha or cloudscraper.CloudScraper.is_IUAM_Challenge( response )
possible_paths = [
( cloudscraper.CloudScraper, 'is_reCaptcha_Challenge' ),
( cloudscraper.CloudScraper, 'is_Captcha_Challenge' ),
( cloudscraper.cloudflare.Cloudflare, 'is_Captcha_Challenge' )
]
is_captcha = False
for ( m, method_name ) in possible_paths:
if hasattr( m, method_name ):
is_captcha = getattr( m, method_name )( response )
if is_captcha:
break
possible_paths = [
( cloudscraper.CloudScraper, 'is_IUAM_Challenge' ),
( cloudscraper.cloudflare.Cloudflare, 'is_IUAM_Challenge' ),
( cloudscraper.cloudflare.Cloudflare, 'is_New_IUAM_Challenge' )
]
is_iuam = False
for ( m, method_name ) in possible_paths:
if hasattr( m, method_name ):
is_iuam = getattr( m, method_name )( response )
if is_iuam:
break
is_attemptable = is_captcha or is_iuam
except Exception as e:

View File

@ -1,6 +1,8 @@
import os
import sqlite3
import sys
import typing
import yaml
# old method of getting frozen dir, doesn't work for symlinks looks like:
# BASE_DIR = getattr( sys, '_MEIPASS', None )
@ -72,16 +74,13 @@ LICENSE_PATH = os.path.join( BASE_DIR, 'license.txt' )
#
import sqlite3
import yaml
options = {}
# Misc
NETWORK_VERSION = 20
SOFTWARE_VERSION = 477
CLIENT_API_VERSION = 27
SOFTWARE_VERSION = 478
CLIENT_API_VERSION = 28
SERVER_THUMBNAIL_DIMENSIONS = ( 200, 200 )
@ -101,32 +100,32 @@ noneable_str = typing.Optional[ str ]
BANDWIDTH_TYPE_DATA = 0
BANDWIDTH_TYPE_REQUESTS = 1
bandwidth_type_string_lookup = {}
bandwidth_type_string_lookup[ BANDWIDTH_TYPE_DATA ] = 'data'
bandwidth_type_string_lookup[ BANDWIDTH_TYPE_REQUESTS ] = 'requests'
bandwidth_type_string_lookup = {
BANDWIDTH_TYPE_DATA : 'data',
BANDWIDTH_TYPE_REQUESTS : 'requests'
}
CONTENT_MERGE_ACTION_COPY = 0
CONTENT_MERGE_ACTION_MOVE = 1
CONTENT_MERGE_ACTION_TWO_WAY_MERGE = 2
content_merge_string_lookup = {}
content_merge_string_lookup[ CONTENT_MERGE_ACTION_COPY ] = 'copy from worse to better'
content_merge_string_lookup[ CONTENT_MERGE_ACTION_MOVE ] = 'move from worse to better'
content_merge_string_lookup[ CONTENT_MERGE_ACTION_TWO_WAY_MERGE ] = 'copy in both directions'
content_merge_string_lookup = {
CONTENT_MERGE_ACTION_COPY : 'copy from worse to better',
CONTENT_MERGE_ACTION_MOVE : 'move from worse to better',
CONTENT_MERGE_ACTION_TWO_WAY_MERGE : 'copy in both directions'
}
CONTENT_STATUS_CURRENT = 0
CONTENT_STATUS_PENDING = 1
CONTENT_STATUS_DELETED = 2
CONTENT_STATUS_PETITIONED = 3
content_status_string_lookup = {}
content_status_string_lookup[ CONTENT_STATUS_CURRENT ] = 'current'
content_status_string_lookup[ CONTENT_STATUS_PENDING ] = 'pending'
content_status_string_lookup[ CONTENT_STATUS_DELETED ] = 'deleted'
content_status_string_lookup[ CONTENT_STATUS_PETITIONED ] = 'petitioned'
content_status_string_lookup = {
CONTENT_STATUS_CURRENT : 'current',
CONTENT_STATUS_PENDING : 'pending',
CONTENT_STATUS_DELETED : 'deleted',
CONTENT_STATUS_PETITIONED : 'petitioned'
}
CONTENT_TYPE_MAPPINGS = 0
CONTENT_TYPE_TAG_SIBLINGS = 1
@ -151,29 +150,29 @@ CONTENT_TYPE_FILE_VIEWING_STATS = 19
CONTENT_TYPE_TAG = 20
CONTENT_TYPE_DEFINITIONS = 21
content_type_string_lookup = {}
content_type_string_lookup[ CONTENT_TYPE_MAPPINGS ] = 'mappings'
content_type_string_lookup[ CONTENT_TYPE_TAG_SIBLINGS ] = 'tag siblings'
content_type_string_lookup[ CONTENT_TYPE_TAG_PARENTS ] = 'tag parents'
content_type_string_lookup[ CONTENT_TYPE_FILES ] = 'files'
content_type_string_lookup[ CONTENT_TYPE_RATINGS ] = 'ratings'
content_type_string_lookup[ CONTENT_TYPE_MAPPING ] = 'mapping'
content_type_string_lookup[ CONTENT_TYPE_DIRECTORIES ] = 'directories'
content_type_string_lookup[ CONTENT_TYPE_URLS ] = 'urls'
content_type_string_lookup[ CONTENT_TYPE_VETO ] = 'veto'
content_type_string_lookup[ CONTENT_TYPE_ACCOUNTS ] = 'accounts'
content_type_string_lookup[ CONTENT_TYPE_OPTIONS ] = 'options'
content_type_string_lookup[ CONTENT_TYPE_SERVICES ] = 'services'
content_type_string_lookup[ CONTENT_TYPE_UNKNOWN ] = 'unknown'
content_type_string_lookup[ CONTENT_TYPE_ACCOUNT_TYPES ] = 'account types'
content_type_string_lookup[ CONTENT_TYPE_VARIABLE ] = 'variable'
content_type_string_lookup[ CONTENT_TYPE_HASH ] = 'hash'
content_type_string_lookup[ CONTENT_TYPE_TIMESTAMP ] = 'timestamp'
content_type_string_lookup[ CONTENT_TYPE_TITLE ] = 'title'
content_type_string_lookup[ CONTENT_TYPE_NOTES ] = 'notes'
content_type_string_lookup[ CONTENT_TYPE_FILE_VIEWING_STATS ] = 'file viewing stats'
content_type_string_lookup[ CONTENT_TYPE_DEFINITIONS ] = 'definitions'
content_type_string_lookup = {
CONTENT_TYPE_MAPPINGS : 'mappings',
CONTENT_TYPE_TAG_SIBLINGS : 'tag siblings',
CONTENT_TYPE_TAG_PARENTS : 'tag parents',
CONTENT_TYPE_FILES : 'files',
CONTENT_TYPE_RATINGS : 'ratings',
CONTENT_TYPE_MAPPING : 'mapping',
CONTENT_TYPE_DIRECTORIES : 'directories',
CONTENT_TYPE_URLS : 'urls',
CONTENT_TYPE_VETO : 'veto',
CONTENT_TYPE_ACCOUNTS : 'accounts',
CONTENT_TYPE_OPTIONS : 'options',
CONTENT_TYPE_SERVICES : 'services',
CONTENT_TYPE_UNKNOWN : 'unknown',
CONTENT_TYPE_ACCOUNT_TYPES : 'account types',
CONTENT_TYPE_VARIABLE : 'variable',
CONTENT_TYPE_HASH : 'hash',
CONTENT_TYPE_TIMESTAMP : 'timestamp',
CONTENT_TYPE_TITLE : 'title',
CONTENT_TYPE_NOTES : 'notes',
CONTENT_TYPE_FILE_VIEWING_STATS : 'file viewing stats',
CONTENT_TYPE_DEFINITIONS : 'definitions'
}
CONTENT_UPDATE_ADD = 0
CONTENT_UPDATE_DELETE = 1
@ -195,26 +194,26 @@ CONTENT_UPDATE_CLEAR_DELETE_RECORD = 17
CONTENT_UPDATE_INCREMENT = 18
CONTENT_UPDATE_DECREMENT = 19
content_update_string_lookup = {}
content_update_string_lookup[ CONTENT_UPDATE_ADD ] = 'add'
content_update_string_lookup[ CONTENT_UPDATE_DELETE ] = 'delete'
content_update_string_lookup[ CONTENT_UPDATE_PEND ] = 'pending'
content_update_string_lookup[ CONTENT_UPDATE_RESCIND_PEND ] = 'rescind pending'
content_update_string_lookup[ CONTENT_UPDATE_PETITION ] = 'petition'
content_update_string_lookup[ CONTENT_UPDATE_RESCIND_PETITION ] = 'rescind petition'
content_update_string_lookup[ CONTENT_UPDATE_EDIT_LOG ] = 'edit log'
content_update_string_lookup[ CONTENT_UPDATE_ARCHIVE ] = 'archive'
content_update_string_lookup[ CONTENT_UPDATE_INBOX ] = 'inbox'
content_update_string_lookup[ CONTENT_UPDATE_RATING ] = 'rating'
content_update_string_lookup[ CONTENT_UPDATE_DENY_PEND ] = 'deny pend'
content_update_string_lookup[ CONTENT_UPDATE_DENY_PETITION ] = 'deny petition'
content_update_string_lookup[ CONTENT_UPDATE_UNDELETE ] = 'undelete'
content_update_string_lookup[ CONTENT_UPDATE_SET ] = 'set'
content_update_string_lookup[ CONTENT_UPDATE_FLIP ] = 'flip on/off'
content_update_string_lookup[ CONTENT_UPDATE_CLEAR_DELETE_RECORD ] = 'clear deletion record'
content_update_string_lookup[ CONTENT_UPDATE_INCREMENT ] = 'increment'
content_update_string_lookup[ CONTENT_UPDATE_DECREMENT ] = 'decrement'
content_update_string_lookup = {
CONTENT_UPDATE_ADD : 'add',
CONTENT_UPDATE_DELETE : 'delete',
CONTENT_UPDATE_PEND : 'pending',
CONTENT_UPDATE_RESCIND_PEND : 'rescind pending',
CONTENT_UPDATE_PETITION : 'petition',
CONTENT_UPDATE_RESCIND_PETITION : 'rescind petition',
CONTENT_UPDATE_EDIT_LOG : 'edit log',
CONTENT_UPDATE_ARCHIVE : 'archive',
CONTENT_UPDATE_INBOX : 'inbox',
CONTENT_UPDATE_RATING : 'rating',
CONTENT_UPDATE_DENY_PEND : 'deny pend',
CONTENT_UPDATE_DENY_PETITION : 'deny petition',
CONTENT_UPDATE_UNDELETE : 'undelete',
CONTENT_UPDATE_SET : 'set',
CONTENT_UPDATE_FLIP : 'flip on/off',
CONTENT_UPDATE_CLEAR_DELETE_RECORD : 'clear deletion record',
CONTENT_UPDATE_INCREMENT : 'increment',
CONTENT_UPDATE_DECREMENT : 'decrement'
}
DEFINITIONS_TYPE_HASHES = 0
DEFINITIONS_TYPE_TAGS = 1
@ -231,29 +230,29 @@ DUPLICATE_MEMBER = 8
DUPLICATE_KING = 9
DUPLICATE_CONFIRMED_ALTERNATE = 10
duplicate_type_string_lookup = {}
duplicate_type_string_lookup[ DUPLICATE_POTENTIAL ] = 'potential duplicates'
duplicate_type_string_lookup[ DUPLICATE_FALSE_POSITIVE ] = 'not related/false positive'
duplicate_type_string_lookup[ DUPLICATE_SAME_QUALITY ] = 'same quality'
duplicate_type_string_lookup[ DUPLICATE_ALTERNATE ] = 'alternates'
duplicate_type_string_lookup[ DUPLICATE_BETTER ] = 'this is better'
duplicate_type_string_lookup[ DUPLICATE_SMALLER_BETTER ] = 'smaller hash_id is better'
duplicate_type_string_lookup[ DUPLICATE_LARGER_BETTER ] = 'larger hash_id is better'
duplicate_type_string_lookup[ DUPLICATE_WORSE ] = 'this is worse'
duplicate_type_string_lookup[ DUPLICATE_MEMBER ] = 'duplicates'
duplicate_type_string_lookup[ DUPLICATE_KING ] = 'the best quality duplicate'
duplicate_type_string_lookup[ DUPLICATE_CONFIRMED_ALTERNATE ] = 'confirmed alternates'
duplicate_type_string_lookup = {
DUPLICATE_POTENTIAL : 'potential duplicates',
DUPLICATE_FALSE_POSITIVE : 'not related/false positive',
DUPLICATE_SAME_QUALITY : 'same quality',
DUPLICATE_ALTERNATE : 'alternates',
DUPLICATE_BETTER : 'this is better',
DUPLICATE_SMALLER_BETTER : 'smaller hash_id is better',
DUPLICATE_LARGER_BETTER : 'larger hash_id is better',
DUPLICATE_WORSE : 'this is worse',
DUPLICATE_MEMBER : 'duplicates',
DUPLICATE_KING : 'the best quality duplicate',
DUPLICATE_CONFIRMED_ALTERNATE : 'confirmed alternates'
}
ENCODING_RAW = 0
ENCODING_HEX = 1
ENCODING_BASE64 = 2
encoding_string_lookup = {}
encoding_string_lookup[ ENCODING_RAW ] = 'raw bytes'
encoding_string_lookup[ ENCODING_HEX ] = 'hexadecimal'
encoding_string_lookup[ ENCODING_BASE64 ] = 'base64'
encoding_string_lookup = {
ENCODING_RAW : 'raw bytes',
ENCODING_HEX : 'hexadecimal',
ENCODING_BASE64 : 'base64'
}
IMPORT_FOLDER_TYPE_DELETE = 0
IMPORT_FOLDER_TYPE_SYNCHRONISE = 1
@ -273,27 +272,27 @@ MAINTENANCE_SHUTDOWN = 1
MAINTENANCE_FORCED = 2
MAINTENANCE_ACTIVE = 3
NICE_RESOLUTIONS = {}
NICE_RESOLUTIONS = {
( 640, 480 ) : '480p',
( 1280, 720 ) : '720p',
( 1920, 1080 ) : '1080p',
( 3840, 2160 ) : '4k',
( 720, 1280 ) : 'vertical 720p',
( 1080, 1920 ) : 'vertical 1080p',
( 2160, 3840 ) : 'vertical 4k'
}
NICE_RESOLUTIONS[ ( 640, 480 ) ] = '480p'
NICE_RESOLUTIONS[ ( 1280, 720 ) ] = '720p'
NICE_RESOLUTIONS[ ( 1920, 1080 ) ] = '1080p'
NICE_RESOLUTIONS[ ( 3840, 2160 ) ] = '4k'
NICE_RESOLUTIONS[ ( 720, 1280 ) ] = 'vertical 720p'
NICE_RESOLUTIONS[ ( 1080, 1920 ) ] = 'vertical 1080p'
NICE_RESOLUTIONS[ ( 2160, 3840 ) ] = 'vertical 4k'
NICE_RATIOS = {}
NICE_RATIOS[ 1 ] = '1:1'
NICE_RATIOS[ 4 / 3 ] = '4:3'
NICE_RATIOS[ 5 / 4 ] = '5:4'
NICE_RATIOS[ 16 / 9 ] = '16:9'
NICE_RATIOS[ 21 / 9 ] = '21:9'
NICE_RATIOS[ 47 / 20 ] = '2.35:1'
NICE_RATIOS[ 9 / 16 ] = '9:16'
NICE_RATIOS[ 2 / 3 ] = '2:3'
NICE_RATIOS[ 4 / 5 ] = '4:5'
NICE_RATIOS = {
1 : '1:1',
4 / 3 : '4:3',
5 / 4 : '5:4',
16 / 9 : '16:9',
21 / 9 : '21:9',
47 / 20 : '2.35:1',
9 / 16 : '9:16',
2 / 3 : '2:3',
4 / 5 : '4:5'
}
GET_DATA = 0
POST_DATA = 1
@ -307,16 +306,16 @@ UNKNOWN_PERMISSION = 7
CREATABLE_PERMISSIONS = [ GET_DATA, POST_DATA, POST_PETITIONS, RESOLVE_PETITIONS, MANAGE_USERS, GENERAL_ADMIN ]
ADMIN_PERMISSIONS = [ RESOLVE_PETITIONS, MANAGE_USERS, GENERAL_ADMIN, EDIT_SERVICES ]
permissions_string_lookup = {}
permissions_string_lookup[ GET_DATA ] = 'get data'
permissions_string_lookup[ POST_DATA ] = 'post data'
permissions_string_lookup[ POST_PETITIONS ] = 'post petitions'
permissions_string_lookup[ RESOLVE_PETITIONS ] = 'resolve petitions'
permissions_string_lookup[ MANAGE_USERS ] = 'manage users'
permissions_string_lookup[ GENERAL_ADMIN ] = 'general administration'
permissions_string_lookup[ EDIT_SERVICES ] = 'edit services'
permissions_string_lookup[ UNKNOWN_PERMISSION ] = 'unknown'
permissions_string_lookup = {
GET_DATA : 'get data',
POST_DATA : 'post data',
POST_PETITIONS : 'post petitions',
RESOLVE_PETITIONS : 'resolve petitions',
MANAGE_USERS : 'manage users',
GENERAL_ADMIN : 'general administration',
EDIT_SERVICES : 'edit services',
UNKNOWN_PERMISSION : 'unknown'
}
# new permissions
@ -324,38 +323,31 @@ PERMISSION_ACTION_PETITION = 0
PERMISSION_ACTION_CREATE = 1
PERMISSION_ACTION_MODERATE = 2
permission_pair_string_lookup = {}
permission_pair_string_lookup[ ( CONTENT_TYPE_ACCOUNTS, None ) ] = 'cannot change accounts'
permission_pair_string_lookup[ ( CONTENT_TYPE_ACCOUNTS, PERMISSION_ACTION_CREATE ) ] = 'can create accounts'
permission_pair_string_lookup[ ( CONTENT_TYPE_ACCOUNTS, PERMISSION_ACTION_MODERATE ) ] = 'can manage accounts completely'
permission_pair_string_lookup[ ( CONTENT_TYPE_ACCOUNT_TYPES, None ) ] = 'cannot change account types'
permission_pair_string_lookup[ ( CONTENT_TYPE_ACCOUNT_TYPES, PERMISSION_ACTION_MODERATE ) ] = 'can manage account types completely'
permission_pair_string_lookup[ ( CONTENT_TYPE_OPTIONS, None ) ] = 'cannot change service options'
permission_pair_string_lookup[ ( CONTENT_TYPE_OPTIONS, PERMISSION_ACTION_MODERATE ) ] = 'can manage service options completely'
permission_pair_string_lookup[ ( CONTENT_TYPE_SERVICES, None ) ] = 'cannot change services'
permission_pair_string_lookup[ ( CONTENT_TYPE_SERVICES, PERMISSION_ACTION_MODERATE ) ] = 'can manage services completely'
permission_pair_string_lookup[ ( CONTENT_TYPE_FILES, None ) ] = 'can only download files'
permission_pair_string_lookup[ ( CONTENT_TYPE_FILES, PERMISSION_ACTION_PETITION ) ] = 'can petition to remove existing files'
permission_pair_string_lookup[ ( CONTENT_TYPE_FILES, PERMISSION_ACTION_CREATE ) ] = 'can upload new files and petition existing ones'
permission_pair_string_lookup[ ( CONTENT_TYPE_FILES, PERMISSION_ACTION_MODERATE ) ] = 'can upload and delete files and process petitions'
permission_pair_string_lookup[ ( CONTENT_TYPE_MAPPINGS, None ) ] = 'can only download mappings'
permission_pair_string_lookup[ ( CONTENT_TYPE_MAPPINGS, PERMISSION_ACTION_PETITION ) ] = 'can petition to remove existing mappings'
permission_pair_string_lookup[ ( CONTENT_TYPE_MAPPINGS, PERMISSION_ACTION_CREATE ) ] = 'can upload new mappings and petition existing ones'
permission_pair_string_lookup[ ( CONTENT_TYPE_MAPPINGS, PERMISSION_ACTION_MODERATE ) ] = 'can upload and delete mappings and process petitions'
permission_pair_string_lookup[ ( CONTENT_TYPE_TAG_PARENTS, None ) ] = 'can only download tag parents'
permission_pair_string_lookup[ ( CONTENT_TYPE_TAG_PARENTS, PERMISSION_ACTION_PETITION ) ] = 'can petition to add or remove tag parents'
permission_pair_string_lookup[ ( CONTENT_TYPE_TAG_PARENTS, PERMISSION_ACTION_MODERATE ) ] = 'can upload and delete tag parents and process petitions'
permission_pair_string_lookup[ ( CONTENT_TYPE_TAG_SIBLINGS, None ) ] = 'can only download tag siblings'
permission_pair_string_lookup[ ( CONTENT_TYPE_TAG_SIBLINGS, PERMISSION_ACTION_PETITION ) ] = 'can petition to add or remove tag siblings'
permission_pair_string_lookup[ ( CONTENT_TYPE_TAG_SIBLINGS, PERMISSION_ACTION_MODERATE ) ] = 'can upload and delete tag siblings and process petitions'
permission_pair_string_lookup = {
( CONTENT_TYPE_ACCOUNTS, None ) : 'cannot change accounts',
( CONTENT_TYPE_ACCOUNTS, PERMISSION_ACTION_CREATE ) : 'can create accounts',
( CONTENT_TYPE_ACCOUNTS, PERMISSION_ACTION_MODERATE ) : 'can manage accounts completely',
( CONTENT_TYPE_ACCOUNT_TYPES, None ) : 'cannot change account types',
( CONTENT_TYPE_ACCOUNT_TYPES, PERMISSION_ACTION_MODERATE ) : 'can manage account types completely',
( CONTENT_TYPE_OPTIONS, None ) : 'cannot change service options',
( CONTENT_TYPE_OPTIONS, PERMISSION_ACTION_MODERATE ) : 'can manage service options completely',
( CONTENT_TYPE_SERVICES, None ) : 'cannot change services',
( CONTENT_TYPE_SERVICES, PERMISSION_ACTION_MODERATE ) : 'can manage services completely',
( CONTENT_TYPE_FILES, None ) : 'can only download files',
( CONTENT_TYPE_FILES, PERMISSION_ACTION_PETITION ) : 'can petition to remove existing files',
( CONTENT_TYPE_FILES, PERMISSION_ACTION_CREATE ) : 'can upload new files and petition existing ones',
( CONTENT_TYPE_FILES, PERMISSION_ACTION_MODERATE ) : 'can upload and delete files and process petitions',
( CONTENT_TYPE_MAPPINGS, None ) : 'can only download mappings',
( CONTENT_TYPE_MAPPINGS, PERMISSION_ACTION_PETITION ) : 'can petition to remove existing mappings',
( CONTENT_TYPE_MAPPINGS, PERMISSION_ACTION_CREATE ) : 'can upload new mappings and petition existing ones',
( CONTENT_TYPE_MAPPINGS, PERMISSION_ACTION_MODERATE ) : 'can upload and delete mappings and process petitions',
( CONTENT_TYPE_TAG_PARENTS, None ) : 'can only download tag parents',
( CONTENT_TYPE_TAG_PARENTS, PERMISSION_ACTION_PETITION ) : 'can petition to add or remove tag parents',
( CONTENT_TYPE_TAG_PARENTS, PERMISSION_ACTION_MODERATE ) : 'can upload and delete tag parents and process petitions',
( CONTENT_TYPE_TAG_SIBLINGS, None ) : 'can only download tag siblings',
( CONTENT_TYPE_TAG_SIBLINGS, PERMISSION_ACTION_PETITION ) : 'can petition to add or remove tag siblings',
( CONTENT_TYPE_TAG_SIBLINGS, PERMISSION_ACTION_MODERATE ) : 'can upload and delete tag siblings and process petitions'
}
TAG_REPOSITORY = 0
FILE_REPOSITORY = 1
@ -379,29 +371,29 @@ COMBINED_DELETED_FILE = 19
SERVER_ADMIN = 99
NULL_SERVICE = 100
service_string_lookup = {}
service_string_lookup[ TAG_REPOSITORY ] = 'hydrus tag repository'
service_string_lookup[ FILE_REPOSITORY ] = 'hydrus file repository'
service_string_lookup[ LOCAL_FILE_DOMAIN ] = 'local file domain'
service_string_lookup[ LOCAL_FILE_TRASH_DOMAIN ] = 'local trash file domain'
service_string_lookup[ COMBINED_LOCAL_FILE ] = 'virtual combined local file service'
service_string_lookup[ MESSAGE_DEPOT ] = 'hydrus message depot'
service_string_lookup[ LOCAL_TAG ] = 'local tag service'
service_string_lookup[ LOCAL_RATING_NUMERICAL ] = 'local numerical rating service'
service_string_lookup[ LOCAL_RATING_LIKE ] = 'local like/dislike rating service'
service_string_lookup[ RATING_NUMERICAL_REPOSITORY ] = 'hydrus numerical rating repository'
service_string_lookup[ RATING_LIKE_REPOSITORY ] = 'hydrus like/dislike rating repository'
service_string_lookup[ COMBINED_TAG ] = 'virtual combined tag service'
service_string_lookup[ COMBINED_FILE ] = 'virtual combined file service'
service_string_lookup[ LOCAL_BOORU ] = 'client local booru'
service_string_lookup[ CLIENT_API_SERVICE ] = 'client api'
service_string_lookup[ IPFS ] = 'ipfs daemon'
service_string_lookup[ TEST_SERVICE ] = 'test service'
service_string_lookup[ LOCAL_NOTES ] = 'local file notes service'
service_string_lookup[ SERVER_ADMIN ] = 'hydrus server administration service'
service_string_lookup[ COMBINED_DELETED_FILE ] = 'virtual deleted file service'
service_string_lookup[ NULL_SERVICE ] = 'null service'
service_string_lookup = {
TAG_REPOSITORY : 'hydrus tag repository',
FILE_REPOSITORY : 'hydrus file repository',
LOCAL_FILE_DOMAIN : 'local file domain',
LOCAL_FILE_TRASH_DOMAIN : 'local trash file domain',
COMBINED_LOCAL_FILE : 'virtual combined local file service',
MESSAGE_DEPOT : 'hydrus message depot',
LOCAL_TAG : 'local tag service',
LOCAL_RATING_NUMERICAL : 'local numerical rating service',
LOCAL_RATING_LIKE : 'local like/dislike rating service',
RATING_NUMERICAL_REPOSITORY : 'hydrus numerical rating repository',
RATING_LIKE_REPOSITORY : 'hydrus like/dislike rating repository',
COMBINED_TAG : 'virtual combined tag service',
COMBINED_FILE : 'virtual combined file service',
LOCAL_BOORU : 'client local booru',
CLIENT_API_SERVICE : 'client api',
IPFS : 'ipfs daemon',
TEST_SERVICE : 'test service',
LOCAL_NOTES : 'local file notes service',
SERVER_ADMIN : 'hydrus server administration service',
COMBINED_DELETED_FILE : 'virtual deleted file service',
NULL_SERVICE : 'null service'
}
LOCAL_FILE_SERVICES = ( LOCAL_FILE_DOMAIN, LOCAL_FILE_TRASH_DOMAIN, COMBINED_LOCAL_FILE )
LOCAL_TAG_SERVICES = ( LOCAL_TAG, )
@ -494,11 +486,11 @@ GET = 0
POST = 1
OPTIONS = 2
query_type_string_lookup = {}
query_type_string_lookup[ GET ] = 'GET'
query_type_string_lookup[ POST ] = 'POST'
query_type_string_lookup[ OPTIONS ] = 'OPTIONS'
query_type_string_lookup = {
GET : 'GET',
POST : 'POST',
OPTIONS : 'OPTIONS'
}
APPLICATION_HYDRUS_CLIENT_COLLECTION = 0
IMAGE_JPEG = 1
@ -782,53 +774,53 @@ mime_mimetype_string_lookup[ UNDETERMINED_WM ] = '{} or {}'.format( mime_mimetyp
mime_mimetype_string_lookup[ UNDETERMINED_MP4 ] = '{} or {}'.format( mime_mimetype_string_lookup[ AUDIO_MP4 ], mime_mimetype_string_lookup[ VIDEO_MP4 ] )
mime_mimetype_string_lookup[ UNDETERMINED_PNG ] = '{} or {}'.format( mime_mimetype_string_lookup[ IMAGE_PNG ], mime_mimetype_string_lookup[ IMAGE_APNG ] )
mime_ext_lookup = {}
mime_ext_lookup[ APPLICATION_HYDRUS_CLIENT_COLLECTION ] = '.collection'
mime_ext_lookup[ IMAGE_JPEG ] = '.jpg'
mime_ext_lookup[ IMAGE_PNG ] = '.png'
mime_ext_lookup[ IMAGE_APNG ] = '.png'
mime_ext_lookup[ IMAGE_GIF ] = '.gif'
mime_ext_lookup[ IMAGE_BMP ] = '.bmp'
mime_ext_lookup[ IMAGE_WEBP ] = '.webp'
mime_ext_lookup[ IMAGE_TIFF ] = '.tiff'
mime_ext_lookup[ IMAGE_ICON ] = '.ico'
mime_ext_lookup[ APPLICATION_FLASH ] = '.swf'
mime_ext_lookup[ APPLICATION_OCTET_STREAM ] = '.bin'
mime_ext_lookup[ APPLICATION_YAML ] = '.yaml'
mime_ext_lookup[ APPLICATION_JSON ] = '.json'
mime_ext_lookup[ APPLICATION_PDF ] = '.pdf'
mime_ext_lookup[ APPLICATION_PSD ] = '.psd'
mime_ext_lookup[ APPLICATION_CLIP ] = '.clip'
mime_ext_lookup[ APPLICATION_ZIP ] = '.zip'
mime_ext_lookup[ APPLICATION_RAR ] = '.rar'
mime_ext_lookup[ APPLICATION_7Z ] = '.7z'
mime_ext_lookup[ APPLICATION_HYDRUS_ENCRYPTED_ZIP ] = '.zip.encrypted'
mime_ext_lookup[ APPLICATION_HYDRUS_UPDATE_CONTENT ] = ''
mime_ext_lookup[ APPLICATION_HYDRUS_UPDATE_DEFINITIONS ] = ''
mime_ext_lookup[ AUDIO_M4A ] = '.m4a'
mime_ext_lookup[ AUDIO_MP3 ] = '.mp3'
mime_ext_lookup[ AUDIO_MKV ] = '.mkv'
mime_ext_lookup[ AUDIO_MP4 ] = '.mp4'
mime_ext_lookup[ AUDIO_OGG ] = '.ogg'
mime_ext_lookup[ AUDIO_REALMEDIA ] = '.ra'
mime_ext_lookup[ AUDIO_FLAC ] = '.flac'
mime_ext_lookup[ AUDIO_WAVE ] = '.wav'
mime_ext_lookup[ AUDIO_TRUEAUDIO ] = '.tta'
mime_ext_lookup[ AUDIO_WMA ] = '.wma'
mime_ext_lookup[ TEXT_HTML ] = '.html'
mime_ext_lookup[ TEXT_PLAIN ] = '.txt'
mime_ext_lookup[ VIDEO_AVI ] = '.avi'
mime_ext_lookup[ VIDEO_FLV ] = '.flv'
mime_ext_lookup[ VIDEO_MOV ] = '.mov'
mime_ext_lookup[ VIDEO_MP4 ] = '.mp4'
mime_ext_lookup[ VIDEO_MPEG ] = '.mpeg'
mime_ext_lookup[ VIDEO_WMV ] = '.wmv'
mime_ext_lookup[ VIDEO_MKV ] = '.mkv'
mime_ext_lookup[ VIDEO_OGV ] = '.ogv'
mime_ext_lookup[ VIDEO_REALMEDIA ] = '.rm'
mime_ext_lookup[ VIDEO_WEBM ] = '.webm'
mime_ext_lookup[ APPLICATION_UNKNOWN ] = ''
mime_ext_lookup = {
APPLICATION_HYDRUS_CLIENT_COLLECTION : '.collection',
IMAGE_JPEG : '.jpg',
IMAGE_PNG : '.png',
IMAGE_APNG : '.png',
IMAGE_GIF : '.gif',
IMAGE_BMP : '.bmp',
IMAGE_WEBP : '.webp',
IMAGE_TIFF : '.tiff',
IMAGE_ICON : '.ico',
APPLICATION_FLASH : '.swf',
APPLICATION_OCTET_STREAM : '.bin',
APPLICATION_YAML : '.yaml',
APPLICATION_JSON : '.json',
APPLICATION_PDF : '.pdf',
APPLICATION_PSD : '.psd',
APPLICATION_CLIP : '.clip',
APPLICATION_ZIP : '.zip',
APPLICATION_RAR : '.rar',
APPLICATION_7Z : '.7z',
APPLICATION_HYDRUS_ENCRYPTED_ZIP : '.zip.encrypted',
APPLICATION_HYDRUS_UPDATE_CONTENT : '',
APPLICATION_HYDRUS_UPDATE_DEFINITIONS : '',
AUDIO_M4A : '.m4a',
AUDIO_MP3 : '.mp3',
AUDIO_MKV : '.mkv',
AUDIO_MP4 : '.mp4',
AUDIO_OGG : '.ogg',
AUDIO_REALMEDIA : '.ra',
AUDIO_FLAC : '.flac',
AUDIO_WAVE : '.wav',
AUDIO_TRUEAUDIO : '.tta',
AUDIO_WMA : '.wma',
TEXT_HTML : '.html',
TEXT_PLAIN : '.txt',
VIDEO_AVI : '.avi',
VIDEO_FLV : '.flv',
VIDEO_MOV : '.mov',
VIDEO_MP4 : '.mp4',
VIDEO_MPEG : '.mpeg',
VIDEO_WMV : '.wmv',
VIDEO_MKV : '.mkv',
VIDEO_OGV : '.ogv',
VIDEO_REALMEDIA : '.rm',
VIDEO_WEBM : '.webm',
APPLICATION_UNKNOWN : ''
}
ALLOWED_MIME_EXTENSIONS = [ mime_ext_lookup[ mime ] for mime in ALLOWED_MIMES ]
@ -850,25 +842,25 @@ SITE_TYPE_PIXIV_TAG = 14
SITE_TYPE_DEFAULT = 15
SITE_TYPE_WATCHER = 16
site_type_string_lookup = {}
site_type_string_lookup[ SITE_TYPE_DEFAULT ] = 'default'
site_type_string_lookup[ SITE_TYPE_BOORU ] = 'booru'
site_type_string_lookup[ SITE_TYPE_DEVIANT_ART ] = 'deviant art'
site_type_string_lookup[ SITE_TYPE_GIPHY ] = 'giphy'
site_type_string_lookup[ SITE_TYPE_HENTAI_FOUNDRY ] = 'hentai foundry'
site_type_string_lookup[ SITE_TYPE_HENTAI_FOUNDRY_ARTIST ] = 'hentai foundry artist'
site_type_string_lookup[ SITE_TYPE_HENTAI_FOUNDRY_ARTIST_PICTURES ] = 'hentai foundry artist pictures'
site_type_string_lookup[ SITE_TYPE_HENTAI_FOUNDRY_ARTIST_SCRAPS ] = 'hentai foundry artist scraps'
site_type_string_lookup[ SITE_TYPE_HENTAI_FOUNDRY_TAGS ] = 'hentai foundry tags'
site_type_string_lookup[ SITE_TYPE_NEWGROUNDS ] = 'newgrounds'
site_type_string_lookup[ SITE_TYPE_NEWGROUNDS_GAMES ] = 'newgrounds games'
site_type_string_lookup[ SITE_TYPE_NEWGROUNDS_MOVIES ] = 'newgrounds movies'
site_type_string_lookup[ SITE_TYPE_PIXIV ] = 'pixiv'
site_type_string_lookup[ SITE_TYPE_PIXIV_ARTIST_ID ] = 'pixiv artist id'
site_type_string_lookup[ SITE_TYPE_PIXIV_TAG ] = 'pixiv tag'
site_type_string_lookup[ SITE_TYPE_TUMBLR ] = 'tumblr'
site_type_string_lookup[ SITE_TYPE_WATCHER ] = 'watcher'
site_type_string_lookup = {
SITE_TYPE_DEFAULT : 'default',
SITE_TYPE_BOORU : 'booru',
SITE_TYPE_DEVIANT_ART : 'deviant art',
SITE_TYPE_GIPHY : 'giphy',
SITE_TYPE_HENTAI_FOUNDRY : 'hentai foundry',
SITE_TYPE_HENTAI_FOUNDRY_ARTIST : 'hentai foundry artist',
SITE_TYPE_HENTAI_FOUNDRY_ARTIST_PICTURES : 'hentai foundry artist pictures',
SITE_TYPE_HENTAI_FOUNDRY_ARTIST_SCRAPS : 'hentai foundry artist scraps',
SITE_TYPE_HENTAI_FOUNDRY_TAGS : 'hentai foundry tags',
SITE_TYPE_NEWGROUNDS : 'newgrounds',
SITE_TYPE_NEWGROUNDS_GAMES : 'newgrounds games',
SITE_TYPE_NEWGROUNDS_MOVIES : 'newgrounds movies',
SITE_TYPE_PIXIV : 'pixiv',
SITE_TYPE_PIXIV_ARTIST_ID : 'pixiv artist id',
SITE_TYPE_PIXIV_TAG : 'pixiv tag',
SITE_TYPE_TUMBLR : 'tumblr',
SITE_TYPE_WATCHER : 'watcher'
}
TIMESTAMP_TYPE_SOURCE = 0
@ -887,18 +879,17 @@ URL_TYPE_DESIRED = 7
URL_TYPE_SOURCE = 8
URL_TYPE_SUB_GALLERY = 9
url_type_string_lookup = {}
url_type_string_lookup[ URL_TYPE_POST ] = 'post url'
url_type_string_lookup[ URL_TYPE_API ] = 'api url'
url_type_string_lookup[ URL_TYPE_FILE ] = 'file url'
url_type_string_lookup[ URL_TYPE_GALLERY ] = 'gallery url'
url_type_string_lookup[ URL_TYPE_WATCHABLE ] = 'watchable url'
url_type_string_lookup[ URL_TYPE_UNKNOWN ] = 'unknown url'
url_type_string_lookup[ URL_TYPE_NEXT ] = 'next page url'
url_type_string_lookup[ URL_TYPE_DESIRED ] = 'downloadable/pursuable url'
url_type_string_lookup[ URL_TYPE_SUB_GALLERY ] = 'sub-gallery url (is queued even if creator found no post/file urls)'
url_type_string_lookup = {
URL_TYPE_POST : 'post url',
URL_TYPE_API : 'api url',
URL_TYPE_FILE : 'file url',
URL_TYPE_GALLERY : 'gallery url',
URL_TYPE_WATCHABLE : 'watchable url',
URL_TYPE_UNKNOWN : 'unknown url',
URL_TYPE_NEXT : 'next page url',
URL_TYPE_DESIRED : 'downloadable/pursuable url',
URL_TYPE_SUB_GALLERY : 'sub-gallery url (is queued even if creator found no post/file urls)'
}
# default options
@ -907,15 +898,24 @@ DEFAULT_SERVICE_PORT = 45871
SERVER_ADMIN_KEY = b'server admin'
def construct_python_tuple( self, node ): return tuple( self.construct_sequence( node ) )
def represent_python_tuple( self, data ): return self.represent_sequence( 'tag:yaml.org,2002:python/tuple', data )
def construct_python_tuple( self, node ):
return tuple( self.construct_sequence( node ) )
def represent_python_tuple( self, data ):
return self.represent_sequence( 'tag:yaml.org,2002:python/tuple', data )
yaml.SafeLoader.add_constructor( 'tag:yaml.org,2002:python/tuple', construct_python_tuple )
yaml.SafeDumper.add_representer( tuple, represent_python_tuple )
# for some reason, sqlite doesn't parse to int before this, despite the column affinity
# it gives the register_converter function a bytestring :/
def integer_boolean_to_bool( integer_boolean ): return bool( int( integer_boolean ) )
def integer_boolean_to_bool( integer_boolean ):
return bool( int( integer_boolean ) )
# sqlite mod

View File

@ -1,3 +1,4 @@
import collections
import os
import psutil
import re
@ -14,6 +15,33 @@ from hydrus.core import HydrusData
from hydrus.core import HydrusGlobals as HG
from hydrus.core import HydrusThreading
mimes_to_default_thumbnail_paths = collections.defaultdict( lambda: os.path.join( HC.STATIC_DIR, 'hydrus.png' ) )
mimes_to_default_thumbnail_paths[ HC.APPLICATION_PDF ] = os.path.join( HC.STATIC_DIR, 'pdf.png' )
mimes_to_default_thumbnail_paths[ HC.APPLICATION_PSD ] = os.path.join( HC.STATIC_DIR, 'psd.png' )
mimes_to_default_thumbnail_paths[ HC.APPLICATION_CLIP ] = os.path.join( HC.STATIC_DIR, 'clip.png' )
for mime in HC.AUDIO:
path = os.path.join( HC.STATIC_DIR, 'audio.png' )
mimes_to_default_thumbnail_paths[ mime ] = os.path.join( path )
for mime in HC.VIDEO:
path = os.path.join( HC.STATIC_DIR, 'video.png' )
mimes_to_default_thumbnail_paths[ mime ] = os.path.join( path )
for mime in HC.ARCHIVES:
path = os.path.join( HC.STATIC_DIR, 'zip.png' )
mimes_to_default_thumbnail_paths[ mime ] = os.path.join( path )
def AppendPathUntilNoConflicts( path ):
( path_absent_ext, ext ) = os.path.splitext( path )

View File

@ -1,3 +1,4 @@
import collections
import json
import os
import traceback
@ -418,14 +419,14 @@ class ParsedRequestArguments( dict ):
raise HydrusExceptions.BadRequestException( 'It looks like the parameter "{}" was missing!'.format( key ) )
def GetValue( self, key, expected_type, expected_list_type = None, default_value = None ):
def GetValue( self, key, expected_type, expected_list_type = None, expected_dict_types = None, default_value = None ):
# not None because in JSON sometimes people put 'null' to mean 'did not enter this optional parameter'
if key in self and self[ key ] is not None:
value = self[ key ]
error_text_lookup = {}
error_text_lookup = collections.defaultdict( lambda: 'unknown!' )
error_text_lookup[ int ] = 'integer'
error_text_lookup[ str ] = 'string'
@ -454,16 +455,25 @@ class ParsedRequestArguments( dict ):
if not isinstance( item, expected_list_type ):
if expected_list_type in error_text_lookup:
type_error_text = error_text_lookup[ expected_list_type ]
else:
type_error_text = 'unknown!'
raise HydrusExceptions.BadRequestException( 'The list parameter "{}" held an item, "{}" that was {} and not the expected type: {}!'.format( key, item, type( item ), error_text_lookup[ expected_list_type ] ) )
raise HydrusExceptions.BadRequestException( 'The list parameter "{}" held an item that was not the expected type: {}!'.format( key, type_error_text ) )
if expected_type is dict and expected_dict_types is not None:
( expected_key_type, expected_value_type ) = expected_dict_types
for ( dict_key, dict_value ) in value.items():
if not isinstance( dict_key, expected_key_type ):
raise HydrusExceptions.BadRequestException( 'The Object parameter "{}" held a key, "{}" that was {} and not the expected type: {}!'.format( key, dict_key, type( dict_key ), error_text_lookup[ expected_key_type ] ) )
if not isinstance( dict_value, expected_value_type ):
raise HydrusExceptions.BadRequestException( 'The Object parameter "{}" held a value, "{}" that was {} and not the expected type: {}!'.format( key, dict_value, type( dict_value ), error_text_lookup[ expected_value_type ] ) )

View File

@ -749,7 +749,7 @@ class TestClientAPI( unittest.TestCase ):
[ ( ( service_keys_to_content_updates, ), kwargs ) ] = HG.test_controller.GetWrite( 'content_updates' )
expected_service_keys_to_content_updates = { CC.LOCAL_FILE_SERVICE_KEY : [ HydrusData.ContentUpdate( HC.CONTENT_TYPE_FILES, HC.CONTENT_UPDATE_DELETE, { hash } ) ] }
expected_service_keys_to_content_updates = { CC.LOCAL_FILE_SERVICE_KEY : [ HydrusData.ContentUpdate( HC.CONTENT_TYPE_FILES, HC.CONTENT_UPDATE_DELETE, { hash }, reason = 'Deleted via Client API.' ) ] }
self._compare_content_updates( service_keys_to_content_updates, expected_service_keys_to_content_updates )
@ -773,10 +773,58 @@ class TestClientAPI( unittest.TestCase ):
[ ( ( service_keys_to_content_updates, ), kwargs ) ] = HG.test_controller.GetWrite( 'content_updates' )
expected_service_keys_to_content_updates = { CC.LOCAL_FILE_SERVICE_KEY : [ HydrusData.ContentUpdate( HC.CONTENT_TYPE_FILES, HC.CONTENT_UPDATE_DELETE, hashes ) ] }
expected_service_keys_to_content_updates = { CC.LOCAL_FILE_SERVICE_KEY : [ HydrusData.ContentUpdate( HC.CONTENT_TYPE_FILES, HC.CONTENT_UPDATE_DELETE, hashes, reason = 'Deleted via Client API.' ) ] }
self._compare_content_updates( service_keys_to_content_updates, expected_service_keys_to_content_updates )
# now with a reason
HG.test_controller.ClearWrites( 'content_updates' )
path = '/add_files/delete_files'
reason = 'yo'
body_dict = { 'hashes' : [ h.hex() for h in hashes ], 'reason' : reason }
body = json.dumps( body_dict )
connection.request( 'POST', path, body = body, headers = headers )
response = connection.getresponse()
data = response.read()
self.assertEqual( response.status, 200 )
[ ( ( service_keys_to_content_updates, ), kwargs ) ] = HG.test_controller.GetWrite( 'content_updates' )
expected_service_keys_to_content_updates = { CC.LOCAL_FILE_SERVICE_KEY : [ HydrusData.ContentUpdate( HC.CONTENT_TYPE_FILES, HC.CONTENT_UPDATE_DELETE, hashes, reason = reason ) ] }
self._compare_content_updates( service_keys_to_content_updates, expected_service_keys_to_content_updates )
# now test it not working
HG.test_controller.ClearWrites( 'content_updates' )
path = '/add_files/delete_files'
body_dict = { 'hashes' : [ h.hex() for h in hashes ], 'file_service_name' : 'not existing service' }
body = json.dumps( body_dict )
connection.request( 'POST', path, body = body, headers = headers )
response = connection.getresponse()
data = response.read()
self.assertEqual( response.status, 400 )
text = str( data, 'utf-8' )
self.assertIn( 'not existing service', text ) # error message should be complaining about it
#
HG.test_controller.ClearWrites( 'content_updates' )
@ -829,6 +877,28 @@ class TestClientAPI( unittest.TestCase ):
HG.test_controller.ClearWrites( 'content_updates' )
path = '/add_files/undelete_files'
body_dict = { 'hashes' : [ h.hex() for h in hashes ], 'file_service_name' : 'not existing service' }
body = json.dumps( body_dict )
connection.request( 'POST', path, body = body, headers = headers )
response = connection.getresponse()
data = response.read()
self.assertEqual( response.status, 400 )
text = str( data, 'utf-8' )
self.assertIn( 'not existing service', text ) # error message should be complaining about it
#
HG.test_controller.ClearWrites( 'content_updates' )
path = '/add_files/archive_files'
body_dict = { 'hash' : hash.hex() }
@ -1646,7 +1716,7 @@ class TestClientAPI( unittest.TestCase ):
expected_service_keys_to_content_updates = collections.defaultdict( list )
expected_service_keys_to_content_updates[ CC.COMBINED_LOCAL_FILE_SERVICE_KEY ] = [ HydrusData.ContentUpdate( HC.CONTENT_TYPE_URLS, HC.CONTENT_UPDATE_ADD, ( [ url ], [ hash ] ) ) ]
expected_service_keys_to_content_updates[ CC.COMBINED_LOCAL_FILE_SERVICE_KEY ] = [ HydrusData.ContentUpdate( HC.CONTENT_TYPE_URLS, HC.CONTENT_UPDATE_ADD, ( [ url ], { hash } ) ) ]
expected_result = [ ( ( expected_service_keys_to_content_updates, ), {} ) ]
@ -1675,7 +1745,7 @@ class TestClientAPI( unittest.TestCase ):
expected_service_keys_to_content_updates = collections.defaultdict( list )
expected_service_keys_to_content_updates[ CC.COMBINED_LOCAL_FILE_SERVICE_KEY ] = [ HydrusData.ContentUpdate( HC.CONTENT_TYPE_URLS, HC.CONTENT_UPDATE_ADD, ( [ url ], [ hash ] ) ) ]
expected_service_keys_to_content_updates[ CC.COMBINED_LOCAL_FILE_SERVICE_KEY ] = [ HydrusData.ContentUpdate( HC.CONTENT_TYPE_URLS, HC.CONTENT_UPDATE_ADD, ( [ url ], { hash } ) ) ]
expected_result = [ ( ( expected_service_keys_to_content_updates, ), {} ) ]
@ -1704,7 +1774,7 @@ class TestClientAPI( unittest.TestCase ):
expected_service_keys_to_content_updates = collections.defaultdict( list )
expected_service_keys_to_content_updates[ CC.COMBINED_LOCAL_FILE_SERVICE_KEY ] = [ HydrusData.ContentUpdate( HC.CONTENT_TYPE_URLS, HC.CONTENT_UPDATE_DELETE, ( [ url ], [ hash ] ) ) ]
expected_service_keys_to_content_updates[ CC.COMBINED_LOCAL_FILE_SERVICE_KEY ] = [ HydrusData.ContentUpdate( HC.CONTENT_TYPE_URLS, HC.CONTENT_UPDATE_DELETE, ( [ url ], { hash } ) ) ]
expected_result = [ ( ( expected_service_keys_to_content_updates, ), {} ) ]
@ -1733,7 +1803,7 @@ class TestClientAPI( unittest.TestCase ):
expected_service_keys_to_content_updates = collections.defaultdict( list )
expected_service_keys_to_content_updates[ CC.COMBINED_LOCAL_FILE_SERVICE_KEY ] = [ HydrusData.ContentUpdate( HC.CONTENT_TYPE_URLS, HC.CONTENT_UPDATE_DELETE, ( [ url ], [ hash ] ) ) ]
expected_service_keys_to_content_updates[ CC.COMBINED_LOCAL_FILE_SERVICE_KEY ] = [ HydrusData.ContentUpdate( HC.CONTENT_TYPE_URLS, HC.CONTENT_UPDATE_DELETE, ( [ url ], { hash } ) ) ]
expected_result = [ ( ( expected_service_keys_to_content_updates, ), {} ) ]
@ -3238,7 +3308,7 @@ class TestClientAPI( unittest.TestCase ):
self.assertEqual( response.status, 404 )
#
# this no longer 404s, it should give the hydrus thumb
path = '/get_files/thumbnail?hash={}'.format( hash_404.hex() )
@ -3248,7 +3318,14 @@ class TestClientAPI( unittest.TestCase ):
data = response.read()
self.assertEqual( response.status, 404 )
with open( os.path.join( HC.STATIC_DIR, 'hydrus.png' ), 'rb' ) as f:
expected_data = f.read()
self.assertEqual( response.status, 200 )
self.assertEqual( data, expected_data )
#

View File

@ -12,16 +12,15 @@ from hydrus.core import HydrusGlobals as HG
from hydrus.core import HydrusTags
from hydrus.client import ClientConstants as CC
from hydrus.client import ClientLocation
from hydrus.client.importing import ClientImportFileSeeds
from hydrus.client.importing.options import ClientImportOptions
from hydrus.client.importing.options import FileImportOptions
from hydrus.client.importing.options import NoteImportOptions
from hydrus.client.importing.options import PresentationImportOptions
from hydrus.client.importing.options import TagImportOptions
from hydrus.client.media import ClientMedia
from hydrus.client.media import ClientMediaManagers
from hydrus.client.media import ClientMediaResult
from hydrus.client.metadata import ClientTags
class TestCheckerOptions( unittest.TestCase ):
@ -581,7 +580,7 @@ class TestPresentationImportOptions( unittest.TestCase ):
[ ( args, kwargs ) ] = HG.test_controller.GetRead( 'filter_hashes' )
self.assertEqual( args, ( CC.LOCAL_FILE_SERVICE_KEY, pre_filter_expected_result ) )
self.assertEqual( args, ( ClientLocation.LocationContext( current_service_keys = ( CC.LOCAL_FILE_SERVICE_KEY, ) ), pre_filter_expected_result ) )
self.assertEqual( result, expected_result )
@ -619,7 +618,7 @@ class TestPresentationImportOptions( unittest.TestCase ):
[ ( args, kwargs ) ] = HG.test_controller.GetRead( 'filter_hashes' )
self.assertEqual( args, ( CC.COMBINED_LOCAL_FILE_SERVICE_KEY, pre_filter_expected_result ) )
self.assertEqual( args, ( ClientLocation.LocationContext( current_service_keys = ( CC.COMBINED_LOCAL_FILE_SERVICE_KEY, ) ), pre_filter_expected_result ) )
self.assertEqual( result, expected_result )
@ -672,7 +671,7 @@ class TestPresentationImportOptions( unittest.TestCase ):
[ ( args, kwargs ) ] = HG.test_controller.GetRead( 'filter_hashes' )
self.assertEqual( args, ( CC.LOCAL_FILE_SERVICE_KEY, pre_filter_expected_result ) )
self.assertEqual( args, ( ClientLocation.LocationContext( current_service_keys = ( CC.LOCAL_FILE_SERVICE_KEY, ) ), pre_filter_expected_result ) )
self.assertEqual( result, expected_result )
@ -723,7 +722,7 @@ class TestPresentationImportOptions( unittest.TestCase ):
[ ( args, kwargs ) ] = HG.test_controller.GetRead( 'filter_hashes' )
self.assertEqual( args, ( CC.LOCAL_FILE_SERVICE_KEY, pre_filter_expected_result ) )
self.assertEqual( args, ( ClientLocation.LocationContext( current_service_keys = ( CC.LOCAL_FILE_SERVICE_KEY, ) ), pre_filter_expected_result ) )
self.assertEqual( result, expected_result )
@ -756,7 +755,7 @@ class TestPresentationImportOptions( unittest.TestCase ):
[ ( args, kwargs ) ] = HG.test_controller.GetRead( 'filter_hashes' )
self.assertEqual( args, ( CC.LOCAL_FILE_SERVICE_KEY, pre_filter_expected_result ) )
self.assertEqual( args, ( ClientLocation.LocationContext( current_service_keys = ( CC.LOCAL_FILE_SERVICE_KEY, ) ), pre_filter_expected_result ) )
self.assertEqual( result, expected_result )
@ -802,7 +801,7 @@ class TestPresentationImportOptions( unittest.TestCase ):
[ ( args, kwargs ) ] = HG.test_controller.GetRead( 'filter_hashes' )
self.assertEqual( args, ( CC.LOCAL_FILE_SERVICE_KEY, pre_filter_expected_result ) )
self.assertEqual( args, ( ClientLocation.LocationContext( current_service_keys = ( CC.LOCAL_FILE_SERVICE_KEY, ) ), pre_filter_expected_result ) )
self.assertEqual( result, expected_result )
@ -850,7 +849,7 @@ class TestPresentationImportOptions( unittest.TestCase ):
[ ( args, kwargs ) ] = HG.test_controller.GetRead( 'filter_hashes' )
self.assertEqual( args, ( CC.LOCAL_FILE_SERVICE_KEY, pre_filter_expected_result ) )
self.assertEqual( args, ( ClientLocation.LocationContext( current_service_keys = ( CC.LOCAL_FILE_SERVICE_KEY, ) ), pre_filter_expected_result ) )
self.assertEqual( result, expected_result )

View File

@ -637,8 +637,8 @@ class TestTagObjects( unittest.TestCase ):
tag_autocomplete_options.SetTuple(
tag_autocomplete_options.GetWriteAutocompleteTagDomain(),
tag_autocomplete_options.OverridesWriteAutocompleteFileDomain(),
tag_autocomplete_options.GetWriteAutocompleteFileDomain(),
tag_autocomplete_options.OverridesWriteAutocompleteLocationContext(),
tag_autocomplete_options.GetWriteAutocompleteLocationContext(),
search_namespaces_into_full_tags,
namespace_bare_fetch_all_allowed,
namespace_fetch_all_allowed,
@ -697,8 +697,8 @@ class TestTagObjects( unittest.TestCase ):
tag_autocomplete_options.SetTuple(
tag_autocomplete_options.GetWriteAutocompleteTagDomain(),
tag_autocomplete_options.OverridesWriteAutocompleteFileDomain(),
tag_autocomplete_options.GetWriteAutocompleteFileDomain(),
tag_autocomplete_options.OverridesWriteAutocompleteLocationContext(),
tag_autocomplete_options.GetWriteAutocompleteLocationContext(),
search_namespaces_into_full_tags,
namespace_bare_fetch_all_allowed,
namespace_fetch_all_allowed,
@ -757,8 +757,8 @@ class TestTagObjects( unittest.TestCase ):
tag_autocomplete_options.SetTuple(
tag_autocomplete_options.GetWriteAutocompleteTagDomain(),
tag_autocomplete_options.OverridesWriteAutocompleteFileDomain(),
tag_autocomplete_options.GetWriteAutocompleteFileDomain(),
tag_autocomplete_options.OverridesWriteAutocompleteLocationContext(),
tag_autocomplete_options.GetWriteAutocompleteLocationContext(),
search_namespaces_into_full_tags,
namespace_bare_fetch_all_allowed,
namespace_fetch_all_allowed,
@ -817,8 +817,8 @@ class TestTagObjects( unittest.TestCase ):
tag_autocomplete_options.SetTuple(
tag_autocomplete_options.GetWriteAutocompleteTagDomain(),
tag_autocomplete_options.OverridesWriteAutocompleteFileDomain(),
tag_autocomplete_options.GetWriteAutocompleteFileDomain(),
tag_autocomplete_options.OverridesWriteAutocompleteLocationContext(),
tag_autocomplete_options.GetWriteAutocompleteLocationContext(),
search_namespaces_into_full_tags,
namespace_bare_fetch_all_allowed,
namespace_fetch_all_allowed,
@ -871,8 +871,8 @@ class TestTagObjects( unittest.TestCase ):
tag_autocomplete_options.SetTuple(
tag_autocomplete_options.GetWriteAutocompleteTagDomain(),
tag_autocomplete_options.OverridesWriteAutocompleteFileDomain(),
tag_autocomplete_options.GetWriteAutocompleteFileDomain(),
tag_autocomplete_options.OverridesWriteAutocompleteLocationContext(),
tag_autocomplete_options.GetWriteAutocompleteLocationContext(),
search_namespaces_into_full_tags,
namespace_bare_fetch_all_allowed,
namespace_fetch_all_allowed,
@ -916,8 +916,8 @@ class TestTagObjects( unittest.TestCase ):
tag_autocomplete_options.SetTuple(
tag_autocomplete_options.GetWriteAutocompleteTagDomain(),
tag_autocomplete_options.OverridesWriteAutocompleteFileDomain(),
tag_autocomplete_options.GetWriteAutocompleteFileDomain(),
tag_autocomplete_options.OverridesWriteAutocompleteLocationContext(),
tag_autocomplete_options.GetWriteAutocompleteLocationContext(),
search_namespaces_into_full_tags,
namespace_bare_fetch_all_allowed,
namespace_fetch_all_allowed,
@ -965,8 +965,8 @@ class TestTagObjects( unittest.TestCase ):
tag_autocomplete_options.SetTuple(
tag_autocomplete_options.GetWriteAutocompleteTagDomain(),
tag_autocomplete_options.OverridesWriteAutocompleteFileDomain(),
tag_autocomplete_options.GetWriteAutocompleteFileDomain(),
tag_autocomplete_options.OverridesWriteAutocompleteLocationContext(),
tag_autocomplete_options.GetWriteAutocompleteLocationContext(),
search_namespaces_into_full_tags,
namespace_bare_fetch_all_allowed,
namespace_fetch_all_allowed,
@ -1028,8 +1028,8 @@ class TestTagObjects( unittest.TestCase ):
tag_autocomplete_options.SetTuple(
tag_autocomplete_options.GetWriteAutocompleteTagDomain(),
tag_autocomplete_options.OverridesWriteAutocompleteFileDomain(),
tag_autocomplete_options.GetWriteAutocompleteFileDomain(),
tag_autocomplete_options.OverridesWriteAutocompleteLocationContext(),
tag_autocomplete_options.GetWriteAutocompleteLocationContext(),
search_namespaces_into_full_tags,
namespace_bare_fetch_all_allowed,
namespace_fetch_all_allowed,
@ -1083,8 +1083,8 @@ class TestTagObjects( unittest.TestCase ):
tag_autocomplete_options.SetTuple(
tag_autocomplete_options.GetWriteAutocompleteTagDomain(),
tag_autocomplete_options.OverridesWriteAutocompleteFileDomain(),
tag_autocomplete_options.GetWriteAutocompleteFileDomain(),
tag_autocomplete_options.OverridesWriteAutocompleteLocationContext(),
tag_autocomplete_options.GetWriteAutocompleteLocationContext(),
search_namespaces_into_full_tags,
namespace_bare_fetch_all_allowed,
namespace_fetch_all_allowed,
@ -1147,8 +1147,8 @@ class TestTagObjects( unittest.TestCase ):
tag_autocomplete_options.SetTuple(
tag_autocomplete_options.GetWriteAutocompleteTagDomain(),
tag_autocomplete_options.OverridesWriteAutocompleteFileDomain(),
tag_autocomplete_options.GetWriteAutocompleteFileDomain(),
tag_autocomplete_options.OverridesWriteAutocompleteLocationContext(),
tag_autocomplete_options.GetWriteAutocompleteLocationContext(),
search_namespaces_into_full_tags,
namespace_bare_fetch_all_allowed,
namespace_fetch_all_allowed,
@ -1201,8 +1201,8 @@ class TestTagObjects( unittest.TestCase ):
tag_autocomplete_options.SetTuple(
tag_autocomplete_options.GetWriteAutocompleteTagDomain(),
tag_autocomplete_options.OverridesWriteAutocompleteFileDomain(),
tag_autocomplete_options.GetWriteAutocompleteFileDomain(),
tag_autocomplete_options.OverridesWriteAutocompleteLocationContext(),
tag_autocomplete_options.GetWriteAutocompleteLocationContext(),
search_namespaces_into_full_tags,
namespace_bare_fetch_all_allowed,
namespace_fetch_all_allowed,
@ -1254,8 +1254,8 @@ class TestTagObjects( unittest.TestCase ):
tag_autocomplete_options.SetTuple(
tag_autocomplete_options.GetWriteAutocompleteTagDomain(),
tag_autocomplete_options.OverridesWriteAutocompleteFileDomain(),
tag_autocomplete_options.GetWriteAutocompleteFileDomain(),
tag_autocomplete_options.OverridesWriteAutocompleteLocationContext(),
tag_autocomplete_options.GetWriteAutocompleteLocationContext(),
search_namespaces_into_full_tags,
namespace_bare_fetch_all_allowed,
namespace_fetch_all_allowed,
@ -1308,8 +1308,8 @@ class TestTagObjects( unittest.TestCase ):
tag_autocomplete_options.SetTuple(
tag_autocomplete_options.GetWriteAutocompleteTagDomain(),
tag_autocomplete_options.OverridesWriteAutocompleteFileDomain(),
tag_autocomplete_options.GetWriteAutocompleteFileDomain(),
tag_autocomplete_options.OverridesWriteAutocompleteLocationContext(),
tag_autocomplete_options.GetWriteAutocompleteLocationContext(),
search_namespaces_into_full_tags,
namespace_bare_fetch_all_allowed,
namespace_fetch_all_allowed,
@ -1361,8 +1361,8 @@ class TestTagObjects( unittest.TestCase ):
tag_autocomplete_options.SetTuple(
tag_autocomplete_options.GetWriteAutocompleteTagDomain(),
tag_autocomplete_options.OverridesWriteAutocompleteFileDomain(),
tag_autocomplete_options.GetWriteAutocompleteFileDomain(),
tag_autocomplete_options.OverridesWriteAutocompleteLocationContext(),
tag_autocomplete_options.GetWriteAutocompleteLocationContext(),
search_namespaces_into_full_tags,
namespace_bare_fetch_all_allowed,
namespace_fetch_all_allowed,
@ -1415,8 +1415,8 @@ class TestTagObjects( unittest.TestCase ):
tag_autocomplete_options.SetTuple(
tag_autocomplete_options.GetWriteAutocompleteTagDomain(),
tag_autocomplete_options.OverridesWriteAutocompleteFileDomain(),
tag_autocomplete_options.GetWriteAutocompleteFileDomain(),
tag_autocomplete_options.OverridesWriteAutocompleteLocationContext(),
tag_autocomplete_options.GetWriteAutocompleteLocationContext(),
search_namespaces_into_full_tags,
namespace_bare_fetch_all_allowed,
namespace_fetch_all_allowed,
@ -1468,8 +1468,8 @@ class TestTagObjects( unittest.TestCase ):
tag_autocomplete_options.SetTuple(
tag_autocomplete_options.GetWriteAutocompleteTagDomain(),
tag_autocomplete_options.OverridesWriteAutocompleteFileDomain(),
tag_autocomplete_options.GetWriteAutocompleteFileDomain(),
tag_autocomplete_options.OverridesWriteAutocompleteLocationContext(),
tag_autocomplete_options.GetWriteAutocompleteLocationContext(),
search_namespaces_into_full_tags,
namespace_bare_fetch_all_allowed,
namespace_fetch_all_allowed,
@ -1522,8 +1522,8 @@ class TestTagObjects( unittest.TestCase ):
tag_autocomplete_options.SetTuple(
tag_autocomplete_options.GetWriteAutocompleteTagDomain(),
tag_autocomplete_options.OverridesWriteAutocompleteFileDomain(),
tag_autocomplete_options.GetWriteAutocompleteFileDomain(),
tag_autocomplete_options.OverridesWriteAutocompleteLocationContext(),
tag_autocomplete_options.GetWriteAutocompleteLocationContext(),
search_namespaces_into_full_tags,
namespace_bare_fetch_all_allowed,
namespace_fetch_all_allowed,