Version 341
This commit is contained in:
parent
0f516b23aa
commit
4b6fc54a4c
|
@ -8,6 +8,48 @@
|
|||
<div class="content">
|
||||
<h3>changelog</h3>
|
||||
<ul>
|
||||
<li><h3>version 341</h3></li>
|
||||
<ul>
|
||||
<li>client api:</li>
|
||||
<li>added /add_tags/add_tags, which does several kinds of tag content updates</li>
|
||||
<li>added /add_tags/clean_tags, which shows how hydrus will handle potential tags</li>
|
||||
<li>added /add_urls/associate_url, which allows you to associate urls with files</li>
|
||||
<li>added 'destination_page_name' to /add_urls/add_url, which will choose which destination watcher/url importer to place the url (or create a new one with that name)</li>
|
||||
<li>updated client api version to 2</li>
|
||||
<li>updated client help and unit tests for the above</li>
|
||||
<li>added a linked contents to the client api help</li>
|
||||
<li>improved some server error handling, mostly moving 403s to more correct 400s</li>
|
||||
<li>improved how missing parameter 400 errors are reported from the server vs deeper keyerrors that should be 500</li>
|
||||
<li>.</li>
|
||||
<li>the rest:</li>
|
||||
<li>tag repository update processing now saves progress to disk every million rows or every minute, whichever comes first. this reduces journaling bloat, improves recovery when the process quits unexpectedly, and makes for significantly faster cancel when requested by the user</li>
|
||||
<li>when processing duplicates and copying/merging/moving ratings, the 'source' file will now also overwrite the 'destination' file's rating if that destination rating is lower (previously, the rating would only go over if the dest had no rating set)</li>
|
||||
<li>added a new 'thumbnail experiment mode' under help->debug->gui. this will load fullsize thumbs and resize them in memory, please see release post for more details</li>
|
||||
<li>reduced menubar replacement flicker while, I believe, keeping and strengthening recent menubar indexing stability improvements</li>
|
||||
<li>the tag autocomplete dropdown will now always embed (instead of floating) in non-Windows</li>
|
||||
<li>when data seems non-decodable, the fallback encoding format is now that given by chardet, rather than utf-8</li>
|
||||
<li>improved serialisability of some pending tag data</li>
|
||||
<li>watchers can now hold and pass on fixed pending tag data</li>
|
||||
<li>gallery log objects can now hold and pass on fixed pending tag data</li>
|
||||
<li>file import objects can now hold and action fixed pending tag data</li>
|
||||
<li>hard drive imports now store their paths-to-tags info in this new format, directly in the file import objects</li>
|
||||
<li>improved some url-import page drop-target-selection logic</li>
|
||||
<li>improved error reporting when dropping/api-adding urls</li>
|
||||
<li>adjusted some url import workflow so big 'already in db' download lists should work a bit faster</li>
|
||||
<li>attempting to start the program with some external database files but not the main 'client.db/server.db' file will now cause a boot-fail exception with an explanation before any stub db files can be made</li>
|
||||
<li>tightened up some hydrus service login-capability-testing code that was previously stopping certain error states from recovering promptly, even on a force account refresh, while the service was maxed on bandwidth</li>
|
||||
<li>fixed a source of linux CRITICAL logspam related to several common dialogs</li>
|
||||
<li>improved ui stability on boot when file folders are missing (particularly for linux)</li>
|
||||
<li>improved stability for the various async tasks on the duplicates processing page, particularly for linux. I am not sure I got everything here, but it is definitely better</li>
|
||||
<li>did some more misc stability improvements, particularly in various boot fail scenarios</li>
|
||||
<li>completely removed an ancient and janky focus catcher widget from the main gui frame</li>
|
||||
<li>now various db caching is improved on the python side, removed a sqlite instruction to force temp information to always stay in memory--hot data returns to staying mostly in memory to start and then spools to disk if the transaction gets too large</li>
|
||||
<li>fixed approx bitrate sorting for malformed video files with explicitly '0' duration</li>
|
||||
<li>daemon_profile_mode now spams some more info about export folders</li>
|
||||
<li>fixed an issue that meant client db maintenance was firing its jobs too aggressively, regardless of idle status</li>
|
||||
<li>updated windows build to cv 4.0</li>
|
||||
<li>misc refactoring and fixes</li>
|
||||
</ul>
|
||||
<li><h3>version 340</h3></li>
|
||||
<ul>
|
||||
<li>client api:</li>
|
||||
|
@ -33,7 +75,7 @@
|
|||
<li>added gelbooru 0.1.11 parser for future application</li>
|
||||
<li>fixed an issue that was stopping advanced content updates from fully copying all the desired mappings in the transaction</li>
|
||||
<li>added a semi-hacky checkbox to 'options->files and trash' that will delay all new file/thumb requests for 15s after the computer resumes from sleep (useful if your files are on a NAS that takes a few seconds to reconnect on wake)</li>
|
||||
<li>wrote some more graceful fallback decoding handling code that attempts original assumed encoding and 'utf-8' if different and returns the one with the fewest ' 'replacement characters</li>
|
||||
<li>wrote some more graceful fallback decoding handling code that attempts original assumed encoding and 'utf-8' if different and returns the one with the fewest '<EFBFBD>' replacement characters</li>
|
||||
<li>the network engine and the ffmpeg info parsing now use this new 'safe' decoding, so even if a site has borked bytes or the video file has unexpected Shift-JIS title metadata, it'll still go through, albeit with some question marks</li>
|
||||
<li>moved some more old daemons to the new job scheduler, deleted some old daemon code</li>
|
||||
<li>improved some daemon job wake and shutdown code</li>
|
||||
|
|
|
@ -20,9 +20,36 @@
|
|||
<li><a href="https://gitlab.com/cryzed/hydrus-api">https://gitlab.com/cryzed/hydrus-api</a> - A python module that talks to the API.</li>
|
||||
</ul>
|
||||
<h3>API</h3>
|
||||
<p>The API should always return JSON on 200. Otherwise, assume it will return plain text, sometimes a raw traceback. You'll typically get 400 for a missing parameter, 401 or 403 for missing/insufficient access, and 500 for a real deal serverside error.</p>
|
||||
<p>If the API returns anything on 200, it should always return JSON. Otherwise, assume it will return plain text, sometimes a raw traceback. You'll typically get 400 for a missing parameter, 401 or 403 for missing/insufficient access, and 500 for a real deal serverside error.</p>
|
||||
<h3>Contents</h3>
|
||||
<ul>
|
||||
<li>
|
||||
<h4>Access Management</h4>
|
||||
<ul>
|
||||
<li><a href="#api_version">GET /api_version</a></li>
|
||||
<li><a href="#request_new_permissions">GET /request_new_permissions</a></li>
|
||||
<li><a href="#verify_access_key">GET /verify_access_key</a></li>
|
||||
</ul>
|
||||
<h4>Adding Files</h4>
|
||||
<ul>
|
||||
<li><a href="#add_files_add_file">POST /add_files/add_file</a></li>
|
||||
</ul>
|
||||
<h4>Adding Tags</h4>
|
||||
<ul>
|
||||
<li><a href="#add_tags_clean_tags">GET /add_tags/clean_tags</a></li>
|
||||
<li><a href="#add_tags_get_tag_services">GET /add_tags/get_tag_services</a></li>
|
||||
<li><a href="#add_tags_add_tags">POST /add_tags/add_tags</a></li>
|
||||
</ul>
|
||||
<h4>Adding URLs</h4>
|
||||
<ul>
|
||||
<li><a href="#add_urls_get_url_files">GET /add_urls/get_url_files</a></li>
|
||||
<li><a href="#add_urls_get_url_info">GET /add_urls/get_url_info</a></li>
|
||||
<li><a href="#add_urls_add_url">POST /add_urls/add_url</a></li>
|
||||
<li><a href="#add_urls_associate_url">POST /add_urls/associate_url</a></li>
|
||||
</ul>
|
||||
</ul>
|
||||
<h3>Access Management</h3>
|
||||
<div class="apiborder">
|
||||
<div class="apiborder" id="api_version">
|
||||
<h3><b>GET /api_version</b></h3>
|
||||
<p><i>Gets the current API version. I will increment this every time I alter the API.</i></p>
|
||||
<ul>
|
||||
|
@ -37,7 +64,7 @@
|
|||
</li>
|
||||
</ul>
|
||||
</div>
|
||||
<div class="apiborder">
|
||||
<div class="apiborder" id="request_new_permissions">
|
||||
<h3><b>GET /request_new_permissions</b></h3>
|
||||
<p><i>Register a new external program with the client. This requires the 'add from api request' mini-dialog under </i>services->review services<i> to be open, otherwise it will 403.</i></p>
|
||||
<ul>
|
||||
|
@ -73,7 +100,7 @@
|
|||
</li>
|
||||
</ul>
|
||||
</div>
|
||||
<div class="apiborder">
|
||||
<div class="apiborder" id="verify_access_key">
|
||||
<h3><b>GET /verify_access_key</b></h3>
|
||||
<p><i>Check your access key is valid.</i></p>
|
||||
<ul>
|
||||
|
@ -99,7 +126,7 @@
|
|||
</ul>
|
||||
</div>
|
||||
<h3>Adding Files</h3>
|
||||
<div class="apiborder">
|
||||
<div class="apiborder" id="add_files_add_file">
|
||||
<h3><b>POST /add_files/add_file</b></h3>
|
||||
<p><i>Tell the client to import a file.</i></p>
|
||||
<ul>
|
||||
|
@ -139,7 +166,44 @@
|
|||
</ul>
|
||||
</div>
|
||||
<h3>Adding Tags</h3>
|
||||
<div class="apiborder">
|
||||
<div class="apiborder" id="add_tags_clean_tags">
|
||||
<h3><b>GET /add_tags/clean_tags</b></h3>
|
||||
<p><i>Ask the client about how it will see certain tags.</i></p>
|
||||
<ul>
|
||||
<li>
|
||||
<p>Headers:</p>
|
||||
<ul>
|
||||
<li>Hydrus-Client-API-Access-Key : (Your hexadecimal access key)</li>
|
||||
</ul>
|
||||
</li>
|
||||
<li><p>Arguments (in percent-encoded JSON):</p></li>
|
||||
<ul>
|
||||
<li>tags : (a list of the tags you want cleaned)</li>
|
||||
</ul>
|
||||
<li>
|
||||
<p>Example request:</p>
|
||||
<pre>Given tags [ " bikini ", "blue eyes", " character : samus aran ", ":)", " ", "", "10", "11", "9", "system:wew", "-flower" ]:</pre>
|
||||
<ul>
|
||||
<li><p>/add_tags/clean_tags?tags=%5B%22%20bikini%20%22%2C%20%22blue%20%20%20%20eyes%22%2C%20%22%20character%20%3A%20samus%20aran%20%22%2C%20%22%3A%29%22%2C%20%22%20%20%20%22%2C%20%22%22%2C%20%2210%22%2C%20%2211%22%2C%20%229%22%2C%20%22system%3Awew%22%2C%20%22-flower%22%5D</p></li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>
|
||||
<p>Response description: The tags cleaned according to hydrus rules. They will also be in hydrus human-friendly sorting order.</p>
|
||||
</li>
|
||||
<li>
|
||||
<p>Example response:</p>
|
||||
<ul>
|
||||
<li>
|
||||
<pre>{
|
||||
"tags": [ "9", "10", "11", "::)", "bikini", "blue eyes", "character:samus aran", "flower", "wew" ]
|
||||
}</pre>
|
||||
</li>
|
||||
</ul>
|
||||
<p>Mostly, hydrus simply trims excess whitespace, but the other examples are rare issues you might run into. 'system' is an invalid namespace, tags cannot be prefixed with hyphens, and any tag starting with ':' is secretly dealt with internally as "[no namespace]:[colon-prefixed-subtag]". Again, you probably won't run into these, but if you see a mismatch somewhere and want to figure it out, or just want to sort some numbered tags, you might like to try this.</p>
|
||||
</li>
|
||||
</ul>
|
||||
</div>
|
||||
<div class="apiborder" id="add_tags_get_tag_services">
|
||||
<h3><b>GET /add_tags/get_tag_services</b></h3>
|
||||
<p><i>Ask the client about its tag services.</i></p>
|
||||
<ul>
|
||||
|
@ -167,11 +231,74 @@
|
|||
</li>
|
||||
</ul>
|
||||
</div>
|
||||
<div class="apiborder">
|
||||
<div class="apiborder" id="add_tags_add_tags">
|
||||
<h3><b>POST /add_tags/add_tags</b></h3>
|
||||
<p><i>Make changes to the tags that files have.</i></p>
|
||||
<ul>
|
||||
<li>
|
||||
<p>Headers:</p>
|
||||
<ul>
|
||||
<li>Hydrus-Client-API-Access-Key : (Your hexadecimal access key)</li>
|
||||
</ul>
|
||||
</li>
|
||||
<li><p>Arguments (in JSON):</p></li>
|
||||
<ul>
|
||||
<li>hash : (an SHA256 hash for a file in 64 characters of hexadecimal)</li>
|
||||
<li>hashes : (a list of SHA256 hashes)</li>
|
||||
<li>service_names_to_tags : (an Object of service names to lists of tags to be 'added' to the files)</li>
|
||||
<li>service_names_to_actions_to_tags : (an Object of service names to content update actions to lists of tags)</li>
|
||||
</ul>
|
||||
<p>You can use either 'hash' or 'hashes', and you can use either the simple add-only 'service_names_to_tags' or the advanced 'service_names_to_actions_to_tags'.</p>
|
||||
<p>The service names are as in the <i>/add_tags/get_tag_services</i> call.</p>
|
||||
<p>The permitted 'actions' are:</p>
|
||||
<ul>
|
||||
<li>0 - Add to a local tag service.</li>
|
||||
<li>1 - Delete from a local tag service.</li>
|
||||
<li>2 - Pend to a tag repository.</li>
|
||||
<li>3 - Rescind a pend from a tag repository.</li>
|
||||
<li>4 - Petition from a tag repository. (This is special)</li>
|
||||
<li>5 - Rescind a petition from a tag repository.</li>
|
||||
</ul>
|
||||
<p>When you petition a tag from a repository, a 'reason' for the petition is typically needed. If you send a normal list of tags here, a default reason of "Petitioned from API" will be given. If you want to set your own reason, you can instead give a list of [ tag, reason ] pairs.</p>
|
||||
<p>Some example requests:</p>
|
||||
<p>Adding some tags to a file:</p>
|
||||
<pre>{
|
||||
"hash" : "df2a7b286d21329fc496e3aa8b8a08b67bb1747ca32749acb3f5d544cbfc0f56",
|
||||
"service_names_to_tags" : {
|
||||
"local tags" : [ "character:supergirl", "rating:safe" ]
|
||||
}
|
||||
}</pre>
|
||||
<p>Adding more tags to two files:</p>
|
||||
<pre>{
|
||||
"hashes" : [ "df2a7b286d21329fc496e3aa8b8a08b67bb1747ca32749acb3f5d544cbfc0f56", "f2b022214e711e9a11e2fcec71bfd524f10f0be40c250737a7861a5ddd3faebf" ],
|
||||
"service_names_to_tags" : {
|
||||
"local tags" : [ "process this" ],
|
||||
"public tag repository" : [ "creator:dandon fuga" ]
|
||||
}
|
||||
}</pre>
|
||||
<p>A complicated transaction with all possible actions:</p>
|
||||
<pre>{
|
||||
"hash" : "df2a7b286d21329fc496e3aa8b8a08b67bb1747ca32749acb3f5d544cbfc0f56",
|
||||
"service_names_to_actions_to_tags" : {
|
||||
"local tags" : {
|
||||
0 : [ "character:supergirl", "rating:safe" ],
|
||||
1 : [ "character:superman" ]
|
||||
},
|
||||
"public tag repository" : {
|
||||
2 : [ "character:supergirl", "rating:safe" ],
|
||||
3 : [ "filename:image.jpg" ],
|
||||
4 : [ [ "creator:danban faga", "typo" ], [ "character:super_girl", "underscore" ] ]
|
||||
5 : [ "skirt" ]
|
||||
}
|
||||
}
|
||||
}</pre>
|
||||
<p>This last example is far more complicated than you will usually see. Pend rescinds and petition rescinds are not common. Petitions are also quite rare, and gathering a good petition reason for each tag is often a pain.</p>
|
||||
<p>Response description: 200 and no content.</p>
|
||||
<p>Note also that hydrus tag actions are safely idempotent. You can pend a tag that is already pended and not worry about an error--it will be discarded. The same for other reasonable logical scenarios: deleting a tag that does not exist will silently make no change, pending a tag that is already 'current' will again be passed over. It is fine to just throw 'process this' tags at every file import you add and not have to worry about checking which files you already added it to.</p>
|
||||
</ul>
|
||||
</div>
|
||||
<h3>Adding URLs</h3>
|
||||
<div class="apiborder">
|
||||
<div class="apiborder" id="add_urls_get_url_files">
|
||||
<h3><b>GET /add_urls/get_url_files</b></h3>
|
||||
<p><i>Ask the client about an URL's files.</i></p>
|
||||
<ul>
|
||||
|
@ -223,7 +350,7 @@
|
|||
</li>
|
||||
</ul>
|
||||
</div>
|
||||
<div class="apiborder">
|
||||
<div class="apiborder" id="add_urls_get_url_info">
|
||||
<h3><b>GET /add_urls/get_url_info</b></h3>
|
||||
<p><i>Ask the client for information about a URL.</i></p>
|
||||
<ul>
|
||||
|
@ -273,7 +400,7 @@
|
|||
</li>
|
||||
</ul>
|
||||
</div>
|
||||
<div class="apiborder">
|
||||
<div class="apiborder" id="add_urls_add_url">
|
||||
<h3><b>POST /add_urls/add_url</b></h3>
|
||||
<p><i>Tell the client to 'import' a URL. This triggers the exact same routine as drag-and-dropping a text URL onto the main client window.</i></p>
|
||||
<ul>
|
||||
|
@ -288,12 +415,24 @@
|
|||
<p>Arguments (in JSON):</p>
|
||||
<ul>
|
||||
<li>url : (the url you want to add)</li>
|
||||
<li>destination_page_name : (optional page name to receive the url)</li>
|
||||
<li>service_names_to_tags : (optional tags to give to any files imported from this url)</li>
|
||||
</ul>
|
||||
</li>
|
||||
<p>If you specify a destination_page_name and an appropriate importer page already exists with that name, that page will be used. Otherwise, a new page with that name will be recreated (and used by subsequent calls with that name). Make sure it that page name is unique (e.g. '/b/ threads', not 'watcher') in your client, or it may not be found.</p>
|
||||
<p>The service_names_to_tags uses the same system as for /add_tags/add_tags. You will need 'add tags' permission, or this will 403.</p>
|
||||
<li>
|
||||
<p>Example request body:</p>
|
||||
<ul>
|
||||
<li><p>{"url": "https://8ch.net/tv/res/1846574.html"}</p></li>
|
||||
<li>
|
||||
<pre>{
|
||||
"url": "https://8ch.net/tv/res/1846574.html",
|
||||
"destination_page_name": "kino zone",
|
||||
"service_names_to_tags": {
|
||||
"local tags" : [ "as seen on /tv/" ]
|
||||
}
|
||||
}</pre>
|
||||
</li>
|
||||
</ul>
|
||||
</li>
|
||||
<li><p>Response description: Some JSON with info on the URL added.</p></li>
|
||||
|
@ -310,8 +449,43 @@
|
|||
</li>
|
||||
</ul>
|
||||
</div>
|
||||
<div class="apiborder">
|
||||
<div class="apiborder" id="add_urls_associate_url">
|
||||
<h3><b>POST /add_urls/associate_url</b></h3>
|
||||
<p><i>Manage which URLs the client considers to be associated with which files.</i></p>
|
||||
<ul>
|
||||
<li>
|
||||
<p>Headers:</p>
|
||||
<ul>
|
||||
<li>Hydrus-Client-API-Access-Key : (Your hexadecimal access key)</li>
|
||||
<li>Content-Type : application/json</li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>
|
||||
<p>Arguments (in JSON):</p>
|
||||
<ul>
|
||||
<li>url_to_add : (an url you want to associate with the file(s))</li>
|
||||
<li>urls_to_add : (a list of urls you want to associate with the file(s))</li>
|
||||
<li>url_to_delete : (an url you want to disassociate from the file(s))</li>
|
||||
<li>urls_to_delete : (a list of urls you want to disassociate from the file(s))</li>
|
||||
<li>hash : (an SHA256 hash for a file in 64 characters of hexadecimal)</li>
|
||||
<li>hashes : (a list of SHA256 hashes)</li>
|
||||
</ul>
|
||||
</li>
|
||||
<p>All of these are optional, but you obviously need to have at least one of 'url' arguments and one of the 'hash' arguments. Unless you really know what you are doing with URL Classes, I strongly recommend you stick to associating URLs with just one single 'hash' at a time. Multiple hashes pointing to the same URL is unusual and frequently unhelpful.</p>
|
||||
<li>
|
||||
<p>Example request body:</p>
|
||||
<ul>
|
||||
<li>
|
||||
<pre>{
|
||||
"url_to_add": "https://rule34.xxx/index.php?id=2588418&page=post&s=view",
|
||||
"hash": "3b820114f658d768550e4e3d4f1dced3ff8db77443472b5ad93700647ad2d3ba"
|
||||
}
|
||||
}</pre>
|
||||
</li>
|
||||
</ul>
|
||||
</li>
|
||||
<li><p>Response description: 200 with no content. Like when adding tags, this is safely idempotent--do not worry about re-adding URLs associations that already exist or accidentally trying to delete ones that don't.</p></li>
|
||||
</ul>
|
||||
</div>
|
||||
<h3>Searching Files</h3>
|
||||
<div class="apiborder">
|
||||
|
|
|
@ -28,7 +28,7 @@
|
|||
<p>That '. venv/bin/activate' line turns your venv on, and will be needed every time you run the client.pyw/server.py files. You can easily tuck it into a launch script.</p>
|
||||
<p>After that, you can go nuts with pip. I think this will do for most systems:</p>
|
||||
<ul>
|
||||
<li>pip3 install beautifulsoup4 html5lib lxml nose numpy opencv-python six Pillow psutil PyOpenSSL PyYAML requests Send2Trash service_identity twisted</li>
|
||||
<li>pip3 install beautifulsoup4 chardet html5lib lxml nose numpy opencv-python six Pillow psutil PyOpenSSL PyYAML requests Send2Trash service_identity twisted</li>
|
||||
</ul>
|
||||
<p>You may want to do all that in smaller batches.</p>
|
||||
<p>And optionally, you can add these packages:</p>
|
||||
|
|
|
@ -250,7 +250,7 @@ class APIPermissions( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
if self._last_search_results is None:
|
||||
|
||||
raise HydrusExceptions.InsufficientCredentialsException( 'It looks like those search results are no longer available--please run the search again!' )
|
||||
raise HydrusExceptions.BadRequestException( 'It looks like those search results are no longer available--please run the search again!' )
|
||||
|
||||
|
||||
num_files_asked_for = len( hash_ids )
|
||||
|
@ -262,7 +262,7 @@ class APIPermissions( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
error_text = error_text.format( HydrusData.ToHumanInt( num_files_asked_for ), HydrusData.ToHumanInt( num_files_allowed_to_see ) )
|
||||
|
||||
raise HydrusExceptions.InsufficientCredentialsException( error_text )
|
||||
raise HydrusExceptions.BadRequestException( error_text )
|
||||
|
||||
|
||||
self._search_results_timeout = HydrusData.GetNow() + SEARCH_RESULTS_CACHE_TIMEOUT
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
from . import ClientImageHandling
|
||||
from . import ClientParsing
|
||||
from . import ClientPaths
|
||||
from . import ClientRendering
|
||||
|
@ -7,6 +8,7 @@ from . import ClientThreading
|
|||
from . import HydrusConstants as HC
|
||||
from . import HydrusExceptions
|
||||
from . import HydrusFileHandling
|
||||
from . import HydrusImageHandling
|
||||
from . import HydrusPaths
|
||||
from . import HydrusSerialisable
|
||||
from . import HydrusThreading
|
||||
|
@ -313,6 +315,11 @@ class ClientFilesManager( object ):
|
|||
|
||||
file_path = self._GenerateExpectedFilePath( hash, mime )
|
||||
|
||||
if not os.path.exists( file_path ):
|
||||
|
||||
raise HydrusExceptions.FileMissingException( 'The thumbnail for file ' + hash.hex() + ' was missing. It could not be regenerated from the original file because the original file is missing! This event could indicate hard drive corruption. Please check everything is ok.')
|
||||
|
||||
|
||||
try:
|
||||
|
||||
percentage_in = self._controller.new_options.GetInteger( 'video_thumbnail_percentage_in' )
|
||||
|
@ -358,7 +365,7 @@ class ClientFilesManager( object ):
|
|||
|
||||
try:
|
||||
|
||||
thumbnail_resized = HydrusFileHandling.GenerateThumbnailFromStaticImage( full_size_path, thumbnail_dimensions, fullsize_thumbnail_mime )
|
||||
thumbnail_resized = HydrusFileHandling.GenerateThumbnailFileBytesFromStaticImagePath( full_size_path, thumbnail_dimensions, fullsize_thumbnail_mime )
|
||||
|
||||
except:
|
||||
|
||||
|
@ -373,7 +380,7 @@ class ClientFilesManager( object ):
|
|||
|
||||
self._GenerateFullSizeThumbnail( hash, mime )
|
||||
|
||||
thumbnail_resized = HydrusFileHandling.GenerateThumbnailFromStaticImage( full_size_path, thumbnail_dimensions, fullsize_thumbnail_mime )
|
||||
thumbnail_resized = HydrusFileHandling.GenerateThumbnailFileBytesFromStaticImagePath( full_size_path, thumbnail_dimensions, fullsize_thumbnail_mime )
|
||||
|
||||
|
||||
resized_path = self._GenerateExpectedResizedThumbnailPath( hash )
|
||||
|
@ -631,7 +638,7 @@ class ClientFilesManager( object ):
|
|||
|
||||
text = 'Attempting to create the database\'s client_files folder structure failed!'
|
||||
|
||||
wx.MessageBox( text )
|
||||
wx.SafeShowMessage( 'unable to create file structure', text )
|
||||
|
||||
raise
|
||||
|
||||
|
@ -693,7 +700,7 @@ class ClientFilesManager( object ):
|
|||
text += os.linesep * 2
|
||||
text += 'If this is happening on client boot, you should now be presented with a dialog to correct this manually!'
|
||||
|
||||
wx.MessageBox( text )
|
||||
wx.SafeShowMessage( 'missing locations', text )
|
||||
|
||||
HydrusData.DebugPrint( text )
|
||||
HydrusData.DebugPrint( 'Missing locations follow:' )
|
||||
|
@ -707,7 +714,7 @@ class ClientFilesManager( object ):
|
|||
text += os.linesep * 2
|
||||
text += 'If this is happening on client boot, you should now be presented with a dialog to correct this manually!'
|
||||
|
||||
wx.MessageBox( text )
|
||||
wx.SafeShowMessage( 'missing locations', text )
|
||||
HydrusData.DebugPrint( text )
|
||||
|
||||
|
||||
|
@ -1341,22 +1348,30 @@ class ClientFilesManager( object ):
|
|||
|
||||
|
||||
|
||||
def RegenerateFullSizeThumbnail( self, hash, mime ):
|
||||
|
||||
with self._lock:
|
||||
|
||||
if HG.file_report_mode:
|
||||
|
||||
HydrusData.ShowText( 'Thumbnail regen request: ' + str( ( hash, mime ) ) )
|
||||
|
||||
|
||||
self._GenerateFullSizeThumbnail( hash, mime )
|
||||
|
||||
|
||||
|
||||
def RegenerateResizedThumbnail( self, hash, mime ):
|
||||
|
||||
with self._lock:
|
||||
|
||||
self.LocklessRegenerateResizedThumbnail( hash, mime )
|
||||
if HG.file_report_mode:
|
||||
|
||||
HydrusData.ShowText( 'Thumbnail regen request: ' + str( ( hash, mime ) ) )
|
||||
|
||||
|
||||
|
||||
|
||||
def LocklessRegenerateResizedThumbnail( self, hash, mime ):
|
||||
|
||||
if HG.file_report_mode:
|
||||
self._GenerateResizedThumbnail( hash, mime )
|
||||
|
||||
HydrusData.ShowText( 'Thumbnail regen request: ' + str( ( hash, mime ) ) )
|
||||
|
||||
|
||||
self._GenerateResizedThumbnail( hash, mime )
|
||||
|
||||
|
||||
def RegenerateThumbnails( self, only_do_missing = False ):
|
||||
|
@ -2276,6 +2291,87 @@ class ThumbnailCache( object ):
|
|||
self._controller.sub( self, 'ClearThumbnails', 'clear_thumbnails' )
|
||||
|
||||
|
||||
def _GetResizedHydrusBitmap( self, display_media ):
|
||||
|
||||
if not HG.thumbnail_experiment_mode:
|
||||
|
||||
return self._GetResizedHydrusBitmapFromHardDrive( display_media )
|
||||
|
||||
else:
|
||||
|
||||
return self._GetResizedHydrusBitmapFromFullSize( display_media )
|
||||
|
||||
|
||||
|
||||
def _GetResizedHydrusBitmapFromFullSize( self, display_media ):
|
||||
|
||||
thumbnail_dimensions = self._controller.options[ 'thumbnail_dimensions' ]
|
||||
|
||||
hash = display_media.GetHash()
|
||||
mime = display_media.GetMime()
|
||||
|
||||
locations_manager = display_media.GetLocationsManager()
|
||||
|
||||
try:
|
||||
|
||||
path = self._controller.client_files_manager.GetFullSizeThumbnailPath( hash, mime )
|
||||
|
||||
except HydrusExceptions.FileMissingException as e:
|
||||
|
||||
if locations_manager.IsLocal():
|
||||
|
||||
HydrusData.ShowException( e )
|
||||
|
||||
|
||||
return self._special_thumbs[ 'hydrus' ]
|
||||
|
||||
|
||||
try:
|
||||
|
||||
numpy_image = ClientImageHandling.GenerateNumpyImage( path, mime )
|
||||
|
||||
except Exception as e:
|
||||
|
||||
try:
|
||||
|
||||
# file is malformed, let's force a regen
|
||||
self._controller.client_files_manager.RegenerateResizedThumbnail( hash, mime )
|
||||
|
||||
try:
|
||||
|
||||
numpy_image = ClientImageHandling.GenerateNumpyImage( path, mime )
|
||||
|
||||
except Exception as e:
|
||||
|
||||
HydrusData.ShowException( e )
|
||||
|
||||
raise HydrusExceptions.FileMissingException( 'The thumbnail for file ' + hash.hex() + ' was broken. It was regenerated, but the new file would not render for the above reason. Please inform the hydrus developer what has happened.' )
|
||||
|
||||
|
||||
except Exception as e:
|
||||
|
||||
HydrusData.ShowException( e )
|
||||
|
||||
return self._special_thumbs[ 'hydrus' ]
|
||||
|
||||
|
||||
|
||||
( fullsize_x, fullsize_y ) = ClientImageHandling.GetNumPyImageResolution( numpy_image )
|
||||
|
||||
( resized_x, resized_y ) = HydrusImageHandling.GetThumbnailResolution( ( fullsize_x, fullsize_y ), thumbnail_dimensions )
|
||||
|
||||
already_correct = fullsize_x == resized_x and fullsize_y == resized_y
|
||||
|
||||
if not already_correct:
|
||||
|
||||
numpy_image = ClientImageHandling.EfficientlyThumbnailNumpyImage( numpy_image, ( resized_x, resized_y ) )
|
||||
|
||||
|
||||
hydrus_bitmap = ClientRendering.GenerateHydrusBitmapFromNumPyImage( numpy_image )
|
||||
|
||||
return hydrus_bitmap
|
||||
|
||||
|
||||
def _GetResizedHydrusBitmapFromHardDrive( self, display_media ):
|
||||
|
||||
thumbnail_dimensions = self._controller.options[ 'thumbnail_dimensions' ]
|
||||
|
@ -2315,8 +2411,6 @@ class ThumbnailCache( object ):
|
|||
return self._special_thumbs[ 'hydrus' ]
|
||||
|
||||
|
||||
mime = display_media.GetMime()
|
||||
|
||||
try:
|
||||
|
||||
hydrus_bitmap = ClientRendering.GenerateHydrusBitmap( path, mime )
|
||||
|
@ -2325,7 +2419,8 @@ class ThumbnailCache( object ):
|
|||
|
||||
try:
|
||||
|
||||
self._controller.client_files_manager.RegenerateResizedThumbnail( hash, mime )
|
||||
# file is malformed, let's force a regen
|
||||
self._controller.client_files_manager.RegenerateFullSizeThumbnail( hash, mime )
|
||||
|
||||
try:
|
||||
|
||||
|
@ -2412,7 +2507,7 @@ class ThumbnailCache( object ):
|
|||
|
||||
thumbnail_dimensions = self._controller.options[ 'thumbnail_dimensions' ]
|
||||
|
||||
thumbnail_bytes = HydrusFileHandling.GenerateThumbnailFromStaticImage( path, thumbnail_dimensions, HC.IMAGE_PNG )
|
||||
thumbnail_bytes = HydrusFileHandling.GenerateThumbnailFileBytesFromStaticImagePath( path, thumbnail_dimensions, HC.IMAGE_PNG )
|
||||
|
||||
with open( temp_path, 'wb' ) as f:
|
||||
|
||||
|
@ -2478,24 +2573,13 @@ class ThumbnailCache( object ):
|
|||
|
||||
if result is None:
|
||||
|
||||
if locations_manager.ShouldDefinitelyHaveThumbnail():
|
||||
try:
|
||||
|
||||
# local file, should be able to regen if needed
|
||||
hydrus_bitmap = self._GetResizedHydrusBitmap( display_media )
|
||||
|
||||
hydrus_bitmap = self._GetResizedHydrusBitmapFromHardDrive( display_media )
|
||||
except:
|
||||
|
||||
else:
|
||||
|
||||
# repository file, maybe not actually available yet
|
||||
|
||||
try:
|
||||
|
||||
hydrus_bitmap = self._GetResizedHydrusBitmapFromHardDrive( display_media )
|
||||
|
||||
except:
|
||||
|
||||
hydrus_bitmap = self._special_thumbs[ 'hydrus' ]
|
||||
|
||||
hydrus_bitmap = self._special_thumbs[ 'hydrus' ]
|
||||
|
||||
|
||||
self._data_cache.AddData( hash, hydrus_bitmap )
|
||||
|
|
|
@ -401,7 +401,7 @@ class Controller( HydrusController.HydrusController ):
|
|||
|
||||
stop_time = HydrusData.GetNow() + ( self.options[ 'idle_shutdown_max_minutes' ] * 60 )
|
||||
|
||||
self.MaintainDB( stop_time = stop_time )
|
||||
self.MaintainDB( only_if_idle = False, stop_time = stop_time )
|
||||
|
||||
if not self.options[ 'pause_repo_sync' ]:
|
||||
|
||||
|
@ -643,7 +643,7 @@ class Controller( HydrusController.HydrusController ):
|
|||
|
||||
client_api_manager._dirty = True
|
||||
|
||||
wx.MessageBox( 'Your client api manager was missing on boot! I have recreated a new empty one. Please check that your hard drive and client are ok and let the hydrus dev know the details if there is a mystery.' )
|
||||
wx.SafeShowMessage( 'Problem loading object', 'Your client api manager was missing on boot! I have recreated a new empty one. Please check that your hard drive and client are ok and let the hydrus dev know the details if there is a mystery.' )
|
||||
|
||||
|
||||
self.client_api_manager = client_api_manager
|
||||
|
@ -658,7 +658,7 @@ class Controller( HydrusController.HydrusController ):
|
|||
|
||||
bandwidth_manager._dirty = True
|
||||
|
||||
wx.MessageBox( 'Your bandwidth manager was missing on boot! I have recreated a new empty one with default rules. Please check that your hard drive and client are ok and let the hydrus dev know the details if there is a mystery.' )
|
||||
wx.SafeShowMessage( 'Problem loading object', 'Your bandwidth manager was missing on boot! I have recreated a new empty one with default rules. Please check that your hard drive and client are ok and let the hydrus dev know the details if there is a mystery.' )
|
||||
|
||||
|
||||
session_manager = self.Read( 'serialisable', HydrusSerialisable.SERIALISABLE_TYPE_NETWORK_SESSION_MANAGER )
|
||||
|
@ -669,7 +669,7 @@ class Controller( HydrusController.HydrusController ):
|
|||
|
||||
session_manager._dirty = True
|
||||
|
||||
wx.MessageBox( 'Your session manager was missing on boot! I have recreated a new empty one. Please check that your hard drive and client are ok and let the hydrus dev know the details if there is a mystery.' )
|
||||
wx.SafeShowMessage( 'Problem loading object', 'Your session manager was missing on boot! I have recreated a new empty one. Please check that your hard drive and client are ok and let the hydrus dev know the details if there is a mystery.' )
|
||||
|
||||
|
||||
domain_manager = self.Read( 'serialisable', HydrusSerialisable.SERIALISABLE_TYPE_NETWORK_DOMAIN_MANAGER )
|
||||
|
@ -682,7 +682,7 @@ class Controller( HydrusController.HydrusController ):
|
|||
|
||||
domain_manager._dirty = True
|
||||
|
||||
wx.MessageBox( 'Your domain manager was missing on boot! I have recreated a new empty one. Please check that your hard drive and client are ok and let the hydrus dev know the details if there is a mystery.' )
|
||||
wx.SafeShowMessage( 'Problem loading object', 'Your domain manager was missing on boot! I have recreated a new empty one. Please check that your hard drive and client are ok and let the hydrus dev know the details if there is a mystery.' )
|
||||
|
||||
|
||||
domain_manager.Initialise()
|
||||
|
@ -697,7 +697,7 @@ class Controller( HydrusController.HydrusController ):
|
|||
|
||||
login_manager._dirty = True
|
||||
|
||||
wx.MessageBox( 'Your login manager was missing on boot! I have recreated a new empty one. Please check that your hard drive and client are ok and let the hydrus dev know the details if there is a mystery.' )
|
||||
wx.SafeShowMessage( 'Problem loading object', 'Your login manager was missing on boot! I have recreated a new empty one. Please check that your hard drive and client are ok and let the hydrus dev know the details if there is a mystery.' )
|
||||
|
||||
|
||||
login_manager.Initialise()
|
||||
|
@ -860,7 +860,12 @@ class Controller( HydrusController.HydrusController ):
|
|||
return self._last_shutdown_was_bad
|
||||
|
||||
|
||||
def MaintainDB( self, stop_time = None ):
|
||||
def MaintainDB( self, only_if_idle = True, stop_time = None ):
|
||||
|
||||
if only_if_idle and not self.GoodTimeToDoBackgroundWork():
|
||||
|
||||
return
|
||||
|
||||
|
||||
if self.new_options.GetBoolean( 'maintain_similar_files_duplicate_pairs_during_idle' ):
|
||||
|
||||
|
@ -1139,8 +1144,6 @@ class Controller( HydrusController.HydrusController ):
|
|||
|
||||
def THREADRestart():
|
||||
|
||||
wx.CallAfter( self.gui.Exit )
|
||||
|
||||
while not self.db.LoopIsFinished():
|
||||
|
||||
time.sleep( 0.1 )
|
||||
|
@ -1158,6 +1161,8 @@ class Controller( HydrusController.HydrusController ):
|
|||
|
||||
self.CallToThreadLongRunning( THREADRestart )
|
||||
|
||||
wx.CallAfter( self.gui.Exit )
|
||||
|
||||
|
||||
|
||||
|
||||
|
@ -1366,8 +1371,8 @@ class Controller( HydrusController.HydrusController ):
|
|||
|
||||
HydrusData.DebugPrint( traceback.format_exc() )
|
||||
|
||||
wx.CallAfter( wx.MessageBox, traceback.format_exc() )
|
||||
wx.CallAfter( wx.MessageBox, text )
|
||||
wx.SafeShowMessage( 'boot error', text )
|
||||
wx.SafeShowMessage( 'boot error', traceback.format_exc() )
|
||||
|
||||
HG.emergency_exit = True
|
||||
|
||||
|
@ -1436,7 +1441,10 @@ class Controller( HydrusController.HydrusController ):
|
|||
|
||||
wx.TheClipboard.Close()
|
||||
|
||||
else: wx.MessageBox( 'Could not get permission to access the clipboard!' )
|
||||
else:
|
||||
|
||||
wx.MessageBox( 'Could not get permission to access the clipboard!' )
|
||||
|
||||
|
||||
elif data_type == 'text':
|
||||
|
||||
|
@ -1450,7 +1458,10 @@ class Controller( HydrusController.HydrusController ):
|
|||
|
||||
wx.TheClipboard.Close()
|
||||
|
||||
else: wx.MessageBox( 'I could not get permission to access the clipboard.' )
|
||||
else:
|
||||
|
||||
wx.MessageBox( 'I could not get permission to access the clipboard.' )
|
||||
|
||||
|
||||
elif data_type == 'bmp':
|
||||
|
||||
|
|
|
@ -1390,8 +1390,6 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
hash_ids = self._STL( self._c.execute( 'SELECT hash_id FROM shape_search_cache WHERE searched_distance IS NULL or searched_distance < ?;', ( search_distance, ) ) )
|
||||
|
||||
pairs_found = 0
|
||||
|
||||
total_done_previously = total_num_hash_ids_in_cache - len( hash_ids )
|
||||
|
||||
for ( i, hash_id ) in enumerate( hash_ids ):
|
||||
|
@ -1430,8 +1428,6 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
self._c.executemany( 'INSERT OR IGNORE INTO duplicate_pairs ( smaller_hash_id, larger_hash_id, duplicate_type ) VALUES ( ?, ?, ? );', ( ( min( hash_id, duplicate_hash_id ), max( hash_id, duplicate_hash_id ), HC.DUPLICATE_UNKNOWN ) for duplicate_hash_id in duplicate_hash_ids ) )
|
||||
|
||||
pairs_found += self._GetRowCount()
|
||||
|
||||
self._c.execute( 'UPDATE shape_search_cache SET searched_distance = ? WHERE hash_id = ?;', ( search_distance, hash_id ) )
|
||||
|
||||
|
||||
|
@ -1888,16 +1884,16 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
search_radius = max_hamming_distance
|
||||
|
||||
result = self._c.execute( 'SELECT phash_id FROM shape_vptree WHERE parent_id IS NULL;' ).fetchone()
|
||||
top_node_result = self._c.execute( 'SELECT phash_id FROM shape_vptree WHERE parent_id IS NULL;' ).fetchone()
|
||||
|
||||
if result is None:
|
||||
if top_node_result is None:
|
||||
|
||||
return []
|
||||
|
||||
|
||||
( root_node_phash_id, ) = result
|
||||
( root_node_phash_id, ) = top_node_result
|
||||
|
||||
search_phashes = [ phash for ( phash, ) in self._c.execute( 'SELECT phash FROM shape_perceptual_hashes NATURAL JOIN shape_perceptual_hash_map WHERE hash_id = ?;', ( hash_id, ) ) ]
|
||||
search_phashes = self._STL( self._c.execute( 'SELECT phash FROM shape_perceptual_hashes NATURAL JOIN shape_perceptual_hash_map WHERE hash_id = ?;', ( hash_id, ) ) )
|
||||
|
||||
if len( search_phashes ) == 0:
|
||||
|
||||
|
@ -4835,8 +4831,14 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
duration = simple_preds[ 'duration' ]
|
||||
|
||||
if duration == 0: files_info_predicates.append( '( duration IS NULL OR duration = 0 )' )
|
||||
else: files_info_predicates.append( 'duration = ' + str( duration ) )
|
||||
if duration == 0:
|
||||
|
||||
files_info_predicates.append( '( duration IS NULL OR duration = 0 )' )
|
||||
|
||||
else:
|
||||
|
||||
files_info_predicates.append( 'duration = ' + str( duration ) )
|
||||
|
||||
|
||||
if 'max_duration' in simple_preds:
|
||||
|
||||
|
@ -8967,6 +8969,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
larger_precise_timestamp = HydrusData.GetNowPrecise()
|
||||
|
||||
total_definitions_rows = 0
|
||||
transaction_rows = 0
|
||||
|
||||
try:
|
||||
|
||||
|
@ -9008,6 +9011,23 @@ class DB( HydrusDB.HydrusDB ):
|
|||
report_speed_to_job_key( job_key, precise_timestamp, num_rows, 'definitions' )
|
||||
|
||||
total_definitions_rows += num_rows
|
||||
transaction_rows += num_rows
|
||||
|
||||
been_a_minute = HydrusData.TimeHasPassed( self._transaction_started + 60 )
|
||||
been_a_hundred_k = transaction_rows > 100000
|
||||
|
||||
if been_a_minute or been_a_hundred_k:
|
||||
|
||||
job_key.SetVariable( 'popup_text_1', 'committing' )
|
||||
|
||||
self._Commit()
|
||||
|
||||
self._BeginImmediate()
|
||||
|
||||
time.sleep( 0.5 )
|
||||
|
||||
transaction_rows = 0
|
||||
|
||||
|
||||
|
||||
# let's atomically save our progress here to avoid the desync issue some people had.
|
||||
|
@ -9036,6 +9056,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
precise_timestamp = HydrusData.GetNowPrecise()
|
||||
|
||||
total_content_rows = 0
|
||||
transaction_rows = 0
|
||||
|
||||
try:
|
||||
|
||||
|
@ -9080,6 +9101,23 @@ class DB( HydrusDB.HydrusDB ):
|
|||
num_rows = content_update.GetNumRows()
|
||||
|
||||
total_content_rows += num_rows
|
||||
transaction_rows += num_rows
|
||||
|
||||
been_a_minute = HydrusData.TimeHasPassed( self._transaction_started + 60 )
|
||||
been_a_million = transaction_rows > 1000000
|
||||
|
||||
if been_a_minute or been_a_million:
|
||||
|
||||
job_key.SetVariable( 'popup_text_1', 'committing' )
|
||||
|
||||
self._Commit()
|
||||
|
||||
self._BeginImmediate()
|
||||
|
||||
transaction_rows = 0
|
||||
|
||||
time.sleep( 0.5 )
|
||||
|
||||
|
||||
|
||||
finally:
|
||||
|
|
|
@ -21,6 +21,11 @@ def DAEMONCheckExportFolders():
|
|||
|
||||
controller = HG.client_controller
|
||||
|
||||
if HG.daemon_report_mode:
|
||||
|
||||
HydrusData.ShowText( 'Export folders daemon started: paused = {}'.format( controller.options[ 'pause_export_folders_sync' ] ) )
|
||||
|
||||
|
||||
if not controller.options[ 'pause_export_folders_sync' ]:
|
||||
|
||||
HG.export_folders_running = True
|
||||
|
@ -31,6 +36,11 @@ def DAEMONCheckExportFolders():
|
|||
|
||||
for name in export_folder_names:
|
||||
|
||||
if HG.daemon_report_mode:
|
||||
|
||||
HydrusData.ShowText( 'Export folders daemon running: {}'.format( name ) )
|
||||
|
||||
|
||||
export_folder = controller.Read( 'serialisable_named', HydrusSerialisable.SERIALISABLE_TYPE_EXPORT_FOLDER, name )
|
||||
|
||||
if controller.options[ 'pause_export_folders_sync' ] or HydrusThreading.IsThreadShuttingDown():
|
||||
|
|
|
@ -156,7 +156,12 @@ def ConvertServiceKeysToTagsToServiceKeysToContentUpdates( hashes, service_keys_
|
|||
|
||||
service_keys_to_content_updates = {}
|
||||
|
||||
for ( service_key, tags ) in list(service_keys_to_tags.items()):
|
||||
for ( service_key, tags ) in service_keys_to_tags.items():
|
||||
|
||||
if len( tags ) == 0:
|
||||
|
||||
continue
|
||||
|
||||
|
||||
if service_key == CC.LOCAL_TAG_SERVICE_KEY:
|
||||
|
||||
|
@ -414,8 +419,8 @@ def ReportShutdownException():
|
|||
|
||||
HydrusData.DebugPrint( traceback.format_exc() )
|
||||
|
||||
wx.CallAfter( wx.MessageBox, traceback.format_exc() )
|
||||
wx.CallAfter( wx.MessageBox, text )
|
||||
wx.SafeShowMessage( 'shutdown error', text )
|
||||
wx.SafeShowMessage( 'shutdown error', traceback.format_exc() )
|
||||
|
||||
def ShowExceptionClient( e, do_wait = True ):
|
||||
|
||||
|
|
|
@ -208,6 +208,19 @@ class DuplicateActionOptions( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
|
||||
|
||||
def worth_updating_rating( source_rating, dest_rating ):
|
||||
|
||||
if source_rating is not None:
|
||||
|
||||
if dest_rating is None or source_rating > dest_rating:
|
||||
|
||||
return True
|
||||
|
||||
|
||||
|
||||
return False
|
||||
|
||||
|
||||
for ( service_key, action ) in self._rating_service_actions:
|
||||
|
||||
content_updates = []
|
||||
|
@ -226,28 +239,18 @@ class DuplicateActionOptions( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
if action == HC.CONTENT_MERGE_ACTION_TWO_WAY_MERGE:
|
||||
|
||||
if first_current_value == second_current_value:
|
||||
|
||||
continue
|
||||
|
||||
|
||||
if first_current_value is None and second_current_value is not None:
|
||||
|
||||
content_updates.append( HydrusData.ContentUpdate( HC.CONTENT_TYPE_RATINGS, HC.CONTENT_UPDATE_ADD, ( second_current_value, first_hashes ) ) )
|
||||
|
||||
elif first_current_value is not None and second_current_value is None:
|
||||
if worth_updating_rating( first_current_value, second_current_value ):
|
||||
|
||||
content_updates.append( HydrusData.ContentUpdate( HC.CONTENT_TYPE_RATINGS, HC.CONTENT_UPDATE_ADD, ( first_current_value, second_hashes ) ) )
|
||||
|
||||
elif worth_updating_rating( second_current_value, first_current_value ):
|
||||
|
||||
content_updates.append( HydrusData.ContentUpdate( HC.CONTENT_TYPE_RATINGS, HC.CONTENT_UPDATE_ADD, ( second_current_value, first_hashes ) ) )
|
||||
|
||||
|
||||
elif action == HC.CONTENT_MERGE_ACTION_COPY:
|
||||
|
||||
if first_current_value == second_current_value:
|
||||
|
||||
continue
|
||||
|
||||
|
||||
if first_current_value is None and second_current_value is not None:
|
||||
if worth_updating_rating( second_current_value, first_current_value ):
|
||||
|
||||
content_updates.append( HydrusData.ContentUpdate( HC.CONTENT_TYPE_RATINGS, HC.CONTENT_UPDATE_ADD, ( second_current_value, first_hashes ) ) )
|
||||
|
||||
|
@ -256,7 +259,7 @@ class DuplicateActionOptions( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
if second_current_value is not None:
|
||||
|
||||
if first_current_value is None:
|
||||
if worth_updating_rating( second_current_value, first_current_value ):
|
||||
|
||||
content_updates.append( HydrusData.ContentUpdate( HC.CONTENT_TYPE_RATINGS, HC.CONTENT_UPDATE_ADD, ( second_current_value, first_hashes ) ) )
|
||||
|
||||
|
|
|
@ -313,6 +313,11 @@ class ExportFolder( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
try:
|
||||
|
||||
if HG.daemon_report_mode:
|
||||
|
||||
HydrusData.ShowText( 'Export folder start check: {} {} {} {}'.format( HydrusData.GetNow(), self._last_checked, self._period, HydrusData.TimeHasPassed( self._last_checked + self._period ) ) )
|
||||
|
||||
|
||||
if HydrusData.TimeHasPassed( self._last_checked + self._period ):
|
||||
|
||||
if self._path != '' and os.path.exists( self._path ) and os.path.isdir( self._path ):
|
||||
|
|
|
@ -34,6 +34,7 @@ from . import ClientPaths
|
|||
from . import ClientRendering
|
||||
from . import ClientSearch
|
||||
from . import ClientServices
|
||||
from . import ClientTags
|
||||
from . import ClientThreading
|
||||
import collections
|
||||
import cv2
|
||||
|
@ -97,10 +98,6 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
|
|||
|
||||
self._statusbar_thread_updater = ClientGUICommon.ThreadToGUIUpdater( self._statusbar, self.RefreshStatusBar )
|
||||
|
||||
self._focus_holder = wx.Window( self )
|
||||
|
||||
self._focus_holder.SetSize( ( 0, 0 ) )
|
||||
|
||||
self._closed_pages = []
|
||||
|
||||
self._lock = threading.Lock()
|
||||
|
@ -147,8 +144,6 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
|
|||
self._controller.sub( self, 'SetTitle', 'main_gui_title' )
|
||||
self._controller.sub( self, 'SyncToTagArchive', 'sync_to_tag_archive' )
|
||||
|
||||
self._menus = {}
|
||||
|
||||
vbox = wx.BoxSizer( wx.VERTICAL )
|
||||
|
||||
vbox.Add( self._notebook, CC.FLAGS_EXPAND_SIZER_BOTH_WAYS )
|
||||
|
@ -1157,11 +1152,9 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
|
|||
|
||||
if name not in self._dirty_menus:
|
||||
|
||||
( label, show ) = self._menus[ name ]
|
||||
menu_index = self._FindMenuBarIndex( name )
|
||||
|
||||
if show:
|
||||
|
||||
menu_index = self._FindMenuBarIndex( label, name )
|
||||
if menu_index != wx.NOT_FOUND:
|
||||
|
||||
self._menubar.EnableTop( menu_index, False )
|
||||
|
||||
|
@ -1213,18 +1206,8 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
|
|||
|
||||
|
||||
|
||||
def _FindMenuBarIndex( self, label, name ):
|
||||
def _FindMenuBarIndex( self, name ):
|
||||
|
||||
for index in range( self._menubar.GetMenuCount() ):
|
||||
|
||||
# not GetMenuLabelText, which will ignore '&'
|
||||
if self._menubar.GetMenuLabel( index ) == label:
|
||||
|
||||
return index
|
||||
|
||||
|
||||
|
||||
# backup wew in case things get muddled
|
||||
for index in range( self._menubar.GetMenuCount() ):
|
||||
|
||||
if self._menubar.GetMenu( index ).hydrus_menubar_name == name:
|
||||
|
@ -1233,7 +1216,7 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
|
|||
|
||||
|
||||
|
||||
raise HydrusExceptions.DataMissing( 'Menu not found!' )
|
||||
return wx.NOT_FOUND
|
||||
|
||||
|
||||
def _ForceFitAllNonGUITLWs( self ):
|
||||
|
@ -2085,6 +2068,7 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
|
|||
|
||||
gui_actions = wx.Menu()
|
||||
|
||||
ClientGUIMenus.AppendMenuCheckItem( self, gui_actions, 'thumbnail experiment mode', 'Try the new experiment.', HG.thumbnail_experiment_mode, self._SwitchBoolean, 'thumbnail_experiment_mode' )
|
||||
ClientGUIMenus.AppendMenuItem( self, gui_actions, 'make some popups', 'Throw some varied popups at the message manager, just to check it is working.', self._DebugMakeSomePopups )
|
||||
ClientGUIMenus.AppendMenuItem( self, gui_actions, 'make a popup in five seconds', 'Throw a delayed popup at the message manager, giving you time to minimise or otherwise alter the client before it arrives.', self._controller.CallLater, 5, HydrusData.ShowText, 'This is a delayed popup message.' )
|
||||
ClientGUIMenus.AppendMenuItem( self, gui_actions, 'make a modal popup in five seconds', 'Throw up a delayed modal popup to test with. It will stay alive for five seconds.', self._DebugMakeDelayedModalPopup )
|
||||
|
@ -2315,11 +2299,11 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
|
|||
|
||||
|
||||
|
||||
def _ImportURL( self, url, service_keys_to_tags = None ):
|
||||
def _ImportURL( self, url, service_keys_to_tags = None, destination_page_name = None ):
|
||||
|
||||
if service_keys_to_tags is None:
|
||||
|
||||
service_keys_to_tags = {}
|
||||
service_keys_to_tags = ClientTags.ServiceKeysToTags()
|
||||
|
||||
|
||||
url = HG.client_controller.network_engine.domain_manager.NormaliseURL( url )
|
||||
|
@ -2337,30 +2321,30 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
|
|||
|
||||
if url_type in ( HC.URL_TYPE_UNKNOWN, HC.URL_TYPE_FILE, HC.URL_TYPE_POST, HC.URL_TYPE_GALLERY ):
|
||||
|
||||
page = self._notebook.GetOrMakeURLImportPage()
|
||||
page = self._notebook.GetOrMakeURLImportPage( destination_page_name )
|
||||
|
||||
if page is not None:
|
||||
|
||||
self._notebook.ShowPage( page )
|
||||
|
||||
page_key = page.GetPageKey()
|
||||
management_panel = page.GetManagementPanel()
|
||||
|
||||
HG.client_controller.pub( 'pend_url', page_key, url, service_keys_to_tags )
|
||||
management_panel.PendURL( url, service_keys_to_tags = service_keys_to_tags )
|
||||
|
||||
return ( url, '"{}" URL added successfully.'.format( match_name ) )
|
||||
|
||||
|
||||
elif url_type == HC.URL_TYPE_WATCHABLE:
|
||||
|
||||
page = self._notebook.GetOrMakeMultipleWatcherPage()
|
||||
page = self._notebook.GetOrMakeMultipleWatcherPage( destination_page_name )
|
||||
|
||||
if page is not None:
|
||||
|
||||
self._notebook.ShowPage( page )
|
||||
|
||||
page_key = page.GetPageKey()
|
||||
management_panel = page.GetManagementPanel()
|
||||
|
||||
HG.client_controller.pub( 'pend_url', page_key, url, service_keys_to_tags )
|
||||
management_panel.PendURL( url, service_keys_to_tags = service_keys_to_tags )
|
||||
|
||||
return ( url, '"{}" URL added successfully.'.format( match_name ) )
|
||||
|
||||
|
@ -2389,8 +2373,6 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
|
|||
self._menubar.Append( menu, label )
|
||||
|
||||
|
||||
self._menus[ name ] = ( label, show )
|
||||
|
||||
|
||||
|
||||
def _InitialiseSession( self ):
|
||||
|
@ -3867,6 +3849,10 @@ The password is cleartext here but obscured in the entry dialog. Enter a blank p
|
|||
|
||||
HG.thumbnail_debug_mode = not HG.thumbnail_debug_mode
|
||||
|
||||
elif name == 'thumbnail_experiment_mode':
|
||||
|
||||
HG.thumbnail_experiment_mode = not HG.thumbnail_experiment_mode
|
||||
|
||||
elif name == 'ui_timer_profile_mode':
|
||||
|
||||
HG.ui_timer_profile_mode = not HG.ui_timer_profile_mode
|
||||
|
@ -4281,7 +4267,7 @@ The password is cleartext here but obscured in the entry dialog. Enter a blank p
|
|||
|
||||
self._DestroyPages( deletee_pages )
|
||||
|
||||
self._focus_holder.SetFocus()
|
||||
self._notebook.SetFocus()
|
||||
|
||||
self._controller.pub( 'notify_new_undo' )
|
||||
|
||||
|
@ -4633,11 +4619,11 @@ The password is cleartext here but obscured in the entry dialog. Enter a blank p
|
|||
self._ImportFiles( paths )
|
||||
|
||||
|
||||
def ImportURLFromAPI( self, url, service_keys_to_tags ):
|
||||
def ImportURLFromAPI( self, url, service_keys_to_tags, destination_page_name ):
|
||||
|
||||
try:
|
||||
|
||||
( normalised_url, result_text ) = self._ImportURL( url, service_keys_to_tags )
|
||||
( normalised_url, result_text ) = self._ImportURL( url, service_keys_to_tags = service_keys_to_tags, destination_page_name = destination_page_name )
|
||||
|
||||
return ( normalised_url, result_text )
|
||||
|
||||
|
@ -4675,9 +4661,9 @@ The password is cleartext here but obscured in the entry dialog. Enter a blank p
|
|||
|
||||
|
||||
|
||||
def NewPageImportHDD( self, paths, file_import_options, paths_to_tags, delete_after_success ):
|
||||
def NewPageImportHDD( self, paths, file_import_options, paths_to_service_keys_to_tags, delete_after_success ):
|
||||
|
||||
management_controller = ClientGUIManagement.CreateManagementControllerImportHDD( paths, file_import_options, paths_to_tags, delete_after_success )
|
||||
management_controller = ClientGUIManagement.CreateManagementControllerImportHDD( paths, file_import_options, paths_to_service_keys_to_tags, delete_after_success )
|
||||
|
||||
self._notebook.NewPage( management_controller, on_deepest_notebook = True )
|
||||
|
||||
|
@ -4707,7 +4693,7 @@ The password is cleartext here but obscured in the entry dialog. Enter a blank p
|
|||
|
||||
if self._notebook.GetNumPages() == 0:
|
||||
|
||||
self._focus_holder.SetFocus()
|
||||
self._notebook.SetFocus()
|
||||
|
||||
|
||||
self._DirtyMenu( 'pages' )
|
||||
|
@ -4719,7 +4705,7 @@ The password is cleartext here but obscured in the entry dialog. Enter a blank p
|
|||
|
||||
if self._notebook.GetNumPages() == 0:
|
||||
|
||||
self._focus_holder.SetFocus()
|
||||
self._notebook.SetFocus()
|
||||
|
||||
|
||||
self._DestroyPages( ( page, ) )
|
||||
|
@ -4931,45 +4917,55 @@ The password is cleartext here but obscured in the entry dialog. Enter a blank p
|
|||
|
||||
name = self._dirty_menus.pop()
|
||||
|
||||
( old_label, old_show ) = self._menus[ name ]
|
||||
( menu_or_none, label, show ) = self._GenerateMenuInfo( name )
|
||||
|
||||
if old_show:
|
||||
old_menu_index = self._FindMenuBarIndex( name )
|
||||
|
||||
if old_menu_index == wx.NOT_FOUND:
|
||||
|
||||
old_menu_index = self._FindMenuBarIndex( old_label, name )
|
||||
if show:
|
||||
|
||||
menu = menu_or_none
|
||||
|
||||
insert_index = 0
|
||||
|
||||
# for every menu that may display, if it is displayed now, bump up insertion index up one
|
||||
for possible_name in MENU_ORDER:
|
||||
|
||||
if possible_name == name:
|
||||
|
||||
break
|
||||
|
||||
|
||||
possible_menu_index = self._FindMenuBarIndex( possible_name )
|
||||
|
||||
if possible_menu_index != wx.NOT_FOUND:
|
||||
|
||||
insert_index += 1
|
||||
|
||||
|
||||
|
||||
self._menubar.Insert( insert_index, menu, label )
|
||||
|
||||
|
||||
old_menu = self._menubar.Remove( old_menu_index )
|
||||
else:
|
||||
|
||||
old_menu = self._menubar.GetMenu( old_menu_index )
|
||||
|
||||
if show:
|
||||
|
||||
menu = menu_or_none
|
||||
|
||||
self._menubar.Replace( old_menu_index, menu, label )
|
||||
|
||||
else:
|
||||
|
||||
self._menubar.Remove( old_menu_index )
|
||||
|
||||
|
||||
ClientGUIMenus.DestroyMenu( self, old_menu )
|
||||
|
||||
|
||||
( menu_or_none, label, show ) = self._GenerateMenuInfo( name )
|
||||
|
||||
if show:
|
||||
|
||||
menu = menu_or_none
|
||||
|
||||
insert_index = 0
|
||||
|
||||
for temp_name in MENU_ORDER:
|
||||
|
||||
if temp_name == name:
|
||||
|
||||
break
|
||||
|
||||
|
||||
( temp_label, temp_show ) = self._menus[ temp_name ]
|
||||
|
||||
if temp_show:
|
||||
|
||||
insert_index += 1
|
||||
|
||||
|
||||
|
||||
self._menubar.Insert( insert_index, menu, label )
|
||||
|
||||
|
||||
self._menus[ name ] = ( label, show )
|
||||
|
||||
|
||||
if len( self._dirty_menus ) > 0:
|
||||
|
||||
|
|
|
@ -346,7 +346,7 @@ class AutoCompleteDropdown( wx.Panel ):
|
|||
|
||||
not_main_gui = tlp.GetParent() is not None
|
||||
|
||||
if not_main_gui or HC.options[ 'always_embed_autocompletes' ]:
|
||||
if not_main_gui or HC.options[ 'always_embed_autocompletes' ] or not HC.PLATFORM_WINDOWS:
|
||||
|
||||
self._float_mode = False
|
||||
|
||||
|
|
|
@ -1997,7 +1997,7 @@ class TextAndPasteCtrl( wx.Panel ):
|
|||
|
||||
text = self._text_input.GetValue()
|
||||
|
||||
text = HydrusText.StripTrailingAndLeadingSpaces( text )
|
||||
text = HydrusText.StripIOInputLine( text )
|
||||
|
||||
if text == '' and not self._allow_empty_input:
|
||||
|
||||
|
|
|
@ -19,6 +19,7 @@ from . import ClientGUIShortcuts
|
|||
from . import ClientGUITime
|
||||
from . import ClientGUITopLevelWindows
|
||||
from . import ClientImporting
|
||||
from . import ClientTags
|
||||
from . import ClientThreading
|
||||
import collections
|
||||
import gc
|
||||
|
@ -135,6 +136,7 @@ class Dialog( wx.Dialog ):
|
|||
self.SetIcon( HG.client_controller.frame_icon )
|
||||
|
||||
self.Bind( wx.EVT_BUTTON, self.EventDialogButton )
|
||||
self.Bind( wx.EVT_CHAR_HOOK, self.EventCharHook )
|
||||
|
||||
if parent is not None and position == 'center':
|
||||
|
||||
|
@ -144,6 +146,20 @@ class Dialog( wx.Dialog ):
|
|||
HG.client_controller.ResetIdleTimer()
|
||||
|
||||
|
||||
def EventCharHook( self, event ):
|
||||
|
||||
( modifier, key ) = ClientGUIShortcuts.ConvertKeyEventToSimpleTuple( event )
|
||||
|
||||
if key == wx.WXK_ESCAPE:
|
||||
|
||||
self.EndModal( wx.ID_CANCEL )
|
||||
|
||||
else:
|
||||
|
||||
event.Skip()
|
||||
|
||||
|
||||
|
||||
def EventDialogButton( self, event ):
|
||||
|
||||
if self.IsModal():
|
||||
|
@ -175,8 +191,6 @@ class DialogChooseNewServiceMethod( Dialog ):
|
|||
|
||||
Dialog.__init__( self, parent, 'how to set up the account?', position = 'center' )
|
||||
|
||||
self._hidden_cancel = wx.Button( self, id = wx.ID_CANCEL, size = ( 0, 0 ) )
|
||||
|
||||
register_message = 'I want to initialise a new account with the server. I have a registration key (a key starting with \'r\').'
|
||||
|
||||
self._register = wx.Button( self, label = register_message )
|
||||
|
@ -759,11 +773,11 @@ class FrameInputLocalFiles( wx.Frame ):
|
|||
|
||||
file_import_options = self._file_import_options.GetValue()
|
||||
|
||||
paths_to_tags = {}
|
||||
paths_to_service_keys_to_tags = collections.defaultdict( ClientTags.ServiceKeysToTags )
|
||||
|
||||
delete_after_success = self._delete_after_success.GetValue()
|
||||
|
||||
HG.client_controller.pub( 'new_hdd_import', self._current_paths, file_import_options, paths_to_tags, delete_after_success )
|
||||
HG.client_controller.pub( 'new_hdd_import', self._current_paths, file_import_options, paths_to_service_keys_to_tags, delete_after_success )
|
||||
|
||||
|
||||
self.Close()
|
||||
|
@ -783,11 +797,11 @@ class FrameInputLocalFiles( wx.Frame ):
|
|||
|
||||
if dlg.ShowModal() == wx.ID_OK:
|
||||
|
||||
paths_to_tags = panel.GetValue()
|
||||
paths_to_service_keys_to_tags = panel.GetValue()
|
||||
|
||||
delete_after_success = self._delete_after_success.GetValue()
|
||||
|
||||
HG.client_controller.pub( 'new_hdd_import', self._current_paths, file_import_options, paths_to_tags, delete_after_success )
|
||||
HG.client_controller.pub( 'new_hdd_import', self._current_paths, file_import_options, paths_to_service_keys_to_tags, delete_after_success )
|
||||
|
||||
self.Close()
|
||||
|
||||
|
@ -1757,8 +1771,6 @@ class DialogSelectImageboard( Dialog ):
|
|||
|
||||
Dialog.__init__( self, parent, 'select imageboard' )
|
||||
|
||||
self._hidden_cancel = wx.Button( self, id = wx.ID_CANCEL, size = ( 0, 0 ) )
|
||||
|
||||
self._tree = wx.TreeCtrl( self )
|
||||
self._tree.Bind( wx.EVT_TREE_ITEM_ACTIVATED, self.EventActivate )
|
||||
|
||||
|
@ -1947,8 +1959,6 @@ class DialogYesNo( Dialog ):
|
|||
self._no.SetForegroundColour( ( 128, 0, 0 ) )
|
||||
self._no.SetLabelText( no_label )
|
||||
|
||||
self._hidden_cancel = wx.Button( self, id = wx.ID_CANCEL, size = ( 0, 0 ) )
|
||||
|
||||
#
|
||||
|
||||
hbox = wx.BoxSizer( wx.HORIZONTAL )
|
||||
|
@ -2003,8 +2013,6 @@ class DialogYesYesNo( Dialog ):
|
|||
self._no.SetForegroundColour( ( 128, 0, 0 ) )
|
||||
self._no.SetLabelText( no_label )
|
||||
|
||||
self._hidden_cancel = wx.Button( self, id = wx.ID_CANCEL, size = ( 0, 0 ) )
|
||||
|
||||
#
|
||||
|
||||
hbox = wx.BoxSizer( wx.HORIZONTAL )
|
||||
|
|
|
@ -347,8 +347,6 @@ class DialogManageUPnP( ClientGUIDialogs.Dialog ):
|
|||
|
||||
ClientGUIDialogs.Dialog.__init__( self, parent, title )
|
||||
|
||||
self._hidden_cancel = wx.Button( self, id = wx.ID_CANCEL, size = ( 0, 0 ) )
|
||||
|
||||
self._status_st = ClientGUICommon.BetterStaticText( self )
|
||||
|
||||
self._mappings_list_ctrl = ClientGUIListCtrl.SaneListCtrl( self, 480, [ ( 'description', -1 ), ( 'internal ip', 100 ), ( 'internal port', 80 ), ( 'external ip', 100 ), ( 'external port', 80 ), ( 'protocol', 80 ), ( 'lease', 80 ) ], delete_key_callback = self.RemoveMappings, activation_callback = self.EditMappings )
|
||||
|
|
|
@ -22,6 +22,7 @@ from . import ClientImportFileSeeds
|
|||
from . import ClientImportGallerySeeds
|
||||
from . import ClientImportLocal
|
||||
from . import ClientImportOptions
|
||||
from . import ClientTags
|
||||
import collections
|
||||
from . import HydrusConstants as HC
|
||||
from . import HydrusData
|
||||
|
@ -1495,7 +1496,7 @@ class EditLocalImportFilenameTaggingPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
def GetValue( self ):
|
||||
|
||||
paths_to_tags = collections.defaultdict( dict )
|
||||
paths_to_service_keys_to_tags = collections.defaultdict( ClientTags.ServiceKeysToTags )
|
||||
|
||||
for page in self._tag_repositories.GetPages():
|
||||
|
||||
|
@ -1508,11 +1509,11 @@ class EditLocalImportFilenameTaggingPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
continue
|
||||
|
||||
|
||||
paths_to_tags[ path ][ service_key ] = tags
|
||||
paths_to_service_keys_to_tags[ path ][ service_key ] = tags
|
||||
|
||||
|
||||
|
||||
return paths_to_tags
|
||||
return paths_to_service_keys_to_tags
|
||||
|
||||
|
||||
class _Panel( wx.Panel ):
|
||||
|
|
|
@ -27,6 +27,7 @@ from . import ClientImportWatchers
|
|||
from . import ClientMedia
|
||||
from . import ClientParsing
|
||||
from . import ClientPaths
|
||||
from . import ClientTags
|
||||
from . import ClientThreading
|
||||
from . import HydrusData
|
||||
from . import HydrusGlobals as HG
|
||||
|
@ -109,19 +110,24 @@ def CreateManagementControllerImportSimpleDownloader():
|
|||
|
||||
return management_controller
|
||||
|
||||
def CreateManagementControllerImportHDD( paths, file_import_options, paths_to_tags, delete_after_success ):
|
||||
def CreateManagementControllerImportHDD( paths, file_import_options, paths_to_service_keys_to_tags, delete_after_success ):
|
||||
|
||||
management_controller = CreateManagementController( 'import', MANAGEMENT_TYPE_IMPORT_HDD )
|
||||
|
||||
hdd_import = ClientImportLocal.HDDImport( paths = paths, file_import_options = file_import_options, paths_to_tags = paths_to_tags, delete_after_success = delete_after_success )
|
||||
hdd_import = ClientImportLocal.HDDImport( paths = paths, file_import_options = file_import_options, paths_to_service_keys_to_tags = paths_to_service_keys_to_tags, delete_after_success = delete_after_success )
|
||||
|
||||
management_controller.SetVariable( 'hdd_import', hdd_import )
|
||||
|
||||
return management_controller
|
||||
|
||||
def CreateManagementControllerImportMultipleWatcher( url = None ):
|
||||
def CreateManagementControllerImportMultipleWatcher( page_name = None, url = None ):
|
||||
|
||||
management_controller = CreateManagementController( 'watcher', MANAGEMENT_TYPE_IMPORT_MULTIPLE_WATCHER )
|
||||
if page_name is None:
|
||||
|
||||
page_name = 'watcher'
|
||||
|
||||
|
||||
management_controller = CreateManagementController( page_name, MANAGEMENT_TYPE_IMPORT_MULTIPLE_WATCHER )
|
||||
|
||||
multiple_watcher_import = ClientImportWatchers.MultipleWatcherImport( url = url )
|
||||
|
||||
|
@ -129,9 +135,14 @@ def CreateManagementControllerImportMultipleWatcher( url = None ):
|
|||
|
||||
return management_controller
|
||||
|
||||
def CreateManagementControllerImportURLs():
|
||||
def CreateManagementControllerImportURLs( page_name = None ):
|
||||
|
||||
management_controller = CreateManagementController( 'url import', MANAGEMENT_TYPE_IMPORT_URLS )
|
||||
if page_name is None:
|
||||
|
||||
page_name = 'url import'
|
||||
|
||||
|
||||
management_controller = CreateManagementController( page_name, MANAGEMENT_TYPE_IMPORT_URLS )
|
||||
|
||||
urls_import = ClientImportSimpleURLs.URLsImport()
|
||||
|
||||
|
@ -598,7 +609,7 @@ class ManagementController( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
paths_to_tags = { path : { bytes.fromhex( service_key ) : tags for ( service_key, tags ) in service_keys_to_tags } for ( path, service_keys_to_tags ) in list(paths_to_tags.items()) }
|
||||
|
||||
hdd_import = ClientImportLocal.HDDImport( paths = paths, file_import_options = file_import_options, paths_to_tags = paths_to_tags, delete_after_success = delete_after_success )
|
||||
hdd_import = ClientImportLocal.HDDImport( paths = paths, file_import_options = file_import_options, paths_to_service_keys_to_tags = paths_to_tags, delete_after_success = delete_after_success )
|
||||
|
||||
serialisable_serialisables[ 'hdd_import' ] = hdd_import.GetSerialisableTuple()
|
||||
|
||||
|
@ -877,7 +888,7 @@ class ManagementPanel( wx.lib.scrolledpanel.ScrolledPanel ):
|
|||
pass
|
||||
|
||||
|
||||
def WaitOnDupeFilterJob( job_key, win, callable ):
|
||||
def WaitOnDupeFilterJob( job_key ):
|
||||
|
||||
while not job_key.IsDone():
|
||||
|
||||
|
@ -891,7 +902,7 @@ def WaitOnDupeFilterJob( job_key, win, callable ):
|
|||
|
||||
time.sleep( 1.0 )
|
||||
|
||||
HG.client_controller.CallLaterWXSafe( win, 0.0, callable )
|
||||
HG.client_controller.pub( 'refresh_dupe_numbers' )
|
||||
|
||||
class ManagementPanelDuplicateFilter( ManagementPanel ):
|
||||
|
||||
|
@ -1123,11 +1134,13 @@ class ManagementPanelDuplicateFilter( ManagementPanel ):
|
|||
|
||||
job_key = ClientThreading.JobKey( cancellable = True )
|
||||
|
||||
job_key.SetVariable( 'popup_title', 'initialising' )
|
||||
|
||||
self._controller.Write( 'maintain_similar_files_tree', job_key = job_key )
|
||||
|
||||
self._controller.pub( 'modal_message', job_key )
|
||||
|
||||
self._controller.CallLater( 1.0, WaitOnDupeFilterJob, job_key, self, self.RefreshAndUpdateStatus )
|
||||
self._controller.CallLater( 1.0, WaitOnDupeFilterJob, job_key )
|
||||
|
||||
|
||||
def _RefreshAndUpdateStatus( self ):
|
||||
|
@ -1143,11 +1156,13 @@ class ManagementPanelDuplicateFilter( ManagementPanel ):
|
|||
|
||||
job_key = ClientThreading.JobKey( cancellable = True )
|
||||
|
||||
job_key.SetVariable( 'popup_title', 'initialising' )
|
||||
|
||||
self._controller.Write( 'maintain_similar_files_phashes', job_key = job_key )
|
||||
|
||||
self._controller.pub( 'modal_message', job_key )
|
||||
|
||||
self._controller.CallLater( 1.0, WaitOnDupeFilterJob, job_key, self, self.RefreshAndUpdateStatus )
|
||||
self._controller.CallLater( 1.0, WaitOnDupeFilterJob, job_key )
|
||||
|
||||
|
||||
def _ResetUnknown( self ):
|
||||
|
@ -1163,6 +1178,7 @@ class ManagementPanelDuplicateFilter( ManagementPanel ):
|
|||
self._controller.Write( 'delete_unknown_duplicate_pairs' )
|
||||
|
||||
self._RefreshAndUpdateStatus()
|
||||
|
||||
|
||||
|
||||
|
||||
|
@ -1170,13 +1186,15 @@ class ManagementPanelDuplicateFilter( ManagementPanel ):
|
|||
|
||||
job_key = ClientThreading.JobKey( cancellable = True )
|
||||
|
||||
job_key.SetVariable( 'popup_title', 'initialising' )
|
||||
|
||||
search_distance = self._search_distance_spinctrl.GetValue()
|
||||
|
||||
self._controller.Write( 'maintain_similar_files_duplicate_pairs', search_distance, job_key = job_key )
|
||||
|
||||
self._controller.pub( 'modal_message', job_key )
|
||||
|
||||
self._controller.CallLater( 1.0, WaitOnDupeFilterJob, job_key, self, self.RefreshAndUpdateStatus )
|
||||
self._controller.CallLater( 1.0, WaitOnDupeFilterJob, job_key )
|
||||
|
||||
|
||||
def _SetCurrentMediaAs( self, duplicate_type ):
|
||||
|
@ -1268,11 +1286,11 @@ class ManagementPanelDuplicateFilter( ManagementPanel ):
|
|||
( num_phashes_to_regen, num_branches_to_regen, searched_distances_to_count, duplicate_types_to_count ) = self._similar_files_maintenance_status
|
||||
|
||||
self._cog_button.Enable()
|
||||
|
||||
'''
|
||||
ClientGUICommon.SetBitmapButtonBitmap( self._phashes_button, CC.GlobalBMPs.play )
|
||||
ClientGUICommon.SetBitmapButtonBitmap( self._branches_button, CC.GlobalBMPs.play )
|
||||
ClientGUICommon.SetBitmapButtonBitmap( self._search_button, CC.GlobalBMPs.play )
|
||||
|
||||
'''
|
||||
total_num_files = max( num_phashes_to_regen, sum( searched_distances_to_count.values() ) )
|
||||
|
||||
if num_phashes_to_regen == 0:
|
||||
|
@ -2295,7 +2313,6 @@ class ManagementPanelImporterMultipleWatcher( ManagementPanelImporter ):
|
|||
|
||||
self._UpdateImportStatus()
|
||||
|
||||
HG.client_controller.sub( self, 'PendURL', 'pend_url' )
|
||||
HG.client_controller.sub( self, '_ClearExistingHighlightAndPanel', 'clear_multiwatcher_highlights' )
|
||||
|
||||
|
||||
|
@ -2303,7 +2320,7 @@ class ManagementPanelImporterMultipleWatcher( ManagementPanelImporter ):
|
|||
|
||||
if service_keys_to_tags is None:
|
||||
|
||||
service_keys_to_tags = {}
|
||||
service_keys_to_tags = ClientTags.ServiceKeysToTags()
|
||||
|
||||
|
||||
first_result = None
|
||||
|
@ -2882,18 +2899,15 @@ class ManagementPanelImporterMultipleWatcher( ManagementPanelImporter ):
|
|||
|
||||
|
||||
|
||||
def PendURL( self, page_key, url, service_keys_to_tags = None ):
|
||||
def PendURL( self, url, service_keys_to_tags = None ):
|
||||
|
||||
if page_key == self._page_key:
|
||||
if service_keys_to_tags is None:
|
||||
|
||||
if service_keys_to_tags is None:
|
||||
|
||||
service_keys_to_tags = {}
|
||||
|
||||
|
||||
self._AddURLs( ( url, ), service_keys_to_tags )
|
||||
service_keys_to_tags = ClientTags.ServiceKeysToTags()
|
||||
|
||||
|
||||
self._AddURLs( ( url, ), service_keys_to_tags )
|
||||
|
||||
|
||||
def SetSearchFocus( self ):
|
||||
|
||||
|
@ -3414,14 +3428,12 @@ class ManagementPanelImporterURLs( ManagementPanelImporter ):
|
|||
|
||||
self._UpdateImportStatus()
|
||||
|
||||
HG.client_controller.sub( self, 'PendURL', 'pend_url' )
|
||||
|
||||
|
||||
def _PendURLs( self, urls, service_keys_to_tags = None ):
|
||||
|
||||
if service_keys_to_tags is None:
|
||||
|
||||
service_keys_to_tags = {}
|
||||
service_keys_to_tags = ClientTags.ServiceKeysToTags()
|
||||
|
||||
|
||||
urls = [ url for url in urls if url.startswith( 'http' ) ]
|
||||
|
@ -3466,18 +3478,15 @@ class ManagementPanelImporterURLs( ManagementPanelImporter ):
|
|||
self._UpdateImportStatus()
|
||||
|
||||
|
||||
def PendURL( self, page_key, url, service_keys_to_tags = None ):
|
||||
def PendURL( self, url, service_keys_to_tags = None ):
|
||||
|
||||
if page_key == self._page_key:
|
||||
if service_keys_to_tags is None:
|
||||
|
||||
if service_keys_to_tags is None:
|
||||
|
||||
service_keys_to_tags = {}
|
||||
|
||||
|
||||
self._PendURLs( ( url, ), service_keys_to_tags )
|
||||
service_keys_to_tags = ClientTags.ServiceKeysToTags()
|
||||
|
||||
|
||||
self._PendURLs( ( url, ), service_keys_to_tags )
|
||||
|
||||
|
||||
def SetSearchFocus( self ):
|
||||
|
||||
|
|
|
@ -357,7 +357,7 @@ class MediaPanel( ClientMedia.ListeningMediaList, wx.ScrolledCanvas ):
|
|||
|
||||
def __init__( self, parent, page_key, file_service_key, media_results ):
|
||||
|
||||
wx.ScrolledCanvas.__init__( self, parent, size = ( 0, 0 ), style = wx.BORDER_SUNKEN )
|
||||
wx.ScrolledCanvas.__init__( self, parent, size = ( 20, 20 ), style = wx.BORDER_SUNKEN )
|
||||
ClientMedia.ListeningMediaList.__init__( self, file_service_key, media_results )
|
||||
|
||||
self._UpdateBackgroundColour()
|
||||
|
@ -2338,7 +2338,7 @@ class MediaPanelThumbnails( MediaPanel ):
|
|||
|
||||
MediaPanel.__init__( self, parent, page_key, file_service_key, media_results )
|
||||
|
||||
self._last_client_size = ( 0, 0 )
|
||||
self._last_client_size = ( 20, 20 )
|
||||
self._num_columns = 1
|
||||
|
||||
self._drag_init_coordinates = None
|
||||
|
|
|
@ -485,6 +485,11 @@ class Page( wx.SplitterWindow ):
|
|||
return self._management_controller
|
||||
|
||||
|
||||
def GetManagementPanel( self ):
|
||||
|
||||
return self._management_panel
|
||||
|
||||
|
||||
# used by autocomplete
|
||||
def GetMedia( self ):
|
||||
|
||||
|
@ -1898,14 +1903,33 @@ class PagesNotebook( wx.Notebook ):
|
|||
|
||||
|
||||
|
||||
def GetOrMakeMultipleWatcherPage( self ):
|
||||
def GetOrMakeMultipleWatcherPage( self, desired_page_name = None ):
|
||||
|
||||
current_page = self.GetCurrentPage()
|
||||
current_media_page = self.GetCurrentMediaPage()
|
||||
|
||||
if current_page is not None and isinstance( current_page, Page ) and current_page.IsMultipleWatcherPage():
|
||||
if current_media_page is not None and current_media_page.IsMultipleWatcherPage():
|
||||
|
||||
return current_page
|
||||
has_wrong_name = desired_page_name is not None and current_media_page.GetName() != desired_page_name
|
||||
|
||||
if not has_wrong_name:
|
||||
|
||||
return current_media_page
|
||||
|
||||
|
||||
|
||||
# first do a search for the page name, if asked for
|
||||
|
||||
if desired_page_name is not None:
|
||||
|
||||
page = self._GetPageFromName( desired_page_name )
|
||||
|
||||
if page is not None and page.IsMultipleWatcherPage():
|
||||
|
||||
return page
|
||||
|
||||
|
||||
|
||||
# failing that, find/generate one with default name
|
||||
|
||||
for page in self._GetPages():
|
||||
|
||||
|
@ -1913,7 +1937,7 @@ class PagesNotebook( wx.Notebook ):
|
|||
|
||||
if page.HasMultipleWatcherPage():
|
||||
|
||||
return page.GetOrMakeMultipleWatcherPage()
|
||||
return page.GetOrMakeMultipleWatcherPage( desired_page_name = desired_page_name )
|
||||
|
||||
|
||||
elif page.IsMultipleWatcherPage():
|
||||
|
@ -1924,10 +1948,36 @@ class PagesNotebook( wx.Notebook ):
|
|||
|
||||
# import page does not exist
|
||||
|
||||
return self.NewPageImportMultipleWatcher( on_deepest_notebook = True )
|
||||
return self.NewPageImportMultipleWatcher( page_name = desired_page_name, on_deepest_notebook = True )
|
||||
|
||||
|
||||
def GetOrMakeURLImportPage( self ):
|
||||
def GetOrMakeURLImportPage( self, desired_page_name = None ):
|
||||
|
||||
current_media_page = self.GetCurrentMediaPage()
|
||||
|
||||
if current_media_page is not None and current_media_page.IsURLImportPage():
|
||||
|
||||
has_wrong_name = desired_page_name is not None and current_media_page.GetName() != desired_page_name
|
||||
|
||||
if not has_wrong_name:
|
||||
|
||||
return current_media_page
|
||||
|
||||
|
||||
|
||||
# first do a search for the page name, if asked for
|
||||
|
||||
if desired_page_name is not None:
|
||||
|
||||
page = self._GetPageFromName( desired_page_name )
|
||||
|
||||
if page is not None and page.IsURLImportPage():
|
||||
|
||||
return page
|
||||
|
||||
|
||||
|
||||
# failing that, find/generate one with default name
|
||||
|
||||
for page in self._GetPages():
|
||||
|
||||
|
@ -1935,7 +1985,7 @@ class PagesNotebook( wx.Notebook ):
|
|||
|
||||
if page.HasURLImportPage():
|
||||
|
||||
return page.GetOrMakeURLImportPage()
|
||||
return page.GetOrMakeURLImportPage( desired_page_name = desired_page_name )
|
||||
|
||||
|
||||
elif page.IsURLImportPage():
|
||||
|
@ -1946,7 +1996,7 @@ class PagesNotebook( wx.Notebook ):
|
|||
|
||||
# import page does not exist
|
||||
|
||||
return self.NewPageImportURLs( on_deepest_notebook = True )
|
||||
return self.NewPageImportURLs( page_name = desired_page_name, on_deepest_notebook = True )
|
||||
|
||||
|
||||
def GetPageKey( self ):
|
||||
|
@ -2114,11 +2164,21 @@ class PagesNotebook( wx.Notebook ):
|
|||
return False
|
||||
|
||||
|
||||
def IsMultipleWatcherPage( self ):
|
||||
|
||||
return False
|
||||
|
||||
|
||||
def IsImporter( self ):
|
||||
|
||||
return False
|
||||
|
||||
|
||||
def IsURLImportPage( self ):
|
||||
|
||||
return False
|
||||
|
||||
|
||||
def LoadGUISession( self, name ):
|
||||
|
||||
if self.GetPageCount() > 0:
|
||||
|
@ -2356,16 +2416,16 @@ class PagesNotebook( wx.Notebook ):
|
|||
return self.NewPage( management_controller, on_deepest_notebook = on_deepest_notebook )
|
||||
|
||||
|
||||
def NewPageImportMultipleWatcher( self, url = None, on_deepest_notebook = False ):
|
||||
def NewPageImportMultipleWatcher( self, page_name = None, url = None, on_deepest_notebook = False ):
|
||||
|
||||
management_controller = ClientGUIManagement.CreateManagementControllerImportMultipleWatcher( url )
|
||||
management_controller = ClientGUIManagement.CreateManagementControllerImportMultipleWatcher( page_name = page_name, url = url )
|
||||
|
||||
return self.NewPage( management_controller, on_deepest_notebook = on_deepest_notebook )
|
||||
|
||||
|
||||
def NewPageImportURLs( self, on_deepest_notebook = False ):
|
||||
def NewPageImportURLs( self, page_name = None, on_deepest_notebook = False ):
|
||||
|
||||
management_controller = ClientGUIManagement.CreateManagementControllerImportURLs()
|
||||
management_controller = ClientGUIManagement.CreateManagementControllerImportURLs( page_name = page_name )
|
||||
|
||||
return self.NewPage( management_controller, on_deepest_notebook = on_deepest_notebook )
|
||||
|
||||
|
|
|
@ -854,9 +854,9 @@ class ManageClientServicesPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
|
||||
network_bytes = network_job.GetContentBytes()
|
||||
|
||||
hydrus_args = HydrusNetwork.ParseNetworkBytesToHydrusArgs( network_bytes )
|
||||
parsed_request_args = HydrusNetwork.ParseNetworkBytesToParsedHydrusArgs( network_bytes )
|
||||
|
||||
access_key_encoded = hydrus_args[ 'access_key' ].hex()
|
||||
access_key_encoded = parsed_request_args[ 'access_key' ].hex()
|
||||
|
||||
wx.CallAfter( wx_setkey, access_key_encoded )
|
||||
|
||||
|
|
|
@ -272,7 +272,7 @@ def GenerateShapePerceptualHashes( path, mime ):
|
|||
|
||||
return phashes
|
||||
|
||||
def GenerateThumbnailFromStaticImageCV( path, dimensions = HC.UNSCALED_THUMBNAIL_DIMENSIONS, mime = None ):
|
||||
def GenerateThumbnailFileBytesFromStaticImagePathCV( path, dimensions = HC.UNSCALED_THUMBNAIL_DIMENSIONS, mime = None ):
|
||||
|
||||
if mime is None:
|
||||
|
||||
|
@ -281,7 +281,7 @@ def GenerateThumbnailFromStaticImageCV( path, dimensions = HC.UNSCALED_THUMBNAIL
|
|||
|
||||
if mime == HC.IMAGE_GIF:
|
||||
|
||||
return HydrusFileHandling.GenerateThumbnailFromStaticImagePIL( path, dimensions, mime )
|
||||
return HydrusFileHandling.GenerateThumbnailFileBytesFromStaticImagePathPIL( path, dimensions, mime )
|
||||
|
||||
|
||||
numpy_image = GenerateNumpyImage( path, mime )
|
||||
|
@ -324,12 +324,18 @@ def GenerateThumbnailFromStaticImageCV( path, dimensions = HC.UNSCALED_THUMBNAIL
|
|||
|
||||
else:
|
||||
|
||||
return HydrusFileHandling.GenerateThumbnailFromStaticImagePIL( path, dimensions, mime )
|
||||
return HydrusFileHandling.GenerateThumbnailFileBytesFromStaticImagePathPIL( path, dimensions, mime )
|
||||
|
||||
|
||||
from . import HydrusFileHandling
|
||||
|
||||
HydrusFileHandling.GenerateThumbnailFromStaticImage = GenerateThumbnailFromStaticImageCV
|
||||
HydrusFileHandling.GenerateThumbnailFileBytesFromStaticImagePath = GenerateThumbnailFileBytesFromStaticImagePathCV
|
||||
|
||||
def GetNumPyImageResolution( numpy_image ):
|
||||
|
||||
( image_y, image_x, depth ) = numpy_image.shape
|
||||
|
||||
return ( image_x, image_y )
|
||||
|
||||
def ResizeNumpyImage( mime, numpy_image, target_resolution ):
|
||||
|
||||
|
|
|
@ -1,9 +1,11 @@
|
|||
from . import ClientConstants as CC
|
||||
from . import ClientData
|
||||
from . import ClientImageHandling
|
||||
from . import ClientImporting
|
||||
from . import ClientNetworkingDomain
|
||||
from . import ClientParsing
|
||||
from . import ClientPaths
|
||||
from . import ClientTags
|
||||
import collections
|
||||
from . import HydrusConstants as HC
|
||||
from . import HydrusData
|
||||
|
@ -304,7 +306,7 @@ class FileSeed( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_FILE_SEED
|
||||
SERIALISABLE_NAME = 'File Import'
|
||||
SERIALISABLE_VERSION = 2
|
||||
SERIALISABLE_VERSION = 3
|
||||
|
||||
def __init__( self, file_seed_type = None, file_seed_data = None ):
|
||||
|
||||
|
@ -331,6 +333,8 @@ class FileSeed( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
self._referral_url = None
|
||||
|
||||
self._fixed_service_keys_to_tags = ClientTags.ServiceKeysToTags()
|
||||
|
||||
self._urls = set()
|
||||
self._tags = set()
|
||||
self._hashes = {}
|
||||
|
@ -358,16 +362,20 @@ class FileSeed( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
def _GetSerialisableInfo( self ):
|
||||
|
||||
serialisable_fixed_service_keys_to_tags = self._fixed_service_keys_to_tags.GetSerialisableTuple()
|
||||
|
||||
serialisable_urls = list( self._urls )
|
||||
serialisable_tags = list( self._tags )
|
||||
serialisable_hashes = [ ( hash_type, hash.hex() ) for ( hash_type, hash ) in list(self._hashes.items()) if hash is not None ]
|
||||
|
||||
return ( self.file_seed_type, self.file_seed_data, self.created, self.modified, self.source_time, self.status, self.note, self._referral_url, serialisable_urls, serialisable_tags, serialisable_hashes )
|
||||
return ( self.file_seed_type, self.file_seed_data, self.created, self.modified, self.source_time, self.status, self.note, self._referral_url, serialisable_fixed_service_keys_to_tags, serialisable_urls, serialisable_tags, serialisable_hashes )
|
||||
|
||||
|
||||
def _InitialiseFromSerialisableInfo( self, serialisable_info ):
|
||||
|
||||
( self.file_seed_type, self.file_seed_data, self.created, self.modified, self.source_time, self.status, self.note, self._referral_url, serialisable_urls, serialisable_tags, serialisable_hashes ) = serialisable_info
|
||||
( self.file_seed_type, self.file_seed_data, self.created, self.modified, self.source_time, self.status, self.note, self._referral_url, serialisable_fixed_service_keys_to_tags, serialisable_urls, serialisable_tags, serialisable_hashes ) = serialisable_info
|
||||
|
||||
self._fixed_service_keys_to_tags = HydrusSerialisable.CreateFromSerialisableTuple( serialisable_fixed_service_keys_to_tags )
|
||||
|
||||
self._urls = set( serialisable_urls )
|
||||
self._tags = set( serialisable_tags )
|
||||
|
@ -445,6 +453,19 @@ class FileSeed( HydrusSerialisable.SerialisableBase ):
|
|||
return ( 2, new_serialisable_info )
|
||||
|
||||
|
||||
if version == 2:
|
||||
|
||||
( file_seed_type, file_seed_data, created, modified, source_time, status, note, referral_url, serialisable_urls, serialisable_tags, serialisable_hashes ) = old_serialisable_info
|
||||
|
||||
fixed_service_keys_to_tags = ClientTags.ServiceKeysToTags()
|
||||
|
||||
serialisable_fixed_service_keys_to_tags = fixed_service_keys_to_tags.GetSerialisableTuple()
|
||||
|
||||
new_serialisable_info = ( file_seed_type, file_seed_data, created, modified, source_time, status, note, referral_url, serialisable_fixed_service_keys_to_tags, serialisable_urls, serialisable_tags, serialisable_hashes )
|
||||
|
||||
return ( 3, new_serialisable_info )
|
||||
|
||||
|
||||
|
||||
def AddParseResults( self, parse_results, file_import_options ):
|
||||
|
||||
|
@ -814,6 +835,8 @@ class FileSeed( HydrusSerialisable.SerialisableBase ):
|
|||
HydrusPaths.CleanUpTempPath( os_file_handle, temp_path )
|
||||
|
||||
|
||||
self.WriteContentUpdates()
|
||||
|
||||
except HydrusExceptions.MimeException as e:
|
||||
|
||||
self.SetStatus( CC.STATUS_ERROR, exception = e )
|
||||
|
@ -924,6 +947,11 @@ class FileSeed( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
|
||||
|
||||
def SetFixedServiceKeysToTags( self, service_keys_to_tags ):
|
||||
|
||||
self._fixed_service_keys_to_tags = ClientTags.ServiceKeysToTags( service_keys_to_tags )
|
||||
|
||||
|
||||
def SetHash( self, hash ):
|
||||
|
||||
if hash is not None:
|
||||
|
@ -1287,6 +1315,9 @@ class FileSeed( HydrusSerialisable.SerialisableBase ):
|
|||
return did_work
|
||||
|
||||
|
||||
# changed this to say that urls alone are not 'did work' since all url results are doing this, and when they have no tags, they are usually superfast db hits anyway
|
||||
# better to scream through an 'already in db' import list that flicker
|
||||
|
||||
service_keys_to_content_updates = collections.defaultdict( list )
|
||||
|
||||
urls = set( self._urls )
|
||||
|
@ -1314,18 +1345,25 @@ class FileSeed( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
in_inbox = HG.client_controller.Read( 'in_inbox', hash )
|
||||
|
||||
for ( service_key, content_updates ) in list(tag_import_options.GetServiceKeysToContentUpdates( self.status, in_inbox, hash, set( self._tags ) ).items()):
|
||||
for ( service_key, content_updates ) in tag_import_options.GetServiceKeysToContentUpdates( self.status, in_inbox, hash, set( self._tags ) ).items():
|
||||
|
||||
service_keys_to_content_updates[ service_key ].extend( content_updates )
|
||||
|
||||
did_work = True
|
||||
|
||||
|
||||
|
||||
for ( service_key, content_updates ) in ClientData.ConvertServiceKeysToTagsToServiceKeysToContentUpdates( ( hash, ), self._fixed_service_keys_to_tags ).items():
|
||||
|
||||
service_keys_to_content_updates[ service_key ].extend( content_updates )
|
||||
|
||||
did_work = True
|
||||
|
||||
|
||||
if len( service_keys_to_content_updates ) > 0:
|
||||
|
||||
HG.client_controller.WriteSynchronous( 'content_updates', service_keys_to_content_updates )
|
||||
|
||||
did_work = True
|
||||
|
||||
|
||||
return did_work
|
||||
|
||||
|
|
|
@ -2,6 +2,7 @@ from . import ClientConstants as CC
|
|||
from . import ClientImporting
|
||||
from . import ClientNetworkingDomain
|
||||
from . import ClientParsing
|
||||
from . import ClientTags
|
||||
import collections
|
||||
from . import HydrusConstants as HC
|
||||
from . import HydrusData
|
||||
|
@ -67,7 +68,7 @@ class GallerySeed( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_GALLERY_SEED
|
||||
SERIALISABLE_NAME = 'Gallery Log Entry'
|
||||
SERIALISABLE_VERSION = 1
|
||||
SERIALISABLE_VERSION = 2
|
||||
|
||||
def __init__( self, url = None, can_generate_more_pages = True ):
|
||||
|
||||
|
@ -85,6 +86,8 @@ class GallerySeed( HydrusSerialisable.SerialisableBase ):
|
|||
self.url = url
|
||||
self._can_generate_more_pages = can_generate_more_pages
|
||||
|
||||
self._fixed_service_keys_to_tags = ClientTags.ServiceKeysToTags()
|
||||
|
||||
self.created = HydrusData.GetNow()
|
||||
self.modified = self.created
|
||||
self.status = CC.STATUS_UNKNOWN
|
||||
|
@ -112,12 +115,16 @@ class GallerySeed( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
def _GetSerialisableInfo( self ):
|
||||
|
||||
return ( self.url, self._can_generate_more_pages, self.created, self.modified, self.status, self.note, self._referral_url )
|
||||
serialisable_fixed_service_keys_to_tags = self._fixed_service_keys_to_tags.GetSerialisableTuple()
|
||||
|
||||
return ( self.url, self._can_generate_more_pages, serialisable_fixed_service_keys_to_tags, self.created, self.modified, self.status, self.note, self._referral_url )
|
||||
|
||||
|
||||
def _InitialiseFromSerialisableInfo( self, serialisable_info ):
|
||||
|
||||
( self.url, self._can_generate_more_pages, self.created, self.modified, self.status, self.note, self._referral_url ) = serialisable_info
|
||||
( self.url, self._can_generate_more_pages, serialisable_fixed_service_keys_to_tags, self.created, self.modified, self.status, self.note, self._referral_url ) = serialisable_info
|
||||
|
||||
self._fixed_service_keys_to_tags = HydrusSerialisable.CreateFromSerialisableTuple( serialisable_fixed_service_keys_to_tags )
|
||||
|
||||
|
||||
def _UpdateModified( self ):
|
||||
|
@ -125,6 +132,21 @@ class GallerySeed( HydrusSerialisable.SerialisableBase ):
|
|||
self.modified = HydrusData.GetNow()
|
||||
|
||||
|
||||
def _UpdateSerialisableInfo( self, version, old_serialisable_info ):
|
||||
|
||||
if version == 1:
|
||||
|
||||
( url, can_generate_more_pages, created, modified, status, note, referral_url ) = old_serialisable_info
|
||||
|
||||
fixed_service_keys_to_tags = ClientTags.ServiceKeysToTags()
|
||||
|
||||
serialisable_fixed_service_keys_to_tags = fixed_service_keys_to_tags.GetSerialisableTuple()
|
||||
|
||||
new_serialisable_info = ( url, can_generate_more_pages, serialisable_fixed_service_keys_to_tags, created, modified, status, note, referral_url )
|
||||
|
||||
return ( 2, new_serialisable_info )
|
||||
|
||||
|
||||
def ForceNextPageURLGeneration( self ):
|
||||
|
||||
self._force_next_page_url_generation = True
|
||||
|
@ -151,6 +173,11 @@ class GallerySeed( HydrusSerialisable.SerialisableBase ):
|
|||
return network_job
|
||||
|
||||
|
||||
def SetFixedServiceKeysToTags( self, service_keys_to_tags ):
|
||||
|
||||
self._fixed_service_keys_to_tags = ClientTags.ServiceKeysToTags( service_keys_to_tags )
|
||||
|
||||
|
||||
def SetReferralURL( self, referral_url ):
|
||||
|
||||
self._referral_url = referral_url
|
||||
|
@ -271,6 +298,11 @@ class GallerySeed( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
file_seeds = ClientImporting.ConvertAllParseResultsToFileSeeds( all_parse_results, self.url, file_import_options )
|
||||
|
||||
for file_seed in file_seeds:
|
||||
|
||||
file_seed.SetFixedServiceKeysToTags( self._fixed_service_keys_to_tags )
|
||||
|
||||
|
||||
num_urls_total = len( file_seeds )
|
||||
|
||||
( num_urls_added, num_urls_already_in_file_seed_cache, can_search_for_more_files, stop_reason ) = file_seeds_callable( file_seeds )
|
||||
|
@ -359,6 +391,11 @@ class GallerySeed( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
next_gallery_seeds = [ GallerySeed( next_page_url ) for next_page_url in new_next_page_urls ]
|
||||
|
||||
for next_gallery_seed in next_gallery_seeds:
|
||||
|
||||
next_gallery_seed.SetFixedServiceKeysToTags( self._fixed_service_keys_to_tags )
|
||||
|
||||
|
||||
gallery_seed_log.AddGallerySeeds( next_gallery_seeds )
|
||||
|
||||
added_new_gallery_pages = True
|
||||
|
|
|
@ -5,6 +5,7 @@ from . import ClientImporting
|
|||
from . import ClientImportFileSeeds
|
||||
from . import ClientImportOptions
|
||||
from . import ClientPaths
|
||||
from . import ClientTags
|
||||
from . import ClientThreading
|
||||
from . import HydrusConstants as HC
|
||||
from . import HydrusData
|
||||
|
@ -21,9 +22,9 @@ class HDDImport( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_HDD_IMPORT
|
||||
SERIALISABLE_NAME = 'Local File Import'
|
||||
SERIALISABLE_VERSION = 1
|
||||
SERIALISABLE_VERSION = 2
|
||||
|
||||
def __init__( self, paths = None, file_import_options = None, paths_to_tags = None, delete_after_success = None ):
|
||||
def __init__( self, paths = None, file_import_options = None, paths_to_service_keys_to_tags = None, delete_after_success = None ):
|
||||
|
||||
HydrusSerialisable.SerialisableBase.__init__( self )
|
||||
|
||||
|
@ -52,6 +53,11 @@ class HDDImport( HydrusSerialisable.SerialisableBase ):
|
|||
pass
|
||||
|
||||
|
||||
if path in paths_to_service_keys_to_tags:
|
||||
|
||||
file_seed.SetFixedServiceKeysToTags( paths_to_service_keys_to_tags[ path ] )
|
||||
|
||||
|
||||
file_seeds.append( file_seed )
|
||||
|
||||
|
||||
|
@ -59,7 +65,6 @@ class HDDImport( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
|
||||
self._file_import_options = file_import_options
|
||||
self._paths_to_tags = paths_to_tags
|
||||
self._delete_after_success = delete_after_success
|
||||
|
||||
self._current_action = ''
|
||||
|
@ -76,18 +81,44 @@ class HDDImport( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
serialisable_file_seed_cache = self._file_seed_cache.GetSerialisableTuple()
|
||||
serialisable_options = self._file_import_options.GetSerialisableTuple()
|
||||
serialisable_paths_to_tags = { path : { service_key.hex() : tags for ( service_key, tags ) in list(service_keys_to_tags.items()) } for ( path, service_keys_to_tags ) in list(self._paths_to_tags.items()) }
|
||||
|
||||
return ( serialisable_file_seed_cache, serialisable_options, serialisable_paths_to_tags, self._delete_after_success, self._paused )
|
||||
return ( serialisable_file_seed_cache, serialisable_options, self._delete_after_success, self._paused )
|
||||
|
||||
|
||||
def _InitialiseFromSerialisableInfo( self, serialisable_info ):
|
||||
|
||||
( serialisable_file_seed_cache, serialisable_options, serialisable_paths_to_tags, self._delete_after_success, self._paused ) = serialisable_info
|
||||
( serialisable_file_seed_cache, serialisable_options, self._delete_after_success, self._paused ) = serialisable_info
|
||||
|
||||
self._file_seed_cache = HydrusSerialisable.CreateFromSerialisableTuple( serialisable_file_seed_cache )
|
||||
self._file_import_options = HydrusSerialisable.CreateFromSerialisableTuple( serialisable_options )
|
||||
self._paths_to_tags = { path : { bytes.fromhex( service_key ) : tags for ( service_key, tags ) in list(service_keys_to_tags.items()) } for ( path, service_keys_to_tags ) in list(serialisable_paths_to_tags.items()) }
|
||||
|
||||
|
||||
def _UpdateSerialisableInfo( self, version, old_serialisable_info ):
|
||||
|
||||
if version == 1:
|
||||
|
||||
( serialisable_file_seed_cache, serialisable_options, serialisable_paths_to_tags, delete_after_success, paused ) = old_serialisable_info
|
||||
|
||||
file_seed_cache = HydrusSerialisable.CreateFromSerialisableTuple( serialisable_file_seed_cache )
|
||||
|
||||
paths_to_service_keys_to_tags = { path : { bytes.fromhex( service_key ) : tags for ( service_key, tags ) in service_keys_to_tags.items() } for ( path, service_keys_to_tags ) in serialisable_paths_to_tags.items() }
|
||||
|
||||
for file_seed in file_seed_cache.GetFileSeeds():
|
||||
|
||||
path = file_seed.file_seed_data
|
||||
|
||||
if path in paths_to_service_keys_to_tags:
|
||||
|
||||
file_seed.SetFixedServiceKeysToTags( paths_to_service_keys_to_tags[ path ] )
|
||||
|
||||
|
||||
|
||||
serialisable_file_seed_cache = file_seed_cache.GetSerialisableTuple()
|
||||
|
||||
new_serialisable_info = ( serialisable_file_seed_cache, serialisable_options, delete_after_success, paused )
|
||||
|
||||
return ( 2, new_serialisable_info )
|
||||
|
||||
|
||||
|
||||
def _WorkOnFiles( self, page_key ):
|
||||
|
@ -103,18 +134,6 @@ class HDDImport( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
path = file_seed.file_seed_data
|
||||
|
||||
with self._lock:
|
||||
|
||||
if path in self._paths_to_tags:
|
||||
|
||||
service_keys_to_tags = self._paths_to_tags[ path ]
|
||||
|
||||
else:
|
||||
|
||||
service_keys_to_tags = {}
|
||||
|
||||
|
||||
|
||||
with self._lock:
|
||||
|
||||
self._current_action = 'importing'
|
||||
|
@ -126,20 +145,6 @@ class HDDImport( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
if file_seed.status in CC.SUCCESSFUL_IMPORT_STATES:
|
||||
|
||||
if file_seed.HasHash():
|
||||
|
||||
hash = file_seed.GetHash()
|
||||
|
||||
service_keys_to_content_updates = ClientData.ConvertServiceKeysToTagsToServiceKeysToContentUpdates( { hash }, service_keys_to_tags )
|
||||
|
||||
if len( service_keys_to_content_updates ) > 0:
|
||||
|
||||
HG.client_controller.WriteSynchronous( 'content_updates', service_keys_to_content_updates )
|
||||
|
||||
did_substantial_work = True
|
||||
|
||||
|
||||
|
||||
if file_seed.ShouldPresent( self._file_import_options ):
|
||||
|
||||
file_seed.PresentToPage( page_key )
|
||||
|
@ -610,7 +615,7 @@ class ImportFolder( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
|
||||
|
||||
service_keys_to_tags = {}
|
||||
service_keys_to_tags = ClientTags.ServiceKeysToTags()
|
||||
|
||||
for ( tag_service_key, filename_tagging_options ) in list(self._tag_service_keys_to_filename_tagging_options.items()):
|
||||
|
||||
|
|
|
@ -1086,7 +1086,7 @@ class TagImportOptions( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
parsed_tags = HydrusTags.CleanTags( parsed_tags )
|
||||
|
||||
service_keys_to_tags = collections.defaultdict( set )
|
||||
service_keys_to_tags = ClientTags.ServiceKeysToTags()
|
||||
|
||||
for ( service_key, service_tag_import_options ) in list(self._service_keys_to_service_tag_import_options.items()):
|
||||
|
||||
|
|
|
@ -5,6 +5,7 @@ from . import ClientImporting
|
|||
from . import ClientImportFileSeeds
|
||||
from . import ClientImportGallerySeeds
|
||||
from . import ClientImportOptions
|
||||
from . import ClientTags
|
||||
from . import HydrusConstants as HC
|
||||
from . import HydrusData
|
||||
from . import HydrusExceptions
|
||||
|
@ -941,7 +942,7 @@ class URLsImport( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
if service_keys_to_tags is None:
|
||||
|
||||
service_keys_to_tags = {}
|
||||
service_keys_to_tags = ClientTags.ServiceKeysToTags()
|
||||
|
||||
|
||||
with self._lock:
|
||||
|
@ -960,6 +961,8 @@ class URLsImport( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
file_seed = ClientImportFileSeeds.FileSeed( ClientImportFileSeeds.FILE_SEED_TYPE_URL, url )
|
||||
|
||||
file_seed.SetFixedServiceKeysToTags( service_keys_to_tags )
|
||||
|
||||
file_seeds.append( file_seed )
|
||||
|
||||
else:
|
||||
|
@ -968,6 +971,8 @@ class URLsImport( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
gallery_seed = ClientImportGallerySeeds.GallerySeed( url, can_generate_more_pages = can_generate_more_pages )
|
||||
|
||||
gallery_seed.SetFixedServiceKeysToTags( service_keys_to_tags )
|
||||
|
||||
gallery_seeds.append( gallery_seed )
|
||||
|
||||
|
||||
|
|
|
@ -6,6 +6,7 @@ from . import ClientImportFileSeeds
|
|||
from . import ClientImportGallerySeeds
|
||||
from . import ClientNetworkingJobs
|
||||
from . import ClientParsing
|
||||
from . import ClientTags
|
||||
import collections
|
||||
from . import HydrusConstants as HC
|
||||
from . import HydrusData
|
||||
|
@ -197,18 +198,13 @@ class MultipleWatcherImport( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
|
||||
|
||||
def AddURL( self, url, service_keys_to_tags ):
|
||||
def AddURL( self, url, service_keys_to_tags = None ):
|
||||
|
||||
if url == '':
|
||||
|
||||
return None
|
||||
|
||||
|
||||
if service_keys_to_tags is None:
|
||||
|
||||
service_keys_to_tags = {}
|
||||
|
||||
|
||||
url = HG.client_controller.network_engine.domain_manager.NormaliseURL( url )
|
||||
|
||||
with self._lock:
|
||||
|
@ -229,6 +225,11 @@ class MultipleWatcherImport( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
watcher.SetURL( url )
|
||||
|
||||
if service_keys_to_tags is not None:
|
||||
|
||||
watcher.SetFixedServiceKeysToTags( service_keys_to_tags )
|
||||
|
||||
|
||||
watcher.SetCheckerOptions( self._checker_options )
|
||||
watcher.SetFileImportOptions( self._file_import_options )
|
||||
watcher.SetTagImportOptions( self._tag_import_options )
|
||||
|
@ -499,7 +500,7 @@ class WatcherImport( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_WATCHER_IMPORT
|
||||
SERIALISABLE_NAME = 'Watcher'
|
||||
SERIALISABLE_VERSION = 6
|
||||
SERIALISABLE_VERSION = 7
|
||||
|
||||
MIN_CHECK_PERIOD = 30
|
||||
|
||||
|
@ -515,8 +516,8 @@ class WatcherImport( HydrusSerialisable.SerialisableBase ):
|
|||
self._gallery_seed_log = ClientImportGallerySeeds.GallerySeedLog()
|
||||
self._file_seed_cache = ClientImportFileSeeds.FileSeedCache()
|
||||
|
||||
self._urls_to_filenames = {}
|
||||
self._urls_to_md5_base64 = {}
|
||||
self._fixed_service_keys_to_tags = ClientTags.ServiceKeysToTags()
|
||||
|
||||
self._checker_options = HG.client_controller.new_options.GetDefaultWatcherCheckerOptions()
|
||||
self._file_import_options = HG.client_controller.new_options.GetDefaultFileImportOptions( 'loud' )
|
||||
self._tag_import_options = ClientImportOptions.TagImportOptions( is_default = True )
|
||||
|
@ -600,6 +601,8 @@ class WatcherImport( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
gallery_seed = ClientImportGallerySeeds.GallerySeed( self._url, can_generate_more_pages = False )
|
||||
|
||||
gallery_seed.SetFixedServiceKeysToTags( self._fixed_service_keys_to_tags )
|
||||
|
||||
self._gallery_seed_log.AddGallerySeeds( ( gallery_seed, ) )
|
||||
|
||||
with self._lock:
|
||||
|
@ -729,11 +732,13 @@ class WatcherImport( HydrusSerialisable.SerialisableBase ):
|
|||
serialisable_gallery_seed_log = self._gallery_seed_log.GetSerialisableTuple()
|
||||
serialisable_file_seed_cache = self._file_seed_cache.GetSerialisableTuple()
|
||||
|
||||
serialisable_fixed_service_keys_to_tags = self._fixed_service_keys_to_tags.GetSerialisableTuple()
|
||||
|
||||
serialisable_checker_options = self._checker_options.GetSerialisableTuple()
|
||||
serialisable_file_import_options = self._file_import_options.GetSerialisableTuple()
|
||||
serialisable_tag_import_options = self._tag_import_options.GetSerialisableTuple()
|
||||
|
||||
return ( self._url, serialisable_gallery_seed_log, serialisable_file_seed_cache, self._urls_to_filenames, self._urls_to_md5_base64, serialisable_checker_options, serialisable_file_import_options, serialisable_tag_import_options, self._last_check_time, self._files_paused, self._checking_paused, self._checking_status, self._subject, self._no_work_until, self._no_work_until_reason, self._creation_time )
|
||||
return ( self._url, serialisable_gallery_seed_log, serialisable_file_seed_cache, serialisable_fixed_service_keys_to_tags, serialisable_checker_options, serialisable_file_import_options, serialisable_tag_import_options, self._last_check_time, self._files_paused, self._checking_paused, self._checking_status, self._subject, self._no_work_until, self._no_work_until_reason, self._creation_time )
|
||||
|
||||
|
||||
def _HasURL( self ):
|
||||
|
@ -743,7 +748,9 @@ class WatcherImport( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
def _InitialiseFromSerialisableInfo( self, serialisable_info ):
|
||||
|
||||
( self._url, serialisable_gallery_seed_log, serialisable_file_seed_cache, self._urls_to_filenames, self._urls_to_md5_base64, serialisable_checker_options, serialisable_file_import_options, serialisable_tag_import_options, self._last_check_time, self._files_paused, self._checking_paused, self._checking_status, self._subject, self._no_work_until, self._no_work_until_reason, self._creation_time ) = serialisable_info
|
||||
( self._url, serialisable_gallery_seed_log, serialisable_file_seed_cache, serialisable_fixed_service_keys_to_tags, serialisable_checker_options, serialisable_file_import_options, serialisable_tag_import_options, self._last_check_time, self._files_paused, self._checking_paused, self._checking_status, self._subject, self._no_work_until, self._no_work_until_reason, self._creation_time ) = serialisable_info
|
||||
|
||||
self._fixed_service_keys_to_tags = HydrusSerialisable.CreateFromSerialisableTuple( serialisable_fixed_service_keys_to_tags )
|
||||
|
||||
self._gallery_seed_log = HydrusSerialisable.CreateFromSerialisableTuple( serialisable_gallery_seed_log )
|
||||
self._file_seed_cache = HydrusSerialisable.CreateFromSerialisableTuple( serialisable_file_seed_cache )
|
||||
|
@ -858,6 +865,19 @@ class WatcherImport( HydrusSerialisable.SerialisableBase ):
|
|||
return ( 6, new_serialisable_info )
|
||||
|
||||
|
||||
if version == 6:
|
||||
|
||||
( url, serialisable_gallery_seed_log, serialisable_file_seed_cache, urls_to_filenames, urls_to_md5_base64, serialisable_checker_options, serialisable_file_import_options, serialisable_tag_import_options, last_check_time, files_paused, checking_paused, checking_status, subject, no_work_until, no_work_until_reason, creation_time ) = old_serialisable_info
|
||||
|
||||
fixed_service_keys_to_tags = ClientTags.ServiceKeysToTags()
|
||||
|
||||
serialisable_fixed_service_keys_to_tags = fixed_service_keys_to_tags.GetSerialisableTuple()
|
||||
|
||||
new_serialisable_info = ( url, serialisable_gallery_seed_log, serialisable_file_seed_cache, serialisable_fixed_service_keys_to_tags, serialisable_checker_options, serialisable_file_import_options, serialisable_tag_import_options, last_check_time, files_paused, checking_paused, checking_status, subject, no_work_until, no_work_until_reason, creation_time )
|
||||
|
||||
return ( 7, new_serialisable_info )
|
||||
|
||||
|
||||
|
||||
def _WorkOnFiles( self ):
|
||||
|
||||
|
@ -1252,6 +1272,14 @@ class WatcherImport( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
|
||||
|
||||
def SetFixedServiceKeysToTags( self, service_keys_to_tags ):
|
||||
|
||||
with self._lock:
|
||||
|
||||
self._fixed_service_keys_to_tags = ClientTags.ServiceKeysToTags( service_keys_to_tags )
|
||||
|
||||
|
||||
|
||||
def SetTagImportOptions( self, tag_import_options ):
|
||||
|
||||
with self._lock:
|
||||
|
|
|
@ -53,6 +53,8 @@ class HydrusServiceClientAPI( HydrusClientService ):
|
|||
|
||||
root.putChild( b'add_tags', add_tags )
|
||||
|
||||
add_tags.putChild( b'add_tags', ClientLocalServerResources.HydrusResourceClientAPIRestrictedAddTagsAddTags( self._service, self._client_requests_domain ) )
|
||||
add_tags.putChild( b'clean_tags', ClientLocalServerResources.HydrusResourceClientAPIRestrictedAddTagsCleanTags( self._service, self._client_requests_domain ) )
|
||||
add_tags.putChild( b'get_tag_services', ClientLocalServerResources.HydrusResourceClientAPIRestrictedAddTagsGetTagServices( self._service, self._client_requests_domain ) )
|
||||
|
||||
add_urls = NoResource()
|
||||
|
@ -62,6 +64,7 @@ class HydrusServiceClientAPI( HydrusClientService ):
|
|||
add_urls.putChild( b'get_url_info', ClientLocalServerResources.HydrusResourceClientAPIRestrictedAddURLsGetURLInfo( self._service, self._client_requests_domain ) )
|
||||
add_urls.putChild( b'get_url_files', ClientLocalServerResources.HydrusResourceClientAPIRestrictedAddURLsGetURLFiles( self._service, self._client_requests_domain ) )
|
||||
add_urls.putChild( b'add_url', ClientLocalServerResources.HydrusResourceClientAPIRestrictedAddURLsImportURL( self._service, self._client_requests_domain ) )
|
||||
add_urls.putChild( b'associate_url', ClientLocalServerResources.HydrusResourceClientAPIRestrictedAddURLsAssociateURL( self._service, self._client_requests_domain ) )
|
||||
|
||||
return root
|
||||
|
||||
|
|
|
@ -1,7 +1,9 @@
|
|||
import collections
|
||||
from . import ClientAPI
|
||||
from . import ClientConstants as CC
|
||||
from . import ClientFiles
|
||||
from . import ClientImportFileSeeds
|
||||
from . import ClientTags
|
||||
from . import HydrusConstants as HC
|
||||
from . import HydrusData
|
||||
from . import HydrusExceptions
|
||||
|
@ -9,6 +11,7 @@ from . import HydrusGlobals as HG
|
|||
from . import HydrusNetworking
|
||||
from . import HydrusPaths
|
||||
from . import HydrusServerResources
|
||||
from . import HydrusTags
|
||||
import json
|
||||
import os
|
||||
import traceback
|
||||
|
@ -24,7 +27,7 @@ LOCAL_BOORU_JSON_PARAMS = set()
|
|||
CLIENT_API_INT_PARAMS = set()
|
||||
CLIENT_API_BYTE_PARAMS = set()
|
||||
CLIENT_API_STRING_PARAMS = { 'name', 'url' }
|
||||
CLIENT_API_JSON_PARAMS = { 'basic_permissions' }
|
||||
CLIENT_API_JSON_PARAMS = { 'basic_permissions', 'tags' }
|
||||
|
||||
def ParseLocalBooruGETArgs( requests_args ):
|
||||
|
||||
|
@ -38,15 +41,68 @@ def ParseClientAPIGETArgs( requests_args ):
|
|||
|
||||
return args
|
||||
|
||||
def ParseClientAPIPOSTArgs( request ):
|
||||
def ParseClientAPIPOSTHashesArgs( args ):
|
||||
|
||||
# this is a mangled dupe of the hydrus parsing stuff. I should refactor it all to something neater and pull out the HydrusNetwork.ParseNetworkBytesToHydrusArgs
|
||||
if not isinstance( args, dict ):
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'The given parameter did not seem to be a JSON Object!' )
|
||||
|
||||
|
||||
parsed_request_args = HydrusNetworking.ParsedRequestArguments( args )
|
||||
|
||||
if 'hash' in parsed_request_args:
|
||||
|
||||
try:
|
||||
|
||||
hash = bytes.fromhex( parsed_request_args[ 'hash' ] )
|
||||
|
||||
if len( hash ) == 0:
|
||||
|
||||
del parsed_request_args[ 'hash' ]
|
||||
|
||||
else:
|
||||
|
||||
parsed_request_args[ 'hash' ] = hash
|
||||
|
||||
|
||||
except:
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'I was expecting to parse \'' + 'hash' + '\' as a hex-encoded bytestring, but it failed.' )
|
||||
|
||||
|
||||
|
||||
if 'hashes' in parsed_request_args:
|
||||
|
||||
try:
|
||||
|
||||
hashes = [ bytes.fromhex( hash_hex ) for hash_hex in parsed_request_args[ 'hashes' ] ]
|
||||
|
||||
hashes = [ hash for hash in hashes if len( hash ) > 0 ]
|
||||
|
||||
if len( hashes ) == 0:
|
||||
|
||||
del parsed_request_args[ 'hashes' ]
|
||||
|
||||
else:
|
||||
|
||||
parsed_request_args[ 'hashes' ] = hashes
|
||||
|
||||
|
||||
except:
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'I was expecting to parse \'' + 'hashes' + '\' as a list of hex-encoded bytestrings, but it failed.' )
|
||||
|
||||
|
||||
|
||||
return parsed_request_args
|
||||
|
||||
def ParseClientAPIPOSTArgs( request ):
|
||||
|
||||
request.content.seek( 0 )
|
||||
|
||||
if not request.requestHeaders.hasHeader( 'Content-Type' ):
|
||||
|
||||
hydrus_args = {}
|
||||
parsed_request_args = HydrusNetworking.ParsedRequestArguments()
|
||||
|
||||
total_bytes_read = 0
|
||||
|
||||
|
@ -62,7 +118,7 @@ def ParseClientAPIPOSTArgs( request ):
|
|||
|
||||
except:
|
||||
|
||||
raise HydrusExceptions.InsufficientCredentialsException( 'Did not recognise Content-Type header!' )
|
||||
raise HydrusExceptions.BadRequestException( 'Did not recognise Content-Type header!' )
|
||||
|
||||
|
||||
total_bytes_read = 0
|
||||
|
@ -75,11 +131,13 @@ def ParseClientAPIPOSTArgs( request ):
|
|||
|
||||
json_string = str( json_bytes, 'utf-8' )
|
||||
|
||||
hydrus_args = json.loads( json_string )
|
||||
args = json.loads( json_string )
|
||||
|
||||
parsed_request_args = ParseClientAPIPOSTHashesArgs( args )
|
||||
|
||||
else:
|
||||
|
||||
hydrus_args = {}
|
||||
parsed_request_args = HydrusNetworking.ParsedRequestArguments()
|
||||
|
||||
( os_file_handle, temp_path ) = HydrusPaths.GetTempPath()
|
||||
|
||||
|
@ -97,15 +155,15 @@ def ParseClientAPIPOSTArgs( request ):
|
|||
|
||||
|
||||
|
||||
return ( hydrus_args, total_bytes_read )
|
||||
return ( parsed_request_args, total_bytes_read )
|
||||
|
||||
class HydrusResourceBooru( HydrusServerResources.HydrusResource ):
|
||||
|
||||
def _callbackParseGETArgs( self, request ):
|
||||
|
||||
hydrus_args = ParseLocalBooruGETArgs( request.args )
|
||||
parsed_request_args = ParseLocalBooruGETArgs( request.args )
|
||||
|
||||
request.hydrus_args = hydrus_args
|
||||
request.parsed_request_args = parsed_request_args
|
||||
|
||||
return request
|
||||
|
||||
|
@ -139,8 +197,8 @@ class HydrusResourceBooruFile( HydrusResourceBooru ):
|
|||
|
||||
def _threadDoGETJob( self, request ):
|
||||
|
||||
share_key = request.hydrus_args[ 'share_key' ]
|
||||
hash = request.hydrus_args[ 'hash' ]
|
||||
share_key = request.parsed_request_args[ 'share_key' ]
|
||||
hash = request.parsed_request_args[ 'hash' ]
|
||||
|
||||
HG.client_controller.local_booru_manager.CheckFileAuthorised( share_key, hash )
|
||||
|
||||
|
@ -164,7 +222,7 @@ class HydrusResourceBooruGallery( HydrusResourceBooru ):
|
|||
# in future, make this a standard frame with a search key that'll load xml or yaml AJAX stuff
|
||||
# with file info included, so the page can sort and whatever
|
||||
|
||||
share_key = request.hydrus_args[ 'share_key' ]
|
||||
share_key = request.parsed_request_args[ 'share_key' ]
|
||||
|
||||
local_booru_manager = HG.client_controller.local_booru_manager
|
||||
|
||||
|
@ -245,8 +303,8 @@ class HydrusResourceBooruPage( HydrusResourceBooru ):
|
|||
|
||||
def _threadDoGETJob( self, request ):
|
||||
|
||||
share_key = request.hydrus_args[ 'share_key' ]
|
||||
hash = request.hydrus_args[ 'hash' ]
|
||||
share_key = request.parsed_request_args[ 'share_key' ]
|
||||
hash = request.parsed_request_args[ 'hash' ]
|
||||
|
||||
local_booru_manager = HG.client_controller.local_booru_manager
|
||||
|
||||
|
@ -334,8 +392,8 @@ class HydrusResourceBooruThumbnail( HydrusResourceBooru ):
|
|||
|
||||
def _threadDoGETJob( self, request ):
|
||||
|
||||
share_key = request.hydrus_args[ 'share_key' ]
|
||||
hash = request.hydrus_args[ 'hash' ]
|
||||
share_key = request.parsed_request_args[ 'share_key' ]
|
||||
hash = request.parsed_request_args[ 'hash' ]
|
||||
|
||||
local_booru_manager = HG.client_controller.local_booru_manager
|
||||
|
||||
|
@ -377,20 +435,20 @@ class HydrusResourceClientAPI( HydrusServerResources.HydrusResource ):
|
|||
|
||||
def _callbackParseGETArgs( self, request ):
|
||||
|
||||
hydrus_args = ParseClientAPIGETArgs( request.args )
|
||||
parsed_request_args = ParseClientAPIGETArgs( request.args )
|
||||
|
||||
request.hydrus_args = hydrus_args
|
||||
request.parsed_request_args = parsed_request_args
|
||||
|
||||
return request
|
||||
|
||||
|
||||
def _callbackParsePOSTArgs( self, request ):
|
||||
|
||||
( hydrus_args, total_bytes_read ) = ParseClientAPIPOSTArgs( request )
|
||||
( parsed_request_args, total_bytes_read ) = ParseClientAPIPOSTArgs( request )
|
||||
|
||||
self._reportDataUsed( request, total_bytes_read )
|
||||
|
||||
request.hydrus_args = hydrus_args
|
||||
request.parsed_request_args = parsed_request_args
|
||||
|
||||
return request
|
||||
|
||||
|
@ -447,8 +505,8 @@ class HydrusResourceClientAPIPermissionsRequest( HydrusResourceClientAPI ):
|
|||
raise HydrusExceptions.InsufficientCredentialsException( 'The permission registration dialog is not open. Please open it under "review services" in the hydrus client.' )
|
||||
|
||||
|
||||
name = request.hydrus_args[ 'name' ]
|
||||
basic_permissions = request.hydrus_args[ 'basic_permissions' ]
|
||||
name = request.parsed_request_args[ 'name' ]
|
||||
basic_permissions = request.parsed_request_args[ 'basic_permissions' ]
|
||||
|
||||
basic_permissions = [ int( value ) for value in basic_permissions ]
|
||||
|
||||
|
@ -561,11 +619,11 @@ class HydrusResourceClientAPIRestrictedAddFile( HydrusResourceClientAPIRestricte
|
|||
|
||||
if not hasattr( request, 'temp_file_info' ):
|
||||
|
||||
path = request.hydrus_args[ 'path' ]
|
||||
path = request.parsed_request_args[ 'path' ]
|
||||
|
||||
if not os.path.exists( path ):
|
||||
|
||||
raise HydrusExceptions.InsufficientCredentialsException( 'Path "{}" does not exist!'.format( path ) )
|
||||
raise HydrusExceptions.BadRequestException( 'Path "{}" does not exist!'.format( path ) )
|
||||
|
||||
|
||||
( os_file_handle, temp_path ) = HydrusPaths.GetTempPath()
|
||||
|
@ -616,9 +674,134 @@ class HydrusResourceClientAPIRestrictedAddTagsAddTags( HydrusResourceClientAPIRe
|
|||
|
||||
def _threadDoPOSTJob( self, request ):
|
||||
|
||||
# grab hash and tags from POST args
|
||||
hashes = set()
|
||||
|
||||
# do it to db
|
||||
if 'hash' in request.parsed_request_args:
|
||||
|
||||
hash = request.parsed_request_args[ 'hash' ]
|
||||
|
||||
hashes.add( hash )
|
||||
|
||||
|
||||
if 'hashes' in request.parsed_request_args:
|
||||
|
||||
more_hashes = request.parsed_request_args[ 'hashes' ]
|
||||
|
||||
hashes.update( hashes )
|
||||
|
||||
|
||||
if len( hashes ) == 0:
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'There were no hashes given!' )
|
||||
|
||||
|
||||
#
|
||||
|
||||
service_keys_to_content_updates = collections.defaultdict( list )
|
||||
|
||||
if 'service_names_to_tags' in request.parsed_request_args:
|
||||
|
||||
service_names_to_tags = request.parsed_request_args[ 'service_names_to_tags' ]
|
||||
|
||||
for ( service_name, tags ) in service_names_to_tags.items():
|
||||
|
||||
try:
|
||||
|
||||
service_key = HG.client_controller.services_manager.GetServiceKeyFromName( HC.TAG_SERVICES, service_name )
|
||||
|
||||
except:
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'Could not find the service "{}"!'.format( service_name ) )
|
||||
|
||||
|
||||
tags = HydrusTags.CleanTags( tags )
|
||||
|
||||
if len( tags ) == 0:
|
||||
|
||||
continue
|
||||
|
||||
|
||||
if service_key == CC.LOCAL_TAG_SERVICE_KEY:
|
||||
|
||||
content_action = HC.CONTENT_UPDATE_ADD
|
||||
|
||||
else:
|
||||
|
||||
content_action = HC.CONTENT_UPDATE_PEND
|
||||
|
||||
|
||||
content_updates = [ HydrusData.ContentUpdate( HC.CONTENT_TYPE_MAPPINGS, content_action, ( tag, hashes ) ) for tag in tags ]
|
||||
|
||||
service_keys_to_content_updates[ service_key ].extend( content_updates )
|
||||
|
||||
|
||||
|
||||
if 'service_names_to_actions_to_tags' in request.parsed_request_args:
|
||||
|
||||
service_names_to_actions_to_tags = request.parsed_request_args[ 'service_names_to_actions_to_tags' ]
|
||||
|
||||
for ( service_name, actions_to_tags ) in service_names_to_actions_to_tags.items():
|
||||
|
||||
try:
|
||||
|
||||
service_key = HG.client_controller.services_manager.GetServiceKeyFromName( HC.TAG_SERVICES, service_name )
|
||||
|
||||
except:
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'Could not find the service "{}"!'.format( service_name ) )
|
||||
|
||||
|
||||
for ( content_action, tags ) in actions_to_tags.items():
|
||||
|
||||
tags = HydrusTags.CleanTags( tags )
|
||||
|
||||
if len( tags ) == 0:
|
||||
|
||||
continue
|
||||
|
||||
|
||||
if service_key == CC.LOCAL_TAG_SERVICE_KEY:
|
||||
|
||||
if content_action not in ( HC.CONTENT_UPDATE_ADD, HC.CONTENT_UPDATE_DELETE ):
|
||||
|
||||
continue
|
||||
|
||||
|
||||
else:
|
||||
|
||||
if content_action in ( HC.CONTENT_UPDATE_ADD, HC.CONTENT_UPDATE_DELETE ):
|
||||
|
||||
continue
|
||||
|
||||
|
||||
|
||||
if content_action == HC.CONTENT_UPDATE_PETITION:
|
||||
|
||||
if isinstance( tags[0], str ):
|
||||
|
||||
tags_and_reasons = [ ( tag, 'Petitioned from API' ) for tag in tags ]
|
||||
|
||||
else:
|
||||
|
||||
tags_and_reasons = tags
|
||||
|
||||
|
||||
content_updates = [ HydrusData.ContentUpdate( HC.CONTENT_TYPE_MAPPINGS, content_action, ( tag, hashes, reason ) ) for ( tag, reason ) in tags_and_reasons ]
|
||||
|
||||
else:
|
||||
|
||||
content_updates = [ HydrusData.ContentUpdate( HC.CONTENT_TYPE_MAPPINGS, content_action, ( tag, hashes ) ) for tag in tags ]
|
||||
|
||||
|
||||
service_keys_to_content_updates[ service_key ].extend( content_updates )
|
||||
|
||||
|
||||
|
||||
|
||||
if len( service_keys_to_content_updates ) > 0:
|
||||
|
||||
HG.client_controller.WriteSynchronous( 'content_updates', service_keys_to_content_updates )
|
||||
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200 )
|
||||
|
||||
|
@ -644,6 +827,27 @@ class HydrusResourceClientAPIRestrictedAddTagsGetTagServices( HydrusResourceClie
|
|||
return response_context
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedAddTagsCleanTags( HydrusResourceClientAPIRestrictedAddTags ):
|
||||
|
||||
def _threadDoGETJob( self, request ):
|
||||
|
||||
tags = request.parsed_request_args[ 'tags' ]
|
||||
|
||||
tags = list( HydrusTags.CleanTags( tags ) )
|
||||
|
||||
tags = HydrusTags.SortNumericTags( tags )
|
||||
|
||||
body_dict = {}
|
||||
|
||||
body_dict[ 'tags' ] = tags
|
||||
|
||||
body = json.dumps( body_dict )
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200, mime = HC.APPLICATION_JSON, body = body )
|
||||
|
||||
return response_context
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedAddURLs( HydrusResourceClientAPIRestricted ):
|
||||
|
||||
def _CheckAPIPermissions( self, request ):
|
||||
|
@ -651,11 +855,123 @@ class HydrusResourceClientAPIRestrictedAddURLs( HydrusResourceClientAPIRestricte
|
|||
request.client_api_permissions.CheckPermission( ClientAPI.CLIENT_API_PERMISSION_ADD_URLS )
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedAddURLsAssociateURL( HydrusResourceClientAPIRestrictedAddURLs ):
|
||||
|
||||
def _threadDoPOSTJob( self, request ):
|
||||
|
||||
urls_to_add = []
|
||||
|
||||
if 'url_to_add' in request.parsed_request_args:
|
||||
|
||||
url = request.parsed_request_args[ 'url_to_add' ]
|
||||
|
||||
if not isinstance( url, str ):
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'Did not understand the given URL "{}"!'.format( url ) )
|
||||
|
||||
|
||||
urls_to_add.append( url )
|
||||
|
||||
|
||||
if 'urls_to_add' in request.parsed_request_args:
|
||||
|
||||
urls = request.parsed_request_args[ 'urls_to_add' ]
|
||||
|
||||
if not isinstance( urls, list ):
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'Did not understand the given URLs!' )
|
||||
|
||||
|
||||
for url in urls:
|
||||
|
||||
if not isinstance( url, str ):
|
||||
|
||||
continue
|
||||
|
||||
|
||||
urls_to_add.append( url )
|
||||
|
||||
|
||||
|
||||
urls_to_delete = []
|
||||
|
||||
if 'url_to_delete' in request.parsed_request_args:
|
||||
|
||||
url = request.parsed_request_args[ 'url_to_delete' ]
|
||||
|
||||
if not isinstance( url, str ):
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'Did not understand the given URL "{}"!'.format( url ) )
|
||||
|
||||
|
||||
urls_to_delete.append( url )
|
||||
|
||||
|
||||
if 'urls_to_delete' in request.parsed_request_args:
|
||||
|
||||
urls = request.parsed_request_args[ 'urls_to_delete' ]
|
||||
|
||||
if not isinstance( urls, list ):
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'Did not understand the given URLs!' )
|
||||
|
||||
|
||||
for url in urls:
|
||||
|
||||
if not isinstance( url, str ):
|
||||
|
||||
continue
|
||||
|
||||
|
||||
urls_to_delete.append( url )
|
||||
|
||||
|
||||
|
||||
applicable_hashes = []
|
||||
|
||||
if 'hash' in request.parsed_request_args:
|
||||
|
||||
applicable_hashes.append( request.parsed_request_args[ 'hash' ] )
|
||||
|
||||
|
||||
if 'hashes' in request.parsed_request_args:
|
||||
|
||||
applicable_hashes.extend( request.parsed_request_args[ 'hashes' ] )
|
||||
|
||||
|
||||
if len( urls_to_add ) == 0 and len( urls_to_delete ) == 0:
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'Did not find any URLs to add!' )
|
||||
|
||||
|
||||
service_keys_to_content_updates = collections.defaultdict( list )
|
||||
|
||||
if len( urls_to_add ) > 0:
|
||||
|
||||
content_update = HydrusData.ContentUpdate( HC.CONTENT_TYPE_URLS, HC.CONTENT_UPDATE_ADD, ( urls_to_add, applicable_hashes ) )
|
||||
|
||||
service_keys_to_content_updates[ CC.COMBINED_LOCAL_FILE_SERVICE_KEY ].append( content_update )
|
||||
|
||||
|
||||
if len( urls_to_delete ) > 0:
|
||||
|
||||
content_update = HydrusData.ContentUpdate( HC.CONTENT_TYPE_URLS, HC.CONTENT_UPDATE_DELETE, ( urls_to_delete, applicable_hashes ) )
|
||||
|
||||
service_keys_to_content_updates[ CC.COMBINED_LOCAL_FILE_SERVICE_KEY ].append( content_update )
|
||||
|
||||
|
||||
HG.client_controller.WriteSynchronous( 'content_updates', service_keys_to_content_updates )
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200 )
|
||||
|
||||
return response_context
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedAddURLsGetURLFiles( HydrusResourceClientAPIRestrictedAddURLs ):
|
||||
|
||||
def _threadDoGETJob( self, request ):
|
||||
|
||||
url = request.hydrus_args[ 'url' ]
|
||||
url = request.parsed_request_args[ 'url' ]
|
||||
|
||||
normalised_url = HG.client_controller.network_engine.domain_manager.NormaliseURL( url )
|
||||
|
||||
|
@ -687,7 +1003,7 @@ class HydrusResourceClientAPIRestrictedAddURLsGetURLInfo( HydrusResourceClientAP
|
|||
|
||||
def _threadDoGETJob( self, request ):
|
||||
|
||||
url = request.hydrus_args[ 'url' ]
|
||||
url = request.parsed_request_args[ 'url' ]
|
||||
|
||||
normalised_url = HG.client_controller.network_engine.domain_manager.NormaliseURL( url )
|
||||
|
||||
|
@ -706,34 +1022,55 @@ class HydrusResourceClientAPIRestrictedAddURLsImportURL( HydrusResourceClientAPI
|
|||
|
||||
def _threadDoPOSTJob( self, request ):
|
||||
|
||||
url = request.hydrus_args[ 'url' ]
|
||||
url = request.parsed_request_args[ 'url' ]
|
||||
|
||||
service_keys_to_tags = {}
|
||||
service_keys_to_tags = None
|
||||
|
||||
if 'service_names_to_tags' in request.hydrus_args:
|
||||
if 'service_names_to_tags' in request.parsed_request_args:
|
||||
|
||||
service_keys_to_tags = ClientTags.ServiceKeysToTags()
|
||||
|
||||
request.client_api_permissions.CheckPermission( ClientAPI.CLIENT_API_PERMISSION_ADD_TAGS )
|
||||
|
||||
service_names_to_tags = request.hydrus_args
|
||||
service_names_to_tags = request.parsed_request_args[ 'service_names_to_tags' ]
|
||||
|
||||
for ( service_name, tags ) in service_names_to_tags.items():
|
||||
|
||||
try:
|
||||
|
||||
service_key = HG.client_controller.service_manager.GetServiceKeyFromName( HC.TAG_SERVICES, service_name )
|
||||
service_key = HG.client_controller.services_manager.GetServiceKeyFromName( HC.TAG_SERVICES, service_name )
|
||||
|
||||
except:
|
||||
|
||||
raise KeyError( 'Could not find the service "{}"!'.format( service_name ) )
|
||||
raise HydrusExceptions.BadRequestException( 'Could not find the service "{}"!'.format( service_name ) )
|
||||
|
||||
|
||||
tags = HydrusTags.CleanTags( tags )
|
||||
|
||||
if len( tags ) == 0:
|
||||
|
||||
continue
|
||||
|
||||
|
||||
service_keys_to_tags[ service_key ] = tags
|
||||
|
||||
|
||||
|
||||
destination_page_name = None
|
||||
|
||||
if 'destination_page_name' in request.parsed_request_args:
|
||||
|
||||
destination_page_name = request.parsed_request_args[ 'destination_page_name' ]
|
||||
|
||||
if not isinstance( destination_page_name, str ):
|
||||
|
||||
raise HydrusExceptions.BadRequestException( '"destination_page_name" did not seem to be a string!' )
|
||||
|
||||
|
||||
|
||||
gui = HG.client_controller.gui
|
||||
|
||||
( normalised_url, result_text ) = HG.client_controller.CallBlockingToWX( gui, gui.ImportURLFromAPI, url, service_keys_to_tags )
|
||||
( normalised_url, result_text ) = HG.client_controller.CallBlockingToWX( gui, gui.ImportURLFromAPI, url, service_keys_to_tags, destination_page_name )
|
||||
|
||||
body_dict = { 'human_result_text' : result_text, 'normalised_url' : normalised_url }
|
||||
|
||||
|
@ -778,7 +1115,7 @@ class HydrusResourceClientAPIRestrictedSearchFilesGetFile( HydrusResourceClientA
|
|||
|
||||
def _threadDoGETJob( self, request ):
|
||||
|
||||
file_id = request.hydrus_args[ 'file_id' ]
|
||||
file_id = request.parsed_request_args[ 'file_id' ]
|
||||
|
||||
request.client_api_permissions.CheckPermissionToSeeFiles( ( file_id, ) )
|
||||
|
||||
|
@ -799,7 +1136,7 @@ class HydrusResourceClientAPIRestrictedSearchFilesGetMetadata( HydrusResourceCli
|
|||
|
||||
def _threadDoGETJob( self, request ):
|
||||
|
||||
file_ids = request.hydrus_args[ 'file_ids' ]
|
||||
file_ids = request.parsed_request_args[ 'file_ids' ]
|
||||
|
||||
request.client_api_permissions.CheckPermissionToSeeFiles( file_ids )
|
||||
|
||||
|
@ -819,7 +1156,7 @@ class HydrusResourceClientAPIRestrictedSearchFilesGetThumbnail( HydrusResourceCl
|
|||
|
||||
def _threadDoGETJob( self, request ):
|
||||
|
||||
file_id = request.hydrus_args[ 'file_id' ]
|
||||
file_id = request.parsed_request_args[ 'file_id' ]
|
||||
|
||||
request.client_api_permissions.CheckPermissionToSeeFiles( ( file_id, ) )
|
||||
|
||||
|
|
|
@ -608,11 +608,6 @@ class LocationsManager( object ):
|
|||
self._petitioned.discard( service_key )
|
||||
|
||||
|
||||
def ShouldDefinitelyHaveThumbnail( self ): # local only
|
||||
|
||||
return HC.COMBINED_LOCAL_FILE in self._current
|
||||
|
||||
|
||||
def ShouldIdeallyHaveThumbnail( self ): # file repo or local
|
||||
|
||||
return len( self._current ) > 0
|
||||
|
@ -2068,7 +2063,7 @@ class MediaSort( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
duration = x.GetDuration()
|
||||
|
||||
if duration is None:
|
||||
if duration is None or duration == 0:
|
||||
|
||||
return 0
|
||||
|
||||
|
@ -2076,7 +2071,7 @@ class MediaSort( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
size = x.GetSize()
|
||||
|
||||
if size is None:
|
||||
if size is None or size == 0:
|
||||
|
||||
return -1
|
||||
|
||||
|
|
|
@ -28,6 +28,10 @@ def ConvertStatusCodeAndDataIntoExceptionInfo( status_code, data, is_hydrus_serv
|
|||
|
||||
eclass = HydrusExceptions.NotModifiedException
|
||||
|
||||
elif status_code == 400:
|
||||
|
||||
eclass = HydrusExceptions.BadRequestException
|
||||
|
||||
elif status_code == 401:
|
||||
|
||||
eclass = HydrusExceptions.MissingCredentialsException
|
||||
|
|
|
@ -348,7 +348,7 @@ class NetworkLoginManager( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
try:
|
||||
|
||||
service.CheckFunctional( including_account = False )
|
||||
service.CheckFunctional( including_bandwidth = False, including_account = False )
|
||||
|
||||
except Exception as e:
|
||||
|
||||
|
|
|
@ -48,10 +48,22 @@ def FrameIndexOutOfRange( index, range_start, range_end ):
|
|||
|
||||
return False
|
||||
|
||||
def GenerateHydrusBitmap( path, mime, compressed = True ):
|
||||
def GenerateHydrusBitmap( path, mime, compressed = True, desired_dimensions = None, do_thumbnail_resize = False ):
|
||||
|
||||
numpy_image = ClientImageHandling.GenerateNumpyImage( path, mime )
|
||||
|
||||
if desired_dimensions is not None:
|
||||
|
||||
if do_thumbnail_resize:
|
||||
|
||||
numpy_image = ClientImageHandling.EfficientlyThumbnailNumpyImage( numpy_image, desired_dimensions )
|
||||
|
||||
else:
|
||||
|
||||
numpy_image = ClientImageHandling.EfficientlyResizeNumpyImage( numpy_image, desired_dimensions )
|
||||
|
||||
|
||||
|
||||
return GenerateHydrusBitmapFromNumPyImage( numpy_image, compressed = compressed )
|
||||
|
||||
def GenerateHydrusBitmapFromNumPyImage( numpy_image, compressed = True ):
|
||||
|
|
|
@ -447,7 +447,10 @@ class FileSystemPredicates( object ):
|
|||
elif operator == '=': self._common_info[ 'duration' ] = duration
|
||||
elif operator == '\u2248':
|
||||
|
||||
if duration == 0: self._common_info[ 'duration' ] = 0
|
||||
if duration == 0:
|
||||
|
||||
self._common_info[ 'duration' ] = 0
|
||||
|
||||
else:
|
||||
|
||||
self._common_info[ 'min_duration' ] = int( duration * 0.85 )
|
||||
|
|
|
@ -550,30 +550,33 @@ class ServiceRemote( Service ):
|
|||
return 3600 * 4
|
||||
|
||||
|
||||
def _CheckFunctional( self, including_external_communication = True ):
|
||||
def _CheckFunctional( self, including_external_communication = True, including_bandwidth = True ):
|
||||
|
||||
if including_external_communication:
|
||||
|
||||
self._CheckCanCommunicateExternally()
|
||||
self._CheckCanCommunicateExternally( including_bandwidth = including_bandwidth )
|
||||
|
||||
|
||||
Service._CheckFunctional( self )
|
||||
|
||||
|
||||
def _CheckCanCommunicateExternally( self ):
|
||||
def _CheckCanCommunicateExternally( self, including_bandwidth = True ):
|
||||
|
||||
if not HydrusData.TimeHasPassed( self._no_requests_until ):
|
||||
|
||||
raise HydrusExceptions.InsufficientCredentialsException( self._no_requests_reason + ' - next request ' + HydrusData.TimestampToPrettyTimeDelta( self._no_requests_until ) )
|
||||
|
||||
|
||||
example_nj = ClientNetworkingJobs.NetworkJobHydrus( self._service_key, 'GET', self._GetBaseURL() )
|
||||
|
||||
can_start = HG.client_controller.network_engine.bandwidth_manager.CanDoWork( example_nj.GetNetworkContexts(), threshold = 60 )
|
||||
|
||||
if not can_start:
|
||||
if including_bandwidth:
|
||||
|
||||
raise HydrusExceptions.BandwidthException( 'bandwidth exceeded' )
|
||||
example_nj = ClientNetworkingJobs.NetworkJobHydrus( self._service_key, 'GET', self._GetBaseURL() )
|
||||
|
||||
can_start = HG.client_controller.network_engine.bandwidth_manager.CanDoWork( example_nj.GetNetworkContexts(), threshold = 60 )
|
||||
|
||||
if not can_start:
|
||||
|
||||
raise HydrusExceptions.BandwidthException( 'bandwidth exceeded' )
|
||||
|
||||
|
||||
|
||||
|
||||
|
@ -687,24 +690,24 @@ class ServiceRestricted( ServiceRemote ):
|
|||
|
||||
|
||||
|
||||
def _CheckFunctional( self, including_external_communication = True, including_account = True ):
|
||||
def _CheckFunctional( self, including_external_communication = True, including_bandwidth = True, including_account = True ):
|
||||
|
||||
if including_account:
|
||||
|
||||
self._account.CheckFunctional()
|
||||
|
||||
|
||||
ServiceRemote._CheckFunctional( self, including_external_communication = including_external_communication )
|
||||
ServiceRemote._CheckFunctional( self, including_external_communication = including_external_communication, including_bandwidth = including_bandwidth )
|
||||
|
||||
|
||||
def _CheckCanCommunicateExternally( self ):
|
||||
def _CheckCanCommunicateExternally( self, including_bandwidth = True ):
|
||||
|
||||
if not self._credentials.HasAccessKey():
|
||||
|
||||
raise HydrusExceptions.MissingCredentialsException( 'this service has no access key set' )
|
||||
|
||||
|
||||
ServiceRemote._CheckCanCommunicateExternally( self )
|
||||
ServiceRemote._CheckCanCommunicateExternally( self, including_bandwidth = including_bandwidth )
|
||||
|
||||
|
||||
|
||||
|
@ -726,11 +729,11 @@ class ServiceRestricted( ServiceRemote ):
|
|||
self._next_account_sync = dictionary[ 'next_account_sync' ]
|
||||
|
||||
|
||||
def CheckFunctional( self, including_external_communication = True, including_account = True ):
|
||||
def CheckFunctional( self, including_external_communication = True, including_bandwidth = True, including_account = True ):
|
||||
|
||||
with self._lock:
|
||||
|
||||
self._CheckFunctional( including_external_communication = including_external_communication, including_account = including_account )
|
||||
self._CheckFunctional( including_external_communication = including_external_communication, including_bandwidth = including_bandwidth, including_account = including_account )
|
||||
|
||||
|
||||
|
||||
|
@ -777,13 +780,13 @@ class ServiceRestricted( ServiceRemote ):
|
|||
|
||||
|
||||
|
||||
def IsFunctional( self, including_external_communication = True, including_account = True ):
|
||||
def IsFunctional( self, including_external_communication = True, including_bandwidth = True, including_account = True ):
|
||||
|
||||
with self._lock:
|
||||
|
||||
try:
|
||||
|
||||
self._CheckFunctional( including_external_communication = including_external_communication, including_account = including_account )
|
||||
self._CheckFunctional( including_external_communication = including_external_communication, including_bandwidth = including_bandwidth, including_account = including_account )
|
||||
|
||||
return True
|
||||
|
||||
|
@ -881,20 +884,20 @@ class ServiceRestricted( ServiceRemote ):
|
|||
|
||||
if content_type == 'application/json':
|
||||
|
||||
hydrus_args = HydrusNetwork.ParseNetworkBytesToHydrusArgs( network_bytes )
|
||||
parsed_args = HydrusNetwork.ParseNetworkBytesToParsedHydrusArgs( network_bytes )
|
||||
|
||||
if command == 'account' and 'account' in hydrus_args:
|
||||
if command == 'account' and 'account' in parsed_args:
|
||||
|
||||
data_used = network_job.GetTotalDataUsed()
|
||||
|
||||
account = hydrus_args[ 'account' ]
|
||||
account = parsed_args[ 'account' ]
|
||||
|
||||
# because the account was one behind when it was serialised! mostly do this just to sync up nicely with the service bandwidth tracker
|
||||
account.ReportDataUsed( data_used )
|
||||
account.ReportRequestUsed()
|
||||
|
||||
|
||||
response = hydrus_args
|
||||
response = parsed_args
|
||||
|
||||
else:
|
||||
|
||||
|
@ -936,7 +939,6 @@ class ServiceRestricted( ServiceRemote ):
|
|||
self._DelayFutureRequests( str( e ) )
|
||||
|
||||
|
||||
|
||||
|
||||
raise
|
||||
|
||||
|
@ -1071,7 +1073,7 @@ class ServiceRepository( ServiceRestricted ):
|
|||
|
||||
try:
|
||||
|
||||
self._CheckFunctional( including_external_communication = False, including_account = False )
|
||||
self._CheckFunctional( including_external_communication = False, including_bandwidth = False, including_account = False )
|
||||
|
||||
return True
|
||||
|
||||
|
@ -1081,7 +1083,7 @@ class ServiceRepository( ServiceRestricted ):
|
|||
|
||||
|
||||
|
||||
def _CheckFunctional( self, including_external_communication = True, including_account = True ):
|
||||
def _CheckFunctional( self, including_external_communication = True, including_bandwidth = True, including_account = True ):
|
||||
|
||||
if self._paused:
|
||||
|
||||
|
@ -1093,7 +1095,7 @@ class ServiceRepository( ServiceRestricted ):
|
|||
raise HydrusExceptions.InsufficientCredentialsException( 'All repositories are paused!' )
|
||||
|
||||
|
||||
ServiceRestricted._CheckFunctional( self, including_external_communication = including_external_communication, including_account = including_account )
|
||||
ServiceRestricted._CheckFunctional( self, including_external_communication = including_external_communication, including_bandwidth = including_bandwidth, including_account = including_account )
|
||||
|
||||
|
||||
def _GetSerialisableDictionary( self ):
|
||||
|
@ -1502,6 +1504,10 @@ class ServiceRepository( ServiceRestricted ):
|
|||
time.sleep( 3 ) # stop spamming of repo sync daemon from bringing this up again too quick
|
||||
|
||||
|
||||
except HydrusExceptions.ShutdownException:
|
||||
|
||||
return
|
||||
|
||||
except Exception as e:
|
||||
|
||||
with self._lock:
|
||||
|
|
|
@ -1,4 +1,5 @@
|
|||
from . import ClientConstants as CC
|
||||
import collections
|
||||
from . import HydrusGlobals as HG
|
||||
from . import HydrusSerialisable
|
||||
from . import HydrusTags
|
||||
|
@ -183,6 +184,33 @@ def SortTags( sort_by, tags_list, tags_to_count = None ):
|
|||
tags_list.sort( key = key, reverse = reverse )
|
||||
|
||||
|
||||
class ServiceKeysToTags( HydrusSerialisable.SerialisableBase, collections.defaultdict ):
|
||||
|
||||
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_SERVICE_KEYS_TO_TAGS
|
||||
SERIALISABLE_NAME = 'Service Keys To Tags'
|
||||
SERIALISABLE_VERSION = 1
|
||||
|
||||
def __init__( self, *args, **kwargs ):
|
||||
|
||||
collections.defaultdict.__init__( self, set, *args, **kwargs )
|
||||
HydrusSerialisable.SerialisableBase.__init__( self )
|
||||
|
||||
|
||||
def _GetSerialisableInfo( self ):
|
||||
|
||||
return [ ( service_key.hex(), list( tags ) ) for ( service_key, tags ) in self.items() ]
|
||||
|
||||
|
||||
def _InitialiseFromSerialisableInfo( self, serialisable_info ):
|
||||
|
||||
for ( service_key_hex, tags_list ) in serialisable_info:
|
||||
|
||||
self[ bytes.fromhex( service_key_hex ) ] = set( tags_list )
|
||||
|
||||
|
||||
|
||||
HydrusSerialisable.SERIALISABLE_TYPES_TO_OBJECT_TYPES[ HydrusSerialisable.SERIALISABLE_TYPE_SERVICE_KEYS_TO_TAGS ] = ServiceKeysToTags
|
||||
|
||||
class TagFilter( HydrusSerialisable.SerialisableBase ):
|
||||
|
||||
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_TAG_FILTER
|
||||
|
|
|
@ -67,8 +67,8 @@ options = {}
|
|||
# Misc
|
||||
|
||||
NETWORK_VERSION = 18
|
||||
SOFTWARE_VERSION = 340
|
||||
CLIENT_API_VERSION = 1
|
||||
SOFTWARE_VERSION = 341
|
||||
CLIENT_API_VERSION = 2
|
||||
|
||||
UNSCALED_THUMBNAIL_DIMENSIONS = ( 200, 200 )
|
||||
|
||||
|
|
|
@ -299,7 +299,6 @@ class HydrusDB( object ):
|
|||
|
||||
self._transaction_started = HydrusData.GetNow()
|
||||
self._in_transaction = True
|
||||
self._transaction_contains_writes = False
|
||||
|
||||
|
||||
|
||||
|
@ -408,6 +407,19 @@ class HydrusDB( object ):
|
|||
|
||||
create_db = True
|
||||
|
||||
external_db_paths = [ os.path.join( self._db_dir, self._db_filenames[ db_name ] ) for db_name in self._db_filenames if db_name != 'main' ]
|
||||
|
||||
existing_external_db_paths = [ external_db_path for external_db_path in external_db_paths if os.path.exists( external_db_path ) ]
|
||||
|
||||
if len( existing_external_db_paths ) > 0:
|
||||
|
||||
message = 'Although the external files, "{}" do exist, the main database file, "{}", does not! This makes for an invalid database, and the program will now quit. Please contact hydrus_dev if you do not know how this happened or need help recovering from hard drive failure.'
|
||||
|
||||
message = message.format( ', '.join( existing_external_db_paths ), db_path )
|
||||
|
||||
raise HydrusExceptions.DBException( message )
|
||||
|
||||
|
||||
|
||||
self._InitDBCursor()
|
||||
|
||||
|
@ -444,8 +456,6 @@ class HydrusDB( object ):
|
|||
|
||||
self._c = self._db.cursor()
|
||||
|
||||
self._c.execute( 'PRAGMA temp_store = 2;' )
|
||||
|
||||
self._c.execute( 'PRAGMA main.cache_size = -10000;' )
|
||||
|
||||
self._c.execute( 'ATTACH ":memory:" AS mem;' )
|
||||
|
@ -580,6 +590,8 @@ class HydrusDB( object ):
|
|||
|
||||
self._BeginImmediate()
|
||||
|
||||
self._transaction_contains_writes = False
|
||||
|
||||
else:
|
||||
|
||||
self._Save()
|
||||
|
@ -713,6 +725,11 @@ class HydrusDB( object ):
|
|||
return [ row for row in self._SelectFromList( select_statement, xs ) ]
|
||||
|
||||
|
||||
def _ShrinkMemory( self ):
|
||||
|
||||
self._c.execute( 'PRAGMA shrink_memory;' )
|
||||
|
||||
|
||||
def _STI( self, iterable_cursor ):
|
||||
|
||||
# strip singleton tuples to an iterator
|
||||
|
@ -891,6 +908,8 @@ class HydrusDB( object ):
|
|||
|
||||
self._BeginImmediate()
|
||||
|
||||
self._transaction_contains_writes = False
|
||||
|
||||
|
||||
|
||||
if HydrusData.TimeHasPassed( self._connection_timestamp + CONNECTION_REFRESH_TIME ): # just to clear out the journal files
|
||||
|
|
|
@ -43,6 +43,7 @@ class NetworkVersionException( NetworkException ): pass
|
|||
class NoContentException( NetworkException ): pass
|
||||
class NotFoundException( NetworkException ): pass
|
||||
class NotModifiedException( NetworkException ): pass
|
||||
class BadRequestException( NetworkException ): pass
|
||||
class MissingCredentialsException( NetworkException ): pass
|
||||
class InsufficientCredentialsException( NetworkException ): pass
|
||||
class RedirectionException( NetworkException ): pass
|
||||
|
|
|
@ -71,7 +71,7 @@ def GenerateThumbnail( path, mime, dimensions = HC.UNSCALED_THUMBNAIL_DIMENSIONS
|
|||
|
||||
if mime in ( HC.IMAGE_JPEG, HC.IMAGE_PNG, HC.IMAGE_GIF ):
|
||||
|
||||
thumbnail = GenerateThumbnailFromStaticImage( path, dimensions, mime )
|
||||
thumbnail = GenerateThumbnailFileBytesFromStaticImagePath( path, dimensions, mime )
|
||||
|
||||
else:
|
||||
|
||||
|
@ -139,7 +139,7 @@ def GenerateThumbnail( path, mime, dimensions = HC.UNSCALED_THUMBNAIL_DIMENSIONS
|
|||
|
||||
return thumbnail
|
||||
|
||||
def GenerateThumbnailFromStaticImagePIL( path, dimensions = HC.UNSCALED_THUMBNAIL_DIMENSIONS, mime = None ):
|
||||
def GenerateThumbnailFileBytesFromStaticImagePathPIL( path, dimensions = HC.UNSCALED_THUMBNAIL_DIMENSIONS, mime = None ):
|
||||
|
||||
f = io.BytesIO()
|
||||
|
||||
|
@ -155,7 +155,7 @@ def GenerateThumbnailFromStaticImagePIL( path, dimensions = HC.UNSCALED_THUMBNAI
|
|||
|
||||
return thumbnail
|
||||
|
||||
GenerateThumbnailFromStaticImage = GenerateThumbnailFromStaticImagePIL
|
||||
GenerateThumbnailFileBytesFromStaticImagePath = GenerateThumbnailFileBytesFromStaticImagePathPIL
|
||||
|
||||
def GetExtraHashesFromPath( path ):
|
||||
|
||||
|
|
|
@ -37,6 +37,8 @@ shutdown_complete = False
|
|||
restart = False
|
||||
emergency_exit = False
|
||||
|
||||
thumbnail_experiment_mode = False
|
||||
|
||||
twisted_is_broke = False
|
||||
|
||||
do_not_catch_char_hook = False
|
||||
|
|
|
@ -231,7 +231,7 @@ def DumpToGETQuery( args ):
|
|||
|
||||
return query
|
||||
|
||||
def ParseNetworkBytesToHydrusArgs( network_bytes ):
|
||||
def ParseNetworkBytesToParsedHydrusArgs( network_bytes ):
|
||||
|
||||
if len( network_bytes ) == 0:
|
||||
|
||||
|
@ -240,6 +240,13 @@ def ParseNetworkBytesToHydrusArgs( network_bytes ):
|
|||
|
||||
args = HydrusSerialisable.CreateFromNetworkBytes( network_bytes )
|
||||
|
||||
if not isinstance( args, dict ):
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'The given parameter did not seem to be a JSON Object!' )
|
||||
|
||||
|
||||
args = HydrusNetworking.ParsedRequestArguments( args )
|
||||
|
||||
if 'access_key' in args:
|
||||
|
||||
args[ 'access_key' ] = bytes.fromhex( args[ 'access_key' ] )
|
||||
|
|
|
@ -82,7 +82,7 @@ def LocalPortInUse( port ):
|
|||
|
||||
def ParseTwistedRequestGETArgs( requests_args, int_params, byte_params, string_params, json_params ):
|
||||
|
||||
args = {}
|
||||
args = ParsedRequestArguments()
|
||||
|
||||
for name_bytes in requests_args:
|
||||
|
||||
|
@ -116,7 +116,7 @@ def ParseTwistedRequestGETArgs( requests_args, int_params, byte_params, string_p
|
|||
|
||||
except:
|
||||
|
||||
raise HydrusExceptions.InsufficientCredentialsException( 'I was expecting to parse \'' + name + '\' as an integer, but it failed.' )
|
||||
raise HydrusExceptions.BadRequestException( 'I was expecting to parse \'' + name + '\' as an integer, but it failed.' )
|
||||
|
||||
|
||||
elif name in byte_params:
|
||||
|
@ -127,7 +127,7 @@ def ParseTwistedRequestGETArgs( requests_args, int_params, byte_params, string_p
|
|||
|
||||
except:
|
||||
|
||||
raise HydrusExceptions.InsufficientCredentialsException( 'I was expecting to parse \'' + name + '\' as a hex-encoded string, but it failed.' )
|
||||
raise HydrusExceptions.BadRequestException( 'I was expecting to parse \'' + name + '\' as a hex-encoded bytestring, but it failed.' )
|
||||
|
||||
|
||||
elif name in string_params:
|
||||
|
@ -138,7 +138,7 @@ def ParseTwistedRequestGETArgs( requests_args, int_params, byte_params, string_p
|
|||
|
||||
except:
|
||||
|
||||
raise HydrusExceptions.InsufficientCredentialsException( 'I was expecting to parse \'' + name + '\' as a hex-encoded string, but it failed.' )
|
||||
raise HydrusExceptions.BadRequestException( 'I was expecting to parse \'' + name + '\' as a percent-encdode string, but it failed.' )
|
||||
|
||||
|
||||
elif name in json_params:
|
||||
|
@ -149,13 +149,20 @@ def ParseTwistedRequestGETArgs( requests_args, int_params, byte_params, string_p
|
|||
|
||||
except:
|
||||
|
||||
raise HydrusExceptions.InsufficientCredentialsException( 'I was expecting to parse \'' + name + '\' as a json-encoded string, but it failed.' )
|
||||
raise HydrusExceptions.BadRequestException( 'I was expecting to parse \'' + name + '\' as a json-encoded string, but it failed.' )
|
||||
|
||||
|
||||
|
||||
|
||||
return args
|
||||
|
||||
class ParsedRequestArguments( dict ):
|
||||
|
||||
def __missing__( self, key ):
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'It looks like the parameter "{}" was missing!'.format( key ) )
|
||||
|
||||
|
||||
class BandwidthRules( HydrusSerialisable.SerialisableBase ):
|
||||
|
||||
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_BANDWIDTH_RULES
|
||||
|
|
|
@ -91,6 +91,7 @@ SERIALISABLE_TYPE_LOGIN_SCRIPT_DOMAIN = 73
|
|||
SERIALISABLE_TYPE_LOGIN_STEP = 74
|
||||
SERIALISABLE_TYPE_CLIENT_API_MANAGER = 75
|
||||
SERIALISABLE_TYPE_CLIENT_API_PERMISSIONS = 76
|
||||
SERIALISABLE_TYPE_SERVICE_KEYS_TO_TAGS = 77
|
||||
|
||||
SERIALISABLE_TYPES_TO_OBJECT_TYPES = {}
|
||||
|
||||
|
|
|
@ -17,7 +17,7 @@ class HydrusRequest( Request ):
|
|||
|
||||
self.start_time = time.clock()
|
||||
self.is_hydrus_client = True
|
||||
self.hydrus_args = None
|
||||
self.parsed_request_args = None
|
||||
self.hydrus_response_context = None
|
||||
|
||||
|
||||
|
|
|
@ -3,6 +3,7 @@ from . import HydrusExceptions
|
|||
from . import HydrusFileHandling
|
||||
from . import HydrusImageHandling
|
||||
from . import HydrusNetwork
|
||||
from . import HydrusNetworking
|
||||
from . import HydrusPaths
|
||||
from . import HydrusSerialisable
|
||||
import os
|
||||
|
@ -230,10 +231,10 @@ def ParseFileArguments( path, decompression_bombs_ok = False ):
|
|||
|
||||
except Exception as e:
|
||||
|
||||
raise HydrusExceptions.InsufficientCredentialsException( 'File ' + hash.hex() + ' could not parse: ' + str( e ) )
|
||||
raise HydrusExceptions.BadRequestException( 'File ' + hash.hex() + ' could not parse: ' + str( e ) )
|
||||
|
||||
|
||||
args = {}
|
||||
args = HydrusNetworking.ParsedRequestArguments()
|
||||
|
||||
args[ 'path' ] = path
|
||||
args[ 'hash' ] = hash
|
||||
|
@ -256,7 +257,7 @@ def ParseFileArguments( path, decompression_bombs_ok = False ):
|
|||
|
||||
tb = traceback.format_exc()
|
||||
|
||||
raise HydrusExceptions.InsufficientCredentialsException( 'Could not generate thumbnail from that file:' + os.linesep + tb )
|
||||
raise HydrusExceptions.BadRequestException( 'Could not generate thumbnail from that file:' + os.linesep + tb )
|
||||
|
||||
|
||||
args[ 'thumbnail' ] = thumbnail
|
||||
|
@ -540,13 +541,9 @@ class HydrusResource( Resource ):
|
|||
default_mime = HC.TEXT_HTML
|
||||
default_encoding = str
|
||||
|
||||
if failure.type == KeyError:
|
||||
if failure.type == HydrusExceptions.BadRequestException:
|
||||
|
||||
response_context = ResponseContext( 400, mime = default_mime, body = default_encoding( 'It appears one or more parameters required for that request were missing:' + os.linesep + failure.getTraceback() ) )
|
||||
|
||||
elif failure.type == HydrusExceptions.BandwidthException:
|
||||
|
||||
response_context = ResponseContext( 509, mime = default_mime, body = default_encoding( failure.value ) )
|
||||
response_context = ResponseContext( 400, mime = default_mime, body = default_encoding( failure.value ) )
|
||||
|
||||
elif failure.type == HydrusExceptions.MissingCredentialsException:
|
||||
|
||||
|
@ -560,6 +557,10 @@ class HydrusResource( Resource ):
|
|||
|
||||
response_context = ResponseContext( 404, mime = default_mime, body = default_encoding( failure.value ) )
|
||||
|
||||
elif failure.type == HydrusExceptions.SessionException:
|
||||
|
||||
response_context = ResponseContext( 419, mime = default_mime, body = default_encoding( failure.value ) )
|
||||
|
||||
elif failure.type == HydrusExceptions.NetworkVersionException:
|
||||
|
||||
response_context = ResponseContext( 426, mime = default_mime, body = default_encoding( failure.value ) )
|
||||
|
@ -568,9 +569,9 @@ class HydrusResource( Resource ):
|
|||
|
||||
response_context = ResponseContext( 503, mime = default_mime, body = default_encoding( failure.value ) )
|
||||
|
||||
elif failure.type == HydrusExceptions.SessionException:
|
||||
elif failure.type == HydrusExceptions.BandwidthException:
|
||||
|
||||
response_context = ResponseContext( 419, mime = default_mime, body = default_encoding( failure.value ) )
|
||||
response_context = ResponseContext( 509, mime = default_mime, body = default_encoding( failure.value ) )
|
||||
|
||||
else:
|
||||
|
||||
|
|
|
@ -290,7 +290,7 @@ def StripTextOfGumpf( t ):
|
|||
|
||||
t = HydrusText.re_multiple_spaces.sub( ' ', t )
|
||||
|
||||
t = HydrusText.re_trailing_space.sub( '', t )
|
||||
t = t.strip()
|
||||
|
||||
t = HydrusText.re_leading_space_or_garbage.sub( '', t )
|
||||
|
||||
|
|
|
@ -1,10 +1,18 @@
|
|||
try:
|
||||
|
||||
import chardet
|
||||
|
||||
CHARDET_OK = True
|
||||
|
||||
except:
|
||||
|
||||
CHARDET_OK = False
|
||||
|
||||
import json
|
||||
import re
|
||||
|
||||
re_newlines = re.compile( '[\r\n]+' )
|
||||
re_multiple_spaces = re.compile( '\\s+' )
|
||||
re_trailing_space = re.compile( '\\s+$' )
|
||||
re_leading_space = re.compile( '^\\s+' )
|
||||
re_leading_space_or_garbage = re.compile( '^(\\s|-|system:)+' )
|
||||
re_leading_single_colon = re.compile( '^:(?!:)' )
|
||||
re_leading_byte_order_mark = re.compile( '^\ufeff' ) # unicode .txt files prepend with this, wew
|
||||
|
@ -21,7 +29,7 @@ def DeserialiseNewlinedTexts( text ):
|
|||
|
||||
texts = text.splitlines()
|
||||
|
||||
texts = [ StripTrailingAndLeadingSpaces( line ) for line in texts ]
|
||||
texts = [ StripIOInputLine( line ) for line in texts ]
|
||||
|
||||
texts = [ line for line in texts if line != '' ]
|
||||
|
||||
|
@ -81,15 +89,22 @@ def NonFailingUnicodeDecode( data, encoding ):
|
|||
|
||||
error_count = text.count( unicode_replacement_character )
|
||||
|
||||
if encoding not in ( 'utf-8', 'utf8', 'UTF-8', 'UTF8' ):
|
||||
if CHARDET_OK:
|
||||
|
||||
utf8_text = str( data, 'utf-8', errors = 'replace' )
|
||||
chardet_result = chardet.detect( data )
|
||||
|
||||
utf8_error_count = utf8_text.count( unicode_replacement_character )
|
||||
|
||||
if utf8_error_count < error_count:
|
||||
if chardet_result[ 'confidence' ] > 0.85:
|
||||
|
||||
return ( utf8_text, 'utf-8' )
|
||||
chardet_encoding = chardet_result[ 'encoding' ]
|
||||
|
||||
chardet_text = str( data, chardet_encoding, errors = 'replace' )
|
||||
|
||||
chardet_error_count = chardet_text.count( unicode_replacement_character )
|
||||
|
||||
if chardet_error_count < error_count:
|
||||
|
||||
return ( chardet_text, chardet_encoding )
|
||||
|
||||
|
||||
|
||||
|
||||
|
@ -106,13 +121,11 @@ def SortStringsIgnoringCase( list_of_strings ):
|
|||
|
||||
list_of_strings.sort( key = lambda s: s.lower() )
|
||||
|
||||
def StripTrailingAndLeadingSpaces( t ):
|
||||
def StripIOInputLine( t ):
|
||||
|
||||
t = re_leading_byte_order_mark.sub( '', t )
|
||||
|
||||
t = re_trailing_space.sub( '', t )
|
||||
|
||||
t = re_leading_space.sub( '', t )
|
||||
t = t.strip()
|
||||
|
||||
return t
|
||||
|
||||
|
|
|
@ -1974,7 +1974,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
if result is None:
|
||||
|
||||
raise HydrusExceptions.InsufficientCredentialsException( 'Did not find ip information for that hash.' )
|
||||
raise HydrusExceptions.NotFoundException( 'Did not find ip information for that hash.' )
|
||||
|
||||
|
||||
return result
|
||||
|
|
|
@ -4,6 +4,7 @@ from . import HydrusData
|
|||
from . import HydrusExceptions
|
||||
from . import HydrusGlobals as HG
|
||||
from . import HydrusNetwork
|
||||
from . import HydrusNetworking
|
||||
from . import HydrusPaths
|
||||
from . import HydrusSerialisable
|
||||
from . import HydrusServerResources
|
||||
|
@ -38,9 +39,9 @@ class HydrusResourceHydrusNetwork( HydrusServerResources.HydrusResource ):
|
|||
|
||||
def _callbackParseGETArgs( self, request ):
|
||||
|
||||
hydrus_args = HydrusNetwork.ParseHydrusNetworkGETArgs( request.args )
|
||||
parsed_request_args = HydrusNetwork.ParseHydrusNetworkGETArgs( request.args )
|
||||
|
||||
request.hydrus_args = hydrus_args
|
||||
request.parsed_request_args = parsed_request_args
|
||||
|
||||
return request
|
||||
|
||||
|
@ -51,7 +52,7 @@ class HydrusResourceHydrusNetwork( HydrusServerResources.HydrusResource ):
|
|||
|
||||
if not request.requestHeaders.hasHeader( 'Content-Type' ):
|
||||
|
||||
hydrus_args = {}
|
||||
parsed_request_args = HydrusNetworking.ParsedRequestArguments()
|
||||
|
||||
else:
|
||||
|
||||
|
@ -65,7 +66,7 @@ class HydrusResourceHydrusNetwork( HydrusServerResources.HydrusResource ):
|
|||
|
||||
except:
|
||||
|
||||
raise HydrusExceptions.InsufficientCredentialsException( 'Did not recognise Content-Type header!' )
|
||||
raise HydrusExceptions.BadRequestException( 'Did not recognise Content-Type header!' )
|
||||
|
||||
|
||||
total_bytes_read = 0
|
||||
|
@ -76,7 +77,7 @@ class HydrusResourceHydrusNetwork( HydrusServerResources.HydrusResource ):
|
|||
|
||||
total_bytes_read += len( json_string )
|
||||
|
||||
hydrus_args = HydrusNetwork.ParseNetworkBytesToHydrusArgs( json_string )
|
||||
parsed_request_args = HydrusNetwork.ParseNetworkBytesToParsedHydrusArgs( json_string )
|
||||
|
||||
else:
|
||||
|
||||
|
@ -96,13 +97,13 @@ class HydrusResourceHydrusNetwork( HydrusServerResources.HydrusResource ):
|
|||
|
||||
decompression_bombs_ok = self._DecompressionBombsOK( request )
|
||||
|
||||
hydrus_args = HydrusServerResources.ParseFileArguments( temp_path, decompression_bombs_ok )
|
||||
parsed_request_args = HydrusServerResources.ParseFileArguments( temp_path, decompression_bombs_ok )
|
||||
|
||||
|
||||
self._reportDataUsed( request, total_bytes_read )
|
||||
|
||||
|
||||
request.hydrus_args = hydrus_args
|
||||
request.parsed_request_args = parsed_request_args
|
||||
|
||||
return request
|
||||
|
||||
|
@ -111,7 +112,7 @@ class HydrusResourceAccessKey( HydrusResourceHydrusNetwork ):
|
|||
|
||||
def _threadDoGETJob( self, request ):
|
||||
|
||||
registration_key = request.hydrus_args[ 'registration_key' ]
|
||||
registration_key = request.parsed_request_args[ 'registration_key' ]
|
||||
|
||||
access_key = HG.server_controller.Read( 'access_key', self._service_key, registration_key )
|
||||
|
||||
|
@ -289,7 +290,7 @@ class HydrusResourceRestrictedAccountInfo( HydrusResourceRestricted ):
|
|||
|
||||
def _threadDoGETJob( self, request ):
|
||||
|
||||
subject_identifier = request.hydrus_args[ 'subject_identifier' ]
|
||||
subject_identifier = request.parsed_request_args[ 'subject_identifier' ]
|
||||
|
||||
if subject_identifier.HasAccountKey():
|
||||
|
||||
|
@ -315,11 +316,11 @@ class HydrusResourceRestrictedAccountModification( HydrusResourceRestricted ):
|
|||
|
||||
def _threadDoPOSTJob( self, request ):
|
||||
|
||||
action = request.hydrus_args[ 'action' ]
|
||||
action = request.parsed_request_args[ 'action' ]
|
||||
|
||||
subject_accounts = request.hydrus_args[ 'accounts' ]
|
||||
subject_accounts = request.parsed_request_args[ 'accounts' ]
|
||||
|
||||
kwargs = request.hydrus_args # for things like expires, title, and so on
|
||||
kwargs = request.parsed_request_args # for things like expires, title, and so on
|
||||
|
||||
with HG.dirty_object_lock:
|
||||
|
||||
|
@ -348,8 +349,8 @@ class HydrusResourceRestrictedAccountTypes( HydrusResourceRestricted ):
|
|||
|
||||
def _threadDoPOSTJob( self, request ):
|
||||
|
||||
account_types = request.hydrus_args[ 'account_types' ]
|
||||
deletee_account_type_keys_to_new_account_type_keys = request.hydrus_args[ 'deletee_account_type_keys_to_new_account_type_keys' ]
|
||||
account_types = request.parsed_request_args[ 'account_types' ]
|
||||
deletee_account_type_keys_to_new_account_type_keys = request.parsed_request_args[ 'deletee_account_type_keys_to_new_account_type_keys' ]
|
||||
|
||||
HG.server_controller.WriteSynchronous( 'account_types', self._service_key, request.hydrus_account, account_types, deletee_account_type_keys_to_new_account_type_keys )
|
||||
|
||||
|
@ -378,7 +379,7 @@ class HydrusResourceRestrictedIP( HydrusResourceRestricted ):
|
|||
|
||||
def _threadDoGETJob( self, request ):
|
||||
|
||||
hash = request.hydrus_args[ 'hash' ]
|
||||
hash = request.parsed_request_args[ 'hash' ]
|
||||
|
||||
( ip, timestamp ) = HG.server_controller.Read( 'ip', self._service_key, request.hydrus_account, hash )
|
||||
|
||||
|
@ -406,8 +407,8 @@ class HydrusResourceRestrictedPetition( HydrusResourceRestricted ):
|
|||
|
||||
def _threadDoGETJob( self, request ):
|
||||
|
||||
content_type = request.hydrus_args[ 'content_type' ]
|
||||
status = request.hydrus_args[ 'status' ]
|
||||
content_type = request.parsed_request_args[ 'content_type' ]
|
||||
status = request.parsed_request_args[ 'status' ]
|
||||
|
||||
petition = HG.server_controller.Read( 'petition', self._service_key, request.hydrus_account, content_type, status )
|
||||
|
||||
|
@ -422,12 +423,12 @@ class HydrusResourceRestrictedRegistrationKeys( HydrusResourceRestricted ):
|
|||
|
||||
def _threadDoGETJob( self, request ):
|
||||
|
||||
num = request.hydrus_args[ 'num' ]
|
||||
account_type_key = request.hydrus_args[ 'account_type_key' ]
|
||||
num = request.parsed_request_args[ 'num' ]
|
||||
account_type_key = request.parsed_request_args[ 'account_type_key' ]
|
||||
|
||||
if 'expires' in request.hydrus_args:
|
||||
if 'expires' in request.parsed_request_args:
|
||||
|
||||
expires = request.hydrus_args[ 'expires' ]
|
||||
expires = request.parsed_request_args[ 'expires' ]
|
||||
|
||||
else:
|
||||
|
||||
|
@ -456,7 +457,7 @@ class HydrusResourceRestrictedRepositoryFile( HydrusResourceRestricted ):
|
|||
|
||||
# no permission check as any functional account can get files
|
||||
|
||||
hash = request.hydrus_args[ 'hash' ]
|
||||
hash = request.parsed_request_args[ 'hash' ]
|
||||
|
||||
( valid, mime ) = HG.server_controller.Read( 'service_has_file', self._service_key, hash )
|
||||
|
||||
|
@ -474,7 +475,7 @@ class HydrusResourceRestrictedRepositoryFile( HydrusResourceRestricted ):
|
|||
|
||||
def _threadDoPOSTJob( self, request ):
|
||||
|
||||
file_dict = request.hydrus_args
|
||||
file_dict = request.parsed_request_args
|
||||
|
||||
if self._service.LogUploaderIPs():
|
||||
|
||||
|
@ -496,7 +497,7 @@ class HydrusResourceRestrictedRepositoryThumbnail( HydrusResourceRestricted ):
|
|||
|
||||
# no permission check as any functional account can get thumbnails
|
||||
|
||||
hash = request.hydrus_args[ 'hash' ]
|
||||
hash = request.parsed_request_args[ 'hash' ]
|
||||
|
||||
( valid, mime ) = HG.server_controller.Read( 'service_has_file', self._service_key, hash )
|
||||
|
||||
|
@ -532,13 +533,13 @@ class HydrusResourceRestrictedServices( HydrusResourceRestricted ):
|
|||
|
||||
def _threadDoPOSTJob( self, request ):
|
||||
|
||||
services = request.hydrus_args[ 'services' ]
|
||||
services = request.parsed_request_args[ 'services' ]
|
||||
|
||||
unique_ports = { service.GetPort() for service in services }
|
||||
|
||||
if len( unique_ports ) < len( services ):
|
||||
|
||||
raise HydrusExceptions.InsufficientCredentialsException( 'It looks like some of those services share ports! Please give them unique ports!' )
|
||||
raise HydrusExceptions.BadRequestException( 'It looks like some of those services share ports! Please give them unique ports!' )
|
||||
|
||||
|
||||
with HG.dirty_object_lock:
|
||||
|
@ -563,7 +564,7 @@ class HydrusResourceRestrictedUpdate( HydrusResourceRestricted ):
|
|||
|
||||
# no permissions check as any functional account can get updates
|
||||
|
||||
update_hash = request.hydrus_args[ 'update_hash' ]
|
||||
update_hash = request.parsed_request_args[ 'update_hash' ]
|
||||
|
||||
if not self._service.HasUpdateHash( update_hash ):
|
||||
|
||||
|
@ -579,7 +580,7 @@ class HydrusResourceRestrictedUpdate( HydrusResourceRestricted ):
|
|||
|
||||
def _threadDoPOSTJob( self, request ):
|
||||
|
||||
client_to_server_update = request.hydrus_args[ 'client_to_server_update' ]
|
||||
client_to_server_update = request.parsed_request_args[ 'client_to_server_update' ]
|
||||
|
||||
HG.server_controller.WriteSynchronous( 'update', self._service_key, request.hydrus_account, client_to_server_update )
|
||||
|
||||
|
@ -609,7 +610,7 @@ class HydrusResourceRestrictedMetadataUpdate( HydrusResourceRestricted ):
|
|||
|
||||
# no permissions check as any functional account can get metadata slices
|
||||
|
||||
since = request.hydrus_args[ 'since' ]
|
||||
since = request.parsed_request_args[ 'since' ]
|
||||
|
||||
metadata_slice = self._service.GetMetadataSlice( since )
|
||||
|
||||
|
|
|
@ -3,8 +3,10 @@ from . import ClientAPI
|
|||
from . import ClientLocalServer
|
||||
from . import ClientServices
|
||||
from . import ClientTags
|
||||
import collections
|
||||
import http.client
|
||||
from . import HydrusConstants as HC
|
||||
from . import HydrusTags
|
||||
from . import HydrusText
|
||||
import json
|
||||
import os
|
||||
|
@ -309,7 +311,7 @@ class TestClientAPI( unittest.TestCase ):
|
|||
|
||||
headers = { 'Hydrus-Client-API-Access-Key' : access_key_hex }
|
||||
|
||||
# none
|
||||
#
|
||||
|
||||
path = '/add_tags/get_tag_services'
|
||||
|
||||
|
@ -332,6 +334,117 @@ class TestClientAPI( unittest.TestCase ):
|
|||
|
||||
self.assertEqual( d, expected_answer )
|
||||
|
||||
# clean tags
|
||||
|
||||
tags = [ " bikini ", "blue eyes", " character : samus aran ", ":)", " ", "", "10", "11", "9", "system:wew", "-flower" ]
|
||||
|
||||
json_tags = json.dumps( tags )
|
||||
|
||||
path = '/add_tags/clean_tags?tags={}'.format( urllib.parse.quote( json_tags, safe = '' ) )
|
||||
|
||||
connection.request( 'GET', path, headers = headers )
|
||||
|
||||
response = connection.getresponse()
|
||||
|
||||
data = response.read()
|
||||
|
||||
text = str( data, 'utf-8' )
|
||||
|
||||
self.assertEqual( response.status, 200 )
|
||||
|
||||
d = json.loads( text )
|
||||
|
||||
expected_answer = {}
|
||||
|
||||
clean_tags = [ "bikini", "blue eyes", "character:samus aran", "::)", "10", "11", "9", "wew", "flower" ]
|
||||
|
||||
clean_tags = HydrusTags.SortNumericTags( clean_tags )
|
||||
|
||||
expected_answer[ 'tags' ] = clean_tags
|
||||
|
||||
self.assertEqual( d, expected_answer )
|
||||
|
||||
# add tags
|
||||
|
||||
headers = { 'Hydrus-Client-API-Access-Key' : access_key_hex, 'Content-Type' : HC.mime_string_lookup[ HC.APPLICATION_JSON ] }
|
||||
|
||||
hash = os.urandom( 32 )
|
||||
hash_hex = hash.hex()
|
||||
|
||||
hash2 = os.urandom( 32 )
|
||||
hash2_hex = hash2.hex()
|
||||
|
||||
# missing hashes
|
||||
|
||||
path = '/add_tags/add_tags'
|
||||
|
||||
body_dict = { 'service_names_to_tags' : { 'local tags' : [ 'test' ] } }
|
||||
|
||||
body = json.dumps( body_dict )
|
||||
|
||||
connection.request( 'POST', path, body = body, headers = headers )
|
||||
|
||||
response = connection.getresponse()
|
||||
|
||||
data = response.read()
|
||||
|
||||
self.assertEqual( response.status, 400 )
|
||||
|
||||
# invalid service key
|
||||
|
||||
path = '/add_tags/add_tags'
|
||||
|
||||
body_dict = { 'hash' : hash_hex, 'service_names_to_tags' : { 'bad tag service' : [ 'test' ] } }
|
||||
|
||||
body = json.dumps( body_dict )
|
||||
|
||||
connection.request( 'POST', path, body = body, headers = headers )
|
||||
|
||||
response = connection.getresponse()
|
||||
|
||||
data = response.read()
|
||||
|
||||
self.assertEqual( response.status, 400 )
|
||||
|
||||
# add tags to local
|
||||
|
||||
HG.test_controller.ClearWrites( 'content_updates' )
|
||||
|
||||
path = '/add_tags/add_tags'
|
||||
|
||||
body_dict = { 'hash' : hash_hex, 'service_names_to_tags' : { 'local tags' : [ 'test', 'test2' ] } }
|
||||
|
||||
body = json.dumps( body_dict )
|
||||
|
||||
connection.request( 'POST', path, body = body, headers = headers )
|
||||
|
||||
response = connection.getresponse()
|
||||
|
||||
data = response.read()
|
||||
|
||||
self.assertEqual( response.status, 200 )
|
||||
|
||||
expected_service_keys_to_content_updates = collections.defaultdict( list )
|
||||
|
||||
expected_service_keys_to_content_updates[ CC.LOCAL_TAG_SERVICE_KEY ] = [ HydrusData.ContentUpdate( HC.CONTENT_TYPE_MAPPINGS, HC.CONTENT_UPDATE_ADD, ( 'test', set( [ hash ] ) ) ), HydrusData.ContentUpdate( HC.CONTENT_TYPE_MAPPINGS, HC.CONTENT_UPDATE_ADD, ( 'test2', set( [ hash ] ) ) ) ]
|
||||
|
||||
[ ( ( service_keys_to_content_updates, ), kwargs ) ] = HG.test_controller.GetWrite( 'content_updates' )
|
||||
|
||||
self.assertEqual( len( service_keys_to_content_updates ), len( expected_service_keys_to_content_updates ) )
|
||||
|
||||
for ( service_key, content_updates ) in service_keys_to_content_updates.items():
|
||||
|
||||
expected_content_updates = expected_service_keys_to_content_updates[ service_key ]
|
||||
|
||||
c_u_tuples = [ c_u.ToTuple() for c_u in content_updates ]
|
||||
e_c_u_tuples = [ e_c_u.ToTuple() for e_c_u in expected_content_updates ]
|
||||
|
||||
c_u_tuples.sort()
|
||||
e_c_u_tuples.sort()
|
||||
|
||||
self.assertEqual( c_u_tuples, e_c_u_tuples )
|
||||
|
||||
|
||||
|
||||
def _test_add_urls( self, connection, set_up_permissions ):
|
||||
|
||||
|
@ -504,9 +617,13 @@ class TestClientAPI( unittest.TestCase ):
|
|||
|
||||
# add url
|
||||
|
||||
HG.test_controller.ClearWrites( 'import_url_test' )
|
||||
|
||||
headers = { 'Hydrus-Client-API-Access-Key' : access_key_hex, 'Content-Type' : HC.mime_string_lookup[ HC.APPLICATION_JSON ] }
|
||||
|
||||
request_dict = { 'url' : 'http://8ch.net/tv/res/1846574.html' }
|
||||
url = 'http://8ch.net/tv/res/1846574.html'
|
||||
|
||||
request_dict = { 'url' : url }
|
||||
|
||||
request_body = json.dumps( request_dict )
|
||||
|
||||
|
@ -525,6 +642,176 @@ class TestClientAPI( unittest.TestCase ):
|
|||
self.assertEqual( response_json[ 'human_result_text' ], '"https://8ch.net/tv/res/1846574.html" URL added successfully.' )
|
||||
self.assertEqual( response_json[ 'normalised_url' ], 'https://8ch.net/tv/res/1846574.html' )
|
||||
|
||||
self.assertEqual( HG.test_controller.GetWrite( 'import_url_test' ), [ ( ( url, None, None ), {} ) ] )
|
||||
|
||||
# with name
|
||||
|
||||
HG.test_controller.ClearWrites( 'import_url_test' )
|
||||
|
||||
request_dict = { 'url' : url, 'destination_page_name' : 'muh /tv/' }
|
||||
|
||||
request_body = json.dumps( request_dict )
|
||||
|
||||
connection.request( 'POST', '/add_urls/add_url', body = request_body, headers = headers )
|
||||
|
||||
response = connection.getresponse()
|
||||
|
||||
self.assertEqual( response.status, 200 )
|
||||
|
||||
data = response.read()
|
||||
|
||||
text = str( data, 'utf-8' )
|
||||
|
||||
response_json = json.loads( text )
|
||||
|
||||
self.assertEqual( response_json[ 'human_result_text' ], '"https://8ch.net/tv/res/1846574.html" URL added successfully.' )
|
||||
self.assertEqual( response_json[ 'normalised_url' ], 'https://8ch.net/tv/res/1846574.html' )
|
||||
|
||||
self.assertEqual( HG.test_controller.GetWrite( 'import_url_test' ), [ ( ( url, None, 'muh /tv/' ), {} ) ] )
|
||||
|
||||
# add tags and name
|
||||
|
||||
HG.test_controller.ClearWrites( 'import_url_test' )
|
||||
|
||||
request_dict = { 'url' : url, 'destination_page_name' : 'muh /tv/', 'service_names_to_tags' : { 'local tags' : [ '/tv/ thread' ] } }
|
||||
|
||||
request_body = json.dumps( request_dict )
|
||||
|
||||
connection.request( 'POST', '/add_urls/add_url', body = request_body, headers = headers )
|
||||
|
||||
response = connection.getresponse()
|
||||
|
||||
self.assertEqual( response.status, 200 )
|
||||
|
||||
data = response.read()
|
||||
|
||||
text = str( data, 'utf-8' )
|
||||
|
||||
response_json = json.loads( text )
|
||||
|
||||
self.assertEqual( response_json[ 'human_result_text' ], '"https://8ch.net/tv/res/1846574.html" URL added successfully.' )
|
||||
self.assertEqual( response_json[ 'normalised_url' ], 'https://8ch.net/tv/res/1846574.html' )
|
||||
|
||||
service_keys_to_tags = ClientTags.ServiceKeysToTags( { CC.LOCAL_TAG_SERVICE_KEY : set( [ '/tv/ thread' ] ) } )
|
||||
|
||||
self.assertEqual( HG.test_controller.GetWrite( 'import_url_test' ), [ ( ( url, service_keys_to_tags, 'muh /tv/' ), {} ) ] )
|
||||
|
||||
# associate url
|
||||
|
||||
HG.test_controller.ClearWrites( 'content_updates' )
|
||||
|
||||
hash = bytes.fromhex( '3b820114f658d768550e4e3d4f1dced3ff8db77443472b5ad93700647ad2d3ba' )
|
||||
url = 'https://rule34.xxx/index.php?id=2588418&page=post&s=view'
|
||||
|
||||
request_dict = { 'url_to_add' : url, 'hash' : hash.hex() }
|
||||
|
||||
request_body = json.dumps( request_dict )
|
||||
|
||||
connection.request( 'POST', '/add_urls/associate_url', body = request_body, headers = headers )
|
||||
|
||||
response = connection.getresponse()
|
||||
|
||||
data = response.read()
|
||||
|
||||
self.assertEqual( response.status, 200 )
|
||||
|
||||
expected_service_keys_to_content_updates = collections.defaultdict( list )
|
||||
|
||||
expected_service_keys_to_content_updates[ CC.COMBINED_LOCAL_FILE_SERVICE_KEY ] = [ HydrusData.ContentUpdate( HC.CONTENT_TYPE_URLS, HC.CONTENT_UPDATE_ADD, ( [ url ], [ hash ] ) ) ]
|
||||
|
||||
expected_result = [ ( ( expected_service_keys_to_content_updates, ), {} ) ]
|
||||
|
||||
result = HG.test_controller.GetWrite( 'content_updates' )
|
||||
|
||||
self.assertEqual( result, expected_result )
|
||||
|
||||
#
|
||||
|
||||
HG.test_controller.ClearWrites( 'content_updates' )
|
||||
|
||||
hash = bytes.fromhex( '3b820114f658d768550e4e3d4f1dced3ff8db77443472b5ad93700647ad2d3ba' )
|
||||
url = 'https://rule34.xxx/index.php?id=2588418&page=post&s=view'
|
||||
|
||||
request_dict = { 'urls_to_add' : [ url ], 'hashes' : [ hash.hex() ] }
|
||||
|
||||
request_body = json.dumps( request_dict )
|
||||
|
||||
connection.request( 'POST', '/add_urls/associate_url', body = request_body, headers = headers )
|
||||
|
||||
response = connection.getresponse()
|
||||
|
||||
data = response.read()
|
||||
|
||||
self.assertEqual( response.status, 200 )
|
||||
|
||||
expected_service_keys_to_content_updates = collections.defaultdict( list )
|
||||
|
||||
expected_service_keys_to_content_updates[ CC.COMBINED_LOCAL_FILE_SERVICE_KEY ] = [ HydrusData.ContentUpdate( HC.CONTENT_TYPE_URLS, HC.CONTENT_UPDATE_ADD, ( [ url ], [ hash ] ) ) ]
|
||||
|
||||
expected_result = [ ( ( expected_service_keys_to_content_updates, ), {} ) ]
|
||||
|
||||
result = HG.test_controller.GetWrite( 'content_updates' )
|
||||
|
||||
self.assertEqual( result, expected_result )
|
||||
|
||||
#
|
||||
|
||||
HG.test_controller.ClearWrites( 'content_updates' )
|
||||
|
||||
hash = bytes.fromhex( '3b820114f658d768550e4e3d4f1dced3ff8db77443472b5ad93700647ad2d3ba' )
|
||||
url = 'http://rule34.xxx/index.php?id=2588418&page=post&s=view'
|
||||
|
||||
request_dict = { 'url_to_delete' : url, 'hash' : hash.hex() }
|
||||
|
||||
request_body = json.dumps( request_dict )
|
||||
|
||||
connection.request( 'POST', '/add_urls/associate_url', body = request_body, headers = headers )
|
||||
|
||||
response = connection.getresponse()
|
||||
|
||||
data = response.read()
|
||||
|
||||
self.assertEqual( response.status, 200 )
|
||||
|
||||
expected_service_keys_to_content_updates = collections.defaultdict( list )
|
||||
|
||||
expected_service_keys_to_content_updates[ CC.COMBINED_LOCAL_FILE_SERVICE_KEY ] = [ HydrusData.ContentUpdate( HC.CONTENT_TYPE_URLS, HC.CONTENT_UPDATE_DELETE, ( [ url ], [ hash ] ) ) ]
|
||||
|
||||
expected_result = [ ( ( expected_service_keys_to_content_updates, ), {} ) ]
|
||||
|
||||
result = HG.test_controller.GetWrite( 'content_updates' )
|
||||
|
||||
self.assertEqual( result, expected_result )
|
||||
|
||||
#
|
||||
|
||||
HG.test_controller.ClearWrites( 'content_updates' )
|
||||
|
||||
hash = bytes.fromhex( '3b820114f658d768550e4e3d4f1dced3ff8db77443472b5ad93700647ad2d3ba' )
|
||||
url = 'http://rule34.xxx/index.php?id=2588418&page=post&s=view'
|
||||
|
||||
request_dict = { 'urls_to_delete' : [ url ], 'hashes' : [ hash.hex() ] }
|
||||
|
||||
request_body = json.dumps( request_dict )
|
||||
|
||||
connection.request( 'POST', '/add_urls/associate_url', body = request_body, headers = headers )
|
||||
|
||||
response = connection.getresponse()
|
||||
|
||||
data = response.read()
|
||||
|
||||
self.assertEqual( response.status, 200 )
|
||||
|
||||
expected_service_keys_to_content_updates = collections.defaultdict( list )
|
||||
|
||||
expected_service_keys_to_content_updates[ CC.COMBINED_LOCAL_FILE_SERVICE_KEY ] = [ HydrusData.ContentUpdate( HC.CONTENT_TYPE_URLS, HC.CONTENT_UPDATE_DELETE, ( [ url ], [ hash ] ) ) ]
|
||||
|
||||
expected_result = [ ( ( expected_service_keys_to_content_updates, ), {} ) ]
|
||||
|
||||
result = HG.test_controller.GetWrite( 'content_updates' )
|
||||
|
||||
self.assertEqual( result, expected_result )
|
||||
|
||||
|
||||
def _test_search_files( self, connection, set_up_permissions ):
|
||||
|
||||
|
|
|
@ -14,6 +14,7 @@ from . import ClientImportFileSeeds
|
|||
from . import ClientRatings
|
||||
from . import ClientSearch
|
||||
from . import ClientServices
|
||||
from . import ClientTags
|
||||
import collections
|
||||
from . import HydrusConstants as HC
|
||||
from . import HydrusData
|
||||
|
@ -605,7 +606,7 @@ class TestClientDB( unittest.TestCase ):
|
|||
|
||||
#
|
||||
|
||||
service_keys_to_tags = { HydrusData.GenerateKey() : [ 'some', 'tags' ] }
|
||||
service_keys_to_tags = ClientTags.ServiceKeysToTags( { HydrusData.GenerateKey() : [ 'some', 'tags' ] } )
|
||||
|
||||
management_controller = ClientGUIManagement.CreateManagementControllerImportHDD( [ 'some', 'paths' ], ClientImportOptions.FileImportOptions(), { 'paths' : service_keys_to_tags }, True )
|
||||
|
||||
|
|
|
@ -1,6 +1,7 @@
|
|||
import collections
|
||||
from . import HydrusConstants as HC
|
||||
from . import ClientData
|
||||
from . import ClientTags
|
||||
import os
|
||||
from . import TestConstants
|
||||
import unittest
|
||||
|
@ -18,19 +19,19 @@ class TestFunctions( unittest.TestCase ):
|
|||
local_key = CC.LOCAL_TAG_SERVICE_KEY
|
||||
remote_key = HydrusData.GenerateKey()
|
||||
|
||||
service_keys_to_tags = { local_key : { 'a' } }
|
||||
service_keys_to_tags = ClientTags.ServiceKeysToTags( { local_key : { 'a' } } )
|
||||
|
||||
content_updates = { local_key : [ HydrusData.ContentUpdate( HC.CONTENT_TYPE_MAPPINGS, HC.CONTENT_UPDATE_ADD, ( 'a', hashes ) ) ] }
|
||||
|
||||
self.assertEqual( ClientData.ConvertServiceKeysToTagsToServiceKeysToContentUpdates( { hash }, service_keys_to_tags ), content_updates )
|
||||
|
||||
service_keys_to_tags = { remote_key : { 'c' } }
|
||||
service_keys_to_tags = ClientTags.ServiceKeysToTags( { remote_key : { 'c' } } )
|
||||
|
||||
content_updates = { remote_key : [ HydrusData.ContentUpdate( HC.CONTENT_TYPE_MAPPINGS, HC.CONTENT_UPDATE_PEND, ( 'c', hashes ) ) ] }
|
||||
|
||||
self.assertEqual( ClientData.ConvertServiceKeysToTagsToServiceKeysToContentUpdates( { hash }, service_keys_to_tags ), content_updates )
|
||||
|
||||
service_keys_to_tags = { local_key : [ 'a', 'character:b' ], remote_key : [ 'c', 'series:d' ] }
|
||||
service_keys_to_tags = ClientTags.ServiceKeysToTags( { local_key : [ 'a', 'character:b' ], remote_key : [ 'c', 'series:d' ] } )
|
||||
|
||||
content_updates = {}
|
||||
|
||||
|
|
|
@ -376,6 +376,8 @@ class TestSerialisables( unittest.TestCase ):
|
|||
scu = {}
|
||||
|
||||
scu[ CC.LOCAL_TAG_SERVICE_KEY ] = [ HydrusData.ContentUpdate( HC.CONTENT_TYPE_MAPPINGS, HC.CONTENT_UPDATE_ADD, ( 'one', { two_hash } ) ), HydrusData.ContentUpdate( HC.CONTENT_TYPE_MAPPINGS, HC.CONTENT_UPDATE_ADD, ( 'two', { one_hash } ) ) ]
|
||||
scu[ TC.LOCAL_RATING_LIKE_SERVICE_KEY ] = [ HydrusData.ContentUpdate( HC.CONTENT_TYPE_RATINGS, HC.CONTENT_UPDATE_ADD, ( 1.0, { two_hash } ) ) ]
|
||||
scu[ TC.LOCAL_RATING_NUMERICAL_SERVICE_KEY ] = [ HydrusData.ContentUpdate( HC.CONTENT_TYPE_RATINGS, HC.CONTENT_UPDATE_ADD, ( 0.8, { two_hash } ) ) ]
|
||||
|
||||
assertSCUEqual( result, scu )
|
||||
|
||||
|
|
4
test.py
4
test.py
|
@ -389,12 +389,14 @@ class Controller( object ):
|
|||
return write
|
||||
|
||||
|
||||
def ImportURLFromAPI( self, url, service_keys_to_tags ):
|
||||
def ImportURLFromAPI( self, url, service_keys_to_tags, destination_page_name ):
|
||||
|
||||
normalised_url = self.network_engine.domain_manager.NormaliseURL( url )
|
||||
|
||||
human_result_text = '"{}" URL added successfully.'.format( normalised_url )
|
||||
|
||||
self.Write( 'import_url_test', url, service_keys_to_tags, destination_page_name )
|
||||
|
||||
return ( normalised_url, human_result_text )
|
||||
|
||||
|
||||
|
|
Loading…
Reference in New Issue