Version 363
This commit is contained in:
parent
5f561d9d60
commit
144c24e93c
|
@ -8,6 +8,48 @@
|
|||
<div class="content">
|
||||
<h3>changelog</h3>
|
||||
<ul>
|
||||
<li><h3>version 363</h3></li>
|
||||
<ul>
|
||||
<li>has audio:</li>
|
||||
<li>wrote a detection routine that can determine if a video has audio. it reads actual audio data and should be able to detect videos with a 'fake' silent audio track and consider them as not having audio</li>
|
||||
<li>extended the client database, file import pipeline, and file metadata object to track the new has_audio value</li>
|
||||
<li>flash files and audio files (like mp3) are considered to always have audio</li>
|
||||
<li>all 'maybe' audio files (atm this means video) are queued up for a file metadata reparse in the files maintenance manager. your existing videos will start off as not having audio, but once they are rescanned, they will get it. this is one of the first big jobs of the new maintenance system, and I expect it will need some different throttling rules to finish this job in reasonable time--by default it does 100 files a day, but if you have 50,000 videos, that's a long time!</li>
|
||||
<li>files now show if they have audio in their info string that appears on thumbnail right-click or the top of the media viewer. it defaults to the unicode character 🔊, but can be edited under the new 'sound' options page</li>
|
||||
<li>added a system:has audio predicate to search for files with/without audio</li>
|
||||
<li>updated file import unit tests to check 'has audio' parsing, and added tests for system:has audio</li>
|
||||
<li>.</li>
|
||||
<li>client api:</li>
|
||||
<li>the /get_files/file_metadata call now provides has_audio info</li>
|
||||
<li>the /get_files/file_metadata call now provides known_urls!</li>
|
||||
<li>added 'cookie management' permission</li>
|
||||
<li>added /manage_cookies/get_cookies to get current cookies by domain</li>
|
||||
<li>added /manage_cookies/set_cookies to set or clear current cookies</li>
|
||||
<li>added/updated unit tests for the above</li>
|
||||
<li>updated help for the above</li>
|
||||
<li>client api version is now 10</li>
|
||||
<li>the rest:</li>
|
||||
<li>system:hash and system:similar to now accept multiple hashes! so, if you have 100 md5s, you can now search for them all at once</li>
|
||||
<li>the thumbnail right-click->file relationships->find similar files now works for multiple selections!</li>
|
||||
<li>when system:hash was just one hash, it would run before anything else and complete a search immediately on finding a match, but now it works like any other predicate, checking for file domain and ANDing with other predicates in the search</li>
|
||||
<li>the 'complete' file maintenance regen job now only does file metadata, not a complete thumb regen. its name and labels are updated to reflect this, and any existing job in the system will get the separate thumb regen job</li>
|
||||
<li>the file maintenance manager now has a couple of how-to sentences at the top, and a new 'see description' button will talk more about each job type</li>
|
||||
<li>the login script testing system now uses a duplicate of the existing domain manager (rather than a fresh empty one), so it will inherit current http headers such as default User-Agent, the lacking of which was messing up some tests</li>
|
||||
<li>fixed the login script testing system not showing downloaded data</li>
|
||||
<li>subscriptions with multiple queries now publish the files they have imported as soon as each query has finished, rather than waiting for the whole sub to be done</li>
|
||||
<li>subscriptions now publish the files they have imported to page/popup even if they have an error</li>
|
||||
<li>added 9:16, 2:3, and 4:5 to the duplicate comparison statement system, for various vertical social media types</li>
|
||||
<li>the autocomplete tag search 'read', which appears on places like search pages, should now more reliably accept the current entered text when there are no search results yet to show</li>
|
||||
<li>the autocomplete tag search 'write', which appears on places like the manage tags dialog, should now correctly accept the input (including appropriate sibling-collapse) when you select a 'stub' result while other results are still loading, rather than broadcasting the exact current text</li>
|
||||
<li>fixed the deviant art file page parser to get source time--however the login script may now be broken/unreliable</li>
|
||||
<li>fixed a missing dialog import when deleting a string transformation</li>
|
||||
<li>reduced the base network connection error reattempt time to 10s (from 60s). there will be more work here in future</li>
|
||||
<li>network jobs that are waiting on a connection error now have a reattempt wait override option in their cog icon menus</li>
|
||||
<li>the post-bad-shutdown 'open your default session or a blank page' dialog will now auto-choose to open your default session in 15 seconds</li>
|
||||
<li>a variety of ui-update events will now not fire as long as the main gui is minimised. as well as saving a sliver of resources, I believe this may fix an issue where long-running subscriptions and other import pipelines could sometimes put the ui in an unrecoverable state due to too many thumb-fade etc... events when the currently focused page was receiving new files while the main gui was minimised</li>
|
||||
<li>maybe fixed a rare problem with deleting old pages</li>
|
||||
<li>cleaned some misc code</li>
|
||||
</ul>
|
||||
<li><h3>version 362</h3></li>
|
||||
<ul>
|
||||
<li>duplicates work finished:</li>
|
||||
|
@ -35,7 +77,7 @@
|
|||
<li>like repository processing, there is now a 1 hour hard limit on any individual import folder run</li>
|
||||
<li>fixed an issue where if a gallery url fetch produced faulty urls, it could sometimes invalidate the whole page with an error rather than just the bad file url items</li>
|
||||
<li>subscriptions will now stop a gallery-page-results-urls-add action early if that one page produces 100 previously seen before urls in a row. this _should_ fix the issue users were seeing with pixiv artist subs resyncing with much older urls that had previously been compacted out of the sub's cache</li>
|
||||
<li>until we can get better asynch ui feedback for admin-level repository commands (lke fetching/setting account types), they now override bandwidth rules and only try the connection once for quicker responses</li>
|
||||
<li>until we can get better asynch ui feedback for admin-level repository commands (like fetching/setting account types), they now override bandwidth rules and only try the connection once for quicker responses</li>
|
||||
<li>misc code cleanup</li>
|
||||
</ul>
|
||||
<li><h3>version 361</h3></li>
|
||||
|
|
|
@ -63,6 +63,11 @@
|
|||
<li><a href="#add_urls_add_url">POST /add_urls/add_url</a></li>
|
||||
<li><a href="#add_urls_associate_url">POST /add_urls/associate_url</a></li>
|
||||
</ul>
|
||||
<h4>Managing Cookies</h4>
|
||||
<ul>
|
||||
<li><a href="#manage_cookies_get_cookies">GET /manage_cookies/get_cookies</a></li>
|
||||
<li><a href="#manage_cookies_set_cookies">POST /manage_cookies/set_cookies</a></li>
|
||||
</ul>
|
||||
<h4>Managing Pages</h4>
|
||||
<ul>
|
||||
<li><a href="#manage_pages_get_pages">GET /manage_pages/get_pages</a></li>
|
||||
|
@ -88,7 +93,7 @@
|
|||
<li>
|
||||
<p>Example response:</p>
|
||||
<ul>
|
||||
<li><pre>{"version": 1}</pre></li>
|
||||
<li><pre>{"version" : 1}</pre></li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
|
@ -114,6 +119,7 @@
|
|||
<li>2 - Add Tags</li>
|
||||
<li>3 - Search for Files</li>
|
||||
<li>4 - Manage Pages</li>
|
||||
<li>5 - Manage Cookies</li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>
|
||||
|
@ -126,7 +132,7 @@
|
|||
<li>
|
||||
<p>Example response:</p>
|
||||
<ul>
|
||||
<li><pre>{"access_key": "73c9ab12751dcf3368f028d3abbe1d8e2a3a48d0de25e64f3a8f00f3a1424c57"}</pre></li>
|
||||
<li><pre>{"access_key" : "73c9ab12751dcf3368f028d3abbe1d8e2a3a48d0de25e64f3a8f00f3a1424c57"}</pre></li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
|
@ -143,7 +149,7 @@
|
|||
<p>Example response:</p>
|
||||
<ul>
|
||||
<li>
|
||||
<pre>{"session_key": "f6e651e7467255ade6f7c66050f3d595ff06d6f3d3693a3a6fb1a9c2b278f800"}</pre>
|
||||
<pre>{"session_key" : "f6e651e7467255ade6f7c66050f3d595ff06d6f3d3693a3a6fb1a9c2b278f800"}</pre>
|
||||
</li>
|
||||
</ul>
|
||||
</li>
|
||||
|
@ -166,8 +172,8 @@
|
|||
<ul>
|
||||
<li>
|
||||
<pre>{
|
||||
"basic_permissions": [0, 1, 3],
|
||||
"human_description": "API Permissions (autotagger): add tags to files, import files, search for files: Can search: only autotag this"
|
||||
"basic_permissions" : [0, 1, 3],
|
||||
"human_description" : "API Permissions (autotagger): add tags to files, import files, search for files: Can search: only autotag this"
|
||||
}</pre>
|
||||
</li>
|
||||
</ul>
|
||||
|
@ -190,16 +196,16 @@
|
|||
<blockquote>path : (the path you want to import)</blockquote>
|
||||
<li>
|
||||
<p>Example request body:</p>
|
||||
<blockquote><pre>{"path": "E:\\to_import\\ayanami.jpg"}</pre></blockquote>
|
||||
<blockquote><pre>{"path" : "E:\\to_import\\ayanami.jpg"}</pre></blockquote>
|
||||
</li>
|
||||
<li><p>Arguments (as bytes): You can alternately just send the file's bytes as the POST body.</p></li>
|
||||
<li><p>Response description: Some JSON with the import result. Please note that file imports for large files may take several seconds, and longer if the client is busy doing other db work, so make sure your request is willing to wait that long for the response.</p></li>
|
||||
<li>
|
||||
<p>Example response:</p>
|
||||
<pre>{
|
||||
"status": 1,
|
||||
"hash": "29a15ad0c035c0a0e86e2591660207db64b10777ced76565a695102a481c3dd1",
|
||||
"note": ""
|
||||
"status" : 1,
|
||||
"hash" : "29a15ad0c035c0a0e86e2591660207db64b10777ced76565a695102a481c3dd1",
|
||||
"note" : ""
|
||||
}</pre>
|
||||
<p>'status' is:</p>
|
||||
<ul>
|
||||
|
@ -240,7 +246,7 @@
|
|||
<ul>
|
||||
<li>
|
||||
<pre>{
|
||||
"tags": [ "9", "10", "11", "::)", "bikini", "blue eyes", "character:samus aran", "flower", "wew" ]
|
||||
"tags" : [ "9", "10", "11", "::)", "bikini", "blue eyes", "character:samus aran", "flower", "wew" ]
|
||||
}</pre>
|
||||
</li>
|
||||
</ul>
|
||||
|
@ -263,8 +269,8 @@
|
|||
<ul>
|
||||
<li>
|
||||
<pre>{
|
||||
"local_tags": [ "local tags" ]
|
||||
"tag_repositories": [ "public tag repository", "mlp fanfic tagging server" ]
|
||||
"local_tags" : [ "local tags" ]
|
||||
"tag_repositories" : [ "public tag repository", "mlp fanfic tagging server" ]
|
||||
}</pre>
|
||||
</li>
|
||||
</ul>
|
||||
|
@ -364,12 +370,12 @@
|
|||
<ul>
|
||||
<li>
|
||||
<pre>{
|
||||
"normalised_url": "https://safebooru.org/index.php?id=2753608&page=post&s=view"
|
||||
"url_file_statuses": [
|
||||
"normalised_url" : "https://safebooru.org/index.php?id=2753608&page=post&s=view"
|
||||
"url_file_statuses" : [
|
||||
{
|
||||
"status": 2
|
||||
"hash": "20e9002824e5e7ffc240b91b6e4a6af552b3143993c1778fd523c30d9fdde02c",
|
||||
"note": "url recognised: Imported at 2015/10/18 10:58:01, which was 3 years 4 months ago (before this check)."
|
||||
"status" : 2
|
||||
"hash" : "20e9002824e5e7ffc240b91b6e4a6af552b3143993c1778fd523c30d9fdde02c",
|
||||
"note" : "url recognised: Imported at 2015/10/18 10:58:01, which was 3 years 4 months ago (before this check)."
|
||||
}
|
||||
]
|
||||
}</pre>
|
||||
|
@ -412,11 +418,11 @@
|
|||
<ul>
|
||||
<li>
|
||||
<pre>{
|
||||
"normalised_url": "https://8ch.net/tv/res/1846574.html",
|
||||
"url_type": 4,
|
||||
"url_type_string": "watchable url",
|
||||
"match_name": "8chan thread",
|
||||
"can_parse": true,
|
||||
"normalised_url" : "https://8ch.net/tv/res/1846574.html",
|
||||
"url_type" : 4,
|
||||
"url_type_string" : "watchable url",
|
||||
"match_name" : "8chan thread",
|
||||
"can_parse" : true,
|
||||
}</pre>
|
||||
</li>
|
||||
</ul>
|
||||
|
@ -462,9 +468,9 @@
|
|||
<ul>
|
||||
<li>
|
||||
<pre>{
|
||||
"url": "https://8ch.net/tv/res/1846574.html",
|
||||
"destination_page_name": "kino zone",
|
||||
"service_names_to_tags": {
|
||||
"url" : "https://8ch.net/tv/res/1846574.html",
|
||||
"destination_page_name" : "kino zone",
|
||||
"service_names_to_tags" : {
|
||||
"local tags" : [ "as seen on /tv/" ]
|
||||
}
|
||||
}</pre>
|
||||
|
@ -477,8 +483,8 @@
|
|||
<ul>
|
||||
<li>
|
||||
<pre>{
|
||||
"human_result_text": "\"https://8ch.net/tv/res/1846574.html\" URL added successfully.",
|
||||
"normalised_url": "https://8ch.net/tv/res/1846574.html"
|
||||
"human_result_text" : "\"https://8ch.net/tv/res/1846574.html\" URL added successfully.",
|
||||
"normalised_url" : "https://8ch.net/tv/res/1846574.html"
|
||||
}</pre>
|
||||
</li>
|
||||
</ul>
|
||||
|
@ -513,14 +519,83 @@
|
|||
<ul>
|
||||
<li>
|
||||
<pre>{
|
||||
"url_to_add": "https://rule34.xxx/index.php?id=2588418&page=post&s=view",
|
||||
"hash": "3b820114f658d768550e4e3d4f1dced3ff8db77443472b5ad93700647ad2d3ba"
|
||||
"url_to_add" : "https://rule34.xxx/index.php?id=2588418&page=post&s=view",
|
||||
"hash" : "3b820114f658d768550e4e3d4f1dced3ff8db77443472b5ad93700647ad2d3ba"
|
||||
}</pre>
|
||||
</li>
|
||||
</ul>
|
||||
</li>
|
||||
<li><p>Response description: 200 with no content. Like when adding tags, this is safely idempotent--do not worry about re-adding URLs associations that already exist or accidentally trying to delete ones that don't.</p></li>
|
||||
</ul>
|
||||
</div><h3>Managing Cookies</h3>
|
||||
<p>This refers to the cookies held in the client's session manager, which are sent with network requests to different domains.</p>
|
||||
<div class="apiborder" id="manage_cookies_get_cookies">
|
||||
<h3><b>GET /manage_cookies/get_cookies</b></h3>
|
||||
<p><i>Get the cookies for a particular domain.</i></p>
|
||||
<ul>
|
||||
<li><p>Restricted access: YES. Manage Cookies permission needed.</p></li>
|
||||
<li><p>Required Headers: n/a</p></li>
|
||||
<li>
|
||||
<p>Arguments: domain</p>
|
||||
</li>
|
||||
<li>
|
||||
<p>Example request (for gelbooru.com):</p>
|
||||
<ul>
|
||||
<li><p>/manage_cookies/get_cookies?domain=gelbooru.com</p></li>
|
||||
</ul>
|
||||
</li>
|
||||
<p>Response description: A JSON Object listing all the cookies for that domain in [ name, value, domain, path, expires ] format.</p>
|
||||
<li>
|
||||
<p>Example response:</p>
|
||||
<ul>
|
||||
<li>
|
||||
<pre>{
|
||||
"cookies" : [
|
||||
[ "__cfduid", "f1bef65041e54e93110a883360bc7e71", ".gelbooru.com", "/", 1596223327 ],
|
||||
[ "pass_hash", "0b0833b797f108e340b315bc5463c324", "gelbooru.com", "/", 1585855361 ],
|
||||
[ "user_id", "123456", "gelbooru.com", "/", 1585855361 ]
|
||||
]
|
||||
}</pre>
|
||||
</li>
|
||||
</ul>
|
||||
</li>
|
||||
<p>Note that these variables are all strings except 'expires', which is either an integer timestamp or <i>null</i> for session cookies.</p>
|
||||
<p>This request will also return any cookies for subdomains. The session system in hydrus generally stores cookies according to the second-level domain, so if you request for specific.someoverbooru.net, you will still get the cookies for someoverbooru.net and all its subdomains.</p>
|
||||
</ul>
|
||||
</div>
|
||||
<div class="apiborder" id="manage_cookies_set_cookies">
|
||||
<h3><b>POST /manage_cookies/set_cookies</b></h3>
|
||||
<p>Set some new cookies for the client. This makes it easier to 'copy' a login from a web browser or similar to hydrus if hydrus's login system can't handle the site yet.</p>
|
||||
<ul>
|
||||
<li><p>Restricted access: YES. Manage Cookies permission needed.</p></li>
|
||||
<li>
|
||||
<p>Required Headers:</p>
|
||||
<ul>
|
||||
<li>Content-Type : application/json</li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>
|
||||
<p>Arguments (in JSON):</p>
|
||||
<ul>
|
||||
<li>cookies : (a list of cookie rows in the same format as the GET request above)</li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>
|
||||
<p>Example request body:</p>
|
||||
<ul>
|
||||
<li>
|
||||
<pre>{
|
||||
"cookies" : [
|
||||
[ "PHPSESSID", "07669eb2a1a6e840e498bb6e0799f3fb", ".somesite.com", "/", 1627327719 ],
|
||||
[ "tag_filter", "1", ".somesite.com", "/", 1627327719 ]
|
||||
]
|
||||
}</pre>
|
||||
</li>
|
||||
</ul>
|
||||
</li>
|
||||
<p>You can set 'value' to be null, which will clear any existing cookie with the corresponding name, domain, and path (acting essentially as a delete).</p>
|
||||
<p>Expires can be null, but session cookies will time-out in hydrus after 60 minutes of non-use.</p>
|
||||
</ul>
|
||||
</div>
|
||||
<h3>Managing Pages</h3>
|
||||
<p>This refers to the pages of the main client UI.</p>
|
||||
|
@ -622,7 +697,7 @@
|
|||
<ul>
|
||||
<li>
|
||||
<pre>{
|
||||
"page_key": "af98318b6eece15fef3cf0378385ce759bfe056916f6e12157cd928eb56c1f18"
|
||||
"page_key" : "af98318b6eece15fef3cf0378385ce759bfe056916f6e12157cd928eb56c1f18"
|
||||
}</pre>
|
||||
</li>
|
||||
</ul>
|
||||
|
@ -659,7 +734,7 @@
|
|||
<ul>
|
||||
<li>
|
||||
<pre>{
|
||||
"file_ids": [ 125462, 4852415, 123, 591415 ]
|
||||
"file_ids" : [ 125462, 4852415, 123, 591415 ]
|
||||
}</pre>
|
||||
</li>
|
||||
</ul>
|
||||
|
@ -718,8 +793,10 @@
|
|||
"width" : 640,
|
||||
"height" : 480,
|
||||
"duration" : null,
|
||||
"has_audio" : false,
|
||||
"num_frames" : null,
|
||||
"num_words" : null,
|
||||
"known_urls" : [],
|
||||
"service_names_to_statuses_to_tags" : {}
|
||||
},
|
||||
{
|
||||
|
@ -731,7 +808,12 @@
|
|||
"height" : 1080,
|
||||
"duration" : 4040,
|
||||
"num_frames" : 102,
|
||||
"num_words" : null
|
||||
"num_words" : null,
|
||||
"known_urls" : [
|
||||
"https://gelbooru.com/index.php?page=post&s=view&id=4841557",
|
||||
"https://img2.gelbooru.com//images/80/c8/80c8646b4a49395fb36c805f316c49a9.jpg",
|
||||
"http://origin-orig.deviantart.net/ed31/f/2019/210/7/8/beachqueen_samus_by_dandonfuga-ddcu1xg.jpg"
|
||||
],
|
||||
"service_names_to_statuses_to_tags" : {
|
||||
"local tags" : {
|
||||
"0" : [ "favourites" ]
|
||||
|
|
|
@ -11,8 +11,9 @@ CLIENT_API_PERMISSION_ADD_FILES = 1
|
|||
CLIENT_API_PERMISSION_ADD_TAGS = 2
|
||||
CLIENT_API_PERMISSION_SEARCH_FILES = 3
|
||||
CLIENT_API_PERMISSION_MANAGE_PAGES = 4
|
||||
CLIENT_API_PERMISSION_MANAGE_COOKIES = 5
|
||||
|
||||
ALLOWED_PERMISSIONS = ( CLIENT_API_PERMISSION_ADD_FILES, CLIENT_API_PERMISSION_ADD_TAGS, CLIENT_API_PERMISSION_ADD_URLS, CLIENT_API_PERMISSION_SEARCH_FILES, CLIENT_API_PERMISSION_MANAGE_PAGES )
|
||||
ALLOWED_PERMISSIONS = ( CLIENT_API_PERMISSION_ADD_FILES, CLIENT_API_PERMISSION_ADD_TAGS, CLIENT_API_PERMISSION_ADD_URLS, CLIENT_API_PERMISSION_SEARCH_FILES, CLIENT_API_PERMISSION_MANAGE_PAGES, CLIENT_API_PERMISSION_MANAGE_COOKIES )
|
||||
|
||||
basic_permission_to_str_lookup = {}
|
||||
|
||||
|
@ -21,6 +22,7 @@ basic_permission_to_str_lookup[ CLIENT_API_PERMISSION_ADD_FILES ] = 'import file
|
|||
basic_permission_to_str_lookup[ CLIENT_API_PERMISSION_ADD_TAGS ] = 'add tags to files'
|
||||
basic_permission_to_str_lookup[ CLIENT_API_PERMISSION_SEARCH_FILES ] = 'search for files'
|
||||
basic_permission_to_str_lookup[ CLIENT_API_PERMISSION_MANAGE_PAGES ] = 'manage pages'
|
||||
basic_permission_to_str_lookup[ CLIENT_API_PERMISSION_MANAGE_COOKIES ] = 'manage cookies'
|
||||
|
||||
SEARCH_RESULTS_CACHE_TIMEOUT = 4 * 3600
|
||||
|
||||
|
|
|
@ -211,8 +211,8 @@ class DB( HydrusDB.HydrusDB ):
|
|||
insert_phrase = 'INSERT OR IGNORE INTO'
|
||||
|
||||
|
||||
# hash_id, size, mime, width, height, duration, num_frames, num_words
|
||||
self._c.executemany( insert_phrase + ' files_info VALUES ( ?, ?, ?, ?, ?, ?, ?, ? );', rows )
|
||||
# hash_id, size, mime, width, height, duration, num_frames, has_audio, num_words
|
||||
self._c.executemany( insert_phrase + ' files_info VALUES ( ?, ?, ?, ?, ?, ?, ?, ?, ? );', rows )
|
||||
|
||||
|
||||
def _AddFiles( self, service_id, rows ):
|
||||
|
@ -1394,7 +1394,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
self._c.execute( 'CREATE TABLE file_inbox ( hash_id INTEGER PRIMARY KEY );' )
|
||||
|
||||
self._c.execute( 'CREATE TABLE files_info ( hash_id INTEGER PRIMARY KEY, size INTEGER, mime INTEGER, width INTEGER, height INTEGER, duration INTEGER, num_frames INTEGER, num_words INTEGER );' )
|
||||
self._c.execute( 'CREATE TABLE files_info ( hash_id INTEGER PRIMARY KEY, size INTEGER, mime INTEGER, width INTEGER, height INTEGER, duration INTEGER, num_frames INTEGER, has_audio INTEGER_BOOLEAN, num_words INTEGER );' )
|
||||
self._CreateIndex( 'files_info', [ 'size' ] )
|
||||
self._CreateIndex( 'files_info', [ 'mime' ] )
|
||||
self._CreateIndex( 'files_info', [ 'width' ] )
|
||||
|
@ -3733,11 +3733,11 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
if additional_data is not None:
|
||||
|
||||
if job_type == ClientFiles.REGENERATE_FILE_DATA_JOB_COMPLETE:
|
||||
if job_type == ClientFiles.REGENERATE_FILE_DATA_JOB_FILE_METADATA:
|
||||
|
||||
( size, mime, width, height, duration, num_frames, num_words ) = additional_data
|
||||
( size, mime, width, height, duration, num_frames, has_audio, num_words ) = additional_data
|
||||
|
||||
self._c.execute( 'UPDATE files_info SET size = ?, mime = ?, width = ?, height = ?, duration = ?, num_frames = ?, num_words = ? WHERE hash_id = ?;', ( size, mime, width, height, duration, num_frames, num_words, hash_id ) )
|
||||
self._c.execute( 'UPDATE files_info SET size = ?, mime = ?, width = ?, height = ?, duration = ?, num_frames = ?, has_audio = ?, num_words = ? WHERE hash_id = ?;', ( size, mime, width, height, duration, num_frames, has_audio, num_words, hash_id ) )
|
||||
|
||||
self._weakref_media_result_cache.DropMediaResult( hash_id, hash )
|
||||
|
||||
|
@ -4602,7 +4602,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
predicates.append( ClientSearch.Predicate( HC.PREDICATE_TYPE_SYSTEM_NOT_LOCAL, min_current_count = num_not_local ) )
|
||||
|
||||
|
||||
predicates.extend( [ ClientSearch.Predicate( predicate_type ) for predicate_type in [ HC.PREDICATE_TYPE_SYSTEM_UNTAGGED, HC.PREDICATE_TYPE_SYSTEM_NUM_TAGS, HC.PREDICATE_TYPE_SYSTEM_LIMIT, HC.PREDICATE_TYPE_SYSTEM_SIZE, HC.PREDICATE_TYPE_SYSTEM_AGE, HC.PREDICATE_TYPE_SYSTEM_KNOWN_URLS, HC.PREDICATE_TYPE_SYSTEM_HASH, HC.PREDICATE_TYPE_SYSTEM_DIMENSIONS, HC.PREDICATE_TYPE_SYSTEM_DURATION, HC.PREDICATE_TYPE_SYSTEM_NUM_WORDS, HC.PREDICATE_TYPE_SYSTEM_MIME ] ] )
|
||||
predicates.extend( [ ClientSearch.Predicate( predicate_type ) for predicate_type in [ HC.PREDICATE_TYPE_SYSTEM_UNTAGGED, HC.PREDICATE_TYPE_SYSTEM_NUM_TAGS, HC.PREDICATE_TYPE_SYSTEM_LIMIT, HC.PREDICATE_TYPE_SYSTEM_SIZE, HC.PREDICATE_TYPE_SYSTEM_AGE, HC.PREDICATE_TYPE_SYSTEM_KNOWN_URLS, HC.PREDICATE_TYPE_SYSTEM_HASH, HC.PREDICATE_TYPE_SYSTEM_DIMENSIONS, HC.PREDICATE_TYPE_SYSTEM_DURATION, HC.PREDICATE_TYPE_SYSTEM_HAS_AUDIO, HC.PREDICATE_TYPE_SYSTEM_NUM_WORDS, HC.PREDICATE_TYPE_SYSTEM_MIME ] ] )
|
||||
|
||||
if have_ratings:
|
||||
|
||||
|
@ -5068,6 +5068,13 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
|
||||
|
||||
if 'has_audio' in simple_preds:
|
||||
|
||||
has_audio = simple_preds[ 'has_audio' ]
|
||||
|
||||
files_info_predicates.append( 'has_audio = {}'.format( int( has_audio ) ) )
|
||||
|
||||
|
||||
if file_service_key != CC.COMBINED_FILE_SERVICE_KEY:
|
||||
|
||||
if 'min_timestamp' in simple_preds: files_info_predicates.append( 'timestamp >= ' + str( simple_preds[ 'min_timestamp' ] ) )
|
||||
|
@ -5145,42 +5152,6 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
there_are_simple_files_info_preds_to_search_for = len( files_info_predicates ) > 0
|
||||
|
||||
#
|
||||
|
||||
# This now overrides any other predicates, including file domain
|
||||
|
||||
if 'hash' in simple_preds:
|
||||
|
||||
query_hash_ids = set()
|
||||
|
||||
( search_hash, search_hash_type ) = simple_preds[ 'hash' ]
|
||||
|
||||
if search_hash_type != 'sha256':
|
||||
|
||||
result = self._GetFileHashes( [ search_hash ], search_hash_type, 'sha256' )
|
||||
|
||||
if len( result ) > 0:
|
||||
|
||||
( search_hash, ) = result
|
||||
|
||||
hash_id = self._GetHashId( search_hash )
|
||||
|
||||
query_hash_ids = { hash_id }
|
||||
|
||||
|
||||
else:
|
||||
|
||||
if self._HashExists( search_hash ):
|
||||
|
||||
hash_id = self._GetHashId( search_hash )
|
||||
|
||||
query_hash_ids = { hash_id }
|
||||
|
||||
|
||||
|
||||
return query_hash_ids
|
||||
|
||||
|
||||
# start with some quick ways to populate query_hash_ids
|
||||
|
||||
def update_qhi( query_hash_ids, some_hash_ids, force_create_new_set = False ):
|
||||
|
@ -5262,19 +5233,48 @@ class DB( HydrusDB.HydrusDB ):
|
|||
done_or_predicates = True
|
||||
|
||||
|
||||
#
|
||||
|
||||
if 'hash' in simple_preds:
|
||||
|
||||
specific_hash_ids = set()
|
||||
|
||||
( search_hashes, search_hash_type ) = simple_preds[ 'hash' ]
|
||||
|
||||
if search_hash_type == 'sha256':
|
||||
|
||||
matching_sha256_hashes = [ search_hash for search_hash in search_hashes if self._HashExists( search_hash ) ]
|
||||
|
||||
else:
|
||||
|
||||
matching_sha256_hashes = self._GetFileHashes( search_hashes, search_hash_type, 'sha256' )
|
||||
|
||||
|
||||
specific_hash_ids = self._GetHashIds( matching_sha256_hashes )
|
||||
|
||||
query_hash_ids = update_qhi( query_hash_ids, specific_hash_ids )
|
||||
|
||||
|
||||
#
|
||||
|
||||
if system_predicates.HasSimilarTo():
|
||||
|
||||
( similar_to_hash, max_hamming ) = system_predicates.GetSimilarTo()
|
||||
( similar_to_hashes, max_hamming ) = system_predicates.GetSimilarTo()
|
||||
|
||||
hash_id = self._GetHashId( similar_to_hash )
|
||||
all_similar_hash_ids = set()
|
||||
|
||||
similar_hash_ids_and_distances = self._PHashesSearch( hash_id, max_hamming )
|
||||
for similar_to_hash in similar_to_hashes:
|
||||
|
||||
hash_id = self._GetHashId( similar_to_hash )
|
||||
|
||||
similar_hash_ids_and_distances = self._PHashesSearch( hash_id, max_hamming )
|
||||
|
||||
similar_hash_ids = [ similar_hash_id for ( similar_hash_id, distance ) in similar_hash_ids_and_distances ]
|
||||
|
||||
all_similar_hash_ids.update( similar_hash_ids )
|
||||
|
||||
|
||||
similar_hash_ids = [ similar_hash_id for ( similar_hash_id, distance ) in similar_hash_ids_and_distances ]
|
||||
|
||||
query_hash_ids = update_qhi( query_hash_ids, similar_hash_ids )
|
||||
query_hash_ids = update_qhi( query_hash_ids, all_similar_hash_ids )
|
||||
|
||||
|
||||
for ( operator, value, rating_service_key ) in system_predicates.GetRatingsPredicates():
|
||||
|
@ -6632,7 +6632,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
self._PopulateHashIdsToHashesCache( hash_ids )
|
||||
|
||||
hash_ids_to_info = { hash_id : ClientMedia.FileInfoManager( hash_id, self._hash_ids_to_hashes_cache[ hash_id ], size, mime, width, height, duration, num_frames, num_words ) for ( hash_id, size, mime, width, height, duration, num_frames, num_words ) in self._SelectFromList( 'SELECT * FROM files_info WHERE hash_id IN {};', hash_ids ) }
|
||||
hash_ids_to_info = { hash_id : ClientMedia.FileInfoManager( hash_id, self._hash_ids_to_hashes_cache[ hash_id ], size, mime, width, height, duration, num_frames, has_audio, num_words ) for ( hash_id, size, mime, width, height, duration, num_frames, has_audio, num_words ) in self._SelectFromList( 'SELECT * FROM files_info WHERE hash_id IN {};', hash_ids ) }
|
||||
|
||||
hash_ids_to_current_file_service_ids_and_timestamps = HydrusData.BuildKeyToListDict( ( ( hash_id, ( service_id, timestamp ) ) for ( hash_id, service_id, timestamp ) in self._SelectFromList( 'SELECT hash_id, service_id, timestamp FROM current_files WHERE hash_id IN {};', hash_ids ) ) )
|
||||
|
||||
|
@ -7941,7 +7941,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
self._c.executemany( 'UPDATE ' + repository_updates_table_name + ' SET processed = ? WHERE hash_id = ?;', ( ( False, hash_id ) for hash_id in definition_hash_ids ) )
|
||||
|
||||
self._ScheduleRepositoryUpdateFileMaintenance( service_id, ClientFiles.REGENERATE_FILE_DATA_JOB_FILE_INTEGRITY_DATA )
|
||||
self._ScheduleRepositoryUpdateFileMaintenance( service_id, ClientFiles.REGENERATE_FILE_DATA_JOB_COMPLETE )
|
||||
self._ScheduleRepositoryUpdateFileMaintenance( service_id, ClientFiles.REGENERATE_FILE_DATA_JOB_FILE_METADATA )
|
||||
|
||||
self._Commit()
|
||||
|
||||
|
@ -7984,7 +7984,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
HydrusData.ShowText( 'File import job adding new file' )
|
||||
|
||||
|
||||
( size, mime, width, height, duration, num_frames, num_words ) = file_import_job.GetFileInfo()
|
||||
( size, mime, width, height, duration, num_frames, has_audio, num_words ) = file_import_job.GetFileInfo()
|
||||
|
||||
timestamp = HydrusData.GetNow()
|
||||
|
||||
|
@ -8005,7 +8005,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
HydrusData.ShowText( 'File import job adding file info row' )
|
||||
|
||||
|
||||
self._AddFilesInfo( [ ( hash_id, size, mime, width, height, duration, num_frames, num_words ) ], overwrite = True )
|
||||
self._AddFilesInfo( [ ( hash_id, size, mime, width, height, duration, num_frames, has_audio, num_words ) ], overwrite = True )
|
||||
|
||||
if HG.file_import_report_mode:
|
||||
|
||||
|
@ -8014,7 +8014,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
self._AddFiles( self._local_file_service_id, [ ( hash_id, timestamp ) ] )
|
||||
|
||||
file_info_manager = ClientMedia.FileInfoManager( hash_id, hash, size, mime, width, height, duration, num_frames, num_words )
|
||||
file_info_manager = ClientMedia.FileInfoManager( hash_id, hash, size, mime, width, height, duration, num_frames, has_audio, num_words )
|
||||
|
||||
content_update = HydrusData.ContentUpdate( HC.CONTENT_TYPE_FILES, HC.CONTENT_UPDATE_ADD, ( file_info_manager, timestamp ) )
|
||||
|
||||
|
@ -8077,13 +8077,14 @@ class DB( HydrusDB.HydrusDB ):
|
|||
height = None
|
||||
duration = None
|
||||
num_frames = None
|
||||
has_audio = None
|
||||
num_words = None
|
||||
|
||||
client_files_manager = self._controller.client_files_manager
|
||||
|
||||
client_files_manager.LocklessAddFileFromBytes( update_hash, mime, update_network_bytes )
|
||||
|
||||
self._AddFilesInfo( [ ( hash_id, size, mime, width, height, duration, num_frames, num_words ) ], overwrite = True )
|
||||
self._AddFilesInfo( [ ( hash_id, size, mime, width, height, duration, num_frames, has_audio, num_words ) ], overwrite = True )
|
||||
|
||||
now = HydrusData.GetNow()
|
||||
|
||||
|
@ -9352,9 +9353,9 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
( file_info_manager, timestamp ) = row
|
||||
|
||||
( hash_id, hash, size, mime, width, height, duration, num_frames, num_words ) = file_info_manager.ToTuple()
|
||||
( hash_id, hash, size, mime, width, height, duration, num_frames, has_audio, num_words ) = file_info_manager.ToTuple()
|
||||
|
||||
self._AddFilesInfo( [ ( hash_id, size, mime, width, height, duration, num_frames, num_words ) ] )
|
||||
self._AddFilesInfo( [ ( hash_id, size, mime, width, height, duration, num_frames, has_audio, num_words ) ] )
|
||||
|
||||
elif service_type == HC.IPFS:
|
||||
|
||||
|
@ -10026,6 +10027,8 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
total_rows = content_update.GetNumRows()
|
||||
|
||||
has_audio = None # hack until we figure this out better
|
||||
|
||||
rows_processed = 0
|
||||
|
||||
for chunk in HydrusData.SplitListIntoChunks( content_update.GetNewFiles(), FILES_CHUNK_SIZE ):
|
||||
|
@ -10039,7 +10042,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
hash_id = self._CacheRepositoryNormaliseServiceHashId( service_id, service_hash_id )
|
||||
|
||||
files_info_rows.append( ( hash_id, size, mime, width, height, duration, num_frames, num_words ) )
|
||||
files_info_rows.append( ( hash_id, size, mime, width, height, duration, num_frames, has_audio, num_words ) )
|
||||
|
||||
files_rows.append( ( hash_id, timestamp ) )
|
||||
|
||||
|
@ -10477,7 +10480,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
if not isinstance( definition_update, HydrusNetwork.DefinitionsUpdate ):
|
||||
|
||||
self._ScheduleRepositoryUpdateFileMaintenance( service_id, ClientFiles.REGENERATE_FILE_DATA_JOB_COMPLETE )
|
||||
self._ScheduleRepositoryUpdateFileMaintenance( service_id, ClientFiles.REGENERATE_FILE_DATA_JOB_FILE_METADATA )
|
||||
|
||||
self._Commit()
|
||||
|
||||
|
@ -10601,7 +10604,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
if not isinstance( content_update, HydrusNetwork.ContentUpdate ):
|
||||
|
||||
self._ScheduleRepositoryUpdateFileMaintenance( service_id, ClientFiles.REGENERATE_FILE_DATA_JOB_COMPLETE )
|
||||
self._ScheduleRepositoryUpdateFileMaintenance( service_id, ClientFiles.REGENERATE_FILE_DATA_JOB_FILE_METADATA )
|
||||
|
||||
self._Commit()
|
||||
|
||||
|
@ -13202,6 +13205,85 @@ class DB( HydrusDB.HydrusDB ):
|
|||
self._c.execute( 'DELETE FROM file_transfers WHERE service_id = ?;', ( service_id, ) )
|
||||
|
||||
|
||||
if version == 362:
|
||||
|
||||
# complete job no longer does thumbs
|
||||
|
||||
self._c.execute( 'INSERT OR IGNORE INTO file_maintenance_jobs ( hash_id, job_type, time_can_start ) SELECT hash_id, ?, time_can_start FROM file_maintenance_jobs WHERE job_type = ?;', ( ClientFiles.REGENERATE_FILE_DATA_JOB_FORCE_THUMBNAIL, ClientFiles.REGENERATE_FILE_DATA_JOB_FILE_METADATA ) )
|
||||
|
||||
#
|
||||
|
||||
one_row = self._c.execute( 'SELECT * FROM files_info;' ).fetchone()
|
||||
|
||||
if one_row is None or len( one_row ) == 8: # doesn't have has_audio yet
|
||||
|
||||
self._controller.pub( 'splash_set_status_subtext', 'adding \'has audio\' metadata column' )
|
||||
|
||||
existing_files_info = self._c.execute( 'SELECT * FROM files_info;' ).fetchall()
|
||||
|
||||
self._c.execute( 'DROP TABLE files_info;' )
|
||||
|
||||
self._c.execute( 'CREATE TABLE files_info ( hash_id INTEGER PRIMARY KEY, size INTEGER, mime INTEGER, width INTEGER, height INTEGER, duration INTEGER, num_frames INTEGER, has_audio INTEGER_BOOLEAN, num_words INTEGER );' )
|
||||
|
||||
insert_iterator = ( ( hash_id, size, mime, width, height, duration, num_frames, mime in HC.MIMES_THAT_DEFINITELY_HAVE_AUDIO, num_words ) for ( hash_id, size, mime, width, height, duration, num_frames, num_words ) in existing_files_info )
|
||||
|
||||
self._c.executemany( 'INSERT INTO files_info VALUES ( ?, ?, ?, ?, ?, ?, ?, ?, ? );', insert_iterator )
|
||||
|
||||
self._CreateIndex( 'files_info', [ 'size' ] )
|
||||
self._CreateIndex( 'files_info', [ 'mime' ] )
|
||||
self._CreateIndex( 'files_info', [ 'width' ] )
|
||||
self._CreateIndex( 'files_info', [ 'height' ] )
|
||||
self._CreateIndex( 'files_info', [ 'duration' ] )
|
||||
self._CreateIndex( 'files_info', [ 'num_frames' ] )
|
||||
|
||||
self._c.execute( 'ANALYZE files_info;' )
|
||||
|
||||
try:
|
||||
|
||||
service_id = self._GetServiceId( CC.COMBINED_LOCAL_FILE_SERVICE_KEY )
|
||||
|
||||
self._c.execute( 'INSERT OR IGNORE INTO file_maintenance_jobs ( hash_id, job_type, time_can_start ) SELECT hash_id, ?, ? FROM files_info NATURAL JOIN current_files WHERE service_id = ? AND mime IN ' + HydrusData.SplayListForDB( HC.MIMES_THAT_MAY_HAVE_AUDIO ) + ';', ( ClientFiles.REGENERATE_FILE_DATA_JOB_FILE_METADATA, 0, service_id ) )
|
||||
|
||||
except Exception as e:
|
||||
|
||||
HydrusData.PrintException( e )
|
||||
|
||||
message = 'Trying to schedule audio detection on videos failed! Please let hydrus dev know!'
|
||||
|
||||
self.pub_initial_message( message )
|
||||
|
||||
|
||||
|
||||
#
|
||||
|
||||
try:
|
||||
|
||||
domain_manager = self._GetJSONDump( HydrusSerialisable.SERIALISABLE_TYPE_NETWORK_DOMAIN_MANAGER )
|
||||
|
||||
domain_manager.Initialise()
|
||||
|
||||
#
|
||||
|
||||
domain_manager.OverwriteDefaultParsers( ( 'deviant art file page parser', ) )
|
||||
|
||||
#
|
||||
|
||||
domain_manager.TryToLinkURLClassesAndParsers()
|
||||
|
||||
#
|
||||
|
||||
self._SetJSONDump( domain_manager )
|
||||
|
||||
except Exception as e:
|
||||
|
||||
HydrusData.PrintException( e )
|
||||
|
||||
message = 'Trying to update some url classes and parsers failed! Please let hydrus dev know!'
|
||||
|
||||
self.pub_initial_message( message )
|
||||
|
||||
|
||||
|
||||
self._controller.pub( 'splash_set_title_text', 'updated db to v' + str( version + 1 ) )
|
||||
|
||||
self._c.execute( 'UPDATE version SET version = ?;', ( version + 1, ) )
|
||||
|
|
|
@ -19,7 +19,7 @@ import threading
|
|||
import time
|
||||
import wx
|
||||
|
||||
REGENERATE_FILE_DATA_JOB_COMPLETE = 0
|
||||
REGENERATE_FILE_DATA_JOB_FILE_METADATA = 0
|
||||
REGENERATE_FILE_DATA_JOB_FORCE_THUMBNAIL = 1
|
||||
REGENERATE_FILE_DATA_JOB_REFIT_THUMBNAIL = 2
|
||||
REGENERATE_FILE_DATA_JOB_OTHER_HASHES = 3
|
||||
|
@ -32,7 +32,7 @@ REGENERATE_FILE_DATA_JOB_SIMILAR_FILES_METADATA = 9
|
|||
|
||||
regen_file_enum_to_str_lookup = {}
|
||||
|
||||
regen_file_enum_to_str_lookup[ REGENERATE_FILE_DATA_JOB_COMPLETE ] = 'complete reparse and thumbnail regen'
|
||||
regen_file_enum_to_str_lookup[ REGENERATE_FILE_DATA_JOB_FILE_METADATA ] = 'regenerate file metadata'
|
||||
regen_file_enum_to_str_lookup[ REGENERATE_FILE_DATA_JOB_FORCE_THUMBNAIL ] = 'regenerate thumbnail'
|
||||
regen_file_enum_to_str_lookup[ REGENERATE_FILE_DATA_JOB_REFIT_THUMBNAIL ] = 'regenerate thumbnail if incorrect size'
|
||||
regen_file_enum_to_str_lookup[ REGENERATE_FILE_DATA_JOB_OTHER_HASHES ] = 'regenerate non-standard hashes'
|
||||
|
@ -43,22 +43,35 @@ regen_file_enum_to_str_lookup[ REGENERATE_FILE_DATA_JOB_FIX_PERMISSIONS ] = 'fix
|
|||
regen_file_enum_to_str_lookup[ REGENERATE_FILE_DATA_JOB_CHECK_SIMILAR_FILES_MEMBERSHIP ] = 'check for membership in the similar files search system'
|
||||
regen_file_enum_to_str_lookup[ REGENERATE_FILE_DATA_JOB_SIMILAR_FILES_METADATA ] = 'regenerate similar files metadata'
|
||||
|
||||
regen_file_enum_to_description_lookup = {}
|
||||
|
||||
regen_file_enum_to_description_lookup[ REGENERATE_FILE_DATA_JOB_FILE_METADATA ] = 'This regenerates file metadata like resolution and duration, or even filetype (such as mkv->webm), which may have been misparsed in a previous version.'
|
||||
regen_file_enum_to_description_lookup[ REGENERATE_FILE_DATA_JOB_FORCE_THUMBNAIL ] = 'This forces a complete regeneration of the thumbnail from the source file.'
|
||||
regen_file_enum_to_description_lookup[ REGENERATE_FILE_DATA_JOB_REFIT_THUMBNAIL ] = 'This looks for the existing thumbnail, and if it is not the correct resolution or is missing, will regenerate a new one for the source file.'
|
||||
regen_file_enum_to_description_lookup[ REGENERATE_FILE_DATA_JOB_OTHER_HASHES ] = 'This regenerates hydrus\'s store of md5, sha1, and sha512 supplementary hashes, which it can use for various external (usually website) lookups.'
|
||||
regen_file_enum_to_description_lookup[ REGENERATE_FILE_DATA_JOB_DELETE_NEIGHBOUR_DUPES ] = 'Sometimes, a file metadata regeneration will mean a new filetype and thus a new file extension. If the existing, incorrectly named file is in use, it must be copied rather than renamed, and so there is a spare duplicate left over after the operation. This jobs cleans up the duplicate at a later time.'
|
||||
regen_file_enum_to_description_lookup[ REGENERATE_FILE_DATA_JOB_FILE_INTEGRITY_PRESENCE ] = 'This checks to see if the file is present in the file system as expected. If it is not, the internal file record in the database is removed, just as if the file were deleted. Use this if you have manually deleted or otherwise lost a number of files from your file structure and need hydrus to re-sync with what it has. Missing files will have their known URLs exported to your database directory if you wish to attempt to re-download them.'
|
||||
regen_file_enum_to_description_lookup[ REGENERATE_FILE_DATA_JOB_FILE_INTEGRITY_DATA ] = 'This does the same check as the \'present\' job, and if the file is where it is expected, it ensures its file content, byte-for-byte, is correct. This is a heavy job, so be wary. Files that are incorrect will be exported to your database directory along with their known URLs.'
|
||||
regen_file_enum_to_description_lookup[ REGENERATE_FILE_DATA_JOB_FIX_PERMISSIONS ] = 'This ensures that files in the file system are readable and writeable. For Linux/OS X users, it specifically sets 644. If you wish to run this job on Linux/OS X, ensure you are first the file owner of all your files.'
|
||||
regen_file_enum_to_description_lookup[ REGENERATE_FILE_DATA_JOB_CHECK_SIMILAR_FILES_MEMBERSHIP ] = 'This checks to see if files should be in the similar files system, and if they are falsely in or falsely out, it will remove their record or queue them up for a search as appropriate. It is useful to repair database damage.'
|
||||
regen_file_enum_to_description_lookup[ REGENERATE_FILE_DATA_JOB_SIMILAR_FILES_METADATA ] = 'This forces a regeneration of the file\'s similar-files \'phashes\'. It is not useful unless you know there is missing data to repair.'
|
||||
|
||||
regen_file_enum_to_ideal_job_size_lookup = {}
|
||||
|
||||
regen_file_enum_to_ideal_job_size_lookup[ REGENERATE_FILE_DATA_JOB_COMPLETE ] = 100
|
||||
regen_file_enum_to_ideal_job_size_lookup[ REGENERATE_FILE_DATA_JOB_FILE_METADATA ] = 250
|
||||
regen_file_enum_to_ideal_job_size_lookup[ REGENERATE_FILE_DATA_JOB_FORCE_THUMBNAIL ] = 250
|
||||
regen_file_enum_to_ideal_job_size_lookup[ REGENERATE_FILE_DATA_JOB_REFIT_THUMBNAIL ] = 1000
|
||||
regen_file_enum_to_ideal_job_size_lookup[ REGENERATE_FILE_DATA_JOB_OTHER_HASHES ] = 25
|
||||
regen_file_enum_to_ideal_job_size_lookup[ REGENERATE_FILE_DATA_JOB_DELETE_NEIGHBOUR_DUPES ] = 100
|
||||
regen_file_enum_to_ideal_job_size_lookup[ REGENERATE_FILE_DATA_JOB_FILE_INTEGRITY_PRESENCE ] = 10000
|
||||
regen_file_enum_to_ideal_job_size_lookup[ REGENERATE_FILE_DATA_JOB_FILE_INTEGRITY_DATA ] = 100
|
||||
regen_file_enum_to_ideal_job_size_lookup[ REGENERATE_FILE_DATA_JOB_FIX_PERMISSIONS ] = 250
|
||||
regen_file_enum_to_ideal_job_size_lookup[ REGENERATE_FILE_DATA_JOB_CHECK_SIMILAR_FILES_MEMBERSHIP ] = 100
|
||||
regen_file_enum_to_ideal_job_size_lookup[ REGENERATE_FILE_DATA_JOB_FIX_PERMISSIONS ] = 500
|
||||
regen_file_enum_to_ideal_job_size_lookup[ REGENERATE_FILE_DATA_JOB_CHECK_SIMILAR_FILES_MEMBERSHIP ] = 1000
|
||||
regen_file_enum_to_ideal_job_size_lookup[ REGENERATE_FILE_DATA_JOB_SIMILAR_FILES_METADATA ] = 100
|
||||
|
||||
regen_file_enum_to_overruled_jobs = {}
|
||||
|
||||
regen_file_enum_to_overruled_jobs[ REGENERATE_FILE_DATA_JOB_COMPLETE ] = [ REGENERATE_FILE_DATA_JOB_FORCE_THUMBNAIL, REGENERATE_FILE_DATA_JOB_REFIT_THUMBNAIL ]
|
||||
regen_file_enum_to_overruled_jobs[ REGENERATE_FILE_DATA_JOB_FILE_METADATA ] = []
|
||||
regen_file_enum_to_overruled_jobs[ REGENERATE_FILE_DATA_JOB_FORCE_THUMBNAIL ] = [ REGENERATE_FILE_DATA_JOB_REFIT_THUMBNAIL ]
|
||||
regen_file_enum_to_overruled_jobs[ REGENERATE_FILE_DATA_JOB_REFIT_THUMBNAIL ] = []
|
||||
regen_file_enum_to_overruled_jobs[ REGENERATE_FILE_DATA_JOB_OTHER_HASHES ] = []
|
||||
|
@ -69,7 +82,7 @@ regen_file_enum_to_overruled_jobs[ REGENERATE_FILE_DATA_JOB_FIX_PERMISSIONS ] =
|
|||
regen_file_enum_to_overruled_jobs[ REGENERATE_FILE_DATA_JOB_CHECK_SIMILAR_FILES_MEMBERSHIP ] = []
|
||||
regen_file_enum_to_overruled_jobs[ REGENERATE_FILE_DATA_JOB_SIMILAR_FILES_METADATA ] = [ REGENERATE_FILE_DATA_JOB_CHECK_SIMILAR_FILES_MEMBERSHIP ]
|
||||
|
||||
ALL_REGEN_JOBS_IN_PREFERRED_ORDER = [ REGENERATE_FILE_DATA_JOB_FILE_INTEGRITY_PRESENCE, REGENERATE_FILE_DATA_JOB_FILE_INTEGRITY_DATA, REGENERATE_FILE_DATA_JOB_REFIT_THUMBNAIL, REGENERATE_FILE_DATA_JOB_FORCE_THUMBNAIL, REGENERATE_FILE_DATA_JOB_COMPLETE, REGENERATE_FILE_DATA_JOB_SIMILAR_FILES_METADATA, REGENERATE_FILE_DATA_JOB_CHECK_SIMILAR_FILES_MEMBERSHIP, REGENERATE_FILE_DATA_JOB_FIX_PERMISSIONS, REGENERATE_FILE_DATA_JOB_OTHER_HASHES, REGENERATE_FILE_DATA_JOB_DELETE_NEIGHBOUR_DUPES ]
|
||||
ALL_REGEN_JOBS_IN_PREFERRED_ORDER = [ REGENERATE_FILE_DATA_JOB_FILE_INTEGRITY_PRESENCE, REGENERATE_FILE_DATA_JOB_FILE_INTEGRITY_DATA, REGENERATE_FILE_DATA_JOB_FILE_METADATA, REGENERATE_FILE_DATA_JOB_REFIT_THUMBNAIL, REGENERATE_FILE_DATA_JOB_FORCE_THUMBNAIL, REGENERATE_FILE_DATA_JOB_SIMILAR_FILES_METADATA, REGENERATE_FILE_DATA_JOB_CHECK_SIMILAR_FILES_MEMBERSHIP, REGENERATE_FILE_DATA_JOB_FIX_PERMISSIONS, REGENERATE_FILE_DATA_JOB_OTHER_HASHES, REGENERATE_FILE_DATA_JOB_DELETE_NEIGHBOUR_DUPES ]
|
||||
|
||||
def GetAllFilePaths( raw_paths, do_human_sort = True ):
|
||||
|
||||
|
@ -1454,7 +1467,7 @@ class FilesMaintenanceManager( object ):
|
|||
|
||||
|
||||
|
||||
def _RegenFileData( self, media_result ):
|
||||
def _RegenFileMetadata( self, media_result ):
|
||||
|
||||
hash = media_result.GetHash()
|
||||
original_mime = media_result.GetMime()
|
||||
|
@ -1463,9 +1476,9 @@ class FilesMaintenanceManager( object ):
|
|||
|
||||
path = self._controller.client_files_manager.GetFilePath( hash, original_mime )
|
||||
|
||||
( size, mime, width, height, duration, num_frames, num_words ) = HydrusFileHandling.GetFileInfo( path, ok_to_look_for_hydrus_updates = True )
|
||||
( size, mime, width, height, duration, num_frames, has_audio, num_words ) = HydrusFileHandling.GetFileInfo( path, ok_to_look_for_hydrus_updates = True )
|
||||
|
||||
additional_data = ( size, mime, width, height, duration, num_frames, num_words )
|
||||
additional_data = ( size, mime, width, height, duration, num_frames, has_audio, num_words )
|
||||
|
||||
if mime != original_mime:
|
||||
|
||||
|
@ -1477,11 +1490,6 @@ class FilesMaintenanceManager( object ):
|
|||
|
||||
|
||||
|
||||
if mime in HC.MIMES_WITH_THUMBNAILS:
|
||||
|
||||
self._RegenFileThumbnailForce( media_result )
|
||||
|
||||
|
||||
return additional_data
|
||||
|
||||
except HydrusExceptions.MimeException:
|
||||
|
@ -1603,6 +1611,8 @@ class FilesMaintenanceManager( object ):
|
|||
num_bad_files = 0
|
||||
num_thumb_refits = 0
|
||||
|
||||
next_gc_collect = HydrusData.GetNow() + 10
|
||||
|
||||
try:
|
||||
|
||||
cleared_jobs = []
|
||||
|
@ -1637,9 +1647,9 @@ class FilesMaintenanceManager( object ):
|
|||
|
||||
try:
|
||||
|
||||
if job_type == REGENERATE_FILE_DATA_JOB_COMPLETE:
|
||||
if job_type == REGENERATE_FILE_DATA_JOB_FILE_METADATA:
|
||||
|
||||
additional_data = self._RegenFileData( media_result )
|
||||
additional_data = self._RegenFileMetadata( media_result )
|
||||
|
||||
elif job_type == REGENERATE_FILE_DATA_JOB_OTHER_HASHES:
|
||||
|
||||
|
@ -1705,8 +1715,17 @@ class FilesMaintenanceManager( object ):
|
|||
cleared_jobs.append( ( hash, job_type, additional_data ) )
|
||||
|
||||
|
||||
if HydrusData.TimeHasPassed( next_gc_collect ):
|
||||
|
||||
gc.collect()
|
||||
|
||||
next_gc_collect = HydrusData.GetNow() + 10
|
||||
|
||||
|
||||
if len( cleared_jobs ) > 100:
|
||||
|
||||
gc.collect()
|
||||
|
||||
self._controller.WriteSynchronous( 'file_maintenance_clear_jobs', cleared_jobs )
|
||||
|
||||
cleared_jobs = []
|
||||
|
@ -1717,6 +1736,8 @@ class FilesMaintenanceManager( object ):
|
|||
|
||||
if len( cleared_jobs ) > 0:
|
||||
|
||||
gc.collect()
|
||||
|
||||
self._controller.Write( 'file_maintenance_clear_jobs', cleared_jobs )
|
||||
|
||||
|
||||
|
|
|
@ -1215,9 +1215,12 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
|
|||
|
||||
for page in pages:
|
||||
|
||||
page.CleanBeforeDestroy()
|
||||
|
||||
page.DestroyLater()
|
||||
if page:
|
||||
|
||||
page.CleanBeforeDestroy()
|
||||
|
||||
page.DestroyLater()
|
||||
|
||||
|
||||
|
||||
|
||||
|
@ -2562,8 +2565,10 @@ class FrameGUI( ClientGUITopLevelWindows.FrameThatResizes ):
|
|||
message = 'It looks like the last instance of the client did not shut down cleanly.'
|
||||
message += os.linesep * 2
|
||||
message += 'Would you like to try loading your default session "' + default_gui_session + '", or just a blank page?'
|
||||
message += os.linesep * 2
|
||||
message += 'This will auto-choose to open your default session in 15 seconds.'
|
||||
|
||||
result = ClientGUIDialogsQuick.GetYesNo( self, message, title = 'Previous shutdown was bad', yes_label = 'try to load "' + default_gui_session + '"', no_label = 'just load a blank page' )
|
||||
result = ClientGUIDialogsQuick.GetYesNo( self, message, title = 'Previous shutdown was bad', yes_label = 'try to load "' + default_gui_session + '"', no_label = 'just load a blank page', auto_yes_time = 15 )
|
||||
|
||||
if result == wx.ID_NO:
|
||||
|
||||
|
@ -4444,6 +4449,11 @@ The password is cleartext here but obscured in the entry dialog. Enter a blank p
|
|||
|
||||
def TIMEREventAnimationUpdate( self, event ):
|
||||
|
||||
if self.IsIconized():
|
||||
|
||||
return
|
||||
|
||||
|
||||
try:
|
||||
|
||||
windows = list( self._animation_update_windows )
|
||||
|
@ -4520,21 +4530,11 @@ The password is cleartext here but obscured in the entry dialog. Enter a blank p
|
|||
text += able_to_close_statement
|
||||
|
||||
|
||||
with ClientGUIDialogs.DialogYesNo( self, text ) as dlg:
|
||||
result = ClientGUIDialogsQuick.GetYesNo( self, text, auto_yes_time = 15 )
|
||||
|
||||
if result == wx.ID_NO:
|
||||
|
||||
job = self._controller.CallLaterWXSafe( dlg, 15, dlg.EndModal, wx.ID_YES )
|
||||
|
||||
try:
|
||||
|
||||
if dlg.ShowModal() == wx.ID_NO:
|
||||
|
||||
return False
|
||||
|
||||
|
||||
finally:
|
||||
|
||||
job.Cancel()
|
||||
|
||||
return False
|
||||
|
||||
|
||||
|
||||
|
@ -4634,7 +4634,7 @@ The password is cleartext here but obscured in the entry dialog. Enter a blank p
|
|||
|
||||
( predicate_type, value, inclusive ) = predicate.GetInfo()
|
||||
|
||||
if value is None and predicate_type in [ HC.PREDICATE_TYPE_SYSTEM_NUM_TAGS, HC.PREDICATE_TYPE_SYSTEM_LIMIT, HC.PREDICATE_TYPE_SYSTEM_SIZE, HC.PREDICATE_TYPE_SYSTEM_DIMENSIONS, HC.PREDICATE_TYPE_SYSTEM_AGE, HC.PREDICATE_TYPE_SYSTEM_KNOWN_URLS, HC.PREDICATE_TYPE_SYSTEM_HASH, HC.PREDICATE_TYPE_SYSTEM_DURATION, HC.PREDICATE_TYPE_SYSTEM_NUM_WORDS, HC.PREDICATE_TYPE_SYSTEM_MIME, HC.PREDICATE_TYPE_SYSTEM_RATING, HC.PREDICATE_TYPE_SYSTEM_SIMILAR_TO, HC.PREDICATE_TYPE_SYSTEM_FILE_SERVICE, HC.PREDICATE_TYPE_SYSTEM_TAG_AS_NUMBER, HC.PREDICATE_TYPE_SYSTEM_FILE_RELATIONSHIPS, HC.PREDICATE_TYPE_SYSTEM_FILE_VIEWING_STATS ]:
|
||||
if value is None and predicate_type in [ HC.PREDICATE_TYPE_SYSTEM_NUM_TAGS, HC.PREDICATE_TYPE_SYSTEM_LIMIT, HC.PREDICATE_TYPE_SYSTEM_SIZE, HC.PREDICATE_TYPE_SYSTEM_DIMENSIONS, HC.PREDICATE_TYPE_SYSTEM_AGE, HC.PREDICATE_TYPE_SYSTEM_KNOWN_URLS, HC.PREDICATE_TYPE_SYSTEM_HASH, HC.PREDICATE_TYPE_SYSTEM_DURATION, HC.PREDICATE_TYPE_SYSTEM_HAS_AUDIO, HC.PREDICATE_TYPE_SYSTEM_NUM_WORDS, HC.PREDICATE_TYPE_SYSTEM_MIME, HC.PREDICATE_TYPE_SYSTEM_RATING, HC.PREDICATE_TYPE_SYSTEM_SIMILAR_TO, HC.PREDICATE_TYPE_SYSTEM_FILE_SERVICE, HC.PREDICATE_TYPE_SYSTEM_TAG_AS_NUMBER, HC.PREDICATE_TYPE_SYSTEM_FILE_RELATIONSHIPS, HC.PREDICATE_TYPE_SYSTEM_FILE_VIEWING_STATS ]:
|
||||
|
||||
with ClientGUITopLevelWindows.DialogEdit( self, 'input predicate', hide_buttons = True ) as dlg:
|
||||
|
||||
|
@ -5118,6 +5118,11 @@ The password is cleartext here but obscured in the entry dialog. Enter a blank p
|
|||
|
||||
def REPEATINGBandwidth( self ):
|
||||
|
||||
if self.IsIconized():
|
||||
|
||||
return
|
||||
|
||||
|
||||
global_tracker = self._controller.network_engine.bandwidth_manager.GetTracker( ClientNetworkingContexts.GLOBAL_NETWORK_CONTEXT )
|
||||
|
||||
boot_time = self._controller.GetBootTime()
|
||||
|
@ -5202,6 +5207,11 @@ The password is cleartext here but obscured in the entry dialog. Enter a blank p
|
|||
|
||||
def REPEATINGPageUpdate( self ):
|
||||
|
||||
if self.IsIconized():
|
||||
|
||||
return
|
||||
|
||||
|
||||
page = self.GetCurrentPage()
|
||||
|
||||
if page is not None:
|
||||
|
@ -5221,6 +5231,11 @@ The password is cleartext here but obscured in the entry dialog. Enter a blank p
|
|||
|
||||
def REPEATINGUIUpdate( self ):
|
||||
|
||||
if self.IsIconized():
|
||||
|
||||
return
|
||||
|
||||
|
||||
for window in list( self._ui_update_windows ):
|
||||
|
||||
if not window:
|
||||
|
|
|
@ -607,6 +607,8 @@ class AutoCompleteDropdown( wx.Panel ):
|
|||
|
||||
self._current_fetch_job_key.Cancel()
|
||||
|
||||
self._current_fetch_job_key = None
|
||||
|
||||
|
||||
|
||||
def _CancelScheduledListRefresh( self ):
|
||||
|
@ -1157,8 +1159,6 @@ class AutoCompleteDropdown( wx.Panel ):
|
|||
self._search_text_for_current_cache = search_text_for_cache
|
||||
self._cached_results = cached_results
|
||||
|
||||
self._current_fetch_job_key = None
|
||||
|
||||
self._initial_matches_fetched = True
|
||||
|
||||
self._SetResultsToList( results )
|
||||
|
@ -1639,11 +1639,16 @@ class AutoCompleteDropdownTagsRead( AutoCompleteDropdownTags ):
|
|||
|
||||
( raw_entry, inclusive, wildcard_text, search_text, explicit_wildcard, cache_text, entry_predicate ) = self._ParseSearchText()
|
||||
|
||||
sitting_on_empty = raw_entry == ''
|
||||
something_to_broadcast = cache_text != ''
|
||||
|
||||
current_page = self._dropdown_notebook.GetCurrentPage()
|
||||
|
||||
# looking at empty or system results
|
||||
nothing_to_select = current_page == self._search_results_list and ( self._last_fetched_search_text == '' or not self._search_results_list.HasValues() )
|
||||
|
||||
# when the user has quickly typed something in and the results are not yet in
|
||||
|
||||
p1 = not sitting_on_empty and self._last_fetched_search_text != search_text
|
||||
p1 = something_to_broadcast and nothing_to_select
|
||||
|
||||
return p1
|
||||
|
||||
|
@ -1943,15 +1948,20 @@ class AutoCompleteDropdownTagsWrite( AutoCompleteDropdownTags ):
|
|||
|
||||
( raw_entry, search_text, cache_text, entry_predicate, sibling_predicate ) = self._ParseSearchText()
|
||||
|
||||
current_page = self._dropdown_notebook.GetCurrentPage()
|
||||
|
||||
sitting_on_empty = raw_entry == ''
|
||||
|
||||
something_to_broadcast = not sitting_on_empty
|
||||
nothing_to_select = isinstance( current_page, ClientGUIListBoxes.ListBox ) and not current_page.HasValues()
|
||||
|
||||
# when the user has quickly typed something in and the results are not yet in
|
||||
|
||||
p1 = not sitting_on_empty and self._last_fetched_search_text != search_text
|
||||
p1 = something_to_broadcast and nothing_to_select
|
||||
|
||||
# when the text ctrl is empty, we are looking at search results, and we want to push a None to the parent dialog
|
||||
|
||||
p2 = sitting_on_empty and self._dropdown_notebook.GetCurrentPage() == self._search_results_list
|
||||
p2 = sitting_on_empty and current_page == self._search_results_list
|
||||
|
||||
return p1 or p2
|
||||
|
||||
|
|
|
@ -467,7 +467,7 @@ class EditStringConverterPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
text = 'Delete all selected?'
|
||||
|
||||
import ClientGUIDialogsQuick
|
||||
from . import ClientGUIDialogsQuick
|
||||
|
||||
result = ClientGUIDialogsQuick.GetYesNo( self, text )
|
||||
|
||||
|
@ -1193,6 +1193,11 @@ class NetworkJobControl( wx.Panel ):
|
|||
|
||||
if self._network_job is not None:
|
||||
|
||||
if self._network_job.CurrentlyWaitingOnConnectionError():
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( self, menu, 'reattempt connection now', 'Stop waiting on a connection error and reattempt the job now.', self._network_job.OverrideConnectionErrorWait )
|
||||
|
||||
|
||||
if self._network_job.ObeysBandwidth():
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( self, menu, 'override bandwidth rules for this job', 'Tell the current job to ignore existing bandwidth rules and go ahead anyway.', self._network_job.OverrideBandwidth )
|
||||
|
|
|
@ -2,6 +2,7 @@ from . import ClientGUIScrolledPanelsButtonQuestions
|
|||
from . import ClientGUIScrolledPanelsEdit
|
||||
from . import ClientGUITopLevelWindows
|
||||
from . import HydrusExceptions
|
||||
from . import HydrusGlobals as HG
|
||||
import wx
|
||||
|
||||
def GetDeleteFilesJobs( win, media, default_reason, suggested_file_service_key = None ):
|
||||
|
@ -55,7 +56,7 @@ def GetInterstitialFilteringAnswer( win, label ):
|
|||
return dlg.ShowModal()
|
||||
|
||||
|
||||
def GetYesNo( win, message, title = 'Are you sure?', yes_label = 'yes', no_label = 'no' ):
|
||||
def GetYesNo( win, message, title = 'Are you sure?', yes_label = 'yes', no_label = 'no', auto_yes_time = None ):
|
||||
|
||||
with ClientGUITopLevelWindows.DialogCustomButtonQuestion( win, title ) as dlg:
|
||||
|
||||
|
@ -63,7 +64,23 @@ def GetYesNo( win, message, title = 'Are you sure?', yes_label = 'yes', no_label
|
|||
|
||||
dlg.SetPanel( panel )
|
||||
|
||||
return dlg.ShowModal()
|
||||
if auto_yes_time is None:
|
||||
|
||||
return dlg.ShowModal()
|
||||
|
||||
else:
|
||||
|
||||
job = HG.client_controller.CallLaterWXSafe( dlg, auto_yes_time, dlg.EndModal, wx.ID_YES )
|
||||
|
||||
try:
|
||||
|
||||
return dlg.ShowModal()
|
||||
|
||||
finally:
|
||||
|
||||
job.Cancel()
|
||||
|
||||
|
||||
|
||||
|
||||
def SelectFromList( win, title, choice_tuples, value_to_select = None, sort_tuples = True ):
|
||||
|
|
|
@ -1520,6 +1520,11 @@ class ListBox( wx.ScrolledWindow ):
|
|||
return self._text_y * len( self._ordered_terms ) + 20
|
||||
|
||||
|
||||
def HasValues( self ):
|
||||
|
||||
return len( self._ordered_terms ) > 0
|
||||
|
||||
|
||||
def MoveSelectionDown( self ):
|
||||
|
||||
if len( self._ordered_terms ) > 1 and self._last_hit_index is not None:
|
||||
|
|
|
@ -1630,7 +1630,7 @@ class EditLoginScriptPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
bandwidth_manager = ClientNetworkingBandwidth.NetworkBandwidthManager()
|
||||
session_manager = ClientNetworkingSessions.NetworkSessionManager()
|
||||
domain_manager = ClientNetworkingDomain.NetworkDomainManager()
|
||||
domain_manager = HG.client_controller.network_engine.domain_manager.Duplicate() # keep custom headers from current domain stuff
|
||||
login_manager = ClientNetworkingLogin.NetworkLoginManager()
|
||||
|
||||
network_engine = ClientNetworking.NetworkEngine( HG.client_controller, bandwidth_manager, session_manager, domain_manager, login_manager )
|
||||
|
|
|
@ -1140,11 +1140,21 @@ class MediaPanel( ClientMedia.ListeningMediaList, wx.ScrolledCanvas ):
|
|||
|
||||
def _GetSimilarTo( self, max_hamming ):
|
||||
|
||||
if self._focused_media is not None:
|
||||
hashes = set()
|
||||
|
||||
media = self._GetSelectedFlatMedia()
|
||||
|
||||
for m in media:
|
||||
|
||||
hash = self._focused_media.GetDisplayMedia().GetHash()
|
||||
if m.GetMime() in HC.MIMES_WE_CAN_PHASH:
|
||||
|
||||
hashes.add( m.GetHash() )
|
||||
|
||||
|
||||
initial_predicates = [ ClientSearch.Predicate( HC.PREDICATE_TYPE_SYSTEM_SIMILAR_TO, ( hash, max_hamming ) ) ]
|
||||
|
||||
if len( hashes ) > 0:
|
||||
|
||||
initial_predicates = [ ClientSearch.Predicate( HC.PREDICATE_TYPE_SYSTEM_SIMILAR_TO, ( tuple( hashes ), max_hamming ) ) ]
|
||||
|
||||
HG.client_controller.pub( 'new_page_query', CC.LOCAL_FILE_SERVICE_KEY, initial_predicates = initial_predicates )
|
||||
|
||||
|
@ -1632,9 +1642,9 @@ class MediaPanel( ClientMedia.ListeningMediaList, wx.ScrolledCanvas ):
|
|||
|
||||
if num_files > 0:
|
||||
|
||||
if job_type == ClientFiles.REGENERATE_FILE_DATA_JOB_COMPLETE:
|
||||
if job_type == ClientFiles.REGENERATE_FILE_DATA_JOB_FILE_METADATA:
|
||||
|
||||
text = 'This will reparse the {} selected files\' metadata and regenerate their thumbnails.'.format( HydrusData.ToHumanInt( num_files ) )
|
||||
text = 'This will reparse the {} selected files\' metadata.'.format( HydrusData.ToHumanInt( num_files ) )
|
||||
text += os.linesep * 2
|
||||
text += 'If the files were imported before some more recent improvement in the parsing code (such as EXIF rotation or bad video resolution or duration or frame count calculation), this will update them.'
|
||||
|
||||
|
@ -4510,10 +4520,10 @@ class MediaPanelThumbnails( MediaPanel ):
|
|||
|
||||
similar_menu = wx.Menu()
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( self, similar_menu, 'exact match', 'Search the database for files that look precisely like this one.', self._GetSimilarTo, HC.HAMMING_EXACT_MATCH )
|
||||
ClientGUIMenus.AppendMenuItem( self, similar_menu, 'very similar', 'Search the database for files that look just like this one.', self._GetSimilarTo, HC.HAMMING_VERY_SIMILAR )
|
||||
ClientGUIMenus.AppendMenuItem( self, similar_menu, 'similar', 'Search the database for files that look generally like this one.', self._GetSimilarTo, HC.HAMMING_SIMILAR )
|
||||
ClientGUIMenus.AppendMenuItem( self, similar_menu, 'speculative', 'Search the database for files that probably look like this one. This is sometimes useful for symbols with sharp edges or lines.', self._GetSimilarTo, HC.HAMMING_SPECULATIVE )
|
||||
ClientGUIMenus.AppendMenuItem( self, similar_menu, 'exact match', 'Search the database for files that look precisely like those selected.', self._GetSimilarTo, HC.HAMMING_EXACT_MATCH )
|
||||
ClientGUIMenus.AppendMenuItem( self, similar_menu, 'very similar', 'Search the database for files that look just like those selected.', self._GetSimilarTo, HC.HAMMING_VERY_SIMILAR )
|
||||
ClientGUIMenus.AppendMenuItem( self, similar_menu, 'similar', 'Search the database for files that look generally like those selected.', self._GetSimilarTo, HC.HAMMING_SIMILAR )
|
||||
ClientGUIMenus.AppendMenuItem( self, similar_menu, 'speculative', 'Search the database for files that probably look like those selected. This is sometimes useful for symbols with sharp edges or lines.', self._GetSimilarTo, HC.HAMMING_SPECULATIVE )
|
||||
|
||||
ClientGUIMenus.AppendMenu( duplicates_menu, similar_menu, 'find similar-looking files' )
|
||||
|
||||
|
@ -4531,7 +4541,7 @@ class MediaPanelThumbnails( MediaPanel ):
|
|||
|
||||
ClientGUIMenus.AppendMenuItem( self, regen_menu, 'thumbnails, but only if wrong size', 'Regenerate the selected files\' thumbnails, but only if they are the wrong size.', self._RegenerateFileData, ClientFiles.REGENERATE_FILE_DATA_JOB_REFIT_THUMBNAIL )
|
||||
ClientGUIMenus.AppendMenuItem( self, regen_menu, 'thumbnails', 'Regenerate the selected files\'s thumbnails.', self._RegenerateFileData, ClientFiles.REGENERATE_FILE_DATA_JOB_FORCE_THUMBNAIL )
|
||||
ClientGUIMenus.AppendMenuItem( self, regen_menu, 'file metadata and thumbnails', 'Regenerated the selected files\' metadata and thumbnails.', self._RegenerateFileData, ClientFiles.REGENERATE_FILE_DATA_JOB_COMPLETE )
|
||||
ClientGUIMenus.AppendMenuItem( self, regen_menu, 'file metadata', 'Regenerated the selected files\' metadata and thumbnails.', self._RegenerateFileData, ClientFiles.REGENERATE_FILE_DATA_JOB_FILE_METADATA )
|
||||
|
||||
ClientGUIMenus.AppendMenu( menu, regen_menu, 'regenerate' )
|
||||
|
||||
|
|
|
@ -2,6 +2,7 @@ from . import ClientConstants as CC
|
|||
from . import ClientData
|
||||
from . import ClientGUICommon
|
||||
from . import ClientGUIControls
|
||||
from . import ClientGUIFunctions
|
||||
from . import ClientGUIOptionsPanels
|
||||
from . import ClientGUIScrolledPanels
|
||||
from . import ClientGUIShortcuts
|
||||
|
@ -55,6 +56,10 @@ class InputFileSystemPredicate( ClientGUIScrolledPanels.EditPanel ):
|
|||
pred_classes.append( PanelPredicateSystemKnownURLsRegex )
|
||||
pred_classes.append( PanelPredicateSystemKnownURLsURLClass )
|
||||
|
||||
elif predicate_type == HC.PREDICATE_TYPE_SYSTEM_HAS_AUDIO:
|
||||
|
||||
pred_classes.append( PanelPredicateSystemHasAudio )
|
||||
|
||||
elif predicate_type == HC.PREDICATE_TYPE_SYSTEM_HASH:
|
||||
|
||||
pred_classes.append( PanelPredicateSystemHash )
|
||||
|
@ -577,6 +582,41 @@ class PanelPredicateSystemFileViewingStatsViewtime( PanelPredicateSystem ):
|
|||
return info
|
||||
|
||||
|
||||
class PanelPredicateSystemHasAudio( PanelPredicateSystem ):
|
||||
|
||||
PREDICATE_TYPE = HC.PREDICATE_TYPE_SYSTEM_HAS_AUDIO
|
||||
|
||||
def __init__( self, parent ):
|
||||
|
||||
PanelPredicateSystem.__init__( self, parent )
|
||||
|
||||
choices = [ 'has audio', 'does not have audio' ]
|
||||
|
||||
self._has_audio = wx.RadioBox( self, choices = choices, style = wx.RA_SPECIFY_ROWS )
|
||||
|
||||
#
|
||||
|
||||
hbox = wx.BoxSizer( wx.HORIZONTAL )
|
||||
|
||||
hbox.Add( ClientGUICommon.BetterStaticText( self, 'system:' ), CC.FLAGS_VCENTER )
|
||||
hbox.Add( self._has_audio, CC.FLAGS_VCENTER )
|
||||
|
||||
self.SetSizer( hbox )
|
||||
|
||||
wx.CallAfter( self._has_audio.SetFocus )
|
||||
|
||||
|
||||
def GetInfo( self ):
|
||||
|
||||
has_audio_string = self._has_audio.GetStringSelection()
|
||||
|
||||
has_audio = has_audio_string == 'has audio'
|
||||
|
||||
info = has_audio
|
||||
|
||||
return info
|
||||
|
||||
|
||||
class PanelPredicateSystemHash( PanelPredicateSystem ):
|
||||
|
||||
PREDICATE_TYPE = HC.PREDICATE_TYPE_SYSTEM_HASH
|
||||
|
@ -585,45 +625,44 @@ class PanelPredicateSystemHash( PanelPredicateSystem ):
|
|||
|
||||
PanelPredicateSystem.__init__( self, parent )
|
||||
|
||||
self.SetToolTip( 'As this can only ever return one result, it overrules the active file domain and any other active predicate.' )
|
||||
self._hashes = wx.TextCtrl( self, style = wx.TE_MULTILINE )
|
||||
|
||||
self._hash = wx.TextCtrl( self, size = ( 200, -1 ) )
|
||||
init_size = ClientGUIFunctions.ConvertTextToPixels( self._hashes, ( 66, 10 ) )
|
||||
|
||||
self._hashes.SetMinSize( init_size )
|
||||
|
||||
choices = [ 'sha256', 'md5', 'sha1', 'sha512' ]
|
||||
|
||||
self._hash_type = wx.RadioBox( self, choices = choices, style = wx.RA_SPECIFY_COLS )
|
||||
|
||||
self._hashes.SetValue( 'enter hash (paste newline-separated for multiple hashes)' )
|
||||
|
||||
hbox = wx.BoxSizer( wx.HORIZONTAL )
|
||||
|
||||
hbox.Add( ClientGUICommon.BetterStaticText( self, 'system:hash=' ), CC.FLAGS_VCENTER )
|
||||
hbox.Add( self._hash, CC.FLAGS_VCENTER )
|
||||
hbox.Add( self._hashes, CC.FLAGS_VCENTER )
|
||||
hbox.Add( self._hash_type, CC.FLAGS_VCENTER )
|
||||
|
||||
self.SetSizer( hbox )
|
||||
|
||||
wx.CallAfter( self._hash.SetFocus )
|
||||
wx.CallAfter( self._hashes.SetFocus )
|
||||
|
||||
|
||||
def GetInfo( self ):
|
||||
|
||||
hex_hash = self._hash.GetValue().lower()
|
||||
hex_hashes_raw = self._hashes.GetValue()
|
||||
|
||||
hex_hash = HydrusText.HexFilter( hex_hash )
|
||||
hex_hashes = HydrusText.DeserialiseNewlinedTexts( hex_hashes_raw )
|
||||
|
||||
if len( hex_hash ) == 0:
|
||||
|
||||
hex_hash = '00'
|
||||
|
||||
elif len( hex_hash ) % 2 == 1:
|
||||
|
||||
hex_hash += '0' # since we are later decoding to byte
|
||||
|
||||
hex_hashes = [ HydrusText.HexFilter( hex_hash ) for hex_hash in hex_hashes ]
|
||||
|
||||
hash = bytes.fromhex( hex_hash )
|
||||
hex_hashes = HydrusData.DedupeList( hex_hashes )
|
||||
|
||||
hashes = tuple( [ bytes.fromhex( hex_hash ) for hex_hash in hex_hashes ] )
|
||||
|
||||
hash_type = self._hash_type.GetStringSelection()
|
||||
|
||||
return ( hash, hash_type )
|
||||
return ( hashes, hash_type )
|
||||
|
||||
|
||||
class PanelPredicateSystemHeight( PanelPredicateSystem ):
|
||||
|
@ -1283,13 +1322,17 @@ class PanelPredicateSystemSimilarTo( PanelPredicateSystem ):
|
|||
|
||||
PanelPredicateSystem.__init__( self, parent )
|
||||
|
||||
self._hash = wx.TextCtrl( self )
|
||||
self._hashes = wx.TextCtrl( self, style = wx.TE_MULTILINE )
|
||||
|
||||
init_size = ClientGUIFunctions.ConvertTextToPixels( self._hashes, ( 66, 10 ) )
|
||||
|
||||
self._hashes.SetMinSize( init_size )
|
||||
|
||||
self._max_hamming = wx.SpinCtrl( self, max = 256, size = ( 60, -1 ) )
|
||||
|
||||
system_predicates = HC.options[ 'file_system_predicates' ]
|
||||
|
||||
self._hash.SetValue( 'enter hash' )
|
||||
self._hashes.SetValue( 'enter hash (paste newline-separated for multiple hashes)' )
|
||||
|
||||
hamming_distance = system_predicates[ 'hamming_distance' ]
|
||||
|
||||
|
@ -1298,31 +1341,28 @@ class PanelPredicateSystemSimilarTo( PanelPredicateSystem ):
|
|||
hbox = wx.BoxSizer( wx.HORIZONTAL )
|
||||
|
||||
hbox.Add( ClientGUICommon.BetterStaticText( self, 'system:similar_to' ), CC.FLAGS_VCENTER )
|
||||
hbox.Add( self._hash, CC.FLAGS_VCENTER )
|
||||
hbox.Add( self._hashes, CC.FLAGS_VCENTER )
|
||||
hbox.Add( wx.StaticText( self, label='\u2248' ), CC.FLAGS_VCENTER )
|
||||
hbox.Add( self._max_hamming, CC.FLAGS_VCENTER )
|
||||
|
||||
self.SetSizer( hbox )
|
||||
|
||||
wx.CallAfter( self._hash.SetFocus )
|
||||
wx.CallAfter( self._hashes.SetFocus )
|
||||
|
||||
|
||||
def GetInfo( self ):
|
||||
|
||||
hex_hash = self._hash.GetValue()
|
||||
hex_hashes_raw = self._hashes.GetValue()
|
||||
|
||||
hex_hash = HydrusText.HexFilter( hex_hash )
|
||||
hex_hashes = HydrusText.DeserialiseNewlinedTexts( hex_hashes_raw )
|
||||
|
||||
if len( hex_hash ) == 0:
|
||||
|
||||
hex_hash = '00'
|
||||
|
||||
elif len( hex_hash ) % 2 == 1:
|
||||
|
||||
hex_hash += '0' # since we are later decoding to byte
|
||||
|
||||
hex_hashes = [ HydrusText.HexFilter( hex_hash ) for hex_hash in hex_hashes ]
|
||||
|
||||
info = ( bytes.fromhex( hex_hash ), self._max_hamming.GetValue() )
|
||||
hex_hashes = HydrusData.DedupeList( hex_hashes )
|
||||
|
||||
hashes = tuple( [ bytes.fromhex( hex_hash ) for hex_hash in hex_hashes ] )
|
||||
|
||||
info = ( hashes, self._max_hamming.GetValue() )
|
||||
|
||||
return info
|
||||
|
||||
|
|
|
@ -1512,7 +1512,7 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
self._listbook.AddPage( 'speed and memory', 'speed and memory', self._SpeedAndMemoryPanel( self._listbook, self._new_options ) )
|
||||
self._listbook.AddPage( 'maintenance and processing', 'maintenance and processing', self._MaintenanceAndProcessingPanel( self._listbook ) )
|
||||
self._listbook.AddPage( 'media', 'media', self._MediaPanel( self._listbook ) )
|
||||
#self._listbook.AddPage( 'sound', 'sound', self._SoundPanel( self._listbook ) )
|
||||
self._listbook.AddPage( 'sound', 'sound', self._SoundPanel( self._listbook, self._new_options ) )
|
||||
self._listbook.AddPage( 'default system predicates', 'default system predicates', self._DefaultFileSystemPredicatesPanel( self._listbook, self._new_options ) )
|
||||
self._listbook.AddPage( 'colours', 'colours', self._ColoursPanel( self._listbook ) )
|
||||
self._listbook.AddPage( 'regex favourites', 'regex favourites', self._RegexPanel( self._listbook ) )
|
||||
|
@ -3496,28 +3496,36 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
|
||||
class _SoundPanel( wx.Panel ):
|
||||
|
||||
def __init__( self, parent ):
|
||||
def __init__( self, parent, new_options ):
|
||||
|
||||
wx.Panel.__init__( self, parent )
|
||||
|
||||
self._play_dumper_noises = wx.CheckBox( self, label = 'play success/fail noises when dumping' )
|
||||
self._new_options = new_options
|
||||
|
||||
self._has_audio_label = wx.TextCtrl( self )
|
||||
|
||||
#
|
||||
|
||||
self._play_dumper_noises.SetValue( HC.options[ 'play_dumper_noises' ] )
|
||||
self._has_audio_label.SetValue( self._new_options.GetString( 'has_audio_label' ) )
|
||||
|
||||
#
|
||||
|
||||
vbox = wx.BoxSizer( wx.VERTICAL )
|
||||
|
||||
vbox.Add( self._play_dumper_noises, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
rows = []
|
||||
|
||||
rows.append( ( 'Label for files with audio: ', self._has_audio_label ) )
|
||||
|
||||
gridbox = ClientGUICommon.WrapInGrid( self, rows )
|
||||
|
||||
vbox.Add( gridbox, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
|
||||
self.SetSizer( vbox )
|
||||
|
||||
|
||||
def UpdateOptions( self ):
|
||||
|
||||
HC.options[ 'play_dumper_noises' ] = self._play_dumper_noises.GetValue()
|
||||
self._new_options.SetString( 'has_audio_label', self._has_audio_label.GetValue() )
|
||||
|
||||
|
||||
|
||||
|
|
|
@ -1990,6 +1990,8 @@ class ReviewFileMaintenance( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
self._action_selector.Append( ClientFiles.regen_file_enum_to_str_lookup[ job_type ], job_type )
|
||||
|
||||
|
||||
self._description_button = ClientGUICommon.BetterButton( self._action_panel, 'see description', self._SeeDescription )
|
||||
|
||||
self._add_new_job = ClientGUICommon.BetterButton( self._action_panel, 'add job', self._AddJob )
|
||||
|
||||
self._add_new_job.Disable()
|
||||
|
@ -1998,6 +2000,7 @@ class ReviewFileMaintenance( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
|
||||
hbox.Add( self._selected_files_st, CC.FLAGS_EXPAND_BOTH_WAYS )
|
||||
hbox.Add( self._action_selector, CC.FLAGS_VCENTER )
|
||||
hbox.Add( self._description_button, CC.FLAGS_VCENTER )
|
||||
hbox.Add( self._add_new_job, CC.FLAGS_VCENTER )
|
||||
|
||||
self._action_panel.Add( hbox, CC.FLAGS_EXPAND_BOTH_WAYS )
|
||||
|
@ -2006,6 +2009,11 @@ class ReviewFileMaintenance( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
|
||||
vbox = wx.BoxSizer( wx.VERTICAL )
|
||||
|
||||
label = 'First, select which files you wish to queue up for the job using a normal search. Hit \'run this search\' to select those files.'
|
||||
label += os.linesep
|
||||
label += 'Then select the job type and click \'add job\'.'
|
||||
|
||||
vbox.Add( ClientGUICommon.BetterStaticText( self._new_work_panel, label = label ), CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
vbox.Add( self._search_panel, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
vbox.Add( self._button_panel, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
vbox.Add( self._action_panel, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
|
@ -2210,6 +2218,13 @@ class ReviewFileMaintenance( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
HG.client_controller.CallToThread( do_it, file_search_context )
|
||||
|
||||
|
||||
def _SeeDescription( self ):
|
||||
|
||||
job_type = self._action_selector.GetValue()
|
||||
|
||||
wx.MessageBox( ClientFiles.regen_file_enum_to_description_lookup[ job_type ] )
|
||||
|
||||
|
||||
def _SelectRepoUpdateFiles( self ):
|
||||
|
||||
def wx_done( hash_ids ):
|
||||
|
|
|
@ -187,7 +187,7 @@ class FileImportJob( object ):
|
|||
HydrusData.ShowText( 'File import job testing if good to import for file import options' )
|
||||
|
||||
|
||||
( size, mime, width, height, duration, num_frames, num_words ) = self._file_info
|
||||
( size, mime, width, height, duration, num_frames, has_audio, num_words ) = self._file_info
|
||||
|
||||
self._file_import_options.CheckFileIsValid( size, mime, width, height )
|
||||
|
||||
|
@ -280,7 +280,7 @@ class FileImportJob( object ):
|
|||
|
||||
self._file_info = HydrusFileHandling.GetFileInfo( self._temp_path, mime )
|
||||
|
||||
( size, mime, width, height, duration, num_frames, num_words ) = self._file_info
|
||||
( size, mime, width, height, duration, num_frames, has_audio, num_words ) = self._file_info
|
||||
|
||||
if HG.file_import_report_mode:
|
||||
|
||||
|
@ -348,7 +348,7 @@ class FileImportJob( object ):
|
|||
|
||||
def GetMime( self ):
|
||||
|
||||
( size, mime, width, height, duration, num_frames, num_words ) = self._file_info
|
||||
( size, mime, width, height, duration, num_frames, has_audio, num_words ) = self._file_info
|
||||
|
||||
return mime
|
||||
|
||||
|
|
|
@ -463,9 +463,6 @@ class Subscription( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
error_count = 0
|
||||
|
||||
all_presentation_hashes = []
|
||||
all_presentation_hashes_fast = set()
|
||||
|
||||
queries = self._GetQueriesForProcessing()
|
||||
|
||||
num_queries = len( queries )
|
||||
|
@ -500,182 +497,173 @@ class Subscription( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
starting_num_unknown = file_seed_cache.GetFileSeedCount( CC.STATUS_UNKNOWN )
|
||||
starting_num_done = starting_num_urls - starting_num_unknown
|
||||
|
||||
while True:
|
||||
try:
|
||||
|
||||
file_seed = file_seed_cache.GetNextFileSeed( CC.STATUS_UNKNOWN )
|
||||
|
||||
if file_seed is None:
|
||||
while True:
|
||||
|
||||
if HG.subscription_report_mode:
|
||||
file_seed = file_seed_cache.GetNextFileSeed( CC.STATUS_UNKNOWN )
|
||||
|
||||
if file_seed is None:
|
||||
|
||||
HydrusData.ShowText( 'Query "' + query_name + '" can do no more file work due to running out of unknown urls.' )
|
||||
if HG.subscription_report_mode:
|
||||
|
||||
HydrusData.ShowText( 'Query "' + query_name + '" can do no more file work due to running out of unknown urls.' )
|
||||
|
||||
|
||||
break
|
||||
|
||||
|
||||
break
|
||||
|
||||
|
||||
if job_key.IsCancelled():
|
||||
|
||||
self._DelayWork( 300, 'recently cancelled' )
|
||||
|
||||
break
|
||||
|
||||
|
||||
p1 = HC.options[ 'pause_subs_sync' ]
|
||||
p2 = HydrusThreading.IsThreadShuttingDown()
|
||||
p3 = HG.view_shutdown
|
||||
p4 = not self._QueryBandwidthIsOK( query )
|
||||
p5 = not self._QueryFileLoginIsOK( query )
|
||||
|
||||
if p1 or p2 or p3 or p4 or p5:
|
||||
|
||||
if p4 and this_query_has_done_work:
|
||||
if job_key.IsCancelled():
|
||||
|
||||
job_key.SetVariable( 'popup_text_2', 'no more bandwidth to download files, will do some more later' )
|
||||
self._DelayWork( 300, 'recently cancelled' )
|
||||
|
||||
time.sleep( 5 )
|
||||
break
|
||||
|
||||
|
||||
break
|
||||
p1 = HC.options[ 'pause_subs_sync' ]
|
||||
p2 = HydrusThreading.IsThreadShuttingDown()
|
||||
p3 = HG.view_shutdown
|
||||
p4 = not self._QueryBandwidthIsOK( query )
|
||||
p5 = not self._QueryFileLoginIsOK( query )
|
||||
|
||||
|
||||
try:
|
||||
|
||||
num_urls = file_seed_cache.GetFileSeedCount()
|
||||
num_unknown = file_seed_cache.GetFileSeedCount( CC.STATUS_UNKNOWN )
|
||||
num_done = num_urls - num_unknown
|
||||
|
||||
# 4001/4003 is not as useful as 1/3
|
||||
|
||||
human_num_urls = num_urls - starting_num_done
|
||||
human_num_done = num_done - starting_num_done
|
||||
|
||||
x_out_of_y = 'file ' + HydrusData.ConvertValueRangeToPrettyString( human_num_done + 1, human_num_urls ) + ': '
|
||||
|
||||
job_key.SetVariable( 'popup_gauge_2', ( human_num_done, human_num_urls ) )
|
||||
|
||||
def status_hook( text ):
|
||||
if p1 or p2 or p3 or p4 or p5:
|
||||
|
||||
job_key.SetVariable( 'popup_text_2', x_out_of_y + text )
|
||||
if p4 and this_query_has_done_work:
|
||||
|
||||
job_key.SetVariable( 'popup_text_2', 'no more bandwidth to download files, will do some more later' )
|
||||
|
||||
time.sleep( 5 )
|
||||
|
||||
|
||||
break
|
||||
|
||||
|
||||
file_seed.WorkOnURL( file_seed_cache, status_hook, self._GenerateNetworkJobFactory( query ), ClientImporting.GenerateMultiplePopupNetworkJobPresentationContextFactory( job_key ), self._file_import_options, self._tag_import_options )
|
||||
|
||||
query_tag_import_options = query.GetTagImportOptions()
|
||||
|
||||
if query_tag_import_options.HasAdditionalTags() and file_seed.status in CC.SUCCESSFUL_IMPORT_STATES:
|
||||
try:
|
||||
|
||||
if file_seed.HasHash():
|
||||
num_urls = file_seed_cache.GetFileSeedCount()
|
||||
num_unknown = file_seed_cache.GetFileSeedCount( CC.STATUS_UNKNOWN )
|
||||
num_done = num_urls - num_unknown
|
||||
|
||||
# 4001/4003 is not as useful as 1/3
|
||||
|
||||
human_num_urls = num_urls - starting_num_done
|
||||
human_num_done = num_done - starting_num_done
|
||||
|
||||
x_out_of_y = 'file ' + HydrusData.ConvertValueRangeToPrettyString( human_num_done + 1, human_num_urls ) + ': '
|
||||
|
||||
job_key.SetVariable( 'popup_gauge_2', ( human_num_done, human_num_urls ) )
|
||||
|
||||
def status_hook( text ):
|
||||
|
||||
job_key.SetVariable( 'popup_text_2', x_out_of_y + text )
|
||||
|
||||
|
||||
file_seed.WorkOnURL( file_seed_cache, status_hook, self._GenerateNetworkJobFactory( query ), ClientImporting.GenerateMultiplePopupNetworkJobPresentationContextFactory( job_key ), self._file_import_options, self._tag_import_options )
|
||||
|
||||
query_tag_import_options = query.GetTagImportOptions()
|
||||
|
||||
if query_tag_import_options.HasAdditionalTags() and file_seed.status in CC.SUCCESSFUL_IMPORT_STATES:
|
||||
|
||||
if file_seed.HasHash():
|
||||
|
||||
hash = file_seed.GetHash()
|
||||
|
||||
in_inbox = HG.client_controller.Read( 'in_inbox', hash )
|
||||
|
||||
downloaded_tags = []
|
||||
|
||||
service_keys_to_content_updates = query_tag_import_options.GetServiceKeysToContentUpdates( file_seed.status, in_inbox, hash, downloaded_tags ) # additional tags
|
||||
|
||||
if len( service_keys_to_content_updates ) > 0:
|
||||
|
||||
HG.client_controller.WriteSynchronous( 'content_updates', service_keys_to_content_updates )
|
||||
|
||||
|
||||
|
||||
|
||||
if file_seed.ShouldPresent( self._file_import_options ):
|
||||
|
||||
hash = file_seed.GetHash()
|
||||
|
||||
in_inbox = HG.client_controller.Read( 'in_inbox', hash )
|
||||
|
||||
downloaded_tags = []
|
||||
|
||||
service_keys_to_content_updates = query_tag_import_options.GetServiceKeysToContentUpdates( file_seed.status, in_inbox, hash, downloaded_tags ) # additional tags
|
||||
|
||||
if len( service_keys_to_content_updates ) > 0:
|
||||
if hash not in presentation_hashes_fast:
|
||||
|
||||
HG.client_controller.WriteSynchronous( 'content_updates', service_keys_to_content_updates )
|
||||
presentation_hashes.append( hash )
|
||||
|
||||
presentation_hashes_fast.add( hash )
|
||||
|
||||
|
||||
|
||||
|
||||
if file_seed.ShouldPresent( self._file_import_options ):
|
||||
except HydrusExceptions.CancelledException as e:
|
||||
|
||||
hash = file_seed.GetHash()
|
||||
self._DelayWork( 300, str( e ) )
|
||||
|
||||
if hash not in presentation_hashes_fast:
|
||||
break
|
||||
|
||||
except HydrusExceptions.VetoException as e:
|
||||
|
||||
status = CC.STATUS_VETOED
|
||||
|
||||
note = str( e )
|
||||
|
||||
file_seed.SetStatus( status, note = note )
|
||||
|
||||
except HydrusExceptions.NotFoundException:
|
||||
|
||||
status = CC.STATUS_VETOED
|
||||
|
||||
note = '404'
|
||||
|
||||
file_seed.SetStatus( status, note = note )
|
||||
|
||||
except Exception as e:
|
||||
|
||||
status = CC.STATUS_ERROR
|
||||
|
||||
job_key.SetVariable( 'popup_text_2', x_out_of_y + 'file failed' )
|
||||
|
||||
file_seed.SetStatus( status, exception = e )
|
||||
|
||||
if isinstance( e, HydrusExceptions.DataMissing ):
|
||||
|
||||
if hash not in all_presentation_hashes_fast:
|
||||
|
||||
all_presentation_hashes.append( hash )
|
||||
|
||||
all_presentation_hashes_fast.add( hash )
|
||||
|
||||
# DataMissing is a quick thing to avoid subscription abandons when lots of deleted files in e621 (or any other booru)
|
||||
# this should be richer in any case in the new system
|
||||
|
||||
presentation_hashes.append( hash )
|
||||
pass
|
||||
|
||||
presentation_hashes_fast.add( hash )
|
||||
else:
|
||||
|
||||
error_count += 1
|
||||
|
||||
time.sleep( 10 )
|
||||
|
||||
|
||||
if error_count > 4:
|
||||
|
||||
raise Exception( 'The subscription ' + self._name + ' encountered several errors when downloading files, so it abandoned its sync.' )
|
||||
|
||||
|
||||
|
||||
except HydrusExceptions.CancelledException as e:
|
||||
this_query_has_done_work = True
|
||||
|
||||
self._DelayWork( 300, str( e ) )
|
||||
|
||||
break
|
||||
|
||||
except HydrusExceptions.VetoException as e:
|
||||
|
||||
status = CC.STATUS_VETOED
|
||||
|
||||
note = str( e )
|
||||
|
||||
file_seed.SetStatus( status, note = note )
|
||||
|
||||
except HydrusExceptions.NotFoundException:
|
||||
|
||||
status = CC.STATUS_VETOED
|
||||
|
||||
note = '404'
|
||||
|
||||
file_seed.SetStatus( status, note = note )
|
||||
|
||||
except Exception as e:
|
||||
|
||||
status = CC.STATUS_ERROR
|
||||
|
||||
job_key.SetVariable( 'popup_text_2', x_out_of_y + 'file failed' )
|
||||
|
||||
file_seed.SetStatus( status, exception = e )
|
||||
|
||||
if isinstance( e, HydrusExceptions.DataMissing ):
|
||||
if len( presentation_hashes ) > 0:
|
||||
|
||||
# DataMissing is a quick thing to avoid subscription abandons when lots of deleted files in e621 (or any other booru)
|
||||
# this should be richer in any case in the new system
|
||||
|
||||
pass
|
||||
|
||||
else:
|
||||
|
||||
error_count += 1
|
||||
|
||||
time.sleep( 10 )
|
||||
job_key.SetVariable( 'popup_files', ( list( presentation_hashes ), query_summary_name ) )
|
||||
|
||||
|
||||
if error_count > 4:
|
||||
|
||||
raise Exception( 'The subscription ' + self._name + ' encountered several errors when downloading files, so it abandoned its sync.' )
|
||||
|
||||
time.sleep( ClientImporting.DID_SUBSTANTIAL_FILE_WORK_MINIMUM_SLEEP_TIME )
|
||||
|
||||
HG.client_controller.WaitUntilViewFree()
|
||||
|
||||
|
||||
this_query_has_done_work = True
|
||||
finally:
|
||||
|
||||
if len( presentation_hashes ) > 0:
|
||||
|
||||
job_key.SetVariable( 'popup_files', ( list( presentation_hashes ), query_summary_name ) )
|
||||
publishing_label = self._GetPublishingLabel( query )
|
||||
|
||||
ClientImporting.PublishPresentationHashes( publishing_label, presentation_hashes, self._publish_files_to_popup_button, self._publish_files_to_page )
|
||||
|
||||
|
||||
time.sleep( ClientImporting.DID_SUBSTANTIAL_FILE_WORK_MINIMUM_SLEEP_TIME )
|
||||
|
||||
HG.client_controller.WaitUntilViewFree()
|
||||
|
||||
|
||||
if not self._merge_query_publish_events and len( presentation_hashes ) > 0:
|
||||
|
||||
publishing_label = self._GetPublishingLabel( query )
|
||||
|
||||
ClientImporting.PublishPresentationHashes( publishing_label, presentation_hashes, self._publish_files_to_popup_button, self._publish_files_to_page )
|
||||
|
||||
|
||||
|
||||
if self._merge_query_publish_events and len( all_presentation_hashes ) > 0:
|
||||
|
||||
publishing_label = self._GetPublishingLabel( query )
|
||||
|
||||
ClientImporting.PublishPresentationHashes( publishing_label, all_presentation_hashes, self._publish_files_to_popup_button, self._publish_files_to_page )
|
||||
|
||||
|
||||
job_key.DeleteVariable( 'popup_files' )
|
||||
|
|
|
@ -76,6 +76,13 @@ class HydrusServiceClientAPI( HydrusClientService ):
|
|||
get_files.putChild( b'file', ClientLocalServerResources.HydrusResourceClientAPIRestrictedGetFilesGetFile( self._service, self._client_requests_domain ) )
|
||||
get_files.putChild( b'thumbnail', ClientLocalServerResources.HydrusResourceClientAPIRestrictedGetFilesGetThumbnail( self._service, self._client_requests_domain ) )
|
||||
|
||||
manage_cookies = NoResource()
|
||||
|
||||
root.putChild( b'manage_cookies', manage_cookies )
|
||||
|
||||
manage_cookies.putChild( b'get_cookies', ClientLocalServerResources.HydrusResourceClientAPIRestrictedManageCookiesGetCookies( self._service, self._client_requests_domain ) )
|
||||
manage_cookies.putChild( b'set_cookies', ClientLocalServerResources.HydrusResourceClientAPIRestrictedManageCookiesSetCookies( self._service, self._client_requests_domain ) )
|
||||
|
||||
manage_pages = NoResource()
|
||||
|
||||
root.putChild( b'manage_pages', manage_pages )
|
||||
|
|
|
@ -2,6 +2,7 @@ import collections
|
|||
from . import ClientAPI
|
||||
from . import ClientConstants as CC
|
||||
from . import ClientImportFileSeeds
|
||||
from . import ClientNetworkingContexts
|
||||
from . import ClientSearch
|
||||
from . import ClientTags
|
||||
from . import HydrusConstants as HC
|
||||
|
@ -12,6 +13,7 @@ from . import HydrusNetworking
|
|||
from . import HydrusPaths
|
||||
from . import HydrusServerResources
|
||||
from . import HydrusTags
|
||||
import http.cookiejar
|
||||
import json
|
||||
import os
|
||||
import time
|
||||
|
@ -28,7 +30,7 @@ LOCAL_BOORU_JSON_BYTE_LIST_PARAMS = set()
|
|||
|
||||
CLIENT_API_INT_PARAMS = { 'file_id' }
|
||||
CLIENT_API_BYTE_PARAMS = { 'hash', 'destination_page_key', 'page_key', 'Hydrus-Client-API-Access-Key', 'Hydrus-Client-API-Session-Key' }
|
||||
CLIENT_API_STRING_PARAMS = { 'name', 'url' }
|
||||
CLIENT_API_STRING_PARAMS = { 'name', 'url', 'domain' }
|
||||
CLIENT_API_JSON_PARAMS = { 'basic_permissions', 'system_inbox', 'system_archive', 'tags', 'file_ids', 'only_return_identifiers' }
|
||||
CLIENT_API_JSON_BYTE_LIST_PARAMS = { 'hashes' }
|
||||
|
||||
|
@ -1485,6 +1487,13 @@ class HydrusResourceClientAPIRestrictedGetFilesFileMetadata( HydrusResourceClien
|
|||
metadata_row[ 'duration' ] = file_info_manager.duration
|
||||
metadata_row[ 'num_frames' ] = file_info_manager.num_frames
|
||||
metadata_row[ 'num_words' ] = file_info_manager.num_words
|
||||
metadata_row[ 'has_audio' ] = file_info_manager.has_audio
|
||||
|
||||
known_urls = list( media_result.GetLocationsManager().GetURLs() )
|
||||
|
||||
known_urls.sort()
|
||||
|
||||
metadata_row[ 'known_urls' ] = known_urls
|
||||
|
||||
tags_manager = media_result.GetTagsManager()
|
||||
|
||||
|
@ -1566,6 +1575,116 @@ class HydrusResourceClientAPIRestrictedGetFilesGetThumbnail( HydrusResourceClien
|
|||
return response_context
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedManageCookies( HydrusResourceClientAPIRestricted ):
|
||||
|
||||
def _CheckAPIPermissions( self, request ):
|
||||
|
||||
request.client_api_permissions.CheckPermission( ClientAPI.CLIENT_API_PERMISSION_MANAGE_COOKIES )
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedManageCookiesGetCookies( HydrusResourceClientAPIRestrictedManageCookies ):
|
||||
|
||||
def _threadDoGETJob( self, request ):
|
||||
|
||||
if 'domain' not in request.parsed_request_args:
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'Please include a domain parameter!' )
|
||||
|
||||
|
||||
domain = request.parsed_request_args[ 'domain' ]
|
||||
|
||||
if '.' not in domain:
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'The value "{}" does not seem to be a domain!'.format( domain ) )
|
||||
|
||||
|
||||
network_context = ClientNetworkingContexts.NetworkContext( CC.NETWORK_CONTEXT_DOMAIN, domain )
|
||||
|
||||
session = HG.client_controller.network_engine.session_manager.GetSession( network_context )
|
||||
|
||||
body_cookies_list = []
|
||||
|
||||
for cookie in session.cookies:
|
||||
|
||||
name = cookie.name
|
||||
value = cookie.value
|
||||
domain = cookie.domain
|
||||
path = cookie.path
|
||||
expires = cookie.expires
|
||||
|
||||
body_cookies_list.append( [ name, value, domain, path, expires ] )
|
||||
|
||||
|
||||
body_dict = {}
|
||||
|
||||
body_dict = { 'cookies' : body_cookies_list }
|
||||
|
||||
body = json.dumps( body_dict )
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200, mime = HC.APPLICATION_JSON, body = body )
|
||||
|
||||
return response_context
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedManageCookiesSetCookies( HydrusResourceClientAPIRestrictedManageCookies ):
|
||||
|
||||
def _threadDoPOSTJob( self, request ):
|
||||
|
||||
cookie_rows = request.parsed_request_args[ 'cookies' ]
|
||||
|
||||
for cookie_row in cookie_rows:
|
||||
|
||||
if len( cookie_row ) != 5:
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'The cookie "{}" did not come in the format [ name, value, domain, path, expires ]!'.format( cookie_row ) )
|
||||
|
||||
|
||||
( name, value, domain, path, expires ) = cookie_row
|
||||
|
||||
ndp_bad = True in ( not isinstance( var, str ) for var in ( name, domain, path ) )
|
||||
v_bad = value is not None and not isinstance( value, str )
|
||||
e_bad = expires is not None and not isinstance( expires, int )
|
||||
|
||||
if ndp_bad or v_bad or e_bad:
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'In the row [ name, value, domain, path, expires ], which I received as "{}", name, domain, and path need to be strings, value needs to be null or a string, and expires needs to be null or an integer!'.format( cookie_row ) )
|
||||
|
||||
|
||||
network_context = ClientNetworkingContexts.NetworkContext( CC.NETWORK_CONTEXT_DOMAIN, domain )
|
||||
|
||||
session = HG.client_controller.network_engine.session_manager.GetSession( network_context )
|
||||
|
||||
if value is None:
|
||||
|
||||
session.cookies.clear( domain, path, name )
|
||||
|
||||
else:
|
||||
|
||||
version = 0
|
||||
port = None
|
||||
port_specified = False
|
||||
domain_specified = True
|
||||
domain_initial_dot = domain.startswith( '.' )
|
||||
path_specified = True
|
||||
secure = False
|
||||
discard = False
|
||||
comment = None
|
||||
comment_url = None
|
||||
rest = {}
|
||||
|
||||
cookie = http.cookiejar.Cookie( version, name, value, port, port_specified, domain, domain_specified, domain_initial_dot, path, path_specified, secure, expires, discard, comment, comment_url, rest )
|
||||
|
||||
session.cookies.set_cookie( cookie )
|
||||
|
||||
|
||||
|
||||
HG.client_controller.network_engine.session_manager.SetDirty()
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200 )
|
||||
|
||||
return response_context
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedManagePages( HydrusResourceClientAPIRestricted ):
|
||||
|
||||
def _CheckAPIPermissions( self, request ):
|
||||
|
|
|
@ -64,6 +64,9 @@ NICE_RATIOS[ 5 / 4 ] = '5:4'
|
|||
NICE_RATIOS[ 16 / 9 ] = '16:9'
|
||||
NICE_RATIOS[ 21 / 9 ] = '21:9'
|
||||
NICE_RATIOS[ 47 / 20 ] = '2.35:1'
|
||||
NICE_RATIOS[ 9 / 16 ] = '9:16'
|
||||
NICE_RATIOS[ 2 / 3 ] = '2:3'
|
||||
NICE_RATIOS[ 4 / 5 ] = '4:5'
|
||||
|
||||
def GetDuplicateComparisonStatements( shown_media, comparison_media ):
|
||||
|
||||
|
@ -499,7 +502,7 @@ class DuplicatesManager( object ):
|
|||
|
||||
class FileInfoManager( object ):
|
||||
|
||||
def __init__( self, hash_id, hash, size = None, mime = None, width = None, height = None, duration = None, num_frames = None, num_words = None ):
|
||||
def __init__( self, hash_id, hash, size = None, mime = None, width = None, height = None, duration = None, num_frames = None, has_audio = None, num_words = None ):
|
||||
|
||||
if mime is None:
|
||||
|
||||
|
@ -514,17 +517,18 @@ class FileInfoManager( object ):
|
|||
self.height = height
|
||||
self.duration = duration
|
||||
self.num_frames = num_frames
|
||||
self.has_audio = has_audio
|
||||
self.num_words = num_words
|
||||
|
||||
|
||||
def Duplicate( self ):
|
||||
|
||||
return FileInfoManager( self.hash_id, self.hash, self.size, self.mime, self.width, self.height, self.duration, self.num_frames, self.num_words )
|
||||
return FileInfoManager( self.hash_id, self.hash, self.size, self.mime, self.width, self.height, self.duration, self.num_frames, self.has_audio, self.num_words )
|
||||
|
||||
|
||||
def ToTuple( self ):
|
||||
|
||||
return ( self.hash_id, self.hash, self.size, self.mime, self.width, self.height, self.duration, self.num_frames, self.num_words )
|
||||
return ( self.hash_id, self.hash, self.size, self.mime, self.width, self.height, self.duration, self.num_frames, self.has_audio, self.num_words )
|
||||
|
||||
|
||||
class FileViewingStatsManager( object ):
|
||||
|
@ -1780,8 +1784,6 @@ class MediaCollection( MediaList, Media ):
|
|||
|
||||
def IsImage( self ): return False
|
||||
|
||||
def IsNoisy( self ): return self.GetDisplayMedia().GetMime() in HC.NOISY_MIMES
|
||||
|
||||
def IsSizeDefinite( self ): return self._size_definite
|
||||
|
||||
def ProcessContentUpdates( self, service_keys_to_content_updates ):
|
||||
|
@ -1896,7 +1898,7 @@ class MediaSingleton( Media ):
|
|||
file_info_manager = self._media_result.GetFileInfoManager()
|
||||
locations_manager = self._media_result.GetLocationsManager()
|
||||
|
||||
( hash_id, hash, size, mime, width, height, duration, num_frames, num_words ) = file_info_manager.ToTuple()
|
||||
( hash_id, hash, size, mime, width, height, duration, num_frames, has_audio, num_words ) = file_info_manager.ToTuple()
|
||||
|
||||
info_string = HydrusData.ToHumanBytes( size ) + ' ' + HC.mime_string_lookup[ mime ]
|
||||
|
||||
|
@ -1906,6 +1908,11 @@ class MediaSingleton( Media ):
|
|||
|
||||
if num_frames is not None: info_string += ' (' + HydrusData.ToHumanInt( num_frames ) + ' frames)'
|
||||
|
||||
if has_audio:
|
||||
|
||||
info_string += ', {}'.format( HG.client_controller.new_options.GetString( 'has_audio_label' ) )
|
||||
|
||||
|
||||
if num_words is not None: info_string += ' (' + HydrusData.ToHumanInt( num_words ) + ' words)'
|
||||
|
||||
lines = [ info_string ]
|
||||
|
@ -2017,7 +2024,15 @@ class MediaSingleton( Media ):
|
|||
return self._media_result.GetHash() in hashes
|
||||
|
||||
|
||||
def HasArchive( self ): return not self._media_result.GetInbox()
|
||||
def HasArchive( self ):
|
||||
|
||||
return not self._media_result.GetInbox()
|
||||
|
||||
|
||||
def HasAudio( self ):
|
||||
|
||||
return self._media_result.HasAudio()
|
||||
|
||||
|
||||
def HasDuration( self ):
|
||||
|
||||
|
@ -2052,8 +2067,6 @@ class MediaSingleton( Media ):
|
|||
return self._media_result.GetMime() in HC.IMAGES and not self.HasDuration()
|
||||
|
||||
|
||||
def IsNoisy( self ): return self._media_result.GetMime() in HC.NOISY_MIMES
|
||||
|
||||
def IsSizeDefinite( self ): return self._media_result.GetSize() is not None
|
||||
|
||||
def IsStaticImage( self ):
|
||||
|
@ -2236,6 +2249,11 @@ class MediaResult( object ):
|
|||
return self._tags_manager
|
||||
|
||||
|
||||
def HasAudio( self ):
|
||||
|
||||
return self._file_info_manager.has_audio is True
|
||||
|
||||
|
||||
def IsStaticImage( self ):
|
||||
|
||||
return self._file_info_manager.mime in HC.IMAGES and self._file_info_manager.duration in ( None, 0 )
|
||||
|
|
|
@ -130,6 +130,8 @@ class NetworkJob( object ):
|
|||
|
||||
self._bandwidth_tracker = HydrusNetworking.BandwidthTracker()
|
||||
|
||||
self._connection_error_wake_time = 0
|
||||
|
||||
self._wake_time = 0
|
||||
|
||||
self._content_type = None
|
||||
|
@ -478,13 +480,13 @@ class NetworkJob( object ):
|
|||
|
||||
def _WaitOnConnectionError( self, status_text ):
|
||||
|
||||
time_to_try_again = HydrusData.GetNow() + ( ( self._current_connection_attempt_number - 1 ) * 60 )
|
||||
self._connection_error_wake_time = HydrusData.GetNow() + ( ( self._current_connection_attempt_number - 1 ) * 10 )
|
||||
|
||||
while not HydrusData.TimeHasPassed( time_to_try_again ) and not self._IsCancelled():
|
||||
while not HydrusData.TimeHasPassed( self._connection_error_wake_time ) and not self._IsCancelled():
|
||||
|
||||
with self._lock:
|
||||
|
||||
self._status_text = status_text + ' - retrying in {}'.format( HydrusData.TimestampToPrettyTimeDelta( time_to_try_again ) )
|
||||
self._status_text = status_text + ' - retrying in {}'.format( HydrusData.TimestampToPrettyTimeDelta( self._connection_error_wake_time ) )
|
||||
|
||||
|
||||
time.sleep( 1 )
|
||||
|
@ -618,6 +620,14 @@ class NetworkJob( object ):
|
|||
|
||||
|
||||
|
||||
def CurrentlyWaitingOnConnectionError( self ):
|
||||
|
||||
with self._lock:
|
||||
|
||||
return not HydrusData.TimeHasPassed( self._connection_error_wake_time )
|
||||
|
||||
|
||||
|
||||
def GenerateLoginProcess( self ):
|
||||
|
||||
with self._lock:
|
||||
|
@ -857,6 +867,14 @@ class NetworkJob( object ):
|
|||
|
||||
|
||||
|
||||
def OverrideConnectionErrorWait( self ):
|
||||
|
||||
with self._lock:
|
||||
|
||||
self._connection_error_wake_time = 0
|
||||
|
||||
|
||||
|
||||
def OverrideToken( self ):
|
||||
|
||||
with self._lock:
|
||||
|
|
|
@ -1585,7 +1585,7 @@ class LoginStep( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
url = 'Did not make a url.'
|
||||
test_result_body = None
|
||||
downloaded_data = 'Did not download data.'
|
||||
downloaded_text = 'Did not download data.'
|
||||
new_temp_variables = {}
|
||||
original_cookie_strings = session_to_cookie_strings( engine.session_manager.GetSessionForDomain( domain ) )
|
||||
test_script_result = 'Did not start.'
|
||||
|
@ -1751,7 +1751,7 @@ class LoginStep( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
new_temp_strings = tuple( ( key + ': ' + value for ( key, value ) in list(new_temp_variables.items()) ) )
|
||||
|
||||
test_result = ( self._name, url, test_result_body, downloaded_data, new_temp_strings, new_cookie_strings, test_script_result )
|
||||
test_result = ( self._name, url, test_result_body, downloaded_text, new_temp_strings, new_cookie_strings, test_script_result )
|
||||
|
||||
test_result_callable( test_result )
|
||||
|
||||
|
|
|
@ -240,4 +240,12 @@ class NetworkSessionManager( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
|
||||
|
||||
def SetDirty( self ):
|
||||
|
||||
with self._lock:
|
||||
|
||||
self._dirty = True
|
||||
|
||||
|
||||
|
||||
HydrusSerialisable.SERIALISABLE_TYPES_TO_OBJECT_TYPES[ HydrusSerialisable.SERIALISABLE_TYPE_NETWORK_SESSION_MANAGER ] = NetworkSessionManager
|
||||
|
|
|
@ -312,6 +312,7 @@ class ClientOptions( HydrusSerialisable.SerialisableBase ):
|
|||
self._dictionary[ 'strings' ][ 'pause_character' ] = '\u23F8'
|
||||
self._dictionary[ 'strings' ][ 'stop_character' ] = '\u23F9'
|
||||
self._dictionary[ 'strings' ][ 'default_gug_name' ] = 'safebooru tag search'
|
||||
self._dictionary[ 'strings' ][ 'has_audio_label' ] = '\U0001F50A'
|
||||
|
||||
self._dictionary[ 'string_list' ] = {}
|
||||
|
||||
|
|
|
@ -1872,9 +1872,9 @@ class ContentParser( HydrusSerialisable.SerialisableBase ):
|
|||
# ->
|
||||
# http:/www.pixiv.net/member_illust.php?illust_id=48114073&mode=medium
|
||||
|
||||
while re.search( '\shttp', u ) is not None:
|
||||
while re.search( r'\shttp', u ) is not None:
|
||||
|
||||
u = re.sub( '^.*\shttp', 'http', u )
|
||||
u = re.sub( r'^.*\shttp', 'http', u )
|
||||
|
||||
|
||||
|
||||
|
|
|
@ -89,28 +89,28 @@ def FilterTagsBySearchText( service_key, search_text, tags, search_siblings = Tr
|
|||
# \Z is end of string
|
||||
# \s is whitespace
|
||||
|
||||
if '\:' in s:
|
||||
if r'\:' in s:
|
||||
|
||||
beginning = '\\A'
|
||||
beginning = r'\A'
|
||||
|
||||
s = s.replace( '\:', '(\:|.*\\s)', 1 )
|
||||
s = s.replace( r'\:', r'(\:|.*\s)', 1 )
|
||||
|
||||
elif s.startswith( '.*' ):
|
||||
|
||||
beginning = '(\\A|\:)'
|
||||
beginning = r'(\A|\:)'
|
||||
|
||||
else:
|
||||
|
||||
beginning = '(\\A|\:|\\s)'
|
||||
beginning = r'(\A|\:|\s)'
|
||||
|
||||
|
||||
if s.endswith( '.*' ):
|
||||
|
||||
end = '\\Z' # end of string
|
||||
end = r'\Z' # end of string
|
||||
|
||||
else:
|
||||
|
||||
end = '(\\s|\\Z)' # whitespace or end of string
|
||||
end = r'(\s|\Z)' # whitespace or end of string
|
||||
|
||||
|
||||
return re.compile( beginning + s + end )
|
||||
|
@ -421,11 +421,18 @@ class FileSystemPredicates( object ):
|
|||
self._common_info[ 'known_url_rules' ].append( ( operator, rule_type, rule ) )
|
||||
|
||||
|
||||
if predicate_type == HC.PREDICATE_TYPE_SYSTEM_HAS_AUDIO:
|
||||
|
||||
has_audio = value
|
||||
|
||||
self._common_info[ 'has_audio' ] = has_audio
|
||||
|
||||
|
||||
if predicate_type == HC.PREDICATE_TYPE_SYSTEM_HASH:
|
||||
|
||||
( hash, hash_type ) = value
|
||||
( hashes, hash_type ) = value
|
||||
|
||||
self._common_info[ 'hash' ] = ( hash, hash_type )
|
||||
self._common_info[ 'hash' ] = ( hashes, hash_type )
|
||||
|
||||
|
||||
if predicate_type == HC.PREDICATE_TYPE_SYSTEM_AGE:
|
||||
|
@ -683,9 +690,9 @@ class FileSystemPredicates( object ):
|
|||
|
||||
if predicate_type == HC.PREDICATE_TYPE_SYSTEM_SIMILAR_TO:
|
||||
|
||||
( hash, max_hamming ) = value
|
||||
( hashes, max_hamming ) = value
|
||||
|
||||
self._similar_to = ( hash, max_hamming )
|
||||
self._similar_to = ( hashes, max_hamming )
|
||||
|
||||
|
||||
if predicate_type == HC.PREDICATE_TYPE_SYSTEM_FILE_RELATIONSHIPS_COUNT:
|
||||
|
@ -731,11 +738,6 @@ class FileSystemPredicates( object ):
|
|||
return self._king_filter
|
||||
|
||||
|
||||
def GetSimpleInfo( self ):
|
||||
|
||||
return self._common_info
|
||||
|
||||
|
||||
def GetLimit( self, apply_implicit_limit = True ):
|
||||
|
||||
if self._limit is None and apply_implicit_limit:
|
||||
|
@ -748,11 +750,25 @@ class FileSystemPredicates( object ):
|
|||
return self._limit
|
||||
|
||||
|
||||
def GetRatingsPredicates( self ): return self._ratings_predicates
|
||||
def GetSimpleInfo( self ):
|
||||
|
||||
return self._common_info
|
||||
|
||||
|
||||
def GetSimilarTo( self ): return self._similar_to
|
||||
def GetRatingsPredicates( self ):
|
||||
|
||||
return self._ratings_predicates
|
||||
|
||||
|
||||
def HasSimilarTo( self ): return self._similar_to is not None
|
||||
def GetSimilarTo( self ):
|
||||
|
||||
return self._similar_to
|
||||
|
||||
|
||||
def HasSimilarTo( self ):
|
||||
|
||||
return self._similar_to is not None
|
||||
|
||||
|
||||
def HasSystemEverything( self ):
|
||||
|
||||
|
@ -771,7 +787,7 @@ class Predicate( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_PREDICATE
|
||||
SERIALISABLE_NAME = 'File Search Predicate'
|
||||
SERIALISABLE_VERSION = 2
|
||||
SERIALISABLE_VERSION = 3
|
||||
|
||||
def __init__( self, predicate_type = None, value = None, inclusive = True, min_current_count = 0, min_pending_count = 0, max_current_count = None, max_pending_count = None ):
|
||||
|
||||
|
@ -821,9 +837,9 @@ class Predicate( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
elif self._predicate_type == HC.PREDICATE_TYPE_SYSTEM_SIMILAR_TO:
|
||||
|
||||
( hash, max_hamming ) = self._value
|
||||
( hashes, max_hamming ) = self._value
|
||||
|
||||
serialisable_value = ( hash.hex(), max_hamming )
|
||||
serialisable_value = ( [ hash.hex() for hash in hashes ], max_hamming )
|
||||
|
||||
elif self._predicate_type == HC.PREDICATE_TYPE_SYSTEM_KNOWN_URLS:
|
||||
|
||||
|
@ -842,9 +858,9 @@ class Predicate( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
elif self._predicate_type == HC.PREDICATE_TYPE_SYSTEM_HASH:
|
||||
|
||||
( hash, hash_type ) = self._value
|
||||
( hashes, hash_type ) = self._value
|
||||
|
||||
serialisable_value = ( hash.hex(), hash_type )
|
||||
serialisable_value = ( [ hash.hex() for hash in hashes ], hash_type )
|
||||
|
||||
elif self._predicate_type == HC.PREDICATE_TYPE_OR_CONTAINER:
|
||||
|
||||
|
@ -872,9 +888,9 @@ class Predicate( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
elif self._predicate_type == HC.PREDICATE_TYPE_SYSTEM_SIMILAR_TO:
|
||||
|
||||
( serialisable_hash, max_hamming ) = serialisable_value
|
||||
( serialisable_hashes, max_hamming ) = serialisable_value
|
||||
|
||||
self._value = ( bytes.fromhex( serialisable_hash ), max_hamming )
|
||||
self._value = ( tuple( [ bytes.fromhex( serialisable_hash ) for serialisable_hash in serialisable_hashes ] ) , max_hamming )
|
||||
|
||||
elif self._predicate_type == HC.PREDICATE_TYPE_SYSTEM_KNOWN_URLS:
|
||||
|
||||
|
@ -893,9 +909,9 @@ class Predicate( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
elif self._predicate_type == HC.PREDICATE_TYPE_SYSTEM_HASH:
|
||||
|
||||
( serialisable_hash, hash_type ) = serialisable_value
|
||||
( serialisable_hashes, hash_type ) = serialisable_value
|
||||
|
||||
self._value = ( bytes.fromhex( serialisable_hash ), hash_type )
|
||||
self._value = ( tuple( [ bytes.fromhex( serialisable_hash ) for serialisable_hash in serialisable_hashes ] ), hash_type )
|
||||
|
||||
elif self._predicate_type == HC.PREDICATE_TYPE_SYSTEM_AGE:
|
||||
|
||||
|
@ -944,6 +960,26 @@ class Predicate( HydrusSerialisable.SerialisableBase ):
|
|||
return ( 2, new_serialisable_info )
|
||||
|
||||
|
||||
if version == 2:
|
||||
|
||||
( predicate_type, serialisable_value, inclusive ) = old_serialisable_info
|
||||
|
||||
if predicate_type in ( HC.PREDICATE_TYPE_SYSTEM_HASH, HC.PREDICATE_TYPE_SYSTEM_SIMILAR_TO ):
|
||||
|
||||
# other value is either hash type or max hamming distance
|
||||
|
||||
( serialisable_hash, other_value ) = serialisable_value
|
||||
|
||||
serialisable_hashes = ( serialisable_hash, )
|
||||
|
||||
serialisable_value = ( serialisable_hashes, other_value )
|
||||
|
||||
|
||||
new_serialisable_info = ( predicate_type, serialisable_value, inclusive )
|
||||
|
||||
return ( 3, new_serialisable_info )
|
||||
|
||||
|
||||
|
||||
def AddCounts( self, predicate ):
|
||||
|
||||
|
@ -1290,15 +1326,36 @@ class Predicate( HydrusSerialisable.SerialisableBase ):
|
|||
base = description
|
||||
|
||||
|
||||
elif self._predicate_type == HC.PREDICATE_TYPE_SYSTEM_HAS_AUDIO:
|
||||
|
||||
base = 'has audio'
|
||||
|
||||
if self._value is not None:
|
||||
|
||||
has_audio = self._value
|
||||
|
||||
if not has_audio:
|
||||
|
||||
base = 'does not have audio'
|
||||
|
||||
|
||||
|
||||
elif self._predicate_type == HC.PREDICATE_TYPE_SYSTEM_HASH:
|
||||
|
||||
base = 'hash'
|
||||
|
||||
if self._value is not None:
|
||||
|
||||
( hash, hash_type ) = self._value
|
||||
( hashes, hash_type ) = self._value
|
||||
|
||||
base = hash_type + ' hash is ' + hash.hex()
|
||||
if len( hashes ) == 1:
|
||||
|
||||
base = '{} hash is {}'.format( hash_type, hashes[0].hex() )
|
||||
|
||||
else:
|
||||
|
||||
base = '{} hash is in {} hashes'.format( hash_type, HydrusData.ToHumanInt( len( hashes ) ) )
|
||||
|
||||
|
||||
|
||||
elif self._predicate_type == HC.PREDICATE_TYPE_SYSTEM_MIME:
|
||||
|
@ -1406,9 +1463,9 @@ class Predicate( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
if self._value is not None:
|
||||
|
||||
( hash, max_hamming ) = self._value
|
||||
( hashes, max_hamming ) = self._value
|
||||
|
||||
base += ' ' + hash.hex() + ' using max hamming of ' + str( max_hamming )
|
||||
base += ' {} files using max hamming of {}'.format( HydrusData.ToHumanInt( len( hashes ) ), max_hamming )
|
||||
|
||||
|
||||
elif self._predicate_type == HC.PREDICATE_TYPE_SYSTEM_FILE_SERVICE:
|
||||
|
|
|
@ -1,31 +1,115 @@
|
|||
from . import HydrusConstants as HC
|
||||
from . import HydrusData
|
||||
from . import HydrusExceptions
|
||||
from . import HydrusVideoHandling
|
||||
import os
|
||||
import re
|
||||
import subprocess
|
||||
import threading
|
||||
import time
|
||||
import traceback
|
||||
|
||||
#if HC.PLATFORM_WINDOWS: import mp3play
|
||||
|
||||
# There used to be hsaudiotag duration stuff here, but I moved it all to FFMPEG
|
||||
|
||||
'''
|
||||
def PlayNoise( name ):
|
||||
def ParseFFMPEGAudio( lines ):
|
||||
|
||||
if HC.PLATFORM_OSX: return
|
||||
# the ^\sStream is to exclude the 'title' line, when it exists, includes the string 'Audio: ', ha ha
|
||||
lines_audio = [ l for l in lines if re.search( r'^\s*Stream', l ) is not None and 'Audio: ' in l ]
|
||||
|
||||
if name not in parsed_noises:
|
||||
audio_found = lines_audio != []
|
||||
audio_format = None
|
||||
|
||||
if audio_found:
|
||||
|
||||
if name == 'success': filename = 'success.mp3'
|
||||
elif name == 'error': filename = 'error.mp3'
|
||||
line = lines_audio[0]
|
||||
|
||||
path = os.path.join( HC.STATIC_DIR, filename )
|
||||
try:
|
||||
|
||||
match = re.search(" [0-9]* Hz", line)
|
||||
|
||||
audio_fps = int(line[match.start()+1:match.end()])
|
||||
|
||||
except:
|
||||
|
||||
audio_fps = 'unknown'
|
||||
|
||||
|
||||
noise = mp3play.load( path )
|
||||
|
||||
parsed_noises[ name ] = noise
|
||||
try:
|
||||
|
||||
match = re.search( r'(?<=Audio\:\s).+?(?=,)', line )
|
||||
|
||||
audio_format = match.group()
|
||||
|
||||
except:
|
||||
|
||||
audio_format = 'unknown'
|
||||
|
||||
|
||||
|
||||
noise = parsed_noises[ name ]
|
||||
return ( audio_found, audio_format )
|
||||
|
||||
def VideoHasAudio( path ):
|
||||
|
||||
info_lines = HydrusVideoHandling.GetFFMPEGInfoLines( path )
|
||||
|
||||
( audio_found, audio_format ) = ParseFFMPEGAudio( info_lines )
|
||||
|
||||
if not audio_found:
|
||||
|
||||
return False
|
||||
|
||||
|
||||
# just because video metadata has an audio stream doesn't mean it has audio. some vids have silent audio streams lmao
|
||||
# so, let's read it as PCM and see if there is any noise
|
||||
# this obviously only works for single audio stream vids, we'll adapt this if someone discovers a multi-stream mkv with a silent channel that doesn't work here
|
||||
|
||||
cmd = [ HydrusVideoHandling.FFMPEG_PATH ]
|
||||
|
||||
# this is perhaps not sensible for eventual playback and I should rather go for wav file-like and feed into python 'wave' in order to maintain stereo/mono and so on and have easy chunk-reading
|
||||
|
||||
cmd.extend( [ '-i', path,
|
||||
'-loglevel', 'quiet',
|
||||
'-f', 's16le',
|
||||
'-' ] )
|
||||
|
||||
|
||||
sbp_kwargs = HydrusData.GetSubprocessKWArgs()
|
||||
|
||||
try:
|
||||
|
||||
process = subprocess.Popen( cmd, bufsize = 65536, stdout=subprocess.PIPE, stderr=subprocess.PIPE, **sbp_kwargs )
|
||||
|
||||
except FileNotFoundError as e:
|
||||
|
||||
HydrusData.ShowText( 'Cannot render audio--FFMPEG not found!' )
|
||||
|
||||
raise
|
||||
|
||||
|
||||
# silent PCM data is just 00 bytes
|
||||
|
||||
try:
|
||||
|
||||
chunk_of_pcm_data = process.stdout.read( 65536 )
|
||||
|
||||
while len( chunk_of_pcm_data ) > 0:
|
||||
|
||||
for b in chunk_of_pcm_data:
|
||||
|
||||
if b != b'\x00':
|
||||
|
||||
return True
|
||||
|
||||
|
||||
|
||||
chunk_of_pcm_data = process.stdout.read( 65536 )
|
||||
|
||||
|
||||
return False
|
||||
|
||||
finally:
|
||||
|
||||
process.terminate()
|
||||
|
||||
process.stdout.close()
|
||||
process.stderr.close()
|
||||
|
||||
|
||||
noise.play()
|
||||
'''
|
||||
|
|
|
@ -67,8 +67,8 @@ options = {}
|
|||
# Misc
|
||||
|
||||
NETWORK_VERSION = 18
|
||||
SOFTWARE_VERSION = 362
|
||||
CLIENT_API_VERSION = 9
|
||||
SOFTWARE_VERSION = 363
|
||||
CLIENT_API_VERSION = 10
|
||||
|
||||
SERVER_THUMBNAIL_DIMENSIONS = ( 200, 200 )
|
||||
|
||||
|
@ -490,7 +490,8 @@ NATIVE_VIDEO = ( IMAGE_APNG, VIDEO_AVI, VIDEO_FLV, VIDEO_MOV, VIDEO_MP4, VIDEO_W
|
|||
|
||||
APPLICATIONS = ( APPLICATION_FLASH, APPLICATION_PSD, APPLICATION_PDF, APPLICATION_ZIP, APPLICATION_RAR, APPLICATION_7Z )
|
||||
|
||||
NOISY_MIMES = tuple( [ APPLICATION_FLASH ] + list( AUDIO ) + list( VIDEO ) )
|
||||
MIMES_THAT_DEFINITELY_HAVE_AUDIO = tuple( [ APPLICATION_FLASH ] + list( AUDIO ) )
|
||||
MIMES_THAT_MAY_HAVE_AUDIO = tuple( list( MIMES_THAT_DEFINITELY_HAVE_AUDIO ) + list( VIDEO ) )
|
||||
|
||||
ARCHIVES = ( APPLICATION_ZIP, APPLICATION_HYDRUS_ENCRYPTED_ZIP, APPLICATION_RAR, APPLICATION_7Z )
|
||||
|
||||
|
@ -668,8 +669,9 @@ PREDICATE_TYPE_OR_CONTAINER = 30
|
|||
PREDICATE_TYPE_LABEL = 31
|
||||
PREDICATE_TYPE_SYSTEM_FILE_RELATIONSHIPS_KING = 32
|
||||
PREDICATE_TYPE_SYSTEM_FILE_RELATIONSHIPS = 33
|
||||
PREDICATE_TYPE_SYSTEM_HAS_AUDIO = 34
|
||||
|
||||
SYSTEM_PREDICATES = [ PREDICATE_TYPE_SYSTEM_EVERYTHING, PREDICATE_TYPE_SYSTEM_INBOX, PREDICATE_TYPE_SYSTEM_ARCHIVE, PREDICATE_TYPE_SYSTEM_UNTAGGED, PREDICATE_TYPE_SYSTEM_NUM_TAGS, PREDICATE_TYPE_SYSTEM_LIMIT, PREDICATE_TYPE_SYSTEM_SIZE, PREDICATE_TYPE_SYSTEM_AGE, PREDICATE_TYPE_SYSTEM_HASH, PREDICATE_TYPE_SYSTEM_WIDTH, PREDICATE_TYPE_SYSTEM_HEIGHT, PREDICATE_TYPE_SYSTEM_RATIO, PREDICATE_TYPE_SYSTEM_DURATION, PREDICATE_TYPE_SYSTEM_MIME, PREDICATE_TYPE_SYSTEM_RATING, PREDICATE_TYPE_SYSTEM_SIMILAR_TO, PREDICATE_TYPE_SYSTEM_LOCAL, PREDICATE_TYPE_SYSTEM_NOT_LOCAL, PREDICATE_TYPE_SYSTEM_NUM_WORDS, PREDICATE_TYPE_SYSTEM_FILE_SERVICE, PREDICATE_TYPE_SYSTEM_NUM_PIXELS, PREDICATE_TYPE_SYSTEM_DIMENSIONS, PREDICATE_TYPE_SYSTEM_TAG_AS_NUMBER, PREDICATE_TYPE_SYSTEM_FILE_RELATIONSHIPS, PREDICATE_TYPE_SYSTEM_FILE_RELATIONSHIPS_COUNT, PREDICATE_TYPE_SYSTEM_FILE_RELATIONSHIPS_KING, PREDICATE_TYPE_SYSTEM_KNOWN_URLS, PREDICATE_TYPE_SYSTEM_FILE_VIEWING_STATS ]
|
||||
SYSTEM_PREDICATES = [ PREDICATE_TYPE_SYSTEM_EVERYTHING, PREDICATE_TYPE_SYSTEM_INBOX, PREDICATE_TYPE_SYSTEM_ARCHIVE, PREDICATE_TYPE_SYSTEM_UNTAGGED, PREDICATE_TYPE_SYSTEM_NUM_TAGS, PREDICATE_TYPE_SYSTEM_LIMIT, PREDICATE_TYPE_SYSTEM_SIZE, PREDICATE_TYPE_SYSTEM_AGE, PREDICATE_TYPE_SYSTEM_HASH, PREDICATE_TYPE_SYSTEM_WIDTH, PREDICATE_TYPE_SYSTEM_HEIGHT, PREDICATE_TYPE_SYSTEM_RATIO, PREDICATE_TYPE_SYSTEM_DURATION, PREDICATE_TYPE_SYSTEM_HAS_AUDIO, PREDICATE_TYPE_SYSTEM_MIME, PREDICATE_TYPE_SYSTEM_RATING, PREDICATE_TYPE_SYSTEM_SIMILAR_TO, PREDICATE_TYPE_SYSTEM_LOCAL, PREDICATE_TYPE_SYSTEM_NOT_LOCAL, PREDICATE_TYPE_SYSTEM_NUM_WORDS, PREDICATE_TYPE_SYSTEM_FILE_SERVICE, PREDICATE_TYPE_SYSTEM_NUM_PIXELS, PREDICATE_TYPE_SYSTEM_DIMENSIONS, PREDICATE_TYPE_SYSTEM_TAG_AS_NUMBER, PREDICATE_TYPE_SYSTEM_FILE_RELATIONSHIPS, PREDICATE_TYPE_SYSTEM_FILE_RELATIONSHIPS_COUNT, PREDICATE_TYPE_SYSTEM_FILE_RELATIONSHIPS_KING, PREDICATE_TYPE_SYSTEM_KNOWN_URLS, PREDICATE_TYPE_SYSTEM_FILE_VIEWING_STATS ]
|
||||
|
||||
SITE_TYPE_DEVIANT_ART = 0
|
||||
SITE_TYPE_GIPHY = 1
|
||||
|
|
|
@ -6,7 +6,7 @@ import traceback
|
|||
|
||||
def GetNumWordsFromString( s ):
|
||||
|
||||
s = re.sub( '[\s]+', ' ', s ) # turns multiple spaces into single spaces
|
||||
s = re.sub( r'[\s]+', ' ', s ) # turns multiple spaces into single spaces
|
||||
|
||||
num_words = len( s.split( ' ' ) )
|
||||
|
||||
|
|
|
@ -201,6 +201,19 @@ def GetFileInfo( path, mime = None, ok_to_look_for_hydrus_updates = False ):
|
|||
duration = int( file_duration_in_s * 1000 )
|
||||
|
||||
|
||||
if mime in HC.MIMES_THAT_DEFINITELY_HAVE_AUDIO:
|
||||
|
||||
has_audio = True
|
||||
|
||||
elif mime in HC.MIMES_THAT_MAY_HAVE_AUDIO:
|
||||
|
||||
has_audio = HydrusAudioHandling.VideoHasAudio( path )
|
||||
|
||||
else:
|
||||
|
||||
has_audio = False
|
||||
|
||||
|
||||
if width is not None and width < 0:
|
||||
|
||||
width *= -1
|
||||
|
@ -226,7 +239,7 @@ def GetFileInfo( path, mime = None, ok_to_look_for_hydrus_updates = False ):
|
|||
num_words *= -1
|
||||
|
||||
|
||||
return ( size, mime, width, height, duration, num_frames, num_words )
|
||||
return ( size, mime, width, height, duration, num_frames, has_audio, num_words )
|
||||
|
||||
def GetHashFromPath( path ):
|
||||
|
||||
|
|
|
@ -226,7 +226,7 @@ def ParseFileArguments( path, decompression_bombs_ok = False ):
|
|||
|
||||
|
||||
|
||||
( size, mime, width, height, duration, num_frames, num_words ) = HydrusFileHandling.GetFileInfo( path, mime )
|
||||
( size, mime, width, height, duration, num_frames, has_audio, num_words ) = HydrusFileHandling.GetFileInfo( path, mime )
|
||||
|
||||
except Exception as e:
|
||||
|
||||
|
@ -244,6 +244,7 @@ def ParseFileArguments( path, decompression_bombs_ok = False ):
|
|||
if height is not None: args[ 'height' ] = height
|
||||
if duration is not None: args[ 'duration' ] = duration
|
||||
if num_frames is not None: args[ 'num_frames' ] = num_frames
|
||||
args[ 'has_audio' ] = has_audio
|
||||
if num_words is not None: args[ 'num_words' ] = num_words
|
||||
|
||||
if mime in HC.MIMES_WITH_THUMBNAILS:
|
||||
|
|
|
@ -12,8 +12,8 @@ import json
|
|||
import re
|
||||
|
||||
re_newlines = re.compile( '[\r\n]+' )
|
||||
re_multiple_spaces = re.compile( '\\s+' )
|
||||
re_leading_space_or_garbage = re.compile( '^(\\s|-|system:)+' )
|
||||
re_multiple_spaces = re.compile( r'\s+' )
|
||||
re_leading_space_or_garbage = re.compile( r'^(\s|-|system:)+' )
|
||||
re_leading_single_colon = re.compile( '^:(?!:)' )
|
||||
re_leading_byte_order_mark = re.compile( '^\ufeff' ) # unicode .txt files prepend with this, wew
|
||||
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
from . import HydrusAudioHandling
|
||||
from . import HydrusConstants as HC
|
||||
from . import HydrusData
|
||||
from . import HydrusExceptions
|
||||
|
@ -318,7 +319,7 @@ def GetMime( path ):
|
|||
has_webm_video = True in ( webm_video_format in video_format for webm_video_format in webm_video_formats )
|
||||
|
||||
|
||||
( has_audio, audio_format ) = ParseFFMPEGAudio( lines )
|
||||
( has_audio, audio_format ) = HydrusAudioHandling.ParseFFMPEGAudio( lines )
|
||||
|
||||
if has_audio:
|
||||
|
||||
|
@ -381,44 +382,6 @@ def HasVideoStream( path ):
|
|||
|
||||
return ParseFFMPEGHasVideo( lines )
|
||||
|
||||
def ParseFFMPEGAudio( lines ):
|
||||
|
||||
# this is from the old stuff--might be helpful later when we add audio
|
||||
|
||||
lines_audio = [l for l in lines if 'Audio: ' in l]
|
||||
|
||||
audio_found = lines_audio != []
|
||||
audio_format = None
|
||||
|
||||
if audio_found:
|
||||
|
||||
line = lines_audio[0]
|
||||
|
||||
try:
|
||||
|
||||
match = re.search(" [0-9]* Hz", line)
|
||||
|
||||
audio_fps = int(line[match.start()+1:match.end()])
|
||||
|
||||
except:
|
||||
|
||||
audio_fps = 'unknown'
|
||||
|
||||
|
||||
try:
|
||||
|
||||
match = re.search( '(?<=Audio\:\s).+?(?=,)', line )
|
||||
|
||||
audio_format = match.group()
|
||||
|
||||
except:
|
||||
|
||||
audio_format = 'unknown'
|
||||
|
||||
|
||||
|
||||
return ( audio_found, audio_format )
|
||||
|
||||
def ParseFFMPEGDuration( lines ):
|
||||
|
||||
# get duration (in seconds)
|
||||
|
@ -631,7 +594,7 @@ def ParseFFMPEGVideoFormat( lines ):
|
|||
|
||||
try:
|
||||
|
||||
match = re.search( '(?<=Video\:\s).+?(?=,)', line )
|
||||
match = re.search( r'(?<=Video\:\s).+?(?=,)', line )
|
||||
|
||||
video_format = match.group()
|
||||
|
||||
|
@ -646,7 +609,7 @@ def ParseFFMPEGVideoLine( lines ):
|
|||
|
||||
# get the output line that speaks about video
|
||||
# the ^\sStream is to exclude the 'title' line, when it exists, includes the string 'Video: ', ha ha
|
||||
lines_video = [ l for l in lines if re.search( '^\s*Stream', l ) is not None and 'Video: ' in l and not ( 'Video: png' in l or 'Video: jpg' in l ) ] # mp3 says it has a 'png' video stream
|
||||
lines_video = [ l for l in lines if re.search( r'^\s*Stream', l ) is not None and 'Video: ' in l and not ( 'Video: png' in l or 'Video: jpg' in l ) ] # mp3 says it has a 'png' video stream
|
||||
|
||||
if len( lines_video ) == 0:
|
||||
|
||||
|
@ -788,7 +751,8 @@ class VideoRendererFFMPEG( object ):
|
|||
"-pix_fmt", self.pix_fmt,
|
||||
"-s", str( w ) + 'x' + str( h ),
|
||||
'-vsync', '0',
|
||||
'-vcodec', 'rawvideo', '-' ] )
|
||||
'-vcodec', 'rawvideo',
|
||||
'-' ] )
|
||||
|
||||
|
||||
sbp_kwargs = HydrusData.GetSubprocessKWArgs()
|
||||
|
|
|
@ -123,6 +123,7 @@ class TestClientAPI( unittest.TestCase ):
|
|||
permissions_to_set_up.append( ( 'add_tags', [ ClientAPI.CLIENT_API_PERMISSION_ADD_TAGS ] ) )
|
||||
permissions_to_set_up.append( ( 'add_urls', [ ClientAPI.CLIENT_API_PERMISSION_ADD_URLS ] ) )
|
||||
permissions_to_set_up.append( ( 'manage_pages', [ ClientAPI.CLIENT_API_PERMISSION_MANAGE_PAGES ] ) )
|
||||
permissions_to_set_up.append( ( 'manage_cookies', [ ClientAPI.CLIENT_API_PERMISSION_MANAGE_COOKIES ] ) )
|
||||
permissions_to_set_up.append( ( 'search_all_files', [ ClientAPI.CLIENT_API_PERMISSION_SEARCH_FILES ] ) )
|
||||
permissions_to_set_up.append( ( 'search_green_files', [ ClientAPI.CLIENT_API_PERMISSION_SEARCH_FILES ] ) )
|
||||
|
||||
|
@ -1225,6 +1226,130 @@ class TestClientAPI( unittest.TestCase ):
|
|||
self.assertEqual( result, expected_result )
|
||||
|
||||
|
||||
def _test_manage_cookies( self, connection, set_up_permissions ):
|
||||
|
||||
api_permissions = set_up_permissions[ 'manage_cookies' ]
|
||||
|
||||
access_key_hex = api_permissions.GetAccessKey().hex()
|
||||
|
||||
headers = { 'Hydrus-Client-API-Access-Key' : access_key_hex }
|
||||
|
||||
#
|
||||
|
||||
path = '/manage_cookies/get_cookies?domain=somesite.com'
|
||||
|
||||
connection.request( 'GET', path, headers = headers )
|
||||
|
||||
response = connection.getresponse()
|
||||
|
||||
data = response.read()
|
||||
|
||||
text = str( data, 'utf-8' )
|
||||
|
||||
self.assertEqual( response.status, 200 )
|
||||
|
||||
d = json.loads( text )
|
||||
|
||||
cookies = d[ 'cookies' ]
|
||||
|
||||
self.assertEqual( cookies, [] )
|
||||
|
||||
#
|
||||
|
||||
headers = { 'Hydrus-Client-API-Access-Key' : access_key_hex, 'Content-Type' : HC.mime_string_lookup[ HC.APPLICATION_JSON ] }
|
||||
|
||||
path = '/manage_cookies/set_cookies'
|
||||
|
||||
cookies = []
|
||||
|
||||
cookies.append( [ 'one', '1', '.somesite.com', '/', HydrusData.GetNow() + 86400 ] )
|
||||
cookies.append( [ 'two', '2', 'somesite.com', '/', HydrusData.GetNow() + 86400 ] )
|
||||
cookies.append( [ 'three', '3', 'wew.somesite.com', '/', HydrusData.GetNow() + 86400 ] )
|
||||
cookies.append( [ 'four', '4', '.somesite.com', '/', None ] )
|
||||
|
||||
request_dict = { 'cookies' : cookies }
|
||||
|
||||
request_body = json.dumps( request_dict )
|
||||
|
||||
connection.request( 'POST', path, body = request_body, headers = headers )
|
||||
|
||||
response = connection.getresponse()
|
||||
|
||||
data = response.read()
|
||||
|
||||
self.assertEqual( response.status, 200 )
|
||||
|
||||
path = '/manage_cookies/get_cookies?domain=somesite.com'
|
||||
|
||||
connection.request( 'GET', path, headers = headers )
|
||||
|
||||
response = connection.getresponse()
|
||||
|
||||
data = response.read()
|
||||
|
||||
text = str( data, 'utf-8' )
|
||||
|
||||
self.assertEqual( response.status, 200 )
|
||||
|
||||
d = json.loads( text )
|
||||
|
||||
result_cookies = d[ 'cookies' ]
|
||||
|
||||
frozen_result_cookies = { tuple( row ) for row in result_cookies }
|
||||
frozen_expected_cookies = { tuple( row ) for row in cookies }
|
||||
|
||||
self.assertEqual( frozen_result_cookies, frozen_expected_cookies )
|
||||
|
||||
#
|
||||
|
||||
headers = { 'Hydrus-Client-API-Access-Key' : access_key_hex, 'Content-Type' : HC.mime_string_lookup[ HC.APPLICATION_JSON ] }
|
||||
|
||||
path = '/manage_cookies/set_cookies'
|
||||
|
||||
cookies = []
|
||||
|
||||
cookies.append( [ 'one', None, '.somesite.com', '/', None ] )
|
||||
|
||||
request_dict = { 'cookies' : cookies }
|
||||
|
||||
request_body = json.dumps( request_dict )
|
||||
|
||||
connection.request( 'POST', path, body = request_body, headers = headers )
|
||||
|
||||
response = connection.getresponse()
|
||||
|
||||
data = response.read()
|
||||
|
||||
self.assertEqual( response.status, 200 )
|
||||
|
||||
path = '/manage_cookies/get_cookies?domain=somesite.com'
|
||||
|
||||
connection.request( 'GET', path, headers = headers )
|
||||
|
||||
response = connection.getresponse()
|
||||
|
||||
data = response.read()
|
||||
|
||||
text = str( data, 'utf-8' )
|
||||
|
||||
self.assertEqual( response.status, 200 )
|
||||
|
||||
d = json.loads( text )
|
||||
|
||||
result_cookies = d[ 'cookies' ]
|
||||
|
||||
expected_cookies = []
|
||||
|
||||
expected_cookies.append( [ 'two', '2', 'somesite.com', '/', HydrusData.GetNow() + 86400 ] )
|
||||
expected_cookies.append( [ 'three', '3', 'wew.somesite.com', '/', HydrusData.GetNow() + 86400 ] )
|
||||
expected_cookies.append( [ 'four', '4', '.somesite.com', '/', None ] )
|
||||
|
||||
frozen_result_cookies = { tuple( row ) for row in result_cookies }
|
||||
frozen_expected_cookies = { tuple( row ) for row in expected_cookies }
|
||||
|
||||
self.assertEqual( frozen_result_cookies, frozen_expected_cookies )
|
||||
|
||||
|
||||
def _test_manage_pages( self, connection, set_up_permissions ):
|
||||
|
||||
api_permissions = set_up_permissions[ 'manage_pages' ]
|
||||
|
@ -1455,6 +1580,11 @@ class TestClientAPI( unittest.TestCase ):
|
|||
|
||||
media_results = []
|
||||
|
||||
urls = { "https://gelbooru.com/index.php?page=post&s=view&id=4841557", "https://img2.gelbooru.com//images/80/c8/80c8646b4a49395fb36c805f316c49a9.jpg" }
|
||||
|
||||
sorted_urls = list( urls )
|
||||
sorted_urls.sort()
|
||||
|
||||
for ( file_id, hash ) in file_ids_to_hashes.items():
|
||||
|
||||
size = random.randint( 8192, 20 * 1048576 )
|
||||
|
@ -1462,14 +1592,15 @@ class TestClientAPI( unittest.TestCase ):
|
|||
width = random.randint( 200, 4096 )
|
||||
height = random.randint( 200, 4096 )
|
||||
duration = random.choice( [ 220, 16.66667, None ] )
|
||||
has_audio = random.choice( [ True, False ] )
|
||||
|
||||
file_info_manager = ClientMedia.FileInfoManager( file_id, hash, size = size, mime = mime, width = width, height = height, duration = duration )
|
||||
file_info_manager = ClientMedia.FileInfoManager( file_id, hash, size = size, mime = mime, width = width, height = height, duration = duration, has_audio = has_audio )
|
||||
|
||||
service_keys_to_statuses_to_tags = { CC.LOCAL_TAG_SERVICE_KEY : { HC.CONTENT_STATUS_CURRENT : [ 'blue eyes', 'blonde hair' ], HC.CONTENT_STATUS_PENDING : [ 'bodysuit' ] } }
|
||||
|
||||
tags_manager = ClientMedia.TagsManager( service_keys_to_statuses_to_tags )
|
||||
|
||||
locations_manager = ClientMedia.LocationsManager( set(), set(), set(), set() )
|
||||
locations_manager = ClientMedia.LocationsManager( set(), set(), set(), set(), urls = urls )
|
||||
ratings_manager = ClientRatings.RatingsManager( {} )
|
||||
file_viewing_stats_manager = ClientMedia.FileViewingStatsManager( 0, 0, 0, 0 )
|
||||
|
||||
|
@ -1497,9 +1628,12 @@ class TestClientAPI( unittest.TestCase ):
|
|||
metadata_row[ 'width' ] = file_info_manager.width
|
||||
metadata_row[ 'height' ] = file_info_manager.height
|
||||
metadata_row[ 'duration' ] = file_info_manager.duration
|
||||
metadata_row[ 'has_audio' ] = file_info_manager.has_audio
|
||||
metadata_row[ 'num_frames' ] = file_info_manager.num_frames
|
||||
metadata_row[ 'num_words' ] = file_info_manager.num_words
|
||||
|
||||
metadata_row[ 'known_urls' ] = list( sorted_urls )
|
||||
|
||||
tags_manager = media_result.GetTagsManager()
|
||||
|
||||
service_names_to_statuses_to_tags = {}
|
||||
|
@ -1853,6 +1987,7 @@ class TestClientAPI( unittest.TestCase ):
|
|||
self._test_add_files( connection, set_up_permissions )
|
||||
self._test_add_tags( connection, set_up_permissions )
|
||||
self._test_add_urls( connection, set_up_permissions )
|
||||
self._test_manage_cookies( connection, set_up_permissions )
|
||||
self._test_manage_pages( connection, set_up_permissions )
|
||||
self._test_search_files( connection, set_up_permissions )
|
||||
self._test_permission_failures( connection, set_up_permissions )
|
||||
|
|
|
@ -322,8 +322,11 @@ class TestClientDB( unittest.TestCase ):
|
|||
tests.append( ( HC.PREDICATE_TYPE_SYSTEM_FILE_SERVICE, ( True, HC.CONTENT_STATUS_CURRENT, CC.LOCAL_FILE_SERVICE_KEY ), 1 ) )
|
||||
tests.append( ( HC.PREDICATE_TYPE_SYSTEM_FILE_SERVICE, ( True, HC.CONTENT_STATUS_PENDING, CC.LOCAL_FILE_SERVICE_KEY ), 0 ) )
|
||||
|
||||
tests.append( ( HC.PREDICATE_TYPE_SYSTEM_HASH, ( hash, 'sha256' ), 1 ) )
|
||||
tests.append( ( HC.PREDICATE_TYPE_SYSTEM_HASH, ( bytes.fromhex( '0123456789abcdef' * 4 ), 'sha256' ), 0 ) )
|
||||
tests.append( ( HC.PREDICATE_TYPE_SYSTEM_HAS_AUDIO, True, 0 ) )
|
||||
tests.append( ( HC.PREDICATE_TYPE_SYSTEM_HAS_AUDIO, False, 1 ) )
|
||||
|
||||
tests.append( ( HC.PREDICATE_TYPE_SYSTEM_HASH, ( ( hash, ), 'sha256' ), 1 ) )
|
||||
tests.append( ( HC.PREDICATE_TYPE_SYSTEM_HASH, ( ( bytes.fromhex( '0123456789abcdef' * 4 ), ), 'sha256' ), 0 ) )
|
||||
|
||||
tests.append( ( HC.PREDICATE_TYPE_SYSTEM_HEIGHT, ( '<', 201 ), 1 ) )
|
||||
tests.append( ( HC.PREDICATE_TYPE_SYSTEM_HEIGHT, ( '<', 200 ), 0 ) )
|
||||
|
@ -369,8 +372,8 @@ class TestClientDB( unittest.TestCase ):
|
|||
tests.append( ( HC.PREDICATE_TYPE_SYSTEM_RATIO, ( '\u2248', 200, 201 ), 1 ) )
|
||||
tests.append( ( HC.PREDICATE_TYPE_SYSTEM_RATIO, ( '\u2248', 4, 1 ), 0 ) )
|
||||
|
||||
tests.append( ( HC.PREDICATE_TYPE_SYSTEM_SIMILAR_TO, ( hash, 5 ), 1 ) )
|
||||
tests.append( ( HC.PREDICATE_TYPE_SYSTEM_SIMILAR_TO, ( bytes.fromhex( '0123456789abcdef' * 4 ), 5 ), 0 ) )
|
||||
tests.append( ( HC.PREDICATE_TYPE_SYSTEM_SIMILAR_TO, ( ( hash, ), 5 ), 1 ) )
|
||||
tests.append( ( HC.PREDICATE_TYPE_SYSTEM_SIMILAR_TO, ( ( bytes.fromhex( '0123456789abcdef' * 4 ), ), 5 ), 0 ) )
|
||||
|
||||
tests.append( ( HC.PREDICATE_TYPE_SYSTEM_SIZE, ( '<', 0, HydrusData.ConvertUnitToInt( 'B' ) ), 0 ) )
|
||||
tests.append( ( HC.PREDICATE_TYPE_SYSTEM_SIZE, ( '<', 5270, HydrusData.ConvertUnitToInt( 'B' ) ), 0 ) )
|
||||
|
@ -638,7 +641,7 @@ class TestClientDB( unittest.TestCase ):
|
|||
predicates.append( ClientSearch.Predicate( HC.PREDICATE_TYPE_SYSTEM_EVERYTHING, min_current_count = 1 ) )
|
||||
predicates.append( ClientSearch.Predicate( HC.PREDICATE_TYPE_SYSTEM_INBOX, min_current_count = 1 ) )
|
||||
predicates.append( ClientSearch.Predicate( HC.PREDICATE_TYPE_SYSTEM_ARCHIVE, min_current_count = 0 ) )
|
||||
predicates.extend( [ ClientSearch.Predicate( predicate_type ) for predicate_type in [ HC.PREDICATE_TYPE_SYSTEM_UNTAGGED, HC.PREDICATE_TYPE_SYSTEM_NUM_TAGS, HC.PREDICATE_TYPE_SYSTEM_LIMIT, HC.PREDICATE_TYPE_SYSTEM_SIZE, HC.PREDICATE_TYPE_SYSTEM_AGE, HC.PREDICATE_TYPE_SYSTEM_KNOWN_URLS, HC.PREDICATE_TYPE_SYSTEM_HASH, HC.PREDICATE_TYPE_SYSTEM_DIMENSIONS, HC.PREDICATE_TYPE_SYSTEM_DURATION, HC.PREDICATE_TYPE_SYSTEM_NUM_WORDS, HC.PREDICATE_TYPE_SYSTEM_MIME, HC.PREDICATE_TYPE_SYSTEM_SIMILAR_TO, HC.PREDICATE_TYPE_SYSTEM_FILE_SERVICE, HC.PREDICATE_TYPE_SYSTEM_TAG_AS_NUMBER, HC.PREDICATE_TYPE_SYSTEM_FILE_RELATIONSHIPS, HC.PREDICATE_TYPE_SYSTEM_FILE_VIEWING_STATS ] ] )
|
||||
predicates.extend( [ ClientSearch.Predicate( predicate_type ) for predicate_type in [ HC.PREDICATE_TYPE_SYSTEM_UNTAGGED, HC.PREDICATE_TYPE_SYSTEM_NUM_TAGS, HC.PREDICATE_TYPE_SYSTEM_LIMIT, HC.PREDICATE_TYPE_SYSTEM_SIZE, HC.PREDICATE_TYPE_SYSTEM_AGE, HC.PREDICATE_TYPE_SYSTEM_KNOWN_URLS, HC.PREDICATE_TYPE_SYSTEM_HAS_AUDIO, HC.PREDICATE_TYPE_SYSTEM_HASH, HC.PREDICATE_TYPE_SYSTEM_DIMENSIONS, HC.PREDICATE_TYPE_SYSTEM_DURATION, HC.PREDICATE_TYPE_SYSTEM_NUM_WORDS, HC.PREDICATE_TYPE_SYSTEM_MIME, HC.PREDICATE_TYPE_SYSTEM_SIMILAR_TO, HC.PREDICATE_TYPE_SYSTEM_FILE_SERVICE, HC.PREDICATE_TYPE_SYSTEM_TAG_AS_NUMBER, HC.PREDICATE_TYPE_SYSTEM_FILE_RELATIONSHIPS, HC.PREDICATE_TYPE_SYSTEM_FILE_VIEWING_STATS ] ] )
|
||||
|
||||
self.assertEqual( set( result ), set( predicates ) )
|
||||
|
||||
|
@ -784,16 +787,16 @@ class TestClientDB( unittest.TestCase ):
|
|||
|
||||
test_files = []
|
||||
|
||||
test_files.append( ( 'muh_swf.swf', 'edfef9905fdecde38e0752a5b6ab7b6df887c3968d4246adc9cffc997e168cdf', 456774, HC.APPLICATION_FLASH, 400, 400, { 33 }, { 1 }, None ) )
|
||||
test_files.append( ( 'muh_mp4.mp4', '2fa293907144a046d043d74e9570b1c792cbfd77ee3f5c93b2b1a1cb3e4c7383', 570534, HC.VIDEO_MP4, 480, 480, { 6266, 6290 }, { 151 }, None ) )
|
||||
test_files.append( ( 'muh_mpeg.mpeg', 'aebb10aaf3b27a5878fd2732ea28aaef7bbecef7449eaa759421c4ba4efff494', 772096, HC.VIDEO_MPEG, 720, 480, { 3500 }, { 105 }, None ) )
|
||||
test_files.append( ( 'muh_webm.webm', '55b6ce9d067326bf4b2fbe66b8f51f366bc6e5f776ba691b0351364383c43fcb', 84069, HC.VIDEO_WEBM, 640, 360, { 4010 }, { 120 }, None ) )
|
||||
test_files.append( ( 'muh_jpg.jpg', '5d884d84813beeebd59a35e474fa3e4742d0f2b6679faa7609b245ddbbd05444', 42296, HC.IMAGE_JPEG, 392, 498, { None }, { None }, None ) )
|
||||
test_files.append( ( 'muh_png.png', 'cdc67d3b377e6e1397ffa55edc5b50f6bdf4482c7a6102c6f27fa351429d6f49', 31452, HC.IMAGE_PNG, 191, 196, { None }, { None }, None ) )
|
||||
test_files.append( ( 'muh_apng.png', '9e7b8b5abc7cb11da32db05671ce926a2a2b701415d1b2cb77a28deea51010c3', 616956, HC.IMAGE_APNG, 500, 500, { 3133, 1880, 1125, 1800 }, { 27, 47 }, None ) )
|
||||
test_files.append( ( 'muh_gif.gif', '00dd9e9611ebc929bfc78fde99a0c92800bbb09b9d18e0946cea94c099b211c2', 15660, HC.IMAGE_GIF, 329, 302, { 600 }, { 5 }, None ) )
|
||||
test_files.append( ( 'muh_swf.swf', 'edfef9905fdecde38e0752a5b6ab7b6df887c3968d4246adc9cffc997e168cdf', 456774, HC.APPLICATION_FLASH, 400, 400, { 33 }, { 1 }, True, None ) )
|
||||
test_files.append( ( 'muh_mp4.mp4', '2fa293907144a046d043d74e9570b1c792cbfd77ee3f5c93b2b1a1cb3e4c7383', 570534, HC.VIDEO_MP4, 480, 480, { 6266, 6290 }, { 151 }, True, None ) )
|
||||
test_files.append( ( 'muh_mpeg.mpeg', 'aebb10aaf3b27a5878fd2732ea28aaef7bbecef7449eaa759421c4ba4efff494', 772096, HC.VIDEO_MPEG, 720, 480, { 3500 }, { 105 }, False, None ) )
|
||||
test_files.append( ( 'muh_webm.webm', '55b6ce9d067326bf4b2fbe66b8f51f366bc6e5f776ba691b0351364383c43fcb', 84069, HC.VIDEO_WEBM, 640, 360, { 4010 }, { 120 }, True, None ) )
|
||||
test_files.append( ( 'muh_jpg.jpg', '5d884d84813beeebd59a35e474fa3e4742d0f2b6679faa7609b245ddbbd05444', 42296, HC.IMAGE_JPEG, 392, 498, { None }, { None }, False, None ) )
|
||||
test_files.append( ( 'muh_png.png', 'cdc67d3b377e6e1397ffa55edc5b50f6bdf4482c7a6102c6f27fa351429d6f49', 31452, HC.IMAGE_PNG, 191, 196, { None }, { None }, False, None ) )
|
||||
test_files.append( ( 'muh_apng.png', '9e7b8b5abc7cb11da32db05671ce926a2a2b701415d1b2cb77a28deea51010c3', 616956, HC.IMAGE_APNG, 500, 500, { 3133, 1880, 1125, 1800 }, { 27, 47 }, False, None ) )
|
||||
test_files.append( ( 'muh_gif.gif', '00dd9e9611ebc929bfc78fde99a0c92800bbb09b9d18e0946cea94c099b211c2', 15660, HC.IMAGE_GIF, 329, 302, { 600 }, { 5 }, False, None ) )
|
||||
|
||||
for ( filename, hex_hash, size, mime, width, height, durations, nums_frame, num_words ) in test_files:
|
||||
for ( filename, hex_hash, size, mime, width, height, durations, num_frames, has_audio, num_words ) in test_files:
|
||||
|
||||
path = os.path.join( HC.STATIC_DIR, 'testing', filename )
|
||||
|
||||
|
@ -830,7 +833,7 @@ class TestClientDB( unittest.TestCase ):
|
|||
|
||||
( mr_file_info_manager, mr_tags_manager, mr_locations_manager, mr_ratings_manager ) = media_result.ToTuple()
|
||||
|
||||
( mr_hash_id, mr_hash, mr_size, mr_mime, mr_width, mr_height, mr_duration, mr_num_frames, mr_num_words ) = mr_file_info_manager.ToTuple()
|
||||
( mr_hash_id, mr_hash, mr_size, mr_mime, mr_width, mr_height, mr_duration, mr_num_frames, mr_has_audio, mr_num_words ) = mr_file_info_manager.ToTuple()
|
||||
|
||||
mr_inbox = mr_locations_manager.GetInbox()
|
||||
|
||||
|
@ -843,7 +846,8 @@ class TestClientDB( unittest.TestCase ):
|
|||
self.assertEqual( mr_width, width )
|
||||
self.assertEqual( mr_height, height )
|
||||
self.assertIn( mr_duration, durations )
|
||||
self.assertIn( mr_num_frames, nums_frame )
|
||||
self.assertIn( mr_num_frames, num_frames )
|
||||
self.assertEqual( mr_has_audio, has_audio )
|
||||
self.assertEqual( mr_num_words, num_words )
|
||||
|
||||
|
||||
|
@ -978,7 +982,7 @@ class TestClientDB( unittest.TestCase ):
|
|||
|
||||
( mr_file_info_manager, mr_tags_manager, mr_locations_manager, mr_ratings_manager ) = media_result.ToTuple()
|
||||
|
||||
( mr_hash_id, mr_hash, mr_size, mr_mime, mr_width, mr_height, mr_duration, mr_num_frames, mr_num_words ) = mr_file_info_manager.ToTuple()
|
||||
( mr_hash_id, mr_hash, mr_size, mr_mime, mr_width, mr_height, mr_duration, mr_num_frames, mr_has_audio, mr_num_words ) = mr_file_info_manager.ToTuple()
|
||||
|
||||
mr_inbox = mr_locations_manager.GetInbox()
|
||||
|
||||
|
@ -993,13 +997,14 @@ class TestClientDB( unittest.TestCase ):
|
|||
self.assertEqual( mr_height, 200 )
|
||||
self.assertEqual( mr_duration, None )
|
||||
self.assertEqual( mr_num_frames, None )
|
||||
self.assertEqual( mr_has_audio, False )
|
||||
self.assertEqual( mr_num_words, None )
|
||||
|
||||
( media_result, ) = self._read( 'media_results_from_ids', ( 1, ) )
|
||||
|
||||
( mr_file_info_manager, mr_tags_manager, mr_locations_manager, mr_ratings_manager ) = media_result.ToTuple()
|
||||
|
||||
( mr_hash_id, mr_hash, mr_size, mr_mime, mr_width, mr_height, mr_duration, mr_num_frames, mr_num_words ) = mr_file_info_manager.ToTuple()
|
||||
( mr_hash_id, mr_hash, mr_size, mr_mime, mr_width, mr_height, mr_duration, mr_num_frames, mr_has_audio, mr_num_words ) = mr_file_info_manager.ToTuple()
|
||||
|
||||
mr_inbox = mr_locations_manager.GetInbox()
|
||||
|
||||
|
@ -1014,6 +1019,7 @@ class TestClientDB( unittest.TestCase ):
|
|||
self.assertEqual( mr_height, 200 )
|
||||
self.assertEqual( mr_duration, None )
|
||||
self.assertEqual( mr_num_frames, None )
|
||||
self.assertEqual( mr_has_audio, False )
|
||||
self.assertEqual( mr_num_words, None )
|
||||
|
||||
|
||||
|
|
|
@ -114,14 +114,14 @@ class TestClientDBDuplicates( unittest.TestCase ):
|
|||
|
||||
# fake-import the files with the phash
|
||||
|
||||
( size, mime, width, height, duration, num_frames, num_words ) = ( 65535, HC.IMAGE_JPEG, 640, 480, None, None, None )
|
||||
( size, mime, width, height, duration, num_frames, has_audio, num_words ) = ( 65535, HC.IMAGE_JPEG, 640, 480, None, None, False, None )
|
||||
|
||||
for hash in self._all_hashes:
|
||||
|
||||
fake_file_import_job = ClientImportFileSeeds.FileImportJob( 'fake path' )
|
||||
|
||||
fake_file_import_job._hash = hash
|
||||
fake_file_import_job._file_info = ( size, mime, width, height, duration, num_frames, num_words )
|
||||
fake_file_import_job._file_info = ( size, mime, width, height, duration, num_frames, has_audio, num_words )
|
||||
fake_file_import_job._extra_hashes = ( b'abcd', b'abcd', b'abcd' )
|
||||
fake_file_import_job._phashes = [ phash ]
|
||||
fake_file_import_job._file_import_options = ClientImportOptions.FileImportOptions()
|
||||
|
|
|
@ -170,6 +170,7 @@ class TestSerialisables( unittest.TestCase ):
|
|||
height = 480
|
||||
duration = None
|
||||
num_frames = None
|
||||
has_audio = False
|
||||
num_words = None
|
||||
|
||||
local_locations_manager = ClientMedia.LocationsManager( { CC.LOCAL_FILE_SERVICE_KEY, CC.COMBINED_LOCAL_FILE_SERVICE_KEY }, set(), set(), set(), inbox )
|
||||
|
@ -194,7 +195,7 @@ class TestSerialisables( unittest.TestCase ):
|
|||
|
||||
local_hash_has_values = HydrusData.GenerateKey()
|
||||
|
||||
file_info_manager = ClientMedia.FileInfoManager( 1, local_hash_has_values, size, mime, width, height, duration, num_frames, num_words )
|
||||
file_info_manager = ClientMedia.FileInfoManager( 1, local_hash_has_values, size, mime, width, height, duration, num_frames, has_audio, num_words )
|
||||
|
||||
media_result = ClientMedia.MediaResult( file_info_manager, substantial_tags_manager, local_locations_manager, substantial_ratings_manager, file_viewing_stats_manager )
|
||||
|
||||
|
@ -204,7 +205,7 @@ class TestSerialisables( unittest.TestCase ):
|
|||
|
||||
other_local_hash_has_values = HydrusData.GenerateKey()
|
||||
|
||||
file_info_manager = ClientMedia.FileInfoManager( 2, other_local_hash_has_values, size, mime, width, height, duration, num_frames, num_words )
|
||||
file_info_manager = ClientMedia.FileInfoManager( 2, other_local_hash_has_values, size, mime, width, height, duration, num_frames, has_audio, num_words )
|
||||
|
||||
media_result = ClientMedia.MediaResult( file_info_manager, substantial_tags_manager, local_locations_manager, substantial_ratings_manager, file_viewing_stats_manager )
|
||||
|
||||
|
@ -214,7 +215,7 @@ class TestSerialisables( unittest.TestCase ):
|
|||
|
||||
local_hash_empty = HydrusData.GenerateKey()
|
||||
|
||||
file_info_manager = ClientMedia.FileInfoManager( 3, local_hash_empty, size, mime, width, height, duration, num_frames, num_words )
|
||||
file_info_manager = ClientMedia.FileInfoManager( 3, local_hash_empty, size, mime, width, height, duration, num_frames, has_audio, num_words )
|
||||
|
||||
media_result = ClientMedia.MediaResult( file_info_manager, empty_tags_manager, local_locations_manager, empty_ratings_manager, file_viewing_stats_manager )
|
||||
|
||||
|
@ -224,7 +225,7 @@ class TestSerialisables( unittest.TestCase ):
|
|||
|
||||
trashed_hash_empty = HydrusData.GenerateKey()
|
||||
|
||||
file_info_manager = ClientMedia.FileInfoManager( 4, trashed_hash_empty, size, mime, width, height, duration, num_frames, num_words )
|
||||
file_info_manager = ClientMedia.FileInfoManager( 4, trashed_hash_empty, size, mime, width, height, duration, num_frames, has_audio, num_words )
|
||||
|
||||
media_result = ClientMedia.MediaResult( file_info_manager, empty_tags_manager, trash_locations_manager, empty_ratings_manager, file_viewing_stats_manager )
|
||||
|
||||
|
@ -234,7 +235,7 @@ class TestSerialisables( unittest.TestCase ):
|
|||
|
||||
deleted_hash_empty = HydrusData.GenerateKey()
|
||||
|
||||
file_info_manager = ClientMedia.FileInfoManager( 5, deleted_hash_empty, size, mime, width, height, duration, num_frames, num_words )
|
||||
file_info_manager = ClientMedia.FileInfoManager( 5, deleted_hash_empty, size, mime, width, height, duration, num_frames, has_audio, num_words )
|
||||
|
||||
media_result = ClientMedia.MediaResult( file_info_manager, empty_tags_manager, deleted_locations_manager, empty_ratings_manager, file_viewing_stats_manager )
|
||||
|
||||
|
@ -244,7 +245,7 @@ class TestSerialisables( unittest.TestCase ):
|
|||
|
||||
one_hash = HydrusData.GenerateKey()
|
||||
|
||||
file_info_manager = ClientMedia.FileInfoManager( 6, one_hash, size, mime, width, height, duration, num_frames, num_words )
|
||||
file_info_manager = ClientMedia.FileInfoManager( 6, one_hash, size, mime, width, height, duration, num_frames, has_audio, num_words )
|
||||
|
||||
media_result = ClientMedia.MediaResult( file_info_manager, one_tags_manager, local_locations_manager, one_ratings_manager, file_viewing_stats_manager )
|
||||
|
||||
|
@ -254,7 +255,7 @@ class TestSerialisables( unittest.TestCase ):
|
|||
|
||||
two_hash = HydrusData.GenerateKey()
|
||||
|
||||
file_info_manager = ClientMedia.FileInfoManager( 7, two_hash, size, mime, width, height, duration, num_frames, num_words )
|
||||
file_info_manager = ClientMedia.FileInfoManager( 7, two_hash, size, mime, width, height, duration, num_frames, has_audio, num_words )
|
||||
|
||||
media_result = ClientMedia.MediaResult( file_info_manager, two_tags_manager, local_locations_manager, two_ratings_manager, file_viewing_stats_manager )
|
||||
|
||||
|
|
|
@ -460,7 +460,13 @@ class TestTagObjects( unittest.TestCase ):
|
|||
self.assertEqual( p.GetNamespace(), 'system' )
|
||||
self.assertEqual( p.GetTextsAndNamespaces(), [ ( p.ToString(), p.GetNamespace() ) ] )
|
||||
|
||||
p = ClientSearch.Predicate( HC.PREDICATE_TYPE_SYSTEM_HASH, ( bytes.fromhex( 'abcd' ), 'sha256' ) )
|
||||
p = ClientSearch.Predicate( HC.PREDICATE_TYPE_SYSTEM_HAS_AUDIO, True )
|
||||
|
||||
self.assertEqual( p.ToString(), 'system:has audio' )
|
||||
self.assertEqual( p.GetNamespace(), 'system' )
|
||||
self.assertEqual( p.GetTextsAndNamespaces(), [ ( p.ToString(), p.GetNamespace() ) ] )
|
||||
|
||||
p = ClientSearch.Predicate( HC.PREDICATE_TYPE_SYSTEM_HASH, ( ( bytes.fromhex( 'abcd' ), ), 'sha256' ) )
|
||||
|
||||
self.assertEqual( p.ToString(), 'system:sha256 hash is abcd' )
|
||||
self.assertEqual( p.GetNamespace(), 'system' )
|
||||
|
@ -538,9 +544,9 @@ class TestTagObjects( unittest.TestCase ):
|
|||
self.assertEqual( p.GetNamespace(), 'system' )
|
||||
self.assertEqual( p.GetTextsAndNamespaces(), [ ( p.ToString(), p.GetNamespace() ) ] )
|
||||
|
||||
p = ClientSearch.Predicate( HC.PREDICATE_TYPE_SYSTEM_SIMILAR_TO, ( bytes.fromhex( 'abcd' ), 5 ) )
|
||||
p = ClientSearch.Predicate( HC.PREDICATE_TYPE_SYSTEM_SIMILAR_TO, ( ( bytes.fromhex( 'abcd' ), ), 5 ) )
|
||||
|
||||
self.assertEqual( p.ToString(), 'system:similar to abcd using max hamming of 5' )
|
||||
self.assertEqual( p.ToString(), 'system:similar to 1 files using max hamming of 5' )
|
||||
self.assertEqual( p.GetNamespace(), 'system' )
|
||||
self.assertEqual( p.GetTextsAndNamespaces(), [ ( p.ToString(), p.GetNamespace() ) ] )
|
||||
|
||||
|
|
Binary file not shown.
Before Width: | Height: | Size: 3.3 KiB After Width: | Height: | Size: 3.3 KiB |
Loading…
Reference in New Issue