Version 518
This commit is contained in:
parent
0c0a80433c
commit
ec4a161523
|
@ -1,4 +0,0 @@
|
|||
If your hydrus crashes as soon as you load a video in mpv, and your audio drver is ASIO or WASAPI, please add these lines to your mpv.conf:
|
||||
|
||||
ao=wasapi
|
||||
audio-fallback-to-null=yes
|
|
@ -0,0 +1,8 @@
|
|||
If your audio driver is ASIO or WASAPI and hydrus crashes as soon as you load a audio/video in mpv, please add these lines to your mpv.conf:
|
||||
|
||||
ao=wasapi
|
||||
audio-fallback-to-null=yes
|
||||
|
||||
If you have no audio devices on your computer and hydrus crashes as soon as you load audio/video in mpv, just add the fallback line:
|
||||
|
||||
audio-fallback-to-null=yes
|
|
@ -7,6 +7,45 @@ title: Changelog
|
|||
!!! note
|
||||
This is the new changelog, only the most recent builds. For all versions, see the [old changelog](old_changelog.html).
|
||||
|
||||
## [Version 518](https://github.com/hydrusnetwork/hydrus/releases/tag/v518)
|
||||
|
||||
### autocomplete improvements
|
||||
|
||||
* tl;dr: I went through the whole tag autocomplete search pipeline, cleaned out the cruft, and made the pre-fetch results more sensible. searching for tags on thumbnails isn't horrible any more!
|
||||
* -
|
||||
* when you type a tag search, either in search or edit autocomplete contexts, and it needs to spend some time reading from the database, the search now always does the 'exact match' search first on what you typed. if you type in 'cat', it will show 'cat' and 'species:cat' and 'character:cat' and anything else that matches 'cat' exactly, with counts, and easy to select, while you are waiting for the full autocomplete results to come back
|
||||
* in edit contexts, this exact-matching pre-fetch results here now include sibling suggestions, even if the results have no count
|
||||
* in edit contexts, the full results should more reliably include sibling suggestions, including those with no count. in some situations ('all known tags'), there may be too many siblings, so let me know!
|
||||
* the main predicate sorting method now sorts by string secondarily, stabilising the sort between same-count preds
|
||||
* when the results list transitions from pre-fetch results to full results, your current selection is now preserved!!! selecting and then hitting enter right when the full results come in should be safe now!
|
||||
* when you type on a set of full results and it quickly filters down on the results cache to a smaller result, it now preserves selection. I'm not sure how totally useful this will be, but I did it anyway. hitting backspace and filtering 'up' will reset selection
|
||||
* when you search for tags on a page of thumbnails, you should now get some early results super fast! these results are lacking sibling data and will be replaced with the better answer soon after, but if you want something simple, they'll work! no more waiting ages for anything on thumbnail tag searches!
|
||||
* fixed an issue where the edit autocomplete was not caching results properly when you had the 'unnamespaced input gives (any namespace) wildcard results' option on
|
||||
* the different loading states of autocomplete all now have clear 'loading...' labels, and each label is a little different based on what it is doing, like 'loading sibling data...'
|
||||
* I generally cleared out jank. as the results move from one type to another, or as they filter down as you type, they _should_ flicker less
|
||||
* added a new gui debug mode to force a three second delay on all autocomplete database jobs, to help simulate slow searches and play with the above
|
||||
* NOTE: autocomplete has a heap of weird options under _tags->manage tag display and search_. I'm really happy with the above changes, but I messed around with the result injection rules, so I may have broken one of the combinations of wildcard rules here. let me know how you get on and I'll fix anything that I busted.
|
||||
|
||||
### pympler
|
||||
|
||||
* hydrus now optionally uses 'pympler', a python memory profiling library. for now, it replaces my old python gc (garbage collection) summarising commands under _help->debug->memory actions_, and gives much nicer formatting and now various estimates of actual memory use. this is a first version that mostly just replicates old behaviour, but I added a 'spam a more accurate total mem size of all the Qt widgets' in there too. I will keep developing this in future. we should be able to track some memory leaks better in future
|
||||
* pympler is now in all the requirements.txts, so if you run from source and want to play with it, please reinstall your venv and you'll be sorted. _help->about_ says whether you have it or not
|
||||
|
||||
### misc
|
||||
|
||||
* the system:time predicates now allow you to specify the hh:mm time on the calendar control. if needed, you can now easily search for files viewed between 10pm-11:30pm yesterday. all existing 'date' system predicates will update to midnight. if you are a time-search nerd, note this changes the precision of existing time predicates--previously they searched _before/after_ the given date, but now they search including the given date, pivoting around the minute (default: 0:00am) rather than the integer calendar day! 'same day as' remains the same, though--midnight to midnight of the given calendar day
|
||||
* if hydrus has previously initial-booted without mpv available and so set the media view options for video/animations/audio to 'show with native viewer', and you then boot with mpv available, hydrus now sets your view options to use mpv and gives a popup saying so. trying to get mpv to work should be a bit easier to test now, since it'll popup and fix itself as soon as you get it working, and people who never realised it was missing and fix it accidentally will now get sorted without having to do anything extra
|
||||
* made some small speed and memory optimisations to content processing for busy clients with large sessions, particularly those with large collect-by'd pages
|
||||
* also boosted the speed of the content update pipeline as it consults which files are affected by which update object
|
||||
* the migrate tags dialog now lets you filter the tag source by pending only on tag repositories
|
||||
* cleaned up some calendar/time code
|
||||
* updated the Client API help on how Hydrus-Client-API-Access-Key works in GET vs POST arguments
|
||||
* patched the legacy use of 'service_names_to_tags' in `/add_urls/add_url` in the client api. this parameter is more obsolete than the other legacy names (it got renamed a while ago to 'service_names_to_additional_tags'), but I'm supporting it again, just for a bit, for Hydrus Companion users stuck on an older version. sorry for the trouble here, this missed my legacy checks!
|
||||
|
||||
### windows mpv test
|
||||
|
||||
* hey, if you are an advanced windows user and want to run a test for me, please rename your mpv-2.dll to .old and then get this https://sourceforge.net/projects/mpv-player-windows/files/libmpv/mpv-dev-x86_64-20230212-git-a40958c.7z/download . extract the libmpv-2.dll and rename it to mpv-2.dll. does it work for you, showing api v2.1 in _help->about_? are you running the built windows release, or from source? it runs great for me from source, but I'd like to get a wider canvas before I update it for everyone. if it doesn't work, then delete the new dll and rename the .old back, and then let me know your windows version etc.., thank you!
|
||||
|
||||
## [Version 517](https://github.com/hydrusnetwork/hydrus/releases/tag/v517)
|
||||
|
||||
### misc
|
||||
|
@ -377,43 +416,3 @@ title: Changelog
|
|||
* fixed the main thumbnail loader being confused at times about which thumbnail mime to load with. the check I have added is ultra-fast on data we are loading anyway, so we shouldn't notice a difference, but if you get slow thumb loads, let me know
|
||||
* fixed the media container embed buttons using the file mime rather than the thumb mime when loading thumbnails (again causing transparency issues)
|
||||
* fixed more generally bad mime handling in the thumbnail generation routine that could have caused more unusual transparency handling for clip, psd, or flash files
|
||||
|
||||
## [Version 508](https://github.com/hydrusnetwork/hydrus/releases/tag/v508)
|
||||
|
||||
### misc
|
||||
|
||||
* added a shortcut action to the 'media' set for 'file relationships: show x', where x is duplicates, potential duplicates, alternates, or false positives, just like the action buried in the thumbnail right-click menu. this actually works in both thumbs and the canvas.
|
||||
* fixed file deletes not getting processed in the duplicate filter when there were no normal duplicate actions committed in a batch. sorry for the trouble here--duplicate decisions and deletes are now counted and reported in the confirmation dialogs as separate numbers
|
||||
* as an experiment, the duplicate filter now says (+50%, -33%) percentage differences in the file size comparison statement. while the numbers here are correct, I'm not sure if this is helpful or awkward. maybe it should be phrased differently--let me know
|
||||
* url classes get two new checkboxes this week: 'do not allow any extra path components/parameters', which will stop a match if the testee URL is 'longer' than the url class's definition. this should help with some difficult 'path-nested URLs aren't matching to the right URL Class' problems
|
||||
* when you import hard drive files manually or in an import folder, files with .txt, .json, or .xml suffixes are now ignored in the file scanning phase. when hydrus eventually supports text files and arbitrary files, the solution will be nicer here, but this patch makes the new sidecar system nicer to work with in the meantime without, I hope, causing too much other fuss
|
||||
* the 'tags' button in the advanced-mode 'sort files' control now hides/shows based on the sort type. also, the asc/desc button now hides/shows when it is invalid (filetype, hash, random), rather than disable/enable. there was a bit more signals-cleanup behind the scenes here too
|
||||
* updated the 'could not set up qtpy/QtCore' error handling yet again to try to figure out this macOS App boot problem some users are getting. the error handling now says what the initial QT_API env variable was and tries to import every possible Qt and prints the whole error for each. hopefully we'll now see why PySide6 is not loading
|
||||
* cleaned up the 'old changelog' page. all the '.' separators are replaced with proper header tags and I rejiggered some of the ul and li elements to interleave better. its favicon is also fixed. btw if you want to edit 500-odd elements at a time in a 2MB document, PyCharm is mostly great. multi-hundred simultaneous edit hung for about five minutes per character, but multiline regex Find and Replace was instant
|
||||
* added a link to a user-written guide for running Hydrus on Windows in Anaconda to the 'installing' help
|
||||
* fixed some old/invalid dialog locations in the 'how to build a downloader' help
|
||||
|
||||
### client api
|
||||
|
||||
* a new `/get_files/file_hashes` command lets you look up any of the sha256, md5, sha1, sha512 hashes that hydrus knows about using any of the other hashes. if you have a bunch of md5 and want to figure out if you have them, or if you want to get the md5s of your files and run them against an external check, this is now possible
|
||||
* added help and unit tests for this new command
|
||||
* added a service enum to the `/get_services` Client API help
|
||||
* client api version is now 37
|
||||
* as a side thing, I rejiggered the 'what non-sha256 hash do these sha256 hashes have?' test here. it now returns a mapping, allowing for more efficient mass lookups, and it no longer creates new sha256 records for novel hashes. feel free to spam this on new sha256 hashes if you like
|
||||
|
||||
### interesting serverside
|
||||
|
||||
* the tag repository now manages a tag filter. admins with 'modify options' permission can alter it under the new menu command _services->administrate services->tag repo->edit tag filter_.
|
||||
* any time new tags are pended to the tag repository, they are now washed through the tag filter. any that don't pass are silently discarded
|
||||
* normal users will regularly fetch the tag filter as long as their client is relatively new. they can review it under a new read-only Tag Filter panel from _review services_. if their client is super old (or the server), account sync and the UI should fail gracefully
|
||||
* if you are in advanced mode and your client account-syncs and discovers the tag filter has changed, it will make a popup with a summary of the changes. I am not sure how spammy/annoying this will be, so let me know if you'd rather turn them off or auto-hide after two hours or something
|
||||
* future updates will have more feedback on _manage tags_ dialog and similar, just to let you know there and then if an entered tag is not wanted. also, admins who change the tag filter will be able to retroactively remove tags that apply to the filter, not just stop new ones. I'd also like some sibling hard-replace to go along with this, so we don't accidentalyl remove tags that are otherwise sibling'd to be good--we'll see
|
||||
* the hydrus server won't bug out so much at unusual errors now. previously, I ingrained that any error during any request would kick off automatic delays, but I have rejiggered it a bit so this mostly just happens during automatic work like update downloading
|
||||
|
||||
### boring serverside
|
||||
|
||||
* added get/set and similar to the tag repo's until-now-untouched tag filter
|
||||
* wrote a nice helper method that splays two tag filters into their added/changed/deleted rules and another that can present that in human-readable format. it prints to the server log whenever a human changes the tag filter, and will be used in future retroactive syncing
|
||||
* cleaned up how the service options are delivered to the client. previously, there would have been a version desync pain if I had ever updated the tag filter internal version. now, the service options delivered to the client are limited to python primitives, atm just update period and nullification period, and tag filter and other complex objects will have their own get calls and fail in quiet isolation
|
||||
* I fixed some borked nullification period initialisation serverside
|
||||
* whenever a tag filter describes itself, if either black or whitelist have more than 12 rules, it now summarises rather than listing every single one
|
||||
|
|
|
@ -94,17 +94,26 @@ Access is required for every request. You can provide this as an http header, li
|
|||
Hydrus-Client-API-Access-Key : 0150d9c4f6a6d2082534a997f4588dcf0c56dffe1d03ffbf98472236112236ae
|
||||
```
|
||||
|
||||
Or you can include it as a GET or POST parameter on any request (except _POST /add\_files/add\_file_, which uses the entire POST body for the file's bytes). Use the same name for your GET or POST argument, such as:
|
||||
Or you can include it in the normal parameters of any request (except _POST /add\_files/add\_file_, which uses the entire POST body for the file's bytes). For GET, this means including it into the URL parameters:
|
||||
|
||||
```
|
||||
/get_files/thumbnail?file_id=452158&Hydrus-Client-API-Access-Key=0150d9c4f6a6d2082534a997f4588dcf0c56dffe1d03ffbf98472236112236ae
|
||||
```
|
||||
|
||||
There is now a simple 'session' system, where you can get a temporary key that gives the same access without having to include the permanent access key in every request. You can fetch a session key with the [/session_key](#session_key) command and thereafter use it just as you would an access key, just with _Hydrus-Client-API-Session-Key_ instead.
|
||||
For POST, this means in the JSON body parameters, like so:
|
||||
|
||||
```
|
||||
{
|
||||
"hash_id" : 123456,
|
||||
"Hydrus-Client-API-Access-Key" : "0150d9c4f6a6d2082534a997f4588dcf0c56dffe1d03ffbf98472236112236ae"
|
||||
}
|
||||
```
|
||||
|
||||
There is also a simple 'session' system, where you can get a temporary key that gives the same access without having to include the permanent access key in every request. You can fetch a session key with the [/session_key](#session_key) command and thereafter use it just as you would an access key, just with _Hydrus-Client-API-Session-Key_ instead.
|
||||
|
||||
Session keys will expire if they are not used within 24 hours, or if the client is restarted, or if the underlying access key is deleted. An invalid/expired session key will give a **419** result with an appropriate error text.
|
||||
|
||||
Bear in mind the Client API is still under construction. Setting up the Client API to be accessible across the internet requires technical experience to be convenient. HTTPS is available for encrypted comms, but the default certificate is self-signed (which basically means an eavesdropper can't see through it, but your ISP/government could if they decided to target you). If you have your own domain and SSL cert, you can replace them though (check the db directory for client.crt and client.key). Otherwise, be careful about transmitting sensitive content outside of your localhost/network.
|
||||
Bear in mind the Client API is still under construction. Setting up the Client API to be accessible across the internet requires technical experience to be convenient. HTTPS is available for encrypted comms, but the default certificate is self-signed (which basically means an eavesdropper can't see through it, but your ISP/government could if they decided to target you). If you have your own domain to host from and an SSL cert, you can replace them and it'll use them instead (check the db directory for client.crt and client.key). Otherwise, be careful about transmitting sensitive content outside of your localhost/network.
|
||||
|
||||
## Common Complex Parameters
|
||||
|
||||
|
|
|
@ -34,6 +34,40 @@
|
|||
<div class="content">
|
||||
<h1 id="changelog"><a href="#changelog">changelog</a></h1>
|
||||
<ul>
|
||||
<li>
|
||||
<h2 id="version_518"><a href="#version_518">version 518</a></h2>
|
||||
<ul>
|
||||
<li><h3>autocomplete improvements</h3></li>
|
||||
<li>tl;dr: I went through the whole tag autocomplete search pipeline, cleaned out the cruft, and made the pre-fetch results more sensible. searching for tags on thumbnails isn't horrible any more!</li>
|
||||
<li>-</li>
|
||||
<li>when you type a tag search, either in search or edit autocomplete contexts, and it needs to spend some time reading from the database, the search now always does the 'exact match' search first on what you typed. if you type in 'cat', it will show 'cat' and 'species:cat' and 'character:cat' and anything else that matches 'cat' exactly, with counts, and easy to select, while you are waiting for the full autocomplete results to come back</li>
|
||||
<li>in edit contexts, this exact-matching pre-fetch results here now include sibling suggestions, even if the results have no count</li>
|
||||
<li>in edit contexts, the full results should more reliably include sibling suggestions, including those with no count. in some situations ('all known tags'), there may be too many siblings, so let me know!</li>
|
||||
<li>the main predicate sorting method now sorts by string secondarily, stabilising the sort between same-count preds</li>
|
||||
<li>when the results list transitions from pre-fetch results to full results, your current selection is now preserved!!! selecting and then hitting enter right when the full results come in should be safe now!</li>
|
||||
<li>when you type on a set of full results and it quickly filters down on the results cache to a smaller result, it now preserves selection. I'm not sure how totally useful this will be, but I did it anyway. hitting backspace and filtering 'up' will reset selection</li>
|
||||
<li>when you search for tags on a page of thumbnails, you should now get some early results super fast! these results are lacking sibling data and will be replaced with the better answer soon after, but if you want something simple, they'll work! no more waiting ages for anything on thumbnail tag searches!</li>
|
||||
<li>fixed an issue where the edit autocomplete was not caching results properly when you had the 'unnamespaced input gives (any namespace) wildcard results' option on</li>
|
||||
<li>the different loading states of autocomplete all now have clear 'loading...' labels, and each label is a little different based on what it is doing, like 'loading sibling data...'</li>
|
||||
<li>I generally cleared out jank. as the results move from one type to another, or as they filter down as you type, they _should_ flicker less</li>
|
||||
<li>added a new gui debug mode to force a three second delay on all autocomplete database jobs, to help simulate slow searches and play with the above</li>
|
||||
<li>NOTE: autocomplete has a heap of weird options under _tags->manage tag display and search_. I'm really happy with the above changes, but I messed around with the result injection rules, so I may have broken one of the combinations of wildcard rules here. let me know how you get on and I'll fix anything that I busted.</li>
|
||||
<li><h3>pympler</h3></li>
|
||||
<li>hydrus now optionally uses 'pympler', a python memory profiling library. for now, it replaces my old python gc (garbage collection) summarising commands under _help->debug->memory actions_, and gives much nicer formatting and now various estimates of actual memory use. this is a first version that mostly just replicates old behaviour, but I added a 'spam a more accurate total mem size of all the Qt widgets' in there too. I will keep developing this in future. we should be able to track some memory leaks better in future</li>
|
||||
<li>pympler is now in all the requirements.txts, so if you run from source and want to play with it, please reinstall your venv and you'll be sorted. _help->about_ says whether you have it or not</li>
|
||||
<li><h3>misc</h3></li>
|
||||
<li>the system:time predicates now allow you to specify the hh:mm time on the calendar control. if needed, you can now easily search for files viewed between 10pm-11:30pm yesterday. all existing 'date' system predicates will update to midnight. if you are a time-search nerd, note this changes the precision of existing time predicates--previously they searched _before/after_ the given date, but now they search including the given date, pivoting around the minute (default: 0:00am) rather than the integer calendar day! 'same day as' remains the same, though--midnight to midnight of the given calendar day</li>
|
||||
<li>if hydrus has previously initial-booted without mpv available and so set the media view options for video/animations/audio to 'show with native viewer', and you then boot with mpv available, hydrus now sets your view options to use mpv and gives a popup saying so. trying to get mpv to work should be a bit easier to test now, since it'll popup and fix itself as soon as you get it working, and people who never realised it was missing and fix it accidentally will now get sorted without having to do anything extra</li>
|
||||
<li>made some small speed and memory optimisations to content processing for busy clients with large sessions, particularly those with large collect-by'd pages</li>
|
||||
<li>also boosted the speed of the content update pipeline as it consults which files are affected by which update object</li>
|
||||
<li>the migrate tags dialog now lets you filter the tag source by pending only on tag repositories</li>
|
||||
<li>cleaned up some calendar/time code</li>
|
||||
<li>updated the Client API help on how Hydrus-Client-API-Access-Key works in GET vs POST arguments</li>
|
||||
<li>patched the legacy use of 'service_names_to_tags' in `/add_urls/add_url` in the client api. this parameter is more obsolete than the other legacy names (it got renamed a while ago to 'service_names_to_additional_tags'), but I'm supporting it again, just for a bit, for Hydrus Companion users stuck on an older version. sorry for the trouble here, this missed my legacy checks!</li>
|
||||
<li><h3>windows mpv test</h3></li>
|
||||
<li>hey, if you are an advanced windows user and want to run a test for me, please rename your mpv-2.dll to .old and then get this https://sourceforge.net/projects/mpv-player-windows/files/libmpv/mpv-dev-x86_64-20230212-git-a40958c.7z/download . extract the libmpv-2.dll and rename it to mpv-2.dll. does it work for you, showing api v2.1 in _help->about_? are you running the built windows release, or from source? it runs great for me from source, but I'd like to get a wider canvas before I update it for everyone. if it doesn't work, then delete the new dll and rename the .old back, and then let me know your windows version etc.., thank you!</li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>
|
||||
<h2 id="version_517"><a href="#version_517">version 517</a></h2>
|
||||
<ul>
|
||||
|
|
|
@ -278,6 +278,10 @@ class ClientOptions( HydrusSerialisable.SerialisableBase ):
|
|||
self._dictionary[ 'booleans' ][ 'focus_preview_on_shift_click' ] = False
|
||||
self._dictionary[ 'booleans' ][ 'focus_preview_on_shift_click_only_static' ] = False
|
||||
|
||||
from hydrus.client.gui.canvas import ClientGUIMPV
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'mpv_available_at_start' ] = ClientGUIMPV.MPV_IS_AVAILABLE
|
||||
|
||||
#
|
||||
|
||||
self._dictionary[ 'colours' ] = HydrusSerialisable.SerialisableDictionary()
|
||||
|
|
|
@ -234,11 +234,12 @@ def IsComplexWildcard( search_text ):
|
|||
|
||||
def SortPredicates( predicates ):
|
||||
|
||||
key = lambda p: p.GetCount().GetMinCount()
|
||||
key = lambda p: ( - p.GetCount().GetMinCount(), p.ToString() )
|
||||
|
||||
predicates.sort( key = key, reverse = True )
|
||||
predicates.sort( key = key )
|
||||
|
||||
return predicates
|
||||
|
||||
|
||||
NUMBER_TEST_OPERATOR_LESS_THAN = 0
|
||||
NUMBER_TEST_OPERATOR_GREATER_THAN = 1
|
||||
|
@ -482,12 +483,16 @@ class FileSystemPredicates( object ):
|
|||
|
||||
elif age_type == 'date':
|
||||
|
||||
( year, month, day ) = age_value
|
||||
( year, month, day, hour, minute ) = age_value
|
||||
|
||||
dt = ClientTime.GetDateTime( year, month, day )
|
||||
dt = ClientTime.GetDateTime( year, month, day, hour, minute )
|
||||
|
||||
time_pivot = ClientTime.CalendarToTimestamp( dt )
|
||||
next_day_timestamp = ClientTime.CalendarToTimestamp( ClientTime.CalendarDelta( dt, day_delta = 1 ) )
|
||||
|
||||
dt_day_of_start = ClientTime.GetDateTime( year, month, day, 0, 0 )
|
||||
|
||||
day_of_start = ClientTime.CalendarToTimestamp( dt_day_of_start )
|
||||
day_of_end = ClientTime.CalendarToTimestamp( ClientTime.CalendarDelta( dt_day_of_start, day_delta = 1 ) )
|
||||
|
||||
# the before/since semantic logic is:
|
||||
# '<' 2022-05-05 means 'before that date'
|
||||
|
@ -499,12 +504,12 @@ class FileSystemPredicates( object ):
|
|||
|
||||
elif operator == '>':
|
||||
|
||||
self._timestamp_ranges[ predicate_type ][ '>' ] = next_day_timestamp
|
||||
self._timestamp_ranges[ predicate_type ][ '>' ] = time_pivot
|
||||
|
||||
elif operator == '=':
|
||||
|
||||
self._timestamp_ranges[ predicate_type ][ '>' ] = time_pivot
|
||||
self._timestamp_ranges[ predicate_type ][ '<' ] = next_day_timestamp
|
||||
self._timestamp_ranges[ predicate_type ][ '>' ] = day_of_start
|
||||
self._timestamp_ranges[ predicate_type ][ '<' ] = day_of_end
|
||||
|
||||
elif operator == CC.UNICODE_ALMOST_EQUAL_TO:
|
||||
|
||||
|
@ -1564,7 +1569,7 @@ class Predicate( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_PREDICATE
|
||||
SERIALISABLE_NAME = 'File Search Predicate'
|
||||
SERIALISABLE_VERSION = 5
|
||||
SERIALISABLE_VERSION = 6
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
|
@ -1876,6 +1881,32 @@ class Predicate( HydrusSerialisable.SerialisableBase ):
|
|||
return ( 5, new_serialisable_info )
|
||||
|
||||
|
||||
if version == 5:
|
||||
|
||||
( predicate_type, serialisable_value, inclusive ) = old_serialisable_info
|
||||
|
||||
if predicate_type in ( PREDICATE_TYPE_SYSTEM_AGE, PREDICATE_TYPE_SYSTEM_LAST_VIEWED_TIME, PREDICATE_TYPE_SYSTEM_MODIFIED_TIME ):
|
||||
|
||||
( operator, age_type, age_value ) = serialisable_value
|
||||
|
||||
if age_type == 'date':
|
||||
|
||||
( year, month, day ) = age_value
|
||||
|
||||
hour = 0
|
||||
minute = 0
|
||||
|
||||
age_value = ( year, month, day, hour, minute )
|
||||
|
||||
serialisable_value = ( operator, age_type, age_value )
|
||||
|
||||
|
||||
|
||||
new_serialisable_info = ( predicate_type, serialisable_value, inclusive )
|
||||
|
||||
return ( 6, new_serialisable_info )
|
||||
|
||||
|
||||
|
||||
def GetCopy( self ):
|
||||
|
||||
|
@ -2463,19 +2494,11 @@ class Predicate( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
elif age_type == 'date':
|
||||
|
||||
( year, month, day ) = age_value
|
||||
( year, month, day, hour, minute ) = age_value
|
||||
|
||||
dt = datetime.datetime( year, month, day )
|
||||
dt = datetime.datetime( year, month, day, hour, minute )
|
||||
|
||||
try:
|
||||
|
||||
# make a timestamp (IN GMT SECS SINCE 1970) from the local meaning of 2018/02/01
|
||||
timestamp = int( time.mktime( dt.timetuple() ) )
|
||||
|
||||
except:
|
||||
|
||||
timestamp = HydrusData.GetNow()
|
||||
|
||||
timestamp = ClientTime.CalendarToTimestamp( dt )
|
||||
|
||||
if operator == '<':
|
||||
|
||||
|
@ -2494,8 +2517,10 @@ class Predicate( HydrusSerialisable.SerialisableBase ):
|
|||
pretty_operator = 'a month either side of '
|
||||
|
||||
|
||||
include_24h_time = operator != '=' and ( hour > 0 or minute > 0 )
|
||||
|
||||
# convert this GMT TIMESTAMP to a pretty local string
|
||||
base += ': ' + pretty_operator + HydrusData.ConvertTimestampToPrettyTime( timestamp, include_24h_time = False )
|
||||
base += ': ' + pretty_operator + HydrusData.ConvertTimestampToPrettyTime( timestamp, include_24h_time = include_24h_time )
|
||||
|
||||
|
||||
|
||||
|
@ -3339,7 +3364,7 @@ class PredicateResultsCache( object ):
|
|||
self._predicates = list( predicates )
|
||||
|
||||
|
||||
def CanServeTagResults( self, parsed_autocomplete_text: ParsedAutocompleteText, exact_match: bool ):
|
||||
def CanServeTagResults( self, parsed_autocomplete_text: ParsedAutocompleteText, exact_match: bool, allow_auto_wildcard_conversion = True ):
|
||||
|
||||
return False
|
||||
|
||||
|
@ -3383,9 +3408,9 @@ class PredicateResultsCacheTag( PredicateResultsCache ):
|
|||
self._exact_match = exact_match
|
||||
|
||||
|
||||
def CanServeTagResults( self, parsed_autocomplete_text: ParsedAutocompleteText, exact_match: bool ):
|
||||
def CanServeTagResults( self, parsed_autocomplete_text: ParsedAutocompleteText, exact_match: bool, allow_auto_wildcard_conversion = True ):
|
||||
|
||||
strict_search_text = parsed_autocomplete_text.GetSearchText( False )
|
||||
strict_search_text = parsed_autocomplete_text.GetSearchText( False, allow_auto_wildcard_conversion = allow_auto_wildcard_conversion )
|
||||
|
||||
if self._exact_match:
|
||||
|
||||
|
|
|
@ -72,10 +72,10 @@ def date_pred_generator( pred_type, o, v ):
|
|||
#Either a tuple of 4 non-negative integers: (years, months, days, hours) where the latter is < 24 OR
|
||||
#a datetime.date object. For the latter, only the YYYY-MM-DD format is accepted.
|
||||
|
||||
if isinstance( v, datetime.date ):
|
||||
if isinstance( v, datetime.datetime ):
|
||||
|
||||
date_type = 'date'
|
||||
v = ( v.year, v.month, v.day )
|
||||
v = ( v.year, v.month, v.day, v.hour, v.minute )
|
||||
|
||||
else:
|
||||
|
||||
|
|
|
@ -46,9 +46,9 @@ def CalendarDelta( dt: datetime.datetime, month_delta = 0, day_delta = 0 ) -> da
|
|||
|
||||
|
||||
|
||||
def GetDateTime( year: int, month: int, day: int ) -> datetime.datetime:
|
||||
def GetDateTime( year: int, month: int, day: int, hour: int, minute: int ) -> datetime.datetime:
|
||||
|
||||
return datetime.datetime( year, month, day )
|
||||
return datetime.datetime( year, month, day, hour, minute )
|
||||
|
||||
def MergeModifiedTimes( existing_timestamp: typing.Optional[ int ], new_timestamp: typing.Optional[ int ] ) -> typing.Optional[ int ]:
|
||||
|
||||
|
|
|
@ -1,12 +1,14 @@
|
|||
import collections
|
||||
import itertools
|
||||
import sqlite3
|
||||
import time
|
||||
import typing
|
||||
|
||||
from hydrus.core import HydrusConstants as HC
|
||||
from hydrus.core import HydrusData
|
||||
from hydrus.core import HydrusDBBase
|
||||
from hydrus.core import HydrusExceptions
|
||||
from hydrus.core import HydrusGlobals as HG
|
||||
|
||||
from hydrus.client import ClientConstants as CC
|
||||
from hydrus.client import ClientSearch
|
||||
|
@ -317,6 +319,21 @@ class ClientDBTagDisplay( ClientDBModule.ClientDBModule ):
|
|||
|
||||
def GetMediaPredicates( self, tag_context: ClientSearch.TagContext, tags_to_counts, inclusive, job_key = None ):
|
||||
|
||||
if HG.autocomplete_delay_mode:
|
||||
|
||||
time_to_stop = HydrusData.GetNowFloat() + 3.0
|
||||
|
||||
while not HydrusData.TimeHasPassedFloat( time_to_stop ):
|
||||
|
||||
time.sleep( 0.1 )
|
||||
|
||||
if job_key.IsCancelled():
|
||||
|
||||
return []
|
||||
|
||||
|
||||
|
||||
|
||||
display_tag_service_id = self.modules_services.GetServiceId( tag_context.display_service_key )
|
||||
|
||||
max_current_count = None
|
||||
|
|
|
@ -1,5 +1,6 @@
|
|||
import collections
|
||||
import sqlite3
|
||||
import time
|
||||
import typing
|
||||
|
||||
from hydrus.core import HydrusConstants as HC
|
||||
|
@ -476,6 +477,21 @@ class ClientDBTagSearch( ClientDBModule.ClientDBModule ):
|
|||
# For instance, if we search for A on a domain where one tag service has A->B, we return the B results. Well, let's increment the A (x) count according to that, based on each service!
|
||||
# and then obviously a nice big merge at the end
|
||||
|
||||
if HG.autocomplete_delay_mode and not exact_match:
|
||||
|
||||
time_to_stop = HydrusData.GetNowFloat() + 3.0
|
||||
|
||||
while not HydrusData.TimeHasPassedFloat( time_to_stop ):
|
||||
|
||||
time.sleep( 0.1 )
|
||||
|
||||
if job_key is not None and job_key.IsCancelled():
|
||||
|
||||
return []
|
||||
|
||||
|
||||
|
||||
|
||||
location_context = file_search_context.GetLocationContext()
|
||||
tag_context = file_search_context.GetTagContext()
|
||||
|
||||
|
|
|
@ -27,9 +27,10 @@ from hydrus.core import HydrusData
|
|||
from hydrus.core import HydrusEncryption
|
||||
from hydrus.core import HydrusExceptions
|
||||
from hydrus.core import HydrusFileHandling
|
||||
from hydrus.core import HydrusImageHandling
|
||||
from hydrus.core import HydrusPaths
|
||||
from hydrus.core import HydrusGlobals as HG
|
||||
from hydrus.core import HydrusImageHandling
|
||||
from hydrus.core import HydrusMemory
|
||||
from hydrus.core import HydrusPaths
|
||||
from hydrus.core import HydrusSerialisable
|
||||
from hydrus.core import HydrusTags
|
||||
from hydrus.core import HydrusTemp
|
||||
|
@ -506,8 +507,6 @@ class FrameGUI( ClientGUITopLevelWindows.MainFrameThatResizes, CAC.ApplicationCo
|
|||
|
||||
self._notebook = ClientGUIPages.PagesNotebook( self, self._controller, 'top page notebook' )
|
||||
|
||||
self._garbage_snapshot = collections.Counter()
|
||||
|
||||
self._currently_uploading_pending = set()
|
||||
|
||||
self._last_clipboard_watched_text = ''
|
||||
|
@ -641,6 +640,65 @@ class FrameGUI( ClientGUITopLevelWindows.MainFrameThatResizes, CAC.ApplicationCo
|
|||
self._locator_widget.setEscapeShortcuts( [ QG.QKeySequence( QC.Qt.Key_Escape ) ] )
|
||||
# self._locator_widget.setQueryTimeout( 100 ) # how much to wait before starting a search after user edit. default 0
|
||||
|
||||
#
|
||||
|
||||
try:
|
||||
|
||||
mpv_available_at_start = self._controller.new_options.GetBoolean( 'mpv_available_at_start' )
|
||||
|
||||
if not mpv_available_at_start and ClientGUIMPV.MPV_IS_AVAILABLE:
|
||||
|
||||
# ok, mpv has started working!
|
||||
|
||||
self._controller.new_options.SetBoolean( 'mpv_available_at_start', True )
|
||||
|
||||
original_mimes_to_view_options = self._new_options.GetMediaViewOptions()
|
||||
|
||||
edited_mimes_to_view_options = dict( original_mimes_to_view_options )
|
||||
|
||||
we_done_it = False
|
||||
|
||||
for general_mime in ( HC.GENERAL_VIDEO, HC.GENERAL_ANIMATION, HC.GENERAL_AUDIO ):
|
||||
|
||||
if general_mime in original_mimes_to_view_options:
|
||||
|
||||
view_options = original_mimes_to_view_options[ general_mime ]
|
||||
|
||||
( media_show_action, media_start_paused, media_start_with_embed, preview_show_action, preview_start_paused, preview_start_with_embed, zoom_info ) = view_options
|
||||
|
||||
if media_show_action == CC.MEDIA_VIEWER_ACTION_SHOW_WITH_NATIVE:
|
||||
|
||||
media_show_action = CC.MEDIA_VIEWER_ACTION_SHOW_WITH_MPV
|
||||
preview_show_action = CC.MEDIA_VIEWER_ACTION_SHOW_WITH_MPV
|
||||
|
||||
view_options = ( media_show_action, media_start_paused, media_start_with_embed, preview_show_action, preview_start_paused, preview_start_with_embed, zoom_info )
|
||||
|
||||
edited_mimes_to_view_options[ general_mime ] = view_options
|
||||
|
||||
we_done_it = True
|
||||
|
||||
|
||||
|
||||
|
||||
if we_done_it:
|
||||
|
||||
self._controller.new_options.SetMediaViewOptions( edited_mimes_to_view_options )
|
||||
|
||||
HydrusData.ShowText( 'Hey, MPV was not working on a previous boot, but it looks like it is now. I have updated your media view settings to use MPV.')
|
||||
|
||||
else:
|
||||
|
||||
HydrusData.ShowText( 'Hey, MPV was not working on a previous boot, but it looks like it is now. You might like to check your file view settings under options->media.')
|
||||
|
||||
|
||||
|
||||
except Exception as e:
|
||||
|
||||
HydrusData.ShowText( 'Hey, while trying to check some MPV stuff on boot, I encountered an error. Please let hydev know.' )
|
||||
|
||||
HydrusData.ShowException( e )
|
||||
|
||||
|
||||
|
||||
def _AboutWindow( self ):
|
||||
|
||||
|
@ -755,6 +813,7 @@ class FrameGUI( ClientGUITopLevelWindows.MainFrameThatResizes, CAC.ApplicationCo
|
|||
library_versions.append( ( 'html5lib present: ', str( ClientParsing.HTML5LIB_IS_OK ) ) )
|
||||
library_versions.append( ( 'lxml present: ', str( ClientParsing.LXML_IS_OK ) ) )
|
||||
library_versions.append( ( 'lz4 present: ', str( HydrusCompression.LZ4_OK ) ) )
|
||||
library_versions.append( ( 'pympler present:', str( HydrusMemory.PYMPLER_OK ) ) )
|
||||
library_versions.append( ( 'pyopenssl present:', str( HydrusEncryption.OPENSSL_OK ) ) )
|
||||
library_versions.append( ( 'speedcopy present:', str( HydrusFileHandling.SPEEDCOPY_OK ) ) )
|
||||
library_versions.append( ( 'install dir', HC.BASE_DIR ) )
|
||||
|
@ -1084,11 +1143,11 @@ class FrameGUI( ClientGUITopLevelWindows.MainFrameThatResizes, CAC.ApplicationCo
|
|||
|
||||
def _ClearOrphanFiles( self ):
|
||||
|
||||
text = 'This will iterate through every file in your database\'s file storage, removing any it does not expect to be there. It may take some time.'
|
||||
text = 'This job will iterate through every file in your database\'s file storage, removing any it does not expect to be there. It may take some time.'
|
||||
text += os.linesep * 2
|
||||
text += 'Files and thumbnails will be inaccessible while this occurs, so it is best to leave the client alone until it is done.'
|
||||
|
||||
result = ClientGUIDialogsQuick.GetYesNo( self, text, yes_label = 'do it', no_label = 'forget it' )
|
||||
result = ClientGUIDialogsQuick.GetYesNo( self, text, yes_label = 'get started', no_label = 'forget it' )
|
||||
|
||||
if result == QW.QDialog.Accepted:
|
||||
|
||||
|
@ -1535,144 +1594,40 @@ class FrameGUI( ClientGUITopLevelWindows.MainFrameThatResizes, CAC.ApplicationCo
|
|||
self._controller.column_list_manager.ResetToDefaults()
|
||||
|
||||
|
||||
def _DebugShowGarbageDifferences( self ):
|
||||
def _DebugShowMemoryUseDifferences( self ):
|
||||
|
||||
count = collections.Counter()
|
||||
|
||||
for o in gc.get_objects():
|
||||
if not HydrusMemory.PYMPLER_OK:
|
||||
|
||||
count[ type( o ) ] += 1
|
||||
HydrusData.ShowText( 'Sorry, you need pympler for this!' )
|
||||
|
||||
return
|
||||
|
||||
|
||||
count.subtract( self._garbage_snapshot )
|
||||
|
||||
text = 'Garbage differences start here:'
|
||||
|
||||
to_print = list( count.items() )
|
||||
|
||||
to_print.sort( key = lambda pair: -pair[1] )
|
||||
|
||||
for ( t, count ) in to_print:
|
||||
|
||||
if count == 0:
|
||||
|
||||
continue
|
||||
|
||||
|
||||
text += os.linesep + '{}: {}'.format( t, HydrusData.ToHumanInt( count ) )
|
||||
|
||||
|
||||
HydrusData.ShowText( text )
|
||||
HydrusMemory.PrintSnapshotDiff()
|
||||
|
||||
|
||||
def _DebugTakeGarbageSnapshot( self ):
|
||||
def _DebugTakeMemoryUseSnapshot( self ):
|
||||
|
||||
count = collections.Counter()
|
||||
|
||||
for o in gc.get_objects():
|
||||
if not HydrusMemory.PYMPLER_OK:
|
||||
|
||||
count[ type( o ) ] += 1
|
||||
HydrusData.ShowText( 'Sorry, you need pympler for this!' )
|
||||
|
||||
return
|
||||
|
||||
|
||||
self._garbage_snapshot = count
|
||||
HydrusMemory.TakeMemoryUseSnapshot()
|
||||
|
||||
|
||||
def _DebugPrintGarbage( self ):
|
||||
def _DebugPrintMemoryUse( self ):
|
||||
|
||||
HydrusData.ShowText( 'Printing garbage to log' )
|
||||
|
||||
HydrusData.Print( 'uncollectable gc.garbage:' )
|
||||
|
||||
count = collections.Counter()
|
||||
|
||||
for o in gc.garbage:
|
||||
if not HydrusMemory.PYMPLER_OK:
|
||||
|
||||
count[ type( o ) ] += 1
|
||||
HydrusData.ShowText( 'Sorry, you need pympler for this!' )
|
||||
|
||||
return
|
||||
|
||||
|
||||
to_print = list( count.items() )
|
||||
|
||||
to_print.sort( key = lambda pair: -pair[1] )
|
||||
|
||||
for ( k, v ) in to_print:
|
||||
|
||||
HydrusData.Print( ( k, v ) )
|
||||
|
||||
|
||||
del gc.garbage[:]
|
||||
|
||||
old_debug = gc.get_debug()
|
||||
|
||||
HydrusData.Print( 'running a collect with stats on:' )
|
||||
|
||||
gc.set_debug( gc.DEBUG_LEAK | gc.DEBUG_STATS )
|
||||
|
||||
gc.collect()
|
||||
|
||||
del gc.garbage[:]
|
||||
|
||||
gc.set_debug( old_debug )
|
||||
|
||||
#
|
||||
|
||||
count = collections.Counter()
|
||||
|
||||
objects_to_inspect = set()
|
||||
|
||||
for o in gc.get_objects():
|
||||
|
||||
# add objects to inspect here
|
||||
|
||||
count[ type( o ) ] += 1
|
||||
|
||||
|
||||
current_frame = sys._getframe( 0 )
|
||||
|
||||
for o in objects_to_inspect:
|
||||
|
||||
HydrusData.Print( o )
|
||||
|
||||
parents = gc.get_referrers( o )
|
||||
|
||||
for parent in parents:
|
||||
|
||||
if parent == current_frame or parent == objects_to_inspect:
|
||||
|
||||
continue
|
||||
|
||||
|
||||
HydrusData.Print( 'parent {}'.format( parent ) )
|
||||
|
||||
grandparents = gc.get_referrers( parent )
|
||||
|
||||
for gp in grandparents:
|
||||
|
||||
if gp == current_frame or gp == parents:
|
||||
|
||||
continue
|
||||
|
||||
|
||||
HydrusData.Print( 'grandparent {}'.format( gp ) )
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
HydrusData.Print( 'currently tracked types:' )
|
||||
|
||||
to_print = list( count.items() )
|
||||
|
||||
to_print.sort( key = lambda pair: -pair[1] )
|
||||
|
||||
for ( k, v ) in to_print:
|
||||
|
||||
if v > 15:
|
||||
|
||||
HydrusData.Print( ( k, v ) )
|
||||
|
||||
|
||||
|
||||
HydrusData.DebugPrint( 'garbage printing finished' )
|
||||
HydrusMemory.PrintCurrentMemoryUse( ( QW.QWidget, ) )
|
||||
|
||||
|
||||
def _DebugShowScheduledJobs( self ):
|
||||
|
@ -3351,6 +3306,7 @@ class FrameGUI( ClientGUITopLevelWindows.MainFrameThatResizes, CAC.ApplicationCo
|
|||
ClientGUIMenus.AppendMenuItem( gui_actions, 'macos anti-flicker test', 'Try it out, let me know how it goes.', flip_macos_antiflicker )
|
||||
|
||||
|
||||
ClientGUIMenus.AppendMenuCheckItem( gui_actions, 'autocomplete delay mode', 'Delay all autocomplete requests at the database level by three seconds.', HG.autocomplete_delay_mode, self._SwitchBoolean, 'autocomplete_delay_mode' )
|
||||
ClientGUIMenus.AppendMenuItem( gui_actions, 'make some popups', 'Throw some varied popups at the message manager, just to check it is working.', self._DebugMakeSomePopups )
|
||||
ClientGUIMenus.AppendMenuItem( gui_actions, 'make a long text popup', 'Make a popup with text that will grow in size.', self._DebugLongTextPopup )
|
||||
ClientGUIMenus.AppendMenuItem( gui_actions, 'make a popup in five seconds', 'Throw a delayed popup at the message manager, giving you time to minimise or otherwise alter the client before it arrives.', self._controller.CallLater, 5, HydrusData.ShowText, 'This is a delayed popup message.' )
|
||||
|
@ -3385,9 +3341,13 @@ class FrameGUI( ClientGUITopLevelWindows.MainFrameThatResizes, CAC.ApplicationCo
|
|||
ClientGUIMenus.AppendMenuItem( memory_actions, 'run slow memory maintenance', 'Tell all the slow caches to maintain themselves.', self._controller.MaintainMemorySlow )
|
||||
ClientGUIMenus.AppendMenuItem( memory_actions, 'clear all rendering caches', 'Tell the image rendering system to forget all current images, tiles, and thumbs. This will often free up a bunch of memory immediately.', self._controller.ClearCaches )
|
||||
ClientGUIMenus.AppendMenuItem( memory_actions, 'clear thumbnail cache', 'Tell the thumbnail cache to forget everything and redraw all current thumbs.', self._controller.pub, 'reset_thumbnail_cache' )
|
||||
ClientGUIMenus.AppendMenuItem( memory_actions, 'print garbage', 'Print some information about the python garbage to the log.', self._DebugPrintGarbage )
|
||||
ClientGUIMenus.AppendMenuItem( memory_actions, 'take garbage snapshot', 'Capture current garbage object counts.', self._DebugTakeGarbageSnapshot )
|
||||
ClientGUIMenus.AppendMenuItem( memory_actions, 'show garbage snapshot changes', 'Show object count differences from the last snapshot.', self._DebugShowGarbageDifferences )
|
||||
|
||||
if HydrusMemory.PYMPLER_OK:
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( memory_actions, 'print memory-use summary', 'Print some information about the python memory use to the log.', self._DebugPrintMemoryUse )
|
||||
ClientGUIMenus.AppendMenuItem( memory_actions, 'take memory-use snapshot', 'Capture current memory use.', self._DebugTakeMemoryUseSnapshot )
|
||||
ClientGUIMenus.AppendMenuItem( memory_actions, 'print memory-use snapshot diff', 'Show memory use differences since the last snapshot.', self._DebugShowMemoryUseDifferences )
|
||||
|
||||
|
||||
ClientGUIMenus.AppendMenu( debug, memory_actions, 'memory actions' )
|
||||
|
||||
|
@ -6588,7 +6548,11 @@ The password is cleartext here but obscured in the entry dialog. Enter a blank p
|
|||
|
||||
def _SwitchBoolean( self, name ):
|
||||
|
||||
if name == 'cache_report_mode':
|
||||
if name == 'autocomplete_delay_mode':
|
||||
|
||||
HG.autocomplete_delay_mode = not HG.autocomplete_delay_mode
|
||||
|
||||
elif name == 'cache_report_mode':
|
||||
|
||||
HG.cache_report_mode = not HG.cache_report_mode
|
||||
|
||||
|
|
|
@ -1221,19 +1221,20 @@ class MigrateTagsPanel( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
|
||||
extra_info = ''
|
||||
|
||||
source_content_statuses_strings = {}
|
||||
source_content_statuses_strings = {
|
||||
( HC.CONTENT_STATUS_CURRENT, ) : 'current',
|
||||
( HC.CONTENT_STATUS_CURRENT, HC.CONTENT_STATUS_PENDING ) : 'current and pending',
|
||||
( HC.CONTENT_STATUS_PENDING, ) : 'pending',
|
||||
( HC.CONTENT_STATUS_DELETED, ) : 'deleted'
|
||||
}
|
||||
|
||||
source_content_statuses_strings[ ( HC.CONTENT_STATUS_CURRENT, ) ] = 'current'
|
||||
source_content_statuses_strings[ ( HC.CONTENT_STATUS_CURRENT, HC.CONTENT_STATUS_PENDING ) ] = 'current and pending'
|
||||
source_content_statuses_strings[ ( HC.CONTENT_STATUS_DELETED, ) ] = 'deleted'
|
||||
|
||||
destination_action_strings = {}
|
||||
|
||||
destination_action_strings[ HC.CONTENT_UPDATE_ADD ] = 'adding them to'
|
||||
destination_action_strings[ HC.CONTENT_UPDATE_DELETE ] = 'deleting them from'
|
||||
destination_action_strings[ HC.CONTENT_UPDATE_CLEAR_DELETE_RECORD ] = 'clearing their deletion record from'
|
||||
destination_action_strings[ HC.CONTENT_UPDATE_PEND ] = 'pending them to'
|
||||
destination_action_strings[ HC.CONTENT_UPDATE_PETITION ] = 'petitioning them for removal from'
|
||||
destination_action_strings = {
|
||||
HC.CONTENT_UPDATE_ADD : 'adding them to',
|
||||
HC.CONTENT_UPDATE_DELETE : 'deleting them from',
|
||||
HC.CONTENT_UPDATE_CLEAR_DELETE_RECORD : 'clearing their deletion record from',
|
||||
HC.CONTENT_UPDATE_PEND : 'pending them to',
|
||||
HC.CONTENT_UPDATE_PETITION : 'petitioning them for removal from'
|
||||
}
|
||||
|
||||
content_type = self._migration_content_type.GetValue()
|
||||
content_statuses = self._migration_source_content_status_filter.GetValue()
|
||||
|
@ -1703,6 +1704,7 @@ class MigrateTagsPanel( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
if source_service_type == HC.TAG_REPOSITORY:
|
||||
|
||||
self._migration_source_content_status_filter.addItem( 'current and pending', ( HC.CONTENT_STATUS_CURRENT, HC.CONTENT_STATUS_PENDING ) )
|
||||
self._migration_source_content_status_filter.addItem( 'current and pending', ( HC.CONTENT_STATUS_PENDING, ) )
|
||||
|
||||
|
||||
self._migration_source_content_status_filter.addItem( 'deleted', ( HC.CONTENT_STATUS_DELETED, ) )
|
||||
|
|
|
@ -32,6 +32,7 @@ except Exception as e:
|
|||
mpv_failed_reason = traceback.format_exc()
|
||||
|
||||
MPV_IS_AVAILABLE = False
|
||||
|
||||
|
||||
def GetClientAPIVersionString():
|
||||
|
||||
|
|
|
@ -490,16 +490,9 @@ class ListBoxItemPredicate( ListBoxItem ):
|
|||
return text
|
||||
|
||||
|
||||
def GetSearchPredicates( self ) -> typing.List[ ClientSearch.Predicate ]:
|
||||
def GetPredicate( self ) -> ClientSearch.Predicate:
|
||||
|
||||
if self._predicate.GetType() in ( ClientSearch.PREDICATE_TYPE_LABEL, ClientSearch.PREDICATE_TYPE_PARENT ):
|
||||
|
||||
return []
|
||||
|
||||
else:
|
||||
|
||||
return [ self._predicate ]
|
||||
|
||||
return self._predicate
|
||||
|
||||
|
||||
def GetRowCount( self, show_parent_rows: bool ):
|
||||
|
@ -555,6 +548,18 @@ class ListBoxItemPredicate( ListBoxItem ):
|
|||
return rows_of_texts_and_namespaces
|
||||
|
||||
|
||||
def GetSearchPredicates( self ) -> typing.List[ ClientSearch.Predicate ]:
|
||||
|
||||
if self._predicate.GetType() in ( ClientSearch.PREDICATE_TYPE_LABEL, ClientSearch.PREDICATE_TYPE_PARENT ):
|
||||
|
||||
return []
|
||||
|
||||
else:
|
||||
|
||||
return [ self._predicate ]
|
||||
|
||||
|
||||
|
||||
def GetTags( self ) -> typing.Set[ str ]:
|
||||
|
||||
if self._predicate.GetType() == ClientSearch.PREDICATE_TYPE_TAG:
|
||||
|
@ -573,3 +578,4 @@ class ListBoxItemPredicate( ListBoxItem ):
|
|||
|
||||
self._i_am_an_or_under_construction = value
|
||||
|
||||
|
||||
|
|
|
@ -1,6 +1,7 @@
|
|||
import collections
|
||||
import itertools
|
||||
import os
|
||||
import time
|
||||
import typing
|
||||
|
||||
from qtpy import QtCore as QC
|
||||
|
@ -27,6 +28,7 @@ from hydrus.client.gui import ClientGUIShortcuts
|
|||
from hydrus.client.gui import ClientGUITopLevelWindowsPanels
|
||||
from hydrus.client.gui import QtPorting as QP
|
||||
from hydrus.client.gui.lists import ClientGUIListBoxes
|
||||
from hydrus.client.gui.lists import ClientGUIListBoxesData
|
||||
from hydrus.client.gui.pages import ClientGUIResultsSortCollect
|
||||
from hydrus.client.gui.search import ClientGUILocation
|
||||
from hydrus.client.gui.search import ClientGUISearch
|
||||
|
@ -35,10 +37,11 @@ from hydrus.client.metadata import ClientTags
|
|||
|
||||
from hydrus.external import LogicExpressionQueryParser
|
||||
|
||||
def AppendLoadingPredicate( predicates ):
|
||||
def AppendLoadingPredicate( predicates, label ):
|
||||
|
||||
predicates.append( ClientSearch.Predicate( predicate_type = ClientSearch.PREDICATE_TYPE_LABEL, value = 'loading results\u2026' ) )
|
||||
predicates.append( ClientSearch.Predicate( predicate_type = ClientSearch.PREDICATE_TYPE_LABEL, value = label + '\u2026' ) )
|
||||
|
||||
|
||||
def InsertOtherPredicatesForRead( predicates: list, parsed_autocomplete_text: ClientSearch.ParsedAutocompleteText, include_unusual_predicate_types: bool, under_construction_or_predicate: typing.Optional[ ClientSearch.Predicate ] ):
|
||||
|
||||
if include_unusual_predicate_types:
|
||||
|
@ -110,6 +113,7 @@ def InsertTagPredicates( predicates: list, tag_service_key: bytes, parsed_autoco
|
|||
def ReadFetch(
|
||||
win: QW.QWidget,
|
||||
job_key: ClientThreading.JobKey,
|
||||
prefetch_callable,
|
||||
results_callable,
|
||||
parsed_autocomplete_text: ClientSearch.ParsedAutocompleteText,
|
||||
qt_media_callable,
|
||||
|
@ -129,6 +133,12 @@ def ReadFetch(
|
|||
|
||||
if parsed_autocomplete_text.IsEmpty():
|
||||
|
||||
matches = []
|
||||
|
||||
AppendLoadingPredicate( matches, 'loading system predicates' )
|
||||
|
||||
HG.client_controller.CallAfterQtSafe( win, 'read a/c exact match results', prefetch_callable, job_key, matches, parsed_autocomplete_text )
|
||||
|
||||
cache_valid = isinstance( results_cache, ClientSearch.PredicateResultsCacheSystem )
|
||||
|
||||
we_need_results = not cache_valid
|
||||
|
@ -156,7 +166,7 @@ def ReadFetch(
|
|||
|
||||
else:
|
||||
|
||||
fetch_from_db = True
|
||||
db_based_results = True
|
||||
|
||||
if synchronised and qt_media_callable is not None and not file_search_context.GetSystemPredicates().HasSystemLimit():
|
||||
|
||||
|
@ -178,51 +188,62 @@ def ReadFetch(
|
|||
|
||||
if media_available_and_good:
|
||||
|
||||
fetch_from_db = False
|
||||
db_based_results = False
|
||||
|
||||
|
||||
|
||||
strict_search_text = parsed_autocomplete_text.GetSearchText( False )
|
||||
autocomplete_search_text = parsed_autocomplete_text.GetSearchText( True )
|
||||
|
||||
if fetch_from_db:
|
||||
if db_based_results:
|
||||
|
||||
allow_auto_wildcard_conversion = True
|
||||
|
||||
is_explicit_wildcard = parsed_autocomplete_text.IsExplicitWildcard( allow_auto_wildcard_conversion )
|
||||
|
||||
small_exact_match_search = ShouldDoExactSearch( parsed_autocomplete_text )
|
||||
|
||||
matches = []
|
||||
|
||||
if small_exact_match_search:
|
||||
if is_explicit_wildcard:
|
||||
|
||||
if not results_cache.CanServeTagResults( parsed_autocomplete_text, True ):
|
||||
|
||||
predicates = HG.client_controller.Read( 'autocomplete_predicates', ClientTags.TAG_DISPLAY_ACTUAL, file_search_context, search_text = strict_search_text, exact_match = True, inclusive = parsed_autocomplete_text.inclusive, job_key = job_key )
|
||||
|
||||
results_cache = ClientSearch.PredicateResultsCacheTag( predicates, strict_search_text, True )
|
||||
|
||||
|
||||
matches = results_cache.FilterPredicates( tag_service_key, strict_search_text )
|
||||
cache_valid = False
|
||||
|
||||
else:
|
||||
|
||||
if is_explicit_wildcard:
|
||||
|
||||
cache_valid = False
|
||||
|
||||
else:
|
||||
|
||||
cache_valid = results_cache.CanServeTagResults( parsed_autocomplete_text, False )
|
||||
|
||||
cache_valid = results_cache.CanServeTagResults( parsed_autocomplete_text, False )
|
||||
|
||||
if cache_valid:
|
||||
|
||||
if cache_valid:
|
||||
|
||||
matches = results_cache.FilterPredicates( tag_service_key, autocomplete_search_text )
|
||||
|
||||
else:
|
||||
|
||||
exact_match_predicates = HG.client_controller.Read( 'autocomplete_predicates', ClientTags.TAG_DISPLAY_ACTUAL, file_search_context, search_text = strict_search_text, exact_match = True, inclusive = parsed_autocomplete_text.inclusive, job_key = job_key )
|
||||
|
||||
small_exact_match_search = ShouldDoExactSearch( parsed_autocomplete_text )
|
||||
|
||||
if small_exact_match_search:
|
||||
|
||||
matches = results_cache.FilterPredicates( tag_service_key, autocomplete_search_text )
|
||||
results_cache = ClientSearch.PredicateResultsCacheTag( exact_match_predicates, strict_search_text, True )
|
||||
|
||||
matches = results_cache.FilterPredicates( tag_service_key, strict_search_text )
|
||||
|
||||
else:
|
||||
|
||||
exact_match_matches = ClientSearch.FilterPredicatesBySearchText( tag_service_key, autocomplete_search_text, exact_match_predicates )
|
||||
|
||||
exact_match_matches = ClientSearch.SortPredicates( exact_match_matches )
|
||||
|
||||
allow_auto_wildcard_conversion = True
|
||||
|
||||
InsertTagPredicates( exact_match_matches, tag_service_key, parsed_autocomplete_text, allow_auto_wildcard_conversion, insert_if_does_not_exist = False )
|
||||
|
||||
InsertOtherPredicatesForRead( exact_match_matches, parsed_autocomplete_text, include_unusual_predicate_types, under_construction_or_predicate )
|
||||
|
||||
AppendLoadingPredicate( exact_match_matches, 'loading full results' )
|
||||
|
||||
HG.client_controller.CallAfterQtSafe( win, 'read a/c exact match results', prefetch_callable, job_key, exact_match_matches, parsed_autocomplete_text )
|
||||
|
||||
#
|
||||
|
||||
search_namespaces_into_full_tags = parsed_autocomplete_text.GetTagAutocompleteOptions().SearchNamespacesIntoFullTags()
|
||||
|
||||
predicates = HG.client_controller.Read( 'autocomplete_predicates', ClientTags.TAG_DISPLAY_ACTUAL, file_search_context, search_text = autocomplete_search_text, inclusive = parsed_autocomplete_text.inclusive, job_key = job_key, search_namespaces_into_full_tags = search_namespaces_into_full_tags )
|
||||
|
@ -254,6 +275,12 @@ def ReadFetch(
|
|||
|
||||
if not isinstance( results_cache, ClientSearch.PredicateResultsCacheMedia ):
|
||||
|
||||
matches = []
|
||||
|
||||
AppendLoadingPredicate( matches, 'calculating results' )
|
||||
|
||||
HG.client_controller.CallAfterQtSafe( win, 'read a/c exact match results', prefetch_callable, job_key, matches, parsed_autocomplete_text )
|
||||
|
||||
# it is possible that media will change between calls to this, so don't cache it
|
||||
|
||||
tags_managers = []
|
||||
|
@ -311,6 +338,28 @@ def ReadFetch(
|
|||
return
|
||||
|
||||
|
||||
# we have data sans siblings and parents. send it as prefetch results, user will have _something_
|
||||
|
||||
prefetch_predicates = [ ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_TAG, value = tag, inclusive = parsed_autocomplete_text.inclusive, count = ClientSearch.PredicateCount( current_count, pending_count, None, None ) ) for ( tag, ( current_count, pending_count ) ) in tags_to_count.items() ]
|
||||
|
||||
prefetch_matches = ClientSearch.FilterPredicatesBySearchText( tag_service_key, autocomplete_search_text, prefetch_predicates )
|
||||
|
||||
prefetch_matches = ClientSearch.SortPredicates( prefetch_matches )
|
||||
|
||||
allow_auto_wildcard_conversion = True
|
||||
|
||||
InsertTagPredicates( prefetch_matches, tag_service_key, parsed_autocomplete_text, allow_auto_wildcard_conversion, insert_if_does_not_exist = False )
|
||||
|
||||
InsertOtherPredicatesForRead( prefetch_matches, parsed_autocomplete_text, include_unusual_predicate_types, under_construction_or_predicate )
|
||||
|
||||
AppendLoadingPredicate( prefetch_matches, 'loading sibling data' )
|
||||
|
||||
HG.client_controller.CallAfterQtSafe( win, 'read a/c exact match results', prefetch_callable, job_key, prefetch_matches, parsed_autocomplete_text )
|
||||
|
||||
#
|
||||
|
||||
# now spend time fetching siblings if needed
|
||||
|
||||
predicates = HG.client_controller.Read( 'media_predicates', tag_context, tags_to_count, parsed_autocomplete_text.inclusive, job_key = job_key )
|
||||
|
||||
results_cache = ClientSearch.PredicateResultsCacheMedia( predicates )
|
||||
|
@ -355,7 +404,7 @@ def ReadFetch(
|
|||
return
|
||||
|
||||
|
||||
HG.client_controller.CallAfterQtSafe( win, 'read a/c fetch', results_callable, job_key, parsed_autocomplete_text, results_cache, matches )
|
||||
HG.client_controller.CallAfterQtSafe( win, 'read a/c full results', results_callable, job_key, parsed_autocomplete_text, results_cache, matches )
|
||||
|
||||
def PutAtTopOfMatches( matches: list, predicate: ClientSearch.Predicate, insert_if_does_not_exist: bool = True ):
|
||||
|
||||
|
@ -414,7 +463,16 @@ def ShouldDoExactSearch( parsed_autocomplete_text: ClientSearch.ParsedAutocomple
|
|||
|
||||
return len( test_text ) <= exact_match_character_threshold
|
||||
|
||||
def WriteFetch( win, job_key, results_callable, parsed_autocomplete_text: ClientSearch.ParsedAutocompleteText, file_search_context: ClientSearch.FileSearchContext, results_cache: ClientSearch.PredicateResultsCache ):
|
||||
|
||||
def WriteFetch(
|
||||
win: QW.QWidget,
|
||||
job_key: ClientThreading.JobKey,
|
||||
prefetch_callable,
|
||||
results_callable,
|
||||
parsed_autocomplete_text: ClientSearch.ParsedAutocompleteText,
|
||||
file_search_context: ClientSearch.FileSearchContext,
|
||||
results_cache: ClientSearch.PredicateResultsCache
|
||||
):
|
||||
|
||||
tag_context = file_search_context.GetTagContext()
|
||||
|
||||
|
@ -432,41 +490,59 @@ def WriteFetch( win, job_key, results_callable, parsed_autocomplete_text: Client
|
|||
is_explicit_wildcard = parsed_autocomplete_text.IsExplicitWildcard( allow_auto_wildcard_conversion )
|
||||
|
||||
strict_search_text = parsed_autocomplete_text.GetSearchText( False, allow_auto_wildcard_conversion = allow_auto_wildcard_conversion )
|
||||
autocomplete_search_text = parsed_autocomplete_text.GetSearchText( True )
|
||||
autocomplete_search_text = parsed_autocomplete_text.GetSearchText( True, allow_auto_wildcard_conversion = allow_auto_wildcard_conversion )
|
||||
|
||||
small_exact_match_search = ShouldDoExactSearch( parsed_autocomplete_text )
|
||||
|
||||
if small_exact_match_search:
|
||||
if is_explicit_wildcard:
|
||||
|
||||
if not results_cache.CanServeTagResults( parsed_autocomplete_text, True ):
|
||||
|
||||
predicates = HG.client_controller.Read( 'autocomplete_predicates', ClientTags.TAG_DISPLAY_STORAGE, file_search_context, search_text = strict_search_text, exact_match = True, job_key = job_key )
|
||||
|
||||
results_cache = ClientSearch.PredicateResultsCacheTag( predicates, strict_search_text, True )
|
||||
|
||||
|
||||
matches = results_cache.FilterPredicates( display_tag_service_key, strict_search_text )
|
||||
cache_valid = False
|
||||
|
||||
else:
|
||||
|
||||
if is_explicit_wildcard:
|
||||
|
||||
cache_valid = False
|
||||
|
||||
else:
|
||||
|
||||
cache_valid = results_cache.CanServeTagResults( parsed_autocomplete_text, False )
|
||||
|
||||
cache_valid = results_cache.CanServeTagResults( parsed_autocomplete_text, False, allow_auto_wildcard_conversion = allow_auto_wildcard_conversion )
|
||||
|
||||
if cache_valid:
|
||||
|
||||
if cache_valid:
|
||||
|
||||
matches = results_cache.FilterPredicates( display_tag_service_key, autocomplete_search_text )
|
||||
|
||||
else:
|
||||
|
||||
original_exact_match_predicates = HG.client_controller.Read( 'autocomplete_predicates', ClientTags.TAG_DISPLAY_STORAGE, file_search_context, search_text = strict_search_text, exact_match = True, zero_count_ok = True, job_key = job_key )
|
||||
|
||||
exact_match_predicates = list( original_exact_match_predicates )
|
||||
|
||||
small_exact_match_search = ShouldDoExactSearch( parsed_autocomplete_text )
|
||||
|
||||
if small_exact_match_search:
|
||||
|
||||
matches = results_cache.FilterPredicates( display_tag_service_key, autocomplete_search_text )
|
||||
results_cache = ClientSearch.PredicateResultsCacheTag( exact_match_predicates, strict_search_text, True )
|
||||
|
||||
matches = results_cache.FilterPredicates( display_tag_service_key, strict_search_text )
|
||||
|
||||
else:
|
||||
|
||||
exact_match_matches = ClientSearch.FilterPredicatesBySearchText( display_tag_service_key, autocomplete_search_text, exact_match_predicates )
|
||||
|
||||
exact_match_matches = ClientSearch.SortPredicates( exact_match_matches )
|
||||
|
||||
allow_auto_wildcard_conversion = False
|
||||
|
||||
InsertTagPredicates( exact_match_matches, display_tag_service_key, parsed_autocomplete_text, allow_auto_wildcard_conversion )
|
||||
|
||||
AppendLoadingPredicate( exact_match_matches, 'loading full results' )
|
||||
|
||||
HG.client_controller.CallAfterQtSafe( win, 'write a/c exact match results', prefetch_callable, job_key, exact_match_matches, parsed_autocomplete_text )
|
||||
|
||||
if job_key.IsCancelled():
|
||||
|
||||
return
|
||||
|
||||
|
||||
#
|
||||
|
||||
search_namespaces_into_full_tags = parsed_autocomplete_text.GetTagAutocompleteOptions().SearchNamespacesIntoFullTags()
|
||||
|
||||
predicates = HG.client_controller.Read( 'autocomplete_predicates', ClientTags.TAG_DISPLAY_STORAGE, file_search_context, search_text = autocomplete_search_text, job_key = job_key, search_namespaces_into_full_tags = search_namespaces_into_full_tags )
|
||||
predicates = HG.client_controller.Read( 'autocomplete_predicates', ClientTags.TAG_DISPLAY_STORAGE, file_search_context, search_text = autocomplete_search_text, job_key = job_key, zero_count_ok = True, search_namespaces_into_full_tags = search_namespaces_into_full_tags )
|
||||
|
||||
if is_explicit_wildcard:
|
||||
|
||||
|
@ -480,33 +556,22 @@ def WriteFetch( win, job_key, results_callable, parsed_autocomplete_text: Client
|
|||
|
||||
|
||||
|
||||
|
||||
if not is_explicit_wildcard:
|
||||
|
||||
# this lets us get sibling data for tags that do not exist with count in the domain
|
||||
|
||||
# we always do this, because results cache will not have current text input data
|
||||
|
||||
input_text_predicates = HG.client_controller.Read( 'autocomplete_predicates', ClientTags.TAG_DISPLAY_STORAGE, file_search_context, search_text = strict_search_text, exact_match = True, zero_count_ok = True, job_key = job_key )
|
||||
|
||||
for input_text_predicate in input_text_predicates:
|
||||
if job_key.IsCancelled():
|
||||
|
||||
if ( input_text_predicate.HasIdealSibling() or input_text_predicate.HasParentPredicates() ) and input_text_predicate not in matches:
|
||||
|
||||
matches.append( input_text_predicate )
|
||||
|
||||
return
|
||||
|
||||
|
||||
|
||||
matches = ClientSearch.SortPredicates( matches )
|
||||
|
||||
|
||||
matches = ClientSearch.SortPredicates( matches )
|
||||
|
||||
allow_auto_wildcard_conversion = False
|
||||
|
||||
InsertTagPredicates( matches, display_tag_service_key, parsed_autocomplete_text, allow_auto_wildcard_conversion )
|
||||
|
||||
HG.client_controller.CallAfterQtSafe( win, 'write a/c fetch', results_callable, job_key, parsed_autocomplete_text, results_cache, matches )
|
||||
HG.client_controller.CallAfterQtSafe( win, 'write a/c full results', results_callable, job_key, parsed_autocomplete_text, results_cache, matches )
|
||||
|
||||
|
||||
class ListBoxTagsPredicatesAC( ClientGUIListBoxes.ListBoxTagsPredicates ):
|
||||
|
||||
def __init__( self, parent, callable, service_key, float_mode, **kwargs ):
|
||||
|
@ -557,7 +622,7 @@ class ListBoxTagsPredicatesAC( ClientGUIListBoxes.ListBoxTagsPredicates ):
|
|||
return term
|
||||
|
||||
|
||||
def SetPredicates( self, predicates ):
|
||||
def SetPredicates( self, predicates, preserve_single_selection = False ):
|
||||
|
||||
# need to do a clever compare, since normal predicate compare doesn't take count into account
|
||||
|
||||
|
@ -585,6 +650,18 @@ class ListBoxTagsPredicatesAC( ClientGUIListBoxes.ListBoxTagsPredicates ):
|
|||
|
||||
if not they_are_the_same:
|
||||
|
||||
previously_selected_predicate = None
|
||||
|
||||
if len( self._selected_terms ) == 1:
|
||||
|
||||
( previously_selected_term, ) = self._selected_terms
|
||||
|
||||
if isinstance( previously_selected_term, ClientGUIListBoxesData.ListBoxItemPredicate ):
|
||||
|
||||
previously_selected_predicate = previously_selected_term.GetPredicate()
|
||||
|
||||
|
||||
|
||||
# important to make own copy, as same object originals can be altered (e.g. set non-inclusive) in cache, and we need to notice that change just above
|
||||
self._predicates = [ predicate.GetCopy() for predicate in predicates ]
|
||||
|
||||
|
@ -596,38 +673,45 @@ class ListBoxTagsPredicatesAC( ClientGUIListBoxes.ListBoxTagsPredicates ):
|
|||
|
||||
self._DataHasChanged()
|
||||
|
||||
if len( predicates ) > 0:
|
||||
if len( self._predicates ) > 0:
|
||||
|
||||
logical_index = 0
|
||||
|
||||
if len( predicates ) > 1:
|
||||
if preserve_single_selection and previously_selected_predicate is not None and previously_selected_predicate in self._predicates:
|
||||
|
||||
skip_ors = True
|
||||
logical_index_to_select = self._predicates.index( previously_selected_predicate )
|
||||
|
||||
some_preds_have_count = True in ( predicate.GetCount().HasNonZeroCount() for predicate in predicates )
|
||||
skip_countless = HG.client_controller.new_options.GetBoolean( 'ac_select_first_with_count' ) and some_preds_have_count
|
||||
else:
|
||||
|
||||
for ( index, predicate ) in enumerate( predicates ):
|
||||
logical_index_to_select = 0
|
||||
|
||||
if len( self._predicates ) > 1:
|
||||
|
||||
# now only apply this to simple tags, not wildcards and system tags
|
||||
skip_ors = True
|
||||
|
||||
if skip_ors and predicate.GetType() == ClientSearch.PREDICATE_TYPE_OR_CONTAINER:
|
||||
some_preds_have_count = True in ( predicate.GetCount().HasNonZeroCount() for predicate in self._predicates )
|
||||
skip_countless = HG.client_controller.new_options.GetBoolean( 'ac_select_first_with_count' ) and some_preds_have_count
|
||||
|
||||
for ( index, predicate ) in enumerate( self._predicates ):
|
||||
|
||||
continue
|
||||
# now only apply this to simple tags, not wildcards and system tags
|
||||
|
||||
|
||||
if skip_countless and predicate.GetType() in ( ClientSearch.PREDICATE_TYPE_PARENT, ClientSearch.PREDICATE_TYPE_TAG ) and predicate.GetCount().HasZeroCount():
|
||||
if skip_ors and predicate.GetType() == ClientSearch.PREDICATE_TYPE_OR_CONTAINER:
|
||||
|
||||
continue
|
||||
|
||||
|
||||
continue
|
||||
if skip_countless and predicate.GetType() in ( ClientSearch.PREDICATE_TYPE_PARENT, ClientSearch.PREDICATE_TYPE_TAG ) and predicate.GetCount().HasZeroCount():
|
||||
|
||||
continue
|
||||
|
||||
|
||||
logical_index_to_select = index
|
||||
|
||||
break
|
||||
|
||||
|
||||
logical_index = index
|
||||
|
||||
break
|
||||
|
||||
|
||||
|
||||
self._Hit( False, False, logical_index )
|
||||
self._Hit( False, False, logical_index_to_select )
|
||||
|
||||
|
||||
|
||||
|
@ -1378,18 +1462,6 @@ class AutoCompleteDropdown( QW.QWidget, CAC.ApplicationCommandProcessorMixin ):
|
|||
return command_processed
|
||||
|
||||
|
||||
def SetFetchedResults( self, job_key: ClientThreading.JobKey, parsed_autocomplete_text: ClientSearch.ParsedAutocompleteText, results_cache: ClientSearch.PredicateResultsCache, results: list ):
|
||||
|
||||
if self._current_fetch_job_key is not None and self._current_fetch_job_key.GetKey() == job_key.GetKey():
|
||||
|
||||
self._CancelSearchResultsFetchJob()
|
||||
|
||||
self._results_cache = results_cache
|
||||
|
||||
self._SetResultsToList( results, parsed_autocomplete_text )
|
||||
|
||||
|
||||
|
||||
def SetForceDropdownHide( self, value ):
|
||||
|
||||
self._force_dropdown_hide = value
|
||||
|
@ -1411,6 +1483,8 @@ class AutoCompleteDropdownTags( AutoCompleteDropdown ):
|
|||
tag_service_key = CC.COMBINED_TAG_SERVICE_KEY
|
||||
|
||||
|
||||
self._last_prefetch_job_key = None
|
||||
|
||||
self._tag_service_key = tag_service_key
|
||||
|
||||
AutoCompleteDropdown.__init__( self, parent )
|
||||
|
@ -1506,9 +1580,9 @@ class AutoCompleteDropdownTags( AutoCompleteDropdown ):
|
|||
self._location_context_button.SetValue( location_context )
|
||||
|
||||
|
||||
def _SetResultsToList( self, results, parsed_autocomplete_text: ClientSearch.ParsedAutocompleteText ):
|
||||
def _SetResultsToList( self, results, parsed_autocomplete_text: ClientSearch.ParsedAutocompleteText, preserve_single_selection = False ):
|
||||
|
||||
self._search_results_list.SetPredicates( results )
|
||||
self._search_results_list.SetPredicates( results, preserve_single_selection = preserve_single_selection )
|
||||
|
||||
self._current_list_parsed_autocomplete_text = parsed_autocomplete_text
|
||||
|
||||
|
@ -1611,16 +1685,45 @@ class AutoCompleteDropdownTags( AutoCompleteDropdown ):
|
|||
self._favourites_list.SetPredicates( predicates )
|
||||
|
||||
|
||||
def SetFetchedResults( self, job_key: ClientThreading.JobKey, parsed_autocomplete_text: ClientSearch.ParsedAutocompleteText, results_cache: ClientSearch.PredicateResultsCache, results: list ):
|
||||
|
||||
if self._current_fetch_job_key is not None and self._current_fetch_job_key.GetKey() == job_key.GetKey():
|
||||
|
||||
preserve_single_selection = False
|
||||
|
||||
if self._last_prefetch_job_key == self._current_fetch_job_key:
|
||||
|
||||
# we are completing a prefetch, so see if we can preserve if the user moved position
|
||||
preserve_single_selection = True
|
||||
|
||||
|
||||
if self._results_cache == results_cache and len( self._search_results_list ) >= len( results ):
|
||||
|
||||
# if we are filtering down existing results, then preserve selection
|
||||
# don't preserve on user backspace, filtering up, it is confusing
|
||||
preserve_single_selection = True
|
||||
|
||||
|
||||
self._CancelSearchResultsFetchJob()
|
||||
|
||||
self._results_cache = results_cache
|
||||
|
||||
self._SetResultsToList( results, parsed_autocomplete_text, preserve_single_selection = preserve_single_selection )
|
||||
|
||||
|
||||
|
||||
def SetLocationContext( self, location_context: ClientLocation.LocationContext ):
|
||||
|
||||
self._SetLocationContext( location_context )
|
||||
|
||||
|
||||
def SetStubPredicates( self, job_key, stub_predicates, parsed_autocomplete_text ):
|
||||
def SetPrefetchResults( self, job_key: ClientThreading.JobKey, predicates: typing.List[ ClientSearch.Predicate ], parsed_autocomplete_text: ClientSearch.ParsedAutocompleteText ):
|
||||
|
||||
if self._current_fetch_job_key is not None and self._current_fetch_job_key.GetKey() == job_key.GetKey():
|
||||
|
||||
self._SetResultsToList( stub_predicates, parsed_autocomplete_text )
|
||||
self._last_prefetch_job_key = self._current_fetch_job_key
|
||||
|
||||
self._SetResultsToList( predicates, parsed_autocomplete_text, preserve_single_selection = False )
|
||||
|
||||
|
||||
|
||||
|
@ -2139,14 +2242,6 @@ class AutoCompleteDropdownTagsRead( AutoCompleteDropdownTags ):
|
|||
|
||||
parsed_autocomplete_text = self._GetParsedAutocompleteText()
|
||||
|
||||
stub_predicates = []
|
||||
|
||||
InsertOtherPredicatesForRead( stub_predicates, parsed_autocomplete_text, self._include_unusual_predicate_types, self._under_construction_or_predicate )
|
||||
|
||||
AppendLoadingPredicate( stub_predicates )
|
||||
|
||||
HG.client_controller.CallLaterQtSafe( self, 0.2, 'set stub predicates', self.SetStubPredicates, job_key, stub_predicates, parsed_autocomplete_text )
|
||||
|
||||
fsc = self.GetFileSearchContext()
|
||||
|
||||
if self._under_construction_or_predicate is None:
|
||||
|
@ -2158,7 +2253,7 @@ class AutoCompleteDropdownTagsRead( AutoCompleteDropdownTags ):
|
|||
under_construction_or_predicate = self._under_construction_or_predicate.Duplicate()
|
||||
|
||||
|
||||
HG.client_controller.CallToThread( ReadFetch, self, job_key, self.SetFetchedResults, parsed_autocomplete_text, self._media_callable, fsc, self._search_pause_play.IsOn(), self._include_unusual_predicate_types, self._results_cache, under_construction_or_predicate, self._force_system_everything )
|
||||
HG.client_controller.CallToThread( ReadFetch, self, job_key, self.SetPrefetchResults, self.SetFetchedResults, parsed_autocomplete_text, self._media_callable, fsc, self._search_pause_play.IsOn(), self._include_unusual_predicate_types, self._results_cache, under_construction_or_predicate, self._force_system_everything )
|
||||
|
||||
|
||||
def _ShouldTakeResponsibilityForEnter( self ):
|
||||
|
@ -2778,21 +2873,11 @@ class AutoCompleteDropdownTagsWrite( AutoCompleteDropdownTags ):
|
|||
|
||||
parsed_autocomplete_text = self._GetParsedAutocompleteText()
|
||||
|
||||
stub_predicates = []
|
||||
|
||||
allow_auto_wildcard_conversion = False
|
||||
|
||||
InsertTagPredicates( stub_predicates, self._display_tag_service_key, parsed_autocomplete_text, allow_auto_wildcard_conversion )
|
||||
|
||||
AppendLoadingPredicate( stub_predicates )
|
||||
|
||||
HG.client_controller.CallLaterQtSafe( self, 0.2, 'set stub predicates', self.SetStubPredicates, job_key, stub_predicates, parsed_autocomplete_text )
|
||||
|
||||
tag_context = ClientSearch.TagContext( service_key = self._tag_service_key, display_service_key = self._display_tag_service_key )
|
||||
|
||||
file_search_context = ClientSearch.FileSearchContext( location_context = self._location_context_button.GetValue(), tag_context = tag_context )
|
||||
|
||||
HG.client_controller.CallToThread( WriteFetch, self, job_key, self.SetFetchedResults, parsed_autocomplete_text, file_search_context, self._results_cache )
|
||||
HG.client_controller.CallToThread( WriteFetch, self, job_key, self.SetPrefetchResults, self.SetFetchedResults, parsed_autocomplete_text, file_search_context, self._results_cache )
|
||||
|
||||
|
||||
def _TakeResponsibilityForEnter( self, shift_down ):
|
||||
|
|
|
@ -343,7 +343,7 @@ class PanelPredicateSystemSingle( PanelPredicateSystem ):
|
|||
return len( custom_defaults ) > 0
|
||||
|
||||
|
||||
class PanelPredicateSystemAgeDate( PanelPredicateSystemSingle ):
|
||||
class PanelPredicateSystemDate( PanelPredicateSystemSingle ):
|
||||
|
||||
def __init__( self, parent, predicate ):
|
||||
|
||||
|
@ -353,11 +353,13 @@ class PanelPredicateSystemAgeDate( PanelPredicateSystemSingle ):
|
|||
|
||||
self._date = QW.QCalendarWidget( self )
|
||||
|
||||
self._time = QW.QTimeEdit( self )
|
||||
|
||||
#
|
||||
|
||||
predicate = self._GetPredicateToInitialisePanelWith( predicate )
|
||||
|
||||
( sign, age_type, ( years, months, days ) ) = predicate.GetValue()
|
||||
( sign, age_type, ( years, months, days, hours, minutes ) ) = predicate.GetValue()
|
||||
|
||||
self._sign.SetValue( sign )
|
||||
|
||||
|
@ -365,19 +367,36 @@ class PanelPredicateSystemAgeDate( PanelPredicateSystemSingle ):
|
|||
|
||||
self._date.setSelectedDate( qt_dt )
|
||||
|
||||
qt_t = QC.QTime( hours, minutes )
|
||||
|
||||
self._time.setTime( qt_t )
|
||||
|
||||
#
|
||||
|
||||
hbox = QP.HBoxLayout()
|
||||
|
||||
QP.AddToLayout( hbox, ClientGUICommon.BetterStaticText(self,'system:import time'), CC.FLAGS_CENTER_PERPENDICULAR )
|
||||
system_pred_label = self._GetSystemPredicateLabel()
|
||||
|
||||
QP.AddToLayout( hbox, ClientGUICommon.BetterStaticText( self, system_pred_label ), CC.FLAGS_CENTER_PERPENDICULAR )
|
||||
QP.AddToLayout( hbox, self._sign, CC.FLAGS_CENTER_PERPENDICULAR )
|
||||
QP.AddToLayout( hbox, self._date, CC.FLAGS_CENTER_PERPENDICULAR )
|
||||
QP.AddToLayout( hbox, self._time, CC.FLAGS_CENTER_PERPENDICULAR )
|
||||
|
||||
hbox.addStretch( 1 )
|
||||
|
||||
self.setLayout( hbox )
|
||||
|
||||
|
||||
def _GetSystemPredicateLabel( self ) -> str:
|
||||
|
||||
raise NotImplementedError()
|
||||
|
||||
|
||||
def _GetPredicateType( self ) -> int:
|
||||
|
||||
raise NotImplementedError()
|
||||
|
||||
|
||||
def GetDefaultPredicate( self ) -> ClientSearch.Predicate:
|
||||
|
||||
qt_dt = QC.QDate.currentDate()
|
||||
|
@ -388,7 +407,12 @@ class PanelPredicateSystemAgeDate( PanelPredicateSystemSingle ):
|
|||
month = qt_dt.month()
|
||||
day = qt_dt.day()
|
||||
|
||||
return ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_SYSTEM_AGE, ( '>', 'date', ( year, month, day ) ) )
|
||||
hour = 0
|
||||
minute = 0
|
||||
|
||||
predicate_type = self._GetPredicateType()
|
||||
|
||||
return ClientSearch.Predicate( predicate_type, ( '>', 'date', ( year, month, day, hour, minute ) ) )
|
||||
|
||||
|
||||
def GetPredicates( self ):
|
||||
|
@ -399,11 +423,76 @@ class PanelPredicateSystemAgeDate( PanelPredicateSystemSingle ):
|
|||
month = qt_dt.month()
|
||||
day = qt_dt.day()
|
||||
|
||||
predicates = ( ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_SYSTEM_AGE, ( self._sign.GetValue(), 'date', ( year, month, day ) ) ), )
|
||||
qt_t = self._time.time()
|
||||
|
||||
hour = qt_t.hour()
|
||||
minute = qt_t.minute()
|
||||
|
||||
predicate_type = self._GetPredicateType()
|
||||
|
||||
predicates = ( ClientSearch.Predicate( predicate_type, ( self._sign.GetValue(), 'date', ( year, month, day, hour, minute ) ) ), )
|
||||
|
||||
return predicates
|
||||
|
||||
|
||||
|
||||
class PanelPredicateSystemAgeDate( PanelPredicateSystemDate ):
|
||||
|
||||
def _GetSystemPredicateLabel( self ) -> str:
|
||||
|
||||
return 'system:import time'
|
||||
|
||||
|
||||
def _GetPredicateType( self ) -> int:
|
||||
|
||||
return ClientSearch.PREDICATE_TYPE_SYSTEM_AGE
|
||||
|
||||
|
||||
def GetDefaultPredicate( self ) -> ClientSearch.Predicate:
|
||||
|
||||
qt_dt = QC.QDate.currentDate()
|
||||
|
||||
qt_dt.addDays( -7 )
|
||||
|
||||
year = qt_dt.year()
|
||||
month = qt_dt.month()
|
||||
day = qt_dt.day()
|
||||
|
||||
hour = 0
|
||||
minute = 0
|
||||
|
||||
predicate_type = self._GetPredicateType()
|
||||
|
||||
return ClientSearch.Predicate( predicate_type, ( '<', 'date', ( year, month, day, hour, minute ) ) )
|
||||
|
||||
|
||||
|
||||
class PanelPredicateSystemLastViewedDate( PanelPredicateSystemDate ):
|
||||
|
||||
def _GetSystemPredicateLabel( self ) -> str:
|
||||
|
||||
return 'system:last viewed date'
|
||||
|
||||
|
||||
def _GetPredicateType( self ) -> int:
|
||||
|
||||
return ClientSearch.PREDICATE_TYPE_SYSTEM_LAST_VIEWED_TIME
|
||||
|
||||
|
||||
|
||||
class PanelPredicateSystemModifiedDate( PanelPredicateSystemDate ):
|
||||
|
||||
def _GetSystemPredicateLabel( self ) -> str:
|
||||
|
||||
return 'system:modified date'
|
||||
|
||||
|
||||
def _GetPredicateType( self ) -> int:
|
||||
|
||||
return ClientSearch.PREDICATE_TYPE_SYSTEM_MODIFIED_TIME
|
||||
|
||||
|
||||
|
||||
class PanelPredicateSystemAgeDelta( PanelPredicateSystemSingle ):
|
||||
|
||||
def __init__( self, parent, predicate ):
|
||||
|
@ -412,10 +501,10 @@ class PanelPredicateSystemAgeDelta( PanelPredicateSystemSingle ):
|
|||
|
||||
self._sign = TimeDeltaOperator( self )
|
||||
|
||||
self._years = ClientGUICommon.BetterSpinBox( self, max=30, width = 60 )
|
||||
self._months = ClientGUICommon.BetterSpinBox( self, max=60, width = 60 )
|
||||
self._days = ClientGUICommon.BetterSpinBox( self, max=90, width = 60 )
|
||||
self._hours = ClientGUICommon.BetterSpinBox( self, max=24, width = 60 )
|
||||
self._years = ClientGUICommon.BetterSpinBox( self, max = 30, width = 60 )
|
||||
self._months = ClientGUICommon.BetterSpinBox( self, max = 60, width = 60 )
|
||||
self._days = ClientGUICommon.BetterSpinBox( self, max = 90, width = 60 )
|
||||
self._hours = ClientGUICommon.BetterSpinBox( self, max = 24, width = 60 )
|
||||
|
||||
#
|
||||
|
||||
|
@ -462,67 +551,6 @@ class PanelPredicateSystemAgeDelta( PanelPredicateSystemSingle ):
|
|||
return predicates
|
||||
|
||||
|
||||
class PanelPredicateSystemLastViewedDate( PanelPredicateSystemSingle ):
|
||||
|
||||
def __init__( self, parent, predicate ):
|
||||
|
||||
PanelPredicateSystemSingle.__init__( self, parent )
|
||||
|
||||
self._sign = TimeDateOperator( self )
|
||||
|
||||
self._date = QW.QCalendarWidget( self )
|
||||
|
||||
#
|
||||
|
||||
predicate = self._GetPredicateToInitialisePanelWith( predicate )
|
||||
|
||||
( sign, age_type, ( years, months, days ) ) = predicate.GetValue()
|
||||
|
||||
self._sign.SetValue( sign )
|
||||
|
||||
qt_dt = QC.QDate( years, months, days )
|
||||
|
||||
self._date.setSelectedDate( qt_dt )
|
||||
|
||||
#
|
||||
|
||||
hbox = QP.HBoxLayout()
|
||||
|
||||
QP.AddToLayout( hbox, ClientGUICommon.BetterStaticText(self,'system:last viewed date'), CC.FLAGS_CENTER_PERPENDICULAR )
|
||||
QP.AddToLayout( hbox, self._sign, CC.FLAGS_CENTER_PERPENDICULAR )
|
||||
QP.AddToLayout( hbox, self._date, CC.FLAGS_CENTER_PERPENDICULAR )
|
||||
|
||||
hbox.addStretch( 1 )
|
||||
|
||||
self.setLayout( hbox )
|
||||
|
||||
|
||||
def GetDefaultPredicate( self ) -> ClientSearch.Predicate:
|
||||
|
||||
qt_dt = QC.QDate.currentDate()
|
||||
|
||||
qt_dt.addDays( -7 )
|
||||
|
||||
year = qt_dt.year()
|
||||
month = qt_dt.month()
|
||||
day = qt_dt.day()
|
||||
|
||||
return ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_SYSTEM_LAST_VIEWED_TIME, ( '>', 'date', ( year, month, day ) ) )
|
||||
|
||||
|
||||
def GetPredicates( self ):
|
||||
|
||||
qt_dt = self._date.selectedDate()
|
||||
|
||||
year = qt_dt.year()
|
||||
month = qt_dt.month()
|
||||
day = qt_dt.day()
|
||||
|
||||
predicates = ( ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_SYSTEM_LAST_VIEWED_TIME, ( self._sign.GetValue(), 'date', ( year, month, day ) ) ), )
|
||||
|
||||
return predicates
|
||||
|
||||
|
||||
class PanelPredicateSystemLastViewedDelta( PanelPredicateSystemSingle ):
|
||||
|
||||
def __init__( self, parent, predicate ):
|
||||
|
@ -581,67 +609,6 @@ class PanelPredicateSystemLastViewedDelta( PanelPredicateSystemSingle ):
|
|||
return predicates
|
||||
|
||||
|
||||
class PanelPredicateSystemModifiedDate( PanelPredicateSystemSingle ):
|
||||
|
||||
def __init__( self, parent, predicate ):
|
||||
|
||||
PanelPredicateSystemSingle.__init__( self, parent )
|
||||
|
||||
self._sign = TimeDateOperator( self )
|
||||
|
||||
self._date = QW.QCalendarWidget( self )
|
||||
|
||||
#
|
||||
|
||||
predicate = self._GetPredicateToInitialisePanelWith( predicate )
|
||||
|
||||
( sign, age_type, ( years, months, days ) ) = predicate.GetValue()
|
||||
|
||||
self._sign.SetValue( sign )
|
||||
|
||||
qt_dt = QC.QDate( years, months, days )
|
||||
|
||||
self._date.setSelectedDate( qt_dt )
|
||||
|
||||
#
|
||||
|
||||
hbox = QP.HBoxLayout()
|
||||
|
||||
QP.AddToLayout( hbox, ClientGUICommon.BetterStaticText(self,'system:modified date'), CC.FLAGS_CENTER_PERPENDICULAR )
|
||||
QP.AddToLayout( hbox, self._sign, CC.FLAGS_CENTER_PERPENDICULAR )
|
||||
QP.AddToLayout( hbox, self._date, CC.FLAGS_CENTER_PERPENDICULAR )
|
||||
|
||||
hbox.addStretch( 1 )
|
||||
|
||||
self.setLayout( hbox )
|
||||
|
||||
|
||||
def GetDefaultPredicate( self ) -> ClientSearch.Predicate:
|
||||
|
||||
qt_dt = QC.QDate.currentDate()
|
||||
|
||||
qt_dt.addDays( -7 )
|
||||
|
||||
year = qt_dt.year()
|
||||
month = qt_dt.month()
|
||||
day = qt_dt.day()
|
||||
|
||||
return ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_SYSTEM_MODIFIED_TIME, ( '>', 'date', ( year, month, day ) ) )
|
||||
|
||||
|
||||
def GetPredicates( self ):
|
||||
|
||||
qt_dt = self._date.selectedDate()
|
||||
|
||||
year = qt_dt.year()
|
||||
month = qt_dt.month()
|
||||
day = qt_dt.day()
|
||||
|
||||
predicates = ( ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_SYSTEM_MODIFIED_TIME, ( self._sign.GetValue(), 'date', ( year, month, day ) ) ), )
|
||||
|
||||
return predicates
|
||||
|
||||
|
||||
class PanelPredicateSystemModifiedDelta( PanelPredicateSystemSingle ):
|
||||
|
||||
def __init__( self, parent, predicate ):
|
||||
|
|
|
@ -249,7 +249,7 @@ class FileSeed( HydrusSerialisable.SerialisableBase ):
|
|||
serialisable_source_urls = list( self._source_urls )
|
||||
serialisable_tags = list( self._tags )
|
||||
serialisable_names_and_notes_dict = list( self._names_and_notes_dict.items() )
|
||||
serialisable_hashes = [ ( hash_type, hash.hex() ) for ( hash_type, hash ) in list(self._hashes.items()) if hash is not None ]
|
||||
serialisable_hashes = [ ( hash_type, hash.hex() ) for ( hash_type, hash ) in self._hashes.items() if hash is not None ]
|
||||
|
||||
return (
|
||||
self.file_seed_type,
|
||||
|
|
|
@ -22,24 +22,16 @@ from hydrus.client.metadata import ClientTags
|
|||
|
||||
def FilterServiceKeysToContentUpdates( full_service_keys_to_content_updates, hashes ):
|
||||
|
||||
filtered_service_keys_to_content_updates = {}
|
||||
|
||||
if not isinstance( hashes, set ):
|
||||
|
||||
hashes = set( hashes )
|
||||
|
||||
|
||||
filtered_service_keys_to_content_updates = collections.defaultdict( list )
|
||||
|
||||
for ( service_key, full_content_updates ) in full_service_keys_to_content_updates.items():
|
||||
|
||||
filtered_content_updates = []
|
||||
|
||||
for content_update in full_content_updates:
|
||||
|
||||
if not hashes.isdisjoint( content_update.GetHashes() ):
|
||||
|
||||
filtered_content_updates.append( content_update )
|
||||
|
||||
|
||||
filtered_content_updates = [ content_update for content_update in full_content_updates if not hashes.isdisjoint( content_update.GetHashes() ) ]
|
||||
|
||||
if len( filtered_content_updates ) > 0:
|
||||
|
||||
|
@ -48,6 +40,7 @@ def FilterServiceKeysToContentUpdates( full_service_keys_to_content_updates, has
|
|||
|
||||
|
||||
return filtered_service_keys_to_content_updates
|
||||
|
||||
|
||||
def FlattenMedia( media_list ):
|
||||
|
||||
|
@ -1289,6 +1282,11 @@ class MediaList( object ):
|
|||
|
||||
def ProcessContentUpdates( self, full_service_keys_to_content_updates ):
|
||||
|
||||
if len( full_service_keys_to_content_updates ) == 0:
|
||||
|
||||
return
|
||||
|
||||
|
||||
service_keys_to_content_updates = FilterServiceKeysToContentUpdates( full_service_keys_to_content_updates, self._hashes )
|
||||
|
||||
if len( service_keys_to_content_updates ) == 0:
|
||||
|
|
|
@ -223,6 +223,12 @@ def ParseClientLegacyArgs( args: dict ):
|
|||
|
||||
new_service_dict_param_name = ConvertLegacyServiceNameParamToKey( legacy_service_dict_param_name )
|
||||
|
||||
# little hack for a super old obsolete thing, it got renamed more significantly
|
||||
if new_service_dict_param_name == 'service_keys_to_tags':
|
||||
|
||||
parsed_request_args[ 'service_keys_to_additional_tags' ] = service_keys_to_gubbins
|
||||
|
||||
|
||||
parsed_request_args[ new_service_dict_param_name ] = service_keys_to_gubbins
|
||||
|
||||
|
||||
|
|
|
@ -84,7 +84,7 @@ options = {}
|
|||
# Misc
|
||||
|
||||
NETWORK_VERSION = 20
|
||||
SOFTWARE_VERSION = 517
|
||||
SOFTWARE_VERSION = 518
|
||||
CLIENT_API_VERSION = 42
|
||||
|
||||
SERVER_THUMBNAIL_DIMENSIONS = ( 200, 200 )
|
||||
|
|
|
@ -1829,6 +1829,7 @@ class ContentUpdate( object ):
|
|||
self._action = action
|
||||
self._row = row
|
||||
self._reason = reason
|
||||
self._hashes = None
|
||||
|
||||
|
||||
def __eq__( self, other ):
|
||||
|
@ -1863,6 +1864,11 @@ class ContentUpdate( object ):
|
|||
|
||||
def GetHashes( self ):
|
||||
|
||||
if self._hashes is not None:
|
||||
|
||||
return self._hashes
|
||||
|
||||
|
||||
hashes = set()
|
||||
|
||||
if self._data_type == HC.CONTENT_TYPE_FILES:
|
||||
|
@ -1953,6 +1959,8 @@ class ContentUpdate( object ):
|
|||
hashes = set( hashes )
|
||||
|
||||
|
||||
self._hashes = hashes
|
||||
|
||||
return hashes
|
||||
|
||||
|
||||
|
@ -1997,6 +2005,8 @@ class ContentUpdate( object ):
|
|||
|
||||
self._row = row
|
||||
|
||||
self._hashes = None
|
||||
|
||||
|
||||
def ToTuple( self ):
|
||||
|
||||
|
|
|
@ -70,6 +70,7 @@ mpv_report_mode = False
|
|||
force_idle_mode = False
|
||||
no_page_limit_mode = False
|
||||
thumbnail_debug_mode = False
|
||||
autocomplete_delay_mode = False
|
||||
|
||||
do_idle_shutdown_work = False
|
||||
shutdown_complete = False
|
||||
|
|
|
@ -0,0 +1,100 @@
|
|||
import collections
|
||||
from hydrus.core import HydrusData
|
||||
|
||||
try:
|
||||
|
||||
import pympler
|
||||
|
||||
from pympler import asizeof
|
||||
from pympler import muppy
|
||||
from pympler import summary
|
||||
from pympler import classtracker
|
||||
from pympler import tracker
|
||||
|
||||
PYMPLER_OK = True
|
||||
|
||||
except:
|
||||
|
||||
PYMPLER_OK = False
|
||||
|
||||
|
||||
CURRENT_TRACKER = None
|
||||
|
||||
# good examples here:
|
||||
# https://pympler.readthedocs.io/en/latest/muppy.html#muppy
|
||||
# this can do other stuff, class tracking and even charts with matplotlib
|
||||
|
||||
# pretty sure the Client should only ever call this stuff on the GUI thread of course, since it'll be touching Qt stuff
|
||||
|
||||
def CheckPymplerOK():
|
||||
|
||||
if not PYMPLER_OK:
|
||||
|
||||
raise Exception( 'Pympler is not available!' )
|
||||
|
||||
|
||||
|
||||
def PrintCurrentMemoryUse( classes_to_track = None ):
|
||||
|
||||
CheckPymplerOK()
|
||||
|
||||
HydrusData.Print( '---printing memory use to log---' )
|
||||
|
||||
all_objects = muppy.get_objects()
|
||||
|
||||
sm = summary.summarize( all_objects )
|
||||
|
||||
summary.print_( sm, limit = 500 )
|
||||
|
||||
HydrusData.DebugPrint( '----memory-use snapshot done----' )
|
||||
|
||||
if classes_to_track is None:
|
||||
|
||||
return
|
||||
|
||||
|
||||
HydrusData.Print( '----printing class use to log---' )
|
||||
|
||||
ct = classtracker.ClassTracker()
|
||||
|
||||
for o in all_objects:
|
||||
|
||||
if isinstance( o, classes_to_track ):
|
||||
|
||||
ct.track_object( o )
|
||||
|
||||
|
||||
|
||||
ct.create_snapshot()
|
||||
|
||||
ct.stats.print_summary()
|
||||
|
||||
HydrusData.DebugPrint( '-----class-use snapshot done----' )
|
||||
|
||||
|
||||
def PrintSnapshotDiff():
|
||||
|
||||
CheckPymplerOK()
|
||||
|
||||
global CURRENT_TRACKER
|
||||
|
||||
if CURRENT_TRACKER is None:
|
||||
|
||||
TakeMemoryUseSnapshot()
|
||||
|
||||
|
||||
HydrusData.Print( '---printing memory diff to log--' )
|
||||
|
||||
diff = CURRENT_TRACKER.diff()
|
||||
|
||||
summary.print_( diff, limit = 500 )
|
||||
|
||||
HydrusData.DebugPrint( '----memory-use snapshot done----' )
|
||||
|
||||
|
||||
def TakeMemoryUseSnapshot():
|
||||
|
||||
global CURRENT_TRACKER
|
||||
|
||||
CURRENT_TRACKER = tracker.SummaryTracker()
|
||||
|
|
@ -116,7 +116,7 @@ class Value( Enum ):
|
|||
HASHLIST_WITH_ALGORITHM = auto() # A 2-tuple, where the first part is a set of potential hashes (as strings), the second part is one of 'sha256', 'md5', 'sha1', 'sha512'
|
||||
FILETYPE_LIST = auto() # A set of file types using the enum set in InitialiseFiletypes as defined in FILETYPES
|
||||
# Either a tuple of 4 non-negative integers: (years, months, days, hours) where the latter is < 24 OR
|
||||
# a datetime.date object. For the latter, only the YYYY-MM-DD format is accepted.
|
||||
# a datetime.datetime object. For the latter, only the YYYY-MM-DD format is accepted.
|
||||
# dateutils has a function to try to guess and parse arbitrary date formats but I didn't use it here since it would be an additional dependency.
|
||||
DATE_OR_TIME_INTERVAL = auto()
|
||||
TIME_SEC_MSEC = auto() # A tuple of two non-negative integers: (seconds, milliseconds) where the latter is <1000
|
||||
|
@ -321,7 +321,8 @@ def parse_value( string: str, spec ):
|
|||
return string[ len( match[ 0 ] ): ], (years, months, days, hours)
|
||||
match = re.match( '(?P<year>[0-9][0-9][0-9][0-9])-(?P<month>[0-9][0-9]?)-(?P<day>[0-9][0-9]?)', string )
|
||||
if match:
|
||||
return string[ len( match[ 0 ] ): ], datetime.date( int( match.group( 'year' ) ), int( match.group( 'month' ) ), int( match.group( 'day' ) ) )
|
||||
# good expansion here would be to parse a full date with 08:20am kind of thing, but we'll wait for better datetime parsing library for that I think!
|
||||
return string[ len( match[ 0 ] ): ], datetime.datetime( int( match.group( 'year' ) ), int( match.group( 'month' ) ), int( match.group( 'day' ) ) )
|
||||
raise ValueError( "Invalid value, expected a date or a time interval" )
|
||||
elif spec == Value.TIME_SEC_MSEC:
|
||||
match = re.match( '((?P<sec>0|([1-9][0-9]*))\s*(seconds|second|secs|sec|s))?\s*((?P<msec>0|([1-9][0-9]*))\s*(milliseconds|millisecond|msecs|msec|ms))?', string )
|
||||
|
|
|
@ -1,5 +1,6 @@
|
|||
cbor2
|
||||
cryptography
|
||||
pympler
|
||||
python-dateutil
|
||||
|
||||
beautifulsoup4>=4.0.0
|
||||
|
|
|
@ -1,5 +1,6 @@
|
|||
cbor2
|
||||
cryptography
|
||||
pympler
|
||||
python-dateutil
|
||||
|
||||
beautifulsoup4>=4.0.0
|
||||
|
|
|
@ -1,5 +1,6 @@
|
|||
cbor2
|
||||
cryptography
|
||||
pympler
|
||||
python-dateutil
|
||||
|
||||
beautifulsoup4>=4.0.0
|
||||
|
|
|
@ -1,5 +1,6 @@
|
|||
cbor2
|
||||
cryptography
|
||||
pympler
|
||||
python-dateutil
|
||||
|
||||
beautifulsoup4>=4.0.0
|
||||
|
|
|
@ -1,5 +1,6 @@
|
|||
cbor2
|
||||
cryptography
|
||||
pympler
|
||||
python-dateutil
|
||||
|
||||
beautifulsoup4>=4.0.0
|
||||
|
|
Loading…
Reference in New Issue