parent
b9687b41e5
commit
583a6f282c
|
@ -94,20 +94,6 @@ jobs:
|
|||
name: Build docs to /help
|
||||
run: mkdocs build -d help
|
||||
working-directory: hydrus
|
||||
#- name: Cache Qt
|
||||
# id: cache-qt
|
||||
# uses: actions/cache@v1
|
||||
# with:
|
||||
# path: Qt
|
||||
# key: ${{ runner.os }}-QtCache
|
||||
#-
|
||||
# name: Install Qt
|
||||
# uses: jurplel/install-qt-action@v2
|
||||
# with:
|
||||
# install-deps: true
|
||||
# setup-python: 'false'
|
||||
# modules: qtcharts qtwidgets qtgui qtcore
|
||||
# cached: ${{ steps.cache-qt.outputs.cache-hit }}
|
||||
-
|
||||
name: Build Hydrus
|
||||
run: |
|
||||
|
@ -171,21 +157,6 @@ jobs:
|
|||
name: Build docs to /help
|
||||
run: mkdocs build -d help
|
||||
working-directory: hydrus
|
||||
#-
|
||||
# name: Cache Qt
|
||||
# id: cache_qt
|
||||
# uses: actions/cache@v1
|
||||
# with:
|
||||
# path: ../Qt
|
||||
# key: ${{ runner.os }}-QtCache
|
||||
#-
|
||||
# name: Install Qt
|
||||
# uses: jurplel/install-qt-action@v2
|
||||
# with:
|
||||
# install-deps: true
|
||||
# setup-python: 'false'
|
||||
# modules: qtcharts qtwidgets qtgui qtcore
|
||||
# cached: ${{ steps.cache_qt.outputs.cache-hit }}
|
||||
-
|
||||
name: Download mpv-dev
|
||||
uses: carlosperate/download-file-action@v1.1.1
|
||||
|
|
|
@ -7,6 +7,31 @@ title: Changelog
|
|||
!!! note
|
||||
This is the new changelog, only the most recent builds. For all versions, see the [old changelog](old_changelog.html).
|
||||
|
||||
## [Version 525](https://github.com/hydrusnetwork/hydrus/releases/tag/v525)
|
||||
|
||||
### library updates
|
||||
|
||||
* after successful testing amongst source users, I am finally updating the official builds and the respective requirements.txts for Qt, from 6.3.1 to 6.4.1 (with 'test' now 6.5.0), opencv-python-headless from 4.5.3.56 to 4.5.5.64 (with a new 'test' of 4.7.0.72), and in the Windows build, the mpv dll from 2022-05-01 to 2023-02-12 (API 2.0 to 2.1). if you use my normal builds, you don't have to do anything special in the update, and with luck you'll get slightly faster images, video, and UI, and with fewer bugs. if you run from source, you might want to re-run your setup_venv script--it'll update you automatically--and if you are a modern Windows source user and haven't yet, grab the new dll here and rename it to mpv-2.dll https://sourceforge.net/projects/mpv-player-windows/files/libmpv/mpv-dev-x86_64-20230212-git-a40958c.7z . there is a chance that some older OSes will not be able to boot this new build, but I think these people were already migrated to being source users when Win 7-level was no longer supported. in any case, let me know how you get on, and if you are on an older OS, be prepared to rollback if this version doesn't boot
|
||||
* setup_venv.bat (Windows source) now adds PyWin32, just like the builds (the new version of pympler, a memory management module, moans on boot if it doesn't have it)
|
||||
|
||||
### timestamps
|
||||
|
||||
* a couple places where fixed calendar time-deltas are converted to absolute datestrings now work better over longer times. going back (5 years, 3 months) should now work out the actual calendar dates (previously they used a rough total_num_seconds estimation) and go back to the same day of the destination month, also accounting for if that has fewer days than the starting month and handling leap years. it also handles >'12 months' better now
|
||||
* in system:time predicates that use since/before a delta, it now allows much larger values in the UI, like '72 months', and it won't merge those into the larger values in the label. so if you set a gap of 100 days, it'll say that, not 3 months 10 days or whatever
|
||||
* the main copy button on 'manage file times' is now a menu button letting you choose to copy all timestamps or just those for the file services. as a hacky experiment, you can also copy the file service timestamps plus one second (in case you want to try finick-ily going through a handful of files to force a certain import sort order)
|
||||
* the system predicate time parsing is now more flexible. for archived, modified, last viewed, and imported time, you can now generally say all variants in the form 'import' or 'imported' and 'time' or 'date' and 'time imported' or 'imported time'.
|
||||
* fixed an issue that meant editing existing delta 'system:archived time' predicates was launching the 'date' edit panel
|
||||
|
||||
### misc
|
||||
|
||||
* in the 'exif and other embedded metadata' review window, which is launched from a button on the the media viewer's top hover, jpegs now state their subsampling and whether they are progressive
|
||||
* every simple place where the client eats clipboard data and tries to import something now has a unified error-reporting process. before, it would make a popup with something like 'I could not understandwhat was in the clipboard!'. Now it makes a popup with info on what was pasted, what was expected, and actual exception info. Longer info is printed to the log
|
||||
* many places across the program say the specific exception type when they report errors now, not just the string summary
|
||||
* the sankaku downloader is updated with a new url class for their new md5 links. also, the file parser is updated to associate the old id URL, and the gallery parser is updated to skip the 'get sank pro' thumbnail links if you are not logged in. if you have sank subscriptions, they are going to go crazy this week due to the URL format changing--sorry, there's no nice way around it!--just ignore their popups about hitting file limits and wait them out. unfortunately, due to an unusual 404-based redirect, the id-based URLs will not work in hydrus any more
|
||||
* the 'API URL' system for url classes now supports File URLs--this may help you figure out some CDN redirects and similar. in a special rule for these File URLs, both URLs will be associated with the imported file (normally, Post API URLs are not saved as Known URLs). relatedly, I have renamed this system broadly to 'api/redirect url', since we use it for a bunch of non-API stuff now
|
||||
* fixed a problem where deleting one of the new inc/dec rating services was not clearing the actual number ratings for that service from the database, causing service-id error hell on loading files with those orphaned rating records. sorry for the trouble, this slipped through testing! any users who were affected by this will also be fixed (orphan records cleared out) on update (issue #1357)
|
||||
* the client cleans up the temporary paths used by file imports more carefully now: it tries more times to delete 'sticky' temp files; it tries to clear them again immediately on shutdown; and it stores them all in the hydrus temp subdirectory where they are less loose and will be captured by the final directory clear on shutdown (issue #1356)
|
||||
|
||||
## [Version 524](https://github.com/hydrusnetwork/hydrus/releases/tag/v524)
|
||||
|
||||
### timestamp sidecars
|
||||
|
@ -356,29 +381,3 @@ title: Changelog
|
|||
* changed and fixed an issue in the client api's new `get_file_relationships` call. previously, I said 'king' would be null if it was not on the given file domain, but this was not working correctly--it was giving pseudorandom 'fallback' kings. now it always gives the king, no matter what! a new param, `king_is_on_file_domain` says whether the king is on the given domain. `king_is_local` says whether the king is available on disk
|
||||
* added some discussion and a list of the 8 possible 'better than' and 'same quality' logical combinations to the `set_file_relationships` help so you can see how group merge involving non-kings works
|
||||
* client api is now version 42
|
||||
|
||||
## [Version 515](https://github.com/hydrusnetwork/hydrus/releases/tag/v515)
|
||||
|
||||
### related tags
|
||||
|
||||
* I worked on last week's related tags algorithm test, bringing it up to usable standard. the old buttons now use the new algorithm exclusively. all users now get 'related tags' showing in manage tags by default (if you don't like it, you can turn it off under _options->tag suggestions_)
|
||||
* the new algorithm has new cancel tech and does a 'work for 600ms' kind of deal, like the old system, and the last-minute blocks from last week are gone--it will search as much as it has time for, including partial results. it also won't lag you out for thirty seconds (unless you tell it to in the options). it searches tags with low count first, so don't worry if it doesn't get to everything--'1girl' usually doesn't have a huge amount extra to offer once everything else has run
|
||||
* it also uses 'hydev actually thought about this' statistical sampling tech to work massively faster on larger-count tags at the cost of some variance in rank and the odd false positive (considered sufficiently related when it actually shouldn't meet the threshold) nearer the bottom end of the tags result list
|
||||
* rather than 'new 1' and 'new 2', there is now an on/off button for searching your local files or all known files on tag repositories. 'all known files' = great results, but very slow, which the tooltip explains
|
||||
* there's also a new status label that will tell you when it is searching and how well the search went (e.g. '12/51 tags searched fully in 459ms')
|
||||
* I also added the 'quick' search button back in, since we can now repeat searches for just selections of tags
|
||||
* I fixed a couple typos in the algorthim that were messing some results
|
||||
* in the manage tags dialog, if you have the suggested tag panels 'side-to-side', they now go in named boxes
|
||||
* in the manage tags dialog, if you have suggested tag panels in a notebook, 'related tags' will only refresh its search on a media change event (including dialog initialisation) when it is the selected page. it won't lag you from the background!
|
||||
* options->tag suggestions now lets you pick which notebook'd tag suggestions page you want to show by default. this defaults to 'related'
|
||||
* I have more plans here. these related tags results are very cachable, so that's an obvious next step to speed up results, and when I have done some other long-term tag improvements elsewhere in the program, I'll be able to quickly filter out unhelpful sibling and parent suggestions. more immediately, I think we'll want some options for namespace weighting (e.g. 'series:' tags' suggestions could have higher rank than 'smile'), so we can tune things a bit
|
||||
|
||||
### misc
|
||||
|
||||
* the 'open externally' canvas widget, which shows any available thumbnail of the flash or psd or whatever, now sizes itself correctly and draws the thumbnail nicely if you set the new thumbnail supersampling option to >100%. if your thumbnail is the wrong size (and probably in a queue to be regenerated soon), I _think_ it'll still make the window too big/small, but it'll draw the thumbnail to fit
|
||||
* if a tag content update comes in with an invalid tag (such as could happen with sidecars recently), the client now heals better. the bad tag is corrected live in more places, and this should be propagated to the UI. if you got a warning about 'you have invalid tags in view' recently but running the routine found no problems, please reboot, and I think you'll be fixed. I'm pretty sure the database wasn't being damaged at all here (it has cleaning safeguards, so it _shouldn't_ be possible to actually save bad tags)--it was just a thing to do with the UI not being told of the cleaned tag, and it shouldn't happen again. thank you for the reports! (issue #1324)
|
||||
* export folders and the file maintenance dialog no longer apply the implicit system:limit (defaults to max 10k files) to their searches!
|
||||
* old OR predicates that you load with saved searches and similar should now always have alphebetised components, and if you double-click them to remove them, they will now clear correctly (previously, they were doing something similar to the recent filetype problem, where instead of recognising themselves and deleting, they would instead duplicate a normalised (sorted) copy of themselves)
|
||||
* thanks to a user, updated the recently note-and-ai-updated pixiv parser again to grab the canonical pixiv URL and translated tags, if present
|
||||
* thanks to a user, updated the sankaku parser to grab some more tags
|
||||
* the file location context and tag context buttons under tag autocompletes now put menu separators between each type of file/tag service in their menus. for basic users, this'll be a separator for every row, but for advanced users with multiple local domains, it will help categorise the list a bit
|
||||
|
|
|
@ -14,6 +14,6 @@ Here you simply set which parsers go with which URL Classes. If you have URL Cla
|
|||
|
||||
If the URL Class has no parser set or the parser is broken or otherwise invalid, the respective URL's file import object in the downloader or subscription is going to throw some kind of error when it runs. If you make and share some parsers, the first indication that something is wrong is going to be several users saying 'I got this error: (_copy notes_ from file import status window)'. You can then load the parser back up in _manage parsers_ and try to figure out what changed and roll out an update.
|
||||
|
||||
_manage url class links_ also shows 'api link review', which summarises which URL Classes api-link to others. In these cases, only the api URL gets a parser entry in the first 'parser links' window, since the first will never be fetched for parsing (in the downloader, it will always be converted to the API URL, and _that_ is fetched and parsed).
|
||||
_manage url class links_ also shows 'api/redirect link review', which summarises which URL Classes redirect to others. In these cases, only the redirected-to URL gets a parser entry in the first 'parser links' window, since the first will never be fetched for parsing (in the downloader, it will always be converted to the Redirected URL, and _that_ is fetched and parsed).
|
||||
|
||||
Once your GUG has a URL Class and your URL Classes have parsers linked, test your downloader! Note that Hydrus's URL drag-and-drop import uses URL Classes, so if you don't have the GUG and gallery stuff done but you have a Post URL set up, you can test that just by dragging a Post URL from your browser to the client, and it should be added to a new URL Downloader and just work. It feels pretty good once it does!
|
||||
|
|
|
@ -34,6 +34,28 @@
|
|||
<div class="content">
|
||||
<h1 id="changelog"><a href="#changelog">changelog</a></h1>
|
||||
<ul>
|
||||
<li>
|
||||
<h2 id="version_525"><a href="#version_525">version 525</a></h2>
|
||||
<ul>
|
||||
<li><h3>library updates</h3></li>
|
||||
<li>after successful testing amongst source users, I am finally updating the official builds and the respective requirements.txts for Qt, from 6.3.1 to 6.4.1 (with 'test' now 6.5.0), opencv-python-headless from 4.5.3.56 to 4.5.5.64 (with a new 'test' of 4.7.0.72), and in the Windows build, the mpv dll from 2022-05-01 to 2023-02-12 (API 2.0 to 2.1). if you use my normal builds, you don't have to do anything special in the update, and with luck you'll get slightly faster images, video, and UI, and with fewer bugs. if you run from source, you might want to re-run your setup_venv script--it'll update you automatically--and if you are a modern Windows source user and haven't yet, grab the new dll here and rename it to mpv-2.dll https://sourceforge.net/projects/mpv-player-windows/files/libmpv/mpv-dev-x86_64-20230212-git-a40958c.7z . there is a chance that some older OSes will not be able to boot this new build, but I think these people were already migrated to being source users when Win 7-level was no longer supported. in any case, let me know how you get on, and if you are on an older OS, be prepared to rollback if this version doesn't boot</li>
|
||||
<li>setup_venv.bat (Windows source) now adds PyWin32, just like the builds (the new version of pympler, a memory management module, moans on boot if it doesn't have it)</li>
|
||||
<li><h3>timestamps</h3></li>
|
||||
<li>a couple places where fixed calendar time-deltas are converted to absolute datestrings now work better over longer times. going back (5 years, 3 months) should now work out the actual calendar dates (previously they used a rough total_num_seconds estimation) and go back to the same day of the destination month, also accounting for if that has fewer days than the starting month and handling leap years. it also handles >'12 months' better now</li>
|
||||
<li>in system:time predicates that use since/before a delta, it now allows much larger values in the UI, like '72 months', and it won't merge those into the larger values in the label. so if you set a gap of 100 days, it'll say that, not 3 months 10 days or whatever</li>
|
||||
<li>the main copy button on 'manage file times' is now a menu button letting you choose to copy all timestamps or just those for the file services. as a hacky experiment, you can also copy the file service timestamps plus one second (in case you want to try finick-ily going through a handful of files to force a certain import sort order)</li>
|
||||
<li>the system predicate time parsing is now more flexible. for archived, modified, last viewed, and imported time, you can now generally say all variants in the form 'import' or 'imported' and 'time' or 'date' and 'time imported' or 'imported time'.</li>
|
||||
<li>fixed an issue that meant editing existing delta 'system:archived time' predicates was launching the 'date' edit panel</li>
|
||||
<li><h3>misc</h3></li>
|
||||
<li>in the 'exif and other embedded metadata' review window, which is launched from a button on the the media viewer's top hover, jpegs now state their subsampling and whether they are progressive</li>
|
||||
<li>every simple place where the client eats clipboard data and tries to import something now has a unified error-reporting process. before, it would make a popup with something like 'I could not understandwhat was in the clipboard!'. Now it makes a popup with info on what was pasted, what was expected, and actual exception info. Longer info is printed to the log</li>
|
||||
<li>many places across the program say the specific exception type when they report errors now, not just the string summary</li>
|
||||
<li>the sankaku downloader is updated with a new url class for their new md5 links. also, the file parser is updated to associate the old id URL, and the gallery parser is updated to skip the 'get sank pro' thumbnail links if you are not logged in. if you have sank subscriptions, they are going to go crazy this week due to the URL format changing--sorry, there's no nice way around it!--just ignore their popups about hitting file limits and wait them out. unfortunately, due to an unusual 404-based redirect, the id-based URLs will not work in hydrus any more</li>
|
||||
<li>the 'API URL' system for url classes now supports File URLs--this may help you figure out some CDN redirects and similar. in a special rule for these File URLs, both URLs will be associated with the imported file (normally, Post API URLs are not saved as Known URLs). relatedly, I have renamed this system broadly to 'api/redirect url', since we use it for a bunch of non-API stuff now</li>
|
||||
<li>fixed a problem where deleting one of the new inc/dec rating services was not clearing the actual number ratings for that service from the database, causing service-id error hell on loading files with those orphaned rating records. sorry for the trouble, this slipped through testing! any users who were affected by this will also be fixed (orphan records cleared out) on update (issue #1357)</li>
|
||||
<li>the client cleans up the temporary paths used by file imports more carefully now: it tries more times to delete 'sticky' temp files; it tries to clear them again immediately on shutdown; and it stores them all in the hydrus temp subdirectory where they are less loose and will be captured by the final directory clear on shutdown (issue #1356)</li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>
|
||||
<h2 id="version_524"><a href="#version_524">version 524</a></h2>
|
||||
<ul>
|
||||
|
|
|
@ -56,7 +56,7 @@ There are three external libraries. You just have to get them and put them in th
|
|||
|
||||
1. If you are on Windows 8.1 or older, get [this](https://sourceforge.net/projects/mpv-player-windows/files/libmpv/mpv-dev-x86_64-20210228-git-d1be8bb.7z).
|
||||
2. If you are on Windows 10 or newer, try [this](https://sourceforge.net/projects/mpv-player-windows/files/libmpv/mpv-dev-x86_64-20220501-git-9ffaa6b.7z).
|
||||
2a. If you want the latest, try [this](https://sourceforge.net/projects/mpv-player-windows/files/libmpv/mpv-dev-x86_64-20230212-git-a40958c.7z/download), but you have to rename the dll to `mpv-2.dll`.
|
||||
2a. If you want the latest, try [this](https://sourceforge.net/projects/mpv-player-windows/files/libmpv/mpv-dev-x86_64-20230212-git-a40958c.7z), but you have to rename the dll to `mpv-2.dll`.
|
||||
|
||||
Then open that archive and place the 'mpv-1.dll' or 'mpv-2.dll' into `install_dir`.
|
||||
|
||||
|
|
|
@ -219,11 +219,6 @@ class Controller( HydrusController.HydrusController ):
|
|||
return ClientDB.DB( self, self.db_dir, 'client' )
|
||||
|
||||
|
||||
def _InitTempDir( self ):
|
||||
|
||||
self.temp_dir = HydrusTemp.GetTempDir()
|
||||
|
||||
|
||||
def _DestroySplash( self ):
|
||||
|
||||
def qt_code( splash ):
|
||||
|
@ -1270,7 +1265,7 @@ class Controller( HydrusController.HydrusController ):
|
|||
|
||||
except Exception as e:
|
||||
|
||||
HydrusData.Print( 'Could not load Qt style: {}'.format( e ) )
|
||||
HydrusData.Print( 'Could not load Qt style: {}'.format( repr( e ) ) )
|
||||
|
||||
|
||||
|
||||
|
@ -1288,7 +1283,7 @@ class Controller( HydrusController.HydrusController ):
|
|||
|
||||
except Exception as e:
|
||||
|
||||
HydrusData.Print( 'Could not load Qt stylesheet: {}'.format( e ) )
|
||||
HydrusData.Print( 'Could not load Qt stylesheet: {}'.format( repr( e ) ) )
|
||||
|
||||
|
||||
|
||||
|
|
|
@ -511,7 +511,7 @@ def GetDefaultObjectsFromPNGs( dir_path, allowed_object_types ):
|
|||
|
||||
except Exception as e:
|
||||
|
||||
HydrusData.Print( 'Object at location "{}" failed to load: {}'.format( path, e ) )
|
||||
HydrusData.Print( 'Object at location "{}" failed to load: {}'.format( path, repr( e ) ) )
|
||||
|
||||
|
||||
|
||||
|
|
|
@ -117,7 +117,7 @@ def ConvertParseResultToPrettyString( result ):
|
|||
|
||||
except Exception as e:
|
||||
|
||||
parsed_text = 'Could not decode a hash from {}: {}'.format( parsed_text, str( e ) )
|
||||
parsed_text = 'Could not decode a hash from {}: {}'.format( parsed_text, repr( e ) )
|
||||
|
||||
|
||||
return '{} hash: {}'.format( hash_type, parsed_text )
|
||||
|
@ -1237,7 +1237,7 @@ class ParseFormulaHTML( ParseFormula ):
|
|||
|
||||
except Exception as e:
|
||||
|
||||
raise HydrusExceptions.ParseException( 'Unable to parse that HTML: {}. HTML Sample: {}'.format( str( e ), parsing_text[:1024] ) )
|
||||
raise HydrusExceptions.ParseException( 'Unable to parse that HTML: {}. HTML Sample: {}'.format( repr( e ), parsing_text[:1024] ) )
|
||||
|
||||
|
||||
tags = self._FindHTMLTags( root )
|
||||
|
@ -1877,7 +1877,7 @@ class ParseFormulaJSON( ParseFormula ):
|
|||
|
||||
else:
|
||||
|
||||
message = 'Unable to parse that JSON: {}.'.format( str( e ) )
|
||||
message = 'Unable to parse that JSON: {}.'.format( repr( e ) )
|
||||
|
||||
|
||||
message += ' Parsing text sample: {}'.format( parsing_text[:1024] )
|
||||
|
|
|
@ -453,9 +453,9 @@ class FileSystemPredicates( object ):
|
|||
|
||||
( years, months, days, hours ) = age_value
|
||||
|
||||
age = ( years * 365 * 86400 ) + ( ( ( ( ( months * 30 ) + days ) * 24 ) + hours ) * 3600 )
|
||||
dt = HydrusTime.CalendarDeltaToDateTime( years, months, days, hours )
|
||||
|
||||
now = HydrusTime.GetNow()
|
||||
time_pivot = HydrusTime.DateTimeToTimestamp( dt )
|
||||
|
||||
# this is backwards (less than means min timestamp) because we are talking about age, not timestamp
|
||||
|
||||
|
@ -465,23 +465,24 @@ class FileSystemPredicates( object ):
|
|||
|
||||
if operator == '<':
|
||||
|
||||
time_pivot = now - age
|
||||
|
||||
self._timestamp_ranges[ predicate_type ][ '>' ] = time_pivot
|
||||
|
||||
elif operator == '>':
|
||||
|
||||
time_pivot = now - age
|
||||
|
||||
self._timestamp_ranges[ predicate_type ][ '<' ] = time_pivot
|
||||
|
||||
elif operator == CC.UNICODE_ALMOST_EQUAL_TO:
|
||||
|
||||
earliest = now - int( age * 1.15 )
|
||||
latest = now - int( age * 0.85 )
|
||||
rough_timedelta_gap = HydrusTime.CalendarDeltaToRoughDateTimeTimeDelta( years, months, days, hours ) * 0.15
|
||||
|
||||
self._timestamp_ranges[ predicate_type ][ '>' ] = earliest
|
||||
self._timestamp_ranges[ predicate_type ][ '<' ] = latest
|
||||
earliest_dt = dt - rough_timedelta_gap
|
||||
latest_dt = dt + rough_timedelta_gap
|
||||
|
||||
earliest_time_pivot = HydrusTime.DateTimeToTimestamp( earliest_dt )
|
||||
latest_time_pivot = HydrusTime.DateTimeToTimestamp( latest_dt )
|
||||
|
||||
self._timestamp_ranges[ predicate_type ][ '>' ] = earliest_time_pivot
|
||||
self._timestamp_ranges[ predicate_type ][ '<' ] = latest_time_pivot
|
||||
|
||||
|
||||
elif age_type == 'date':
|
||||
|
@ -2494,35 +2495,53 @@ class Predicate( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
( years, months, days, hours ) = age_value
|
||||
|
||||
DAY = 86400
|
||||
MONTH = DAY * 30
|
||||
YEAR = DAY * 365
|
||||
str_components = []
|
||||
|
||||
time_delta = 0
|
||||
for ( quantity, label ) in [
|
||||
( years, 'year' ),
|
||||
( months, 'month' ),
|
||||
( days, 'day' ),
|
||||
( hours, 'hour' ),
|
||||
]:
|
||||
|
||||
if quantity > 0:
|
||||
|
||||
str_component = '{} {}'.format( HydrusData.ToHumanInt( quantity ), label )
|
||||
|
||||
if quantity > 1:
|
||||
|
||||
str_component += 's'
|
||||
|
||||
|
||||
str_components.append( str_component )
|
||||
|
||||
|
||||
if len( str_components ) == 2:
|
||||
|
||||
break
|
||||
|
||||
|
||||
|
||||
time_delta += hours * 3600
|
||||
time_delta += days * DAY
|
||||
time_delta += months * MONTH
|
||||
time_delta += years * YEAR
|
||||
nice_date_string = ' '.join( str_components )
|
||||
|
||||
if operator == '<':
|
||||
|
||||
pretty_operator = 'since '
|
||||
pretty_operator = 'since'
|
||||
|
||||
elif operator == '>':
|
||||
|
||||
pretty_operator = 'before '
|
||||
pretty_operator = 'before'
|
||||
|
||||
elif operator == CC.UNICODE_ALMOST_EQUAL_TO:
|
||||
|
||||
pretty_operator = 'around '
|
||||
pretty_operator = 'around'
|
||||
|
||||
else:
|
||||
|
||||
pretty_operator = 'unknown operator '
|
||||
pretty_operator = 'unknown operator'
|
||||
|
||||
|
||||
base += ': ' + pretty_operator + HydrusTime.TimeDeltaToPrettyTimeDelta( time_delta ) + ' ago'
|
||||
base += ': {} {} ago'.format( pretty_operator, nice_date_string )
|
||||
|
||||
elif age_type == 'date':
|
||||
|
||||
|
|
|
@ -341,7 +341,7 @@ def LoadFromNumPyImage( numpy_image: numpy.array ):
|
|||
|
||||
HydrusData.PrintException( e )
|
||||
|
||||
message = 'The image loaded, but it did not seem to be a hydrus serialised png! The error was: {}'.format( str( e ) )
|
||||
message = 'The image loaded, but it did not seem to be a hydrus serialised png! The error was: {}'.format( repr( e ) )
|
||||
message += os.linesep * 2
|
||||
message += 'If you believe this is a legit non-resized, non-converted hydrus serialised png, please send it to hydrus_dev.'
|
||||
|
||||
|
|
|
@ -1657,6 +1657,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
# so now we just blat all tables and trust in the Lord that we don't forget to add any new ones in future
|
||||
|
||||
self._Execute( 'DELETE FROM local_ratings WHERE service_id = ?;', ( service_id, ) )
|
||||
self._Execute( 'DELETE FROM local_incdec_ratings WHERE service_id = ?;', ( service_id, ) )
|
||||
self._Execute( 'DELETE FROM recent_tags WHERE service_id = ?;', ( service_id, ) )
|
||||
self._Execute( 'DELETE FROM service_info WHERE service_id = ?;', ( service_id, ) )
|
||||
|
||||
|
@ -9321,6 +9322,46 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
|
||||
|
||||
if version == 524:
|
||||
|
||||
try:
|
||||
|
||||
domain_manager = self.modules_serialisable.GetJSONDump( HydrusSerialisable.SERIALISABLE_TYPE_NETWORK_DOMAIN_MANAGER )
|
||||
|
||||
domain_manager.Initialise()
|
||||
|
||||
#
|
||||
|
||||
domain_manager.OverwriteDefaultURLClasses( [
|
||||
'sankaku chan file page (md5)'
|
||||
] )
|
||||
|
||||
domain_manager.OverwriteDefaultParsers( [
|
||||
'sankaku gallery page parser',
|
||||
'sankaku file page parser'
|
||||
] )
|
||||
|
||||
#
|
||||
|
||||
domain_manager.TryToLinkURLClassesAndParsers()
|
||||
|
||||
#
|
||||
|
||||
self.modules_serialisable.SetJSONDump( domain_manager )
|
||||
|
||||
except Exception as e:
|
||||
|
||||
HydrusData.PrintException( e )
|
||||
|
||||
message = 'Trying to update some downloader objects failed! Please let hydrus dev know!'
|
||||
|
||||
self.pub_initial_message( message )
|
||||
|
||||
|
||||
# when I brought these services in, I forgot to clear their ratings on service deletion! cleaning up here
|
||||
self._Execute( 'DELETE FROM local_incdec_ratings WHERE service_id NOT IN ( SELECT service_id FROM services );' )
|
||||
|
||||
|
||||
self._controller.frame_splash_status.SetTitleText( 'updated db to v{}'.format( HydrusData.ToHumanInt( version + 1 ) ) )
|
||||
|
||||
self._Execute( 'UPDATE version SET version = ?;', ( version + 1, ) )
|
||||
|
|
|
@ -820,7 +820,7 @@ class FrameGUI( CAC.ApplicationCommandProcessorMixin, ClientGUITopLevelWindows.M
|
|||
library_version_lines.append( 'lz4 present: {}'.format( HydrusCompression.LZ4_OK ) )
|
||||
library_version_lines.append( 'pympler present: {}'.format( HydrusMemory.PYMPLER_OK ) )
|
||||
library_version_lines.append( 'pyopenssl present: {}'.format( HydrusEncryption.OPENSSL_OK ) )
|
||||
library_version_lines.append( 'speedcopy present: {}'.format( HydrusFileHandling.SPEEDCOPY_OK ) )
|
||||
library_version_lines.append( 'speedcopy (experimental test) present: {}'.format( HydrusFileHandling.SPEEDCOPY_OK ) )
|
||||
library_version_lines.append( 'install dir: {}'.format( HC.BASE_DIR ) )
|
||||
library_version_lines.append( 'db dir: {}'.format( HG.client_controller.db_dir ) )
|
||||
library_version_lines.append( 'temp dir: {}'.format( HydrusTemp.GetCurrentTempDir() ) )
|
||||
|
@ -3352,9 +3352,10 @@ class FrameGUI( CAC.ApplicationCommandProcessorMixin, ClientGUITopLevelWindows.M
|
|||
|
||||
if HydrusMemory.PYMPLER_OK:
|
||||
|
||||
ClientGUIMenus.AppendMenuItem( memory_actions, 'print memory-use summary', 'Print some information about the python memory use to the log.', self._DebugPrintMemoryUse )
|
||||
ClientGUIMenus.AppendMenuItem( memory_actions, 'take memory-use snapshot', 'Capture current memory use.', self._DebugTakeMemoryUseSnapshot )
|
||||
ClientGUIMenus.AppendMenuItem( memory_actions, 'print memory-use snapshot diff', 'Show memory use differences since the last snapshot.', self._DebugShowMemoryUseDifferences )
|
||||
ClientGUIMenus.AppendSeparator( memory_actions )
|
||||
ClientGUIMenus.AppendMenuItem( memory_actions, 'WARNING, MEGA-LAGGY: print memory-use summary', 'Print some information about the python memory use to the log.', self._DebugPrintMemoryUse )
|
||||
ClientGUIMenus.AppendMenuItem( memory_actions, 'WARNING, MEGA-LAGGY: take memory-use snapshot', 'Capture current memory use.', self._DebugTakeMemoryUseSnapshot )
|
||||
ClientGUIMenus.AppendMenuItem( memory_actions, 'WARNING, MEGA-LAGGY: print memory-use snapshot diff', 'Show memory use differences since the last snapshot.', self._DebugShowMemoryUseDifferences )
|
||||
|
||||
|
||||
ClientGUIMenus.AppendMenu( debug, memory_actions, 'memory actions' )
|
||||
|
|
|
@ -16,6 +16,7 @@ from hydrus.client import ClientConstants as CC
|
|||
from hydrus.client.gui import ClientGUIAsync
|
||||
from hydrus.client.gui import ClientGUIDialogs
|
||||
from hydrus.client.gui import ClientGUIDialogsQuick
|
||||
from hydrus.client.gui import ClientGUIFunctions
|
||||
from hydrus.client.gui import ClientGUIRatings
|
||||
from hydrus.client.gui import ClientGUIShortcuts
|
||||
from hydrus.client.gui import QtPorting as QP
|
||||
|
@ -144,9 +145,9 @@ class DialogManageRatings( CAC.ApplicationCommandProcessorMixin, ClientGUIDialog
|
|||
|
||||
rating_clipboard_pairs = [ ( bytes.fromhex( service_key_encoded ), rating ) for ( service_key_encoded, rating ) in rating_clipboard_pairs_encoded ]
|
||||
|
||||
except:
|
||||
except Exception as e:
|
||||
|
||||
QW.QMessageBox.critical( self, 'Error', 'Did not understand what was in the clipboard!' )
|
||||
ClientGUIFunctions.PresentClipboardParseError( self, raw_text, 'JSON pairs of service keys and rating values', e )
|
||||
|
||||
return
|
||||
|
||||
|
|
|
@ -1136,7 +1136,7 @@ class EditURLClassPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
tt = 'The same url can be expressed in different ways. The parameters can be reordered, and descriptive \'sugar\' like "/123456/bodysuit-samus_aran" can be altered at a later date, say to "/123456/bodysuit-green_eyes-samus_aran". In order to collapse all the different expressions of a url down to a single comparable form, the client will \'normalise\' them based on the essential definitions in their url class. Parameters will be alphebatised and non-defined elements will be removed.'
|
||||
tt += os.linesep * 2
|
||||
tt += 'All normalisation will switch to the preferred scheme (http/https). The alphabetisation of parameters and stripping out of non-defined elements will occur for all URLs except Gallery URLs or Watchable URLs that do not use an API Lookup. (In general, you can define gallery and watchable urls a little more loosely since they generally do not need to be compared, but if you will be saving it with a file or need to perform some regex conversion into an API URL, you\'ll want a rigorously defined url class that will normalise to something reliable and pretty.)'
|
||||
tt += 'All normalisation will switch to the preferred scheme (http/https). The alphabetisation of parameters and stripping out of non-defined elements will occur for all URLs except Gallery URLs or Watchable URLs that do not use an API Lookup. (In general, you can define gallery and watchable urls a little more loosely since they generally do not need to be compared, but if you will be saving it with a file or need to perform some regex conversion into an API/Redirect URL, you\'ll want a rigorously defined url class that will normalise to something reliable and pretty.)'
|
||||
|
||||
self._normalised_url.setToolTip( tt )
|
||||
|
||||
|
@ -1250,8 +1250,8 @@ class EditURLClassPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
rows = []
|
||||
|
||||
rows.append( ( 'optional api url converter: ', self._api_lookup_converter ) )
|
||||
rows.append( ( 'api url: ', self._api_url ) )
|
||||
rows.append( ( 'optional api/redirect url converter: ', self._api_lookup_converter ) )
|
||||
rows.append( ( 'api/redirect url: ', self._api_url ) )
|
||||
|
||||
gridbox = ClientGUICommon.WrapInGrid( self._api_url_panel, rows )
|
||||
|
||||
|
@ -1833,7 +1833,7 @@ class EditURLClassPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
if url_class.Matches( api_lookup_url ):
|
||||
|
||||
self._example_url_classes.setText( 'Matches own API URL!' )
|
||||
self._example_url_classes.setText( 'Matches own API/Redirect URL!' )
|
||||
self._example_url_classes.setObjectName( 'HydrusInvalid' )
|
||||
|
||||
|
||||
|
@ -1850,7 +1850,7 @@ class EditURLClassPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
self._api_url.setText( 'Could not convert - ' + reason )
|
||||
|
||||
self._example_url_classes.setText( 'API URL Problem!' )
|
||||
self._example_url_classes.setText( 'API/Redirect URL Problem!' )
|
||||
self._example_url_classes.setObjectName( 'HydrusInvalid' )
|
||||
|
||||
|
||||
|
@ -1954,7 +1954,7 @@ class EditURLClassPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
except HydrusExceptions.StringConvertException as e:
|
||||
|
||||
raise HydrusExceptions.VetoException( 'Problem making API URL!' )
|
||||
raise HydrusExceptions.VetoException( 'Problem making API/Redirect URL!' )
|
||||
|
||||
|
||||
|
||||
|
@ -1980,7 +1980,7 @@ class EditURLClassPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
if url_class.Matches( api_lookup_url ):
|
||||
|
||||
message = 'This URL class matches its own API URL! This can break a downloader unless there is a more specific URL Class the matches the API URL before this. I recommend you fix this here, but you do not have to. Exit now?'
|
||||
message = 'This URL class matches its own API/Redirect URL! This can break a downloader unless there is a more specific URL Class the matches the API URL before this. I recommend you fix this here, but you do not have to. Exit now?'
|
||||
|
||||
result = ClientGUIDialogsQuick.GetYesNo( self, message )
|
||||
|
||||
|
@ -2280,7 +2280,7 @@ class EditURLClassLinksPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
#
|
||||
|
||||
self._notebook.addTab( self._parser_list_ctrl_panel, 'parser links' )
|
||||
self._notebook.addTab( self._api_pairs_list_ctrl, 'api link review' )
|
||||
self._notebook.addTab( self._api_pairs_list_ctrl, 'api/redirect link review' )
|
||||
|
||||
#
|
||||
|
||||
|
|
|
@ -18,6 +18,7 @@ from hydrus.client import ClientPaths
|
|||
from hydrus.client import ClientSerialisable
|
||||
from hydrus.client import ClientTime
|
||||
from hydrus.client.gui import ClientGUIDialogsQuick
|
||||
from hydrus.client.gui import ClientGUIFunctions
|
||||
from hydrus.client.gui import ClientGUIMenus
|
||||
from hydrus.client.gui import ClientGUISerialisable
|
||||
from hydrus.client.gui import ClientGUIScrolledPanels
|
||||
|
@ -106,13 +107,12 @@ def ImportFromClipboard( win: QW.QWidget, file_seed_cache: ClientImportFileSeeds
|
|||
|
||||
ImportSources( file_seed_cache, sources )
|
||||
|
||||
except:
|
||||
except Exception as e:
|
||||
|
||||
QW.QMessageBox.critical( win, 'Error', 'Could not import!' )
|
||||
|
||||
raise
|
||||
ClientGUIFunctions.PresentClipboardParseError( win, raw_text, 'Lines of URLs or file paths', e )
|
||||
|
||||
|
||||
|
||||
def ImportFromPNG( win: QW.QWidget, file_seed_cache: ClientImportFileSeeds.FileSeedCache ):
|
||||
|
||||
with QP.FileDialog( win, 'select the png with the sources', wildcard = 'PNG (*.png)' ) as dlg:
|
||||
|
|
|
@ -6,6 +6,7 @@ from qtpy import QtWidgets as QW
|
|||
from qtpy import QtGui as QG
|
||||
|
||||
from hydrus.core import HydrusConstants as HC
|
||||
from hydrus.core import HydrusData
|
||||
from hydrus.core import HydrusGlobals as HG
|
||||
from hydrus.core import HydrusText
|
||||
|
||||
|
@ -319,6 +320,31 @@ def NotebookScreenToHitTest( notebook, screen_position ):
|
|||
|
||||
return notebook.tabBar().tabAt( tab_pos )
|
||||
|
||||
|
||||
def PresentClipboardParseError( win: QW.QWidget, content: str, expected_content_description: str, e: Exception ):
|
||||
|
||||
MAX_CONTENT_SIZE = 1024
|
||||
|
||||
log_message = 'Clipboard Error!\nI was expecting: {}'.format( expected_content_description )
|
||||
|
||||
if len( content ) > MAX_CONTENT_SIZE:
|
||||
|
||||
log_message += '\nFirst {} of content received (total was {}):\n'.format( HydrusData.ToHumanBytes( MAX_CONTENT_SIZE ), HydrusData.ToHumanBytes( len( content ) ) ) + content[:MAX_CONTENT_SIZE]
|
||||
|
||||
else:
|
||||
|
||||
log_message += '\nContent received ({}):\n'.format( HydrusData.ToHumanBytes( len( content ) ) ) + content[:MAX_CONTENT_SIZE]
|
||||
|
||||
|
||||
HydrusData.DebugPrint( log_message )
|
||||
|
||||
HydrusData.PrintException( e, do_wait = False )
|
||||
|
||||
message = 'Sorry, I could not understand what was in the clipboard. I was expecting "{}" but received this text:\n\n{}\n\nMore details have been written to the log, but the general error was:\n\n{}'.format( expected_content_description, HydrusText.ElideText( content, 64 ), repr( e ) )
|
||||
|
||||
QW.QMessageBox.critical( win, 'Clipboard Error!', message )
|
||||
|
||||
|
||||
def SetBitmapButtonBitmap( button, bitmap ):
|
||||
|
||||
# old wx stuff, but still basically relevant
|
||||
|
|
|
@ -16,6 +16,7 @@ from hydrus.client import ClientPaths
|
|||
from hydrus.client import ClientSerialisable
|
||||
from hydrus.client import ClientTime
|
||||
from hydrus.client.gui import ClientGUIDialogsQuick
|
||||
from hydrus.client.gui import ClientGUIFunctions
|
||||
from hydrus.client.gui import ClientGUIMenus
|
||||
from hydrus.client.gui import ClientGUISerialisable
|
||||
from hydrus.client.gui import ClientGUIScrolledPanels
|
||||
|
@ -53,6 +54,7 @@ def GetURLsFromURLsString( urls_string ):
|
|||
|
||||
return urls
|
||||
|
||||
|
||||
def ImportFromClipboard( win: QW.QWidget, gallery_seed_log: ClientImportGallerySeeds.GallerySeedLog, can_generate_more_pages: bool ):
|
||||
|
||||
try:
|
||||
|
@ -72,13 +74,12 @@ def ImportFromClipboard( win: QW.QWidget, gallery_seed_log: ClientImportGalleryS
|
|||
|
||||
ImportURLs( win, gallery_seed_log, urls, can_generate_more_pages )
|
||||
|
||||
except:
|
||||
except Exception as e:
|
||||
|
||||
QW.QMessageBox.critical( win, 'Error', 'Could not import!' )
|
||||
|
||||
raise
|
||||
ClientGUIFunctions.PresentClipboardParseError( win, raw_text, 'Lines of URLs', e )
|
||||
|
||||
|
||||
|
||||
def ImportFromPNG( win: QW.QWidget, gallery_seed_log: ClientImportGallerySeeds.GallerySeedLog, can_generate_more_pages: bool ):
|
||||
|
||||
with QP.FileDialog( win, 'select the png with the urls', wildcard = 'PNG (*.png)' ) as dlg:
|
||||
|
|
|
@ -693,6 +693,15 @@ def ShowFileEmbeddedMetadata( win: QW.QWidget, media: ClientMedia.MediaSingleton
|
|||
file_text = HydrusImageHandling.GetEmbeddedFileText( pil_image )
|
||||
|
||||
|
||||
extra_rows = []
|
||||
|
||||
if mime == HC.IMAGE_JPEG:
|
||||
|
||||
extra_rows.append( ( 'progressive', 'yes' if 'progression' in pil_image.info else 'no' ) )
|
||||
|
||||
extra_rows.append( ( 'subsampling', HydrusImageHandling.GetJpegSubsampling( pil_image ) ) )
|
||||
|
||||
|
||||
if exif_dict is None and file_text is None:
|
||||
|
||||
QW.QMessageBox.information( win, 'Nothing found', 'Sorry, could not see any human-readable information in this file! Hydrus should have known this, so if this keeps happening, you may need to schedule a rescan of this info in file maintenance.' )
|
||||
|
@ -702,7 +711,7 @@ def ShowFileEmbeddedMetadata( win: QW.QWidget, media: ClientMedia.MediaSingleton
|
|||
|
||||
frame = ClientGUITopLevelWindowsPanels.FrameThatTakesScrollablePanel( win, 'Embedded Metadata' )
|
||||
|
||||
panel = ClientGUIScrolledPanelsReview.ReviewFileEmbeddedMetadata( frame, exif_dict, file_text )
|
||||
panel = ClientGUIScrolledPanelsReview.ReviewFileEmbeddedMetadata( frame, exif_dict, file_text, extra_rows )
|
||||
|
||||
frame.SetPanel( panel )
|
||||
|
||||
|
|
|
@ -36,6 +36,7 @@ from hydrus.client.gui.importing import ClientGUIImportOptions
|
|||
from hydrus.client.gui.lists import ClientGUIListConstants as CGLC
|
||||
from hydrus.client.gui.lists import ClientGUIListCtrl
|
||||
from hydrus.client.gui.widgets import ClientGUICommon
|
||||
from hydrus.client.gui.widgets import ClientGUIMenuButton
|
||||
from hydrus.client.importing.options import NoteImportOptions
|
||||
from hydrus.client.importing.options import TagImportOptions
|
||||
from hydrus.client.media import ClientMedia
|
||||
|
@ -524,9 +525,7 @@ class EditDefaultImportOptionsPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
except Exception as e:
|
||||
|
||||
QW.QMessageBox.critical( self, 'Error', 'I could not understand what was in the clipboard' )
|
||||
|
||||
HydrusData.ShowException( e )
|
||||
ClientGUIFunctions.PresentClipboardParseError( self, raw_text, 'An instance of JSON-serialised tag or note import options', e )
|
||||
|
||||
|
||||
|
||||
|
@ -1944,9 +1943,9 @@ class EditFileNotesPanel( CAC.ApplicationCommandProcessorMixin, ClientGUIScrolle
|
|||
|
||||
names_and_notes = clean_names_and_notes
|
||||
|
||||
except:
|
||||
except Exception as e:
|
||||
|
||||
QW.QMessageBox.critical( self, 'Error', 'Did not understand what was in the clipboard!' )
|
||||
ClientGUIFunctions.PresentClipboardParseError( self, raw_text, 'JSON names and notes, either as an Object or a list of pairs', e )
|
||||
|
||||
return
|
||||
|
||||
|
@ -2238,11 +2237,23 @@ class EditFileTimestampsPanel( CAC.ApplicationCommandProcessorMixin, ClientGUISc
|
|||
|
||||
#
|
||||
|
||||
self._copy_button = ClientGUICommon.BetterBitmapButton( self, CC.global_pixmaps().copy, self._Copy )
|
||||
self._copy_button.setToolTip( 'Copy all timestamps to the clipboard.' )
|
||||
menu_items = []
|
||||
|
||||
menu_items.append( ( 'normal', 'all times', 'Copy every time here for pasting in another file\'s dialog.', self._Copy ) )
|
||||
|
||||
c = HydrusData.Call( self._Copy, allowed_timestamp_types = ( HC.TIMESTAMP_TYPE_IMPORTED, HC.TIMESTAMP_TYPE_PREVIOUSLY_IMPORTED, HC.TIMESTAMP_TYPE_DELETED ) )
|
||||
|
||||
menu_items.append( ( 'normal', 'all file service times', 'Copy every imported/deleted/previously imported time here for pasting in another file\'s dialog.', c ) )
|
||||
|
||||
c = HydrusData.Call( self._Copy, allowed_timestamp_types = ( HC.TIMESTAMP_TYPE_IMPORTED, HC.TIMESTAMP_TYPE_PREVIOUSLY_IMPORTED, HC.TIMESTAMP_TYPE_DELETED ), adjust_delta = 1 )
|
||||
|
||||
menu_items.append( ( 'normal', 'all file service times, plus one second', 'This is an experiment, feel free to play around with it to manually force a certain order on a handful of files. I expect to replace it will a full \'cascade\' dialog in future.', c ) )
|
||||
|
||||
self._copy_button = ClientGUIMenuButton.MenuBitmapButton( self, CC.global_pixmaps().copy, menu_items )
|
||||
self._copy_button.setToolTip( 'Copy timestamps to the clipboard.' )
|
||||
|
||||
self._paste_button = ClientGUICommon.BetterBitmapButton( self, CC.global_pixmaps().paste, self._Paste )
|
||||
self._paste_button.setToolTip( 'Paste all timestamps from another timestamps dialog.' )
|
||||
self._paste_button.setToolTip( 'Paste timestamps from another timestamps dialog.' )
|
||||
|
||||
#
|
||||
|
||||
|
@ -2319,9 +2330,27 @@ class EditFileTimestampsPanel( CAC.ApplicationCommandProcessorMixin, ClientGUISc
|
|||
return ( display_tuple, sort_tuple )
|
||||
|
||||
|
||||
def _Copy( self ):
|
||||
def _Copy( self, allowed_timestamp_types = None, adjust_delta = 0 ):
|
||||
|
||||
list_of_timestamp_data = HydrusSerialisable.SerialisableList( self._GetValidTimestampDatas() )
|
||||
list_of_timestamp_data = self._GetValidTimestampDatas()
|
||||
|
||||
if allowed_timestamp_types is not None:
|
||||
|
||||
list_of_timestamp_data = [ timestamp_data for timestamp_data in list_of_timestamp_data if timestamp_data.timestamp_type in allowed_timestamp_types ]
|
||||
|
||||
|
||||
if adjust_delta != 0:
|
||||
|
||||
for timestamp_data in list_of_timestamp_data:
|
||||
|
||||
if timestamp_data.timestamp is not None:
|
||||
|
||||
timestamp_data.timestamp += adjust_delta
|
||||
|
||||
|
||||
|
||||
|
||||
list_of_timestamp_data = HydrusSerialisable.SerialisableList( list_of_timestamp_data )
|
||||
|
||||
text = json.dumps( list_of_timestamp_data.GetSerialisableTuple() )
|
||||
|
||||
|
@ -2453,7 +2482,7 @@ class EditFileTimestampsPanel( CAC.ApplicationCommandProcessorMixin, ClientGUISc
|
|||
|
||||
|
||||
|
||||
def _GetValidTimestampDatas( self, only_changes = False ):
|
||||
def _GetValidTimestampDatas( self, only_changes = False ) -> typing.List[ ClientTime.TimestampData ]:
|
||||
|
||||
timestamps_manager = self._media.GetLocationsManager().GetTimestampsManager()
|
||||
|
||||
|
@ -2541,6 +2570,8 @@ class EditFileTimestampsPanel( CAC.ApplicationCommandProcessorMixin, ClientGUISc
|
|||
|
||||
|
||||
|
||||
result = HydrusSerialisable.SerialisableList( result ).Duplicate()
|
||||
|
||||
return result
|
||||
|
||||
|
||||
|
@ -2571,9 +2602,9 @@ class EditFileTimestampsPanel( CAC.ApplicationCommandProcessorMixin, ClientGUISc
|
|||
|
||||
|
||||
|
||||
except:
|
||||
except Exception as e:
|
||||
|
||||
QW.QMessageBox.critical( self, 'Error', 'Did not understand what was in the clipboard!' )
|
||||
ClientGUIFunctions.PresentClipboardParseError( self, raw_text, 'A list of JSON-serialised Timestamp Data objects', e )
|
||||
|
||||
return
|
||||
|
||||
|
|
|
@ -3390,7 +3390,7 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
|
||||
except Exception as e:
|
||||
|
||||
QW.QMessageBox.critical( self, 'Critical', 'Could not apply style: {}'.format( str( e ) ) )
|
||||
QW.QMessageBox.critical( self, 'Critical', 'Could not apply style: {}'.format( repr( e ) ) )
|
||||
|
||||
|
||||
try:
|
||||
|
@ -3406,7 +3406,7 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
|
||||
except Exception as e:
|
||||
|
||||
QW.QMessageBox.critical( self, 'Critical', 'Could not apply stylesheet: {}'.format( str( e ) ) )
|
||||
QW.QMessageBox.critical( self, 'Critical', 'Could not apply stylesheet: {}'.format( repr( e ) ) )
|
||||
|
||||
|
||||
|
||||
|
@ -4713,7 +4713,7 @@ class ManageURLsPanel( CAC.ApplicationCommandProcessorMixin, ClientGUIScrolledPa
|
|||
|
||||
except Exception as e:
|
||||
|
||||
QW.QMessageBox.warning( self, 'Warning', 'I could not understand what was in the clipboard: {}'.format( e ) )
|
||||
ClientGUIFunctions.PresentClipboardParseError( self, raw_text, 'Lines of URLs', e )
|
||||
|
||||
|
||||
|
||||
|
|
|
@ -2297,7 +2297,7 @@ class ReviewDownloaderImport( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
|
||||
except Exception as e:
|
||||
|
||||
QW.QMessageBox.critical( self, 'Error', 'Sorry, seemed to be a problem: {}'.format( str( e ) ) )
|
||||
QW.QMessageBox.critical( self, 'Error', 'Sorry, seemed to be a problem: {}'.format( repr( e ) ) )
|
||||
|
||||
return
|
||||
|
||||
|
@ -2345,7 +2345,7 @@ class ReviewDownloaderImport( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
|
||||
class ReviewFileEmbeddedMetadata( ClientGUIScrolledPanels.ReviewPanel ):
|
||||
|
||||
def __init__( self, parent, exif_dict: typing.Optional[ dict ], file_text: typing.Optional[ str ] ):
|
||||
def __init__( self, parent, exif_dict: typing.Optional[ dict ], file_text: typing.Optional[ str ], extra_rows: typing.List[ typing.Tuple[ str, str ] ] ):
|
||||
|
||||
ClientGUIScrolledPanels.ReviewPanel.__init__( self, parent )
|
||||
|
||||
|
@ -2376,6 +2376,16 @@ class ReviewFileEmbeddedMetadata( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
|
||||
#
|
||||
|
||||
extra_rows_panel = ClientGUICommon.StaticBox( self, 'extra info' )
|
||||
|
||||
rows = [ ( f'{key}: ', ClientGUICommon.BetterStaticText( extra_rows_panel, label = value ) ) for ( key, value ) in extra_rows ]
|
||||
|
||||
gridbox = ClientGUICommon.WrapInGrid( self, rows )
|
||||
|
||||
extra_rows_panel.Add( gridbox, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
|
||||
#
|
||||
|
||||
if exif_dict is None:
|
||||
|
||||
exif_panel.setVisible( False )
|
||||
|
@ -2408,12 +2418,18 @@ class ReviewFileEmbeddedMetadata( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
self._text.setPlainText( file_text )
|
||||
|
||||
|
||||
if len( extra_rows ) == 0:
|
||||
|
||||
extra_rows_panel.setVisible( False )
|
||||
|
||||
|
||||
#
|
||||
|
||||
vbox = QP.VBoxLayout()
|
||||
|
||||
QP.AddToLayout( vbox, exif_panel, CC.FLAGS_EXPAND_BOTH_WAYS )
|
||||
QP.AddToLayout( vbox, text_panel, CC.FLAGS_EXPAND_BOTH_WAYS )
|
||||
QP.AddToLayout( vbox, extra_rows_panel, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
|
||||
self.widget().setLayout( vbox )
|
||||
|
||||
|
|
|
@ -209,7 +209,7 @@ class SingleStringConversionTestPanel( QW.QWidget ):
|
|||
|
||||
except Exception as e:
|
||||
|
||||
results = [ 'error: {}'.format( str( e ) ) ]
|
||||
results = [ 'error: {}'.format( repr( e ) ) ]
|
||||
|
||||
stop_now = True
|
||||
|
||||
|
|
|
@ -809,7 +809,7 @@ class EditSubscriptionPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
try:
|
||||
|
||||
pasted_text = HG.client_controller.GetClipboardText()
|
||||
raw_text = HG.client_controller.GetClipboardText()
|
||||
|
||||
except HydrusExceptions.DataMissing as e:
|
||||
|
||||
|
@ -820,11 +820,11 @@ class EditSubscriptionPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
try:
|
||||
|
||||
pasted_query_texts = HydrusText.DeserialiseNewlinedTexts( pasted_text )
|
||||
pasted_query_texts = HydrusText.DeserialiseNewlinedTexts( raw_text )
|
||||
|
||||
except:
|
||||
except Exception as e:
|
||||
|
||||
QW.QMessageBox.critical( self, 'Error', 'I could not understand what was in the clipboard!' )
|
||||
ClientGUIFunctions.PresentClipboardParseError( self, raw_text, 'Lines of Queries', e )
|
||||
|
||||
return
|
||||
|
||||
|
|
|
@ -1081,7 +1081,7 @@ class EditTagFilterPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
except Exception as e:
|
||||
|
||||
QW.QMessageBox.critical( self, 'Error', 'I could not understand what was in the clipboard' )
|
||||
ClientGUIFunctions.PresentClipboardParseError( self, raw_text, 'JSON-serialised Tag Filter object', e )
|
||||
|
||||
return
|
||||
|
||||
|
@ -2720,7 +2720,7 @@ class ManageTagsPanel( CAC.ApplicationCommandProcessorMixin, ClientGUIScrolledPa
|
|||
|
||||
try:
|
||||
|
||||
text = HG.client_controller.GetClipboardText()
|
||||
raw_text = HG.client_controller.GetClipboardText()
|
||||
|
||||
except HydrusExceptions.DataMissing as e:
|
||||
|
||||
|
@ -2731,7 +2731,7 @@ class ManageTagsPanel( CAC.ApplicationCommandProcessorMixin, ClientGUIScrolledPa
|
|||
|
||||
try:
|
||||
|
||||
tags = HydrusText.DeserialiseNewlinedTexts( text )
|
||||
tags = HydrusText.DeserialiseNewlinedTexts( raw_text )
|
||||
|
||||
tags = HydrusTags.CleanTags( tags )
|
||||
|
||||
|
@ -2739,7 +2739,7 @@ class ManageTagsPanel( CAC.ApplicationCommandProcessorMixin, ClientGUIScrolledPa
|
|||
|
||||
except Exception as e:
|
||||
|
||||
QW.QMessageBox.warning( self, 'Warning', 'I could not understand what was in the clipboard' )
|
||||
ClientGUIFunctions.PresentClipboardParseError( self, raw_text, 'Lines of tags', e )
|
||||
|
||||
|
||||
|
||||
|
@ -3590,7 +3590,7 @@ class ManageTagParents( ClientGUIScrolledPanels.ManagePanel ):
|
|||
|
||||
try:
|
||||
|
||||
import_string = HG.client_controller.GetClipboardText()
|
||||
raw_text = HG.client_controller.GetClipboardText()
|
||||
|
||||
except HydrusExceptions.DataMissing as e:
|
||||
|
||||
|
@ -3599,11 +3599,18 @@ class ManageTagParents( ClientGUIScrolledPanels.ManagePanel ):
|
|||
return
|
||||
|
||||
|
||||
pairs = self._DeserialiseImportString( import_string )
|
||||
|
||||
self._AddPairs( pairs, add_only = add_only )
|
||||
|
||||
self._UpdateListCtrlData()
|
||||
try:
|
||||
|
||||
pairs = self._DeserialiseImportString( raw_text )
|
||||
|
||||
self._AddPairs( pairs, add_only = add_only )
|
||||
|
||||
self._UpdateListCtrlData()
|
||||
|
||||
except Exception as e:
|
||||
|
||||
ClientGUIFunctions.PresentClipboardParseError( self, raw_text, 'Lines of child-parent line-pairs', e )
|
||||
|
||||
|
||||
|
||||
def _ImportFromTXT( self, add_only = False ):
|
||||
|
@ -4899,7 +4906,7 @@ class ManageTagSiblings( ClientGUIScrolledPanels.ManagePanel ):
|
|||
|
||||
try:
|
||||
|
||||
import_string = HG.client_controller.GetClipboardText()
|
||||
raw_text = HG.client_controller.GetClipboardText()
|
||||
|
||||
except HydrusExceptions.DataMissing as e:
|
||||
|
||||
|
@ -4908,15 +4915,22 @@ class ManageTagSiblings( ClientGUIScrolledPanels.ManagePanel ):
|
|||
return
|
||||
|
||||
|
||||
pairs = self._DeserialiseImportString( import_string )
|
||||
|
||||
self._AutoPetitionConflicts( pairs )
|
||||
|
||||
self._AutoPetitionLoops( pairs )
|
||||
|
||||
self._AddPairs( pairs, add_only = add_only )
|
||||
|
||||
self._UpdateListCtrlData()
|
||||
try:
|
||||
|
||||
pairs = self._DeserialiseImportString( raw_text )
|
||||
|
||||
self._AutoPetitionConflicts( pairs )
|
||||
|
||||
self._AutoPetitionLoops( pairs )
|
||||
|
||||
self._AddPairs( pairs, add_only = add_only )
|
||||
|
||||
self._UpdateListCtrlData()
|
||||
|
||||
except Exception as e:
|
||||
|
||||
ClientGUIFunctions.PresentClipboardParseError( self, raw_text, 'Lines of lesser-ideal sibling line-pairs', e )
|
||||
|
||||
|
||||
|
||||
def _ImportFromTXT( self, add_only = False ):
|
||||
|
|
|
@ -540,9 +540,9 @@ class DateTimeCtrl( QW.QWidget ):
|
|||
qt_time = QC.QDateTime.fromSecsSinceEpoch( timestamp )
|
||||
|
||||
|
||||
except:
|
||||
except Exception as e:
|
||||
|
||||
QW.QMessageBox.critical( self, 'Error', 'Did not understand what was in the clipboard!' )
|
||||
ClientGUIFunctions.PresentClipboardParseError( self, raw_text, 'A simple integer timestamp', e )
|
||||
|
||||
return
|
||||
|
||||
|
|
|
@ -855,8 +855,11 @@ class CanvasHoverFrameTop( CanvasHoverFrame ):
|
|||
|
||||
has_exif = self._current_media.GetMediaResult().GetFileInfoManager().has_exif
|
||||
has_human_readable_embedded_metadata = self._current_media.GetMediaResult().GetFileInfoManager().has_human_readable_embedded_metadata
|
||||
has_extra_rows = self._current_media.GetMime() == HC.IMAGE_JPEG
|
||||
|
||||
if has_exif or has_human_readable_embedded_metadata:
|
||||
stuff_to_show = has_exif or has_human_readable_embedded_metadata or has_extra_rows
|
||||
|
||||
if stuff_to_show:
|
||||
|
||||
tt_components = []
|
||||
|
||||
|
@ -870,12 +873,17 @@ class CanvasHoverFrameTop( CanvasHoverFrame ):
|
|||
tt_components.append( 'non-exif human-readable embedded metadata')
|
||||
|
||||
|
||||
if has_extra_rows:
|
||||
|
||||
tt_components.append( 'extra info' )
|
||||
|
||||
|
||||
tt = 'show {}'.format( ' and '.join( tt_components ) )
|
||||
|
||||
self._show_embedded_metadata_button.setToolTip( tt )
|
||||
|
||||
|
||||
self._show_embedded_metadata_button.setVisible( has_exif or has_human_readable_embedded_metadata )
|
||||
self._show_embedded_metadata_button.setVisible( stuff_to_show )
|
||||
|
||||
|
||||
|
||||
|
|
|
@ -590,7 +590,7 @@ class FilenameTaggingOptionsPanel( QW.QWidget ):
|
|||
|
||||
try:
|
||||
|
||||
text = HG.client_controller.GetClipboardText()
|
||||
raw_text = HG.client_controller.GetClipboardText()
|
||||
|
||||
except HydrusExceptions.DataMissing as e:
|
||||
|
||||
|
@ -601,15 +601,17 @@ class FilenameTaggingOptionsPanel( QW.QWidget ):
|
|||
|
||||
try:
|
||||
|
||||
tags = HydrusText.DeserialiseNewlinedTexts( text )
|
||||
tags = HydrusText.DeserialiseNewlinedTexts( raw_text )
|
||||
|
||||
tags = HydrusTags.CleanTags( tags )
|
||||
|
||||
return tags
|
||||
|
||||
except:
|
||||
except Exception as e:
|
||||
|
||||
raise Exception( 'I could not understand what was in the clipboard' )
|
||||
ClientGUIFunctions.PresentClipboardParseError( self, raw_text, 'Lines of tags', e )
|
||||
|
||||
raise
|
||||
|
||||
|
||||
|
||||
|
|
|
@ -15,6 +15,7 @@ from hydrus.client import ClientConstants as CC
|
|||
from hydrus.client.gui import ClientGUICore as CGC
|
||||
from hydrus.client.gui import ClientGUIDialogs
|
||||
from hydrus.client.gui import ClientGUIDialogsQuick
|
||||
from hydrus.client.gui import ClientGUIFunctions
|
||||
from hydrus.client.gui import ClientGUIMenus
|
||||
from hydrus.client.gui import ClientGUIOptionsPanels
|
||||
from hydrus.client.gui import ClientGUIScrolledPanels
|
||||
|
@ -1883,9 +1884,7 @@ class ImportOptionsButton( ClientGUICommon.ButtonWithMenuArrow ):
|
|||
|
||||
except Exception as e:
|
||||
|
||||
QW.QMessageBox.critical( self, 'Error', 'I could not understand what was in the clipboard: {}'.format( e ) )
|
||||
|
||||
HydrusData.ShowException( e )
|
||||
ClientGUIFunctions.PresentClipboardParseError( self, raw_text, 'JSON-serialised File Import Options', e )
|
||||
|
||||
return
|
||||
|
||||
|
@ -1924,9 +1923,7 @@ class ImportOptionsButton( ClientGUICommon.ButtonWithMenuArrow ):
|
|||
|
||||
except Exception as e:
|
||||
|
||||
QW.QMessageBox.critical( self, 'Error', 'I could not understand what was in the clipboard: {}'.format( e ) )
|
||||
|
||||
HydrusData.ShowException( e )
|
||||
ClientGUIFunctions.PresentClipboardParseError( self, raw_text, 'JSON-serialised Note Import Options', e )
|
||||
|
||||
return
|
||||
|
||||
|
@ -1965,9 +1962,7 @@ class ImportOptionsButton( ClientGUICommon.ButtonWithMenuArrow ):
|
|||
|
||||
except Exception as e:
|
||||
|
||||
QW.QMessageBox.critical( self, 'Error', 'I could not understand what was in the clipboard: {}'.format( e ) )
|
||||
|
||||
HydrusData.ShowException( e )
|
||||
ClientGUIFunctions.PresentClipboardParseError( self, raw_text, 'JSON-serialised Tag Import Options', e )
|
||||
|
||||
return
|
||||
|
||||
|
|
|
@ -517,7 +517,7 @@ class AddEditDeleteListBox( QW.QWidget ):
|
|||
|
||||
except Exception as e:
|
||||
|
||||
QW.QMessageBox.critical( self, 'Error', 'I could not understand what was in the clipboard' )
|
||||
ClientGUIFunctions.PresentClipboardParseError( self, raw_text, 'JSON-serialised Hydrus Object(s)', e )
|
||||
|
||||
|
||||
|
||||
|
|
|
@ -869,10 +869,10 @@ class COLUMN_LIST_URL_CLASS_API_PAIRS( COLUMN_LIST_DEFINITION ):
|
|||
API_URL_CLASS = 1
|
||||
|
||||
|
||||
column_list_type_name_lookup[ COLUMN_LIST_URL_CLASS_API_PAIRS.ID ] = 'api links'
|
||||
column_list_type_name_lookup[ COLUMN_LIST_URL_CLASS_API_PAIRS.ID ] = 'api/redirect links'
|
||||
|
||||
register_column_type( COLUMN_LIST_URL_CLASS_API_PAIRS.ID, COLUMN_LIST_URL_CLASS_API_PAIRS.URL_CLASS, 'url class', False, 36, True )
|
||||
register_column_type( COLUMN_LIST_URL_CLASS_API_PAIRS.ID, COLUMN_LIST_URL_CLASS_API_PAIRS.API_URL_CLASS, 'api url class', False, 36, True )
|
||||
register_column_type( COLUMN_LIST_URL_CLASS_API_PAIRS.ID, COLUMN_LIST_URL_CLASS_API_PAIRS.API_URL_CLASS, 'api/redirect url class', False, 36, True )
|
||||
|
||||
default_column_list_sort_lookup[ COLUMN_LIST_URL_CLASS_API_PAIRS.ID ] = ( COLUMN_LIST_URL_CLASS_API_PAIRS.API_URL_CLASS, True )
|
||||
|
||||
|
|
|
@ -1233,13 +1233,13 @@ class BetterListCtrlPanel( QW.QWidget ):
|
|||
|
||||
except HydrusExceptions.SerialisationException as e:
|
||||
|
||||
QW.QMessageBox.critical( self, 'Problem loading', 'Problem loading that object: {}'.format( str( e ) ) )
|
||||
QW.QMessageBox.critical( self, 'Problem loading', 'Problem loading that object: {}'.format( repr( e ) ) )
|
||||
|
||||
return
|
||||
|
||||
except Exception as e:
|
||||
|
||||
QW.QMessageBox.critical( self, 'Error', 'I could not understand what was in the clipboard: {}'.format( str( e ) ) )
|
||||
QW.QMessageBox.critical( self, 'Error', 'I could not understand what was in the clipboard: {}'.format( repr( e ) ) )
|
||||
|
||||
return
|
||||
|
||||
|
@ -1261,15 +1261,9 @@ class BetterListCtrlPanel( QW.QWidget ):
|
|||
|
||||
obj = HydrusSerialisable.CreateFromString( raw_text, raise_error_on_future_version = True )
|
||||
|
||||
except HydrusExceptions.SerialisationException as e:
|
||||
|
||||
QW.QMessageBox.critical( self, 'Problem loading', 'Problem loading that object: {}'.format( str( e ) ) )
|
||||
|
||||
return
|
||||
|
||||
except Exception as e:
|
||||
|
||||
QW.QMessageBox.critical( self, 'Error', 'I could not understand what was in the clipboard: {}'.format( str( e ) ) )
|
||||
ClientGUIFunctions.PresentClipboardParseError( self, raw_text, 'JSON-serialised Hydrus Object(s)', e )
|
||||
|
||||
return
|
||||
|
||||
|
@ -1281,7 +1275,7 @@ class BetterListCtrlPanel( QW.QWidget ):
|
|||
|
||||
except Exception as e:
|
||||
|
||||
QW.QMessageBox.critical( self, 'Error', 'Problem importing: {}'.format( str( e ) ) )
|
||||
QW.QMessageBox.critical( self, 'Error', 'Problem importing: {}'.format( repr( e ) ) )
|
||||
|
||||
|
||||
self._listctrl.Sort()
|
||||
|
|
|
@ -3910,6 +3910,8 @@ class MediaPanelThumbnails( MediaPanel ):
|
|||
|
||||
else:
|
||||
|
||||
# TODO: move away from this hell function GetPrettyInfoLines and set the timestamp tooltips to the be the full ISO time
|
||||
|
||||
pretty_info_lines = list( focus_singleton.GetPrettyInfoLines() )
|
||||
|
||||
top_line = pretty_info_lines.pop( 0 )
|
||||
|
|
|
@ -18,6 +18,7 @@ from hydrus.client import ClientSerialisable
|
|||
from hydrus.client import ClientStrings
|
||||
from hydrus.client import ClientThreading
|
||||
from hydrus.client.gui import ClientGUIDialogsQuick
|
||||
from hydrus.client.gui import ClientGUIFunctions
|
||||
from hydrus.client.gui import ClientGUIMenus
|
||||
from hydrus.client.gui import ClientGUICore as CGC
|
||||
from hydrus.client.gui import ClientGUIScrolledPanels
|
||||
|
@ -278,9 +279,9 @@ class EditNodes( QW.QWidget ):
|
|||
|
||||
self._ImportObject( obj )
|
||||
|
||||
except:
|
||||
except Exception as e:
|
||||
|
||||
QW.QMessageBox.critical( self, 'Error', 'I could not understand what was in the clipboard' )
|
||||
ClientGUIFunctions.PresentClipboardParseError( self, raw_text, 'JSON-serialised Nodes', e )
|
||||
|
||||
|
||||
|
||||
|
@ -1136,7 +1137,7 @@ class ManageParsingScriptsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
|
||||
except Exception as e:
|
||||
|
||||
QW.QMessageBox.critical( self, 'Error', 'I could not understand what was in the clipboard' )
|
||||
ClientGUIFunctions.PresentClipboardParseError( self, raw_text, 'JSON-serialised Parsing Scripts', e )
|
||||
|
||||
|
||||
|
||||
|
|
|
@ -210,7 +210,14 @@ class TestPanel( QW.QWidget ):
|
|||
return
|
||||
|
||||
|
||||
self._SetExampleData( raw_text, example_bytes = raw_bytes )
|
||||
try:
|
||||
|
||||
self._SetExampleData( raw_text, example_bytes = raw_bytes )
|
||||
|
||||
except Exception as e:
|
||||
|
||||
ClientGUIFunctions.PresentClipboardParseError( self, raw_text, 'UTF-8 text', e )
|
||||
|
||||
|
||||
|
||||
def _SetExampleData( self, example_data, example_bytes = None ):
|
||||
|
|
|
@ -2857,9 +2857,9 @@ class AutoCompleteDropdownTagsWrite( AutoCompleteDropdownTags ):
|
|||
|
||||
self.tagsPasted.emit( list( tags ) )
|
||||
|
||||
except:
|
||||
except Exception as e:
|
||||
|
||||
QW.QMessageBox.critical( self, 'Error', 'I could not understand what was in the clipboard' )
|
||||
ClientGUIFunctions.PresentClipboardParseError( self, raw_text, 'Lines of tags', e )
|
||||
|
||||
raise
|
||||
|
||||
|
|
|
@ -515,10 +515,10 @@ class PanelPredicateSystemAgeDelta( PanelPredicateSystemSingle ):
|
|||
|
||||
self._sign = TimeDeltaOperator( self )
|
||||
|
||||
self._years = ClientGUICommon.BetterSpinBox( self, max = 30, width = 60 )
|
||||
self._months = ClientGUICommon.BetterSpinBox( self, max = 60, width = 60 )
|
||||
self._days = ClientGUICommon.BetterSpinBox( self, max = 90, width = 60 )
|
||||
self._hours = ClientGUICommon.BetterSpinBox( self, max = 24, width = 60 )
|
||||
self._years = ClientGUICommon.BetterSpinBox( self, max = 50000 )
|
||||
self._months = ClientGUICommon.BetterSpinBox( self, max = 1000 )
|
||||
self._days = ClientGUICommon.BetterSpinBox( self, max = 1000 )
|
||||
self._hours = ClientGUICommon.BetterSpinBox( self, max = 10000 )
|
||||
|
||||
#
|
||||
|
||||
|
@ -575,10 +575,10 @@ class PanelPredicateSystemLastViewedDelta( PanelPredicateSystemSingle ):
|
|||
|
||||
self._sign = TimeDeltaOperator( self )
|
||||
|
||||
self._years = ClientGUICommon.BetterSpinBox( self, max=30 )
|
||||
self._months = ClientGUICommon.BetterSpinBox( self, max=60 )
|
||||
self._days = ClientGUICommon.BetterSpinBox( self, max=90 )
|
||||
self._hours = ClientGUICommon.BetterSpinBox( self, max=24 )
|
||||
self._years = ClientGUICommon.BetterSpinBox( self, max = 50000 )
|
||||
self._months = ClientGUICommon.BetterSpinBox( self, max = 1000 )
|
||||
self._days = ClientGUICommon.BetterSpinBox( self, max = 1000 )
|
||||
self._hours = ClientGUICommon.BetterSpinBox( self, max = 10000 )
|
||||
|
||||
#
|
||||
|
||||
|
@ -634,10 +634,10 @@ class PanelPredicateSystemArchivedDelta( PanelPredicateSystemSingle ):
|
|||
|
||||
self._sign = TimeDeltaOperator( self )
|
||||
|
||||
self._years = ClientGUICommon.BetterSpinBox( self, max=30 )
|
||||
self._months = ClientGUICommon.BetterSpinBox( self, max=60 )
|
||||
self._days = ClientGUICommon.BetterSpinBox( self, max=90 )
|
||||
self._hours = ClientGUICommon.BetterSpinBox( self, max=24 )
|
||||
self._years = ClientGUICommon.BetterSpinBox( self, max = 50000 )
|
||||
self._months = ClientGUICommon.BetterSpinBox( self, max = 1000 )
|
||||
self._days = ClientGUICommon.BetterSpinBox( self, max = 1000 )
|
||||
self._hours = ClientGUICommon.BetterSpinBox( self, max = 10000 )
|
||||
|
||||
#
|
||||
|
||||
|
@ -693,10 +693,10 @@ class PanelPredicateSystemModifiedDelta( PanelPredicateSystemSingle ):
|
|||
|
||||
self._sign = TimeDeltaOperator( self )
|
||||
|
||||
self._years = ClientGUICommon.BetterSpinBox( self, max=30 )
|
||||
self._months = ClientGUICommon.BetterSpinBox( self, max=60 )
|
||||
self._days = ClientGUICommon.BetterSpinBox( self, max=90 )
|
||||
self._hours = ClientGUICommon.BetterSpinBox( self, max=24 )
|
||||
self._years = ClientGUICommon.BetterSpinBox( self, max = 50000 )
|
||||
self._months = ClientGUICommon.BetterSpinBox( self, max = 1000 )
|
||||
self._days = ClientGUICommon.BetterSpinBox( self, max = 1000 )
|
||||
self._hours = ClientGUICommon.BetterSpinBox( self, max = 10000 )
|
||||
|
||||
#
|
||||
|
||||
|
|
|
@ -202,7 +202,7 @@ class EditPredicatesPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
AGE_DELTA_PRED = ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_SYSTEM_AGE, ( '>', 'delta', ( 2000, 1, 1, 1 ) ) )
|
||||
LAST_VIEWED_DELTA_PRED = ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_SYSTEM_LAST_VIEWED_TIME, ( '>', 'delta', ( 2000, 1, 1, 1 ) ) )
|
||||
ARCHIVED_DELTA_PRED = ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_SYSTEM_MODIFIED_TIME, ( '>', 'delta', ( 2000, 1, 1, 1 ) ) )
|
||||
ARCHIVED_DELTA_PRED = ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_SYSTEM_ARCHIVED_TIME, ( '>', 'delta', ( 2000, 1, 1, 1 ) ) )
|
||||
MODIFIED_DELTA_PRED = ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_SYSTEM_MODIFIED_TIME, ( '>', 'delta', ( 2000, 1, 1, 1 ) ) )
|
||||
KNOWN_URL_EXACT = ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_SYSTEM_KNOWN_URLS, ( True, 'exact_match', '', '' ) )
|
||||
KNOWN_URL_DOMAIN = ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_SYSTEM_KNOWN_URLS, ( True, 'domain', '', '' ) )
|
||||
|
|
|
@ -9,6 +9,7 @@ from hydrus.core import HydrusGlobals as HG
|
|||
from hydrus.core import HydrusTime
|
||||
|
||||
from hydrus.client.gui import ClientGUICore as CGC
|
||||
from hydrus.client.gui import ClientGUIFunctions
|
||||
from hydrus.client.gui import ClientGUIMenus
|
||||
from hydrus.client.gui import ClientGUIStyle
|
||||
from hydrus.client.gui import QtInit
|
||||
|
@ -124,7 +125,7 @@ class ColourPickerButton( QW.QPushButton ):
|
|||
|
||||
try:
|
||||
|
||||
import_string = HG.client_controller.GetClipboardText()
|
||||
raw_text = HG.client_controller.GetClipboardText()
|
||||
|
||||
except Exception as e:
|
||||
|
||||
|
@ -133,6 +134,8 @@ class ColourPickerButton( QW.QPushButton ):
|
|||
return
|
||||
|
||||
|
||||
import_string = raw_text
|
||||
|
||||
if import_string.startswith( '#' ):
|
||||
|
||||
import_string = import_string[1:]
|
||||
|
@ -153,9 +156,7 @@ class ColourPickerButton( QW.QPushButton ):
|
|||
|
||||
except Exception as e:
|
||||
|
||||
QW.QMessageBox.critical( self, 'Error', str(e) )
|
||||
|
||||
HydrusData.ShowException( e )
|
||||
ClientGUIFunctions.PresentClipboardParseError( self, raw_text, 'A hex colour like #FF0050', e )
|
||||
|
||||
return
|
||||
|
||||
|
|
|
@ -485,9 +485,9 @@ class TextAndPasteCtrl( QW.QWidget ):
|
|||
self._add_callable( texts )
|
||||
|
||||
|
||||
except:
|
||||
except Exception as e:
|
||||
|
||||
QW.QMessageBox.critical( self, 'Error', 'I could not understand what was in the clipboard' )
|
||||
ClientGUIFunctions.PresentClipboardParseError( self, raw_text, 'Lines of text', e )
|
||||
|
||||
|
||||
|
||||
|
|
|
@ -624,7 +624,9 @@ class FileSeed( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
status_hook( 'downloading file' )
|
||||
|
||||
network_job = network_job_factory( 'GET', file_url, temp_path = temp_path, referral_url = referral_url )
|
||||
url_to_fetch = HG.client_controller.network_engine.domain_manager.GetURLToFetch( file_url )
|
||||
|
||||
network_job = network_job_factory( 'GET', url_to_fetch, temp_path = temp_path, referral_url = referral_url )
|
||||
|
||||
for ( key, value ) in self._request_headers.items():
|
||||
|
||||
|
@ -645,9 +647,14 @@ class FileSeed( HydrusSerialisable.SerialisableBase ):
|
|||
network_job.WaitUntilDone()
|
||||
|
||||
|
||||
if url_to_fetch != file_url:
|
||||
|
||||
self._AddPrimaryURLs( ( url_to_fetch, ) )
|
||||
|
||||
|
||||
actual_fetched_url = network_job.GetActualFetchedURL()
|
||||
|
||||
if actual_fetched_url != file_url:
|
||||
if actual_fetched_url not in ( file_url, url_to_fetch ):
|
||||
|
||||
self._AddPrimaryURLs( ( actual_fetched_url, ) )
|
||||
|
||||
|
@ -1357,7 +1364,7 @@ class FileSeed( HydrusSerialisable.SerialisableBase ):
|
|||
network_job = network_job_factory( 'GET', url_to_check, referral_url = referral_url )
|
||||
|
||||
for ( key, value ) in self._request_headers.items():
|
||||
|
||||
|
||||
network_job.AddAdditionalHeader( key, value )
|
||||
|
||||
|
||||
|
|
|
@ -359,7 +359,7 @@ class FileImportJob( object ):
|
|||
|
||||
except Exception as e:
|
||||
|
||||
raise HydrusExceptions.DamagedOrUnusualFileException( 'Could not render a thumbnail: {}'.format( str( e ) ) )
|
||||
raise HydrusExceptions.DamagedOrUnusualFileException( 'Could not render a thumbnail: {}'.format( repr( e ) ) )
|
||||
|
||||
|
||||
|
||||
|
|
|
@ -1649,7 +1649,7 @@ class Subscription( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
self._paused = True
|
||||
|
||||
self._DelayWork( 300, 'error: {}'.format( str( e ) ) )
|
||||
self._DelayWork( 300, 'error: {}'.format( repr( e ) ) )
|
||||
|
||||
return
|
||||
|
||||
|
|
|
@ -425,7 +425,7 @@ def ParseClientAPIPOSTArgs( request ):
|
|||
|
||||
except json.decoder.JSONDecodeError as e:
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'Sorry, did not understand the JSON you gave me: {}'.format( str( e ) ) )
|
||||
raise HydrusExceptions.BadRequestException( 'Sorry, did not understand the JSON you gave me: {}'.format( e ) )
|
||||
|
||||
|
||||
parsed_request_args = ParseClientAPIPOSTByteArgs( args )
|
||||
|
@ -1948,7 +1948,7 @@ class HydrusResourceClientAPIRestrictedAddTagsAddTags( HydrusResourceClientAPIRe
|
|||
|
||||
if service_keys_to_actions_to_tags is None:
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'Need a service-names-to-tags parameter!' )
|
||||
raise HydrusExceptions.BadRequestException( 'Need a service_keys_to_tags or service_keys_to_actions_to_tags parameter!' )
|
||||
|
||||
|
||||
service_keys_to_content_updates = collections.defaultdict( list )
|
||||
|
|
|
@ -292,11 +292,11 @@ class NetworkEngine( object ):
|
|||
|
||||
if job.IsHydrusJob():
|
||||
|
||||
message = 'This hydrus service (' + job.GetLoginNetworkContext().ToString() + ') could not do work because: {}'.format( str( e ) )
|
||||
message = 'This hydrus service (' + job.GetLoginNetworkContext().ToString() + ') could not do work because: {}'.format( repr( e ) )
|
||||
|
||||
else:
|
||||
|
||||
message = 'This job\'s network context (' + job.GetLoginNetworkContext().ToString() + ') seems to have an invalid login. The error was: {}'.format( str( e ) )
|
||||
message = 'This job\'s network context (' + job.GetLoginNetworkContext().ToString() + ') seems to have an invalid login. The error was: {}'.format( repr( e ) )
|
||||
|
||||
|
||||
job.Cancel( message )
|
||||
|
|
|
@ -239,7 +239,7 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
if api_url_class is None:
|
||||
|
||||
raise HydrusExceptions.URLClassException( 'Could not find an API URL Class for ' + api_url + ' URL, which originally came from ' + url + '!' )
|
||||
raise HydrusExceptions.URLClassException( 'Could not find an API/Redirect URL Class for ' + api_url + ' URL, which originally came from ' + url + '!' )
|
||||
|
||||
|
||||
if api_url_class in seen_url_classes:
|
||||
|
@ -248,15 +248,15 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
if loop_size == 1:
|
||||
|
||||
message = 'Could not find an API URL Class for ' + url + ' as the url class API-linked to itself!'
|
||||
message = 'Could not find an API/Redirect URL Class for ' + url + ' as the url class API-linked to itself!'
|
||||
|
||||
elif loop_size == 2:
|
||||
|
||||
message = 'Could not find an API URL Class for ' + url + ' as the url class and its API url class API-linked to each other!'
|
||||
message = 'Could not find an API/Redirect URL Class for ' + url + ' as the url class and its API url class API-linked to each other!'
|
||||
|
||||
else:
|
||||
|
||||
message = 'Could not find an API URL Class for ' + url + ' as it and its API url classes linked in a loop of size ' + HydrusData.ToHumanInt( loop_size ) + '!'
|
||||
message = 'Could not find an API/Redirect URL Class for ' + url + ' as it and its API url classes linked in a loop of size ' + HydrusData.ToHumanInt( loop_size ) + '!'
|
||||
|
||||
|
||||
raise HydrusExceptions.URLClassException( message )
|
||||
|
@ -323,6 +323,27 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
|
|||
return None
|
||||
|
||||
|
||||
def _GetURLToFetch( self, url: str ):
|
||||
|
||||
url_class = self._GetURLClass( url )
|
||||
|
||||
if url_class is None:
|
||||
|
||||
return url
|
||||
|
||||
|
||||
try:
|
||||
|
||||
( url_class, url_to_fetch ) = self._GetNormalisedAPIURLClassAndURL( url )
|
||||
|
||||
except HydrusExceptions.URLClassException as e:
|
||||
|
||||
raise HydrusExceptions.URLClassException( 'Could not find a URL class for ' + url + '!' + os.linesep * 2 + str( e ) )
|
||||
|
||||
|
||||
return url_to_fetch
|
||||
|
||||
|
||||
def _GetURLToFetchAndParser( self, url ):
|
||||
|
||||
try:
|
||||
|
@ -331,7 +352,7 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
|
|||
|
||||
except HydrusExceptions.URLClassException as e:
|
||||
|
||||
raise HydrusExceptions.URLClassException( 'Could not find a parser for ' + url + '!' + os.linesep * 2 + str( e ) )
|
||||
raise HydrusExceptions.URLClassException( 'Could not find a URL class for ' + url + '!' + os.linesep * 2 + str( e ) )
|
||||
|
||||
|
||||
url_class_key = parser_url_class.GetClassKey()
|
||||
|
@ -1392,6 +1413,21 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
|
|||
return ( url_type, match_name, can_parse, cannot_parse_reason )
|
||||
|
||||
|
||||
def GetURLToFetch( self, url ):
|
||||
|
||||
with self._lock:
|
||||
|
||||
url_to_fetch = self._GetURLToFetch( url )
|
||||
|
||||
if HG.network_report_mode:
|
||||
|
||||
HydrusData.ShowText( 'request for URL to fetch: {} -> {}'.format( url, url_to_fetch ) )
|
||||
|
||||
|
||||
return url_to_fetch
|
||||
|
||||
|
||||
|
||||
def GetURLToFetchAndParser( self, url ):
|
||||
|
||||
with self._lock:
|
||||
|
|
|
@ -1717,7 +1717,7 @@ class NetworkJob( object ):
|
|||
|
||||
self.engine.domain_manager.ReportNetworkInfrastructureError( self._url )
|
||||
|
||||
raise HydrusExceptions.ConnectionException( 'Problem with SSL: {}'.format( str( e ) ) )
|
||||
raise HydrusExceptions.ConnectionException( 'Problem with SSL: {}'.format( repr( e ) ) )
|
||||
|
||||
except ( requests.exceptions.ConnectionError, requests.exceptions.ConnectTimeout ):
|
||||
|
||||
|
|
|
@ -100,7 +100,7 @@ options = {}
|
|||
# Misc
|
||||
|
||||
NETWORK_VERSION = 20
|
||||
SOFTWARE_VERSION = 524
|
||||
SOFTWARE_VERSION = 525
|
||||
CLIENT_API_VERSION = 44
|
||||
|
||||
SERVER_THUMBNAIL_DIMENSIONS = ( 200, 200 )
|
||||
|
@ -1058,7 +1058,7 @@ URL_TYPE_SUB_GALLERY = 9
|
|||
|
||||
url_type_string_lookup = {
|
||||
URL_TYPE_POST : 'post url',
|
||||
URL_TYPE_API : 'api url',
|
||||
URL_TYPE_API : 'api/redirect url',
|
||||
URL_TYPE_FILE : 'file url',
|
||||
URL_TYPE_GALLERY : 'gallery url',
|
||||
URL_TYPE_WATCHABLE : 'watchable url',
|
||||
|
|
|
@ -161,7 +161,7 @@ class HydrusController( object ):
|
|||
|
||||
def _InitTempDir( self ):
|
||||
|
||||
self.temp_dir = HydrusTemp.GetTempDir()
|
||||
self.temp_dir = HydrusTemp.GetHydrusTempDir()
|
||||
|
||||
|
||||
def _MaintainCallToThreads( self ):
|
||||
|
@ -793,6 +793,8 @@ class HydrusController( object ):
|
|||
self._slow_job_scheduler = None
|
||||
|
||||
|
||||
HydrusTemp.CleanUpOldTempPaths()
|
||||
|
||||
if hasattr( self, 'temp_dir' ):
|
||||
|
||||
HydrusPaths.DeletePath( self.temp_dir )
|
||||
|
|
|
@ -705,6 +705,21 @@ def GetJPEGQuantizationQualityEstimate( path ):
|
|||
return ( 'unknown', None )
|
||||
|
||||
|
||||
def GetJpegSubsampling( pil_image: PILImage.Image ) -> str:
|
||||
|
||||
from PIL import JpegImagePlugin
|
||||
|
||||
result = JpegImagePlugin.get_sampling( pil_image )
|
||||
|
||||
subsampling_str_lookup = {
|
||||
0 : '4:4:4',
|
||||
1 : '4:2:2',
|
||||
2 : '4:2:0'
|
||||
}
|
||||
|
||||
return subsampling_str_lookup.get( result, 'unknown' )
|
||||
|
||||
|
||||
def GetEmbeddedFileText( pil_image: PILImage.Image ) -> typing.Optional[ str ]:
|
||||
|
||||
def render_dict( d, prefix ):
|
||||
|
|
|
@ -41,6 +41,7 @@ def CleanUpTempPath( os_file_handle, temp_path ):
|
|||
|
||||
|
||||
|
||||
|
||||
def CleanUpOldTempPaths():
|
||||
|
||||
with TEMP_PATH_LOCK:
|
||||
|
@ -61,7 +62,7 @@ def CleanUpOldTempPaths():
|
|||
|
||||
except OSError:
|
||||
|
||||
if HydrusTime.TimeHasPassed( time_failed + 600 ):
|
||||
if HydrusTime.TimeHasPassed( time_failed + 1200 ):
|
||||
|
||||
IN_USE_TEMP_PATHS.discard( row )
|
||||
|
||||
|
@ -70,14 +71,28 @@ def CleanUpOldTempPaths():
|
|||
|
||||
|
||||
|
||||
|
||||
def GetCurrentTempDir():
|
||||
|
||||
return tempfile.gettempdir()
|
||||
|
||||
def GetTempDir( dir = None ):
|
||||
|
||||
HYDRUS_TEMP_DIR = None
|
||||
|
||||
def GetHydrusTempDir():
|
||||
|
||||
return tempfile.mkdtemp( prefix = 'hydrus', dir = dir )
|
||||
path = tempfile.mkdtemp( prefix = 'hydrus' )
|
||||
|
||||
global HYDRUS_TEMP_DIR
|
||||
|
||||
if HYDRUS_TEMP_DIR is None:
|
||||
|
||||
HYDRUS_TEMP_DIR = path
|
||||
|
||||
|
||||
return path
|
||||
|
||||
|
||||
def SetEnvTempDir( path ):
|
||||
|
||||
if os.path.exists( path ) and not os.path.isdir( path ):
|
||||
|
@ -109,7 +124,22 @@ def SetEnvTempDir( path ):
|
|||
|
||||
tempfile.tempdir = path
|
||||
|
||||
|
||||
def GetTempPath( suffix = '', dir = None ):
|
||||
|
||||
global HYDRUS_TEMP_DIR
|
||||
|
||||
if dir is None and HYDRUS_TEMP_DIR is not None:
|
||||
|
||||
dir = HYDRUS_TEMP_DIR
|
||||
|
||||
if not os.path.exists( dir ):
|
||||
|
||||
HYDRUS_TEMP_DIR = None
|
||||
|
||||
dir = GetHydrusTempDir()
|
||||
|
||||
|
||||
|
||||
return tempfile.mkstemp( suffix = suffix, prefix = 'hydrus', dir = dir )
|
||||
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
import calendar
|
||||
import datetime
|
||||
import time
|
||||
|
||||
|
@ -121,6 +122,44 @@ def TimeUntil( timestamp ):
|
|||
return timestamp - GetNow()
|
||||
|
||||
|
||||
def CalendarDeltaToDateTime( years : int, months : int, days : int, hours : int ) -> datetime.datetime:
|
||||
|
||||
now = datetime.datetime.now()
|
||||
|
||||
day_and_hour_delta = datetime.timedelta( days = days, hours = hours )
|
||||
|
||||
result = now - day_and_hour_delta
|
||||
|
||||
while months > result.month:
|
||||
|
||||
years += 1
|
||||
months -= 12
|
||||
|
||||
|
||||
new_year = result.year - years
|
||||
new_month = result.month - months
|
||||
|
||||
dayrange = calendar.monthrange( new_year, new_month )
|
||||
|
||||
new_day = min( dayrange[1], result.day )
|
||||
|
||||
result = datetime.datetime(
|
||||
year = new_year,
|
||||
month = new_month,
|
||||
day = new_day,
|
||||
hour = result.hour,
|
||||
minute = result.minute,
|
||||
second = result.second
|
||||
)
|
||||
|
||||
return result
|
||||
|
||||
|
||||
def CalendarDeltaToRoughDateTimeTimeDelta( years : int, months : int, days : int, hours : int ) -> datetime.timedelta:
|
||||
|
||||
return datetime.timedelta( days = days + ( months * ( 365.25 / 12 ) ) + ( years * 365.25 ), hours = hours )
|
||||
|
||||
|
||||
def TimeDeltaToPrettyTimeDelta( seconds, show_seconds = True ):
|
||||
|
||||
if seconds is None:
|
||||
|
@ -145,8 +184,8 @@ def TimeDeltaToPrettyTimeDelta( seconds, show_seconds = True ):
|
|||
MINUTE = 60
|
||||
HOUR = 60 * MINUTE
|
||||
DAY = 24 * HOUR
|
||||
MONTH = 30 * DAY
|
||||
YEAR = 365 * DAY
|
||||
YEAR = 365.25 * DAY
|
||||
MONTH = YEAR / 12
|
||||
|
||||
lines = []
|
||||
|
||||
|
|
|
@ -802,7 +802,7 @@ class HydrusResource( Resource ):
|
|||
|
||||
if isinstance( e, HydrusExceptions.DBException ):
|
||||
|
||||
e = e.db_e # could well be a DataException
|
||||
e = e.db_e # could well be a DataException, which we want to promote
|
||||
|
||||
|
||||
try: self._CleanUpTempFile( request )
|
||||
|
|
|
@ -185,10 +185,10 @@ SYSTEM_PREDICATES = {
|
|||
'limit': (Predicate.LIMIT, Operators.ONLY_EQUAL, Value.NATURAL, None),
|
||||
'file ?type': (Predicate.FILETYPE, Operators.ONLY_EQUAL, Value.FILETYPE_LIST, None),
|
||||
'hash': (Predicate.HASH, Operators.EQUAL, Value.HASHLIST_WITH_ALGORITHM, None),
|
||||
'archived? date|date archived': (Predicate.ARCHIVED_DATE, Operators.RELATIONAL, Value.DATE_OR_TIME_INTERVAL, None),
|
||||
'modified date|date modified': (Predicate.MOD_DATE, Operators.RELATIONAL, Value.DATE_OR_TIME_INTERVAL, None),
|
||||
'last viewed time|last view time': (Predicate.LAST_VIEWED_TIME, Operators.RELATIONAL, Value.DATE_OR_TIME_INTERVAL, None),
|
||||
'time imported|import time': (Predicate.TIME_IMPORTED, Operators.RELATIONAL, Value.DATE_OR_TIME_INTERVAL, None),
|
||||
'archived? (date|time)|(date|time) archived': (Predicate.ARCHIVED_DATE, Operators.RELATIONAL, Value.DATE_OR_TIME_INTERVAL, None),
|
||||
'modified (date|time)|(date|time) modified': (Predicate.MOD_DATE, Operators.RELATIONAL, Value.DATE_OR_TIME_INTERVAL, None),
|
||||
'last view(ed)? (date|time)|(date|time) last viewed': (Predicate.LAST_VIEWED_TIME, Operators.RELATIONAL, Value.DATE_OR_TIME_INTERVAL, None),
|
||||
'import(ed)? (date|time)|(date|time) imported': (Predicate.TIME_IMPORTED, Operators.RELATIONAL, Value.DATE_OR_TIME_INTERVAL, None),
|
||||
'duration': (Predicate.DURATION, Operators.RELATIONAL, Value.TIME_SEC_MSEC, None),
|
||||
'framerate': (Predicate.FRAMERATE, Operators.RELATIONAL_EXACT, Value.NATURAL, Units.FPS_OR_NONE),
|
||||
'number of frames': (Predicate.NUM_OF_FRAMES, Operators.RELATIONAL, Value.NATURAL, None),
|
||||
|
@ -334,8 +334,6 @@ def parse_value( string: str, spec ):
|
|||
months = int( match.group( 'month' ) ) if match.group( 'month' ) else 0
|
||||
days = int( match.group( 'day' ) ) if match.group( 'day' ) else 0
|
||||
hours = int( match.group( 'hour' ) ) if match.group( 'hour' ) else 0
|
||||
days += math.floor( hours / 24 )
|
||||
hours = hours % 24
|
||||
return string[ len( match[ 0 ] ): ], (years, months, days, hours)
|
||||
match = re.match( '(?P<year>[0-9][0-9][0-9][0-9])-(?P<month>[0-9][0-9]?)-(?P<day>[0-9][0-9]?)', string )
|
||||
if match:
|
||||
|
|
|
@ -20,7 +20,7 @@ class TestDaemons( unittest.TestCase ):
|
|||
|
||||
def test_import_folders_daemon( self ):
|
||||
|
||||
test_dir = HydrusTemp.GetTempDir()
|
||||
test_dir = HydrusTemp.GetHydrusTempDir()
|
||||
|
||||
try:
|
||||
|
||||
|
|
|
@ -2037,30 +2037,46 @@ class TestTagObjects( unittest.TestCase ):
|
|||
( 'system:md5 hash is not abcdef01', "system:Hash != Abcdef01 md5" ),
|
||||
( 'system:md5 hash is not abcdef01', "system:Hash is not Abcdef01 md5" ),
|
||||
( 'system:sha256 hash is abcdef0102', "system:hash = abcdef0102" ),
|
||||
( 'system:archived time: since 7 years 1 month ago', "system:archived date < 7 years 45 days 70h" ),
|
||||
( 'system:archived time: since 7 years 1 month ago', "system:archive date < 7 years 45 days 70h" ),
|
||||
( 'system:modified time: since 7 years 1 month ago', "system:modified date < 7 years 45 days 70h" ),
|
||||
( 'system:archived time: since 7 years 45 days ago', "system:archived date < 7 years 45 days 700h" ),
|
||||
( 'system:archived time: since 7 years 45 days ago', "system:archive date < 7 years 45 days 700h" ),
|
||||
( 'system:archived time: since 7 years 45 days ago', "system:archived time < 7 years 45 days 700h" ),
|
||||
( 'system:archived time: since 7 years 45 days ago', "system:archive time < 7 years 45 days 700h" ),
|
||||
( 'system:archived time: since 7 years 45 days ago', "system:date archived < 7 years 45 days 700h" ),
|
||||
( 'system:archived time: since 7 years 45 days ago', "system:time archived < 7 years 45 days 700h" ),
|
||||
( 'system:modified time: since 7 years 45 days ago', "system:modified date < 7 years 45 days 700h" ),
|
||||
( 'system:modified time: since 2011-06-04', "system:modified date > 2011-06-04" ),
|
||||
( 'system:modified time: before 7 years 2 months ago', "system:date modified > 7 years 2 months" ),
|
||||
( 'system:modified time: since 1 day ago', "system:date modified < 1 day" ),
|
||||
( 'system:modified time: since 1 day ago', "system:time modified < 1 day" ),
|
||||
( 'system:modified time: since 1 day ago', "system:modified date < 1 day" ),
|
||||
( 'system:modified time: since 1 day ago', "system:modified time < 1 day" ),
|
||||
( 'system:modified time: since 1 month 1 day ago', "system:date modified < 0 years 1 month 1 day 1 hour" ),
|
||||
( 'system:last view time: since 7 years 1 month ago', "system:last viewed time < 7 years 45 days 70h" ),
|
||||
( 'system:last view time: since 7 years 1 month ago', "system:last view time < 7 years 45 days 70h" ),
|
||||
( 'system:import time: since 7 years 1 month ago', "system:time_imported < 7 years 45 days 70h" ),
|
||||
( 'system:last view time: since 7 years 45 days ago', "system:last viewed time < 7 years 45 days 700h" ),
|
||||
( 'system:last view time: since 7 years 45 days ago', "system:last viewed date < 7 years 45 days 700h" ),
|
||||
( 'system:last view time: since 7 years 45 days ago', "system:last view time < 7 years 45 days 700h" ),
|
||||
( 'system:last view time: since 7 years 45 days ago', "system:last view date < 7 years 45 days 700h" ),
|
||||
( 'system:last view time: since 7 years 45 days ago', "system:time last viewed < 7 years 45 days 700h" ),
|
||||
( 'system:last view time: since 7 years 45 days ago', "system:date last viewed < 7 years 45 days 700h" ),
|
||||
( 'system:import time: since 7 years 45 days ago', "system:time_imported < 7 years 45 days 700h" ),
|
||||
( 'system:import time: since 2011-06-04', "system:time imported > 2011-06-04" ),
|
||||
( 'system:import time: before 7 years 2 months ago', "system:time imported > 7 years 2 months" ),
|
||||
( 'system:import time: since 1 day ago', "system:time imported < 1 day" ),
|
||||
( 'system:import time: since 1 month 1 day ago', "system:time imported < 0 years 1 month 1 day 1 hour" ),
|
||||
( 'system:import time: a month either side of 2011-01-03', " system:time imported ~= 2011-1-3 " ),
|
||||
( 'system:import time: a month either side of 1996-05-02', "system:time imported ~= 1996-05-2" ),
|
||||
( 'system:import time: since 7 years 1 month ago', "system:import_time < 7 years 45 days 70h" ),
|
||||
( 'system:import time: since 7 years 45 days ago', "system:import_time < 7 years 45 days 700h" ),
|
||||
( 'system:import time: since 2011-06-04', "system:import time > 2011-06-04" ),
|
||||
( 'system:import time: before 7 years 2 months ago', "system:import time > 7 years 2 months" ),
|
||||
( 'system:import time: since 1 day ago', "system:import time < 1 day" ),
|
||||
( 'system:import time: around 1 day ago', "system:import time = 1 day" ),
|
||||
( 'system:import time: since 1 month 1 day ago', "system:import time < 0 years 1 month 1 day 1 hour" ),
|
||||
( 'system:import time: a month either side of 2011-01-03', " system:import time ~= 2011-1-3 " ),
|
||||
( 'system:import time: a month either side of 1996-05-02', "system:import time ~= 1996-05-2" ),
|
||||
( 'system:import time: since 1 day ago', "system:import time < 1 day" ),
|
||||
( 'system:import time: since 1 day ago', "system:imported time < 1 day" ),
|
||||
( 'system:import time: since 1 day ago', "system:import date < 1 day" ),
|
||||
( 'system:import time: since 1 day ago', "system:imported date < 1 day" ),
|
||||
( 'system:import time: since 1 day ago', "system:time imported < 1 day" ),
|
||||
( 'system:import time: since 1 day ago', "system:date imported < 1 day" ),
|
||||
( 'system:duration < 5.0 seconds', "system:duration < 5 seconds" ),
|
||||
( 'system:duration \u2248 11.0 seconds', "system:duration ~= 5 sec 6000 msecs" ),
|
||||
( 'system:duration > 3 milliseconds', "system:duration > 3 milliseconds" ),
|
||||
|
|
|
@ -105,13 +105,14 @@ IF "%install_type%" == "s" (
|
|||
IF "%install_type%" == "d" (
|
||||
|
||||
python -m pip install -r static\requirements\advanced\requirements_core.txt
|
||||
python -m pip install -r static\requirements\advanced\requirements_windows.txt
|
||||
|
||||
python -m pip install -r static\requirements\advanced\requirements_qt6_test.txt
|
||||
python -m pip install pyside2
|
||||
python -m pip install PyQtChart PyQt5
|
||||
python -m pip install PyQt6-Charts PyQt6
|
||||
python -m pip install -r static\requirements\advanced\requirements_mpv_new.txt
|
||||
python -m pip install -r static\requirements\advanced\requirements_opencv_new.txt
|
||||
python -m pip install -r static\requirements\advanced\requirements_opencv_test.txt
|
||||
python -m pip install -r static\requirements\hydev\requirements_windows_build.txt
|
||||
|
||||
)
|
||||
|
@ -119,6 +120,7 @@ IF "%install_type%" == "d" (
|
|||
IF "%install_type%" == "a" (
|
||||
|
||||
python -m pip install -r static\requirements\advanced\requirements_core.txt
|
||||
python -m pip install -r static\requirements\advanced\requirements_windows.txt
|
||||
|
||||
IF "%qt%" == "5" python -m pip install -r static\requirements\advanced\requirements_qt5.txt
|
||||
IF "%qt%" == "6" python -m pip install -r static\requirements\advanced\requirements_qt6.txt
|
||||
|
|
|
@ -33,6 +33,4 @@ pyinstaller==5.5
|
|||
mkdocs-material
|
||||
|
||||
PyWin32
|
||||
pypiwin32
|
||||
pywin32-ctypes
|
||||
pefile
|
||||
|
|
Binary file not shown.
Before Width: | Height: | Size: 3.0 KiB After Width: | Height: | Size: 3.1 KiB |
Binary file not shown.
Before Width: | Height: | Size: 2.6 KiB After Width: | Height: | Size: 2.5 KiB |
Binary file not shown.
After Width: | Height: | Size: 2.6 KiB |
|
@ -0,0 +1 @@
|
|||
PyWin32
|
|
@ -4,7 +4,4 @@ mock>=4.0.0
|
|||
httmock>=1.4.0
|
||||
mkdocs-material
|
||||
|
||||
PyWin32
|
||||
pypiwin32
|
||||
pywin32-ctypes
|
||||
pefile
|
||||
|
|
Loading…
Reference in New Issue