Version 554
This commit is contained in:
parent
d6ddfff80c
commit
efdfef4637
|
@ -7,6 +7,71 @@ title: Changelog
|
|||
!!! note
|
||||
This is the new changelog, only the most recent builds. For all versions, see the [old changelog](old_changelog.html).
|
||||
|
||||
## [Version 554](https://github.com/hydrusnetwork/hydrus/releases/tag/v554)
|
||||
|
||||
### checker options fixes
|
||||
|
||||
* **sorry for any jank 'static check interval' watcher or subscription timings you saw last week! I screwed something up and it slipped through testing**
|
||||
* the 'static check interval' logic is much much simpler. rather than try to always keep to the same check period, even if the actual check is delayed, it just works off 'last check time + period', every time. the clever stuff was generally confusing and failing in a variety of ways
|
||||
* fixed a bug in the new static check time code that was stopping certain in-limbo watchers from calculating their correct next check time on program load
|
||||
* fixed a bug in the new static check time code that was causing too many checks in long-paused-and-now-unpaused downloaders
|
||||
* some new unit tests will make sure these errors do not happen again
|
||||
* in the checker options UI, if you uncheck 'just check at a static, regular interval', and leave the faster/slower values as the same when you OK, then the dialog now asks you if that is what you want
|
||||
* in the checker options UI, the 'slower than' value will now automatically update itself to be no smaller than the 'faster than' value
|
||||
|
||||
### job status fixes and cleanup (mostly boring)
|
||||
|
||||
* **sorry for any 'Cancel/IsCancellable' related errors you saw last week! I screwed something else up**
|
||||
* fixed a dumb infinite recursion error in the new job status cancellable 'safety' checks that was happening when it was time to auto-dismiss a cancellable job due to program/thread shutdown or a maintenance mode change. this also fixes some non-dismissing popup messages (usually subscriptions) that weren't setting their cancel status correctly
|
||||
* this happened because the code here was ancient and ugly. I have renamed, simplified, and reworked the logical dangerzone variables and methods in the job status object so we don't run into this problem again. 'Cancel' and 'Finish' no longer take a seconds parameter, 'Delete' is now 'FinishAndDismiss', 'IsDeleted' is now 'IsDismissed', 'IsDeletable' is now merged into a cleaner 'IsDone', 'IsWorking' is removed, 'SetCancellable' and 'SetPausable' are removed (these will always be in the init, and will determine what type of job we have), and the various new Client API calls and help are updated for this
|
||||
* also, the job status methods now check their backstop 'cancel' tests far less often, and there's a throttle to make sure they can only run once a second anyway
|
||||
* also ditched the needless threading events for simple bools
|
||||
* also cleared up about 40 pointless Finish/FinishAndDismiss duplicate calls across the program
|
||||
* also fixed up the job status object to do its various yield pauses more sanely
|
||||
|
||||
### cbz and ugoira detection and thumbnails
|
||||
|
||||
* CBZ files are now detected! there is no very strict standard of what is or isn't a CBZ (it is basically just a zip of images and maybe some metadata files), but I threw together a 'yeah that looks like a cbz' test that now runs on every zip. there will probably be several false positives, but with luck fewer false negatives, which I think is the way we want to lean here. if you have just some zip of similarly named images, it'll now be counted as a CBZ, but I think we'll nonetheless want to give those all the upcoming CBZ tech anyway, even if they aren't technically intended to be 'CBZ', whatever that actually means here other than the different file extension
|
||||
* the client looks for the cover image in your CBZ and uses that for the thumbnail! it also uses this file's resolution as the CBZ resolution
|
||||
* Ugoira files are now detected! there is a firmer standard of what an Ugoira is, but it is still tricky as we are just talking about a different list of zipped image files here. I expect zero false negatives and some false positives (unfortunately, it'll be CBZs with zero-padded numerical-only filenames). as all ugoiras are valid CBZs but few CBZs are valid ugoiras, the Ugoira test runs first
|
||||
* the client now gets a thumbnail for Ugoiras. It'll also use the x%-in setting that other animations and videos use! it also fetches resolution and 'num frames'. duration can't be inferred just yet, but we hope to have some options (and actual rendering) happening in the medium-term future
|
||||
* this is all an experiment. let me know how it goes, and send in any examples of it failing awfully. there is lots more to do. if things don't explode with this much, I'll see about .cbr and cb7, which seems totally doable, and then I can seriously plan out UI for actual view and internal navigation. I can't promise proper reader features like bookmarks or anything, but I'll keep on pushing
|
||||
* all your existing zips will be scheduled for a filetype re-scan on update
|
||||
|
||||
### animations
|
||||
|
||||
* the native FFMPEG renderer pipeline is now capable of transparency. APNGs rendered in the native viewer now have correct transparency and can pass 'has transparency' checks
|
||||
* all your apngs will be scheduled for the 'has transparency' check, just like pngs and gifs and stuff a couple weeks ago. thanks to the user who submitted some transparency-having apngs to test with!
|
||||
* the thumbnails for animated gifs are now taken using the FFMPEG renderer, which puts them x% in, just like APNG and other video. transparency in these thumbnails also seems to be good! am not going to regen everyone's animated gif thumbs yet--I'll do some more IRL testing--but this will probably come in a few weeks. let me know if you see a bevy of newly imported gifs with crazy thumbs
|
||||
* I also overhauled the native GIF renderer. what used to be a cobbled-together RGB OpenCV solution with a fallback to bad PIL code is now a proper only-PIL RGBA solution, and the transparency seems to be great now (the OpenCV code had no transparency, and the PIL fallback tried but generally drew the last frame on top of the previous, giving a noclip effect). the new renderer also skips to an unrendered area faster
|
||||
* given the file maintenance I/O Error problems we had the past couple weeks, I also made this cleaner GIF renderer much more robust; it will generally just rewind itself or give a black frame if it runs into truncation problems, no worries, and for gifs that just have one weird frame that doesn't break seek, it should be able to skip past those now, repeating the last good frame until it hits something valid
|
||||
* as a side thing, the FFMPEG GIF renderer seems capable of doing almost everything the PIL renderer can now. I can flip the code to using the FFMPEG pipeline and gifs come through fine, transparency included. I prefer the PIL for now, but depending on how things go, I may add options to use the FFMPEG bridge as a testbed/fallback in future
|
||||
* added some PIL animated gif rendering tech to handle a gif that out of nowhere produces a giga 85171x53524 frame, eating up multiple GB of memory and taking twenty seconds to failrender
|
||||
* fixed yet another potential source of the false positive I/O Errors caused by the recent 'has transparency' checking, this time not just in malformed animated gif frames, but some busted static images too
|
||||
* improved the PIL loading code a little more, converting more possible I/O Errors and other weird damaged file states to the correct hydrus-internal exception types with nicer error texts
|
||||
* the 'disable CV for gifs' option is removed
|
||||
|
||||
### file pre-import checks
|
||||
|
||||
* the 'is this file free to work on' test that runs before files are added to the manual or import folder file list now has an additional file-open check. this improves reliability over NAS connections, where the file may be used by a remote process, and also improves detection for files where the current user only has read permissions
|
||||
* import folders now have a 'recent modified time skip period' setting, defaulting to 60 seconds. any file that has a modified date newer than that many seconds ago will not be imported on the current check. this helps to avoid importing files that are currently being downloaded/copied into the folder when the import folder runs (when that folder/download process is otherwise immune to the existing 'already in use' checks)
|
||||
* import folders now repeat-check folders that have many previously-seen files much faster
|
||||
|
||||
### misc
|
||||
|
||||
* the 'max gif size' setting in the quiet and loud file import options now defaults to 'no limit'. it used to be 32MB, to catch various trash webm re-encodes, but these days it catches more false positives than it is worth, and 32MB is less of a deal these days too
|
||||
* the test on boot to see if the given database location is writeable-to should now give an error when that location is on a non--existing location (e.g. a removable usb drive that is not currently plugged in). previously, it could, depending on the situation, either proceed and go crazy later or wait indefinitely on a CPU-heavy busy-wait for the drive to be plugged back in. unfortunately, because at this stage there is no logfile location and no UI, if your custom db dir does not and cannot exist, the program terminates instantly and silently writes a crash log to your desktop. I have made a plan to improve this in future
|
||||
* also cleaned up all the db_dir boot code generally. the various validity tests should now only happen once per potential location
|
||||
* the function that converts an export phrase into a filename will now elide long unicode filenames correctly. filenames with complex unicode characters will take more than one byte per character (and most OSes have ~255 byte filename limit), which requires a trickier check. also, on Windows, where there is a 260-character total path limit, the combined directory+filename length is checked better, and just checked on Windows. all errors raised here are better
|
||||
* added some unit tests to check the new path eliding tech
|
||||
* brushed up the 'getting started with ratings' help a little
|
||||
|
||||
### client api
|
||||
|
||||
* thanks to a user, the Client API now has the ability to see and interact with the current popup messages in the popup toaster!
|
||||
* fixed a stupid typo that I made in the new Client API options call. added a unit test to catch this in future, too
|
||||
* the client api version is now 57
|
||||
|
||||
## [Version 553](https://github.com/hydrusnetwork/hydrus/releases/tag/v553)
|
||||
|
||||
### animated gif fixes
|
||||
|
@ -371,43 +436,3 @@ title: Changelog
|
|||
* re-sorted the job list dropdown in the file maintenance dialog
|
||||
* some file maintenance database work should be a bit faster
|
||||
* fixed some behind the scenes stuff when the file history chart has no file info to show
|
||||
|
||||
## [Version 544](https://github.com/hydrusnetwork/hydrus/releases/tag/v544)
|
||||
|
||||
### webp vulnerability
|
||||
|
||||
* the main webp library (libwebp) that many programs use for webp support had a remote execution (very bad) vulnerability. you probably noticed your chrome/firefox updated this week, which was fixing this. we use the same thing via the `Pillow` library, which also rolled out a fix. I'm not sure how vulnerable hydrus ever was, since we are usually jank about how we do anything, but best to be safe about these things. there were apparently exploits for this floating around
|
||||
* the builds today have the fix, so if you use them, update as normal and you are good
|
||||
* if you run from source, **rebuild your venv at your earliest convenience**, and you'll get the new version of Pillow and be good. note, if you use the advanced setup, that there is a new question about `Pillow`
|
||||
* unfortunately, Windows 7 users (or anyone else running from source on Python 3.7) cannot get the fix! it needs Pillow 10.0.1, which is >=Python 3.8. it seems many large programs are dropping support for Win 7 this year, so while I will continue to support it for a reasonable while longer, I think the train may be running out of track bros
|
||||
|
||||
### max size in file storage system
|
||||
|
||||
* the `migrate database` dialog now allows you to set a 'max size' for all but one of your media locations. if you have a 500GB drive you want to store some stuff on, you no longer have to balance the weights in your head--just set a max size of 450GB and hydrus will figure it out for you. it is not super precise (and it isn't healthy to fill drives up to 98% anyway), so make sure you leave some padding
|
||||
* also, please note that this will not automatically rebalance _yet_. right now, the only way files move between locations is through the 'move files now' button on the dialog, so if you have a location that is full up according to its max size rule and then spend a month importing many files, it will go over its limit until and unless you revisit 'migrate database' and move files again. I hope to have automatic background rebalancing in the near future
|
||||
* updated the 'database migration' help to talk about this and added a new migration example
|
||||
* the 'edit num bytes' widget now supports terabytes (TB)
|
||||
* I fleshed out the logic and fixed several bugs in the migration code, mostly to do with the new max size stuff and distributing weights appropriately in various situations
|
||||
|
||||
### misc
|
||||
|
||||
* when an image file fails to render in the media viewer, it now draws a bordered box with a brief 'failed to render' note. previously, it janked with a second of lag, made some popups, and left the display on eternal blank hang. now it finishes its job cleanly and returns a 'nah m8' 'image' result
|
||||
* I reworked the Mr Bones layout a bit. the search is now on the left, and the rows of the main count table are separated for readability
|
||||
* it turns out that bitmap (.bmp) files can support ICC Profiles, so I've told hydrus to look for them in new bitmaps and retroactively scan all your existing ones
|
||||
* fixed an issue with the recent PSD code updates that was breaking boot for clients running from source without the psd-tools library (this affected the Docker build)
|
||||
* updated all the 'setup_venv' scripts. all the formatting and text has had a pass, and there is now a question on (n)ew or (old) Pillow
|
||||
* to stop FFMPEG's false positives where it can think a txt file is an mpeg, the main hydrus filetype scanning routine will no longer send files with common text extensions to ffmpeg. if you do have an mp3 called music.txt, rename it before import!
|
||||
* thanks to a user, the inkbunny file page parser fetches the correct source time again (#1431)
|
||||
* thanks to a user, the old sankaku gallery parser can find the 'next page' again
|
||||
* removed the broken sankaku login script for new users. I recommend people move to Hydrus Companion for all tricky login situations (#1435)
|
||||
* thanks to a user, procreate file parsing, which had the width/height flipped, is fixed. all existing procreate files will regen their metadata and thumbs
|
||||
|
||||
### client api
|
||||
|
||||
* thanks to a user, the Client API now has a `/get_files/render` command, which gives you a 100% zoom png render of the given file. useful if you want to display a PSD on a web page!
|
||||
* I screwed up Mr Bones's Client API request last week. this is now fixed
|
||||
* Mr Bones now supports a full file search context on the Client API, just like the main UI now. same parameters as `/get_files/search_files`, the help talks about it. He also cancels his work early if the request is terminated
|
||||
* Mr Bones gets several new unit tests to guarantee long-term ride reliability
|
||||
* the Client API (and all hydrus servers) now return proper JSON on an error. there's the error summary, specific exception name, and http status code. the big bad 500-error-of-last-resort still tacks on the large serverside traceback to the summary, so we'll see if that is still annoying and split it off if needed
|
||||
* the new `/add_tags/get_siblings_and_parents` now properly cleans the tags you give it, trimming whitespace and lowercasing letters and so on
|
||||
* the client api version is now 52
|
||||
|
|
|
@ -2861,9 +2861,8 @@ Response:
|
|||
"had_error": false,
|
||||
"is_cancellable": false,
|
||||
"is_cancelled": false,
|
||||
"is_deleted": false,
|
||||
"is_done": false,
|
||||
"is_pauseable": false,
|
||||
"is_done": true,
|
||||
"is_pausable": false,
|
||||
"is_paused": false,
|
||||
"is_working": true,
|
||||
"nice_string": "This is a test popup message"
|
||||
|
@ -2875,9 +2874,8 @@ Response:
|
|||
"had_error": false,
|
||||
"is_cancellable": false,
|
||||
"is_cancelled": false,
|
||||
"is_deleted": false,
|
||||
"is_done": false,
|
||||
"is_pauseable": false,
|
||||
"is_done": true,
|
||||
"is_pausable": false,
|
||||
"is_paused": false,
|
||||
"is_working": true,
|
||||
"nice_string": "sub gap downloader test",
|
||||
|
@ -2892,9 +2890,8 @@ Response:
|
|||
"had_error": false,
|
||||
"is_cancellable": true,
|
||||
"is_cancelled": false,
|
||||
"is_deleted": false,
|
||||
"is_done": false,
|
||||
"is_pauseable": false,
|
||||
"is_pausable": false,
|
||||
"is_paused": false,
|
||||
"is_working": true,
|
||||
"nice_string": "subscriptions - safebooru\r\ndownloading files for \"elf\" (1/1)\r\nfile 4/27: downloading file",
|
||||
|
@ -2944,17 +2941,35 @@ Required Headers:
|
|||
Arguments (in JSON):
|
||||
:
|
||||
* it accepts these fields of a [job status object](#job_status_objects):
|
||||
* `status_title`
|
||||
* `status_text_1` and `status_text_2`
|
||||
* `is_cancellable`
|
||||
* `is_pausable`
|
||||
* `attached_files_mergable`
|
||||
* `status_title`
|
||||
* `status_text_1` and `status_text_2`
|
||||
* `popup_gauge_1` and `popup_gauge_2`
|
||||
* `api_data`
|
||||
* `files_label`: the label for the files attached to the job status. It will be returned as `label` in the `files` object in the [job status object](#job_status_objects).
|
||||
* [files](#parameters_files) that will be added to the job status. They will be returned as `hashes` in the `files` object in the [job status object](#job_status_objects). `files_label` is required to add files.
|
||||
* `files_label`: the label for the files attached to the job status. It will be returned as `label` in the `files` object in the [job status object](#job_status_objects).
|
||||
* [files](#parameters_files) that will be added to the job status. They will be returned as `hashes` in the `files` object in the [job status object](#job_status_objects). `files_label` is required to add files.
|
||||
|
||||
A new job status will be created and submitted as a popup.
|
||||
A new job status will be created and submitted as a popup. Set a `status_title` on bigger ongoing jobs that will take a while and receive many updates--and leave it alone, even when the job is done. For simple notes, just set `status_text_1`.
|
||||
|
||||
!!! danger "Finishing Jobs"
|
||||
The pausable, cancellable, and files-mergable status of a job is only settable at creation. A pausable or cancellable popup represents an ongoing and unfinished job. The popup will exist indefinitely and will not be user-dismissable unless the user can first cancel it.
|
||||
|
||||
**You, as the creator, _must_ plan to call Finish once your work is done. Yes, even if there is an error!**
|
||||
|
||||
!!! note "Pausing and Cancelling"
|
||||
If the user pauses a job, you should recognise that and pause your work. Resume when they do.
|
||||
|
||||
If the user cancels a job, you should recognise that and stop work. Either call `finish` with an appropriate status update, or `finish_and_dismiss` if you have nothing more to say.
|
||||
|
||||
If your long-term job has a main loop, place this at the top of the loop, along with your status update calls.
|
||||
|
||||
```json title="Example request body"
|
||||
{
|
||||
"status_text_1": "Note to user"
|
||||
}
|
||||
```
|
||||
|
||||
```json title="Example request body"
|
||||
{
|
||||
|
@ -2963,7 +2978,7 @@ A new job status will be created and submitted as a popup.
|
|||
"popup_gauge_2": [9, 10],
|
||||
"status_text_1": "Doing things",
|
||||
"status_text_2": "Doing other things",
|
||||
"cancellable": true,
|
||||
"is_cancellable": true,
|
||||
"api_data": {
|
||||
"whatever": "stuff"
|
||||
},
|
||||
|
@ -2979,132 +2994,6 @@ Response:
|
|||
: A JSON Object containing `job_status`, the [job status object](#job_status_objects) that was added.
|
||||
|
||||
|
||||
### **POST `/manage_popups/update_popup`** { id="manage_popups_update_popuip" }
|
||||
|
||||
_Update a popup._
|
||||
|
||||
Restricted access:
|
||||
: YES. Manage Popups permission needed.
|
||||
|
||||
Required Headers:
|
||||
:
|
||||
* `Content-Type`: application/json
|
||||
|
||||
Arguments (in JSON):
|
||||
:
|
||||
* `job_status_key`: The hex key of the job status to update.
|
||||
* It accepts these fields of a [job status object](#job_status_objects):
|
||||
* `status_title`
|
||||
* `status_text_1` and `status_text_2`
|
||||
* `is_cancellable`
|
||||
* `is_pausable`
|
||||
* `attached_files_mergable`
|
||||
* `popup_gauge_1` and `popup_gauge_2`
|
||||
* `api_data`
|
||||
* `files_label`: the label for the files attached to the job status. It will be returned as `label` in the `files` object in the [job status object](#job_status_objects).
|
||||
* [files](#parameters_files) that will be added to the job status. They will be returned as `hashes` in the `files` object in the [job status object](#job_status_objects). `files_label` is required to add files.
|
||||
|
||||
The specified job status will be updated with the new values submitted. Any field without a value will be left alone and any field set to `null` will be removed from the job status.
|
||||
|
||||
```json title="Example request body"
|
||||
{
|
||||
"job_status_key": "abee8b37d47dba8abf82638d4afb1d11586b9ef7be634aeb8ae3bcb8162b2c86",
|
||||
"status_title": "Example Popup",
|
||||
"status_text_1": null,
|
||||
"popup_gauge_1": [12, 120],
|
||||
"api_data": {
|
||||
"whatever": "other stuff"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
Response:
|
||||
: A JSON Object containing `job_status`, the [job status object](#job_status_objects) that was updated.
|
||||
|
||||
|
||||
### **POST `/manage_popups/dismiss_popup`** { id="manage_popups_dismiss_popup" }
|
||||
|
||||
_Dismiss a popup._
|
||||
|
||||
Restricted access:
|
||||
: YES. Manage Pages permission needed.
|
||||
|
||||
Required Headers:
|
||||
:
|
||||
* `Content-Type`: application/json
|
||||
|
||||
Arguments (in JSON):
|
||||
:
|
||||
* `job_status_key`: The job status key to dismiss
|
||||
* `seconds`: (optional) an integer number of seconds to wait before dismissing the job status, defaults to happening immediately
|
||||
|
||||
The job status must not be cancellable or pausable to be dismissed.
|
||||
|
||||
```json title="Example request body"
|
||||
{
|
||||
"job_status_key" : "abee8b37d47dba8abf82638d4afb1d11586b9ef7be634aeb8ae3bcb8162b2c86"
|
||||
}
|
||||
```
|
||||
|
||||
Response:
|
||||
: 200 with no content.
|
||||
|
||||
|
||||
### **POST `/manage_popups/finish_popup`** { id="manage_popups_finish_popup" }
|
||||
|
||||
_Mark a popup as done._
|
||||
|
||||
Restricted access:
|
||||
: YES. Manage Pages permission needed.
|
||||
|
||||
Required Headers:
|
||||
:
|
||||
* `Content-Type`: application/json
|
||||
|
||||
Arguments (in JSON):
|
||||
:
|
||||
* `job_status_key`: The job status key to finish
|
||||
* `seconds`: (optional) an integer number of seconds to wait before finishing the job status, defaults to happening immediately
|
||||
|
||||
|
||||
```json title="Example request body"
|
||||
{
|
||||
"job_status_key" : "abee8b37d47dba8abf82638d4afb1d11586b9ef7be634aeb8ae3bcb8162b2c86"
|
||||
}
|
||||
```
|
||||
|
||||
Response:
|
||||
: 200 with no content.
|
||||
|
||||
|
||||
### **POST `/manage_popups/cancel_popup`** { id="manage_popups_cancel_popup" }
|
||||
|
||||
_Cancel a popup._
|
||||
|
||||
Restricted access:
|
||||
: YES. Manage Pages permission needed.
|
||||
|
||||
Required Headers:
|
||||
:
|
||||
* `Content-Type`: application/json
|
||||
|
||||
Arguments (in JSON):
|
||||
:
|
||||
* `job_status_key`: The job status key to cancel
|
||||
* `seconds`: (optional) an integer number of seconds to wait before cancelling the job status, defaults to happening immediately
|
||||
|
||||
The job status must be cancellable to be cancelled.
|
||||
|
||||
```json title="Example request body"
|
||||
{
|
||||
"job_status_key" : "abee8b37d47dba8abf82638d4afb1d11586b9ef7be634aeb8ae3bcb8162b2c86"
|
||||
}
|
||||
```
|
||||
|
||||
Response:
|
||||
: 200 with no content.
|
||||
|
||||
|
||||
### **POST `/manage_popups/call_user_callable`** { id="manage_popups_call_user_callable" }
|
||||
|
||||
_Call the user callable function of a popup._
|
||||
|
@ -3132,6 +3021,168 @@ Response:
|
|||
: 200 with no content.
|
||||
|
||||
|
||||
### **POST `/manage_popups/cancel_popup`** { id="manage_popups_cancel_popup" }
|
||||
|
||||
_Try to cancel a popup._
|
||||
|
||||
Restricted access:
|
||||
: YES. Manage Pages permission needed.
|
||||
|
||||
Required Headers:
|
||||
:
|
||||
* `Content-Type`: application/json
|
||||
|
||||
Arguments (in JSON):
|
||||
:
|
||||
* `job_status_key`: The job status key to cancel
|
||||
|
||||
The job status must be cancellable to be cancelled. If it isn't, this is nullipotent.
|
||||
|
||||
```json title="Example request body"
|
||||
{
|
||||
"job_status_key" : "abee8b37d47dba8abf82638d4afb1d11586b9ef7be634aeb8ae3bcb8162b2c86"
|
||||
}
|
||||
```
|
||||
|
||||
Response:
|
||||
: 200 with no content.
|
||||
|
||||
|
||||
### **POST `/manage_popups/dismiss_popup`** { id="manage_popups_dismiss_popup" }
|
||||
|
||||
_Try to dismiss a popup._
|
||||
|
||||
Restricted access:
|
||||
: YES. Manage Pages permission needed.
|
||||
|
||||
Required Headers:
|
||||
:
|
||||
* `Content-Type`: application/json
|
||||
|
||||
Arguments (in JSON):
|
||||
:
|
||||
* `job_status_key`: The job status key to dismiss
|
||||
|
||||
This is a call an 'observer' (i.e. not the job creator) makes. In the client UI, it would be a user right-clicking a popup to dismiss it. If the job is dismissable (i.e. it `is_done`), the popup disappears, but if it is pausable/cancellable--an ongoing job--then this action is nullipotent.
|
||||
|
||||
You should call this on jobs you did not create yourself.
|
||||
|
||||
```json title="Example request body"
|
||||
{
|
||||
"job_status_key": "abee8b37d47dba8abf82638d4afb1d11586b9ef7be634aeb8ae3bcb8162b2c86"
|
||||
}
|
||||
```
|
||||
|
||||
Response:
|
||||
: 200 with no content.
|
||||
|
||||
|
||||
### **POST `/manage_popups/finish_popup`** { id="manage_popups_finish_popup" }
|
||||
|
||||
_Mark a popup as done._
|
||||
|
||||
Restricted access:
|
||||
: YES. Manage Pages permission needed.
|
||||
|
||||
Required Headers:
|
||||
:
|
||||
* `Content-Type`: application/json
|
||||
|
||||
Arguments (in JSON):
|
||||
:
|
||||
* `job_status_key`: The job status key to finish
|
||||
|
||||
!!! danger "Important"
|
||||
**You may only call this on jobs you created yourself.**
|
||||
|
||||
You only need to call it on jobs that you created pausable or cancellable. It clears those statuses, sets `is_done`, and allows the user to dismiss the job with a right-click.
|
||||
|
||||
Once called, the popup will remain indefinitely. You should marry this call with an `update` that clears the texts and gauges you were using and leaves a "Done, processed x files with y errors!" or similar statement to let the user know how the job went.
|
||||
|
||||
```json title="Example request body"
|
||||
{
|
||||
"job_status_key" : "abee8b37d47dba8abf82638d4afb1d11586b9ef7be634aeb8ae3bcb8162b2c86"
|
||||
}
|
||||
```
|
||||
|
||||
Response:
|
||||
: 200 with no content.
|
||||
|
||||
|
||||
### **POST `/manage_popups/finish_and_dismiss_popup`** { id="manage_popups_finish_and_dismiss_popup" }
|
||||
|
||||
_Finish and dismiss a popup._
|
||||
|
||||
Restricted access:
|
||||
: YES. Manage Pages permission needed.
|
||||
|
||||
Required Headers:
|
||||
:
|
||||
* `Content-Type`: application/json
|
||||
|
||||
Arguments (in JSON):
|
||||
:
|
||||
* `job_status_key`: The job status key to dismiss
|
||||
* `seconds`: (optional) an integer number of seconds to wait before dismissing the job status, defaults to happening immediately
|
||||
|
||||
!!! danger "Important"
|
||||
**You may only call this on jobs you created yourself.**
|
||||
|
||||
This will call `finish` immediately and flag the message for auto-dismissal (i.e. removing it from the popup toaster) either immediately or after the given number of seconds.
|
||||
|
||||
You would want this instead of just `finish` for when you either do not need to leave a 'Done!' summary, or if the summary is not so important, and is only needed if the user happens to glance that way. If you did boring work for ten minutes, you might like to set a simple 'Done!' and auto-dismiss after thirty seconds or so.
|
||||
|
||||
```json title="Example request body"
|
||||
{
|
||||
"job_status_key": "abee8b37d47dba8abf82638d4afb1d11586b9ef7be634aeb8ae3bcb8162b2c86",
|
||||
"seconds": 5
|
||||
}
|
||||
```
|
||||
|
||||
Response:
|
||||
: 200 with no content.
|
||||
|
||||
|
||||
### **POST `/manage_popups/update_popup`** { id="manage_popups_update_popuip" }
|
||||
|
||||
_Update a popup._
|
||||
|
||||
Restricted access:
|
||||
: YES. Manage Popups permission needed.
|
||||
|
||||
Required Headers:
|
||||
:
|
||||
* `Content-Type`: application/json
|
||||
|
||||
Arguments (in JSON):
|
||||
:
|
||||
* `job_status_key`: The hex key of the job status to update.
|
||||
* It accepts these fields of a [job status object](#job_status_objects):
|
||||
* `status_title`
|
||||
* `status_text_1` and `status_text_2`
|
||||
* `popup_gauge_1` and `popup_gauge_2`
|
||||
* `api_data`
|
||||
* `files_label`: the label for the files attached to the job status. It will be returned as `label` in the `files` object in the [job status object](#job_status_objects).
|
||||
* [files](#parameters_files) that will be added to the job status. They will be returned as `hashes` in the `files` object in the [job status object](#job_status_objects). `files_label` is required to add files.
|
||||
|
||||
The specified job status will be updated with the new values submitted. Any field without a value will be left alone and any field set to `null` will be removed from the job status.
|
||||
|
||||
```json title="Example request body"
|
||||
{
|
||||
"job_status_key": "abee8b37d47dba8abf82638d4afb1d11586b9ef7be634aeb8ae3bcb8162b2c86",
|
||||
"status_title": "Example Popup",
|
||||
"status_text_1": null,
|
||||
"popup_gauge_1": [12, 120],
|
||||
"api_data": {
|
||||
"whatever": "other stuff"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
Response:
|
||||
: A JSON Object containing `job_status`, the [job status object](#job_status_objects) that was updated.
|
||||
|
||||
|
||||
## Managing the Database
|
||||
|
||||
### **POST `/manage_database/lock_on`** { id="manage_database_lock_on" }
|
||||
|
|
|
@ -24,16 +24,42 @@ As well as the shape and colour options, you can set how many 'stars' to display
|
|||
|
||||
If you change the star range at a later date, any existing ratings will be 'stretched' across the new range. As values are collapsed to the nearest integer, this is best done for scales that are multiples. 2/5 will neatly become 4/10 on a zero-allowed service, for instance, and 0/4 can nicely become 1/5 if you disallow zero ratings in the same step. If you didn't intuitively understand that, just don't touch the number of stars or zero rating checkbox after you have created the numerical rating service!
|
||||
|
||||
## inc/dec
|
||||
|
||||
This is a simple counter. It can represent whatever you like, but most people usually go for 'I x this image y times'. You left-click to +1 it, right-click to -1.
|
||||
|
||||
![](images/ratings_incdec.png)
|
||||
|
||||
![](images/ratings_incdec_canvas.png)
|
||||
|
||||
## now what? { id="using_ratings" }
|
||||
|
||||
Ratings are displayed in the top-right of the media viewer:
|
||||
|
||||
![](images/ratings_ebola_chan.png)
|
||||
|
||||
Hovering over each control will pop up its name, in case you forget which is which. You can set then them with a left- or right-click. Like/dislike and numerical have slightly different click behaviour, so have a play with them to get their feel. Pressing F4 on a selection of thumbnails will open a dialog with a very similar layout, which will let you set the same rating to many files simultaneously.
|
||||
Hovering over each control will pop up its name, in case you forget which is which.
|
||||
|
||||
For like/dislike:
|
||||
|
||||
- **Left-click:** Set 'like'
|
||||
- **Right-click:** Set 'dislike'
|
||||
- **Second X-click:** Set 'not rated'
|
||||
|
||||
For numerical:
|
||||
|
||||
- **Left-click:** Set value
|
||||
- **Right-click:** Set 'not rated'
|
||||
|
||||
For inc/dec:
|
||||
|
||||
- **Left-click:** +1
|
||||
- **Right-click:** -1
|
||||
|
||||
Pressing F4 on a selection of thumbnails will open a dialog with a very similar layout, which will let you set the same rating to many files simultaneously.
|
||||
|
||||
Once you have some ratings set, you can search for them using system:rating, which produces this dialog:
|
||||
|
||||
![](images/ratings_system_pred.png)
|
||||
|
||||
On my own client, I find it useful to have several like/dislike ratings set up as one-click pseudo-tags, like the 'OP images' above.
|
||||
On my own client, I find it useful to have several like/dislike ratings set up as quick one-click pseudo-tags. Stuff like 'this would make a good post' or 'read this later' that I can hit while I am doing archive/delete filtering.
|
||||
|
|
Binary file not shown.
After Width: | Height: | Size: 12 KiB |
Binary file not shown.
After Width: | Height: | Size: 1.8 KiB |
|
@ -34,6 +34,60 @@
|
|||
<div class="content">
|
||||
<h1 id="changelog"><a href="#changelog">changelog</a></h1>
|
||||
<ul>
|
||||
<li>
|
||||
<h2 id="version_554"><a href="#version_554">version 554</a></h2>
|
||||
<ul>
|
||||
<li><h3>checker options fixes</h3></li>
|
||||
<li>**sorry for any jank 'static check interval' watcher or subscription timings you saw last week! I screwed something up and it slipped through testing**</li>
|
||||
<li>the 'static check interval' logic is much much simpler. rather than try to always keep to the same check period, even if the actual check is delayed, it just works off 'last check time + period', every time. the clever stuff was generally confusing and failing in a variety of ways</li>
|
||||
<li>fixed a bug in the new static check time code that was stopping certain in-limbo watchers from calculating their correct next check time on program load</li>
|
||||
<li>fixed a bug in the new static check time code that was causing too many checks in long-paused-and-now-unpaused downloaders</li>
|
||||
<li>some new unit tests will make sure these errors do not happen again</li>
|
||||
<li>in the checker options UI, if you uncheck 'just check at a static, regular interval', and leave the faster/slower values as the same when you OK, then the dialog now asks you if that is what you want</li>
|
||||
<li>in the checker options UI, the 'slower than' value will now automatically update itself to be no smaller than the 'faster than' value</li>
|
||||
<li><h3>job status fixes and cleanup (mostly boring)</h3></li>
|
||||
<li>**sorry for any 'Cancel/IsCancellable' related errors you saw last week! I screwed something else up**</li>
|
||||
<li>fixed a dumb infinite recursion error in the new job status cancellable 'safety' checks that was happening when it was time to auto-dismiss a cancellable job due to program/thread shutdown or a maintenance mode change. this also fixes some non-dismissing popup messages (usually subscriptions) that weren't setting their cancel status correctly</li>
|
||||
<li>this happened because the code here was ancient and ugly. I have renamed, simplified, and reworked the logical dangerzone variables and methods in the job status object so we don't run into this problem again. 'Cancel' and 'Finish' no longer take a seconds parameter, 'Delete' is now 'FinishAndDismiss', 'IsDeleted' is now 'IsDismissed', 'IsDeletable' is now merged into a cleaner 'IsDone', 'IsWorking' is removed, 'SetCancellable' and 'SetPausable' are removed (these will always be in the init, and will determine what type of job we have), and the various new Client API calls and help are updated for this</li>
|
||||
<li>also, the job status methods now check their backstop 'cancel' tests far less often, and there's a throttle to make sure they can only run once a second anyway</li>
|
||||
<li>also ditched the needless threading events for simple bools</li>
|
||||
<li>also cleared up about 40 pointless Finish/FinishAndDismiss duplicate calls across the program</li>
|
||||
<li>also fixed up the job status object to do its various yield pauses more sanely</li>
|
||||
<li><h3>cbz and ugoira detection and thumbnails</h3></li>
|
||||
<li>CBZ files are now detected! there is no very strict standard of what is or isn't a CBZ (it is basically just a zip of images and maybe some metadata files), but I threw together a 'yeah that looks like a cbz' test that now runs on every zip. there will probably be several false positives, but with luck fewer false negatives, which I think is the way we want to lean here. if you have just some zip of similarly named images, it'll now be counted as a CBZ, but I think we'll nonetheless want to give those all the upcoming CBZ tech anyway, even if they aren't technically intended to be 'CBZ', whatever that actually means here other than the different file extension</li>
|
||||
<li>the client looks for the cover image in your CBZ and uses that for the thumbnail! it also uses this file's resolution as the CBZ resolution</li>
|
||||
<li>Ugoira files are now detected! there is a firmer standard of what an Ugoira is, but it is still tricky as we are just talking about a different list of zipped image files here. I expect zero false negatives and some false positives (unfortunately, it'll be CBZs with zero-padded numerical-only filenames). as all ugoiras are valid CBZs but few CBZs are valid ugoiras, the Ugoira test runs first</li>
|
||||
<li>the client now gets a thumbnail for Ugoiras. It'll also use the x%-in setting that other animations and videos use! it also fetches resolution and 'num frames'. duration can't be inferred just yet, but we hope to have some options (and actual rendering) happening in the medium-term future</li>
|
||||
<li>this is all an experiment. let me know how it goes, and send in any examples of it failing awfully. there is lots more to do. if things don't explode with this much, I'll see about .cbr and cb7, which seems totally doable, and then I can seriously plan out UI for actual view and internal navigation. I can't promise proper reader features like bookmarks or anything, but I'll keep on pushing</li>
|
||||
<li>all your existing zips will be scheduled for a filetype re-scan on update</li>
|
||||
<li><h3>animations</h3></li>
|
||||
<li>the native FFMPEG renderer pipeline is now capable of transparency. APNGs rendered in the native viewer now have correct transparency and can pass 'has transparency' checks</li>
|
||||
<li>all your apngs will be scheduled for the 'has transparency' check, just like pngs and gifs and stuff a couple weeks ago. thanks to the user who submitted some transparency-having apngs to test with!</li>
|
||||
<li>the thumbnails for animated gifs are now taken using the FFMPEG renderer, which puts them x% in, just like APNG and other video. transparency in these thumbnails also seems to be good! am not going to regen everyone's animated gif thumbs yet--I'll do some more IRL testing--but this will probably come in a few weeks. let me know if you see a bevy of newly imported gifs with crazy thumbs</li>
|
||||
<li>I also overhauled the native GIF renderer. what used to be a cobbled-together RGB OpenCV solution with a fallback to bad PIL code is now a proper only-PIL RGBA solution, and the transparency seems to be great now (the OpenCV code had no transparency, and the PIL fallback tried but generally drew the last frame on top of the previous, giving a noclip effect). the new renderer also skips to an unrendered area faster</li>
|
||||
<li>given the file maintenance I/O Error problems we had the past couple weeks, I also made this cleaner GIF renderer much more robust; it will generally just rewind itself or give a black frame if it runs into truncation problems, no worries, and for gifs that just have one weird frame that doesn't break seek, it should be able to skip past those now, repeating the last good frame until it hits something valid</li>
|
||||
<li>as a side thing, the FFMPEG GIF renderer seems capable of doing almost everything the PIL renderer can now. I can flip the code to using the FFMPEG pipeline and gifs come through fine, transparency included. I prefer the PIL for now, but depending on how things go, I may add options to use the FFMPEG bridge as a testbed/fallback in future</li>
|
||||
<li>added some PIL animated gif rendering tech to handle a gif that out of nowhere produces a giga 85171x53524 frame, eating up multiple GB of memory and taking twenty seconds to failrender</li>
|
||||
<li>fixed yet another potential source of the false positive I/O Errors caused by the recent 'has transparency' checking, this time not just in malformed animated gif frames, but some busted static images too</li>
|
||||
<li>improved the PIL loading code a little more, converting more possible I/O Errors and other weird damaged file states to the correct hydrus-internal exception types with nicer error texts</li>
|
||||
<li>the 'disable CV for gifs' option is removed</li>
|
||||
<li><h3>file pre-import checks</h3></li>
|
||||
<li>the 'is this file free to work on' test that runs before files are added to the manual or import folder file list now has an additional file-open check. this improves reliability over NAS connections, where the file may be used by a remote process, and also improves detection for files where the current user only has read permissions</li>
|
||||
<li>import folders now have a 'recent modified time skip period' setting, defaulting to 60 seconds. any file that has a modified date newer than that many seconds ago will not be imported on the current check. this helps to avoid importing files that are currently being downloaded/copied into the folder when the import folder runs (when that folder/download process is otherwise immune to the existing 'already in use' checks)</li>
|
||||
<li>import folders now repeat-check folders that have many previously-seen files much faster</li>
|
||||
<li><h3>misc</h3></li>
|
||||
<li>the 'max gif size' setting in the quiet and loud file import options now defaults to 'no limit'. it used to be 32MB, to catch various trash webm re-encodes, but these days it catches more false positives than it is worth, and 32MB is less of a deal these days too</li>
|
||||
<li>the test on boot to see if the given database location is writeable-to should now give an error when that location is on a non--existing location (e.g. a removable usb drive that is not currently plugged in). previously, it could, depending on the situation, either proceed and go crazy later or wait indefinitely on a CPU-heavy busy-wait for the drive to be plugged back in. unfortunately, because at this stage there is no logfile location and no UI, if your custom db dir does not and cannot exist, the program terminates instantly and silently writes a crash log to your desktop. I have made a plan to improve this in future</li>
|
||||
<li>also cleaned up all the db_dir boot code generally. the various validity tests should now only happen once per potential location</li>
|
||||
<li>the function that converts an export phrase into a filename will now elide long unicode filenames correctly. filenames with complex unicode characters will take more than one byte per character (and most OSes have ~255 byte filename limit), which requires a trickier check. also, on Windows, where there is a 260-character total path limit, the combined directory+filename length is checked better, and just checked on Windows. all errors raised here are better</li>
|
||||
<li>added some unit tests to check the new path eliding tech</li>
|
||||
<li>brushed up the 'getting started with ratings' help a little</li>
|
||||
<li><h3>client api</h3></li>
|
||||
<li>thanks to a user, the Client API now has the ability to see and interact with the current popup messages in the popup toaster!</li>
|
||||
<li>fixed a stupid typo that I made in the new Client API options call. added a unit test to catch this in future, too</li>
|
||||
<li>the client api version is now 57</li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>
|
||||
<h2 id="version_553"><a href="#version_553">version 553</a></h2>
|
||||
<ul>
|
||||
|
|
|
@ -22,7 +22,19 @@ CLIENT_API_PERMISSION_MANAGE_FILE_RELATIONSHIPS = 8
|
|||
CLIENT_API_PERMISSION_EDIT_RATINGS = 9
|
||||
CLIENT_API_PERMISSION_MANAGE_POPUPS = 10
|
||||
|
||||
ALLOWED_PERMISSIONS = ( CLIENT_API_PERMISSION_ADD_FILES, CLIENT_API_PERMISSION_ADD_TAGS, CLIENT_API_PERMISSION_ADD_URLS, CLIENT_API_PERMISSION_SEARCH_FILES, CLIENT_API_PERMISSION_MANAGE_PAGES, CLIENT_API_PERMISSION_MANAGE_HEADERS, CLIENT_API_PERMISSION_MANAGE_DATABASE, CLIENT_API_PERMISSION_ADD_NOTES, CLIENT_API_PERMISSION_MANAGE_FILE_RELATIONSHIPS, CLIENT_API_PERMISSION_EDIT_RATINGS, CLIENT_API_PERMISSION_MANAGE_POPUPS )
|
||||
ALLOWED_PERMISSIONS = (
|
||||
CLIENT_API_PERMISSION_ADD_FILES,
|
||||
CLIENT_API_PERMISSION_ADD_TAGS,
|
||||
CLIENT_API_PERMISSION_ADD_URLS,
|
||||
CLIENT_API_PERMISSION_SEARCH_FILES,
|
||||
CLIENT_API_PERMISSION_MANAGE_PAGES,
|
||||
CLIENT_API_PERMISSION_MANAGE_HEADERS,
|
||||
CLIENT_API_PERMISSION_MANAGE_DATABASE,
|
||||
CLIENT_API_PERMISSION_ADD_NOTES,
|
||||
CLIENT_API_PERMISSION_MANAGE_FILE_RELATIONSHIPS,
|
||||
CLIENT_API_PERMISSION_EDIT_RATINGS,
|
||||
CLIENT_API_PERMISSION_MANAGE_POPUPS
|
||||
)
|
||||
|
||||
basic_permission_to_str_lookup = {}
|
||||
|
||||
|
|
|
@ -224,7 +224,7 @@ media_viewer_capabilities = {
|
|||
|
||||
for mime in HC.SEARCHABLE_MIMES:
|
||||
|
||||
if mime in HC.ANIMATIONS:
|
||||
if mime in HC.VIEWABLE_ANIMATIONS:
|
||||
|
||||
media_viewer_capabilities[ mime ] = animated_full_support
|
||||
|
||||
|
|
|
@ -297,7 +297,7 @@ class Controller( HydrusController.HydrusController ):
|
|||
|
||||
job_status.SetStatusText( 'enabling I/O now' )
|
||||
|
||||
job_status.Delete()
|
||||
job_status.FinishAndDismiss()
|
||||
|
||||
return
|
||||
|
||||
|
@ -307,7 +307,7 @@ class Controller( HydrusController.HydrusController ):
|
|||
|
||||
if HydrusTime.TimeHasPassed( wake_time ):
|
||||
|
||||
job_status.Delete()
|
||||
job_status.FinishAndDismiss()
|
||||
|
||||
return
|
||||
|
||||
|
@ -422,19 +422,10 @@ class Controller( HydrusController.HydrusController ):
|
|||
|
||||
|
||||
|
||||
job_status = ClientThreading.JobStatus( cancel_on_shutdown = False )
|
||||
job_status = ClientThreading.JobStatus( cancellable = True, cancel_on_shutdown = False )
|
||||
|
||||
QP.CallAfter( qt_code, win, job_status )
|
||||
|
||||
i = 0
|
||||
|
||||
while not job_status.IsDone() and i < 8:
|
||||
|
||||
time.sleep( 0.02 )
|
||||
|
||||
i += 1
|
||||
|
||||
|
||||
# I think in some cases with the splash screen we may actually be pushing stuff here after model shutdown
|
||||
# but I also don't want a hang, as we have seen with some GUI async job that got fired on shutdown and it seems some event queue was halted or deadlocked
|
||||
# so, we'll give it 16ms to work, then we'll start testing for shutdown hang
|
||||
|
@ -1353,13 +1344,24 @@ class Controller( HydrusController.HydrusController ):
|
|||
if self.db.IsFirstStart():
|
||||
|
||||
message = 'Hi, this looks like the first time you have started the hydrus client.'
|
||||
message += os.linesep * 2
|
||||
message += '\n' * 2
|
||||
message += 'Don\'t forget to check out the help if you haven\'t already, by clicking help->help--it has an extensive \'getting started\' section, including how to update and the importance of backing up your database.'
|
||||
message += os.linesep * 2
|
||||
message += '\n' * 2
|
||||
message += 'To dismiss popup messages like this, right-click them.'
|
||||
|
||||
HydrusData.ShowText( message )
|
||||
|
||||
if HC.WE_SWITCHED_TO_USERPATH:
|
||||
|
||||
message = f'Hey, furthermore, it looks like the original desired database location was not writable-to, so the client has fallen back to using your userpath at:'
|
||||
message += '\n' * 2
|
||||
message += HC.USERPATH_DB_DIR
|
||||
message += '\n' * 2
|
||||
message += 'If that is fine with you, no problem. But if you were expecting to load an existing database and the above "first start" popup is a surprise, then your old db path is probably read-only. Fix that and try again. If it helps, hit up help->about to see the directories hydrus is currently using.'
|
||||
|
||||
HydrusData.ShowText( message )
|
||||
|
||||
|
||||
|
||||
if self.db.IsDBUpdated():
|
||||
|
||||
|
|
|
@ -9,7 +9,6 @@ from hydrus.core import HydrusTime
|
|||
|
||||
from hydrus.client import ClientApplicationCommand as CAC
|
||||
from hydrus.client import ClientConstants as CC
|
||||
from hydrus.client import ClientData
|
||||
from hydrus.client import ClientLocation
|
||||
|
||||
def GetClientDefaultOptions():
|
||||
|
|
|
@ -401,9 +401,7 @@ class QuickDownloadManager( object ):
|
|||
|
||||
job_status_pub_job.Cancel()
|
||||
|
||||
job_status.Finish()
|
||||
|
||||
job_status.Delete( 1 )
|
||||
job_status.FinishAndDismiss( 1 )
|
||||
|
||||
|
||||
|
||||
|
|
|
@ -770,7 +770,7 @@ class DuplicatesManager( object ):
|
|||
time.sleep( reasonable_work_time * rest_ratio )
|
||||
|
||||
|
||||
job_status.Delete()
|
||||
job_status.FinishAndDismiss()
|
||||
|
||||
finally:
|
||||
|
||||
|
|
|
@ -15,6 +15,7 @@ from hydrus.core import HydrusLists
|
|||
from hydrus.core import HydrusPaths
|
||||
from hydrus.core import HydrusThreading
|
||||
from hydrus.core import HydrusTime
|
||||
from hydrus.core import HydrusVideoHandling
|
||||
from hydrus.core.images import HydrusBlurhash
|
||||
from hydrus.core.images import HydrusImageColours
|
||||
from hydrus.core.images import HydrusImageHandling
|
||||
|
@ -1706,9 +1707,7 @@ class ClientFilesManager( object ):
|
|||
|
||||
job_status.SetStatusText( 'done!' )
|
||||
|
||||
job_status.Finish()
|
||||
|
||||
job_status.Delete()
|
||||
job_status.FinishAndDismiss()
|
||||
|
||||
|
||||
|
||||
|
@ -1870,50 +1869,67 @@ def HasHumanReadableEmbeddedMetadata( path, mime ):
|
|||
return has_human_readable_embedded_metadata
|
||||
|
||||
|
||||
def HasTransparency( path, mime, num_frames = None, resolution = None ):
|
||||
def HasTransparency( path, mime, duration = None, num_frames = None, resolution = None ):
|
||||
|
||||
if mime not in HC.MIMES_THAT_WE_CAN_CHECK_FOR_TRANSPARENCY:
|
||||
|
||||
return False
|
||||
|
||||
|
||||
if mime in HC.IMAGES:
|
||||
try:
|
||||
|
||||
numpy_image = HydrusImageHandling.GenerateNumPyImage( path, mime )
|
||||
|
||||
return HydrusImageColours.NumPyImageHasUsefulAlphaChannel( numpy_image )
|
||||
|
||||
elif mime == HC.ANIMATION_GIF:
|
||||
|
||||
if num_frames is None or resolution is None:
|
||||
if mime in HC.IMAGES:
|
||||
|
||||
return False # something crazy going on, so let's bail out
|
||||
numpy_image = HydrusImageHandling.GenerateNumPyImage( path, mime )
|
||||
|
||||
|
||||
we_checked_alpha_channel = False
|
||||
|
||||
renderer = ClientVideoHandling.GIFRenderer( path, num_frames, resolution, force_pil = True )
|
||||
|
||||
for i in range( num_frames ):
|
||||
return HydrusImageColours.NumPyImageHasUsefulAlphaChannel( numpy_image )
|
||||
|
||||
numpy_image = renderer.read_frame()
|
||||
elif mime in ( HC.ANIMATION_GIF, HC.ANIMATION_APNG ):
|
||||
|
||||
if not we_checked_alpha_channel:
|
||||
if num_frames is None or resolution is None:
|
||||
|
||||
if not HydrusImageColours.NumPyImageHasAlphaChannel( numpy_image ):
|
||||
return False # something crazy going on, so let's bail out
|
||||
|
||||
|
||||
we_checked_alpha_channel = False
|
||||
|
||||
if mime == HC.ANIMATION_GIF:
|
||||
|
||||
renderer = ClientVideoHandling.GIFRenderer( path, num_frames, resolution )
|
||||
|
||||
else: # HC.ANIMATION_APNG
|
||||
|
||||
renderer = HydrusVideoHandling.VideoRendererFFMPEG( path, mime, duration, num_frames, resolution )
|
||||
|
||||
|
||||
for i in range( num_frames ):
|
||||
|
||||
numpy_image = renderer.read_frame()
|
||||
|
||||
if not we_checked_alpha_channel:
|
||||
|
||||
return False
|
||||
if not HydrusImageColours.NumPyImageHasAlphaChannel( numpy_image ):
|
||||
|
||||
return False
|
||||
|
||||
|
||||
we_checked_alpha_channel = True
|
||||
|
||||
|
||||
we_checked_alpha_channel = True
|
||||
|
||||
|
||||
if HydrusImageColours.NumPyImageHasUsefulAlphaChannel( numpy_image ):
|
||||
|
||||
return True
|
||||
if HydrusImageColours.NumPyImageHasUsefulAlphaChannel( numpy_image ):
|
||||
|
||||
return True
|
||||
|
||||
|
||||
|
||||
|
||||
except HydrusExceptions.DamagedOrUnusualFileException as e:
|
||||
|
||||
HydrusData.Print( 'Problem determining transparency for "{}":'.format( path ) )
|
||||
HydrusData.PrintException( e )
|
||||
|
||||
return False
|
||||
|
||||
|
||||
return False
|
||||
|
||||
|
@ -2429,7 +2445,7 @@ class FilesMaintenanceManager( object ):
|
|||
|
||||
path = self._controller.client_files_manager.GetFilePath( hash, mime )
|
||||
|
||||
has_transparency = HasTransparency( path, mime, num_frames = media_result.GetNumFrames(), resolution = media_result.GetResolution() )
|
||||
has_transparency = HasTransparency( path, mime, duration = media_result.GetDurationMS(), num_frames = media_result.GetNumFrames(), resolution = media_result.GetResolution() )
|
||||
|
||||
additional_data = has_transparency
|
||||
|
||||
|
@ -3025,9 +3041,7 @@ class FilesMaintenanceManager( object ):
|
|||
|
||||
job_status.DeleteVariable( 'popup_gauge_1' )
|
||||
|
||||
job_status.Finish()
|
||||
|
||||
job_status.Delete( 5 )
|
||||
job_status.FinishAndDismiss( 5 )
|
||||
|
||||
if not work_done:
|
||||
|
||||
|
@ -3232,9 +3246,7 @@ class FilesMaintenanceManager( object ):
|
|||
|
||||
job_status.DeleteVariable( 'popup_gauge_1' )
|
||||
|
||||
job_status.Finish()
|
||||
|
||||
job_status.Delete( 5 )
|
||||
job_status.FinishAndDismiss( 5 )
|
||||
|
||||
self._controller.pub( 'notify_files_maintenance_done' )
|
||||
|
||||
|
|
|
@ -394,9 +394,7 @@ class MigrationJob( object ):
|
|||
|
||||
job_status.SetStatusText( 'done!' )
|
||||
|
||||
job_status.Finish()
|
||||
|
||||
job_status.Delete( 3 )
|
||||
job_status.FinishAndDismiss( 3 )
|
||||
|
||||
|
||||
|
||||
|
|
|
@ -130,8 +130,6 @@ class ClientOptions( HydrusSerialisable.SerialisableBase ):
|
|||
self._dictionary[ 'booleans' ][ 'discord_dnd_fix' ] = False
|
||||
self._dictionary[ 'booleans' ][ 'secret_discord_dnd_fix' ] = False
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'disable_cv_for_gifs' ] = False
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'show_unmatched_urls_in_media_viewer' ] = False
|
||||
|
||||
self._dictionary[ 'booleans' ][ 'set_search_focus_on_page_change' ] = False
|
||||
|
@ -774,7 +772,7 @@ class ClientOptions( HydrusSerialisable.SerialisableBase ):
|
|||
allow_decompression_bombs = True
|
||||
min_size = None
|
||||
max_size = None
|
||||
max_gif_size = 32 * 1048576
|
||||
max_gif_size = None
|
||||
min_resolution = None
|
||||
max_resolution = None
|
||||
|
||||
|
|
|
@ -11,6 +11,7 @@ from hydrus.core import HydrusAnimationHandling
|
|||
from hydrus.core import HydrusCompression
|
||||
from hydrus.core import HydrusConstants as HC
|
||||
from hydrus.core import HydrusData
|
||||
from hydrus.core import HydrusExceptions
|
||||
from hydrus.core import HydrusGlobals as HG
|
||||
from hydrus.core import HydrusVideoHandling
|
||||
from hydrus.core.images import HydrusImageColours
|
||||
|
@ -68,8 +69,16 @@ def GenerateHydrusBitmapFromPILImage( pil_image, compressed = True ):
|
|||
depth = 3
|
||||
|
||||
|
||||
return HydrusBitmap( pil_image.tobytes(), pil_image.size, depth, compressed = compressed )
|
||||
try:
|
||||
|
||||
return HydrusBitmap( pil_image.tobytes(), pil_image.size, depth, compressed = compressed )
|
||||
|
||||
except IOError:
|
||||
|
||||
raise HydrusExceptions.DamagedOrUnusualFileException( 'Looks like a truncated file that PIL could not handle!' )
|
||||
|
||||
|
||||
|
||||
class ImageRenderer( ClientCachesBase.CacheableObject ):
|
||||
|
||||
def __init__( self, media, this_is_for_metadata_alone = False ):
|
||||
|
@ -631,170 +640,11 @@ class RasterContainerVideo( RasterContainer ):
|
|||
|
||||
|
||||
|
||||
def THREADRender( self ):
|
||||
|
||||
mime = self._media.GetMime()
|
||||
duration = self._media.GetDurationMS()
|
||||
num_frames_in_video = self._media.GetNumFrames()
|
||||
|
||||
time.sleep( 0.00001 )
|
||||
|
||||
if self._media.GetMime() == HC.ANIMATION_GIF:
|
||||
|
||||
( self._durations, self._times_to_play_animation ) = HydrusAnimationHandling.GetFrameDurationsPILAnimation( self._path )
|
||||
|
||||
self._renderer = ClientVideoHandling.GIFRenderer( self._path, num_frames_in_video, self._target_resolution )
|
||||
|
||||
else:
|
||||
|
||||
if self._media.GetMime() == HC.ANIMATION_APNG:
|
||||
|
||||
self._times_to_play_animation = HydrusAnimationHandling.GetTimesToPlayAPNG( self._path )
|
||||
|
||||
|
||||
self._renderer = HydrusVideoHandling.VideoRendererFFMPEG( self._path, mime, duration, num_frames_in_video, self._target_resolution )
|
||||
|
||||
|
||||
# give ui a chance to draw a blank frame rather than hard-charge right into CPUland
|
||||
time.sleep( 0.00001 )
|
||||
|
||||
self.GetReadyForFrame( self._init_position )
|
||||
def CanHaveVariableFramerate( self ):
|
||||
|
||||
with self._lock:
|
||||
|
||||
self._initialised = True
|
||||
|
||||
|
||||
while True:
|
||||
|
||||
if self._stop or HG.started_shutdown:
|
||||
|
||||
self._renderer.Stop()
|
||||
|
||||
self._renderer = None
|
||||
|
||||
with self._lock:
|
||||
|
||||
self._frames = {}
|
||||
|
||||
|
||||
return
|
||||
|
||||
|
||||
#
|
||||
|
||||
with self._lock:
|
||||
|
||||
# lets see if we should move the renderer to a new position
|
||||
|
||||
next_render_is_out_of_buffer = FrameIndexOutOfRange( self._next_render_index, self._buffer_start_index, self._buffer_end_index )
|
||||
buffer_not_fully_rendered = self._last_index_rendered != self._buffer_end_index
|
||||
|
||||
currently_rendering_out_of_buffer = next_render_is_out_of_buffer and buffer_not_fully_rendered
|
||||
|
||||
will_render_ideal_frame_soon = self._IndexInRange( self._next_render_index, self._buffer_start_index, self._ideal_next_frame )
|
||||
|
||||
need_ideal_next_frame = not self._HasFrame( self._ideal_next_frame )
|
||||
|
||||
will_not_get_to_ideal_frame = need_ideal_next_frame and not will_render_ideal_frame_soon
|
||||
|
||||
if currently_rendering_out_of_buffer or will_not_get_to_ideal_frame:
|
||||
|
||||
# we cannot get to the ideal next frame, so we need to rewind/reposition
|
||||
|
||||
self._renderer.set_position( self._buffer_start_index )
|
||||
|
||||
self._last_index_rendered = -1
|
||||
|
||||
self._next_render_index = self._buffer_start_index
|
||||
|
||||
|
||||
#
|
||||
|
||||
need_to_render = self._last_index_rendered != self._buffer_end_index
|
||||
|
||||
|
||||
if need_to_render:
|
||||
|
||||
with self._lock:
|
||||
|
||||
self._rendered_first_frame = True
|
||||
|
||||
frame_index = self._next_render_index # keep this before the get call, as it increments in a clock arithmetic way afterwards
|
||||
|
||||
renderer = self._renderer
|
||||
|
||||
|
||||
try:
|
||||
|
||||
numpy_image = renderer.read_frame()
|
||||
|
||||
except Exception as e:
|
||||
|
||||
HydrusData.ShowException( e )
|
||||
|
||||
return
|
||||
|
||||
finally:
|
||||
|
||||
with self._lock:
|
||||
|
||||
self._last_index_rendered = frame_index
|
||||
|
||||
self._next_render_index = ( self._next_render_index + 1 ) % num_frames_in_video
|
||||
|
||||
|
||||
|
||||
with self._lock:
|
||||
|
||||
if self._next_render_index == 0 and self._buffer_end_index != num_frames_in_video - 1:
|
||||
|
||||
# we need to rewind renderer
|
||||
|
||||
self._renderer.set_position( 0 )
|
||||
|
||||
self._last_index_rendered = -1
|
||||
|
||||
|
||||
should_save_frame = not self._HasFrame( frame_index )
|
||||
|
||||
|
||||
if should_save_frame:
|
||||
|
||||
frame = GenerateHydrusBitmapFromNumPyImage( numpy_image, compressed = False )
|
||||
|
||||
with self._lock:
|
||||
|
||||
self._frames[ frame_index ] = frame
|
||||
|
||||
self._MaintainBuffer()
|
||||
|
||||
|
||||
|
||||
with self._lock:
|
||||
|
||||
work_still_to_do = self._last_index_rendered != self._buffer_end_index
|
||||
|
||||
|
||||
if work_still_to_do:
|
||||
|
||||
time.sleep( 0.0001 )
|
||||
|
||||
else:
|
||||
|
||||
half_a_frame = ( self._average_frame_duration / 1000.0 ) * 0.5
|
||||
|
||||
sleep_duration = min( 0.1, half_a_frame ) # for 10s-long 3-frame gifs, wew
|
||||
|
||||
time.sleep( sleep_duration ) # just so we don't spam cpu
|
||||
|
||||
|
||||
else:
|
||||
|
||||
self._render_event.wait( 1 )
|
||||
|
||||
self._render_event.clear()
|
||||
|
||||
return self._media.GetMime() == HC.ANIMATION_GIF
|
||||
|
||||
|
||||
|
||||
|
@ -1007,14 +857,6 @@ class RasterContainerVideo( RasterContainer ):
|
|||
|
||||
|
||||
|
||||
def CanHaveVariableFramerate( self ):
|
||||
|
||||
with self._lock:
|
||||
|
||||
return self._media.GetMime() == HC.ANIMATION_GIF
|
||||
|
||||
|
||||
|
||||
def IsInitialised( self ):
|
||||
|
||||
with self._lock:
|
||||
|
@ -1033,6 +875,175 @@ class RasterContainerVideo( RasterContainer ):
|
|||
self._stop = True
|
||||
|
||||
|
||||
def THREADRender( self ):
|
||||
|
||||
mime = self._media.GetMime()
|
||||
duration = self._media.GetDurationMS()
|
||||
num_frames_in_video = self._media.GetNumFrames()
|
||||
|
||||
time.sleep( 0.00001 )
|
||||
|
||||
# OK so just a note, you can switch GIF to the FFMPEG renderer these days and it works fine mate, transparency included
|
||||
if self._media.GetMime() == HC.ANIMATION_GIF:
|
||||
|
||||
( self._durations, self._times_to_play_animation ) = HydrusAnimationHandling.GetFrameDurationsPILAnimation( self._path )
|
||||
|
||||
self._renderer = ClientVideoHandling.GIFRenderer( self._path, num_frames_in_video, self._target_resolution )
|
||||
|
||||
else:
|
||||
|
||||
if self._media.GetMime() == HC.ANIMATION_APNG:
|
||||
|
||||
self._times_to_play_animation = HydrusAnimationHandling.GetTimesToPlayAPNG( self._path )
|
||||
|
||||
|
||||
self._renderer = HydrusVideoHandling.VideoRendererFFMPEG( self._path, mime, duration, num_frames_in_video, self._target_resolution )
|
||||
|
||||
|
||||
# give ui a chance to draw a blank frame rather than hard-charge right into CPUland
|
||||
time.sleep( 0.00001 )
|
||||
|
||||
self.GetReadyForFrame( self._init_position )
|
||||
|
||||
with self._lock:
|
||||
|
||||
self._initialised = True
|
||||
|
||||
|
||||
while True:
|
||||
|
||||
if self._stop or HG.started_shutdown:
|
||||
|
||||
self._renderer.Stop()
|
||||
|
||||
self._renderer = None
|
||||
|
||||
with self._lock:
|
||||
|
||||
self._frames = {}
|
||||
|
||||
|
||||
return
|
||||
|
||||
|
||||
#
|
||||
|
||||
with self._lock:
|
||||
|
||||
# lets see if we should move the renderer to a new position
|
||||
|
||||
next_render_is_out_of_buffer = FrameIndexOutOfRange( self._next_render_index, self._buffer_start_index, self._buffer_end_index )
|
||||
buffer_not_fully_rendered = self._last_index_rendered != self._buffer_end_index
|
||||
|
||||
currently_rendering_out_of_buffer = next_render_is_out_of_buffer and buffer_not_fully_rendered
|
||||
|
||||
will_render_ideal_frame_soon = self._IndexInRange( self._next_render_index, self._buffer_start_index, self._ideal_next_frame )
|
||||
|
||||
need_ideal_next_frame = not self._HasFrame( self._ideal_next_frame )
|
||||
|
||||
will_not_get_to_ideal_frame = need_ideal_next_frame and not will_render_ideal_frame_soon
|
||||
|
||||
if currently_rendering_out_of_buffer or will_not_get_to_ideal_frame:
|
||||
|
||||
# we cannot get to the ideal next frame, so we need to rewind/reposition
|
||||
|
||||
self._renderer.set_position( self._buffer_start_index )
|
||||
|
||||
self._last_index_rendered = -1
|
||||
|
||||
self._next_render_index = self._buffer_start_index
|
||||
|
||||
|
||||
#
|
||||
|
||||
need_to_render = self._last_index_rendered != self._buffer_end_index
|
||||
|
||||
|
||||
if need_to_render:
|
||||
|
||||
with self._lock:
|
||||
|
||||
self._rendered_first_frame = True
|
||||
|
||||
frame_index = self._next_render_index # keep this before the get call, as it increments in a clock arithmetic way afterwards
|
||||
|
||||
renderer = self._renderer
|
||||
|
||||
|
||||
try:
|
||||
|
||||
numpy_image = renderer.read_frame()
|
||||
|
||||
except Exception as e:
|
||||
|
||||
HydrusData.ShowException( e )
|
||||
|
||||
return
|
||||
|
||||
finally:
|
||||
|
||||
with self._lock:
|
||||
|
||||
self._last_index_rendered = frame_index
|
||||
|
||||
self._next_render_index = ( self._next_render_index + 1 ) % num_frames_in_video
|
||||
|
||||
|
||||
|
||||
with self._lock:
|
||||
|
||||
if self._next_render_index == 0 and self._buffer_end_index != num_frames_in_video - 1:
|
||||
|
||||
# we need to rewind renderer
|
||||
|
||||
self._renderer.set_position( 0 )
|
||||
|
||||
self._last_index_rendered = -1
|
||||
|
||||
|
||||
should_save_frame = not self._HasFrame( frame_index )
|
||||
|
||||
|
||||
if should_save_frame:
|
||||
|
||||
frame = GenerateHydrusBitmapFromNumPyImage( numpy_image, compressed = False )
|
||||
|
||||
with self._lock:
|
||||
|
||||
self._frames[ frame_index ] = frame
|
||||
|
||||
self._MaintainBuffer()
|
||||
|
||||
|
||||
|
||||
with self._lock:
|
||||
|
||||
work_still_to_do = self._last_index_rendered != self._buffer_end_index
|
||||
|
||||
|
||||
if work_still_to_do:
|
||||
|
||||
time.sleep( 0.0001 )
|
||||
|
||||
else:
|
||||
|
||||
half_a_frame = ( self._average_frame_duration / 1000.0 ) * 0.5
|
||||
|
||||
sleep_duration = min( 0.1, half_a_frame ) # for 10s-long 3-frame gifs, wew
|
||||
|
||||
time.sleep( sleep_duration ) # just so we don't spam cpu
|
||||
|
||||
|
||||
else:
|
||||
|
||||
self._render_event.wait( 1 )
|
||||
|
||||
self._render_event.clear()
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
class HydrusBitmap( ClientCachesBase.CacheableObject ):
|
||||
|
||||
def __init__( self, data, size, depth, compressed = True ):
|
||||
|
|
|
@ -1986,8 +1986,7 @@ class ServiceRepository( ServiceRestricted ):
|
|||
|
||||
finally:
|
||||
|
||||
job_status.Finish()
|
||||
job_status.Delete( 5 )
|
||||
job_status.FinishAndDismiss( 5 )
|
||||
|
||||
|
||||
|
||||
|
@ -2362,8 +2361,7 @@ class ServiceRepository( ServiceRestricted ):
|
|||
job_status.DeleteStatusText( 2 )
|
||||
job_status.DeleteVariable( 'popup_gauge_1' )
|
||||
|
||||
job_status.Finish()
|
||||
job_status.Delete( 3 )
|
||||
job_status.FinishAndDismiss( 3 )
|
||||
|
||||
|
||||
|
||||
|
@ -2860,8 +2858,7 @@ class ServiceRepository( ServiceRestricted ):
|
|||
|
||||
finally:
|
||||
|
||||
job_status.Finish()
|
||||
job_status.Delete( 5 )
|
||||
job_status.FinishAndDismiss( 5 )
|
||||
|
||||
|
||||
|
||||
|
@ -3095,13 +3092,13 @@ class ServiceIPFS( ServiceRemote ):
|
|||
|
||||
if not urls_good:
|
||||
|
||||
job_status.Delete()
|
||||
job_status.FinishAndDismiss()
|
||||
|
||||
|
||||
|
||||
except:
|
||||
|
||||
job_status.Delete()
|
||||
job_status.FinishAndDismiss()
|
||||
|
||||
raise
|
||||
|
||||
|
@ -3152,7 +3149,7 @@ class ServiceIPFS( ServiceRemote ):
|
|||
|
||||
except:
|
||||
|
||||
job_status.Delete()
|
||||
job_status.FinishAndDismiss()
|
||||
|
||||
raise
|
||||
|
||||
|
|
|
@ -26,25 +26,29 @@ class JobStatus( object ):
|
|||
self._stop_time = stop_time
|
||||
self._cancel_on_shutdown = cancel_on_shutdown and maintenance_mode != HC.MAINTENANCE_SHUTDOWN
|
||||
|
||||
self._start_time = HydrusTime.GetNow()
|
||||
self._cancelled = False
|
||||
self._paused = False
|
||||
self._dismissed = False
|
||||
self._finish_and_dismiss_time = None
|
||||
|
||||
self._deleted = threading.Event()
|
||||
self._deletion_time = None
|
||||
self._done = threading.Event()
|
||||
self._cancelled = threading.Event()
|
||||
self._paused = threading.Event()
|
||||
self._i_am_an_ongoing_job = self._pausable or self._cancellable
|
||||
|
||||
self._ui_update_pause_period = 0.1
|
||||
self._next_ui_update_pause = HydrusTime.GetNowFloat() + self._ui_update_pause_period
|
||||
if self._i_am_an_ongoing_job:
|
||||
|
||||
self._i_am_done = False
|
||||
self._job_finish_time = None
|
||||
|
||||
else:
|
||||
|
||||
self._i_am_done = True
|
||||
self._job_finish_time = HydrusTime.GetNowFloat()
|
||||
|
||||
|
||||
self._yield_pause_period = 10
|
||||
self._next_yield_pause = HydrusTime.GetNow() + self._yield_pause_period
|
||||
self._ui_update_pauser = HydrusThreading.BigJobPauser( 0.1, 0.00001 )
|
||||
|
||||
self._bigger_pause_period = 100
|
||||
self._next_bigger_pause = HydrusTime.GetNow() + self._bigger_pause_period
|
||||
self._yield_pauser = HydrusThreading.BigJobPauser()
|
||||
|
||||
self._longer_pause_period = 1000
|
||||
self._next_longer_pause = HydrusTime.GetNow() + self._longer_pause_period
|
||||
self._cancel_tests_regular_checker = HydrusThreading.RegularJobChecker( 1.0 )
|
||||
|
||||
self._exception = None
|
||||
|
||||
|
@ -70,35 +74,33 @@ class JobStatus( object ):
|
|||
|
||||
def _CheckCancelTests( self ):
|
||||
|
||||
if not self._cancelled.is_set():
|
||||
if self._cancel_tests_regular_checker.Due():
|
||||
|
||||
should_cancel = False
|
||||
|
||||
if self._cancel_on_shutdown and HydrusThreading.IsThreadShuttingDown():
|
||||
if not self._i_am_done:
|
||||
|
||||
should_cancel = True
|
||||
|
||||
|
||||
if HG.client_controller.ShouldStopThisWork( self._maintenance_mode, self._stop_time ):
|
||||
|
||||
should_cancel = True
|
||||
|
||||
|
||||
if should_cancel:
|
||||
|
||||
self.Cancel()
|
||||
|
||||
|
||||
|
||||
if not self._deleted.is_set():
|
||||
|
||||
if self._deletion_time is not None:
|
||||
|
||||
if HydrusTime.TimeHasPassed( self._deletion_time ):
|
||||
if self._cancel_on_shutdown and HydrusThreading.IsThreadShuttingDown():
|
||||
|
||||
self.Finish()
|
||||
self.Cancel()
|
||||
|
||||
self._deleted.set()
|
||||
return
|
||||
|
||||
|
||||
if HG.client_controller.ShouldStopThisWork( self._maintenance_mode, self._stop_time ):
|
||||
|
||||
self.Cancel()
|
||||
|
||||
return
|
||||
|
||||
|
||||
|
||||
if not self._dismissed:
|
||||
|
||||
if self._finish_and_dismiss_time is not None:
|
||||
|
||||
if HydrusTime.TimeHasPassed( self._finish_and_dismiss_time ):
|
||||
|
||||
self.FinishAndDismiss()
|
||||
|
||||
|
||||
|
||||
|
||||
|
@ -112,41 +114,11 @@ class JobStatus( object ):
|
|||
|
||||
|
||||
|
||||
def Cancel( self, seconds = None ) -> bool:
|
||||
def Cancel( self ):
|
||||
|
||||
if not self.IsCancellable():
|
||||
|
||||
return False
|
||||
|
||||
self._cancelled = True
|
||||
|
||||
if seconds is None:
|
||||
|
||||
self._cancelled.set()
|
||||
|
||||
self.Finish()
|
||||
|
||||
else:
|
||||
|
||||
HG.client_controller.CallLater( seconds, self.Cancel )
|
||||
|
||||
|
||||
return True
|
||||
|
||||
|
||||
def Delete( self, seconds = None ) -> bool:
|
||||
|
||||
if seconds is None:
|
||||
|
||||
self._deleted.set()
|
||||
|
||||
self.Finish()
|
||||
|
||||
else:
|
||||
|
||||
self._deletion_time = HydrusTime.GetNow() + seconds
|
||||
|
||||
|
||||
return True
|
||||
self.Finish()
|
||||
|
||||
|
||||
def DeleteFiles( self ):
|
||||
|
@ -168,7 +140,7 @@ class JobStatus( object ):
|
|||
|
||||
self.DeleteVariable( 'status_title' )
|
||||
|
||||
|
||||
|
||||
def DeleteVariable( self, name ):
|
||||
|
||||
with self._variable_lock:
|
||||
|
@ -179,23 +151,31 @@ class JobStatus( object ):
|
|||
|
||||
|
||||
|
||||
if HydrusTime.TimeHasPassedFloat( self._next_ui_update_pause ):
|
||||
|
||||
time.sleep( 0.00001 )
|
||||
|
||||
self._next_ui_update_pause = HydrusTime.GetNowFloat() + self._ui_update_pause_period
|
||||
|
||||
self._ui_update_pauser.Pause()
|
||||
|
||||
|
||||
def Finish( self, seconds = None ):
|
||||
def Finish( self ):
|
||||
|
||||
self._i_am_done = True
|
||||
self._job_finish_time = HydrusTime.GetNowFloat()
|
||||
|
||||
self._paused = False
|
||||
|
||||
self._pausable = False
|
||||
self._cancellable = False
|
||||
|
||||
|
||||
def FinishAndDismiss( self, seconds = None ):
|
||||
|
||||
self.Finish()
|
||||
|
||||
if seconds is None:
|
||||
|
||||
self._done.set()
|
||||
self._dismissed = True
|
||||
|
||||
else:
|
||||
|
||||
HG.client_controller.CallLater( seconds, self.Finish )
|
||||
self._finish_and_dismiss_time = HydrusTime.GetNow() + seconds
|
||||
|
||||
|
||||
|
||||
|
@ -286,73 +266,43 @@ class JobStatus( object ):
|
|||
|
||||
def IsCancellable( self ):
|
||||
|
||||
self._CheckCancelTests()
|
||||
|
||||
return self._cancellable and not self.IsDone()
|
||||
return self._cancellable
|
||||
|
||||
|
||||
def IsCancelled( self ):
|
||||
|
||||
self._CheckCancelTests()
|
||||
|
||||
return self._cancelled.is_set()
|
||||
return self._cancelled
|
||||
|
||||
|
||||
def IsDeletable( self ):
|
||||
|
||||
return not ( self.IsPausable() or self.IsCancellable() )
|
||||
|
||||
|
||||
def IsDeleted( self ):
|
||||
def IsDismissed( self ):
|
||||
|
||||
self._CheckCancelTests()
|
||||
|
||||
return self._deleted.is_set()
|
||||
return self._dismissed
|
||||
|
||||
|
||||
def IsDone( self ):
|
||||
|
||||
self._CheckCancelTests()
|
||||
|
||||
return self._done.is_set()
|
||||
return self._i_am_done
|
||||
|
||||
|
||||
def IsPausable( self ):
|
||||
|
||||
self._CheckCancelTests()
|
||||
|
||||
return self._pausable and not self.IsDone()
|
||||
return self._pausable
|
||||
|
||||
|
||||
def IsPaused( self ):
|
||||
|
||||
self._CheckCancelTests()
|
||||
|
||||
return self._paused.is_set() and not self.IsDone()
|
||||
|
||||
|
||||
def IsWorking( self ):
|
||||
|
||||
self._CheckCancelTests()
|
||||
|
||||
return not self.IsDone()
|
||||
return self._paused
|
||||
|
||||
|
||||
def PausePlay( self ):
|
||||
|
||||
if self._paused.is_set():
|
||||
|
||||
self._paused.clear()
|
||||
|
||||
else:
|
||||
|
||||
self._paused.set()
|
||||
|
||||
|
||||
|
||||
def SetCancellable( self, value ):
|
||||
|
||||
self._cancellable = value
|
||||
self._paused = not self._paused
|
||||
|
||||
|
||||
def SetErrorException( self, e: Exception ):
|
||||
|
@ -381,8 +331,6 @@ class JobStatus( object ):
|
|||
self.SetVariable( 'network_job', network_job )
|
||||
|
||||
|
||||
def SetPausable( self, value ): self._pausable = value
|
||||
|
||||
def SetStatusText( self, text: str, level = 1 ):
|
||||
|
||||
self.SetVariable( 'status_text_{}'.format( level ), text )
|
||||
|
@ -407,17 +355,19 @@ class JobStatus( object ):
|
|||
|
||||
with self._variable_lock: self._variables[ name ] = value
|
||||
|
||||
if HydrusTime.TimeHasPassed( self._next_ui_update_pause ):
|
||||
|
||||
time.sleep( 0.00001 )
|
||||
|
||||
self._next_ui_update_pause = HydrusTime.GetNow() + self._ui_update_pause_period
|
||||
|
||||
self._ui_update_pauser.Pause()
|
||||
|
||||
|
||||
def TimeRunning( self ):
|
||||
|
||||
return HydrusTime.GetNow() - self._start_time
|
||||
if self._job_finish_time is None:
|
||||
|
||||
return HydrusTime.GetNowFloat() - self._creation_time
|
||||
|
||||
else:
|
||||
|
||||
return self._job_finish_time - self._creation_time
|
||||
|
||||
|
||||
|
||||
def ToString( self ):
|
||||
|
@ -466,26 +416,7 @@ class JobStatus( object ):
|
|||
|
||||
def WaitIfNeeded( self ):
|
||||
|
||||
if HydrusTime.TimeHasPassed( self._next_yield_pause ):
|
||||
|
||||
time.sleep( 0.1 )
|
||||
|
||||
self._next_yield_pause = HydrusTime.GetNow() + self._yield_pause_period
|
||||
|
||||
if HydrusTime.TimeHasPassed( self._next_bigger_pause ):
|
||||
|
||||
time.sleep( 1 )
|
||||
|
||||
self._next_bigger_pause = HydrusTime.GetNow() + self._bigger_pause_period
|
||||
|
||||
if HydrusTime.TimeHasPassed( self._longer_pause_period ):
|
||||
|
||||
time.sleep( 10 )
|
||||
|
||||
self._next_longer_pause = HydrusTime.GetNow() + self._longer_pause_period
|
||||
|
||||
|
||||
|
||||
self._yield_pauser.Pause()
|
||||
|
||||
i_paused = False
|
||||
should_quit = False
|
||||
|
@ -510,6 +441,7 @@ class JobStatus( object ):
|
|||
return ( i_paused, should_quit )
|
||||
|
||||
|
||||
|
||||
class FileRWLock( object ):
|
||||
|
||||
class RLock( object ):
|
||||
|
|
|
@ -1,8 +1,11 @@
|
|||
import PIL.Image
|
||||
import cv2
|
||||
import numpy
|
||||
|
||||
from hydrus.core import HydrusData
|
||||
from hydrus.core import HydrusExceptions
|
||||
from hydrus.core import HydrusGlobals as HG
|
||||
from hydrus.core import HydrusTime
|
||||
from hydrus.core.images import HydrusImageHandling
|
||||
from hydrus.core.images import HydrusImageNormalisation
|
||||
|
||||
|
@ -54,7 +57,7 @@ def GetCVVideoProperties( path ):
|
|||
# the cv code was initially written by @fluffy_cub
|
||||
class GIFRenderer( object ):
|
||||
|
||||
def __init__( self, path, num_frames, target_resolution, force_pil = False ):
|
||||
def __init__( self, path, num_frames, target_resolution ):
|
||||
|
||||
if HG.media_load_report_mode:
|
||||
|
||||
|
@ -64,100 +67,239 @@ class GIFRenderer( object ):
|
|||
self._path = path
|
||||
self._num_frames = num_frames
|
||||
self._target_resolution = target_resolution
|
||||
self._force_pil = force_pil
|
||||
self._pil_dangerzone_frame = None
|
||||
self._cannot_seek_to_or_beyond_this_index = None
|
||||
self._frames_we_could_not_render = set()
|
||||
|
||||
new_options = HG.client_controller.new_options
|
||||
self._current_render_index = 0
|
||||
self._last_valid_numpy_frame = None
|
||||
|
||||
if new_options.GetBoolean( 'disable_cv_for_gifs' ) or cv2.__version__.startswith( '2' ) or self._force_pil:
|
||||
self._Initialise()
|
||||
|
||||
|
||||
def _GetRecoveryFrame( self ) -> numpy.array:
|
||||
|
||||
if self._last_valid_numpy_frame is None:
|
||||
|
||||
self._InitialisePIL()
|
||||
numpy_image = numpy.zeros( ( self._target_resolution[1], self._target_resolution[0], 4 ), dtype = 'uint8' )
|
||||
numpy_image[:,:,3] = 255 # numpy is great!
|
||||
|
||||
else:
|
||||
|
||||
self._InitialiseCV()
|
||||
numpy_image = self._last_valid_numpy_frame
|
||||
|
||||
|
||||
return numpy_image
|
||||
|
||||
|
||||
def _GetCurrentFrame( self ):
|
||||
def _Initialise( self ):
|
||||
|
||||
if self._cv_mode:
|
||||
if HG.media_load_report_mode:
|
||||
|
||||
( retval, numpy_image ) = self._cv_video.read()
|
||||
HydrusData.ShowText( 'Loading GIF with PIL' )
|
||||
|
||||
if not retval:
|
||||
|
||||
self._next_render_index = ( self._next_render_index + 1 ) % self._num_frames
|
||||
|
||||
raise HydrusExceptions.CantRenderWithCVException( 'CV could not render frame ' + str( self._next_render_index - 1 ) + '.' )
|
||||
|
||||
|
||||
# dequantize = False since we'll be doing that later for each frame in turn
|
||||
# if we do it now, it collapses down to a one frame object
|
||||
self._pil_image = HydrusImageHandling.GeneratePILImage( self._path, dequantize = False )
|
||||
|
||||
self._pil_global_palette = self._pil_image.palette
|
||||
|
||||
# years-old weirdo fix, taking it out 2023-11
|
||||
'''
|
||||
# believe it or not, doing this actually fixed a couple of gifs!
|
||||
try:
|
||||
|
||||
self._pil_image.seek( 1 )
|
||||
self._pil_image.seek( 0 )
|
||||
|
||||
except Exception as e:
|
||||
|
||||
raise HydrusExceptions.DamagedOrUnusualFileException( 'Could not initialise GIF!' ) from e
|
||||
|
||||
'''
|
||||
|
||||
|
||||
def _MoveRendererOnOneFrame( self ):
|
||||
|
||||
self._current_render_index = ( self._current_render_index + 1 ) % self._num_frames
|
||||
|
||||
we_are_in_the_dangerzone = self._cannot_seek_to_or_beyond_this_index is not None and self._current_render_index >= self._cannot_seek_to_or_beyond_this_index
|
||||
|
||||
if self._current_render_index == 0 or we_are_in_the_dangerzone:
|
||||
|
||||
self._RewindGIF( reinitialise = True )
|
||||
|
||||
else:
|
||||
|
||||
try:
|
||||
|
||||
current_frame = HydrusImageNormalisation.DequantizePILImage( self._pil_image )
|
||||
self._pil_image.seek( self._current_render_index )
|
||||
|
||||
size = self._pil_image.size
|
||||
|
||||
# this out of the blue: <PIL.GifImagePlugin.GifImageFile image mode=RGBA size=85171x53524 at 0x1BF0386C460>
|
||||
# 8GB memory 20 second fail render
|
||||
if size[0] > 16384 or size[1] > 16384:
|
||||
|
||||
raise HydrusExceptions.DamagedOrUnusualFileException( 'Crazy GIF frame went bananas!' )
|
||||
|
||||
|
||||
except:
|
||||
|
||||
raise HydrusExceptions.UnsupportedFileException( f'PIL could not move to frame {self._next_render_index - 1}.' )
|
||||
|
||||
|
||||
if current_frame.mode == 'RGBA':
|
||||
|
||||
if self._pil_canvas is None:
|
||||
# this can raise OSError in some 'trancated file' circumstances
|
||||
# trying to render beyond with PIL is rife with trouble, so we won't try
|
||||
if self._cannot_seek_to_or_beyond_this_index is None:
|
||||
|
||||
self._pil_canvas = current_frame
|
||||
self._cannot_seek_to_or_beyond_this_index = self._current_render_index
|
||||
|
||||
else:
|
||||
|
||||
try:
|
||||
|
||||
self._pil_canvas.paste( current_frame, None, current_frame ) # use the rgba image as its own mask
|
||||
|
||||
except:
|
||||
|
||||
# the 'paste' can produce an OSError(!!!) on a truncated file, lfg
|
||||
# so let's just bail out in that case mate
|
||||
|
||||
self._pil_dangerzone_frame = self._next_render_index
|
||||
|
||||
self._pil_canvas = current_frame
|
||||
|
||||
self._cannot_seek_to_or_beyond_this_index = min( self._cannot_seek_to_or_beyond_this_index, self._current_render_index )
|
||||
|
||||
|
||||
elif current_frame.mode == 'RGB':
|
||||
|
||||
self._pil_canvas = current_frame
|
||||
self._RewindGIF( reinitialise = True )
|
||||
|
||||
|
||||
numpy_image = HydrusImageHandling.GenerateNumPyImageFromPILImage( self._pil_canvas, strip_useless_alpha = False )
|
||||
|
||||
|
||||
def _RenderCurrentFrameAndResizeIt( self ) -> numpy.array:
|
||||
|
||||
if self._cannot_seek_to_or_beyond_this_index is not None and self._current_render_index >= self._cannot_seek_to_or_beyond_this_index:
|
||||
|
||||
numpy_image = self._GetRecoveryFrame()
|
||||
|
||||
elif self._current_render_index in self._frames_we_could_not_render:
|
||||
|
||||
numpy_image = self._GetRecoveryFrame()
|
||||
|
||||
else:
|
||||
|
||||
time_started = HydrusTime.GetNowFloat()
|
||||
|
||||
try:
|
||||
|
||||
current_frame = HydrusImageNormalisation.DequantizePILImage( self._pil_image )
|
||||
|
||||
# don't have to worry about pasting alpha-having transparent frames over the previous frame--PIL seems to handle this these days!
|
||||
self._pil_canvas = current_frame
|
||||
|
||||
numpy_image = HydrusImageHandling.GenerateNumPyImageFromPILImage( self._pil_canvas, strip_useless_alpha = False )
|
||||
|
||||
except:
|
||||
|
||||
# PIL can produce an IOError, which is an OSError(!!!), on a truncated file, lfg
|
||||
# so let's just bail out in that case mate
|
||||
|
||||
self._frames_we_could_not_render.add( self._current_render_index )
|
||||
|
||||
time_to_error = HydrusTime.GetNowFloat() - time_started
|
||||
|
||||
if time_to_error > 2.0:
|
||||
|
||||
# this is a crazy file that, with its broken frame, needs to re-render the whole thing or something
|
||||
# don't push any further
|
||||
|
||||
self._cannot_seek_to_or_beyond_this_index = min( self._frames_we_could_not_render )
|
||||
|
||||
|
||||
numpy_image = self._GetRecoveryFrame()
|
||||
|
||||
|
||||
|
||||
self._last_valid_numpy_frame = numpy_image
|
||||
|
||||
numpy_image = HydrusImageHandling.ResizeNumPyImage( numpy_image, self._target_resolution )
|
||||
|
||||
return numpy_image
|
||||
|
||||
|
||||
def _RewindGIF( self, reinitialise = False ):
|
||||
|
||||
self._pil_image.seek( 0 )
|
||||
|
||||
self._current_render_index = 0
|
||||
|
||||
if reinitialise:
|
||||
|
||||
self._Initialise()
|
||||
|
||||
|
||||
|
||||
def read_frame( self ):
|
||||
|
||||
numpy_image = self._RenderCurrentFrameAndResizeIt()
|
||||
|
||||
self._MoveRendererOnOneFrame()
|
||||
|
||||
return numpy_image
|
||||
|
||||
|
||||
def set_position( self, index ):
|
||||
|
||||
if self._cannot_seek_to_or_beyond_this_index is not None and index >= self._cannot_seek_to_or_beyond_this_index:
|
||||
|
||||
return
|
||||
|
||||
|
||||
if index == self._current_render_index:
|
||||
|
||||
return
|
||||
|
||||
elif index < self._current_render_index:
|
||||
|
||||
self._RewindGIF()
|
||||
|
||||
|
||||
while self._current_render_index < index:
|
||||
|
||||
self._MoveRendererOnOneFrame()
|
||||
|
||||
|
||||
#self._cv_video.set( CV_CAP_PROP_POS_FRAMES, index )
|
||||
|
||||
|
||||
def Stop( self ):
|
||||
|
||||
pass
|
||||
|
||||
|
||||
|
||||
# the cv code was initially written by @fluffy_cub
|
||||
# hydev is splitting this off into its own clean thing now, just for posterity
|
||||
# PIL gif rendering has improved by leaps and bounds in recent years, and we have good alpha rendering now. CV is now the sub-par
|
||||
class GIFRendererCV( object ):
|
||||
|
||||
def __init__( self, path, num_frames, target_resolution ):
|
||||
|
||||
if HG.media_load_report_mode:
|
||||
|
||||
HydrusData.ShowText( 'Loading GIF: ' + path )
|
||||
|
||||
|
||||
self._path = path
|
||||
self._num_frames = num_frames
|
||||
self._target_resolution = target_resolution
|
||||
|
||||
self._InitialiseCV()
|
||||
|
||||
|
||||
def _GetCurrentFrameAndMoveOn( self ):
|
||||
|
||||
( retval, numpy_image ) = self._cv_video.read()
|
||||
|
||||
if not retval:
|
||||
|
||||
self._next_render_index = ( self._next_render_index + 1 ) % self._num_frames
|
||||
|
||||
raise HydrusExceptions.CantRenderWithCVException( 'CV could not render frame ' + str( self._next_render_index - 1 ) + '.' )
|
||||
|
||||
|
||||
self._next_render_index = ( self._next_render_index + 1 ) % self._num_frames
|
||||
|
||||
dangerzone = self._pil_dangerzone_frame is not None and self._next_render_index >= self._pil_dangerzone_frame
|
||||
|
||||
if self._next_render_index == 0 or dangerzone:
|
||||
if self._next_render_index == 0:
|
||||
|
||||
self._RewindGIF()
|
||||
|
||||
else:
|
||||
|
||||
if not self._cv_mode:
|
||||
|
||||
try:
|
||||
|
||||
self._pil_image.seek( self._next_render_index )
|
||||
|
||||
except:
|
||||
|
||||
# this can raise OSError in some 'trancated file' circumstances, lmao
|
||||
# trying to render beyond with PIL is rife with trouble, so we won't try
|
||||
self._RewindGIF()
|
||||
|
||||
|
||||
|
||||
|
||||
return numpy_image
|
||||
|
||||
|
@ -169,129 +311,45 @@ class GIFRenderer( object ):
|
|||
HydrusData.ShowText( 'Loading GIF with OpenCV' )
|
||||
|
||||
|
||||
self._cv_mode = True
|
||||
|
||||
self._cv_video = cv2.VideoCapture( self._path )
|
||||
|
||||
self._cv_video.set( CAP_PROP_CONVERT_RGB, 1.0 ) # True cast to double
|
||||
|
||||
self._next_render_index = 0
|
||||
self._last_frame = None
|
||||
|
||||
|
||||
def _InitialisePIL( self ):
|
||||
def _RenderCurrentFrameAndResizeIt( self ):
|
||||
|
||||
if HG.media_load_report_mode:
|
||||
|
||||
HydrusData.ShowText( 'Loading GIF with PIL' )
|
||||
|
||||
numpy_image = self._GetCurrentFrameAndMoveOn()
|
||||
|
||||
self._cv_mode = False
|
||||
numpy_image = HydrusImageHandling.ResizeNumPyImage( numpy_image, self._target_resolution )
|
||||
|
||||
# dequantize = False since we'll be doing that later for each frame in turn
|
||||
# if we do it now, it collapses down to a one frame object
|
||||
self._pil_image = HydrusImageHandling.GeneratePILImage( self._path, dequantize = False )
|
||||
|
||||
self._pil_canvas = None
|
||||
|
||||
self._pil_global_palette = self._pil_image.palette
|
||||
|
||||
self._next_render_index = 0
|
||||
self._last_frame = None
|
||||
|
||||
# years-old weirdo fix, taking it out 2023-11
|
||||
'''
|
||||
# believe it or not, doing this actually fixed a couple of gifs!
|
||||
try:
|
||||
|
||||
self._pil_image.seek( 1 )
|
||||
self._pil_image.seek( 0 )
|
||||
|
||||
except Exception as e:
|
||||
|
||||
raise HydrusExceptions.UnsupportedFileException( 'Could not initialise GIF!' ) from e
|
||||
|
||||
'''
|
||||
|
||||
|
||||
def _RenderCurrentFrame( self ):
|
||||
|
||||
if self._cv_mode:
|
||||
|
||||
try:
|
||||
|
||||
numpy_image = self._GetCurrentFrame()
|
||||
|
||||
numpy_image = HydrusImageHandling.ResizeNumPyImage( numpy_image, self._target_resolution )
|
||||
|
||||
numpy_image = cv2.cvtColor( numpy_image, cv2.COLOR_BGR2RGB )
|
||||
|
||||
except HydrusExceptions.CantRenderWithCVException:
|
||||
|
||||
if self._last_frame is None:
|
||||
|
||||
if HG.media_load_report_mode:
|
||||
|
||||
HydrusData.ShowText( 'OpenCV Failed to render a frame' )
|
||||
|
||||
|
||||
self._InitialisePIL()
|
||||
|
||||
numpy_image = self._RenderCurrentFrame()
|
||||
|
||||
else:
|
||||
|
||||
numpy_image = self._last_frame
|
||||
|
||||
|
||||
|
||||
else:
|
||||
|
||||
numpy_image = self._GetCurrentFrame()
|
||||
|
||||
numpy_image = HydrusImageHandling.ResizeNumPyImage( numpy_image, self._target_resolution )
|
||||
|
||||
|
||||
self._last_frame = numpy_image
|
||||
numpy_image = cv2.cvtColor( numpy_image, cv2.COLOR_BGR2RGB )
|
||||
|
||||
return numpy_image
|
||||
|
||||
|
||||
def _RewindGIF( self ):
|
||||
|
||||
if self._cv_mode:
|
||||
|
||||
self._cv_video.release()
|
||||
self._cv_video.open( self._path )
|
||||
|
||||
#self._cv_video.set( CAP_PROP_POS_FRAMES, 0.0 )
|
||||
|
||||
else:
|
||||
|
||||
self._pil_image.seek( 0 )
|
||||
|
||||
self._pil_canvas = None
|
||||
|
||||
self._cv_video.release()
|
||||
self._cv_video.open( self._path )
|
||||
|
||||
#self._cv_video.set( CAP_PROP_POS_FRAMES, 0.0 )
|
||||
|
||||
self._next_render_index = 0
|
||||
|
||||
|
||||
def read_frame( self ):
|
||||
|
||||
return self._RenderCurrentFrame()
|
||||
return self._RenderCurrentFrameAndResizeIt()
|
||||
|
||||
|
||||
def set_position( self, index ):
|
||||
|
||||
if self._pil_dangerzone_frame is not None and index >= self._pil_dangerzone_frame:
|
||||
|
||||
return
|
||||
|
||||
|
||||
if index == self._next_render_index: return
|
||||
elif index < self._next_render_index: self._RewindGIF()
|
||||
|
||||
while self._next_render_index < index: self._GetCurrentFrame()
|
||||
while self._next_render_index < index: self._GetCurrentFrameAndMoveOn()
|
||||
|
||||
#self._cv_video.set( CV_CAP_PROP_POS_FRAMES, index )
|
||||
|
||||
|
|
|
@ -2432,9 +2432,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
job_status.SetStatusText( 'done!' )
|
||||
|
||||
job_status.Finish()
|
||||
|
||||
job_status.Delete( 5 )
|
||||
job_status.FinishAndDismiss( 5 )
|
||||
|
||||
|
||||
|
||||
|
@ -6883,9 +6881,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
job_status.SetStatusText( 'done!' )
|
||||
|
||||
job_status.Finish()
|
||||
|
||||
job_status.Delete( 5 )
|
||||
job_status.FinishAndDismiss( 5 )
|
||||
|
||||
|
||||
|
||||
|
@ -6910,9 +6906,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
job_status.SetStatusText( 'done!' )
|
||||
|
||||
job_status.Finish()
|
||||
|
||||
job_status.Delete( 5 )
|
||||
job_status.FinishAndDismiss( 5 )
|
||||
|
||||
self._cursor_transaction_wrapper.pub_after_job( 'notify_new_tag_display_application' )
|
||||
self._cursor_transaction_wrapper.pub_after_job( 'notify_new_force_refresh_tags_data' )
|
||||
|
@ -6985,9 +6979,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
job_status.SetStatusText( 'done!' )
|
||||
|
||||
job_status.Finish()
|
||||
|
||||
job_status.Delete( 5 )
|
||||
job_status.FinishAndDismiss( 5 )
|
||||
|
||||
|
||||
|
||||
|
@ -7065,9 +7057,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
job_status.SetStatusText( 'done!' )
|
||||
|
||||
job_status.Finish()
|
||||
|
||||
job_status.Delete( 5 )
|
||||
job_status.FinishAndDismiss( 5 )
|
||||
|
||||
|
||||
|
||||
|
@ -7171,9 +7161,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
job_status.SetStatusText( 'done!' )
|
||||
|
||||
job_status.Finish()
|
||||
|
||||
job_status.Delete( 5 )
|
||||
job_status.FinishAndDismiss( 5 )
|
||||
|
||||
self._cursor_transaction_wrapper.pub_after_job( 'notify_new_tag_display_application' )
|
||||
self._cursor_transaction_wrapper.pub_after_job( 'notify_new_force_refresh_tags_data' )
|
||||
|
@ -7253,9 +7241,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
job_status.SetStatusText( 'done!' )
|
||||
|
||||
job_status.Finish()
|
||||
|
||||
job_status.Delete( 5 )
|
||||
job_status.FinishAndDismiss( 5 )
|
||||
|
||||
self._cursor_transaction_wrapper.pub_after_job( 'notify_new_force_refresh_tags_data' )
|
||||
|
||||
|
@ -7356,9 +7342,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
job_status.SetStatusText( 'done!' )
|
||||
|
||||
job_status.Finish()
|
||||
|
||||
job_status.Delete( 5 )
|
||||
job_status.FinishAndDismiss( 5 )
|
||||
|
||||
HydrusData.ShowText( 'Now the mappings cache regen is done, you might want to restart the program.' )
|
||||
|
||||
|
@ -7457,9 +7441,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
job_status.SetStatusText( 'done!' )
|
||||
|
||||
job_status.Finish()
|
||||
|
||||
job_status.Delete( 5 )
|
||||
job_status.FinishAndDismiss( 5 )
|
||||
|
||||
self._cursor_transaction_wrapper.pub_after_job( 'notify_new_force_refresh_tags_data' )
|
||||
|
||||
|
@ -7939,9 +7921,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
job_status.SetStatusText( 'done!' )
|
||||
|
||||
job_status.Finish()
|
||||
|
||||
job_status.Delete( 5 )
|
||||
job_status.FinishAndDismiss( 5 )
|
||||
|
||||
|
||||
|
||||
|
@ -8000,9 +7980,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
job_status.SetStatusText( 'done!' )
|
||||
|
||||
job_status.Finish()
|
||||
|
||||
job_status.Delete( 5 )
|
||||
job_status.FinishAndDismiss( 5 )
|
||||
|
||||
self._cursor_transaction_wrapper.pub_after_job( 'notify_new_force_refresh_tags_data' )
|
||||
|
||||
|
@ -8292,9 +8270,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
job_status.SetStatusText( 'done!' )
|
||||
|
||||
job_status.Finish()
|
||||
|
||||
job_status.Delete( 5 )
|
||||
job_status.FinishAndDismiss( 5 )
|
||||
|
||||
self._cursor_transaction_wrapper.pub_after_job( 'notify_new_tag_display_application' )
|
||||
self._cursor_transaction_wrapper.pub_after_job( 'notify_new_force_refresh_tags_data' )
|
||||
|
@ -9928,6 +9904,33 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
|
||||
|
||||
if version == 553:
|
||||
|
||||
try:
|
||||
|
||||
self._controller.frame_splash_status.SetSubtext( f'scheduling some maintenance work' )
|
||||
|
||||
all_local_hash_ids = self.modules_files_storage.GetCurrentHashIdsList( self.modules_services.combined_local_file_service_id )
|
||||
|
||||
with self._MakeTemporaryIntegerTable( all_local_hash_ids, 'hash_id' ) as temp_hash_ids_table_name:
|
||||
|
||||
hash_ids = self._STS( self._Execute( f'SELECT hash_id FROM {temp_hash_ids_table_name} CROSS JOIN files_info USING ( hash_id ) WHERE mime IN {HydrusData.SplayListForDB( [ HC.ANIMATION_APNG ] )};', ) )
|
||||
self.modules_files_maintenance_queue.AddJobs( hash_ids, ClientFiles.REGENERATE_FILE_DATA_JOB_FILE_HAS_TRANSPARENCY )
|
||||
|
||||
hash_ids = self._STS( self._Execute( f'SELECT hash_id FROM {temp_hash_ids_table_name} CROSS JOIN files_info USING ( hash_id ) WHERE mime IN {HydrusData.SplayListForDB( [ HC.APPLICATION_ZIP ] )};', ) )
|
||||
self.modules_files_maintenance_queue.AddJobs( hash_ids, ClientFiles.REGENERATE_FILE_DATA_JOB_FILE_METADATA )
|
||||
|
||||
|
||||
except Exception as e:
|
||||
|
||||
HydrusData.PrintException( e )
|
||||
|
||||
message = 'Some file updates failed to schedule! This is not super important, but hydev would be interested in seeing the error that was printed to the log.'
|
||||
|
||||
self.pub_initial_message( message )
|
||||
|
||||
|
||||
|
||||
self._controller.frame_splash_status.SetTitleText( 'updated db to v{}'.format( HydrusData.ToHumanInt( version + 1 ) ) )
|
||||
|
||||
self._Execute( 'UPDATE version SET version = ?;', ( version + 1, ) )
|
||||
|
@ -10392,9 +10395,7 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
job_status.SetStatusText( 'done!' )
|
||||
|
||||
job_status.Finish()
|
||||
|
||||
job_status.Delete( 10 )
|
||||
job_status.FinishAndDismiss( 10 )
|
||||
|
||||
|
||||
|
||||
|
|
|
@ -292,9 +292,7 @@ class ClientDBMaintenance( ClientDBModule.ClientDBModule ):
|
|||
|
||||
finally:
|
||||
|
||||
job_status.Finish()
|
||||
|
||||
job_status.Delete( 10 )
|
||||
job_status.FinishAndDismiss( 10 )
|
||||
|
||||
|
||||
|
||||
|
|
|
@ -714,9 +714,7 @@ class ClientDBSimilarFiles( ClientDBModule.ClientDBModule ):
|
|||
job_status.DeleteVariable( 'popup_gauge_1' )
|
||||
job_status.DeleteStatusText( 2 ) # used in the regenbranch call
|
||||
|
||||
job_status.Finish()
|
||||
|
||||
job_status.Delete( 5 )
|
||||
job_status.FinishAndDismiss( 5 )
|
||||
|
||||
|
||||
|
||||
|
@ -778,9 +776,7 @@ class ClientDBSimilarFiles( ClientDBModule.ClientDBModule ):
|
|||
job_status.SetStatusText( 'done!' )
|
||||
job_status.DeleteStatusText( 2 )
|
||||
|
||||
job_status.Finish()
|
||||
|
||||
job_status.Delete( 5 )
|
||||
job_status.FinishAndDismiss( 5 )
|
||||
|
||||
|
||||
|
||||
|
|
|
@ -21,8 +21,6 @@ from hydrus.client.metadata import ClientMetadataMigration
|
|||
from hydrus.client.metadata import ClientTags
|
||||
from hydrus.client.search import ClientSearch
|
||||
|
||||
MAX_PATH_LENGTH = 240 # bit of padding from 255 for .txt neigbouring and other surprises
|
||||
|
||||
def GenerateExportFilename( destination_directory, media, terms, file_index, do_not_use_filenames = None ):
|
||||
|
||||
def clean_tag_text( t ):
|
||||
|
@ -39,10 +37,23 @@ def GenerateExportFilename( destination_directory, media, terms, file_index, do_
|
|||
return t
|
||||
|
||||
|
||||
if len( destination_directory ) > ( MAX_PATH_LENGTH - 10 ):
|
||||
decent_expected_filename_length = 16
|
||||
|
||||
try:
|
||||
|
||||
raise Exception( 'The destination directory is too long!' )
|
||||
destination_directory_elided = HydrusPaths.ElideFilenameOrDirectorySafely( destination_directory, num_characters_used_in_other_components = decent_expected_filename_length )
|
||||
|
||||
except Exception as e:
|
||||
|
||||
raise Exception( 'Sorry, the destination directory path is way too long! Try shortening it.' ) from e
|
||||
|
||||
|
||||
if destination_directory_elided != destination_directory:
|
||||
|
||||
raise Exception( 'Sorry, the destination directory path is too long! Try shortening it.' )
|
||||
|
||||
|
||||
destination_directory_num_characters_in_filesystem = len( destination_directory.encode( 'utf-8' ) )
|
||||
|
||||
filename = ''
|
||||
|
||||
|
@ -140,14 +151,7 @@ def GenerateExportFilename( destination_directory, media, terms, file_index, do_
|
|||
filename = filename[ : - len( ext ) ]
|
||||
|
||||
|
||||
example_dest_path = os.path.join( destination_directory, filename + ext )
|
||||
|
||||
excess_chars = len( example_dest_path ) - MAX_PATH_LENGTH
|
||||
|
||||
if excess_chars > 0:
|
||||
|
||||
filename = filename[ : - excess_chars ]
|
||||
|
||||
filename = HydrusPaths.ElideFilenameOrDirectorySafely( filename, num_characters_used_in_other_components = destination_directory_num_characters_in_filesystem )
|
||||
|
||||
if do_not_use_filenames is not None:
|
||||
|
||||
|
@ -816,7 +820,7 @@ class ExportFolder( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
HG.client_controller.WriteSynchronous( 'serialisable', self )
|
||||
|
||||
job_status.Delete()
|
||||
job_status.FinishAndDismiss()
|
||||
|
||||
|
||||
|
||||
|
|
|
@ -210,7 +210,7 @@ def THREADUploadPending( service_key ):
|
|||
|
||||
if len( content_types_to_request ) > 0:
|
||||
|
||||
unauthorised_job_status.Delete( 120 )
|
||||
unauthorised_job_status.FinishAndDismiss( 120 )
|
||||
|
||||
|
||||
call = HydrusData.Call( HG.client_controller.pub, 'open_manage_services_and_try_to_auto_create_account', service_key )
|
||||
|
@ -281,7 +281,7 @@ def THREADUploadPending( service_key ):
|
|||
|
||||
HydrusData.Print( job_status.ToString() )
|
||||
|
||||
job_status.Delete( 5 )
|
||||
job_status.FinishAndDismiss( 5 )
|
||||
|
||||
return
|
||||
|
||||
|
@ -424,15 +424,13 @@ def THREADUploadPending( service_key ):
|
|||
|
||||
HydrusData.Print( job_status.ToString() )
|
||||
|
||||
job_status.Finish()
|
||||
|
||||
if len( content_types_to_request ) == 0:
|
||||
|
||||
job_status.Delete()
|
||||
job_status.FinishAndDismiss()
|
||||
|
||||
else:
|
||||
|
||||
job_status.Delete( 5 )
|
||||
job_status.FinishAndDismiss( 5 )
|
||||
|
||||
|
||||
|
||||
|
@ -1371,7 +1369,7 @@ class FrameGUI( CAC.ApplicationCommandProcessorMixin, ClientGUITopLevelWindows.M
|
|||
|
||||
finally:
|
||||
|
||||
job_status.Delete( seconds = 3 )
|
||||
job_status.FinishAndDismiss( seconds = 3 )
|
||||
|
||||
|
||||
QP.CallAfter( qt_code, network_job )
|
||||
|
@ -1413,7 +1411,7 @@ class FrameGUI( CAC.ApplicationCommandProcessorMixin, ClientGUITopLevelWindows.M
|
|||
time.sleep( 1 )
|
||||
|
||||
|
||||
job_status.Delete()
|
||||
job_status.FinishAndDismiss()
|
||||
|
||||
|
||||
self._controller.CallToThread( do_it, self._controller, cancellable )
|
||||
|
@ -3888,7 +3886,7 @@ class FrameGUI( CAC.ApplicationCommandProcessorMixin, ClientGUITopLevelWindows.M
|
|||
|
||||
def publish_callable( account_types ):
|
||||
|
||||
job_status.Delete()
|
||||
job_status.FinishAndDismiss()
|
||||
|
||||
self._ManageAccountTypes( service_key, account_types )
|
||||
|
||||
|
@ -4086,7 +4084,7 @@ class FrameGUI( CAC.ApplicationCommandProcessorMixin, ClientGUITopLevelWindows.M
|
|||
|
||||
finally:
|
||||
|
||||
job_status.Delete()
|
||||
job_status.FinishAndDismiss()
|
||||
|
||||
|
||||
|
||||
|
@ -4205,7 +4203,7 @@ class FrameGUI( CAC.ApplicationCommandProcessorMixin, ClientGUITopLevelWindows.M
|
|||
|
||||
finally:
|
||||
|
||||
job_status.Delete()
|
||||
job_status.FinishAndDismiss()
|
||||
|
||||
|
||||
|
||||
|
@ -4494,9 +4492,7 @@ class FrameGUI( CAC.ApplicationCommandProcessorMixin, ClientGUITopLevelWindows.M
|
|||
|
||||
job_status.SetStatusText( 'done!' )
|
||||
|
||||
job_status.Finish()
|
||||
|
||||
job_status.Delete( 5 )
|
||||
job_status.FinishAndDismiss( 5 )
|
||||
|
||||
service.SetAccountRefreshDueNow()
|
||||
|
||||
|
@ -4553,9 +4549,7 @@ class FrameGUI( CAC.ApplicationCommandProcessorMixin, ClientGUITopLevelWindows.M
|
|||
|
||||
job_status.SetStatusText( 'done!' )
|
||||
|
||||
job_status.Finish()
|
||||
|
||||
job_status.Delete( 5 )
|
||||
job_status.FinishAndDismiss( 5 )
|
||||
|
||||
service.SetAccountRefreshDueNow()
|
||||
|
||||
|
@ -4623,9 +4617,7 @@ class FrameGUI( CAC.ApplicationCommandProcessorMixin, ClientGUITopLevelWindows.M
|
|||
|
||||
job_status.SetStatusText( 'done!' )
|
||||
|
||||
job_status.Finish()
|
||||
|
||||
job_status.Delete( 5 )
|
||||
job_status.FinishAndDismiss( 5 )
|
||||
|
||||
service.DoAFullMetadataResync()
|
||||
|
||||
|
@ -4776,7 +4768,7 @@ class FrameGUI( CAC.ApplicationCommandProcessorMixin, ClientGUITopLevelWindows.M
|
|||
|
||||
finally:
|
||||
|
||||
job_status.Delete()
|
||||
job_status.FinishAndDismiss()
|
||||
|
||||
|
||||
subscriptions = HG.client_controller.subscriptions_manager.GetSubscriptions()
|
||||
|
@ -4823,7 +4815,7 @@ class FrameGUI( CAC.ApplicationCommandProcessorMixin, ClientGUITopLevelWindows.M
|
|||
|
||||
finally:
|
||||
|
||||
done_job_status.Delete()
|
||||
done_job_status.FinishAndDismiss()
|
||||
|
||||
|
||||
finally:
|
||||
|
@ -5538,7 +5530,7 @@ class FrameGUI( CAC.ApplicationCommandProcessorMixin, ClientGUITopLevelWindows.M
|
|||
|
||||
def publish_callable( accounts ):
|
||||
|
||||
job_status.Delete()
|
||||
job_status.FinishAndDismiss()
|
||||
|
||||
self._ReviewAllAccounts( service_key, accounts )
|
||||
|
||||
|
@ -5659,7 +5651,7 @@ class FrameGUI( CAC.ApplicationCommandProcessorMixin, ClientGUITopLevelWindows.M
|
|||
|
||||
frame.SetPanel( panel )
|
||||
|
||||
job_status.Delete()
|
||||
job_status.FinishAndDismiss()
|
||||
|
||||
|
||||
job_status.SetStatusText( 'loading database data' )
|
||||
|
@ -5970,7 +5962,7 @@ class FrameGUI( CAC.ApplicationCommandProcessorMixin, ClientGUITopLevelWindows.M
|
|||
HG.client_controller.RestartClientServerServices()
|
||||
|
||||
|
||||
job_status.Delete()
|
||||
job_status.FinishAndDismiss()
|
||||
|
||||
|
||||
|
||||
|
@ -6983,7 +6975,7 @@ The password is cleartext here but obscured in the entry dialog. Enter a blank p
|
|||
|
||||
def AddModalMessage( self, job_status: ClientThreading.JobStatus ):
|
||||
|
||||
if job_status.IsCancelled() or job_status.IsDeleted():
|
||||
if job_status.IsDismissed():
|
||||
|
||||
return
|
||||
|
||||
|
|
|
@ -112,7 +112,7 @@ def CopyHashesToClipboard( win: QW.QWidget, hash_type: str, medias: typing.Seque
|
|||
|
||||
HG.client_controller.pub( 'message', job_status )
|
||||
|
||||
job_status.Delete( 2 )
|
||||
job_status.FinishAndDismiss( 2 )
|
||||
|
||||
|
||||
def CopyMediaURLs( medias ):
|
||||
|
@ -357,9 +357,7 @@ def OpenURLs( urls ):
|
|||
|
||||
if job_status is not None:
|
||||
|
||||
job_status.Finish()
|
||||
|
||||
job_status.Delete( 1 )
|
||||
job_status.FinishAndDismiss( 1 )
|
||||
|
||||
|
||||
|
||||
|
|
|
@ -652,7 +652,7 @@ def MoveOrDuplicateLocalFiles( win: QW.QWidget, dest_service_key: bytes, action:
|
|||
pauser.Pause()
|
||||
|
||||
|
||||
job_status.Delete()
|
||||
job_status.FinishAndDismiss()
|
||||
|
||||
|
||||
def publish_callable( result ):
|
||||
|
|
|
@ -212,7 +212,7 @@ class PopupMessage( PopupWindow ):
|
|||
|
||||
self._job_status.SetVariable( 'popup_yes_no_answer', False )
|
||||
|
||||
self._job_status.Delete()
|
||||
self._job_status.FinishAndDismiss()
|
||||
|
||||
self._yes.hide()
|
||||
self._no.hide()
|
||||
|
@ -240,7 +240,7 @@ class PopupMessage( PopupWindow ):
|
|||
|
||||
self._job_status.SetVariable( 'popup_yes_no_answer', True )
|
||||
|
||||
self._job_status.Delete()
|
||||
self._job_status.FinishAndDismiss()
|
||||
|
||||
self._yes.hide()
|
||||
self._no.hide()
|
||||
|
@ -327,10 +327,7 @@ class PopupMessage( PopupWindow ):
|
|||
|
||||
if len( presented_hashes ) == 0:
|
||||
|
||||
if self._job_status.IsDone():
|
||||
|
||||
self.TryToDismiss()
|
||||
|
||||
self.TryToDismiss()
|
||||
|
||||
|
||||
|
||||
|
@ -381,16 +378,16 @@ class PopupMessage( PopupWindow ):
|
|||
return self._job_status
|
||||
|
||||
|
||||
def IsDeleted( self ):
|
||||
def IsDismissed( self ):
|
||||
|
||||
return self._job_status.IsDeleted()
|
||||
return self._job_status.IsDismissed()
|
||||
|
||||
|
||||
def TryToDismiss( self ):
|
||||
|
||||
if self._job_status.IsDeletable():
|
||||
if self._job_status.IsDone():
|
||||
|
||||
self._job_status.Delete()
|
||||
self._job_status.FinishAndDismiss()
|
||||
|
||||
PopupWindow.TryToDismiss( self )
|
||||
|
||||
|
@ -661,14 +658,14 @@ class JobStatusPopupQueue( object ):
|
|||
|
||||
with self._lock:
|
||||
|
||||
removees = [ job_status.GetKey() for job_status in self._job_status_ordered_dict_queue.values() if job_status.IsDeleted() ]
|
||||
removees = [ job_status.GetKey() for job_status in self._job_status_ordered_dict_queue.values() if job_status.IsDismissed() ]
|
||||
|
||||
for job_status_key in removees:
|
||||
|
||||
del self._job_status_ordered_dict_queue[ job_status_key ]
|
||||
|
||||
|
||||
self._job_statuses_in_view = { job_status for job_status in self._job_statuses_in_view if not job_status.IsDeleted() }
|
||||
self._job_statuses_in_view = { job_status for job_status in self._job_statuses_in_view if not job_status.IsDismissed() }
|
||||
|
||||
|
||||
|
||||
|
@ -678,9 +675,9 @@ class JobStatusPopupQueue( object ):
|
|||
|
||||
for job_status in self._job_status_ordered_dict_queue.values():
|
||||
|
||||
if job_status.IsDeletable():
|
||||
if job_status.IsDone():
|
||||
|
||||
job_status.Delete()
|
||||
job_status.FinishAndDismiss()
|
||||
|
||||
|
||||
|
||||
|
@ -863,7 +860,7 @@ class PopupMessageManager( QW.QFrame ):
|
|||
|
||||
HG.client_controller.CallLaterQtSafe( self, 0.5, 'initialise message', self.AddMessage, job_status )
|
||||
|
||||
HG.client_controller.CallLaterQtSafe( self, 1.0, 'delete initial message', job_status.Delete )
|
||||
HG.client_controller.CallLaterQtSafe( self, 1.0, 'delete initial message', job_status.FinishAndDismiss )
|
||||
|
||||
|
||||
def _CheckPending( self ):
|
||||
|
@ -1065,7 +1062,7 @@ class PopupMessageManager( QW.QFrame ):
|
|||
|
||||
if message_window:
|
||||
|
||||
if message_window.IsDeleted():
|
||||
if message_window.IsDismissed():
|
||||
|
||||
self._RemovePopupWindow( message_window )
|
||||
|
||||
|
@ -1141,12 +1138,12 @@ class PopupMessageManager( QW.QFrame ):
|
|||
|
||||
job_status = window.GetJobStatus()
|
||||
|
||||
if not job_status.IsDeletable():
|
||||
if not job_status.IsDone():
|
||||
|
||||
return
|
||||
|
||||
|
||||
job_status.Delete()
|
||||
job_status.FinishAndDismiss()
|
||||
|
||||
self._message_vbox.removeWidget( window )
|
||||
|
||||
|
@ -1188,7 +1185,7 @@ class PopupMessageManager( QW.QFrame ):
|
|||
continue
|
||||
|
||||
|
||||
if message_window.GetJobStatus().IsDeleted():
|
||||
if message_window.GetJobStatus().IsDismissed():
|
||||
|
||||
removees.append( message_window )
|
||||
|
||||
|
|
|
@ -2369,9 +2369,6 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
self._use_system_ffmpeg = QW.QCheckBox( system_panel )
|
||||
self._use_system_ffmpeg.setToolTip( 'Check this to always default to the system ffmpeg in your path, rather than using the static ffmpeg in hydrus\'s bin directory. (requires restart)' )
|
||||
|
||||
self._disable_cv_for_gifs = QW.QCheckBox( system_panel )
|
||||
self._disable_cv_for_gifs.setToolTip( 'OpenCV is good at rendering gifs, but if you have problems with it and your graphics card, check this and the less reliable and slower PIL will be used instead. EDIT: OpenCV is much better these days--this is mostly not needed.' )
|
||||
|
||||
self._load_images_with_pil = QW.QCheckBox( system_panel )
|
||||
self._load_images_with_pil.setToolTip( 'OpenCV is much faster than PIL, but it is sometimes less reliable. Switch this on if you experience crashes or other unusual problems while importing or viewing certain images. EDIT: OpenCV is much better these days--this is mostly not needed.' )
|
||||
|
||||
|
@ -2457,7 +2454,6 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
self._animation_start_position.setValue( int( HC.options['animation_start_position'] * 100.0 ) )
|
||||
self._hide_uninteresting_local_import_time.setChecked( self._new_options.GetBoolean( 'hide_uninteresting_local_import_time' ) )
|
||||
self._hide_uninteresting_modified_time.setChecked( self._new_options.GetBoolean( 'hide_uninteresting_modified_time' ) )
|
||||
self._disable_cv_for_gifs.setChecked( self._new_options.GetBoolean( 'disable_cv_for_gifs' ) )
|
||||
self._load_images_with_pil.setChecked( self._new_options.GetBoolean( 'load_images_with_pil' ) )
|
||||
self._use_system_ffmpeg.setChecked( self._new_options.GetBoolean( 'use_system_ffmpeg' ) )
|
||||
self._always_loop_animations.setChecked( self._new_options.GetBoolean( 'always_loop_gifs' ) )
|
||||
|
@ -2560,7 +2556,6 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
|
||||
rows.append( ( 'Set a new mpv.conf on dialog ok?:', self._mpv_conf_path ) )
|
||||
rows.append( ( 'Prefer system FFMPEG:', self._use_system_ffmpeg ) )
|
||||
rows.append( ( 'BUGFIX: Load gifs with PIL instead of OpenCV (slower, bad transparency):', self._disable_cv_for_gifs ) )
|
||||
rows.append( ( 'BUGFIX: Load images with PIL (slower):', self._load_images_with_pil ) )
|
||||
|
||||
gridbox = ClientGUICommon.WrapInGrid( system_panel, rows )
|
||||
|
@ -2799,7 +2794,6 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
|
||||
self._new_options.SetBoolean( 'hide_uninteresting_local_import_time', self._hide_uninteresting_local_import_time.isChecked() )
|
||||
self._new_options.SetBoolean( 'hide_uninteresting_modified_time', self._hide_uninteresting_modified_time.isChecked() )
|
||||
self._new_options.SetBoolean( 'disable_cv_for_gifs', self._disable_cv_for_gifs.isChecked() )
|
||||
self._new_options.SetBoolean( 'load_images_with_pil', self._load_images_with_pil.isChecked() )
|
||||
self._new_options.SetBoolean( 'use_system_ffmpeg', self._use_system_ffmpeg.isChecked() )
|
||||
self._new_options.SetBoolean( 'always_loop_gifs', self._always_loop_animations.isChecked() )
|
||||
|
@ -2897,7 +2891,7 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
self._new_options = new_options
|
||||
|
||||
self._start_note_editing_at_end = QW.QCheckBox( self )
|
||||
self._start_note_editing_at_end.setToolTip( 'Otherwise, start with the caret at the start of the document.' )
|
||||
self._start_note_editing_at_end.setToolTip( 'Otherwise, start the text cursor at the start of the document.' )
|
||||
|
||||
self._start_note_editing_at_end.setChecked( self._new_options.GetBoolean( 'start_note_editing_at_end' ) )
|
||||
|
||||
|
@ -2905,7 +2899,7 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
|
|||
|
||||
rows = []
|
||||
|
||||
rows.append( ( 'Start editing notes with caret at the end of the document: ', self._start_note_editing_at_end ) )
|
||||
rows.append( ( 'Start editing notes with the text cursor at the end of the document: ', self._start_note_editing_at_end ) )
|
||||
|
||||
gridbox = ClientGUICommon.WrapInGrid( self, rows )
|
||||
|
||||
|
|
|
@ -183,6 +183,9 @@ class EditCheckerOptions( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
self._flat_check_period_checkbox.clicked.connect( self.EventFlatPeriodCheck )
|
||||
|
||||
self._never_faster_than.timeDeltaChanged.connect( self._UpdateTimeDeltas )
|
||||
self._never_slower_than.timeDeltaChanged.connect( self._UpdateTimeDeltas )
|
||||
|
||||
|
||||
def _ShowHelp( self ):
|
||||
|
||||
|
@ -217,6 +220,44 @@ class EditCheckerOptions( ClientGUIScrolledPanels.EditPanel ):
|
|||
self._reactive_check_panel.show()
|
||||
self._static_check_panel.hide()
|
||||
|
||||
self._UpdateTimeDeltas()
|
||||
|
||||
|
||||
def _UpdateTimeDeltas( self ):
|
||||
|
||||
if not self._flat_check_period_checkbox.isChecked():
|
||||
|
||||
never_faster_than = self._never_faster_than.GetValue()
|
||||
never_slower_than = self._never_slower_than.GetValue()
|
||||
|
||||
if never_slower_than < never_faster_than:
|
||||
|
||||
self._never_slower_than.SetValue( never_faster_than )
|
||||
|
||||
|
||||
|
||||
|
||||
def UserIsOKToOK( self ):
|
||||
|
||||
if not self._flat_check_period_checkbox.isChecked():
|
||||
|
||||
if self._never_faster_than.GetValue() == self._never_slower_than.GetValue():
|
||||
|
||||
from hydrus.client.gui import ClientGUIDialogsQuick
|
||||
|
||||
message = 'The "never check faster/slower than" values are the same, which means this checker will always check at a static, regular interval. Is that OK?'
|
||||
|
||||
result = ClientGUIDialogsQuick.GetYesNo( self, message )
|
||||
|
||||
if result != QW.QDialog.Accepted:
|
||||
|
||||
return False
|
||||
|
||||
|
||||
|
||||
|
||||
return True
|
||||
|
||||
|
||||
def EventFlatPeriodCheck( self ):
|
||||
|
||||
|
|
|
@ -279,7 +279,7 @@ def ShouldHaveAnimationBar( media, show_action ):
|
|||
return False
|
||||
|
||||
|
||||
is_animation = media.GetMime() in HC.ANIMATIONS
|
||||
is_animation = media.GetMime() in HC.VIEWABLE_ANIMATIONS
|
||||
is_audio = media.GetMime() in HC.AUDIO
|
||||
is_video = media.GetMime() in HC.VIDEO
|
||||
|
||||
|
@ -448,18 +448,13 @@ class Animation( QW.QWidget ):
|
|||
|
||||
self._canvas_qt_pixmap = HG.client_controller.bitmap_manager.GetQtPixmap( my_raw_width, my_raw_height )
|
||||
|
||||
self._canvas_qt_pixmap.setDevicePixelRatio( self.devicePixelRatio() )
|
||||
|
||||
painter = QG.QPainter( self._canvas_qt_pixmap )
|
||||
|
||||
self._DrawABlankFrame( painter )
|
||||
|
||||
else:
|
||||
|
||||
self._canvas_qt_pixmap.setDevicePixelRatio( self.devicePixelRatio() )
|
||||
|
||||
painter = QG.QPainter( self._canvas_qt_pixmap )
|
||||
|
||||
|
||||
self._canvas_qt_pixmap.setDevicePixelRatio( self.devicePixelRatio() )
|
||||
|
||||
painter = QG.QPainter( self._canvas_qt_pixmap )
|
||||
|
||||
# this makes transparency work nice, so just force it
|
||||
self._DrawABlankFrame( painter )
|
||||
|
||||
current_frame = self._video_container.GetFrame( self._current_frame_index )
|
||||
|
||||
|
@ -840,7 +835,7 @@ class Animation( QW.QWidget ):
|
|||
|
||||
do_times_to_play_animation_pause = False
|
||||
|
||||
if self._media.GetMime() in HC.ANIMATIONS and not HG.client_controller.new_options.GetBoolean( 'always_loop_gifs' ):
|
||||
if self._media.GetMime() in HC.VIEWABLE_ANIMATIONS and not HG.client_controller.new_options.GetBoolean( 'always_loop_gifs' ):
|
||||
|
||||
times_to_play_animation = self._video_container.GetTimesToPlayAnimation()
|
||||
|
||||
|
|
|
@ -949,7 +949,7 @@ class MPVWidget( CAC.ApplicationCommandProcessorMixin, QW.QWidget ):
|
|||
|
||||
self._player.pause = True
|
||||
|
||||
if mime in HC.ANIMATIONS and not HG.client_controller.new_options.GetBoolean( 'always_loop_gifs' ):
|
||||
if mime in HC.VIEWABLE_ANIMATIONS and not HG.client_controller.new_options.GetBoolean( 'always_loop_gifs' ):
|
||||
|
||||
if mime == HC.ANIMATION_GIF:
|
||||
|
||||
|
|
|
@ -924,9 +924,7 @@ class ReviewExportFilesPanel( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
job_status.DeleteVariable( 'popup_gauge_1' )
|
||||
job_status.SetStatusText( 'Done!' )
|
||||
|
||||
job_status.Finish()
|
||||
|
||||
job_status.Delete( 5 )
|
||||
job_status.FinishAndDismiss( 5 )
|
||||
|
||||
QP.CallAfter( qt_update_label, 'done!' )
|
||||
|
||||
|
|
|
@ -193,6 +193,10 @@ class EditImportFolderPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
self._period = ClientGUITime.TimeDeltaButton( self._folder_box, min = 3 * 60, days = True, hours = True, minutes = True )
|
||||
|
||||
self._last_modified_time_skip_period = ClientGUITime.TimeDeltaButton( self._folder_box, min = 1, days = True, hours = True, minutes = True, seconds = True )
|
||||
tt = 'If a file has a modified time more recent than this long ago, it will not be imported in the current check. Helps to avoid importing files that are in the process of downloading/copying (usually on a NAS where other "already in use" checks may fail).'
|
||||
self._last_modified_time_skip_period.setToolTip( tt )
|
||||
|
||||
self._paused = QW.QCheckBox( self._folder_box )
|
||||
|
||||
self._check_now = QW.QCheckBox( self._folder_box )
|
||||
|
@ -273,6 +277,9 @@ class EditImportFolderPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
self._check_regularly.setChecked( check_regularly )
|
||||
|
||||
self._period.SetValue( period )
|
||||
|
||||
self._last_modified_time_skip_period.SetValue( import_folder.GetLastModifiedTimeSkipPeriod() )
|
||||
|
||||
self._paused.setChecked( paused )
|
||||
|
||||
self._show_working_popup.setChecked( show_working_popup )
|
||||
|
@ -318,6 +325,7 @@ class EditImportFolderPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
rows.append( ( 'currently paused (if set, will not ever do any work): ', self._paused ) )
|
||||
rows.append( ( 'check regularly?: ', self._check_regularly ) )
|
||||
rows.append( ( 'check period: ', self._period ) )
|
||||
rows.append( ( 'recent modified time skip period: ', self._last_modified_time_skip_period ) )
|
||||
rows.append( ( 'check on manage dialog ok: ', self._check_now ) )
|
||||
rows.append( ( 'show a popup while working: ', self._show_working_popup ) )
|
||||
rows.append( ( 'publish presented files to a popup button: ', self._publish_files_to_popup_button ) )
|
||||
|
@ -632,6 +640,7 @@ class EditImportFolderPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
|
||||
period = self._period.GetValue()
|
||||
|
||||
check_regularly = self._check_regularly.isChecked()
|
||||
|
||||
paused = self._paused.isChecked()
|
||||
|
@ -646,6 +655,8 @@ class EditImportFolderPanel( ClientGUIScrolledPanels.EditPanel ):
|
|||
|
||||
self._import_folder.SetTuple( name, path, file_import_options, tag_import_options, tag_service_keys_to_filename_tagging_options, actions, action_locations, period, check_regularly, paused, check_now, show_working_popup, publish_files_to_popup_button, publish_files_to_page )
|
||||
|
||||
self._import_folder.SetLastModifiedTimeSkipPeriod( self._last_modified_time_skip_period.GetValue() )
|
||||
|
||||
metadata_routers = self._metadata_routers_button.GetValue()
|
||||
|
||||
self._import_folder.SetMetadataRouters( metadata_routers )
|
||||
|
|
|
@ -1793,8 +1793,7 @@ class ManagementPanelImporterMultipleGallery( ManagementPanelImporter ):
|
|||
|
||||
self._gallery_importers_listctrl.UpdateDatas()
|
||||
|
||||
job_status.Finish()
|
||||
job_status.Delete()
|
||||
job_status.FinishAndDismiss()
|
||||
|
||||
|
||||
|
||||
|
@ -2789,8 +2788,7 @@ class ManagementPanelImporterMultipleWatcher( ManagementPanelImporter ):
|
|||
|
||||
self._watchers_listctrl.UpdateDatas()
|
||||
|
||||
job_status.Finish()
|
||||
job_status.Delete()
|
||||
job_status.FinishAndDismiss()
|
||||
|
||||
|
||||
|
||||
|
@ -4579,7 +4577,7 @@ class ManagementPanelPetitions( ManagementPanel ):
|
|||
|
||||
job_status.SetStatusText( 'Hey, the server did not have that type of petition after all. Please hit refresh counts.' )
|
||||
|
||||
job_status.Delete( 5 )
|
||||
job_status.FinishAndDismiss( 5 )
|
||||
|
||||
HG.client_controller.pub( 'message', job_status )
|
||||
|
||||
|
@ -5447,7 +5445,7 @@ class ManagementPanelPetitions( ManagementPanel ):
|
|||
|
||||
finally:
|
||||
|
||||
job_status.Delete()
|
||||
job_status.FinishAndDismiss()
|
||||
|
||||
|
||||
HG.client_controller.CallBlockingToQt( self, qt_petition_cleared, outgoing_petition )
|
||||
|
@ -5603,7 +5601,7 @@ class ManagementPanelQuery( ManagementPanel ):
|
|||
|
||||
if len( file_search_context.GetPredicates() ) > 0:
|
||||
|
||||
self._query_job_status = ClientThreading.JobStatus()
|
||||
self._query_job_status = ClientThreading.JobStatus( cancellable = True )
|
||||
|
||||
sort_by = self._media_sort_widget.GetSort()
|
||||
|
||||
|
|
|
@ -2174,7 +2174,7 @@ class PagesNotebook( QP.TabWidgetWithDnD ):
|
|||
|
||||
self.freshSessionLoaded.emit( session )
|
||||
|
||||
job_status.Delete()
|
||||
job_status.FinishAndDismiss()
|
||||
|
||||
|
||||
def ChooseNewPage( self ):
|
||||
|
|
|
@ -1785,7 +1785,7 @@ class ReviewServicePanel( QW.QWidget ):
|
|||
|
||||
if should_quit:
|
||||
|
||||
job_status.Delete()
|
||||
job_status.FinishAndDismiss()
|
||||
|
||||
return
|
||||
|
||||
|
|
|
@ -406,7 +406,7 @@ class FileImportJob( object ):
|
|||
|
||||
#
|
||||
|
||||
self._has_transparency = ClientFiles.HasTransparency( self._temp_path, mime, num_frames = num_frames, resolution = ( width, height ) )
|
||||
self._has_transparency = ClientFiles.HasTransparency( self._temp_path, mime, duration = duration, num_frames = num_frames, resolution = ( width, height ) )
|
||||
|
||||
has_exif = False
|
||||
|
||||
|
|
|
@ -483,7 +483,7 @@ class ImportFolder( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_TYPE_IMPORT_FOLDER
|
||||
SERIALISABLE_NAME = 'Import Folder'
|
||||
SERIALISABLE_VERSION = 8
|
||||
SERIALISABLE_VERSION = 9
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
|
@ -552,6 +552,8 @@ class ImportFolder( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
self._period = period
|
||||
self._check_regularly = check_regularly
|
||||
|
||||
self._last_modified_time_skip_period = 60
|
||||
|
||||
self._file_seed_cache = ClientImportFileSeeds.FileSeedCache()
|
||||
self._last_checked = 0
|
||||
self._paused = False
|
||||
|
@ -698,29 +700,22 @@ class ImportFolder( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
|
||||
def _CheckFolder( self, job_status: ClientThreading.JobStatus ):
|
||||
|
||||
|
||||
( all_paths, num_sidecars ) = ClientFiles.GetAllFilePaths( [ self._path ] )
|
||||
|
||||
all_paths = HydrusPaths.FilterFreePaths( all_paths )
|
||||
paths_to_file_seeds = { path : ClientImportFileSeeds.FileSeed( ClientImportFileSeeds.FILE_SEED_TYPE_HDD, path ) for path in all_paths }
|
||||
|
||||
file_seeds = []
|
||||
new_paths = [ path for ( path, file_seed ) in paths_to_file_seeds.items() if not self._file_seed_cache.HasFileSeed( file_seed ) ]
|
||||
|
||||
for path in all_paths:
|
||||
|
||||
if job_status.IsCancelled():
|
||||
|
||||
break
|
||||
|
||||
|
||||
file_seed = ClientImportFileSeeds.FileSeed( ClientImportFileSeeds.FILE_SEED_TYPE_HDD, path )
|
||||
|
||||
if not self._file_seed_cache.HasFileSeed( file_seed ):
|
||||
|
||||
file_seeds.append( file_seed )
|
||||
|
||||
|
||||
job_status.SetStatusText( 'checking: found ' + HydrusData.ToHumanInt( len( file_seeds ) ) + ' new files' )
|
||||
|
||||
job_status.SetStatusText( f'checking: found {HydrusData.ToHumanInt( len( new_paths ) )} new files' )
|
||||
|
||||
old_new_paths = HydrusPaths.FilterOlderModifiedFiles( new_paths, self._last_modified_time_skip_period )
|
||||
|
||||
free_old_new_paths = HydrusPaths.FilterFreePaths( old_new_paths )
|
||||
|
||||
file_seeds = [ paths_to_file_seeds[ path ] for path in free_old_new_paths ]
|
||||
|
||||
job_status.SetStatusText( f'checking: found {HydrusData.ToHumanInt( len( file_seeds ) )} new files to import' )
|
||||
|
||||
self._file_seed_cache.AddFileSeeds( file_seeds )
|
||||
|
||||
|
@ -740,7 +735,25 @@ class ImportFolder( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
action_pairs = list(self._actions.items())
|
||||
action_location_pairs = list(self._action_locations.items())
|
||||
|
||||
return ( self._path, serialisable_file_import_options, serialisable_tag_import_options, serialisable_metadata_routers, serialisable_tag_service_keys_to_filename_tagging_options, action_pairs, action_location_pairs, self._period, self._check_regularly, serialisable_file_seed_cache, self._last_checked, self._paused, self._check_now, self._show_working_popup, self._publish_files_to_popup_button, self._publish_files_to_page )
|
||||
return (
|
||||
self._path,
|
||||
serialisable_file_import_options,
|
||||
serialisable_tag_import_options,
|
||||
serialisable_metadata_routers,
|
||||
serialisable_tag_service_keys_to_filename_tagging_options,
|
||||
action_pairs,
|
||||
action_location_pairs,
|
||||
self._period,
|
||||
self._check_regularly,
|
||||
serialisable_file_seed_cache,
|
||||
self._last_checked,
|
||||
self._paused,
|
||||
self._check_now,
|
||||
self._last_modified_time_skip_period,
|
||||
self._show_working_popup,
|
||||
self._publish_files_to_popup_button,
|
||||
self._publish_files_to_page
|
||||
)
|
||||
|
||||
|
||||
def _ImportFiles( self, job_status ):
|
||||
|
@ -907,7 +920,25 @@ class ImportFolder( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
def _InitialiseFromSerialisableInfo( self, serialisable_info ):
|
||||
|
||||
( self._path, serialisable_file_import_options, serialisable_tag_import_options, serialisable_metadata_routers, serialisable_tag_service_keys_to_filename_tagging_options, action_pairs, action_location_pairs, self._period, self._check_regularly, serialisable_file_seed_cache, self._last_checked, self._paused, self._check_now, self._show_working_popup, self._publish_files_to_popup_button, self._publish_files_to_page ) = serialisable_info
|
||||
(
|
||||
self._path,
|
||||
serialisable_file_import_options,
|
||||
serialisable_tag_import_options,
|
||||
serialisable_metadata_routers,
|
||||
serialisable_tag_service_keys_to_filename_tagging_options,
|
||||
action_pairs,
|
||||
action_location_pairs,
|
||||
self._period,
|
||||
self._check_regularly,
|
||||
serialisable_file_seed_cache,
|
||||
self._last_checked,
|
||||
self._paused,
|
||||
self._check_now,
|
||||
self._last_modified_time_skip_period,
|
||||
self._show_working_popup,
|
||||
self._publish_files_to_popup_button,
|
||||
self._publish_files_to_page
|
||||
) = serialisable_info
|
||||
|
||||
self._actions = dict( action_pairs )
|
||||
self._action_locations = dict( action_location_pairs )
|
||||
|
@ -1049,6 +1080,52 @@ class ImportFolder( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
return ( 8, new_serialisable_info )
|
||||
|
||||
|
||||
if version == 8:
|
||||
|
||||
(
|
||||
path,
|
||||
serialisable_file_import_options,
|
||||
serialisable_tag_import_options,
|
||||
serialisable_metadata_routers,
|
||||
serialisable_tag_service_keys_to_filename_tagging_options,
|
||||
action_pairs,
|
||||
action_location_pairs,
|
||||
period,
|
||||
check_regularly,
|
||||
serialisable_file_seed_cache,
|
||||
last_checked,
|
||||
paused,
|
||||
check_now,
|
||||
show_working_popup,
|
||||
publish_files_to_popup_button,
|
||||
publish_files_to_page
|
||||
) = old_serialisable_info
|
||||
|
||||
last_modified_time_skip_period = 60
|
||||
|
||||
new_serialisable_info = (
|
||||
path,
|
||||
serialisable_file_import_options,
|
||||
serialisable_tag_import_options,
|
||||
serialisable_metadata_routers,
|
||||
serialisable_tag_service_keys_to_filename_tagging_options,
|
||||
action_pairs,
|
||||
action_location_pairs,
|
||||
period,
|
||||
check_regularly,
|
||||
serialisable_file_seed_cache,
|
||||
last_checked,
|
||||
paused,
|
||||
check_now,
|
||||
last_modified_time_skip_period,
|
||||
show_working_popup,
|
||||
publish_files_to_popup_button,
|
||||
publish_files_to_page
|
||||
)
|
||||
|
||||
return ( 9, new_serialisable_info )
|
||||
|
||||
|
||||
|
||||
def CheckNow( self ):
|
||||
|
||||
|
@ -1140,7 +1217,7 @@ class ImportFolder( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
HG.client_controller.WriteSynchronous( 'serialisable', self )
|
||||
|
||||
|
||||
job_status.Delete()
|
||||
job_status.FinishAndDismiss()
|
||||
|
||||
|
||||
def GetFileSeedCache( self ):
|
||||
|
@ -1148,6 +1225,11 @@ class ImportFolder( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
return self._file_seed_cache
|
||||
|
||||
|
||||
def GetLastModifiedTimeSkipPeriod( self ) -> int:
|
||||
|
||||
return self._last_modified_time_skip_period
|
||||
|
||||
|
||||
def GetMetadataRouters( self ):
|
||||
|
||||
return list( self._metadata_routers )
|
||||
|
@ -1178,6 +1260,11 @@ class ImportFolder( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
self._file_seed_cache = file_seed_cache
|
||||
|
||||
|
||||
def SetLastModifiedTimeSkipPeriod( self, value: int ):
|
||||
|
||||
self._last_modified_time_skip_period = value
|
||||
|
||||
|
||||
def SetMetadataRouters( self, metadata_routers: typing.Collection[ ClientMetadataMigration.SingleFileMetadataRouter ] ):
|
||||
|
||||
self._metadata_routers = HydrusSerialisable.SerialisableList( metadata_routers )
|
||||
|
|
|
@ -1712,7 +1712,7 @@ class Subscription( HydrusSerialisable.SerialisableBaseNamed ):
|
|||
|
||||
else:
|
||||
|
||||
job_status.Delete()
|
||||
job_status.FinishAndDismiss()
|
||||
|
||||
|
||||
|
||||
|
|
|
@ -5,7 +5,6 @@ from hydrus.core import HydrusData
|
|||
from hydrus.core import HydrusSerialisable
|
||||
from hydrus.core import HydrusTime
|
||||
|
||||
from hydrus.client import ClientConstants as CC
|
||||
from hydrus.client.media import ClientMediaResult
|
||||
from hydrus.client.metadata import ClientTags
|
||||
|
||||
|
@ -102,7 +101,12 @@ class CheckerOptions( HydrusSerialisable.SerialisableBase ):
|
|||
return death_file_velocity_period
|
||||
|
||||
|
||||
def GetNextCheckTime( self, file_seed_cache, last_check_time, previous_next_check_time ):
|
||||
def GetNextCheckTime( self, file_seed_cache, last_check_time: int, previous_next_check_time: typing.Optional[ int ] ) -> int:
|
||||
|
||||
if previous_next_check_time is None:
|
||||
|
||||
previous_next_check_time = last_check_time + self._never_faster_than
|
||||
|
||||
|
||||
if len( file_seed_cache ) == 0:
|
||||
|
||||
|
@ -115,28 +119,27 @@ class CheckerOptions( HydrusSerialisable.SerialisableBase ):
|
|||
return HydrusTime.GetNow() + self._never_slower_than
|
||||
|
||||
|
||||
elif self._never_faster_than == self._never_slower_than:
|
||||
|
||||
|
||||
if self._never_faster_than == self._never_slower_than:
|
||||
|
||||
# fixed check period
|
||||
check_period = self._never_slower_than
|
||||
fixed_check_period = self._never_slower_than
|
||||
|
||||
# ok the issue here is that, if the thing is supposed to check every seven days, we want to persist in saturday night rather than slowly creep forward a few hours every week due to delays
|
||||
# thus, if the 'last' next check time is 'reasonable' (which for now we really simplified to just mean "it happened already"), we'll add to that instead of our actual last check time
|
||||
# but if the user reduces the check time to one day on Wednesday, we want to notice that and reset immediately, not wait until next saturday to recalculate!
|
||||
# I had a bunch of complicated logic to try and make sure a saturday check stayed on saturday, even if the check was delayed to sunday, and it just wasn't worth the trouble
|
||||
# KISS
|
||||
|
||||
if HydrusTime.TimeHasPassed( previous_next_check_time - 5 ):
|
||||
|
||||
next_check_time = previous_next_check_time + check_period
|
||||
|
||||
else:
|
||||
|
||||
next_check_time = last_check_time + check_period
|
||||
|
||||
next_check_time = last_check_time + fixed_check_period
|
||||
|
||||
return next_check_time
|
||||
while HydrusTime.TimeHasPassed( next_check_time + fixed_check_period ):
|
||||
|
||||
next_check_time += fixed_check_period
|
||||
|
||||
|
||||
else:
|
||||
|
||||
# dynamic check period
|
||||
|
||||
( current_files_found, current_time_delta ) = self._GetCurrentFilesVelocity( file_seed_cache, last_check_time )
|
||||
|
||||
if current_files_found == 0:
|
||||
|
@ -163,9 +166,11 @@ class CheckerOptions( HydrusSerialisable.SerialisableBase ):
|
|||
check_period = min( max( never_faster_than, ideal_check_period ), self._never_slower_than )
|
||||
|
||||
|
||||
return last_check_time + check_period
|
||||
next_check_time = last_check_time + check_period
|
||||
|
||||
|
||||
return next_check_time
|
||||
|
||||
|
||||
def GetPrettyCurrentVelocity( self, file_seed_cache, last_check_time, no_prefix = False ):
|
||||
|
||||
|
|
|
@ -151,9 +151,10 @@ class HydrusServiceClientAPI( HydrusClientService ):
|
|||
root.putChild( b'manage_popups', manage_popups )
|
||||
|
||||
manage_popups.putChild( b'get_popups', ClientLocalServerResources.HydrusResourceClientAPIRestrictedManagePopupsGetPopups( self._service, self._client_requests_domain ) )
|
||||
manage_popups.putChild( b'dismiss_popup', ClientLocalServerResources.HydrusResourceClientAPIRestrictedManagePopupsDismissPopup( self._service, self._client_requests_domain ) )
|
||||
manage_popups.putChild( b'cancel_popup', ClientLocalServerResources.HydrusResourceClientAPIRestrictedManagePopupsCancelPopup( self._service, self._client_requests_domain ) )
|
||||
manage_popups.putChild( b'dismiss_popup', ClientLocalServerResources.HydrusResourceClientAPIRestrictedManagePopupsDismissPopup( self._service, self._client_requests_domain ) )
|
||||
manage_popups.putChild( b'finish_popup', ClientLocalServerResources.HydrusResourceClientAPIRestrictedManagePopupsFinishPopup( self._service, self._client_requests_domain ) )
|
||||
manage_popups.putChild( b'finish_and_dismiss_popup', ClientLocalServerResources.HydrusResourceClientAPIRestrictedManagePopupsFinishAndDismissPopup( self._service, self._client_requests_domain ) )
|
||||
manage_popups.putChild( b'call_user_callable', ClientLocalServerResources.HydrusResourceClientAPIRestrictedManagePopupsCallUserCallable( self._service, self._client_requests_domain ) )
|
||||
manage_popups.putChild( b'add_popup', ClientLocalServerResources.HydrusResourceClientAPIRestrictedManagePopupsAddPopup( self._service, self._client_requests_domain ) )
|
||||
manage_popups.putChild( b'update_popup', ClientLocalServerResources.HydrusResourceClientAPIRestrictedManagePopupsUpdatePopup( self._service, self._client_requests_domain ) )
|
||||
|
|
|
@ -1167,7 +1167,7 @@ class HydrusResourceBooruPage( HydrusResourceBooru ):
|
|||
|
||||
mime = media_result.GetMime()
|
||||
|
||||
if mime in HC.IMAGES or mime in HC.ANIMATIONS:
|
||||
if mime in HC.IMAGES or mime in HC.VIEWABLE_ANIMATIONS:
|
||||
|
||||
( width, height ) = media_result.GetResolution()
|
||||
|
||||
|
@ -3571,7 +3571,7 @@ class HydrusResourceClientAPIRestrictedManageCookiesSetCookies( HydrusResourceCl
|
|||
|
||||
job_status.SetStatusText( message )
|
||||
|
||||
job_status.Delete( 5 )
|
||||
job_status.FinishAndDismiss( 5 )
|
||||
|
||||
HG.client_controller.pub( 'message', job_status )
|
||||
|
||||
|
@ -3819,7 +3819,7 @@ class HydrusResourceClientAPIRestrictedManageCookiesSetHeaders( HydrusResourceCl
|
|||
|
||||
job_status.SetStatusText( message )
|
||||
|
||||
job_status.Delete( 5 )
|
||||
job_status.FinishAndDismiss( 5 )
|
||||
|
||||
HG.client_controller.pub( 'message', job_status )
|
||||
|
||||
|
@ -3934,7 +3934,7 @@ class HydrusResourceClientAPIRestrictedManageDatabaseGetClientOptions( HydrusRes
|
|||
|
||||
old_options = HG.client_controller.options
|
||||
|
||||
old_options = { key : value for ( key, value ) in old_options if key in OLD_OPTIONS_DEFAULT }
|
||||
old_options = { key : value for ( key, value ) in old_options.items() if key in OLD_OPTIONS_DEFAULT }
|
||||
|
||||
new_options: ClientOptions.ClientOptions = HG.client_controller.new_options
|
||||
|
||||
|
@ -4422,16 +4422,14 @@ def JobStatusToDict( job_status: ClientThreading.JobStatus ):
|
|||
'had_error' : job_status.HadError(),
|
||||
'is_cancellable' : job_status.IsCancellable(),
|
||||
'is_cancelled' : job_status.IsCancelled(),
|
||||
'is_deleted' : job_status.IsDeleted(),
|
||||
'is_done' : job_status.IsDone(),
|
||||
'is_pauseable' : job_status.IsPausable(),
|
||||
'is_pausable' : job_status.IsPausable(),
|
||||
'is_paused' : job_status.IsPaused(),
|
||||
'is_working' : job_status.IsWorking(),
|
||||
'nice_string' : job_status.ToString(),
|
||||
'popup_gauge_1' : job_status.GetIfHasVariable( 'popup_gauge_1' ),
|
||||
'popup_gauge_2' : job_status.GetIfHasVariable( 'popup_gauge_2' ),
|
||||
'attached_files_mergable' : job_status.GetIfHasVariable( 'attached_files_mergable' ),
|
||||
'api_data' : job_status.GetIfHasVariable( 'api_data' ),
|
||||
'api_data' : job_status.GetIfHasVariable( 'api_data' )
|
||||
}
|
||||
|
||||
files_object = job_status.GetFiles()
|
||||
|
@ -4481,19 +4479,27 @@ def JobStatusToDict( job_status: ClientThreading.JobStatus ):
|
|||
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedManagePopupsGetPopups( HydrusResourceClientAPIRestrictedManagePages ):
|
||||
class HydrusResourceClientAPIRestrictedManagePopupsAddPopup( HydrusResourceClientAPIRestrictedManagePages ):
|
||||
|
||||
def _threadDoGETJob( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
def _threadDoPOSTJob(self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
job_status_queue: ClientGUIPopupMessages.JobStatusPopupQueue = HG.client_controller.job_status_popup_queue
|
||||
pausable = request.parsed_request_args.GetValue( 'is_pausable', bool, default_value = False )
|
||||
cancellable = request.parsed_request_args.GetValue( 'is_cancellable', bool, default_value = False )
|
||||
|
||||
only_in_view = request.parsed_request_args.GetValue( 'only_in_view', bool, default_value = False )
|
||||
job_status = ClientThreading.JobStatus( pausable = pausable, cancellable = cancellable )
|
||||
|
||||
job_statuses = job_status_queue.GetJobStatuses( only_in_view )
|
||||
if request.parsed_request_args.GetValue( 'attached_files_mergable', bool, default_value = False ):
|
||||
|
||||
job_status.SetVariable( 'attached_files_mergable', True )
|
||||
|
||||
|
||||
HandlePopupUpdate( job_status, request )
|
||||
|
||||
HG.client_controller.pub( 'message', job_status )
|
||||
|
||||
body_dict = {
|
||||
'job_statuses' : [JobStatusToDict( job ) for job in job_statuses]
|
||||
}
|
||||
'job_status': JobStatusToDict( job_status )
|
||||
}
|
||||
|
||||
body = Dumps( body_dict, request.preferred_mime )
|
||||
|
||||
|
@ -4506,34 +4512,33 @@ class HydrusResourceClientAPIRestrictedManagePopupsGetPopups( HydrusResourceClie
|
|||
def GetJobStatusFromRequest( request: HydrusServerRequest.HydrusRequest ) -> ClientThreading.JobStatus:
|
||||
|
||||
job_status_key = request.parsed_request_args.GetValue( 'job_status_key', bytes )
|
||||
|
||||
|
||||
job_status_queue: ClientGUIPopupMessages.JobStatusPopupQueue = HG.client_controller.job_status_popup_queue
|
||||
|
||||
job_status = job_status_queue.GetJobStatus( job_status_key )
|
||||
|
||||
if job_status is None:
|
||||
|
||||
raise HydrusExceptions.BadRequestException('This job key doesn\'t exist!')
|
||||
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'This job key doesn\'t exist!' )
|
||||
|
||||
|
||||
return job_status
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedManagePopupsDismissPopup( HydrusResourceClientAPIRestrictedManagePages ):
|
||||
class HydrusResourceClientAPIRestrictedManagePopupsCallUserCallable( HydrusResourceClientAPIRestrictedManagePages ):
|
||||
|
||||
def _threadDoPOSTJob(self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
job_status = GetJobStatusFromRequest( request )
|
||||
|
||||
if not job_status.IsDeletable():
|
||||
user_callable = job_status.GetUserCallable()
|
||||
|
||||
if user_callable is None:
|
||||
|
||||
raise HydrusExceptions.BadRequestException('This job can\'t be dismissed!')
|
||||
raise HydrusExceptions.BadRequestException('This job doesn\'t have a user callable!')
|
||||
|
||||
|
||||
seconds = request.parsed_request_args.GetValueOrNone( 'seconds', int )
|
||||
|
||||
job_status.Delete( seconds )
|
||||
HG.client_controller.CallBlockingToQt( HG.client_controller.gui, user_callable )
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200 )
|
||||
|
||||
|
@ -4547,14 +4552,27 @@ class HydrusResourceClientAPIRestrictedManagePopupsCancelPopup( HydrusResourceCl
|
|||
|
||||
job_status = GetJobStatusFromRequest( request )
|
||||
|
||||
if not job_status.IsCancellable():
|
||||
if job_status.IsCancellable():
|
||||
|
||||
raise HydrusExceptions.BadRequestException('This job can\'t be cancelled!')
|
||||
job_status.Cancel()
|
||||
|
||||
|
||||
seconds = request.parsed_request_args.GetValueOrNone( 'seconds', int )
|
||||
response_context = HydrusServerResources.ResponseContext( 200 )
|
||||
|
||||
job_status.Cancel( seconds )
|
||||
return response_context
|
||||
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedManagePopupsDismissPopup( HydrusResourceClientAPIRestrictedManagePages ):
|
||||
|
||||
def _threadDoPOSTJob(self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
job_status = GetJobStatusFromRequest( request )
|
||||
|
||||
if job_status.IsDone():
|
||||
|
||||
job_status.FinishAndDismiss()
|
||||
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200 )
|
||||
|
||||
|
@ -4568,9 +4586,7 @@ class HydrusResourceClientAPIRestrictedManagePopupsFinishPopup( HydrusResourceCl
|
|||
|
||||
job_status = GetJobStatusFromRequest( request )
|
||||
|
||||
seconds = request.parsed_request_args.GetValueOrNone( 'seconds', int )
|
||||
|
||||
job_status.Finish( seconds )
|
||||
job_status.Finish()
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200 )
|
||||
|
||||
|
@ -4578,20 +4594,15 @@ class HydrusResourceClientAPIRestrictedManagePopupsFinishPopup( HydrusResourceCl
|
|||
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedManagePopupsCallUserCallable( HydrusResourceClientAPIRestrictedManagePages ):
|
||||
class HydrusResourceClientAPIRestrictedManagePopupsFinishAndDismissPopup( HydrusResourceClientAPIRestrictedManagePages ):
|
||||
|
||||
def _threadDoPOSTJob(self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
job_status = GetJobStatusFromRequest( request )
|
||||
|
||||
user_callable = job_status.GetUserCallable()
|
||||
|
||||
if user_callable is None:
|
||||
|
||||
raise HydrusExceptions.BadRequestException('This job doesn\'t have a user callable!')
|
||||
|
||||
seconds = request.parsed_request_args.GetValueOrNone( 'seconds', int )
|
||||
|
||||
HG.client_controller.CallBlockingToQt( HG.client_controller.gui, user_callable )
|
||||
job_status.FinishAndDismiss( seconds )
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200 )
|
||||
|
||||
|
@ -4599,6 +4610,28 @@ class HydrusResourceClientAPIRestrictedManagePopupsCallUserCallable( HydrusResou
|
|||
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedManagePopupsGetPopups( HydrusResourceClientAPIRestrictedManagePages ):
|
||||
|
||||
def _threadDoGETJob( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
job_status_queue: ClientGUIPopupMessages.JobStatusPopupQueue = HG.client_controller.job_status_popup_queue
|
||||
|
||||
only_in_view = request.parsed_request_args.GetValue( 'only_in_view', bool, default_value = False )
|
||||
|
||||
job_statuses = job_status_queue.GetJobStatuses( only_in_view )
|
||||
|
||||
body_dict = {
|
||||
'job_statuses' : [JobStatusToDict( job ) for job in job_statuses]
|
||||
}
|
||||
|
||||
body = Dumps( body_dict, request.preferred_mime )
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200, mime = request.preferred_mime, body = body )
|
||||
|
||||
return response_context
|
||||
|
||||
|
||||
|
||||
def HandlePopupUpdate( job_status: ClientThreading.JobStatus, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
def HandleGenericVariable( name: str, type: type ):
|
||||
|
@ -4619,9 +4652,9 @@ def HandlePopupUpdate( job_status: ClientThreading.JobStatus, request: HydrusSer
|
|||
|
||||
|
||||
if 'status_title' in request.parsed_request_args:
|
||||
|
||||
|
||||
status_title = request.parsed_request_args.GetValueOrNone( 'status_title', str )
|
||||
|
||||
|
||||
if status_title is not None:
|
||||
|
||||
job_status.SetStatusTitle( status_title )
|
||||
|
@ -4635,7 +4668,7 @@ def HandlePopupUpdate( job_status: ClientThreading.JobStatus, request: HydrusSer
|
|||
if 'status_text_1' in request.parsed_request_args:
|
||||
|
||||
status_text = request.parsed_request_args.GetValueOrNone( 'status_text_1', str )
|
||||
|
||||
|
||||
if status_text is not None:
|
||||
|
||||
job_status.SetStatusText( status_text, 1 )
|
||||
|
@ -4647,7 +4680,7 @@ def HandlePopupUpdate( job_status: ClientThreading.JobStatus, request: HydrusSer
|
|||
|
||||
|
||||
if 'status_text_2' in request.parsed_request_args:
|
||||
|
||||
|
||||
status_text_2 = request.parsed_request_args.GetValueOrNone( 'status_text_2', str )
|
||||
|
||||
if status_text_2 is not None:
|
||||
|
@ -4660,22 +4693,6 @@ def HandlePopupUpdate( job_status: ClientThreading.JobStatus, request: HydrusSer
|
|||
|
||||
|
||||
|
||||
is_cancellable = request.parsed_request_args.GetValueOrNone( 'is_cancellable', bool )
|
||||
|
||||
if is_cancellable is not None:
|
||||
|
||||
job_status.SetCancellable( is_cancellable )
|
||||
|
||||
|
||||
is_pausable = request.parsed_request_args.GetValueOrNone( 'is_pausable', bool )
|
||||
|
||||
if is_pausable is not None:
|
||||
|
||||
job_status.SetPausable( is_pausable )
|
||||
|
||||
|
||||
HandleGenericVariable( 'attached_files_mergable', bool )
|
||||
|
||||
HandleGenericVariable( 'api_data', dict )
|
||||
|
||||
for name in ['popup_gauge_1', 'popup_gauge_2']:
|
||||
|
@ -4715,28 +4732,6 @@ def HandlePopupUpdate( job_status: ClientThreading.JobStatus, request: HydrusSer
|
|||
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedManagePopupsAddPopup( HydrusResourceClientAPIRestrictedManagePages ):
|
||||
|
||||
def _threadDoPOSTJob(self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
job_status = ClientThreading.JobStatus()
|
||||
|
||||
HandlePopupUpdate( job_status, request )
|
||||
|
||||
HG.client_controller.pub( 'message', job_status )
|
||||
|
||||
body_dict = {
|
||||
'job_status': JobStatusToDict( job_status )
|
||||
}
|
||||
|
||||
body = Dumps( body_dict, request.preferred_mime )
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200, mime = request.preferred_mime, body = body )
|
||||
|
||||
return response_context
|
||||
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedManagePopupsUpdatePopup( HydrusResourceClientAPIRestrictedManagePages ):
|
||||
|
||||
def _threadDoPOSTJob(self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
@ -4744,7 +4739,7 @@ class HydrusResourceClientAPIRestrictedManagePopupsUpdatePopup( HydrusResourceCl
|
|||
job_status = GetJobStatusFromRequest( request )
|
||||
|
||||
HandlePopupUpdate( job_status, request )
|
||||
|
||||
|
||||
body_dict = {
|
||||
'job_status': JobStatusToDict( job_status )
|
||||
}
|
||||
|
@ -4752,7 +4747,7 @@ class HydrusResourceClientAPIRestrictedManagePopupsUpdatePopup( HydrusResourceCl
|
|||
body = Dumps( body_dict, request.preferred_mime )
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200, mime = request.preferred_mime, body = body )
|
||||
|
||||
|
||||
return response_context
|
||||
|
||||
|
||||
|
|
|
@ -969,9 +969,7 @@ class LoginProcessDomain( LoginProcess ):
|
|||
|
||||
job_status.SetStatusText( result )
|
||||
|
||||
job_status.Finish()
|
||||
|
||||
job_status.Delete( 4 )
|
||||
job_status.FinishAndDismiss( 4 )
|
||||
|
||||
|
||||
class LoginProcessHydrus( LoginProcess ):
|
||||
|
|
|
@ -1,6 +1,10 @@
|
|||
import collections
|
||||
import re
|
||||
import zipfile
|
||||
|
||||
from hydrus.core import HydrusConstants as HC
|
||||
from hydrus.core import HydrusData
|
||||
from hydrus.core import HydrusExceptions
|
||||
|
||||
def ExtractSingleFileFromZip( path_to_zip, filename_to_extract, extract_into_file_path ):
|
||||
|
||||
|
@ -16,6 +20,42 @@ def ExtractSingleFileFromZip( path_to_zip, filename_to_extract, extract_into_fil
|
|||
|
||||
|
||||
|
||||
def ExtractCoverPage( path_to_zip, extract_path ):
|
||||
|
||||
# this probably depth-first fails with a crazy multiple-nested-subdirectory structure, but we'll cross that bridge when we come to it
|
||||
with zipfile.ZipFile( path_to_zip ) as zip_handle:
|
||||
|
||||
all_file_paths = [ zip_info.filename for zip_info in zip_handle.infolist() if not zip_info.is_dir() ]
|
||||
|
||||
HydrusData.HumanTextSort( all_file_paths )
|
||||
|
||||
for path in all_file_paths:
|
||||
|
||||
if '.' in path:
|
||||
|
||||
ext_with_dot = '.' + path.split( '.' )[-1]
|
||||
|
||||
if ext_with_dot in HC.IMAGE_FILE_EXTS:
|
||||
|
||||
# this is the cover page
|
||||
|
||||
with zip_handle.open( path ) as reader:
|
||||
|
||||
with open( extract_path, 'wb' ) as writer:
|
||||
|
||||
writer.write( reader.read() )
|
||||
|
||||
return
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
raise HydrusExceptions.DamagedOrUnusualFileException( 'Sorry, could not find an image file in there!' )
|
||||
|
||||
|
||||
def GetSingleFileFromZipBytes( path_to_zip, path_in_zip ):
|
||||
|
||||
return GetZipAsPath( path_to_zip, path_in_zip = path_in_zip ).read_bytes()
|
||||
|
@ -26,6 +66,118 @@ def GetZipAsPath( path_to_zip, path_in_zip="" ):
|
|||
return zipfile.Path( path_to_zip, at=path_in_zip )
|
||||
|
||||
|
||||
def ZipLooksLikeCBZ( path_to_zip ):
|
||||
|
||||
# TODO: we should probably wangle this away from 'zip' and towards 'archive', but it is fine as a first step
|
||||
|
||||
# what does a Comic Book Archive look like? it is ad-hoc, not rigorous, so be forgiving
|
||||
# it is a list of images
|
||||
# they may be flat in the base, or they may be in one or more subfolders
|
||||
# they may be accomanied by extra metadata files like: md5sum, .sfv/.SFV, .nfo, comicbook.xml, metadata.txt
|
||||
# nothing else
|
||||
|
||||
directories_to_image_filenames = collections.defaultdict( set )
|
||||
|
||||
num_directories = 1
|
||||
num_weird_files = 0
|
||||
num_images = 0
|
||||
num_weird_files_allowed_per_directory = 5
|
||||
num_images_needed_per_directory = 1
|
||||
ok_weird_filenames = { 'md5sum', 'comicbook.xml', 'metadata.txt' }
|
||||
|
||||
with zipfile.ZipFile( path_to_zip ) as zip_handle:
|
||||
|
||||
for zip_info in zip_handle.infolist():
|
||||
|
||||
if zip_info.is_dir():
|
||||
|
||||
num_directories += 1
|
||||
|
||||
continue
|
||||
|
||||
|
||||
filename = zip_info.filename
|
||||
|
||||
if '/' in filename:
|
||||
|
||||
directory_path = '/'.join( filename.split( '/' )[:-1] )
|
||||
|
||||
filename = filename.split( '/' )[-1]
|
||||
|
||||
else:
|
||||
|
||||
directory_path = ''
|
||||
|
||||
|
||||
filename = filename.lower()
|
||||
|
||||
if filename in ok_weird_filenames:
|
||||
|
||||
continue
|
||||
|
||||
|
||||
if '.' in filename:
|
||||
|
||||
ext_with_dot = '.' + filename.split( '.' )[-1]
|
||||
|
||||
if ext_with_dot in HC.IMAGE_FILE_EXTS:
|
||||
|
||||
num_images += 1
|
||||
|
||||
directories_to_image_filenames[ directory_path ].add( filename )
|
||||
|
||||
continue
|
||||
|
||||
else:
|
||||
|
||||
num_weird_files += 1
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
if len( directories_to_image_filenames ) > 0:
|
||||
|
||||
directories_to_looks_good_scores = {}
|
||||
|
||||
for ( directory_path, filenames ) in directories_to_image_filenames.items():
|
||||
|
||||
# ok, so a zip that has fifteen different filename styles is not a cbz
|
||||
# one that is all "Coolguy Adventures-c4-p001.jpg" however is!
|
||||
|
||||
# so let's take all the numbers and figure out how commonly the filenames are templated
|
||||
|
||||
unique_numberless_filenames = { re.sub( r'\d', '', filename ) for filename in filenames }
|
||||
|
||||
magical_uniqueness_percentage = len( unique_numberless_filenames ) / len( filenames )
|
||||
|
||||
directories_to_looks_good_scores[ directory_path ] = magical_uniqueness_percentage
|
||||
|
||||
|
||||
all_percentages = list( directories_to_looks_good_scores.values() )
|
||||
|
||||
average_directory_good = sum( all_percentages ) / len( all_percentages )
|
||||
|
||||
# experimentally, I haven't seen it go above 0.138 on a legit cbz
|
||||
if average_directory_good > 0.2:
|
||||
|
||||
return False
|
||||
|
||||
|
||||
|
||||
if num_weird_files * num_directories > num_weird_files_allowed_per_directory:
|
||||
|
||||
return False
|
||||
|
||||
|
||||
if num_images * num_directories < num_images_needed_per_directory:
|
||||
|
||||
return False
|
||||
|
||||
|
||||
return True
|
||||
|
||||
|
||||
def MimeFromOpenDocument( path ):
|
||||
|
||||
try:
|
||||
|
|
|
@ -94,6 +94,8 @@ if USERPATH_DB_DIR == desired_userpath_db_dir:
|
|||
USERPATH_DB_DIR = None
|
||||
|
||||
|
||||
WE_SWITCHED_TO_USERPATH = False
|
||||
|
||||
LICENSE_PATH = os.path.join( BASE_DIR, 'license.txt' )
|
||||
|
||||
#
|
||||
|
@ -103,8 +105,8 @@ options = {}
|
|||
# Misc
|
||||
|
||||
NETWORK_VERSION = 20
|
||||
SOFTWARE_VERSION = 553
|
||||
CLIENT_API_VERSION = 56
|
||||
SOFTWARE_VERSION = 554
|
||||
CLIENT_API_VERSION = 57
|
||||
|
||||
SERVER_THUMBNAIL_DIMENSIONS = ( 200, 200 )
|
||||
|
||||
|
@ -123,7 +125,7 @@ noneable_str = typing.Optional[ str ]
|
|||
|
||||
BANDWIDTH_TYPE_DATA = 0
|
||||
BANDWIDTH_TYPE_REQUESTS = 1
|
||||
|
||||
|
||||
bandwidth_type_string_lookup = {
|
||||
BANDWIDTH_TYPE_DATA : 'data',
|
||||
BANDWIDTH_TYPE_REQUESTS : 'requests'
|
||||
|
@ -736,6 +738,8 @@ APPLICATION_PROCREATE = 69
|
|||
IMAGE_QOI = 70
|
||||
APPLICATION_EPUB = 71
|
||||
APPLICATION_DJVU = 72
|
||||
APPLICATION_CBZ = 73
|
||||
ANIMATION_UGOIRA = 74
|
||||
APPLICATION_OCTET_STREAM = 100
|
||||
APPLICATION_UNKNOWN = 101
|
||||
|
||||
|
@ -767,6 +771,7 @@ SEARCHABLE_MIMES = {
|
|||
IMAGE_AVIF,
|
||||
IMAGE_AVIF_SEQUENCE,
|
||||
IMAGE_BMP,
|
||||
ANIMATION_UGOIRA,
|
||||
APPLICATION_FLASH,
|
||||
VIDEO_AVI,
|
||||
VIDEO_FLV,
|
||||
|
@ -777,6 +782,7 @@ SEARCHABLE_MIMES = {
|
|||
VIDEO_WEBM,
|
||||
VIDEO_OGV,
|
||||
VIDEO_MPEG,
|
||||
APPLICATION_CBZ,
|
||||
APPLICATION_CLIP,
|
||||
APPLICATION_PSD,
|
||||
APPLICATION_SAI2,
|
||||
|
@ -827,6 +833,15 @@ IMAGES = [
|
|||
]
|
||||
|
||||
ANIMATIONS = [
|
||||
ANIMATION_GIF,
|
||||
ANIMATION_APNG,
|
||||
IMAGE_AVIF_SEQUENCE,
|
||||
IMAGE_HEIC_SEQUENCE,
|
||||
IMAGE_HEIF_SEQUENCE,
|
||||
ANIMATION_UGOIRA
|
||||
]
|
||||
|
||||
VIEWABLE_ANIMATIONS = [
|
||||
ANIMATION_GIF,
|
||||
ANIMATION_APNG,
|
||||
IMAGE_AVIF_SEQUENCE,
|
||||
|
@ -885,6 +900,7 @@ IMAGE_PROJECT_FILES = [
|
|||
]
|
||||
|
||||
ARCHIVES = [
|
||||
APPLICATION_CBZ,
|
||||
APPLICATION_7Z,
|
||||
APPLICATION_GZIP,
|
||||
APPLICATION_RAR,
|
||||
|
@ -939,7 +955,8 @@ MIMES_THAT_WE_CAN_CHECK_FOR_TRANSPARENCY = {
|
|||
IMAGE_AVIF,
|
||||
IMAGE_HEIF,
|
||||
IMAGE_HEIC,
|
||||
ANIMATION_GIF
|
||||
ANIMATION_GIF,
|
||||
ANIMATION_APNG
|
||||
}
|
||||
|
||||
MIMES_THAT_MAY_THEORETICALLY_HAVE_TRANSPARENCY = MIMES_THAT_WE_CAN_CHECK_FOR_TRANSPARENCY.union( {
|
||||
|
@ -956,7 +973,7 @@ MIMES_THAT_MAY_THEORETICALLY_HAVE_TRANSPARENCY = MIMES_THAT_WE_CAN_CHECK_FOR_TRA
|
|||
ANIMATION_APNG
|
||||
} )
|
||||
|
||||
APPLICATIONS_WITH_THUMBNAILS = { IMAGE_SVG, APPLICATION_PDF, APPLICATION_FLASH, APPLICATION_CLIP, APPLICATION_PROCREATE }.union( VIEWABLE_IMAGE_PROJECT_FILES )
|
||||
APPLICATIONS_WITH_THUMBNAILS = { IMAGE_SVG, APPLICATION_PDF, APPLICATION_FLASH, APPLICATION_CLIP, APPLICATION_PROCREATE }.union( VIEWABLE_IMAGE_PROJECT_FILES ).union( { APPLICATION_CBZ } )
|
||||
|
||||
MIMES_WITH_THUMBNAILS = set( IMAGES ).union( ANIMATIONS ).union( VIDEO ).union( APPLICATIONS_WITH_THUMBNAILS )
|
||||
|
||||
|
@ -1013,6 +1030,7 @@ mime_enum_lookup = {
|
|||
'image/vnd.djvu' : APPLICATION_DJVU,
|
||||
'image/vnd.djvu+multipage' : APPLICATION_DJVU,
|
||||
'image/x-djvu' : APPLICATION_DJVU,
|
||||
'application/vnd.comicbook+zip' : APPLICATION_CBZ,
|
||||
'application/zip' : APPLICATION_ZIP,
|
||||
'application/vnd.rar' : APPLICATION_RAR,
|
||||
'application/x-7z-compressed' : APPLICATION_7Z,
|
||||
|
@ -1064,12 +1082,14 @@ mime_string_lookup = {
|
|||
IMAGE_QOI : 'qoi',
|
||||
IMAGE_ICON : 'icon',
|
||||
IMAGE_SVG : 'svg',
|
||||
IMAGE_HEIF: 'heif',
|
||||
IMAGE_HEIF_SEQUENCE: 'heif sequence',
|
||||
IMAGE_HEIC: 'heic',
|
||||
IMAGE_HEIC_SEQUENCE: 'heic sequence',
|
||||
IMAGE_AVIF: 'avif',
|
||||
IMAGE_AVIF_SEQUENCE: 'avif sequence',
|
||||
IMAGE_HEIF : 'heif',
|
||||
IMAGE_HEIF_SEQUENCE : 'heif sequence',
|
||||
IMAGE_HEIC : 'heic',
|
||||
IMAGE_HEIC_SEQUENCE : 'heic sequence',
|
||||
IMAGE_AVIF : 'avif',
|
||||
IMAGE_AVIF_SEQUENCE : 'avif sequence',
|
||||
ANIMATION_UGOIRA : 'ugoira',
|
||||
APPLICATION_CBZ : 'cbz',
|
||||
APPLICATION_FLASH : 'flash',
|
||||
APPLICATION_OCTET_STREAM : 'application/octet-stream',
|
||||
APPLICATION_YAML : 'yaml',
|
||||
|
@ -1149,8 +1169,10 @@ mime_mimetype_string_lookup = {
|
|||
IMAGE_HEIC_SEQUENCE: 'image/heic-sequence',
|
||||
IMAGE_AVIF: 'image/avif',
|
||||
IMAGE_AVIF_SEQUENCE: 'image/avif-sequence',
|
||||
ANIMATION_UGOIRA : 'application/zip',
|
||||
APPLICATION_FLASH : 'application/x-shockwave-flash',
|
||||
APPLICATION_OCTET_STREAM : 'application/octet-stream',
|
||||
APPLICATION_CBZ: 'application/vnd.comicbook+zip',
|
||||
APPLICATION_YAML : 'application/x-yaml',
|
||||
APPLICATION_JSON : 'application/json',
|
||||
APPLICATION_CBOR : 'application/cbor',
|
||||
|
@ -1227,6 +1249,8 @@ mime_ext_lookup = {
|
|||
IMAGE_HEIC_SEQUENCE: '.heics',
|
||||
IMAGE_AVIF: '.avif',
|
||||
IMAGE_AVIF_SEQUENCE: '.avifs',
|
||||
ANIMATION_UGOIRA : '.zip',
|
||||
APPLICATION_CBZ : '.cbz',
|
||||
APPLICATION_FLASH : '.swf',
|
||||
APPLICATION_OCTET_STREAM : '.bin',
|
||||
APPLICATION_YAML : '.yaml',
|
||||
|
@ -1274,6 +1298,9 @@ mime_ext_lookup = {
|
|||
APPLICATION_UNKNOWN : ''
|
||||
}
|
||||
|
||||
IMAGE_FILE_EXTS = { mime_ext_lookup[ mime ] for mime in IMAGES }
|
||||
IMAGE_FILE_EXTS.update( ( '.jpe', '.jpeg' ) )
|
||||
|
||||
ALLOWED_MIME_EXTENSIONS = [ mime_ext_lookup[ mime ] for mime in ALLOWED_MIMES ]
|
||||
|
||||
SITE_TYPE_DEVIANT_ART = 0
|
||||
|
|
|
@ -2,9 +2,8 @@ import hashlib
|
|||
import os
|
||||
|
||||
from hydrus.core import HydrusAnimationHandling
|
||||
from hydrus.core import HydrusPSDHandling
|
||||
from hydrus.core import HydrusClipHandling
|
||||
from hydrus.core import HydrusArchiveHandling
|
||||
from hydrus.core import HydrusClipHandling
|
||||
from hydrus.core import HydrusConstants as HC
|
||||
from hydrus.core import HydrusData
|
||||
from hydrus.core import HydrusDocumentHandling
|
||||
|
@ -12,12 +11,14 @@ from hydrus.core import HydrusExceptions
|
|||
from hydrus.core import HydrusFlashHandling
|
||||
from hydrus.core import HydrusKritaHandling
|
||||
from hydrus.core import HydrusProcreateHandling
|
||||
from hydrus.core import HydrusPSDHandling
|
||||
from hydrus.core import HydrusPaths
|
||||
from hydrus.core import HydrusSerialisable
|
||||
from hydrus.core import HydrusSVGHandling
|
||||
from hydrus.core import HydrusPDFHandling
|
||||
from hydrus.core import HydrusTemp
|
||||
from hydrus.core import HydrusText
|
||||
from hydrus.core import HydrusUgoiraHandling
|
||||
from hydrus.core import HydrusVideoHandling
|
||||
from hydrus.core.images import HydrusImageHandling
|
||||
from hydrus.core.networking import HydrusNetwork
|
||||
|
@ -51,36 +52,27 @@ def GenerateThumbnailBytes( path, target_resolution, mime, duration, num_frames,
|
|||
|
||||
def GenerateThumbnailNumPy( path, target_resolution, mime, duration, num_frames, percentage_in = 35 ):
|
||||
|
||||
if mime == HC.APPLICATION_PSD:
|
||||
if mime == HC.APPLICATION_CBZ:
|
||||
|
||||
( os_file_handle, temp_path ) = HydrusTemp.GetTempPath()
|
||||
|
||||
try:
|
||||
|
||||
thumbnail_numpy = HydrusPSDHandling.GenerateThumbnailNumPyFromPSDPath( path, target_resolution )
|
||||
HydrusArchiveHandling.ExtractCoverPage( path, temp_path )
|
||||
|
||||
except Exception as e:
|
||||
cover_mime = GetMime( temp_path )
|
||||
|
||||
HydrusData.Print( 'Problem generating thumbnail for "{}":'.format( path ) )
|
||||
HydrusData.PrintException( e )
|
||||
HydrusData.Print( 'Attempting ffmpeg PSD thumbnail fallback' )
|
||||
thumbnail_numpy = HydrusImageHandling.GenerateThumbnailNumPyFromStaticImagePath( temp_path, target_resolution, cover_mime )
|
||||
|
||||
( os_file_handle, temp_path ) = HydrusTemp.GetTempPath( suffix = '.png' )
|
||||
except:
|
||||
|
||||
try:
|
||||
|
||||
HydrusVideoHandling.RenderImageToImagePath( path, temp_path )
|
||||
|
||||
thumbnail_numpy = HydrusImageHandling.GenerateThumbnailNumPyFromStaticImagePath( temp_path, target_resolution, HC.IMAGE_PNG )
|
||||
|
||||
except Exception as e:
|
||||
|
||||
thumb_path = os.path.join( HC.STATIC_DIR, 'psd.png' )
|
||||
|
||||
thumbnail_numpy = HydrusImageHandling.GenerateThumbnailNumPyFromStaticImagePath( thumb_path, target_resolution, HC.IMAGE_PNG )
|
||||
|
||||
finally:
|
||||
|
||||
HydrusTemp.CleanUpTempPath( os_file_handle, temp_path )
|
||||
|
||||
thumb_path = os.path.join( HC.STATIC_DIR, 'zip.png' )
|
||||
|
||||
thumbnail_numpy = HydrusImageHandling.GenerateThumbnailNumPyFromStaticImagePath( thumb_path, target_resolution, HC.IMAGE_PNG )
|
||||
|
||||
finally:
|
||||
|
||||
HydrusTemp.CleanUpTempPath( os_file_handle, temp_path )
|
||||
|
||||
|
||||
elif mime == HC.APPLICATION_CLIP:
|
||||
|
@ -144,6 +136,38 @@ def GenerateThumbnailNumPy( path, target_resolution, mime, duration, num_frames,
|
|||
HydrusTemp.CleanUpTempPath( os_file_handle, temp_path )
|
||||
|
||||
|
||||
elif mime == HC.APPLICATION_PSD:
|
||||
|
||||
try:
|
||||
|
||||
thumbnail_numpy = HydrusPSDHandling.GenerateThumbnailNumPyFromPSDPath( path, target_resolution )
|
||||
|
||||
except Exception as e:
|
||||
|
||||
HydrusData.Print( 'Problem generating thumbnail for "{}":'.format( path ) )
|
||||
HydrusData.PrintException( e )
|
||||
HydrusData.Print( 'Attempting ffmpeg PSD thumbnail fallback' )
|
||||
|
||||
( os_file_handle, temp_path ) = HydrusTemp.GetTempPath( suffix = '.png' )
|
||||
|
||||
try:
|
||||
|
||||
HydrusVideoHandling.RenderImageToImagePath( path, temp_path )
|
||||
|
||||
thumbnail_numpy = HydrusImageHandling.GenerateThumbnailNumPyFromStaticImagePath( temp_path, target_resolution, HC.IMAGE_PNG )
|
||||
|
||||
except Exception as e:
|
||||
|
||||
thumb_path = os.path.join( HC.STATIC_DIR, 'psd.png' )
|
||||
|
||||
thumbnail_numpy = HydrusImageHandling.GenerateThumbnailNumPyFromStaticImagePath( thumb_path, target_resolution, HC.IMAGE_PNG )
|
||||
|
||||
finally:
|
||||
|
||||
HydrusTemp.CleanUpTempPath( os_file_handle, temp_path )
|
||||
|
||||
|
||||
|
||||
elif mime == HC.IMAGE_SVG:
|
||||
|
||||
try:
|
||||
|
@ -203,7 +227,7 @@ def GenerateThumbnailNumPy( path, target_resolution, mime, duration, num_frames,
|
|||
HydrusTemp.CleanUpTempPath( os_file_handle, temp_path )
|
||||
|
||||
|
||||
elif mime in HC.IMAGES or mime == HC.ANIMATION_GIF: # not apng atm
|
||||
elif mime in HC.IMAGES:
|
||||
|
||||
# TODO: it would be nice to have gif and apng generating their thumb x frames in, like with videos. maybe we should add animation thumb fetcher to hydrusanimationhandling
|
||||
|
||||
|
@ -221,27 +245,52 @@ def GenerateThumbnailNumPy( path, target_resolution, mime, duration, num_frames,
|
|||
thumbnail_numpy = HydrusImageHandling.GenerateThumbnailNumPyFromStaticImagePath( thumb_path, target_resolution, HC.IMAGE_PNG )
|
||||
|
||||
|
||||
else:
|
||||
elif mime == HC.ANIMATION_UGOIRA:
|
||||
|
||||
renderer = None
|
||||
|
||||
desired_thumb_frame = int( ( percentage_in / 100.0 ) * num_frames )
|
||||
( os_file_handle, temp_path ) = HydrusTemp.GetTempPath()
|
||||
|
||||
try:
|
||||
|
||||
renderer = HydrusVideoHandling.VideoRendererFFMPEG( path, mime, duration, num_frames, target_resolution, start_pos = desired_thumb_frame )
|
||||
desired_thumb_frame_index = int( ( percentage_in / 100.0 ) * ( num_frames - 1 ) )
|
||||
|
||||
HydrusUgoiraHandling.ExtractFrame( path, desired_thumb_frame_index, temp_path )
|
||||
|
||||
cover_mime = GetMime( temp_path )
|
||||
|
||||
thumbnail_numpy = HydrusImageHandling.GenerateThumbnailNumPyFromStaticImagePath( temp_path, target_resolution, cover_mime )
|
||||
|
||||
except:
|
||||
|
||||
thumb_path = os.path.join( HC.STATIC_DIR, 'zip.png' )
|
||||
|
||||
thumbnail_numpy = HydrusImageHandling.GenerateThumbnailNumPyFromStaticImagePath( thumb_path, target_resolution, HC.IMAGE_PNG )
|
||||
|
||||
finally:
|
||||
|
||||
HydrusTemp.CleanUpTempPath( os_file_handle, temp_path )
|
||||
|
||||
|
||||
else: # animations and video
|
||||
|
||||
renderer = None
|
||||
|
||||
desired_thumb_frame_index = int( ( percentage_in / 100.0 ) * ( num_frames - 1 ) )
|
||||
|
||||
try:
|
||||
|
||||
renderer = HydrusVideoHandling.VideoRendererFFMPEG( path, mime, duration, num_frames, target_resolution, start_pos = desired_thumb_frame_index )
|
||||
|
||||
numpy_image = renderer.read_frame()
|
||||
|
||||
except Exception as e:
|
||||
|
||||
HydrusData.Print( 'Problem generating thumbnail for "{}" at frame {} ({})--FFMPEG could not render it.'.format( path, desired_thumb_frame, HydrusData.ConvertFloatToPercentage( percentage_in / 100.0 ) ) )
|
||||
HydrusData.Print( 'Problem generating thumbnail for "{}" at frame {} ({})--FFMPEG could not render it.'.format( path, desired_thumb_frame_index, HydrusData.ConvertFloatToPercentage( percentage_in / 100.0 ) ) )
|
||||
HydrusData.PrintException( e )
|
||||
|
||||
numpy_image = None
|
||||
|
||||
|
||||
if numpy_image is None and desired_thumb_frame != 0:
|
||||
if numpy_image is None and desired_thumb_frame_index != 0:
|
||||
|
||||
if renderer is not None:
|
||||
|
||||
|
@ -358,7 +407,28 @@ def GetFileInfo( path, mime = None, ok_to_look_for_hydrus_updates = False ):
|
|||
|
||||
|
||||
# keep this in the specific-first, general-last test order
|
||||
if mime == HC.APPLICATION_CLIP:
|
||||
if mime == HC.APPLICATION_CBZ:
|
||||
|
||||
( os_file_handle, temp_path ) = HydrusTemp.GetTempPath()
|
||||
|
||||
try:
|
||||
|
||||
HydrusArchiveHandling.ExtractCoverPage( path, temp_path )
|
||||
|
||||
cover_mime = GetMime( temp_path )
|
||||
|
||||
( width, height ) = HydrusImageHandling.GetImageResolution( temp_path, cover_mime )
|
||||
|
||||
except:
|
||||
|
||||
( width, height ) = ( 100, 100 )
|
||||
|
||||
finally:
|
||||
|
||||
HydrusTemp.CleanUpTempPath( os_file_handle, temp_path )
|
||||
|
||||
|
||||
elif mime == HC.APPLICATION_CLIP:
|
||||
|
||||
( ( width, height ), duration, num_frames ) = HydrusClipHandling.GetClipProperties( path )
|
||||
|
||||
|
@ -433,10 +503,14 @@ def GetFileInfo( path, mime = None, ok_to_look_for_hydrus_updates = False ):
|
|||
|
||||
( ( width, height ), duration, num_frames, has_audio ) = HydrusVideoHandling.GetFFMPEGVideoProperties( path )
|
||||
|
||||
elif mime in HC.ANIMATIONS:
|
||||
elif mime in HC.VIEWABLE_ANIMATIONS:
|
||||
|
||||
( ( width, height ), duration, num_frames ) = HydrusAnimationHandling.GetAnimationProperties( path, mime )
|
||||
|
||||
elif mime == HC.ANIMATION_UGOIRA:
|
||||
|
||||
( ( width, height ), num_frames ) = HydrusUgoiraHandling.GetUgoiraProperties( path )
|
||||
|
||||
elif mime in HC.IMAGES:
|
||||
|
||||
( width, height ) = HydrusImageHandling.GetImageResolution( path, mime )
|
||||
|
@ -630,6 +704,16 @@ def GetMime( path, ok_to_look_for_hydrus_updates = False ):
|
|||
return HC.APPLICATION_PROCREATE
|
||||
|
||||
|
||||
if HydrusUgoiraHandling.ZipLooksLikeUgoira( path ):
|
||||
|
||||
return HC.ANIMATION_UGOIRA
|
||||
|
||||
|
||||
if HydrusArchiveHandling.ZipLooksLikeCBZ( path ):
|
||||
|
||||
return HC.APPLICATION_CBZ
|
||||
|
||||
|
||||
return HC.APPLICATION_ZIP
|
||||
|
||||
if mime in ( HC.UNDETERMINED_WM, HC.UNDETERMINED_MP4 ):
|
||||
|
|
|
@ -21,7 +21,7 @@ def MergedPILImageFromKra( path ):
|
|||
|
||||
except FileNotFoundError:
|
||||
|
||||
raise HydrusExceptions.UnsupportedFileException( f'Could not read {KRITA_FILE_MERGED} from this Krita file' )
|
||||
raise HydrusExceptions.DamagedOrUnusualFileException( f'Could not read {KRITA_FILE_MERGED} from this Krita file' )
|
||||
|
||||
|
||||
|
||||
|
|
|
@ -23,7 +23,7 @@ def MergedPILImageFromPSD( path: str ) -> PILImage:
|
|||
|
||||
if not psd.has_preview():
|
||||
|
||||
raise HydrusExceptions.UnsupportedFileException('PSD file has no embedded preview!')
|
||||
raise HydrusExceptions.DamagedOrUnusualFileException('PSD file has no embedded preview!')
|
||||
|
||||
|
||||
pil_image = convert_image_data_to_pil( psd )
|
||||
|
|
|
@ -17,6 +17,7 @@ from hydrus.core import HydrusConstants as HC
|
|||
from hydrus.core import HydrusData
|
||||
from hydrus.core import HydrusGlobals as HG
|
||||
from hydrus.core import HydrusThreading
|
||||
from hydrus.core import HydrusTime
|
||||
|
||||
mimes_to_default_thumbnail_paths = collections.defaultdict( lambda: os.path.join( HC.STATIC_DIR, 'hydrus.png' ) )
|
||||
|
||||
|
@ -32,23 +33,23 @@ mimes_to_default_thumbnail_paths[ HC.IMAGE_SVG ] = os.path.join( HC.STATIC_DIR,
|
|||
|
||||
for mime in HC.AUDIO:
|
||||
|
||||
path = os.path.join( HC.STATIC_DIR, 'audio.png' )
|
||||
png_path = os.path.join( HC.STATIC_DIR, 'audio.png' )
|
||||
|
||||
mimes_to_default_thumbnail_paths[ mime ] = os.path.join( path )
|
||||
mimes_to_default_thumbnail_paths[ mime ] = os.path.join( png_path )
|
||||
|
||||
|
||||
for mime in HC.VIDEO:
|
||||
|
||||
path = os.path.join( HC.STATIC_DIR, 'video.png' )
|
||||
png_path = os.path.join( HC.STATIC_DIR, 'video.png' )
|
||||
|
||||
mimes_to_default_thumbnail_paths[ mime ] = os.path.join( path )
|
||||
mimes_to_default_thumbnail_paths[ mime ] = os.path.join( png_path )
|
||||
|
||||
|
||||
for mime in HC.ARCHIVES:
|
||||
|
||||
path = os.path.join( HC.STATIC_DIR, 'zip.png' )
|
||||
png_path = os.path.join( HC.STATIC_DIR, 'zip.png' )
|
||||
|
||||
mimes_to_default_thumbnail_paths[ mime ] = os.path.join( path )
|
||||
mimes_to_default_thumbnail_paths[ mime ] = os.path.join( png_path )
|
||||
|
||||
|
||||
def AppendPathUntilNoConflicts( path ):
|
||||
|
@ -179,18 +180,16 @@ def DeletePath( path ) -> bool:
|
|||
|
||||
return True
|
||||
|
||||
|
||||
def DirectoryIsWriteable( path ):
|
||||
|
||||
while not os.path.exists( path ):
|
||||
try:
|
||||
|
||||
try:
|
||||
|
||||
path = os.path.dirname( path )
|
||||
|
||||
except:
|
||||
|
||||
return False
|
||||
|
||||
MakeSureDirectoryExists( path )
|
||||
|
||||
except:
|
||||
|
||||
raise Exception( f'While trying to determine if a directory, "{path}", has write permissions, I could not even ensure that it exists!' )
|
||||
|
||||
|
||||
if not os.access( path, os.W_OK | os.X_OK ):
|
||||
|
@ -198,7 +197,7 @@ def DirectoryIsWriteable( path ):
|
|||
return False
|
||||
|
||||
|
||||
# we'll actually do a file, since Program Files passes the above test lmaoooo
|
||||
# we'll actually do a file, since Windows Program Files passes the above test lmaoooo
|
||||
|
||||
try:
|
||||
|
||||
|
@ -221,10 +220,59 @@ def DirectoryIsWriteable( path ):
|
|||
|
||||
return True
|
||||
|
||||
|
||||
def ElideFilenameOrDirectorySafely( name: str, num_characters_used_in_other_components: typing.Optional[ int ] = None, num_characters_already_used_in_this_component: typing.Optional[ int ] = None ):
|
||||
|
||||
# most OSes cannot handle a filename or dirname with more than 255 characters
|
||||
# Windows cannot handle a _total_ pathname more than 260
|
||||
# to be safe and deal with surprise extensions like (11) or .txt sidecars, we use 240
|
||||
# moreover, unicode paths are encoded to bytes, so we have to count differently
|
||||
|
||||
MAX_PATH_LENGTH = 240
|
||||
|
||||
num_characters_available = MAX_PATH_LENGTH
|
||||
|
||||
if num_characters_used_in_other_components is not None:
|
||||
|
||||
if HC.PLATFORM_WINDOWS:
|
||||
|
||||
num_characters_available -= num_characters_used_in_other_components
|
||||
|
||||
if num_characters_available <= 0:
|
||||
|
||||
raise Exception( 'Sorry, it looks like the combined export filename or directory would be too long! Try shortening the export directory name!' )
|
||||
|
||||
|
||||
|
||||
|
||||
if num_characters_already_used_in_this_component is not None:
|
||||
|
||||
num_characters_available -= num_characters_already_used_in_this_component
|
||||
|
||||
if num_characters_available <= 0:
|
||||
|
||||
raise Exception( 'Sorry, it looks like the export filename would be too long! Try shortening the export phrase or directory!' )
|
||||
|
||||
|
||||
|
||||
while len( name.encode( 'utf-8' ) ) > num_characters_available:
|
||||
|
||||
name = name[:-1]
|
||||
|
||||
|
||||
if name == '':
|
||||
|
||||
raise Exception( 'Sorry, it looks like the export filename would be too long! Try shortening the export phrase or directory!' )
|
||||
|
||||
|
||||
return name
|
||||
|
||||
|
||||
def FileisWriteable( path: str ):
|
||||
|
||||
return os.access( path, os.W_OK )
|
||||
|
||||
|
||||
def FilterFreePaths( paths ):
|
||||
|
||||
free_paths = []
|
||||
|
@ -241,6 +289,31 @@ def FilterFreePaths( paths ):
|
|||
|
||||
return free_paths
|
||||
|
||||
|
||||
def FilterOlderModifiedFiles( paths: typing.Collection[ str ], grace_period: int ) -> typing.List[ str ]:
|
||||
|
||||
only_older_than = HydrusTime.GetNow() - grace_period
|
||||
|
||||
good_paths = []
|
||||
|
||||
for path in paths:
|
||||
|
||||
try:
|
||||
|
||||
if os.path.getmtime( path ) < only_older_than:
|
||||
|
||||
good_paths.append( path )
|
||||
|
||||
|
||||
except:
|
||||
|
||||
continue
|
||||
|
||||
|
||||
|
||||
return good_paths
|
||||
|
||||
|
||||
def GetDefaultLaunchPath():
|
||||
|
||||
if HC.PLATFORM_WINDOWS:
|
||||
|
@ -463,11 +536,10 @@ def LaunchFile( path, launch_path = None ):
|
|||
|
||||
thread.start()
|
||||
|
||||
|
||||
def MakeSureDirectoryExists( path ):
|
||||
|
||||
it_exists_already = os.path.exists( path )
|
||||
|
||||
if it_exists_already:
|
||||
if os.path.exists( path ):
|
||||
|
||||
if os.path.isdir( path ):
|
||||
|
||||
|
@ -475,11 +547,13 @@ def MakeSureDirectoryExists( path ):
|
|||
|
||||
else:
|
||||
|
||||
raise Exception( 'Sorry, the desired directory "{}" already exists as a normal file!'.format( path ) )
|
||||
raise Exception( f'Sorry, the directory "{path}" already exists as a normal file!' )
|
||||
|
||||
|
||||
|
||||
os.makedirs( path, exist_ok = True )
|
||||
else:
|
||||
|
||||
os.makedirs( path, exist_ok = True )
|
||||
|
||||
|
||||
|
||||
def FileModifiedTimeIsOk( mtime: int ):
|
||||
|
@ -860,30 +934,46 @@ def PathsHaveSameSizeAndDate( path1, path2 ):
|
|||
|
||||
def PathIsFree( path ):
|
||||
|
||||
if not os.path.exists( path ):
|
||||
|
||||
return False
|
||||
|
||||
|
||||
try:
|
||||
|
||||
stat_result = os.stat( path )
|
||||
|
||||
current_bits = stat_result.st_mode
|
||||
|
||||
if not current_bits & stat.S_IWRITE:
|
||||
if current_bits & stat.S_IWRITE:
|
||||
|
||||
# read-only file, cannot do the rename check
|
||||
os.rename( path, path ) # rename a path to itself
|
||||
|
||||
return True
|
||||
|
||||
|
||||
os.rename( path, path ) # rename a path to itself
|
||||
|
||||
return True
|
||||
|
||||
except OSError as e: # 'already in use by another process' or an odd filename too long error
|
||||
|
||||
HydrusData.Print( 'Already in use/inaccessible: ' + path )
|
||||
|
||||
return False
|
||||
|
||||
|
||||
return False
|
||||
try:
|
||||
|
||||
with open( path, 'rb' ) as f:
|
||||
|
||||
return True
|
||||
|
||||
|
||||
except:
|
||||
|
||||
HydrusData.Print( 'Could not open the file: ' + path )
|
||||
|
||||
return False
|
||||
|
||||
|
||||
|
||||
def ReadFileLikeAsBlocks( f ):
|
||||
|
||||
next_block = f.read( HC.READ_BLOCK_SIZE )
|
||||
|
|
|
@ -141,6 +141,30 @@ def SubprocessCommunicate( process: subprocess.Popen ):
|
|||
|
||||
|
||||
|
||||
class RegularJobChecker( object ):
|
||||
|
||||
def __init__( self, period = 10 ):
|
||||
|
||||
self._period = period
|
||||
|
||||
self._next_check = HydrusTime.GetNowFloat()
|
||||
|
||||
|
||||
def Due( self ) -> bool:
|
||||
|
||||
if HydrusTime.TimeHasPassedFloat( self._next_check ):
|
||||
|
||||
self._next_check = HydrusTime.GetNowFloat() + self._period
|
||||
|
||||
return True
|
||||
|
||||
else:
|
||||
|
||||
return False
|
||||
|
||||
|
||||
|
||||
|
||||
class BigJobPauser( object ):
|
||||
|
||||
def __init__( self, period = 10, wait_time = 0.1 ):
|
||||
|
@ -148,16 +172,16 @@ class BigJobPauser( object ):
|
|||
self._period = period
|
||||
self._wait_time = wait_time
|
||||
|
||||
self._next_pause = HydrusTime.GetNow() + self._period
|
||||
self._next_pause = HydrusTime.GetNowFloat() + self._period
|
||||
|
||||
|
||||
def Pause( self ):
|
||||
|
||||
if HydrusTime.TimeHasPassed( self._next_pause ):
|
||||
if HydrusTime.TimeHasPassedFloat( self._next_pause ):
|
||||
|
||||
time.sleep( self._wait_time )
|
||||
|
||||
self._next_pause = HydrusTime.GetNow() + self._period
|
||||
self._next_pause = HydrusTime.GetNowFloat() + self._period
|
||||
|
||||
|
||||
|
||||
|
|
|
@ -0,0 +1,164 @@
|
|||
import zipfile
|
||||
|
||||
from hydrus.core import HydrusArchiveHandling
|
||||
from hydrus.core import HydrusConstants as HC
|
||||
from hydrus.core import HydrusExceptions
|
||||
from hydrus.core import HydrusTemp
|
||||
from hydrus.core.images import HydrusImageHandling
|
||||
|
||||
def ExtractFrame( path_to_zip, frame_index, extract_path ):
|
||||
|
||||
# this is too ugly to use for an animation thing, but it'll work for fetching a thumb fine
|
||||
|
||||
with zipfile.ZipFile( path_to_zip ) as zip_handle:
|
||||
|
||||
all_file_paths = [ zip_info.filename for zip_info in zip_handle.infolist() if not zip_info.is_dir() ]
|
||||
|
||||
if len( all_file_paths ) == 0:
|
||||
|
||||
raise HydrusExceptions.DamagedOrUnusualFileException( 'This Ugoira seems to be empty! It has probably been corrupted!' )
|
||||
|
||||
|
||||
all_file_paths.sort()
|
||||
|
||||
frame_index = min( frame_index, len( all_file_paths ) - 1 )
|
||||
|
||||
frame_path = all_file_paths[ frame_index ]
|
||||
|
||||
with zip_handle.open( frame_path ) as reader:
|
||||
|
||||
with open( extract_path, 'wb' ) as writer:
|
||||
|
||||
writer.write( reader.read() )
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
def GetUgoiraProperties( path_to_zip ):
|
||||
|
||||
( os_file_handle, temp_path ) = HydrusTemp.GetTempPath()
|
||||
|
||||
try:
|
||||
|
||||
try:
|
||||
|
||||
HydrusArchiveHandling.ExtractCoverPage( path_to_zip, temp_path )
|
||||
|
||||
pil_image = HydrusImageHandling.GeneratePILImage( temp_path, dequantize = False )
|
||||
|
||||
( width, height ) = pil_image.size
|
||||
|
||||
except:
|
||||
|
||||
( width, height ) = ( 100, 100 )
|
||||
|
||||
|
||||
try:
|
||||
|
||||
with zipfile.ZipFile( path_to_zip ) as zip_handle:
|
||||
|
||||
num_frames = len( zip_handle.infolist() )
|
||||
|
||||
|
||||
except:
|
||||
|
||||
num_frames = None
|
||||
|
||||
|
||||
finally:
|
||||
|
||||
HydrusTemp.CleanUpTempPath( os_file_handle, temp_path )
|
||||
|
||||
|
||||
return ( ( width, height ), num_frames )
|
||||
|
||||
|
||||
def ZipLooksLikeUgoira( path_to_zip ):
|
||||
|
||||
# what does an Ugoira look like? it has a standard, but this is not always followed, so be somewhat forgiving
|
||||
# it is a list of images named in the format 000123.jpg. this is very typically 6-figure, starting at 000000, but it may be shorter and start at 0001
|
||||
# no directories
|
||||
# we can forgive a .json or .js file, nothing else
|
||||
|
||||
our_image_ext = None
|
||||
|
||||
with zipfile.ZipFile( path_to_zip ) as zip_handle:
|
||||
|
||||
zip_infos = zip_handle.infolist()
|
||||
|
||||
if True in ( zip_info.is_dir() for zip_info in zip_infos ):
|
||||
|
||||
return False
|
||||
|
||||
|
||||
image_number_strings = []
|
||||
|
||||
filenames = [ zip_info.filename for zip_info in zip_infos ]
|
||||
|
||||
for filename in filenames:
|
||||
|
||||
if '.' not in filename:
|
||||
|
||||
return False
|
||||
|
||||
|
||||
number = '.'.join( filename.split( '.' )[:-1] )
|
||||
ext = '.' + filename.split( '.' )[-1]
|
||||
|
||||
if ext in ( '.js', '.json' ):
|
||||
|
||||
continue
|
||||
|
||||
|
||||
if ext not in HC.IMAGE_FILE_EXTS:
|
||||
|
||||
return False
|
||||
|
||||
|
||||
if our_image_ext is None:
|
||||
|
||||
our_image_ext = ext
|
||||
|
||||
|
||||
if ext != our_image_ext:
|
||||
|
||||
return False
|
||||
|
||||
|
||||
image_number_strings.append( number )
|
||||
|
||||
|
||||
if len( image_number_strings ) == 0:
|
||||
|
||||
return False
|
||||
|
||||
|
||||
image_number_strings.sort()
|
||||
|
||||
try:
|
||||
|
||||
current_image_number = int( image_number_strings[0] )
|
||||
|
||||
except:
|
||||
|
||||
return False
|
||||
|
||||
|
||||
number_of_digits = len( image_number_strings[0] )
|
||||
|
||||
for image_number_string in image_number_strings:
|
||||
|
||||
string_we_expect = str( current_image_number ).zfill( number_of_digits )
|
||||
|
||||
if image_number_string != string_we_expect:
|
||||
|
||||
return False
|
||||
|
||||
|
||||
current_image_number += 1
|
||||
|
||||
|
||||
|
||||
return True
|
||||
|
|
@ -996,7 +996,7 @@ def VideoHasAudio( path, info_lines ) -> bool:
|
|||
# This was built from moviepy's FFMPEG_VideoReader
|
||||
class VideoRendererFFMPEG( object ):
|
||||
|
||||
def __init__( self, path, mime, duration, num_frames, target_resolution, pix_fmt = "rgb24", clip_rect = None, start_pos = None ):
|
||||
def __init__( self, path, mime, duration, num_frames, target_resolution, clip_rect = None, start_pos = None ):
|
||||
|
||||
self._path = path
|
||||
self._mime = mime
|
||||
|
@ -1005,6 +1005,15 @@ class VideoRendererFFMPEG( object ):
|
|||
self._target_resolution = target_resolution
|
||||
self._clip_rect = clip_rect
|
||||
|
||||
if self._mime in HC.MIMES_THAT_WE_CAN_CHECK_FOR_TRANSPARENCY:
|
||||
|
||||
self.pix_fmt = 'rgba'
|
||||
|
||||
else:
|
||||
|
||||
self.pix_fmt = 'rgb24'
|
||||
|
||||
|
||||
self.lastread = None
|
||||
|
||||
self.fps = self._num_frames / self._duration
|
||||
|
@ -1014,10 +1023,14 @@ class VideoRendererFFMPEG( object ):
|
|||
self.fps = 24
|
||||
|
||||
|
||||
self.pix_fmt = pix_fmt
|
||||
|
||||
if pix_fmt == 'rgba': self.depth = 4
|
||||
else: self.depth = 3
|
||||
if self.pix_fmt == 'rgba':
|
||||
|
||||
self.depth = 4
|
||||
|
||||
else:
|
||||
|
||||
self.depth = 3
|
||||
|
||||
|
||||
( x, y ) = self._target_resolution
|
||||
|
||||
|
@ -1170,7 +1183,7 @@ class VideoRendererFFMPEG( object ):
|
|||
|
||||
s = self.process.stdout.read( nbytes )
|
||||
|
||||
if len(s) != nbytes:
|
||||
if len( s ) != nbytes:
|
||||
|
||||
if self.lastread is None:
|
||||
|
||||
|
@ -1188,7 +1201,7 @@ class VideoRendererFFMPEG( object ):
|
|||
return self.read_frame()
|
||||
|
||||
|
||||
raise Exception( 'Unable to render that video! Please send it to hydrus dev so he can look at it!' )
|
||||
raise HydrusExceptions.DamagedOrUnusualFileException( 'Unable to render that video! Please send it to hydrus dev so he can look at it!' )
|
||||
|
||||
|
||||
result = self.lastread
|
||||
|
@ -1197,7 +1210,7 @@ class VideoRendererFFMPEG( object ):
|
|||
|
||||
else:
|
||||
|
||||
result = numpy.fromstring( s, dtype = 'uint8' ).reshape( ( h, w, len( s ) // ( w * h ) ) )
|
||||
result = numpy.fromstring( s, dtype = 'uint8' ).reshape( ( h, w, self.depth ) )
|
||||
|
||||
self.lastread = result
|
||||
|
||||
|
|
|
@ -258,8 +258,15 @@ def GenerateNumPyImage( path, mime, force_pil = False ) -> numpy.array:
|
|||
|
||||
def GenerateNumPyImageFromPILImage( pil_image: PILImage.Image, strip_useless_alpha = True ) -> numpy.array:
|
||||
|
||||
# this seems to magically work, I guess asarray either has a match for Image or Image provides some common shape/datatype properties that it can hook into
|
||||
numpy_image = numpy.asarray( pil_image )
|
||||
try:
|
||||
|
||||
# this seems to magically work, I guess asarray either has a match for Image or Image provides some common shape/datatype properties that it can hook into
|
||||
numpy_image = numpy.asarray( pil_image )
|
||||
|
||||
except IOError:
|
||||
|
||||
raise HydrusExceptions.DamagedOrUnusualFileException( 'Looks like a truncated file that PIL could not handle!' )
|
||||
|
||||
|
||||
if strip_useless_alpha:
|
||||
|
||||
|
@ -268,43 +275,27 @@ def GenerateNumPyImageFromPILImage( pil_image: PILImage.Image, strip_useless_alp
|
|||
|
||||
return numpy_image
|
||||
|
||||
# old method:
|
||||
'''
|
||||
( w, h ) = pil_image.size
|
||||
|
||||
try:
|
||||
|
||||
s = pil_image.tobytes()
|
||||
|
||||
except OSError as e: # e.g. OSError: unrecognized data stream contents when reading image file
|
||||
|
||||
raise HydrusExceptions.UnsupportedFileException( str( e ) )
|
||||
|
||||
|
||||
depth = len( s ) // ( w * h )
|
||||
|
||||
return numpy.fromstring( s, dtype = 'uint8' ).reshape( ( h, w, depth ) )
|
||||
'''
|
||||
|
||||
|
||||
def GeneratePILImage( path: typing.Union[ str, typing.BinaryIO ], dequantize = True ) -> PILImage.Image:
|
||||
|
||||
pil_image = HydrusImageOpening.RawOpenPILImage( path )
|
||||
|
||||
if pil_image is None:
|
||||
try:
|
||||
|
||||
raise Exception( 'The file at {} could not be rendered!'.format( path ) )
|
||||
pil_image = HydrusImageNormalisation.RotateEXIFPILImage( pil_image )
|
||||
|
||||
|
||||
pil_image = HydrusImageNormalisation.RotateEXIFPILImage( pil_image )
|
||||
|
||||
if dequantize:
|
||||
if dequantize:
|
||||
|
||||
# note this destroys animated gifs atm, it collapses down to one frame
|
||||
pil_image = HydrusImageNormalisation.DequantizePILImage( pil_image )
|
||||
|
||||
|
||||
# note this destroys animated gifs atm, it collapses down to one frame
|
||||
pil_image = HydrusImageNormalisation.DequantizePILImage( pil_image )
|
||||
return pil_image
|
||||
|
||||
except IOError:
|
||||
|
||||
raise HydrusExceptions.DamagedOrUnusualFileException( 'Looks like a truncated file that PIL could not handle!' )
|
||||
|
||||
|
||||
return pil_image
|
||||
|
||||
|
||||
def GeneratePILImageFromNumPyImage( numpy_image: numpy.array ) -> PILImage.Image:
|
||||
|
|
|
@ -11,7 +11,12 @@ def RawOpenPILImage( path: typing.Union[ str, typing.BinaryIO ] ) -> PILImage.Im
|
|||
|
||||
except Exception as e:
|
||||
|
||||
raise HydrusExceptions.DamagedOrUnusualFileException( 'Could not load the image--it was likely malformed!' )
|
||||
raise HydrusExceptions.DamagedOrUnusualFileException( f'Could not load the image at "{path}"--it was likely malformed!' ) from e
|
||||
|
||||
|
||||
if pil_image is None:
|
||||
|
||||
raise HydrusExceptions.DamagedOrUnusualFileException( f'Could not load the image at "{path}"--it was likely malformed!' )
|
||||
|
||||
|
||||
return pil_image
|
||||
|
|
|
@ -53,54 +53,46 @@ try:
|
|||
|
||||
if result.db_dir is None:
|
||||
|
||||
db_dir = HC.DEFAULT_DB_DIR
|
||||
|
||||
if not HydrusPaths.DirectoryIsWriteable( db_dir ) or HC.RUNNING_FROM_MACOS_APP:
|
||||
if HC.RUNNING_FROM_MACOS_APP:
|
||||
|
||||
if HC.USERPATH_DB_DIR is None:
|
||||
|
||||
raise Exception( 'The default db path "{}" was not writeable, and the userpath could not be determined!'.format( HC.DEFAULT_DB_DIR ) )
|
||||
raise Exception( 'The userpath (for macOS App database) could not be determined!' )
|
||||
|
||||
|
||||
db_dir = HC.USERPATH_DB_DIR
|
||||
|
||||
else:
|
||||
|
||||
db_dir = HC.DEFAULT_DB_DIR
|
||||
|
||||
|
||||
else:
|
||||
|
||||
db_dir = result.db_dir
|
||||
|
||||
|
||||
db_dir = HydrusPaths.ConvertPortablePathToAbsPath( db_dir, HC.BASE_DIR )
|
||||
db_dir = HydrusPaths.ConvertPortablePathToAbsPath( db_dir, HC.BASE_DIR )
|
||||
|
||||
|
||||
if not HydrusPaths.DirectoryIsWriteable( db_dir ):
|
||||
|
||||
message = 'The given db path "{}" is not a writeable-to!'.format( db_dir )
|
||||
|
||||
db_dir = HC.USERPATH_DB_DIR
|
||||
|
||||
raise Exception( message )
|
||||
|
||||
|
||||
try:
|
||||
|
||||
HydrusPaths.MakeSureDirectoryExists( db_dir )
|
||||
|
||||
except:
|
||||
|
||||
message = 'Could not ensure db path "{}" exists! Check the location is correct and that you have permission to write to it!'.format( db_dir )
|
||||
|
||||
db_dir = HC.USERPATH_DB_DIR
|
||||
|
||||
raise Exception( message )
|
||||
|
||||
|
||||
if not os.path.isdir( db_dir ):
|
||||
|
||||
message = 'The given db path "{}" is not a directory!'.format( db_dir )
|
||||
|
||||
db_dir = HC.USERPATH_DB_DIR
|
||||
|
||||
raise Exception( message )
|
||||
if HC.USERPATH_DB_DIR is None:
|
||||
|
||||
raise Exception( f'The db path "{db_dir}" was not writeable-to, and the userpath could not be determined!' )
|
||||
|
||||
else:
|
||||
|
||||
if not HydrusPaths.DirectoryIsWriteable( HC.USERPATH_DB_DIR ):
|
||||
|
||||
raise Exception( f'Neither the default db path "{db_dir}", nor the userpath fallback "{HC.USERPATH_DB_DIR}", were writeable-to!' )
|
||||
|
||||
|
||||
HydrusData.Print( f'The given db path "{db_dir}" is not writeable-to! Falling back to userpath at "{HC.USERPATH_DB_DIR}".' )
|
||||
|
||||
HC.WE_SWITCHED_TO_USERPATH = True
|
||||
|
||||
db_dir = HC.USERPATH_DB_DIR
|
||||
|
||||
|
||||
|
||||
HG.db_journal_mode = result.db_journal_mode
|
||||
|
|
|
@ -63,54 +63,46 @@ try:
|
|||
|
||||
if result.db_dir is None:
|
||||
|
||||
db_dir = HC.DEFAULT_DB_DIR
|
||||
|
||||
if not HydrusPaths.DirectoryIsWriteable( db_dir ) or HC.RUNNING_FROM_MACOS_APP:
|
||||
if HC.RUNNING_FROM_MACOS_APP:
|
||||
|
||||
if HC.USERPATH_DB_DIR is None:
|
||||
|
||||
raise Exception( 'The default db path "{}" was not writeable, and the userpath could not be determined!'.format( HC.DEFAULT_DB_DIR ) )
|
||||
raise Exception( 'The userpath (for macOS App database) could not be determined!' )
|
||||
|
||||
|
||||
db_dir = HC.USERPATH_DB_DIR
|
||||
|
||||
else:
|
||||
|
||||
db_dir = HC.DEFAULT_DB_DIR
|
||||
|
||||
|
||||
else:
|
||||
|
||||
db_dir = result.db_dir
|
||||
|
||||
|
||||
db_dir = HydrusPaths.ConvertPortablePathToAbsPath( db_dir, HC.BASE_DIR )
|
||||
db_dir = HydrusPaths.ConvertPortablePathToAbsPath( db_dir, HC.BASE_DIR )
|
||||
|
||||
|
||||
if not HydrusPaths.DirectoryIsWriteable( db_dir ):
|
||||
|
||||
message = 'The given db path "{}" is not a writeable-to!'.format( db_dir )
|
||||
|
||||
db_dir = HC.USERPATH_DB_DIR
|
||||
|
||||
raise Exception( message )
|
||||
|
||||
|
||||
try:
|
||||
|
||||
HydrusPaths.MakeSureDirectoryExists( db_dir )
|
||||
|
||||
except:
|
||||
|
||||
message = 'Could not ensure db path "{}" exists! Check the location is correct and that you have permission to write to it!'.format( db_dir )
|
||||
|
||||
db_dir = HC.USERPATH_DB_DIR
|
||||
|
||||
raise Exception( message )
|
||||
|
||||
|
||||
if not os.path.isdir( db_dir ):
|
||||
|
||||
message = 'The given db path "{}" is not a directory!'.format( db_dir )
|
||||
|
||||
db_dir = HC.USERPATH_DB_DIR
|
||||
|
||||
raise Exception( message )
|
||||
if HC.USERPATH_DB_DIR is None:
|
||||
|
||||
raise Exception( f'The db path "{db_dir}" was not writeable-to, and the userpath could not be determined!' )
|
||||
|
||||
else:
|
||||
|
||||
if not HydrusPaths.DirectoryIsWriteable( HC.USERPATH_DB_DIR ):
|
||||
|
||||
raise Exception( f'Neither the default db path "{db_dir}", nor the userpath fallback "{HC.USERPATH_DB_DIR}", were writeable-to!' )
|
||||
|
||||
|
||||
HydrusData.Print( f'The given db path "{db_dir}" is not writeable-to! Falling back to userpath at "{HC.USERPATH_DB_DIR}".' )
|
||||
|
||||
HC.WE_SWITCHED_TO_USERPATH = True
|
||||
|
||||
db_dir = HC.USERPATH_DB_DIR
|
||||
|
||||
|
||||
|
||||
HG.db_journal_mode = result.db_journal_mode
|
||||
|
|
|
@ -4082,6 +4082,56 @@ class TestClientAPI( unittest.TestCase ):
|
|||
self.assertEqual( result, expected_result )
|
||||
|
||||
|
||||
def _test_options( self, connection, set_up_permissions ):
|
||||
|
||||
# first fail
|
||||
|
||||
api_permissions = set_up_permissions[ 'add_urls' ]
|
||||
|
||||
access_key_hex = api_permissions.GetAccessKey().hex()
|
||||
|
||||
headers = { 'Hydrus-Client-API-Access-Key' : access_key_hex }
|
||||
|
||||
#
|
||||
|
||||
path = '/manage_database/get_client_options'
|
||||
|
||||
connection.request( 'GET', path, headers = headers )
|
||||
|
||||
response = connection.getresponse()
|
||||
|
||||
data = response.read()
|
||||
|
||||
self.assertEqual( response.status, 403 )
|
||||
|
||||
# now good
|
||||
|
||||
api_permissions = set_up_permissions[ 'everything' ]
|
||||
|
||||
access_key_hex = api_permissions.GetAccessKey().hex()
|
||||
|
||||
headers = { 'Hydrus-Client-API-Access-Key' : access_key_hex }
|
||||
|
||||
#
|
||||
|
||||
path = '/manage_database/get_client_options'
|
||||
|
||||
connection.request( 'GET', path, headers = headers )
|
||||
|
||||
response = connection.getresponse()
|
||||
|
||||
data = response.read()
|
||||
|
||||
self.assertEqual( response.status, 200 )
|
||||
|
||||
text = str( data, 'utf-8' )
|
||||
|
||||
d = json.loads( text )
|
||||
|
||||
self.assertIn( 'old_options', d )
|
||||
self.assertIn( 'options', d )
|
||||
|
||||
|
||||
def _test_search_files( self, connection, set_up_permissions ):
|
||||
|
||||
hash_ids = [ 1, 2, 3, 4, 5, 10, 15, 16, 17, 18, 19, 20, 21, 25, 100, 101, 150 ]
|
||||
|
@ -5858,6 +5908,7 @@ class TestClientAPI( unittest.TestCase ):
|
|||
set_up_permissions = self._test_client_api_basics( connection )
|
||||
self._test_get_services( connection, set_up_permissions )
|
||||
self._test_manage_database( connection, set_up_permissions )
|
||||
self._test_options( connection, set_up_permissions )
|
||||
self._test_add_files_add_file( connection, set_up_permissions )
|
||||
self._test_add_files_other_actions( connection, set_up_permissions )
|
||||
self._test_add_notes( connection, set_up_permissions )
|
||||
|
|
|
@ -191,7 +191,7 @@ class TestCheckerOptions( unittest.TestCase ):
|
|||
self.assertEqual( fast_checker_options.GetNextCheckTime( new_thread_file_seed_cache, last_check_time, 0 ), last_check_time + 600 )
|
||||
self.assertEqual( slow_checker_options.GetNextCheckTime( new_thread_file_seed_cache, last_check_time, 0 ), last_check_time + 600 )
|
||||
|
||||
# Let's test these new static timings, where if faster_than == slower_than, we just add that period to the 'previous_next_check_time' (e.g. checking every sunday night)
|
||||
# Let's test the static timings, where if faster_than == slower_than
|
||||
|
||||
static_checker_options = ClientImportOptions.CheckerOptions( intended_files_per_check = 5, never_faster_than = 3600, never_slower_than = 3600, death_file_velocity = ( 1, 3600 ) )
|
||||
|
||||
|
@ -203,6 +203,19 @@ class TestCheckerOptions( unittest.TestCase ):
|
|||
|
||||
self.assertEqual( static_checker_options.GetNextCheckTime( new_thread_file_seed_cache, last_check_time, previous_next_check_time ), previous_next_check_time + 3600 )
|
||||
|
||||
self.assertEqual( static_checker_options.GetNextCheckTime( new_thread_file_seed_cache, last_check_time, None ), last_check_time + 3600 )
|
||||
|
||||
# user just changed the check period, and previous next check time is in the future
|
||||
last_check_time = HydrusTime.GetNow() - 600
|
||||
previous_next_check_time = HydrusTime.GetNow() + 1800
|
||||
|
||||
self.assertEqual( static_checker_options.GetNextCheckTime( new_thread_file_seed_cache, last_check_time, previous_next_check_time ), last_check_time + 3600 )
|
||||
|
||||
|
||||
last_check_time = HydrusTime.GetNow() - 100000
|
||||
previous_next_check_time = last_check_time - 600
|
||||
|
||||
self.assertEqual( static_checker_options.GetNextCheckTime( new_thread_file_seed_cache, last_check_time, previous_next_check_time ), HydrusTime.GetNow() + 3600 )
|
||||
|
||||
|
||||
class TestFileImportOptions( unittest.TestCase ):
|
||||
|
|
|
@ -65,6 +65,7 @@ from hydrus.test import TestFunctions
|
|||
from hydrus.test import TestHydrusData
|
||||
from hydrus.test import TestHydrusNATPunch
|
||||
from hydrus.test import TestHydrusNetworking
|
||||
from hydrus.test import TestHydrusPaths
|
||||
from hydrus.test import TestHydrusSerialisable
|
||||
from hydrus.test import TestHydrusServer
|
||||
from hydrus.test import TestHydrusSessions
|
||||
|
@ -418,7 +419,7 @@ class Controller( object ):
|
|||
|
||||
|
||||
|
||||
job_status = ClientThreading.JobStatus()
|
||||
job_status = ClientThreading.JobStatus( cancellable = True, cancel_on_shutdown = False )
|
||||
|
||||
QP.CallAfter( qt_code, win, job_status )
|
||||
|
||||
|
@ -800,6 +801,7 @@ class Controller( object ):
|
|||
TestClientDBDuplicates,
|
||||
TestClientDBTags,
|
||||
TestHydrusData,
|
||||
TestHydrusPaths,
|
||||
TestHydrusTime,
|
||||
TestHydrusNATPunch,
|
||||
TestClientNetworking,
|
||||
|
@ -835,6 +837,7 @@ class Controller( object ):
|
|||
TestClientThreading,
|
||||
TestFunctions,
|
||||
TestHydrusData,
|
||||
TestHydrusPaths,
|
||||
TestHydrusTags,
|
||||
TestHydrusTime,
|
||||
TestHydrusSerialisable,
|
||||
|
|
|
@ -3,7 +3,6 @@ import unittest
|
|||
from hydrus.core import HydrusConstants as HC
|
||||
from hydrus.core import HydrusData
|
||||
from hydrus.core import HydrusGlobals as HG
|
||||
from hydrus.core import HydrusTime
|
||||
|
||||
from hydrus.client import ClientConstants as CC
|
||||
from hydrus.client import ClientData
|
||||
|
|
|
@ -1,13 +1,6 @@
|
|||
import unittest
|
||||
|
||||
from hydrus.core import HydrusConstants as HC
|
||||
from hydrus.core import HydrusData
|
||||
from hydrus.core import HydrusGlobals as HG
|
||||
from hydrus.core import HydrusTime
|
||||
|
||||
from hydrus.client import ClientConstants as CC
|
||||
from hydrus.client import ClientData
|
||||
from hydrus.client.metadata import ClientTags
|
||||
|
||||
class TestHydrusData( unittest.TestCase ):
|
||||
|
||||
|
|
|
@ -0,0 +1,60 @@
|
|||
import unittest
|
||||
|
||||
from hydrus.core import HydrusConstants as HC
|
||||
from hydrus.core import HydrusPaths
|
||||
|
||||
class TestHydrusPaths( unittest.TestCase ):
|
||||
|
||||
def test_eliding( self ):
|
||||
|
||||
name_too_long = 'a' * 245
|
||||
name_shortened = 'a' * 240
|
||||
|
||||
self.assertEqual( HydrusPaths.ElideFilenameOrDirectorySafely( name_too_long ), name_shortened )
|
||||
|
||||
unicode_name_too_long_hex = '57696e646f7773e381a7e381afe380814e544653e381aee5a0b4e59088e38081e9809ae5b8b8e38081e38395e382a1e382a4e383abe5908de381aee69c80e5a4a7e995b7e381af323630e69687e5ad97e381a7e38199e380826d61634f53efbc88556e6978e38399e383bce382b9efbc89e381a7e381afe380814846532be3818ae38288e381b341504653e381aee5a0b4e59088e38081e9809ae5b8b8e38081e38395e382a1e382a4e383abe5908de381aee69c80e5a4a7e995b7e381af323535e69687e5ad97e381a7e38199e380824c696e7578efbc88e3818ae38288e381b3e3819de381aee4bb96e381ae556e6978e7b3bbe382b7e382b9e38386e383a0efbc89e381a7e381afe38081e381bbe381a8e38293e381a9e381aee38395e382a1e382a4e383abe382b7e382b9e38386e383a0e38081e4be8be38188e381b065787434e381aee5a0b4e59088e38081e9809ae5b8b8e38081e38395e382a1e382a4e383abe5908de381aee69c80e5a4a7e995b7e381af323535e69687e5ad97e381a7e38199e38082'
|
||||
|
||||
unicode_name_too_long = bytes.fromhex( unicode_name_too_long_hex ).decode( 'utf-8' )
|
||||
|
||||
unicode_name_shortened_hex = '57696e646f7773e381a7e381afe380814e544653e381aee5a0b4e59088e38081e9809ae5b8b8e38081e38395e382a1e382a4e383abe5908de381aee69c80e5a4a7e995b7e381af323630e69687e5ad97e381a7e38199e380826d61634f53efbc88556e6978e38399e383bce382b9efbc89e381a7e381afe380814846532be3818ae38288e381b341504653e381aee5a0b4e59088e38081e9809ae5b8b8e38081e38395e382a1e382a4e383abe5908de381aee69c80e5a4a7e995b7e381af323535e69687e5ad97e381a7e38199e380824c696e7578efbc88e3818ae38288e381b3e3819de381aee4bb96e381ae556e69'
|
||||
|
||||
unicode_name_shortened = bytes.fromhex( unicode_name_shortened_hex ).decode( 'utf-8' )
|
||||
|
||||
self.assertEqual( HydrusPaths.ElideFilenameOrDirectorySafely( unicode_name_too_long ), unicode_name_shortened )
|
||||
|
||||
#
|
||||
|
||||
old_platform_windows = HC.PLATFORM_WINDOWS
|
||||
|
||||
num_characters_used_in_other_components = 4
|
||||
|
||||
try:
|
||||
|
||||
HC.PLATFORM_WINDOWS = True
|
||||
|
||||
name_shortened = 'a' * ( 240 - num_characters_used_in_other_components )
|
||||
|
||||
self.assertEqual( HydrusPaths.ElideFilenameOrDirectorySafely( name_too_long, num_characters_used_in_other_components = num_characters_used_in_other_components ), name_shortened )
|
||||
|
||||
HC.PLATFORM_WINDOWS = False
|
||||
|
||||
name_shortened = 'a' * 240
|
||||
|
||||
self.assertEqual( HydrusPaths.ElideFilenameOrDirectorySafely( name_too_long, num_characters_used_in_other_components = num_characters_used_in_other_components ), name_shortened )
|
||||
|
||||
finally:
|
||||
|
||||
HC.PLATFORM_WINDOWS = old_platform_windows
|
||||
|
||||
|
||||
num_characters_already_used_in_this_component = 3
|
||||
|
||||
name_shortened = 'a' * ( 240 - num_characters_already_used_in_this_component )
|
||||
|
||||
self.assertEqual( HydrusPaths.ElideFilenameOrDirectorySafely( name_too_long, num_characters_already_used_in_this_component = num_characters_already_used_in_this_component ), name_shortened )
|
||||
|
||||
#
|
||||
|
||||
|
||||
|
||||
|
|
@ -1,13 +1,7 @@
|
|||
import unittest
|
||||
|
||||
from hydrus.core import HydrusConstants as HC
|
||||
from hydrus.core import HydrusGlobals as HG
|
||||
from hydrus.core import HydrusTime
|
||||
|
||||
from hydrus.client import ClientConstants as CC
|
||||
from hydrus.client import ClientData
|
||||
from hydrus.client.metadata import ClientTags
|
||||
|
||||
class TestHydrusTime( unittest.TestCase ):
|
||||
|
||||
def test_quick( self ):
|
||||
|
|
Loading…
Reference in New Issue