|
@ -7,6 +7,46 @@ title: Changelog
|
|||
!!! note
|
||||
This is the new changelog, only the most recent builds. For all versions, see the [old changelog](old_changelog.html).
|
||||
|
||||
## [Version 544](https://github.com/hydrusnetwork/hydrus/releases/tag/v544)
|
||||
|
||||
### webp vulnerability
|
||||
|
||||
* the main webp library (libwebp) that many programs use for webp support had a remote execution (very bad) vulnerability. you probably noticed your chrome/firefox updated this week, which was fixing this. we use the same thing via the `Pillow` library, which also rolled out a fix. I'm not sure how vulnerable hydrus ever was, since we are usually jank about how we do anything, but best to be safe about these things. there were apparently exploits for this floating around
|
||||
* the builds today have the fix, so if you use them, update as normal and you are good
|
||||
* if you run from source, **rebuild your venv at your earliest convenience**, and you'll get the new version of Pillow and be good. note, if you use the advanced setup, that there is a new question about `Pillow`
|
||||
* unfortunately, Windows 7 users (or anyone else running from source on Python 3.7) cannot get the fix! it needs Pillow 10.0.1, which is >=Python 3.8. it seems many large programs are dropping support for Win 7 this year, so while I will continue to support it for a reasonable while longer, I think the train may be running out of track bros
|
||||
|
||||
### max size in file storage system
|
||||
|
||||
* the `migrate database` dialog now allows you to set a 'max size' for all but one of your media locations. if you have a 500GB drive you want to store some stuff on, you no longer have to balance the weights in your head--just set a max size of 450GB and hydrus will figure it out for you. it is not super precise (and it isn't healthy to fill drives up to 98% anyway), so make sure you leave some padding
|
||||
* also, please note that this will not automatically rebalance _yet_. right now, the only way files move between locations is through the 'move files now' button on the dialog, so if you have a location that is full up according to its max size rule and then spend a month importing many files, it will go over its limit until and unless you revisit 'migrate database' and move files again. I hope to have automatic background rebalancing in the near future
|
||||
* updated the 'database migration' help to talk about this and added a new migration example
|
||||
* the 'edit num bytes' widget now supports terabytes (TB)
|
||||
* I fleshed out the logic and fixed several bugs in the migration code, mostly to do with the new max size stuff and distributing weights appropriately in various situations
|
||||
|
||||
### misc
|
||||
|
||||
* when an image file fails to render in the media viewer, it now draws a bordered box with a brief 'failed to render' note. previously, it janked with a second of lag, made some popups, and left the display on eternal blank hang. now it finishes its job cleanly and returns a 'nah m8' 'image' result
|
||||
* I reworked the Mr Bones layout a bit. the search is now on the left, and the rows of the main count table are separated for readability
|
||||
* it turns out that bitmap (.bmp) files can support ICC Profiles, so I've told hydrus to look for them in new bitmaps and retroactively scan all your existing ones
|
||||
* fixed an issue with the recent PSD code updates that was breaking boot for clients running from source without the psd-tools library (this affected the Docker build)
|
||||
* updated all the 'setup_venv' scripts. all the formatting and text has had a pass, and there is now a question on (n)ew or (old) Pillow
|
||||
* to stop FFMPEG's false positives where it can think a txt file is an mpeg, the main hydrus filetype scanning routine will no longer send files with common text extensions to ffmpeg. if you do have an mp3 called music.txt, rename it before import!
|
||||
* thanks to a user, the inkbunny file page parser fetches the correct source time again (#1431)
|
||||
* thanks to a user, the old sankaku gallery parser can find the 'next page' again
|
||||
* removed the broken sankaku login script for new users. I recommend people move to Hydrus Companion for all tricky login situations (#1435)
|
||||
* thanks to a user, procreate file parsing, which had the width/height flipped, is fixed. all existing procreate files will regen their metadata and thumbs
|
||||
|
||||
### client api
|
||||
|
||||
* thanks to a user, the Client API now has a `/get_files/render` command, which gives you a 100% zoom png render of the given file. useful if you want to display a PSD on a web page!
|
||||
* I screwed up Mr Bones's Client API request last week. this is now fixed
|
||||
* Mr Bones now supports a full file search context on the Client API, just like the main UI now. same parameters as `/get_files/search_files`, the help talks about it. He also cancels his work early if the request is terminated
|
||||
* Mr Bones gets several new unit tests to guarantee long-term ride reliability
|
||||
* the Client API (and all hydrus servers) now return proper JSON on an error. there's the error summary, specific exception name, and http status code. the big bad 500-error-of-last-resort still tacks on the large serverside traceback to the summary, so we'll see if that is still annoying and split it off if needed
|
||||
* the new `/add_tags/get_siblings_and_parents` now properly cleans the tags you give it, trimming whitespace and lowercasing letters and so on
|
||||
* the client api version is now 52
|
||||
|
||||
## [Version 543](https://github.com/hydrusnetwork/hydrus/releases/tag/v543)
|
||||
|
||||
### misc
|
||||
|
@ -370,53 +410,3 @@ title: Changelog
|
|||
* all fetches of multiple rows of data from multi-column lists now happen sorted. this is just a little thing, but it'll probably dejank a few operations where you edit several things at once or get some errors and are trying to figure out which of five things caused it
|
||||
* the hydrus official mimetype for psd files is now 'image/vnd.adobe.photoshop' (instead of 'application/x-photoshop')
|
||||
* with krita file (which are actually just zip files) support, we now have the very barebones of archive tech started. I'll expand it a bit more and we should be able to improve support for other archive-like formats in the future
|
||||
|
||||
## [Version 534](https://github.com/hydrusnetwork/hydrus/releases/tag/v534)
|
||||
|
||||
### user submissions
|
||||
|
||||
* thanks to a user, we now have SAI2 (.sai2) file support!
|
||||
* thanks to a user, the duplicate filter now says if one file has audio. this complements the recent Hydrus Video Deduplicator (https://github.com/appleappleapplenanner/hydrus-video-deduplicator), which can queue videos up in your dupe filter
|
||||
* thanks to a user, we now have some nice svg images in the help->links(?) menu instead of gritty bitmaps
|
||||
* thanks to a user, some help documentation for recent client vs hydrus_client changes got fixed
|
||||
|
||||
### quality of life/new stuff
|
||||
|
||||
* the media viewer's top-area 'removed from x' lines for files deleted from a local file service no longer appear--unless that file is currently in the trash. on clients with busy multiple local file services, they were mostly just annoying and spammy. if you need this data, hit up the right-click menu of the file--it is still listed there
|
||||
* the 'loading' media page now draws a background in the same colour as the thumbnail grid, so new searches or refreshes will no longer flash to a default grey colour--it should just be a smooth thumbs gone/thumbs back now
|
||||
* added a new shortcut action, 'copy small bmp of image for quick source lookups', for last week's new bitmap copy action
|
||||
* it turns out PNG and WEBP files can have EXIF data, and our existing scanner works with them, so the EXIF scanner now looks at PNGs and WEBPs too. PNGs appear to be rare, about 1-in-200. I will retroactively scan your existing WEBPs, since they have EXIF more commonly, maybe as high as 1-in-5, and are less common as a filetype anyway so the scan will be less work, but when you update you will get a yes/no dialog asking if you want to do PNGs too. it isn't a big deal and you can always queue it up later if you want
|
||||
|
||||
### fixes
|
||||
|
||||
* I banged my head against the notes layout code and actually had great success--a variety of borked note-spilling-over-into-nothing and note-stretching-itself-crazy and note-has-fifty-pixels-of-margin issues are fixed. let me know if you still have crazy notes anywhere
|
||||
* the duplicate filter right-hand hover is now more aggressive about getting out of the way of the notes hover, especially when the notes hover jitter-resizes itself a few extra pixels of height. the notes hover should no longer ever overlap the duplicate filter hover's top buttons--very sorry this took so long
|
||||
* when you drag and drop thumbnails out of the program while using an automatic pattern to rename them (_options->gui_), all the filenames are now unique, adding '(x)' after their name as needed for dedupe. previously, on duplicates, it was basically doing a weird spam-merge
|
||||
* fixed an issue when sanitizing export filenames when the destination directory's file system type cannot be determined
|
||||
* fixed a bug when doing a search in a deleted file domain with 'file import time' as the file sort
|
||||
* fixed a bug when hitting the shortcut for 'open file in media viewer' on a non-local file
|
||||
* fixed a bug when the client wants to figure out view options for a file that has mime 'application/unknown'
|
||||
* I may have improved the 'woah the db caches are unsynced' error handling in the 'what siblings and parents does this have?' check when you right-click tags
|
||||
|
||||
### weird bitmap pastes
|
||||
|
||||
* fixed the new 'paste image' button under `system:similar files` for a certain category of unusual clipboard bitmaps, including several that hydrus itself generates, where it turns out the QImage storage system stores extra padding bytes on each line of pixels
|
||||
* fixed the new 'paste image' button when the incoming bitmap has a useless alpha channel (e.g. 100% transparent). this was not being stripped like it is for imported images, and so some similar files data was not lining up
|
||||
* many bitmaps copied from other programs like Firefox remain slightly different to what hydrus generates (even though both are at 100% scale). my best guess here is that there is some differing ICC-profile-like colour adjustment happening somewhere, probably either a global browser setting, the browser obeying a global GPU setting, a simply better application of such image metadata on the browser's side, or maybe a stripping of such data, since it seems a 'copy image' event in Firefox also generates and attaches to your clipboard a temporary png file in your temp folder, so maybe the bitmap that we pull from the clipboard is actually generated during some conversion process amidst all that, and it loses some jpeg colour data. whatever the case here, it changes the pixel hash and subtly alters the perceptual hash in many cases. I'm bumping the default distance on this search predicate up to 8 now, to catch the weirder instances
|
||||
|
||||
### misc
|
||||
|
||||
* the 'does the db partition have 500MB free?' check that runs on database boot now occurs after some initial database cleanup, and it will use half the total database size instead, if that is smaller than 500MB, down to 64MB (issue #1373)
|
||||
* added a note to the 'running from source' help that the newer mpv dll seems to work on Qt5 and Windows 7 (issue #1338)
|
||||
* the twitter parsers and gugs are removed from the defaults for new users. a shame, but we'll see what happens in future
|
||||
* more misc linting cleanup
|
||||
|
||||
### ratings on the client api
|
||||
|
||||
* the services object now shows `star_shape` and `min_stars` and `max_stars` for like/dislike and numerical rating services
|
||||
* the file metadata object now has a 'ratings' key, which lists `rating_service_key->rating` for all the client's rating services. this thing is simple and uses human-friendly values, but it can hold several different data types, so check the help for details and examples
|
||||
* a new permission, 'edit ratings', is added.
|
||||
* a new command, `/edit_ratings/set_rating`, is added. Guess what it does! (issue #343)
|
||||
* the help is updated for these
|
||||
* the unit tests are updated for these
|
||||
* the client api version is now 48
|
||||
|
|
|
@ -2,6 +2,9 @@
|
|||
title: Database Migration
|
||||
---
|
||||
|
||||
!!! warning
|
||||
I am working on this system right now and will be moving the 'move files now' action to a more granular, always-on background migration. This document will update to reflect those changes!
|
||||
|
||||
# database migration
|
||||
|
||||
## the hydrus database { id="intro" }
|
||||
|
@ -38,19 +41,39 @@ Backing such an arrangement up is obviously more complicated, and the internal c
|
|||
!!! danger
|
||||
**As always, I recommend creating a backup before you try any of this, just in case it goes wrong.**
|
||||
|
||||
If you would like to move your files and thumbnails to new locations, I generally recommend you not move their folders around yourself--the database has an internal knowledge of where it thinks its file and thumbnail folders are, and if you move them while it is closed, it will become confused and you will have to manually relocate what is missing on the next boot via a repair dialog. This is not impossible to figure out, but if the program's 'client files' folder confuses you at all, I'd recommend you stay away. Instead, you can simply do it through the gui:
|
||||
If you would like to move your files and thumbnails to new locations, I generally recommend you not move their folders around yourself--the database has an internal knowledge of where it thinks its file and thumbnail folders are, and if you move them while it is closed, it will become confused.
|
||||
|
||||
??? note "Missing Locations"
|
||||
If your folders are in the wrong locations on a client boot, a repair dialog appears, and you can manually update the client's internal understanding. This is not impossible to figure out, _and in some tricky storage situations doing this on purpose can be faster than letting the client migrate things itself_, but generally it is best and safest to do everything through the dialog.
|
||||
|
||||
Go _database->migrate database_, giving you this dialog:
|
||||
|
||||
![](images/db_migration.png)
|
||||
|
||||
To move your files somewhere else, add the new location, empty/remove the old location, and then click 'move files now'.
|
||||
The buttons let you add more locations and remove old ones. The operations on this dialog are simple and atomic--at no point is your db ever invalid.
|
||||
|
||||
**Portable** means that the path is beneath the main db dir and so is stored as a relative path. Portable paths will still function if the database changes location between boots (for instance, if you run the client from a USB drive and it mounts under a different location).
|
||||
**Beneath db?** means that the path is beneath the main db dir and so is stored internally as a relative path. Portable paths will still function if the database changes location between boots (for instance, if you run the client from a USB drive and it mounts under a different location).
|
||||
|
||||
**Weight** means the relative amount of media you would like to store in that location. It only matters if you are spreading your files across multiple locations. If location A has a weight of 1 and B has a weight of 2, A will get approximately one third of your files and B will get approximately two thirds.
|
||||
|
||||
The operations on this dialog are simple and atomic--at no point is your db ever invalid. Once you have the locations and ideal usage set how you like, hit the 'move files now' button to actually shuffle your files around. It will take some time to finish, but you can pause and resume it later if the job is large or you want to undo or alter something.
|
||||
**Max Size** means the max total size of files the client will want to store in that location. Again, it only matters if you are spreading your files across multiple locations, but it is a simple way to ensure you don't go over a particular smaller hard drive's size. One location must always be limitless. This is not precise, so give it some padding. When one location is maxed out, the remaining locations will distribute the remainder of the files according to their respective weights. _For the meantime, this will not update by itself. If you import many files, the location may go over its limit and you will have to revisit 'migrate database' to rebalance your files again. Bear with me--I will fix this soon with the background migrate._
|
||||
|
||||
Let's set up an example move:
|
||||
|
||||
![](images/db_migration_move_pending.png)
|
||||
|
||||
I made several changes:
|
||||
|
||||
* Added `C:\hydrus_files` to store files.
|
||||
* Added `D:\hydrus_files` to store files, with a max size of 128MB.
|
||||
* Set `C:\hydrus_thumbs` as the location to store thumbnails.
|
||||
* Removed the original `C:\Hydrus Network\db\client_files` location.
|
||||
|
||||
While the ideal usage has changed significantly, note that the current usage remains the same. Nothing moves until you click 'move files now'. Moving files will take some time to finish. Once done, it looks like this:
|
||||
|
||||
![](images/db_migration_move_done.png)
|
||||
|
||||
The current and ideal usages line up, and the defunct `C:\Hydrus Network\db\client_files` location, which no longer stores anything, is removed from the list.
|
||||
|
||||
## informing the software that the SQLite database is not in the default location { id="launch_parameter" }
|
||||
|
||||
|
@ -72,7 +95,7 @@ Rather than typing the path out in a terminal every time you want to launch your
|
|||
|
||||
Note that an install with an 'external' database no longer needs access to write to its own path, so you can store it anywhere you like, including protected read-only locations (e.g. in 'Program Files'). Just double-check your shortcuts are good.
|
||||
|
||||
## finally { id="finally" }
|
||||
## backups { id="finally" }
|
||||
|
||||
If your database now lives in one or more new locations, make sure to update your backup routine to follow them!
|
||||
|
||||
|
@ -91,15 +114,15 @@ Specifically:
|
|||
* Create two empty folders on your SSD with names like 'hydrus\_db' and 'hydrus\_thumbnails'.
|
||||
* Set the 'thumbnail location override' to 'hydrus_thumbnails'. You should get that new location in the list, currently empty but prepared to take all your thumbs.
|
||||
* Hit 'move files now' to actually move the thumbnails. Since this involves moving a lot of individual files from a high-latency source, it will take a long time to finish. The hydrus client may hang periodically as it works, but you can just leave it to work on its own--it will get there in the end. You can also watch it do its disk work under Task Manager.
|
||||
* Now hit 'add location' and select your new 'hydrus\_files'. 'hydrus\_files' should be added and willing to take 50% of the files.
|
||||
* Select the old location (probably 'install\_dir/db/client\_files') and hit 'decrease weight' until it has weight 0 and you are prompted to remove it completely. 'hydrus_files' should now be willing to take all the files from the old location.
|
||||
* Now hit 'add location' and select your new 'hydrus\_files'. 'hydrus\_files' should be appear and be willing to take 50% of the files.
|
||||
* Select the old location (probably 'install\_dir/db/client\_files') and hit 'remove location' or 'decrease weight' until it has weight 0 and you are prompted to remove it completely. 'hydrus_files' should now be willing to take all 100% of the files from the old location.
|
||||
* Hit 'move files now' again to make this happen. This should be fast since it is just moving a bunch of folders across the same partition.
|
||||
* With everything now 'non-portable' and hence decoupled from the db, you can now easily migrate the install and db to 'hydrus_db' simply by shutting the client down and moving the install folder in a file explorer.
|
||||
* Update your shortcut to the new hydrus_client.exe location and try to boot.
|
||||
* Update your backup scheme to match your new locations.
|
||||
* Enjoy a much faster client.
|
||||
|
||||
You should now have _something_ like this:
|
||||
You should now have _something_ like this (let's say the D drive is the fast SSD, and E is the high capacity HDD):
|
||||
|
||||
![](images/db_migration_example.png)
|
||||
|
||||
|
|
|
@ -46,7 +46,7 @@ In general, the API deals with standard UTF-8 JSON. POST requests and 200 OK res
|
|||
```
|
||||
|
||||
|
||||
On 200 OK, the API returns JSON for everything except actual file/thumbnail requests. On 4XX and 5XX, assume it will return plain text, which may be a raw traceback that I'd be interested in seeing. You'll typically get 400 for a missing parameter, 401/403/419 for missing/insufficient/expired access, and 500 for a real deal serverside error.
|
||||
The API returns JSON for everything except actual file/thumbnail requests. For errors, you'll typically get 400 for a missing/invalid parameter, 401/403/419 for missing/insufficient/expired access, and 500 for a real deal serverside error.
|
||||
|
||||
!!! note
|
||||
For any request sent to the API, the total size of the initial request line (this includes the URL and any parameters) and the headers must not be larger than 2 megabytes.
|
||||
|
@ -353,11 +353,11 @@ Arguments:
|
|||
* 7 - Edit File Notes
|
||||
* 8 - Edit File Relationships
|
||||
* 9 - Edit File Ratings
|
||||
|
||||
|
||||
``` title="Example request"
|
||||
/request_new_permissions?name=my%20import%20script&basic_permissions=[0,1]
|
||||
```
|
||||
|
||||
|
||||
Response:
|
||||
: Some JSON with your access key, which is 64 characters of hex. This will not be valid until the user approves the request in the client ui.
|
||||
```json title="Example response"
|
||||
|
@ -2764,7 +2764,16 @@ _Get the data from help->how boned am I?. This is a simple Object of numbers jus
|
|||
Restricted access:
|
||||
: YES. Manage Database permission needed.
|
||||
|
||||
Arguments: None
|
||||
Arguments (in percent-encoded JSON):
|
||||
:
|
||||
* `tags`: (optional, a list of tags you wish to search for)
|
||||
* [file domain](#parameters_file_domain) (optional, defaults to 'all my files')
|
||||
* `tag_service_key`: (optional, hexadecimal, the tag domain on which to search, defaults to 'all my files')
|
||||
|
||||
``` title="Example requests"
|
||||
/manage_database/mr_bones
|
||||
/manage_database/mr_bones?tags=%5B%22blonde_hair%22%2C%20%22blue_eyes%22%5D
|
||||
```
|
||||
|
||||
```json title="Example response"
|
||||
{
|
||||
|
@ -2783,3 +2792,5 @@ Arguments: None
|
|||
}
|
||||
}
|
||||
```
|
||||
|
||||
The arguments here are the same as for [GET /get\_files/search\_files](#get_files_search_files). You can set any or none of them to set a search domain like in the dialog.
|
||||
|
|
Before Width: | Height: | Size: 30 KiB After Width: | Height: | Size: 29 KiB |
After Width: | Height: | Size: 33 KiB |
After Width: | Height: | Size: 36 KiB |
|
@ -34,6 +34,41 @@
|
|||
<div class="content">
|
||||
<h1 id="changelog"><a href="#changelog">changelog</a></h1>
|
||||
<ul>
|
||||
<li>
|
||||
<h2 id="version_544"><a href="#version_544">version 544</a></h2>
|
||||
<ul>
|
||||
<li><h3>webp vulnerability</h3></li>
|
||||
<li>the main webp library (libwebp) that many programs use for webp support had a remote execution (very bad) vulnerability. you probably noticed your chrome/firefox updated this week, which was fixing this. we use the same thing via the `Pillow` library, which also rolled out a fix. I'm not sure how vulnerable hydrus ever was, since we are usually jank about how we do anything, but best to be safe about these things. there were apparently exploits for this floating around</li>
|
||||
<li>the builds today have the fix, so if you use them, update as normal and you are good</li>
|
||||
<li>if you run from source, **rebuild your venv at your earliest convenience**, and you'll get the new version of Pillow and be good. note, if you use the advanced setup, that there is a new question about `Pillow`</li>
|
||||
<li>unfortunately, Windows 7 users (or anyone else running from source on Python 3.7) cannot get the fix! it needs Pillow 10.0.1, which is >=Python 3.8. it seems many large programs are dropping support for Win 7 this year, so while I will continue to support it for a reasonable while longer, I think the train may be running out of track bros</li>
|
||||
<li><h3>max size in file storage system</h3></li>
|
||||
<li>the `migrate database` dialog now allows you to set a 'max size' for all but one of your media locations. if you have a 500GB drive you want to store some stuff on, you no longer have to balance the weights in your head--just set a max size of 450GB and hydrus will figure it out for you. it is not super precise (and it isn't healthy to fill drives up to 98% anyway), so make sure you leave some padding</li>
|
||||
<li>also, please note that this will not automatically rebalance _yet_. right now, the only way files move between locations is through the 'move files now' button on the dialog, so if you have a location that is full up according to its max size rule and then spend a month importing many files, it will go over its limit until and unless you revisit 'migrate database' and move files again. I hope to have automatic background rebalancing in the near future</li>
|
||||
<li>updated the 'database migration' help to talk about this and added a new migration example</li>
|
||||
<li>I fleshed out the logic and fixed several bugs in the migration code, mostly to do with the new max size stuff and distributing weights appropriately in various situations</li>
|
||||
<li><h3>misc</h3></li>
|
||||
<li>when an image file fails to render in the media viewer, it now draws a bordered box with a brief 'failed to render' note. previously, it janked with a second of lag, made some popups, and left the display on eternal blank hang. now it finishes its job cleanly and returns a 'nah m8' 'image' result</li>
|
||||
<li>I reworked the Mr Bones layout a bit. the search is now on the left, and the rows of the main count table are separated for readability</li>
|
||||
<li>it turns out that bitmap (.bmp) files can support ICC Profiles, so I've told hydrus to look for them in new bitmaps and retroactively scan all your existing ones</li>
|
||||
<li>the 'edit num bytes' widget now supports terabytes (TB)</li>
|
||||
<li>fixed an issue with the recent PSD code updates that was breaking boot for clients running from source without the psd-tools library (this affected the Docker build)</li>
|
||||
<li>updated all the 'setup_venv' scripts. all the formatting and text has had a pass, and there is now a question on (n)ew or (old) Pillow</li>
|
||||
<li>to stop FFMPEG's false positives where it can think a txt file is an mpeg, the main hydrus filetype scanning routine will no longer send files with common text extensions to ffmpeg. if you do have an mp3 called music.txt, rename it before import!</li>
|
||||
<li>thanks to a user, the inkbunny file page parser fetches the correct source time again (#1431)</li>
|
||||
<li>thanks to a user, the old sankaku gallery parser can find the 'next page' again</li>
|
||||
<li>removed the broken sankaku login script for new users. I recommend people move to Hydrus Companion for all tricky login situations (#1435)</li>
|
||||
<li>thanks to a user, procreate file parsing, which had the width/height flipped, is fixed. all existing procreate files will regen their metadata and thumbs</li>
|
||||
<li><h3>client api</h3></li>
|
||||
<li>thanks to a user, the Client API now has a `/get_files/render` command, which gives you a 100% zoom png render of the given file. useful if you want to display a PSD on a web page!</li>
|
||||
<li>I screwed up Mr Bones's Client API request last week. this is now fixed</li>
|
||||
<li>Mr Bones now supports a full file search context on the Client API, just like the main UI now. same parameters as `/get_files/search_files`, the help talks about it. He also cancels his work early if the request is terminated</li>
|
||||
<li>Mr Bones gets several new unit tests to guarantee long-term ride reliability</li>
|
||||
<li>the Client API (and all hydrus servers) now return proper JSON on an error. there's the error summary, specific exception name, and http status code. the big bad 500-error-of-last-resort still tacks on the large serverside traceback to the summary, so we'll see if that is still annoying and split it off if needed</li>
|
||||
<li>the new `/add_tags/get_siblings_and_parents` now properly cleans the tags you give it, trimming whitespace and lowercasing letters and so on</li>
|
||||
<li>the client api version is now 52</li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>
|
||||
<h2 id="version_543"><a href="#version_543">version 543</a></h2>
|
||||
<ul>
|
||||
|
|
|
@ -727,23 +727,36 @@ class ClientFilesManager( object ):
|
|||
# we want these overweight guys to nonetheless distribute their stuff according to relative weights
|
||||
# so, what we'll do is we'll play a game with a split-pot, where bust players can't get dosh from later rounds
|
||||
|
||||
second_round_total_ideal_weight = total_ideal_weight
|
||||
second_round_base_locations = []
|
||||
|
||||
desperately_overweight_locations = []
|
||||
overweight_locations = []
|
||||
underweight_locations = []
|
||||
available_locations = []
|
||||
starving_locations = []
|
||||
|
||||
# first round, we need to sort out who is bust
|
||||
|
||||
total_normalised_weight_lost_in_first_round = 0
|
||||
|
||||
for base_location in all_media_base_locations:
|
||||
|
||||
current_weight = current_base_locations_to_normalised_weights[ base_location ]
|
||||
current_num_bytes = current_base_locations_to_size_estimate[ base_location ]
|
||||
|
||||
if base_location.NeedsToRemoveSubfolders( current_num_bytes ):
|
||||
if not base_location.AbleToAcceptSubfolders( current_num_bytes, smallest_subfolder_num_bytes ):
|
||||
|
||||
overweight_locations.append( base_location )
|
||||
second_round_total_ideal_weight -= current_weight
|
||||
if base_location.max_num_bytes is None:
|
||||
|
||||
total_normalised_weight_lost_in_first_round = base_location.ideal_weight / total_ideal_weight
|
||||
|
||||
else:
|
||||
|
||||
total_normalised_weight_lost_in_first_round += base_location.max_num_bytes / all_local_files_total_size
|
||||
|
||||
|
||||
if base_location.NeedsToRemoveSubfolders( current_num_bytes ):
|
||||
|
||||
desperately_overweight_locations.append( base_location )
|
||||
|
||||
|
||||
else:
|
||||
|
||||
|
@ -754,32 +767,56 @@ class ClientFilesManager( object ):
|
|||
random.shuffle( second_round_base_locations )
|
||||
|
||||
# second round, let's distribute the remainder
|
||||
# I fixed some logic and it seems like everything here is now AbleToAccept, so maybe we want another quick pass on this
|
||||
# or just wait until I do the slow migration and we'll figure something out with the staticmethod on BaseLocation that just gets ideal weights
|
||||
# I also added this jank regarding / ( 1 - first_round_weight ), which makes sure we are distributing the remaining weight correctly
|
||||
|
||||
second_round_total_ideal_weight = sum( ( base_location.ideal_weight for base_location in second_round_base_locations ) )
|
||||
|
||||
for base_location in second_round_base_locations:
|
||||
|
||||
current_weight = current_base_locations_to_normalised_weights[ base_location ]
|
||||
current_normalised_weight = current_base_locations_to_normalised_weights[ base_location ]
|
||||
current_num_bytes = current_base_locations_to_size_estimate[ base_location ]
|
||||
|
||||
if base_location.WouldLikeToRemoveSubfolders( current_weight, second_round_total_ideal_weight, largest_subfolder_normalised_weight ):
|
||||
# can be both overweight and able to eat more
|
||||
|
||||
if base_location.WouldLikeToRemoveSubfolders( current_normalised_weight / ( 1 - total_normalised_weight_lost_in_first_round ), second_round_total_ideal_weight, largest_subfolder_normalised_weight ):
|
||||
|
||||
overweight_locations.append( base_location )
|
||||
|
||||
elif base_location.EagerToAcceptSubfolders( current_weight, second_round_total_ideal_weight, smallest_subfolder_normalised_weight, current_num_bytes, smallest_subfolder_num_bytes ):
|
||||
|
||||
if base_location.EagerToAcceptSubfolders( current_normalised_weight / ( 1 - total_normalised_weight_lost_in_first_round ), second_round_total_ideal_weight, smallest_subfolder_normalised_weight, current_num_bytes, smallest_subfolder_num_bytes ):
|
||||
|
||||
underweight_locations.insert( 0, base_location )
|
||||
starving_locations.insert( 0, base_location )
|
||||
|
||||
elif base_location.AbleToAcceptSubfolders( current_num_bytes, smallest_subfolder_num_bytes ):
|
||||
|
||||
underweight_locations.append( base_location )
|
||||
available_locations.append( base_location )
|
||||
|
||||
|
||||
|
||||
#
|
||||
|
||||
if len( underweight_locations ) > 0 and len( overweight_locations ) > 0:
|
||||
if len( desperately_overweight_locations ) > 0:
|
||||
|
||||
overweight_location = overweight_locations.pop( 0 )
|
||||
underweight_location = underweight_locations.pop( 0 )
|
||||
potential_sources = desperately_overweight_locations
|
||||
potential_destinations = starving_locations + available_locations
|
||||
|
||||
elif len( overweight_locations ) > 0:
|
||||
|
||||
potential_sources = overweight_locations
|
||||
potential_destinations = starving_locations
|
||||
|
||||
else:
|
||||
|
||||
potential_sources = []
|
||||
potential_destinations = []
|
||||
|
||||
|
||||
if len( potential_sources ) > 0 and len( potential_destinations ) > 0:
|
||||
|
||||
source_base_location = potential_sources.pop( 0 )
|
||||
destination_base_location = potential_destinations.pop( 0 )
|
||||
|
||||
random.shuffle( file_prefixes )
|
||||
|
||||
|
@ -791,10 +828,10 @@ class ClientFilesManager( object ):
|
|||
|
||||
base_location = subfolder.base_location
|
||||
|
||||
if base_location == overweight_location:
|
||||
if base_location == source_base_location:
|
||||
|
||||
overweight_subfolder = ClientFilesPhysical.FilesStorageSubfolder( file_prefix, overweight_location )
|
||||
underweight_subfolder = ClientFilesPhysical.FilesStorageSubfolder( file_prefix, underweight_location )
|
||||
overweight_subfolder = ClientFilesPhysical.FilesStorageSubfolder( file_prefix, source_base_location )
|
||||
underweight_subfolder = ClientFilesPhysical.FilesStorageSubfolder( file_prefix, destination_base_location )
|
||||
|
||||
return ( overweight_subfolder, underweight_subfolder )
|
||||
|
||||
|
|
|
@ -158,6 +158,11 @@ class FilesStorageBaseLocation( object ):
|
|||
|
||||
def NeedsToRemoveSubfolders( self, current_num_bytes: int ):
|
||||
|
||||
if self.ideal_weight == 0:
|
||||
|
||||
return True
|
||||
|
||||
|
||||
if self.max_num_bytes is not None and current_num_bytes > self.max_num_bytes:
|
||||
|
||||
return True
|
||||
|
@ -184,6 +189,78 @@ class FilesStorageBaseLocation( object ):
|
|||
return current_normalised_weight - weight_of_subfolder > ideal_normalised_weight
|
||||
|
||||
|
||||
@staticmethod
|
||||
def STATICGetIdealWeights( current_num_bytes: int, base_locations: typing.List[ "FilesStorageBaseLocation" ] ) -> typing.Dict[ "FilesStorageBaseLocation", float ]:
|
||||
|
||||
# This is kind of tacked on logic versus the eager/able/needs/would stuff, but I'm collecting it here so at least the logic, pseudo-doubled, is in one place
|
||||
# this is used by the migrate database listctrl atm, but maybe we can merge all this together sometime
|
||||
|
||||
result = {}
|
||||
|
||||
limited_locations = sorted( [ base_location for base_location in base_locations if base_location.max_num_bytes is not None ], key = lambda b_l: b_l.max_num_bytes )
|
||||
unlimited_locations = [ base_location for base_location in base_locations if base_location.max_num_bytes is None ]
|
||||
|
||||
# ok we are first playing a game of elimination. eliminate limited locations that are overweight and distribute the extra for the next round
|
||||
next_round_of_limited_locations = []
|
||||
players_eliminated = False
|
||||
|
||||
amount_of_normalised_weight_lost_to_bust_players = 0.0
|
||||
|
||||
while len( limited_locations ) > 0:
|
||||
|
||||
total_ideal_weight = sum( ( base_location.ideal_weight for base_location in limited_locations ) ) + sum( ( base_location.ideal_weight for base_location in unlimited_locations ) )
|
||||
|
||||
limited_location_under_examination = limited_locations.pop( 0 )
|
||||
|
||||
normalised_weight = limited_location_under_examination.ideal_weight / total_ideal_weight
|
||||
|
||||
max_num_bytes = limited_location_under_examination.max_num_bytes
|
||||
|
||||
if normalised_weight * current_num_bytes > max_num_bytes:
|
||||
|
||||
true_ideal_normalised_weight = max_num_bytes / current_num_bytes
|
||||
|
||||
result[ limited_location_under_examination ] = true_ideal_normalised_weight
|
||||
|
||||
amount_of_normalised_weight_lost_to_bust_players += true_ideal_normalised_weight
|
||||
|
||||
current_num_bytes -= max_num_bytes
|
||||
|
||||
players_eliminated = True
|
||||
|
||||
else:
|
||||
|
||||
next_round_of_limited_locations.append( limited_location_under_examination )
|
||||
|
||||
|
||||
if len( limited_locations ) == 0:
|
||||
|
||||
if players_eliminated:
|
||||
|
||||
limited_locations = next_round_of_limited_locations
|
||||
|
||||
next_round_of_limited_locations = []
|
||||
players_eliminated = False
|
||||
|
||||
else:
|
||||
|
||||
unlimited_locations.extend( next_round_of_limited_locations )
|
||||
|
||||
|
||||
|
||||
|
||||
# ok, all the bust players have been eliminated. the remaining pot is distributed according to relative weights as normal
|
||||
|
||||
total_ideal_weight = sum( ( base_location.ideal_weight for base_location in unlimited_locations ) )
|
||||
|
||||
for base_location in unlimited_locations:
|
||||
|
||||
result[ base_location ] = ( base_location.ideal_weight / total_ideal_weight ) * ( 1 - amount_of_normalised_weight_lost_to_bust_players )
|
||||
|
||||
|
||||
return result
|
||||
|
||||
|
||||
|
||||
class FilesStorageSubfolder( object ):
|
||||
|
||||
|
|
|
@ -78,6 +78,8 @@ class ImageRenderer( ClientCachesBase.CacheableObject ):
|
|||
ClientCachesBase.CacheableObject.__init__( self )
|
||||
|
||||
self._numpy_image = None
|
||||
self._render_failed = False
|
||||
self._is_ready = False
|
||||
|
||||
self._hash = media.GetHash()
|
||||
self._mime = media.GetMime()
|
||||
|
@ -235,28 +237,38 @@ class ImageRenderer( ClientCachesBase.CacheableObject ):
|
|||
|
||||
except Exception as e:
|
||||
|
||||
HydrusData.ShowText( 'Problem rendering image at "{}"! Error follows:'.format( self._path ) )
|
||||
self._numpy_image = self._InitialiseErrorImage( e )
|
||||
|
||||
HydrusData.ShowException( e )
|
||||
self._render_failed = True
|
||||
|
||||
HydrusData.Print( 'Problem rendering image at "{}"! Error follows:'.format( self._path ) )
|
||||
|
||||
HydrusData.PrintException( e, do_wait = False )
|
||||
|
||||
|
||||
self._is_ready = True
|
||||
|
||||
if not self._this_is_for_metadata_alone:
|
||||
|
||||
if self._numpy_image is None:
|
||||
|
||||
m = 'There was a problem rendering the image with hash {}! It may be damaged.'.format(
|
||||
self._hash.hex()
|
||||
)
|
||||
|
||||
m += os.linesep * 2
|
||||
m += 'Jobs to check its integrity and metadata have been scheduled. If it is damaged, it may be redownloaded or removed from the client completely. If it is not damaged, it may be fixed automatically or further action may be required.'
|
||||
|
||||
HydrusData.ShowText( m )
|
||||
|
||||
HG.client_controller.Write( 'file_maintenance_add_jobs_hashes', { self._hash }, ClientFiles.REGENERATE_FILE_DATA_JOB_FILE_INTEGRITY_DATA_TRY_URL_ELSE_REMOVE_RECORD )
|
||||
HG.client_controller.Write( 'file_maintenance_add_jobs_hashes', { self._hash }, ClientFiles.REGENERATE_FILE_DATA_JOB_FILE_METADATA )
|
||||
|
||||
else:
|
||||
# TODO: Move this error code to a nice button or something
|
||||
# old recovery code, before the ErrorImage
|
||||
# I think move to show a nice 'check integrity' button when a file errors, so the user can kick it off, and we avoid the popup spam
|
||||
'''
|
||||
# (if image failed to render)
|
||||
m = 'There was a problem rendering the image with hash {}! It may be damaged.'.format(
|
||||
self._hash.hex()
|
||||
)
|
||||
|
||||
m += os.linesep * 2
|
||||
m += 'Jobs to check its integrity and metadata have been scheduled. If it is damaged, it may be redownloaded or removed from the client completely. If it is not damaged, it may be fixed automatically or further action may be required.'
|
||||
|
||||
HydrusData.ShowText( m )
|
||||
|
||||
HG.client_controller.Write( 'file_maintenance_add_jobs_hashes', { self._hash }, ClientFiles.REGENERATE_FILE_DATA_JOB_FILE_INTEGRITY_DATA_TRY_URL_ELSE_REMOVE_RECORD )
|
||||
HG.client_controller.Write( 'file_maintenance_add_jobs_hashes', { self._hash }, ClientFiles.REGENERATE_FILE_DATA_JOB_FILE_METADATA )
|
||||
'''
|
||||
|
||||
if not self._render_failed:
|
||||
|
||||
my_resolution_size = QC.QSize( self._resolution[0], self._resolution[1] )
|
||||
my_numpy_size = QC.QSize( self._numpy_image.shape[1], self._numpy_image.shape[0] )
|
||||
|
@ -280,6 +292,48 @@ class ImageRenderer( ClientCachesBase.CacheableObject ):
|
|||
|
||||
|
||||
|
||||
def _InitialiseErrorImage( self, e: Exception ):
|
||||
|
||||
( width, height ) = self._resolution
|
||||
|
||||
qt_image = QG.QImage( width, height, QG.QImage.Format_RGB888 )
|
||||
|
||||
painter = QG.QPainter( qt_image )
|
||||
|
||||
painter.setBackground( QG.QBrush( QC.Qt.white ) )
|
||||
|
||||
painter.eraseRect( painter.viewport() )
|
||||
|
||||
pen = QG.QPen( QG.QColor( 20, 20, 20 ) )
|
||||
|
||||
pen.setWidth( 5 )
|
||||
|
||||
painter.setPen( pen )
|
||||
painter.setBrush( QC.Qt.NoBrush )
|
||||
|
||||
painter.drawRect( 0, 0, width - 1, height - 1 )
|
||||
|
||||
from hydrus.client.gui import ClientGUIFunctions
|
||||
|
||||
font = painter.font()
|
||||
|
||||
font.setPixelSize( height // 20 )
|
||||
|
||||
painter.setFont( font )
|
||||
|
||||
text = 'Image failed to render:'
|
||||
text += '\n'
|
||||
text += str( e )
|
||||
text += '\n'
|
||||
text += 'Full info written to the log.'
|
||||
|
||||
painter.drawText( QC.QRectF( 0, 0, width, height ), QC.Qt.AlignCenter, text )
|
||||
|
||||
del painter
|
||||
|
||||
return ClientGUIFunctions.ConvertQtImageToNumPy( qt_image )
|
||||
|
||||
|
||||
def GetEstimatedMemoryFootprint( self ):
|
||||
|
||||
if self._numpy_image is None:
|
||||
|
@ -415,9 +469,15 @@ class ImageRenderer( ClientCachesBase.CacheableObject ):
|
|||
|
||||
def IsReady( self ):
|
||||
|
||||
return self._numpy_image is not None
|
||||
return self._is_ready
|
||||
|
||||
|
||||
def RenderFailed( self ):
|
||||
|
||||
return self._render_failed
|
||||
|
||||
|
||||
|
||||
class ImageTile( ClientCachesBase.CacheableObject ):
|
||||
|
||||
def __init__( self, hash: bytes, clip_rect: QC.QRect, qt_pixmap: QG.QPixmap ):
|
||||
|
|
|
@ -2601,6 +2601,11 @@ class DB( HydrusDB.HydrusDB ):
|
|||
job_key = None
|
||||
):
|
||||
|
||||
if job_key is None:
|
||||
|
||||
job_key = ClientThreading.JobKey()
|
||||
|
||||
|
||||
boned_stats = {}
|
||||
|
||||
( num_total, size_total ) = self._Execute( f'SELECT COUNT( hash_id ), SUM( size ) FROM {files_table_name} CROSS JOIN files_info USING ( hash_id );' ).fetchone()
|
||||
|
@ -9714,6 +9719,64 @@ class DB( HydrusDB.HydrusDB ):
|
|||
|
||||
|
||||
|
||||
if version == 543:
|
||||
|
||||
try:
|
||||
|
||||
domain_manager = self.modules_serialisable.GetJSONDump( HydrusSerialisable.SERIALISABLE_TYPE_NETWORK_DOMAIN_MANAGER )
|
||||
|
||||
domain_manager.Initialise()
|
||||
|
||||
#
|
||||
|
||||
domain_manager.OverwriteDefaultParsers( [
|
||||
'sankaku gallery page parser',
|
||||
'inkbunny file page parser'
|
||||
] )
|
||||
|
||||
#
|
||||
|
||||
domain_manager.TryToLinkURLClassesAndParsers()
|
||||
|
||||
#
|
||||
|
||||
self.modules_serialisable.SetJSONDump( domain_manager )
|
||||
|
||||
except Exception as e:
|
||||
|
||||
HydrusData.PrintException( e )
|
||||
|
||||
message = 'Trying to update some downloader objects failed! Please let hydrus dev know!'
|
||||
|
||||
self.pub_initial_message( message )
|
||||
|
||||
|
||||
try:
|
||||
|
||||
self._controller.frame_splash_status.SetSubtext( f'scheduling some maintenance work' )
|
||||
|
||||
all_local_hash_ids = self.modules_files_storage.GetCurrentHashIdsList( self.modules_services.combined_local_file_service_id )
|
||||
|
||||
with self._MakeTemporaryIntegerTable( all_local_hash_ids, 'hash_id' ) as temp_hash_ids_table_name:
|
||||
|
||||
hash_ids = self._STS( self._Execute( f'SELECT hash_id FROM {temp_hash_ids_table_name} CROSS JOIN files_info USING ( hash_id ) WHERE mime = ?;', ( HC.IMAGE_BMP, ) ) )
|
||||
self.modules_files_maintenance_queue.AddJobs( hash_ids, ClientFiles.REGENERATE_FILE_DATA_JOB_FILE_HAS_ICC_PROFILE )
|
||||
|
||||
hash_ids = self._STS( self._Execute( f'SELECT hash_id FROM {temp_hash_ids_table_name} CROSS JOIN files_info USING ( hash_id ) WHERE mime = ?;', ( HC.APPLICATION_PROCREATE, ) ) )
|
||||
self.modules_files_maintenance_queue.AddJobs( hash_ids, ClientFiles.REGENERATE_FILE_DATA_JOB_FILE_METADATA )
|
||||
self.modules_files_maintenance_queue.AddJobs( hash_ids, ClientFiles.REGENERATE_FILE_DATA_JOB_FORCE_THUMBNAIL )
|
||||
|
||||
|
||||
except Exception as e:
|
||||
|
||||
HydrusData.PrintException( e )
|
||||
|
||||
message = 'Some file updates failed to schedule! This is not super important, but hydev would be interested in seeing the error that was printed to the log.'
|
||||
|
||||
self.pub_initial_message( message )
|
||||
|
||||
|
||||
|
||||
self._controller.frame_splash_status.SetTitleText( 'updated db to v{}'.format( HydrusData.ToHumanInt( version + 1 ) ) )
|
||||
|
||||
self._Execute( 'UPDATE version SET version = ?;', ( version + 1, ) )
|
||||
|
|
|
@ -203,7 +203,7 @@ class EditPanel( ResizingScrolledPanel ):
|
|||
|
||||
class EditSingleCtrlPanel( CAC.ApplicationCommandProcessorMixin, EditPanel ):
|
||||
|
||||
def __init__( self, parent, ok_on_these_commands = None ):
|
||||
def __init__( self, parent, ok_on_these_commands = None, message = None ):
|
||||
|
||||
EditPanel.__init__( self, parent )
|
||||
CAC.ApplicationCommandProcessorMixin.__init__( self )
|
||||
|
@ -221,6 +221,17 @@ class EditSingleCtrlPanel( CAC.ApplicationCommandProcessorMixin, EditPanel ):
|
|||
|
||||
self._vbox = QP.VBoxLayout( margin = 0 )
|
||||
|
||||
if message is not None:
|
||||
|
||||
from hydrus.client.gui.widgets import ClientGUICommon
|
||||
|
||||
st = ClientGUICommon.BetterStaticText( self, label = message )
|
||||
|
||||
st.setWordWrap( True )
|
||||
|
||||
QP.AddToLayout( self._vbox, st, CC.FLAGS_EXPAND_PERPENDICULAR )
|
||||
|
||||
|
||||
self.widget().setLayout( self._vbox )
|
||||
|
||||
self._my_shortcuts_handler = ClientGUIShortcuts.ShortcutsHandler( self, [ 'media' ] )
|
||||
|
|
|
@ -51,6 +51,7 @@ from hydrus.client.gui.lists import ClientGUIListCtrl
|
|||
from hydrus.client.gui.search import ClientGUIACDropdown
|
||||
from hydrus.client.gui.search import ClientGUILocation
|
||||
from hydrus.client.gui.widgets import ClientGUICommon
|
||||
from hydrus.client.gui.widgets import ClientGUIControls
|
||||
from hydrus.client.gui.widgets import ClientGUIMenuButton
|
||||
from hydrus.client.importing.options import FileImportOptions
|
||||
from hydrus.client.metadata import ClientTags
|
||||
|
@ -159,7 +160,7 @@ class MigrateDatabasePanel( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
|
||||
current_media_base_locations_listctrl_panel = ClientGUIListCtrl.BetterListCtrlPanel( file_locations_panel )
|
||||
|
||||
self._current_media_base_locations_listctrl = ClientGUIListCtrl.BetterListCtrl( current_media_base_locations_listctrl_panel, CGLC.COLUMN_LIST_DB_MIGRATION_LOCATIONS.ID, 8, self._ConvertLocationToListCtrlTuples )
|
||||
self._current_media_base_locations_listctrl = ClientGUIListCtrl.BetterListCtrl( current_media_base_locations_listctrl_panel, CGLC.COLUMN_LIST_DB_MIGRATION_LOCATIONS.ID, 8, self._ConvertLocationToListCtrlTuples, activation_callback = self._SetMaxNumBytes )
|
||||
self._current_media_base_locations_listctrl.setSelectionMode( QW.QAbstractItemView.SingleSelection )
|
||||
|
||||
self._current_media_base_locations_listctrl.Sort()
|
||||
|
@ -169,6 +170,7 @@ class MigrateDatabasePanel( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
current_media_base_locations_listctrl_panel.AddButton( 'add new location for files', self._SelectPathToAdd )
|
||||
current_media_base_locations_listctrl_panel.AddButton( 'increase location weight', self._IncreaseWeight, enabled_check_func = self._CanIncreaseWeight )
|
||||
current_media_base_locations_listctrl_panel.AddButton( 'decrease location weight', self._DecreaseWeight, enabled_check_func = self._CanDecreaseWeight )
|
||||
current_media_base_locations_listctrl_panel.AddButton( 'set max size', self._SetMaxNumBytes, enabled_check_func = self._CanSetMaxNumBytes )
|
||||
current_media_base_locations_listctrl_panel.AddButton( 'remove location', self._RemoveSelectedBaseLocation, enabled_check_func = self._CanRemoveLocation )
|
||||
|
||||
self._thumbnails_location = QW.QLineEdit( file_locations_panel )
|
||||
|
@ -358,6 +360,30 @@ class MigrateDatabasePanel( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
|
||||
|
||||
|
||||
return False
|
||||
|
||||
|
||||
def _CanSetMaxNumBytes( self ):
|
||||
|
||||
only_one_location_is_limitless = len( [ 1 for base_location in self._media_base_locations if base_location.max_num_bytes is None ] ) == 1
|
||||
|
||||
base_locations = self._current_media_base_locations_listctrl.GetData( only_selected = True )
|
||||
|
||||
if len( base_locations ) > 0:
|
||||
|
||||
base_location = base_locations[0]
|
||||
|
||||
if base_location in self._media_base_locations:
|
||||
|
||||
if base_location.max_num_bytes is None and only_one_location_is_limitless:
|
||||
|
||||
return False
|
||||
|
||||
|
||||
return True
|
||||
|
||||
|
||||
|
||||
return False
|
||||
|
||||
|
||||
|
@ -377,7 +403,7 @@ class MigrateDatabasePanel( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
self._SaveToDB()
|
||||
|
||||
|
||||
def _ConvertLocationToListCtrlTuples( self, base_location ):
|
||||
def _ConvertLocationToListCtrlTuples( self, base_location: ClientFilesPhysical.FilesStorageBaseLocation ):
|
||||
|
||||
f_space = self._all_local_files_total_size
|
||||
( t_space_min, t_space_max ) = self._GetThumbnailSizeEstimates()
|
||||
|
@ -468,8 +494,6 @@ class MigrateDatabasePanel( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
|
||||
#
|
||||
|
||||
total_ideal_weight = sum( ( base_location.ideal_weight for base_location in self._media_base_locations ) )
|
||||
|
||||
if base_location in self._media_base_locations:
|
||||
|
||||
ideal_weight = base_location.ideal_weight
|
||||
|
@ -490,15 +514,19 @@ class MigrateDatabasePanel( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
|
||||
|
||||
|
||||
if base_location in self._media_base_locations:
|
||||
if base_location.max_num_bytes is None:
|
||||
|
||||
ideal_fp = base_location.ideal_weight / total_ideal_weight
|
||||
pretty_max_num_bytes = 'n/a'
|
||||
sort_max_num_bytes = -1
|
||||
|
||||
else:
|
||||
|
||||
ideal_fp = 0.0
|
||||
pretty_max_num_bytes = HydrusData.ToHumanBytes( base_location.max_num_bytes )
|
||||
sort_max_num_bytes = base_location.max_num_bytes
|
||||
|
||||
|
||||
ideal_fp = self._media_base_locations_to_ideal_usage.get( base_location, 0.0 )
|
||||
|
||||
if self._ideal_thumbnails_base_location_override is None:
|
||||
|
||||
ideal_tp = ideal_fp
|
||||
|
@ -545,8 +573,8 @@ class MigrateDatabasePanel( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
pretty_ideal_usage = 'nothing'
|
||||
|
||||
|
||||
display_tuple = ( pretty_location, pretty_portable, pretty_free_space, pretty_current_usage, pretty_ideal_weight, pretty_ideal_usage )
|
||||
sort_tuple = ( pretty_location, portable, free_space, ideal_weight, ideal_usage, current_usage )
|
||||
display_tuple = ( pretty_location, pretty_portable, pretty_free_space, pretty_current_usage, pretty_ideal_weight, pretty_max_num_bytes, pretty_ideal_usage )
|
||||
sort_tuple = ( pretty_location, portable, free_space, ideal_weight, ideal_usage, sort_max_num_bytes, current_usage )
|
||||
|
||||
return ( display_tuple, sort_tuple )
|
||||
|
||||
|
@ -763,6 +791,54 @@ class MigrateDatabasePanel( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
|
||||
|
||||
|
||||
def _SetMaxNumBytes( self ):
|
||||
|
||||
if not self._CanSetMaxNumBytes():
|
||||
|
||||
return
|
||||
|
||||
|
||||
base_locations = self._current_media_base_locations_listctrl.GetData( only_selected = True )
|
||||
|
||||
if len( base_locations ) > 0:
|
||||
|
||||
base_location = base_locations[0]
|
||||
|
||||
if base_location in self._media_base_locations:
|
||||
|
||||
max_num_bytes = base_location.max_num_bytes
|
||||
|
||||
with ClientGUITopLevelWindowsPanels.DialogEdit( self, 'edit max size' ) as dlg:
|
||||
|
||||
message = 'If a location goes over its set size, it will schedule to migrate some files to other locations. At least one media location must have no limit.'
|
||||
message += '\n' * 2
|
||||
message += 'This is not precise (it works on average size, and thumbnails can add a bit), so give it some padding. Also, in general, remember it is not healthy to fill any hard drive more than 90% full.'
|
||||
message += '\n' * 2
|
||||
message += 'Also, this feature is under active development. The client will go over this limit if your collection grows significantly--the only way things rebalance atm are if you click "move files now"--but in future I will have things automatically migrate in the background to ensure limits are obeyed. This is just for advanced users to play with for now!'
|
||||
|
||||
panel = ClientGUIScrolledPanels.EditSingleCtrlPanel( dlg, message = message )
|
||||
|
||||
control = ClientGUIControls.NoneableBytesControl( panel, initial_value = 100 * ( 1024 ** 3 ) )
|
||||
|
||||
control.SetValue( max_num_bytes )
|
||||
|
||||
panel.SetControl( control )
|
||||
|
||||
dlg.SetPanel( panel )
|
||||
|
||||
if dlg.exec() == QW.QDialog.Accepted:
|
||||
|
||||
max_num_bytes = control.GetValue()
|
||||
|
||||
base_location.max_num_bytes = max_num_bytes
|
||||
|
||||
self._SaveToDB()
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
def _SetThumbnailLocation( self ):
|
||||
|
||||
with QP.DirDialog( self, 'Select thumbnail location' ) as dlg:
|
||||
|
@ -798,10 +874,12 @@ class MigrateDatabasePanel( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
|
||||
( self._media_base_locations, self._ideal_thumbnails_base_location_override ) = self._controller.Read( 'ideal_client_files_locations' )
|
||||
|
||||
self._media_base_locations_to_ideal_usage = ClientFilesPhysical.FilesStorageBaseLocation.STATICGetIdealWeights( self._all_local_files_total_size, self._media_base_locations )
|
||||
|
||||
approx_total_db_size = self._controller.db.GetApproxTotalFileSize()
|
||||
|
||||
self._current_db_path_st.setText( 'database (about '+HydrusData.ToHumanBytes(approx_total_db_size)+'): '+self._controller.GetDBDir() )
|
||||
self._current_install_path_st.setText( 'install: '+HC.BASE_DIR )
|
||||
self._current_install_path_st.setText( 'install: ' + HC.BASE_DIR )
|
||||
|
||||
approx_total_client_files = self._all_local_files_total_size
|
||||
( approx_total_thumbnails_min, approx_total_thumbnails_max ) = self._GetThumbnailSizeEstimates()
|
||||
|
@ -2887,8 +2965,8 @@ class ReviewHowBonedAmI( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
|
||||
big_hbox = QP.HBoxLayout()
|
||||
|
||||
QP.AddToLayout( big_hbox, vbox, CC.FLAGS_EXPAND_BOTH_WAYS )
|
||||
QP.AddToLayout( big_hbox, self._search_panel, CC.FLAGS_EXPAND_BOTH_WAYS )
|
||||
QP.AddToLayout( big_hbox, vbox, CC.FLAGS_EXPAND_BOTH_WAYS )
|
||||
|
||||
self.widget().setLayout( big_hbox )
|
||||
|
||||
|
@ -3085,7 +3163,16 @@ class ReviewHowBonedAmI( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
|
||||
#
|
||||
|
||||
QP.AddToLayout( text_table_layout, ClientGUICommon.BetterStaticText( self._files_content_panel, label = '\u251cCurrent:' ), CC.FLAGS_ON_LEFT )
|
||||
QP.AddToLayout( text_table_layout, ClientGUICommon.BetterStaticText( self._files_content_panel, label = '' ), CC.FLAGS_ON_LEFT )
|
||||
QP.AddToLayout( text_table_layout, QW.QWidget( self._files_content_panel ), CC.FLAGS_ON_LEFT )
|
||||
QP.AddToLayout( text_table_layout, QW.QWidget( self._files_content_panel ), CC.FLAGS_ON_LEFT )
|
||||
QP.AddToLayout( text_table_layout, QW.QWidget( self._files_content_panel ), CC.FLAGS_ON_LEFT )
|
||||
QP.AddToLayout( text_table_layout, QW.QWidget( self._files_content_panel ), CC.FLAGS_ON_LEFT )
|
||||
QP.AddToLayout( text_table_layout, QW.QWidget( self._files_content_panel ), CC.FLAGS_ON_LEFT )
|
||||
|
||||
#
|
||||
|
||||
QP.AddToLayout( text_table_layout, ClientGUICommon.BetterStaticText( self._files_content_panel, label = 'Current:' ), CC.FLAGS_ON_LEFT )
|
||||
QP.AddToLayout( text_table_layout, ClientGUICommon.BetterStaticText( self._files_content_panel, label = HydrusData.ToHumanInt( num_total ) ), CC.FLAGS_ON_RIGHT )
|
||||
QP.AddToLayout( text_table_layout, ClientGUICommon.BetterStaticText( self._files_content_panel, label = ClientData.ConvertZoomToPercentage( current_num_percent ) ), CC.FLAGS_ON_RIGHT )
|
||||
QP.AddToLayout( text_table_layout, ClientGUICommon.BetterStaticText( self._files_content_panel, label = HydrusData.ToHumanBytes( size_total ) ), CC.FLAGS_ON_RIGHT )
|
||||
|
@ -3094,7 +3181,25 @@ class ReviewHowBonedAmI( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
|
||||
#
|
||||
|
||||
QP.AddToLayout( text_table_layout, ClientGUICommon.BetterStaticText( self._files_content_panel, label = '\u2502\u251cInbox:' ), CC.FLAGS_ON_LEFT )
|
||||
QP.AddToLayout( text_table_layout, ClientGUICommon.BetterStaticText( self._files_content_panel, label = 'Deleted:' ), CC.FLAGS_ON_LEFT )
|
||||
QP.AddToLayout( text_table_layout, ClientGUICommon.BetterStaticText( self._files_content_panel, label = HydrusData.ToHumanInt( num_deleted ) ), CC.FLAGS_ON_RIGHT )
|
||||
QP.AddToLayout( text_table_layout, ClientGUICommon.BetterStaticText( self._files_content_panel, label = ClientData.ConvertZoomToPercentage( deleted_num_percent ) ), CC.FLAGS_ON_RIGHT )
|
||||
QP.AddToLayout( text_table_layout, ClientGUICommon.BetterStaticText( self._files_content_panel, label = HydrusData.ToHumanBytes( size_deleted ) ), CC.FLAGS_ON_RIGHT )
|
||||
QP.AddToLayout( text_table_layout, ClientGUICommon.BetterStaticText( self._files_content_panel, label = ClientData.ConvertZoomToPercentage( deleted_size_percent ) ), CC.FLAGS_ON_RIGHT )
|
||||
QP.AddToLayout( text_table_layout, ClientGUICommon.BetterStaticText( self._files_content_panel, label = HydrusData.ToHumanBytes( deleted_average_filesize ) ), CC.FLAGS_ON_RIGHT )
|
||||
|
||||
#
|
||||
|
||||
QP.AddToLayout( text_table_layout, ClientGUICommon.BetterStaticText( self._files_content_panel, label = '' ), CC.FLAGS_ON_LEFT )
|
||||
QP.AddToLayout( text_table_layout, QW.QWidget( self._files_content_panel ), CC.FLAGS_ON_LEFT )
|
||||
QP.AddToLayout( text_table_layout, QW.QWidget( self._files_content_panel ), CC.FLAGS_ON_LEFT )
|
||||
QP.AddToLayout( text_table_layout, QW.QWidget( self._files_content_panel ), CC.FLAGS_ON_LEFT )
|
||||
QP.AddToLayout( text_table_layout, QW.QWidget( self._files_content_panel ), CC.FLAGS_ON_LEFT )
|
||||
QP.AddToLayout( text_table_layout, QW.QWidget( self._files_content_panel ), CC.FLAGS_ON_LEFT )
|
||||
|
||||
#
|
||||
|
||||
QP.AddToLayout( text_table_layout, ClientGUICommon.BetterStaticText( self._files_content_panel, label = 'Inbox:' ), CC.FLAGS_ON_LEFT )
|
||||
QP.AddToLayout( text_table_layout, ClientGUICommon.BetterStaticText( self._files_content_panel, label = HydrusData.ToHumanInt( num_inbox ) ), CC.FLAGS_ON_RIGHT )
|
||||
QP.AddToLayout( text_table_layout, ClientGUICommon.BetterStaticText( self._files_content_panel, label = ClientData.ConvertZoomToPercentage( inbox_num_percent ) ), CC.FLAGS_ON_RIGHT )
|
||||
QP.AddToLayout( text_table_layout, ClientGUICommon.BetterStaticText( self._files_content_panel, label = HydrusData.ToHumanBytes( size_inbox ) ), CC.FLAGS_ON_RIGHT )
|
||||
|
@ -3103,22 +3208,13 @@ class ReviewHowBonedAmI( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
|
||||
#
|
||||
|
||||
QP.AddToLayout( text_table_layout, ClientGUICommon.BetterStaticText( self._files_content_panel, label = '\u2502\u2514Archive:' ), CC.FLAGS_ON_LEFT )
|
||||
QP.AddToLayout( text_table_layout, ClientGUICommon.BetterStaticText( self._files_content_panel, label = 'Archive:' ), CC.FLAGS_ON_LEFT )
|
||||
QP.AddToLayout( text_table_layout, ClientGUICommon.BetterStaticText( self._files_content_panel, label = HydrusData.ToHumanInt( num_archive ) ), CC.FLAGS_ON_RIGHT )
|
||||
QP.AddToLayout( text_table_layout, ClientGUICommon.BetterStaticText( self._files_content_panel, label = ClientData.ConvertZoomToPercentage( archive_num_percent ) ), CC.FLAGS_ON_RIGHT )
|
||||
QP.AddToLayout( text_table_layout, ClientGUICommon.BetterStaticText( self._files_content_panel, label = HydrusData.ToHumanBytes( size_archive ) ), CC.FLAGS_ON_RIGHT )
|
||||
QP.AddToLayout( text_table_layout, ClientGUICommon.BetterStaticText( self._files_content_panel, label = ClientData.ConvertZoomToPercentage( archive_size_percent ) ), CC.FLAGS_ON_RIGHT )
|
||||
QP.AddToLayout( text_table_layout, ClientGUICommon.BetterStaticText( self._files_content_panel, label = HydrusData.ToHumanBytes( archive_average_filesize ) ), CC.FLAGS_ON_RIGHT )
|
||||
|
||||
#
|
||||
|
||||
QP.AddToLayout( text_table_layout, ClientGUICommon.BetterStaticText( self._files_content_panel, label = '\u2514Deleted:' ), CC.FLAGS_ON_LEFT )
|
||||
QP.AddToLayout( text_table_layout, ClientGUICommon.BetterStaticText( self._files_content_panel, label = HydrusData.ToHumanInt( num_deleted ) ), CC.FLAGS_ON_RIGHT )
|
||||
QP.AddToLayout( text_table_layout, ClientGUICommon.BetterStaticText( self._files_content_panel, label = ClientData.ConvertZoomToPercentage( deleted_num_percent ) ), CC.FLAGS_ON_RIGHT )
|
||||
QP.AddToLayout( text_table_layout, ClientGUICommon.BetterStaticText( self._files_content_panel, label = HydrusData.ToHumanBytes( size_deleted ) ), CC.FLAGS_ON_RIGHT )
|
||||
QP.AddToLayout( text_table_layout, ClientGUICommon.BetterStaticText( self._files_content_panel, label = ClientData.ConvertZoomToPercentage( deleted_size_percent ) ), CC.FLAGS_ON_RIGHT )
|
||||
QP.AddToLayout( text_table_layout, ClientGUICommon.BetterStaticText( self._files_content_panel, label = HydrusData.ToHumanBytes( deleted_average_filesize ) ), CC.FLAGS_ON_RIGHT )
|
||||
|
||||
else:
|
||||
|
||||
panel_vbox = QP.VBoxLayout()
|
||||
|
@ -3150,7 +3246,16 @@ class ReviewHowBonedAmI( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
|
||||
#
|
||||
|
||||
QP.AddToLayout( text_table_layout, ClientGUICommon.BetterStaticText( self._files_content_panel, label = '\u251cInbox:' ), CC.FLAGS_ON_LEFT )
|
||||
QP.AddToLayout( text_table_layout, ClientGUICommon.BetterStaticText( self._files_content_panel, label = '' ), CC.FLAGS_ON_LEFT )
|
||||
QP.AddToLayout( text_table_layout, QW.QWidget( self._files_content_panel ), CC.FLAGS_ON_LEFT )
|
||||
QP.AddToLayout( text_table_layout, QW.QWidget( self._files_content_panel ), CC.FLAGS_ON_LEFT )
|
||||
QP.AddToLayout( text_table_layout, QW.QWidget( self._files_content_panel ), CC.FLAGS_ON_LEFT )
|
||||
QP.AddToLayout( text_table_layout, QW.QWidget( self._files_content_panel ), CC.FLAGS_ON_LEFT )
|
||||
QP.AddToLayout( text_table_layout, QW.QWidget( self._files_content_panel ), CC.FLAGS_ON_LEFT )
|
||||
|
||||
#
|
||||
|
||||
QP.AddToLayout( text_table_layout, ClientGUICommon.BetterStaticText( self._files_content_panel, label = 'Inbox:' ), CC.FLAGS_ON_LEFT )
|
||||
QP.AddToLayout( text_table_layout, ClientGUICommon.BetterStaticText( self._files_content_panel, label = HydrusData.ToHumanInt( num_inbox ) ), CC.FLAGS_ON_RIGHT )
|
||||
QP.AddToLayout( text_table_layout, ClientGUICommon.BetterStaticText( self._files_content_panel, label = ClientData.ConvertZoomToPercentage( inbox_num_percent ) ), CC.FLAGS_ON_RIGHT )
|
||||
QP.AddToLayout( text_table_layout, ClientGUICommon.BetterStaticText( self._files_content_panel, label = HydrusData.ToHumanBytes( size_inbox ) ), CC.FLAGS_ON_RIGHT )
|
||||
|
@ -3159,7 +3264,7 @@ class ReviewHowBonedAmI( ClientGUIScrolledPanels.ReviewPanel ):
|
|||
|
||||
#
|
||||
|
||||
QP.AddToLayout( text_table_layout, ClientGUICommon.BetterStaticText( self._files_content_panel, label = '\u2514Archive:' ), CC.FLAGS_ON_LEFT )
|
||||
QP.AddToLayout( text_table_layout, ClientGUICommon.BetterStaticText( self._files_content_panel, label = 'Archive:' ), CC.FLAGS_ON_LEFT )
|
||||
QP.AddToLayout( text_table_layout, ClientGUICommon.BetterStaticText( self._files_content_panel, label = HydrusData.ToHumanInt( num_archive ) ), CC.FLAGS_ON_RIGHT )
|
||||
QP.AddToLayout( text_table_layout, ClientGUICommon.BetterStaticText( self._files_content_panel, label = ClientData.ConvertZoomToPercentage( archive_num_percent ) ), CC.FLAGS_ON_RIGHT )
|
||||
QP.AddToLayout( text_table_layout, ClientGUICommon.BetterStaticText( self._files_content_panel, label = HydrusData.ToHumanBytes( size_archive ) ), CC.FLAGS_ON_RIGHT )
|
||||
|
|
|
@ -1049,6 +1049,7 @@ class COLUMN_LIST_DB_MIGRATION_LOCATIONS( COLUMN_LIST_DEFINITION ):
|
|||
CURRENT_USAGE = 3
|
||||
WEIGHT = 4
|
||||
IDEAL_USAGE = 5
|
||||
MAX_NUM_BYTES = 6
|
||||
|
||||
|
||||
column_list_type_name_lookup[ COLUMN_LIST_DB_MIGRATION_LOCATIONS.ID ] = 'db file storage locations'
|
||||
|
@ -1058,6 +1059,7 @@ register_column_type( COLUMN_LIST_DB_MIGRATION_LOCATIONS.ID, COLUMN_LIST_DB_MIGR
|
|||
register_column_type( COLUMN_LIST_DB_MIGRATION_LOCATIONS.ID, COLUMN_LIST_DB_MIGRATION_LOCATIONS.FREE_SPACE, 'disk free space', False, 12, True )
|
||||
register_column_type( COLUMN_LIST_DB_MIGRATION_LOCATIONS.ID, COLUMN_LIST_DB_MIGRATION_LOCATIONS.CURRENT_USAGE, 'current usage', False, 24, True )
|
||||
register_column_type( COLUMN_LIST_DB_MIGRATION_LOCATIONS.ID, COLUMN_LIST_DB_MIGRATION_LOCATIONS.WEIGHT, 'weight', False, 8, True )
|
||||
register_column_type( COLUMN_LIST_DB_MIGRATION_LOCATIONS.ID, COLUMN_LIST_DB_MIGRATION_LOCATIONS.MAX_NUM_BYTES, 'max size', False, 10, True )
|
||||
register_column_type( COLUMN_LIST_DB_MIGRATION_LOCATIONS.ID, COLUMN_LIST_DB_MIGRATION_LOCATIONS.IDEAL_USAGE, 'ideal usage', False, 24, True )
|
||||
|
||||
default_column_list_sort_lookup[ COLUMN_LIST_DB_MIGRATION_LOCATIONS.ID ] = ( COLUMN_LIST_DB_MIGRATION_LOCATIONS.LOCATION, True )
|
||||
|
|
|
@ -280,8 +280,9 @@ class BytesControl( QW.QWidget ):
|
|||
|
||||
self._unit.addItem( 'B', 1 )
|
||||
self._unit.addItem( 'KB', 1024 )
|
||||
self._unit.addItem( 'MB', 1024 * 1024 )
|
||||
self._unit.addItem( 'GB', 1024 * 1024 * 1024 )
|
||||
self._unit.addItem( 'MB', 1024 ** 2 )
|
||||
self._unit.addItem( 'GB', 1024 ** 3 )
|
||||
self._unit.addItem( 'TB', 1024 ** 4 )
|
||||
|
||||
#
|
||||
|
||||
|
@ -311,7 +312,7 @@ class BytesControl( QW.QWidget ):
|
|||
|
||||
def GetSeparatedValue( self ):
|
||||
|
||||
return (self._spin.value(), self._unit.GetValue())
|
||||
return ( self._spin.value(), self._unit.GetValue() )
|
||||
|
||||
|
||||
def GetValue( self ):
|
||||
|
@ -321,12 +322,12 @@ class BytesControl( QW.QWidget ):
|
|||
|
||||
def SetSeparatedValue( self, value, unit ):
|
||||
|
||||
return (self._spin.setValue( value ), self._unit.SetValue( unit ))
|
||||
return ( self._spin.setValue( value ), self._unit.SetValue( unit ) )
|
||||
|
||||
|
||||
def SetValue( self, value: int ):
|
||||
|
||||
max_unit = 1024 * 1024 * 1024
|
||||
max_unit = 1024 ** 4
|
||||
|
||||
unit = 1
|
||||
|
||||
|
@ -341,6 +342,7 @@ class BytesControl( QW.QWidget ):
|
|||
self._unit.SetValue( unit )
|
||||
|
||||
|
||||
|
||||
class NoneableBytesControl( QW.QWidget ):
|
||||
|
||||
valueChanged = QC.Signal()
|
||||
|
|
|
@ -2232,6 +2232,8 @@ class HydrusResourceClientAPIRestrictedAddTagsGetTagSiblingsParents( HydrusResou
|
|||
|
||||
CheckTags( tags )
|
||||
|
||||
tags = HydrusTags.CleanTags( tags )
|
||||
|
||||
tags_to_service_keys_to_siblings_and_parents = HG.client_controller.Read( 'tag_siblings_and_parents_lookup', tags )
|
||||
|
||||
tags_dict = {}
|
||||
|
@ -2819,7 +2821,7 @@ class HydrusResourceClientAPIRestrictedGetFilesGetRenderedFile( HydrusResourceCl
|
|||
def _threadDoGETJob( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
try:
|
||||
|
||||
|
||||
media_result: ClientMedia.MediaSingleton
|
||||
|
||||
if 'file_id' in request.parsed_request_args:
|
||||
|
@ -2847,34 +2849,34 @@ class HydrusResourceClientAPIRestrictedGetFilesGetRenderedFile( HydrusResourceCl
|
|||
|
||||
raise HydrusExceptions.NotFoundException( 'One or more of those file identifiers was missing!' )
|
||||
|
||||
if not media_result.IsStaticImage():
|
||||
|
||||
raise HydrusExceptions.BadRequestException('Requested file is not an image!')
|
||||
|
||||
|
||||
hash = media_result.GetHash()
|
||||
|
||||
if not media_result.IsStaticImage():
|
||||
|
||||
raise HydrusExceptions.BadRequestException('Requested file is not an image!')
|
||||
|
||||
|
||||
renderer: ClientRendering.ImageRenderer = HG.client_controller.GetCache( 'images' ).GetImageRenderer( media_result )
|
||||
|
||||
while not renderer.IsReady():
|
||||
|
||||
|
||||
if request.disconnected:
|
||||
|
||||
|
||||
return
|
||||
|
||||
|
||||
time.sleep( 0.1 )
|
||||
|
||||
|
||||
|
||||
numpy_image = renderer.GetNumPyImage()
|
||||
|
||||
body = HydrusImageHandling.GeneratePNGBytesNumPy(numpy_image)
|
||||
|
||||
body = HydrusImageHandling.GeneratePNGBytesNumPy( numpy_image )
|
||||
|
||||
is_attachment = request.parsed_request_args.GetValue( 'download', bool, default_value = False )
|
||||
|
||||
response_context = HydrusServerResources.ResponseContext( 200, mime = HC.IMAGE_PNG, body = body, is_attachment = is_attachment, max_age = 86400 * 365 )
|
||||
|
||||
return response_context
|
||||
|
||||
|
||||
|
||||
class HydrusResourceClientAPIRestrictedGetFilesFileHashes( HydrusResourceClientAPIRestrictedGetFiles ):
|
||||
|
||||
|
@ -3787,7 +3789,25 @@ class HydrusResourceClientAPIRestrictedManageDatabaseMrBones( HydrusResourceClie
|
|||
|
||||
def _threadDoGETJob( self, request: HydrusServerRequest.HydrusRequest ):
|
||||
|
||||
boned_stats = HG.client_controller.Read( 'boned_stats' )
|
||||
location_context = ParseLocationContext( request, ClientLocation.LocationContext.STATICCreateSimple( CC.COMBINED_LOCAL_MEDIA_SERVICE_KEY ) )
|
||||
|
||||
tag_service_key = ParseTagServiceKey( request )
|
||||
|
||||
if tag_service_key == CC.COMBINED_TAG_SERVICE_KEY and location_context.IsAllKnownFiles():
|
||||
|
||||
raise HydrusExceptions.BadRequestException( 'Sorry, search for all known tags over all known files is not supported!' )
|
||||
|
||||
|
||||
tag_context = ClientSearch.TagContext( service_key = tag_service_key )
|
||||
predicates = ParseClientAPISearchPredicates( request )
|
||||
|
||||
file_search_context = ClientSearch.FileSearchContext( location_context = location_context, tag_context = tag_context, predicates = predicates )
|
||||
|
||||
job_key = ClientThreading.JobKey( cancellable = True )
|
||||
|
||||
request.disconnect_callables.append( job_key.Cancel )
|
||||
|
||||
boned_stats = HG.client_controller.Read( 'boned_stats', file_search_context = file_search_context, job_key = job_key )
|
||||
|
||||
body_dict = { 'boned_stats' : boned_stats }
|
||||
|
||||
|
|
|
@ -100,8 +100,8 @@ options = {}
|
|||
# Misc
|
||||
|
||||
NETWORK_VERSION = 20
|
||||
SOFTWARE_VERSION = 543
|
||||
CLIENT_API_VERSION = 51
|
||||
SOFTWARE_VERSION = 544
|
||||
CLIENT_API_VERSION = 52
|
||||
|
||||
SERVER_THUMBNAIL_DIMENSIONS = ( 200, 200 )
|
||||
|
||||
|
@ -919,7 +919,7 @@ APPLICATIONS_WITH_THUMBNAILS = set( { IMAGE_SVG, APPLICATION_PDF, APPLICATION_FL
|
|||
|
||||
MIMES_WITH_THUMBNAILS = set( IMAGES ).union( ANIMATIONS ).union( VIDEO ).union( APPLICATIONS_WITH_THUMBNAILS )
|
||||
|
||||
FILES_THAT_CAN_HAVE_ICC_PROFILE = { IMAGE_JPEG, IMAGE_PNG, IMAGE_GIF, IMAGE_TIFF, APPLICATION_PSD }.union( PIL_HEIF_MIMES )
|
||||
FILES_THAT_CAN_HAVE_ICC_PROFILE = { IMAGE_BMP, IMAGE_JPEG, IMAGE_PNG, IMAGE_GIF, IMAGE_TIFF, APPLICATION_PSD }.union( PIL_HEIF_MIMES )
|
||||
|
||||
FILES_THAT_CAN_HAVE_EXIF = { IMAGE_JPEG, IMAGE_TIFF, IMAGE_PNG, IMAGE_WEBP }.union( PIL_HEIF_MIMES )
|
||||
# images and animations that PIL can handle
|
||||
|
|
|
@ -690,25 +690,31 @@ def GetMime( path, ok_to_look_for_hydrus_updates = False ):
|
|||
return HC.IMAGE_SVG
|
||||
|
||||
|
||||
# it is important this goes at the end, because ffmpeg has a billion false positives!
|
||||
# it is important this goes at the end, because ffmpeg has a billion false positives! and it takes CPU to true negative
|
||||
# for instance, it once thought some hydrus update files were mpegs
|
||||
try:
|
||||
# it also thinks txt files can be mpegs
|
||||
likely_to_false_positive = True in ( path.endswith( ext ) for ext in ( '.txt', '.log', '.json' ) )
|
||||
|
||||
if not likely_to_false_positive:
|
||||
|
||||
mime = HydrusVideoHandling.GetMime( path )
|
||||
|
||||
if mime != HC.APPLICATION_UNKNOWN:
|
||||
try:
|
||||
|
||||
return mime
|
||||
mime = HydrusVideoHandling.GetMime( path )
|
||||
|
||||
if mime != HC.APPLICATION_UNKNOWN:
|
||||
|
||||
return mime
|
||||
|
||||
|
||||
except HydrusExceptions.UnsupportedFileException:
|
||||
|
||||
pass
|
||||
|
||||
except Exception as e:
|
||||
|
||||
HydrusData.Print( 'FFMPEG had trouble with: ' + path )
|
||||
HydrusData.PrintException( e, do_wait = False )
|
||||
|
||||
|
||||
except HydrusExceptions.UnsupportedFileException:
|
||||
|
||||
pass
|
||||
|
||||
except Exception as e:
|
||||
|
||||
HydrusData.Print( 'FFMPEG had trouble with: ' + path )
|
||||
HydrusData.PrintException( e, do_wait = False )
|
||||
|
||||
|
||||
return HC.APPLICATION_UNKNOWN
|
||||
|
|
|
@ -3,7 +3,8 @@ import typing
|
|||
|
||||
from PIL import Image as PILImage
|
||||
|
||||
from hydrus.core import HydrusExceptions, HydrusImageHandling
|
||||
from hydrus.core import HydrusExceptions
|
||||
from hydrus.core import HydrusImageHandling
|
||||
|
||||
try:
|
||||
|
||||
|
@ -55,8 +56,9 @@ def GenerateThumbnailBytesFromPSDPath( path: str, target_resolution: typing.Tupl
|
|||
def GetPSDResolution( path: str ):
|
||||
|
||||
if not PSD_TOOLS_OK:
|
||||
|
||||
|
||||
raise HydrusExceptions.UnsupportedFileException( 'psd_tools unavailable' )
|
||||
|
||||
|
||||
return HydrusPSDTools.GetPSDResolution( path )
|
||||
|
||||
|
@ -75,4 +77,4 @@ def GetPSDResolutionFallback( path: str ):
|
|||
width: int = struct.unpack( '>L', width_bytes )[0]
|
||||
|
||||
return ( width, height )
|
||||
|
||||
|
||||
|
|
|
@ -2,14 +2,11 @@ from PIL import Image as PILImage
|
|||
|
||||
from hydrus.core import HydrusExceptions
|
||||
|
||||
|
||||
from psd_tools import PSDImage
|
||||
from psd_tools.constants import Resource, ColorMode, Resource
|
||||
from psd_tools.api.numpy_io import has_transparency, get_transparency_index
|
||||
from psd_tools.api.pil_io import get_pil_mode, get_pil_channels, _create_image
|
||||
|
||||
|
||||
|
||||
def PSDHasICCProfile( path: str ):
|
||||
|
||||
psd = PSDImage.open( path )
|
||||
|
@ -24,11 +21,11 @@ def MergedPILImageFromPSD( path: str ) -> PILImage:
|
|||
#pil_image = psd.topil( apply_icc = False )
|
||||
|
||||
if psd.has_preview():
|
||||
|
||||
|
||||
pil_image = convert_image_data_to_pil(psd)
|
||||
|
||||
|
||||
else:
|
||||
|
||||
|
||||
raise HydrusExceptions.UnsupportedFileException('PSD file has no embedded preview!')
|
||||
|
||||
|
||||
|
@ -46,51 +43,69 @@ def MergedPILImageFromPSD( path: str ) -> PILImage:
|
|||
def GetPSDResolution( path: str ):
|
||||
|
||||
psd = PSDImage.open( path )
|
||||
|
||||
|
||||
return ( psd.width, psd.height )
|
||||
|
||||
|
||||
|
||||
# modified from psd-tools source:
|
||||
# https://github.com/psd-tools/psd-tools/blob/main/src/psd_tools/api/pil_io.py
|
||||
|
||||
def convert_image_data_to_pil(psd: PSDImage):
|
||||
def convert_image_data_to_pil( psd: PSDImage ):
|
||||
|
||||
alpha = None
|
||||
|
||||
|
||||
channel_data = psd._record.image_data.get_data(psd._record.header)
|
||||
size = (psd.width, psd.height)
|
||||
|
||||
channels = [_create_image(size, c, psd.depth) for c in channel_data]
|
||||
|
||||
|
||||
# has_transparency not quite correct
|
||||
# see https://github.com/psd-tools/psd-tools/issues/369
|
||||
# and https://github.com/psd-tools/psd-tools/pull/370
|
||||
no_alpha = psd._record.layer_and_mask_information.layer_info is not None and psd._record.layer_and_mask_information.layer_info.layer_count > 0
|
||||
|
||||
|
||||
if has_transparency(psd) and not no_alpha:
|
||||
|
||||
alpha = channels[get_transparency_index(psd)]
|
||||
|
||||
|
||||
if psd.color_mode == ColorMode.INDEXED:
|
||||
|
||||
image = channels[0]
|
||||
image.putpalette(psd._record.color_mode_data.interleave())
|
||||
|
||||
elif psd.color_mode == ColorMode.MULTICHANNEL:
|
||||
|
||||
image = channels[0] # Multi-channel mode is a collection of alpha.
|
||||
|
||||
else:
|
||||
|
||||
mode = get_pil_mode(psd.color_mode)
|
||||
image = PILImage.merge(mode, channels[:get_pil_channels(mode)])
|
||||
|
||||
|
||||
if not image:
|
||||
|
||||
return None
|
||||
|
||||
|
||||
return post_process(image, alpha)
|
||||
|
||||
|
||||
|
||||
def post_process(image, alpha):
|
||||
|
||||
# Fix inverted CMYK.
|
||||
if image.mode == 'CMYK':
|
||||
|
||||
from PIL import ImageChops
|
||||
image = ImageChops.invert(image)
|
||||
|
||||
|
||||
# In Pillow, alpha channel is only available in RGB or L.
|
||||
if alpha and image.mode in ('RGB', 'L'):
|
||||
|
||||
image.putalpha(alpha)
|
||||
|
||||
|
||||
return image
|
||||
|
||||
|
|
|
@ -79,15 +79,13 @@ def GetProcreateResolution( path ):
|
|||
|
||||
# canvas is rotated 90 or -90 degrees
|
||||
|
||||
height = size[1]
|
||||
|
||||
width = size[0]
|
||||
width = size[1]
|
||||
height = size[0]
|
||||
|
||||
else:
|
||||
|
||||
height = size[0]
|
||||
|
||||
width = size[1]
|
||||
width = size[0]
|
||||
height = size[1]
|
||||
|
||||
|
||||
except:
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
import json
|
||||
import os
|
||||
import time
|
||||
|
||||
|
@ -822,64 +823,79 @@ class HydrusResource( Resource ):
|
|||
try: self._CleanUpTempFile( request )
|
||||
except: pass
|
||||
|
||||
default_mime = HC.TEXT_HTML
|
||||
default_encoding = str
|
||||
error_summary = str( e )
|
||||
|
||||
if isinstance( e, HydrusExceptions.BadRequestException ):
|
||||
|
||||
response_context = ResponseContext( 400, mime = default_mime, body = default_encoding( e ) )
|
||||
status_code = 400
|
||||
|
||||
elif isinstance( e, ( HydrusExceptions.MissingCredentialsException, HydrusExceptions.DoesNotSupportCORSException ) ):
|
||||
|
||||
response_context = ResponseContext( 401, mime = default_mime, body = default_encoding( e ) )
|
||||
status_code = 401
|
||||
|
||||
elif isinstance( e, HydrusExceptions.InsufficientCredentialsException ):
|
||||
|
||||
response_context = ResponseContext( 403, mime = default_mime, body = default_encoding( e ) )
|
||||
status_code = 403
|
||||
|
||||
elif isinstance( e, ( HydrusExceptions.NotFoundException, HydrusExceptions.DataMissing, HydrusExceptions.FileMissingException ) ):
|
||||
|
||||
response_context = ResponseContext( 404, mime = default_mime, body = default_encoding( e ) )
|
||||
status_code = 404
|
||||
|
||||
elif isinstance( e, HydrusExceptions.NotAcceptable ):
|
||||
|
||||
response_context = ResponseContext( 406, mime = default_mime, body = default_encoding( e ) )
|
||||
status_code = 406
|
||||
|
||||
elif isinstance( e, HydrusExceptions.ConflictException ):
|
||||
|
||||
response_context = ResponseContext( 409, mime = default_mime, body = default_encoding( e ) )
|
||||
status_code = 409
|
||||
|
||||
elif isinstance( e, HydrusExceptions.RangeNotSatisfiableException ):
|
||||
|
||||
response_context = ResponseContext( 416, mime = default_mime, body = default_encoding( e ) )
|
||||
status_code = 416
|
||||
|
||||
elif isinstance( e, HydrusExceptions.SessionException ):
|
||||
|
||||
response_context = ResponseContext( 419, mime = default_mime, body = default_encoding( e ) )
|
||||
status_code = 419
|
||||
|
||||
elif isinstance( e, HydrusExceptions.NetworkVersionException ):
|
||||
|
||||
response_context = ResponseContext( 426, mime = default_mime, body = default_encoding( e ) )
|
||||
status_code = 426
|
||||
|
||||
elif isinstance( e, ( HydrusExceptions.ServerBusyException, HydrusExceptions.ShutdownException ) ):
|
||||
|
||||
response_context = ResponseContext( 503, mime = default_mime, body = default_encoding( e ) )
|
||||
status_code = 503
|
||||
|
||||
elif isinstance( e, HydrusExceptions.BandwidthException ):
|
||||
|
||||
response_context = ResponseContext( 509, mime = default_mime, body = default_encoding( e ) )
|
||||
status_code = 509
|
||||
|
||||
elif isinstance( e, HydrusExceptions.ServerException ):
|
||||
|
||||
response_context = ResponseContext( 500, mime = default_mime, body = default_encoding( e ) )
|
||||
status_code = 500
|
||||
|
||||
else:
|
||||
|
||||
status_code = 500
|
||||
|
||||
HydrusData.DebugPrint( failure.getTraceback() )
|
||||
|
||||
response_context = ResponseContext( 500, mime = default_mime, body = default_encoding( 'The repository encountered an error it could not handle! Here is a dump of what happened, which will also be written to your client.log file. If it persists, please forward it to hydrus.admin@gmail.com:' + os.linesep * 2 + failure.getTraceback() ) )
|
||||
error_summary = f'The "{self._service.GetName()}" encountered an error it could not handle!\n\nHere is a dump of what happened, which will also be written to your log. If it persists, please forward it to hydrus.admin@gmail.com:\n\n' + failure.getTraceback()
|
||||
|
||||
|
||||
# TODO: maybe pull the cbor stuff down to hydrus core here and respond with Dumps( blah, requested_mime ) instead
|
||||
|
||||
default_mime = HC.APPLICATION_JSON
|
||||
|
||||
body_dict = {
|
||||
'error' : error_summary,
|
||||
'exception_type' : str( type( e ).__name__ ),
|
||||
'status_code' : status_code
|
||||
}
|
||||
|
||||
body = json.dumps( body_dict )
|
||||
|
||||
response_context = ResponseContext( status_code, mime = default_mime, body = body )
|
||||
|
||||
request.hydrus_response_context = response_context
|
||||
|
||||
self._callbackRenderResponseContext( request )
|
||||
|
|
|
@ -3478,6 +3478,40 @@ class TestClientAPI( unittest.TestCase ):
|
|||
|
||||
self.assertEqual( boned_stats, dict( expected_data ) )
|
||||
|
||||
[ ( args, kwargs ) ] = HG.test_controller.GetRead( 'boned_stats' )
|
||||
|
||||
file_search_context = kwargs[ 'file_search_context' ]
|
||||
|
||||
self.assertEqual( len( file_search_context.GetPredicates() ), 0 )
|
||||
|
||||
#
|
||||
|
||||
HG.test_controller.SetRead( 'boned_stats', expected_data )
|
||||
|
||||
path = '/manage_database/mr_bones?tags={}'.format( urllib.parse.quote( json.dumps( [ 'skirt', 'blue_eyes' ] ) ) )
|
||||
|
||||
connection.request( 'GET', path, headers = headers )
|
||||
|
||||
response = connection.getresponse()
|
||||
|
||||
data = response.read()
|
||||
|
||||
text = str( data, 'utf-8' )
|
||||
|
||||
self.assertEqual( response.status, 200 )
|
||||
|
||||
d = json.loads( text )
|
||||
|
||||
boned_stats = d[ 'boned_stats' ]
|
||||
|
||||
self.assertEqual( boned_stats, dict( expected_data ) )
|
||||
|
||||
[ ( args, kwargs ) ] = HG.test_controller.GetRead( 'boned_stats' )
|
||||
|
||||
file_search_context = kwargs[ 'file_search_context' ]
|
||||
|
||||
self.assertEqual( len( file_search_context.GetPredicates() ), 2 )
|
||||
|
||||
|
||||
def _test_manage_duplicates( self, connection, set_up_permissions ):
|
||||
|
||||
|
|
|
@ -1582,6 +1582,82 @@ class TestClientDB( unittest.TestCase ):
|
|||
self.assertEqual( mr_num_words, None )
|
||||
|
||||
|
||||
def test_mr_bones( self ):
|
||||
|
||||
TestClientDB._clear_db()
|
||||
|
||||
test_files = []
|
||||
|
||||
test_files.append( 'muh_swf.swf' )
|
||||
test_files.append( 'muh_mp4.mp4' )
|
||||
|
||||
file_import_options = FileImportOptions.FileImportOptions()
|
||||
file_import_options.SetIsDefault( True )
|
||||
|
||||
for filename in test_files:
|
||||
|
||||
HG.test_controller.SetRead( 'hash_status', ClientImportFiles.FileImportStatus.STATICGetUnknownStatus() )
|
||||
|
||||
path = os.path.join( HC.STATIC_DIR, 'testing', filename )
|
||||
|
||||
file_import_job = ClientImportFiles.FileImportJob( path, file_import_options )
|
||||
|
||||
file_import_job.GeneratePreImportHashAndStatus()
|
||||
|
||||
file_import_job.GenerateInfo()
|
||||
|
||||
file_import_status = self._write( 'import_file', file_import_job )
|
||||
|
||||
|
||||
swf_hash_hex = 'edfef9905fdecde38e0752a5b6ab7b6df887c3968d4246adc9cffc997e168cdf'
|
||||
|
||||
media_result = self._read( 'media_result', bytes.fromhex( swf_hash_hex ) )
|
||||
|
||||
earliest_import_time = media_result.GetLocationsManager().GetTimestampsManager().GetImportedTimestamp( CC.COMBINED_LOCAL_MEDIA_SERVICE_KEY )
|
||||
|
||||
result = self._read( 'boned_stats' )
|
||||
|
||||
expected_result = {
|
||||
'earliest_import_time': earliest_import_time,
|
||||
'num_archive': 0,
|
||||
'num_deleted': 0,
|
||||
'num_inbox': 2,
|
||||
'size_archive': 0,
|
||||
'size_deleted': 0,
|
||||
'size_inbox': 1027308,
|
||||
'total_alternate_files': 0,
|
||||
'total_duplicate_files': 0,
|
||||
'total_viewtime': (0, 0, 0, 0)
|
||||
}
|
||||
|
||||
self.assertEqual( result, expected_result )
|
||||
|
||||
#
|
||||
|
||||
location_context = ClientLocation.LocationContext.STATICCreateSimple( CC.COMBINED_LOCAL_MEDIA_SERVICE_KEY )
|
||||
|
||||
predicates = [ ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_SYSTEM_MIME, value = ( HC.APPLICATION_FLASH, ) ) ]
|
||||
|
||||
file_search_context = ClientSearch.FileSearchContext( location_context = location_context, predicates = predicates )
|
||||
|
||||
result = self._read( 'boned_stats', file_search_context = file_search_context )
|
||||
|
||||
expected_result = {
|
||||
'earliest_import_time': earliest_import_time,
|
||||
'num_archive': 0,
|
||||
'num_deleted': 0,
|
||||
'num_inbox': 1,
|
||||
'size_archive': 0,
|
||||
'size_deleted': 0,
|
||||
'size_inbox': 456774,
|
||||
'total_alternate_files': 0,
|
||||
'total_duplicate_files': 0,
|
||||
'total_viewtime': (0, 0, 0, 0)
|
||||
}
|
||||
|
||||
self.assertEqual( result, expected_result )
|
||||
|
||||
|
||||
def test_nums_pending( self ):
|
||||
|
||||
TestClientDB._clear_db()
|
||||
|
|
|
@ -12,7 +12,7 @@ lxml>=4.5.0
|
|||
lz4>=3.0.0
|
||||
numpy>=1.16.0
|
||||
psd-tools>=1.9.28
|
||||
Pillow>=9.5.0
|
||||
Pillow>=10.0.1
|
||||
pillow-heif>=0.12.0
|
||||
psutil>=5.0.0
|
||||
pyOpenSSL>=19.1.0
|
||||
|
|
|
@ -2,6 +2,31 @@
|
|||
|
||||
pushd "%~dp0"
|
||||
|
||||
ECHO r::::::::::::::::::::::::::::::::::r
|
||||
ECHO : :
|
||||
ECHO : :PP. :
|
||||
ECHO : vBBr :
|
||||
ECHO : 7BB: :
|
||||
ECHO : rBB: :
|
||||
ECHO : :DQRE: rBB: :gMBb: :
|
||||
ECHO : :BBBi rBB: 7BBB. :
|
||||
ECHO : KBB: rBB: rBBI :
|
||||
ECHO : qBB: rBB: rQBU :
|
||||
ECHO : qBB: rBB: iBBS :
|
||||
ECHO : qBB: iBB: 7BBj :
|
||||
ECHO : iBBY iBB. 2BB. :
|
||||
ECHO : SBQq iBQ: EBBY :
|
||||
ECHO : :MQBZMBBDRBBP. :
|
||||
ECHO : .YBB7 :
|
||||
ECHO : :BB. :
|
||||
ECHO : 7BBi :
|
||||
ECHO : rBB: :
|
||||
ECHO : :
|
||||
ECHO r::::::::::::::::::::::::::::::::::r
|
||||
ECHO:
|
||||
ECHO hydrus
|
||||
ECHO:
|
||||
|
||||
where /q python
|
||||
IF ERRORLEVEL 1 (
|
||||
|
||||
|
@ -29,8 +54,8 @@ IF EXIST "venv\" (
|
|||
|
||||
:questions
|
||||
|
||||
ECHO:
|
||||
ECHO Users on older Windows need the advanced install.
|
||||
ECHO --------
|
||||
ECHO Users on older Windows or Python ^>=3.11 need the advanced install.
|
||||
ECHO:
|
||||
ECHO Your Python version is:
|
||||
python --version
|
||||
|
@ -44,10 +69,15 @@ goto :parse_fail
|
|||
|
||||
:question_qt
|
||||
|
||||
ECHO --------
|
||||
ECHO We are now going to choose which versions of some larger libraries we are going to use. If something doesn't install, or hydrus won't boot, just run this script again and it will delete everything and start over.
|
||||
ECHO:
|
||||
ECHO Qt is the User Interface library. We are now on Qt6.
|
||||
ECHO If you are on Windows ^<=8.1, choose 5.
|
||||
ECHO If you have multi-monitor menu position bugs with the normal Qt6, try the (o)lder build on Python ^<=3.10 or (m)iddle on Python ^>=3.11.
|
||||
|
||||
ECHO Qt - User Interface
|
||||
ECHO:
|
||||
ECHO Most people want "6".
|
||||
ECHO If you are on Windows ^<=8.1, choose "5".
|
||||
ECHO If you have multi-monitor menu position bugs with the normal Qt6, try "o" on Python ^<=3.10 or "m" on Python ^>=3.11.
|
||||
SET /P qt="Do you want Qt(5), Qt(6), Qt6 (o)lder, Qt6 (m)iddle or (t)est? "
|
||||
|
||||
IF "%qt%" == "5" goto :question_mpv
|
||||
|
@ -59,21 +89,39 @@ goto :parse_fail
|
|||
|
||||
:question_mpv
|
||||
|
||||
ECHO --------
|
||||
ECHO mpv - audio and video playback
|
||||
ECHO:
|
||||
ECHO mpv is the main way to play audio and video. We need to tell hydrus how to talk to your mpv dll.
|
||||
ECHO Try the n first. If it doesn't work, fall back to o.
|
||||
ECHO We need to tell hydrus how to talk to your mpv dll.
|
||||
ECHO Most people want "n".
|
||||
ECHO If it doesn't work, fall back to "o".
|
||||
SET /P mpv="Do you want (o)ld mpv, (n)ew mpv, or (t)est mpv? "
|
||||
|
||||
IF "%mpv%" == "o" goto :question_opencv
|
||||
IF "%mpv%" == "n" goto :question_opencv
|
||||
IF "%mpv%" == "t" goto :question_opencv
|
||||
IF "%mpv%" == "o" goto :question_pillow
|
||||
IF "%mpv%" == "n" goto :question_pillow
|
||||
IF "%mpv%" == "t" goto :question_pillow
|
||||
goto :parse_fail
|
||||
|
||||
:question_pillow
|
||||
|
||||
ECHO --------
|
||||
ECHO Pillow - Images
|
||||
ECHO:
|
||||
ECHO Most people want "n".
|
||||
ECHO If you are Python 3.7 or earlier, choose "o".
|
||||
SET /P pillow="Do you want (o)ld pillow or (n)ew pillow? "
|
||||
|
||||
IF "%pillow%" == "o" goto :question_opencv
|
||||
IF "%pillow%" == "n" goto :question_opencv
|
||||
goto :parse_fail
|
||||
|
||||
:question_opencv
|
||||
|
||||
ECHO --------
|
||||
ECHO OpenCV - Images
|
||||
ECHO:
|
||||
ECHO OpenCV is the main image processing library.
|
||||
ECHO Try the n first. If it doesn't work, fall back to o. Very new python versions might need t.
|
||||
ECHO Most people want "n".
|
||||
ECHO If it doesn't work, fall back to "o". Python ^>=3.11 might need "t".
|
||||
SET /P opencv="Do you want (o)ld OpenCV, (n)ew OpenCV, or (t)est OpenCV? "
|
||||
|
||||
IF "%opencv%" == "o" goto :create
|
||||
|
@ -83,6 +131,7 @@ goto :parse_fail
|
|||
|
||||
:create
|
||||
|
||||
ECHO --------
|
||||
echo Creating new venv...
|
||||
|
||||
python -m venv venv
|
||||
|
@ -118,6 +167,7 @@ IF "%install_type%" == "d" (
|
|||
python -m pip install pyside2
|
||||
python -m pip install PyQtChart PyQt5
|
||||
python -m pip install PyQt6-Charts PyQt6
|
||||
python -m pip install -r static\requirements\advanced\requirements_pillow_new.txt
|
||||
python -m pip install -r static\requirements\advanced\requirements_mpv_test.txt
|
||||
python -m pip install -r static\requirements\advanced\requirements_opencv_test.txt
|
||||
python -m pip install -r static\requirements\hydev\requirements_windows_build.txt
|
||||
|
@ -135,6 +185,9 @@ IF "%install_type%" == "a" (
|
|||
IF "%qt%" == "m" python -m pip install -r static\requirements\advanced\requirements_qt6_middle.txt
|
||||
IF "%qt%" == "t" python -m pip install -r static\requirements\advanced\requirements_qt6_test.txt
|
||||
|
||||
IF "%pillow%" == "o" python -m pip install -r static\requirements\advanced\requirements_pillow_old.txt
|
||||
IF "%pillow%" == "n" python -m pip install -r static\requirements\advanced\requirements_pillow_new.txt
|
||||
|
||||
IF "%mpv%" == "o" python -m pip install -r static\requirements\advanced\requirements_mpv_old.txt
|
||||
IF "%mpv%" == "n" python -m pip install -r static\requirements\advanced\requirements_mpv_new.txt
|
||||
IF "%mpv%" == "t" python -m pip install -r static\requirements\advanced\requirements_mpv_test.txt
|
||||
|
@ -147,6 +200,7 @@ IF "%install_type%" == "a" (
|
|||
|
||||
CALL venv\Scripts\deactivate.bat
|
||||
|
||||
ECHO --------
|
||||
SET /P done="Done!"
|
||||
|
||||
popd
|
||||
|
@ -155,6 +209,7 @@ EXIT /B 0
|
|||
|
||||
:parse_fail
|
||||
|
||||
ECHO --------
|
||||
SET /P done="Sorry, did not understand that input!"
|
||||
|
||||
popd
|
||||
|
|
|
@ -2,24 +2,50 @@
|
|||
|
||||
pushd "$(dirname "$0")" || exit 1
|
||||
|
||||
echo " r::::::::::::::::::::::::::::::::::r"
|
||||
echo " : :"
|
||||
echo " : :PP. :"
|
||||
echo " : vBBr :"
|
||||
echo " : 7BB: :"
|
||||
echo " : rBB: :"
|
||||
echo " : :DQRE: rBB: :gMBb: :"
|
||||
echo " : :BBBi rBB: 7BBB. :"
|
||||
echo " : KBB: rBB: rBBI :"
|
||||
echo " : qBB: rBB: rQBU :"
|
||||
echo " : qBB: rBB: iBBS :"
|
||||
echo " : qBB: iBB: 7BBj :"
|
||||
echo " : iBBY iBB. 2BB. :"
|
||||
echo " : SBQq iBQ: EBBY :"
|
||||
echo " : :MQBZMBBDRBBP. :"
|
||||
echo " : .YBB7 :"
|
||||
echo " : :BB. :"
|
||||
echo " : 7BBi :"
|
||||
echo " : rBB: :"
|
||||
echo " : :"
|
||||
echo " r::::::::::::::::::::::::::::::::::r"
|
||||
echo
|
||||
echo " hydrus"
|
||||
echo
|
||||
|
||||
py_command=python3
|
||||
|
||||
if ! type -P $py_command >/dev/null 2>&1; then
|
||||
echo "No python3 found, using python."
|
||||
py_command=python
|
||||
echo "No \"python3\" found, using \"python\"."
|
||||
py_command=python
|
||||
fi
|
||||
|
||||
if [ -d "venv" ]; then
|
||||
echo "Virtual environment will be reinstalled. Hit Enter to start."
|
||||
read -r
|
||||
echo "Deleting old venv..."
|
||||
rm -rf venv
|
||||
echo "Virtual environment will be reinstalled. Hit Enter to start."
|
||||
read -r
|
||||
echo "Deleting old venv..."
|
||||
rm -rf venv
|
||||
else
|
||||
echo "If you do not know what this is, check the 'running from source' help. Hit Enter to start."
|
||||
read -r
|
||||
echo "If you do not know what this is, check the 'running from source' help. Hit Enter to start."
|
||||
read -r
|
||||
fi
|
||||
|
||||
echo "If your macOS is slightly old, do the advanced install. Let hydev know what works for you."
|
||||
echo "--------"
|
||||
echo "If your macOS is old, or you are on >=Python 3.11, do the advanced install. Let hydev know what works for you."
|
||||
echo
|
||||
echo "Your Python version is:"
|
||||
$py_command --version
|
||||
|
@ -29,68 +55,93 @@ echo "Do you want the (s)imple or (a)dvanced install? "
|
|||
read -r install_type
|
||||
|
||||
if [ "$install_type" = "s" ]; then
|
||||
:
|
||||
:
|
||||
elif [ "$install_type" = "a" ]; then
|
||||
echo
|
||||
echo "Qt is the User Interface library. We are now on Qt6."
|
||||
echo "If you are <= 10.13 (High Sierra), choose 5."
|
||||
echo "If you are <=10.15 (Catalina) or otherwise have trouble with the normal Qt6, try the (o)lder build on Python ^<=3.10 or (m)iddle on Python ^>=3.11."
|
||||
echo "--------"
|
||||
echo "We are now going to choose which versions of some larger libraries we are going to use. If something doesn't install, or hydrus won't boot, just run this script again and it will delete everything and start over."
|
||||
echo
|
||||
echo "Qt - User Interface"
|
||||
echo "Most people want \"6\"."
|
||||
echo "If you are <= 10.13 (High Sierra), choose \"5\"."
|
||||
echo "If you are <=10.15 (Catalina) or otherwise have trouble with the normal Qt6, try \"o\" on Python <=3.10 or \"m\" on Python >=3.11."
|
||||
echo "Do you want Qt(5), Qt(6), Qt6 (o)lder, Qt6 (m)iddle or (t)est? "
|
||||
read -r qt
|
||||
if [ "$qt" = "5" ]; then
|
||||
:
|
||||
elif [ "$qt" = "6" ]; then
|
||||
:
|
||||
elif [ "$qt" = "o" ]; then
|
||||
:
|
||||
elif [ "$qt" = "m" ]; then
|
||||
:
|
||||
elif [ "$qt" = "t" ]; then
|
||||
:
|
||||
else
|
||||
echo "Sorry, did not understand that input!"
|
||||
popd || exit 1
|
||||
exit 1
|
||||
fi
|
||||
|
||||
echo
|
||||
echo "mpv is broken on macOS. As a safe default, choose n."
|
||||
echo "Do you want (o)ld mpv, (n)ew mpv, or (t)est mpv? "
|
||||
read -r mpv
|
||||
if [ "$mpv" = "o" ]; then
|
||||
:
|
||||
elif [ "$mpv" = "n" ]; then
|
||||
:
|
||||
elif [ "$mpv" = "t" ]; then
|
||||
:
|
||||
else
|
||||
echo "Sorry, did not understand that input!"
|
||||
popd || exit 1
|
||||
exit 1
|
||||
fi
|
||||
|
||||
echo
|
||||
echo "OpenCV is the main image processing library."
|
||||
echo "Try the n first. If it doesn't work, fall back to o. Very new python versions might need t."
|
||||
echo "Do you want (o)ld OpenCV, (n)ew OpenCV, or (t)est OpenCV? "
|
||||
read -r opencv
|
||||
if [ "$opencv" = "o" ]; then
|
||||
:
|
||||
elif [ "$opencv" = "n" ]; then
|
||||
:
|
||||
elif [ "$opencv" = "t" ]; then
|
||||
:
|
||||
else
|
||||
echo "Sorry, did not understand that input!"
|
||||
popd || exit 1
|
||||
exit 1
|
||||
fi
|
||||
read -r qt
|
||||
if [ "$qt" = "5" ]; then
|
||||
:
|
||||
elif [ "$qt" = "6" ]; then
|
||||
:
|
||||
elif [ "$qt" = "o" ]; then
|
||||
:
|
||||
elif [ "$qt" = "m" ]; then
|
||||
:
|
||||
elif [ "$qt" = "t" ]; then
|
||||
:
|
||||
else
|
||||
echo "Sorry, did not understand that input!"
|
||||
popd || exit 1
|
||||
exit 1
|
||||
fi
|
||||
|
||||
echo "--------"
|
||||
echo "mpv - audio and video playback"
|
||||
echo
|
||||
echo "mpv is broken on macOS. As a safe default, choose \"n\"."
|
||||
echo "Do you want (o)ld mpv, (n)ew mpv, or (t)est mpv? "
|
||||
read -r mpv
|
||||
if [ "$mpv" = "o" ]; then
|
||||
:
|
||||
elif [ "$mpv" = "n" ]; then
|
||||
:
|
||||
elif [ "$mpv" = "t" ]; then
|
||||
:
|
||||
else
|
||||
echo "Sorry, did not understand that input!"
|
||||
popd || exit 1
|
||||
exit 1
|
||||
fi
|
||||
|
||||
echo "--------"
|
||||
echo "Pillow - Images"
|
||||
echo
|
||||
echo "Most people want \"n\"."
|
||||
echo "If you are Python 3.7 or earlier, choose \"o\""
|
||||
echo "Do you want (o)ld pillow or (n)ew pillow? "
|
||||
read -r pillow
|
||||
if [ "$pillow" = "o" ]; then
|
||||
:
|
||||
elif [ "$pillow" = "n" ]; then
|
||||
:
|
||||
else
|
||||
echo "Sorry, did not understand that input!"
|
||||
popd || exit 1
|
||||
exit 1
|
||||
fi
|
||||
|
||||
echo "--------"
|
||||
echo "OpenCV - Images"
|
||||
echo
|
||||
echo "Most people want \"n\"."
|
||||
echo "If it doesn't work, fall back to \"o\". Python >=3.11 might need \"t\"."
|
||||
echo "Do you want (o)ld OpenCV, (n)ew OpenCV, or (t)est OpenCV? "
|
||||
read -r opencv
|
||||
if [ "$opencv" = "o" ]; then
|
||||
:
|
||||
elif [ "$opencv" = "n" ]; then
|
||||
:
|
||||
elif [ "$opencv" = "t" ]; then
|
||||
:
|
||||
else
|
||||
echo "Sorry, did not understand that input!"
|
||||
popd || exit 1
|
||||
exit 1
|
||||
fi
|
||||
else
|
||||
echo "Sorry, did not understand that input!"
|
||||
popd || exit 1
|
||||
exit 1
|
||||
echo "Sorry, did not understand that input!"
|
||||
popd || exit 1
|
||||
exit 1
|
||||
fi
|
||||
|
||||
echo "--------"
|
||||
echo "Creating new venv..."
|
||||
$py_command -m venv venv
|
||||
|
||||
|
@ -98,8 +149,8 @@ source venv/bin/activate
|
|||
|
||||
if ! source venv/bin/activate; then
|
||||
echo "The venv failed to activate, stopping now!"
|
||||
popd || exit 1
|
||||
exit 1
|
||||
popd || exit 1
|
||||
exit 1
|
||||
fi
|
||||
|
||||
python -m pip install --upgrade pip
|
||||
|
@ -107,41 +158,48 @@ python -m pip install --upgrade pip
|
|||
python -m pip install --upgrade wheel
|
||||
|
||||
if [ "$install_type" = "s" ]; then
|
||||
python -m pip install -r requirements.txt
|
||||
python -m pip install -r requirements.txt
|
||||
elif [ "$install_type" = "a" ]; then
|
||||
python -m pip install -r static/requirements/advanced/requirements_core.txt
|
||||
|
||||
if [ "$qt" = "5" ]; then
|
||||
python -m pip install -r static/requirements/advanced/requirements_qt5.txt
|
||||
elif [ "$qt" = "6" ]; then
|
||||
python -m pip install -r static/requirements/advanced/requirements_qt6.txt
|
||||
elif [ "$qt" = "o" ]; then
|
||||
python -m pip install -r static/requirements/advanced/requirements_qt6_older.txt
|
||||
elif [ "$qt" = "m" ]; then
|
||||
python -m pip install -r static/requirements/advanced/requirements_qt6_middle.txt
|
||||
elif [ "$qt" = "t" ]; then
|
||||
python -m pip install -r static/requirements/advanced/requirements_qt6_test.txt
|
||||
fi
|
||||
|
||||
if [ "$mpv" = "o" ]; then
|
||||
python -m pip install -r static/requirements/advanced/requirements_mpv_old.txt
|
||||
elif [ "$mpv" = "n" ]; then
|
||||
python -m pip install -r static/requirements/advanced/requirements_mpv_new.txt
|
||||
elif [ "$mpv" = "t" ]; then
|
||||
python -m pip install -r static/requirements/advanced/requirements_mpv_test.txt
|
||||
fi
|
||||
|
||||
if [ "$opencv" = "o" ]; then
|
||||
python -m pip install -r static/requirements/advanced/requirements_opencv_old.txt
|
||||
elif [ "$opencv" = "n" ]; then
|
||||
python -m pip install -r static/requirements/advanced/requirements_opencv_new.txt
|
||||
elif [ "$opencv" = "t" ]; then
|
||||
python -m pip install -r static/requirements/advanced/requirements_opencv_test.txt
|
||||
fi
|
||||
python -m pip install -r static/requirements/advanced/requirements_core.txt
|
||||
|
||||
if [ "$qt" = "5" ]; then
|
||||
python -m pip install -r static/requirements/advanced/requirements_qt5.txt
|
||||
elif [ "$qt" = "6" ]; then
|
||||
python -m pip install -r static/requirements/advanced/requirements_qt6.txt
|
||||
elif [ "$qt" = "o" ]; then
|
||||
python -m pip install -r static/requirements/advanced/requirements_qt6_older.txt
|
||||
elif [ "$qt" = "m" ]; then
|
||||
python -m pip install -r static/requirements/advanced/requirements_qt6_middle.txt
|
||||
elif [ "$qt" = "t" ]; then
|
||||
python -m pip install -r static/requirements/advanced/requirements_qt6_test.txt
|
||||
fi
|
||||
|
||||
if [ "$mpv" = "o" ]; then
|
||||
python -m pip install -r static/requirements/advanced/requirements_mpv_old.txt
|
||||
elif [ "$mpv" = "n" ]; then
|
||||
python -m pip install -r static/requirements/advanced/requirements_mpv_new.txt
|
||||
elif [ "$mpv" = "t" ]; then
|
||||
python -m pip install -r static/requirements/advanced/requirements_mpv_test.txt
|
||||
fi
|
||||
|
||||
if [ "$pillow" = "o" ]; then
|
||||
python -m pip install -r static/requirements/advanced/requirements_pillow_old.txt
|
||||
elif [ "$pillow" = "n" ]; then
|
||||
python -m pip install -r static/requirements/advanced/requirements_pillow_new.txt
|
||||
fi
|
||||
|
||||
if [ "$opencv" = "o" ]; then
|
||||
python -m pip install -r static/requirements/advanced/requirements_opencv_old.txt
|
||||
elif [ "$opencv" = "n" ]; then
|
||||
python -m pip install -r static/requirements/advanced/requirements_opencv_new.txt
|
||||
elif [ "$opencv" = "t" ]; then
|
||||
python -m pip install -r static/requirements/advanced/requirements_opencv_test.txt
|
||||
fi
|
||||
fi
|
||||
|
||||
deactivate
|
||||
|
||||
echo "--------"
|
||||
echo "Done!"
|
||||
|
||||
read -r
|
||||
|
|
|
@ -2,10 +2,35 @@
|
|||
|
||||
pushd "$(dirname "$0")" || exit 1
|
||||
|
||||
echo " r::::::::::::::::::::::::::::::::::r"
|
||||
echo " : :"
|
||||
echo " : :PP. :"
|
||||
echo " : vBBr :"
|
||||
echo " : 7BB: :"
|
||||
echo " : rBB: :"
|
||||
echo " : :DQRE: rBB: :gMBb: :"
|
||||
echo " : :BBBi rBB: 7BBB. :"
|
||||
echo " : KBB: rBB: rBBI :"
|
||||
echo " : qBB: rBB: rQBU :"
|
||||
echo " : qBB: rBB: iBBS :"
|
||||
echo " : qBB: iBB: 7BBj :"
|
||||
echo " : iBBY iBB. 2BB. :"
|
||||
echo " : SBQq iBQ: EBBY :"
|
||||
echo " : :MQBZMBBDRBBP. :"
|
||||
echo " : .YBB7 :"
|
||||
echo " : :BB. :"
|
||||
echo " : 7BBi :"
|
||||
echo " : rBB: :"
|
||||
echo " : :"
|
||||
echo " r::::::::::::::::::::::::::::::::::r"
|
||||
echo
|
||||
echo " hydrus"
|
||||
echo
|
||||
|
||||
py_command=python3
|
||||
|
||||
if ! type -P $py_command >/dev/null 2>&1; then
|
||||
echo "No python3 found, using python."
|
||||
echo "No \"python3\" found, using \"python\"."
|
||||
py_command=python
|
||||
fi
|
||||
|
||||
|
@ -19,7 +44,8 @@ else
|
|||
read -r
|
||||
fi
|
||||
|
||||
echo "Users on older OSes need the advanced install."
|
||||
echo "--------"
|
||||
echo "Users on older OSes or Python >=3.11 need the advanced install."
|
||||
echo
|
||||
echo "Your Python version is:"
|
||||
$py_command --version
|
||||
|
@ -31,10 +57,13 @@ read -r install_type
|
|||
if [ "$install_type" = "s" ]; then
|
||||
:
|
||||
elif [ "$install_type" = "a" ]; then
|
||||
echo "--------"
|
||||
echo "We are now going to choose which versions of some larger libraries we are going to use. If something doesn't install, or hydrus won't boot, just run this script again and it will delete everything and start over."
|
||||
echo
|
||||
echo "Qt is the User Interface library. We are now on Qt6."
|
||||
echo "If you are <=Ubuntu 18.04 or equivalent, choose 5."
|
||||
echo "If you cannot boot with the normal Qt6, try the (o)lder build on Python ^<=3.10 or (m)iddle on Python ^>=3.11."
|
||||
echo "Qt - User Interface"
|
||||
echo "Most people want \"6\"."
|
||||
echo "If you are <=Ubuntu 18.04 or equivalent, choose \"5\"."
|
||||
echo "If you cannot boot with the normal Qt6, try \"o\" on Python ^<=3.10 or \"m\" on Python ^>=3.11."
|
||||
echo "Do you want Qt(5), Qt(6), Qt6 (o)lder, Qt6 (m)iddle or (t)est? "
|
||||
read -r qt
|
||||
if [ "$qt" = "5" ]; then
|
||||
|
@ -52,9 +81,12 @@ elif [ "$install_type" = "a" ]; then
|
|||
exit 1
|
||||
fi
|
||||
|
||||
echo "--------"
|
||||
echo "mpv - audio and video playback"
|
||||
echo
|
||||
echo "mpv is the main way to play audio and video. We need to tell hydrus how to talk to your existing mpv install."
|
||||
echo "Try the n first. If it doesn't work, fall back to o."
|
||||
echo "We need to tell hydrus how to talk to your existing mpv install."
|
||||
echo "Most people want \"n\"."
|
||||
echo "If it doesn't work, fall back to \"o\"."
|
||||
echo "Do you want (o)ld mpv, (n)ew mpv, or (t)est mpv? "
|
||||
read -r mpv
|
||||
if [ "$mpv" = "o" ]; then
|
||||
|
@ -69,9 +101,28 @@ elif [ "$install_type" = "a" ]; then
|
|||
exit 1
|
||||
fi
|
||||
|
||||
echo "--------"
|
||||
echo "Pillow - Images"
|
||||
echo
|
||||
echo "OpenCV is the main image processing library."
|
||||
echo "Try the n first. If it doesn't work, fall back to o. Very new python versions might need t."
|
||||
echo "Most people want \"n\"."
|
||||
echo "If you are Python 3.7 or earlier, choose \"o\""
|
||||
echo "Do you want (o)ld pillow or (n)ew pillow? "
|
||||
read -r pillow
|
||||
if [ "$pillow" = "o" ]; then
|
||||
:
|
||||
elif [ "$pillow" = "n" ]; then
|
||||
:
|
||||
else
|
||||
echo "Sorry, did not understand that input!"
|
||||
popd || exit 1
|
||||
exit 1
|
||||
fi
|
||||
|
||||
echo "--------"
|
||||
echo "OpenCV - Images"
|
||||
echo
|
||||
echo "Most people want \"n\"."
|
||||
echo "If it doesn't work, fall back to \"o\". Python >=3.11 might need \"t\"."
|
||||
echo "Do you want (o)ld OpenCV, (n)ew OpenCV, or (t)est OpenCV? "
|
||||
read -r opencv
|
||||
if [ "$opencv" = "o" ]; then
|
||||
|
@ -91,6 +142,7 @@ else
|
|||
exit 1
|
||||
fi
|
||||
|
||||
echo "--------"
|
||||
echo "Creating new venv..."
|
||||
$py_command -m venv venv
|
||||
|
||||
|
@ -129,6 +181,12 @@ elif [ "$install_type" = "a" ]; then
|
|||
python -m pip install -r static/requirements/advanced/requirements_mpv_test.txt
|
||||
fi
|
||||
|
||||
if [ "$pillow" = "o" ]; then
|
||||
python -m pip install -r static/requirements/advanced/requirements_pillow_old.txt
|
||||
elif [ "$pillow" = "n" ]; then
|
||||
python -m pip install -r static/requirements/advanced/requirements_pillow_new.txt
|
||||
fi
|
||||
|
||||
if [ "$opencv" = "o" ]; then
|
||||
python -m pip install -r static/requirements/advanced/requirements_opencv_old.txt
|
||||
elif [ "$opencv" = "n" ]; then
|
||||
|
@ -140,6 +198,7 @@ fi
|
|||
|
||||
deactivate
|
||||
|
||||
echo "--------"
|
||||
echo "Done!"
|
||||
|
||||
popd || exit
|
||||
|
|
|
@ -11,7 +11,7 @@ html5lib>=1.0.1
|
|||
lxml>=4.5.0
|
||||
lz4>=3.0.0
|
||||
numpy>=1.16.0
|
||||
Pillow>=9.5.0
|
||||
Pillow>=10.0.1
|
||||
pillow-heif>=0.12.0
|
||||
psd-tools>=1.9.28
|
||||
psutil>=5.0.0
|
||||
|
|
|
@ -11,7 +11,7 @@ html5lib>=1.0.1
|
|||
lxml>=4.5.0
|
||||
lz4>=3.0.0
|
||||
numpy>=1.16.0
|
||||
Pillow>=9.5.0
|
||||
Pillow>=10.0.1
|
||||
pillow-heif>=0.12.0
|
||||
psd-tools>=1.9.28
|
||||
psutil>=5.0.0
|
||||
|
|
|
@ -11,7 +11,7 @@ html5lib>=1.0.1
|
|||
lxml>=4.5.0
|
||||
lz4>=3.0.0
|
||||
numpy>=1.16.0
|
||||
Pillow>=9.5.0
|
||||
Pillow>=10.0.1
|
||||
pillow-heif>=0.12.0
|
||||
psd-tools>=1.9.28
|
||||
psutil>=5.0.0
|
||||
|
|
Before Width: | Height: | Size: 3.2 KiB |
Before Width: | Height: | Size: 3.0 KiB After Width: | Height: | Size: 2.9 KiB |
Before Width: | Height: | Size: 2.5 KiB After Width: | Height: | Size: 2.6 KiB |
|
@ -11,8 +11,6 @@ html5lib>=1.0.1
|
|||
lxml>=4.5.0
|
||||
lz4>=3.0.0
|
||||
numpy>=1.16.0
|
||||
Pillow>=9.5.0
|
||||
pillow-heif>=0.12.0
|
||||
psd-tools>=1.9.28
|
||||
psutil>=5.0.0
|
||||
pyOpenSSL>=19.1.0
|
||||
|
|
|
@ -0,0 +1,2 @@
|
|||
Pillow>=10.0.1
|
||||
pillow-heif>=0.12.0
|
|
@ -0,0 +1,2 @@
|
|||
Pillow>=9.5.0
|
||||
pillow-heif>=0.12.0
|
|
@ -4,7 +4,7 @@ cloudscraper>=1.2.33
|
|||
html5lib>=1.0.1
|
||||
lz4>=3.0.0
|
||||
numpy>=1.16.0
|
||||
Pillow>=9.5.0
|
||||
Pillow>=10.0.1
|
||||
pillow-heif>=0.12.0
|
||||
psutil>=5.0.0
|
||||
pyOpenSSL>=19.1.0
|
||||
|
|