diff --git a/.github/workflows/release.yml b/.github/workflows/release.yml
index ae503b46..11f3876e 100644
--- a/.github/workflows/release.yml
+++ b/.github/workflows/release.yml
@@ -326,14 +326,14 @@ jobs:
uses: carlosperate/download-file-action@v1.0.3
id: download_mpv
with:
- file-url: 'https://sourceforge.net/projects/mpv-player-windows/files/libmpv/mpv-dev-x86_64-20210228-git-d1be8bb.7z'
+ file-url: 'https://sourceforge.net/projects/mpv-player-windows/files/libmpv/mpv-dev-x86_64-v3-20220829-git-211ce69.7z'
file-name: 'mpv-dev-x86_64.7z'
location: '.'
-
name: Process mpv-dev
run: |
7z x ${{ steps.download_mpv.outputs.file-path }}
- move mpv-1.dll hydrus\
+ move mpv-2.dll hydrus\
-
name: Build Hydrus
run: |
@@ -413,14 +413,14 @@ jobs:
uses: carlosperate/download-file-action@v1.0.3
id: download_mpv
with:
- file-url: 'https://sourceforge.net/projects/mpv-player-windows/files/libmpv/mpv-dev-x86_64-20210228-git-d1be8bb.7z'
+ file-url: 'https://sourceforge.net/projects/mpv-player-windows/files/libmpv/mpv-dev-x86_64-v3-20220829-git-211ce69.7z'
file-name: 'mpv-dev-x86_64.7z'
location: '.'
-
name: Process mpv-dev
run: |
7z x ${{ steps.download_mpv.outputs.file-path }}
- move mpv-1.dll hydrus\
+ move mpv-2.dll hydrus\
-
name: Build Hydrus
run: |
diff --git a/docs/about_docs.md b/docs/about_docs.md
index f069fd36..cfefc8a0 100644
--- a/docs/about_docs.md
+++ b/docs/about_docs.md
@@ -1,3 +1,7 @@
+---
+title: About These Docs
+---
+
# About These Docs
The Hydrus docs are built with [MkDocs](https://www.mkdocs.org/) using the [Material for MkDocs](https://squidfunk.github.io/mkdocs-material/) theme. The .md files in the `docs` directory are converted into nice html in the `help` directory. This is done automatically in the built releases, but if you run from source, you will want to build your own.
diff --git a/docs/access_keys.md b/docs/access_keys.md
index f420fb1a..f96b6055 100644
--- a/docs/access_keys.md
+++ b/docs/access_keys.md
@@ -1,7 +1,9 @@
---
-title: PTR access keys
+title: PTR Access Keys
---
+# PTR access keys
+
The PTR is now run by users with more bandwidth than I had to give, so the bandwidth limits are gone! If you would like to talk with the new management, please check the [discord](https://discord.gg/wPHPCUZ).
A guide and schema for the new PTR is [here](PTR.md).
@@ -66,4 +68,4 @@ Then you can check your PTR at any time under _services->review services_, under
A user kindly manages a store of update files and pre-processed empty client databases to get your synced quicker. This is generally recommended for advanced users or those following a guide, but if you are otherwise interested, please check it out:
-[https://cuddlebear92.github.io/Quicksync/](https://cuddlebear92.github.io/Quicksync/)
\ No newline at end of file
+[https://cuddlebear92.github.io/Quicksync/](https://cuddlebear92.github.io/Quicksync/)
diff --git a/docs/adding_new_downloaders.md b/docs/adding_new_downloaders.md
index 8ae6c123..58d5665e 100644
--- a/docs/adding_new_downloaders.md
+++ b/docs/adding_new_downloaders.md
@@ -1,5 +1,5 @@
---
-title: adding new downloaders
+title: Adding New Downloaders
---
# adding new downloaders
@@ -18,4 +18,4 @@ You can get these pngs from anyone who has experience in the downloader system.
To 'add' the easy-import pngs to your client, hit _network->downloaders->import downloaders_. A little image-panel will appear onto which you can drag-and-drop these png files. The client will then decode and go through the png, looking for interesting new objects and automatically import and link them up without you having to do any more. Your only further input on your end is a 'does this look correct?' check right before the actual import, just to make sure there isn't some mistake or other glaring problem.
-Objects imported this way will take precedence over existing functionality, so if one of your downloaders breaks due to a site change, importing a fixed png here will overwrite the broken entries and become the new default.
\ No newline at end of file
+Objects imported this way will take precedence over existing functionality, so if one of your downloaders breaks due to a site change, importing a fixed png here will overwrite the broken entries and become the new default.
diff --git a/docs/advanced.md b/docs/advanced.md
index 01c9d16a..7afd76b5 100644
--- a/docs/advanced.md
+++ b/docs/advanced.md
@@ -1,7 +1,8 @@
---
-title: general clever tricks
+title: General Clever Tricks
---
+# general clever tricks
!!! note "this is non-comprehensive"
I am always changing and adding little things. The best way to learn is just to look around. If you think a shortcut should probably do something, try it out! If you can't find something, let me know and I'll try to add it!
@@ -95,4 +96,4 @@ If the file is one you particularly care about, the easiest solution is to open
## setting a password { id="password" }
-the client offers a very simple password system, enough to keep out noobs. You can set it at _database->set a password_. It will thereafter ask for the password every time you start the program, and will not open without it. However none of the database is encrypted, and someone with enough enthusiasm or a tool and access to your computer can still very easily see what files you have. The password is mainly to stop idle snoops checking your images if you are away from your machine.
\ No newline at end of file
+the client offers a very simple password system, enough to keep out noobs. You can set it at _database->set a password_. It will thereafter ask for the password every time you start the program, and will not open without it. However none of the database is encrypted, and someone with enough enthusiasm or a tool and access to your computer can still very easily see what files you have. The password is mainly to stop idle snoops checking your images if you are away from your machine.
diff --git a/docs/advanced_multiple_local_file_services.md b/docs/advanced_multiple_local_file_services.md
index 20f14edf..93f39b72 100644
--- a/docs/advanced_multiple_local_file_services.md
+++ b/docs/advanced_multiple_local_file_services.md
@@ -1,7 +1,9 @@
---
-title: multiple local file services
+title: Multiple Local File Services
---
+# multiple local file services
+
The client lets you store your files in different overlapping partitions. This can help management workflows and privacy.
## what's the problem? { id="the_problem" }
diff --git a/docs/advanced_parents.md b/docs/advanced_parents.md
index 81cce42c..ca9cbfab 100644
--- a/docs/advanced_parents.md
+++ b/docs/advanced_parents.md
@@ -1,7 +1,9 @@
---
-title: tag parents
+title: Tag Parents
---
+# tag parents
+
Tag parents let you automatically add a particular tag every time another tag is added. The relationship will also apply retroactively.
## what's the problem? { id="the_problem" }
diff --git a/docs/advanced_siblings.md b/docs/advanced_siblings.md
index 8dd3eabe..15854083 100644
--- a/docs/advanced_siblings.md
+++ b/docs/advanced_siblings.md
@@ -1,7 +1,9 @@
---
-title: tag siblings
+title: Tag Siblings
---
+# tag siblings
+
Tag siblings let you replace a bad tag with a better tag.
## what's the problem? { id="the_problem" }
diff --git a/docs/after_disaster.md b/docs/after_disaster.md
index 6565ed82..ef134a65 100644
--- a/docs/after_disaster.md
+++ b/docs/after_disaster.md
@@ -2,7 +2,9 @@
title: recovering after disaster
---
-# you just had a database problem
+# Recovering After Disaster
+
+## you just had a database problem
I have helped quite a few users recover a mangled database from disk failure or accidental deletion. You just had similar and have been pointed here. This is a simple spiel on the next step that I, hydev, like to give people once we are done.
diff --git a/docs/changelog.md b/docs/changelog.md
index bf805223..41e6160f 100644
--- a/docs/changelog.md
+++ b/docs/changelog.md
@@ -1,8 +1,49 @@
+---
+title: Changelog
+---
+
# changelog
!!! note
This is the new changelog, only the most recent builds. For all versions, see the [old changelog](old_changelog.html).
+## [Version 499](https://github.com/hydrusnetwork/hydrus/releases/tag/v499)
+
+### mpv
+* updated the mpv version for Windows. this is more complicated than it sounds and has been fraught with difficulty at times, so I do not try it often, but the situation seems to be much better now. today we are updating about twelve months. I may be imagining it, but things seem a bit smoother. a variety of weird file support should be better--an old transparent apng that I know crashed older mpv no longer causes a crash--and there's some acceleration now for very new CPU chipsets. I've also insisted on precise seeking (rather than keyframe seeking, which some users may have defaulted to). mpv-1.dll is now mpv-2.dll
+* I don't have an easy Linux testbed any more, so I would be interested in a Linux 'running from source' user trying out a similar update and letting me know how it goes. try getting the latest libmpv1 and then update python-mpv to 1.0.1 on pip. your 'mpv api version' in _help->about_ should now be 2.0. this new python-mpv seems to have several compatibility improvements, which is what has plagued us before here
+* mpv on macOS is still a frustrating question mark, but if this works on Linux, it may open another door. who knows, maybe the new version doesn't crash instantly on load
+
+### search change for potential duplicates
+* this is subtle and complicated, so if you are a casual user of duplicates, don't worry about it. duplicates page = better now
+* for those who are more invested in dupes, I have altered the main potential duplicate search query. when the filter prepares some potential dupes to compare, or you load up some random thumbs in the page, or simply when the duplicates processing page presents counts, this all now only tests kings. previously, it could compare any member of a duplicate group to any other, and it would nominate kings as group representatives, but this lead to some odd situations where if you said 'must be pixel dupes', you could get two low quality pixel dupes offering their better king(s) up for actual comparison, giving you a comparison that was not a pixel dupe. same for the general searching of potentials, where if you search for 'bad quality', any bad quality file you set as a dupe but didn't delete could get matched (including in 'both match' mode), and offer a 'nicer' king as tribute that didn't have the tag. now, it only searches kings. kings match searches, and it is those kings that must match pixel dupe rules. this also means that kings will always be available on the current file domain, and no fallback king-nomination-from-filtered-members routine is needed any more
+* the knock-on effect here is minimal, but in general all database work in the duplicate filter should be a little faster, and some of your numbers may be a few counts smaller, typically after discounting weird edge case split-up duplicate groups that aren't real/common enough to really worry about. if you use a waterfall of multiple local file services to process your files, you might see significantly smaller counts due to kings not always being in the same file domain as their bad members, so you may want to try 'all my files' or just see how it goes--might be far less confusing, now you are only given unambiguous kings. anyway, in general, I think no big differences here for most users except better precision in searching!
+* but let me know how you get on IRL!
+
+### misc
+* thank's to a user's hard work, the default twitter downloader gets some upgrades this week: you can now download from twitter lists, a twitter user's likes, and twitter collections (which are curated lists of tweets). the downloaders still get a lot of 'ignored' results for text-only tweets, and you still have to be logged in to get nsfw, but this adds some neat tools to the toolbox
+* thanks to a user, the Client API now reports brief caching information and should boost Hydrus Companion performance (issue #605)
+* the simple shortcut list in the edit shortcut action dialog now no longer shows any duplicates (such as 'close media viewer' in the dupes window)
+* added a new default reason for tag petitions, 'clearing mass-pasted junk'. 'not applicable' is now 'not applicable/incorrect'
+* in the petition processing page, the content boxes now specifically say ADD or DELETE to reinforce what you are doing and to differentiate the two boxes when you have a pixel petition
+* in the petition processing page, the content boxes now grow and shrink in height, up to a max of 20 rows, depending on how much stuff is in them. I _think_ I have pixel perfect heights here, so let me know if yours are wrong!
+* the 'service info' rows in review services are now presented in nicer order
+* updated the header/title formatting across the help documentation. when you search for a page title, it should now show up in results (e.g. you type 'running from source', you get that nicely at the top, not a confusing sub-header of that article). the section links are also all now capitalised
+* misc refactoring
+
+### bunch of fixes
+* fixed a weird and possible crash-inducing scrolling bug in the tag list some users had in Qt6
+* fixed a typo error in file lookup scripts from when I added multi-line support to the parsing system (issue #1221)
+* fixed some bad labels in 'speed and memory' that talked about 'MB' when the widget allowed setting different units. also, I updated the 'video buffer' option on that page to a full 'bytes value' widget too (issue #1223)
+* the 'bytes value' widget, where you can set '100 MB' and similar, now gives the 'unit' dropdown a little more minimum width. it was getting a little thin on some styles and not showing the full text in the dropdown menu (issue #1222)
+* fixed a bug in similar-shape-search-tree-rebalancing maintenance in the rare case that the queue of branches in need of regeneration become out of sync with the main tree (issue #1219)
+* fixed a bug in archive/delete filter where clicks that were making actions would start borked drag-and-drop panning states if you dragged before releasing the click. it would cause warped media movement if you then clicked on hover window greyspace
+* fixed the 'this was a cloudflare problem' scanner for the new 1.2.64 version of cloudscraper
+* updated the popupmanager's positioning update code to use a nicer event filter and gave its position calculation code a quick pass. it might fix some popup toaster position bugs, not sure
+* fixed a weird menu creation bug involving a QStandardItem appearing in the menu actions
+* fixed a similar weird QStandardItem bug in the media viewer canvas code
+* fixed an error that could appear on force-emptied pages that receive sort signals
+
## [Version 498](https://github.com/hydrusnetwork/hydrus/releases/tag/v498)
_almost all the changes this week are only important to server admins and janitors. regular users can skip updating this week_
@@ -335,19 +376,3 @@ _almost all the changes this week are only important to server admins and janito
* I also cleaned some of the maintenance code, and made it more aggressive, so 'do a full metadata resync' is now be even more uncompromising
* also, the repository updates file service gets a bit of cleanup. it seems some ghost files have snuck in there over time, and today their records are corrected. the bug that let this happen in the first place is also fixed
* there remains an issue where some users' clients have tried to hit the PTR with 404ing update file hashes. I am still investigating this
-
-## [Version 488](https://github.com/hydrusnetwork/hydrus/releases/tag/v488)
-
-### all misc this week
-* the client now supports 'wavpack' files. these are basically a kind of compressed wav. mpv seems to play them fine too!
-* added a new file maintenance action, 'if file is missing, note it in log', which records the metadata about missing files to the database directory but makes no other action
-* the 'file is missing/incorrect' file maintenance jobs now also export the files' tags to the database directory, to further help identify them
-* simplified the logic behind the 'remove files if they are trashed' option. it should fire off more reliably now, even if you have a weird multiple-domain location for the current page, and still not fire if you are actually looking at the trash
-* if you paste an URL into the normal 'urls' downloader page, and it already has that URL and the URL has status 'failed', that existing URL will now be tried again. let's see how this works IRL, maybe it needs an option, maybe this feels natural when it comes up
-* the default bandwidth rules are boosted. the client is more efficient these days and doesn't need so many forced breaks on big import lists, and the internet has generally moved on. thanks to the users who helped talk out what the new limits should aim at. if you are an existing user, you can change your current defaults under _network->data->review bandwidth usage and edit rules_--there's even a button to revert your defaults 'back' to these new rules
-* now like all its neighbours, the cog icon on the duplicate right-side hover no longer annoyingly steals keyboard focus on a click.
-* did some code and logic cleanup around 'delete files', particularly to improve repository update deletes now we have multiple local file services, and in planning for future maintenance in this area
-* all the 'yes yes no' dialogs--the ones with multiple yes options--are moved to the newer panel system and will render their size and layout a bit more uniformly
-* may have fixed an issue with a very slow to boot client trying to politely wait on the thumbnail cache before it instantiates
-* misc UI text rewording and layout flag fixes
-* fixed some jank formatting on database migration help
diff --git a/docs/contact.md b/docs/contact.md
index 3a76395e..bda62062 100644
--- a/docs/contact.md
+++ b/docs/contact.md
@@ -1,4 +1,6 @@
-
+---
+title: Contact and Links
+---
# contact and links
@@ -29,4 +31,4 @@ That said:
* [email](mailto:hydrus.admin@gmail.com)
* [discord](https://discord.gg/wPHPCUZ)
* [patreon](https://www.patreon.com/hydrus_dev)
-* [user-run repository and wiki (including download presets for several non-default boorus)](https://github.com/CuddleBear92/Hydrus-Presets-and-Scripts)
\ No newline at end of file
+* [user-run repository and wiki (including download presets for several non-default boorus)](https://github.com/CuddleBear92/Hydrus-Presets-and-Scripts)
diff --git a/docs/database_migration.md b/docs/database_migration.md
index e9dd6805..8dae79c6 100644
--- a/docs/database_migration.md
+++ b/docs/database_migration.md
@@ -1,7 +1,9 @@
---
-title: database migration
+title: Database Migration
---
+# database migration
+
## the hydrus database { id="intro" }
A hydrus client consists of three components:
diff --git a/docs/downloader_completion.md b/docs/downloader_completion.md
index 2b2c6104..dcfe820f 100644
--- a/docs/downloader_completion.md
+++ b/docs/downloader_completion.md
@@ -1,7 +1,9 @@
---
-title: Putting it All Together
+title: Putting It All Together
---
+# Putting it all together
+
Now you know what GUGs, URL Classes, and Parsers are, you should have some ideas of how URL Classes could steer what happens when the downloader is faced with an URL to process. Should a URL be imported as a media file, or should it be parsed? If so, how?
You may have noticed in the Edit GUG ui that it lists if a current URL Class matches the example URL output. If the GUG has no matching URL Class, it won't be listed in the main 'gallery selector' button's list--it'll be relegated to the 'non-functioning' page. Without a URL Class, the client doesn't know what to do with the output of that GUG. But if a URL Class does match, we can then hand the result over to a parser set at _network->downloader definitions->manage url class links_:
diff --git a/docs/downloader_gugs.md b/docs/downloader_gugs.md
index cd90d535..1c706ca7 100644
--- a/docs/downloader_gugs.md
+++ b/docs/downloader_gugs.md
@@ -2,6 +2,8 @@
title: Gallery URL Generators
---
+# Gallery URL Generators
+
Gallery URL Generators, or **GUGs** are simple objects that take a simple string from the user, like:
* blue_eyes
diff --git a/docs/downloader_login.md b/docs/downloader_login.md
index 69497a8e..b43d955c 100644
--- a/docs/downloader_login.md
+++ b/docs/downloader_login.md
@@ -1,3 +1,7 @@
+---
+title: Login Manager
+---
+
# Login Manager
-The system works, but this help was never done! Check the defaults for examples of how it works, sorry!
\ No newline at end of file
+The system works, but this help was never done! Check the defaults for examples of how it works, sorry!
diff --git a/docs/downloader_parsers.md b/docs/downloader_parsers.md
index f3a986db..462dd33d 100644
--- a/docs/downloader_parsers.md
+++ b/docs/downloader_parsers.md
@@ -1,3 +1,7 @@
+---
+title: Parsers
+---
+
# Parsers
In hydrus, a parser is an object that takes a single block of HTML or JSON data and returns many kinds of hydrus-level metadata.
@@ -27,4 +31,4 @@ When you are making a parser, consider this checklist (you might want to copy/ha
* Is a source/post time available?
* Is a source URL available? Is it good quality, or does it often just point to an artist's base twitter profile? If you pull it from text or a tooltip, is it clipped for longer URLs?
-[Taken a break? Now let's put it all together ---->](downloader_completion.md)
\ No newline at end of file
+[Taken a break? Now let's put it all together ---->](downloader_completion.md)
diff --git a/docs/downloader_parsers_content_parsers.md b/docs/downloader_parsers_content_parsers.md
index 1a4de4f5..e40fcff2 100644
--- a/docs/downloader_parsers_content_parsers.md
+++ b/docs/downloader_parsers_content_parsers.md
@@ -1,3 +1,7 @@
+---
+title: Content Parsers
+---
+
# Content Parsers
So, we can now generate some strings from a document. Content Parsers will let us apply a single metadata type to those strings to inform hydrus what they are.
@@ -67,4 +71,4 @@ This is a special content type--it tells the next highest stage of parsing that
![](images/edit_content_parser_panel_veto.png)
-They will associate their name with the veto being raised, so it is useful to give these a decent descriptive name so you can see what might be going right or wrong during testing. If it is an appropriate and serious enough veto, it may also rise up to the user level and will be useful if they need to report you an error (like "After five pages of parsing, it gives 'veto: no next page link'").
\ No newline at end of file
+They will associate their name with the veto being raised, so it is useful to give these a decent descriptive name so you can see what might be going right or wrong during testing. If it is an appropriate and serious enough veto, it may also rise up to the user level and will be useful if they need to report you an error (like "After five pages of parsing, it gives 'veto: no next page link'").
diff --git a/docs/downloader_parsers_full_example_api.md b/docs/downloader_parsers_full_example_api.md
index d642f178..fe5a8212 100644
--- a/docs/downloader_parsers_full_example_api.md
+++ b/docs/downloader_parsers_full_example_api.md
@@ -1,3 +1,7 @@
+---
+title: API Example
+---
+
# api example
Some sites offer API calls for their pages. Depending on complexity and quality of content, using these APIs may or may not be a good idea. Artstation has a good one--let's first review our URL Classes:
@@ -42,4 +46,4 @@ This again uses python's datetime to decode the date, which Artstation presents
## summary { id="summary" }
-APIs that are stable and free to access (e.g. do not require OAuth or other complicated login headers) can make parsing fantastic. They save bandwidth and CPU time, and they are typically easier to work with than HTML. Unfortunately, the boorus that do provide APIs often list their tags without namespace information, so I recommend you double-check you can get what you want before you get too deep into it. Some APIs also offer incomplete data, such as relative URLs (relative to the original URL!), which can be a pain to figure out in our system.
\ No newline at end of file
+APIs that are stable and free to access (e.g. do not require OAuth or other complicated login headers) can make parsing fantastic. They save bandwidth and CPU time, and they are typically easier to work with than HTML. Unfortunately, the boorus that do provide APIs often list their tags without namespace information, so I recommend you double-check you can get what you want before you get too deep into it. Some APIs also offer incomplete data, such as relative URLs (relative to the original URL!), which can be a pain to figure out in our system.
diff --git a/docs/downloader_parsers_full_example_file_page.md b/docs/downloader_parsers_full_example_file_page.md
index 85ffa651..5574556c 100644
--- a/docs/downloader_parsers_full_example_file_page.md
+++ b/docs/downloader_parsers_full_example_file_page.md
@@ -1,3 +1,7 @@
+---
+title: File Page Example
+---
+
# file page example
Let's look at this page: [https://gelbooru.com/index.php?page=post&s=view&id=3837615](https://gelbooru.com/index.php?page=post&s=view&id=3837615).
@@ -96,4 +100,4 @@ Phew--all that for a bit of Lara Croft! Thankfully, most sites use similar schem
![](images/downloader_post_example_final.png)
-This is overall a decent parser. Some parts of it may fail when Gelbooru update to their next version, but that can be true of even very good parsers with multiple redundancy. For now, hydrus can use this to quickly and efficiently pull content from anything running Gelbooru 0.2.5., and the effort spent now can save millions of combined _right-click->save as_ and manual tag copies in future. If you make something like this and share it about, you'll be doing a good service for those who could never figure it out.
\ No newline at end of file
+This is overall a decent parser. Some parts of it may fail when Gelbooru update to their next version, but that can be true of even very good parsers with multiple redundancy. For now, hydrus can use this to quickly and efficiently pull content from anything running Gelbooru 0.2.5., and the effort spent now can save millions of combined _right-click->save as_ and manual tag copies in future. If you make something like this and share it about, you'll be doing a good service for those who could never figure it out.
diff --git a/docs/downloader_parsers_full_example_gallery_page.md b/docs/downloader_parsers_full_example_gallery_page.md
index 2cc459a5..9b93276b 100644
--- a/docs/downloader_parsers_full_example_gallery_page.md
+++ b/docs/downloader_parsers_full_example_gallery_page.md
@@ -1,3 +1,7 @@
+---
+title: Gallery Page Example
+---
+
# gallery page example
!!! caution
@@ -39,4 +43,4 @@ Note that this finds two URLs. e621 apply the `rel="next"` to both the "2" link
## summary { id="summary" }
-With those two rules, we are done. Gallery parsers are nice and simple.
\ No newline at end of file
+With those two rules, we are done. Gallery parsers are nice and simple.
diff --git a/docs/downloader_parsers_page_parsers.md b/docs/downloader_parsers_page_parsers.md
index 5fcf24be..0aa0f48f 100644
--- a/docs/downloader_parsers_page_parsers.md
+++ b/docs/downloader_parsers_page_parsers.md
@@ -1,3 +1,7 @@
+---
+title: Page Parsers
+---
+
# Page Parsers
We can now produce individual rows of rich metadata. To arrange them all into a useful structure, we will use Page Parsers.
@@ -56,4 +60,4 @@ When the client sees this in a downloader context, it will where to download the
## subsidiary page parsers { id="subsidiary_page_parsers" }
-Here be dragons. This was an attempt to make parsing more helpful in certain API situations, but it ended up ugly. I do not recommend you use it, as I will likely scratch the whole thing and replace it with something better one day. It basically splits the page up into pieces that can then be parsed by nested page parsers as separate objects, but the UI and workflow is hell. Afaik, the imageboard API parsers use it, but little/nothing else. If you are really interested, check out how those work and maybe duplicate to figure out your own imageboard parser and/or send me your thoughts on how to separate File URL/timestamp combos better.
\ No newline at end of file
+Here be dragons. This was an attempt to make parsing more helpful in certain API situations, but it ended up ugly. I do not recommend you use it, as I will likely scratch the whole thing and replace it with something better one day. It basically splits the page up into pieces that can then be parsed by nested page parsers as separate objects, but the UI and workflow is hell. Afaik, the imageboard API parsers use it, but little/nothing else. If you are really interested, check out how those work and maybe duplicate to figure out your own imageboard parser and/or send me your thoughts on how to separate File URL/timestamp combos better.
diff --git a/docs/downloader_sharing.md b/docs/downloader_sharing.md
index 225a2920..e2b66cbe 100644
--- a/docs/downloader_sharing.md
+++ b/docs/downloader_sharing.md
@@ -1,5 +1,5 @@
---
-title: Sharing Downloaders
+title: Sharing
---
# Sharing Downloaders
@@ -16,4 +16,4 @@ It isn't difficult. Essentially, you want to bundle enough objects to make one o
This all works on Example URLs and some domain guesswork, so make sure your url classes are good and the parsers have correct Example URLs as well. If they don't, they won't all link up neatly for the end user. If part of your downloader is on a different domain to the GUGs and Gallery URLs, then you'll have to add them manually. Just start with 'add gug' and see if it looks like enough.
-Once you have the necessary and sufficient objects added, you can export to png. You'll get a similar 'does this look right?' summary as what the end-user will see, just to check you have everything in order and the domains all correct. If that is good, then make sure to give the png a sensible filename and embellish the title and description if you need to. You can then send/post that png wherever, and any regular user will be able to use your work.
\ No newline at end of file
+Once you have the necessary and sufficient objects added, you can export to png. You'll get a similar 'does this look right?' summary as what the end-user will see, just to check you have everything in order and the domains all correct. If that is good, then make sure to give the png a sensible filename and embellish the title and description if you need to. You can then send/post that png wherever, and any regular user will be able to use your work.
diff --git a/docs/downloader_url_classes.md b/docs/downloader_url_classes.md
index 8852b9c3..8f6d20ce 100644
--- a/docs/downloader_url_classes.md
+++ b/docs/downloader_url_classes.md
@@ -1,3 +1,7 @@
+---
+title: URL Classes
+---
+
# URL Classes
The fundamental connective tissue of the downloader system is the 'URL Class'. This object identifies and normalises URLs and links them to other components. Whenever the client handles a URL, it tries to match it to a URL Class to figure out what to do.
diff --git a/docs/duplicates.md b/docs/duplicates.md
index c7ed2d71..07794d8e 100644
--- a/docs/duplicates.md
+++ b/docs/duplicates.md
@@ -1,5 +1,5 @@
---
-title: duplicates
+title: Filtering Duplicates
---
# duplicates { id="intro" }
diff --git a/docs/faq.md b/docs/faq.md
index 679ea4f4..2f03eecf 100644
--- a/docs/faq.md
+++ b/docs/faq.md
@@ -1,3 +1,7 @@
+---
+title: FAQ
+---
+
# FAQ
## what is a repository? { id="repositories" }
@@ -105,4 +109,4 @@ This is another long string of random hexadecimal that _identifies_ your account
The repositories do not work like conventional search engines; it takes a short but predictable while for changes to propagate to other users.
-The client's searches only ever happen over its local cache of what is on the repository. Any changes you make will be delayed for others until their next update occurs. At the moment, the update period is 100,000 seconds, which is about 1 day and 4 hours.
\ No newline at end of file
+The client's searches only ever happen over its local cache of what is on the repository. Any changes you make will be delayed for others until their next update occurs. At the moment, the update period is 100,000 seconds, which is about 1 day and 4 hours.
diff --git a/docs/gettingStartedOverview.md b/docs/gettingStartedOverview.md
index ca04779c..7737e781 100644
--- a/docs/gettingStartedOverview.md
+++ b/docs/gettingStartedOverview.md
@@ -1,5 +1,5 @@
---
-title: Overview for getting started
+title: Overview For Getting Started
---
# Overview for getting started
@@ -25,4 +25,4 @@ It is also worth having a look at [siblings](advanced_siblings.md) for when you
Have a lot of very similar looking pictures because of one reason or another? Have a look at [duplicates](duplicates.md), Hydrus' duplicates finder and filtering tool.
## API
-Hydrus has an API that lets external tools connect to it. See [API](client_api.md) for how to turn it on and a list of some of these tools.
\ No newline at end of file
+Hydrus has an API that lets external tools connect to it. See [API](client_api.md) for how to turn it on and a list of some of these tools.
diff --git a/docs/getting_started_importing.md b/docs/getting_started_importing.md
index ca23d578..ca82a76d 100644
--- a/docs/getting_started_importing.md
+++ b/docs/getting_started_importing.md
@@ -1,5 +1,5 @@
---
-title: Importing and exporting
+title: Importing and Exporting
---
# Importing and exporting
@@ -68,4 +68,4 @@ While you can import and export tags together with images sometimes you just don
Going to `tags -> migrate tags` you get a window that lets you deal with just tags. One of the options here is what's called a Hydrus Tag Archive, a file containing the hash <-> tag mappings for the files and tags matching the query.
-![](images/hydrus_tag_archive.png)
\ No newline at end of file
+![](images/hydrus_tag_archive.png)
diff --git a/docs/getting_started_installing.md b/docs/getting_started_installing.md
index 9d5ead7a..6f75c945 100644
--- a/docs/getting_started_installing.md
+++ b/docs/getting_started_installing.md
@@ -1,7 +1,9 @@
---
-title: installing and updating
+title: Installing and Updating
---
+# installing and updating
+
If any of this is confusing, a simpler guide is [here](https://github.com/Zweibach/text/blob/master/Hydrus/Hydrus%20Help%20Docs/00_tableOfContents.md), and some video guides are [here](https://github.com/CuddleBear92/Hydrus-guides)!
## downloading
diff --git a/docs/getting_started_ratings.md b/docs/getting_started_ratings.md
index f68a108f..283575f8 100644
--- a/docs/getting_started_ratings.md
+++ b/docs/getting_started_ratings.md
@@ -1,6 +1,7 @@
---
-title: ratings
+title: Ratings
---
+
# getting started with ratings
The hydrus client supports two kinds of ratings: _like/dislike_ and _numerical_. Let's start with the simpler one:
diff --git a/docs/getting_started_searching.md b/docs/getting_started_searching.md
index 3649e24d..b82caa88 100644
--- a/docs/getting_started_searching.md
+++ b/docs/getting_started_searching.md
@@ -1,5 +1,5 @@
---
-title: Searching and sorting
+title: Searching and Sorting
---
# Searching and sorting
diff --git a/docs/getting_started_subscriptions.md b/docs/getting_started_subscriptions.md
index d5b92f63..bc463deb 100644
--- a/docs/getting_started_subscriptions.md
+++ b/docs/getting_started_subscriptions.md
@@ -1,5 +1,5 @@
---
-title: subscriptions
+title: Subscriptions
---
# subscriptions
diff --git a/docs/getting_started_tags.md b/docs/getting_started_tags.md
index 81906c5e..30189f85 100644
--- a/docs/getting_started_tags.md
+++ b/docs/getting_started_tags.md
@@ -1,5 +1,5 @@
---
-title: tags
+title: Tags
---
# getting started with tags
diff --git a/docs/index.md b/docs/index.md
index 2260ef7d..cacb83cc 100644
--- a/docs/index.md
+++ b/docs/index.md
@@ -4,6 +4,7 @@ hide:
- navigation
- toc
---
+
# hydrus network - client and server
The hydrus network client is a desktop application written for Anonymous and other internet enthusiasts with large media collections. It organises your files into an internal database and browses them with tags instead of folders, a little like a booru on your desktop. Tags and files can be anonymously shared through custom servers that any user may run. Everything is free, nothing phones home, and the source code is included with the release. It is developed mostly for Windows, but builds for Linux and macOS are available (perhaps with some limitations, depending on your situation).
diff --git a/docs/introduction.md b/docs/introduction.md
index 6f60b9ce..aeea68ef 100644
--- a/docs/introduction.md
+++ b/docs/introduction.md
@@ -1,7 +1,9 @@
---
-title: introduction and statement of principles
+title: Introduction and Statement of Principles
---
+# introduction and statement of principles
+
## on being anonymous { id="anonymous" }
Nearly all sites use the same pseudonymous username/password system, and nearly all of them have the same drama, sockpuppets, and egotistical mods. Censorship is routine. That works for many people, but not for me.
@@ -44,4 +46,4 @@ These programs are free software. Everything I, hydrus dev, have made is under t
--8<-- "license.txt"
```
-Do what the fuck you want to with my software, and if shit breaks, DEAL WITH IT.
\ No newline at end of file
+Do what the fuck you want to with my software, and if shit breaks, DEAL WITH IT.
diff --git a/docs/ipfs.md b/docs/ipfs.md
index eac5dc96..3001aa23 100644
--- a/docs/ipfs.md
+++ b/docs/ipfs.md
@@ -2,6 +2,8 @@
title: IPFS
---
+# IPFS
+
IPFS is a p2p protocol that makes it easy to share many sorts of data. The hydrus client can communicate with an IPFS daemon to send and receive files.
You can read more about IPFS from [their homepage](http://ipfs.io), or [this guide](https://medium.com/@ConsenSys/an-introduction-to-ipfs-9bba4860abd0) that explains its various rules in more detail.
diff --git a/docs/launch_arguments.md b/docs/launch_arguments.md
index 29915ebc..2419f728 100644
--- a/docs/launch_arguments.md
+++ b/docs/launch_arguments.md
@@ -1,5 +1,5 @@
---
-title: launch arguments
+title: Launch Arguments
---
# launch arguments
diff --git a/docs/local_booru.md b/docs/local_booru.md
index c97fbb07..62eff377 100644
--- a/docs/local_booru.md
+++ b/docs/local_booru.md
@@ -1,3 +1,6 @@
+---
+title: Local Booru
+---
# local booru
@@ -61,4 +64,4 @@ You can review all your shares on _services->review services_, under _local->boo
## future plans { id="future" }
-This was a fun project, but it never advanced beyond a prototype. The future of this system is other people's nice applications plugging into the [Client API](client_api.md).
\ No newline at end of file
+This was a fun project, but it never advanced beyond a prototype. The future of this system is other people's nice applications plugging into the [Client API](client_api.md).
diff --git a/docs/old_changelog.html b/docs/old_changelog.html
index 83c6d62e..616f0f80 100644
--- a/docs/old_changelog.html
+++ b/docs/old_changelog.html
@@ -33,9 +33,46 @@
+
+
+ - mpv:
+ - updated the mpv version for Windows. this is more complicated than it sounds and has been fraught with difficulty at times, so I do not try it often, but the situation seems to be much better now. today we are updating about twelve months. I may be imagining it, but things seem a bit smoother. a variety of weird file support should be better--an old transparent apng that I know crashed older mpv no longer causes a crash--and there's some acceleration now for very new CPU chipsets. I've also insisted on precise seeking (rather than keyframe seeking, which some users may have defaulted to). mpv-1.dll is now mpv-2.dll
+ - I don't have an easy Linux testbed any more, so I would be interested in a Linux 'running from source' user trying out a similar update and letting me know how it goes. try getting the latest libmpv1 and then update python-mpv to 1.0.1 on pip. your 'mpv api version' in _help->about_ should now be 2.0. this new python-mpv seems to have several compatibility improvements, which is what has plagued us before here
+ - mpv on macOS is still a frustrating question mark, but if this works on Linux, it may open another door. who knows, maybe the new version doesn't crash instantly on load
+ - .
+ - search change for potential duplicates:
+ - this is subtle and complicated, so if you are a casual user of duplicates, don't worry about it. duplicates page = better now
+ - for those who are more invested in dupes, I have altered the main potential duplicate search query. when the filter prepares some potential dupes to compare, or you load up some random thumbs in the page, or simply when the duplicates processing page presents counts, this all now only tests kings. previously, it could compare any member of a duplicate group to any other, and it would nominate kings as group representatives, but this lead to some odd situations where if you said 'must be pixel dupes', you could get two low quality pixel dupes offering their better king(s) up for actual comparison, giving you a comparison that was not a pixel dupe. same for the general searching of potentials, where if you search for 'bad quality', any bad quality file you set as a dupe but didn't delete could get matched (including in 'both match' mode), and offer a 'nicer' king as tribute that didn't have the tag. now, it only searches kings. kings match searches, and it is those kings that must match pixel dupe rules. this also means that kings will always be available on the current file domain, and no fallback king-nomination-from-filtered-members routine is needed any more
+ - the knock-on effect here is minimal, but in general all database work in the duplicate filter should be a little faster, and some of your numbers may be a few counts smaller, typically after discounting weird edge case split-up duplicate groups that aren't real/common enough to really worry about. if you use a waterfall of multiple local file services to process your files, you might see significantly smaller counts due to kings not always being in the same file domain as their bad members, so you may want to try 'all my files' or just see how it goes--might be far less confusing, now you are only given unambiguous kings. anyway, in general, I think no big differences here for most users except better precision in searching!
+ - but let me know how you get on IRL!
+ - .
+ - misc:
+ - thank's to a user's hard work, the default twitter downloader gets some upgrades this week: you can now download from twitter lists, a twitter user's likes, and twitter collections (which are curated lists of tweets). the downloaders still get a lot of 'ignored' results for text-only tweets, and you still have to be logged in to get nsfw, but this adds some neat tools to the toolbox
+ - thanks to a user, the Client API now reports brief caching information and should boost Hydrus Companion performance (issue #605)
+ - the simple shortcut list in the edit shortcut action dialog now no longer shows any duplicates (such as 'close media viewer' in the dupes window)
+ - added a new default reason for tag petitions, 'clearing mass-pasted junk'. 'not applicable' is now 'not applicable/incorrect'
+ - in the petition processing page, the content boxes now specifically say ADD or DELETE to reinforce what you are doing and to differentiate the two boxes when you have a pixel petition
+ - in the petition processing page, the content boxes now grow and shrink in height, up to a max of 20 rows, depending on how much stuff is in them. I _think_ I have pixel perfect heights here, so let me know if yours are wrong!
+ - the 'service info' rows in review services are now presented in nicer order
+ - updated the header/title formatting across the help documentation. when you search for a page title, it should now show up in results (e.g. you type 'running from source', you get that nicely at the top, not a confusing sub-header of that article). the section links are also all now capitalised
+ - misc refactoring
+ - .
+ - bunch of fixes:
+ - fixed a weird and possible crash-inducing scrolling bug in the tag list some users had in Qt6
+ - fixed a typo error in file lookup scripts from when I added multi-line support to the parsing system (issue #1221)
+ - fixed some bad labels in 'speed and memory' that talked about 'MB' when the widget allowed setting different units. also, I updated the 'video buffer' option on that page to a full 'bytes value' widget too (issue #1223)
+ - the 'bytes value' widget, where you can set '100 MB' and similar, now gives the 'unit' dropdown a little more minimum width. it was getting a little thin on some styles and not showing the full text in the dropdown menu (issue #1222)
+ - fixed a bug in similar-shape-search-tree-rebalancing maintenance in the rare case that the queue of branches in need of regeneration become out of sync with the main tree (issue #1219)
+ - fixed a bug in archive/delete filter where clicks that were making actions would start borked drag-and-drop panning states if you dragged before releasing the click. it would cause warped media movement if you then clicked on hover window greyspace
+ - fixed the 'this was a cloudflare problem' scanner for the new 1.2.64 version of cloudscraper
+ - updated the popupmanager's positioning update code to use a nicer event filter and gave its position calculation code a quick pass. it might fix some popup toaster position bugs, not sure
+ - fixed a weird menu creation bug involving a QStandardItem appearing in the menu actions
+ - fixed a similar weird QStandardItem bug in the media viewer canvas code
+ - fixed an error that could appear on force-emptied pages that receive sort signals
+
- - _almost all the changes this week are only important to server admins and janitors. regular users can skip updating this week_
+ - almost all the changes this week are only important to server admins and janitors. regular users can skip updating this week
- overview:
- the server has important database and network updates this week. if your server has a lot of content, it has to count it all up, so it will take a short while to update. the petition protocol has also changed, so older clients will not be able to fetch new servers' petitions without an error. I think newer clients will be able to fetch older servers' ones, but it may be iffy
- I considered whether I should update the network protocol version number, which would (politely) force all users to update, but as this causes inconvenience every time I do it, and I expect to do more incremental updates here in coming weeks, and since this only affects admins and janitors, I decided to not. we are going to be in awkward flux for a little bit, so please make sure you update privileged clients and servers at roughly the same time
diff --git a/docs/petitionPractices.md b/docs/petitionPractices.md
index 61ced891..5c2c4a05 100644
--- a/docs/petitionPractices.md
+++ b/docs/petitionPractices.md
@@ -1,6 +1,7 @@
---
-title: Petition practices
+title: Petition Practices
---
+
# Petitions practices
This document exists to give a rough idea what to do in regard to the PTR to avoid creating uncecessary work for the janitors.
@@ -42,4 +43,4 @@ List of some bad parents to `character:` tags as an example:
## Translations
Translations should be siblinged to what the closest in-use romanised tag is if there's no proper translation. If the tag is ambiguous, such as `響` or `ヒビキ` which means `hibiki`, just sibling them to the ambiguous tag. The tag can then later on be deleted and replaced by a less ambiguous tag. On the other hand, `響(艦隊これくしょん)` straight up means `hibiki (kantai kollection)` and can safely be siblinged to the proper `character:` tag.
-Do the same for subjective tags. `魅惑のふともも` can be translated to `bewitching thighs`. `まったく、駆逐艦は最高だぜ!!` straight up translates to `Geez, destroyers are the best!!`, which does not contain much usable information for Hydrus currently. These can then either be siblinged down to an unsubjective tag (`thighs`) if there's objective information in the tag, deleted if purely subjective, or deleted and replaced if ambiguous.
\ No newline at end of file
+Do the same for subjective tags. `魅惑のふともも` can be translated to `bewitching thighs`. `まったく、駆逐艦は最高だぜ!!` straight up translates to `Geez, destroyers are the best!!`, which does not contain much usable information for Hydrus currently. These can then either be siblinged down to an unsubjective tag (`thighs`) if there's objective information in the tag, deleted if purely subjective, or deleted and replaced if ambiguous.
diff --git a/docs/privacy.md b/docs/privacy.md
index 8baaecff..a009a28c 100644
--- a/docs/privacy.md
+++ b/docs/privacy.md
@@ -1,3 +1,7 @@
+---
+title: Privacy
+---
+
# privacy
!!! tldr "tl;dr"
diff --git a/docs/reducing_lag.md b/docs/reducing_lag.md
index 3a1c7a92..5b3f95a0 100644
--- a/docs/reducing_lag.md
+++ b/docs/reducing_lag.md
@@ -1,7 +1,9 @@
---
-title: reducing lag
+title: Reducing Lag
---
+# reducing lag
+
## hydrus is cpu and hdd hungry { id="intro" }
The hydrus client manages a lot of complicated data and gives you a lot of power over it. To add millions of files and tags to its database, and then to perform difficult searches over that information, it needs to use a lot of CPU time and hard drive time--sometimes in small laggy blips, and occasionally in big 100% CPU chunks. I don't put training wheels or limiters on the software either, so if you search for 300,000 files, the client will try to fetch that many.
@@ -45,4 +47,4 @@ You can generate a profile by hitting _help->debug->profile mode_, which tells t
Turn on profile mode, do the thing that runs slow for you (importing a file, fetching some tags, whatever), and then check your database folder (most likely _install_dir/db_) for a new 'client profile - DATE.log' file. This file will be filled with several sets of tables with timing information. Please send that whole file to me, or if it is too large, cut what seems important. It should not contain any personal information, but feel free to look through it.
-There are several ways to [contact me](contact.md).
\ No newline at end of file
+There are several ways to [contact me](contact.md).
diff --git a/docs/running_from_source.md b/docs/running_from_source.md
index 475587f4..4f71bcf2 100644
--- a/docs/running_from_source.md
+++ b/docs/running_from_source.md
@@ -1,7 +1,9 @@
---
-title: running from source
+title: Running From Source
---
+# running from source
+
I write the client and server entirely in [python](https://python.org), which can run straight from source. It is not simple to get hydrus running this way, but if none of the built packages work for you (for instance you use a non-Ubuntu-compatible flavour of Linux), it may be the only way you can get the program to run. Also, if you have a general interest in exploring the code or wish to otherwise modify the program, you will obviously need to do this.
## a quick note about Linux flavours { id="linux_flavours" }
@@ -88,7 +90,7 @@ MPV is optional and complicated, but it is great, so it is worth the time to fig
As well as the python wrapper, 'python-mpv' as in the requirements.txt, you also need the underlying library. This is _not_ mpv the program, but 'libmpv', often called 'libmpv1'.
-For Windows, the dll builds are [here](https://sourceforge.net/projects/mpv-player-windows/files/libmpv/), although getting the right version for the current wrapper can be difficult (you will get errors when you try to load video if it is not correct). Just put it in your hydrus base install directory. You can also just grab the 'mpv-1.dll' I bundle in my release. In my experience, [this](https://sourceforge.net/projects/mpv-player-windows/files/libmpv/mpv-dev-x86_64-20210228-git-d1be8bb.7z/download) works with python-mpv 0.5.2.
+For Windows, the dll builds are [here](https://sourceforge.net/projects/mpv-player-windows/files/libmpv/), although getting the right version for the current wrapper can be difficult (you will get errors when you try to load video if it is not correct). Just put it in your hydrus base install directory. You can also just grab the 'mpv-2.dll' I bundle in my release. In my experience, [this](https://sourceforge.net/projects/mpv-player-windows/files/libmpv/mpv-dev-x86_64-20210228-git-d1be8bb.7z/download) works with python-mpv 0.5.2.
If you are on Linux, you can usually get 'libmpv1' with _apt_. You might have to adjust your python-mpv version (e.g. `pip3 install python-mpv==0.4.5`) to get it to work.
diff --git a/docs/server.md b/docs/server.md
index 188b8456..f82e574f 100644
--- a/docs/server.md
+++ b/docs/server.md
@@ -1,7 +1,9 @@
---
-title: running your own server
+title: Running Your Own Server
---
+# running your own server
+
!!! note
**You do not need the server to do anything with hydrus! It is only for advanced users to do very specific jobs!** The server is also hacked-together and quite technical. It requires a fair amount of experience with the client and its concepts, and it does not operate on a timescale that works well on a LAN. Only try running your own server once you have a bit of experience synchronising with something like the PTR and you think, 'Hey, I know exactly what that does, and I would like one!'
@@ -75,4 +77,4 @@ All of a server's files and options are stored in its accompanying .db file and
If you get to a point where you can no longer boot the repository, try running SQLite Studio and opening server.db. If the issue is simple--like manually changing the port number--you may be in luck. Send me an email if it is tricky.
-Remember that everything is breaking all the time. Make regular backups, and you'll minimise your problems.
\ No newline at end of file
+Remember that everything is breaking all the time. Make regular backups, and you'll minimise your problems.
diff --git a/docs/support.md b/docs/support.md
index 3f8199a3..65e5a6d1 100644
--- a/docs/support.md
+++ b/docs/support.md
@@ -1,8 +1,10 @@
---
-title: financial support
+title: Financial Support
---
-# can I contribute to hydrus development? { id="support" }
+# Financial Support
+
+## can I contribute to hydrus development? { id="support" }
I do not expect anything from anyone. I'm amazed and grateful that anyone wants to use my software and share tags with others. I enjoy the feedback and work, and I hope to keep putting completely free weekly releases out as long as there is more to do.
@@ -10,4 +12,4 @@ That said, as I have developed the software, several users have kindly offered t
I find the tactics of most internet fundraising very distasteful, especially when they promise something they then fail to deliver. I much prefer the 'if you like me and would like to contribute, then please do, meanwhile I'll keep doing what I do' model. I support several 'put out regular free content' creators on Patreon in this way, and I get a lot out of it, even though I have no direct reward beyond the knowledge that I helped some people do something neat.
-If you feel the same way about my work, I've set up a simple Patreon page [here](https://www.patreon.com/hydrus_dev). If you can help out, it is deeply appreciated.
\ No newline at end of file
+If you feel the same way about my work, I've set up a simple Patreon page [here](https://www.patreon.com/hydrus_dev). If you can help out, it is deeply appreciated.
diff --git a/docs/wine.md b/docs/wine.md
index 39302e3e..8b0d1791 100644
--- a/docs/wine.md
+++ b/docs/wine.md
@@ -1,5 +1,5 @@
---
-title: running in wine
+title: Running In Wine
---
# running a client or server in wine
@@ -27,4 +27,4 @@ Installation process:
---
-If you get the client running in Wine, please let me know how you get on!
\ No newline at end of file
+If you get the client running in Wine, please let me know how you get on!
diff --git a/hydrus/client/ClientOptions.py b/hydrus/client/ClientOptions.py
index 14ea3196..c77c716f 100644
--- a/hydrus/client/ClientOptions.py
+++ b/hydrus/client/ClientOptions.py
@@ -334,7 +334,7 @@ class ClientOptions( HydrusSerialisable.SerialisableBase ):
self._dictionary[ 'integers' ][ 'notebook_tab_alignment' ] = CC.DIRECTION_UP
- self._dictionary[ 'integers' ][ 'video_buffer_size_mb' ] = 96
+ self._dictionary[ 'integers' ][ 'video_buffer_size' ] = 96 * 1024 * 1024
self._dictionary[ 'integers' ][ 'related_tags_search_1_duration_ms' ] = 250
self._dictionary[ 'integers' ][ 'related_tags_search_2_duration_ms' ] = 2000
diff --git a/hydrus/client/ClientParsing.py b/hydrus/client/ClientParsing.py
index 634307c3..4f3e480b 100644
--- a/hydrus/client/ClientParsing.py
+++ b/hydrus/client/ClientParsing.py
@@ -2799,7 +2799,9 @@ class ParseNodeContentLink( HydrusSerialisable.SerialisableBase ):
def ParseURLs( self, job_key, parsing_text, referral_url ):
- basic_urls = self._formula.Parse( {}, parsing_text )
+ collapse_newlines = True
+
+ basic_urls = self._formula.Parse( {}, parsing_text, collapse_newlines )
absolute_urls = [ urllib.parse.urljoin( referral_url, basic_url ) for basic_url in basic_urls ]
diff --git a/hydrus/client/ClientRendering.py b/hydrus/client/ClientRendering.py
index 2de282f8..e7124a92 100644
--- a/hydrus/client/ClientRendering.py
+++ b/hydrus/client/ClientRendering.py
@@ -486,7 +486,7 @@ class RasterContainerVideo( RasterContainer ):
new_options = HG.client_controller.new_options
- video_buffer_size_mb = new_options.GetInteger( 'video_buffer_size_mb' )
+ video_buffer_size = new_options.GetInteger( 'video_buffer_size' )
duration = self._media.GetDuration()
num_frames_in_video = self._media.GetNumFrames()
@@ -515,7 +515,7 @@ class RasterContainerVideo( RasterContainer ):
self._average_frame_duration = duration / num_frames_in_video
- frame_buffer_length = ( video_buffer_size_mb * 1024 * 1024 ) // ( x * y * 3 )
+ frame_buffer_length = video_buffer_size // ( x * y * 3 )
# if we can't buffer the whole vid, then don't have a clunky massive buffer
diff --git a/hydrus/client/db/ClientDB.py b/hydrus/client/db/ClientDB.py
index 295dc06d..0537277d 100644
--- a/hydrus/client/db/ClientDB.py
+++ b/hydrus/client/db/ClientDB.py
@@ -11518,6 +11518,72 @@ class DB( HydrusDB.HydrusDB ):
+ if version == 498:
+
+ try:
+
+ domain_manager = self.modules_serialisable.GetJSONDump( HydrusSerialisable.SERIALISABLE_TYPE_NETWORK_DOMAIN_MANAGER )
+
+ domain_manager.Initialise()
+
+ #
+
+ domain_manager.RenameGUG( 'twitter syndication profile lookup (limited)', 'twitter syndication profile lookup' )
+ domain_manager.RenameGUG( 'twitter syndication profile lookup (limited) (with replies)', 'twitter syndication profile lookup (with replies)' )
+
+ domain_manager.OverwriteDefaultGUGs( ( 'twitter syndication list lookup', 'twitter syndication likes lookup', 'twitter syndication collection lookup' ) )
+
+ domain_manager.OverwriteDefaultParsers( ( 'twitter syndication api profile parser', 'twitter syndication api tweet parser' ) )
+
+ domain_manager.OverwriteDefaultURLClasses( (
+ 'twitter list',
+ 'twitter syndication api collection',
+ 'twitter syndication api likes (user_id)',
+ 'twitter syndication api likes',
+ 'twitter syndication api list (list_id)',
+ 'twitter syndication api list (screen_name and slug)',
+ 'twitter syndication api list (user_id and slug)',
+ 'twitter syndication api profile (user_id)',
+ 'twitter syndication api profile',
+ 'twitter syndication api tweet',
+ 'twitter tweet'
+ ) )
+
+ #
+
+ domain_manager.TryToLinkURLClassesAndParsers()
+
+ #
+
+ self.modules_serialisable.SetJSONDump( domain_manager )
+
+ except Exception as e:
+
+ HydrusData.PrintException( e )
+
+ message = 'Trying to update some downloader objects failed! Please let hydrus dev know!'
+
+ self.pub_initial_message( message )
+
+
+ try:
+
+ new_options = self.modules_serialisable.GetJSONDump( HydrusSerialisable.SERIALISABLE_TYPE_CLIENT_OPTIONS )
+
+ new_options.SetInteger( 'video_buffer_size', new_options.GetInteger( 'video_buffer_size_mb' ) * 1024 * 1024 )
+
+ self.modules_serialisable.SetJSONDump( new_options )
+
+ except:
+
+ HydrusData.PrintException( e )
+
+ message = 'Trying to update the video buffer option value failed! Please let hydrus dev know!'
+
+ self.pub_initial_message( message )
+
+
+
self._controller.frame_splash_status.SetTitleText( 'updated db to v{}'.format( HydrusData.ToHumanInt( version + 1 ) ) )
self._Execute( 'UPDATE version SET version = ?;', ( version + 1, ) )
diff --git a/hydrus/client/db/ClientDBFilesDuplicates.py b/hydrus/client/db/ClientDBFilesDuplicates.py
index 297b5f62..6632b499 100644
--- a/hydrus/client/db/ClientDBFilesDuplicates.py
+++ b/hydrus/client/db/ClientDBFilesDuplicates.py
@@ -416,7 +416,7 @@ class ClientDBFilesDuplicates( ClientDBModule.ClientDBModule ):
if king_hash_id not in preferred_hash_ids:
- king_hash_id = random.sample( preferred_hash_ids, 1 )[0]
+ king_hash_id = random.choice( list( preferred_hash_ids ) )
return king_hash_id
@@ -425,7 +425,7 @@ class ClientDBFilesDuplicates( ClientDBModule.ClientDBModule ):
if king_hash_id not in media_hash_ids:
- king_hash_id = random.sample( media_hash_ids, 1 )[0]
+ king_hash_id = random.choice( list( media_hash_ids ) )
return king_hash_id
@@ -926,8 +926,8 @@ class ClientDBFilesDuplicates( ClientDBModule.ClientDBModule ):
def DuplicatesGetPotentialDuplicatePairsTableJoinOnEverythingSearchResults( self, db_location_context: ClientDBFilesStorage.DBLocationContext, pixel_dupes_preference: int, max_hamming_distance: int ):
- tables = 'potential_duplicate_pairs, duplicate_file_members AS duplicate_file_members_smaller, duplicate_file_members AS duplicate_file_members_larger'
- join_predicate = 'smaller_media_id = duplicate_file_members_smaller.media_id AND larger_media_id = duplicate_file_members_larger.media_id AND distance <= {}'.format( max_hamming_distance )
+ tables = 'potential_duplicate_pairs, duplicate_files AS duplicate_files_smaller, duplicate_files AS duplicate_files_larger'
+ join_predicate = 'smaller_media_id = duplicate_files_smaller.media_id AND larger_media_id = duplicate_files_larger.media_id AND distance <= {}'.format( max_hamming_distance )
if not db_location_context.location_context.IsAllKnownFiles():
@@ -935,12 +935,12 @@ class ClientDBFilesDuplicates( ClientDBModule.ClientDBModule ):
tables = '{}, {} AS current_files_smaller, {} AS current_files_larger'.format( tables, files_table_name, files_table_name )
- join_predicate = '{} AND duplicate_file_members_smaller.hash_id = current_files_smaller.hash_id AND duplicate_file_members_larger.hash_id = current_files_larger.hash_id'.format( join_predicate )
+ join_predicate = '{} AND duplicate_files_smaller.king_hash_id = current_files_smaller.hash_id AND duplicate_files_larger.king_hash_id = current_files_larger.hash_id'.format( join_predicate )
if pixel_dupes_preference in ( CC.SIMILAR_FILES_PIXEL_DUPES_REQUIRED, CC.SIMILAR_FILES_PIXEL_DUPES_EXCLUDED ):
- join_predicate_pixel_dupes = 'duplicate_file_members_smaller.hash_id = pixel_hash_map_smaller.hash_id AND duplicate_file_members_larger.hash_id = pixel_hash_map_larger.hash_id AND pixel_hash_map_smaller.pixel_hash_id = pixel_hash_map_larger.pixel_hash_id'
+ join_predicate_pixel_dupes = 'duplicate_files_smaller.king_hash_id = pixel_hash_map_smaller.hash_id AND duplicate_files_larger.king_hash_id = pixel_hash_map_larger.hash_id AND pixel_hash_map_smaller.pixel_hash_id = pixel_hash_map_larger.pixel_hash_id'
if pixel_dupes_preference == CC.SIMILAR_FILES_PIXEL_DUPES_REQUIRED:
@@ -973,7 +973,7 @@ class ClientDBFilesDuplicates( ClientDBModule.ClientDBModule ):
files_table_name = db_location_context.GetSingleFilesTableName()
- table_join = 'potential_duplicate_pairs, duplicate_file_members AS duplicate_file_members_smaller, {} AS current_files_smaller, duplicate_file_members AS duplicate_file_members_larger, {} AS current_files_larger ON ( smaller_media_id = duplicate_file_members_smaller.media_id AND duplicate_file_members_smaller.hash_id = current_files_smaller.hash_id AND larger_media_id = duplicate_file_members_larger.media_id AND duplicate_file_members_larger.hash_id = current_files_larger.hash_id )'.format( files_table_name, files_table_name )
+ table_join = 'potential_duplicate_pairs, duplicate_files AS duplicate_files_smaller, {} AS current_files_smaller, duplicate_files AS duplicate_files_larger, {} AS current_files_larger ON ( smaller_media_id = duplicate_files_smaller.media_id AND duplicate_files_smaller.king_hash_id = current_files_smaller.hash_id AND larger_media_id = duplicate_files_larger.media_id AND duplicate_files_larger.king_hash_id = current_files_larger.hash_id )'.format( files_table_name, files_table_name )
return table_join
@@ -1030,15 +1030,15 @@ class ClientDBFilesDuplicates( ClientDBModule.ClientDBModule ):
# ████████████████████████████████████████████████████████████████████████
#
- base_tables = 'potential_duplicate_pairs, duplicate_file_members AS duplicate_file_members_smaller, duplicate_file_members AS duplicate_file_members_larger'
+ base_tables = 'potential_duplicate_pairs, duplicate_files AS duplicate_files_smaller, duplicate_files AS duplicate_files_larger'
- join_predicate_media_to_hashes = 'smaller_media_id = duplicate_file_members_smaller.media_id AND larger_media_id = duplicate_file_members_larger.media_id AND distance <= {}'.format( max_hamming_distance )
+ join_predicate_media_to_hashes = 'smaller_media_id = duplicate_files_smaller.media_id AND larger_media_id = duplicate_files_larger.media_id AND distance <= {}'.format( max_hamming_distance )
if both_files_match:
tables = '{}, {} AS results_smaller, {} AS results_larger'.format( base_tables, results_table_name, results_table_name )
- join_predicate_hashes_to_allowed_results = 'duplicate_file_members_smaller.hash_id = results_smaller.hash_id AND duplicate_file_members_larger.hash_id = results_larger.hash_id'
+ join_predicate_hashes_to_allowed_results = 'duplicate_files_smaller.king_hash_id = results_smaller.hash_id AND duplicate_files_larger.king_hash_id = results_larger.hash_id'
else:
@@ -1046,7 +1046,7 @@ class ClientDBFilesDuplicates( ClientDBModule.ClientDBModule ):
tables = '{}, {} AS results_table_for_this_query'.format( base_tables, results_table_name )
- join_predicate_hashes_to_allowed_results = '( duplicate_file_members_smaller.hash_id = results_table_for_this_query.hash_id OR duplicate_file_members_larger.hash_id = results_table_for_this_query.hash_id )'
+ join_predicate_hashes_to_allowed_results = '( duplicate_files_smaller.king_hash_id = results_table_for_this_query.hash_id OR duplicate_files_larger.king_hash_id = results_table_for_this_query.hash_id )'
else:
@@ -1054,9 +1054,9 @@ class ClientDBFilesDuplicates( ClientDBModule.ClientDBModule ):
tables = '{}, {} AS results_table_for_this_query, {} AS current_files_for_this_query'.format( base_tables, results_table_name, files_table_name )
- join_predicate_smaller_matches = '( duplicate_file_members_smaller.hash_id = results_table_for_this_query.hash_id AND duplicate_file_members_larger.hash_id = current_files_for_this_query.hash_id )'
+ join_predicate_smaller_matches = '( duplicate_files_smaller.king_hash_id = results_table_for_this_query.hash_id AND duplicate_files_larger.king_hash_id = current_files_for_this_query.hash_id )'
- join_predicate_larger_matches = '( duplicate_file_members_smaller.hash_id = current_files_for_this_query.hash_id AND duplicate_file_members_larger.hash_id = results_table_for_this_query.hash_id )'
+ join_predicate_larger_matches = '( duplicate_files_smaller.king_hash_id = current_files_for_this_query.hash_id AND duplicate_files_larger.king_hash_id = results_table_for_this_query.hash_id )'
join_predicate_hashes_to_allowed_results = '( {} OR {} )'.format( join_predicate_smaller_matches, join_predicate_larger_matches )
@@ -1064,7 +1064,7 @@ class ClientDBFilesDuplicates( ClientDBModule.ClientDBModule ):
if pixel_dupes_preference in ( CC.SIMILAR_FILES_PIXEL_DUPES_REQUIRED, CC.SIMILAR_FILES_PIXEL_DUPES_EXCLUDED ):
- join_predicate_pixel_dupes = 'duplicate_file_members_smaller.hash_id = pixel_hash_map_smaller.hash_id AND duplicate_file_members_larger.hash_id = pixel_hash_map_larger.hash_id AND pixel_hash_map_smaller.pixel_hash_id = pixel_hash_map_larger.pixel_hash_id'
+ join_predicate_pixel_dupes = 'duplicate_files_smaller.king_hash_id = pixel_hash_map_smaller.hash_id AND duplicate_files_larger.king_hash_id = pixel_hash_map_larger.hash_id AND pixel_hash_map_smaller.pixel_hash_id = pixel_hash_map_larger.pixel_hash_id'
if pixel_dupes_preference == CC.SIMILAR_FILES_PIXEL_DUPES_REQUIRED:
diff --git a/hydrus/client/db/ClientDBSimilarFiles.py b/hydrus/client/db/ClientDBSimilarFiles.py
index 4f2516b0..c9553360 100644
--- a/hydrus/client/db/ClientDBSimilarFiles.py
+++ b/hydrus/client/db/ClientDBSimilarFiles.py
@@ -528,7 +528,18 @@ class ClientDBSimilarFiles( ClientDBModule.ClientDBModule ):
with self._MakeTemporaryIntegerTable( rebalance_perceptual_hash_ids, 'phash_id' ) as temp_table_name:
# temp perceptual hashes to tree
- ( biggest_perceptual_hash_id, ) = self._Execute( 'SELECT phash_id FROM {} CROSS JOIN shape_vptree USING ( phash_id ) ORDER BY inner_population + outer_population DESC;'.format( temp_table_name ) ).fetchone()
+ result = self._Execute( 'SELECT phash_id FROM {} CROSS JOIN shape_vptree USING ( phash_id ) ORDER BY inner_population + outer_population DESC;'.format( temp_table_name ) ).fetchone()
+
+ if result is None:
+
+ self._Execute( 'DELETE FROM shape_maintenance_branch_regen;' )
+
+ return
+
+ else:
+
+ ( biggest_perceptual_hash_id, ) = result
+
self._RegenerateBranch( job_key, biggest_perceptual_hash_id )
diff --git a/hydrus/client/gui/ClientGUIMenus.py b/hydrus/client/gui/ClientGUIMenus.py
index ee9a4ca9..55ca6550 100644
--- a/hydrus/client/gui/ClientGUIMenus.py
+++ b/hydrus/client/gui/ClientGUIMenus.py
@@ -174,7 +174,11 @@ def AppendSeparator( menu ):
last_item = menu.actions()[-1]
- if not last_item.isSeparator():
+ # got this once, who knows what happened, so we test for QAction now
+ # 'PySide2.QtGui.QStandardItem' object has no attribute 'isSeparator'
+ last_item_is_separator = isinstance( last_item, QW.QAction ) and last_item.isSeparator()
+
+ if not last_item_is_separator:
menu.addSeparator()
diff --git a/hydrus/client/gui/ClientGUIPopupMessages.py b/hydrus/client/gui/ClientGUIPopupMessages.py
index fcbf3b54..45050913 100644
--- a/hydrus/client/gui/ClientGUIPopupMessages.py
+++ b/hydrus/client/gui/ClientGUIPopupMessages.py
@@ -588,6 +588,7 @@ class PopupMessage( PopupWindow ):
+
class PopupMessageManager( QW.QWidget ):
def __init__( self, parent ):
@@ -625,9 +626,7 @@ class PopupMessageManager( QW.QWidget ):
self._pending_job_keys = []
- self._gui_event_filter = QP.WidgetEventFilter( parent )
- self._gui_event_filter.EVT_SIZE( self.EventParentMovedOrResized )
- self._gui_event_filter.EVT_MOVE( self.EventParentMovedOrResized )
+ parent.installEventFilter( self )
HG.client_controller.sub( self, 'AddMessage', 'message' )
@@ -781,7 +780,7 @@ class PopupMessageManager( QW.QWidget ):
try:
- gui_frame = self.parentWidget()
+ gui_frame = HG.client_controller.gui
gui_is_hidden = not gui_frame.isVisible()
@@ -789,8 +788,6 @@ class PopupMessageManager( QW.QWidget ):
current_focus_tlw = QW.QApplication.activeWindow()
- self_is_active = current_focus_tlw == self
-
main_gui_or_child_window_is_active = ClientGUIFunctions.TLWOrChildIsActive( gui_frame )
num_messages_displayed = self._message_vbox.count()
@@ -799,6 +796,18 @@ class PopupMessageManager( QW.QWidget ):
if there_is_stuff_to_display:
+ # Unhiding tends to raise the main gui tlw in some window managers, which is annoying if a media viewer window has focus
+ show_is_not_annoying = main_gui_or_child_window_is_active or self._DisplayingError()
+
+ ok_to_show = show_is_not_annoying and not going_to_bug_out_at_hide_or_show
+
+ if ok_to_show:
+
+ self.show()
+
+
+ #
+
parent_size = gui_frame.size()
my_size = self.size()
@@ -816,16 +825,6 @@ class PopupMessageManager( QW.QWidget ):
- # Unhiding tends to raise the main gui tlw in some window managers, which is annoying if a media viewer window has focus
- show_is_not_annoying = main_gui_or_child_window_is_active or self._DisplayingError()
-
- ok_to_show = show_is_not_annoying and not going_to_bug_out_at_hide_or_show
-
- if ok_to_show:
-
- self.show()
-
-
else:
if not going_to_bug_out_at_hide_or_show:
@@ -1098,6 +1097,22 @@ class PopupMessageManager( QW.QWidget ):
self.MakeSureEverythingFits()
+ def eventFilter( self, watched, event ):
+
+ if watched == self.parentWidget():
+
+ if event.type() in ( QC.QEvent.Resize, QC.QEvent.Move ):
+
+ if self._OKToAlterUI():
+
+ self._SizeAndPositionAndShow()
+
+
+
+
+ return False
+
+
def resizeEvent( self, event ):
if not self or not QP.isValid( self ): # funny runtime error caused this
@@ -1113,21 +1128,6 @@ class PopupMessageManager( QW.QWidget ):
event.ignore()
- def EventParentMovedOrResized( self, event ):
-
- if not self or not QP.isValid( self ): # funny runtime error caused this
-
- return
-
-
- if self._OKToAlterUI():
-
- self._SizeAndPositionAndShow()
-
-
- return True # was: event.ignore()
-
-
def MakeSureEverythingFits( self ):
if self._OKToAlterUI():
diff --git a/hydrus/client/gui/ClientGUIScrolledPanelsManagement.py b/hydrus/client/gui/ClientGUIScrolledPanelsManagement.py
index adbafca8..79066a60 100644
--- a/hydrus/client/gui/ClientGUIScrolledPanelsManagement.py
+++ b/hydrus/client/gui/ClientGUIScrolledPanelsManagement.py
@@ -2864,8 +2864,8 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
buffer_panel = ClientGUICommon.StaticBox( self, 'video buffer' )
- self._video_buffer_size_mb = ClientGUICommon.BetterSpinBox( buffer_panel, min=48, max= 16 * 1024 )
- self._video_buffer_size_mb.valueChanged.connect( self.EventVideoBufferUpdate )
+ self._video_buffer_size = ClientGUIControls.BytesControl( buffer_panel )
+ self._video_buffer_size.valueChanged.connect( self.EventVideoBufferUpdate )
self._estimated_number_video_frames = QW.QLabel( '', buffer_panel )
@@ -2881,7 +2881,7 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
self._ideal_tile_dimension.setValue( self._new_options.GetInteger( 'ideal_tile_dimension' ) )
- self._video_buffer_size_mb.setValue( self._new_options.GetInteger( 'video_buffer_size_mb' ) )
+ self._video_buffer_size.SetValue( self._new_options.GetInteger( 'video_buffer_size' ) )
self._media_viewer_prefetch_delay_base_ms.setValue( self._new_options.GetInteger( 'media_viewer_prefetch_delay_base_ms' ) )
self._media_viewer_prefetch_num_previous.setValue( self._new_options.GetInteger( 'media_viewer_prefetch_num_previous' ) )
@@ -2929,7 +2929,7 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
video_buffer_sizer = QP.HBoxLayout()
- QP.AddToLayout( video_buffer_sizer, self._video_buffer_size_mb, CC.FLAGS_CENTER_PERPENDICULAR )
+ QP.AddToLayout( video_buffer_sizer, self._video_buffer_size, CC.FLAGS_CENTER_PERPENDICULAR )
QP.AddToLayout( video_buffer_sizer, self._estimated_number_video_frames, CC.FLAGS_CENTER_PERPENDICULAR )
#
@@ -2942,7 +2942,7 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
rows = []
- rows.append( ( 'MB memory reserved for thumbnail cache:', thumbnails_sizer ) )
+ rows.append( ( 'Memory reserved for thumbnail cache:', thumbnails_sizer ) )
rows.append( ( 'Thumbnail cache timeout:', self._thumbnail_cache_timeout ) )
gridbox = ClientGUICommon.WrapInGrid( thumbnail_cache_panel, rows )
@@ -2965,7 +2965,7 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
rows = []
- rows.append( ( 'MB memory reserved for image cache:', fullscreens_sizer ) )
+ rows.append( ( 'Memory reserved for image cache:', fullscreens_sizer ) )
rows.append( ( 'Image cache timeout:', self._image_cache_timeout ) )
rows.append( ( 'Maximum image size (in % of cache) that can be cached:', image_cache_storage_sizer ) )
rows.append( ( 'Maximum image size (in % of cache) that will be prefetched:', image_cache_prefetch_sizer ) )
@@ -2989,7 +2989,7 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
rows = []
- rows.append( ( 'MB memory reserved for image tile cache:', image_tiles_sizer ) )
+ rows.append( ( 'Memory reserved for image tile cache:', image_tiles_sizer ) )
rows.append( ( 'Image tile cache timeout:', self._image_tile_cache_timeout ) )
rows.append( ( 'Ideal tile width/height px:', self._ideal_tile_dimension ) )
@@ -3019,7 +3019,7 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
rows = []
- rows.append( ( 'MB memory for video buffer: ', video_buffer_sizer ) )
+ rows.append( ( 'Memory for video buffer: ', video_buffer_sizer ) )
gridbox = ClientGUICommon.WrapInGrid( buffer_panel, rows )
@@ -3041,7 +3041,7 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
self.EventImageCacheUpdate()
self.EventThumbnailsUpdate()
self.EventImageTilesUpdate()
- self.EventVideoBufferUpdate( self._video_buffer_size_mb.value() )
+ self.EventVideoBufferUpdate()
def EventImageCacheUpdate( self ):
@@ -3105,9 +3105,11 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
self._estimated_number_thumbnails.setText( '(at '+res_string+', about '+HydrusData.ToHumanInt(estimated_thumbs)+' thumbnails)' )
- def EventVideoBufferUpdate( self, value ):
+ def EventVideoBufferUpdate( self ):
- estimated_720p_frames = int( ( value * 1024 * 1024 ) // ( 1280 * 720 * 3 ) )
+ value = self._video_buffer_size.GetValue()
+
+ estimated_720p_frames = int( value // ( 1280 * 720 * 3 ) )
self._estimated_number_video_frames.setText( '(about '+HydrusData.ToHumanInt(estimated_720p_frames)+' frames of 720p video)' )
@@ -3131,7 +3133,7 @@ class ManageOptionsPanel( ClientGUIScrolledPanels.ManagePanel ):
self._new_options.SetInteger( 'image_cache_storage_limit_percentage', self._image_cache_storage_limit_percentage.value() )
self._new_options.SetInteger( 'image_cache_prefetch_limit_percentage', self._image_cache_prefetch_limit_percentage.value() )
- self._new_options.SetInteger( 'video_buffer_size_mb', self._video_buffer_size_mb.value() )
+ self._new_options.SetInteger( 'video_buffer_size', self._video_buffer_size.GetValue() )
diff --git a/hydrus/client/gui/ClientGUIShortcuts.py b/hydrus/client/gui/ClientGUIShortcuts.py
index 5d71aaee..e85d7402 100644
--- a/hydrus/client/gui/ClientGUIShortcuts.py
+++ b/hydrus/client/gui/ClientGUIShortcuts.py
@@ -258,6 +258,8 @@ simple_shortcut_name_to_action_lookup = {
'custom' : SHORTCUTS_MEDIA_ACTIONS + SHORTCUTS_MEDIA_VIEWER_ACTIONS
}
+simple_shortcut_name_to_action_lookup = { key : HydrusData.DedupeList( value ) for ( key, value ) in simple_shortcut_name_to_action_lookup.items() }
+
CUMULATIVE_MOUSEWARP_MANHATTAN_LENGTH = 0
# ok, the problem here is that I get key codes that are converted, so if someone does shift+1 on a US keyboard, this ends up with Shift+! same with ctrl+alt+ to get accented characters
diff --git a/hydrus/client/gui/ClientGUITags.py b/hydrus/client/gui/ClientGUITags.py
index fe72e4ed..6fac25fe 100644
--- a/hydrus/client/gui/ClientGUITags.py
+++ b/hydrus/client/gui/ClientGUITags.py
@@ -2413,7 +2413,8 @@ class ManageTagsPanel( ClientGUIScrolledPanels.ManagePanel, CAC.ApplicationComma
suggestions = []
suggestions.append( 'mangled parse/typo' )
- suggestions.append( 'not applicable' )
+ suggestions.append( 'not applicable/incorrect' )
+ suggestions.append( 'clearing mass-pasted junk' )
suggestions.append( 'splitting filename/title/etc... into individual tags' )
with ClientGUIDialogs.DialogTextEntry( self, message, suggestions = suggestions ) as dlg:
diff --git a/hydrus/client/gui/QtPorting.py b/hydrus/client/gui/QtPorting.py
index e2898723..541a9cc0 100644
--- a/hydrus/client/gui/QtPorting.py
+++ b/hydrus/client/gui/QtPorting.py
@@ -1209,6 +1209,7 @@ class CallAfterEventCatcher( QC.QObject ):
return False
+
def CallAfter( fn, *args, **kwargs ):
diff --git a/hydrus/client/gui/canvas/ClientGUICanvas.py b/hydrus/client/gui/canvas/ClientGUICanvas.py
index 7eca4207..05f78516 100644
--- a/hydrus/client/gui/canvas/ClientGUICanvas.py
+++ b/hydrus/client/gui/canvas/ClientGUICanvas.py
@@ -229,6 +229,19 @@ class CanvasLayout( QW.QLayout ):
+class LayoutEventSilencer( QC.QObject ):
+
+ def eventFilter( self, watched, event ):
+
+ if watched == self.parent() and event.type() == QC.QEvent.LayoutRequest:
+
+ return True
+
+
+ return False
+
+
+
class Canvas( QW.QWidget, CAC.ApplicationCommandProcessorMixin ):
CANVAS_TYPE = CC.CANVAS_MEDIA_VIEWER
@@ -261,6 +274,9 @@ class Canvas( QW.QWidget, CAC.ApplicationCommandProcessorMixin ):
self._my_shortcuts_handler = ClientGUIShortcuts.ShortcutsHandler( self, [ 'media', 'media_viewer' ], catch_mouse = catch_mouse, ignore_activating_mouse_click = ignore_activating_mouse_click )
+ self._layout_silencer = LayoutEventSilencer( self )
+ self.installEventFilter( self._layout_silencer )
+
self._click_drag_reporting_filter = MediaContainerDragClickReportingFilter( self )
self.installEventFilter( self._click_drag_reporting_filter )
@@ -660,18 +676,6 @@ class Canvas( QW.QWidget, CAC.ApplicationCommandProcessorMixin ):
ClientGUIMediaActions.UndeleteMedia( self, ( self._current_media, ) )
- def event( self, event ):
-
- if event.type() == QC.QEvent.LayoutRequest:
-
- return True
-
- else:
-
- return QW.QWidget.event( self, event )
-
-
-
def CleanBeforeDestroy( self ):
self.ClearMedia()
@@ -704,7 +708,7 @@ class Canvas( QW.QWidget, CAC.ApplicationCommandProcessorMixin ):
self._media_container.ResetCenterPosition()
- self._last_drag_pos = None
+ self.EndDrag()
@@ -979,7 +983,7 @@ class Canvas( QW.QWidget, CAC.ApplicationCommandProcessorMixin ):
self._media_container.ResetCenterPosition()
- self._last_drag_pos = None
+ self.EndDrag()
def SetLocationContext( self, location_context: ClientLocation.LocationContext ):
@@ -1006,6 +1010,8 @@ class Canvas( QW.QWidget, CAC.ApplicationCommandProcessorMixin ):
if media != self._current_media:
+ self.EndDrag()
+
HG.client_controller.ResetIdleTimer()
self._SaveCurrentMediaViewTime()
@@ -2661,7 +2667,7 @@ class CanvasFilterDuplicates( CanvasWithHovers ):
self._media_container.ResetCenterPosition()
- self._last_drag_pos = None
+ self.EndDrag()
self._media_container.show()
@@ -3979,7 +3985,7 @@ class CanvasMediaListBrowser( CanvasMediaListNavigable ):
i_can_post_ratings = len( local_ratings_services ) > 0
- self._last_drag_pos = None # to stop successive right-click drag warp bug
+ self.EndDrag() # to stop successive right-click drag warp bug
locations_manager = self._current_media.GetLocationsManager()
diff --git a/hydrus/client/gui/canvas/ClientGUICanvasHoverFrames.py b/hydrus/client/gui/canvas/ClientGUICanvasHoverFrames.py
index 7278e59c..2feb2ccd 100644
--- a/hydrus/client/gui/canvas/ClientGUICanvasHoverFrames.py
+++ b/hydrus/client/gui/canvas/ClientGUICanvasHoverFrames.py
@@ -1372,7 +1372,7 @@ class NotePanel( QW.QWidget ):
return True
- return QW.QWidget.eventFilter( self, object, event )
+ return False
def heightForWidth( self, width: int ):
diff --git a/hydrus/client/gui/canvas/ClientGUIMPV.py b/hydrus/client/gui/canvas/ClientGUIMPV.py
index 7ab4ac03..5b659487 100644
--- a/hydrus/client/gui/canvas/ClientGUIMPV.py
+++ b/hydrus/client/gui/canvas/ClientGUIMPV.py
@@ -141,6 +141,15 @@ class mpvWidget( QW.QWidget, CAC.ApplicationCommandProcessorMixin ):
self._my_shortcut_handler = ClientGUIShortcuts.ShortcutsHandler( self, [], catch_mouse = True )
+ try:
+
+ self.we_are_newer_api = float( GetClientAPIVersionString() ) >= 2.0
+
+ except:
+
+ self.we_are_newer_api = False
+
+
def _GetAudioOptionNames( self ):
@@ -422,7 +431,7 @@ class mpvWidget( QW.QWidget, CAC.ApplicationCommandProcessorMixin ):
try:
- self._player.seek( time_index_s, reference = 'absolute' )
+ self._player.seek( time_index_s, reference = 'absolute', precision = 'exact' )
except:
diff --git a/hydrus/client/gui/lists/ClientGUIListBoxes.py b/hydrus/client/gui/lists/ClientGUIListBoxes.py
index 52732580..e7010715 100644
--- a/hydrus/client/gui/lists/ClientGUIListBoxes.py
+++ b/hydrus/client/gui/lists/ClientGUIListBoxes.py
@@ -1882,7 +1882,7 @@ class ListBox( QW.QScrollArea ):
- return QW.QScrollArea.eventFilter( self, watched, event )
+ return False
def mouseMoveEvent( self, event ):
@@ -2425,569 +2425,571 @@ class ListBoxTags( ListBox ):
def ShowMenu( self ):
+ if len( self._ordered_terms ) == 0:
+
+ return
+
+
sub_selection_string = None
- if len( self._ordered_terms ) > 0:
+ selected_actual_tags = self._GetTagsFromTerms( self._selected_terms )
+
+ menu = QW.QMenu()
+
+ if self._terms_may_have_sibling_or_parent_info:
- selected_actual_tags = self._GetTagsFromTerms( self._selected_terms )
-
- menu = QW.QMenu()
-
- if self._terms_may_have_sibling_or_parent_info:
+ if self._show_parent_decorators:
- if self._show_parent_decorators:
+ if self._extra_parent_rows_allowed:
- if self._extra_parent_rows_allowed:
+ if len( self._ordered_terms ) != self._total_positional_rows:
- if len( self._ordered_terms ) != self._total_positional_rows:
-
- ClientGUIMenus.AppendMenuItem( menu, 'collapse parent rows', 'Show/hide parents.', self.SetExtraParentRowsAllowed, not self._extra_parent_rows_allowed )
-
+ ClientGUIMenus.AppendMenuItem( menu, 'collapse parent rows', 'Show/hide parents.', self.SetExtraParentRowsAllowed, not self._extra_parent_rows_allowed )
- else:
-
- ClientGUIMenus.AppendMenuItem( menu, 'expand parent rows', 'Show/hide parents.', self.SetExtraParentRowsAllowed, not self._extra_parent_rows_allowed )
-
-
- ClientGUIMenus.AppendMenuItem( menu, 'hide parent decorators', 'Show/hide parent info.', self.SetParentDecoratorsAllowed, not self._show_parent_decorators )
else:
- ClientGUIMenus.AppendMenuItem( menu, 'show parent decorators', 'Show/hide parent info.', self.SetParentDecoratorsAllowed, not self._show_parent_decorators )
+ ClientGUIMenus.AppendMenuItem( menu, 'expand parent rows', 'Show/hide parents.', self.SetExtraParentRowsAllowed, not self._extra_parent_rows_allowed )
- if self._show_sibling_decorators:
-
- ClientGUIMenus.AppendMenuItem( menu, 'hide sibling decorators', 'Show/hide sibling info.', self.SetSiblingDecoratorsAllowed, not self._show_sibling_decorators )
-
- else:
-
- ClientGUIMenus.AppendMenuItem( menu, 'show sibling decorators', 'Show/hide sibling info.', self.SetSiblingDecoratorsAllowed, not self._show_sibling_decorators )
-
-
- ClientGUIMenus.AppendSeparator( menu )
-
-
- copy_menu = QW.QMenu( menu )
-
- selected_copyable_tag_strings = self._GetCopyableTagStrings( COPY_SELECTED_TAGS )
- selected_copyable_subtag_strings = self._GetCopyableTagStrings( COPY_SELECTED_SUBTAGS )
-
- if len( selected_copyable_tag_strings ) == 1:
-
- ( selection_string, ) = selected_copyable_tag_strings
+ ClientGUIMenus.AppendMenuItem( menu, 'hide parent decorators', 'Show/hide parent info.', self.SetParentDecoratorsAllowed, not self._show_parent_decorators )
else:
- selection_string = '{} selected'.format( HydrusData.ToHumanInt( len( selected_copyable_tag_strings ) ) )
+ ClientGUIMenus.AppendMenuItem( menu, 'show parent decorators', 'Show/hide parent info.', self.SetParentDecoratorsAllowed, not self._show_parent_decorators )
- if len( selected_copyable_tag_strings ) > 0:
+ if self._show_sibling_decorators:
- ClientGUIMenus.AppendMenuItem( copy_menu, selection_string, 'Copy the selected tags to your clipboard.', self._ProcessMenuCopyEvent, COPY_SELECTED_TAGS )
+ ClientGUIMenus.AppendMenuItem( menu, 'hide sibling decorators', 'Show/hide sibling info.', self.SetSiblingDecoratorsAllowed, not self._show_sibling_decorators )
- if len( selected_copyable_subtag_strings ) == 1:
-
- # this does a quick test for 'are we selecting a namespaced tags' that also allows for having both 'samus aran' and 'character:samus aran'
- if set( selected_copyable_subtag_strings ) != set( selected_copyable_tag_strings ):
-
- ( sub_selection_string, ) = selected_copyable_subtag_strings
-
- ClientGUIMenus.AppendMenuItem( copy_menu, sub_selection_string, 'Copy the selected subtag to your clipboard.', self._ProcessMenuCopyEvent, COPY_SELECTED_SUBTAGS )
-
-
- else:
-
- sub_selection_string = '{} selected subtags'.format( HydrusData.ToHumanInt( len( selected_copyable_subtag_strings ) ) )
-
- ClientGUIMenus.AppendMenuItem( copy_menu, sub_selection_string, 'Copy the selected subtags to your clipboard.', self._ProcessMenuCopyEvent, COPY_SELECTED_SUBTAGS )
-
+ else:
- if self._HasCounts():
-
- ClientGUIMenus.AppendSeparator( copy_menu )
-
- ClientGUIMenus.AppendMenuItem( copy_menu, '{} with counts'.format( selection_string ), 'Copy the selected tags, with their counts, to your clipboard.', self._ProcessMenuCopyEvent, COPY_SELECTED_TAGS_WITH_COUNTS )
-
- if sub_selection_string is not None:
-
- ClientGUIMenus.AppendMenuItem( copy_menu, '{} with counts'.format( sub_selection_string ), 'Copy the selected subtags, with their counts, to your clipboard.', self._ProcessMenuCopyEvent, COPY_SELECTED_SUBTAGS_WITH_COUNTS )
-
-
+ ClientGUIMenus.AppendMenuItem( menu, 'show sibling decorators', 'Show/hide sibling info.', self.SetSiblingDecoratorsAllowed, not self._show_sibling_decorators )
- copy_all_is_appropriate = len( self._ordered_terms ) > len( self._selected_terms )
+ ClientGUIMenus.AppendSeparator( menu )
- if copy_all_is_appropriate:
+
+ copy_menu = QW.QMenu( menu )
+
+ selected_copyable_tag_strings = self._GetCopyableTagStrings( COPY_SELECTED_TAGS )
+ selected_copyable_subtag_strings = self._GetCopyableTagStrings( COPY_SELECTED_SUBTAGS )
+
+ if len( selected_copyable_tag_strings ) == 1:
+
+ ( selection_string, ) = selected_copyable_tag_strings
+
+ else:
+
+ selection_string = '{} selected'.format( HydrusData.ToHumanInt( len( selected_copyable_tag_strings ) ) )
+
+
+ if len( selected_copyable_tag_strings ) > 0:
+
+ ClientGUIMenus.AppendMenuItem( copy_menu, selection_string, 'Copy the selected tags to your clipboard.', self._ProcessMenuCopyEvent, COPY_SELECTED_TAGS )
+
+ if len( selected_copyable_subtag_strings ) == 1:
+
+ # this does a quick test for 'are we selecting a namespaced tags' that also allows for having both 'samus aran' and 'character:samus aran'
+ if set( selected_copyable_subtag_strings ) != set( selected_copyable_tag_strings ):
+
+ ( sub_selection_string, ) = selected_copyable_subtag_strings
+
+ ClientGUIMenus.AppendMenuItem( copy_menu, sub_selection_string, 'Copy the selected subtag to your clipboard.', self._ProcessMenuCopyEvent, COPY_SELECTED_SUBTAGS )
+
+
+ else:
+
+ sub_selection_string = '{} selected subtags'.format( HydrusData.ToHumanInt( len( selected_copyable_subtag_strings ) ) )
+
+ ClientGUIMenus.AppendMenuItem( copy_menu, sub_selection_string, 'Copy the selected subtags to your clipboard.', self._ProcessMenuCopyEvent, COPY_SELECTED_SUBTAGS )
+
+
+ if self._HasCounts():
ClientGUIMenus.AppendSeparator( copy_menu )
- ClientGUIMenus.AppendMenuItem( copy_menu, 'all tags', 'Copy all the tags in this list to your clipboard.', self._ProcessMenuCopyEvent, COPY_ALL_TAGS )
- ClientGUIMenus.AppendMenuItem( copy_menu, 'all subtags', 'Copy all the subtags in this list to your clipboard.', self._ProcessMenuCopyEvent, COPY_ALL_SUBTAGS )
+ ClientGUIMenus.AppendMenuItem( copy_menu, '{} with counts'.format( selection_string ), 'Copy the selected tags, with their counts, to your clipboard.', self._ProcessMenuCopyEvent, COPY_SELECTED_TAGS_WITH_COUNTS )
- if self._HasCounts():
+ if sub_selection_string is not None:
- ClientGUIMenus.AppendMenuItem( copy_menu, 'all tags with counts', 'Copy all the tags in this list, with their counts, to your clipboard.', self._ProcessMenuCopyEvent, COPY_ALL_TAGS_WITH_COUNTS )
- ClientGUIMenus.AppendMenuItem( copy_menu, 'all subtags with counts', 'Copy all the subtags in this list, with their counts, to your clipboard.', self._ProcessMenuCopyEvent, COPY_ALL_SUBTAGS_WITH_COUNTS )
+ ClientGUIMenus.AppendMenuItem( copy_menu, '{} with counts'.format( sub_selection_string ), 'Copy the selected subtags, with their counts, to your clipboard.', self._ProcessMenuCopyEvent, COPY_SELECTED_SUBTAGS_WITH_COUNTS )
- ClientGUIMenus.AppendMenu( menu, copy_menu, 'copy' )
+
+ copy_all_is_appropriate = len( self._ordered_terms ) > len( self._selected_terms )
+
+ if copy_all_is_appropriate:
- #
+ ClientGUIMenus.AppendSeparator( copy_menu )
- can_launch_sibling_and_parent_dialogs = len( selected_actual_tags ) > 0 and self.can_spawn_new_windows
- can_show_siblings_and_parents = len( selected_actual_tags ) == 1
+ ClientGUIMenus.AppendMenuItem( copy_menu, 'all tags', 'Copy all the tags in this list to your clipboard.', self._ProcessMenuCopyEvent, COPY_ALL_TAGS )
+ ClientGUIMenus.AppendMenuItem( copy_menu, 'all subtags', 'Copy all the subtags in this list to your clipboard.', self._ProcessMenuCopyEvent, COPY_ALL_SUBTAGS )
- if can_show_siblings_and_parents or can_launch_sibling_and_parent_dialogs:
+ if self._HasCounts():
- siblings_menu = QW.QMenu( menu )
- parents_menu = QW.QMenu( menu )
+ ClientGUIMenus.AppendMenuItem( copy_menu, 'all tags with counts', 'Copy all the tags in this list, with their counts, to your clipboard.', self._ProcessMenuCopyEvent, COPY_ALL_TAGS_WITH_COUNTS )
+ ClientGUIMenus.AppendMenuItem( copy_menu, 'all subtags with counts', 'Copy all the subtags in this list, with their counts, to your clipboard.', self._ProcessMenuCopyEvent, COPY_ALL_SUBTAGS_WITH_COUNTS )
- ClientGUIMenus.AppendMenu( menu, siblings_menu, 'siblings' )
- ClientGUIMenus.AppendMenu( menu, parents_menu, 'parents' )
+
+
+ ClientGUIMenus.AppendMenu( menu, copy_menu, 'copy' )
+
+ #
+
+ can_launch_sibling_and_parent_dialogs = len( selected_actual_tags ) > 0 and self.can_spawn_new_windows
+ can_show_siblings_and_parents = len( selected_actual_tags ) == 1
+
+ if can_show_siblings_and_parents or can_launch_sibling_and_parent_dialogs:
+
+ siblings_menu = QW.QMenu( menu )
+ parents_menu = QW.QMenu( menu )
+
+ ClientGUIMenus.AppendMenu( menu, siblings_menu, 'siblings' )
+ ClientGUIMenus.AppendMenu( menu, parents_menu, 'parents' )
+
+ if can_launch_sibling_and_parent_dialogs:
- if can_launch_sibling_and_parent_dialogs:
+ if len( selected_actual_tags ) == 1:
- if len( selected_actual_tags ) == 1:
+ ( tag, ) = selected_actual_tags
+
+ text = tag
+
+ else:
+
+ text = 'selection'
+
+
+ ClientGUIMenus.AppendMenuItem( siblings_menu, 'add siblings to ' + text, 'Add a sibling to this tag.', self._ProcessMenuTagEvent, 'sibling' )
+ ClientGUIMenus.AppendMenuItem( parents_menu, 'add parents to ' + text, 'Add a parent to this tag.', self._ProcessMenuTagEvent, 'parent' )
+
+
+ if can_show_siblings_and_parents:
+
+ ( selected_tag, ) = selected_actual_tags
+
+ def sp_work_callable():
+
+ selected_tag_to_service_keys_to_siblings_and_parents = HG.client_controller.Read( 'tag_siblings_and_parents_lookup', ( selected_tag, ) )
+
+ service_keys_to_siblings_and_parents = selected_tag_to_service_keys_to_siblings_and_parents[ selected_tag ]
+
+ return service_keys_to_siblings_and_parents
+
+
+ def sp_publish_callable( service_keys_to_siblings_and_parents ):
+
+ service_keys_in_order = HG.client_controller.services_manager.GetServiceKeys( HC.REAL_TAG_SERVICES )
+
+ all_siblings = set()
+
+ siblings_to_service_keys = collections.defaultdict( set )
+ parents_to_service_keys = collections.defaultdict( set )
+ children_to_service_keys = collections.defaultdict( set )
+
+ ideals_to_service_keys = collections.defaultdict( set )
+
+ for ( service_key, ( sibling_chain_members, ideal_tag, descendants, ancestors ) ) in service_keys_to_siblings_and_parents.items():
- ( tag, ) = selected_actual_tags
+ all_siblings.update( sibling_chain_members )
- text = tag
+ for sibling in sibling_chain_members:
+
+ if sibling == ideal_tag:
+
+ ideals_to_service_keys[ ideal_tag ].add( service_key )
+
+ continue
+
+
+ if sibling == selected_tag: # don't care about the selected tag unless it is ideal
+
+ continue
+
+
+ siblings_to_service_keys[ sibling ].add( service_key )
+
+
+ for ancestor in ancestors:
+
+ parents_to_service_keys[ ancestor ].add( service_key )
+
+
+ for descendant in descendants:
+
+ children_to_service_keys[ descendant ].add( service_key )
+
+
+
+ all_siblings.discard( selected_tag )
+
+ num_siblings = len( all_siblings )
+ num_parents = len( parents_to_service_keys )
+ num_children = len( children_to_service_keys )
+
+ service_keys_to_service_names = { service_key : HG.client_controller.services_manager.GetName( service_key ) for service_key in service_keys_in_order }
+
+ ALL_SERVICES_LABEL = 'all services'
+
+ def convert_service_keys_to_name_string( s_ks ):
+
+ if len( s_ks ) == len( service_keys_in_order ):
+
+ return ALL_SERVICES_LABEL
+
+
+ return ', '.join( ( service_keys_to_service_names[ service_key ] for service_key in service_keys_in_order if service_key in s_ks ) )
+
+
+ def group_and_sort_siblings_to_service_keys( t_to_s_ks ):
+
+ # convert "tag -> everywhere I am" to "sorted groups of locations -> what we have in common, also sorted"
+
+ service_key_groups_to_tags = collections.defaultdict( list )
+
+ for ( t, s_ks ) in t_to_s_ks.items():
+
+ service_key_groups_to_tags[ tuple( s_ks ) ].append( t )
+
+
+ tag_sort = ClientTagSorting.TagSort.STATICGetTextASCDefault()
+
+ for t_list in service_key_groups_to_tags.values():
+
+ ClientTagSorting.SortTags( tag_sort, t_list )
+
+
+ service_key_groups = sorted( service_key_groups_to_tags.keys(), key = lambda s_k_g: ( -len( s_k_g ), convert_service_keys_to_name_string( s_k_g ) ) )
+
+ service_key_group_names_and_tags = [ ( convert_service_keys_to_name_string( s_k_g ), service_key_groups_to_tags[ s_k_g ] ) for s_k_g in service_key_groups ]
+
+ return service_key_group_names_and_tags
+
+
+ def group_and_sort_parents_to_service_keys( p_to_s_ks, c_to_s_ks ):
+
+ # convert two lots of "tag -> everywhere I am" to "sorted groups of locations -> what we have in common, also sorted"
+
+ service_key_groups_to_tags = collections.defaultdict( lambda: ( [], [] ) )
+
+ for ( p, s_ks ) in p_to_s_ks.items():
+
+ service_key_groups_to_tags[ tuple( s_ks ) ][0].append( p )
+
+
+ for ( c, s_ks ) in c_to_s_ks.items():
+
+ service_key_groups_to_tags[ tuple( s_ks ) ][1].append( c )
+
+
+ tag_sort = ClientTagSorting.TagSort.STATICGetTextASCDefault()
+
+ for ( t_list_1, t_list_2 ) in service_key_groups_to_tags.values():
+
+ ClientTagSorting.SortTags( tag_sort, t_list_1 )
+ ClientTagSorting.SortTags( tag_sort, t_list_2 )
+
+
+ service_key_groups = sorted( service_key_groups_to_tags.keys(), key = lambda s_k_g: ( -len( s_k_g ), convert_service_keys_to_name_string( s_k_g ) ) )
+
+ service_key_group_names_and_tags = [ ( convert_service_keys_to_name_string( s_k_g ), service_key_groups_to_tags[ s_k_g ] ) for s_k_g in service_key_groups ]
+
+ return service_key_group_names_and_tags
+
+
+ if num_siblings == 0:
+
+ siblings_menu.setTitle( 'no siblings' )
else:
- text = 'selection'
+ siblings_menu.setTitle( '{} siblings'.format( HydrusData.ToHumanInt( num_siblings ) ) )
-
- ClientGUIMenus.AppendMenuItem( siblings_menu, 'add siblings to ' + text, 'Add a sibling to this tag.', self._ProcessMenuTagEvent, 'sibling' )
- ClientGUIMenus.AppendMenuItem( parents_menu, 'add parents to ' + text, 'Add a parent to this tag.', self._ProcessMenuTagEvent, 'parent' )
-
-
- if can_show_siblings_and_parents:
-
- ( selected_tag, ) = selected_actual_tags
-
- def sp_work_callable():
+ #
- selected_tag_to_service_keys_to_siblings_and_parents = HG.client_controller.Read( 'tag_siblings_and_parents_lookup', ( selected_tag, ) )
+ ClientGUIMenus.AppendSeparator( siblings_menu )
- service_keys_to_siblings_and_parents = selected_tag_to_service_keys_to_siblings_and_parents[ selected_tag ]
+ ideals = sorted( ideals_to_service_keys.keys(), key = HydrusTags.ConvertTagToSortable )
- return service_keys_to_siblings_and_parents
-
-
- def sp_publish_callable( service_keys_to_siblings_and_parents ):
-
- service_keys_in_order = HG.client_controller.services_manager.GetServiceKeys( HC.REAL_TAG_SERVICES )
-
- all_siblings = set()
-
- siblings_to_service_keys = collections.defaultdict( set )
- parents_to_service_keys = collections.defaultdict( set )
- children_to_service_keys = collections.defaultdict( set )
-
- ideals_to_service_keys = collections.defaultdict( set )
-
- for ( service_key, ( sibling_chain_members, ideal_tag, descendants, ancestors ) ) in service_keys_to_siblings_and_parents.items():
+ for ideal in ideals:
- all_siblings.update( sibling_chain_members )
-
- for sibling in sibling_chain_members:
+ if ideal == selected_tag:
- if sibling == ideal_tag:
-
- ideals_to_service_keys[ ideal_tag ].add( service_key )
-
- continue
-
-
- if sibling == selected_tag: # don't care about the selected tag unless it is ideal
-
- continue
-
-
- siblings_to_service_keys[ sibling ].add( service_key )
+ continue
- for ancestor in ancestors:
-
- parents_to_service_keys[ ancestor ].add( service_key )
-
+ ideal_label = 'ideal is "{}" on: {}'.format( ideal, convert_service_keys_to_name_string( ideals_to_service_keys[ ideal ] ) )
- for descendant in descendants:
-
- children_to_service_keys[ descendant ].add( service_key )
-
-
-
- all_siblings.discard( selected_tag )
-
- num_siblings = len( all_siblings )
- num_parents = len( parents_to_service_keys )
- num_children = len( children_to_service_keys )
-
- service_keys_to_service_names = { service_key : HG.client_controller.services_manager.GetName( service_key ) for service_key in service_keys_in_order }
-
- ALL_SERVICES_LABEL = 'all services'
-
- def convert_service_keys_to_name_string( s_ks ):
-
- if len( s_ks ) == len( service_keys_in_order ):
-
- return ALL_SERVICES_LABEL
-
-
- return ', '.join( ( service_keys_to_service_names[ service_key ] for service_key in service_keys_in_order if service_key in s_ks ) )
-
-
- def group_and_sort_siblings_to_service_keys( t_to_s_ks ):
-
- # convert "tag -> everywhere I am" to "sorted groups of locations -> what we have in common, also sorted"
-
- service_key_groups_to_tags = collections.defaultdict( list )
-
- for ( t, s_ks ) in t_to_s_ks.items():
-
- service_key_groups_to_tags[ tuple( s_ks ) ].append( t )
-
-
- tag_sort = ClientTagSorting.TagSort.STATICGetTextASCDefault()
-
- for t_list in service_key_groups_to_tags.values():
-
- ClientTagSorting.SortTags( tag_sort, t_list )
-
-
- service_key_groups = sorted( service_key_groups_to_tags.keys(), key = lambda s_k_g: ( -len( s_k_g ), convert_service_keys_to_name_string( s_k_g ) ) )
-
- service_key_group_names_and_tags = [ ( convert_service_keys_to_name_string( s_k_g ), service_key_groups_to_tags[ s_k_g ] ) for s_k_g in service_key_groups ]
-
- return service_key_group_names_and_tags
-
-
- def group_and_sort_parents_to_service_keys( p_to_s_ks, c_to_s_ks ):
-
- # convert two lots of "tag -> everywhere I am" to "sorted groups of locations -> what we have in common, also sorted"
-
- service_key_groups_to_tags = collections.defaultdict( lambda: ( [], [] ) )
-
- for ( p, s_ks ) in p_to_s_ks.items():
-
- service_key_groups_to_tags[ tuple( s_ks ) ][0].append( p )
-
-
- for ( c, s_ks ) in c_to_s_ks.items():
-
- service_key_groups_to_tags[ tuple( s_ks ) ][1].append( c )
-
-
- tag_sort = ClientTagSorting.TagSort.STATICGetTextASCDefault()
-
- for ( t_list_1, t_list_2 ) in service_key_groups_to_tags.values():
-
- ClientTagSorting.SortTags( tag_sort, t_list_1 )
- ClientTagSorting.SortTags( tag_sort, t_list_2 )
-
-
- service_key_groups = sorted( service_key_groups_to_tags.keys(), key = lambda s_k_g: ( -len( s_k_g ), convert_service_keys_to_name_string( s_k_g ) ) )
-
- service_key_group_names_and_tags = [ ( convert_service_keys_to_name_string( s_k_g ), service_key_groups_to_tags[ s_k_g ] ) for s_k_g in service_key_groups ]
-
- return service_key_group_names_and_tags
-
-
- if num_siblings == 0:
-
- siblings_menu.setTitle( 'no siblings' )
-
- else:
-
- siblings_menu.setTitle( '{} siblings'.format( HydrusData.ToHumanInt( num_siblings ) ) )
-
- #
-
- ClientGUIMenus.AppendSeparator( siblings_menu )
-
- ideals = sorted( ideals_to_service_keys.keys(), key = HydrusTags.ConvertTagToSortable )
-
- for ideal in ideals:
-
- if ideal == selected_tag:
-
- continue
-
-
- ideal_label = 'ideal is "{}" on: {}'.format( ideal, convert_service_keys_to_name_string( ideals_to_service_keys[ ideal ] ) )
-
- ClientGUIMenus.AppendMenuItem( siblings_menu, ideal_label, ideal_label, HG.client_controller.pub, 'clipboard', 'text', ideal )
-
-
- #
-
- for ( s_k_name, tags ) in group_and_sort_siblings_to_service_keys( siblings_to_service_keys ):
-
- ClientGUIMenus.AppendSeparator( siblings_menu )
-
- if s_k_name != ALL_SERVICES_LABEL:
-
- ClientGUIMenus.AppendMenuLabel( siblings_menu, '--{}--'.format( s_k_name ) )
-
-
- for tag in tags:
-
- ClientGUIMenus.AppendMenuLabel( siblings_menu, tag )
-
-
+ ClientGUIMenus.AppendMenuItem( siblings_menu, ideal_label, ideal_label, HG.client_controller.pub, 'clipboard', 'text', ideal )
#
- if num_parents + num_children == 0:
+ for ( s_k_name, tags ) in group_and_sort_siblings_to_service_keys( siblings_to_service_keys ):
- parents_menu.setTitle( 'no parents' )
+ ClientGUIMenus.AppendSeparator( siblings_menu )
- else:
+ if s_k_name != ALL_SERVICES_LABEL:
+
+ ClientGUIMenus.AppendMenuLabel( siblings_menu, '--{}--'.format( s_k_name ) )
+
- parents_menu.setTitle( '{} parents, {} children'.format( HydrusData.ToHumanInt( num_parents ), HydrusData.ToHumanInt( num_children ) ) )
+ for tag in tags:
+
+ ClientGUIMenus.AppendMenuLabel( siblings_menu, tag )
+
+
+
+
+ #
+
+ if num_parents + num_children == 0:
+
+ parents_menu.setTitle( 'no parents' )
+
+ else:
+
+ parents_menu.setTitle( '{} parents, {} children'.format( HydrusData.ToHumanInt( num_parents ), HydrusData.ToHumanInt( num_children ) ) )
+
+ ClientGUIMenus.AppendSeparator( parents_menu )
+
+ for ( s_k_name, ( parents, children ) ) in group_and_sort_parents_to_service_keys( parents_to_service_keys, children_to_service_keys ):
ClientGUIMenus.AppendSeparator( parents_menu )
- for ( s_k_name, ( parents, children ) ) in group_and_sort_parents_to_service_keys( parents_to_service_keys, children_to_service_keys ):
+ if s_k_name != ALL_SERVICES_LABEL:
- ClientGUIMenus.AppendSeparator( parents_menu )
+ ClientGUIMenus.AppendMenuLabel( parents_menu, '--{}--'.format( s_k_name ) )
- if s_k_name != ALL_SERVICES_LABEL:
-
- ClientGUIMenus.AppendMenuLabel( parents_menu, '--{}--'.format( s_k_name ) )
-
+
+ for parent in parents:
- for parent in parents:
-
- parent_label = 'parent: {}'.format( parent )
-
- ClientGUIMenus.AppendMenuItem( parents_menu, parent_label, parent_label, HG.client_controller.pub, 'clipboard', 'text', parent )
-
+ parent_label = 'parent: {}'.format( parent )
- for child in children:
-
- child_label = 'child: {}'.format( child )
-
- ClientGUIMenus.AppendMenuItem( parents_menu, child_label, child_label, HG.client_controller.pub, 'clipboard', 'text', child )
-
+ ClientGUIMenus.AppendMenuItem( parents_menu, parent_label, parent_label, HG.client_controller.pub, 'clipboard', 'text', parent )
+
+
+ for child in children:
+
+ child_label = 'child: {}'.format( child )
+
+ ClientGUIMenus.AppendMenuItem( parents_menu, child_label, child_label, HG.client_controller.pub, 'clipboard', 'text', child )
- async_job = ClientGUIAsync.AsyncQtJob( menu, sp_work_callable, sp_publish_callable )
-
- async_job.start()
-
+
+ async_job = ClientGUIAsync.AsyncQtJob( menu, sp_work_callable, sp_publish_callable )
+
+ async_job.start()
- if len( self._selected_terms ) > 0:
+
+ if len( self._selected_terms ) > 0:
+
+ ClientGUIMenus.AppendSeparator( menu )
+
+ ( predicates, or_predicate, inverse_predicates, namespace_predicate, inverse_namespace_predicate ) = self._GetSelectedPredicatesAndInverseCopies()
+
+ if len( predicates ) > 0:
+
+ if self.can_spawn_new_windows or self._CanProvideCurrentPagePredicates():
+
+ search_menu = QW.QMenu( menu )
+
+ ClientGUIMenus.AppendMenu( menu, search_menu, 'search' )
+
+
+ if self.can_spawn_new_windows:
+
+ ClientGUIMenus.AppendMenuItem( search_menu, 'open a new search page for ' + selection_string, 'Open a new search page starting with the selected predicates.', self._NewSearchPages, [ predicates ] )
+
+ if or_predicate is not None:
+
+ ClientGUIMenus.AppendMenuItem( search_menu, 'open a new OR search page for ' + selection_string, 'Open a new search page starting with the selected merged as an OR search predicate.', self._NewSearchPages, [ ( or_predicate, ) ] )
+
+
+ if len( predicates ) > 1:
+
+ for_each_predicates = [ ( predicate, ) for predicate in predicates ]
+
+ ClientGUIMenus.AppendMenuItem( search_menu, 'open new search pages for each in selection', 'Open one new search page for each selected predicate.', self._NewSearchPages, for_each_predicates )
+
+
+ ClientGUIMenus.AppendSeparator( search_menu )
+
+
+ if self._CanProvideCurrentPagePredicates():
+
+ current_predicates = self._GetCurrentPagePredicates()
+
+ predicates = set( predicates )
+ inverse_predicates = set( inverse_predicates )
+
+ if len( predicates ) == 1:
+
+ ( pred, ) = predicates
+
+ predicates_selection_string = pred.ToString( with_count = False )
+
+ else:
+
+ predicates_selection_string = 'selected'
+
+
+ some_selected_in_current = HydrusData.SetsIntersect( predicates, current_predicates )
+
+ if some_selected_in_current:
+
+ ClientGUIMenus.AppendMenuItem( search_menu, 'remove {} from current search'.format( predicates_selection_string ), 'Remove the selected predicates from the current search.', self._ProcessMenuPredicateEvent, 'remove_predicates' )
+
+
+ some_selected_not_in_current = len( predicates.intersection( current_predicates ) ) < len( predicates )
+
+ if some_selected_not_in_current:
+
+ ClientGUIMenus.AppendMenuItem( search_menu, 'add {} to current search'.format( predicates_selection_string ), 'Add the selected predicates to the current search.', self._ProcessMenuPredicateEvent, 'add_predicates' )
+
+
+ if or_predicate is not None:
+
+ ClientGUIMenus.AppendMenuItem( search_menu, 'add an OR of {} to current search'.format( predicates_selection_string ), 'Add the selected predicates as an OR predicate to the current search.', self._ProcessMenuPredicateEvent, 'add_or_predicate' )
+
+
+ some_selected_are_excluded_explicitly = HydrusData.SetsIntersect( inverse_predicates, current_predicates )
+
+ if some_selected_are_excluded_explicitly:
+
+ ClientGUIMenus.AppendMenuItem( search_menu, 'permit {} for current search'.format( predicates_selection_string ), 'Stop disallowing the selected predicates from the current search.', self._ProcessMenuPredicateEvent, 'remove_inverse_predicates' )
+
+
+ some_selected_are_not_excluded_explicitly = len( inverse_predicates.intersection( current_predicates ) ) < len( inverse_predicates )
+
+ if some_selected_are_not_excluded_explicitly:
+
+ ClientGUIMenus.AppendMenuItem( search_menu, 'exclude {} from the current search'.format( predicates_selection_string ), 'Disallow the selected predicates for the current search.', self._ProcessMenuPredicateEvent, 'add_inverse_predicates' )
+
+
+ if namespace_predicate is not None and namespace_predicate not in current_predicates:
+
+ ClientGUIMenus.AppendMenuItem( search_menu, 'add {} to current search'.format( namespace_predicate.ToString( with_count = False ) ), 'Add the namespace predicate to the current search.', self._ProcessMenuPredicateEvent, 'add_namespace_predicate' )
+
+
+ if inverse_namespace_predicate is not None and inverse_namespace_predicate not in current_predicates:
+
+ ClientGUIMenus.AppendMenuItem( search_menu, 'exclude {} from the current search'.format( namespace_predicate.ToString( with_count = False ) ), 'Disallow the namespace predicate from the current search.', self._ProcessMenuPredicateEvent, 'add_inverse_namespace_predicate' )
+
+
+
+ self._AddEditMenu( menu )
+
+
+ if len( selected_actual_tags ) > 0 and self._page_key is not None:
+
+ select_menu = QW.QMenu( menu )
+
+ tags_sorted_to_show_on_menu = HydrusTags.SortNumericTags( selected_actual_tags )
+
+ tags_sorted_to_show_on_menu_string = ', '.join( tags_sorted_to_show_on_menu )
+
+ while len( tags_sorted_to_show_on_menu_string ) > 64:
+
+ if len( tags_sorted_to_show_on_menu ) == 1:
+
+ tags_sorted_to_show_on_menu_string = '(many/long tags)'
+
+ else:
+
+ tags_sorted_to_show_on_menu.pop( -1 )
+
+ tags_sorted_to_show_on_menu_string = ', '.join( tags_sorted_to_show_on_menu + [ '\u2026' ] )
+
+
+
+ if len( selected_actual_tags ) == 1:
+
+ label = 'files with "{}"'.format( tags_sorted_to_show_on_menu_string )
+
+ else:
+
+ label = 'files with all of "{}"'.format( tags_sorted_to_show_on_menu_string )
+
+
+ ClientGUIMenus.AppendMenuItem( select_menu, label, 'Select the files with these tags.', self._SelectFilesWithTags, 'AND' )
+
+ if len( selected_actual_tags ) > 1:
+
+ label = 'files with any of "{}"'.format( tags_sorted_to_show_on_menu_string )
+
+ ClientGUIMenus.AppendMenuItem( select_menu, label, 'Select the files with any of these tags.', self._SelectFilesWithTags, 'OR' )
+
+
+ ClientGUIMenus.AppendMenu( menu, select_menu, 'select' )
+
+
+
+ if len( selected_actual_tags ) == 1:
+
+ ( selected_tag, ) = selected_actual_tags
+
+ if self._tag_display_type in ( ClientTags.TAG_DISPLAY_SINGLE_MEDIA, ClientTags.TAG_DISPLAY_SELECTION_LIST ):
ClientGUIMenus.AppendSeparator( menu )
- ( predicates, or_predicate, inverse_predicates, namespace_predicate, inverse_namespace_predicate ) = self._GetSelectedPredicatesAndInverseCopies()
+ ( namespace, subtag ) = HydrusTags.SplitTag( selected_tag )
- if len( predicates ) > 0:
-
- if self.can_spawn_new_windows or self._CanProvideCurrentPagePredicates():
-
- search_menu = QW.QMenu( menu )
-
- ClientGUIMenus.AppendMenu( menu, search_menu, 'search' )
-
-
- if self.can_spawn_new_windows:
-
- ClientGUIMenus.AppendMenuItem( search_menu, 'open a new search page for ' + selection_string, 'Open a new search page starting with the selected predicates.', self._NewSearchPages, [ predicates ] )
-
- if or_predicate is not None:
-
- ClientGUIMenus.AppendMenuItem( search_menu, 'open a new OR search page for ' + selection_string, 'Open a new search page starting with the selected merged as an OR search predicate.', self._NewSearchPages, [ ( or_predicate, ) ] )
-
-
- if len( predicates ) > 1:
-
- for_each_predicates = [ ( predicate, ) for predicate in predicates ]
-
- ClientGUIMenus.AppendMenuItem( search_menu, 'open new search pages for each in selection', 'Open one new search page for each selected predicate.', self._NewSearchPages, for_each_predicates )
-
-
- ClientGUIMenus.AppendSeparator( search_menu )
-
-
- if self._CanProvideCurrentPagePredicates():
-
- current_predicates = self._GetCurrentPagePredicates()
-
- predicates = set( predicates )
- inverse_predicates = set( inverse_predicates )
-
- if len( predicates ) == 1:
-
- ( pred, ) = predicates
-
- predicates_selection_string = pred.ToString( with_count = False )
-
- else:
-
- predicates_selection_string = 'selected'
-
-
- some_selected_in_current = HydrusData.SetsIntersect( predicates, current_predicates )
-
- if some_selected_in_current:
-
- ClientGUIMenus.AppendMenuItem( search_menu, 'remove {} from current search'.format( predicates_selection_string ), 'Remove the selected predicates from the current search.', self._ProcessMenuPredicateEvent, 'remove_predicates' )
-
-
- some_selected_not_in_current = len( predicates.intersection( current_predicates ) ) < len( predicates )
-
- if some_selected_not_in_current:
-
- ClientGUIMenus.AppendMenuItem( search_menu, 'add {} to current search'.format( predicates_selection_string ), 'Add the selected predicates to the current search.', self._ProcessMenuPredicateEvent, 'add_predicates' )
-
-
- if or_predicate is not None:
-
- ClientGUIMenus.AppendMenuItem( search_menu, 'add an OR of {} to current search'.format( predicates_selection_string ), 'Add the selected predicates as an OR predicate to the current search.', self._ProcessMenuPredicateEvent, 'add_or_predicate' )
-
-
- some_selected_are_excluded_explicitly = HydrusData.SetsIntersect( inverse_predicates, current_predicates )
-
- if some_selected_are_excluded_explicitly:
-
- ClientGUIMenus.AppendMenuItem( search_menu, 'permit {} for current search'.format( predicates_selection_string ), 'Stop disallowing the selected predicates from the current search.', self._ProcessMenuPredicateEvent, 'remove_inverse_predicates' )
-
-
- some_selected_are_not_excluded_explicitly = len( inverse_predicates.intersection( current_predicates ) ) < len( inverse_predicates )
-
- if some_selected_are_not_excluded_explicitly:
-
- ClientGUIMenus.AppendMenuItem( search_menu, 'exclude {} from the current search'.format( predicates_selection_string ), 'Disallow the selected predicates for the current search.', self._ProcessMenuPredicateEvent, 'add_inverse_predicates' )
-
-
- if namespace_predicate is not None and namespace_predicate not in current_predicates:
-
- ClientGUIMenus.AppendMenuItem( search_menu, 'add {} to current search'.format( namespace_predicate.ToString( with_count = False ) ), 'Add the namespace predicate to the current search.', self._ProcessMenuPredicateEvent, 'add_namespace_predicate' )
-
-
- if inverse_namespace_predicate is not None and inverse_namespace_predicate not in current_predicates:
-
- ClientGUIMenus.AppendMenuItem( search_menu, 'exclude {} from the current search'.format( namespace_predicate.ToString( with_count = False ) ), 'Disallow the namespace predicate from the current search.', self._ProcessMenuPredicateEvent, 'add_inverse_namespace_predicate' )
-
-
-
- self._AddEditMenu( menu )
-
+ hide_menu = QW.QMenu( menu )
- if len( selected_actual_tags ) > 0 and self._page_key is not None:
-
- select_menu = QW.QMenu( menu )
-
- tags_sorted_to_show_on_menu = HydrusTags.SortNumericTags( selected_actual_tags )
-
- tags_sorted_to_show_on_menu_string = ', '.join( tags_sorted_to_show_on_menu )
-
- while len( tags_sorted_to_show_on_menu_string ) > 64:
-
- if len( tags_sorted_to_show_on_menu ) == 1:
-
- tags_sorted_to_show_on_menu_string = '(many/long tags)'
-
- else:
-
- tags_sorted_to_show_on_menu.pop( -1 )
-
- tags_sorted_to_show_on_menu_string = ', '.join( tags_sorted_to_show_on_menu + [ '\u2026' ] )
-
-
-
- if len( selected_actual_tags ) == 1:
-
- label = 'files with "{}"'.format( tags_sorted_to_show_on_menu_string )
-
- else:
-
- label = 'files with all of "{}"'.format( tags_sorted_to_show_on_menu_string )
-
-
- ClientGUIMenus.AppendMenuItem( select_menu, label, 'Select the files with these tags.', self._SelectFilesWithTags, 'AND' )
-
- if len( selected_actual_tags ) > 1:
-
- label = 'files with any of "{}"'.format( tags_sorted_to_show_on_menu_string )
-
- ClientGUIMenus.AppendMenuItem( select_menu, label, 'Select the files with any of these tags.', self._SelectFilesWithTags, 'OR' )
-
-
- ClientGUIMenus.AppendMenu( menu, select_menu, 'select' )
-
+ ClientGUIMenus.AppendMenuItem( hide_menu, '"{}" tags from here'.format( ClientTags.RenderNamespaceForUser( namespace ) ), 'Hide this namespace from view in future.', self._ProcessMenuTagEvent, 'hide_namespace' )
+ ClientGUIMenus.AppendMenuItem( hide_menu, '"{}" from here'.format( selected_tag ), 'Hide this tag from view in future.', self._ProcessMenuTagEvent, 'hide' )
+
+ ClientGUIMenus.AppendMenu( menu, hide_menu, 'hide' )
- if len( selected_actual_tags ) == 1:
-
- ( selected_tag, ) = selected_actual_tags
-
- if self._tag_display_type in ( ClientTags.TAG_DISPLAY_SINGLE_MEDIA, ClientTags.TAG_DISPLAY_SELECTION_LIST ):
-
- ClientGUIMenus.AppendSeparator( menu )
-
- ( namespace, subtag ) = HydrusTags.SplitTag( selected_tag )
-
- hide_menu = QW.QMenu( menu )
-
- ClientGUIMenus.AppendMenuItem( hide_menu, '"{}" tags from here'.format( ClientTags.RenderNamespaceForUser( namespace ) ), 'Hide this namespace from view in future.', self._ProcessMenuTagEvent, 'hide_namespace' )
- ClientGUIMenus.AppendMenuItem( hide_menu, '"{}" from here'.format( selected_tag ), 'Hide this tag from view in future.', self._ProcessMenuTagEvent, 'hide' )
-
- ClientGUIMenus.AppendMenu( menu, hide_menu, 'hide' )
-
-
- def set_favourite_tags( tag ):
-
- favourite_tags = list( HG.client_controller.new_options.GetStringList( 'favourite_tags' ) )
-
- if selected_tag in favourite_tags:
-
- favourite_tags.remove( tag )
-
- else:
-
- favourite_tags.append( tag )
-
-
- HG.client_controller.new_options.SetStringList( 'favourite_tags', favourite_tags )
-
- HG.client_controller.pub( 'notify_new_favourite_tags' )
-
+ def set_favourite_tags( tag ):
favourite_tags = list( HG.client_controller.new_options.GetStringList( 'favourite_tags' ) )
if selected_tag in favourite_tags:
- label = 'remove "{}" from favourites'.format( selected_tag )
- description = 'Remove this tag from your favourites'
+ favourite_tags.remove( tag )
else:
- label = 'add "{}" to favourites'.format( selected_tag )
- description = 'Add this tag from your favourites'
+ favourite_tags.append( tag )
- favourites_menu = QW.QMenu( menu )
+ HG.client_controller.new_options.SetStringList( 'favourite_tags', favourite_tags )
- ClientGUIMenus.AppendMenuItem( favourites_menu, label, description, set_favourite_tags, selected_tag )
-
- m = ClientGUIMenus.AppendMenu( menu, favourites_menu, 'favourites' )
+ HG.client_controller.pub( 'notify_new_favourite_tags' )
- self.AddAdditionalMenuItems( menu )
+ favourite_tags = list( HG.client_controller.new_options.GetStringList( 'favourite_tags' ) )
- CGC.core().PopupMenu( self, menu )
+ if selected_tag in favourite_tags:
+
+ label = 'remove "{}" from favourites'.format( selected_tag )
+ description = 'Remove this tag from your favourites'
+
+ else:
+
+ label = 'add "{}" to favourites'.format( selected_tag )
+ description = 'Add this tag from your favourites'
+
+ favourites_menu = QW.QMenu( menu )
+
+ ClientGUIMenus.AppendMenuItem( favourites_menu, label, description, set_favourite_tags, selected_tag )
+
+ m = ClientGUIMenus.AppendMenu( menu, favourites_menu, 'favourites' )
+
+
+ self.AddAdditionalMenuItems( menu )
+
+ CGC.core().PopupMenu( self, menu )
def ForceTagRecalc( self ):
diff --git a/hydrus/client/gui/pages/ClientGUIManagement.py b/hydrus/client/gui/pages/ClientGUIManagement.py
index fe3f4216..9eaea94d 100644
--- a/hydrus/client/gui/pages/ClientGUIManagement.py
+++ b/hydrus/client/gui/pages/ClientGUIManagement.py
@@ -4332,7 +4332,10 @@ class ManagementPanelPetitions( ManagementPanel ):
self._reason_text = QW.QTextEdit( self._petition_panel )
self._reason_text.setReadOnly( True )
- self._reason_text.setMinimumHeight( 80 )
+
+ ( min_width, min_height ) = ClientGUIFunctions.ConvertTextToPixels( self._reason_text, ( 16, 6 ) )
+
+ self._reason_text.setFixedHeight( min_height )
check_all = ClientGUICommon.BetterButton( self._petition_panel, 'check all', self._CheckAll )
flip_selected = ClientGUICommon.BetterButton( self._petition_panel, 'flip selected', self._FlipSelected )
@@ -4349,14 +4352,14 @@ class ManagementPanelPetitions( ManagementPanel ):
( min_width, min_height ) = ClientGUIFunctions.ConvertTextToPixels( self._contents_add, ( 16, 20 ) )
- self._contents_add.setMinimumHeight( min_height )
+ self._contents_add.setFixedHeight( min_height )
self._contents_delete = ClientGUICommon.BetterCheckBoxList( self._petition_panel )
self._contents_delete.itemDoubleClicked.connect( self.ContentsDeleteDoubleClick )
( min_width, min_height ) = ClientGUIFunctions.ConvertTextToPixels( self._contents_delete, ( 16, 20 ) )
- self._contents_delete.setMinimumHeight( min_height )
+ self._contents_delete.setFixedHeight( min_height )
self._process = QW.QPushButton( 'process', self._petition_panel )
self._process.clicked.connect( self.EventProcess )
@@ -4395,8 +4398,8 @@ class ManagementPanelPetitions( ManagementPanel ):
self._petition_panel.Add( self._reason_text, CC.FLAGS_EXPAND_PERPENDICULAR )
self._petition_panel.Add( check_hbox, CC.FLAGS_EXPAND_SIZER_PERPENDICULAR )
self._petition_panel.Add( sort_hbox, CC.FLAGS_EXPAND_SIZER_PERPENDICULAR )
- self._petition_panel.Add( self._contents_add, CC.FLAGS_EXPAND_BOTH_WAYS )
- self._petition_panel.Add( self._contents_delete, CC.FLAGS_EXPAND_BOTH_WAYS )
+ self._petition_panel.Add( self._contents_add, CC.FLAGS_EXPAND_PERPENDICULAR )
+ self._petition_panel.Add( self._contents_delete, CC.FLAGS_EXPAND_PERPENDICULAR )
self._petition_panel.Add( self._process, CC.FLAGS_EXPAND_PERPENDICULAR )
self._petition_panel.Add( self._copy_account_key_button, CC.FLAGS_EXPAND_PERPENDICULAR )
self._petition_panel.Add( self._modify_petitioner, CC.FLAGS_EXPAND_PERPENDICULAR )
@@ -4407,7 +4410,7 @@ class ManagementPanelPetitions( ManagementPanel ):
QP.AddToLayout( vbox, self._media_collect, CC.FLAGS_EXPAND_PERPENDICULAR )
QP.AddToLayout( vbox, self._petitions_info_panel, CC.FLAGS_EXPAND_PERPENDICULAR )
- QP.AddToLayout( vbox, self._petition_panel, CC.FLAGS_EXPAND_BOTH_WAYS )
+ QP.AddToLayout( vbox, self._petition_panel, CC.FLAGS_EXPAND_PERPENDICULAR )
if service_type == HC.TAG_REPOSITORY:
@@ -4820,20 +4823,35 @@ class ManagementPanelPetitions( ManagementPanel ):
contents = self._contents_add
+ string_template = 'ADD: {}'
+
else:
contents = self._contents_delete
+ string_template = 'DELETE: {}'
+
contents.clear()
for ( i, ( content, check ) ) in enumerate( contents_and_checks ):
- content_string = content.ToString()
+ content_string = string_template.format( content.ToString() )
contents.Append( content_string, content, starts_checked = check )
+ if contents.count() > 0:
+
+ ideal_height_in_rows = min( 20, len( contents_and_checks ) )
+
+ pixels_per_row = contents.sizeHintForRow( 0 )
+
+ ideal_height_in_pixels = ( ideal_height_in_rows * pixels_per_row ) + ( contents.frameWidth() * 2 )
+
+ contents.setFixedHeight( ideal_height_in_pixels )
+
+
def _ShowHashes( self, hashes ):
diff --git a/hydrus/client/gui/parsing/ClientGUIParsing.py b/hydrus/client/gui/parsing/ClientGUIParsing.py
index 63e99abf..a51adf48 100644
--- a/hydrus/client/gui/parsing/ClientGUIParsing.py
+++ b/hydrus/client/gui/parsing/ClientGUIParsing.py
@@ -707,8 +707,6 @@ class EditContentParserPanel( ClientGUIScrolledPanels.EditPanel ):
vbox = QP.VBoxLayout()
label = 'Try to make sure you will only ever parse one text result here. A single content parser with a single note name producing eleven different note texts is going to be conflict hell for the user at the end.'
- label += os.linesep * 2
- label += 'Also this is prototype, it does not do anything yet!'
st = ClientGUICommon.BetterStaticText( self._notes_panel, label = label )
diff --git a/hydrus/client/gui/services/ClientGUIClientsideServices.py b/hydrus/client/gui/services/ClientGUIClientsideServices.py
index 033a5665..fe96dc05 100644
--- a/hydrus/client/gui/services/ClientGUIClientsideServices.py
+++ b/hydrus/client/gui/services/ClientGUIClientsideServices.py
@@ -2902,7 +2902,10 @@ class ReviewServiceRepositorySubPanel( QW.QWidget ):
message = 'Note that num file hashes and tags here include deleted content so will likely not line up with your review services value, which is only for current content.'
message += os.linesep * 2
- message += os.linesep.join( ( '{}: {}'.format( HC.service_info_enum_str_lookup[ int( info_type ) ], HydrusData.ToHumanInt( info ) ) for ( info_type, info ) in service_info_dict.items() ) )
+ tuples = [ ( HC.service_info_enum_str_lookup[ info_type ], HydrusData.ToHumanInt( service_info_dict[ info_type ] ) ) for info_type in l if info_type in service_info_dict ]
+ string_rows = [ '{}: {}'.format( info_type, info ) for ( info_type, info ) in tuples ]
+
+ message += os.linesep.join( string_rows )
QW.QMessageBox.information( self, 'Service Info', message )
diff --git a/hydrus/client/gui/widgets/ClientGUIControls.py b/hydrus/client/gui/widgets/ClientGUIControls.py
index 5a4c4df7..c47c0521 100644
--- a/hydrus/client/gui/widgets/ClientGUIControls.py
+++ b/hydrus/client/gui/widgets/ClientGUIControls.py
@@ -11,6 +11,7 @@ from hydrus.core import HydrusText
from hydrus.core.networking import HydrusNetworking
from hydrus.client import ClientConstants as CC
+from hydrus.client.gui import ClientGUIFunctions
from hydrus.client.gui import ClientGUIScrolledPanels
from hydrus.client.gui import ClientGUITime
from hydrus.client.gui import ClientGUITopLevelWindowsPanels
@@ -293,6 +294,10 @@ class BytesControl( QW.QWidget ):
QP.AddToLayout( hbox, self._unit, CC.FLAGS_CENTER_PERPENDICULAR )
self.setLayout( hbox )
+
+ min_width = ClientGUIFunctions.ConvertTextToPixelWidth( self._unit, 8 )
+
+ self._unit.setMinimumWidth( min_width )
self._spin.valueChanged.connect( self._HandleValueChanged )
self._unit.currentIndexChanged.connect( self._HandleValueChanged )
diff --git a/hydrus/client/media/ClientMedia.py b/hydrus/client/media/ClientMedia.py
index 74a2d240..b1426eec 100644
--- a/hydrus/client/media/ClientMedia.py
+++ b/hydrus/client/media/ClientMedia.py
@@ -1229,7 +1229,7 @@ class MediaList( object ):
self._collected_media = set()
self._selected_media = set()
- self._sorted_media = []
+ self._sorted_media = SortedList()
self._RecalcAfterMediaRemove()
diff --git a/hydrus/client/networking/ClientNetworkingDomain.py b/hydrus/client/networking/ClientNetworkingDomain.py
index 62f3e660..2399e18e 100644
--- a/hydrus/client/networking/ClientNetworkingDomain.py
+++ b/hydrus/client/networking/ClientNetworkingDomain.py
@@ -1619,6 +1619,31 @@ class NetworkDomainManager( HydrusSerialisable.SerialisableBase ):
+ def RenameGUG( self, original_name, new_name ):
+
+ with self._lock:
+
+ existing_gug_names_to_gugs = { gug.GetName() : gug for gug in self._gugs }
+
+
+ if original_name in existing_gug_names_to_gugs:
+
+ gug = existing_gug_names_to_gugs[ original_name ]
+
+ del existing_gug_names_to_gugs[ original_name ]
+
+ gug.SetName( new_name )
+
+ gug.SetNonDupeName( set( existing_gug_names_to_gugs.keys() ) )
+
+ existing_gug_names_to_gugs[ gug.GetName() ] = gug
+
+ new_gugs = list( existing_gug_names_to_gugs.values() )
+
+ self.SetGUGs( new_gugs )
+
+
+
def ReportNetworkInfrastructureError( self, url ):
with self._lock:
diff --git a/hydrus/client/networking/ClientNetworkingJobs.py b/hydrus/client/networking/ClientNetworkingJobs.py
index 280abcd5..f02279da 100644
--- a/hydrus/client/networking/ClientNetworkingJobs.py
+++ b/hydrus/client/networking/ClientNetworkingJobs.py
@@ -790,12 +790,12 @@ class NetworkJob( object ):
# cloudscraper refactored a bit around 1.2.60, so we now have some different paths to what we want
- old_module = None
- new_module = None
+ old_class_object = None
+ new_class_instance = None
if hasattr( cloudscraper, 'CloudScraper' ):
- old_module = getattr( cloudscraper, 'CloudScraper' )
+ old_class_object = getattr( cloudscraper, 'CloudScraper' )
if hasattr( cloudscraper, 'cloudflare' ):
@@ -804,13 +804,17 @@ class NetworkJob( object ):
if hasattr( m, 'Cloudflare' ):
- new_module = getattr( m, 'Cloudflare' )
+ new_class_object = getattr( m, 'Cloudflare' )
+
+ cs = cloudscraper.CloudScraper()
+
+ new_class_instance = new_class_object( cs )
possible_paths = [
- ( old_module, 'is_Firewall_Blocked' ),
- ( new_module, 'is_Firewall_Blocked' )
+ ( old_class_object, 'is_Firewall_Blocked' ),
+ ( new_class_instance, 'is_Firewall_Blocked' )
]
is_firewall = False
@@ -834,9 +838,9 @@ class NetworkJob( object ):
possible_paths = [
- ( old_module, 'is_reCaptcha_Challenge' ),
- ( old_module, 'is_Captcha_Challenge' ),
- ( new_module, 'is_Captcha_Challenge' )
+ ( old_class_object, 'is_reCaptcha_Challenge' ),
+ ( old_class_object, 'is_Captcha_Challenge' ),
+ ( new_class_instance, 'is_Captcha_Challenge' )
]
is_captcha = False
@@ -860,9 +864,9 @@ class NetworkJob( object ):
possible_paths = [
- ( old_module, 'is_IUAM_Challenge' ),
- ( new_module, 'is_IUAM_Challenge' ),
- ( new_module, 'is_New_IUAM_Challenge' )
+ ( old_class_object, 'is_IUAM_Challenge' ),
+ ( new_class_instance, 'is_IUAM_Challenge' ),
+ ( new_class_instance, 'is_New_IUAM_Challenge' )
]
is_iuam = False
diff --git a/hydrus/core/HydrusConstants.py b/hydrus/core/HydrusConstants.py
index 17d995c9..a0d0fb59 100644
--- a/hydrus/core/HydrusConstants.py
+++ b/hydrus/core/HydrusConstants.py
@@ -80,7 +80,7 @@ options = {}
# Misc
NETWORK_VERSION = 20
-SOFTWARE_VERSION = 498
+SOFTWARE_VERSION = 499
CLIENT_API_VERSION = 32
SERVER_THUMBNAIL_DIMENSIONS = ( 200, 200 )
diff --git a/hydrus/core/networking/HydrusServerResources.py b/hydrus/core/networking/HydrusServerResources.py
index a3eba640..c03b487d 100644
--- a/hydrus/core/networking/HydrusServerResources.py
+++ b/hydrus/core/networking/HydrusServerResources.py
@@ -675,7 +675,7 @@ class HydrusResource( Resource ):
request.setHeader( 'Content-Type', content_type )
request.setHeader( 'Content-Length', str( content_length ) )
request.setHeader( 'Content-Disposition', content_disposition )
- request.setHeader( 'Cache-Control', 'max-age={}'.format( 4 ) ) #hydurs won't change its mind about dynamic data under 4 seconds even if you ask repeatedly
+ request.setHeader( 'Cache-Control', 'max-age={}'.format( 4 ) ) # hydrus won't change its mind about dynamic data under 4 seconds even if you ask repeatedly
request.write( body_bytes )
diff --git a/mkdocs.yml b/mkdocs.yml
index c6aeb550..317de3fe 100644
--- a/mkdocs.yml
+++ b/mkdocs.yml
@@ -23,7 +23,7 @@ nav:
- Next Steps:
- adding_new_downloaders.md
- getting_started_subscriptions.md
- - filtering duplicates: duplicates.md
+ - duplicates.md
- Advanced:
- advanced_siblings.md
- advanced_parents.md
@@ -54,7 +54,7 @@ nav:
- downloader_parsers_full_example_file_page.md
- downloader_parsers_full_example_api.md
- downloader_completion.md
- - Sharing: downloader_sharing.md
+ - downloader_sharing.md
- downloader_login.md
- API:
- client_api.md
diff --git a/requirements_macos.txt b/requirements_macos.txt
index a8c5ca16..8a868d55 100644
--- a/requirements_macos.txt
+++ b/requirements_macos.txt
@@ -7,7 +7,7 @@ lxml>=4.5.0
lz4>=3.0.0
nose>=1.3.0
numpy>=1.16.0
-opencv-python-headless>=4.0.0, <=4.5.3.56
+opencv-python-headless>=4.0.0
Pillow>=6.0.0
psutil>=5.0.0
pylzma>=0.5.0
diff --git a/requirements_ubuntu.txt b/requirements_ubuntu.txt
index a8c5ca16..8a868d55 100644
--- a/requirements_ubuntu.txt
+++ b/requirements_ubuntu.txt
@@ -7,7 +7,7 @@ lxml>=4.5.0
lz4>=3.0.0
nose>=1.3.0
numpy>=1.16.0
-opencv-python-headless>=4.0.0, <=4.5.3.56
+opencv-python-headless>=4.0.0
Pillow>=6.0.0
psutil>=5.0.0
pylzma>=0.5.0
diff --git a/requirements_windows.txt b/requirements_windows.txt
index 77488458..46124e64 100644
--- a/requirements_windows.txt
+++ b/requirements_windows.txt
@@ -7,14 +7,14 @@ lxml>=4.5.0
lz4>=3.0.0
nose>=1.3.0
numpy>=1.16.0
-opencv-python-headless>=4.0.0, <=4.5.3.56
+opencv-python-headless>=4.0.0
Pillow>=6.0.0
psutil>=5.0.0
pylzma>=0.5.0
pyOpenSSL>=19.1.0
PySide2>=5.15.0
PySocks>=1.7.0
-python-mpv==0.5.2
+python-mpv==1.0.1
PyYAML>=5.0.0
QtPy>=1.9.0
requests==2.23.0
diff --git a/static/build_files/windows/client-winQt5.spec b/static/build_files/windows/client-winQt5.spec
index a26d72cd..224c4755 100644
--- a/static/build_files/windows/client-winQt5.spec
+++ b/static/build_files/windows/client-winQt5.spec
@@ -24,7 +24,7 @@ a = Analysis(['hydrus\\client.pyw'],
('hydrus\\db', 'db'),
('hydrus\\hydrus', 'hydrus'),
('hydrus\\sqlite3.dll', '.'),
- ('hydrus\\mpv-1.dll', '.'),
+ ('hydrus\\mpv-2.dll', '.'),
(cloudscraper_dir, 'cloudscraper'),
(shiboken_dir, 'shiboken2\\files.dir')
],
diff --git a/static/build_files/windows/client-winQt6.spec b/static/build_files/windows/client-winQt6.spec
index 928477d6..157bd3fb 100644
--- a/static/build_files/windows/client-winQt6.spec
+++ b/static/build_files/windows/client-winQt6.spec
@@ -22,7 +22,7 @@ a = Analysis(['hydrus\\client.pyw'],
('hydrus\\db', 'db'),
('hydrus\\hydrus', 'hydrus'),
('hydrus\\sqlite3.dll', '.'),
- ('hydrus\\mpv-1.dll', '.'),
+ ('hydrus\\mpv-2.dll', '.'),
(cloudscraper_dir, 'cloudscraper')
],
hiddenimports=['hydrus\\server.py', 'cloudscraper'],
diff --git a/static/build_files/windows/requirementsQt5.txt b/static/build_files/windows/requirementsQt5.txt
index 7806163e..dcf24fc6 100644
--- a/static/build_files/windows/requirementsQt5.txt
+++ b/static/build_files/windows/requirementsQt5.txt
@@ -14,7 +14,7 @@ pylzma>=0.5.0
pyOpenSSL>=19.1.0
PySide2>=5.15.0
PySocks>=1.7.0
-python-mpv==0.5.2
+python-mpv==1.0.1
PyYAML>=5.0.0
QtPy>=1.9.0
requests==2.23.0
diff --git a/static/build_files/windows/requirementsQt6.txt b/static/build_files/windows/requirementsQt6.txt
index 161ddb0b..94e0a823 100644
--- a/static/build_files/windows/requirementsQt6.txt
+++ b/static/build_files/windows/requirementsQt6.txt
@@ -14,7 +14,7 @@ pylzma>=0.5.0
pyOpenSSL>=19.1.0
PySide6>=6.0.0
PySocks>=1.7.0
-python-mpv==0.5.2
+python-mpv==1.0.1
PyYAML>=5.0.0
QtPy>=1.9.0
requests==2.23.0
diff --git a/static/default/gugs/twitter syndication collection lookup.png b/static/default/gugs/twitter syndication collection lookup.png
new file mode 100644
index 00000000..f3b10379
Binary files /dev/null and b/static/default/gugs/twitter syndication collection lookup.png differ
diff --git a/static/default/gugs/twitter syndication likes lookup.png b/static/default/gugs/twitter syndication likes lookup.png
new file mode 100644
index 00000000..b83d3ed6
Binary files /dev/null and b/static/default/gugs/twitter syndication likes lookup.png differ
diff --git a/static/default/gugs/twitter syndication list lookup.png b/static/default/gugs/twitter syndication list lookup.png
new file mode 100644
index 00000000..6771b9cf
Binary files /dev/null and b/static/default/gugs/twitter syndication list lookup.png differ
diff --git a/static/default/gugs/twitter syndication profile lookup (limited) (with replies).png b/static/default/gugs/twitter syndication profile lookup (limited) (with replies).png
deleted file mode 100644
index ac18843b..00000000
Binary files a/static/default/gugs/twitter syndication profile lookup (limited) (with replies).png and /dev/null differ
diff --git a/static/default/gugs/twitter syndication profile lookup (limited).png b/static/default/gugs/twitter syndication profile lookup (limited).png
deleted file mode 100644
index 38763d68..00000000
Binary files a/static/default/gugs/twitter syndication profile lookup (limited).png and /dev/null differ
diff --git a/static/default/gugs/twitter syndication profile lookup (with replies).png b/static/default/gugs/twitter syndication profile lookup (with replies).png
new file mode 100644
index 00000000..5b289d6c
Binary files /dev/null and b/static/default/gugs/twitter syndication profile lookup (with replies).png differ
diff --git a/static/default/gugs/twitter syndication profile lookup.png b/static/default/gugs/twitter syndication profile lookup.png
new file mode 100644
index 00000000..680fec67
Binary files /dev/null and b/static/default/gugs/twitter syndication profile lookup.png differ
diff --git a/static/default/parsers/twitter syndication api profile parser.png b/static/default/parsers/twitter syndication api profile parser.png
index 4bd95d9f..a1b3eb46 100644
Binary files a/static/default/parsers/twitter syndication api profile parser.png and b/static/default/parsers/twitter syndication api profile parser.png differ
diff --git a/static/default/parsers/twitter syndication api tweet parser.png b/static/default/parsers/twitter syndication api tweet parser.png
index 18764d40..678b6229 100644
Binary files a/static/default/parsers/twitter syndication api tweet parser.png and b/static/default/parsers/twitter syndication api tweet parser.png differ
diff --git a/static/default/url_classes/twitter list.png b/static/default/url_classes/twitter list.png
new file mode 100644
index 00000000..95e28236
Binary files /dev/null and b/static/default/url_classes/twitter list.png differ
diff --git a/static/default/url_classes/twitter syndication api collection.png b/static/default/url_classes/twitter syndication api collection.png
new file mode 100644
index 00000000..b08d4f7a
Binary files /dev/null and b/static/default/url_classes/twitter syndication api collection.png differ
diff --git a/static/default/url_classes/twitter syndication api likes (user_id).png b/static/default/url_classes/twitter syndication api likes (user_id).png
new file mode 100644
index 00000000..720342ee
Binary files /dev/null and b/static/default/url_classes/twitter syndication api likes (user_id).png differ
diff --git a/static/default/url_classes/twitter syndication api likes.png b/static/default/url_classes/twitter syndication api likes.png
new file mode 100644
index 00000000..417f9ee1
Binary files /dev/null and b/static/default/url_classes/twitter syndication api likes.png differ
diff --git a/static/default/url_classes/twitter syndication api list (list_id).png b/static/default/url_classes/twitter syndication api list (list_id).png
new file mode 100644
index 00000000..983dd088
Binary files /dev/null and b/static/default/url_classes/twitter syndication api list (list_id).png differ
diff --git a/static/default/url_classes/twitter syndication api list (screen_name and slug).png b/static/default/url_classes/twitter syndication api list (screen_name and slug).png
new file mode 100644
index 00000000..8546e0c3
Binary files /dev/null and b/static/default/url_classes/twitter syndication api list (screen_name and slug).png differ
diff --git a/static/default/url_classes/twitter syndication api list (user_id and slug).png b/static/default/url_classes/twitter syndication api list (user_id and slug).png
new file mode 100644
index 00000000..bff93d3d
Binary files /dev/null and b/static/default/url_classes/twitter syndication api list (user_id and slug).png differ
diff --git a/static/default/url_classes/twitter syndication api profile (user_id).png b/static/default/url_classes/twitter syndication api profile (user_id).png
new file mode 100644
index 00000000..4412799f
Binary files /dev/null and b/static/default/url_classes/twitter syndication api profile (user_id).png differ
diff --git a/static/default/url_classes/twitter syndication api profile.png b/static/default/url_classes/twitter syndication api profile.png
index db81d0f5..7914a875 100644
Binary files a/static/default/url_classes/twitter syndication api profile.png and b/static/default/url_classes/twitter syndication api profile.png differ
diff --git a/static/default/url_classes/twitter syndication api tweet.png b/static/default/url_classes/twitter syndication api tweet.png
index 3b0a94ec..b07b3855 100644
Binary files a/static/default/url_classes/twitter syndication api tweet.png and b/static/default/url_classes/twitter syndication api tweet.png differ
diff --git a/static/default/url_classes/twitter tweet.png b/static/default/url_classes/twitter tweet.png
index adacbfae..56421765 100644
Binary files a/static/default/url_classes/twitter tweet.png and b/static/default/url_classes/twitter tweet.png differ