diff --git a/db/help my mpv crashes with WASAPI or ASIO audio.txt b/db/help my mpv crashes with WASAPI or ASIO audio.txt
new file mode 100644
index 00000000..724c8869
--- /dev/null
+++ b/db/help my mpv crashes with WASAPI or ASIO audio.txt
@@ -0,0 +1,4 @@
+If your hydrus crashes as soon as you load a video in mpv, and your audio drver is ASIO or WASAPI, please add these lines to your mpv.conf:
+
+ao=wasapi
+audio-fallback-to-null=yes
\ No newline at end of file
diff --git a/help/changelog.html b/help/changelog.html
index dbd0e229..ee05f6e3 100755
--- a/help/changelog.html
+++ b/help/changelog.html
@@ -8,6 +8,31 @@
-
This is now legacy! Use /get_services instead!
+
This is becoming obsolete and will be removed! Use /get_services instead!
Ask the client about its tag services.
Restricted access: YES. Add Tags permission needed.
@@ -517,14 +517,16 @@
Required Headers: n/a
Arguments (in JSON):
- - hash : (an SHA256 hash for a file in 64 characters of hexadecimal)
- - hashes : (a list of SHA256 hashes)
- - service_names_to_tags : (an Object of service names to lists of tags to be 'added' to the files)
- - service_names_to_actions_to_tags : (an Object of service names to content update actions to lists of tags)
- - add_siblings_and_parents : obsolete, now does nothing
+ - hash : (selective A, an SHA256 hash for a file in 64 characters of hexadecimal)
+ - hashes : (selective A, a list of SHA256 hashes)
+ - service_names_to_tags : (selective B, an Object of service names to lists of tags to be 'added' to the files)
+ - service_keys_to_tags : (selective B, an Object of service keys to lists of tags to be 'added' to the files)
+ - service_names_to_actions_to_tags : (selective B, an Object of service names to content update actions to lists of tags)
+ - service_keys_to_actions_to_tags : (selective B, an Object of service keys to content update actions to lists of tags)
- You can use either 'hash' or 'hashes', and you can use either the simple add-only 'service_names_to_tags' or the advanced 'service_names_to_actions_to_tags'.
- The service names are as in the /add_tags/get_tag_services call.
+ You can use either 'hash' or 'hashes'.
+ You can use either 'service_names_to...' or 'service_keys_to...', where names is simple and human-friendly "my tags" and similar (but may be renamed by a user), but keys is a little more complicated but accurate/unique. Since a client may have multiple tag services with non-default names and pseudo-random keys, if it is not your client you will need to check the /get_services call to get the names or keys, and you may need some selection UI on your end so the user can pick what to do if there are multiple choices. I encourage using keys if you can.
+ Also, you can use either '...to_tags', which is simple and add-only, or '...to_actions_to_tags', which is more complicated and allows you to remove/petition or rescind pending content.
The permitted 'actions' are:
{
"hash" : "df2a7b286d21329fc496e3aa8b8a08b67bb1747ca32749acb3f5d544cbfc0f56",
- "service_names_to_actions_to_tags" : {
- "my tags" : {
+ "service_keys_to_actions_to_tags" : {
+ "6c6f63616c2074616773" : {
"0" : [ "character:supergirl", "rating:safe" ],
"1" : [ "character:superman" ]
},
- "public tag repository" : {
+ "aa0424b501237041dab0308c02c35454d377eebd74cfbc5b9d7b3e16cc2193e9" : {
"2" : [ "character:supergirl", "rating:safe" ],
"3" : [ "filename:image.jpg" ],
"4" : [ [ "creator:danban faga", "typo" ], [ "character:super_girl", "underscore" ] ]
@@ -686,7 +688,8 @@
destination_page_key : (optional page identifier for the page to receive the url)
destination_page_name : (optional page name to receive the url)
show_destination_page : (optional, defaulting to false, controls whether the UI will change pages on add)
-
service_names_to_additional_tags : (optional tags to give to any files imported from this url)
+
service_names_to_additional_tags : (optional, selective, tags to give to any files imported from this url)
+
service_keys_to_additional_tags : (optional, selective, tags to give to any files imported from this url)
filterable_tags : (optional tags to be filtered by any tag import options that applies to the URL)
service_names_to_tags : (obsolete, legacy synonym for service_names_to_additional_tags)
@@ -694,7 +697,7 @@
If you specify a destination_page_name and an appropriate importer page already exists with that name, that page will be used. Otherwise, a new page with that name will be recreated (and used by subsequent calls with that name). Make sure it that page name is unique (e.g. '/b/ threads', not 'watcher') in your client, or it may not be found.
Alternately, destination_page_key defines exactly which page should be used. Bear in mind this page key is only valid to the current session (they are regenerated on client reset or session reload), so you must figure out which one you want using the /manage_pages/get_pages call. If the correct page_key is not found, or the page it corresponds to is of the incorrect type, the standard page selection/creation rules will apply.
show_destination_page defaults to False to reduce flicker when adding many URLs to different pages quickly. If you turn it on, the client will behave like a URL drag and drop and select the final page the URL ends up on.
-
service_names_to_additional_tags uses the same data structure as for /add_tags/add_tags. You will need 'add tags' permission, or this will 403. These tags work exactly as 'additional' tags work in a tag import options. They are service specific, and always added unless some advanced tag import options checkbox (like 'only add tags to new files') is set.
+
service_names_to_additional_tags and service_keys_to_additional_tags use the same data structure as in /add_tags/add_tags--service ids to a list of tags to add. You will need 'add tags' permission or this will 403. These tags work exactly as 'additional' tags work in a tag import options. They are service specific, and always added unless some advanced tag import options checkbox (like 'only add tags to new files') is set.
filterable_tags works like the tags parsed by a hydrus downloader. It is just a list of strings. They have no inherant service and will be sent to a tag import options, if one exists, to decide which tag services get what. This parameter is useful if you are pulling all a URL's tags outside of hydrus and want to have them processed like any other downloader, rather than figuring out service names and namespace filtering on your end. Note that in order for a tag import options to kick in, I think you will have to have a Post URL URL Class hydrus-side set up for the URL so some tag import options (whether that is Class-specific or just the default) can be loaded at import time.
Example request bodies:
@@ -1348,7 +1351,9 @@
"is_trashed" : false,
"known_urls" : [],
"service_names_to_statuses_to_tags" : {}
+ "service_keys_to_statuses_to_tags" : {}
"service_names_to_statuses_to_display_tags" : {}
+ "service_keys_to_statuses_to_display_tags" : {}
},
{
"file_id" : 4567,
@@ -1380,6 +1385,16 @@
"1" : [ "bodysuit" ]
}
},
+ "service_keys_to_statuses_to_tags" : {
+ "6c6f63616c2074616773" : {
+ "0" : [ "favourites" ]
+ "2" : [ "process this later" ]
+ },
+ "37e3849bda234f53b0e9792a036d14d4f3a9a136d1cb939705dbcd5287941db4" : {
+ "0" : [ "blonde_hair", "blue_eyes", "looking_at_viewer" ]
+ "1" : [ "bodysuit" ]
+ }
+ },
"service_names_to_statuses_to_display_tags" : {
"my tags" : {
"0" : [ "favourites" ]
@@ -1389,6 +1404,16 @@
"0" : [ "blonde hair", "blue eyes", "looking at viewer" ]
"1" : [ "bodysuit", "clothing" ]
}
+ },
+ "service_keys_to_statuses_to_display_tags" : {
+ "6c6f63616c2074616773" : {
+ "0" : [ "favourites" ]
+ "2" : [ "process this later", "processing" ]
+ },
+ "37e3849bda234f53b0e9792a036d14d4f3a9a136d1cb939705dbcd5287941db4" : {
+ "0" : [ "blonde hair", "blue eyes", "looking at viewer" ]
+ "1" : [ "bodysuit", "clothing" ]
+ }
}
}
]
@@ -1414,15 +1439,16 @@
Size is in bytes. Duration is in milliseconds, and may be an int or a float.
-
The service_names_to_statuses_to_tags structures are similar to the /add_tags/add_tags scheme, excepting that the status numbers are:
+
The service_names_to_statuses_to_tags and service_keys_to_statuses_to_tags structures are similar to the /add_tags/add_tags scheme, excepting that the status numbers are:
- 0 - current
- 1 - pending
- 2 - deleted
- 3 - petitioned
+
The tag structure is duplicated for both 'name' and 'key'. The use of 'name' is an increasingly legacy issue--a hack when the Client API was young--and 'service_names_to...' lookups are likely to be deleted in future in favour of service_key. I recommend you move to service key when you can. To learn more about service names and keys on a client, use the /get_services call (and cache the response--it doesn't change much!).
Note that since JSON Object keys must be strings, these status numbers are strings, not ints.
-
While service_names_to_statuses_to_tags represents the actual tags stored on the database for a file, the service_names_to_statuses_to_display_tags structure reflects how tags appear in the UI, after siblings are collapsed and parents are added. If you want to edit a file's tags, use service_names_to_statuses_to_tags. If you want to render to the user, use service_names_to_statuses_to_displayed_tags.
+
While service_XXX_to_statuses_to_tags represent the actual tags stored on the database for a file, the service_XXX_to_statuses_to_display_tags structures reflect how tags appear in the UI, after siblings are collapsed and parents are added. If you want to edit a file's tags, start with service_keys_to_statuses_to_tags. If you want to render to the user, use service_keys_to_statuses_to_displayed_tags.
If you add detailed_url_information=true, a new entry, 'detailed_known_urls', will be added for each file, with a list of the same structure as /add_urls/get_url_info. This may be an expensive request if you are querying thousands of files at once.
For example:
diff --git a/hydrus/client/ClientCaches.py b/hydrus/client/ClientCaches.py
index 73c6e40b..4b257dce 100644
--- a/hydrus/client/ClientCaches.py
+++ b/hydrus/client/ClientCaches.py
@@ -1102,7 +1102,6 @@ class ThumbnailCache( object ):
elif mime in HC.VIDEO: return self._special_thumbs[ 'video' ]
elif mime == HC.APPLICATION_PDF: return self._special_thumbs[ 'pdf' ]
elif mime == HC.APPLICATION_PSD: return self._special_thumbs[ 'psd' ]
- elif mime == HC.APPLICATION_CLIP: return self._special_thumbs[ 'clip' ]
elif mime in HC.ARCHIVES: return self._special_thumbs[ 'zip' ]
else: return self._special_thumbs[ 'hydrus' ]
diff --git a/hydrus/client/db/ClientDB.py b/hydrus/client/db/ClientDB.py
index 18b9399b..af309371 100644
--- a/hydrus/client/db/ClientDB.py
+++ b/hydrus/client/db/ClientDB.py
@@ -16506,6 +16506,35 @@ class DB( HydrusDB.HydrusDB ):
+ if version == 459:
+
+ try:
+
+ self._controller.frame_splash_status.SetSubtext( 'scheduling clip and apng files for regen' )
+
+ table_join = self.modules_files_storage.GetTableJoinLimitedByFileDomain( self.modules_services.combined_local_file_service_id, 'files_info', HC.CONTENT_STATUS_CURRENT )
+
+ from hydrus.client import ClientFiles
+
+ hash_ids = self._STL( self._Execute( 'SELECT hash_id FROM {} WHERE mime = ?;'.format( table_join ), ( HC.APPLICATION_CLIP, ) ) )
+
+ self.modules_files_maintenance_queue.AddJobs( hash_ids, ClientFiles.REGENERATE_FILE_DATA_JOB_FILE_METADATA )
+ self.modules_files_maintenance_queue.AddJobs( hash_ids, ClientFiles.REGENERATE_FILE_DATA_JOB_FORCE_THUMBNAIL )
+
+ hash_ids = self._STL( self._Execute( 'SELECT hash_id FROM {} WHERE mime = ?;'.format( table_join ), ( HC.IMAGE_APNG, ) ) )
+
+ self.modules_files_maintenance_queue.AddJobs( hash_ids, ClientFiles.REGENERATE_FILE_DATA_JOB_FILE_METADATA )
+
+ except:
+
+ HydrusData.PrintException( e )
+
+ message = 'Trying to schedule clip and apng files for maintenance failed! Please let hydrus dev know!'
+
+ self.pub_initial_message( message )
+
+
+
self._controller.frame_splash_status.SetTitleText( 'updated db to v{}'.format( HydrusData.ToHumanInt( version + 1 ) ) )
self._Execute( 'UPDATE version SET version = ?;', ( version + 1, ) )
diff --git a/hydrus/client/gui/canvas/ClientGUICanvas.py b/hydrus/client/gui/canvas/ClientGUICanvas.py
index 61ed0f33..c24ad171 100644
--- a/hydrus/client/gui/canvas/ClientGUICanvas.py
+++ b/hydrus/client/gui/canvas/ClientGUICanvas.py
@@ -4248,6 +4248,10 @@ class CanvasMediaListBrowser( CanvasMediaListNavigable ):
return
+ else:
+
+ return
+
diff --git a/hydrus/client/gui/lists/ClientGUIListBoxes.py b/hydrus/client/gui/lists/ClientGUIListBoxes.py
index 6cf9d4fa..08226fb2 100644
--- a/hydrus/client/gui/lists/ClientGUIListBoxes.py
+++ b/hydrus/client/gui/lists/ClientGUIListBoxes.py
@@ -2283,6 +2283,11 @@ class ListBoxTags( ListBox ):
+ def _SelectFilesWithTags( self, select_type ):
+
+ pass
+
+
def _UpdateBackgroundColour( self ):
new_options = HG.client_controller.new_options
@@ -2809,13 +2814,13 @@ class ListBoxTags( ListBox ):
label = 'files with all of "{}"'.format( tags_sorted_to_show_on_menu_string )
- ClientGUIMenus.AppendMenuItem( select_menu, label, 'Select the files with these tags.', HG.client_controller.pub, 'select_files_with_tags', self._page_key, 'AND', set( selected_actual_tags ) )
+ ClientGUIMenus.AppendMenuItem( select_menu, label, 'Select the files with these tags.', self._SelectFilesWithTags, 'AND' )
if len( selected_actual_tags ) > 1:
label = 'files with any of "{}"'.format( tags_sorted_to_show_on_menu_string )
- ClientGUIMenus.AppendMenuItem( select_menu, label, 'Select the files with any of these tags.', HG.client_controller.pub, 'select_files_with_tags', self._page_key, 'OR', set( selected_actual_tags ) )
+ ClientGUIMenus.AppendMenuItem( select_menu, label, 'Select the files with any of these tags.', self._SelectFilesWithTags, 'OR' )
ClientGUIMenus.AppendMenu( menu, select_menu, 'select' )
@@ -3228,6 +3233,16 @@ class ListBoxTagsDisplayCapable( ListBoxTags ):
return work_callable
+ def _SelectFilesWithTags( self, and_or_or ):
+
+ if self._page_key is not None:
+
+ selected_actual_tags = self._GetTagsFromTerms( self._selected_terms )
+
+ HG.client_controller.pub( 'select_files_with_tags', self._page_key, self._service_key, and_or_or, set( selected_actual_tags ) )
+
+
+
def GetSelectedTags( self ):
return set( self._GetTagsFromTerms( self._selected_terms ) )
diff --git a/hydrus/client/gui/pages/ClientGUIResults.py b/hydrus/client/gui/pages/ClientGUIResults.py
index 1ab31b78..c8805c2d 100644
--- a/hydrus/client/gui/pages/ClientGUIResults.py
+++ b/hydrus/client/gui/pages/ClientGUIResults.py
@@ -2222,11 +2222,11 @@ class MediaPanel( ClientMedia.ListeningMediaList, QW.QScrollArea ):
- def SelectByTags( self, page_key, and_or_or, tags ):
+ def SelectByTags( self, page_key, tag_service_key, and_or_or, tags ):
if page_key == self._page_key:
- self._Select( ClientMedia.FileFilter( ClientMedia.FILE_FILTER_TAGS, ( and_or_or, tags ) ) )
+ self._Select( ClientMedia.FileFilter( ClientMedia.FILE_FILTER_TAGS, ( tag_service_key, and_or_or, tags ) ) )
self.setFocus( QC.Qt.OtherFocusReason )
diff --git a/hydrus/client/media/ClientMedia.py b/hydrus/client/media/ClientMedia.py
index 29999b8b..f37a5fb0 100644
--- a/hydrus/client/media/ClientMedia.py
+++ b/hydrus/client/media/ClientMedia.py
@@ -1174,17 +1174,17 @@ class MediaList( object ):
elif file_filter.filter_type == FILE_FILTER_TAGS:
- ( and_or_or, select_tags ) = file_filter.filter_data
+ ( tag_service_key, and_or_or, select_tags ) = file_filter.filter_data
if and_or_or == 'AND':
select_tags = set( select_tags )
- return sum( ( 1 for m in flat_media if select_tags.issubset( m.GetTagsManager().GetCurrentAndPending( CC.COMBINED_TAG_SERVICE_KEY, ClientTags.TAG_DISPLAY_ACTUAL ) ) ) )
+ return sum( ( 1 for m in flat_media if select_tags.issubset( m.GetTagsManager().GetCurrentAndPending( tag_service_key, ClientTags.TAG_DISPLAY_ACTUAL ) ) ) )
elif and_or_or == 'OR':
- return sum( ( 1 for m in flat_media if HydrusData.SetsIntersect( m.GetTagsManager().GetCurrentAndPending( CC.COMBINED_TAG_SERVICE_KEY, ClientTags.TAG_DISPLAY_ACTUAL ), select_tags ) ) )
+ return sum( ( 1 for m in flat_media if HydrusData.SetsIntersect( m.GetTagsManager().GetCurrentAndPending( tag_service_key, ClientTags.TAG_DISPLAY_ACTUAL ), select_tags ) ) )
@@ -1255,17 +1255,17 @@ class MediaList( object ):
elif file_filter.filter_type == FILE_FILTER_TAGS:
- ( and_or_or, select_tags ) = file_filter.filter_data
+ ( tag_service_key, and_or_or, select_tags ) = file_filter.filter_data
if and_or_or == 'AND':
select_tags = set( select_tags )
- filtered_media = [ m for m in flat_media if select_tags.issubset( m.GetTagsManager().GetCurrentAndPending( CC.COMBINED_TAG_SERVICE_KEY, ClientTags.TAG_DISPLAY_ACTUAL ) ) ]
+ filtered_media = [ m for m in flat_media if select_tags.issubset( m.GetTagsManager().GetCurrentAndPending( tag_service_key, ClientTags.TAG_DISPLAY_ACTUAL ) ) ]
elif and_or_or == 'OR':
- filtered_media = [ m for m in flat_media if HydrusData.SetsIntersect( m.GetTagsManager().GetCurrentAndPending( CC.COMBINED_TAG_SERVICE_KEY, ClientTags.TAG_DISPLAY_ACTUAL ), select_tags ) ]
+ filtered_media = [ m for m in flat_media if HydrusData.SetsIntersect( m.GetTagsManager().GetCurrentAndPending( tag_service_key, ClientTags.TAG_DISPLAY_ACTUAL ), select_tags ) ]
@@ -1321,17 +1321,17 @@ class MediaList( object ):
elif file_filter.filter_type == FILE_FILTER_TAGS:
- ( and_or_or, select_tags ) = file_filter.filter_data
+ ( tag_service_key, and_or_or, select_tags ) = file_filter.filter_data
if and_or_or == 'AND':
select_tags = set( select_tags )
- filtered_media = { m for m in self._sorted_media if select_tags.issubset( m.GetTagsManager().GetCurrentAndPending( CC.COMBINED_TAG_SERVICE_KEY, ClientTags.TAG_DISPLAY_ACTUAL ) ) }
+ filtered_media = { m for m in self._sorted_media if select_tags.issubset( m.GetTagsManager().GetCurrentAndPending( tag_service_key, ClientTags.TAG_DISPLAY_ACTUAL ) ) }
elif and_or_or == 'OR':
- filtered_media = { m for m in self._sorted_media if HydrusData.SetsIntersect( m.GetTagsManager().GetCurrentAndPending( CC.COMBINED_TAG_SERVICE_KEY, ClientTags.TAG_DISPLAY_ACTUAL ), select_tags ) }
+ filtered_media = { m for m in self._sorted_media if HydrusData.SetsIntersect( m.GetTagsManager().GetCurrentAndPending( tag_service_key, ClientTags.TAG_DISPLAY_ACTUAL ), select_tags ) }
@@ -1827,10 +1827,15 @@ class FileFilter( object ):
elif self.filter_type == FILE_FILTER_TAGS:
- ( and_or_or, select_tags ) = self.filter_data
+ ( tag_service_key, and_or_or, select_tags ) = self.filter_data
s = and_or_or.join( select_tags )
+ if tag_service_key != CC.COMBINED_TAG_SERVICE_KEY:
+
+ s = '{} on {}'.format( s, HG.client_controller.services_manager.GetName( tag_service_key ) )
+
+
s = HydrusText.ElideText( s, 64 )
elif self.filter_type == FILE_FILTER_MIME:
diff --git a/hydrus/client/networking/ClientLocalServerResources.py b/hydrus/client/networking/ClientLocalServerResources.py
index a1b4e2f9..4e496027 100644
--- a/hydrus/client/networking/ClientLocalServerResources.py
+++ b/hydrus/client/networking/ClientLocalServerResources.py
@@ -43,6 +43,7 @@ CLIENT_API_BYTE_PARAMS = { 'hash', 'destination_page_key', 'page_key', 'Hydrus-C
CLIENT_API_STRING_PARAMS = { 'name', 'url', 'domain', 'file_service_name', 'tag_service_name' }
CLIENT_API_JSON_PARAMS = { 'basic_permissions', 'system_inbox', 'system_archive', 'tags', 'file_ids', 'only_return_identifiers', 'detailed_url_information', 'simple', 'file_sort_asc' }
CLIENT_API_JSON_BYTE_LIST_PARAMS = { 'hashes' }
+CLIENT_API_JSON_BYTE_DICT_PARAMS = { 'service_keys_to_tags', 'service_keys_to_actions_to_tags', 'service_keys_to_additional_tags' }
def CheckHashLength( hashes, hash_type = 'sha256' ):
@@ -70,6 +71,26 @@ def CheckHashLength( hashes, hash_type = 'sha256' ):
+def ConvertServiceNamesDictToKeys( allowed_service_types, service_name_dict ):
+
+ service_key_dict = {}
+
+ for ( service_name, value ) in service_name_dict.items():
+
+ try:
+
+ service_key = HG.client_controller.services_manager.GetServiceKeyFromName( allowed_service_types, service_name )
+
+ except:
+
+ raise HydrusExceptions.BadRequestException( 'Could not find the service "{}", or it was the wrong type!'.format( service_name ) )
+
+
+ service_key_dict[ service_key ] = value
+
+
+ return service_key_dict
+
def ParseLocalBooruGETArgs( requests_args ):
args = HydrusNetworkVariableHandling.ParseTwistedRequestGETArgs( requests_args, LOCAL_BOORU_INT_PARAMS, LOCAL_BOORU_BYTE_PARAMS, LOCAL_BOORU_STRING_PARAMS, LOCAL_BOORU_JSON_PARAMS, LOCAL_BOORU_JSON_BYTE_LIST_PARAMS )
@@ -161,6 +182,52 @@ def ParseClientAPIPOSTByteArgs( args ):
+ for var_name in CLIENT_API_JSON_BYTE_DICT_PARAMS:
+
+ if var_name in parsed_request_args:
+
+ try:
+
+ raw_dict = parsed_request_args[ var_name ]
+
+ # In JSON, if someone puts 'null' for an optional value, treat that as 'did not enter anything'
+ if raw_dict is None:
+
+ del parsed_request_args[ var_name ]
+
+ continue
+
+
+ bytes_dict = {}
+
+ for ( key, value ) in raw_dict.items():
+
+ if len( key ) == 0:
+
+ continue
+
+
+ bytes_key = bytes.fromhex( key )
+
+ bytes_dict[ bytes_key ] = value
+
+
+ if len( bytes_dict ) == 0:
+
+ del parsed_request_args[ var_name ]
+
+ else:
+
+ parsed_request_args[ var_name ] = bytes_dict
+
+
+ except:
+
+ raise HydrusExceptions.BadRequestException( 'I was expecting to parse \'{}\' as a dictionary of hex strings to other data, but it failed.'.format( var_name ) )
+
+
+
+
return parsed_request_args
def ParseClientAPIPOSTArgs( request ):
@@ -650,10 +717,6 @@ class HydrusResourceBooruThumbnail( HydrusResourceBooru ):
path = os.path.join( HC.STATIC_DIR, 'psd.png' )
- elif mime == HC.APPLICATION_CLIP:
-
- path = os.path.join( HC.STATIC_DIR, 'clip.png' )
-
else:
path = os.path.join( HC.STATIC_DIR, 'hydrus.png' )
@@ -1216,30 +1279,34 @@ class HydrusResourceClientAPIRestrictedAddTagsAddTags( HydrusResourceClientAPIRe
#
- service_keys_to_content_updates = collections.defaultdict( list )
+ service_keys_to_tags = None
- if 'service_names_to_tags' in request.parsed_request_args:
+ if 'service_keys_to_tags' in request.parsed_request_args:
+
+ service_keys_to_tags = request.parsed_request_args.GetValue( 'service_keys_to_tags', dict )
+
+ elif 'service_names_to_tags' in request.parsed_request_args:
service_names_to_tags = request.parsed_request_args.GetValue( 'service_names_to_tags', dict )
- for ( service_name, tags ) in service_names_to_tags.items():
+ service_keys_to_tags = ConvertServiceNamesDictToKeys( HC.REAL_TAG_SERVICES, service_names_to_tags )
+
+
+ service_keys_to_actions_to_tags = None
+
+ if service_keys_to_tags is not None:
+
+ service_keys_to_actions_to_tags = {}
+
+ for ( service_key, tags ) in service_keys_to_tags.items():
try:
- service_key = HG.client_controller.services_manager.GetServiceKeyFromName( HC.REAL_TAG_SERVICES, service_name )
+ service = HG.client_controller.services_manager.GetService( service_key )
except:
- raise HydrusExceptions.BadRequestException( 'Could not find the service "{}"!'.format( service_name ) )
-
-
- service = HG.client_controller.services_manager.GetService( service_key )
-
- tags = HydrusTags.CleanTags( tags )
-
- if len( tags ) == 0:
-
- continue
+ raise HydrusExceptions.BadRequestException( 'Could not find the service with key {}! Maybe it was recently deleted?'.format( service_key.hex() ) )
if service.GetServiceType() == HC.LOCAL_TAG:
@@ -1251,105 +1318,121 @@ class HydrusResourceClientAPIRestrictedAddTagsAddTags( HydrusResourceClientAPIRe
content_action = HC.CONTENT_UPDATE_PEND
- content_updates = [ HydrusData.ContentUpdate( HC.CONTENT_TYPE_MAPPINGS, content_action, ( tag, hashes ) ) for tag in tags ]
+ service_keys_to_actions_to_tags[ service_key ] = collections.defaultdict( set )
- service_keys_to_content_updates[ service_key ].extend( content_updates )
+ service_keys_to_actions_to_tags[ service_key ][ content_action ].update( tags )
- if 'service_names_to_actions_to_tags' in request.parsed_request_args:
+ if 'service_keys_to_actions_to_tags' in request.parsed_request_args:
+
+ service_keys_to_actions_to_tags = request.parsed_request_args.GetValue( 'service_keys_to_actions_to_tags', dict )
+
+ elif 'service_names_to_actions_to_tags' in request.parsed_request_args:
service_names_to_actions_to_tags = request.parsed_request_args.GetValue( 'service_names_to_actions_to_tags', dict )
- for ( service_name, actions_to_tags ) in service_names_to_actions_to_tags.items():
-
- try:
-
- service_key = HG.client_controller.services_manager.GetServiceKeyFromName( HC.REAL_TAG_SERVICES, service_name )
-
- except:
-
- raise HydrusExceptions.BadRequestException( 'Could not find the service "{}"!'.format( service_name ) )
-
+ service_keys_to_actions_to_tags = ConvertServiceNamesDictToKeys( HC.REAL_TAG_SERVICES, service_names_to_actions_to_tags )
+
+
+ if service_keys_to_actions_to_tags is None:
+
+ raise HydrusExceptions.BadRequestException( 'Need a service-names-to-tags parameter!' )
+
+
+ service_keys_to_content_updates = collections.defaultdict( list )
+
+ for ( service_key, actions_to_tags ) in service_keys_to_actions_to_tags.items():
+
+ try:
service = HG.client_controller.services_manager.GetService( service_key )
- for ( content_action, tags ) in actions_to_tags.items():
+ except HydrusExceptions.DataMissing:
+
+ raise HydrusExceptions.BadRequestException( 'Could not find the service with key {}! Maybe it was recently deleted?'.format( service_key.hex() ) )
+
+
+ if service.GetServiceType() not in HC.REAL_TAG_SERVICES:
+
+ raise HydrusExceptions.BadRequestException( 'Was given a service that is not a tag service!' )
+
+
+ for ( content_action, tags ) in actions_to_tags.items():
+
+ tags = list( tags )
+
+ if len( tags ) == 0:
- tags = list( tags )
+ continue
- if len( tags ) == 0:
+
+ content_action = int( content_action )
+
+ actual_tags = []
+
+ tags_to_reasons = {}
+
+ for tag_item in tags:
+
+ reason = 'Petitioned from API'
+
+ if isinstance( tag_item, str ):
- continue
+ tag = tag_item
-
- content_action = int( content_action )
-
- actual_tags = []
-
- tags_to_reasons = {}
-
- for tag_item in tags:
+ elif isinstance( tag_item, collections.abc.Collection ) and len( tag_item ) == 2:
- reason = 'Petitioned from API'
+ ( tag, reason ) = tag_item
- if isinstance( tag_item, str ):
-
- tag = tag_item
-
- elif isinstance( tag_item, collections.abc.Collection ) and len( tag_item ) == 2:
-
- ( tag, reason ) = tag_item
-
- if not ( isinstance( tag, str ) and isinstance( reason, str ) ):
-
- continue
-
-
- else:
-
- continue
-
-
- actual_tags.append( tag )
- tags_to_reasons[ tag ] = reason
-
-
- actual_tags = HydrusTags.CleanTags( actual_tags )
-
- if len( actual_tags ) == 0:
-
- continue
-
-
- tags = actual_tags
-
- if service.GetServiceType() == HC.LOCAL_TAG:
-
- if content_action not in ( HC.CONTENT_UPDATE_ADD, HC.CONTENT_UPDATE_DELETE ):
+ if not ( isinstance( tag, str ) and isinstance( reason, str ) ):
continue
else:
- if content_action in ( HC.CONTENT_UPDATE_ADD, HC.CONTENT_UPDATE_DELETE ):
-
- continue
-
+ continue
- if content_action == HC.CONTENT_UPDATE_PETITION:
+ actual_tags.append( tag )
+ tags_to_reasons[ tag ] = reason
+
+
+ actual_tags = HydrusTags.CleanTags( actual_tags )
+
+ if len( actual_tags ) == 0:
+
+ continue
+
+
+ tags = actual_tags
+
+ if service.GetServiceType() == HC.LOCAL_TAG:
+
+ if content_action not in ( HC.CONTENT_UPDATE_ADD, HC.CONTENT_UPDATE_DELETE ):
- content_updates = [ HydrusData.ContentUpdate( HC.CONTENT_TYPE_MAPPINGS, content_action, ( tag, hashes ), reason = tags_to_reasons[ tag ] ) for tag in tags ]
-
- else:
-
- content_updates = [ HydrusData.ContentUpdate( HC.CONTENT_TYPE_MAPPINGS, content_action, ( tag, hashes ) ) for tag in tags ]
+ continue
- service_keys_to_content_updates[ service_key ].extend( content_updates )
+ else:
+ if content_action in ( HC.CONTENT_UPDATE_ADD, HC.CONTENT_UPDATE_DELETE ):
+
+ continue
+
+
+
+ if content_action == HC.CONTENT_UPDATE_PETITION:
+
+ content_updates = [ HydrusData.ContentUpdate( HC.CONTENT_TYPE_MAPPINGS, content_action, ( tag, hashes ), reason = tags_to_reasons[ tag ] ) for tag in tags ]
+
+ else:
+
+ content_updates = [ HydrusData.ContentUpdate( HC.CONTENT_TYPE_MAPPINGS, content_action, ( tag, hashes ) ) for tag in tags ]
+
+
+ service_keys_to_content_updates[ service_key ].extend( content_updates )
@@ -1630,10 +1713,10 @@ class HydrusResourceClientAPIRestrictedAddURLsImportURL( HydrusResourceClientAPI
additional_service_keys_to_tags = ClientTags.ServiceKeysToTags()
+ service_keys_to_additional_tags = None
+
if 'service_names_to_tags' in request.parsed_request_args or 'service_names_to_additional_tags' in request.parsed_request_args:
- request.client_api_permissions.CheckPermission( ClientAPI.CLIENT_API_PERMISSION_ADD_TAGS )
-
if 'service_names_to_tags' in request.parsed_request_args:
service_names_to_additional_tags = request.parsed_request_args.GetValue( 'service_names_to_tags', dict )
@@ -1643,15 +1726,24 @@ class HydrusResourceClientAPIRestrictedAddURLsImportURL( HydrusResourceClientAPI
service_names_to_additional_tags = request.parsed_request_args.GetValue( 'service_names_to_additional_tags', dict )
- for ( service_name, tags ) in service_names_to_additional_tags.items():
+ service_keys_to_additional_tags = ConvertServiceNamesDictToKeys( HC.REAL_TAG_SERVICES, service_names_to_additional_tags )
+
+ elif 'service_keys_to_additional_tags' in request.parsed_request_args:
+
+ service_keys_to_additional_tags = request.parsed_request_args.GetValue( 'service_keys_to_additional_tags', dict )
+
+
+ if service_keys_to_additional_tags is not None:
+
+ request.client_api_permissions.CheckPermission( ClientAPI.CLIENT_API_PERMISSION_ADD_TAGS )
+
+ for ( service_key, tags ) in service_keys_to_additional_tags.items():
- try:
+ service = HG.client_controller.services_manager.GetService( service_key )
+
+ if service.GetServiceType() not in HC.REAL_TAG_SERVICES:
- service_key = HG.client_controller.services_manager.GetServiceKeyFromName( HC.REAL_TAG_SERVICES, service_name )
-
- except:
-
- raise HydrusExceptions.BadRequestException( 'Could not find the service "{}"!'.format( service_name ) )
+ raise HydrusExceptions.BadRequestException( 'Was given a service that is not a tag service!' )
tags = HydrusTags.CleanTags( tags )
@@ -2034,6 +2126,7 @@ class HydrusResourceClientAPIRestrictedGetFilesFileMetadata( HydrusResourceClien
tags_manager = media_result.GetTagsManager()
service_names_to_statuses_to_tags = {}
+ api_service_keys_to_statuses_to_tags = {}
service_keys_to_statuses_to_tags = tags_manager.GetServiceKeysToStatusesToTags( ClientTags.TAG_DISPLAY_STORAGE )
@@ -2052,13 +2145,17 @@ class HydrusResourceClientAPIRestrictedGetFilesFileMetadata( HydrusResourceClien
service_names_to_statuses_to_tags[ service_name ] = statuses_to_tags_json_serialisable
+ api_service_keys_to_statuses_to_tags[ service_key.hex() ] = statuses_to_tags_json_serialisable
+
metadata_row[ 'service_names_to_statuses_to_tags' ] = service_names_to_statuses_to_tags
+ metadata_row[ 'service_keys_to_statuses_to_tags' ] = api_service_keys_to_statuses_to_tags
#
service_names_to_statuses_to_tags = {}
+ api_service_keys_to_statuses_to_tags = {}
service_keys_to_statuses_to_tags = tags_manager.GetServiceKeysToStatusesToTags( ClientTags.TAG_DISPLAY_ACTUAL )
@@ -2069,12 +2166,20 @@ class HydrusResourceClientAPIRestrictedGetFilesFileMetadata( HydrusResourceClien
service_keys_to_names[ service_key ] = services_manager.GetName( service_key )
- service_name = service_keys_to_names[ service_key ]
+ statuses_to_tags_json_serialisable = { str( status ) : sorted( tags, key = HydrusTags.ConvertTagToSortable ) for ( status, tags ) in statuses_to_tags.items() if len( tags ) > 0 }
- service_names_to_statuses_to_tags[ service_name ] = { str( status ) : sorted( tags, key = HydrusTags.ConvertTagToSortable ) for ( status, tags ) in statuses_to_tags.items() }
+ if len( statuses_to_tags_json_serialisable ) > 0:
+
+ service_name = service_keys_to_names[ service_key ]
+
+ service_names_to_statuses_to_tags[ service_name ] = statuses_to_tags_json_serialisable
+
+ api_service_keys_to_statuses_to_tags[ service_key.hex() ] = statuses_to_tags_json_serialisable
+
metadata_row[ 'service_names_to_statuses_to_display_tags' ] = service_names_to_statuses_to_tags
+ metadata_row[ 'service_keys_to_statuses_to_display_tags' ] = api_service_keys_to_statuses_to_tags
#
diff --git a/hydrus/client/networking/ClientNetworkingJobs.py b/hydrus/client/networking/ClientNetworkingJobs.py
index 0484aef3..785887f8 100644
--- a/hydrus/client/networking/ClientNetworkingJobs.py
+++ b/hydrus/client/networking/ClientNetworkingJobs.py
@@ -225,6 +225,7 @@ class NetworkJob( object ):
self._num_bytes_read = 0
self._num_bytes_to_read = 1
self._num_bytes_read_is_accurate = True
+ self._number_of_concurrent_empty_chunks = 0
self._file_import_options = None
@@ -541,10 +542,21 @@ class NetworkJob( object ):
if download_is_definitely_incomplete and not we_read_some_data:
- raise HydrusExceptions.NetworkException( 'The server appeared to want to send this URL in ranged chunks, but this chunk was empty!' )
+ self._number_of_concurrent_empty_chunks += 1
+
+ if self._number_of_concurrent_empty_chunks > 2:
+
+ raise HydrusExceptions.NetworkException( 'The server appeared to want to send this URL in ranged chunks, but this chunk was empty!' )
+
+
+ more_to_download = True
+
+ else:
+
+ self._number_of_concurrent_empty_chunks = 0
+
+ more_to_download = we_read_some_data and download_is_definitely_incomplete
-
- more_to_download = we_read_some_data and download_is_definitely_incomplete
if not more_to_download:
@@ -581,6 +593,7 @@ class NetworkJob( object ):
self._num_bytes_read = 0
self._num_bytes_to_read = 1
self._num_bytes_read_is_accurate = True
+ self._number_of_concurrent_empty_chunks = 0
def _SendRequestAndGetResponse( self ) -> requests.Response:
diff --git a/hydrus/core/HydrusClipHandling.py b/hydrus/core/HydrusClipHandling.py
new file mode 100644
index 00000000..01c72c70
--- /dev/null
+++ b/hydrus/core/HydrusClipHandling.py
@@ -0,0 +1,106 @@
+import sqlite3
+
+from hydrus.core import HydrusExceptions
+from hydrus.core import HydrusTemp
+
+def ExtractDBPNGToPath( path, temp_path ):
+
+ ( os_file_handle, sqlite_temp_path ) = HydrusTemp.GetTempPath()
+
+ db = None
+ c = None
+
+ try:
+
+ ( db, c ) = GetSQLiteDB( path, sqlite_temp_path )
+
+ ( png_bytes, ) = c.execute( 'SELECT ImageData FROM CanvasPreview;' ).fetchone()
+
+ with open( temp_path, 'wb' ) as f:
+
+ f.write( png_bytes )
+
+
+ finally:
+
+ if c is not None:
+
+ c.close()
+
+
+ if db is not None:
+
+ db.close()
+
+
+ HydrusTemp.CleanUpTempPath( os_file_handle, sqlite_temp_path )
+
+
+def GetResolution( path ):
+
+ ( os_file_handle, sqlite_temp_path ) = HydrusTemp.GetTempPath()
+
+ db = None
+ c = None
+
+ try:
+
+ ( db, c ) = GetSQLiteDB( path, sqlite_temp_path )
+
+ ( width_float, height_float ) = c.execute( 'SELECT CanvasWidth, CanvasHeight FROM Canvas;' ).fetchone()
+
+ finally:
+
+ if c is not None:
+
+ c.close()
+
+
+ if db is not None:
+
+ db.close()
+
+
+ HydrusTemp.CleanUpTempPath( os_file_handle, sqlite_temp_path )
+
+
+ return ( int( width_float ), int( height_float ) )
+
+def GetSQLiteDB( path, sqlite_temp_path ):
+
+ with open( path, 'rb' ) as f:
+
+ clip_bytes = f.read()
+
+
+ SQLITE_START = b'SQLite format 3'
+
+ try:
+
+ i = clip_bytes.index( SQLITE_START )
+
+ except IndexError:
+
+ raise HydrusExceptions.DamagedOrUnusualFileException( 'This clip file had no internal SQLite file, so no PNG thumb could be extracted!' )
+
+
+ sqlite_bytes = clip_bytes[ i : ]
+
+ with open( sqlite_temp_path, 'wb' ) as f:
+
+ f.write( sqlite_bytes )
+
+
+ try:
+
+ db = sqlite3.connect( sqlite_temp_path, isolation_level = None, detect_types = sqlite3.PARSE_DECLTYPES )
+
+ c = db.cursor()
+
+ except:
+
+ raise HydrusExceptions.DamagedOrUnusualFileException( 'This clip file seemed to have an invalid internal SQLite file!' )
+
+
+ return ( db, c )
+
diff --git a/hydrus/core/HydrusConstants.py b/hydrus/core/HydrusConstants.py
index 41d1a9b0..708f35cc 100644
--- a/hydrus/core/HydrusConstants.py
+++ b/hydrus/core/HydrusConstants.py
@@ -81,8 +81,8 @@ options = {}
# Misc
NETWORK_VERSION = 20
-SOFTWARE_VERSION = 459
-CLIENT_API_VERSION = 20
+SOFTWARE_VERSION = 460
+CLIENT_API_VERSION = 21
SERVER_THUMBNAIL_DIMENSIONS = ( 200, 200 )
@@ -599,7 +599,7 @@ MIMES_THAT_MAY_HAVE_AUDIO = tuple( list( MIMES_THAT_DEFINITELY_HAVE_AUDIO ) + li
ARCHIVES = ( APPLICATION_ZIP, APPLICATION_HYDRUS_ENCRYPTED_ZIP, APPLICATION_RAR, APPLICATION_7Z )
-MIMES_WITH_THUMBNAILS = ( APPLICATION_FLASH, IMAGE_JPEG, IMAGE_PNG, IMAGE_APNG, IMAGE_GIF, IMAGE_BMP, IMAGE_WEBP, IMAGE_TIFF, IMAGE_ICON, APPLICATION_PSD, VIDEO_AVI, VIDEO_FLV, VIDEO_MOV, VIDEO_MP4, VIDEO_WMV, VIDEO_MKV, VIDEO_REALMEDIA, VIDEO_WEBM, VIDEO_MPEG )
+MIMES_WITH_THUMBNAILS = ( APPLICATION_FLASH, APPLICATION_CLIP, IMAGE_JPEG, IMAGE_PNG, IMAGE_APNG, IMAGE_GIF, IMAGE_BMP, IMAGE_WEBP, IMAGE_TIFF, IMAGE_ICON, APPLICATION_PSD, VIDEO_AVI, VIDEO_FLV, VIDEO_MOV, VIDEO_MP4, VIDEO_WMV, VIDEO_MKV, VIDEO_REALMEDIA, VIDEO_WEBM, VIDEO_MPEG )
HYDRUS_UPDATE_FILES = ( APPLICATION_HYDRUS_UPDATE_DEFINITIONS, APPLICATION_HYDRUS_UPDATE_CONTENT )
diff --git a/hydrus/core/HydrusFileHandling.py b/hydrus/core/HydrusFileHandling.py
index 944df3b1..9dd675f0 100644
--- a/hydrus/core/HydrusFileHandling.py
+++ b/hydrus/core/HydrusFileHandling.py
@@ -3,6 +3,7 @@ import os
import struct
from hydrus.core import HydrusAudioHandling
+from hydrus.core import HydrusClipHandling
from hydrus.core import HydrusConstants as HC
from hydrus.core import HydrusData
from hydrus.core import HydrusDocumentHandling
@@ -60,6 +61,11 @@ headers_and_mime = [
def GenerateThumbnailBytes( path, target_resolution, mime, duration, num_frames, percentage_in = 35 ):
+ if target_resolution == ( 0, 0 ):
+
+ target_resolution = ( 128, 128 )
+
+
if mime in ( HC.IMAGE_JPEG, HC.IMAGE_PNG, HC.IMAGE_GIF, HC.IMAGE_WEBP, HC.IMAGE_TIFF, HC.IMAGE_ICON ): # not apng atm
thumbnail_bytes = HydrusImageHandling.GenerateThumbnailBytesFromStaticImagePath( path, target_resolution, mime )
@@ -85,55 +91,73 @@ def GenerateThumbnailBytes( path, target_resolution, mime, duration, num_frames,
HydrusTemp.CleanUpTempPath( os_file_handle, temp_path )
+ elif mime == HC.APPLICATION_CLIP:
+
+ ( os_file_handle, temp_path ) = HydrusTemp.GetTempPath()
+
+ try:
+
+ HydrusClipHandling.ExtractDBPNGToPath( path, temp_path )
+
+ thumbnail_bytes = HydrusImageHandling.GenerateThumbnailBytesFromStaticImagePath( temp_path, target_resolution, mime )
+
+ except:
+
+ thumb_path = os.path.join( HC.STATIC_DIR, 'clip.png' )
+
+ thumbnail_bytes = HydrusImageHandling.GenerateThumbnailBytesFromStaticImagePath( thumb_path, target_resolution, mime )
+
+ finally:
+
+ HydrusTemp.CleanUpTempPath( os_file_handle, temp_path )
+
+
+ elif mime == HC.APPLICATION_FLASH:
+
+ ( os_file_handle, temp_path ) = HydrusTemp.GetTempPath()
+
+ try:
+
+ HydrusFlashHandling.RenderPageToFile( path, temp_path, 1 )
+
+ thumbnail_bytes = HydrusImageHandling.GenerateThumbnailBytesFromStaticImagePath( temp_path, target_resolution, mime )
+
+ except:
+
+ thumb_path = os.path.join( HC.STATIC_DIR, 'flash.png' )
+
+ thumbnail_bytes = HydrusImageHandling.GenerateThumbnailBytesFromStaticImagePath( thumb_path, target_resolution, mime )
+
+ finally:
+
+ HydrusTemp.CleanUpTempPath( os_file_handle, temp_path )
+
+
else:
- if mime == HC.APPLICATION_FLASH:
+ renderer = HydrusVideoHandling.VideoRendererFFMPEG( path, mime, duration, num_frames, target_resolution )
+
+ renderer.read_frame() # this initialises the renderer and loads the first frame as a fallback
+
+ desired_thumb_frame = int( ( percentage_in / 100.0 ) * num_frames )
+
+ renderer.set_position( desired_thumb_frame )
+
+ numpy_image = renderer.read_frame()
+
+ if numpy_image is None:
- ( os_file_handle, temp_path ) = HydrusTemp.GetTempPath()
-
- try:
-
- HydrusFlashHandling.RenderPageToFile( path, temp_path, 1 )
-
- thumbnail_bytes = HydrusImageHandling.GenerateThumbnailBytesFromStaticImagePath( temp_path, target_resolution, mime )
-
- except:
-
- thumb_path = os.path.join( HC.STATIC_DIR, 'flash.png' )
-
- thumbnail_bytes = HydrusImageHandling.GenerateThumbnailBytesFromStaticImagePath( thumb_path, target_resolution, mime )
-
- finally:
-
- HydrusTemp.CleanUpTempPath( os_file_handle, temp_path )
-
-
- else:
-
- renderer = HydrusVideoHandling.VideoRendererFFMPEG( path, mime, duration, num_frames, target_resolution )
-
- renderer.read_frame() # this initialises the renderer and loads the first frame as a fallback
-
- desired_thumb_frame = int( ( percentage_in / 100.0 ) * num_frames )
-
- renderer.set_position( desired_thumb_frame )
-
- numpy_image = renderer.read_frame()
-
- if numpy_image is None:
-
- raise Exception( 'Could not create a thumbnail from that video!' )
-
-
- numpy_image = HydrusImageHandling.ResizeNumPyImage( numpy_image, target_resolution ) # just in case ffmpeg doesn't deliver right
-
- thumbnail_bytes = HydrusImageHandling.GenerateThumbnailBytesNumPy( numpy_image, mime )
-
- renderer.Stop()
-
- del renderer
+ raise Exception( 'Could not create a thumbnail from that video!' )
+ numpy_image = HydrusImageHandling.ResizeNumPyImage( numpy_image, target_resolution ) # just in case ffmpeg doesn't deliver right
+
+ thumbnail_bytes = HydrusImageHandling.GenerateThumbnailBytesNumPy( numpy_image, mime )
+
+ renderer.Stop()
+
+ del renderer
+
return thumbnail_bytes
@@ -208,11 +232,19 @@ def GetFileInfo( path, mime = None, ok_to_look_for_hydrus_updates = False ):
( ( width, height ), duration, num_frames ) = HydrusImageHandling.GetImageProperties( path, mime )
+ elif mime == HC.APPLICATION_CLIP:
+
+ ( width, height ) = HydrusClipHandling.GetResolution( path )
+
elif mime == HC.APPLICATION_FLASH:
( ( width, height ), duration, num_frames ) = HydrusFlashHandling.GetFlashProperties( path )
- elif mime in ( HC.IMAGE_APNG, HC.VIDEO_AVI, HC.VIDEO_FLV, HC.VIDEO_WMV, HC.VIDEO_MOV, HC.VIDEO_MP4, HC.VIDEO_MKV, HC.VIDEO_REALMEDIA, HC.VIDEO_WEBM, HC.VIDEO_MPEG ):
+ elif mime == HC.IMAGE_APNG:
+
+ ( ( width, height ), duration, num_frames, has_audio ) = HydrusVideoHandling.GetFFMPEGAPNGProperties( path )
+
+ elif mime in ( HC.VIDEO_AVI, HC.VIDEO_FLV, HC.VIDEO_WMV, HC.VIDEO_MOV, HC.VIDEO_MP4, HC.VIDEO_MKV, HC.VIDEO_REALMEDIA, HC.VIDEO_WEBM, HC.VIDEO_MPEG ):
( ( width, height ), duration, num_frames, has_audio ) = HydrusVideoHandling.GetFFMPEGVideoProperties( path )
@@ -383,14 +415,16 @@ def GetMime( path, ok_to_look_for_hydrus_updates = False ):
def IsPNGAnimated( file_header_bytes ):
- if file_header_bytes[ 37: ].startswith( b'acTL' ):
+ apng_actl_bytes = HydrusVideoHandling.GetAPNGACTLChunk( file_header_bytes )
+
+ if apng_actl_bytes is not None:
# this is an animated png
# acTL chunk in an animated png is 4 bytes of num frames, then 4 bytes of num times to loop
# https://wiki.mozilla.org/APNG_Specification#.60acTL.60:_The_Animation_Control_Chunk
- num_frames = HydrusVideoHandling.GetAPNGNumFrames( file_header_bytes )
+ num_frames = HydrusVideoHandling.GetAPNGNumFrames( apng_actl_bytes )
if num_frames > 1:
diff --git a/hydrus/core/HydrusVideoHandling.py b/hydrus/core/HydrusVideoHandling.py
index a94228bc..49522291 100644
--- a/hydrus/core/HydrusVideoHandling.py
+++ b/hydrus/core/HydrusVideoHandling.py
@@ -43,9 +43,35 @@ def CheckFFMPEGError( lines ):
raise HydrusExceptions.DamagedOrUnusualFileException( 'FFMPEG could not parse.' )
-def GetAPNGNumFrames( file_header_bytes ):
+def GetAPNGACTLChunk( file_header_bytes: bytes ):
- ( num_frames, ) = struct.unpack( '>I', file_header_bytes[ 41 : 45 ] )
+ apng_actl_chunk_header = b'acTL'
+ apng_phys_chunk_header = b'pHYs'
+
+ first_guess_header = file_header_bytes[ 37:128 ]
+
+ if first_guess_header.startswith( apng_actl_chunk_header ):
+
+ return first_guess_header
+
+ elif first_guess_header.startswith( apng_phys_chunk_header ):
+
+ # aha, some weird other png chunk
+ # https://wiki.mozilla.org/APNG_Specification
+
+ if apng_actl_chunk_header in first_guess_header:
+
+ i = first_guess_header.index( apng_actl_chunk_header )
+
+ return first_guess_header[i:]
+
+
+
+ return None
+
+def GetAPNGNumFrames( apng_actl_bytes ):
+
+ ( num_frames, ) = struct.unpack( '>I', apng_actl_bytes[ 4 : 8 ] )
return num_frames
@@ -231,13 +257,20 @@ def GetFFMPEGAPNGProperties( path ):
file_header_bytes = f.read( 256 )
- num_frames = GetAPNGNumFrames( file_header_bytes )
+ apng_actl_bytes = GetAPNGACTLChunk( file_header_bytes )
+
+ if apng_actl_bytes is None:
+
+ raise HydrusExceptions.DamagedOrUnusualFileException( 'This APNG had an unusual file header!' )
+
+
+ num_frames = GetAPNGNumFrames( apng_actl_bytes )
lines = GetFFMPEGInfoLines( path )
- resolution = ParseFFMPEGVideoResolution( lines )
+ resolution = ParseFFMPEGVideoResolution( lines, png_ok = True )
- ( fps, confident_fps ) = ParseFFMPEGFPS( lines )
+ ( fps, confident_fps ) = ParseFFMPEGFPS( lines, png_ok = True )
if not confident_fps:
@@ -554,50 +587,41 @@ def ParseFFMPEGDuration( lines ):
raise HydrusExceptions.DamagedOrUnusualFileException( 'Error reading duration!' )
-def ParseFFMPEGFPS( lines_for_first_second ):
+def ParseFFMPEGFPS( lines, png_ok = False ):
+
+ try:
+
+ line = ParseFFMPEGVideoLine( lines, png_ok = png_ok )
+
+ ( possible_results, confident ) = ParseFFMPEGFPSPossibleResults( line )
+
+ if len( possible_results ) == 0:
+
+ fps = 1
+ confident = False
+
+ else:
+
+ fps = min( possible_results )
+
+
+ return ( fps, confident )
+
+ except:
+
+ raise HydrusExceptions.DamagedOrUnusualFileException( 'Error estimating framerate!' )
+
+
+def ParseFFMPEGFPSFromFirstSecond( lines_for_first_second ):
try:
line = ParseFFMPEGVideoLine( lines_for_first_second )
- # get the frame rate
-
- possible_results = set()
-
- match = re.search("( [0-9]*.| )[0-9]* tbr", line)
-
- if match is not None:
-
- tbr = line[match.start():match.end()].split(' ')[1]
-
- tbr_fps_is_likely_garbage = match is None or tbr.endswith( 'k' ) or float( tbr ) > 144
-
- if not tbr_fps_is_likely_garbage:
-
- possible_results.add( float( tbr ) )
-
-
-
- #
-
- match = re.search("( [0-9]*.| )[0-9]* fps", line)
-
- if match is not None:
-
- fps = line[match.start():match.end()].split(' ')[1]
-
- fps_is_likely_garbage = match is None or fps.endswith( 'k' ) or float( fps ) > 144
-
- if not fps_is_likely_garbage:
-
- possible_results.add( float( fps ) )
-
-
+ ( possible_results, confident ) = ParseFFMPEGFPSPossibleResults( line )
num_frames_in_first_second = ParseFFMPEGNumFramesManually( lines_for_first_second )
- confident = len( possible_results ) <= 1
-
if len( possible_results ) == 0:
fps = num_frames_in_first_second
@@ -641,6 +665,48 @@ def ParseFFMPEGFPS( lines_for_first_second ):
raise HydrusExceptions.DamagedOrUnusualFileException( 'Error estimating framerate!' )
+def ParseFFMPEGFPSPossibleResults( video_line ):
+
+ # get the frame rate
+
+ possible_results = set()
+
+ match = re.search("( [0-9]*.| )[0-9]* tbr", video_line)
+
+ if match is not None:
+
+ tbr = video_line[match.start():match.end()].split(' ')[1]
+
+ tbr_fps_is_likely_garbage = match is None or tbr.endswith( 'k' ) or float( tbr ) > 144
+
+ if not tbr_fps_is_likely_garbage:
+
+ possible_results.add( float( tbr ) )
+
+
+
+ #
+
+ match = re.search("( [0-9]*.| )[0-9]* fps", video_line)
+
+ if match is not None:
+
+ fps = video_line[match.start():match.end()].split(' ')[1]
+
+ fps_is_likely_garbage = match is None or fps.endswith( 'k' ) or float( fps ) > 144
+
+ if not fps_is_likely_garbage:
+
+ possible_results.add( float( fps ) )
+
+
+
+ possible_results.discard( 0 )
+
+ confident = len( possible_results ) <= 1
+
+ return ( possible_results, confident )
+
def ParseFFMPEGHasVideo( lines ):
try:
@@ -730,11 +796,20 @@ def ParseFFMPEGVideoFormat( lines ):
return ( True, video_format )
-def ParseFFMPEGVideoLine( lines ):
+def ParseFFMPEGVideoLine( lines, png_ok = False ):
+
+ if png_ok:
+
+ bad_video_formats = [ 'jpg' ]
+
+ else:
+
+ bad_video_formats = [ 'png', 'jpg' ]
+
# get the output line that speaks about video
# the ^\sStream is to exclude the 'title' line, when it exists, includes the string 'Video: ', ha ha
- lines_video = [ l for l in lines if re.search( r'^\s*Stream', l ) is not None and 'Video: ' in l and not ( 'Video: png' in l or 'Video: jpg' in l ) ] # mp3 says it has a 'png' video stream
+ lines_video = [ l for l in lines if re.search( r'^\s*Stream', l ) is not None and 'Video: ' in l and True not in ( 'Video: {}'.format( bad_video_format ) in l for bad_video_format in bad_video_formats ) ] # mp3 says it has a 'png' video stream
if len( lines_video ) == 0:
@@ -745,11 +820,11 @@ def ParseFFMPEGVideoLine( lines ):
return line
-def ParseFFMPEGVideoResolution( lines ):
+def ParseFFMPEGVideoResolution( lines, png_ok = False ):
try:
- line = ParseFFMPEGVideoLine( lines )
+ line = ParseFFMPEGVideoLine( lines, png_ok = png_ok )
# get the size, of the form 460x320 (w x h)
match = re.search(" [0-9]*x[0-9]*(,| )", line)
@@ -929,23 +1004,33 @@ class VideoRendererFFMPEG( object ):
skip_frames = 0
+ do_fast_seek = True
+
( w, h ) = self._target_resolution
cmd = [ FFMPEG_PATH ]
- if do_ss:
+ if do_ss and do_fast_seek: # fast seek
cmd.extend( [ '-ss', "%.03f" % ss ] )
- cmd.extend( [ '-i', self._path,
+ cmd.extend( [ '-i', self._path ] )
+
+ if do_ss and not do_fast_seek: # slow seek
+
+ cmd.extend( [ '-ss', "%.03f" % ss ] )
+
+
+ cmd.extend( [
'-loglevel', 'quiet',
'-f', 'image2pipe',
"-pix_fmt", self.pix_fmt,
"-s", str( w ) + 'x' + str( h ),
'-vsync', '0',
'-vcodec', 'rawvideo',
- '-' ] )
+ '-'
+ ] )
sbp_kwargs = HydrusData.GetSubprocessKWArgs()
diff --git a/hydrus/test/TestClientAPI.py b/hydrus/test/TestClientAPI.py
index 12b62342..8589a009 100644
--- a/hydrus/test/TestClientAPI.py
+++ b/hydrus/test/TestClientAPI.py
@@ -403,6 +403,10 @@ class TestClientAPI( unittest.TestCase ):
self.assertEqual( response.status, 200 )
+ #
+
+ HG.test_controller.ClearWrites( 'content_updates' )
+
body_dict = { 'Hydrus-Client-API-Session-Key' : session_key_hex, 'hash' : hash_hex, 'service_names_to_tags' : { 'my tags' : [ 'test', 'test2' ] } }
body = json.dumps( body_dict )
@@ -415,6 +419,55 @@ class TestClientAPI( unittest.TestCase ):
self.assertEqual( response.status, 200 )
+ [ ( ( service_keys_to_content_updates, ), kwargs ) ] = HG.test_controller.GetWrite( 'content_updates' )
+
+ self.assertIn( CC.DEFAULT_LOCAL_TAG_SERVICE_KEY, service_keys_to_content_updates )
+ self.assertTrue( len( service_keys_to_content_updates[ CC.DEFAULT_LOCAL_TAG_SERVICE_KEY ] ) > 0 )
+
+ #
+
+ HG.test_controller.ClearWrites( 'content_updates' )
+
+ body_dict = { 'Hydrus-Client-API-Session-Key' : session_key_hex, 'hash' : hash_hex, 'service_keys_to_tags' : { CC.DEFAULT_LOCAL_TAG_SERVICE_KEY.hex() : [ 'test', 'test2' ] } }
+
+ body = json.dumps( body_dict )
+
+ connection.request( 'POST', path, body = body, headers = headers )
+
+ response = connection.getresponse()
+
+ data = response.read()
+
+ self.assertEqual( response.status, 200 )
+
+ [ ( ( service_keys_to_content_updates, ), kwargs ) ] = HG.test_controller.GetWrite( 'content_updates' )
+
+ self.assertIn( CC.DEFAULT_LOCAL_TAG_SERVICE_KEY, service_keys_to_content_updates )
+ self.assertTrue( len( service_keys_to_content_updates[ CC.DEFAULT_LOCAL_TAG_SERVICE_KEY ] ) > 0 )
+
+ #
+
+ HG.test_controller.ClearWrites( 'content_updates' )
+
+ body_dict = { 'Hydrus-Client-API-Session-Key' : session_key_hex, 'hash' : hash_hex, 'service_keys_to_actions_to_tags' : { CC.DEFAULT_LOCAL_TAG_SERVICE_KEY.hex() : { str( HC.CONTENT_UPDATE_ADD ) : [ 'test', 'test2' ] } } }
+
+ body = json.dumps( body_dict )
+
+ connection.request( 'POST', path, body = body, headers = headers )
+
+ response = connection.getresponse()
+
+ data = response.read()
+
+ self.assertEqual( response.status, 200 )
+
+ [ ( ( service_keys_to_content_updates, ), kwargs ) ] = HG.test_controller.GetWrite( 'content_updates' )
+
+ self.assertIn( CC.DEFAULT_LOCAL_TAG_SERVICE_KEY, service_keys_to_content_updates )
+ self.assertTrue( len( service_keys_to_content_updates[ CC.DEFAULT_LOCAL_TAG_SERVICE_KEY ] ) > 0 )
+
+ #
+
return set_up_permissions
@@ -1370,6 +1423,34 @@ class TestClientAPI( unittest.TestCase ):
self.assertEqual( HG.test_controller.GetWrite( 'import_url_test' ), [ ( ( url, set( filterable_tags ), additional_service_keys_to_tags, 'muh /tv/', None, True ), {} ) ] )
+ # add tags with service key and name, and show destination page
+
+ HG.test_controller.ClearWrites( 'import_url_test' )
+
+ request_dict = { 'url' : url, 'destination_page_name' : 'muh /tv/', 'show_destination_page' : True, 'filterable_tags' : [ 'filename:yo' ], 'service_keys_to_additional_tags' : { CC.DEFAULT_LOCAL_TAG_SERVICE_KEY.hex() : [ '/tv/ thread' ] } }
+
+ request_body = json.dumps( request_dict )
+
+ connection.request( 'POST', '/add_urls/add_url', body = request_body, headers = headers )
+
+ response = connection.getresponse()
+
+ data = response.read()
+
+ text = str( data, 'utf-8' )
+
+ self.assertEqual( response.status, 200 )
+
+ response_json = json.loads( text )
+
+ self.assertEqual( response_json[ 'human_result_text' ], '"https://8ch.net/tv/res/1846574.html" URL added successfully.' )
+ self.assertEqual( response_json[ 'normalised_url' ], 'https://8ch.net/tv/res/1846574.html' )
+
+ filterable_tags = [ 'filename:yo' ]
+ additional_service_keys_to_tags = ClientTags.ServiceKeysToTags( { CC.DEFAULT_LOCAL_TAG_SERVICE_KEY : set( [ '/tv/ thread' ] ) } )
+
+ self.assertEqual( HG.test_controller.GetWrite( 'import_url_test' ), [ ( ( url, set( filterable_tags ), additional_service_keys_to_tags, 'muh /tv/', None, True ), {} ) ] )
+
# associate url
HG.test_controller.ClearWrites( 'content_updates' )
@@ -2286,6 +2367,7 @@ class TestClientAPI( unittest.TestCase ):
tags_manager = media_result.GetTagsManager()
service_names_to_statuses_to_tags = {}
+ api_service_keys_to_statuses_to_tags = {}
service_keys_to_statuses_to_tags = tags_manager.GetServiceKeysToStatusesToTags( ClientTags.TAG_DISPLAY_STORAGE )
@@ -2296,14 +2378,22 @@ class TestClientAPI( unittest.TestCase ):
service_keys_to_names[ service_key ] = services_manager.GetName( service_key )
- service_name = service_keys_to_names[ service_key ]
+ s = { str( status ) : sorted( tags, key = HydrusTags.ConvertTagToSortable ) for ( status, tags ) in statuses_to_tags.items() if len( tags ) > 0 }
- service_names_to_statuses_to_tags[ service_name ] = { str( status ) : sorted( tags, key = HydrusTags.ConvertTagToSortable ) for ( status, tags ) in statuses_to_tags.items() }
+ if len( s ) > 0:
+
+ service_name = service_keys_to_names[ service_key ]
+
+ service_names_to_statuses_to_tags[ service_name ] = s
+ api_service_keys_to_statuses_to_tags[ service_key.hex() ] = s
+
metadata_row[ 'service_names_to_statuses_to_tags' ] = service_names_to_statuses_to_tags
+ metadata_row[ 'service_keys_to_statuses_to_tags' ] = api_service_keys_to_statuses_to_tags
- service_names_to_statuses_to_tags = {}
+ service_names_to_statuses_to_display_tags = {}
+ service_keys_to_statuses_to_display_tags = {}
service_keys_to_statuses_to_tags = tags_manager.GetServiceKeysToStatusesToTags( ClientTags.TAG_DISPLAY_ACTUAL )
@@ -2314,12 +2404,19 @@ class TestClientAPI( unittest.TestCase ):
service_keys_to_names[ service_key ] = services_manager.GetName( service_key )
- service_name = service_keys_to_names[ service_key ]
+ s = { str( status ) : sorted( tags, key = HydrusTags.ConvertTagToSortable ) for ( status, tags ) in statuses_to_tags.items() if len( tags ) > 0 }
- service_names_to_statuses_to_tags[ service_name ] = { str( status ) : sorted( tags, key = HydrusTags.ConvertTagToSortable ) for ( status, tags ) in statuses_to_tags.items() }
+ if len( s ) > 0:
+
+ service_name = service_keys_to_names[ service_key ]
+
+ service_names_to_statuses_to_display_tags[ service_name ] = s
+ service_keys_to_statuses_to_display_tags[ service_key.hex() ] = s
+
- metadata_row[ 'service_names_to_statuses_to_display_tags' ] = service_names_to_statuses_to_tags
+ metadata_row[ 'service_names_to_statuses_to_display_tags' ] = service_names_to_statuses_to_display_tags
+ metadata_row[ 'service_keys_to_statuses_to_display_tags' ] = service_keys_to_statuses_to_display_tags
metadata.append( metadata_row )
diff --git a/requirements_macos.txt b/requirements_macos.txt
index f1fff14c..9a765e04 100644
--- a/requirements_macos.txt
+++ b/requirements_macos.txt
@@ -6,7 +6,7 @@ lxml>=4.5.0
lz4>=3.0.0
nose>=1.3.0
numpy>=1.16.0
-opencv-python-headless>=4.0.0
+opencv-python-headless>=4.0.0, <=4.5.3.56
Pillow>=6.0.0
psutil>=5.0.0
pylzma>=0.5.0
diff --git a/requirements_ubuntu.txt b/requirements_ubuntu.txt
index f1fff14c..9a765e04 100644
--- a/requirements_ubuntu.txt
+++ b/requirements_ubuntu.txt
@@ -6,7 +6,7 @@ lxml>=4.5.0
lz4>=3.0.0
nose>=1.3.0
numpy>=1.16.0
-opencv-python-headless>=4.0.0
+opencv-python-headless>=4.0.0, <=4.5.3.56
Pillow>=6.0.0
psutil>=5.0.0
pylzma>=0.5.0
diff --git a/requirements_windows.txt b/requirements_windows.txt
index 7516abf9..c768235a 100644
--- a/requirements_windows.txt
+++ b/requirements_windows.txt
@@ -6,7 +6,7 @@ lxml>=4.5.0
lz4>=3.0.0
nose>=1.3.0
numpy>=1.16.0
-opencv-python-headless>=4.0.0
+opencv-python-headless>=4.0.0, <=4.5.3.56
Pillow>=6.0.0
psutil>=5.0.0
pylzma>=0.5.0