Commit Graph

11 Commits

Author SHA1 Message Date
wm4 3b4641a5a9 filter: minor cosmetic naming issue
Just putting some more lipstick on the pig, maybe it looks a bit nicer
now.
2020-03-08 19:38:10 +01:00
wm4 8a1bd15216 filter: add functions to suspend filtering temporarily
Filtering is integrated into an event loop, which is something the
filter API user provides. To make interacting with the event loop
easier, and in particular to avoid filtering to block event handling,
add functions the event loop code can suspend filtering.

While we cannot actually suspend a single filter, it's pretty easy to
suspend the filter graph run loop itself, which is responsible for
selecting which filter to run next.

This commit shouldn't change behavior at all, but the functions will be
used in later commits.
2020-03-05 22:00:50 +01:00
wm4 485f335b69 filter: add async queue filter
This is supposed to enable communication between filter graphs on
separate threads. Having multiple threads makes only sense if they can
run concurrently with each other, which requires such an asynchronous
queue as a building block. (Probably.)

The basic idea is that you have two independent filters, which can be
each part of separate filter graphs, but which communicate into one
direction with an explicit queue. This is rather similar to unix pipes.
Just like unix pipes, the queue is limited in size, so that still some
data flow control enforced, and runaway memory usage is avoided.

This implementation is pretty dumb. In theory, you could avoid avoid
waking up the filter graphs in quite a lot of situations. For example,
you don't need to wake up the consumer filter if there are already
frames queued. Also, you could add "watermarks" that set a threshold at
which producer or consumer should be woken up to produce/consume more
frames (this would generally serve to "batch" multiple frames at once,
instead of performing high-frequency wakeups). But this is hard, so the
code is dumb. (I just deleted all related code when I still got
situations where wakeups were lost.)

This is actually salvaged and modified from a much older branch I had
lying around. It will be used in the next commit.
2020-02-29 21:52:00 +01:00
wm4 f29623786b filter: decide how multi-threading is supposed to work
Instead of vague ideas about making different filter graphs on different
threads interact directly, this have no direct support. Instead, helpers
are required (such as added with the next commit).

Document it. Different root filters (i.e. separate filter graphs) are
now considered to be part of separate threads, so assert() if they're
found to accidentally interact.
2020-02-29 21:49:14 +01:00
wm4 4b32d51b96 f_output_chain: log status of auto filters
Just so users don't think the filters do anything when they don't insert
any filters.
2018-04-29 02:21:32 +03:00
wm4 ff24285eb1 video: pass through container fps to filters
This means vf_vapoursynth doesn't need a hack to work around the filter
code, and libavfilter filters now actually get the frame_rate field on
input pads set.

The libavfilter doxygen says the frame_rate field is only to be set if
the frame rate is known to be constant, and uses the word "must" (which
probably means they really mean it?) - but ffmpeg.c sets the field to
mere guesses anyway, and it looks like this normally won't lead to
problems.
2018-04-19 23:22:48 +02:00
wm4 3ffa70e2da filter: extend documentation comments
Add more explanations, and also fix some blatantly wrong things.
2018-02-13 17:45:29 -08:00
wm4 debc17663d
filter: add/use a convenience function
I guess this is generally useful for filters which buffer data
internally.
2018-02-03 05:01:28 -08:00
wm4 6d36fad83c video: make decoder wrapper a filter
Move dec_video.c to filters/f_decoder_wrapper.c. It essentially becomes
a source filter. vd.h mostly disappears, because mp_filter takes care of
the dataflow, but its remains are in struct mp_decoder_fns.

One goal is to simplify dataflow by letting the filter framework handle
it (or more accurately, using its conventions). One result is that the
decode calls disappear from video.c, because we simply connect the
decoder wrapper and the filter chain with mp_pin_connect().

Another goal is to eventually remove the code duplication between the
audio and video paths for this. This commit prepares for this by trying
to make f_decoder_wrapper.c extensible, so it can be used for audio as
well later.

Decoder framedropping changes a bit. It doesn't seem to be worse than
before, and it's an obscure feature, so I'm content with its new state.
Some special code that was apparently meant to avoid dropping too many
frames in a row is removed, though.

I'm not sure how the source code tree should be organized. For one,
video/decode/vd_lavc.c is the only file in its directory, which is a bit
annoying.
2018-01-30 03:10:27 -08:00
wm4 b9f804b566 audio: rewrite filtering glue code
Use the new filtering code for audio too.
2018-01-30 03:10:27 -08:00
wm4 76276c9210 video: rewrite filtering glue code
Get rid of the old vf.c code. Replace it with a generic filtering
framework, which can potentially handle more than just --vf. At least
reimplementing --af with this code is planned.

This changes some --vf semantics (including runtime behavior and the
"vf" command). The most important ones are listed in interface-changes.

vf_convert.c is renamed to f_swscale.c. It is now an internal filter
that can not be inserted by the user manually.

f_lavfi.c is a refactor of player/lavfi.c. The latter will be removed
once --lavfi-complex is reimplemented on top of f_lavfi.c. (which is
conceptually easy, but a big mess due to the data flow changes).

The existing filters are all changed heavily. The data flow of the new
filter framework is different. Especially EOF handling changes - EOF is
now a "frame" rather than a state, and must be passed through exactly
once.

Another major thing is that all filters must support dynamic format
changes. The filter reconfig() function goes away. (This sounds complex,
but since all filters need to handle EOF draining anyway, they can use
the same code, and it removes the mess with reconfig() having to predict
the output format, which completely breaks with libavfilter anyway.)

In addition, there is no automatic format negotiation or conversion.
libavfilter's primitive and insufficient API simply doesn't allow us to
do this in a reasonable way. Instead, filters can use f_autoconvert as
sub-filter, and tell it which formats they support. This filter will in
turn add actual conversion filters, such as f_swscale, to perform
necessary format changes.

vf_vapoursynth.c uses the same basic principle of operation as before,
but with worryingly different details in data flow. Still appears to
work.

The hardware deint filters (vf_vavpp.c, vf_d3d11vpp.c, vf_vdpaupp.c) are
heavily changed. Fortunately, they all used refqueue.c, which is for
sharing the data flow logic (especially for managing future/past
surfaces and such). It turns out it can be used to factor out most of
the data flow. Some of these filters accepted software input. Instead of
having ad-hoc upload code in each filter, surface upload is now
delegated to f_autoconvert, which can use f_hwupload to perform this.

Exporting VO capabilities is still a big mess (mp_stream_info stuff).

The D3D11 code drops the redundant image formats, and all code uses the
hw_subfmt (sw_format in FFmpeg) instead. Although that too seems to be a
big mess for now.

f_async_queue is unused.
2018-01-30 03:10:27 -08:00