It is VERY recommended to check if the MTRR registers are set up properly, because they can give a big performance boost.
Do a 'cat /proc/mtrr
' :
--($:~)-- cat /proc/mtrr
reg00: base=0xe4000000 (3648MB), size= 16MB: write-combining, count=9
reg01: base=0xd8000000 (3456MB), size= 128MB: write-combining, count=1
It's right, shows my Matrox G400 with 16Mb memory. I did this from XFree 4.x.x , which sets up MTRR registers automatically.
If nothing worked, you have to do it manually. First, you have to find the base address. You have 3 ways to find it:
(--) SVGA: PCI: Matrox MGA G400 AGP rev 4, Memory @ 0xd8000000, 0xd4000000
(--) SVGA: Linear framebuffer at 0xD8000000
01:00.0 VGA compatible controller: Matrox Graphics, Inc.: Unknown device 0525
Memory at d8000000 (32-bit, prefetchable)
mga_mem_base = d8000000
Then let's find the memory size. This is very easy, just convert video ram size to hexadecimal, or use this table:
1 MB | 0x100000 | ||
2 MB | 0x200000 | ||
4 MB | 0x400000 | ||
8 MB | 0x800000 | ||
16 MB | 0x1000000 | ||
32 MB | 0x2000000 | ||
You know base address and memory size, let's setup mtrr registers! For example, for the Matrox card above (base=0xd8000000) with 32MB ram (size=0x2000000) just execute:
echo "base=0xd8000000 size=0x2000000 type=write-combining" >| /proc/mtrr
Not all CPUs support MTRRs. For example older K6-2's [around 266Mhz,
stepping 0] doesn't support MTRR, but stepping 12's do ('cat /proc/cpuinfo
'
to check it').
Under XFree86 4.0.2 or newer, you can use your card's hardware YUV routines using the XVideo extension. This is what the option '-vo xv' uses. Also, this is driver supports adjusting brightness/contrast/hue/etc (unless you use the old, slow DirectShow DivX codec, which supports it everywhere), see the manpage.
In order to make this work, be sure to check the following:
(II) Loading extension XVideo
in /var/log/XFree86.0.log
NOTE: this loads only the XFree86's extension. In a good install, this is always loaded, and doesn't mean that the _card's_ XVideo support is loaded!
X-Video Extension version 2.2 screen #0 Adaptor #0: "Savage Streams Engine" number of ports: 1 port base: 43 operations supported: PutImage supported visuals: depth 16, visualID 0x22 depth 16, visualID 0x23 number of attributes: 5 (...) Number of image formats: 7 id: 0x32595559 (YUY2) guid: 59555932-0000-0010-8000-00aa00389b71 bits per pixel: 16 number of planes: 1 type: YUV (packed) id: 0x32315659 (YV12) guid: 59563132-0000-0010-8000-00aa00389b71 bits per pixel: 12 number of planes: 3 type: YUV (planar) (...etc...)
It must support YUY2 packed, and YV12 planar pixel formats to be usable with MPlayer.
Older 3dfx drivers were known to have problems with XVideo acceleration, it didn't support either YUY2 or YV12, and so. Verify that you have XFree86 version 4.2.0 or greater, it works ok with YV12 and YUY2. Previous versions, including 4.1.0 crashes with YV12! If you experience strange effects using -vo xv, try SDL (it has XVideo too) and see if it helps. Check the SDL section for details.
OR, try the NEW -vo tdfxfb driver! See the 2.3.1.9 section!
S3 Savage3D's should work fine, but for Savage4, use XFree86 version 4.0.3 or greater (in case of image problems, try 16bpp). As for S3 Virge.. there is xv support, but the card itself is very slow, so you better sell it.
NOTE: Savage cards have a slow YV12 image displaying capability (it needs to do YV12->YUY2 conversion, because the Savage hardware can't display YV12). So when this documentation says at some point "this has YV12 output use this, it's faster", it's not sure. Try this driver, it uses MMX2 for this task and is faster than the native X driver.
nVidia isn't a very good choice under Linux (according to NVidia, this is not true).. You'll have to use the binary closed-source nVidia driver, available at nVidia's website. The standard XFree86 driver doesn't support XVideo for these cards, due to nVidia's closed sources/specifications.
As far as I know the latest XFree86 driver contains XVideo support for Geforce 2 and 3.
These cards can be found in many laptops. Unfortunately, the driver in X 4.2.0 can't do Xv, but we have a modified, Xv-capable driver for you. Download from here. Driver provided by Stefan Seyfried.
To allow playback of DVD sized content change your XF86Config like this :
Section "Device"
[...]
Driver "neomagic"
Option "OverlayMem" "829440"
[...]
EndSection
If you want to use xv with a trident card, provided that it doesn't work with 4.1.0, try the latest cvs of Xfree or wait for Xfree 4.2.0. The latest cvs adds support for fullscreen xv support with the Cyberblade XP card.
This document tries to explain in some words what DGA is in general and what the DGA video output driver for mplayer can do (and what it can't).
DGA is short for Direct Graphics Access and is a means for a program to bypass the X-Server and directly modifying the framebuffer memory. Technically spoken this happens by mapping the framebuffer memory into the memory range of your process. This is allowed by the kernel only if you have superuser privileges. You can get these either by logging in as root or by setting the suid bit on the mplayer excecutable (NOT recommended!).
There are two versions of DGA: DGA1 is used by XFree 3.x.x and DGA2 was introduced with XFree 4.0.1.
DGA1 provides only direct framebuffer access as described above. For switching the resolution of the video signal you have to rely on the XVidMode extension.
DGA2 incorporates the features of XVidMode extension and also allows switching the depth of the display. So you may, although basically running a 32 bit depth XServer, switch to a depth of 15 bits and vice versa.
However DGA has some drawbacks. It seems it is somewhat dependent on the graphics chip you use and on the implementation of the XServer's video driver that controls this chip. So it does not work on every system ...
2.3.1.3.3. Installing DGA support for MPlayer
First make sure X loads the DGA extension, see in /var/log/XFree86.0.log:
(II) Loading extension XFree86-DGA
See, XFree86 4.0.x or greater is VERY RECOMMENDED! MPlayer's DGA driver is autodetected on ./configure, or you can force it with --enable-dga.
If the driver couldn't switch to a smaller resolution, experiment with switches -vm (only with X 3.3.x), -fs, -bpp, -zoom to find a video mode that the movie fits in. There is no converter right now.. :(
Become ROOT. DGA needs root access to be able to write directly video memory. If you want to run it as user, then install MPlayer SUID root:
chown root /usr/local/bin/mplayer
chmod 750 /usr/local/bin/mplayer
chmod +s /usr/local/bin/mplayer
Now it works as a simple user, too.
!!!! BUT STAY TUNED !!!!
This is a BIG security risk! Never do this on a server or on a computer
can be accessed by more people than only you because they can gain root
privilegies through suid root mplayer.
!!!! SO YOU HAVE BEEN WARNED ... !!!!
Now use '-vo dga' option, and there you go! (hope so:) You should also try if the '-vo sdl:dga' option works for you! It's much faster!!!
2.3.1.3.4. Resolution switching
The DGA driver allows for switching the resolution of the output signal.
This avoids the need for doing (slow) software scaling and at the same
time provides a fullscreen image. Ideally it would switch to the exact
resolution (except for honouring aspect ratio) of the video data, but the
XServer only allows switching to resolutions predefined in
/etc/X11/XF86Config
(/etc/X11/XF86Config-4
for XFree 4.0.X respectively).
Those are defined by so-called modelines and depend on the capabilites
of your video hardware. The XServer scans this config file on startup and
disables the modelines not suitable for your hardware. You can find
out which modes survive with the X11 log file. It can be found at:
/var/log/XFree86.0.log
.
See appendix A for some sample modeline definitions.
DGA is used in two places with MPlayer: The SDL driver can be made to make use of it (-vo sdl:dga) and within the DGA driver (-vo dga). The above said is true for both; in the following sections I'll explain how the DGA driver for MPlayer works.
2.3.1.3.6. Features of the DGA driver
The DGA driver is invoked by specifying -vo dga at the command line. The default behaviour is to switch to a resolution matching the original resolution of the video as close as possible. It deliberately ignores the -vm and -fs switches (enabling of video mode switching and fullscreen) - it always tries to cover as much area of your screen as possible by switching the video mode, thus refraining to use a single additional cycle of your CPU to scale the image. If you don't like the mode it chooses you may force it to choose the mode matching closest the resolution you specify by -x and -y. By providing the -v option, the DGA driver will print, among a lot of other things, a list of all resolutions supported by your current XF86-Config file. Having DGA2 you may also force it to use a certain depth by using the -bpp option. Valid depths are 15, 16, 24 and 32. It depends on your hardware whether these depths are natively supported or if a (possibly slow) conversion has to be done.
If you should be lucky enough to have enough offscreen memory left to put a whole image there, the DGA driver will use doublebuffering, which results in much smoother movie replaying. It will tell you whether double- buffering is enabled or not.
Doublebuffering means that the next frame of your video is being drawn in some offscreen memory while the current frame is being displayed. When the next frame is ready, the graphics chip is just told the location in memory of the new frame and simply fetches the data to be displayed from there. In the meantime the other buffer in memory will be filled again with new video data.
Doublebuffering may be switched on by using the option -double and may be disabled with -nodouble. Current default option is to disable doublebuffering. When using the DGA driver, onscreen display (OSD) only works with doublebuffering enabled. However, enabling doublebuffering may result in a big speed penalty (on my K6-II+ 525 it used an additional 20% of CPU time!) depending on the implementation of DGA for your hardware.Generally spoken, DGA framebuffer access should be at least as fast as using the X11 driver with the additional benefit of getting a fullscreen image. The percentage speed values printed by mplayer have to be interpreted with some care, as for example, with the X11 driver they do not include the time used by the X-Server needed for the actual drawing. Hook a terminal to a serial line of your box and start top to see what is really going on in your box ...
Generally spoken, the speedup done by using DGA against 'normal' use of X11 highly depends on your graphics card and how well the X-Server module for it is optimized.
If you have a slow system, better use 15 or 16bit depth since they require only half the memory bandwidth of a 32 bit display.
Using a depth of 24bit is even a good idea if your card natively just supports 32 bit depth since it transfers 25% less data compared to the 32/32 mode.
I've seen some avi files already be replayed on a Pentium MMX 266. AMD K6-2 CPUs might work at 400 MHZ and above.
Well, according to some developpers of XFree, DGA is quite a beast. They tell you better not to use it. Its implementation is not always flawless with every chipset driver for XFree out there.
Section "Modes" Identifier "Modes[0]" Modeline "800x600" 40 800 840 968 1056 600 601 605 628 Modeline "712x600" 35.0 712 740 850 900 400 410 412 425 Modeline "640x480" 25.175 640 664 760 800 480 491 493 525 Modeline "400x300" 20 400 416 480 528 300 301 303 314 Doublescan Modeline "352x288" 25.10 352 368 416 432 288 296 290 310 Modeline "352x240" 15.750 352 368 416 432 240 244 246 262 Doublescan Modeline "320x240" 12.588 320 336 384 400 240 245 246 262 Doublescan EndSection
These entries work fine with my Riva128 chip, using nv.o XServer driver module.
If you experience troubles with the DGA driver please feel free to file a bug report to me (e-mail address below). Please start mplayer with the -v option and include all lines in the bug report that start with vo_dga:
Please do also include the version of X11 you are using, the graphics card and your CPU type. The X11 driver module (defined in XF86-Config) might also help. Thanks!
Acki (acki@acki-netz.de, www.acki-netz.de)
SDL (Simple Directmedia Layer) is basically an unified video/audio interface. Programs that use it know only about SDL, and not about what video or audio driver does SDL actually use. For example a Doom port using SDL can run on svgalib, aalib, X, fbdev, and others, you only have to specify the (for example) video driver to use with the SDL_VIDEODRIVER environment variable. Well, in theory.
With MPlayer, we used its X11 driver's software scaler ability for cards/drivers that doesn't support XVideo, until we made our own (faster, nicer) software scaler. Also we used its aalib output, but now we have ours which is more comfortable. Its DGA mode was better than ours, until recently. Get it now? :)
It also helps with some buggy drivers/cards if the video is jerky (not slow system problem), or audio is lagging.
SDL video output supports displaying subtitles under the movie, on the (if present) black bar.
Here are some notes about SDL out in MPlayer.
There are several commandline switches for SDL: | |||
-vo sdl:name | specifies sdl video driver to use (ie. aalib, dga, x11) | ||
-ao sdl:name | specifies sdl audio driver to use (ie. dsp, esd, arts) | ||
-noxv | disables Xvideo hardware acceleration | ||
-forcexv | tries to force Xvideo acceleration | ||
SDL Keys: | |||
F | toggles fullscreen/windowed mode | ||
C | cycles available fullscreen modes | ||
W/S | mappings for * and / (mixer control) | ||
KNOWN BUGS:
If you don't have X, you can use the SVGAlib target! Be sure not to use the -fs switch, since it toggles the usage of the software scaler, and it's SLOOOW now, unless you have a real fast CPU (and/or MTRR?). :(
Of course you'll have to install svgalib and its development package in order for MPlayer build its SVGAlib driver (autodetected, but can be forced), and don't forget to edit /etc/vga/libvga.config to suit your card & monitor.
2.3.1.6. Framebuffer output (FBdev)
Whether to build the FBdev target is autodetected during ./configure . Read the framebuffer documentation in the kernel sources (Documentation/fb/*) for info on how to enable it, etc.. !
If your card doesn't support VBE 2.0 standard (older ISA/PCI cards, such as S3 Trio64), only VBE 1.2 (or older?) : Well, VESAfb is still available, but you'll have to load SciTech Display Doctor (formerly UniVBE) before booting Linux. Use a DOS boot disk or whatever. And don't forget to register your UniVBE ;))
The FBdev output takes some additional parameters above the others:
-fb | specify the framebuffer device to use (/dev/fb0) | ||
-fbmode | mode name to use (according to /etc/fb.modes) | ||
-fbmodeconfig | config file of modes (default /etc/fb.modes) | ||
-monitor_hfreq | IMPORTANT values, see example.conf | ||
-monitor_vfreq | |||
-monitor_dotclock | |||
If you want to change to a specific mode, then use
mplayer -vm -fbmode (NameOfMode) filename
echo -e '\033[?25l'
or setterm -cursor off
setterm -blank 0
echo -e '\033[?25h'
or setterm -cursor on
NOTE: FBdev video mode changing _does not work_ with the VESA framebuffer, and don't ask for it, since it's not an MPlayer limitation.
2.3.1.7. Matrox framebuffer (mga_vid)
This section is about the Matrox G200/G400/G450/G550 BES (Back-End Scaler) support, the mga_vid kernel driver. It's active developed by me (A'rpi), and it has hardware VSYNC support with triple buffering. It works on both framebuffer console and under X.
WARNING: on non-Linux systems, use Vidix for mga_vid !!!
To use it, you first have to compile mga_vid.o:
cd drivers
make
Then create /dev/mga_vid device:
mknod /dev/mga_vid c 178 0
and load the driver with
insmod mga_vid.o
You should verify the memory size detection using the 'dmesg' command. If it's bad, use the mga_ram_size option (rmmod mga_vid first), specify card's memory size in MB:
insmod mga_vid.o mga_ram_size=16
To make it load/unload automatically when needed, first insert the following line at the end of /etc/modules.conf:
alias char-major-178 mga_vid
Then copy the mga_vid.o
module to the appropriate place under
/lib/modules/<kernel version>/somewhere
.
Then run
depmod -a
Now you have to (re)compile MPlayer, ./configure will detect /dev/mga_vid and build the 'mga' driver. Using it from MPlayer goes by '-vo mga' if you have matroxfb console, or '-vo xmga' under XFree86 3.x.x or 4.x.x.
The mga_vid driver cooperates with Xv.
The /dev/mga_vid
device file can be read (for example by
cat /dev/mga_vid
) for some info, and written for brightness
change : echo "brightness=120" > /dev/mga_vid
2.3.1.8. SiS 6326 framebuffer (sis_vid)
SiS 6326 YUV Framebuffer driver -> sis_vid kernel driver
Its interface should be compatible with the mga_vid, but the driver was not updated after the mga_vid changes, so it's outdated now. Volunteers needed to test it and bring the code up-to-date.
2.3.1.9. 3dfx YUV support (tdfxfb)
This driver uses the kernel's tdfx framebuffer driver to play movies with
YUV acceleration. You'll need a kernel with tdfxfb support, and recompile with
./configure --enable-tdfxfb
MPlayer supports displaying movies using OpenGL, but if your platform/driver supports xv as should be the case on a PC with Linux, use xv instead, OpenGL performance is considerably worse. If you have an X11 implementation without xv support, OpenGL is a viable alternative.
Unfortunately not all drivers support this feature. The Utah-GLX drivers (for XFree86 3.3.6) support it for all cards. See http://utah-glx.sourceforge.net for details about how to install it.
XFree86(DRI) >= 4.0.3 supports OpenGL with Matrox and Radeon cards, >= 4.2 supports Rage128. See http://dri.sourceforge.net for download and installation instructions.
2.3.1.11. AAlib - text mode displaying
AAlib is a library for displaying graphics in text mode, using powerful ASCII renderer. There are LOTS of programs already supporting it, like Doom, Quake, etc. MPlayer contains a very usable driver for it. If ./configure detects aalib installed, the aalib libvo driver will be built.
You can use some keys in the AA Window to change rendering options: | |||
1 | decrease contrast | ||
2 | increase contrast | ||
3 | decrease brightness | ||
4 | increase brightness | ||
5 | switch fast rendering on/off | ||
6 | set dithering mode (none, error distribution, floyd steinberg) | ||
7 | invert image | ||
a | toggles between aa and mplayer control) | ||
The following command line options can be used: | |||
-aaosdcolor=V | change osd color | ||
-aasubcolor=V | change subtitle color | ||
where V can be: (0/normal, 1/dark, 2/bold, 3/boldfont, 4/reverse, 5/special) | |||
AAlib itselves provides a large sum of options. Here are some important: | |||
-aadriver | set recommended aa driver (X11, curses, linux) | ||
-aaextended | use all 256 characters | ||
-aaeight | use eight bit ascii | ||
-aahelp | prints out all aalib options | ||
NOTE: the rendering is very CPU intensive, especially when using AA-on-X (using aalib on X), and it's least CPU intensive on standard, non-framebuffer console. Use SVGATextMode to set up a big textmode, then enjoy! (secondary head Hercules cards rock :)) (anyone can enhance bdev to do conversion/dithering to hgafb? Would be neat :)
Use the -framedrop option if your comp isn't fast enough to render all frames!
Playing on terminal you'll get better speed and quality using the linux driver, not curses (-aadriver linux). But therefore you need write access on /dev/vcsa<terminal>! That isn't autodetected by aalib, bu vo_aa tries to find the best mode. See http://aa-project.sourceforge.net/tune/ for further tuning issues.
2.3.1.12. VESA - output to VESA BIOS
This driver was designed and introduced as generic driver for any video
card which has VESA VBE 2.0 compatible BIOS. But exists still one reason of
developing of this driver - it's multiple troubles with displaying movie on TV.
VESA BIOS EXTENSION (VBE) Version 3.0 Date: September 16, 1998 (Page 70)
says:
Dual-Controller Designs
VBE 3.0 supports the dual-controller design by assuming that since both
controllers are typically provided by the same OEM, under control of a
single BIOS ROM on the same graphics card, it is possible to hide the fact
that two controllers are indeed present from the application. This has the
limitation of preventing simultaneous use of the independent controllers,
but allows applications released before VBE 3.0 to operate normally. The
VBE Function 00h (Return Controller Information) returns the combined
information of both controllers, including the combined list of available modes.
When the application selects a mode, the appropriate controller is activated.
Each of the remaining VBE functions then operates on the active controller.
So you have chances to get working TV-out by using this driver.
(I guess that TV-out frequently is standalone head or standalone output
at least.)
What are pluses:
- You have chances to watch movies if Linux even doesn't know your video hardware.
- You don't need to have installed any graphics' related things on your Linux
(like X11 (aka XFree86), fbdev and so on). This driver can be run from
text-mode.
- You have chances to get working TV-out. (It's known at least for ATI's cards).
- This driver calls int 10h handler thus it's not an emulator - it
calls real things of real BIOS in real-mode. (Finely -
in vm86 mode).
- Most important :) You can watch DVD at 320x200 if you don't have a powerful CPU.
What are minuses:
- It works only on x86 systems.
- It's the slowest driver from all the available ones for MPlayer.
(But only if your card doesn't support DGA mode - otherwise this
driver is comparable by speed with -vo dga and -vo fbdev ones.
- It can be used only by ROOT.
- Currently it's available only for Linux.
- It doesn't use any hardware accelerations (like YUV overlay or hw scaling).
Don't use this driver with GCC 2.96 ! It won't work !
These switches of command line currently are available for VESA: | |||
-vo vesa:opts | currently recognized: dga to force dga mode and nodga to disable dga mode. Note: you may omit these parameters to enable autodetect of dga mode. (In the future also will specify mode parameters such as refresh rate, interlacing, doublescan and so on. Samples: i43, 85, d100) | ||
-screenw, -screenh, -bpp | force userdefined mode | ||
-x, -y | set userdefined prescaling | ||
-zoom | enables userdefined prescaling | ||
-fs | scales image to fullscreen | ||
-fs -zoom | scales userdefined prescaling to fullscreen | ||
-double | enables double buffering mode. (Available only in DGA mode). Should be slower of single buffering, but has no flickering effects. | ||
Known problems and workaround:
- If you have installed NLS font on your Linux box and run VESA driver
from text-mode then after terminating mplayer you will have ROM font loaded instead
of national. You can load national font again by using setsysfont utility
from for example Mandrake distribution.
(Hint: The same utility is used for localizating fbdev).
- Some Linux graphics drivers don't update active BIOS mode in DOS memory. So if you have such
problem - always use VESA driver only from text-mode. Otherwise text-mode (#03) will be
activated anyway and you will need restart your computer.
- Often after terminating VESA driver you get black screen. To return your screen
to original state - simply switch to other console (by pressing Alt-Fx) then switch
to your previous console by the same way.
- To get working TV-out you need have plugged tv-connector in before booting
your PC since video BIOS initializes itself only once during POST procedure.
Avoid if possible. Outputs to X11 (uses shared memory extension), with no
hardware acceleration at all. Supports (MMX/3DNow/SSE accelerated, but still
slow) software scaling, use the options -fs -zoom
. Most cards have
hardware scaling support, use the -vo xv
output for them, or
-vo xmga
for Matroxes.
The problem is that most cards' driver doesn't support hardware acceleration on the second head/TV. In those cases, you see green/blue coloured window instead of the movie. This is where this driver comes in handy, but you need powerful CPU to use software scaling. Don't use the SDL driver's software output+scaler, it has worse image quality !
Software scaling is very slow, you better try changing video modes instead. It's very simple. See the DGA section's modelines, and insert them into your XF86Config.
-vm
option. It will
change to a resolution your movie fits in. If it doesn't :2.3.1.14. Rage128 (Pro) / Radeon video overlay (radeon_vid)
This section is OBSOLETED ! Use Vidix !
WHAT IS VIDIX
VIDIX is the abbreviation for VIDeo Interface for
*niX.
VIDIX was designed and introduced as an interface for fast user-space drivers
providing DGA everywhere where it's possible (unlike X11). I
hope that these drivers will be as portable as X11 (not only on
*nix).
What is it:
I can tell you in bold capital letters :
VIDIX PROVIDES DIRECT GRAPHICS ACCESS TO BES YUV MEMORY.
Well (it's in my todo) - implement DGA to MPEG2 decoder.
This interface was designed as an attempt to fit existing video acceleration interfaces (known as mga_vid, mga_yuv, radeon_vid) into a fixed scheme. It provides highlevel interface to chips which are known as BES (BackEnd scalers) or OV (Video Overlays). It doesn't provide lowlevel interface to things which are known as graphics servers. (I don't want to compete with X11 team in graphics mode switching). I.e. main goal of this interface is to provide maximal speed of video playback but not putting video signal on screen of your TV or on tape of your VCR. Although these things are also very significant - it's perfectly other task. (However I guess that it would be possible to implement something like mini-X (don't mix it with Minix ;) in the future, if some number of volunteers will be found.
USAGE
-vo xvidix
-vo vesa:vidix
and -vo fbdev:vidix
REQUIREMENTS
:vidix
subdevice.
USAGE METHODS
When VIDIX is used as subdevice (-vo vesa:vidix
) then
video mode configuration is performed by video output device
(vo_server in short). Therefore you can pass into command line of
MPlayer the same keys as for vo_server. In addition it understands
-double
key as globally visible parameter. (I recommend using
this key with VIDIX at least for ATI's card).
As for -vo xvidix
: currently it recognizes the following
options: -fs -zoom -x -y -double
.
Also you can specify VIDIX's driver directly as third subargument in command
line :
mplayer -vo xvidix:mga_vid.so -fs -zoom -double
file.avi
or
mplayer -vo vesa:vidix:radeon_vid.so -fs -zoom -double -bpp
32 file.avi
But it's dangerous, and you shouldn't do that. In this case given driver will
be forced and result is unpredictable (it may freeze your
computer). You should do that ONLY if you are absolutely sure it will work,
and MPlayer doesn't do it automatically. Please tell about it to the
developers. The Right Way is to use VIDIX without arguments to enable driver
autodetection.
VIDIX is very new technology and it's extremely possible that on your system (OS=abc CPU=xyz) it won't work. In this case only solution for you it's port it (mainly libdha). But there is hope that it will work on those systems where X11 does.
And the last WARNING: (un)fortunately you MUST have ROOT privileges to use VIDIX due to direct hardware access. At least set the suid bit on the MPlayer excecutable.
VIDEO EQUALIZER
This is a video equalizer implemented especially for Vidix. You can use it either with 1-8 keys as described in the manpage, or by command line arguments. MPlayer recognizes the following options :
-brightness
- adjust BRIGHTNESS of video
output. It's not equal to brightness adjusting on monitor panel or on TV. It
changes intensity of RGB components of video signal from black to white
screen.
-contrast
- adjust CONTRAST of video output.
Works in similar manner as brightness.
-saturation
- adjust SATURATION of video
output. You can get grayscale output with this option.
-hue
- adjust HUE of video signal. You can
get colored negative of image with this option.
-red_intensity
- adjust intensity of RED
component of video signal.
-green_intensity
- adjust intensity of GREEN
component of video signal.
-blue_intensity
- adjust intensity of BLUE
component of video signal.
Each parameter can accept values from -1000 to +1000.
Default value for each parameter is 0.
Note: Not every driver provides support for each of those parameters. Currently only radeon_vid.so provides full support for video eqalizing. Other drivers only partly support these options.
Examples:
mplayer -vo vesa:vidix -brightness -300 -contrast 200
filename.avi
or
mplayer -vo xvidix -red_intensity -50 -saturation 400 -hue 300
filename.vob
This is a display-driver (-vo zr
) for a number of MJPEG
capture/playback cards (tested for DC10+ and Buz, and it should work for the
LML33, the DC10). The driver works by encoding the frame to jpeg and then
sending it to the card. For the jpeg encoding libavcodec is
used, and required.
This driver talks to the kernel driver available at
http://mjpeg.sourceforge.net, so
you must get it working first. Then recompile MPlayer with
--enable-zr
.
Some remarks:
-zr*
commandline options. The explanation
of these options can be viewed with -zrhelp
. It is possible to
crop the input frame (cut borders to make it fit or to enhance performace)
and to do other things.Sorry, selected video_out device is incompatible with this codec.
Under Linux you have 2 methods to get G400 TV out working :
IMPORTANT: Only Matrox G400DH/G400MAX has TV-out support under Linux, others (G450, G550) has NOT!
-vo x11 -fs -zoom
options, but it will be SLOW, and has Macrovision copyprotection
enabled (you can "workaround" Macrovision using
this perl
script.TVout/matroxset
and type make
. Install
matroxset
into somewhere in your PATH.fbset
installed, enter
TVout/fbset
and type make
. Install
fbset
into somewhere in your PATH.TVout/
directory in the MPlayer
source, and execute ./modules
as root. Your text-mode console
will enter into framebuffer mode (no way back!)../matroxtv
script. This will present you
to a very simple menu. Press 2 and ENTER. Now you should
have the same picture on your monitor, and TV. The 3. option
will turn on independent display, but then you can't use X! If
the TV (PAL !) picture has some weird stripes on it, the script wasn't able to
set the resolution correctly (to 640x512 by default). Use other menu
items randomly and it'll be OK :)Yoh. Next task is to make the cursor on tty1 (or whatever) to disappear, and turn off screen blanking. Execute the following commands:
echo -e '\033[?25l'
or setterm -cursor off
setterm -blank 0
You possibly want to put the above into a script, and also clear
the screen.. To turn the cursor back :echo -e '\033[?25h'
or setterm -cursor on
Yeah kewl. Start movie playing with mplayer -vo mga -fs -screenw 640
-screenh 512 <filename>
(if you use X, now change to matroxfb with for example CTRL-ALT-F1 !)
Change 640x512 if you set the resolution to other..
Enjoy the ultra-fast ultra-featured Matrox TV output (better than Xv) !
A few word about ATI's TV-out:
Currently ATI doesn't want to support any of its TV-out chips under Linux.
Below is official answer from ATI Inc.:
> Hello!
>
> On your pages you wrote that you support linux developers.
> Currently I participate with mplayer project (www.mplayerhq.hu)
> I'm interesting with enabling TV-out on Radeon VE chips during
> movie playback. I would be glad to add this feature to radeonfb driver
> (which can be found in CVS tree of mplayer project at main/drivers/radeon).
> Do I have a chance to get any official technical documenation?
We will not provide TV out related documents due to macrovision concerns.
Also mpeg2 decoding is something that we MAY consider in the future but not
at this current time. This is again due to proprietary and 3rd party
information.
Pity isn't?
Q:What is Macrovision?
A:It's copy protection mechanism.
It means that if they open any TV-out related information then hackers will be able to disable copy protection on their chips. Therefore we have no chance to get working TV-out on ATI.
What's status of ATI's tv-out chips under Linux:
Fortunately, owners of fast enough CPUs (Duron, Celeron2 and better) can watch movies on their TV through VESA drivers.
I should say good words to ATI Inc. too:
they produce top quality BIOSes.
VESA drivers don't use any hardware acceleration but it simulates
DGA through 64K window, which is configured through 32-bit mode
functions of BIOS. ATI cards have enough fast video memory (DIMM or DDR
chips with 64 - 128-bit access) so it's not bottleneck for them. There are no
limitations on which video mode can be displayed on your TV (like on other
cards) so you can use any video mode on your TV (from
320x200 up to 1024x768).
From other side (it's known at least for Radeons) there is DGA
mode which is detected automatically and in this case you'll get comparable
with -vo dga and -vo fbdev drivers speed.
Only thing you need to do - have TV connector plugged in before booting your
PC since video BIOS initializes itself only once during POST procedure.
For detail see VESA sections of this documentation.
Check this URL.