[MPlayer-users] Tearing and VSync [WAS: Re: Mplayer picture question]

Ilkka Karvinen kartsan at gmail.com
Mon Apr 21 22:09:47 CEST 2008


On Monday 21 April 2008 09:49, Martin Emrich wrote:
> I actually have the same problem here (-vo xv, nVidia GF2 MX or GF3,
> connected via TV-Out to a PAL TV set). AFAIK, usually any playback
> device forces its own refresh rate (synced to the content being played)
> onto the TV set, which in turn syncs itself onto the presented refresh
> rate (in a very low margin, e.g. 47-53 Hz for a PAL TV).
>
> I am quite sure that this is not possible with an nvidia graphics card
> (correct me if I am wrong), but what is the best recommended hardware
> for tearing- and artefact-free mplayer/ffmpeg output to a TV set (analog
> Composite video or soon via DVI/HDMI)?
>
> Or how do I enable VSync on X.Org (the player PC is fast enough, and
> skipping/repeating one frame every few seconds to resyncronize would be
> quite less annoying for me than the constant tearing)?

After tweaking whole evening I'm finally getting tearing free video.

I found this page:
http://www.nvnews.net/vbulletin/showthread.php?t=85301

which explained how to enable vsync with opengl output. I used that
_GL_SYNC_TO_VBLANK=1 setting and enabled vsync with nvidia settings
program. I still saw these artifacts. I was also having problem with
refresh rate. My projector said it was always getting 1080/60i whatever
I tried. I turned out that nvidia has this system of "backend resolution"
and "frontend resolution". When I disabled "Force Full GPU Scaling", I
was able to change modeline that sends right output to the projector
for 720/50p. When refresh rate is 50Hz, 25fps video looks ok.

One more thing. If I enable triple buffering, I get the artifacts again.
It only works with double buffering. 

I'm using nvidia driver. I didn't try with xorg driver.

Ilkka



More information about the MPlayer-users mailing list