[MPlayer-dev-eng] libvo2

David Holm dholm at telia.com
Thu Nov 15 10:11:09 CET 2001


Arpi wrote:

>Hi,
>
>>as some of you know my cvs updates were unappreciated (mainly due to 
>>your idiotic mailservers taking 3 hour sto deliver mail from the cvs 
>>
>it was your, not our servers.
>
I need to channel my anger somehow until it's gone... please don't take 
it too personal...

>
>
>>mailinglist), so I got cut off from cvs access.
>>
>i can't see the connection.
>it's written and you should know that how to make patches. the only
>difference is that instead of diff you type cvs commit.
>you shouldn't do cosmetics changes, commit files with zero changes,
>commit nonworkable patches. it has nothing with your subscription at cvslog.
>if you're newbie or unsure, do a cvs diff -u first, check if it's ok and
>only commit then. even i do it usually before commit to avoid commiting
>debug code or false comments.
>
whenever I get cvs access again I will do so...

>
>
>>I'm not going to stop development of it because of this, but I have 
>>reprioritized.
>>
>you know.
>
you too...

>
>
>>The dxr3 users are pleased with my work in difference to you others. So 
>>I'll put all effort into getting the current libvo dxr3 plugni working 
>>correctly (meaning, win32 codec playback, audio and subpic playback 
>>without locking video) before I go back to libvo2.
>>I have created a draft of features om libvo2 and I'll post it now to get 
>>some feedback before I return to develop it. Most, if not all, of the 
>>things in the spec is from several developers and users on the list and 
>>I don't think that many additions have been made by me, just wanted to 
>>point that out so people don't start sending fan-mail to me.
>>Libvo2 Core is _NOT_ a concern for device developers, the libvo2 core is 
>>the glue between a device and the hard stuff (like pixel conversion, 
>>scaling etc)
>>I don't think it fully lists all the features since I had time to 
>>implement some more before Arpi shit in his pants and closed off my cvs 
>>access.
>>
>don't be so upset... *you* messed up cvs. it's fact. i suggested (and forced)
>you to send patches to list instead of commiting to cvs without checking
>until you read and understand cvs-howto.txt.
>
and that is exactly what I'm going to do

>
>
>>I'll post some headers later on for commenting, but they are a bit far 
>>from being in "feedback" mode right now. Comments ppl...
>>
>ok
>
>>libvo2 specification draft
>>
>>        Device Capabilites
>>   
>>    * Direct Rendering
>>    * (better) Hardware decoder support, including hw decoders to have 
>>access to the pts
>>    * Hardware scaling, flipping, croping etc
>>    * control() interface mimicing ioctl() making it easy
>>      for developers to add functionality to the device interface
>>      (My personal opinion is to use this interface for as many
>>       functions as possible)
>>    * Simpler device development by letting libvo2 core handle
>>      most of the general work (like pixel conversion, scaling, etc)
>>
>>        Core Capabilities
>>
>>    * Checking for the availability of a vo_* device.(requiers a
>>      prototype implementation in all vo_ devices)
>>    * Finding fastest way of scaling, processing, format conversion
>>      by scanning for available vo_ devices and their capabilites
>>
>it's partially job of mplayer core, because it needs interfacing with video
>codecs for colorspace and buffering (and should be mixed with codec selection)
>
I'll find the best way to do this (have some ideas) otherwise I'm sure 
someone here can come up with an even better solution, we'll see when it 
comes to that...

>
>
>>      during startup. The user might want to output video using device
>>      vo_X, but device vo_Y is capable of hardware scaling which vo_X
>>      isn't, then vo2core will try to utilize vo_Y's scaling capabilites
>>      but still rendering graphics through vo_X.
>>      Say for instance you have a device in your computer containing a
>>      programmable DSP (Digital Signal Processor) but this device has
>>      nothing at all to do with video. Assuming someone would be
>>      interested in writing microcode for the DSP to say do hardware
>>      pixel conversion this device would be a very useful vo_ device
>>      although it's completely uncapable of video-output.
>>
>ehh
>
just forget about this part for now, I'll rewrite it, I can hardly 
understand it myself actually

>
>
>>    * Converting codec output to the vo_ device's desired input
>>      format, and if possible directly rendering it to the device's
>>      buffer.
>>
>direct rendering means direct rendering :)
>so the codec renders to the buffer of driver, it skips libvo2 core.
>this is why i designed libvo2 driver api to provide surfaces, and leave the
>rendering and handling doublebuffer etc to libvo2 or codecs.
>
But I still want the libvo2 to glue a format converter if needed, even 
in direct rendering mode, this isn't a problem is it (estetically)?

>
>
>>Additional notes
>>
>>Well, I'm not that good at writing development drafts, but I hope you 
>>understand
>>what we are moving towards. I will supply some libvo2 headers for you to 
>>get a
>>deeper understanding of what is to come.
>>I need a lot of feedback, good, bad, whatever.
>>
>>Someone on the mplayer-users list suggested a benchmark option which 
>>would try
>>different combos for a specified -vo device and write a translation table
>>(I'm suggesting something like ~/.mplayer/vo_<device>.translation or 
>>something
>>like that).
>>This is a great idea, especially if I can get you to see the point in 
>>providing
>>vo_ devices which has nothing to do with video output but still can be 
>>utilized
>>for doing calculations in hardware!
>>
>what device do you think of? i can't imagine such.
>
to make it simple and to use what I had in mind when I wrote it... 
whenever using the dxr3 you can use some of the very powerful features 
of you 3d accelerator board (ie nvidia)... I'm not into how they work 
yet, but if you put stuff in agp memory(system ram) and not video memory 
you'd be able to manipulate stuff very fast and then send it to the 
output device (hardware decoder in this case)...
This should be possible, shouldn't it? And I see it as a great 
feature... also, the SB Live has a programmable dsp, there is supposed 
to be an sdk somewhere, this could also be used for such things 
(although using it for hw mp3,ogg or whatever playback would perhaps be 
smarter in this case...)

//David Holm

>
>
>
>A'rpi / Astral & ESP-team
>
>--
>mailto:arpi at thot.banki.hu
>http://esp-team.scene.hu
>_______________________________________________
>MPlayer-dev-eng mailing list
>MPlayer-dev-eng at mplayerhq.hu
>http://mplayerhq.hu/mailman/listinfo/mplayer-dev-eng
>






More information about the MPlayer-dev-eng mailing list