[Ffmpeg-devel] Macromedia Flash 8

Mike Melanson mike
Tue Sep 20 03:57:13 CEST 2005

Colin Ward wrote:
>    Ok, thanks.  You can never trust version numbers these days, 
> especially if the marketing men have had anything to do with them.  WMV9 
> could be version 9 of WMV, if engineers set the version number, or it 
> could be only version 2 or 3, if marketing men set the version number!

You have that backwards. If it were up to the tech people, version
numbers would be lower and the codec would be WMV3. It's the marketing
folks who artificially inflate version numbers to make software appear
more advanced. In fact, Microsoft video codecs progressed as follows:

Microsoft Video-1
Microsoft MPEG-4 v1
Microsoft MPEG-4 v2
Microsoft MPEG-4 v3
Microsoft WMV7
Microsoft WMV8 (WMV2)
Microsoft WMV9 (WMV3)

>    I noticed that MPlayer can play some WMV9 videos but crashes with 
> others;  is this because support is only partially complete thus far?

If you are on an x86 platform and have the binary codecs installed, you
are likely using the binary Win32 decoder to handle the data. I do not
know if the MPlayer team has adapted the SMPTE VC-1 decoder yet and if
they did, I doubt they would distribute it with the regular source code.
But I'm sure an MPlayer team member will let me know the situation soon.

In any case, check that the crashing video works with WM 9 under
Windows. If it does, maybe the MPlayer people would like to examine the
errant video file.

>    WMV9 *is* pretty hot stuff too.  I went to a telco about a year ago 
> who were experimenting with broadband HD video and were playing WMV9 
> files on a huge wall mounted plasma screen, and it was *very* 
> impressive.  However, what was disappointing (and this applies to H.264 
> as well) was that the codec seemed to use the brute force approach (in 
> terms of CPU and bandwidth) to get its quality.  What would have been 
> *really* impressive if WMV9 and H.264 got their amazingly high quality 
> while still having only the bandwidth requirements of previous 

Soooo... you watched a video on a big TV and you could instinctively
determine the video data rate and amount of CPU required to decode it?
Impressive, but I would be inclined to not believe you. :) To
generalize, the next generation of codecs are spec'd to be able to
deliver higher resolution video at similar data rates that are in use
today. The trade-off is that, yes, more CPU muscle will be required for

> generation codecs.  I mean, the Serenity H.264 trailer is heaps nicer 
> than the Serenity Quicktime trailer, but it was several times the size, 
> which takes away the "wow" factor a bit.

Trade-offs, my friend. As you observed, the video was nicer (i.e.,
bigger and more detailed) and the trade-off is that the data size was a
bit larger. Let's run some numbers here. I am working off the Serenity
trailers listed here:


The "full-screen" trailer (what a misnomer) is actually 640x272 pixels.
It is ~40 MB. The 1080p trailer is 129MB in a zip file. I have
downloaded this once and unpacked it and the unpacked size is not much
larger. Let's call it ~130MB. Assume that the files are all video data
(which they are not, there is also audio and file overhead) and that
they have the same framerate (not sure if this is true). Assuming that
1080p means 1920x1080 pixels, this is the amount of raw data that the
codec is processing at the different resolutions:

 640 x  272 =  174080 pixels/frame * 3/2 bytes/pixel =  261120 bytes/frame
1920 x 1080 = 2073600 pixels/frame * 3/2 bytes/pixel = 3110400 bytes/frame

The codec is pushing through almost 12 times as much raw data. Yet the
file size is only about 3.3 times as large. Not bad.

	-Mike Melanson 

More information about the ffmpeg-devel mailing list