|
|||||||||
|
Thread Tools | Search this Thread |
March 30th, 2008, 05:58 AM | #31 | |
Inner Circle
Join Date: Jan 2006
Posts: 2,699
|
Quote:
They decided there was merit in the final no always corresponding to frames, and not frames for progressive systems, but fields for interlace. Hence the two numbers simply define the amount of information without reference to the scanning system. I think the forward slash is supposed to denote that the new system is being referred to - hence old nomenclature is 1080i60, new standard is 1080i/30, but 1080i/60 infers a 1080 line system with 60 frames, 120 fields/sec. So no, I could have put 1080i60, but I didn't mean 1080i/60. :-) |
|
March 30th, 2008, 06:39 AM | #32 | |
Regular Crew
Join Date: Dec 2007
Location: Cuenca (Spain)
Posts: 92
|
Quote:
In analog interfaces the signal must carry the vertical and horizontal sync signals as well as blanking intervals. These lay before and after each active part of each line (active part is the portion of the signal which effectively contains picture signal). But in digital interfaces, there is no need to digitize the sync and blanking portions of the signal. Their values are always known, and they are of no use while the video remains digital, so in order to save space, only the active samples are stored. If the video must be converted back to an analog interface, the sync and blanking will be generated and interleaved with the active video by the DAC. But it is impractical to transmit just these active samples, because this would complicate the design of DACs, so it is more convenient to 'leave a gap' between the EAV (end of active video) of each line and the SAV (start of active video) of the following line. The 'duration' (or number of bits) of the gap correspond, as you are guessing, to the duration of the sync and blanking of each line. And, of course, to leave something unused is against an engineer's religion, so they quickly developed methods to take profit of these gaps and fill them with ancillary data, that is, any kind of data that could fit in the remaining bandwidth, such as audio, captions, test signals, error correction, metadata, etc. So, of these 1.485 Gb/s, just 1.244 are used for video. That leaves enough bandwidth to accomodate 48 channels of uncompressed 24 bits, 192 KHz digital audio!
__________________
Wire telegraph is a kind of a very, very long cat. You pull his tail in New York and his head is meowing in Los Angeles. Radio operates exactly the same way, but there is no cat. |
|
March 30th, 2008, 07:54 AM | #33 |
Regular Crew
Join Date: Sep 2007
Location: Bee Cave, Texas
Posts: 151
|
Excuse me, off topic, another new device using drives instead of flash, and JPEG2000. The following publicity explicitly states that the "Elite HD" supports the EX-1!
http://www.ffv.com/releases/080325.htm Except for info in that link, the device is apparently not available for view until NAB (as a prototype?). We all knew something like these devices would be coming - as for myself, like one other responder to the above link, I may just wait for Cineform SOLID. |
March 30th, 2008, 09:16 AM | #34 | |
Convergent Design
Join Date: Apr 2005
Location: Colorado Springs, CO
Posts: 869
|
Quote:
This can be very confusing. However, most people specify the number of fields when describing interlaced formats and the number of frames in progressive formats. The Sony EX1 brochure specifies 1080/59.94i (59.94 fields per second) and 1080/29.97p (29.97 frames per second). That's the "standard" we followed on the Flash XDR specification. But. if in doubt it's always best to ask.
__________________
Mike Schell Convergent Design |
|
| ||||||
|
|