New dawn of video editing (provided everyone's aboard): at DVinfo.net
DV Info Net

Go Back   DV Info Net > The DV Info Network > Digital Video Industry News
Register FAQ Today's Posts Buyer's Guides

Digital Video Industry News
Events, press releases, bulletins and dispatches from the DV world at large.

Reply
 
Thread Tools Search this Thread
Old May 30th, 2006, 10:29 PM   #1
Trustee
 
Join Date: Sep 2002
Location: Vulcan
Posts: 1,564
New dawn of video editing (provided everyone's aboard):

http://www.youtube.com/watch?v=1n-6o...&search=novell

^now all they have to do is record the results of the render back into a SD-DVD or HD-DVD/BR format! that means HD-editing in REAL time w NO rendering time is availble as soon as they write the post rendered form back into HD.
__________________
bow wow wow
Yi Fong Yu is offline   Reply With Quote
Old May 31st, 2006, 03:25 PM   #2
Major Player
 
Join Date: Jun 2003
Location: Golden, CO
Posts: 681
Er... What does Novell's new v10 shell for their SUSE Enterprise Desktop have to do with editing HD video in real time. ???
__________________
- Jeff Kilgroe
- Applied Visual Technologies | DarkScience
- www.darkscience.com
Jeff Kilgroe is offline   Reply With Quote
Old June 2nd, 2006, 09:30 AM   #3
Trustee
 
Join Date: Sep 2002
Location: Vulcan
Posts: 1,564
everything =).

ya know how we all have to render the final product to HD-DVD/BR, etc. with the desktop's computations being completely run by GPU, then that means the FX/edits applied to the editing process can be achieved in real-time (via pixel shaders). the only hurdle (that's why i said provided we're all onboard) is writing that final rendered footage BACK from the PCIe to the hard drive itself. we already know the GPU's have enough juice to help encode MPEG2 through4.
__________________
bow wow wow
Yi Fong Yu is offline   Reply With Quote
Old June 2nd, 2006, 11:07 AM   #4
Wrangler
 
Join Date: Oct 2002
Location: Los Angeles, CA
Posts: 2,100
Apple's been processing video on the GPU for a couple years now...they've built a few frameworks for it that Motion and FCP use.
__________________
My Work: nateweaver.net
Nate Weaver is offline   Reply With Quote
Old June 2nd, 2006, 11:19 AM   #5
Trustee
 
Join Date: Mar 2004
Location: Milwaukee, WI
Posts: 1,719
Avid Liquid has also been using the GPU for three years now. This is why it can do so much with no special hardware.
Thomas Smet is offline   Reply With Quote
Old June 2nd, 2006, 02:13 PM   #6
Trustee
 
Join Date: Sep 2002
Location: Vulcan
Posts: 1,564
not 100%. they backed down from full-blown 3D acceleration when the cost of upgrading so many binaries from bitmap into vector. there's been a few articles on it. google around.

Quote:
Originally Posted by Nate Weaver
Apple's been processing video on the GPU for a couple years now...they've built a few frameworks for it that Motion and FCP use.
dunno about avid. are they still around ;).
__________________
bow wow wow
Yi Fong Yu is offline   Reply With Quote
Old June 6th, 2006, 11:25 AM   #7
Major Player
 
Join Date: Jun 2003
Location: Golden, CO
Posts: 681
GPU acceleration has been around in one form or another for years and been used for video acceleration, encoding, decoding, 3D rendering, etc...

I'm still puzzled as to the point of the original post and reference to Novell's V10. It doesn't have the application support or industry backing to be a major player or even a purposeful solution for mainstream video editing. GPU support can be had in any OS out now.. Both the next revision of Windows (Vista) and OSX will have more visual OS goodies than what was shown in that Novell demo (it actually looked like a half-assed Vista rip-off).

I smell a troll....

However, I will agree on one point -- for it to work everyone has to be onboard. It's up to the software/hardware vendors to jump on board together and make this happen. I think what has kept full GPU acceleration for video editing/rendering from taking place on a mass scale is that the GPU market evolves so fast it's difficult for game developers and the GPU maker's own driver devlopment teams to keep up. 3D modeling/animation/CADD applications can't keep up, not even close. Why would video editing be able to do any better or even as good?
__________________
- Jeff Kilgroe
- Applied Visual Technologies | DarkScience
- www.darkscience.com
Jeff Kilgroe is offline   Reply With Quote
Old June 6th, 2006, 11:47 AM   #8
Trustee
 
Join Date: Sep 2002
Location: Vulcan
Posts: 1,564
how can it be a ripoff of Vista when it is available NOW? yes you can download it now and try to use it if you know *nix. actually it's been around a lot longer than that.

OS X has limited 3D support, it's not full-on 100% like the demo shows.

the key (imho) isn't just "during editing" accelerating the process. the key is write the rendered goods BACK to the Hard drive in real-time. think about it. you have 6 hours of HDV footage. you cut it down to 30minutes. then you throw that to rendering and the CPU's take over and it's going to take you from 2 hours to 2 days (depending on how fast a machine you have). taht's ridiculous in these day and age when everyone is proclaiming how awesome the power of GPU's are. why not throw the task to GPU's to ENCODE via hardware processing in REAL TIME so we can output "final cut" MPEG2 or MPEG4 for either SD-DVD or HD-DVD/BR?

the key is to have enough bandwidth to "write back" from the GPU card through the PCIe or... if the 4x4 concept from AMD actually ships, then through the hypertransport back to the hard drive. taht's if NVidia&ATI were committed to making a flip chip to plug into the design. either AM2/3 or Socket F. anyway, the point is, if they don't iron that out, it's going to be hard to write that material back. but just imagine:

picking from where you have a locked final edit @30minutes. all it'll take you to render to 2k or 4k or 8k resolution regardless of the # of FX you applied audio and visually, your "final render" will take you merely 30minutes real-time. in another words, your hard drive will merely serve as a recording device to the outputs.
__________________
bow wow wow
Yi Fong Yu is offline   Reply With Quote
Old June 6th, 2006, 11:48 AM   #9
Trustee
 
Join Date: Sep 2002
Location: Vulcan
Posts: 1,564
how can it be a ripoff of Vista when it is available NOW? yes you can download it now and try to use it if you know *nix. actually it's been around a lot longer than that.

OS X has limited 3D support, it's not full-on 100% like the demo shows.

the key (imho) isn't just "during editing" accelerating the process. the key is write the rendered goods BACK to the Hard drive in real-time. think about it. you have 6 hours of HDV footage. you cut it down to 30minutes. then you throw that to rendering and the CPU's take over and it's going to take you from 2 hours to 2 days (depending on how fast a machine you have). taht's ridiculous in these day and age when everyone is proclaiming how awesome the power of GPU's are. why not throw the task to GPU's to ENCODE via hardware processing in REAL TIME so we can output "final cut" MPEG2 or MPEG4 for either SD-DVD or HD-DVD/BR?

the key is to have enough bandwidth to "write back" from the GPU card through the PCIe or... if the 4x4 concept from AMD actually ships, then through the hypertransport back to the hard drive. taht's if NVidia&ATI were committed to making a flip chip to plug into the design. either AM2/3 or Socket F. anyway, the point is, if they don't iron that out, it's going to be hard to write that material back. but just imagine:

picking from where you have a locked final edit @30minutes. all it'll take you to render to 2k or 4k or 8k resolution regardless of the # of FX you applied audio and visually, your "final render" will take you merely 30minutes real-time. in another words, your hard drive will merely serve as a recording device to the outputs.
__________________
bow wow wow
Yi Fong Yu is offline   Reply With Quote
Old June 8th, 2006, 12:00 PM   #10
Major Player
 
Join Date: Jan 2004
Location: Seattle
Posts: 243
Quote:
OS X has limited 3D support, it's not full-on 100% like the demo shows.
Ummmmm full 3D support would be handled by the GPU. The next OS X version 10.5 should offer more functionality but I agree with Kilgore. Thise stuff has been shipping on production level hardware for some time. Apple's Motion and Aperture heavily use the GPU for accelerating screen elements and when OS X fully supports OpenGL 2.0 I expect things to improve even more.

Not trying to downplay the Novell advancement but calling it "a new Dawn" is incorrect. It's already being done.
Harrison Murchison is offline   Reply With Quote
Old June 8th, 2006, 03:11 PM   #11
Wrangler
 
Join Date: Oct 2002
Location: Los Angeles, CA
Posts: 2,100
Even assuming Yi is right about the fact that no company is doing it on the level of Novell, the truth remains that it has zero relevance to the world of professional editing tools.

Sorry Yi. I know tech advancements are exciting, but your news isn't very connected to our world.
__________________
My Work: nateweaver.net
Nate Weaver is offline   Reply With Quote
Old June 8th, 2006, 10:22 PM   #12
Major Player
 
Join Date: Jun 2003
Location: Golden, CO
Posts: 681
The Novell desktop in its standard form is very much copied (look and style) off of Windows. Much of the animation and effects (jiggly windows, translucency, etc..) has been available in various skin/effects packages for OSX and Windows and other Linux shells for some time. Everything shown in Novell's demo videos is all available under Vista (and in most cases in XP and OSX with the proper 3rd party add-ons). As for the "3D multiple desktops" where the user access other desktops by rotating a cube, nothing new there either. SGI had this very same feature as far back as IRIX 9.3 (like 8 years ago) and multiple desktops is nothing new and supported with the proper software under Windows and OSX. While I haven't specifically seen the rotating cube effect under Windows or OSX, it would be fairly easy to implement if someone already hasn't. I've seen desktops that shuffle like notecards and turn like pages of a book... There really isn't one original idea in that Novell desktop environment.

The only thing going on there that's pretty cool is the level of GPU integration in that desktop environment. But it's fairly new and has a head start vs. the upcoming revisions of OSX and Windows, which in current beta form already do everything we saw in those videos and then some.

All that aside, it has little to do with video editing and encoding. Having the GPU accelerate window draws and video overlays within a desktop environment is one thing. And has nothing to do directly with the applications themselves actually using the GPU to aid in their own tasks. As others have pointed out, some applications already do this to some degree... Although nowhere near as much as we would all probably like. And I don't see this changing a whole lot when OSX 10.5 or Windows Vista arrives. There are just too many GPU varieties on the market and no real standards to start offloading massive amounts of pro-level work to another piece of hardware. Especially when trying to create something that depends on consistancy of calculations across differing CPUs or platforms -- like MPEG interpolation/encoding or random number seeding and generation.

Before all this happens, there needs to be an extensive and standard language with base functionality for GPU access. Something like nVidia's failed C for Graphics API, but on a much larger and more industry-serving scale.

But as it stands right now, when it comes to huge tasks involving rendering of video or 3D animation, adding more CPUs is cheaper and will provide more power in the long run than trying to rely on specialized GPU power. Not to mention, so many functions (like the random number generator) are typically implemented in software rather than using the integrated CPU method. Why? Because PowerPC CPUs have a different RNG than Intel CPUs, which are different than the old DEC Alpha CPU approach, etc... When a network of 20+ render nodes are all crunching away and need to use a specific seed value to generate imagery for a procedural texture map, they all must call upon the consistant, software-based RNG. NewTek forgot to implement software RNGs on the PC and Alpha versions of Lightwave3D a few years back and mixed PC/Alpha networks would have jitters and shimmers in procedurals due to mismatched RNG values between frames generated on the two different platforms. Without a standardized approach and/or API in place that the industry can work with, how would such a workflow even be possible when one system has a GeForce 7800GTS, and the one next to it has a 6800U while the one down the hall is a Radeon X1800, etc..

Even in a non-network situation and just looking at a project being encoded for DVD. What if I edit and render a significant amount of footage to MPEG2 on my system with a 7800GTX GPU and I have someone else do other sequences (like B&W "flashback" scenes or something) and they get encoded on similar system, but with a RADEON X1600 doing a ton of the processing. Each GPU is responsible for how the MPEG2 encoder handles the interpretation of how to assemble I-Frame target data and each has a different internal approach. There could be distinctly different noise patterns between the two different sets of clips from these systems. One could have softer edges or one could have more crushed details in blacks or whites, etc.. Such potential concerns would keep serious users from taking a project from the office home to work on with a different system.

Sorry for the long-winded reply, but what application developers need to do in the real world is a far cry from what Novell has going in their new Linux shell. A better argument would be that game developers continuously exploit several of the latest abilities of these GPUs and squeeze a lot of power out of them, why couldn't this talent be harnessed to create a video editing and encoding application or platform?
__________________
- Jeff Kilgroe
- Applied Visual Technologies | DarkScience
- www.darkscience.com
Jeff Kilgroe is offline   Reply With Quote
Old June 9th, 2006, 03:58 AM   #13
Trustee
 
Join Date: Feb 2004
Location: Suwanee, GA
Posts: 1,241
It can be a Vista ripoff because I saw "that" demo at WinHEC 2003 done with Longhorn (now Vista).

BTW, to do that, MS has rewritten the display engine (WinFX). Now, the GPU is treated about the same with the CPU is with a scheduler, a virtual memory manager, and a crash recover engine. They have completely changed how the engine generates fonts and objects (Vista can scale without aliasing as it we regenerate the fonts using the GPU - hence the heavy graphics requirements for Aero Glass). I have seen it running on a 240dpi monitor and the text looks like print (finally!) It will also run wide gamut and they have demoed that on a Japanese experimental monitor that had black as close to 0,0,0 as I have seen.

But cool, glad Novell has that. Linux needs eye candy too!
George Ellis is offline   Reply With Quote
Old June 11th, 2006, 05:39 PM   #14
Trustee
 
Join Date: Sep 2002
Location: Vulcan
Posts: 1,564
i believe there is a simple method to resolve that issue.

since there's been a big advancement in pixel shading and applying fx real-time on top of a piece of video stream playing, why not redirect that output of the visuals back into the hard drive? almost just like pressing record back in the VHS days? then, there's our "final render" from 720p through 1080p.
__________________
bow wow wow
Yi Fong Yu is offline   Reply
Reply

DV Info Net refers all where-to-buy and where-to-rent questions exclusively to these trusted full line dealers and rental houses...

B&H Photo Video
(866) 521-7381
New York, NY USA

Scan Computers Int. Ltd.
+44 0871-472-4747
Bolton, Lancashire UK


DV Info Net also encourages you to support local businesses and buy from an authorized dealer in your neighborhood.
  You are here: DV Info Net > The DV Info Network > Digital Video Industry News


 



All times are GMT -6. The time now is 08:44 PM.


DV Info Net -- Real Names, Real People, Real Info!
1998-2024 The Digital Video Information Network