|
|||||||||
|
Thread Tools | Search this Thread |
January 3rd, 2011, 01:42 PM | #16 |
Inner Circle
Join Date: Jun 2003
Location: San Jose, CA
Posts: 2,222
|
What's a regular desktop computer? Intel has a six-core 3.33 GHz chip, the i7-908X. The Sony Cell processor has eight codes and has been shipping since 2005 in the Sony Playstation 3. nVidia is shipping 512 cores in their latest generation of Fermi architecture. The biggest problem is that software is generally unable to use more than one core. Some of this is due to problems that can not be broken up into pieces due to a high level of data and time dependency.
Microsoft seems to have little interest in software performance and instead sells software by feature count. Microsoft could figure out how to prevent a six-core Intel Windows 7 system from locking up for a few seconds whenever the DVD drive is accessed to identify a disk, I would have greater faith in the industry. |
January 3rd, 2011, 02:41 PM | #17 | |
Trustee
Join Date: Feb 2005
Location: Worldwide
Posts: 1,589
|
Quote:
There is certainly no smoke without fire; and a movie filmed in the highest possible resolution also needs to be filmed well and hold your interest to make it worthwhile watching... :) |
|
January 3rd, 2011, 02:48 PM | #18 |
Major Player
Join Date: Nov 2005
Location: Tartu, Estonia
Posts: 579
|
... and still, somehow print industry didn't adopt the 1200 ppi resolution that was technically available 30 years ago. And still we can admire even the 300 ppi high gloss magazines or catalogs... It's up to the size of the relative presentation size and while thinking about the issue, what are the current methods of presentation, what is the "commercial" or commonly accepted resolution for human retina?
T |
January 4th, 2011, 08:39 AM | #19 |
Major Player
Join Date: Feb 2007
Location: London UK
Posts: 430
|
I wonder if the movies will be any better. Certainly hasn't helped much thus far.
As Alfred Hitchcock said. " There are 3 things that make a great movie. A great Script, a great script and a great script."
I don't suppose any new computer chip is going to make people write better, though there is always the danger - and plenty of evidence - of people using computers to make up for 2nd rate writing. BTW, my current pet hate is the obsession with color grading. I frequently find myself cringing when i see what some dumb colourist has done to "improve" a film. |
January 4th, 2011, 12:16 PM | #20 |
Major Player
Join Date: Apr 2010
Location: Reading, PA USA and Athens, Greece
Posts: 269
|
agreed; most online HDSLR stuff is a bit trite, and trying. suspension of disbelief, shoddy effects, and poor acting cannot be overcome by 4K 4:4:4 recording.
this means you too, hollywood. |
January 4th, 2011, 09:13 PM | #21 |
Major Player
Join Date: Mar 2005
Location: Neenah, WI
Posts: 547
|
I could not possibly agree more...
__________________
TimK Kolb Productions |
January 4th, 2011, 11:48 PM | #22 |
Major Player
Join Date: Sep 2008
Location: Las Vegas, NV
Posts: 628
|
Absolutely. Not too long ago it feels I used to ask when the consumer HD cameras would start to be available, with laughter being the response. Today I have and love my EX3. That and all the sub 1000 dollar SFX/NLE software packages they have now like After Effects, and the really big one for me, websites like Youtube and Vimeo where you can actually have an audience to watch the movies you make. Man it is so awesome!
__________________
EX3, Q6600 Quad core PC - Vista 64, Vegas 8.1 64bit, SR11 b-cam |
January 5th, 2011, 12:12 AM | #23 |
Inner Circle
Join Date: May 2006
Location: Camas, WA, USA
Posts: 5,513
|
Personally, I think HDR could rock for a superhero film. It tends to have a comic book look, which could be perfect for the genre.
Of course, it won't rescue a bad script. But if the script calls for a comic book world, HDR could really help pull it off.
__________________
Jon Fairhurst |
| ||||||
|
|