View Full Version : Five years-Everything is Obsolete


David Rice
December 30th, 2010, 02:20 PM
London, Dec 29 (ANI): Scientists at the University of Glasgow have developed an ultra-fast computer chip, which is 20 times faster than regular desktop computers.

Regular PCs have two, four or sometimes 16 cores but the new central processing unit (CPU) developed by the researchers effectively had 1,000 cores on a single chip.

They claim the new technology will be in the consumers hands in 3-5 years.

A game changer. So much for today's top of the line camcorders, computers, and editing software.

I'm about through chasing my tail.

Perrone Ford
December 30th, 2010, 02:27 PM
I wonder if the movies will be any better. Certainly hasn't helped much thus far.

Panagiotis Raris
December 30th, 2010, 03:17 PM
would be nice, but nVidia has been working on CUDA supercomputers for years, promising supercomputing power for all, and it still hasnt happened.

Perrone Ford
December 30th, 2010, 03:23 PM
It's all relative.

I am doing things now on my fastest computer that we used to run on the CRAY YMP years ago. So there you go.

David Parks
December 30th, 2010, 04:48 PM
It is all relative. It's all related to Intel. My favorite quote from Talking Heads..."Same as it ever was...Same as it ever was." Days go by!

Cheers

Nicholas de Kock
December 31st, 2010, 07:48 AM
3-5 years is a bit optimistic more like 10 years, it's only been developed in an university lab so far.

However, he warned that the research was an early proof-of-concept work but added that he hoped "to demonstrate a convenient way to program FPGAs so that their potential to provide very fast processing power could be used much more widely in future computing and electronics."

Scottish researchers claim 1,000-core processor - Techworld.com (http://news.techworld.com/personal-tech/3254758/scottish-researchers-claim-1000-core-processor/)

Jason Ryman
December 31st, 2010, 09:24 AM
As a photographer tired of paying the Canon tax every 18 months, I've realized is that I don't need the latest-greatest whiz bang camera in order to offer a quality product to my clients.

That being said, I cannot wait for the day that the processing power is so fast that the post, rendering, etc, will be complete almost instantaneously. Hit the button and it's done.

Matt Buys
December 31st, 2010, 09:49 AM
Wow. Wouldn't mind investing in the company that owns the patent on that. A potential game changer.

David Rice
December 31st, 2010, 10:35 AM
Super HD (They already have it).

Terabyte video files and 300-500 Terabyte Hard Drives.

But, where are they going to get the Band Width to move all that information?

Erik Phairas
December 31st, 2010, 08:13 PM
All I ever wanted to do is make monster movies like the ones I watched growing up. I don't need any of this new stuff to do that. 1080p xdcam ex is plenty good enough for what I want to do. :)

John Kilderry
December 31st, 2010, 10:04 PM
You know, when it comes down to it, average people just don't see what we see on the widescreen. They don't see some local news still using their SD equipmant because it fills their 16:9 screen. They don't notice when a new show comes on there is a little group of LED's on their cable box that may switch from 720p to 1080i. They just don't care as long as the picture is relatively sharp and the sound is good.

Granted, that's the masses, but worth keeping in mind when we are beating ourselves up for not having the consumer version of of the CIA mainframe and a garage full of 4K cameras.

When you think about it, it's pretty cool what we can do right now and beyond my wildest dreams of even 10 years ago.

Just sayin'.

Brian Luce
January 1st, 2011, 11:13 AM
Will it really make a difference to people here? You can get i7's for $500 that are pretty fast. Footage always has to be previewed in real time. Stable and smart software is more useful to most editors I'll argue. Animation is another story.

Kyle Root
January 1st, 2011, 01:16 PM
Yeah I learned a while back, that it doesn't pay to continually upgrade your video cameras or DSLRs and computers. Unless you are really making like 6 figures from it, or have a truly compelling reason, then it's not worth it upgrade gear every year something is released.

You know, I just got CS5 about a month ago having upgraded from CS2.

My reasons for upgrading were:

(1) The computer I was using CS2 on, I got in 2003 and was starting to have problems. DVD burner was spotty and the sound card died. So after about 7 years it was time for a computer upgrade.
(2) It woudln't play nice with my D90 HD video files.
(3) Eventually I'm going to have either (or maybe both) a XF100 and a NX5U (or what the release next), and CS2 wouldn't play with either of those formats.
(4) I wanted to take advantage of the 64 bit system with up to 24GB RAM and the MPE.

Lawrence Bansbach
January 2nd, 2011, 12:22 PM
London, Dec 29 (ANI): Scientists at the University of Glasgow have developed an ultra-fast computer chip, which is 20 times faster than regular desktop computers.

Regular PCs have two, four or sometimes 16 cores but the new central processing unit (CPU) developed by the researchers effectively had 1,000 cores on a single chip.

They claim the new technology will be in the consumers hands in 3-5 years.

A game changer. So much for today's top of the line camcorders, computers, and editing software.

I'm about through chasing my tail.
It seems that if the chip's only 20 times faster, then the cores are individually far less powerful than current ones or the tech scales poorly.

Panagiotis Raris
January 2nd, 2011, 01:00 PM
no different than CUDA processing/supercomputers. they rely on volume of data that can be processed; but not all software can utilize 2,3,4 etc let alone 1000 cores.

Gints Klimanis
January 3rd, 2011, 01:42 PM
What's a regular desktop computer? Intel has a six-core 3.33 GHz chip, the i7-908X. The Sony Cell processor has eight codes and has been shipping since 2005 in the Sony Playstation 3. nVidia is shipping 512 cores in their latest generation of Fermi architecture. The biggest problem is that software is generally unable to use more than one core. Some of this is due to problems that can not be broken up into pieces due to a high level of data and time dependency.

Microsoft seems to have little interest in software performance and instead sells software by feature count. Microsoft could figure out how to prevent a six-core Intel Windows 7 system from locking up for a few seconds whenever the DVD drive is accessed to identify a disk, I would have greater faith in the industry.

Tony Davies-Patrick
January 3rd, 2011, 02:41 PM
I wonder if the movies will be any better. Certainly hasn't helped much thus far.

Lol! I love that answer, Perrone!

There is certainly no smoke without fire; and a movie filmed in the highest possible resolution also needs to be filmed well and hold your interest to make it worthwhile watching... :)

Toenis Liivamaegi
January 3rd, 2011, 02:48 PM
... and still, somehow print industry didn't adopt the 1200 ppi resolution that was technically available 30 years ago. And still we can admire even the 300 ppi high gloss magazines or catalogs... It's up to the size of the relative presentation size and while thinking about the issue, what are the current methods of presentation, what is the "commercial" or commonly accepted resolution for human retina?

T

Dom Stevenson
January 4th, 2011, 08:39 AM
As Alfred Hitchcock said. " There are 3 things that make a great movie. A great Script, a great script and a great script."

I don't suppose any new computer chip is going to make people write better, though there is always the danger - and plenty of evidence - of people using computers to make up for 2nd rate writing.

BTW, my current pet hate is the obsession with color grading. I frequently find myself cringing when i see what some dumb colourist has done to "improve" a film.

Panagiotis Raris
January 4th, 2011, 12:16 PM
agreed; most online HDSLR stuff is a bit trite, and trying. suspension of disbelief, shoddy effects, and poor acting cannot be overcome by 4K 4:4:4 recording.


this means you too, hollywood.

Tim Kolb
January 4th, 2011, 09:13 PM
When you think about it, it's pretty cool what we can do right now and beyond my wildest dreams of even 10 years ago.

Just sayin'.

I could not possibly agree more...

Erik Phairas
January 4th, 2011, 11:48 PM
I could not possibly agree more...

Absolutely. Not too long ago it feels I used to ask when the consumer HD cameras would start to be available, with laughter being the response. Today I have and love my EX3. That and all the sub 1000 dollar SFX/NLE software packages they have now like After Effects, and the really big one for me, websites like Youtube and Vimeo where you can actually have an audience to watch the movies you make. Man it is so awesome!

Jon Fairhurst
January 5th, 2011, 12:12 AM
Personally, I think HDR could rock for a superhero film. It tends to have a comic book look, which could be perfect for the genre.

Of course, it won't rescue a bad script. But if the script calls for a comic book world, HDR could really help pull it off.