Jim Ioannidis
October 17th, 2002, 01:50 PM
Hi guys,
I started reading Scott Billups book "Digital Moviemaking" and I have a question that I just need to get answered because either I'm slow (quite possibly) or this is not making sense.
In the early parts of the book Scott talks about 8bit, 16bit and 10bit signals.
He says that a 8bit signal can generate 256 levels of brightness and color, that sounds right.
a 16bit signal can produce over 65,000 levels of color or brightness, this also is right.
But he says that a 10bit signal can produceover 5,000 levels of color or brightness. Am I doing my binary coversions wrong, How can you get 5,000 from 10bit. The most you should be able to get from 10bit I believe is 1024.
He doesn't seem to explain where the 5000 comes from and i just gotta know.
ok, thanx guys.
I started reading Scott Billups book "Digital Moviemaking" and I have a question that I just need to get answered because either I'm slow (quite possibly) or this is not making sense.
In the early parts of the book Scott talks about 8bit, 16bit and 10bit signals.
He says that a 8bit signal can generate 256 levels of brightness and color, that sounds right.
a 16bit signal can produce over 65,000 levels of color or brightness, this also is right.
But he says that a 10bit signal can produceover 5,000 levels of color or brightness. Am I doing my binary coversions wrong, How can you get 5,000 from 10bit. The most you should be able to get from 10bit I believe is 1024.
He doesn't seem to explain where the 5000 comes from and i just gotta know.
ok, thanx guys.