View Full Version : Losing my points of reference


Pages : [1] 2

Ozzie Alfonso
September 29th, 2010, 09:09 PM
For the past five years I've been an adjunct professor of Television Production at a major New York University. My students range from average to bright, and depending on the course I have a mixture of sophomores, juniors, and seniors in the class.

As a midterm question I have been asking for a research paper comparing digital to analog - the pros and con, etc. With the midterms approaching, I've begun to think I should drop this comparison since digital is the given and analog seems to be a vanishing and irrelevant topic. Still, today I prepared the class for the kind and depth of essay I was expecting. In the middle of the explanation I referred to "negative" film as opposed to "direct positive" film. One student interrupted with "Professor, you said "negative" - what do you mean?" I went on to explain negative film, compared it to slide film... "Slide?" Most of the class did not know what a slide was because they had never seen one, and most have never shot film, much less negative film.

My feeling of loss and anxiety grew as I realized there were few common points of reference I could share with the class, and with each semester there are fewer and fewer. Thankfully, and surprisingly, one common point of reference was vinyl records. They had actually seen albums, even looked at the grooves! Finally a common ground to build my mini lecture.

These students were born in the early 1990s. By the time they started school it was already the mid 90's. I bring this topic up because I know I'm not alone in trying to explain the current digital world to an audience that has no concept of analog. I appreciate any insights, or ideas, or suggestions you might have. How to explain digital when your audience knows nothing else? How can we understand the present when the past has been erased?

Bryan Cantwell
September 29th, 2010, 09:19 PM
I don't think there's any way to explain it.

You have to show it. Teach them how to record & edit on 1/4" audio tape. Have them work in film (if only 35mm stills), and then in digital video. They'll need to experience it to have an understanding of it at all...

Sareesh Sudhakaran
September 29th, 2010, 09:41 PM
Once I gave a seminar on digital image processing (think: photoshop; it was my project in my engineering course). Here's how I explained it:

The digital world is broken up into units of 0 & 1. The analogous world is fluid and can't be broken up without losing an "understanding" of the whole. Tape can also be digital (HDV, DV, etc). Believe it or not, film can also be digital! Digital is just a way to record data and nothing else. We break up data into pieces so it is more manageable and can be transported faster. Analogous data depends on the medium entirely (electric wires, radio/microwave frequencies, cables, sound waves...).

Just to give you a wild example: the speed of sound has a maximum value, beyond which it cannot travel, but sound transmitted digitally can be transmitted at the speed of light (optical fibre, etc).

However, the greatest advantage of analogue is that there is less data loss. Because digital needs intermediate conversion steps it loses data. That's why film is still better than digital, and why the natural world (eyes and ears) are better than cameras and speakers. There is no digital without analogue. It's still there, but invisible.

What you'll need to do is convince them on an emotional (or artistic) level on why they need the choice: analogous to how a painter or a musician chooses his tools. If they don't appreciate the choice, either you have failed or the choices are not good enough for their purposes. That begs the question: what do THEY want? :)

Hope I haven't been too long winded about this. All the best!

Allan Black
September 29th, 2010, 11:52 PM
Most are not interested because they don't have time what with their cells, text, email, Face-book, Twitter etc.

So lead by practical examples: As the students enter class have them all leave their digital cell phones in a box by the door, then when they're seated you put in yours and then sloooowly put in your notebook .. with appropriate comments.

Having arranged to take some slides of some of the parents, use a slide projector and screen some at the start to get attention, then screen some 1950-60 city sights with your commentary.

Each student should bring an analogue example for show and tell, an audio cassette, 8 track cartridge, reel to reel tape, LP record, small valve amp etc.

Show and run a small valve radio without the case so they see the valves light up, maybe lower the classroom room lights. Try and borrow an old valve TV with a small screen, you might get some old valves from the local radio station. Describe how they need to warm up and work. Demonstrate how everyone was living in an earlier age, most happy, by playing a short extract from a radio comedy show. Have fun, say 30secs of Sex and the City would have landed the whole production company in goal.

Have a pair of old phone handsets wired together with a battery so they work, ring and answer etc. Put one each side of the classroom so the wire is visible. Two students should demonstrate it, dialing etc.

Have a student dressed as a cablegram delivery kid who interrupts the class with a wire, maybe on a bike.

The more you can show and demonstrate the better. Hope this helps.

Cheers.

Laurence Janus
September 30th, 2010, 12:10 AM
What is your goal with this?

Perrone Ford
September 30th, 2010, 12:10 AM
Ozzie,

While I sympathize somewhat, I must ask the pertinent question. "What are you preparing them for?"

It is not necessary to understand the gas light to appreciate the light bulb, nor is it necessary to have a deep understanding of the horse and buggy to understand the nature of the automobile. The concept of slide film and negative film is for all purposes irrelevant to these students. The idea of actually shooting on film, particularly motion picture film, is as foreign to them as I suspect a paper encyclopedia is.

While I am all for preserving the past, I am far more for giving students the tools and knowledge they will need to succeed in the future. And film is not the future. As beautiful as it can be, it's scarcity and price mean most students today will never handle it. I grew up on Kodachrome and ektachrome. Today you can't even buy it. It's a relic, much like a tube TV. These lectures should be preparing the students for their future unless you are teaching a historical course.

A more relevant question might be what is the difference between studio camera and a cinema camera. Explain the path from camera to theater/broadcast. Or ask them to project what they see happening in THEIR future... not reflect on OUR past.

I will also put the question to you. As an educator, how well versed are you in TODAY's technology? In today's methodology? Are you as well versed in the methods of today as you are in the methods of our youth? If you are well versed in it, then impart that onto the students. VERY few things motivated me as a student more than going to a live set and seeing what it took to make the news, the weather, and the sports report. The lighting, the editing rooms, the greenscreen (blue screen back then), etc.



As a midterm question I have been asking for a research paper comparing digital to analog - the pros and con, etc. With the midterms approaching, I've begun to think I should drop this comparison since digital is the given and analog seems to be a vanishing and irrelevant topic. Still, today I prepared the class for the kind and depth of essay I was expecting. In the middle of the explanation I referred to "negative" film as opposed to "direct positive" film. One student interrupted with "Professor, you said "negative" - what do you mean?" I went on to explain negative film, compared it to slide film... "Slide?" Most of the class did not know what a slide was because they had never seen one, and most have never shot film, much less negative film.

My feeling of loss and anxiety grew as I realized there were few common points of reference I could share with the class, and with each semester there are fewer and fewer. Thankfully, and surprisingly, one common point of reference was vinyl records. They had actually seen albums, even looked at the grooves! Finally a common ground to build my mini lecture.

These students were born in the early 1990s. By the time they started school it was already the mid 90's. I bring this topic up because I know I'm not alone in trying to explain the current digital world to an audience that has no concept of analog. I appreciate any insights, or ideas, or suggestions you might have. How to explain digital when your audience knows nothing else? How can we understand the present when the past has been erased?

Chris Soucy
September 30th, 2010, 12:45 AM
I'm with Perrone on this one.

Sorry Ozzie, these kids are the future, not the past, what has been has already been and has been surpassed, twice over in every generation, for the last 30 years.

Your're teaching Television Production, not history.

Leave the history to someone else, teach what they need to know going forward, not the past, it really isn't much help to them.

Those with a more studious bent may well investigate further, but your job is getting them past those exams, not giving them a living history of the moving image, unless it's in the syllabus.

Those who can, will, those who just don't "geddit" won't, simple as that.

Stop stressing and make sure you know more than they do, being on the downside of that is the kiss of death for any Tech trainer.


CS

Jim Andrada
September 30th, 2010, 01:02 AM
Kodak: "There is a very real resurgence for film" - British Journal of Photography (http://www.bjp-online.com/british-journal-of-photography/q-and-a/1735570/kodak-there-real-resurgence-film)

Well, one good point to make about analog is that the real world IS analog. Nobody actually sees or hears digital anything and they won't until babies start getting born with digital inputs implanted in their ...

And even then, the nervous system is still analog.

So since the actual act of communicating to the viewer/listener is indeed analog, failure to have some understanding of fundamental analog concepts is, I think, a gross omission.

After all, the OP said he was teaching at a university, not a trade school or junior college.

Lorinda Norton
September 30th, 2010, 01:12 AM
Hey, Ozzie! Been awhile. This reminded me of a funny T-shirt I saw Chris Hurd wearing several years ago that said, "Long Live Analog." I loved it. Analog does have a pro or two, in my outdated opinion.

"How to explain digital when your audience knows nothing else?" While I think it is good for students to know the difference between digital and analog--particularly sampling and all that--your wondering if it's time to drop the comparison (especially in the form of a research paper) is most likely a signal that it's time to at least cut back. Personally, I like knowing the history behind technology and anything else, but I'll bet just a skimming would do.

EDIT: Just read your post, Jim. Good points!

Brian Drysdale
September 30th, 2010, 02:13 AM
Having track laid a full sound track on a 16mm drama using a lot of foley and location dialogue, I was surprised how little had changed when I was involved (although not as the editor) doing the same on a couple of shorts I directed. The process had changed, but how you used the raw material hadn't.

You were still laying tracks, except they weren't physical tracks on 16mm magnetic film laid using spacer on a synchroniser, they were digital tracks. How they looked on the computer screen was exactly the same as the dubbing mixing cue sheet I made up for the 16mm film. One difference was that the dub mix actually took longer than when I did the 16mm mix, even though they had the same complexity (apart from being in stereo as against mono) and it was a longer film.

The on set working relationships are the same if you're shooting film or video, although film does force people to think more in advance and that's something worth learning.

Certainly students should learn history of film and watch those old films, it's amazing how innovative some of the visuals on the best silent films are. There's more to it than pushing buttons.

Laurence Janus
September 30th, 2010, 02:36 AM
As Brian Drysdale astutely observed "film does force people to think more in advance and that's something worth learning."

This is a very important point. Digital has created a "get everything and sort it out later" attitude in people.
I have had some projects turn into a big pile of nothing because of this.

How about this for an assignment. Give each group in the class an SD WORM card and they can only tell you the IN-OUT points for each clip to make their movie, no color correction or other silliness!
SD WORM Card (http://www.sandisk.com/business-solutions/sd-worm/sd-worm-card)

These cards are small too, so they would have to be careful :)

Perrone Ford
September 30th, 2010, 07:50 AM
After all, the OP said he was teaching at a university, not a trade school or junior college.

And as such, I would suspect there would be coursework available to teach the foundations of our broadcast and film industry. But this is a PRODUCTION class. Had this class been taught 30 years ago, would the instructor have required everyone to write a thesis on silent film?

A "production" course should be about the tools and methods to produce either episodic TV, documentary, narrative, news, etc. And nowhere in modern production (at least outside of the high dollar shows) is anyone going to be working with film.

I am all for teaching basics of exposure, and even a bit of history. But that should be a lecture, not a research project. I'd imagine at a university level, I could barely get in 1/3 of what I'd want to teach kids about TV production in a semester. Even the idea of basic editing could take weeks.

Again, just my thoughts, but if I was paying the prices for classes these students probably are, I'd be wanting a class to teach me what I need to be employable. Not something I could get on my own or with a visit to my grandfather's attic or a museum.

Alan Emery
September 30th, 2010, 08:10 AM
Hi Ozzie;

I have been interested in or shooting stills for over 50 years and have moved to digital for everything but large prints and even that is a hybrid (medium format scanned). I also have been involved as a subject, writer, or advisor in documentary TV productions for not quite as long, mostly on 35 mm or 16 mm film. Recently I decided I woud tackle digital video production on my own as a hobby from start to finish. So I am about as deeply embedded in analog as one can be, but moving to digital.

What fascinated me about the move was how dramatically the legacy of film has influenced and directed the development of the end processes of digital imagery both stills and video.

For example, for video, the image is required to be backwards compatible to the original b&w screens. When the first of the colour digital was developed for TV it could not be RGB (not backwards compatible), it had to develop a new approach: chrominance and luminance -- hence the 4:4:4 colour space of completed digital imagery for TV. (I expect that will change once all TV is digitally broadcast, but that is a while yet for most of the world). Our eye and brain system functions as an analogue or continuous system. The eye-brain system can resolve individual images up to a certain rate. Beyond that rate, the brain tends to confuse the two closest images in time and fuses them into a single image. The rate varies, but is around 20 to 25 frames per second. This sets the frame rate for any human viewer, digital or analogue. Called flicker fusion, the rate varies with different animals. Many invertebrates can resolve separate images at nearly 100 frames per second. The original format of presentation of TV was essentially analogue converted physically into digital (phosphor dots). While film and photo prints are pretty close to continuous (still have grain structure), the printed image we see in magazines has long been effectively digital. Continuous screen printing is pretty rare nowadays.

There are lots more examples that at first totally confused me as to why digital went through these wild contortions, but it is essentially the background from which the digital image evolved that sets up these sometimes crazy conventions. Perhaps that is a good focus for an historical essay on the importance of analogue in today's digital world.

Alan

Jonathan Palfrey
September 30th, 2010, 09:56 AM
I totally agree with Perrone.

As a university student myself I feel we have focused far too much on the history of TV, media and art rather than the future.

Yes there needs to be some historical context to what you are teaching but your students only need a basic understanding. For example knowing how to process digital content and thinking about how this might develop and change in their life time would be far more useful and prepare them for their future jobs.

I personally feel digital is a big enough subject on its own to teach about let alone trying to get them to explain analog (my main essay last year was on Digital Cinema and 3D).

Brian Drysdale
September 30th, 2010, 10:05 AM
And as such, I would suspect there would be coursework available to teach the foundations of our broadcast and film industry. But this is a PRODUCTION class. Had this class been taught 30 years ago, would the instructor have required everyone to write a thesis on silent film?

A "production" course should be about the tools and methods to produce either episodic TV, documentary, narrative, news, etc. And nowhere in modern production (at least outside of the high dollar shows) is anyone going to be working with film.


I think this depends on the levels you're going to teach students. If you are dealing with the best or people who have ambitions to be the best they should be aware of many aspects of the subject and it's place within modern culture. Yes, the instructor should teach about silent cinema, there a lot to learn for any aspiring film maker - they should check out films like "Sunrise" and other classics. Good quality copies can still stun you and they're before the Hays Code stuff.

Jim Andrada
September 30th, 2010, 10:29 AM
Hi Perrone - I certainly hope the instructor would have covered silent film - in fact it should still be covered. After all, digital content can still be silent. But I do get your real point - ie how far back do we have to go in teaching production or anything else of a practical nature. And I guess it depends on what it is that we really are trying to teach.

However, I don't think that the digital vs analog issue is quite - well, let's say "analogous" to the nuclear fusion vs rubbing two sticks together to generate heat/flame etc analogy.

Let's face it - digital is a huge abstraction that boils down to the concept of processing in a discrete vs a continuous space as I think was pointed out in another post. Digital isn't ones and zeros - after all a lot of early large scale computers were built that operated in base 10 vs base 2 as do most of todays digital computers. An abacus is a digital device.

Maybe I'm getting too far afield, but at a very fundamental level, the issues we all face relate to how one samples and represents the analog real world in order to map it into discrete computational space, perform abstract discrete computation, and re-map into analog space that can be perceived by our analog senses. And why this is a good thing in the sense of making lower cost and more efficient systems available to us.

How about the isses of audio dithering to avoid the ugliness of quantization (which by the way is just another way of saying conversion of analog (continuous) to/from digital (quantized) space? Sampling into 4.2.2 or other color spaces? In fact, the fundamental act of sampling is another discrete/continuous transformation. Digital clipping anyone?

Everything we do with our digital hardware and NLE's and 3D effects software is in the service of producing/transmitting etc analog content while capturing the efficiency benefits of digital encoding.

Analog will be with us forever so I think people need to understand it and think about it.

Edit - I realized that I confused a couple of things here - ie the real world is continuous but not analog, although our perception of it is via analog sensors like eyes, ears etc.

Seth Bloombaum
September 30th, 2010, 10:42 AM
I experienced the a-to-d revolution first-hand as a working pro. I'm not particularly nostalgic, though I still battle with my own gear fetishism. Case in point: I own an ARP 2600, which I never could afford back in the day (analog synthesizer, very fun toy).

I teach video production as an adjunct in a certificate program at a community college. Every term, there are fewer and fewer students who've had experience in 35mm still photography - this term there is 1 out of 20. Average age at this college is 35, most of my students are in their mid-twenties to early thirties. I taught an introductory unit on sound a couple days ago, none of the students had any experience with analog sound recording, eg. not even cassette tapes.

An understanding of analog acquisition and post is purely incidental to our college's mission of preparation for employment, IMO. On the other hand, we do need to establish a foundation of understanding analog-to-digital conversion, and the fundamental characteristics of digital information.

For example, everything in our world outside the camera is analog by nature, to include not only light and sound, but also our perception of light and sound. Understanding of the differences between our perceptions and what the camera or microphone perceives is essential.

Also bear in mind that this information remains analog through preamps and image sensors, until conversion to digital. In that, students do need a solid foundation in analog signals and techniques for dealing with them, but, not to include analog recording technique or media.

I am *so excited* that we now have such amazing tools at such low cost. I see it as the democratization of media. I think it was Benjamin Franklin who said "The freedom of the press is only guaranteed to those who own one", now, so many people can! I want to help them do it better - an historical perspective is helpful in that mission, but IMO there is no reason that a student needs to get inside the technology of film or analog tape - it's enough that they see what people did with these technologies.

Edit: Ah, I see Jim made some of these points above while I was typing; appreciate them, they resonate, especially the concept of digital tech as an intermediary between events in an analog world and the eventual display back in analog.

Joseph Santarromana
September 30th, 2010, 11:03 AM
Art has always been pushed forward by new technologies, each technology has its process that generates its own visual characteristics and those characteristics speaks something of the culture of the time of that technology. The use of photographic technologies from Daguerreotype to black and white film to color to digital to 3D to holograms have unique processes and these processes have become part of an artistic language. An artist that is highly informed of the historic developments of their medium can have a greater command of its language. A Post Modern explanation would be something like 'what does it mean for a digital photographer to create an image that looks as if it was a daguerreotype?' A poet who has a command of language written and/or spoken has greater potential of his/hers art form.

Joseph Santarromana
September 30th, 2010, 11:14 AM
Sometimes because of time restraints its too much to teach multi processes in 1 class, maybe you can recommend to art history to include these issues in their curriculum.

Perrone Ford
September 30th, 2010, 11:53 AM
Art has always been pushed forward by new technologies, each technology has its process that generates its own visual characteristics and those characteristics speaks something of the culture of the time of that technology. The use of photographic technologies from Daguerreotype to black and white film to color to digital to 3D to holograms have unique processes and these processes have become part of an artistic language. An artist that is highly informed of the historic developments of their medium can have a greater command of its language. A Post Modern explanation would be something like 'what does it mean for a digital photographer to create an image that looks as if it was a daguerreotype?' A poet who has a command of language written and/or spoken has greater potential of his/hers art form.

While I don't disagree with this, I really feel much of this discussion is being played out in the minds of people who are not students, or haven't been around students in a long time. So here's the scenario.

You are a professor. You teach Video Production. It's day 1 of class. Thirty young men and women, most of whom have never seen a box of film are about to walk into your twice a week, 1.5 hour class. You have 13 weeks, or 26 classes to teach them a university level course in producing commercial video. That is a total of 39 hours. Less than a full work week for someone clocking a 9-5 job.

So let's make it pertinent to the folks here. If you were going to shadow a wedding videographer, or a wildlife videographer because that was what you were planning to do with the rest of your life, and you had a SINGLE 40hr week to learn everything you needed to know, how much time would you like that person to spend on the history of video cameras, analog sound techniques, nyquist theorems, A/D converters, and other minutiae, and how much of that 40 hours would you like spent on the things you'd need to either become employable at the end of the week, or to start your own business at the end of the week.

These kids have 5 or 6 classes most likely, and are trying to find their way to their future. Sure, understanding how we go to the point we are at is terrific. And very important. But with a total of 39 hours available, I'd be rather disappointed to spend $5k-$10k of my tuition money on learning it. Time that I'd rather spend learning to edit, light, script, mic, and create broadcast level work.

If we're talking about a high school class, where you can dedicate 120-150 hours or more to the subject, then yes, I think you go into that kind of depth.

Jim Andrada
September 30th, 2010, 12:22 PM
Well, I don't think you need to spend two weeks on the subject but I do think it's important to at least talk about it in an introductory class and point at relevant literature. I also think the "analogy" between a 40 hour workweek and a 40 hour total class (although if it's a semester system probably closer to 60 or 70 hours) doesn't really take into consideration the time spent in reading/homework etc on a class - I remember spending 6 hours a week in class and 20 hours a week in lab (I was a Chemistry & Physics combined major and I think I lived in those labs!) all things considered I think I spent more than 40 hours a week on these two subjects (and I had 3 others)

Brian Drysdale
September 30th, 2010, 01:04 PM
You have 13 weeks, or 26 classes to teach them a university level course in producing commercial video. That is a total of 39 hours. Less than a full work week for someone clocking a 9-5 job.

Is this really university degree level? It sounds like it falls short of what I would call a degree, no wonder whenever I attend workshops students helping out say they learn more on that workshop than in one year on their course. I assume this is a just one module which makes up the final degree.

Perrone Ford
September 30th, 2010, 02:00 PM
Well, I don't think you need to spend two weeks on the subject but I do think it's important to at least talk about it in an introductory class and point at relevant literature. I also think the "analogy" between a 40 hour workweek and a 40 hour total class (although if it's a semester system probably closer to 60 or 70 hours) doesn't really take into consideration the time spent in reading/homework etc on a class - I remember spending 6 hours a week in class and 20 hours a week in lab (I was a Chemistry & Physics combined major and I think I lived in those labs!) all things considered I think I spent more than 40 hours a week on these two subjects (and I had 3 others)

When a paper is assigned on a subject, typically that indicates more than one classroom lecture is spent on the concepts. We can conservatively say 2-3 class periods on it, perhaps to include a discussion after the papers are read and graded.

As for the semester system, my basic courses were 3 credit hours. Which meant either 3 :58 minute class periods or 2 1.5 hour class periods. We had 4 credit our classes with lab (I was an engineering student) and yes, that was 2, 2hr classes and a 3hr lab per week, plus homework. But I wouldn't expect a video production course to run the same as a chemistry lab. I'd expect it to run more like most common classes which are lecture only.

As previously mentioned, I see nothing wrong at all with devoting 1, 2, or even perhaps 3 lectures to the idea of laying the groundwork. Explaining from whence we've come, and how we got to today. And from there on, coursework should focus on present and potentially, future techniques and methods. At least in my view.

Perrone Ford
September 30th, 2010, 02:05 PM
Is this really university degree level? It sounds like it falls short of what I would call a degree, no wonder whenever I attend workshops students helping out say they learn more on that workshop than in one year on their course. I assume this is a just one module which makes up the final degree.

Brian,

I think we are misunderstanding one another. This is likely a single course of perhaps 3-4 credit hours of a total 120+ that a student must complete to graduate. That is, assuming it is a traditional 4 year university. However, in the American University system, there is often not a lot of overlap in this kind of field, so each course is designed to stand on it's own somewhat. In that context, learning as much practical information per class is optimal. Whereas taking a more general course in the THEORY of production would seem more suitable to a historical lesson on the craft, and perhaps bringing that up to modern methods.

Coursework is often laid out in a mix of theory, and practical application. Much like in a liberal arts course, you might take "modern literature" to get a synopsis of poetry and prose from the past 100 years, but you'd take "Creative Writing" as your practical application of that knowledge. And the overlap between the two is quite small.

Hopefully, that makes sense.

Bill Davis
September 30th, 2010, 02:14 PM
Personally, I think the issue here has little to do with the state of the industry - OR the state of education. It was started by a teacher confronting the fact that every year, more and more of the knowledge base that allowed him to become a teacher is getting more and more obsolete.

That's difficult to confront.

And it's a BIG problem for the education industry. If everything is changing - and current knowledge is what you read over the last month on-line - what's even the ROLE of formal education?

It's easy to teach the historical basics, because the students can't be expected to pick that up without directed study. But when you have to make an encoding decision TODAY between H-264 and WebM, for example, the ONLY way to make that decision is if you've spent a LOT of time keeping current. And how can you do that - also prepare to teach it - AND live your life?

What makes "school", school anymore? Perhaps that's the real question. And will it look like formal classrooms and lectures much longer. Or will "school" for pragmatic skills like Video Production become more like rolling shoots that demonstrate what's working TODAY.

Interesting time we live in.

Brian Drysdale
September 30th, 2010, 03:18 PM
Coursework is often laid out in a mix of theory, and practical application. Much like in a liberal arts course, you might take "modern literature" to get a synopsis of poetry and prose from the past 100 years, but you'd take "Creative Writing" as your practical application of that knowledge. And the overlap between the two is quite small.

.

I do get the impression that the American system is modular, but for over the overall degree the modules themselves may not totally interconnect.

As for all the changing, there are aspects which change very quickly, but other aspects that haven't really changed at all. There are top DP's who aren't hugely technical, but are extremely creative and/or have excellent people skills. These people survive the technical changes because they have the core skills that have always been in the industry. They also know how to access the required new knowledge and make use of it.

Alan Emery
September 30th, 2010, 04:16 PM
Hi Ozzie,

You ask "How can we understand the present when the past has been erased?"

As you can tell from the responses, the past has definitely not been erased, it has evolved and is built into the digital world in many ways from the technical and mechanical to the continuity of the skill of making fine stories and fine imagery captured and presented with the latest technologies.

Probably from the very beginning of human societies, wisdom (as opposed to information and knowledge) has been passed down by telling stories. I can imagine conjuring images in those long distant past times through firelight, costumes, dancing, song, plays, and mock battles. Along the timeline of human development we have created new ways to engage the mind and heart.

While the relationship of film technology of the past and digital technology of the present are important they are really a means to an end -- passing on wisdom through storytelling and the metaphor of the images we see on the various screens we use to display them. In today's world staying current with the amazingly rapid development of better and better technology is a challenge all by itself. So I agree with the sentiment that while the technical historical past is of interest, especially in the abstract, it is not a critical feature of being able to tell stories using modern digital equipment.

Alan

Seth Bloombaum
September 30th, 2010, 04:29 PM
I don't care what students know about film. I care what students know about what people have done with film. I agree with statements from several people above, that a "production" education needs to include practice and study in many genres.

Jonathan Jones
September 30th, 2010, 09:21 PM
Aside from the technical factors involved in distinguishing analogue from digital, the practical characteristics can certainly be extrapolated and philosophized upon.

An educational exploratory technique I have always favored would be the elemental core of a parable, but without the surface analogy narrative. To help explore and discuss such distinguishing characteristics, it might serve to start with analogous and communally recognized concepts with which they are most familiar in their own active memory. For these students, such concepts might be the ways that Myspace, and more recently, Facebook have largely replaced campus community billboards, texting has replaced phone calls and emails, and the growing trend many students are now experiencing with PDFs and ebooks are starting to replace the bulky and expensive course text books required for their studies.

Each is, in their own ways, methods to connect the expression and the reception of information. Each also carries their own benefits and limitations - pros and cons, including such factors as ease of use, functionality, accessibility, cost of entry, maintenance, systemic adoption, scalability, and intra-operational compatibility, among a host of others.

The same holds true to analogue / digital media creation. Fundamentally, the information to be conveyed is unchanged - the precise amalgams of light and shadow, sound and silence. What is different is the methods through which these fundamentals are captured, manipulated, and expressed. And these also invite the same opportunities for extrapolation - benefits and limitations - pros and cons.

While some elements of these explorations may not serve to be dwelt upon, in some ways "long view scutiny" can sometimes invite stepping outside of the box, and compel the ambitious to re-think even the most basic tools of the trade that many take for granted, optimizing their potential, and even spurring on the next phase of related technological development. There will come a time that those students discussing this today will be the next generation's perception of those with a blinking "12:00" on their VCR faceplates, or a looping Foghat cartridge stuck in their 8-track player.

-Jon

Yoshiko Okada
September 30th, 2010, 11:15 PM
I also think "Sunrise" is one of the greatset silent films.
But most film makers have forgotten a lot of great films in the silent era.
Few people watch even Griffith's movies at the present day.
But I don't think people should be worried about it.
Because if the students need knowledge of analogue, they will learn about it by themselvs in the future.
But If they don't need it, they won't learn it.
If they don't want to learn about past time, they won't be able to get complete knowledge.
What instructors can teach their students are limited.
The instructors should wait for students finding what they will need.

Chris Soucy
September 30th, 2010, 11:46 PM
pretty well exactly what I said in post #7.

I know when I did my video course in London as a (very) mature student, I could bring a wealth of previous knowledge of still photography to the party which the "kids" just didn't have, neither did they have the exposure I'd already had in video from doing it in practice and what I'd learn't via DVinfo.

Did it stop them from learning the realities as it existed then, with the equipment available? Not a bloody bit.

Yes, I was able to increase their learning curve dramatically because I had "hands on" practice at a speciality they hadn't had even a chance to touch before, but I was amazed by how those "kids" (mostly early 20's, I was in my early 50's!) soaked up information.

They were "getting" the new stuff a heck of a lot faster than I was (tho' as I knew the "new stuff" was, in effect, "old stuff" I wasn't exactly too bothered).

Forget the analogue/ digital divide.

Movie making is about creativity and discipline, they will not be surplanted by any technology ever created.

Worrying about the technology just obscures the reality - how do you make a product that talks to people?

Lose that and sink into obscurity, whether it be analogue or digital, it's still just dross.

Forget the history unless the individual want to find out themselves, get on with what they NEED to know going forward.


CS

Yoshiko Okada
October 1st, 2010, 12:47 AM
So don't you think people can learn anything from history?
Many painters learnt some useful techniques from previous works of the pioneers.
I guess movie makers should be similar to them.
Few people at the present day know Griffith's techniques, but obviously they made films developt in the 20th century.
If the techchnology of the 20th century didn't exist, the current technology don't exist.
We can learn a lot of things from history.
But we can't force any students to do so.
Maybe some great students will find something from history.

Brian Drysdale
October 1st, 2010, 03:40 AM
I think people maybe confusing analogue and digital technology, which are the tools, rather than learning what you do with the tools. Unfortunately, a lot of productions seem to be following a template, rather than letting the subject matter lead the form. Although, there's always the element of the production executives just blindly following the buzz words they heard on the latest guru workshop, rather than thinking about it.

Nothing really new about that, John Ford didn't shoot some CUs because he knew that the studios would use them give a chance. Other directors only let the shots run as long as THEY needed before cutting, so the studios didn't have too many alternatives during the editing. Now everything is over covered and the director may not have any involvement in the edit.

Knowing about lighting and having a knowledge of images is something that carries forward regardless of the technology, as does knowing about dramatic timing and story telling skills.

Chris Hurd
October 1st, 2010, 07:40 AM
So don't you think people can learn anything from history?Absolutely yes, they can. But at the university level, that's usually a separate class altogether. When I went to film school at UT-Austin, film history was a mandatory course, and it was two semesters long (it was also one of my favorite classes). Our introductory production class -- if I recall correctly, it was "Principles of Film & Video Production" or some such -- did not touch upon film history at all since there was a separate course for that already. One day early in the course was spent on the history of the technology, and from there on it did not look back. Our text for most of that course was Photography by Upton. The instructor said, "this text is about still photography but the fundamentals of framing, composition and exposure are exactly the same for the moving image. Read everything in this book except for the part about developing film." It was one of the best pieces of advice, not to mention one of the best text books, I ever experienced in film school.

Seth Bloombaum
October 1st, 2010, 10:04 AM
...The instructor said, "this text is about still photography but the fundamentals of framing, composition and exposure are exactly the same for the moving image. Read everything in this book except for the part about developing film." It was one of the best pieces of advice, not to mention one of the best text books, I ever experienced in film school.

Most students have learned to read the visual language of film and television. For most, this is, in the main, a subconcious process.

Learning to write in the conventional visual language is of primary concern. As Chris points out above, the conventions of composition have been with us for a long time. Other conventions too, like, "when to cut" between shots. When not to cross the line. How to reflect the changing emotional intimacy of a scene in shot selection.

Here, the long history of art, photography, film, television, graphic design and other graphic arts can inform us in our study of visual language and the creation of new works. Likewise the history of musical exposition, radio, and even carnival hucksters can inform our audio compositions.

I encourage students to not only know, but to add to that pallette of conventions. There's always a new way to cross the line with a camera, a way to make a stronger composition.

How to optically composite in the film printing process is a dead-end, if not a dead art. That now-ancient technology and craft may yet inform an effects designer, but, as Yoshiko Okada points out above, it will be post-university study for that effects designer.

Jay West
October 1st, 2010, 11:28 AM
Having read everything above, I find myself agreeing with pretty much everybody. That's because I don't really understand why Ozzie was requiring a midterm essay/paper on comparing analogue and digital.

The fundamental question is what is the analogue-digital comparison supposed to teach the students? That is not a rhetorical question about old and new.

If Ozzie is teaching an academic course, one could well ask a McLuhanesque question about the extent to which media and technical limitations shape perception and technique. A question about comparing analogue and digital could be a focus except that you really would need to teach a lot of history to provide the frame of reference mentioned in Ozzie's title for this thread. One could just as well ask for a discussion of how the limitations of movie film & equipment (and the related economics) shaped the filmic style that productions still emulate and whose look still has very strong partisans.

But is that where Ozzie was aiming? Or, is he running a practical/lab type course about actual production practices for things such as broadcast, movies, multimedia educational packages, etc.? If the latter, then, analogue-digital comparisons do not have a purpose apparent to me. True, there are folks who still deliberately work with Super 8mm film. (I recall reading a recent post in the Wedding and Event forum here about somebody who made a wedding video that he shot with a Super 8 film camera.) But, realistically, nobody coming out of a college level TV production program is going to have to compare and choose between digital and analogue production equipment. Digital won that fight a decade ago.

For hands-on production training, analogue-digital comparisons sound like something to skip much as Chris's instructor said to skip chapter on film development in the Upton book. You do need to learn how to work with depth of field and how choices of lenses, aperture and shutter speed affect the production. You do not need to compare films stocks and Beta video tape to XDCAM and P2.

I remember an experience somewhat like Chris's, except that my college film classes were about ten years before Upton first published his book. (I think we used something that combined chapters excerpted from writings by Ansel Adams and Sergi Eisenstein.) We also had a lecture on the old nitrate film. It was interesting and also was potentially useful for anybody thinking about conservation and museum work. It would not have been a good subject for a midterm paper.

It seems to me that looking at the old work to learn about narrative structure, and techniques of framing and composition is different than comparing, say, kinescopes, 3/4-inch Ampex, Beta, SVHS and Hi8 with DV, HDV and AVCHD.

So, back to where I started this post: what was Ozzie trying to get the students to learn by comparing analogue and digital? Answering that seems fundamental to helping him figure out how to change the midterm so that he can better communicate with his students.

Brian Drysdale
October 1st, 2010, 12:28 PM
When I first used a Digibeta camera compared to an analogue Betacam SP BVW 400, the only things that I noticed that were different was that the CG was a bit higher and the sound reference tone setting was different, everything else was the same.

Having to switch between 4x3 and 16:9 depending on the production was a bigger deal because some lenses had an build in optical adapter, so that the angle of view could be maintained, while other lenses didn't. This could restrict your wide end on 4x3 on a Digibeta camera.

Now everything is 16:9, so it's not an issue.

Jim Andrada
October 1st, 2010, 08:46 PM
Anyone want to wager on whether Chris actually read the part about film developing or not? As I remember college (50th reunion coming in a year or two!) whenever a professor told us to skip something it was the first thing we looked at. Usually we had to agree that it was a waste of time/irrelevant, but occasionally it was the best part of the book. I always thought that the greatest thing about college was that it exposed you to so many things that you had had no intention of studying or even finding out about. Or put another way, I think I often learned more from the "don't bother reading it" parts of the book than from what I was supposed to be reading.

But then again I've wound up in a career that's totally different from what I thought I wanted to do and am eternally glad that I studied at a liberal arts school instead of an engineering oriented school where I would never have been as challenged to think outside the bounds I had originally set for myself.

As someone said, Education is what's left when you've forgotten everything you learned in college.

Yoshiko Okada
October 1st, 2010, 10:25 PM
I wonder if our views can help Ozzie.
I guess he wants to teach his students the value of analogue because of a teacher, but he doesn't know how to teach them.
So I should have thought how we could help him rather than if we had to study hisory or not.
If he can, I'd like him to tell us what he is thinking about it now.
And I found maybe Laurence Janus made a good suggestion in his comment (#11).
But I could hardly understand what he wrote.
So if he doesn't think it bother him, I'd like him to write what he suggested again.

Anyway I want a lot of students to learn various things including history of shooting films.
Nobody knows what is profitable during studying probably.

Geoffrey Cox
October 2nd, 2010, 02:53 PM
A very interesting thread especially for me who is also a university lecturer though not in film / video production.

One of the fundamental questions is what is a University education for? To help students get jobs or to help them develop a love of knowledge and the ability to learn and grow autonomously? To me it is the latter because the former is only short term whilst the latter will help them for their whole life (including, importantly, the ever changing demands of the job market). So I'm in the history / context / critical enquiry / understanding camp all the way.

Students often complain of being taught things that are 'irrelevant'. That is because they lack understanding of what is lastingly important and only see the short term. Our job is to try and enlighten them and 'selling out' to the ever decreasing demands of the fickle market will impoverish education and do the fee paying students a disservice.

Rant over, and of course practical / technical skills are very important too!

Bill Davis
October 2nd, 2010, 04:08 PM
Since I'm feeling especially ornery today let me pose an even more complicated meta-question...

Current research shows pretty clearly that when it comes to brain development - MOST people don't reach full brain maturity until 25 or so. So if you're standing in front of a classroom of 20 year olds, and they have diminished reasoning skills relative to where they'll be in another 5 years - what's the point of focusing overly much on ANY of the grave, important discusson processes suggested here that pretty much REQUIRE cognitive reasoning to parse.

Why not just teach PROCESS? At that age, kids can easily learn process and repetition. So drill, drill, drill on the button pushing skills. Push them out in the field and let them watch, and participate in the process of making content. But leave the philosophy stuff until the whole class can participate fairly?

Not my SUGGESTION mind you. Just a point of interest in the discussion.

If you've got 20 students and only 5 of them have the fully developed cognitive skills to learn the esoteric skills YOU want to teach - aren't you doing 15 of them a disservice by concentrating on what YOU find important at YOUR age? Rather than concentrating on what THEY can learn at their age?

Just asking the question.

And yes, I know school (at least in the US) doesn't work like that. I'm posing the question about what it SHOULD look like in order to be most effective.

Discuss amongst yourselves.

Jim Andrada
October 2nd, 2010, 06:14 PM
Funny - I thought full maturity (mental at least) didn't arrive until 80 or 90! At least I assumed that's why my wife keeps telling me I'm acting like a child! Now you've got me starting to worry - maybe I've been going downhill all these years and never realized it.

Lots of interesting ideas bouning around here.

First, let me say that I'm not so sure what the OP meant by teaching analog(ue). If he was into teaching how to splice tape with a knife to do editing, maybe is isn't relevant any more (although I find it interesting) but if he was into things like understanding why digital (audio) clipping is so much different than analog (tube) clipping then I think he'd be right on to teach it. After all, tubes seem to be making quite a comeback lately.

Re Bill's comment re age of (mental) maturity for 75% of the college population, then I wonder if maybe 75% of the college population is in the wrong kind of school at their point in time - maybe they'd be better served by studying at a technical/trade school focused on workplace skills and going to university later in life. I think I'm sort of with the comments made by Geoffrey a couple of posts ago. re what exactly the point of a univeristy education really is.

Probably politically incorrect to go down this path but for some reason in the US we seem to have a fear of acknowledging that different people for different reasons have different educational goals and capabilities and everybody shouldn't necessarily be in a university setting.

Alan Emery
October 2nd, 2010, 07:10 PM
Hi All,

Recognizing that there is a continuum of education in a college/university setting that ranges from applied to abstruse, and also recognizing that each college or teaching university or research-based university sets its own objectives, I looked up what and where he teaches. Ozzie himself is a highly qualified professional who has won Emmy Awards for cinematograhy and outstanding directing.

The stated objectives of the program are:
•Training in all relevant production techniques such as camerawork, lighting, editing, sound engineering, tape operation and more
•Intensive classroom instruction detailing the history of television, film production, radio, journalism and web-based media
•Opportunities for hands-on learning in a state-of-the art Television Center, both through classroom instruction and extracurricular participation in WRED-TV
•Courses in management, marketing and business administration as the industry values professionals with strong business skills.

So this is not a research-based university style course, it is a teaching style university and one of its stated objectives is "detailing the history of television, film production, radio, journalism and web-based media." Because it is an entire program and within this program one of the objectives is a detailed history, it makes sense to ask students to do a paper on the subject; perhaps even a big paper. But clearly the paper should focus on history as a way to learn the profession.

Ozzie is facing a challenge and looking for guidance on one part of the historical aspects; the evolution from film to digital. How to talk about the past when most of the students have never heard of the basics of the earlier technology. Presumably he will also teach other important aspects of the history about how to make a good film by examining the great films of the past.

But this remains an interesting question (the evolution of film to digital) that in a detailed history seems logical to ask a student to understand.

Alan

Yoshiko Okada
October 3rd, 2010, 12:45 AM
My major was psychology, and I studied how films influenced human minds.
I watched plenty of films including a lot of silent films and some movies made in the Soviet Union for my studying.
The silent films fascnated me very much though they didn't have colours nor sounds.
To be honest I prefer Murnau's or Stroheim's films to recent Hollywood movies.
They made impressive and exciting films in 20s.

I am just an amteur of shooting videos or films.
But I may tell you audience's mind.
At the present day a lot of people have their video cameras.
Even children can have their own cameras, and can post their works on websites.
But most of them are boring.
Though they watched their objects carefully when they shoot videos or films, but few cameramen think about audience.
I think Murnau and some directors in silent era understood what audience wanted to watch.
Their techniques impressed audience, but their camera didn't shoot the outstanding scenes.
The cameras didn't shoot only objects. but also views of cameramen.

I guess most students studying at universities want to be professinal of videos and films.
But instructors can teach them just fundamental and techniques.
I know one famous photographer who could ignored audience.
He was Jaques Henri Lartigue.
His family was very rich, so he didn't need to earn money.
He took plenty of photographs and shot some short films, but they were just his hobbies.
He took those photos for himself ,his family and his friends.
Though he didn't expect to show a lot of people his works, they impress a lot of people now.
He had a gift, but few cameramen are similar to him.

I don't think the latest technology can improve anyone's technique.
Once they post their works on websites, there are a lot of audience.
They can't decide if their works are good or not by themselves.
They should understand minds of audience.

Now I feel frustrated because my English is terrible, so I can hardly tell you my opinion well.
I think the shooting films doesn't mean just handling machines.
But instructors can't teach their students about human minds.
So I'd like a lot of students to watch a lot of films including very old works, and learn what shooting is..

Adam Gold
October 3rd, 2010, 12:50 AM
I think your English has a lovely poetic flow, and you express opinion very well. Don't let anyone tell you your English is terrible.

Yoshiko Okada
October 3rd, 2010, 01:03 AM
Thanks, Adam.

Geoffrey Cox
October 3rd, 2010, 02:37 AM
Well said Adam, your English is perfectly clear and a pleasure to read Yoshiko and what you write echoes what I feel - learning about technique and practical skills is just the start. I teach in a Music dept. and concentrate on digital audio composition / sound design etc as well as historical topics which I begin by talking about the very first mechanical recordings on tin-foil cylinders so analogue tape is all a bit new- fangled for me! Maybe it's because making a film is a more complex business than working in sound alone, but we concentrate very much on generating ideas and aesthetic technique which involves a good understanding of the evolution of an art form. But yes we also have classes which are also dominated by practical skills though the context is never far away. Bear in mind that for generations Universities taught almost nothing practical at all - it was just theory and history: for the ancient Greeks, Music was studied but as pure theory only.

So for me courses that concentrate too heavily on technical skills are unbalanced and are essentially a commercial enterprise - come to us and get skills to get a good job - rather than University education. In the past in the UK this would have been what an apprenticeship would have been for but they have gone as employers stopped paying for them and so that sphere has moved into the university sector, hence the conflict in the idea of what we as lecturers are really for; a conflict that is apparent everyday in my job but not necessarily a bad one - we search continually for the balance.

Ultimately it isn't about what we teach though but about how much we can help the students to teach themselves.

Jim Andrada
October 3rd, 2010, 01:17 PM
Yoshiko - thank you for your cogent comments.

Interesting that Murnau had actually studied art history at university. I wonder if anyone was teaching "film" in those days?. If they had, I can just imagine people complaining that studying still images was irrelevant to the "new" technology of film. Or studying B&W was irrelevant to the new technology of color. Or studying silent films was irrelevant to "talkies". Etc etc etc.

I couldn't agree more with Geoffrey that teaching of skills without teaching the principles and historical contexts is in some way short-changing the student. But at the same time there are students at university who are very much focused on marketable skills rather than theory and history. Should the university teach the principles and leave it to the students to learn how these principles relate to practical skills, or should the university teach the skills and leave it to the students to learn the principles? Personally I favor the former, but that's just me.

And in the end maybe it doesn't matter all that much.

Brian Drysdale
October 3rd, 2010, 02:04 PM
The mechanical skills are pretty useless you have common references with the creative people you are dealing with. Doing your own thing is fine, but communicating with a director can involve establishing visual (photographic, painted or graphic), dramatic or musical sources.

Some of the top film schools don't even include shooting anything until the 2nd year, so that the fundamentals are in place.

Chris Hurd
October 3rd, 2010, 03:19 PM
That's how it was at UT-Austin back in my day. Freshmen weren't allowed anywhere near the hardware.