|
|||||||||
|
Thread Tools | Search this Thread |
October 11th, 2010, 11:04 PM | #76 |
New Boot
Join Date: Sep 2010
Location: Osaka, Japan
Posts: 22
|
Well I think we started this topic if students who know digital only should learn analogue or not originally.
So we haven't reached a conclusion? Basically young people, especially twenties and under think old things are boring. Amazingly French director, Rene Clair captured what young people were in his film "Les Belles de nuit" in 1950s. Most young people don't think to need knowledge of analogue because they are boring. They like rock music, but dislike works of Bach or Felix Mendelssohn generally. But most of them have not listen their music. They simply claim old music is boring though they don't know them. I think there were a lot of more interesting films in the the past than recent movies. They will help young people for making new films. I don't care which medium is better analogue or digital. But I guess masters should have both knowledge about analogue and digital.
__________________
I don`t care for modern films -- all crashing cars and close-ups of people`s feet....Lillian Gish |
October 14th, 2010, 08:38 AM | #77 |
Space Hipster
Join Date: Jun 2003
Location: Jacksonville, FL
Posts: 1,596
|
I'm with Perrone on this one. Sure, you can give a brief - and I mean brief - history of TV and film production, but then it's time to roll up the sleeves and teach these kids real-world production techniques. One semester isn't nearly enough time to learn about all the new HD cameras and formats, editing systems, audio, etc. When it comes time for the students to get a job, they will need this info and hands-on experience. Why devote a lot of time talking about the past?
I'm sure a lot of us in this forum can remember our first break into the production world. Mine was at a local TV station. Do you think they cared that I knew who invented the radio? They just wanted to know how much I knew about lighting, working with video cameras, then checking out my sample "reel" (there's a throwback term). All of my college activities of clubs, honor rolls, being a member of the student senate, being in Who's Who in Colleges made not a bit of difference to them. They just wanted to know how I could contribute to their TV productions. It was hands-on experience that got me my first job. |
October 14th, 2010, 09:04 AM | #78 |
Inner Circle
Join Date: Feb 2006
Location: Belfast, UK
Posts: 6,152
|
It's not one or the other, they need to have the grammar of the meduim, have references and to have the knowledge not just for now, but the grounding that can allow them to use whatever bits of kit will the future bring.
A lot of those basics haven't changed, you just need to upgrade them as you progress. Unfortunately, as one BBC cameraman put it to me years ago, the technological aspects can be a lot more interesting than the content of many programmes. |
October 14th, 2010, 06:27 PM | #79 |
Inner Circle
Join Date: Feb 2007
Location: Tucson AZ
Posts: 2,211
|
Well, I would think that there would be more than a one semster course in a 4 year university program.
But I guess we keep coming back to a fundamental disagreement on the role of a university. Honestly speaking, I personally don't think it's the right place for a nuts and bolts course on the details of production using X or Y or Z software. workflow, etc. I really think it has to provide a broader focus. As an example, about 100 years ago (well, actually 50 years aga - just feels like 100) when I was in college, I took a music theory course. My major was in the physical sciences, so this was what we called a "distribution" requirement - ie you weren't allowed to take ONLY courses in your major. I didn't learn much about music theory, unfortunately, but I did learn a lot about how the music department operated. To start with there were no courses in performance. Theory, history, etc yes, performance no. If you wanted to study performance you were encouraged to take courses at a nearby conservatory in parallel with the university courses. Or as one of my friends who DID major in music put it, at our school, music was treated like the proverbial "good child" ie it was something to be seen but not heard! And if performance was your objective, you were also encouraged to take part in the many extracurricular musical groups about. I didn't learn prigramming at school - I learned programming by working at a federal research lab in the summer putting in 80 hours a week for 40 hours pay (at the princely salary of $1 per hour). At school we focused on theory of computation, mathematics, physics, etc. - in other words the fundamental principles. At work we learned how to apply these fundamental principles to actually do something useful. Anyhow, that's how I got into the computer "biz" 51 years ago so maybe I'm biased towards thinking that the university's role isn't preparation for a job per se but rather preparation of the student by laying the foundation that practical skills can be built upon Maybe things are different today. EoR (End of Rant!) |
October 14th, 2010, 11:30 PM | #80 |
New Boot
Join Date: Sep 2010
Location: Osaka, Japan
Posts: 22
|
Before Impressionism some painters also drew pictures of landscape.
But in the old days their paints got dry easily, so they couldn't draw their pictures outside their studio mostly. The period around Monet a new paint was invented. Painters could bring the paints and could draw their pictures outside. It was a good example that a new technology or a new tool helped to make a new arts. But I guess Monet or Renoir learnt old masterpieces though they could use new paints. I also guess even Picasso learnt fundamental or old techniques of paintings. I am still studying English, and I know so many English teachers. Most of them could teach English by just textbooks. They could teach me recent topics and modern English. Sometimes we could use computer and learnt natural English by Internet. But they were boring for me. My English didn't improve at all. One day I met one instructor. He never taught me recent topics or modren English that most young American people used (I mean slang etc.), but he taught me a lot of words and phrases about world history and the fine arts. How his taught was very unique and not a modern way, but it encouraged me. He helped me to keep studying English. I still thank him though I go to another school now because of some reasons. If we are talking about what instructors should teach their students rather than which medium is better analogue or digital, I want to claim instructors should teach their students whole things about shooting including analogue. I guess studying don't mean knowing just modern things. A few decades later it is likely the modern things at the present day would be very old. Maybe most students can learn some easy techniques by themselves. But they can't learn history without instructors. I repeat instructers should teach children both analogue and digital. Then children should choose which medium is better for them.
__________________
I don`t care for modern films -- all crashing cars and close-ups of people`s feet....Lillian Gish |
October 15th, 2010, 12:43 AM | #81 |
Inner Circle
Join Date: Feb 2007
Location: Tucson AZ
Posts: 2,211
|
In the "old days" of computation, we all operated with such modern workflows as use of the sliderule, how to interpolate in tables of logarithms,all manner of self checking techniques for validating results of long (one week or more full time) calculations with electro-mechanical devices, plugboard wiring, paper tape punching and writing neatly in little boxes on forms so someone could transform our programs into punched cards.
In fact, the term "computer" originally referred to the people who used such electic calculators. I still have a neat little autographed copy of a book on the calculation of the orbits of minor planets by Paul Herget. All manner of manual tricks and "workflows". Oh and by the way, even after large scale digital computers began to appear there were still large scale analog systems in daily use - I remember that it was still faster to process instrumentation tapes initially on an analog system where you plugged a variety of modular electrical components to perform the "calculations" before feeding the result into a digital system. Fortunately, we learned the fundamental principles. Otherwise we would all have been rendered obsolete by today's digital wonders. Oh well, I guess it's the eternal battle of how to balance the teaching of 'how" with the teaching of "why". Personally I would have felt cheated if classes had emphasized the hows at the expense of the whys. YMMV |
October 15th, 2010, 01:56 AM | #82 |
Inner Circle
Join Date: Feb 2006
Location: Belfast, UK
Posts: 6,152
|
I believe the term "computer" for the people doing the calculating even pre dates the electronic calculator to days of manually making out mathematical tables. Charles Babbage came up with a mechanical "difference engine" to solve the problem of human error in the tables.
Charles Babbage - Wikipedia, the free encyclopedia |
October 15th, 2010, 04:13 PM | #83 |
Inner Circle
Join Date: Feb 2007
Location: Tucson AZ
Posts: 2,211
|
Thanks Brian! Yes, the term is much older than even I am. What I found so interesting about its use was that the author of the book would still acknowledge the original use of the term even at the beginning of the "non-human" (some would say "in-human") computer era.
|
October 15th, 2010, 04:24 PM | #84 | |
Trustee
Join Date: Jan 2004
Location: Scottsdale, AZ 85260
Posts: 1,538
|
Quote:
As to the current debate, one of the more troublesome aspects of all this time trying to figure out the value proposition between Camera A or Camera B - or even the aesthetic proposition between black and white or color - is that it typically distracts us from moving on to the more fundamental questions about whether or not the video we're making has a justification for it's existance at all. In a theatre last night, I saw a trailer for the coming "True Grit" remake. Does that film actually need a modern remake? Sure, you can make it bloodier. Even, perhaps, more "realistic." But no matter how hard you try, you can NEVER make another movie that will transcend it what the ORIGINAL True Grit was. It's not really a "movie" it's a "John Wayne Movie". And it needs to be considered as just that. Infinately WORSE - I heard they're making a "Hollywood" version of The Girl with the Dragon Tattoo. If EVER there was a movie that DIDN'T need a re-make of it's original THIS has GOT to be it. Watching the Swedish version of TGWTDT (EVEN after reading the whole book series) is like the movie equivalent of hearing Bach for the first time. It doesn't NEED anyone to waste their time trying to do it better - it's essentially PERFECT as it is. Spend your time, people doing something else. Please! And at it's heart, that's why the debate about digital verses analog is so silly. If the work you're doing is valid and original and compelling, at best the medium and the approach and the camera will ALL become a small part of the whole. Like whether you scored your original music for a string quartet or a brass quartet. What they'll remember is whether the MUSIC moved them. If yes, it that music is superb, it will be echo'd in new arangements forever. If not, four Stradivari or 4 pipe organs won't save it.
__________________
Classroom editing instructor? Check out www.starteditingnow.com Turnkey editor training content including licensed training footage for classroom use. |
|
October 15th, 2010, 07:28 PM | #85 | |
Trustee
Join Date: Jan 2005
Location: Healdsburg, California
Posts: 1,138
|
Quote:
My sense is that such an effort will stand to compromise the harmonious elements that make the 2007 Spanish version such a moving and compelling masterpiece. -Jon
__________________
"Are we to go on record, sir, with our assertion that the 'pink hearts, yellow moons, orange stars, and green clovers' are, in point of fact', magically delicious?" - Walter Hollarhan before the House Subcommittee on Integrity in Advertising - May, 1974 |
|
October 16th, 2010, 02:14 AM | #86 |
Inner Circle
Join Date: Feb 2006
Location: Belfast, UK
Posts: 6,152
|
Hollywood has been making remakes (often inferior, sometimes better) for years. What they're after is a proven story that reduces their risk.
|
October 16th, 2010, 10:56 AM | #87 | |
Trustee
Join Date: Jan 2005
Location: Healdsburg, California
Posts: 1,138
|
Quote:
-Jon
__________________
"Are we to go on record, sir, with our assertion that the 'pink hearts, yellow moons, orange stars, and green clovers' are, in point of fact', magically delicious?" - Walter Hollarhan before the House Subcommittee on Integrity in Advertising - May, 1974 |
|
October 16th, 2010, 12:49 PM | #88 |
Inner Circle
Join Date: Feb 2006
Location: Belfast, UK
Posts: 6,152
|
What they do with the story after the various hands have been dipping into it is entirely another matter of course.
|
October 16th, 2010, 01:58 PM | #89 |
Major Player
Join Date: Feb 2008
Location: Janetville Ontario Canada
Posts: 210
|
Perhaps the debate that began on how to teach young people about film when they are not old enough to have any personal knowledge about "film" or film techniques or film technicalities, but instead are only familiar with digital imaging and its techniques and technicalities, is similar to the debate about cosmological "reality". Currently there are about 5 string theories each of which correctly defines reality within its own theory, but cannot define the reality in another string theory. Where they overlap, they continue to define reality, but in their own way.
Model-dependent realities fit the film/digital debate. When film was defining reality in its way, it worked fine. Similarly we are developing ways to use digital imaging to define reality, and that works too. The place where they overlap is when a digital videographer attempts to mimic the reality of film. Here the use of 24 frames per second, shallow depths of field, a slight graininess, and other techniques attempt to define film reality using digital reality. While it comes close to film, it never quite makes it out of the reality of digital. A teacher showing classic films would have a hard time teaching young people to create the reality of a film in a digital reality. The students might wonder why the teacher is trying to mimic a look and feel that is "no longer in existence." Certainly the story-telling qualities transcend the imaging techniques, and that is an important lesson from the days of film. But how would one explain the lure of a not-quite-as-accurate a recording technique to a student other than describing something about the film techniques and film technicalities, and then comparing that to digital. Two separate and distinct realities perhaps? But both describing the same reality. One could surmise that the teacher is from one reality and the student from another reality, and while each is in the same reality, their perception of that same reality is quite different, although the same. The idea for this comment came from an article by Stephen Hawking and his colleague in the most recent Scientific American on "The elusive theory of everything." In this article there is another facet that is similar. He compares Newtonian physics (everyday physics with which we are familiar and which we generally use to describe film, and light effects on film) and quantum physics (which is essentially a series of on-states or off-states -- similar to digital 1 or 0 which we currently use to define the images in our digital cameras and increasingly in our everyday lives). Even one of the thread headings here "Photon Management" is quantum, not Newtonian physics; photon being a particle of light, not a wave form of light. Fun topic. Alan |
October 16th, 2010, 11:40 PM | #90 |
Inner Circle
Join Date: Feb 2007
Location: Tucson AZ
Posts: 2,211
|
And what after all defines "digital" vs "analogue"
Is a wind up watch really analogue? Or is it a digital (quantized) device with an analogue readout? If the watch operates with an escapement, then I think it is in reality a digital device, ie it works by counting (a digital operation after all) oscillations and displaying the result on an analogue clock face as an angular position "analogous" to the time.. But if it works off of a little electric motor that spins at a fixed rate, then I think it would qualify as an analogue device at least to my benighted way of thinking. Well, one might say, "So what?" as, regardless of whether the clock works by counting pulses or not, the result "looks the same" at least within the capability of the human eye to discern the difference. Hmm - sounds like a description of digital photography or video or audio, doesn't it? Digital processes displayed on an analogue device and practically indistinguishable from an analogue recording at least in the sense that an average person probably couldn't reliably tell you which was which. A crappy worthless documentary is a crappy worthless documentary regardles of whether it's a crappy digital documentary or a crappy analogue documentary Ah yes, wave vs particle reality, continuous reality vs quantum reality. Great stuff. |
| ||||||
|
|