DV Info Net

DV Info Net (https://www.dvinfo.net/forum/)
-   All Things Audio (https://www.dvinfo.net/forum/all-things-audio/)
-   -   Zoom H6 line out phase inversion problem (https://www.dvinfo.net/forum/all-things-audio/538486-zoom-h6-line-out-phase-inversion-problem.html)

Geoffrey Cox January 5th, 2022 12:10 PM

Zoom H6 line out phase inversion problem
 
I was testing my new Zoom H6 with a Panasonic S5 camera. I connected a microphone to the Zoom input and Zoom line out to the S5 line input so as to have a guide track. I made a recording (i.e. internally on the Zoom and externally via the Zoom line out to the S5) and synched things in post. I worked in Logic so I could drill down right to sample level to get the synch as good as possible.

To my surprise I noticed the audio recording on the S5 video file was clearly phase inverted. If I lined them up exactly I got quite close to complete phase cancellation. I could only assume this was a fault with the input of the S5 or output of the Zoom. I assumed it would be the S5 but tested the Zoom line out on a different device (my laptop using a Scarlett Solo audio interface). The phase was again, inverted.

The only logical conclusion I can come to is that the Zoom H6 line out phase inverts the signal. This should not be happening. It could be a fault with my machine only but that seems unlikely. Could it be a design flaw?

I would be surprised if anyone else had noticed this but any thoughts very welcome.

Patrick Tracy January 5th, 2022 01:13 PM

Re: Zoom H6 line out phase inversion problem
 
It might be a simple matter of cost, using fewer components in the path to the line out.

Greg Miller January 6th, 2022 03:08 AM

Re: Zoom H6 line out phase inversion problem
 
If I understand this correctly, the electrical polarity of the Zoom output seems to be reversed with respect to the polarity of a file recorded on the Zoom. I believe you're running 3.5mm TRS output to 3.5mm TRS input, so that pretty much rules out cable wiring issues.

If that's the case, then I can see two possible scenarios. either

(A) the Zoom file polarity is correct with respect to the Zoom input, and the Zoom line output is reverse polarity,

or

B) the Zoom file polarity is reversed with respect to the Zoom input, and the Zoom line output is correct with respect to the Zoom input.

Time to get out the oscilloscope and a signal generator (or reference test source).

Don Palomaki January 6th, 2022 08:50 AM

Re: Zoom H6 line out phase inversion problem
 
What kind on mic did you connect to the Zoom, and how was it connected?

The 3.5mm zoom output jack and the 3.5mm mic input jacks on both devices appear to be unbalanced stereo.

Phase reversal in one channel with equal audio in the other channel can happen when a balanced source is connected to left and right unbalanced inputs. Without getting into other possible configuration options on the Zoom: an example - a balanced mic connected to the MIC/LINE input jack would likely record reversed phase on one channel.

Geoffrey Cox January 6th, 2022 03:14 PM

Re: Zoom H6 line out phase inversion problem
 
I tried different mics.

First I tried a mono mic (Rode), then 2x DPA 4061 stereo pair (linked in the H6). The Rode was connected via a 3.5mm TRS to XLR adapter, the DPAs via there own microdot to XLR adapters. So all mics going into the Zoom via XLR.

The results were the same. Mono mic is phase inverted and with the stereo pair, both channels inverted.

Using a phase inverter plugin solved the issue and with the stereo pair I had to invert both channels.

And yes, using a 3,5mm TRS lead.

Geoffrey Cox January 6th, 2022 03:18 PM

Re: Zoom H6 line out phase inversion problem
 
Quote:

Originally Posted by Greg Miller (Post 1966758)
If I understand this correctly, the electrical polarity of the Zoom output seems to be reversed with respect to the polarity of a file recorded on the Zoom. I believe you're running 3.5mm TRS output to 3.5mm TRS input, so that pretty much rules out cable wiring issues.

If that's the case, then I can see two possible scenarios. either

(A) the Zoom file polarity is correct with respect to the Zoom input, and the Zoom line output is reverse polarity,

or

B) the Zoom file polarity is reversed with respect to the Zoom input, and the Zoom line output is correct with respect to the Zoom input.

Time to get out the oscilloscope and a signal generator (or reference test source).

Indeed though I had not thought of scenario B which would be a much more basic fault I would have thought. Unfortunately the oscilloscope test is beyond my capabilities!

Greg Miller January 6th, 2022 11:15 PM

Re: Zoom H6 line out phase inversion problem
 
I'm glad you seem to have found a solution, but I'm a bit puzzled.

Where did you use the phase inverter? At the mic input of the Zoom? Or somewhere else?

Geoffrey Cox January 7th, 2022 03:49 AM

Re: Zoom H6 line out phase inversion problem
 
I used a phase inversion plugin on one of the tracks in the audio software (Logic). In fact it is part of a general gain plugin.

In many ways the whole problem is not that important as it is unlikely I would use both the Zoom and S5 audio tracks together and would stick with e better quality, multitrack Zoom audio unless something had gone wrong. I just think it bad that there is this obvious design fault (or whatever it is). Where it is awkward is that it is more tricky to synch to the guide track when it is visually inverted - the phase inversion plugin does not alter it visually as presumably it only affects the outgoing signal.

Patrick Tracy January 7th, 2022 02:31 PM

Re: Zoom H6 line out phase inversion problem
 
Interesting. In Vegas Pro, which is a complete DAW as well as an NLE, there's a polarity invert switch right on the channel control, and it does flip the waveform. There's no need for a plugin. I'm kind of shocked that Logic doesn't have that kind of functionality.

Geoffrey Cox January 7th, 2022 02:39 PM

Re: Zoom H6 line out phase inversion problem
 
Quote:

Originally Posted by Patrick Tracy (Post 1966766)
Interesting. In Vegas Pro, which is a complete DAW as well as an NLE, there's a polarity invert switch right on the channel control, and it does flip the waveform. There's no need for a plugin. I'm kind of shocked that Logic doesn't have that kind of functionality.

It's a good point. I will double check but don't think there is a simple reverse polarity switch.

Paul R Johnson January 7th, 2022 04:57 PM

Re: Zoom H6 line out phase inversion problem
 
Can you just flip it in the audio editor? When aligning audio tracks for sync, I never get close enough for this to ever be a problem. I assume that when you look at the zoomed in waveform you are seeing one waveform go positive on it's first transient leading edge and the other one goes negative? My guess is it's just a design feature. Have you actually tested the E-E of the Zoom, as in the input to output to see if the zoom is the same polarity in as it is out. I'm not sure it's that critical , but if the output is reversed to the input that is poor design. If the in to out is the same, I doubt it matters too much.

Greg Miller January 7th, 2022 11:02 PM

Re: Zoom H6 line out phase inversion problem
 
Paul, I suggested testing the Zoom in/out polarity in post #3. The OP said, in post #6, that he didn't have that capability.

Paul R Johnson January 8th, 2022 06:29 AM

Re: Zoom H6 line out phase inversion problem
 
I was thinking more of checking a file recorded on the card, ingested into a computer to see if the first data was positive going compared to that card via the audio out or usb out. Ideally they'd all be the same, but probably not.

Thinking about this, I'm not sure it makes any actually difference practically as I don't think I've ever zoomed in close enough to have ever noticed. After all - the camera audio won't ever cancel completely as the mics are in different places? Plus - there's no point being more accurate than the frame rate - so minute shifts which are possible in an audio editor aren't usually possible in the video editor, locked to frames,

I edit in Premiere, and it's simple to pop into Audition to do individual sample shifts, but I've never had to?

Geoffrey Cox January 8th, 2022 08:43 AM

Re: Zoom H6 line out phase inversion problem
 
Quote:

Originally Posted by Paul R Johnson (Post 1966770)
I was thinking more of checking a file recorded on the card, ingested into a computer to see if the first data was positive going compared to that card via the audio out or usb out. Ideally they'd all be the same, but probably not.

Thinking about this, I'm not sure it makes any actually difference practically as I don't think I've ever zoomed in close enough to have ever noticed. After all - the camera audio won't ever cancel completely as the mics are in different places? Plus - there's no point being more accurate than the frame rate - so minute shifts which are possible in an audio editor aren't usually possible in the video editor, locked to frames,

I edit in Premiere, and it's simple to pop into Audition to do individual sample shifts, but I've never had to?

If I understand you correctly this is what I already described above. I compared the recording made straight to the card in the Zoom device (ingested into a computer via the card - my laptop has a card reader built in) to the same simultaneous recording fed out via the line out of the Zoom. First I tried the line out plugged into my camera and then noticed the phase inversion so assumed the camera input would be at fault but tried a second test with the Zoom line out going into my laptop via an audio interface. The result was the same - an inverted signal so I had to assume that it was the Zoom line out that was the problem. The actual phase cancelation that resulted makes it clear this is not just some issue with the waveform depiction but an actual inversion.

When it comes to how much one needs that kind of close synch, it is a good point but with dialogue it does need to be pretty close and I have sometimes found frame only accuracy is not enough so I would finalise things in an audio editor (Logic). I do a lot of fine-tuned audio work anyway and find the limitations of such work in a video NLE (premiere in my case) frustrating.

Don Palomaki January 8th, 2022 09:00 AM

Re: Zoom H6 line out phase inversion problem
 
On audio sync to video. Light travels much faster than sound, about 186,000 miles per second vs. ~1000 ft/sec. So audio lagging video slightly corresponds to standing back a bit further from the source. Our brains process it. However, audio leading video is unnatural

FWIW: Phase inversion and delays in one channel of stereo sound compared to the other channel can effect the spatial perception of the sound. You can demonstrate it with loud speakers by reversing the +/- connections on one side and listen. With proper phase you can generally pinpoint instruments, with one side flipped phase the same instrument may seem to wonder in space.

Geoffrey Cox January 8th, 2022 09:20 AM

Re: Zoom H6 line out phase inversion problem
 
Quote:

Originally Posted by Don Palomaki (Post 1966772)
On audio sync to video. Light travels much faster than sound, about 186,000 miles per second vs. ~1000 ft/sec. So audio lagging video slightly corresponds to standing back a bit further from the source. Our brains process it. However, audio leading video is unnatural
.

'Unnatural'? But video is unnatural in the first place, it is all an artifice. I work on many occasions with audio leading video. It is called art, and being 'unnatural' is half of what art is about. We may receive the image quicker than sound to our senses but that is not the same as our brain registering it as such. As has been pointed out many times, sound is more emotive than image as it is less definite and has porous boundaries (and it is speculated that in terms of flight and flight survival of early man, it was more important as you could hear danger way before seeing it), so in fact it is the sound that people often notice first and what stays with them long after the images have faded.

Greg Miller January 8th, 2022 09:40 AM

Re: Zoom H6 line out phase inversion problem
 
IMHO video displayed on a screen is no more natural or unnatural than audio reproduced by a loudspeaker. However, I agree with Don (and it's an indisputable fact): in the natural world (no recording or playback involved) it is inevitable that sound will reach an observer somewhat later than the light from the corresponding visual event; the reverse *never* happens.

I have observed, in any lip-sync combination, that sound-early playback is much more noticeable and objectionable than sound-late playback. (e.g. people sitting in the back row of a 100-foot-long theatre don't complain that the sound is out of sync)

The comments about polarity-flipped stereo, while true, seemingly don't apply to the OP's workflow as he describes it.

Glancing at the Zoom manual, there seem to be some menu-driven mixers. I wonder whether there is some configuration setting in one of those mixers which might be causing the observed problem.

Patrick Tracy January 8th, 2022 10:56 AM

Re: Zoom H6 line out phase inversion problem
 
Quote:

Originally Posted by Paul R Johnson (Post 1966770)
Thinking about this, I'm not sure it makes any actually difference practically as I don't think I've ever zoomed in close enough to have ever noticed. After all - the camera audio won't ever cancel completely as the mics are in different places? Plus - there's no point being more accurate than the frame rate - so minute shifts which are possible in an audio editor aren't usually possible in the video editor, locked to frames,

I edit audio and video in Vegas Pro. In that software you can unlock audio from video and move it down to the sample level, which I do routinely.

In a project I'm working on right now, I recorded a band on four cameras and a Zoom H5. The Zoom is getting a feed from the PA plus the onboard mics are capturing sound at the lip of stage, the combination of those giving a fair to very good representation of the live sound. I most definitely slide the direct feed to time align it with the onboard mic track. I usually do the audio in a separate project and drop rendered songs into the video editing project where I again slide it around to make it look correct. I don't just align it to the camera audio because that's often a frame or two off, which I can see. The audio definitely needs to be placed with sub-frame accuracy to look right.

Charlie Ross January 8th, 2022 08:00 PM

Re: Zoom H6 line out phase inversion problem
 
Patrick, you are describing exactly my live music workflow. Every bit of it.

None of it helps the OP in his present predicament but it's good to read that at least one other person thinks somewhat like I do. Wait until I can tell my friends!!! LOL.

I can't find a service manual for the Zoom to appraise what might be going on in the signal path and I would just "get around" the inversion in software anyway since I am already tinkering on the timeline in detail anyway.
Cheers.

Patrick Tracy January 8th, 2022 10:41 PM

Re: Zoom H6 line out phase inversion problem
 
Quote:

Originally Posted by Charlie Ross (Post 1966782)
Patrick, you are describing exactly my live music workflow. Every bit of it.

None of it helps the OP in his present predicament but it's good to read that at least one other person thinks somewhat like I do. Wait until I can tell my friends!!! LOL.

I can't find a service manual for the Zoom to appraise what might be going on in the signal path and I would just "get around" the inversion in software anyway since I am already tinkering on the timeline in detail anyway.
Cheers.

That's encouraging. I was really responding to the "audio can only be moved by full frames" idea.

I think it's probably as simple as economics. My guess is that the circuit that was good enough to get it done left the signal inverted at the line output. Inverting back to "normal" would have needed one more op amp stage, which would have added to the cost and almost never been noticed. I don't think I've ever used the line out on my Zoom.

Geoffrey Cox January 9th, 2022 05:40 AM

Re: Zoom H6 line out phase inversion problem
 
Quote:

Originally Posted by Patrick Tracy (Post 1966776)
I edit audio and video in Vegas Pro. In that software you can unlock audio from video and move it down to the sample level, which I do routinely.

In a project I'm working on right now, I recorded a band on four cameras and a Zoom H5. The Zoom is getting a feed from the PA plus the onboard mics are capturing sound at the lip of stage, the combination of those giving a fair to very good representation of the live sound. I most definitely slide the direct feed to time align it with the onboard mic track. I usually do the audio in a separate project and drop rendered songs into the video editing project where I again slide it around to make it look correct. I don't just align it to the camera audio because that's often a frame or two off, which I can see. The audio definitely needs to be placed with sub-frame accuracy to look right.

Indeed - sub-frame accuracy is needed. It is one of the frustrations to me that video NLE's do not tend to allow you shift audio at subframe level (once the frame rate has been fixed). In my sort of work it is actually timbre that is a major factor here. Aligning audio tracks right down to sample level accuracy may not be audible in the sense one can tell the time position difference between a few samples but you can hear the difference as the two sounds interact with each other differently. After all, two instances of the same sound misaligned by a few samples can be easily detectable by the ear via the timbre shift.

Rick Reineke January 9th, 2022 12:39 PM

Re: Zoom H6 line out phase inversion problem
 
Every version of Vegas Pro allows moving audio (and video) down to the sample level, unless it is locked to the frame level. VP also has a substantial DAW built in.. in fact, Vegas was originally an (audio only) DAW before video editing was added. On the down side, it lacks MIDI instruments, does not support VST-3 audio plug-ins or side-chaining..

Patrick Tracy January 9th, 2022 12:45 PM

Re: Zoom H6 line out phase inversion problem
 
Quote:

Originally Posted by Geoffrey Cox (Post 1966785)
Indeed - sub-frame accuracy is needed. It is one of the frustrations to me that video NLE's do not tend to allow you shift audio at subframe level (once the frame rate has been fixed). In my sort of work it is actually timbre that is a major factor here. Aligning audio tracks right down to sample level accuracy may not be audible in the sense one can tell the time position difference between a few samples but you can hear the difference as the two sounds interact with each other differently. After all, two instances of the same sound misaligned by a few samples can be easily detectable by the ear via the timbre shift.

Yep, I use a coincident pair for drum overheads so I have a point to which I can time align all the close mics. It's a similar process to aligning a room mic with a direct PA feed for a live concert recording. The difference is subtle but worth the effort. Control over polarity is important in these contexts, which is why it's nice that Vegas Pro has a button on the channel controls and inverts the waveform display accordingly. And you can do all this right inside a video editing project. These features might be handy for the OP.

Vegas Pro has a "Quantize to frames" check box. I always have it on for video editing and off for audio mixing.

Geoffrey Cox January 9th, 2022 01:28 PM

Re: Zoom H6 line out phase inversion problem
 
Quote:

Originally Posted by Patrick Tracy (Post 1966776)
I edit audio and video in Vegas Pro. In that software you can unlock audio from video and move it down to the sample level, which I do routinely.

In a project I'm working on right now, I recorded a band on four cameras and a Zoom H5. The Zoom is getting a feed from the PA plus the onboard mics are capturing sound at the lip of stage, the combination of those giving a fair to very good representation of the live sound. I most definitely slide the direct feed to time align it with the onboard mic track. I usually do the audio in a separate project and drop rendered songs into the video editing project where I again slide it around to make it look correct. I don't just align it to the camera audio because that's often a frame or two off, which I can see. The audio definitely needs to be placed with sub-frame accuracy to look right.

Quote:

Originally Posted by Patrick Tracy (Post 1966788)
Yep, I use a coincident pair for drum overheads so I have a point to which I can time align all the close mics. It's a similar process to aligning a room mic with a direct PA feed for a live concert recording. The difference is subtle but worth the effort. Control over polarity is important in these contexts, which is why it's nice that Vegas Pro has a button on the channel controls and inverts the waveform display accordingly. And you can do all this right inside a video editing project. These features might be handy for the OP.

Vegas Pro has a "Quantize to frames" check box. I always have it on for video editing and off for audio mixing.

I have heard of Vegas of course but never explored it. Went from FCP (original) to Premiere. I've used various audio sequencers over the years but for all its faults, the depth and breadth of Logic is still very good for the price of the thing.

It does sound though as if I should check out Vegas as my work is basically sound-led and I am a musician at heart.

Patrick Tracy January 9th, 2022 01:37 PM

Re: Zoom H6 line out phase inversion problem
 
Quote:

Originally Posted by Geoffrey Cox (Post 1966789)
I have heard of Vegas of course but never explored it. Went from FCP (original) to Premiere. I've used various audio sequencers over the years but for all its faults, the depth and breadth of Logic is still very good for the price of the thing.

It does sound though as if I should check out Vegas as my work is basically sound-led and I am a musician at heart.

I'm probably biased because I've been using this line of software for the better part of two decades, but it definitely does certain things really well. It is worth noting some of the limitations mentioned above. Also, there seems to be a trend of people getting frustrated with it and moving to other software, like Resolve, but I haven't had any of the trouble others have had.

For me the real advantage is that it's very effective with audio and video. I don't have to switch software to work on audio for a video project, though I do often have a separate project for the audio. You can actually nest one Vegas Pro project as a track in another Vegas Pro project.

Don Palomaki January 9th, 2022 01:37 PM

Re: Zoom H6 line out phase inversion problem
 
Quote:

Originally Posted by Geoffrey Cox (Post 1966773)
'Unnatural'? But video is unnatural in the first place, it is all an artifice...

Care to dolly-zoom anyone?

Unnatural, meaning not as it is found in nature. Recorded audio is colored by the transducers used (mic and speakers) A/D - D/A converters used, not to mention recording and listening environment acoustics vis-a-vis the transducers, audio effects processors, and so on. A lot of artistic effects are "unnatural" driven by an intent to provide sensory experiences different from what mother nature provides.

With a single audio source file sub-frame audio shifts are of less significance than when dealing with a mix of multiple sources.

What counts ultimately is what the intended viewer/listener (e.g., paying customer) sees and hears as compared to their expectation.

Geoffrey Cox January 9th, 2022 03:15 PM

Re: Zoom H6 line out phase inversion problem
 
Quote:

Originally Posted by Don Palomaki (Post 1966791)
Care to dolly-zoom anyone?

Unnatural, meaning not as it is found in nature. Recorded audio is colored by the transducers used (mic and speakers) A/D - D/A converters used, not to mention recording and listening environment acoustics vis-a-vis the transducers, audio effects processors, and so on. A lot of artistic effects are "unnatural" driven by an intent to provide sensory experiences different from what mother nature provides.

With a single audio source file sub-frame audio shifts are of less significance than when dealing with a mix of multiple sources.

What counts ultimately is what the intended viewer/listener (e.g., paying customer) sees and hears as compared to their expectation.

Are you trying to claim video (and film) is not also 'coloured' by the technical processes needed to make it happen? If anything video is less transparent than audio - colours are almost never really true to what you are looking at for example whereas a good audio recording can sound almost exactly like what you were just hearing. Also, the post processing of video can be just as 'unnatural' as audio. Virtually every mainstream film I watch looks unnatural to my eye and nothing like 'reality'. Bazin's idea the photograph is a window on the world is false, and increasingly so but we find ourselves in a world where, for example, CGI is consider something that looks good, whereas it is in fact, horribly artificial but we suspend our disbelief - that is how cinema works - we know and see it is all unnatural, but get over it.

What counts ultimately is not fulfilling the audiences expectations but doing the exact opposite.

Greg Miller January 9th, 2022 04:09 PM

Re: Zoom H6 line out phase inversion problem
 
My lunch was 'coloured' by the technical processes needed to create it. Certainly the french fries didn't taste like raw potato, and god only knows what that Big Mac was before processing.

Don Palomaki January 10th, 2022 05:16 PM

Re: Zoom H6 line out phase inversion problem
 
Quote:

Are you trying to claim video (and film) is not also 'coloured' by the technical processes needed to make it happen?

good audio recording can sound almost exactly like what you were just hearing...

What counts ultimately is not fulfilling the audiences expectations but doing the exact opposite..
No. I'm saying audio recordings are also colored. And "almost" can cover a lot of territory

And no recording, audio or video, is an exact representation of the live event. as experienced by the viewer/listener in his/her seat.

This all brings to mind the ads for AR speakers (Acoustics Research) from the 1960s. Behind the stage curtain AR3a vs a live performer. Point source vs point source in a large venue and the best of gear. But a far cry from a pair of speakers vs a 90 piece orchestra.

And a lot of current music is highly engineered - about the only purely acoustic instruments are the drum set and voice and even they are processed, reverbed, and compressed to satisfy the artistic intents of whose making the final calls on the released work

Is it bad? (a mater of taste), is it wrong? (its art so wrong is a matter of opinion)

Except for few wackos and those in chains an unhappy audience is unlikely come back for more.

Andrew Smith January 10th, 2022 10:39 PM

Re: Zoom H6 line out phase inversion problem
 
Further, the quality of one's (remaining) hearing will also colour the perception of the sound.

Andrew

Paul R Johnson January 14th, 2022 03:30 AM

Re: Zoom H6 line out phase inversion problem
 
I’m really struggling to get this time alignment accuracy quest. Audio is so rarely time aligned to anything near the accuracy needed to cause an accidental polarity reversal to wreck things unless you are trying to null out audio, and with multiple mics this is usually impossible anyway. Time alignment is wrecked totally at that precision when we are looking at say a band on stage. Anybody who attends real concerts hears delays as part of the ‘sound ‘. I suspect the reason drums evolved into centre stage is to keep the snare drum in time for the majority. A few of my videos have a close camera on the drummer, and sync wise the distance to the furthest camera can be quite delayed, so syncing to a snare down beat that can be seen, is easy. I find that in premiere, the close cam must be spot on, but if that makes that point impossible at frame rate locked audio sliding for the distant cams, I just have those a tad late, never early and it works. I could easily pop into audition and tweak the timing but it’s just not important. When we work with more than 8m distance, as in the concert halls and churches I work in often, I’m not sure trying to correct it works. Organs are the biggest culprit. Not all of a rank is in the same place. Often over the years they’ve been extended and between a B and the next C the actual pipes could be a long way and even facing different directions, as in along the church or across. This is the real work and I’m sure this all helps the stereo field to sound realistic.

If you are shooting a string quartet you have one angle where the violin should arrive first as it is closest. On the other side the cello should arrive first and the wide shot on centre should have them the same. If you were there. In the video your audio could be A/B, a bit rare nowadays, but more likely a variant of X/Y. Maybe the best solution is for sound to be delayed a tad compared to picture to create depth?

I’ve never seen a commercial orchestral concert with big delays to simulate a person at the back, but equally on the huge close miked orchestras we see now, the close mics all get processed to add realism, usually reverb and small delays.

The question is the old one. For the viewer, when is NOW?

Geoffrey Cox January 14th, 2022 07:13 AM

Re: Zoom H6 line out phase inversion problem
 
Quote:

Originally Posted by Don Palomaki (Post 1966797)
No. I'm saying audio recordings are also colored. And "almost" can cover a lot of territory

And no recording, audio or video, is an exact representation of the live event. as experienced by the viewer/listener in his/her seat.

This all brings to mind the ads for AR speakers (Acoustics Research) from the 1960s. Behind the stage curtain AR3a vs a live performer. Point source vs point source in a large venue and the best of gear. But a far cry from a pair of speakers vs a 90 piece orchestra.

And a lot of current music is highly engineered - about the only purely acoustic instruments are the drum set and voice and even they are processed, reverbed, and compressed to satisfy the artistic intents of whose making the final calls on the released work

Is it bad? (a mater of taste), is it wrong? (its art so wrong is a matter of opinion)

Except for few wackos and those in chains an unhappy audience is unlikely come back for more.

My point about the last bit is not to dissatisfy the audience but to delight them by doing something they don't expect. I think all really great art does this in one form or another.

Geoffrey Cox January 14th, 2022 07:19 AM

Re: Zoom H6 line out phase inversion problem
 
Quote:

Originally Posted by Paul R Johnson (Post 1966816)
I’m really struggling to get this time alignment accuracy quest. Audio is so rarely time aligned to anything near the accuracy needed to cause an accidental polarity reversal to wreck things unless you are trying to null out audio, and with multiple mics this is usually impossible anyway. Time alignment is wrecked totally at that precision when we are looking at say a band on stage. Anybody who attends real concerts hears delays as part of the ‘sound ‘. I suspect the reason drums evolved into centre stage is to keep the snare drum in time for the majority. A few of my videos have a close camera on the drummer, and sync wise the distance to the furthest camera can be quite delayed, so syncing to a snare down beat that can be seen, is easy. I find that in premiere, the close cam must be spot on, but if that makes that point impossible at frame rate locked audio sliding for the distant cams, I just have those a tad late, never early and it works. I could easily pop into audition and tweak the timing but it’s just not important. When we work with more than 8m distance, as in the concert halls and churches I work in often, I’m not sure trying to correct it works. Organs are the biggest culprit. Not all of a rank is in the same place. Often over the years they’ve been extended and between a B and the next C the actual pipes could be a long way and even facing different directions, as in along the church or across. This is the real work and I’m sure this all helps the stereo field to sound realistic.

If you are shooting a string quartet you have one angle where the violin should arrive first as it is closest. On the other side the cello should arrive first and the wide shot on centre should have them the same. If you were there. In the video your audio could be A/B, a bit rare nowadays, but more likely a variant of X/Y. Maybe the best solution is for sound to be delayed a tad compared to picture to create depth?

I’ve never seen a commercial orchestral concert with big delays to simulate a person at the back, but equally on the huge close miked orchestras we see now, the close mics all get processed to add realism, usually reverb and small delays.

The question is the old one. For the viewer, when is NOW?

In my case the specific project is simply a person talking to camera (reciting poetry to be precise) or at least having the camera pointing directly at them so sensitivity to synch is high. But my main reason for wanting to go finer than frame rate is the point I made about timbre upthread - this is definitely affected by alignment finer than frame rate.

To be clear, the phase inversion happening with the Zoom is more anecdotal than a real problem as such.

Don Palomaki January 14th, 2022 08:55 AM

Re: Zoom H6 line out phase inversion problem
 
Mixing multiple mics of the same material in the same audio track can result in comb effect cancellations and they would be sensitive to time alignment. Room acoustics can add to this. Proper mic placement can reduce this issue.

Greg Miller January 14th, 2022 10:00 AM

Re: Zoom H6 line out phase inversion problem
 
Quote:

Originally Posted by Paul R Johnson (Post 1966816)
Audio is so rarely time aligned to anything near the accuracy needed to cause an accidental polarity reversal

Time (mis)alignment can cause a polarity reversal only with a symmetrical waveform of fixed amplitude, and only at some particular frequency when t = n*([period of f)]/2), where n = an odd integer. (At other frequencies it will create a comb filter.)

Time misalignment will never cause a polarity reversal of something asymmetrical, such as a rim shot, clap of a slate, etc. (In the case of transients like those, misalignment will just cause a short same-polarity echo.)

Patrick Tracy January 14th, 2022 01:38 PM

Re: Zoom H6 line out phase inversion problem
 
Quote:

Originally Posted by Paul R Johnson (Post 1966816)
I’m really struggling to get this time alignment accuracy quest. Audio is so rarely time aligned to anything near the accuracy needed to cause an accidental polarity reversal to wreck things unless you are trying to null out audio, and with multiple mics this is usually impossible anyway. Time alignment is wrecked totally at that precision when we are looking at say a band on stage. Anybody who attends real concerts hears delays as part of the ‘sound ‘. I suspect the reason drums evolved into centre stage is to keep the snare drum in time for the majority. A few of my videos have a close camera on the drummer, and sync wise the distance to the furthest camera can be quite delayed, so syncing to a snare down beat that can be seen, is easy. I find that in premiere, the close cam must be spot on, but if that makes that point impossible at frame rate locked audio sliding for the distant cams, I just have those a tad late, never early and it works. I could easily pop into audition and tweak the timing but it’s just not important. When we work with more than 8m distance, as in the concert halls and churches I work in often, I’m not sure trying to correct it works. Organs are the biggest culprit. Not all of a rank is in the same place. Often over the years they’ve been extended and between a B and the next C the actual pipes could be a long way and even facing different directions, as in along the church or across. This is the real work and I’m sure this all helps the stereo field to sound realistic.

If you are shooting a string quartet you have one angle where the violin should arrive first as it is closest. On the other side the cello should arrive first and the wide shot on centre should have them the same. If you were there. In the video your audio could be A/B, a bit rare nowadays, but more likely a variant of X/Y. Maybe the best solution is for sound to be delayed a tad compared to picture to create depth?

I’ve never seen a commercial orchestral concert with big delays to simulate a person at the back, but equally on the huge close miked orchestras we see now, the close mics all get processed to add realism, usually reverb and small delays.

The question is the old one. For the viewer, when is NOW?

I'm not quite sure what you're referring to. Do you think it's a goal to change the acoustic perspective with the camera angle? I would never do that, I would pick one acoustic perspective that represents an idealized audience experience.

In the case of my Zoom H5, I'll slide the onboard mic tracks to account for the fact that it takes a bit more time for sound to arrive from the backline compared to the direct board feed. Sometimes that really matters when there's some common signal between them, in which case polarity and time have to be considered.

In the case of a drum kit, aligning, or not, the close mics to the overheads will affect the tone. There's no one right way to do it, so I go based on how it sounds. Different mics at different distances and angles will never be in phase across the audio spectrum, so I put them where they sound right to me. It just works out that I tend to prefer it when the arrival times are at least approximately compensated for.

Geoffrey Cox January 14th, 2022 04:50 PM

Re: Zoom H6 line out phase inversion problem
 
Quote:

Originally Posted by Don Palomaki (Post 1966823)
Mixing multiple mics of the same material in the same audio track can result in comb effect cancellations and they would be sensitive to time alignment. Room acoustics can add to this. Proper mic placement can reduce this issue.

This is why am using the Zoom, so the tracks are separate and can be tweaked to get rid of any comb effect. I started off using a small mixer and going straight into the camera but came up against the comb effect. And because it is outdoors and it occurs when I go hand held and get close in (i.e. inside the 3x distance between mic and source) it is very hard to detect on the headphones till you get back in the studio when it is too late.

Paul R Johnson January 16th, 2022 06:24 AM

Re: Zoom H6 line out phase inversion problem
 
I'm sorry guys - you've headed away from common sense practical issues into science fantasy.

Two mics on a snare and the polarity button does obvious stuff, but in a stereo recording of a quartet, a miswired cable does not shout loudly about polarity errors and often you just don't even notice until you see the stereo meter display being unusual. There really is no point to this when every person with an HDMI connected display will be watching with the picture slightly misplaced from the audio, and as we've already said, sound behind is quite normal, but picture behind horrible to watch. My point is simple. We are all used to latency and have our own minimums before we start to get cranky, but below a certain point, while we could adjust, there simply is no point and in real life. at 50fps what are we talking about? A subject to camera distance of about 20ft? We all cope with that sort of latency in real life every day. Are we not generating a problem to fix that doesn't need fixing?

I'm just not that interested in a 'problem' that really isn't one.

Geoffrey Cox January 16th, 2022 07:58 AM

Re: Zoom H6 line out phase inversion problem
 
I think the times when subframe adjustment really does matter sonically have been clearly articulated. They may not apply to what you do Paul but it is often useful to understand why things matter to what others do. If such fine adjustments of audio were meaningless, why do audio sequencers allow much finer adjustments than even 100 frames per second? And a video NLE is also an audio sequencer.

Paul R Johnson January 17th, 2022 01:02 AM

Re: Zoom H6 line out phase inversion problem
 
Because when I am mixing a multi mic orchestra project with stereo clusters and spot mics it is very important. In video this kind of sonic cohesion technique is simply not where you should be even attempting it. We are talking about aligning audio and video in a multi cam shoot. The very nature of the source material usually dictates any audio from camera positions is for for effect, as in audience reaction or atmos. Distant capture rarely sound right spacially and always lacks clarity, so your premixed replacement audio is the primary source. Micro timing of a compromised recording is somewhat pointless.

The balance and blend of the audio tracks is paramount. Syncing these with the picture edit makes this audio track the key track, and the picture just needs to match it and frame rate is sufficient. It annoys me when there are video screens in the edit that are delayed by just a few frames, or often now half a dozen!


All times are GMT -6. The time now is 12:20 AM.

DV Info Net -- Real Names, Real People, Real Info!
1998-2025 The Digital Video Information Network