Phil Reams
March 30th, 2003, 05:02 AM
I've heard and read a lot about RF affecting the steadyshot feature on the PD-150.
Would anyone be interested in seeing the results posted if I did a somewhat scientific test as to the minimum level of RF it takes to start causing a problem?
Among all of the other gadgets I own, I have a Motorola R2001A commnunications service monitor. This device has a RF signal generator, spectrum analyzer, oscilliscope, CTCSS tone generator, as well as a host of other features.
I'm still knocking around in my head a testing methodology and procedure, but it would be something like this:
The procedure would be to set up the PD-150 to simulate real-world conditions as much as possible.
Next, I would set up an antenna on a portable test-stand. The antenna would be tunable so I could test at different frequencies, as well as variable distances from the camera. Coax would be run from the antenna to the generator output on the signal generator.
I would then start recording a tape in the PD-150 while it was pointing at a test target.
While recording, I would then turn on the signal generator and start keying it at different power levels--lowest power to the highest level that causes an effect and record the results.
The resulting spreadsheet would look somthing like this:
Antenna distance - 3ft / frequency 154.000 Mhz / power level -30dbm - no effect
Antenna distance - 3ft / frequency 154.000 Mhz / power level -20dbm - no effect
Antenna distance - 3ft / frequency 154.000 Mhz / power level -10dbm - Interference noted
This would be repeated at varying antenna distances / power and frequencies.
I have a couple of friends with PD-150s, so we could average the results and create a baseline graph as to the maximum power of any kind of handie-talkie or headset communicator you could use without worrying about interference.
Sounds like a good weekend project in between gigs for me - or this simply could be the incoherent ramblings of someone who just finished a marathon editing session at 6:00 AM. *grin*
Opinions? Ideas on methodology? Let me know!
cheers,
-Phil
Would anyone be interested in seeing the results posted if I did a somewhat scientific test as to the minimum level of RF it takes to start causing a problem?
Among all of the other gadgets I own, I have a Motorola R2001A commnunications service monitor. This device has a RF signal generator, spectrum analyzer, oscilliscope, CTCSS tone generator, as well as a host of other features.
I'm still knocking around in my head a testing methodology and procedure, but it would be something like this:
The procedure would be to set up the PD-150 to simulate real-world conditions as much as possible.
Next, I would set up an antenna on a portable test-stand. The antenna would be tunable so I could test at different frequencies, as well as variable distances from the camera. Coax would be run from the antenna to the generator output on the signal generator.
I would then start recording a tape in the PD-150 while it was pointing at a test target.
While recording, I would then turn on the signal generator and start keying it at different power levels--lowest power to the highest level that causes an effect and record the results.
The resulting spreadsheet would look somthing like this:
Antenna distance - 3ft / frequency 154.000 Mhz / power level -30dbm - no effect
Antenna distance - 3ft / frequency 154.000 Mhz / power level -20dbm - no effect
Antenna distance - 3ft / frequency 154.000 Mhz / power level -10dbm - Interference noted
This would be repeated at varying antenna distances / power and frequencies.
I have a couple of friends with PD-150s, so we could average the results and create a baseline graph as to the maximum power of any kind of handie-talkie or headset communicator you could use without worrying about interference.
Sounds like a good weekend project in between gigs for me - or this simply could be the incoherent ramblings of someone who just finished a marathon editing session at 6:00 AM. *grin*
Opinions? Ideas on methodology? Let me know!
cheers,
-Phil