|
|||||||||
|
Thread Tools | Search this Thread |
September 20th, 2008, 01:23 PM | #1 |
How to use HDTV as monitor clone?
Not sure if this is the right forum to ask this hardware question, so, excuse me if I'm not in the right place.
I want to pick off one of my two available DVI outputs on my Quadro video card and split the signal to feed two cloned displays, one is an LCD monitor the other is a 720 HDTV. There are several DVI splitters available on the market. The problem arises because the two monitor clones are different resolutions, so the Quadro defaults to the lowest resolution display, which is the HDTV. I'm wondering if I can use a descaler box to match the HDTV display and allow my native 1680x1400 resolution on the LCD monitor. There are also distribution amplifiers on the market, but, to the best of my knowledge, they all default to the lowest resolution device. By the way, the Quadro card has a component output, however, to use it, I need to disable one of my DVI outputs...not a good solution for me. Can anybody help me out with an answer, short of going to a 3 monitor output card or dual PCIe video cards? |
|
September 26th, 2008, 02:17 AM | #2 |
Major Player
Join Date: Dec 2003
Location: Newberg, Oregon
Posts: 494
|
My card has two DVI ports and I actually send one to a monitor and the other one to my HDTV via a DVI to VGA converter. (my TV has a VGA port on it).
You can then tell your video drivers to clone. Or are you already using both DVI's? |
September 26th, 2008, 06:07 AM | #3 |
Thanx Jeremiah. The only problem with your suggestion is that VGA will not display HD resolutions. What I've done is to install a second video card. I run my primary LCD monitors off the primary DVI card outputs and the HDTV monitor off of the second video card. This allows me to set different resolutions for the HDTV and the monitors. Works great!
|
|
September 26th, 2008, 07:22 PM | #4 |
Major Player
Join Date: Dec 2003
Location: Newberg, Oregon
Posts: 494
|
Aha...my video card actually sends 1920x1080 to the TV. (I can set the resolutions independently). My comp doubles as a Home Theater PC.
|
September 26th, 2008, 07:36 PM | #5 |
Major Player
Join Date: Sep 2008
Location: Las Vegas, NV
Posts: 628
|
my computer has a ATI radeon HD 3650. Dual monitor video card. Supports up to 1920X1080. Actually I was able to send video at the native resolution of my older (720p) LCD HDTV. 1360X768.
|
September 26th, 2008, 07:47 PM | #6 |
Major Player
Join Date: Dec 2003
Location: Newberg, Oregon
Posts: 494
|
Over on the AV Science boards there is a big debate over VGA vs HDMI as to which gives you a better image going from a computer to a HDTV. I ended up siding with the VGA crowd. The HDMI image looked a bit soft from my ATI card. When I upgraded to a new computer, I switched to an NVIDIA card, I didn't even have an HDMI anymore, just the dual DVI. Once I got a long enough optical audio cable I was able to use my computer to watch Blu-Ray on my TV. My card also has component out, oddly enough. I still ended up liking the image from the VGA more though.
Buying the Blu-Ray Burner was a stopgap until the prices come down on set-top boxes. I can still give customers Blu-Ray if they request them, and also can watch Blu-Ray and HD-DVD (I got the LG model that plays both). |
September 26th, 2008, 08:36 PM | #7 |
Major Player
Join Date: Sep 2008
Location: Las Vegas, NV
Posts: 628
|
well what I have noticed, which might just be a software thing and not a hardware. Playing my SR11 through it's component output to my HDTV gives a much brighter picture, very clear and clean.
Playing it over the VGA at 1360X768 to my HDTV using Sonys included software bundle. Still very nice, but it's darker and the image looks a bit harsh. Could be Sonys software or just the video card settings I suppose. |
September 26th, 2008, 08:55 PM | #8 |
Major Player
Join Date: Dec 2003
Location: Newberg, Oregon
Posts: 494
|
Interesting. My main reason for sticking with VGA is that I have this 25 foot VGA cable so it's routed out of the way. Maybe I'll pick up a longer component and try that out again.
|
September 27th, 2008, 05:46 AM | #9 |
I'm not sure about this, but, I do know that VGA(analog) predates DVI(digital). For all resolution issues, DVI and HDMI are identical digital standards. Yeah, you can convert the DVI to VGA, for delivery to a VGA set, but, the image is subject to noise, especially for long cable runs. My 46 inch Samsung HDTV monitor has several inputs, including VGA and HDMI. The VGA inputs are limited to smaller resolutions than the HDMI input. Furthermore, the VGA looks like crap...more like the old days of SD television images.
I don't think this is a video card issue, the issue is with TV inputs. While your TV may show 1920x1080, you may well be looking at an uprezzedi image, whose native size in VGA is no better than 720x480. |
|
September 27th, 2008, 10:36 AM | #10 |
Major Player
Join Date: Sep 2008
Location: Las Vegas, NV
Posts: 628
|
Well the thing is. My video card has one DVI output, and one VGA... my HDTV is a sony XBR flat screen LCD. It came out a couple years ago. Anyway my point is, it has that protocol where it will report to the computer all the resolutions it will accept. Goes all the way to 1920X1080 but I choose the native res of the HDTV to make it one less process the video has to go through.
I have no way to tell what is actually making it on screen but it looks very nice. Just not as nice as straight from the camera using component. I actually could switch all this around and use the VGA for the computer monitor and send HDMI to the TV but it means taking the TV off the wall since they hide that connection behind it..... maybe this weekend I will give it a try. |
September 27th, 2008, 01:22 PM | #11 |
Major Player
Join Date: Dec 2003
Location: Newberg, Oregon
Posts: 494
|
Well, the TV is reporting 1920x1080 from that input, which is what I have it set to. No noise. It's a Samsung. Maybe some TV's don't accept 1080p through the VGA, but this one does.
|
September 27th, 2008, 04:02 PM | #12 |
Inner Circle
|
Hi guys..........
The Vesa VGA standard only supports resolutions up to and including WXGA @ 1360 (H) X 768 (V).
Any screen fed with such a VGA signal that "looks good" is, in all probability, a native 1360 X 768 and is thus not up or down ressing to get the picture to fit (although, if the original content was 1080, it's being downressed in the graphics card, which can introduce artifacts). In all probabilty also, such a screen when fed with full 1080 X 1920 via the HDMI/ DVI/ Component interfaces may well make a pigs ear of it as it needs to down ress to make the picture fit. Not all screens do this well. 1080 (anything) is not a supported Vesa VGA standard, so it's highly unlikely your graphics card can o/p it (1080p/ i VGA) and a dead cert your screen couldn't play it even if it did. Many screens will only accept full 1080 on either the YPbPr Component i/p (usually only i) or the HDMI/ DVI i/p (usually both p & i). The best "bang for buck" if playing with full 1080 is 1080 content via HDMI/ DVI to a full 1080 screen, no resolution/ signal conversions required. YPbPr runs it a pretty close second (yes it's analogue but at least it's still 1080) As soon as you throw resolution conversion (either up or down) or digital/analogue conversion into the mix things start going down hill fast. Worst case scenario is original 1080 via analogue VGA @ 768/ 720 back to digital to a full 1080 HD screen where it's upressed again to 1080 (actually you can get worse, by using the Composite ports but who'd bother?) Add to that the fact that some screens can be pretty economical with the truth with regard to displayed content formats and you can see where all sorts of odd things appear to happen. CS |
September 27th, 2008, 05:33 PM | #13 |
Major Player
Join Date: Dec 2003
Location: Newberg, Oregon
Posts: 494
|
Some TVs can accept 1080p over VGA. Samsung's can. There was one model from a few years ago, the 5688W that could only accept 1080i over HDMI, but you could feed it 1080p into its VGA port. I am sending my Samsung LCD 1080p with my computer.
Many people connect their X-Box 360's to their HDTV's at 1080p using the VGA port. Go look at the AV Science forums...lots of threads on the subject: AVS Forum JR |
September 27th, 2008, 05:38 PM | #14 |
Major Player
Join Date: Sep 2008
Location: Las Vegas, NV
Posts: 628
|
just tried some different settings. My video card will indeed send all the way up to 1920X1080 through VGA.. but my TV will not except anything higher than 1360X768. It just says unsupported signal.
But the card will allow me to choose the resolutions and send them through the cable. |
| ||||||
|
|