Sender: |
|
Date: |
Wed, 7 Nov 2001 19:12:01 -0800 |
Reply-To: |
|
Subject: |
|
MIME-Version: |
1.0 |
Content-Transfer-Encoding: |
7bit |
In-Reply-To: |
|
Content-Type: |
text/plain; charset="iso-8859-1" |
From: |
|
Parts/Attachments: |
|
|
I missed the original thread, but have some input and also a question.
I have some recent experience with purchasing video cards with DVI-D
interface. My problem was that I needed one that had 1600x1200 output. Most
of the cards support only 1280x1024, which is fine for most monitors, but
mine is a 20 inch LCD.
The card I decided on is a Gainward GeForce 3 PowerPack that includes video
capture - bought it for $315 from MIX PC. A much cheaper card that seems to
work well is the ATI Radeon VE that I have seen for sale for as low as $60.
Matrox and Hercules are some other cards that also support DVI-D but not
necessarily 1600x1200.
Now I have a question for those that may know more about DVI-D than I do. Is
there a refresh rate for the DVI-D interface like there is for analog, or is
it fixed?
______________________________________
Peter Shkabara
[log in to unmask] - http://gocolumbia.org/pesh
-----Original Message-----
You need to look for a card that
supports DVI-D, Digital Video Interface for Digital.
Rand Blunck
> I am looking at a 15" Samsung LCD monitor. I read somewhere that you need
> a digital video card. I can't seem to find anything that suggests if a
> card is digital.
Visit our website regularly for FAQs,
articles, how-to's, tech tips and much more
http://freepctech.com
|
|
|