Welcome Guest [Log In] [Register]

Kia Ora
You are currently viewing our forum as a guest. This means you are limited to certain areas of the board and that there are some features you can't use or read.

We are an active community of worldwide senior members participating in chat, politics, travel, health, blogging, graphics, computer issues & help, book club, literature & poetry, finance discussions, recipe exchange and much more. Also, as a member you will be able to access member only sections, many features, send personal messages, make new friends, etc.

Registration is simple, fast and completely free. Why not register today and become a part of the group. Registration button at the very top left of the page.

Thank you for stopping by.

Join our community!

In case of difficulty, email worldwideseniors.org@gmail.com.
If you're already a member please log in to your account to access all of our features:

Username:   Password:
Add Reply
Video Output
Topic Started: Sep 17 2011, 03:10 AM (180 Views)
Trotsky
Member Avatar
Big City Boy
Next question (I know they are boring but I want to buy with no regrets.) And I really DO appreciate your help/

I am seeing a lot of lower end computers with INTEGRATED VIDEO, using ATI or RADEON chipsets. I know it's not ideal but my gaming is limited to Missile Command, Tetris and Free Cell and movies are limited to YOU TUBE and even less of that now that I can download it right to the TV via super-fast internet, ethernet from the router and Blu-Ray player.
Down the road I might add a video card if I feel the need.

My main problem, if it is a problem, is that ALL computers offer VGA output, MANY offer HDMI and very few offer DVI. My new monitor takes DVI but not HDMI.
I remember many of the first generation HD sets had component inputs but not HDMI and they used a splitter to take the cable signal from HDMI to COMPONENT input.
I see lots of HDMI-DVI splitters around also.

Will a splitter taking the HDMI digital video putput from the computer and outputting a DVI signal to the monitor give as good a signal as HDMI to HDMI. I know, or suspect?, that there is no audio via DVI but that is fine becasue the monitor is speakerless and my audio is a separate Logitec system.

Or must I search for a puter with DVI output or immediately buy a card with that capability? I don't want to output video in VGA to an LCD monitor.
**************************



Offline Profile Quote Post Goto Top
 
wildie
Member Avatar
Veteran Member
I can't pretend to be an expert on video. Like most people its a struggle to figure out what way to go.

All I can do is recount my own experience!

As for on-board video, I have never came across a m/b that isn't so equipped in the last 10 years. Not to say that there are not some, but certainly few and far between.

I would venture to say, that its likely that a m/b manufacturer would want to provide good video characteristics in order to make their product more attractive.
Also, all the laptop computers that I'm aware of, use on board video quite effectively.
On board video usually uses some of the system memory, so this negotively effect system performance.

My desktop computer came with on-board video and I was quite happy with it until the CRT monitor died and I replaced it with a LCD monitor that supported DVI. It did support VGA as well. But I wanted to try DVI, so I bought an ATI Radion expansion card that supplied DVI output.
With XP this arrangement worked well and it freed up some system memory, due to the video card memory.

All was well, until I upgraded from XP to Win7. ATI (AMD) decided not to support this old video card ( after all, it was all of two years old) so Win7 used MS drivers. Sadly, on Win7 round circles were all egg shaped.
I searched and searched for a solution, without success, then i realized that Win7 was much like Vista, but with more whistles and bells.
So, I down loaded the Vista driver from ATI and forced its installation in Win7. And then, all was well!

As to your problem of which way to go for video. From my experience, I would buy a machine with standard VGA (cheaper) then once you decide whether you want to use HDMI or DVI, you can buy an expansion card that suits you. You may even be able to buy a card that provides both HDMI and DVI. The card that I bought is good for VGA and DVI.

As for splitters, I'm not familiar with these, but usually splitters are parisitic and this can weaken the signal. So I would be reluctant to go this route.
Offline Profile Quote Post Goto Top
 
reactivate
Member Avatar
Gold Star Member
I generally agree with wildie.

I did a home build with my current system and selected the motherboard with zero consideration of the onboard video capability. I already had a reasonably good GPU card and planned to move it to the new build - which I did. It has two DVI outputs for a dual monitor system (which is what I run) and a HDMI output which I have never tried. With 20/20 hindsight, I should have bought a MB with onboard video simply as a backup in case my GPU card dies. Should it die, I would be dead in the water until I could replace it. And that is the reason I would go along with purchasing a new system with onboard graphics. Should you decide you need or want a separate GPU card, it can easily be installed and you simply disable the onboard capability in the BIOS and have a readily available emergency backup.

Modern video cards (and I presume onboard graphics processors) are quite powerful computer processing units in their own right. There are programs around that exploit this computing capability to do background tasks. Certainly games and many videos place heavy GPU demands on this capability. Should a GPU be unable to keep up with the demands placed on it, it drops whole frames from the display and the display of the movie or game becomes very "jerky". You will certainly know it if your GPU is incapable of keeping up.

The real problem with GPUs, and in particular, onboard GPUs is that all that processing power demands power. Power means a heat and a larger power supply. Heat is the most destructive factor in computing. Trust me, if you do something that pushes your CPUs or GPUs to 100% utilization for anything more than a few minutes, you will very likely have a system crash possibly accompanied by a hard component failure. A failure in the graphics when it is on the MB can render the entire MB useless. Onboard graphics are more likely to overheat because they do not have their own cooling system. I continuously run software utilities that monitor temperatures on my MB, CPU, GPU, all internal hard drives and the powersupply. Any that exceed acceptable parameters cause an audible warning and if the temperature gets to be what I consider critical, the system does an emergency shutdown. A simple fan failure in a system can easily cause overheating even without heavy computing demands.

For these reasons, onboard GPUs are not very powerful but they do make a good backup. Discrete GPU cards have their own cooling system (one or two fans and/or heatpipe cooling.

As for video adapters, I think DVI - HDMI converters should work with little loss in quality except the sound problem. I wouldn't want to offer an opinion on other converter forms.
Offline Profile Quote Post Goto Top
 
Trotsky
Member Avatar
Big City Boy
Thanks guys, I appreciate your input.

Quote:
 
Wildie:All was well, until I upgraded from XP to Win7. ATI (AMD) decided not to support this old video card ( after all, it was all of two years old) so Win7 used MS drivers. Sadly, on Win7 round circles were all egg shaped.


More and more I am beginning to think Win 7 is more of a pain in the ass than MSFT is letting on.

Quote:
 
reactivate: The real problem with GPUs, and in particular, onboard GPUs is that all that processing power demands power. Power means a heat and a larger power supply.


I am perusing these high end graphics cards and am SHOCKED to see requirements like 200, 300 and even 400 WATTS. Jesus, they are approaching the output of a room heater on low. unsure112

(I got my circa 2000 Video card (nVidia) with my computer and it has all of 64 Mb of onboard memory...I'm sure it would run on two D-cells. I doubt that it even gets warm.)

So my goal will be to find something with 6 gigs of shared memory and integrated graphics with a good chipset IF I can get a DVI output...I've seen very few. If needs must I'll pop about $80 and get a mediocre graphics card.

Any thoughts about Intel i-5 or i-7 processors vs. AMD Phenom II?

Edited by Trotsky, Sep 17 2011, 01:10 PM.
Offline Profile Quote Post Goto Top
 
1 user reading this topic (1 Guest and 0 Anonymous)
« Previous Topic · SOFTWARE & HARDWARE · Next Topic »
Add Reply