Hi, I have an LCD (AOC) which shipped w/a VGA cable only. My cpu= Intel Core i5, my GPU=ATI Radeon HD 5770, my OS=Win 7 x64. The vid card has several ports available for choosing connection type incl. VGA, DVI-I (?) & HDMI. The monitor has 2 ports, 1 for VGA & one for DVI. Currently, I have my connection between the monitor & my computer as VGA cable w/DVI adapter (came w/ATI GPU) & the VGA head is attached to my LCD while the DVI-adapter is physically connected to the computer. I have tried all the troubleshooting tips & recommended settings from both ATI support site as well as Microsoft, but my screen resolution (set to its native rez of 1920x1080 @ 60 Htz.) continues to be poor. Therefore, ttf rendering is also poor & difficult to read even w/default font size reconfigured to =11. BTW, all relevant drivers are up-to-date. When I access the graphics system's control module, it appears that the card "sees" my monitor as a CRT-type monitor, which may be caused by the physical location of the DVI port on my tower (near the bottom, lower half of the panel in rear of tower.) and may not be related to poor display output. However, when I power-down my desktop, the monitor will display the "d-sub" indicator, which I am assuming means that I'm only receiving an analog signal, not a digital one as I'd expect since I'm using a DVI adapter. Do I have the cable hooked up incorrectly? Do I need another type of adapter? Does an ATI-made VGA-DVI adapter not work w/AOC-made monitors? Or do I just have a sucky display panel (23" AOC 2330V, full HD). If there's anyone out there who might know what I've done incorrectly or who knows connectors, please assist a fed-up do-it-your-selfer (obviously) who has reached the limits of personal capabilities. Thanks All!