I sold my dell laptop today to make way for a new Dell XPS 420 that i got decked out. I did not get the monitor because i found an ACER 24" that was a better deal. Well i am waiting on the computer to get here so i wen out and bought the monitor. It has 3 different inputs: VGA, DVI, HDMI. I went to my parents place to test out the monitor on there computer. Well it says it is a plug and play the most compuers will auto recognize the new settings and change them. So i plugged it up using the VGA input because that is all my moms computer uses. It is an older (4 year) B series tower. Anyway i plugged it up and set it up like it says, turned on monitor power then computer and the monitor just flashes for a second then goes out. Once the computer is booted up if i turn it off and on it does it again just flashes it for a second and goes out. A friend of mine said he was at work and the EXACT same thing happened to him, it would flash for a slit second and go out. It turned out his input was set to dvi when he was using vga and he switched it over and it worked fine. I looked and i cant change them manually without going into the on screen menu and changing it. But here is the catch, i cant go into the menu it seems if there is nothing on the monitor... So is it just because maybe the monitor is set to use dvi as the default input and not vga and my computer is too old to run the protocol to auto recognize it. It works, it flashes it for a second then goes out just like what happened to my friend i just cant change it. I dont want to wait until my xps gets here because i dont want to find out then something is wrong with the monitor and i have to go back out and get a new one. I have a friend that has an xps but we work different shifts and it seems we are never at home at the same time. Any ideas and sorry for the long post i am just trying to be as descriptive as possible.