So here's a question for those who are electronically inclined. I deal with a lot of laptops which do not have their original power supplies. So we usually match up the connector, voltage and amperage. However, there's not always an exact match. I have a few questions about this. What are the dangers of power supplies rated for higher / lower voltages than recommended? What are the consequences of higher / lower amperage than recommended? And assuming Volts x Amps = Watts (From what I've figured out), do you suppose this would work? Suppose I have a laptop designed for 15v / 3a. This is roughly 45w, correct? What if I were to use a 19v / 2.3a power supply? Does the total wattage dictate whether the laptop will get fried or not? Or do the amp/voltage ratings fry by theirselves? The fellows at work tell me higher amps is the killer, but higher votlages are okay (within reason). From my own obverservations, this is not the case. It looks more like higher voltages kill, while higher amperages make no difference. I successfully used a number of lower voltage, higher amperage power supplies, but toasted some equipment (not laptops, thank goodness!) with higher voltage, correct amperage power supplies.