Amperage vs Voltage
I would like to get in on this one. Sorry it's a bit late.
If you are retrofitting a piece of equipment with a new power supply it is quite simple. The voltage must be the same as the original within a certain percentage. Too high and the components will frazz. Too low and the circuit will not get enough power. And that's the thing. Power. All circuits consume power and you must try to get it to match. If the power supply you have has more current capacity, then that is OK. If the circuit tries to use more, the power supply won't struggle. If the supply struggles, the voltage will sag and you may get unpredictable operation. Higher current capacity is OK because the circuit should be fused to protect it in the event of a fault condition.
So, in summary, always match the voltage and the current should be at least the same or more.
Incidentally, while the folks here are correct, current kills, it still comes back to power. A car battery can produce hundreds of amps for a short time but it is only 12 volts. Voltage has to overcome resistance for current to flow (in simple terms. I know current always flows in a circuit that has a voltage supply.). My bodily resistance is in the order of 10 millions ohms and that is why I can stick my fingers across my car battery with no effect. While there is current flowing, it is only about a millionth of an amp. Not enough to hurt. But remember, it only takes about 5 thousandths of an amp (5 milliamps) for 5 seconds to arrest your heart. At 240 volts (yeah, I'm from the UK) and with about 10 million ohms, I will draw about 24 microamps (which should be OK but I ain't giving it a go!) but if my resistance was less and it depends on your biology, it could rise enough to cause damage. Power = Volts times Amps.