OK to use 230V switch on power supply?

Status
Not open for further replies.

charvey

Posts: 18   +0
I'm wiring my new place right now and will have a dedicated circuit for the computer. Is there any reason why I can't run a 230V circuit for the power supply since it (Antec SmartPower 2.0 350W) can be switched to 230V? Nevermind the plug configuration....easy fix.
Just an FYI for folks on the computer all the time: Your watthour meter is measuring the higher of the current draw on L1 and L2. If you're plugged into 110V, you're still paying for 220V. Run anything off of 220V and you cut your power bill in half. A good reason to never buy a 110V window A/C unit!
 
Definition:
"In electrical terms, one watt is the power produced by a current of one ampere flowing through an electric potential of one volt. "



what? We run at 120V here. Are talking about running your computer off the same line as your washer and dryer? Dryers run at 220V @60Hz which is actually just two separate 110V lines, if one of the 110V lines is off the dryer will spin but not heat. That switch on the back of the PSU is designed for other countries running higher voltage at 50Hz. You are asking for trouble if you try to wire it to a dryer outlet. Technically it may work but will likely reduce the life of the unit.

Amps are still amps regardless of the voltage, watt/voltage = amp. A PSU at 230V vs 110V will still draw the same current. You would actually be using more watts on a 230V outlet. Amp * voltage = watts.

Example: lets say a device is drawing 10A -> 1200w@120V , 2300w@230V


How do you figure that your powerbill will be any cheaper plugging into a higher voltage?
 
Run anything off of 220V and you cut your power bill in half. A good reason to never buy a 110V window A/C unit!

Error... you just have two 120v lines coming with inverted phases. For 220v, you plug one phase on one side, and the other on the other instead of being hot/neutral, it's hot/hot.

Your counter is counting in Watts, not amps.
 
seanp789 said:
Definition:
"In electrical terms, one watt is the power produced by a current of one ampere flowing through an electric potential of one volt. "



what? We run at 120V here. Are talking about running your computer off the same line as your washer and dryer? Dryers run at 220V @60Hz which is actually just two separate 110V lines, if one of the 110V lines is off the dryer will spin but not heat. That switch on the back of the PSU is designed for other countries running higher voltage at 50Hz. You are asking for trouble if you try to wire it to a dryer outlet. Technically it may work but will likely reduce the life of the unit.

Amps are still amps regardless of the voltage, watt/voltage = amp. A PSU at 230V vs 110V will still draw the same current. You would actually be using more watts on a 230V outlet. Amp * voltage = watts.

Example: lets say a device is drawing 10A -> 1200w@120V , 2300w@230V


How do you figure that your powerbill will be any cheaper plugging into a higher voltage?

A PSU running at twice the input voltage will draw HALF the current (amps). The overall power (watts) stays the same. Look up Watt's law.

If everything in your house ran at 240v, you likely would see smaller power bills. Less voltage drop in wiring means appliances such as water heaters and dryers may not have to run as long to finish their tasks. You won't see much difference with a switch mode power supply in your PC, though some supplies will give you a bit more available output power (just by design) when you give them a 240v input.
 
and the meter measure Watt-Hours.

why build a special circuit for one room when the whole country has not taken
this approach?

It is true that (in the USA), we need a 220/240 circuit for a close dryer and
an electric oven, maybe even another in the garage for special services,
but ...

When in Rome, do as the Romans do!
 
And here in Europe we have 230V / 50 Hz system, heavy duty machines use 380V, but I've never even thought of computers using that... I have a feeling it would send smoke signals for a brief period of time.
 
cracked said:
Less voltage drop in wiring means appliances such as water heaters and dryers may not have to run as long to finish their tasks.

Uhmm.... The voltage drop is just about unmeasurable even if you laid all the wires in your house in series. Maybe only if you had a really huge house. You'd need kilometers of wiring to actually see any difference in your electrical bills
 
The main reason for supplying both 120 AND 240 V service to a house is mainly because large electric motors and heaters (air conditioning, heat pumps, dryers, well pumps, jacuzzis, water heaters, etc.) can run more efficiently at the higher voltage. This is especially true for electric motors; the entire electrical distribution system in nearly every country was designed to run motors more efficiently.

Running a computer at 240V vs 120V will not make any difference in power consumption, at least that would actually be noticeable on a power bill.

A PSU at 230V vs 110V will still draw the same current

Not so.
When all is said and done, the end result from the power supply is 12, 5, and 3.3 VDC at whatever amps the PSU is rated to give out on each rail. The PSU takes a certain amount of watts to do that. Amps * Volts = Watts. Hence, if the input voltage is doubled, the current draw at the power source will be cut in half.
 
not recommended. While it would work, most computer peripherals are set for 110-115v. you could fry other things besides the system. Also bear in mind standard voltage is 115 so if you move it elsewhere you might not have the hookup if the psu is not switchable.

If it is switchable, and you have other uses for 240 then go ahead.
 
Status
Not open for further replies.
Back