E

electroguy

First of all, I would like to apologize for creating an account solely to ask my question. I honestly don't know where else to turn to find out this information; I've exhaused every google search and the knowledge of people around me. I'm hoping someone won't mind helping me with this.

I live in a country where, on average, we have power 3/4 of every day. After years of living with power cuts that disconnect my internet and reset my entertainment devices, I have decided to invest in a battery backup system. A UPS will not suffice, because UPS's are meant for very short periods of time, and I would like my devices to remain running for the long hours when the power is off. I am therefore setting up an inverter+batteries system. I have purchased a pure sine wave 1400VA inverter, and I now need to calculate my power requirements in order to get the right capacity batteries.

I've been reading a lot about calculations between Amps, Watts, and Voltage, and I have to say that after 2 full days of trying to learn, I am now more confused than when I started.

I'm going to explain this as best as I can: let's say I have an internet modem or router, and on the back of this device, the power requirement is stated as follows "12V--2.5A". This leads me to believe that, given W=V*A, this device uses 30 Watts of power. However, another thought occured to me. If I look at the power adapter of this device, it says it requires an input of "100-240V~50/60Hz 1.0A". So, with my limited knowledge, I'm thinking that this adapter uses anywhere from 100-240 Watts of power. My country's power is 220V, so that would make it 220 Watts. Now, I know I'm completely wrong about one or more things in the above statement. Can someone please explain to me what parts I have wrong?

Also, my TV has a sticker on the back saying something similar: "100-240V~50/60Hz 3.0A". What is the power consumption of my TV? Does it really use half as much power when it's plugged into a US power socket than in a UK power socket? That can't be right...

Ultimately, I need to decide on the capacity batteries I need for my setup. The inverter is a 24 volt unit, and it therefore needs two 12V batteries connected in series. I have to decide on whether to get two 75Vh batteries, or larger ones, and how much battery time I will get with them with all my devices plugged in. I have come accross this equation:

(Batt. Voltage * Batt. Capacity in Amps)/Power Usage in Watts = Standby Time.

I'd really like to calculate my total device power usage so that I can finally figure out my ultimate standby time. Can someone please educate me on what parts of my reasoning I'm obviously confused about?

Thank you all so much.
 
forget the ratings on PSUs etc. a clamp meter in the circuit in use will tell you the current drain of the equipment. you have a 1400VA inverter. that means at 220V, you have a supply capable of 6.3A. this relates to 58A @ 24V. so at full load, your battery back-up is running at 58A, so 75AH batteries will last just over 1 hour. obviously, you will not generally be running at full load. hope this helps.
 
Upvote 0
I take it you live in India then ? plus just to add you should only run what you need from the ups ie you pc/laptop essential lights plus is it new I ask because if it is old you may find the batteries wont last as long as their lifespan starts to decay after 5 years and less in a hot country. Also check your ups some have a display showing you how much time you have left on it
 
Upvote 0
UPS manufacturer APC did have a tool on their website to calculate the size and type of the back up unit for you. May be worth a look?
 
Upvote 0
Hi everyone, thanks for your replies. I should have been more specific about my questions. Here is my main question:

When looking for the power consumption of a particular device, do I look at the device itelf (which says "12V--2.5A") or do I look at the AC input requirements on its adapter (which says "100-240V~50/60Hz 1.0A")?
 
Upvote 0
you look at the device itself.the rating on the power adapter is the max. output current it is happy with without frying. to convert current from 12V to 220V, divide by 18. e.g. 2.5A @ 12V is equivalent to 0.14A @ 220V.
 
Upvote 0
You can do as Tel suggested up in post number 2, or there is a thing called diversity.
Say your fuse board has 2 32A & 2 6A mcbs/fuses, you will not use anything approaching 76Amps.
There are methods of working out typical usage but there is a simple Rule of Thumb where you add up all the values of the mcbs/fuses and multiply by 0.4 (I think, someone else will confirm!)
 
Upvote 0
Thank you for your response telectrix! My other question is, how much power does my TV use? It says "100-240V~50/60Hz 3.0A" on the back. But 220*3=660 Watts sounds like way, way too much power, and so I'm sure my calculation is wrong. Can you please explain how to derive its power usage? Thank you!
 
Upvote 0
it could be that the 3.0A is the current drawn at 100V (worst case). does it not give a wattage on the set?
 
Upvote 0

Similar threads

OFFICIAL SPONSORS

Electrical Goods - Electrical Tools - Brand Names Electrician Courses Green Electrical Goods PCB Way Electric Underfloor Heating Electrician Courses Heating 2 Go Electrician Workwear Supplier
These Official Forum Sponsors May Provide Discounts to Regular Forum Members - If you would like to sponsor us then CLICK HERE and post a thread with who you are, and we'll send you some stats etc

Advert

Daily, weekly or monthly email

Advert

Thread statistics

Created
electroguy,
Last reply from
telectrix,
Replies
9
Views
4,154

Advert