Discuss Length of cable in the Electrical Engineering Chat area at ElectriciansForums.net

Welcome to ElectriciansForums.net - The American Electrical Advice Forum
Head straight to the main forums to chat by click here:   American Electrical Advice Forum

Does the length of cable de rate the cable as it creates a greater resistance and as a result heats up the cable reducing its current carrying capacities?

No.
What is does affect is the voltage. That's why you calculate the volt drop of circuits.
If the volt drop is too high then the cable size will have to increase.
 
No. Think about it - if you keep the current the same but double the length, then you double the resistance and double the heat generated. But, you have also doubled the length of the cable - so the amount of heat generated per length of cable is exactly the same.
 
Not directly, as already said the heat loss per unit length is the same.

But when it comes to coping with the overload under fault conditions the situation is rather different. The let-through fault "energy" (the I2t term) for an OCPD generally increases at small overloads because the 't' term increases much faster than the 'I^2' term decreases.

So for fault conditions a long cable can end up suffering more thermal stress because it took a lot longer for the OCPD to disconnect. For a fuse the curve is fairly simple, dropping down and flattening off as you get to the worst-case PFC actually having the least I2t term!

But a MCB is a lot more complex, and it depend on which region you are in (thermal trip, or "instantaneous" magnetic trip). Here is an example from the Hager commercial catalogue:
Length of cable Hager-B-curve-MCB - EletriciansForums.net
You can see that when you hit the magnetic trip region at {3-5 * In} then the fault energy drops massively as the disconnect time drops from 5-25 seconds to tens milliseconds or so.

However, unlike a fuse the MCB does not disconnect that much faster as PFC increases to allow a shorter 't' to offset the greater 'I^2' aspect. So once you get beyond a couple of kA fault currents then a similar rating BS88 fuse lets through less energy.
 
Strange way of wording it Jako but in simple terms yes. I think this is what you are getting at!
If you have for example a 3% VD with 20A at 50m and extend the cable, that 3% volt drop can only be achieved by de-rating the circuit current (effectively de-rating ccc of the cable for a particular circuit). As the circuit current would be less than before, I'm not sure of the significance of any heat generated!
 
I think we are all looking at the OP's statement from differing points of view!

Typically there are several different limits on a cable that have to be taken in to account when designing the circuit:
  1. The thermal limit of CCC. This is a product of both the cable thickness (hence I2R loss per unit length heating it) and the environment it is in (thermally insulated, in duct, open air, etc) that alters how easily the heat escapes, but in itself is independent of the length. (post #3)
  2. Voltage drop limit. Here a long cable may end up much heavier than the CCC limit simply to avoid too much VD (posts #2 & #5).
  3. Fault disconnection time limit. This is depends on both the OCPD characteristics and the Zs value (i.e. PFC) which depends on the cable resistance (R1 + R2) and that is length-dependant. So (much like VD) can result in additional conductor size being needed to meet safe disconnection times.
  4. The fault let-through energy (post #4) depends on the disconnection time and so has a thermal aspect for survival - the adiabatic calculation to see what a fault would do the insulation as an infrequent stress which might require a shorter disconnect time than the shock-protection limits in the wiring regs.
So while a cable's thermal CCC is independent of length (just the current and thermal environment), the VD and fault disconnection/adiabatic limits are length-dependent factors to consider.
 
Last edited:
My question was if the circuit say was run in 1.5 cable and can take 10 amps if I had a long run would that reduce that 10amps sorry for being unclear. Thanks for your help guys
 
My question was if the circuit say was run in 1.5 cable and can take 10 amps if I had a long run would that reduce that 10amps sorry for being unclear. Thanks for your help guys
Yes, at a point you would find it fails to meet the VD or disconnection limits.

So as the length increases from near-zero you have a constant limit, the thermal CCC, but then you reach one or more points when you need to derate it to maintain either the required VD (say 5% or 3% for lights), or to allow adequate fault clearance.
 
I think we are all looking at the OP's statement from differing points of view!
Agreed pc1966, the wording of the question does'nt help and of course many other things to take into consideration but got the impression trainee asking a simple question looking for a simple answer! We all know though nothing is that simple or straightforward when it comes to circuit design.
 
Yes, at a point you would find it fails to meet the VD or disconnection limits.

So as the length increases from near-zero you have a constant limit, the thermal CCC, but then you reach one or more points when you need to derate it to maintain either the required VD (say 5% or 3% for lights), or to allow adequate fault clearance.

So say for an example it could take 10 amps at 10 meters if we went up to 100meters would we expect that to drop to day 5 amps for example?
 
Probably in the majority of cases the VD limit will also deal with the disconnection time indirectly. For example, if you need 10A and have a 10A MCB or similar then at 5% VD your fault current is (up to) 20x the MCB rating, so a long cable that has its size increased to maintain VD is keeping the PFC up and clearing time down.

Where that gets in to difficulties is when you have a higher resistance CPC (e.g. T&E with 2.5 L/N and 1.5 E, or SWA with a steel CPC that is a much poorer conductor) or on high current circuits when your cable's R1+R2 is getting down to the supply's Ze value.
 
So say for an example it could take 10 amps at 10 meters if we went up to 100meters would we expect that to drop to day 5 amps for example?
Maybe, you would need to limit the circuit current but you really need to do some proper circuit calcs to be sure. It wont drop automatically to 5A as in your example.
 
So say for an example it could take 10 amps at 10 meters if we went up to 100meters would we expect that to drop to day 5 amps for example?
It depends on the application:
  • Is it 3% or 5% drop you have to meet?
  • Are you on 230V single phase or 400V three-phase?
They will give you different current limits for the same cable!
 
It depends on the application:
  • Is it 3% or 5% drop you have to meet?
  • Are you on 230V single phase or 400V three-phase?
They will give you different current limits for the same cable!

Okay thank you. I understand that you need to up the cable size to satisfy voltage drop and disconnection times. Just when you look in the regs it says a cable can carry a given amount of current depending on its installation method and external influences but the main question I was trying to clear up was that when the length of the conductor increases so does the resistance meaning the conductor gets hotter and effecting the current carrying capabilities. Thanks for all your in-depth answers
 
No, the conductors don't get hotter.
As already said, for a given current, the increase in heat generated will be matched by the increase in length. The heat per length of the cable remains the same. So 1m or 100m - for a given current being carried, the cable will run at the same temperature.
But as described above - at some point as the length increases you will have to use a larger cable to meet volt drop or fault carrying requirements, but not for temperature rise reasons while carrying the normal load current.
 
Resistance will cause heat like a high resistive joint etc.

But it's a bit chicken egg - Resistance increases with length of conductor - Resistance causes heat - Heat causes more resistance.

but under normal operation the cable is ok but under fault the cable could fry.

Edit: unless your CSA is too small in the first place I.e bellow your protective device then fry.
 
Last edited:
the main question I was trying to clear up was that when the length of the conductor increases so does the resistance meaning the conductor gets hotter and effecting the current carrying capabilities.

As has been said already this is incorrect, the conductor does not get hotter as it gets longer.
The conductor will reach the same temperature.

The heat is evenly distributed around the surface area of the conductor, for every increase in length there will be a proportional increase in surface area.
 
Surely an excessive volt drop from a very long run would cause the conductors to heat up.
Heat per unit length is exactly the same. Total wasted power is higher, of course!

But under fault conditions a higher overall resistance and resulting lower fault current means the cable can rise to a much higher temperature due the longer disconnection time of the fuse/MCB.
 

Reply to Length of cable in the Electrical Engineering Chat area at ElectriciansForums.net

OFFICIAL SPONSORS

Electrical Goods - Electrical Tools - Brand Names Electrician Courses Green Electrical Goods PCB Way Electrical Goods - Electrical Tools - Brand Names Pushfit Wire Connectors Electric Underfloor Heating Electrician Courses
These Official Forum Sponsors May Provide Discounts to Regular Forum Members - If you would like to sponsor us then CLICK HERE and post a thread with who you are, and we'll send you some stats etc
This website was designed, optimised and is hosted by Untold Media. Operating under the name Untold Media since 2001.
Back
Top
AdBlock Detected

We get it, advertisements are annoying!

Sure, ad-blocking software does a great job at blocking ads, but it also blocks useful features of our website. For the best site experience please disable your AdBlocker.

I've Disabled AdBlock