I can't take the credit for this
@essex as it was a mate at work that pointed it out to me. I was initially a sheep, like most, who believed that the AFDD's should be detecting most arks.
As
@netblindpaul has pointed out "If, the maximum fault energy to cause a fire is required to be limited by an RCD to 0.3^2 x t, then why is it allowed to be 2.5^2 x t for an AFDD?"
I presume it's down to cost. When you look at how the AFDD's detect arks then if they were to detect 30mA then there is going to be lots of nuisance tripping, unless the detection device is more fine tuned. This will increase the cost of these even more.