There were various conventions established a long time ago, including some of the points mentioned by James. The first installations were mostly for lighting, and in the UK with our early adoption of 230V and similar voltages (although not fully standardised, voltages in this range became a de-facto standard before the national grid in 1929) we found that quite a lot of light could be given by 5A, as much as you would want to group onto a single circuit. Accessories and practices therefore began to evolve to suit the relevant cables and ratings and the 5A circuit became a convention that remained even once outlet circuits became more widespread.
One that is specific to the UK is that until 1947, we used outlets of very different ratings (2A, 5A and 15A) for different purposes. Note that these correspond in power rating to 4A, 10A and 30A on 120V, none of which correspond directly to a NEMA 5-15 or 5-20 in functionality. It was acceptable to combine lights and 5A outlets (where there could be multiple outlets on one circuit), but not 15A, as the regulations specified that only one 15A outlet could be served by one 15A fuse and nothing else.
When the option of the ring final circuit came in with the 1947 introduction of fused 13A plugs, the 30A circuit fuse was much too large for light fittings to be supplied directly. The 5A lighting circuit was now in even greater contrast to the outlet circuit and the two never re-converged, except when lights are supplied from an FCU on an RFC, which is discouraged for reasons of redundancy as James also notes.
The latter point was highlighted when RCDs became mandatory, until now when RCBOs with separate residual protection for each circuit are taking over. With what we call a 'dual RCD' board, we have two groups of circuits each protected by an overall RCD. Good practice is to serve lights in half the house and power in the other half from each RCD, so that any fault that prevents an RCD being reset leaves at least one circuit capable of providing light operational in each room.
We still have a regulation, mainly from the days of tungsten lighting and a little obsolete now, that the maximum permissible voltage drop on a lighting circuit is only 3%, while 5% is permissible for outlets. This is readily achieved if heavy loads and lights do not share a circuit. The main point of it was that filament lamps become drastically less efficient with even a small reduction in voltage, so it was detrimental to allow other loads that are not so critical to contribute to the voltage drop on lighting circuits. They also flicker annoyingly with variations in load/drop, and contribute significantly to energy loss in cabling due to their long periods of use.
There is also the important historical issue of dual tarriffs. In some countries and under some power supply contracts in others, energy for lighting was charged at a different rate to that for heating, so it was essential that the lights were wired to a separate fuse box and meter.
In some places, e.g. Italy, lighting and general power were at different voltages to discourage misuse and / or to allow the use of lower voltage lamps which were more efficient. Italy still has two sizes of outlet and plug, 10A and 16A, historically called 'Luce' (lighting) and 'Forza' (power) for the different functions, so you could plug a desk lamp into the lighting circuit but not into the power. These days all outlets accept both sizes of plug and run at the same voltage.
Maybe there are more...