That doesn't have to relate to how the actual gamut is covered. Reading the spec carefully the essence seems to be that encoding and decoding always come back to fractions between loosely speaking 0 and 1 so the actual bit depth in a sense doesn't matter.
My view on this goes back to looking at purchasing a monitor not really all that long ago. The 1st thing that became apparent is that it wasn't clear what graphics cards would allow 30bit through. The 2nd thing that became apparent is that displays can upscale the 24bit to 30 with the associated gaps which have been commented about on the web. Then comes connection cables to the display itself. Vesa fall short in all sorts of ways now on larger displays. Looking more recently like a couple of days ago it still isn't clear to me if DVI will support 30 bit, it seems that displayport will via a rather short lead - optical may help if available and finally HDMI pass but I suspect that has display resolution limits due to it's bandwidth. My screen isn't 1080P it's 2560x1440.
Then say I produce an aRGB jpg for the web. Another grey area. Is it 8,10,12 or what. There are 2 aspects of that. If I adjust using a 10bit screen the jpg itself may be 8. I have no idea if that would matter. Doesn't matter if that is then displayed on a 32 bit FP screen if such a thing existed it would be translated to a fraction and then to 32 bit FP but it's still 8 bit info. The other aspect is part old hat. A browser will colour manage it ok but there are web pages about showing that this may not work out. All depends on tones. With colour management in browsers things have improved but .................
Also say I produce a camera aRGB - what's in that. The jpg standard is very flexible and is mainly concerned with the compression methods used - not even RGB if I remember correctly.
What I am saying is that it's rather difficult to know precisely what goes out of a PC and what happens to it when it's converted to an image that can be displayed and viewed by others. I'm left with the feeling leave it alone until it's clearly sorted out. At the moment the bandwidth to the screen seems to be a bit of a problem on large high res screens in particular. On the other hand the info I have seen may not be up to date. At this point in time it seems 85% plus PC's should be using display port - a 1m lead wouldn't reach mine.
It's crazy really because 30bit aRGB used universally would get rid of even the need for sRGB. There wouldn't be any gaps. It just becomes a sub set of aRGB that may need some minor colour management. Would it cost more? I suspect not because I strongly suspect that the main difference between the displays is the back light.
So in real terms I still feel aRGB is still a 24bit colour system.
There has been other rather miss leading info about on even PC colour depth - add an alpa channel for transparency and include it in the bit depth.
I have also read that when it comes to actual aRGB output Adobe only put 24bit out. True, old hat, pass.
John
-