Yes the inrush current is certainly worth thinking about. With a 6V instead of a 12V bulb, presumably the inrush current could also be double.
Inrush is pretty certainly the time when filaments fail, but why do they fail due to this current, when they have happily survived it for thousands of previous starts?
Normally inrush current is not a problem. Current in itself is not harmful to metals. It is the heating effect that can be a problem. The inrush current may be many times larger than the rated current, but it exists for only a very short time. The heat generated is proportional to both the size of the current (much higher during inrush) and to the duration (very much shorter than the duration of the rated current.) So even though the inrush may be 10 to 20x the rated current, I believe the filament does not rise above its normal operating temperature during inrush.
That is on average though. I think failure comes when the filament is no longer uniform. Then it will behave as a series of smaller filaments. If one of these filaments is much thinner than average, it will (at any current) generate more heat than the others and therefore reach higher temperature than the others. Eventually, when it became thin enough, this would cause it to heat above MP even at rated current.
During inrush it suffers disproportionately. Being thinner it heats up faster than the rest. Its resistance increases faster than the rest. The heat generated per mm is much greater and it heats up even faster. So before it is bad enough to fail in constant use, it will be bad enough to fail during inrush.
Why does it get worse during its life? There must be random variations in the original filament. Anywhere it is even slightly thinner or higher resistivity than average will be overheating all the time. Just a little during continuous use and a lot more during each inrush cycle. That excess heating will increase the rate of loss of filament (by evaporation I think, though not sure) at that point and increase the non-uniformity.
So what is the significance of this for the poor old transformer?
It must be designed to cope with the normal inrush current pulses. And that would not be a difficult design requirement. The inrush pulse is very short and the thermal inertia of a transformer wire is enormously greater than that of the filament, so it will hardly notice the inrush effect. All wound components are well used to inrush currents and it is always the associated circuitry - switches and fuses - which suffer. The transformer or motor never even notice. Current only damages them through heat and thermal inertia prevents that for short periods (in the case of large motors, inrush may be many seconds, but it's still the fuses and switchgear which burn out.)
Secondly, there is no mechanism for the non-uniform deterioration. The wire of a transformer may get slightly warm, but it never reaches the sort of levels where metal will evaporate or even melt a little. Of course a transformer ages and deteriorates - due to vibration, thermal cycling (albeit quite small differences), chemical change in insulators, whatever. Eventually perhaps it reaches some sort of failure mode and as it approaches that inevitable end, it is likely to fail at a moment of peak stress. But that stress is not the real cause of failure.
Its like someone driving a rusty old banger over a speed bump and trying to sue the council because his door falls off. Normal well maintained cars drive over it all day long without ill effect and many do so at twice the speed limit. Only a car already in seriously bad condition is likely to suffer.