Detailed state transition models of fault-tolerant systems tend to induce extremely large state spaces, mainly caused by the non-deterministic nature of faults. One of the well-known countermeasures are partial ordering techniques. Yet the remaining state space can be by far too large. This paper deals with a special partial ordering criteria to limit fault effects: After the model components have been grouped to single fault regions the concurrency between these regions is reduced by firing rules of the respective transitions. The rules are either based on a priority scheme or, preferably, on a model of time consumption. It is shown how the approach can be realized in standard SDL without an extension to the language. The problems of the underlying SDL time model and the relationship to single fault regions are discussed deeply. An experimental evaluation with a large model shows the usefulness of the approach.