Jump to content

ReDoS

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by 4SnavaA (talk | contribs) at 01:02, 22 September 2021 (Reverted 1 edit by 45.244.21.147 (talk) to last revision by 2A02:FE1:8081:5B00:6C1F:CFAC:185A:8D80). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

The regular expression denial of service (ReDoS)[1] is an algorithmic complexity attack that produces a denial-of-service by providing a regular expression that takes a very long time to evaluate. The attack exploits the fact that most regular expression implementations have exponential time worst case complexity: the time taken can grow exponentially in relation to input size. An attacker can thus cause a program to spend an unbounded amount of time processing by providing such a regular expression, either slowing down or becoming unresponsive.[2][3]

Description

Regular expression ("regex") matching can be done by building a finite-state automaton. Regex can be easily converted to nondeterministic automata (NFAs), in which for each state and input symbol there may be several possible next states. After building the automaton, several possibilities exist:

  • the engine may convert it to a deterministic finite-state automaton (DFA) and run the input through the result;
  • the engine may try one by one all the possible paths until a match is found or all the paths are tried and fail ("backtracking").[4][5]
  • the engine may consider all possible paths through the nondeterministic automaton in parallel;
  • the engine may convert the nondeterministic automaton to a DFA lazily (i.e., on the fly, during the match).

Of the above algorithms, the first two are problematic. The first is problematic because a deterministic automaton could have up to states where is the number of states in the nondeterministic automaton; thus, the conversion from NFA to DFA may take exponential time. The second is problematic because a nondeterministic automaton could have an exponential number of paths of length , so that walking through an input of length will also take exponential time.[6] The last two algorithms, however, do not exhibit pathological behavior.

Note that for non-pathological regular expressions the problematic algorithms are usually fast, and in practice one can expect them to "compile" a regex in O(m) time and match it in O(n) time; instead, simulation of an NFA and lazy computation of the DFA have O(m2n) worst-case complexity.[7] Regex denial of service occurs when these expectations are applied to regex provided by the user, and malicious regular expressions provided by the user trigger the worst-case complexity of the regex matcher.

While regex algorithms can be written in an efficient way, most regex engines in existence extend the regex languages with additional constructs that cannot always be solved efficiently. Such extended patterns essentially force the implementation of regex in most programming languages to use backtracking.

Examples

Exponential backtracking

The most severe type of problem happens with backtracking regular expression matches, where some patterns have a runtime which is exponential in the length of the input string.[8] For strings of characters, the runtime is . This happens when a regular expression has three properties:

  • the regular expression applies repetition (+, *) to a subexpression;
  • the subexpression can match the same input in multiple ways, or the subexpression can match an input string which is a prefix of a longer possible match;
  • and after the repeated subexpression, there is an expression that matches something which the subexpression does not match.

The second condition is best explained with two examples:

  • in (a|a)+$, repetition is applied to the subexpression a|a, which can match a in two ways on each side of the alternation.
  • in (a+)*$, repetition is applied the subexpression a+, which can match a or aa, etc.

In both of these examples we used $ to match the end of the string, satisfying the third condition, but it is also possible to use another character for this. For example (a|aa)*c has the same problematic structure.

All three of the above regular expressions will exhibit exponential runtime when applied to strings of the form . For example, if you try to match them against aaaaaaaaaaaaaaaaaaaaaaaa! on a backtracking expression engine, it will take a significantly long time to complete, and the runtime will approximately double for each extra a before the !.

It is also possible to have backtracking which is polynomial time , instead of exponential. This can also cause problems for long enough inputs, though less attention has been paid to this problem as malicious input must be much longer to have a significant effect. An example of such a pattern is "a*b?a*x", when the input being an arbitrarily long sequence of "a"s.

Vulnerable regexes in online repositories

So-called "evil" or malicious regexes have been found in online regular expression repositories. Note that it is enough to find an evil subexpression in order to attack the full regex:

  1. RegExLib, id=1757 (email validation) - see red part, which is an Evil Regex
    ^([a-zA-Z0-9])(([\-.]|[_]+)?([a-zA-Z0-9]+))*(@){1}[a-z0-9]+[.]{1}(([a-z]{2,3})|([a-z]{2,3}[.]{1}[a-z]{2,3}))$
  2. OWASP Validation Regex Repository, Java Classname - see red part, which is an Evil Regex
    ^(([a-z])+.)+[A-Z]([a-z])+$

These two examples are also susceptible to the input aaaaaaaaaaaaaaaaaaaaaaaa!.

Attacks

If a regex itself is affected by a user input, the attacker can inject a malicious regex and make the system vulnerable. Therefore, in most cases, regular expression denial of service can be avoided by removing the possibility for the user to execute arbitrary patterns on the server. In this case, web applications and databases are the main vulnerable applications. Alternatively, a malicious page could hang the user's web browser or cause it to use arbitrary amounts of memory.

However, some of the examples in the above paragraphs are considerably less "artificial" than the others; thus, they demonstrate how vulnerable regexes may be used as a result of programming mistakes. In this case e-mail scanners and intrusion detection systems could also be vulnerable. Fortunately, in most cases the problematic regular expressions can be rewritten as "non-evil" patterns. For example, (.*a)+ can be rewritten to ([^a]*a)+.

In the case of a web application, the programmer may use the same regular expression to validate input on both the client and the server side of the system. An attacker could inspect the client code, looking for evil regular expressions, and send crafted input directly to the web server in order to hang it.

See also

References

  1. ^ OWASP (2010-02-10). "Regex Denial of Service". Retrieved 2010-04-16.
  2. ^ RiverStar Software (2010-01-18). "Security Bulletin: Caution Using Regular Expressions". Retrieved 2010-04-16.
  3. ^ Ristic, Ivan (2010-03-15). ModSecurity Handbook. London, UK: Feisty Duck Ltd. p. 173. ISBN 978-1-907117-02-2.
  4. ^ Crosby and Wallach, Usenix Security (2003). "Regular Expression Denial Of Service". Retrieved 2010-01-13.
  5. ^ Bryan Sullivan, MSDN Magazine (2010-05-03). "Regular Expression Denial of Service Attacks and Defenses". Retrieved 2010-05-06. {{cite web}}: External link in |author= (help)CS1 maint: numeric names: authors list (link)
  6. ^ Kirrage, J.; Rathnayake, A.; Thielecke, H. (2013). "Static Analysis for Regular Expression Denial-of-Service Attacks". Network and System Security. Madrid, Spain: Springer. pp. 135–148. arXiv:1301.0849. doi:10.1007/978-3-642-38631-2_11.
  7. ^ Lazy computation of the DFA can usually reach the speed of deterministic automatons while keeping worst case behavior similar to simulation of an NFA. However, it is considerably more complex to implement and can use more memory.
  8. ^ Jim Manico and Adar Weidman (2009-12-07). "OWASP Podcast 56 (ReDoS)". Retrieved 2010-04-02.