mirror of
http://git.haproxy.org/git/haproxy.git/
synced 2024-12-24 13:42:16 +00:00
8a1027aa45
For tokenizing a string, standard Lua recommends to use regexes. The followinf example splits words: for i in string.gmatch(example, "%S+") do print(i) end This is a little bit overkill for simply split words. This patch adds a tokenize function which quick and do not use regexes. |
||
---|---|---|
.. | ||
_static | ||
conf.py | ||
index.rst | ||
Makefile |