fish-shell/sphinx_doc_src
Fabian Homborg 86133b0a2b Add read --tokenize
This splits a string into variables according to the shell's
tokenization rules, considering quoting, escaping etc.

This runs an automatic `unescape` on the string so it's presented like
it would be passed to the command. E.g.

    printf '%s\n' a\ b

returns the tokens

printf
%s\n
a b

It might be useful to add another mode "--tokenize-raw" that doesn't
do that, but this seems to be the more useful of the two.

Fixes #3823.
2019-12-01 18:14:26 +01:00
..
_static sphinx: honor changes in static html assets 2019-10-19 14:52:24 +02:00
cmds Add read --tokenize 2019-12-01 18:14:26 +01:00
commands.rst docs: Remove explicit .html links 2019-04-30 13:11:33 +02:00
conf.py docs: restore compatibility with Sphinx < 1.8.0 2019-11-26 18:17:20 +08:00
design.rst docs: Remove explicit .html links 2019-04-30 13:11:33 +02:00
faq.rst Support FOO=bar syntax for passing variables to individual commands 2019-11-25 09:20:51 +01:00
fish_indent_lexer.py Reformat all files 2019-05-05 12:09:25 +02:00
index.rst Add individual documentation pages for string's subcommands 2019-11-07 09:54:25 +01:00
license.rst Update copyright information 2019-08-08 18:47:36 +02:00
tutorial.rst Disavow IRC channel 2019-11-30 09:29:49 +01:00