This used the *logical* $PWD, but realpath would operate on the
physical $PWD if given ".", even with -s. This makes this test fail if the $PWD is
logically different from physical.
This was long overdue since the setup logic is much more complex than
the actual tests.
tmux-prompt.fish had extra logic to protect against XDG_CONFIG_HOME
with leading double double-dot. I believe this is no longer necessary
with the new test driver.
We still use our own temp dir because we want to be able to run this
independently of the test driver, This can be useful for debugging
tests. For example we can insert a "$tmux attach" command in a test,
and then run
build/fish -C 'source tests/test_functions/isolated-tmux.fish' tests/checks/tmux-bind.fish
This allows to inspect the state of the test and debug interactively.
Attaching to the terminal doesn't work when running inside littlecheck
because littlecheck consumes our output and doesn't give us a terminal.
(Maybe there's an easy way to fix that?)
On request of a team member, this patches `basic.fish` to no longer
depend on being invoked by the test driver and started up in a $PWD that
points to a clean temporary directory.
This was requested by a team member who would like for some tests to
remain invokable (in thier own $HOME) directly via littlecheck without
relying on the test driver to prep the environment.
A comment explaining the rationale is also added so this doesn't get
passed down as folklore "you need to include this for tests to run" even
though no one understands why.
Tests are now executed in a test-specific temporary directory, so test
output on failure should be reproducible/reusable as-is without needing
to have TMPDIR defined (as it only exists by default under macOS).
Instead of trying to assert that there are no zombies when the test
starts (which often fails) and to prevent conflating existing or
irrelevant zombies with the ones we are interested in checking for,
have `ps` also emit the parent process id and filter its output to
include only children of the current fish instance.
Aside from the fact that the shared state could cause problems, tests
were randomly assuming it would be created where that wasn't the case.
In particular, `redirect.fish` and `basic.fish` were failing on only
macOS because `../test/temp` didn't exist yet - it would be created by
other tests later.
This disables job control inside command substitutions. Prior to this
change, a cmdsub might get its own process group. This caused it to fail
to cancel loops properly. For example:
while true ; echo (sleep 5) ; end
could not be control-C cancelled, because the signal would go to sleep,
and so the loop would continue on. The simplest way to fix this is to
match other shells and not use job control in cmdsubs.
Related is #1362
* commandline: Add --is-valid option to query whether it's syntactically complete
This means querying when the commandline is in a state that it could
be executed. Because our `execute` bind function also inserts a
newline if it isn't.
One case that's not handled right now: `execute` also expands
abbreviations, those can technically make the commandline invalid
again.
Unfortunately we have no real way to *check* without doing the
replacement.
Also since abbreviations are only available in command position when
you _execute_ them the commandline will most likely be valid.
This is enough to make transient prompts work:
```fish
function reset-transient --on-event fish_postexec
set -g TRANSIENT 0
end
function maybe_execute
if commandline --is-valid
set -g TRANSIENT 1
commandline -f repaint
else
set -g TRANSIENT 0
end
commandline -f execute
end
bind \r maybe_execute
```
and then in `fish_prompt` react to $TRANSIENT being set to 1.
Because we are, ultimately, interested in how many cells a string
occupies, we *have* to handle carriage return (`\r`) and line
feed (`\n`).
A carriage return sets the current tally to 0, and only the longest
tally is kept. The idea here is that the last position is the same as
the last position of the longest string. So:
abcdef\r123
ends up looking like
123def
which is the same width as abcdef, 6.
A line feed meanwhile means we flush the current tally and start a new
one. Every line is printed separately, even if it's given as one.
That's because, well, counting the width over multiple lines
doesn't *help*.
As a sidenote: This is necessarily imperfect, because, while we may
know the width of the terminal ($COLUMNS), we don't know the current
cursor position. So we can only give the width, and the user can then
figure something out on their own.
But for the common case of figuring out how wide the prompt is, this
should do.
* Add `set --function`
This makes the function's scope available, even inside of blocks. Outside of blocks it's the toplevel local scope.
This removes the need to declare variables locally before use, and will probably end up being the main way variables get set.
E.g.:
```fish
set -l thing
if condition
set thing one
else
set thing two
end
```
could be written as
```fish
if condition
set -f thing one
else
set -f thing two
end
```
Note: Many scripts shipped with fish use workarounds like `and`/`or`
instead of `if`, so it isn't easy to find good examples.
Also, if there isn't an else-branch in that above, just with
```fish
if condition
set -f thing one
end
```
that means something different from setting it before! Now, if
`condition` isn't true, it would use a global (or universal) variable of
te same name!
Some more interesting parts:
Because it *is* a local scope, setting a variable `-f` and
`-l` in the toplevel of a function ends up the same:
```fish
function foo2
set -l foo bar
set -f foo baz # modifies the *same* variable!
end
```
but setting it locally inside a block creates a new local variable
that shadows the function-scoped variable:
```fish
function foo3
set -f foo bar
begin
set -l foo banana
# $foo is banana
end
# $foo is bar again
end
```
This is how local variables already work. "Local" is actually "block-scoped".
Also `set --show` will only show the closest local scope, so it won't
show a shadowed function-level variable. Again, this is how local
variables already work, and could be done as a separate change.
As a fun tidbit, functions with --no-scope-shadowing can now use this to set variables in the calling function. That's probably okay given that it's already an escape hatch (but to be clear: if it turns out to problematic I reserve the right to remove it).
Fixes#565
Fixes some regressions from 35ca42413 ("Simplify some parse_util functions").
The tmux tests are not beautiful but I find them easy to write.
Probably a pexpect test would also be enough here?
for PWD in foo; true; end
prints:
>..src/parse_execution.cpp:461: end_execution_reason_t parse_execution_context_t::run_for_statement(const ast::for_header_t&, const ast::job_list_t&): Assertion `retval == ENV_OK' failed.
because this used the wrong way to see if something is read-only.
This allows us to test that `test` takes numbers with decimal point even in comma-using locales,
to stop those pesky americans from breaking everything again.
(and yes, we use french to keep myself honest)
Through a mechanism I don't entirely understand, $PWD is sometimes
writable (so that `cd` can change it) and sometimes not.
In this case we ended up with it writable, which is wrong.
See #8179.
This didn't do all the syntax checks, so something like
fish -c 'echo foo; and $status'
complained of a missing command `0` (i.e. $status), and
fish -c 'echo foo | exec grep'
hit an assert!
So we do what read_ni does, parse each command into an ast, run
parse_util_detect_errors on it if it worked and then eval the ast.
It is possible to do this neater by modifying parser::eval, but I
can't find where.
This is slightly unclean. Even tho it would otherwise be syntactically
valid, using $status as a command is very very very likely to be an
error, like
if not $status
We have reports of this surprisingly regularly, including #2773.
Because $status can only ever be a value from 0 to 255, it is also
very unlikely to be an actual command, and that command is very
unlikely to do what you want.
So we simply point the user towards the "conditions" help section,
that should explain things.
This is opt-in through a new feature flag "ampersand-nobg-in-token".
When this flag and "qmark-noglob" are enabled, this command no longer
needs quoting:
curl https://example.com/thing?foo=bar&duran=duran
Compared to the previous approach e1570a4 ("Let '&' only separate as
the first char of a word"), this has some advantages:
1. "&&" and "&>" are no longer affected. They are still special, even
if used between tokens without spaces, like "echo bar&>foo".
Maybe this is not really *better*, but it avoids risking to annoy
users by breaking the old variant.
2. "&" is still special if at the end of a token, like in "sleep 1&".
Word movement is not affected by the semantics change, so Alt-F and
friends still stop at every "&".
Currently, if a "return" is given outside of a function, we'd just
throw an error.
That always struck me as a bit weird, given that scripts can also
return a value.
So simply let "return" outside also exit the script, kinda like "exit"
does.
However, unlike "exit" it doesn't quit an interactive shell - it seems
weird to have "return" do that as well. It sets $status, so it can be
used to quickly set that, in case you want to test something.
In the variable handler, we just go through the entire thing and keep
every element once.
If there's a duplicate, we set it again, which calls the handler
again.
This takes a bit of time, to be paid on each startup. On my system,
with 100 already deduplicated elements, that's about 4ms (compared to
~17ms for adding them to $PATH).
It's also semantically more complicated - now this variable
specifically is deduplicated? Do we just want "unique" variables that
can't have duplicates?
However: This entirely removes the pathological case of appending to
$fish_user_paths in config.fish (which should be an FAQ entry!), and the implementation is quite simple.
This adds a hack to the parser. Given a command
echo "x$()y z"
we virtually insert double quotes before and after the command
substitution, so the command internally looks like
echo "x"$()"y z"
This hack allows to reuse the existing logic for handling (recursive)
command substitutions.
This makes the quoting syntax more complex; external highlighters
should consider adding this if possible.
The upside (more Bash compatibility) seems worth it.
Closes#159
In some setups (eg. macports) $tmpdir can expand to more than
100 symbols and tests fail with 'socket file name too long'
errors.
Using relative path to socket file fixes the issue.
* string: Allow `collect --no-empty` to avoid empty ellision
Currently we still have that issue where
test -n (thing | string collect)
can return true if `thing` doesn't print anything, because the
collected argument will still be removed.
So, what we do is allow `--no-empty` to be used, in which case we
print one empty argument.
This means
test -n (thing | string collect -n)
can now be safely used.
"no-empty" isn't the best name for this flag, but string's design
really incentivizes reusing names, and it's not *terrible*.
* Switch to `--allow-empty`
`--no-empty` does the exact opposite for `string split` and split0.
Since `-a`/`--allow-empty` already exists, use it.