Instead of trying to assert that there are no zombies when the test
starts (which often fails) and to prevent conflating existing or
irrelevant zombies with the ones we are interested in checking for,
have `ps` also emit the parent process id and filter its output to
include only children of the current fish instance.
Aside from the fact that the shared state could cause problems, tests
were randomly assuming it would be created where that wasn't the case.
In particular, `redirect.fish` and `basic.fish` were failing on only
macOS because `../test/temp` didn't exist yet - it would be created by
other tests later.
The default matching logic for fish_tests was prefix based, so when we
were running `history` we were also running all history tests. This
causes the test to fail for an unknown reason.
Even though we are using CMake's ctest for testing, we still define our
own `make test` target rather than use its default for many reasons:
* CMake doesn't run tests in-proc or even add each tests as an
individual node in the ninja dependency tree, instead it just bundles
all tests into a target called `test` that always just shells out to
`ctest`, so there are no build-related benefits to not doing that
ourselves.
* CMake devs insist that it is appropriate for `make test` to never
depend on `make all`, i.e. running `make test` does not require any
of the binaries to be built before testing.
* The only way to have a test depend on a binary is to add a fake test
with a name like "build_fish" that executes CMake recursively to
build the `fish` target.
* It is not possible to set top-level CTest options/settings such as
CTEST_PARALLEL_LEVEL from within the CMake configuration file.
* Circling back to the point about individual tests not being actual
Makefile targets, CMake does not offer any way to execute a named
test via the `make`/`ninja`/whatever interface; the only way to
manually invoke test `foo` is to to manually run `ctest` and specify
a regex matching `foo` as an argument, e.g. `ctest -R ^foo$`... which
is really crazy.
With this patch, it is now possible to execute any single test by name,
by invoking the build directly, e.g. to run the `universal.fish` check:
`cmake --build build --target universal.fish` or
`ninja -C build universal.fish`. Unfortunately, this is not integrated
into the Makefile wrapper, so `make universal.fish` won't work (although
this can potentially be hacked around).
Instead of compiling `fish_tests.cpp` dynamically with weakly-linked
symbols and asking it to print the list of all available tests, we
use a magic string `#define`'d as a no-op to allow CMake to regex search
for matching test groups. This speeds up configuration somewhat (by not
compiling anything), but more importantly, it's much less brittle and
doesn't involve and linker dark magic.
There's of course still no getting around the fact that it's really ugly.
We have a *lot* of color sequences to try and tparm is slow (on the
whole, when you do this thousands of times).
So let's just check colors last, which makes everything else (which is
comparatively nothing) faster, while barely impacting
colors (benchmarking confirms no measurable difference).
Fixes#8253.
* Fix ls.fish: add -l option to GNU ls
* Sort alphabetically and remove --lcontext and --scontext (what are these?) on shared and GNU part.
* Revert --lcontext and --scontext options.
Unfortunately, we now need to know which .html file has which sections
to link to the correct one in help.fish.
So this script helps extract the sections from pre-built docs. It's
not supposed to be run at build time because
1. These change rarely.
2. We should link to the correct document even if the user doesn't
have the docs built.
And before anyone mentions it: This does *not* parse html with regex.
This "parses" the restricted subset of "class followed by href without
embedded quotes" that sphinx uses here in practice.
This should add all the sections that aren't linked internally,
including "identifiers".
(also give up on the line breaking because it makes it annoying to do
automatically)
Fixes#8245.
Fixes#8232.
Note that this needed to have expect_prompt used in the pexpect test -
we might want to add a "catchup" there so you can just ignore the
prompt counter for a bit and pick it back up later.
This was semi-automated with
```fish
for file in $argv
set -l varname (string replace -r '.*/(.*).html' '$1' -- $file | string escape --style=var)pages
set -l sections (string replace -rf '.*class="headerlink" href="#([^"]*)".*' '$1' <$file)
echo set -l $varname $sections
end
```
(where $argv contains the path to faq, fish_for_bash_users,
interactive, language and tutorial.html)
Building help.fish at compile time would work, but only for users who
build the docs.
* Remove safe_strerror, safe_perror and safe_append
This no longer works on new glibcs because they removed sys_errlist.
So just hardcode the relevant errno messages (and phrase them better).
Fixes#4183.
Co-authored-by: Johannes Altmanninger <aclopte@gmail.com>
The clang warning for pending_signals_t was about the operator=
return type being wrong (misc-unconventional-assign-operator).
Signed-off-by: Rosen Penev <rosenp@gmail.com>
We don't want to convert the input to a "wcstring &" because
"stage_variables" needs to have the same type as other stages, so we
can use it in a loop. Communicate that to clang-tidy.
We also don't want to take "wcstring &&". As the Google style guide
states, it's not really beneficial here, and it potentially hurts
readability because it's a relatively obscure feature.
The rest of our code contains a bunch of && parameters. We might
want to get rid of some of them.
Closes#8227
clang-tidy wrongly sees an std::move to a const ref parameter and
believes it to be pointless. The copy constructor however is deleted.
Signed-off-by: Rosen Penev <rosenp@gmail.com>
This disables job control inside command substitutions. Prior to this
change, a cmdsub might get its own process group. This caused it to fail
to cancel loops properly. For example:
while true ; echo (sleep 5) ; end
could not be control-C cancelled, because the signal would go to sleep,
and so the loop would continue on. The simplest way to fix this is to
match other shells and not use job control in cmdsubs.
Related is #1362
The same hack that is used for `pkg remove <foo>` is required here, too.
Due to the massive number of results, we use `head -n 250` to prevent
the completion from hanging or the shell from being overencumbered by
too many possibe completions. However, this would only generate matches
for any of the first 250 packages, rather than printing the first 250
packages that match.
[ci skip]
The previous layout confused me for a minute as it suggested it was
possible for `pipe_next_read` to be moved twice (once in the first
conditional block, then again when the deferred process conditional
called `continue` - if and only if the deferred process *was* the last
process in the job. This patch clarifies that can't be the case.
`pipe_next_read` is moved in the body of the loop, and not
re-initialized the last go around. However, we call
`pipe_next_read.close()` after the loop, which is undefined behavior (as
it's been moved).
Best case scenario, the compiler passed the address of our copy of the
struct to `exec_process_in_job` and beyond, it went out of scope there,
the value of `fd` was set to closed (minus one), and we explicitly call
`.close()` again, in which case it does nothing.
Worst case scenario, the compiler re-uses the storage for the now-moved
struct for something else and our call to `.close()` ends up closing
some other value of `fd` (valid or invalid) and things break.
Aside from the fact that we obviously don't need to close it since it's
not assigned for the last process in the job, it's a RAII object so we
don't have to worry about manually closing it in the first place.
`escape_code_length()` was converted from returning a `size_t` to
returning a `maybe_t<size_t>` but that subtly broke all existing call
sites by forcing all input to go through the slow path of assuming a
zero-length escape sequence was found.
This is because all callers predicated their next action on what amounts
to `if (escape_code_length(...))` which would correctly skip the slow
path when `escape_code_length` returned zero, but after the conversion
to `maybe_t` contained not `maybe_t::none()` but rather
`maybe_t::some(0)` due to coercion of the result from the `size_t` local
`esc_seq_len` to the `maybe_t<size_t>` return value - which, when
coerced to a boolean returns *true* for `maybe_t::some(0)` rather than
false.
The regression was introduced in 7ad855a844
and did not ship in any released versions so no harm, no foul.
This is required for the usage of placement new. Not an issue for fish
as it gets picked up from elsewhere, but it lets one use it in a C++
test directly this way.