Alters `all`, `any`, `each while`, `each`, `insert`, `par-each`, `reduce`, `update`, `upsert` and `where`,
so that their blocks take an optional parameter containing the index.
* Make json require string and pass around metadata
The json deserializer was accepting any inputs by coercing non-strings
into strings. As an example, if the input was `[1, 2]` the coercion
would turn into `[12]` and deserialize as a list containing number
twelve instead of a list of two numbers, one and two. This could lead
to silent data corruption.
Aside from that pipeline metadata wasn't passed aroud.
This commit fixes the type issue by adding a strict conversion
function that errors if the input type is not a string or external
stream. It then uses this function instead of the original
`collect_string()`. In addition, this function returns the pipeline
metadata so it can be passed along.
* Make other formats require string
The problem with json coercing non-string types to string was present in
all other text formats. This reuses the `collect_string_strict` function
to fix them.
* `IntoPipelineData` cleanup
The method `into_pipeline_data_with_metadata` can now be conveniently
used.
* add signature information when help on one command
* tell user that one command support operated on cell paths
Also, make type output to be more friendly, like `record<>` should just be `record`
And the same to `table<>`, which should be `table`
* simplify code
* don't show signatures for parser keyword
* update comment
* output arg syntax shape as type, so it's the same as describe command
* fix string when no positional args
* update signature body
* update
* add help signature test
* fix arg output format for composed data type like list or record
* fix clippy
* add comment
* Grouped config commands better
* Tweaked test slightly
* Fix merge conflict(?)
* Remove recently-added test case
* Revert rm.always_trash default
* Untweak rm help messages
* Formatting
* Remove example
* Add deprecation warning
* Remove deprecation timeline
Not sure we want to commit to a specific timeline just yet
Co-authored-by: Reilly Wood <26268125+rgwood@users.noreply.github.com>
* Add input and output types to $nu.scope.commands
This commit changes the schema: instead of
command.signature: table
we now have
command.signatures: list<table>
with one signature for every input-output type pair.
* Represent signatures as a map from input_type to signature
* Sort signature entries
* Drop command name from signature tables
* Don't use "rest" as name of rest parameter; use empty string instead
* Bug fix: was creating records with repeated keys
E.g.
$nu.scope.commands | where name == 'hash sha256' | get signatures.0 | table -e
$nu.scope.commands | where name == 'transpose' | get signatures.0 | table -e
* removes unused features.
* Adds back multithreading feature to sysinfo.
* Adds back alloc for percent-encoding
* Adds updated lock file.
* Missed one sysinfo.
* `indexmap` just defaults
* Revert `miette``default-features=false`
Co-authored-by: Stefan Holderbach <sholderbach@users.noreply.github.com>
Co-authored-by: JT <547158+jntrnr@users.noreply.github.com>
This adds support for (limited) mutable variables. Mutable variables are created with mut much the same way immutable variables are made with let.
Mutable variables allow mutation via the assignment operator (=).
❯ mut x = 100
❯ $x = 200
❯ print $x
200
Mutable variables are limited in that they're only tended to be used in the local code block. Trying to capture a local variable will result in an error:
❯ mut x = 123; {|| $x }
Error: nu::parser::expected_keyword (link)
× Capture of mutable variable.
The intent of this limitation is to reduce some of the issues with mutable variables in general: namely they make code that's harder to reason about. By reducing the scope that a mutable variable can be used it, we can help create local reasoning about them.
Mutation can occur with fields as well, as in this case:
❯ mut y = {abc: 123}
❯ $y.abc = 456
❯ $y
On a historical note: mutable variables are something that we resisted for quite a long time, leaning as much as we could on the functional style of pipelines and dataflow. That said, we've watched folks struggle to work with reduce as an approximation for patterns that would be trivial to express with local mutation. With that in mind, we're leaning towards the happy path.
- Custom commands are true for builtin and custom
- Add classification as external command
- Specify wildcard in keyword: keyword is true for builtin and keyword