# Description
Use `record!` macro instead of defining two separate `vec!` for `cols`
and `vals` when appropriate.
This visually aligns the key with the value.
Further more you don't have to deal with the construction of `Record {
cols, vals }` so we can hide the implementation details in the future.
## State
Not covering all possible commands yet, also some tests/examples are
better expressed by creating cols and vals separately.
# User/Developer-Facing Changes
The examples and tests should read more natural. No relevant functional
change
# Bycatch
Where I noticed it I replaced usage of `Value` constructors with
`Span::test_data()` or `Span::unknown()` to the `Value::test_...`
constructors. This should make things more readable and also simplify
changes to the `Span` system in the future.
# Description
As part of the refactor to split spans off of Value, this moves to using
helper functions to create values, and using `.span()` instead of
matching span out of Value directly.
Hoping to get a few more helping hands to finish this, as there are a
lot of commands to update :)
# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->
# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.
Make sure you've run and fixed any issues with these commands:
- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use std testing; testing run-tests --path
crates/nu-std"` to run the tests for the standard library
> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->
# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
---------
Co-authored-by: Darren Schroeder <343840+fdncred@users.noreply.github.com>
Co-authored-by: WindSoilder <windsoilder@outlook.com>
# Description
This PR creates a new `Record` type to reduce duplicate code and
possibly bugs as well. (This is an edited version of #9648.)
- `Record` implements `FromIterator` and `IntoIterator` and so can be
iterated over or collected into. For example, this helps with
conversions to and from (hash)maps. (Also, no more
`cols.iter().zip(vals)`!)
- `Record` has a `push(col, val)` function to help insure that the
number of columns is equal to the number of values. I caught a few
potential bugs thanks to this (e.g. in the `ls` command).
- Finally, this PR also adds a `record!` macro that helps simplify
record creation. It is used like so:
```rust
record! {
"key1" => some_value,
"key2" => Value::string("text", span),
"key3" => Value::int(optional_int.unwrap_or(0), span),
"key4" => Value::bool(config.setting, span),
}
```
Since macros hinder formatting, etc., the right hand side values should
be relatively short and sweet like the examples above.
Where possible, prefer `record!` or `.collect()` on an iterator instead
of multiple `Record::push`s, since the first two automatically set the
record capacity and do less work overall.
# User-Facing Changes
Besides the changes in `nu-protocol` the only other breaking changes are
to `nu-table::{ExpandedTable::build_map, JustTable::kv_table}`.
# Description
Resolves issue #8370
Adds the following flags to commands `from csv` and `from tsv`:
- `--flexible`: allow the number of fields in records to be variable
- `-c --comment`: a comment character to ignore lines starting with it
- `-q --quote`: a quote character to ignore separators in strings,
defaults to '\"'
- `-e --escape`: an escape character for strings containing the quote
character
Internally, the `Value` struct has an additional helper function
`as_char` which converts it to a single `char`
# User-Facing Changes
The single quoted string `'\t'` can no longer be used as a parameter for
the flag `--separator '\t'` as it is interpreted as a two-character
string. One needs to use from now on the flag with a double quoted
string like so: `-s "\t"` which correctly interprets the string as a
single `char`.
# Description
_(Thank you for improving Nushell. Please, check our [contributing
guide](../CONTRIBUTING.md) and talk to the core team before making major
changes.)_
I opened this PR to unify the run command method. It's mainly to improve
consistency across the tree.
# User-Facing Changes
None.
# Tests + Formatting
Don't forget to add tests that cover your changes.
Make sure you've run and fixed any issues with these commands:
- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used -A
clippy::needless_collect` to check that you're using the standard code
style
- `cargo test --workspace` to check that all tests pass
# After Submitting
If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
* Make json require string and pass around metadata
The json deserializer was accepting any inputs by coercing non-strings
into strings. As an example, if the input was `[1, 2]` the coercion
would turn into `[12]` and deserialize as a list containing number
twelve instead of a list of two numbers, one and two. This could lead
to silent data corruption.
Aside from that pipeline metadata wasn't passed aroud.
This commit fixes the type issue by adding a strict conversion
function that errors if the input type is not a string or external
stream. It then uses this function instead of the original
`collect_string()`. In addition, this function returns the pipeline
metadata so it can be passed along.
* Make other formats require string
The problem with json coercing non-string types to string was present in
all other text formats. This reuses the `collect_string_strict` function
to fix them.
* `IntoPipelineData` cleanup
The method `into_pipeline_data_with_metadata` can now be conveniently
used.
* Add failing test that list of ints and floats is List<Number>
* Start defining subtype relation
* Make it possible to declare input and output types for commands
- Enforce them in tests
* Declare input and output types of commands
* Add formatted signatures to `help commands` table
* Revert SyntaxShape::Table -> Type::Table change
* Revert unnecessary derive(Hash) on SyntaxShape
Co-authored-by: JT <547158+jntrnr@users.noreply.github.com>