Commit graph

62 commits

Author SHA1 Message Date
Devyn Cairns
6795ad7e33
Make custom value type handling more consistent (#12230)
[Context on
Discord](https://discord.com/channels/601130461678272522/855947301380947968/1219425984990806207)

# Description

- Rename `CustomValue::value_string()` to `type_name()` to reflect its
usage better.
- Change print behavior to always call `to_base_value()` first, to give
the custom value better control over the output.
- Change `describe --detailed` to show the type name as the subtype,
rather than trying to describe the base value.
- Change custom `Type` to use `type_name()` rather than `typetag_name()`
to make things like `PluginCustomValue` more transparent

One question: should `describe --detailed` still include a description
of the base value somewhere? I'm torn on it, it seems possibly useful
for some things (maybe sqlite databases?), but having `describe -d` not
include the custom type name anywhere felt weird. Another option would
be to add another method to `CustomValue` for info to be displayed in
`describe`, so that it can be more type-specific?

# User-Facing Changes
Everything above has implications for printing and `describe` on custom
values

# Tests + Formatting
- 🟢 `toolkit fmt`
- 🟢 `toolkit clippy`
- 🟢 `toolkit test`
- 🟢 `toolkit test stdlib`
2024-03-19 11:09:59 +01:00
Ian Manske
b6c7656194
IO and redirection overhaul (#11934)
# Description
The PR overhauls how IO redirection is handled, allowing more explicit
and fine-grain control over `stdout` and `stderr` output as well as more
efficient IO and piping.

To summarize the changes in this PR:
- Added a new `IoStream` type to indicate the intended destination for a
pipeline element's `stdout` and `stderr`.
- The `stdout` and `stderr` `IoStream`s are stored in the `Stack` and to
avoid adding 6 additional arguments to every eval function and
`Command::run`. The `stdout` and `stderr` streams can be temporarily
overwritten through functions on `Stack` and these functions will return
a guard that restores the original `stdout` and `stderr` when dropped.
- In the AST, redirections are now directly part of a `PipelineElement`
as a `Option<Redirection>` field instead of having multiple different
`PipelineElement` enum variants for each kind of redirection. This
required changes to the parser, mainly in `lite_parser.rs`.
- `Command`s can also set a `IoStream` override/redirection which will
apply to the previous command in the pipeline. This is used, for
example, in `ignore` to allow the previous external command to have its
stdout redirected to `Stdio::null()` at spawn time. In contrast, the
current implementation has to create an os pipe and manually consume the
output on nushell's side. File and pipe redirections (`o>`, `e>`, `e>|`,
etc.) have precedence over overrides from commands.

This PR improves piping and IO speed, partially addressing #10763. Using
the `throughput` command from that issue, this PR gives the following
speedup on my setup for the commands below:
| Command | Before (MB/s) | After (MB/s) | Bash (MB/s) |
| --------------------------- | -------------:| ------------:|
-----------:|
| `throughput o> /dev/null` | 1169 | 52938 | 54305 |
| `throughput \| ignore` | 840 | 55438 | N/A |
| `throughput \| null` | Error | 53617 | N/A |
| `throughput \| rg 'x'` | 1165 | 3049 | 3736 |
| `(throughput) \| rg 'x'` | 810 | 3085 | 3815 |

(Numbers above are the median samples for throughput)

This PR also paves the way to refactor our `ExternalStream` handling in
the various commands. For example, this PR already fixes the following
code:
```nushell
^sh -c 'echo -n "hello "; sleep 0; echo "world"' | find "hello world"
```
This returns an empty list on 0.90.1 and returns a highlighted "hello
world" on this PR.

Since the `stdout` and `stderr` `IoStream`s are available to commands
when they are run, then this unlocks the potential for more convenient
behavior. E.g., the `find` command can disable its ansi highlighting if
it detects that the output `IoStream` is not the terminal. Knowing the
output streams will also allow background job output to be redirected
more easily and efficiently.

# User-Facing Changes
- External commands returned from closures will be collected (in most
cases):
  ```nushell
  1..2 | each {|_| nu -c "print a" }
  ```
This gives `["a", "a"]` on this PR, whereas this used to print "a\na\n"
and then return an empty list.

  ```nushell
  1..2 | each {|_| nu -c "print -e a" }
  ```
This gives `["", ""]` and prints "a\na\n" to stderr, whereas this used
to return an empty list and print "a\na\n" to stderr.

- Trailing new lines are always trimmed for external commands when
piping into internal commands or collecting it as a value. (Failure to
decode the output as utf-8 will keep the trailing newline for the last
binary value.) In the current nushell version, the following three code
snippets differ only in parenthesis placement, but they all also have
different outputs:

  1. `1..2 | each { ^echo a }`
     ```
     a
     a
     ╭────────────╮
     │ empty list │
     ╰────────────╯
     ```
  2. `1..2 | each { (^echo a) }`
     ```
     ╭───┬───╮
     │ 0 │ a │
     │ 1 │ a │
     ╰───┴───╯
     ```
  3. `1..2 | (each { ^echo a })`
     ```
     ╭───┬───╮
     │ 0 │ a │
     │   │   │
     │ 1 │ a │
     │   │   │
     ╰───┴───╯
     ```

  But in this PR, the above snippets will all have the same output:
  ```
  ╭───┬───╮
  │ 0 │ a │
  │ 1 │ a │
  ╰───┴───╯
  ```

- All existing flags on `run-external` are now deprecated.

- File redirections now apply to all commands inside a code block:
  ```nushell
  (nu -c "print -e a"; nu -c "print -e b") e> test.out
  ```
This gives "a\nb\n" in `test.out` and prints nothing. The same result
would happen when printing to stdout and using a `o>` file redirection.

- External command output will (almost) never be ignored, and ignoring
output must be explicit now:
  ```nushell
  (^echo a; ^echo b)
  ```
This prints "a\nb\n", whereas this used to print only "b\n". This only
applies to external commands; values and internal commands not in return
position will not print anything (e.g., `(echo a; echo b)` still only
prints "b").

- `complete` now always captures stderr (`do` is not necessary).

# After Submitting
The language guide and other documentation will need to be updated.
2024-03-14 15:51:55 -05:00
Devyn Cairns
73f3c0b60b
Support for all custom value operations on plugin custom values (#12088)
# Description

Adds support for the following operations on plugin custom values, in
addition to `to_base_value` which was already present:

- `follow_path_int()`
- `follow_path_string()`
- `partial_cmp()`
- `operation()`
- `Drop` (notification, if opted into with
`CustomValue::notify_plugin_on_drop`)

There are additionally customizable methods within the `Plugin` and
`StreamingPlugin` traits for implementing these functions in a way that
requires access to the plugin state, as a registered handle model such
as might be used in a dataframes plugin would.

`Value::append` was also changed to handle custom values correctly.

# User-Facing Changes

- Signature of `CustomValue::follow_path_string` and
`CustomValue::follow_path_int` changed to give access to the span of the
custom value itself, useful for some errors.
- Plugins using custom values have to be recompiled because the engine
will try to do custom value operations that aren't supported
- Plugins can do more things 🎉 

# Tests + Formatting
Tests were added for all of the new custom values functionality.

- 🟢 `toolkit fmt`
- 🟢 `toolkit clippy`
- 🟢 `toolkit test`
- 🟢 `toolkit test stdlib`

# After Submitting
- [ ] Document protocol reference `CustomValueOp` variants:
  - [ ] `FollowPathInt`
  - [ ] `FollowPathString`
  - [ ] `PartialCmp`
  - [ ] `Operation`
  - [ ] `Dropped`
- [ ] Document `notify_on_drop` optional field in `PluginCustomValue`
2024-03-12 10:37:08 +01:00
Jakub Žádník
14d1c67863
Debugger experiments (#11441)
<!--
if this PR closes one or more issues, you can automatically link the PR
with
them by using one of the [*linking
keywords*](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword),
e.g.
- this PR should close #xxxx
- fixes #xxxx

you can also mention related issues, PRs or discussions!
-->

# Description
<!--
Thank you for improving Nushell. Please, check our [contributing
guide](../CONTRIBUTING.md) and talk to the core team before making major
changes.

Description of your pull request goes here. **Provide examples and/or
screenshots** if your changes affect the user experience.
-->

This PR adds a new evaluator path with callbacks to a mutable trait
object implementing a Debugger trait. The trait object can do anything,
e.g., profiling, code coverage, step debugging. Currently,
entering/leaving a block and a pipeline element is marked with
callbacks, but more callbacks can be added as necessary. Not all
callbacks need to be used by all debuggers; unused ones are simply empty
calls. A simple profiler is implemented as a proof of concept.

The debugging support is implementing by making `eval_xxx()` functions
generic depending on whether we're debugging or not. This has zero
computational overhead, but makes the binary slightly larger (see
benchmarks below). `eval_xxx()` variants called from commands (like
`eval_block_with_early_return()` in `each`) are chosen with a dynamic
dispatch for two reasons: to not grow the binary size due to duplicating
the code of many commands, and for the fact that it isn't possible
because it would make Command trait objects object-unsafe.

In the future, I hope it will be possible to allow plugin callbacks such
that users would be able to implement their profiler plugins instead of
having to recompile Nushell.
[DAP](https://microsoft.github.io/debug-adapter-protocol/) would also be
interesting to explore.

Try `help debug profile`.

## Screenshots

Basic output:

![profiler_new](https://github.com/nushell/nushell/assets/25571562/418b9df0-b659-4dcb-b023-2d5fcef2c865)

To profile with more granularity, increase the profiler depth (you'll
see that repeated `is-windows` calls take a large chunk of total time,
making it a good candidate for optimizing):

![profiler_new_m3](https://github.com/nushell/nushell/assets/25571562/636d756d-5d56-460c-a372-14716f65f37f)

## Benchmarks

### Binary size

Binary size increase vs. main: **+40360 bytes**. _(Both built with
`--release --features=extra,dataframe`.)_

### Time

```nushell
# bench_debug.nu
use std bench

let test = {
    1..100
    | each {
        ls | each {|row| $row.name | str length }
    }
    | flatten
    | math avg
}

print 'debug:'
let res2 = bench { debug profile $test } --pretty
print $res2
```

```nushell
# bench_nodebug.nu
use std bench

let test = {
    1..100
    | each {
        ls | each {|row| $row.name | str length }
    }
    | flatten
    | math avg
}

print 'no debug:'
let res1 = bench { do $test } --pretty
print $res1
```

`cargo run --release -- bench_debug.nu` is consistently 1--2 ms slower
than `cargo run --release -- bench_nodebug.nu` due to the collection
overhead + gathering the report. This is expected. When gathering more
stuff, the overhead is obviously higher.

`cargo run --release -- bench_nodebug.nu` vs. `nu bench_nodebug.nu` I
didn't measure any difference. Both benchmarks report times between 97
and 103 ms randomly, without one being consistently higher than the
other. This suggests that at least in this particular case, when not
running any debugger, there is no runtime overhead.

## API changes

This PR adds a generic parameter to all `eval_xxx` functions that forces
you to specify whether you use the debugger. You can resolve it in two
ways:
* Use a provided helper that will figure it out for you. If you wanted
to use `eval_block(&engine_state, ...)`, call `let eval_block =
get_eval_block(&engine_state); eval_block(&engine_state, ...)`
* If you know you're in an evaluation path that doesn't need debugger
support, call `eval_block::<WithoutDebug>(&engine_state, ...)` (this is
the case of hooks, for example).

I tried to add more explanation in the docstring of `debugger_trait.rs`.

## TODO

- [x] Better profiler output to reduce spam of iterative commands like
`each`
- [x] Resolve `TODO: DEBUG` comments
- [x] Resolve unwraps
- [x] Add doc comments
- [x] Add usage and extra usage for `debug profile`, explaining all
columns

# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->

Hopefully none.

# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.

Make sure you've run and fixed any issues with these commands:

- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use std testing; testing run-tests --path
crates/nu-std"` to run the tests for the standard library

> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->

# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
2024-03-08 20:21:35 +02:00
Jack Wright
2a721bad52
Add columns to dataframe that are present in the schema but not present the Dataframe when applying schema. (#11987) 2024-02-26 17:22:33 -06:00
Ian Manske
fb4251aba7
Remove Record::from_raw_cols_vals_unchecked (#11810)
# Description
Follows from #11718 and replaces all usages of
`Record::from_raw_cols_vals_unchecked` with iterator or `record!`
equivalents.
2024-02-18 14:20:22 +02:00
Ian Manske
1c49ca503a
Name the Value conversion functions more clearly (#11851)
# Description
This PR renames the conversion functions on `Value` to be more consistent.
It follows the Rust [API guidelines](https://rust-lang.github.io/api-guidelines/naming.html#ad-hoc-conversions-follow-as_-to_-into_-conventions-c-conv) for ad-hoc conversions.
The conversion functions on `Value` now come in a few forms:
- `coerce_{type}` takes a `&Value` and attempts to convert the value to
`type` (e.g., `i64` are converted to `f64`). This is the old behavior of
some of the `as_{type}` functions -- these functions have simply been
renamed to better reflect what they do.
- The new `as_{type}` functions take a `&Value` and returns an `Ok`
result only if the value is of `type` (no conversion is attempted). The
returned value will be borrowed if `type` is non-`Copy`, otherwise an
owned value is returned.
- `into_{type}` exists for non-`Copy` types, but otherwise does not
attempt conversion just like `as_type`. It takes an owned `Value` and
always returns an owned result.
- `coerce_into_{type}` has the same relationship with `coerce_{type}` as
`into_{type}` does with `as_{type}`.
- `to_{kind}_string`: conversion to different string formats (debug,
abbreviated, etc.). Only two of the old string conversion functions were
removed, the rest have been renamed only.
- `to_{type}`: other conversion functions. Currently, only `to_path`
exists. (And `to_string` through `Display`.)

This table summaries the above:
| Form | Cost | Input Ownership | Output Ownership | Converts `Value`
case/`type` |
| ---------------------------- | ----- | --------------- |
---------------- | -------- |
| `as_{type}` | Cheap | Borrowed | Borrowed/Owned | No |
| `into_{type}` | Cheap | Owned | Owned | No |
| `coerce_{type}` | Cheap | Borrowed | Borrowed/Owned | Yes |
| `coerce_into_{type}` | Cheap | Owned | Owned | Yes |
| `to_{kind}_string` | Expensive | Borrowed | Owned | Yes |
| `to_{type}` | Expensive | Borrowed | Owned | Yes |

# User-Facing Changes
Breaking API change for `Value` in `nu-protocol` which is exposed as
part of the plugin API.
2024-02-17 18:14:16 +00:00
Jack Wright
525acf9d9e
Ability to cast a dataframe's column to a different dtype (#11803)
Provides the ability to cast columns in dataframes, lazy dataframes, and
expressions.

<img width="587" alt="Screenshot 2024-02-14 at 13 53 01"
src="https://github.com/nushell/nushell/assets/56345/b894f746-0e37-472e-9fb0-eb6f71f2bf27">

<img width="616" alt="Screenshot 2024-02-14 at 13 52 37"
src="https://github.com/nushell/nushell/assets/56345/cf10efa7-d89c-4189-ab71-d368b2354d19">

<img width="626" alt="Screenshot 2024-02-14 at 13 54 58"
src="https://github.com/nushell/nushell/assets/56345/cd57cdf0-5096-41dd-8ab5-46e3d1e061b8">

---------

Co-authored-by: Jack Wright <jack.wright@disqo.com>
2024-02-14 18:15:00 -06:00
nibon7
da4c918392
Bump polars from 0.36 to 0.37 (#11848)
# Description
Bump polars from 0.36 to 0.37

# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->

# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.

Make sure you've run and fixed any issues with these commands:

- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use std testing; testing run-tests --path
crates/nu-std"` to run the tests for the standard library

> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->

# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
2024-02-13 06:27:30 -06:00
Jakub Žádník
b8d37a7541
Fix panic in rotate; Add safe record creation function (#11718)
<!--
if this PR closes one or more issues, you can automatically link the PR
with
them by using one of the [*linking
keywords*](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword),
e.g.
- this PR should close #xxxx
- fixes #xxxx

you can also mention related issues, PRs or discussions!
-->

# Description
<!--
Thank you for improving Nushell. Please, check our [contributing
guide](../CONTRIBUTING.md) and talk to the core team before making major
changes.

Description of your pull request goes here. **Provide examples and/or
screenshots** if your changes affect the user experience.
-->

Fixes https://github.com/nushell/nushell/issues/11716

The problem is in our [record creation
API](0d518bf813/crates/nu-protocol/src/value/record.rs (L33))
which panics if the numbers of columns and values are different. I added
a safe variant that returns a `Result` and used it in the `rotate`
command.

## TODO in another PR:

Go through all `from_raw_cols_vals_unchecked()` (this includes the
`record!` macro which uses the unchecked version) and make sure that
either
a) it is guaranteed the number of cols and vals is the same, or
b) convert the call to `from_raw_cols_vals()`

Reason: Nushell should never panic.

# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->

# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.

Make sure you've run and fixed any issues with these commands:

- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use std testing; testing run-tests --path
crates/nu-std"` to run the tests for the standard library

> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->

# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
2024-02-03 13:23:16 +02:00
Jack Wright
175dab4898
"[11611] fixing dataframe column comparisons" (#11676)
fixes #11611

Co-authored-by: Jack Wright <jack.wright@disqo.com>
2024-01-29 17:28:12 -06:00
Jack Wright
f879c00f9d
The ability to specify a schema when using dfr open and dfr into-df (#11634)
# Description

There are times where explicitly specifying a schema for a dataframe is
needed such as:
- Opening CSV and JSON lines files and needing provide more information
to polars to keep it from failing or in a desire to override default
type conversion
- When converting a nushell value to a dataframe and wanting to override
the default conversion behaviors.

This pull requests provides:
- A flag to allow specifying a schema when using dfr into-df
- A flag to allow specifying a schema when using dfr open that works for
CSV and JSON types
- A new command `dfr schema` which displays schema information and will
allow display support schema dtypes

Schema is specified creating a record that has the key value and the
dtype. Examples usages:

```
{a:1, b:{a:2}} | dfr into-df -s {a: u8, b: {a: i32}} | dfr schema
{a: 1, b: {a: [1 2 3]}, c: [a b c]} | dfr into-df -s {a: u8, b: {a: list<u64>}, c: list<str>} | dfr schema
 dfr open -s {pid: i32, ppid: i32, name: str, status: str, cpu: f64, mem: i64, virtual: i64} /tmp/ps.jsonl  | dfr schema
```

Supported dtypes:
null                                                   
bool                                                   
u8                                                     
u16                                                    
u32                                                    
u64                                                    
i8                                                     
i16                                                    
i32                                                    
i64                                                    
f32                                                    
f64                                                    
str                                                    
binary                                                 
date                                                   
datetime[time_unit: (ms, us, ns) timezone (optional)]  
duration[time_unit: (ms, us, ns)]                      
time                                                   
object                                                 
unknown                                                
list[dtype]


structs are also supported but are specified via another record:
{a: u8, b: {d: str}}

Another feature with the dfr schema command is that it returns the data
back in a format that can be passed to provide a valid schema that can
be passed in as schema argument:

<img width="638" alt="Screenshot 2024-01-29 at 10 23 58"
src="https://github.com/nushell/nushell/assets/56345/b49c3bff-5cda-4c86-975a-dfd91d991373">

---------

Co-authored-by: Jack Wright <jack.wright@disqo.com>
2024-01-29 13:26:04 -06:00
nibon7
a44ad949f1
Bump polars from 0.35 to 0.36 (#11624)
# Description
* release notes
 https://github.com/pola-rs/polars/releases/tag/rs-0.36.2

* dependencies 
remove `sysinfo` 0.29.11
add `polars-compute` 0.36.2

# User-Facing Changes
[Change value_counts resulting column name from counts to
count](https://github.com/pola-rs/polars/pull/12506)

# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.

Make sure you've run and fixed any issues with these commands:

- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use std testing; testing run-tests --path
crates/nu-std"` to run the tests for the standard library

> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->

# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
2024-01-24 09:27:06 -06:00
Artemiy
1867bb1a88
Fix incorrect handling of boolean flags for builtin commands (#11492)
# Description
Possible fix of #11456
This PR fixes a bug where builtin commands did not respect the logic of
dynamically passed boolean flags. The reason is
[has_flag](6f59abaf43/crates/nu-protocol/src/ast/call.rs (L204C5-L212C6))
method did not evaluate and take into consideration expression used with
flag.

To address this issue a solution is proposed:
1. `has_flag` method is moved to `CallExt` and new logic to evaluate
expression and check if it is a boolean value is added
2. `has_flag_const` method is added to `CallExt` which is a constant
version of `has_flag`
3. `has_named` method is added to `Call` which is basically the old
logic of `has_flag`
4. All usages of `has_flag` in code are updated, mostly to pass
`engine_state` and `stack` to new `has_flag`. In `run_const` commands it
is replaced with `has_flag_const`. And in a few select places: parser,
`to nuon` and `into string` old logic via `has_named` is used.

# User-Facing Changes
Explicit values of boolean flags are now respected in builtin commands.
Before:

![image](https://github.com/nushell/nushell/assets/17511668/f9fbabb2-3cfd-43f9-ba9e-ece76d80043c)
After:

![image](https://github.com/nushell/nushell/assets/17511668/21867596-2075-437f-9c85-45563ac70083)

Another example:
Before:

![image](https://github.com/nushell/nushell/assets/17511668/efdbc5ca-5227-45a4-ac5b-532cdc2bbf5f)
After:

![image](https://github.com/nushell/nushell/assets/17511668/2907d5c5-aa93-404d-af1c-21cdc3d44646)


# Tests + Formatting
Added test reproducing some variants of original issue.
2024-01-11 17:19:48 +02:00
Stefan Holderbach
8cfa96b4c0
Construct Records only through checked helpers (#11386)
# Description

Constructing the internals of `Record` without checking the lengths is
bad. (also incompatible with changes to how we store records)

- Use `Record::from_raw_cols_vals` in dataframe code
- Use `record!` macro in dataframe test
- Use `record!` in `nu-color-config` tests
- Stop direct record construction in `nu-command`
- Refactor table construction in `from nuon`

# User-Facing Changes
None

# Tests + Formatting
No new tests, updated tests in equal fashion
2023-12-21 16:48:15 +01:00
Jack Wright
44dc890124
Polars Struct support without unsafe blocks (#11229)
Second attempt at polars Struct support. This version avoid using unsafe
checks by cloning the StructArray and utilizing the into_static to
convert to a StructOwned.

---------

Co-authored-by: Jack Wright <jack.wright@disqo.com>
2023-12-15 11:21:30 +01:00
Eric Hodel
ecb3b3a364
Ensure that command usage starts uppercase and ends period (#11278)
# Description

This repeats #8268 to make all command usage strings start with an
uppercase letter and end with a period per #5056

Adds a test to ensure that commands won't regress

Part of #5066

# User-Facing Changes

Command usage is now consistent

# Tests + Formatting

- 🟢 `toolkit fmt`
- 🟢 `toolkit clippy`
- 🟢 `toolkit test`
- 🟢 `toolkit test stdlib`

# After Submitting

Automatic documentation updates
2023-12-10 08:28:54 -06:00
Eric Hodel
a95a4505ef
Convert Shellerror::GenericError to named fields (#11230)
# Description

Replace `.to_string()` used in `GenericError` with `.into()` as
`.into()` seems more popular

Replace `Vec::new()` used in `GenericError` with `vec![]` as `vec![]`
seems more popular

(There are so, so many)
2023-12-07 00:40:03 +01:00
Jack Wright
31146a7591
Upgrading to polars 0.35 (#11241)
Co-authored-by: Jack Wright <jack.wright@disqo.com>
2023-12-05 18:09:34 -06:00
Eric Hodel
67eec92e76
Convert more ShellError variants to named fields (#11222)
# Description

Convert errors to named fields:
* NeedsPositiveValue
* MissingConfigValue
* UnsupportedConfigValue
* DowncastNotPossible
* NonUtf8Custom
* NonUtf8
* DidYouMeanCustom
* DidYouMean
* ReadingFile
* RemoveNotPossible
* ChangedModifiedTimeNotPossible
* ChangedAccessTimeNotPossible

Part of #10700
2023-12-04 10:19:32 +01:00
Stefan Holderbach
112306aab5
Revert "Adding support for Polars structs" (#11171)
Reverts nushell/nushell#10943

The current implementation of `arr_to_value` is unsound, as it allows
casts of arbitrary data to arbitrary types without being marked
`unsafe`.
The full safety requirements to perform both the cast and the following
unchecked access are not as clear that a simple change of `fn
arr_to_value` to `unsafe fn arr_to_value` could be blessed without
further investigation.

cc @ayax79
2023-11-29 16:33:27 +01:00
Eric Hodel
e36f69bf3c
Convert FileNotFoundCustom to named fields (#11123)
# Description

Part of #10700

# User-Facing Changes

None

# Tests + Formatting

- 🟢 `toolkit fmt`
- 🟢 `toolkit clippy`
- 🟢 `toolkit test`
- 🟢 `toolkit test stdlib`

# After Submitting

N/A
2023-11-21 17:30:21 -06:00
nibon7
f41c93b2d3
Apply nightly clippy fixes (#11083)
<!--
if this PR closes one or more issues, you can automatically link the PR
with
them by using one of the [*linking
keywords*](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword),
e.g.
- this PR should close #xxxx
- fixes #xxxx

you can also mention related issues, PRs or discussions!
-->

# Description
<!--
Thank you for improving Nushell. Please, check our [contributing
guide](../CONTRIBUTING.md) and talk to the core team before making major
changes.

Description of your pull request goes here. **Provide examples and/or
screenshots** if your changes affect the user experience.
-->
Clippy fixes for rust 1.76.0-nightly

# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->

N/A
# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.

Make sure you've run and fixed any issues with these commands:

- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use std testing; testing run-tests --path
crates/nu-std"` to run the tests for the standard library

> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->

# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
2023-11-17 09:15:55 -06:00
Jack Wright
fe92051bb3
Adding support for Polars structs (#10943)
Provides support for reading Polars structs. This allows opening of
supported files (jsonl, parquet, etc) that contain rows with structured
data.

The following attached json lines
file([receipts.jsonl.gz](https://github.com/nushell/nushell/files/13311476/receipts.jsonl.gz))
contains a customer column with structured data. This json lines file
can now be loaded via `dfr open` and will render as follows:

<img width="525" alt="Screenshot 2023-11-09 at 10 09 18"
src="https://github.com/nushell/nushell/assets/56345/4b26ccdc-c230-43ae-a8d5-8af88a1b72de">


This also addresses some cleanup of date handling and utilizing
timezones where provided.

This pull request only addresses reading data from polars structs. I
will address converting nushell data to polars structs in a future
request as this change is large enough as it is.

---------

Co-authored-by: Jack Wright <jack.wright@disqo.com>
2023-11-09 19:00:59 -06:00
Christopher Durham
0f600bc3f5
Improve case insensitivity consistency (#10884)
# Description

Add an extension trait `IgnoreCaseExt` to nu_utils which adds some case
insensitivity helpers, and use them throughout nu to improve the
handling of case insensitivity. Proper case folding is done via unicase,
which is already a dependency via mime_guess from nu-command.

In actuality a lot of code still does `to_lowercase`, because unicase
only provides immediate comparison and doesn't expose a `to_folded_case`
yet. And since we do a lot of `contains`/`starts_with`/`ends_with`, it's
not sufficient to just have `eq_ignore_case`. But if we get access in
the future, this makes us ready to use it with a change in one place.

Plus, it's clearer what the purpose is at the call site to call
`to_folded_case` instead of `to_lowercase` if it's exclusively for the
purpose of case insensitive comparison, even if it just does
`to_lowercase` still.

# User-Facing Changes

- Some commands that were supposed to be case insensitive remained only
insensitive to ASCII case (a-z), and now are case insensitive w.r.t.
non-ASCII characters as well.

# Tests + Formatting

- 🟢 `toolkit fmt`
- 🟢 `toolkit clippy`
- 🟢 `toolkit test`
- 🟢 `toolkit test stdlib`

---------

Co-authored-by: Stefan Holderbach <sholderbach@users.noreply.github.com>
2023-11-08 23:58:54 +01:00
Eric Hodel
7a3cbf43e8
Convert ShellError::UnsupportedInput to named fields (#10971)
# Description

This is easy to do with rust-analyzer, but I didn't want to just pump
these all out without feedback.

Part of #10700

# User-Facing Changes

None

# Tests + Formatting

- 🟢 `toolkit fmt`
- 🟢 `toolkit clippy`
- 🟢 `toolkit test`
- 🟢 `toolkit test stdlib`

# After Submitting

N/A

---------

Co-authored-by: Stefan Holderbach <sholderbach@users.noreply.github.com>
2023-11-07 23:25:32 +01:00
Ian Manske
15c22db8f4
Make FromValue take owned Values (#10900)
# Description
Changes `FromValue` to take owned `Value`s instead of borrowed `Value`s.
This eliminates some unnecessary clones (e.g., in `call_ext.rs`).

# User-Facing Changes
Breaking API change for `nu_protocol`.
2023-10-31 19:47:00 +01:00
Stefan Holderbach
4b301710d3
Convert more examples and tests to record! macro (#10840)
# Description
Use `record!` macro instead of defining two separate `vec!` for `cols`
and `vals` when appropriate.
This visually aligns the key with the value.
Further more you don't have to deal with the construction of `Record {
cols, vals }` so we can hide the implementation details in the future.

## State

Not covering all possible commands yet, also some tests/examples are
better expressed by creating cols and vals separately.

# User/Developer-Facing Changes
The examples and tests should read more natural. No relevant functional
change

# Bycatch

Where I noticed it I replaced usage of `Value` constructors with
`Span::test_data()` or `Span::unknown()` to the `Value::test_...`
constructors. This should make things more readable and also simplify
changes to the `Span` system in the future.
2023-10-28 14:52:31 +02:00
Jack Wright
c6016d7659
Dataframe support for small int types (#10828)
Turned features to allow signed and unsigned 8 and 16 bit types.

---------

Co-authored-by: Jack Wright <jack.wright@disqo.com>
2023-10-24 21:25:21 -05:00
Stefan Holderbach
c925537c48
Update polars to 0.33 (#10672)
# Description
Open question:

Undocumented behavior for the new argument `ambiguous` to the
`as_datetime`
methods. I cheated by passing a default (assuming empty string).
This appears like an API primarily serving the python impl:


https://pola-rs.github.io/polars/py-polars/html/reference/expressions/api/polars.Expr.str.to_datetime.html#polars-expr-str-to-datetime


# User-Facing Changes
Only dependent on breaking changes to the behavior of polars.

# Tests + Formatting
No observed changes to tests

Manually checked `dfr as-datetime`, doesn't seem to panic.
2023-10-11 21:28:18 +02:00
Stefan Holderbach
81ece18d5e
Add a stub dfr command (#10683)
# Description
This will only display the list of subcommands.

Prompted by a question on Discord why completions may be missing.
With standard completion settings getting the subcommands doesn't seem
to be a problem but we could add this command for good measure.

# User-Facing Changes
New command `dfr` that does nothing apart from displaying the
subcommands and hogging a space in the completions

# Tests + Formatting
(-)
2023-10-11 17:51:20 +02:00
Hofer-Julian
7dbda76fad
Add long options for core and dataframes (#10619) 2023-10-06 18:55:29 +02:00
Darren Schroeder
a9a82de5c4
fix some new chrono warnings (#10384)
# Description

This PR cleans up some warnings on the latest chrono dependency.

# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->

# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.

Make sure you've run and fixed any issues with these commands:

- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use std testing; testing run-tests --path
crates/nu-std"` to run the tests for the standard library

> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->

# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
2023-09-15 15:46:25 -05:00
Stefan Holderbach
bbf0b45c59
Update internal use of decimal to float (#10333)
# Description
We made the decision that our floating point type should be referred to
as `float` over `decimal`.
Commands were updated by #9979 and #10320

Now make the internal codebase consistent in referring to this data type
as `float`.

Work for #10332

# User-Facing Changes

`decimal` has been removed as a type name/symbol. 

Instead of 
```nushell
def foo [bar: decimal] decimal -> decimal {}
```
use 
```nushell
def foo [bar: float] float -> float {}
```

Potential effect of `SyntaxShape`'s `Display` implementation now also
referring to `float` instead of `decimal`

# Details
- Rename `SyntaxShape::Decimal` to `Float`
- Update `Display for SyntaxShape` to `float`
- Update error message + fn name in dataframe code
- Fix docs in command examples
- Rename tests that are float specific
- Update doccomment on `SyntaxShape`
- Update comment in script

# Tests + Formatting
Updates the names of some tests
2023-09-13 23:53:55 +02:00
Stefan Holderbach
a14e9e0a2e
Invert &Options to Option<&T> (#10315)
Elide the reference for `Copy` type (`usize`)
Use the canonical deref where possible.
* `&Box` -> `&`
* `&String` -> `&str`
* `&PathBuf` -> `&Path`

Skips the ctrl-C handler for now.
2023-09-13 07:00:58 +08:00
JT
6cdfee3573
Move Value to helpers, separate span call (#10121)
# Description

As part of the refactor to split spans off of Value, this moves to using
helper functions to create values, and using `.span()` instead of
matching span out of Value directly.

Hoping to get a few more helping hands to finish this, as there are a
lot of commands to update :)

# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->

# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.

Make sure you've run and fixed any issues with these commands:

- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style
- `cargo test --workspace` to check that all tests pass (on Windows make
sure to [enable developer
mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging))
- `cargo run -- -c "use std testing; testing run-tests --path
crates/nu-std"` to run the tests for the standard library

> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->

# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->

---------

Co-authored-by: Darren Schroeder <343840+fdncred@users.noreply.github.com>
Co-authored-by: WindSoilder <windsoilder@outlook.com>
2023-09-03 07:27:29 -07:00
Jack Wright
fd4ba0443d
fixed usages of deprecated chrono DateTime::from_utc (#10161)
This addresses the warnings generated from using DateTime::from_utc.
DateTime::from_utc was deprecated as of chrono 0.4.27

Co-authored-by: Jack Wright <jack.wright@disqo.com>
2023-08-30 17:04:19 -05:00
Jack Wright
3fd1a26ec0
Updating polars and sqlparser versions (#10114)
Polars and SQLParser upgrade.

I have exposed features that have been added to polars as command args
where appropriate.

---------

Co-authored-by: Jack Wright <jack.wright@disqo.com>
Co-authored-by: Darren Schroeder <343840+fdncred@users.noreply.github.com>
Co-authored-by: sholderbach <sholderbach@users.noreply.github.com>
2023-08-30 00:13:34 +02:00
Matthias Q
cea67cb30b
Allow for .parq file ending as alternative to .parquet (#10112)
# Description

Many systems like Hadoops HDFS store parquet files with the short
variant `.parq`. It is quite annoying to rename these file before
opening them with nushell. This PR lets nushell accept .parq alongside
.parquet file endings.


# User-Facing Changes
Not sure if this is applicable here.

# Tests + Formatting

Make sure you've run and fixed any issues with these commands:

- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes) - ✔️
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to
check that you're using the standard code style - ✔️
- `cargo test --workspace` to check that all tests pass -  (fails on
none related test)
- `cargo run -- -c "use std testing; testing run-tests --path
crates/nu-std"` to run the tests for the standard library -
✔️
2023-08-24 15:57:33 -05:00
JT
1e3e034021
Spanned Value step 1: span all value cases (#10042)
# Description

This doesn't really do much that the user could see, but it helps get us
ready to do the steps of the refactor to split the span off of Value, so
that values can be spanless. This allows us to have top-level values
that can hold both a Value and a Span, without requiring that all values
have them.

We expect to see significant memory reduction by removing so many
unnecessary spans from values. For example, a table of 100,000 rows and
5 columns would have a savings of ~8megs in just spans that are almost
always duplicated.

# User-Facing Changes

Nothing yet

# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.

Make sure you've run and fixed any issues with these commands:

- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used -A
clippy::needless_collect -A clippy::result_large_err` to check that
you're using the standard code style
- `cargo test --workspace` to check that all tests pass
- `cargo run -- -c "use std testing; testing run-tests --path
crates/nu-std"` to run the tests for the standard library

> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->

# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
2023-08-25 08:48:05 +12:00
Ian Manske
8da27a1a09
Create Record type (#10103)
# Description
This PR creates a new `Record` type to reduce duplicate code and
possibly bugs as well. (This is an edited version of #9648.)
- `Record` implements `FromIterator` and `IntoIterator` and so can be
iterated over or collected into. For example, this helps with
conversions to and from (hash)maps. (Also, no more
`cols.iter().zip(vals)`!)
- `Record` has a `push(col, val)` function to help insure that the
number of columns is equal to the number of values. I caught a few
potential bugs thanks to this (e.g. in the `ls` command).
- Finally, this PR also adds a `record!` macro that helps simplify
record creation. It is used like so:
   ```rust
   record! {
       "key1" => some_value,
       "key2" => Value::string("text", span),
       "key3" => Value::int(optional_int.unwrap_or(0), span),
       "key4" => Value::bool(config.setting, span),
   }
   ```
Since macros hinder formatting, etc., the right hand side values should
be relatively short and sweet like the examples above.

Where possible, prefer `record!` or `.collect()` on an iterator instead
of multiple `Record::push`s, since the first two automatically set the
record capacity and do less work overall.

# User-Facing Changes
Besides the changes in `nu-protocol` the only other breaking changes are
to `nu-table::{ExpandedTable::build_map, JustTable::kv_table}`.
2023-08-25 07:50:29 +12:00
Jakub Žádník
fb908df17d
Add additional span to IncorrectValue error (#10036) 2023-08-18 20:47:05 +03:00
Jack Wright
7a123d3eb1
Expose polars avro support (#10019)
# Description

Exposes polars avro support via dfr open and dfr to-avro

---------

Co-authored-by: Jack Wright <jack.wright@disqo.com>
2023-08-15 20:31:49 -05:00
Jack Wright
8b160f9850
Nushell table list columns -> dataframe list columns. Explode / Flatten dataframe support. (#9951)
# Description
- Adds support for conversion between nushell lists and polars lists
instead of treating them as a polars object.
- Fixed explode and flatten to work both as expressions or lazy
dataframe commands. The previous item was required to make this work.

---------

Co-authored-by: Jack Wright <jack.wright@disqo.com>
Co-authored-by: Darren Schroeder <343840+fdncred@users.noreply.github.com>
2023-08-15 06:54:37 -05:00
Reilly Wood
d5fa7b8a55
Put heavy dataframe dependencies behind feature flag (#9971)
Context from Discord:
https://discord.com/channels/601130461678272522/615962413203718156/1138694933545504819

I was working on Nu for the first time in a while and I noticed that
sometimes rust-analyzer takes a really long time to run `cargo check` on
the entire workspace. I dug in and it was checking a bunch of
dataframe-related dependencies even though the `dataframe` feature is
not built by default.

It looks like this is a regression of sorts, introduced by
https://github.com/nushell/nushell/pull/9241. Thankfully the fix is
pretty easy, we can make it so everything important in
`nu-cmd-dataframe` is only used when the `dataframe` feature is enabled.

### Impact on `cargo check --workspace`

Before this PR: 635 crates, 33.59s
After this PR: 498 crates, ~20s

(with the `mold` linker and a `cargo clean` before each run, the
relative difference for incremental checks will likely be much larger)
2023-08-09 22:36:09 -07:00
JT
c8f3799c20
Fix a couple clippy warnings (#9936)
<!--
if this PR closes one or more issues, you can automatically link the PR
with
them by using one of the [*linking
keywords*](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword),
e.g.
- this PR should close #xxxx
- fixes #xxxx

you can also mention related issues, PRs or discussions!
-->

# Description
<!--
Thank you for improving Nushell. Please, check our [contributing
guide](../CONTRIBUTING.md) and talk to the core team before making major
changes.

Description of your pull request goes here. **Provide examples and/or
screenshots** if your changes affect the user experience.
-->

# User-Facing Changes
<!-- List of all changes that impact the user experience here. This
helps us keep track of breaking changes. -->

# Tests + Formatting
<!--
Don't forget to add tests that cover your changes.

Make sure you've run and fixed any issues with these commands:

- `cargo fmt --all -- --check` to check standard code formatting (`cargo
fmt --all` applies these changes)
- `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used -A
clippy::needless_collect -A clippy::result_large_err` to check that
you're using the standard code style
- `cargo test --workspace` to check that all tests pass
- `cargo run -- -c "use std testing; testing run-tests --path
crates/nu-std"` to run the tests for the standard library

> **Note**
> from `nushell` you can also use the `toolkit` as follows
> ```bash
> use toolkit.nu # or use an `env_change` hook to activate it
automatically
> toolkit check pr
> ```
-->

# After Submitting
<!-- If your PR had any user-facing changes, update [the
documentation](https://github.com/nushell/nushell.github.io) after the
PR is merged, if necessary. This will help us keep the docs up to date.
-->
2023-08-07 06:23:11 +12:00
Jack Wright
87abfee268
Merged overloaded commands (#9860)
- fixes #9807

# Description

This pull request merges all overloaded dfr commands into one command:

eager:
dfr first -> eager/first.rs
dfr last -> eager/last.rs
dfr into-nu -> eager/to_nu.rs (merged)

lazy:
dfr min -> expressions/expressions_macro.rs lazy_expressions_macro
dfr max -> expressions/expressions_macro.rs lazy_expressions_macro
dfr sum -> expressions/expressions_macro.rs lazy_expressions_macro
dfr mean -> expressions/expressions_macro.rs lazy_expressions_macro
dfr std -> expressions/expressions_macro.rs lazy_expressions_macro
dfr var   -> expressions/expressions_macro.rs lazy_expressions_macro

series:
dfr n-unique -> series/n_unique.rs
dfr is-not-null -> series/masks/is_not_null.rs
dfr is-null -> series/masks/is_null.rs

# User-Facing Changes
No user facing changes

---------

Co-authored-by: Jack Wright <jack.wright@disqo.com>
2023-07-31 07:34:12 -05:00
Jack Wright
bf5bd3ff10
"merging into one dfr into-nu command" (#9858)
- fixes #9806

# Description

Merges ExprAsNu command and ToNu into one command. 

# User-Facing Changes

As both commands were overloading ```dfr into-nu``` there are no user
facing changes

---------

Co-authored-by: Jack Wright <jack.wright@disqo.com>
2023-07-29 15:23:31 -05:00
Ian Manske
7e1b922ea7
Add functions for each Value case (#9736)
# Description
This PR ensures functions exist to extract and create each and every
`Value` case. It also renames `Value::boolean` to `Value::bool` to match
`Value::test_bool`, `Value::as_bool`, and `Value::Bool`. Similarly,
`Value::as_integer` was renamed to `Value::as_int` to be consistent with
`Value::int`, `Value::test_int`, and `Value::Int`. These two renames can
be undone if necessary.

# User-Facing Changes
No user facing changes, but two public functions were renamed which may
affect downstream dependents.
2023-07-21 08:20:33 -05:00
Stefan Holderbach
bd0032898f
Apply nightly clippy lints (#9654)
# Description
- A new one is the removal of unnecessary `#` in raw strings without `"`
inside.
-
https://rust-lang.github.io/rust-clippy/master/index.html#/needless_raw_string_hashes
- The automatically applied removal of `.into_iter()` touched several
places where #9648 will change to the use of the record API. If
necessary I can remove them @IanManske to avoid churn with this PR.
- Manually applied `.try_fold` in two places
- Removed a dead `if`
- Manual: Combat rightward-drift with early return
2023-07-12 00:00:31 +02:00