nushell/crates/nu_plugin_query/src/query_web.rs

423 lines
13 KiB
Rust
Raw Normal View History

Reorganize plugin API around commands (#12170) [Context on Discord](https://discord.com/channels/601130461678272522/855947301380947968/1216517833312309419) # Description This is a significant breaking change to the plugin API, but one I think is worthwhile. @ayax79 mentioned on Discord that while trying to start on a dataframes plugin, he was a little disappointed that more wasn't provided in terms of code organization for commands, particularly since there are *a lot* of `dfr` commands. This change treats plugins more like miniatures of the engine, with dispatch of the command name being handled inherently, each command being its own type, and each having their own signature within the trait impl for the command type rather than having to find a way to centralize it all into one `Vec`. For the example plugins that have multiple commands, I definitely like how this looks a lot better. This encourages doing code organization the right way and feels very good. For the plugins that have only one command, it's just a little bit more boilerplate - but still worth it, in my opinion. The `Box<dyn PluginCommand<Plugin = Self>>` type in `commands()` is a little bit hairy, particularly for Rust beginners, but ultimately not so bad, and it gives the desired flexibility for shared state for a whole plugin + the individual commands. # User-Facing Changes Pretty big breaking change to plugin API, but probably one that's worth making. ```rust use nu_plugin::*; use nu_protocol::{PluginSignature, PipelineData, Type, Value}; struct LowercasePlugin; struct Lowercase; // Plugins can now have multiple commands impl PluginCommand for Lowercase { type Plugin = LowercasePlugin; // The signature lives with the command fn signature(&self) -> PluginSignature { PluginSignature::build("lowercase") .usage("Convert each string in a stream to lowercase") .input_output_type(Type::List(Type::String.into()), Type::List(Type::String.into())) } // We also provide SimplePluginCommand which operates on Value like before fn run( &self, plugin: &LowercasePlugin, engine: &EngineInterface, call: &EvaluatedCall, input: PipelineData, ) -> Result<PipelineData, LabeledError> { let span = call.head; Ok(input.map(move |value| { value.as_str() .map(|string| Value::string(string.to_lowercase(), span)) // Errors in a stream should be returned as values. .unwrap_or_else(|err| Value::error(err, span)) }, None)?) } } // Plugin now just has a list of commands, and the custom value op stuff still goes here impl Plugin for LowercasePlugin { fn commands(&self) -> Vec<Box<dyn PluginCommand<Plugin=Self>>> { vec![Box::new(Lowercase)] } } fn main() { serve_plugin(&LowercasePlugin{}, MsgPackSerializer) } ``` Time this however you like - we're already breaking stuff for 0.92, so it might be good to do it now, but if it feels like a lot all at once, it could wait. # Tests + Formatting - :green_circle: `toolkit fmt` - :green_circle: `toolkit clippy` - :green_circle: `toolkit test` - :green_circle: `toolkit test stdlib` # After Submitting - [ ] Update examples in the book - [x] Fix #12088 to match - this change would actually simplify it a lot, because the methods are currently just duplicated between `Plugin` and `StreamingPlugin`, but they only need to be on `Plugin` with this change
2024-03-14 21:40:02 +00:00
use crate::{web_tables::WebTable, Query};
use nu_plugin::{EngineInterface, EvaluatedCall, SimplePluginCommand};
use nu_protocol::{
Category, LabeledError, PluginExample, PluginSignature, Record, Span, Spanned, SyntaxShape,
Value,
};
use scraper::{Html, Selector as ScraperSelector};
Reorganize plugin API around commands (#12170) [Context on Discord](https://discord.com/channels/601130461678272522/855947301380947968/1216517833312309419) # Description This is a significant breaking change to the plugin API, but one I think is worthwhile. @ayax79 mentioned on Discord that while trying to start on a dataframes plugin, he was a little disappointed that more wasn't provided in terms of code organization for commands, particularly since there are *a lot* of `dfr` commands. This change treats plugins more like miniatures of the engine, with dispatch of the command name being handled inherently, each command being its own type, and each having their own signature within the trait impl for the command type rather than having to find a way to centralize it all into one `Vec`. For the example plugins that have multiple commands, I definitely like how this looks a lot better. This encourages doing code organization the right way and feels very good. For the plugins that have only one command, it's just a little bit more boilerplate - but still worth it, in my opinion. The `Box<dyn PluginCommand<Plugin = Self>>` type in `commands()` is a little bit hairy, particularly for Rust beginners, but ultimately not so bad, and it gives the desired flexibility for shared state for a whole plugin + the individual commands. # User-Facing Changes Pretty big breaking change to plugin API, but probably one that's worth making. ```rust use nu_plugin::*; use nu_protocol::{PluginSignature, PipelineData, Type, Value}; struct LowercasePlugin; struct Lowercase; // Plugins can now have multiple commands impl PluginCommand for Lowercase { type Plugin = LowercasePlugin; // The signature lives with the command fn signature(&self) -> PluginSignature { PluginSignature::build("lowercase") .usage("Convert each string in a stream to lowercase") .input_output_type(Type::List(Type::String.into()), Type::List(Type::String.into())) } // We also provide SimplePluginCommand which operates on Value like before fn run( &self, plugin: &LowercasePlugin, engine: &EngineInterface, call: &EvaluatedCall, input: PipelineData, ) -> Result<PipelineData, LabeledError> { let span = call.head; Ok(input.map(move |value| { value.as_str() .map(|string| Value::string(string.to_lowercase(), span)) // Errors in a stream should be returned as values. .unwrap_or_else(|err| Value::error(err, span)) }, None)?) } } // Plugin now just has a list of commands, and the custom value op stuff still goes here impl Plugin for LowercasePlugin { fn commands(&self) -> Vec<Box<dyn PluginCommand<Plugin=Self>>> { vec![Box::new(Lowercase)] } } fn main() { serve_plugin(&LowercasePlugin{}, MsgPackSerializer) } ``` Time this however you like - we're already breaking stuff for 0.92, so it might be good to do it now, but if it feels like a lot all at once, it could wait. # Tests + Formatting - :green_circle: `toolkit fmt` - :green_circle: `toolkit clippy` - :green_circle: `toolkit test` - :green_circle: `toolkit test stdlib` # After Submitting - [ ] Update examples in the book - [x] Fix #12088 to match - this change would actually simplify it a lot, because the methods are currently just duplicated between `Plugin` and `StreamingPlugin`, but they only need to be on `Plugin` with this change
2024-03-14 21:40:02 +00:00
pub struct QueryWeb;
impl SimplePluginCommand for QueryWeb {
type Plugin = Query;
fn signature(&self) -> PluginSignature {
PluginSignature::build("query web")
.usage("execute selector query on html/web")
.named("query", SyntaxShape::String, "selector query", Some('q'))
.switch("as-html", "return the query output as html", Some('m'))
.plugin_examples(web_examples())
.named(
"attribute",
SyntaxShape::String,
"downselect based on the given attribute",
Some('a'),
)
.named(
"as-table",
SyntaxShape::List(Box::new(SyntaxShape::String)),
"find table based on column header list",
Some('t'),
)
.switch(
"inspect",
"run in inspect mode to provide more information for determining column headers",
Some('i'),
)
.category(Category::Network)
}
fn run(
&self,
_plugin: &Query,
_engine: &EngineInterface,
call: &EvaluatedCall,
input: &Value,
) -> Result<Value, LabeledError> {
parse_selector_params(call, input)
}
}
pub fn web_examples() -> Vec<PluginExample> {
vec![
PluginExample {
example: "http get https://phoronix.com | query web --query 'header' | flatten".into(),
description: "Retrieve all `<header>` elements from phoronix.com website".into(),
result: None,
},
PluginExample {
example: "http get https://en.wikipedia.org/wiki/List_of_cities_in_India_by_population |
query web --as-table [City 'Population(2011)[3]' 'Population(2001)[3][a]' 'State or unionterritory' 'Ref']".into(),
description: "Retrieve a html table from Wikipedia and parse it into a nushell table using table headers as guides".into(),
result: None
},
PluginExample {
example: "http get https://www.nushell.sh | query web --query 'h2, h2 + p' | each {str join} | group 2 | each {rotate --ccw tagline description} | flatten".into(),
description: "Pass multiple css selectors to extract several elements within single query, group the query results together and rotate them to create a table".into(),
result: None,
},
PluginExample {
example: "http get https://example.org | query web --query a --attribute href".into(),
description: "Retrieve a specific html attribute instead of the default text".into(),
result: None,
}
]
}
pub struct Selector {
pub query: String,
pub as_html: bool,
pub attribute: String,
pub as_table: Value,
pub inspect: bool,
}
impl Selector {
pub fn new() -> Selector {
Selector {
query: String::new(),
as_html: false,
attribute: String::new(),
as_table: Value::string("".to_string(), Span::unknown()),
inspect: false,
}
}
}
impl Default for Selector {
fn default() -> Self {
Self::new()
}
}
pub fn parse_selector_params(call: &EvaluatedCall, input: &Value) -> Result<Value, LabeledError> {
let head = call.head;
let query: Option<Spanned<String>> = call.get_flag("query")?;
let as_html = call.has_flag("as-html")?;
let attribute = call.get_flag("attribute")?.unwrap_or_default();
let as_table: Value = call
.get_flag("as-table")?
.unwrap_or_else(|| Value::nothing(head));
let inspect = call.has_flag("inspect")?;
if let Some(query) = &query {
if let Err(err) = ScraperSelector::parse(&query.item) {
return Err(LabeledError::new("CSS query parse error")
.with_label(err.to_string(), query.span)
.with_help("cannot parse this query as a valid CSS selector"));
}
} else {
return Err(
LabeledError::new("Missing query argument").with_label("add --query here", call.head)
);
}
let selector = Selector {
query: query.map(|q| q.item).unwrap_or_default(),
as_html,
attribute,
as_table,
inspect,
};
Move Value to helpers, separate span call (#10121) # Description As part of the refactor to split spans off of Value, this moves to using helper functions to create values, and using `.span()` instead of matching span out of Value directly. Hoping to get a few more helping hands to finish this, as there are a lot of commands to update :) # User-Facing Changes <!-- List of all changes that impact the user experience here. This helps us keep track of breaking changes. --> # Tests + Formatting <!-- Don't forget to add tests that cover your changes. Make sure you've run and fixed any issues with these commands: - `cargo fmt --all -- --check` to check standard code formatting (`cargo fmt --all` applies these changes) - `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to check that you're using the standard code style - `cargo test --workspace` to check that all tests pass (on Windows make sure to [enable developer mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging)) - `cargo run -- -c "use std testing; testing run-tests --path crates/nu-std"` to run the tests for the standard library > **Note** > from `nushell` you can also use the `toolkit` as follows > ```bash > use toolkit.nu # or use an `env_change` hook to activate it automatically > toolkit check pr > ``` --> # After Submitting <!-- If your PR had any user-facing changes, update [the documentation](https://github.com/nushell/nushell.github.io) after the PR is merged, if necessary. This will help us keep the docs up to date. --> --------- Co-authored-by: Darren Schroeder <343840+fdncred@users.noreply.github.com> Co-authored-by: WindSoilder <windsoilder@outlook.com>
2023-09-03 14:27:29 +00:00
let span = input.span();
match input {
Move Value to helpers, separate span call (#10121) # Description As part of the refactor to split spans off of Value, this moves to using helper functions to create values, and using `.span()` instead of matching span out of Value directly. Hoping to get a few more helping hands to finish this, as there are a lot of commands to update :) # User-Facing Changes <!-- List of all changes that impact the user experience here. This helps us keep track of breaking changes. --> # Tests + Formatting <!-- Don't forget to add tests that cover your changes. Make sure you've run and fixed any issues with these commands: - `cargo fmt --all -- --check` to check standard code formatting (`cargo fmt --all` applies these changes) - `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to check that you're using the standard code style - `cargo test --workspace` to check that all tests pass (on Windows make sure to [enable developer mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging)) - `cargo run -- -c "use std testing; testing run-tests --path crates/nu-std"` to run the tests for the standard library > **Note** > from `nushell` you can also use the `toolkit` as follows > ```bash > use toolkit.nu # or use an `env_change` hook to activate it automatically > toolkit check pr > ``` --> # After Submitting <!-- If your PR had any user-facing changes, update [the documentation](https://github.com/nushell/nushell.github.io) after the PR is merged, if necessary. This will help us keep the docs up to date. --> --------- Co-authored-by: Darren Schroeder <343840+fdncred@users.noreply.github.com> Co-authored-by: WindSoilder <windsoilder@outlook.com>
2023-09-03 14:27:29 +00:00
Value::String { val, .. } => Ok(begin_selector_query(val.to_string(), selector, span)),
_ => Err(LabeledError::new("Requires text input")
.with_label("expected text from pipeline", span)),
}
}
fn begin_selector_query(input_html: String, selector: Selector, span: Span) -> Value {
if let Value::List { .. } = selector.as_table {
return retrieve_tables(
input_html.as_str(),
&selector.as_table,
selector.inspect,
span,
);
} else if selector.attribute.is_empty() {
execute_selector_query(
input_html.as_str(),
selector.query.as_str(),
selector.as_html,
selector.inspect,
span,
)
} else {
execute_selector_query_with_attribute(
input_html.as_str(),
selector.query.as_str(),
selector.attribute.as_str(),
selector.inspect,
span,
)
}
}
pub fn retrieve_tables(
input_string: &str,
columns: &Value,
inspect_mode: bool,
span: Span,
) -> Value {
let html = input_string;
let mut cols: Vec<String> = Vec::new();
if let Value::List { vals, .. } = &columns {
for x in vals {
if let Value::String { val, .. } = x {
cols.push(val.to_string())
}
}
}
if inspect_mode {
update query web wiki example (#11709) # Description This PR tries to make `query web` more resilient and easier to debug with the `--inspect` parameter when trying to scrape tables. Previously it would just fail, now at least it tries to give you a hint. This is some example output now of when something went wrong. ``` ❯ http get https://en.wikipedia.org/wiki/List_of_cities_in_India_by_population | query web --as-table [Rank City 'Population(2011)[3]' 'Population(2001)[3][a]' 'State or union territory'] --inspect Passed in Column Headers = ["Rank", "City", "Population(2011)[3]", "Population(2001)[3][a]", "State or union territory"] First 2048 HTML chars = <!DOCTYPE html> <html class="client-nojs vector-feature-language-in-header-enabled vector-feature-language-in-main-page-header-disabled vector-feature-sticky-header-disabled vector-feature-page-tools-pinned-disabled vector-feature-toc-pinned-clientpref-1 vector-feature-main-menu-pinned-disabled vector-feature-limited-width-clientpref-1 vector-feature-limited-width-content-enabled vector-feature-custom-font-size-clientpref-0 vector-feature-client-preferences-disabled vector-feature-client-prefs-pinned-disabled vector-toc-available" lang="en" dir="ltr"> <head> <meta charset="UTF-8"> <title>List of cities in India by population - Wikipedia</title> <script>(function(){var className="client-js vector-feature-language-in-header-enabled vector-feature-language-in-main-page-header-disabled vector-feature-sticky-header-disabled vector-feature-page-tools-pinned-disabled vector-feature-toc-pinned-clientpref-1 vector-feature-main-menu-pinned-disabled vector-feature-limited-width-clientpref-1 vector-feature-limited-width-content-enabled vector-feature-custom-font-size-clientpref-0 vector-feature-client-preferences-disabled vector-feature-client-prefs-pinned-disabled vector-toc-available";var cookie=document.cookie.match(/(?:^|; )enwikimwclientpreferences=([^;]+)/);if(cookie){cookie[1].split('%2C').forEach(function(pref){className=className.replace(new RegExp('(^| )'+pref.replace(/-clientpref-\w+$|[^\w-]+/g,'')+'-clientpref-\\w+( |$)'),'$1'+pref+'$2');});}document.documentElement.className=className;}());RLCONF={"wgBreakFrames":false,"wgSeparatorTransformTable":["",""],"wgDigitTransformTable":["",""],"wgDefaultDateFormat":"dmy","wgMonthNames":["", "January","February","March","April","May","June","July","August","September","October","November","December"],"wgRequestId":"9ecdad8f-2dbd-4245-b54d-9c57aea5ca45","wgCanonicalNamespace":"","wgCanonicalSpecialPageName":false,"wgNamespaceNumber":0,"wgPageName":"List_of_cities_in_India_by_population","wgTitle":"List of cities in India by population","wgCurRevisionId":1192093210,"wgRev Potential HTML Headers = ["City", "Population(2011)[3]", "Population(2001)[3][a]", "State or unionterritory", "Ref"] Potential HTML Headers = ["City", "Population(2011)[5]", "Population(2001)", "State or unionterritory"] Potential HTML Headers = [".mw-parser-output .navbar{display:inline;font-size:88%;font-weight:normal}.mw-parser-output .navbar-collapse{float:left;text-align:left}.mw-parser-output .navbar-boxtext{word-spacing:0}.mw-parser-output .navbar ul{display:inline-block;white-space:nowrap;line-height:inherit}.mw-parser-output .navbar-brackets::before{margin-right:-0.125em;content:\"[ \"}.mw-parser-output .navbar-brackets::after{margin-left:-0.125em;content:\" ]\"}.mw-parser-output .navbar li{word-spacing:-0.125em}.mw-parser-output .navbar a>span,.mw-parser-output .navbar a>abbr{text-decoration:inherit}.mw-parser-output .navbar-mini abbr{font-variant:small-caps;border-bottom:none;text-decoration:none;cursor:inherit}.mw-parser-output .navbar-ct-full{font-size:114%;margin:0 7em}.mw-parser-output .navbar-ct-mini{font-size:114%;margin:0 4em}vtePopulation of cities in India"] Potential HTML Headers = ["vteGeography of India"] ╭──────────────────────────┬─────────────────────────────────────────────────────╮ │ Rank │ error: no data found (column name may be incorrect) │ │ City │ error: no data found (column name may be incorrect) │ │ Population(2011)[3] │ error: no data found (column name may be incorrect) │ │ Population(2001)[3][a] │ error: no data found (column name may be incorrect) │ │ State or union territory │ error: no data found (column name may be incorrect) │ ╰──────────────────────────┴─────────────────────────────────────────────────────╯ ``` The key here is to look at the `Passed in Column Headers` and compare them to the `Potential HTML Headers` and couple that with the error table at the bottom should give you a hint that, in this situation, wikipedia has changed the column names, yet again. So we need to update our query web statement's tables to get closer to what we want. ``` ❯ http get https://en.wikipedia.org/wiki/List_of_cities_in_India_by_population | query web --as-table [City 'Population(2011)[3]' 'Population(2001)[3][a]' 'State or unionterritory' 'Ref'] ╭─#──┬───────City───────┬─Population(2011)[3]─┬─Population(2001)[3][a]─┬─State or unionterritory─┬──Ref───╮ │ 0 │ Mumbai │ 12,442,373 │ 11,978,450 │ Maharashtra │ [3] │ │ 1 │ Delhi │ 11,034,555 │ 9,879,172 │ Delhi │ [3] │ │ 2 │ Bangalore │ 8,443,675 │ 5,682,293 │ Karnataka │ [3] │ │ 3 │ Hyderabad │ 6,993,262 │ 5,496,960 │ Telangana │ [3] │ │ 4 │ Ahmedabad │ 5,577,940 │ 4,470,006 │ Gujarat │ [3] │ │ 5 │ Chennai │ 4,646,732 │ 4,343,645 │ Tamil Nadu │ [3] │ │ 6 │ Kolkata │ 4,496,694 │ 4,580,546 │ West Bengal │ [3] │ │ 7 │ Surat │ 4,467,797 │ 2,788,126 │ Gujarat │ [3] │ │ 8 │ Pune │ 3,124,458 │ 2,538,473 │ Maharashtra │ [3] │ │ 9 │ Jaipur │ 3,046,163 │ 2,322,575 │ Rajasthan │ [3] │ │ 10 │ Lucknow │ 2,817,105 │ 2,185,927 │ Uttar Pradesh │ [3] │ │ 11 │ Kanpur │ 2,765,348 │ 2,551,337 │ Uttar Pradesh │ [3] │ │ 12 │ Nagpur │ 2,405,665 │ 2,052,066 │ Maharashtra │ [3] │ ``` # User-Facing Changes <!-- List of all changes that impact the user experience here. This helps us keep track of breaking changes. --> # Tests + Formatting <!-- Don't forget to add tests that cover your changes. Make sure you've run and fixed any issues with these commands: - `cargo fmt --all -- --check` to check standard code formatting (`cargo fmt --all` applies these changes) - `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to check that you're using the standard code style - `cargo test --workspace` to check that all tests pass (on Windows make sure to [enable developer mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging)) - `cargo run -- -c "use std testing; testing run-tests --path crates/nu-std"` to run the tests for the standard library > **Note** > from `nushell` you can also use the `toolkit` as follows > ```bash > use toolkit.nu # or use an `env_change` hook to activate it automatically > toolkit check pr > ``` --> # After Submitting <!-- If your PR had any user-facing changes, update [the documentation](https://github.com/nushell/nushell.github.io) after the PR is merged, if necessary. This will help us keep the docs up to date. -->
2024-02-02 15:03:28 +00:00
eprintln!("Passed in Column Headers = {:?}\n", &cols);
eprintln!("First 2048 HTML chars = {}\n", &html[0..2047]);
}
update query web wiki example (#11709) # Description This PR tries to make `query web` more resilient and easier to debug with the `--inspect` parameter when trying to scrape tables. Previously it would just fail, now at least it tries to give you a hint. This is some example output now of when something went wrong. ``` ❯ http get https://en.wikipedia.org/wiki/List_of_cities_in_India_by_population | query web --as-table [Rank City 'Population(2011)[3]' 'Population(2001)[3][a]' 'State or union territory'] --inspect Passed in Column Headers = ["Rank", "City", "Population(2011)[3]", "Population(2001)[3][a]", "State or union territory"] First 2048 HTML chars = <!DOCTYPE html> <html class="client-nojs vector-feature-language-in-header-enabled vector-feature-language-in-main-page-header-disabled vector-feature-sticky-header-disabled vector-feature-page-tools-pinned-disabled vector-feature-toc-pinned-clientpref-1 vector-feature-main-menu-pinned-disabled vector-feature-limited-width-clientpref-1 vector-feature-limited-width-content-enabled vector-feature-custom-font-size-clientpref-0 vector-feature-client-preferences-disabled vector-feature-client-prefs-pinned-disabled vector-toc-available" lang="en" dir="ltr"> <head> <meta charset="UTF-8"> <title>List of cities in India by population - Wikipedia</title> <script>(function(){var className="client-js vector-feature-language-in-header-enabled vector-feature-language-in-main-page-header-disabled vector-feature-sticky-header-disabled vector-feature-page-tools-pinned-disabled vector-feature-toc-pinned-clientpref-1 vector-feature-main-menu-pinned-disabled vector-feature-limited-width-clientpref-1 vector-feature-limited-width-content-enabled vector-feature-custom-font-size-clientpref-0 vector-feature-client-preferences-disabled vector-feature-client-prefs-pinned-disabled vector-toc-available";var cookie=document.cookie.match(/(?:^|; )enwikimwclientpreferences=([^;]+)/);if(cookie){cookie[1].split('%2C').forEach(function(pref){className=className.replace(new RegExp('(^| )'+pref.replace(/-clientpref-\w+$|[^\w-]+/g,'')+'-clientpref-\\w+( |$)'),'$1'+pref+'$2');});}document.documentElement.className=className;}());RLCONF={"wgBreakFrames":false,"wgSeparatorTransformTable":["",""],"wgDigitTransformTable":["",""],"wgDefaultDateFormat":"dmy","wgMonthNames":["", "January","February","March","April","May","June","July","August","September","October","November","December"],"wgRequestId":"9ecdad8f-2dbd-4245-b54d-9c57aea5ca45","wgCanonicalNamespace":"","wgCanonicalSpecialPageName":false,"wgNamespaceNumber":0,"wgPageName":"List_of_cities_in_India_by_population","wgTitle":"List of cities in India by population","wgCurRevisionId":1192093210,"wgRev Potential HTML Headers = ["City", "Population(2011)[3]", "Population(2001)[3][a]", "State or unionterritory", "Ref"] Potential HTML Headers = ["City", "Population(2011)[5]", "Population(2001)", "State or unionterritory"] Potential HTML Headers = [".mw-parser-output .navbar{display:inline;font-size:88%;font-weight:normal}.mw-parser-output .navbar-collapse{float:left;text-align:left}.mw-parser-output .navbar-boxtext{word-spacing:0}.mw-parser-output .navbar ul{display:inline-block;white-space:nowrap;line-height:inherit}.mw-parser-output .navbar-brackets::before{margin-right:-0.125em;content:\"[ \"}.mw-parser-output .navbar-brackets::after{margin-left:-0.125em;content:\" ]\"}.mw-parser-output .navbar li{word-spacing:-0.125em}.mw-parser-output .navbar a>span,.mw-parser-output .navbar a>abbr{text-decoration:inherit}.mw-parser-output .navbar-mini abbr{font-variant:small-caps;border-bottom:none;text-decoration:none;cursor:inherit}.mw-parser-output .navbar-ct-full{font-size:114%;margin:0 7em}.mw-parser-output .navbar-ct-mini{font-size:114%;margin:0 4em}vtePopulation of cities in India"] Potential HTML Headers = ["vteGeography of India"] ╭──────────────────────────┬─────────────────────────────────────────────────────╮ │ Rank │ error: no data found (column name may be incorrect) │ │ City │ error: no data found (column name may be incorrect) │ │ Population(2011)[3] │ error: no data found (column name may be incorrect) │ │ Population(2001)[3][a] │ error: no data found (column name may be incorrect) │ │ State or union territory │ error: no data found (column name may be incorrect) │ ╰──────────────────────────┴─────────────────────────────────────────────────────╯ ``` The key here is to look at the `Passed in Column Headers` and compare them to the `Potential HTML Headers` and couple that with the error table at the bottom should give you a hint that, in this situation, wikipedia has changed the column names, yet again. So we need to update our query web statement's tables to get closer to what we want. ``` ❯ http get https://en.wikipedia.org/wiki/List_of_cities_in_India_by_population | query web --as-table [City 'Population(2011)[3]' 'Population(2001)[3][a]' 'State or unionterritory' 'Ref'] ╭─#──┬───────City───────┬─Population(2011)[3]─┬─Population(2001)[3][a]─┬─State or unionterritory─┬──Ref───╮ │ 0 │ Mumbai │ 12,442,373 │ 11,978,450 │ Maharashtra │ [3] │ │ 1 │ Delhi │ 11,034,555 │ 9,879,172 │ Delhi │ [3] │ │ 2 │ Bangalore │ 8,443,675 │ 5,682,293 │ Karnataka │ [3] │ │ 3 │ Hyderabad │ 6,993,262 │ 5,496,960 │ Telangana │ [3] │ │ 4 │ Ahmedabad │ 5,577,940 │ 4,470,006 │ Gujarat │ [3] │ │ 5 │ Chennai │ 4,646,732 │ 4,343,645 │ Tamil Nadu │ [3] │ │ 6 │ Kolkata │ 4,496,694 │ 4,580,546 │ West Bengal │ [3] │ │ 7 │ Surat │ 4,467,797 │ 2,788,126 │ Gujarat │ [3] │ │ 8 │ Pune │ 3,124,458 │ 2,538,473 │ Maharashtra │ [3] │ │ 9 │ Jaipur │ 3,046,163 │ 2,322,575 │ Rajasthan │ [3] │ │ 10 │ Lucknow │ 2,817,105 │ 2,185,927 │ Uttar Pradesh │ [3] │ │ 11 │ Kanpur │ 2,765,348 │ 2,551,337 │ Uttar Pradesh │ [3] │ │ 12 │ Nagpur │ 2,405,665 │ 2,052,066 │ Maharashtra │ [3] │ ``` # User-Facing Changes <!-- List of all changes that impact the user experience here. This helps us keep track of breaking changes. --> # Tests + Formatting <!-- Don't forget to add tests that cover your changes. Make sure you've run and fixed any issues with these commands: - `cargo fmt --all -- --check` to check standard code formatting (`cargo fmt --all` applies these changes) - `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to check that you're using the standard code style - `cargo test --workspace` to check that all tests pass (on Windows make sure to [enable developer mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging)) - `cargo run -- -c "use std testing; testing run-tests --path crates/nu-std"` to run the tests for the standard library > **Note** > from `nushell` you can also use the `toolkit` as follows > ```bash > use toolkit.nu # or use an `env_change` hook to activate it automatically > toolkit check pr > ``` --> # After Submitting <!-- If your PR had any user-facing changes, update [the documentation](https://github.com/nushell/nushell.github.io) after the PR is merged, if necessary. This will help us keep the docs up to date. -->
2024-02-02 15:03:28 +00:00
let tables = match WebTable::find_by_headers(html, &cols, inspect_mode) {
Some(t) => {
if inspect_mode {
eprintln!("Table Found = {:#?}", &t);
}
t
}
None => vec![WebTable::empty()],
};
if tables.len() == 1 {
return retrieve_table(
tables.into_iter().next().expect("Error retrieving table"),
columns,
span,
);
}
let vals = tables
.into_iter()
.map(move |table| retrieve_table(table, columns, span))
.collect();
Move Value to helpers, separate span call (#10121) # Description As part of the refactor to split spans off of Value, this moves to using helper functions to create values, and using `.span()` instead of matching span out of Value directly. Hoping to get a few more helping hands to finish this, as there are a lot of commands to update :) # User-Facing Changes <!-- List of all changes that impact the user experience here. This helps us keep track of breaking changes. --> # Tests + Formatting <!-- Don't forget to add tests that cover your changes. Make sure you've run and fixed any issues with these commands: - `cargo fmt --all -- --check` to check standard code formatting (`cargo fmt --all` applies these changes) - `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to check that you're using the standard code style - `cargo test --workspace` to check that all tests pass (on Windows make sure to [enable developer mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging)) - `cargo run -- -c "use std testing; testing run-tests --path crates/nu-std"` to run the tests for the standard library > **Note** > from `nushell` you can also use the `toolkit` as follows > ```bash > use toolkit.nu # or use an `env_change` hook to activate it automatically > toolkit check pr > ``` --> # After Submitting <!-- If your PR had any user-facing changes, update [the documentation](https://github.com/nushell/nushell.github.io) after the PR is merged, if necessary. This will help us keep the docs up to date. --> --------- Co-authored-by: Darren Schroeder <343840+fdncred@users.noreply.github.com> Co-authored-by: WindSoilder <windsoilder@outlook.com>
2023-09-03 14:27:29 +00:00
Value::list(vals, span)
}
fn retrieve_table(mut table: WebTable, columns: &Value, span: Span) -> Value {
let mut cols: Vec<String> = Vec::new();
if let Value::List { vals, .. } = &columns {
for x in vals {
// TODO Find a way to get the Config object here
if let Value::String { val, .. } = x {
cols.push(val.to_string())
}
}
}
if cols.is_empty() && !table.headers().is_empty() {
for col in table.headers().keys() {
cols.push(col.to_string());
}
}
update query web wiki example (#11709) # Description This PR tries to make `query web` more resilient and easier to debug with the `--inspect` parameter when trying to scrape tables. Previously it would just fail, now at least it tries to give you a hint. This is some example output now of when something went wrong. ``` ❯ http get https://en.wikipedia.org/wiki/List_of_cities_in_India_by_population | query web --as-table [Rank City 'Population(2011)[3]' 'Population(2001)[3][a]' 'State or union territory'] --inspect Passed in Column Headers = ["Rank", "City", "Population(2011)[3]", "Population(2001)[3][a]", "State or union territory"] First 2048 HTML chars = <!DOCTYPE html> <html class="client-nojs vector-feature-language-in-header-enabled vector-feature-language-in-main-page-header-disabled vector-feature-sticky-header-disabled vector-feature-page-tools-pinned-disabled vector-feature-toc-pinned-clientpref-1 vector-feature-main-menu-pinned-disabled vector-feature-limited-width-clientpref-1 vector-feature-limited-width-content-enabled vector-feature-custom-font-size-clientpref-0 vector-feature-client-preferences-disabled vector-feature-client-prefs-pinned-disabled vector-toc-available" lang="en" dir="ltr"> <head> <meta charset="UTF-8"> <title>List of cities in India by population - Wikipedia</title> <script>(function(){var className="client-js vector-feature-language-in-header-enabled vector-feature-language-in-main-page-header-disabled vector-feature-sticky-header-disabled vector-feature-page-tools-pinned-disabled vector-feature-toc-pinned-clientpref-1 vector-feature-main-menu-pinned-disabled vector-feature-limited-width-clientpref-1 vector-feature-limited-width-content-enabled vector-feature-custom-font-size-clientpref-0 vector-feature-client-preferences-disabled vector-feature-client-prefs-pinned-disabled vector-toc-available";var cookie=document.cookie.match(/(?:^|; )enwikimwclientpreferences=([^;]+)/);if(cookie){cookie[1].split('%2C').forEach(function(pref){className=className.replace(new RegExp('(^| )'+pref.replace(/-clientpref-\w+$|[^\w-]+/g,'')+'-clientpref-\\w+( |$)'),'$1'+pref+'$2');});}document.documentElement.className=className;}());RLCONF={"wgBreakFrames":false,"wgSeparatorTransformTable":["",""],"wgDigitTransformTable":["",""],"wgDefaultDateFormat":"dmy","wgMonthNames":["", "January","February","March","April","May","June","July","August","September","October","November","December"],"wgRequestId":"9ecdad8f-2dbd-4245-b54d-9c57aea5ca45","wgCanonicalNamespace":"","wgCanonicalSpecialPageName":false,"wgNamespaceNumber":0,"wgPageName":"List_of_cities_in_India_by_population","wgTitle":"List of cities in India by population","wgCurRevisionId":1192093210,"wgRev Potential HTML Headers = ["City", "Population(2011)[3]", "Population(2001)[3][a]", "State or unionterritory", "Ref"] Potential HTML Headers = ["City", "Population(2011)[5]", "Population(2001)", "State or unionterritory"] Potential HTML Headers = [".mw-parser-output .navbar{display:inline;font-size:88%;font-weight:normal}.mw-parser-output .navbar-collapse{float:left;text-align:left}.mw-parser-output .navbar-boxtext{word-spacing:0}.mw-parser-output .navbar ul{display:inline-block;white-space:nowrap;line-height:inherit}.mw-parser-output .navbar-brackets::before{margin-right:-0.125em;content:\"[ \"}.mw-parser-output .navbar-brackets::after{margin-left:-0.125em;content:\" ]\"}.mw-parser-output .navbar li{word-spacing:-0.125em}.mw-parser-output .navbar a>span,.mw-parser-output .navbar a>abbr{text-decoration:inherit}.mw-parser-output .navbar-mini abbr{font-variant:small-caps;border-bottom:none;text-decoration:none;cursor:inherit}.mw-parser-output .navbar-ct-full{font-size:114%;margin:0 7em}.mw-parser-output .navbar-ct-mini{font-size:114%;margin:0 4em}vtePopulation of cities in India"] Potential HTML Headers = ["vteGeography of India"] ╭──────────────────────────┬─────────────────────────────────────────────────────╮ │ Rank │ error: no data found (column name may be incorrect) │ │ City │ error: no data found (column name may be incorrect) │ │ Population(2011)[3] │ error: no data found (column name may be incorrect) │ │ Population(2001)[3][a] │ error: no data found (column name may be incorrect) │ │ State or union territory │ error: no data found (column name may be incorrect) │ ╰──────────────────────────┴─────────────────────────────────────────────────────╯ ``` The key here is to look at the `Passed in Column Headers` and compare them to the `Potential HTML Headers` and couple that with the error table at the bottom should give you a hint that, in this situation, wikipedia has changed the column names, yet again. So we need to update our query web statement's tables to get closer to what we want. ``` ❯ http get https://en.wikipedia.org/wiki/List_of_cities_in_India_by_population | query web --as-table [City 'Population(2011)[3]' 'Population(2001)[3][a]' 'State or unionterritory' 'Ref'] ╭─#──┬───────City───────┬─Population(2011)[3]─┬─Population(2001)[3][a]─┬─State or unionterritory─┬──Ref───╮ │ 0 │ Mumbai │ 12,442,373 │ 11,978,450 │ Maharashtra │ [3] │ │ 1 │ Delhi │ 11,034,555 │ 9,879,172 │ Delhi │ [3] │ │ 2 │ Bangalore │ 8,443,675 │ 5,682,293 │ Karnataka │ [3] │ │ 3 │ Hyderabad │ 6,993,262 │ 5,496,960 │ Telangana │ [3] │ │ 4 │ Ahmedabad │ 5,577,940 │ 4,470,006 │ Gujarat │ [3] │ │ 5 │ Chennai │ 4,646,732 │ 4,343,645 │ Tamil Nadu │ [3] │ │ 6 │ Kolkata │ 4,496,694 │ 4,580,546 │ West Bengal │ [3] │ │ 7 │ Surat │ 4,467,797 │ 2,788,126 │ Gujarat │ [3] │ │ 8 │ Pune │ 3,124,458 │ 2,538,473 │ Maharashtra │ [3] │ │ 9 │ Jaipur │ 3,046,163 │ 2,322,575 │ Rajasthan │ [3] │ │ 10 │ Lucknow │ 2,817,105 │ 2,185,927 │ Uttar Pradesh │ [3] │ │ 11 │ Kanpur │ 2,765,348 │ 2,551,337 │ Uttar Pradesh │ [3] │ │ 12 │ Nagpur │ 2,405,665 │ 2,052,066 │ Maharashtra │ [3] │ ``` # User-Facing Changes <!-- List of all changes that impact the user experience here. This helps us keep track of breaking changes. --> # Tests + Formatting <!-- Don't forget to add tests that cover your changes. Make sure you've run and fixed any issues with these commands: - `cargo fmt --all -- --check` to check standard code formatting (`cargo fmt --all` applies these changes) - `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to check that you're using the standard code style - `cargo test --workspace` to check that all tests pass (on Windows make sure to [enable developer mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging)) - `cargo run -- -c "use std testing; testing run-tests --path crates/nu-std"` to run the tests for the standard library > **Note** > from `nushell` you can also use the `toolkit` as follows > ```bash > use toolkit.nu # or use an `env_change` hook to activate it automatically > toolkit check pr > ``` --> # After Submitting <!-- If your PR had any user-facing changes, update [the documentation](https://github.com/nushell/nushell.github.io) after the PR is merged, if necessary. This will help us keep the docs up to date. -->
2024-02-02 15:03:28 +00:00
// We provided columns but the table has no headers, so we'll just make a single column table
if !cols.is_empty() && table.headers().is_empty() {
let mut record = Record::new();
for col in &cols {
record.push(
col.clone(),
Value::string("error: no data found (column name may be incorrect)", span),
);
}
return Value::record(record, span);
}
let mut table_out = Vec::new();
// sometimes there are tables where the first column is the headers, kind of like
// a table has ben rotated ccw 90 degrees, in these cases all columns will be missing
// we keep track of this with this variable so we can deal with it later
let mut at_least_one_row_filled = false;
// if columns are still empty, let's just make a single column table with the data
if cols.is_empty() {
at_least_one_row_filled = true;
let table_with_no_empties: Vec<_> = table.iter().filter(|item| !item.is_empty()).collect();
let mut record = Record::new();
for row in &table_with_no_empties {
for (counter, cell) in row.iter().enumerate() {
record.push(format!("column{counter}"), Value::string(cell, span));
}
}
table_out.push(Value::record(record, span))
} else {
for row in &table {
let record = cols
.iter()
.map(|col| {
let val = row
.get(col)
.unwrap_or(&format!("Missing column: '{}'", &col))
.to_string();
if !at_least_one_row_filled && val != format!("Missing column: '{}'", &col) {
at_least_one_row_filled = true;
}
(col.clone(), Value::string(val, span))
})
.collect();
table_out.push(Value::record(record, span))
}
}
if !at_least_one_row_filled {
let mut data2 = Vec::new();
for x in &table.data {
data2.push(x.join(", "));
}
table.data = vec![data2];
return retrieve_table(table, columns, span);
}
// table_out
Move Value to helpers, separate span call (#10121) # Description As part of the refactor to split spans off of Value, this moves to using helper functions to create values, and using `.span()` instead of matching span out of Value directly. Hoping to get a few more helping hands to finish this, as there are a lot of commands to update :) # User-Facing Changes <!-- List of all changes that impact the user experience here. This helps us keep track of breaking changes. --> # Tests + Formatting <!-- Don't forget to add tests that cover your changes. Make sure you've run and fixed any issues with these commands: - `cargo fmt --all -- --check` to check standard code formatting (`cargo fmt --all` applies these changes) - `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to check that you're using the standard code style - `cargo test --workspace` to check that all tests pass (on Windows make sure to [enable developer mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging)) - `cargo run -- -c "use std testing; testing run-tests --path crates/nu-std"` to run the tests for the standard library > **Note** > from `nushell` you can also use the `toolkit` as follows > ```bash > use toolkit.nu # or use an `env_change` hook to activate it automatically > toolkit check pr > ``` --> # After Submitting <!-- If your PR had any user-facing changes, update [the documentation](https://github.com/nushell/nushell.github.io) after the PR is merged, if necessary. This will help us keep the docs up to date. --> --------- Co-authored-by: Darren Schroeder <343840+fdncred@users.noreply.github.com> Co-authored-by: WindSoilder <windsoilder@outlook.com>
2023-09-03 14:27:29 +00:00
Value::list(table_out, span)
}
fn execute_selector_query_with_attribute(
input_string: &str,
query_string: &str,
attribute: &str,
inspect: bool,
span: Span,
) -> Value {
let doc = Html::parse_fragment(input_string);
let vals: Vec<Value> = doc
.select(&css(query_string, inspect))
.map(|selection| {
Value::string(
selection.value().attr(attribute).unwrap_or("").to_string(),
span,
)
})
.collect();
Move Value to helpers, separate span call (#10121) # Description As part of the refactor to split spans off of Value, this moves to using helper functions to create values, and using `.span()` instead of matching span out of Value directly. Hoping to get a few more helping hands to finish this, as there are a lot of commands to update :) # User-Facing Changes <!-- List of all changes that impact the user experience here. This helps us keep track of breaking changes. --> # Tests + Formatting <!-- Don't forget to add tests that cover your changes. Make sure you've run and fixed any issues with these commands: - `cargo fmt --all -- --check` to check standard code formatting (`cargo fmt --all` applies these changes) - `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to check that you're using the standard code style - `cargo test --workspace` to check that all tests pass (on Windows make sure to [enable developer mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging)) - `cargo run -- -c "use std testing; testing run-tests --path crates/nu-std"` to run the tests for the standard library > **Note** > from `nushell` you can also use the `toolkit` as follows > ```bash > use toolkit.nu # or use an `env_change` hook to activate it automatically > toolkit check pr > ``` --> # After Submitting <!-- If your PR had any user-facing changes, update [the documentation](https://github.com/nushell/nushell.github.io) after the PR is merged, if necessary. This will help us keep the docs up to date. --> --------- Co-authored-by: Darren Schroeder <343840+fdncred@users.noreply.github.com> Co-authored-by: WindSoilder <windsoilder@outlook.com>
2023-09-03 14:27:29 +00:00
Value::list(vals, span)
}
fn execute_selector_query(
input_string: &str,
query_string: &str,
as_html: bool,
inspect: bool,
span: Span,
) -> Value {
let doc = Html::parse_fragment(input_string);
let vals: Vec<Value> = match as_html {
true => doc
.select(&css(query_string, inspect))
.map(|selection| Value::string(selection.html(), span))
.collect(),
false => doc
.select(&css(query_string, inspect))
.map(|selection| {
`query web --query` should return `list<list<string>>` like the scraper crate's `ElementRef::text()` (#11705) <!-- if this PR closes one or more issues, you can automatically link the PR with them by using one of the [*linking keywords*](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword), e.g. - this PR should close #xxxx - fixes #xxxx you can also mention related issues, PRs or discussions! --> # Description <!-- Thank you for improving Nushell. Please, check our [contributing guide](../CONTRIBUTING.md) and talk to the core team before making major changes. Description of your pull request goes here. **Provide examples and/or screenshots** if your changes affect the user experience. --> ## Problem I tried converting one of my Rust web scrapers to Nushell just to see how it would be done, but quickly ran into an issue that proved annoying to fix without diving into the source. For instance, let's say we have the following HTML ```html <p>Hello there, <span style="color: red;">World</span></p> ``` and we want to extract only the text within the `p` element, but not the `span`. With the current version of nu_plugin_query, if we run this code ```nushell echo `<p>Hello there, <span style="color: red;">World</span></p>` | query web -q "p" | get 0 # returns "Hello there, World" # but we want only "Hello there, " ``` we will get back a `list<string>` that contains 1 string `Hello there, World`. To avoid scraping the span, we would have to do something like this ```nushell const html = `<p>Hello there, <span style="color: red;">World</span></p>` $html | query web -q "p" | get 0 | str replace ($html | query web -q "p > span" | get 0) "" # returns "Hello there, " ``` In other words, we would have to make a sub scrape of the text we *don't* want in order to subtract it from the text we *do* want. ## Solution I didn't like this behavior, so I decided to change it. I modified the `execute_selector_query` function to collect all text nodes in the HTML element matching the query. Now `query web --query` will return a `list<list<string>>` ```nushell echo `<p>Hello there, <span style="color: red;">World</span></p>` | query web -q "p" | get 0 | to json --raw # returns ["Hello there, ","World"] ``` This also brings `query web --query`'s behavior more in line with [scraper's ElementRef::text()](https://docs.rs/scraper/latest/scraper/element_ref/struct.ElementRef.html#method.text) which "Returns an iterator over descendent text nodes", allowing you to choose how much of an element's text you want to scrape without resorting to string substitutions. ## Consequences As this is a user-facing change, the usage examples will produce different results than before. For example ```nushell http get https://phoronix.com | query web --query 'header' ``` will return a list of lists of 1 string each, whereas before it was just a list of strings. I only modified the 3rd example ```nushell # old http get https://www.nushell.sh | query web --query 'h2, h2 + p' | group 2 | each {rotate --ccw tagline description} | flatten # new http get https://www.nushell.sh | query web --query 'h2, h2 + p' | each {str join} | group 2 | each {rotate --ccw tagline description} | flatten ``` to make it behave like before because I thought this one ought to show the same results as before. However, the second reason I changed the 3rd example is because it otherwise panics! If we run the original 3rd example with my modifications, we get a panic ``` thread 'main' panicked at crates/nu-protocol/src/value/record.rs:34:9: assertion `left == right` failed left: 2 right: 17 ``` This happens because `rotate` receives a list of lists where the inner lists have a different number of elements. However this panic is unrelated to the changes I've made, because it can be triggered easily without using the plugin. For instance ```nushell # this is fine [[[one] [two]] [[three] [four]]] | each {rotate --ccw tagline description} # this panics! [[[one] [two]] [[three] [four five]]] | each {rotate --ccw tagline description} ``` Though beyond the scope of this PR, I thought I'd mention this bug since I found it while testing the usage examples. However, I intend to make a proper issue about it tomorrow. # User-Facing Changes <!-- List of all changes that impact the user experience here. This helps us keep track of breaking changes. --> `query web --query "css selector"` now returns a `list<list<string>>` instead of a `list<string>` to make it more in line with [scraper's ElementRef::text()](https://docs.rs/scraper/latest/scraper/element_ref/struct.ElementRef.html#method.text). # Tests + Formatting <!-- Don't forget to add tests that cover your changes. Make sure you've run and fixed any issues with these commands: - `cargo fmt --all -- --check` to check standard code formatting (`cargo fmt --all` applies these changes) - `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to check that you're using the standard code style - `cargo test --workspace` to check that all tests pass (on Windows make sure to [enable developer mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging)) - `cargo run -- -c "use std testing; testing run-tests --path crates/nu-std"` to run the tests for the standard library > **Note** > from `nushell` you can also use the `toolkit` as follows > ```bash > use toolkit.nu # or use an `env_change` hook to activate it automatically > toolkit check pr > ``` --> I ran `cargo fmt --all -- --check`, `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` and the tests in the plugin. # After Submitting <!-- If your PR had any user-facing changes, update [the documentation](https://github.com/nushell/nushell.github.io) after the PR is merged, if necessary. This will help us keep the docs up to date. --> PR that updates the documentation to match the new 3rd example: https://github.com/nushell/nushell.github.io/pull/1235
2024-02-03 01:40:47 +00:00
Value::list(
selection
.text()
`query web --query` should return `list<list<string>>` like the scraper crate's `ElementRef::text()` (#11705) <!-- if this PR closes one or more issues, you can automatically link the PR with them by using one of the [*linking keywords*](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword), e.g. - this PR should close #xxxx - fixes #xxxx you can also mention related issues, PRs or discussions! --> # Description <!-- Thank you for improving Nushell. Please, check our [contributing guide](../CONTRIBUTING.md) and talk to the core team before making major changes. Description of your pull request goes here. **Provide examples and/or screenshots** if your changes affect the user experience. --> ## Problem I tried converting one of my Rust web scrapers to Nushell just to see how it would be done, but quickly ran into an issue that proved annoying to fix without diving into the source. For instance, let's say we have the following HTML ```html <p>Hello there, <span style="color: red;">World</span></p> ``` and we want to extract only the text within the `p` element, but not the `span`. With the current version of nu_plugin_query, if we run this code ```nushell echo `<p>Hello there, <span style="color: red;">World</span></p>` | query web -q "p" | get 0 # returns "Hello there, World" # but we want only "Hello there, " ``` we will get back a `list<string>` that contains 1 string `Hello there, World`. To avoid scraping the span, we would have to do something like this ```nushell const html = `<p>Hello there, <span style="color: red;">World</span></p>` $html | query web -q "p" | get 0 | str replace ($html | query web -q "p > span" | get 0) "" # returns "Hello there, " ``` In other words, we would have to make a sub scrape of the text we *don't* want in order to subtract it from the text we *do* want. ## Solution I didn't like this behavior, so I decided to change it. I modified the `execute_selector_query` function to collect all text nodes in the HTML element matching the query. Now `query web --query` will return a `list<list<string>>` ```nushell echo `<p>Hello there, <span style="color: red;">World</span></p>` | query web -q "p" | get 0 | to json --raw # returns ["Hello there, ","World"] ``` This also brings `query web --query`'s behavior more in line with [scraper's ElementRef::text()](https://docs.rs/scraper/latest/scraper/element_ref/struct.ElementRef.html#method.text) which "Returns an iterator over descendent text nodes", allowing you to choose how much of an element's text you want to scrape without resorting to string substitutions. ## Consequences As this is a user-facing change, the usage examples will produce different results than before. For example ```nushell http get https://phoronix.com | query web --query 'header' ``` will return a list of lists of 1 string each, whereas before it was just a list of strings. I only modified the 3rd example ```nushell # old http get https://www.nushell.sh | query web --query 'h2, h2 + p' | group 2 | each {rotate --ccw tagline description} | flatten # new http get https://www.nushell.sh | query web --query 'h2, h2 + p' | each {str join} | group 2 | each {rotate --ccw tagline description} | flatten ``` to make it behave like before because I thought this one ought to show the same results as before. However, the second reason I changed the 3rd example is because it otherwise panics! If we run the original 3rd example with my modifications, we get a panic ``` thread 'main' panicked at crates/nu-protocol/src/value/record.rs:34:9: assertion `left == right` failed left: 2 right: 17 ``` This happens because `rotate` receives a list of lists where the inner lists have a different number of elements. However this panic is unrelated to the changes I've made, because it can be triggered easily without using the plugin. For instance ```nushell # this is fine [[[one] [two]] [[three] [four]]] | each {rotate --ccw tagline description} # this panics! [[[one] [two]] [[three] [four five]]] | each {rotate --ccw tagline description} ``` Though beyond the scope of this PR, I thought I'd mention this bug since I found it while testing the usage examples. However, I intend to make a proper issue about it tomorrow. # User-Facing Changes <!-- List of all changes that impact the user experience here. This helps us keep track of breaking changes. --> `query web --query "css selector"` now returns a `list<list<string>>` instead of a `list<string>` to make it more in line with [scraper's ElementRef::text()](https://docs.rs/scraper/latest/scraper/element_ref/struct.ElementRef.html#method.text). # Tests + Formatting <!-- Don't forget to add tests that cover your changes. Make sure you've run and fixed any issues with these commands: - `cargo fmt --all -- --check` to check standard code formatting (`cargo fmt --all` applies these changes) - `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to check that you're using the standard code style - `cargo test --workspace` to check that all tests pass (on Windows make sure to [enable developer mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging)) - `cargo run -- -c "use std testing; testing run-tests --path crates/nu-std"` to run the tests for the standard library > **Note** > from `nushell` you can also use the `toolkit` as follows > ```bash > use toolkit.nu # or use an `env_change` hook to activate it automatically > toolkit check pr > ``` --> I ran `cargo fmt --all -- --check`, `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` and the tests in the plugin. # After Submitting <!-- If your PR had any user-facing changes, update [the documentation](https://github.com/nushell/nushell.github.io) after the PR is merged, if necessary. This will help us keep the docs up to date. --> PR that updates the documentation to match the new 3rd example: https://github.com/nushell/nushell.github.io/pull/1235
2024-02-03 01:40:47 +00:00
.map(|text| Value::string(text, span))
.collect(),
span,
)
})
.collect(),
};
Move Value to helpers, separate span call (#10121) # Description As part of the refactor to split spans off of Value, this moves to using helper functions to create values, and using `.span()` instead of matching span out of Value directly. Hoping to get a few more helping hands to finish this, as there are a lot of commands to update :) # User-Facing Changes <!-- List of all changes that impact the user experience here. This helps us keep track of breaking changes. --> # Tests + Formatting <!-- Don't forget to add tests that cover your changes. Make sure you've run and fixed any issues with these commands: - `cargo fmt --all -- --check` to check standard code formatting (`cargo fmt --all` applies these changes) - `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to check that you're using the standard code style - `cargo test --workspace` to check that all tests pass (on Windows make sure to [enable developer mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging)) - `cargo run -- -c "use std testing; testing run-tests --path crates/nu-std"` to run the tests for the standard library > **Note** > from `nushell` you can also use the `toolkit` as follows > ```bash > use toolkit.nu # or use an `env_change` hook to activate it automatically > toolkit check pr > ``` --> # After Submitting <!-- If your PR had any user-facing changes, update [the documentation](https://github.com/nushell/nushell.github.io) after the PR is merged, if necessary. This will help us keep the docs up to date. --> --------- Co-authored-by: Darren Schroeder <343840+fdncred@users.noreply.github.com> Co-authored-by: WindSoilder <windsoilder@outlook.com>
2023-09-03 14:27:29 +00:00
Value::list(vals, span)
}
pub fn css(selector: &str, inspect: bool) -> ScraperSelector {
if inspect {
ScraperSelector::parse("html").expect("Error unwrapping the default scraperselector")
} else {
ScraperSelector::parse(selector).expect("Error unwrapping scraperselector::parse")
}
}
#[cfg(test)]
mod tests {
use super::*;
const SIMPLE_LIST: &str = r#"
<ul>
<li>Coffee</li>
<li>Tea</li>
<li>Milk</li>
</ul>
"#;
`query web --query` should return `list<list<string>>` like the scraper crate's `ElementRef::text()` (#11705) <!-- if this PR closes one or more issues, you can automatically link the PR with them by using one of the [*linking keywords*](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword), e.g. - this PR should close #xxxx - fixes #xxxx you can also mention related issues, PRs or discussions! --> # Description <!-- Thank you for improving Nushell. Please, check our [contributing guide](../CONTRIBUTING.md) and talk to the core team before making major changes. Description of your pull request goes here. **Provide examples and/or screenshots** if your changes affect the user experience. --> ## Problem I tried converting one of my Rust web scrapers to Nushell just to see how it would be done, but quickly ran into an issue that proved annoying to fix without diving into the source. For instance, let's say we have the following HTML ```html <p>Hello there, <span style="color: red;">World</span></p> ``` and we want to extract only the text within the `p` element, but not the `span`. With the current version of nu_plugin_query, if we run this code ```nushell echo `<p>Hello there, <span style="color: red;">World</span></p>` | query web -q "p" | get 0 # returns "Hello there, World" # but we want only "Hello there, " ``` we will get back a `list<string>` that contains 1 string `Hello there, World`. To avoid scraping the span, we would have to do something like this ```nushell const html = `<p>Hello there, <span style="color: red;">World</span></p>` $html | query web -q "p" | get 0 | str replace ($html | query web -q "p > span" | get 0) "" # returns "Hello there, " ``` In other words, we would have to make a sub scrape of the text we *don't* want in order to subtract it from the text we *do* want. ## Solution I didn't like this behavior, so I decided to change it. I modified the `execute_selector_query` function to collect all text nodes in the HTML element matching the query. Now `query web --query` will return a `list<list<string>>` ```nushell echo `<p>Hello there, <span style="color: red;">World</span></p>` | query web -q "p" | get 0 | to json --raw # returns ["Hello there, ","World"] ``` This also brings `query web --query`'s behavior more in line with [scraper's ElementRef::text()](https://docs.rs/scraper/latest/scraper/element_ref/struct.ElementRef.html#method.text) which "Returns an iterator over descendent text nodes", allowing you to choose how much of an element's text you want to scrape without resorting to string substitutions. ## Consequences As this is a user-facing change, the usage examples will produce different results than before. For example ```nushell http get https://phoronix.com | query web --query 'header' ``` will return a list of lists of 1 string each, whereas before it was just a list of strings. I only modified the 3rd example ```nushell # old http get https://www.nushell.sh | query web --query 'h2, h2 + p' | group 2 | each {rotate --ccw tagline description} | flatten # new http get https://www.nushell.sh | query web --query 'h2, h2 + p' | each {str join} | group 2 | each {rotate --ccw tagline description} | flatten ``` to make it behave like before because I thought this one ought to show the same results as before. However, the second reason I changed the 3rd example is because it otherwise panics! If we run the original 3rd example with my modifications, we get a panic ``` thread 'main' panicked at crates/nu-protocol/src/value/record.rs:34:9: assertion `left == right` failed left: 2 right: 17 ``` This happens because `rotate` receives a list of lists where the inner lists have a different number of elements. However this panic is unrelated to the changes I've made, because it can be triggered easily without using the plugin. For instance ```nushell # this is fine [[[one] [two]] [[three] [four]]] | each {rotate --ccw tagline description} # this panics! [[[one] [two]] [[three] [four five]]] | each {rotate --ccw tagline description} ``` Though beyond the scope of this PR, I thought I'd mention this bug since I found it while testing the usage examples. However, I intend to make a proper issue about it tomorrow. # User-Facing Changes <!-- List of all changes that impact the user experience here. This helps us keep track of breaking changes. --> `query web --query "css selector"` now returns a `list<list<string>>` instead of a `list<string>` to make it more in line with [scraper's ElementRef::text()](https://docs.rs/scraper/latest/scraper/element_ref/struct.ElementRef.html#method.text). # Tests + Formatting <!-- Don't forget to add tests that cover your changes. Make sure you've run and fixed any issues with these commands: - `cargo fmt --all -- --check` to check standard code formatting (`cargo fmt --all` applies these changes) - `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to check that you're using the standard code style - `cargo test --workspace` to check that all tests pass (on Windows make sure to [enable developer mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging)) - `cargo run -- -c "use std testing; testing run-tests --path crates/nu-std"` to run the tests for the standard library > **Note** > from `nushell` you can also use the `toolkit` as follows > ```bash > use toolkit.nu # or use an `env_change` hook to activate it automatically > toolkit check pr > ``` --> I ran `cargo fmt --all -- --check`, `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` and the tests in the plugin. # After Submitting <!-- If your PR had any user-facing changes, update [the documentation](https://github.com/nushell/nushell.github.io) after the PR is merged, if necessary. This will help us keep the docs up to date. --> PR that updates the documentation to match the new 3rd example: https://github.com/nushell/nushell.github.io/pull/1235
2024-02-03 01:40:47 +00:00
const NESTED_TEXT: &str = r#"<p>Hello there, <span style="color: red;">World</span></p>"#;
#[test]
fn test_first_child_is_not_empty() {
assert!(!execute_selector_query(
SIMPLE_LIST,
"li:first-child",
false,
false,
Span::test_data()
)
.is_empty())
}
#[test]
fn test_first_child() {
let item = execute_selector_query(
SIMPLE_LIST,
"li:first-child",
false,
false,
Span::test_data(),
);
let config = nu_protocol::Config::default();
Name the `Value` conversion functions more clearly (#11851) # Description This PR renames the conversion functions on `Value` to be more consistent. It follows the Rust [API guidelines](https://rust-lang.github.io/api-guidelines/naming.html#ad-hoc-conversions-follow-as_-to_-into_-conventions-c-conv) for ad-hoc conversions. The conversion functions on `Value` now come in a few forms: - `coerce_{type}` takes a `&Value` and attempts to convert the value to `type` (e.g., `i64` are converted to `f64`). This is the old behavior of some of the `as_{type}` functions -- these functions have simply been renamed to better reflect what they do. - The new `as_{type}` functions take a `&Value` and returns an `Ok` result only if the value is of `type` (no conversion is attempted). The returned value will be borrowed if `type` is non-`Copy`, otherwise an owned value is returned. - `into_{type}` exists for non-`Copy` types, but otherwise does not attempt conversion just like `as_type`. It takes an owned `Value` and always returns an owned result. - `coerce_into_{type}` has the same relationship with `coerce_{type}` as `into_{type}` does with `as_{type}`. - `to_{kind}_string`: conversion to different string formats (debug, abbreviated, etc.). Only two of the old string conversion functions were removed, the rest have been renamed only. - `to_{type}`: other conversion functions. Currently, only `to_path` exists. (And `to_string` through `Display`.) This table summaries the above: | Form | Cost | Input Ownership | Output Ownership | Converts `Value` case/`type` | | ---------------------------- | ----- | --------------- | ---------------- | -------- | | `as_{type}` | Cheap | Borrowed | Borrowed/Owned | No | | `into_{type}` | Cheap | Owned | Owned | No | | `coerce_{type}` | Cheap | Borrowed | Borrowed/Owned | Yes | | `coerce_into_{type}` | Cheap | Owned | Owned | Yes | | `to_{kind}_string` | Expensive | Borrowed | Owned | Yes | | `to_{type}` | Expensive | Borrowed | Owned | Yes | # User-Facing Changes Breaking API change for `Value` in `nu-protocol` which is exposed as part of the plugin API.
2024-02-17 18:14:16 +00:00
let out = item.to_expanded_string("\n", &config);
`query web --query` should return `list<list<string>>` like the scraper crate's `ElementRef::text()` (#11705) <!-- if this PR closes one or more issues, you can automatically link the PR with them by using one of the [*linking keywords*](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword), e.g. - this PR should close #xxxx - fixes #xxxx you can also mention related issues, PRs or discussions! --> # Description <!-- Thank you for improving Nushell. Please, check our [contributing guide](../CONTRIBUTING.md) and talk to the core team before making major changes. Description of your pull request goes here. **Provide examples and/or screenshots** if your changes affect the user experience. --> ## Problem I tried converting one of my Rust web scrapers to Nushell just to see how it would be done, but quickly ran into an issue that proved annoying to fix without diving into the source. For instance, let's say we have the following HTML ```html <p>Hello there, <span style="color: red;">World</span></p> ``` and we want to extract only the text within the `p` element, but not the `span`. With the current version of nu_plugin_query, if we run this code ```nushell echo `<p>Hello there, <span style="color: red;">World</span></p>` | query web -q "p" | get 0 # returns "Hello there, World" # but we want only "Hello there, " ``` we will get back a `list<string>` that contains 1 string `Hello there, World`. To avoid scraping the span, we would have to do something like this ```nushell const html = `<p>Hello there, <span style="color: red;">World</span></p>` $html | query web -q "p" | get 0 | str replace ($html | query web -q "p > span" | get 0) "" # returns "Hello there, " ``` In other words, we would have to make a sub scrape of the text we *don't* want in order to subtract it from the text we *do* want. ## Solution I didn't like this behavior, so I decided to change it. I modified the `execute_selector_query` function to collect all text nodes in the HTML element matching the query. Now `query web --query` will return a `list<list<string>>` ```nushell echo `<p>Hello there, <span style="color: red;">World</span></p>` | query web -q "p" | get 0 | to json --raw # returns ["Hello there, ","World"] ``` This also brings `query web --query`'s behavior more in line with [scraper's ElementRef::text()](https://docs.rs/scraper/latest/scraper/element_ref/struct.ElementRef.html#method.text) which "Returns an iterator over descendent text nodes", allowing you to choose how much of an element's text you want to scrape without resorting to string substitutions. ## Consequences As this is a user-facing change, the usage examples will produce different results than before. For example ```nushell http get https://phoronix.com | query web --query 'header' ``` will return a list of lists of 1 string each, whereas before it was just a list of strings. I only modified the 3rd example ```nushell # old http get https://www.nushell.sh | query web --query 'h2, h2 + p' | group 2 | each {rotate --ccw tagline description} | flatten # new http get https://www.nushell.sh | query web --query 'h2, h2 + p' | each {str join} | group 2 | each {rotate --ccw tagline description} | flatten ``` to make it behave like before because I thought this one ought to show the same results as before. However, the second reason I changed the 3rd example is because it otherwise panics! If we run the original 3rd example with my modifications, we get a panic ``` thread 'main' panicked at crates/nu-protocol/src/value/record.rs:34:9: assertion `left == right` failed left: 2 right: 17 ``` This happens because `rotate` receives a list of lists where the inner lists have a different number of elements. However this panic is unrelated to the changes I've made, because it can be triggered easily without using the plugin. For instance ```nushell # this is fine [[[one] [two]] [[three] [four]]] | each {rotate --ccw tagline description} # this panics! [[[one] [two]] [[three] [four five]]] | each {rotate --ccw tagline description} ``` Though beyond the scope of this PR, I thought I'd mention this bug since I found it while testing the usage examples. However, I intend to make a proper issue about it tomorrow. # User-Facing Changes <!-- List of all changes that impact the user experience here. This helps us keep track of breaking changes. --> `query web --query "css selector"` now returns a `list<list<string>>` instead of a `list<string>` to make it more in line with [scraper's ElementRef::text()](https://docs.rs/scraper/latest/scraper/element_ref/struct.ElementRef.html#method.text). # Tests + Formatting <!-- Don't forget to add tests that cover your changes. Make sure you've run and fixed any issues with these commands: - `cargo fmt --all -- --check` to check standard code formatting (`cargo fmt --all` applies these changes) - `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to check that you're using the standard code style - `cargo test --workspace` to check that all tests pass (on Windows make sure to [enable developer mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging)) - `cargo run -- -c "use std testing; testing run-tests --path crates/nu-std"` to run the tests for the standard library > **Note** > from `nushell` you can also use the `toolkit` as follows > ```bash > use toolkit.nu # or use an `env_change` hook to activate it automatically > toolkit check pr > ``` --> I ran `cargo fmt --all -- --check`, `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` and the tests in the plugin. # After Submitting <!-- If your PR had any user-facing changes, update [the documentation](https://github.com/nushell/nushell.github.io) after the PR is merged, if necessary. This will help us keep the docs up to date. --> PR that updates the documentation to match the new 3rd example: https://github.com/nushell/nushell.github.io/pull/1235
2024-02-03 01:40:47 +00:00
assert_eq!("[[Coffee]]".to_string(), out)
}
#[test]
fn test_nested_text_nodes() {
let item = execute_selector_query(
NESTED_TEXT,
"p:first-child",
false,
false,
Span::test_data(),
);
let out = item
.into_list()
`query web --query` should return `list<list<string>>` like the scraper crate's `ElementRef::text()` (#11705) <!-- if this PR closes one or more issues, you can automatically link the PR with them by using one of the [*linking keywords*](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword), e.g. - this PR should close #xxxx - fixes #xxxx you can also mention related issues, PRs or discussions! --> # Description <!-- Thank you for improving Nushell. Please, check our [contributing guide](../CONTRIBUTING.md) and talk to the core team before making major changes. Description of your pull request goes here. **Provide examples and/or screenshots** if your changes affect the user experience. --> ## Problem I tried converting one of my Rust web scrapers to Nushell just to see how it would be done, but quickly ran into an issue that proved annoying to fix without diving into the source. For instance, let's say we have the following HTML ```html <p>Hello there, <span style="color: red;">World</span></p> ``` and we want to extract only the text within the `p` element, but not the `span`. With the current version of nu_plugin_query, if we run this code ```nushell echo `<p>Hello there, <span style="color: red;">World</span></p>` | query web -q "p" | get 0 # returns "Hello there, World" # but we want only "Hello there, " ``` we will get back a `list<string>` that contains 1 string `Hello there, World`. To avoid scraping the span, we would have to do something like this ```nushell const html = `<p>Hello there, <span style="color: red;">World</span></p>` $html | query web -q "p" | get 0 | str replace ($html | query web -q "p > span" | get 0) "" # returns "Hello there, " ``` In other words, we would have to make a sub scrape of the text we *don't* want in order to subtract it from the text we *do* want. ## Solution I didn't like this behavior, so I decided to change it. I modified the `execute_selector_query` function to collect all text nodes in the HTML element matching the query. Now `query web --query` will return a `list<list<string>>` ```nushell echo `<p>Hello there, <span style="color: red;">World</span></p>` | query web -q "p" | get 0 | to json --raw # returns ["Hello there, ","World"] ``` This also brings `query web --query`'s behavior more in line with [scraper's ElementRef::text()](https://docs.rs/scraper/latest/scraper/element_ref/struct.ElementRef.html#method.text) which "Returns an iterator over descendent text nodes", allowing you to choose how much of an element's text you want to scrape without resorting to string substitutions. ## Consequences As this is a user-facing change, the usage examples will produce different results than before. For example ```nushell http get https://phoronix.com | query web --query 'header' ``` will return a list of lists of 1 string each, whereas before it was just a list of strings. I only modified the 3rd example ```nushell # old http get https://www.nushell.sh | query web --query 'h2, h2 + p' | group 2 | each {rotate --ccw tagline description} | flatten # new http get https://www.nushell.sh | query web --query 'h2, h2 + p' | each {str join} | group 2 | each {rotate --ccw tagline description} | flatten ``` to make it behave like before because I thought this one ought to show the same results as before. However, the second reason I changed the 3rd example is because it otherwise panics! If we run the original 3rd example with my modifications, we get a panic ``` thread 'main' panicked at crates/nu-protocol/src/value/record.rs:34:9: assertion `left == right` failed left: 2 right: 17 ``` This happens because `rotate` receives a list of lists where the inner lists have a different number of elements. However this panic is unrelated to the changes I've made, because it can be triggered easily without using the plugin. For instance ```nushell # this is fine [[[one] [two]] [[three] [four]]] | each {rotate --ccw tagline description} # this panics! [[[one] [two]] [[three] [four five]]] | each {rotate --ccw tagline description} ``` Though beyond the scope of this PR, I thought I'd mention this bug since I found it while testing the usage examples. However, I intend to make a proper issue about it tomorrow. # User-Facing Changes <!-- List of all changes that impact the user experience here. This helps us keep track of breaking changes. --> `query web --query "css selector"` now returns a `list<list<string>>` instead of a `list<string>` to make it more in line with [scraper's ElementRef::text()](https://docs.rs/scraper/latest/scraper/element_ref/struct.ElementRef.html#method.text). # Tests + Formatting <!-- Don't forget to add tests that cover your changes. Make sure you've run and fixed any issues with these commands: - `cargo fmt --all -- --check` to check standard code formatting (`cargo fmt --all` applies these changes) - `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to check that you're using the standard code style - `cargo test --workspace` to check that all tests pass (on Windows make sure to [enable developer mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging)) - `cargo run -- -c "use std testing; testing run-tests --path crates/nu-std"` to run the tests for the standard library > **Note** > from `nushell` you can also use the `toolkit` as follows > ```bash > use toolkit.nu # or use an `env_change` hook to activate it automatically > toolkit check pr > ``` --> I ran `cargo fmt --all -- --check`, `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` and the tests in the plugin. # After Submitting <!-- If your PR had any user-facing changes, update [the documentation](https://github.com/nushell/nushell.github.io) after the PR is merged, if necessary. This will help us keep the docs up to date. --> PR that updates the documentation to match the new 3rd example: https://github.com/nushell/nushell.github.io/pull/1235
2024-02-03 01:40:47 +00:00
.unwrap()
.into_iter()
`query web --query` should return `list<list<string>>` like the scraper crate's `ElementRef::text()` (#11705) <!-- if this PR closes one or more issues, you can automatically link the PR with them by using one of the [*linking keywords*](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword), e.g. - this PR should close #xxxx - fixes #xxxx you can also mention related issues, PRs or discussions! --> # Description <!-- Thank you for improving Nushell. Please, check our [contributing guide](../CONTRIBUTING.md) and talk to the core team before making major changes. Description of your pull request goes here. **Provide examples and/or screenshots** if your changes affect the user experience. --> ## Problem I tried converting one of my Rust web scrapers to Nushell just to see how it would be done, but quickly ran into an issue that proved annoying to fix without diving into the source. For instance, let's say we have the following HTML ```html <p>Hello there, <span style="color: red;">World</span></p> ``` and we want to extract only the text within the `p` element, but not the `span`. With the current version of nu_plugin_query, if we run this code ```nushell echo `<p>Hello there, <span style="color: red;">World</span></p>` | query web -q "p" | get 0 # returns "Hello there, World" # but we want only "Hello there, " ``` we will get back a `list<string>` that contains 1 string `Hello there, World`. To avoid scraping the span, we would have to do something like this ```nushell const html = `<p>Hello there, <span style="color: red;">World</span></p>` $html | query web -q "p" | get 0 | str replace ($html | query web -q "p > span" | get 0) "" # returns "Hello there, " ``` In other words, we would have to make a sub scrape of the text we *don't* want in order to subtract it from the text we *do* want. ## Solution I didn't like this behavior, so I decided to change it. I modified the `execute_selector_query` function to collect all text nodes in the HTML element matching the query. Now `query web --query` will return a `list<list<string>>` ```nushell echo `<p>Hello there, <span style="color: red;">World</span></p>` | query web -q "p" | get 0 | to json --raw # returns ["Hello there, ","World"] ``` This also brings `query web --query`'s behavior more in line with [scraper's ElementRef::text()](https://docs.rs/scraper/latest/scraper/element_ref/struct.ElementRef.html#method.text) which "Returns an iterator over descendent text nodes", allowing you to choose how much of an element's text you want to scrape without resorting to string substitutions. ## Consequences As this is a user-facing change, the usage examples will produce different results than before. For example ```nushell http get https://phoronix.com | query web --query 'header' ``` will return a list of lists of 1 string each, whereas before it was just a list of strings. I only modified the 3rd example ```nushell # old http get https://www.nushell.sh | query web --query 'h2, h2 + p' | group 2 | each {rotate --ccw tagline description} | flatten # new http get https://www.nushell.sh | query web --query 'h2, h2 + p' | each {str join} | group 2 | each {rotate --ccw tagline description} | flatten ``` to make it behave like before because I thought this one ought to show the same results as before. However, the second reason I changed the 3rd example is because it otherwise panics! If we run the original 3rd example with my modifications, we get a panic ``` thread 'main' panicked at crates/nu-protocol/src/value/record.rs:34:9: assertion `left == right` failed left: 2 right: 17 ``` This happens because `rotate` receives a list of lists where the inner lists have a different number of elements. However this panic is unrelated to the changes I've made, because it can be triggered easily without using the plugin. For instance ```nushell # this is fine [[[one] [two]] [[three] [four]]] | each {rotate --ccw tagline description} # this panics! [[[one] [two]] [[three] [four five]]] | each {rotate --ccw tagline description} ``` Though beyond the scope of this PR, I thought I'd mention this bug since I found it while testing the usage examples. However, I intend to make a proper issue about it tomorrow. # User-Facing Changes <!-- List of all changes that impact the user experience here. This helps us keep track of breaking changes. --> `query web --query "css selector"` now returns a `list<list<string>>` instead of a `list<string>` to make it more in line with [scraper's ElementRef::text()](https://docs.rs/scraper/latest/scraper/element_ref/struct.ElementRef.html#method.text). # Tests + Formatting <!-- Don't forget to add tests that cover your changes. Make sure you've run and fixed any issues with these commands: - `cargo fmt --all -- --check` to check standard code formatting (`cargo fmt --all` applies these changes) - `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to check that you're using the standard code style - `cargo test --workspace` to check that all tests pass (on Windows make sure to [enable developer mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging)) - `cargo run -- -c "use std testing; testing run-tests --path crates/nu-std"` to run the tests for the standard library > **Note** > from `nushell` you can also use the `toolkit` as follows > ```bash > use toolkit.nu # or use an `env_change` hook to activate it automatically > toolkit check pr > ``` --> I ran `cargo fmt --all -- --check`, `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` and the tests in the plugin. # After Submitting <!-- If your PR had any user-facing changes, update [the documentation](https://github.com/nushell/nushell.github.io) after the PR is merged, if necessary. This will help us keep the docs up to date. --> PR that updates the documentation to match the new 3rd example: https://github.com/nushell/nushell.github.io/pull/1235
2024-02-03 01:40:47 +00:00
.map(|matches| {
matches
.into_list()
`query web --query` should return `list<list<string>>` like the scraper crate's `ElementRef::text()` (#11705) <!-- if this PR closes one or more issues, you can automatically link the PR with them by using one of the [*linking keywords*](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword), e.g. - this PR should close #xxxx - fixes #xxxx you can also mention related issues, PRs or discussions! --> # Description <!-- Thank you for improving Nushell. Please, check our [contributing guide](../CONTRIBUTING.md) and talk to the core team before making major changes. Description of your pull request goes here. **Provide examples and/or screenshots** if your changes affect the user experience. --> ## Problem I tried converting one of my Rust web scrapers to Nushell just to see how it would be done, but quickly ran into an issue that proved annoying to fix without diving into the source. For instance, let's say we have the following HTML ```html <p>Hello there, <span style="color: red;">World</span></p> ``` and we want to extract only the text within the `p` element, but not the `span`. With the current version of nu_plugin_query, if we run this code ```nushell echo `<p>Hello there, <span style="color: red;">World</span></p>` | query web -q "p" | get 0 # returns "Hello there, World" # but we want only "Hello there, " ``` we will get back a `list<string>` that contains 1 string `Hello there, World`. To avoid scraping the span, we would have to do something like this ```nushell const html = `<p>Hello there, <span style="color: red;">World</span></p>` $html | query web -q "p" | get 0 | str replace ($html | query web -q "p > span" | get 0) "" # returns "Hello there, " ``` In other words, we would have to make a sub scrape of the text we *don't* want in order to subtract it from the text we *do* want. ## Solution I didn't like this behavior, so I decided to change it. I modified the `execute_selector_query` function to collect all text nodes in the HTML element matching the query. Now `query web --query` will return a `list<list<string>>` ```nushell echo `<p>Hello there, <span style="color: red;">World</span></p>` | query web -q "p" | get 0 | to json --raw # returns ["Hello there, ","World"] ``` This also brings `query web --query`'s behavior more in line with [scraper's ElementRef::text()](https://docs.rs/scraper/latest/scraper/element_ref/struct.ElementRef.html#method.text) which "Returns an iterator over descendent text nodes", allowing you to choose how much of an element's text you want to scrape without resorting to string substitutions. ## Consequences As this is a user-facing change, the usage examples will produce different results than before. For example ```nushell http get https://phoronix.com | query web --query 'header' ``` will return a list of lists of 1 string each, whereas before it was just a list of strings. I only modified the 3rd example ```nushell # old http get https://www.nushell.sh | query web --query 'h2, h2 + p' | group 2 | each {rotate --ccw tagline description} | flatten # new http get https://www.nushell.sh | query web --query 'h2, h2 + p' | each {str join} | group 2 | each {rotate --ccw tagline description} | flatten ``` to make it behave like before because I thought this one ought to show the same results as before. However, the second reason I changed the 3rd example is because it otherwise panics! If we run the original 3rd example with my modifications, we get a panic ``` thread 'main' panicked at crates/nu-protocol/src/value/record.rs:34:9: assertion `left == right` failed left: 2 right: 17 ``` This happens because `rotate` receives a list of lists where the inner lists have a different number of elements. However this panic is unrelated to the changes I've made, because it can be triggered easily without using the plugin. For instance ```nushell # this is fine [[[one] [two]] [[three] [four]]] | each {rotate --ccw tagline description} # this panics! [[[one] [two]] [[three] [four five]]] | each {rotate --ccw tagline description} ``` Though beyond the scope of this PR, I thought I'd mention this bug since I found it while testing the usage examples. However, I intend to make a proper issue about it tomorrow. # User-Facing Changes <!-- List of all changes that impact the user experience here. This helps us keep track of breaking changes. --> `query web --query "css selector"` now returns a `list<list<string>>` instead of a `list<string>` to make it more in line with [scraper's ElementRef::text()](https://docs.rs/scraper/latest/scraper/element_ref/struct.ElementRef.html#method.text). # Tests + Formatting <!-- Don't forget to add tests that cover your changes. Make sure you've run and fixed any issues with these commands: - `cargo fmt --all -- --check` to check standard code formatting (`cargo fmt --all` applies these changes) - `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to check that you're using the standard code style - `cargo test --workspace` to check that all tests pass (on Windows make sure to [enable developer mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging)) - `cargo run -- -c "use std testing; testing run-tests --path crates/nu-std"` to run the tests for the standard library > **Note** > from `nushell` you can also use the `toolkit` as follows > ```bash > use toolkit.nu # or use an `env_change` hook to activate it automatically > toolkit check pr > ``` --> I ran `cargo fmt --all -- --check`, `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` and the tests in the plugin. # After Submitting <!-- If your PR had any user-facing changes, update [the documentation](https://github.com/nushell/nushell.github.io) after the PR is merged, if necessary. This will help us keep the docs up to date. --> PR that updates the documentation to match the new 3rd example: https://github.com/nushell/nushell.github.io/pull/1235
2024-02-03 01:40:47 +00:00
.unwrap()
.into_iter()
.map(|text_nodes| text_nodes.coerce_into_string().unwrap())
`query web --query` should return `list<list<string>>` like the scraper crate's `ElementRef::text()` (#11705) <!-- if this PR closes one or more issues, you can automatically link the PR with them by using one of the [*linking keywords*](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword), e.g. - this PR should close #xxxx - fixes #xxxx you can also mention related issues, PRs or discussions! --> # Description <!-- Thank you for improving Nushell. Please, check our [contributing guide](../CONTRIBUTING.md) and talk to the core team before making major changes. Description of your pull request goes here. **Provide examples and/or screenshots** if your changes affect the user experience. --> ## Problem I tried converting one of my Rust web scrapers to Nushell just to see how it would be done, but quickly ran into an issue that proved annoying to fix without diving into the source. For instance, let's say we have the following HTML ```html <p>Hello there, <span style="color: red;">World</span></p> ``` and we want to extract only the text within the `p` element, but not the `span`. With the current version of nu_plugin_query, if we run this code ```nushell echo `<p>Hello there, <span style="color: red;">World</span></p>` | query web -q "p" | get 0 # returns "Hello there, World" # but we want only "Hello there, " ``` we will get back a `list<string>` that contains 1 string `Hello there, World`. To avoid scraping the span, we would have to do something like this ```nushell const html = `<p>Hello there, <span style="color: red;">World</span></p>` $html | query web -q "p" | get 0 | str replace ($html | query web -q "p > span" | get 0) "" # returns "Hello there, " ``` In other words, we would have to make a sub scrape of the text we *don't* want in order to subtract it from the text we *do* want. ## Solution I didn't like this behavior, so I decided to change it. I modified the `execute_selector_query` function to collect all text nodes in the HTML element matching the query. Now `query web --query` will return a `list<list<string>>` ```nushell echo `<p>Hello there, <span style="color: red;">World</span></p>` | query web -q "p" | get 0 | to json --raw # returns ["Hello there, ","World"] ``` This also brings `query web --query`'s behavior more in line with [scraper's ElementRef::text()](https://docs.rs/scraper/latest/scraper/element_ref/struct.ElementRef.html#method.text) which "Returns an iterator over descendent text nodes", allowing you to choose how much of an element's text you want to scrape without resorting to string substitutions. ## Consequences As this is a user-facing change, the usage examples will produce different results than before. For example ```nushell http get https://phoronix.com | query web --query 'header' ``` will return a list of lists of 1 string each, whereas before it was just a list of strings. I only modified the 3rd example ```nushell # old http get https://www.nushell.sh | query web --query 'h2, h2 + p' | group 2 | each {rotate --ccw tagline description} | flatten # new http get https://www.nushell.sh | query web --query 'h2, h2 + p' | each {str join} | group 2 | each {rotate --ccw tagline description} | flatten ``` to make it behave like before because I thought this one ought to show the same results as before. However, the second reason I changed the 3rd example is because it otherwise panics! If we run the original 3rd example with my modifications, we get a panic ``` thread 'main' panicked at crates/nu-protocol/src/value/record.rs:34:9: assertion `left == right` failed left: 2 right: 17 ``` This happens because `rotate` receives a list of lists where the inner lists have a different number of elements. However this panic is unrelated to the changes I've made, because it can be triggered easily without using the plugin. For instance ```nushell # this is fine [[[one] [two]] [[three] [four]]] | each {rotate --ccw tagline description} # this panics! [[[one] [two]] [[three] [four five]]] | each {rotate --ccw tagline description} ``` Though beyond the scope of this PR, I thought I'd mention this bug since I found it while testing the usage examples. However, I intend to make a proper issue about it tomorrow. # User-Facing Changes <!-- List of all changes that impact the user experience here. This helps us keep track of breaking changes. --> `query web --query "css selector"` now returns a `list<list<string>>` instead of a `list<string>` to make it more in line with [scraper's ElementRef::text()](https://docs.rs/scraper/latest/scraper/element_ref/struct.ElementRef.html#method.text). # Tests + Formatting <!-- Don't forget to add tests that cover your changes. Make sure you've run and fixed any issues with these commands: - `cargo fmt --all -- --check` to check standard code formatting (`cargo fmt --all` applies these changes) - `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` to check that you're using the standard code style - `cargo test --workspace` to check that all tests pass (on Windows make sure to [enable developer mode](https://learn.microsoft.com/en-us/windows/apps/get-started/developer-mode-features-and-debugging)) - `cargo run -- -c "use std testing; testing run-tests --path crates/nu-std"` to run the tests for the standard library > **Note** > from `nushell` you can also use the `toolkit` as follows > ```bash > use toolkit.nu # or use an `env_change` hook to activate it automatically > toolkit check pr > ``` --> I ran `cargo fmt --all -- --check`, `cargo clippy --workspace -- -D warnings -D clippy::unwrap_used` and the tests in the plugin. # After Submitting <!-- If your PR had any user-facing changes, update [the documentation](https://github.com/nushell/nushell.github.io) after the PR is merged, if necessary. This will help us keep the docs up to date. --> PR that updates the documentation to match the new 3rd example: https://github.com/nushell/nushell.github.io/pull/1235
2024-02-03 01:40:47 +00:00
.collect::<Vec<String>>()
})
.collect::<Vec<Vec<String>>>();
assert_eq!(
out,
vec![vec!["Hello there, ".to_string(), "World".to_string()]],
);
}
}