Rollup of 5 pull requests
Successful merges:
- #5825 (Add the new lint `same_item_push`)
- #5869 (New lint against `Self` as an arbitrary self type)
- #5870 (enable #[allow(clippy::unsafe_derive_deserialize)])
- #5871 (Lint .min(x).max(y) with x < y)
- #5874 (Make the docs clearer for new contributors)
Failed merges:
r? @ghost
changelog: rollup
Make the docs clearer for new contributors
It confused me before, so I made it extra obvious that you need to run
a script to set up your toolchain before you can build Clippy.
I also added a note so that new contributors aren't confused when
Clippy doesn't build as a result of a change in rustc's internals.
changelog: make `CONTRIBUTING.md` clearer for new contributors
enable #[allow(clippy::unsafe_derive_deserialize)]
Before this change this lint could not be allowed as the code we are checking is automatically generated.
changelog: Enable using the `allow` attribute on top of an ADT linted by [`unsafe_derive_deserialize`].
Fixes: #5789
Fix ICE in `loops` module
changelog: Fix ICE related to `needless_collect` when a call to `iter()` was not present.
I went for restoring the old suggestion of `next().is_some()` over `get(0).is_some()` given that `iter()` is not necessarily present (could be e.g. `into_iter()` or `iter_mut()`) and that the old suggestion could change semantics, e.g. a call to `filter()` could be present between `iter()` and the collect part.
Fixes#5872
By moving `{known,used}_attrs` from `SessionGlobals` to `Session`. This
means they are accessed via the `Session`, rather than via TLS. A few
`Attr` methods and `librustc_ast` functions are now methods of
`Session`.
All of this required passing a `Session` to lots of functions that didn't
already have one. Some of these functions also had arguments removed, because
those arguments could be accessed directly via the `Session` argument.
`contains_feature_attr()` was dead, and is removed.
Some functions were moved from `librustc_ast` elsewhere because they now need
to access `Session`, which isn't available in that crate.
- `entry_point_type()` --> `librustc_builtin_macros`
- `global_allocator_spans()` --> `librustc_metadata`
- `is_proc_macro_attr()` --> `Session`
* Add an easy-to-see note at the top of `CONTRIBUTING.md` that points
new contributors to the Basics docs
* Add a note about compiler errors as a result of internals changes
that break Clippy
Check whether locals are too large instead of whether accesses into them are too large
Essentially this stops const prop from attempting to optimize
```rust
let mut x = [0_u8; 5000];
x[42] = 3;
```
I don't expect this to be a perf improvement without #73656 (which is also where the lack of this PR will be a perf regression).
r? @wesleywiser
try_err: Consider Try impl for Poll when generating suggestions
There are two different implementation of `Try` trait for `Poll` type:
`Poll<Result<T, E>>` and `Poll<Option<Result<T, E>>>`. Take them into
account when generating suggestions.
For example, for `Err(e)?` suggest either `return Poll::Ready(Err(e))` or
`return Poll::Ready(Some(Err(e)))` as appropriate.
Fixes#5855
changelog: try_err: Consider Try impl for Poll when generating suggestions
Add derive_ord_xor_partial_ord lint
Fix#1621
Some remarks:
This PR follows the example of the analogous derive_hash_xor_partial_eq lint where possible.
I initially tried using the `match_path` function to identify `Ord` implementation like the derive_hash_xor_partial_eq lint currently does, for `Hash` implementations but that didn't work.
Specifically, the structs at the top level were getting paths that matched `&["$crate", "cmp", "Ord"]` instead of `&["std", "cmp", "Ord"]`. While trying to figure out what to do instead I saw the comment at the top of [clippy_lints/src/utils/paths.rs](f5d429cd76/clippy_lints/src/utils/paths.rs (L5)) that mentioned [this issue](https://github.com/rust-lang/rust-clippy/issues/5393) and suggested to use diagnostic items instead of hardcoded paths whenever possible. I looked for a way to identify `Ord` implementations with diagnostic items, but (possibly because this was the first time I had heard of diagnostic items,) I was unable to find one.
Eventually I tried using `get_trait_def_id` and comparing `DefId` values directly and that seems to work as expected. Maybe there's a better approach however?
changelog: new lint: derive_ord_xor_partial_ord
Handle mapping to Option in `map_flatten` lint
Fixes#4496
The existing [`map_flatten`](https://rust-lang.github.io/rust-clippy/master/index.html#map_flatten) lint suggests changing `expr.map(...).flatten()` to `expr.flat_map(...)` when `expr` is `Iterator`. This PR changes suggestion to `filter_map` instead of `flat_map` when mapping to `Option`, because it is more natural
Also here are some questions:
* If expression has type which implements `Iterator` trait (`match_trait_method(cx, expr, &paths::ITERATOR) == true`), how can I get type of iterator elements? Currently I use return type of closure inside `map`, but probably it is not good way
* I would like to change suggestion range to cover only `.map(...).flatten()`, that is from:
```
let _: Vec<_> = vec![5_i8; 6].into_iter().map(|x| 0..x).flatten().collect();
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ help: try using `flat_map` instead: `vec![5_i8; 6].into_iter().flat_map
```
to
```
let _: Vec<_> = vec![5_i8; 6].into_iter().map(|x| 0..x).flatten().collect();
^^^^^^^^^^^^^^^^^^^^^^^^ help: try using `flat_map` instead: `.flat_map(|x| 0..x)`
```
Is it ok?
* Is `map_flatten` lint intentionally in `pedantic` category, or could it be moved to `complexity`?
changelog: Handle mapping to Option in [`map_flatten`](https://rust-lang.github.io/rust-clippy/master/index.html#map_flatten) lint
needless_collect: catch x: Vec<_> = iter.collect(); x.into_iter() ...
changelog: Expand the needless_collect lint as suggested in #5627 (WIP).
This PR is WIP because I can't figure out how to make the multi-part suggestion include its changes in the source code (the fixed is identical to the source, despite the lint making suggestions). Aside from that one issue, I think this should be good.