mirror of
https://github.com/anchore/syft
synced 2024-11-10 06:14:16 +00:00
Speed up cataloging by replacing globs searching with index lookups (#1510)
* replace raw globs with index equivelent operations Signed-off-by: Alex Goodman <alex.goodman@anchore.com> * add cataloger test for alpm cataloger Signed-off-by: Alex Goodman <alex.goodman@anchore.com> * fix import sorting for binary cataloger Signed-off-by: Alex Goodman <alex.goodman@anchore.com> * fix linting for mock resolver Signed-off-by: Alex Goodman <alex.goodman@anchore.com> * separate portage cataloger parser impl from cataloger Signed-off-by: Alex Goodman <alex.goodman@anchore.com> * enhance cataloger pkgtest utils to account for resolver responses Signed-off-by: Alex Goodman <alex.goodman@anchore.com> * add glob-based cataloger tests for alpm cataloger Signed-off-by: Alex Goodman <alex.goodman@anchore.com> * add glob-based cataloger tests for apkdb cataloger Signed-off-by: Alex Goodman <alex.goodman@anchore.com> * add glob-based cataloger tests for dpkg cataloger Signed-off-by: Alex Goodman <alex.goodman@anchore.com> * add glob-based cataloger tests for cpp cataloger Signed-off-by: Alex Goodman <alex.goodman@anchore.com> * add glob-based cataloger tests for dart cataloger Signed-off-by: Alex Goodman <alex.goodman@anchore.com> * add glob-based cataloger tests for dotnet cataloger Signed-off-by: Alex Goodman <alex.goodman@anchore.com> * add glob-based cataloger tests for elixir cataloger Signed-off-by: Alex Goodman <alex.goodman@anchore.com> * add glob-based cataloger tests for erlang cataloger Signed-off-by: Alex Goodman <alex.goodman@anchore.com> * add glob-based cataloger tests for golang cataloger Signed-off-by: Alex Goodman <alex.goodman@anchore.com> * add glob-based cataloger tests for haskell cataloger Signed-off-by: Alex Goodman <alex.goodman@anchore.com> * add glob-based cataloger tests for java cataloger Signed-off-by: Alex Goodman <alex.goodman@anchore.com> * add glob-based cataloger tests for javascript cataloger Signed-off-by: Alex Goodman <alex.goodman@anchore.com> * add glob-based cataloger tests for php cataloger Signed-off-by: Alex Goodman <alex.goodman@anchore.com> * add glob-based cataloger tests for portage cataloger Signed-off-by: Alex Goodman <alex.goodman@anchore.com> * add glob-based cataloger tests for python cataloger Signed-off-by: Alex Goodman <alex.goodman@anchore.com> * add glob-based cataloger tests for rpm cataloger Signed-off-by: Alex Goodman <alex.goodman@anchore.com> * add glob-based cataloger tests for rust cataloger Signed-off-by: Alex Goodman <alex.goodman@anchore.com> * add glob-based cataloger tests for sbom cataloger Signed-off-by: Alex Goodman <alex.goodman@anchore.com> * add glob-based cataloger tests for swift cataloger Signed-off-by: Alex Goodman <alex.goodman@anchore.com> * allow generic catloger to run all mimetype searches at once Signed-off-by: Alex Goodman <alex.goodman@anchore.com> * remove stutter from php and javascript cataloger constructors Signed-off-by: Alex Goodman <alex.goodman@anchore.com> * bump stereoscope Signed-off-by: Alex Goodman <alex.goodman@anchore.com> * add tests for generic.Search Signed-off-by: Alex Goodman <alex.goodman@anchore.com> * add exceptions for java archive git ignore entries Signed-off-by: Alex Goodman <alex.goodman@anchore.com> * enhance basename and extension resolver methods to be variadic Signed-off-by: Alex Goodman <alex.goodman@anchore.com> * dont allow * prefix on extension searches Signed-off-by: Alex Goodman <alex.goodman@anchore.com> * add glob-based cataloger tests for ruby cataloger Signed-off-by: Alex Goodman <alex.goodman@anchore.com> * remove unnecessary string casting Signed-off-by: Alex Goodman <alex.goodman@anchore.com> * incorporate surfacing of leaf link resolitions from stereoscope results Signed-off-by: Alex Goodman <alex.goodman@anchore.com> * [wip] switch to stereoscope file metadata Signed-off-by: Alex Goodman <alex.goodman@anchore.com> * [wip + failing] revert to old globs but keep new resolvers Signed-off-by: Alex Goodman <alex.goodman@anchore.com> * index files, links, and dirs within the directory resolver Signed-off-by: Alex Goodman <alex.goodman@anchore.com> * fix several resolver bugs and inconsistencies Signed-off-by: Alex Goodman <alex.goodman@anchore.com> * move format testutils to internal package Signed-off-by: Alex Goodman <alex.goodman@anchore.com> * update syft json to account for file type string normalization Signed-off-by: Alex Goodman <alex.goodman@anchore.com> * split up directory resolver from indexing Signed-off-by: Alex Goodman <alex.goodman@anchore.com> * update docs to include details about searching Signed-off-by: Alex Goodman <alex.goodman@anchore.com> * [wip] bump stereoscope to development version Signed-off-by: Alex Goodman <alex.goodman@anchore.com> * fix linting Signed-off-by: Alex Goodman <alex.goodman@anchore.com> * adjust symlinks fixture to be fixed to digest Signed-off-by: Alex Goodman <alex.goodman@anchore.com> * fix all-locations resolver tests Signed-off-by: Alex Goodman <alex.goodman@anchore.com> * fix test fixture reference Signed-off-by: Alex Goodman <alex.goodman@anchore.com> * rename file.Type Signed-off-by: Alex Goodman <alex.goodman@anchore.com> * bump stereoscope Signed-off-by: Alex Goodman <alex.goodman@anchore.com> * fix PR comment to exclude extra * Signed-off-by: Alex Goodman <alex.goodman@anchore.com> * bump to dev version of stereoscope Signed-off-by: Alex Goodman <alex.goodman@anchore.com> * bump to final version of stereoscope Signed-off-by: Alex Goodman <alex.goodman@anchore.com> * move observing resolver to pkgtest Signed-off-by: Alex Goodman <alex.goodman@anchore.com> --------- Signed-off-by: Alex Goodman <alex.goodman@anchore.com>
This commit is contained in:
parent
550e2fc7c3
commit
988041ba6d
201 changed files with 4166 additions and 1370 deletions
|
@ -6,7 +6,6 @@ issues:
|
|||
# include:
|
||||
# - EXC0002 # disable excluding of issues about comments from golint
|
||||
|
||||
|
||||
linters:
|
||||
# inverted configuration with `enable-all` and `disable` is not scalable during updates of golangci-lint
|
||||
disable-all: true
|
||||
|
@ -14,25 +13,24 @@ linters:
|
|||
- asciicheck
|
||||
- bodyclose
|
||||
- depguard
|
||||
- dogsled
|
||||
- dupl
|
||||
- errcheck
|
||||
- errorlint
|
||||
- exportloopref
|
||||
- forcetypeassert
|
||||
- funlen
|
||||
- gocognit
|
||||
- goconst
|
||||
- gocritic
|
||||
- gocyclo
|
||||
- gofmt
|
||||
- tparallel
|
||||
- importas
|
||||
- goimports
|
||||
- goprintffuncname
|
||||
- gosec
|
||||
- gosimple
|
||||
- govet
|
||||
- ineffassign
|
||||
- misspell
|
||||
- nolintlint
|
||||
- nakedret
|
||||
- revive
|
||||
- staticcheck
|
||||
- stylecheck
|
||||
|
@ -41,6 +39,7 @@ linters:
|
|||
- unparam
|
||||
- unused
|
||||
- whitespace
|
||||
|
||||
linters-settings:
|
||||
funlen:
|
||||
# Checks the number of lines in a function.
|
||||
|
@ -57,7 +56,7 @@ run:
|
|||
timeout: 10m
|
||||
|
||||
# do not enable...
|
||||
# - dogsled # found to be to niche and ineffective
|
||||
# - deadcode # The owner seems to have abandoned the linter. Replaced by "unused".
|
||||
# - goprintffuncname # does not catch all cases and there are exceptions
|
||||
# - nakedret # does not catch all cases and should not fail a build
|
||||
# - gochecknoglobals
|
||||
|
@ -73,7 +72,11 @@ run:
|
|||
# - lll # without a way to specify per-line exception cases, this is not usable
|
||||
# - maligned # this is an excellent linter, but tricky to optimize and we are not sensitive to memory layout optimizations
|
||||
# - nestif
|
||||
# - prealloc # following this rule isn't consistently a good idea, as it sometimes forces unnecessary allocations that result in less idiomatic code
|
||||
# - scopelint # deprecated
|
||||
# - nolintlint # as of go1.19 this conflicts with the behavior of gofmt, which is a deal-breaker (lint-fix will still fail when running lint)
|
||||
# - prealloc # following this rule isn't consistently a good idea, as it sometimes forces unnecessary allocations that result in less idiomatic code
|
||||
# - rowserrcheck # not in a repo with sql, so this is not useful
|
||||
# - scopelint # deprecated
|
||||
# - structcheck # The owner seems to have abandoned the linter. Replaced by "unused".
|
||||
# - testpackage
|
||||
# - wsl # this doens't have an auto-fixer yet and is pretty noisy (https://github.com/bombsimon/wsl/issues/90)
|
||||
# - varcheck # The owner seems to have abandoned the linter. Replaced by "unused".
|
||||
# - wsl # this doens't have an auto-fixer yet and is pretty noisy (https://github.com/bombsimon/wsl/issues/90)
|
||||
|
|
|
@ -118,45 +118,55 @@ sequenceDiagram
|
|||
|
||||
Catalogers are the way in which syft is able to identify and construct packages given some amount of source metadata.
|
||||
For example, Syft can locate and process `package-lock.json` files when performing filesystem scans.
|
||||
See: [how to specify file globs](https://github.com/anchore/syft/blob/main/syft/pkg/cataloger/javascript/cataloger.go#L16-L21)
|
||||
and an implementation of the [package-lock.json parser](https://github.com/anchore/syft/blob/main/syft/pkg/cataloger/javascript/cataloger.go#L16-L21) fora quick review.
|
||||
See: [how to specify file globs](https://github.com/anchore/syft/tree/v0.70.0/syft/pkg/cataloger/javascript/cataloger.go#L16-L21)
|
||||
and an implementation of the [package-lock.json parser](https://github.com/anchore/syft/tree/v0.70.0/syft/pkg/cataloger/javascript/cataloger.go#L16-L21) fora quick review.
|
||||
|
||||
#### Building a new Cataloger
|
||||
|
||||
Catalogers must fulfill the interface [found here](https://github.com/anchore/syft/blob/main/syft/pkg/cataloger.go).
|
||||
Catalogers must fulfill the interface [found here](https://github.com/anchore/syft/tree/v0.70.0/syft/pkg/cataloger.go).
|
||||
This means that when building a new cataloger, the new struct must implement both method signatures of `Catalog` and `Name`.
|
||||
|
||||
A top level view of the functions that construct all the catalogers can be found [here](https://github.com/anchore/syft/blob/main/syft/pkg/cataloger/cataloger.go).
|
||||
A top level view of the functions that construct all the catalogers can be found [here](https://github.com/anchore/syft/tree/v0.70.0/syft/pkg/cataloger/cataloger.go).
|
||||
When an author has finished writing a new cataloger this is the spot to plug in the new catalog constructor.
|
||||
|
||||
For a top level view of how the catalogers are used see [this function](https://github.com/anchore/syft/blob/main/syft/pkg/cataloger/catalog.go#L41-L100) as a reference. It ranges over all catalogers passed as an argument and invokes the `Catalog` method:
|
||||
For a top level view of how the catalogers are used see [this function](https://github.com/anchore/syft/tree/v0.70.0/syft/pkg/cataloger/catalog.go#L41-L100) as a reference. It ranges over all catalogers passed as an argument and invokes the `Catalog` method:
|
||||
|
||||
Each cataloger has its own `Catalog` method, but this does not mean that they are all vastly different.
|
||||
Take a look at the `apkdb` cataloger for alpine to see how it [constructs a generic.NewCataloger](https://github.com/anchore/syft/blob/main/syft/pkg/cataloger/apkdb/cataloger.go).
|
||||
Take a look at the `apkdb` cataloger for alpine to see how it [constructs a generic.NewCataloger](https://github.com/anchore/syft/tree/v0.70.0/syft/pkg/cataloger/apkdb/cataloger.go).
|
||||
|
||||
`generic.NewCataloger` is an abstraction syft uses to make writing common components easier. First, it takes the `catalogerName` to identify the cataloger.
|
||||
On the other side of the call it uses two key pieces which inform the cataloger how to identify and return packages, the `globPatterns` and the `parseFunction`:
|
||||
- The first piece is a `parseByGlob` matching pattern used to identify the files that contain the package metadata.
|
||||
See [here for the APK example](https://github.com/anchore/syft/blob/main/syft/pkg/apk_metadata.go#L16-L41).
|
||||
See [here for the APK example](https://github.com/anchore/syft/tree/v0.70.0/syft/pkg/apk_metadata.go#L16-L41).
|
||||
- The other is a `parseFunction` which informs the cataloger what to do when it has found one of the above matches files.
|
||||
See this [link for an example](https://github.com/anchore/syft/blob/main/syft/pkg/cataloger/apkdb/parse_apk_db.go#L22-L102).
|
||||
See this [link for an example](https://github.com/anchore/syft/tree/v0.70.0/syft/pkg/cataloger/apkdb/parse_apk_db.go#L22-L102).
|
||||
|
||||
If you're unsure about using the `Generic Cataloger` and think the use case being filled requires something more custom
|
||||
just file an issue or ask in our slack, and we'd be more than happy to help on the design.
|
||||
|
||||
Identified packages share a common struct so be sure that when the new cataloger is constructing a new package it is using the [`Package` struct](https://github.com/anchore/syft/blob/main/syft/pkg/package.go#L16-L31).
|
||||
Identified packages share a common struct so be sure that when the new cataloger is constructing a new package it is using the [`Package` struct](https://github.com/anchore/syft/tree/v0.70.0/syft/pkg/package.go#L16-L31).
|
||||
|
||||
Metadata Note: Identified packages are also assigned specific metadata that can be unique to their environment.
|
||||
See [this folder](https://github.com/anchore/syft/tree/main/syft/pkg) for examples of the different metadata types.
|
||||
See [this folder](https://github.com/anchore/syft/tree/v0.70.0/syft/pkg) for examples of the different metadata types.
|
||||
These are plugged into the `MetadataType` and `Metadata` fields in the above struct. `MetadataType` informs which type is being used. `Metadata` is an interface converted to that type.
|
||||
|
||||
Finally, here is an example of where the package construction is done in the apk cataloger. The first link is where `newPackage` is called in the `parseFunction`. The second link shows the package construction:
|
||||
- [Call for new package](https://github.com/anchore/syft/blob/6a7d6e6071829c7ce2943266c0e187b27c0b325c/syft/pkg/cataloger/apkdb/parse_apk_db.go#L96-L99)
|
||||
- [APK Package Constructor](https://github.com/anchore/syft/blob/6a7d6e6071829c7ce2943266c0e187b27c0b325c/syft/pkg/cataloger/apkdb/package.go#L12-L27)
|
||||
- [Call for new package](https://github.com/anchore/syft/blob/v0.70.0/syft/pkg/cataloger/apkdb/parse_apk_db.go#L106)
|
||||
- [APK Package Constructor](https://github.com/anchore/syft/tree/v0.70.0/syft/pkg/cataloger/apkdb/package.go#L12-L27)
|
||||
|
||||
If you have more questions about implementing a cataloger or questions about one you might be currently working
|
||||
always feel free to file an issue or reach out to us [on slack](https://anchore.com/slack).
|
||||
|
||||
#### Searching for files
|
||||
|
||||
All catalogers are provided an instance of the [`source.FileResolver`](https://github.com/anchore/syft/blob/v0.70.0/syft/source/file_resolver.go#L8) to interface with the image and search for files. The implementations for these
|
||||
abstractions leverage [`stereoscope`](https://github.com/anchore/stereoscope) in order to perform searching. Here is a
|
||||
rough outline how that works:
|
||||
|
||||
1. a stereoscope `file.Index` is searched based on the input given (a path, glob, or MIME type). The index is relatively fast to search, but requires results to be filtered down to the files that exist in the specific layer(s) of interest. This is done automatically by the `filetree.Searcher` abstraction. This abstraction will fallback to searching directly against the raw `filetree.FileTree` if the index does not contain the file(s) of interest. Note: the `filetree.Searcher` is used by the `source.FileResolver` abstraction.
|
||||
2. Once the set of files are returned from the `filetree.Searcher` the results are filtered down further to return the most unique file results. For example, you may have requested for files by a glob that returns multiple results. These results are filtered down to deduplicate by real files, so if a result contains two references to the same file, say one accessed via symlink and one accessed via the real path, then the real path reference is returned and the symlink reference is filtered out. If both were accessed by symlink then the first (by lexical order) is returned. This is done automatically by the `source.FileResolver` abstraction.
|
||||
3. By the time results reach the `pkg.Cataloger` you are guaranteed to have a set of unique files that exist in the layer(s) of interest (relative to what the resolver supports).
|
||||
|
||||
## Testing
|
||||
|
||||
### Levels of testing
|
||||
|
|
7
go.mod
7
go.mod
|
@ -52,7 +52,7 @@ require (
|
|||
github.com/CycloneDX/cyclonedx-go v0.7.1-0.20221222100750-41a1ac565cce
|
||||
github.com/Masterminds/sprig/v3 v3.2.3
|
||||
github.com/anchore/go-logger v0.0.0-20220728155337-03b66a5207d8
|
||||
github.com/anchore/stereoscope v0.0.0-20230203152723-c49244e4d66f
|
||||
github.com/anchore/stereoscope v0.0.0-20230208154630-5a306f07f2e7
|
||||
github.com/docker/docker v23.0.0+incompatible
|
||||
github.com/google/go-containerregistry v0.13.0
|
||||
github.com/invopop/jsonschema v0.7.0
|
||||
|
@ -60,7 +60,7 @@ require (
|
|||
github.com/opencontainers/go-digest v1.0.0
|
||||
github.com/sassoftware/go-rpmutils v0.2.0
|
||||
github.com/vbatts/go-mtree v0.5.2
|
||||
golang.org/x/exp v0.0.0-20220823124025-807a23277127
|
||||
golang.org/x/exp v0.0.0-20230202163644-54bba9f4231b
|
||||
gopkg.in/yaml.v3 v3.0.1
|
||||
)
|
||||
|
||||
|
@ -70,6 +70,7 @@ require (
|
|||
github.com/Masterminds/semver/v3 v3.2.0 // indirect
|
||||
github.com/Microsoft/go-winio v0.6.0 // indirect
|
||||
github.com/anchore/go-struct-converter v0.0.0-20221118182256-c68fdcfa2092 // indirect
|
||||
github.com/becheran/wildmatch-go v1.0.0 // indirect
|
||||
github.com/containerd/containerd v1.6.12 // indirect
|
||||
github.com/containerd/stargz-snapshotter/estargz v0.12.1 // indirect
|
||||
github.com/davecgh/go-spew v1.1.1 // indirect
|
||||
|
@ -127,7 +128,7 @@ require (
|
|||
golang.org/x/sync v0.1.0 // indirect
|
||||
golang.org/x/sys v0.5.0 // indirect
|
||||
golang.org/x/text v0.7.0 // indirect
|
||||
golang.org/x/tools v0.1.12 // indirect
|
||||
golang.org/x/tools v0.2.0 // indirect
|
||||
golang.org/x/xerrors v0.0.0-20220907171357-04be3eba64a2 // indirect
|
||||
google.golang.org/genproto v0.0.0-20221227171554-f9683d7f8bef // indirect
|
||||
google.golang.org/grpc v1.52.0 // indirect
|
||||
|
|
13
go.sum
13
go.sum
|
@ -90,8 +90,8 @@ github.com/anchore/go-version v1.2.2-0.20200701162849-18adb9c92b9b h1:e1bmaoJfZV
|
|||
github.com/anchore/go-version v1.2.2-0.20200701162849-18adb9c92b9b/go.mod h1:Bkc+JYWjMCF8OyZ340IMSIi2Ebf3uwByOk6ho4wne1E=
|
||||
github.com/anchore/packageurl-go v0.1.1-0.20230104203445-02e0a6721501 h1:AV7qjwMcM4r8wFhJq3jLRztew3ywIyPTRapl2T1s9o8=
|
||||
github.com/anchore/packageurl-go v0.1.1-0.20230104203445-02e0a6721501/go.mod h1:Blo6OgJNiYF41ufcgHKkbCKF2MDOMlrqhXv/ij6ocR4=
|
||||
github.com/anchore/stereoscope v0.0.0-20230203152723-c49244e4d66f h1:hyEFgDzqZRr/+q1cPfjgIKXWJ7lMHDHmDXAOrhKMhRA=
|
||||
github.com/anchore/stereoscope v0.0.0-20230203152723-c49244e4d66f/go.mod h1:YerDPu5voTWZUmjrAHhak7gGGdGLilqroEEFLA/PUHo=
|
||||
github.com/anchore/stereoscope v0.0.0-20230208154630-5a306f07f2e7 h1:PrdFBPMyika+AM1/AwDmYqrVeUATDU90wbrd81ugicU=
|
||||
github.com/anchore/stereoscope v0.0.0-20230208154630-5a306f07f2e7/go.mod h1:TUCfo52tEz7ahTUFtKN//wcB7kJzQs0Oifmnd4NkIXw=
|
||||
github.com/andreyvit/diff v0.0.0-20170406064948-c7f18ee00883/go.mod h1:rCTlJbsFo29Kk6CurOXKm700vrz8f0KW0JNfpkRJY/8=
|
||||
github.com/andybalholm/brotli v1.0.1/go.mod h1:loMXtMfwqflxFJPmdbJO0a3KNoPuLBgiu3qAvBg8x/Y=
|
||||
github.com/andybalholm/brotli v1.0.4 h1:V7DdXeJtZscaqfNuAdSRuRFzuiKlHSC/Zh3zl9qY3JY=
|
||||
|
@ -102,6 +102,8 @@ github.com/armon/go-metrics v0.0.0-20180917152333-f0300d1749da/go.mod h1:Q73ZrmV
|
|||
github.com/armon/go-metrics v0.3.10/go.mod h1:4O98XIr/9W0sxpJ8UaYkvjk10Iff7SnFrb4QAOwNTFc=
|
||||
github.com/armon/go-radix v0.0.0-20180808171621-7fddfc383310/go.mod h1:ufUuZ+zHj4x4TnLV4JWEpy2hxWSpsRywHrMgIH9cCH8=
|
||||
github.com/armon/go-radix v1.0.0/go.mod h1:ufUuZ+zHj4x4TnLV4JWEpy2hxWSpsRywHrMgIH9cCH8=
|
||||
github.com/becheran/wildmatch-go v1.0.0 h1:mE3dGGkTmpKtT4Z+88t8RStG40yN9T+kFEGj2PZFSzA=
|
||||
github.com/becheran/wildmatch-go v1.0.0/go.mod h1:gbMvj0NtVdJ15Mg/mH9uxk2R1QCistMyU7d9KFzroX4=
|
||||
github.com/beorn7/perks v0.0.0-20180321164747-3a771d992973/go.mod h1:Dwedo/Wpr24TaqPxmxbtue+5NUziq4I4S80YR8gNf3Q=
|
||||
github.com/beorn7/perks v1.0.0/go.mod h1:KWe93zE9D1o94FZ5RNwFwVgaQK1VOXiVxmqh+CedLV8=
|
||||
github.com/beorn7/perks v1.0.1/go.mod h1:G2ZrVWU2WbWT9wwq4/hrbKbnv/1ERSJQ0ibhJ6rlkpw=
|
||||
|
@ -637,8 +639,8 @@ golang.org/x/exp v0.0.0-20191227195350-da58074b4299/go.mod h1:2RIsYlXP63K8oxa1u0
|
|||
golang.org/x/exp v0.0.0-20200119233911-0405dc783f0a/go.mod h1:2RIsYlXP63K8oxa1u096TMicItID8zy7Y6sNkU49FU4=
|
||||
golang.org/x/exp v0.0.0-20200207192155-f17229e696bd/go.mod h1:J/WKrq2StrnmMY6+EHIKF9dgMWnmCNThgcyBT1FY9mM=
|
||||
golang.org/x/exp v0.0.0-20200224162631-6cc2880d07d6/go.mod h1:3jZMyOhIsHpP37uCMkUooju7aAi5cS1Q23tOzKc+0MU=
|
||||
golang.org/x/exp v0.0.0-20220823124025-807a23277127 h1:S4NrSKDfihhl3+4jSTgwoIevKxX9p7Iv9x++OEIptDo=
|
||||
golang.org/x/exp v0.0.0-20220823124025-807a23277127/go.mod h1:cyybsKvd6eL0RnXn6p/Grxp8F5bW7iYuBgsNCOHpMYE=
|
||||
golang.org/x/exp v0.0.0-20230202163644-54bba9f4231b h1:EqBVA+nNsObCwQoBEHy4wLU0pi7i8a4AL3pbItPdPkE=
|
||||
golang.org/x/exp v0.0.0-20230202163644-54bba9f4231b/go.mod h1:CxIveKay+FTh1D0yPZemJVgC/95VzuuOLq5Qi4xnoYc=
|
||||
golang.org/x/image v0.0.0-20190227222117-0694c2d4d067/go.mod h1:kZ7UVZpmo3dzQBMxlp+ypCbDeSB+sBbTgSJuh5dn5js=
|
||||
golang.org/x/image v0.0.0-20190802002840-cff245a6509b/go.mod h1:FeLwcggjj3mMvU+oOTbSwawSJRM1uh48EjtB4UJZlP0=
|
||||
golang.org/x/lint v0.0.0-20181026193005-c67002cb31c3/go.mod h1:UVdnD1Gm6xHRNCYTkRU2/jEulfH38KcIWyp/GAMgvoE=
|
||||
|
@ -904,8 +906,9 @@ golang.org/x/tools v0.1.2/go.mod h1:o0xws9oXOQQZyjljx8fwUC0k7L1pTE6eaCbjGeHmOkk=
|
|||
golang.org/x/tools v0.1.3/go.mod h1:o0xws9oXOQQZyjljx8fwUC0k7L1pTE6eaCbjGeHmOkk=
|
||||
golang.org/x/tools v0.1.4/go.mod h1:o0xws9oXOQQZyjljx8fwUC0k7L1pTE6eaCbjGeHmOkk=
|
||||
golang.org/x/tools v0.1.5/go.mod h1:o0xws9oXOQQZyjljx8fwUC0k7L1pTE6eaCbjGeHmOkk=
|
||||
golang.org/x/tools v0.1.12 h1:VveCTK38A2rkS8ZqFY25HIDFscX5X9OoEhJd3quQmXU=
|
||||
golang.org/x/tools v0.1.12/go.mod h1:hNGJHUnrk76NpqgfD5Aqm5Crs+Hm0VOH/i9J2+nxYbc=
|
||||
golang.org/x/tools v0.2.0 h1:G6AHpWxTMGY1KyEYoAQ5WTtIekUUvDNjan3ugu60JvE=
|
||||
golang.org/x/tools v0.2.0/go.mod h1:y4OqIKeOV/fWJetJ8bXPU1sEVniLMIyDAZWeHdV+NTA=
|
||||
golang.org/x/xerrors v0.0.0-20190717185122-a985d3407aa7/go.mod h1:I/5z698sn9Ka8TeJc9MKroUUfqBBauWjQqLJ2OPfmY0=
|
||||
golang.org/x/xerrors v0.0.0-20191011141410-1b5146add898/go.mod h1:I/5z698sn9Ka8TeJc9MKroUUfqBBauWjQqLJ2OPfmY0=
|
||||
golang.org/x/xerrors v0.0.0-20191204190536-9bdfabe68543/go.mod h1:I/5z698sn9Ka8TeJc9MKroUUfqBBauWjQqLJ2OPfmY0=
|
||||
|
|
|
@ -28,3 +28,10 @@ func StringInSlice(a string, list []string) bool {
|
|||
}
|
||||
return false
|
||||
}
|
||||
|
||||
func SplitAny(s string, seps string) []string {
|
||||
splitter := func(r rune) bool {
|
||||
return strings.ContainsRune(seps, r)
|
||||
}
|
||||
return strings.FieldsFunc(s, splitter)
|
||||
}
|
||||
|
|
|
@ -104,3 +104,37 @@ func TestTruncateMiddleEllipsis(t *testing.T) {
|
|||
})
|
||||
}
|
||||
}
|
||||
|
||||
func TestSplitAny(t *testing.T) {
|
||||
|
||||
tests := []struct {
|
||||
name string
|
||||
input string
|
||||
fields string
|
||||
want []string
|
||||
}{
|
||||
{
|
||||
name: "simple",
|
||||
input: "a,b,c",
|
||||
fields: ",",
|
||||
want: []string{"a", "b", "c"},
|
||||
},
|
||||
{
|
||||
name: "empty",
|
||||
input: "",
|
||||
fields: ",",
|
||||
want: []string{},
|
||||
},
|
||||
{
|
||||
name: "multiple separators",
|
||||
input: "a,b\nc:d",
|
||||
fields: ",:\n",
|
||||
want: []string{"a", "b", "c", "d"},
|
||||
},
|
||||
}
|
||||
for _, tt := range tests {
|
||||
t.Run(tt.name, func(t *testing.T) {
|
||||
assert.Equal(t, tt.want, SplitAny(tt.input, tt.fields))
|
||||
})
|
||||
}
|
||||
}
|
||||
|
|
|
@ -1,6 +1,7 @@
|
|||
package file
|
||||
|
||||
import (
|
||||
"github.com/anchore/stereoscope/pkg/file"
|
||||
"github.com/anchore/syft/internal/log"
|
||||
"github.com/anchore/syft/syft/source"
|
||||
)
|
||||
|
@ -20,7 +21,7 @@ func allRegularFiles(resolver source.FileResolver) (locations []source.Location)
|
|||
continue
|
||||
}
|
||||
|
||||
if metadata.Type != source.RegularFile {
|
||||
if metadata.Type != file.TypeRegular {
|
||||
continue
|
||||
}
|
||||
locations = append(locations, resolvedLocation)
|
||||
|
|
|
@ -3,6 +3,7 @@ package file
|
|||
import (
|
||||
"testing"
|
||||
|
||||
"github.com/google/go-cmp/cmp"
|
||||
"github.com/scylladb/go-set/strset"
|
||||
"github.com/stretchr/testify/assert"
|
||||
"github.com/stretchr/testify/require"
|
||||
|
@ -69,8 +70,8 @@ func Test_allRegularFiles(t *testing.T) {
|
|||
virtualLocations.Add(l.VirtualPath)
|
||||
}
|
||||
}
|
||||
assert.ElementsMatch(t, tt.wantRealPaths.List(), realLocations.List(), "mismatched real paths")
|
||||
assert.ElementsMatch(t, tt.wantVirtualPaths.List(), virtualLocations.List(), "mismatched virtual paths")
|
||||
assert.ElementsMatch(t, tt.wantRealPaths.List(), realLocations.List(), "real paths differ: "+cmp.Diff(tt.wantRealPaths.List(), realLocations.List()))
|
||||
assert.ElementsMatch(t, tt.wantVirtualPaths.List(), virtualLocations.List(), "virtual paths differ: "+cmp.Diff(tt.wantVirtualPaths.List(), virtualLocations.List()))
|
||||
})
|
||||
}
|
||||
}
|
||||
|
|
|
@ -11,6 +11,7 @@ import (
|
|||
"github.com/wagoodman/go-partybus"
|
||||
"github.com/wagoodman/go-progress"
|
||||
|
||||
"github.com/anchore/stereoscope/pkg/file"
|
||||
"github.com/anchore/syft/internal"
|
||||
"github.com/anchore/syft/internal/bus"
|
||||
"github.com/anchore/syft/internal/log"
|
||||
|
@ -65,7 +66,7 @@ func (i *DigestsCataloger) catalogLocation(resolver source.FileResolver, locatio
|
|||
}
|
||||
|
||||
// we should only attempt to report digests for files that are regular files (don't attempt to resolve links)
|
||||
if meta.Type != source.RegularFile {
|
||||
if meta.Type != file.TypeRegular {
|
||||
return nil, errUndigestableFile
|
||||
}
|
||||
|
||||
|
|
|
@ -146,7 +146,7 @@ func TestDigestsCataloger_MixFileTypes(t *testing.T) {
|
|||
if err != nil {
|
||||
t.Fatalf("unable to get file=%q : %+v", test.path, err)
|
||||
}
|
||||
l := source.NewLocationFromImage(test.path, *ref, img)
|
||||
l := source.NewLocationFromImage(test.path, *ref.Reference, img)
|
||||
|
||||
if len(actual[l.Coordinates]) == 0 {
|
||||
if test.expected != "" {
|
||||
|
|
|
@ -6,6 +6,7 @@ import (
|
|||
"testing"
|
||||
|
||||
"github.com/stretchr/testify/assert"
|
||||
"github.com/stretchr/testify/require"
|
||||
|
||||
"github.com/anchore/stereoscope/pkg/file"
|
||||
"github.com/anchore/stereoscope/pkg/imagetest"
|
||||
|
@ -50,8 +51,9 @@ func TestFileMetadataCataloger(t *testing.T) {
|
|||
path: "/file-1.txt",
|
||||
exists: true,
|
||||
expected: source.FileMetadata{
|
||||
Path: "/file-1.txt",
|
||||
Mode: 0644,
|
||||
Type: "RegularFile",
|
||||
Type: file.TypeRegular,
|
||||
UserID: 1,
|
||||
GroupID: 2,
|
||||
Size: 7,
|
||||
|
@ -62,8 +64,9 @@ func TestFileMetadataCataloger(t *testing.T) {
|
|||
path: "/hardlink-1",
|
||||
exists: true,
|
||||
expected: source.FileMetadata{
|
||||
Path: "/hardlink-1",
|
||||
Mode: 0644,
|
||||
Type: "HardLink",
|
||||
Type: file.TypeHardLink,
|
||||
LinkDestination: "file-1.txt",
|
||||
UserID: 1,
|
||||
GroupID: 2,
|
||||
|
@ -74,8 +77,9 @@ func TestFileMetadataCataloger(t *testing.T) {
|
|||
path: "/symlink-1",
|
||||
exists: true,
|
||||
expected: source.FileMetadata{
|
||||
Path: "/symlink-1",
|
||||
Mode: 0777 | os.ModeSymlink,
|
||||
Type: "SymbolicLink",
|
||||
Type: file.TypeSymLink,
|
||||
LinkDestination: "file-1.txt",
|
||||
UserID: 0,
|
||||
GroupID: 0,
|
||||
|
@ -86,8 +90,9 @@ func TestFileMetadataCataloger(t *testing.T) {
|
|||
path: "/char-device-1",
|
||||
exists: true,
|
||||
expected: source.FileMetadata{
|
||||
Path: "/char-device-1",
|
||||
Mode: 0644 | os.ModeDevice | os.ModeCharDevice,
|
||||
Type: "CharacterDevice",
|
||||
Type: file.TypeCharacterDevice,
|
||||
UserID: 0,
|
||||
GroupID: 0,
|
||||
MIMEType: "",
|
||||
|
@ -97,8 +102,9 @@ func TestFileMetadataCataloger(t *testing.T) {
|
|||
path: "/block-device-1",
|
||||
exists: true,
|
||||
expected: source.FileMetadata{
|
||||
Path: "/block-device-1",
|
||||
Mode: 0644 | os.ModeDevice,
|
||||
Type: "BlockDevice",
|
||||
Type: file.TypeBlockDevice,
|
||||
UserID: 0,
|
||||
GroupID: 0,
|
||||
MIMEType: "",
|
||||
|
@ -108,8 +114,9 @@ func TestFileMetadataCataloger(t *testing.T) {
|
|||
path: "/fifo-1",
|
||||
exists: true,
|
||||
expected: source.FileMetadata{
|
||||
Path: "/fifo-1",
|
||||
Mode: 0644 | os.ModeNamedPipe,
|
||||
Type: "FIFONode",
|
||||
Type: file.TypeFIFO,
|
||||
UserID: 0,
|
||||
GroupID: 0,
|
||||
MIMEType: "",
|
||||
|
@ -119,11 +126,13 @@ func TestFileMetadataCataloger(t *testing.T) {
|
|||
path: "/bin",
|
||||
exists: true,
|
||||
expected: source.FileMetadata{
|
||||
Path: "/bin",
|
||||
Mode: 0755 | os.ModeDir,
|
||||
Type: "Directory",
|
||||
Type: file.TypeDirectory,
|
||||
UserID: 0,
|
||||
GroupID: 0,
|
||||
MIMEType: "",
|
||||
IsDir: true,
|
||||
},
|
||||
},
|
||||
}
|
||||
|
@ -131,11 +140,9 @@ func TestFileMetadataCataloger(t *testing.T) {
|
|||
for _, test := range tests {
|
||||
t.Run(test.path, func(t *testing.T) {
|
||||
_, ref, err := img.SquashedTree().File(file.Path(test.path))
|
||||
if err != nil {
|
||||
t.Fatalf("unable to get file: %+v", err)
|
||||
}
|
||||
require.NoError(t, err)
|
||||
|
||||
l := source.NewLocationFromImage(test.path, *ref, img)
|
||||
l := source.NewLocationFromImage(test.path, *ref.Reference, img)
|
||||
|
||||
assert.Equal(t, test.expected, actual[l.Coordinates], "mismatched metadata")
|
||||
|
||||
|
|
|
@ -5,7 +5,7 @@ import (
|
|||
"regexp"
|
||||
"testing"
|
||||
|
||||
"github.com/anchore/syft/syft/formats/common/testutils"
|
||||
"github.com/anchore/syft/syft/formats/internal/testutils"
|
||||
)
|
||||
|
||||
var updateCycloneDx = flag.Bool("update-cyclonedx", false, "update the *.golden files for cyclone-dx encoders")
|
||||
|
|
|
@ -5,7 +5,7 @@ import (
|
|||
"regexp"
|
||||
"testing"
|
||||
|
||||
"github.com/anchore/syft/syft/formats/common/testutils"
|
||||
"github.com/anchore/syft/syft/formats/internal/testutils"
|
||||
)
|
||||
|
||||
var updateCycloneDx = flag.Bool("update-cyclonedx", false, "update the *.golden files for cyclone-dx encoders")
|
||||
|
|
|
@ -44,7 +44,7 @@ func Identify(by []byte) sbom.Format {
|
|||
for _, f := range Formats() {
|
||||
if err := f.Validate(bytes.NewReader(by)); err != nil {
|
||||
if !errors.Is(err, sbom.ErrValidationNotSupported) {
|
||||
log.Debugf("format %s returned err: %+v", f.ID(), err)
|
||||
log.WithFields("error", err).Tracef("format validation for %s failed", f.ID())
|
||||
}
|
||||
continue
|
||||
}
|
||||
|
|
|
@ -168,7 +168,7 @@ func populateImageCatalog(catalog *pkg.Catalog, img *image.Image) {
|
|||
Name: "package-1",
|
||||
Version: "1.0.1",
|
||||
Locations: source.NewLocationSet(
|
||||
source.NewLocationFromImage(string(ref1.RealPath), *ref1, img),
|
||||
source.NewLocationFromImage(string(ref1.RealPath), *ref1.Reference, img),
|
||||
),
|
||||
Type: pkg.PythonPkg,
|
||||
FoundBy: "the-cataloger-1",
|
||||
|
@ -188,7 +188,7 @@ func populateImageCatalog(catalog *pkg.Catalog, img *image.Image) {
|
|||
Name: "package-2",
|
||||
Version: "2.0.1",
|
||||
Locations: source.NewLocationSet(
|
||||
source.NewLocationFromImage(string(ref2.RealPath), *ref2, img),
|
||||
source.NewLocationFromImage(string(ref2.RealPath), *ref2.Reference, img),
|
||||
),
|
||||
Type: pkg.DebPkg,
|
||||
FoundBy: "the-cataloger-2",
|
|
@ -5,7 +5,7 @@ import (
|
|||
"regexp"
|
||||
"testing"
|
||||
|
||||
"github.com/anchore/syft/syft/formats/common/testutils"
|
||||
"github.com/anchore/syft/syft/formats/internal/testutils"
|
||||
)
|
||||
|
||||
var updateSpdxJson = flag.Bool("update-spdx-json", false, "update the *.golden files for spdx-json encoders")
|
||||
|
|
|
@ -5,7 +5,7 @@ import (
|
|||
"regexp"
|
||||
"testing"
|
||||
|
||||
"github.com/anchore/syft/syft/formats/common/testutils"
|
||||
"github.com/anchore/syft/syft/formats/internal/testutils"
|
||||
"github.com/anchore/syft/syft/pkg"
|
||||
"github.com/anchore/syft/syft/sbom"
|
||||
"github.com/anchore/syft/syft/source"
|
||||
|
|
|
@ -8,7 +8,7 @@ import (
|
|||
"github.com/go-test/deep"
|
||||
"github.com/stretchr/testify/assert"
|
||||
|
||||
"github.com/anchore/syft/syft/formats/common/testutils"
|
||||
"github.com/anchore/syft/syft/formats/internal/testutils"
|
||||
)
|
||||
|
||||
func TestEncodeDecodeCycle(t *testing.T) {
|
||||
|
|
|
@ -5,10 +5,11 @@ import (
|
|||
"regexp"
|
||||
"testing"
|
||||
|
||||
stereoFile "github.com/anchore/stereoscope/pkg/file"
|
||||
"github.com/anchore/syft/syft/artifact"
|
||||
"github.com/anchore/syft/syft/cpe"
|
||||
"github.com/anchore/syft/syft/file"
|
||||
"github.com/anchore/syft/syft/formats/common/testutils"
|
||||
"github.com/anchore/syft/syft/formats/internal/testutils"
|
||||
"github.com/anchore/syft/syft/linux"
|
||||
"github.com/anchore/syft/syft/pkg"
|
||||
"github.com/anchore/syft/syft/sbom"
|
||||
|
@ -107,26 +108,26 @@ func TestEncodeFullJSONDocument(t *testing.T) {
|
|||
FileMetadata: map[source.Coordinates]source.FileMetadata{
|
||||
source.NewLocation("/a/place").Coordinates: {
|
||||
Mode: 0775,
|
||||
Type: "directory",
|
||||
Type: stereoFile.TypeDirectory,
|
||||
UserID: 0,
|
||||
GroupID: 0,
|
||||
},
|
||||
source.NewLocation("/a/place/a").Coordinates: {
|
||||
Mode: 0775,
|
||||
Type: "regularFile",
|
||||
Type: stereoFile.TypeRegular,
|
||||
UserID: 0,
|
||||
GroupID: 0,
|
||||
},
|
||||
source.NewLocation("/b").Coordinates: {
|
||||
Mode: 0775,
|
||||
Type: "symbolicLink",
|
||||
Type: stereoFile.TypeSymLink,
|
||||
LinkDestination: "/c",
|
||||
UserID: 0,
|
||||
GroupID: 0,
|
||||
},
|
||||
source.NewLocation("/b/place/b").Coordinates: {
|
||||
Mode: 0644,
|
||||
Type: "regularFile",
|
||||
Type: stereoFile.TypeRegular,
|
||||
UserID: 1,
|
||||
GroupID: 2,
|
||||
},
|
||||
|
|
|
@ -14,10 +14,10 @@ type File struct {
|
|||
}
|
||||
|
||||
type FileMetadataEntry struct {
|
||||
Mode int `json:"mode"`
|
||||
Type source.FileType `json:"type"`
|
||||
LinkDestination string `json:"linkDestination,omitempty"`
|
||||
UserID int `json:"userID"`
|
||||
GroupID int `json:"groupID"`
|
||||
MIMEType string `json:"mimeType"`
|
||||
Mode int `json:"mode"`
|
||||
Type string `json:"type"`
|
||||
LinkDestination string `json:"linkDestination,omitempty"`
|
||||
UserID int `json:"userID"`
|
||||
GroupID int `json:"groupID"`
|
||||
MIMEType string `json:"mimeType"`
|
||||
}
|
||||
|
|
|
@ -89,7 +89,7 @@
|
|||
}
|
||||
},
|
||||
"schema": {
|
||||
"version": "6.1.0",
|
||||
"url": "https://raw.githubusercontent.com/anchore/syft/main/schema/json/schema-6.1.0.json"
|
||||
"version": "6.2.0",
|
||||
"url": "https://raw.githubusercontent.com/anchore/syft/main/schema/json/schema-6.2.0.json"
|
||||
}
|
||||
}
|
||||
|
|
|
@ -78,7 +78,7 @@
|
|||
},
|
||||
"metadata": {
|
||||
"mode": 775,
|
||||
"type": "directory",
|
||||
"type": "Directory",
|
||||
"userID": 0,
|
||||
"groupID": 0,
|
||||
"mimeType": ""
|
||||
|
@ -91,7 +91,7 @@
|
|||
},
|
||||
"metadata": {
|
||||
"mode": 775,
|
||||
"type": "regularFile",
|
||||
"type": "RegularFile",
|
||||
"userID": 0,
|
||||
"groupID": 0,
|
||||
"mimeType": ""
|
||||
|
@ -111,7 +111,7 @@
|
|||
},
|
||||
"metadata": {
|
||||
"mode": 775,
|
||||
"type": "symbolicLink",
|
||||
"type": "SymbolicLink",
|
||||
"linkDestination": "/c",
|
||||
"userID": 0,
|
||||
"groupID": 0,
|
||||
|
@ -125,7 +125,7 @@
|
|||
},
|
||||
"metadata": {
|
||||
"mode": 644,
|
||||
"type": "regularFile",
|
||||
"type": "RegularFile",
|
||||
"userID": 1,
|
||||
"groupID": 2,
|
||||
"mimeType": ""
|
||||
|
|
|
@ -9,7 +9,7 @@
|
|||
"locations": [
|
||||
{
|
||||
"path": "/somefile-1.txt",
|
||||
"layerID": "sha256:6afd1cb55939d87ba4c298907d0a53059bb3742c2d65314643e2464071cf0a2d"
|
||||
"layerID": "sha256:fb6beecb75b39f4bb813dbf177e501edd5ddb3e69bb45cedeb78c676ee1b7a59"
|
||||
}
|
||||
],
|
||||
"licenses": [
|
||||
|
@ -40,7 +40,7 @@
|
|||
"locations": [
|
||||
{
|
||||
"path": "/somefile-2.txt",
|
||||
"layerID": "sha256:657997cff9a836139186239bdfe77250239a700d0ed97d57e101c295e8244319"
|
||||
"layerID": "sha256:319b588ce64253a87b533c8ed01cf0025e0eac98e7b516e12532957e1244fdec"
|
||||
}
|
||||
],
|
||||
"licenses": [],
|
||||
|
@ -64,11 +64,11 @@
|
|||
],
|
||||
"artifactRelationships": [],
|
||||
"source": {
|
||||
"id": "c85f7ae1b0ac38342c1cf1a6f7ea2b4b1ddc49cd1b24219ebd05dc10b3303491",
|
||||
"id": "1a678f111c8ddc66fd82687bb024e0dd6af61314404937a80e810c0cf317b796",
|
||||
"type": "image",
|
||||
"target": {
|
||||
"userInput": "user-image-input",
|
||||
"imageID": "sha256:b5c0bfa8bcf70c75d92ebebbf76af667906d56e6fad50c37e7f93df824a64b79",
|
||||
"imageID": "sha256:3c51b06feb0cda8ee62d0e3755ef2a8496a6b71f8a55b245f07f31c4bb813d31",
|
||||
"manifestDigest": "sha256:2731251dc34951c0e50fcc643b4c5f74922dad1a5d98f302b504cf46cd5d9368",
|
||||
"mediaType": "application/vnd.docker.distribution.manifest.v2+json",
|
||||
"tags": [
|
||||
|
@ -78,17 +78,17 @@
|
|||
"layers": [
|
||||
{
|
||||
"mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip",
|
||||
"digest": "sha256:6afd1cb55939d87ba4c298907d0a53059bb3742c2d65314643e2464071cf0a2d",
|
||||
"digest": "sha256:fb6beecb75b39f4bb813dbf177e501edd5ddb3e69bb45cedeb78c676ee1b7a59",
|
||||
"size": 22
|
||||
},
|
||||
{
|
||||
"mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip",
|
||||
"digest": "sha256:657997cff9a836139186239bdfe77250239a700d0ed97d57e101c295e8244319",
|
||||
"digest": "sha256:319b588ce64253a87b533c8ed01cf0025e0eac98e7b516e12532957e1244fdec",
|
||||
"size": 16
|
||||
}
|
||||
],
|
||||
"manifest": "eyJzY2hlbWFWZXJzaW9uIjoyLCJtZWRpYVR5cGUiOiJhcHBsaWNhdGlvbi92bmQuZG9ja2VyLmRpc3RyaWJ1dGlvbi5tYW5pZmVzdC52Mitqc29uIiwiY29uZmlnIjp7Im1lZGlhVHlwZSI6ImFwcGxpY2F0aW9uL3ZuZC5kb2NrZXIuY29udGFpbmVyLmltYWdlLnYxK2pzb24iLCJzaXplIjo2NzMsImRpZ2VzdCI6InNoYTI1NjpiNWMwYmZhOGJjZjcwYzc1ZDkyZWJlYmJmNzZhZjY2NzkwNmQ1NmU2ZmFkNTBjMzdlN2Y5M2RmODI0YTY0Yjc5In0sImxheWVycyI6W3sibWVkaWFUeXBlIjoiYXBwbGljYXRpb24vdm5kLmRvY2tlci5pbWFnZS5yb290ZnMuZGlmZi50YXIuZ3ppcCIsInNpemUiOjIwNDgsImRpZ2VzdCI6InNoYTI1Njo2YWZkMWNiNTU5MzlkODdiYTRjMjk4OTA3ZDBhNTMwNTliYjM3NDJjMmQ2NTMxNDY0M2UyNDY0MDcxY2YwYTJkIn0seyJtZWRpYVR5cGUiOiJhcHBsaWNhdGlvbi92bmQuZG9ja2VyLmltYWdlLnJvb3Rmcy5kaWZmLnRhci5nemlwIiwic2l6ZSI6MjA0OCwiZGlnZXN0Ijoic2hhMjU2OjY1Nzk5N2NmZjlhODM2MTM5MTg2MjM5YmRmZTc3MjUwMjM5YTcwMGQwZWQ5N2Q1N2UxMDFjMjk1ZTgyNDQzMTkifV19",
|
||||
"config": "eyJhcmNoaXRlY3R1cmUiOiJhbWQ2NCIsImNvbmZpZyI6eyJFbnYiOlsiUEFUSD0vdXNyL2xvY2FsL3NiaW46L3Vzci9sb2NhbC9iaW46L3Vzci9zYmluOi91c3IvYmluOi9zYmluOi9iaW4iXSwiV29ya2luZ0RpciI6Ii8iLCJPbkJ1aWxkIjpudWxsfSwiY3JlYXRlZCI6IjIwMjItMDgtMjVUMTY6MjI6MDguODkxMzY0Mjc4WiIsImhpc3RvcnkiOlt7ImNyZWF0ZWQiOiIyMDIyLTA4LTI1VDE2OjIyOjA4Ljc2MzMzMDMyM1oiLCJjcmVhdGVkX2J5IjoiQUREIGZpbGUtMS50eHQgL3NvbWVmaWxlLTEudHh0ICMgYnVpbGRraXQiLCJjb21tZW50IjoiYnVpbGRraXQuZG9ja2VyZmlsZS52MCJ9LHsiY3JlYXRlZCI6IjIwMjItMDgtMjVUMTY6MjI6MDguODkxMzY0Mjc4WiIsImNyZWF0ZWRfYnkiOiJBREQgZmlsZS0yLnR4dCAvc29tZWZpbGUtMi50eHQgIyBidWlsZGtpdCIsImNvbW1lbnQiOiJidWlsZGtpdC5kb2NrZXJmaWxlLnYwIn1dLCJvcyI6ImxpbnV4Iiwicm9vdGZzIjp7InR5cGUiOiJsYXllcnMiLCJkaWZmX2lkcyI6WyJzaGEyNTY6NmFmZDFjYjU1OTM5ZDg3YmE0YzI5ODkwN2QwYTUzMDU5YmIzNzQyYzJkNjUzMTQ2NDNlMjQ2NDA3MWNmMGEyZCIsInNoYTI1Njo2NTc5OTdjZmY5YTgzNjEzOTE4NjIzOWJkZmU3NzI1MDIzOWE3MDBkMGVkOTdkNTdlMTAxYzI5NWU4MjQ0MzE5Il19fQ==",
|
||||
"manifest": "eyJzY2hlbWFWZXJzaW9uIjoyLCJtZWRpYVR5cGUiOiJhcHBsaWNhdGlvbi92bmQuZG9ja2VyLmRpc3RyaWJ1dGlvbi5tYW5pZmVzdC52Mitqc29uIiwiY29uZmlnIjp7Im1lZGlhVHlwZSI6ImFwcGxpY2F0aW9uL3ZuZC5kb2NrZXIuY29udGFpbmVyLmltYWdlLnYxK2pzb24iLCJzaXplIjo2NzMsImRpZ2VzdCI6InNoYTI1NjozYzUxYjA2ZmViMGNkYThlZTYyZDBlMzc1NWVmMmE4NDk2YTZiNzFmOGE1NWIyNDVmMDdmMzFjNGJiODEzZDMxIn0sImxheWVycyI6W3sibWVkaWFUeXBlIjoiYXBwbGljYXRpb24vdm5kLmRvY2tlci5pbWFnZS5yb290ZnMuZGlmZi50YXIuZ3ppcCIsInNpemUiOjIwNDgsImRpZ2VzdCI6InNoYTI1NjpmYjZiZWVjYjc1YjM5ZjRiYjgxM2RiZjE3N2U1MDFlZGQ1ZGRiM2U2OWJiNDVjZWRlYjc4YzY3NmVlMWI3YTU5In0seyJtZWRpYVR5cGUiOiJhcHBsaWNhdGlvbi92bmQuZG9ja2VyLmltYWdlLnJvb3Rmcy5kaWZmLnRhci5nemlwIiwic2l6ZSI6MjA0OCwiZGlnZXN0Ijoic2hhMjU2OjMxOWI1ODhjZTY0MjUzYTg3YjUzM2M4ZWQwMWNmMDAyNWUwZWFjOThlN2I1MTZlMTI1MzI5NTdlMTI0NGZkZWMifV19",
|
||||
"config": "eyJhcmNoaXRlY3R1cmUiOiJhbWQ2NCIsImNvbmZpZyI6eyJFbnYiOlsiUEFUSD0vdXNyL2xvY2FsL3NiaW46L3Vzci9sb2NhbC9iaW46L3Vzci9zYmluOi91c3IvYmluOi9zYmluOi9iaW4iXSwiV29ya2luZ0RpciI6Ii8iLCJPbkJ1aWxkIjpudWxsfSwiY3JlYXRlZCI6IjIwMjItMDgtMDFUMjA6MDk6MjIuNTA5NDIxNzEyWiIsImhpc3RvcnkiOlt7ImNyZWF0ZWQiOiIyMDIyLTA4LTAxVDIwOjA5OjIyLjQ4Nzg5NTUxOVoiLCJjcmVhdGVkX2J5IjoiQUREIGZpbGUtMS50eHQgL3NvbWVmaWxlLTEudHh0ICMgYnVpbGRraXQiLCJjb21tZW50IjoiYnVpbGRraXQuZG9ja2VyZmlsZS52MCJ9LHsiY3JlYXRlZCI6IjIwMjItMDgtMDFUMjA6MDk6MjIuNTA5NDIxNzEyWiIsImNyZWF0ZWRfYnkiOiJBREQgZmlsZS0yLnR4dCAvc29tZWZpbGUtMi50eHQgIyBidWlsZGtpdCIsImNvbW1lbnQiOiJidWlsZGtpdC5kb2NrZXJmaWxlLnYwIn1dLCJvcyI6ImxpbnV4Iiwicm9vdGZzIjp7InR5cGUiOiJsYXllcnMiLCJkaWZmX2lkcyI6WyJzaGEyNTY6ZmI2YmVlY2I3NWIzOWY0YmI4MTNkYmYxNzdlNTAxZWRkNWRkYjNlNjliYjQ1Y2VkZWI3OGM2NzZlZTFiN2E1OSIsInNoYTI1NjozMTliNTg4Y2U2NDI1M2E4N2I1MzNjOGVkMDFjZjAwMjVlMGVhYzk4ZTdiNTE2ZTEyNTMyOTU3ZTEyNDRmZGVjIl19fQ==",
|
||||
"repoDigests": [],
|
||||
"architecture": "",
|
||||
"os": ""
|
||||
|
@ -112,7 +112,7 @@
|
|||
}
|
||||
},
|
||||
"schema": {
|
||||
"version": "6.1.0",
|
||||
"url": "https://raw.githubusercontent.com/anchore/syft/main/schema/json/schema-6.1.0.json"
|
||||
"version": "6.2.0",
|
||||
"url": "https://raw.githubusercontent.com/anchore/syft/main/schema/json/schema-6.2.0.json"
|
||||
}
|
||||
}
|
||||
|
|
Binary file not shown.
|
@ -5,6 +5,7 @@ import (
|
|||
"sort"
|
||||
"strconv"
|
||||
|
||||
stereoscopeFile "github.com/anchore/stereoscope/pkg/file"
|
||||
"github.com/anchore/syft/internal"
|
||||
"github.com/anchore/syft/internal/log"
|
||||
"github.com/anchore/syft/syft/artifact"
|
||||
|
@ -137,7 +138,7 @@ func toFileMetadataEntry(coordinates source.Coordinates, metadata *source.FileMe
|
|||
|
||||
return &model.FileMetadataEntry{
|
||||
Mode: mode,
|
||||
Type: metadata.Type,
|
||||
Type: toFileType(metadata.Type),
|
||||
LinkDestination: metadata.LinkDestination,
|
||||
UserID: metadata.UserID,
|
||||
GroupID: metadata.GroupID,
|
||||
|
@ -145,6 +146,31 @@ func toFileMetadataEntry(coordinates source.Coordinates, metadata *source.FileMe
|
|||
}
|
||||
}
|
||||
|
||||
func toFileType(ty stereoscopeFile.Type) string {
|
||||
switch ty {
|
||||
case stereoscopeFile.TypeSymLink:
|
||||
return "SymbolicLink"
|
||||
case stereoscopeFile.TypeHardLink:
|
||||
return "HardLink"
|
||||
case stereoscopeFile.TypeDirectory:
|
||||
return "Directory"
|
||||
case stereoscopeFile.TypeSocket:
|
||||
return "Socket"
|
||||
case stereoscopeFile.TypeBlockDevice:
|
||||
return "BlockDevice"
|
||||
case stereoscopeFile.TypeCharacterDevice:
|
||||
return "CharacterDevice"
|
||||
case stereoscopeFile.TypeFIFO:
|
||||
return "FIFONode"
|
||||
case stereoscopeFile.TypeRegular:
|
||||
return "RegularFile"
|
||||
case stereoscopeFile.TypeIrregular:
|
||||
return "IrregularFile"
|
||||
default:
|
||||
return "Unknown"
|
||||
}
|
||||
}
|
||||
|
||||
func toPackageModels(catalog *pkg.Catalog) []model.Package {
|
||||
artifacts := make([]model.Package, 0)
|
||||
if catalog == nil {
|
||||
|
|
|
@ -8,6 +8,7 @@ import (
|
|||
"github.com/stretchr/testify/assert"
|
||||
"github.com/stretchr/testify/require"
|
||||
|
||||
"github.com/anchore/stereoscope/pkg/file"
|
||||
"github.com/anchore/syft/internal"
|
||||
"github.com/anchore/syft/syft/formats/syftjson/model"
|
||||
"github.com/anchore/syft/syft/source"
|
||||
|
@ -97,3 +98,64 @@ func Test_toSourceModel(t *testing.T) {
|
|||
// assert all possible schemes were under test
|
||||
assert.ElementsMatch(t, allSchemes.List(), testedSchemes.List(), "not all source.Schemes are under test")
|
||||
}
|
||||
|
||||
func Test_toFileType(t *testing.T) {
|
||||
|
||||
badType := file.Type(0x1337)
|
||||
var allTypesTested []file.Type
|
||||
tests := []struct {
|
||||
ty file.Type
|
||||
name string
|
||||
}{
|
||||
{
|
||||
ty: file.TypeRegular,
|
||||
name: "RegularFile",
|
||||
},
|
||||
{
|
||||
ty: file.TypeDirectory,
|
||||
name: "Directory",
|
||||
},
|
||||
{
|
||||
ty: file.TypeSymLink,
|
||||
name: "SymbolicLink",
|
||||
},
|
||||
{
|
||||
ty: file.TypeHardLink,
|
||||
name: "HardLink",
|
||||
},
|
||||
{
|
||||
ty: file.TypeSocket,
|
||||
name: "Socket",
|
||||
},
|
||||
{
|
||||
ty: file.TypeCharacterDevice,
|
||||
name: "CharacterDevice",
|
||||
},
|
||||
{
|
||||
ty: file.TypeBlockDevice,
|
||||
name: "BlockDevice",
|
||||
},
|
||||
{
|
||||
ty: file.TypeFIFO,
|
||||
name: "FIFONode",
|
||||
},
|
||||
{
|
||||
ty: file.TypeIrregular,
|
||||
name: "IrregularFile",
|
||||
},
|
||||
{
|
||||
ty: badType,
|
||||
name: "Unknown",
|
||||
},
|
||||
}
|
||||
for _, tt := range tests {
|
||||
t.Run(tt.name, func(t *testing.T) {
|
||||
assert.Equalf(t, tt.name, toFileType(tt.ty), "toFileType(%v)", tt.ty)
|
||||
if tt.ty != badType {
|
||||
allTypesTested = append(allTypesTested, tt.ty)
|
||||
}
|
||||
})
|
||||
}
|
||||
|
||||
assert.ElementsMatch(t, allTypesTested, file.AllTypes(), "not all file.Types are under test")
|
||||
}
|
||||
|
|
|
@ -6,7 +6,7 @@ import (
|
|||
|
||||
"github.com/go-test/deep"
|
||||
|
||||
"github.com/anchore/syft/syft/formats/common/testutils"
|
||||
"github.com/anchore/syft/syft/formats/internal/testutils"
|
||||
)
|
||||
|
||||
var updateTableGoldenFiles = flag.Bool("update-table", false, "update the *.golden files for table format")
|
||||
|
|
|
@ -6,7 +6,7 @@ import (
|
|||
|
||||
"github.com/stretchr/testify/assert"
|
||||
|
||||
"github.com/anchore/syft/syft/formats/common/testutils"
|
||||
"github.com/anchore/syft/syft/formats/internal/testutils"
|
||||
)
|
||||
|
||||
var updateTmpl = flag.Bool("update-tmpl", false, "update the *.golden files for json encoders")
|
||||
|
|
|
@ -4,7 +4,7 @@ import (
|
|||
"flag"
|
||||
"testing"
|
||||
|
||||
"github.com/anchore/syft/syft/formats/common/testutils"
|
||||
"github.com/anchore/syft/syft/formats/internal/testutils"
|
||||
)
|
||||
|
||||
var updateTextEncoderGoldenFiles = flag.Bool("update-text", false, "update the *.golden files for text encoder")
|
||||
|
|
209
syft/pkg/cataloger/alpm/cataloger_test.go
Normal file
209
syft/pkg/cataloger/alpm/cataloger_test.go
Normal file
|
@ -0,0 +1,209 @@
|
|||
package alpm
|
||||
|
||||
import (
|
||||
"testing"
|
||||
|
||||
"github.com/google/go-cmp/cmp/cmpopts"
|
||||
|
||||
"github.com/anchore/syft/syft/artifact"
|
||||
"github.com/anchore/syft/syft/file"
|
||||
"github.com/anchore/syft/syft/pkg"
|
||||
"github.com/anchore/syft/syft/pkg/cataloger/internal/pkgtest"
|
||||
"github.com/anchore/syft/syft/source"
|
||||
)
|
||||
|
||||
func TestAlpmCataloger(t *testing.T) {
|
||||
|
||||
expectedPkgs := []pkg.Package{
|
||||
{
|
||||
Name: "gmp",
|
||||
Version: "6.2.1-2",
|
||||
Type: pkg.AlpmPkg,
|
||||
FoundBy: "alpmdb-cataloger",
|
||||
Licenses: []string{"LGPL3", "GPL"},
|
||||
Locations: source.NewLocationSet(source.NewLocation("var/lib/pacman/local/gmp-6.2.1-2/desc")),
|
||||
CPEs: nil,
|
||||
PURL: "",
|
||||
MetadataType: "AlpmMetadata",
|
||||
Metadata: pkg.AlpmMetadata{
|
||||
BasePackage: "gmp",
|
||||
Package: "gmp",
|
||||
Version: "6.2.1-2",
|
||||
Description: "A free library for arbitrary precision arithmetic",
|
||||
Architecture: "x86_64",
|
||||
Size: 1044438,
|
||||
Packager: "Antonio Rojas <arojas@archlinux.org>",
|
||||
License: "LGPL3\nGPL",
|
||||
URL: "https://gmplib.org/",
|
||||
Validation: "pgp",
|
||||
Reason: 1,
|
||||
Files: []pkg.AlpmFileRecord{
|
||||
{
|
||||
Path: "/usr",
|
||||
Type: "dir",
|
||||
Digests: []file.Digest{},
|
||||
},
|
||||
{
|
||||
Path: "/usr/include",
|
||||
Type: "dir",
|
||||
Digests: []file.Digest{},
|
||||
},
|
||||
{
|
||||
Path: "/usr/include/gmp.h",
|
||||
Size: "84140",
|
||||
Digests: []file.Digest{
|
||||
{Algorithm: "md5", Value: "76595f70565c72550eb520809bf86856"},
|
||||
{Algorithm: "sha256", Value: "91a614b9202453153fe3b7512d15e89659108b93ce8841c8e13789eb85da9e3a"},
|
||||
},
|
||||
},
|
||||
{
|
||||
Path: "/usr/include/gmpxx.h",
|
||||
Size: "129113",
|
||||
Digests: []file.Digest{
|
||||
{Algorithm: "md5", Value: "ea3d21de4bcf7c696799c5c55dd3655b"},
|
||||
{Algorithm: "sha256", Value: "0011ae411a0bc1030e07d968b32fdc1343f5ac2a17b7d28f493e7976dde2ac82"},
|
||||
},
|
||||
},
|
||||
{
|
||||
Path: "/usr/lib",
|
||||
Type: "dir",
|
||||
Digests: []file.Digest{},
|
||||
},
|
||||
{
|
||||
Path: "/usr/lib/libgmp.so",
|
||||
Type: "link",
|
||||
Link: "libgmp.so.10.4.1",
|
||||
Digests: []file.Digest{},
|
||||
},
|
||||
{
|
||||
Path: "/usr/lib/libgmp.so.10",
|
||||
Type: "link",
|
||||
Link: "libgmp.so.10.4.1",
|
||||
Digests: []file.Digest{},
|
||||
},
|
||||
{
|
||||
Path: "/usr/lib/libgmp.so.10.4.1",
|
||||
Size: "663224",
|
||||
Digests: []file.Digest{
|
||||
{Algorithm: "md5", Value: "d6d03eadacdd9048d5b2adf577e9d722"},
|
||||
{Algorithm: "sha256", Value: "39898bd3d8d6785222432fa8b8aef7ce3b7e5bbfc66a52b7c0da09bed4adbe6a"},
|
||||
},
|
||||
},
|
||||
{
|
||||
Path: "/usr/lib/libgmpxx.so",
|
||||
Type: "link",
|
||||
Link: "libgmpxx.so.4.6.1",
|
||||
Digests: []file.Digest{},
|
||||
},
|
||||
{
|
||||
Path: "/usr/lib/libgmpxx.so.4",
|
||||
Type: "link",
|
||||
Link: "libgmpxx.so.4.6.1",
|
||||
Digests: []file.Digest{},
|
||||
},
|
||||
{
|
||||
Path: "/usr/lib/libgmpxx.so.4.6.1",
|
||||
Size: "30680",
|
||||
Digests: []file.Digest{
|
||||
{Algorithm: "md5", Value: "dd5f0c4d635fa599fa7f4339c0e8814d"},
|
||||
{Algorithm: "sha256", Value: "0ef67cbde4841f58d2e4b41f59425eb87c9eeaf4e649c060b326342c53bedbec"},
|
||||
},
|
||||
},
|
||||
{
|
||||
Path: "/usr/lib/pkgconfig",
|
||||
Type: "dir",
|
||||
Digests: []file.Digest{},
|
||||
},
|
||||
{
|
||||
Path: "/usr/lib/pkgconfig/gmp.pc",
|
||||
Size: "245",
|
||||
Digests: []file.Digest{
|
||||
{Algorithm: "md5", Value: "a91a9f1b66218cb77b9cd2cdf341756d"},
|
||||
{Algorithm: "sha256", Value: "4e9de547a48c4e443781e9fa702a1ec5a23ee28b4bc520306cff2541a855be37"},
|
||||
},
|
||||
},
|
||||
{
|
||||
Path: "/usr/lib/pkgconfig/gmpxx.pc",
|
||||
Size: "280",
|
||||
Digests: []file.Digest{
|
||||
{Algorithm: "md5", Value: "8c0f54e987934352177a6a30a811b001"},
|
||||
{Algorithm: "sha256", Value: "fc5dbfbe75977057ba50953d94b9daecf696c9fdfe5b94692b832b44ecca871b"},
|
||||
},
|
||||
},
|
||||
{
|
||||
Path: "/usr/share",
|
||||
Type: "dir",
|
||||
Digests: []file.Digest{},
|
||||
},
|
||||
{
|
||||
Path: "/usr/share/info",
|
||||
Type: "dir",
|
||||
Digests: []file.Digest{},
|
||||
},
|
||||
{
|
||||
Path: "/usr/share/info/gmp.info-1.gz",
|
||||
Size: "85892",
|
||||
Digests: []file.Digest{
|
||||
{Algorithm: "md5", Value: "63304d4d2f0247fb8a999fae66a81c19"},
|
||||
{Algorithm: "sha256", Value: "86288c1531a2789db5da8b9838b5cde4db07bda230ae11eba23a1f33698bd14e"},
|
||||
},
|
||||
},
|
||||
{
|
||||
Path: "/usr/share/info/gmp.info-2.gz",
|
||||
Size: "48484",
|
||||
Digests: []file.Digest{
|
||||
{Algorithm: "md5", Value: "4bb0dadec416d305232cac6eae712ff7"},
|
||||
{Algorithm: "sha256", Value: "b7443c1b529588d98a074266087f79b595657ac7274191c34b10a9ceedfa950e"},
|
||||
},
|
||||
},
|
||||
{
|
||||
Path: "/usr/share/info/gmp.info.gz",
|
||||
Size: "2380",
|
||||
Digests: []file.Digest{
|
||||
{Algorithm: "md5", Value: "cf6880fb0d862ee1da0d13c3831b5720"},
|
||||
{Algorithm: "sha256", Value: "a13c8eecda3f3e5ad1e09773e47a9686f07d9d494eaddf326f3696bbef1548fd"},
|
||||
},
|
||||
},
|
||||
},
|
||||
Backup: []pkg.AlpmFileRecord{},
|
||||
},
|
||||
},
|
||||
}
|
||||
|
||||
// TODO: relationships are not under test yet
|
||||
var expectedRelationships []artifact.Relationship
|
||||
|
||||
pkgtest.NewCatalogTester().
|
||||
FromDirectory(t, "test-fixtures/gmp-fixture").
|
||||
WithCompareOptions(cmpopts.IgnoreFields(pkg.AlpmFileRecord{}, "Time")).
|
||||
Expects(expectedPkgs, expectedRelationships).
|
||||
TestCataloger(t, NewAlpmdbCataloger())
|
||||
|
||||
}
|
||||
|
||||
func TestCataloger_Globs(t *testing.T) {
|
||||
tests := []struct {
|
||||
name string
|
||||
fixture string
|
||||
expected []string
|
||||
}{
|
||||
{
|
||||
name: "obtain description files",
|
||||
fixture: "test-fixtures/glob-paths",
|
||||
expected: []string{
|
||||
"var/lib/pacman/local/base-1.0/desc",
|
||||
"var/lib/pacman/local/dive-0.10.0/desc",
|
||||
},
|
||||
},
|
||||
}
|
||||
|
||||
for _, test := range tests {
|
||||
t.Run(test.name, func(t *testing.T) {
|
||||
pkgtest.NewCatalogTester().
|
||||
FromDirectory(t, test.fixture).
|
||||
ExpectsResolverContentQueries(test.expected).
|
||||
IgnoreUnfulfilledPathResponses("var/lib/pacman/local/base-1.0/mtree", "var/lib/pacman/local/dive-0.10.0/mtree").
|
||||
TestCataloger(t, NewAlpmdbCataloger())
|
||||
})
|
||||
}
|
||||
}
|
|
@ -1,9 +1,8 @@
|
|||
package alpm
|
||||
|
||||
import (
|
||||
"strings"
|
||||
|
||||
"github.com/anchore/packageurl-go"
|
||||
"github.com/anchore/syft/internal"
|
||||
"github.com/anchore/syft/syft/linux"
|
||||
"github.com/anchore/syft/syft/pkg"
|
||||
"github.com/anchore/syft/syft/source"
|
||||
|
@ -15,7 +14,7 @@ func newPackage(m pkg.AlpmMetadata, release *linux.Release, locations ...source.
|
|||
Version: m.Version,
|
||||
Locations: source.NewLocationSet(locations...),
|
||||
Type: pkg.AlpmPkg,
|
||||
Licenses: strings.Split(m.License, " "),
|
||||
Licenses: internal.SplitAny(m.License, " \n"),
|
||||
PURL: packageURL(m, release),
|
||||
MetadataType: pkg.AlpmMetadataType,
|
||||
Metadata: m,
|
||||
|
|
|
@ -65,6 +65,10 @@ func parseAlpmDB(resolver source.FileResolver, env *generic.Environment, reader
|
|||
metadata.Backup = filesMetadata.Backup
|
||||
}
|
||||
|
||||
if metadata.Package == "" {
|
||||
return nil, nil, nil
|
||||
}
|
||||
|
||||
return []pkg.Package{
|
||||
newPackage(*metadata, env.LinuxRelease, reader.Location),
|
||||
}, nil, nil
|
||||
|
|
|
@ -16,10 +16,12 @@ import (
|
|||
func TestDatabaseParser(t *testing.T) {
|
||||
tests := []struct {
|
||||
name string
|
||||
fixture string
|
||||
expected pkg.AlpmMetadata
|
||||
}{
|
||||
{
|
||||
name: "test alpm database parsing",
|
||||
name: "test alpm database parsing",
|
||||
fixture: "test-fixtures/files",
|
||||
expected: pkg.AlpmMetadata{
|
||||
Backup: []pkg.AlpmFileRecord{
|
||||
{
|
||||
|
@ -90,7 +92,7 @@ func TestDatabaseParser(t *testing.T) {
|
|||
|
||||
for _, test := range tests {
|
||||
t.Run(test.name, func(t *testing.T) {
|
||||
f, err := os.Open("test-fixtures/files")
|
||||
f, err := os.Open(test.fixture)
|
||||
require.NoError(t, err)
|
||||
t.Cleanup(func() { require.NoError(t, f.Close()) })
|
||||
|
||||
|
|
|
@ -0,0 +1 @@
|
|||
bogus desc file
|
|
@ -0,0 +1 @@
|
|||
bogus files
|
|
@ -0,0 +1 @@
|
|||
bogus desc file
|
|
@ -0,0 +1,44 @@
|
|||
%NAME%
|
||||
gmp
|
||||
|
||||
%VERSION%
|
||||
6.2.1-2
|
||||
|
||||
%BASE%
|
||||
gmp
|
||||
|
||||
%DESC%
|
||||
A free library for arbitrary precision arithmetic
|
||||
|
||||
%URL%
|
||||
https://gmplib.org/
|
||||
|
||||
%ARCH%
|
||||
x86_64
|
||||
|
||||
%BUILDDATE%
|
||||
1653121258
|
||||
|
||||
%INSTALLDATE%
|
||||
1665878640
|
||||
|
||||
%PACKAGER%
|
||||
Antonio Rojas <arojas@archlinux.org>
|
||||
|
||||
%SIZE%
|
||||
1044438
|
||||
|
||||
%REASON%
|
||||
1
|
||||
|
||||
%LICENSE%
|
||||
LGPL3
|
||||
GPL
|
||||
|
||||
%VALIDATION%
|
||||
pgp
|
||||
|
||||
%DEPENDS%
|
||||
gcc-libs
|
||||
sh
|
||||
|
|
@ -0,0 +1,21 @@
|
|||
%FILES%
|
||||
usr/
|
||||
usr/include/
|
||||
usr/include/gmp.h
|
||||
usr/include/gmpxx.h
|
||||
usr/lib/
|
||||
usr/lib/libgmp.so
|
||||
usr/lib/libgmp.so.10
|
||||
usr/lib/libgmp.so.10.4.1
|
||||
usr/lib/libgmpxx.so
|
||||
usr/lib/libgmpxx.so.4
|
||||
usr/lib/libgmpxx.so.4.6.1
|
||||
usr/lib/pkgconfig/
|
||||
usr/lib/pkgconfig/gmp.pc
|
||||
usr/lib/pkgconfig/gmpxx.pc
|
||||
usr/share/
|
||||
usr/share/info/
|
||||
usr/share/info/gmp.info-1.gz
|
||||
usr/share/info/gmp.info-2.gz
|
||||
usr/share/info/gmp.info.gz
|
||||
|
Binary file not shown.
30
syft/pkg/cataloger/apkdb/cataloger_test.go
Normal file
30
syft/pkg/cataloger/apkdb/cataloger_test.go
Normal file
|
@ -0,0 +1,30 @@
|
|||
package apkdb
|
||||
|
||||
import (
|
||||
"testing"
|
||||
|
||||
"github.com/anchore/syft/syft/pkg/cataloger/internal/pkgtest"
|
||||
)
|
||||
|
||||
func TestCataloger_Globs(t *testing.T) {
|
||||
tests := []struct {
|
||||
name string
|
||||
fixture string
|
||||
expected []string
|
||||
}{
|
||||
{
|
||||
name: "obtain DB files",
|
||||
fixture: "test-fixtures/glob-paths",
|
||||
expected: []string{"lib/apk/db/installed"},
|
||||
},
|
||||
}
|
||||
|
||||
for _, test := range tests {
|
||||
t.Run(test.name, func(t *testing.T) {
|
||||
pkgtest.NewCatalogTester().
|
||||
FromDirectory(t, test.fixture).
|
||||
ExpectsResolverContentQueries(test.expected).
|
||||
TestCataloger(t, NewApkdbCataloger())
|
||||
})
|
||||
}
|
||||
}
|
|
@ -7,6 +7,7 @@ import (
|
|||
"strconv"
|
||||
"strings"
|
||||
|
||||
"github.com/anchore/syft/internal"
|
||||
"github.com/anchore/syft/internal/log"
|
||||
"github.com/anchore/syft/syft/artifact"
|
||||
"github.com/anchore/syft/syft/file"
|
||||
|
@ -350,19 +351,12 @@ func discoverPackageDependencies(pkgs []pkg.Package) (relationships []artifact.R
|
|||
return relationships
|
||||
}
|
||||
|
||||
func splitAny(s string, seps string) []string {
|
||||
splitter := func(r rune) bool {
|
||||
return strings.ContainsRune(seps, r)
|
||||
}
|
||||
return strings.FieldsFunc(s, splitter)
|
||||
}
|
||||
|
||||
func stripVersionSpecifier(s string) string {
|
||||
// examples:
|
||||
// musl>=1 --> musl
|
||||
// cmd:scanelf=1.3.4-r0 --> cmd:scanelf
|
||||
|
||||
items := splitAny(s, "<>=")
|
||||
items := internal.SplitAny(s, "<>=")
|
||||
if len(items) == 0 {
|
||||
return s
|
||||
}
|
||||
|
|
|
@ -0,0 +1 @@
|
|||
bogus db contents
|
|
@ -32,6 +32,7 @@ func (c Cataloger) Catalog(resolver source.FileResolver) ([]pkg.Package, []artif
|
|||
var relationships []artifact.Relationship
|
||||
|
||||
for _, cls := range defaultClassifiers {
|
||||
log.WithFields("classifier", cls.Class).Trace("cataloging binaries")
|
||||
pkgs, err := catalog(resolver, cls)
|
||||
if err != nil {
|
||||
log.WithFields("error", err, "classifier", cls.Class).Warn("unable to catalog binary package: %w", err)
|
||||
|
|
|
@ -13,7 +13,7 @@ import (
|
|||
"github.com/anchore/syft/syft/source"
|
||||
)
|
||||
|
||||
func TestClassifierCataloger_DefaultClassifiers_PositiveCases(t *testing.T) {
|
||||
func Test_Cataloger_DefaultClassifiers_PositiveCases(t *testing.T) {
|
||||
tests := []struct {
|
||||
name string
|
||||
fixtureDir string
|
||||
|
@ -440,7 +440,7 @@ func TestClassifierCataloger_DefaultClassifiers_PositiveCases(t *testing.T) {
|
|||
}
|
||||
}
|
||||
|
||||
func TestClassifierCataloger_DefaultClassifiers_PositiveCases_Image(t *testing.T) {
|
||||
func Test_Cataloger_DefaultClassifiers_PositiveCases_Image(t *testing.T) {
|
||||
tests := []struct {
|
||||
name string
|
||||
fixtureImage string
|
||||
|
@ -521,39 +521,57 @@ func assertPackagesAreEqual(t *testing.T, expected pkg.Package, p pkg.Package) {
|
|||
}
|
||||
|
||||
type panicyResolver struct {
|
||||
globCalled bool
|
||||
searchCalled bool
|
||||
}
|
||||
|
||||
func (p panicyResolver) FileContentsByLocation(location source.Location) (io.ReadCloser, error) {
|
||||
func (p *panicyResolver) FilesByExtension(_ ...string) ([]source.Location, error) {
|
||||
p.searchCalled = true
|
||||
return nil, errors.New("not implemented")
|
||||
}
|
||||
|
||||
func (p panicyResolver) HasPath(s string) bool {
|
||||
func (p *panicyResolver) FilesByBasename(_ ...string) ([]source.Location, error) {
|
||||
p.searchCalled = true
|
||||
return nil, errors.New("not implemented")
|
||||
}
|
||||
|
||||
func (p *panicyResolver) FilesByBasenameGlob(_ ...string) ([]source.Location, error) {
|
||||
p.searchCalled = true
|
||||
return nil, errors.New("not implemented")
|
||||
}
|
||||
|
||||
func (p *panicyResolver) FileContentsByLocation(_ source.Location) (io.ReadCloser, error) {
|
||||
p.searchCalled = true
|
||||
return nil, errors.New("not implemented")
|
||||
}
|
||||
|
||||
func (p *panicyResolver) HasPath(s string) bool {
|
||||
return true
|
||||
}
|
||||
|
||||
func (p panicyResolver) FilesByPath(paths ...string) ([]source.Location, error) {
|
||||
func (p *panicyResolver) FilesByPath(_ ...string) ([]source.Location, error) {
|
||||
p.searchCalled = true
|
||||
return nil, errors.New("not implemented")
|
||||
}
|
||||
|
||||
func (p *panicyResolver) FilesByGlob(patterns ...string) ([]source.Location, error) {
|
||||
p.globCalled = true
|
||||
func (p *panicyResolver) FilesByGlob(_ ...string) ([]source.Location, error) {
|
||||
p.searchCalled = true
|
||||
return nil, errors.New("not implemented")
|
||||
}
|
||||
|
||||
func (p panicyResolver) FilesByMIMEType(types ...string) ([]source.Location, error) {
|
||||
func (p *panicyResolver) FilesByMIMEType(_ ...string) ([]source.Location, error) {
|
||||
p.searchCalled = true
|
||||
return nil, errors.New("not implemented")
|
||||
}
|
||||
|
||||
func (p panicyResolver) RelativeFileByPath(_ source.Location, path string) *source.Location {
|
||||
func (p *panicyResolver) RelativeFileByPath(_ source.Location, _ string) *source.Location {
|
||||
return nil
|
||||
}
|
||||
|
||||
func (p panicyResolver) AllLocations() <-chan source.Location {
|
||||
func (p *panicyResolver) AllLocations() <-chan source.Location {
|
||||
return nil
|
||||
}
|
||||
|
||||
func (p panicyResolver) FileMetadataByLocation(location source.Location) (source.FileMetadata, error) {
|
||||
func (p *panicyResolver) FileMetadataByLocation(_ source.Location) (source.FileMetadata, error) {
|
||||
return source.FileMetadata{}, errors.New("not implemented")
|
||||
}
|
||||
|
||||
|
@ -563,5 +581,5 @@ func Test_Cataloger_ResilientToErrors(t *testing.T) {
|
|||
resolver := &panicyResolver{}
|
||||
_, _, err := c.Catalog(resolver)
|
||||
assert.NoError(t, err)
|
||||
assert.True(t, resolver.globCalled)
|
||||
assert.True(t, resolver.searchCalled)
|
||||
}
|
||||
|
|
|
@ -21,7 +21,7 @@ func Test_ClassifierCPEs(t *testing.T) {
|
|||
fixture: "test-fixtures/version.txt",
|
||||
classifier: classifier{
|
||||
Package: "some-app",
|
||||
FileGlob: ".*/version.txt",
|
||||
FileGlob: "**/version.txt",
|
||||
EvidenceMatcher: fileContentsVersionMatcher(`(?m)my-verison:(?P<version>[0-9.]+)`),
|
||||
CPEs: []cpe.CPE{},
|
||||
},
|
||||
|
@ -32,7 +32,7 @@ func Test_ClassifierCPEs(t *testing.T) {
|
|||
fixture: "test-fixtures/version.txt",
|
||||
classifier: classifier{
|
||||
Package: "some-app",
|
||||
FileGlob: ".*/version.txt",
|
||||
FileGlob: "**/version.txt",
|
||||
EvidenceMatcher: fileContentsVersionMatcher(`(?m)my-verison:(?P<version>[0-9.]+)`),
|
||||
CPEs: []cpe.CPE{
|
||||
cpe.Must("cpe:2.3:a:some:app:*:*:*:*:*:*:*:*"),
|
||||
|
@ -47,7 +47,7 @@ func Test_ClassifierCPEs(t *testing.T) {
|
|||
fixture: "test-fixtures/version.txt",
|
||||
classifier: classifier{
|
||||
Package: "some-app",
|
||||
FileGlob: ".*/version.txt",
|
||||
FileGlob: "**/version.txt",
|
||||
EvidenceMatcher: fileContentsVersionMatcher(`(?m)my-verison:(?P<version>[0-9.]+)`),
|
||||
CPEs: []cpe.CPE{
|
||||
cpe.Must("cpe:2.3:a:some:app:*:*:*:*:*:*:*:*"),
|
||||
|
|
|
@ -41,8 +41,8 @@ func ImageCatalogers(cfg Config) []pkg.Cataloger {
|
|||
alpm.NewAlpmdbCataloger(),
|
||||
ruby.NewGemSpecCataloger(),
|
||||
python.NewPythonPackageCataloger(),
|
||||
php.NewPHPComposerInstalledCataloger(),
|
||||
javascript.NewJavascriptPackageCataloger(),
|
||||
php.NewComposerInstalledCataloger(),
|
||||
javascript.NewPackageCataloger(),
|
||||
deb.NewDpkgdbCataloger(),
|
||||
rpm.NewRpmDBCataloger(),
|
||||
java.NewJavaCataloger(cfg.Java()),
|
||||
|
@ -63,8 +63,8 @@ func DirectoryCatalogers(cfg Config) []pkg.Cataloger {
|
|||
ruby.NewGemFileLockCataloger(),
|
||||
python.NewPythonIndexCataloger(),
|
||||
python.NewPythonPackageCataloger(),
|
||||
php.NewPHPComposerLockCataloger(),
|
||||
javascript.NewJavascriptLockCataloger(),
|
||||
php.NewComposerLockCataloger(),
|
||||
javascript.NewLockCataloger(),
|
||||
deb.NewDpkgdbCataloger(),
|
||||
rpm.NewRpmDBCataloger(),
|
||||
rpm.NewFileCataloger(),
|
||||
|
@ -96,8 +96,8 @@ func AllCatalogers(cfg Config) []pkg.Cataloger {
|
|||
ruby.NewGemSpecCataloger(),
|
||||
python.NewPythonIndexCataloger(),
|
||||
python.NewPythonPackageCataloger(),
|
||||
javascript.NewJavascriptLockCataloger(),
|
||||
javascript.NewJavascriptPackageCataloger(),
|
||||
javascript.NewLockCataloger(),
|
||||
javascript.NewPackageCataloger(),
|
||||
deb.NewDpkgdbCataloger(),
|
||||
rpm.NewRpmDBCataloger(),
|
||||
rpm.NewFileCataloger(),
|
||||
|
@ -111,8 +111,8 @@ func AllCatalogers(cfg Config) []pkg.Cataloger {
|
|||
rust.NewAuditBinaryCataloger(),
|
||||
dart.NewPubspecLockCataloger(),
|
||||
dotnet.NewDotnetDepsCataloger(),
|
||||
php.NewPHPComposerInstalledCataloger(),
|
||||
php.NewPHPComposerLockCataloger(),
|
||||
php.NewComposerInstalledCataloger(),
|
||||
php.NewComposerLockCataloger(),
|
||||
swift.NewCocoapodsCataloger(),
|
||||
cpp.NewConanCataloger(),
|
||||
portage.NewPortageCataloger(),
|
||||
|
|
33
syft/pkg/cataloger/cpp/cataloger_test.go
Normal file
33
syft/pkg/cataloger/cpp/cataloger_test.go
Normal file
|
@ -0,0 +1,33 @@
|
|||
package cpp
|
||||
|
||||
import (
|
||||
"testing"
|
||||
|
||||
"github.com/anchore/syft/syft/pkg/cataloger/internal/pkgtest"
|
||||
)
|
||||
|
||||
func TestCataloger_Globs(t *testing.T) {
|
||||
tests := []struct {
|
||||
name string
|
||||
fixture string
|
||||
expected []string
|
||||
}{
|
||||
{
|
||||
name: "obtain conan files",
|
||||
fixture: "test-fixtures/glob-paths",
|
||||
expected: []string{
|
||||
"somewhere/src/conanfile.txt",
|
||||
"somewhere/src/conan.lock",
|
||||
},
|
||||
},
|
||||
}
|
||||
|
||||
for _, test := range tests {
|
||||
t.Run(test.name, func(t *testing.T) {
|
||||
pkgtest.NewCatalogTester().
|
||||
FromDirectory(t, test.fixture).
|
||||
ExpectsResolverContentQueries(test.expected).
|
||||
TestCataloger(t, NewConanCataloger())
|
||||
})
|
||||
}
|
||||
}
|
|
@ -0,0 +1 @@
|
|||
bogus conan.lock
|
|
@ -0,0 +1 @@
|
|||
bogus conan file
|
32
syft/pkg/cataloger/dart/cataloger_test.go
Normal file
32
syft/pkg/cataloger/dart/cataloger_test.go
Normal file
|
@ -0,0 +1,32 @@
|
|||
package dart
|
||||
|
||||
import (
|
||||
"testing"
|
||||
|
||||
"github.com/anchore/syft/syft/pkg/cataloger/internal/pkgtest"
|
||||
)
|
||||
|
||||
func TestCataloger_Globs(t *testing.T) {
|
||||
tests := []struct {
|
||||
name string
|
||||
fixture string
|
||||
expected []string
|
||||
}{
|
||||
{
|
||||
name: "obtain pubspec files",
|
||||
fixture: "test-fixtures/glob-paths",
|
||||
expected: []string{
|
||||
"src/pubspec.lock",
|
||||
},
|
||||
},
|
||||
}
|
||||
|
||||
for _, test := range tests {
|
||||
t.Run(test.name, func(t *testing.T) {
|
||||
pkgtest.NewCatalogTester().
|
||||
FromDirectory(t, test.fixture).
|
||||
ExpectsResolverContentQueries(test.expected).
|
||||
TestCataloger(t, NewPubspecLockCataloger())
|
||||
})
|
||||
}
|
||||
}
|
|
@ -0,0 +1 @@
|
|||
bogus pubspec.lock
|
|
@ -4,7 +4,6 @@ Package dpkg provides a concrete Cataloger implementation for Debian package DB
|
|||
package deb
|
||||
|
||||
import (
|
||||
"github.com/anchore/syft/syft/pkg"
|
||||
"github.com/anchore/syft/syft/pkg/cataloger/generic"
|
||||
)
|
||||
|
||||
|
@ -13,5 +12,7 @@ const catalogerName = "dpkgdb-cataloger"
|
|||
// NewDpkgdbCataloger returns a new Deb package cataloger capable of parsing DPKG status DB files.
|
||||
func NewDpkgdbCataloger() *generic.Cataloger {
|
||||
return generic.NewCataloger(catalogerName).
|
||||
WithParserByGlobs(parseDpkgDB, pkg.DpkgDBGlob)
|
||||
// note: these globs have been intentionally split up in order to improve search performance,
|
||||
// please do NOT combine into: "**/var/lib/dpkg/{status,status.d/*}"
|
||||
WithParserByGlobs(parseDpkgDB, "**/var/lib/dpkg/status", "**/var/lib/dpkg/status.d/*")
|
||||
}
|
||||
|
|
|
@ -81,3 +81,29 @@ func TestDpkgCataloger(t *testing.T) {
|
|||
Expects(expected, nil).
|
||||
TestCataloger(t, c)
|
||||
}
|
||||
|
||||
func TestCataloger_Globs(t *testing.T) {
|
||||
tests := []struct {
|
||||
name string
|
||||
fixture string
|
||||
expected []string
|
||||
}{
|
||||
{
|
||||
name: "obtain db status files",
|
||||
fixture: "test-fixtures/glob-paths",
|
||||
expected: []string{
|
||||
"var/lib/dpkg/status",
|
||||
"var/lib/dpkg/status.d/pkg-1.0",
|
||||
},
|
||||
},
|
||||
}
|
||||
|
||||
for _, test := range tests {
|
||||
t.Run(test.name, func(t *testing.T) {
|
||||
pkgtest.NewCatalogTester().
|
||||
FromDirectory(t, test.fixture).
|
||||
ExpectsResolverContentQueries(test.expected).
|
||||
TestCataloger(t, NewDpkgdbCataloger())
|
||||
})
|
||||
}
|
||||
}
|
||||
|
|
|
@ -83,7 +83,7 @@ func packageURL(m pkg.DpkgMetadata, distro *linux.Release) string {
|
|||
func addLicenses(resolver source.FileResolver, dbLocation source.Location, p *pkg.Package) {
|
||||
metadata, ok := p.Metadata.(pkg.DpkgMetadata)
|
||||
if !ok {
|
||||
log.WithFields("package", p.String()).Warn("unable to extract DPKG metadata to add licenses")
|
||||
log.WithFields("package", p).Warn("unable to extract DPKG metadata to add licenses")
|
||||
return
|
||||
}
|
||||
|
||||
|
@ -103,7 +103,7 @@ func addLicenses(resolver source.FileResolver, dbLocation source.Location, p *pk
|
|||
func mergeFileListing(resolver source.FileResolver, dbLocation source.Location, p *pkg.Package) {
|
||||
metadata, ok := p.Metadata.(pkg.DpkgMetadata)
|
||||
if !ok {
|
||||
log.WithFields("package", p.String()).Warn("unable to extract DPKG metadata to file listing")
|
||||
log.WithFields("package", p).Warn("unable to extract DPKG metadata to file listing")
|
||||
return
|
||||
}
|
||||
|
||||
|
|
|
@ -0,0 +1 @@
|
|||
bogus status
|
|
@ -0,0 +1 @@
|
|||
bogus package
|
32
syft/pkg/cataloger/dotnet/cataloger_test.go
Normal file
32
syft/pkg/cataloger/dotnet/cataloger_test.go
Normal file
|
@ -0,0 +1,32 @@
|
|||
package dotnet
|
||||
|
||||
import (
|
||||
"testing"
|
||||
|
||||
"github.com/anchore/syft/syft/pkg/cataloger/internal/pkgtest"
|
||||
)
|
||||
|
||||
func TestCataloger_Globs(t *testing.T) {
|
||||
tests := []struct {
|
||||
name string
|
||||
fixture string
|
||||
expected []string
|
||||
}{
|
||||
{
|
||||
name: "obtain deps.json files",
|
||||
fixture: "test-fixtures/glob-paths",
|
||||
expected: []string{
|
||||
"src/something.deps.json",
|
||||
},
|
||||
},
|
||||
}
|
||||
|
||||
for _, test := range tests {
|
||||
t.Run(test.name, func(t *testing.T) {
|
||||
pkgtest.NewCatalogTester().
|
||||
FromDirectory(t, test.fixture).
|
||||
ExpectsResolverContentQueries(test.expected).
|
||||
TestCataloger(t, NewDotnetDepsCataloger())
|
||||
})
|
||||
}
|
||||
}
|
|
@ -0,0 +1 @@
|
|||
bogus deps.json
|
32
syft/pkg/cataloger/elixir/cataloger_test.go
Normal file
32
syft/pkg/cataloger/elixir/cataloger_test.go
Normal file
|
@ -0,0 +1,32 @@
|
|||
package elixir
|
||||
|
||||
import (
|
||||
"testing"
|
||||
|
||||
"github.com/anchore/syft/syft/pkg/cataloger/internal/pkgtest"
|
||||
)
|
||||
|
||||
func TestCataloger_Globs(t *testing.T) {
|
||||
tests := []struct {
|
||||
name string
|
||||
fixture string
|
||||
expected []string
|
||||
}{
|
||||
{
|
||||
name: "obtain mix.lock files",
|
||||
fixture: "test-fixtures/glob-paths",
|
||||
expected: []string{
|
||||
"src/mix.lock",
|
||||
},
|
||||
},
|
||||
}
|
||||
|
||||
for _, test := range tests {
|
||||
t.Run(test.name, func(t *testing.T) {
|
||||
pkgtest.NewCatalogTester().
|
||||
FromDirectory(t, test.fixture).
|
||||
ExpectsResolverContentQueries(test.expected).
|
||||
TestCataloger(t, NewMixLockCataloger())
|
||||
})
|
||||
}
|
||||
}
|
|
@ -0,0 +1 @@
|
|||
bogus mix.lock
|
32
syft/pkg/cataloger/erlang/cataloger_test.go
Normal file
32
syft/pkg/cataloger/erlang/cataloger_test.go
Normal file
|
@ -0,0 +1,32 @@
|
|||
package erlang
|
||||
|
||||
import (
|
||||
"testing"
|
||||
|
||||
"github.com/anchore/syft/syft/pkg/cataloger/internal/pkgtest"
|
||||
)
|
||||
|
||||
func TestCataloger_Globs(t *testing.T) {
|
||||
tests := []struct {
|
||||
name string
|
||||
fixture string
|
||||
expected []string
|
||||
}{
|
||||
{
|
||||
name: "obtain rebar.lock files",
|
||||
fixture: "test-fixtures/glob-paths",
|
||||
expected: []string{
|
||||
"src/rebar.lock",
|
||||
},
|
||||
},
|
||||
}
|
||||
|
||||
for _, test := range tests {
|
||||
t.Run(test.name, func(t *testing.T) {
|
||||
pkgtest.NewCatalogTester().
|
||||
FromDirectory(t, test.fixture).
|
||||
ExpectsResolverContentQueries(test.expected).
|
||||
TestCataloger(t, NewRebarLockCataloger())
|
||||
})
|
||||
}
|
||||
}
|
|
@ -0,0 +1 @@
|
|||
bogus rebar.lock
|
|
@ -47,16 +47,13 @@ func (c *Cataloger) WithParserByMimeTypes(parser Parser, types ...string) *Catal
|
|||
c.processor = append(c.processor,
|
||||
func(resolver source.FileResolver, env Environment) []request {
|
||||
var requests []request
|
||||
for _, t := range types {
|
||||
log.WithFields("mimetype", t).Trace("searching for paths matching mimetype")
|
||||
|
||||
matches, err := resolver.FilesByMIMEType(t)
|
||||
if err != nil {
|
||||
log.Warnf("unable to process mimetype=%q: %+v", t, err)
|
||||
continue
|
||||
}
|
||||
requests = append(requests, makeRequests(parser, matches)...)
|
||||
log.WithFields("mimetypes", types).Trace("searching for paths matching mimetype")
|
||||
matches, err := resolver.FilesByMIMEType(types...)
|
||||
if err != nil {
|
||||
log.Warnf("unable to process mimetypes=%+v: %+v", types, err)
|
||||
return nil
|
||||
}
|
||||
requests = append(requests, makeRequests(parser, matches)...)
|
||||
return requests
|
||||
},
|
||||
)
|
||||
|
|
58
syft/pkg/cataloger/golang/cataloger_test.go
Normal file
58
syft/pkg/cataloger/golang/cataloger_test.go
Normal file
|
@ -0,0 +1,58 @@
|
|||
package golang
|
||||
|
||||
import (
|
||||
"testing"
|
||||
|
||||
"github.com/anchore/syft/syft/pkg/cataloger/internal/pkgtest"
|
||||
)
|
||||
|
||||
func Test_Mod_Cataloger_Globs(t *testing.T) {
|
||||
tests := []struct {
|
||||
name string
|
||||
fixture string
|
||||
expected []string
|
||||
}{
|
||||
{
|
||||
name: "obtain go.mod files",
|
||||
fixture: "test-fixtures/glob-paths",
|
||||
expected: []string{
|
||||
"src/go.mod",
|
||||
},
|
||||
},
|
||||
}
|
||||
|
||||
for _, test := range tests {
|
||||
t.Run(test.name, func(t *testing.T) {
|
||||
pkgtest.NewCatalogTester().
|
||||
FromDirectory(t, test.fixture).
|
||||
ExpectsResolverContentQueries(test.expected).
|
||||
IgnoreUnfulfilledPathResponses("src/go.sum").
|
||||
TestCataloger(t, NewGoModFileCataloger())
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
func Test_Binary_Cataloger_Globs(t *testing.T) {
|
||||
tests := []struct {
|
||||
name string
|
||||
fixture string
|
||||
expected []string
|
||||
}{
|
||||
{
|
||||
name: "obtain binary files",
|
||||
fixture: "test-fixtures/glob-paths",
|
||||
expected: []string{
|
||||
"partial-binary",
|
||||
},
|
||||
},
|
||||
}
|
||||
|
||||
for _, test := range tests {
|
||||
t.Run(test.name, func(t *testing.T) {
|
||||
pkgtest.NewCatalogTester().
|
||||
FromDirectory(t, test.fixture).
|
||||
ExpectsResolverContentQueries(test.expected).
|
||||
TestCataloger(t, NewGoModuleBinaryCataloger())
|
||||
})
|
||||
}
|
||||
}
|
|
@ -0,0 +1 @@
|
|||
销睨
|
|
@ -0,0 +1 @@
|
|||
// bogus go.mod
|
34
syft/pkg/cataloger/haskell/cataloger_test.go
Normal file
34
syft/pkg/cataloger/haskell/cataloger_test.go
Normal file
|
@ -0,0 +1,34 @@
|
|||
package haskell
|
||||
|
||||
import (
|
||||
"testing"
|
||||
|
||||
"github.com/anchore/syft/syft/pkg/cataloger/internal/pkgtest"
|
||||
)
|
||||
|
||||
func TestCataloger_Globs(t *testing.T) {
|
||||
tests := []struct {
|
||||
name string
|
||||
fixture string
|
||||
expected []string
|
||||
}{
|
||||
{
|
||||
name: "obtain stack and cabal files",
|
||||
fixture: "test-fixtures/glob-paths",
|
||||
expected: []string{
|
||||
"src/stack.yaml",
|
||||
"src/stack.yaml.lock",
|
||||
"src/cabal.project.freeze",
|
||||
},
|
||||
},
|
||||
}
|
||||
|
||||
for _, test := range tests {
|
||||
t.Run(test.name, func(t *testing.T) {
|
||||
pkgtest.NewCatalogTester().
|
||||
FromDirectory(t, test.fixture).
|
||||
ExpectsResolverContentQueries(test.expected).
|
||||
TestCataloger(t, NewHackageCataloger())
|
||||
})
|
||||
}
|
||||
}
|
|
@ -0,0 +1 @@
|
|||
cabal.project.freeze
|
|
@ -0,0 +1 @@
|
|||
bogus stack.yaml
|
|
@ -0,0 +1 @@
|
|||
bogus stack.yaml.lock
|
222
syft/pkg/cataloger/internal/pkgtest/observing_resolver.go
Normal file
222
syft/pkg/cataloger/internal/pkgtest/observing_resolver.go
Normal file
|
@ -0,0 +1,222 @@
|
|||
package pkgtest
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"io"
|
||||
"sort"
|
||||
|
||||
"github.com/scylladb/go-set/strset"
|
||||
|
||||
"github.com/anchore/syft/syft/source"
|
||||
)
|
||||
|
||||
var _ source.FileResolver = (*ObservingResolver)(nil)
|
||||
|
||||
type ObservingResolver struct {
|
||||
decorated source.FileResolver
|
||||
pathQueries map[string][]string
|
||||
pathResponses []source.Location
|
||||
contentQueries []source.Location
|
||||
emptyPathResponses map[string][]string
|
||||
}
|
||||
|
||||
func NewObservingResolver(resolver source.FileResolver) *ObservingResolver {
|
||||
return &ObservingResolver{
|
||||
decorated: resolver,
|
||||
pathResponses: make([]source.Location, 0),
|
||||
emptyPathResponses: make(map[string][]string),
|
||||
pathQueries: make(map[string][]string),
|
||||
}
|
||||
}
|
||||
|
||||
// testing helpers...
|
||||
|
||||
func (r *ObservingResolver) ObservedPathQuery(input string) bool {
|
||||
for _, vs := range r.pathQueries {
|
||||
for _, v := range vs {
|
||||
if v == input {
|
||||
return true
|
||||
}
|
||||
}
|
||||
}
|
||||
return false
|
||||
}
|
||||
|
||||
func (r *ObservingResolver) ObservedPathResponses(path string) bool {
|
||||
for _, loc := range r.pathResponses {
|
||||
if loc.RealPath == path {
|
||||
return true
|
||||
}
|
||||
}
|
||||
return false
|
||||
}
|
||||
|
||||
func (r *ObservingResolver) ObservedContentQueries(path string) bool {
|
||||
for _, loc := range r.contentQueries {
|
||||
if loc.RealPath == path {
|
||||
return true
|
||||
}
|
||||
}
|
||||
return false
|
||||
}
|
||||
|
||||
func (r *ObservingResolver) AllContentQueries() []string {
|
||||
observed := strset.New()
|
||||
for _, loc := range r.contentQueries {
|
||||
observed.Add(loc.RealPath)
|
||||
}
|
||||
return observed.List()
|
||||
}
|
||||
|
||||
func (r *ObservingResolver) AllPathQueries() map[string][]string {
|
||||
return r.pathQueries
|
||||
}
|
||||
|
||||
func (r *ObservingResolver) PruneUnfulfilledPathResponses(ignore map[string][]string, ignorePaths ...string) {
|
||||
if ignore == nil {
|
||||
return
|
||||
}
|
||||
// remove any paths that were ignored for specific calls
|
||||
for k, v := range ignore {
|
||||
results := r.emptyPathResponses[k]
|
||||
for _, ig := range v {
|
||||
for i, result := range results {
|
||||
if result == ig {
|
||||
results = append(results[:i], results[i+1:]...)
|
||||
break
|
||||
}
|
||||
}
|
||||
}
|
||||
if len(results) > 0 {
|
||||
r.emptyPathResponses[k] = results
|
||||
} else {
|
||||
delete(r.emptyPathResponses, k)
|
||||
}
|
||||
}
|
||||
|
||||
// remove any paths that were ignored for all calls
|
||||
for _, ig := range ignorePaths {
|
||||
for k, v := range r.emptyPathResponses {
|
||||
for i, result := range v {
|
||||
if result == ig {
|
||||
v = append(v[:i], v[i+1:]...)
|
||||
break
|
||||
}
|
||||
}
|
||||
if len(v) > 0 {
|
||||
r.emptyPathResponses[k] = v
|
||||
} else {
|
||||
delete(r.emptyPathResponses, k)
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
func (r *ObservingResolver) HasUnfulfilledPathRequests() bool {
|
||||
return len(r.emptyPathResponses) > 0
|
||||
}
|
||||
|
||||
func (r *ObservingResolver) PrettyUnfulfilledPathRequests() string {
|
||||
var res string
|
||||
var keys []string
|
||||
|
||||
for k := range r.emptyPathResponses {
|
||||
keys = append(keys, k)
|
||||
}
|
||||
|
||||
sort.Strings(keys)
|
||||
|
||||
for _, k := range keys {
|
||||
res += fmt.Sprintf(" %s: %+v\n", k, r.emptyPathResponses[k])
|
||||
}
|
||||
return res
|
||||
}
|
||||
|
||||
// For the file path resolver...
|
||||
|
||||
func (r *ObservingResolver) addPathQuery(name string, input ...string) {
|
||||
r.pathQueries[name] = append(r.pathQueries[name], input...)
|
||||
}
|
||||
|
||||
func (r *ObservingResolver) addPathResponse(locs ...source.Location) {
|
||||
r.pathResponses = append(r.pathResponses, locs...)
|
||||
}
|
||||
|
||||
func (r *ObservingResolver) addEmptyPathResponse(name string, locs []source.Location, paths ...string) {
|
||||
if len(locs) == 0 {
|
||||
results := r.emptyPathResponses[name]
|
||||
results = append(results, paths...)
|
||||
r.emptyPathResponses[name] = results
|
||||
}
|
||||
}
|
||||
|
||||
func (r *ObservingResolver) FilesByPath(paths ...string) ([]source.Location, error) {
|
||||
name := "FilesByPath"
|
||||
r.addPathQuery(name, paths...)
|
||||
|
||||
locs, err := r.decorated.FilesByPath(paths...)
|
||||
|
||||
r.addPathResponse(locs...)
|
||||
r.addEmptyPathResponse(name, locs, paths...)
|
||||
return locs, err
|
||||
}
|
||||
|
||||
func (r *ObservingResolver) FilesByGlob(patterns ...string) ([]source.Location, error) {
|
||||
name := "FilesByGlob"
|
||||
r.addPathQuery(name, patterns...)
|
||||
|
||||
locs, err := r.decorated.FilesByGlob(patterns...)
|
||||
|
||||
r.addPathResponse(locs...)
|
||||
r.addEmptyPathResponse(name, locs, patterns...)
|
||||
return locs, err
|
||||
}
|
||||
|
||||
func (r *ObservingResolver) FilesByMIMEType(types ...string) ([]source.Location, error) {
|
||||
name := "FilesByMIMEType"
|
||||
r.addPathQuery(name, types...)
|
||||
|
||||
locs, err := r.decorated.FilesByMIMEType(types...)
|
||||
|
||||
r.addPathResponse(locs...)
|
||||
r.addEmptyPathResponse(name, locs, types...)
|
||||
return locs, err
|
||||
}
|
||||
|
||||
func (r *ObservingResolver) RelativeFileByPath(l source.Location, path string) *source.Location {
|
||||
name := "RelativeFileByPath"
|
||||
r.addPathQuery(name, path)
|
||||
|
||||
loc := r.decorated.RelativeFileByPath(l, path)
|
||||
|
||||
if loc != nil {
|
||||
r.addPathResponse(*loc)
|
||||
} else {
|
||||
results := r.emptyPathResponses[name]
|
||||
results = append(results, path)
|
||||
r.emptyPathResponses[name] = results
|
||||
}
|
||||
return loc
|
||||
}
|
||||
|
||||
// For the content resolver methods...
|
||||
|
||||
func (r *ObservingResolver) FileContentsByLocation(location source.Location) (io.ReadCloser, error) {
|
||||
r.contentQueries = append(r.contentQueries, location)
|
||||
reader, err := r.decorated.FileContentsByLocation(location)
|
||||
return reader, err
|
||||
}
|
||||
|
||||
// For the remaining resolver methods...
|
||||
|
||||
func (r *ObservingResolver) AllLocations() <-chan source.Location {
|
||||
return r.decorated.AllLocations()
|
||||
}
|
||||
|
||||
func (r *ObservingResolver) HasPath(s string) bool {
|
||||
return r.decorated.HasPath(s)
|
||||
}
|
||||
|
||||
func (r *ObservingResolver) FileMetadataByLocation(location source.Location) (source.FileMetadata, error) {
|
||||
return r.decorated.FileMetadataByLocation(location)
|
||||
}
|
|
@ -1,6 +1,7 @@
|
|||
package pkgtest
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"io"
|
||||
"os"
|
||||
"strings"
|
||||
|
@ -8,6 +9,7 @@ import (
|
|||
|
||||
"github.com/google/go-cmp/cmp"
|
||||
"github.com/google/go-cmp/cmp/cmpopts"
|
||||
"github.com/stretchr/testify/assert"
|
||||
"github.com/stretchr/testify/require"
|
||||
|
||||
"github.com/anchore/stereoscope/pkg/imagetest"
|
||||
|
@ -21,20 +23,35 @@ import (
|
|||
type locationComparer func(x, y source.Location) bool
|
||||
|
||||
type CatalogTester struct {
|
||||
expectedPkgs []pkg.Package
|
||||
expectedRelationships []artifact.Relationship
|
||||
env *generic.Environment
|
||||
reader source.LocationReadCloser
|
||||
resolver source.FileResolver
|
||||
wantErr require.ErrorAssertionFunc
|
||||
compareOptions []cmp.Option
|
||||
locationComparer locationComparer
|
||||
expectedPkgs []pkg.Package
|
||||
expectedRelationships []artifact.Relationship
|
||||
assertResultExpectations bool
|
||||
expectedPathResponses []string // this is a minimum set, the resolver may return more that just this list
|
||||
expectedContentQueries []string // this is a full set, any other queries are unexpected (and will fail the test)
|
||||
ignoreUnfulfilledPathResponses map[string][]string
|
||||
ignoreAnyUnfulfilledPaths []string
|
||||
env *generic.Environment
|
||||
reader source.LocationReadCloser
|
||||
resolver source.FileResolver
|
||||
wantErr require.ErrorAssertionFunc
|
||||
compareOptions []cmp.Option
|
||||
locationComparer locationComparer
|
||||
}
|
||||
|
||||
func NewCatalogTester() *CatalogTester {
|
||||
return &CatalogTester{
|
||||
wantErr: require.NoError,
|
||||
locationComparer: DefaultLocationComparer,
|
||||
ignoreUnfulfilledPathResponses: map[string][]string{
|
||||
"FilesByPath": {
|
||||
// most catalogers search for a linux release, which will not be fulfilled in testing
|
||||
"/etc/os-release",
|
||||
"/usr/lib/os-release",
|
||||
"/etc/system-release-cpe",
|
||||
"/etc/redhat-release",
|
||||
"/bin/busybox",
|
||||
},
|
||||
},
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -90,6 +107,7 @@ func (p *CatalogTester) WithEnv(env *generic.Environment) *CatalogTester {
|
|||
}
|
||||
|
||||
func (p *CatalogTester) WithError() *CatalogTester {
|
||||
p.assertResultExpectations = true
|
||||
p.wantErr = require.Error
|
||||
return p
|
||||
}
|
||||
|
@ -129,12 +147,33 @@ func (p *CatalogTester) IgnorePackageFields(fields ...string) *CatalogTester {
|
|||
return p
|
||||
}
|
||||
|
||||
func (p *CatalogTester) WithCompareOptions(opts ...cmp.Option) *CatalogTester {
|
||||
p.compareOptions = append(p.compareOptions, opts...)
|
||||
return p
|
||||
}
|
||||
|
||||
func (p *CatalogTester) Expects(pkgs []pkg.Package, relationships []artifact.Relationship) *CatalogTester {
|
||||
p.assertResultExpectations = true
|
||||
p.expectedPkgs = pkgs
|
||||
p.expectedRelationships = relationships
|
||||
return p
|
||||
}
|
||||
|
||||
func (p *CatalogTester) ExpectsResolverPathResponses(locations []string) *CatalogTester {
|
||||
p.expectedPathResponses = locations
|
||||
return p
|
||||
}
|
||||
|
||||
func (p *CatalogTester) ExpectsResolverContentQueries(locations []string) *CatalogTester {
|
||||
p.expectedContentQueries = locations
|
||||
return p
|
||||
}
|
||||
|
||||
func (p *CatalogTester) IgnoreUnfulfilledPathResponses(paths ...string) *CatalogTester {
|
||||
p.ignoreAnyUnfulfilledPaths = append(p.ignoreAnyUnfulfilledPaths, paths...)
|
||||
return p
|
||||
}
|
||||
|
||||
func (p *CatalogTester) TestParser(t *testing.T, parser generic.Parser) {
|
||||
t.Helper()
|
||||
pkgs, relationships, err := parser(p.resolver, p.env, p.reader)
|
||||
|
@ -144,9 +183,30 @@ func (p *CatalogTester) TestParser(t *testing.T, parser generic.Parser) {
|
|||
|
||||
func (p *CatalogTester) TestCataloger(t *testing.T, cataloger pkg.Cataloger) {
|
||||
t.Helper()
|
||||
pkgs, relationships, err := cataloger.Catalog(p.resolver)
|
||||
p.wantErr(t, err)
|
||||
p.assertPkgs(t, pkgs, relationships)
|
||||
|
||||
resolver := NewObservingResolver(p.resolver)
|
||||
|
||||
pkgs, relationships, err := cataloger.Catalog(resolver)
|
||||
|
||||
// this is a minimum set, the resolver may return more that just this list
|
||||
for _, path := range p.expectedPathResponses {
|
||||
assert.Truef(t, resolver.ObservedPathResponses(path), "expected path query for %q was not observed", path)
|
||||
}
|
||||
|
||||
// this is a full set, any other queries are unexpected (and will fail the test)
|
||||
if len(p.expectedContentQueries) > 0 {
|
||||
assert.ElementsMatchf(t, p.expectedContentQueries, resolver.AllContentQueries(), "unexpected content queries observed: diff %s", cmp.Diff(p.expectedContentQueries, resolver.AllContentQueries()))
|
||||
}
|
||||
|
||||
if p.assertResultExpectations {
|
||||
p.wantErr(t, err)
|
||||
p.assertPkgs(t, pkgs, relationships)
|
||||
} else {
|
||||
resolver.PruneUnfulfilledPathResponses(p.ignoreUnfulfilledPathResponses, p.ignoreAnyUnfulfilledPaths...)
|
||||
|
||||
// if we aren't testing the results, we should focus on what was searched for (for glob-centric tests)
|
||||
assert.Falsef(t, resolver.HasUnfulfilledPathRequests(), "unfulfilled path requests: \n%v", resolver.PrettyUnfulfilledPathRequests())
|
||||
}
|
||||
}
|
||||
|
||||
func (p *CatalogTester) assertPkgs(t *testing.T, pkgs []pkg.Package, relationships []artifact.Relationship) {
|
||||
|
@ -175,12 +235,31 @@ func (p *CatalogTester) assertPkgs(t *testing.T, pkgs []pkg.Package, relationshi
|
|||
),
|
||||
)
|
||||
|
||||
if diff := cmp.Diff(p.expectedPkgs, pkgs, p.compareOptions...); diff != "" {
|
||||
t.Errorf("unexpected packages from parsing (-expected +actual)\n%s", diff)
|
||||
{
|
||||
var r diffReporter
|
||||
var opts []cmp.Option
|
||||
|
||||
opts = append(opts, p.compareOptions...)
|
||||
opts = append(opts, cmp.Reporter(&r))
|
||||
|
||||
if diff := cmp.Diff(p.expectedPkgs, pkgs, opts...); diff != "" {
|
||||
t.Log("Specific Differences:\n" + r.String())
|
||||
t.Errorf("unexpected packages from parsing (-expected +actual)\n%s", diff)
|
||||
}
|
||||
}
|
||||
|
||||
if diff := cmp.Diff(p.expectedRelationships, relationships, p.compareOptions...); diff != "" {
|
||||
t.Errorf("unexpected relationships from parsing (-expected +actual)\n%s", diff)
|
||||
{
|
||||
var r diffReporter
|
||||
var opts []cmp.Option
|
||||
|
||||
opts = append(opts, p.compareOptions...)
|
||||
opts = append(opts, cmp.Reporter(&r))
|
||||
|
||||
if diff := cmp.Diff(p.expectedRelationships, relationships, opts...); diff != "" {
|
||||
t.Log("Specific Differences:\n" + r.String())
|
||||
|
||||
t.Errorf("unexpected relationships from parsing (-expected +actual)\n%s", diff)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -223,3 +302,28 @@ func AssertPackagesEqual(t *testing.T, a, b pkg.Package) {
|
|||
t.Errorf("unexpected packages from parsing (-expected +actual)\n%s", diff)
|
||||
}
|
||||
}
|
||||
|
||||
// diffReporter is a simple custom reporter that only records differences detected during comparison.
|
||||
type diffReporter struct {
|
||||
path cmp.Path
|
||||
diffs []string
|
||||
}
|
||||
|
||||
func (r *diffReporter) PushStep(ps cmp.PathStep) {
|
||||
r.path = append(r.path, ps)
|
||||
}
|
||||
|
||||
func (r *diffReporter) Report(rs cmp.Result) {
|
||||
if !rs.Equal() {
|
||||
vx, vy := r.path.Last().Values()
|
||||
r.diffs = append(r.diffs, fmt.Sprintf("%#v:\n\t-: %+v\n\t+: %+v\n", r.path, vx, vy))
|
||||
}
|
||||
}
|
||||
|
||||
func (r *diffReporter) PopStep() {
|
||||
r.path = r.path[:len(r.path)-1]
|
||||
}
|
||||
|
||||
func (r *diffReporter) String() string {
|
||||
return strings.Join(r.diffs, "\n")
|
||||
}
|
||||
|
|
|
@ -29,5 +29,5 @@ func NewJavaCataloger(cfg Config) *generic.Cataloger {
|
|||
// Pom files list dependencies that maybe not be locally installed yet.
|
||||
func NewJavaPomCataloger() *generic.Cataloger {
|
||||
return generic.NewCataloger("java-pom-cataloger").
|
||||
WithParserByGlobs(parserPomXML, pomXMLDirGlob)
|
||||
WithParserByGlobs(parserPomXML, "**/pom.xml")
|
||||
}
|
||||
|
|
87
syft/pkg/cataloger/java/cataloger_test.go
Normal file
87
syft/pkg/cataloger/java/cataloger_test.go
Normal file
|
@ -0,0 +1,87 @@
|
|||
package java
|
||||
|
||||
import (
|
||||
"testing"
|
||||
|
||||
"github.com/anchore/syft/syft/pkg/cataloger/internal/pkgtest"
|
||||
)
|
||||
|
||||
func Test_ArchiveCataloger_Globs(t *testing.T) {
|
||||
tests := []struct {
|
||||
name string
|
||||
fixture string
|
||||
expected []string
|
||||
}{
|
||||
{
|
||||
name: "obtain java archive files",
|
||||
fixture: "test-fixtures/glob-paths",
|
||||
expected: []string{
|
||||
"java-archives/example.jar",
|
||||
"java-archives/example.war",
|
||||
"java-archives/example.ear",
|
||||
"java-archives/example.par",
|
||||
"java-archives/example.sar",
|
||||
"java-archives/example.jpi",
|
||||
"java-archives/example.hpi",
|
||||
"java-archives/example.lpkg",
|
||||
"archives/example.zip",
|
||||
"archives/example.tar",
|
||||
"archives/example.tar.gz",
|
||||
"archives/example.tgz",
|
||||
"archives/example.tar.bz",
|
||||
"archives/example.tar.bz2",
|
||||
"archives/example.tbz",
|
||||
"archives/example.tbz2",
|
||||
"archives/example.tar.br",
|
||||
"archives/example.tbr",
|
||||
"archives/example.tar.lz4",
|
||||
"archives/example.tlz4",
|
||||
"archives/example.tar.sz",
|
||||
"archives/example.tsz",
|
||||
"archives/example.tar.xz",
|
||||
"archives/example.txz",
|
||||
"archives/example.tar.zst",
|
||||
"archives/example.tzst",
|
||||
"archives/example.tar.zstd",
|
||||
"archives/example.tzstd",
|
||||
},
|
||||
},
|
||||
}
|
||||
|
||||
for _, test := range tests {
|
||||
t.Run(test.name, func(t *testing.T) {
|
||||
pkgtest.NewCatalogTester().
|
||||
FromDirectory(t, test.fixture).
|
||||
ExpectsResolverContentQueries(test.expected).
|
||||
TestCataloger(t, NewJavaCataloger(Config{
|
||||
SearchUnindexedArchives: true,
|
||||
SearchIndexedArchives: true,
|
||||
}))
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
func Test_POMCataloger_Globs(t *testing.T) {
|
||||
tests := []struct {
|
||||
name string
|
||||
fixture string
|
||||
expected []string
|
||||
}{
|
||||
{
|
||||
name: "obtain java pom files",
|
||||
fixture: "test-fixtures/glob-paths",
|
||||
expected: []string{
|
||||
"src/pom.xml",
|
||||
},
|
||||
},
|
||||
}
|
||||
|
||||
for _, test := range tests {
|
||||
t.Run(test.name, func(t *testing.T) {
|
||||
pkgtest.NewCatalogTester().
|
||||
FromDirectory(t, test.fixture).
|
||||
ExpectsResolverContentQueries(test.expected).
|
||||
TestCataloger(t, NewJavaPomCataloger())
|
||||
})
|
||||
}
|
||||
}
|
|
@ -18,7 +18,6 @@ import (
|
|||
)
|
||||
|
||||
const pomXMLGlob = "*pom.xml"
|
||||
const pomXMLDirGlob = "**/pom.xml"
|
||||
|
||||
var propertyMatcher = regexp.MustCompile("[$][{][^}]+[}]")
|
||||
|
||||
|
|
|
@ -34,6 +34,9 @@ var genericTarGlobs = []string{
|
|||
"**/*.txz",
|
||||
// zst
|
||||
"**/*.tar.zst",
|
||||
"**/*.tzst",
|
||||
"**/*.tar.zstd",
|
||||
"**/*.tzstd",
|
||||
}
|
||||
|
||||
// TODO: when the generic archive cataloger is implemented, this should be removed (https://github.com/anchore/syft/issues/246)
|
||||
|
|
2
syft/pkg/cataloger/java/test-fixtures/glob-paths/archives/.gitignore
vendored
Normal file
2
syft/pkg/cataloger/java/test-fixtures/glob-paths/archives/.gitignore
vendored
Normal file
|
@ -0,0 +1,2 @@
|
|||
# we want to override some of the root level ignores just for the fixutes that we know are safe
|
||||
!*
|
|
@ -0,0 +1 @@
|
|||
example archive
|
|
@ -0,0 +1 @@
|
|||
example archive
|
|
@ -0,0 +1 @@
|
|||
example archive
|
|
@ -0,0 +1 @@
|
|||
example archive
|
|
@ -0,0 +1 @@
|
|||
example archive
|
|
@ -0,0 +1 @@
|
|||
example archive
|
|
@ -0,0 +1 @@
|
|||
example archive
|
|
@ -0,0 +1 @@
|
|||
example archive
|
|
@ -0,0 +1 @@
|
|||
example archive
|
|
@ -0,0 +1 @@
|
|||
example archive
|
|
@ -0,0 +1 @@
|
|||
example archive
|
|
@ -0,0 +1 @@
|
|||
example archive
|
|
@ -0,0 +1 @@
|
|||
example archive
|
|
@ -0,0 +1 @@
|
|||
example archive
|
|
@ -0,0 +1 @@
|
|||
example archive
|
|
@ -0,0 +1 @@
|
|||
example archive
|
|
@ -0,0 +1 @@
|
|||
example archive
|
|
@ -0,0 +1 @@
|
|||
example archive
|
|
@ -0,0 +1 @@
|
|||
example archive
|
|
@ -0,0 +1 @@
|
|||
example archive
|
2
syft/pkg/cataloger/java/test-fixtures/glob-paths/java-archives/.gitignore
vendored
Normal file
2
syft/pkg/cataloger/java/test-fixtures/glob-paths/java-archives/.gitignore
vendored
Normal file
|
@ -0,0 +1,2 @@
|
|||
# we want to override some of the root level ignores just for the fixutes that we know are safe
|
||||
!*
|
|
@ -0,0 +1 @@
|
|||
example archive
|
|
@ -0,0 +1 @@
|
|||
example archive
|
Some files were not shown because too many files have changed in this diff Show more
Loading…
Reference in a new issue