-
Notifications
You must be signed in to change notification settings - Fork 2.4k
Integration test for Docker and perf stats #2489
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
8d0b455 to
1fc64df
Compare
9af4e46 to
478f27b
Compare
|
All (the pull request submitter and all commit authors) CLAs are signed, but one or more commits were authored or co-authored by someone other than the pull request submitter. We need to confirm that all authors are ok with their commits being contributed to this project. Please have them confirm that by leaving a comment that contains only Note to project maintainer: There may be cases where the author cannot leave a comment, or the comment is not properly detected as consent. In those cases, you can manually confirm consent of the commit author(s), and set the ℹ️ Googlers: Go here for more info. |
478f27b to
4dc7481
Compare
|
CLAs look good, thanks! ℹ️ Googlers: Go here for more info. |
|
@dashpole, you will find a proposal for extending integration tests in this PR. What changes is that integration tests are being run for "normal" build of cAdvisor (i.e. with no tags) and for a custom one (with |
build/integration-in-docker.sh
Outdated
| CADVISOR_ARGS="$CADVISOR_ARGS" /usr/local/bin/runner.sh build/integration.sh" | ||
| } | ||
|
|
||
| run_tests "-race" "sudo" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is there a reason why we want to do perf tests separately? Are they expected to fail in some scenarios?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just read the comment above. I think this is a reasonable solution, although I think we should consider having a separate target. That way users who just want to run the "distro agnostic" tests can do so without having errors mixed in.
integration/tests/api/perf_test.go
Outdated
|
|
||
| assert.Nil(t, err) | ||
| assert.Len(t, info.Stats, 1) | ||
| assert.Greater(t, len(info.Stats[0].PerfStats), 0, "Length of info.Stats[0].PerfStats is not greater tahn zero") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
s/tahn/than
| assert.Greater(t, len(info.Stats[0].PerfStats), 0, "Length of info.Stats[0].PerfStats is not greater tahn zero") | ||
| for k, stat := range info.Stats[0].PerfStats { | ||
| //Everything beyond name is non-deterministic unfortunately. | ||
| assert.Contains(t, []string{"context-switches", "cpu-migrations-custom"}, stat.Name, "Wrong metric name for key %d: %#v", k, stat) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Are we guaranteed to have more than 0 of either of these events? Even that check can sometimes catch problems...
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In waitForContainerInfo() we wait for v1.ContainerInfo to be available. Perf stats are part of this structure and I would consider it a bug if they were not available.
|
cc @dims |
|
+1 to "perf tests separately" please! looks reasonable otherwise :) |
Signed-off-by: Maciej "Iwan" Iwanowski <[email protected]>
Signed-off-by: Maciej "Iwan" Iwanowski <[email protected]>
Signed-off-by: Maciej "Iwan" Iwanowski <[email protected]>
Signed-off-by: Maciej "Iwan" Iwanowski <[email protected]>
Signed-off-by: Maciej "Iwan" Iwanowski <[email protected]>
Signed-off-by: Maciej "Iwan" Iwanowski <[email protected]>
4cd5cf6 to
396c32a
Compare
Signed-off-by: Maciej "Iwan" Iwanowski <[email protected]>
Signed-off-by: Maciej "Iwan" Iwanowski <[email protected]>
Signed-off-by: Maciej "Iwan" Iwanowski <[email protected]>
|
@dashpole @dims - is such approach acceptable? Changes are:
Eventually it would make sense to execute unit tests in Docker too, I think. There are some unit tests against libpfm4 part of code that can be executed without installing the library. What is you opinion? |
|
@iwankgb works for me! |
dashpole
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lgtm
Integration tests for Docker container and perf stats.