Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[exporter/elasticsearch] goroutine leaks in tests #35638

Closed
mauri870 opened this issue Oct 7, 2024 · 8 comments
Closed

[exporter/elasticsearch] goroutine leaks in tests #35638

mauri870 opened this issue Oct 7, 2024 · 8 comments

Comments

@mauri870
Copy link
Contributor

mauri870 commented Oct 7, 2024

Component(s)

exporter/elasticsearch

Describe the issue you're reporting

Description

The goleak tool reports goroutine leaks in tests.

Steps to Reproduce

cd exporter/elasticsearchexporter
go test ./... -v

Expected Result

No leaks reported by goleak.

Actual Result

goleak: Errors on successful test run: found unexpected goroutines:
[Goroutine 765 in state select, with net/http.(*persistConn).writeLoop on top of the stack:
net/http.(*persistConn).writeLoop(0xc000485320)
        /home/mauri870/gopath/pkg/mod/golang.org/[email protected]/src/net/http/transport.go:2519 +0xe7
created by net/http.(*Transport).dialConn in goroutine 823
        /home/mauri870/gopath/pkg/mod/golang.org/[email protected]/src/net/http/transport.go:1875 +0x15a5
 Goroutine 764 in state IO wait, with internal/poll.runtime_pollWait on top of the stack:
internal/poll.runtime_pollWait(0x70ffb101fb60, 0x72)
        /home/mauri870/gopath/pkg/mod/golang.org/[email protected]/src/runtime/netpoll.go:351 +0x85
internal/poll.(*pollDesc).wait(0xc000418d80?, 0xc000590000?, 0x0)
        /home/mauri870/gopath/pkg/mod/golang.org/[email protected]/src/internal/poll/fd_poll_runtime.go:84 +0x27
internal/poll.(*pollDesc).waitRead(...)
        /home/mauri870/gopath/pkg/mod/golang.org/[email protected]/src/internal/poll/fd_poll_runtime.go:89
internal/poll.(*FD).Read(0xc000418d80, {0xc000590000, 0x1000, 0x1000})

[...]

Related issues

@mauri870 mauri870 added the needs triage New item requiring triage label Oct 7, 2024
Copy link
Contributor

github-actions bot commented Oct 7, 2024

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@mauri870 mauri870 changed the title goroutine leaks in tests [exporter/elasticsearch] goroutine leaks in tests Oct 7, 2024
@mauri870
Copy link
Contributor Author

mauri870 commented Oct 7, 2024

As of now there are two leaks that are detected, following the goleak docs I was able to pinpoint the exact tests that have leaks:

$ export GOTOOLCHAIN=go1.23.1
$ export GOSUMDB=sum.golang.org

$ cd exporter/elasticsearchexporter
$ go test -c -o tests; for test in $(go test -list . | grep -E "^(Test|Example)"); do ./tests -test.run "^$test\$" &>/dev/null && echo -n "." || echo -e "\n$test failed"; done

.......................
TestComponentLifecycle failed
................%

I was unable to fix this particular test, but commenting it reports TestExporterMetrics instead, which I fixed at #35639. After that no more leaks are reported.

I would appreciate it if someone could take a look at TestComponentLifecycle.

@carsonip
Copy link
Contributor

carsonip commented Oct 7, 2024

/label -needs-triage

@github-actions github-actions bot removed the needs triage New item requiring triage label Oct 7, 2024
@mauri870
Copy link
Contributor Author

mauri870 commented Oct 7, 2024

Turns out TestExporterMetrics was being reported because of a flaky assertion. We should probably fix TestComponentLifecycle first and make sure all the other tests are not being reported by goleak.

@carsonip
Copy link
Contributor

carsonip commented Oct 18, 2024

I have found another leak of unclosed bulk indexer, fixing in 210306e as part of #35865

@axw
Copy link
Contributor

axw commented Nov 21, 2024

Is this still relevant? I'm not able to reproduce the failure locally.

@carsonip
Copy link
Contributor

Can no longer reproduce any leaks. This issue should be good to close.

@ChrsMark
Copy link
Member

Can no longer reproduce any leaks. This issue should be good to close.

I'm gonna close this for now then. Feel free to re-open if there is still evidence that it's not fixed.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants