We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Describe the bug JSONRPC prematurely closes the connection with this panic:
http: superfluous response.WriteHeader call from github.com/tendermint/tendermint/rpc/jsonrpc/server.(*responseWriterWrapper).WriteHeader (http_server.go:225) Panic during RPC panic recovery: write tcp 127.0.0.1:28657->127.0.0.1:44936: i/o timeout goroutine 8687815 [running]: runtime/debug.Stack(0xc001b270a8, 0x126d2a0, 0xc004577bd0) /usr/lib/go-1.16/src/runtime/debug/stack.go:24 +0x9f github.com/tendermint/tendermint/rpc/jsonrpc/server.RecoverAndLogHandler.func1.1(0x16a1230, 0xc0002007e0) /root/go/pkg/mod/github.com/tendermint/[email protected]/rpc/jsonrpc/server/http_server.go:162 +0x76 panic(0x126d2a0, 0xc004577bd0) /usr/lib/go-1.16/src/runtime/panic.go:965 +0x1b9 github.com/tendermint/tendermint/rpc/jsonrpc/server.WriteRPCResponseHTTPError(0x16a0c00, 0xc0076b2e40, 0x1f4, 0x13a26ca, 0x3, 0x1688a00, 0x16630c8, 0x0, 0x0, 0x0, ...) /root/go/pkg/mod/github.com/tendermint/[email protected]/rpc/jsonrpc/server/http_server.go:105 +0x29c github.com/tendermint/tendermint/rpc/jsonrpc/server.RecoverAndLogHandler.func1.2(0xc0076b2e40, 0x16a60a8, 0xc0010a05e0, 0xc03902fc502625af, 0x75accab3043c, 0x1e51de0, 0xc0015de200) /root/go/pkg/mod/github.com/tendermint/[email protected]/rpc/jsonrpc/server/http_server.go:193 +0x378 panic(0x126d2a0, 0xc004577bd0) /usr/lib/go-1.16/src/runtime/panic.go:965 +0x1b9 github.com/tendermint/tendermint/rpc/jsonrpc/server.WriteRPCResponseHTTP(0x16a0c00, 0xc0076b2e40, 0x13a26ca, 0x3, 0x1688a00, 0xc003d4b4d8, 0xc00288a000, 0x9bdd, 0xf07a, 0x0) /root/go/pkg/mod/github.com/tendermint/[email protected]/rpc/jsonrpc/server/http_server.go:117 +0x297 github.com/tendermint/tendermint/rpc/jsonrpc/server.makeHTTPHandler.func2(0x16a0c00, 0xc0076b2e40, 0xc0015de200) /root/go/pkg/mod/github.com/tendermint/[email protected]/rpc/jsonrpc/server/http_uri_handler.go:64 +0x6f3 net/http.HandlerFunc.ServeHTTP(0xc001ee6600, 0x16a0c00, 0xc0076b2e40, 0xc0015de200) /usr/lib/go-1.16/src/net/http/server.go:2049 +0x44 net/http.(*ServeMux).ServeHTTP(0xc003aa2c40, 0x16a0c00, 0xc0076b2e40, 0xc0015de200) /usr/lib/go-1.16/src/net/http/server.go:2428 +0x1ad github.com/tendermint/tendermint/rpc/jsonrpc/server.maxBytesHandler.ServeHTTP(0x16871e0, 0xc003aa2c40, 0xf4240, 0x16a0c00, 0xc0076b2e40, 0xc0015de200) /root/go/pkg/mod/github.com/tendermint/[email protected]/rpc/jsonrpc/server/http_server.go:240 +0xd9 github.com/tendermint/tendermint/rpc/jsonrpc/server.RecoverAndLogHandler.func1(0x16a1230, 0xc0002007e0, 0xc0015de200) /root/go/pkg/mod/github.com/tendermint/[email protected]/rpc/jsonrpc/server/http_server.go:213 +0x39a net/http.HandlerFunc.ServeHTTP(0xc001d72f30, 0x16a1230, 0xc0002007e0, 0xc0015de200) /usr/lib/go-1.16/src/net/http/server.go:2049 +0x44 net/http.serverHandler.ServeHTTP(0xc000f44540, 0x16a1230, 0xc0002007e0, 0xc0015de200) /usr/lib/go-1.16/src/net/http/server.go:2867 +0xa3 net/http.(*conn).serve(0xc00b067360, 0x16a5858, 0xc004b2b6c0) /usr/lib/go-1.16/src/net/http/server.go:1932 +0x8cd created by net/http.(*Server).Serve /usr/lib/go-1.16/src/net/http/server.go:2993 +0x39b
How to reproduce Unclear. Seems to happen with a variety of queries:
2021/07/30 06:39:50 [error] 60150#60150: *176853 upstream prematurely closed connection while reading response header from upstream, client: 10.42.42.215, server: , request: "GET /emoney/api/auth/accounts/emoney14ep4hfm733sfvuaxt78mf4fkq4jzrr2mmvqe6m HTTP/1.1", upstream: "http://127.0.0.1:3317/auth/accounts/emoney14ep4hfm733sfvuaxt78mf4fkq4jzrr2mmvqe6m" 2021/07/30 06:48:55 [error] 60150#60150: *177430 upstream prematurely closed connection while reading response header from upstream, client: 10.42.43.182, server: , request: "GET /emoney/api/supply/circulating HTTP/1.1", upstream: "http://127.0.0.1:3317/supply/circulating" 2021/07/30 06:51:57 [error] 60150#60150: *177462 upstream prematurely closed connection while reading response header from upstream, client: 10.42.44.234, server: , request: "GET /emoney/api/supply/circulating HTTP/1.1", upstream: "http://127.0.0.1:3317/supply/circulating" 2021/07/30 06:51:57 [error] 60150#60150: *177776 upstream prematurely closed connection while reading response header from upstream, client: 10.42.43.182, server: , request: "GET /emoney/api/staking/pool HTTP/1.1", upstream: "http://127.0.0.1:3317/staking/pool" 2021/07/30 06:51:57 [error] 60150#60150: *177602 upstream prematurely closed connection while reading response header from upstream, client: 10.42.42.215, server: , request: "GET /emoney/api/staking/validators/emoneyvaloper1qjk97vct5gs995xmf9pk0zxt9ptpdyzpkpwt6v/delegations HTTP/1.1", upstream: "http://127.0.0.1:3317/staking/validators/emoneyvaloper1qjk97vct5gs995xmf9pk0zxt9ptpdyzpkpwt6v/delegations" 2021/07/30 06:53:10 [error] 60150#60150: *177602 upstream prematurely closed connection while reading response header from upstream, client: 10.42.42.215, server: , request: "GET /emoney/api/supply/circulating HTTP/1.1", upstream: "http://127.0.0.1:3317/supply/circulating"
However seems more prevalent when requests arrive in a bursty fashion, so could be a concurrency issue:
10.42.42.215 - - [30/Jul/2021:06:51:57 +0000] "GET /cosmoshub/health HTTP/1.1" 200 50 "-" "Amazon CloudFront" 10.42.43.182 - - [30/Jul/2021:06:51:57 +0000] "GET /emoney/health HTTP/1.1" 200 50 "-" "Amazon CloudFront" 10.42.44.234 - - [30/Jul/2021:06:51:57 +0000] "GET /emoney/api/supply/circulating HTTP/1.1" 502 166 "-" "Amazon CloudFront" 10.42.43.182 - - [30/Jul/2021:06:51:57 +0000] "GET /emoney/api/staking/pool HTTP/1.1" 502 166 "-" "Amazon CloudFront" 10.42.42.215 - - [30/Jul/2021:06:51:57 +0000] "GET /emoney/api/staking/validators/emoneyvaloper1qjk97vct5gs995xmf9pk0zxt9ptpdyzpkpwt6v/delegations HTTP/1.1" 502 166 "-" "Amazon CloudFront" 10.42.43.182 - - [30/Jul/2021:06:51:57 +0000] "GET /emoney/api/auth/accounts/emoney14ep4hfm733sfvuaxt78mf4fkq4jzrr2mmvqe6m HTTP/1.1" 200 354 "-" "Amazon CloudFront" 10.42.44.234 - - [30/Jul/2021:06:51:57 +0000] "GET /emoney/api/supply/total HTTP/1.1" 200 278 "-" "Amazon CloudFront" 10.42.42.215 - - [30/Jul/2021:06:51:57 +0000] "GET /emoney/api/distribution/community_pool HTTP/1.1" 200 339 "-" "Amazon CloudFront" 10.42.43.182 - - [30/Jul/2021:06:51:57 +0000] "GET /emoney/api/staking/validators/emoneyvaloper1zgv6tqess9q6y4cj28ldpjllrqlyzqqh80fpgu/delegations HTTP/1.1" 200 3656 "-" "Amazon CloudFront"
The text was updated successfully, but these errors were encountered:
No branches or pull requests
Describe the bug
JSONRPC prematurely closes the connection with this panic:
How to reproduce
Unclear. Seems to happen with a variety of queries:
However seems more prevalent when requests arrive in a bursty fashion, so could be a concurrency issue:
The text was updated successfully, but these errors were encountered: