You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Apr 26, 2024. It is now read-only.
All times are UTC, and all log snippets are from t2bot.io's Synapse 0.20 instance. Logs are large, and may have sensitive information in them. Please message me (@travis:t2l.io) for the full log package. Snippets for clarity are included below.
At 18:33:10,723 on April 19th, 2017 @voyager:t2bot.io attempted to join #freenode_#ubuntu:matrix.org (!DeuJfzJakkOlzOGEdi:matrix.org). That particular room has tens of thousands (possibly more?) users in it, and likely quite a few servers participating (I've not joined the room myself).
I'm not sure if this would affect the outcome of the analysis, however voyager does do some immediate processing of room information once joined. It relies on cached data as much as possible, however it does query for room state and user state for each member in the room for it's own internal processes.
Just over a minute later, a failure to join was issued:
2017-04-19 18:34:29,145 - synapse.federation.federation_client - 608 - ERROR - POST-2346789- Failed to send_join via matrix.org: [<twisted.python.failure.Failure twisted.web._newclient.ConnectionAborted: >, <twisted.python.failure.Failure twisted.web.http._DataLoss: >]
Traceback (most recent call last):
File "/usr/local/lib/python2.7/dist-packages/synapse/federation/federation_client.py", line 541, in send_join
content=pdu.get_pdu_json(time_now),
ResponseFailed: [<twisted.python.failure.Failure twisted.web._newclient.ConnectionAborted: >, <twisted.python.failure.Failure twisted.web.http._DataLoss: >]
2017-04-19 18:34:29,146 - synapse.handlers.federation - 953 - INFO - POST-2346789- Processing queued PDU $149262682795251bZRhq:matrix.org which was received while we were joining !DeuJfzJakkOlzOGEdi:matrix.org
2017-04-19 18:34:29,147 - synapse.util.async - 208 - INFO - POST-2346789- Releasing linearizer lock 'member' for key ('!DeuJfzJakkOlzOGEdi:matrix.org',)
2017-04-19 18:34:29,149 - synapse.http.server - 139 - ERROR - POST-2346789- Failed handle request synapse.http.server._async_render on <synapse.rest.ClientRestResource object at 0x7f2daabbb3d0>: <XForwardedForRequest at 0x7f2d88d487a0 method=POST uri=/_matrix/client/r0/join/%23freenode_%23ubuntu%3Amatrix.org?access_token=<redacted> clientproto=HTTP/1.0 site=8008>
Traceback (most recent call last):
File "/usr/local/lib/python2.7/dist-packages/synapse/http/server.py", line 116, in wrapped_request_handler
yield request_handler(self, request, request_metrics)
File "/usr/local/lib/python2.7/dist-packages/twisted/internet/defer.py", line 1299, in _inlineCallbacks
result = result.throwExceptionIntoGenerator(g)
File "/usr/local/lib/python2.7/dist-packages/twisted/python/failure.py", line 393, in throwExceptionIntoGenerator
return g.throw(self.type, self.value, self.tb)
File "/usr/local/lib/python2.7/dist-packages/synapse/http/server.py", line 255, in _async_render
callback_return = yield callback(request, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/twisted/internet/defer.py", line 1299, in _inlineCallbacks
result = result.throwExceptionIntoGenerator(g)
File "/usr/local/lib/python2.7/dist-packages/twisted/python/failure.py", line 393, in throwExceptionIntoGenerator
return g.throw(self.type, self.value, self.tb)
File "/usr/local/lib/python2.7/dist-packages/synapse/rest/client/v1/room.py", line 269, in on_POST
third_party_signed=content.get("third_party_signed", None),
File "/usr/local/lib/python2.7/dist-packages/twisted/internet/defer.py", line 1299, in _inlineCallbacks
result = result.throwExceptionIntoGenerator(g)
File "/usr/local/lib/python2.7/dist-packages/twisted/python/failure.py", line 393, in throwExceptionIntoGenerator
return g.throw(self.type, self.value, self.tb)
File "/usr/local/lib/python2.7/dist-packages/synapse/handlers/room_member.py", line 174, in update_membership
content=content,
File "/usr/local/lib/python2.7/dist-packages/twisted/internet/defer.py", line 1299, in _inlineCallbacks
result = result.throwExceptionIntoGenerator(g)
File "/usr/local/lib/python2.7/dist-packages/twisted/python/failure.py", line 393, in throwExceptionIntoGenerator
return g.throw(self.type, self.value, self.tb)
File "/usr/local/lib/python2.7/dist-packages/synapse/handlers/room_member.py", line 268, in _update_membership
remote_room_hosts, room_id, target, content
File "/usr/local/lib/python2.7/dist-packages/twisted/internet/defer.py", line 1299, in _inlineCallbacks
result = result.throwExceptionIntoGenerator(g)
File "/usr/local/lib/python2.7/dist-packages/twisted/python/failure.py", line 393, in throwExceptionIntoGenerator
return g.throw(self.type, self.value, self.tb)
File "/usr/local/lib/python2.7/dist-packages/synapse/handlers/room_member.py", line 138, in remote_join
content,
File "/usr/local/lib/python2.7/dist-packages/twisted/internet/defer.py", line 1299, in _inlineCallbacks
result = result.throwExceptionIntoGenerator(g)
File "/usr/local/lib/python2.7/dist-packages/twisted/python/failure.py", line 393, in throwExceptionIntoGenerator
return g.throw(self.type, self.value, self.tb)
File "/usr/local/lib/python2.7/dist-packages/synapse/handlers/federation.py", line 889, in do_invite_join
ret = yield self.replication_layer.send_join(target_hosts, event)
File "/usr/local/lib/python2.7/dist-packages/twisted/internet/defer.py", line 1299, in _inlineCallbacks
result = result.throwExceptionIntoGenerator(g)
File "/usr/local/lib/python2.7/dist-packages/twisted/python/failure.py", line 393, in throwExceptionIntoGenerator
return g.throw(self.type, self.value, self.tb)
File "/usr/local/lib/python2.7/dist-packages/synapse/federation/federation_client.py", line 611, in send_join
raise RuntimeError("Failed to send to any server.")
RuntimeError: Failed to send to any server.
2017-04-19 18:34:29,151 - synapse.access.http.8008 - 91 - INFO - POST-2346789- 138.197.141.99 - 8008 - {@voyager:t2bot.io} Processed request: 78427ms (11ms, 3ms) (5ms/5) 55B 500 "POST /_matrix/client/r0/join/%23freenode_%23ubuntu%3Amatrix.org?access_token=<redacted> HTTP/1.0" "None"
Shortly after it managed to re-send the PDU just fine and start backfilling data. Some curious log lines appear after this point, which may relate to existing issues:
2017-04-19 18:34:32,488 - synapse.access.https.8448 - 59 - INFO - PUT-2347053- 83.136.250.51 - 8448 - Received request: PUT /_matrix/federation/v1/send/1492620916994/
2017-04-19 18:34:32,491 - synapse.federation.transport.server - 138 - INFO - PUT-2347053- Request from matrix.org
2017-04-19 18:34:32,492 - synapse.federation.transport.server - 244 - INFO - PUT-2347053- Received txn 1492620916994 from matrix.org. (PDUs: 1, EDUs: 0, failures: 0)
2017-04-19 18:34:32,503 - synapse.handlers.federation - 180 - INFO - PUT-2347053- Still missing 1 events for room '!DeuJfzJakkOlzOGEdi:matrix.org': ['$149262684695670mVuwM:matrix.org']...
(note that the above is receiving a request. The logs have 2 ongoing requests that are backfilling, if I understand the logs correctly)
2017-04-19 18:48:59,967 - synapse.http.matrixfederationclient - 211 - WARNING - PUT-2347053- {GET-O-303880} Sending request failed to matrix.org: GET matrix://matrix.org/_matrix/federation/v1/event/$1472131296205420PQhiT:matrix.org/: DeferredTimedOutError - DeferredTimedOutError: 504: Timed out
2017-04-19 19:23:53,199 - synapse.util.async - 208 - INFO - PUT-2350438- Releasing linearizer lock 'fed_room_pdu' for key '!PeOkQLnTUKVtuXZJeT:t2l.io'
2017-04-19 19:23:53,200 - synapse.federation.federation_server - 178 - ERROR - PUT-2350438- Failed to handle PDU
Traceback (most recent call last):
File "/usr/local/lib/python2.7/dist-packages/synapse/federation/federation_server.py", line 171, in on_incoming_transaction
yield self._handle_received_pdu(transaction.origin, pdu)
File "/usr/local/lib/python2.7/dist-packages/twisted/internet/defer.py", line 1299, in _inlineCallbacks
result = result.throwExceptionIntoGenerator(g)
File "/usr/local/lib/python2.7/dist-packages/twisted/python/failure.py", line 393, in throwExceptionIntoGenerator
return g.throw(self.type, self.value, self.tb)
File "/usr/local/lib/python2.7/dist-packages/synapse/federation/federation_server.py", line 524, in _handle_received_pdu
yield self.handler.on_receive_pdu(origin, pdu, get_missing=True)
File "/usr/local/lib/python2.7/dist-packages/twisted/internet/defer.py", line 1299, in _inlineCallbacks
result = result.throwExceptionIntoGenerator(g)
File "/usr/local/lib/python2.7/dist-packages/twisted/python/failure.py", line 393, in throwExceptionIntoGenerator
return g.throw(self.type, self.value, self.tb)
File "/usr/local/lib/python2.7/dist-packages/synapse/handlers/federation.py", line 172, in on_receive_pdu
origin, pdu, prevs, min_depth
File "/usr/local/lib/python2.7/dist-packages/twisted/internet/defer.py", line 1299, in _inlineCallbacks
result = result.throwExceptionIntoGenerator(g)
File "/usr/local/lib/python2.7/dist-packages/twisted/python/failure.py", line 393, in throwExceptionIntoGenerator
return g.throw(self.type, self.value, self.tb)
File "/usr/local/lib/python2.7/dist-packages/synapse/handlers/federation.py", line 265, in _get_missing_events_for_pdu
timeout=10000,
File "/usr/local/lib/python2.7/dist-packages/twisted/internet/defer.py", line 1299, in _inlineCallbacks
result = result.throwExceptionIntoGenerator(g)
File "/usr/local/lib/python2.7/dist-packages/twisted/python/failure.py", line 393, in throwExceptionIntoGenerator
return g.throw(self.type, self.value, self.tb)
File "/usr/local/lib/python2.7/dist-packages/synapse/federation/federation_client.py", line 741, in get_missing_events
timeout=timeout,
File "/usr/local/lib/python2.7/dist-packages/twisted/internet/defer.py", line 1299, in _inlineCallbacks
result = result.throwExceptionIntoGenerator(g)
File "/usr/local/lib/python2.7/dist-packages/twisted/python/failure.py", line 393, in throwExceptionIntoGenerator
return g.throw(self.type, self.value, self.tb)
File "/usr/local/lib/python2.7/dist-packages/synapse/federation/transport/client.py", line 432, in get_missing_events
timeout=timeout,
File "/usr/local/lib/python2.7/dist-packages/twisted/internet/defer.py", line 1299, in _inlineCallbacks
result = result.throwExceptionIntoGenerator(g)
File "/usr/local/lib/python2.7/dist-packages/twisted/python/failure.py", line 393, in throwExceptionIntoGenerator
return g.throw(self.type, self.value, self.tb)
File "/usr/local/lib/python2.7/dist-packages/synapse/http/matrixfederationclient.py", line 384, in post_json
ignore_backoff=ignore_backoff,
File "/usr/local/lib/python2.7/dist-packages/twisted/internet/defer.py", line 1299, in _inlineCallbacks
result = result.throwExceptionIntoGenerator(g)
File "/usr/local/lib/python2.7/dist-packages/twisted/python/failure.py", line 393, in throwExceptionIntoGenerator
return g.throw(self.type, self.value, self.tb)
File "/usr/local/lib/python2.7/dist-packages/synapse/http/matrixfederationclient.py", line 134, in _request
ignore_backoff=ignore_backoff,
File "/usr/local/lib/python2.7/dist-packages/twisted/internet/defer.py", line 1301, in _inlineCallbacks
result = g.send(result)
File "/usr/local/lib/python2.7/dist-packages/synapse/util/retryutils.py", line 82, in get_retry_limiter
destination=destination,
NotRetryingDestination: Not retrying server t2l.io.
Note: this is for an unrelated room, but the events that unfolded appear to have caused issues with pre-existing rooms.
There are likely more issues, and someone much more experienced in analyzing the logs (which are horrendously long) may have a better time at diagnosing these issues. There also appears to be multiple cases where the entire server stalls for several seconds before the log blurs on past, such as the boundary here:
2017-04-19 21:38:25,856 - synapse.http.outbound - 152 - INFO - - {PUT-O-366436} [ptman.name] Sending request: PUT matrix://ptman.name/_matrix/federation/v1/send/1492011074836/
2017-04-19 21:38:55,145 - twisted - 131 - INFO - - Starting factory <synapse.crypto.keyclient.SynapseKeyClientFactory instance at 0x7f2ceb438050>
The text was updated successfully, but these errors were encountered:
turt2live
changed the title
Joining #freenode_#ubuntu took 3+ hours (voyager/t2bot.io)
Joining #freenode_#ubuntu took 4 hours (voyager/t2bot.io)
Apr 20, 2017
All times are UTC, and all log snippets are from t2bot.io's Synapse 0.20 instance. Logs are large, and may have sensitive information in them. Please message me (
@travis:t2l.io
) for the full log package. Snippets for clarity are included below.At 18:33:10,723 on April 19th, 2017
@voyager:t2bot.io
attempted to join#freenode_#ubuntu:matrix.org
(!DeuJfzJakkOlzOGEdi:matrix.org
). That particular room has tens of thousands (possibly more?) users in it, and likely quite a few servers participating (I've not joined the room myself).I'm not sure if this would affect the outcome of the analysis, however voyager does do some immediate processing of room information once joined. It relies on cached data as much as possible, however it does query for room state and user state for each member in the room for it's own internal processes.
Just over a minute later, a failure to join was issued:
Shortly after it managed to re-send the PDU just fine and start backfilling data. Some curious log lines appear after this point, which may relate to existing issues:
(note that the above is receiving a request. The logs have 2 ongoing requests that are backfilling, if I understand the logs correctly)
2017-04-19 18:34:41,566 - synapse.federation.federation_base - 142 - WARNING - POST-2346789- Event content has been tampered, redacting $14828390682765754NYgej:matrix.org: {'origin': 'matrix.org', 'signatures': {'matrix.org': {'ed25519:auto': 'bg5zWUE+hpeOJfM3AJhtd1hdBbjD3/wlsDxDSPNkt81NV9+hyiChb0S3X4xfqtYEzwUqssE6S2tK3zWArnN8Bw'}}, 'origin_server_ts': 1482839068433, 'sender': '@freenode_gerry_dublin_dub:matrix.org', 'event_id': '$14828390682765754NYgej:matrix.org', 'prev_events': [['$14828390582765725izalW:matrix.org', {'sha256': 'ABLHAtmB4MILBirO7GnFLbodGUpYc+FbTRBuAL2RqH4'}]], 'membership': 'join', 'state_key': '@freenode_gerry_dublin_dub:matrix.org', 'unsigned': {'age': 9787813094}, 'content': {'membership': 'join'}, 'depth': 762377, 'prev_state': [], 'room_id': '!DeuJfzJakkOlzOGEdi:matrix.org', 'auth_events': [['$1438843017169306rgPBE:matrix.org', {'sha256': 'b0upddEeaxEVYH4NT/+t1nFN68AyOU+lAKiGDENGl0c'}], ['$1438843018169308XyBOV:matrix.org', {'sha256': '0txEe+YSQm68eTTbWzXOwLju5jRUB14L1E8iCCUcLUg'}], ['$1438843020169313RcxmM:matrix.org', {'sha256': 'hRY6/gURv+TMHaT0TUPMUTDAxbvzfU1qC3WYAIFgI1o'}]], 'hashes': {'sha256': 'xu7mjuWgD6Rdh8EryJDK9A8O0AXHIfQHqU0CdZROOEM'}, 'type': u'm.room.member'}
This appears many, many, times - just in different forms.
2017-04-19 18:48:59,967 - synapse.http.matrixfederationclient - 211 - WARNING - PUT-2347053- {GET-O-303880} Sending request failed to matrix.org: GET matrix://matrix.org/_matrix/federation/v1/event/$1472131296205420PQhiT:matrix.org/: DeferredTimedOutError - DeferredTimedOutError: 504: Timed out
Note: this is for an unrelated room, but the events that unfolded appear to have caused issues with pre-existing rooms.
There are likely more issues, and someone much more experienced in analyzing the logs (which are horrendously long) may have a better time at diagnosing these issues. There also appears to be multiple cases where the entire server stalls for several seconds before the log blurs on past, such as the boundary here:
The text was updated successfully, but these errors were encountered: