Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chore: purging calldata hash #4984

Merged
merged 1 commit into from
Mar 7, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions docs/docs/developers/debugging/sandbox-errors.md
Original file line number Diff line number Diff line change
Expand Up @@ -185,9 +185,9 @@ Users may create a proof against a historical state in Aztec. The rollup circuit

## Sequencer Errors

- "Calldata hash mismatch" - the sequencer assembles a block and sends it to the rollup circuits for proof generation. Along with the proof, the circuits return the hash of the calldata that must be sent to the Rollup contract on L1. Before doing so, the sequencer sanity checks that this hash is equivalent to the calldata hash of the block that it submitted. This could be a bug in our code e.g. if we are ordering things differently in circuits and in our transaction/block (e.g. incorrect ordering of encrypted logs or queued public calls). Easiest way to debug this is by printing the calldata of the block both on the TS (in l2Block.getCalldataHash()) and C++ side (in the base rollup)
- "Txs effects hash mismatch" - the sequencer assembles a block and sends it to the rollup circuits for proof generation. Along with the proof, the circuits return the hash of the transaction effects that must be sent to the Rollup contract on L1. Before doing so, the sequencer sanity checks that this hash is equivalent to the transaction effects hash of the block that it submitted. This could be a bug in our code e.g. if we are ordering things differently in circuits and in our transaction/block (e.g. incorrect ordering of encrypted logs or queued public calls). Easiest way to debug this is by printing the txs effects hash of the block both on the TS (in l2Block.getTxsEffectsHash()) and noir side (in the base rollup)

- "${treeName} tree root mismatch" - like with calldata mismatch, it validates that the root of the tree matches the output of the circuit simulation. The tree name could be Public data tree, Note Hash Tree, Contract tree, Nullifier tree or the L1ToL2Message tree,
- "${treeName} tree root mismatch" - like with txs effects hash mismatch, it validates that the root of the tree matches the output of the circuit simulation. The tree name could be Public data tree, Note Hash Tree, Contract tree, Nullifier tree or the L1ToL2Message tree,

- "${treeName} tree next available leaf index mismatch" - validating a tree's root is not enough. It also checks that the `next_available_leaf_index` is as expected. This is the next index we can insert new values into. Note that for the public data tree, this test is skipped since as it is a sparse tree unlike the others.

Expand Down
2 changes: 1 addition & 1 deletion docs/docs/developers/privacy/main.md
Original file line number Diff line number Diff line change
Expand Up @@ -87,7 +87,7 @@ A 'Function Fingerprint' is any data which is exposed by a function to the outsi

> Note: many of these were mentioned in the ["Crossing the private -> public boundary"](#crossing-the-private---public-boundary) section.

> Note: the calldata submitted to L1 is [encoded](https://github.com/AztecProtocol/aztec-packages/blob/master/l1-contracts/src/core/libraries/Decoder.sol) in such a way that all categories of data are packed together, when submitted. E.g. all commitments from all txs in a block are arranged as contiguous bytes of calldata. But that _doesn't_ mean the data from a particular tx is garbled in with all other txs' calldata: the distinct Tx Fingerprint of each tx can is publicly visible when a tx is submitted to the L2 tx pool.
> Note: the transaction effects submitted to L1 is [encoded](https://github.com/AztecProtocol/aztec-packages/blob/master/l1-contracts/src/core/libraries/Decoder.sol) but not garbled with other transactions: the distinct Tx Fingerprint of each tx can is publicly visible when a tx is submitted to the L2 tx pool.

#### Standardizing Fingerprints

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ For this to work we import the `get_withdraw_content_hash` helper function from
The `exit_to_l1_public` function enables anyone to withdraw their L2 tokens back to L1 publicly. This is done by burning tokens on L2 and then creating an L2->L1 message.

1. Like with our deposit function, we need to create the L2 to L1 message. The content is the _amount_ to burn, the recipient address, and who can execute the withdraw on the L1 portal on behalf of the user. It can be `0x0` for anyone, or a specified address.
2. `context.message_portal()` passes this content to the [kernel circuit](../../../learn/concepts/circuits/kernels/public_kernel.md) which creates the proof for the transaction. The kernel circuit then adds the sender (the L2 address of the bridge + version of aztec) and the recipient (the portal to the L2 address + the chain ID of L1) under the hood, to create the message which gets added as rollup calldata by the sequencer and is stored in the outbox for consumption.
2. `context.message_portal()` passes this content to the [kernel circuit](../../../learn/concepts/circuits/kernels/public_kernel.md) which creates the proof for the transaction. The kernel circuit then adds the sender (the L2 address of the bridge + version of aztec) and the recipient (the portal to the L2 address + the chain ID of L1) under the hood, to create the message which gets added as part of the transaction data published by the sequencer and is stored in the outbox for consumption.
3. Finally, you also burn the tokens on L2! Note that it burning is done at the end to follow the check effects interaction pattern. Note that the caller has to first approve the bridge contract to burn tokens on its behalf. Refer to [burn_public function on the token contract](../writing_token_contract.md#burn_public). The nonce parameter refers to the approval message that the user creates - also refer to [authorizing token spends here](../writing_token_contract.md#authorizing-token-spends).
- We burn the tokens from the `msg_sender()`. Otherwise, a malicious user could burn someone else’s tokens and mint tokens on L1 to themselves. One could add another approval flow on the bridge but that might make it complex for other applications to call the bridge.

Expand Down
16 changes: 8 additions & 8 deletions l1-contracts/slither_output.md
Original file line number Diff line number Diff line change
Expand Up @@ -94,11 +94,18 @@ src/core/libraries/decoders/MessagesDecoder.sol#L168-L170


- [ ] ID-10
Dubious typecast in [Inbox.batchConsume(bytes32[],address)](src/core/messagebridge/Inbox.sol#L122-L143):
uint256 => uint32 casting occurs in [expectedVersion = uint32(REGISTRY.getVersionFor(msg.sender))](src/core/messagebridge/Inbox.sol#L128)

src/core/messagebridge/Inbox.sol#L122-L143


- [ ] ID-11
Dubious typecast in [HeaderLib.decode(bytes)](src/core/libraries/HeaderLib.sol#L145-L189):
bytes => bytes32 casting occurs in [header.lastArchive = AppendOnlyTreeSnapshot(bytes32(_header),uint32(bytes4(_header)))](src/core/libraries/HeaderLib.sol#L153-L155)
bytes => bytes4 casting occurs in [header.lastArchive = AppendOnlyTreeSnapshot(bytes32(_header),uint32(bytes4(_header)))](src/core/libraries/HeaderLib.sol#L153-L155)
bytes => bytes32 casting occurs in [header.contentCommitment.txTreeHeight = uint256(bytes32(_header))](src/core/libraries/HeaderLib.sol#L158)
bytes => bytes32 casting occurs in [header.contentCommitment.txsHash = bytes32(_header)](src/core/libraries/HeaderLib.sol#L159)
bytes => bytes32 casting occurs in [header.contentCommitment.txsEffectsHash = bytes32(_header)](src/core/libraries/HeaderLib.sol#L159)
bytes => bytes32 casting occurs in [header.contentCommitment.inHash = bytes32(_header)](src/core/libraries/HeaderLib.sol#L160)
bytes => bytes32 casting occurs in [header.contentCommitment.outHash = bytes32(_header)](src/core/libraries/HeaderLib.sol#L161)
bytes => bytes32 casting occurs in [header.stateReference.l1ToL2MessageTree = AppendOnlyTreeSnapshot(bytes32(_header),uint32(bytes4(_header)))](src/core/libraries/HeaderLib.sol#L164-L166)
Expand All @@ -121,13 +128,6 @@ Dubious typecast in [HeaderLib.decode(bytes)](src/core/libraries/HeaderLib.sol#L
src/core/libraries/HeaderLib.sol#L145-L189


- [ ] ID-11
Dubious typecast in [Inbox.batchConsume(bytes32[],address)](src/core/messagebridge/Inbox.sol#L122-L143):
uint256 => uint32 casting occurs in [expectedVersion = uint32(REGISTRY.getVersionFor(msg.sender))](src/core/messagebridge/Inbox.sol#L128)

src/core/messagebridge/Inbox.sol#L122-L143


## missing-zero-check
Impact: Low
Confidence: Medium
Expand Down
4 changes: 2 additions & 2 deletions l1-contracts/src/core/Rollup.sol
Original file line number Diff line number Diff line change
Expand Up @@ -61,8 +61,8 @@ contract Rollup is IRollup {
HeaderLib.validate(header, VERSION, lastBlockTs, archive);

// Check if the data is available using availability oracle (change availability oracle if you want a different DA layer)
if (!AVAILABILITY_ORACLE.isAvailable(header.contentCommitment.txsHash)) {
revert Errors.Rollup__UnavailableTxs(header.contentCommitment.txsHash);
if (!AVAILABILITY_ORACLE.isAvailable(header.contentCommitment.txsEffectsHash)) {
revert Errors.Rollup__UnavailableTxs(header.contentCommitment.txsEffectsHash);
}

// Decode the cross-chain messages (Will be removed as part of message model change)
Expand Down
12 changes: 6 additions & 6 deletions l1-contracts/src/core/availability_oracle/AvailabilityOracle.sol
Original file line number Diff line number Diff line change
Expand Up @@ -17,16 +17,16 @@ contract AvailabilityOracle is IAvailabilityOracle {
mapping(bytes32 txsHash => bool available) public override(IAvailabilityOracle) isAvailable;

/**
* @notice Publishes transactions and marks its commitment, the TxsHash, as available
* @notice Publishes transactions and marks its commitment, the TxsEffectsHash, as available
* @param _body - The block body
* @return txsHash - The TxsHash
* @return txsEffectsHash - The TxsEffectsHash
*/
function publish(bytes calldata _body) external override(IAvailabilityOracle) returns (bytes32) {
bytes32 _txsHash = TxsDecoder.decode(_body);
isAvailable[_txsHash] = true;
bytes32 txsEffectsHash = TxsDecoder.decode(_body);
isAvailable[txsEffectsHash] = true;

emit TxsPublished(_txsHash);
emit TxsPublished(txsEffectsHash);

return _txsHash;
return txsEffectsHash;
}
}
4 changes: 2 additions & 2 deletions l1-contracts/src/core/interfaces/IAvailabilityOracle.sol
Original file line number Diff line number Diff line change
Expand Up @@ -3,9 +3,9 @@
pragma solidity >=0.8.18;

interface IAvailabilityOracle {
event TxsPublished(bytes32 txsHash);
event TxsPublished(bytes32 txsEffectsHash);

function publish(bytes calldata _body) external returns (bytes32);

function isAvailable(bytes32 _txsHash) external view returns (bool);
function isAvailable(bytes32 _txsEffectsHash) external view returns (bool);
}
6 changes: 3 additions & 3 deletions l1-contracts/src/core/libraries/HeaderLib.sol
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ import {Hash} from "./Hash.sol";
* | 0x0020 | 0x04 | lastArchive.nextAvailableLeafIndex
* | | | ContentCommitment {
* | 0x0024 | 0x20 | txTreeHeight
* | 0x0044 | 0x20 | txsHash
* | 0x0044 | 0x20 | txsEffectsHash
* | 0x0064 | 0x20 | inHash
* | 0x0084 | 0x20 | outHash
* | | | StateReference {
Expand Down Expand Up @@ -84,7 +84,7 @@ library HeaderLib {

struct ContentCommitment {
uint256 txTreeHeight;
bytes32 txsHash;
bytes32 txsEffectsHash;
bytes32 inHash;
bytes32 outHash;
}
Expand Down Expand Up @@ -156,7 +156,7 @@ library HeaderLib {

// Reading ContentCommitment
header.contentCommitment.txTreeHeight = uint256(bytes32(_header[0x0024:0x0044]));
header.contentCommitment.txsHash = bytes32(_header[0x0044:0x0064]);
header.contentCommitment.txsEffectsHash = bytes32(_header[0x0044:0x0064]);
header.contentCommitment.inHash = bytes32(_header[0x0064:0x0084]);
header.contentCommitment.outHash = bytes32(_header[0x0084:0x00a4]);

Expand Down
4 changes: 2 additions & 2 deletions l1-contracts/test/decoders/Base.sol
Original file line number Diff line number Diff line change
Expand Up @@ -36,11 +36,11 @@ contract DecoderBase is Test {
struct Data {
bytes32 archive;
bytes body;
bytes32 calldataHash;
DecodedHeader decodedHeader;
bytes header;
bytes32 l1ToL2MessagesHash;
bytes32 publicInputsHash;
bytes32 txsEffectsHash;
}

struct DecodedHeader {
Expand Down Expand Up @@ -68,7 +68,7 @@ contract DecoderBase is Test {
bytes32 inHash;
bytes32 outHash;
uint256 txTreeHeight;
bytes32 txsHash;
bytes32 txsEffectsHash;
}

struct PartialStateReference {
Expand Down
15 changes: 12 additions & 3 deletions l1-contracts/test/decoders/Decoders.t.sol
Original file line number Diff line number Diff line change
Expand Up @@ -74,7 +74,11 @@ contract DecodersTest is DecoderBase {
contentCommitment.txTreeHeight,
"Invalid txTreeSize"
);
assertEq(header.contentCommitment.txsHash, contentCommitment.txsHash, "Invalid txsHash");
assertEq(
header.contentCommitment.txsEffectsHash,
contentCommitment.txsEffectsHash,
"Invalid txsEffectsHash"
);
assertEq(header.contentCommitment.inHash, contentCommitment.inHash, "Invalid inHash");
assertEq(header.contentCommitment.outHash, contentCommitment.outHash, "Invalid outHash");
}
Expand Down Expand Up @@ -192,8 +196,13 @@ contract DecodersTest is DecoderBase {

// Txs
{
bytes32 txsHash = txsHelper.decode(data.block.body);
assertEq(txsHash, data.block.calldataHash, "Invalid txs hash");
bytes32 txsEffectsHash = txsHelper.decode(data.block.body);
assertEq(txsEffectsHash, data.block.txsEffectsHash, "Invalid txs effects hash");
assertEq(
txsEffectsHash,
data.block.decodedHeader.contentCommitment.txsEffectsHash,
"Invalid txs effects hash"
);
}

// The public inputs are computed based of these values, but not directly part of the decoding per say.
Expand Down
14 changes: 7 additions & 7 deletions l1-contracts/test/fixtures/empty_block_0.json
Original file line number Diff line number Diff line change
Expand Up @@ -35,23 +35,23 @@
]
},
"block": {
"archive": "0x17dd18fadaaca367dbe7f1e3f82d1ca7173eb7fbf3d417371010d6ee2b78d6d2",
"archive": "0x1bb1134e3fda61b56e838d68034e7c1e3a8da99d2321533b579eb1ae7588cd51",
"body": "0x0000001000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000400000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000",
"calldataHash": "0xdeb8be229731acd5c13f8dbdbfc60fdc8f7865f480d77f84c763d625aefbd6b1",
"txsEffectsHash": "0xdeb8be229731acd5c13f8dbdbfc60fdc8f7865f480d77f84c763d625aefbd6b1",
"decodedHeader": {
"contentCommitment": {
"inHash": "0x0000000000000000000000000000000000000000000000000000000000000000",
"outHash": "0xc78009fdf07fc56a11f122370658a353aaa542ed63e44c4bc15ff4cd105ab33c",
"txTreeHeight": 2,
"txsHash": "0xdeb8be229731acd5c13f8dbdbfc60fdc8f7865f480d77f84c763d625aefbd6b1"
"txsEffectsHash": "0xdeb8be229731acd5c13f8dbdbfc60fdc8f7865f480d77f84c763d625aefbd6b1"
},
"globalVariables": {
"blockNumber": 1,
"chainId": 31337,
"timestamp": 0,
"version": 1,
"coinbase": "0xf016058fa5c84a01a8e42abcbb7decabd09e8f4e",
"feeRecipient": "0x0e0b51d2fc596c37eb6ef04325c69533627bd7bfbe8caa99eea9d66aed66b85f"
"coinbase": "0x56a54c9ad4f77919e45f9b9a18cf55468a60ebe5",
"feeRecipient": "0x02db69f955a50583c56b7405d887a720030cefc20293682c3eba3574e4c77846"
},
"lastArchive": {
"nextAvailableLeafIndex": 1,
Expand Down Expand Up @@ -82,8 +82,8 @@
}
}
},
"header": "0x0f045bd8180c4de901e18a10e9393ae42d9ef7928fe6b68568cb48b91d1355a7000000010000000000000000000000000000000000000000000000000000000000000002deb8be229731acd5c13f8dbdbfc60fdc8f7865f480d77f84c763d625aefbd6b10000000000000000000000000000000000000000000000000000000000000000c78009fdf07fc56a11f122370658a353aaa542ed63e44c4bc15ff4cd105ab33c1864fcdaa80ff2719154fa7c8a9050662972707168d69eac9db6fd3110829f800000001016642d9ccd8346c403aa4c3fa451178b22534a27035cdaa6ec34ae53b29c50cb000001000bcfa3e9f1a8922ee92c6dc964d6595907c1804a86753774322b468f69d4f278000001801864fcdaa80ff2719154fa7c8a9050662972707168d69eac9db6fd3110829f80000000040572c8db882674dd026b8877fbba1b700a4407da3ae9ce5fa43215a28163362b000000c00000000000000000000000000000000000000000000000000000000000007a69000000000000000000000000000000000000000000000000000000000000000100000000000000000000000000000000000000000000000000000000000000010000000000000000000000000000000000000000000000000000000000000000f016058fa5c84a01a8e42abcbb7decabd09e8f4e0e0b51d2fc596c37eb6ef04325c69533627bd7bfbe8caa99eea9d66aed66b85f",
"header": "0x0f045bd8180c4de901e18a10e9393ae42d9ef7928fe6b68568cb48b91d1355a7000000010000000000000000000000000000000000000000000000000000000000000002deb8be229731acd5c13f8dbdbfc60fdc8f7865f480d77f84c763d625aefbd6b10000000000000000000000000000000000000000000000000000000000000000c78009fdf07fc56a11f122370658a353aaa542ed63e44c4bc15ff4cd105ab33c1864fcdaa80ff2719154fa7c8a9050662972707168d69eac9db6fd3110829f800000001016642d9ccd8346c403aa4c3fa451178b22534a27035cdaa6ec34ae53b29c50cb000001000bcfa3e9f1a8922ee92c6dc964d6595907c1804a86753774322b468f69d4f278000001801864fcdaa80ff2719154fa7c8a9050662972707168d69eac9db6fd3110829f80000000040572c8db882674dd026b8877fbba1b700a4407da3ae9ce5fa43215a28163362b000000c00000000000000000000000000000000000000000000000000000000000007a6900000000000000000000000000000000000000000000000000000000000000010000000000000000000000000000000000000000000000000000000000000001000000000000000000000000000000000000000000000000000000000000000056a54c9ad4f77919e45f9b9a18cf55468a60ebe502db69f955a50583c56b7405d887a720030cefc20293682c3eba3574e4c77846",
"l1ToL2MessagesHash": "0x076a27c79e5ace2a3d47f9dd2e83e4ff6ea8872b3c2218f66c92b89b55f36560",
"publicInputsHash": "0x0f85e8c25f4736024ecf839615a742aa013dc0af1179bac8176ee26b3e1471c9"
"publicInputsHash": "0x0115bc353d66365ef2b850d1f9487476c969eda2b6bbeff9e747ab58189010c6"
}
}
Loading
Loading