Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Sanity Check: Why are slashes being sent to provers and not burnt? #300

Open
SeanMcOwen opened this issue May 14, 2024 · 2 comments
Open
Labels
question Further information is requested Tier5

Comments

@SeanMcOwen
Copy link
Contributor

In s_slashes_to_prover:

def s_slashes_to_prover(
    params: AztecModelParams,
    _2,
    _3,
    state: AztecModelState,
    signal: SignalEvolveProcess,
):
    """
    Logic for keeping track of how many slashes have occurred.
    """
    old_slashes_to_prover = state["slashes_to_provers"]
    transfers: Sequence[Transfer] = signal.get("transfers", [])  # type: ignore

    # Calculate the number of slashes of each type to add
    delta_slashes_prover = len(
        [
            transfer
            for transfer in transfers
            if (transfer.kind == TransferKind.slash_prover) and (transfer.to_prover == True)
        ]
    )

    updated_slashes_to_prover = old_slashes_to_prover + delta_slashes_prover

    return ("slashes_to_provers", updated_slashes_to_prover)
@SeanMcOwen
Copy link
Contributor Author

Likewise for slashes to sequencer. Is this just a misnomer and it is saying that sequencers are the ones slashed with transfer.to_sequencer ?

@SeanMcOwen SeanMcOwen added the question Further information is requested label May 14, 2024
@jackhack00 jackhack00 added Tier5 and removed Tier1 labels May 22, 2024
@jackhack00
Copy link
Contributor

  • not affecting actual slashes, only tracking of slashes
  • changed to tier5
  • potentially make more legible

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested Tier5
Projects
None yet
Development

No branches or pull requests

2 participants