Follow

It's very interesting. A crude description:

The problem with the naive musig design was that: trying to fix each participant's public key by hashing it: H(Pi)Pi didn't prevent an attacker separating it out in such a way that they could perform Wagner's attack on it (you can see a description of this near the end of my blog post: https://joinmarket.me/blog/blog/avoiding-wagnerian-tragedies/ ), which is an attack that is extremely practical for large numbers of keys.

Now, that was fixed up with what is vaguely referred to as (2/n)

... solid security proof, is actually a very strong echo of the original solution for the pubkeys (i.e. it's also a kind of "delinearization").

Basically, each participant, rather counterintuitively, sends a *set* of nonces as their contribution rather than 1. Let's say M nonces each, then their *final* nonce will be such a "delinearized" combination of their starting nonces.

The reason this defeats Wagner's attack (which would otherwise allow the attacker to dupe the other signer(s) ...

(4/n)

.. into signing a message they didn't intend to, *if* we tried to use a 2 round protocol, i.e. not bother to commit to the nonces upfront (3 round)), is because Wagner's attack kind of looks like:

a + b + c + d = e

where the attacker is getting a,b,c,d as hash outputs he can't predict, but trying to make them add up to 'e', which is some number.

As the paper points out, the attack only works because e is some fixed constant; if it varies with a,b,c,d it of course can't work.

(5/n)

Just to flesh out something above a bit more clearly:

Key 1 sends R_1,1, R_1,2, R_1,3

Key 2 sends R_2,1, R_2,2, R_2,3

But Key 1's actual nonce will be something like:

H(1,message, pubkey, *all the R_ij of both parties*)R_1,1

+

H(2,message, pubkey, *all the R_ijof both parties*)R_1,2

+

H(3,message, pubkey, *all the R_ij of both parties*)R_1,3

... just fixing everything to prevent "separation" as mentioned.

Notice this is still 2 round, but not 3.

Still reading the paper.

(6/n)

waxwing@waxwing@x0f.org.. delinearization. More specifically it meant hashing *the entire keyset* into each key to prevent this "separating out".

The problem was that for a rigorous security proof, the first stage of the mulitisignature protocol had to include a fixing of all participants' nonces (R_i) ahead of time. That made the protocol three rounds of communication which is pretty icky (albeit, fine in some scenarios).

The approach this new paper takes to remove that first round, while still retaining a (3/n)