You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository was archived by the owner on Nov 7, 2025. It is now read-only.
As I've gotten deeper into this I've been pondering something: what would be the impact to this core privacy model if user bidding signals were:
Partitioned in any untrusted or persistent environment
Viewable and deleteable by a user on their browser
But could be viewed together in a transient process by a function in an opaque environment such as a TEE, provided the output of that process still had to have DP and K enforced.
I haven't had the chance to try to work through the math here (some serious cobwebs to dust off for any proof'ing) but I wonder if this would still meet the privacy model laid out here from a "happy path perspective" (meaning impact to "reidentification across context"), with the full understanding that any hacks on that environment would incur a worse privacy loss than if a single-partition-process is hacked.