Identity needs new legal protections

We are beyond traditional laws

Steve Jones
3 min readApr 22, 2024

Your identity is obviously a fundamental part of you, however the rules and laws that exist are not designed, and are not ready, for what AI is doing to the concept of identity. Microsoft’s VASA-1 makes real a threat that has been there for years, and really underlines the issue of accountability for identity protection in AI.

(Had to use an uploaded version as medium doesn’t support MP4 links)

VASA-1 is you, you aren’t VASA-1

So from a simple audio file and a portrait photo VASA-1 is now able to make you, or a life-like facsimile of you, this can be attached to an LLM and suddenly you’ve got a chatbot with your face on it. If your audio and photo are out there on the internet, is this just “fair use” of your identity? We already know that professionals have had their voices ripped off, using the cover of ‘pre-AI’ contracts to justify the use.

So what are your rights? In Mike’s example where his voice (the thing he makes his living from) was ripped off by a company reusing historical call-center recordings he’d done, the answer was “well, err, nothing”. With Fair Use being thrown about to justify training of AIs is there a risk that the same ‘rules’ are used to justify creating all manner of avatars base on people’s identities? What stops an AI version of Cameo? Can the same sort of ‘it isn’t really that’ be done as beloved by Halloween costume creators?

I mean sure it might look and sound like Robert Downey Jr but this is actually “Red Metal Suit Guy” wishing you happy birthday.

All of this is before we look at the ability to weaponize disinformation and commit social engineering attacks on a truly industrial scale.

The laws aren’t fit for AI

The new EU AI Act is one of the first regulatory frameworks around AI, and it does talk about privacy and identity, but not in the idea that a person has a rights to that identity and what can be done with it. It could be argued that GDPR covers this, but it is noticeable that it talks about behaviors rather than the concept of identity and linking it to Big Data, which a single image and an audio track are not.

Now there are laws about impersonating someone, committing fraud etc as a result, but historically that requires a person to do an impression. Laws just are not set up for the idea that a digital facsimile of your identity can be created from a minimal dataset and without your knowledge. Relying on secondary legislation, like GDPR, to address that challenge is highly unlikely to produce positive outcomes.

You should own the rights to you

I’ve talked for a lot of years about this challenge. Since 2017 in fact, and on this blog back in 2021.

There really needs to be legislation that enshrines the digital rights for a person’s identity, and explicitly requires opt-in for impersonation level reuse of that identity. The sort of “mechanistic reproduction” clauses that Mike fell victim to are not at all where we are today.

This is not ‘against AI’, but there should absolutely be controls and restrictions on impersonation level technology and detection approaches require significantly more investment. There should also be accountability on researchers and firms creating such technologies to help ensure they cannot be nefariously used.

At the heart of this needs to be an enshrinement to the idea that a person owns their own personal identity, not in terms of database records and marketing, but in terms of their mannerisms, intonations, style and experience.

Variation on the famous Da Vinci picture of a man, but with half of the image as an android.

--

--

My job is to make exciting technology dull, because dull means it works. All opinions my own.