Meteen naar de inhoud

A look at Nightshade and Glaze, UChicago researchers' tools that help artists “mask” or even “poison” their work to break AI models later trained on the data (Melissa Heikkilä/MIT Technology Review) 23-10-2023

Melissa Heikkilä / MIT Technology Review:
A look at Nightshade and Glaze, UChicago researchers’ tools that help artists “mask” or even “poison” their work to break AI models later trained on the data  —  The tool, called Nightshade, messes up training data in ways that could cause serious damage to image-generating AI models.


Lees verder op Tech Meme

A look at Nightshade and Glaze, UChicago researchers' tools that help artists “mask” or even “poison” their work to break AI models later trained on the data (Melissa Heikkilä/MIT Technology Review) 23-10-2023

Melissa Heikkilä / MIT Technology Review:
A look at Nightshade and Glaze, UChicago researchers’ tools that help artists “mask” or even “poison” their work to break AI models later trained on the data  —  The tool, called Nightshade, messes up training data in ways that could cause serious damage to image-generating AI models.


Lees verder op Tech Meme

A look at Nightshade and Glaze, UChicago researchers' tools that help artists “mask” or even “poison” their work to break AI models later trained on the data (Melissa Heikkilä/MIT Technology Review) 23-10-2023

Melissa Heikkilä / MIT Technology Review:
A look at Nightshade and Glaze, UChicago researchers’ tools that help artists “mask” or even “poison” their work to break AI models later trained on the data  —  The tool, called Nightshade, messes up training data in ways that could cause serious damage to image-generating AI models.


Lees verder op Tech Meme

A look at Nightshade and Glaze, UChicago researchers' tools that help artists “mask” or even “poison” their work to break AI models later trained on the data (Melissa Heikkilä/MIT Technology Review) 23-10-2023

Melissa Heikkilä / MIT Technology Review:
A look at Nightshade and Glaze, UChicago researchers’ tools that help artists “mask” or even “poison” their work to break AI models later trained on the data  —  The tool, called Nightshade, messes up training data in ways that could cause serious damage to image-generating AI models.


Lees verder op Tech Meme

A look at Nightshade and Glaze, UChicago researchers' tools that help artists “mask” or even “poison” their work to break AI models later trained on the data (Melissa Heikkilä/MIT Technology Review) 23-10-2023

Melissa Heikkilä / MIT Technology Review:
A look at Nightshade and Glaze, UChicago researchers’ tools that help artists “mask” or even “poison” their work to break AI models later trained on the data  —  The tool, called Nightshade, messes up training data in ways that could cause serious damage to image-generating AI models.


Lees verder op Tech Meme

A look at Nightshade and Glaze, UChicago researchers' tools that help artists “mask” or even “poison” their work to break AI models later trained on the data (Melissa Heikkilä/MIT Technology Review) 23-10-2023

Melissa Heikkilä / MIT Technology Review:
A look at Nightshade and Glaze, UChicago researchers’ tools that help artists “mask” or even “poison” their work to break AI models later trained on the data  —  The tool, called Nightshade, messes up training data in ways that could cause serious damage to image-generating AI models.


Lees verder op Tech Meme

A look at Nightshade and Glaze, UChicago researchers' tools that help artists “mask” or even “poison” their work to break AI models later trained on the data (Melissa Heikkilä/MIT Technology Review) 23-10-2023

Melissa Heikkilä / MIT Technology Review:
A look at Nightshade and Glaze, UChicago researchers’ tools that help artists “mask” or even “poison” their work to break AI models later trained on the data  —  The tool, called Nightshade, messes up training data in ways that could cause serious damage to image-generating AI models.


Lees verder op Tech Meme

A look at Nightshade and Glaze, UChicago researchers' tools that help artists “mask” or even “poison” their work to break AI models later trained on the data (Melissa Heikkilä/MIT Technology Review) 23-10-2023

Melissa Heikkilä / MIT Technology Review:
A look at Nightshade and Glaze, UChicago researchers’ tools that help artists “mask” or even “poison” their work to break AI models later trained on the data  —  The tool, called Nightshade, messes up training data in ways that could cause serious damage to image-generating AI models.


Lees verder op Tech Meme

A look at Nightshade and Glaze, UChicago researchers' tools that help artists “mask” or even “poison” their work to break AI models later trained on the data (Melissa Heikkilä/MIT Technology Review) 23-10-2023

Melissa Heikkilä / MIT Technology Review:
A look at Nightshade and Glaze, UChicago researchers’ tools that help artists “mask” or even “poison” their work to break AI models later trained on the data  —  The tool, called Nightshade, messes up training data in ways that could cause serious damage to image-generating AI models.


Lees verder op Tech Meme

A look at Nightshade and Glaze, UChicago researchers' tools that help artists “mask” or even “poison” their work to break AI models later trained on the data (Melissa Heikkilä/MIT Technology Review) 23-10-2023

Melissa Heikkilä / MIT Technology Review:
A look at Nightshade and Glaze, UChicago researchers’ tools that help artists “mask” or even “poison” their work to break AI models later trained on the data  —  The tool, called Nightshade, messes up training data in ways that could cause serious damage to image-generating AI models.


Lees verder op Tech Meme