One of the cringier aspects of AI as we know it today has been the proliferation of deepfake technology to make nude photos of anyone you want. What if you took away the abstraction and put the faker and subject in the same space? That’s the question the NUCA camera was designed to explore. [via 404 Media]
[Mathias Vef] and [Benedikt Groß] designed the NUCA camera “with the intention of critiquing the current trajectory of AI image generation.” The camera itself is a fairly unassuming device, a 3D-printed digital camera (19.5 × 6 × 1.5 cm) with a 37 mm lens. When the camera shutter button is pressed, a nude image is generated of the subject.
The final image is generated using a mixture of the picture taken of the subject, pose data, and facial landmarks. The photo is run through a classifier which identifies features such as age, gender, body type, etc. and then uses those to generate a text prompt for Stable Diffusion. The original face of the subject is then stitched onto the nude image and aligned with the estimated pose. Many of the sample images on the project’s website show the bias toward certain beauty ideals from AI datasets.
Looking for more ways to use AI with cameras? How about this one that uses GPS to imagine a scene instead. Prefer to keep AI out of your endeavors to invade personal space? How about building your own TSA body scanner?
Umm, I guess it’s an interesting if rather creepy take on exploring the ethics of AI image generation but I really hope they don’t open source the code for this.
There are already many smartphone apps that do the same.
Really? That would be useful for Tinder, modified to nearly nude.
Just kidding. My face on a fantastic body would very obviously be a fake.
Close but no cigar. I for one wouldn’t mind being deepfaked if the result was a near perfect body like those depicted in the examples. Want to raise alerts? Then rewrite it to add imperfections, or worse, generate some porn perversion.
It’s not the pictures that are the problem. It’s the people who view the deepfaked images of you that you need to be worried about.
Why in the hell would I ever worry about that even for one second
The person could look at any erotic model out there, but the fact that it has to be me in particular, without so much as asking, would raise all kinds of warning flags. Having a stalker is an inconvenience at best.
> I’m not sure if it’s the image, or the effort that creeps me out.
It’s the sentiment. Sexualization is one thing; sexualization by a crazy person is different because you don’t know what their limits are if any.
seems a bit silly on purpose, as the body lines are not matching up for men (who they are buffing up) and women (who they are slimming down)..
10 minute before you, ytrewq posted commented that he wants it to create near perfect body like images. You are complaining it’s improving the bodies.
What a contrast.
From the examples on the website, the women are actually bigger in the “after” photo than they are in the “before” photo.
Basically “Vedo nudo” in real life, but without Nino Manfredi.
https://www.imdb.com/title/tt0080084/
AI imagery alert! The hero image has an extra finger on the right hand!
“Benedikt Groß”
Nominative determinism strikes again. This is gross.
For a project “with the intention of critiquing the current trajectory of AI image generation.” it sure looks like a for-profit product announcement
Agree
What is the point of this? Stable diffusion exists. The open source models exist. For the same reason that you could not simply get rid of Linux you can not get rid of the public information. Oh I just want to critique it. Thank you for pissing in the wind your contribution is noted.
The purpose is to make it about themselves and point out what a good good boy he is
It’s an art, not some technical proof of concept.
One way to read the title is that the camera works only when it detects nudes.
Now, is it “deep fake it till you make it” going to work? From what I saw, i’ll say do not cancel your p0rnHub or girlfriend subscription yet.
Stop calling it AI.
It doesn’t think.
It doesn’t form ideas.
It has no concept of self.
It is a probability algorithm. It guesses what comes next. That’s it.
That is not intelligence, that is regurgitation.
“That is not intelligence, that is regurgitation!”
Much like the US public education system.
(Sigh!)
Thank you, I keep telling people about the Chinese Room concept but people get fixated on anthropomorphizing things.
AI is always just software, nobody ever said it had a soul. And most humans are entirely incapable of anything except regurgitation as well. Your post (and mine) are two examples.
>nobody ever said it had a soul.
A whole bunch of people do, either through claiming the reverse that thought is nothing but software (i.e. abstract logical rules), or because they have some mystical misconception about what computers do.
There is even the minority view that says current AI is conscious, and the vast amount of negative reinforcement built into machine learning causes vast amounts of suffering to conscious beings (namely, the AIs). I’ve seen a semi-academic piece defending this view.
That is insane, but fantastic!
I come to conclusions based upon data I have previously ingested and the current context, therefore I do not have a soul and am running on algorithms and statistics. Much like how AI is artificially created and runs off statistics and algorithms.
I get where your pedantry is coming from, but like your cousins the Grammer nazis you kinda missed the point in an attempt to sound smurt :-/
Oy… tell me you have no concept of how LLMs work without telling me you have no concept of how LLMs work.
The ‘data’ LLMs ‘ingest’ is related to the conditional probability of one symbol occuring in proximity of another symbol.
LLMs don’t ‘come to conclusions’ at all. They just sit there as networks that encode a large and complex probability system. They’re elaborate lookup tables with no capacity to analyze the contents of their rows and columns.
All relevant processing occurs in the symbolic layer. LLMs aren’t capable of associating symbols with self-consistent ‘objects’ or the ‘senses’ of those objects used in semantics. They can’t decide whether a new pattern of input symbols creates associations that contradict other associations that have already been encoded. They don’t have the capacity to identify ‘statements’ as things that can be internally consistent or inconsistent, let alone ‘true’ or ‘false’.
Humans can use LLMs to generate new patterns that match the probability encoded in the network specifically related to a given piece of input data. LLMs can’t generate the concept of doing that, or generate their own input prompts in a sense that humans would consider meaningful.
As tempting as it is to say something snarky, it’s easy to prove that you are incapable of generating the kind of random nonsense and structured gibberish an LLM can produce. Assigning ‘intelligence’ to them is equivalent to saying a frying pan has a concept of ‘food’.
“..bias toward certain beauty ideals..”
Booooooooooooring get new material
The photo looks weird in this article. One hand has too many fingers!
Wonder if it was taken using the camera?
I think that was on purpose. I could imagine, the whole idea maybe was meant as a joke or a late April’s fool trick.
The designers of that thing probably foresaw that nudity would get the attention of the public. Mabe it was a marketing stunt, too? 🙂
Did you read the site description or were you too busy gawking at pixelated nude bodies?
>NUCA is a speculative design and art project by Mathias Vef and Benedikt Groß. The project aims to provoke and question the current trajectory of generative AI in reproducing body images.
Further:
>To make this speculative (though very plausible) scenario tangible, NUCA is framed in the manner of a typical tech startup, trying with a private beta campaign to find its users.
Tell me it’s 2024 without saying it’s 2024
“Many of the sample images on the project’s website show the bias toward certain beauty ideals from AI datasets.”
All of the male pictures have a blur extending from the crotch to the knee to cover the genitals. What pervo sites did they use to model the male “equipment?”
That’s absolutely awful!… Where can we buy it?
Just make sure I have a big package.
USPS price hikes makes sure it stays at home.
That package was on target!
You all are aware that this is not a real product, right? It’s a single prototype plus some mocked up media pretending to be a tech startup. The whole point of it is to provoke conversation. It says right on their page.
Yeah. Navarre said it was “art” in the opening paragraph, no?
Whether it’s a real product or not only matters if you want to buy one. That it _could_ be a real product hopefully puts the point across just as well.
Pretty sure this is vaporware, those before photos really don’t look like they were taken by a homebrew camera, but rather bought from a stock photo service. I’m sure the software side is trivial these days, so these folks took a day project (or more likely, a workflow someone else already developed for entertainment) and made a fake product out of it as a resume booster I guess.
The six finger hand holding the fake camera is a nice bit of irony at least
It isn’t even “vaporware.” It is an art project intended to make you think about what would happen if there were a real device that took nude pictures of clothed people .
From the project page:
>NUCA is a speculative design and art project by Mathias Vef and Benedikt Groß. The project aims to provoke and question the current trajectory of generative AI in reproducing body images.
Further:
>To make this speculative (though very plausible) scenario tangible, NUCA is framed in the manner of a typical tech startup, trying with a private beta campaign to find its users.
There is no moral purpose for such a device.
Oh where did all the hippies go? 🙁
https://www.britannica.com/topic/hippie
https://www.britannica.com/topic/sexual-revolution
Discovered cocaine and BMWs, became yuppies. Voted Reagan.
Now are derided as ‘boomers’ by new generation commies/morons.
When did being a hippie doesn’t mean to have a sexual assault simulating camera?
That’s the point of the project: to make you think of the moralities of such a device before someone makes and builds one for real.
>NUCA is a speculative design and art project by Mathias Vef and Benedikt Groß. The project aims to provoke and question the current trajectory of generative AI in reproducing body images.
Harmless fun, no big deal. ^^
In the 80s/90s there were these “x-ray goggles” that did let you see “through”.
In side the goggles or plastic cameras there were pre-made pictures of skeletons or “nude” pictures (often hilariously fake).
Per Dude commenting in another Hackaday article:
“There was a scandal back in the day with a Sony point and shoot camera that used IR for better night shots. That inadvertently made women’s bras visible under certain kinds of t-shirts.”
Pretty sure OnePlus had a model that included an IR camera and they had to disable it through software because of complaints.
I want this built in to doorbell security cameras