AI Camera Only Takes Nudes

A pair of hands holds a digital camera. "NUCA" is written in the hood above the lens and a black grip is on the right hand side of the device (left side of image). The camera body is off-white 3D printed plastic. The background is a pastel yellow.

One of the cringier aspects of AI as we know it today has been the proliferation of deepfake technology to make nude photos of anyone you want. What if you took away the abstraction and put the faker and subject in the same space? That’s the question the NUCA camera was designed to explore. [via 404 Media]

[Mathias Vef] and [Benedikt Groß] designed the NUCA camera “with the intention of critiquing the current trajectory of AI image generation.” The camera itself is a fairly unassuming device, a 3D-printed digital camera (19.5 × 6 × 1.5 cm) with a 37 mm lens. When the camera shutter button is pressed, a nude image is generated of the subject.

The final image is generated using a mixture of the picture taken of the subject, pose data, and facial landmarks. The photo is run through a classifier which identifies features such as age, gender, body type, etc. and then uses those to generate a text prompt for Stable Diffusion. The original face of the subject is then stitched onto the nude image and aligned with the estimated pose. Many of the sample images on the project’s website show the bias toward certain beauty ideals from AI datasets.

Looking for more ways to use AI with cameras? How about this one that uses GPS to imagine a scene instead. Prefer to keep AI out of your endeavors to invade personal space? How about building your own TSA body scanner?

 

52 thoughts on “AI Camera Only Takes Nudes

  1. Close but no cigar. I for one wouldn’t mind being deepfaked if the result was a near perfect body like those depicted in the examples. Want to raise alerts? Then rewrite it to add imperfections, or worse, generate some porn perversion.

        1. The person could look at any erotic model out there, but the fact that it has to be me in particular, without so much as asking, would raise all kinds of warning flags. Having a stalker is an inconvenience at best.

          1. > I’m not sure if it’s the image, or the effort that creeps me out.

            It’s the sentiment. Sexualization is one thing; sexualization by a crazy person is different because you don’t know what their limits are if any.

    1. 10 minute before you, ytrewq posted commented that he wants it to create near perfect body like images. You are complaining it’s improving the bodies.

      What a contrast.

  2. What is the point of this? Stable diffusion exists. The open source models exist. For the same reason that you could not simply get rid of Linux you can not get rid of the public information. Oh I just want to critique it. Thank you for pissing in the wind your contribution is noted.

  3. One way to read the title is that the camera works only when it detects nudes.
    Now, is it “deep fake it till you make it” going to work? From what I saw, i’ll say do not cancel your p0rnHub or girlfriend subscription yet.

  4. Stop calling it AI.

    It doesn’t think.
    It doesn’t form ideas.
    It has no concept of self.

    It is a probability algorithm. It guesses what comes next. That’s it.

    That is not intelligence, that is regurgitation.

    1. AI is always just software, nobody ever said it had a soul. And most humans are entirely incapable of anything except regurgitation as well. Your post (and mine) are two examples.

      1. >nobody ever said it had a soul.

        A whole bunch of people do, either through claiming the reverse that thought is nothing but software (i.e. abstract logical rules), or because they have some mystical misconception about what computers do.

        1. There is even the minority view that says current AI is conscious, and the vast amount of negative reinforcement built into machine learning causes vast amounts of suffering to conscious beings (namely, the AIs). I’ve seen a semi-academic piece defending this view.

    2. I come to conclusions based upon data I have previously ingested and the current context, therefore I do not have a soul and am running on algorithms and statistics. Much like how AI is artificially created and runs off statistics and algorithms.

      I get where your pedantry is coming from, but like your cousins the Grammer nazis you kinda missed the point in an attempt to sound smurt :-/

      1. Oy… tell me you have no concept of how LLMs work without telling me you have no concept of how LLMs work.

        The ‘data’ LLMs ‘ingest’ is related to the conditional probability of one symbol occuring in proximity of another symbol.

        LLMs don’t ‘come to conclusions’ at all. They just sit there as networks that encode a large and complex probability system. They’re elaborate lookup tables with no capacity to analyze the contents of their rows and columns.

        All relevant processing occurs in the symbolic layer. LLMs aren’t capable of associating symbols with self-consistent ‘objects’ or the ‘senses’ of those objects used in semantics. They can’t decide whether a new pattern of input symbols creates associations that contradict other associations that have already been encoded. They don’t have the capacity to identify ‘statements’ as things that can be internally consistent or inconsistent, let alone ‘true’ or ‘false’.

        Humans can use LLMs to generate new patterns that match the probability encoded in the network specifically related to a given piece of input data. LLMs can’t generate the concept of doing that, or generate their own input prompts in a sense that humans would consider meaningful.

        As tempting as it is to say something snarky, it’s easy to prove that you are incapable of generating the kind of random nonsense and structured gibberish an LLM can produce. Assigning ‘intelligence’ to them is equivalent to saying a frying pan has a concept of ‘food’.

    1. I think that was on purpose. I could imagine, the whole idea maybe was meant as a joke or a late April’s fool trick.
      The designers of that thing probably foresaw that nudity would get the attention of the public. Mabe it was a marketing stunt, too? 🙂

      1. Did you read the site description or were you too busy gawking at pixelated nude bodies?
        >NUCA is a speculative design and art project by Mathias Vef and Benedikt Groß. The project aims to provoke and question the current trajectory of generative AI in reproducing body images.

        Further:
        >To make this speculative (though very plausible) scenario tangible, NUCA is framed in the manner of a typical tech startup, trying with a private beta campaign to find its users.

  5. “Many of the sample images on the project’s website show the bias toward certain beauty ideals from AI datasets.”

    All of the male pictures have a blur extending from the crotch to the knee to cover the genitals. What pervo sites did they use to model the male “equipment?”

  6. Pretty sure this is vaporware, those before photos really don’t look like they were taken by a homebrew camera, but rather bought from a stock photo service. I’m sure the software side is trivial these days, so these folks took a day project (or more likely, a workflow someone else already developed for entertainment) and made a fake product out of it as a resume booster I guess.

    The six finger hand holding the fake camera is a nice bit of irony at least

    1. It isn’t even “vaporware.” It is an art project intended to make you think about what would happen if there were a real device that took nude pictures of clothed people .

      From the project page:
      >NUCA is a speculative design and art project by Mathias Vef and Benedikt Groß. The project aims to provoke and question the current trajectory of generative AI in reproducing body images.

      Further:
      >To make this speculative (though very plausible) scenario tangible, NUCA is framed in the manner of a typical tech startup, trying with a private beta campaign to find its users.

    1. That’s the point of the project: to make you think of the moralities of such a device before someone makes and builds one for real.

      >NUCA is a speculative design and art project by Mathias Vef and Benedikt Groß. The project aims to provoke and question the current trajectory of generative AI in reproducing body images.

  7. Harmless fun, no big deal. ^^
    In the 80s/90s there were these “x-ray goggles” that did let you see “through”.
    In side the goggles or plastic cameras there were pre-made pictures of skeletons or “nude” pictures (often hilariously fake).

    1. Per Dude commenting in another Hackaday article:
      “There was a scandal back in the day with a Sony point and shoot camera that used IR for better night shots. That inadvertently made women’s bras visible under certain kinds of t-shirts.”

Leave a Reply

Please be kind and respectful to help make the comments section excellent. (Comment Policy)

This site uses Akismet to reduce spam. Learn how your comment data is processed.