Earlier this year, Microsoft added a new key to Windows keyboards for the first time since 1994. Before the news dropped, your mind might’ve raced with the possibilities and potential usefulness of a new addition. However, the button ended up being a Copilot launcher button that doesn’t even work in an innovative way.

Logitech announced a new mouse last week. I was disappointed to learn that the most distinct feature of the Logitech Signature AI Edition M750 is a button located south of the scroll wheel. This button is preprogrammed to launch the ChatGPT prompt builder, which Logitech recently added to its peripherals configuration app Options+.

Similarly to Logitech, Nothing is trying to give its customers access to ChatGPT quickly. In this case, access occurs by pinching the device. This month, Nothing announced that it “integrated Nothing earbuds and Nothing OS with ChatGPT to offer users instant access to knowledge directly from the devices they use most, earbuds and smartphones.”

In the gaming world, for example, MSI announced this year a monitor with a built-in NPU and the ability to quickly show League of Legends players when an enemy from outside of their field of view is arriving.

Another example is AI Shark’s vague claims. This year, it announced technology that brands could license in order to make an “AI keyboard,” “AI mouse,” “AI game controller” or “AI headphones.” The products claim to use some unspecified AI tech to learn gaming patterns and adjust accordingly.

Despite my pessimism about the droves of AI marketing hype, if not AI washing, likely to barrage the next couple of years of tech announcements, I have hope that consumer interest and common sense will yield skepticism that stops some of the worst so-called AI gadgets from getting popular or misleading people.

  • tal
    link
    fedilink
    English
    arrow-up
    6
    ·
    edit-2
    8 months ago

    What are they using as input? Like, you can have software that can control a set of outputs learn what output combinations are good at producing an input.

    But you gotta have an input, and looking at their products, I don’t see sensors.

    I guess they have smartphone integration, and that’s got sensors, so if they can figure out a way to get useful data on what’s arousing somehow from that, that’d work.

    googles

    https://techcrunch.com/2023/07/05/lovense-chatgpt-pleasure-companion/?guccounter=1

    Launched in beta in the company’s remote control app, the “Advanced Lovense ChatGPT Pleasure Companion” invites you to indulge in juicy and erotic stories that the Companion creates based on your selected topic. Lovers of spicy fan fiction never had it this good, is all I’m saying. Once you’ve picked your topics, the Companion will even voice the story and control your Lovense toy while reading it to you. Probably not entirely what those 1990s marketers had in mind when they coined the word “multimedia,” but we’ll roll with it.

    Riding off into the sunset in a galaxy far, far away? It’s got you (un)covered. A sultry Wild West drama featuring six muppets and a tap-dancing octopus? No problem, partner. Finally want to dip into that all-out orgy fantasy you have where you turn into a gingerbread man, and you’re leaning into the crisply baked gingerbread village? Probably . . . we didn’t try. But that’s part of the fun with generative AI: If you can think it, you can experience it.

    Of course, all of this is a way for Lovense to sell more of its remote controllable toys. “The higher the intensity of the story, the stronger and faster the toy’s reaction will be,” the company promises.

    Hmm.

    Okay, so the erotica text generation stuff is legitimately machine learning, but that’s not directly linked to their stuff.

    Ditto for LLM-based speech synth, if that’s what they’re doing to generate the voice.

    It looks like they’ve got some sort of text classifier to estimate the intensity, how erotic a given passage in the text is, then they just scale up the intensity of the device their software is controlling based on it.

    The bit about trying to quantify emotional content of text isn’t new – sentiment analysis is a thing – but I assume that they’re using some existing system to do that, that they aren’t able themselves to train the system further based on how people react to their specific system.

    I’m guessing that this is gluing together existing systems that have used machine learning, rather than themselves doing learning. Like, they aren’t learning what the relationship is between the settings on their device in a given situation and human arousal. They’re assuming a simple “people want higher device intensity at more intense portions of the text” relationship, and then using existing systems that were trained as an input.

    • blackbelt352@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      8 months ago

      Lovense is basically just making a line go up and down to raise and lower vibration intensities with AI. They have tons of user generated patterns and probably have some tracking of what people are using through other parts of their app. It’s really not that complicated of an application.