A.I. company Worldcoin has rolled out 1,500 Orbs to more than 35 cities in a bid to create digital identities for the world’s citizens.

  • Arotrios@kbin.social
    link
    fedilink
    arrow-up
    48
    arrow-down
    2
    ·
    1 year ago

    Just hell no. Sounds like a spray paint campaign is in order. I’m gonna go post this on the anarchy subs and see how they feel about it (unless you already got there first).

  • fearout@kbin.social
    link
    fedilink
    arrow-up
    31
    ·
    edit-2
    1 year ago

    “I’ve been very interested in things like universal basic income and what’s going to happen to global wealth redistribution,” Sam Altman, Worldcoin’s cofounder

    Holy crap it’s Sam Altman, the CEO of OpenAI. After that recent article about his $2 Kenyan workers it’s much harder to believe in benevolent intentions.

    • explodicle@local106.com
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      Any time someone creates a new coin instead of using the thousands available, it’s 99.9% a scam. We don’t need a new money supply for a UBI, this has been discussed to death in crypto circles for over a decade.

  • Dubious_Fart@lemmy.ml
    link
    fedilink
    arrow-up
    31
    arrow-down
    1
    ·
    1 year ago

    Well I look forward to hearing the endless tales of people smashing the fuck out of these, and taking the hardware to figure out how to do greater damage to the entire project.

    • killernova@lemmy.world
      link
      fedilink
      arrow-up
      5
      ·
      1 year ago

      I think these would look pretty cool in my art deco living room, and they’re free too! Such a great deal ;)

    • jonesy@lemmy.ml
      link
      fedilink
      English
      arrow-up
      7
      ·
      1 year ago

      Shiit, if you think about it, we kinda already have a social credit system in the US. It’s less social I suppose, but does affect things that effect our social status, like being able to finance a car or house afforedably.

  • NaibofTabr@infosec.pub
    link
    fedilink
    arrow-up
    18
    arrow-down
    3
    ·
    1 year ago

    Hmm, based on the pictures in the article this thing is basically a camera in a shiny ball about 1ft in diameter (it appears to be about the width of 3 bricks laid side by side). It’s not like a Cloud Gate-sized object. To get a scan of your irises you would have to be pretty close to it for at least a few seconds - it’s not like it could get a scan if you’re just walking by a few feet away. You’d have to walk up and point your face at it on purpose. The camera in it also looks fixed - I doubt it can rotate to follow you, that would be mechanically complex, expensive and prone to failure.

    Based on the description, their software takes an image of your irises and reduces it to a hash value. The original image is deleted (they claim) and the hash value is stored as an ID code. It seems likely that the hash value will be unique to their software - e.g. if you wrote your own code to produce hash values from images, you would get a different number even if you had the same picture of the same eyes. So the hash value doesn’t necessarily represent anything about your eyes that would be much of a privacy invasion… It’s just a mathematically derived number string which is unique to their software.

    It’s not clear what part of this system is “AI”, though my guess would be it has something to do with re-identifying your eyes next time you want to access whatever is secured with your hash code. It’s really not clear how that would work… a new image of your eyes collected a year later under different lighting conditions would probably produce a different hash value, so how does this system match them, if it only records the hashes?

    FWIW, I think smashing or spray painting these things, while fun ideas in the rebellious teenager sense, is probably overkill and likely to get you more attention from law enforcement than you want. But, you could probably just walk up behind it and slap a sticker or tape over the camera… they’d still have to pay someone to go out and peel it off.

    • jon@kbin.social
      link
      fedilink
      arrow-up
      7
      ·
      1 year ago

      Taking a picture instantly after would probably create a different hash value. The thing about hashing is that even if one bit is different between source images, the resulting hashes would look entirely different.

      I suppose I could conceive of a proprietary hash algorithm that would allow for fuzzy matching of iris photos, but as you said, eyes taken years apart in different conditions wouldn’t match the original hash. Or falsely match similar looking eyes. It’s not like this system allows them to get high resolution perfectly lit iris photos, after all.

      The whole thing sounds dubious, and I suspect AI is mentioned solely to secure investor funding, much like how several years back everything mentioned Blockchain.

      • UFO@programming.dev
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        They are likely using a form of https://en.wikipedia.org/wiki/Perceptual_hashing

        The noise level a perceptual hash is sensitive to can be tuned.

        The “falsely match similar looking” is harder than one would expect. I used to work on an audio fingerprinting system which was extremely robust to “similar” audio matching. What sounded similar to us was always identified uniquely by the hash with high confidence.

        For example. Take the same piano piece done by the same artists on the same piano performed as close as they could to the same: never confused the perceptual hash with ~10 sec of audio. Not once. We could even identify how much of a pre-recorded song was used in a “live” performance.

        There are adversarial attacks for perceptual hashes. However, “similar eyes” would not be one to a standard perceptual hash. More like: a picture of an abstract puppy happens to have the same hash as an eye.

        I’d be curious on the details of the hash. That is necessary to know what the adversely attacks are. But I see no mention of the details. Which is suspicious on it’s own.

  • Erikatharsis@kbin.social
    link
    fedilink
    arrow-up
    11
    ·
    1 year ago

    The absolute absurdity of a news article on nefarious data collection requiring that I enable JS to read it, just so that it can load a ridiculous number of trackers.

  • Tibert@compuverse.uk
    link
    fedilink
    arrow-up
    10
    ·
    1 year ago

    What the hell is this? How can they even do this without getting deleted.

    I’m not sure I understood it correctly. Do people just need to just look at the mirrored surface to get scanned, and they get a coin?

    You don’t know and don’t have an account your bad! We have your eyes now!

    Or do people need to read a privacy policy and accept everything before they get scanned?