Landing : Athabascau University
  • Bookmarks

All site bookmarks

Load More

Latest comments

  • Those face recognition systems can go very wrong...

    Here are deepfake videos of Mark Zukerburg/ Obama. It can be extremely difficult for the general public to tell it’s fake ( I myself can't tell it's fake)

    https://www.cnn.com/2019/06/11/tech/zuckerberg-deepfake/index.html

    and even if some of the companies might not have bad intentions when collecting the data, their systems can be hacked and our data can be used on some evil things. Not only facial data, voice data can potentially also be used on restricting our voice to do some unethical/illegal activities.

    I once got a call on my office number earlier this year, and it was strange that I felt the caller keep asking me weird questions ( like, he was asking me a bunch of questions like if I can confirm my email address, and wanted me to answer yes) I just felt wield so while on the phone I decided to google if this was some kind of scam and then I saw this. https://www.cbc.ca/news/canada/edmonton/can-you-hear-me-phone-scam-warning-bbb-1.3970312

    Thank goodness I don't usually say the word "yes" but respond to questions with "yeah" ( which really frustrated the scammer that I wouldn't say the "yes" word ...).

    However, if they can deep fake my voice, they can create the word “yes” themselves without me saying it. Here are some more articles on scammers potentially be using deep fakes voice technologies. A CEO in UK was scammed $243,000:

    https://www.forbes.com/sites/jessedamiani/2019/09/03/a-voice-deepfake-was-used-to-scam-a-ceo-out-of-243000/#7fb95fe92241

    https://www.wsj.com/articles/fraudsters-use-ai-to-mimic-ceos-voice-in-unusual-cybercrime-case-11567157402

    https://www.pandasecurity.com/mediacenter/news/deepfake-voice-fraud/

    Jenny Chun Chi Lien October 8, 2020 - 12:58pm

  • Both fascinating and horrific, Gio, thanks for sharing. Your iPhone only stores the data to log you in locally, like PIN and fingerprint data, and it's not your face as such, just a set of data points derived from it: in effect, much the same as a PIN. It's a piece of information that is only held in one physical place that, with luck, you are in complete control of. Apple go to great pains to try to prevent any possible access to it, even by determined professionals with access to the device. But, if they have that, then you have much bigger problems than facial recognition :-) The local secure storage is what makes it relatively secure, and it is why you need to set it up again independently on all your devices. Not *too* worrying! At least, not as worrying as passwords, the hash of which is stored on a server and thus, in principle, hackable even without physical access.

    But those public face recognition systems certainly are very worrying indeed: interesting to reflect on how your behaviour might change if you know you are in a panopticon (note that behavioural change was exactly the point of Bentham's original dystopic invention), especially one in which the perceivers are incredibly fallible and prone to error. There are lots of counter-technologies, of course - e.g. see https://www.wired.co.uk/article/avoid-facial-recognition-software for a top-down overview with some examples. Knowing your enemy is important - these are not intelligent systems, in the sense of being human-like in their perceptions of you! And, like the iPhone (or equivalents in Android, Windoze, etc), not all are evil. It would be interesting to reflect on precisely what it is about the others that makes them more or less evil - I suspect it might help to get to the heart of understanding how social media (and computers in general) have changed the conversation about privacy, and rights of individuals to it.

    Jon

    Jon Dron October 8, 2020 - 12:30pm

  • For all my years in information technology, it seemed that the solutions we come up are supposed to make thinigs better, faster, easier, etc.  The byproduct of such efficiencies are that workload could be reduced and make organizational positions redundant.  Luckily, I have been in good companies, where people that lose their jobs to technological solutions are repurposed into other value-added activites for the company.  I suspect that that is not always the case.  When the question about whether IT is making things better than they were, this situation from earlier in the year came into my mind regaring Clearview AI facial recognition software.  After watching this video, I was creeped out and felt like wearing a hood over anything I do in the public space (I will not really do this....).  To think that any public imagine can be mined by this software and stored for easy retrieval within the app.  If you consider every place where there is a potential image that could be billions of images that are structured (social media sites) and unstructured (street cameras).  The easy access to my likeness is why I don't use the facial recogition feature on my iphone since I am fearful that sombody could reconstruct my imgaine and possibly even create a realistic image of me somewhere that I have never been.  

    https://www.cnn.com/videos/business/2020/02/12/facial-recognition-clearview-ai-shorter-orig.cnn-business/video/playlists/business-artificial-intelligence/

    This next website has a list of facial recognition apps and their main use cases.  I found it interesting to review them and think about whether we are better off as a society with each of them or not.  Which ones do you think are really an asset to society and which ones "creep" you out?

    Gio

    Giovanni Tricarico October 8, 2020 - 11:39am

  • Really interesting - it is particularly disturbing that all of this occurs without transparency!

    Jon Dron September 28, 2020 - 12:05pm