Contact Us

ESTATE LIVING
1st Floor Lona House
212 Upper Buitengracht
Bo Kaap, Cape Town, 8001

BUSINESS DEVELOPMENT
Jaime-Lee Gardner
jaime@estate-living.co.za
072 171 1979

CREATIVE, DESIGN & CONTENT
Louise Martin
louise@estate-living.co.za
073 335 4084

All rights reserved © 2019 Copyright Estate Living.

Our site uses cookies and other data to improve your experiance.
Please read our privacy policy to familiarise yourself with how we use this information.

Artificially yours

, |

Artificial intelligence – real stupidity, really creepy. Or really clever, really great? Depends on how you look at it. And how you use it.

Artificial intelligence (AI) goes by many names. Some refer to it as machine learning while others prefer neural networking. Irrespective of what you call it, AI is here to stay, with many experts predicting that this will be the year it hits the mainstream.

Chances are you’ve already experienced some form of AI in your life. Whether it is through that handy digital assistant on your phone, or a chatbot online, the technology has started to permeate virtually all facets of our lives.

Talk (dirty) to me

A few years ago, the likes of Facebook and Skype started incorporating bots as a friendlier and faster way of getting people the information they want. So, instead of going through a Help function, or a Q&A document, you could type in a question and the bot would answer. Funnily enough, people started enjoying these ‘chats’ with bots. So much so that many other companies started using them to help people find all sorts of interesting things like cheap places to book for the Easter holidays, what kind of restaurants are near them, and even to find out which neighbourhoods are good places to buy property. Of course, this information is built on the data the companies had stored in their back-end systems – the AI just unlocked it in a more personable manner.

But some people take the personable concept a bit far. In a hilarious episode of The Big Bang Theory, Raj falls in love with Siri. When he asks her to play ‘some smooth jazz’, she chooses his favourite artiste, and he responds, ‘This woman can read me like a book.’ And then, later, he fantasises about giving her flowers. This is in line with the results of a survey, by research bot analytics platform Dashbot, which revealed that 2.5% of images and information sent to bots are – uhm – ‘adult’ in nature. That’s not a huge percentage, but think it through. For every 200 queries about restaurants, books, movies, recipes and/or yoga poses, five people sent compromising selfies to … well, to a chunk of code!

 

 

You read right. There are people (mostly men but also women) who are happily sending naked selfies to bots. Given that there are already thousands of bots available for chats, that is a significant number of naked photos doing the rounds for the amusement of AIs everywhere. Perhaps it is simply a case of making sure the bots love us before taking over the world.

 

Trust the machine

More seriously, though, a survey by analytics specialist company SAS found that 47% of respondents said they would
rather go to an AI doctor than a human one. Of course there must be some sample bias there because the survey was
online, but even so. Already, the AI healthcare market is on track to hit more than US$6 billion by 2021. This has
led some to think that AI-driven bots could replace apps as the next big thing. Think about it – instead of downloading an app to buy from Amazon, to summon an Uber or to swipe right on Tinder, you can have a bot doing it. The bot becomes your interface to the digital world.

Imagine Siri on steroids but only better – and able to understand what you are talking about.

Take CNN’s Kik bot as an example. Targeting young readers, it focuses on giving them the basics on big news stories. They can learn more by tapping a series of conversation prompts that offer specific details about what has been going on, etc. With several rule-based chatbot tools available for download,more companies and people can create bots themselves.

 

Misplaced trust?

But, as with anything good, there’s bound to be a dark side, and the same is true for bots. South Africans have a better sense of privacy and security than most, but we still fall under the spell of technology when it makes things so convenient. Already your mobile device is storing a wealth of personal information. It’s tempting to think that it’s only celebs and public figures who need to be careful about their personal info because – really – who cares what brand of toothpaste you use, or where you go to gym?

It might seem innocent to tell your bot or virtual assistant (think Siri, Alexa, and the like) about how much you hate
traffic, or love to buy a specific product. Sure, it might seem innocent, but you never know where that data is going to end up. At the less scary end of the privacy intrusion spectrum is an online store sending you specials about that product you love. But it can get scarier – especially when the cyber world meets the real world.

Think through this scenario. If you regularly ask for traffic conditions at a specific time of day on a specific route, that tells anyone who may want to know that you will not be at home at that time, and – possibly more scary – that you will be driving down a dark, lonely road at a specific time in your shiny new SUV. You don’t need a lot of imagination to work out what smart criminals can do with that info.

Yes, this might seem extreme, but just because you’re paranoid, doesn’t mean ‘they’ aren’t out to get you. So, while
embracing the AI life does offer advantages, remain vigilant in how you use it, and how much you ‘trust the machine’. When the bots rise up, they could very well use that information against you.

Perhaps then that naked selfie you sent may stand you in good stead. Or not!

Leave a comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Recent comments

No Comments

Post a comment

Processing...
Thank you! Your subscription has been confirmed. You'll hear from us soon.
Subscribe to our mailing list and receive updates, news and offers
ErrorHere