The Real Slim Shady

I have an admission to make, it’s something I have a visceral reaction towards, and it’s a problem I’m still working on: I hate ghosting.

I’m no stranger to seeking out professional help: when I approached my business coach about my anger when people didn’t respond to my sales emails, he said that just because we reach out to people, it doesn’t mean they have to respond. It took me a while to realise it, but it’s true: people are not obligated to communicate.

With that said, people’s non-obligation to communicate has resulted in terrible reputations across myriad industries from recruitment to real estate and in more personal circles, online dating. The vast tentacle reach of the internet has resulted in the commodification of the human experience: we can say what we want, when we want it and if we don’t get it, there’s always another search string, another platform, another AI prompt.

Speaking of which, it was only a matter of time before we got our own personal AI.

So many questions arise from this type of implementation of technology:

  1. Whose data did they use to train this AI on?

    When we’re training AI models, they learn from huge swathes (or nodes) of data. GPT3, for example, learnt from billions of data nodes. GPT4 has learned from trillions of them. If we consider that models get better the more (good) data we get them to learn on, the best models have learned from huge amounts of excellent data. The problem with getting am AI to learn on personal data, is that the data is personal. I don’t remember opting my personal data in to train this and I’m sure thousands of other people didn’t either - so whose data have they trained it on?

  2. Why and how did we rationalise this data?

    Data is only as good as its lack of bias. When we’re training AI, the key is to rationalise the data that it is trained on:

So when the Founder of Personal AI says that this is unbiased by the world: the nature of human existence, biased as it is, has had to be at the core of this for it to learn. One could argue, that in order for this Personal AI to be able to mirror your voice, your style and everything about you, it has had to learn how to imbibe these biases from the test data in order to create the inferences.

3. Are we strengthening human relationships by allowing a machine to learn how we speak and answer on behalf of us?

Nothing screams “I cannot authentically communicate with you anymore” quite like “auto-reply with co-pilot.” Authenticity and vulnerability are key parts of the human condition that are also powerful when we’re communicating human-to-human; training an AI to auto-respond is the antithesis of that.

The stoics amongst us will argue that this, potentially, is the end of ghosting. Gone will be the days of the Out of Office, we’ll always be connected, we won’t forget Mother’s Day or, dare I say it, our anniversaries. But I would argue that this presents a false standard; that in order to ghost, there had to be something that was once alive.

P.S

Previous
Previous

School's out for...ever

Next
Next

A(i)ugmented Reality