Friday, March 23, 2018

NEW on SHOWCASE


Fiction • “2018: The Year in Review,” by Kersley Fitzgerald •



Hey, it’s Roshni, and this is Organech, the podcast about where life meets technology.

So, it’s almost the new year, 2019, and we’ve seen some big changes this year, haven’t we? Heroin addictions are down, as well as addictions to other prescription pain killers. Suicides are lower than they have been in years, and still falling. Deaths by heart disease, down. Domestic violence, nose dive. Weirdly enough, pet ownership is up, as are musical instrument purchases and rentals.

But that’s not really what we talk about on Organech, so here’s some stats more relevant to tech and how it influences our lives. Hours of computer games played was cut in half around the world. The number of social media users is up slightly, but obsessive use, you know, where people are on Twitter and Instagram all day long—nonexistent. E-books are selling like wild, including my book, Security in the New World, so thank you very much for that. More strangely? ISIS recruitment on Twitter is nil.

But here's something we’re just beginning to understand—all of this, the drugs, the domestic violence, the video games—they’re all related. And that relation is the continuation of a story we’ve talked about before.

I’m sure you remember the podcasts we’ve done on Cambridge Analytica over the last couple of years. They were a division of the British company Strategic Communication Laboratories Group, a company that uses studies in human behavior to tailor-make online ads for people, you know, like how you “like” a post about your friend’s new boots and all of a sudden every page has ads for boots. Unless, of course, you’ve downloaded Privacy Badger to make your surfing experience a little more anonymous.

Anywho, Cambridge Analytica is the more political arm of the SCL Group. You may remember they promised to push undecideds to vote for Brexit in the UK and then worked on the presidential campaigns of Ted Cruz and Ben Carson before briefly working with President Trump. They couldn’t quite deliver what they’d promised, which was nothing less than in-depth analysis of every voter in America as well as the social media tools to sway their vote. They claimed they didn’t have the time to complete a full workup, but we all knew it was just a matter of months before they had American voters in their sights.

Until the Big Hack. Of course, the data from your Alexa, Cortana, Siri, and Google Home were supposed to be secure, but we all know what happened in January. I really wonder where North Korea comes up with such good hackers. For at least a couple of weeks, we tossed all our voice-command systems and avoided social media except to say how we had all changed our passwords and used checks to pay our bills.

It was mid-February when Yoyodyne Cybernetics released Valet. The ad campaign didn't focus on turning lights on and off, or writing a grocery list, or telling you if your outfit was on-trend. It was all about the security. And when neither the Ukrainians, Israelis, nor the North Koreans could hack into their databases at the annual Zero Play hacker competition, Valet became the electronic butler of choice.

The great thing about Valet is that it combined all the Big Brothers into one. Off the bat, Yoyodyne had contracts with Apple, Google, Amazon, and even Facebook. Convenience without the push of a button for the low, low price of your privacy.

But it worked. This thing is so sophisticated it can help with homework in a bedroom, make suggestions for a grocery list for a diabetic in the kitchen, and not order that toy from Amazon that your toddler really wants in the den. There have been no security breaches. And since the data is held in server farms in Europe, for ten Euros and a nicely-worded letter, you can have a copy of your own profile.

Not that there’s much data to be stored. Valet works on algorithms. It analyzes what it tracks, finds the patterns, then turns those patterns into mathematical formulas without saving the raw data. When someone’s actions match a formula, it responds with a corresponding formula that meets your needs.

By March, Apple, Google, and the others were benefiting so much by Valet, they were paying Yoyodyne. By April, the owner of Yoyodyne, Mr. Francisco A. Truist, had the cash to make another big move.

He bought Cambridge Analytica.

Surprised? Not very many people knew. We tried to get an interview with Mr. Truist, but we’re told he’s a very private person—ironic, no? We did get a nice letter. By mail. It reads:
Thank you for your interest, Roshni. I've been a fan of Organech for many years. [Aw, that's sweet. Or possibly stalkerish. Not sure.] I’m afraid I can’t give interviews, but I also don’t want any ambiguity about the purpose of Yoyodyne, our Valet, or Sociology Research Operations ("Cambridge Analytica" was so pompous!).

We simply exist to make the world a better place. We believe in privacy, efficiency, and good will. We do not believe in political pandering, gratuitous commercialism, or emotional manipulation. I think we’ve made a pretty good start, but we plan on getting better. Look out for our next venture: Solid Truth, an algorithm that will verify accurate news articles.

Keep up the good work,
Al
So, we asked you, the listening audience, what you thought. Do you have a Valet? How’s it been? Are you going to keep it?

Charlie from Portland, Maine, wrote in. She said she got a Valet in August to help her eight-year-old son with homework. Within a week, she started seeing Facebook ads and articles on dyslexia treatments. So, she brought it up to the doctor at her son’s next appointment, and guess what? After three months of therapy, his reading has improved a full grade.

Pablo called us from New Mexico. His wife was recently diagnosed with Celiac disease. She was so sick that he had to do all the shopping and cooking, and he had no idea how to feed her. She had already used Valet for her shopping, but he noticed the program started making its own lists with recipes attached. The dishes were mostly what they ate regularly, but the more he follows Valet’s instructions, the healthier his wife gets.

We have about four-hundred of your letters, emails, and calls. Each one brought me a little closer to getting my own Valet—you remember the disaster last year when I tried a Google Home for a week! Most of your stories were about the little things. Like, prompting you to reschedule a meeting to make your son’s band concert, finding that perfect gift for your mother-in-law, or posting a link that shows how you can actually do something about the cause du jour instead of just talking about it.

But it was Sherry from Tulsa who convinced me. She had cut off all her social media after the Big Hack, which doesn’t sound so bad except that she suffers from depression and uses Facebook to connect with her friends, family, and a support group. The day she took too many pills, Valet called her husband at work. He got there in time. When she came home, she opened Facebook for the first time in months. It was a while before she realized all the posts, all the Tweets, and all the Instagram photos she saw were positive and personal, as if they’d been hand-picked just for her. And she’s doing much better.

So, I jumped in. I got a Valet. It connects to my Apple Watch and my husband’s Android phone. So far it’s notified me when my son was choking in the other room, reminded me that my dinner guest is a recovering alcoholic, and tracked the GPS on Rex’s collar when he ran away. Twice.

It’s a weird kind of privacy, isn’t it? Instead of our personal data stored somewhere, it’s like Valet is inside our heads.

Right now, it’s a privacy I’m willing to share.



Kersley Fitzgerald has been a member of the Stupefying Stories crew since before the beginning. She wrote this piece in response to the 12/22/17 Friday Challenge, and given recent events, we just had to publish it now. We stand by all attendant ironies.

Kersley writes that this piece was “totally inspired by Manoush Zomorodi and her podcast Note to Self. It’s awesome. But she had nothing to do with this.”

0 comments: