Echo Dot (3rd Gen) - Smart speaker with Alexa - Charcoal

Use your voice to play a song, artist, or genre through Amazon Music, Apple Music, Spotify, Pandora, and others. With compatible Echo devices in different rooms, you can fill your whole home with music.

Buy Now

Wireless Rechargeable Battery Powered WiFi Camera.

Wireless Rechargeable Battery Powered WiFi Camera is home security camera system lets you listen in and talk back through the built in speaker and microphone that work directly through your iPhone or Android Mic.

Buy Now

Multi-Party Privacy and AI Autonomous Cars


Multi-Party Privacy and AI Autonomous Cars 1
Multi-party privacy may be violated when you are included in a post of a group photo without permission. (GETTY IMAGES)

By Lance Eliot, the AI Trends Insider

You are at a bar and a friend of yours takes a selfie that includes you in the picture.

Turns out you’ve had a bit to drink and it’s not the most flattering of pictures.

In fact, you look totally plastered.

You are so hammered that you don’t even realize that your friend is taking the selfie and the next morning you don’t even remember there was a snapshot taken of the night’s efforts. About three days later, after becoming fully sober, you happen to look at the social media posts of your friend, and lo-and-behold there’s the picture, posted for her friends to see.

In a semi-panic, you contact your friend and plead with the friend to remove the picture.

The friend agrees to do so.

Meanwhile, turns out that the friends of that person happened to capture the picture, and many of them thought it was so funny that they re-posted it in other venues. It’s now on Facebook, Instagram, Twitter, etc. You look so ridiculous that it has gone viral. Some have even cut out just you from the picture and then made memes of you that are hilarious, and have spread like wildfire on social media.

People at work that only know you at work (you don’t often associate with them outside of the workplace), have come up to you at the office to mention they saw the picture.

Your boss comes to ask you about it.

The company is a bit worried that it might somehow reflect poorly on the firm.

People that you used to know from high school and even elementary school have contacted you to say that you’ve really gone wild (you used to be the studious serious person). Oddly enough, you know that this was a one-time drinking binge and that you almost never do anything like this. You certainly hadn’t been anticipating that a picture would capture the rare moment.

Frustratingly, it’s a misleading characterization of who you are.

A momentary lapse that has been blown way out of proportion.

People that you don’t know start to bombard your social media sites with requests to get linked.

Anyone that parties that hard must be worth knowing. Unfortunately, most of the requests are from creepy people. Scarier still is that they have Googled you and found out all sorts of aspects about your personal life. These complete strangers are sending you messages that appear as though they know you, doing so by referring to places you’ve lived, vacations you’ve taken, and so on.

Sadly, this leads to identity theft attempts of your identity, such as your bank account or opening of credit cards, and so on. It leads to cyber-stalking of you by nefarious hackers. Social phishing ensues.

If this seems like a nightmare, I’d say that you can wake-up now and go along with the aspect that it was all a dream.

A really ugly dream.

Let’s also make clear that it could absolutely happen.

Multi-Party Privacy A Looming Issue

Many people that are using social media seem to not realize that their privacy is not necessarily controlled solely by themselves.

If you end-up in someone else’s snapshot that just so happens to include you, maybe you are in the tangentially foreground or maybe even in the background, there’s a chance that you’ll now be found on social media if that person posts the photo.

The advent of facial recognition for photos has become quite proficient. In the early days of social media, a person’s face had to be facing completely forward and fully seen by the camera, the lighting had to be topnotch, and basically if it was a pristine kind of facial shot then the computer could recognize your face. Also, there were so few faces on social media that the computer could only determine that there was a face present, but it wasn’t able to try and guess who’s face it was.

Nowadays, the facial recognition is so good that your head can be turned and barely seen by the camera, and the lighting can be crummy, and there can be blurs and other aspects, and yet the computer can find a face. And, it can label the face by using the now millions of faces already found and tagged. The odds of remaining in obscurity in a photo online is no longer feasible for very long.

People are shocked to find that they went to the mall and all of a sudden there’s some postings that have them tagged in the photos. You are likely upset because you were just minding your own business at the mall. You didn’t take a photo of yourself and nor did a friend. But, because other people were taking photos, and because of the widespread database of faces, once these fellow mall shoppers posted the picture, it was easy enough to automatically tag you in a photo by a computer. No human intervention needed.

Notice also that in the story about being in a bar and a friend having taken a snapshot, even if your friend agrees to remove the picture from being posted, the odds are that once it’s been posted you’ll never be able to stop it from being promulgated.

There’s a rule-of-thumb these days that once something gets posted, it could be that it will last forever (unless you believe that someday the Internet will be closed down – good luck waiting for that day).

I realize you are likely already thinking about your own privacy when it comes to your own efforts, such as a selfie of yourself that you made and that you posted.

You might be very diligent about only posting selfies that you think showcase your better side.

You might be careful not to post a blog that might use foul words or offend anyone.

You might be cautious about filling in forms at web sites and be protective about private information.

Unfortunately, unless you live on a deserted island, the odds are that you are around other people, and the odds are that those people are going to capture you in their photos.

I suppose you could walk around all day long with a bag over your head. When people ask you why, you could tell them you are trying to preserve your privacy. You are trying to remain anonymous. I’d bet that you’d get a lot of strange stares and possibly people calling the police to come check-out the person wearing the bag over their head.

In some cases, you’ll perhaps know that you are in a photo and that someone that you know is going to post it. You went to a friend’s birthday party on Saturday and photos were taken. The friend already mentioned that an online photo album had been setup. You’ll be appearing in those photos. That’s something you knew about beforehand. There’s also the circumstance of being caught up in a photo that you didn’t know was being taken, and might have been a snapshot by a complete stranger, akin to the mall example earlier.

So, let’s recap:

  • You took a selfie, which you knew about because you snapped it, and then you posted it
  • You end-up in someone else’s photo, whom you know, and they posted it, but you didn’t know they would post it
  • You end-up in someone else’s photo, whom you know, and they posted it with your blessings
  • You end-up in someone else’s photo, a complete stranger, and they posted it but you didn’t know they would post it
  • You took a photo of others, whom you know, and you posted it, but you didn’t let them know beforehand
  • You took a photo of others, whom you know, and you posted it with their blessings
  • You took a photo of others, complete strangers, and you posted it but they didn’t know you would post it
  • Etc.

I purposely have pointed out in the aforementioned list that you can be both the person “victimized” by this and also the person that causes others to be victimized. I say this because I know some people that have gotten upset that others included them in a photo, and posted it without getting the permission of that person, and yet this same person routinely posts photos that include others and they don’t get their permission. Do as I say, not as I do, that’s the mantra of those people.

There’s a phrase for this multitude of participants involved in privacy, namely it is referred to as Multi-Party Privacy (MP).

Details About Multi-Party Privacy

Multi-Party Privacy has to do with trying to figure out what to do about intersecting privacy aspects in a contemporary world of global social media.

You might be thinking that privacy is a newer topic and that it has only emerged with the rise of the Internet and social media.

Well, you might be surprised to know that in 1948 the United Nations adopted a document known as the Universal Declaration of Human Rights (UDHR) and Article 12 refers to the right of privacy. Of course, few at that time could envision fully the world we have today, consisting of a globally interconnected electronic communications network and the use of social media, and for which it has made trying to retain or control privacy a lot harder to do.

When you have a situation involving MP, you can likely have an issue arise with conflict among the participants in terms of the nature of the privacy involved. In some cases, there is little or no conflict and the MP might be readily dealt with, thus it is easy to ensure the privacy of the multiple participants.

More than likely, you’ll have to deal with Multi-Party Privacy Conflicts (MPC), wherein one or more parties disagree about the privacy aspects of something that intersects them.

In the story about you being in the bar and your friend snapped the unbecoming picture and posted it, you might have been perfectly fine with this and therefore there was no MPC. But, as per the story, you later on realized what had happened, and so you objected to your friend about the posting. This was then a conflict.

This was a MPC: Multi-parties involved in a matter of privacy, over which they have a conflict, because one of them was willing to violate the privacy of the other, but the other was not willing to do so.

In this example, your friend quickly acquiesced and agreed to remove the posting.

This seemingly resolved the MPC.

As mentioned, even if the MPC seems to be resolved, it can unfortunately be a situation wherein the horse is already out of the barn. The damage is done and cannot readily be undone. Privacy can be usurped, even if the originating point of the privacy breach is later somehow fixed or undone.

I realize that some of you will say that you’ve had such a circumstance and that rather than trying to un-post the picture that you merely removed the tag that had labeled you in the picture. Yes, many of the social media sites allow you to un-tag something that was either manually tagged or automatically tagged. This would seem to put you back into anonymity.

If so, it is likely short-lived.

All it will take is for someone else to come along and decide to re-apply a tag, or an automated crawler that does it. Trying to return to a state of anonymity is going to be very hard to do as long as the picture still remains available. There will always be an open chance that it will get tagged again.

I’ll scare you even more so.

There are maybe thousands of photos right now with you in them, perhaps in the background while at the train station, or while in a store, or at a mall, or on vacation in the wilderness. You might not yet be tagged in any of those. The more that we continue to move toward this global massive inter-combining of social media and the Internet, and the more that computers advance and computing power becomes less costly, those seemingly “obscure” photos are bound to get labeled.

Every place that you’ve ever been, in every photo so captured, and that’s posted online, might ultimately become a tagged indication of where you were. Plus, the odds are that the photo has embedded in it other info such as the date and time of the photo, and the latitude and longitude of the photo location. Not only are you tagged, but now we’ll know when you were there and where it was. Plus, whomever else is in the photo will be tagged, so we’ll all know who you were with.


Time to give it all up, and go live in a cave.

Say, are there any cameras in that cave?

Multi-Party Privacy And AI Autonomous Cars

What does all this have to do with AI self-driving driverless autonomous cars?

At the Cybernetic AI Self-Driving Car Institute, we are developing AI software for self-driving cars. We’re also aware of the potential privacy aspects and looking at ways to deal with them from a technology perspective (it will also need to be dealt with from a societal and governmental perspective too).

I’ve already previously discussed at some length the matter of privacy and AI self-driving cars, so please do a quick refresher and take a look at that article:

I’ve also pointed out that many of the pundits in support of AI self-driving cars continually hammer away at the benefits of AI self-driving cars, such as the mobility possibilities, but they often do so with idealism and don’t seem to be willing to also consider the downsides such as privacy concerns – see my article about idealism and AI self-driving cars:

Some are worried that we’re heading towards a monstrous future by having AI self-driving cars, including the potential for large-scale privacy invasion, as such, see my article about the AI self-driving car as a Frankenstein moment:

Herein, let’s take a close look at Multi-Party Privacy and the potential for conflicts, or MPC as it relates to AI self-driving cars.

An AI self-driving car involves these key aspects as part of the driving task:

  • Sensor data collection and interpretation
  • Sensor fusion
  • Virtual world model updating
  • AI action plans
  • Car controls commands issuance

This is based on my overarching framework about AI self-driving cars, which you can read about here:

You would normally think about the sensors of an AI self-driving car that are facing outward and detecting the world around the self-driving car. There are cameras involved, radar, sonar, LIDAR, and the like. These are continually scanning the surroundings and allow the AI to then ascertain as best possible whether there is a car ahead, or whether there might be a pedestrian nearby, and so on. Sometimes one kind of sensor might be blurry or not getting a good reading, and thus the other sensors are more so relied upon. Sometimes they are all working well and a generally full sensing of the environment will be possible.

One question for you is how long will this collected data about the surroundings be kept?

You could argue that the AI only needs the collected data from moment to moment. You are driving down a street in a neighborhood. As you proceed along, every second the sensors are collecting data. The AI is reviewing it to ascertain what’s going on. You might assume that this is a form of data streaming and there’s no “collection” per se of it.

You’d likely be wrong in that assumption.

Some or all of that data might indeed be collected and retained. For the on-board systems of the self-driving car, perhaps only portions are being kept. The AI self-driving car likely has an OTA (Over The Air) update capability, allowing it to use some kind of Internet-like communications to connect with an in-the-cloud capability of the automaker or tech firm that made the AI system. The data being collected by the AI self-driving car can potentially be beamed to the cloud via the OTA.

For my article about OTA, see:

There are some AI developers that are going to be screaming right now and saying that Lance, there’s no way that the entire set of collected data from each AI self-driving car is going to be beamed-up. It’s too much data, it takes too much time to beam-up. And, besides, it’s wasted effort because what would someone do with the data?

I’d counter-argue that with compression and with increasingly high-speed communications, it’s not so infeasible to beam-up the data.

Plus, the data could be stored temporarily on-board the self-driving car and then piped up at a later time.

In terms of what the data could be used for, well, that’s the million-dollar question.

Or, maybe billion-dollar question.

Exploiting Multi-Party Private Data

If you were an auto maker or tech firm, and you could collect the sensory data from the AI self-driving cars that people have purchased from you, would you want to do so?

Sure, why not.

You could use it presumably to improve the capabilities of the AI self-driving cars, mining the data and improving the machine learning capabilities across all of your AI self-driving cars.

That’s a pretty clean and prudent thing to do.

You could also use the data in case there are accidents involving your AI self-driving car.

By examining the data after an accident, perhaps you’d be able to show that the dog that the AI self-driving car hit was hidden from view and darted out into the street at the last moment. This might be crucial from a public perception that the seemingly evil AI ran over a dog in the roadway. The data might also have important legal value. It might be used for lawsuits against the automaker or tech firm. It might be used for insurance purposes to set rates of insurance. Etc.

Let’s also though put on our money making hats.

If you were an auto maker or tech firm, and you were collecting all of this data, could you make money from it? Would third parties be willing to pay for that data? Maybe so. When you consider that the AI self-driving car is driving around all over the place, and it is kind of mapping whatever it encounters, there’s bound to be business value in that data.

It could have value to the government too.

Suppose your AI self-driving car was driving past a gas station just as a thief ran out of the attached convenience store. Voila, your AI self-driving car might have captured the thief on the video that was being used by the AI to navigate the self-driving car.

In essence, with the advent of AI self-driving cars, wherever we are, whenever we are there, the roaming AI self-driving cars are now going to up the ante on video capture. If you already were leery about the number of video cameras that are on rooftops and walls and polls, the AI self-driving car is going to increase exponentially.

Don’t think of the AI self-driving car as a car, instead think of it as a roaming video camera.

Right now, there are 250+ million cars in the United States.

Imagine if every one of those cars had a video camera, and the video camera had to be on whenever the car was in motion.

That’s a lot of videos. That’s a lot of videos of everyday activities.

I challenge you to later today, when in your car, look around and pretend that all the other cars have video cameras and are recording everything they see, every moment.

Eerie, yes?

Exponential Increase In Multi-Party Privacy Concerns

The point herein that if you believe in the Multi-Party Privacy issue, the AI self-driving car is going to make the MP become really big-time.

And, the MPC, the conflicts over privacy, will go through the roof.

You opt to take your AI self-driving car to the local store. It captures video of your neighbors outside their homes, mowing the lawn, playing ball with their kids, watching birds, you name it. All of that video, in the normal everyday course of life activities. Suppose it gets posted someplace online. Did any of them agree to this? Would then even know they had been recorded?

I assure you that the sensors and video cameras on an AI self-driving car are so subtle that people are not going to realize that they are being recorded.

It’s not like the old days where there might be a large camera placed on the top of the car and someone holding up a sign saying you are being recorded. It will be done without any realization by people. Even if at first they are thinking about it, once AI self-driving cars become prevalent it will just become an accustomed aspect.

And, suppose the government mandated that a red recording light had to be placed on the top of an AI self-driving car, what would people do? Stop playing ball in the street, hide behind a tree, or maybe walk around all day with a bag over their heads?


One unanswered question right now is whether you as the owner of an AI self-driving car will get access to the sensor data collected by your AI self-driving car. You might insist that of course you should, it’s your car, darn it. The auto makers and tech firms might disagree and say that the data collected is not your data, it is data owned by them. They can claim you have no right to it, and furthermore that you’re having it might undermine the privacy of others. We’ll need to see how that plays out in real life.

Let’s also consider the sensors that will be pointing inward into the AI self-driving car.

Yes, I said pointing inward.

There is likely to be both audio microphones inside the AI self-driving car and cameras pointing inward. Why? Suppose you put your children into the AI self-driving car and tell the AI to take them to school. I’m betting you’d want to be able to see your children and make sure they are Okay. You’d want to talk to them and let them talk to you. For this a myriad of other good reasons, there’s going to be cameras and microphones inwardly aimed inside AI self-driving cars.

If you were contemplating the privacy aspects of recording what the AI self-driving car detects outside of the self-driving car, I’m sure you’ll be dismayed at the recordings of what’s happening inside the AI self-driving car.

Here’s an example.

It’s late at night. You’ve been to the bar. You want to get home. You are at least aware enough to not drive yourself. You hail an AI self-driving car. You get into the AI self-driving car, there’s no human driver. While in the AI self-driving car, you hurl whatever food and drink you had ingested while at the bar. You freak out inside the AI self-driving car due to drunkenness and you ramble about how bad your life is. You yell about the friends that don’t love you. You are out of your head.

Suppose the AI self-driving car is a ridesharing service provided by the Acme company. They now have recorded all of your actions while inside the AI self-driving car. What might they do with it? There’s also the chance that the ridesharing service is actually somebody’s personal AI self-driving car, but they let it be used when they aren’t using it, trying to earn some extra dough. They now have that recording of you in the AI self-driving car.

Eerie (again), yes?

There might be some AI self-driving ridesharing services that advertise they will never ever violate your privacy and that they don’t record what happens inside their AI self-driving cars.

Or, there might be AI ridesharing services that offer for an extra fee they won’t record. Or, for an extra fee they will give you a copy of the recording.

You might say that it is a violation of your privacy to have such a recording made.

But, keep in mind that you willingly went into the AI self-driving car.

There might even be some kind of agreement you agreed to by booking the ridesharing service or by getting into the self-driving car.

Some have suggested that once people know they are being recorded inside of a self-driving car, they’ll change their behavior and behave.

This seems so laughable that I can barely believe the persons saying this believe it. Maybe when AI self-driving cars first begin, we’ll sit in them like we used to do in an airplane, and be well-mannered and such, but after AI self-driving cars become prevalent, human behavior will be human behavior.

There are some that are exploring ways to tackle this problem using technology. Perhaps, when you get into the AI self-driving car, you have some kind of special app on your smartphone that can mask the video being recorded by the self-driving car and your face is not shown and your voice is scrambled. Or, maybe there is a bag in the self-driving car that you can put over your head (oops, back to the bag trick).

The Multi-Party Privacy issue arises in this case because there is someone else potentially capturing your private moments and it is in conflict with how you want your private moments to be used. Let’s extend this idea. You get into an AI self-driving car with two of your friends. You have a great time in the self-driving car. Afterward, one of you wants to keep and post the video, the other does not. There’s another MPC.


Some people will like having the video recordings of the interior of the AI self-driving car.

Suppose you take the family on a road trip. You might want to keep the video of both the interior shenanigans and the video captured of where you went. In the past, you might show someone a few pictures of your family road trip. Nowadays, you tend to show them video clips. In the future, you could show the whole trip, at least from the perspective of whatever the AI self-driving car could see.

For my article about family road trips and AI self-driving cars, see:

I hope that this discussion about Multi-Party Privacy does not cause you to become soured on AI self-driving cars.

Nor do I want this to be something of an alarmist nature.

The point more so is that we need to be thinking now about what the future will consist of. The AI developers crafting AI self-driving cars are primarily focused on getting an AI self-driving car to be able to drive. We need to be looking further ahead, and considering what other qualms or issues might arise. I’d bet that MPC will be one of them.

Get ready for privacy conflicts.

There are going to be conflicts about conflicts, you betcha.

Copyright 2019 Dr. Lance Eliot

This content is originally posted on AI Trends.

[Ed. Note: For reader’s interested in Dr. Eliot’s ongoing business analyses about the advent of self-driving cars, see his online Forbes column:]

Multi-Party Privacy and AI Autonomous Cars 2

Read More


Please enter your comment!
Please enter your name here